WebKit Bugzilla
Attachment 356735 Details for
Bug 186933
: [WPE][GTK] Implement WebAudioSourceProviderGStreamer to allow bridging MediaStream and the WebAudio APIs
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
[patch]
Patch
bug-186933-20181206143613.patch (text/plain), 31.54 KB, created by
Thibault Saunier
on 2018-12-06 09:36:14 PST
(
hide
)
Description:
Patch
Filename:
MIME Type:
Creator:
Thibault Saunier
Created:
2018-12-06 09:36:14 PST
Size:
31.54 KB
patch
obsolete
>Subversion Revision: 238666 >diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog >index 0f2b337981efbb85889ff67fc1bdb56516af2658..6774fc04754dc775dab64961c266f1409f454594 100644 >--- a/Source/WebCore/ChangeLog >+++ b/Source/WebCore/ChangeLog >@@ -1,3 +1,47 @@ >+2018-12-05 Thibault Saunier <tsaunier@igalia.com> >+ >+ [WPE][GTK] Implement WebAudioSourceProviderGStreamer to allow bridging MediaStream and the WebAudio APIs >+ https://bugs.webkit.org/show_bug.cgi?id=186933 >+ >+ Reusing the AudioSourceProviderGStreamer itself as it was doing almost everything we needed, >+ just added a constructor to be able to create it from a MediaStreamTrackPrivate and made it a >+ WebAudioSourceProvider which only means it is now a ThreadSafeRefCounted. >+ >+ Sensibily refactored GStreamerMediaStreamSource so that we could reuse it to track a single >+ MediaStreamTrack. >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ Enabled all tests depending on that feature. >+ >+ * platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp: >+ (WebCore::AudioSourceProviderGStreamer::AudioSourceProviderGStreamer): >+ (WebCore::AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer): >+ (WebCore::AudioSourceProviderGStreamer::setClient): >+ * platform/audio/gstreamer/AudioSourceProviderGStreamer.h: >+ * platform/mediastream/MediaStreamTrackPrivate.cpp: >+ (WebCore::MediaStreamTrackPrivate::audioSourceProvider): >+ * platform/mediastream/gstreamer/GStreamerAudioCapturer.cpp: >+ (WebCore::GStreamerAudioCapturer::GStreamerAudioCapturer): >+ * platform/mediastream/gstreamer/GStreamerAudioStreamDescription.h: >+ * platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp: >+ (WebCore::webkitMediaStreamSrcSetupSrc): >+ (WebCore::webkitMediaStreamSrcSetupAppSrc): >+ (WebCore::webkitMediaStreamSrcAddTrack): >+ (WebCore::webkitMediaStreamSrcSetStream): >+ (WebCore::webkitMediaStreamSrcNew): >+ * platform/mediastream/gstreamer/GStreamerMediaStreamSource.h: >+ * platform/mediastream/gstreamer/MockGStreamerAudioCaptureSource.cpp: >+ (WebCore::WrappedMockRealtimeAudioSource::WrappedMockRealtimeAudioSource): >+ (WebCore::WrappedMockRealtimeAudioSource::start): >+ (WebCore::WrappedMockRealtimeAudioSource::addHum): >+ (WebCore::WrappedMockRealtimeAudioSource::render): >+ (WebCore::WrappedMockRealtimeAudioSource::settingsDidChange): >+ (WebCore::MockGStreamerAudioCaptureSource::startProducingData): >+ * platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp: >+ (WebCore::RealtimeOutgoingAudioSourceLibWebRTC::pullAudioData): Handle the case where input buffers >+ are "big" and process all the data we can for each runs of the method. >+ > 2018-11-29 Rob Buis <rbuis@igalia.com> > > Remove some superfluous code in ContentSecurityPolicy::upgradeInsecureRequestIfNeeded >diff --git a/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp b/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp >index d4a0ca2bf8a6e6be502aeb277adee2a8a6c35993..e7111f3c023ae615f0195302979116c3577b649f 100644 >--- a/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp >+++ b/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp >@@ -27,6 +27,10 @@ > #include <gst/audio/audio-info.h> > #include <gst/base/gstadapter.h> > >+#if ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) >+#include "GStreamerAudioData.h" >+#include "GStreamerMediaStreamSource.h" >+#endif > > namespace WebCore { > >@@ -94,6 +98,31 @@ AudioSourceProviderGStreamer::AudioSourceProviderGStreamer() > m_frontRightAdapter = gst_adapter_new(); > } > >+#if ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) >+AudioSourceProviderGStreamer::AudioSourceProviderGStreamer(MediaStreamTrackPrivate& source) >+ : m_notifier(MainThreadNotifier<MainThreadNotification>::create()) >+ , m_client(nullptr) >+ , m_deinterleaveSourcePads(0) >+ , m_deinterleavePadAddedHandlerId(0) >+ , m_deinterleaveNoMorePadsHandlerId(0) >+ , m_deinterleavePadRemovedHandlerId(0) >+{ >+ m_frontLeftAdapter = gst_adapter_new(); >+ m_frontRightAdapter = gst_adapter_new(); >+ auto pipelineName = String::format("WebAudioProvider_MediaStreamTrack_%s", source.id().utf8().data()); >+ m_pipeline = adoptGRef(GST_ELEMENT(g_object_ref_sink(gst_element_factory_make("pipeline", pipelineName.utf8().data())))); >+ auto src = webkitMediaStreamSrcNew(); >+ webkitMediaStreamSrcAddTrack(WEBKIT_MEDIA_STREAM_SRC(src), &source, true); >+ >+ m_audioSinkBin = adoptGRef(GST_ELEMENT(g_object_ref_sink(gst_parse_bin_from_description("tee name=audioTee", true, nullptr)))); >+ >+ gst_bin_add_many(GST_BIN(m_pipeline.get()), src, m_audioSinkBin.get(), nullptr); >+ gst_element_link(src, m_audioSinkBin.get()); >+ >+ connectSimpleBusMessageCallback(m_pipeline.get()); >+} >+#endif >+ > AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer() > { > m_notifier->invalidate(); >@@ -105,6 +134,9 @@ AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer() > g_signal_handler_disconnect(deinterleave.get(), m_deinterleavePadRemovedHandlerId); > } > >+ if (m_pipeline) >+ gst_element_set_state(m_pipeline.get(), GST_STATE_NULL); >+ > g_object_unref(m_frontLeftAdapter); > g_object_unref(m_frontRightAdapter); > } >@@ -205,12 +237,17 @@ void AudioSourceProviderGStreamer::setClient(AudioSourceProviderClient* client) > ASSERT(client); > m_client = client; > >+ if (m_pipeline) >+ gst_element_set_state(m_pipeline.get(), GST_STATE_PLAYING); >+ > // The volume element is used to mute audio playback towards the > // autoaudiosink. This is needed to avoid double playback of audio > // from our audio sink and from the WebAudio AudioDestination node > // supposedly configured already by application side. > GRefPtr<GstElement> volumeElement = adoptGRef(gst_bin_get_by_name(GST_BIN(m_audioSinkBin.get()), "volume")); >- g_object_set(volumeElement.get(), "mute", TRUE, nullptr); >+ >+ if (volumeElement) >+ g_object_set(volumeElement.get(), "mute", TRUE, nullptr); > > // The audioconvert and audioresample elements are needed to > // ensure deinterleave and the sinks downstream receive buffers in >diff --git a/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h b/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h >index ab60198c880fdb94463bd7192dedfd6be07349f7..92d0a86f45e75d7dcf54f803e2879c5211b9db70 100644 >--- a/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h >+++ b/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h >@@ -28,14 +28,31 @@ > #include <wtf/Forward.h> > #include <wtf/Noncopyable.h> > >+#if ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) >+#include "GStreamerAudioStreamDescription.h" >+#include "MediaStreamTrackPrivate.h" >+#include "WebAudioSourceProvider.h" >+#endif >+ > typedef struct _GstAdapter GstAdapter; > typedef struct _GstAppSink GstAppSink; > > namespace WebCore { > >+#if ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) >+class AudioSourceProviderGStreamer final : public WebAudioSourceProvider { >+public: >+ static Ref<AudioSourceProviderGStreamer> create(MediaStreamTrackPrivate& source) >+ { >+ return adoptRef(*new AudioSourceProviderGStreamer(source)); >+ } >+ AudioSourceProviderGStreamer(MediaStreamTrackPrivate&); >+#else > class AudioSourceProviderGStreamer : public AudioSourceProvider { > WTF_MAKE_NONCOPYABLE(AudioSourceProviderGStreamer); > public: >+#endif >+ > AudioSourceProviderGStreamer(); > ~AudioSourceProviderGStreamer(); > >@@ -54,6 +71,7 @@ public: > void clearAdapters(); > > private: >+ GRefPtr<GstElement> m_pipeline; > enum MainThreadNotification { > DeinterleavePadsConfigured = 1 << 0, > }; >diff --git a/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp b/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp >index 5166de6f7fa1caa3a51fecda38fcdee046a359e1..95fa95ed4f94b84cd09d6f18bbfdf61d53f588f9 100644 >--- a/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp >+++ b/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp >@@ -35,6 +35,8 @@ > > #if PLATFORM(COCOA) > #include "WebAudioSourceProviderAVFObjC.h" >+#elif ENABLE(WEB_AUDIO) && ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) && USE(GSTREAMER) >+#include "AudioSourceProviderGStreamer.h" > #else > #include "WebAudioSourceProvider.h" > #endif >@@ -178,6 +180,9 @@ AudioSourceProvider* MediaStreamTrackPrivate::audioSourceProvider() > #if PLATFORM(COCOA) > if (!m_audioSourceProvider) > m_audioSourceProvider = WebAudioSourceProviderAVFObjC::create(*this); >+#elif USE(LIBWEBRTC) && USE(GSTREAMER) >+ if (!m_audioSourceProvider) >+ m_audioSourceProvider = AudioSourceProviderGStreamer::create(*this); > #endif > return m_audioSourceProvider.get(); > } >diff --git a/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.cpp b/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.cpp >index 8df758a109fee65e823993ea842418b4b3218a68..14e23189d8e63bc888d61500a507219f439cf531 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.cpp >+++ b/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.cpp >@@ -36,20 +36,10 @@ GStreamerAudioCapturer::GStreamerAudioCapturer(GStreamerCaptureDevice device) > } > > GStreamerAudioCapturer::GStreamerAudioCapturer() >- : GStreamerCapturer("audiotestsrc", adoptGRef(gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, LibWebRTCAudioFormat::sampleRate, nullptr))) >+ : GStreamerCapturer("appsrc", adoptGRef(gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, LibWebRTCAudioFormat::sampleRate, nullptr))) > { > } > >-GstElement* GStreamerAudioCapturer::createSource() >-{ >- GstElement* source = GStreamerCapturer::createSource(); >- >- if (!m_device) >- gst_util_set_object_arg(G_OBJECT(m_src.get()), "wave", "ticks"); >- >- return source; >-} >- > GstElement* GStreamerAudioCapturer::createConverter() > { > auto converter = gst_parse_bin_from_description("audioconvert ! audioresample", TRUE, nullptr); >@@ -62,14 +52,15 @@ GstElement* GStreamerAudioCapturer::createConverter() > bool GStreamerAudioCapturer::setSampleRate(int sampleRate) > { > >- if (sampleRate > 0) { >- GST_INFO_OBJECT(m_pipeline.get(), "Setting SampleRate %d", sampleRate); >- m_caps = adoptGRef(gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, sampleRate, nullptr)); >- } else { >+ if (sampleRate <= 0) { > GST_INFO_OBJECT(m_pipeline.get(), "Not forcing sample rate"); >- m_caps = adoptGRef(gst_caps_new_empty_simple("audio/x-raw")); >+ >+ return false; > } > >+ GST_INFO_OBJECT(m_pipeline.get(), "Setting SampleRate %d", sampleRate); >+ m_caps = adoptGRef(gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, sampleRate, nullptr)); >+ > if (!m_capsfilter.get()) > return false; > >diff --git a/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.h b/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.h >index 7b279ee344e5ba7967a7adab59acedff5571fab0..9aaf761d6c4b43c44ef44711e837b0f1252d1f5a 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.h >+++ b/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioCapturer.h >@@ -32,7 +32,6 @@ public: > GStreamerAudioCapturer(GStreamerCaptureDevice); > GStreamerAudioCapturer(); > >- GstElement* createSource() final; > GstElement* createConverter() final; > const char* name() final { return "Audio"; } > >diff --git a/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioStreamDescription.h b/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioStreamDescription.h >index 861adef0108211d0c43b4ba25762bc6008f620b1..64fced400333d6b5294c7c6db9dc8570df2c3cc8 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioStreamDescription.h >+++ b/Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioStreamDescription.h >@@ -82,6 +82,7 @@ public: > bool isSignedInteger() const final { return GST_AUDIO_INFO_IS_INTEGER(&m_info); } > bool isNativeEndian() const final { return GST_AUDIO_INFO_ENDIANNESS(&m_info) == G_BYTE_ORDER; } > bool isFloat() const final { return GST_AUDIO_INFO_IS_FLOAT(&m_info); } >+ int bytesPerFrame() { return GST_AUDIO_INFO_BPF(&m_info); } > > uint32_t numberOfInterleavedChannels() const final { return isInterleaved() ? GST_AUDIO_INFO_CHANNELS(&m_info) : TRUE; } > uint32_t numberOfChannelStreams() const final { return GST_AUDIO_INFO_CHANNELS(&m_info); } >diff --git a/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp b/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp >index 2513ca94be9c7e000c46608ca110dcae1378ac2f..927d986c205bb3f0cb21d9ffceb00c30286a1e8c 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp >+++ b/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp >@@ -41,7 +41,6 @@ namespace WebCore { > static void webkitMediaStreamSrcPushVideoSample(WebKitMediaStreamSrc* self, GstSample* gstsample); > static void webkitMediaStreamSrcPushAudioSample(WebKitMediaStreamSrc* self, GstSample* gstsample); > static void webkitMediaStreamSrcTrackEnded(WebKitMediaStreamSrc* self, MediaStreamTrackPrivate&); >-static void webkitMediaStreamSrcAddTrack(WebKitMediaStreamSrc* self, MediaStreamTrackPrivate*); > static void webkitMediaStreamSrcRemoveTrackByType(WebKitMediaStreamSrc* self, RealtimeMediaSource::Type trackType); > > static GstStaticPadTemplate videoSrcTemplate = GST_STATIC_PAD_TEMPLATE("video_src", >@@ -156,7 +155,7 @@ public: > > void didAddTrack(MediaStreamTrackPrivate& track) final > { >- webkitMediaStreamSrcAddTrack(m_mediaStreamSrc.get(), &track); >+ webkitMediaStreamSrcAddTrack(m_mediaStreamSrc.get(), &track, false); > } > > void didRemoveTrack(MediaStreamTrackPrivate& track) final >@@ -182,8 +181,8 @@ struct _WebKitMediaStreamSrc { > std::unique_ptr<WebKitMediaStreamTrackObserver> mediaStreamTrackObserver; > std::unique_ptr<WebKitMediaStreamObserver> mediaStreamObserver; > volatile gint npads; >- gulong probeid; > RefPtr<MediaStreamPrivate> stream; >+ RefPtr<MediaStreamTrackPrivate> track; > > GstFlowCombiner* flowCombiner; > GRefPtr<GstStreamCollection> streamCollection; >@@ -314,8 +313,11 @@ static GstStateChangeReturn webkitMediaStreamSrcChangeState(GstElement* element, > if (transition == GST_STATE_CHANGE_PAUSED_TO_READY) { > > GST_OBJECT_LOCK(self); >- for (auto& track : self->stream->tracks()) >- track->removeObserver(*self->mediaStreamTrackObserver.get()); >+ if (self->stream) { >+ for (auto& track : self->stream->tracks()) >+ track->removeObserver(*self->mediaStreamTrackObserver.get()); >+ } else if (self->track) >+ self->track->removeObserver(*self->mediaStreamTrackObserver.get()); > GST_OBJECT_UNLOCK(self); > } > >@@ -435,22 +437,26 @@ static GstPadProbeReturn webkitMediaStreamSrcPadProbeCb(GstPad* pad, GstPadProbe > > static gboolean webkitMediaStreamSrcSetupSrc(WebKitMediaStreamSrc* self, > MediaStreamTrackPrivate* track, GstElement* element, >- GstStaticPadTemplate* pad_template, gboolean observe_track) >+ GstStaticPadTemplate* pad_template, gboolean observe_track, >+ bool onlyTrack) > { > auto pad = adoptGRef(gst_element_get_static_pad(element, "src")); > > gst_bin_add(GST_BIN(self), element); > >- ProbeData* data = new ProbeData; >- data->self = WEBKIT_MEDIA_STREAM_SRC(self); >- data->pad_template = pad_template; >- data->track = track; >- >- self->probeid = gst_pad_add_probe(pad.get(), (GstPadProbeType)GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, >- (GstPadProbeCallback)webkitMediaStreamSrcPadProbeCb, data, >- [](gpointer data) { >- delete (ProbeData*)data; >- }); >+ if (!onlyTrack) { >+ ProbeData* data = new ProbeData; >+ data->self = WEBKIT_MEDIA_STREAM_SRC(self); >+ data->pad_template = pad_template; >+ data->track = track; >+ >+ gst_pad_add_probe(pad.get(), (GstPadProbeType)GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, >+ (GstPadProbeCallback)webkitMediaStreamSrcPadProbeCb, data, >+ [](gpointer data) { >+ delete (ProbeData*)data; >+ }); >+ } else >+ webkitMediaStreamSrcAddPad(self, pad.get(), pad_template); > > if (observe_track) > track->addObserver(*self->mediaStreamTrackObserver.get()); >@@ -461,12 +467,12 @@ static gboolean webkitMediaStreamSrcSetupSrc(WebKitMediaStreamSrc* self, > > static gboolean webkitMediaStreamSrcSetupAppSrc(WebKitMediaStreamSrc* self, > MediaStreamTrackPrivate* track, GstElement** element, >- GstStaticPadTemplate* pad_template) >+ GstStaticPadTemplate* pad_template, bool onlyTrack) > { > *element = gst_element_factory_make("appsrc", nullptr); > g_object_set(*element, "is-live", true, "format", GST_FORMAT_TIME, nullptr); > >- return webkitMediaStreamSrcSetupSrc(self, track, *element, pad_template, TRUE); >+ return webkitMediaStreamSrcSetupSrc(self, track, *element, pad_template, TRUE, onlyTrack); > } > > static void webkitMediaStreamSrcPostStreamCollection(WebKitMediaStreamSrc* self, MediaStreamPrivate* stream) >@@ -484,14 +490,20 @@ static void webkitMediaStreamSrcPostStreamCollection(WebKitMediaStreamSrc* self, > gst_message_new_stream_collection(GST_OBJECT(self), self->streamCollection.get())); > } > >-static void webkitMediaStreamSrcAddTrack(WebKitMediaStreamSrc* self, MediaStreamTrackPrivate* track) >+bool webkitMediaStreamSrcAddTrack(WebKitMediaStreamSrc* self, MediaStreamTrackPrivate* track, bool onlyTrack) > { >+ bool res = false; > if (track->type() == RealtimeMediaSource::Type::Audio) >- webkitMediaStreamSrcSetupAppSrc(self, track, &self->audioSrc, &audioSrcTemplate); >+ res = webkitMediaStreamSrcSetupAppSrc(self, track, &self->audioSrc, &audioSrcTemplate, onlyTrack); > else if (track->type() == RealtimeMediaSource::Type::Video) >- webkitMediaStreamSrcSetupAppSrc(self, track, &self->videoSrc, &videoSrcTemplate); >+ res = webkitMediaStreamSrcSetupAppSrc(self, track, &self->videoSrc, &videoSrcTemplate, onlyTrack); > else > GST_INFO("Unsupported track type: %d", static_cast<int>(track->type())); >+ >+ if (onlyTrack && res) >+ self->track = track; >+ >+ return false; > } > > static void webkitMediaStreamSrcRemoveTrackByType(WebKitMediaStreamSrc* self, RealtimeMediaSource::Type trackType) >@@ -524,7 +536,7 @@ bool webkitMediaStreamSrcSetStream(WebKitMediaStreamSrc* self, MediaStreamPrivat > self->stream = stream; > self->stream->addObserver(*self->mediaStreamObserver.get()); > for (auto& track : stream->tracks()) >- webkitMediaStreamSrcAddTrack(self, track.get()); >+ webkitMediaStreamSrcAddTrack(self, track.get(), false); > > return TRUE; > } >@@ -593,6 +605,11 @@ static void webkitMediaStreamSrcTrackEnded(WebKitMediaStreamSrc* self, > gst_pad_push_event(pad.get(), gst_event_new_eos()); > } > >+GstElement* webkitMediaStreamSrcNew(void) >+{ >+ return GST_ELEMENT(g_object_new(webkit_media_stream_src_get_type(), nullptr)); >+} >+ > } // WebCore > #endif // GST_CHECK_VERSION(1, 10, 0) > #endif // ENABLE(VIDEO) && ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) >diff --git a/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.h b/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.h >index 1599bae7862714a7bb9d342ce4a3208026434632..c33bd0285b6111941540eea1f47e5f9c89e14042 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.h >+++ b/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.h >@@ -41,6 +41,8 @@ typedef struct _WebKitMediaStreamSrc WebKitMediaStreamSrc; > #define WEBKIT_TYPE_MEDIA_STREAM_SRC (webkit_media_stream_src_get_type()) > GType webkit_media_stream_src_get_type(void) G_GNUC_CONST; > bool webkitMediaStreamSrcSetStream(WebKitMediaStreamSrc*, MediaStreamPrivate*); >+bool webkitMediaStreamSrcAddTrack(WebKitMediaStreamSrc*, MediaStreamTrackPrivate*, bool onlyTrack); >+GstElement * webkitMediaStreamSrcNew(void); > } // WebCore > > #endif // ENABLE(VIDEO) && ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) >diff --git a/Source/WebCore/platform/mediastream/gstreamer/MockGStreamerAudioCaptureSource.cpp b/Source/WebCore/platform/mediastream/gstreamer/MockGStreamerAudioCaptureSource.cpp >index fd52ebf496e7b1e697696b85daac61576a061adf..16bb68cb64ac5c878100c6df296a6054113aacaa 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/MockGStreamerAudioCaptureSource.cpp >+++ b/Source/WebCore/platform/mediastream/gstreamer/MockGStreamerAudioCaptureSource.cpp >@@ -24,16 +24,110 @@ > #if ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) && USE(GSTREAMER) > #include "MockGStreamerAudioCaptureSource.h" > >+#include "GStreamerAudioStreamDescription.h" > #include "MockRealtimeAudioSource.h" > >+#include <gst/app/gstappsrc.h> >+ > namespace WebCore { > >+static const double s_Tau = 2 * M_PI; >+static const double s_BipBopDuration = 0.07; >+static const double s_BipBopVolume = 0.5; >+static const double s_BipFrequency = 1500; >+static const double s_BopFrequency = 500; >+static const double s_HumFrequency = 150; >+static const double s_HumVolume = 0.1; >+ > class WrappedMockRealtimeAudioSource : public MockRealtimeAudioSource { > public: > WrappedMockRealtimeAudioSource(String&& deviceID, String&& name, String&& hashSalt) > : MockRealtimeAudioSource(WTFMove(deviceID), WTFMove(name), WTFMove(hashSalt)) >+ , m_src(nullptr) >+ { >+ } >+ >+ void start(GRefPtr<GstElement> src) >+ { >+ m_src = src; >+ if (m_streamFormat) >+ gst_app_src_set_caps(GST_APP_SRC(m_src.get()), m_streamFormat->caps()); >+ MockRealtimeAudioSource::start(); >+ } >+ >+ void addHum(float amplitude, float frequency, float sampleRate, uint64_t start, float *p, uint64_t count) >+ { >+ float humPeriod = sampleRate / frequency; >+ for (uint64_t i = start, end = start + count; i < end; ++i) { >+ float a = amplitude * sin(i * s_Tau / humPeriod); >+ a += *p; >+ *p++ = a; >+ } >+ } >+ >+ void render(Seconds delta) > { >+ ASSERT(m_src); >+ >+ uint32_t totalFrameCount = GST_ROUND_UP_16(static_cast<size_t>(delta.seconds() * sampleRate())); >+ uint32_t frameCount = std::min(totalFrameCount, m_maximiumFrameCount); >+ while (frameCount) { >+ uint32_t bipBopStart = m_samplesRendered % m_bipBopBuffer.size(); >+ uint32_t bipBopRemain = m_bipBopBuffer.size() - bipBopStart; >+ uint32_t bipBopCount = std::min(frameCount, bipBopRemain); >+ >+ GstBuffer* buffer = gst_buffer_new_allocate(nullptr, bipBopCount * m_streamFormat->bytesPerFrame(), nullptr); >+ { >+ GstMappedBuffer map(buffer, GST_MAP_WRITE); >+ >+ if (!muted()) { >+ memcpy(map.data(), &m_bipBopBuffer[bipBopStart], sizeof(float) * bipBopCount); >+ addHum(s_HumVolume, s_HumFrequency, sampleRate(), m_samplesRendered, (float*)map.data(), bipBopCount); >+ } else >+ memset(map.data(), 0, sizeof(float) * bipBopCount); >+ } >+ >+ gst_app_src_push_buffer(GST_APP_SRC(m_src.get()), buffer); >+ m_samplesRendered += bipBopCount; >+ totalFrameCount -= bipBopCount; >+ frameCount = std::min(totalFrameCount, m_maximiumFrameCount); >+ } > } >+ >+ void settingsDidChange(OptionSet<RealtimeMediaSourceSettings::Flag> settings) >+ { >+ if (settings.contains(RealtimeMediaSourceSettings::Flag::SampleRate)) { >+ GstAudioInfo info; >+ auto rate = sampleRate(); >+ size_t sampleCount = 2 * rate; >+ >+ m_maximiumFrameCount = WTF::roundUpToPowerOfTwo(renderInterval().seconds() * sampleRate()); >+ gst_audio_info_set_format(&info, GST_AUDIO_FORMAT_F32LE, rate, 1, nullptr); >+ m_streamFormat = GStreamerAudioStreamDescription(info); >+ >+ if (m_src) >+ gst_app_src_set_caps(GST_APP_SRC(m_src.get()), m_streamFormat->caps()); >+ >+ m_bipBopBuffer.grow(sampleCount); >+ m_bipBopBuffer.fill(0); >+ >+ size_t bipBopSampleCount = ceil(s_BipBopDuration * rate); >+ size_t bipStart = 0; >+ size_t bopStart = rate; >+ >+ addHum(s_BipBopVolume, s_BipFrequency, rate, 0, static_cast<float*>(m_bipBopBuffer.data() + bipStart), bipBopSampleCount); >+ addHum(s_BipBopVolume, s_BopFrequency, rate, 0, static_cast<float*>(m_bipBopBuffer.data() + bopStart), bipBopSampleCount); >+ } >+ >+ MockRealtimeAudioSource::settingsDidChange(settings); >+ } >+ >+ GRefPtr<GstElement> m_src; >+ std::optional<GStreamerAudioStreamDescription> m_streamFormat; >+ Vector<float> m_bipBopBuffer; >+ uint32_t m_maximiumFrameCount; >+ uint64_t m_samplesEmitted { 0 }; >+ uint64_t m_samplesRendered { 0 }; > }; > > CaptureSourceOrError MockRealtimeAudioSource::create(String&& deviceID, >@@ -80,7 +174,7 @@ void MockGStreamerAudioCaptureSource::stopProducingData() > void MockGStreamerAudioCaptureSource::startProducingData() > { > GStreamerAudioCaptureSource::startProducingData(); >- m_wrappedSource->start(); >+ static_cast<WrappedMockRealtimeAudioSource*>(m_wrappedSource.get())->start(capturer()->source()); > } > > const RealtimeMediaSourceSettings& MockGStreamerAudioCaptureSource::settings() >diff --git a/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp b/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp >index 534af7701037211273f983e00f4d4b021adc161d..5e7f977a96b2179db618fe800b2d5635db9d6a1f 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp >+++ b/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp >@@ -108,33 +108,26 @@ void RealtimeOutgoingAudioSourceLibWebRTC::pullAudioData() > size_t inChunkSampleCount = gst_audio_converter_get_in_frames(m_sampleConverter.get(), outChunkSampleCount); > size_t inBufferSize = inChunkSampleCount * m_inputStreamDescription->getInfo()->bpf; > >- auto available = gst_adapter_available(m_adapter.get()); >- if (inBufferSize > available) { >- GST_DEBUG("Not enough data: wanted: %ld > %ld available", >- inBufferSize, available); >- >- return; >- } >- >- auto inBuffer = adoptGRef(gst_adapter_take_buffer(m_adapter.get(), inBufferSize)); >- auto outBuffer = adoptGRef(gst_buffer_new_allocate(nullptr, outBufferSize, 0)); >- GstMappedBuffer outMap(outBuffer.get(), GST_MAP_WRITE); >- if (isSilenced()) >- gst_audio_format_fill_silence(m_outputStreamDescription->getInfo()->finfo, outMap.data(), outMap.size()); >- else { >- GstMappedBuffer inMap(inBuffer.get(), GST_MAP_READ); >- >- gpointer in[1] = { inMap.data() }; >- gpointer out[1] = { outMap.data() }; >- if (!gst_audio_converter_samples(m_sampleConverter.get(), static_cast<GstAudioConverterFlags>(0), in, inChunkSampleCount, out, outChunkSampleCount)) { >- GST_ERROR("Could not convert samples."); >- >- return; >+ while (gst_adapter_available(m_adapter.get()) > inBufferSize) { >+ auto inBuffer = adoptGRef(gst_adapter_take_buffer(m_adapter.get(), inBufferSize)); >+ m_audioBuffer.grow(outBufferSize); >+ if (isSilenced()) >+ gst_audio_format_fill_silence(m_outputStreamDescription->getInfo()->finfo, m_audioBuffer.data(), outBufferSize); >+ else { >+ GstMappedBuffer inMap(inBuffer.get(), GST_MAP_READ); >+ >+ gpointer in[1] = { inMap.data() }; >+ gpointer out[1] = { m_audioBuffer.data() }; >+ if (!gst_audio_converter_samples(m_sampleConverter.get(), static_cast<GstAudioConverterFlags>(0), in, inChunkSampleCount, out, outChunkSampleCount)) { >+ GST_ERROR("Could not convert samples."); >+ >+ return; >+ } > } >- } > >- sendAudioFrames(outMap.data(), LibWebRTCAudioFormat::sampleSize, static_cast<int>(m_outputStreamDescription->sampleRate()), >- static_cast<int>(m_outputStreamDescription->numberOfChannels()), outChunkSampleCount); >+ sendAudioFrames(m_audioBuffer.data(), LibWebRTCAudioFormat::sampleSize, static_cast<int>(m_outputStreamDescription->sampleRate()), >+ static_cast<int>(m_outputStreamDescription->numberOfChannels()), outChunkSampleCount); >+ } > } > > bool RealtimeOutgoingAudioSourceLibWebRTC::isReachingBufferedAudioDataHighLimit() >diff --git a/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.h b/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.h >index 4609e9d9bad25f6e92844241f8f1555564087fc0..090405e1848facf8a87f08f565b18b747becb162 100644 >--- a/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.h >+++ b/Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.h >@@ -54,6 +54,7 @@ private: > > Lock m_adapterMutex; > GRefPtr<GstAdapter> m_adapter; >+ Vector<uint8_t> m_audioBuffer; > }; > > } // namespace WebCore >diff --git a/LayoutTests/ChangeLog b/LayoutTests/ChangeLog >index 7257553e71bfa3a1436fa80c98be72c58f810a34..b5b8be49ab7d634d7af3f07722e93eb920611d9c 100644 >--- a/LayoutTests/ChangeLog >+++ b/LayoutTests/ChangeLog >@@ -1,3 +1,15 @@ >+2018-12-05 Thibault Saunier <tsaunier@igalia.com> >+ >+ [WPE][GTK] Implement WebAudioSourceProviderGStreamer to allow bridging MediaStream and the WebAudio APIs >+ https://bugs.webkit.org/show_bug.cgi?id=186933 >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ Enabled all tests depending on that feature. >+ >+ * platform/gtk/TestExpectations: >+ * webrtc/clone-audio-track.html: >+ > 2018-11-28 Said Abou-Hallawa <sabouhallawa@apple.com> > > Updating href on linearGradient and radialGradient doesn't update its rendering >diff --git a/LayoutTests/platform/gtk/TestExpectations b/LayoutTests/platform/gtk/TestExpectations >index 4f6e27d2820b6d89f97b0ca5ce95684bec024b1b..7feb17c2aef4cd943a3304580879e178dc625496 100644 >--- a/LayoutTests/platform/gtk/TestExpectations >+++ b/LayoutTests/platform/gtk/TestExpectations >@@ -586,16 +586,6 @@ webkit.org/b/187064 webrtc/libwebrtc/descriptionGetters.html > webkit.org/b/177533 webrtc/video-interruption.html > > webkit.org/b/186933 webrtc/peer-connection-createMediaStreamDestination.html >-webkit.org/b/186933 webrtc/peer-connection-remote-audio-mute2.html >-webkit.org/b/186933 webrtc/peer-connection-audio-unmute.html >-webkit.org/b/186933 webrtc/peer-connection-audio-mute2.html >-webkit.org/b/186933 webrtc/peer-connection-audio-mute.html >-webkit.org/b/186933 webrtc/peer-connection-remote-audio-mute.html >-webkit.org/b/186933 webrtc/clone-audio-track.html >-webkit.org/b/186933 webrtc/audio-replace-track.html >-webkit.org/b/186933 webrtc/audio-peer-connection-webaudio.html >-webkit.org/b/186933 webrtc/audio-muted-stats.html >-webkit.org/b/186933 webrtc/getUserMedia-webaudio-autoplay.html > > imported/w3c/web-platform-tests/webrtc/ [ Skip ] > http/tests/webrtc [ Skip ] >diff --git a/LayoutTests/webrtc/clone-audio-track.html b/LayoutTests/webrtc/clone-audio-track.html >index 649b446e8584253b513ab7293a090c15f0dbf543..0f9374849083f4ade02766574a153b582d91bbdb 100644 >--- a/LayoutTests/webrtc/clone-audio-track.html >+++ b/LayoutTests/webrtc/clone-audio-track.html >@@ -33,7 +33,7 @@ > }); > }).then(() => { > return analyseAudio(remoteStream, 200, context).then((results) => { >- assert_false(results.heardHum, "Did not heard hum from remote enabled track"); >+ assert_false(results.heardHum, "Did not hear hum from remote disabled track"); > }); > }).then(() => { > return analyseAudio(new MediaStream([clonedTrack]), 200, context).then((results) => {
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 186933
:
356726
| 356735