WebKit Bugzilla
Attachment 357973 Details for
Bug 187896
: 'ended' Event doesn't fire on MediaStreamTrack when a USB camera is unplugged
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
[patch]
Patch
bug-187896-20181221132240.patch (text/plain), 18.67 KB, created by
Eric Carlson
on 2018-12-21 13:22:41 PST
(
hide
)
Description:
Patch
Filename:
MIME Type:
Creator:
Eric Carlson
Created:
2018-12-21 13:22:41 PST
Size:
18.67 KB
patch
obsolete
>Subversion Revision: 239461 >diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog >index 6c10bac620ed624c6f2f1c07551afd5f72f42b6b..d070168e459ea30edfb0bead1d00f829cc2ed2bc 100644 >--- a/Source/WebCore/ChangeLog >+++ b/Source/WebCore/ChangeLog >@@ -1,3 +1,38 @@ >+2018-12-21 Eric Carlson <eric.carlson@apple.com> >+ >+ 'ended' Event doesn't fire on MediaStreamTrack when a USB camera is unplugged >+ https://bugs.webkit.org/show_bug.cgi?id=187896 >+ <rdar://problem/42681445> >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ No new tests, tested manually. >+ >+ * platform/mediastream/mac/AVVideoCaptureSource.h: >+ * platform/mediastream/mac/AVVideoCaptureSource.mm: >+ (WebCore::AVVideoCaptureSource::deviceDisconnected): >+ (-[WebCoreAVVideoCaptureSourceObserver addNotificationObservers]): >+ (-[WebCoreAVVideoCaptureSourceObserver removeNotificationObservers]): >+ (-[WebCoreAVVideoCaptureSourceObserver deviceConnectedDidChange:]): >+ * platform/mediastream/mac/CoreAudioCaptureDeviceManager.cpp: >+ (WebCore::deviceHasInputStreams): >+ (WebCore::isValidCaptureDevice): >+ (WebCore::CoreAudioCaptureDeviceManager::coreAudioCaptureDevices): >+ (WebCore::CoreAudioCaptureDeviceManager::refreshAudioCaptureDevices): >+ (WebCore::CoreAudioCaptureDeviceManager::devicesChanged): Deleted. >+ * platform/mediastream/mac/CoreAudioCaptureDeviceManager.h: >+ * platform/mediastream/mac/CoreAudioCaptureSource.cpp: >+ (WebCore::CoreAudioSharedUnit::setCaptureDevice): >+ (WebCore::CoreAudioSharedUnit::devicesChanged): >+ (WebCore::CoreAudioSharedUnit::startProducingData): >+ (WebCore::CoreAudioSharedUnit::startInternal): >+ (WebCore::CoreAudioSharedUnit::verifyIsCapturing): >+ (WebCore::CoreAudioSharedUnit::captureFailed): >+ (WebCore::CoreAudioCaptureSourceFactory::devicesChanged): >+ (WebCore::CoreAudioCaptureSource::CoreAudioCaptureSource): >+ (WebCore::CoreAudioSharedUnit::setCaptureDeviceID): Deleted. >+ * platform/mediastream/mac/CoreAudioCaptureSource.h: >+ > 2018-12-20 Chris Dumez <cdumez@apple.com> > > Use Optional::valueOr() instead of Optional::value_or() >diff --git a/Source/WebCore/PAL/ChangeLog b/Source/WebCore/PAL/ChangeLog >index 2ba4673b479fbe0bac49b2bbe658224018c5e18c..13244cd599fd35b7fb3e870ddf73fa4bcce9f364 100644 >--- a/Source/WebCore/PAL/ChangeLog >+++ b/Source/WebCore/PAL/ChangeLog >@@ -1,3 +1,13 @@ >+2018-12-21 Eric Carlson <eric.carlson@apple.com> >+ >+ 'ended' Event doesn't fire on MediaStreamTrack when a USB camera is unplugged >+ https://bugs.webkit.org/show_bug.cgi?id=187896 >+ <rdar://problem/42681445> >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ * pal/spi/cf/CoreAudioSPI.h: >+ > 2018-12-19 Chris Dumez <cdumez@apple.com> > > wtf/Optional.h: move-constructor and move-assignment operator should disengage the value being moved from >diff --git a/Source/WebCore/PAL/pal/spi/cf/CoreAudioSPI.h b/Source/WebCore/PAL/pal/spi/cf/CoreAudioSPI.h >index 37799d2b1015f5c1cb7a97aa397a42764e50a517..88a858e910d44d81e2fa9b066036c5a3964a67bd 100644 >--- a/Source/WebCore/PAL/pal/spi/cf/CoreAudioSPI.h >+++ b/Source/WebCore/PAL/pal/spi/cf/CoreAudioSPI.h >@@ -55,7 +55,8 @@ CF_ENUM(AudioObjectPropertyScope) > > CF_ENUM(AudioObjectPropertySelector) > { >- kAudioHardwarePropertyDefaultInputDevice = 'dIn ' >+ kAudioHardwarePropertyDefaultInputDevice = 'dIn ', >+ kAudioDevicePropertyTapEnabled = 'tapd', > }; > > CF_ENUM(int) >diff --git a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h >index 2a53054314684a2e475798fe8799652ddafe165e..603817263deeb03849c87186418649c034f1fb0d 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h >+++ b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h >@@ -60,6 +60,7 @@ public: > enum class InterruptionReason { None, VideoNotAllowedInBackground, AudioInUse, VideoInUse, VideoNotAllowedInSideBySide }; > void captureSessionBeginInterruption(RetainPtr<NSNotification>); > void captureSessionEndInterruption(RetainPtr<NSNotification>); >+ void deviceDisconnected(RetainPtr<NSNotification>); > > AVCaptureSession* session() const { return m_session.get(); } > >diff --git a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >index a54cd14d6eaa76b8800e800ea7be2a8a108c554b..f055ba38acaf03c1d22f0433095ceb4729e5f407 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >+++ b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >@@ -78,6 +78,9 @@ SOFT_LINK_CLASS(AVFoundation, AVCaptureSession) > > SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) > >+SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) >+#define AVCaptureDeviceWasDisconnectedNotification getAVCaptureDeviceWasDisconnectedNotification() >+ > #if PLATFORM(IOS_FAMILY) > SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *) > SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *) >@@ -109,6 +112,7 @@ using namespace PAL; > -(void)sessionRuntimeError:(NSNotification*)notification; > -(void)beginSessionInterrupted:(NSNotification*)notification; > -(void)endSessionInterrupted:(NSNotification*)notification; >+-(void)deviceConnectedDidChange:(NSNotification*)notification; > #endif > @end > >@@ -626,6 +630,14 @@ void AVVideoCaptureSource::captureSessionEndInterruption(RetainPtr<NSNotificatio > } > #endif > >+void AVVideoCaptureSource::deviceDisconnected(RetainPtr<NSNotification> notification) >+{ >+ AVCaptureDeviceTypedef *device = [notification object]; >+ if (this->device() == device) >+ captureFailed(); >+} >+ >+ > } // namespace WebCore > > @implementation WebCoreAVVideoCaptureSourceObserver >@@ -650,12 +662,14 @@ void AVVideoCaptureSource::captureSessionEndInterruption(RetainPtr<NSNotificatio > > - (void)addNotificationObservers > { >-#if PLATFORM(IOS_FAMILY) > ASSERT(m_callback); > > NSNotificationCenter* center = [NSNotificationCenter defaultCenter]; >- AVCaptureSessionType* session = m_callback->session(); > >+ [center addObserver:self selector:@selector(deviceConnectedDidChange:) name:AVCaptureDeviceWasDisconnectedNotification object:nil]; >+ >+#if PLATFORM(IOS_FAMILY) >+ AVCaptureSessionType* session = m_callback->session(); > [center addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:session]; > [center addObserver:self selector:@selector(beginSessionInterrupted:) name:AVCaptureSessionWasInterruptedNotification object:session]; > [center addObserver:self selector:@selector(endSessionInterrupted:) name:AVCaptureSessionInterruptionEndedNotification object:session]; >@@ -664,9 +678,7 @@ void AVVideoCaptureSource::captureSessionEndInterruption(RetainPtr<NSNotificatio > > - (void)removeNotificationObservers > { >-#if PLATFORM(IOS_FAMILY) > [[NSNotificationCenter defaultCenter] removeObserver:self]; >-#endif > } > > - (void)captureOutput:(AVCaptureOutputType*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnectionType*)connection >@@ -704,6 +716,16 @@ void AVVideoCaptureSource::captureSessionEndInterruption(RetainPtr<NSNotificatio > m_callback->captureDeviceSuspendedDidChange(); > } > >+- (void)deviceConnectedDidChange:(NSNotification*)notification >+{ >+ LOG(Media, "WebCoreAVVideoCaptureSourceObserver::deviceConnectedDidChange(%p)", self); >+ >+ if (!m_callback) >+ return; >+ >+ m_callback->deviceDisconnected(notification); >+} >+ > #if PLATFORM(IOS_FAMILY) > - (void)sessionRuntimeError:(NSNotification*)notification > { >diff --git a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.cpp b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.cpp >index 10956ba1fc9694f03f3558c07736d353e6206d48..799709d517cd113fa3d5201369c44e70373a3a1e 100644 >--- a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.cpp >+++ b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.cpp >@@ -29,10 +29,12 @@ > #if ENABLE(MEDIA_STREAM) && PLATFORM(MAC) > > #include "CoreAudioCaptureDevice.h" >+#include "CoreAudioCaptureSource.h" > #include "Logging.h" > #include "RealtimeMediaSourceCenter.h" > #include <AudioUnit/AudioUnit.h> > #include <CoreMedia/CMSync.h> >+#include <pal/spi/cf/CoreAudioSPI.h> > #include <wtf/Assertions.h> > #include <wtf/NeverDestroyed.h> > >@@ -67,7 +69,6 @@ static bool deviceHasInputStreams(AudioObjectID deviceID) > UInt32 dataSize = 0; > AudioObjectPropertyAddress address = { kAudioDevicePropertyStreamConfiguration, kAudioDevicePropertyScopeInput, kAudioObjectPropertyElementMaster }; > auto err = AudioObjectGetPropertyDataSize(deviceID, &address, 0, nullptr, &dataSize); >- > if (err || !dataSize) > return false; > >@@ -80,6 +81,23 @@ static bool deviceHasInputStreams(AudioObjectID deviceID) > > static bool isValidCaptureDevice(const CoreAudioCaptureDevice& device) > { >+ // Ignore output devices that have input only for echo cancellation. >+ AudioObjectPropertyAddress address = { kAudioDevicePropertyTapEnabled, kAudioDevicePropertyScopeOutput, kAudioObjectPropertyElementMaster }; >+ if (AudioObjectHasProperty(device.deviceID(), &address)) >+ return false; >+ >+ // Filter out non-aggregable devices >+ UInt32 dataSize = 0; >+ address = { kAudioObjectPropertyCreator, kAudioObjectPropertyScopeGlobal, kAudioObjectPropertyElementMaster }; >+ CFStringRef name = nullptr; >+ dataSize = sizeof(name); >+ AudioObjectGetPropertyData(device.deviceID(), &address, 0, nullptr, &dataSize, &name); >+ bool isNonAggregable = !name || !String { name }.startsWith("com.apple.audio.CoreAudio"); >+ if (name) >+ CFRelease(name); >+ if (isNonAggregable) >+ return false; >+ > // Ignore unnamed devices and aggregate devices created by VPIO. > return !device.label().isEmpty() && !device.label().startsWith("VPAUAggregateAudioDevice"); > } >@@ -91,8 +109,24 @@ Vector<CoreAudioCaptureDevice>& CoreAudioCaptureDeviceManager::coreAudioCaptureD > initialized = true; > refreshAudioCaptureDevices(DoNotNotify); > >+ auto weakThis = makeWeakPtr(*this); >+ m_listenerBlock = Block_copy(^(UInt32 count, const AudioObjectPropertyAddress properties[]) { >+ if (!weakThis) >+ return; >+ >+ for (UInt32 i = 0; i < count; ++i) { >+ const AudioObjectPropertyAddress& property = properties[i]; >+ >+ if (property.mSelector != kAudioHardwarePropertyDevices) >+ continue; >+ >+ weakThis->refreshAudioCaptureDevices(Notify); >+ return; >+ } >+ }); >+ > AudioObjectPropertyAddress address = { kAudioHardwarePropertyDevices, kAudioObjectPropertyScopeGlobal, kAudioObjectPropertyElementMaster }; >- auto err = AudioObjectAddPropertyListener(kAudioObjectSystemObject, &address, devicesChanged, this); >+ auto err = AudioObjectAddPropertyListenerBlock(kAudioObjectSystemObject, &address, dispatch_get_main_queue(), m_listenerBlock); > if (err) > LOG_ERROR("CoreAudioCaptureDeviceManager::devices(%p) AudioObjectAddPropertyListener returned error %d (%.4s)", this, (int)err, (char*)&err); > } >@@ -164,14 +198,10 @@ void CoreAudioCaptureDeviceManager::refreshAudioCaptureDevices(NotifyIfDevicesHa > m_devices.append(captureDevice); > } > >- if (notify == Notify) >+ if (notify == Notify) { > deviceChanged(); >-} >- >-OSStatus CoreAudioCaptureDeviceManager::devicesChanged(AudioObjectID, UInt32, const AudioObjectPropertyAddress*, void* userData) >-{ >- static_cast<CoreAudioCaptureDeviceManager*>(userData)->refreshAudioCaptureDevices(Notify); >- return 0; >+ CoreAudioCaptureSourceFactory::singleton().devicesChanged(m_devices); >+ } > } > > } // namespace WebCore >diff --git a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.h b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.h >index 5c972f1864fa6a1078762ae4c4ff0ffe897f659e..19462e48dde4a9017e8960d5d136abd4b77d8f77 100644 >--- a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.h >+++ b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureDeviceManager.h >@@ -52,7 +52,6 @@ private: > CoreAudioCaptureDeviceManager() = default; > ~CoreAudioCaptureDeviceManager() = default; > >- static OSStatus devicesChanged(AudioObjectID, UInt32, const AudioObjectPropertyAddress*, void*); > Vector<CoreAudioCaptureDevice>& coreAudioCaptureDevices(); > > enum NotifyIfDevicesHaveChanged { Notify, DoNotNotify }; >@@ -60,6 +59,8 @@ private: > > Vector<CaptureDevice> m_devices; > Vector<CoreAudioCaptureDevice> m_coreAudioCaptureDevices; >+ >+ AudioObjectPropertyListenerBlock m_listenerBlock; > }; > > } // namespace WebCore >diff --git a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.cpp b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.cpp >index 57d6b014fd5058e010c96852e49f6c45987d8a0b..5b76548ea515a5eda65e328900277c7e8a2d25fc 100644 >--- a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.cpp >+++ b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.cpp >@@ -102,7 +102,9 @@ public: > > bool hasAudioUnit() const { return m_ioUnit; } > >- void setCaptureDeviceID(uint32_t); >+ void setCaptureDevice(const String&, uint32_t); >+ >+ void devicesChanged(const Vector<CaptureDevice>&); > > private: > OSStatus configureSpeakerProc(); >@@ -116,10 +118,12 @@ private: > static OSStatus speakerCallback(void*, AudioUnitRenderActionFlags*, const AudioTimeStamp*, UInt32, UInt32, AudioBufferList*); > OSStatus provideSpeakerData(AudioUnitRenderActionFlags&, const AudioTimeStamp&, UInt32, UInt32, AudioBufferList*); > >- void startInternal(); >+ OSStatus startInternal(); > void stopInternal(); > > void verifyIsCapturing(); >+ void devicesChanged(); >+ void captureFailed(); > > Vector<std::reference_wrapper<CoreAudioCaptureSource>> m_clients; > >@@ -162,6 +166,8 @@ private: > uint64_t m_speakerProcsCalled { 0 }; > #endif > >+ String m_persistentID; >+ > uint64_t m_microphoneProcsCalled { 0 }; > uint64_t m_microphoneProcsCalledLastTime { 0 }; > Timer m_verifyCapturingTimer; >@@ -197,8 +203,10 @@ void CoreAudioSharedUnit::removeClient(CoreAudioCaptureSource& client) > }); > } > >-void CoreAudioSharedUnit::setCaptureDeviceID(uint32_t captureDeviceID) >+void CoreAudioSharedUnit::setCaptureDevice(const String& persistentID, uint32_t captureDeviceID) > { >+ m_persistentID = persistentID; >+ > #if PLATFORM(MAC) > if (m_captureDeviceID == captureDeviceID) > return; >@@ -210,6 +218,23 @@ void CoreAudioSharedUnit::setCaptureDeviceID(uint32_t captureDeviceID) > #endif > } > >+void CoreAudioSharedUnit::devicesChanged(const Vector<CaptureDevice>& devices) >+{ >+ if (!m_ioUnit) >+ return; >+ >+ bool deviceIsAlive = false; >+ for (auto& device : devices) { >+ if (m_persistentID == device.persistentId()) { >+ deviceIsAlive = true; >+ break; >+ } >+ } >+ >+ if (!deviceIsAlive) >+ captureFailed(); >+} >+ > void CoreAudioSharedUnit::addEchoCancellationSource(AudioSampleDataSource& source) > { > if (!source.setOutputFormat(m_speakerProcFormat)) { >@@ -559,7 +584,8 @@ void CoreAudioSharedUnit::startProducingData() > ASSERT(!m_ioUnit); > } > >- startInternal(); >+ if (startInternal()) >+ captureFailed(); > } > > OSStatus CoreAudioSharedUnit::resume() >@@ -578,7 +604,7 @@ OSStatus CoreAudioSharedUnit::resume() > return 0; > } > >-void CoreAudioSharedUnit::startInternal() >+OSStatus CoreAudioSharedUnit::startInternal() > { > OSStatus err; > if (!m_ioUnit) { >@@ -586,7 +612,7 @@ void CoreAudioSharedUnit::startInternal() > if (err) { > cleanupAudioUnit(); > ASSERT(!m_ioUnit); >- return; >+ return err; > } > ASSERT(m_ioUnit); > } >@@ -598,7 +624,7 @@ void CoreAudioSharedUnit::startInternal() > err = AudioOutputUnitStart(m_ioUnit); > if (err) { > RELEASE_LOG_ERROR(Media, "CoreAudioSharedUnit::start(%p) AudioOutputUnitStart failed with error %d (%.4s)", this, (int)err, (char*)&err); >- return; >+ return err; > } > > m_ioUnitStarted = true; >@@ -606,6 +632,8 @@ void CoreAudioSharedUnit::startInternal() > m_verifyCapturingTimer.startRepeating(10_s); > m_microphoneProcsCalled = 0; > m_microphoneProcsCalledLastTime = 0; >+ >+ return 0; > } > > void CoreAudioSharedUnit::verifyIsCapturing() >@@ -617,8 +645,14 @@ void CoreAudioSharedUnit::verifyIsCapturing() > return; > } > >+ captureFailed(); >+} >+ >+ >+void CoreAudioSharedUnit::captureFailed() >+{ > #if !RELEASE_LOG_DISABLED >- RELEASE_LOG_ERROR(Media, "CoreAudioSharedUnit::verifyIsCapturing - capture failed\n"); >+ RELEASE_LOG_ERROR(Media, "CoreAudioSharedUnit::captureFailed - capture failed\n"); > #endif > for (CoreAudioCaptureSource& client : m_clients) > client.captureFailed(); >@@ -783,12 +817,17 @@ CaptureDeviceManager& CoreAudioCaptureSourceFactory::audioCaptureDeviceManager() > #endif > } > >-CoreAudioCaptureSource::CoreAudioCaptureSource(String&& deviceID, String&& label, String&& hashSalt, uint32_t persistentID) >+void CoreAudioCaptureSourceFactory::devicesChanged(const Vector<CaptureDevice>& devices) >+{ >+ CoreAudioSharedUnit::singleton().devicesChanged(devices); >+} >+ >+CoreAudioCaptureSource::CoreAudioCaptureSource(String&& deviceID, String&& label, String&& hashSalt, uint32_t captureDeviceID) > : RealtimeMediaSource(RealtimeMediaSource::Type::Audio, WTFMove(label), WTFMove(deviceID), WTFMove(hashSalt)) >- , m_captureDeviceID(persistentID) >+ , m_captureDeviceID(captureDeviceID) > { > auto& unit = CoreAudioSharedUnit::singleton(); >- unit.setCaptureDeviceID(m_captureDeviceID); >+ unit.setCaptureDevice(persistentID(), m_captureDeviceID); > > initializeEchoCancellation(unit.enableEchoCancellation()); > initializeSampleRate(unit.sampleRate()); >diff --git a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.h b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.h >index d9fa0ae43406aa3960dd659feb7b1258c805973a..f5203731ccd01e46b275876505e6543ec3eb32e8 100644 >--- a/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.h >+++ b/Source/WebCore/platform/mediastream/mac/CoreAudioCaptureSource.h >@@ -117,6 +117,8 @@ public: > void endInterruption(); > void scheduleReconfiguration(); > >+ void devicesChanged(const Vector<CaptureDevice>&); >+ > #if PLATFORM(IOS_FAMILY) > void setCoreAudioActiveSource(CoreAudioCaptureSource& source) { setActiveSource(source); } > void unsetCoreAudioActiveSource(CoreAudioCaptureSource& source) { unsetActiveSource(source); }
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 187896
:
357973
|
357978
|
357988