WebKit Bugzilla
Attachment 348551 Details for
Bug 189159
: [MediaStream] Remove AVMediaCaptureSource
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
[patch]
Patch for landing
bug-189159-20180830143011.patch (text/plain), 39.72 KB, created by
Eric Carlson
on 2018-08-30 14:30:12 PDT
(
hide
)
Description:
Patch for landing
Filename:
MIME Type:
Creator:
Eric Carlson
Created:
2018-08-30 14:30:12 PDT
Size:
39.72 KB
patch
obsolete
>Subversion Revision: 235515 >diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog >index 1c4dde3f2b5225f5a73fb3d71221c596ff863222..10b65694ba4f9f5ed0661d5bec749d9899638f65 100644 >--- a/Source/WebCore/ChangeLog >+++ b/Source/WebCore/ChangeLog >@@ -1,5 +1,54 @@ > 2018-08-30 Eric Carlson <eric.carlson@apple.com> > >+ [MediaStream] Remove AVMediaCaptureSource >+ https://bugs.webkit.org/show_bug.cgi?id=189159 >+ >+ Reviewed by Youenn Fablet. >+ >+ No new tests, no change in functionality. >+ >+ Refactor video capture to get rid of a base class we don't >+ need any more. >+ >+ * WebCore.xcodeproj/project.pbxproj: >+ * platform/mediastream/mac/AVMediaCaptureSource.h: Removed. >+ * platform/mediastream/mac/AVMediaCaptureSource.mm: Removed. >+ * platform/mediastream/mac/AVVideoCaptureSource.h: >+ (WebCore::AVVideoCaptureSource::session const): >+ (WebCore::AVVideoCaptureSource::device const): >+ * platform/mediastream/mac/AVVideoCaptureSource.mm: >+ (WebCore::globaVideoCaptureSerialQueue): >+ (WebCore::AVVideoCaptureSource::AVVideoCaptureSource): >+ (WebCore::AVVideoCaptureSource::~AVVideoCaptureSource): >+ (WebCore::AVVideoCaptureSource::startProducingData): >+ (WebCore::AVVideoCaptureSource::stopProducingData): >+ (WebCore::AVVideoCaptureSource::beginConfiguration): >+ (WebCore::AVVideoCaptureSource::commitConfiguration): >+ (WebCore::AVVideoCaptureSource::settingsDidChange): >+ (WebCore::AVVideoCaptureSource::settings const): >+ (WebCore::AVVideoCaptureSource::capabilities const): >+ (WebCore::AVVideoCaptureSource::setPreset): >+ (WebCore::AVVideoCaptureSource::setupSession): >+ (WebCore::AVVideoCaptureSource::setupCaptureSession): >+ (WebCore::AVVideoCaptureSource::captureSessionIsRunningDidChange): >+ (WebCore::AVVideoCaptureSource::interrupted const): >+ (WebCore::AVVideoCaptureSource::captureSessionRuntimeError): >+ (WebCore::AVVideoCaptureSource::captureSessionBeginInterruption): >+ (WebCore::AVVideoCaptureSource::captureSessionEndInterruption): >+ (-[WebCoreAVVideoCaptureSourceObserver initWithCallback:]): >+ (-[WebCoreAVVideoCaptureSourceObserver disconnect]): >+ (-[WebCoreAVVideoCaptureSourceObserver addNotificationObservers]): >+ (-[WebCoreAVVideoCaptureSourceObserver removeNotificationObservers]): >+ (-[WebCoreAVVideoCaptureSourceObserver captureOutput:didOutputSampleBuffer:fromConnection:]): >+ (-[WebCoreAVVideoCaptureSourceObserver observeValueForKeyPath:ofObject:change:context:]): >+ (-[WebCoreAVVideoCaptureSourceObserver sessionRuntimeError:]): >+ (-[WebCoreAVVideoCaptureSourceObserver beginSessionInterrupted:]): >+ (-[WebCoreAVVideoCaptureSourceObserver endSessionInterrupted:]): >+ (WebCore::AVVideoCaptureSource::initializeCapabilities): Deleted. >+ (WebCore::AVVideoCaptureSource::initializeSupportedConstraints): Deleted. >+ (WebCore::AVVideoCaptureSource::updateSettings): Deleted. >+ >+2018-08-30 Eric Carlson <eric.carlson@apple.com> > Mock video devices should only support discrete sizes > https://bugs.webkit.org/show_bug.cgi?id=189000 > <rdar://problem/43766551> >diff --git a/Source/WebCore/WebCore.xcodeproj/project.pbxproj b/Source/WebCore/WebCore.xcodeproj/project.pbxproj >index f361874753b23abe7e4907b2dbbe1502bbf1b8f9..93990094bdbd673236f79363b54fb6c466d8d990 100644 >--- a/Source/WebCore/WebCore.xcodeproj/project.pbxproj >+++ b/Source/WebCore/WebCore.xcodeproj/project.pbxproj >@@ -72,8 +72,6 @@ > 070334D9145A006F008D8D45 /* TrackBase.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 070334D8145A006F008D8D45 /* TrackBase.cpp */; }; > 070363E2181A1CDC00C074A5 /* AVCaptureDeviceManager.h in Headers */ = {isa = PBXBuildFile; fileRef = 070363DA181A1CDC00C074A5 /* AVCaptureDeviceManager.h */; settings = {ATTRIBUTES = (Private, ); }; }; > 070363E3181A1CDC00C074A5 /* AVCaptureDeviceManager.mm in Sources */ = {isa = PBXBuildFile; fileRef = 070363DB181A1CDC00C074A5 /* AVCaptureDeviceManager.mm */; }; >- 070363E4181A1CDC00C074A5 /* AVMediaCaptureSource.h in Headers */ = {isa = PBXBuildFile; fileRef = 070363DC181A1CDC00C074A5 /* AVMediaCaptureSource.h */; }; >- 070363E5181A1CDC00C074A5 /* AVMediaCaptureSource.mm in Sources */ = {isa = PBXBuildFile; fileRef = 070363DD181A1CDC00C074A5 /* AVMediaCaptureSource.mm */; }; > 070363E6181A1CDC00C074A5 /* AVVideoCaptureSource.h in Headers */ = {isa = PBXBuildFile; fileRef = 070363DE181A1CDC00C074A5 /* AVVideoCaptureSource.h */; }; > 070363E7181A1CDC00C074A5 /* AVVideoCaptureSource.mm in Sources */ = {isa = PBXBuildFile; fileRef = 070363DF181A1CDC00C074A5 /* AVVideoCaptureSource.mm */; }; > 0704A4081D6DE9F10086DCDB /* OverconstrainedError.h in Headers */ = {isa = PBXBuildFile; fileRef = 0704A4051D6DE9F10086DCDB /* OverconstrainedError.h */; }; >@@ -5173,8 +5171,6 @@ > 070334E8145A1F35008D8D45 /* JSTrackCustom.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = JSTrackCustom.cpp; sourceTree = "<group>"; }; > 070363DA181A1CDC00C074A5 /* AVCaptureDeviceManager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVCaptureDeviceManager.h; sourceTree = "<group>"; }; > 070363DB181A1CDC00C074A5 /* AVCaptureDeviceManager.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVCaptureDeviceManager.mm; sourceTree = "<group>"; }; >- 070363DC181A1CDC00C074A5 /* AVMediaCaptureSource.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVMediaCaptureSource.h; sourceTree = "<group>"; }; >- 070363DD181A1CDC00C074A5 /* AVMediaCaptureSource.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVMediaCaptureSource.mm; sourceTree = "<group>"; }; > 070363DE181A1CDC00C074A5 /* AVVideoCaptureSource.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVVideoCaptureSource.h; sourceTree = "<group>"; }; > 070363DF181A1CDC00C074A5 /* AVVideoCaptureSource.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AVVideoCaptureSource.mm; sourceTree = "<group>"; }; > 0704A4031D6DE9F10086DCDB /* OverconstrainedError.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = OverconstrainedError.idl; sourceTree = "<group>"; }; >@@ -15203,8 +15199,6 @@ > 07C046C61E42512F007201E7 /* AudioTrackPrivateMediaStreamCocoa.h */, > 070363DA181A1CDC00C074A5 /* AVCaptureDeviceManager.h */, > 070363DB181A1CDC00C074A5 /* AVCaptureDeviceManager.mm */, >- 070363DC181A1CDC00C074A5 /* AVMediaCaptureSource.h */, >- 070363DD181A1CDC00C074A5 /* AVMediaCaptureSource.mm */, > 070363DE181A1CDC00C074A5 /* AVVideoCaptureSource.h */, > 070363DF181A1CDC00C074A5 /* AVVideoCaptureSource.mm */, > 3F8020311E9E381D00DEC61D /* CoreAudioCaptureDevice.cpp */, >@@ -27099,7 +27093,6 @@ > CDC675231EAEA9B700727C84 /* AVAudioSessionCaptureDeviceManager.h in Headers */, > 070363E2181A1CDC00C074A5 /* AVCaptureDeviceManager.h in Headers */, > 07F4E93320B3587F002E3803 /* AVFoundationMIMETypeCache.h in Headers */, >- 070363E4181A1CDC00C074A5 /* AVMediaCaptureSource.h in Headers */, > CD336F6217F9F64700DDDCD0 /* AVTrackPrivateAVFObjCImpl.h in Headers */, > 070363E6181A1CDC00C074A5 /* AVVideoCaptureSource.h in Headers */, > F45C231E1995B73B00A6E2E3 /* AxisScrollSnapOffsets.h in Headers */, >@@ -31328,7 +31321,6 @@ > CDC675221EAEA9B700727C84 /* AVAudioSessionCaptureDeviceManager.mm in Sources */, > 070363E3181A1CDC00C074A5 /* AVCaptureDeviceManager.mm in Sources */, > 0719427F1D088F21002AA51D /* AVFoundationMIMETypeCache.mm in Sources */, >- 070363E5181A1CDC00C074A5 /* AVMediaCaptureSource.mm in Sources */, > CD336F6117F9F64700DDDCD0 /* AVTrackPrivateAVFObjCImpl.mm in Sources */, > 070363E7181A1CDC00C074A5 /* AVVideoCaptureSource.mm in Sources */, > 7A45032F18DB717200377B34 /* BufferedLineReader.cpp in Sources */, >diff --git a/Source/WebCore/platform/OrientationNotifier.h b/Source/WebCore/platform/OrientationNotifier.h >index 6e9e890429d846e90c058a674ea4ce3f8234b231..0e6ccb10f9861e0eeb744711e8e856cb69c1a012 100644 >--- a/Source/WebCore/platform/OrientationNotifier.h >+++ b/Source/WebCore/platform/OrientationNotifier.h >@@ -25,6 +25,8 @@ > > #pragma once > >+#include <wtf/Vector.h> >+ > namespace WebCore { > > class OrientationNotifier { >diff --git a/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm b/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >index 0efc92a0fee902ad9689e6ceacc55a6572f4fbd9..9e195ca988dd37d43bea4486dde559c67431da0a 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >+++ b/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm >@@ -28,7 +28,6 @@ > > #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) > >-#import "AVMediaCaptureSource.h" > #import "AVVideoCaptureSource.h" > #import "AudioSourceProvider.h" > #import "Logging.h" >@@ -55,20 +54,12 @@ SOFT_LINK_CLASS(AVFoundation, AVCaptureSession) > SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeAudio, NSString *) > SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeMuxed, NSString *) > SOFT_LINK_CONSTANT(AVFoundation, AVMediaTypeVideo, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVCaptureSessionPreset1280x720, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVCaptureSessionPreset640x480, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVCaptureSessionPreset352x288, NSString *) >-SOFT_LINK_CONSTANT(AVFoundation, AVCaptureSessionPresetLow, NSString *) > SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasConnectedNotification, NSString *) > SOFT_LINK_CONSTANT(AVFoundation, AVCaptureDeviceWasDisconnectedNotification, NSString *) > > #define AVMediaTypeAudio getAVMediaTypeAudio() > #define AVMediaTypeMuxed getAVMediaTypeMuxed() > #define AVMediaTypeVideo getAVMediaTypeVideo() >-#define AVCaptureSessionPreset1280x720 getAVCaptureSessionPreset1280x720() >-#define AVCaptureSessionPreset640x480 getAVCaptureSessionPreset640x480() >-#define AVCaptureSessionPreset352x288 getAVCaptureSessionPreset352x288() >-#define AVCaptureSessionPresetLow getAVCaptureSessionPresetLow() > #define AVCaptureDeviceWasConnectedNotification getAVCaptureDeviceWasConnectedNotification() > #define AVCaptureDeviceWasDisconnectedNotification getAVCaptureDeviceWasDisconnectedNotification() > >diff --git a/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h b/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h >deleted file mode 100644 >index 5f48476de6d6f2bfaa75d25ff5b1bff3030e4055..0000000000000000000000000000000000000000 >--- a/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h >+++ /dev/null >@@ -1,116 +0,0 @@ >-/* >- * Copyright (C) 2013-2017 Apple Inc. All rights reserved. >- * >- * Redistribution and use in source and binary forms, with or without >- * modification, are permitted provided that the following conditions >- * are met: >- * 1. Redistributions of source code must retain the above copyright >- * notice, this list of conditions and the following disclaimer. >- * 2. Redistributions in binary form must reproduce the above copyright >- * notice, this list of conditions and the following disclaimer in the >- * documentation and/or other materials provided with the distribution. >- * >- * THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY >- * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE >- * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >- * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR >- * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, >- * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, >- * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR >- * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY >- * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT >- * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE >- * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. >- */ >- >-#ifndef AVMediaCaptureSource_h >-#define AVMediaCaptureSource_h >- >-#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) >- >-#include "GenericTaskQueue.h" >-#include "RealtimeMediaSource.h" >-#include "Timer.h" >-#include <wtf/Function.h> >- >-OBJC_CLASS AVCaptureAudioDataOutput; >-OBJC_CLASS AVCaptureConnection; >-OBJC_CLASS AVCaptureDevice; >-OBJC_CLASS AVCaptureOutput; >-OBJC_CLASS AVCaptureSession; >-OBJC_CLASS AVCaptureVideoDataOutput; >-OBJC_CLASS NSError; >-OBJC_CLASS NSNotification; >-OBJC_CLASS WebCoreAVMediaCaptureSourceObserver; >- >-typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; >- >-namespace WebCore { >- >-class AVMediaCaptureSource; >- >-class AVMediaCaptureSource : public RealtimeMediaSource { >-public: >- virtual ~AVMediaCaptureSource(); >- >- virtual void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) = 0; >- >- void captureSessionIsRunningDidChange(bool); >- void captureSessionRuntimeError(RetainPtr<NSError>); >- >- enum class InterruptionReason { None, VideoNotAllowedInBackground, AudioInUse, VideoInUse, VideoNotAllowedInSideBySide }; >- void captureSessionBeginInterruption(RetainPtr<NSNotification>); >- void captureSessionEndInterruption(RetainPtr<NSNotification>); >- >- AVCaptureSession *session() const { return m_session.get(); } >- >- const RealtimeMediaSourceSettings& settings() const final; >- >- void startProducingData() final; >- void stopProducingData() final; >- >-protected: >- AVMediaCaptureSource(AVCaptureDevice*, const AtomicString&, RealtimeMediaSource::Type); >- >- virtual bool setupCaptureSession() = 0; >- virtual void shutdownCaptureSession() = 0; >- virtual void updateSettings(RealtimeMediaSourceSettings&) = 0; >- virtual void initializeCapabilities(RealtimeMediaSourceCapabilities&) = 0; >- virtual void initializeSupportedConstraints(RealtimeMediaSourceSupportedConstraints&) = 0; >- >- AVCaptureDevice *device() const { return m_device.get(); } >- >- RealtimeMediaSourceSupportedConstraints& supportedConstraints(); >- const RealtimeMediaSourceCapabilities& capabilities() const final; >- >- void setVideoSampleBufferDelegate(AVCaptureVideoDataOutput*); >- void setAudioSampleBufferDelegate(AVCaptureAudioDataOutput*); >- >-private: >- bool setupSession(); >- >- void beginConfiguration() final; >- void commitConfiguration() final; >- >- bool isCaptureSource() const final { return true; } >- >- bool interrupted() const final; >- >- void initializeSettings(); >- void initializeCapabilities(); >- >- RealtimeMediaSourceSettings m_currentSettings; >- RealtimeMediaSourceSupportedConstraints m_supportedConstraints; >- RetainPtr<WebCoreAVMediaCaptureSourceObserver> m_objcObserver; >- std::unique_ptr<RealtimeMediaSourceCapabilities> m_capabilities; >- RetainPtr<AVCaptureSession> m_session; >- RetainPtr<AVCaptureDevice> m_device; >- InterruptionReason m_interruption { InterruptionReason::None }; >- bool m_isRunning { false }; >-}; >- >-} // namespace WebCore >- >-#endif // ENABLE(MEDIA_STREAM) >- >-#endif // AVMediaCaptureSource_h >diff --git a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h >index f8dcd00afe230a2f520fddf83a05db956d0cf31a..748e4800c551f89527bca60eb9637c4495f4e768 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h >+++ b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h >@@ -27,23 +27,24 @@ > > #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) > >-#include "AVMediaCaptureSource.h" > #include "OrientationNotifier.h" >+#include "RealtimeMediaSource.h" >+#include <wtf/text/StringHash.h> > >-OBJC_CLASS CALayer; >-OBJC_CLASS AVFrameRateRange; >+typedef struct opaqueCMSampleBuffer* CMSampleBufferRef; > >-typedef struct CGImage *CGImageRef; >-typedef const struct opaqueCMFormatDescription *CMFormatDescriptionRef; >-typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; >+OBJC_CLASS AVCaptureConnection; >+OBJC_CLASS AVCaptureDevice; >+OBJC_CLASS AVCaptureOutput; >+OBJC_CLASS AVCaptureSession; >+OBJC_CLASS AVCaptureVideoDataOutput; >+OBJC_CLASS NSError; >+OBJC_CLASS NSNotification; >+OBJC_CLASS WebCoreAVVideoCaptureSourceObserver; > > namespace WebCore { > >-class FloatRect; >-class GraphicsContext; >-class PixelBufferConformerCV; >- >-class AVVideoCaptureSource : public AVMediaCaptureSource, private OrientationNotifier::Observer { >+class AVVideoCaptureSource : public RealtimeMediaSource, private OrientationNotifier::Observer { > public: > static CaptureSourceOrError create(const AtomicString&, const MediaConstraints*); > >@@ -52,43 +53,59 @@ public: > int32_t width() const { return m_width; } > int32_t height() const { return m_height; } > >+ enum class InterruptionReason { None, VideoNotAllowedInBackground, AudioInUse, VideoInUse, VideoNotAllowedInSideBySide }; >+ void captureSessionBeginInterruption(RetainPtr<NSNotification>); >+ void captureSessionEndInterruption(RetainPtr<NSNotification>); >+ >+ AVCaptureSession* session() const { return m_session.get(); } >+ >+ void captureSessionIsRunningDidChange(bool); >+ void captureSessionRuntimeError(RetainPtr<NSError>); >+ void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*); >+ > private: > AVVideoCaptureSource(AVCaptureDevice*, const AtomicString&); > virtual ~AVVideoCaptureSource(); > >- bool setupCaptureSession() final; >- void shutdownCaptureSession() final; >- >- void updateSettings(RealtimeMediaSourceSettings&) final; >+ bool setupSession(); >+ bool setupCaptureSession(); >+ void shutdownCaptureSession(); > >+ const RealtimeMediaSourceCapabilities& capabilities() const final; > void applySizeAndFrameRate(std::optional<int> width, std::optional<int> height, std::optional<double>) final; > bool applySize(const IntSize&) final; > bool applyFrameRate(double) final; >- bool setPreset(NSString*); >- >+ const RealtimeMediaSourceSettings& settings() const final; >+ void startProducingData() final; >+ void stopProducingData() final; >+ bool supportsSizeAndFrameRate(std::optional<int> width, std::optional<int> height, std::optional<double>) final; >+ void settingsDidChange() final; > void monitorOrientation(OrientationNotifier&) final; >+ void beginConfiguration() final; >+ void commitConfiguration() final; >+ bool isCaptureSource() const final { return true; } >+ bool interrupted() const final; >+ >+ bool setPreset(NSString*); > void computeSampleRotation(); > > bool isFrameRateSupported(double frameRate); > >- NSString *bestSessionPresetForVideoDimensions(std::optional<int> width, std::optional<int> height); >- bool supportsSizeAndFrameRate(std::optional<int> width, std::optional<int> height, std::optional<double>) final; >- >- void initializeCapabilities(RealtimeMediaSourceCapabilities&) final; >- void initializeSupportedConstraints(RealtimeMediaSourceSupportedConstraints&) final; >+ NSString* bestSessionPresetForVideoDimensions(std::optional<int> width, std::optional<int> height); > > // OrientationNotifier::Observer API > void orientationChanged(int orientation) final; > > bool setFrameRateConstraint(double minFrameRate, double maxFrameRate); > >- void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) final; > void processNewFrame(RetainPtr<CMSampleBufferRef>, RetainPtr<AVCaptureConnection>); > IntSize sizeForPreset(NSString*); > > using VideoPresetMap = HashMap<String, IntSize>; > VideoPresetMap& videoPresets() { return m_supportedPresets; } > >+ AVCaptureDevice* device() const { return m_device.get(); } >+ > RetainPtr<NSString> m_pendingPreset; > RetainPtr<CMSampleBufferRef> m_buffer; > RetainPtr<AVCaptureVideoDataOutput> m_videoOutput; >@@ -101,6 +118,15 @@ private: > MediaSample::VideoRotation m_sampleRotation { MediaSample::VideoRotation::None }; > > VideoPresetMap m_supportedPresets; >+ >+ mutable std::optional<RealtimeMediaSourceSettings> m_currentSettings; >+ mutable std::optional<RealtimeMediaSourceCapabilities> m_capabilities; >+ RetainPtr<WebCoreAVVideoCaptureSourceObserver> m_objcObserver; >+ RetainPtr<AVCaptureSession> m_session; >+ RetainPtr<AVCaptureDevice> m_device; >+ InterruptionReason m_interruption { InterruptionReason::None }; >+ bool m_isRunning { false }; >+ > }; > > } // namespace WebCore >diff --git a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >index facc38f08a8809d2b3b8339f29fe7949e27b76e9..17e0556d9c67810293ee289d1e9f02a7ecca204f 100644 >--- a/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >+++ b/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm >@@ -28,30 +28,21 @@ > > #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) > >-#import "AVCaptureDeviceManager.h" >-#import "GraphicsContextCG.h" > #import "ImageBuffer.h" > #import "IntRect.h" > #import "Logging.h" > #import "MediaConstraints.h" > #import "MediaSampleAVFObjC.h" >-#import "NotImplemented.h" >-#import "PixelBufferConformerCV.h" > #import "PlatformLayer.h" > #import "RealtimeMediaSourceCenterMac.h" > #import "RealtimeMediaSourceSettings.h" >-#import "WebActionDisablingCALayerDelegate.h" > #import <AVFoundation/AVCaptureDevice.h> > #import <AVFoundation/AVCaptureInput.h> > #import <AVFoundation/AVCaptureOutput.h> > #import <AVFoundation/AVCaptureSession.h> >+#import <AVFoundation/AVError.h> > #import <objc/runtime.h> > >-#if PLATFORM(IOS) >-#include "WebCoreThread.h" >-#include "WebCoreThreadRun.h" >-#endif >- > #import <pal/cf/CoreMediaSoftLink.h> > #import "CoreVideoSoftLink.h" > >@@ -62,6 +53,7 @@ typedef AVCaptureDeviceInput AVCaptureDeviceInputType; > typedef AVCaptureOutput AVCaptureOutputType; > typedef AVCaptureVideoDataOutput AVCaptureVideoDataOutputType; > typedef AVFrameRateRange AVFrameRateRangeType; >+typedef AVCaptureSession AVCaptureSessionType; > > SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) > >@@ -72,6 +64,7 @@ SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceInput) > SOFT_LINK_CLASS(AVFoundation, AVCaptureOutput) > SOFT_LINK_CLASS(AVFoundation, AVCaptureVideoDataOutput) > SOFT_LINK_CLASS(AVFoundation, AVFrameRateRange) >+SOFT_LINK_CLASS(AVFoundation, AVCaptureSession) > > #define AVCaptureConnection getAVCaptureConnectionClass() > #define AVCaptureDevice getAVCaptureDeviceClass() >@@ -97,13 +90,40 @@ SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVCaptureSessionPreset320x240, NSStrin > #if PLATFORM(IOS) > SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVCaptureSessionPreset3840x2160, NSString *) > SOFT_LINK_CONSTANT_MAY_FAIL(AVFoundation, AVCaptureSessionPreset1920x1080, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionRuntimeErrorNotification, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionWasInterruptedNotification, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionInterruptionEndedNotification, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionInterruptionReasonKey, NSString *) >+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVCaptureSessionErrorKey, NSString *) > > #define AVCaptureSessionPreset3840x2160 getAVCaptureSessionPreset3840x2160() > #define AVCaptureSessionPreset1920x1080 getAVCaptureSessionPreset1920x1080() >+#define AVCaptureSessionRuntimeErrorNotification getAVCaptureSessionRuntimeErrorNotification() >+#define AVCaptureSessionWasInterruptedNotification getAVCaptureSessionWasInterruptedNotification() >+#define AVCaptureSessionInterruptionEndedNotification getAVCaptureSessionInterruptionEndedNotification() >+#define AVCaptureSessionInterruptionReasonKey getAVCaptureSessionInterruptionReasonKey() >+#define AVCaptureSessionErrorKey getAVCaptureSessionErrorKey() > #endif > > using namespace WebCore; > >+@interface WebCoreAVVideoCaptureSourceObserver : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate> { >+ AVVideoCaptureSource* m_callback; >+} >+ >+-(id)initWithCallback:(AVVideoCaptureSource*)callback; >+-(void)disconnect; >+-(void)addNotificationObservers; >+-(void)removeNotificationObservers; >+-(void)captureOutput:(AVCaptureOutputType*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnectionType*)connection; >+-(void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context; >+#if PLATFORM(IOS) >+-(void)sessionRuntimeError:(NSNotification*)notification; >+-(void)beginSessionInterrupted:(NSNotification*)notification; >+-(void)endSessionInterrupted:(NSNotification*)notification; >+#endif >+@end >+ > namespace WebCore { > > #if PLATFORM(MAC) >@@ -112,6 +132,16 @@ const OSType videoCaptureFormat = kCVPixelFormatType_420YpCbCr8Planar; > const OSType videoCaptureFormat = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange; > #endif > >+static dispatch_queue_t globaVideoCaptureSerialQueue() >+{ >+ static dispatch_queue_t globalQueue; >+ static dispatch_once_t onceToken; >+ dispatch_once(&onceToken, ^{ >+ globalQueue = dispatch_queue_create_with_target("WebCoreAVVideoCaptureSource video capture queue", DISPATCH_QUEUE_SERIAL, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0)); >+ }); >+ return globalQueue; >+} >+ > CaptureSourceOrError AVVideoCaptureSource::create(const AtomicString& id, const MediaConstraints* constraints) > { > AVCaptureDeviceTypedef *device = [getAVCaptureDeviceClass() deviceWithUniqueID:id]; >@@ -129,7 +159,9 @@ CaptureSourceOrError AVVideoCaptureSource::create(const AtomicString& id, const > } > > AVVideoCaptureSource::AVVideoCaptureSource(AVCaptureDeviceTypedef* device, const AtomicString& id) >- : AVMediaCaptureSource(device, id, Type::Video) >+ : RealtimeMediaSource(id, Type::Video, device.localizedName) >+ , m_objcObserver(adoptNS([[WebCoreAVVideoCaptureSourceObserver alloc] initWithCallback:this])) >+ , m_device(device) > { > struct VideoPreset { > bool symbolAvailable; >@@ -157,12 +189,60 @@ AVVideoCaptureSource::AVVideoCaptureSource(AVCaptureDeviceTypedef* device, const > > presetsMap->add(String(preset.name), IntSize(preset.width, preset.height)); > } >+ >+#if PLATFORM(IOS) >+ static_assert(static_cast<int>(InterruptionReason::VideoNotAllowedInBackground) == static_cast<int>(AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableInBackground), "InterruptionReason::VideoNotAllowedInBackground is not AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableInBackground as expected"); >+ static_assert(static_cast<int>(InterruptionReason::VideoNotAllowedInSideBySide) == AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps, "InterruptionReason::VideoNotAllowedInSideBySide is not AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps as expected"); >+ static_assert(static_cast<int>(InterruptionReason::VideoInUse) == AVCaptureSessionInterruptionReasonVideoDeviceInUseByAnotherClient, "InterruptionReason::VideoInUse is not AVCaptureSessionInterruptionReasonVideoDeviceInUseByAnotherClient as expected"); >+ static_assert(static_cast<int>(InterruptionReason::AudioInUse) == AVCaptureSessionInterruptionReasonAudioDeviceInUseByAnotherClient, "InterruptionReason::AudioInUse is not AVCaptureSessionInterruptionReasonAudioDeviceInUseByAnotherClient as expected"); >+#endif >+ >+ setPersistentID(String(device.uniqueID)); > } > > AVVideoCaptureSource::~AVVideoCaptureSource() > { > #if PLATFORM(IOS) > RealtimeMediaSourceCenterMac::videoCaptureSourceFactory().unsetActiveSource(*this); >+#endif >+ [m_objcObserver disconnect]; >+ >+ if (!m_session) >+ return; >+ >+ [m_session removeObserver:m_objcObserver.get() forKeyPath:@"rate"]; >+ if ([m_session isRunning]) >+ [m_session stopRunning]; >+ >+} >+ >+void AVVideoCaptureSource::startProducingData() >+{ >+ if (!m_session) { >+ if (!setupSession()) >+ return; >+ } >+ >+ if ([m_session isRunning]) >+ return; >+ >+ [m_objcObserver addNotificationObservers]; >+ [m_session startRunning]; >+} >+ >+void AVVideoCaptureSource::stopProducingData() >+{ >+ if (!m_session) >+ return; >+ >+ [m_objcObserver removeNotificationObservers]; >+ >+ if ([m_session isRunning]) >+ [m_session stopRunning]; >+ >+ m_interruption = InterruptionReason::None; >+#if PLATFORM(IOS) >+ m_session = nullptr; > #endif > } > >@@ -178,8 +258,65 @@ static void updateAspectRatioMinMax(double& min, double& max, double value) > max = std::max<double>(max, value); > } > >-void AVVideoCaptureSource::initializeCapabilities(RealtimeMediaSourceCapabilities& capabilities) >+void AVVideoCaptureSource::beginConfiguration() >+{ >+ if (m_session) >+ [m_session beginConfiguration]; >+} >+ >+void AVVideoCaptureSource::commitConfiguration() > { >+ if (m_session) >+ [m_session commitConfiguration]; >+} >+ >+void AVVideoCaptureSource::settingsDidChange() >+{ >+ m_currentSettings = std::nullopt; >+ RealtimeMediaSource::settingsDidChange(); >+} >+ >+const RealtimeMediaSourceSettings& AVVideoCaptureSource::settings() const >+{ >+ if (m_currentSettings) >+ return *m_currentSettings; >+ >+ RealtimeMediaSourceSettings settings; >+ if ([device() position] == AVCaptureDevicePositionFront) >+ settings.setFacingMode(RealtimeMediaSourceSettings::User); >+ else if ([device() position] == AVCaptureDevicePositionBack) >+ settings.setFacingMode(RealtimeMediaSourceSettings::Environment); >+ else >+ settings.setFacingMode(RealtimeMediaSourceSettings::Unknown); >+ >+ auto maxFrameDuration = [device() activeVideoMaxFrameDuration]; >+ settings.setFrameRate(maxFrameDuration.timescale / maxFrameDuration.value); >+ settings.setWidth(m_width); >+ settings.setHeight(m_height); >+ settings.setDeviceId(id()); >+ >+ RealtimeMediaSourceSupportedConstraints supportedConstraints; >+ supportedConstraints.setSupportsDeviceId(true); >+ supportedConstraints.setSupportsFacingMode([device() position] != AVCaptureDevicePositionUnspecified); >+ supportedConstraints.setSupportsWidth(true); >+ supportedConstraints.setSupportsHeight(true); >+ supportedConstraints.setSupportsAspectRatio(true); >+ supportedConstraints.setSupportsFrameRate(true); >+ >+ settings.setSupportedConstraints(supportedConstraints); >+ >+ m_currentSettings = WTFMove(settings); >+ >+ return *m_currentSettings; >+} >+ >+const RealtimeMediaSourceCapabilities& AVVideoCaptureSource::capabilities() const >+{ >+ if (m_capabilities) >+ return *m_capabilities; >+ >+ RealtimeMediaSourceCapabilities capabilities(settings().supportedConstraints()); >+ capabilities.setDeviceId(id()); > AVCaptureDeviceTypedef *videoDevice = device(); > > if ([videoDevice position] == AVCaptureDevicePositionFront) >@@ -204,45 +341,20 @@ void AVVideoCaptureSource::initializeCapabilities(RealtimeMediaSourceCapabilitie > } > } > >- for (auto& preset : videoPresets()) { >+ for (auto& preset : m_supportedPresets) { > auto values = preset.value; > updateSizeMinMax(minimumWidth, maximumWidth, values.width()); > updateSizeMinMax(minimumHeight, maximumHeight, values.height()); > updateAspectRatioMinMax(minimumAspectRatio, maximumAspectRatio, static_cast<double>(values.width()) / values.height()); > } >- > capabilities.setFrameRate(CapabilityValueOrRange(lowestFrameRateRange, highestFrameRateRange)); > capabilities.setWidth(CapabilityValueOrRange(minimumWidth, maximumWidth)); > capabilities.setHeight(CapabilityValueOrRange(minimumHeight, maximumHeight)); > capabilities.setAspectRatio(CapabilityValueOrRange(minimumAspectRatio, maximumAspectRatio)); >-} >- >-void AVVideoCaptureSource::initializeSupportedConstraints(RealtimeMediaSourceSupportedConstraints& supportedConstraints) >-{ >- supportedConstraints.setSupportsFacingMode([device() position] != AVCaptureDevicePositionUnspecified); >- supportedConstraints.setSupportsWidth(true); >- supportedConstraints.setSupportsHeight(true); >- supportedConstraints.setSupportsAspectRatio(true); >- supportedConstraints.setSupportsFrameRate(true); >-} >- >-void AVVideoCaptureSource::updateSettings(RealtimeMediaSourceSettings& settings) >-{ >- settings.setDeviceId(id()); > >- if ([device() position] == AVCaptureDevicePositionFront) >- settings.setFacingMode(RealtimeMediaSourceSettings::User); >- else if ([device() position] == AVCaptureDevicePositionBack) >- settings.setFacingMode(RealtimeMediaSourceSettings::Environment); >- else >- settings.setFacingMode(RealtimeMediaSourceSettings::Unknown); >+ m_capabilities = WTFMove(capabilities); > >- // FIXME: Observe frame rate changes. >- auto maxFrameDuration = [device() activeVideoMaxFrameDuration]; >- settings.setFrameRate(maxFrameDuration.timescale / maxFrameDuration.value); >- settings.setWidth(m_width); >- settings.setHeight(m_height); >- settings.setAspectRatio(static_cast<float>(m_width) / m_height); >+ return *m_capabilities; > } > > bool AVVideoCaptureSource::applySize(const IntSize& size) >@@ -285,7 +397,11 @@ bool AVVideoCaptureSource::setPreset(NSString *preset) > @try { > session().sessionPreset = preset; > #if PLATFORM(MAC) >- auto settingsDictionary = @{ (__bridge NSString *)kCVPixelBufferPixelFormatTypeKey: @(videoCaptureFormat), (__bridge NSString *)kCVPixelBufferWidthKey: @(size.width()), (__bridge NSString *)kCVPixelBufferHeightKey: @(size.height()), }; >+ auto settingsDictionary = @{ >+ (__bridge NSString *)kCVPixelBufferPixelFormatTypeKey: @(videoCaptureFormat), >+ (__bridge NSString *)kCVPixelBufferWidthKey: @(size.width()), >+ (__bridge NSString *)kCVPixelBufferHeightKey: @(size.height()) >+ }; > [m_videoOutput setVideoSettings:settingsDictionary]; > #endif > } @catch(NSException *exception) { >@@ -381,6 +497,24 @@ static inline int sensorOrientationFromVideoOutput(AVCaptureVideoDataOutputType* > return connection ? sensorOrientation([connection videoOrientation]) : 0; > } > >+bool AVVideoCaptureSource::setupSession() >+{ >+ if (m_session) >+ return true; >+ >+ m_session = adoptNS([allocAVCaptureSessionInstance() init]); >+ [m_session addObserver:m_objcObserver.get() forKeyPath:@"rate" options:NSKeyValueObservingOptionNew context:(void *)nil]; >+ >+ [m_session beginConfiguration]; >+ bool success = setupCaptureSession(); >+ [m_session commitConfiguration]; >+ >+ if (!success) >+ captureFailed(); >+ >+ return success; >+} >+ > bool AVVideoCaptureSource::setupCaptureSession() > { > #if PLATFORM(IOS) >@@ -412,7 +546,7 @@ bool AVVideoCaptureSource::setupCaptureSession() > > [m_videoOutput setVideoSettings:settingsDictionary.get()]; > [m_videoOutput setAlwaysDiscardsLateVideoFrames:YES]; >- setVideoSampleBufferDelegate(m_videoOutput.get()); >+ [m_videoOutput setSampleBufferDelegate:m_objcObserver.get() queue:globaVideoCaptureSerialQueue()]; > > if (![session() canAddOutput:m_videoOutput.get()]) { > RELEASE_LOG(Media, "AVVideoCaptureSource::setupCaptureSession(%p), unable to add video sample buffer output delegate", this); >@@ -555,6 +689,157 @@ bool AVVideoCaptureSource::supportsSizeAndFrameRate(std::optional<int> width, st > return isFrameRateSupported(frameRate.value()); > } > >+void AVVideoCaptureSource::captureSessionIsRunningDidChange(bool state) >+{ >+ scheduleDeferredTask([this, state] { >+ if ((state == m_isRunning) && (state == !muted())) >+ return; >+ >+ m_isRunning = state; >+ notifyMutedChange(!m_isRunning); >+ }); >+} >+ >+bool AVVideoCaptureSource::interrupted() const >+{ >+ if (m_interruption != InterruptionReason::None) >+ return true; >+ >+ return RealtimeMediaSource::interrupted(); >+} >+ >+#if PLATFORM(IOS) >+void AVVideoCaptureSource::captureSessionRuntimeError(RetainPtr<NSError> error) >+{ >+ if (!m_isRunning || error.get().code != AVErrorMediaServicesWereReset) >+ return; >+ >+ // Try to restart the session, but reset m_isRunning immediately so if it fails we won't try again. >+ [m_session startRunning]; >+ m_isRunning = [m_session isRunning]; >+} >+ >+void AVVideoCaptureSource::captureSessionBeginInterruption(RetainPtr<NSNotification> notification) >+{ >+ m_interruption = static_cast<AVVideoCaptureSource::InterruptionReason>([notification.get().userInfo[AVCaptureSessionInterruptionReasonKey] integerValue]); >+} >+ >+void AVVideoCaptureSource::captureSessionEndInterruption(RetainPtr<NSNotification>) >+{ >+ InterruptionReason reason = m_interruption; >+ >+ m_interruption = InterruptionReason::None; >+ if (reason != InterruptionReason::VideoNotAllowedInSideBySide || m_isRunning || !m_session) >+ return; >+ >+ [m_session startRunning]; >+ m_isRunning = [m_session isRunning]; >+} >+#endif >+ > } // namespace WebCore > >+@implementation WebCoreAVVideoCaptureSourceObserver >+ >+- (id)initWithCallback:(AVVideoCaptureSource*)callback >+{ >+ self = [super init]; >+ if (!self) >+ return nil; >+ >+ m_callback = callback; >+ >+ return self; >+} >+ >+- (void)disconnect >+{ >+ [NSObject cancelPreviousPerformRequestsWithTarget:self]; >+ [self removeNotificationObservers]; >+ m_callback = nullptr; >+} >+ >+- (void)addNotificationObservers >+{ >+#if PLATFORM(IOS) >+ ASSERT(m_callback); >+ >+ NSNotificationCenter* center = [NSNotificationCenter defaultCenter]; >+ AVCaptureSessionType* session = m_callback->session(); >+ >+ [center addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:session]; >+ [center addObserver:self selector:@selector(beginSessionInterrupted:) name:AVCaptureSessionWasInterruptedNotification object:session]; >+ [center addObserver:self selector:@selector(endSessionInterrupted:) name:AVCaptureSessionInterruptionEndedNotification object:session]; >+#endif >+} >+ >+- (void)removeNotificationObservers >+{ >+#if PLATFORM(IOS) >+ [[NSNotificationCenter defaultCenter] removeObserver:self]; >+#endif >+} >+ >+- (void)captureOutput:(AVCaptureOutputType*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnectionType*)connection >+{ >+ if (!m_callback) >+ return; >+ >+ m_callback->captureOutputDidOutputSampleBufferFromConnection(captureOutput, sampleBuffer, connection); >+} >+ >+- (void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context >+{ >+ UNUSED_PARAM(object); >+ UNUSED_PARAM(context); >+ >+ if (!m_callback) >+ return; >+ >+ id newValue = [change valueForKey:NSKeyValueChangeNewKey]; >+ >+#if !LOG_DISABLED >+ bool willChange = [[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue]; >+ >+ if (willChange) >+ LOG(Media, "WebCoreAVVideoCaptureSourceObserver::observeValueForKeyPath(%p) - will change, keyPath = %s", self, [keyPath UTF8String]); >+ else { >+ RetainPtr<NSString> valueString = adoptNS([[NSString alloc] initWithFormat:@"%@", newValue]); >+ LOG(Media, "WebCoreAVVideoCaptureSourceObserver::observeValueForKeyPath(%p) - did change, keyPath = %s, value = %s", self, [keyPath UTF8String], [valueString.get() UTF8String]); >+ } >+#endif >+ >+ if ([keyPath isEqualToString:@"running"]) >+ m_callback->captureSessionIsRunningDidChange([newValue boolValue]); >+} >+ >+#if PLATFORM(IOS) >+- (void)sessionRuntimeError:(NSNotification*)notification >+{ >+ NSError *error = notification.userInfo[AVCaptureSessionErrorKey]; >+ LOG(Media, "WebCoreAVVideoCaptureSourceObserver::sessionRuntimeError(%p) - error = %s", self, [[error localizedDescription] UTF8String]); >+ >+ if (m_callback) >+ m_callback->captureSessionRuntimeError(error); >+} >+ >+- (void)beginSessionInterrupted:(NSNotification*)notification >+{ >+ LOG(Media, "WebCoreAVVideoCaptureSourceObserver::beginSessionInterrupted(%p) - reason = %d", self, [notification.userInfo[AVCaptureSessionInterruptionReasonKey] integerValue]); >+ >+ if (m_callback) >+ m_callback->captureSessionBeginInterruption(notification); >+} >+ >+- (void)endSessionInterrupted:(NSNotification*)notification >+{ >+ LOG(Media, "WebCoreAVVideoCaptureSourceObserver::endSessionInterrupted(%p)", self); >+ >+ if (m_callback) >+ m_callback->captureSessionEndInterruption(notification); >+} >+#endif >+ >+@end >+ > #endif // ENABLE(MEDIA_STREAM)
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 189159
:
348508
|
348519
|
348529
|
348551
|
348635