WebKit Bugzilla
Attachment 356469 Details for
Bug 192316
: Update libwebrtc up to 0d007d7c4f
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
[patch]
Patch
bug-192316-20181203222033.patch (text/plain), 7.96 MB, created by
youenn fablet
on 2018-12-03 22:20:46 PST
(
hide
)
Description:
Patch
Filename:
MIME Type:
Creator:
youenn fablet
Created:
2018-12-03 22:20:46 PST
Size:
7.96 MB
patch
obsolete
>Subversion Revision: 238822 >diff --git a/Source/ThirdParty/libwebrtc/ChangeLog b/Source/ThirdParty/libwebrtc/ChangeLog >index 77516ad964359e32e300b150f31df881ae851a66..09002c3a782fd718123192b6d58ed9e205e04b36 100644 >--- a/Source/ThirdParty/libwebrtc/ChangeLog >+++ b/Source/ThirdParty/libwebrtc/ChangeLog >@@ -1,3 +1,20 @@ >+2018-12-03 Youenn Fablet <youenn@apple.com> >+ >+ Update libwebrtc up to 0d007d7c4f >+ https://bugs.webkit.org/show_bug.cgi?id=192316 >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ Updating to latest libwebrtc will allows cherry-picking important bug fixes. >+ >+ * Configurations/libwebrtc.iOS.exp: >+ * Configurations/libwebrtc.iOSsim.exp: >+ * Configurations/libwebrtc.mac.exp: >+ * Source/third_party/abseil-cpp: refreshed. >+ * Source/webrtc: refreshed. >+ * WebKit/0001-libwebrtc-changes.patch: Removed. >+ * libwebrtc.xcodeproj/project.pbxproj: >+ > 2018-12-01 Thibault Saunier <tsaunier@igalia.com> > > [GStreamer][WebRTC] Build opus decoder support in libwebrtc >diff --git a/Source/WebCore/ChangeLog b/Source/WebCore/ChangeLog >index 3f9ac3b4eff5181831e0db9643bc845350d10d5b..d29be4dc9c77f39264839e50f7de5dcfd32a88a7 100644 >--- a/Source/WebCore/ChangeLog >+++ b/Source/WebCore/ChangeLog >@@ -1,3 +1,14 @@ >+2018-12-03 Youenn Fablet <youenn@apple.com> >+ >+ Update libwebrtc up to 0d007d7c4f >+ https://bugs.webkit.org/show_bug.cgi?id=192316 >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ Update include according new libwebrtc. >+ >+ * platform/mediastream/libwebrtc/LibWebRTCProvider.cpp: >+ > 2018-12-03 Youenn Fablet <youenn@apple.com> > > A sender created through addTransceiver and populated using addTrack should have its source set >diff --git a/Source/WebKit/ChangeLog b/Source/WebKit/ChangeLog >index f0223263ce8fcf9639425f1ebb66c2bb7ed09b77..7385402226c0a3283e48901e5542dceb0b61f7da 100644 >--- a/Source/WebKit/ChangeLog >+++ b/Source/WebKit/ChangeLog >@@ -1,3 +1,22 @@ >+2018-12-03 Youenn Fablet <youenn@apple.com> >+ >+ Update libwebrtc up to 0d007d7c4f >+ https://bugs.webkit.org/show_bug.cgi?id=192316 >+ >+ Reviewed by NOBODY (OOPS!). >+ >+ Update code base according new libwebrtc backend >+ In particular, use directly int64_t for packet time. >+ >+ * Configurations/WebKit.xcconfig: >+ * NetworkProcess/webrtc/LibWebRTCSocketClient.cpp: >+ (WebKit::LibWebRTCSocketClient::signalReadPacket): >+ * NetworkProcess/webrtc/LibWebRTCSocketClient.h: >+ * NetworkProcess/webrtc/NetworkRTCProvider.cpp: >+ * NetworkProcess/webrtc/NetworkRTCSocket.h: >+ * WebProcess/Network/webrtc/LibWebRTCSocket.cpp: >+ (WebKit::LibWebRTCSocket::signalReadPacket): >+ > 2018-12-03 Youenn Fablet <youenn@apple.com> > > Device orientation may be wrong on page reload after crash >diff --git a/Source/ThirdParty/libwebrtc/CMakeLists.txt b/Source/ThirdParty/libwebrtc/CMakeLists.txt >index 25478847fe771ac9f9251076838ef66e5353c880..fa21b279c79522a63b4004dede6e2706eac7efc9 100644 >--- a/Source/ThirdParty/libwebrtc/CMakeLists.txt >+++ b/Source/ThirdParty/libwebrtc/CMakeLists.txt >@@ -414,12 +414,9 @@ set(webrtc_SOURCES > Source/webrtc/audio/audio_state.cc > Source/webrtc/audio/audio_transport_impl.cc > Source/webrtc/audio/channel_receive.cc >- Source/webrtc/audio/channel_receive_proxy.cc > Source/webrtc/audio/channel_send.cc >- Source/webrtc/audio/channel_send_proxy.cc > Source/webrtc/audio/null_audio_poller.cc > Source/webrtc/audio/remix_resample.cc >- Source/webrtc/audio/time_interval.cc > Source/webrtc/audio/transport_feedback_packet_loss_tracker.cc > Source/webrtc/audio/utility/audio_frame_operations.cc > Source/webrtc/call/audio_receive_stream.cc >@@ -583,7 +580,6 @@ set(webrtc_SOURCES > Source/webrtc/modules/audio_coding/acm2/acm_resampler.cc > Source/webrtc/modules/audio_coding/acm2/audio_coding_module.cc > Source/webrtc/modules/audio_coding/acm2/call_statistics.cc >- Source/webrtc/modules/audio_coding/acm2/codec_manager.cc > Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc > Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_config.cc > Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.cc >@@ -775,7 +771,6 @@ set(webrtc_SOURCES > Source/webrtc/modules/audio_coding/neteq/preemptive_expand.cc > Source/webrtc/modules/audio_coding/neteq/random_vector.cc > Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.cc >- Source/webrtc/modules/audio_coding/neteq/rtcp.cc > Source/webrtc/modules/audio_coding/neteq/statistics_calculator.cc > Source/webrtc/modules/audio_coding/neteq/sync_buffer.cc > Source/webrtc/modules/audio_coding/neteq/tick_timer.cc >@@ -857,9 +852,7 @@ set(webrtc_SOURCES > Source/webrtc/modules/audio_processing/agc2/compute_interpolated_gain_curve.cc > Source/webrtc/modules/audio_processing/agc2/down_sampler.cc > Source/webrtc/modules/audio_processing/agc2/fixed_digital_level_estimator.cc >- Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.cc > Source/webrtc/modules/audio_processing/agc2/gain_applier.cc >- Source/webrtc/modules/audio_processing/agc2/gain_curve_applier.cc > Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve.cc > Source/webrtc/modules/audio_processing/agc2/limiter.cc > Source/webrtc/modules/audio_processing/agc2/noise_level_estimator.cc >@@ -928,19 +921,16 @@ set(webrtc_SOURCES > Source/webrtc/modules/congestion_controller/bbr/data_transfer_tracker.cc > Source/webrtc/modules/congestion_controller/bbr/loss_rate_filter.cc > Source/webrtc/modules/congestion_controller/bbr/rtt_stats.cc >- Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller.cc > Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.cc > Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.cc > Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.cc > Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.cc >- Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_factory.cc > Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.cc > Source/webrtc/modules/congestion_controller/goog_cc/median_slope_estimator.cc > Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.cc > Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.cc > Source/webrtc/modules/congestion_controller/goog_cc/trendline_estimator.cc > Source/webrtc/modules/congestion_controller/receive_side_congestion_controller.cc >- Source/webrtc/modules/congestion_controller/rtp/pacer_controller.cc > Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller.cc > Source/webrtc/modules/congestion_controller/rtp/send_time_history.cc > Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.cc >@@ -1050,7 +1040,6 @@ set(webrtc_SOURCES > Source/webrtc/modules/video_coding/codecs/i420/i420.cc > Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.cc > Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.cc >- Source/webrtc/modules/video_coding/codecs/vp8/vp8_temporal_layers.cc > Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.cc > Source/webrtc/modules/video_coding/codecs/vp8/libvpx_interface.cc > Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers_checker.cc >@@ -1087,7 +1076,6 @@ set(webrtc_SOURCES > Source/webrtc/modules/video_coding/utility/default_video_bitrate_allocator.cc > Source/webrtc/modules/video_coding/utility/frame_dropper.cc > Source/webrtc/modules/video_coding/utility/ivf_file_writer.cc >- Source/webrtc/modules/video_coding/utility/moving_average.cc > Source/webrtc/modules/video_coding/utility/quality_scaler.cc > Source/webrtc/modules/video_coding/utility/simulcast_rate_allocator.cc > Source/webrtc/modules/video_coding/utility/simulcast_utility.cc >@@ -1130,15 +1118,12 @@ set(webrtc_SOURCES > Source/webrtc/p2p/base/transportdescriptionfactory.cc > Source/webrtc/p2p/base/turnport.cc > Source/webrtc/p2p/base/turnserver.cc >- Source/webrtc/p2p/base/udptransport.cc > Source/webrtc/p2p/client/basicportallocator.cc > Source/webrtc/p2p/client/turnportfactory.cc > Source/webrtc/p2p/stunprober/stunprober.cc > Source/webrtc/pc/audiotrack.cc >- Source/webrtc/pc/bundlefilter.cc > Source/webrtc/pc/channel.cc > Source/webrtc/pc/channelmanager.cc >- Source/webrtc/pc/createpeerconnectionfactory.cc > Source/webrtc/pc/datachannel.cc > Source/webrtc/pc/dtlssrtptransport.cc > Source/webrtc/pc/dtmfsender.cc >@@ -1205,7 +1190,6 @@ set(webrtc_SOURCES > Source/webrtc/rtc_base/experiments/rtt_mult_experiment.cc > Source/webrtc/rtc_base/file.cc > Source/webrtc/rtc_base/file_posix.cc >- Source/webrtc/rtc_base/filerotatingstream.cc > Source/webrtc/rtc_base/fileutils.cc > Source/webrtc/rtc_base/firewallsocketserver.cc > Source/webrtc/rtc_base/flags.cc >@@ -1228,7 +1212,6 @@ set(webrtc_SOURCES > Source/webrtc/rtc_base/nethelpers.cc > Source/webrtc/rtc_base/network.cc > Source/webrtc/rtc_base/networkmonitor.cc >- Source/webrtc/rtc_base/noop.cc > Source/webrtc/rtc_base/nullsocketserver.cc > Source/webrtc/rtc_base/numerics/exp_filter.cc > Source/webrtc/rtc_base/numerics/histogram_percentile_counter.cc >@@ -1240,8 +1223,6 @@ set(webrtc_SOURCES > Source/webrtc/rtc_base/opensslsessioncache.cc > Source/webrtc/rtc_base/opensslstreamadapter.cc > Source/webrtc/rtc_base/opensslutility.cc >- Source/webrtc/rtc_base/optionsfile.cc >- Source/webrtc/rtc_base/pathutils.cc > Source/webrtc/rtc_base/physicalsocketserver.cc > Source/webrtc/rtc_base/platform_file.cc > Source/webrtc/rtc_base/platform_thread.cc >@@ -1285,7 +1266,6 @@ set(webrtc_SOURCES > Source/webrtc/rtc_base/time/timestamp_extrapolator.cc > Source/webrtc/rtc_base/timestampaligner.cc > Source/webrtc/rtc_base/timeutils.cc >- Source/webrtc/rtc_base/unixfilesystem.cc > Source/webrtc/rtc_base/virtualsocketserver.cc > Source/webrtc/rtc_base/weak_ptr.cc > Source/webrtc/rtc_base/zero_memory.cc >diff --git a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOS.exp b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOS.exp >index d95d51ea0c817ad06ace80ea2d026c483ae7b2ab..62286edc552daadd29d3143738507175564deeb8 100644 >--- a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOS.exp >+++ b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOS.exp >@@ -218,7 +218,6 @@ __ZNK6webrtc18RtpSenderInterface19init_send_encodingsEv > __ZNK6webrtc18RtpSenderInterface17GetFrameEncryptorEv > __ZN6webrtc18RtpSenderInterface17SetFrameEncryptorEN3rtc13scoped_refptrINS_23FrameEncryptorInterfaceEEE > __ZN6webrtc30PeerConnectionFactoryInterface17CreateVideoSourceENSt3__110unique_ptrIN7cricket13VideoCapturerENS1_14default_deleteIS4_EEEE >-__ZNK3rtc14NetworkManager16GetMDnsResponderEv > __ZTVN6webrtc18RtpSenderInterfaceE > __ZN3rtc23RTCCertificateGeneratorC1EPNS_6ThreadES2_ > __ZN3rtc9KeyParams3RSAEii >@@ -233,3 +232,9 @@ __ZN6webrtc26PeerConnectionDependenciesC1EPNS_22PeerConnectionObserverE > __ZN6webrtc30PeerConnectionFactoryInterface20CreatePeerConnectionERKNS_23PeerConnectionInterface16RTCConfigurationENSt3__110unique_ptrIN7cricket13PortAllocatorENS5_14default_deleteIS8_EEEENS6_IN3rtc32RTCCertificateGeneratorInterfaceENS9_ISD_EEEEPNS_22PeerConnectionObserverE > __ZN6webrtc29SetSessionDescriptionObserver9OnFailureERKNSt3__112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEE > __ZN6webrtc32CreateSessionDescriptionObserver9OnFailureERKNSt3__112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEE >+__ZN6webrtc13CryptoOptionsC1ERKS0_ >+__ZN6webrtc13CryptoOptionsD1Ev >+__ZNK6webrtc20AudioSourceInterface7optionsEv >+__ZTVN6webrtc20AudioSourceInterfaceE >+__ZN6webrtc23PeerConnectionInterface21peer_connection_stateEv >+__ZNK3rtc14NetworkManager16GetMdnsResponderEv >diff --git a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOSsim.exp b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOSsim.exp >index 19d0fe6b201d96cbfd1c1c1d086740977107a6f0..e3126bf06fd9870aa9f3f99c4753ce1c80138c85 100644 >--- a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOSsim.exp >+++ b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.iOSsim.exp >@@ -219,7 +219,6 @@ __ZNK6webrtc18RtpSenderInterface19init_send_encodingsEv > __ZNK6webrtc18RtpSenderInterface17GetFrameEncryptorEv > __ZN6webrtc18RtpSenderInterface17SetFrameEncryptorEN3rtc13scoped_refptrINS_23FrameEncryptorInterfaceEEE > __ZN6webrtc30PeerConnectionFactoryInterface17CreateVideoSourceENSt3__110unique_ptrIN7cricket13VideoCapturerENS1_14default_deleteIS4_EEEE >-__ZNK3rtc14NetworkManager16GetMDnsResponderEv > __ZTVN6webrtc18RtpSenderInterfaceE > __ZN3rtc23RTCCertificateGeneratorC1EPNS_6ThreadES2_ > __ZN3rtc9KeyParams3RSAEii >@@ -234,3 +233,9 @@ __ZN6webrtc26PeerConnectionDependenciesC1EPNS_22PeerConnectionObserverE > __ZN6webrtc30PeerConnectionFactoryInterface20CreatePeerConnectionERKNS_23PeerConnectionInterface16RTCConfigurationENSt3__110unique_ptrIN7cricket13PortAllocatorENS5_14default_deleteIS8_EEEENS6_IN3rtc32RTCCertificateGeneratorInterfaceENS9_ISD_EEEEPNS_22PeerConnectionObserverE > __ZN6webrtc29SetSessionDescriptionObserver9OnFailureERKNSt3__112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEE > __ZN6webrtc32CreateSessionDescriptionObserver9OnFailureERKNSt3__112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEE >+__ZN6webrtc13CryptoOptionsC1ERKS0_ >+__ZN6webrtc13CryptoOptionsD1Ev >+__ZNK6webrtc20AudioSourceInterface7optionsEv >+__ZTVN6webrtc20AudioSourceInterfaceE >+__ZN6webrtc23PeerConnectionInterface21peer_connection_stateEv >+__ZNK3rtc14NetworkManager16GetMdnsResponderEv >diff --git a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.mac.exp b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.mac.exp >index 19d0fe6b201d96cbfd1c1c1d086740977107a6f0..e3126bf06fd9870aa9f3f99c4753ce1c80138c85 100644 >--- a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.mac.exp >+++ b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.mac.exp >@@ -219,7 +219,6 @@ __ZNK6webrtc18RtpSenderInterface19init_send_encodingsEv > __ZNK6webrtc18RtpSenderInterface17GetFrameEncryptorEv > __ZN6webrtc18RtpSenderInterface17SetFrameEncryptorEN3rtc13scoped_refptrINS_23FrameEncryptorInterfaceEEE > __ZN6webrtc30PeerConnectionFactoryInterface17CreateVideoSourceENSt3__110unique_ptrIN7cricket13VideoCapturerENS1_14default_deleteIS4_EEEE >-__ZNK3rtc14NetworkManager16GetMDnsResponderEv > __ZTVN6webrtc18RtpSenderInterfaceE > __ZN3rtc23RTCCertificateGeneratorC1EPNS_6ThreadES2_ > __ZN3rtc9KeyParams3RSAEii >@@ -234,3 +233,9 @@ __ZN6webrtc26PeerConnectionDependenciesC1EPNS_22PeerConnectionObserverE > __ZN6webrtc30PeerConnectionFactoryInterface20CreatePeerConnectionERKNS_23PeerConnectionInterface16RTCConfigurationENSt3__110unique_ptrIN7cricket13PortAllocatorENS5_14default_deleteIS8_EEEENS6_IN3rtc32RTCCertificateGeneratorInterfaceENS9_ISD_EEEEPNS_22PeerConnectionObserverE > __ZN6webrtc29SetSessionDescriptionObserver9OnFailureERKNSt3__112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEE > __ZN6webrtc32CreateSessionDescriptionObserver9OnFailureERKNSt3__112basic_stringIcNS1_11char_traitsIcEENS1_9allocatorIcEEEE >+__ZN6webrtc13CryptoOptionsC1ERKS0_ >+__ZN6webrtc13CryptoOptionsD1Ev >+__ZNK6webrtc20AudioSourceInterface7optionsEv >+__ZTVN6webrtc20AudioSourceInterfaceE >+__ZN6webrtc23PeerConnectionInterface21peer_connection_stateEv >+__ZNK3rtc14NetworkManager16GetMdnsResponderEv >diff --git a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.xcconfig b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.xcconfig >index 2dcf039079fa19427d09d80882b8a7c3ca15d988..57599fcdfc944e1cf1686fe635d14897a2d3b096 100644 >--- a/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.xcconfig >+++ b/Source/ThirdParty/libwebrtc/Configurations/libwebrtc.xcconfig >@@ -32,8 +32,8 @@ GCC_PREPROCESSOR_DEFINITIONS[sdk=macosx*] = $(inherited) WEBRTC_USE_VTB_HARDWARE > GCC_PREPROCESSOR_DEFINITIONS[sdk=iphoneos*] = $(inherited) WEBRTC_IOS; > GCC_PREPROCESSOR_DEFINITIONS[sdk=iphonesimulator*] = $(inherited) WEBRTC_IOS; > >-EXCLUDED_SOURCE_FILE_NAMES[sdk=iphoneos*] = *_sse.cc *_sse2.cc macutils.cc macwindowpicker.cc audio_device_mac.cc audio_mixer_manager_mac.cc; >-EXCLUDED_SOURCE_FILE_NAMES[sdk=iphonesimulator*] = macutils.cc macwindowpicker.cc audio_device_mac.cc audio_mixer_manager_mac.cc; >+EXCLUDED_SOURCE_FILE_NAMES[sdk=iphoneos*] = *_sse.cc *_sse2.cc macutils.cc macwindowpicker.cc audio_device_mac.cc audio_mixer_manager_mac.cc logging_mac.mm; >+EXCLUDED_SOURCE_FILE_NAMES[sdk=iphonesimulator*] = macutils.cc macwindowpicker.cc audio_device_mac.cc audio_mixer_manager_mac.cc logging_mac.mm; > EXCLUDED_SOURCE_FILE_NAMES[sdk=macosx*] = voice_processing_audio_unit.mm; > > OTHER_LDFLAGS[sdk=macosx10.13*] = $(inherited); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMake/AbseilHelpers.cmake b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMake/AbseilHelpers.cmake >index e4eafe49e47906515e448a2def59f7629b2ba393..0c9341747512ceb6173541c7f4a2667a15b57727 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMake/AbseilHelpers.cmake >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMake/AbseilHelpers.cmake >@@ -123,7 +123,7 @@ endfunction() > # > # all tests will be register for execution with add_test() > # >-# test compilation and execution is disable when BUILD_TESTING=OFF >+# test compilation and execution is disable when ABSL_RUN_TESTS=OFF > # > function(absl_test) > >@@ -135,7 +135,7 @@ function(absl_test) > ) > > >- if(BUILD_TESTING) >+ if(ABSL_RUN_TESTS) > > set(_NAME ${ABSL_TEST_TARGET}) > string(TOUPPER ${_NAME} _UPPER_NAME) >@@ -153,7 +153,7 @@ function(absl_test) > set_property(TARGET ${_NAME}_bin PROPERTY FOLDER ${ABSL_IDE_FOLDER}) > > add_test(${_NAME} ${_NAME}_bin) >- endif(BUILD_TESTING) >+ endif(ABSL_RUN_TESTS) > > endfunction() > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMakeLists.txt b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMakeLists.txt >index 89a3386f7e258efc9739b3517566c591f02ade7e..9a7e1031b2351104e3ec832114a8d6b75d5504c1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMakeLists.txt >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/CMakeLists.txt >@@ -32,7 +32,7 @@ if (MSVC) > # /wd4244 conversion from 'type1' to 'type2' > # /wd4267 conversion from 'size_t' to 'type2' > # /wd4800 force value to bool 'true' or 'false' (performance warning) >- add_compile_options(/W3 /WX /wd4005 /wd4068 /wd4244 /wd4267 /wd4800) >+ add_compile_options(/W3 /wd4005 /wd4068 /wd4244 /wd4267 /wd4800) > add_definitions(/DNOMINMAX /DWIN32_LEAN_AND_MEAN=1 /D_CRT_SECURE_NO_WARNINGS /D_SCL_SECURE_NO_WARNINGS) > else() > set(ABSL_STD_CXX_FLAG "-std=c++11" CACHE STRING "c++ std flag (default: c++11)") >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/OWNERS b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/OWNERS >index ceca4a8ba2cb16bd8d1d7833bedf78e6f09c93a1..38ebaf2295d1a616056650403abb89a09280dd16 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/OWNERS >@@ -1,2 +1,4 @@ >+danilchap@chromium.org >+kwiberg@chromium.org > mbonadei@chromium.org > phoglund@chromium.org >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/README.chromium b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/README.chromium >index 91d6d381a47704450b599a009f049c707b1a9232..6026a2f4e20613edb53e168bfb438f20fe4c1fd9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/README.chromium >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/README.chromium >@@ -4,7 +4,7 @@ URL: https://github.com/abseil/abseil-cpp > License: Apache 2.0 > License File: LICENSE > Version: 0 >-Revision: bea85b52733022294eef108a2e42d77b616ddca2 >+Revision: fb462224c058487763f263b7995d70efd0242c17 > Security Critical: yes > > Description: >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/WORKSPACE b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/WORKSPACE >index e4a911978dc6825695a28413bdfa2355accda27d..72ef13980c29263832cbb22f98f79ae859821ffe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/WORKSPACE >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/WORKSPACE >@@ -5,11 +5,11 @@ load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") > http_archive( > name = "bazel_toolchains", > urls = [ >- "https://mirror.bazel.build/github.com/bazelbuild/bazel-toolchains/archive/287b64e0a211fb7c23b74695f8d5f5205b61f4eb.tar.gz", >- "https://github.com/bazelbuild/bazel-toolchains/archive/287b64e0a211fb7c23b74695f8d5f5205b61f4eb.tar.gz", >+ "https://mirror.bazel.build/github.com/bazelbuild/bazel-toolchains/archive/bc09b995c137df042bb80a395b73d7ce6f26afbe.tar.gz", >+ "https://github.com/bazelbuild/bazel-toolchains/archive/bc09b995c137df042bb80a395b73d7ce6f26afbe.tar.gz", > ], >- strip_prefix = "bazel-toolchains-287b64e0a211fb7c23b74695f8d5f5205b61f4eb", >- sha256 = "aca8ac6afd7745027ee4a43032b51a725a61a75a30f02cc58681ee87e4dcdf4b", >+ strip_prefix = "bazel-toolchains-bc09b995c137df042bb80a395b73d7ce6f26afbe", >+ sha256 = "4329663fe6c523425ad4d3c989a8ac026b04e1acedeceb56aa4b190fa7f3973c", > ) > > # GoogleTest/GoogleMock framework. Used by most unit-tests. >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/CMakeLists.txt b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/CMakeLists.txt >index 689f64e258baac99201cfa7894dbcb80542408df..1d09b1935d84735d6dc90742a645b28850aeec76 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/CMakeLists.txt >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/CMakeLists.txt >@@ -20,6 +20,7 @@ add_subdirectory(base) > add_subdirectory(algorithm) > add_subdirectory(container) > add_subdirectory(debugging) >+add_subdirectory(hash) > add_subdirectory(memory) > add_subdirectory(meta) > add_subdirectory(numeric) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container.h >index 6af8c09799e56f223ad3eeca246d83ab07063a19..53ab15686c6e10e6217ed47b0b4d7a333756807c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container.h >@@ -494,7 +494,7 @@ BidirectionalIterator c_copy_backward(const C& src, > // Container-based version of the <algorithm> `std::move()` function to move > // a container's elements into an iterator. > template <typename C, typename OutputIterator> >-OutputIterator c_move(C& src, OutputIterator dest) { >+OutputIterator c_move(C&& src, OutputIterator dest) { > return std::move(container_algorithm_internal::c_begin(src), > container_algorithm_internal::c_end(src), dest); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container_test.cc >index de66f14680f1b8f22d2e965d441a05efc58a6c03..1502b17f8636b0d5edec9fa4535c6e8b80d35075 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/algorithm/container_test.cc >@@ -636,6 +636,21 @@ TEST(MutatingTest, Move) { > Pointee(5))); > } > >+TEST(MutatingTest, MoveWithRvalue) { >+ auto MakeRValueSrc = [] { >+ std::vector<std::unique_ptr<int>> src; >+ src.emplace_back(absl::make_unique<int>(1)); >+ src.emplace_back(absl::make_unique<int>(2)); >+ src.emplace_back(absl::make_unique<int>(3)); >+ return src; >+ }; >+ >+ std::vector<std::unique_ptr<int>> dest = MakeRValueSrc(); >+ absl::c_move(MakeRValueSrc(), std::back_inserter(dest)); >+ EXPECT_THAT(dest, ElementsAre(Pointee(1), Pointee(2), Pointee(3), Pointee(1), >+ Pointee(2), Pointee(3))); >+} >+ > TEST(MutatingTest, SwapRanges) { > std::vector<int> odds = {2, 4, 6}; > std::vector<int> evens = {1, 3, 5}; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.bazel >index 06d092ebdfa83abf924bb3ee032694259b2e4b2d..f7d810135b8ccc5fb05de4ab725d6c1cd8df3a46 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.bazel >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.bazel >@@ -19,6 +19,7 @@ load( > "ABSL_DEFAULT_COPTS", > "ABSL_TEST_COPTS", > "ABSL_EXCEPTIONS_FLAG", >+ "ABSL_EXCEPTIONS_FLAG_LINKOPTS", > ) > > package(default_visibility = ["//visibility:public"]) >@@ -29,6 +30,7 @@ cc_library( > name = "spinlock_wait", > srcs = [ > "internal/spinlock_akaros.inc", >+ "internal/spinlock_linux.inc", > "internal/spinlock_posix.inc", > "internal/spinlock_wait.cc", > "internal/spinlock_win32.inc", >@@ -179,6 +181,7 @@ cc_library( > srcs = ["internal/throw_delegate.cc"], > hdrs = ["internal/throw_delegate.h"], > copts = ABSL_DEFAULT_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > visibility = [ > "//absl:__subpackages__", > ], >@@ -193,6 +196,7 @@ cc_test( > name = "throw_delegate_test", > srcs = ["throw_delegate_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":throw_delegate", > "@com_google_googletest//:gtest_main", >@@ -225,6 +229,7 @@ cc_library( > srcs = ["internal/exception_safety_testing.cc"], > hdrs = ["internal/exception_safety_testing.h"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":base", > ":config", >@@ -241,6 +246,7 @@ cc_test( > name = "exception_safety_testing_test", > srcs = ["exception_safety_testing_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":exception_safety_testing", > "//absl/memory", >@@ -299,6 +305,7 @@ cc_test( > size = "medium", > srcs = ["spinlock_test_common.cc"], > copts = ABSL_TEST_COPTS, >+ tags = ["no_test_wasm"], > deps = [ > ":base", > ":core_headers", >@@ -337,6 +344,9 @@ cc_test( > name = "config_test", > srcs = ["config_test.cc"], > copts = ABSL_TEST_COPTS, >+ tags = [ >+ "no_test_wasm", >+ ], > deps = [ > ":config", > "//absl/synchronization:thread_pool", >@@ -348,6 +358,9 @@ cc_test( > name = "call_once_test", > srcs = ["call_once_test.cc"], > copts = ABSL_TEST_COPTS, >+ tags = [ >+ "no_test_wasm", >+ ], > deps = [ > ":base", > ":core_headers", >@@ -401,6 +414,9 @@ cc_test( > "//absl:windows": [], > "//conditions:default": ["-pthread"], > }), >+ tags = [ >+ "no_test_wasm", >+ ], > deps = [ > ":base", > ":core_headers", >@@ -421,3 +437,23 @@ cc_test( > "@com_github_google_benchmark//:benchmark_main", > ], > ) >+ >+cc_library( >+ name = "bits", >+ hdrs = ["internal/bits.h"], >+ visibility = [ >+ "//absl:__subpackages__", >+ ], >+ deps = [":core_headers"], >+) >+ >+cc_test( >+ name = "bits_test", >+ size = "small", >+ srcs = ["internal/bits_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":bits", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.gn >index a714656462290bcd40cd4999ca1fef20c8637e7d..6c540f3b2bb29ca96e45696a1cb335ccb4a0b1bf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/BUILD.gn >@@ -23,6 +23,7 @@ source_set("spinlock_wait") { > public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] > sources = [ > "internal/spinlock_akaros.inc", >+ "internal/spinlock_linux.inc", > "internal/spinlock_posix.inc", > "internal/spinlock_wait.cc", > "internal/spinlock_win32.inc", >@@ -71,6 +72,7 @@ source_set("dynamic_annotations") { > public = [ > "dynamic_annotations.h", > ] >+ > # Abseil's dynamic annotations are only visible inside Abseil because > # their usage is deprecated in Chromium (see README.chromium for more info). > visibility = [] >@@ -296,3 +298,19 @@ source_set("endian") { > ":core_headers", > ] > } >+ >+source_set("bits") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public = [ >+ "internal/bits.h", >+ ] >+ deps = [ >+ ":core_headers", >+ ] >+ visibility = [] >+ visibility += [ "../*" ] >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/CMakeLists.txt b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/CMakeLists.txt >index 01d2af085f58e15f3e2639c516cf66d46d121bed..04a6eb31955b602202ad597ba2b34ec04bc7e557 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/CMakeLists.txt >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/CMakeLists.txt >@@ -31,6 +31,7 @@ list(APPEND BASE_PUBLIC_HEADERS > > list(APPEND BASE_INTERNAL_HEADERS > "internal/atomic_hook.h" >+ "internal/bits.h" > "internal/cycleclock.h" > "internal/direct_mmap.h" > "internal/endian.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/attributes.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/attributes.h >index b1883b6d752cc15cd76e079240d0857ce18e8d0a..e8500224443bce5bf65ba5ce1fcd00d8c7044c8f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/attributes.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/attributes.h >@@ -100,7 +100,7 @@ > // ABSL_PRINTF_ATTRIBUTE > // ABSL_SCANF_ATTRIBUTE > // >-// Tells the compiler to perform `printf` format std::string checking if the >+// Tells the compiler to perform `printf` format string checking if the > // compiler supports it; see the 'format' attribute in > // <http://gcc.gnu.org/onlinedocs/gcc-4.7.0/gcc/Function-Attributes.html>. > // >@@ -494,14 +494,27 @@ > #define ABSL_XRAY_LOG_ARGS(N) > #endif > >+// ABSL_ATTRIBUTE_REINITIALIZES >+// >+// Indicates that a member function reinitializes the entire object to a known >+// state, independent of the previous state of the object. >+// >+// The clang-tidy check bugprone-use-after-move allows member functions marked >+// with this attribute to be called on objects that have been moved from; >+// without the attribute, this would result in a use-after-move warning. >+#if ABSL_HAVE_CPP_ATTRIBUTE(clang::reinitializes) >+#define ABSL_ATTRIBUTE_REINITIALIZES [[clang::reinitializes]] >+#else >+#define ABSL_ATTRIBUTE_REINITIALIZES >+#endif >+ > // ----------------------------------------------------------------------------- > // Variable Attributes > // ----------------------------------------------------------------------------- > > // ABSL_ATTRIBUTE_UNUSED > // >-// Prevents the compiler from complaining about or optimizing away variables >-// that appear unused. >+// Prevents the compiler from complaining about variables that appear unused. > #if ABSL_HAVE_ATTRIBUTE(unused) || (defined(__GNUC__) && !defined(__clang__)) > #undef ABSL_ATTRIBUTE_UNUSED > #define ABSL_ATTRIBUTE_UNUSED __attribute__((__unused__)) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/config.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/config.h >index 6890e313bf27af7e5e8a5c051325013e27215a0e..d4eb7d0cbd76b718f428979dce755d735afcb00d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/config.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/config.h >@@ -199,7 +199,7 @@ > #define ABSL_HAVE_INTRINSIC_INT128 1 > #elif defined(__CUDACC__) > // __CUDACC_VER__ is a full version number before CUDA 9, and is defined to a >-// std::string explaining that it has been removed starting with CUDA 9. We use >+// string explaining that it has been removed starting with CUDA 9. We use > // nested #ifs because there is no short-circuiting in the preprocessor. > // NOTE: `__CUDACC__` could be undefined while `__CUDACC_VER__` is defined. > #if __CUDACC_VER__ >= 70000 >@@ -414,14 +414,13 @@ > // <string_view>, <variant> is implemented) or higher. Also, `__cplusplus` is > // not correctly set by MSVC, so we use `_MSVC_LANG` to check the language > // version. >-// TODO(zhangxy): fix tests before enabling aliasing for `std::any`, >-// `std::string_view`. >+// TODO(zhangxy): fix tests before enabling aliasing for `std::any`. > #if defined(_MSC_VER) && _MSC_VER >= 1910 && \ > ((defined(_MSVC_LANG) && _MSVC_LANG > 201402) || __cplusplus > 201402) > // #define ABSL_HAVE_STD_ANY 1 > #define ABSL_HAVE_STD_OPTIONAL 1 > #define ABSL_HAVE_STD_VARIANT 1 >-// #define ABSL_HAVE_STD_STRING_VIEW 1 >+#define ABSL_HAVE_STD_STRING_VIEW 1 > #endif > > #endif // ABSL_BASE_CONFIG_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/exception_safety_testing_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/exception_safety_testing_test.cc >index 97c8d6f831894a880a984a1d0e090e04877282ee..106bc34b00f7ce27a7b43752ad710699fbc455f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/exception_safety_testing_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/exception_safety_testing_test.cc >@@ -38,7 +38,7 @@ template <typename F> > void ExpectNoThrow(const F& f) { > try { > f(); >- } catch (TestException e) { >+ } catch (const TestException& e) { > ADD_FAILURE() << "Unexpected exception thrown from " << e.what(); > } > } >@@ -179,7 +179,7 @@ TEST(ThrowingValueTest, ThrowingStreamOps) { > } > > // Tests the operator<< of ThrowingValue by forcing ConstructorTracker to emit >-// a nonfatal failure that contains the std::string representation of the Thrower >+// a nonfatal failure that contains the string representation of the Thrower > TEST(ThrowingValueTest, StreamOpsOutput) { > using ::testing::TypeSpec; > exceptions_internal::ConstructorTracker ct(exceptions_internal::countdown); >@@ -548,21 +548,21 @@ TEST(ExceptionSafetyTesterTest, IncompleteTypesAreNotTestable) { > // Test that providing operation and inveriants still does not allow for the > // the invocation of .Test() and .Test(op) because it lacks a factory > auto without_fac = >- testing::MakeExceptionSafetyTester().WithOperation(op).WithInvariants( >+ testing::MakeExceptionSafetyTester().WithOperation(op).WithContracts( > inv, testing::strong_guarantee); > EXPECT_FALSE(HasNullaryTest(without_fac)); > EXPECT_FALSE(HasUnaryTest(without_fac)); > >- // Test that providing invariants and factory allows the invocation of >+ // Test that providing contracts and factory allows the invocation of > // .Test(op) but does not allow for .Test() because it lacks an operation > auto without_op = testing::MakeExceptionSafetyTester() >- .WithInvariants(inv, testing::strong_guarantee) >+ .WithContracts(inv, testing::strong_guarantee) > .WithFactory(fac); > EXPECT_FALSE(HasNullaryTest(without_op)); > EXPECT_TRUE(HasUnaryTest(without_op)); > > // Test that providing operation and factory still does not allow for the >- // the invocation of .Test() and .Test(op) because it lacks invariants >+ // the invocation of .Test() and .Test(op) because it lacks contracts > auto without_inv = > testing::MakeExceptionSafetyTester().WithOperation(op).WithFactory(fac); > EXPECT_FALSE(HasNullaryTest(without_inv)); >@@ -577,7 +577,7 @@ std::unique_ptr<ExampleStruct> ExampleFunctionFactory() { > > void ExampleFunctionOperation(ExampleStruct*) {} > >-testing::AssertionResult ExampleFunctionInvariant(ExampleStruct*) { >+testing::AssertionResult ExampleFunctionContract(ExampleStruct*) { > return testing::AssertionSuccess(); > } > >@@ -593,16 +593,16 @@ struct { > > struct { > testing::AssertionResult operator()(ExampleStruct* example_struct) const { >- return ExampleFunctionInvariant(example_struct); >+ return ExampleFunctionContract(example_struct); > } >-} example_struct_invariant; >+} example_struct_contract; > > auto example_lambda_factory = []() { return ExampleFunctionFactory(); }; > > auto example_lambda_operation = [](ExampleStruct*) {}; > >-auto example_lambda_invariant = [](ExampleStruct* example_struct) { >- return ExampleFunctionInvariant(example_struct); >+auto example_lambda_contract = [](ExampleStruct* example_struct) { >+ return ExampleFunctionContract(example_struct); > }; > > // Testing that function references, pointers, structs with operator() and >@@ -612,28 +612,28 @@ TEST(ExceptionSafetyTesterTest, MixedFunctionTypes) { > EXPECT_TRUE(testing::MakeExceptionSafetyTester() > .WithFactory(ExampleFunctionFactory) > .WithOperation(ExampleFunctionOperation) >- .WithInvariants(ExampleFunctionInvariant) >+ .WithContracts(ExampleFunctionContract) > .Test()); > > // function pointer > EXPECT_TRUE(testing::MakeExceptionSafetyTester() > .WithFactory(&ExampleFunctionFactory) > .WithOperation(&ExampleFunctionOperation) >- .WithInvariants(&ExampleFunctionInvariant) >+ .WithContracts(&ExampleFunctionContract) > .Test()); > > // struct > EXPECT_TRUE(testing::MakeExceptionSafetyTester() > .WithFactory(example_struct_factory) > .WithOperation(example_struct_operation) >- .WithInvariants(example_struct_invariant) >+ .WithContracts(example_struct_contract) > .Test()); > > // lambda > EXPECT_TRUE(testing::MakeExceptionSafetyTester() > .WithFactory(example_lambda_factory) > .WithOperation(example_lambda_operation) >- .WithInvariants(example_lambda_invariant) >+ .WithContracts(example_lambda_contract) > .Test()); > } > >@@ -658,9 +658,9 @@ struct { > } invoker; > > auto tester = >- testing::MakeExceptionSafetyTester().WithOperation(invoker).WithInvariants( >+ testing::MakeExceptionSafetyTester().WithOperation(invoker).WithContracts( > CheckNonNegativeInvariants); >-auto strong_tester = tester.WithInvariants(testing::strong_guarantee); >+auto strong_tester = tester.WithContracts(testing::strong_guarantee); > > struct FailsBasicGuarantee : public NonNegative { > void operator()() { >@@ -690,7 +690,7 @@ TEST(ExceptionCheckTest, StrongGuaranteeFailure) { > EXPECT_FALSE(strong_tester.WithInitialValue(FollowsBasicGuarantee{}).Test()); > } > >-struct BasicGuaranteeWithExtraInvariants : public NonNegative { >+struct BasicGuaranteeWithExtraContracts : public NonNegative { > // After operator(), i is incremented. If operator() throws, i is set to 9999 > void operator()() { > int old_i = i; >@@ -701,21 +701,21 @@ struct BasicGuaranteeWithExtraInvariants : public NonNegative { > > static constexpr int kExceptionSentinel = 9999; > }; >-constexpr int BasicGuaranteeWithExtraInvariants::kExceptionSentinel; >+constexpr int BasicGuaranteeWithExtraContracts::kExceptionSentinel; > >-TEST(ExceptionCheckTest, BasicGuaranteeWithExtraInvariants) { >+TEST(ExceptionCheckTest, BasicGuaranteeWithExtraContracts) { > auto tester_with_val = >- tester.WithInitialValue(BasicGuaranteeWithExtraInvariants{}); >+ tester.WithInitialValue(BasicGuaranteeWithExtraContracts{}); > EXPECT_TRUE(tester_with_val.Test()); > EXPECT_TRUE( > tester_with_val >- .WithInvariants([](BasicGuaranteeWithExtraInvariants* o) { >- if (o->i == BasicGuaranteeWithExtraInvariants::kExceptionSentinel) { >+ .WithContracts([](BasicGuaranteeWithExtraContracts* o) { >+ if (o->i == BasicGuaranteeWithExtraContracts::kExceptionSentinel) { > return testing::AssertionSuccess(); > } > return testing::AssertionFailure() > << "i should be " >- << BasicGuaranteeWithExtraInvariants::kExceptionSentinel >+ << BasicGuaranteeWithExtraContracts::kExceptionSentinel > << ", but is " << o->i; > }) > .Test()); >@@ -740,7 +740,7 @@ struct HasReset : public NonNegative { > void reset() { i = 0; } > }; > >-testing::AssertionResult CheckHasResetInvariants(HasReset* h) { >+testing::AssertionResult CheckHasResetContracts(HasReset* h) { > h->reset(); > return testing::AssertionResult(h->i == 0); > } >@@ -759,14 +759,14 @@ TEST(ExceptionCheckTest, ModifyingChecker) { > }; > > EXPECT_FALSE(tester.WithInitialValue(FollowsBasicGuarantee{}) >- .WithInvariants(set_to_1000, is_1000) >+ .WithContracts(set_to_1000, is_1000) > .Test()); > EXPECT_TRUE(strong_tester.WithInitialValue(FollowsStrongGuarantee{}) >- .WithInvariants(increment) >+ .WithContracts(increment) > .Test()); > EXPECT_TRUE(testing::MakeExceptionSafetyTester() > .WithInitialValue(HasReset{}) >- .WithInvariants(CheckHasResetInvariants) >+ .WithContracts(CheckHasResetContracts) > .Test(invoker)); > } > >@@ -799,7 +799,7 @@ TEST(ExceptionCheckTest, NonEqualityComparable) { > return testing::AssertionResult(nec->i == NonEqualityComparable().i); > }; > auto strong_nec_tester = tester.WithInitialValue(NonEqualityComparable{}) >- .WithInvariants(nec_is_strong); >+ .WithContracts(nec_is_strong); > > EXPECT_TRUE(strong_nec_tester.Test()); > EXPECT_FALSE(strong_nec_tester.Test( >@@ -833,14 +833,14 @@ struct { > testing::AssertionResult operator()(ExhaustivenessTester<T>*) const { > return testing::AssertionSuccess(); > } >-} CheckExhaustivenessTesterInvariants; >+} CheckExhaustivenessTesterContracts; > > template <typename T> > unsigned char ExhaustivenessTester<T>::successes = 0; > > TEST(ExceptionCheckTest, Exhaustiveness) { > auto exhaust_tester = testing::MakeExceptionSafetyTester() >- .WithInvariants(CheckExhaustivenessTesterInvariants) >+ .WithContracts(CheckExhaustivenessTesterContracts) > .WithOperation(invoker); > > EXPECT_TRUE( >@@ -849,7 +849,7 @@ TEST(ExceptionCheckTest, Exhaustiveness) { > > EXPECT_TRUE( > exhaust_tester.WithInitialValue(ExhaustivenessTester<ThrowingValue<>>{}) >- .WithInvariants(testing::strong_guarantee) >+ .WithContracts(testing::strong_guarantee) > .Test()); > EXPECT_EQ(ExhaustivenessTester<ThrowingValue<>>::successes, 0xF); > } >@@ -931,8 +931,8 @@ TEST(ThrowingValueTraitsTest, RelationalOperators) { > } > > TEST(ThrowingAllocatorTraitsTest, Assignablility) { >- EXPECT_TRUE(std::is_move_assignable<ThrowingAllocator<int>>::value); >- EXPECT_TRUE(std::is_copy_assignable<ThrowingAllocator<int>>::value); >+ EXPECT_TRUE(absl::is_move_assignable<ThrowingAllocator<int>>::value); >+ EXPECT_TRUE(absl::is_copy_assignable<ThrowingAllocator<int>>::value); > EXPECT_TRUE(std::is_nothrow_move_assignable<ThrowingAllocator<int>>::value); > EXPECT_TRUE(std::is_nothrow_copy_assignable<ThrowingAllocator<int>>::value); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/bits.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/bits.h >new file mode 100644 >index 0000000000000000000000000000000000000000..bc7faaee3b257edc37cd9090b7777d7035d3c3b8 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/bits.h >@@ -0,0 +1,193 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_BASE_INTERNAL_BITS_H_ >+#define ABSL_BASE_INTERNAL_BITS_H_ >+ >+// This file contains bitwise ops which are implementation details of various >+// absl libraries. >+ >+#include <cstdint> >+ >+// Clang on Windows has __builtin_clzll; otherwise we need to use the >+// windows intrinsic functions. >+#if defined(_MSC_VER) >+#include <intrin.h> >+#if defined(_M_X64) >+#pragma intrinsic(_BitScanReverse64) >+#pragma intrinsic(_BitScanForward64) >+#endif >+#pragma intrinsic(_BitScanReverse) >+#pragma intrinsic(_BitScanForward) >+#endif >+ >+#include "absl/base/attributes.h" >+ >+#if defined(_MSC_VER) >+// We can achieve something similar to attribute((always_inline)) with MSVC by >+// using the __forceinline keyword, however this is not perfect. MSVC is >+// much less aggressive about inlining, and even with the __forceinline keyword. >+#define ABSL_BASE_INTERNAL_FORCEINLINE __forceinline >+#else >+// Use default attribute inline. >+#define ABSL_BASE_INTERNAL_FORCEINLINE inline ABSL_ATTRIBUTE_ALWAYS_INLINE >+#endif >+ >+ >+namespace absl { >+namespace base_internal { >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountLeadingZeros64Slow(uint64_t n) { >+ int zeroes = 60; >+ if (n >> 32) zeroes -= 32, n >>= 32; >+ if (n >> 16) zeroes -= 16, n >>= 16; >+ if (n >> 8) zeroes -= 8, n >>= 8; >+ if (n >> 4) zeroes -= 4, n >>= 4; >+ return "\4\3\2\2\1\1\1\1\0\0\0\0\0\0\0"[n] + zeroes; >+} >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountLeadingZeros64(uint64_t n) { >+#if defined(_MSC_VER) && defined(_M_X64) >+ // MSVC does not have __buitin_clzll. Use _BitScanReverse64. >+ unsigned long result = 0; // NOLINT(runtime/int) >+ if (_BitScanReverse64(&result, n)) { >+ return 63 - result; >+ } >+ return 64; >+#elif defined(_MSC_VER) >+ // MSVC does not have __buitin_clzll. Compose two calls to _BitScanReverse >+ unsigned long result = 0; // NOLINT(runtime/int) >+ if ((n >> 32) && _BitScanReverse(&result, n >> 32)) { >+ return 31 - result; >+ } >+ if (_BitScanReverse(&result, n)) { >+ return 63 - result; >+ } >+ return 64; >+#elif defined(__GNUC__) >+ // Use __builtin_clzll, which uses the following instructions: >+ // x86: bsr >+ // ARM64: clz >+ // PPC: cntlzd >+ static_assert(sizeof(unsigned long long) == sizeof(n), // NOLINT(runtime/int) >+ "__builtin_clzll does not take 64-bit arg"); >+ >+ // Handle 0 as a special case because __builtin_clzll(0) is undefined. >+ if (n == 0) { >+ return 64; >+ } >+ return __builtin_clzll(n); >+#else >+ return CountLeadingZeros64Slow(n); >+#endif >+} >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountLeadingZeros32Slow(uint64_t n) { >+ int zeroes = 28; >+ if (n >> 16) zeroes -= 16, n >>= 16; >+ if (n >> 8) zeroes -= 8, n >>= 8; >+ if (n >> 4) zeroes -= 4, n >>= 4; >+ return "\4\3\2\2\1\1\1\1\0\0\0\0\0\0\0"[n] + zeroes; >+} >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountLeadingZeros32(uint32_t n) { >+#if defined(_MSC_VER) >+ unsigned long result = 0; // NOLINT(runtime/int) >+ if (_BitScanReverse(&result, n)) { >+ return 31 - result; >+ } >+ return 32; >+#elif defined(__GNUC__) >+ // Use __builtin_clz, which uses the following instructions: >+ // x86: bsr >+ // ARM64: clz >+ // PPC: cntlzd >+ static_assert(sizeof(int) == sizeof(n), >+ "__builtin_clz does not take 32-bit arg"); >+ >+ // Handle 0 as a special case because __builtin_clz(0) is undefined. >+ if (n == 0) { >+ return 32; >+ } >+ return __builtin_clz(n); >+#else >+ return CountLeadingZeros32Slow(n); >+#endif >+} >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountTrailingZerosNonZero64Slow(uint64_t n) { >+ int c = 63; >+ n &= ~n + 1; >+ if (n & 0x00000000FFFFFFFF) c -= 32; >+ if (n & 0x0000FFFF0000FFFF) c -= 16; >+ if (n & 0x00FF00FF00FF00FF) c -= 8; >+ if (n & 0x0F0F0F0F0F0F0F0F) c -= 4; >+ if (n & 0x3333333333333333) c -= 2; >+ if (n & 0x5555555555555555) c -= 1; >+ return c; >+} >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountTrailingZerosNonZero64(uint64_t n) { >+#if defined(_MSC_VER) && defined(_M_X64) >+ unsigned long result = 0; // NOLINT(runtime/int) >+ _BitScanForward64(&result, n); >+ return result; >+#elif defined(_MSC_VER) >+ unsigned long result = 0; // NOLINT(runtime/int) >+ if (static_cast<uint32_t>(n) == 0) { >+ _BitScanForward(&result, n >> 32); >+ return result + 32; >+ } >+ _BitScanForward(&result, n); >+ return result; >+#elif defined(__GNUC__) >+ static_assert(sizeof(unsigned long long) == sizeof(n), // NOLINT(runtime/int) >+ "__builtin_ctzll does not take 64-bit arg"); >+ return __builtin_ctzll(n); >+#else >+ return CountTrailingZerosNonZero64Slow(n); >+#endif >+} >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountTrailingZerosNonZero32Slow(uint32_t n) { >+ int c = 31; >+ n &= ~n + 1; >+ if (n & 0x0000FFFF) c -= 16; >+ if (n & 0x00FF00FF) c -= 8; >+ if (n & 0x0F0F0F0F) c -= 4; >+ if (n & 0x33333333) c -= 2; >+ if (n & 0x55555555) c -= 1; >+ return c; >+} >+ >+ABSL_BASE_INTERNAL_FORCEINLINE int CountTrailingZerosNonZero32(uint32_t n) { >+#if defined(_MSC_VER) >+ unsigned long result = 0; // NOLINT(runtime/int) >+ _BitScanForward(&result, n); >+ return result; >+#elif defined(__GNUC__) >+ static_assert(sizeof(int) == sizeof(n), >+ "__builtin_ctz does not take 32-bit arg"); >+ return __builtin_ctz(n); >+#else >+ return CountTrailingZerosNonZero32Slow(n); >+#endif >+} >+ >+#undef ABSL_BASE_INTERNAL_FORCEINLINE >+ >+} // namespace base_internal >+} // namespace absl >+ >+#endif // ABSL_BASE_INTERNAL_BITS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/bits_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/bits_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..e5d991d6725874aa9db803b15117817aa3086b51 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/bits_test.cc >@@ -0,0 +1,97 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/base/internal/bits.h" >+ >+#include "gtest/gtest.h" >+ >+namespace { >+ >+int CLZ64(uint64_t n) { >+ int fast = absl::base_internal::CountLeadingZeros64(n); >+ int slow = absl::base_internal::CountLeadingZeros64Slow(n); >+ EXPECT_EQ(fast, slow) << n; >+ return fast; >+} >+ >+TEST(BitsTest, CountLeadingZeros64) { >+ EXPECT_EQ(64, CLZ64(uint64_t{})); >+ EXPECT_EQ(0, CLZ64(~uint64_t{})); >+ >+ for (int index = 0; index < 64; index++) { >+ uint64_t x = static_cast<uint64_t>(1) << index; >+ const auto cnt = 63 - index; >+ ASSERT_EQ(cnt, CLZ64(x)) << index; >+ ASSERT_EQ(cnt, CLZ64(x + x - 1)) << index; >+ } >+} >+ >+int CLZ32(uint32_t n) { >+ int fast = absl::base_internal::CountLeadingZeros32(n); >+ int slow = absl::base_internal::CountLeadingZeros32Slow(n); >+ EXPECT_EQ(fast, slow) << n; >+ return fast; >+} >+ >+TEST(BitsTest, CountLeadingZeros32) { >+ EXPECT_EQ(32, CLZ32(uint32_t{})); >+ EXPECT_EQ(0, CLZ32(~uint32_t{})); >+ >+ for (int index = 0; index < 32; index++) { >+ uint32_t x = static_cast<uint32_t>(1) << index; >+ const auto cnt = 31 - index; >+ ASSERT_EQ(cnt, CLZ32(x)) << index; >+ ASSERT_EQ(cnt, CLZ32(x + x - 1)) << index; >+ ASSERT_EQ(CLZ64(x), CLZ32(x) + 32); >+ } >+} >+ >+int CTZ64(uint64_t n) { >+ int fast = absl::base_internal::CountTrailingZerosNonZero64(n); >+ int slow = absl::base_internal::CountTrailingZerosNonZero64Slow(n); >+ EXPECT_EQ(fast, slow) << n; >+ return fast; >+} >+ >+TEST(BitsTest, CountTrailingZerosNonZero64) { >+ EXPECT_EQ(0, CTZ64(~uint64_t{})); >+ >+ for (int index = 0; index < 64; index++) { >+ uint64_t x = static_cast<uint64_t>(1) << index; >+ const auto cnt = index; >+ ASSERT_EQ(cnt, CTZ64(x)) << index; >+ ASSERT_EQ(cnt, CTZ64(~(x - 1))) << index; >+ } >+} >+ >+int CTZ32(uint32_t n) { >+ int fast = absl::base_internal::CountTrailingZerosNonZero32(n); >+ int slow = absl::base_internal::CountTrailingZerosNonZero32Slow(n); >+ EXPECT_EQ(fast, slow) << n; >+ return fast; >+} >+ >+TEST(BitsTest, CountTrailingZerosNonZero32) { >+ EXPECT_EQ(0, CTZ32(~uint32_t{})); >+ >+ for (int index = 0; index < 32; index++) { >+ uint32_t x = static_cast<uint32_t>(1) << index; >+ const auto cnt = index; >+ ASSERT_EQ(cnt, CTZ32(x)) << index; >+ ASSERT_EQ(cnt, CTZ32(~(x - 1))) << index; >+ } >+} >+ >+ >+} // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/exception_safety_testing.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/exception_safety_testing.h >index 8c2f5093fc4da828afa82342be44995d18a3bbe7..5665a1b67db2ec4ca2ad80ffa06fe6e9cfc491b9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/exception_safety_testing.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/exception_safety_testing.h >@@ -191,19 +191,19 @@ class TrackedObject { > ~TrackedObject() noexcept { ConstructorTracker::ObjectDestructed(this); } > }; > >-template <typename Factory, typename Operation, typename Invariant> >-absl::optional<testing::AssertionResult> TestSingleInvariantAtCountdownImpl( >+template <typename Factory, typename Operation, typename Contract> >+absl::optional<testing::AssertionResult> TestSingleContractAtCountdownImpl( > const Factory& factory, const Operation& operation, int count, >- const Invariant& invariant) { >+ const Contract& contract) { > auto t_ptr = factory(); > absl::optional<testing::AssertionResult> current_res; > SetCountdown(count); > try { > operation(t_ptr.get()); > } catch (const exceptions_internal::TestException& e) { >- current_res.emplace(invariant(t_ptr.get())); >+ current_res.emplace(contract(t_ptr.get())); > if (!current_res.value()) { >- *current_res << e.what() << " failed invariant check"; >+ *current_res << e.what() << " failed contract check"; > } > } > UnsetCountdown(); >@@ -211,22 +211,22 @@ absl::optional<testing::AssertionResult> TestSingleInvariantAtCountdownImpl( > } > > template <typename Factory, typename Operation> >-absl::optional<testing::AssertionResult> TestSingleInvariantAtCountdownImpl( >+absl::optional<testing::AssertionResult> TestSingleContractAtCountdownImpl( > const Factory& factory, const Operation& operation, int count, > StrongGuaranteeTagType) { > using TPtr = typename decltype(factory())::pointer; > auto t_is_strong = [&](TPtr t) { return *t == *factory(); }; >- return TestSingleInvariantAtCountdownImpl(factory, operation, count, >- t_is_strong); >+ return TestSingleContractAtCountdownImpl(factory, operation, count, >+ t_is_strong); > } > >-template <typename Factory, typename Operation, typename Invariant> >-int TestSingleInvariantAtCountdown( >+template <typename Factory, typename Operation, typename Contract> >+int TestSingleContractAtCountdown( > const Factory& factory, const Operation& operation, int count, >- const Invariant& invariant, >+ const Contract& contract, > absl::optional<testing::AssertionResult>* reduced_res) { > // If reduced_res is empty, it means the current call to >- // TestSingleInvariantAtCountdown(...) is the first test being run so we do >+ // TestSingleContractAtCountdown(...) is the first test being run so we do > // want to run it. Alternatively, if it's not empty (meaning a previous test > // has run) we want to check if it passed. If the previous test did pass, we > // want to contine running tests so we do want to run the current one. If it >@@ -234,22 +234,22 @@ int TestSingleInvariantAtCountdown( > // output. If that's the case, we do not run the current test and instead we > // simply return. > if (!reduced_res->has_value() || reduced_res->value()) { >- *reduced_res = TestSingleInvariantAtCountdownImpl(factory, operation, count, >- invariant); >+ *reduced_res = >+ TestSingleContractAtCountdownImpl(factory, operation, count, contract); > } > return 0; > } > >-template <typename Factory, typename Operation, typename... Invariants> >-inline absl::optional<testing::AssertionResult> TestAllInvariantsAtCountdown( >+template <typename Factory, typename Operation, typename... Contracts> >+inline absl::optional<testing::AssertionResult> TestAllContractsAtCountdown( > const Factory& factory, const Operation& operation, int count, >- const Invariants&... invariants) { >+ const Contracts&... contracts) { > absl::optional<testing::AssertionResult> reduced_res; > > // Run each checker, short circuiting after the first failure > int dummy[] = { >- 0, (TestSingleInvariantAtCountdown(factory, operation, count, invariants, >- &reduced_res))...}; >+ 0, (TestSingleContractAtCountdown(factory, operation, count, contracts, >+ &reduced_res))...}; > static_cast<void>(dummy); > return reduced_res; > } >@@ -858,7 +858,7 @@ testing::AssertionResult TestNothrowOp(const Operation& operation) { > try { > operation(); > return testing::AssertionSuccess(); >- } catch (exceptions_internal::TestException) { >+ } catch (const exceptions_internal::TestException&) { > return testing::AssertionFailure() > << "TestException thrown during call to operation() when nothrow " > "guarantee was expected."; >@@ -884,15 +884,15 @@ class DefaultFactory { > T t_; > }; > >-template <size_t LazyInvariantsCount, typename LazyFactory, >+template <size_t LazyContractsCount, typename LazyFactory, > typename LazyOperation> > using EnableIfTestable = typename absl::enable_if_t< >- LazyInvariantsCount != 0 && >+ LazyContractsCount != 0 && > !std::is_same<LazyFactory, UninitializedT>::value && > !std::is_same<LazyOperation, UninitializedT>::value>; > > template <typename Factory = UninitializedT, >- typename Operation = UninitializedT, typename... Invariants> >+ typename Operation = UninitializedT, typename... Contracts> > class ExceptionSafetyTester; > > } // namespace exceptions_internal >@@ -903,7 +903,7 @@ namespace exceptions_internal { > > /* > * Builds a tester object that tests if performing a operation on a T follows >- * exception safety guarantees. Verification is done via invariant assertion >+ * exception safety guarantees. Verification is done via contract assertion > * callbacks applied to T instances post-throw. > * > * Template parameters for ExceptionSafetyTester: >@@ -921,18 +921,18 @@ namespace exceptions_internal { > * fresh T instance so it's free to modify and destroy the T instances as it > * pleases. > * >- * - Invariants...: The invariant assertion callback objects (passed in via >- * tester.WithInvariants(...)) must be invocable with the signature >+ * - Contracts...: The contract assertion callback objects (passed in via >+ * tester.WithContracts(...)) must be invocable with the signature > * `testing::AssertionResult operator()(T*) const` where T is the type being >- * tested. Invariant assertion callbacks are provided T instances post-throw. >- * They must return testing::AssertionSuccess when the type invariants of the >- * provided T instance hold. If the type invariants of the T instance do not >+ * tested. Contract assertion callbacks are provided T instances post-throw. >+ * They must return testing::AssertionSuccess when the type contracts of the >+ * provided T instance hold. If the type contracts of the T instance do not > * hold, they must return testing::AssertionFailure. Execution order of >- * Invariants... is unspecified. They will each individually get a fresh T >+ * Contracts... is unspecified. They will each individually get a fresh T > * instance so they are free to modify and destroy the T instances as they > * please. > */ >-template <typename Factory, typename Operation, typename... Invariants> >+template <typename Factory, typename Operation, typename... Contracts> > class ExceptionSafetyTester { > public: > /* >@@ -948,7 +948,7 @@ class ExceptionSafetyTester { > * tester.WithFactory(...). > */ > template <typename T> >- ExceptionSafetyTester<DefaultFactory<T>, Operation, Invariants...> >+ ExceptionSafetyTester<DefaultFactory<T>, Operation, Contracts...> > WithInitialValue(const T& t) const { > return WithFactory(DefaultFactory<T>(t)); > } >@@ -961,9 +961,9 @@ class ExceptionSafetyTester { > * method tester.WithInitialValue(...). > */ > template <typename NewFactory> >- ExceptionSafetyTester<absl::decay_t<NewFactory>, Operation, Invariants...> >+ ExceptionSafetyTester<absl::decay_t<NewFactory>, Operation, Contracts...> > WithFactory(const NewFactory& new_factory) const { >- return {new_factory, operation_, invariants_}; >+ return {new_factory, operation_, contracts_}; > } > > /* >@@ -972,39 +972,39 @@ class ExceptionSafetyTester { > * tester. > */ > template <typename NewOperation> >- ExceptionSafetyTester<Factory, absl::decay_t<NewOperation>, Invariants...> >+ ExceptionSafetyTester<Factory, absl::decay_t<NewOperation>, Contracts...> > WithOperation(const NewOperation& new_operation) const { >- return {factory_, new_operation, invariants_}; >+ return {factory_, new_operation, contracts_}; > } > > /* >- * Returns a new ExceptionSafetyTester with the provided MoreInvariants... >- * combined with the Invariants... that were already included in the instance >- * on which the method was called. Invariants... cannot be removed or replaced >+ * Returns a new ExceptionSafetyTester with the provided MoreContracts... >+ * combined with the Contracts... that were already included in the instance >+ * on which the method was called. Contracts... cannot be removed or replaced > * once added to an ExceptionSafetyTester instance. A fresh object must be >- * created in order to get an empty Invariants... list. >+ * created in order to get an empty Contracts... list. > * >- * In addition to passing in custom invariant assertion callbacks, this method >+ * In addition to passing in custom contract assertion callbacks, this method > * accepts `testing::strong_guarantee` as an argument which checks T instances > * post-throw against freshly created T instances via operator== to verify > * that any state changes made during the execution of the operation were > * properly rolled back. > */ >- template <typename... MoreInvariants> >- ExceptionSafetyTester<Factory, Operation, Invariants..., >- absl::decay_t<MoreInvariants>...> >- WithInvariants(const MoreInvariants&... more_invariants) const { >- return {factory_, operation_, >- std::tuple_cat(invariants_, >- std::tuple<absl::decay_t<MoreInvariants>...>( >- more_invariants...))}; >+ template <typename... MoreContracts> >+ ExceptionSafetyTester<Factory, Operation, Contracts..., >+ absl::decay_t<MoreContracts>...> >+ WithContracts(const MoreContracts&... more_contracts) const { >+ return { >+ factory_, operation_, >+ std::tuple_cat(contracts_, std::tuple<absl::decay_t<MoreContracts>...>( >+ more_contracts...))}; > } > > /* > * Returns a testing::AssertionResult that is the reduced result of the > * exception safety algorithm. The algorithm short circuits and returns >- * AssertionFailure after the first invariant callback returns an >- * AssertionFailure. Otherwise, if all invariant callbacks return an >+ * AssertionFailure after the first contract callback returns an >+ * AssertionFailure. Otherwise, if all contract callbacks return an > * AssertionSuccess, the reduced result is AssertionSuccess. > * > * The passed-in testable operation will not be saved in a new tester instance >@@ -1013,33 +1013,33 @@ class ExceptionSafetyTester { > * > * Preconditions for tester.Test(const NewOperation& new_operation): > * >- * - May only be called after at least one invariant assertion callback and a >+ * - May only be called after at least one contract assertion callback and a > * factory or initial value have been provided. > */ > template < > typename NewOperation, >- typename = EnableIfTestable<sizeof...(Invariants), Factory, NewOperation>> >+ typename = EnableIfTestable<sizeof...(Contracts), Factory, NewOperation>> > testing::AssertionResult Test(const NewOperation& new_operation) const { >- return TestImpl(new_operation, absl::index_sequence_for<Invariants...>()); >+ return TestImpl(new_operation, absl::index_sequence_for<Contracts...>()); > } > > /* > * Returns a testing::AssertionResult that is the reduced result of the > * exception safety algorithm. The algorithm short circuits and returns >- * AssertionFailure after the first invariant callback returns an >- * AssertionFailure. Otherwise, if all invariant callbacks return an >+ * AssertionFailure after the first contract callback returns an >+ * AssertionFailure. Otherwise, if all contract callbacks return an > * AssertionSuccess, the reduced result is AssertionSuccess. > * > * Preconditions for tester.Test(): > * >- * - May only be called after at least one invariant assertion callback, a >+ * - May only be called after at least one contract assertion callback, a > * factory or initial value and a testable operation have been provided. > */ >- template <typename LazyOperation = Operation, >- typename = >- EnableIfTestable<sizeof...(Invariants), Factory, LazyOperation>> >+ template < >+ typename LazyOperation = Operation, >+ typename = EnableIfTestable<sizeof...(Contracts), Factory, LazyOperation>> > testing::AssertionResult Test() const { >- return TestImpl(operation_, absl::index_sequence_for<Invariants...>()); >+ return TestImpl(operation_, absl::index_sequence_for<Contracts...>()); > } > > private: >@@ -1051,8 +1051,8 @@ class ExceptionSafetyTester { > ExceptionSafetyTester() {} > > ExceptionSafetyTester(const Factory& f, const Operation& o, >- const std::tuple<Invariants...>& i) >- : factory_(f), operation_(o), invariants_(i) {} >+ const std::tuple<Contracts...>& i) >+ : factory_(f), operation_(o), contracts_(i) {} > > template <typename SelectedOperation, size_t... Indices> > testing::AssertionResult TestImpl(const SelectedOperation& selected_operation, >@@ -1064,28 +1064,28 @@ class ExceptionSafetyTester { > > // Run the full exception safety test algorithm for the current countdown > auto reduced_res = >- TestAllInvariantsAtCountdown(factory_, selected_operation, count, >- std::get<Indices>(invariants_)...); >- // If there is no value in the optional, no invariants were run because no >+ TestAllContractsAtCountdown(factory_, selected_operation, count, >+ std::get<Indices>(contracts_)...); >+ // If there is no value in the optional, no contracts were run because no > // exception was thrown. This means that the test is complete and the loop > // can exit successfully. > if (!reduced_res.has_value()) { > return testing::AssertionSuccess(); > } >- // If the optional is not empty and the value is falsy, an invariant check >+ // If the optional is not empty and the value is falsy, an contract check > // failed so the test must exit to propegate the failure. > if (!reduced_res.value()) { > return reduced_res.value(); > } > // If the optional is not empty and the value is not falsy, it means >- // exceptions were thrown but the invariants passed so the test must >+ // exceptions were thrown but the contracts passed so the test must > // continue to run. > } > } > > Factory factory_; > Operation operation_; >- std::tuple<Invariants...> invariants_; >+ std::tuple<Contracts...> contracts_; > }; > > } // namespace exceptions_internal >@@ -1096,7 +1096,7 @@ class ExceptionSafetyTester { > * instances of ExceptionSafetyTester. > * > * In order to test a T for exception safety, a factory for that T, a testable >- * operation, and at least one invariant callback returning an assertion >+ * operation, and at least one contract callback returning an assertion > * result must be applied using the respective methods. > */ > inline exceptions_internal::ExceptionSafetyTester<> >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.cc >index 0626cd5478d97004eb1e9cde215a90e4862d29f1..159f945f5edd07078f8cafcea8b3730a3e8c718d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.cc >@@ -401,16 +401,20 @@ bool LowLevelAlloc::DeleteArena(Arena *arena) { > ABSL_RAW_CHECK(munmap_result != 0, > "LowLevelAlloc::DeleteArena: VitualFree failed"); > #else >+#ifndef ABSL_LOW_LEVEL_ALLOC_ASYNC_SIGNAL_SAFE_MISSING > if ((arena->flags & LowLevelAlloc::kAsyncSignalSafe) == 0) { > munmap_result = munmap(region, size); > } else { > munmap_result = base_internal::DirectMunmap(region, size); > } >+#else >+ munmap_result = munmap(region, size); >+#endif // ABSL_LOW_LEVEL_ALLOC_ASYNC_SIGNAL_SAFE_MISSING > if (munmap_result != 0) { > ABSL_RAW_LOG(FATAL, "LowLevelAlloc::DeleteArena: munmap failed: %d", > errno); > } >-#endif >+#endif // _WIN32 > } > section.Leave(); > arena->~Arena(); >@@ -545,6 +549,7 @@ static void *DoAllocWithArena(size_t request, LowLevelAlloc::Arena *arena) { > MEM_RESERVE | MEM_COMMIT, PAGE_READWRITE); > ABSL_RAW_CHECK(new_pages != nullptr, "VirtualAlloc failed"); > #else >+#ifndef ABSL_LOW_LEVEL_ALLOC_ASYNC_SIGNAL_SAFE_MISSING > if ((arena->flags & LowLevelAlloc::kAsyncSignalSafe) != 0) { > new_pages = base_internal::DirectMmap(nullptr, new_pages_size, > PROT_WRITE|PROT_READ, MAP_ANONYMOUS|MAP_PRIVATE, -1, 0); >@@ -552,10 +557,15 @@ static void *DoAllocWithArena(size_t request, LowLevelAlloc::Arena *arena) { > new_pages = mmap(nullptr, new_pages_size, PROT_WRITE | PROT_READ, > MAP_ANONYMOUS | MAP_PRIVATE, -1, 0); > } >+#else >+ new_pages = mmap(nullptr, new_pages_size, PROT_WRITE | PROT_READ, >+ MAP_ANONYMOUS | MAP_PRIVATE, -1, 0); >+#endif // ABSL_LOW_LEVEL_ALLOC_ASYNC_SIGNAL_SAFE_MISSING > if (new_pages == MAP_FAILED) { > ABSL_RAW_LOG(FATAL, "mmap error: %d", errno); > } >-#endif >+ >+#endif // _WIN32 > arena->mu.Lock(); > s = reinterpret_cast<AllocList *>(new_pages); > s->header.size = new_pages_size; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.h >index 3c15605bed3578899389cc4ea296ceb5ba57a15e..fba9466a757427c01ed121464c3ab112de20c9a6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/low_level_alloc.h >@@ -39,10 +39,13 @@ > #define ABSL_LOW_LEVEL_ALLOC_MISSING 1 > #endif > >-// Using LowLevelAlloc with kAsyncSignalSafe isn't supported on Windows. >+// Using LowLevelAlloc with kAsyncSignalSafe isn't supported on Windows or >+// asm.js / WebAssembly. >+// See https://kripken.github.io/emscripten-site/docs/porting/pthreads.html >+// for more information. > #ifdef ABSL_LOW_LEVEL_ALLOC_ASYNC_SIGNAL_SAFE_MISSING > #error ABSL_LOW_LEVEL_ALLOC_ASYNC_SIGNAL_SAFE_MISSING cannot be directly set >-#elif defined(_WIN32) >+#elif defined(_WIN32) || defined(__asmjs__) || defined(__wasm__) > #define ABSL_LOW_LEVEL_ALLOC_ASYNC_SIGNAL_SAFE_MISSING 1 > #endif > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/raw_logging.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/raw_logging.h >index 67abfd30798db8fff60b284aa6c92dfe23a45bef..79a7bb9b2f7a62bb1969ceeac01782ce4e9dec5a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/raw_logging.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/raw_logging.h >@@ -114,7 +114,7 @@ void SafeWriteToStderr(const char *s, size_t len); > > // compile-time function to get the "base" filename, that is, the part of > // a filename after the last "/" or "\" path separator. The search starts at >-// the end of the std::string; the second parameter is the length of the std::string. >+// the end of the string; the second parameter is the length of the string. > constexpr const char* Basename(const char* fname, int offset) { > return offset == 0 || fname[offset - 1] == '/' || fname[offset - 1] == '\\' > ? fname + offset >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/spinlock_linux.inc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/spinlock_linux.inc >new file mode 100644 >index 0000000000000000000000000000000000000000..94c861dc6ca23c28ed2082289ba23ef253860101 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/spinlock_linux.inc >@@ -0,0 +1,72 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// This file is a Linux-specific part of spinlock_wait.cc >+ >+#include <linux/futex.h> >+#include <sys/syscall.h> >+#include <unistd.h> >+ >+#include <atomic> >+#include <cerrno> >+#include <climits> >+#include <cstdint> >+#include <ctime> >+ >+#include "absl/base/attributes.h" >+ >+// The SpinLock lockword is `std::atomic<uint32_t>`. Here we assert that >+// `std::atomic<uint32_t>` is bitwise equivalent of the `int` expected >+// by SYS_futex. We also assume that reads/writes done to the lockword >+// by SYS_futex have rational semantics with regard to the >+// std::atomic<> API. C++ provides no guarantees of these assumptions, >+// but they are believed to hold in practice. >+static_assert(sizeof(std::atomic<uint32_t>) == sizeof(int), >+ "SpinLock lockword has the wrong size for a futex"); >+ >+// Some Android headers are missing these definitions even though they >+// support these futex operations. >+#ifdef __BIONIC__ >+#ifndef SYS_futex >+#define SYS_futex __NR_futex >+#endif >+#ifndef FUTEX_PRIVATE_FLAG >+#define FUTEX_PRIVATE_FLAG 128 >+#endif >+#endif >+ >+extern "C" { >+ >+ABSL_ATTRIBUTE_WEAK void AbslInternalSpinLockDelay( >+ std::atomic<uint32_t> *w, uint32_t value, int loop, >+ absl::base_internal::SchedulingMode) { >+ if (loop != 0) { >+ int save_errno = errno; >+ struct timespec tm; >+ tm.tv_sec = 0; >+ // Increase the delay; we expect (but do not rely on) explicit wakeups. >+ // We don't rely on explicit wakeups because we intentionally allow for >+ // a race on the kSpinLockSleeper bit. >+ tm.tv_nsec = 16 * absl::base_internal::SpinLockSuggestedDelayNS(loop); >+ syscall(SYS_futex, w, FUTEX_WAIT | FUTEX_PRIVATE_FLAG, value, &tm); >+ errno = save_errno; >+ } >+} >+ >+ABSL_ATTRIBUTE_WEAK void AbslInternalSpinLockWake(std::atomic<uint32_t> *w, >+ bool all) { >+ syscall(SYS_futex, w, FUTEX_WAKE | FUTEX_PRIVATE_FLAG, all ? INT_MAX : 1, 0); >+} >+ >+} // extern "C" >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/spinlock_wait.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/spinlock_wait.cc >index 9f6e9911e1096b2f187cc6a8430544e9fc166333..0fde9c04b73baf57d13499820cbd4471f35fdde2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/spinlock_wait.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/spinlock_wait.cc >@@ -13,7 +13,7 @@ > // limitations under the License. > > // The OS-specific header included below must provide two calls: >-// base::subtle::SpinLockDelay() and base::subtle::SpinLockWake(). >+// AbslInternalSpinLockDelay() and AbslInternalSpinLockWake(). > // See spinlock_wait.h for the specs. > > #include <atomic> >@@ -23,6 +23,8 @@ > > #if defined(_WIN32) > #include "absl/base/internal/spinlock_win32.inc" >+#elif defined(__linux__) >+#include "absl/base/internal/spinlock_linux.inc" > #elif defined(__akaros__) > #include "absl/base/internal/spinlock_akaros.inc" > #else >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/thread_identity.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/thread_identity.cc >index 678e8568d7421e5148e9d19cd56e68084a003705..cff9c1b4f4dc39542b2aed944c8c7ed0663fa2cf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/thread_identity.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/thread_identity.cc >@@ -68,6 +68,14 @@ void SetCurrentThreadIdentity( > // NOTE: Not async-safe. But can be open-coded. > absl::call_once(init_thread_identity_key_once, AllocateThreadIdentityKey, > reclaimer); >+ >+#ifdef __EMSCRIPTEN__ >+ // Emscripten PThread implementation does not support signals. >+ // See https://kripken.github.io/emscripten-site/docs/porting/pthreads.html >+ // for more information. >+ pthread_setspecific(thread_identity_pthread_key, >+ reinterpret_cast<void*>(identity)); >+#else > // We must mask signals around the call to setspecific as with current glibc, > // a concurrent getspecific (needed for GetCurrentThreadIdentityIfPresent()) > // may zero our value. >@@ -81,6 +89,8 @@ void SetCurrentThreadIdentity( > pthread_setspecific(thread_identity_pthread_key, > reinterpret_cast<void*>(identity)); > pthread_sigmask(SIG_SETMASK, &curr_signals, nullptr); >+#endif // !__EMSCRIPTEN__ >+ > #elif ABSL_THREAD_IDENTITY_MODE == ABSL_THREAD_IDENTITY_MODE_USE_TLS > // NOTE: Not async-safe. But can be open-coded. > absl::call_once(init_thread_identity_key_once, AllocateThreadIdentityKey, >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/unaligned_access.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/unaligned_access.h >index 5c7517abd71a6bf5250719d1d58cc7a05a906e1a..f9df3b7848c40a974fd9697def8cf7ffbfe029ef 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/unaligned_access.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/internal/unaligned_access.h >@@ -65,6 +65,7 @@ void __sanitizer_unaligned_store64(void *p, uint64_t v); > } // extern "C" > > namespace absl { >+namespace base_internal { > > inline uint16_t UnalignedLoad16(const void *p) { > return __sanitizer_unaligned_load16(p); >@@ -90,22 +91,27 @@ inline void UnalignedStore64(void *p, uint64_t v) { > __sanitizer_unaligned_store64(p, v); > } > >+} // namespace base_internal > } // namespace absl > >-#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) (absl::UnalignedLoad16(_p)) >-#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) (absl::UnalignedLoad32(_p)) >-#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) (absl::UnalignedLoad64(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) \ >+ (absl::base_internal::UnalignedLoad16(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) \ >+ (absl::base_internal::UnalignedLoad32(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) \ >+ (absl::base_internal::UnalignedLoad64(_p)) > > #define ABSL_INTERNAL_UNALIGNED_STORE16(_p, _val) \ >- (absl::UnalignedStore16(_p, _val)) >+ (absl::base_internal::UnalignedStore16(_p, _val)) > #define ABSL_INTERNAL_UNALIGNED_STORE32(_p, _val) \ >- (absl::UnalignedStore32(_p, _val)) >+ (absl::base_internal::UnalignedStore32(_p, _val)) > #define ABSL_INTERNAL_UNALIGNED_STORE64(_p, _val) \ >- (absl::UnalignedStore64(_p, _val)) >+ (absl::base_internal::UnalignedStore64(_p, _val)) > > #elif defined(UNDEFINED_BEHAVIOR_SANITIZER) > > namespace absl { >+namespace base_internal { > > inline uint16_t UnalignedLoad16(const void *p) { > uint16_t t; >@@ -131,18 +137,22 @@ inline void UnalignedStore32(void *p, uint32_t v) { memcpy(p, &v, sizeof v); } > > inline void UnalignedStore64(void *p, uint64_t v) { memcpy(p, &v, sizeof v); } > >+} // namespace base_internal > } // namespace absl > >-#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) (absl::UnalignedLoad16(_p)) >-#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) (absl::UnalignedLoad32(_p)) >-#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) (absl::UnalignedLoad64(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) \ >+ (absl::base_internal::UnalignedLoad16(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) \ >+ (absl::base_internal::UnalignedLoad32(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) \ >+ (absl::base_internal::UnalignedLoad64(_p)) > > #define ABSL_INTERNAL_UNALIGNED_STORE16(_p, _val) \ >- (absl::UnalignedStore16(_p, _val)) >+ (absl::base_internal::UnalignedStore16(_p, _val)) > #define ABSL_INTERNAL_UNALIGNED_STORE32(_p, _val) \ >- (absl::UnalignedStore32(_p, _val)) >+ (absl::base_internal::UnalignedStore32(_p, _val)) > #define ABSL_INTERNAL_UNALIGNED_STORE64(_p, _val) \ >- (absl::UnalignedStore64(_p, _val)) >+ (absl::base_internal::UnalignedStore64(_p, _val)) > > #elif defined(__x86_64__) || defined(_M_X64) || defined(__i386) || \ > defined(_M_IX86) || defined(__ppc__) || defined(__PPC__) || \ >@@ -199,7 +209,7 @@ inline void UnalignedStore64(void *p, uint64_t v) { memcpy(p, &v, sizeof v); } > // so we do that. > > namespace absl { >-namespace internal { >+namespace base_internal { > > struct Unaligned16Struct { > uint16_t value; >@@ -211,22 +221,25 @@ struct Unaligned32Struct { > uint8_t dummy; // To make the size non-power-of-two. > } ABSL_ATTRIBUTE_PACKED; > >-} // namespace internal >+} // namespace base_internal > } // namespace absl > >-#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) \ >- ((reinterpret_cast<const ::absl::internal::Unaligned16Struct *>(_p))->value) >-#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) \ >- ((reinterpret_cast<const ::absl::internal::Unaligned32Struct *>(_p))->value) >+#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) \ >+ ((reinterpret_cast<const ::absl::base_internal::Unaligned16Struct *>(_p)) \ >+ ->value) >+#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) \ >+ ((reinterpret_cast<const ::absl::base_internal::Unaligned32Struct *>(_p)) \ >+ ->value) > >-#define ABSL_INTERNAL_UNALIGNED_STORE16(_p, _val) \ >- ((reinterpret_cast< ::absl::internal::Unaligned16Struct *>(_p))->value = \ >- (_val)) >-#define ABSL_INTERNAL_UNALIGNED_STORE32(_p, _val) \ >- ((reinterpret_cast< ::absl::internal::Unaligned32Struct *>(_p))->value = \ >- (_val)) >+#define ABSL_INTERNAL_UNALIGNED_STORE16(_p, _val) \ >+ ((reinterpret_cast< ::absl::base_internal::Unaligned16Struct *>(_p)) \ >+ ->value = (_val)) >+#define ABSL_INTERNAL_UNALIGNED_STORE32(_p, _val) \ >+ ((reinterpret_cast< ::absl::base_internal::Unaligned32Struct *>(_p)) \ >+ ->value = (_val)) > > namespace absl { >+namespace base_internal { > > inline uint64_t UnalignedLoad64(const void *p) { > uint64_t t; >@@ -236,11 +249,13 @@ inline uint64_t UnalignedLoad64(const void *p) { > > inline void UnalignedStore64(void *p, uint64_t v) { memcpy(p, &v, sizeof v); } > >+} // namespace base_internal > } // namespace absl > >-#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) (absl::UnalignedLoad64(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) \ >+ (absl::base_internal::UnalignedLoad64(_p)) > #define ABSL_INTERNAL_UNALIGNED_STORE64(_p, _val) \ >- (absl::UnalignedStore64(_p, _val)) >+ (absl::base_internal::UnalignedStore64(_p, _val)) > > #else > >@@ -252,6 +267,7 @@ inline void UnalignedStore64(void *p, uint64_t v) { memcpy(p, &v, sizeof v); } > // unaligned loads and stores. > > namespace absl { >+namespace base_internal { > > inline uint16_t UnalignedLoad16(const void *p) { > uint16_t t; >@@ -277,18 +293,22 @@ inline void UnalignedStore32(void *p, uint32_t v) { memcpy(p, &v, sizeof v); } > > inline void UnalignedStore64(void *p, uint64_t v) { memcpy(p, &v, sizeof v); } > >+} // namespace base_internal > } // namespace absl > >-#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) (absl::UnalignedLoad16(_p)) >-#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) (absl::UnalignedLoad32(_p)) >-#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) (absl::UnalignedLoad64(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD16(_p) \ >+ (absl::base_internal::UnalignedLoad16(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD32(_p) \ >+ (absl::base_internal::UnalignedLoad32(_p)) >+#define ABSL_INTERNAL_UNALIGNED_LOAD64(_p) \ >+ (absl::base_internal::UnalignedLoad64(_p)) > > #define ABSL_INTERNAL_UNALIGNED_STORE16(_p, _val) \ >- (absl::UnalignedStore16(_p, _val)) >+ (absl::base_internal::UnalignedStore16(_p, _val)) > #define ABSL_INTERNAL_UNALIGNED_STORE32(_p, _val) \ >- (absl::UnalignedStore32(_p, _val)) >+ (absl::base_internal::UnalignedStore32(_p, _val)) > #define ABSL_INTERNAL_UNALIGNED_STORE64(_p, _val) \ >- (absl::UnalignedStore64(_p, _val)) >+ (absl::base_internal::UnalignedStore64(_p, _val)) > > #endif > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/log_severity.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/log_severity.h >index e2931c34d1df54a7bbf7ed71de858503362c7337..5770d3629e7264689f6029cb6d64285545bf4fd1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/log_severity.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/log_severity.h >@@ -39,7 +39,7 @@ constexpr std::array<absl::LogSeverity, 4> LogSeverities() { > absl::LogSeverity::kError, absl::LogSeverity::kFatal}}; > } > >-// Returns the all-caps std::string representation (e.g. "INFO") of the specified >+// Returns the all-caps string representation (e.g. "INFO") of the specified > // severity level if it is one of the normal levels and "UNKNOWN" otherwise. > constexpr const char* LogSeverityName(absl::LogSeverity s) { > return s == absl::LogSeverity::kInfo >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/raw_logging_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/raw_logging_test.cc >index ebbc5db906727f6cb2c03a63e99a897f5ceb5aa4..b21cf651758e3c249262e69c1c8ba17c1c9a7aee 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/raw_logging_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/base/raw_logging_test.cc >@@ -40,7 +40,7 @@ TEST(RawLoggingCompilationTest, PassingCheck) { > } > > // Not all platforms support output from raw log, so we don't verify any >-// particular output for RAW check failures (expecting the empty std::string >+// particular output for RAW check failures (expecting the empty string > // accomplishes this). This test is primarily a compilation test, but we > // are verifying process death when EXPECT_DEATH works for a platform. > const char kExpectedDeathOutput[] = ""; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.bazel >index 6d5c958f382ba6397f0b0df1cf996678a7a3c5a6..f2210e3c197983fab6034d818154779b71732f2f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.bazel >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.bazel >@@ -19,6 +19,7 @@ load( > "ABSL_DEFAULT_COPTS", > "ABSL_TEST_COPTS", > "ABSL_EXCEPTIONS_FLAG", >+ "ABSL_EXCEPTIONS_FLAG_LINKOPTS", > ) > > package(default_visibility = ["//visibility:public"]) >@@ -62,9 +63,11 @@ cc_test( > name = "fixed_array_test", > srcs = ["fixed_array_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":fixed_array", > "//absl/base:exception_testing", >+ "//absl/hash:hash_testing", > "//absl/memory", > "@com_google_googletest//:gtest_main", > ], >@@ -77,6 +80,7 @@ cc_test( > deps = [ > ":fixed_array", > "//absl/base:exception_testing", >+ "//absl/hash:hash_testing", > "//absl/memory", > "@com_google_googletest//:gtest_main", > ], >@@ -86,6 +90,7 @@ cc_test( > name = "fixed_array_exception_safety_test", > srcs = ["fixed_array_exception_safety_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":fixed_array", > "//absl/base:exception_safety_testing", >@@ -120,12 +125,14 @@ cc_test( > name = "inlined_vector_test", > srcs = ["inlined_vector_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":inlined_vector", > ":test_instance_tracker", > "//absl/base", > "//absl/base:core_headers", > "//absl/base:exception_testing", >+ "//absl/hash:hash_testing", > "//absl/memory", > "//absl/strings", > "@com_google_googletest//:gtest_main", >@@ -142,6 +149,7 @@ cc_test( > "//absl/base", > "//absl/base:core_headers", > "//absl/base:exception_testing", >+ "//absl/hash:hash_testing", > "//absl/memory", > "//absl/strings", > "@com_google_googletest//:gtest_main", >@@ -181,3 +189,459 @@ cc_test( > "@com_google_googletest//:gtest_main", > ], > ) >+ >+NOTEST_TAGS_NONMOBILE = [ >+ "no_test_darwin_x86_64", >+ "no_test_loonix", >+] >+ >+NOTEST_TAGS_MOBILE = [ >+ "no_test_android_arm", >+ "no_test_android_arm64", >+ "no_test_android_x86", >+ "no_test_ios_x86_64", >+] >+ >+NOTEST_TAGS = NOTEST_TAGS_MOBILE + NOTEST_TAGS_NONMOBILE >+ >+cc_library( >+ name = "flat_hash_map", >+ hdrs = ["flat_hash_map.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":raw_hash_map", >+ "//absl/memory", >+ ], >+) >+ >+cc_test( >+ name = "flat_hash_map_test", >+ srcs = ["flat_hash_map_test.cc"], >+ copts = ABSL_TEST_COPTS + ["-DUNORDERED_MAP_CXX17"], >+ tags = NOTEST_TAGS_NONMOBILE, >+ deps = [ >+ ":flat_hash_map", >+ ":hash_generator_testing", >+ ":unordered_map_constructor_test", >+ ":unordered_map_lookup_test", >+ ":unordered_map_modifiers_test", >+ "//absl/types:any", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "flat_hash_set", >+ hdrs = ["flat_hash_set.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":raw_hash_set", >+ "//absl/base:core_headers", >+ "//absl/memory", >+ ], >+) >+ >+cc_test( >+ name = "flat_hash_set_test", >+ srcs = ["flat_hash_set_test.cc"], >+ copts = ABSL_TEST_COPTS + ["-DUNORDERED_SET_CXX17"], >+ tags = NOTEST_TAGS_NONMOBILE, >+ deps = [ >+ ":flat_hash_set", >+ ":hash_generator_testing", >+ ":unordered_set_constructor_test", >+ ":unordered_set_lookup_test", >+ ":unordered_set_modifiers_test", >+ "//absl/memory", >+ "//absl/strings", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "node_hash_map", >+ hdrs = ["node_hash_map.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":node_hash_policy", >+ ":raw_hash_map", >+ "//absl/memory", >+ ], >+) >+ >+cc_test( >+ name = "node_hash_map_test", >+ srcs = ["node_hash_map_test.cc"], >+ copts = ABSL_TEST_COPTS + ["-DUNORDERED_MAP_CXX17"], >+ tags = NOTEST_TAGS_NONMOBILE, >+ deps = [ >+ ":hash_generator_testing", >+ ":node_hash_map", >+ ":tracked", >+ ":unordered_map_constructor_test", >+ ":unordered_map_lookup_test", >+ ":unordered_map_modifiers_test", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "node_hash_set", >+ hdrs = ["node_hash_set.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":node_hash_policy", >+ ":raw_hash_set", >+ "//absl/memory", >+ ], >+) >+ >+cc_test( >+ name = "node_hash_set_test", >+ srcs = ["node_hash_set_test.cc"], >+ copts = ABSL_TEST_COPTS + ["-DUNORDERED_SET_CXX17"], >+ tags = NOTEST_TAGS_NONMOBILE, >+ deps = [ >+ ":hash_generator_testing", >+ ":node_hash_set", >+ ":unordered_set_constructor_test", >+ ":unordered_set_lookup_test", >+ ":unordered_set_modifiers_test", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "container_memory", >+ hdrs = ["internal/container_memory.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ "//absl/memory", >+ "//absl/utility", >+ ], >+) >+ >+cc_test( >+ name = "container_memory_test", >+ srcs = ["internal/container_memory_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ tags = NOTEST_TAGS_NONMOBILE, >+ deps = [ >+ ":container_memory", >+ "//absl/strings", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "hash_function_defaults", >+ hdrs = ["internal/hash_function_defaults.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ "//absl/base:config", >+ "//absl/hash", >+ "//absl/strings", >+ ], >+) >+ >+cc_test( >+ name = "hash_function_defaults_test", >+ srcs = ["internal/hash_function_defaults_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ tags = NOTEST_TAGS, >+ deps = [ >+ ":hash_function_defaults", >+ "//absl/hash", >+ "//absl/strings", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "hash_generator_testing", >+ testonly = 1, >+ srcs = ["internal/hash_generator_testing.cc"], >+ hdrs = ["internal/hash_generator_testing.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_policy_testing", >+ "//absl/meta:type_traits", >+ "//absl/strings", >+ ], >+) >+ >+cc_library( >+ name = "hash_policy_testing", >+ testonly = 1, >+ hdrs = ["internal/hash_policy_testing.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ "//absl/hash", >+ "//absl/strings", >+ ], >+) >+ >+cc_test( >+ name = "hash_policy_testing_test", >+ srcs = ["internal/hash_policy_testing_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_policy_testing", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "hash_policy_traits", >+ hdrs = ["internal/hash_policy_traits.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = ["//absl/meta:type_traits"], >+) >+ >+cc_test( >+ name = "hash_policy_traits_test", >+ srcs = ["internal/hash_policy_traits_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_policy_traits", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "hashtable_debug", >+ hdrs = ["internal/hashtable_debug.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":hashtable_debug_hooks", >+ ], >+) >+ >+cc_library( >+ name = "hashtable_debug_hooks", >+ hdrs = ["internal/hashtable_debug_hooks.h"], >+ copts = ABSL_DEFAULT_COPTS, >+) >+ >+cc_library( >+ name = "node_hash_policy", >+ hdrs = ["internal/node_hash_policy.h"], >+ copts = ABSL_DEFAULT_COPTS, >+) >+ >+cc_test( >+ name = "node_hash_policy_test", >+ srcs = ["internal/node_hash_policy_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_policy_traits", >+ ":node_hash_policy", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "raw_hash_map", >+ hdrs = ["internal/raw_hash_map.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":container_memory", >+ ":raw_hash_set", >+ ], >+) >+ >+cc_library( >+ name = "raw_hash_set", >+ srcs = ["internal/raw_hash_set.cc"], >+ hdrs = ["internal/raw_hash_set.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":compressed_tuple", >+ ":container_memory", >+ ":hash_policy_traits", >+ ":hashtable_debug_hooks", >+ ":layout", >+ "//absl/base:bits", >+ "//absl/base:config", >+ "//absl/base:core_headers", >+ "//absl/base:endian", >+ "//absl/memory", >+ "//absl/meta:type_traits", >+ "//absl/types:optional", >+ "//absl/utility", >+ ], >+) >+ >+cc_test( >+ name = "raw_hash_set_test", >+ srcs = ["internal/raw_hash_set_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ linkstatic = 1, >+ tags = NOTEST_TAGS, >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":hash_policy_testing", >+ ":hashtable_debug", >+ ":raw_hash_set", >+ "//absl/base", >+ "//absl/base:core_headers", >+ "//absl/strings", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_test( >+ name = "raw_hash_set_allocator_test", >+ size = "small", >+ srcs = ["internal/raw_hash_set_allocator_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":raw_hash_set", >+ ":tracked", >+ "//absl/base:core_headers", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "layout", >+ hdrs = ["internal/layout.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ "//absl/base:core_headers", >+ "//absl/meta:type_traits", >+ "//absl/strings", >+ "//absl/types:span", >+ "//absl/utility", >+ ], >+) >+ >+cc_test( >+ name = "layout_test", >+ size = "small", >+ srcs = ["internal/layout_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ tags = NOTEST_TAGS, >+ visibility = ["//visibility:private"], >+ deps = [ >+ ":layout", >+ "//absl/base", >+ "//absl/base:core_headers", >+ "//absl/types:span", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "tracked", >+ testonly = 1, >+ hdrs = ["internal/tracked.h"], >+ copts = ABSL_TEST_COPTS, >+) >+ >+cc_library( >+ name = "unordered_map_constructor_test", >+ testonly = 1, >+ hdrs = ["internal/unordered_map_constructor_test.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "@com_google_googletest//:gtest", >+ ], >+) >+ >+cc_library( >+ name = "unordered_map_lookup_test", >+ testonly = 1, >+ hdrs = ["internal/unordered_map_lookup_test.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "@com_google_googletest//:gtest", >+ ], >+) >+ >+cc_library( >+ name = "unordered_map_modifiers_test", >+ testonly = 1, >+ hdrs = ["internal/unordered_map_modifiers_test.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "@com_google_googletest//:gtest", >+ ], >+) >+ >+cc_library( >+ name = "unordered_set_constructor_test", >+ testonly = 1, >+ hdrs = ["internal/unordered_set_constructor_test.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "@com_google_googletest//:gtest", >+ ], >+) >+ >+cc_library( >+ name = "unordered_set_lookup_test", >+ testonly = 1, >+ hdrs = ["internal/unordered_set_lookup_test.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "@com_google_googletest//:gtest", >+ ], >+) >+ >+cc_library( >+ name = "unordered_set_modifiers_test", >+ testonly = 1, >+ hdrs = ["internal/unordered_set_modifiers_test.h"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "@com_google_googletest//:gtest", >+ ], >+) >+ >+cc_test( >+ name = "unordered_set_test", >+ srcs = ["internal/unordered_set_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ tags = NOTEST_TAGS_NONMOBILE, >+ deps = [ >+ ":unordered_set_constructor_test", >+ ":unordered_set_lookup_test", >+ ":unordered_set_modifiers_test", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_test( >+ name = "unordered_map_test", >+ srcs = ["internal/unordered_map_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ tags = NOTEST_TAGS_NONMOBILE, >+ deps = [ >+ ":unordered_map_constructor_test", >+ ":unordered_map_lookup_test", >+ ":unordered_map_modifiers_test", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.gn >index 001a2a36d4846d57e851f8143f8aaa5c9915830a..a2fbd543316d456b5df1473133e31d0409b7270e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/BUILD.gn >@@ -84,3 +84,389 @@ source_set("test_instance_tracker") { > visibility = [] > visibility += [ "../*" ] > } >+ >+source_set("flat_hash_map") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "flat_hash_map.h", >+ ] >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":raw_hash_map", >+ "../memory", >+ ] >+} >+ >+source_set("flat_hash_set") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "flat_hash_set.h", >+ ] >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":raw_hash_set", >+ "../base:core_headers", >+ "../memory", >+ ] >+} >+ >+source_set("node_hash_map") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "node_hash_map.h", >+ ] >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":node_hash_policy", >+ ":raw_hash_map", >+ "../memory", >+ ] >+} >+ >+source_set("node_hash_set") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "node_hash_set.h", >+ ] >+ deps = [ >+ ":container_memory", >+ ":hash_function_defaults", >+ ":node_hash_policy", >+ ":raw_hash_set", >+ "../memory", >+ ] >+} >+ >+source_set("container_memory") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/container_memory.h", >+ ] >+ deps = [ >+ "../memory", >+ "../utility", >+ ] >+} >+ >+source_set("hash_function_defaults") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/hash_function_defaults.h", >+ ] >+ deps = [ >+ "../base:config", >+ "../hash", >+ "../strings", >+ ] >+} >+ >+source_set("hash_generator_testing") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ sources = [ >+ "internal/hash_generator_testing.cc", >+ ] >+ public = [ >+ "internal/hash_generator_testing.h", >+ ] >+ deps = [ >+ ":hash_policy_testing", >+ "../meta:type_traits", >+ "../strings", >+ ] >+} >+ >+source_set("hash_policy_testing") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/hash_policy_testing.h", >+ ] >+ deps = [ >+ "../hash", >+ "../strings", >+ ] >+} >+ >+source_set("hash_policy_traits") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/hash_policy_traits.h", >+ ] >+ deps = [ >+ "../meta:type_traits", >+ ] >+} >+ >+source_set("hashtable_debug") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/hashtable_debug.h", >+ ] >+ deps = [ >+ ":hashtable_debug_hooks", >+ ] >+} >+ >+source_set("hashtable_debug_hooks") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/hashtable_debug_hooks.h", >+ ] >+} >+ >+source_set("node_hash_policy") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/node_hash_policy.h", >+ ] >+} >+ >+source_set("raw_hash_map") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/raw_hash_map.h", >+ ] >+ deps = [ >+ ":container_memory", >+ ":raw_hash_set", >+ ] >+} >+ >+source_set("raw_hash_set") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ sources = [ >+ "internal/raw_hash_set.cc", >+ ] >+ public = [ >+ "internal/raw_hash_set.h", >+ ] >+ deps = [ >+ ":compressed_tuple", >+ ":container_memory", >+ ":hash_policy_traits", >+ ":hashtable_debug_hooks", >+ ":layout", >+ "../base:bits", >+ "../base:config", >+ "../base:core_headers", >+ "../base:endian", >+ "../memory", >+ "../meta:type_traits", >+ "../types:optional", >+ "../utility", >+ ] >+} >+ >+source_set("layout") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/layout.h", >+ ] >+ deps = [ >+ "../base:core_headers", >+ "../meta:type_traits", >+ "../strings", >+ "../types:span", >+ "../utility", >+ ] >+} >+ >+source_set("tracked") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/tracked.h", >+ ] >+} >+ >+source_set("unordered_map_constructor_test") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/unordered_map_constructor_test.h", >+ ] >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "//testing/gtest", >+ ] >+} >+ >+source_set("unordered_map_lookup_test") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/unordered_map_lookup_test.h", >+ ] >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "//testing/gtest", >+ ] >+} >+ >+source_set("unordered_map_modifiers_test") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/unordered_map_modifiers_test.h", >+ ] >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "//testing/gtest", >+ ] >+} >+ >+source_set("unordered_set_constructor_test") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/unordered_set_constructor_test.h", >+ ] >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "//testing/gtest", >+ ] >+} >+ >+source_set("unordered_set_lookup_test") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/unordered_set_lookup_test.h", >+ ] >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "//testing/gtest", >+ ] >+} >+ >+source_set("unordered_set_modifiers_test") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/unordered_set_modifiers_test.h", >+ ] >+ deps = [ >+ ":hash_generator_testing", >+ ":hash_policy_testing", >+ "//testing/gtest", >+ ] >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/CMakeLists.txt b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/CMakeLists.txt >index 123e4c4849aabe2eedea21f05efd2d0263830fb8..9e406902b0585deda3ba3072bee0ff277484ebc3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/CMakeLists.txt >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/CMakeLists.txt >@@ -17,18 +17,42 @@ > > list(APPEND CONTAINER_PUBLIC_HEADERS > "fixed_array.h" >+ "flat_hash_map.h" >+ "flat_hash_set.h" > "inlined_vector.h" >+ "node_hash_map.h" >+ "node_hash_set.h" > ) > > > list(APPEND CONTAINER_INTERNAL_HEADERS >+ "internal/compressed_tuple.h" >+ "internal/container_memory.h" >+ "internal/hash_function_defaults.h" >+ "internal/hash_generator_testing.h" >+ "internal/hash_policy_testing.h" >+ "internal/hash_policy_traits.h" >+ "internal/hashtable_debug.h" >+ "internal/layout.h" >+ "internal/node_hash_policy.h" >+ "internal/raw_hash_map.h" >+ "internal/raw_hash_set.h" > "internal/test_instance_tracker.h" >+ "internal/tracked.h" >+ "internal/unordered_map_constructor_test.h" >+ "internal/unordered_map_lookup_test.h" >+ "internal/unordered_map_modifiers_test.h" >+ "internal/unordered_set_constructor_test.h" >+ "internal/unordered_set_lookup_test.h" >+ "internal/unordered_set_modifiers_test.h" > ) > > >-absl_header_library( >+absl_library( > TARGET > absl_container >+ SOURCES >+ "internal/raw_hash_set.cc" > EXPORT_NAME > container > ) >@@ -142,3 +166,11 @@ absl_test( > ) > > >+absl_test( >+ TARGET >+ raw_hash_set_test >+ SOURCES >+ "internal/raw_hash_set_test.cc" >+ PUBLIC_LIBRARIES >+ absl::base absl::hash absl_throw_delegate test_instance_tracker_lib >+) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array.h >index 182258f623e13aea80b7578e1906dad9478388b0..6d9fa5f745dcc9563c7a709160db1837cffbc2d7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array.h >@@ -138,8 +138,8 @@ class FixedArray { > explicit FixedArray(size_type n, const allocator_type& a = allocator_type()) > : storage_(n, a) { > if (DefaultConstructorIsNonTrivial()) { >- memory_internal::ConstructStorage(storage_.alloc(), storage_.begin(), >- storage_.end()); >+ memory_internal::ConstructRange(storage_.alloc(), storage_.begin(), >+ storage_.end()); > } > } > >@@ -147,8 +147,8 @@ class FixedArray { > FixedArray(size_type n, const value_type& val, > const allocator_type& a = allocator_type()) > : storage_(n, a) { >- memory_internal::ConstructStorage(storage_.alloc(), storage_.begin(), >- storage_.end(), val); >+ memory_internal::ConstructRange(storage_.alloc(), storage_.begin(), >+ storage_.end(), val); > } > > // Creates an array initialized with the size and contents of `init_list`. >@@ -163,13 +163,12 @@ class FixedArray { > FixedArray(Iterator first, Iterator last, > const allocator_type& a = allocator_type()) > : storage_(std::distance(first, last), a) { >- memory_internal::CopyToStorageFromRange(storage_.alloc(), storage_.begin(), >- first, last); >+ memory_internal::CopyRange(storage_.alloc(), storage_.begin(), first, last); > } > > ~FixedArray() noexcept { > for (auto* cur = storage_.begin(); cur != storage_.end(); ++cur) { >- AllocatorTraits::destroy(*storage_.alloc(), cur); >+ AllocatorTraits::destroy(storage_.alloc(), cur); > } > } > >@@ -359,6 +358,13 @@ class FixedArray { > friend bool operator>=(const FixedArray& lhs, const FixedArray& rhs) { > return !(lhs < rhs); > } >+ >+ template <typename H> >+ friend H AbslHashValue(H h, const FixedArray& v) { >+ return H::combine(H::combine_contiguous(std::move(h), v.data(), v.size()), >+ v.size()); >+ } >+ > private: > // StorageElement > // >@@ -446,15 +452,15 @@ class FixedArray { > if (UsingInlinedStorage(size())) { > InlinedStorage::AnnotateDestruct(size()); > } else { >- AllocatorTraits::deallocate(*alloc(), AsValueType(begin()), size()); >+ AllocatorTraits::deallocate(alloc(), AsValueType(begin()), size()); > } > } > > size_type size() const { return size_alloc_.template get<0>(); } > StorageElement* begin() const { return data_; } > StorageElement* end() const { return begin() + size(); } >- allocator_type* alloc() { >- return std::addressof(size_alloc_.template get<1>()); >+ allocator_type& alloc() { >+ return size_alloc_.template get<1>(); > } > > private: >@@ -468,7 +474,7 @@ class FixedArray { > return InlinedStorage::data(); > } else { > return reinterpret_cast<StorageElement*>( >- AllocatorTraits::allocate(*alloc(), size())); >+ AllocatorTraits::allocate(alloc(), size())); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_exception_safety_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_exception_safety_test.cc >index c123c2a1c0d2d38484056c34f89101489e5c1424..da63dbfe38e71d34a33e800dfe42aa9f3ea317fa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_exception_safety_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_exception_safety_test.cc >@@ -97,7 +97,7 @@ testing::AssertionResult ReadMemory(FixedArr* fixed_arr) { > > TEST(FixedArrayExceptionSafety, Fill) { > auto test_fill = testing::MakeExceptionSafetyTester() >- .WithInvariants(ReadMemory) >+ .WithContracts(ReadMemory) > .WithOperation([&](FixedArr* fixed_arr_ptr) { > auto thrower = > Thrower(kUpdatedValue, testing::nothrow_ctor); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_test.cc >index b07ebcb6d9ca6b0e426ca63ac34c06f8140983ce..205ff41fe1142bbe8e187a44f17a5a5a73b5c548 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/fixed_array_test.cc >@@ -27,6 +27,7 @@ > #include "gmock/gmock.h" > #include "gtest/gtest.h" > #include "absl/base/internal/exception_testing.h" >+#include "absl/hash/hash_testing.h" > #include "absl/memory/memory.h" > > using ::testing::ElementsAreArray; >@@ -867,4 +868,5 @@ TEST(FixedArrayTest, AddressSanitizerAnnotations4) { > EXPECT_DEATH(raw[21] = ThreeInts(), "container-overflow"); > } > #endif // ADDRESS_SANITIZER >+ > } // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_map.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_map.h >new file mode 100644 >index 0000000000000000000000000000000000000000..e5570d1004413166abdddb6c546bb57cabd4bf04 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_map.h >@@ -0,0 +1,528 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// ----------------------------------------------------------------------------- >+// File: flat_hash_map.h >+// ----------------------------------------------------------------------------- >+// >+// An `absl::flat_hash_map<K, V>` is an unordered associative container of >+// unique keys and associated values designed to be a more efficient replacement >+// for `std::unordered_map`. Like `unordered_map`, search, insertion, and >+// deletion of map elements can be done as an `O(1)` operation. However, >+// `flat_hash_map` (and other unordered associative containers known as the >+// collection of Abseil "Swiss tables") contain other optimizations that result >+// in both memory and computation advantages. >+// >+// In most cases, your default choice for a hash map should be a map of type >+// `flat_hash_map`. >+ >+#ifndef ABSL_CONTAINER_FLAT_HASH_MAP_H_ >+#define ABSL_CONTAINER_FLAT_HASH_MAP_H_ >+ >+#include <cstddef> >+#include <new> >+#include <type_traits> >+#include <utility> >+ >+#include "absl/container/internal/container_memory.h" >+#include "absl/container/internal/hash_function_defaults.h" // IWYU pragma: export >+#include "absl/container/internal/raw_hash_map.h" // IWYU pragma: export >+#include "absl/memory/memory.h" >+ >+namespace absl { >+namespace container_internal { >+template <class K, class V> >+struct FlatHashMapPolicy; >+} // namespace container_internal >+ >+// ----------------------------------------------------------------------------- >+// absl::flat_hash_map >+// ----------------------------------------------------------------------------- >+// >+// An `absl::flat_hash_map<K, V>` is an unordered associative container which >+// has been optimized for both speed and memory footprint in most common use >+// cases. Its interface is similar to that of `std::unordered_map<K, V>` with >+// the following notable differences: >+// >+// * Requires keys that are CopyConstructible >+// * Requires values that are MoveConstructible >+// * Supports heterogeneous lookup, through `find()`, `operator[]()` and >+// `insert()`, provided that the map is provided a compatible heterogeneous >+// hashing function and equality operator. >+// * Invalidates any references and pointers to elements within the table after >+// `rehash()`. >+// * Contains a `capacity()` member function indicating the number of element >+// slots (open, deleted, and empty) within the hash map. >+// * Returns `void` from the `erase(iterator)` overload. >+// >+// By default, `flat_hash_map` uses the `absl::Hash` hashing framework. >+// All fundamental and Abseil types that support the `absl::Hash` framework have >+// a compatible equality operator for comparing insertions into `flat_hash_map`. >+// If your type is not yet supported by the `asbl::Hash` framework, see >+// absl/hash/hash.h for information on extending Abseil hashing to user-defined >+// types. >+// >+// NOTE: A `flat_hash_map` stores its value types directly inside its >+// implementation array to avoid memory indirection. Because a `flat_hash_map` >+// is designed to move data when rehashed, map values will not retain pointer >+// stability. If you require pointer stability, or your values are large, >+// consider using `absl::flat_hash_map<Key, std::unique_ptr<Value>>` instead. >+// If your types are not moveable or you require pointer stability for keys, >+// consider `absl::node_hash_map`. >+// >+// Example: >+// >+// // Create a flat hash map of three strings (that map to strings) >+// absl::flat_hash_map<std::string, std::string> ducks = >+// {{"a", "huey"}, {"b", "dewey"}, {"c", "louie"}}; >+// >+// // Insert a new element into the flat hash map >+// ducks.insert({"d", "donald"}); >+// >+// // Force a rehash of the flat hash map >+// ducks.rehash(0); >+// >+// // Find the element with the key "b" >+// std::string search_key = "b"; >+// auto result = ducks.find(search_key); >+// if (result != ducks.end()) { >+// std::cout << "Result: " << result->second << std::endl; >+// } >+template <class K, class V, >+ class Hash = absl::container_internal::hash_default_hash<K>, >+ class Eq = absl::container_internal::hash_default_eq<K>, >+ class Allocator = std::allocator<std::pair<const K, V>>> >+class flat_hash_map : public absl::container_internal::raw_hash_map< >+ absl::container_internal::FlatHashMapPolicy<K, V>, >+ Hash, Eq, Allocator> { >+ using Base = typename flat_hash_map::raw_hash_map; >+ >+ public: >+ flat_hash_map() {} >+ using Base::Base; >+ >+ // flat_hash_map::begin() >+ // >+ // Returns an iterator to the beginning of the `flat_hash_map`. >+ using Base::begin; >+ >+ // flat_hash_map::cbegin() >+ // >+ // Returns a const iterator to the beginning of the `flat_hash_map`. >+ using Base::cbegin; >+ >+ // flat_hash_map::cend() >+ // >+ // Returns a const iterator to the end of the `flat_hash_map`. >+ using Base::cend; >+ >+ // flat_hash_map::end() >+ // >+ // Returns an iterator to the end of the `flat_hash_map`. >+ using Base::end; >+ >+ // flat_hash_map::capacity() >+ // >+ // Returns the number of element slots (assigned, deleted, and empty) >+ // available within the `flat_hash_map`. >+ // >+ // NOTE: this member function is particular to `absl::flat_hash_map` and is >+ // not provided in the `std::unordered_map` API. >+ using Base::capacity; >+ >+ // flat_hash_map::empty() >+ // >+ // Returns whether or not the `flat_hash_map` is empty. >+ using Base::empty; >+ >+ // flat_hash_map::max_size() >+ // >+ // Returns the largest theoretical possible number of elements within a >+ // `flat_hash_map` under current memory constraints. This value can be thought >+ // of the largest value of `std::distance(begin(), end())` for a >+ // `flat_hash_map<K, V>`. >+ using Base::max_size; >+ >+ // flat_hash_map::size() >+ // >+ // Returns the number of elements currently within the `flat_hash_map`. >+ using Base::size; >+ >+ // flat_hash_map::clear() >+ // >+ // Removes all elements from the `flat_hash_map`. Invalidates any references, >+ // pointers, or iterators referring to contained elements. >+ // >+ // NOTE: this operation may shrink the underlying buffer. To avoid shrinking >+ // the underlying buffer call `erase(begin(), end())`. >+ using Base::clear; >+ >+ // flat_hash_map::erase() >+ // >+ // Erases elements within the `flat_hash_map`. Erasing does not trigger a >+ // rehash. Overloads are listed below. >+ // >+ // void erase(const_iterator pos): >+ // >+ // Erases the element at `position` of the `flat_hash_map`, returning >+ // `void`. >+ // >+ // NOTE: this return behavior is different than that of STL containers in >+ // general and `std::unordered_map` in particular. >+ // >+ // iterator erase(const_iterator first, const_iterator last): >+ // >+ // Erases the elements in the open interval [`first`, `last`), returning an >+ // iterator pointing to `last`. >+ // >+ // size_type erase(const key_type& key): >+ // >+ // Erases the element with the matching key, if it exists. >+ using Base::erase; >+ >+ // flat_hash_map::insert() >+ // >+ // Inserts an element of the specified value into the `flat_hash_map`, >+ // returning an iterator pointing to the newly inserted element, provided that >+ // an element with the given key does not already exist. If rehashing occurs >+ // due to the insertion, all iterators are invalidated. Overloads are listed >+ // below. >+ // >+ // std::pair<iterator,bool> insert(const init_type& value): >+ // >+ // Inserts a value into the `flat_hash_map`. Returns a pair consisting of an >+ // iterator to the inserted element (or to the element that prevented the >+ // insertion) and a bool denoting whether the insertion took place. >+ // >+ // std::pair<iterator,bool> insert(T&& value): >+ // std::pair<iterator,bool> insert(init_type&& value): >+ // >+ // Inserts a moveable value into the `flat_hash_map`. Returns a pair >+ // consisting of an iterator to the inserted element (or to the element that >+ // prevented the insertion) and a bool denoting whether the insertion took >+ // place. >+ // >+ // iterator insert(const_iterator hint, const init_type& value): >+ // iterator insert(const_iterator hint, T&& value): >+ // iterator insert(const_iterator hint, init_type&& value); >+ // >+ // Inserts a value, using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. Returns an iterator to the >+ // inserted element, or to the existing element that prevented the >+ // insertion. >+ // >+ // void insert(InputIterator first, InputIterator last): >+ // >+ // Inserts a range of values [`first`, `last`). >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently, for `flat_hash_map` we guarantee the >+ // first match is inserted. >+ // >+ // void insert(std::initializer_list<init_type> ilist): >+ // >+ // Inserts the elements within the initializer list `ilist`. >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently within the initializer list, for >+ // `flat_hash_map` we guarantee the first match is inserted. >+ using Base::insert; >+ >+ // flat_hash_map::insert_or_assign() >+ // >+ // Inserts an element of the specified value into the `flat_hash_map` provided >+ // that a value with the given key does not already exist, or replaces it with >+ // the element value if a key for that value already exists, returning an >+ // iterator pointing to the newly inserted element. If rehashing occurs due >+ // to the insertion, all existing iterators are invalidated. Overloads are >+ // listed below. >+ // >+ // pair<iterator, bool> insert_or_assign(const init_type& k, T&& obj): >+ // pair<iterator, bool> insert_or_assign(init_type&& k, T&& obj): >+ // >+ // Inserts/Assigns (or moves) the element of the specified key into the >+ // `flat_hash_map`. >+ // >+ // iterator insert_or_assign(const_iterator hint, >+ // const init_type& k, T&& obj): >+ // iterator insert_or_assign(const_iterator hint, init_type&& k, T&& obj): >+ // >+ // Inserts/Assigns (or moves) the element of the specified key into the >+ // `flat_hash_map` using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. >+ using Base::insert_or_assign; >+ >+ // flat_hash_map::emplace() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `flat_hash_map`, provided that no element with the given key >+ // already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace; >+ >+ // flat_hash_map::emplace_hint() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `flat_hash_map`, using the position of `hint` as a non-binding >+ // suggestion for where to begin the insertion search, and only inserts >+ // provided that no element with the given key already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace_hint; >+ >+ // flat_hash_map::try_emplace() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `flat_hash_map`, provided that no element with the given key >+ // already exists. Unlike `emplace()`, if an element with the given key >+ // already exists, we guarantee that no element is constructed. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ // Overloads are listed below. >+ // >+ // pair<iterator, bool> try_emplace(const key_type& k, Args&&... args): >+ // pair<iterator, bool> try_emplace(key_type&& k, Args&&... args): >+ // >+ // Inserts (via copy or move) the element of the specified key into the >+ // `flat_hash_map`. >+ // >+ // iterator try_emplace(const_iterator hint, >+ // const init_type& k, Args&&... args): >+ // iterator try_emplace(const_iterator hint, init_type&& k, Args&&... args): >+ // >+ // Inserts (via copy or move) the element of the specified key into the >+ // `flat_hash_map` using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. >+ using Base::try_emplace; >+ >+ // flat_hash_map::extract() >+ // >+ // Extracts the indicated element, erasing it in the process, and returns it >+ // as a C++17-compatible node handle. Overloads are listed below. >+ // >+ // node_type extract(const_iterator position): >+ // >+ // Extracts the key,value pair of the element at the indicated position and >+ // returns a node handle owning that extracted data. >+ // >+ // node_type extract(const key_type& x): >+ // >+ // Extracts the key,value pair of the element with a key matching the passed >+ // key value and returns a node handle owning that extracted data. If the >+ // `flat_hash_map` does not contain an element with a matching key, this >+ // function returns an empty node handle. >+ using Base::extract; >+ >+ // flat_hash_map::merge() >+ // >+ // Extracts elements from a given `source` flat hash map into this >+ // `flat_hash_map`. If the destination `flat_hash_map` already contains an >+ // element with an equivalent key, that element is not extracted. >+ using Base::merge; >+ >+ // flat_hash_map::swap(flat_hash_map& other) >+ // >+ // Exchanges the contents of this `flat_hash_map` with those of the `other` >+ // flat hash map, avoiding invocation of any move, copy, or swap operations on >+ // individual elements. >+ // >+ // All iterators and references on the `flat_hash_map` remain valid, excepting >+ // for the past-the-end iterator, which is invalidated. >+ // >+ // `swap()` requires that the flat hash map's hashing and key equivalence >+ // functions be Swappable, and are exchaged using unqualified calls to >+ // non-member `swap()`. If the map's allocator has >+ // `std::allocator_traits<allocator_type>::propagate_on_container_swap::value` >+ // set to `true`, the allocators are also exchanged using an unqualified call >+ // to non-member `swap()`; otherwise, the allocators are not swapped. >+ using Base::swap; >+ >+ // flat_hash_map::rehash(count) >+ // >+ // Rehashes the `flat_hash_map`, setting the number of slots to be at least >+ // the passed value. If the new number of slots increases the load factor more >+ // than the current maximum load factor >+ // (`count` < `size()` / `max_load_factor()`), then the new number of slots >+ // will be at least `size()` / `max_load_factor()`. >+ // >+ // To force a rehash, pass rehash(0). >+ // >+ // NOTE: unlike behavior in `std::unordered_map`, references are also >+ // invalidated upon a `rehash()`. >+ using Base::rehash; >+ >+ // flat_hash_map::reserve(count) >+ // >+ // Sets the number of slots in the `flat_hash_map` to the number needed to >+ // accommodate at least `count` total elements without exceeding the current >+ // maximum load factor, and may rehash the container if needed. >+ using Base::reserve; >+ >+ // flat_hash_map::at() >+ // >+ // Returns a reference to the mapped value of the element with key equivalent >+ // to the passed key. >+ using Base::at; >+ >+ // flat_hash_map::contains() >+ // >+ // Determines whether an element with a key comparing equal to the given `key` >+ // exists within the `flat_hash_map`, returning `true` if so or `false` >+ // otherwise. >+ using Base::contains; >+ >+ // flat_hash_map::count(const Key& key) const >+ // >+ // Returns the number of elements with a key comparing equal to the given >+ // `key` within the `flat_hash_map`. note that this function will return >+ // either `1` or `0` since duplicate keys are not allowed within a >+ // `flat_hash_map`. >+ using Base::count; >+ >+ // flat_hash_map::equal_range() >+ // >+ // Returns a closed range [first, last], defined by a `std::pair` of two >+ // iterators, containing all elements with the passed key in the >+ // `flat_hash_map`. >+ using Base::equal_range; >+ >+ // flat_hash_map::find() >+ // >+ // Finds an element with the passed `key` within the `flat_hash_map`. >+ using Base::find; >+ >+ // flat_hash_map::operator[]() >+ // >+ // Returns a reference to the value mapped to the passed key within the >+ // `flat_hash_map`, performing an `insert()` if the key does not already >+ // exist. >+ // >+ // If an insertion occurs and results in a rehashing of the container, all >+ // iterators are invalidated. Otherwise iterators are not affected and >+ // references are not invalidated. Overloads are listed below. >+ // >+ // T& operator[](const Key& key): >+ // >+ // Inserts an init_type object constructed in-place if the element with the >+ // given key does not exist. >+ // >+ // T& operator[](Key&& key): >+ // >+ // Inserts an init_type object constructed in-place provided that an element >+ // with the given key does not exist. >+ using Base::operator[]; >+ >+ // flat_hash_map::bucket_count() >+ // >+ // Returns the number of "buckets" within the `flat_hash_map`. Note that >+ // because a flat hash map contains all elements within its internal storage, >+ // this value simply equals the current capacity of the `flat_hash_map`. >+ using Base::bucket_count; >+ >+ // flat_hash_map::load_factor() >+ // >+ // Returns the current load factor of the `flat_hash_map` (the average number >+ // of slots occupied with a value within the hash map). >+ using Base::load_factor; >+ >+ // flat_hash_map::max_load_factor() >+ // >+ // Manages the maximum load factor of the `flat_hash_map`. Overloads are >+ // listed below. >+ // >+ // float flat_hash_map::max_load_factor() >+ // >+ // Returns the current maximum load factor of the `flat_hash_map`. >+ // >+ // void flat_hash_map::max_load_factor(float ml) >+ // >+ // Sets the maximum load factor of the `flat_hash_map` to the passed value. >+ // >+ // NOTE: This overload is provided only for API compatibility with the STL; >+ // `flat_hash_map` will ignore any set load factor and manage its rehashing >+ // internally as an implementation detail. >+ using Base::max_load_factor; >+ >+ // flat_hash_map::get_allocator() >+ // >+ // Returns the allocator function associated with this `flat_hash_map`. >+ using Base::get_allocator; >+ >+ // flat_hash_map::hash_function() >+ // >+ // Returns the hashing function used to hash the keys within this >+ // `flat_hash_map`. >+ using Base::hash_function; >+ >+ // flat_hash_map::key_eq() >+ // >+ // Returns the function used for comparing keys equality. >+ using Base::key_eq; >+}; >+ >+namespace container_internal { >+ >+template <class K, class V> >+struct FlatHashMapPolicy { >+ using slot_type = container_internal::slot_type<K, V>; >+ using key_type = K; >+ using mapped_type = V; >+ using init_type = std::pair</*non const*/ key_type, mapped_type>; >+ >+ template <class Allocator, class... Args> >+ static void construct(Allocator* alloc, slot_type* slot, Args&&... args) { >+ slot_type::construct(alloc, slot, std::forward<Args>(args)...); >+ } >+ >+ template <class Allocator> >+ static void destroy(Allocator* alloc, slot_type* slot) { >+ slot_type::destroy(alloc, slot); >+ } >+ >+ template <class Allocator> >+ static void transfer(Allocator* alloc, slot_type* new_slot, >+ slot_type* old_slot) { >+ slot_type::transfer(alloc, new_slot, old_slot); >+ } >+ >+ template <class F, class... Args> >+ static decltype(absl::container_internal::DecomposePair( >+ std::declval<F>(), std::declval<Args>()...)) >+ apply(F&& f, Args&&... args) { >+ return absl::container_internal::DecomposePair(std::forward<F>(f), >+ std::forward<Args>(args)...); >+ } >+ >+ static size_t space_used(const slot_type*) { return 0; } >+ >+ static std::pair<const K, V>& element(slot_type* slot) { return slot->value; } >+ >+ static V& value(std::pair<const K, V>* kv) { return kv->second; } >+ static const V& value(const std::pair<const K, V>* kv) { return kv->second; } >+}; >+ >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_FLAT_HASH_MAP_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_map_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_map_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..10a781ffd6c7d9d149086e443174a7aa76ad22ae >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_map_test.cc >@@ -0,0 +1,241 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/flat_hash_map.h" >+ >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/unordered_map_constructor_test.h" >+#include "absl/container/internal/unordered_map_lookup_test.h" >+#include "absl/container/internal/unordered_map_modifiers_test.h" >+#include "absl/types/any.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+using ::absl::container_internal::hash_internal::Enum; >+using ::absl::container_internal::hash_internal::EnumClass; >+using ::testing::_; >+using ::testing::Pair; >+using ::testing::UnorderedElementsAre; >+ >+template <class K, class V> >+using Map = >+ flat_hash_map<K, V, StatefulTestingHash, StatefulTestingEqual, Alloc<>>; >+ >+static_assert(!std::is_standard_layout<NonStandardLayout>(), ""); >+ >+using MapTypes = >+ ::testing::Types<Map<int, int>, Map<std::string, int>, Map<Enum, std::string>, >+ Map<EnumClass, int>, Map<int, NonStandardLayout>, >+ Map<NonStandardLayout, int>>; >+ >+INSTANTIATE_TYPED_TEST_CASE_P(FlatHashMap, ConstructorTest, MapTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(FlatHashMap, LookupTest, MapTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(FlatHashMap, ModifiersTest, MapTypes); >+ >+TEST(FlatHashMap, StandardLayout) { >+ struct Int { >+ explicit Int(size_t value) : value(value) {} >+ Int() : value(0) { ADD_FAILURE(); } >+ Int(const Int& other) : value(other.value) { ADD_FAILURE(); } >+ Int(Int&&) = default; >+ bool operator==(const Int& other) const { return value == other.value; } >+ size_t value; >+ }; >+ static_assert(std::is_standard_layout<Int>(), ""); >+ >+ struct Hash { >+ size_t operator()(const Int& obj) const { return obj.value; } >+ }; >+ >+ // Verify that neither the key nor the value get default-constructed or >+ // copy-constructed. >+ { >+ flat_hash_map<Int, Int, Hash> m; >+ m.try_emplace(Int(1), Int(2)); >+ m.try_emplace(Int(3), Int(4)); >+ m.erase(Int(1)); >+ m.rehash(2 * m.bucket_count()); >+ } >+ { >+ flat_hash_map<Int, Int, Hash> m; >+ m.try_emplace(Int(1), Int(2)); >+ m.try_emplace(Int(3), Int(4)); >+ m.erase(Int(1)); >+ m.clear(); >+ } >+} >+ >+// gcc becomes unhappy if this is inside the method, so pull it out here. >+struct balast {}; >+ >+TEST(FlatHashMap, IteratesMsan) { >+ // Because SwissTable randomizes on pointer addresses, we keep old tables >+ // around to ensure we don't reuse old memory. >+ std::vector<absl::flat_hash_map<int, balast>> garbage; >+ for (int i = 0; i < 100; ++i) { >+ absl::flat_hash_map<int, balast> t; >+ for (int j = 0; j < 100; ++j) { >+ t[j]; >+ for (const auto& p : t) EXPECT_THAT(p, Pair(_, _)); >+ } >+ garbage.push_back(std::move(t)); >+ } >+} >+ >+// Demonstration of the "Lazy Key" pattern. This uses heterogenous insert to >+// avoid creating expensive key elements when the item is already present in the >+// map. >+struct LazyInt { >+ explicit LazyInt(size_t value, int* tracker) >+ : value(value), tracker(tracker) {} >+ >+ explicit operator size_t() const { >+ ++*tracker; >+ return value; >+ } >+ >+ size_t value; >+ int* tracker; >+}; >+ >+struct Hash { >+ using is_transparent = void; >+ int* tracker; >+ size_t operator()(size_t obj) const { >+ ++*tracker; >+ return obj; >+ } >+ size_t operator()(const LazyInt& obj) const { >+ ++*tracker; >+ return obj.value; >+ } >+}; >+ >+struct Eq { >+ using is_transparent = void; >+ bool operator()(size_t lhs, size_t rhs) const { >+ return lhs == rhs; >+ } >+ bool operator()(size_t lhs, const LazyInt& rhs) const { >+ return lhs == rhs.value; >+ } >+}; >+ >+TEST(FlatHashMap, LazyKeyPattern) { >+ // hashes are only guaranteed in opt mode, we use assertions to track internal >+ // state that can cause extra calls to hash. >+ int conversions = 0; >+ int hashes = 0; >+ flat_hash_map<size_t, size_t, Hash, Eq> m(0, Hash{&hashes}); >+ >+ m[LazyInt(1, &conversions)] = 1; >+ EXPECT_THAT(m, UnorderedElementsAre(Pair(1, 1))); >+ EXPECT_EQ(conversions, 1); >+#ifdef NDEBUG >+ EXPECT_EQ(hashes, 1); >+#endif >+ >+ m[LazyInt(1, &conversions)] = 2; >+ EXPECT_THAT(m, UnorderedElementsAre(Pair(1, 2))); >+ EXPECT_EQ(conversions, 1); >+#ifdef NDEBUG >+ EXPECT_EQ(hashes, 2); >+#endif >+ >+ m.try_emplace(LazyInt(2, &conversions), 3); >+ EXPECT_THAT(m, UnorderedElementsAre(Pair(1, 2), Pair(2, 3))); >+ EXPECT_EQ(conversions, 2); >+#ifdef NDEBUG >+ EXPECT_EQ(hashes, 3); >+#endif >+ >+ m.try_emplace(LazyInt(2, &conversions), 4); >+ EXPECT_THAT(m, UnorderedElementsAre(Pair(1, 2), Pair(2, 3))); >+ EXPECT_EQ(conversions, 2); >+#ifdef NDEBUG >+ EXPECT_EQ(hashes, 4); >+#endif >+} >+ >+TEST(FlatHashMap, BitfieldArgument) { >+ union { >+ int n : 1; >+ }; >+ n = 0; >+ flat_hash_map<int, int> m; >+ m.erase(n); >+ m.count(n); >+ m.prefetch(n); >+ m.find(n); >+ m.contains(n); >+ m.equal_range(n); >+ m.insert_or_assign(n, n); >+ m.insert_or_assign(m.end(), n, n); >+ m.try_emplace(n); >+ m.try_emplace(m.end(), n); >+ m.at(n); >+ m[n]; >+} >+ >+TEST(FlatHashMap, MergeExtractInsert) { >+ // We can't test mutable keys, or non-copyable keys with flat_hash_map. >+ // Test that the nodes have the proper API. >+ absl::flat_hash_map<int, int> m = {{1, 7}, {2, 9}}; >+ auto node = m.extract(1); >+ EXPECT_TRUE(node); >+ EXPECT_EQ(node.key(), 1); >+ EXPECT_EQ(node.mapped(), 7); >+ EXPECT_THAT(m, UnorderedElementsAre(Pair(2, 9))); >+ >+ node.mapped() = 17; >+ m.insert(std::move(node)); >+ EXPECT_THAT(m, UnorderedElementsAre(Pair(1, 17), Pair(2, 9))); >+} >+#if !defined(__ANDROID__) && !defined(__APPLE__) && !defined(__EMSCRIPTEN__) >+TEST(FlatHashMap, Any) { >+ absl::flat_hash_map<int, absl::any> m; >+ m.emplace(1, 7); >+ auto it = m.find(1); >+ ASSERT_NE(it, m.end()); >+ EXPECT_EQ(7, absl::any_cast<int>(it->second)); >+ >+ m.emplace(std::piecewise_construct, std::make_tuple(2), std::make_tuple(8)); >+ it = m.find(2); >+ ASSERT_NE(it, m.end()); >+ EXPECT_EQ(8, absl::any_cast<int>(it->second)); >+ >+ m.emplace(std::piecewise_construct, std::make_tuple(3), >+ std::make_tuple(absl::any(9))); >+ it = m.find(3); >+ ASSERT_NE(it, m.end()); >+ EXPECT_EQ(9, absl::any_cast<int>(it->second)); >+ >+ struct H { >+ size_t operator()(const absl::any&) const { return 0; } >+ }; >+ struct E { >+ bool operator()(const absl::any&, const absl::any&) const { return true; } >+ }; >+ absl::flat_hash_map<absl::any, int, H, E> m2; >+ m2.emplace(1, 7); >+ auto it2 = m2.find(1); >+ ASSERT_NE(it2, m2.end()); >+ EXPECT_EQ(7, it2->second); >+} >+#endif // __ANDROID__ >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_set.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_set.h >new file mode 100644 >index 0000000000000000000000000000000000000000..98aead1a8ae209e2a45a956167be95e7270c544b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_set.h >@@ -0,0 +1,439 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// ----------------------------------------------------------------------------- >+// File: flat_hash_set.h >+// ----------------------------------------------------------------------------- >+// >+// An `absl::flat_hash_set<T>` is an unordered associative container designed to >+// be a more efficient replacement for `std::unordered_set`. Like >+// `unordered_set`, search, insertion, and deletion of set elements can be done >+// as an `O(1)` operation. However, `flat_hash_set` (and other unordered >+// associative containers known as the collection of Abseil "Swiss tables") >+// contain other optimizations that result in both memory and computation >+// advantages. >+// >+// In most cases, your default choice for a hash set should be a set of type >+// `flat_hash_set`. >+#ifndef ABSL_CONTAINER_FLAT_HASH_SET_H_ >+#define ABSL_CONTAINER_FLAT_HASH_SET_H_ >+ >+#include <type_traits> >+#include <utility> >+ >+#include "absl/base/macros.h" >+#include "absl/container/internal/container_memory.h" >+#include "absl/container/internal/hash_function_defaults.h" // IWYU pragma: export >+#include "absl/container/internal/raw_hash_set.h" // IWYU pragma: export >+#include "absl/memory/memory.h" >+ >+namespace absl { >+namespace container_internal { >+template <typename T> >+struct FlatHashSetPolicy; >+} // namespace container_internal >+ >+// ----------------------------------------------------------------------------- >+// absl::flat_hash_set >+// ----------------------------------------------------------------------------- >+// >+// An `absl::flat_hash_set<T>` is an unordered associative container which has >+// been optimized for both speed and memory footprint in most common use cases. >+// Its interface is similar to that of `std::unordered_set<T>` with the >+// following notable differences: >+// >+// * Requires keys that are CopyConstructible >+// * Supports heterogeneous lookup, through `find()`, `operator[]()` and >+// `insert()`, provided that the set is provided a compatible heterogeneous >+// hashing function and equality operator. >+// * Invalidates any references and pointers to elements within the table after >+// `rehash()`. >+// * Contains a `capacity()` member function indicating the number of element >+// slots (open, deleted, and empty) within the hash set. >+// * Returns `void` from the `erase(iterator)` overload. >+// >+// By default, `flat_hash_set` uses the `absl::Hash` hashing framework. All >+// fundamental and Abseil types that support the `absl::Hash` framework have a >+// compatible equality operator for comparing insertions into `flat_hash_map`. >+// If your type is not yet supported by the `asbl::Hash` framework, see >+// absl/hash/hash.h for information on extending Abseil hashing to user-defined >+// types. >+// >+// NOTE: A `flat_hash_set` stores its keys directly inside its implementation >+// array to avoid memory indirection. Because a `flat_hash_set` is designed to >+// move data when rehashed, set keys will not retain pointer stability. If you >+// require pointer stability, consider using >+// `absl::flat_hash_set<std::unique_ptr<T>>`. If your type is not moveable and >+// you require pointer stability, consider `absl::node_hash_set` instead. >+// >+// Example: >+// >+// // Create a flat hash set of three strings >+// absl::flat_hash_set<std::string> ducks = >+// {"huey", "dewey", "louie"}; >+// >+// // Insert a new element into the flat hash set >+// ducks.insert("donald"}; >+// >+// // Force a rehash of the flat hash set >+// ducks.rehash(0); >+// >+// // See if "dewey" is present >+// if (ducks.contains("dewey")) { >+// std::cout << "We found dewey!" << std::endl; >+// } >+template <class T, class Hash = absl::container_internal::hash_default_hash<T>, >+ class Eq = absl::container_internal::hash_default_eq<T>, >+ class Allocator = std::allocator<T>> >+class flat_hash_set >+ : public absl::container_internal::raw_hash_set< >+ absl::container_internal::FlatHashSetPolicy<T>, Hash, Eq, Allocator> { >+ using Base = typename flat_hash_set::raw_hash_set; >+ >+ public: >+ flat_hash_set() {} >+ using Base::Base; >+ >+ // flat_hash_set::begin() >+ // >+ // Returns an iterator to the beginning of the `flat_hash_set`. >+ using Base::begin; >+ >+ // flat_hash_set::cbegin() >+ // >+ // Returns a const iterator to the beginning of the `flat_hash_set`. >+ using Base::cbegin; >+ >+ // flat_hash_set::cend() >+ // >+ // Returns a const iterator to the end of the `flat_hash_set`. >+ using Base::cend; >+ >+ // flat_hash_set::end() >+ // >+ // Returns an iterator to the end of the `flat_hash_set`. >+ using Base::end; >+ >+ // flat_hash_set::capacity() >+ // >+ // Returns the number of element slots (assigned, deleted, and empty) >+ // available within the `flat_hash_set`. >+ // >+ // NOTE: this member function is particular to `absl::flat_hash_set` and is >+ // not provided in the `std::unordered_map` API. >+ using Base::capacity; >+ >+ // flat_hash_set::empty() >+ // >+ // Returns whether or not the `flat_hash_set` is empty. >+ using Base::empty; >+ >+ // flat_hash_set::max_size() >+ // >+ // Returns the largest theoretical possible number of elements within a >+ // `flat_hash_set` under current memory constraints. This value can be thought >+ // of the largest value of `std::distance(begin(), end())` for a >+ // `flat_hash_set<T>`. >+ using Base::max_size; >+ >+ // flat_hash_set::size() >+ // >+ // Returns the number of elements currently within the `flat_hash_set`. >+ using Base::size; >+ >+ // flat_hash_set::clear() >+ // >+ // Removes all elements from the `flat_hash_set`. Invalidates any references, >+ // pointers, or iterators referring to contained elements. >+ // >+ // NOTE: this operation may shrink the underlying buffer. To avoid shrinking >+ // the underlying buffer call `erase(begin(), end())`. >+ using Base::clear; >+ >+ // flat_hash_set::erase() >+ // >+ // Erases elements within the `flat_hash_set`. Erasing does not trigger a >+ // rehash. Overloads are listed below. >+ // >+ // void erase(const_iterator pos): >+ // >+ // Erases the element at `position` of the `flat_hash_set`, returning >+ // `void`. >+ // >+ // NOTE: this return behavior is different than that of STL containers in >+ // general and `std::unordered_map` in particular. >+ // >+ // iterator erase(const_iterator first, const_iterator last): >+ // >+ // Erases the elements in the open interval [`first`, `last`), returning an >+ // iterator pointing to `last`. >+ // >+ // size_type erase(const key_type& key): >+ // >+ // Erases the element with the matching key, if it exists. >+ using Base::erase; >+ >+ // flat_hash_set::insert() >+ // >+ // Inserts an element of the specified value into the `flat_hash_set`, >+ // returning an iterator pointing to the newly inserted element, provided that >+ // an element with the given key does not already exist. If rehashing occurs >+ // due to the insertion, all iterators are invalidated. Overloads are listed >+ // below. >+ // >+ // std::pair<iterator,bool> insert(const T& value): >+ // >+ // Inserts a value into the `flat_hash_set`. Returns a pair consisting of an >+ // iterator to the inserted element (or to the element that prevented the >+ // insertion) and a bool denoting whether the insertion took place. >+ // >+ // std::pair<iterator,bool> insert(T&& value): >+ // >+ // Inserts a moveable value into the `flat_hash_set`. Returns a pair >+ // consisting of an iterator to the inserted element (or to the element that >+ // prevented the insertion) and a bool denoting whether the insertion took >+ // place. >+ // >+ // iterator insert(const_iterator hint, const T& value): >+ // iterator insert(const_iterator hint, T&& value): >+ // >+ // Inserts a value, using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. Returns an iterator to the >+ // inserted element, or to the existing element that prevented the >+ // insertion. >+ // >+ // void insert(InputIterator first, InputIterator last): >+ // >+ // Inserts a range of values [`first`, `last`). >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently, for `flat_hash_set` we guarantee the >+ // first match is inserted. >+ // >+ // void insert(std::initializer_list<T> ilist): >+ // >+ // Inserts the elements within the initializer list `ilist`. >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently within the initializer list, for >+ // `flat_hash_set` we guarantee the first match is inserted. >+ using Base::insert; >+ >+ // flat_hash_set::emplace() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `flat_hash_set`, provided that no element with the given key >+ // already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace; >+ >+ // flat_hash_set::emplace_hint() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `flat_hash_set`, using the position of `hint` as a non-binding >+ // suggestion for where to begin the insertion search, and only inserts >+ // provided that no element with the given key already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace_hint; >+ >+ // flat_hash_set::extract() >+ // >+ // Extracts the indicated element, erasing it in the process, and returns it >+ // as a C++17-compatible node handle. Overloads are listed below. >+ // >+ // node_type extract(const_iterator position): >+ // >+ // Extracts the element at the indicated position and returns a node handle >+ // owning that extracted data. >+ // >+ // node_type extract(const key_type& x): >+ // >+ // Extracts the element with the key matching the passed key value and >+ // returns a node handle owning that extracted data. If the `flat_hash_set` >+ // does not contain an element with a matching key, this function returns an >+ // empty node handle. >+ using Base::extract; >+ >+ // flat_hash_set::merge() >+ // >+ // Extracts elements from a given `source` flat hash map into this >+ // `flat_hash_set`. If the destination `flat_hash_set` already contains an >+ // element with an equivalent key, that element is not extracted. >+ using Base::merge; >+ >+ // flat_hash_set::swap(flat_hash_set& other) >+ // >+ // Exchanges the contents of this `flat_hash_set` with those of the `other` >+ // flat hash map, avoiding invocation of any move, copy, or swap operations on >+ // individual elements. >+ // >+ // All iterators and references on the `flat_hash_set` remain valid, excepting >+ // for the past-the-end iterator, which is invalidated. >+ // >+ // `swap()` requires that the flat hash set's hashing and key equivalence >+ // functions be Swappable, and are exchaged using unqualified calls to >+ // non-member `swap()`. If the map's allocator has >+ // `std::allocator_traits<allocator_type>::propagate_on_container_swap::value` >+ // set to `true`, the allocators are also exchanged using an unqualified call >+ // to non-member `swap()`; otherwise, the allocators are not swapped. >+ using Base::swap; >+ >+ // flat_hash_set::rehash(count) >+ // >+ // Rehashes the `flat_hash_set`, setting the number of slots to be at least >+ // the passed value. If the new number of slots increases the load factor more >+ // than the current maximum load factor >+ // (`count` < `size()` / `max_load_factor()`), then the new number of slots >+ // will be at least `size()` / `max_load_factor()`. >+ // >+ // To force a rehash, pass rehash(0). >+ // >+ // NOTE: unlike behavior in `std::unordered_set`, references are also >+ // invalidated upon a `rehash()`. >+ using Base::rehash; >+ >+ // flat_hash_set::reserve(count) >+ // >+ // Sets the number of slots in the `flat_hash_set` to the number needed to >+ // accommodate at least `count` total elements without exceeding the current >+ // maximum load factor, and may rehash the container if needed. >+ using Base::reserve; >+ >+ // flat_hash_set::contains() >+ // >+ // Determines whether an element comparing equal to the given `key` exists >+ // within the `flat_hash_set`, returning `true` if so or `false` otherwise. >+ using Base::contains; >+ >+ // flat_hash_set::count(const Key& key) const >+ // >+ // Returns the number of elements comparing equal to the given `key` within >+ // the `flat_hash_set`. note that this function will return either `1` or `0` >+ // since duplicate elements are not allowed within a `flat_hash_set`. >+ using Base::count; >+ >+ // flat_hash_set::equal_range() >+ // >+ // Returns a closed range [first, last], defined by a `std::pair` of two >+ // iterators, containing all elements with the passed key in the >+ // `flat_hash_set`. >+ using Base::equal_range; >+ >+ // flat_hash_set::find() >+ // >+ // Finds an element with the passed `key` within the `flat_hash_set`. >+ using Base::find; >+ >+ // flat_hash_set::bucket_count() >+ // >+ // Returns the number of "buckets" within the `flat_hash_set`. Note that >+ // because a flat hash map contains all elements within its internal storage, >+ // this value simply equals the current capacity of the `flat_hash_set`. >+ using Base::bucket_count; >+ >+ // flat_hash_set::load_factor() >+ // >+ // Returns the current load factor of the `flat_hash_set` (the average number >+ // of slots occupied with a value within the hash map). >+ using Base::load_factor; >+ >+ // flat_hash_set::max_load_factor() >+ // >+ // Manages the maximum load factor of the `flat_hash_set`. Overloads are >+ // listed below. >+ // >+ // float flat_hash_set::max_load_factor() >+ // >+ // Returns the current maximum load factor of the `flat_hash_set`. >+ // >+ // void flat_hash_set::max_load_factor(float ml) >+ // >+ // Sets the maximum load factor of the `flat_hash_set` to the passed value. >+ // >+ // NOTE: This overload is provided only for API compatibility with the STL; >+ // `flat_hash_set` will ignore any set load factor and manage its rehashing >+ // internally as an implementation detail. >+ using Base::max_load_factor; >+ >+ // flat_hash_set::get_allocator() >+ // >+ // Returns the allocator function associated with this `flat_hash_set`. >+ using Base::get_allocator; >+ >+ // flat_hash_set::hash_function() >+ // >+ // Returns the hashing function used to hash the keys within this >+ // `flat_hash_set`. >+ using Base::hash_function; >+ >+ // flat_hash_set::key_eq() >+ // >+ // Returns the function used for comparing keys equality. >+ using Base::key_eq; >+}; >+ >+namespace container_internal { >+ >+template <class T> >+struct FlatHashSetPolicy { >+ using slot_type = T; >+ using key_type = T; >+ using init_type = T; >+ using constant_iterators = std::true_type; >+ >+ template <class Allocator, class... Args> >+ static void construct(Allocator* alloc, slot_type* slot, Args&&... args) { >+ absl::allocator_traits<Allocator>::construct(*alloc, slot, >+ std::forward<Args>(args)...); >+ } >+ >+ template <class Allocator> >+ static void destroy(Allocator* alloc, slot_type* slot) { >+ absl::allocator_traits<Allocator>::destroy(*alloc, slot); >+ } >+ >+ template <class Allocator> >+ static void transfer(Allocator* alloc, slot_type* new_slot, >+ slot_type* old_slot) { >+ construct(alloc, new_slot, std::move(*old_slot)); >+ destroy(alloc, old_slot); >+ } >+ >+ static T& element(slot_type* slot) { return *slot; } >+ >+ template <class F, class... Args> >+ static decltype(absl::container_internal::DecomposeValue( >+ std::declval<F>(), std::declval<Args>()...)) >+ apply(F&& f, Args&&... args) { >+ return absl::container_internal::DecomposeValue( >+ std::forward<F>(f), std::forward<Args>(args)...); >+ } >+ >+ static size_t space_used(const T*) { return 0; } >+}; >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_FLAT_HASH_SET_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_set_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_set_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..e52fd532cbeec75158be6205e561181cb174d4bb >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/flat_hash_set_test.cc >@@ -0,0 +1,126 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/flat_hash_set.h" >+ >+#include <vector> >+ >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/unordered_set_constructor_test.h" >+#include "absl/container/internal/unordered_set_lookup_test.h" >+#include "absl/container/internal/unordered_set_modifiers_test.h" >+#include "absl/memory/memory.h" >+#include "absl/strings/string_view.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using ::absl::container_internal::hash_internal::Enum; >+using ::absl::container_internal::hash_internal::EnumClass; >+using ::testing::Pointee; >+using ::testing::UnorderedElementsAre; >+using ::testing::UnorderedElementsAreArray; >+ >+template <class T> >+using Set = >+ absl::flat_hash_set<T, StatefulTestingHash, StatefulTestingEqual, Alloc<T>>; >+ >+using SetTypes = >+ ::testing::Types<Set<int>, Set<std::string>, Set<Enum>, Set<EnumClass>>; >+ >+INSTANTIATE_TYPED_TEST_CASE_P(FlatHashSet, ConstructorTest, SetTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(FlatHashSet, LookupTest, SetTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(FlatHashSet, ModifiersTest, SetTypes); >+ >+TEST(FlatHashSet, EmplaceString) { >+ std::vector<std::string> v = {"a", "b"}; >+ absl::flat_hash_set<absl::string_view> hs(v.begin(), v.end()); >+ EXPECT_THAT(hs, UnorderedElementsAreArray(v)); >+} >+ >+TEST(FlatHashSet, BitfieldArgument) { >+ union { >+ int n : 1; >+ }; >+ n = 0; >+ absl::flat_hash_set<int> s = {n}; >+ s.insert(n); >+ s.insert(s.end(), n); >+ s.insert({n}); >+ s.erase(n); >+ s.count(n); >+ s.prefetch(n); >+ s.find(n); >+ s.contains(n); >+ s.equal_range(n); >+} >+ >+TEST(FlatHashSet, MergeExtractInsert) { >+ struct Hash { >+ size_t operator()(const std::unique_ptr<int>& p) const { return *p; } >+ }; >+ struct Eq { >+ bool operator()(const std::unique_ptr<int>& a, >+ const std::unique_ptr<int>& b) const { >+ return *a == *b; >+ } >+ }; >+ absl::flat_hash_set<std::unique_ptr<int>, Hash, Eq> set1, set2; >+ set1.insert(absl::make_unique<int>(7)); >+ set1.insert(absl::make_unique<int>(17)); >+ >+ set2.insert(absl::make_unique<int>(7)); >+ set2.insert(absl::make_unique<int>(19)); >+ >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(7), Pointee(17))); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7), Pointee(19))); >+ >+ set1.merge(set2); >+ >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(7), Pointee(17), Pointee(19))); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7))); >+ >+ auto node = set1.extract(absl::make_unique<int>(7)); >+ EXPECT_TRUE(node); >+ EXPECT_THAT(node.value(), Pointee(7)); >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(17), Pointee(19))); >+ >+ auto insert_result = set2.insert(std::move(node)); >+ EXPECT_FALSE(node); >+ EXPECT_FALSE(insert_result.inserted); >+ EXPECT_TRUE(insert_result.node); >+ EXPECT_THAT(insert_result.node.value(), Pointee(7)); >+ EXPECT_EQ(**insert_result.position, 7); >+ EXPECT_NE(insert_result.position->get(), insert_result.node.value().get()); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7))); >+ >+ node = set1.extract(absl::make_unique<int>(17)); >+ EXPECT_TRUE(node); >+ EXPECT_THAT(node.value(), Pointee(17)); >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(19))); >+ >+ node.value() = absl::make_unique<int>(23); >+ >+ insert_result = set2.insert(std::move(node)); >+ EXPECT_FALSE(node); >+ EXPECT_TRUE(insert_result.inserted); >+ EXPECT_FALSE(insert_result.node); >+ EXPECT_EQ(**insert_result.position, 23); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7), Pointee(23))); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector.h >index ca36fd3699b905c2ede9bb10725b4d9b66986877..12756bb820cf6ae956ca464a8ac7cd732cbc9e1f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector.h >@@ -620,6 +620,12 @@ class InlinedVector { > // Returns the allocator of this inlined vector. > allocator_type get_allocator() const { return allocator(); } > >+ template <typename H> >+ friend H AbslHashValue(H h, const InlinedVector& v) { >+ return H::combine(H::combine_contiguous(std::move(h), v.data(), v.size()), >+ v.size()); >+ } >+ > private: > static_assert(N > 0, "inlined vector with nonpositive size"); > >@@ -648,8 +654,7 @@ class InlinedVector { > // our instance of it for free. > class AllocatorAndTag : private allocator_type { > public: >- explicit AllocatorAndTag(const allocator_type& a, Tag t = Tag()) >- : allocator_type(a), tag_(t) {} >+ explicit AllocatorAndTag(const allocator_type& a) : allocator_type(a) {} > Tag& tag() { return tag_; } > const Tag& tag() const { return tag_; } > allocator_type& allocator() { return *this; } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_benchmark.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_benchmark.cc >index 24f2174928a38b01fef99ae27cb89e4198265ab3..a3ad0f8ae4f1a163c7f232932e46a38ee4ea670e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_benchmark.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_benchmark.cc >@@ -66,7 +66,7 @@ BENCHMARK(BM_StdVectorFill)->Range(0, 1024); > // The purpose of the next two benchmarks is to verify that > // absl::InlinedVector is efficient when moving is more efficent than > // copying. To do so, we use strings that are larger than the short >-// std::string optimization. >+// string optimization. > bool StringRepresentedInline(std::string s) { > const char* chars = s.data(); > std::string s1 = std::move(s); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_test.cc >index 196a1bed976c4bbd060a0c81a30819aced4669b2..08dcd3ef66fb6a3302f862c52ad91f536260d400 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/inlined_vector_test.cc >@@ -31,6 +31,7 @@ > #include "absl/base/internal/raw_logging.h" > #include "absl/base/macros.h" > #include "absl/container/internal/test_instance_tracker.h" >+#include "absl/hash/hash_testing.h" > #include "absl/memory/memory.h" > #include "absl/strings/str_cat.h" > >@@ -1788,4 +1789,5 @@ TEST(AllocatorSupportTest, SizeAllocConstructor) { > EXPECT_THAT(v, AllOf(SizeIs(len), Each(0))); > } > } >+ > } // anonymous namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/container_memory.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/container_memory.h >new file mode 100644 >index 0000000000000000000000000000000000000000..56c5d2df67314e829fc9f591b36e65ab9df3e0ca >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/container_memory.h >@@ -0,0 +1,405 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_CONTAINER_MEMORY_H_ >+#define ABSL_CONTAINER_INTERNAL_CONTAINER_MEMORY_H_ >+ >+#ifdef ADDRESS_SANITIZER >+#include <sanitizer/asan_interface.h> >+#endif >+ >+#ifdef MEMORY_SANITIZER >+#include <sanitizer/msan_interface.h> >+#endif >+ >+#include <cassert> >+#include <cstddef> >+#include <memory> >+#include <tuple> >+#include <type_traits> >+#include <utility> >+ >+#include "absl/memory/memory.h" >+#include "absl/utility/utility.h" >+ >+namespace absl { >+namespace container_internal { >+ >+// Allocates at least n bytes aligned to the specified alignment. >+// Alignment must be a power of 2. It must be positive. >+// >+// Note that many allocators don't honor alignment requirements above certain >+// threshold (usually either alignof(std::max_align_t) or alignof(void*)). >+// Allocate() doesn't apply alignment corrections. If the underlying allocator >+// returns insufficiently alignment pointer, that's what you are going to get. >+template <size_t Alignment, class Alloc> >+void* Allocate(Alloc* alloc, size_t n) { >+ static_assert(Alignment > 0, ""); >+ assert(n && "n must be positive"); >+ struct alignas(Alignment) M {}; >+ using A = typename absl::allocator_traits<Alloc>::template rebind_alloc<M>; >+ using AT = typename absl::allocator_traits<Alloc>::template rebind_traits<M>; >+ A mem_alloc(*alloc); >+ void* p = AT::allocate(mem_alloc, (n + sizeof(M) - 1) / sizeof(M)); >+ assert(reinterpret_cast<uintptr_t>(p) % Alignment == 0 && >+ "allocator does not respect alignment"); >+ return p; >+} >+ >+// The pointer must have been previously obtained by calling >+// Allocate<Alignment>(alloc, n). >+template <size_t Alignment, class Alloc> >+void Deallocate(Alloc* alloc, void* p, size_t n) { >+ static_assert(Alignment > 0, ""); >+ assert(n && "n must be positive"); >+ struct alignas(Alignment) M {}; >+ using A = typename absl::allocator_traits<Alloc>::template rebind_alloc<M>; >+ using AT = typename absl::allocator_traits<Alloc>::template rebind_traits<M>; >+ A mem_alloc(*alloc); >+ AT::deallocate(mem_alloc, static_cast<M*>(p), >+ (n + sizeof(M) - 1) / sizeof(M)); >+} >+ >+namespace memory_internal { >+ >+// Constructs T into uninitialized storage pointed by `ptr` using the args >+// specified in the tuple. >+template <class Alloc, class T, class Tuple, size_t... I> >+void ConstructFromTupleImpl(Alloc* alloc, T* ptr, Tuple&& t, >+ absl::index_sequence<I...>) { >+ absl::allocator_traits<Alloc>::construct( >+ *alloc, ptr, std::get<I>(std::forward<Tuple>(t))...); >+} >+ >+template <class T, class F> >+struct WithConstructedImplF { >+ template <class... Args> >+ decltype(std::declval<F>()(std::declval<T>())) operator()( >+ Args&&... args) const { >+ return std::forward<F>(f)(T(std::forward<Args>(args)...)); >+ } >+ F&& f; >+}; >+ >+template <class T, class Tuple, size_t... Is, class F> >+decltype(std::declval<F>()(std::declval<T>())) WithConstructedImpl( >+ Tuple&& t, absl::index_sequence<Is...>, F&& f) { >+ return WithConstructedImplF<T, F>{std::forward<F>(f)}( >+ std::get<Is>(std::forward<Tuple>(t))...); >+} >+ >+template <class T, size_t... Is> >+auto TupleRefImpl(T&& t, absl::index_sequence<Is...>) >+ -> decltype(std::forward_as_tuple(std::get<Is>(std::forward<T>(t))...)) { >+ return std::forward_as_tuple(std::get<Is>(std::forward<T>(t))...); >+} >+ >+// Returns a tuple of references to the elements of the input tuple. T must be a >+// tuple. >+template <class T> >+auto TupleRef(T&& t) -> decltype( >+ TupleRefImpl(std::forward<T>(t), >+ absl::make_index_sequence< >+ std::tuple_size<typename std::decay<T>::type>::value>())) { >+ return TupleRefImpl( >+ std::forward<T>(t), >+ absl::make_index_sequence< >+ std::tuple_size<typename std::decay<T>::type>::value>()); >+} >+ >+template <class F, class K, class V> >+decltype(std::declval<F>()(std::declval<const K&>(), std::piecewise_construct, >+ std::declval<std::tuple<K>>(), std::declval<V>())) >+DecomposePairImpl(F&& f, std::pair<std::tuple<K>, V> p) { >+ const auto& key = std::get<0>(p.first); >+ return std::forward<F>(f)(key, std::piecewise_construct, std::move(p.first), >+ std::move(p.second)); >+} >+ >+} // namespace memory_internal >+ >+// Constructs T into uninitialized storage pointed by `ptr` using the args >+// specified in the tuple. >+template <class Alloc, class T, class Tuple> >+void ConstructFromTuple(Alloc* alloc, T* ptr, Tuple&& t) { >+ memory_internal::ConstructFromTupleImpl( >+ alloc, ptr, std::forward<Tuple>(t), >+ absl::make_index_sequence< >+ std::tuple_size<typename std::decay<Tuple>::type>::value>()); >+} >+ >+// Constructs T using the args specified in the tuple and calls F with the >+// constructed value. >+template <class T, class Tuple, class F> >+decltype(std::declval<F>()(std::declval<T>())) WithConstructed( >+ Tuple&& t, F&& f) { >+ return memory_internal::WithConstructedImpl<T>( >+ std::forward<Tuple>(t), >+ absl::make_index_sequence< >+ std::tuple_size<typename std::decay<Tuple>::type>::value>(), >+ std::forward<F>(f)); >+} >+ >+// Given arguments of an std::pair's consructor, PairArgs() returns a pair of >+// tuples with references to the passed arguments. The tuples contain >+// constructor arguments for the first and the second elements of the pair. >+// >+// The following two snippets are equivalent. >+// >+// 1. std::pair<F, S> p(args...); >+// >+// 2. auto a = PairArgs(args...); >+// std::pair<F, S> p(std::piecewise_construct, >+// std::move(p.first), std::move(p.second)); >+inline std::pair<std::tuple<>, std::tuple<>> PairArgs() { return {}; } >+template <class F, class S> >+std::pair<std::tuple<F&&>, std::tuple<S&&>> PairArgs(F&& f, S&& s) { >+ return {std::piecewise_construct, std::forward_as_tuple(std::forward<F>(f)), >+ std::forward_as_tuple(std::forward<S>(s))}; >+} >+template <class F, class S> >+std::pair<std::tuple<const F&>, std::tuple<const S&>> PairArgs( >+ const std::pair<F, S>& p) { >+ return PairArgs(p.first, p.second); >+} >+template <class F, class S> >+std::pair<std::tuple<F&&>, std::tuple<S&&>> PairArgs(std::pair<F, S>&& p) { >+ return PairArgs(std::forward<F>(p.first), std::forward<S>(p.second)); >+} >+template <class F, class S> >+auto PairArgs(std::piecewise_construct_t, F&& f, S&& s) >+ -> decltype(std::make_pair(memory_internal::TupleRef(std::forward<F>(f)), >+ memory_internal::TupleRef(std::forward<S>(s)))) { >+ return std::make_pair(memory_internal::TupleRef(std::forward<F>(f)), >+ memory_internal::TupleRef(std::forward<S>(s))); >+} >+ >+// A helper function for implementing apply() in map policies. >+template <class F, class... Args> >+auto DecomposePair(F&& f, Args&&... args) >+ -> decltype(memory_internal::DecomposePairImpl( >+ std::forward<F>(f), PairArgs(std::forward<Args>(args)...))) { >+ return memory_internal::DecomposePairImpl( >+ std::forward<F>(f), PairArgs(std::forward<Args>(args)...)); >+} >+ >+// A helper function for implementing apply() in set policies. >+template <class F, class Arg> >+decltype(std::declval<F>()(std::declval<const Arg&>(), std::declval<Arg>())) >+DecomposeValue(F&& f, Arg&& arg) { >+ const auto& key = arg; >+ return std::forward<F>(f)(key, std::forward<Arg>(arg)); >+} >+ >+// Helper functions for asan and msan. >+inline void SanitizerPoisonMemoryRegion(const void* m, size_t s) { >+#ifdef ADDRESS_SANITIZER >+ ASAN_POISON_MEMORY_REGION(m, s); >+#endif >+#ifdef MEMORY_SANITIZER >+ __msan_poison(m, s); >+#endif >+ (void)m; >+ (void)s; >+} >+ >+inline void SanitizerUnpoisonMemoryRegion(const void* m, size_t s) { >+#ifdef ADDRESS_SANITIZER >+ ASAN_UNPOISON_MEMORY_REGION(m, s); >+#endif >+#ifdef MEMORY_SANITIZER >+ __msan_unpoison(m, s); >+#endif >+ (void)m; >+ (void)s; >+} >+ >+template <typename T> >+inline void SanitizerPoisonObject(const T* object) { >+ SanitizerPoisonMemoryRegion(object, sizeof(T)); >+} >+ >+template <typename T> >+inline void SanitizerUnpoisonObject(const T* object) { >+ SanitizerUnpoisonMemoryRegion(object, sizeof(T)); >+} >+ >+namespace memory_internal { >+ >+// If Pair is a standard-layout type, OffsetOf<Pair>::kFirst and >+// OffsetOf<Pair>::kSecond are equivalent to offsetof(Pair, first) and >+// offsetof(Pair, second) respectively. Otherwise they are -1. >+// >+// The purpose of OffsetOf is to avoid calling offsetof() on non-standard-layout >+// type, which is non-portable. >+template <class Pair, class = std::true_type> >+struct OffsetOf { >+ static constexpr size_t kFirst = -1; >+ static constexpr size_t kSecond = -1; >+}; >+ >+template <class Pair> >+struct OffsetOf<Pair, typename std::is_standard_layout<Pair>::type> { >+ static constexpr size_t kFirst = offsetof(Pair, first); >+ static constexpr size_t kSecond = offsetof(Pair, second); >+}; >+ >+template <class K, class V> >+struct IsLayoutCompatible { >+ private: >+ struct Pair { >+ K first; >+ V second; >+ }; >+ >+ // Is P layout-compatible with Pair? >+ template <class P> >+ static constexpr bool LayoutCompatible() { >+ return std::is_standard_layout<P>() && sizeof(P) == sizeof(Pair) && >+ alignof(P) == alignof(Pair) && >+ memory_internal::OffsetOf<P>::kFirst == >+ memory_internal::OffsetOf<Pair>::kFirst && >+ memory_internal::OffsetOf<P>::kSecond == >+ memory_internal::OffsetOf<Pair>::kSecond; >+ } >+ >+ public: >+ // Whether pair<const K, V> and pair<K, V> are layout-compatible. If they are, >+ // then it is safe to store them in a union and read from either. >+ static constexpr bool value = std::is_standard_layout<K>() && >+ std::is_standard_layout<Pair>() && >+ memory_internal::OffsetOf<Pair>::kFirst == 0 && >+ LayoutCompatible<std::pair<K, V>>() && >+ LayoutCompatible<std::pair<const K, V>>(); >+}; >+ >+} // namespace memory_internal >+ >+// If kMutableKeys is false, only the value member is accessed. >+// >+// If kMutableKeys is true, key is accessed through all slots while value and >+// mutable_value are accessed only via INITIALIZED slots. Slots are created and >+// destroyed via mutable_value so that the key can be moved later. >+template <class K, class V> >+union slot_type { >+ private: >+ static void emplace(slot_type* slot) { >+ // The construction of union doesn't do anything at runtime but it allows us >+ // to access its members without violating aliasing rules. >+ new (slot) slot_type; >+ } >+ // If pair<const K, V> and pair<K, V> are layout-compatible, we can accept one >+ // or the other via slot_type. We are also free to access the key via >+ // slot_type::key in this case. >+ using kMutableKeys = >+ std::integral_constant<bool, >+ memory_internal::IsLayoutCompatible<K, V>::value>; >+ >+ public: >+ slot_type() {} >+ ~slot_type() = delete; >+ using value_type = std::pair<const K, V>; >+ using mutable_value_type = std::pair<K, V>; >+ >+ value_type value; >+ mutable_value_type mutable_value; >+ K key; >+ >+ template <class Allocator, class... Args> >+ static void construct(Allocator* alloc, slot_type* slot, Args&&... args) { >+ emplace(slot); >+ if (kMutableKeys::value) { >+ absl::allocator_traits<Allocator>::construct(*alloc, &slot->mutable_value, >+ std::forward<Args>(args)...); >+ } else { >+ absl::allocator_traits<Allocator>::construct(*alloc, &slot->value, >+ std::forward<Args>(args)...); >+ } >+ } >+ >+ // Construct this slot by moving from another slot. >+ template <class Allocator> >+ static void construct(Allocator* alloc, slot_type* slot, slot_type* other) { >+ emplace(slot); >+ if (kMutableKeys::value) { >+ absl::allocator_traits<Allocator>::construct( >+ *alloc, &slot->mutable_value, std::move(other->mutable_value)); >+ } else { >+ absl::allocator_traits<Allocator>::construct(*alloc, &slot->value, >+ std::move(other->value)); >+ } >+ } >+ >+ template <class Allocator> >+ static void destroy(Allocator* alloc, slot_type* slot) { >+ if (kMutableKeys::value) { >+ absl::allocator_traits<Allocator>::destroy(*alloc, &slot->mutable_value); >+ } else { >+ absl::allocator_traits<Allocator>::destroy(*alloc, &slot->value); >+ } >+ } >+ >+ template <class Allocator> >+ static void transfer(Allocator* alloc, slot_type* new_slot, >+ slot_type* old_slot) { >+ emplace(new_slot); >+ if (kMutableKeys::value) { >+ absl::allocator_traits<Allocator>::construct( >+ *alloc, &new_slot->mutable_value, std::move(old_slot->mutable_value)); >+ } else { >+ absl::allocator_traits<Allocator>::construct(*alloc, &new_slot->value, >+ std::move(old_slot->value)); >+ } >+ destroy(alloc, old_slot); >+ } >+ >+ template <class Allocator> >+ static void swap(Allocator* alloc, slot_type* a, slot_type* b) { >+ if (kMutableKeys::value) { >+ using std::swap; >+ swap(a->mutable_value, b->mutable_value); >+ } else { >+ value_type tmp = std::move(a->value); >+ absl::allocator_traits<Allocator>::destroy(*alloc, &a->value); >+ absl::allocator_traits<Allocator>::construct(*alloc, &a->value, >+ std::move(b->value)); >+ absl::allocator_traits<Allocator>::destroy(*alloc, &b->value); >+ absl::allocator_traits<Allocator>::construct(*alloc, &b->value, >+ std::move(tmp)); >+ } >+ } >+ >+ template <class Allocator> >+ static void move(Allocator* alloc, slot_type* src, slot_type* dest) { >+ if (kMutableKeys::value) { >+ dest->mutable_value = std::move(src->mutable_value); >+ } else { >+ absl::allocator_traits<Allocator>::destroy(*alloc, &dest->value); >+ absl::allocator_traits<Allocator>::construct(*alloc, &dest->value, >+ std::move(src->value)); >+ } >+ } >+ >+ template <class Allocator> >+ static void move(Allocator* alloc, slot_type* first, slot_type* last, >+ slot_type* result) { >+ for (slot_type *src = first, *dest = result; src != last; ++src, ++dest) >+ move(alloc, src, dest); >+ } >+}; >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_CONTAINER_MEMORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/container_memory_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/container_memory_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f1c4058298c157409fec9ac990850c0b1494971f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/container_memory_test.cc >@@ -0,0 +1,188 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/container_memory.h" >+ >+#include <cstdint> >+#include <tuple> >+#include <utility> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/strings/string_view.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using ::testing::Pair; >+ >+TEST(Memory, AlignmentLargerThanBase) { >+ std::allocator<int8_t> alloc; >+ void* mem = Allocate<2>(&alloc, 3); >+ EXPECT_EQ(0, reinterpret_cast<uintptr_t>(mem) % 2); >+ memcpy(mem, "abc", 3); >+ Deallocate<2>(&alloc, mem, 3); >+} >+ >+TEST(Memory, AlignmentSmallerThanBase) { >+ std::allocator<int64_t> alloc; >+ void* mem = Allocate<2>(&alloc, 3); >+ EXPECT_EQ(0, reinterpret_cast<uintptr_t>(mem) % 2); >+ memcpy(mem, "abc", 3); >+ Deallocate<2>(&alloc, mem, 3); >+} >+ >+class Fixture : public ::testing::Test { >+ using Alloc = std::allocator<std::string>; >+ >+ public: >+ Fixture() { ptr_ = std::allocator_traits<Alloc>::allocate(*alloc(), 1); } >+ ~Fixture() override { >+ std::allocator_traits<Alloc>::destroy(*alloc(), ptr_); >+ std::allocator_traits<Alloc>::deallocate(*alloc(), ptr_, 1); >+ } >+ std::string* ptr() { return ptr_; } >+ Alloc* alloc() { return &alloc_; } >+ >+ private: >+ Alloc alloc_; >+ std::string* ptr_; >+}; >+ >+TEST_F(Fixture, ConstructNoArgs) { >+ ConstructFromTuple(alloc(), ptr(), std::forward_as_tuple()); >+ EXPECT_EQ(*ptr(), ""); >+} >+ >+TEST_F(Fixture, ConstructOneArg) { >+ ConstructFromTuple(alloc(), ptr(), std::forward_as_tuple("abcde")); >+ EXPECT_EQ(*ptr(), "abcde"); >+} >+ >+TEST_F(Fixture, ConstructTwoArg) { >+ ConstructFromTuple(alloc(), ptr(), std::forward_as_tuple(5, 'a')); >+ EXPECT_EQ(*ptr(), "aaaaa"); >+} >+ >+TEST(PairArgs, NoArgs) { >+ EXPECT_THAT(PairArgs(), >+ Pair(std::forward_as_tuple(), std::forward_as_tuple())); >+} >+ >+TEST(PairArgs, TwoArgs) { >+ EXPECT_EQ( >+ std::make_pair(std::forward_as_tuple(1), std::forward_as_tuple('A')), >+ PairArgs(1, 'A')); >+} >+ >+TEST(PairArgs, Pair) { >+ EXPECT_EQ( >+ std::make_pair(std::forward_as_tuple(1), std::forward_as_tuple('A')), >+ PairArgs(std::make_pair(1, 'A'))); >+} >+ >+TEST(PairArgs, Piecewise) { >+ EXPECT_EQ( >+ std::make_pair(std::forward_as_tuple(1), std::forward_as_tuple('A')), >+ PairArgs(std::piecewise_construct, std::forward_as_tuple(1), >+ std::forward_as_tuple('A'))); >+} >+ >+TEST(WithConstructed, Simple) { >+ EXPECT_EQ(1, WithConstructed<absl::string_view>( >+ std::make_tuple(std::string("a")), >+ [](absl::string_view str) { return str.size(); })); >+} >+ >+template <class F, class Arg> >+decltype(DecomposeValue(std::declval<F>(), std::declval<Arg>())) >+DecomposeValueImpl(int, F&& f, Arg&& arg) { >+ return DecomposeValue(std::forward<F>(f), std::forward<Arg>(arg)); >+} >+ >+template <class F, class Arg> >+const char* DecomposeValueImpl(char, F&& f, Arg&& arg) { >+ return "not decomposable"; >+} >+ >+template <class F, class Arg> >+decltype(DecomposeValueImpl(0, std::declval<F>(), std::declval<Arg>())) >+TryDecomposeValue(F&& f, Arg&& arg) { >+ return DecomposeValueImpl(0, std::forward<F>(f), std::forward<Arg>(arg)); >+} >+ >+TEST(DecomposeValue, Decomposable) { >+ auto f = [](const int& x, int&& y) { >+ EXPECT_EQ(&x, &y); >+ EXPECT_EQ(42, x); >+ return 'A'; >+ }; >+ EXPECT_EQ('A', TryDecomposeValue(f, 42)); >+} >+ >+TEST(DecomposeValue, NotDecomposable) { >+ auto f = [](void*) { >+ ADD_FAILURE() << "Must not be called"; >+ return 'A'; >+ }; >+ EXPECT_STREQ("not decomposable", TryDecomposeValue(f, 42)); >+} >+ >+template <class F, class... Args> >+decltype(DecomposePair(std::declval<F>(), std::declval<Args>()...)) >+DecomposePairImpl(int, F&& f, Args&&... args) { >+ return DecomposePair(std::forward<F>(f), std::forward<Args>(args)...); >+} >+ >+template <class F, class... Args> >+const char* DecomposePairImpl(char, F&& f, Args&&... args) { >+ return "not decomposable"; >+} >+ >+template <class F, class... Args> >+decltype(DecomposePairImpl(0, std::declval<F>(), std::declval<Args>()...)) >+TryDecomposePair(F&& f, Args&&... args) { >+ return DecomposePairImpl(0, std::forward<F>(f), std::forward<Args>(args)...); >+} >+ >+TEST(DecomposePair, Decomposable) { >+ auto f = [](const int& x, std::piecewise_construct_t, std::tuple<int&&> k, >+ std::tuple<double>&& v) { >+ EXPECT_EQ(&x, &std::get<0>(k)); >+ EXPECT_EQ(42, x); >+ EXPECT_EQ(0.5, std::get<0>(v)); >+ return 'A'; >+ }; >+ EXPECT_EQ('A', TryDecomposePair(f, 42, 0.5)); >+ EXPECT_EQ('A', TryDecomposePair(f, std::make_pair(42, 0.5))); >+ EXPECT_EQ('A', TryDecomposePair(f, std::piecewise_construct, >+ std::make_tuple(42), std::make_tuple(0.5))); >+} >+ >+TEST(DecomposePair, NotDecomposable) { >+ auto f = [](...) { >+ ADD_FAILURE() << "Must not be called"; >+ return 'A'; >+ }; >+ EXPECT_STREQ("not decomposable", >+ TryDecomposePair(f)); >+ EXPECT_STREQ("not decomposable", >+ TryDecomposePair(f, std::piecewise_construct, std::make_tuple(), >+ std::make_tuple(0.5))); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_function_defaults.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_function_defaults.h >new file mode 100644 >index 0000000000000000000000000000000000000000..1f0d794d966388186dd7dfd69a8ae9a6adf709f6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_function_defaults.h >@@ -0,0 +1,143 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// Define the default Hash and Eq functions for SwissTable containers. >+// >+// std::hash<T> and std::equal_to<T> are not appropriate hash and equal >+// functions for SwissTable containers. There are two reasons for this. >+// >+// SwissTable containers are power of 2 sized containers: >+// >+// This means they use the lower bits of the hash value to find the slot for >+// each entry. The typical hash function for integral types is the identity. >+// This is a very weak hash function for SwissTable and any power of 2 sized >+// hashtable implementation which will lead to excessive collisions. For >+// SwissTable we use murmur3 style mixing to reduce collisions to a minimum. >+// >+// SwissTable containers support heterogeneous lookup: >+// >+// In order to make heterogeneous lookup work, hash and equal functions must be >+// polymorphic. At the same time they have to satisfy the same requirements the >+// C++ standard imposes on hash functions and equality operators. That is: >+// >+// if hash_default_eq<T>(a, b) returns true for any a and b of type T, then >+// hash_default_hash<T>(a) must equal hash_default_hash<T>(b) >+// >+// For SwissTable containers this requirement is relaxed to allow a and b of >+// any and possibly different types. Note that like the standard the hash and >+// equal functions are still bound to T. This is important because some type U >+// can be hashed by/tested for equality differently depending on T. A notable >+// example is `const char*`. `const char*` is treated as a c-style string when >+// the hash function is hash<string> but as a pointer when the hash function is >+// hash<void*>. >+// >+#ifndef ABSL_CONTAINER_INTERNAL_HASH_FUNCTION_DEFAULTS_H_ >+#define ABSL_CONTAINER_INTERNAL_HASH_FUNCTION_DEFAULTS_H_ >+ >+#include <stdint.h> >+#include <cstddef> >+#include <memory> >+#include <string> >+#include <type_traits> >+ >+#include "absl/base/config.h" >+#include "absl/hash/hash.h" >+#include "absl/strings/string_view.h" >+ >+namespace absl { >+namespace container_internal { >+ >+// The hash of an object of type T is computed by using absl::Hash. >+template <class T, class E = void> >+struct HashEq { >+ using Hash = absl::Hash<T>; >+ using Eq = std::equal_to<T>; >+}; >+ >+struct StringHash { >+ using is_transparent = void; >+ >+ size_t operator()(absl::string_view v) const { >+ return absl::Hash<absl::string_view>{}(v); >+ } >+}; >+ >+// Supports heterogeneous lookup for string-like elements. >+struct StringHashEq { >+ using Hash = StringHash; >+ struct Eq { >+ using is_transparent = void; >+ bool operator()(absl::string_view lhs, absl::string_view rhs) const { >+ return lhs == rhs; >+ } >+ }; >+}; >+template <> >+struct HashEq<std::string> : StringHashEq {}; >+template <> >+struct HashEq<absl::string_view> : StringHashEq {}; >+ >+// Supports heterogeneous lookup for pointers and smart pointers. >+template <class T> >+struct HashEq<T*> { >+ struct Hash { >+ using is_transparent = void; >+ template <class U> >+ size_t operator()(const U& ptr) const { >+ return absl::Hash<const T*>{}(HashEq::ToPtr(ptr)); >+ } >+ }; >+ struct Eq { >+ using is_transparent = void; >+ template <class A, class B> >+ bool operator()(const A& a, const B& b) const { >+ return HashEq::ToPtr(a) == HashEq::ToPtr(b); >+ } >+ }; >+ >+ private: >+ static const T* ToPtr(const T* ptr) { return ptr; } >+ template <class U, class D> >+ static const T* ToPtr(const std::unique_ptr<U, D>& ptr) { >+ return ptr.get(); >+ } >+ template <class U> >+ static const T* ToPtr(const std::shared_ptr<U>& ptr) { >+ return ptr.get(); >+ } >+}; >+ >+template <class T, class D> >+struct HashEq<std::unique_ptr<T, D>> : HashEq<T*> {}; >+template <class T> >+struct HashEq<std::shared_ptr<T>> : HashEq<T*> {}; >+ >+// This header's visibility is restricted. If you need to access the default >+// hasher please use the container's ::hasher alias instead. >+// >+// Example: typename Hash = typename absl::flat_hash_map<K, V>::hasher >+template <class T> >+using hash_default_hash = typename container_internal::HashEq<T>::Hash; >+ >+// This header's visibility is restricted. If you need to access the default >+// key equal please use the container's ::key_equal alias instead. >+// >+// Example: typename Eq = typename absl::flat_hash_map<K, V, Hash>::key_equal >+template <class T> >+using hash_default_eq = typename container_internal::HashEq<T>::Eq; >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_HASH_FUNCTION_DEFAULTS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_function_defaults_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_function_defaults_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..464baae02cddb0c41e612e59788d837d59f917fc >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_function_defaults_test.cc >@@ -0,0 +1,299 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/hash_function_defaults.h" >+ >+#include <functional> >+#include <type_traits> >+#include <utility> >+ >+#include "gtest/gtest.h" >+#include "absl/strings/string_view.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using ::testing::Types; >+ >+TEST(Eq, Int32) { >+ hash_default_eq<int32_t> eq; >+ EXPECT_TRUE(eq(1, 1u)); >+ EXPECT_TRUE(eq(1, char{1})); >+ EXPECT_TRUE(eq(1, true)); >+ EXPECT_TRUE(eq(1, double{1.1})); >+ EXPECT_FALSE(eq(1, char{2})); >+ EXPECT_FALSE(eq(1, 2u)); >+ EXPECT_FALSE(eq(1, false)); >+ EXPECT_FALSE(eq(1, 2.)); >+} >+ >+TEST(Hash, Int32) { >+ hash_default_hash<int32_t> hash; >+ auto h = hash(1); >+ EXPECT_EQ(h, hash(1u)); >+ EXPECT_EQ(h, hash(char{1})); >+ EXPECT_EQ(h, hash(true)); >+ EXPECT_EQ(h, hash(double{1.1})); >+ EXPECT_NE(h, hash(2u)); >+ EXPECT_NE(h, hash(char{2})); >+ EXPECT_NE(h, hash(false)); >+ EXPECT_NE(h, hash(2.)); >+} >+ >+enum class MyEnum { A, B, C, D }; >+ >+TEST(Eq, Enum) { >+ hash_default_eq<MyEnum> eq; >+ EXPECT_TRUE(eq(MyEnum::A, MyEnum::A)); >+ EXPECT_FALSE(eq(MyEnum::A, MyEnum::B)); >+} >+ >+TEST(Hash, Enum) { >+ hash_default_hash<MyEnum> hash; >+ >+ for (MyEnum e : {MyEnum::A, MyEnum::B, MyEnum::C}) { >+ auto h = hash(e); >+ EXPECT_EQ(h, hash_default_hash<int>{}(static_cast<int>(e))); >+ EXPECT_NE(h, hash(MyEnum::D)); >+ } >+} >+ >+using StringTypes = ::testing::Types<std::string, absl::string_view>; >+ >+template <class T> >+struct EqString : ::testing::Test { >+ hash_default_eq<T> key_eq; >+}; >+ >+TYPED_TEST_CASE(EqString, StringTypes); >+ >+template <class T> >+struct HashString : ::testing::Test { >+ hash_default_hash<T> hasher; >+}; >+ >+TYPED_TEST_CASE(HashString, StringTypes); >+ >+TYPED_TEST(EqString, Works) { >+ auto eq = this->key_eq; >+ EXPECT_TRUE(eq("a", "a")); >+ EXPECT_TRUE(eq("a", absl::string_view("a"))); >+ EXPECT_TRUE(eq("a", std::string("a"))); >+ EXPECT_FALSE(eq("a", "b")); >+ EXPECT_FALSE(eq("a", absl::string_view("b"))); >+ EXPECT_FALSE(eq("a", std::string("b"))); >+} >+ >+TYPED_TEST(HashString, Works) { >+ auto hash = this->hasher; >+ auto h = hash("a"); >+ EXPECT_EQ(h, hash(absl::string_view("a"))); >+ EXPECT_EQ(h, hash(std::string("a"))); >+ EXPECT_NE(h, hash(absl::string_view("b"))); >+ EXPECT_NE(h, hash(std::string("b"))); >+} >+ >+struct NoDeleter { >+ template <class T> >+ void operator()(const T* ptr) const {} >+}; >+ >+using PointerTypes = >+ ::testing::Types<const int*, int*, std::unique_ptr<const int>, >+ std::unique_ptr<const int, NoDeleter>, >+ std::unique_ptr<int>, std::unique_ptr<int, NoDeleter>, >+ std::shared_ptr<const int>, std::shared_ptr<int>>; >+ >+template <class T> >+struct EqPointer : ::testing::Test { >+ hash_default_eq<T> key_eq; >+}; >+ >+TYPED_TEST_CASE(EqPointer, PointerTypes); >+ >+template <class T> >+struct HashPointer : ::testing::Test { >+ hash_default_hash<T> hasher; >+}; >+ >+TYPED_TEST_CASE(HashPointer, PointerTypes); >+ >+TYPED_TEST(EqPointer, Works) { >+ int dummy; >+ auto eq = this->key_eq; >+ auto sptr = std::make_shared<int>(); >+ std::shared_ptr<const int> csptr = sptr; >+ int* ptr = sptr.get(); >+ const int* cptr = ptr; >+ std::unique_ptr<int, NoDeleter> uptr(ptr); >+ std::unique_ptr<const int, NoDeleter> cuptr(ptr); >+ >+ EXPECT_TRUE(eq(ptr, cptr)); >+ EXPECT_TRUE(eq(ptr, sptr)); >+ EXPECT_TRUE(eq(ptr, uptr)); >+ EXPECT_TRUE(eq(ptr, csptr)); >+ EXPECT_TRUE(eq(ptr, cuptr)); >+ EXPECT_FALSE(eq(&dummy, cptr)); >+ EXPECT_FALSE(eq(&dummy, sptr)); >+ EXPECT_FALSE(eq(&dummy, uptr)); >+ EXPECT_FALSE(eq(&dummy, csptr)); >+ EXPECT_FALSE(eq(&dummy, cuptr)); >+} >+ >+TEST(Hash, DerivedAndBase) { >+ struct Base {}; >+ struct Derived : Base {}; >+ >+ hash_default_hash<Base*> hasher; >+ >+ Base base; >+ Derived derived; >+ EXPECT_NE(hasher(&base), hasher(&derived)); >+ EXPECT_EQ(hasher(static_cast<Base*>(&derived)), hasher(&derived)); >+ >+ auto dp = std::make_shared<Derived>(); >+ EXPECT_EQ(hasher(static_cast<Base*>(dp.get())), hasher(dp)); >+} >+ >+TEST(Hash, FunctionPointer) { >+ using Func = int (*)(); >+ hash_default_hash<Func> hasher; >+ hash_default_eq<Func> eq; >+ >+ Func p1 = [] { return 1; }, p2 = [] { return 2; }; >+ EXPECT_EQ(hasher(p1), hasher(p1)); >+ EXPECT_TRUE(eq(p1, p1)); >+ >+ EXPECT_NE(hasher(p1), hasher(p2)); >+ EXPECT_FALSE(eq(p1, p2)); >+} >+ >+TYPED_TEST(HashPointer, Works) { >+ int dummy; >+ auto hash = this->hasher; >+ auto sptr = std::make_shared<int>(); >+ std::shared_ptr<const int> csptr = sptr; >+ int* ptr = sptr.get(); >+ const int* cptr = ptr; >+ std::unique_ptr<int, NoDeleter> uptr(ptr); >+ std::unique_ptr<const int, NoDeleter> cuptr(ptr); >+ >+ EXPECT_EQ(hash(ptr), hash(cptr)); >+ EXPECT_EQ(hash(ptr), hash(sptr)); >+ EXPECT_EQ(hash(ptr), hash(uptr)); >+ EXPECT_EQ(hash(ptr), hash(csptr)); >+ EXPECT_EQ(hash(ptr), hash(cuptr)); >+ EXPECT_NE(hash(&dummy), hash(cptr)); >+ EXPECT_NE(hash(&dummy), hash(sptr)); >+ EXPECT_NE(hash(&dummy), hash(uptr)); >+ EXPECT_NE(hash(&dummy), hash(csptr)); >+ EXPECT_NE(hash(&dummy), hash(cuptr)); >+} >+ >+// Cartesian product of (string, std::string, absl::string_view) >+// with (string, std::string, absl::string_view, const char*). >+using StringTypesCartesianProduct = Types< >+ // clang-format off >+ >+ std::pair<std::string, std::string>, >+ std::pair<std::string, absl::string_view>, >+ std::pair<std::string, const char*>, >+ >+ std::pair<absl::string_view, std::string>, >+ std::pair<absl::string_view, absl::string_view>, >+ std::pair<absl::string_view, const char*>>; >+// clang-format on >+ >+constexpr char kFirstString[] = "abc123"; >+constexpr char kSecondString[] = "ijk456"; >+ >+template <typename T> >+struct StringLikeTest : public ::testing::Test { >+ typename T::first_type a1{kFirstString}; >+ typename T::second_type b1{kFirstString}; >+ typename T::first_type a2{kSecondString}; >+ typename T::second_type b2{kSecondString}; >+ hash_default_eq<typename T::first_type> eq; >+ hash_default_hash<typename T::first_type> hash; >+}; >+ >+TYPED_TEST_CASE_P(StringLikeTest); >+ >+TYPED_TEST_P(StringLikeTest, Eq) { >+ EXPECT_TRUE(this->eq(this->a1, this->b1)); >+ EXPECT_TRUE(this->eq(this->b1, this->a1)); >+} >+ >+TYPED_TEST_P(StringLikeTest, NotEq) { >+ EXPECT_FALSE(this->eq(this->a1, this->b2)); >+ EXPECT_FALSE(this->eq(this->b2, this->a1)); >+} >+ >+TYPED_TEST_P(StringLikeTest, HashEq) { >+ EXPECT_EQ(this->hash(this->a1), this->hash(this->b1)); >+ EXPECT_EQ(this->hash(this->a2), this->hash(this->b2)); >+ // It would be a poor hash function which collides on these strings. >+ EXPECT_NE(this->hash(this->a1), this->hash(this->b2)); >+} >+ >+TYPED_TEST_CASE(StringLikeTest, StringTypesCartesianProduct); >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >+ >+enum Hash : size_t { >+ kStd = 0x2, // std::hash >+#ifdef _MSC_VER >+ kExtension = kStd, // In MSVC, std::hash == ::hash >+#else // _MSC_VER >+ kExtension = 0x4, // ::hash (GCC extension) >+#endif // _MSC_VER >+}; >+ >+// H is a bitmask of Hash enumerations. >+// Hashable<H> is hashable via all means specified in H. >+template <int H> >+struct Hashable { >+ static constexpr bool HashableBy(Hash h) { return h & H; } >+}; >+ >+namespace std { >+template <int H> >+struct hash<Hashable<H>> { >+ template <class E = Hashable<H>, >+ class = typename std::enable_if<E::HashableBy(kStd)>::type> >+ size_t operator()(E) const { >+ return kStd; >+ } >+}; >+} // namespace std >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+template <class T> >+size_t Hash(const T& v) { >+ return hash_default_hash<T>()(v); >+} >+ >+TEST(Delegate, HashDispatch) { >+ EXPECT_EQ(Hash(kStd), Hash(Hashable<kStd>())); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_generator_testing.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_generator_testing.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..0d6a9df16f8a336c143afaeec2069142361b889c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_generator_testing.cc >@@ -0,0 +1,72 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/hash_generator_testing.h" >+ >+#include <deque> >+ >+namespace absl { >+namespace container_internal { >+namespace hash_internal { >+namespace { >+ >+class RandomDeviceSeedSeq { >+ public: >+ using result_type = typename std::random_device::result_type; >+ >+ template <class Iterator> >+ void generate(Iterator start, Iterator end) { >+ while (start != end) { >+ *start = gen_(); >+ ++start; >+ } >+ } >+ >+ private: >+ std::random_device gen_; >+}; >+ >+} // namespace >+ >+std::mt19937_64* GetThreadLocalRng() { >+ RandomDeviceSeedSeq seed_seq; >+ thread_local auto* rng = new std::mt19937_64(seed_seq); >+ return rng; >+} >+ >+std::string Generator<std::string>::operator()() const { >+ // NOLINTNEXTLINE(runtime/int) >+ std::uniform_int_distribution<short> chars(0x20, 0x7E); >+ std::string res; >+ res.resize(32); >+ std::generate(res.begin(), res.end(), >+ [&]() { return chars(*GetThreadLocalRng()); }); >+ return res; >+} >+ >+absl::string_view Generator<absl::string_view>::operator()() const { >+ static auto* arena = new std::deque<std::string>(); >+ // NOLINTNEXTLINE(runtime/int) >+ std::uniform_int_distribution<short> chars(0x20, 0x7E); >+ arena->emplace_back(); >+ auto& res = arena->back(); >+ res.resize(32); >+ std::generate(res.begin(), res.end(), >+ [&]() { return chars(*GetThreadLocalRng()); }); >+ return res; >+} >+ >+} // namespace hash_internal >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_generator_testing.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_generator_testing.h >new file mode 100644 >index 0000000000000000000000000000000000000000..50d771026c7b5005135d2863165a009cb4dbca98 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_generator_testing.h >@@ -0,0 +1,150 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// Generates random values for testing. Specialized only for the few types we >+// care about. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_HASH_GENERATOR_TESTING_H_ >+#define ABSL_CONTAINER_INTERNAL_HASH_GENERATOR_TESTING_H_ >+ >+#include <stdint.h> >+#include <algorithm> >+#include <iosfwd> >+#include <random> >+#include <tuple> >+#include <type_traits> >+#include <utility> >+ >+#include "absl/container/internal/hash_policy_testing.h" >+#include "absl/meta/type_traits.h" >+#include "absl/strings/string_view.h" >+ >+namespace absl { >+namespace container_internal { >+namespace hash_internal { >+namespace generator_internal { >+ >+template <class Container, class = void> >+struct IsMap : std::false_type {}; >+ >+template <class Map> >+struct IsMap<Map, absl::void_t<typename Map::mapped_type>> : std::true_type {}; >+ >+} // namespace generator_internal >+ >+std::mt19937_64* GetThreadLocalRng(); >+ >+enum Enum { >+ kEnumEmpty, >+ kEnumDeleted, >+}; >+ >+enum class EnumClass : uint64_t { >+ kEmpty, >+ kDeleted, >+}; >+ >+inline std::ostream& operator<<(std::ostream& o, const EnumClass& ec) { >+ return o << static_cast<uint64_t>(ec); >+} >+ >+template <class T, class E = void> >+struct Generator; >+ >+template <class T> >+struct Generator<T, typename std::enable_if<std::is_integral<T>::value>::type> { >+ T operator()() const { >+ std::uniform_int_distribution<T> dist; >+ return dist(*GetThreadLocalRng()); >+ } >+}; >+ >+template <> >+struct Generator<Enum> { >+ Enum operator()() const { >+ std::uniform_int_distribution<typename std::underlying_type<Enum>::type> >+ dist; >+ while (true) { >+ auto variate = dist(*GetThreadLocalRng()); >+ if (variate != kEnumEmpty && variate != kEnumDeleted) >+ return static_cast<Enum>(variate); >+ } >+ } >+}; >+ >+template <> >+struct Generator<EnumClass> { >+ EnumClass operator()() const { >+ std::uniform_int_distribution< >+ typename std::underlying_type<EnumClass>::type> >+ dist; >+ while (true) { >+ EnumClass variate = static_cast<EnumClass>(dist(*GetThreadLocalRng())); >+ if (variate != EnumClass::kEmpty && variate != EnumClass::kDeleted) >+ return static_cast<EnumClass>(variate); >+ } >+ } >+}; >+ >+template <> >+struct Generator<std::string> { >+ std::string operator()() const; >+}; >+ >+template <> >+struct Generator<absl::string_view> { >+ absl::string_view operator()() const; >+}; >+ >+template <> >+struct Generator<NonStandardLayout> { >+ NonStandardLayout operator()() const { >+ return NonStandardLayout(Generator<std::string>()()); >+ } >+}; >+ >+template <class K, class V> >+struct Generator<std::pair<K, V>> { >+ std::pair<K, V> operator()() const { >+ return std::pair<K, V>(Generator<typename std::decay<K>::type>()(), >+ Generator<typename std::decay<V>::type>()()); >+ } >+}; >+ >+template <class... Ts> >+struct Generator<std::tuple<Ts...>> { >+ std::tuple<Ts...> operator()() const { >+ return std::tuple<Ts...>(Generator<typename std::decay<Ts>::type>()()...); >+ } >+}; >+ >+template <class U> >+struct Generator<U, absl::void_t<decltype(std::declval<U&>().key()), >+ decltype(std::declval<U&>().value())>> >+ : Generator<std::pair< >+ typename std::decay<decltype(std::declval<U&>().key())>::type, >+ typename std::decay<decltype(std::declval<U&>().value())>::type>> {}; >+ >+template <class Container> >+using GeneratedType = decltype( >+ std::declval<const Generator< >+ typename std::conditional<generator_internal::IsMap<Container>::value, >+ typename Container::value_type, >+ typename Container::key_type>::type>&>()()); >+ >+} // namespace hash_internal >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_HASH_GENERATOR_TESTING_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_testing.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_testing.h >new file mode 100644 >index 0000000000000000000000000000000000000000..ffc76ead7a6842e9fcbcf32042f12f7eac6e9f98 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_testing.h >@@ -0,0 +1,178 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// Utilities to help tests verify that hash tables properly handle stateful >+// allocators and hash functions. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_HASH_POLICY_TESTING_H_ >+#define ABSL_CONTAINER_INTERNAL_HASH_POLICY_TESTING_H_ >+ >+#include <cstdlib> >+#include <limits> >+#include <memory> >+#include <ostream> >+#include <type_traits> >+#include <utility> >+#include <vector> >+ >+#include "absl/hash/hash.h" >+#include "absl/strings/string_view.h" >+ >+namespace absl { >+namespace container_internal { >+namespace hash_testing_internal { >+ >+template <class Derived> >+struct WithId { >+ WithId() : id_(next_id<Derived>()) {} >+ WithId(const WithId& that) : id_(that.id_) {} >+ WithId(WithId&& that) : id_(that.id_) { that.id_ = 0; } >+ WithId& operator=(const WithId& that) { >+ id_ = that.id_; >+ return *this; >+ } >+ WithId& operator=(WithId&& that) { >+ id_ = that.id_; >+ that.id_ = 0; >+ return *this; >+ } >+ >+ size_t id() const { return id_; } >+ >+ friend bool operator==(const WithId& a, const WithId& b) { >+ return a.id_ == b.id_; >+ } >+ friend bool operator!=(const WithId& a, const WithId& b) { return !(a == b); } >+ >+ protected: >+ explicit WithId(size_t id) : id_(id) {} >+ >+ private: >+ size_t id_; >+ >+ template <class T> >+ static size_t next_id() { >+ // 0 is reserved for moved from state. >+ static size_t gId = 1; >+ return gId++; >+ } >+}; >+ >+} // namespace hash_testing_internal >+ >+struct NonStandardLayout { >+ NonStandardLayout() {} >+ explicit NonStandardLayout(std::string s) : value(std::move(s)) {} >+ virtual ~NonStandardLayout() {} >+ >+ friend bool operator==(const NonStandardLayout& a, >+ const NonStandardLayout& b) { >+ return a.value == b.value; >+ } >+ friend bool operator!=(const NonStandardLayout& a, >+ const NonStandardLayout& b) { >+ return a.value != b.value; >+ } >+ >+ template <typename H> >+ friend H AbslHashValue(H h, const NonStandardLayout& v) { >+ return H::combine(std::move(h), v.value); >+ } >+ >+ std::string value; >+}; >+ >+struct StatefulTestingHash >+ : absl::container_internal::hash_testing_internal::WithId< >+ StatefulTestingHash> { >+ template <class T> >+ size_t operator()(const T& t) const { >+ return absl::Hash<T>{}(t); >+ } >+}; >+ >+struct StatefulTestingEqual >+ : absl::container_internal::hash_testing_internal::WithId< >+ StatefulTestingEqual> { >+ template <class T, class U> >+ bool operator()(const T& t, const U& u) const { >+ return t == u; >+ } >+}; >+ >+// It is expected that Alloc() == Alloc() for all allocators so we cannot use >+// WithId base. We need to explicitly assign ids. >+template <class T = int> >+struct Alloc : std::allocator<T> { >+ using propagate_on_container_swap = std::true_type; >+ >+ // Using old paradigm for this to ensure compatibility. >+ explicit Alloc(size_t id = 0) : id_(id) {} >+ >+ Alloc(const Alloc&) = default; >+ Alloc& operator=(const Alloc&) = default; >+ >+ template <class U> >+ Alloc(const Alloc<U>& that) : std::allocator<T>(that), id_(that.id()) {} >+ >+ template <class U> >+ struct rebind { >+ using other = Alloc<U>; >+ }; >+ >+ size_t id() const { return id_; } >+ >+ friend bool operator==(const Alloc& a, const Alloc& b) { >+ return a.id_ == b.id_; >+ } >+ friend bool operator!=(const Alloc& a, const Alloc& b) { return !(a == b); } >+ >+ private: >+ size_t id_ = std::numeric_limits<size_t>::max(); >+}; >+ >+template <class Map> >+auto items(const Map& m) -> std::vector< >+ std::pair<typename Map::key_type, typename Map::mapped_type>> { >+ using std::get; >+ std::vector<std::pair<typename Map::key_type, typename Map::mapped_type>> res; >+ res.reserve(m.size()); >+ for (const auto& v : m) res.emplace_back(get<0>(v), get<1>(v)); >+ return res; >+} >+ >+template <class Set> >+auto keys(const Set& s) >+ -> std::vector<typename std::decay<typename Set::key_type>::type> { >+ std::vector<typename std::decay<typename Set::key_type>::type> res; >+ res.reserve(s.size()); >+ for (const auto& v : s) res.emplace_back(v); >+ return res; >+} >+ >+} // namespace container_internal >+} // namespace absl >+ >+// ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS is false for glibcxx versions >+// where the unordered containers are missing certain constructors that >+// take allocator arguments. This test is defined ad-hoc for the platforms >+// we care about (notably Crosstool 17) because libstdcxx's useless >+// versioning scheme precludes a more principled solution. >+#if defined(__GLIBCXX__) && __GLIBCXX__ <= 20140425 >+#define ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS 0 >+#else >+#define ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS 1 >+#endif >+ >+#endif // ABSL_CONTAINER_INTERNAL_HASH_POLICY_TESTING_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_testing_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_testing_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..c215c4237ae551686dd824d42da7cfb93b0ce915 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_testing_test.cc >@@ -0,0 +1,43 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/hash_policy_testing.h" >+ >+#include "gtest/gtest.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+TEST(_, Hash) { >+ StatefulTestingHash h1; >+ EXPECT_EQ(1, h1.id()); >+ StatefulTestingHash h2; >+ EXPECT_EQ(2, h2.id()); >+ StatefulTestingHash h1c(h1); >+ EXPECT_EQ(1, h1c.id()); >+ StatefulTestingHash h2m(std::move(h2)); >+ EXPECT_EQ(2, h2m.id()); >+ EXPECT_EQ(0, h2.id()); >+ StatefulTestingHash h3; >+ EXPECT_EQ(3, h3.id()); >+ h3 = StatefulTestingHash(); >+ EXPECT_EQ(4, h3.id()); >+ h3 = std::move(h1); >+ EXPECT_EQ(1, h3.id()); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_traits.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_traits.h >new file mode 100644 >index 0000000000000000000000000000000000000000..029e47e175c9dfae3524b7cb153931026de5bf97 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_traits.h >@@ -0,0 +1,189 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_HASH_POLICY_TRAITS_H_ >+#define ABSL_CONTAINER_INTERNAL_HASH_POLICY_TRAITS_H_ >+ >+#include <cstddef> >+#include <memory> >+#include <type_traits> >+#include <utility> >+ >+#include "absl/meta/type_traits.h" >+ >+namespace absl { >+namespace container_internal { >+ >+// Defines how slots are initialized/destroyed/moved. >+template <class Policy, class = void> >+struct hash_policy_traits { >+ private: >+ struct ReturnKey { >+ // We return `Key` here. >+ // When Key=T&, we forward the lvalue reference. >+ // When Key=T, we return by value to avoid a dangling reference. >+ // eg, for string_hash_map. >+ template <class Key, class... Args> >+ Key operator()(Key&& k, const Args&...) const { >+ return std::forward<Key>(k); >+ } >+ }; >+ >+ template <class P = Policy, class = void> >+ struct ConstantIteratorsImpl : std::false_type {}; >+ >+ template <class P> >+ struct ConstantIteratorsImpl<P, absl::void_t<typename P::constant_iterators>> >+ : P::constant_iterators {}; >+ >+ public: >+ // The actual object stored in the hash table. >+ using slot_type = typename Policy::slot_type; >+ >+ // The type of the keys stored in the hashtable. >+ using key_type = typename Policy::key_type; >+ >+ // The argument type for insertions into the hashtable. This is different >+ // from value_type for increased performance. See initializer_list constructor >+ // and insert() member functions for more details. >+ using init_type = typename Policy::init_type; >+ >+ using reference = decltype(Policy::element(std::declval<slot_type*>())); >+ using pointer = typename std::remove_reference<reference>::type*; >+ using value_type = typename std::remove_reference<reference>::type; >+ >+ // Policies can set this variable to tell raw_hash_set that all iterators >+ // should be constant, even `iterator`. This is useful for set-like >+ // containers. >+ // Defaults to false if not provided by the policy. >+ using constant_iterators = ConstantIteratorsImpl<>; >+ >+ // PRECONDITION: `slot` is UNINITIALIZED >+ // POSTCONDITION: `slot` is INITIALIZED >+ template <class Alloc, class... Args> >+ static void construct(Alloc* alloc, slot_type* slot, Args&&... args) { >+ Policy::construct(alloc, slot, std::forward<Args>(args)...); >+ } >+ >+ // PRECONDITION: `slot` is INITIALIZED >+ // POSTCONDITION: `slot` is UNINITIALIZED >+ template <class Alloc> >+ static void destroy(Alloc* alloc, slot_type* slot) { >+ Policy::destroy(alloc, slot); >+ } >+ >+ // Transfers the `old_slot` to `new_slot`. Any memory allocated by the >+ // allocator inside `old_slot` to `new_slot` can be transfered. >+ // >+ // OPTIONAL: defaults to: >+ // >+ // clone(new_slot, std::move(*old_slot)); >+ // destroy(old_slot); >+ // >+ // PRECONDITION: `new_slot` is UNINITIALIZED and `old_slot` is INITIALIZED >+ // POSTCONDITION: `new_slot` is INITIALIZED and `old_slot` is >+ // UNINITIALIZED >+ template <class Alloc> >+ static void transfer(Alloc* alloc, slot_type* new_slot, slot_type* old_slot) { >+ transfer_impl(alloc, new_slot, old_slot, 0); >+ } >+ >+ // PRECONDITION: `slot` is INITIALIZED >+ // POSTCONDITION: `slot` is INITIALIZED >+ template <class P = Policy> >+ static auto element(slot_type* slot) -> decltype(P::element(slot)) { >+ return P::element(slot); >+ } >+ >+ // Returns the amount of memory owned by `slot`, exclusive of `sizeof(*slot)`. >+ // >+ // If `slot` is nullptr, returns the constant amount of memory owned by any >+ // full slot or -1 if slots own variable amounts of memory. >+ // >+ // PRECONDITION: `slot` is INITIALIZED or nullptr >+ template <class P = Policy> >+ static size_t space_used(const slot_type* slot) { >+ return P::space_used(slot); >+ } >+ >+ // Provides generalized access to the key for elements, both for elements in >+ // the table and for elements that have not yet been inserted (or even >+ // constructed). We would like an API that allows us to say: `key(args...)` >+ // but we cannot do that for all cases, so we use this more general API that >+ // can be used for many things, including the following: >+ // >+ // - Given an element in a table, get its key. >+ // - Given an element initializer, get its key. >+ // - Given `emplace()` arguments, get the element key. >+ // >+ // Implementations of this must adhere to a very strict technical >+ // specification around aliasing and consuming arguments: >+ // >+ // Let `value_type` be the result type of `element()` without ref- and >+ // cv-qualifiers. The first argument is a functor, the rest are constructor >+ // arguments for `value_type`. Returns `std::forward<F>(f)(k, xs...)`, where >+ // `k` is the element key, and `xs...` are the new constructor arguments for >+ // `value_type`. It's allowed for `k` to alias `xs...`, and for both to alias >+ // `ts...`. The key won't be touched once `xs...` are used to construct an >+ // element; `ts...` won't be touched at all, which allows `apply()` to consume >+ // any rvalues among them. >+ // >+ // If `value_type` is constructible from `Ts&&...`, `Policy::apply()` must not >+ // trigger a hard compile error unless it originates from `f`. In other words, >+ // `Policy::apply()` must be SFINAE-friendly. If `value_type` is not >+ // constructible from `Ts&&...`, either SFINAE or a hard compile error is OK. >+ // >+ // If `Ts...` is `[cv] value_type[&]` or `[cv] init_type[&]`, >+ // `Policy::apply()` must work. A compile error is not allowed, SFINAE or not. >+ template <class F, class... Ts, class P = Policy> >+ static auto apply(F&& f, Ts&&... ts) >+ -> decltype(P::apply(std::forward<F>(f), std::forward<Ts>(ts)...)) { >+ return P::apply(std::forward<F>(f), std::forward<Ts>(ts)...); >+ } >+ >+ // Returns the "key" portion of the slot. >+ // Used for node handle manipulation. >+ template <class P = Policy> >+ static auto key(slot_type* slot) >+ -> decltype(P::apply(ReturnKey(), element(slot))) { >+ return P::apply(ReturnKey(), element(slot)); >+ } >+ >+ // Returns the "value" (as opposed to the "key") portion of the element. Used >+ // by maps to implement `operator[]`, `at()` and `insert_or_assign()`. >+ template <class T, class P = Policy> >+ static auto value(T* elem) -> decltype(P::value(elem)) { >+ return P::value(elem); >+ } >+ >+ private: >+ // Use auto -> decltype as an enabler. >+ template <class Alloc, class P = Policy> >+ static auto transfer_impl(Alloc* alloc, slot_type* new_slot, >+ slot_type* old_slot, int) >+ -> decltype((void)P::transfer(alloc, new_slot, old_slot)) { >+ P::transfer(alloc, new_slot, old_slot); >+ } >+ template <class Alloc> >+ static void transfer_impl(Alloc* alloc, slot_type* new_slot, >+ slot_type* old_slot, char) { >+ construct(alloc, new_slot, std::move(element(old_slot))); >+ destroy(alloc, old_slot); >+ } >+}; >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_HASH_POLICY_TRAITS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_traits_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_traits_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..423f1548e7ca138ee5aca09242afd3bc4f972b26 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hash_policy_traits_test.cc >@@ -0,0 +1,142 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/hash_policy_traits.h" >+ >+#include <functional> >+#include <memory> >+#include <new> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using ::testing::MockFunction; >+using ::testing::Return; >+using ::testing::ReturnRef; >+ >+using Alloc = std::allocator<int>; >+using Slot = int; >+ >+struct PolicyWithoutOptionalOps { >+ using slot_type = Slot; >+ using key_type = Slot; >+ using init_type = Slot; >+ >+ static std::function<void(void*, Slot*, Slot)> construct; >+ static std::function<void(void*, Slot*)> destroy; >+ >+ static std::function<Slot&(Slot*)> element; >+ static int apply(int v) { return apply_impl(v); } >+ static std::function<int(int)> apply_impl; >+ static std::function<Slot&(Slot*)> value; >+}; >+ >+std::function<void(void*, Slot*, Slot)> PolicyWithoutOptionalOps::construct; >+std::function<void(void*, Slot*)> PolicyWithoutOptionalOps::destroy; >+ >+std::function<Slot&(Slot*)> PolicyWithoutOptionalOps::element; >+std::function<int(int)> PolicyWithoutOptionalOps::apply_impl; >+std::function<Slot&(Slot*)> PolicyWithoutOptionalOps::value; >+ >+struct PolicyWithOptionalOps : PolicyWithoutOptionalOps { >+ static std::function<void(void*, Slot*, Slot*)> transfer; >+}; >+ >+std::function<void(void*, Slot*, Slot*)> PolicyWithOptionalOps::transfer; >+ >+struct Test : ::testing::Test { >+ Test() { >+ PolicyWithoutOptionalOps::construct = [&](void* a1, Slot* a2, Slot a3) { >+ construct.Call(a1, a2, std::move(a3)); >+ }; >+ PolicyWithoutOptionalOps::destroy = [&](void* a1, Slot* a2) { >+ destroy.Call(a1, a2); >+ }; >+ >+ PolicyWithoutOptionalOps::element = [&](Slot* a1) -> Slot& { >+ return element.Call(a1); >+ }; >+ PolicyWithoutOptionalOps::apply_impl = [&](int a1) -> int { >+ return apply.Call(a1); >+ }; >+ PolicyWithoutOptionalOps::value = [&](Slot* a1) -> Slot& { >+ return value.Call(a1); >+ }; >+ >+ PolicyWithOptionalOps::transfer = [&](void* a1, Slot* a2, Slot* a3) { >+ return transfer.Call(a1, a2, a3); >+ }; >+ } >+ >+ std::allocator<int> alloc; >+ int a = 53; >+ >+ MockFunction<void(void*, Slot*, Slot)> construct; >+ MockFunction<void(void*, Slot*)> destroy; >+ >+ MockFunction<Slot&(Slot*)> element; >+ MockFunction<int(int)> apply; >+ MockFunction<Slot&(Slot*)> value; >+ >+ MockFunction<void(void*, Slot*, Slot*)> transfer; >+}; >+ >+TEST_F(Test, construct) { >+ EXPECT_CALL(construct, Call(&alloc, &a, 53)); >+ hash_policy_traits<PolicyWithoutOptionalOps>::construct(&alloc, &a, 53); >+} >+ >+TEST_F(Test, destroy) { >+ EXPECT_CALL(destroy, Call(&alloc, &a)); >+ hash_policy_traits<PolicyWithoutOptionalOps>::destroy(&alloc, &a); >+} >+ >+TEST_F(Test, element) { >+ int b = 0; >+ EXPECT_CALL(element, Call(&a)).WillOnce(ReturnRef(b)); >+ EXPECT_EQ(&b, &hash_policy_traits<PolicyWithoutOptionalOps>::element(&a)); >+} >+ >+TEST_F(Test, apply) { >+ EXPECT_CALL(apply, Call(42)).WillOnce(Return(1337)); >+ EXPECT_EQ(1337, (hash_policy_traits<PolicyWithoutOptionalOps>::apply(42))); >+} >+ >+TEST_F(Test, value) { >+ int b = 0; >+ EXPECT_CALL(value, Call(&a)).WillOnce(ReturnRef(b)); >+ EXPECT_EQ(&b, &hash_policy_traits<PolicyWithoutOptionalOps>::value(&a)); >+} >+ >+TEST_F(Test, without_transfer) { >+ int b = 42; >+ EXPECT_CALL(element, Call(&b)).WillOnce(::testing::ReturnRef(b)); >+ EXPECT_CALL(construct, Call(&alloc, &a, b)); >+ EXPECT_CALL(destroy, Call(&alloc, &b)); >+ hash_policy_traits<PolicyWithoutOptionalOps>::transfer(&alloc, &a, &b); >+} >+ >+TEST_F(Test, with_transfer) { >+ int b = 42; >+ EXPECT_CALL(transfer, Call(&alloc, &a, &b)); >+ hash_policy_traits<PolicyWithOptionalOps>::transfer(&alloc, &a, &b); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hashtable_debug.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hashtable_debug.h >new file mode 100644 >index 0000000000000000000000000000000000000000..c3bd65c9c4ec52d5f4a796f76b18dbcda7292b34 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hashtable_debug.h >@@ -0,0 +1,108 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// This library provides APIs to debug the probing behavior of hash tables. >+// >+// In general, the probing behavior is a black box for users and only the >+// side effects can be measured in the form of performance differences. >+// These APIs give a glimpse on the actual behavior of the probing algorithms in >+// these hashtables given a specified hash function and a set of elements. >+// >+// The probe count distribution can be used to assess the quality of the hash >+// function for that particular hash table. Note that a hash function that >+// performs well in one hash table implementation does not necessarily performs >+// well in a different one. >+// >+// This library supports std::unordered_{set,map}, dense_hash_{set,map} and >+// absl::{flat,node,string}_hash_{set,map}. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_HASHTABLE_DEBUG_H_ >+#define ABSL_CONTAINER_INTERNAL_HASHTABLE_DEBUG_H_ >+ >+#include <cstddef> >+#include <algorithm> >+#include <type_traits> >+#include <vector> >+ >+#include "absl/container/internal/hashtable_debug_hooks.h" >+ >+namespace absl { >+namespace container_internal { >+ >+// Returns the number of probes required to lookup `key`. Returns 0 for a >+// search with no collisions. Higher values mean more hash collisions occurred; >+// however, the exact meaning of this number varies according to the container >+// type. >+template <typename C> >+size_t GetHashtableDebugNumProbes( >+ const C& c, const typename C::key_type& key) { >+ return absl::container_internal::hashtable_debug_internal:: >+ HashtableDebugAccess<C>::GetNumProbes(c, key); >+} >+ >+// Gets a histogram of the number of probes for each elements in the container. >+// The sum of all the values in the vector is equal to container.size(). >+template <typename C> >+std::vector<size_t> GetHashtableDebugNumProbesHistogram(const C& container) { >+ std::vector<size_t> v; >+ for (auto it = container.begin(); it != container.end(); ++it) { >+ size_t num_probes = GetHashtableDebugNumProbes( >+ container, >+ absl::container_internal::hashtable_debug_internal::GetKey<C>(*it, 0)); >+ v.resize(std::max(v.size(), num_probes + 1)); >+ v[num_probes]++; >+ } >+ return v; >+} >+ >+struct HashtableDebugProbeSummary { >+ size_t total_elements; >+ size_t total_num_probes; >+ double mean; >+}; >+ >+// Gets a summary of the probe count distribution for the elements in the >+// container. >+template <typename C> >+HashtableDebugProbeSummary GetHashtableDebugProbeSummary(const C& container) { >+ auto probes = GetHashtableDebugNumProbesHistogram(container); >+ HashtableDebugProbeSummary summary = {}; >+ for (size_t i = 0; i < probes.size(); ++i) { >+ summary.total_elements += probes[i]; >+ summary.total_num_probes += probes[i] * i; >+ } >+ summary.mean = 1.0 * summary.total_num_probes / summary.total_elements; >+ return summary; >+} >+ >+// Returns the number of bytes requested from the allocator by the container >+// and not freed. >+template <typename C> >+size_t AllocatedByteSize(const C& c) { >+ return absl::container_internal::hashtable_debug_internal:: >+ HashtableDebugAccess<C>::AllocatedByteSize(c); >+} >+ >+// Returns a tight lower bound for AllocatedByteSize(c) where `c` is of type `C` >+// and `c.size()` is equal to `num_elements`. >+template <typename C> >+size_t LowerBoundAllocatedByteSize(size_t num_elements) { >+ return absl::container_internal::hashtable_debug_internal:: >+ HashtableDebugAccess<C>::LowerBoundAllocatedByteSize(num_elements); >+} >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_HASHTABLE_DEBUG_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hashtable_debug_hooks.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hashtable_debug_hooks.h >new file mode 100644 >index 0000000000000000000000000000000000000000..8f219726bee01fe97a8f10e3306494868094e92b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/hashtable_debug_hooks.h >@@ -0,0 +1,81 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// Provides the internal API for hashtable_debug.h. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_HASHTABLE_DEBUG_HOOKS_H_ >+#define ABSL_CONTAINER_INTERNAL_HASHTABLE_DEBUG_HOOKS_H_ >+ >+#include <cstddef> >+ >+#include <algorithm> >+#include <type_traits> >+#include <vector> >+ >+namespace absl { >+namespace container_internal { >+namespace hashtable_debug_internal { >+ >+// If it is a map, call get<0>(). >+using std::get; >+template <typename T, typename = typename T::mapped_type> >+auto GetKey(const typename T::value_type& pair, int) -> decltype(get<0>(pair)) { >+ return get<0>(pair); >+} >+ >+// If it is not a map, return the value directly. >+template <typename T> >+const typename T::key_type& GetKey(const typename T::key_type& key, char) { >+ return key; >+} >+ >+// Containers should specialize this to provide debug information for that >+// container. >+template <class Container, typename Enabler = void> >+struct HashtableDebugAccess { >+ // Returns the number of probes required to find `key` in `c`. The "number of >+ // probes" is a concept that can vary by container. Implementations should >+ // return 0 when `key` was found in the minimum number of operations and >+ // should increment the result for each non-trivial operation required to find >+ // `key`. >+ // >+ // The default implementation uses the bucket api from the standard and thus >+ // works for `std::unordered_*` containers. >+ static size_t GetNumProbes(const Container& c, >+ const typename Container::key_type& key) { >+ if (!c.bucket_count()) return {}; >+ size_t num_probes = 0; >+ size_t bucket = c.bucket(key); >+ for (auto it = c.begin(bucket), e = c.end(bucket);; ++it, ++num_probes) { >+ if (it == e) return num_probes; >+ if (c.key_eq()(key, GetKey<Container>(*it, 0))) return num_probes; >+ } >+ } >+ >+ // Returns the number of bytes requested from the allocator by the container >+ // and not freed. >+ // >+ // static size_t AllocatedByteSize(const Container& c); >+ >+ // Returns a tight lower bound for AllocatedByteSize(c) where `c` is of type >+ // `Container` and `c.size()` is equal to `num_elements`. >+ // >+ // static size_t LowerBoundAllocatedByteSize(size_t num_elements); >+}; >+ >+} // namespace hashtable_debug_internal >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_HASHTABLE_DEBUG_HOOKS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/layout.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/layout.h >new file mode 100644 >index 0000000000000000000000000000000000000000..676c7d67ee68dcd566e17094d4958e3747547d99 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/layout.h >@@ -0,0 +1,732 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// MOTIVATION AND TUTORIAL >+// >+// If you want to put in a single heap allocation N doubles followed by M ints, >+// it's easy if N and M are known at compile time. >+// >+// struct S { >+// double a[N]; >+// int b[M]; >+// }; >+// >+// S* p = new S; >+// >+// But what if N and M are known only in run time? Class template Layout to the >+// rescue! It's a portable generalization of the technique known as struct hack. >+// >+// // This object will tell us everything we need to know about the memory >+// // layout of double[N] followed by int[M]. It's structurally identical to >+// // size_t[2] that stores N and M. It's very cheap to create. >+// const Layout<double, int> layout(N, M); >+// >+// // Allocate enough memory for both arrays. `AllocSize()` tells us how much >+// // memory is needed. We are free to use any allocation function we want as >+// // long as it returns aligned memory. >+// std::unique_ptr<unsigned char[]> p(new unsigned char[layout.AllocSize()]); >+// >+// // Obtain the pointer to the array of doubles. >+// // Equivalent to `reinterpret_cast<double*>(p.get())`. >+// // >+// // We could have written layout.Pointer<0>(p) instead. If all the types are >+// // unique you can use either form, but if some types are repeated you must >+// // use the index form. >+// double* a = layout.Pointer<double>(p.get()); >+// >+// // Obtain the pointer to the array of ints. >+// // Equivalent to `reinterpret_cast<int*>(p.get() + N * 8)`. >+// int* b = layout.Pointer<int>(p); >+// >+// If we are unable to specify sizes of all fields, we can pass as many sizes as >+// we can to `Partial()`. In return, it'll allow us to access the fields whose >+// locations and sizes can be computed from the provided information. >+// `Partial()` comes in handy when the array sizes are embedded into the >+// allocation. >+// >+// // size_t[1] containing N, size_t[1] containing M, double[N], int[M]. >+// using L = Layout<size_t, size_t, double, int>; >+// >+// unsigned char* Allocate(size_t n, size_t m) { >+// const L layout(1, 1, n, m); >+// unsigned char* p = new unsigned char[layout.AllocSize()]; >+// *layout.Pointer<0>(p) = n; >+// *layout.Pointer<1>(p) = m; >+// return p; >+// } >+// >+// void Use(unsigned char* p) { >+// // First, extract N and M. >+// // Specify that the first array has only one element. Using `prefix` we >+// // can access the first two arrays but not more. >+// constexpr auto prefix = L::Partial(1); >+// size_t n = *prefix.Pointer<0>(p); >+// size_t m = *prefix.Pointer<1>(p); >+// >+// // Now we can get pointers to the payload. >+// const L layout(1, 1, n, m); >+// double* a = layout.Pointer<double>(p); >+// int* b = layout.Pointer<int>(p); >+// } >+// >+// The layout we used above combines fixed-size with dynamically-sized fields. >+// This is quite common. Layout is optimized for this use case and generates >+// optimal code. All computations that can be performed at compile time are >+// indeed performed at compile time. >+// >+// Efficiency tip: The order of fields matters. In `Layout<T1, ..., TN>` try to >+// ensure that `alignof(T1) >= ... >= alignof(TN)`. This way you'll have no >+// padding in between arrays. >+// >+// You can manually override the alignment of an array by wrapping the type in >+// `Aligned<T, N>`. `Layout<..., Aligned<T, N>, ...>` has exactly the same API >+// and behavior as `Layout<..., T, ...>` except that the first element of the >+// array of `T` is aligned to `N` (the rest of the elements follow without >+// padding). `N` cannot be less than `alignof(T)`. >+// >+// `AllocSize()` and `Pointer()` are the most basic methods for dealing with >+// memory layouts. Check out the reference or code below to discover more. >+// >+// EXAMPLE >+// >+// // Immutable move-only string with sizeof equal to sizeof(void*). The >+// // string size and the characters are kept in the same heap allocation. >+// class CompactString { >+// public: >+// CompactString(const char* s = "") { >+// const size_t size = strlen(s); >+// // size_t[1] followed by char[size + 1]. >+// const L layout(1, size + 1); >+// p_.reset(new unsigned char[layout.AllocSize()]); >+// // If running under ASAN, mark the padding bytes, if any, to catch >+// // memory errors. >+// layout.PoisonPadding(p_.get()); >+// // Store the size in the allocation. >+// *layout.Pointer<size_t>(p_.get()) = size; >+// // Store the characters in the allocation. >+// memcpy(layout.Pointer<char>(p_.get()), s, size + 1); >+// } >+// >+// size_t size() const { >+// // Equivalent to reinterpret_cast<size_t&>(*p). >+// return *L::Partial().Pointer<size_t>(p_.get()); >+// } >+// >+// const char* c_str() const { >+// // Equivalent to reinterpret_cast<char*>(p.get() + sizeof(size_t)). >+// // The argument in Partial(1) specifies that we have size_t[1] in front >+// // of the characters. >+// return L::Partial(1).Pointer<char>(p_.get()); >+// } >+// >+// private: >+// // Our heap allocation contains a size_t followed by an array of chars. >+// using L = Layout<size_t, char>; >+// std::unique_ptr<unsigned char[]> p_; >+// }; >+// >+// int main() { >+// CompactString s = "hello"; >+// assert(s.size() == 5); >+// assert(strcmp(s.c_str(), "hello") == 0); >+// } >+// >+// DOCUMENTATION >+// >+// The interface exported by this file consists of: >+// - class `Layout<>` and its public members. >+// - The public members of class `internal_layout::LayoutImpl<>`. That class >+// isn't intended to be used directly, and its name and template parameter >+// list are internal implementation details, but the class itself provides >+// most of the functionality in this file. See comments on its members for >+// detailed documentation. >+// >+// `Layout<T1,... Tn>::Partial(count1,..., countm)` (where `m` <= `n`) returns a >+// `LayoutImpl<>` object. `Layout<T1,..., Tn> layout(count1,..., countn)` >+// creates a `Layout` object, which exposes the same functionality by inheriting >+// from `LayoutImpl<>`. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_LAYOUT_H_ >+#define ABSL_CONTAINER_INTERNAL_LAYOUT_H_ >+ >+#include <assert.h> >+#include <stddef.h> >+#include <stdint.h> >+#include <ostream> >+#include <string> >+#include <tuple> >+#include <type_traits> >+#include <typeinfo> >+#include <utility> >+ >+#ifdef ADDRESS_SANITIZER >+#include <sanitizer/asan_interface.h> >+#endif >+ >+#include "absl/meta/type_traits.h" >+#include "absl/strings/str_cat.h" >+#include "absl/types/span.h" >+#include "absl/utility/utility.h" >+ >+#if defined(__GXX_RTTI) >+#define ABSL_INTERNAL_HAS_CXA_DEMANGLE >+#endif >+ >+#ifdef ABSL_INTERNAL_HAS_CXA_DEMANGLE >+#include <cxxabi.h> >+#endif >+ >+namespace absl { >+namespace container_internal { >+ >+// A type wrapper that instructs `Layout` to use the specific alignment for the >+// array. `Layout<..., Aligned<T, N>, ...>` has exactly the same API >+// and behavior as `Layout<..., T, ...>` except that the first element of the >+// array of `T` is aligned to `N` (the rest of the elements follow without >+// padding). >+// >+// Requires: `N >= alignof(T)` and `N` is a power of 2. >+template <class T, size_t N> >+struct Aligned; >+ >+namespace internal_layout { >+ >+template <class T> >+struct NotAligned {}; >+ >+template <class T, size_t N> >+struct NotAligned<const Aligned<T, N>> { >+ static_assert(sizeof(T) == 0, "Aligned<T, N> cannot be const-qualified"); >+}; >+ >+template <size_t> >+using IntToSize = size_t; >+ >+template <class> >+using TypeToSize = size_t; >+ >+template <class T> >+struct Type : NotAligned<T> { >+ using type = T; >+}; >+ >+template <class T, size_t N> >+struct Type<Aligned<T, N>> { >+ using type = T; >+}; >+ >+template <class T> >+struct SizeOf : NotAligned<T>, std::integral_constant<size_t, sizeof(T)> {}; >+ >+template <class T, size_t N> >+struct SizeOf<Aligned<T, N>> : std::integral_constant<size_t, sizeof(T)> {}; >+ >+template <class T> >+struct AlignOf : NotAligned<T>, std::integral_constant<size_t, alignof(T)> {}; >+ >+template <class T, size_t N> >+struct AlignOf<Aligned<T, N>> : std::integral_constant<size_t, N> { >+ static_assert(N % alignof(T) == 0, >+ "Custom alignment can't be lower than the type's alignment"); >+}; >+ >+// Does `Ts...` contain `T`? >+template <class T, class... Ts> >+using Contains = absl::disjunction<std::is_same<T, Ts>...>; >+ >+template <class From, class To> >+using CopyConst = >+ typename std::conditional<std::is_const<From>::value, const To, To>::type; >+ >+template <class T> >+using SliceType = absl::Span<T>; >+ >+// This namespace contains no types. It prevents functions defined in it from >+// being found by ADL. >+namespace adl_barrier { >+ >+template <class Needle, class... Ts> >+constexpr size_t Find(Needle, Needle, Ts...) { >+ static_assert(!Contains<Needle, Ts...>(), "Duplicate element type"); >+ return 0; >+} >+ >+template <class Needle, class T, class... Ts> >+constexpr size_t Find(Needle, T, Ts...) { >+ return adl_barrier::Find(Needle(), Ts()...) + 1; >+} >+ >+constexpr bool IsPow2(size_t n) { return !(n & (n - 1)); } >+ >+// Returns `q * m` for the smallest `q` such that `q * m >= n`. >+// Requires: `m` is a power of two. It's enforced by IsLegalElementType below. >+constexpr size_t Align(size_t n, size_t m) { return (n + m - 1) & ~(m - 1); } >+ >+constexpr size_t Min(size_t a, size_t b) { return b < a ? b : a; } >+ >+constexpr size_t Max(size_t a) { return a; } >+ >+template <class... Ts> >+constexpr size_t Max(size_t a, size_t b, Ts... rest) { >+ return adl_barrier::Max(b < a ? a : b, rest...); >+} >+ >+template <class T> >+std::string TypeName() { >+ std::string out; >+ int status = 0; >+ char* demangled = nullptr; >+#ifdef ABSL_INTERNAL_HAS_CXA_DEMANGLE >+ demangled = abi::__cxa_demangle(typeid(T).name(), nullptr, nullptr, &status); >+#endif >+ if (status == 0 && demangled != nullptr) { // Demangling succeeeded. >+ absl::StrAppend(&out, "<", demangled, ">"); >+ free(demangled); >+ } else { >+#if defined(__GXX_RTTI) || defined(_CPPRTTI) >+ absl::StrAppend(&out, "<", typeid(T).name(), ">"); >+#endif >+ } >+ return out; >+} >+ >+} // namespace adl_barrier >+ >+template <bool C> >+using EnableIf = typename std::enable_if<C, int>::type; >+ >+// Can `T` be a template argument of `Layout`? >+template <class T> >+using IsLegalElementType = std::integral_constant< >+ bool, !std::is_reference<T>::value && !std::is_volatile<T>::value && >+ !std::is_reference<typename Type<T>::type>::value && >+ !std::is_volatile<typename Type<T>::type>::value && >+ adl_barrier::IsPow2(AlignOf<T>::value)>; >+ >+template <class Elements, class SizeSeq, class OffsetSeq> >+class LayoutImpl; >+ >+// Public base class of `Layout` and the result type of `Layout::Partial()`. >+// >+// `Elements...` contains all template arguments of `Layout` that created this >+// instance. >+// >+// `SizeSeq...` is `[0, NumSizes)` where `NumSizes` is the number of arguments >+// passed to `Layout::Partial()` or `Layout::Layout()`. >+// >+// `OffsetSeq...` is `[0, NumOffsets)` where `NumOffsets` is >+// `Min(sizeof...(Elements), NumSizes + 1)` (the number of arrays for which we >+// can compute offsets). >+template <class... Elements, size_t... SizeSeq, size_t... OffsetSeq> >+class LayoutImpl<std::tuple<Elements...>, absl::index_sequence<SizeSeq...>, >+ absl::index_sequence<OffsetSeq...>> { >+ private: >+ static_assert(sizeof...(Elements) > 0, "At least one field is required"); >+ static_assert(absl::conjunction<IsLegalElementType<Elements>...>::value, >+ "Invalid element type (see IsLegalElementType)"); >+ >+ enum { >+ NumTypes = sizeof...(Elements), >+ NumSizes = sizeof...(SizeSeq), >+ NumOffsets = sizeof...(OffsetSeq), >+ }; >+ >+ // These are guaranteed by `Layout`. >+ static_assert(NumOffsets == adl_barrier::Min(NumTypes, NumSizes + 1), >+ "Internal error"); >+ static_assert(NumTypes > 0, "Internal error"); >+ >+ // Returns the index of `T` in `Elements...`. Results in a compilation error >+ // if `Elements...` doesn't contain exactly one instance of `T`. >+ template <class T> >+ static constexpr size_t ElementIndex() { >+ static_assert(Contains<Type<T>, Type<typename Type<Elements>::type>...>(), >+ "Type not found"); >+ return adl_barrier::Find(Type<T>(), >+ Type<typename Type<Elements>::type>()...); >+ } >+ >+ template <size_t N> >+ using ElementAlignment = >+ AlignOf<typename std::tuple_element<N, std::tuple<Elements...>>::type>; >+ >+ public: >+ // Element types of all arrays packed in a tuple. >+ using ElementTypes = std::tuple<typename Type<Elements>::type...>; >+ >+ // Element type of the Nth array. >+ template <size_t N> >+ using ElementType = typename std::tuple_element<N, ElementTypes>::type; >+ >+ constexpr explicit LayoutImpl(IntToSize<SizeSeq>... sizes) >+ : size_{sizes...} {} >+ >+ // Alignment of the layout, equal to the strictest alignment of all elements. >+ // All pointers passed to the methods of layout must be aligned to this value. >+ static constexpr size_t Alignment() { >+ return adl_barrier::Max(AlignOf<Elements>::value...); >+ } >+ >+ // Offset in bytes of the Nth array. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // assert(x.Offset<0>() == 0); // The ints starts from 0. >+ // assert(x.Offset<1>() == 16); // The doubles starts from 16. >+ // >+ // Requires: `N <= NumSizes && N < sizeof...(Ts)`. >+ template <size_t N, EnableIf<N == 0> = 0> >+ constexpr size_t Offset() const { >+ return 0; >+ } >+ >+ template <size_t N, EnableIf<N != 0> = 0> >+ constexpr size_t Offset() const { >+ static_assert(N < NumOffsets, "Index out of bounds"); >+ return adl_barrier::Align( >+ Offset<N - 1>() + SizeOf<ElementType<N - 1>>() * size_[N - 1], >+ ElementAlignment<N>()); >+ } >+ >+ // Offset in bytes of the array with the specified element type. There must >+ // be exactly one such array and its zero-based index must be at most >+ // `NumSizes`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // assert(x.Offset<int>() == 0); // The ints starts from 0. >+ // assert(x.Offset<double>() == 16); // The doubles starts from 16. >+ template <class T> >+ constexpr size_t Offset() const { >+ return Offset<ElementIndex<T>()>(); >+ } >+ >+ // Offsets in bytes of all arrays for which the offsets are known. >+ constexpr std::array<size_t, NumOffsets> Offsets() const { >+ return {{Offset<OffsetSeq>()...}}; >+ } >+ >+ // The number of elements in the Nth array. This is the Nth argument of >+ // `Layout::Partial()` or `Layout::Layout()` (zero-based). >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // assert(x.Size<0>() == 3); >+ // assert(x.Size<1>() == 4); >+ // >+ // Requires: `N < NumSizes`. >+ template <size_t N> >+ constexpr size_t Size() const { >+ static_assert(N < NumSizes, "Index out of bounds"); >+ return size_[N]; >+ } >+ >+ // The number of elements in the array with the specified element type. >+ // There must be exactly one such array and its zero-based index must be >+ // at most `NumSizes`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // assert(x.Size<int>() == 3); >+ // assert(x.Size<double>() == 4); >+ template <class T> >+ constexpr size_t Size() const { >+ return Size<ElementIndex<T>()>(); >+ } >+ >+ // The number of elements of all arrays for which they are known. >+ constexpr std::array<size_t, NumSizes> Sizes() const { >+ return {{Size<SizeSeq>()...}}; >+ } >+ >+ // Pointer to the beginning of the Nth array. >+ // >+ // `Char` must be `[const] [signed|unsigned] char`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // unsigned char* p = new unsigned char[x.AllocSize()]; >+ // int* ints = x.Pointer<0>(p); >+ // double* doubles = x.Pointer<1>(p); >+ // >+ // Requires: `N <= NumSizes && N < sizeof...(Ts)`. >+ // Requires: `p` is aligned to `Alignment()`. >+ template <size_t N, class Char> >+ CopyConst<Char, ElementType<N>>* Pointer(Char* p) const { >+ using C = typename std::remove_const<Char>::type; >+ static_assert( >+ std::is_same<C, char>() || std::is_same<C, unsigned char>() || >+ std::is_same<C, signed char>(), >+ "The argument must be a pointer to [const] [signed|unsigned] char"); >+ constexpr size_t alignment = Alignment(); >+ (void)alignment; >+ assert(reinterpret_cast<uintptr_t>(p) % alignment == 0); >+ return reinterpret_cast<CopyConst<Char, ElementType<N>>*>(p + Offset<N>()); >+ } >+ >+ // Pointer to the beginning of the array with the specified element type. >+ // There must be exactly one such array and its zero-based index must be at >+ // most `NumSizes`. >+ // >+ // `Char` must be `[const] [signed|unsigned] char`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // unsigned char* p = new unsigned char[x.AllocSize()]; >+ // int* ints = x.Pointer<int>(p); >+ // double* doubles = x.Pointer<double>(p); >+ // >+ // Requires: `p` is aligned to `Alignment()`. >+ template <class T, class Char> >+ CopyConst<Char, T>* Pointer(Char* p) const { >+ return Pointer<ElementIndex<T>()>(p); >+ } >+ >+ // Pointers to all arrays for which pointers are known. >+ // >+ // `Char` must be `[const] [signed|unsigned] char`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // unsigned char* p = new unsigned char[x.AllocSize()]; >+ // >+ // int* ints; >+ // double* doubles; >+ // std::tie(ints, doubles) = x.Pointers(p); >+ // >+ // Requires: `p` is aligned to `Alignment()`. >+ // >+ // Note: We're not using ElementType alias here because it does not compile >+ // under MSVC. >+ template <class Char> >+ std::tuple<CopyConst< >+ Char, typename std::tuple_element<OffsetSeq, ElementTypes>::type>*...> >+ Pointers(Char* p) const { >+ return std::tuple<CopyConst<Char, ElementType<OffsetSeq>>*...>( >+ Pointer<OffsetSeq>(p)...); >+ } >+ >+ // The Nth array. >+ // >+ // `Char` must be `[const] [signed|unsigned] char`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // unsigned char* p = new unsigned char[x.AllocSize()]; >+ // Span<int> ints = x.Slice<0>(p); >+ // Span<double> doubles = x.Slice<1>(p); >+ // >+ // Requires: `N < NumSizes`. >+ // Requires: `p` is aligned to `Alignment()`. >+ template <size_t N, class Char> >+ SliceType<CopyConst<Char, ElementType<N>>> Slice(Char* p) const { >+ return SliceType<CopyConst<Char, ElementType<N>>>(Pointer<N>(p), Size<N>()); >+ } >+ >+ // The array with the specified element type. There must be exactly one >+ // such array and its zero-based index must be less than `NumSizes`. >+ // >+ // `Char` must be `[const] [signed|unsigned] char`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // unsigned char* p = new unsigned char[x.AllocSize()]; >+ // Span<int> ints = x.Slice<int>(p); >+ // Span<double> doubles = x.Slice<double>(p); >+ // >+ // Requires: `p` is aligned to `Alignment()`. >+ template <class T, class Char> >+ SliceType<CopyConst<Char, T>> Slice(Char* p) const { >+ return Slice<ElementIndex<T>()>(p); >+ } >+ >+ // All arrays with known sizes. >+ // >+ // `Char` must be `[const] [signed|unsigned] char`. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // unsigned char* p = new unsigned char[x.AllocSize()]; >+ // >+ // Span<int> ints; >+ // Span<double> doubles; >+ // std::tie(ints, doubles) = x.Slices(p); >+ // >+ // Requires: `p` is aligned to `Alignment()`. >+ // >+ // Note: We're not using ElementType alias here because it does not compile >+ // under MSVC. >+ template <class Char> >+ std::tuple<SliceType<CopyConst< >+ Char, typename std::tuple_element<SizeSeq, ElementTypes>::type>>...> >+ Slices(Char* p) const { >+ // Workaround for https://gcc.gnu.org/bugzilla/show_bug.cgi?id=63875 (fixed >+ // in 6.1). >+ (void)p; >+ return std::tuple<SliceType<CopyConst<Char, ElementType<SizeSeq>>>...>( >+ Slice<SizeSeq>(p)...); >+ } >+ >+ // The size of the allocation that fits all arrays. >+ // >+ // // int[3], 4 bytes of padding, double[4]. >+ // Layout<int, double> x(3, 4); >+ // unsigned char* p = new unsigned char[x.AllocSize()]; // 48 bytes >+ // >+ // Requires: `NumSizes == sizeof...(Ts)`. >+ constexpr size_t AllocSize() const { >+ static_assert(NumTypes == NumSizes, "You must specify sizes of all fields"); >+ return Offset<NumTypes - 1>() + >+ SizeOf<ElementType<NumTypes - 1>>() * size_[NumTypes - 1]; >+ } >+ >+ // If built with --config=asan, poisons padding bytes (if any) in the >+ // allocation. The pointer must point to a memory block at least >+ // `AllocSize()` bytes in length. >+ // >+ // `Char` must be `[const] [signed|unsigned] char`. >+ // >+ // Requires: `p` is aligned to `Alignment()`. >+ template <class Char, size_t N = NumOffsets - 1, EnableIf<N == 0> = 0> >+ void PoisonPadding(const Char* p) const { >+ Pointer<0>(p); // verify the requirements on `Char` and `p` >+ } >+ >+ template <class Char, size_t N = NumOffsets - 1, EnableIf<N != 0> = 0> >+ void PoisonPadding(const Char* p) const { >+ static_assert(N < NumOffsets, "Index out of bounds"); >+ (void)p; >+#ifdef ADDRESS_SANITIZER >+ PoisonPadding<Char, N - 1>(p); >+ // The `if` is an optimization. It doesn't affect the observable behaviour. >+ if (ElementAlignment<N - 1>() % ElementAlignment<N>()) { >+ size_t start = >+ Offset<N - 1>() + SizeOf<ElementType<N - 1>>() * size_[N - 1]; >+ ASAN_POISON_MEMORY_REGION(p + start, Offset<N>() - start); >+ } >+#endif >+ } >+ >+ // Human-readable description of the memory layout. Useful for debugging. >+ // Slow. >+ // >+ // // char[5], 3 bytes of padding, int[3], 4 bytes of padding, followed >+ // // by an unknown number of doubles. >+ // auto x = Layout<char, int, double>::Partial(5, 3); >+ // assert(x.DebugString() == >+ // "@0<char>(1)[5]; @8<int>(4)[3]; @24<double>(8)"); >+ // >+ // Each field is in the following format: @offset<type>(sizeof)[size] (<type> >+ // may be missing depending on the target platform). For example, >+ // @8<int>(4)[3] means that at offset 8 we have an array of ints, where each >+ // int is 4 bytes, and we have 3 of those ints. The size of the last field may >+ // be missing (as in the example above). Only fields with known offsets are >+ // described. Type names may differ across platforms: one compiler might >+ // produce "unsigned*" where another produces "unsigned int *". >+ std::string DebugString() const { >+ const auto offsets = Offsets(); >+ const size_t sizes[] = {SizeOf<ElementType<OffsetSeq>>()...}; >+ const std::string types[] = {adl_barrier::TypeName<ElementType<OffsetSeq>>()...}; >+ std::string res = absl::StrCat("@0", types[0], "(", sizes[0], ")"); >+ for (size_t i = 0; i != NumOffsets - 1; ++i) { >+ absl::StrAppend(&res, "[", size_[i], "]; @", offsets[i + 1], types[i + 1], >+ "(", sizes[i + 1], ")"); >+ } >+ // NumSizes is a constant that may be zero. Some compilers cannot see that >+ // inside the if statement "size_[NumSizes - 1]" must be valid. >+ int last = static_cast<int>(NumSizes) - 1; >+ if (NumTypes == NumSizes && last >= 0) { >+ absl::StrAppend(&res, "[", size_[last], "]"); >+ } >+ return res; >+ } >+ >+ private: >+ // Arguments of `Layout::Partial()` or `Layout::Layout()`. >+ size_t size_[NumSizes > 0 ? NumSizes : 1]; >+}; >+ >+template <size_t NumSizes, class... Ts> >+using LayoutType = LayoutImpl< >+ std::tuple<Ts...>, absl::make_index_sequence<NumSizes>, >+ absl::make_index_sequence<adl_barrier::Min(sizeof...(Ts), NumSizes + 1)>>; >+ >+} // namespace internal_layout >+ >+// Descriptor of arrays of various types and sizes laid out in memory one after >+// another. See the top of the file for documentation. >+// >+// Check out the public API of internal_layout::LayoutImpl above. The type is >+// internal to the library but its methods are public, and they are inherited >+// by `Layout`. >+template <class... Ts> >+class Layout : public internal_layout::LayoutType<sizeof...(Ts), Ts...> { >+ public: >+ static_assert(sizeof...(Ts) > 0, "At least one field is required"); >+ static_assert( >+ absl::conjunction<internal_layout::IsLegalElementType<Ts>...>::value, >+ "Invalid element type (see IsLegalElementType)"); >+ >+ // The result type of `Partial()` with `NumSizes` arguments. >+ template <size_t NumSizes> >+ using PartialType = internal_layout::LayoutType<NumSizes, Ts...>; >+ >+ // `Layout` knows the element types of the arrays we want to lay out in >+ // memory but not the number of elements in each array. >+ // `Partial(size1, ..., sizeN)` allows us to specify the latter. The >+ // resulting immutable object can be used to obtain pointers to the >+ // individual arrays. >+ // >+ // It's allowed to pass fewer array sizes than the number of arrays. E.g., >+ // if all you need is to the offset of the second array, you only need to >+ // pass one argument -- the number of elements in the first arrays. >+ // >+ // // int[3] followed by 4 bytes of padding and an unknown number of >+ // // doubles. >+ // auto x = Layout<int, double>::Partial(3); >+ // // doubles start at byte 16. >+ // assert(x.Offset<1>() == 16); >+ // >+ // If you know the number of elements in all arrays, you can still call >+ // `Partial()` but it's more convenient to use the constructor of `Layout`. >+ // >+ // Layout<int, double> x(3, 5); >+ // >+ // Note: The sizes of the arrays must be specified in number of elements, >+ // not in bytes. >+ // >+ // Requires: `sizeof...(Sizes) <= sizeof...(Ts)`. >+ // Requires: all arguments are convertible to `size_t`. >+ template <class... Sizes> >+ static constexpr PartialType<sizeof...(Sizes)> Partial(Sizes&&... sizes) { >+ static_assert(sizeof...(Sizes) <= sizeof...(Ts), ""); >+ return PartialType<sizeof...(Sizes)>(absl::forward<Sizes>(sizes)...); >+ } >+ >+ // Creates a layout with the sizes of all arrays specified. If you know >+ // only the sizes of the first N arrays (where N can be zero), you can use >+ // `Partial()` defined above. The constructor is essentially equivalent to >+ // calling `Partial()` and passing in all array sizes; the constructor is >+ // provided as a convenient abbreviation. >+ // >+ // Note: The sizes of the arrays must be specified in number of elements, >+ // not in bytes. >+ constexpr explicit Layout(internal_layout::TypeToSize<Ts>... sizes) >+ : internal_layout::LayoutType<sizeof...(Ts), Ts...>(sizes...) {} >+}; >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_LAYOUT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/layout_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/layout_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f35157a3bd85219886946fd062691c57053b2ede >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/layout_test.cc >@@ -0,0 +1,1552 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/layout.h" >+ >+// We need ::max_align_t because some libstdc++ versions don't provide >+// std::max_align_t >+#include <stddef.h> >+#include <cstdint> >+#include <memory> >+#include <sstream> >+#include <type_traits> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/base/internal/raw_logging.h" >+#include "absl/types/span.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using ::absl::Span; >+using ::testing::ElementsAre; >+ >+size_t Distance(const void* from, const void* to) { >+ ABSL_RAW_CHECK(from <= to, "Distance must be non-negative"); >+ return static_cast<const char*>(to) - static_cast<const char*>(from); >+} >+ >+template <class Expected, class Actual> >+Expected Type(Actual val) { >+ static_assert(std::is_same<Expected, Actual>(), ""); >+ return val; >+} >+ >+using Int128 = int64_t[2]; >+ >+// Properties of types that this test relies on. >+static_assert(sizeof(int8_t) == 1, ""); >+static_assert(alignof(int8_t) == 1, ""); >+static_assert(sizeof(int16_t) == 2, ""); >+static_assert(alignof(int16_t) == 2, ""); >+static_assert(sizeof(int32_t) == 4, ""); >+static_assert(alignof(int32_t) == 4, ""); >+static_assert(sizeof(Int128) == 16, ""); >+static_assert(alignof(Int128) == 8, ""); >+ >+template <class Expected, class Actual> >+void SameType() { >+ static_assert(std::is_same<Expected, Actual>(), ""); >+} >+ >+TEST(Layout, ElementType) { >+ { >+ using L = Layout<int32_t>; >+ SameType<int32_t, L::ElementType<0>>(); >+ SameType<int32_t, decltype(L::Partial())::ElementType<0>>(); >+ SameType<int32_t, decltype(L::Partial(0))::ElementType<0>>(); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ SameType<int32_t, L::ElementType<0>>(); >+ SameType<int32_t, L::ElementType<1>>(); >+ SameType<int32_t, decltype(L::Partial())::ElementType<0>>(); >+ SameType<int32_t, decltype(L::Partial())::ElementType<1>>(); >+ SameType<int32_t, decltype(L::Partial(0))::ElementType<0>>(); >+ SameType<int32_t, decltype(L::Partial(0))::ElementType<1>>(); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ SameType<int8_t, L::ElementType<0>>(); >+ SameType<int32_t, L::ElementType<1>>(); >+ SameType<Int128, L::ElementType<2>>(); >+ SameType<int8_t, decltype(L::Partial())::ElementType<0>>(); >+ SameType<int8_t, decltype(L::Partial(0))::ElementType<0>>(); >+ SameType<int32_t, decltype(L::Partial(0))::ElementType<1>>(); >+ SameType<int8_t, decltype(L::Partial(0, 0))::ElementType<0>>(); >+ SameType<int32_t, decltype(L::Partial(0, 0))::ElementType<1>>(); >+ SameType<Int128, decltype(L::Partial(0, 0))::ElementType<2>>(); >+ SameType<int8_t, decltype(L::Partial(0, 0, 0))::ElementType<0>>(); >+ SameType<int32_t, decltype(L::Partial(0, 0, 0))::ElementType<1>>(); >+ SameType<Int128, decltype(L::Partial(0, 0, 0))::ElementType<2>>(); >+ } >+} >+ >+TEST(Layout, ElementTypes) { >+ { >+ using L = Layout<int32_t>; >+ SameType<std::tuple<int32_t>, L::ElementTypes>(); >+ SameType<std::tuple<int32_t>, decltype(L::Partial())::ElementTypes>(); >+ SameType<std::tuple<int32_t>, decltype(L::Partial(0))::ElementTypes>(); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ SameType<std::tuple<int32_t, int32_t>, L::ElementTypes>(); >+ SameType<std::tuple<int32_t, int32_t>, decltype(L::Partial())::ElementTypes>(); >+ SameType<std::tuple<int32_t, int32_t>, decltype(L::Partial(0))::ElementTypes>(); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ SameType<std::tuple<int8_t, int32_t, Int128>, L::ElementTypes>(); >+ SameType<std::tuple<int8_t, int32_t, Int128>, >+ decltype(L::Partial())::ElementTypes>(); >+ SameType<std::tuple<int8_t, int32_t, Int128>, >+ decltype(L::Partial(0))::ElementTypes>(); >+ SameType<std::tuple<int8_t, int32_t, Int128>, >+ decltype(L::Partial(0, 0))::ElementTypes>(); >+ SameType<std::tuple<int8_t, int32_t, Int128>, >+ decltype(L::Partial(0, 0, 0))::ElementTypes>(); >+ } >+} >+ >+TEST(Layout, OffsetByIndex) { >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial().Offset<0>()); >+ EXPECT_EQ(0, L::Partial(3).Offset<0>()); >+ EXPECT_EQ(0, L(3).Offset<0>()); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(0, L::Partial().Offset<0>()); >+ EXPECT_EQ(0, L::Partial(3).Offset<0>()); >+ EXPECT_EQ(12, L::Partial(3).Offset<1>()); >+ EXPECT_EQ(0, L::Partial(3, 5).Offset<0>()); >+ EXPECT_EQ(12, L::Partial(3, 5).Offset<1>()); >+ EXPECT_EQ(0, L(3, 5).Offset<0>()); >+ EXPECT_EQ(12, L(3, 5).Offset<1>()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, L::Partial().Offset<0>()); >+ EXPECT_EQ(0, L::Partial(0).Offset<0>()); >+ EXPECT_EQ(0, L::Partial(0).Offset<1>()); >+ EXPECT_EQ(0, L::Partial(1).Offset<0>()); >+ EXPECT_EQ(4, L::Partial(1).Offset<1>()); >+ EXPECT_EQ(0, L::Partial(5).Offset<0>()); >+ EXPECT_EQ(8, L::Partial(5).Offset<1>()); >+ EXPECT_EQ(0, L::Partial(0, 0).Offset<0>()); >+ EXPECT_EQ(0, L::Partial(0, 0).Offset<1>()); >+ EXPECT_EQ(0, L::Partial(0, 0).Offset<2>()); >+ EXPECT_EQ(0, L::Partial(1, 0).Offset<0>()); >+ EXPECT_EQ(4, L::Partial(1, 0).Offset<1>()); >+ EXPECT_EQ(8, L::Partial(1, 0).Offset<2>()); >+ EXPECT_EQ(0, L::Partial(5, 3).Offset<0>()); >+ EXPECT_EQ(8, L::Partial(5, 3).Offset<1>()); >+ EXPECT_EQ(24, L::Partial(5, 3).Offset<2>()); >+ EXPECT_EQ(0, L::Partial(0, 0, 0).Offset<0>()); >+ EXPECT_EQ(0, L::Partial(0, 0, 0).Offset<1>()); >+ EXPECT_EQ(0, L::Partial(0, 0, 0).Offset<2>()); >+ EXPECT_EQ(0, L::Partial(1, 0, 0).Offset<0>()); >+ EXPECT_EQ(4, L::Partial(1, 0, 0).Offset<1>()); >+ EXPECT_EQ(8, L::Partial(1, 0, 0).Offset<2>()); >+ EXPECT_EQ(0, L::Partial(5, 3, 1).Offset<0>()); >+ EXPECT_EQ(24, L::Partial(5, 3, 1).Offset<2>()); >+ EXPECT_EQ(8, L::Partial(5, 3, 1).Offset<1>()); >+ EXPECT_EQ(0, L(5, 3, 1).Offset<0>()); >+ EXPECT_EQ(24, L(5, 3, 1).Offset<2>()); >+ EXPECT_EQ(8, L(5, 3, 1).Offset<1>()); >+ } >+} >+ >+TEST(Layout, OffsetByType) { >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial().Offset<int32_t>()); >+ EXPECT_EQ(0, L::Partial(3).Offset<int32_t>()); >+ EXPECT_EQ(0, L(3).Offset<int32_t>()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, L::Partial().Offset<int8_t>()); >+ EXPECT_EQ(0, L::Partial(0).Offset<int8_t>()); >+ EXPECT_EQ(0, L::Partial(0).Offset<int32_t>()); >+ EXPECT_EQ(0, L::Partial(1).Offset<int8_t>()); >+ EXPECT_EQ(4, L::Partial(1).Offset<int32_t>()); >+ EXPECT_EQ(0, L::Partial(5).Offset<int8_t>()); >+ EXPECT_EQ(8, L::Partial(5).Offset<int32_t>()); >+ EXPECT_EQ(0, L::Partial(0, 0).Offset<int8_t>()); >+ EXPECT_EQ(0, L::Partial(0, 0).Offset<int32_t>()); >+ EXPECT_EQ(0, L::Partial(0, 0).Offset<Int128>()); >+ EXPECT_EQ(0, L::Partial(1, 0).Offset<int8_t>()); >+ EXPECT_EQ(4, L::Partial(1, 0).Offset<int32_t>()); >+ EXPECT_EQ(8, L::Partial(1, 0).Offset<Int128>()); >+ EXPECT_EQ(0, L::Partial(5, 3).Offset<int8_t>()); >+ EXPECT_EQ(8, L::Partial(5, 3).Offset<int32_t>()); >+ EXPECT_EQ(24, L::Partial(5, 3).Offset<Int128>()); >+ EXPECT_EQ(0, L::Partial(0, 0, 0).Offset<int8_t>()); >+ EXPECT_EQ(0, L::Partial(0, 0, 0).Offset<int32_t>()); >+ EXPECT_EQ(0, L::Partial(0, 0, 0).Offset<Int128>()); >+ EXPECT_EQ(0, L::Partial(1, 0, 0).Offset<int8_t>()); >+ EXPECT_EQ(4, L::Partial(1, 0, 0).Offset<int32_t>()); >+ EXPECT_EQ(8, L::Partial(1, 0, 0).Offset<Int128>()); >+ EXPECT_EQ(0, L::Partial(5, 3, 1).Offset<int8_t>()); >+ EXPECT_EQ(24, L::Partial(5, 3, 1).Offset<Int128>()); >+ EXPECT_EQ(8, L::Partial(5, 3, 1).Offset<int32_t>()); >+ EXPECT_EQ(0, L(5, 3, 1).Offset<int8_t>()); >+ EXPECT_EQ(24, L(5, 3, 1).Offset<Int128>()); >+ EXPECT_EQ(8, L(5, 3, 1).Offset<int32_t>()); >+ } >+} >+ >+TEST(Layout, Offsets) { >+ { >+ using L = Layout<int32_t>; >+ EXPECT_THAT(L::Partial().Offsets(), ElementsAre(0)); >+ EXPECT_THAT(L::Partial(3).Offsets(), ElementsAre(0)); >+ EXPECT_THAT(L(3).Offsets(), ElementsAre(0)); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_THAT(L::Partial().Offsets(), ElementsAre(0)); >+ EXPECT_THAT(L::Partial(3).Offsets(), ElementsAre(0, 12)); >+ EXPECT_THAT(L::Partial(3, 5).Offsets(), ElementsAre(0, 12)); >+ EXPECT_THAT(L(3, 5).Offsets(), ElementsAre(0, 12)); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_THAT(L::Partial().Offsets(), ElementsAre(0)); >+ EXPECT_THAT(L::Partial(1).Offsets(), ElementsAre(0, 4)); >+ EXPECT_THAT(L::Partial(5).Offsets(), ElementsAre(0, 8)); >+ EXPECT_THAT(L::Partial(0, 0).Offsets(), ElementsAre(0, 0, 0)); >+ EXPECT_THAT(L::Partial(1, 0).Offsets(), ElementsAre(0, 4, 8)); >+ EXPECT_THAT(L::Partial(5, 3).Offsets(), ElementsAre(0, 8, 24)); >+ EXPECT_THAT(L::Partial(0, 0, 0).Offsets(), ElementsAre(0, 0, 0)); >+ EXPECT_THAT(L::Partial(1, 0, 0).Offsets(), ElementsAre(0, 4, 8)); >+ EXPECT_THAT(L::Partial(5, 3, 1).Offsets(), ElementsAre(0, 8, 24)); >+ EXPECT_THAT(L(5, 3, 1).Offsets(), ElementsAre(0, 8, 24)); >+ } >+} >+ >+TEST(Layout, AllocSize) { >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial(0).AllocSize()); >+ EXPECT_EQ(12, L::Partial(3).AllocSize()); >+ EXPECT_EQ(12, L(3).AllocSize()); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(32, L::Partial(3, 5).AllocSize()); >+ EXPECT_EQ(32, L(3, 5).AllocSize()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, L::Partial(0, 0, 0).AllocSize()); >+ EXPECT_EQ(8, L::Partial(1, 0, 0).AllocSize()); >+ EXPECT_EQ(8, L::Partial(0, 1, 0).AllocSize()); >+ EXPECT_EQ(16, L::Partial(0, 0, 1).AllocSize()); >+ EXPECT_EQ(24, L::Partial(1, 1, 1).AllocSize()); >+ EXPECT_EQ(136, L::Partial(3, 5, 7).AllocSize()); >+ EXPECT_EQ(136, L(3, 5, 7).AllocSize()); >+ } >+} >+ >+TEST(Layout, SizeByIndex) { >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial(0).Size<0>()); >+ EXPECT_EQ(3, L::Partial(3).Size<0>()); >+ EXPECT_EQ(3, L(3).Size<0>()); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(0, L::Partial(0).Size<0>()); >+ EXPECT_EQ(3, L::Partial(3).Size<0>()); >+ EXPECT_EQ(3, L::Partial(3, 5).Size<0>()); >+ EXPECT_EQ(5, L::Partial(3, 5).Size<1>()); >+ EXPECT_EQ(3, L(3, 5).Size<0>()); >+ EXPECT_EQ(5, L(3, 5).Size<1>()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(3, L::Partial(3).Size<0>()); >+ EXPECT_EQ(3, L::Partial(3, 5).Size<0>()); >+ EXPECT_EQ(5, L::Partial(3, 5).Size<1>()); >+ EXPECT_EQ(3, L::Partial(3, 5, 7).Size<0>()); >+ EXPECT_EQ(5, L::Partial(3, 5, 7).Size<1>()); >+ EXPECT_EQ(7, L::Partial(3, 5, 7).Size<2>()); >+ EXPECT_EQ(3, L(3, 5, 7).Size<0>()); >+ EXPECT_EQ(5, L(3, 5, 7).Size<1>()); >+ EXPECT_EQ(7, L(3, 5, 7).Size<2>()); >+ } >+} >+ >+TEST(Layout, SizeByType) { >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial(0).Size<int32_t>()); >+ EXPECT_EQ(3, L::Partial(3).Size<int32_t>()); >+ EXPECT_EQ(3, L(3).Size<int32_t>()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(3, L::Partial(3).Size<int8_t>()); >+ EXPECT_EQ(3, L::Partial(3, 5).Size<int8_t>()); >+ EXPECT_EQ(5, L::Partial(3, 5).Size<int32_t>()); >+ EXPECT_EQ(3, L::Partial(3, 5, 7).Size<int8_t>()); >+ EXPECT_EQ(5, L::Partial(3, 5, 7).Size<int32_t>()); >+ EXPECT_EQ(7, L::Partial(3, 5, 7).Size<Int128>()); >+ EXPECT_EQ(3, L(3, 5, 7).Size<int8_t>()); >+ EXPECT_EQ(5, L(3, 5, 7).Size<int32_t>()); >+ EXPECT_EQ(7, L(3, 5, 7).Size<Int128>()); >+ } >+} >+ >+TEST(Layout, Sizes) { >+ { >+ using L = Layout<int32_t>; >+ EXPECT_THAT(L::Partial().Sizes(), ElementsAre()); >+ EXPECT_THAT(L::Partial(3).Sizes(), ElementsAre(3)); >+ EXPECT_THAT(L(3).Sizes(), ElementsAre(3)); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_THAT(L::Partial().Sizes(), ElementsAre()); >+ EXPECT_THAT(L::Partial(3).Sizes(), ElementsAre(3)); >+ EXPECT_THAT(L::Partial(3, 5).Sizes(), ElementsAre(3, 5)); >+ EXPECT_THAT(L(3, 5).Sizes(), ElementsAre(3, 5)); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_THAT(L::Partial().Sizes(), ElementsAre()); >+ EXPECT_THAT(L::Partial(3).Sizes(), ElementsAre(3)); >+ EXPECT_THAT(L::Partial(3, 5).Sizes(), ElementsAre(3, 5)); >+ EXPECT_THAT(L::Partial(3, 5, 7).Sizes(), ElementsAre(3, 5, 7)); >+ EXPECT_THAT(L(3, 5, 7).Sizes(), ElementsAre(3, 5, 7)); >+ } >+} >+ >+TEST(Layout, PointerByIndex) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L::Partial().Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L::Partial(3).Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L(3).Pointer<0>(p)))); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L::Partial().Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L::Partial(3).Pointer<0>(p)))); >+ EXPECT_EQ(12, Distance(p, Type<const int32_t*>(L::Partial(3).Pointer<1>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int32_t*>(L::Partial(3, 5).Pointer<0>(p)))); >+ EXPECT_EQ(12, >+ Distance(p, Type<const int32_t*>(L::Partial(3, 5).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L(3, 5).Pointer<0>(p)))); >+ EXPECT_EQ(12, Distance(p, Type<const int32_t*>(L(3, 5).Pointer<1>(p)))); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, Distance(p, Type<const int8_t*>(L::Partial().Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int8_t*>(L::Partial(0).Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L::Partial(0).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int8_t*>(L::Partial(1).Pointer<0>(p)))); >+ EXPECT_EQ(4, Distance(p, Type<const int32_t*>(L::Partial(1).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int8_t*>(L::Partial(5).Pointer<0>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<const int32_t*>(L::Partial(5).Pointer<1>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int8_t*>(L::Partial(0, 0).Pointer<0>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int32_t*>(L::Partial(0, 0).Pointer<1>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const Int128*>(L::Partial(0, 0).Pointer<2>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int8_t*>(L::Partial(1, 0).Pointer<0>(p)))); >+ EXPECT_EQ(4, >+ Distance(p, Type<const int32_t*>(L::Partial(1, 0).Pointer<1>(p)))); >+ EXPECT_EQ(8, >+ Distance(p, Type<const Int128*>(L::Partial(1, 0).Pointer<2>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int8_t*>(L::Partial(5, 3).Pointer<0>(p)))); >+ EXPECT_EQ(8, >+ Distance(p, Type<const int32_t*>(L::Partial(5, 3).Pointer<1>(p)))); >+ EXPECT_EQ(24, >+ Distance(p, Type<const Int128*>(L::Partial(5, 3).Pointer<2>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int8_t*>(L::Partial(0, 0, 0).Pointer<0>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int32_t*>(L::Partial(0, 0, 0).Pointer<1>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const Int128*>(L::Partial(0, 0, 0).Pointer<2>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int8_t*>(L::Partial(1, 0, 0).Pointer<0>(p)))); >+ EXPECT_EQ( >+ 4, Distance(p, Type<const int32_t*>(L::Partial(1, 0, 0).Pointer<1>(p)))); >+ EXPECT_EQ( >+ 8, Distance(p, Type<const Int128*>(L::Partial(1, 0, 0).Pointer<2>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int8_t*>(L::Partial(5, 3, 1).Pointer<0>(p)))); >+ EXPECT_EQ( >+ 24, >+ Distance(p, Type<const Int128*>(L::Partial(5, 3, 1).Pointer<2>(p)))); >+ EXPECT_EQ( >+ 8, Distance(p, Type<const int32_t*>(L::Partial(5, 3, 1).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int8_t*>(L(5, 3, 1).Pointer<0>(p)))); >+ EXPECT_EQ(24, Distance(p, Type<const Int128*>(L(5, 3, 1).Pointer<2>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<const int32_t*>(L(5, 3, 1).Pointer<1>(p)))); >+ } >+} >+ >+TEST(Layout, PointerByType) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, >+ Distance(p, Type<const int32_t*>(L::Partial().Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int32_t*>(L::Partial(3).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const int32_t*>(L(3).Pointer<int32_t>(p)))); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, Distance(p, Type<const int8_t*>(L::Partial().Pointer<int8_t>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int8_t*>(L::Partial(0).Pointer<int8_t>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int32_t*>(L::Partial(0).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int8_t*>(L::Partial(1).Pointer<int8_t>(p)))); >+ EXPECT_EQ(4, >+ Distance(p, Type<const int32_t*>(L::Partial(1).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<const int8_t*>(L::Partial(5).Pointer<int8_t>(p)))); >+ EXPECT_EQ(8, >+ Distance(p, Type<const int32_t*>(L::Partial(5).Pointer<int32_t>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int8_t*>(L::Partial(0, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int32_t*>(L::Partial(0, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<const Int128*>(L::Partial(0, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int8_t*>(L::Partial(1, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ( >+ 4, Distance(p, Type<const int32_t*>(L::Partial(1, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ( >+ 8, >+ Distance(p, Type<const Int128*>(L::Partial(1, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<const int8_t*>(L::Partial(5, 3).Pointer<int8_t>(p)))); >+ EXPECT_EQ( >+ 8, Distance(p, Type<const int32_t*>(L::Partial(5, 3).Pointer<int32_t>(p)))); >+ EXPECT_EQ( >+ 24, >+ Distance(p, Type<const Int128*>(L::Partial(5, 3).Pointer<Int128>(p)))); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<const int8_t*>(L::Partial(0, 0, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<const int32_t*>(L::Partial(0, 0, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<const Int128*>( >+ L::Partial(0, 0, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<const int8_t*>(L::Partial(1, 0, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ( >+ 4, >+ Distance(p, Type<const int32_t*>(L::Partial(1, 0, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<const Int128*>( >+ L::Partial(1, 0, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<const int8_t*>(L::Partial(5, 3, 1).Pointer<int8_t>(p)))); >+ EXPECT_EQ(24, Distance(p, Type<const Int128*>( >+ L::Partial(5, 3, 1).Pointer<Int128>(p)))); >+ EXPECT_EQ( >+ 8, >+ Distance(p, Type<const int32_t*>(L::Partial(5, 3, 1).Pointer<int32_t>(p)))); >+ EXPECT_EQ(24, >+ Distance(p, Type<const Int128*>(L(5, 3, 1).Pointer<Int128>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<const int32_t*>(L(5, 3, 1).Pointer<int32_t>(p)))); >+ } >+} >+ >+TEST(Layout, MutablePointerByIndex) { >+ alignas(max_align_t) unsigned char p[100]; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial().Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(3).Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L(3).Pointer<0>(p)))); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial().Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(3).Pointer<0>(p)))); >+ EXPECT_EQ(12, Distance(p, Type<int32_t*>(L::Partial(3).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(3, 5).Pointer<0>(p)))); >+ EXPECT_EQ(12, Distance(p, Type<int32_t*>(L::Partial(3, 5).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L(3, 5).Pointer<0>(p)))); >+ EXPECT_EQ(12, Distance(p, Type<int32_t*>(L(3, 5).Pointer<1>(p)))); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial().Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(0).Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(0).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(1).Pointer<0>(p)))); >+ EXPECT_EQ(4, Distance(p, Type<int32_t*>(L::Partial(1).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(5).Pointer<0>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<int32_t*>(L::Partial(5).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(0, 0).Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(0, 0).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<Int128*>(L::Partial(0, 0).Pointer<2>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(1, 0).Pointer<0>(p)))); >+ EXPECT_EQ(4, Distance(p, Type<int32_t*>(L::Partial(1, 0).Pointer<1>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<Int128*>(L::Partial(1, 0).Pointer<2>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(5, 3).Pointer<0>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<int32_t*>(L::Partial(5, 3).Pointer<1>(p)))); >+ EXPECT_EQ(24, Distance(p, Type<Int128*>(L::Partial(5, 3).Pointer<2>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(0, 0, 0).Pointer<0>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(0, 0, 0).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<Int128*>(L::Partial(0, 0, 0).Pointer<2>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(1, 0, 0).Pointer<0>(p)))); >+ EXPECT_EQ(4, Distance(p, Type<int32_t*>(L::Partial(1, 0, 0).Pointer<1>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<Int128*>(L::Partial(1, 0, 0).Pointer<2>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(5, 3, 1).Pointer<0>(p)))); >+ EXPECT_EQ(24, >+ Distance(p, Type<Int128*>(L::Partial(5, 3, 1).Pointer<2>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<int32_t*>(L::Partial(5, 3, 1).Pointer<1>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L(5, 3, 1).Pointer<0>(p)))); >+ EXPECT_EQ(24, Distance(p, Type<Int128*>(L(5, 3, 1).Pointer<2>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<int32_t*>(L(5, 3, 1).Pointer<1>(p)))); >+ } >+} >+ >+TEST(Layout, MutablePointerByType) { >+ alignas(max_align_t) unsigned char p[100]; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial().Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(3).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L(3).Pointer<int32_t>(p)))); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial().Pointer<int8_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(0).Pointer<int8_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(0).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(1).Pointer<int8_t>(p)))); >+ EXPECT_EQ(4, Distance(p, Type<int32_t*>(L::Partial(1).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(5).Pointer<int8_t>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<int32_t*>(L::Partial(5).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(0, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int32_t*>(L::Partial(0, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<Int128*>(L::Partial(0, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(1, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ(4, Distance(p, Type<int32_t*>(L::Partial(1, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ(8, >+ Distance(p, Type<Int128*>(L::Partial(1, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L::Partial(5, 3).Pointer<int8_t>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<int32_t*>(L::Partial(5, 3).Pointer<int32_t>(p)))); >+ EXPECT_EQ(24, >+ Distance(p, Type<Int128*>(L::Partial(5, 3).Pointer<Int128>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<int8_t*>(L::Partial(0, 0, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<int32_t*>(L::Partial(0, 0, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Int128*>(L::Partial(0, 0, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<int8_t*>(L::Partial(1, 0, 0).Pointer<int8_t>(p)))); >+ EXPECT_EQ(4, >+ Distance(p, Type<int32_t*>(L::Partial(1, 0, 0).Pointer<int32_t>(p)))); >+ EXPECT_EQ( >+ 8, Distance(p, Type<Int128*>(L::Partial(1, 0, 0).Pointer<Int128>(p)))); >+ EXPECT_EQ(0, >+ Distance(p, Type<int8_t*>(L::Partial(5, 3, 1).Pointer<int8_t>(p)))); >+ EXPECT_EQ( >+ 24, Distance(p, Type<Int128*>(L::Partial(5, 3, 1).Pointer<Int128>(p)))); >+ EXPECT_EQ(8, >+ Distance(p, Type<int32_t*>(L::Partial(5, 3, 1).Pointer<int32_t>(p)))); >+ EXPECT_EQ(0, Distance(p, Type<int8_t*>(L(5, 3, 1).Pointer<int8_t>(p)))); >+ EXPECT_EQ(24, Distance(p, Type<Int128*>(L(5, 3, 1).Pointer<Int128>(p)))); >+ EXPECT_EQ(8, Distance(p, Type<int32_t*>(L(5, 3, 1).Pointer<int32_t>(p)))); >+ } >+} >+ >+TEST(Layout, Pointers) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ using L = Layout<int8_t, int8_t, Int128>; >+ { >+ const auto x = L::Partial(); >+ EXPECT_EQ(std::make_tuple(x.Pointer<0>(p)), >+ Type<std::tuple<const int8_t*>>(x.Pointers(p))); >+ } >+ { >+ const auto x = L::Partial(1); >+ EXPECT_EQ(std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p)), >+ (Type<std::tuple<const int8_t*, const int8_t*>>(x.Pointers(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2); >+ EXPECT_EQ( >+ std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p), x.Pointer<2>(p)), >+ (Type<std::tuple<const int8_t*, const int8_t*, const Int128*>>( >+ x.Pointers(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2, 3); >+ EXPECT_EQ( >+ std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p), x.Pointer<2>(p)), >+ (Type<std::tuple<const int8_t*, const int8_t*, const Int128*>>( >+ x.Pointers(p)))); >+ } >+ { >+ const L x(1, 2, 3); >+ EXPECT_EQ( >+ std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p), x.Pointer<2>(p)), >+ (Type<std::tuple<const int8_t*, const int8_t*, const Int128*>>( >+ x.Pointers(p)))); >+ } >+} >+ >+TEST(Layout, MutablePointers) { >+ alignas(max_align_t) unsigned char p[100]; >+ using L = Layout<int8_t, int8_t, Int128>; >+ { >+ const auto x = L::Partial(); >+ EXPECT_EQ(std::make_tuple(x.Pointer<0>(p)), >+ Type<std::tuple<int8_t*>>(x.Pointers(p))); >+ } >+ { >+ const auto x = L::Partial(1); >+ EXPECT_EQ(std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p)), >+ (Type<std::tuple<int8_t*, int8_t*>>(x.Pointers(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2); >+ EXPECT_EQ( >+ std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p), x.Pointer<2>(p)), >+ (Type<std::tuple<int8_t*, int8_t*, Int128*>>(x.Pointers(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2, 3); >+ EXPECT_EQ( >+ std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p), x.Pointer<2>(p)), >+ (Type<std::tuple<int8_t*, int8_t*, Int128*>>(x.Pointers(p)))); >+ } >+ { >+ const L x(1, 2, 3); >+ EXPECT_EQ( >+ std::make_tuple(x.Pointer<0>(p), x.Pointer<1>(p), x.Pointer<2>(p)), >+ (Type<std::tuple<int8_t*, int8_t*, Int128*>>(x.Pointers(p)))); >+ } >+} >+ >+TEST(Layout, SliceByIndexSize) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial(0).Slice<0>(p).size()); >+ EXPECT_EQ(3, L::Partial(3).Slice<0>(p).size()); >+ EXPECT_EQ(3, L(3).Slice<0>(p).size()); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(3, L::Partial(3).Slice<0>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5).Slice<1>(p).size()); >+ EXPECT_EQ(5, L(3, 5).Slice<1>(p).size()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(3, L::Partial(3).Slice<0>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5).Slice<0>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5).Slice<1>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5, 7).Slice<0>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5, 7).Slice<1>(p).size()); >+ EXPECT_EQ(7, L::Partial(3, 5, 7).Slice<2>(p).size()); >+ EXPECT_EQ(3, L(3, 5, 7).Slice<0>(p).size()); >+ EXPECT_EQ(5, L(3, 5, 7).Slice<1>(p).size()); >+ EXPECT_EQ(7, L(3, 5, 7).Slice<2>(p).size()); >+ } >+} >+ >+TEST(Layout, SliceByTypeSize) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial(0).Slice<int32_t>(p).size()); >+ EXPECT_EQ(3, L::Partial(3).Slice<int32_t>(p).size()); >+ EXPECT_EQ(3, L(3).Slice<int32_t>(p).size()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(3, L::Partial(3).Slice<int8_t>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5).Slice<int8_t>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5).Slice<int32_t>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5, 7).Slice<int8_t>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5, 7).Slice<int32_t>(p).size()); >+ EXPECT_EQ(7, L::Partial(3, 5, 7).Slice<Int128>(p).size()); >+ EXPECT_EQ(3, L(3, 5, 7).Slice<int8_t>(p).size()); >+ EXPECT_EQ(5, L(3, 5, 7).Slice<int32_t>(p).size()); >+ EXPECT_EQ(7, L(3, 5, 7).Slice<Int128>(p).size()); >+ } >+} >+ >+TEST(Layout, MutableSliceByIndexSize) { >+ alignas(max_align_t) unsigned char p[100]; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial(0).Slice<0>(p).size()); >+ EXPECT_EQ(3, L::Partial(3).Slice<0>(p).size()); >+ EXPECT_EQ(3, L(3).Slice<0>(p).size()); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(3, L::Partial(3).Slice<0>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5).Slice<1>(p).size()); >+ EXPECT_EQ(5, L(3, 5).Slice<1>(p).size()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(3, L::Partial(3).Slice<0>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5).Slice<0>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5).Slice<1>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5, 7).Slice<0>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5, 7).Slice<1>(p).size()); >+ EXPECT_EQ(7, L::Partial(3, 5, 7).Slice<2>(p).size()); >+ EXPECT_EQ(3, L(3, 5, 7).Slice<0>(p).size()); >+ EXPECT_EQ(5, L(3, 5, 7).Slice<1>(p).size()); >+ EXPECT_EQ(7, L(3, 5, 7).Slice<2>(p).size()); >+ } >+} >+ >+TEST(Layout, MutableSliceByTypeSize) { >+ alignas(max_align_t) unsigned char p[100]; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, L::Partial(0).Slice<int32_t>(p).size()); >+ EXPECT_EQ(3, L::Partial(3).Slice<int32_t>(p).size()); >+ EXPECT_EQ(3, L(3).Slice<int32_t>(p).size()); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(3, L::Partial(3).Slice<int8_t>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5).Slice<int8_t>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5).Slice<int32_t>(p).size()); >+ EXPECT_EQ(3, L::Partial(3, 5, 7).Slice<int8_t>(p).size()); >+ EXPECT_EQ(5, L::Partial(3, 5, 7).Slice<int32_t>(p).size()); >+ EXPECT_EQ(7, L::Partial(3, 5, 7).Slice<Int128>(p).size()); >+ EXPECT_EQ(3, L(3, 5, 7).Slice<int8_t>(p).size()); >+ EXPECT_EQ(5, L(3, 5, 7).Slice<int32_t>(p).size()); >+ EXPECT_EQ(7, L(3, 5, 7).Slice<Int128>(p).size()); >+ } >+} >+ >+TEST(Layout, SliceByIndexData) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int32_t>>(L::Partial(0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int32_t>>(L::Partial(3).Slice<0>(p)).data())); >+ EXPECT_EQ(0, Distance(p, Type<Span<const int32_t>>(L(3).Slice<0>(p)).data())); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int32_t>>(L::Partial(3).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, >+ Type<Span<const int32_t>>(L::Partial(3, 5).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 12, >+ Distance(p, >+ Type<Span<const int32_t>>(L::Partial(3, 5).Slice<1>(p)).data())); >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<const int32_t>>(L(3, 5).Slice<0>(p)).data())); >+ EXPECT_EQ(12, >+ Distance(p, Type<Span<const int32_t>>(L(3, 5).Slice<1>(p)).data())); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int8_t>>(L::Partial(0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int8_t>>(L::Partial(1).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int8_t>>(L::Partial(5).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<const int8_t>>(L::Partial(0, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, >+ Type<Span<const int32_t>>(L::Partial(0, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<const int8_t>>(L::Partial(1, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 4, >+ Distance(p, >+ Type<Span<const int32_t>>(L::Partial(1, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<const int8_t>>(L::Partial(5, 3).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 8, >+ Distance(p, >+ Type<Span<const int32_t>>(L::Partial(5, 3).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int8_t>>(L::Partial(0, 0, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, >+ Type<Span<const int32_t>>(L::Partial(0, 0, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, >+ Type<Span<const Int128>>(L::Partial(0, 0, 0).Slice<2>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int8_t>>(L::Partial(1, 0, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 4, >+ Distance( >+ p, >+ Type<Span<const int32_t>>(L::Partial(1, 0, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 8, >+ Distance( >+ p, >+ Type<Span<const Int128>>(L::Partial(1, 0, 0).Slice<2>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int8_t>>(L::Partial(5, 3, 1).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 24, >+ Distance( >+ p, >+ Type<Span<const Int128>>(L::Partial(5, 3, 1).Slice<2>(p)).data())); >+ EXPECT_EQ( >+ 8, >+ Distance( >+ p, >+ Type<Span<const int32_t>>(L::Partial(5, 3, 1).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<const int8_t>>(L(5, 3, 1).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 24, >+ Distance(p, Type<Span<const Int128>>(L(5, 3, 1).Slice<2>(p)).data())); >+ EXPECT_EQ( >+ 8, Distance(p, Type<Span<const int32_t>>(L(5, 3, 1).Slice<1>(p)).data())); >+ } >+} >+ >+TEST(Layout, SliceByTypeData) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int32_t>>(L::Partial(0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int32_t>>(L::Partial(3).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<const int32_t>>(L(3).Slice<int32_t>(p)).data())); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<const int8_t>>(L::Partial(0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<const int8_t>>(L::Partial(1).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<const int8_t>>(L::Partial(5).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int8_t>>(L::Partial(0, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, >+ Type<Span<const int32_t>>(L::Partial(0, 0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int8_t>>(L::Partial(1, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 4, >+ Distance( >+ p, >+ Type<Span<const int32_t>>(L::Partial(1, 0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<const int8_t>>(L::Partial(5, 3).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 8, >+ Distance( >+ p, >+ Type<Span<const int32_t>>(L::Partial(5, 3).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, >+ Type<Span<const int8_t>>(L::Partial(0, 0, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int32_t>>(L::Partial(0, 0, 0).Slice<int32_t>(p)) >+ .data())); >+ EXPECT_EQ(0, Distance(p, Type<Span<const Int128>>( >+ L::Partial(0, 0, 0).Slice<Int128>(p)) >+ .data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, >+ Type<Span<const int8_t>>(L::Partial(1, 0, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 4, >+ Distance(p, Type<Span<const int32_t>>(L::Partial(1, 0, 0).Slice<int32_t>(p)) >+ .data())); >+ EXPECT_EQ(8, Distance(p, Type<Span<const Int128>>( >+ L::Partial(1, 0, 0).Slice<Int128>(p)) >+ .data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, >+ Type<Span<const int8_t>>(L::Partial(5, 3, 1).Slice<int8_t>(p)).data())); >+ EXPECT_EQ(24, Distance(p, Type<Span<const Int128>>( >+ L::Partial(5, 3, 1).Slice<Int128>(p)) >+ .data())); >+ EXPECT_EQ( >+ 8, >+ Distance(p, Type<Span<const int32_t>>(L::Partial(5, 3, 1).Slice<int32_t>(p)) >+ .data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<const int8_t>>(L(5, 3, 1).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 24, >+ Distance(p, >+ Type<Span<const Int128>>(L(5, 3, 1).Slice<Int128>(p)).data())); >+ EXPECT_EQ( >+ 8, Distance( >+ p, Type<Span<const int32_t>>(L(5, 3, 1).Slice<int32_t>(p)).data())); >+ } >+} >+ >+TEST(Layout, MutableSliceByIndexData) { >+ alignas(max_align_t) unsigned char p[100]; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<int32_t>>(L::Partial(0).Slice<0>(p)).data())); >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<int32_t>>(L::Partial(3).Slice<0>(p)).data())); >+ EXPECT_EQ(0, Distance(p, Type<Span<int32_t>>(L(3).Slice<0>(p)).data())); >+ } >+ { >+ using L = Layout<int32_t, int32_t>; >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<int32_t>>(L::Partial(3).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int32_t>>(L::Partial(3, 5).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 12, >+ Distance(p, Type<Span<int32_t>>(L::Partial(3, 5).Slice<1>(p)).data())); >+ EXPECT_EQ(0, Distance(p, Type<Span<int32_t>>(L(3, 5).Slice<0>(p)).data())); >+ EXPECT_EQ(12, Distance(p, Type<Span<int32_t>>(L(3, 5).Slice<1>(p)).data())); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(0).Slice<0>(p)).data())); >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(1).Slice<0>(p)).data())); >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(5).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int8_t>>(L::Partial(0, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int32_t>>(L::Partial(0, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int8_t>>(L::Partial(1, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 4, Distance(p, Type<Span<int32_t>>(L::Partial(1, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int8_t>>(L::Partial(5, 3).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 8, Distance(p, Type<Span<int32_t>>(L::Partial(5, 3).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(0, 0, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int32_t>>(L::Partial(0, 0, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<Int128>>(L::Partial(0, 0, 0).Slice<2>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(1, 0, 0).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 4, >+ Distance(p, Type<Span<int32_t>>(L::Partial(1, 0, 0).Slice<1>(p)).data())); >+ EXPECT_EQ( >+ 8, Distance( >+ p, Type<Span<Int128>>(L::Partial(1, 0, 0).Slice<2>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(5, 3, 1).Slice<0>(p)).data())); >+ EXPECT_EQ( >+ 24, Distance( >+ p, Type<Span<Int128>>(L::Partial(5, 3, 1).Slice<2>(p)).data())); >+ EXPECT_EQ( >+ 8, >+ Distance(p, Type<Span<int32_t>>(L::Partial(5, 3, 1).Slice<1>(p)).data())); >+ EXPECT_EQ(0, Distance(p, Type<Span<int8_t>>(L(5, 3, 1).Slice<0>(p)).data())); >+ EXPECT_EQ(24, >+ Distance(p, Type<Span<Int128>>(L(5, 3, 1).Slice<2>(p)).data())); >+ EXPECT_EQ(8, Distance(p, Type<Span<int32_t>>(L(5, 3, 1).Slice<1>(p)).data())); >+ } >+} >+ >+TEST(Layout, MutableSliceByTypeData) { >+ alignas(max_align_t) unsigned char p[100]; >+ { >+ using L = Layout<int32_t>; >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int32_t>>(L::Partial(0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int32_t>>(L::Partial(3).Slice<int32_t>(p)).data())); >+ EXPECT_EQ(0, Distance(p, Type<Span<int32_t>>(L(3).Slice<int32_t>(p)).data())); >+ } >+ { >+ using L = Layout<int8_t, int32_t, Int128>; >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int8_t>>(L::Partial(0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int8_t>>(L::Partial(1).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance(p, Type<Span<int8_t>>(L::Partial(5).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(0, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<int32_t>>(L::Partial(0, 0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(1, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 4, Distance( >+ p, Type<Span<int32_t>>(L::Partial(1, 0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance(p, Type<Span<int8_t>>(L::Partial(5, 3).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 8, Distance( >+ p, Type<Span<int32_t>>(L::Partial(5, 3).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<int8_t>>(L::Partial(0, 0, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, Type<Span<int32_t>>(L::Partial(0, 0, 0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 0, >+ Distance( >+ p, >+ Type<Span<Int128>>(L::Partial(0, 0, 0).Slice<Int128>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<int8_t>>(L::Partial(1, 0, 0).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 4, >+ Distance( >+ p, Type<Span<int32_t>>(L::Partial(1, 0, 0).Slice<int32_t>(p)).data())); >+ EXPECT_EQ( >+ 8, >+ Distance( >+ p, >+ Type<Span<Int128>>(L::Partial(1, 0, 0).Slice<Int128>(p)).data())); >+ EXPECT_EQ( >+ 0, Distance( >+ p, Type<Span<int8_t>>(L::Partial(5, 3, 1).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 24, >+ Distance( >+ p, >+ Type<Span<Int128>>(L::Partial(5, 3, 1).Slice<Int128>(p)).data())); >+ EXPECT_EQ( >+ 8, >+ Distance( >+ p, Type<Span<int32_t>>(L::Partial(5, 3, 1).Slice<int32_t>(p)).data())); >+ EXPECT_EQ(0, >+ Distance(p, Type<Span<int8_t>>(L(5, 3, 1).Slice<int8_t>(p)).data())); >+ EXPECT_EQ( >+ 24, >+ Distance(p, Type<Span<Int128>>(L(5, 3, 1).Slice<Int128>(p)).data())); >+ EXPECT_EQ( >+ 8, Distance(p, Type<Span<int32_t>>(L(5, 3, 1).Slice<int32_t>(p)).data())); >+ } >+} >+ >+MATCHER_P(IsSameSlice, slice, "") { >+ return arg.size() == slice.size() && arg.data() == slice.data(); >+} >+ >+template <typename... M> >+class TupleMatcher { >+ public: >+ explicit TupleMatcher(M... matchers) : matchers_(std::move(matchers)...) {} >+ >+ template <typename Tuple> >+ bool MatchAndExplain(const Tuple& p, >+ testing::MatchResultListener* /* listener */) const { >+ static_assert(std::tuple_size<Tuple>::value == sizeof...(M), ""); >+ return MatchAndExplainImpl( >+ p, absl::make_index_sequence<std::tuple_size<Tuple>::value>{}); >+ } >+ >+ // For the matcher concept. Left empty as we don't really need the diagnostics >+ // right now. >+ void DescribeTo(::std::ostream* os) const {} >+ void DescribeNegationTo(::std::ostream* os) const {} >+ >+ private: >+ template <typename Tuple, size_t... Is> >+ bool MatchAndExplainImpl(const Tuple& p, absl::index_sequence<Is...>) const { >+ // Using std::min as a simple variadic "and". >+ return std::min( >+ {true, testing::SafeMatcherCast< >+ const typename std::tuple_element<Is, Tuple>::type&>( >+ std::get<Is>(matchers_)) >+ .Matches(std::get<Is>(p))...}); >+ } >+ >+ std::tuple<M...> matchers_; >+}; >+ >+template <typename... M> >+testing::PolymorphicMatcher<TupleMatcher<M...>> Tuple(M... matchers) { >+ return testing::MakePolymorphicMatcher( >+ TupleMatcher<M...>(std::move(matchers)...)); >+} >+ >+TEST(Layout, Slices) { >+ alignas(max_align_t) const unsigned char p[100] = {}; >+ using L = Layout<int8_t, int8_t, Int128>; >+ { >+ const auto x = L::Partial(); >+ EXPECT_THAT(Type<std::tuple<>>(x.Slices(p)), Tuple()); >+ } >+ { >+ const auto x = L::Partial(1); >+ EXPECT_THAT(Type<std::tuple<Span<const int8_t>>>(x.Slices(p)), >+ Tuple(IsSameSlice(x.Slice<0>(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2); >+ EXPECT_THAT( >+ (Type<std::tuple<Span<const int8_t>, Span<const int8_t>>>(x.Slices(p))), >+ Tuple(IsSameSlice(x.Slice<0>(p)), IsSameSlice(x.Slice<1>(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2, 3); >+ EXPECT_THAT((Type<std::tuple<Span<const int8_t>, Span<const int8_t>, >+ Span<const Int128>>>(x.Slices(p))), >+ Tuple(IsSameSlice(x.Slice<0>(p)), IsSameSlice(x.Slice<1>(p)), >+ IsSameSlice(x.Slice<2>(p)))); >+ } >+ { >+ const L x(1, 2, 3); >+ EXPECT_THAT((Type<std::tuple<Span<const int8_t>, Span<const int8_t>, >+ Span<const Int128>>>(x.Slices(p))), >+ Tuple(IsSameSlice(x.Slice<0>(p)), IsSameSlice(x.Slice<1>(p)), >+ IsSameSlice(x.Slice<2>(p)))); >+ } >+} >+ >+TEST(Layout, MutableSlices) { >+ alignas(max_align_t) unsigned char p[100] = {}; >+ using L = Layout<int8_t, int8_t, Int128>; >+ { >+ const auto x = L::Partial(); >+ EXPECT_THAT(Type<std::tuple<>>(x.Slices(p)), Tuple()); >+ } >+ { >+ const auto x = L::Partial(1); >+ EXPECT_THAT(Type<std::tuple<Span<int8_t>>>(x.Slices(p)), >+ Tuple(IsSameSlice(x.Slice<0>(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2); >+ EXPECT_THAT((Type<std::tuple<Span<int8_t>, Span<int8_t>>>(x.Slices(p))), >+ Tuple(IsSameSlice(x.Slice<0>(p)), IsSameSlice(x.Slice<1>(p)))); >+ } >+ { >+ const auto x = L::Partial(1, 2, 3); >+ EXPECT_THAT( >+ (Type<std::tuple<Span<int8_t>, Span<int8_t>, Span<Int128>>>(x.Slices(p))), >+ Tuple(IsSameSlice(x.Slice<0>(p)), IsSameSlice(x.Slice<1>(p)), >+ IsSameSlice(x.Slice<2>(p)))); >+ } >+ { >+ const L x(1, 2, 3); >+ EXPECT_THAT( >+ (Type<std::tuple<Span<int8_t>, Span<int8_t>, Span<Int128>>>(x.Slices(p))), >+ Tuple(IsSameSlice(x.Slice<0>(p)), IsSameSlice(x.Slice<1>(p)), >+ IsSameSlice(x.Slice<2>(p)))); >+ } >+} >+ >+TEST(Layout, UnalignedTypes) { >+ constexpr Layout<unsigned char, unsigned char, unsigned char> x(1, 2, 3); >+ alignas(max_align_t) unsigned char p[x.AllocSize() + 1]; >+ EXPECT_THAT(x.Pointers(p + 1), Tuple(p + 1, p + 2, p + 4)); >+} >+ >+TEST(Layout, CustomAlignment) { >+ constexpr Layout<unsigned char, Aligned<unsigned char, 8>> x(1, 2); >+ alignas(max_align_t) unsigned char p[x.AllocSize()]; >+ EXPECT_EQ(10, x.AllocSize()); >+ EXPECT_THAT(x.Pointers(p), Tuple(p + 0, p + 8)); >+} >+ >+TEST(Layout, OverAligned) { >+ constexpr size_t M = alignof(max_align_t); >+ constexpr Layout<unsigned char, Aligned<unsigned char, 2 * M>> x(1, 3); >+ alignas(2 * M) unsigned char p[x.AllocSize()]; >+ EXPECT_EQ(2 * M + 3, x.AllocSize()); >+ EXPECT_THAT(x.Pointers(p), Tuple(p + 0, p + 2 * M)); >+} >+ >+TEST(Layout, Alignment) { >+ static_assert(Layout<int8_t>::Alignment() == 1, ""); >+ static_assert(Layout<int32_t>::Alignment() == 4, ""); >+ static_assert(Layout<int64_t>::Alignment() == 8, ""); >+ static_assert(Layout<Aligned<int8_t, 64>>::Alignment() == 64, ""); >+ static_assert(Layout<int8_t, int32_t, int64_t>::Alignment() == 8, ""); >+ static_assert(Layout<int8_t, int64_t, int32_t>::Alignment() == 8, ""); >+ static_assert(Layout<int32_t, int8_t, int64_t>::Alignment() == 8, ""); >+ static_assert(Layout<int32_t, int64_t, int8_t>::Alignment() == 8, ""); >+ static_assert(Layout<int64_t, int8_t, int32_t>::Alignment() == 8, ""); >+ static_assert(Layout<int64_t, int32_t, int8_t>::Alignment() == 8, ""); >+} >+ >+TEST(Layout, ConstexprPartial) { >+ constexpr size_t M = alignof(max_align_t); >+ constexpr Layout<unsigned char, Aligned<unsigned char, 2 * M>> x(1, 3); >+ static_assert(x.Partial(1).template Offset<1>() == 2 * M, ""); >+} >+// [from, to) >+struct Region { >+ size_t from; >+ size_t to; >+}; >+ >+void ExpectRegionPoisoned(const unsigned char* p, size_t n, bool poisoned) { >+#ifdef ADDRESS_SANITIZER >+ for (size_t i = 0; i != n; ++i) { >+ EXPECT_EQ(poisoned, __asan_address_is_poisoned(p + i)); >+ } >+#endif >+} >+ >+template <size_t N> >+void ExpectPoisoned(const unsigned char (&buf)[N], >+ std::initializer_list<Region> reg) { >+ size_t prev = 0; >+ for (const Region& r : reg) { >+ ExpectRegionPoisoned(buf + prev, r.from - prev, false); >+ ExpectRegionPoisoned(buf + r.from, r.to - r.from, true); >+ prev = r.to; >+ } >+ ExpectRegionPoisoned(buf + prev, N - prev, false); >+} >+ >+TEST(Layout, PoisonPadding) { >+ using L = Layout<int8_t, int64_t, int32_t, Int128>; >+ >+ constexpr size_t n = L::Partial(1, 2, 3, 4).AllocSize(); >+ { >+ constexpr auto x = L::Partial(); >+ alignas(max_align_t) const unsigned char c[n] = {}; >+ x.PoisonPadding(c); >+ EXPECT_EQ(x.Slices(c), x.Slices(c)); >+ ExpectPoisoned(c, {}); >+ } >+ { >+ constexpr auto x = L::Partial(1); >+ alignas(max_align_t) const unsigned char c[n] = {}; >+ x.PoisonPadding(c); >+ EXPECT_EQ(x.Slices(c), x.Slices(c)); >+ ExpectPoisoned(c, {{1, 8}}); >+ } >+ { >+ constexpr auto x = L::Partial(1, 2); >+ alignas(max_align_t) const unsigned char c[n] = {}; >+ x.PoisonPadding(c); >+ EXPECT_EQ(x.Slices(c), x.Slices(c)); >+ ExpectPoisoned(c, {{1, 8}}); >+ } >+ { >+ constexpr auto x = L::Partial(1, 2, 3); >+ alignas(max_align_t) const unsigned char c[n] = {}; >+ x.PoisonPadding(c); >+ EXPECT_EQ(x.Slices(c), x.Slices(c)); >+ ExpectPoisoned(c, {{1, 8}, {36, 40}}); >+ } >+ { >+ constexpr auto x = L::Partial(1, 2, 3, 4); >+ alignas(max_align_t) const unsigned char c[n] = {}; >+ x.PoisonPadding(c); >+ EXPECT_EQ(x.Slices(c), x.Slices(c)); >+ ExpectPoisoned(c, {{1, 8}, {36, 40}}); >+ } >+ { >+ constexpr L x(1, 2, 3, 4); >+ alignas(max_align_t) const unsigned char c[n] = {}; >+ x.PoisonPadding(c); >+ EXPECT_EQ(x.Slices(c), x.Slices(c)); >+ ExpectPoisoned(c, {{1, 8}, {36, 40}}); >+ } >+} >+ >+TEST(Layout, DebugString) { >+ const std::string int64_type = >+#ifdef _MSC_VER >+ "__int64"; >+#else // _MSC_VER >+ std::is_same<int64_t, long long>::value ? "long long" : "long"; // NOLINT >+#endif // _MSC_VER >+ { >+ constexpr auto x = Layout<int8_t, int32_t, int8_t, Int128>::Partial(); >+ EXPECT_EQ("@0<signed char>(1)", x.DebugString()); >+ } >+ { >+ constexpr auto x = Layout<int8_t, int32_t, int8_t, Int128>::Partial(1); >+ EXPECT_EQ("@0<signed char>(1)[1]; @4<int>(4)", x.DebugString()); >+ } >+ { >+ constexpr auto x = Layout<int8_t, int32_t, int8_t, Int128>::Partial(1, 2); >+ EXPECT_EQ("@0<signed char>(1)[1]; @4<int>(4)[2]; @12<signed char>(1)", >+ x.DebugString()); >+ } >+ { >+ constexpr auto x = Layout<int8_t, int32_t, int8_t, Int128>::Partial(1, 2, 3); >+ EXPECT_EQ( >+ "@0<signed char>(1)[1]; @4<int>(4)[2]; @12<signed char>(1)[3]; " >+ "@16<" + >+ int64_type + " [2]>(16)", >+ x.DebugString()); >+ } >+ { >+ constexpr auto x = Layout<int8_t, int32_t, int8_t, Int128>::Partial(1, 2, 3, 4); >+ EXPECT_EQ( >+ "@0<signed char>(1)[1]; @4<int>(4)[2]; @12<signed char>(1)[3]; " >+ "@16<" + >+ int64_type + " [2]>(16)[4]", >+ x.DebugString()); >+ } >+ { >+ constexpr Layout<int8_t, int32_t, int8_t, Int128> x(1, 2, 3, 4); >+ EXPECT_EQ( >+ "@0<signed char>(1)[1]; @4<int>(4)[2]; @12<signed char>(1)[3]; " >+ "@16<" + >+ int64_type + " [2]>(16)[4]", >+ x.DebugString()); >+ } >+} >+ >+TEST(Layout, CharTypes) { >+ constexpr Layout<int32_t> x(1); >+ alignas(max_align_t) char c[x.AllocSize()] = {}; >+ alignas(max_align_t) unsigned char uc[x.AllocSize()] = {}; >+ alignas(max_align_t) signed char sc[x.AllocSize()] = {}; >+ alignas(max_align_t) const char cc[x.AllocSize()] = {}; >+ alignas(max_align_t) const unsigned char cuc[x.AllocSize()] = {}; >+ alignas(max_align_t) const signed char csc[x.AllocSize()] = {}; >+ >+ Type<int32_t*>(x.Pointer<0>(c)); >+ Type<int32_t*>(x.Pointer<0>(uc)); >+ Type<int32_t*>(x.Pointer<0>(sc)); >+ Type<const int32_t*>(x.Pointer<0>(cc)); >+ Type<const int32_t*>(x.Pointer<0>(cuc)); >+ Type<const int32_t*>(x.Pointer<0>(csc)); >+ >+ Type<int32_t*>(x.Pointer<int32_t>(c)); >+ Type<int32_t*>(x.Pointer<int32_t>(uc)); >+ Type<int32_t*>(x.Pointer<int32_t>(sc)); >+ Type<const int32_t*>(x.Pointer<int32_t>(cc)); >+ Type<const int32_t*>(x.Pointer<int32_t>(cuc)); >+ Type<const int32_t*>(x.Pointer<int32_t>(csc)); >+ >+ Type<std::tuple<int32_t*>>(x.Pointers(c)); >+ Type<std::tuple<int32_t*>>(x.Pointers(uc)); >+ Type<std::tuple<int32_t*>>(x.Pointers(sc)); >+ Type<std::tuple<const int32_t*>>(x.Pointers(cc)); >+ Type<std::tuple<const int32_t*>>(x.Pointers(cuc)); >+ Type<std::tuple<const int32_t*>>(x.Pointers(csc)); >+ >+ Type<Span<int32_t>>(x.Slice<0>(c)); >+ Type<Span<int32_t>>(x.Slice<0>(uc)); >+ Type<Span<int32_t>>(x.Slice<0>(sc)); >+ Type<Span<const int32_t>>(x.Slice<0>(cc)); >+ Type<Span<const int32_t>>(x.Slice<0>(cuc)); >+ Type<Span<const int32_t>>(x.Slice<0>(csc)); >+ >+ Type<std::tuple<Span<int32_t>>>(x.Slices(c)); >+ Type<std::tuple<Span<int32_t>>>(x.Slices(uc)); >+ Type<std::tuple<Span<int32_t>>>(x.Slices(sc)); >+ Type<std::tuple<Span<const int32_t>>>(x.Slices(cc)); >+ Type<std::tuple<Span<const int32_t>>>(x.Slices(cuc)); >+ Type<std::tuple<Span<const int32_t>>>(x.Slices(csc)); >+} >+ >+TEST(Layout, ConstElementType) { >+ constexpr Layout<const int32_t> x(1); >+ alignas(int32_t) char c[x.AllocSize()] = {}; >+ const char* cc = c; >+ const int32_t* p = reinterpret_cast<const int32_t*>(cc); >+ >+ EXPECT_EQ(alignof(int32_t), x.Alignment()); >+ >+ EXPECT_EQ(0, x.Offset<0>()); >+ EXPECT_EQ(0, x.Offset<const int32_t>()); >+ >+ EXPECT_THAT(x.Offsets(), ElementsAre(0)); >+ >+ EXPECT_EQ(1, x.Size<0>()); >+ EXPECT_EQ(1, x.Size<const int32_t>()); >+ >+ EXPECT_THAT(x.Sizes(), ElementsAre(1)); >+ >+ EXPECT_EQ(sizeof(int32_t), x.AllocSize()); >+ >+ EXPECT_EQ(p, Type<const int32_t*>(x.Pointer<0>(c))); >+ EXPECT_EQ(p, Type<const int32_t*>(x.Pointer<0>(cc))); >+ >+ EXPECT_EQ(p, Type<const int32_t*>(x.Pointer<const int32_t>(c))); >+ EXPECT_EQ(p, Type<const int32_t*>(x.Pointer<const int32_t>(cc))); >+ >+ EXPECT_THAT(Type<std::tuple<const int32_t*>>(x.Pointers(c)), Tuple(p)); >+ EXPECT_THAT(Type<std::tuple<const int32_t*>>(x.Pointers(cc)), Tuple(p)); >+ >+ EXPECT_THAT(Type<Span<const int32_t>>(x.Slice<0>(c)), >+ IsSameSlice(Span<const int32_t>(p, 1))); >+ EXPECT_THAT(Type<Span<const int32_t>>(x.Slice<0>(cc)), >+ IsSameSlice(Span<const int32_t>(p, 1))); >+ >+ EXPECT_THAT(Type<Span<const int32_t>>(x.Slice<const int32_t>(c)), >+ IsSameSlice(Span<const int32_t>(p, 1))); >+ EXPECT_THAT(Type<Span<const int32_t>>(x.Slice<const int32_t>(cc)), >+ IsSameSlice(Span<const int32_t>(p, 1))); >+ >+ EXPECT_THAT(Type<std::tuple<Span<const int32_t>>>(x.Slices(c)), >+ Tuple(IsSameSlice(Span<const int32_t>(p, 1)))); >+ EXPECT_THAT(Type<std::tuple<Span<const int32_t>>>(x.Slices(cc)), >+ Tuple(IsSameSlice(Span<const int32_t>(p, 1)))); >+} >+ >+namespace example { >+ >+// Immutable move-only string with sizeof equal to sizeof(void*). The string >+// size and the characters are kept in the same heap allocation. >+class CompactString { >+ public: >+ CompactString(const char* s = "") { // NOLINT >+ const size_t size = strlen(s); >+ // size_t[1], followed by char[size + 1]. >+ // This statement doesn't allocate memory. >+ const L layout(1, size + 1); >+ // AllocSize() tells us how much memory we need to allocate for all our >+ // data. >+ p_.reset(new unsigned char[layout.AllocSize()]); >+ // If running under ASAN, mark the padding bytes, if any, to catch memory >+ // errors. >+ layout.PoisonPadding(p_.get()); >+ // Store the size in the allocation. >+ // Pointer<size_t>() is a synonym for Pointer<0>(). >+ *layout.Pointer<size_t>(p_.get()) = size; >+ // Store the characters in the allocation. >+ memcpy(layout.Pointer<char>(p_.get()), s, size + 1); >+ } >+ >+ size_t size() const { >+ // Equivalent to reinterpret_cast<size_t&>(*p). >+ return *L::Partial().Pointer<size_t>(p_.get()); >+ } >+ >+ const char* c_str() const { >+ // Equivalent to reinterpret_cast<char*>(p.get() + sizeof(size_t)). >+ // The argument in Partial(1) specifies that we have size_t[1] in front of >+ // the >+ // characters. >+ return L::Partial(1).Pointer<char>(p_.get()); >+ } >+ >+ private: >+ // Our heap allocation contains a size_t followed by an array of chars. >+ using L = Layout<size_t, char>; >+ std::unique_ptr<unsigned char[]> p_; >+}; >+ >+TEST(CompactString, Works) { >+ CompactString s = "hello"; >+ EXPECT_EQ(5, s.size()); >+ EXPECT_STREQ("hello", s.c_str()); >+} >+ >+} // namespace example >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/node_hash_policy.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/node_hash_policy.h >new file mode 100644 >index 0000000000000000000000000000000000000000..065e7009e7e7e4c5b033a5d258cad93572cfda5e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/node_hash_policy.h >@@ -0,0 +1,88 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// Adapts a policy for nodes. >+// >+// The node policy should model: >+// >+// struct Policy { >+// // Returns a new node allocated and constructed using the allocator, using >+// // the specified arguments. >+// template <class Alloc, class... Args> >+// value_type* new_element(Alloc* alloc, Args&&... args) const; >+// >+// // Destroys and deallocates node using the allocator. >+// template <class Alloc> >+// void delete_element(Alloc* alloc, value_type* node) const; >+// }; >+// >+// It may also optionally define `value()` and `apply()`. For documentation on >+// these, see hash_policy_traits.h. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_NODE_HASH_POLICY_H_ >+#define ABSL_CONTAINER_INTERNAL_NODE_HASH_POLICY_H_ >+ >+#include <cassert> >+#include <cstddef> >+#include <memory> >+#include <type_traits> >+#include <utility> >+ >+namespace absl { >+namespace container_internal { >+ >+template <class Reference, class Policy> >+struct node_hash_policy { >+ static_assert(std::is_lvalue_reference<Reference>::value, ""); >+ >+ using slot_type = typename std::remove_cv< >+ typename std::remove_reference<Reference>::type>::type*; >+ >+ template <class Alloc, class... Args> >+ static void construct(Alloc* alloc, slot_type* slot, Args&&... args) { >+ *slot = Policy::new_element(alloc, std::forward<Args>(args)...); >+ } >+ >+ template <class Alloc> >+ static void destroy(Alloc* alloc, slot_type* slot) { >+ Policy::delete_element(alloc, *slot); >+ } >+ >+ template <class Alloc> >+ static void transfer(Alloc*, slot_type* new_slot, slot_type* old_slot) { >+ *new_slot = *old_slot; >+ } >+ >+ static size_t space_used(const slot_type* slot) { >+ if (slot == nullptr) return Policy::element_space_used(nullptr); >+ return Policy::element_space_used(*slot); >+ } >+ >+ static Reference element(slot_type* slot) { return **slot; } >+ >+ template <class T, class P = Policy> >+ static auto value(T* elem) -> decltype(P::value(elem)) { >+ return P::value(elem); >+ } >+ >+ template <class... Ts, class P = Policy> >+ static auto apply(Ts&&... ts) -> decltype(P::apply(std::forward<Ts>(ts)...)) { >+ return P::apply(std::forward<Ts>(ts)...); >+ } >+}; >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_NODE_HASH_POLICY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/node_hash_policy_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/node_hash_policy_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..43d287e3c4a04ebd618c305a15b76305f72a7497 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/node_hash_policy_test.cc >@@ -0,0 +1,67 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/node_hash_policy.h" >+ >+#include <memory> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/internal/hash_policy_traits.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using ::testing::Pointee; >+ >+struct Policy : node_hash_policy<int&, Policy> { >+ using key_type = int; >+ using init_type = int; >+ >+ template <class Alloc> >+ static int* new_element(Alloc* alloc, int value) { >+ return new int(value); >+ } >+ >+ template <class Alloc> >+ static void delete_element(Alloc* alloc, int* elem) { >+ delete elem; >+ } >+}; >+ >+using NodePolicy = hash_policy_traits<Policy>; >+ >+struct NodeTest : ::testing::Test { >+ std::allocator<int> alloc; >+ int n = 53; >+ int* a = &n; >+}; >+ >+TEST_F(NodeTest, ConstructDestroy) { >+ NodePolicy::construct(&alloc, &a, 42); >+ EXPECT_THAT(a, Pointee(42)); >+ NodePolicy::destroy(&alloc, &a); >+} >+ >+TEST_F(NodeTest, transfer) { >+ int s = 42; >+ int* b = &s; >+ NodePolicy::transfer(&alloc, &a, &b); >+ EXPECT_EQ(&s, a); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_map.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_map.h >new file mode 100644 >index 0000000000000000000000000000000000000000..1edc0071e7de829de2ddc79675846632f6604c46 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_map.h >@@ -0,0 +1,182 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_RAW_HASH_MAP_H_ >+#define ABSL_CONTAINER_INTERNAL_RAW_HASH_MAP_H_ >+ >+#include <tuple> >+#include <type_traits> >+#include <utility> >+ >+#include "absl/container/internal/container_memory.h" >+#include "absl/container/internal/raw_hash_set.h" // IWYU pragma: export >+ >+namespace absl { >+namespace container_internal { >+ >+template <class Policy, class Hash, class Eq, class Alloc> >+class raw_hash_map : public raw_hash_set<Policy, Hash, Eq, Alloc> { >+ // P is Policy. It's passed as a template argument to support maps that have >+ // incomplete types as values, as in unordered_map<K, IncompleteType>. >+ // MappedReference<> may be a non-reference type. >+ template <class P> >+ using MappedReference = decltype(P::value( >+ std::addressof(std::declval<typename raw_hash_map::reference>()))); >+ >+ // MappedConstReference<> may be a non-reference type. >+ template <class P> >+ using MappedConstReference = decltype(P::value( >+ std::addressof(std::declval<typename raw_hash_map::const_reference>()))); >+ >+ public: >+ using key_type = typename Policy::key_type; >+ using mapped_type = typename Policy::mapped_type; >+ template <typename K> >+ using key_arg = typename raw_hash_map::raw_hash_set::template key_arg<K>; >+ >+ static_assert(!std::is_reference<key_type>::value, ""); >+ // TODO(alkis): remove this assertion and verify that reference mapped_type is >+ // supported. >+ static_assert(!std::is_reference<mapped_type>::value, ""); >+ >+ using iterator = typename raw_hash_map::raw_hash_set::iterator; >+ using const_iterator = typename raw_hash_map::raw_hash_set::const_iterator; >+ >+ raw_hash_map() {} >+ using raw_hash_map::raw_hash_set::raw_hash_set; >+ >+ // The last two template parameters ensure that both arguments are rvalues >+ // (lvalue arguments are handled by the overloads below). This is necessary >+ // for supporting bitfield arguments. >+ // >+ // union { int n : 1; }; >+ // flat_hash_map<int, int> m; >+ // m.insert_or_assign(n, n); >+ template <class K = key_type, class V = mapped_type, K* = nullptr, >+ V* = nullptr> >+ std::pair<iterator, bool> insert_or_assign(key_arg<K>&& k, V&& v) { >+ return insert_or_assign_impl(std::forward<K>(k), std::forward<V>(v)); >+ } >+ >+ template <class K = key_type, class V = mapped_type, K* = nullptr> >+ std::pair<iterator, bool> insert_or_assign(key_arg<K>&& k, const V& v) { >+ return insert_or_assign_impl(std::forward<K>(k), v); >+ } >+ >+ template <class K = key_type, class V = mapped_type, V* = nullptr> >+ std::pair<iterator, bool> insert_or_assign(const key_arg<K>& k, V&& v) { >+ return insert_or_assign_impl(k, std::forward<V>(v)); >+ } >+ >+ template <class K = key_type, class V = mapped_type> >+ std::pair<iterator, bool> insert_or_assign(const key_arg<K>& k, const V& v) { >+ return insert_or_assign_impl(k, v); >+ } >+ >+ template <class K = key_type, class V = mapped_type, K* = nullptr, >+ V* = nullptr> >+ iterator insert_or_assign(const_iterator, key_arg<K>&& k, V&& v) { >+ return insert_or_assign(std::forward<K>(k), std::forward<V>(v)).first; >+ } >+ >+ template <class K = key_type, class V = mapped_type, K* = nullptr> >+ iterator insert_or_assign(const_iterator, key_arg<K>&& k, const V& v) { >+ return insert_or_assign(std::forward<K>(k), v).first; >+ } >+ >+ template <class K = key_type, class V = mapped_type, V* = nullptr> >+ iterator insert_or_assign(const_iterator, const key_arg<K>& k, V&& v) { >+ return insert_or_assign(k, std::forward<V>(v)).first; >+ } >+ >+ template <class K = key_type, class V = mapped_type> >+ iterator insert_or_assign(const_iterator, const key_arg<K>& k, const V& v) { >+ return insert_or_assign(k, v).first; >+ } >+ >+ template <class K = key_type, class... Args, >+ typename std::enable_if< >+ !std::is_convertible<K, const_iterator>::value, int>::type = 0, >+ K* = nullptr> >+ std::pair<iterator, bool> try_emplace(key_arg<K>&& k, Args&&... args) { >+ return try_emplace_impl(std::forward<K>(k), std::forward<Args>(args)...); >+ } >+ >+ template <class K = key_type, class... Args, >+ typename std::enable_if< >+ !std::is_convertible<K, const_iterator>::value, int>::type = 0> >+ std::pair<iterator, bool> try_emplace(const key_arg<K>& k, Args&&... args) { >+ return try_emplace_impl(k, std::forward<Args>(args)...); >+ } >+ >+ template <class K = key_type, class... Args, K* = nullptr> >+ iterator try_emplace(const_iterator, key_arg<K>&& k, Args&&... args) { >+ return try_emplace(std::forward<K>(k), std::forward<Args>(args)...).first; >+ } >+ >+ template <class K = key_type, class... Args> >+ iterator try_emplace(const_iterator, const key_arg<K>& k, Args&&... args) { >+ return try_emplace(k, std::forward<Args>(args)...).first; >+ } >+ >+ template <class K = key_type, class P = Policy> >+ MappedReference<P> at(const key_arg<K>& key) { >+ auto it = this->find(key); >+ if (it == this->end()) std::abort(); >+ return Policy::value(&*it); >+ } >+ >+ template <class K = key_type, class P = Policy> >+ MappedConstReference<P> at(const key_arg<K>& key) const { >+ auto it = this->find(key); >+ if (it == this->end()) std::abort(); >+ return Policy::value(&*it); >+ } >+ >+ template <class K = key_type, class P = Policy, K* = nullptr> >+ MappedReference<P> operator[](key_arg<K>&& key) { >+ return Policy::value(&*try_emplace(std::forward<K>(key)).first); >+ } >+ >+ template <class K = key_type, class P = Policy> >+ MappedReference<P> operator[](const key_arg<K>& key) { >+ return Policy::value(&*try_emplace(key).first); >+ } >+ >+ private: >+ template <class K, class V> >+ std::pair<iterator, bool> insert_or_assign_impl(K&& k, V&& v) { >+ auto res = this->find_or_prepare_insert(k); >+ if (res.second) >+ this->emplace_at(res.first, std::forward<K>(k), std::forward<V>(v)); >+ else >+ Policy::value(&*this->iterator_at(res.first)) = std::forward<V>(v); >+ return {this->iterator_at(res.first), res.second}; >+ } >+ >+ template <class K = key_type, class... Args> >+ std::pair<iterator, bool> try_emplace_impl(K&& k, Args&&... args) { >+ auto res = this->find_or_prepare_insert(k); >+ if (res.second) >+ this->emplace_at(res.first, std::piecewise_construct, >+ std::forward_as_tuple(std::forward<K>(k)), >+ std::forward_as_tuple(std::forward<Args>(args)...)); >+ return {this->iterator_at(res.first), res.second}; >+ } >+}; >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_RAW_HASH_MAP_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..10153129fd1e6ebe7513009ea8423fdec45e3634 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set.cc >@@ -0,0 +1,45 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/raw_hash_set.h" >+ >+#include <cstddef> >+ >+#include "absl/base/config.h" >+ >+namespace absl { >+namespace container_internal { >+ >+constexpr size_t Group::kWidth; >+ >+// Returns "random" seed. >+inline size_t RandomSeed() { >+#if ABSL_HAVE_THREAD_LOCAL >+ static thread_local size_t counter = 0; >+ size_t value = ++counter; >+#else // ABSL_HAVE_THREAD_LOCAL >+ static std::atomic<size_t> counter; >+ size_t value = counter.fetch_add(1, std::memory_order_relaxed); >+#endif // ABSL_HAVE_THREAD_LOCAL >+ return value ^ static_cast<size_t>(reinterpret_cast<uintptr_t>(&counter)); >+} >+ >+bool ShouldInsertBackwards(size_t hash, ctrl_t* ctrl) { >+ // To avoid problems with weak hashes and single bit tests, we use % 13. >+ // TODO(kfm,sbenza): revisit after we do unconditional mixing >+ return (H1(hash, ctrl) ^ RandomSeed()) % 13 > 6; >+} >+ >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set.h >new file mode 100644 >index 0000000000000000000000000000000000000000..70da90f79a4fc63a9676582727cfecc7ed8c2be5 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set.h >@@ -0,0 +1,1915 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// An open-addressing >+// hashtable with quadratic probing. >+// >+// This is a low level hashtable on top of which different interfaces can be >+// implemented, like flat_hash_set, node_hash_set, string_hash_set, etc. >+// >+// The table interface is similar to that of std::unordered_set. Notable >+// differences are that most member functions support heterogeneous keys when >+// BOTH the hash and eq functions are marked as transparent. They do so by >+// providing a typedef called `is_transparent`. >+// >+// When heterogeneous lookup is enabled, functions that take key_type act as if >+// they have an overload set like: >+// >+// iterator find(const key_type& key); >+// template <class K> >+// iterator find(const K& key); >+// >+// size_type erase(const key_type& key); >+// template <class K> >+// size_type erase(const K& key); >+// >+// std::pair<iterator, iterator> equal_range(const key_type& key); >+// template <class K> >+// std::pair<iterator, iterator> equal_range(const K& key); >+// >+// When heterogeneous lookup is disabled, only the explicit `key_type` overloads >+// exist. >+// >+// find() also supports passing the hash explicitly: >+// >+// iterator find(const key_type& key, size_t hash); >+// template <class U> >+// iterator find(const U& key, size_t hash); >+// >+// In addition the pointer to element and iterator stability guarantees are >+// weaker: all iterators and pointers are invalidated after a new element is >+// inserted. >+// >+// IMPLEMENTATION DETAILS >+// >+// The table stores elements inline in a slot array. In addition to the slot >+// array the table maintains some control state per slot. The extra state is one >+// byte per slot and stores empty or deleted marks, or alternatively 7 bits from >+// the hash of an occupied slot. The table is split into logical groups of >+// slots, like so: >+// >+// Group 1 Group 2 Group 3 >+// +---------------+---------------+---------------+ >+// | | | | | | | | | | | | | | | | | | | | | | | | | >+// +---------------+---------------+---------------+ >+// >+// On lookup the hash is split into two parts: >+// - H2: 7 bits (those stored in the control bytes) >+// - H1: the rest of the bits >+// The groups are probed using H1. For each group the slots are matched to H2 in >+// parallel. Because H2 is 7 bits (128 states) and the number of slots per group >+// is low (8 or 16) in almost all cases a match in H2 is also a lookup hit. >+// >+// On insert, once the right group is found (as in lookup), its slots are >+// filled in order. >+// >+// On erase a slot is cleared. In case the group did not have any empty slots >+// before the erase, the erased slot is marked as deleted. >+// >+// Groups without empty slots (but maybe with deleted slots) extend the probe >+// sequence. The probing algorithm is quadratic. Given N the number of groups, >+// the probing function for the i'th probe is: >+// >+// P(0) = H1 % N >+// >+// P(i) = (P(i - 1) + i) % N >+// >+// This probing function guarantees that after N probes, all the groups of the >+// table will be probed exactly once. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_RAW_HASH_SET_H_ >+#define ABSL_CONTAINER_INTERNAL_RAW_HASH_SET_H_ >+ >+#ifndef SWISSTABLE_HAVE_SSE2 >+#ifdef __SSE2__ >+#define SWISSTABLE_HAVE_SSE2 1 >+#else >+#define SWISSTABLE_HAVE_SSE2 0 >+#endif >+#endif >+ >+#ifndef SWISSTABLE_HAVE_SSSE3 >+#ifdef __SSSE3__ >+#define SWISSTABLE_HAVE_SSSE3 1 >+#else >+#define SWISSTABLE_HAVE_SSSE3 0 >+#endif >+#endif >+ >+#if SWISSTABLE_HAVE_SSSE3 && !SWISSTABLE_HAVE_SSE2 >+#error "Bad configuration!" >+#endif >+ >+#if SWISSTABLE_HAVE_SSE2 >+#include <x86intrin.h> >+#endif >+ >+#include <algorithm> >+#include <cmath> >+#include <cstdint> >+#include <cstring> >+#include <iterator> >+#include <limits> >+#include <memory> >+#include <tuple> >+#include <type_traits> >+#include <utility> >+ >+#include "absl/base/internal/bits.h" >+#include "absl/base/internal/endian.h" >+#include "absl/base/port.h" >+#include "absl/container/internal/compressed_tuple.h" >+#include "absl/container/internal/container_memory.h" >+#include "absl/container/internal/hash_policy_traits.h" >+#include "absl/container/internal/hashtable_debug_hooks.h" >+#include "absl/container/internal/layout.h" >+#include "absl/memory/memory.h" >+#include "absl/meta/type_traits.h" >+#include "absl/types/optional.h" >+#include "absl/utility/utility.h" >+ >+namespace absl { >+namespace container_internal { >+ >+template <size_t Width> >+class probe_seq { >+ public: >+ probe_seq(size_t hash, size_t mask) { >+ assert(((mask + 1) & mask) == 0 && "not a mask"); >+ mask_ = mask; >+ offset_ = hash & mask_; >+ } >+ size_t offset() const { return offset_; } >+ size_t offset(size_t i) const { return (offset_ + i) & mask_; } >+ >+ void next() { >+ index_ += Width; >+ offset_ += index_; >+ offset_ &= mask_; >+ } >+ // 0-based probe index. The i-th probe in the probe sequence. >+ size_t index() const { return index_; } >+ >+ private: >+ size_t mask_; >+ size_t offset_; >+ size_t index_ = 0; >+}; >+ >+template <class ContainerKey, class Hash, class Eq> >+struct RequireUsableKey { >+ template <class PassedKey, class... Args> >+ std::pair< >+ decltype(std::declval<const Hash&>()(std::declval<const PassedKey&>())), >+ decltype(std::declval<const Eq&>()(std::declval<const ContainerKey&>(), >+ std::declval<const PassedKey&>()))>* >+ operator()(const PassedKey&, const Args&...) const; >+}; >+ >+template <class E, class Policy, class Hash, class Eq, class... Ts> >+struct IsDecomposable : std::false_type {}; >+ >+template <class Policy, class Hash, class Eq, class... Ts> >+struct IsDecomposable< >+ absl::void_t<decltype( >+ Policy::apply(RequireUsableKey<typename Policy::key_type, Hash, Eq>(), >+ std::declval<Ts>()...))>, >+ Policy, Hash, Eq, Ts...> : std::true_type {}; >+ >+template <class, class = void> >+struct IsTransparent : std::false_type {}; >+template <class T> >+struct IsTransparent<T, absl::void_t<typename T::is_transparent>> >+ : std::true_type {}; >+ >+// TODO(alkis): Switch to std::is_nothrow_swappable when gcc/clang supports it. >+template <class T> >+constexpr bool IsNoThrowSwappable() { >+ using std::swap; >+ return noexcept(swap(std::declval<T&>(), std::declval<T&>())); >+} >+ >+template <typename T> >+int TrailingZeros(T x) { >+ return sizeof(T) == 8 ? base_internal::CountTrailingZerosNonZero64(x) >+ : base_internal::CountTrailingZerosNonZero32(x); >+} >+ >+template <typename T> >+int LeadingZeros(T x) { >+ return sizeof(T) == 8 ? base_internal::CountLeadingZeros64(x) >+ : base_internal::CountLeadingZeros32(x); >+} >+ >+// An abstraction over a bitmask. It provides an easy way to iterate through the >+// indexes of the set bits of a bitmask. When Shift=0 (platforms with SSE), >+// this is a true bitmask. On non-SSE, platforms the arithematic used to >+// emulate the SSE behavior works in bytes (Shift=3) and leaves each bytes as >+// either 0x00 or 0x80. >+// >+// For example: >+// for (int i : BitMask<uint32_t, 16>(0x5)) -> yields 0, 2 >+// for (int i : BitMask<uint64_t, 8, 3>(0x0000000080800000)) -> yields 2, 3 >+template <class T, int SignificantBits, int Shift = 0> >+class BitMask { >+ static_assert(std::is_unsigned<T>::value, ""); >+ static_assert(Shift == 0 || Shift == 3, ""); >+ >+ public: >+ // These are useful for unit tests (gunit). >+ using value_type = int; >+ using iterator = BitMask; >+ using const_iterator = BitMask; >+ >+ explicit BitMask(T mask) : mask_(mask) {} >+ BitMask& operator++() { >+ mask_ &= (mask_ - 1); >+ return *this; >+ } >+ explicit operator bool() const { return mask_ != 0; } >+ int operator*() const { return LowestBitSet(); } >+ int LowestBitSet() const { >+ return container_internal::TrailingZeros(mask_) >> Shift; >+ } >+ int HighestBitSet() const { >+ return (sizeof(T) * CHAR_BIT - container_internal::LeadingZeros(mask_) - >+ 1) >> >+ Shift; >+ } >+ >+ BitMask begin() const { return *this; } >+ BitMask end() const { return BitMask(0); } >+ >+ int TrailingZeros() const { >+ return container_internal::TrailingZeros(mask_) >> Shift; >+ } >+ >+ int LeadingZeros() const { >+ constexpr int total_significant_bits = SignificantBits << Shift; >+ constexpr int extra_bits = sizeof(T) * 8 - total_significant_bits; >+ return container_internal::LeadingZeros(mask_ << extra_bits) >> Shift; >+ } >+ >+ private: >+ friend bool operator==(const BitMask& a, const BitMask& b) { >+ return a.mask_ == b.mask_; >+ } >+ friend bool operator!=(const BitMask& a, const BitMask& b) { >+ return a.mask_ != b.mask_; >+ } >+ >+ T mask_; >+}; >+ >+using ctrl_t = signed char; >+using h2_t = uint8_t; >+ >+// The values here are selected for maximum performance. See the static asserts >+// below for details. >+enum Ctrl : ctrl_t { >+ kEmpty = -128, // 0b10000000 >+ kDeleted = -2, // 0b11111110 >+ kSentinel = -1, // 0b11111111 >+}; >+static_assert( >+ kEmpty & kDeleted & kSentinel & 0x80, >+ "Special markers need to have the MSB to make checking for them efficient"); >+static_assert(kEmpty < kSentinel && kDeleted < kSentinel, >+ "kEmpty and kDeleted must be smaller than kSentinel to make the " >+ "SIMD test of IsEmptyOrDeleted() efficient"); >+static_assert(kSentinel == -1, >+ "kSentinel must be -1 to elide loading it from memory into SIMD " >+ "registers (pcmpeqd xmm, xmm)"); >+static_assert(kEmpty == -128, >+ "kEmpty must be -128 to make the SIMD check for its " >+ "existence efficient (psignb xmm, xmm)"); >+static_assert(~kEmpty & ~kDeleted & kSentinel & 0x7F, >+ "kEmpty and kDeleted must share an unset bit that is not shared " >+ "by kSentinel to make the scalar test for MatchEmptyOrDeleted() " >+ "efficient"); >+static_assert(kDeleted == -2, >+ "kDeleted must be -2 to make the implementation of " >+ "ConvertSpecialToEmptyAndFullToDeleted efficient"); >+ >+// A single block of empty control bytes for tables without any slots allocated. >+// This enables removing a branch in the hot path of find(). >+inline ctrl_t* EmptyGroup() { >+ alignas(16) static constexpr ctrl_t empty_group[] = { >+ kSentinel, kEmpty, kEmpty, kEmpty, kEmpty, kEmpty, kEmpty, kEmpty, >+ kEmpty, kEmpty, kEmpty, kEmpty, kEmpty, kEmpty, kEmpty, kEmpty}; >+ return const_cast<ctrl_t*>(empty_group); >+} >+ >+// Mixes a randomly generated per-process seed with `hash` and `ctrl` to >+// randomize insertion order within groups. >+bool ShouldInsertBackwards(size_t hash, ctrl_t* ctrl); >+ >+// Returns a hash seed. >+// >+// The seed consists of the ctrl_ pointer, which adds enough entropy to ensure >+// non-determinism of iteration order in most cases. >+inline size_t HashSeed(const ctrl_t* ctrl) { >+ // The low bits of the pointer have little or no entropy because of >+ // alignment. We shift the pointer to try to use higher entropy bits. A >+ // good number seems to be 12 bits, because that aligns with page size. >+ return reinterpret_cast<uintptr_t>(ctrl) >> 12; >+} >+ >+inline size_t H1(size_t hash, const ctrl_t* ctrl) { >+ return (hash >> 7) ^ HashSeed(ctrl); >+} >+inline ctrl_t H2(size_t hash) { return hash & 0x7F; } >+ >+inline bool IsEmpty(ctrl_t c) { return c == kEmpty; } >+inline bool IsFull(ctrl_t c) { return c >= 0; } >+inline bool IsDeleted(ctrl_t c) { return c == kDeleted; } >+inline bool IsEmptyOrDeleted(ctrl_t c) { return c < kSentinel; } >+ >+#if SWISSTABLE_HAVE_SSE2 >+struct Group { >+ static constexpr size_t kWidth = 16; // the number of slots per group >+ >+ explicit Group(const ctrl_t* pos) { >+ ctrl = _mm_loadu_si128(reinterpret_cast<const __m128i*>(pos)); >+ } >+ >+ // Returns a bitmask representing the positions of slots that match hash. >+ BitMask<uint32_t, kWidth> Match(h2_t hash) const { >+ auto match = _mm_set1_epi8(hash); >+ return BitMask<uint32_t, kWidth>( >+ _mm_movemask_epi8(_mm_cmpeq_epi8(match, ctrl))); >+ } >+ >+ // Returns a bitmask representing the positions of empty slots. >+ BitMask<uint32_t, kWidth> MatchEmpty() const { >+#if SWISSTABLE_HAVE_SSSE3 >+ // This only works because kEmpty is -128. >+ return BitMask<uint32_t, kWidth>( >+ _mm_movemask_epi8(_mm_sign_epi8(ctrl, ctrl))); >+#else >+ return Match(kEmpty); >+#endif >+ } >+ >+ // Returns a bitmask representing the positions of empty or deleted slots. >+ BitMask<uint32_t, kWidth> MatchEmptyOrDeleted() const { >+ auto special = _mm_set1_epi8(kSentinel); >+ return BitMask<uint32_t, kWidth>( >+ _mm_movemask_epi8(_mm_cmpgt_epi8(special, ctrl))); >+ } >+ >+ // Returns the number of trailing empty or deleted elements in the group. >+ uint32_t CountLeadingEmptyOrDeleted() const { >+ auto special = _mm_set1_epi8(kSentinel); >+ return TrailingZeros(_mm_movemask_epi8(_mm_cmpgt_epi8(special, ctrl)) + 1); >+ } >+ >+ void ConvertSpecialToEmptyAndFullToDeleted(ctrl_t* dst) const { >+ auto msbs = _mm_set1_epi8(0x80); >+ auto x126 = _mm_set1_epi8(126); >+#if SWISSTABLE_HAVE_SSSE3 >+ auto res = _mm_or_si128(_mm_shuffle_epi8(x126, ctrl), msbs); >+#else >+ auto zero = _mm_setzero_si128(); >+ auto special_mask = _mm_cmpgt_epi8(zero, ctrl); >+ auto res = _mm_or_si128(msbs, _mm_andnot_si128(special_mask, x126)); >+#endif >+ _mm_storeu_si128(reinterpret_cast<__m128i*>(dst), res); >+ } >+ >+ __m128i ctrl; >+}; >+#else >+struct Group { >+ static constexpr size_t kWidth = 8; >+ >+ explicit Group(const ctrl_t* pos) : ctrl(little_endian::Load64(pos)) {} >+ >+ BitMask<uint64_t, kWidth, 3> Match(h2_t hash) const { >+ // For the technique, see: >+ // http://graphics.stanford.edu/~seander/bithacks.html##ValueInWord >+ // (Determine if a word has a byte equal to n). >+ // >+ // Caveat: there are false positives but: >+ // - they only occur if there is a real match >+ // - they never occur on kEmpty, kDeleted, kSentinel >+ // - they will be handled gracefully by subsequent checks in code >+ // >+ // Example: >+ // v = 0x1716151413121110 >+ // hash = 0x12 >+ // retval = (v - lsbs) & ~v & msbs = 0x0000000080800000 >+ constexpr uint64_t msbs = 0x8080808080808080ULL; >+ constexpr uint64_t lsbs = 0x0101010101010101ULL; >+ auto x = ctrl ^ (lsbs * hash); >+ return BitMask<uint64_t, kWidth, 3>((x - lsbs) & ~x & msbs); >+ } >+ >+ BitMask<uint64_t, kWidth, 3> MatchEmpty() const { >+ constexpr uint64_t msbs = 0x8080808080808080ULL; >+ return BitMask<uint64_t, kWidth, 3>((ctrl & (~ctrl << 6)) & msbs); >+ } >+ >+ BitMask<uint64_t, kWidth, 3> MatchEmptyOrDeleted() const { >+ constexpr uint64_t msbs = 0x8080808080808080ULL; >+ return BitMask<uint64_t, kWidth, 3>((ctrl & (~ctrl << 7)) & msbs); >+ } >+ >+ uint32_t CountLeadingEmptyOrDeleted() const { >+ constexpr uint64_t gaps = 0x00FEFEFEFEFEFEFEULL; >+ return (TrailingZeros(((~ctrl & (ctrl >> 7)) | gaps) + 1) + 7) >> 3; >+ } >+ >+ void ConvertSpecialToEmptyAndFullToDeleted(ctrl_t* dst) const { >+ constexpr uint64_t msbs = 0x8080808080808080ULL; >+ constexpr uint64_t lsbs = 0x0101010101010101ULL; >+ auto x = ctrl & msbs; >+ auto res = (~x + (x >> 7)) & ~lsbs; >+ little_endian::Store64(dst, res); >+ } >+ >+ uint64_t ctrl; >+}; >+#endif // SWISSTABLE_HAVE_SSE2 >+ >+template <class Policy, class Hash, class Eq, class Alloc> >+class raw_hash_set; >+ >+ >+inline bool IsValidCapacity(size_t n) { >+ return ((n + 1) & n) == 0 && n >= Group::kWidth - 1; >+} >+ >+// PRECONDITION: >+// IsValidCapacity(capacity) >+// ctrl[capacity] == kSentinel >+// ctrl[i] != kSentinel for all i < capacity >+// Applies mapping for every byte in ctrl: >+// DELETED -> EMPTY >+// EMPTY -> EMPTY >+// FULL -> DELETED >+inline void ConvertDeletedToEmptyAndFullToDeleted( >+ ctrl_t* ctrl, size_t capacity) { >+ assert(ctrl[capacity] == kSentinel); >+ assert(IsValidCapacity(capacity)); >+ for (ctrl_t* pos = ctrl; pos != ctrl + capacity + 1; pos += Group::kWidth) { >+ Group{pos}.ConvertSpecialToEmptyAndFullToDeleted(pos); >+ } >+ // Copy the cloned ctrl bytes. >+ std::memcpy(ctrl + capacity + 1, ctrl, Group::kWidth); >+ ctrl[capacity] = kSentinel; >+} >+ >+// Rounds up the capacity to the next power of 2 minus 1 and ensures it is >+// greater or equal to Group::kWidth - 1. >+inline size_t NormalizeCapacity(size_t n) { >+ constexpr size_t kMinCapacity = Group::kWidth - 1; >+ return n <= kMinCapacity >+ ? kMinCapacity >+ : std::numeric_limits<size_t>::max() >> LeadingZeros(n); >+} >+ >+// The node_handle concept from C++17. >+// We specialize node_handle for sets and maps. node_handle_base holds the >+// common API of both. >+template <typename Policy, typename Alloc> >+class node_handle_base { >+ protected: >+ using PolicyTraits = hash_policy_traits<Policy>; >+ using slot_type = typename PolicyTraits::slot_type; >+ >+ public: >+ using allocator_type = Alloc; >+ >+ constexpr node_handle_base() {} >+ node_handle_base(node_handle_base&& other) noexcept { >+ *this = std::move(other); >+ } >+ ~node_handle_base() { destroy(); } >+ node_handle_base& operator=(node_handle_base&& other) { >+ destroy(); >+ if (!other.empty()) { >+ alloc_ = other.alloc_; >+ PolicyTraits::transfer(alloc(), slot(), other.slot()); >+ other.reset(); >+ } >+ return *this; >+ } >+ >+ bool empty() const noexcept { return !alloc_; } >+ explicit operator bool() const noexcept { return !empty(); } >+ allocator_type get_allocator() const { return *alloc_; } >+ >+ protected: >+ template <typename, typename, typename, typename> >+ friend class raw_hash_set; >+ >+ node_handle_base(const allocator_type& a, slot_type* s) : alloc_(a) { >+ PolicyTraits::transfer(alloc(), slot(), s); >+ } >+ >+ void destroy() { >+ if (!empty()) { >+ PolicyTraits::destroy(alloc(), slot()); >+ reset(); >+ } >+ } >+ >+ void reset() { >+ assert(alloc_.has_value()); >+ alloc_ = absl::nullopt; >+ } >+ >+ slot_type* slot() const { >+ assert(!empty()); >+ return reinterpret_cast<slot_type*>(std::addressof(slot_space_)); >+ } >+ allocator_type* alloc() { return std::addressof(*alloc_); } >+ >+ private: >+ absl::optional<allocator_type> alloc_; >+ mutable absl::aligned_storage_t<sizeof(slot_type), alignof(slot_type)> >+ slot_space_; >+}; >+ >+// For sets. >+template <typename Policy, typename Alloc, typename = void> >+class node_handle : public node_handle_base<Policy, Alloc> { >+ using Base = typename node_handle::node_handle_base; >+ >+ public: >+ using value_type = typename Base::PolicyTraits::value_type; >+ >+ constexpr node_handle() {} >+ >+ value_type& value() const { >+ return Base::PolicyTraits::element(this->slot()); >+ } >+ >+ private: >+ template <typename, typename, typename, typename> >+ friend class raw_hash_set; >+ >+ node_handle(const Alloc& a, typename Base::slot_type* s) : Base(a, s) {} >+}; >+ >+// For maps. >+template <typename Policy, typename Alloc> >+class node_handle<Policy, Alloc, absl::void_t<typename Policy::mapped_type>> >+ : public node_handle_base<Policy, Alloc> { >+ using Base = typename node_handle::node_handle_base; >+ >+ public: >+ using key_type = typename Policy::key_type; >+ using mapped_type = typename Policy::mapped_type; >+ >+ constexpr node_handle() {} >+ >+ auto key() const -> decltype(Base::PolicyTraits::key(this->slot())) { >+ return Base::PolicyTraits::key(this->slot()); >+ } >+ >+ mapped_type& mapped() const { >+ return Base::PolicyTraits::value( >+ &Base::PolicyTraits::element(this->slot())); >+ } >+ >+ private: >+ template <typename, typename, typename, typename> >+ friend class raw_hash_set; >+ >+ node_handle(const Alloc& a, typename Base::slot_type* s) : Base(a, s) {} >+}; >+ >+// Implement the insert_return_type<> concept of C++17. >+template <class Iterator, class NodeType> >+struct insert_return_type { >+ Iterator position; >+ bool inserted; >+ NodeType node; >+}; >+ >+// Helper trait to allow or disallow arbitrary keys when the hash and >+// eq functions are transparent. >+// It is very important that the inner template is an alias and that the type it >+// produces is not a dependent type. Otherwise, type deduction would fail. >+template <bool is_transparent> >+struct KeyArg { >+ // Transparent. Forward `K`. >+ template <typename K, typename key_type> >+ using type = K; >+}; >+ >+template <> >+struct KeyArg<false> { >+ // Not transparent. Always use `key_type`. >+ template <typename K, typename key_type> >+ using type = key_type; >+}; >+ >+// Policy: a policy defines how to perform different operations on >+// the slots of the hashtable (see hash_policy_traits.h for the full interface >+// of policy). >+// >+// Hash: a (possibly polymorphic) functor that hashes keys of the hashtable. The >+// functor should accept a key and return size_t as hash. For best performance >+// it is important that the hash function provides high entropy across all bits >+// of the hash. >+// >+// Eq: a (possibly polymorphic) functor that compares two keys for equality. It >+// should accept two (of possibly different type) keys and return a bool: true >+// if they are equal, false if they are not. If two keys compare equal, then >+// their hash values as defined by Hash MUST be equal. >+// >+// Allocator: an Allocator [http://devdocs.io/cpp/concept/allocator] with which >+// the storage of the hashtable will be allocated and the elements will be >+// constructed and destroyed. >+template <class Policy, class Hash, class Eq, class Alloc> >+class raw_hash_set { >+ using PolicyTraits = hash_policy_traits<Policy>; >+ using KeyArgImpl = container_internal::KeyArg<IsTransparent<Eq>::value && >+ IsTransparent<Hash>::value>; >+ >+ public: >+ using init_type = typename PolicyTraits::init_type; >+ using key_type = typename PolicyTraits::key_type; >+ // TODO(sbenza): Hide slot_type as it is an implementation detail. Needs user >+ // code fixes! >+ using slot_type = typename PolicyTraits::slot_type; >+ using allocator_type = Alloc; >+ using size_type = size_t; >+ using difference_type = ptrdiff_t; >+ using hasher = Hash; >+ using key_equal = Eq; >+ using policy_type = Policy; >+ using value_type = typename PolicyTraits::value_type; >+ using reference = value_type&; >+ using const_reference = const value_type&; >+ using pointer = typename absl::allocator_traits< >+ allocator_type>::template rebind_traits<value_type>::pointer; >+ using const_pointer = typename absl::allocator_traits< >+ allocator_type>::template rebind_traits<value_type>::const_pointer; >+ >+ // Alias used for heterogeneous lookup functions. >+ // `key_arg<K>` evaluates to `K` when the functors are tranparent and to >+ // `key_type` otherwise. It permits template argument deduction on `K` for the >+ // transparent case. >+ template <class K> >+ using key_arg = typename KeyArgImpl::template type<K, key_type>; >+ >+ private: >+ // Give an early error when key_type is not hashable/eq. >+ auto KeyTypeCanBeHashed(const Hash& h, const key_type& k) -> decltype(h(k)); >+ auto KeyTypeCanBeEq(const Eq& eq, const key_type& k) -> decltype(eq(k, k)); >+ >+ using Layout = absl::container_internal::Layout<ctrl_t, slot_type>; >+ >+ static Layout MakeLayout(size_t capacity) { >+ assert(IsValidCapacity(capacity)); >+ return Layout(capacity + Group::kWidth + 1, capacity); >+ } >+ >+ using AllocTraits = absl::allocator_traits<allocator_type>; >+ using SlotAlloc = typename absl::allocator_traits< >+ allocator_type>::template rebind_alloc<slot_type>; >+ using SlotAllocTraits = typename absl::allocator_traits< >+ allocator_type>::template rebind_traits<slot_type>; >+ >+ static_assert(std::is_lvalue_reference<reference>::value, >+ "Policy::element() must return a reference"); >+ >+ template <typename T> >+ struct SameAsElementReference >+ : std::is_same<typename std::remove_cv< >+ typename std::remove_reference<reference>::type>::type, >+ typename std::remove_cv< >+ typename std::remove_reference<T>::type>::type> {}; >+ >+ // An enabler for insert(T&&): T must be convertible to init_type or be the >+ // same as [cv] value_type [ref]. >+ // Note: we separate SameAsElementReference into its own type to avoid using >+ // reference unless we need to. MSVC doesn't seem to like it in some >+ // cases. >+ template <class T> >+ using RequiresInsertable = typename std::enable_if< >+ absl::disjunction<std::is_convertible<T, init_type>, >+ SameAsElementReference<T>>::value, >+ int>::type; >+ >+ // RequiresNotInit is a workaround for gcc prior to 7.1. >+ // See https://godbolt.org/g/Y4xsUh. >+ template <class T> >+ using RequiresNotInit = >+ typename std::enable_if<!std::is_same<T, init_type>::value, int>::type; >+ >+ template <class... Ts> >+ using IsDecomposable = IsDecomposable<void, PolicyTraits, Hash, Eq, Ts...>; >+ >+ public: >+ static_assert(std::is_same<pointer, value_type*>::value, >+ "Allocators with custom pointer types are not supported"); >+ static_assert(std::is_same<const_pointer, const value_type*>::value, >+ "Allocators with custom pointer types are not supported"); >+ >+ class iterator { >+ friend class raw_hash_set; >+ >+ public: >+ using iterator_category = std::forward_iterator_tag; >+ using value_type = typename raw_hash_set::value_type; >+ using reference = >+ absl::conditional_t<PolicyTraits::constant_iterators::value, >+ const value_type&, value_type&>; >+ using pointer = absl::remove_reference_t<reference>*; >+ using difference_type = typename raw_hash_set::difference_type; >+ >+ iterator() {} >+ >+ // PRECONDITION: not an end() iterator. >+ reference operator*() const { return PolicyTraits::element(slot_); } >+ >+ // PRECONDITION: not an end() iterator. >+ pointer operator->() const { return &operator*(); } >+ >+ // PRECONDITION: not an end() iterator. >+ iterator& operator++() { >+ ++ctrl_; >+ ++slot_; >+ skip_empty_or_deleted(); >+ return *this; >+ } >+ // PRECONDITION: not an end() iterator. >+ iterator operator++(int) { >+ auto tmp = *this; >+ ++*this; >+ return tmp; >+ } >+ >+ friend bool operator==(const iterator& a, const iterator& b) { >+ return a.ctrl_ == b.ctrl_; >+ } >+ friend bool operator!=(const iterator& a, const iterator& b) { >+ return !(a == b); >+ } >+ >+ private: >+ iterator(ctrl_t* ctrl) : ctrl_(ctrl) {} // for end() >+ iterator(ctrl_t* ctrl, slot_type* slot) : ctrl_(ctrl), slot_(slot) {} >+ >+ void skip_empty_or_deleted() { >+ while (IsEmptyOrDeleted(*ctrl_)) { >+ // ctrl is not necessarily aligned to Group::kWidth. It is also likely >+ // to read past the space for ctrl bytes and into slots. This is ok >+ // because ctrl has sizeof() == 1 and slot has sizeof() >= 1 so there >+ // is no way to read outside the combined slot array. >+ uint32_t shift = Group{ctrl_}.CountLeadingEmptyOrDeleted(); >+ ctrl_ += shift; >+ slot_ += shift; >+ } >+ } >+ >+ ctrl_t* ctrl_ = nullptr; >+ slot_type* slot_; >+ }; >+ >+ class const_iterator { >+ friend class raw_hash_set; >+ >+ public: >+ using iterator_category = typename iterator::iterator_category; >+ using value_type = typename raw_hash_set::value_type; >+ using reference = typename raw_hash_set::const_reference; >+ using pointer = typename raw_hash_set::const_pointer; >+ using difference_type = typename raw_hash_set::difference_type; >+ >+ const_iterator() {} >+ // Implicit construction from iterator. >+ const_iterator(iterator i) : inner_(std::move(i)) {} >+ >+ reference operator*() const { return *inner_; } >+ pointer operator->() const { return inner_.operator->(); } >+ >+ const_iterator& operator++() { >+ ++inner_; >+ return *this; >+ } >+ const_iterator operator++(int) { return inner_++; } >+ >+ friend bool operator==(const const_iterator& a, const const_iterator& b) { >+ return a.inner_ == b.inner_; >+ } >+ friend bool operator!=(const const_iterator& a, const const_iterator& b) { >+ return !(a == b); >+ } >+ >+ private: >+ const_iterator(const ctrl_t* ctrl, const slot_type* slot) >+ : inner_(const_cast<ctrl_t*>(ctrl), const_cast<slot_type*>(slot)) {} >+ >+ iterator inner_; >+ }; >+ >+ using node_type = container_internal::node_handle<Policy, Alloc>; >+ >+ raw_hash_set() noexcept( >+ std::is_nothrow_default_constructible<hasher>::value&& >+ std::is_nothrow_default_constructible<key_equal>::value&& >+ std::is_nothrow_default_constructible<allocator_type>::value) {} >+ >+ explicit raw_hash_set(size_t bucket_count, const hasher& hash = hasher(), >+ const key_equal& eq = key_equal(), >+ const allocator_type& alloc = allocator_type()) >+ : ctrl_(EmptyGroup()), settings_(0, hash, eq, alloc) { >+ if (bucket_count) { >+ capacity_ = NormalizeCapacity(bucket_count); >+ growth_left() = static_cast<size_t>(capacity_ * kMaxLoadFactor); >+ initialize_slots(); >+ } >+ } >+ >+ raw_hash_set(size_t bucket_count, const hasher& hash, >+ const allocator_type& alloc) >+ : raw_hash_set(bucket_count, hash, key_equal(), alloc) {} >+ >+ raw_hash_set(size_t bucket_count, const allocator_type& alloc) >+ : raw_hash_set(bucket_count, hasher(), key_equal(), alloc) {} >+ >+ explicit raw_hash_set(const allocator_type& alloc) >+ : raw_hash_set(0, hasher(), key_equal(), alloc) {} >+ >+ template <class InputIter> >+ raw_hash_set(InputIter first, InputIter last, size_t bucket_count = 0, >+ const hasher& hash = hasher(), const key_equal& eq = key_equal(), >+ const allocator_type& alloc = allocator_type()) >+ : raw_hash_set(bucket_count, hash, eq, alloc) { >+ insert(first, last); >+ } >+ >+ template <class InputIter> >+ raw_hash_set(InputIter first, InputIter last, size_t bucket_count, >+ const hasher& hash, const allocator_type& alloc) >+ : raw_hash_set(first, last, bucket_count, hash, key_equal(), alloc) {} >+ >+ template <class InputIter> >+ raw_hash_set(InputIter first, InputIter last, size_t bucket_count, >+ const allocator_type& alloc) >+ : raw_hash_set(first, last, bucket_count, hasher(), key_equal(), alloc) {} >+ >+ template <class InputIter> >+ raw_hash_set(InputIter first, InputIter last, const allocator_type& alloc) >+ : raw_hash_set(first, last, 0, hasher(), key_equal(), alloc) {} >+ >+ // Instead of accepting std::initializer_list<value_type> as the first >+ // argument like std::unordered_set<value_type> does, we have two overloads >+ // that accept std::initializer_list<T> and std::initializer_list<init_type>. >+ // This is advantageous for performance. >+ // >+ // // Turns {"abc", "def"} into std::initializer_list<std::string>, then copies >+ // // the strings into the set. >+ // std::unordered_set<std::string> s = {"abc", "def"}; >+ // >+ // // Turns {"abc", "def"} into std::initializer_list<const char*>, then >+ // // copies the strings into the set. >+ // absl::flat_hash_set<std::string> s = {"abc", "def"}; >+ // >+ // The same trick is used in insert(). >+ // >+ // The enabler is necessary to prevent this constructor from triggering where >+ // the copy constructor is meant to be called. >+ // >+ // absl::flat_hash_set<int> a, b{a}; >+ // >+ // RequiresNotInit<T> is a workaround for gcc prior to 7.1. >+ template <class T, RequiresNotInit<T> = 0, RequiresInsertable<T> = 0> >+ raw_hash_set(std::initializer_list<T> init, size_t bucket_count = 0, >+ const hasher& hash = hasher(), const key_equal& eq = key_equal(), >+ const allocator_type& alloc = allocator_type()) >+ : raw_hash_set(init.begin(), init.end(), bucket_count, hash, eq, alloc) {} >+ >+ raw_hash_set(std::initializer_list<init_type> init, size_t bucket_count = 0, >+ const hasher& hash = hasher(), const key_equal& eq = key_equal(), >+ const allocator_type& alloc = allocator_type()) >+ : raw_hash_set(init.begin(), init.end(), bucket_count, hash, eq, alloc) {} >+ >+ template <class T, RequiresNotInit<T> = 0, RequiresInsertable<T> = 0> >+ raw_hash_set(std::initializer_list<T> init, size_t bucket_count, >+ const hasher& hash, const allocator_type& alloc) >+ : raw_hash_set(init, bucket_count, hash, key_equal(), alloc) {} >+ >+ raw_hash_set(std::initializer_list<init_type> init, size_t bucket_count, >+ const hasher& hash, const allocator_type& alloc) >+ : raw_hash_set(init, bucket_count, hash, key_equal(), alloc) {} >+ >+ template <class T, RequiresNotInit<T> = 0, RequiresInsertable<T> = 0> >+ raw_hash_set(std::initializer_list<T> init, size_t bucket_count, >+ const allocator_type& alloc) >+ : raw_hash_set(init, bucket_count, hasher(), key_equal(), alloc) {} >+ >+ raw_hash_set(std::initializer_list<init_type> init, size_t bucket_count, >+ const allocator_type& alloc) >+ : raw_hash_set(init, bucket_count, hasher(), key_equal(), alloc) {} >+ >+ template <class T, RequiresNotInit<T> = 0, RequiresInsertable<T> = 0> >+ raw_hash_set(std::initializer_list<T> init, const allocator_type& alloc) >+ : raw_hash_set(init, 0, hasher(), key_equal(), alloc) {} >+ >+ raw_hash_set(std::initializer_list<init_type> init, >+ const allocator_type& alloc) >+ : raw_hash_set(init, 0, hasher(), key_equal(), alloc) {} >+ >+ raw_hash_set(const raw_hash_set& that) >+ : raw_hash_set(that, AllocTraits::select_on_container_copy_construction( >+ that.alloc_ref())) {} >+ >+ raw_hash_set(const raw_hash_set& that, const allocator_type& a) >+ : raw_hash_set(0, that.hash_ref(), that.eq_ref(), a) { >+ reserve(that.size()); >+ // Because the table is guaranteed to be empty, we can do something faster >+ // than a full `insert`. >+ for (const auto& v : that) { >+ const size_t hash = PolicyTraits::apply(HashElement{hash_ref()}, v); >+ const size_t i = find_first_non_full(hash); >+ set_ctrl(i, H2(hash)); >+ emplace_at(i, v); >+ } >+ size_ = that.size(); >+ growth_left() -= that.size(); >+ } >+ >+ raw_hash_set(raw_hash_set&& that) noexcept( >+ std::is_nothrow_copy_constructible<hasher>::value&& >+ std::is_nothrow_copy_constructible<key_equal>::value&& >+ std::is_nothrow_copy_constructible<allocator_type>::value) >+ : ctrl_(absl::exchange(that.ctrl_, EmptyGroup())), >+ slots_(absl::exchange(that.slots_, nullptr)), >+ size_(absl::exchange(that.size_, 0)), >+ capacity_(absl::exchange(that.capacity_, 0)), >+ // Hash, equality and allocator are copied instead of moved because >+ // `that` must be left valid. If Hash is std::function<Key>, moving it >+ // would create a nullptr functor that cannot be called. >+ settings_(that.settings_) { >+ // growth_left was copied above, reset the one from `that`. >+ that.growth_left() = 0; >+ } >+ >+ raw_hash_set(raw_hash_set&& that, const allocator_type& a) >+ : ctrl_(EmptyGroup()), >+ slots_(nullptr), >+ size_(0), >+ capacity_(0), >+ settings_(0, that.hash_ref(), that.eq_ref(), a) { >+ if (a == that.alloc_ref()) { >+ std::swap(ctrl_, that.ctrl_); >+ std::swap(slots_, that.slots_); >+ std::swap(size_, that.size_); >+ std::swap(capacity_, that.capacity_); >+ std::swap(growth_left(), that.growth_left()); >+ } else { >+ reserve(that.size()); >+ // Note: this will copy elements of dense_set and unordered_set instead of >+ // moving them. This can be fixed if it ever becomes an issue. >+ for (auto& elem : that) insert(std::move(elem)); >+ } >+ } >+ >+ raw_hash_set& operator=(const raw_hash_set& that) { >+ raw_hash_set tmp(that, >+ AllocTraits::propagate_on_container_copy_assignment::value >+ ? that.alloc_ref() >+ : alloc_ref()); >+ swap(tmp); >+ return *this; >+ } >+ >+ raw_hash_set& operator=(raw_hash_set&& that) noexcept( >+ absl::allocator_traits<allocator_type>::is_always_equal::value&& >+ std::is_nothrow_move_assignable<hasher>::value&& >+ std::is_nothrow_move_assignable<key_equal>::value) { >+ // TODO(sbenza): We should only use the operations from the noexcept clause >+ // to make sure we actually adhere to that contract. >+ return move_assign( >+ std::move(that), >+ typename AllocTraits::propagate_on_container_move_assignment()); >+ } >+ >+ ~raw_hash_set() { destroy_slots(); } >+ >+ iterator begin() { >+ auto it = iterator_at(0); >+ it.skip_empty_or_deleted(); >+ return it; >+ } >+ iterator end() { return {ctrl_ + capacity_}; } >+ >+ const_iterator begin() const { >+ return const_cast<raw_hash_set*>(this)->begin(); >+ } >+ const_iterator end() const { return const_cast<raw_hash_set*>(this)->end(); } >+ const_iterator cbegin() const { return begin(); } >+ const_iterator cend() const { return end(); } >+ >+ bool empty() const { return !size(); } >+ size_t size() const { return size_; } >+ size_t capacity() const { return capacity_; } >+ size_t max_size() const { return std::numeric_limits<size_t>::max(); } >+ >+ void clear() { >+ // Iterating over this container is O(bucket_count()). When bucket_count() >+ // is much greater than size(), iteration becomes prohibitively expensive. >+ // For clear() it is more important to reuse the allocated array when the >+ // container is small because allocation takes comparatively long time >+ // compared to destruction of the elements of the container. So we pick the >+ // largest bucket_count() threshold for which iteration is still fast and >+ // past that we simply deallocate the array. >+ if (capacity_ > 127) { >+ destroy_slots(); >+ } else if (capacity_) { >+ for (size_t i = 0; i != capacity_; ++i) { >+ if (IsFull(ctrl_[i])) { >+ PolicyTraits::destroy(&alloc_ref(), slots_ + i); >+ } >+ } >+ size_ = 0; >+ reset_ctrl(); >+ growth_left() = static_cast<size_t>(capacity_ * kMaxLoadFactor); >+ } >+ assert(empty()); >+ } >+ >+ // This overload kicks in when the argument is an rvalue of insertable and >+ // decomposable type other than init_type. >+ // >+ // flat_hash_map<std::string, int> m; >+ // m.insert(std::make_pair("abc", 42)); >+ template <class T, RequiresInsertable<T> = 0, >+ typename std::enable_if<IsDecomposable<T>::value, int>::type = 0, >+ T* = nullptr> >+ std::pair<iterator, bool> insert(T&& value) { >+ return emplace(std::forward<T>(value)); >+ } >+ >+ // This overload kicks in when the argument is a bitfield or an lvalue of >+ // insertable and decomposable type. >+ // >+ // union { int n : 1; }; >+ // flat_hash_set<int> s; >+ // s.insert(n); >+ // >+ // flat_hash_set<std::string> s; >+ // const char* p = "hello"; >+ // s.insert(p); >+ // >+ // TODO(romanp): Once we stop supporting gcc 5.1 and below, replace >+ // RequiresInsertable<T> with RequiresInsertable<const T&>. >+ // We are hitting this bug: https://godbolt.org/g/1Vht4f. >+ template < >+ class T, RequiresInsertable<T> = 0, >+ typename std::enable_if<IsDecomposable<const T&>::value, int>::type = 0> >+ std::pair<iterator, bool> insert(const T& value) { >+ return emplace(value); >+ } >+ >+ // This overload kicks in when the argument is an rvalue of init_type. Its >+ // purpose is to handle brace-init-list arguments. >+ // >+ // flat_hash_set<std::string, int> s; >+ // s.insert({"abc", 42}); >+ std::pair<iterator, bool> insert(init_type&& value) { >+ return emplace(std::move(value)); >+ } >+ >+ template <class T, RequiresInsertable<T> = 0, >+ typename std::enable_if<IsDecomposable<T>::value, int>::type = 0, >+ T* = nullptr> >+ iterator insert(const_iterator, T&& value) { >+ return insert(std::forward<T>(value)).first; >+ } >+ >+ // TODO(romanp): Once we stop supporting gcc 5.1 and below, replace >+ // RequiresInsertable<T> with RequiresInsertable<const T&>. >+ // We are hitting this bug: https://godbolt.org/g/1Vht4f. >+ template < >+ class T, RequiresInsertable<T> = 0, >+ typename std::enable_if<IsDecomposable<const T&>::value, int>::type = 0> >+ iterator insert(const_iterator, const T& value) { >+ return insert(value).first; >+ } >+ >+ iterator insert(const_iterator, init_type&& value) { >+ return insert(std::move(value)).first; >+ } >+ >+ template <class InputIt> >+ void insert(InputIt first, InputIt last) { >+ for (; first != last; ++first) insert(*first); >+ } >+ >+ template <class T, RequiresNotInit<T> = 0, RequiresInsertable<const T&> = 0> >+ void insert(std::initializer_list<T> ilist) { >+ insert(ilist.begin(), ilist.end()); >+ } >+ >+ void insert(std::initializer_list<init_type> ilist) { >+ insert(ilist.begin(), ilist.end()); >+ } >+ >+ insert_return_type<iterator, node_type> insert(node_type&& node) { >+ if (!node) return {end(), false, node_type()}; >+ const auto& elem = PolicyTraits::element(node.slot()); >+ auto res = PolicyTraits::apply( >+ InsertSlot<false>{*this, std::move(*node.slot())}, elem); >+ if (res.second) { >+ node.reset(); >+ return {res.first, true, node_type()}; >+ } else { >+ return {res.first, false, std::move(node)}; >+ } >+ } >+ >+ iterator insert(const_iterator, node_type&& node) { >+ return insert(std::move(node)).first; >+ } >+ >+ // This overload kicks in if we can deduce the key from args. This enables us >+ // to avoid constructing value_type if an entry with the same key already >+ // exists. >+ // >+ // For example: >+ // >+ // flat_hash_map<std::string, std::string> m = {{"abc", "def"}}; >+ // // Creates no std::string copies and makes no heap allocations. >+ // m.emplace("abc", "xyz"); >+ template <class... Args, typename std::enable_if< >+ IsDecomposable<Args...>::value, int>::type = 0> >+ std::pair<iterator, bool> emplace(Args&&... args) { >+ return PolicyTraits::apply(EmplaceDecomposable{*this}, >+ std::forward<Args>(args)...); >+ } >+ >+ // This overload kicks in if we cannot deduce the key from args. It constructs >+ // value_type unconditionally and then either moves it into the table or >+ // destroys. >+ template <class... Args, typename std::enable_if< >+ !IsDecomposable<Args...>::value, int>::type = 0> >+ std::pair<iterator, bool> emplace(Args&&... args) { >+ typename std::aligned_storage<sizeof(slot_type), alignof(slot_type)>::type >+ raw; >+ slot_type* slot = reinterpret_cast<slot_type*>(&raw); >+ >+ PolicyTraits::construct(&alloc_ref(), slot, std::forward<Args>(args)...); >+ const auto& elem = PolicyTraits::element(slot); >+ return PolicyTraits::apply(InsertSlot<true>{*this, std::move(*slot)}, elem); >+ } >+ >+ template <class... Args> >+ iterator emplace_hint(const_iterator, Args&&... args) { >+ return emplace(std::forward<Args>(args)...).first; >+ } >+ >+ // Extension API: support for lazy emplace. >+ // >+ // Looks up key in the table. If found, returns the iterator to the element. >+ // Otherwise calls f with one argument of type raw_hash_set::constructor. f >+ // MUST call raw_hash_set::constructor with arguments as if a >+ // raw_hash_set::value_type is constructed, otherwise the behavior is >+ // undefined. >+ // >+ // For example: >+ // >+ // std::unordered_set<ArenaString> s; >+ // // Makes ArenaStr even if "abc" is in the map. >+ // s.insert(ArenaString(&arena, "abc")); >+ // >+ // flat_hash_set<ArenaStr> s; >+ // // Makes ArenaStr only if "abc" is not in the map. >+ // s.lazy_emplace("abc", [&](const constructor& ctor) { >+ // ctor(&arena, "abc"); >+ // }); >+ // >+ // WARNING: This API is currently experimental. If there is a way to implement >+ // the same thing with the rest of the API, prefer that. >+ class constructor { >+ friend class raw_hash_set; >+ >+ public: >+ template <class... Args> >+ void operator()(Args&&... args) const { >+ assert(*slot_); >+ PolicyTraits::construct(alloc_, *slot_, std::forward<Args>(args)...); >+ *slot_ = nullptr; >+ } >+ >+ private: >+ constructor(allocator_type* a, slot_type** slot) : alloc_(a), slot_(slot) {} >+ >+ allocator_type* alloc_; >+ slot_type** slot_; >+ }; >+ >+ template <class K = key_type, class F> >+ iterator lazy_emplace(const key_arg<K>& key, F&& f) { >+ auto res = find_or_prepare_insert(key); >+ if (res.second) { >+ slot_type* slot = slots_ + res.first; >+ std::forward<F>(f)(constructor(&alloc_ref(), &slot)); >+ assert(!slot); >+ } >+ return iterator_at(res.first); >+ } >+ >+ // Extension API: support for heterogeneous keys. >+ // >+ // std::unordered_set<std::string> s; >+ // // Turns "abc" into std::string. >+ // s.erase("abc"); >+ // >+ // flat_hash_set<std::string> s; >+ // // Uses "abc" directly without copying it into std::string. >+ // s.erase("abc"); >+ template <class K = key_type> >+ size_type erase(const key_arg<K>& key) { >+ auto it = find(key); >+ if (it == end()) return 0; >+ erase(it); >+ return 1; >+ } >+ >+ // Erases the element pointed to by `it`. Unlike `std::unordered_set::erase`, >+ // this method returns void to reduce algorithmic complexity to O(1). In >+ // order to erase while iterating across a map, use the following idiom (which >+ // also works for standard containers): >+ // >+ // for (auto it = m.begin(), end = m.end(); it != end;) { >+ // if (<pred>) { >+ // m.erase(it++); >+ // } else { >+ // ++it; >+ // } >+ // } >+ void erase(const_iterator cit) { erase(cit.inner_); } >+ >+ // This overload is necessary because otherwise erase<K>(const K&) would be >+ // a better match if non-const iterator is passed as an argument. >+ void erase(iterator it) { >+ assert(it != end()); >+ PolicyTraits::destroy(&alloc_ref(), it.slot_); >+ erase_meta_only(it); >+ } >+ >+ iterator erase(const_iterator first, const_iterator last) { >+ while (first != last) { >+ erase(first++); >+ } >+ return last.inner_; >+ } >+ >+ // Moves elements from `src` into `this`. >+ // If the element already exists in `this`, it is left unmodified in `src`. >+ template <typename H, typename E> >+ void merge(raw_hash_set<Policy, H, E, Alloc>& src) { // NOLINT >+ assert(this != &src); >+ for (auto it = src.begin(), e = src.end(); it != e; ++it) { >+ if (PolicyTraits::apply(InsertSlot<false>{*this, std::move(*it.slot_)}, >+ PolicyTraits::element(it.slot_)) >+ .second) { >+ src.erase_meta_only(it); >+ } >+ } >+ } >+ >+ template <typename H, typename E> >+ void merge(raw_hash_set<Policy, H, E, Alloc>&& src) { >+ merge(src); >+ } >+ >+ node_type extract(const_iterator position) { >+ node_type node(alloc_ref(), position.inner_.slot_); >+ erase_meta_only(position); >+ return node; >+ } >+ >+ template < >+ class K = key_type, >+ typename std::enable_if<!std::is_same<K, iterator>::value, int>::type = 0> >+ node_type extract(const key_arg<K>& key) { >+ auto it = find(key); >+ return it == end() ? node_type() : extract(const_iterator{it}); >+ } >+ >+ void swap(raw_hash_set& that) noexcept( >+ IsNoThrowSwappable<hasher>() && IsNoThrowSwappable<key_equal>() && >+ (!AllocTraits::propagate_on_container_swap::value || >+ IsNoThrowSwappable<allocator_type>())) { >+ using std::swap; >+ swap(ctrl_, that.ctrl_); >+ swap(slots_, that.slots_); >+ swap(size_, that.size_); >+ swap(capacity_, that.capacity_); >+ swap(growth_left(), that.growth_left()); >+ swap(hash_ref(), that.hash_ref()); >+ swap(eq_ref(), that.eq_ref()); >+ if (AllocTraits::propagate_on_container_swap::value) { >+ swap(alloc_ref(), that.alloc_ref()); >+ } else { >+ // If the allocators do not compare equal it is officially undefined >+ // behavior. We choose to do nothing. >+ } >+ } >+ >+ void rehash(size_t n) { >+ if (n == 0 && capacity_ == 0) return; >+ if (n == 0 && size_ == 0) return destroy_slots(); >+ auto m = NormalizeCapacity(std::max(n, NumSlotsFast(size()))); >+ // n == 0 unconditionally rehashes as per the standard. >+ if (n == 0 || m > capacity_) { >+ resize(m); >+ } >+ } >+ >+ void reserve(size_t n) { >+ rehash(NumSlotsFast(n)); >+ } >+ >+ // Extension API: support for heterogeneous keys. >+ // >+ // std::unordered_set<std::string> s; >+ // // Turns "abc" into std::string. >+ // s.count("abc"); >+ // >+ // ch_set<std::string> s; >+ // // Uses "abc" directly without copying it into std::string. >+ // s.count("abc"); >+ template <class K = key_type> >+ size_t count(const key_arg<K>& key) const { >+ return find(key) == end() ? 0 : 1; >+ } >+ >+ // Issues CPU prefetch instructions for the memory needed to find or insert >+ // a key. Like all lookup functions, this support heterogeneous keys. >+ // >+ // NOTE: This is a very low level operation and should not be used without >+ // specific benchmarks indicating its importance. >+ template <class K = key_type> >+ void prefetch(const key_arg<K>& key) const { >+ (void)key; >+#if defined(__GNUC__) >+ auto seq = probe(hash_ref()(key)); >+ __builtin_prefetch(static_cast<const void*>(ctrl_ + seq.offset())); >+ __builtin_prefetch(static_cast<const void*>(slots_ + seq.offset())); >+#endif // __GNUC__ >+ } >+ >+ // The API of find() has two extensions. >+ // >+ // 1. The hash can be passed by the user. It must be equal to the hash of the >+ // key. >+ // >+ // 2. The type of the key argument doesn't have to be key_type. This is so >+ // called heterogeneous key support. >+ template <class K = key_type> >+ iterator find(const key_arg<K>& key, size_t hash) { >+ auto seq = probe(hash); >+ while (true) { >+ Group g{ctrl_ + seq.offset()}; >+ for (int i : g.Match(H2(hash))) { >+ if (ABSL_PREDICT_TRUE(PolicyTraits::apply( >+ EqualElement<K>{key, eq_ref()}, >+ PolicyTraits::element(slots_ + seq.offset(i))))) >+ return iterator_at(seq.offset(i)); >+ } >+ if (ABSL_PREDICT_TRUE(g.MatchEmpty())) return end(); >+ seq.next(); >+ } >+ } >+ template <class K = key_type> >+ iterator find(const key_arg<K>& key) { >+ return find(key, hash_ref()(key)); >+ } >+ >+ template <class K = key_type> >+ const_iterator find(const key_arg<K>& key, size_t hash) const { >+ return const_cast<raw_hash_set*>(this)->find(key, hash); >+ } >+ template <class K = key_type> >+ const_iterator find(const key_arg<K>& key) const { >+ return find(key, hash_ref()(key)); >+ } >+ >+ template <class K = key_type> >+ bool contains(const key_arg<K>& key) const { >+ return find(key) != end(); >+ } >+ >+ template <class K = key_type> >+ std::pair<iterator, iterator> equal_range(const key_arg<K>& key) { >+ auto it = find(key); >+ if (it != end()) return {it, std::next(it)}; >+ return {it, it}; >+ } >+ template <class K = key_type> >+ std::pair<const_iterator, const_iterator> equal_range( >+ const key_arg<K>& key) const { >+ auto it = find(key); >+ if (it != end()) return {it, std::next(it)}; >+ return {it, it}; >+ } >+ >+ size_t bucket_count() const { return capacity_; } >+ float load_factor() const { >+ return capacity_ ? static_cast<double>(size()) / capacity_ : 0.0; >+ } >+ float max_load_factor() const { return 1.0f; } >+ void max_load_factor(float) { >+ // Does nothing. >+ } >+ >+ hasher hash_function() const { return hash_ref(); } >+ key_equal key_eq() const { return eq_ref(); } >+ allocator_type get_allocator() const { return alloc_ref(); } >+ >+ friend bool operator==(const raw_hash_set& a, const raw_hash_set& b) { >+ if (a.size() != b.size()) return false; >+ const raw_hash_set* outer = &a; >+ const raw_hash_set* inner = &b; >+ if (outer->capacity() > inner->capacity()) std::swap(outer, inner); >+ for (const value_type& elem : *outer) >+ if (!inner->has_element(elem)) return false; >+ return true; >+ } >+ >+ friend bool operator!=(const raw_hash_set& a, const raw_hash_set& b) { >+ return !(a == b); >+ } >+ >+ friend void swap(raw_hash_set& a, >+ raw_hash_set& b) noexcept(noexcept(a.swap(b))) { >+ a.swap(b); >+ } >+ >+ private: >+ template <class Container, typename Enabler> >+ friend struct absl::container_internal::hashtable_debug_internal:: >+ HashtableDebugAccess; >+ >+ struct FindElement { >+ template <class K, class... Args> >+ const_iterator operator()(const K& key, Args&&...) const { >+ return s.find(key); >+ } >+ const raw_hash_set& s; >+ }; >+ >+ struct HashElement { >+ template <class K, class... Args> >+ size_t operator()(const K& key, Args&&...) const { >+ return h(key); >+ } >+ const hasher& h; >+ }; >+ >+ template <class K1> >+ struct EqualElement { >+ template <class K2, class... Args> >+ bool operator()(const K2& lhs, Args&&...) const { >+ return eq(lhs, rhs); >+ } >+ const K1& rhs; >+ const key_equal& eq; >+ }; >+ >+ struct EmplaceDecomposable { >+ template <class K, class... Args> >+ std::pair<iterator, bool> operator()(const K& key, Args&&... args) const { >+ auto res = s.find_or_prepare_insert(key); >+ if (res.second) { >+ s.emplace_at(res.first, std::forward<Args>(args)...); >+ } >+ return {s.iterator_at(res.first), res.second}; >+ } >+ raw_hash_set& s; >+ }; >+ >+ template <bool do_destroy> >+ struct InsertSlot { >+ template <class K, class... Args> >+ std::pair<iterator, bool> operator()(const K& key, Args&&...) && { >+ auto res = s.find_or_prepare_insert(key); >+ if (res.second) { >+ PolicyTraits::transfer(&s.alloc_ref(), s.slots_ + res.first, &slot); >+ } else if (do_destroy) { >+ PolicyTraits::destroy(&s.alloc_ref(), &slot); >+ } >+ return {s.iterator_at(res.first), res.second}; >+ } >+ raw_hash_set& s; >+ // Constructed slot. Either moved into place or destroyed. >+ slot_type&& slot; >+ }; >+ >+ // Computes std::ceil(n / kMaxLoadFactor). Faster than calling std::ceil. >+ static inline size_t NumSlotsFast(size_t n) { >+ return static_cast<size_t>( >+ (n * kMaxLoadFactorDenominator + (kMaxLoadFactorNumerator - 1)) / >+ kMaxLoadFactorNumerator); >+ } >+ >+ // "erases" the object from the container, except that it doesn't actually >+ // destroy the object. It only updates all the metadata of the class. >+ // This can be used in conjunction with Policy::transfer to move the object to >+ // another place. >+ void erase_meta_only(const_iterator it) { >+ assert(IsFull(*it.inner_.ctrl_) && "erasing a dangling iterator"); >+ --size_; >+ const size_t index = it.inner_.ctrl_ - ctrl_; >+ const size_t index_before = (index - Group::kWidth) & capacity_; >+ const auto empty_after = Group(it.inner_.ctrl_).MatchEmpty(); >+ const auto empty_before = Group(ctrl_ + index_before).MatchEmpty(); >+ >+ // We count how many consecutive non empties we have to the right and to the >+ // left of `it`. If the sum is >= kWidth then there is at least one probe >+ // window that might have seen a full group. >+ bool was_never_full = >+ empty_before && empty_after && >+ static_cast<size_t>(empty_after.TrailingZeros() + >+ empty_before.LeadingZeros()) < Group::kWidth; >+ >+ set_ctrl(index, was_never_full ? kEmpty : kDeleted); >+ growth_left() += was_never_full; >+ } >+ >+ void initialize_slots() { >+ assert(capacity_); >+ auto layout = MakeLayout(capacity_); >+ char* mem = static_cast<char*>( >+ Allocate<Layout::Alignment()>(&alloc_ref(), layout.AllocSize())); >+ ctrl_ = reinterpret_cast<ctrl_t*>(layout.template Pointer<0>(mem)); >+ slots_ = layout.template Pointer<1>(mem); >+ reset_ctrl(); >+ growth_left() = static_cast<size_t>(capacity_ * kMaxLoadFactor) - size_; >+ } >+ >+ void destroy_slots() { >+ if (!capacity_) return; >+ for (size_t i = 0; i != capacity_; ++i) { >+ if (IsFull(ctrl_[i])) { >+ PolicyTraits::destroy(&alloc_ref(), slots_ + i); >+ } >+ } >+ auto layout = MakeLayout(capacity_); >+ // Unpoison before returning the memory to the allocator. >+ SanitizerUnpoisonMemoryRegion(slots_, sizeof(slot_type) * capacity_); >+ Deallocate<Layout::Alignment()>(&alloc_ref(), ctrl_, layout.AllocSize()); >+ ctrl_ = EmptyGroup(); >+ slots_ = nullptr; >+ size_ = 0; >+ capacity_ = 0; >+ growth_left() = 0; >+ } >+ >+ void resize(size_t new_capacity) { >+ assert(IsValidCapacity(new_capacity)); >+ auto* old_ctrl = ctrl_; >+ auto* old_slots = slots_; >+ const size_t old_capacity = capacity_; >+ capacity_ = new_capacity; >+ initialize_slots(); >+ >+ for (size_t i = 0; i != old_capacity; ++i) { >+ if (IsFull(old_ctrl[i])) { >+ size_t hash = PolicyTraits::apply(HashElement{hash_ref()}, >+ PolicyTraits::element(old_slots + i)); >+ size_t new_i = find_first_non_full(hash); >+ set_ctrl(new_i, H2(hash)); >+ PolicyTraits::transfer(&alloc_ref(), slots_ + new_i, old_slots + i); >+ } >+ } >+ if (old_capacity) { >+ SanitizerUnpoisonMemoryRegion(old_slots, >+ sizeof(slot_type) * old_capacity); >+ auto layout = MakeLayout(old_capacity); >+ Deallocate<Layout::Alignment()>(&alloc_ref(), old_ctrl, >+ layout.AllocSize()); >+ } >+ } >+ >+ void drop_deletes_without_resize() ABSL_ATTRIBUTE_NOINLINE { >+ assert(IsValidCapacity(capacity_)); >+ // Algorithm: >+ // - mark all DELETED slots as EMPTY >+ // - mark all FULL slots as DELETED >+ // - for each slot marked as DELETED >+ // hash = Hash(element) >+ // target = find_first_non_full(hash) >+ // if target is in the same group >+ // mark slot as FULL >+ // else if target is EMPTY >+ // transfer element to target >+ // mark slot as EMPTY >+ // mark target as FULL >+ // else if target is DELETED >+ // swap current element with target element >+ // mark target as FULL >+ // repeat procedure for current slot with moved from element (target) >+ ConvertDeletedToEmptyAndFullToDeleted(ctrl_, capacity_); >+ typename std::aligned_storage<sizeof(slot_type), alignof(slot_type)>::type >+ raw; >+ slot_type* slot = reinterpret_cast<slot_type*>(&raw); >+ for (size_t i = 0; i != capacity_; ++i) { >+ if (!IsDeleted(ctrl_[i])) continue; >+ size_t hash = PolicyTraits::apply(HashElement{hash_ref()}, >+ PolicyTraits::element(slots_ + i)); >+ size_t new_i = find_first_non_full(hash); >+ >+ // Verify if the old and new i fall within the same group wrt the hash. >+ // If they do, we don't need to move the object as it falls already in the >+ // best probe we can. >+ const auto probe_index = [&](size_t pos) { >+ return ((pos - probe(hash).offset()) & capacity_) / Group::kWidth; >+ }; >+ >+ // Element doesn't move. >+ if (ABSL_PREDICT_TRUE(probe_index(new_i) == probe_index(i))) { >+ set_ctrl(i, H2(hash)); >+ continue; >+ } >+ if (IsEmpty(ctrl_[new_i])) { >+ // Transfer element to the empty spot. >+ // set_ctrl poisons/unpoisons the slots so we have to call it at the >+ // right time. >+ set_ctrl(new_i, H2(hash)); >+ PolicyTraits::transfer(&alloc_ref(), slots_ + new_i, slots_ + i); >+ set_ctrl(i, kEmpty); >+ } else { >+ assert(IsDeleted(ctrl_[new_i])); >+ set_ctrl(new_i, H2(hash)); >+ // Until we are done rehashing, DELETED marks previously FULL slots. >+ // Swap i and new_i elements. >+ PolicyTraits::transfer(&alloc_ref(), slot, slots_ + i); >+ PolicyTraits::transfer(&alloc_ref(), slots_ + i, slots_ + new_i); >+ PolicyTraits::transfer(&alloc_ref(), slots_ + new_i, slot); >+ --i; // repeat >+ } >+ } >+ growth_left() = static_cast<size_t>(capacity_ * kMaxLoadFactor) - size_; >+ } >+ >+ void rehash_and_grow_if_necessary() { >+ if (capacity_ == 0) { >+ resize(Group::kWidth - 1); >+ } else if (size() <= kMaxLoadFactor / 2 * capacity_) { >+ // Squash DELETED without growing if there is enough capacity. >+ drop_deletes_without_resize(); >+ } else { >+ // Otherwise grow the container. >+ resize(capacity_ * 2 + 1); >+ } >+ } >+ >+ bool has_element(const value_type& elem) const { >+ size_t hash = PolicyTraits::apply(HashElement{hash_ref()}, elem); >+ auto seq = probe(hash); >+ while (true) { >+ Group g{ctrl_ + seq.offset()}; >+ for (int i : g.Match(H2(hash))) { >+ if (ABSL_PREDICT_TRUE(PolicyTraits::element(slots_ + seq.offset(i)) == >+ elem)) >+ return true; >+ } >+ if (ABSL_PREDICT_TRUE(g.MatchEmpty())) return false; >+ seq.next(); >+ assert(seq.index() < capacity_ && "full table!"); >+ } >+ return false; >+ } >+ >+ // Probes the raw_hash_set with the probe sequence for hash and returns the >+ // pointer to the first empty or deleted slot. >+ // NOTE: this function must work with tables having both kEmpty and kDelete >+ // in one group. Such tables appears during drop_deletes_without_resize. >+ // >+ // This function is very useful when insertions happen and: >+ // - the input is already a set >+ // - there are enough slots >+ // - the element with the hash is not in the table >+ size_t find_first_non_full(size_t hash) { >+ auto seq = probe(hash); >+ while (true) { >+ Group g{ctrl_ + seq.offset()}; >+ auto mask = g.MatchEmptyOrDeleted(); >+ if (mask) { >+#if !defined(NDEBUG) >+ // We want to force small tables to have random entries too, so >+ // in debug build we will randomly insert in either the front or back of >+ // the group. >+ // TODO(kfm,sbenza): revisit after we do unconditional mixing >+ if (ShouldInsertBackwards(hash, ctrl_)) >+ return seq.offset(mask.HighestBitSet()); >+ else >+ return seq.offset(mask.LowestBitSet()); >+#else >+ return seq.offset(mask.LowestBitSet()); >+#endif >+ } >+ assert(seq.index() < capacity_ && "full table!"); >+ seq.next(); >+ } >+ } >+ >+ // TODO(alkis): Optimize this assuming *this and that don't overlap. >+ raw_hash_set& move_assign(raw_hash_set&& that, std::true_type) { >+ raw_hash_set tmp(std::move(that)); >+ swap(tmp); >+ return *this; >+ } >+ raw_hash_set& move_assign(raw_hash_set&& that, std::false_type) { >+ raw_hash_set tmp(std::move(that), alloc_ref()); >+ swap(tmp); >+ return *this; >+ } >+ >+ protected: >+ template <class K> >+ std::pair<size_t, bool> find_or_prepare_insert(const K& key) { >+ auto hash = hash_ref()(key); >+ auto seq = probe(hash); >+ while (true) { >+ Group g{ctrl_ + seq.offset()}; >+ for (int i : g.Match(H2(hash))) { >+ if (ABSL_PREDICT_TRUE(PolicyTraits::apply( >+ EqualElement<K>{key, eq_ref()}, >+ PolicyTraits::element(slots_ + seq.offset(i))))) >+ return {seq.offset(i), false}; >+ } >+ if (ABSL_PREDICT_TRUE(g.MatchEmpty())) break; >+ seq.next(); >+ } >+ return {prepare_insert(hash), true}; >+ } >+ >+ size_t prepare_insert(size_t hash) ABSL_ATTRIBUTE_NOINLINE { >+ size_t target = find_first_non_full(hash); >+ if (ABSL_PREDICT_FALSE(growth_left() == 0 && !IsDeleted(ctrl_[target]))) { >+ rehash_and_grow_if_necessary(); >+ target = find_first_non_full(hash); >+ } >+ ++size_; >+ growth_left() -= IsEmpty(ctrl_[target]); >+ set_ctrl(target, H2(hash)); >+ return target; >+ } >+ >+ // Constructs the value in the space pointed by the iterator. This only works >+ // after an unsuccessful find_or_prepare_insert() and before any other >+ // modifications happen in the raw_hash_set. >+ // >+ // PRECONDITION: i is an index returned from find_or_prepare_insert(k), where >+ // k is the key decomposed from `forward<Args>(args)...`, and the bool >+ // returned by find_or_prepare_insert(k) was true. >+ // POSTCONDITION: *m.iterator_at(i) == value_type(forward<Args>(args)...). >+ template <class... Args> >+ void emplace_at(size_t i, Args&&... args) { >+ PolicyTraits::construct(&alloc_ref(), slots_ + i, >+ std::forward<Args>(args)...); >+ >+ assert(PolicyTraits::apply(FindElement{*this}, *iterator_at(i)) == >+ iterator_at(i) && >+ "constructed value does not match the lookup key"); >+ } >+ >+ iterator iterator_at(size_t i) { return {ctrl_ + i, slots_ + i}; } >+ const_iterator iterator_at(size_t i) const { return {ctrl_ + i, slots_ + i}; } >+ >+ private: >+ friend struct RawHashSetTestOnlyAccess; >+ >+ probe_seq<Group::kWidth> probe(size_t hash) const { >+ return probe_seq<Group::kWidth>(H1(hash, ctrl_), capacity_); >+ } >+ >+ // Reset all ctrl bytes back to kEmpty, except the sentinel. >+ void reset_ctrl() { >+ std::memset(ctrl_, kEmpty, capacity_ + Group::kWidth); >+ ctrl_[capacity_] = kSentinel; >+ SanitizerPoisonMemoryRegion(slots_, sizeof(slot_type) * capacity_); >+ } >+ >+ // Sets the control byte, and if `i < Group::kWidth`, set the cloned byte at >+ // the end too. >+ void set_ctrl(size_t i, ctrl_t h) { >+ assert(i < capacity_); >+ >+ if (IsFull(h)) { >+ SanitizerUnpoisonObject(slots_ + i); >+ } else { >+ SanitizerPoisonObject(slots_ + i); >+ } >+ >+ ctrl_[i] = h; >+ ctrl_[((i - Group::kWidth) & capacity_) + Group::kWidth] = h; >+ } >+ >+ size_t& growth_left() { return settings_.template get<0>(); } >+ >+ hasher& hash_ref() { return settings_.template get<1>(); } >+ const hasher& hash_ref() const { return settings_.template get<1>(); } >+ key_equal& eq_ref() { return settings_.template get<2>(); } >+ const key_equal& eq_ref() const { return settings_.template get<2>(); } >+ allocator_type& alloc_ref() { return settings_.template get<3>(); } >+ const allocator_type& alloc_ref() const { >+ return settings_.template get<3>(); >+ } >+ >+ // On average each group has 2 empty slot (for the vectorized case). >+ static constexpr int64_t kMaxLoadFactorNumerator = 14; >+ static constexpr int64_t kMaxLoadFactorDenominator = 16; >+ static constexpr float kMaxLoadFactor = >+ 1.0 * kMaxLoadFactorNumerator / kMaxLoadFactorDenominator; >+ >+ // TODO(alkis): Investigate removing some of these fields: >+ // - ctrl/slots can be derived from each other >+ // - size can be moved into the slot array >+ ctrl_t* ctrl_ = EmptyGroup(); // [(capacity + 1) * ctrl_t] >+ slot_type* slots_ = nullptr; // [capacity * slot_type] >+ size_t size_ = 0; // number of full slots >+ size_t capacity_ = 0; // total number of slots >+ absl::container_internal::CompressedTuple<size_t /* growth_left */, hasher, >+ key_equal, allocator_type> >+ settings_{0, hasher{}, key_equal{}, allocator_type{}}; >+}; >+ >+namespace hashtable_debug_internal { >+template <typename Set> >+struct HashtableDebugAccess<Set, absl::void_t<typename Set::raw_hash_set>> { >+ using Traits = typename Set::PolicyTraits; >+ using Slot = typename Traits::slot_type; >+ >+ static size_t GetNumProbes(const Set& set, >+ const typename Set::key_type& key) { >+ size_t num_probes = 0; >+ size_t hash = set.hash_ref()(key); >+ auto seq = set.probe(hash); >+ while (true) { >+ container_internal::Group g{set.ctrl_ + seq.offset()}; >+ for (int i : g.Match(container_internal::H2(hash))) { >+ if (Traits::apply( >+ typename Set::template EqualElement<typename Set::key_type>{ >+ key, set.eq_ref()}, >+ Traits::element(set.slots_ + seq.offset(i)))) >+ return num_probes; >+ ++num_probes; >+ } >+ if (g.MatchEmpty()) return num_probes; >+ seq.next(); >+ ++num_probes; >+ } >+ } >+ >+ static size_t AllocatedByteSize(const Set& c) { >+ size_t capacity = c.capacity_; >+ if (capacity == 0) return 0; >+ auto layout = Set::MakeLayout(capacity); >+ size_t m = layout.AllocSize(); >+ >+ size_t per_slot = Traits::space_used(static_cast<const Slot*>(nullptr)); >+ if (per_slot != ~size_t{}) { >+ m += per_slot * c.size(); >+ } else { >+ for (size_t i = 0; i != capacity; ++i) { >+ if (container_internal::IsFull(c.ctrl_[i])) { >+ m += Traits::space_used(c.slots_ + i); >+ } >+ } >+ } >+ return m; >+ } >+ >+ static size_t LowerBoundAllocatedByteSize(size_t size) { >+ size_t capacity = container_internal::NormalizeCapacity( >+ std::ceil(size / Set::kMaxLoadFactor)); >+ if (capacity == 0) return 0; >+ auto layout = Set::MakeLayout(capacity); >+ size_t m = layout.AllocSize(); >+ size_t per_slot = Traits::space_used(static_cast<const Slot*>(nullptr)); >+ if (per_slot != ~size_t{}) { >+ m += per_slot * size; >+ } >+ return m; >+ } >+}; >+ >+} // namespace hashtable_debug_internal >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_RAW_HASH_SET_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set_allocator_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set_allocator_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..891fa450fe08c223e202b58fc38a02b8bf77822b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set_allocator_test.cc >@@ -0,0 +1,428 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include <limits> >+#include <scoped_allocator> >+ >+#include "gtest/gtest.h" >+#include "absl/container/internal/raw_hash_set.h" >+#include "absl/container/internal/tracked.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+enum AllocSpec { >+ kPropagateOnCopy = 1, >+ kPropagateOnMove = 2, >+ kPropagateOnSwap = 4, >+}; >+ >+struct AllocState { >+ size_t num_allocs = 0; >+ std::set<void*> owned; >+}; >+ >+template <class T, >+ int Spec = kPropagateOnCopy | kPropagateOnMove | kPropagateOnSwap> >+class CheckedAlloc { >+ public: >+ template <class, int> >+ friend class CheckedAlloc; >+ >+ using value_type = T; >+ >+ CheckedAlloc() {} >+ explicit CheckedAlloc(size_t id) : id_(id) {} >+ CheckedAlloc(const CheckedAlloc&) = default; >+ CheckedAlloc& operator=(const CheckedAlloc&) = default; >+ >+ template <class U> >+ CheckedAlloc(const CheckedAlloc<U, Spec>& that) >+ : id_(that.id_), state_(that.state_) {} >+ >+ template <class U> >+ struct rebind { >+ using other = CheckedAlloc<U, Spec>; >+ }; >+ >+ using propagate_on_container_copy_assignment = >+ std::integral_constant<bool, (Spec & kPropagateOnCopy) != 0>; >+ >+ using propagate_on_container_move_assignment = >+ std::integral_constant<bool, (Spec & kPropagateOnMove) != 0>; >+ >+ using propagate_on_container_swap = >+ std::integral_constant<bool, (Spec & kPropagateOnSwap) != 0>; >+ >+ CheckedAlloc select_on_container_copy_construction() const { >+ if (Spec & kPropagateOnCopy) return *this; >+ return {}; >+ } >+ >+ T* allocate(size_t n) { >+ T* ptr = std::allocator<T>().allocate(n); >+ track_alloc(ptr); >+ return ptr; >+ } >+ void deallocate(T* ptr, size_t n) { >+ memset(ptr, 0, n * sizeof(T)); // The freed memory must be unpoisoned. >+ track_dealloc(ptr); >+ return std::allocator<T>().deallocate(ptr, n); >+ } >+ >+ friend bool operator==(const CheckedAlloc& a, const CheckedAlloc& b) { >+ return a.id_ == b.id_; >+ } >+ friend bool operator!=(const CheckedAlloc& a, const CheckedAlloc& b) { >+ return !(a == b); >+ } >+ >+ size_t num_allocs() const { return state_->num_allocs; } >+ >+ void swap(CheckedAlloc& that) { >+ using std::swap; >+ swap(id_, that.id_); >+ swap(state_, that.state_); >+ } >+ >+ friend void swap(CheckedAlloc& a, CheckedAlloc& b) { a.swap(b); } >+ >+ friend std::ostream& operator<<(std::ostream& o, const CheckedAlloc& a) { >+ return o << "alloc(" << a.id_ << ")"; >+ } >+ >+ private: >+ void track_alloc(void* ptr) { >+ AllocState* state = state_.get(); >+ ++state->num_allocs; >+ if (!state->owned.insert(ptr).second) >+ ADD_FAILURE() << *this << " got previously allocated memory: " << ptr; >+ } >+ void track_dealloc(void* ptr) { >+ if (state_->owned.erase(ptr) != 1) >+ ADD_FAILURE() << *this >+ << " deleting memory owned by another allocator: " << ptr; >+ } >+ >+ size_t id_ = std::numeric_limits<size_t>::max(); >+ >+ std::shared_ptr<AllocState> state_ = std::make_shared<AllocState>(); >+}; >+ >+struct Identity { >+ int32_t operator()(int32_t v) const { return v; } >+}; >+ >+struct Policy { >+ using slot_type = Tracked<int32_t>; >+ using init_type = Tracked<int32_t>; >+ using key_type = int32_t; >+ >+ template <class allocator_type, class... Args> >+ static void construct(allocator_type* alloc, slot_type* slot, >+ Args&&... args) { >+ std::allocator_traits<allocator_type>::construct( >+ *alloc, slot, std::forward<Args>(args)...); >+ } >+ >+ template <class allocator_type> >+ static void destroy(allocator_type* alloc, slot_type* slot) { >+ std::allocator_traits<allocator_type>::destroy(*alloc, slot); >+ } >+ >+ template <class allocator_type> >+ static void transfer(allocator_type* alloc, slot_type* new_slot, >+ slot_type* old_slot) { >+ construct(alloc, new_slot, std::move(*old_slot)); >+ destroy(alloc, old_slot); >+ } >+ >+ template <class F> >+ static auto apply(F&& f, int32_t v) -> decltype(std::forward<F>(f)(v, v)) { >+ return std::forward<F>(f)(v, v); >+ } >+ >+ template <class F> >+ static auto apply(F&& f, const slot_type& v) >+ -> decltype(std::forward<F>(f)(v.val(), v)) { >+ return std::forward<F>(f)(v.val(), v); >+ } >+ >+ template <class F> >+ static auto apply(F&& f, slot_type&& v) >+ -> decltype(std::forward<F>(f)(v.val(), std::move(v))) { >+ return std::forward<F>(f)(v.val(), std::move(v)); >+ } >+ >+ static slot_type& element(slot_type* slot) { return *slot; } >+}; >+ >+template <int Spec> >+struct PropagateTest : public ::testing::Test { >+ using Alloc = CheckedAlloc<Tracked<int32_t>, Spec>; >+ >+ using Table = raw_hash_set<Policy, Identity, std::equal_to<int32_t>, Alloc>; >+ >+ PropagateTest() { >+ EXPECT_EQ(a1, t1.get_allocator()); >+ EXPECT_NE(a2, t1.get_allocator()); >+ } >+ >+ Alloc a1 = Alloc(1); >+ Table t1 = Table(0, a1); >+ Alloc a2 = Alloc(2); >+}; >+ >+using PropagateOnAll = >+ PropagateTest<kPropagateOnCopy | kPropagateOnMove | kPropagateOnSwap>; >+using NoPropagateOnCopy = PropagateTest<kPropagateOnMove | kPropagateOnSwap>; >+using NoPropagateOnMove = PropagateTest<kPropagateOnCopy | kPropagateOnSwap>; >+ >+TEST_F(PropagateOnAll, Empty) { EXPECT_EQ(0, a1.num_allocs()); } >+ >+TEST_F(PropagateOnAll, InsertAllocates) { >+ auto it = t1.insert(0).first; >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, InsertDecomposes) { >+ auto it = t1.insert(0).first; >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+ >+ EXPECT_FALSE(t1.insert(0).second); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, RehashMoves) { >+ auto it = t1.insert(0).first; >+ EXPECT_EQ(0, it->num_moves()); >+ t1.rehash(2 * t1.capacity()); >+ EXPECT_EQ(2, a1.num_allocs()); >+ it = t1.find(0); >+ EXPECT_EQ(1, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, CopyConstructor) { >+ auto it = t1.insert(0).first; >+ Table u(t1); >+ EXPECT_EQ(2, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnCopy, CopyConstructor) { >+ auto it = t1.insert(0).first; >+ Table u(t1); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(1, u.get_allocator().num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, CopyConstructorWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(t1, a1); >+ EXPECT_EQ(2, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnCopy, CopyConstructorWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(t1, a1); >+ EXPECT_EQ(2, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, CopyConstructorWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(t1, a2); >+ EXPECT_EQ(a2, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(1, a2.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnCopy, CopyConstructorWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(t1, a2); >+ EXPECT_EQ(a2, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(1, a2.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, MoveConstructor) { >+ auto it = t1.insert(0).first; >+ Table u(std::move(t1)); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnMove, MoveConstructor) { >+ auto it = t1.insert(0).first; >+ Table u(std::move(t1)); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, MoveConstructorWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(std::move(t1), a1); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnMove, MoveConstructorWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(std::move(t1), a1); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, MoveConstructorWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(std::move(t1), a2); >+ it = u.find(0); >+ EXPECT_EQ(a2, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(1, a2.num_allocs()); >+ EXPECT_EQ(1, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnMove, MoveConstructorWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(std::move(t1), a2); >+ it = u.find(0); >+ EXPECT_EQ(a2, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(1, a2.num_allocs()); >+ EXPECT_EQ(1, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, CopyAssignmentWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a1); >+ u = t1; >+ EXPECT_EQ(2, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnCopy, CopyAssignmentWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a1); >+ u = t1; >+ EXPECT_EQ(2, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, CopyAssignmentWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a2); >+ u = t1; >+ EXPECT_EQ(a1, u.get_allocator()); >+ EXPECT_EQ(2, a1.num_allocs()); >+ EXPECT_EQ(0, a2.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnCopy, CopyAssignmentWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a2); >+ u = t1; >+ EXPECT_EQ(a2, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(1, a2.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(1, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, MoveAssignmentWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a1); >+ u = std::move(t1); >+ EXPECT_EQ(a1, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnMove, MoveAssignmentWithSameAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a1); >+ u = std::move(t1); >+ EXPECT_EQ(a1, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, MoveAssignmentWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a2); >+ u = std::move(t1); >+ EXPECT_EQ(a1, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, a2.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(NoPropagateOnMove, MoveAssignmentWithDifferentAlloc) { >+ auto it = t1.insert(0).first; >+ Table u(0, a2); >+ u = std::move(t1); >+ it = u.find(0); >+ EXPECT_EQ(a2, u.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(1, a2.num_allocs()); >+ EXPECT_EQ(1, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+TEST_F(PropagateOnAll, Swap) { >+ auto it = t1.insert(0).first; >+ Table u(0, a2); >+ u.swap(t1); >+ EXPECT_EQ(a1, u.get_allocator()); >+ EXPECT_EQ(a2, t1.get_allocator()); >+ EXPECT_EQ(1, a1.num_allocs()); >+ EXPECT_EQ(0, a2.num_allocs()); >+ EXPECT_EQ(0, it->num_moves()); >+ EXPECT_EQ(0, it->num_copies()); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..cd33a3acb9178888855ff91de94146934c837556 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/raw_hash_set_test.cc >@@ -0,0 +1,1961 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/internal/raw_hash_set.h" >+ >+#include <array> >+#include <cmath> >+#include <cstdint> >+#include <deque> >+#include <functional> >+#include <memory> >+#include <numeric> >+#include <random> >+#include <string> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/base/attributes.h" >+#include "absl/base/internal/cycleclock.h" >+#include "absl/base/internal/raw_logging.h" >+#include "absl/container/internal/container_memory.h" >+#include "absl/container/internal/hash_function_defaults.h" >+#include "absl/container/internal/hash_policy_testing.h" >+#include "absl/container/internal/hashtable_debug.h" >+#include "absl/strings/string_view.h" >+ >+namespace absl { >+namespace container_internal { >+ >+struct RawHashSetTestOnlyAccess { >+ template <typename C> >+ static auto GetSlots(const C& c) -> decltype(c.slots_) { >+ return c.slots_; >+ } >+}; >+ >+namespace { >+ >+using ::testing::DoubleNear; >+using ::testing::ElementsAre; >+using ::testing::Optional; >+using ::testing::Pair; >+using ::testing::UnorderedElementsAre; >+ >+TEST(Util, NormalizeCapacity) { >+ constexpr size_t kMinCapacity = Group::kWidth - 1; >+ EXPECT_EQ(kMinCapacity, NormalizeCapacity(0)); >+ EXPECT_EQ(kMinCapacity, NormalizeCapacity(1)); >+ EXPECT_EQ(kMinCapacity, NormalizeCapacity(2)); >+ EXPECT_EQ(kMinCapacity, NormalizeCapacity(kMinCapacity)); >+ EXPECT_EQ(kMinCapacity * 2 + 1, NormalizeCapacity(kMinCapacity + 1)); >+ EXPECT_EQ(kMinCapacity * 2 + 1, NormalizeCapacity(kMinCapacity + 2)); >+} >+ >+TEST(Util, probe_seq) { >+ probe_seq<16> seq(0, 127); >+ auto gen = [&]() { >+ size_t res = seq.offset(); >+ seq.next(); >+ return res; >+ }; >+ std::vector<size_t> offsets(8); >+ std::generate_n(offsets.begin(), 8, gen); >+ EXPECT_THAT(offsets, ElementsAre(0, 16, 48, 96, 32, 112, 80, 64)); >+ seq = probe_seq<16>(128, 127); >+ std::generate_n(offsets.begin(), 8, gen); >+ EXPECT_THAT(offsets, ElementsAre(0, 16, 48, 96, 32, 112, 80, 64)); >+} >+ >+TEST(BitMask, Smoke) { >+ EXPECT_FALSE((BitMask<uint8_t, 8>(0))); >+ EXPECT_TRUE((BitMask<uint8_t, 8>(5))); >+ >+ EXPECT_THAT((BitMask<uint8_t, 8>(0)), ElementsAre()); >+ EXPECT_THAT((BitMask<uint8_t, 8>(0x1)), ElementsAre(0)); >+ EXPECT_THAT((BitMask<uint8_t, 8>(0x2)), ElementsAre(1)); >+ EXPECT_THAT((BitMask<uint8_t, 8>(0x3)), ElementsAre(0, 1)); >+ EXPECT_THAT((BitMask<uint8_t, 8>(0x4)), ElementsAre(2)); >+ EXPECT_THAT((BitMask<uint8_t, 8>(0x5)), ElementsAre(0, 2)); >+ EXPECT_THAT((BitMask<uint8_t, 8>(0x55)), ElementsAre(0, 2, 4, 6)); >+ EXPECT_THAT((BitMask<uint8_t, 8>(0xAA)), ElementsAre(1, 3, 5, 7)); >+} >+ >+TEST(BitMask, WithShift) { >+ // See the non-SSE version of Group for details on what this math is for. >+ uint64_t ctrl = 0x1716151413121110; >+ uint64_t hash = 0x12; >+ constexpr uint64_t msbs = 0x8080808080808080ULL; >+ constexpr uint64_t lsbs = 0x0101010101010101ULL; >+ auto x = ctrl ^ (lsbs * hash); >+ uint64_t mask = (x - lsbs) & ~x & msbs; >+ EXPECT_EQ(0x0000000080800000, mask); >+ >+ BitMask<uint64_t, 8, 3> b(mask); >+ EXPECT_EQ(*b, 2); >+} >+ >+TEST(BitMask, LeadingTrailing) { >+ EXPECT_EQ((BitMask<uint32_t, 16>(0b0001101001000000).LeadingZeros()), 3); >+ EXPECT_EQ((BitMask<uint32_t, 16>(0b0001101001000000).TrailingZeros()), 6); >+ >+ EXPECT_EQ((BitMask<uint32_t, 16>(0b0000000000000001).LeadingZeros()), 15); >+ EXPECT_EQ((BitMask<uint32_t, 16>(0b0000000000000001).TrailingZeros()), 0); >+ >+ EXPECT_EQ((BitMask<uint32_t, 16>(0b1000000000000000).LeadingZeros()), 0); >+ EXPECT_EQ((BitMask<uint32_t, 16>(0b1000000000000000).TrailingZeros()), 15); >+ >+ EXPECT_EQ((BitMask<uint64_t, 8, 3>(0x0000008080808000).LeadingZeros()), 3); >+ EXPECT_EQ((BitMask<uint64_t, 8, 3>(0x0000008080808000).TrailingZeros()), 1); >+ >+ EXPECT_EQ((BitMask<uint64_t, 8, 3>(0x0000000000000080).LeadingZeros()), 7); >+ EXPECT_EQ((BitMask<uint64_t, 8, 3>(0x0000000000000080).TrailingZeros()), 0); >+ >+ EXPECT_EQ((BitMask<uint64_t, 8, 3>(0x8000000000000000).LeadingZeros()), 0); >+ EXPECT_EQ((BitMask<uint64_t, 8, 3>(0x8000000000000000).TrailingZeros()), 7); >+} >+ >+TEST(Group, EmptyGroup) { >+ for (h2_t h = 0; h != 128; ++h) EXPECT_FALSE(Group{EmptyGroup()}.Match(h)); >+} >+ >+#if SWISSTABLE_HAVE_SSE2 >+TEST(Group, Match) { >+ ctrl_t group[] = {kEmpty, 1, kDeleted, 3, kEmpty, 5, kSentinel, 7, >+ 7, 5, 3, 1, 1, 1, 1, 1}; >+ EXPECT_THAT(Group{group}.Match(0), ElementsAre()); >+ EXPECT_THAT(Group{group}.Match(1), ElementsAre(1, 11, 12, 13, 14, 15)); >+ EXPECT_THAT(Group{group}.Match(3), ElementsAre(3, 10)); >+ EXPECT_THAT(Group{group}.Match(5), ElementsAre(5, 9)); >+ EXPECT_THAT(Group{group}.Match(7), ElementsAre(7, 8)); >+} >+ >+TEST(Group, MatchEmpty) { >+ ctrl_t group[] = {kEmpty, 1, kDeleted, 3, kEmpty, 5, kSentinel, 7, >+ 7, 5, 3, 1, 1, 1, 1, 1}; >+ EXPECT_THAT(Group{group}.MatchEmpty(), ElementsAre(0, 4)); >+} >+ >+TEST(Group, MatchEmptyOrDeleted) { >+ ctrl_t group[] = {kEmpty, 1, kDeleted, 3, kEmpty, 5, kSentinel, 7, >+ 7, 5, 3, 1, 1, 1, 1, 1}; >+ EXPECT_THAT(Group{group}.MatchEmptyOrDeleted(), ElementsAre(0, 2, 4)); >+} >+#else >+TEST(Group, Match) { >+ ctrl_t group[] = {kEmpty, 1, 2, kDeleted, 2, 1, kSentinel, 1}; >+ EXPECT_THAT(Group{group}.Match(0), ElementsAre()); >+ EXPECT_THAT(Group{group}.Match(1), ElementsAre(1, 5, 7)); >+ EXPECT_THAT(Group{group}.Match(2), ElementsAre(2, 4)); >+} >+TEST(Group, MatchEmpty) { >+ ctrl_t group[] = {kEmpty, 1, 2, kDeleted, 2, 1, kSentinel, 1}; >+ EXPECT_THAT(Group{group}.MatchEmpty(), ElementsAre(0)); >+} >+ >+TEST(Group, MatchEmptyOrDeleted) { >+ ctrl_t group[] = {kEmpty, 1, 2, kDeleted, 2, 1, kSentinel, 1}; >+ EXPECT_THAT(Group{group}.MatchEmptyOrDeleted(), ElementsAre(0, 3)); >+} >+#endif >+ >+TEST(Batch, DropDeletes) { >+ constexpr size_t kCapacity = 63; >+ constexpr size_t kGroupWidth = container_internal::Group::kWidth; >+ std::vector<ctrl_t> ctrl(kCapacity + 1 + kGroupWidth); >+ ctrl[kCapacity] = kSentinel; >+ std::vector<ctrl_t> pattern = {kEmpty, 2, kDeleted, 2, kEmpty, 1, kDeleted}; >+ for (size_t i = 0; i != kCapacity; ++i) { >+ ctrl[i] = pattern[i % pattern.size()]; >+ if (i < kGroupWidth - 1) >+ ctrl[i + kCapacity + 1] = pattern[i % pattern.size()]; >+ } >+ ConvertDeletedToEmptyAndFullToDeleted(ctrl.data(), kCapacity); >+ ASSERT_EQ(ctrl[kCapacity], kSentinel); >+ for (size_t i = 0; i < kCapacity + 1 + kGroupWidth; ++i) { >+ ctrl_t expected = pattern[i % (kCapacity + 1) % pattern.size()]; >+ if (i == kCapacity) expected = kSentinel; >+ if (expected == kDeleted) expected = kEmpty; >+ if (IsFull(expected)) expected = kDeleted; >+ EXPECT_EQ(ctrl[i], expected) >+ << i << " " << int{pattern[i % pattern.size()]}; >+ } >+} >+ >+TEST(Group, CountLeadingEmptyOrDeleted) { >+ const std::vector<ctrl_t> empty_examples = {kEmpty, kDeleted}; >+ const std::vector<ctrl_t> full_examples = {0, 1, 2, 3, 5, 9, 127, kSentinel}; >+ >+ for (ctrl_t empty : empty_examples) { >+ std::vector<ctrl_t> e(Group::kWidth, empty); >+ EXPECT_EQ(Group::kWidth, Group{e.data()}.CountLeadingEmptyOrDeleted()); >+ for (ctrl_t full : full_examples) { >+ for (size_t i = 0; i != Group::kWidth; ++i) { >+ std::vector<ctrl_t> f(Group::kWidth, empty); >+ f[i] = full; >+ EXPECT_EQ(i, Group{f.data()}.CountLeadingEmptyOrDeleted()); >+ } >+ std::vector<ctrl_t> f(Group::kWidth, empty); >+ f[Group::kWidth * 2 / 3] = full; >+ f[Group::kWidth / 2] = full; >+ EXPECT_EQ( >+ Group::kWidth / 2, Group{f.data()}.CountLeadingEmptyOrDeleted()); >+ } >+ } >+} >+ >+struct IntPolicy { >+ using slot_type = int64_t; >+ using key_type = int64_t; >+ using init_type = int64_t; >+ >+ static void construct(void*, int64_t* slot, int64_t v) { *slot = v; } >+ static void destroy(void*, int64_t*) {} >+ static void transfer(void*, int64_t* new_slot, int64_t* old_slot) { >+ *new_slot = *old_slot; >+ } >+ >+ static int64_t& element(slot_type* slot) { return *slot; } >+ >+ template <class F> >+ static auto apply(F&& f, int64_t x) -> decltype(std::forward<F>(f)(x, x)) { >+ return std::forward<F>(f)(x, x); >+ } >+}; >+ >+class StringPolicy { >+ template <class F, class K, class V, >+ class = typename std::enable_if< >+ std::is_convertible<const K&, absl::string_view>::value>::type> >+ decltype(std::declval<F>()( >+ std::declval<const absl::string_view&>(), std::piecewise_construct, >+ std::declval<std::tuple<K>>(), >+ std::declval<V>())) static apply_impl(F&& f, >+ std::pair<std::tuple<K>, V> p) { >+ const absl::string_view& key = std::get<0>(p.first); >+ return std::forward<F>(f)(key, std::piecewise_construct, std::move(p.first), >+ std::move(p.second)); >+ } >+ >+ public: >+ struct slot_type { >+ struct ctor {}; >+ >+ template <class... Ts> >+ slot_type(ctor, Ts&&... ts) : pair(std::forward<Ts>(ts)...) {} >+ >+ std::pair<std::string, std::string> pair; >+ }; >+ >+ using key_type = std::string; >+ using init_type = std::pair<std::string, std::string>; >+ >+ template <class allocator_type, class... Args> >+ static void construct(allocator_type* alloc, slot_type* slot, Args... args) { >+ std::allocator_traits<allocator_type>::construct( >+ *alloc, slot, typename slot_type::ctor(), std::forward<Args>(args)...); >+ } >+ >+ template <class allocator_type> >+ static void destroy(allocator_type* alloc, slot_type* slot) { >+ std::allocator_traits<allocator_type>::destroy(*alloc, slot); >+ } >+ >+ template <class allocator_type> >+ static void transfer(allocator_type* alloc, slot_type* new_slot, >+ slot_type* old_slot) { >+ construct(alloc, new_slot, std::move(old_slot->pair)); >+ destroy(alloc, old_slot); >+ } >+ >+ static std::pair<std::string, std::string>& element(slot_type* slot) { >+ return slot->pair; >+ } >+ >+ template <class F, class... Args> >+ static auto apply(F&& f, Args&&... args) >+ -> decltype(apply_impl(std::forward<F>(f), >+ PairArgs(std::forward<Args>(args)...))) { >+ return apply_impl(std::forward<F>(f), >+ PairArgs(std::forward<Args>(args)...)); >+ } >+}; >+ >+struct StringHash : absl::Hash<absl::string_view> { >+ using is_transparent = void; >+}; >+struct StringEq : std::equal_to<absl::string_view> { >+ using is_transparent = void; >+}; >+ >+struct StringTable >+ : raw_hash_set<StringPolicy, StringHash, StringEq, std::allocator<int>> { >+ using Base = typename StringTable::raw_hash_set; >+ StringTable() {} >+ using Base::Base; >+}; >+ >+struct IntTable >+ : raw_hash_set<IntPolicy, container_internal::hash_default_hash<int64_t>, >+ std::equal_to<int64_t>, std::allocator<int64_t>> { >+ using Base = typename IntTable::raw_hash_set; >+ IntTable() {} >+ using Base::Base; >+}; >+ >+struct BadFastHash { >+ template <class T> >+ size_t operator()(const T&) const { >+ return 0; >+ } >+}; >+ >+struct BadTable : raw_hash_set<IntPolicy, BadFastHash, std::equal_to<int>, >+ std::allocator<int>> { >+ using Base = typename BadTable::raw_hash_set; >+ BadTable() {} >+ using Base::Base; >+}; >+ >+TEST(Table, EmptyFunctorOptimization) { >+ static_assert(std::is_empty<std::equal_to<absl::string_view>>::value, ""); >+ static_assert(std::is_empty<std::allocator<int>>::value, ""); >+ >+ struct MockTable { >+ void* ctrl; >+ void* slots; >+ size_t size; >+ size_t capacity; >+ size_t growth_left; >+ }; >+ struct StatelessHash { >+ size_t operator()(absl::string_view) const { return 0; } >+ }; >+ struct StatefulHash : StatelessHash { >+ size_t dummy; >+ }; >+ >+ EXPECT_EQ( >+ sizeof(MockTable), >+ sizeof( >+ raw_hash_set<StringPolicy, StatelessHash, >+ std::equal_to<absl::string_view>, std::allocator<int>>)); >+ >+ EXPECT_EQ( >+ sizeof(MockTable) + sizeof(StatefulHash), >+ sizeof( >+ raw_hash_set<StringPolicy, StatefulHash, >+ std::equal_to<absl::string_view>, std::allocator<int>>)); >+} >+ >+TEST(Table, Empty) { >+ IntTable t; >+ EXPECT_EQ(0, t.size()); >+ EXPECT_TRUE(t.empty()); >+} >+ >+#ifdef __GNUC__ >+template <class T> >+ABSL_ATTRIBUTE_ALWAYS_INLINE inline void DoNotOptimize(const T& v) { >+ asm volatile("" : : "r,m"(v) : "memory"); >+} >+#endif >+ >+TEST(Table, Prefetch) { >+ IntTable t; >+ t.emplace(1); >+ // Works for both present and absent keys. >+ t.prefetch(1); >+ t.prefetch(2); >+ >+ // Do not run in debug mode, when prefetch is not implemented, or when >+ // sanitizers are enabled. >+#if defined(NDEBUG) && defined(__GNUC__) && !defined(ADDRESS_SANITIZER) && \ >+ !defined(MEMORY_SANITIZER) && !defined(THREAD_SANITIZER) && \ >+ !defined(UNDEFINED_BEHAVIOR_SANITIZER) >+ const auto now = [] { return absl::base_internal::CycleClock::Now(); }; >+ >+ static constexpr int size = 1000000; >+ for (int i = 0; i < size; ++i) t.insert(i); >+ >+ int64_t no_prefetch = 0, prefetch = 0; >+ for (int iter = 0; iter < 10; ++iter) { >+ int64_t time = now(); >+ for (int i = 0; i < size; ++i) { >+ DoNotOptimize(t.find(i)); >+ } >+ no_prefetch += now() - time; >+ >+ time = now(); >+ for (int i = 0; i < size; ++i) { >+ t.prefetch(i + 20); >+ DoNotOptimize(t.find(i)); >+ } >+ prefetch += now() - time; >+ } >+ >+ // no_prefetch is at least 30% slower. >+ EXPECT_GE(1.0 * no_prefetch / prefetch, 1.3); >+#endif >+} >+ >+TEST(Table, LookupEmpty) { >+ IntTable t; >+ auto it = t.find(0); >+ EXPECT_TRUE(it == t.end()); >+} >+ >+TEST(Table, Insert1) { >+ IntTable t; >+ EXPECT_TRUE(t.find(0) == t.end()); >+ auto res = t.emplace(0); >+ EXPECT_TRUE(res.second); >+ EXPECT_THAT(*res.first, 0); >+ EXPECT_EQ(1, t.size()); >+ EXPECT_THAT(*t.find(0), 0); >+} >+ >+TEST(Table, Insert2) { >+ IntTable t; >+ EXPECT_TRUE(t.find(0) == t.end()); >+ auto res = t.emplace(0); >+ EXPECT_TRUE(res.second); >+ EXPECT_THAT(*res.first, 0); >+ EXPECT_EQ(1, t.size()); >+ EXPECT_TRUE(t.find(1) == t.end()); >+ res = t.emplace(1); >+ EXPECT_TRUE(res.second); >+ EXPECT_THAT(*res.first, 1); >+ EXPECT_EQ(2, t.size()); >+ EXPECT_THAT(*t.find(0), 0); >+ EXPECT_THAT(*t.find(1), 1); >+} >+ >+TEST(Table, InsertCollision) { >+ BadTable t; >+ EXPECT_TRUE(t.find(1) == t.end()); >+ auto res = t.emplace(1); >+ EXPECT_TRUE(res.second); >+ EXPECT_THAT(*res.first, 1); >+ EXPECT_EQ(1, t.size()); >+ >+ EXPECT_TRUE(t.find(2) == t.end()); >+ res = t.emplace(2); >+ EXPECT_THAT(*res.first, 2); >+ EXPECT_TRUE(res.second); >+ EXPECT_EQ(2, t.size()); >+ >+ EXPECT_THAT(*t.find(1), 1); >+ EXPECT_THAT(*t.find(2), 2); >+} >+ >+// Test that we do not add existent element in case we need to search through >+// many groups with deleted elements >+TEST(Table, InsertCollisionAndFindAfterDelete) { >+ BadTable t; // all elements go to the same group. >+ // Have at least 2 groups with Group::kWidth collisions >+ // plus some extra collisions in the last group. >+ constexpr size_t kNumInserts = Group::kWidth * 2 + 5; >+ for (size_t i = 0; i < kNumInserts; ++i) { >+ auto res = t.emplace(i); >+ EXPECT_TRUE(res.second); >+ EXPECT_THAT(*res.first, i); >+ EXPECT_EQ(i + 1, t.size()); >+ } >+ >+ // Remove elements one by one and check >+ // that we still can find all other elements. >+ for (size_t i = 0; i < kNumInserts; ++i) { >+ EXPECT_EQ(1, t.erase(i)) << i; >+ for (size_t j = i + 1; j < kNumInserts; ++j) { >+ EXPECT_THAT(*t.find(j), j); >+ auto res = t.emplace(j); >+ EXPECT_FALSE(res.second) << i << " " << j; >+ EXPECT_THAT(*res.first, j); >+ EXPECT_EQ(kNumInserts - i - 1, t.size()); >+ } >+ } >+ EXPECT_TRUE(t.empty()); >+} >+ >+TEST(Table, LazyEmplace) { >+ StringTable t; >+ bool called = false; >+ auto it = t.lazy_emplace("abc", [&](const StringTable::constructor& f) { >+ called = true; >+ f("abc", "ABC"); >+ }); >+ EXPECT_TRUE(called); >+ EXPECT_THAT(*it, Pair("abc", "ABC")); >+ called = false; >+ it = t.lazy_emplace("abc", [&](const StringTable::constructor& f) { >+ called = true; >+ f("abc", "DEF"); >+ }); >+ EXPECT_FALSE(called); >+ EXPECT_THAT(*it, Pair("abc", "ABC")); >+} >+ >+TEST(Table, ContainsEmpty) { >+ IntTable t; >+ >+ EXPECT_FALSE(t.contains(0)); >+} >+ >+TEST(Table, Contains1) { >+ IntTable t; >+ >+ EXPECT_TRUE(t.insert(0).second); >+ EXPECT_TRUE(t.contains(0)); >+ EXPECT_FALSE(t.contains(1)); >+ >+ EXPECT_EQ(1, t.erase(0)); >+ EXPECT_FALSE(t.contains(0)); >+} >+ >+TEST(Table, Contains2) { >+ IntTable t; >+ >+ EXPECT_TRUE(t.insert(0).second); >+ EXPECT_TRUE(t.contains(0)); >+ EXPECT_FALSE(t.contains(1)); >+ >+ t.clear(); >+ EXPECT_FALSE(t.contains(0)); >+} >+ >+int decompose_constructed; >+struct DecomposeType { >+ DecomposeType(int i) : i(i) { // NOLINT >+ ++decompose_constructed; >+ } >+ >+ explicit DecomposeType(const char* d) : DecomposeType(*d) {} >+ >+ int i; >+}; >+ >+struct DecomposeHash { >+ using is_transparent = void; >+ size_t operator()(DecomposeType a) const { return a.i; } >+ size_t operator()(int a) const { return a; } >+ size_t operator()(const char* a) const { return *a; } >+}; >+ >+struct DecomposeEq { >+ using is_transparent = void; >+ bool operator()(DecomposeType a, DecomposeType b) const { return a.i == b.i; } >+ bool operator()(DecomposeType a, int b) const { return a.i == b; } >+ bool operator()(DecomposeType a, const char* b) const { return a.i == *b; } >+}; >+ >+struct DecomposePolicy { >+ using slot_type = DecomposeType; >+ using key_type = DecomposeType; >+ using init_type = DecomposeType; >+ >+ template <typename T> >+ static void construct(void*, DecomposeType* slot, T&& v) { >+ *slot = DecomposeType(std::forward<T>(v)); >+ } >+ static void destroy(void*, DecomposeType*) {} >+ static DecomposeType& element(slot_type* slot) { return *slot; } >+ >+ template <class F, class T> >+ static auto apply(F&& f, const T& x) -> decltype(std::forward<F>(f)(x, x)) { >+ return std::forward<F>(f)(x, x); >+ } >+}; >+ >+template <typename Hash, typename Eq> >+void TestDecompose(bool construct_three) { >+ DecomposeType elem{0}; >+ const int one = 1; >+ const char* three_p = "3"; >+ const auto& three = three_p; >+ >+ raw_hash_set<DecomposePolicy, Hash, Eq, std::allocator<int>> set1; >+ >+ decompose_constructed = 0; >+ int expected_constructed = 0; >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.insert(elem); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.insert(1); >+ EXPECT_EQ(++expected_constructed, decompose_constructed); >+ set1.emplace("3"); >+ EXPECT_EQ(++expected_constructed, decompose_constructed); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ >+ { // insert(T&&) >+ set1.insert(1); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ } >+ >+ { // insert(const T&) >+ set1.insert(one); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ } >+ >+ { // insert(hint, T&&) >+ set1.insert(set1.begin(), 1); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ } >+ >+ { // insert(hint, const T&) >+ set1.insert(set1.begin(), one); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ } >+ >+ { // emplace(...) >+ set1.emplace(1); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.emplace("3"); >+ expected_constructed += construct_three; >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.emplace(one); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.emplace(three); >+ expected_constructed += construct_three; >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ } >+ >+ { // emplace_hint(...) >+ set1.emplace_hint(set1.begin(), 1); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.emplace_hint(set1.begin(), "3"); >+ expected_constructed += construct_three; >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.emplace_hint(set1.begin(), one); >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ set1.emplace_hint(set1.begin(), three); >+ expected_constructed += construct_three; >+ EXPECT_EQ(expected_constructed, decompose_constructed); >+ } >+} >+ >+TEST(Table, Decompose) { >+ TestDecompose<DecomposeHash, DecomposeEq>(false); >+ >+ struct TransparentHashIntOverload { >+ size_t operator()(DecomposeType a) const { return a.i; } >+ size_t operator()(int a) const { return a; } >+ }; >+ struct TransparentEqIntOverload { >+ bool operator()(DecomposeType a, DecomposeType b) const { >+ return a.i == b.i; >+ } >+ bool operator()(DecomposeType a, int b) const { return a.i == b; } >+ }; >+ TestDecompose<TransparentHashIntOverload, DecomposeEq>(true); >+ TestDecompose<TransparentHashIntOverload, TransparentEqIntOverload>(true); >+ TestDecompose<DecomposeHash, TransparentEqIntOverload>(true); >+} >+ >+// Returns the largest m such that a table with m elements has the same number >+// of buckets as a table with n elements. >+size_t MaxDensitySize(size_t n) { >+ IntTable t; >+ t.reserve(n); >+ for (size_t i = 0; i != n; ++i) t.emplace(i); >+ const size_t c = t.bucket_count(); >+ while (c == t.bucket_count()) t.emplace(n++); >+ return t.size() - 1; >+} >+ >+struct Modulo1000Hash { >+ size_t operator()(int x) const { return x % 1000; } >+}; >+ >+struct Modulo1000HashTable >+ : public raw_hash_set<IntPolicy, Modulo1000Hash, std::equal_to<int>, >+ std::allocator<int>> {}; >+ >+// Test that rehash with no resize happen in case of many deleted slots. >+TEST(Table, RehashWithNoResize) { >+ Modulo1000HashTable t; >+ // Adding the same length (and the same hash) strings >+ // to have at least kMinFullGroups groups >+ // with Group::kWidth collisions. Then feel upto MaxDensitySize; >+ const size_t kMinFullGroups = 7; >+ std::vector<int> keys; >+ for (size_t i = 0; i < MaxDensitySize(Group::kWidth * kMinFullGroups); ++i) { >+ int k = i * 1000; >+ t.emplace(k); >+ keys.push_back(k); >+ } >+ const size_t capacity = t.capacity(); >+ >+ // Remove elements from all groups except the first and the last one. >+ // All elements removed from full groups will be marked as kDeleted. >+ const size_t erase_begin = Group::kWidth / 2; >+ const size_t erase_end = (t.size() / Group::kWidth - 1) * Group::kWidth; >+ for (size_t i = erase_begin; i < erase_end; ++i) { >+ EXPECT_EQ(1, t.erase(keys[i])) << i; >+ } >+ keys.erase(keys.begin() + erase_begin, keys.begin() + erase_end); >+ >+ auto last_key = keys.back(); >+ size_t last_key_num_probes = GetHashtableDebugNumProbes(t, last_key); >+ >+ // Make sure that we have to make a lot of probes for last key. >+ ASSERT_GT(last_key_num_probes, kMinFullGroups); >+ >+ int x = 1; >+ // Insert and erase one element, before inplace rehash happen. >+ while (last_key_num_probes == GetHashtableDebugNumProbes(t, last_key)) { >+ t.emplace(x); >+ ASSERT_EQ(capacity, t.capacity()); >+ // All elements should be there. >+ ASSERT_TRUE(t.find(x) != t.end()) << x; >+ for (const auto& k : keys) { >+ ASSERT_TRUE(t.find(k) != t.end()) << k; >+ } >+ t.erase(x); >+ ++x; >+ } >+} >+ >+TEST(Table, InsertEraseStressTest) { >+ IntTable t; >+ const size_t kMinElementCount = 250; >+ std::deque<int> keys; >+ size_t i = 0; >+ for (; i < MaxDensitySize(kMinElementCount); ++i) { >+ t.emplace(i); >+ keys.push_back(i); >+ } >+ const size_t kNumIterations = 1000000; >+ for (; i < kNumIterations; ++i) { >+ ASSERT_EQ(1, t.erase(keys.front())); >+ keys.pop_front(); >+ t.emplace(i); >+ keys.push_back(i); >+ } >+} >+ >+TEST(Table, InsertOverloads) { >+ StringTable t; >+ // These should all trigger the insert(init_type) overload. >+ t.insert({{}, {}}); >+ t.insert({"ABC", {}}); >+ t.insert({"DEF", "!!!"}); >+ >+ EXPECT_THAT(t, UnorderedElementsAre(Pair("", ""), Pair("ABC", ""), >+ Pair("DEF", "!!!"))); >+} >+ >+TEST(Table, LargeTable) { >+ IntTable t; >+ for (int64_t i = 0; i != 100000; ++i) t.emplace(i << 40); >+ for (int64_t i = 0; i != 100000; ++i) ASSERT_EQ(i << 40, *t.find(i << 40)); >+} >+ >+// Timeout if copy is quadratic as it was in Rust. >+TEST(Table, EnsureNonQuadraticAsInRust) { >+ static const size_t kLargeSize = 1 << 15; >+ >+ IntTable t; >+ for (size_t i = 0; i != kLargeSize; ++i) { >+ t.insert(i); >+ } >+ >+ // If this is quadratic, the test will timeout. >+ IntTable t2; >+ for (const auto& entry : t) t2.insert(entry); >+} >+ >+TEST(Table, ClearBug) { >+ IntTable t; >+ constexpr size_t capacity = container_internal::Group::kWidth - 1; >+ constexpr size_t max_size = capacity / 2; >+ for (size_t i = 0; i < max_size; ++i) { >+ t.insert(i); >+ } >+ ASSERT_EQ(capacity, t.capacity()); >+ intptr_t original = reinterpret_cast<intptr_t>(&*t.find(2)); >+ t.clear(); >+ ASSERT_EQ(capacity, t.capacity()); >+ for (size_t i = 0; i < max_size; ++i) { >+ t.insert(i); >+ } >+ ASSERT_EQ(capacity, t.capacity()); >+ intptr_t second = reinterpret_cast<intptr_t>(&*t.find(2)); >+ // We are checking that original and second are close enough to each other >+ // that they are probably still in the same group. This is not strictly >+ // guaranteed. >+ EXPECT_LT(std::abs(original - second), >+ capacity * sizeof(IntTable::value_type)); >+} >+ >+TEST(Table, Erase) { >+ IntTable t; >+ EXPECT_TRUE(t.find(0) == t.end()); >+ auto res = t.emplace(0); >+ EXPECT_TRUE(res.second); >+ EXPECT_EQ(1, t.size()); >+ t.erase(res.first); >+ EXPECT_EQ(0, t.size()); >+ EXPECT_TRUE(t.find(0) == t.end()); >+} >+ >+// Collect N bad keys by following algorithm: >+// 1. Create an empty table and reserve it to 2 * N. >+// 2. Insert N random elements. >+// 3. Take first Group::kWidth - 1 to bad_keys array. >+// 4. Clear the table without resize. >+// 5. Go to point 2 while N keys not collected >+std::vector<int64_t> CollectBadMergeKeys(size_t N) { >+ static constexpr int kGroupSize = Group::kWidth - 1; >+ >+ auto topk_range = [](size_t b, size_t e, IntTable* t) -> std::vector<int64_t> { >+ for (size_t i = b; i != e; ++i) { >+ t->emplace(i); >+ } >+ std::vector<int64_t> res; >+ res.reserve(kGroupSize); >+ auto it = t->begin(); >+ for (size_t i = b; i != e && i != b + kGroupSize; ++i, ++it) { >+ res.push_back(*it); >+ } >+ return res; >+ }; >+ >+ std::vector<int64_t> bad_keys; >+ bad_keys.reserve(N); >+ IntTable t; >+ t.reserve(N * 2); >+ >+ for (size_t b = 0; bad_keys.size() < N; b += N) { >+ auto keys = topk_range(b, b + N, &t); >+ bad_keys.insert(bad_keys.end(), keys.begin(), keys.end()); >+ t.erase(t.begin(), t.end()); >+ EXPECT_TRUE(t.empty()); >+ } >+ return bad_keys; >+} >+ >+struct ProbeStats { >+ // Number of elements with specific probe length over all tested tables. >+ std::vector<size_t> all_probes_histogram; >+ // Ratios total_probe_length/size for every tested table. >+ std::vector<double> single_table_ratios; >+ >+ friend ProbeStats operator+(const ProbeStats& a, const ProbeStats& b) { >+ ProbeStats res = a; >+ res.all_probes_histogram.resize(std::max(res.all_probes_histogram.size(), >+ b.all_probes_histogram.size())); >+ std::transform(b.all_probes_histogram.begin(), b.all_probes_histogram.end(), >+ res.all_probes_histogram.begin(), >+ res.all_probes_histogram.begin(), std::plus<size_t>()); >+ res.single_table_ratios.insert(res.single_table_ratios.end(), >+ b.single_table_ratios.begin(), >+ b.single_table_ratios.end()); >+ return res; >+ } >+ >+ // Average ratio total_probe_length/size over tables. >+ double AvgRatio() const { >+ return std::accumulate(single_table_ratios.begin(), >+ single_table_ratios.end(), 0.0) / >+ single_table_ratios.size(); >+ } >+ >+ // Maximum ratio total_probe_length/size over tables. >+ double MaxRatio() const { >+ return *std::max_element(single_table_ratios.begin(), >+ single_table_ratios.end()); >+ } >+ >+ // Percentile ratio total_probe_length/size over tables. >+ double PercentileRatio(double Percentile = 0.95) const { >+ auto r = single_table_ratios; >+ auto mid = r.begin() + static_cast<size_t>(r.size() * Percentile); >+ if (mid != r.end()) { >+ std::nth_element(r.begin(), mid, r.end()); >+ return *mid; >+ } else { >+ return MaxRatio(); >+ } >+ } >+ >+ // Maximum probe length over all elements and all tables. >+ size_t MaxProbe() const { return all_probes_histogram.size(); } >+ >+ // Fraction of elements with specified probe length. >+ std::vector<double> ProbeNormalizedHistogram() const { >+ double total_elements = std::accumulate(all_probes_histogram.begin(), >+ all_probes_histogram.end(), 0ull); >+ std::vector<double> res; >+ for (size_t p : all_probes_histogram) { >+ res.push_back(p / total_elements); >+ } >+ return res; >+ } >+ >+ size_t PercentileProbe(double Percentile = 0.99) const { >+ size_t idx = 0; >+ for (double p : ProbeNormalizedHistogram()) { >+ if (Percentile > p) { >+ Percentile -= p; >+ ++idx; >+ } else { >+ return idx; >+ } >+ } >+ return idx; >+ } >+ >+ friend std::ostream& operator<<(std::ostream& out, const ProbeStats& s) { >+ out << "{AvgRatio:" << s.AvgRatio() << ", MaxRatio:" << s.MaxRatio() >+ << ", PercentileRatio:" << s.PercentileRatio() >+ << ", MaxProbe:" << s.MaxProbe() << ", Probes=["; >+ for (double p : s.ProbeNormalizedHistogram()) { >+ out << p << ","; >+ } >+ out << "]}"; >+ >+ return out; >+ } >+}; >+ >+struct ExpectedStats { >+ double avg_ratio; >+ double max_ratio; >+ std::vector<std::pair<double, double>> pecentile_ratios; >+ std::vector<std::pair<double, double>> pecentile_probes; >+ >+ friend std::ostream& operator<<(std::ostream& out, const ExpectedStats& s) { >+ out << "{AvgRatio:" << s.avg_ratio << ", MaxRatio:" << s.max_ratio >+ << ", PercentileRatios: ["; >+ for (auto el : s.pecentile_ratios) { >+ out << el.first << ":" << el.second << ", "; >+ } >+ out << "], PercentileProbes: ["; >+ for (auto el : s.pecentile_probes) { >+ out << el.first << ":" << el.second << ", "; >+ } >+ out << "]}"; >+ >+ return out; >+ } >+}; >+ >+void VerifyStats(size_t size, const ExpectedStats& exp, >+ const ProbeStats& stats) { >+ EXPECT_LT(stats.AvgRatio(), exp.avg_ratio) << size << " " << stats; >+ EXPECT_LT(stats.MaxRatio(), exp.max_ratio) << size << " " << stats; >+ for (auto pr : exp.pecentile_ratios) { >+ EXPECT_LE(stats.PercentileRatio(pr.first), pr.second) >+ << size << " " << pr.first << " " << stats; >+ } >+ >+ for (auto pr : exp.pecentile_probes) { >+ EXPECT_LE(stats.PercentileProbe(pr.first), pr.second) >+ << size << " " << pr.first << " " << stats; >+ } >+} >+ >+using ProbeStatsPerSize = std::map<size_t, ProbeStats>; >+ >+// Collect total ProbeStats on num_iters iterations of the following algorithm: >+// 1. Create new table and reserve it to keys.size() * 2 >+// 2. Insert all keys xored with seed >+// 3. Collect ProbeStats from final table. >+ProbeStats CollectProbeStatsOnKeysXoredWithSeed(const std::vector<int64_t>& keys, >+ size_t num_iters) { >+ const size_t reserve_size = keys.size() * 2; >+ >+ ProbeStats stats; >+ >+ int64_t seed = 0x71b1a19b907d6e33; >+ while (num_iters--) { >+ seed = static_cast<int64_t>(static_cast<uint64_t>(seed) * 17 + 13); >+ IntTable t1; >+ t1.reserve(reserve_size); >+ for (const auto& key : keys) { >+ t1.emplace(key ^ seed); >+ } >+ >+ auto probe_histogram = GetHashtableDebugNumProbesHistogram(t1); >+ stats.all_probes_histogram.resize( >+ std::max(stats.all_probes_histogram.size(), probe_histogram.size())); >+ std::transform(probe_histogram.begin(), probe_histogram.end(), >+ stats.all_probes_histogram.begin(), >+ stats.all_probes_histogram.begin(), std::plus<size_t>()); >+ >+ size_t total_probe_seq_length = 0; >+ for (size_t i = 0; i < probe_histogram.size(); ++i) { >+ total_probe_seq_length += i * probe_histogram[i]; >+ } >+ stats.single_table_ratios.push_back(total_probe_seq_length * 1.0 / >+ keys.size()); >+ t1.erase(t1.begin(), t1.end()); >+ } >+ return stats; >+} >+ >+ExpectedStats XorSeedExpectedStats() { >+ constexpr bool kRandomizesInserts = >+#if NDEBUG >+ false; >+#else // NDEBUG >+ true; >+#endif // NDEBUG >+ >+ // The effective load factor is larger in non-opt mode because we insert >+ // elements out of order. >+ switch (container_internal::Group::kWidth) { >+ case 8: >+ if (kRandomizesInserts) { >+ return {0.05, >+ 1.0, >+ {{0.95, 0.5}}, >+ {{0.95, 0}, {0.99, 2}, {0.999, 4}, {0.9999, 10}}}; >+ } else { >+ return {0.05, >+ 2.0, >+ {{0.95, 0.1}}, >+ {{0.95, 0}, {0.99, 2}, {0.999, 4}, {0.9999, 10}}}; >+ } >+ break; >+ case 16: >+ if (kRandomizesInserts) { >+ return {0.1, >+ 1.0, >+ {{0.95, 0.1}}, >+ {{0.95, 0}, {0.99, 1}, {0.999, 8}, {0.9999, 15}}}; >+ } else { >+ return {0.05, >+ 1.0, >+ {{0.95, 0.05}}, >+ {{0.95, 0}, {0.99, 1}, {0.999, 4}, {0.9999, 10}}}; >+ } >+ break; >+ default: >+ ABSL_RAW_LOG(FATAL, "%s", "Unknown Group width"); >+ } >+ return {}; >+} >+TEST(Table, DISABLED_EnsureNonQuadraticTopNXorSeedByProbeSeqLength) { >+ ProbeStatsPerSize stats; >+ std::vector<size_t> sizes = {Group::kWidth << 5, Group::kWidth << 10}; >+ for (size_t size : sizes) { >+ stats[size] = >+ CollectProbeStatsOnKeysXoredWithSeed(CollectBadMergeKeys(size), 200); >+ } >+ auto expected = XorSeedExpectedStats(); >+ for (size_t size : sizes) { >+ auto& stat = stats[size]; >+ VerifyStats(size, expected, stat); >+ } >+} >+ >+// Collect total ProbeStats on num_iters iterations of the following algorithm: >+// 1. Create new table >+// 2. Select 10% of keys and insert 10 elements key * 17 + j * 13 >+// 3. Collect ProbeStats from final table >+ProbeStats CollectProbeStatsOnLinearlyTransformedKeys( >+ const std::vector<int64_t>& keys, size_t num_iters) { >+ ProbeStats stats; >+ >+ std::random_device rd; >+ std::mt19937 rng(rd()); >+ auto linear_transform = [](size_t x, size_t y) { return x * 17 + y * 13; }; >+ std::uniform_int_distribution<size_t> dist(0, keys.size()-1); >+ while (num_iters--) { >+ IntTable t1; >+ size_t num_keys = keys.size() / 10; >+ size_t start = dist(rng); >+ for (size_t i = 0; i != num_keys; ++i) { >+ for (size_t j = 0; j != 10; ++j) { >+ t1.emplace(linear_transform(keys[(i + start) % keys.size()], j)); >+ } >+ } >+ >+ auto probe_histogram = GetHashtableDebugNumProbesHistogram(t1); >+ stats.all_probes_histogram.resize( >+ std::max(stats.all_probes_histogram.size(), probe_histogram.size())); >+ std::transform(probe_histogram.begin(), probe_histogram.end(), >+ stats.all_probes_histogram.begin(), >+ stats.all_probes_histogram.begin(), std::plus<size_t>()); >+ >+ size_t total_probe_seq_length = 0; >+ for (size_t i = 0; i < probe_histogram.size(); ++i) { >+ total_probe_seq_length += i * probe_histogram[i]; >+ } >+ stats.single_table_ratios.push_back(total_probe_seq_length * 1.0 / >+ t1.size()); >+ t1.erase(t1.begin(), t1.end()); >+ } >+ return stats; >+} >+ >+ExpectedStats LinearTransformExpectedStats() { >+ constexpr bool kRandomizesInserts = >+#if NDEBUG >+ false; >+#else // NDEBUG >+ true; >+#endif // NDEBUG >+ >+ // The effective load factor is larger in non-opt mode because we insert >+ // elements out of order. >+ switch (container_internal::Group::kWidth) { >+ case 8: >+ if (kRandomizesInserts) { >+ return {0.1, >+ 0.5, >+ {{0.95, 0.3}}, >+ {{0.95, 0}, {0.99, 1}, {0.999, 8}, {0.9999, 15}}}; >+ } else { >+ return {0.15, >+ 0.5, >+ {{0.95, 0.3}}, >+ {{0.95, 0}, {0.99, 3}, {0.999, 15}, {0.9999, 25}}}; >+ } >+ break; >+ case 16: >+ if (kRandomizesInserts) { >+ return {0.1, >+ 0.4, >+ {{0.95, 0.3}}, >+ {{0.95, 0}, {0.99, 1}, {0.999, 8}, {0.9999, 15}}}; >+ } else { >+ return {0.05, >+ 0.2, >+ {{0.95, 0.1}}, >+ {{0.95, 0}, {0.99, 1}, {0.999, 6}, {0.9999, 10}}}; >+ } >+ break; >+ default: >+ ABSL_RAW_LOG(FATAL, "%s", "Unknown Group width"); >+ } >+ return {}; >+} >+TEST(Table, DISABLED_EnsureNonQuadraticTopNLinearTransformByProbeSeqLength) { >+ ProbeStatsPerSize stats; >+ std::vector<size_t> sizes = {Group::kWidth << 5, Group::kWidth << 10}; >+ for (size_t size : sizes) { >+ stats[size] = CollectProbeStatsOnLinearlyTransformedKeys( >+ CollectBadMergeKeys(size), 300); >+ } >+ auto expected = LinearTransformExpectedStats(); >+ for (size_t size : sizes) { >+ auto& stat = stats[size]; >+ VerifyStats(size, expected, stat); >+ } >+} >+ >+TEST(Table, EraseCollision) { >+ BadTable t; >+ >+ // 1 2 3 >+ t.emplace(1); >+ t.emplace(2); >+ t.emplace(3); >+ EXPECT_THAT(*t.find(1), 1); >+ EXPECT_THAT(*t.find(2), 2); >+ EXPECT_THAT(*t.find(3), 3); >+ EXPECT_EQ(3, t.size()); >+ >+ // 1 DELETED 3 >+ t.erase(t.find(2)); >+ EXPECT_THAT(*t.find(1), 1); >+ EXPECT_TRUE(t.find(2) == t.end()); >+ EXPECT_THAT(*t.find(3), 3); >+ EXPECT_EQ(2, t.size()); >+ >+ // DELETED DELETED 3 >+ t.erase(t.find(1)); >+ EXPECT_TRUE(t.find(1) == t.end()); >+ EXPECT_TRUE(t.find(2) == t.end()); >+ EXPECT_THAT(*t.find(3), 3); >+ EXPECT_EQ(1, t.size()); >+ >+ // DELETED DELETED DELETED >+ t.erase(t.find(3)); >+ EXPECT_TRUE(t.find(1) == t.end()); >+ EXPECT_TRUE(t.find(2) == t.end()); >+ EXPECT_TRUE(t.find(3) == t.end()); >+ EXPECT_EQ(0, t.size()); >+} >+ >+TEST(Table, EraseInsertProbing) { >+ BadTable t(100); >+ >+ // 1 2 3 4 >+ t.emplace(1); >+ t.emplace(2); >+ t.emplace(3); >+ t.emplace(4); >+ >+ // 1 DELETED 3 DELETED >+ t.erase(t.find(2)); >+ t.erase(t.find(4)); >+ >+ // 1 10 3 11 12 >+ t.emplace(10); >+ t.emplace(11); >+ t.emplace(12); >+ >+ EXPECT_EQ(5, t.size()); >+ EXPECT_THAT(t, UnorderedElementsAre(1, 10, 3, 11, 12)); >+} >+ >+TEST(Table, Clear) { >+ IntTable t; >+ EXPECT_TRUE(t.find(0) == t.end()); >+ t.clear(); >+ EXPECT_TRUE(t.find(0) == t.end()); >+ auto res = t.emplace(0); >+ EXPECT_TRUE(res.second); >+ EXPECT_EQ(1, t.size()); >+ t.clear(); >+ EXPECT_EQ(0, t.size()); >+ EXPECT_TRUE(t.find(0) == t.end()); >+} >+ >+TEST(Table, Swap) { >+ IntTable t; >+ EXPECT_TRUE(t.find(0) == t.end()); >+ auto res = t.emplace(0); >+ EXPECT_TRUE(res.second); >+ EXPECT_EQ(1, t.size()); >+ IntTable u; >+ t.swap(u); >+ EXPECT_EQ(0, t.size()); >+ EXPECT_EQ(1, u.size()); >+ EXPECT_TRUE(t.find(0) == t.end()); >+ EXPECT_THAT(*u.find(0), 0); >+} >+ >+TEST(Table, Rehash) { >+ IntTable t; >+ EXPECT_TRUE(t.find(0) == t.end()); >+ t.emplace(0); >+ t.emplace(1); >+ EXPECT_EQ(2, t.size()); >+ t.rehash(128); >+ EXPECT_EQ(2, t.size()); >+ EXPECT_THAT(*t.find(0), 0); >+ EXPECT_THAT(*t.find(1), 1); >+} >+ >+TEST(Table, RehashDoesNotRehashWhenNotNecessary) { >+ IntTable t; >+ t.emplace(0); >+ t.emplace(1); >+ auto* p = &*t.find(0); >+ t.rehash(1); >+ EXPECT_EQ(p, &*t.find(0)); >+} >+ >+TEST(Table, RehashZeroDoesNotAllocateOnEmptyTable) { >+ IntTable t; >+ t.rehash(0); >+ EXPECT_EQ(0, t.bucket_count()); >+} >+ >+TEST(Table, RehashZeroDeallocatesEmptyTable) { >+ IntTable t; >+ t.emplace(0); >+ t.clear(); >+ EXPECT_NE(0, t.bucket_count()); >+ t.rehash(0); >+ EXPECT_EQ(0, t.bucket_count()); >+} >+ >+TEST(Table, RehashZeroForcesRehash) { >+ IntTable t; >+ t.emplace(0); >+ t.emplace(1); >+ auto* p = &*t.find(0); >+ t.rehash(0); >+ EXPECT_NE(p, &*t.find(0)); >+} >+ >+TEST(Table, ConstructFromInitList) { >+ using P = std::pair<std::string, std::string>; >+ struct Q { >+ operator P() const { return {}; } >+ }; >+ StringTable t = {P(), Q(), {}, {{}, {}}}; >+} >+ >+TEST(Table, CopyConstruct) { >+ IntTable t; >+ t.max_load_factor(.321f); >+ t.emplace(0); >+ EXPECT_EQ(1, t.size()); >+ { >+ IntTable u(t); >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(t.max_load_factor(), u.max_load_factor()); >+ EXPECT_THAT(*u.find(0), 0); >+ } >+ { >+ IntTable u{t}; >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(t.max_load_factor(), u.max_load_factor()); >+ EXPECT_THAT(*u.find(0), 0); >+ } >+ { >+ IntTable u = t; >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(t.max_load_factor(), u.max_load_factor()); >+ EXPECT_THAT(*u.find(0), 0); >+ } >+} >+ >+TEST(Table, CopyConstructWithAlloc) { >+ StringTable t; >+ t.max_load_factor(.321f); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ StringTable u(t, Alloc<std::pair<std::string, std::string>>()); >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(t.max_load_factor(), u.max_load_factor()); >+ EXPECT_THAT(*u.find("a"), Pair("a", "b")); >+} >+ >+struct ExplicitAllocIntTable >+ : raw_hash_set<IntPolicy, container_internal::hash_default_hash<int64_t>, >+ std::equal_to<int64_t>, Alloc<int64_t>> { >+ ExplicitAllocIntTable() {} >+}; >+ >+TEST(Table, AllocWithExplicitCtor) { >+ ExplicitAllocIntTable t; >+ EXPECT_EQ(0, t.size()); >+} >+ >+TEST(Table, MoveConstruct) { >+ { >+ StringTable t; >+ t.max_load_factor(.321f); >+ const float lf = t.max_load_factor(); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ >+ StringTable u(std::move(t)); >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(lf, u.max_load_factor()); >+ EXPECT_THAT(*u.find("a"), Pair("a", "b")); >+ } >+ { >+ StringTable t; >+ t.max_load_factor(.321f); >+ const float lf = t.max_load_factor(); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ >+ StringTable u{std::move(t)}; >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(lf, u.max_load_factor()); >+ EXPECT_THAT(*u.find("a"), Pair("a", "b")); >+ } >+ { >+ StringTable t; >+ t.max_load_factor(.321f); >+ const float lf = t.max_load_factor(); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ >+ StringTable u = std::move(t); >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(lf, u.max_load_factor()); >+ EXPECT_THAT(*u.find("a"), Pair("a", "b")); >+ } >+} >+ >+TEST(Table, MoveConstructWithAlloc) { >+ StringTable t; >+ t.max_load_factor(.321f); >+ const float lf = t.max_load_factor(); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ StringTable u(std::move(t), Alloc<std::pair<std::string, std::string>>()); >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(lf, u.max_load_factor()); >+ EXPECT_THAT(*u.find("a"), Pair("a", "b")); >+} >+ >+TEST(Table, CopyAssign) { >+ StringTable t; >+ t.max_load_factor(.321f); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ StringTable u; >+ u = t; >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(t.max_load_factor(), u.max_load_factor()); >+ EXPECT_THAT(*u.find("a"), Pair("a", "b")); >+} >+ >+TEST(Table, CopySelfAssign) { >+ StringTable t; >+ t.max_load_factor(.321f); >+ const float lf = t.max_load_factor(); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ t = *&t; >+ EXPECT_EQ(1, t.size()); >+ EXPECT_EQ(lf, t.max_load_factor()); >+ EXPECT_THAT(*t.find("a"), Pair("a", "b")); >+} >+ >+TEST(Table, MoveAssign) { >+ StringTable t; >+ t.max_load_factor(.321f); >+ const float lf = t.max_load_factor(); >+ t.emplace("a", "b"); >+ EXPECT_EQ(1, t.size()); >+ StringTable u; >+ u = std::move(t); >+ EXPECT_EQ(1, u.size()); >+ EXPECT_EQ(lf, u.max_load_factor()); >+ EXPECT_THAT(*u.find("a"), Pair("a", "b")); >+} >+ >+TEST(Table, Equality) { >+ StringTable t; >+ std::vector<std::pair<std::string, std::string>> v = {{"a", "b"}, {"aa", "bb"}}; >+ t.insert(std::begin(v), std::end(v)); >+ StringTable u = t; >+ EXPECT_EQ(u, t); >+} >+ >+TEST(Table, Equality2) { >+ StringTable t; >+ std::vector<std::pair<std::string, std::string>> v1 = {{"a", "b"}, {"aa", "bb"}}; >+ t.insert(std::begin(v1), std::end(v1)); >+ StringTable u; >+ std::vector<std::pair<std::string, std::string>> v2 = {{"a", "a"}, {"aa", "aa"}}; >+ u.insert(std::begin(v2), std::end(v2)); >+ EXPECT_NE(u, t); >+} >+ >+TEST(Table, Equality3) { >+ StringTable t; >+ std::vector<std::pair<std::string, std::string>> v1 = {{"b", "b"}, {"bb", "bb"}}; >+ t.insert(std::begin(v1), std::end(v1)); >+ StringTable u; >+ std::vector<std::pair<std::string, std::string>> v2 = {{"a", "a"}, {"aa", "aa"}}; >+ u.insert(std::begin(v2), std::end(v2)); >+ EXPECT_NE(u, t); >+} >+ >+TEST(Table, NumDeletedRegression) { >+ IntTable t; >+ t.emplace(0); >+ t.erase(t.find(0)); >+ // construct over a deleted slot. >+ t.emplace(0); >+ t.clear(); >+} >+ >+TEST(Table, FindFullDeletedRegression) { >+ IntTable t; >+ for (int i = 0; i < 1000; ++i) { >+ t.emplace(i); >+ t.erase(t.find(i)); >+ } >+ EXPECT_EQ(0, t.size()); >+} >+ >+TEST(Table, ReplacingDeletedSlotDoesNotRehash) { >+ size_t n; >+ { >+ // Compute n such that n is the maximum number of elements before rehash. >+ IntTable t; >+ t.emplace(0); >+ size_t c = t.bucket_count(); >+ for (n = 1; c == t.bucket_count(); ++n) t.emplace(n); >+ --n; >+ } >+ IntTable t; >+ t.rehash(n); >+ const size_t c = t.bucket_count(); >+ for (size_t i = 0; i != n; ++i) t.emplace(i); >+ EXPECT_EQ(c, t.bucket_count()) << "rehashing threshold = " << n; >+ t.erase(0); >+ t.emplace(0); >+ EXPECT_EQ(c, t.bucket_count()) << "rehashing threshold = " << n; >+} >+ >+TEST(Table, NoThrowMoveConstruct) { >+ ASSERT_TRUE( >+ std::is_nothrow_copy_constructible<absl::Hash<absl::string_view>>::value); >+ ASSERT_TRUE(std::is_nothrow_copy_constructible< >+ std::equal_to<absl::string_view>>::value); >+ ASSERT_TRUE(std::is_nothrow_copy_constructible<std::allocator<int>>::value); >+ EXPECT_TRUE(std::is_nothrow_move_constructible<StringTable>::value); >+} >+ >+TEST(Table, NoThrowMoveAssign) { >+ ASSERT_TRUE( >+ std::is_nothrow_move_assignable<absl::Hash<absl::string_view>>::value); >+ ASSERT_TRUE( >+ std::is_nothrow_move_assignable<std::equal_to<absl::string_view>>::value); >+ ASSERT_TRUE(std::is_nothrow_move_assignable<std::allocator<int>>::value); >+ ASSERT_TRUE( >+ absl::allocator_traits<std::allocator<int>>::is_always_equal::value); >+ EXPECT_TRUE(std::is_nothrow_move_assignable<StringTable>::value); >+} >+ >+TEST(Table, NoThrowSwappable) { >+ ASSERT_TRUE( >+ container_internal::IsNoThrowSwappable<absl::Hash<absl::string_view>>()); >+ ASSERT_TRUE(container_internal::IsNoThrowSwappable< >+ std::equal_to<absl::string_view>>()); >+ ASSERT_TRUE(container_internal::IsNoThrowSwappable<std::allocator<int>>()); >+ EXPECT_TRUE(container_internal::IsNoThrowSwappable<StringTable>()); >+} >+ >+TEST(Table, HeterogeneousLookup) { >+ struct Hash { >+ size_t operator()(int64_t i) const { return i; } >+ size_t operator()(double i) const { >+ ADD_FAILURE(); >+ return i; >+ } >+ }; >+ struct Eq { >+ bool operator()(int64_t a, int64_t b) const { return a == b; } >+ bool operator()(double a, int64_t b) const { >+ ADD_FAILURE(); >+ return a == b; >+ } >+ bool operator()(int64_t a, double b) const { >+ ADD_FAILURE(); >+ return a == b; >+ } >+ bool operator()(double a, double b) const { >+ ADD_FAILURE(); >+ return a == b; >+ } >+ }; >+ >+ struct THash { >+ using is_transparent = void; >+ size_t operator()(int64_t i) const { return i; } >+ size_t operator()(double i) const { return i; } >+ }; >+ struct TEq { >+ using is_transparent = void; >+ bool operator()(int64_t a, int64_t b) const { return a == b; } >+ bool operator()(double a, int64_t b) const { return a == b; } >+ bool operator()(int64_t a, double b) const { return a == b; } >+ bool operator()(double a, double b) const { return a == b; } >+ }; >+ >+ raw_hash_set<IntPolicy, Hash, Eq, Alloc<int64_t>> s{0, 1, 2}; >+ // It will convert to int64_t before the query. >+ EXPECT_EQ(1, *s.find(double{1.1})); >+ >+ raw_hash_set<IntPolicy, THash, TEq, Alloc<int64_t>> ts{0, 1, 2}; >+ // It will try to use the double, and fail to find the object. >+ EXPECT_TRUE(ts.find(1.1) == ts.end()); >+} >+ >+template <class Table> >+using CallFind = decltype(std::declval<Table&>().find(17)); >+ >+template <class Table> >+using CallErase = decltype(std::declval<Table&>().erase(17)); >+ >+template <class Table> >+using CallExtract = decltype(std::declval<Table&>().extract(17)); >+ >+template <class Table> >+using CallPrefetch = decltype(std::declval<Table&>().prefetch(17)); >+ >+template <class Table> >+using CallCount = decltype(std::declval<Table&>().count(17)); >+ >+template <template <typename> class C, class Table, class = void> >+struct VerifyResultOf : std::false_type {}; >+ >+template <template <typename> class C, class Table> >+struct VerifyResultOf<C, Table, absl::void_t<C<Table>>> : std::true_type {}; >+ >+TEST(Table, HeterogeneousLookupOverloads) { >+ using NonTransparentTable = >+ raw_hash_set<StringPolicy, absl::Hash<absl::string_view>, >+ std::equal_to<absl::string_view>, std::allocator<int>>; >+ >+ EXPECT_FALSE((VerifyResultOf<CallFind, NonTransparentTable>())); >+ EXPECT_FALSE((VerifyResultOf<CallErase, NonTransparentTable>())); >+ EXPECT_FALSE((VerifyResultOf<CallExtract, NonTransparentTable>())); >+ EXPECT_FALSE((VerifyResultOf<CallPrefetch, NonTransparentTable>())); >+ EXPECT_FALSE((VerifyResultOf<CallCount, NonTransparentTable>())); >+ >+ using TransparentTable = raw_hash_set< >+ StringPolicy, >+ absl::container_internal::hash_default_hash<absl::string_view>, >+ absl::container_internal::hash_default_eq<absl::string_view>, >+ std::allocator<int>>; >+ >+ EXPECT_TRUE((VerifyResultOf<CallFind, TransparentTable>())); >+ EXPECT_TRUE((VerifyResultOf<CallErase, TransparentTable>())); >+ EXPECT_TRUE((VerifyResultOf<CallExtract, TransparentTable>())); >+ EXPECT_TRUE((VerifyResultOf<CallPrefetch, TransparentTable>())); >+ EXPECT_TRUE((VerifyResultOf<CallCount, TransparentTable>())); >+} >+ >+// TODO(alkis): Expand iterator tests. >+TEST(Iterator, IsDefaultConstructible) { >+ StringTable::iterator i; >+ EXPECT_TRUE(i == StringTable::iterator()); >+} >+ >+TEST(ConstIterator, IsDefaultConstructible) { >+ StringTable::const_iterator i; >+ EXPECT_TRUE(i == StringTable::const_iterator()); >+} >+ >+TEST(Iterator, ConvertsToConstIterator) { >+ StringTable::iterator i; >+ EXPECT_TRUE(i == StringTable::const_iterator()); >+} >+ >+TEST(Iterator, Iterates) { >+ IntTable t; >+ for (size_t i = 3; i != 6; ++i) EXPECT_TRUE(t.emplace(i).second); >+ EXPECT_THAT(t, UnorderedElementsAre(3, 4, 5)); >+} >+ >+TEST(Table, Merge) { >+ StringTable t1, t2; >+ t1.emplace("0", "-0"); >+ t1.emplace("1", "-1"); >+ t2.emplace("0", "~0"); >+ t2.emplace("2", "~2"); >+ >+ EXPECT_THAT(t1, UnorderedElementsAre(Pair("0", "-0"), Pair("1", "-1"))); >+ EXPECT_THAT(t2, UnorderedElementsAre(Pair("0", "~0"), Pair("2", "~2"))); >+ >+ t1.merge(t2); >+ EXPECT_THAT(t1, UnorderedElementsAre(Pair("0", "-0"), Pair("1", "-1"), >+ Pair("2", "~2"))); >+ EXPECT_THAT(t2, UnorderedElementsAre(Pair("0", "~0"))); >+} >+ >+TEST(Nodes, EmptyNodeType) { >+ using node_type = StringTable::node_type; >+ node_type n; >+ EXPECT_FALSE(n); >+ EXPECT_TRUE(n.empty()); >+ >+ EXPECT_TRUE((std::is_same<node_type::allocator_type, >+ StringTable::allocator_type>::value)); >+} >+ >+TEST(Nodes, ExtractInsert) { >+ constexpr char k0[] = "Very long std::string zero."; >+ constexpr char k1[] = "Very long std::string one."; >+ constexpr char k2[] = "Very long std::string two."; >+ StringTable t = {{k0, ""}, {k1, ""}, {k2, ""}}; >+ EXPECT_THAT(t, >+ UnorderedElementsAre(Pair(k0, ""), Pair(k1, ""), Pair(k2, ""))); >+ >+ auto node = t.extract(k0); >+ EXPECT_THAT(t, UnorderedElementsAre(Pair(k1, ""), Pair(k2, ""))); >+ EXPECT_TRUE(node); >+ EXPECT_FALSE(node.empty()); >+ >+ StringTable t2; >+ auto res = t2.insert(std::move(node)); >+ EXPECT_TRUE(res.inserted); >+ EXPECT_THAT(*res.position, Pair(k0, "")); >+ EXPECT_FALSE(res.node); >+ EXPECT_THAT(t2, UnorderedElementsAre(Pair(k0, ""))); >+ >+ // Not there. >+ EXPECT_THAT(t, UnorderedElementsAre(Pair(k1, ""), Pair(k2, ""))); >+ node = t.extract("Not there!"); >+ EXPECT_THAT(t, UnorderedElementsAre(Pair(k1, ""), Pair(k2, ""))); >+ EXPECT_FALSE(node); >+ >+ // Inserting nothing. >+ res = t2.insert(std::move(node)); >+ EXPECT_FALSE(res.inserted); >+ EXPECT_EQ(res.position, t2.end()); >+ EXPECT_FALSE(res.node); >+ EXPECT_THAT(t2, UnorderedElementsAre(Pair(k0, ""))); >+ >+ t.emplace(k0, "1"); >+ node = t.extract(k0); >+ >+ // Insert duplicate. >+ res = t2.insert(std::move(node)); >+ EXPECT_FALSE(res.inserted); >+ EXPECT_THAT(*res.position, Pair(k0, "")); >+ EXPECT_TRUE(res.node); >+ EXPECT_FALSE(node); >+} >+ >+StringTable MakeSimpleTable(size_t size) { >+ StringTable t; >+ for (size_t i = 0; i < size; ++i) t.emplace(std::string(1, 'A' + i), ""); >+ return t; >+} >+ >+std::string OrderOfIteration(const StringTable& t) { >+ std::string order; >+ for (auto& p : t) order += p.first; >+ return order; >+} >+ >+TEST(Table, IterationOrderChangesByInstance) { >+ // Needs to be more than kWidth elements to be able to affect order. >+ const StringTable reference = MakeSimpleTable(20); >+ >+ // Since order is non-deterministic we can't just try once and verify. >+ // We'll try until we find that order changed. It should not take many tries >+ // for that. >+ // Important: we have to keep the old tables around. Otherwise tcmalloc will >+ // just give us the same blocks and we would be doing the same order again. >+ std::vector<StringTable> garbage; >+ for (int i = 0; i < 10; ++i) { >+ auto trial = MakeSimpleTable(20); >+ if (OrderOfIteration(trial) != OrderOfIteration(reference)) { >+ // We are done. >+ return; >+ } >+ garbage.push_back(std::move(trial)); >+ } >+ FAIL(); >+} >+ >+TEST(Table, IterationOrderChangesOnRehash) { >+ // Since order is non-deterministic we can't just try once and verify. >+ // We'll try until we find that order changed. It should not take many tries >+ // for that. >+ // Important: we have to keep the old tables around. Otherwise tcmalloc will >+ // just give us the same blocks and we would be doing the same order again. >+ std::vector<StringTable> garbage; >+ for (int i = 0; i < 10; ++i) { >+ // Needs to be more than kWidth elements to be able to affect order. >+ StringTable t = MakeSimpleTable(20); >+ const std::string reference = OrderOfIteration(t); >+ // Force rehash to the same size. >+ t.rehash(0); >+ std::string trial = OrderOfIteration(t); >+ if (trial != reference) { >+ // We are done. >+ return; >+ } >+ garbage.push_back(std::move(t)); >+ } >+ FAIL(); >+} >+ >+TEST(Table, IterationOrderChangesForSmallTables) { >+ // Since order is non-deterministic we can't just try once and verify. >+ // We'll try until we find that order changed. >+ // Important: we have to keep the old tables around. Otherwise tcmalloc will >+ // just give us the same blocks and we would be doing the same order again. >+ StringTable reference_table = MakeSimpleTable(5); >+ const std::string reference = OrderOfIteration(reference_table); >+ std::vector<StringTable> garbage; >+ for (int i = 0; i < 50; ++i) { >+ StringTable t = MakeSimpleTable(5); >+ std::string trial = OrderOfIteration(t); >+ if (trial != reference) { >+ // We are done. >+ return; >+ } >+ garbage.push_back(std::move(t)); >+ } >+ FAIL() << "Iteration order remained the same across many attempts."; >+} >+ >+// Fill the table to 3 different load factors (min, median, max) and evaluate >+// the percentage of perfect hits using the debug API. >+template <class Table, class AddFn> >+std::vector<double> CollectPerfectRatios(Table t, AddFn add) { >+ using Key = typename Table::key_type; >+ >+ // First, fill enough to have a good distribution. >+ constexpr size_t kMinSize = 10000; >+ std::vector<Key> keys; >+ while (t.size() < kMinSize) keys.push_back(add(t)); >+ // Then, insert until we reach min load factor. >+ double lf = t.load_factor(); >+ while (lf <= t.load_factor()) keys.push_back(add(t)); >+ >+ // We are now at min load factor. Take a snapshot. >+ size_t perfect = 0; >+ auto update_perfect = [&](Key k) { >+ perfect += GetHashtableDebugNumProbes(t, k) == 0; >+ }; >+ for (const auto& k : keys) update_perfect(k); >+ >+ std::vector<double> perfect_ratios; >+ // Keep going until we hit max load factor. >+ while (t.load_factor() < .6) { >+ perfect_ratios.push_back(1.0 * perfect / t.size()); >+ update_perfect(add(t)); >+ } >+ while (t.load_factor() > .5) { >+ perfect_ratios.push_back(1.0 * perfect / t.size()); >+ update_perfect(add(t)); >+ } >+ return perfect_ratios; >+} >+ >+std::vector<std::pair<double, double>> StringTablePefectRatios() { >+ constexpr bool kRandomizesInserts = >+#if NDEBUG >+ false; >+#else // NDEBUG >+ true; >+#endif // NDEBUG >+ >+ // The effective load factor is larger in non-opt mode because we insert >+ // elements out of order. >+ switch (container_internal::Group::kWidth) { >+ case 8: >+ if (kRandomizesInserts) { >+ return {{0.986, 0.02}, {0.95, 0.02}, {0.89, 0.02}}; >+ } else { >+ return {{0.995, 0.01}, {0.97, 0.01}, {0.89, 0.02}}; >+ } >+ break; >+ case 16: >+ if (kRandomizesInserts) { >+ return {{0.973, 0.01}, {0.965, 0.01}, {0.92, 0.02}}; >+ } else { >+ return {{0.995, 0.005}, {0.99, 0.005}, {0.94, 0.01}}; >+ } >+ break; >+ default: >+ // Ignore anything else. >+ return {}; >+ } >+} >+ >+// This is almost a change detector, but it allows us to know how we are >+// affecting the probe distribution. >+TEST(Table, EffectiveLoadFactorStrings) { >+ std::vector<double> perfect_ratios = >+ CollectPerfectRatios(StringTable(), [](StringTable& t) { >+ return t.emplace(std::to_string(t.size()), "").first->first; >+ }); >+ >+ auto ratios = StringTablePefectRatios(); >+ if (ratios.empty()) return; >+ >+ EXPECT_THAT(perfect_ratios.front(), >+ DoubleNear(ratios[0].first, ratios[0].second)); >+ EXPECT_THAT(perfect_ratios[perfect_ratios.size() / 2], >+ DoubleNear(ratios[1].first, ratios[1].second)); >+ EXPECT_THAT(perfect_ratios.back(), >+ DoubleNear(ratios[2].first, ratios[2].second)); >+} >+ >+std::vector<std::pair<double, double>> IntTablePefectRatios() { >+ constexpr bool kRandomizesInserts = >+#ifdef NDEBUG >+ false; >+#else // NDEBUG >+ true; >+#endif // NDEBUG >+ >+ // The effective load factor is larger in non-opt mode because we insert >+ // elements out of order. >+ switch (container_internal::Group::kWidth) { >+ case 8: >+ if (kRandomizesInserts) { >+ return {{0.99, 0.02}, {0.985, 0.02}, {0.95, 0.05}}; >+ } else { >+ return {{0.99, 0.01}, {0.99, 0.01}, {0.95, 0.02}}; >+ } >+ break; >+ case 16: >+ if (kRandomizesInserts) { >+ return {{0.98, 0.02}, {0.978, 0.02}, {0.96, 0.02}}; >+ } else { >+ return {{0.998, 0.003}, {0.995, 0.01}, {0.975, 0.02}}; >+ } >+ break; >+ default: >+ // Ignore anything else. >+ return {}; >+ } >+} >+ >+// This is almost a change detector, but it allows us to know how we are >+// affecting the probe distribution. >+TEST(Table, EffectiveLoadFactorInts) { >+ std::vector<double> perfect_ratios = CollectPerfectRatios( >+ IntTable(), [](IntTable& t) { return *t.emplace(t.size()).first; }); >+ >+ auto ratios = IntTablePefectRatios(); >+ if (ratios.empty()) return; >+ >+ EXPECT_THAT(perfect_ratios.front(), >+ DoubleNear(ratios[0].first, ratios[0].second)); >+ EXPECT_THAT(perfect_ratios[perfect_ratios.size() / 2], >+ DoubleNear(ratios[1].first, ratios[1].second)); >+ EXPECT_THAT(perfect_ratios.back(), >+ DoubleNear(ratios[2].first, ratios[2].second)); >+} >+ >+// Confirm that we assert if we try to erase() end(). >+TEST(TableDeathTest, EraseOfEndAsserts) { >+ // Use an assert with side-effects to figure out if they are actually enabled. >+ bool assert_enabled = false; >+ assert([&]() { >+ assert_enabled = true; >+ return true; >+ }()); >+ if (!assert_enabled) return; >+ >+ IntTable t; >+ // Extra simple "regexp" as regexp support is highly varied across platforms. >+ constexpr char kDeathMsg[] = "it != end"; >+ EXPECT_DEATH_IF_SUPPORTED(t.erase(t.end()), kDeathMsg); >+} >+ >+#ifdef ADDRESS_SANITIZER >+TEST(Sanitizer, PoisoningUnused) { >+ IntTable t; >+ // Insert something to force an allocation. >+ int64_t& v1 = *t.insert(0).first; >+ >+ // Make sure there is something to test. >+ ASSERT_GT(t.capacity(), 1); >+ >+ int64_t* slots = RawHashSetTestOnlyAccess::GetSlots(t); >+ for (size_t i = 0; i < t.capacity(); ++i) { >+ EXPECT_EQ(slots + i != &v1, __asan_address_is_poisoned(slots + i)); >+ } >+} >+ >+TEST(Sanitizer, PoisoningOnErase) { >+ IntTable t; >+ int64_t& v = *t.insert(0).first; >+ >+ EXPECT_FALSE(__asan_address_is_poisoned(&v)); >+ t.erase(0); >+ EXPECT_TRUE(__asan_address_is_poisoned(&v)); >+} >+#endif // ADDRESS_SANITIZER >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.cc >index fe00aca8fb98f120f235202369b0b5be876d4747..b18e0bb7a1ea5fffdc9618f1e793b8ff92ed9038 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.cc >@@ -21,6 +21,7 @@ int BaseCountedInstance::num_live_instances_ = 0; > int BaseCountedInstance::num_moves_ = 0; > int BaseCountedInstance::num_copies_ = 0; > int BaseCountedInstance::num_swaps_ = 0; >+int BaseCountedInstance::num_comparisons_ = 0; > > } // namespace test_internal > } // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.h >index cf8f3a531e6212fcaa13d6535170af86bffa698a..ec45f574037fc8e0b6937bcd8fc1f42756e08108 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker.h >@@ -22,8 +22,8 @@ namespace absl { > namespace test_internal { > > // A type that counts number of occurences of the type, the live occurrences of >-// the type, as well as the number of copies, moves, and swaps that have >-// occurred on the type. This is used as a base class for the copyable, >+// the type, as well as the number of copies, moves, swaps, and comparisons that >+// have occurred on the type. This is used as a base class for the copyable, > // copyable+movable, and movable types below that are used in actual tests. Use > // InstanceTracker in tests to track the number of instances. > class BaseCountedInstance { >@@ -66,6 +66,36 @@ class BaseCountedInstance { > return *this; > } > >+ bool operator==(const BaseCountedInstance& x) const { >+ ++num_comparisons_; >+ return value_ == x.value_; >+ } >+ >+ bool operator!=(const BaseCountedInstance& x) const { >+ ++num_comparisons_; >+ return value_ != x.value_; >+ } >+ >+ bool operator<(const BaseCountedInstance& x) const { >+ ++num_comparisons_; >+ return value_ < x.value_; >+ } >+ >+ bool operator>(const BaseCountedInstance& x) const { >+ ++num_comparisons_; >+ return value_ > x.value_; >+ } >+ >+ bool operator<=(const BaseCountedInstance& x) const { >+ ++num_comparisons_; >+ return value_ <= x.value_; >+ } >+ >+ bool operator>=(const BaseCountedInstance& x) const { >+ ++num_comparisons_; >+ return value_ >= x.value_; >+ } >+ > int value() const { > if (!is_live_) std::abort(); > return value_; >@@ -108,6 +138,9 @@ class BaseCountedInstance { > > // Number of times that BaseCountedInstance objects were swapped. > static int num_swaps_; >+ >+ // Number of times that BaseCountedInstance objects were compared. >+ static int num_comparisons_; > }; > > // Helper to track the BaseCountedInstance instance counters. Expects that the >@@ -152,13 +185,21 @@ class InstanceTracker { > // construction or the last call to ResetCopiesMovesSwaps(). > int swaps() const { return BaseCountedInstance::num_swaps_ - start_swaps_; } > >- // Resets the base values for moves, copies and swaps to the current values, >- // so that subsequent Get*() calls for moves, copies and swaps will compare to >- // the situation at the point of this call. >+ // Returns the number of comparisons on BaseCountedInstance objects since >+ // construction or the last call to ResetCopiesMovesSwaps(). >+ int comparisons() const { >+ return BaseCountedInstance::num_comparisons_ - start_comparisons_; >+ } >+ >+ // Resets the base values for moves, copies, comparisons, and swaps to the >+ // current values, so that subsequent Get*() calls for moves, copies, >+ // comparisons, and swaps will compare to the situation at the point of this >+ // call. > void ResetCopiesMovesSwaps() { > start_moves_ = BaseCountedInstance::num_moves_; > start_copies_ = BaseCountedInstance::num_copies_; > start_swaps_ = BaseCountedInstance::num_swaps_; >+ start_comparisons_ = BaseCountedInstance::num_comparisons_; > } > > private: >@@ -167,6 +208,7 @@ class InstanceTracker { > int start_moves_; > int start_copies_; > int start_swaps_; >+ int start_comparisons_; > }; > > // Copyable, not movable. >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker_test.cc >index 9efb6771cf086db16fef39460c591921ecd5f691..0ae57636df21338979210395bb3996bbd8c97b8e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/test_instance_tracker_test.cc >@@ -157,4 +157,26 @@ TEST(TestInstanceTracker, ExistingInstances) { > EXPECT_EQ(1, tracker.moves()); > } > >+TEST(TestInstanceTracker, Comparisons) { >+ InstanceTracker tracker; >+ MovableOnlyInstance one(1), two(2); >+ >+ EXPECT_EQ(0, tracker.comparisons()); >+ EXPECT_FALSE(one == two); >+ EXPECT_EQ(1, tracker.comparisons()); >+ EXPECT_TRUE(one != two); >+ EXPECT_EQ(2, tracker.comparisons()); >+ EXPECT_TRUE(one < two); >+ EXPECT_EQ(3, tracker.comparisons()); >+ EXPECT_FALSE(one > two); >+ EXPECT_EQ(4, tracker.comparisons()); >+ EXPECT_TRUE(one <= two); >+ EXPECT_EQ(5, tracker.comparisons()); >+ EXPECT_FALSE(one >= two); >+ EXPECT_EQ(6, tracker.comparisons()); >+ >+ tracker.ResetCopiesMovesSwaps(); >+ EXPECT_EQ(0, tracker.comparisons()); >+} >+ > } // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/tracked.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/tracked.h >new file mode 100644 >index 0000000000000000000000000000000000000000..7d14af03138d3efd66237b797f3ef73c003c8815 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/tracked.h >@@ -0,0 +1,78 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_TRACKED_H_ >+#define ABSL_CONTAINER_INTERNAL_TRACKED_H_ >+ >+#include <stddef.h> >+#include <memory> >+#include <utility> >+ >+namespace absl { >+namespace container_internal { >+ >+// A class that tracks its copies and moves so that it can be queried in tests. >+template <class T> >+class Tracked { >+ public: >+ Tracked() {} >+ // NOLINTNEXTLINE(runtime/explicit) >+ Tracked(const T& val) : val_(val) {} >+ Tracked(const Tracked& that) >+ : val_(that.val_), >+ num_moves_(that.num_moves_), >+ num_copies_(that.num_copies_) { >+ ++(*num_copies_); >+ } >+ Tracked(Tracked&& that) >+ : val_(std::move(that.val_)), >+ num_moves_(std::move(that.num_moves_)), >+ num_copies_(std::move(that.num_copies_)) { >+ ++(*num_moves_); >+ } >+ Tracked& operator=(const Tracked& that) { >+ val_ = that.val_; >+ num_moves_ = that.num_moves_; >+ num_copies_ = that.num_copies_; >+ ++(*num_copies_); >+ } >+ Tracked& operator=(Tracked&& that) { >+ val_ = std::move(that.val_); >+ num_moves_ = std::move(that.num_moves_); >+ num_copies_ = std::move(that.num_copies_); >+ ++(*num_moves_); >+ } >+ >+ const T& val() const { return val_; } >+ >+ friend bool operator==(const Tracked& a, const Tracked& b) { >+ return a.val_ == b.val_; >+ } >+ friend bool operator!=(const Tracked& a, const Tracked& b) { >+ return !(a == b); >+ } >+ >+ size_t num_copies() { return *num_copies_; } >+ size_t num_moves() { return *num_moves_; } >+ >+ private: >+ T val_; >+ std::shared_ptr<size_t> num_moves_ = std::make_shared<size_t>(0); >+ std::shared_ptr<size_t> num_copies_ = std::make_shared<size_t>(0); >+}; >+ >+} // namespace container_internal >+} // namespace absl >+ >+#endif // ABSL_CONTAINER_INTERNAL_TRACKED_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_constructor_test.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_constructor_test.h >new file mode 100644 >index 0000000000000000000000000000000000000000..2ffb646cb264a65efa4beccdef2ca28e14e9a5d7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_constructor_test.h >@@ -0,0 +1,404 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_CONSTRUCTOR_TEST_H_ >+#define ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_CONSTRUCTOR_TEST_H_ >+ >+#include <algorithm> >+#include <vector> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/hash_policy_testing.h" >+ >+namespace absl { >+namespace container_internal { >+ >+template <class UnordMap> >+class ConstructorTest : public ::testing::Test {}; >+ >+TYPED_TEST_CASE_P(ConstructorTest); >+ >+TYPED_TEST_P(ConstructorTest, NoArgs) { >+ TypeParam m; >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCount) { >+ TypeParam m(123); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHash) { >+ using H = typename TypeParam::hasher; >+ H hasher; >+ TypeParam m(123, hasher); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHashEqual) { >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ H hasher; >+ E equal; >+ TypeParam m(123, hasher, equal); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHashEqualAlloc) { >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountAlloc) { >+#if defined(UNORDERED_MAP_CXX14) || defined(UNORDERED_MAP_CXX17) >+ using A = typename TypeParam::allocator_type; >+ A alloc(0); >+ TypeParam m(123, alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHashAlloc) { >+#if defined(UNORDERED_MAP_CXX14) || defined(UNORDERED_MAP_CXX17) >+ using H = typename TypeParam::hasher; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ A alloc(0); >+ TypeParam m(123, hasher, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketAlloc) { >+#if ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS >+ using A = typename TypeParam::allocator_type; >+ A alloc(0); >+ TypeParam m(alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(m, ::testing::UnorderedElementsAre()); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, InputIteratorBucketHashEqualAlloc) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end(), 123, hasher, equal, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, InputIteratorBucketAlloc) { >+#if defined(UNORDERED_MAP_CXX14) || defined(UNORDERED_MAP_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using A = typename TypeParam::allocator_type; >+ A alloc(0); >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end(), 123, alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, InputIteratorBucketHashAlloc) { >+#if defined(UNORDERED_MAP_CXX14) || defined(UNORDERED_MAP_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ A alloc(0); >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end(), 123, hasher, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, CopyConstructor) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam n(m); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, CopyConstructorAlloc) { >+#if ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam n(m, A(11)); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_NE(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+#endif >+} >+ >+// TODO(alkis): Test non-propagating allocators on copy constructors. >+ >+TYPED_TEST_P(ConstructorTest, MoveConstructor) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam t(m); >+ TypeParam n(std::move(t)); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, MoveConstructorAlloc) { >+#if ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam t(m); >+ TypeParam n(std::move(t), A(1)); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_NE(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+#endif >+} >+ >+// TODO(alkis): Test non-propagating allocators on move constructors. >+ >+TYPED_TEST_P(ConstructorTest, InitializerListBucketHashEqualAlloc) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(values, 123, hasher, equal, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, InitializerListBucketAlloc) { >+#if defined(UNORDERED_MAP_CXX14) || defined(UNORDERED_MAP_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using A = typename TypeParam::allocator_type; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ A alloc(0); >+ TypeParam m(values, 123, alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, InitializerListBucketHashAlloc) { >+#if defined(UNORDERED_MAP_CXX14) || defined(UNORDERED_MAP_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ A alloc(0); >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m(values, 123, hasher, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, Assignment) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}, 123, hasher, equal, alloc); >+ TypeParam n; >+ n = m; >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m, n); >+} >+ >+// TODO(alkis): Test [non-]propagating allocators on move/copy assignments >+// (it depends on traits). >+ >+TYPED_TEST_P(ConstructorTest, MoveAssignment) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}, 123, hasher, equal, alloc); >+ TypeParam t(m); >+ TypeParam n; >+ n = std::move(t); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentFromInitializerList) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m; >+ m = values; >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentOverwritesExisting) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}); >+ TypeParam n({gen()}); >+ n = m; >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, MoveAssignmentOverwritesExisting) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}); >+ TypeParam t(m); >+ TypeParam n({gen()}); >+ n = std::move(t); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentFromInitializerListOverwritesExisting) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m; >+ m = values; >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentOnSelf) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m(values); >+ m = *&m; // Avoid -Wself-assign >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+// We cannot test self move as standard states that it leaves standard >+// containers in unspecified state (and in practice in causes memory-leak >+// according to heap-checker!). >+ >+REGISTER_TYPED_TEST_CASE_P( >+ ConstructorTest, NoArgs, BucketCount, BucketCountHash, BucketCountHashEqual, >+ BucketCountHashEqualAlloc, BucketCountAlloc, BucketCountHashAlloc, >+ BucketAlloc, InputIteratorBucketHashEqualAlloc, InputIteratorBucketAlloc, >+ InputIteratorBucketHashAlloc, CopyConstructor, CopyConstructorAlloc, >+ MoveConstructor, MoveConstructorAlloc, InitializerListBucketHashEqualAlloc, >+ InitializerListBucketAlloc, InitializerListBucketHashAlloc, Assignment, >+ MoveAssignment, AssignmentFromInitializerList, >+ AssignmentOverwritesExisting, MoveAssignmentOverwritesExisting, >+ AssignmentFromInitializerListOverwritesExisting, AssignmentOnSelf); >+ >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_CONSTRUCTOR_TEST_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_lookup_test.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_lookup_test.h >new file mode 100644 >index 0000000000000000000000000000000000000000..1f1b6b489b301eb9fadd1bf84919cad637c9ab8b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_lookup_test.h >@@ -0,0 +1,114 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_LOOKUP_TEST_H_ >+#define ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_LOOKUP_TEST_H_ >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/hash_policy_testing.h" >+ >+namespace absl { >+namespace container_internal { >+ >+template <class UnordMap> >+class LookupTest : public ::testing::Test {}; >+ >+TYPED_TEST_CASE_P(LookupTest); >+ >+TYPED_TEST_P(LookupTest, At) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ for (const auto& p : values) { >+ const auto& val = m.at(p.first); >+ EXPECT_EQ(p.second, val) << ::testing::PrintToString(p.first); >+ } >+} >+ >+TYPED_TEST_P(LookupTest, OperatorBracket) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using V = typename TypeParam::mapped_type; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ for (const auto& p : values) { >+ auto& val = m[p.first]; >+ EXPECT_EQ(V(), val) << ::testing::PrintToString(p.first); >+ val = p.second; >+ } >+ for (const auto& p : values) >+ EXPECT_EQ(p.second, m[p.first]) << ::testing::PrintToString(p.first); >+} >+ >+TYPED_TEST_P(LookupTest, Count) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ for (const auto& p : values) >+ EXPECT_EQ(0, m.count(p.first)) << ::testing::PrintToString(p.first); >+ m.insert(values.begin(), values.end()); >+ for (const auto& p : values) >+ EXPECT_EQ(1, m.count(p.first)) << ::testing::PrintToString(p.first); >+} >+ >+TYPED_TEST_P(LookupTest, Find) { >+ using std::get; >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ for (const auto& p : values) >+ EXPECT_TRUE(m.end() == m.find(p.first)) >+ << ::testing::PrintToString(p.first); >+ m.insert(values.begin(), values.end()); >+ for (const auto& p : values) { >+ auto it = m.find(p.first); >+ EXPECT_TRUE(m.end() != it) << ::testing::PrintToString(p.first); >+ EXPECT_EQ(p.second, get<1>(*it)) << ::testing::PrintToString(p.first); >+ } >+} >+ >+TYPED_TEST_P(LookupTest, EqualRange) { >+ using std::get; >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ for (const auto& p : values) { >+ auto r = m.equal_range(p.first); >+ ASSERT_EQ(0, std::distance(r.first, r.second)); >+ } >+ m.insert(values.begin(), values.end()); >+ for (const auto& p : values) { >+ auto r = m.equal_range(p.first); >+ ASSERT_EQ(1, std::distance(r.first, r.second)); >+ EXPECT_EQ(p.second, get<1>(*r.first)) << ::testing::PrintToString(p.first); >+ } >+} >+ >+REGISTER_TYPED_TEST_CASE_P(LookupTest, At, OperatorBracket, Count, Find, >+ EqualRange); >+ >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_LOOKUP_TEST_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_modifiers_test.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_modifiers_test.h >new file mode 100644 >index 0000000000000000000000000000000000000000..b6c633ae27357c145e6e1f206ba2d2cc24c8854a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_modifiers_test.h >@@ -0,0 +1,272 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_MODIFIERS_TEST_H_ >+#define ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_MODIFIERS_TEST_H_ >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/hash_policy_testing.h" >+ >+namespace absl { >+namespace container_internal { >+ >+template <class UnordMap> >+class ModifiersTest : public ::testing::Test {}; >+ >+TYPED_TEST_CASE_P(ModifiersTest); >+ >+TYPED_TEST_P(ModifiersTest, Clear) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ m.clear(); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAre()); >+ EXPECT_TRUE(m.empty()); >+} >+ >+TYPED_TEST_P(ModifiersTest, Insert) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using V = typename TypeParam::mapped_type; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ auto p = m.insert(val); >+ EXPECT_TRUE(p.second); >+ EXPECT_EQ(val, *p.first); >+ T val2 = {val.first, hash_internal::Generator<V>()()}; >+ p = m.insert(val2); >+ EXPECT_FALSE(p.second); >+ EXPECT_EQ(val, *p.first); >+} >+ >+TYPED_TEST_P(ModifiersTest, InsertHint) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using V = typename TypeParam::mapped_type; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ auto it = m.insert(m.end(), val); >+ EXPECT_TRUE(it != m.end()); >+ EXPECT_EQ(val, *it); >+ T val2 = {val.first, hash_internal::Generator<V>()()}; >+ it = m.insert(it, val2); >+ EXPECT_TRUE(it != m.end()); >+ EXPECT_EQ(val, *it); >+} >+ >+TYPED_TEST_P(ModifiersTest, InsertRange) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ m.insert(values.begin(), values.end()); >+ ASSERT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+TYPED_TEST_P(ModifiersTest, InsertOrAssign) { >+#ifdef UNORDERED_MAP_CXX17 >+ using std::get; >+ using K = typename TypeParam::key_type; >+ using V = typename TypeParam::mapped_type; >+ K k = hash_internal::Generator<K>()(); >+ V val = hash_internal::Generator<V>()(); >+ TypeParam m; >+ auto p = m.insert_or_assign(k, val); >+ EXPECT_TRUE(p.second); >+ EXPECT_EQ(k, get<0>(*p.first)); >+ EXPECT_EQ(val, get<1>(*p.first)); >+ V val2 = hash_internal::Generator<V>()(); >+ p = m.insert_or_assign(k, val2); >+ EXPECT_FALSE(p.second); >+ EXPECT_EQ(k, get<0>(*p.first)); >+ EXPECT_EQ(val2, get<1>(*p.first)); >+#endif >+} >+ >+TYPED_TEST_P(ModifiersTest, InsertOrAssignHint) { >+#ifdef UNORDERED_MAP_CXX17 >+ using std::get; >+ using K = typename TypeParam::key_type; >+ using V = typename TypeParam::mapped_type; >+ K k = hash_internal::Generator<K>()(); >+ V val = hash_internal::Generator<V>()(); >+ TypeParam m; >+ auto it = m.insert_or_assign(m.end(), k, val); >+ EXPECT_TRUE(it != m.end()); >+ EXPECT_EQ(k, get<0>(*it)); >+ EXPECT_EQ(val, get<1>(*it)); >+ V val2 = hash_internal::Generator<V>()(); >+ it = m.insert_or_assign(it, k, val2); >+ EXPECT_EQ(k, get<0>(*it)); >+ EXPECT_EQ(val2, get<1>(*it)); >+#endif >+} >+ >+TYPED_TEST_P(ModifiersTest, Emplace) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using V = typename TypeParam::mapped_type; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ // TODO(alkis): We need a way to run emplace in a more meaningful way. Perhaps >+ // with test traits/policy. >+ auto p = m.emplace(val); >+ EXPECT_TRUE(p.second); >+ EXPECT_EQ(val, *p.first); >+ T val2 = {val.first, hash_internal::Generator<V>()()}; >+ p = m.emplace(val2); >+ EXPECT_FALSE(p.second); >+ EXPECT_EQ(val, *p.first); >+} >+ >+TYPED_TEST_P(ModifiersTest, EmplaceHint) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using V = typename TypeParam::mapped_type; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ // TODO(alkis): We need a way to run emplace in a more meaningful way. Perhaps >+ // with test traits/policy. >+ auto it = m.emplace_hint(m.end(), val); >+ EXPECT_EQ(val, *it); >+ T val2 = {val.first, hash_internal::Generator<V>()()}; >+ it = m.emplace_hint(it, val2); >+ EXPECT_EQ(val, *it); >+} >+ >+TYPED_TEST_P(ModifiersTest, TryEmplace) { >+#ifdef UNORDERED_MAP_CXX17 >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using V = typename TypeParam::mapped_type; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ // TODO(alkis): We need a way to run emplace in a more meaningful way. Perhaps >+ // with test traits/policy. >+ auto p = m.try_emplace(val.first, val.second); >+ EXPECT_TRUE(p.second); >+ EXPECT_EQ(val, *p.first); >+ T val2 = {val.first, hash_internal::Generator<V>()()}; >+ p = m.try_emplace(val2.first, val2.second); >+ EXPECT_FALSE(p.second); >+ EXPECT_EQ(val, *p.first); >+#endif >+} >+ >+TYPED_TEST_P(ModifiersTest, TryEmplaceHint) { >+#ifdef UNORDERED_MAP_CXX17 >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using V = typename TypeParam::mapped_type; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ // TODO(alkis): We need a way to run emplace in a more meaningful way. Perhaps >+ // with test traits/policy. >+ auto it = m.try_emplace(m.end(), val.first, val.second); >+ EXPECT_EQ(val, *it); >+ T val2 = {val.first, hash_internal::Generator<V>()()}; >+ it = m.try_emplace(it, val2.first, val2.second); >+ EXPECT_EQ(val, *it); >+#endif >+} >+ >+template <class V> >+using IfNotVoid = typename std::enable_if<!std::is_void<V>::value, V>::type; >+ >+// In openmap we chose not to return the iterator from erase because that's >+// more expensive. As such we adapt erase to return an iterator here. >+struct EraseFirst { >+ template <class Map> >+ auto operator()(Map* m, int) const >+ -> IfNotVoid<decltype(m->erase(m->begin()))> { >+ return m->erase(m->begin()); >+ } >+ template <class Map> >+ typename Map::iterator operator()(Map* m, ...) const { >+ auto it = m->begin(); >+ m->erase(it++); >+ return it; >+ } >+}; >+ >+TYPED_TEST_P(ModifiersTest, Erase) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using std::get; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ auto& first = *m.begin(); >+ std::vector<T> values2; >+ for (const auto& val : values) >+ if (get<0>(val) != get<0>(first)) values2.push_back(val); >+ auto it = EraseFirst()(&m, 0); >+ ASSERT_TRUE(it != m.end()); >+ EXPECT_EQ(1, std::count(values2.begin(), values2.end(), *it)); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values2.begin(), >+ values2.end())); >+} >+ >+TYPED_TEST_P(ModifiersTest, EraseRange) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ auto it = m.erase(m.begin(), m.end()); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAre()); >+ EXPECT_TRUE(it == m.end()); >+} >+ >+TYPED_TEST_P(ModifiersTest, EraseKey) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(items(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_EQ(1, m.erase(values[0].first)); >+ EXPECT_EQ(0, std::count(m.begin(), m.end(), values[0])); >+ EXPECT_THAT(items(m), ::testing::UnorderedElementsAreArray(values.begin() + 1, >+ values.end())); >+} >+ >+TYPED_TEST_P(ModifiersTest, Swap) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> v1; >+ std::vector<T> v2; >+ std::generate_n(std::back_inserter(v1), 5, hash_internal::Generator<T>()); >+ std::generate_n(std::back_inserter(v2), 5, hash_internal::Generator<T>()); >+ TypeParam m1(v1.begin(), v1.end()); >+ TypeParam m2(v2.begin(), v2.end()); >+ EXPECT_THAT(items(m1), ::testing::UnorderedElementsAreArray(v1)); >+ EXPECT_THAT(items(m2), ::testing::UnorderedElementsAreArray(v2)); >+ m1.swap(m2); >+ EXPECT_THAT(items(m1), ::testing::UnorderedElementsAreArray(v2)); >+ EXPECT_THAT(items(m2), ::testing::UnorderedElementsAreArray(v1)); >+} >+ >+// TODO(alkis): Write tests for extract. >+// TODO(alkis): Write tests for merge. >+ >+REGISTER_TYPED_TEST_CASE_P(ModifiersTest, Clear, Insert, InsertHint, >+ InsertRange, InsertOrAssign, InsertOrAssignHint, >+ Emplace, EmplaceHint, TryEmplace, TryEmplaceHint, >+ Erase, EraseRange, EraseKey, Swap); >+ >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_INTERNAL_UNORDERED_MAP_MODIFIERS_TEST_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..40e799cda89c3c90553b38db7ab32a2c14ed0280 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_map_test.cc >@@ -0,0 +1,38 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include <unordered_map> >+ >+#include "absl/container/internal/unordered_map_constructor_test.h" >+#include "absl/container/internal/unordered_map_lookup_test.h" >+#include "absl/container/internal/unordered_map_modifiers_test.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using MapTypes = ::testing::Types< >+ std::unordered_map<int, int, StatefulTestingHash, StatefulTestingEqual, >+ Alloc<std::pair<const int, int>>>, >+ std::unordered_map<std::string, std::string, StatefulTestingHash, >+ StatefulTestingEqual, >+ Alloc<std::pair<const std::string, std::string>>>>; >+ >+INSTANTIATE_TYPED_TEST_CASE_P(UnorderedMap, ConstructorTest, MapTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(UnorderedMap, LookupTest, MapTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(UnorderedMap, ModifiersTest, MapTypes); >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_constructor_test.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_constructor_test.h >new file mode 100644 >index 0000000000000000000000000000000000000000..cb593704685c47faba1303e2c5397bc155244e22 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_constructor_test.h >@@ -0,0 +1,408 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_UNORDERED_SET_CONSTRUCTOR_TEST_H_ >+#define ABSL_CONTAINER_INTERNAL_UNORDERED_SET_CONSTRUCTOR_TEST_H_ >+ >+#include <algorithm> >+#include <vector> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/hash_policy_testing.h" >+ >+namespace absl { >+namespace container_internal { >+ >+template <class UnordMap> >+class ConstructorTest : public ::testing::Test {}; >+ >+TYPED_TEST_CASE_P(ConstructorTest); >+ >+TYPED_TEST_P(ConstructorTest, NoArgs) { >+ TypeParam m; >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCount) { >+ TypeParam m(123); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHash) { >+ using H = typename TypeParam::hasher; >+ H hasher; >+ TypeParam m(123, hasher); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHashEqual) { >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ H hasher; >+ E equal; >+ TypeParam m(123, hasher, equal); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHashEqualAlloc) { >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+ >+ const auto& cm = m; >+ EXPECT_EQ(cm.hash_function(), hasher); >+ EXPECT_EQ(cm.key_eq(), equal); >+ EXPECT_EQ(cm.get_allocator(), alloc); >+ EXPECT_TRUE(cm.empty()); >+ EXPECT_THAT(keys(cm), ::testing::UnorderedElementsAre()); >+ EXPECT_GE(cm.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountAlloc) { >+#if defined(UNORDERED_SET_CXX14) || defined(UNORDERED_SET_CXX17) >+ using A = typename TypeParam::allocator_type; >+ A alloc(0); >+ TypeParam m(123, alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketCountHashAlloc) { >+#if defined(UNORDERED_SET_CXX14) || defined(UNORDERED_SET_CXX17) >+ using H = typename TypeParam::hasher; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ A alloc(0); >+ TypeParam m(123, hasher, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, BucketAlloc) { >+#if ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS >+ using A = typename TypeParam::allocator_type; >+ A alloc(0); >+ TypeParam m(alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_TRUE(m.empty()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, InputIteratorBucketHashEqualAlloc) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ std::vector<T> values; >+ for (size_t i = 0; i != 10; ++i) >+ values.push_back(hash_internal::Generator<T>()()); >+ TypeParam m(values.begin(), values.end(), 123, hasher, equal, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, InputIteratorBucketAlloc) { >+#if defined(UNORDERED_SET_CXX14) || defined(UNORDERED_SET_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using A = typename TypeParam::allocator_type; >+ A alloc(0); >+ std::vector<T> values; >+ for (size_t i = 0; i != 10; ++i) >+ values.push_back(hash_internal::Generator<T>()()); >+ TypeParam m(values.begin(), values.end(), 123, alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, InputIteratorBucketHashAlloc) { >+#if defined(UNORDERED_SET_CXX14) || defined(UNORDERED_SET_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ A alloc(0); >+ std::vector<T> values; >+ for (size_t i = 0; i != 10; ++i) >+ values.push_back(hash_internal::Generator<T>()()); >+ TypeParam m(values.begin(), values.end(), 123, hasher, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, CopyConstructor) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam n(m); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, CopyConstructorAlloc) { >+#if ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam n(m, A(11)); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_NE(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+#endif >+} >+ >+// TODO(alkis): Test non-propagating allocators on copy constructors. >+ >+TYPED_TEST_P(ConstructorTest, MoveConstructor) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam t(m); >+ TypeParam n(std::move(t)); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, MoveConstructorAlloc) { >+#if ABSL_UNORDERED_SUPPORTS_ALLOC_CTORS >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(123, hasher, equal, alloc); >+ for (size_t i = 0; i != 10; ++i) m.insert(hash_internal::Generator<T>()()); >+ TypeParam t(m); >+ TypeParam n(std::move(t), A(1)); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_NE(m.get_allocator(), n.get_allocator()); >+ EXPECT_EQ(m, n); >+#endif >+} >+ >+// TODO(alkis): Test non-propagating allocators on move constructors. >+ >+TYPED_TEST_P(ConstructorTest, InitializerListBucketHashEqualAlloc) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ TypeParam m(values, 123, hasher, equal, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.key_eq(), equal); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+} >+ >+TYPED_TEST_P(ConstructorTest, InitializerListBucketAlloc) { >+#if defined(UNORDERED_SET_CXX14) || defined(UNORDERED_SET_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using A = typename TypeParam::allocator_type; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ A alloc(0); >+ TypeParam m(values, 123, alloc); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, InitializerListBucketHashAlloc) { >+#if defined(UNORDERED_SET_CXX14) || defined(UNORDERED_SET_CXX17) >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ A alloc(0); >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m(values, 123, hasher, alloc); >+ EXPECT_EQ(m.hash_function(), hasher); >+ EXPECT_EQ(m.get_allocator(), alloc); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_GE(m.bucket_count(), 123); >+#endif >+} >+ >+TYPED_TEST_P(ConstructorTest, Assignment) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}, 123, hasher, equal, alloc); >+ TypeParam n; >+ n = m; >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m, n); >+} >+ >+// TODO(alkis): Test [non-]propagating allocators on move/copy assignments >+// (it depends on traits). >+ >+TYPED_TEST_P(ConstructorTest, MoveAssignment) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ using H = typename TypeParam::hasher; >+ using E = typename TypeParam::key_equal; >+ using A = typename TypeParam::allocator_type; >+ H hasher; >+ E equal; >+ A alloc(0); >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}, 123, hasher, equal, alloc); >+ TypeParam t(m); >+ TypeParam n; >+ n = std::move(t); >+ EXPECT_EQ(m.hash_function(), n.hash_function()); >+ EXPECT_EQ(m.key_eq(), n.key_eq()); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentFromInitializerList) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m; >+ m = values; >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentOverwritesExisting) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}); >+ TypeParam n({gen()}); >+ n = m; >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, MoveAssignmentOverwritesExisting) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ TypeParam m({gen(), gen(), gen()}); >+ TypeParam t(m); >+ TypeParam n({gen()}); >+ n = std::move(t); >+ EXPECT_EQ(m, n); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentFromInitializerListOverwritesExisting) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m; >+ m = values; >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+TYPED_TEST_P(ConstructorTest, AssignmentOnSelf) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ hash_internal::Generator<T> gen; >+ std::initializer_list<T> values = {gen(), gen(), gen(), gen(), gen()}; >+ TypeParam m(values); >+ m = *&m; // Avoid -Wself-assign. >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+REGISTER_TYPED_TEST_CASE_P( >+ ConstructorTest, NoArgs, BucketCount, BucketCountHash, BucketCountHashEqual, >+ BucketCountHashEqualAlloc, BucketCountAlloc, BucketCountHashAlloc, >+ BucketAlloc, InputIteratorBucketHashEqualAlloc, InputIteratorBucketAlloc, >+ InputIteratorBucketHashAlloc, CopyConstructor, CopyConstructorAlloc, >+ MoveConstructor, MoveConstructorAlloc, InitializerListBucketHashEqualAlloc, >+ InitializerListBucketAlloc, InitializerListBucketHashAlloc, Assignment, >+ MoveAssignment, AssignmentFromInitializerList, >+ AssignmentOverwritesExisting, MoveAssignmentOverwritesExisting, >+ AssignmentFromInitializerListOverwritesExisting, AssignmentOnSelf); >+ >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_INTERNAL_UNORDERED_SET_CONSTRUCTOR_TEST_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_lookup_test.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_lookup_test.h >new file mode 100644 >index 0000000000000000000000000000000000000000..aca9c6a5df7b725c175993f47069da4e921729d3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_lookup_test.h >@@ -0,0 +1,88 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_UNORDERED_SET_LOOKUP_TEST_H_ >+#define ABSL_CONTAINER_INTERNAL_UNORDERED_SET_LOOKUP_TEST_H_ >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/hash_policy_testing.h" >+ >+namespace absl { >+namespace container_internal { >+ >+template <class UnordSet> >+class LookupTest : public ::testing::Test {}; >+ >+TYPED_TEST_CASE_P(LookupTest); >+ >+TYPED_TEST_P(LookupTest, Count) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ for (const auto& v : values) >+ EXPECT_EQ(0, m.count(v)) << ::testing::PrintToString(v); >+ m.insert(values.begin(), values.end()); >+ for (const auto& v : values) >+ EXPECT_EQ(1, m.count(v)) << ::testing::PrintToString(v); >+} >+ >+TYPED_TEST_P(LookupTest, Find) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ for (const auto& v : values) >+ EXPECT_TRUE(m.end() == m.find(v)) << ::testing::PrintToString(v); >+ m.insert(values.begin(), values.end()); >+ for (const auto& v : values) { >+ typename TypeParam::iterator it = m.find(v); >+ static_assert(std::is_same<const typename TypeParam::value_type&, >+ decltype(*it)>::value, >+ ""); >+ static_assert(std::is_same<const typename TypeParam::value_type*, >+ decltype(it.operator->())>::value, >+ ""); >+ EXPECT_TRUE(m.end() != it) << ::testing::PrintToString(v); >+ EXPECT_EQ(v, *it) << ::testing::PrintToString(v); >+ } >+} >+ >+TYPED_TEST_P(LookupTest, EqualRange) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ for (const auto& v : values) { >+ auto r = m.equal_range(v); >+ ASSERT_EQ(0, std::distance(r.first, r.second)); >+ } >+ m.insert(values.begin(), values.end()); >+ for (const auto& v : values) { >+ auto r = m.equal_range(v); >+ ASSERT_EQ(1, std::distance(r.first, r.second)); >+ EXPECT_EQ(v, *r.first); >+ } >+} >+ >+REGISTER_TYPED_TEST_CASE_P(LookupTest, Count, Find, EqualRange); >+ >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_INTERNAL_UNORDERED_SET_LOOKUP_TEST_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_modifiers_test.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_modifiers_test.h >new file mode 100644 >index 0000000000000000000000000000000000000000..9beacf3316973dd7bcdcb9382a136c18d0a8c76b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_modifiers_test.h >@@ -0,0 +1,187 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_CONTAINER_INTERNAL_UNORDERED_SET_MODIFIERS_TEST_H_ >+#define ABSL_CONTAINER_INTERNAL_UNORDERED_SET_MODIFIERS_TEST_H_ >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/internal/hash_generator_testing.h" >+#include "absl/container/internal/hash_policy_testing.h" >+ >+namespace absl { >+namespace container_internal { >+ >+template <class UnordSet> >+class ModifiersTest : public ::testing::Test {}; >+ >+TYPED_TEST_CASE_P(ModifiersTest); >+ >+TYPED_TEST_P(ModifiersTest, Clear) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ m.clear(); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_TRUE(m.empty()); >+} >+ >+TYPED_TEST_P(ModifiersTest, Insert) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ auto p = m.insert(val); >+ EXPECT_TRUE(p.second); >+ EXPECT_EQ(val, *p.first); >+ p = m.insert(val); >+ EXPECT_FALSE(p.second); >+} >+ >+TYPED_TEST_P(ModifiersTest, InsertHint) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ auto it = m.insert(m.end(), val); >+ EXPECT_TRUE(it != m.end()); >+ EXPECT_EQ(val, *it); >+ it = m.insert(it, val); >+ EXPECT_TRUE(it != m.end()); >+ EXPECT_EQ(val, *it); >+} >+ >+TYPED_TEST_P(ModifiersTest, InsertRange) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m; >+ m.insert(values.begin(), values.end()); >+ ASSERT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+} >+ >+TYPED_TEST_P(ModifiersTest, Emplace) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ // TODO(alkis): We need a way to run emplace in a more meaningful way. Perhaps >+ // with test traits/policy. >+ auto p = m.emplace(val); >+ EXPECT_TRUE(p.second); >+ EXPECT_EQ(val, *p.first); >+ p = m.emplace(val); >+ EXPECT_FALSE(p.second); >+ EXPECT_EQ(val, *p.first); >+} >+ >+TYPED_TEST_P(ModifiersTest, EmplaceHint) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ T val = hash_internal::Generator<T>()(); >+ TypeParam m; >+ // TODO(alkis): We need a way to run emplace in a more meaningful way. Perhaps >+ // with test traits/policy. >+ auto it = m.emplace_hint(m.end(), val); >+ EXPECT_EQ(val, *it); >+ it = m.emplace_hint(it, val); >+ EXPECT_EQ(val, *it); >+} >+ >+template <class V> >+using IfNotVoid = typename std::enable_if<!std::is_void<V>::value, V>::type; >+ >+// In openmap we chose not to return the iterator from erase because that's >+// more expensive. As such we adapt erase to return an iterator here. >+struct EraseFirst { >+ template <class Map> >+ auto operator()(Map* m, int) const >+ -> IfNotVoid<decltype(m->erase(m->begin()))> { >+ return m->erase(m->begin()); >+ } >+ template <class Map> >+ typename Map::iterator operator()(Map* m, ...) const { >+ auto it = m->begin(); >+ m->erase(it++); >+ return it; >+ } >+}; >+ >+TYPED_TEST_P(ModifiersTest, Erase) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ std::vector<T> values2; >+ for (const auto& val : values) >+ if (val != *m.begin()) values2.push_back(val); >+ auto it = EraseFirst()(&m, 0); >+ ASSERT_TRUE(it != m.end()); >+ EXPECT_EQ(1, std::count(values2.begin(), values2.end(), *it)); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values2.begin(), >+ values2.end())); >+} >+ >+TYPED_TEST_P(ModifiersTest, EraseRange) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ auto it = m.erase(m.begin(), m.end()); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAre()); >+ EXPECT_TRUE(it == m.end()); >+} >+ >+TYPED_TEST_P(ModifiersTest, EraseKey) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> values; >+ std::generate_n(std::back_inserter(values), 10, >+ hash_internal::Generator<T>()); >+ TypeParam m(values.begin(), values.end()); >+ ASSERT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values)); >+ EXPECT_EQ(1, m.erase(values[0])); >+ EXPECT_EQ(0, std::count(m.begin(), m.end(), values[0])); >+ EXPECT_THAT(keys(m), ::testing::UnorderedElementsAreArray(values.begin() + 1, >+ values.end())); >+} >+ >+TYPED_TEST_P(ModifiersTest, Swap) { >+ using T = hash_internal::GeneratedType<TypeParam>; >+ std::vector<T> v1; >+ std::vector<T> v2; >+ std::generate_n(std::back_inserter(v1), 5, hash_internal::Generator<T>()); >+ std::generate_n(std::back_inserter(v2), 5, hash_internal::Generator<T>()); >+ TypeParam m1(v1.begin(), v1.end()); >+ TypeParam m2(v2.begin(), v2.end()); >+ EXPECT_THAT(keys(m1), ::testing::UnorderedElementsAreArray(v1)); >+ EXPECT_THAT(keys(m2), ::testing::UnorderedElementsAreArray(v2)); >+ m1.swap(m2); >+ EXPECT_THAT(keys(m1), ::testing::UnorderedElementsAreArray(v2)); >+ EXPECT_THAT(keys(m2), ::testing::UnorderedElementsAreArray(v1)); >+} >+ >+// TODO(alkis): Write tests for extract. >+// TODO(alkis): Write tests for merge. >+ >+REGISTER_TYPED_TEST_CASE_P(ModifiersTest, Clear, Insert, InsertHint, >+ InsertRange, Emplace, EmplaceHint, Erase, EraseRange, >+ EraseKey, Swap); >+ >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_INTERNAL_UNORDERED_SET_MODIFIERS_TEST_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..1281ce53d5e2e40728fb6ca8dd90c0c9498aefa3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/internal/unordered_set_test.cc >@@ -0,0 +1,37 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include <unordered_set> >+ >+#include "absl/container/internal/unordered_set_constructor_test.h" >+#include "absl/container/internal/unordered_set_lookup_test.h" >+#include "absl/container/internal/unordered_set_modifiers_test.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using SetTypes = >+ ::testing::Types<std::unordered_set<int, StatefulTestingHash, >+ StatefulTestingEqual, Alloc<int>>, >+ std::unordered_set<std::string, StatefulTestingHash, >+ StatefulTestingEqual, Alloc<std::string>>>; >+ >+INSTANTIATE_TYPED_TEST_CASE_P(UnorderedSet, ConstructorTest, SetTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(UnorderedSet, LookupTest, SetTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(UnorderedSet, ModifiersTest, SetTypes); >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_map.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_map.h >new file mode 100644 >index 0000000000000000000000000000000000000000..6369ec3ad6431a47e64316588e46714317e61d9a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_map.h >@@ -0,0 +1,530 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// ----------------------------------------------------------------------------- >+// File: node_hash_map.h >+// ----------------------------------------------------------------------------- >+// >+// An `absl::node_hash_map<K, V>` is an unordered associative container of >+// unique keys and associated values designed to be a more efficient replacement >+// for `std::unordered_map`. Like `unordered_map`, search, insertion, and >+// deletion of map elements can be done as an `O(1)` operation. However, >+// `node_hash_map` (and other unordered associative containers known as the >+// collection of Abseil "Swiss tables") contain other optimizations that result >+// in both memory and computation advantages. >+// >+// In most cases, your default choice for a hash map should be a map of type >+// `flat_hash_map`. However, if you need pointer stability and cannot store >+// a `flat_hash_map` with `unique_ptr` elements, a `node_hash_map` may be a >+// valid alternative. As well, if you are migrating your code from using >+// `std::unordered_map`, a `node_hash_map` provides a more straightforward >+// migration, because it guarantees pointer stability. Consider migrating to >+// `node_hash_map` and perhaps converting to a more efficient `flat_hash_map` >+// upon further review. >+ >+#ifndef ABSL_CONTAINER_NODE_HASH_MAP_H_ >+#define ABSL_CONTAINER_NODE_HASH_MAP_H_ >+ >+#include <tuple> >+#include <type_traits> >+#include <utility> >+ >+#include "absl/container/internal/container_memory.h" >+#include "absl/container/internal/hash_function_defaults.h" // IWYU pragma: export >+#include "absl/container/internal/node_hash_policy.h" >+#include "absl/container/internal/raw_hash_map.h" // IWYU pragma: export >+#include "absl/memory/memory.h" >+ >+namespace absl { >+namespace container_internal { >+template <class Key, class Value> >+class NodeHashMapPolicy; >+} // namespace container_internal >+ >+// ----------------------------------------------------------------------------- >+// absl::node_hash_map >+// ----------------------------------------------------------------------------- >+// >+// An `absl::node_hash_map<K, V>` is an unordered associative container which >+// has been optimized for both speed and memory footprint in most common use >+// cases. Its interface is similar to that of `std::unordered_map<K, V>` with >+// the following notable differences: >+// >+// * Supports heterogeneous lookup, through `find()`, `operator[]()` and >+// `insert()`, provided that the map is provided a compatible heterogeneous >+// hashing function and equality operator. >+// * Contains a `capacity()` member function indicating the number of element >+// slots (open, deleted, and empty) within the hash map. >+// * Returns `void` from the `erase(iterator)` overload. >+// >+// By default, `node_hash_map` uses the `absl::Hash` hashing framework. >+// All fundamental and Abseil types that support the `absl::Hash` framework have >+// a compatible equality operator for comparing insertions into `node_hash_map`. >+// If your type is not yet supported by the `asbl::Hash` framework, see >+// absl/hash/hash.h for information on extending Abseil hashing to user-defined >+// types. >+// >+// Example: >+// >+// // Create a node hash map of three strings (that map to strings) >+// absl::node_hash_map<std::string, std::string> ducks = >+// {{"a", "huey"}, {"b", "dewey"}, {"c", "louie"}}; >+// >+// // Insert a new element into the node hash map >+// ducks.insert({"d", "donald"}}; >+// >+// // Force a rehash of the node hash map >+// ducks.rehash(0); >+// >+// // Find the element with the key "b" >+// std::string search_key = "b"; >+// auto result = ducks.find(search_key); >+// if (result != ducks.end()) { >+// std::cout << "Result: " << search_key->second << std::endl; >+// } >+template <class Key, class Value, >+ class Hash = absl::container_internal::hash_default_hash<Key>, >+ class Eq = absl::container_internal::hash_default_eq<Key>, >+ class Alloc = std::allocator<std::pair<const Key, Value>>> >+class node_hash_map >+ : public absl::container_internal::raw_hash_map< >+ absl::container_internal::NodeHashMapPolicy<Key, Value>, Hash, Eq, >+ Alloc> { >+ using Base = typename node_hash_map::raw_hash_map; >+ >+ public: >+ node_hash_map() {} >+ using Base::Base; >+ >+ // node_hash_map::begin() >+ // >+ // Returns an iterator to the beginning of the `node_hash_map`. >+ using Base::begin; >+ >+ // node_hash_map::cbegin() >+ // >+ // Returns a const iterator to the beginning of the `node_hash_map`. >+ using Base::cbegin; >+ >+ // node_hash_map::cend() >+ // >+ // Returns a const iterator to the end of the `node_hash_map`. >+ using Base::cend; >+ >+ // node_hash_map::end() >+ // >+ // Returns an iterator to the end of the `node_hash_map`. >+ using Base::end; >+ >+ // node_hash_map::capacity() >+ // >+ // Returns the number of element slots (assigned, deleted, and empty) >+ // available within the `node_hash_map`. >+ // >+ // NOTE: this member function is particular to `absl::node_hash_map` and is >+ // not provided in the `std::unordered_map` API. >+ using Base::capacity; >+ >+ // node_hash_map::empty() >+ // >+ // Returns whether or not the `node_hash_map` is empty. >+ using Base::empty; >+ >+ // node_hash_map::max_size() >+ // >+ // Returns the largest theoretical possible number of elements within a >+ // `node_hash_map` under current memory constraints. This value can be thought >+ // of as the largest value of `std::distance(begin(), end())` for a >+ // `node_hash_map<K, V>`. >+ using Base::max_size; >+ >+ // node_hash_map::size() >+ // >+ // Returns the number of elements currently within the `node_hash_map`. >+ using Base::size; >+ >+ // node_hash_map::clear() >+ // >+ // Removes all elements from the `node_hash_map`. Invalidates any references, >+ // pointers, or iterators referring to contained elements. >+ // >+ // NOTE: this operation may shrink the underlying buffer. To avoid shrinking >+ // the underlying buffer call `erase(begin(), end())`. >+ using Base::clear; >+ >+ // node_hash_map::erase() >+ // >+ // Erases elements within the `node_hash_map`. Erasing does not trigger a >+ // rehash. Overloads are listed below. >+ // >+ // void erase(const_iterator pos): >+ // >+ // Erases the element at `position` of the `node_hash_map`, returning >+ // `void`. >+ // >+ // NOTE: this return behavior is different than that of STL containers in >+ // general and `std::unordered_map` in particular. >+ // >+ // iterator erase(const_iterator first, const_iterator last): >+ // >+ // Erases the elements in the open interval [`first`, `last`), returning an >+ // iterator pointing to `last`. >+ // >+ // size_type erase(const key_type& key): >+ // >+ // Erases the element with the matching key, if it exists. >+ using Base::erase; >+ >+ // node_hash_map::insert() >+ // >+ // Inserts an element of the specified value into the `node_hash_map`, >+ // returning an iterator pointing to the newly inserted element, provided that >+ // an element with the given key does not already exist. If rehashing occurs >+ // due to the insertion, all iterators are invalidated. Overloads are listed >+ // below. >+ // >+ // std::pair<iterator,bool> insert(const init_type& value): >+ // >+ // Inserts a value into the `node_hash_map`. Returns a pair consisting of an >+ // iterator to the inserted element (or to the element that prevented the >+ // insertion) and a `bool` denoting whether the insertion took place. >+ // >+ // std::pair<iterator,bool> insert(T&& value): >+ // std::pair<iterator,bool> insert(init_type&& value): >+ // >+ // Inserts a moveable value into the `node_hash_map`. Returns a `std::pair` >+ // consisting of an iterator to the inserted element (or to the element that >+ // prevented the insertion) and a `bool` denoting whether the insertion took >+ // place. >+ // >+ // iterator insert(const_iterator hint, const init_type& value): >+ // iterator insert(const_iterator hint, T&& value): >+ // iterator insert(const_iterator hint, init_type&& value); >+ // >+ // Inserts a value, using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. Returns an iterator to the >+ // inserted element, or to the existing element that prevented the >+ // insertion. >+ // >+ // void insert(InputIterator first, InputIterator last): >+ // >+ // Inserts a range of values [`first`, `last`). >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently, for `node_hash_map` we guarantee the >+ // first match is inserted. >+ // >+ // void insert(std::initializer_list<init_type> ilist): >+ // >+ // Inserts the elements within the initializer list `ilist`. >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently within the initializer list, for >+ // `node_hash_map` we guarantee the first match is inserted. >+ using Base::insert; >+ >+ // node_hash_map::insert_or_assign() >+ // >+ // Inserts an element of the specified value into the `node_hash_map` provided >+ // that a value with the given key does not already exist, or replaces it with >+ // the element value if a key for that value already exists, returning an >+ // iterator pointing to the newly inserted element. If rehashing occurs due to >+ // the insertion, all iterators are invalidated. Overloads are listed >+ // below. >+ // >+ // std::pair<iterator, bool> insert_or_assign(const init_type& k, T&& obj): >+ // std::pair<iterator, bool> insert_or_assign(init_type&& k, T&& obj): >+ // >+ // Inserts/Assigns (or moves) the element of the specified key into the >+ // `node_hash_map`. >+ // >+ // iterator insert_or_assign(const_iterator hint, >+ // const init_type& k, T&& obj): >+ // iterator insert_or_assign(const_iterator hint, init_type&& k, T&& obj): >+ // >+ // Inserts/Assigns (or moves) the element of the specified key into the >+ // `node_hash_map` using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. >+ using Base::insert_or_assign; >+ >+ // node_hash_map::emplace() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `node_hash_map`, provided that no element with the given key >+ // already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace; >+ >+ // node_hash_map::emplace_hint() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `node_hash_map`, using the position of `hint` as a non-binding >+ // suggestion for where to begin the insertion search, and only inserts >+ // provided that no element with the given key already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace_hint; >+ >+ // node_hash_map::try_emplace() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `node_hash_map`, provided that no element with the given key >+ // already exists. Unlike `emplace()`, if an element with the given key >+ // already exists, we guarantee that no element is constructed. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ // Overloads are listed below. >+ // >+ // std::pair<iterator, bool> try_emplace(const key_type& k, Args&&... args): >+ // std::pair<iterator, bool> try_emplace(key_type&& k, Args&&... args): >+ // >+ // Inserts (via copy or move) the element of the specified key into the >+ // `node_hash_map`. >+ // >+ // iterator try_emplace(const_iterator hint, >+ // const init_type& k, Args&&... args): >+ // iterator try_emplace(const_iterator hint, init_type&& k, Args&&... args): >+ // >+ // Inserts (via copy or move) the element of the specified key into the >+ // `node_hash_map` using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. >+ using Base::try_emplace; >+ >+ // node_hash_map::extract() >+ // >+ // Extracts the indicated element, erasing it in the process, and returns it >+ // as a C++17-compatible node handle. Overloads are listed below. >+ // >+ // node_type extract(const_iterator position): >+ // >+ // Extracts the key,value pair of the element at the indicated position and >+ // returns a node handle owning that extracted data. >+ // >+ // node_type extract(const key_type& x): >+ // >+ // Extracts the key,value pair of the element with a key matching the passed >+ // key value and returns a node handle owning that extracted data. If the >+ // `node_hash_map` does not contain an element with a matching key, this >+ // function returns an empty node handle. >+ using Base::extract; >+ >+ // node_hash_map::merge() >+ // >+ // Extracts elements from a given `source` node hash map into this >+ // `node_hash_map`. If the destination `node_hash_map` already contains an >+ // element with an equivalent key, that element is not extracted. >+ using Base::merge; >+ >+ // node_hash_map::swap(node_hash_map& other) >+ // >+ // Exchanges the contents of this `node_hash_map` with those of the `other` >+ // node hash map, avoiding invocation of any move, copy, or swap operations on >+ // individual elements. >+ // >+ // All iterators and references on the `node_hash_map` remain valid, excepting >+ // for the past-the-end iterator, which is invalidated. >+ // >+ // `swap()` requires that the node hash map's hashing and key equivalence >+ // functions be Swappable, and are exchaged using unqualified calls to >+ // non-member `swap()`. If the map's allocator has >+ // `std::allocator_traits<allocator_type>::propagate_on_container_swap::value` >+ // set to `true`, the allocators are also exchanged using an unqualified call >+ // to non-member `swap()`; otherwise, the allocators are not swapped. >+ using Base::swap; >+ >+ // node_hash_map::rehash(count) >+ // >+ // Rehashes the `node_hash_map`, setting the number of slots to be at least >+ // the passed value. If the new number of slots increases the load factor more >+ // than the current maximum load factor >+ // (`count` < `size()` / `max_load_factor()`), then the new number of slots >+ // will be at least `size()` / `max_load_factor()`. >+ // >+ // To force a rehash, pass rehash(0). >+ using Base::rehash; >+ >+ // node_hash_map::reserve(count) >+ // >+ // Sets the number of slots in the `node_hash_map` to the number needed to >+ // accommodate at least `count` total elements without exceeding the current >+ // maximum load factor, and may rehash the container if needed. >+ using Base::reserve; >+ >+ // node_hash_map::at() >+ // >+ // Returns a reference to the mapped value of the element with key equivalent >+ // to the passed key. >+ using Base::at; >+ >+ // node_hash_map::contains() >+ // >+ // Determines whether an element with a key comparing equal to the given `key` >+ // exists within the `node_hash_map`, returning `true` if so or `false` >+ // otherwise. >+ using Base::contains; >+ >+ // node_hash_map::count(const Key& key) const >+ // >+ // Returns the number of elements with a key comparing equal to the given >+ // `key` within the `node_hash_map`. note that this function will return >+ // either `1` or `0` since duplicate keys are not allowed within a >+ // `node_hash_map`. >+ using Base::count; >+ >+ // node_hash_map::equal_range() >+ // >+ // Returns a closed range [first, last], defined by a `std::pair` of two >+ // iterators, containing all elements with the passed key in the >+ // `node_hash_map`. >+ using Base::equal_range; >+ >+ // node_hash_map::find() >+ // >+ // Finds an element with the passed `key` within the `node_hash_map`. >+ using Base::find; >+ >+ // node_hash_map::operator[]() >+ // >+ // Returns a reference to the value mapped to the passed key within the >+ // `node_hash_map`, performing an `insert()` if the key does not already >+ // exist. If an insertion occurs and results in a rehashing of the container, >+ // all iterators are invalidated. Otherwise iterators are not affected and >+ // references are not invalidated. Overloads are listed below. >+ // >+ // T& operator[](const Key& key): >+ // >+ // Inserts an init_type object constructed in-place if the element with the >+ // given key does not exist. >+ // >+ // T& operator[](Key&& key): >+ // >+ // Inserts an init_type object constructed in-place provided that an element >+ // with the given key does not exist. >+ using Base::operator[]; >+ >+ // node_hash_map::bucket_count() >+ // >+ // Returns the number of "buckets" within the `node_hash_map`. >+ using Base::bucket_count; >+ >+ // node_hash_map::load_factor() >+ // >+ // Returns the current load factor of the `node_hash_map` (the average number >+ // of slots occupied with a value within the hash map). >+ using Base::load_factor; >+ >+ // node_hash_map::max_load_factor() >+ // >+ // Manages the maximum load factor of the `node_hash_map`. Overloads are >+ // listed below. >+ // >+ // float node_hash_map::max_load_factor() >+ // >+ // Returns the current maximum load factor of the `node_hash_map`. >+ // >+ // void node_hash_map::max_load_factor(float ml) >+ // >+ // Sets the maximum load factor of the `node_hash_map` to the passed value. >+ // >+ // NOTE: This overload is provided only for API compatibility with the STL; >+ // `node_hash_map` will ignore any set load factor and manage its rehashing >+ // internally as an implementation detail. >+ using Base::max_load_factor; >+ >+ // node_hash_map::get_allocator() >+ // >+ // Returns the allocator function associated with this `node_hash_map`. >+ using Base::get_allocator; >+ >+ // node_hash_map::hash_function() >+ // >+ // Returns the hashing function used to hash the keys within this >+ // `node_hash_map`. >+ using Base::hash_function; >+ >+ // node_hash_map::key_eq() >+ // >+ // Returns the function used for comparing keys equality. >+ using Base::key_eq; >+ >+ ABSL_DEPRECATED("Call `hash_function()` instead.") >+ typename Base::hasher hash_funct() { return this->hash_function(); } >+ >+ ABSL_DEPRECATED("Call `rehash()` instead.") >+ void resize(typename Base::size_type hint) { this->rehash(hint); } >+}; >+ >+namespace container_internal { >+ >+template <class Key, class Value> >+class NodeHashMapPolicy >+ : public absl::container_internal::node_hash_policy< >+ std::pair<const Key, Value>&, NodeHashMapPolicy<Key, Value>> { >+ using value_type = std::pair<const Key, Value>; >+ >+ public: >+ using key_type = Key; >+ using mapped_type = Value; >+ using init_type = std::pair</*non const*/ key_type, mapped_type>; >+ >+ template <class Allocator, class... Args> >+ static value_type* new_element(Allocator* alloc, Args&&... args) { >+ using PairAlloc = typename absl::allocator_traits< >+ Allocator>::template rebind_alloc<value_type>; >+ PairAlloc pair_alloc(*alloc); >+ value_type* res = >+ absl::allocator_traits<PairAlloc>::allocate(pair_alloc, 1); >+ absl::allocator_traits<PairAlloc>::construct(pair_alloc, res, >+ std::forward<Args>(args)...); >+ return res; >+ } >+ >+ template <class Allocator> >+ static void delete_element(Allocator* alloc, value_type* pair) { >+ using PairAlloc = typename absl::allocator_traits< >+ Allocator>::template rebind_alloc<value_type>; >+ PairAlloc pair_alloc(*alloc); >+ absl::allocator_traits<PairAlloc>::destroy(pair_alloc, pair); >+ absl::allocator_traits<PairAlloc>::deallocate(pair_alloc, pair, 1); >+ } >+ >+ template <class F, class... Args> >+ static decltype(absl::container_internal::DecomposePair( >+ std::declval<F>(), std::declval<Args>()...)) >+ apply(F&& f, Args&&... args) { >+ return absl::container_internal::DecomposePair(std::forward<F>(f), >+ std::forward<Args>(args)...); >+ } >+ >+ static size_t element_space_used(const value_type*) { >+ return sizeof(value_type); >+ } >+ >+ static Value& value(value_type* elem) { return elem->second; } >+ static const Value& value(const value_type* elem) { return elem->second; } >+}; >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_NODE_HASH_MAP_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_map_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_map_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..bd789644c24e8c81741493e55bf0b61e2f356660 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_map_test.cc >@@ -0,0 +1,218 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/node_hash_map.h" >+ >+#include "absl/container/internal/tracked.h" >+#include "absl/container/internal/unordered_map_constructor_test.h" >+#include "absl/container/internal/unordered_map_lookup_test.h" >+#include "absl/container/internal/unordered_map_modifiers_test.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+ >+using ::testing::Field; >+using ::testing::Pair; >+using ::testing::UnorderedElementsAre; >+ >+using MapTypes = ::testing::Types< >+ absl::node_hash_map<int, int, StatefulTestingHash, StatefulTestingEqual, >+ Alloc<std::pair<const int, int>>>, >+ absl::node_hash_map<std::string, std::string, StatefulTestingHash, >+ StatefulTestingEqual, >+ Alloc<std::pair<const std::string, std::string>>>>; >+ >+INSTANTIATE_TYPED_TEST_CASE_P(NodeHashMap, ConstructorTest, MapTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(NodeHashMap, LookupTest, MapTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(NodeHashMap, ModifiersTest, MapTypes); >+ >+using M = absl::node_hash_map<std::string, Tracked<int>>; >+ >+TEST(NodeHashMap, Emplace) { >+ M m; >+ Tracked<int> t(53); >+ m.emplace("a", t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(1, t.num_copies()); >+ >+ m.emplace(std::string("a"), t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(1, t.num_copies()); >+ >+ std::string a("a"); >+ m.emplace(a, t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(1, t.num_copies()); >+ >+ const std::string ca("a"); >+ m.emplace(a, t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(1, t.num_copies()); >+ >+ m.emplace(std::make_pair("a", t)); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(2, t.num_copies()); >+ >+ m.emplace(std::make_pair(std::string("a"), t)); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(3, t.num_copies()); >+ >+ std::pair<std::string, Tracked<int>> p("a", t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(4, t.num_copies()); >+ m.emplace(p); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(4, t.num_copies()); >+ >+ const std::pair<std::string, Tracked<int>> cp("a", t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(5, t.num_copies()); >+ m.emplace(cp); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(5, t.num_copies()); >+ >+ std::pair<const std::string, Tracked<int>> pc("a", t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(6, t.num_copies()); >+ m.emplace(pc); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(6, t.num_copies()); >+ >+ const std::pair<const std::string, Tracked<int>> cpc("a", t); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(7, t.num_copies()); >+ m.emplace(cpc); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(7, t.num_copies()); >+ >+ m.emplace(std::piecewise_construct, std::forward_as_tuple("a"), >+ std::forward_as_tuple(t)); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(7, t.num_copies()); >+ >+ m.emplace(std::piecewise_construct, std::forward_as_tuple(std::string("a")), >+ std::forward_as_tuple(t)); >+ ASSERT_EQ(0, t.num_moves()); >+ ASSERT_EQ(7, t.num_copies()); >+} >+ >+TEST(NodeHashMap, AssignRecursive) { >+ struct Tree { >+ // Verify that unordered_map<K, IncompleteType> can be instantiated. >+ absl::node_hash_map<int, Tree> children; >+ }; >+ Tree root; >+ const Tree& child = root.children.emplace().first->second; >+ // Verify that `lhs = rhs` doesn't read rhs after clearing lhs. >+ root = child; >+} >+ >+TEST(FlatHashMap, MoveOnlyKey) { >+ struct Key { >+ Key() = default; >+ Key(Key&&) = default; >+ Key& operator=(Key&&) = default; >+ }; >+ struct Eq { >+ bool operator()(const Key&, const Key&) const { return true; } >+ }; >+ struct Hash { >+ size_t operator()(const Key&) const { return 0; } >+ }; >+ absl::node_hash_map<Key, int, Hash, Eq> m; >+ m[Key()]; >+} >+ >+struct NonMovableKey { >+ explicit NonMovableKey(int i) : i(i) {} >+ NonMovableKey(NonMovableKey&&) = delete; >+ int i; >+}; >+struct NonMovableKeyHash { >+ using is_transparent = void; >+ size_t operator()(const NonMovableKey& k) const { return k.i; } >+ size_t operator()(int k) const { return k; } >+}; >+struct NonMovableKeyEq { >+ using is_transparent = void; >+ bool operator()(const NonMovableKey& a, const NonMovableKey& b) const { >+ return a.i == b.i; >+ } >+ bool operator()(const NonMovableKey& a, int b) const { return a.i == b; } >+}; >+ >+TEST(NodeHashMap, MergeExtractInsert) { >+ absl::node_hash_map<NonMovableKey, int, NonMovableKeyHash, NonMovableKeyEq> >+ set1, set2; >+ set1.emplace(std::piecewise_construct, std::make_tuple(7), >+ std::make_tuple(-7)); >+ set1.emplace(std::piecewise_construct, std::make_tuple(17), >+ std::make_tuple(-17)); >+ >+ set2.emplace(std::piecewise_construct, std::make_tuple(7), >+ std::make_tuple(-70)); >+ set2.emplace(std::piecewise_construct, std::make_tuple(19), >+ std::make_tuple(-190)); >+ >+ auto Elem = [](int key, int value) { >+ return Pair(Field(&NonMovableKey::i, key), value); >+ }; >+ >+ EXPECT_THAT(set1, UnorderedElementsAre(Elem(7, -7), Elem(17, -17))); >+ EXPECT_THAT(set2, UnorderedElementsAre(Elem(7, -70), Elem(19, -190))); >+ >+ // NonMovableKey is neither copyable nor movable. We should still be able to >+ // move nodes around. >+ static_assert(!std::is_move_constructible<NonMovableKey>::value, ""); >+ set1.merge(set2); >+ >+ EXPECT_THAT(set1, >+ UnorderedElementsAre(Elem(7, -7), Elem(17, -17), Elem(19, -190))); >+ EXPECT_THAT(set2, UnorderedElementsAre(Elem(7, -70))); >+ >+ auto node = set1.extract(7); >+ EXPECT_TRUE(node); >+ EXPECT_EQ(node.key().i, 7); >+ EXPECT_EQ(node.mapped(), -7); >+ EXPECT_THAT(set1, UnorderedElementsAre(Elem(17, -17), Elem(19, -190))); >+ >+ auto insert_result = set2.insert(std::move(node)); >+ EXPECT_FALSE(node); >+ EXPECT_FALSE(insert_result.inserted); >+ EXPECT_TRUE(insert_result.node); >+ EXPECT_EQ(insert_result.node.key().i, 7); >+ EXPECT_EQ(insert_result.node.mapped(), -7); >+ EXPECT_THAT(*insert_result.position, Elem(7, -70)); >+ EXPECT_THAT(set2, UnorderedElementsAre(Elem(7, -70))); >+ >+ node = set1.extract(17); >+ EXPECT_TRUE(node); >+ EXPECT_EQ(node.key().i, 17); >+ EXPECT_EQ(node.mapped(), -17); >+ EXPECT_THAT(set1, UnorderedElementsAre(Elem(19, -190))); >+ >+ node.mapped() = 23; >+ >+ insert_result = set2.insert(std::move(node)); >+ EXPECT_FALSE(node); >+ EXPECT_TRUE(insert_result.inserted); >+ EXPECT_FALSE(insert_result.node); >+ EXPECT_THAT(*insert_result.position, Elem(17, 23)); >+ EXPECT_THAT(set2, UnorderedElementsAre(Elem(7, -70), Elem(17, 23))); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_set.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_set.h >new file mode 100644 >index 0000000000000000000000000000000000000000..90d4ce0c1631d78cfecfcd8fc29a1c484ae9f637 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_set.h >@@ -0,0 +1,439 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// ----------------------------------------------------------------------------- >+// File: node_hash_set.h >+// ----------------------------------------------------------------------------- >+// >+// An `absl::node_hash_set<T>` is an unordered associative container designed to >+// be a more efficient replacement for `std::unordered_set`. Like >+// `unordered_set`, search, insertion, and deletion of map elements can be done >+// as an `O(1)` operation. However, `node_hash_set` (and other unordered >+// associative containers known as the collection of Abseil "Swiss tables") >+// contain other optimizations that result in both memory and computation >+// advantages. >+// >+// In most cases, your default choice for a hash table should be a map of type >+// `flat_hash_map` or a set of type `flat_hash_set`. However, if you need >+// pointer stability, a `node_hash_set` should be your preferred choice. As >+// well, if you are migrating your code from using `std::unordered_set`, a >+// `node_hash_set` should be an easy migration. Consider migrating to >+// `node_hash_set` and perhaps converting to a more efficient `flat_hash_set` >+// upon further review. >+ >+#ifndef ABSL_CONTAINER_NODE_HASH_SET_H_ >+#define ABSL_CONTAINER_NODE_HASH_SET_H_ >+ >+#include <type_traits> >+ >+#include "absl/container/internal/hash_function_defaults.h" // IWYU pragma: export >+#include "absl/container/internal/node_hash_policy.h" >+#include "absl/container/internal/raw_hash_set.h" // IWYU pragma: export >+#include "absl/memory/memory.h" >+ >+namespace absl { >+namespace container_internal { >+template <typename T> >+struct NodeHashSetPolicy; >+} // namespace container_internal >+ >+// ----------------------------------------------------------------------------- >+// absl::node_hash_set >+// ----------------------------------------------------------------------------- >+// >+// An `absl::node_hash_set<T>` is an unordered associative container which >+// has been optimized for both speed and memory footprint in most common use >+// cases. Its interface is similar to that of `std::unordered_set<T>` with the >+// following notable differences: >+// >+// * Supports heterogeneous lookup, through `find()`, `operator[]()` and >+// `insert()`, provided that the map is provided a compatible heterogeneous >+// hashing function and equality operator. >+// * Contains a `capacity()` member function indicating the number of element >+// slots (open, deleted, and empty) within the hash set. >+// * Returns `void` from the `erase(iterator)` overload. >+// >+// By default, `node_hash_set` uses the `absl::Hash` hashing framework. >+// All fundamental and Abseil types that support the `absl::Hash` framework have >+// a compatible equality operator for comparing insertions into `node_hash_set`. >+// If your type is not yet supported by the `asbl::Hash` framework, see >+// absl/hash/hash.h for information on extending Abseil hashing to user-defined >+// types. >+// >+// Example: >+// >+// // Create a node hash set of three strings >+// absl::node_hash_map<std::string, std::string> ducks = >+// {"huey", "dewey"}, "louie"}; >+// >+// // Insert a new element into the node hash map >+// ducks.insert("donald"}; >+// >+// // Force a rehash of the node hash map >+// ducks.rehash(0); >+// >+// // See if "dewey" is present >+// if (ducks.contains("dewey")) { >+// std::cout << "We found dewey!" << std::endl; >+// } >+template <class T, class Hash = absl::container_internal::hash_default_hash<T>, >+ class Eq = absl::container_internal::hash_default_eq<T>, >+ class Alloc = std::allocator<T>> >+class node_hash_set >+ : public absl::container_internal::raw_hash_set< >+ absl::container_internal::NodeHashSetPolicy<T>, Hash, Eq, Alloc> { >+ using Base = typename node_hash_set::raw_hash_set; >+ >+ public: >+ node_hash_set() {} >+ using Base::Base; >+ >+ // node_hash_set::begin() >+ // >+ // Returns an iterator to the beginning of the `node_hash_set`. >+ using Base::begin; >+ >+ // node_hash_set::cbegin() >+ // >+ // Returns a const iterator to the beginning of the `node_hash_set`. >+ using Base::cbegin; >+ >+ // node_hash_set::cend() >+ // >+ // Returns a const iterator to the end of the `node_hash_set`. >+ using Base::cend; >+ >+ // node_hash_set::end() >+ // >+ // Returns an iterator to the end of the `node_hash_set`. >+ using Base::end; >+ >+ // node_hash_set::capacity() >+ // >+ // Returns the number of element slots (assigned, deleted, and empty) >+ // available within the `node_hash_set`. >+ // >+ // NOTE: this member function is particular to `absl::node_hash_set` and is >+ // not provided in the `std::unordered_map` API. >+ using Base::capacity; >+ >+ // node_hash_set::empty() >+ // >+ // Returns whether or not the `node_hash_set` is empty. >+ using Base::empty; >+ >+ // node_hash_set::max_size() >+ // >+ // Returns the largest theoretical possible number of elements within a >+ // `node_hash_set` under current memory constraints. This value can be thought >+ // of the largest value of `std::distance(begin(), end())` for a >+ // `node_hash_set<T>`. >+ using Base::max_size; >+ >+ // node_hash_set::size() >+ // >+ // Returns the number of elements currently within the `node_hash_set`. >+ using Base::size; >+ >+ // node_hash_set::clear() >+ // >+ // Removes all elements from the `node_hash_set`. Invalidates any references, >+ // pointers, or iterators referring to contained elements. >+ // >+ // NOTE: this operation may shrink the underlying buffer. To avoid shrinking >+ // the underlying buffer call `erase(begin(), end())`. >+ using Base::clear; >+ >+ // node_hash_set::erase() >+ // >+ // Erases elements within the `node_hash_set`. Erasing does not trigger a >+ // rehash. Overloads are listed below. >+ // >+ // void erase(const_iterator pos): >+ // >+ // Erases the element at `position` of the `node_hash_set`, returning >+ // `void`. >+ // >+ // NOTE: this return behavior is different than that of STL containers in >+ // general and `std::unordered_map` in particular. >+ // >+ // iterator erase(const_iterator first, const_iterator last): >+ // >+ // Erases the elements in the open interval [`first`, `last`), returning an >+ // iterator pointing to `last`. >+ // >+ // size_type erase(const key_type& key): >+ // >+ // Erases the element with the matching key, if it exists. >+ using Base::erase; >+ >+ // node_hash_set::insert() >+ // >+ // Inserts an element of the specified value into the `node_hash_set`, >+ // returning an iterator pointing to the newly inserted element, provided that >+ // an element with the given key does not already exist. If rehashing occurs >+ // due to the insertion, all iterators are invalidated. Overloads are listed >+ // below. >+ // >+ // std::pair<iterator,bool> insert(const T& value): >+ // >+ // Inserts a value into the `node_hash_set`. Returns a pair consisting of an >+ // iterator to the inserted element (or to the element that prevented the >+ // insertion) and a bool denoting whether the insertion took place. >+ // >+ // std::pair<iterator,bool> insert(T&& value): >+ // >+ // Inserts a moveable value into the `node_hash_set`. Returns a pair >+ // consisting of an iterator to the inserted element (or to the element that >+ // prevented the insertion) and a bool denoting whether the insertion took >+ // place. >+ // >+ // iterator insert(const_iterator hint, const T& value): >+ // iterator insert(const_iterator hint, T&& value): >+ // >+ // Inserts a value, using the position of `hint` as a non-binding suggestion >+ // for where to begin the insertion search. Returns an iterator to the >+ // inserted element, or to the existing element that prevented the >+ // insertion. >+ // >+ // void insert(InputIterator first, InputIterator last): >+ // >+ // Inserts a range of values [`first`, `last`). >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently, for `node_hash_set` we guarantee the >+ // first match is inserted. >+ // >+ // void insert(std::initializer_list<T> ilist): >+ // >+ // Inserts the elements within the initializer list `ilist`. >+ // >+ // NOTE: Although the STL does not specify which element may be inserted if >+ // multiple keys compare equivalently within the initializer list, for >+ // `node_hash_set` we guarantee the first match is inserted. >+ using Base::insert; >+ >+ // node_hash_set::emplace() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `node_hash_set`, provided that no element with the given key >+ // already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace; >+ >+ // node_hash_set::emplace_hint() >+ // >+ // Inserts an element of the specified value by constructing it in-place >+ // within the `node_hash_set`, using the position of `hint` as a non-binding >+ // suggestion for where to begin the insertion search, and only inserts >+ // provided that no element with the given key already exists. >+ // >+ // The element may be constructed even if there already is an element with the >+ // key in the container, in which case the newly constructed element will be >+ // destroyed immediately. Prefer `try_emplace()` unless your key is not >+ // copyable or moveable. >+ // >+ // If rehashing occurs due to the insertion, all iterators are invalidated. >+ using Base::emplace_hint; >+ >+ // node_hash_set::extract() >+ // >+ // Extracts the indicated element, erasing it in the process, and returns it >+ // as a C++17-compatible node handle. Overloads are listed below. >+ // >+ // node_type extract(const_iterator position): >+ // >+ // Extracts the element at the indicated position and returns a node handle >+ // owning that extracted data. >+ // >+ // node_type extract(const key_type& x): >+ // >+ // Extracts the element with the key matching the passed key value and >+ // returns a node handle owning that extracted data. If the `node_hash_set` >+ // does not contain an element with a matching key, this function returns an >+ // empty node handle. >+ using Base::extract; >+ >+ // node_hash_set::merge() >+ // >+ // Extracts elements from a given `source` flat hash map into this >+ // `node_hash_set`. If the destination `node_hash_set` already contains an >+ // element with an equivalent key, that element is not extracted. >+ using Base::merge; >+ >+ // node_hash_set::swap(node_hash_set& other) >+ // >+ // Exchanges the contents of this `node_hash_set` with those of the `other` >+ // flat hash map, avoiding invocation of any move, copy, or swap operations on >+ // individual elements. >+ // >+ // All iterators and references on the `node_hash_set` remain valid, excepting >+ // for the past-the-end iterator, which is invalidated. >+ // >+ // `swap()` requires that the flat hash set's hashing and key equivalence >+ // functions be Swappable, and are exchaged using unqualified calls to >+ // non-member `swap()`. If the map's allocator has >+ // `std::allocator_traits<allocator_type>::propagate_on_container_swap::value` >+ // set to `true`, the allocators are also exchanged using an unqualified call >+ // to non-member `swap()`; otherwise, the allocators are not swapped. >+ using Base::swap; >+ >+ // node_hash_set::rehash(count) >+ // >+ // Rehashes the `node_hash_set`, setting the number of slots to be at least >+ // the passed value. If the new number of slots increases the load factor more >+ // than the current maximum load factor >+ // (`count` < `size()` / `max_load_factor()`), then the new number of slots >+ // will be at least `size()` / `max_load_factor()`. >+ // >+ // To force a rehash, pass rehash(0). >+ // >+ // NOTE: unlike behavior in `std::unordered_set`, references are also >+ // invalidated upon a `rehash()`. >+ using Base::rehash; >+ >+ // node_hash_set::reserve(count) >+ // >+ // Sets the number of slots in the `node_hash_set` to the number needed to >+ // accommodate at least `count` total elements without exceeding the current >+ // maximum load factor, and may rehash the container if needed. >+ using Base::reserve; >+ >+ // node_hash_set::contains() >+ // >+ // Determines whether an element comparing equal to the given `key` exists >+ // within the `node_hash_set`, returning `true` if so or `false` otherwise. >+ using Base::contains; >+ >+ // node_hash_set::count(const Key& key) const >+ // >+ // Returns the number of elements comparing equal to the given `key` within >+ // the `node_hash_set`. note that this function will return either `1` or `0` >+ // since duplicate elements are not allowed within a `node_hash_set`. >+ using Base::count; >+ >+ // node_hash_set::equal_range() >+ // >+ // Returns a closed range [first, last], defined by a `std::pair` of two >+ // iterators, containing all elements with the passed key in the >+ // `node_hash_set`. >+ using Base::equal_range; >+ >+ // node_hash_set::find() >+ // >+ // Finds an element with the passed `key` within the `node_hash_set`. >+ using Base::find; >+ >+ // node_hash_set::bucket_count() >+ // >+ // Returns the number of "buckets" within the `node_hash_set`. Note that >+ // because a flat hash map contains all elements within its internal storage, >+ // this value simply equals the current capacity of the `node_hash_set`. >+ using Base::bucket_count; >+ >+ // node_hash_set::load_factor() >+ // >+ // Returns the current load factor of the `node_hash_set` (the average number >+ // of slots occupied with a value within the hash map). >+ using Base::load_factor; >+ >+ // node_hash_set::max_load_factor() >+ // >+ // Manages the maximum load factor of the `node_hash_set`. Overloads are >+ // listed below. >+ // >+ // float node_hash_set::max_load_factor() >+ // >+ // Returns the current maximum load factor of the `node_hash_set`. >+ // >+ // void node_hash_set::max_load_factor(float ml) >+ // >+ // Sets the maximum load factor of the `node_hash_set` to the passed value. >+ // >+ // NOTE: This overload is provided only for API compatibility with the STL; >+ // `node_hash_set` will ignore any set load factor and manage its rehashing >+ // internally as an implementation detail. >+ using Base::max_load_factor; >+ >+ // node_hash_set::get_allocator() >+ // >+ // Returns the allocator function associated with this `node_hash_set`. >+ using Base::get_allocator; >+ >+ // node_hash_set::hash_function() >+ // >+ // Returns the hashing function used to hash the keys within this >+ // `node_hash_set`. >+ using Base::hash_function; >+ >+ // node_hash_set::key_eq() >+ // >+ // Returns the function used for comparing keys equality. >+ using Base::key_eq; >+ >+ ABSL_DEPRECATED("Call `hash_function()` instead.") >+ typename Base::hasher hash_funct() { return this->hash_function(); } >+ >+ ABSL_DEPRECATED("Call `rehash()` instead.") >+ void resize(typename Base::size_type hint) { this->rehash(hint); } >+}; >+ >+namespace container_internal { >+ >+template <class T> >+struct NodeHashSetPolicy >+ : absl::container_internal::node_hash_policy<T&, NodeHashSetPolicy<T>> { >+ using key_type = T; >+ using init_type = T; >+ using constant_iterators = std::true_type; >+ >+ template <class Allocator, class... Args> >+ static T* new_element(Allocator* alloc, Args&&... args) { >+ using ValueAlloc = >+ typename absl::allocator_traits<Allocator>::template rebind_alloc<T>; >+ ValueAlloc value_alloc(*alloc); >+ T* res = absl::allocator_traits<ValueAlloc>::allocate(value_alloc, 1); >+ absl::allocator_traits<ValueAlloc>::construct(value_alloc, res, >+ std::forward<Args>(args)...); >+ return res; >+ } >+ >+ template <class Allocator> >+ static void delete_element(Allocator* alloc, T* elem) { >+ using ValueAlloc = >+ typename absl::allocator_traits<Allocator>::template rebind_alloc<T>; >+ ValueAlloc value_alloc(*alloc); >+ absl::allocator_traits<ValueAlloc>::destroy(value_alloc, elem); >+ absl::allocator_traits<ValueAlloc>::deallocate(value_alloc, elem, 1); >+ } >+ >+ template <class F, class... Args> >+ static decltype(absl::container_internal::DecomposeValue( >+ std::declval<F>(), std::declval<Args>()...)) >+ apply(F&& f, Args&&... args) { >+ return absl::container_internal::DecomposeValue( >+ std::forward<F>(f), std::forward<Args>(args)...); >+ } >+ >+ static size_t element_space_used(const T*) { return sizeof(T); } >+}; >+} // namespace container_internal >+} // namespace absl >+#endif // ABSL_CONTAINER_NODE_HASH_SET_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_set_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_set_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7e498f0f779b4d6e764ecafca62b1cb08ccfd5ae >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/container/node_hash_set_test.cc >@@ -0,0 +1,103 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/container/node_hash_set.h" >+ >+#include "absl/container/internal/unordered_set_constructor_test.h" >+#include "absl/container/internal/unordered_set_lookup_test.h" >+#include "absl/container/internal/unordered_set_modifiers_test.h" >+ >+namespace absl { >+namespace container_internal { >+namespace { >+using ::absl::container_internal::hash_internal::Enum; >+using ::absl::container_internal::hash_internal::EnumClass; >+using ::testing::Pointee; >+using ::testing::UnorderedElementsAre; >+ >+using SetTypes = ::testing::Types< >+ node_hash_set<int, StatefulTestingHash, StatefulTestingEqual, Alloc<int>>, >+ node_hash_set<std::string, StatefulTestingHash, StatefulTestingEqual, >+ Alloc<int>>, >+ node_hash_set<Enum, StatefulTestingHash, StatefulTestingEqual, Alloc<Enum>>, >+ node_hash_set<EnumClass, StatefulTestingHash, StatefulTestingEqual, >+ Alloc<EnumClass>>>; >+ >+INSTANTIATE_TYPED_TEST_CASE_P(NodeHashSet, ConstructorTest, SetTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(NodeHashSet, LookupTest, SetTypes); >+INSTANTIATE_TYPED_TEST_CASE_P(NodeHashSet, ModifiersTest, SetTypes); >+ >+TEST(NodeHashSet, MoveableNotCopyableCompiles) { >+ node_hash_set<std::unique_ptr<void*>> t; >+ node_hash_set<std::unique_ptr<void*>> u; >+ u = std::move(t); >+} >+ >+TEST(NodeHashSet, MergeExtractInsert) { >+ struct Hash { >+ size_t operator()(const std::unique_ptr<int>& p) const { return *p; } >+ }; >+ struct Eq { >+ bool operator()(const std::unique_ptr<int>& a, >+ const std::unique_ptr<int>& b) const { >+ return *a == *b; >+ } >+ }; >+ absl::node_hash_set<std::unique_ptr<int>, Hash, Eq> set1, set2; >+ set1.insert(absl::make_unique<int>(7)); >+ set1.insert(absl::make_unique<int>(17)); >+ >+ set2.insert(absl::make_unique<int>(7)); >+ set2.insert(absl::make_unique<int>(19)); >+ >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(7), Pointee(17))); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7), Pointee(19))); >+ >+ set1.merge(set2); >+ >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(7), Pointee(17), Pointee(19))); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7))); >+ >+ auto node = set1.extract(absl::make_unique<int>(7)); >+ EXPECT_TRUE(node); >+ EXPECT_THAT(node.value(), Pointee(7)); >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(17), Pointee(19))); >+ >+ auto insert_result = set2.insert(std::move(node)); >+ EXPECT_FALSE(node); >+ EXPECT_FALSE(insert_result.inserted); >+ EXPECT_TRUE(insert_result.node); >+ EXPECT_THAT(insert_result.node.value(), Pointee(7)); >+ EXPECT_EQ(**insert_result.position, 7); >+ EXPECT_NE(insert_result.position->get(), insert_result.node.value().get()); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7))); >+ >+ node = set1.extract(absl::make_unique<int>(17)); >+ EXPECT_TRUE(node); >+ EXPECT_THAT(node.value(), Pointee(17)); >+ EXPECT_THAT(set1, UnorderedElementsAre(Pointee(19))); >+ >+ node.value() = absl::make_unique<int>(23); >+ >+ insert_result = set2.insert(std::move(node)); >+ EXPECT_FALSE(node); >+ EXPECT_TRUE(insert_result.inserted); >+ EXPECT_FALSE(insert_result.node); >+ EXPECT_EQ(**insert_result.position, 23); >+ EXPECT_THAT(set2, UnorderedElementsAre(Pointee(7), Pointee(23))); >+} >+ >+} // namespace >+} // namespace container_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/copts.bzl b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/copts.bzl >index 0168ac5abddc39ff7c4781ff5e3e2e9be0a2057e..5c508f17df9747618ca52c5e4b933f818ee9b9c8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/copts.bzl >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/copts.bzl >@@ -35,7 +35,6 @@ GCC_TEST_FLAGS = [ > # Docs on groups of flags is preceded by ###. > > LLVM_FLAGS = [ >- # All warnings are treated as errors by implicit -Werror flag > "-Wall", > "-Wextra", > "-Weverything", >@@ -54,6 +53,10 @@ LLVM_FLAGS = [ > "-Wno-packed", > "-Wno-padded", > ### >+ # Google style does not use unsigned integers, though STL containers >+ # have unsigned types. >+ "-Wno-sign-compare", >+ ### > "-Wno-float-conversion", > "-Wno-float-equal", > "-Wno-format-nonliteral", >@@ -101,6 +104,7 @@ LLVM_TEST_FLAGS = [ > "-Wno-c99-extensions", > "-Wno-missing-noreturn", > "-Wno-missing-prototypes", >+ "-Wno-missing-variable-declarations", > "-Wno-null-conversion", > "-Wno-shadow", > "-Wno-shift-sign-overflow", >@@ -112,13 +116,15 @@ LLVM_TEST_FLAGS = [ > "-Wno-unused-template", > "-Wno-used-but-marked-unused", > "-Wno-zero-as-null-pointer-constant", >+ # gtest depends on this GNU extension being offered. >+ "-Wno-gnu-zero-variadic-macro-arguments", > ] > > MSVC_FLAGS = [ > "/W3", >- "/WX", > "/wd4005", # macro-redefinition > "/wd4068", # unknown pragma >+ "/wd4180", # qualifier applied to function type has no meaning; ignored > "/wd4244", # conversion from 'type1' to 'type2', possible loss of data > "/wd4267", # conversion from 'size_t' to 'type', possible loss of data > "/wd4800", # forcing value to bool 'true' or 'false' (performance warning) >@@ -152,3 +158,7 @@ ABSL_EXCEPTIONS_FLAG = select({ > "//absl:windows": ["/U_HAS_EXCEPTIONS", "/D_HAS_EXCEPTIONS=1", "/EHsc"], > "//conditions:default": ["-fexceptions"], > }) >+ >+ABSL_EXCEPTIONS_FLAG_LINKOPTS = select({ >+ "//conditions:default": [], >+}) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/BUILD.gn >index b50b0a6ddc8321968eb1fa176321bc8e1f974254..d3582d6a426f775716eb11762d57a1991b04cb32 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/BUILD.gn >@@ -92,10 +92,10 @@ source_set("failure_signal_handler") { > ] > public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] > sources = [ >- "failure_signal_handler.cc" >+ "failure_signal_handler.cc", > ] > public = [ >- "failure_signal_handler.h" >+ "failure_signal_handler.h", > ] > deps = [ > ":examine_stack", >@@ -126,6 +126,7 @@ source_set("debugging_internal") { > ] > deps = [ > "../base", >+ "../base:core_headers", > "../base:dynamic_annotations", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/internal/demangle.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/internal/demangle.cc >index c9ca2f3bdb170e77d28dec15795ba58829fb868a..48354459bc85f86dfa0d319e9741188b3e9d6746 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/internal/demangle.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/internal/demangle.cc >@@ -340,7 +340,7 @@ static bool ZeroOrMore(ParseFunc parse_func, State *state) { > } > > // Append "str" at "out_cur_idx". If there is an overflow, out_cur_idx is >-// set to out_end_idx+1. The output std::string is ensured to >+// set to out_end_idx+1. The output string is ensured to > // always terminate with '\0' as long as there is no overflow. > static void Append(State *state, const char *const str, const int length) { > for (int i = 0; i < length; ++i) { >@@ -840,7 +840,7 @@ static bool ParseNumber(State *state, int *number_out) { > } > > // Floating-point literals are encoded using a fixed-length lowercase >-// hexadecimal std::string. >+// hexadecimal string. > static bool ParseFloatNumber(State *state) { > ComplexityGuard guard(state); > if (guard.IsTooComplex()) return false; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/stacktrace.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/stacktrace.cc >index 61fee6190f545842adc362c8fcfcec5c0e75658a..463fad269c16afba3e383eed12906c548849f38f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/stacktrace.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/stacktrace.cc >@@ -80,11 +80,13 @@ ABSL_ATTRIBUTE_ALWAYS_INLINE inline int Unwind(void** result, int* sizes, > > } // anonymous namespace > >+ABSL_ATTRIBUTE_NOINLINE > int GetStackFrames(void** result, int* sizes, int max_depth, int skip_count) { > return Unwind<true, false>(result, sizes, max_depth, skip_count, nullptr, > nullptr); > } > >+ABSL_ATTRIBUTE_NOINLINE > int GetStackFramesWithContext(void** result, int* sizes, int max_depth, > int skip_count, const void* uc, > int* min_dropped_frames) { >@@ -92,11 +94,13 @@ int GetStackFramesWithContext(void** result, int* sizes, int max_depth, > min_dropped_frames); > } > >+ABSL_ATTRIBUTE_NOINLINE > int GetStackTrace(void** result, int max_depth, int skip_count) { > return Unwind<false, false>(result, nullptr, max_depth, skip_count, nullptr, > nullptr); > } > >+ABSL_ATTRIBUTE_NOINLINE > int GetStackTraceWithContext(void** result, int max_depth, int skip_count, > const void* uc, int* min_dropped_frames) { > return Unwind<false, true>(result, nullptr, max_depth, skip_count, uc, >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/symbolize_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/symbolize_test.cc >index 5f2af47ee45d654b422d0e864fbf060cb5c51075..8029fbe9d2e6fb720816ef0435e07099cc387292 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/symbolize_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/debugging/symbolize_test.cc >@@ -321,7 +321,7 @@ TEST(Symbolize, SymbolizeWithMultipleMaps) { > } > } > >-// Appends std::string(*args->arg) to args->symbol_buf. >+// Appends string(*args->arg) to args->symbol_buf. > static void DummySymbolDecorator( > const absl::debugging_internal::SymbolDecoratorArgs *args) { > std::string *message = static_cast<std::string *>(args->arg); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/BUILD.bazel >new file mode 100644 >index 0000000000000000000000000000000000000000..50aa5506c11c87ed0b4a0642b950ba160f8a2f79 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/BUILD.bazel >@@ -0,0 +1,114 @@ >+# >+# Copyright 2018 The Abseil Authors. >+# >+# Licensed under the Apache License, Version 2.0 (the "License"); >+# you may not use this file except in compliance with the License. >+# You may obtain a copy of the License at >+# >+# http://www.apache.org/licenses/LICENSE-2.0 >+# >+# Unless required by applicable law or agreed to in writing, software >+# distributed under the License is distributed on an "AS IS" BASIS, >+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+# See the License for the specific language governing permissions and >+# limitations under the License. >+# >+ >+load( >+ "//absl:copts.bzl", >+ "ABSL_DEFAULT_COPTS", >+ "ABSL_TEST_COPTS", >+) >+ >+package(default_visibility = ["//visibility:public"]) >+ >+licenses(["notice"]) # Apache 2.0 >+ >+cc_library( >+ name = "hash", >+ srcs = [ >+ "internal/hash.cc", >+ "internal/hash.h", >+ ], >+ hdrs = ["hash.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ ":city", >+ "//absl/base:core_headers", >+ "//absl/base:endian", >+ "//absl/container:fixed_array", >+ "//absl/meta:type_traits", >+ "//absl/numeric:int128", >+ "//absl/strings", >+ "//absl/types:optional", >+ "//absl/types:variant", >+ "//absl/utility", >+ ], >+) >+ >+cc_library( >+ name = "hash_testing", >+ testonly = 1, >+ hdrs = ["hash_testing.h"], >+ deps = [ >+ ":spy_hash_state", >+ "//absl/meta:type_traits", >+ "//absl/strings", >+ "//absl/types:variant", >+ "@com_google_googletest//:gtest", >+ ], >+) >+ >+cc_test( >+ name = "hash_test", >+ srcs = ["hash_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":hash", >+ ":hash_testing", >+ "//absl/base:core_headers", >+ "//absl/container:flat_hash_set", >+ "//absl/hash:spy_hash_state", >+ "//absl/meta:type_traits", >+ "//absl/numeric:int128", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >+ >+cc_library( >+ name = "spy_hash_state", >+ testonly = 1, >+ hdrs = ["internal/spy_hash_state.h"], >+ copts = ABSL_DEFAULT_COPTS, >+ visibility = ["//visibility:private"], >+ deps = [ >+ ":hash", >+ "//absl/strings", >+ "//absl/strings:str_format", >+ ], >+) >+ >+cc_library( >+ name = "city", >+ srcs = ["internal/city.cc"], >+ hdrs = [ >+ "internal/city.h", >+ "internal/city_crc.h", >+ ], >+ copts = ABSL_DEFAULT_COPTS, >+ deps = [ >+ "//absl/base:config", >+ "//absl/base:core_headers", >+ "//absl/base:endian", >+ ], >+) >+ >+cc_test( >+ name = "city_test", >+ srcs = ["internal/city_test.cc"], >+ copts = ABSL_TEST_COPTS, >+ deps = [ >+ ":city", >+ "@com_google_googletest//:gtest_main", >+ ], >+) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/BUILD.gn >new file mode 100644 >index 0000000000000000000000000000000000000000..40b6b3a22b37132bb6ef92abccfba1f669cfe47f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/BUILD.gn >@@ -0,0 +1,104 @@ >+# Copyright 2018 The Chromium Authors. All rights reserved. >+# Use of this source code is governed by a BSD-style license that can be >+# found in the LICENSE file. >+ >+import("//build_overrides/build.gni") >+ >+if (build_with_chromium) { >+ visibility = [ >+ "//third_party/webrtc/*", >+ "//third_party/abseil-cpp/*", >+ "//third_party/googletest:gtest", >+ ] >+} else { >+ visibility = [ "*" ] >+} >+ >+source_set("hash") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ sources = [ >+ "internal/hash.cc", >+ "internal/hash.h", >+ ] >+ public = [ >+ "hash.h", >+ ] >+ deps = [ >+ ":city", >+ "../base:core_headers", >+ "../base:endian", >+ "../container:fixed_array", >+ "../meta:type_traits", >+ "../numeric:int128", >+ "../strings", >+ "../types:optional", >+ "../types:variant", >+ "../utility", >+ ] >+} >+ >+source_set("hash_testing") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "hash_testing.h", >+ ] >+ deps = [ >+ ":spy_hash_state", >+ "../meta:type_traits", >+ "../strings", >+ "../types:variant", >+ "//testing/gtest", >+ ] >+} >+ >+source_set("spy_hash_state") { >+ testonly = true >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/spy_hash_state.h", >+ ] >+ deps = [ >+ ":hash", >+ "../strings", >+ "../strings:str_format", >+ ] >+ visibility = [] >+ visibility += [ "../*" ] >+} >+ >+source_set("city") { >+ configs -= [ "//build/config/compiler:chromium_code" ] >+ configs += [ >+ "//build/config/compiler:no_chromium_code", >+ "//third_party/abseil-cpp:absl_default_cflags_cc", >+ ] >+ public_configs = [ "//third_party/abseil-cpp:absl_include_config" ] >+ public = [ >+ "internal/city.h", >+ "internal/city_crc.h", >+ ] >+ sources = [ >+ "internal/city.cc", >+ ] >+ deps = [ >+ "../base:config", >+ "../base:core_headers", >+ "../base:endian", >+ ] >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/CMakeLists.txt b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/CMakeLists.txt >new file mode 100644 >index 0000000000000000000000000000000000000000..35081e3774f8f322a9d75da948ef27a2213d227e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/CMakeLists.txt >@@ -0,0 +1,80 @@ >+# >+# Copyright 2018 The Abseil Authors. >+# >+# Licensed under the Apache License, Version 2.0 (the "License"); >+# you may not use this file except in compliance with the License. >+# You may obtain a copy of the License at >+# >+# http://www.apache.org/licenses/LICENSE-2.0 >+# >+# Unless required by applicable law or agreed to in writing, software >+# distributed under the License is distributed on an "AS IS" BASIS, >+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+# See the License for the specific language governing permissions and >+# limitations under the License. >+# >+ >+list(APPEND HASH_PUBLIC_HEADERS >+ "hash.h" >+) >+ >+list(APPEND HASH_INTERNAL_HEADERS >+ "internal/city.h" >+ "internal/city_crc.h" >+ "internal/hash.h" >+) >+ >+# absl_hash library >+list(APPEND HASH_SRC >+ "internal/city.cc" >+ "internal/hash.cc" >+ ${HASH_PUBLIC_HEADERS} >+ ${HASH_INTERNAL_HEADERS} >+) >+ >+set(HASH_PUBLIC_LIBRARIES absl::hash absl::container absl::strings absl::str_format absl::utility) >+ >+absl_library( >+ TARGET >+ absl_hash >+ SOURCES >+ ${HASH_SRC} >+ PUBLIC_LIBRARIES >+ ${HASH_PUBLIC_LIBRARIES} >+ EXPORT_NAME >+ hash >+) >+ >+# >+## TESTS >+# >+ >+# testing support >+set(HASH_TEST_HEADERS hash_testing.h internal/spy_hash_state.h) >+set(HASH_TEST_PUBLIC_LIBRARIES absl::hash absl::container absl::numeric absl::strings absl::str_format) >+ >+# hash_test >+set(HASH_TEST_SRC "hash_test.cc" ${HASH_TEST_HEADERS}) >+ >+absl_test( >+ TARGET >+ hash_test >+ SOURCES >+ ${HASH_TEST_SRC} >+ PUBLIC_LIBRARIES >+ ${HASH_TEST_PUBLIC_LIBRARIES} >+) >+ >+# hash_test >+set(CITY_TEST_SRC "internal/city_test.cc") >+ >+absl_test( >+ TARGET >+ city_test >+ SOURCES >+ ${CITY_TEST_SRC} >+ PUBLIC_LIBRARIES >+ ${HASH_TEST_PUBLIC_LIBRARIES} >+) >+ >+ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash.h >new file mode 100644 >index 0000000000000000000000000000000000000000..c7ba4c2b7a5c198f74ce273910bac881c4559f89 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash.h >@@ -0,0 +1,312 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// ----------------------------------------------------------------------------- >+// File: hash.h >+// ----------------------------------------------------------------------------- >+// >+// This header file defines the Abseil `hash` library and the Abseil hashing >+// framework. This framework consists of the following: >+// >+// * The `absl::Hash` functor, which is used to invoke the hasher within the >+// Abseil hashing framework. `absl::Hash<T>` supports most basic types and >+// a number of Abseil types out of the box. >+// * `AbslHashValue`, an extension point that allows you to extend types to >+// support Abseil hashing without requiring you to define a hashing >+// algorithm. >+// * `HashState`, a type-erased class which implement the manipulation of the >+// hash state (H) itself. containing member functions `combine()` and >+// `combine_contiguous()`, which you can use to contribute to an existing >+// hash state when hashing your types. >+// >+// Unlike `std::hash` or other hashing frameworks, the Abseil hashing framework >+// provides most of its utility by abstracting away the hash algorithm (and its >+// implementation) entirely. Instead, a type invokes the Abseil hashing >+// framework by simply combining its state with the state of known, hashable >+// types. Hashing of that combined state is separately done by `absl::Hash`. >+// >+// Example: >+// >+// // Suppose we have a class `Circle` for which we want to add hashing >+// class Circle { >+// public: >+// ... >+// private: >+// std::pair<int, int> center_; >+// int radius_; >+// }; >+// >+// // To add hashing support to `Circle`, we simply need to add an ordinary >+// // function `AbslHashValue()`, and return the combined hash state of the >+// // existing hash state and the class state: >+// >+// template <typename H> >+// friend H AbslHashValue(H h, const Circle& c) { >+// return H::combine(std::move(h), c.center_, c.radius_); >+// } >+// >+// For more information, see Adding Type Support to `absl::Hash` below. >+// >+#ifndef ABSL_HASH_HASH_H_ >+#define ABSL_HASH_HASH_H_ >+ >+#include "absl/hash/internal/hash.h" >+ >+namespace absl { >+ >+// ----------------------------------------------------------------------------- >+// `absl::Hash` >+// ----------------------------------------------------------------------------- >+// >+// `absl::Hash<T>` is a convenient general-purpose hash functor for a type `T` >+// satisfying any of the following conditions (in order): >+// >+// * T is an arithmetic or pointer type >+// * T defines an overload for `AbslHashValue(H, const T&)` for an arbitrary >+// hash state `H`. >+// - T defines a specialization of `HASH_NAMESPACE::hash<T>` >+// - T defines a specialization of `std::hash<T>` >+// >+// `absl::Hash` intrinsically supports the following types: >+// >+// * All integral types (including bool) >+// * All enum types >+// * All floating-point types (although hashing them is discouraged) >+// * All pointer types, including nullptr_t >+// * std::pair<T1, T2>, if T1 and T2 are hashable >+// * std::tuple<Ts...>, if all the Ts... are hashable >+// * std::unique_ptr and std::shared_ptr >+// * All string-like types including: >+// * std::string >+// * std::string_view (as well as any instance of std::basic_string that >+// uses char and std::char_traits) >+// * All the standard sequence containers (provided the elements are hashable) >+// * All the standard ordered associative containers (provided the elements are >+// hashable) >+// * absl types such as the following: >+// * absl::string_view >+// * absl::InlinedVector >+// * absl::FixedArray >+// * absl::unit128 >+// * absl::Time, absl::Duration, and absl::TimeZone >+// >+// Note: the list above is not meant to be exhaustive. Additional type support >+// may be added, in which case the above list will be updated. >+// >+// ----------------------------------------------------------------------------- >+// absl::Hash Invocation Evaluation >+// ----------------------------------------------------------------------------- >+// >+// When invoked, `absl::Hash<T>` searches for supplied hash functions in the >+// following order: >+// >+// * Natively supported types out of the box (see above) >+// * Types for which an `AbslHashValue()` overload is provided (such as >+// user-defined types). See "Adding Type Support to `absl::Hash`" below. >+// * Types which define a `HASH_NAMESPACE::hash<T>` specialization (aka >+// `__gnu_cxx::hash<T>` for gcc/Clang or `stdext::hash<T>` for MSVC) >+// * Types which define a `std::hash<T>` specialization >+// >+// The fallback to legacy hash functions exists mainly for backwards >+// compatibility. If you have a choice, prefer defining an `AbslHashValue` >+// overload instead of specializing any legacy hash functors. >+// >+// ----------------------------------------------------------------------------- >+// The Hash State Concept, and using `HashState` for Type Erasure >+// ----------------------------------------------------------------------------- >+// >+// The `absl::Hash` framework relies on the Concept of a "hash state." Such a >+// hash state is used in several places: >+// >+// * Within existing implementations of `absl::Hash<T>` to store the hashed >+// state of an object. Note that it is up to the implementation how it stores >+// such state. A hash table, for example, may mix the state to produce an >+// integer value; a testing framework may simply hold a vector of that state. >+// * Within implementations of `AbslHashValue()` used to extend user-defined >+// types. (See "Adding Type Support to absl::Hash" below.) >+// * Inside a `HashState`, providing type erasure for the concept of a hash >+// state, which you can use to extend the `absl::Hash` framework for types >+// that are otherwise difficult to extend using `AbslHashValue()`. (See the >+// `HashState` class below.) >+// >+// The "hash state" concept contains two member functions for mixing hash state: >+// >+// * `H::combine()` >+// >+// Combines an arbitrary number of values into a hash state, returning the >+// updated state. Note that the existing hash state is move-only and must be >+// passed by value. >+// >+// Each of the value types T must be hashable by H. >+// >+// NOTE: >+// >+// state = H::combine(std::move(state), value1, value2, value3); >+// >+// must be guaranteed to produce the same hash expansion as >+// >+// state = H::combine(std::move(state), value1); >+// state = H::combine(std::move(state), value2); >+// state = H::combine(std::move(state), value3); >+// >+// * `H::combine_contiguous()` >+// >+// Combines a contiguous array of `size` elements into a hash state, >+// returning the updated state. Note that the existing hash state is >+// move-only and must be passed by value. >+// >+// NOTE: >+// >+// state = H::combine_contiguous(std::move(state), data, size); >+// >+// need NOT be guaranteed to produce the same hash expansion as a loop >+// (it may perform internal optimizations). If you need this guarantee, use a >+// loop instead. >+// >+// ----------------------------------------------------------------------------- >+// Adding Type Support to `absl::Hash` >+// ----------------------------------------------------------------------------- >+// >+// To add support for your user-defined type, add a proper `AbslHashValue()` >+// overload as a free (non-member) function. The overload will take an >+// existing hash state and should combine that state with state from the type. >+// >+// Example: >+// >+// template <typename H> >+// H AbslHashValue(H state, const MyType& v) { >+// return H::combine(std::move(state), v.field1, ..., v.fieldN); >+// } >+// >+// where `(field1, ..., fieldN)` are the members you would use on your >+// `operator==` to define equality. >+// >+// Notice that `AbslHashValue` is not a class member, but an ordinary function. >+// An `AbslHashValue` overload for a type should only be declared in the same >+// file and namespace as said type. The proper `AbslHashValue` implementation >+// for a given type will be discovered via ADL. >+// >+// Note: unlike `std::hash', `absl::Hash` should never be specialized. It must >+// only be extended by adding `AbslHashValue()` overloads. >+// >+template <typename T> >+using Hash = absl::hash_internal::Hash<T>; >+ >+// HashState >+// >+// A type erased version of the hash state concept, for use in user-defined >+// `AbslHashValue` implementations that can't use templates (such as PImpl >+// classes, virtual functions, etc.). The type erasure adds overhead so it >+// should be avoided unless necessary. >+// >+// Note: This wrapper will only erase calls to: >+// combine_contiguous(H, const unsigned char*, size_t) >+// >+// All other calls will be handled internally and will not invoke overloads >+// provided by the wrapped class. >+// >+// Users of this class should still define a template `AbslHashValue` function, >+// but can use `absl::HashState::Create(&state)` to erase the type of the hash >+// state and dispatch to their private hashing logic. >+// >+// This state can be used like any other hash state. In particular, you can call >+// `HashState::combine()` and `HashState::combine_contiguous()` on it. >+// >+// Example: >+// >+// class Interface { >+// public: >+// template <typename H> >+// friend H AbslHashValue(H state, const Interface& value) { >+// state = H::combine(std::move(state), std::type_index(typeid(*this))); >+// value.HashValue(absl::HashState::Create(&state)); >+// return state; >+// } >+// private: >+// virtual void HashValue(absl::HashState state) const = 0; >+// }; >+// >+// class Impl : Interface { >+// private: >+// void HashValue(absl::HashState state) const override { >+// absl::HashState::combine(std::move(state), v1_, v2_); >+// } >+// int v1_; >+// string v2_; >+// }; >+class HashState : public hash_internal::HashStateBase<HashState> { >+ public: >+ // HashState::Create() >+ // >+ // Create a new `HashState` instance that wraps `state`. All calls to >+ // `combine()` and `combine_contiguous()` on the new instance will be >+ // redirected to the original `state` object. The `state` object must outlive >+ // the `HashState` instance. >+ template <typename T> >+ static HashState Create(T* state) { >+ HashState s; >+ s.Init(state); >+ return s; >+ } >+ >+ HashState(const HashState&) = delete; >+ HashState& operator=(const HashState&) = delete; >+ HashState(HashState&&) = default; >+ HashState& operator=(HashState&&) = default; >+ >+ // HashState::combine() >+ // >+ // Combines an arbitrary number of values into a hash state, returning the >+ // updated state. >+ using HashState::HashStateBase::combine; >+ >+ // HashState::combine_contiguous() >+ // >+ // Combines a contiguous array of `size` elements into a hash state, returning >+ // the updated state. >+ static HashState combine_contiguous(HashState hash_state, >+ const unsigned char* first, size_t size) { >+ hash_state.combine_contiguous_(hash_state.state_, first, size); >+ return hash_state; >+ } >+ using HashState::HashStateBase::combine_contiguous; >+ >+ private: >+ HashState() = default; >+ >+ template <typename T> >+ static void CombineContiguousImpl(void* p, const unsigned char* first, >+ size_t size) { >+ T& state = *static_cast<T*>(p); >+ state = T::combine_contiguous(std::move(state), first, size); >+ } >+ >+ template <typename T> >+ void Init(T* state) { >+ state_ = state; >+ combine_contiguous_ = &CombineContiguousImpl<T>; >+ } >+ >+ // Do not erase an already erased state. >+ void Init(HashState* state) { >+ state_ = state->state_; >+ combine_contiguous_ = state->combine_contiguous_; >+ } >+ >+ void* state_; >+ void (*combine_contiguous_)(void*, const unsigned char*, size_t); >+}; >+ >+} // namespace absl >+#endif // ABSL_HASH_HASH_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7b6fb2e7ac1cb0931c5e6b9833def64eace2ac77 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash_test.cc >@@ -0,0 +1,425 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/hash/hash.h" >+ >+#include <array> >+#include <cstring> >+#include <deque> >+#include <forward_list> >+#include <functional> >+#include <iterator> >+#include <limits> >+#include <list> >+#include <map> >+#include <memory> >+#include <numeric> >+#include <random> >+#include <set> >+#include <string> >+#include <tuple> >+#include <type_traits> >+#include <unordered_map> >+#include <utility> >+#include <vector> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/container/flat_hash_set.h" >+#include "absl/hash/hash_testing.h" >+#include "absl/hash/internal/spy_hash_state.h" >+#include "absl/meta/type_traits.h" >+#include "absl/numeric/int128.h" >+ >+namespace { >+ >+using absl::Hash; >+using absl::hash_internal::SpyHashState; >+ >+template <typename T> >+class HashValueIntTest : public testing::Test { >+}; >+TYPED_TEST_CASE_P(HashValueIntTest); >+ >+template <typename T> >+SpyHashState SpyHash(const T& value) { >+ return SpyHashState::combine(SpyHashState(), value); >+} >+ >+// Helper trait to verify if T is hashable. We use absl::Hash's poison status to >+// detect it. >+template <typename T> >+using is_hashable = std::is_default_constructible<absl::Hash<T>>; >+ >+TYPED_TEST_P(HashValueIntTest, BasicUsage) { >+ EXPECT_TRUE((is_hashable<TypeParam>::value)); >+ >+ TypeParam n = 42; >+ EXPECT_EQ(SpyHash(n), SpyHash(TypeParam{42})); >+ EXPECT_NE(SpyHash(n), SpyHash(TypeParam{0})); >+ EXPECT_NE(SpyHash(std::numeric_limits<TypeParam>::max()), >+ SpyHash(std::numeric_limits<TypeParam>::min())); >+} >+ >+TYPED_TEST_P(HashValueIntTest, FastPath) { >+ // Test the fast-path to make sure the values are the same. >+ TypeParam n = 42; >+ EXPECT_EQ(absl::Hash<TypeParam>{}(n), >+ absl::Hash<std::tuple<TypeParam>>{}(std::tuple<TypeParam>(n))); >+} >+ >+REGISTER_TYPED_TEST_CASE_P(HashValueIntTest, BasicUsage, FastPath); >+using IntTypes = testing::Types<unsigned char, char, int, int32_t, int64_t, uint32_t, >+ uint64_t, size_t>; >+INSTANTIATE_TYPED_TEST_CASE_P(My, HashValueIntTest, IntTypes); >+ >+template <typename T, typename = void> >+struct IsHashCallble : std::false_type {}; >+ >+template <typename T> >+struct IsHashCallble<T, absl::void_t<decltype(std::declval<absl::Hash<T>>()( >+ std::declval<const T&>()))>> : std::true_type {}; >+ >+template <typename T, typename = void> >+struct IsAggregateInitializable : std::false_type {}; >+ >+template <typename T> >+struct IsAggregateInitializable<T, absl::void_t<decltype(T{})>> >+ : std::true_type {}; >+ >+TEST(IsHashableTest, ValidHash) { >+ EXPECT_TRUE((is_hashable<int>::value)); >+ EXPECT_TRUE(std::is_default_constructible<absl::Hash<int>>::value); >+ EXPECT_TRUE(std::is_copy_constructible<absl::Hash<int>>::value); >+ EXPECT_TRUE(std::is_move_constructible<absl::Hash<int>>::value); >+ EXPECT_TRUE(absl::is_copy_assignable<absl::Hash<int>>::value); >+ EXPECT_TRUE(absl::is_move_assignable<absl::Hash<int>>::value); >+ EXPECT_TRUE(IsHashCallble<int>::value); >+ EXPECT_TRUE(IsAggregateInitializable<absl::Hash<int>>::value); >+} >+#if ABSL_HASH_INTERNAL_CAN_POISON_ && !defined(__APPLE__) >+TEST(IsHashableTest, PoisonHash) { >+ struct X {}; >+ EXPECT_FALSE((is_hashable<X>::value)); >+ EXPECT_FALSE(std::is_default_constructible<absl::Hash<X>>::value); >+ EXPECT_FALSE(std::is_copy_constructible<absl::Hash<X>>::value); >+ EXPECT_FALSE(std::is_move_constructible<absl::Hash<X>>::value); >+ EXPECT_FALSE(absl::is_copy_assignable<absl::Hash<X>>::value); >+ EXPECT_FALSE(absl::is_move_assignable<absl::Hash<X>>::value); >+ EXPECT_FALSE(IsHashCallble<X>::value); >+ EXPECT_FALSE(IsAggregateInitializable<absl::Hash<X>>::value); >+} >+#endif // ABSL_HASH_INTERNAL_CAN_POISON_ >+ >+// Hashable types >+// >+// These types exist simply to exercise various AbslHashValue behaviors, so >+// they are named by what their AbslHashValue overload does. >+struct NoOp { >+ template <typename HashCode> >+ friend HashCode AbslHashValue(HashCode h, NoOp n) { >+ return std::move(h); >+ } >+}; >+ >+struct EmptyCombine { >+ template <typename HashCode> >+ friend HashCode AbslHashValue(HashCode h, EmptyCombine e) { >+ return HashCode::combine(std::move(h)); >+ } >+}; >+ >+template <typename Int> >+struct CombineIterative { >+ template <typename HashCode> >+ friend HashCode AbslHashValue(HashCode h, CombineIterative c) { >+ for (int i = 0; i < 5; ++i) { >+ h = HashCode::combine(std::move(h), Int(i)); >+ } >+ return h; >+ } >+}; >+ >+template <typename Int> >+struct CombineVariadic { >+ template <typename HashCode> >+ friend HashCode AbslHashValue(HashCode h, CombineVariadic c) { >+ return HashCode::combine(std::move(h), Int(0), Int(1), Int(2), Int(3), >+ Int(4)); >+ } >+}; >+ >+using InvokeTag = absl::hash_internal::InvokeHashTag; >+template <InvokeTag T> >+using InvokeTagConstant = std::integral_constant<InvokeTag, T>; >+ >+template <InvokeTag... Tags> >+struct MinTag; >+ >+template <InvokeTag a, InvokeTag b, InvokeTag... Tags> >+struct MinTag<a, b, Tags...> : MinTag<(a < b ? a : b), Tags...> {}; >+ >+template <InvokeTag a> >+struct MinTag<a> : InvokeTagConstant<a> {}; >+ >+template <InvokeTag... Tags> >+struct CustomHashType { >+ size_t value; >+}; >+ >+template <InvokeTag allowed, InvokeTag... tags> >+struct EnableIfContained >+ : std::enable_if<absl::disjunction< >+ std::integral_constant<bool, allowed == tags>...>::value> {}; >+ >+template < >+ typename H, InvokeTag... Tags, >+ typename = typename EnableIfContained<InvokeTag::kHashValue, Tags...>::type> >+H AbslHashValue(H state, CustomHashType<Tags...> t) { >+ static_assert(MinTag<Tags...>::value == InvokeTag::kHashValue, ""); >+ return H::combine(std::move(state), >+ t.value + static_cast<int>(InvokeTag::kHashValue)); >+} >+ >+} // namespace >+ >+namespace absl { >+namespace hash_internal { >+template <InvokeTag... Tags> >+struct is_uniquely_represented< >+ CustomHashType<Tags...>, >+ typename EnableIfContained<InvokeTag::kUniquelyRepresented, Tags...>::type> >+ : std::true_type {}; >+} // namespace hash_internal >+} // namespace absl >+ >+#if ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+namespace ABSL_INTERNAL_LEGACY_HASH_NAMESPACE { >+template <InvokeTag... Tags> >+struct hash<CustomHashType<Tags...>> { >+ template <InvokeTag... TagsIn, typename = typename EnableIfContained< >+ InvokeTag::kLegacyHash, TagsIn...>::type> >+ size_t operator()(CustomHashType<TagsIn...> t) const { >+ static_assert(MinTag<Tags...>::value == InvokeTag::kLegacyHash, ""); >+ return t.value + static_cast<int>(InvokeTag::kLegacyHash); >+ } >+}; >+} // namespace ABSL_INTERNAL_LEGACY_HASH_NAMESPACE >+#endif // ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ >+namespace std { >+template <InvokeTag... Tags> // NOLINT >+struct hash<CustomHashType<Tags...>> { >+ template <InvokeTag... TagsIn, typename = typename EnableIfContained< >+ InvokeTag::kStdHash, TagsIn...>::type> >+ size_t operator()(CustomHashType<TagsIn...> t) const { >+ static_assert(MinTag<Tags...>::value == InvokeTag::kStdHash, ""); >+ return t.value + static_cast<int>(InvokeTag::kStdHash); >+ } >+}; >+} // namespace std >+ >+namespace { >+ >+template <typename... T> >+void TestCustomHashType(InvokeTagConstant<InvokeTag::kNone>, T...) { >+ using type = CustomHashType<T::value...>; >+ SCOPED_TRACE(testing::PrintToString(std::vector<InvokeTag>{T::value...})); >+ EXPECT_TRUE(is_hashable<type>()); >+ EXPECT_TRUE(is_hashable<const type>()); >+ EXPECT_TRUE(is_hashable<const type&>()); >+ >+ const size_t offset = static_cast<int>(std::min({T::value...})); >+ EXPECT_EQ(SpyHash(type{7}), SpyHash(size_t{7 + offset})); >+} >+ >+void TestCustomHashType(InvokeTagConstant<InvokeTag::kNone>) { >+#if ABSL_HASH_INTERNAL_CAN_POISON_ >+ // is_hashable is false if we don't support any of the hooks. >+ using type = CustomHashType<>; >+ EXPECT_FALSE(is_hashable<type>()); >+ EXPECT_FALSE(is_hashable<const type>()); >+ EXPECT_FALSE(is_hashable<const type&>()); >+#endif // ABSL_HASH_INTERNAL_CAN_POISON_ >+} >+ >+template <InvokeTag Tag, typename... T> >+void TestCustomHashType(InvokeTagConstant<Tag> tag, T... t) { >+ constexpr auto next = static_cast<InvokeTag>(static_cast<int>(Tag) + 1); >+ TestCustomHashType(InvokeTagConstant<next>(), tag, t...); >+ TestCustomHashType(InvokeTagConstant<next>(), t...); >+} >+ >+TEST(HashTest, CustomHashType) { >+ TestCustomHashType(InvokeTagConstant<InvokeTag{}>()); >+} >+ >+TEST(HashTest, NoOpsAreEquivalent) { >+ EXPECT_EQ(Hash<NoOp>()({}), Hash<NoOp>()({})); >+ EXPECT_EQ(Hash<NoOp>()({}), Hash<EmptyCombine>()({})); >+} >+ >+template <typename T> >+class HashIntTest : public testing::Test { >+}; >+TYPED_TEST_CASE_P(HashIntTest); >+ >+TYPED_TEST_P(HashIntTest, BasicUsage) { >+ EXPECT_NE(Hash<NoOp>()({}), Hash<TypeParam>()(0)); >+ EXPECT_NE(Hash<NoOp>()({}), >+ Hash<TypeParam>()(std::numeric_limits<TypeParam>::max())); >+ if (std::numeric_limits<TypeParam>::min() != 0) { >+ EXPECT_NE(Hash<NoOp>()({}), >+ Hash<TypeParam>()(std::numeric_limits<TypeParam>::min())); >+ } >+ >+ EXPECT_EQ(Hash<CombineIterative<TypeParam>>()({}), >+ Hash<CombineVariadic<TypeParam>>()({})); >+} >+ >+REGISTER_TYPED_TEST_CASE_P(HashIntTest, BasicUsage); >+using IntTypes = testing::Types<unsigned char, char, int, int32_t, int64_t, uint32_t, >+ uint64_t, size_t>; >+INSTANTIATE_TYPED_TEST_CASE_P(My, HashIntTest, IntTypes); >+ >+struct StructWithPadding { >+ char c; >+ int i; >+ >+ template <typename H> >+ friend H AbslHashValue(H hash_state, const StructWithPadding& s) { >+ return H::combine(std::move(hash_state), s.c, s.i); >+ } >+}; >+ >+static_assert(sizeof(StructWithPadding) > sizeof(char) + sizeof(int), >+ "StructWithPadding doesn't have padding"); >+static_assert(std::is_standard_layout<StructWithPadding>::value, ""); >+ >+// This check has to be disabled because libstdc++ doesn't support it. >+// static_assert(std::is_trivially_constructible<StructWithPadding>::value, ""); >+ >+template <typename T> >+struct ArraySlice { >+ T* begin; >+ T* end; >+ >+ template <typename H> >+ friend H AbslHashValue(H hash_state, const ArraySlice& slice) { >+ for (auto t = slice.begin; t != slice.end; ++t) { >+ hash_state = H::combine(std::move(hash_state), *t); >+ } >+ return hash_state; >+ } >+}; >+ >+TEST(HashTest, HashNonUniquelyRepresentedType) { >+ // Create equal StructWithPadding objects that are known to have non-equal >+ // padding bytes. >+ static const size_t kNumStructs = 10; >+ unsigned char buffer1[kNumStructs * sizeof(StructWithPadding)]; >+ std::memset(buffer1, 0, sizeof(buffer1)); >+ auto* s1 = reinterpret_cast<StructWithPadding*>(buffer1); >+ >+ unsigned char buffer2[kNumStructs * sizeof(StructWithPadding)]; >+ std::memset(buffer2, 255, sizeof(buffer2)); >+ auto* s2 = reinterpret_cast<StructWithPadding*>(buffer2); >+ for (int i = 0; i < kNumStructs; ++i) { >+ SCOPED_TRACE(i); >+ s1[i].c = s2[i].c = '0' + i; >+ s1[i].i = s2[i].i = i; >+ ASSERT_FALSE(memcmp(buffer1 + i * sizeof(StructWithPadding), >+ buffer2 + i * sizeof(StructWithPadding), >+ sizeof(StructWithPadding)) == 0) >+ << "Bug in test code: objects do not have unequal" >+ << " object representations"; >+ } >+ >+ EXPECT_EQ(Hash<StructWithPadding>()(s1[0]), Hash<StructWithPadding>()(s2[0])); >+ EXPECT_EQ(Hash<ArraySlice<StructWithPadding>>()({s1, s1 + kNumStructs}), >+ Hash<ArraySlice<StructWithPadding>>()({s2, s2 + kNumStructs})); >+} >+ >+TEST(HashTest, StandardHashContainerUsage) { >+ std::unordered_map<int, std::string, Hash<int>> map = {{0, "foo"}, { 42, "bar" }}; >+ >+ EXPECT_NE(map.find(0), map.end()); >+ EXPECT_EQ(map.find(1), map.end()); >+ EXPECT_NE(map.find(0u), map.end()); >+} >+ >+struct ConvertibleFromNoOp { >+ ConvertibleFromNoOp(NoOp) {} // NOLINT(runtime/explicit) >+ >+ template <typename H> >+ friend H AbslHashValue(H hash_state, ConvertibleFromNoOp) { >+ return H::combine(std::move(hash_state), 1); >+ } >+}; >+ >+TEST(HashTest, HeterogeneousCall) { >+ EXPECT_NE(Hash<ConvertibleFromNoOp>()(NoOp()), >+ Hash<NoOp>()(NoOp())); >+} >+ >+TEST(IsUniquelyRepresentedTest, SanityTest) { >+ using absl::hash_internal::is_uniquely_represented; >+ >+ EXPECT_TRUE(is_uniquely_represented<unsigned char>::value); >+ EXPECT_TRUE(is_uniquely_represented<int>::value); >+ EXPECT_FALSE(is_uniquely_represented<bool>::value); >+ EXPECT_FALSE(is_uniquely_represented<int*>::value); >+} >+ >+struct IntAndString { >+ int i; >+ std::string s; >+ >+ template <typename H> >+ friend H AbslHashValue(H hash_state, IntAndString int_and_string) { >+ return H::combine(std::move(hash_state), int_and_string.s, >+ int_and_string.i); >+ } >+}; >+ >+TEST(HashTest, SmallValueOn64ByteBoundary) { >+ Hash<IntAndString>()(IntAndString{0, std::string(63, '0')}); >+} >+ >+struct TypeErased { >+ size_t n; >+ >+ template <typename H> >+ friend H AbslHashValue(H hash_state, const TypeErased& v) { >+ v.HashValue(absl::HashState::Create(&hash_state)); >+ return hash_state; >+ } >+ >+ void HashValue(absl::HashState state) const { >+ absl::HashState::combine(std::move(state), n); >+ } >+}; >+ >+TEST(HashTest, TypeErased) { >+ EXPECT_TRUE((is_hashable<TypeErased>::value)); >+ EXPECT_TRUE((is_hashable<std::pair<TypeErased, int>>::value)); >+ >+ EXPECT_EQ(SpyHash(TypeErased{7}), SpyHash(size_t{7})); >+ EXPECT_NE(SpyHash(TypeErased{7}), SpyHash(size_t{13})); >+ >+ EXPECT_EQ(SpyHash(std::make_pair(TypeErased{7}, 17)), >+ SpyHash(std::make_pair(size_t{7}, 17))); >+} >+ >+} // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash_testing.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash_testing.h >new file mode 100644 >index 0000000000000000000000000000000000000000..1e3cda64467dfab0ee1661c075f63c06bc21800d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/hash_testing.h >@@ -0,0 +1,372 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_HASH_HASH_TESTING_H_ >+#define ABSL_HASH_HASH_TESTING_H_ >+ >+#include <initializer_list> >+#include <tuple> >+#include <type_traits> >+#include <vector> >+ >+#include "gmock/gmock.h" >+#include "gtest/gtest.h" >+#include "absl/hash/internal/spy_hash_state.h" >+#include "absl/meta/type_traits.h" >+#include "absl/strings/str_cat.h" >+#include "absl/types/variant.h" >+ >+namespace absl { >+ >+// Run the absl::Hash algorithm over all the elements passed in and verify that >+// their hash expansion is congruent with their `==` operator. >+// >+// It is used in conjunction with EXPECT_TRUE. Failures will output information >+// on what requirement failed and on which objects. >+// >+// Users should pass a collection of types as either an initializer list or a >+// container of cases. >+// >+// EXPECT_TRUE(absl::VerifyTypeImplementsAbslHashCorrectly( >+// {v1, v2, ..., vN})); >+// >+// std::vector<MyType> cases; >+// // Fill cases... >+// EXPECT_TRUE(absl::VerifyTypeImplementsAbslHashCorrectly(cases)); >+// >+// Users can pass a variety of types for testing heterogeneous lookup with >+// `std::make_tuple`: >+// >+// EXPECT_TRUE(absl::VerifyTypeImplementsAbslHashCorrectly( >+// std::make_tuple(v1, v2, ..., vN))); >+// >+// >+// Ideally, the values passed should provide enough coverage of the `==` >+// operator and the AbslHashValue implementations. >+// For dynamically sized types, the empty state should usually be included in >+// the values. >+// >+// The function accepts an optional comparator function, in case that `==` is >+// not enough for the values provided. >+// >+// Usage: >+// >+// EXPECT_TRUE(absl::VerifyTypeImplementsAbslHashCorrectly( >+// std::make_tuple(v1, v2, ..., vN), MyCustomEq{})); >+// >+// It checks the following requirements: >+// 1. The expansion for a value is deterministic. >+// 2. For any two objects `a` and `b` in the sequence, if `a == b` evaluates >+// to true, then their hash expansion must be equal. >+// 3. If `a == b` evaluates to false their hash expansion must be unequal. >+// 4. If `a == b` evaluates to false neither hash expansion can be a >+// suffix of the other. >+// 5. AbslHashValue overloads should not be called by the user. They are only >+// meant to be called by the framework. Users should call H::combine() and >+// H::combine_contiguous(). >+// 6. No moved-from instance of the hash state is used in the implementation >+// of AbslHashValue. >+// >+// The values do not have to have the same type. This can be useful for >+// equivalent types that support heterogeneous lookup. >+// >+// A possible reason for breaking (2) is combining state in the hash expansion >+// that was not used in `==`. >+// For example: >+// >+// struct Bad2 { >+// int a, b; >+// template <typename H> >+// friend H AbslHashValue(H state, Bad2 x) { >+// // Uses a and b. >+// return H::combine(x.a, x.b); >+// } >+// friend bool operator==(Bad2 x, Bad2 y) { >+// // Only uses a. >+// return x.a == y.a; >+// } >+// }; >+// >+// As for (3), breaking this usually means that there is state being passed to >+// the `==` operator that is not used in the hash expansion. >+// For example: >+// >+// struct Bad3 { >+// int a, b; >+// template <typename H> >+// friend H AbslHashValue(H state, Bad3 x) { >+// // Only uses a. >+// return H::combine(x.a); >+// } >+// friend bool operator==(Bad3 x, Bad3 y) { >+// // Uses a and b. >+// return x.a == y.a && x.b == y.b; >+// } >+// }; >+// >+// Finally, a common way to break 4 is by combining dynamic ranges without >+// combining the size of the range. >+// For example: >+// >+// struct Bad4 { >+// int *p, size; >+// template <typename H> >+// friend H AbslHashValue(H state, Bad4 x) { >+// return H::combine_range(x.p, x.p + x.size); >+// } >+// friend bool operator==(Bad4 x, Bad4 y) { >+// return std::equal(x.p, x.p + x.size, y.p, y.p + y.size); >+// } >+// }; >+// >+// An easy solution to this is to combine the size after combining the range, >+// like so: >+// template <typename H> >+// friend H AbslHashValue(H state, Bad4 x) { >+// return H::combine(H::combine_range(x.p, x.p + x.size), x.size); >+// } >+// >+template <int&... ExplicitBarrier, typename Container> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(const Container& values); >+ >+template <int&... ExplicitBarrier, typename Container, typename Eq> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(const Container& values, Eq equals); >+ >+template <int&..., typename T> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(std::initializer_list<T> values); >+ >+template <int&..., typename T, typename Eq> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(std::initializer_list<T> values, >+ Eq equals); >+ >+namespace hash_internal { >+ >+struct PrintVisitor { >+ size_t index; >+ template <typename T> >+ std::string operator()(const T* value) const { >+ return absl::StrCat("#", index, "(", testing::PrintToString(*value), ")"); >+ } >+}; >+ >+template <typename Eq> >+struct EqVisitor { >+ Eq eq; >+ template <typename T, typename U> >+ bool operator()(const T* t, const U* u) const { >+ return eq(*t, *u); >+ } >+}; >+ >+struct ExpandVisitor { >+ template <typename T> >+ SpyHashState operator()(const T* value) const { >+ return SpyHashState::combine(SpyHashState(), *value); >+ } >+}; >+ >+template <typename Container, typename Eq> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(const Container& values, Eq equals) { >+ using V = typename Container::value_type; >+ >+ struct Info { >+ const V& value; >+ size_t index; >+ std::string ToString() const { return absl::visit(PrintVisitor{index}, value); } >+ SpyHashState expand() const { return absl::visit(ExpandVisitor{}, value); } >+ }; >+ >+ using EqClass = std::vector<Info>; >+ std::vector<EqClass> classes; >+ >+ // Gather the values in equivalence classes. >+ size_t i = 0; >+ for (const auto& value : values) { >+ EqClass* c = nullptr; >+ for (auto& eqclass : classes) { >+ if (absl::visit(EqVisitor<Eq>{equals}, value, eqclass[0].value)) { >+ c = &eqclass; >+ break; >+ } >+ } >+ if (c == nullptr) { >+ classes.emplace_back(); >+ c = &classes.back(); >+ } >+ c->push_back({value, i}); >+ ++i; >+ >+ // Verify potential errors captured by SpyHashState. >+ if (auto error = c->back().expand().error()) { >+ return testing::AssertionFailure() << *error; >+ } >+ } >+ >+ if (classes.size() < 2) { >+ return testing::AssertionFailure() >+ << "At least two equivalence classes are expected."; >+ } >+ >+ // We assume that equality is correctly implemented. >+ // Now we verify that AbslHashValue is also correctly implemented. >+ >+ for (const auto& c : classes) { >+ // All elements of the equivalence class must have the same hash expansion. >+ const SpyHashState expected = c[0].expand(); >+ for (const Info& v : c) { >+ if (v.expand() != v.expand()) { >+ return testing::AssertionFailure() >+ << "Hash expansion for " << v.ToString() >+ << " is non-deterministic."; >+ } >+ if (v.expand() != expected) { >+ return testing::AssertionFailure() >+ << "Values " << c[0].ToString() << " and " << v.ToString() >+ << " evaluate as equal but have an unequal hash expansion."; >+ } >+ } >+ >+ // Elements from other classes must have different hash expansion. >+ for (const auto& c2 : classes) { >+ if (&c == &c2) continue; >+ const SpyHashState c2_hash = c2[0].expand(); >+ switch (SpyHashState::Compare(expected, c2_hash)) { >+ case SpyHashState::CompareResult::kEqual: >+ return testing::AssertionFailure() >+ << "Values " << c[0].ToString() << " and " << c2[0].ToString() >+ << " evaluate as unequal but have an equal hash expansion."; >+ case SpyHashState::CompareResult::kBSuffixA: >+ return testing::AssertionFailure() >+ << "Hash expansion of " << c2[0].ToString() >+ << " is a suffix of the hash expansion of " << c[0].ToString() >+ << "."; >+ case SpyHashState::CompareResult::kASuffixB: >+ return testing::AssertionFailure() >+ << "Hash expansion of " << c[0].ToString() >+ << " is a suffix of the hash expansion of " << c2[0].ToString() >+ << "."; >+ case SpyHashState::CompareResult::kUnequal: >+ break; >+ } >+ } >+ } >+ return testing::AssertionSuccess(); >+} >+ >+template <typename... T> >+struct TypeSet { >+ template <typename U, bool = disjunction<std::is_same<T, U>...>::value> >+ struct Insert { >+ using type = TypeSet<U, T...>; >+ }; >+ template <typename U> >+ struct Insert<U, true> { >+ using type = TypeSet; >+ }; >+ >+ template <template <typename...> class C> >+ using apply = C<T...>; >+}; >+ >+template <typename... T> >+struct MakeTypeSet : TypeSet<>{}; >+template <typename T, typename... Ts> >+struct MakeTypeSet<T, Ts...> : MakeTypeSet<Ts...>::template Insert<T>::type {}; >+ >+template <typename... T> >+using VariantForTypes = typename MakeTypeSet< >+ const typename std::decay<T>::type*...>::template apply<absl::variant>; >+ >+template <typename Container> >+struct ContainerAsVector { >+ using V = absl::variant<const typename Container::value_type*>; >+ using Out = std::vector<V>; >+ >+ static Out Do(const Container& values) { >+ Out out; >+ for (const auto& v : values) out.push_back(&v); >+ return out; >+ } >+}; >+ >+template <typename... T> >+struct ContainerAsVector<std::tuple<T...>> { >+ using V = VariantForTypes<T...>; >+ using Out = std::vector<V>; >+ >+ template <size_t... I> >+ static Out DoImpl(const std::tuple<T...>& tuple, absl::index_sequence<I...>) { >+ return Out{&std::get<I>(tuple)...}; >+ } >+ >+ static Out Do(const std::tuple<T...>& values) { >+ return DoImpl(values, absl::index_sequence_for<T...>()); >+ } >+}; >+ >+template <> >+struct ContainerAsVector<std::tuple<>> { >+ static std::vector<VariantForTypes<int>> Do(std::tuple<>) { return {}; } >+}; >+ >+struct DefaultEquals { >+ template <typename T, typename U> >+ bool operator()(const T& t, const U& u) const { >+ return t == u; >+ } >+}; >+ >+} // namespace hash_internal >+ >+template <int&..., typename Container> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(const Container& values) { >+ return hash_internal::VerifyTypeImplementsAbslHashCorrectly( >+ hash_internal::ContainerAsVector<Container>::Do(values), >+ hash_internal::DefaultEquals{}); >+} >+ >+template <int&..., typename Container, typename Eq> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(const Container& values, Eq equals) { >+ return hash_internal::VerifyTypeImplementsAbslHashCorrectly( >+ hash_internal::ContainerAsVector<Container>::Do(values), >+ equals); >+} >+ >+template <int&..., typename T> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(std::initializer_list<T> values) { >+ return hash_internal::VerifyTypeImplementsAbslHashCorrectly( >+ hash_internal::ContainerAsVector<std::initializer_list<T>>::Do(values), >+ hash_internal::DefaultEquals{}); >+} >+ >+template <int&..., typename T, typename Eq> >+ABSL_MUST_USE_RESULT testing::AssertionResult >+VerifyTypeImplementsAbslHashCorrectly(std::initializer_list<T> values, >+ Eq equals) { >+ return hash_internal::VerifyTypeImplementsAbslHashCorrectly( >+ hash_internal::ContainerAsVector<std::initializer_list<T>>::Do(values), >+ equals); >+} >+ >+} // namespace absl >+ >+#endif // ABSL_HASH_HASH_TESTING_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..8f72dd1b7d0c9f538d91d627dff0662e4e0c9b2c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city.cc >@@ -0,0 +1,590 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// This file provides CityHash64() and related functions. >+// >+// It's probably possible to create even faster hash functions by >+// writing a program that systematically explores some of the space of >+// possible hash functions, by using SIMD instructions, or by >+// compromising on hash quality. >+ >+#include "absl/hash/internal/city.h" >+ >+#include <string.h> // for memcpy and memset >+#include <algorithm> >+ >+#include "absl/base/config.h" >+#include "absl/base/internal/endian.h" >+#include "absl/base/internal/unaligned_access.h" >+#include "absl/base/optimization.h" >+ >+namespace absl { >+namespace hash_internal { >+ >+#ifdef ABSL_IS_BIG_ENDIAN >+#define uint32_in_expected_order(x) (absl::gbswap_32(x)) >+#define uint64_in_expected_order(x) (absl::gbswap_64(x)) >+#else >+#define uint32_in_expected_order(x) (x) >+#define uint64_in_expected_order(x) (x) >+#endif >+ >+static uint64_t Fetch64(const char *p) { >+ return uint64_in_expected_order(ABSL_INTERNAL_UNALIGNED_LOAD64(p)); >+} >+ >+static uint32_t Fetch32(const char *p) { >+ return uint32_in_expected_order(ABSL_INTERNAL_UNALIGNED_LOAD32(p)); >+} >+ >+// Some primes between 2^63 and 2^64 for various uses. >+static const uint64_t k0 = 0xc3a5c85c97cb3127ULL; >+static const uint64_t k1 = 0xb492b66fbe98f273ULL; >+static const uint64_t k2 = 0x9ae16a3b2f90404fULL; >+ >+// Magic numbers for 32-bit hashing. Copied from Murmur3. >+static const uint32_t c1 = 0xcc9e2d51; >+static const uint32_t c2 = 0x1b873593; >+ >+// A 32-bit to 32-bit integer hash copied from Murmur3. >+static uint32_t fmix(uint32_t h) { >+ h ^= h >> 16; >+ h *= 0x85ebca6b; >+ h ^= h >> 13; >+ h *= 0xc2b2ae35; >+ h ^= h >> 16; >+ return h; >+} >+ >+static uint32_t Rotate32(uint32_t val, int shift) { >+ // Avoid shifting by 32: doing so yields an undefined result. >+ return shift == 0 ? val : ((val >> shift) | (val << (32 - shift))); >+} >+ >+#undef PERMUTE3 >+#define PERMUTE3(a, b, c) \ >+ do { \ >+ std::swap(a, b); \ >+ std::swap(a, c); \ >+ } while (0) >+ >+static uint32_t Mur(uint32_t a, uint32_t h) { >+ // Helper from Murmur3 for combining two 32-bit values. >+ a *= c1; >+ a = Rotate32(a, 17); >+ a *= c2; >+ h ^= a; >+ h = Rotate32(h, 19); >+ return h * 5 + 0xe6546b64; >+} >+ >+static uint32_t Hash32Len13to24(const char *s, size_t len) { >+ uint32_t a = Fetch32(s - 4 + (len >> 1)); >+ uint32_t b = Fetch32(s + 4); >+ uint32_t c = Fetch32(s + len - 8); >+ uint32_t d = Fetch32(s + (len >> 1)); >+ uint32_t e = Fetch32(s); >+ uint32_t f = Fetch32(s + len - 4); >+ uint32_t h = len; >+ >+ return fmix(Mur(f, Mur(e, Mur(d, Mur(c, Mur(b, Mur(a, h))))))); >+} >+ >+static uint32_t Hash32Len0to4(const char *s, size_t len) { >+ uint32_t b = 0; >+ uint32_t c = 9; >+ for (size_t i = 0; i < len; i++) { >+ signed char v = s[i]; >+ b = b * c1 + v; >+ c ^= b; >+ } >+ return fmix(Mur(b, Mur(len, c))); >+} >+ >+static uint32_t Hash32Len5to12(const char *s, size_t len) { >+ uint32_t a = len, b = len * 5, c = 9, d = b; >+ a += Fetch32(s); >+ b += Fetch32(s + len - 4); >+ c += Fetch32(s + ((len >> 1) & 4)); >+ return fmix(Mur(c, Mur(b, Mur(a, d)))); >+} >+ >+uint32_t CityHash32(const char *s, size_t len) { >+ if (len <= 24) { >+ return len <= 12 >+ ? (len <= 4 ? Hash32Len0to4(s, len) : Hash32Len5to12(s, len)) >+ : Hash32Len13to24(s, len); >+ } >+ >+ // len > 24 >+ uint32_t h = len, g = c1 * len, f = g; >+ >+ uint32_t a0 = Rotate32(Fetch32(s + len - 4) * c1, 17) * c2; >+ uint32_t a1 = Rotate32(Fetch32(s + len - 8) * c1, 17) * c2; >+ uint32_t a2 = Rotate32(Fetch32(s + len - 16) * c1, 17) * c2; >+ uint32_t a3 = Rotate32(Fetch32(s + len - 12) * c1, 17) * c2; >+ uint32_t a4 = Rotate32(Fetch32(s + len - 20) * c1, 17) * c2; >+ h ^= a0; >+ h = Rotate32(h, 19); >+ h = h * 5 + 0xe6546b64; >+ h ^= a2; >+ h = Rotate32(h, 19); >+ h = h * 5 + 0xe6546b64; >+ g ^= a1; >+ g = Rotate32(g, 19); >+ g = g * 5 + 0xe6546b64; >+ g ^= a3; >+ g = Rotate32(g, 19); >+ g = g * 5 + 0xe6546b64; >+ f += a4; >+ f = Rotate32(f, 19); >+ f = f * 5 + 0xe6546b64; >+ size_t iters = (len - 1) / 20; >+ do { >+ uint32_t b0 = Rotate32(Fetch32(s) * c1, 17) * c2; >+ uint32_t b1 = Fetch32(s + 4); >+ uint32_t b2 = Rotate32(Fetch32(s + 8) * c1, 17) * c2; >+ uint32_t b3 = Rotate32(Fetch32(s + 12) * c1, 17) * c2; >+ uint32_t b4 = Fetch32(s + 16); >+ h ^= b0; >+ h = Rotate32(h, 18); >+ h = h * 5 + 0xe6546b64; >+ f += b1; >+ f = Rotate32(f, 19); >+ f = f * c1; >+ g += b2; >+ g = Rotate32(g, 18); >+ g = g * 5 + 0xe6546b64; >+ h ^= b3 + b1; >+ h = Rotate32(h, 19); >+ h = h * 5 + 0xe6546b64; >+ g ^= b4; >+ g = absl::gbswap_32(g) * 5; >+ h += b4 * 5; >+ h = absl::gbswap_32(h); >+ f += b0; >+ PERMUTE3(f, h, g); >+ s += 20; >+ } while (--iters != 0); >+ g = Rotate32(g, 11) * c1; >+ g = Rotate32(g, 17) * c1; >+ f = Rotate32(f, 11) * c1; >+ f = Rotate32(f, 17) * c1; >+ h = Rotate32(h + g, 19); >+ h = h * 5 + 0xe6546b64; >+ h = Rotate32(h, 17) * c1; >+ h = Rotate32(h + f, 19); >+ h = h * 5 + 0xe6546b64; >+ h = Rotate32(h, 17) * c1; >+ return h; >+} >+ >+// Bitwise right rotate. Normally this will compile to a single >+// instruction, especially if the shift is a manifest constant. >+static uint64_t Rotate(uint64_t val, int shift) { >+ // Avoid shifting by 64: doing so yields an undefined result. >+ return shift == 0 ? val : ((val >> shift) | (val << (64 - shift))); >+} >+ >+static uint64_t ShiftMix(uint64_t val) { return val ^ (val >> 47); } >+ >+static uint64_t HashLen16(uint64_t u, uint64_t v) { >+ return Hash128to64(uint128(u, v)); >+} >+ >+static uint64_t HashLen16(uint64_t u, uint64_t v, uint64_t mul) { >+ // Murmur-inspired hashing. >+ uint64_t a = (u ^ v) * mul; >+ a ^= (a >> 47); >+ uint64_t b = (v ^ a) * mul; >+ b ^= (b >> 47); >+ b *= mul; >+ return b; >+} >+ >+static uint64_t HashLen0to16(const char *s, size_t len) { >+ if (len >= 8) { >+ uint64_t mul = k2 + len * 2; >+ uint64_t a = Fetch64(s) + k2; >+ uint64_t b = Fetch64(s + len - 8); >+ uint64_t c = Rotate(b, 37) * mul + a; >+ uint64_t d = (Rotate(a, 25) + b) * mul; >+ return HashLen16(c, d, mul); >+ } >+ if (len >= 4) { >+ uint64_t mul = k2 + len * 2; >+ uint64_t a = Fetch32(s); >+ return HashLen16(len + (a << 3), Fetch32(s + len - 4), mul); >+ } >+ if (len > 0) { >+ uint8_t a = s[0]; >+ uint8_t b = s[len >> 1]; >+ uint8_t c = s[len - 1]; >+ uint32_t y = static_cast<uint32_t>(a) + (static_cast<uint32_t>(b) << 8); >+ uint32_t z = len + (static_cast<uint32_t>(c) << 2); >+ return ShiftMix(y * k2 ^ z * k0) * k2; >+ } >+ return k2; >+} >+ >+// This probably works well for 16-byte strings as well, but it may be overkill >+// in that case. >+static uint64_t HashLen17to32(const char *s, size_t len) { >+ uint64_t mul = k2 + len * 2; >+ uint64_t a = Fetch64(s) * k1; >+ uint64_t b = Fetch64(s + 8); >+ uint64_t c = Fetch64(s + len - 8) * mul; >+ uint64_t d = Fetch64(s + len - 16) * k2; >+ return HashLen16(Rotate(a + b, 43) + Rotate(c, 30) + d, >+ a + Rotate(b + k2, 18) + c, mul); >+} >+ >+// Return a 16-byte hash for 48 bytes. Quick and dirty. >+// Callers do best to use "random-looking" values for a and b. >+static std::pair<uint64_t, uint64_t> WeakHashLen32WithSeeds(uint64_t w, uint64_t x, >+ uint64_t y, uint64_t z, >+ uint64_t a, uint64_t b) { >+ a += w; >+ b = Rotate(b + a + z, 21); >+ uint64_t c = a; >+ a += x; >+ a += y; >+ b += Rotate(a, 44); >+ return std::make_pair(a + z, b + c); >+} >+ >+// Return a 16-byte hash for s[0] ... s[31], a, and b. Quick and dirty. >+static std::pair<uint64_t, uint64_t> WeakHashLen32WithSeeds(const char *s, uint64_t a, >+ uint64_t b) { >+ return WeakHashLen32WithSeeds(Fetch64(s), Fetch64(s + 8), Fetch64(s + 16), >+ Fetch64(s + 24), a, b); >+} >+ >+// Return an 8-byte hash for 33 to 64 bytes. >+static uint64_t HashLen33to64(const char *s, size_t len) { >+ uint64_t mul = k2 + len * 2; >+ uint64_t a = Fetch64(s) * k2; >+ uint64_t b = Fetch64(s + 8); >+ uint64_t c = Fetch64(s + len - 24); >+ uint64_t d = Fetch64(s + len - 32); >+ uint64_t e = Fetch64(s + 16) * k2; >+ uint64_t f = Fetch64(s + 24) * 9; >+ uint64_t g = Fetch64(s + len - 8); >+ uint64_t h = Fetch64(s + len - 16) * mul; >+ uint64_t u = Rotate(a + g, 43) + (Rotate(b, 30) + c) * 9; >+ uint64_t v = ((a + g) ^ d) + f + 1; >+ uint64_t w = absl::gbswap_64((u + v) * mul) + h; >+ uint64_t x = Rotate(e + f, 42) + c; >+ uint64_t y = (absl::gbswap_64((v + w) * mul) + g) * mul; >+ uint64_t z = e + f + c; >+ a = absl::gbswap_64((x + z) * mul + y) + b; >+ b = ShiftMix((z + a) * mul + d + h) * mul; >+ return b + x; >+} >+ >+uint64_t CityHash64(const char *s, size_t len) { >+ if (len <= 32) { >+ if (len <= 16) { >+ return HashLen0to16(s, len); >+ } else { >+ return HashLen17to32(s, len); >+ } >+ } else if (len <= 64) { >+ return HashLen33to64(s, len); >+ } >+ >+ // For strings over 64 bytes we hash the end first, and then as we >+ // loop we keep 56 bytes of state: v, w, x, y, and z. >+ uint64_t x = Fetch64(s + len - 40); >+ uint64_t y = Fetch64(s + len - 16) + Fetch64(s + len - 56); >+ uint64_t z = HashLen16(Fetch64(s + len - 48) + len, Fetch64(s + len - 24)); >+ std::pair<uint64_t, uint64_t> v = WeakHashLen32WithSeeds(s + len - 64, len, z); >+ std::pair<uint64_t, uint64_t> w = WeakHashLen32WithSeeds(s + len - 32, y + k1, x); >+ x = x * k1 + Fetch64(s); >+ >+ // Decrease len to the nearest multiple of 64, and operate on 64-byte chunks. >+ len = (len - 1) & ~static_cast<size_t>(63); >+ do { >+ x = Rotate(x + y + v.first + Fetch64(s + 8), 37) * k1; >+ y = Rotate(y + v.second + Fetch64(s + 48), 42) * k1; >+ x ^= w.second; >+ y += v.first + Fetch64(s + 40); >+ z = Rotate(z + w.first, 33) * k1; >+ v = WeakHashLen32WithSeeds(s, v.second * k1, x + w.first); >+ w = WeakHashLen32WithSeeds(s + 32, z + w.second, y + Fetch64(s + 16)); >+ std::swap(z, x); >+ s += 64; >+ len -= 64; >+ } while (len != 0); >+ return HashLen16(HashLen16(v.first, w.first) + ShiftMix(y) * k1 + z, >+ HashLen16(v.second, w.second) + x); >+} >+ >+uint64_t CityHash64WithSeed(const char *s, size_t len, uint64_t seed) { >+ return CityHash64WithSeeds(s, len, k2, seed); >+} >+ >+uint64_t CityHash64WithSeeds(const char *s, size_t len, uint64_t seed0, >+ uint64_t seed1) { >+ return HashLen16(CityHash64(s, len) - seed0, seed1); >+} >+ >+// A subroutine for CityHash128(). Returns a decent 128-bit hash for strings >+// of any length representable in signed long. Based on City and Murmur. >+static uint128 CityMurmur(const char *s, size_t len, uint128 seed) { >+ uint64_t a = Uint128Low64(seed); >+ uint64_t b = Uint128High64(seed); >+ uint64_t c = 0; >+ uint64_t d = 0; >+ int64_t l = len - 16; >+ if (l <= 0) { // len <= 16 >+ a = ShiftMix(a * k1) * k1; >+ c = b * k1 + HashLen0to16(s, len); >+ d = ShiftMix(a + (len >= 8 ? Fetch64(s) : c)); >+ } else { // len > 16 >+ c = HashLen16(Fetch64(s + len - 8) + k1, a); >+ d = HashLen16(b + len, c + Fetch64(s + len - 16)); >+ a += d; >+ do { >+ a ^= ShiftMix(Fetch64(s) * k1) * k1; >+ a *= k1; >+ b ^= a; >+ c ^= ShiftMix(Fetch64(s + 8) * k1) * k1; >+ c *= k1; >+ d ^= c; >+ s += 16; >+ l -= 16; >+ } while (l > 0); >+ } >+ a = HashLen16(a, c); >+ b = HashLen16(d, b); >+ return uint128(a ^ b, HashLen16(b, a)); >+} >+ >+uint128 CityHash128WithSeed(const char *s, size_t len, uint128 seed) { >+ if (len < 128) { >+ return CityMurmur(s, len, seed); >+ } >+ >+ // We expect len >= 128 to be the common case. Keep 56 bytes of state: >+ // v, w, x, y, and z. >+ std::pair<uint64_t, uint64_t> v, w; >+ uint64_t x = Uint128Low64(seed); >+ uint64_t y = Uint128High64(seed); >+ uint64_t z = len * k1; >+ v.first = Rotate(y ^ k1, 49) * k1 + Fetch64(s); >+ v.second = Rotate(v.first, 42) * k1 + Fetch64(s + 8); >+ w.first = Rotate(y + z, 35) * k1 + x; >+ w.second = Rotate(x + Fetch64(s + 88), 53) * k1; >+ >+ // This is the same inner loop as CityHash64(), manually unrolled. >+ do { >+ x = Rotate(x + y + v.first + Fetch64(s + 8), 37) * k1; >+ y = Rotate(y + v.second + Fetch64(s + 48), 42) * k1; >+ x ^= w.second; >+ y += v.first + Fetch64(s + 40); >+ z = Rotate(z + w.first, 33) * k1; >+ v = WeakHashLen32WithSeeds(s, v.second * k1, x + w.first); >+ w = WeakHashLen32WithSeeds(s + 32, z + w.second, y + Fetch64(s + 16)); >+ std::swap(z, x); >+ s += 64; >+ x = Rotate(x + y + v.first + Fetch64(s + 8), 37) * k1; >+ y = Rotate(y + v.second + Fetch64(s + 48), 42) * k1; >+ x ^= w.second; >+ y += v.first + Fetch64(s + 40); >+ z = Rotate(z + w.first, 33) * k1; >+ v = WeakHashLen32WithSeeds(s, v.second * k1, x + w.first); >+ w = WeakHashLen32WithSeeds(s + 32, z + w.second, y + Fetch64(s + 16)); >+ std::swap(z, x); >+ s += 64; >+ len -= 128; >+ } while (ABSL_PREDICT_TRUE(len >= 128)); >+ x += Rotate(v.first + z, 49) * k0; >+ y = y * k0 + Rotate(w.second, 37); >+ z = z * k0 + Rotate(w.first, 27); >+ w.first *= 9; >+ v.first *= k0; >+ // If 0 < len < 128, hash up to 4 chunks of 32 bytes each from the end of s. >+ for (size_t tail_done = 0; tail_done < len;) { >+ tail_done += 32; >+ y = Rotate(x + y, 42) * k0 + v.second; >+ w.first += Fetch64(s + len - tail_done + 16); >+ x = x * k0 + w.first; >+ z += w.second + Fetch64(s + len - tail_done); >+ w.second += v.first; >+ v = WeakHashLen32WithSeeds(s + len - tail_done, v.first + z, v.second); >+ v.first *= k0; >+ } >+ // At this point our 56 bytes of state should contain more than >+ // enough information for a strong 128-bit hash. We use two >+ // different 56-byte-to-8-byte hashes to get a 16-byte final result. >+ x = HashLen16(x, v.first); >+ y = HashLen16(y + z, w.first); >+ return uint128(HashLen16(x + v.second, w.second) + y, >+ HashLen16(x + w.second, y + v.second)); >+} >+ >+uint128 CityHash128(const char *s, size_t len) { >+ return len >= 16 >+ ? CityHash128WithSeed(s + 16, len - 16, >+ uint128(Fetch64(s), Fetch64(s + 8) + k0)) >+ : CityHash128WithSeed(s, len, uint128(k0, k1)); >+} >+} // namespace hash_internal >+} // namespace absl >+ >+#ifdef __SSE4_2__ >+#include <nmmintrin.h> >+#include "absl/hash/internal/city_crc.h" >+ >+namespace absl { >+namespace hash_internal { >+ >+// Requires len >= 240. >+static void CityHashCrc256Long(const char *s, size_t len, uint32_t seed, >+ uint64_t *result) { >+ uint64_t a = Fetch64(s + 56) + k0; >+ uint64_t b = Fetch64(s + 96) + k0; >+ uint64_t c = result[0] = HashLen16(b, len); >+ uint64_t d = result[1] = Fetch64(s + 120) * k0 + len; >+ uint64_t e = Fetch64(s + 184) + seed; >+ uint64_t f = 0; >+ uint64_t g = 0; >+ uint64_t h = c + d; >+ uint64_t x = seed; >+ uint64_t y = 0; >+ uint64_t z = 0; >+ >+ // 240 bytes of input per iter. >+ size_t iters = len / 240; >+ len -= iters * 240; >+ do { >+#undef CHUNK >+#define CHUNK(r) \ >+ PERMUTE3(x, z, y); \ >+ b += Fetch64(s); \ >+ c += Fetch64(s + 8); \ >+ d += Fetch64(s + 16); \ >+ e += Fetch64(s + 24); \ >+ f += Fetch64(s + 32); \ >+ a += b; \ >+ h += f; \ >+ b += c; \ >+ f += d; \ >+ g += e; \ >+ e += z; \ >+ g += x; \ >+ z = _mm_crc32_u64(z, b + g); \ >+ y = _mm_crc32_u64(y, e + h); \ >+ x = _mm_crc32_u64(x, f + a); \ >+ e = Rotate(e, r); \ >+ c += e; \ >+ s += 40 >+ >+ CHUNK(0); >+ PERMUTE3(a, h, c); >+ CHUNK(33); >+ PERMUTE3(a, h, f); >+ CHUNK(0); >+ PERMUTE3(b, h, f); >+ CHUNK(42); >+ PERMUTE3(b, h, d); >+ CHUNK(0); >+ PERMUTE3(b, h, e); >+ CHUNK(33); >+ PERMUTE3(a, h, e); >+ } while (--iters > 0); >+ >+ while (len >= 40) { >+ CHUNK(29); >+ e ^= Rotate(a, 20); >+ h += Rotate(b, 30); >+ g ^= Rotate(c, 40); >+ f += Rotate(d, 34); >+ PERMUTE3(c, h, g); >+ len -= 40; >+ } >+ if (len > 0) { >+ s = s + len - 40; >+ CHUNK(33); >+ e ^= Rotate(a, 43); >+ h += Rotate(b, 42); >+ g ^= Rotate(c, 41); >+ f += Rotate(d, 40); >+ } >+ result[0] ^= h; >+ result[1] ^= g; >+ g += h; >+ a = HashLen16(a, g + z); >+ x += y << 32; >+ b += x; >+ c = HashLen16(c, z) + h; >+ d = HashLen16(d, e + result[0]); >+ g += e; >+ h += HashLen16(x, f); >+ e = HashLen16(a, d) + g; >+ z = HashLen16(b, c) + a; >+ y = HashLen16(g, h) + c; >+ result[0] = e + z + y + x; >+ a = ShiftMix((a + y) * k0) * k0 + b; >+ result[1] += a + result[0]; >+ a = ShiftMix(a * k0) * k0 + c; >+ result[2] = a + result[1]; >+ a = ShiftMix((a + e) * k0) * k0; >+ result[3] = a + result[2]; >+} >+ >+// Requires len < 240. >+static void CityHashCrc256Short(const char *s, size_t len, uint64_t *result) { >+ char buf[240]; >+ memcpy(buf, s, len); >+ memset(buf + len, 0, 240 - len); >+ CityHashCrc256Long(buf, 240, ~static_cast<uint32_t>(len), result); >+} >+ >+void CityHashCrc256(const char *s, size_t len, uint64_t *result) { >+ if (ABSL_PREDICT_TRUE(len >= 240)) { >+ CityHashCrc256Long(s, len, 0, result); >+ } else { >+ CityHashCrc256Short(s, len, result); >+ } >+} >+ >+uint128 CityHashCrc128WithSeed(const char *s, size_t len, uint128 seed) { >+ if (len <= 900) { >+ return CityHash128WithSeed(s, len, seed); >+ } else { >+ uint64_t result[4]; >+ CityHashCrc256(s, len, result); >+ uint64_t u = Uint128High64(seed) + result[0]; >+ uint64_t v = Uint128Low64(seed) + result[1]; >+ return uint128(HashLen16(u, v + result[2]), >+ HashLen16(Rotate(v, 32), u * k0 + result[3])); >+ } >+} >+ >+uint128 CityHashCrc128(const char *s, size_t len) { >+ if (len <= 900) { >+ return CityHash128(s, len); >+ } else { >+ uint64_t result[4]; >+ CityHashCrc256(s, len, result); >+ return uint128(result[2], result[3]); >+ } >+} >+ >+} // namespace hash_internal >+} // namespace absl >+ >+#endif >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city.h >new file mode 100644 >index 0000000000000000000000000000000000000000..55b37b875f0b3fd8467da99dbed4297ac89f1819 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city.h >@@ -0,0 +1,108 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// http://code.google.com/p/cityhash/ >+// >+// This file provides a few functions for hashing strings. All of them are >+// high-quality functions in the sense that they pass standard tests such >+// as Austin Appleby's SMHasher. They are also fast. >+// >+// For 64-bit x86 code, on short strings, we don't know of anything faster than >+// CityHash64 that is of comparable quality. We believe our nearest competitor >+// is Murmur3. For 64-bit x86 code, CityHash64 is an excellent choice for hash >+// tables and most other hashing (excluding cryptography). >+// >+// For 64-bit x86 code, on long strings, the picture is more complicated. >+// On many recent Intel CPUs, such as Nehalem, Westmere, Sandy Bridge, etc., >+// CityHashCrc128 appears to be faster than all competitors of comparable >+// quality. CityHash128 is also good but not quite as fast. We believe our >+// nearest competitor is Bob Jenkins' Spooky. We don't have great data for >+// other 64-bit CPUs, but for long strings we know that Spooky is slightly >+// faster than CityHash on some relatively recent AMD x86-64 CPUs, for example. >+// Note that CityHashCrc128 is declared in citycrc.h. >+// >+// For 32-bit x86 code, we don't know of anything faster than CityHash32 that >+// is of comparable quality. We believe our nearest competitor is Murmur3A. >+// (On 64-bit CPUs, it is typically faster to use the other CityHash variants.) >+// >+// Functions in the CityHash family are not suitable for cryptography. >+// >+// Please see CityHash's README file for more details on our performance >+// measurements and so on. >+// >+// WARNING: This code has been only lightly tested on big-endian platforms! >+// It is known to work well on little-endian platforms that have a small penalty >+// for unaligned reads, such as current Intel and AMD moderate-to-high-end CPUs. >+// It should work on all 32-bit and 64-bit platforms that allow unaligned reads; >+// bug reports are welcome. >+// >+// By the way, for some hash functions, given strings a and b, the hash >+// of a+b is easily derived from the hashes of a and b. This property >+// doesn't hold for any hash functions in this file. >+ >+#ifndef ABSL_HASH_INTERNAL_CITY_H_ >+#define ABSL_HASH_INTERNAL_CITY_H_ >+ >+#include <stdint.h> >+#include <stdlib.h> // for size_t. >+#include <utility> >+ >+ >+namespace absl { >+namespace hash_internal { >+ >+typedef std::pair<uint64_t, uint64_t> uint128; >+ >+inline uint64_t Uint128Low64(const uint128 &x) { return x.first; } >+inline uint64_t Uint128High64(const uint128 &x) { return x.second; } >+ >+// Hash function for a byte array. >+uint64_t CityHash64(const char *s, size_t len); >+ >+// Hash function for a byte array. For convenience, a 64-bit seed is also >+// hashed into the result. >+uint64_t CityHash64WithSeed(const char *s, size_t len, uint64_t seed); >+ >+// Hash function for a byte array. For convenience, two seeds are also >+// hashed into the result. >+uint64_t CityHash64WithSeeds(const char *s, size_t len, uint64_t seed0, >+ uint64_t seed1); >+ >+// Hash function for a byte array. >+uint128 CityHash128(const char *s, size_t len); >+ >+// Hash function for a byte array. For convenience, a 128-bit seed is also >+// hashed into the result. >+uint128 CityHash128WithSeed(const char *s, size_t len, uint128 seed); >+ >+// Hash function for a byte array. Most useful in 32-bit binaries. >+uint32_t CityHash32(const char *s, size_t len); >+ >+// Hash 128 input bits down to 64 bits of output. >+// This is intended to be a reasonably good hash function. >+inline uint64_t Hash128to64(const uint128 &x) { >+ // Murmur-inspired hashing. >+ const uint64_t kMul = 0x9ddfea08eb382d69ULL; >+ uint64_t a = (Uint128Low64(x) ^ Uint128High64(x)) * kMul; >+ a ^= (a >> 47); >+ uint64_t b = (Uint128High64(x) ^ a) * kMul; >+ b ^= (b >> 47); >+ b *= kMul; >+ return b; >+} >+ >+} // namespace hash_internal >+} // namespace absl >+ >+#endif // ABSL_HASH_INTERNAL_CITY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city_crc.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city_crc.h >new file mode 100644 >index 0000000000000000000000000000000000000000..6be6557d213f2b40a8c474c6a0c279f0a6bcaf27 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city_crc.h >@@ -0,0 +1,41 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// This file declares the subset of the CityHash functions that require >+// _mm_crc32_u64(). See the CityHash README for details. >+// >+// Functions in the CityHash family are not suitable for cryptography. >+ >+#ifndef ABSL_HASH_INTERNAL_CITY_CRC_H_ >+#define ABSL_HASH_INTERNAL_CITY_CRC_H_ >+ >+#include "absl/hash/internal/city.h" >+ >+namespace absl { >+namespace hash_internal { >+ >+// Hash function for a byte array. >+uint128 CityHashCrc128(const char *s, size_t len); >+ >+// Hash function for a byte array. For convenience, a 128-bit seed is also >+// hashed into the result. >+uint128 CityHashCrc128WithSeed(const char *s, size_t len, uint128 seed); >+ >+// Hash function for a byte array. Sets result[0] ... result[3]. >+void CityHashCrc256(const char *s, size_t len, uint64_t *result); >+ >+} // namespace hash_internal >+} // namespace absl >+ >+#endif // ABSL_HASH_INTERNAL_CITY_CRC_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..678da53d34d04d6011fe87b06a7e26613bdbef99 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/city_test.cc >@@ -0,0 +1,1812 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/hash/internal/city.h" >+ >+#include <string.h> >+#include <cstdio> >+#include <iostream> >+#include "gtest/gtest.h" >+#ifdef __SSE4_2__ >+#include "absl/hash/internal/city_crc.h" >+#endif >+ >+namespace absl { >+namespace hash_internal { >+ >+static const uint64_t k0 = 0xc3a5c85c97cb3127ULL; >+static const uint64_t kSeed0 = 1234567; >+static const uint64_t kSeed1 = k0; >+static const uint128 kSeed128(kSeed0, kSeed1); >+static const int kDataSize = 1 << 20; >+static const int kTestSize = 300; >+ >+static char data[kDataSize]; >+ >+// Initialize data to pseudorandom values. >+void setup() { >+ uint64_t a = 9; >+ uint64_t b = 777; >+ for (int i = 0; i < kDataSize; i++) { >+ a += b; >+ b += a; >+ a = (a ^ (a >> 41)) * k0; >+ b = (b ^ (b >> 41)) * k0 + i; >+ uint8_t u = b >> 37; >+ memcpy(data + i, &u, 1); // uint8_t -> char >+ } >+} >+ >+#define C(x) 0x##x##ULL >+static const uint64_t testdata[kTestSize][16] = { >+ {C(9ae16a3b2f90404f), C(75106db890237a4a), C(3feac5f636039766), >+ C(3df09dfc64c09a2b), C(3cb540c392e51e29), C(6b56343feac0663), >+ C(5b7bc50fd8e8ad92), C(3df09dfc64c09a2b), C(3cb540c392e51e29), >+ C(6b56343feac0663), C(5b7bc50fd8e8ad92), C(95162f24e6a5f930), >+ C(6808bdf4f1eb06e0), C(b3b1f3a67b624d82), C(c9a62f12bd4cd80b), >+ C(dc56d17a)}, >+ {C(541150e87f415e96), C(1aef0d24b3148a1a), C(bacc300e1e82345a), >+ C(c3cdc41e1df33513), C(2c138ff2596d42f6), C(f58e9082aed3055f), >+ C(162e192b2957163d), C(c3cdc41e1df33513), C(2c138ff2596d42f6), >+ C(f58e9082aed3055f), C(162e192b2957163d), C(fb99e85e0d16f90c), >+ C(608462c15bdf27e8), C(e7d2c5c943572b62), C(1baaa9327642798c), >+ C(99929334)}, >+ {C(f3786a4b25827c1), C(34ee1a2bf767bd1c), C(2f15ca2ebfb631f2), >+ C(3149ba1dac77270d), C(70e2e076e30703c), C(59bcc9659bc5296), >+ C(9ecbc8132ae2f1d7), C(3149ba1dac77270d), C(70e2e076e30703c), >+ C(59bcc9659bc5296), C(9ecbc8132ae2f1d7), C(a01d30789bad7cf2), >+ C(ae03fe371981a0e0), C(127e3883b8788934), C(d0ac3d4c0a6fca32), >+ C(4252edb7)}, >+ {C(ef923a7a1af78eab), C(79163b1e1e9a9b18), C(df3b2aca6e1e4a30), >+ C(2193fb7620cbf23b), C(8b6a8ff06cda8302), C(1a44469afd3e091f), >+ C(8b0449376612506), C(2193fb7620cbf23b), C(8b6a8ff06cda8302), >+ C(1a44469afd3e091f), C(8b0449376612506), C(e9d9d41c32ad91d1), >+ C(b44ab09f58e3c608), C(19e9175f9fcf784), C(839b3c9581b4a480), C(ebc34f3c)}, >+ {C(11df592596f41d88), C(843ec0bce9042f9c), C(cce2ea1e08b1eb30), >+ C(4d09e42f09cc3495), C(666236631b9f253b), C(d28b3763cd02b6a3), >+ C(43b249e57c4d0c1b), C(4d09e42f09cc3495), C(666236631b9f253b), >+ C(d28b3763cd02b6a3), C(43b249e57c4d0c1b), C(3887101c8adea101), >+ C(8a9355d4efc91df0), C(3e610944cc9fecfd), C(5bf9eb60b08ac0ce), >+ C(26f2b463)}, >+ {C(831f448bdc5600b3), C(62a24be3120a6919), C(1b44098a41e010da), >+ C(dc07df53b949c6b), C(d2b11b2081aeb002), C(d212b02c1b13f772), >+ C(c0bed297b4be1912), C(dc07df53b949c6b), C(d2b11b2081aeb002), >+ C(d212b02c1b13f772), C(c0bed297b4be1912), C(682d3d2ad304e4af), >+ C(40e9112a655437a1), C(268b09f7ee09843f), C(6b9698d43859ca47), >+ C(b042c047)}, >+ {C(3eca803e70304894), C(d80de767e4a920a), C(a51cfbb292efd53d), >+ C(d183dcda5f73edfa), C(3a93cbf40f30128c), C(1a92544d0b41dbda), >+ C(aec2c4bee81975e1), C(d183dcda5f73edfa), C(3a93cbf40f30128c), >+ C(1a92544d0b41dbda), C(aec2c4bee81975e1), C(5f91814d1126ba4b), >+ C(f8ac57eee87fcf1f), C(c55c644a5d0023cd), C(adb761e827825ff2), >+ C(e73bb0a8)}, >+ {C(1b5a063fb4c7f9f1), C(318dbc24af66dee9), C(10ef7b32d5c719af), >+ C(b140a02ef5c97712), C(b7d00ef065b51b33), C(635121d532897d98), >+ C(532daf21b312a6d6), C(b140a02ef5c97712), C(b7d00ef065b51b33), >+ C(635121d532897d98), C(532daf21b312a6d6), C(c0b09b75d943910), >+ C(8c84dfb5ef2a8e96), C(e5c06034b0353433), C(3170faf1c33a45dd), >+ C(91dfdd75)}, >+ {C(a0f10149a0e538d6), C(69d008c20f87419f), C(41b36376185b3e9e), >+ C(26b6689960ccf81d), C(55f23b27bb9efd94), C(3a17f6166dd765db), >+ C(c891a8a62931e782), C(26b6689960ccf81d), C(55f23b27bb9efd94), >+ C(3a17f6166dd765db), C(c891a8a62931e782), C(23852dc37ddd2607), >+ C(8b7f1b1ec897829e), C(d1d69452a54eed8a), C(56431f2bd766ec24), >+ C(c87f95de)}, >+ {C(fb8d9c70660b910b), C(a45b0cc3476bff1b), C(b28d1996144f0207), >+ C(98ec31113e5e35d2), C(5e4aeb853f1b9aa7), C(bcf5c8fe4465b7c8), >+ C(b1ea3a8243996f15), C(98ec31113e5e35d2), C(5e4aeb853f1b9aa7), >+ C(bcf5c8fe4465b7c8), C(b1ea3a8243996f15), C(cabbccedb6407571), >+ C(d1e40a84c445ec3a), C(33302aa908cf4039), C(9f15f79211b5cdf8), >+ C(3f5538ef)}, >+ {C(236827beae282a46), C(e43970221139c946), C(4f3ac6faa837a3aa), >+ C(71fec0f972248915), C(2170ec2061f24574), C(9eb346b6caa36e82), >+ C(2908f0fdbca48e73), C(71fec0f972248915), C(2170ec2061f24574), >+ C(9eb346b6caa36e82), C(2908f0fdbca48e73), C(8101c99f07c64abb), >+ C(b9f4b02b1b6a96a7), C(583a2b10cd222f88), C(199dae4cf9db24c), C(70eb1a1f)}, >+ {C(c385e435136ecf7c), C(d9d17368ff6c4a08), C(1b31eed4e5251a67), >+ C(df01a322c43a6200), C(298b65a1714b5a7e), C(933b83f0aedf23c), >+ C(157bcb44d63f765a), C(df01a322c43a6200), C(298b65a1714b5a7e), >+ C(933b83f0aedf23c), C(157bcb44d63f765a), C(d6e9fc7a272d8b51), >+ C(3ee5073ef1a9b777), C(63149e31fac02c59), C(2f7979ff636ba1d8), >+ C(cfd63b83)}, >+ {C(e3f6828b6017086d), C(21b4d1900554b3b0), C(bef38be1809e24f1), >+ C(d93251758985ee6c), C(32a9e9f82ba2a932), C(3822aacaa95f3329), >+ C(db349b2f90a490d8), C(d93251758985ee6c), C(32a9e9f82ba2a932), >+ C(3822aacaa95f3329), C(db349b2f90a490d8), C(8d49194a894a19ca), >+ C(79a78b06e42738e6), C(7e0f1eda3d390c66), C(1c291d7e641100a5), >+ C(894a52ef)}, >+ {C(851fff285561dca0), C(4d1277d73cdf416f), C(28ccffa61010ebe2), >+ C(77a4ccacd131d9ee), C(e1d08eeb2f0e29aa), C(70b9e3051383fa45), >+ C(582d0120425caba), C(77a4ccacd131d9ee), C(e1d08eeb2f0e29aa), >+ C(70b9e3051383fa45), C(582d0120425caba), C(a740eef1846e4564), >+ C(572dddb74ac3ae00), C(fdb5ca9579163bbd), C(a649b9b799c615d2), >+ C(9cde6a54)}, >+ {C(61152a63595a96d9), C(d1a3a91ef3a7ba45), C(443b6bb4a493ad0c), >+ C(a154296d11362d06), C(d0f0bf1f1cb02fc1), C(ccb87e09309f90d1), >+ C(b24a8e4881911101), C(a154296d11362d06), C(d0f0bf1f1cb02fc1), >+ C(ccb87e09309f90d1), C(b24a8e4881911101), C(1a481b4528559f58), >+ C(bf837a3150896995), C(4989ef6b941a3757), C(2e725ab72d0b2948), >+ C(6c4898d5)}, >+ {C(44473e03be306c88), C(30097761f872472a), C(9fd1b669bfad82d7), >+ C(3bab18b164396783), C(47e385ff9d4c06f), C(18062081bf558df), >+ C(63416eb68f104a36), C(3bab18b164396783), C(47e385ff9d4c06f), >+ C(18062081bf558df), C(63416eb68f104a36), C(4abda1560c47ac80), >+ C(1ea0e63dc6587aee), C(33ec79d92ebc1de), C(94f9dccef771e048), C(13e1978e)}, >+ {C(3ead5f21d344056), C(fb6420393cfb05c3), C(407932394cbbd303), >+ C(ac059617f5906673), C(94d50d3dcd3069a7), C(2b26c3b92dea0f0), >+ C(99b7374cc78fc3fb), C(ac059617f5906673), C(94d50d3dcd3069a7), >+ C(2b26c3b92dea0f0), C(99b7374cc78fc3fb), C(1a8e3c73cdd40ee8), >+ C(cbb5fca06747f45b), C(ceec44238b291841), C(28bf35cce9c90a25), C(51b4ba8)}, >+ {C(6abbfde37ee03b5b), C(83febf188d2cc113), C(cda7b62d94d5b8ee), >+ C(a4375590b8ae7c82), C(168fd42f9ecae4ff), C(23bbde43de2cb214), >+ C(a8c333112a243c8c), C(a4375590b8ae7c82), C(168fd42f9ecae4ff), >+ C(23bbde43de2cb214), C(a8c333112a243c8c), C(10ac012e8c518b49), >+ C(64a44605d8b29458), C(a67e701d2a679075), C(3a3a20f43ec92303), >+ C(b6b06e40)}, >+ {C(943e7ed63b3c080), C(1ef207e9444ef7f8), C(ef4a9f9f8c6f9b4a), >+ C(6b54fc38d6a84108), C(32f4212a47a4665), C(6b5a9a8f64ee1da6), >+ C(9f74e86c6da69421), C(6b54fc38d6a84108), C(32f4212a47a4665), >+ C(6b5a9a8f64ee1da6), C(9f74e86c6da69421), C(946dd0cb30c1a08e), >+ C(fdf376956907eaaa), C(a59074c6eec03028), C(b1a3abcf283f34ac), C(240a2f2)}, >+ {C(d72ce05171ef8a1a), C(c6bd6bd869203894), C(c760e6396455d23a), >+ C(f86af0b40dcce7b), C(8d3c15d613394d3c), C(491e400491cd4ece), >+ C(7c19d3530ea3547f), C(f86af0b40dcce7b), C(8d3c15d613394d3c), >+ C(491e400491cd4ece), C(7c19d3530ea3547f), C(1362963a1dc32af9), >+ C(fb9bc11762e1385c), C(9e164ef1f5376083), C(6c15819b5e828a7e), >+ C(5dcefc30)}, >+ {C(4182832b52d63735), C(337097e123eea414), C(b5a72ca0456df910), >+ C(7ebc034235bc122f), C(d9a7783d4edd8049), C(5f8b04a15ae42361), >+ C(fc193363336453dd), C(7ebc034235bc122f), C(d9a7783d4edd8049), >+ C(5f8b04a15ae42361), C(fc193363336453dd), C(9b6c50224ef8c4f8), >+ C(ba225c7942d16c3f), C(6f6d55226a73c412), C(abca061fe072152a), >+ C(7a48b105)}, >+ {C(d6cdae892584a2cb), C(58de0fa4eca17dcd), C(43df30b8f5f1cb00), >+ C(9e4ea5a4941e097d), C(547e048d5a9daaba), C(eb6ecbb0b831d185), >+ C(e0168df5fad0c670), C(9e4ea5a4941e097d), C(547e048d5a9daaba), >+ C(eb6ecbb0b831d185), C(e0168df5fad0c670), C(afa9705f98c2c96a), >+ C(749436f48137a96b), C(759c041fc21df486), C(b23bf400107aa2ec), >+ C(fd55007b)}, >+ {C(5c8e90bc267c5ee4), C(e9ae044075d992d9), C(f234cbfd1f0a1e59), >+ C(ce2744521944f14c), C(104f8032f99dc152), C(4e7f425bfac67ca7), >+ C(9461b911a1c6d589), C(ce2744521944f14c), C(104f8032f99dc152), >+ C(4e7f425bfac67ca7), C(9461b911a1c6d589), C(5e5ecc726db8b60d), >+ C(cce68b0586083b51), C(8a7f8e54a9cba0fc), C(42f010181d16f049), >+ C(6b95894c)}, >+ {C(bbd7f30ac310a6f3), C(b23b570d2666685f), C(fb13fb08c9814fe7), >+ C(4ee107042e512374), C(1e2c8c0d16097e13), C(210c7500995aa0e6), >+ C(6c13190557106457), C(4ee107042e512374), C(1e2c8c0d16097e13), >+ C(210c7500995aa0e6), C(6c13190557106457), C(a99b31c96777f381), >+ C(8312ae8301d386c0), C(ed5042b2a4fa96a3), C(d71d1bb23907fe97), >+ C(3360e827)}, >+ {C(36a097aa49519d97), C(8204380a73c4065), C(77c2004bdd9e276a), >+ C(6ee1f817ce0b7aee), C(e9dcb3507f0596ca), C(6bc63c666b5100e2), >+ C(e0b056f1821752af), C(6ee1f817ce0b7aee), C(e9dcb3507f0596ca), >+ C(6bc63c666b5100e2), C(e0b056f1821752af), C(8ea1114e60292678), >+ C(904b80b46becc77), C(46cd9bb6e9dff52f), C(4c91e3b698355540), C(45177e0b)}, >+ {C(dc78cb032c49217), C(112464083f83e03a), C(96ae53e28170c0f5), >+ C(d367ff54952a958), C(cdad930657371147), C(aa24dc2a9573d5fe), >+ C(eb136daa89da5110), C(d367ff54952a958), C(cdad930657371147), >+ C(aa24dc2a9573d5fe), C(eb136daa89da5110), C(de623005f6d46057), >+ C(b50c0c92b95e9b7f), C(a8aa54050b81c978), C(573fb5c7895af9b5), >+ C(7c6fffe4)}, >+ {C(441593e0da922dfe), C(936ef46061469b32), C(204a1921197ddd87), >+ C(50d8a70e7a8d8f56), C(256d150ae75dab76), C(e81f4c4a1989036a), >+ C(d0f8db365f9d7e00), C(50d8a70e7a8d8f56), C(256d150ae75dab76), >+ C(e81f4c4a1989036a), C(d0f8db365f9d7e00), C(753d686677b14522), >+ C(9f76e0cb6f2d0a66), C(ab14f95988ec0d39), C(97621d9da9c9812f), >+ C(bbc78da4)}, >+ {C(2ba3883d71cc2133), C(72f2bbb32bed1a3c), C(27e1bd96d4843251), >+ C(a90f761e8db1543a), C(c339e23c09703cd8), C(f0c6624c4b098fd3), >+ C(1bae2053e41fa4d9), C(a90f761e8db1543a), C(c339e23c09703cd8), >+ C(f0c6624c4b098fd3), C(1bae2053e41fa4d9), C(3589e273c22ba059), >+ C(63798246e5911a0b), C(18e710ec268fc5dc), C(714a122de1d074f3), >+ C(c5c25d39)}, >+ {C(f2b6d2adf8423600), C(7514e2f016a48722), C(43045743a50396ba), >+ C(23dacb811652ad4f), C(c982da480e0d4c7d), C(3a9c8ed5a399d0a9), >+ C(951b8d084691d4e4), C(23dacb811652ad4f), C(c982da480e0d4c7d), >+ C(3a9c8ed5a399d0a9), C(951b8d084691d4e4), C(d9f87b4988cff2f7), >+ C(217a191d986aa3bc), C(6ad23c56b480350), C(dd78673938ceb2e7), C(b6e5d06e)}, >+ {C(38fffe7f3680d63c), C(d513325255a7a6d1), C(31ed47790f6ca62f), >+ C(c801faaa0a2e331f), C(491dbc58279c7f88), C(9c0178848321c97a), >+ C(9d934f814f4d6a3c), C(c801faaa0a2e331f), C(491dbc58279c7f88), >+ C(9c0178848321c97a), C(9d934f814f4d6a3c), C(606a3e4fc8763192), >+ C(bc15cb36a677ee84), C(52d5904157e1fe71), C(1588dd8b1145b79b), >+ C(6178504e)}, >+ {C(b7477bf0b9ce37c6), C(63b1c580a7fd02a4), C(f6433b9f10a5dac), >+ C(68dd76db9d64eca7), C(36297682b64b67), C(42b192d71f414b7a), >+ C(79692cef44fa0206), C(68dd76db9d64eca7), C(36297682b64b67), >+ C(42b192d71f414b7a), C(79692cef44fa0206), C(f0979252f4776d07), >+ C(4b87cd4f1c9bbf52), C(51b84bbc6312c710), C(150720fbf85428a7), >+ C(bd4c3637)}, >+ {C(55bdb0e71e3edebd), C(c7ab562bcf0568bc), C(43166332f9ee684f), >+ C(b2e25964cd409117), C(a010599d6287c412), C(fa5d6461e768dda2), >+ C(cb3ce74e8ec4f906), C(b2e25964cd409117), C(a010599d6287c412), >+ C(fa5d6461e768dda2), C(cb3ce74e8ec4f906), C(6120abfd541a2610), >+ C(aa88b148cc95794d), C(2686ca35df6590e3), C(c6b02d18616ce94d), >+ C(6e7ac474)}, >+ {C(782fa1b08b475e7), C(fb7138951c61b23b), C(9829105e234fb11e), >+ C(9a8c431f500ef06e), C(d848581a580b6c12), C(fecfe11e13a2bdb4), >+ C(6c4fa0273d7db08c), C(9a8c431f500ef06e), C(d848581a580b6c12), >+ C(fecfe11e13a2bdb4), C(6c4fa0273d7db08c), C(482f43bf5ae59fcb), >+ C(f651fbca105d79e6), C(f09f78695d865817), C(7a99d0092085cf47), >+ C(1fb4b518)}, >+ {C(c5dc19b876d37a80), C(15ffcff666cfd710), C(e8c30c72003103e2), >+ C(7870765b470b2c5d), C(78a9103ff960d82), C(7bb50ffc9fac74b3), >+ C(477e70ab2b347db2), C(7870765b470b2c5d), C(78a9103ff960d82), >+ C(7bb50ffc9fac74b3), C(477e70ab2b347db2), C(a625238bdf7c07cf), >+ C(1128d515174809f5), C(b0f1647e82f45873), C(17792d1c4f222c39), >+ C(31d13d6d)}, >+ {C(5e1141711d2d6706), C(b537f6dee8de6933), C(3af0a1fbbe027c54), >+ C(ea349dbc16c2e441), C(38a7455b6a877547), C(5f97b9750e365411), >+ C(e8cde7f93af49a3), C(ea349dbc16c2e441), C(38a7455b6a877547), >+ C(5f97b9750e365411), C(e8cde7f93af49a3), C(ba101925ec1f7e26), >+ C(d5e84cab8192c71e), C(e256427726fdd633), C(a4f38e2c6116890d), >+ C(26fa72e3)}, >+ {C(782edf6da001234f), C(f48cbd5c66c48f3), C(808754d1e64e2a32), >+ C(5d9dde77353b1a6d), C(11f58c54581fa8b1), C(da90fa7c28c37478), >+ C(5e9a2eafc670a88a), C(5d9dde77353b1a6d), C(11f58c54581fa8b1), >+ C(da90fa7c28c37478), C(5e9a2eafc670a88a), C(e35e1bc172e011ef), >+ C(bf9255a4450ae7fe), C(55f85194e26bc55f), C(4f327873e14d0e54), >+ C(6a7433bf)}, >+ {C(d26285842ff04d44), C(8f38d71341eacca9), C(5ca436f4db7a883c), >+ C(bf41e5376b9f0eec), C(2252d21eb7e1c0e9), C(f4b70a971855e732), >+ C(40c7695aa3662afd), C(bf41e5376b9f0eec), C(2252d21eb7e1c0e9), >+ C(f4b70a971855e732), C(40c7695aa3662afd), C(770fe19e16ab73bb), >+ C(d603ebda6393d749), C(e58c62439aa50dbd), C(96d51e5a02d2d7cf), >+ C(4e6df758)}, >+ {C(c6ab830865a6bae6), C(6aa8e8dd4b98815c), C(efe3846713c371e5), >+ C(a1924cbf0b5f9222), C(7f4872369c2b4258), C(cd6da30530f3ea89), >+ C(b7f8b9a704e6cea1), C(a1924cbf0b5f9222), C(7f4872369c2b4258), >+ C(cd6da30530f3ea89), C(b7f8b9a704e6cea1), C(fa06ff40433fd535), >+ C(fb1c36fe8f0737f1), C(bb7050561171f80), C(b1bc23235935d897), C(d57f63ea)}, >+ {C(44b3a1929232892), C(61dca0e914fc217), C(a607cc142096b964), >+ C(f7dbc8433c89b274), C(2f5f70581c9b7d32), C(39bf5e5fec82dcca), >+ C(8ade56388901a619), C(f7dbc8433c89b274), C(2f5f70581c9b7d32), >+ C(39bf5e5fec82dcca), C(8ade56388901a619), C(c1c6a725caab3ea9), >+ C(c1c7906c2f80b898), C(9c3871a04cc884e6), C(df01813cbbdf217f), >+ C(52ef73b3)}, >+ {C(4b603d7932a8de4f), C(fae64c464b8a8f45), C(8fafab75661d602a), >+ C(8ffe870ef4adc087), C(65bea2be41f55b54), C(82f3503f636aef1), >+ C(5f78a282378b6bb0), C(8ffe870ef4adc087), C(65bea2be41f55b54), >+ C(82f3503f636aef1), C(5f78a282378b6bb0), C(7bf2422c0beceddb), >+ C(9d238d4780114bd), C(7ad198311906597f), C(ec8f892c0422aca3), C(3cb36c3)}, >+ {C(4ec0b54cf1566aff), C(30d2c7269b206bf4), C(77c22e82295e1061), >+ C(3df9b04434771542), C(feddce785ccb661f), C(a644aff716928297), >+ C(dd46aee73824b4ed), C(3df9b04434771542), C(feddce785ccb661f), >+ C(a644aff716928297), C(dd46aee73824b4ed), C(bf8d71879da29b02), >+ C(fc82dccbfc8022a0), C(31bfcd0d9f48d1d3), C(c64ee24d0e7b5f8b), >+ C(72c39bea)}, >+ {C(ed8b7a4b34954ff7), C(56432de31f4ee757), C(85bd3abaa572b155), >+ C(7d2c38a926dc1b88), C(5245b9eb4cd6791d), C(fb53ab03b9ad0855), >+ C(3664026c8fc669d7), C(7d2c38a926dc1b88), C(5245b9eb4cd6791d), >+ C(fb53ab03b9ad0855), C(3664026c8fc669d7), C(45024d5080bc196), >+ C(b236ebec2cc2740), C(27231ad0e3443be4), C(145780b63f809250), C(a65aa25c)}, >+ {C(5d28b43694176c26), C(714cc8bc12d060ae), C(3437726273a83fe6), >+ C(864b1b28ec16ea86), C(6a78a5a4039ec2b9), C(8e959533e35a766), >+ C(347b7c22b75ae65f), C(864b1b28ec16ea86), C(6a78a5a4039ec2b9), >+ C(8e959533e35a766), C(347b7c22b75ae65f), C(5005892bb61e647c), >+ C(fe646519b4a1894d), C(cd801026f74a8a53), C(8713463e9a1ab9ce), >+ C(74740539)}, >+ {C(6a1ef3639e1d202e), C(919bc1bd145ad928), C(30f3f7e48c28a773), >+ C(2e8c49d7c7aaa527), C(5e2328fc8701db7c), C(89ef1afca81f7de8), >+ C(b1857db11985d296), C(2e8c49d7c7aaa527), C(5e2328fc8701db7c), >+ C(89ef1afca81f7de8), C(b1857db11985d296), C(17763d695f616115), >+ C(b8f7bf1fcdc8322c), C(cf0c61938ab07a27), C(1122d3e6edb4e866), >+ C(c3ae3c26)}, >+ {C(159f4d9e0307b111), C(3e17914a5675a0c), C(af849bd425047b51), >+ C(3b69edadf357432b), C(3a2e311c121e6bf2), C(380fad1e288d57e5), >+ C(bf7c7e8ef0e3b83a), C(3b69edadf357432b), C(3a2e311c121e6bf2), >+ C(380fad1e288d57e5), C(bf7c7e8ef0e3b83a), C(92966d5f4356ae9b), >+ C(2a03fc66c4d6c036), C(2516d8bddb0d5259), C(b3ffe9737ff5090), C(f29db8a2)}, >+ {C(cc0a840725a7e25b), C(57c69454396e193a), C(976eaf7eee0b4540), >+ C(cd7a46850b95e901), C(c57f7d060dda246f), C(6b9406ead64079bf), >+ C(11b28e20a573b7bd), C(cd7a46850b95e901), C(c57f7d060dda246f), >+ C(6b9406ead64079bf), C(11b28e20a573b7bd), C(2d6db356e9369ace), >+ C(dc0afe10fba193), C(5cdb10885dbbfce), C(5c700e205782e35a), C(1ef4cbf4)}, >+ {C(a2b27ee22f63c3f1), C(9ebde0ce1b3976b2), C(2fe6a92a257af308), >+ C(8c1df927a930af59), C(a462f4423c9e384e), C(236542255b2ad8d9), >+ C(595d201a2c19d5bc), C(8c1df927a930af59), C(a462f4423c9e384e), >+ C(236542255b2ad8d9), C(595d201a2c19d5bc), C(22c87d4604a67f3), >+ C(585a06eb4bc44c4f), C(b4175a7ac7eabcd8), C(a457d3eeba14ab8c), >+ C(a9be6c41)}, >+ {C(d8f2f234899bcab3), C(b10b037297c3a168), C(debea2c510ceda7f), >+ C(9498fefb890287ce), C(ae68c2be5b1a69a6), C(6189dfba34ed656c), >+ C(91658f95836e5206), C(9498fefb890287ce), C(ae68c2be5b1a69a6), >+ C(6189dfba34ed656c), C(91658f95836e5206), C(c0bb4fff32aecd4d), >+ C(94125f505a50eef9), C(6ac406e7cfbce5bb), C(344a4b1dcdb7f5d8), C(fa31801)}, >+ {C(584f28543864844f), C(d7cee9fc2d46f20d), C(a38dca5657387205), >+ C(7a0b6dbab9a14e69), C(c6d0a9d6b0e31ac4), C(a674d85812c7cf6), >+ C(63538c0351049940), C(7a0b6dbab9a14e69), C(c6d0a9d6b0e31ac4), >+ C(a674d85812c7cf6), C(63538c0351049940), C(9710e5f0bc93d1d), >+ C(c2bea5bd7c54ddd4), C(48739af2bed0d32d), C(ba2c4e09e21fba85), >+ C(8331c5d8)}, >+ {C(a94be46dd9aa41af), C(a57e5b7723d3f9bd), C(34bf845a52fd2f), >+ C(843b58463c8df0ae), C(74b258324e916045), C(bdd7353230eb2b38), >+ C(fad31fced7abade5), C(843b58463c8df0ae), C(74b258324e916045), >+ C(bdd7353230eb2b38), C(fad31fced7abade5), C(2436aeafb0046f85), >+ C(65bc9af9e5e33161), C(92733b1b3ae90628), C(f48143eaf78a7a89), >+ C(e9876db8)}, >+ {C(9a87bea227491d20), C(a468657e2b9c43e7), C(af9ba60db8d89ef7), >+ C(cc76f429ea7a12bb), C(5f30eaf2bb14870a), C(434e824cb3e0cd11), >+ C(431a4d382e39d16e), C(cc76f429ea7a12bb), C(5f30eaf2bb14870a), >+ C(434e824cb3e0cd11), C(431a4d382e39d16e), C(9e51f913c4773a8), >+ C(32ab1925823d0add), C(99c61b54c1d8f69d), C(38cfb80f02b43b1f), >+ C(27b0604e)}, >+ {C(27688c24958d1a5c), C(e3b4a1c9429cf253), C(48a95811f70d64bc), >+ C(328063229db22884), C(67e9c95f8ba96028), C(7c6bf01c60436075), >+ C(fa55161e7d9030b2), C(328063229db22884), C(67e9c95f8ba96028), >+ C(7c6bf01c60436075), C(fa55161e7d9030b2), C(dadbc2f0dab91681), >+ C(da39d7a4934ca11), C(162e845d24c1b45c), C(eb5b9dcd8c6ed31b), C(dcec07f2)}, >+ {C(5d1d37790a1873ad), C(ed9cd4bcc5fa1090), C(ce51cde05d8cd96a), >+ C(f72c26e624407e66), C(a0eb541bdbc6d409), C(c3f40a2f40b3b213), >+ C(6a784de68794492d), C(f72c26e624407e66), C(a0eb541bdbc6d409), >+ C(c3f40a2f40b3b213), C(6a784de68794492d), C(10a38a23dbef7937), >+ C(6a5560f853252278), C(c3387bbf3c7b82ba), C(fbee7c12eb072805), >+ C(cff0a82a)}, >+ {C(1f03fd18b711eea9), C(566d89b1946d381a), C(6e96e83fc92563ab), >+ C(405f66cf8cae1a32), C(d7261740d8f18ce6), C(fea3af64a413d0b2), >+ C(d64d1810e83520fe), C(405f66cf8cae1a32), C(d7261740d8f18ce6), >+ C(fea3af64a413d0b2), C(d64d1810e83520fe), C(e1334a00a580c6e8), >+ C(454049e1b52c15f), C(8895d823d9778247), C(efa7f2e88b826618), C(fec83621)}, >+ {C(f0316f286cf527b6), C(f84c29538de1aa5a), C(7612ed3c923d4a71), >+ C(d4eccebe9393ee8a), C(2eb7867c2318cc59), C(1ce621fd700fe396), >+ C(686450d7a346878a), C(d4eccebe9393ee8a), C(2eb7867c2318cc59), >+ C(1ce621fd700fe396), C(686450d7a346878a), C(75a5f37579f8b4cb), >+ C(500cc16eb6541dc7), C(b7b02317b539d9a6), C(3519ddff5bc20a29), C(743d8dc)}, >+ {C(297008bcb3e3401d), C(61a8e407f82b0c69), C(a4a35bff0524fa0e), >+ C(7a61d8f552a53442), C(821d1d8d8cfacf35), C(7cc06361b86d0559), >+ C(119b617a8c2be199), C(7a61d8f552a53442), C(821d1d8d8cfacf35), >+ C(7cc06361b86d0559), C(119b617a8c2be199), C(2996487da6721759), >+ C(61a901376070b91d), C(d88dee12ae9c9b3c), C(5665491be1fa53a7), >+ C(64d41d26)}, >+ {C(43c6252411ee3be), C(b4ca1b8077777168), C(2746dc3f7da1737f), >+ C(2247a4b2058d1c50), C(1b3fa184b1d7bcc0), C(deb85613995c06ed), >+ C(cbe1d957485a3ccd), C(2247a4b2058d1c50), C(1b3fa184b1d7bcc0), >+ C(deb85613995c06ed), C(cbe1d957485a3ccd), C(dfe241f8f33c96b6), >+ C(6597eb05019c2109), C(da344b2a63a219cf), C(79b8e3887612378a), >+ C(acd90c81)}, >+ {C(ce38a9a54fad6599), C(6d6f4a90b9e8755e), C(c3ecc79ff105de3f), >+ C(e8b9ee96efa2d0e), C(90122905c4ab5358), C(84f80c832d71979c), >+ C(229310f3ffbbf4c6), C(e8b9ee96efa2d0e), C(90122905c4ab5358), >+ C(84f80c832d71979c), C(229310f3ffbbf4c6), C(cc9eb42100cd63a7), >+ C(7a283f2f3da7b9f), C(359b061d314e7a72), C(d0d959720028862), C(7c746a4b)}, >+ {C(270a9305fef70cf), C(600193999d884f3a), C(f4d49eae09ed8a1), >+ C(2e091b85660f1298), C(bfe37fae1cdd64c9), C(8dddfbab930f6494), >+ C(2ccf4b08f5d417a), C(2e091b85660f1298), C(bfe37fae1cdd64c9), >+ C(8dddfbab930f6494), C(2ccf4b08f5d417a), C(365c2ee85582fe6), >+ C(dee027bcd36db62a), C(b150994d3c7e5838), C(fdfd1a0e692e436d), >+ C(b1047e99)}, >+ {C(e71be7c28e84d119), C(eb6ace59932736e6), C(70c4397807ba12c5), >+ C(7a9d77781ac53509), C(4489c3ccfda3b39c), C(fa722d4f243b4964), >+ C(25f15800bffdd122), C(7a9d77781ac53509), C(4489c3ccfda3b39c), >+ C(fa722d4f243b4964), C(25f15800bffdd122), C(ed85e4157fbd3297), >+ C(aab1967227d59efd), C(2199631212eb3839), C(3e4c19359aae1cc2), >+ C(d1fd1068)}, >+ {C(b5b58c24b53aaa19), C(d2a6ab0773dd897f), C(ef762fe01ecb5b97), >+ C(9deefbcfa4cab1f1), C(b58f5943cd2492ba), C(a96dcc4d1f4782a7), >+ C(102b62a82309dde5), C(9deefbcfa4cab1f1), C(b58f5943cd2492ba), >+ C(a96dcc4d1f4782a7), C(102b62a82309dde5), C(35fe52684763b338), >+ C(afe2616651eaad1f), C(43e38715bdfa05e7), C(83c9ba83b5ec4a40), >+ C(56486077)}, >+ {C(44dd59bd301995cf), C(3ccabd76493ada1a), C(540db4c87d55ef23), >+ C(cfc6d7adda35797), C(14c7d1f32332cf03), C(2d553ffbff3be99d), >+ C(c91c4ee0cb563182), C(cfc6d7adda35797), C(14c7d1f32332cf03), >+ C(2d553ffbff3be99d), C(c91c4ee0cb563182), C(9aa5e507f49136f0), >+ C(760c5dd1a82c4888), C(beea7e974a1cfb5c), C(640b247774fe4bf7), >+ C(6069be80)}, >+ {C(b4d4789eb6f2630b), C(bf6973263ce8ef0e), C(d1c75c50844b9d3), >+ C(bce905900c1ec6ea), C(c30f304f4045487d), C(a5c550166b3a142b), >+ C(2f482b4e35327287), C(bce905900c1ec6ea), C(c30f304f4045487d), >+ C(a5c550166b3a142b), C(2f482b4e35327287), C(15b21ddddf355438), >+ C(496471fa3006bab), C(2a8fd458d06c1a32), C(db91e8ae812f0b8d), C(2078359b)}, >+ {C(12807833c463737c), C(58e927ea3b3776b4), C(72dd20ef1c2f8ad0), >+ C(910b610de7a967bf), C(801bc862120f6bf5), C(9653efeed5897681), >+ C(f5367ff83e9ebbb3), C(910b610de7a967bf), C(801bc862120f6bf5), >+ C(9653efeed5897681), C(f5367ff83e9ebbb3), C(cf56d489afd1b0bf), >+ C(c7c793715cae3de8), C(631f91d64abae47c), C(5f1f42fb14a444a2), >+ C(9ea21004)}, >+ {C(e88419922b87176f), C(bcf32f41a7ddbf6f), C(d6ebefd8085c1a0f), >+ C(d1d44fe99451ef72), C(ec951ba8e51e3545), C(c0ca86b360746e96), >+ C(aa679cc066a8040b), C(d1d44fe99451ef72), C(ec951ba8e51e3545), >+ C(c0ca86b360746e96), C(aa679cc066a8040b), C(51065861ece6ffc1), >+ C(76777368a2997e11), C(87f278f46731100c), C(bbaa4140bdba4527), >+ C(9c9cfe88)}, >+ {C(105191e0ec8f7f60), C(5918dbfcca971e79), C(6b285c8a944767b9), >+ C(d3e86ac4f5eccfa4), C(e5399df2b106ca1), C(814aadfacd217f1d), >+ C(2754e3def1c405a9), C(d3e86ac4f5eccfa4), C(e5399df2b106ca1), >+ C(814aadfacd217f1d), C(2754e3def1c405a9), C(99290323b9f06e74), >+ C(a9782e043f271461), C(13c8b3b8c275a860), C(6038d620e581e9e7), >+ C(b70a6ddd)}, >+ {C(a5b88bf7399a9f07), C(fca3ddfd96461cc4), C(ebe738fdc0282fc6), >+ C(69afbc800606d0fb), C(6104b97a9db12df7), C(fcc09198bb90bf9f), >+ C(c5e077e41a65ba91), C(69afbc800606d0fb), C(6104b97a9db12df7), >+ C(fcc09198bb90bf9f), C(c5e077e41a65ba91), C(db261835ee8aa08e), >+ C(db0ee662e5796dc9), C(fc1880ecec499e5f), C(648866fbe1502034), >+ C(dea37298)}, >+ {C(d08c3f5747d84f50), C(4e708b27d1b6f8ac), C(70f70fd734888606), >+ C(909ae019d761d019), C(368bf4aab1b86ef9), C(308bd616d5460239), >+ C(4fd33269f76783ea), C(909ae019d761d019), C(368bf4aab1b86ef9), >+ C(308bd616d5460239), C(4fd33269f76783ea), C(7d53b37c19713eab), >+ C(6bba6eabda58a897), C(91abb50efc116047), C(4e902f347e0e0e35), >+ C(8f480819)}, >+ {C(2f72d12a40044b4b), C(889689352fec53de), C(f03e6ad87eb2f36), >+ C(ef79f28d874b9e2d), C(b512089e8e63b76c), C(24dc06833bf193a9), >+ C(3c23308ba8e99d7e), C(ef79f28d874b9e2d), C(b512089e8e63b76c), >+ C(24dc06833bf193a9), C(3c23308ba8e99d7e), C(5ceff7b85cacefb7), >+ C(ef390338898cd73), C(b12967d7d2254f54), C(de874cbd8aef7b75), C(30b3b16)}, >+ {C(aa1f61fdc5c2e11e), C(c2c56cd11277ab27), C(a1e73069fdf1f94f), >+ C(8184bab36bb79df0), C(c81929ce8655b940), C(301b11bf8a4d8ce8), >+ C(73126fd45ab75de9), C(8184bab36bb79df0), C(c81929ce8655b940), >+ C(301b11bf8a4d8ce8), C(73126fd45ab75de9), C(4bd6f76e4888229a), >+ C(9aae355b54a756d5), C(ca3de9726f6e99d5), C(83f80cac5bc36852), >+ C(f31bc4e8)}, >+ {C(9489b36fe2246244), C(3355367033be74b8), C(5f57c2277cbce516), >+ C(bc61414f9802ecaf), C(8edd1e7a50562924), C(48f4ab74a35e95f2), >+ C(cc1afcfd99a180e7), C(bc61414f9802ecaf), C(8edd1e7a50562924), >+ C(48f4ab74a35e95f2), C(cc1afcfd99a180e7), C(517dd5e3acf66110), >+ C(7dd3ad9e8978b30d), C(1f6d5dfc70de812b), C(947daaba6441aaf3), >+ C(419f953b)}, >+ {C(358d7c0476a044cd), C(e0b7b47bcbd8854f), C(ffb42ec696705519), >+ C(d45e44c263e95c38), C(df61db53923ae3b1), C(f2bc948cc4fc027c), >+ C(8a8000c6066772a3), C(d45e44c263e95c38), C(df61db53923ae3b1), >+ C(f2bc948cc4fc027c), C(8a8000c6066772a3), C(9fd93c942d31fa17), >+ C(d7651ecebe09cbd3), C(68682cefb6a6f165), C(541eb99a2dcee40e), >+ C(20e9e76d)}, >+ {C(b0c48df14275265a), C(9da4448975905efa), C(d716618e414ceb6d), >+ C(30e888af70df1e56), C(4bee54bd47274f69), C(178b4059e1a0afe5), >+ C(6e2c96b7f58e5178), C(30e888af70df1e56), C(4bee54bd47274f69), >+ C(178b4059e1a0afe5), C(6e2c96b7f58e5178), C(bb429d3b9275e9bc), >+ C(c198013f09cafdc6), C(ec0a6ee4fb5de348), C(744e1e8ed2eb1eb0), >+ C(646f0ff8)}, >+ {C(daa70bb300956588), C(410ea6883a240c6d), C(f5c8239fb5673eb3), >+ C(8b1d7bb4903c105f), C(cfb1c322b73891d4), C(5f3b792b22f07297), >+ C(fd64061f8be86811), C(8b1d7bb4903c105f), C(cfb1c322b73891d4), >+ C(5f3b792b22f07297), C(fd64061f8be86811), C(1d2db712921cfc2b), >+ C(cd1b2b2f2cee18ae), C(6b6f8790dc7feb09), C(46c179efa3f0f518), >+ C(eeb7eca8)}, >+ {C(4ec97a20b6c4c7c2), C(5913b1cd454f29fd), C(a9629f9daf06d685), >+ C(852c9499156a8f3), C(3a180a6abfb79016), C(9fc3c4764037c3c9), >+ C(2890c42fc0d972cf), C(852c9499156a8f3), C(3a180a6abfb79016), >+ C(9fc3c4764037c3c9), C(2890c42fc0d972cf), C(1f92231d4e537651), >+ C(fab8bb07aa54b7b9), C(e05d2d771c485ed4), C(d50b34bf808ca731), C(8112bb9)}, >+ {C(5c3323628435a2e8), C(1bea45ce9e72a6e3), C(904f0a7027ddb52e), >+ C(939f31de14dcdc7b), C(a68fdf4379df068), C(f169e1f0b835279d), >+ C(7498e432f9619b27), C(939f31de14dcdc7b), C(a68fdf4379df068), >+ C(f169e1f0b835279d), C(7498e432f9619b27), C(1aa2a1f11088e785), >+ C(d6ad72f45729de78), C(9a63814157c80267), C(55538e35c648e435), >+ C(85a6d477)}, >+ {C(c1ef26bea260abdb), C(6ee423f2137f9280), C(df2118b946ed0b43), >+ C(11b87fb1b900cc39), C(e33e59b90dd815b1), C(aa6cb5c4bafae741), >+ C(739699951ca8c713), C(11b87fb1b900cc39), C(e33e59b90dd815b1), >+ C(aa6cb5c4bafae741), C(739699951ca8c713), C(2b4389a967310077), >+ C(1d5382568a31c2c9), C(55d1e787fbe68991), C(277c254bc31301e7), >+ C(56f76c84)}, >+ {C(6be7381b115d653a), C(ed046190758ea511), C(de6a45ffc3ed1159), >+ C(a64760e4041447d0), C(e3eac49f3e0c5109), C(dd86c4d4cb6258e2), >+ C(efa9857afd046c7f), C(a64760e4041447d0), C(e3eac49f3e0c5109), >+ C(dd86c4d4cb6258e2), C(efa9857afd046c7f), C(fab793dae8246f16), >+ C(c9e3b121b31d094c), C(a2a0f55858465226), C(dba6f0ff39436344), >+ C(9af45d55)}, >+ {C(ae3eece1711b2105), C(14fd3f4027f81a4a), C(abb7e45177d151db), >+ C(501f3e9b18861e44), C(465201170074e7d8), C(96d5c91970f2cb12), >+ C(40fd28c43506c95d), C(501f3e9b18861e44), C(465201170074e7d8), >+ C(96d5c91970f2cb12), C(40fd28c43506c95d), C(e86c4b07802aaff3), >+ C(f317d14112372a70), C(641b13e587711650), C(4915421ab1090eaa), >+ C(d1c33760)}, >+ {C(376c28588b8fb389), C(6b045e84d8491ed2), C(4e857effb7d4e7dc), >+ C(154dd79fd2f984b4), C(f11171775622c1c3), C(1fbe30982e78e6f0), >+ C(a460a15dcf327e44), C(154dd79fd2f984b4), C(f11171775622c1c3), >+ C(1fbe30982e78e6f0), C(a460a15dcf327e44), C(f359e0900cc3d582), >+ C(7e11070447976d00), C(324e6daf276ea4b5), C(7aa6e2df0cc94fa2), >+ C(c56bbf69)}, >+ {C(58d943503bb6748f), C(419c6c8e88ac70f6), C(586760cbf3d3d368), >+ C(b7e164979d5ccfc1), C(12cb4230d26bf286), C(f1bf910d44bd84cb), >+ C(b32c24c6a40272), C(b7e164979d5ccfc1), C(12cb4230d26bf286), >+ C(f1bf910d44bd84cb), C(b32c24c6a40272), C(11ed12e34c48c039), >+ C(b0c2538e51d0a6ac), C(4269bb773e1d553a), C(e35a9dbabd34867), C(abecfb9b)}, >+ {C(dfff5989f5cfd9a1), C(bcee2e7ea3a96f83), C(681c7874adb29017), >+ C(3ff6c8ac7c36b63a), C(48bc8831d849e326), C(30b078e76b0214e2), >+ C(42954e6ad721b920), C(3ff6c8ac7c36b63a), C(48bc8831d849e326), >+ C(30b078e76b0214e2), C(42954e6ad721b920), C(f9aeb33d164b4472), >+ C(7b353b110831dbdc), C(16f64c82f44ae17b), C(b71244cc164b3b2b), >+ C(8de13255)}, >+ {C(7fb19eb1a496e8f5), C(d49e5dfdb5c0833f), C(c0d5d7b2f7c48dc7), >+ C(1a57313a32f22dde), C(30af46e49850bf8b), C(aa0fe8d12f808f83), >+ C(443e31d70873bb6b), C(1a57313a32f22dde), C(30af46e49850bf8b), >+ C(aa0fe8d12f808f83), C(443e31d70873bb6b), C(bbeb67c49c9fdc13), >+ C(18f1e2a88f59f9d5), C(fb1b05038e5def11), C(d0450b5ce4c39c52), >+ C(a98ee299)}, >+ {C(5dba5b0dadccdbaa), C(4ba8da8ded87fcdc), C(f693fdd25badf2f0), >+ C(e9029e6364286587), C(ae69f49ecb46726c), C(18e002679217c405), >+ C(bd6d66e85332ae9f), C(e9029e6364286587), C(ae69f49ecb46726c), >+ C(18e002679217c405), C(bd6d66e85332ae9f), C(6bf330b1c353dd2a), >+ C(74e9f2e71e3a4152), C(3f85560b50f6c413), C(d33a52a47eaed2b4), >+ C(3015f556)}, >+ {C(688bef4b135a6829), C(8d31d82abcd54e8e), C(f95f8a30d55036d7), >+ C(3d8c90e27aa2e147), C(2ec937ce0aa236b4), C(89b563996d3a0b78), >+ C(39b02413b23c3f08), C(3d8c90e27aa2e147), C(2ec937ce0aa236b4), >+ C(89b563996d3a0b78), C(39b02413b23c3f08), C(8d475a2e64faf2d2), >+ C(48567f7dca46ecaf), C(254cda08d5f87a6d), C(ec6ae9f729c47039), >+ C(5a430e29)}, >+ {C(d8323be05433a412), C(8d48fa2b2b76141d), C(3d346f23978336a5), >+ C(4d50c7537562033f), C(57dc7625b61dfe89), C(9723a9f4c08ad93a), >+ C(5309596f48ab456b), C(4d50c7537562033f), C(57dc7625b61dfe89), >+ C(9723a9f4c08ad93a), C(5309596f48ab456b), C(7e453088019d220f), >+ C(8776067ba6ab9714), C(67e1d06bd195de39), C(74a1a32f8994b918), >+ C(2797add0)}, >+ {C(3b5404278a55a7fc), C(23ca0b327c2d0a81), C(a6d65329571c892c), >+ C(45504801e0e6066b), C(86e6c6d6152a3d04), C(4f3db1c53eca2952), >+ C(d24d69b3e9ef10f3), C(45504801e0e6066b), C(86e6c6d6152a3d04), >+ C(4f3db1c53eca2952), C(d24d69b3e9ef10f3), C(93a0de2219e66a70), >+ C(8932c7115ccb1f8a), C(5ef503fdf2841a8c), C(38064dd9efa80a41), >+ C(27d55016)}, >+ {C(2a96a3f96c5e9bbc), C(8caf8566e212dda8), C(904de559ca16e45e), >+ C(f13bc2d9c2fe222e), C(be4ccec9a6cdccfd), C(37b2cbdd973a3ac9), >+ C(7b3223cd9c9497be), C(f13bc2d9c2fe222e), C(be4ccec9a6cdccfd), >+ C(37b2cbdd973a3ac9), C(7b3223cd9c9497be), C(d5904440f376f889), >+ C(62b13187699c473c), C(4751b89251f26726), C(9500d84fa3a61ba8), >+ C(84945a82)}, >+ {C(22bebfdcc26d18ff), C(4b4d8dcb10807ba1), C(40265eee30c6b896), >+ C(3752b423073b119a), C(377dc5eb7c662bdb), C(2b9f07f93a6c25b9), >+ C(96f24ede2bdc0718), C(3752b423073b119a), C(377dc5eb7c662bdb), >+ C(2b9f07f93a6c25b9), C(96f24ede2bdc0718), C(f7699b12c31417bd), >+ C(17b366f401c58b2), C(bf60188d5f437b37), C(484436e56df17f04), C(3ef7e224)}, >+ {C(627a2249ec6bbcc2), C(c0578b462a46735a), C(4974b8ee1c2d4f1f), >+ C(ebdbb918eb6d837f), C(8fb5f218dd84147c), C(c77dd1f881df2c54), >+ C(62eac298ec226dc3), C(ebdbb918eb6d837f), C(8fb5f218dd84147c), >+ C(c77dd1f881df2c54), C(62eac298ec226dc3), C(43eded83c4b60bd0), >+ C(9a0a403b5487503b), C(25f305d9147f0bda), C(3ad417f511bc1e64), >+ C(35ed8dc8)}, >+ {C(3abaf1667ba2f3e0), C(ee78476b5eeadc1), C(7e56ac0a6ca4f3f4), >+ C(f1b9b413df9d79ed), C(a7621b6fd02db503), C(d92f7ba9928a4ffe), >+ C(53f56babdcae96a6), C(f1b9b413df9d79ed), C(a7621b6fd02db503), >+ C(d92f7ba9928a4ffe), C(53f56babdcae96a6), C(5302b89fc48713ab), >+ C(d03e3b04dbe7a2f2), C(fa74ef8af6d376a7), C(103c8cdea1050ef2), >+ C(6a75e43d)}, >+ {C(3931ac68c5f1b2c9), C(efe3892363ab0fb0), C(40b707268337cd36), >+ C(a53a6b64b1ac85c9), C(d50e7f86ee1b832b), C(7bab08fdd26ba0a4), >+ C(7587743c18fe2475), C(a53a6b64b1ac85c9), C(d50e7f86ee1b832b), >+ C(7bab08fdd26ba0a4), C(7587743c18fe2475), C(e3b5d5d490cf5761), >+ C(dfc053f7d065edd5), C(42ffd8d5fb70129f), C(599ca38677cccdc3), >+ C(235d9805)}, >+ {C(b98fb0606f416754), C(46a6e5547ba99c1e), C(c909d82112a8ed2), >+ C(dbfaae9642b3205a), C(f676a1339402bcb9), C(f4f12a5b1ac11f29), >+ C(7db8bad81249dee4), C(dbfaae9642b3205a), C(f676a1339402bcb9), >+ C(f4f12a5b1ac11f29), C(7db8bad81249dee4), C(b26e46f2da95922e), >+ C(2aaedd5e12e3c611), C(a0e2d9082966074), C(c64da8a167add63d), C(f7d69572)}, >+ {C(7f7729a33e58fcc4), C(2e4bc1e7a023ead4), C(e707008ea7ca6222), >+ C(47418a71800334a0), C(d10395d8fc64d8a4), C(8257a30062cb66f), >+ C(6786f9b2dc1ff18a), C(47418a71800334a0), C(d10395d8fc64d8a4), >+ C(8257a30062cb66f), C(6786f9b2dc1ff18a), C(5633f437bb2f180f), >+ C(e5a3a405737d22d6), C(ca0ff1ef6f7f0b74), C(d0ae600684b16df8), >+ C(bacd0199)}, >+ {C(42a0aa9ce82848b3), C(57232730e6bee175), C(f89bb3f370782031), >+ C(caa33cf9b4f6619c), C(b2c8648ad49c209f), C(9e89ece0712db1c0), >+ C(101d8274a711a54b), C(caa33cf9b4f6619c), C(b2c8648ad49c209f), >+ C(9e89ece0712db1c0), C(101d8274a711a54b), C(538e79f1e70135cd), >+ C(e1f5a76f983c844e), C(653c082fd66088fc), C(1b9c9b464b654958), >+ C(e428f50e)}, >+ {C(6b2c6d38408a4889), C(de3ef6f68fb25885), C(20754f456c203361), >+ C(941f5023c0c943f9), C(dfdeb9564fd66f24), C(2140cec706b9d406), >+ C(7b22429b131e9c72), C(941f5023c0c943f9), C(dfdeb9564fd66f24), >+ C(2140cec706b9d406), C(7b22429b131e9c72), C(94215c22eb940f45), >+ C(d28b9ed474f7249a), C(6f25e88f2fbf9f56), C(b6718f9e605b38ac), >+ C(81eaaad3)}, >+ {C(930380a3741e862a), C(348d28638dc71658), C(89dedcfd1654ea0d), >+ C(7e7f61684080106), C(837ace9794582976), C(5ac8ca76a357eb1b), >+ C(32b58308625661fb), C(7e7f61684080106), C(837ace9794582976), >+ C(5ac8ca76a357eb1b), C(32b58308625661fb), C(c09705c4572025d9), >+ C(f9187f6af0291303), C(1c0edd8ee4b02538), C(e6cb105daa0578a), C(addbd3e3)}, >+ {C(94808b5d2aa25f9a), C(cec72968128195e0), C(d9f4da2bdc1e130f), >+ C(272d8dd74f3006cc), C(ec6c2ad1ec03f554), C(4ad276b249a5d5dd), >+ C(549a22a17c0cde12), C(272d8dd74f3006cc), C(ec6c2ad1ec03f554), >+ C(4ad276b249a5d5dd), C(549a22a17c0cde12), C(602119cb824d7cde), >+ C(f4d3cef240ef35fa), C(e889895e01911bc7), C(785a7e5ac20e852b), >+ C(e66dbca0)}, >+ {C(b31abb08ae6e3d38), C(9eb9a95cbd9e8223), C(8019e79b7ee94ea9), >+ C(7b2271a7a3248e22), C(3b4f700e5a0ba523), C(8ebc520c227206fe), >+ C(da3f861490f5d291), C(7b2271a7a3248e22), C(3b4f700e5a0ba523), >+ C(8ebc520c227206fe), C(da3f861490f5d291), C(d08a689f9f3aa60e), >+ C(547c1b97a068661f), C(4b15a67fa29172f0), C(eaf40c085191d80f), >+ C(afe11fd5)}, >+ {C(dccb5534a893ea1a), C(ce71c398708c6131), C(fe2396315457c164), >+ C(3f1229f4d0fd96fb), C(33130aa5fa9d43f2), C(e42693d5b34e63ab), >+ C(2f4ef2be67f62104), C(3f1229f4d0fd96fb), C(33130aa5fa9d43f2), >+ C(e42693d5b34e63ab), C(2f4ef2be67f62104), C(372e5153516e37b9), >+ C(af9ec142ab12cc86), C(777920c09345e359), C(e7c4a383bef8adc6), >+ C(a71a406f)}, >+ {C(6369163565814de6), C(8feb86fb38d08c2f), C(4976933485cc9a20), >+ C(7d3e82d5ba29a90d), C(d5983cc93a9d126a), C(37e9dfd950e7b692), >+ C(80673be6a7888b87), C(7d3e82d5ba29a90d), C(d5983cc93a9d126a), >+ C(37e9dfd950e7b692), C(80673be6a7888b87), C(57f732dc600808bc), >+ C(59477199802cc78b), C(f824810eb8f2c2de), C(c4a3437f05b3b61c), >+ C(9d90eaf5)}, >+ {C(edee4ff253d9f9b3), C(96ef76fb279ef0ad), C(a4d204d179db2460), >+ C(1f3dcdfa513512d6), C(4dc7ec07283117e4), C(4438bae88ae28bf9), >+ C(aa7eae72c9244a0d), C(1f3dcdfa513512d6), C(4dc7ec07283117e4), >+ C(4438bae88ae28bf9), C(aa7eae72c9244a0d), C(b9aedc8d3ecc72df), >+ C(b75a8eb090a77d62), C(6b15677f9cd91507), C(51d8282cb3a9ddbf), >+ C(6665db10)}, >+ {C(941993df6e633214), C(929bc1beca5b72c6), C(141fc52b8d55572d), >+ C(b3b782ad308f21ed), C(4f2676485041dee0), C(bfe279aed5cb4bc8), >+ C(2a62508a467a22ff), C(b3b782ad308f21ed), C(4f2676485041dee0), >+ C(bfe279aed5cb4bc8), C(2a62508a467a22ff), C(e74d29eab742385d), >+ C(56b05cd90ecfc293), C(c603728ea73f8844), C(8638fcd21bc692c4), >+ C(9c977cbf)}, >+ {C(859838293f64cd4c), C(484403b39d44ad79), C(bf674e64d64b9339), >+ C(44d68afda9568f08), C(478568ed51ca1d65), C(679c204ad3d9e766), >+ C(b28e788878488dc1), C(44d68afda9568f08), C(478568ed51ca1d65), >+ C(679c204ad3d9e766), C(b28e788878488dc1), C(d001a84d3a84fae6), >+ C(d376958fe4cb913e), C(17435277e36c86f0), C(23657b263c347aa6), >+ C(ee83ddd4)}, >+ {C(c19b5648e0d9f555), C(328e47b2b7562993), C(e756b92ba4bd6a51), >+ C(c3314e362764ddb8), C(6481c084ee9ec6b5), C(ede23fb9a251771), >+ C(bd617f2643324590), C(c3314e362764ddb8), C(6481c084ee9ec6b5), >+ C(ede23fb9a251771), C(bd617f2643324590), C(d2d30c9b95e030f5), >+ C(8a517312ffc5795e), C(8b1f325033bd535e), C(3ee6e867e03f2892), C(26519cc)}, >+ {C(f963b63b9006c248), C(9e9bf727ffaa00bc), C(c73bacc75b917e3a), >+ C(2c6aa706129cc54c), C(17a706f59a49f086), C(c7c1eec455217145), >+ C(6adfdc6e07602d42), C(2c6aa706129cc54c), C(17a706f59a49f086), >+ C(c7c1eec455217145), C(6adfdc6e07602d42), C(fb75fca30d848dd2), >+ C(5228c9ed14653ed4), C(953958910153b1a2), C(a430103a24f42a5d), >+ C(a485a53f)}, >+ {C(6a8aa0852a8c1f3b), C(c8f1e5e206a21016), C(2aa554aed1ebb524), >+ C(fc3e3c322cd5d89b), C(b7e3911dc2bd4ebb), C(fcd6da5e5fae833a), >+ C(51ed3c41f87f9118), C(fc3e3c322cd5d89b), C(b7e3911dc2bd4ebb), >+ C(fcd6da5e5fae833a), C(51ed3c41f87f9118), C(f31750cbc19c420a), >+ C(186dab1abada1d86), C(ca7f88cb894b3cd7), C(2859eeb1c373790c), >+ C(f62bc412)}, >+ {C(740428b4d45e5fb8), C(4c95a4ce922cb0a5), C(e99c3ba78feae796), >+ C(914f1ea2fdcebf5c), C(9566453c07cd0601), C(9841bf66d0462cd), >+ C(79140c1c18536aeb), C(914f1ea2fdcebf5c), C(9566453c07cd0601), >+ C(9841bf66d0462cd), C(79140c1c18536aeb), C(a963b930b05820c2), >+ C(6a7d9fa0c8c45153), C(64214c40d07cf39b), C(7057daf1d806c014), >+ C(8975a436)}, >+ {C(658b883b3a872b86), C(2f0e303f0f64827a), C(975337e23dc45e1), >+ C(99468a917986162b), C(7b31434aac6e0af0), C(f6915c1562c7d82f), >+ C(e4071d82a6dd71db), C(99468a917986162b), C(7b31434aac6e0af0), >+ C(f6915c1562c7d82f), C(e4071d82a6dd71db), C(5f5331f077b5d996), >+ C(7b314ba21b747a4f), C(5a73cb9521da17f5), C(12ed435fae286d86), >+ C(94ff7f41)}, >+ {C(6df0a977da5d27d4), C(891dd0e7cb19508), C(fd65434a0b71e680), >+ C(8799e4740e573c50), C(9e739b52d0f341e8), C(cdfd34ba7d7b03eb), >+ C(5061812ce6c88499), C(8799e4740e573c50), C(9e739b52d0f341e8), >+ C(cdfd34ba7d7b03eb), C(5061812ce6c88499), C(612b8d8f2411dc5c), >+ C(878bd883d29c7787), C(47a846727182bb), C(ec4949508c8b3b9a), C(760aa031)}, >+ {C(a900275464ae07ef), C(11f2cfda34beb4a3), C(9abf91e5a1c38e4), >+ C(8063d80ab26f3d6d), C(4177b4b9b4f0393f), C(6de42ba8672b9640), >+ C(d0bccdb72c51c18), C(8063d80ab26f3d6d), C(4177b4b9b4f0393f), >+ C(6de42ba8672b9640), C(d0bccdb72c51c18), C(af3f611b7f22cf12), >+ C(3863c41492645755), C(928c7a616a8f14f9), C(a82c78eb2eadc58b), >+ C(3bda76df)}, >+ {C(810bc8aa0c40bcb0), C(448a019568d01441), C(f60ec52f60d3aeae), >+ C(52c44837aa6dfc77), C(15d8d8fccdd6dc5b), C(345b793ccfa93055), >+ C(932160fe802ca975), C(52c44837aa6dfc77), C(15d8d8fccdd6dc5b), >+ C(345b793ccfa93055), C(932160fe802ca975), C(a624b0dd93fc18cd), >+ C(d955b254c2037f1e), C(e540533d370a664c), C(2ba4ec12514e9d7), C(498e2e65)}, >+ {C(22036327deb59ed7), C(adc05ceb97026a02), C(48bff0654262672b), >+ C(c791b313aba3f258), C(443c7757a4727bee), C(e30e4b2372171bdf), >+ C(f3db986c4156f3cb), C(c791b313aba3f258), C(443c7757a4727bee), >+ C(e30e4b2372171bdf), C(f3db986c4156f3cb), C(a939aefab97c6e15), >+ C(dbeb8acf1d5b0e6c), C(1e0eab667a795bba), C(80dd539902df4d50), >+ C(d38deb48)}, >+ {C(7d14dfa9772b00c8), C(595735efc7eeaed7), C(29872854f94c3507), >+ C(bc241579d8348401), C(16dc832804d728f0), C(e9cc71ae64e3f09e), >+ C(bef634bc978bac31), C(bc241579d8348401), C(16dc832804d728f0), >+ C(e9cc71ae64e3f09e), C(bef634bc978bac31), C(7f64b1fa2a9129e), >+ C(71d831bd530ac7f3), C(c7ad0a8a6d5be6f1), C(82a7d3a815c7aaab), >+ C(82b3fb6b)}, >+ {C(2d777cddb912675d), C(278d7b10722a13f9), C(f5c02bfb7cc078af), >+ C(4283001239888836), C(f44ca39a6f79db89), C(ed186122d71bcc9f), >+ C(8620017ab5f3ba3b), C(4283001239888836), C(f44ca39a6f79db89), >+ C(ed186122d71bcc9f), C(8620017ab5f3ba3b), C(e787472187f176c), >+ C(267e64c4728cf181), C(f1ba4b3007c15e30), C(8e3a75d5b02ecfc0), >+ C(e500e25f)}, >+ {C(f2ec98824e8aa613), C(5eb7e3fb53fe3bed), C(12c22860466e1dd4), >+ C(374dd4288e0b72e5), C(ff8916db706c0df4), C(cb1a9e85de5e4b8d), >+ C(d4d12afb67a27659), C(374dd4288e0b72e5), C(ff8916db706c0df4), >+ C(cb1a9e85de5e4b8d), C(d4d12afb67a27659), C(feb69095d1ba175a), >+ C(e2003aab23a47fad), C(8163a3ecab894b49), C(46d356674ce041f6), >+ C(bd2bb07c)}, >+ {C(5e763988e21f487f), C(24189de8065d8dc5), C(d1519d2403b62aa0), >+ C(9136456740119815), C(4d8ff7733b27eb83), C(ea3040bc0c717ef8), >+ C(7617ab400dfadbc), C(9136456740119815), C(4d8ff7733b27eb83), >+ C(ea3040bc0c717ef8), C(7617ab400dfadbc), C(fb336770c10b17a1), >+ C(6123b68b5b31f151), C(1e147d5f295eccf2), C(9ecbb1333556f977), >+ C(3a2b431d)}, >+ {C(48949dc327bb96ad), C(e1fd21636c5c50b4), C(3f6eb7f13a8712b4), >+ C(14cf7f02dab0eee8), C(6d01750605e89445), C(4f1cf4006e613b78), >+ C(57c40c4db32bec3b), C(14cf7f02dab0eee8), C(6d01750605e89445), >+ C(4f1cf4006e613b78), C(57c40c4db32bec3b), C(1fde5a347f4a326e), >+ C(cb5a54308adb0e3f), C(14994b2ba447a23c), C(7067d0abb4257b68), >+ C(7322a83d)}, >+ {C(b7c4209fb24a85c5), C(b35feb319c79ce10), C(f0d3de191833b922), >+ C(570d62758ddf6397), C(5e0204fb68a7b800), C(4383a9236f8b5a2b), >+ C(7bc1a64641d803a4), C(570d62758ddf6397), C(5e0204fb68a7b800), >+ C(4383a9236f8b5a2b), C(7bc1a64641d803a4), C(5434d61285099f7a), >+ C(d49449aacdd5dd67), C(97855ba0e9a7d75d), C(da67328062f3a62f), >+ C(a645ca1c)}, >+ {C(9c9e5be0943d4b05), C(b73dc69e45201cbb), C(aab17180bfe5083d), >+ C(c738a77a9a55f0e2), C(705221addedd81df), C(fd9bd8d397abcfa3), >+ C(8ccf0004aa86b795), C(c738a77a9a55f0e2), C(705221addedd81df), >+ C(fd9bd8d397abcfa3), C(8ccf0004aa86b795), C(2bb5db2280068206), >+ C(8c22d29f307a01d), C(274a22de02f473c8), C(b8791870f4268182), C(8909a45a)}, >+ {C(3898bca4dfd6638d), C(f911ff35efef0167), C(24bdf69e5091fc88), >+ C(9b82567ab6560796), C(891b69462b41c224), C(8eccc7e4f3af3b51), >+ C(381e54c3c8f1c7d0), C(9b82567ab6560796), C(891b69462b41c224), >+ C(8eccc7e4f3af3b51), C(381e54c3c8f1c7d0), C(c80fbc489a558a55), >+ C(1ba88e062a663af7), C(af7b1ef1c0116303), C(bd20e1a5a6b1a0cd), >+ C(bd30074c)}, >+ {C(5b5d2557400e68e7), C(98d610033574cee), C(dfd08772ce385deb), >+ C(3c13e894365dc6c2), C(26fc7bbcda3f0ef), C(dbb71106cdbfea36), >+ C(785239a742c6d26d), C(3c13e894365dc6c2), C(26fc7bbcda3f0ef), >+ C(dbb71106cdbfea36), C(785239a742c6d26d), C(f810c415ae05b2f4), >+ C(bb9b9e7398526088), C(70128f1bf830a32b), C(bcc73f82b6410899), >+ C(c17cf001)}, >+ {C(a927ed8b2bf09bb6), C(606e52f10ae94eca), C(71c2203feb35a9ee), >+ C(6e65ec14a8fb565), C(34bff6f2ee5a7f79), C(2e329a5be2c011b), >+ C(73161c93331b14f9), C(6e65ec14a8fb565), C(34bff6f2ee5a7f79), >+ C(2e329a5be2c011b), C(73161c93331b14f9), C(15d13f2408aecf88), >+ C(9f5b61b8a4b55b31), C(8fe25a43b296dba6), C(bdad03b7300f284e), >+ C(26ffd25a)}, >+ {C(8d25746414aedf28), C(34b1629d28b33d3a), C(4d5394aea5f82d7b), >+ C(379f76458a3c8957), C(79dd080f9843af77), C(c46f0a7847f60c1d), >+ C(af1579c5797703cc), C(379f76458a3c8957), C(79dd080f9843af77), >+ C(c46f0a7847f60c1d), C(af1579c5797703cc), C(8b7d31f338755c14), >+ C(2eff97679512aaa8), C(df07d68e075179ed), C(c8fa6c7a729e7f1f), >+ C(f1d8ce3c)}, >+ {C(b5bbdb73458712f2), C(1ff887b3c2a35137), C(7f7231f702d0ace9), >+ C(1e6f0910c3d25bd8), C(ad9e250862102467), C(1c842a07abab30cd), >+ C(cd8124176bac01ac), C(1e6f0910c3d25bd8), C(ad9e250862102467), >+ C(1c842a07abab30cd), C(cd8124176bac01ac), C(ea6ebe7a79b67edc), >+ C(73f598ac9db26713), C(4f4e72d7460b8fc), C(365dc4b9fdf13f21), C(3ee8fb17)}, >+ {C(3d32a26e3ab9d254), C(fc4070574dc30d3a), C(f02629579c2b27c9), >+ C(b1cf09b0184a4834), C(5c03db48eb6cc159), C(f18c7fcf34d1df47), >+ C(dfb043419ecf1fa9), C(b1cf09b0184a4834), C(5c03db48eb6cc159), >+ C(f18c7fcf34d1df47), C(dfb043419ecf1fa9), C(dcd78d13f9ca658f), >+ C(4355d408ffe8e49f), C(81eefee908b593b4), C(590c213c20e981a3), >+ C(a77acc2a)}, >+ {C(9371d3c35fa5e9a5), C(42967cf4d01f30), C(652d1eeae704145c), >+ C(ceaf1a0d15234f15), C(1450a54e45ba9b9), C(65e9c1fd885aa932), >+ C(354d4bc034ba8cbe), C(ceaf1a0d15234f15), C(1450a54e45ba9b9), >+ C(65e9c1fd885aa932), C(354d4bc034ba8cbe), C(8fd4ff484c08fb4b), >+ C(bf46749866f69ba0), C(cf1c21ede82c9477), C(4217548c43da109), C(f4556dee)}, >+ {C(cbaa3cb8f64f54e0), C(76c3b48ee5c08417), C(9f7d24e87e61ce9), >+ C(85b8e53f22e19507), C(bb57137739ca486b), C(c77f131cca38f761), >+ C(c56ac3cf275be121), C(85b8e53f22e19507), C(bb57137739ca486b), >+ C(c77f131cca38f761), C(c56ac3cf275be121), C(9ec1a6c9109d2685), >+ C(3dad0922e76afdb0), C(fd58cbf952958103), C(7b04c908e78639a1), >+ C(de287a64)}, >+ {C(b2e23e8116c2ba9f), C(7e4d9c0060101151), C(3310da5e5028f367), >+ C(adc52dddb76f6e5e), C(4aad4e925a962b68), C(204b79b7f7168e64), >+ C(df29ed6671c36952), C(adc52dddb76f6e5e), C(4aad4e925a962b68), >+ C(204b79b7f7168e64), C(df29ed6671c36952), C(e02927cac396d210), >+ C(5d500e71742b638a), C(5c9998af7f27b124), C(3fba9a2573dc2f7), C(878e55b9)}, >+ {C(8aa77f52d7868eb9), C(4d55bd587584e6e2), C(d2db37041f495f5), >+ C(ce030d15b5fe2f4), C(86b4a7a0780c2431), C(ee070a9ae5b51db7), >+ C(edc293d9595be5d8), C(ce030d15b5fe2f4), C(86b4a7a0780c2431), >+ C(ee070a9ae5b51db7), C(edc293d9595be5d8), C(3dfc5ec108260a2b), >+ C(8afe28c7123bf4e2), C(da82ef38023a7a5f), C(3e1f77b0174b77c3), C(7648486)}, >+ {C(858fea922c7fe0c3), C(cfe8326bf733bc6f), C(4e5e2018cf8f7dfc), >+ C(64fd1bc011e5bab7), C(5c9e858728015568), C(97ac42c2b00b29b1), >+ C(7f89caf08c109aee), C(64fd1bc011e5bab7), C(5c9e858728015568), >+ C(97ac42c2b00b29b1), C(7f89caf08c109aee), C(9a8af34fd0e9dacf), >+ C(bbc54161aa1507e0), C(7cda723ccbbfe5ee), C(2c289d839fb93f58), >+ C(57ac0fb1)}, >+ {C(46ef25fdec8392b1), C(e48d7b6d42a5cd35), C(56a6fe1c175299ca), >+ C(fdfa836b41dcef62), C(2f8db8030e847e1b), C(5ba0a49ac4f9b0f8), >+ C(dae897ed3e3fce44), C(fdfa836b41dcef62), C(2f8db8030e847e1b), >+ C(5ba0a49ac4f9b0f8), C(dae897ed3e3fce44), C(9c432e31aef626e7), >+ C(9a36e1c6cd6e3dd), C(5095a167c34d19d), C(a70005cfa6babbea), C(d01967ca)}, >+ {C(8d078f726b2df464), C(b50ee71cdcabb299), C(f4af300106f9c7ba), >+ C(7d222caae025158a), C(cc028d5fd40241b9), C(dd42515b639e6f97), >+ C(e08e86531a58f87f), C(7d222caae025158a), C(cc028d5fd40241b9), >+ C(dd42515b639e6f97), C(e08e86531a58f87f), C(d93612c835b37d7b), >+ C(91dd61729b2fa7f4), C(ba765a1bdda09db7), C(55258b451b2b1297), >+ C(96ecdf74)}, >+ {C(35ea86e6960ca950), C(34fe1fe234fc5c76), C(a00207a3dc2a72b7), >+ C(80395e48739e1a67), C(74a67d8f7f43c3d7), C(dd2bdd1d62246c6e), >+ C(a1f44298ba80acf6), C(80395e48739e1a67), C(74a67d8f7f43c3d7), >+ C(dd2bdd1d62246c6e), C(a1f44298ba80acf6), C(ad86d86c187bf38), >+ C(26feea1f2eee240d), C(ed7f1fd066b23897), C(a768cf1e0fbb502), C(779f5506)}, >+ {C(8aee9edbc15dd011), C(51f5839dc8462695), C(b2213e17c37dca2d), >+ C(133b299a939745c5), C(796e2aac053f52b3), C(e8d9fe1521a4a222), >+ C(819a8863e5d1c290), C(133b299a939745c5), C(796e2aac053f52b3), >+ C(e8d9fe1521a4a222), C(819a8863e5d1c290), C(c0737f0fe34d36ad), >+ C(e6d6d4a267a5cc31), C(98300a7911674c23), C(bef189661c257098), >+ C(3c94c2de)}, >+ {C(c3e142ba98432dda), C(911d060cab126188), C(b753fbfa8365b844), >+ C(fd1a9ba5e71b08a2), C(7ac0dc2ed7778533), C(b543161ff177188a), >+ C(492fc08a6186f3f4), C(fd1a9ba5e71b08a2), C(7ac0dc2ed7778533), >+ C(b543161ff177188a), C(492fc08a6186f3f4), C(fc4745f516afd3b6), >+ C(88c30370a53080e), C(65a1bb34abc465e2), C(abbd14662911c8b3), C(39f98faf)}, >+ {C(123ba6b99c8cd8db), C(448e582672ee07c4), C(cebe379292db9e65), >+ C(938f5bbab544d3d6), C(d2a95f9f2d376d73), C(68b2f16149e81aa3), >+ C(ad7e32f82d86c79d), C(938f5bbab544d3d6), C(d2a95f9f2d376d73), >+ C(68b2f16149e81aa3), C(ad7e32f82d86c79d), C(4574015ae8626ce2), >+ C(455aa6137386a582), C(658ad2542e8ec20), C(e31d7be2ca35d00), C(7af31199)}, >+ {C(ba87acef79d14f53), C(b3e0fcae63a11558), C(d5ac313a593a9f45), >+ C(eea5f5a9f74af591), C(578710bcc36fbea2), C(7a8393432188931d), >+ C(705cfc5ec7cc172), C(eea5f5a9f74af591), C(578710bcc36fbea2), >+ C(7a8393432188931d), C(705cfc5ec7cc172), C(da85ebe5fc427976), >+ C(bfa5c7a454df54c8), C(4632b72a81bf66d2), C(5dd72877db539ee2), >+ C(e341a9d6)}, >+ {C(bcd3957d5717dc3), C(2da746741b03a007), C(873816f4b1ece472), >+ C(2b826f1a2c08c289), C(da50f56863b55e74), C(b18712f6b3eed83b), >+ C(bdc7cc05ab4c685f), C(2b826f1a2c08c289), C(da50f56863b55e74), >+ C(b18712f6b3eed83b), C(bdc7cc05ab4c685f), C(9e45fb833d1b0af), >+ C(d7213081db29d82e), C(d2a6b6c6a09ed55e), C(98a7686cba323ca9), >+ C(ca24aeeb)}, >+ {C(61442ff55609168e), C(6447c5fc76e8c9cf), C(6a846de83ae15728), >+ C(effc2663cffc777f), C(93214f8f463afbed), C(a156ef06066f4e4e), >+ C(a407b6ed8769d51e), C(effc2663cffc777f), C(93214f8f463afbed), >+ C(a156ef06066f4e4e), C(a407b6ed8769d51e), C(bb2f9ed29745c02a), >+ C(981eecd435b36ad9), C(461a5a05fb9cdff4), C(bd6cb2a87b9f910c), >+ C(b2252b57)}, >+ {C(dbe4b1b2d174757f), C(506512da18712656), C(6857f3e0b8dd95f), >+ C(5a4fc2728a9bb671), C(ebb971522ec38759), C(1a5a093e6cf1f72b), >+ C(729b057fe784f504), C(5a4fc2728a9bb671), C(ebb971522ec38759), >+ C(1a5a093e6cf1f72b), C(729b057fe784f504), C(71fcbf42a767f9cf), >+ C(114cfe772da6cdd), C(60cdf9cb629d9d7a), C(e270d10ad088b24e), C(72c81da1)}, >+ {C(531e8e77b363161c), C(eece0b43e2dae030), C(8294b82c78f34ed1), >+ C(e777b1fd580582f2), C(7b880f58da112699), C(562c6b189a6333f4), >+ C(139d64f88a611d4), C(e777b1fd580582f2), C(7b880f58da112699), >+ C(562c6b189a6333f4), C(139d64f88a611d4), C(53d8ef17eda64fa4), >+ C(bf3eded14dc60a04), C(2b5c559cf5ec07c5), C(8895f7339d03a48a), >+ C(6b9fce95)}, >+ {C(f71e9c926d711e2b), C(d77af2853a4ceaa1), C(9aa0d6d76a36fae7), >+ C(dd16cd0fbc08393), C(29a414a5d8c58962), C(72793d8d1022b5b2), >+ C(2e8e69cf7cbffdf0), C(dd16cd0fbc08393), C(29a414a5d8c58962), >+ C(72793d8d1022b5b2), C(2e8e69cf7cbffdf0), C(3721c0473aa99c9a), >+ C(1cff4ed9c31cd91c), C(4990735033cc482b), C(7fdf8c701c72f577), >+ C(19399857)}, >+ {C(cb20ac28f52df368), C(e6705ee7880996de), C(9b665cc3ec6972f2), >+ C(4260e8c254e9924b), C(f197a6eb4591572d), C(8e867ff0fb7ab27c), >+ C(f95502fb503efaf3), C(4260e8c254e9924b), C(f197a6eb4591572d), >+ C(8e867ff0fb7ab27c), C(f95502fb503efaf3), C(30c41876b08e3e22), >+ C(958e2419e3cd22f4), C(f0f3aa1fe119a107), C(481662310a379100), >+ C(3c57a994)}, >+ {C(e4a794b4acb94b55), C(89795358057b661b), C(9c4cdcec176d7a70), >+ C(4890a83ee435bc8b), C(d8c1c00fceb00914), C(9e7111ba234f900f), >+ C(eb8dbab364d8b604), C(4890a83ee435bc8b), C(d8c1c00fceb00914), >+ C(9e7111ba234f900f), C(eb8dbab364d8b604), C(b3261452963eebb), >+ C(6cf94b02792c4f95), C(d88fa815ef1e8fc), C(2d687af66604c73), C(c053e729)}, >+ {C(cb942e91443e7208), C(e335de8125567c2a), C(d4d74d268b86df1f), >+ C(8ba0fdd2ffc8b239), C(f413b366c1ffe02f), C(c05b2717c59a8a28), >+ C(981188eab4fcc8fb), C(8ba0fdd2ffc8b239), C(f413b366c1ffe02f), >+ C(c05b2717c59a8a28), C(981188eab4fcc8fb), C(e563f49a1d9072ba), >+ C(3c6a3aa4a26367dc), C(ba0db13448653f34), C(31065d756074d7d6), >+ C(51cbbba7)}, >+ {C(ecca7563c203f7ba), C(177ae2423ef34bb2), C(f60b7243400c5731), >+ C(cf1edbfe7330e94e), C(881945906bcb3cc6), C(4acf0293244855da), >+ C(65ae042c1c2a28c2), C(cf1edbfe7330e94e), C(881945906bcb3cc6), >+ C(4acf0293244855da), C(65ae042c1c2a28c2), C(b25fa0a1cab33559), >+ C(d98e8daa28124131), C(fce17f50b9c351b3), C(3f995ccf7386864b), >+ C(1acde79a)}, >+ {C(1652cb940177c8b5), C(8c4fe7d85d2a6d6d), C(f6216ad097e54e72), >+ C(f6521b912b368ae6), C(a9fe4eff81d03e73), C(d6f623629f80d1a3), >+ C(2b9604f32cb7dc34), C(f6521b912b368ae6), C(a9fe4eff81d03e73), >+ C(d6f623629f80d1a3), C(2b9604f32cb7dc34), C(2a43d84dcf59c7e2), >+ C(d0a197c70c5dae0b), C(6e84d4bbc71d76a0), C(c7e94620378c6cb2), >+ C(2d160d13)}, >+ {C(31fed0fc04c13ce8), C(3d5d03dbf7ff240a), C(727c5c9b51581203), >+ C(6b5ffc1f54fecb29), C(a8e8e7ad5b9a21d9), C(c4d5a32cd6aac22d), >+ C(d7e274ad22d4a79a), C(6b5ffc1f54fecb29), C(a8e8e7ad5b9a21d9), >+ C(c4d5a32cd6aac22d), C(d7e274ad22d4a79a), C(368841ea5731a112), >+ C(feaf7bc2e73ca48f), C(636fb272e9ea1f6), C(5d9cb7580c3f6207), C(787f5801)}, >+ {C(e7b668947590b9b3), C(baa41ad32938d3fa), C(abcbc8d4ca4b39e4), >+ C(381ee1b7ea534f4e), C(da3759828e3de429), C(3e015d76729f9955), >+ C(cbbec51a6485fbde), C(381ee1b7ea534f4e), C(da3759828e3de429), >+ C(3e015d76729f9955), C(cbbec51a6485fbde), C(9b86605281f20727), >+ C(fc6fcf508676982a), C(3b135f7a813a1040), C(d3a4706bea1db9c9), >+ C(c9629828)}, >+ {C(1de2119923e8ef3c), C(6ab27c096cf2fe14), C(8c3658edca958891), >+ C(4cc8ed3ada5f0f2), C(4a496b77c1f1c04e), C(9085b0a862084201), >+ C(a1894bde9e3dee21), C(4cc8ed3ada5f0f2), C(4a496b77c1f1c04e), >+ C(9085b0a862084201), C(a1894bde9e3dee21), C(367fb472dc5b277d), >+ C(7d39ccca16fc6745), C(763f988d70db9106), C(a8b66f7fecb70f02), >+ C(be139231)}, >+ {C(1269df1e69e14fa7), C(992f9d58ac5041b7), C(e97fcf695a7cbbb4), >+ C(e5d0549802d15008), C(424c134ecd0db834), C(6fc44fd91be15c6c), >+ C(a1a5ef95d50e537d), C(e5d0549802d15008), C(424c134ecd0db834), >+ C(6fc44fd91be15c6c), C(a1a5ef95d50e537d), C(d1e3daf5d05f5308), >+ C(4c7f81600eaa1327), C(109d1b8d1f9d0d2b), C(871e8699e0aeb862), >+ C(7df699ef)}, >+ {C(820826d7aba567ff), C(1f73d28e036a52f3), C(41c4c5a73f3b0893), >+ C(aa0d74d4a98db89b), C(36fd486d07c56e1d), C(d0ad23cbb6660d8a), >+ C(1264a84665b35e19), C(aa0d74d4a98db89b), C(36fd486d07c56e1d), >+ C(d0ad23cbb6660d8a), C(1264a84665b35e19), C(789682bf7d781b33), >+ C(6bfa6abd2fb5722d), C(6779cb3623d33900), C(435ca5214e1ee5f0), >+ C(8ce6b96d)}, >+ {C(ffe0547e4923cef9), C(3534ed49b9da5b02), C(548a273700fba03d), >+ C(28ac84ca70958f7e), C(d8ae575a68faa731), C(2aaaee9b9dcffd4c), >+ C(6c7faab5c285c6da), C(28ac84ca70958f7e), C(d8ae575a68faa731), >+ C(2aaaee9b9dcffd4c), C(6c7faab5c285c6da), C(45d94235f99ba78f), >+ C(ab5ea16f39497f5b), C(fb4d6c86fccbdca3), C(8104e6310a5fd2c7), >+ C(6f9ed99c)}, >+ {C(72da8d1b11d8bc8b), C(ba94b56b91b681c6), C(4e8cc51bd9b0fc8c), >+ C(43505ed133be672a), C(e8f2f9d973c2774e), C(677b9b9c7cad6d97), >+ C(4e1f5d56ef17b906), C(43505ed133be672a), C(e8f2f9d973c2774e), >+ C(677b9b9c7cad6d97), C(4e1f5d56ef17b906), C(eea3a6038f983767), >+ C(87109f077f86db01), C(ecc1ca41f74d61cc), C(34a87e86e83bed17), >+ C(e0244796)}, >+ {C(d62ab4e3f88fc797), C(ea86c7aeb6283ae4), C(b5b93e09a7fe465), >+ C(4344a1a0134afe2), C(ff5c17f02b62341d), C(3214c6a587ce4644), >+ C(a905e7ed0629d05c), C(4344a1a0134afe2), C(ff5c17f02b62341d), >+ C(3214c6a587ce4644), C(a905e7ed0629d05c), C(b5c72690cd716e82), >+ C(7c6097649e6ebe7b), C(7ceee8c6e56a4dcd), C(80ca849dc53eb9e4), >+ C(4ccf7e75)}, >+ {C(d0f06c28c7b36823), C(1008cb0874de4bb8), C(d6c7ff816c7a737b), >+ C(489b697fe30aa65f), C(4da0fb621fdc7817), C(dc43583b82c58107), >+ C(4b0261debdec3cd6), C(489b697fe30aa65f), C(4da0fb621fdc7817), >+ C(dc43583b82c58107), C(4b0261debdec3cd6), C(a9748d7b6c0e016c), >+ C(7e8828f7ba4b034b), C(da0fa54348a2512a), C(ebf9745c0962f9ad), >+ C(915cef86)}, >+ {C(99b7042460d72ec6), C(2a53e5e2b8e795c2), C(53a78132d9e1b3e3), >+ C(c043e67e6fc64118), C(ff0abfe926d844d3), C(f2a9fe5db2e910fe), >+ C(ce352cdc84a964dd), C(c043e67e6fc64118), C(ff0abfe926d844d3), >+ C(f2a9fe5db2e910fe), C(ce352cdc84a964dd), C(b89bc028aa5e6063), >+ C(a354e7fdac04459c), C(68d6547e6e980189), C(c968dddfd573773e), >+ C(5cb59482)}, >+ {C(4f4dfcfc0ec2bae5), C(841233148268a1b8), C(9248a76ab8be0d3), >+ C(334c5a25b5903a8c), C(4c94fef443122128), C(743e7d8454655c40), >+ C(1ab1e6d1452ae2cd), C(334c5a25b5903a8c), C(4c94fef443122128), >+ C(743e7d8454655c40), C(1ab1e6d1452ae2cd), C(fec766de4a8e476c), >+ C(cc0929da9567e71b), C(5f9ef5b5f150c35a), C(87659cabd649768f), >+ C(6ca3f532)}, >+ {C(fe86bf9d4422b9ae), C(ebce89c90641ef9c), C(1c84e2292c0b5659), >+ C(8bde625a10a8c50d), C(eb8271ded1f79a0b), C(14dc6844f0de7a3c), >+ C(f85b2f9541e7e6da), C(8bde625a10a8c50d), C(eb8271ded1f79a0b), >+ C(14dc6844f0de7a3c), C(f85b2f9541e7e6da), C(2fe22cfd1683b961), >+ C(ea1d75c5b7aa01ca), C(9eef60a44876bb95), C(950c818e505c6f7f), >+ C(e24f3859)}, >+ {C(a90d81060932dbb0), C(8acfaa88c5fbe92b), C(7c6f3447e90f7f3f), >+ C(dd52fc14c8dd3143), C(1bc7508516e40628), C(3059730266ade626), >+ C(ffa526822f391c2), C(dd52fc14c8dd3143), C(1bc7508516e40628), >+ C(3059730266ade626), C(ffa526822f391c2), C(e25232d7afc8a406), >+ C(d2b8a5a3f3b5f670), C(6630f33edb7dfe32), C(c71250ba68c4ea86), >+ C(adf5a9c7)}, >+ {C(17938a1b0e7f5952), C(22cadd2f56f8a4be), C(84b0d1183d5ed7c1), >+ C(c1336b92fef91bf6), C(80332a3945f33fa9), C(a0f68b86f726ff92), >+ C(a3db5282cf5f4c0b), C(c1336b92fef91bf6), C(80332a3945f33fa9), >+ C(a0f68b86f726ff92), C(a3db5282cf5f4c0b), C(82640b6fc4916607), >+ C(2dc2a3aa1a894175), C(8b4c852bdee7cc9), C(10b9d0a08b55ff83), C(32264b75)}, >+ {C(de9e0cb0e16f6e6d), C(238e6283aa4f6594), C(4fb9c914c2f0a13b), >+ C(497cb912b670f3b), C(d963a3f02ff4a5b6), C(4fccefae11b50391), >+ C(42ba47db3f7672f), C(497cb912b670f3b), C(d963a3f02ff4a5b6), >+ C(4fccefae11b50391), C(42ba47db3f7672f), C(1d6b655a1889feef), >+ C(5f319abf8fafa19f), C(715c2e49deb14620), C(8d9153082ecdcea4), >+ C(a64b3376)}, >+ {C(6d4b876d9b146d1a), C(aab2d64ce8f26739), C(d315f93600e83fe5), >+ C(2fe9fabdbe7fdd4), C(755db249a2d81a69), C(f27929f360446d71), >+ C(79a1bf957c0c1b92), C(2fe9fabdbe7fdd4), C(755db249a2d81a69), >+ C(f27929f360446d71), C(79a1bf957c0c1b92), C(3c8a28d4c936c9cd), >+ C(df0d3d13b2c6a902), C(c76702dd97cd2edd), C(1aa220f7be16517), C(d33890e)}, >+ {C(e698fa3f54e6ea22), C(bd28e20e7455358c), C(9ace161f6ea76e66), >+ C(d53fb7e3c93a9e4), C(737ae71b051bf108), C(7ac71feb84c2df42), >+ C(3d8075cd293a15b4), C(d53fb7e3c93a9e4), C(737ae71b051bf108), >+ C(7ac71feb84c2df42), C(3d8075cd293a15b4), C(bf8cee5e095d8a7c), >+ C(e7086b3c7608143a), C(e55b0c2fa938d70c), C(fffb5f58e643649c), >+ C(926d4b63)}, >+ {C(7bc0deed4fb349f7), C(1771aff25dc722fa), C(19ff0644d9681917), >+ C(cf7d7f25bd70cd2c), C(9464ed9baeb41b4f), C(b9064f5c3cb11b71), >+ C(237e39229b012b20), C(cf7d7f25bd70cd2c), C(9464ed9baeb41b4f), >+ C(b9064f5c3cb11b71), C(237e39229b012b20), C(dd54d3f5d982dffe), >+ C(7fc7562dbfc81dbf), C(5b0dd1924f70945), C(f1760537d8261135), C(d51ba539)}, >+ {C(db4b15e88533f622), C(256d6d2419b41ce9), C(9d7c5378396765d5), >+ C(9040e5b936b8661b), C(276e08fa53ac27fd), C(8c944d39c2bdd2cc), >+ C(e2514c9802a5743c), C(9040e5b936b8661b), C(276e08fa53ac27fd), >+ C(8c944d39c2bdd2cc), C(e2514c9802a5743c), C(e82107b11ac90386), >+ C(7d6a22bc35055e6), C(fd6ea9d1c438d8ae), C(be6015149e981553), C(7f37636d)}, >+ {C(922834735e86ecb2), C(363382685b88328e), C(e9c92960d7144630), >+ C(8431b1bfd0a2379c), C(90383913aea283f9), C(a6163831eb4924d2), >+ C(5f3921b4f9084aee), C(8431b1bfd0a2379c), C(90383913aea283f9), >+ C(a6163831eb4924d2), C(5f3921b4f9084aee), C(7a70061a1473e579), >+ C(5b19d80dcd2c6331), C(6196b97931faad27), C(869bf6828e237c3f), >+ C(b98026c0)}, >+ {C(30f1d72c812f1eb8), C(b567cd4a69cd8989), C(820b6c992a51f0bc), >+ C(c54677a80367125e), C(3204fbdba462e606), C(8563278afc9eae69), >+ C(262147dd4bf7e566), C(c54677a80367125e), C(3204fbdba462e606), >+ C(8563278afc9eae69), C(262147dd4bf7e566), C(2178b63e7ee2d230), >+ C(e9c61ad81f5bff26), C(9af7a81b3c501eca), C(44104a3859f0238f), >+ C(b877767e)}, >+ {C(168884267f3817e9), C(5b376e050f637645), C(1c18314abd34497a), >+ C(9598f6ab0683fcc2), C(1c805abf7b80e1ee), C(dec9ac42ee0d0f32), >+ C(8cd72e3912d24663), C(9598f6ab0683fcc2), C(1c805abf7b80e1ee), >+ C(dec9ac42ee0d0f32), C(8cd72e3912d24663), C(1f025d405f1c1d87), >+ C(bf7b6221e1668f8f), C(52316f64e692dbb0), C(7bf43df61ec51b39), C(aefae77)}, >+ {C(82e78596ee3e56a7), C(25697d9c87f30d98), C(7600a8342834924d), >+ C(6ba372f4b7ab268b), C(8c3237cf1fe243df), C(3833fc51012903df), >+ C(8e31310108c5683f), C(6ba372f4b7ab268b), C(8c3237cf1fe243df), >+ C(3833fc51012903df), C(8e31310108c5683f), C(126593715c2de429), >+ C(48ca8f35a3f54b90), C(b9322b632f4f8b0), C(926bb169b7337693), C(f686911)}, >+ {C(aa2d6cf22e3cc252), C(9b4dec4f5e179f16), C(76fb0fba1d99a99a), >+ C(9a62af3dbba140da), C(27857ea044e9dfc1), C(33abce9da2272647), >+ C(b22a7993aaf32556), C(9a62af3dbba140da), C(27857ea044e9dfc1), >+ C(33abce9da2272647), C(b22a7993aaf32556), C(bf8f88f8019bedf0), >+ C(ed2d7f01fb273905), C(6b45f15901b481cd), C(f88ebb413ba6a8d5), >+ C(3deadf12)}, >+ {C(7bf5ffd7f69385c7), C(fc077b1d8bc82879), C(9c04e36f9ed83a24), >+ C(82065c62e6582188), C(8ef787fd356f5e43), C(2922e53e36e17dfa), >+ C(9805f223d385010b), C(82065c62e6582188), C(8ef787fd356f5e43), >+ C(2922e53e36e17dfa), C(9805f223d385010b), C(692154f3491b787d), >+ C(e7e64700e414fbf), C(757d4d4ab65069a0), C(cd029446a8e348e2), C(ccf02a4e)}, >+ {C(e89c8ff9f9c6e34b), C(f54c0f669a49f6c4), C(fc3e46f5d846adef), >+ C(22f2aa3df2221cc), C(f66fea90f5d62174), C(b75defaeaa1dd2a7), >+ C(9b994cd9a7214fd5), C(22f2aa3df2221cc), C(f66fea90f5d62174), >+ C(b75defaeaa1dd2a7), C(9b994cd9a7214fd5), C(fac675a31804b773), >+ C(98bcb3b820c50fc6), C(e14af64d28cf0885), C(27466fbd2b360eb5), >+ C(176c1722)}, >+ {C(a18fbcdccd11e1f4), C(8248216751dfd65e), C(40c089f208d89d7c), >+ C(229b79ab69ae97d), C(a87aabc2ec26e582), C(be2b053721eb26d2), >+ C(10febd7f0c3d6fcb), C(229b79ab69ae97d), C(a87aabc2ec26e582), >+ C(be2b053721eb26d2), C(10febd7f0c3d6fcb), C(9cc5b9b2f6e3bf7b), >+ C(655d8495fe624a86), C(6381a9f3d1f2bd7e), C(79ebabbfc25c83e2), C(26f82ad)}, >+ {C(2d54f40cc4088b17), C(59d15633b0cd1399), C(a8cc04bb1bffd15b), >+ C(d332cdb073d8dc46), C(272c56466868cb46), C(7e7fcbe35ca6c3f3), >+ C(ee8f51e5a70399d4), C(d332cdb073d8dc46), C(272c56466868cb46), >+ C(7e7fcbe35ca6c3f3), C(ee8f51e5a70399d4), C(16737a9c7581fe7b), >+ C(ed04bf52f4b75dcb), C(9707ffb36bd30c1a), C(1390f236fdc0de3e), >+ C(b5244f42)}, >+ {C(69276946cb4e87c7), C(62bdbe6183be6fa9), C(3ba9773dac442a1a), >+ C(702e2afc7f5a1825), C(8c49b11ea8151fdc), C(caf3fef61f5a86fa), >+ C(ef0b2ee8649d7272), C(702e2afc7f5a1825), C(8c49b11ea8151fdc), >+ C(caf3fef61f5a86fa), C(ef0b2ee8649d7272), C(9e34a4e08d9441e1), >+ C(7bdc0cd64d5af533), C(a926b14d99e3d868), C(fca923a17788cce4), >+ C(49a689e5)}, >+ {C(668174a3f443df1d), C(407299392da1ce86), C(c2a3f7d7f2c5be28), >+ C(a590b202a7a5807b), C(968d2593f7ccb54e), C(9dd8d669e3e95dec), >+ C(ee0cc5dd58b6e93a), C(a590b202a7a5807b), C(968d2593f7ccb54e), >+ C(9dd8d669e3e95dec), C(ee0cc5dd58b6e93a), C(ac65d5a9466fb483), >+ C(221be538b2c9d806), C(5cbe9441784f9fd9), C(d4c7d5d6e3c122b8), C(59fcdd3)}, >+ {C(5e29be847bd5046), C(b561c7f19c8f80c3), C(5e5abd5021ccaeaf), >+ C(7432d63888e0c306), C(74bbceeed479cb71), C(6471586599575fdf), >+ C(6a859ad23365cba2), C(7432d63888e0c306), C(74bbceeed479cb71), >+ C(6471586599575fdf), C(6a859ad23365cba2), C(f9ceec84acd18dcc), >+ C(74a242ff1907437c), C(f70890194e1ee913), C(777dfcb4bb01f0ba), >+ C(4f4b04e9)}, >+ {C(cd0d79f2164da014), C(4c386bb5c5d6ca0c), C(8e771b03647c3b63), >+ C(69db23875cb0b715), C(ada8dd91504ae37f), C(46bf18dbf045ed6a), >+ C(e1b5f67b0645ab63), C(69db23875cb0b715), C(ada8dd91504ae37f), >+ C(46bf18dbf045ed6a), C(e1b5f67b0645ab63), C(877be8f5dcddff4), >+ C(6d471b5f9ca2e2d1), C(802c86d6f495b9bb), C(a1f9b9b22b3be704), >+ C(8b00f891)}, >+ {C(e0e6fc0b1628af1d), C(29be5fb4c27a2949), C(1c3f781a604d3630), >+ C(c4af7faf883033aa), C(9bd296c4e9453cac), C(ca45426c1f7e33f9), >+ C(a6bbdcf7074d40c5), C(c4af7faf883033aa), C(9bd296c4e9453cac), >+ C(ca45426c1f7e33f9), C(a6bbdcf7074d40c5), C(e13a005d7142733b), >+ C(c02b7925c5eeefaf), C(d39119a60441e2d5), C(3c24c710df8f4d43), >+ C(16e114f3)}, >+ {C(2058927664adfd93), C(6e8f968c7963baa5), C(af3dced6fff7c394), >+ C(42e34cf3d53c7876), C(9cddbb26424dc5e), C(64f6340a6d8eddad), >+ C(2196e488eb2a3a4b), C(42e34cf3d53c7876), C(9cddbb26424dc5e), >+ C(64f6340a6d8eddad), C(2196e488eb2a3a4b), C(c9e9da25911a16fd), >+ C(e21b4683f3e196a8), C(cb80bf1a4c6fdbb4), C(53792e9b3c3e67f8), >+ C(d6b6dadc)}, >+ {C(dc107285fd8e1af7), C(a8641a0609321f3f), C(db06e89ffdc54466), >+ C(bcc7a81ed5432429), C(b6d7bdc6ad2e81f1), C(93605ec471aa37db), >+ C(a2a73f8a85a8e397), C(bcc7a81ed5432429), C(b6d7bdc6ad2e81f1), >+ C(93605ec471aa37db), C(a2a73f8a85a8e397), C(10a012b8ca7ac24b), >+ C(aac5fd63351595cf), C(5bb4c648a226dea0), C(9d11ecb2b5c05c5f), >+ C(897e20ac)}, >+ {C(fbba1afe2e3280f1), C(755a5f392f07fce), C(9e44a9a15402809a), >+ C(6226a32e25099848), C(ea895661ecf53004), C(4d7e0158db2228b9), >+ C(e5a7d82922f69842), C(6226a32e25099848), C(ea895661ecf53004), >+ C(4d7e0158db2228b9), C(e5a7d82922f69842), C(2cea7713b69840ca), >+ C(18de7b9ae938375b), C(f127cca08f3cc665), C(b1c22d727665ad2), C(f996e05d)}, >+ {C(bfa10785ddc1011b), C(b6e1c4d2f670f7de), C(517d95604e4fcc1f), >+ C(ca6552a0dfb82c73), C(b024cdf09e34ba07), C(66cd8c5a95d7393b), >+ C(e3939acf790d4a74), C(ca6552a0dfb82c73), C(b024cdf09e34ba07), >+ C(66cd8c5a95d7393b), C(e3939acf790d4a74), C(97827541a1ef051e), >+ C(ac2fce47ebe6500c), C(b3f06d3bddf3bd6a), C(1d74afb25e1ce5fe), >+ C(c4306af6)}, >+ {C(534cc35f0ee1eb4e), C(b703820f1f3b3dce), C(884aa164cf22363), >+ C(f14ef7f47d8a57a3), C(80d1f86f2e061d7c), C(401d6c2f151b5a62), >+ C(e988460224108944), C(f14ef7f47d8a57a3), C(80d1f86f2e061d7c), >+ C(401d6c2f151b5a62), C(e988460224108944), C(7804d4135f68cd19), >+ C(5487b4b39e69fe8e), C(8cc5999015358a27), C(8f3729b61c2d5601), >+ C(6dcad433)}, >+ {C(7ca6e3933995dac), C(fd118c77daa8188), C(3aceb7b5e7da6545), >+ C(c8389799445480db), C(5389f5df8aacd50d), C(d136581f22fab5f), >+ C(c2f31f85991da417), C(c8389799445480db), C(5389f5df8aacd50d), >+ C(d136581f22fab5f), C(c2f31f85991da417), C(aefbf9ff84035a43), >+ C(8accbaf44adadd7c), C(e57f3657344b67f5), C(21490e5e8abdec51), >+ C(3c07374d)}, >+ {C(f0d6044f6efd7598), C(e044d6ba4369856e), C(91968e4f8c8a1a4c), >+ C(70bd1968996bffc2), C(4c613de5d8ab32ac), C(fe1f4f97206f79d8), >+ C(ac0434f2c4e213a9), C(70bd1968996bffc2), C(4c613de5d8ab32ac), >+ C(fe1f4f97206f79d8), C(ac0434f2c4e213a9), C(7490e9d82cfe22ca), >+ C(5fbbf7f987454238), C(c39e0dc8368ce949), C(22201d3894676c71), >+ C(f0f4602c)}, >+ {C(3d69e52049879d61), C(76610636ea9f74fe), C(e9bf5602f89310c0), >+ C(8eeb177a86053c11), C(e390122c345f34a2), C(1e30e47afbaaf8d6), >+ C(7b892f68e5f91732), C(8eeb177a86053c11), C(e390122c345f34a2), >+ C(1e30e47afbaaf8d6), C(7b892f68e5f91732), C(b87922525fa44158), >+ C(f440a1ee1a1a766b), C(ee8efad279d08c5c), C(421f910c5b60216e), >+ C(3e1ea071)}, >+ {C(79da242a16acae31), C(183c5f438e29d40), C(6d351710ae92f3de), >+ C(27233b28b5b11e9b), C(c7dfe8988a942700), C(570ed11c4abad984), >+ C(4b4c04632f48311a), C(27233b28b5b11e9b), C(c7dfe8988a942700), >+ C(570ed11c4abad984), C(4b4c04632f48311a), C(12f33235442cbf9), >+ C(a35315ca0b5b8cdb), C(d8abde62ead5506b), C(fc0fcf8478ad5266), >+ C(67580f0c)}, >+ {C(461c82656a74fb57), C(d84b491b275aa0f7), C(8f262cb29a6eb8b2), >+ C(49fa3070bc7b06d0), C(f12ed446bd0c0539), C(6d43ac5d1dd4b240), >+ C(7609524fe90bec93), C(49fa3070bc7b06d0), C(f12ed446bd0c0539), >+ C(6d43ac5d1dd4b240), C(7609524fe90bec93), C(391c2b2e076ec241), >+ C(f5e62deda7839f7b), C(3c7b3186a10d870f), C(77ef4f2cba4f1005), >+ C(4e109454)}, >+ {C(53c1a66d0b13003), C(731f060e6fe797fc), C(daa56811791371e3), >+ C(57466046cf6896ed), C(8ac37e0e8b25b0c6), C(3e6074b52ad3cf18), >+ C(aa491ce7b45db297), C(57466046cf6896ed), C(8ac37e0e8b25b0c6), >+ C(3e6074b52ad3cf18), C(aa491ce7b45db297), C(f7a9227c5e5e22c3), >+ C(3d92e0841e29ce28), C(2d30da5b2859e59d), C(ff37fa1c9cbfafc2), >+ C(88a474a7)}, >+ {C(d3a2efec0f047e9), C(1cabce58853e58ea), C(7a17b2eae3256be4), >+ C(c2dcc9758c910171), C(cb5cddaeff4ddb40), C(5d7cc5869baefef1), >+ C(9644c5853af9cfeb), C(c2dcc9758c910171), C(cb5cddaeff4ddb40), >+ C(5d7cc5869baefef1), C(9644c5853af9cfeb), C(255c968184694ee1), >+ C(4e4d726eda360927), C(7d27dd5b6d100377), C(9a300e2020ddea2c), C(5b5bedd)}, >+ {C(43c64d7484f7f9b2), C(5da002b64aafaeb7), C(b576c1e45800a716), >+ C(3ee84d3d5b4ca00b), C(5cbc6d701894c3f9), C(d9e946f5ae1ca95), >+ C(24ca06e67f0b1833), C(3ee84d3d5b4ca00b), C(5cbc6d701894c3f9), >+ C(d9e946f5ae1ca95), C(24ca06e67f0b1833), C(3413d46b4152650e), >+ C(cbdfdbc2ab516f9c), C(2aad8acb739e0c6c), C(2bfc950d9f9fa977), >+ C(1aaddfa7)}, >+ {C(a7dec6ad81cf7fa1), C(180c1ab708683063), C(95e0fd7008d67cff), >+ C(6b11c5073687208), C(7e0a57de0d453f3), C(e48c267d4f646867), >+ C(2168e9136375f9cb), C(6b11c5073687208), C(7e0a57de0d453f3), >+ C(e48c267d4f646867), C(2168e9136375f9cb), C(64da194aeeea7fdf), >+ C(a3b9f01fa5885678), C(c316f8ee2eb2bd17), C(a7e4d80f83e4427f), >+ C(5be07fd8)}, >+ {C(5408a1df99d4aff), C(b9565e588740f6bd), C(abf241813b08006e), >+ C(7da9e81d89fda7ad), C(274157cabe71440d), C(2c22d9a480b331f7), >+ C(e835c8ac746472d5), C(7da9e81d89fda7ad), C(274157cabe71440d), >+ C(2c22d9a480b331f7), C(e835c8ac746472d5), C(2038ce817a201ae4), >+ C(46f3289dfe1c5e40), C(435578a42d4b7c56), C(f96d9f409fcf561), C(cbca8606)}, >+ {C(a8b27a6bcaeeed4b), C(aec1eeded6a87e39), C(9daf246d6fed8326), >+ C(d45a938b79f54e8f), C(366b219d6d133e48), C(5b14be3c25c49405), >+ C(fdd791d48811a572), C(d45a938b79f54e8f), C(366b219d6d133e48), >+ C(5b14be3c25c49405), C(fdd791d48811a572), C(3de67b8d9e95d335), >+ C(903c01307cfbeed5), C(af7d65f32274f1d1), C(4dba141b5fc03c42), >+ C(bde64d01)}, >+ {C(9a952a8246fdc269), C(d0dcfcac74ef278c), C(250f7139836f0f1f), >+ C(c83d3c5f4e5f0320), C(694e7adeb2bf32e5), C(7ad09538a3da27f5), >+ C(2b5c18f934aa5303), C(c83d3c5f4e5f0320), C(694e7adeb2bf32e5), >+ C(7ad09538a3da27f5), C(2b5c18f934aa5303), C(c4dad7703d34326e), >+ C(825569e2bcdc6a25), C(b83d267709ca900d), C(44ed05151f5d74e6), >+ C(ee90cf33)}, >+ {C(c930841d1d88684f), C(5eb66eb18b7f9672), C(e455d413008a2546), >+ C(bc271bc0df14d647), C(b071100a9ff2edbb), C(2b1a4c1cc31a119a), >+ C(b5d7caa1bd946cef), C(bc271bc0df14d647), C(b071100a9ff2edbb), >+ C(2b1a4c1cc31a119a), C(b5d7caa1bd946cef), C(e02623ae10f4aadd), >+ C(d79f600389cd06fd), C(1e8da7965303e62b), C(86f50e10eeab0925), >+ C(4305c3ce)}, >+ {C(94dc6971e3cf071a), C(994c7003b73b2b34), C(ea16e85978694e5), >+ C(336c1b59a1fc19f6), C(c173acaecc471305), C(db1267d24f3f3f36), >+ C(e9a5ee98627a6e78), C(336c1b59a1fc19f6), C(c173acaecc471305), >+ C(db1267d24f3f3f36), C(e9a5ee98627a6e78), C(718f334204305ae5), >+ C(e3b53c148f98d22c), C(a184012df848926), C(6e96386127d51183), C(4b3a1d76)}, >+ {C(7fc98006e25cac9), C(77fee0484cda86a7), C(376ec3d447060456), >+ C(84064a6dcf916340), C(fbf55a26790e0ebb), C(2e7f84151c31a5c2), >+ C(9f7f6d76b950f9bf), C(84064a6dcf916340), C(fbf55a26790e0ebb), >+ C(2e7f84151c31a5c2), C(9f7f6d76b950f9bf), C(125e094fbee2b146), >+ C(5706aa72b2eef7c2), C(1c4a2daa905ee66e), C(83d48029b5451694), >+ C(a8bb6d80)}, >+ {C(bd781c4454103f6), C(612197322f49c931), C(b9cf17fd7e5462d5), >+ C(e38e526cd3324364), C(85f2b63a5b5e840a), C(485d7cef5aaadd87), >+ C(d2b837a462f6db6d), C(e38e526cd3324364), C(85f2b63a5b5e840a), >+ C(485d7cef5aaadd87), C(d2b837a462f6db6d), C(3e41cef031520d9a), >+ C(82df73902d7f67e), C(3ba6fd54c15257cb), C(22f91f079be42d40), C(1f9fa607)}, >+ {C(da60e6b14479f9df), C(3bdccf69ece16792), C(18ebf45c4fecfdc9), >+ C(16818ee9d38c6664), C(5519fa9a1e35a329), C(cbd0001e4b08ed8), >+ C(41a965e37a0c731b), C(16818ee9d38c6664), C(5519fa9a1e35a329), >+ C(cbd0001e4b08ed8), C(41a965e37a0c731b), C(66e7b5dcca1ca28f), >+ C(963b2d993614347d), C(9b6fc6f41d411106), C(aaaecaccf7848c0c), >+ C(8d0e4ed2)}, >+ {C(4ca56a348b6c4d3), C(60618537c3872514), C(2fbb9f0e65871b09), >+ C(30278016830ddd43), C(f046646d9012e074), C(c62a5804f6e7c9da), >+ C(98d51f5830e2bc1e), C(30278016830ddd43), C(f046646d9012e074), >+ C(c62a5804f6e7c9da), C(98d51f5830e2bc1e), C(7b2cbe5d37e3f29e), >+ C(7b8c3ed50bda4aa0), C(3ea60cc24639e038), C(f7706de9fb0b5801), >+ C(1bf31347)}, >+ {C(ebd22d4b70946401), C(6863602bf7139017), C(c0b1ac4e11b00666), >+ C(7d2782b82bd494b6), C(97159ba1c26b304b), C(42b3b0fd431b2ac2), >+ C(faa81f82691c830c), C(7d2782b82bd494b6), C(97159ba1c26b304b), >+ C(42b3b0fd431b2ac2), C(faa81f82691c830c), C(7cc6449234c7e185), >+ C(aeaa6fa643ca86a5), C(1412db1c0f2e0133), C(4df2fe3e4072934f), >+ C(1ae3fc5b)}, >+ {C(3cc4693d6cbcb0c), C(501689ea1c70ffa), C(10a4353e9c89e364), >+ C(58c8aba7475e2d95), C(3e2f291698c9427a), C(e8710d19c9de9e41), >+ C(65dda22eb04cf953), C(58c8aba7475e2d95), C(3e2f291698c9427a), >+ C(e8710d19c9de9e41), C(65dda22eb04cf953), C(d7729c48c250cffa), >+ C(ef76162b2ddfba4b), C(52371e17f4d51f6d), C(ddd002112ff0c833), >+ C(459c3930)}, >+ {C(38908e43f7ba5ef0), C(1ab035d4e7781e76), C(41d133e8c0a68ff7), >+ C(d1090893afaab8bc), C(96c4fe6922772807), C(4522426c2b4205eb), >+ C(efad99a1262e7e0d), C(d1090893afaab8bc), C(96c4fe6922772807), >+ C(4522426c2b4205eb), C(efad99a1262e7e0d), C(c7696029abdb465e), >+ C(4e18eaf03d517651), C(d006bced54c86ac8), C(4330326d1021860c), >+ C(e00c4184)}, >+ {C(34983ccc6aa40205), C(21802cad34e72bc4), C(1943e8fb3c17bb8), >+ C(fc947167f69c0da5), C(ae79cfdb91b6f6c1), C(7b251d04c26cbda3), >+ C(128a33a79060d25e), C(fc947167f69c0da5), C(ae79cfdb91b6f6c1), >+ C(7b251d04c26cbda3), C(128a33a79060d25e), C(1eca842dbfe018dd), >+ C(50a4cd2ee0ba9c63), C(c2f5c97d8399682f), C(3f929fc7cbe8ecbb), >+ C(ffc7a781)}, >+ {C(86215c45dcac9905), C(ea546afe851cae4b), C(d85b6457e489e374), >+ C(b7609c8e70386d66), C(36e6ccc278d1636d), C(2f873307c08e6a1c), >+ C(10f252a758505289), C(b7609c8e70386d66), C(36e6ccc278d1636d), >+ C(2f873307c08e6a1c), C(10f252a758505289), C(c8977646e81ab4b6), >+ C(8017b745cd80213b), C(960687db359bea0), C(ef4a470660799488), C(6a125480)}, >+ {C(420fc255c38db175), C(d503cd0f3c1208d1), C(d4684e74c825a0bc), >+ C(4c10537443152f3d), C(720451d3c895e25d), C(aff60c4d11f513fd), >+ C(881e8d6d2d5fb953), C(4c10537443152f3d), C(720451d3c895e25d), >+ C(aff60c4d11f513fd), C(881e8d6d2d5fb953), C(9dec034a043f1f55), >+ C(e27a0c22e7bfb39d), C(2220b959128324), C(53240272152dbd8b), C(88a1512b)}, >+ {C(1d7a31f5bc8fe2f9), C(4763991092dcf836), C(ed695f55b97416f4), >+ C(f265edb0c1c411d7), C(30e1e9ec5262b7e6), C(c2c3ba061ce7957a), >+ C(d975f93b89a16409), C(f265edb0c1c411d7), C(30e1e9ec5262b7e6), >+ C(c2c3ba061ce7957a), C(d975f93b89a16409), C(e9d703123f43450a), >+ C(41383fedfed67c82), C(6e9f43ecbbbd6004), C(c7ccd23a24e77b8), C(549bbbe5)}, >+ {C(94129a84c376a26e), C(c245e859dc231933), C(1b8f74fecf917453), >+ C(e9369d2e9007e74b), C(b1375915d1136052), C(926c2021fe1d2351), >+ C(1d943addaaa2e7e6), C(e9369d2e9007e74b), C(b1375915d1136052), >+ C(926c2021fe1d2351), C(1d943addaaa2e7e6), C(f5f515869c246738), >+ C(7e309cd0e1c0f2a0), C(153c3c36cf523e3b), C(4931c66872ea6758), >+ C(c133d38c)}, >+ {C(1d3a9809dab05c8d), C(adddeb4f71c93e8), C(ef342eb36631edb), >+ C(301d7a61c4b3dbca), C(861336c3f0552d61), C(12c6db947471300f), >+ C(a679ef0ed761deb9), C(301d7a61c4b3dbca), C(861336c3f0552d61), >+ C(12c6db947471300f), C(a679ef0ed761deb9), C(5f713b720efcd147), >+ C(37ac330a333aa6b), C(3309dc9ec1616eef), C(52301d7a908026b5), C(fcace348)}, >+ {C(90fa3ccbd60848da), C(dfa6e0595b569e11), C(e585d067a1f5135d), >+ C(6cef866ec295abea), C(c486c0d9214beb2d), C(d6e490944d5fe100), >+ C(59df3175d72c9f38), C(6cef866ec295abea), C(c486c0d9214beb2d), >+ C(d6e490944d5fe100), C(59df3175d72c9f38), C(3f23aeb4c04d1443), >+ C(9bf0515cd8d24770), C(958554f60ccaade2), C(5182863c90132fe8), >+ C(ed7b6f9a)}, >+ {C(2dbb4fc71b554514), C(9650e04b86be0f82), C(60f2304fba9274d3), >+ C(fcfb9443e997cab), C(f13310d96dec2772), C(709cad2045251af2), >+ C(afd0d30cc6376dad), C(fcfb9443e997cab), C(f13310d96dec2772), >+ C(709cad2045251af2), C(afd0d30cc6376dad), C(59d4bed30d550d0d), >+ C(58006d4e22d8aad1), C(eee12d2362d1f13b), C(35cf1d7faaf1d228), >+ C(6d907dda)}, >+ {C(b98bf4274d18374a), C(1b669fd4c7f9a19a), C(b1f5972b88ba2b7a), >+ C(73119c99e6d508be), C(5d4036a187735385), C(8fa66e192fd83831), >+ C(2abf64b6b592ed57), C(73119c99e6d508be), C(5d4036a187735385), >+ C(8fa66e192fd83831), C(2abf64b6b592ed57), C(d4501f95dd84b08c), >+ C(bf1552439c8bea02), C(4f56fe753ba7e0ba), C(4ca8d35cc058cfcd), >+ C(7a4d48d5)}, >+ {C(d6781d0b5e18eb68), C(b992913cae09b533), C(58f6021caaee3a40), >+ C(aafcb77497b5a20b), C(411819e5e79b77a3), C(bd779579c51c77ce), >+ C(58d11f5dcf5d075d), C(aafcb77497b5a20b), C(411819e5e79b77a3), >+ C(bd779579c51c77ce), C(58d11f5dcf5d075d), C(9eae76cde1cb4233), >+ C(32fe25a9bf657970), C(1c0c807948edb06a), C(b8f29a3dfaee254d), >+ C(e686f3db)}, >+ {C(226651cf18f4884c), C(595052a874f0f51c), C(c9b75162b23bab42), >+ C(3f44f873be4812ec), C(427662c1dbfaa7b2), C(a207ff9638fb6558), >+ C(a738d919e45f550f), C(3f44f873be4812ec), C(427662c1dbfaa7b2), >+ C(a207ff9638fb6558), C(a738d919e45f550f), C(cb186ea05717e7d6), >+ C(1ca7d68a5871fdc1), C(5d4c119ea8ef3750), C(72b6a10fa2ff9406), C(cce7c55)}, >+ {C(a734fb047d3162d6), C(e523170d240ba3a5), C(125a6972809730e8), >+ C(d396a297799c24a1), C(8fee992e3069bad5), C(2e3a01b0697ccf57), >+ C(ee9c7390bd901cfa), C(d396a297799c24a1), C(8fee992e3069bad5), >+ C(2e3a01b0697ccf57), C(ee9c7390bd901cfa), C(56f2d9da0af28af2), >+ C(3fdd37b2fe8437cb), C(3d13eeeb60d6aec0), C(2432ae62e800a5ce), C(f58b96b)}, >+ {C(c6df6364a24f75a3), C(c294e2c84c4f5df8), C(a88df65c6a89313b), >+ C(895fe8443183da74), C(c7f2f6f895a67334), C(a0d6b6a506691d31), >+ C(24f51712b459a9f0), C(895fe8443183da74), C(c7f2f6f895a67334), >+ C(a0d6b6a506691d31), C(24f51712b459a9f0), C(173a699481b9e088), >+ C(1dee9b77bcbf45d3), C(32b98a646a8667d0), C(3adcd4ee28f42a0e), >+ C(1bbf6f60)}, >+ {C(d8d1364c1fbcd10), C(2d7cc7f54832deaa), C(4e22c876a7c57625), >+ C(a3d5d1137d30c4bd), C(1e7d706a49bdfb9e), C(c63282b20ad86db2), >+ C(aec97fa07916bfd6), C(a3d5d1137d30c4bd), C(1e7d706a49bdfb9e), >+ C(c63282b20ad86db2), C(aec97fa07916bfd6), C(7c9ba3e52d44f73e), >+ C(af62fd245811185d), C(8a9d2dacd8737652), C(bd2cce277d5fbec0), >+ C(ce5e0cc2)}, >+ {C(aae06f9146db885f), C(3598736441e280d9), C(fba339b117083e55), >+ C(b22bf08d9f8aecf7), C(c182730de337b922), C(2b9adc87a0450a46), >+ C(192c29a9cfc00aad), C(b22bf08d9f8aecf7), C(c182730de337b922), >+ C(2b9adc87a0450a46), C(192c29a9cfc00aad), C(9fd733f1d84a59d9), >+ C(d86bd5c9839ace15), C(af20b57303172876), C(9f63cb7161b5364c), >+ C(584cfd6f)}, >+ {C(8955ef07631e3bcc), C(7d70965ea3926f83), C(39aed4134f8b2db6), >+ C(882efc2561715a9c), C(ef8132a18a540221), C(b20a3c87a8c257c1), >+ C(f541b8628fad6c23), C(882efc2561715a9c), C(ef8132a18a540221), >+ C(b20a3c87a8c257c1), C(f541b8628fad6c23), C(9552aed57a6e0467), >+ C(4d9fdd56867611a7), C(c330279bf23b9eab), C(44dbbaea2fcb8eba), >+ C(8f9bbc33)}, >+ {C(ad611c609cfbe412), C(d3c00b18bf253877), C(90b2172e1f3d0bfd), >+ C(371a98b2cb084883), C(33a2886ee9f00663), C(be9568818ed6e6bd), >+ C(f244a0fa2673469a), C(371a98b2cb084883), C(33a2886ee9f00663), >+ C(be9568818ed6e6bd), C(f244a0fa2673469a), C(b447050bd3e559e9), >+ C(d3b695dae7a13383), C(ded0bb65be471188), C(ca3c7a2b78922cae), >+ C(d7640d95)}, >+ {C(d5339adc295d5d69), C(b633cc1dcb8b586a), C(ee84184cf5b1aeaf), >+ C(89f3aab99afbd636), C(f420e004f8148b9a), C(6818073faa797c7c), >+ C(dd3b4e21cbbf42ca), C(89f3aab99afbd636), C(f420e004f8148b9a), >+ C(6818073faa797c7c), C(dd3b4e21cbbf42ca), C(6a2b7db261164844), >+ C(cbead63d1895852a), C(93d37e1eae05e2f9), C(5d06db2703fbc3ae), C(3d12a2b)}, >+ {C(40d0aeff521375a8), C(77ba1ad7ecebd506), C(547c6f1a7d9df427), >+ C(21c2be098327f49b), C(7e035065ac7bbef5), C(6d7348e63023fb35), >+ C(9d427dc1b67c3830), C(21c2be098327f49b), C(7e035065ac7bbef5), >+ C(6d7348e63023fb35), C(9d427dc1b67c3830), C(4e3d018a43858341), >+ C(cf924bb44d6b43c5), C(4618b6a26e3446ae), C(54d3013fac3ed469), >+ C(aaeafed0)}, >+ {C(8b2d54ae1a3df769), C(11e7adaee3216679), C(3483781efc563e03), >+ C(9d097dd3152ab107), C(51e21d24126e8563), C(cba56cac884a1354), >+ C(39abb1b595f0a977), C(9d097dd3152ab107), C(51e21d24126e8563), >+ C(cba56cac884a1354), C(39abb1b595f0a977), C(81e6dd1c1109848f), >+ C(1644b209826d7b15), C(6ac67e4e4b4812f0), C(b3a9f5622c935bf7), >+ C(95b9b814)}, >+ {C(99c175819b4eae28), C(932e8ff9f7a40043), C(ec78dcab07ca9f7c), >+ C(c1a78b82ba815b74), C(458cbdfc82eb322a), C(17f4a192376ed8d7), >+ C(6f9e92968bc8ccef), C(c1a78b82ba815b74), C(458cbdfc82eb322a), >+ C(17f4a192376ed8d7), C(6f9e92968bc8ccef), C(93e098c333b39905), >+ C(d59b1cace44b7fdc), C(f7a64ed78c64c7c5), C(7c6eca5dd87ec1ce), >+ C(45fbe66e)}, >+ {C(2a418335779b82fc), C(af0295987849a76b), C(c12bc5ff0213f46e), >+ C(5aeead8d6cb25bb9), C(739315f7743ec3ff), C(9ab48d27111d2dcc), >+ C(5b87bd35a975929b), C(5aeead8d6cb25bb9), C(739315f7743ec3ff), >+ C(9ab48d27111d2dcc), C(5b87bd35a975929b), C(c3dd8d6d95a46bb3), >+ C(7bf9093215a4f483), C(cb557d6ed84285bd), C(daf58422f261fdb5), >+ C(b4baa7a8)}, >+ {C(3b1fc6a3d279e67d), C(70ea1e49c226396), C(25505adcf104697c), >+ C(ba1ffba29f0367aa), C(a20bec1dd15a8b6c), C(e9bf61d2dab0f774), >+ C(f4f35bf5870a049c), C(ba1ffba29f0367aa), C(a20bec1dd15a8b6c), >+ C(e9bf61d2dab0f774), C(f4f35bf5870a049c), C(26787efa5b92385), >+ C(3d9533590ce30b59), C(a4da3e40530a01d4), C(6395deaefb70067c), >+ C(83e962fe)}, >+ {C(d97eacdf10f1c3c9), C(b54f4654043a36e0), C(b128f6eb09d1234), >+ C(d8ad7ec84a9c9aa2), C(e256cffed11f69e6), C(2cf65e4958ad5bda), >+ C(cfbf9b03245989a7), C(d8ad7ec84a9c9aa2), C(e256cffed11f69e6), >+ C(2cf65e4958ad5bda), C(cfbf9b03245989a7), C(9fa51e6686cf4444), >+ C(9425c117a34609d5), C(b25f7e2c6f30e96), C(ea5477c3f2b5afd1), C(aac3531c)}, >+ {C(293a5c1c4e203cd4), C(6b3329f1c130cefe), C(f2e32f8ec76aac91), >+ C(361e0a62c8187bff), C(6089971bb84d7133), C(93df7741588dd50b), >+ C(c2a9b6abcd1d80b1), C(361e0a62c8187bff), C(6089971bb84d7133), >+ C(93df7741588dd50b), C(c2a9b6abcd1d80b1), C(4d2f86869d79bc59), >+ C(85cd24d8aa570ff), C(b0dcf6ef0e94bbb5), C(2037c69aa7a78421), C(2b1db7cc)}, >+ {C(4290e018ffaedde7), C(a14948545418eb5e), C(72d851b202284636), >+ C(4ec02f3d2f2b23f2), C(ab3580708aa7c339), C(cdce066fbab3f65), >+ C(d8ed3ecf3c7647b9), C(4ec02f3d2f2b23f2), C(ab3580708aa7c339), >+ C(cdce066fbab3f65), C(d8ed3ecf3c7647b9), C(6d2204b3e31f344a), >+ C(61a4d87f80ee61d7), C(446c43dbed4b728f), C(73130ac94f58747e), >+ C(cf00cd31)}, >+ {C(f919a59cbde8bf2f), C(a56d04203b2dc5a5), C(38b06753ac871e48), >+ C(c2c9fc637dbdfcfa), C(292ab8306d149d75), C(7f436b874b9ffc07), >+ C(a5b56b0129218b80), C(c2c9fc637dbdfcfa), C(292ab8306d149d75), >+ C(7f436b874b9ffc07), C(a5b56b0129218b80), C(9188f7bdc47ec050), >+ C(cfe9345d03a15ade), C(40b520fb2750c49e), C(c2e83d343968af2e), >+ C(7d3c43b8)}, >+ {C(1d70a3f5521d7fa4), C(fb97b3fdc5891965), C(299d49bbbe3535af), >+ C(e1a8286a7d67946e), C(52bd956f047b298), C(cbd74332dd4204ac), >+ C(12b5be7752721976), C(e1a8286a7d67946e), C(52bd956f047b298), >+ C(cbd74332dd4204ac), C(12b5be7752721976), C(278426e27f6204b6), >+ C(932ca7a7cd610181), C(41647321f0a5914d), C(48f4aa61a0ae80db), >+ C(cbd5fac6)}, >+ {C(6af98d7b656d0d7c), C(d2e99ae96d6b5c0c), C(f63bd1603ef80627), >+ C(bde51033ac0413f8), C(bc0272f691aec629), C(6204332651bebc44), >+ C(1cbf00de026ea9bd), C(bde51033ac0413f8), C(bc0272f691aec629), >+ C(6204332651bebc44), C(1cbf00de026ea9bd), C(b9c7ed6a75f3ff1e), >+ C(7e310b76a5808e4f), C(acbbd1aad5531885), C(fc245f2473adeb9c), >+ C(76d0fec4)}, >+ {C(395b7a8adb96ab75), C(582df7165b20f4a), C(e52bd30e9ff657f9), >+ C(6c71064996cbec8b), C(352c535edeefcb89), C(ac7f0aba15cd5ecd), >+ C(3aba1ca8353e5c60), C(6c71064996cbec8b), C(352c535edeefcb89), >+ C(ac7f0aba15cd5ecd), C(3aba1ca8353e5c60), C(5c30a288a80ce646), >+ C(c2940488b6617674), C(925f8cc66b370575), C(aa65d1283b9bb0ef), >+ C(405e3402)}, >+ {C(3822dd82c7df012f), C(b9029b40bd9f122b), C(fd25b988468266c4), >+ C(43e47bd5bab1e0ef), C(4a71f363421f282f), C(880b2f32a2b4e289), >+ C(1299d4eda9d3eadf), C(43e47bd5bab1e0ef), C(4a71f363421f282f), >+ C(880b2f32a2b4e289), C(1299d4eda9d3eadf), C(d713a40226f5564), >+ C(4d8d34fedc769406), C(a85001b29cd9cac3), C(cae92352a41fd2b0), >+ C(c732c481)}, >+ {C(79f7efe4a80b951a), C(dd3a3fddfc6c9c41), C(ab4c812f9e27aa40), >+ C(832954ec9d0de333), C(94c390aa9bcb6b8a), C(f3b32afdc1f04f82), >+ C(d229c3b72e4b9a74), C(832954ec9d0de333), C(94c390aa9bcb6b8a), >+ C(f3b32afdc1f04f82), C(d229c3b72e4b9a74), C(1d11860d7ed624a6), >+ C(cadee20b3441b984), C(75307079bf306f7b), C(87902aa3b9753ba4), >+ C(a8d123c9)}, >+ {C(ae6e59f5f055921a), C(e9d9b7bf68e82), C(5ce4e4a5b269cc59), >+ C(4960111789727567), C(149b8a37c7125ab6), C(78c7a13ab9749382), >+ C(1c61131260ca151a), C(4960111789727567), C(149b8a37c7125ab6), >+ C(78c7a13ab9749382), C(1c61131260ca151a), C(1e93276b35c309a0), >+ C(2618f56230acde58), C(af61130a18e4febf), C(7145deb18e89befe), >+ C(1e80ad7d)}, >+ {C(8959dbbf07387d36), C(b4658afce48ea35d), C(8f3f82437d8cb8d6), >+ C(6566d74954986ba5), C(99d5235cc82519a7), C(257a23805c2d825), >+ C(ad75ccb968e93403), C(6566d74954986ba5), C(99d5235cc82519a7), >+ C(257a23805c2d825), C(ad75ccb968e93403), C(b45bd4cf78e11f7f), >+ C(80c5536bdc487983), C(a4fd76ecbf018c8a), C(3b9dac78a7a70d43), >+ C(52aeb863)}, >+ {C(4739613234278a49), C(99ea5bcd340bf663), C(258640912e712b12), >+ C(c8a2827404991402), C(7ee5e78550f02675), C(2ec53952db5ac662), >+ C(1526405a9df6794b), C(c8a2827404991402), C(7ee5e78550f02675), >+ C(2ec53952db5ac662), C(1526405a9df6794b), C(eddc6271170c5e1f), >+ C(f5a85f986001d9d6), C(95427c677bf58d58), C(53ed666dfa85cb29), >+ C(ef7c0c18)}, >+ {C(420e6c926bc54841), C(96dbbf6f4e7c75cd), C(d8d40fa70c3c67bb), >+ C(3edbc10e4bfee91b), C(f0d681304c28ef68), C(77ea602029aaaf9c), >+ C(90f070bd24c8483c), C(3edbc10e4bfee91b), C(f0d681304c28ef68), >+ C(77ea602029aaaf9c), C(90f070bd24c8483c), C(28bc8e41e08ceb86), >+ C(1eb56e48a65691ef), C(9fea5301c9202f0e), C(3fcb65091aa9f135), >+ C(b6ad4b68)}, >+ {C(c8601bab561bc1b7), C(72b26272a0ff869a), C(56fdfc986d6bc3c4), >+ C(83707730cad725d4), C(c9ca88c3a779674a), C(e1c696fbbd9aa933), >+ C(723f3baab1c17a45), C(83707730cad725d4), C(c9ca88c3a779674a), >+ C(e1c696fbbd9aa933), C(723f3baab1c17a45), C(f82abc7a1d851682), >+ C(30683836818e857d), C(78bfa3e89a5ab23f), C(6928234482b31817), >+ C(c1e46b17)}, >+ {C(b2d294931a0e20eb), C(284ffd9a0815bc38), C(1f8a103aac9bbe6), >+ C(1ef8e98e1ea57269), C(5971116272f45a8b), C(187ad68ce95d8eac), >+ C(e94e93ee4e8ecaa6), C(1ef8e98e1ea57269), C(5971116272f45a8b), >+ C(187ad68ce95d8eac), C(e94e93ee4e8ecaa6), C(a0ff2a58611838b5), >+ C(b01e03849bfbae6f), C(d081e202e28ea3ab), C(51836bcee762bf13), >+ C(57b8df25)}, >+ {C(7966f53c37b6c6d7), C(8e6abcfb3aa2b88f), C(7f2e5e0724e5f345), >+ C(3eeb60c3f5f8143d), C(a25aec05c422a24f), C(b026b03ad3cca4db), >+ C(e6e030028cc02a02), C(3eeb60c3f5f8143d), C(a25aec05c422a24f), >+ C(b026b03ad3cca4db), C(e6e030028cc02a02), C(16fe679338b34bfc), >+ C(c1be385b5c8a9de4), C(65af5df6567530eb), C(ed3b303df4dc6335), >+ C(e9fa36d6)}, >+ {C(be9bb0abd03b7368), C(13bca93a3031be55), C(e864f4f52b55b472), >+ C(36a8d13a2cbb0939), C(254ac73907413230), C(73520d1522315a70), >+ C(8c9fdb5cf1e1a507), C(36a8d13a2cbb0939), C(254ac73907413230), >+ C(73520d1522315a70), C(8c9fdb5cf1e1a507), C(b3640570b926886), >+ C(fba2344ee87f7bab), C(de57341ab448df05), C(385612ee094fa977), >+ C(8f8daefc)}, >+ {C(a08d128c5f1649be), C(a8166c3dbbe19aad), C(cb9f914f829ec62c), >+ C(5b2b7ca856fad1c3), C(8093022d682e375d), C(ea5d163ba7ea231f), >+ C(d6181d012c0de641), C(5b2b7ca856fad1c3), C(8093022d682e375d), >+ C(ea5d163ba7ea231f), C(d6181d012c0de641), C(e7d40d0ab8b08159), >+ C(2e82320f51b3a67e), C(27c2e356ea0b63a3), C(58842d01a2b1d077), C(6e1bb7e)}, >+ {C(7c386f0ffe0465ac), C(530419c9d843dbf3), C(7450e3a4f72b8d8c), >+ C(48b218e3b721810d), C(d3757ac8609bc7fc), C(111ba02a88aefc8), >+ C(e86343137d3bfc2a), C(48b218e3b721810d), C(d3757ac8609bc7fc), >+ C(111ba02a88aefc8), C(e86343137d3bfc2a), C(44ad26b51661b507), >+ C(db1268670274f51e), C(62a5e75beae875f3), C(e266e7a44c5f28c6), >+ C(fd0076f0)}, >+ {C(bb362094e7ef4f8), C(ff3c2a48966f9725), C(55152803acd4a7fe), >+ C(15747d8c505ffd00), C(438a15f391312cd6), C(e46ca62c26d821f5), >+ C(be78d74c9f79cb44), C(15747d8c505ffd00), C(438a15f391312cd6), >+ C(e46ca62c26d821f5), C(be78d74c9f79cb44), C(a8aa19f3aa59f09a), >+ C(effb3cddab2c9267), C(d78e41ad97cb16a5), C(ace6821513527d32), >+ C(899b17b6)}, >+ {C(cd80dea24321eea4), C(52b4fdc8130c2b15), C(f3ea100b154bfb82), >+ C(d9ccef1d4be46988), C(5ede0c4e383a5e66), C(da69683716a54d1e), >+ C(bfc3fdf02d242d24), C(d9ccef1d4be46988), C(5ede0c4e383a5e66), >+ C(da69683716a54d1e), C(bfc3fdf02d242d24), C(20ed30274651b3f5), >+ C(4c659824169e86c6), C(637226dae5b52a0e), C(7e050dbd1c71dc7f), >+ C(e3e84e31)}, >+ {C(d599a04125372c3a), C(313136c56a56f363), C(1e993c3677625832), >+ C(2870a99c76a587a4), C(99f74cc0b182dda4), C(8a5e895b2f0ca7b6), >+ C(3d78882d5e0bb1dc), C(2870a99c76a587a4), C(99f74cc0b182dda4), >+ C(8a5e895b2f0ca7b6), C(3d78882d5e0bb1dc), C(f466123732a3e25e), >+ C(aca5e59716a40e50), C(261d2e7383d0e686), C(ce9362d6a42c15a7), >+ C(eef79b6b)}, >+ {C(dbbf541e9dfda0a), C(1479fceb6db4f844), C(31ab576b59062534), >+ C(a3335c417687cf3a), C(92ff114ac45cda75), C(c3b8a627384f13b5), >+ C(c4f25de33de8b3f7), C(a3335c417687cf3a), C(92ff114ac45cda75), >+ C(c3b8a627384f13b5), C(c4f25de33de8b3f7), C(eacbf520578c5964), >+ C(4cb19c5ab24f3215), C(e7d8a6f67f0c6e7), C(325c2413eb770ada), C(868e3315)}, >+ {C(c2ee3288be4fe2bf), C(c65d2f5ddf32b92), C(af6ecdf121ba5485), >+ C(c7cd48f7abf1fe59), C(ce600656ace6f53a), C(8a94a4381b108b34), >+ C(f9d1276c64bf59fb), C(c7cd48f7abf1fe59), C(ce600656ace6f53a), >+ C(8a94a4381b108b34), C(f9d1276c64bf59fb), C(219ce70ff5a112a5), >+ C(e6026c576e2d28d7), C(b8e467f25015e3a6), C(950cb904f37af710), >+ C(4639a426)}, >+ {C(d86603ced1ed4730), C(f9de718aaada7709), C(db8b9755194c6535), >+ C(d803e1eead47604c), C(ad00f7611970a71b), C(bc50036b16ce71f5), >+ C(afba96210a2ca7d6), C(d803e1eead47604c), C(ad00f7611970a71b), >+ C(bc50036b16ce71f5), C(afba96210a2ca7d6), C(28f7a7be1d6765f0), >+ C(97bd888b93938c68), C(6ad41d1b407ded49), C(b9bfec098dc543e4), >+ C(f3213646)}, >+ {C(915263c671b28809), C(a815378e7ad762fd), C(abec6dc9b669f559), >+ C(d17c928c5342477f), C(745130b795254ad5), C(8c5db926fe88f8ba), >+ C(742a95c953e6d974), C(d17c928c5342477f), C(745130b795254ad5), >+ C(8c5db926fe88f8ba), C(742a95c953e6d974), C(279db8057b5d3e96), >+ C(98168411565b4ec4), C(50a72c54fa1125fa), C(27766a635db73638), >+ C(17f148e9)}, >+ {C(2b67cdd38c307a5e), C(cb1d45bb5c9fe1c), C(800baf2a02ec18ad), >+ C(6531c1fe32bcb417), C(8c970d8df8cdbeb4), C(917ba5fc67e72b40), >+ C(4b65e4e263e0a426), C(6531c1fe32bcb417), C(8c970d8df8cdbeb4), >+ C(917ba5fc67e72b40), C(4b65e4e263e0a426), C(e0de33ce88a8b3a9), >+ C(f8ef98a437e16b08), C(a5162c0c7c5f7b62), C(dbdac43361b2b881), >+ C(bfd94880)}, >+ {C(2d107419073b9cd0), C(a96db0740cef8f54), C(ec41ee91b3ecdc1b), >+ C(ffe319654c8e7ebc), C(6a67b8f13ead5a72), C(6dd10a34f80d532f), >+ C(6e9cfaece9fbca4), C(ffe319654c8e7ebc), C(6a67b8f13ead5a72), >+ C(6dd10a34f80d532f), C(6e9cfaece9fbca4), C(b4468eb6a30aa7e9), >+ C(e87995bee483222a), C(d036c2c90c609391), C(853306e82fa32247), >+ C(bb1fa7f3)}, >+ {C(f3e9487ec0e26dfc), C(1ab1f63224e837fa), C(119983bb5a8125d8), >+ C(8950cfcf4bdf622c), C(8847dca82efeef2f), C(646b75b026708169), >+ C(21cab4b1687bd8b), C(8950cfcf4bdf622c), C(8847dca82efeef2f), >+ C(646b75b026708169), C(21cab4b1687bd8b), C(243b489a9eae6231), >+ C(5f3e634c4b779876), C(ff8abd1548eaf646), C(c7962f5f0151914b), C(88816b1)}, >+ {C(1160987c8fe86f7d), C(879e6db1481eb91b), C(d7dcb802bfe6885d), >+ C(14453b5cc3d82396), C(4ef700c33ed278bc), C(1639c72ffc00d12e), >+ C(fb140ee6155f700d), C(14453b5cc3d82396), C(4ef700c33ed278bc), >+ C(1639c72ffc00d12e), C(fb140ee6155f700d), C(2e6b5c96a6620862), >+ C(a1f136998cbe19c), C(74e058a3b6c5a712), C(93dcf6bd33928b17), C(5c2faeb3)}, >+ {C(eab8112c560b967b), C(97f550b58e89dbae), C(846ed506d304051f), >+ C(276aa37744b5a028), C(8c10800ee90ea573), C(e6e57d2b33a1e0b7), >+ C(91f83563cd3b9dda), C(276aa37744b5a028), C(8c10800ee90ea573), >+ C(e6e57d2b33a1e0b7), C(91f83563cd3b9dda), C(afbb4739570738a1), >+ C(440ba98da5d8f69), C(fde4e9b0eda20350), C(e67dfa5a2138fa1), C(51b5fc6f)}, >+ {C(1addcf0386d35351), C(b5f436561f8f1484), C(85d38e22181c9bb1), >+ C(ff5c03f003c1fefe), C(e1098670afe7ff6), C(ea445030cf86de19), >+ C(f155c68b5c2967f8), C(ff5c03f003c1fefe), C(e1098670afe7ff6), >+ C(ea445030cf86de19), C(f155c68b5c2967f8), C(95d31b145dbb2e9e), >+ C(914fe1ca3deb3265), C(6066020b1358ccc1), C(c74bb7e2dee15036), >+ C(33d94752)}, >+ {C(d445ba84bf803e09), C(1216c2497038f804), C(2293216ea2237207), >+ C(e2164451c651adfb), C(b2534e65477f9823), C(4d70691a69671e34), >+ C(15be4963dbde8143), C(e2164451c651adfb), C(b2534e65477f9823), >+ C(4d70691a69671e34), C(15be4963dbde8143), C(762e75c406c5e9a3), >+ C(7b7579f7e0356841), C(480533eb066dfce5), C(90ae14ea6bfeb4ae), >+ C(b0c92948)}, >+ {C(37235a096a8be435), C(d9b73130493589c2), C(3b1024f59378d3be), >+ C(ad159f542d81f04e), C(49626a97a946096), C(d8d3998bf09fd304), >+ C(d127a411eae69459), C(ad159f542d81f04e), C(49626a97a946096), >+ C(d8d3998bf09fd304), C(d127a411eae69459), C(8f3253c4eb785a7b), >+ C(4049062f37e62397), C(b9fa04d3b670e5c1), C(1211a7967ac9350f), >+ C(c7171590)}, >+ {C(763ad6ea2fe1c99d), C(cf7af5368ac1e26b), C(4d5e451b3bb8d3d4), >+ C(3712eb913d04e2f2), C(2f9500d319c84d89), C(4ac6eb21a8cf06f9), >+ C(7d1917afcde42744), C(3712eb913d04e2f2), C(2f9500d319c84d89), >+ C(4ac6eb21a8cf06f9), C(7d1917afcde42744), C(6b58604b5dd10903), >+ C(c4288dfbc1e319fc), C(230f75ca96817c6e), C(8894cba3b763756c), >+ C(240a67fb)}, >+ {C(ea627fc84cd1b857), C(85e372494520071f), C(69ec61800845780b), >+ C(a3c1c5ca1b0367), C(eb6933997272bb3d), C(76a72cb62692a655), >+ C(140bb5531edf756e), C(a3c1c5ca1b0367), C(eb6933997272bb3d), >+ C(76a72cb62692a655), C(140bb5531edf756e), C(8d0d8067d1c925f4), >+ C(7b3fa56d8d77a10c), C(2bd00287b0946d88), C(f08c8e4bd65b8970), >+ C(e1843cd5)}, >+ {C(1f2ffd79f2cdc0c8), C(726a1bc31b337aaa), C(678b7f275ef96434), >+ C(5aa82bfaa99d3978), C(c18f96cade5ce18d), C(38404491f9e34c03), >+ C(891fb8926ba0418c), C(5aa82bfaa99d3978), C(c18f96cade5ce18d), >+ C(38404491f9e34c03), C(891fb8926ba0418c), C(e5f69a6398114c15), >+ C(7b8ded3623bc6b1d), C(2f3e5c5da5ff70e8), C(1ab142addea6a9ec), >+ C(fda1452b)}, >+ {C(39a9e146ec4b3210), C(f63f75802a78b1ac), C(e2e22539c94741c3), >+ C(8b305d532e61226e), C(caeae80da2ea2e), C(88a6289a76ac684e), >+ C(8ce5b5f9df1cbd85), C(8b305d532e61226e), C(caeae80da2ea2e), >+ C(88a6289a76ac684e), C(8ce5b5f9df1cbd85), C(8ae1fc4798e00d57), >+ C(e7164b8fb364fc46), C(6a978c9bd3a66943), C(ef10d5ae4dd08dc), C(a2cad330)}, >+ {C(74cba303e2dd9d6d), C(692699b83289fad1), C(dfb9aa7874678480), >+ C(751390a8a5c41bdc), C(6ee5fbf87605d34), C(6ca73f610f3a8f7c), >+ C(e898b3c996570ad), C(751390a8a5c41bdc), C(6ee5fbf87605d34), >+ C(6ca73f610f3a8f7c), C(e898b3c996570ad), C(98168a5858fc7110), >+ C(6f987fa27aa0daa2), C(f25e3e180d4b36a3), C(d0b03495aeb1be8a), >+ C(53467e16)}, >+ {C(4cbc2b73a43071e0), C(56c5db4c4ca4e0b7), C(1b275a162f46bd3d), >+ C(b87a326e413604bf), C(d8f9a5fa214b03ab), C(8a8bb8265771cf88), >+ C(a655319054f6e70f), C(b87a326e413604bf), C(d8f9a5fa214b03ab), >+ C(8a8bb8265771cf88), C(a655319054f6e70f), C(b499cb8e65a9af44), >+ C(bee7fafcc8307491), C(5d2e55fa9b27cda2), C(63b120f5fb2d6ee5), >+ C(da14a8d0)}, >+ {C(875638b9715d2221), C(d9ba0615c0c58740), C(616d4be2dfe825aa), >+ C(5df25f13ea7bc284), C(165edfaafd2598fb), C(af7215c5c718c696), >+ C(e9f2f9ca655e769), C(5df25f13ea7bc284), C(165edfaafd2598fb), >+ C(af7215c5c718c696), C(e9f2f9ca655e769), C(e459cfcb565d3d2d), >+ C(41d032631be2418a), C(c505db05fd946f60), C(54990394a714f5de), >+ C(67333551)}, >+ {C(fb686b2782994a8d), C(edee60693756bb48), C(e6bc3cae0ded2ef5), >+ C(58eb4d03b2c3ddf5), C(6d2542995f9189f1), C(c0beec58a5f5fea2), >+ C(ed67436f42e2a78b), C(58eb4d03b2c3ddf5), C(6d2542995f9189f1), >+ C(c0beec58a5f5fea2), C(ed67436f42e2a78b), C(dfec763cdb2b5193), >+ C(724a8d5345bd2d6), C(94d4fd1b81457c23), C(28e87c50cdede453), C(a0ebd66e)}, >+ {C(ab21d81a911e6723), C(4c31b07354852f59), C(835da384c9384744), >+ C(7f759dddc6e8549a), C(616dd0ca022c8735), C(94717ad4bc15ceb3), >+ C(f66c7be808ab36e), C(7f759dddc6e8549a), C(616dd0ca022c8735), >+ C(94717ad4bc15ceb3), C(f66c7be808ab36e), C(af8286b550b2f4b7), >+ C(745bd217d20a9f40), C(c73bfb9c5430f015), C(55e65922666e3fc2), >+ C(4b769593)}, >+ {C(33d013cc0cd46ecf), C(3de726423aea122c), C(116af51117fe21a9), >+ C(f271ba474edc562d), C(e6596e67f9dd3ebd), C(c0a288edf808f383), >+ C(b3def70681c6babc), C(f271ba474edc562d), C(e6596e67f9dd3ebd), >+ C(c0a288edf808f383), C(b3def70681c6babc), C(7da7864e9989b095), >+ C(bf2f8718693cd8a1), C(264a9144166da776), C(61ad90676870beb6), >+ C(6aa75624)}, >+ {C(8ca92c7cd39fae5d), C(317e620e1bf20f1), C(4f0b33bf2194b97f), >+ C(45744afcf131dbee), C(97222392c2559350), C(498a19b280c6d6ed), >+ C(83ac2c36acdb8d49), C(45744afcf131dbee), C(97222392c2559350), >+ C(498a19b280c6d6ed), C(83ac2c36acdb8d49), C(7a69645c294daa62), >+ C(abe9d2be8275b3d2), C(39542019de371085), C(7f4efac8488cd6ad), >+ C(602a3f96)}, >+ {C(fdde3b03f018f43e), C(38f932946c78660), C(c84084ce946851ee), >+ C(b6dd09ba7851c7af), C(570de4e1bb13b133), C(c4e784eb97211642), >+ C(8285a7fcdcc7c58d), C(b6dd09ba7851c7af), C(570de4e1bb13b133), >+ C(c4e784eb97211642), C(8285a7fcdcc7c58d), C(d421f47990da899b), >+ C(8aed409c997eaa13), C(7a045929c2e29ccf), C(b373682a6202c86b), >+ C(cd183c4d)}, >+ {C(9c8502050e9c9458), C(d6d2a1a69964beb9), C(1675766f480229b5), >+ C(216e1d6c86cb524c), C(d01cf6fd4f4065c0), C(fffa4ec5b482ea0f), >+ C(a0e20ee6a5404ac1), C(216e1d6c86cb524c), C(d01cf6fd4f4065c0), >+ C(fffa4ec5b482ea0f), C(a0e20ee6a5404ac1), C(c1b037e4eebaf85e), >+ C(634e3d7c3ebf89eb), C(bcda972358c67d1), C(fd1352181e5b8578), C(960a4d07)}, >+ {C(348176ca2fa2fdd2), C(3a89c514cc360c2d), C(9f90b8afb318d6d0), >+ C(bceee07c11a9ac30), C(2e2d47dff8e77eb7), C(11a394cd7b6d614a), >+ C(1d7c41d54e15cb4a), C(bceee07c11a9ac30), C(2e2d47dff8e77eb7), >+ C(11a394cd7b6d614a), C(1d7c41d54e15cb4a), C(15baa5ae7312b0fc), >+ C(f398f596cc984635), C(8ab8fdf87a6788e8), C(b2b5c1234ab47e2), C(9ae998c4)}, >+ {C(4a3d3dfbbaea130b), C(4e221c920f61ed01), C(553fd6cd1304531f), >+ C(bd2b31b5608143fe), C(ab717a10f2554853), C(293857f04d194d22), >+ C(d51be8fa86f254f0), C(bd2b31b5608143fe), C(ab717a10f2554853), >+ C(293857f04d194d22), C(d51be8fa86f254f0), C(1eee39e07686907e), >+ C(639039fe0e8d3052), C(d6ec1470cef97ff), C(370c82b860034f0f), C(74e2179d)}, >+ {C(b371f768cdf4edb9), C(bdef2ace6d2de0f0), C(e05b4100f7f1baec), >+ C(b9e0d415b4ebd534), C(c97c2a27efaa33d7), C(591cdb35f84ef9da), >+ C(a57d02d0e8e3756c), C(b9e0d415b4ebd534), C(c97c2a27efaa33d7), >+ C(591cdb35f84ef9da), C(a57d02d0e8e3756c), C(23f55f12d7c5c87b), >+ C(4c7ca0fe23221101), C(dbc3020480334564), C(d985992f32c236b1), >+ C(ee9bae25)}, >+ {C(7a1d2e96934f61f), C(eb1760ae6af7d961), C(887eb0da063005df), >+ C(2228d6725e31b8ab), C(9b98f7e4d0142e70), C(b6a8c2115b8e0fe7), >+ C(b591e2f5ab9b94b1), C(2228d6725e31b8ab), C(9b98f7e4d0142e70), >+ C(b6a8c2115b8e0fe7), C(b591e2f5ab9b94b1), C(6c1feaa8065318e0), >+ C(4e7e2ca21c2e81fb), C(e9fe5d8ce7993c45), C(ee411fa2f12cf8df), >+ C(b66edf10)}, >+ {C(8be53d466d4728f2), C(86a5ac8e0d416640), C(984aa464cdb5c8bb), >+ C(87049e68f5d38e59), C(7d8ce44ec6bd7751), C(cc28d08ab414839c), >+ C(6c8f0bd34fe843e3), C(87049e68f5d38e59), C(7d8ce44ec6bd7751), >+ C(cc28d08ab414839c), C(6c8f0bd34fe843e3), C(b8496dcdc01f3e47), >+ C(2f03125c282ac26), C(82a8797ba3f5ef07), C(7c977a4d10bf52b8), C(d6209737)}, >+ {C(829677eb03abf042), C(43cad004b6bc2c0), C(f2f224756803971a), >+ C(98d0dbf796480187), C(fbcb5f3e1bef5742), C(5af2a0463bf6e921), >+ C(ad9555bf0120b3a3), C(98d0dbf796480187), C(fbcb5f3e1bef5742), >+ C(5af2a0463bf6e921), C(ad9555bf0120b3a3), C(283e39b3dc99f447), >+ C(bedaa1a4a0250c28), C(9d50546624ff9a57), C(4abaf523d1c090f6), C(b994a88)}, >+ {C(754435bae3496fc), C(5707fc006f094dcf), C(8951c86ab19d8e40), >+ C(57c5208e8f021a77), C(f7653fbb69cd9276), C(a484410af21d75cb), >+ C(f19b6844b3d627e8), C(57c5208e8f021a77), C(f7653fbb69cd9276), >+ C(a484410af21d75cb), C(f19b6844b3d627e8), C(f37400fc3ffd9514), >+ C(36ae0d821734edfd), C(5f37820af1f1f306), C(be637d40e6a5ad0), C(a05d43c0)}, >+ {C(fda9877ea8e3805f), C(31e868b6ffd521b7), C(b08c90681fb6a0fd), >+ C(68110a7f83f5d3ff), C(6d77e045901b85a8), C(84ef681113036d8b), >+ C(3b9f8e3928f56160), C(68110a7f83f5d3ff), C(6d77e045901b85a8), >+ C(84ef681113036d8b), C(3b9f8e3928f56160), C(fc8b7f56c130835), >+ C(a11f3e800638e841), C(d9572267f5cf28c1), C(7897c8149803f2aa), >+ C(c79f73a8)}, >+ {C(2e36f523ca8f5eb5), C(8b22932f89b27513), C(331cd6ecbfadc1bb), >+ C(d1bfe4df12b04cbf), C(f58c17243fd63842), C(3a453cdba80a60af), >+ C(5737b2ca7470ea95), C(d1bfe4df12b04cbf), C(f58c17243fd63842), >+ C(3a453cdba80a60af), C(5737b2ca7470ea95), C(54d44a3f4477030c), >+ C(8168e02d4869aa7f), C(77f383a17778559d), C(95e1737d77a268fc), >+ C(a490aff5)}, >+ {C(21a378ef76828208), C(a5c13037fa841da2), C(506d22a53fbe9812), >+ C(61c9c95d91017da5), C(16f7c83ba68f5279), C(9c0619b0808d05f7), >+ C(83c117ce4e6b70a3), C(61c9c95d91017da5), C(16f7c83ba68f5279), >+ C(9c0619b0808d05f7), C(83c117ce4e6b70a3), C(cfb4c8af7fd01413), >+ C(fdef04e602e72296), C(ed6124d337889b1), C(4919c86707b830da), C(dfad65b4)}, >+ {C(ccdd5600054b16ca), C(f78846e84204cb7b), C(1f9faec82c24eac9), >+ C(58634004c7b2d19a), C(24bb5f51ed3b9073), C(46409de018033d00), >+ C(4a9805eed5ac802e), C(58634004c7b2d19a), C(24bb5f51ed3b9073), >+ C(46409de018033d00), C(4a9805eed5ac802e), C(e18de8db306baf82), >+ C(46bbf75f1fa025ff), C(5faf2fb09be09487), C(3fbc62bd4e558fb3), C(1d07dfb)}, >+ {C(7854468f4e0cabd0), C(3a3f6b4f098d0692), C(ae2423ec7799d30d), >+ C(29c3529eb165eeba), C(443de3703b657c35), C(66acbce31ae1bc8d), >+ C(1acc99effe1d547e), C(29c3529eb165eeba), C(443de3703b657c35), >+ C(66acbce31ae1bc8d), C(1acc99effe1d547e), C(cf07f8a57906573d), >+ C(31bafb0bbb9a86e7), C(40c69492702a9346), C(7df61fdaa0b858af), >+ C(416df9a0)}, >+ {C(7f88db5346d8f997), C(88eac9aacc653798), C(68a4d0295f8eefa1), >+ C(ae59ca86f4c3323d), C(25906c09906d5c4c), C(8dd2aa0c0a6584ae), >+ C(232a7d96b38f40e9), C(ae59ca86f4c3323d), C(25906c09906d5c4c), >+ C(8dd2aa0c0a6584ae), C(232a7d96b38f40e9), C(8986ee00a2ed0042), >+ C(c49ae7e428c8a7d1), C(b7dd8280713ac9c2), C(e018720aed1ebc28), >+ C(1f8fb9cc)}, >+ {C(bb3fb5fb01d60fcf), C(1b7cc0847a215eb6), C(1246c994437990a1), >+ C(d4edc954c07cd8f3), C(224f47e7c00a30ab), C(d5ad7ad7f41ef0c6), >+ C(59e089281d869fd7), C(d4edc954c07cd8f3), C(224f47e7c00a30ab), >+ C(d5ad7ad7f41ef0c6), C(59e089281d869fd7), C(f29340d07a14b6f1), >+ C(c87c5ef76d9c4ef3), C(463118794193a9a), C(2922dcb0540f0dbc), C(7abf48e3)}, >+ {C(2e783e1761acd84d), C(39158042bac975a0), C(1cd21c5a8071188d), >+ C(b1b7ec44f9302176), C(5cb476450dc0c297), C(dc5ef652521ef6a2), >+ C(3cc79a9e334e1f84), C(b1b7ec44f9302176), C(5cb476450dc0c297), >+ C(dc5ef652521ef6a2), C(3cc79a9e334e1f84), C(769e2a283dbcc651), >+ C(9f24b105c8511d3f), C(c31c15575de2f27e), C(ecfecf32c3ae2d66), >+ C(dea4e3dd)}, >+ {C(392058251cf22acc), C(944ec4475ead4620), C(b330a10b5cb94166), >+ C(54bc9bee7cbe1767), C(485820bdbe442431), C(54d6120ea2972e90), >+ C(f437a0341f29b72a), C(54bc9bee7cbe1767), C(485820bdbe442431), >+ C(54d6120ea2972e90), C(f437a0341f29b72a), C(8f30885c784d5704), >+ C(aa95376b16c7906a), C(e826928cfaf93dc3), C(20e8f54d1c16d7d8), >+ C(c6064f22)}, >+ {C(adf5c1e5d6419947), C(2a9747bc659d28aa), C(95c5b8cb1f5d62c), >+ C(80973ea532b0f310), C(a471829aa9c17dd9), C(c2ff3479394804ab), >+ C(6bf44f8606753636), C(80973ea532b0f310), C(a471829aa9c17dd9), >+ C(c2ff3479394804ab), C(6bf44f8606753636), C(5184d2973e6dd827), >+ C(121b96369a332d9a), C(5c25d3475ab69e50), C(26d2961d62884168), >+ C(743bed9c)}, >+ {C(6bc1db2c2bee5aba), C(e63b0ed635307398), C(7b2eca111f30dbbc), >+ C(230d2b3e47f09830), C(ec8624a821c1caf4), C(ea6ec411cdbf1cb1), >+ C(5f38ae82af364e27), C(230d2b3e47f09830), C(ec8624a821c1caf4), >+ C(ea6ec411cdbf1cb1), C(5f38ae82af364e27), C(a519ef515ea7187c), >+ C(6bad5efa7ebae05f), C(748abacb11a74a63), C(a28eef963d1396eb), >+ C(fce254d5)}, >+ {C(b00f898229efa508), C(83b7590ad7f6985c), C(2780e70a0592e41d), >+ C(7122413bdbc94035), C(e7f90fae33bf7763), C(4b6bd0fb30b12387), >+ C(557359c0c44f48ca), C(7122413bdbc94035), C(e7f90fae33bf7763), >+ C(4b6bd0fb30b12387), C(557359c0c44f48ca), C(d5656c3d6bc5f0d), >+ C(983ff8e5e784da99), C(628479671b445bf), C(e179a1e27ce68f5d), C(e47ec9d1)}, >+ {C(b56eb769ce0d9a8c), C(ce196117bfbcaf04), C(b26c3c3797d66165), >+ C(5ed12338f630ab76), C(fab19fcb319116d), C(167f5f42b521724b), >+ C(c4aa56c409568d74), C(5ed12338f630ab76), C(fab19fcb319116d), >+ C(167f5f42b521724b), C(c4aa56c409568d74), C(75fff4b42f8e9778), >+ C(94218f94710c1ea3), C(b7b05efb738b06a6), C(83fff2deabf9cd3), C(334a145c)}, >+ {C(70c0637675b94150), C(259e1669305b0a15), C(46e1dd9fd387a58d), >+ C(fca4e5bc9292788e), C(cd509dc1facce41c), C(bbba575a59d82fe), >+ C(4e2e71c15b45d4d3), C(fca4e5bc9292788e), C(cd509dc1facce41c), >+ C(bbba575a59d82fe), C(4e2e71c15b45d4d3), C(5dc54582ead999c), >+ C(72612d1571963c6f), C(30318a9d2d3d1829), C(785dd00f4cc9c9a0), >+ C(adec1e3c)}, >+ {C(74c0b8a6821faafe), C(abac39d7491370e7), C(faf0b2a48a4e6aed), >+ C(967e970df9673d2a), C(d465247cffa415c0), C(33a1df0ca1107722), >+ C(49fc2a10adce4a32), C(967e970df9673d2a), C(d465247cffa415c0), >+ C(33a1df0ca1107722), C(49fc2a10adce4a32), C(c5707e079a284308), >+ C(573028266635dda6), C(f786f5eee6127fa0), C(b30d79cebfb51266), >+ C(f6a9fbf8)}, >+ {C(5fb5e48ac7b7fa4f), C(a96170f08f5acbc7), C(bbf5c63d4f52a1e5), >+ C(6cc09e60700563e9), C(d18f23221e964791), C(ffc23eeef7af26eb), >+ C(693a954a3622a315), C(815308a32a9b0daf), C(efb2ab27bf6fd0bd), >+ C(9f1ffc0986111118), C(f9a3aa1778ea3985), C(698fe54b2b93933b), >+ C(dacc2b28404d0f10), C(815308a32a9b0daf), C(efb2ab27bf6fd0bd), >+ C(5398210c)}, >+}; >+ >+void TestUnchanging(const uint64_t* expected, int offset, int len) { >+ const uint128 u = CityHash128(data + offset, len); >+ const uint128 v = CityHash128WithSeed(data + offset, len, kSeed128); >+ EXPECT_EQ(expected[0], CityHash64(data + offset, len)); >+ EXPECT_EQ(expected[15], CityHash32(data + offset, len)); >+ EXPECT_EQ(expected[1], CityHash64WithSeed(data + offset, len, kSeed0)); >+ EXPECT_EQ(expected[2], >+ CityHash64WithSeeds(data + offset, len, kSeed0, kSeed1)); >+ EXPECT_EQ(expected[3], Uint128Low64(u)); >+ EXPECT_EQ(expected[4], Uint128High64(u)); >+ EXPECT_EQ(expected[5], Uint128Low64(v)); >+ EXPECT_EQ(expected[6], Uint128High64(v)); >+#ifdef __SSE4_2__ >+ const uint128 y = CityHashCrc128(data + offset, len); >+ const uint128 z = CityHashCrc128WithSeed(data + offset, len, kSeed128); >+ uint64_t crc256_results[4]; >+ CityHashCrc256(data + offset, len, crc256_results); >+ EXPECT_EQ(expected[7], Uint128Low64(y)); >+ EXPECT_EQ(expected[8], Uint128High64(y)); >+ EXPECT_EQ(expected[9], Uint128Low64(z)); >+ EXPECT_EQ(expected[10], Uint128High64(z)); >+ for (int i = 0; i < 4; i++) { >+ EXPECT_EQ(expected[11 + i], crc256_results[i]); >+ } >+#endif >+} >+ >+TEST(CityHashTest, Unchanging) { >+ setup(); >+ int i = 0; >+ for (; i < kTestSize - 1; i++) { >+ TestUnchanging(testdata[i], i * i, i); >+ } >+ TestUnchanging(testdata[i], 0, kDataSize); >+} >+ >+} // namespace hash_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/hash.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/hash.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..4bf64096e7eaa3b346ed2f2a22df2cd2a8e71574 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/hash.cc >@@ -0,0 +1,23 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include "absl/hash/internal/hash.h" >+ >+namespace absl { >+namespace hash_internal { >+ >+ABSL_CONST_INIT const void* const CityHashState::kSeed = &kSeed; >+ >+} // namespace hash_internal >+} // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/hash.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/hash.h >new file mode 100644 >index 0000000000000000000000000000000000000000..78217d2fe760003cb83e8a317a59185e82a730f7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/hash.h >@@ -0,0 +1,887 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+// >+// ----------------------------------------------------------------------------- >+// File: hash.h >+// ----------------------------------------------------------------------------- >+// >+#ifndef ABSL_HASH_INTERNAL_HASH_H_ >+#define ABSL_HASH_INTERNAL_HASH_H_ >+ >+#include <algorithm> >+#include <array> >+#include <cmath> >+#include <cstring> >+#include <deque> >+#include <forward_list> >+#include <functional> >+#include <iterator> >+#include <limits> >+#include <list> >+#include <map> >+#include <memory> >+#include <set> >+#include <string> >+#include <tuple> >+#include <type_traits> >+#include <utility> >+#include <vector> >+ >+#include "absl/base/internal/endian.h" >+#include "absl/base/port.h" >+#include "absl/container/fixed_array.h" >+#include "absl/meta/type_traits.h" >+#include "absl/numeric/int128.h" >+#include "absl/strings/string_view.h" >+#include "absl/types/optional.h" >+#include "absl/types/variant.h" >+#include "absl/utility/utility.h" >+#include "absl/hash/internal/city.h" >+ >+namespace absl { >+namespace hash_internal { >+ >+// HashStateBase >+// >+// A hash state object represents an intermediate state in the computation >+// of an unspecified hash algorithm. `HashStateBase` provides a CRTP style >+// base class for hash state implementations. Developers adding type support >+// for `absl::Hash` should not rely on any parts of the state object other than >+// the following member functions: >+// >+// * HashStateBase::combine() >+// * HashStateBase::combine_contiguous() >+// >+// A derived hash state class of type `H` must provide a static member function >+// with a signature similar to the following: >+// >+// `static H combine_contiguous(H state, const unsigned char*, size_t)`. >+// >+// `HashStateBase` will provide a complete implementations for a hash state >+// object in terms of this method. >+// >+// Example: >+// >+// // Use CRTP to define your derived class. >+// struct MyHashState : HashStateBase<MyHashState> { >+// static H combine_contiguous(H state, const unsigned char*, size_t); >+// using MyHashState::HashStateBase::combine; >+// using MyHashState::HashStateBase::combine_contiguous; >+// }; >+template <typename H> >+class HashStateBase { >+ public: >+ // HashStateBase::combine() >+ // >+ // Combines an arbitrary number of values into a hash state, returning the >+ // updated state. >+ // >+ // Each of the value types `T` must be separately hashable by the Abseil >+ // hashing framework. >+ // >+ // NOTE: >+ // >+ // state = H::combine(std::move(state), value1, value2, value3); >+ // >+ // is guaranteed to produce the same hash expansion as: >+ // >+ // state = H::combine(std::move(state), value1); >+ // state = H::combine(std::move(state), value2); >+ // state = H::combine(std::move(state), value3); >+ template <typename T, typename... Ts> >+ static H combine(H state, const T& value, const Ts&... values); >+ static H combine(H state) { return state; } >+ >+ // HashStateBase::combine_contiguous() >+ // >+ // Combines a contiguous array of `size` elements into a hash state, returning >+ // the updated state. >+ // >+ // NOTE: >+ // >+ // state = H::combine_contiguous(std::move(state), data, size); >+ // >+ // is NOT guaranteed to produce the same hash expansion as a for-loop (it may >+ // perform internal optimizations). If you need this guarantee, use the >+ // for-loop instead. >+ template <typename T> >+ static H combine_contiguous(H state, const T* data, size_t size); >+}; >+ >+// is_uniquely_represented >+// >+// `is_uniquely_represented<T>` is a trait class that indicates whether `T` >+// is uniquely represented. >+// >+// A type is "uniquely represented" if two equal values of that type are >+// guaranteed to have the same bytes in their underlying storage. In other >+// words, if `a == b`, then `memcmp(&a, &b, sizeof(T))` is guaranteed to be >+// zero. This property cannot be detected automatically, so this trait is false >+// by default, but can be specialized by types that wish to assert that they are >+// uniquely represented. This makes them eligible for certain optimizations. >+// >+// If you have any doubt whatsoever, do not specialize this template. >+// The default is completely safe, and merely disables some optimizations >+// that will not matter for most types. Specializing this template, >+// on the other hand, can be very hazardous. >+// >+// To be uniquely represented, a type must not have multiple ways of >+// representing the same value; for example, float and double are not >+// uniquely represented, because they have distinct representations for >+// +0 and -0. Furthermore, the type's byte representation must consist >+// solely of user-controlled data, with no padding bits and no compiler- >+// controlled data such as vptrs or sanitizer metadata. This is usually >+// very difficult to guarantee, because in most cases the compiler can >+// insert data and padding bits at its own discretion. >+// >+// If you specialize this template for a type `T`, you must do so in the file >+// that defines that type (or in this file). If you define that specialization >+// anywhere else, `is_uniquely_represented<T>` could have different meanings >+// in different places. >+// >+// The Enable parameter is meaningless; it is provided as a convenience, >+// to support certain SFINAE techniques when defining specializations. >+template <typename T, typename Enable = void> >+struct is_uniquely_represented : std::false_type {}; >+ >+// is_uniquely_represented<unsigned char> >+// >+// unsigned char is a synonym for "byte", so it is guaranteed to be >+// uniquely represented. >+template <> >+struct is_uniquely_represented<unsigned char> : std::true_type {}; >+ >+// is_uniquely_represented for non-standard integral types >+// >+// Integral types other than bool should be uniquely represented on any >+// platform that this will plausibly be ported to. >+template <typename Integral> >+struct is_uniquely_represented< >+ Integral, typename std::enable_if<std::is_integral<Integral>::value>::type> >+ : std::true_type {}; >+ >+// is_uniquely_represented<bool> >+// >+// >+template <> >+struct is_uniquely_represented<bool> : std::false_type {}; >+ >+// hash_bytes() >+// >+// Convenience function that combines `hash_state` with the byte representation >+// of `value`. >+template <typename H, typename T> >+H hash_bytes(H hash_state, const T& value) { >+ const unsigned char* start = reinterpret_cast<const unsigned char*>(&value); >+ return H::combine_contiguous(std::move(hash_state), start, sizeof(value)); >+} >+ >+// ----------------------------------------------------------------------------- >+// AbslHashValue for Basic Types >+// ----------------------------------------------------------------------------- >+ >+// Note: Default `AbslHashValue` implementations live in `hash_internal`. This >+// allows us to block lexical scope lookup when doing an unqualified call to >+// `AbslHashValue` below. User-defined implementations of `AbslHashValue` can >+// only be found via ADL. >+ >+// AbslHashValue() for hashing bool values >+// >+// We use SFINAE to ensure that this overload only accepts bool, not types that >+// are convertible to bool. >+template <typename H, typename B> >+typename std::enable_if<std::is_same<B, bool>::value, H>::type AbslHashValue( >+ H hash_state, B value) { >+ return H::combine(std::move(hash_state), >+ static_cast<unsigned char>(value ? 1 : 0)); >+} >+ >+// AbslHashValue() for hashing enum values >+template <typename H, typename Enum> >+typename std::enable_if<std::is_enum<Enum>::value, H>::type AbslHashValue( >+ H hash_state, Enum e) { >+ // In practice, we could almost certainly just invoke hash_bytes directly, >+ // but it's possible that a sanitizer might one day want to >+ // store data in the unused bits of an enum. To avoid that risk, we >+ // convert to the underlying type before hashing. Hopefully this will get >+ // optimized away; if not, we can reopen discussion with c-toolchain-team. >+ return H::combine(std::move(hash_state), >+ static_cast<typename std::underlying_type<Enum>::type>(e)); >+} >+// AbslHashValue() for hashing floating-point values >+template <typename H, typename Float> >+typename std::enable_if<std::is_floating_point<Float>::value, H>::type >+AbslHashValue(H hash_state, Float value) { >+ return hash_internal::hash_bytes(std::move(hash_state), >+ value == 0 ? 0 : value); >+} >+ >+// Long double has the property that it might have extra unused bytes in it. >+// For example, in x86 sizeof(long double)==16 but it only really uses 80-bits >+// of it. This means we can't use hash_bytes on a long double and have to >+// convert it to something else first. >+template <typename H> >+H AbslHashValue(H hash_state, long double value) { >+ const int category = std::fpclassify(value); >+ switch (category) { >+ case FP_INFINITE: >+ // Add the sign bit to differentiate between +Inf and -Inf >+ hash_state = H::combine(std::move(hash_state), std::signbit(value)); >+ break; >+ >+ case FP_NAN: >+ case FP_ZERO: >+ default: >+ // Category is enough for these. >+ break; >+ >+ case FP_NORMAL: >+ case FP_SUBNORMAL: >+ // We can't convert `value` directly to double because this would have >+ // undefined behavior if the value is out of range. >+ // std::frexp gives us a value in the range (-1, -.5] or [.5, 1) that is >+ // guaranteed to be in range for `double`. The truncation is >+ // implementation defined, but that works as long as it is deterministic. >+ int exp; >+ auto mantissa = static_cast<double>(std::frexp(value, &exp)); >+ hash_state = H::combine(std::move(hash_state), mantissa, exp); >+ } >+ >+ return H::combine(std::move(hash_state), category); >+} >+ >+// AbslHashValue() for hashing pointers >+template <typename H, typename T> >+H AbslHashValue(H hash_state, T* ptr) { >+ return hash_internal::hash_bytes(std::move(hash_state), ptr); >+} >+ >+// AbslHashValue() for hashing nullptr_t >+template <typename H> >+H AbslHashValue(H hash_state, std::nullptr_t) { >+ return H::combine(std::move(hash_state), static_cast<void*>(nullptr)); >+} >+ >+// ----------------------------------------------------------------------------- >+// AbslHashValue for Composite Types >+// ----------------------------------------------------------------------------- >+ >+// is_hashable() >+// >+// Trait class which returns true if T is hashable by the absl::Hash framework. >+// Used for the AbslHashValue implementations for composite types below. >+template <typename T> >+struct is_hashable; >+ >+// AbslHashValue() for hashing pairs >+template <typename H, typename T1, typename T2> >+typename std::enable_if<is_hashable<T1>::value && is_hashable<T2>::value, >+ H>::type >+AbslHashValue(H hash_state, const std::pair<T1, T2>& p) { >+ return H::combine(std::move(hash_state), p.first, p.second); >+} >+ >+// hash_tuple() >+// >+// Helper function for hashing a tuple. The third argument should >+// be an index_sequence running from 0 to tuple_size<Tuple> - 1. >+template <typename H, typename Tuple, size_t... Is> >+H hash_tuple(H hash_state, const Tuple& t, absl::index_sequence<Is...>) { >+ return H::combine(std::move(hash_state), std::get<Is>(t)...); >+} >+ >+// AbslHashValue for hashing tuples >+template <typename H, typename... Ts> >+#if defined(_MSC_VER) >+// This SFINAE gets MSVC confused under some conditions. Let's just disable it >+// for now. >+H >+#else // _MSC_VER >+typename std::enable_if<absl::conjunction<is_hashable<Ts>...>::value, H>::type >+#endif // _MSC_VER >+AbslHashValue(H hash_state, const std::tuple<Ts...>& t) { >+ return hash_internal::hash_tuple(std::move(hash_state), t, >+ absl::make_index_sequence<sizeof...(Ts)>()); >+} >+ >+// ----------------------------------------------------------------------------- >+// AbslHashValue for Pointers >+// ----------------------------------------------------------------------------- >+ >+// AbslHashValue for hashing unique_ptr >+template <typename H, typename T, typename D> >+H AbslHashValue(H hash_state, const std::unique_ptr<T, D>& ptr) { >+ return H::combine(std::move(hash_state), ptr.get()); >+} >+ >+// AbslHashValue for hashing shared_ptr >+template <typename H, typename T> >+H AbslHashValue(H hash_state, const std::shared_ptr<T>& ptr) { >+ return H::combine(std::move(hash_state), ptr.get()); >+} >+ >+// ----------------------------------------------------------------------------- >+// AbslHashValue for String-Like Types >+// ----------------------------------------------------------------------------- >+ >+// AbslHashValue for hashing strings >+// >+// All the string-like types supported here provide the same hash expansion for >+// the same character sequence. These types are: >+// >+// - `std::string` (and std::basic_string<char, std::char_traits<char>, A> for >+// any allocator A) >+// - `absl::string_view` and `std::string_view` >+// >+// For simplicity, we currently support only `char` strings. This support may >+// be broadened, if necessary, but with some caution - this overload would >+// misbehave in cases where the traits' `eq()` member isn't equivalent to `==` >+// on the underlying character type. >+template <typename H> >+H AbslHashValue(H hash_state, absl::string_view str) { >+ return H::combine( >+ H::combine_contiguous(std::move(hash_state), str.data(), str.size()), >+ str.size()); >+} >+ >+// ----------------------------------------------------------------------------- >+// AbslHashValue for Sequence Containers >+// ----------------------------------------------------------------------------- >+ >+// AbslHashValue for hashing std::array >+template <typename H, typename T, size_t N> >+typename std::enable_if<is_hashable<T>::value, H>::type AbslHashValue( >+ H hash_state, const std::array<T, N>& array) { >+ return H::combine_contiguous(std::move(hash_state), array.data(), >+ array.size()); >+} >+ >+// AbslHashValue for hashing std::deque >+template <typename H, typename T, typename Allocator> >+typename std::enable_if<is_hashable<T>::value, H>::type AbslHashValue( >+ H hash_state, const std::deque<T, Allocator>& deque) { >+ // TODO(gromer): investigate a more efficient implementation taking >+ // advantage of the chunk structure. >+ for (const auto& t : deque) { >+ hash_state = H::combine(std::move(hash_state), t); >+ } >+ return H::combine(std::move(hash_state), deque.size()); >+} >+ >+// AbslHashValue for hashing std::forward_list >+template <typename H, typename T, typename Allocator> >+typename std::enable_if<is_hashable<T>::value, H>::type AbslHashValue( >+ H hash_state, const std::forward_list<T, Allocator>& list) { >+ size_t size = 0; >+ for (const T& t : list) { >+ hash_state = H::combine(std::move(hash_state), t); >+ ++size; >+ } >+ return H::combine(std::move(hash_state), size); >+} >+ >+// AbslHashValue for hashing std::list >+template <typename H, typename T, typename Allocator> >+typename std::enable_if<is_hashable<T>::value, H>::type AbslHashValue( >+ H hash_state, const std::list<T, Allocator>& list) { >+ for (const auto& t : list) { >+ hash_state = H::combine(std::move(hash_state), t); >+ } >+ return H::combine(std::move(hash_state), list.size()); >+} >+ >+// AbslHashValue for hashing std::vector >+// >+// Do not use this for vector<bool>. It does not have a .data(), and a fallback >+// for std::hash<> is most likely faster. >+template <typename H, typename T, typename Allocator> >+typename std::enable_if<is_hashable<T>::value && !std::is_same<T, bool>::value, >+ H>::type >+AbslHashValue(H hash_state, const std::vector<T, Allocator>& vector) { >+ return H::combine(H::combine_contiguous(std::move(hash_state), vector.data(), >+ vector.size()), >+ vector.size()); >+} >+ >+// ----------------------------------------------------------------------------- >+// AbslHashValue for Ordered Associative Containers >+// ----------------------------------------------------------------------------- >+ >+// AbslHashValue for hashing std::map >+template <typename H, typename Key, typename T, typename Compare, >+ typename Allocator> >+typename std::enable_if<is_hashable<Key>::value && is_hashable<T>::value, >+ H>::type >+AbslHashValue(H hash_state, const std::map<Key, T, Compare, Allocator>& map) { >+ for (const auto& t : map) { >+ hash_state = H::combine(std::move(hash_state), t); >+ } >+ return H::combine(std::move(hash_state), map.size()); >+} >+ >+// AbslHashValue for hashing std::multimap >+template <typename H, typename Key, typename T, typename Compare, >+ typename Allocator> >+typename std::enable_if<is_hashable<Key>::value && is_hashable<T>::value, >+ H>::type >+AbslHashValue(H hash_state, >+ const std::multimap<Key, T, Compare, Allocator>& map) { >+ for (const auto& t : map) { >+ hash_state = H::combine(std::move(hash_state), t); >+ } >+ return H::combine(std::move(hash_state), map.size()); >+} >+ >+// AbslHashValue for hashing std::set >+template <typename H, typename Key, typename Compare, typename Allocator> >+typename std::enable_if<is_hashable<Key>::value, H>::type AbslHashValue( >+ H hash_state, const std::set<Key, Compare, Allocator>& set) { >+ for (const auto& t : set) { >+ hash_state = H::combine(std::move(hash_state), t); >+ } >+ return H::combine(std::move(hash_state), set.size()); >+} >+ >+// AbslHashValue for hashing std::multiset >+template <typename H, typename Key, typename Compare, typename Allocator> >+typename std::enable_if<is_hashable<Key>::value, H>::type AbslHashValue( >+ H hash_state, const std::multiset<Key, Compare, Allocator>& set) { >+ for (const auto& t : set) { >+ hash_state = H::combine(std::move(hash_state), t); >+ } >+ return H::combine(std::move(hash_state), set.size()); >+} >+ >+// ----------------------------------------------------------------------------- >+// AbslHashValue for Wrapper Types >+// ----------------------------------------------------------------------------- >+ >+// AbslHashValue for hashing absl::optional >+template <typename H, typename T> >+typename std::enable_if<is_hashable<T>::value, H>::type AbslHashValue( >+ H hash_state, const absl::optional<T>& opt) { >+ if (opt) hash_state = H::combine(std::move(hash_state), *opt); >+ return H::combine(std::move(hash_state), opt.has_value()); >+} >+ >+// VariantVisitor >+template <typename H> >+struct VariantVisitor { >+ H&& hash_state; >+ template <typename T> >+ H operator()(const T& t) const { >+ return H::combine(std::move(hash_state), t); >+ } >+}; >+ >+// AbslHashValue for hashing absl::variant >+template <typename H, typename... T> >+typename std::enable_if<conjunction<is_hashable<T>...>::value, H>::type >+AbslHashValue(H hash_state, const absl::variant<T...>& v) { >+ if (!v.valueless_by_exception()) { >+ hash_state = absl::visit(VariantVisitor<H>{std::move(hash_state)}, v); >+ } >+ return H::combine(std::move(hash_state), v.index()); >+} >+// ----------------------------------------------------------------------------- >+ >+// hash_range_or_bytes() >+// >+// Mixes all values in the range [data, data+size) into the hash state. >+// This overload accepts only uniquely-represented types, and hashes them by >+// hashing the entire range of bytes. >+template <typename H, typename T> >+typename std::enable_if<is_uniquely_represented<T>::value, H>::type >+hash_range_or_bytes(H hash_state, const T* data, size_t size) { >+ const auto* bytes = reinterpret_cast<const unsigned char*>(data); >+ return H::combine_contiguous(std::move(hash_state), bytes, sizeof(T) * size); >+} >+ >+// hash_range_or_bytes() >+template <typename H, typename T> >+typename std::enable_if<!is_uniquely_represented<T>::value, H>::type >+hash_range_or_bytes(H hash_state, const T* data, size_t size) { >+ for (const auto end = data + size; data < end; ++data) { >+ hash_state = H::combine(std::move(hash_state), *data); >+ } >+ return hash_state; >+} >+ >+// InvokeHashTag >+// >+// InvokeHash(H, const T&) invokes the appropriate hash implementation for a >+// hasher of type `H` and a value of type `T`. If `T` is not hashable, there >+// will be no matching overload of InvokeHash(). >+// Note: Some platforms (eg MSVC) do not support the detect idiom on >+// std::hash. In those platforms the last fallback will be std::hash and >+// InvokeHash() will always have a valid overload even if std::hash<T> is not >+// valid. >+// >+// We try the following options in order: >+// * If is_uniquely_represented, hash bytes directly. >+// * ADL AbslHashValue(H, const T&) call. >+// * std::hash<T> >+ >+// In MSVC we can't probe std::hash or stdext::hash because it triggers a >+// static_assert instead of failing substitution. >+#if defined(_MSC_VER) >+#define ABSL_HASH_INTERNAL_CAN_POISON_ 0 >+#else // _MSC_VER >+#define ABSL_HASH_INTERNAL_CAN_POISON_ 1 >+#endif // _MSC_VER >+ >+#if defined(ABSL_INTERNAL_LEGACY_HASH_NAMESPACE) && \ >+ ABSL_HASH_INTERNAL_CAN_POISON_ >+#define ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ 1 >+#else >+#define ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ 0 >+#endif >+ >+enum class InvokeHashTag { >+ kUniquelyRepresented, >+ kHashValue, >+#if ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ kLegacyHash, >+#endif // ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ kStdHash, >+ kNone >+}; >+ >+// HashSelect >+// >+// Type trait to select the appropriate hash implementation to use. >+// HashSelect<T>::value is an instance of InvokeHashTag that indicates the best >+// available hashing mechanism. >+// See `Note` above about MSVC. >+template <typename T> >+struct HashSelect { >+ private: >+ struct State : HashStateBase<State> { >+ static State combine_contiguous(State hash_state, const unsigned char*, >+ size_t); >+ using State::HashStateBase::combine_contiguous; >+ }; >+ >+ // `Probe<V, Tag>::value` evaluates to `V<T>::value` if it is a valid >+ // expression, and `false` otherwise. >+ // `Probe<V, Tag>::tag` always evaluates to `Tag`. >+ template <template <typename> class V, InvokeHashTag Tag> >+ struct Probe { >+ private: >+ template <typename U, typename std::enable_if<V<U>::value, int>::type = 0> >+ static std::true_type Test(int); >+ template <typename U> >+ static std::false_type Test(char); >+ >+ public: >+ static constexpr InvokeHashTag kTag = Tag; >+ static constexpr bool value = decltype( >+ Test<absl::remove_const_t<absl::remove_reference_t<T>>>(0))::value; >+ }; >+ >+ template <typename U> >+ using ProbeUniquelyRepresented = is_uniquely_represented<U>; >+ >+ template <typename U> >+ using ProbeHashValue = >+ std::is_same<State, decltype(AbslHashValue(std::declval<State>(), >+ std::declval<const U&>()))>; >+ >+#if ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ template <typename U> >+ using ProbeLegacyHash = >+ std::is_convertible<decltype(ABSL_INTERNAL_LEGACY_HASH_NAMESPACE::hash< >+ U>()(std::declval<const U&>())), >+ size_t>; >+#endif // ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ >+ template <typename U> >+ using ProbeStdHash = >+#if ABSL_HASH_INTERNAL_CAN_POISON_ >+ std::is_convertible<decltype(std::hash<U>()(std::declval<const U&>())), >+ size_t>; >+#else // ABSL_HASH_INTERNAL_CAN_POISON_ >+ std::true_type; >+#endif // ABSL_HASH_INTERNAL_CAN_POISON_ >+ >+ template <typename U> >+ using ProbeNone = std::true_type; >+ >+ public: >+ // Probe each implementation in order. >+ // disjunction provides short circuting wrt instantiation. >+ static constexpr InvokeHashTag value = absl::disjunction< >+ Probe<ProbeUniquelyRepresented, InvokeHashTag::kUniquelyRepresented>, >+ Probe<ProbeHashValue, InvokeHashTag::kHashValue>, >+#if ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ Probe<ProbeLegacyHash, InvokeHashTag::kLegacyHash>, >+#endif // ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ Probe<ProbeStdHash, InvokeHashTag::kStdHash>, >+ Probe<ProbeNone, InvokeHashTag::kNone>>::kTag; >+}; >+ >+template <typename T> >+struct is_hashable : std::integral_constant<bool, HashSelect<T>::value != >+ InvokeHashTag::kNone> {}; >+ >+template <typename H, typename T> >+absl::enable_if_t<HashSelect<T>::value == InvokeHashTag::kUniquelyRepresented, >+ H> >+InvokeHash(H state, const T& value) { >+ return hash_internal::hash_bytes(std::move(state), value); >+} >+ >+template <typename H, typename T> >+absl::enable_if_t<HashSelect<T>::value == InvokeHashTag::kHashValue, H> >+InvokeHash(H state, const T& value) { >+ return AbslHashValue(std::move(state), value); >+} >+ >+#if ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+template <typename H, typename T> >+absl::enable_if_t<HashSelect<T>::value == InvokeHashTag::kLegacyHash, H> >+InvokeHash(H state, const T& value) { >+ return hash_internal::hash_bytes( >+ std::move(state), ABSL_INTERNAL_LEGACY_HASH_NAMESPACE::hash<T>{}(value)); >+} >+#endif // ABSL_HASH_INTERNAL_SUPPORT_LEGACY_HASH_ >+ >+template <typename H, typename T> >+absl::enable_if_t<HashSelect<T>::value == InvokeHashTag::kStdHash, H> >+InvokeHash(H state, const T& value) { >+ return hash_internal::hash_bytes(std::move(state), std::hash<T>{}(value)); >+} >+ >+// CityHashState >+class CityHashState : public HashStateBase<CityHashState> { >+ // absl::uint128 is not an alias or a thin wrapper around the intrinsic. >+ // We use the intrinsic when available to improve performance. >+#ifdef ABSL_HAVE_INTRINSIC_INT128 >+ using uint128 = __uint128_t; >+#else // ABSL_HAVE_INTRINSIC_INT128 >+ using uint128 = absl::uint128; >+#endif // ABSL_HAVE_INTRINSIC_INT128 >+ >+ static constexpr uint64_t kMul = >+ sizeof(size_t) == 4 ? uint64_t{0xcc9e2d51} : uint64_t{0x9ddfea08eb382d69}; >+ >+ template <typename T> >+ using IntegralFastPath = >+ conjunction<std::is_integral<T>, is_uniquely_represented<T>>; >+ >+ public: >+ // Move only >+ CityHashState(CityHashState&&) = default; >+ CityHashState& operator=(CityHashState&&) = default; >+ >+ // CityHashState::combine_contiguous() >+ // >+ // Fundamental base case for hash recursion: mixes the given range of bytes >+ // into the hash state. >+ static CityHashState combine_contiguous(CityHashState hash_state, >+ const unsigned char* first, >+ size_t size) { >+ return CityHashState( >+ CombineContiguousImpl(hash_state.state_, first, size, >+ std::integral_constant<int, sizeof(size_t)>{})); >+ } >+ using CityHashState::HashStateBase::combine_contiguous; >+ >+ // CityHashState::hash() >+ // >+ // For performance reasons in non-opt mode, we specialize this for >+ // integral types. >+ // Otherwise we would be instantiating and calling dozens of functions for >+ // something that is just one multiplication and a couple xor's. >+ // The result should be the same as running the whole algorithm, but faster. >+ template <typename T, absl::enable_if_t<IntegralFastPath<T>::value, int> = 0> >+ static size_t hash(T value) { >+ return static_cast<size_t>(Mix(Seed(), static_cast<uint64_t>(value))); >+ } >+ >+ // Overload of CityHashState::hash() >+ template <typename T, absl::enable_if_t<!IntegralFastPath<T>::value, int> = 0> >+ static size_t hash(const T& value) { >+ return static_cast<size_t>(combine(CityHashState{}, value).state_); >+ } >+ >+ private: >+ // Invoked only once for a given argument; that plus the fact that this is >+ // move-only ensures that there is only one non-moved-from object. >+ CityHashState() : state_(Seed()) {} >+ >+ // Workaround for MSVC bug. >+ // We make the type copyable to fix the calling convention, even though we >+ // never actually copy it. Keep it private to not affect the public API of the >+ // type. >+ CityHashState(const CityHashState&) = default; >+ >+ explicit CityHashState(uint64_t state) : state_(state) {} >+ >+ // Implementation of the base case for combine_contiguous where we actually >+ // mix the bytes into the state. >+ // Dispatch to different implementations of the combine_contiguous depending >+ // on the value of `sizeof(size_t)`. >+ static uint64_t CombineContiguousImpl(uint64_t state, >+ const unsigned char* first, size_t len, >+ std::integral_constant<int, 4> >+ /* sizeof_size_t */); >+ static uint64_t CombineContiguousImpl(uint64_t state, >+ const unsigned char* first, size_t len, >+ std::integral_constant<int, 8> >+ /* sizeof_size_t*/); >+ >+ // Reads 9 to 16 bytes from p. >+ // The first 8 bytes are in .first, the rest (zero padded) bytes are in >+ // .second. >+ static std::pair<uint64_t, uint64_t> Read9To16(const unsigned char* p, >+ size_t len) { >+ uint64_t high = little_endian::Load64(p + len - 8); >+ return {little_endian::Load64(p), high >> (128 - len * 8)}; >+ } >+ >+ // Reads 4 to 8 bytes from p. Zero pads to fill uint64_t. >+ static uint64_t Read4To8(const unsigned char* p, size_t len) { >+ return (static_cast<uint64_t>(little_endian::Load32(p + len - 4)) >+ << (len - 4) * 8) | >+ little_endian::Load32(p); >+ } >+ >+ // Reads 1 to 3 bytes from p. Zero pads to fill uint32_t. >+ static uint32_t Read1To3(const unsigned char* p, size_t len) { >+ return static_cast<uint32_t>((p[0]) | // >+ (p[len / 2] << (len / 2 * 8)) | // >+ (p[len - 1] << ((len - 1) * 8))); >+ } >+ >+ ABSL_ATTRIBUTE_ALWAYS_INLINE static uint64_t Mix(uint64_t state, uint64_t v) { >+ using MultType = >+ absl::conditional_t<sizeof(size_t) == 4, uint64_t, uint128>; >+ // We do the addition in 64-bit space to make sure the 128-bit >+ // multiplication is fast. If we were to do it as MultType the compiler has >+ // to assume that the high word is non-zero and needs to perform 2 >+ // multiplications instead of one. >+ MultType m = state + v; >+ m *= kMul; >+ return static_cast<uint64_t>(m ^ (m >> (sizeof(m) * 8 / 2))); >+ } >+ >+ // Seed() >+ // >+ // A non-deterministic seed. >+ // >+ // The current purpose of this seed is to generate non-deterministic results >+ // and prevent having users depend on the particular hash values. >+ // It is not meant as a security feature right now, but it leaves the door >+ // open to upgrade it to a true per-process random seed. A true random seed >+ // costs more and we don't need to pay for that right now. >+ // >+ // On platforms with ASLR, we take advantage of it to make a per-process >+ // random value. >+ // See https://en.wikipedia.org/wiki/Address_space_layout_randomization >+ // >+ // On other platforms this is still going to be non-deterministic but most >+ // probably per-build and not per-process. >+ ABSL_ATTRIBUTE_ALWAYS_INLINE static uint64_t Seed() { >+ return static_cast<uint64_t>(reinterpret_cast<uintptr_t>(kSeed)); >+ } >+ static const void* const kSeed; >+ >+ uint64_t state_; >+}; >+ >+// CityHashState::CombineContiguousImpl() >+inline uint64_t CityHashState::CombineContiguousImpl( >+ uint64_t state, const unsigned char* first, size_t len, >+ std::integral_constant<int, 4> /* sizeof_size_t */) { >+ // For large values we use CityHash, for small ones we just use a >+ // multiplicative hash. >+ uint64_t v; >+ if (len > 8) { >+ v = absl::hash_internal::CityHash32(reinterpret_cast<const char*>(first), len); >+ } else if (len >= 4) { >+ v = Read4To8(first, len); >+ } else if (len > 0) { >+ v = Read1To3(first, len); >+ } else { >+ // Empty ranges have no effect. >+ return state; >+ } >+ return Mix(state, v); >+} >+ >+// Overload of CityHashState::CombineContiguousImpl() >+inline uint64_t CityHashState::CombineContiguousImpl( >+ uint64_t state, const unsigned char* first, size_t len, >+ std::integral_constant<int, 8> /* sizeof_size_t */) { >+ // For large values we use CityHash, for small ones we just use a >+ // multiplicative hash. >+ uint64_t v; >+ if (len > 16) { >+ v = absl::hash_internal::CityHash64(reinterpret_cast<const char*>(first), len); >+ } else if (len > 8) { >+ auto p = Read9To16(first, len); >+ state = Mix(state, p.first); >+ v = p.second; >+ } else if (len >= 4) { >+ v = Read4To8(first, len); >+ } else if (len > 0) { >+ v = Read1To3(first, len); >+ } else { >+ // Empty ranges have no effect. >+ return state; >+ } >+ return Mix(state, v); >+} >+ >+ >+struct AggregateBarrier {}; >+ >+// HashImpl >+ >+// Add a private base class to make sure this type is not an aggregate. >+// Aggregates can be aggregate initialized even if the default constructor is >+// deleted. >+struct PoisonedHash : private AggregateBarrier { >+ PoisonedHash() = delete; >+ PoisonedHash(const PoisonedHash&) = delete; >+ PoisonedHash& operator=(const PoisonedHash&) = delete; >+}; >+ >+template <typename T> >+struct HashImpl { >+ size_t operator()(const T& value) const { return CityHashState::hash(value); } >+}; >+ >+template <typename T> >+struct Hash >+ : absl::conditional_t<is_hashable<T>::value, HashImpl<T>, PoisonedHash> {}; >+ >+template <typename H> >+template <typename T, typename... Ts> >+H HashStateBase<H>::combine(H state, const T& value, const Ts&... values) { >+ return H::combine(hash_internal::InvokeHash(std::move(state), value), >+ values...); >+} >+ >+// HashStateBase::combine_contiguous() >+template <typename H> >+template <typename T> >+H HashStateBase<H>::combine_contiguous(H state, const T* data, size_t size) { >+ return hash_internal::hash_range_or_bytes(std::move(state), data, size); >+} >+} // namespace hash_internal >+} // namespace absl >+ >+#endif // ABSL_HASH_INTERNAL_HASH_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/print_hash_of.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/print_hash_of.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..b6df31cc5e85b9046e0a54ebcbe839bc60402175 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/print_hash_of.cc >@@ -0,0 +1,23 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#include <cstdlib> >+ >+#include "absl/hash/hash.h" >+ >+// Prints the hash of argv[1]. >+int main(int argc, char** argv) { >+ if (argc < 2) return 1; >+ printf("%zu\n", absl::Hash<int>{}(std::atoi(argv[1]))); // NOLINT >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/spy_hash_state.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/spy_hash_state.h >new file mode 100644 >index 0000000000000000000000000000000000000000..03d795b09001e3dc303013738f6246aa2de0ac5f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/hash/internal/spy_hash_state.h >@@ -0,0 +1,218 @@ >+// Copyright 2018 The Abseil Authors. >+// >+// Licensed under the Apache License, Version 2.0 (the "License"); >+// you may not use this file except in compliance with the License. >+// You may obtain a copy of the License at >+// >+// http://www.apache.org/licenses/LICENSE-2.0 >+// >+// Unless required by applicable law or agreed to in writing, software >+// distributed under the License is distributed on an "AS IS" BASIS, >+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >+// See the License for the specific language governing permissions and >+// limitations under the License. >+ >+#ifndef ABSL_HASH_INTERNAL_SPY_HASH_STATE_H_ >+#define ABSL_HASH_INTERNAL_SPY_HASH_STATE_H_ >+ >+#include <ostream> >+#include <string> >+#include <vector> >+ >+#include "absl/hash/hash.h" >+#include "absl/strings/match.h" >+#include "absl/strings/str_format.h" >+#include "absl/strings/str_join.h" >+ >+namespace absl { >+namespace hash_internal { >+ >+// SpyHashState is an implementation of the HashState API that simply >+// accumulates all input bytes in an internal buffer. This makes it useful >+// for testing AbslHashValue overloads (so long as they are templated on the >+// HashState parameter), since it can report the exact hash representation >+// that the AbslHashValue overload produces. >+// >+// Sample usage: >+// EXPECT_EQ(SpyHashState::combine(SpyHashState(), foo), >+// SpyHashState::combine(SpyHashState(), bar)); >+template <typename T> >+class SpyHashStateImpl : public HashStateBase<SpyHashStateImpl<T>> { >+ public: >+ SpyHashStateImpl() >+ : error_(std::make_shared<absl::optional<std::string>>()) { >+ static_assert(std::is_void<T>::value, ""); >+ } >+ >+ // Move-only >+ SpyHashStateImpl(const SpyHashStateImpl&) = delete; >+ SpyHashStateImpl& operator=(const SpyHashStateImpl&) = delete; >+ >+ SpyHashStateImpl(SpyHashStateImpl&& other) noexcept { >+ *this = std::move(other); >+ } >+ >+ SpyHashStateImpl& operator=(SpyHashStateImpl&& other) noexcept { >+ hash_representation_ = std::move(other.hash_representation_); >+ error_ = other.error_; >+ moved_from_ = other.moved_from_; >+ other.moved_from_ = true; >+ return *this; >+ } >+ >+ template <typename U> >+ SpyHashStateImpl(SpyHashStateImpl<U>&& other) { // NOLINT >+ hash_representation_ = std::move(other.hash_representation_); >+ error_ = other.error_; >+ moved_from_ = other.moved_from_; >+ other.moved_from_ = true; >+ } >+ >+ template <typename A, typename... Args> >+ static SpyHashStateImpl combine(SpyHashStateImpl s, const A& a, >+ const Args&... args) { >+ // Pass an instance of SpyHashStateImpl<A> when trying to combine `A`. This >+ // allows us to test that the user only uses this instance for combine calls >+ // and does not call AbslHashValue directly. >+ // See AbslHashValue implementation at the bottom. >+ s = SpyHashStateImpl<A>::HashStateBase::combine(std::move(s), a); >+ return SpyHashStateImpl::combine(std::move(s), args...); >+ } >+ static SpyHashStateImpl combine(SpyHashStateImpl s) { >+ if (direct_absl_hash_value_error_) { >+ *s.error_ = "AbslHashValue should not be invoked directly."; >+ } else if (s.moved_from_) { >+ *s.error_ = "Used moved-from instance of the hash state object."; >+ } >+ return s; >+ } >+ >+ static void SetDirectAbslHashValueError() { >+ direct_absl_hash_value_error_ = true; >+ } >+ >+ // Two SpyHashStateImpl objects are equal if they hold equal hash >+ // representations. >+ friend bool operator==(const SpyHashStateImpl& lhs, >+ const SpyHashStateImpl& rhs) { >+ return lhs.hash_representation_ == rhs.hash_representation_; >+ } >+ >+ friend bool operator!=(const SpyHashStateImpl& lhs, >+ const SpyHashStateImpl& rhs) { >+ return !(lhs == rhs); >+ } >+ >+ enum class CompareResult { >+ kEqual, >+ kASuffixB, >+ kBSuffixA, >+ kUnequal, >+ }; >+ >+ static CompareResult Compare(const SpyHashStateImpl& a, >+ const SpyHashStateImpl& b) { >+ const std::string a_flat = absl::StrJoin(a.hash_representation_, ""); >+ const std::string b_flat = absl::StrJoin(b.hash_representation_, ""); >+ if (a_flat == b_flat) return CompareResult::kEqual; >+ if (absl::EndsWith(a_flat, b_flat)) return CompareResult::kBSuffixA; >+ if (absl::EndsWith(b_flat, a_flat)) return CompareResult::kASuffixB; >+ return CompareResult::kUnequal; >+ } >+ >+ // operator<< prints the hash representation as a hex and ASCII dump, to >+ // facilitate debugging. >+ friend std::ostream& operator<<(std::ostream& out, >+ const SpyHashStateImpl& hash_state) { >+ out << "[\n"; >+ for (auto& s : hash_state.hash_representation_) { >+ size_t offset = 0; >+ for (char c : s) { >+ if (offset % 16 == 0) { >+ out << absl::StreamFormat("\n0x%04x: ", offset); >+ } >+ if (offset % 2 == 0) { >+ out << " "; >+ } >+ out << absl::StreamFormat("%02x", c); >+ ++offset; >+ } >+ out << "\n"; >+ } >+ return out << "]"; >+ } >+ >+ // The base case of the combine recursion, which writes raw bytes into the >+ // internal buffer. >+ static SpyHashStateImpl combine_contiguous(SpyHashStateImpl hash_state, >+ const unsigned char* begin, >+ size_t size) { >+ hash_state.hash_representation_.emplace_back( >+ reinterpret_cast<const char*>(begin), size); >+ return hash_state; >+ } >+ >+ using SpyHashStateImpl::HashStateBase::combine_contiguous; >+ >+ absl::optional<std::string> error() const { >+ if (moved_from_) { >+ return "Returned a moved-from instance of the hash state object."; >+ } >+ return *error_; >+ } >+ >+ private: >+ template <typename U> >+ friend class SpyHashStateImpl; >+ >+ // This is true if SpyHashStateImpl<T> has been passed to a call of >+ // AbslHashValue with the wrong type. This detects that the user called >+ // AbslHashValue directly (because the hash state type does not match). >+ static bool direct_absl_hash_value_error_; >+ >+ >+ std::vector<std::string> hash_representation_; >+ // This is a shared_ptr because we want all instances of the particular >+ // SpyHashState run to share the field. This way we can set the error for >+ // use-after-move and all the copies will see it. >+ std::shared_ptr<absl::optional<std::string>> error_; >+ bool moved_from_ = false; >+}; >+ >+template <typename T> >+bool SpyHashStateImpl<T>::direct_absl_hash_value_error_; >+ >+template <bool& B> >+struct OdrUse { >+ constexpr OdrUse() {} >+ bool& b = B; >+}; >+ >+template <void (*)()> >+struct RunOnStartup { >+ static bool run; >+ static constexpr OdrUse<run> kOdrUse{}; >+}; >+ >+template <void (*f)()> >+bool RunOnStartup<f>::run = (f(), true); >+ >+template < >+ typename T, typename U, >+ // Only trigger for when (T != U), >+ absl::enable_if_t<!std::is_same<T, U>::value, int> = 0, >+ // This statement works in two ways: >+ // - First, it instantiates RunOnStartup and forces the initialization of >+ // `run`, which set the global variable. >+ // - Second, it triggers a SFINAE error disabling the overload to prevent >+ // compile time errors. If we didn't disable the overload we would get >+ // ambiguous overload errors, which we don't want. >+ int = RunOnStartup<SpyHashStateImpl<T>::SetDirectAbslHashValueError>::run> >+void AbslHashValue(SpyHashStateImpl<T>, const U&); >+ >+using SpyHashState = SpyHashStateImpl<void>; >+ >+} // namespace hash_internal >+} // namespace absl >+ >+#endif // ABSL_HASH_INTERNAL_SPY_HASH_STATE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/BUILD.bazel >index 46f47b12bb2adb2f672fb050fb40d77ecf85bc21..89a312eac4b19bf88ba1f2100ca045d1cc32785e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/BUILD.bazel >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/BUILD.bazel >@@ -19,6 +19,7 @@ load( > "ABSL_DEFAULT_COPTS", > "ABSL_TEST_COPTS", > "ABSL_EXCEPTIONS_FLAG", >+ "ABSL_EXCEPTIONS_FLAG_LINKOPTS", > ) > > package(default_visibility = ["//visibility:public"]) >@@ -53,6 +54,7 @@ cc_test( > "memory_exception_safety_test.cc", > ], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":memory", > "//absl/base:exception_safety_testing", >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory.h >index c7caf8b94dc91b2838d13eb48904d172d2a30f73..1eaec0f40aead57d692e4b858e64214045645f81 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory.h >@@ -39,16 +39,30 @@ namespace absl { > // Function Template: WrapUnique() > // ----------------------------------------------------------------------------- > // >-// Adopts ownership from a raw pointer and transfers it to the returned >-// `std::unique_ptr`, whose type is deduced. DO NOT specify the template type T >-// when calling WrapUnique. >+// Adopts ownership from a raw pointer and transfers it to the returned >+// `std::unique_ptr`, whose type is deduced. Because of this deduction, *do not* >+// specify the template type `T` when calling `WrapUnique`. > // > // Example: > // X* NewX(int, int); > // auto x = WrapUnique(NewX(1, 2)); // 'x' is std::unique_ptr<X>. > // >-// `absl::WrapUnique` is useful for capturing the output of a raw pointer >-// factory. However, prefer 'absl::make_unique<T>(args...) over >+// The purpose of WrapUnique is to automatically deduce the pointer type. If you >+// wish to make the type explicit, for readability reasons or because you prefer >+// to use a base-class pointer rather than a derived one, just use >+// `std::unique_ptr` directly. >+// >+// Example: >+// X* Factory(int, int); >+// auto x = std::unique_ptr<X>(Factory(1, 2)); >+// - or - >+// std::unique_ptr<X> x(Factory(1, 2)); >+// >+// This has the added advantage of working whether Factory returns a raw >+// pointer or a `std::unique_ptr`. >+// >+// While `absl::WrapUnique` is useful for capturing the output of a raw >+// pointer factory, prefer 'absl::make_unique<T>(args...)' over > // 'absl::WrapUnique(new T(args...))'. > // > // auto x = WrapUnique(new X(1, 2)); // works, but nonideal. >@@ -641,55 +655,59 @@ struct default_allocator_is_nothrow : std::false_type {}; > #endif > > namespace memory_internal { >-#ifdef ABSL_HAVE_EXCEPTIONS >-template <typename Allocator, typename StorageElement, typename... Args> >-void ConstructStorage(Allocator* alloc, StorageElement* first, >- StorageElement* last, const Args&... args) { >- for (StorageElement* cur = first; cur != last; ++cur) { >+#ifdef ABSL_HAVE_EXCEPTIONS // ConstructRange >+template <typename Allocator, typename Iterator, typename... Args> >+void ConstructRange(Allocator& alloc, Iterator first, Iterator last, >+ const Args&... args) { >+ for (Iterator cur = first; cur != last; ++cur) { > try { >- std::allocator_traits<Allocator>::construct(*alloc, cur, args...); >+ std::allocator_traits<Allocator>::construct(alloc, cur, args...); > } catch (...) { > while (cur != first) { > --cur; >- std::allocator_traits<Allocator>::destroy(*alloc, cur); >+ std::allocator_traits<Allocator>::destroy(alloc, cur); > } > throw; > } > } > } >-template <typename Allocator, typename StorageElement, typename Iterator> >-void CopyToStorageFromRange(Allocator* alloc, StorageElement* destination, >- Iterator first, Iterator last) { >- for (StorageElement* cur = destination; first != last; >+#else // ABSL_HAVE_EXCEPTIONS // ConstructRange >+template <typename Allocator, typename Iterator, typename... Args> >+void ConstructRange(Allocator& alloc, Iterator first, Iterator last, >+ const Args&... args) { >+ for (; first != last; ++first) { >+ std::allocator_traits<Allocator>::construct(alloc, first, args...); >+ } >+} >+#endif // ABSL_HAVE_EXCEPTIONS // ConstructRange >+ >+#ifdef ABSL_HAVE_EXCEPTIONS // CopyRange >+template <typename Allocator, typename Iterator, typename InputIterator> >+void CopyRange(Allocator& alloc, Iterator destination, InputIterator first, >+ InputIterator last) { >+ for (Iterator cur = destination; first != last; > static_cast<void>(++cur), static_cast<void>(++first)) { > try { >- std::allocator_traits<Allocator>::construct(*alloc, cur, *first); >+ std::allocator_traits<Allocator>::construct(alloc, cur, *first); > } catch (...) { > while (cur != destination) { > --cur; >- std::allocator_traits<Allocator>::destroy(*alloc, cur); >+ std::allocator_traits<Allocator>::destroy(alloc, cur); > } > throw; > } > } > } >-#else // ABSL_HAVE_EXCEPTIONS >-template <typename Allocator, typename StorageElement, typename... Args> >-void ConstructStorage(Allocator* alloc, StorageElement* first, >- StorageElement* last, const Args&... args) { >- for (; first != last; ++first) { >- std::allocator_traits<Allocator>::construct(*alloc, first, args...); >- } >-} >-template <typename Allocator, typename StorageElement, typename Iterator> >-void CopyToStorageFromRange(Allocator* alloc, StorageElement* destination, >- Iterator first, Iterator last) { >+#else // ABSL_HAVE_EXCEPTIONS // CopyRange >+template <typename Allocator, typename Iterator, typename InputIterator> >+void CopyRange(Allocator& alloc, Iterator destination, InputIterator first, >+ InputIterator last) { > for (; first != last; > static_cast<void>(++destination), static_cast<void>(++first)) { >- std::allocator_traits<Allocator>::construct(*alloc, destination, *first); >+ std::allocator_traits<Allocator>::construct(alloc, destination, *first); > } > } >-#endif // ABSL_HAVE_EXCEPTIONS >+#endif // ABSL_HAVE_EXCEPTIONS // CopyRange > } // namespace memory_internal > } // namespace absl > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_exception_safety_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_exception_safety_test.cc >index d1f6e84f10f8a668ac25fb6d90ccc5a783202219..00d2b192cfc32df624363ca863ec79f90f6185b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_exception_safety_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_exception_safety_test.cc >@@ -32,7 +32,7 @@ TEST(MakeUnique, CheckForLeaks) { > .WithInitialValue(Thrower(kValue)) > // Ensures make_unique does not modify the input. The real > // test, though, is ConstructorTracker checking for leaks. >- .WithInvariants(testing::strong_guarantee); >+ .WithContracts(testing::strong_guarantee); > > EXPECT_TRUE(tester.Test([](Thrower* thrower) { > static_cast<void>(absl::make_unique<Thrower>(*thrower)); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_test.cc >index dee9b486a30d30d255705ac3814f19f781994053..2e453e22ba5ba39cfd690fc591e4afe487cb78a9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/memory/memory_test.cc >@@ -149,7 +149,6 @@ TEST(Make_UniqueTest, NotAmbiguousWithStdMakeUnique) { > } > > #if 0 >-// TODO(billydonahue): Make a proper NC test. > // These tests shouldn't compile. > TEST(MakeUniqueTestNC, AcceptMoveOnlyLvalue) { > auto m = MoveOnly(); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits.h >index 457b890841a64fb4ed36cde8621d7822190b798d..23ebd6ed0990fd3d9510d4d91066f074cfe11101 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits.h >@@ -105,8 +105,25 @@ template <class To, template <class...> class Op, class... Args> > struct is_detected_convertible > : is_detected_convertible_impl<void, To, Op, Args...>::type {}; > >+template <typename T> >+using IsCopyAssignableImpl = >+ decltype(std::declval<T&>() = std::declval<const T&>()); >+ >+template <typename T> >+using IsMoveAssignableImpl = decltype(std::declval<T&>() = std::declval<T&&>()); >+ > } // namespace type_traits_internal > >+template <typename T> >+struct is_copy_assignable : type_traits_internal::is_detected< >+ type_traits_internal::IsCopyAssignableImpl, T> { >+}; >+ >+template <typename T> >+struct is_move_assignable : type_traits_internal::is_detected< >+ type_traits_internal::IsMoveAssignableImpl, T> { >+}; >+ > // void_t() > // > // Ignores the type of any its arguments and returns `void`. In general, this >@@ -309,7 +326,7 @@ template <typename T> > struct is_trivially_copy_assignable > : std::integral_constant< > bool, __has_trivial_assign(typename std::remove_reference<T>::type) && >- std::is_copy_assignable<T>::value> { >+ absl::is_copy_assignable<T>::value> { > #ifdef ABSL_HAVE_STD_IS_TRIVIALLY_ASSIGNABLE > private: > static constexpr bool compliant = >@@ -409,11 +426,11 @@ struct IsHashEnabled > : absl::conjunction<std::is_default_constructible<std::hash<Key>>, > std::is_copy_constructible<std::hash<Key>>, > std::is_destructible<std::hash<Key>>, >- std::is_copy_assignable<std::hash<Key>>, >+ absl::is_copy_assignable<std::hash<Key>>, > IsHashable<Key>> {}; >+ > } // namespace type_traits_internal > > } // namespace absl > >- > #endif // ABSL_META_TYPE_TRAITS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits_test.cc >index 81b4bd323977e5e3cd27660c50cced86ab225017..f51f5dedee435d683b4b2f5eb3012bd61b5e2aa3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/meta/type_traits_test.cc >@@ -877,4 +877,80 @@ TEST(TypeTraitsTest, TestResultOf) { > EXPECT_EQ(TypeEnum::D, GetTypeExt(Wrap<TypeD>())); > } > >+template <typename T> >+bool TestCopyAssign() { >+ return absl::is_copy_assignable<T>::value == >+ std::is_copy_assignable<T>::value; >+} >+ >+TEST(TypeTraitsTest, IsCopyAssignable) { >+ EXPECT_TRUE(TestCopyAssign<int>()); >+ EXPECT_TRUE(TestCopyAssign<int&>()); >+ EXPECT_TRUE(TestCopyAssign<int&&>()); >+ >+ struct S {}; >+ EXPECT_TRUE(TestCopyAssign<S>()); >+ EXPECT_TRUE(TestCopyAssign<S&>()); >+ EXPECT_TRUE(TestCopyAssign<S&&>()); >+ >+ class C { >+ public: >+ explicit C(C* c) : c_(c) {} >+ ~C() { delete c_; } >+ >+ private: >+ C* c_; >+ }; >+ EXPECT_TRUE(TestCopyAssign<C>()); >+ EXPECT_TRUE(TestCopyAssign<C&>()); >+ EXPECT_TRUE(TestCopyAssign<C&&>()); >+ >+ // Reason for ifndef: add_lvalue_reference<T> in libc++ breaks for these cases >+#ifndef _LIBCPP_VERSION >+ EXPECT_TRUE(TestCopyAssign<int()>()); >+ EXPECT_TRUE(TestCopyAssign<int(int) const>()); >+ EXPECT_TRUE(TestCopyAssign<int(...) volatile&>()); >+ EXPECT_TRUE(TestCopyAssign<int(int, ...) const volatile&&>()); >+#endif // _LIBCPP_VERSION >+} >+ >+template <typename T> >+bool TestMoveAssign() { >+ return absl::is_move_assignable<T>::value == >+ std::is_move_assignable<T>::value; >+} >+ >+TEST(TypeTraitsTest, IsMoveAssignable) { >+ EXPECT_TRUE(TestMoveAssign<int>()); >+ EXPECT_TRUE(TestMoveAssign<int&>()); >+ EXPECT_TRUE(TestMoveAssign<int&&>()); >+ >+ struct S {}; >+ EXPECT_TRUE(TestMoveAssign<S>()); >+ EXPECT_TRUE(TestMoveAssign<S&>()); >+ EXPECT_TRUE(TestMoveAssign<S&&>()); >+ >+ class C { >+ public: >+ explicit C(C* c) : c_(c) {} >+ ~C() { delete c_; } >+ void operator=(const C&) = delete; >+ void operator=(C&&) = delete; >+ >+ private: >+ C* c_; >+ }; >+ EXPECT_TRUE(TestMoveAssign<C>()); >+ EXPECT_TRUE(TestMoveAssign<C&>()); >+ EXPECT_TRUE(TestMoveAssign<C&&>()); >+ >+ // Reason for ifndef: add_lvalue_reference<T> in libc++ breaks for these cases >+#ifndef _LIBCPP_VERSION >+ EXPECT_TRUE(TestMoveAssign<int()>()); >+ EXPECT_TRUE(TestMoveAssign<int(int) const>()); >+ EXPECT_TRUE(TestMoveAssign<int(...) volatile&>()); >+ EXPECT_TRUE(TestMoveAssign<int(int, ...) const volatile&&>()); >+#endif // _LIBCPP_VERSION >+} >+ > } // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/BUILD.bazel >index f49571ebb3e1042434151a2237127a99df1ad301..324ce6695e402c8973fc5b5df257aa5f265203dd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/BUILD.bazel >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/BUILD.bazel >@@ -49,6 +49,7 @@ cc_test( > ":int128", > "//absl/base", > "//absl/base:core_headers", >+ "//absl/hash:hash_testing", > "//absl/meta:type_traits", > "@com_google_googletest//:gtest_main", > ], >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128.h >index 2d131b8bda1c348d91743e7c42d38a57968c330b..79b62a7586695d5b2f431d38af52a0ab2274ae2c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128.h >@@ -192,6 +192,12 @@ class > // Returns the highest value for a 128-bit unsigned integer. > friend constexpr uint128 Uint128Max(); > >+ // Support for absl::Hash. >+ template <typename H> >+ friend H AbslHashValue(H h, uint128 v) { >+ return H::combine(std::move(h), Uint128High64(v), Uint128Low64(v)); >+ } >+ > private: > constexpr uint128(uint64_t high, uint64_t low); > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128_test.cc >index 1eb3e0ec8961c1c6c88756ef1f201f0203aee0a2..dfe3475aca0627b29e633cc2f85ce846b61bc3f1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/numeric/int128_test.cc >@@ -23,6 +23,7 @@ > > #include "gtest/gtest.h" > #include "absl/base/internal/cycleclock.h" >+#include "absl/hash/hash_testing.h" > #include "absl/meta/type_traits.h" > > #if defined(_MSC_VER) && _MSC_VER == 1900 >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.bazel >index 3a5f1332cda0e1f7040ef0dbdae557b018213854..6d7c2619a530e000f74ea3115dfd797d702054ff 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.bazel >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.bazel >@@ -19,6 +19,7 @@ load( > "ABSL_DEFAULT_COPTS", > "ABSL_TEST_COPTS", > "ABSL_EXCEPTIONS_FLAG", >+ "ABSL_EXCEPTIONS_FLAG_LINKOPTS", > ) > > package( >@@ -69,6 +70,7 @@ cc_library( > deps = [ > ":internal", > "//absl/base", >+ "//absl/base:bits", > "//absl/base:config", > "//absl/base:core_headers", > "//absl/base:endian", >@@ -86,7 +88,6 @@ cc_library( > "internal/utf8.cc", > ], > hdrs = [ >- "internal/bits.h", > "internal/char_map.h", > "internal/ostringstream.h", > "internal/resize_uninitialized.h", >@@ -212,7 +213,6 @@ cc_test( > visibility = ["//visibility:private"], > deps = [ > ":internal", >- ":strings", > "//absl/base:core_headers", > "@com_google_googletest//:gtest_main", > ], >@@ -237,6 +237,7 @@ cc_test( > size = "small", > srcs = ["string_view_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > visibility = ["//visibility:private"], > deps = [ > ":strings", >@@ -373,7 +374,6 @@ cc_test( > visibility = ["//visibility:private"], > deps = [ > ":strings", >- "//absl/memory", > "@com_github_google_benchmark//:benchmark_main", > ], > ) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.gn >index 9a8a10d0e7108e79e42c9c5b36355cf879016b95..20af83c0e518b650eb2e899d3e7b3c9efe7c108b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/BUILD.gn >@@ -59,6 +59,7 @@ source_set("strings") { > deps = [ > ":internal", > "../base", >+ "../base:bits", > "../base:config", > "../base:core_headers", > "../base:endian", >@@ -81,7 +82,6 @@ source_set("internal") { > "internal/utf8.cc", > ] > public = [ >- "internal/bits.h", > "internal/char_map.h", > "internal/ostringstream.h", > "internal/resize_uninitialized.h", >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/CMakeLists.txt b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/CMakeLists.txt >index cd122134729fa686e98610269f18acb18f8d3007..f3e4162387f15c832d75062adcc6b7ed5d5da17a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/CMakeLists.txt >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/CMakeLists.txt >@@ -32,7 +32,6 @@ list(APPEND STRINGS_PUBLIC_HEADERS > > > list(APPEND STRINGS_INTERNAL_HEADERS >- "internal/bits.h" > "internal/char_map.h" > "internal/charconv_bigint.h" > "internal/charconv_parse.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/ascii.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/ascii.h >index 96a64541c321a710d29efc8d9263862f1b9960e7..48a9da22e769a85128a69cf2dfc332980afed445 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/ascii.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/ascii.h >@@ -165,7 +165,7 @@ inline char ascii_tolower(unsigned char c) { > // Converts the characters in `s` to lowercase, changing the contents of `s`. > void AsciiStrToLower(std::string* s); > >-// Creates a lowercase std::string from a given absl::string_view. >+// Creates a lowercase string from a given absl::string_view. > ABSL_MUST_USE_RESULT inline std::string AsciiStrToLower(absl::string_view s) { > std::string result(s); > absl::AsciiStrToLower(&result); >@@ -183,7 +183,7 @@ inline char ascii_toupper(unsigned char c) { > // Converts the characters in `s` to uppercase, changing the contents of `s`. > void AsciiStrToUpper(std::string* s); > >-// Creates an uppercase std::string from a given absl::string_view. >+// Creates an uppercase string from a given absl::string_view. > ABSL_MUST_USE_RESULT inline std::string AsciiStrToUpper(absl::string_view s) { > std::string result(s); > absl::AsciiStrToUpper(&result); >@@ -195,10 +195,10 @@ ABSL_MUST_USE_RESULT inline std::string AsciiStrToUpper(absl::string_view s) { > ABSL_MUST_USE_RESULT inline absl::string_view StripLeadingAsciiWhitespace( > absl::string_view str) { > auto it = std::find_if_not(str.begin(), str.end(), absl::ascii_isspace); >- return absl::string_view(it, str.end() - it); >+ return str.substr(it - str.begin()); > } > >-// Strips in place whitespace from the beginning of the given std::string. >+// Strips in place whitespace from the beginning of the given string. > inline void StripLeadingAsciiWhitespace(std::string* str) { > auto it = std::find_if_not(str->begin(), str->end(), absl::ascii_isspace); > str->erase(str->begin(), it); >@@ -209,10 +209,10 @@ inline void StripLeadingAsciiWhitespace(std::string* str) { > ABSL_MUST_USE_RESULT inline absl::string_view StripTrailingAsciiWhitespace( > absl::string_view str) { > auto it = std::find_if_not(str.rbegin(), str.rend(), absl::ascii_isspace); >- return absl::string_view(str.begin(), str.rend() - it); >+ return str.substr(0, str.rend() - it); > } > >-// Strips in place whitespace from the end of the given std::string >+// Strips in place whitespace from the end of the given string > inline void StripTrailingAsciiWhitespace(std::string* str) { > auto it = std::find_if_not(str->rbegin(), str->rend(), absl::ascii_isspace); > str->erase(str->rend() - it); >@@ -225,7 +225,7 @@ ABSL_MUST_USE_RESULT inline absl::string_view StripAsciiWhitespace( > return StripTrailingAsciiWhitespace(StripLeadingAsciiWhitespace(str)); > } > >-// Strips in place whitespace from both ends of the given std::string >+// Strips in place whitespace from both ends of the given string > inline void StripAsciiWhitespace(std::string* str) { > StripTrailingAsciiWhitespace(str); > StripLeadingAsciiWhitespace(str); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.cc >index 08c3947eccd475dee33310d247bf8740039980c8..c7b8c98b1b4459c56ea851e9281e35a92613c60f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.cc >@@ -20,8 +20,8 @@ > #include <cstring> > > #include "absl/base/casts.h" >+#include "absl/base/internal/bits.h" > #include "absl/numeric/int128.h" >-#include "absl/strings/internal/bits.h" > #include "absl/strings/internal/charconv_bigint.h" > #include "absl/strings/internal/charconv_parse.h" > >@@ -243,9 +243,9 @@ struct CalculatedFloat { > // minus the number of leading zero bits.) > int BitWidth(uint128 value) { > if (Uint128High64(value) == 0) { >- return 64 - strings_internal::CountLeadingZeros64(Uint128Low64(value)); >+ return 64 - base_internal::CountLeadingZeros64(Uint128Low64(value)); > } >- return 128 - strings_internal::CountLeadingZeros64(Uint128High64(value)); >+ return 128 - base_internal::CountLeadingZeros64(Uint128High64(value)); > } > > // Calculates how far to the right a mantissa needs to be shifted to create a >@@ -518,7 +518,7 @@ CalculatedFloat CalculateFromParsedHexadecimal( > const strings_internal::ParsedFloat& parsed_hex) { > uint64_t mantissa = parsed_hex.mantissa; > int exponent = parsed_hex.exponent; >- int mantissa_width = 64 - strings_internal::CountLeadingZeros64(mantissa); >+ int mantissa_width = 64 - base_internal::CountLeadingZeros64(mantissa); > const int shift = NormalizedShiftSize<FloatType>(mantissa_width, exponent); > bool result_exact; > exponent += shift; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.h >index 3e313679c9618aa88dc9e8b55e88e96a0f7f696b..0735382962356cff7bfb5f3ebd3a0393d2d2810a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv.h >@@ -22,7 +22,7 @@ namespace absl { > // Workalike compatibilty version of std::chars_format from C++17. > // > // This is an bitfield enumerator which can be passed to absl::from_chars to >-// configure the std::string-to-float conversion. >+// configure the string-to-float conversion. > enum class chars_format { > scientific = 1, > fixed = 2, >@@ -30,7 +30,7 @@ enum class chars_format { > general = fixed | scientific, > }; > >-// The return result of a std::string-to-number conversion. >+// The return result of a string-to-number conversion. > // > // `ec` will be set to `invalid_argument` if a well-formed number was not found > // at the start of the input range, `result_out_of_range` if a well-formed >@@ -67,7 +67,7 @@ struct from_chars_result { > // If `fmt` is set, it must be one of the enumerator values of the chars_format. > // (This is despite the fact that chars_format is a bitmask type.) If set to > // `scientific`, a matching number must contain an exponent. If set to `fixed`, >-// then an exponent will never match. (For example, the std::string "1e5" will be >+// then an exponent will never match. (For example, the string "1e5" will be > // parsed as "1".) If set to `hex`, then a hexadecimal float is parsed in the > // format that strtod() accepts, except that a "0x" prefix is NOT matched. > // (In particular, in `hex` mode, the input "0xff" results in the largest >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv_test.cc >index f8d71cc6c448c7a3e7f0caf98098fc7b4d2d5642..89418fe948ffedc4b75ac2fa810fd4416605c3bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/charconv_test.cc >@@ -33,7 +33,7 @@ namespace { > > #if ABSL_COMPILER_DOES_EXACT_ROUNDING > >-// Tests that the given std::string is accepted by absl::from_chars, and that it >+// Tests that the given string is accepted by absl::from_chars, and that it > // converts exactly equal to the given number. > void TestDoubleParse(absl::string_view str, double expected_number) { > SCOPED_TRACE(str); >@@ -250,7 +250,7 @@ TEST(FromChars, NearRoundingCasesExplicit) { > EXPECT_EQ(ToFloat("459926601011.e15"), ldexpf(12466336, 65)); > } > >-// Common test logic for converting a std::string which lies exactly halfway between >+// Common test logic for converting a string which lies exactly halfway between > // two target floats. > // > // mantissa and exponent represent the precise value between two floating point >@@ -655,7 +655,7 @@ int NextStep(int step) { > // is correct for in-bounds values, and that overflow and underflow are done > // correctly for out-of-bounds values. > // >-// input_generator maps from an integer index to a std::string to test. >+// input_generator maps from an integer index to a string to test. > // expected_generator maps from an integer index to an expected Float value. > // from_chars conversion of input_generator(i) should result in > // expected_generator(i). >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.cc >index fbc9f756315fa7fe9a7bb025d48c3fb284b9e8fb..8d8b00b2001ce8307b48788007cc5655a75f54fa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.cc >@@ -89,7 +89,7 @@ inline bool IsSurrogate(char32_t c, absl::string_view src, std::string* error) { > // > // Unescapes C escape sequences and is the reverse of CEscape(). > // >-// If 'source' is valid, stores the unescaped std::string and its size in >+// If 'source' is valid, stores the unescaped string and its size in > // 'dest' and 'dest_len' respectively, and returns true. Otherwise > // returns false and optionally stores the error description in > // 'error'. Set 'error' to nullptr to disable error reporting. >@@ -104,7 +104,7 @@ bool CUnescapeInternal(absl::string_view source, bool leave_nulls_escaped, > char* dest, ptrdiff_t* dest_len, std::string* error) { > char* d = dest; > const char* p = source.data(); >- const char* end = source.end(); >+ const char* end = p + source.size(); > const char* last_byte = end - 1; > > // Small optimization for case where source = dest and there's no escaping >@@ -294,7 +294,7 @@ bool CUnescapeInternal(absl::string_view source, bool leave_nulls_escaped, > // ---------------------------------------------------------------------- > // CUnescapeInternal() > // >-// Same as above but uses a C++ std::string for output. 'source' and 'dest' >+// Same as above but uses a C++ string for output. 'source' and 'dest' > // may be the same. > // ---------------------------------------------------------------------- > bool CUnescapeInternal(absl::string_view source, bool leave_nulls_escaped, >@@ -304,7 +304,7 @@ bool CUnescapeInternal(absl::string_view source, bool leave_nulls_escaped, > ptrdiff_t dest_size; > if (!CUnescapeInternal(source, > leave_nulls_escaped, >- const_cast<char*>(dest->data()), >+ &(*dest)[0], > &dest_size, > error)) { > return false; >@@ -684,7 +684,7 @@ bool Base64UnescapeInternal(const char* src_param, size_t szsrc, char* dest, > // The arrays below were generated by the following code > // #include <sys/time.h> > // #include <stdlib.h> >-// #include <std::string.h> >+// #include <string.h> > // main() > // { > // static const char Base64[] = >@@ -939,7 +939,8 @@ constexpr char kBase64Chars[] = > constexpr char kWebSafeBase64Chars[] = > "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-_"; > >-void Base64EscapeInternal(const unsigned char* src, size_t szsrc, std::string* dest, >+template <typename String> >+void Base64EscapeInternal(const unsigned char* src, size_t szsrc, String* dest, > bool do_padding, const char* base64_chars) { > const size_t calc_escaped_size = > CalculateBase64EscapedLenInternal(szsrc, do_padding); >@@ -951,7 +952,8 @@ void Base64EscapeInternal(const unsigned char* src, size_t szsrc, std::string* d > dest->erase(escaped_len); > } > >-bool Base64UnescapeInternal(const char* src, size_t slen, std::string* dest, >+template <typename String> >+bool Base64UnescapeInternal(const char* src, size_t slen, String* dest, > const signed char* unbase64) { > // Determine the size of the output std::string. Base64 encodes every 3 bytes into > // 4 characters. any leftover chars are added directly for good measure. >@@ -999,7 +1001,7 @@ constexpr char kHexValue[256] = { > /* clang-format on */ > > // This is a templated function so that T can be either a char* >-// or a std::string. This works because we use the [] operator to access >+// or a string. This works because we use the [] operator to access > // individual characters at a time. > template <typename T> > void HexStringToBytesInternal(const char* from, T to, ptrdiff_t num) { >@@ -1009,7 +1011,7 @@ void HexStringToBytesInternal(const char* from, T to, ptrdiff_t num) { > } > } > >-// This is a templated function so that T can be either a char* or a std::string. >+// This is a templated function so that T can be either a char* or a string. > template <typename T> > void BytesToHexStringInternal(const unsigned char* src, T dest, ptrdiff_t num) { > auto dest_ptr = &dest[0]; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.h >index 7f1ab96d8ff15d56042b4659d66148214705f0a5..296597304e6ce3f5865dc03a967f34318e69b91c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping.h >@@ -17,7 +17,7 @@ > // File: escaping.h > // ----------------------------------------------------------------------------- > // >-// This header file contains std::string utilities involved in escaping and >+// This header file contains string utilities involved in escaping and > // unescaping strings in various ways. > // > >@@ -37,7 +37,7 @@ namespace absl { > > // CUnescape() > // >-// Unescapes a `source` std::string and copies it into `dest`, rewriting C-style >+// Unescapes a `source` string and copies it into `dest`, rewriting C-style > // escape sequences (http://en.cppreference.com/w/cpp/language/escape) into > // their proper code point equivalents, returning `true` if successful. > // >@@ -57,9 +57,10 @@ namespace absl { > // 0x99). > // > // >-// If any errors are encountered, this function returns `false` and stores the >-// first encountered error in `error`. To disable error reporting, set `error` >-// to `nullptr` or use the overload with no error reporting below. >+// If any errors are encountered, this function returns `false`, leaving the >+// `dest` output parameter in an unspecified state, and stores the first >+// encountered error in `error`. To disable error reporting, set `error` to >+// `nullptr` or use the overload with no error reporting below. > // > // Example: > // >@@ -78,7 +79,7 @@ inline bool CUnescape(absl::string_view source, std::string* dest) { > > // CEscape() > // >-// Escapes a 'src' std::string using C-style escapes sequences >+// Escapes a 'src' string using C-style escapes sequences > // (http://en.cppreference.com/w/cpp/language/escape), escaping other > // non-printable/non-whitespace bytes as octal sequences (e.g. "\377"). > // >@@ -91,7 +92,7 @@ std::string CEscape(absl::string_view src); > > // CHexEscape() > // >-// Escapes a 'src' std::string using C-style escape sequences, escaping >+// Escapes a 'src' string using C-style escape sequences, escaping > // other non-printable/non-whitespace bytes as hexadecimal sequences (e.g. > // "\xFF"). > // >@@ -104,7 +105,7 @@ std::string CHexEscape(absl::string_view src); > > // Utf8SafeCEscape() > // >-// Escapes a 'src' std::string using C-style escape sequences, escaping bytes as >+// Escapes a 'src' string using C-style escape sequences, escaping bytes as > // octal sequences, and passing through UTF-8 characters without conversion. > // I.e., when encountering any bytes with their high bit set, this function > // will not escape those values, whether or not they are valid UTF-8. >@@ -112,47 +113,47 @@ std::string Utf8SafeCEscape(absl::string_view src); > > // Utf8SafeCHexEscape() > // >-// Escapes a 'src' std::string using C-style escape sequences, escaping bytes as >+// Escapes a 'src' string using C-style escape sequences, escaping bytes as > // hexadecimal sequences, and passing through UTF-8 characters without > // conversion. > std::string Utf8SafeCHexEscape(absl::string_view src); > > // Base64Unescape() > // >-// Converts a `src` std::string encoded in Base64 to its binary equivalent, writing >+// Converts a `src` string encoded in Base64 to its binary equivalent, writing > // it to a `dest` buffer, returning `true` on success. If `src` contains invalid > // characters, `dest` is cleared and returns `false`. > bool Base64Unescape(absl::string_view src, std::string* dest); > >-// WebSafeBase64Unescape(absl::string_view, std::string*) >+// WebSafeBase64Unescape() > // >-// Converts a `src` std::string encoded in Base64 to its binary equivalent, writing >+// Converts a `src` string encoded in Base64 to its binary equivalent, writing > // it to a `dest` buffer, but using '-' instead of '+', and '_' instead of '/'. > // If `src` contains invalid characters, `dest` is cleared and returns `false`. > bool WebSafeBase64Unescape(absl::string_view src, std::string* dest); > > // Base64Escape() > // >-// Encodes a `src` std::string into a `dest` buffer using base64 encoding, with >+// Encodes a `src` string into a `dest` buffer using base64 encoding, with > // padding characters. This function conforms with RFC 4648 section 4 (base64). > void Base64Escape(absl::string_view src, std::string* dest); > > // WebSafeBase64Escape() > // >-// Encodes a `src` std::string into a `dest` buffer using '-' instead of '+' and >+// Encodes a `src` string into a `dest` buffer using '-' instead of '+' and > // '_' instead of '/', and without padding. This function conforms with RFC 4648 > // section 5 (base64url). > void WebSafeBase64Escape(absl::string_view src, std::string* dest); > > // HexStringToBytes() > // >-// Converts an ASCII hex std::string into bytes, returning binary data of length >+// Converts an ASCII hex string into bytes, returning binary data of length > // `from.size()/2`. > std::string HexStringToBytes(absl::string_view from); > > // BytesToHexString() > // >-// Converts binary data into an ASCII text std::string, returning a std::string of size >+// Converts binary data into an ASCII text string, returning a string of size > // `2*from.size()`. > std::string BytesToHexString(absl::string_view from); > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping_test.cc >index 3f65ec107f4555c950ff777e89e93d23e43378e5..9dc27f3f8fa768a2a91cdaf95037d74cff01aebc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/escaping_test.cc >@@ -543,18 +543,19 @@ static struct { > {"abcdefghijklmnopqrstuvwxyz", "YWJjZGVmZ2hpamtsbW5vcHFyc3R1dnd4eXo="}, > }; > >-TEST(Base64, EscapeAndUnescape) { >+template <typename StringType> >+void TestEscapeAndUnescape() { > // Check the short strings; this tests the math (and boundaries) > for (const auto& tc : base64_tests) { >- std::string encoded("this junk should be ignored"); >+ StringType encoded("this junk should be ignored"); > absl::Base64Escape(tc.plaintext, &encoded); > EXPECT_EQ(encoded, tc.cyphertext); > >- std::string decoded("this junk should be ignored"); >+ StringType decoded("this junk should be ignored"); > EXPECT_TRUE(absl::Base64Unescape(encoded, &decoded)); > EXPECT_EQ(decoded, tc.plaintext); > >- std::string websafe(tc.cyphertext); >+ StringType websafe(tc.cyphertext); > for (int c = 0; c < websafe.size(); ++c) { > if ('+' == websafe[c]) websafe[c] = '-'; > if ('/' == websafe[c]) websafe[c] = '_'; >@@ -576,7 +577,7 @@ TEST(Base64, EscapeAndUnescape) { > > // Now try the long strings, this tests the streaming > for (const auto& tc : absl::strings_internal::base64_strings()) { >- std::string buffer; >+ StringType buffer; > absl::WebSafeBase64Escape(tc.plaintext, &buffer); > EXPECT_EQ(tc.cyphertext, buffer); > } >@@ -586,7 +587,7 @@ TEST(Base64, EscapeAndUnescape) { > absl::string_view data_set[] = {"ab-/", absl::string_view("\0bcd", 4), > absl::string_view("abc.\0", 5)}; > for (absl::string_view bad_data : data_set) { >- std::string buf; >+ StringType buf; > EXPECT_FALSE(absl::Base64Unescape(bad_data, &buf)); > EXPECT_FALSE(absl::WebSafeBase64Unescape(bad_data, &buf)); > EXPECT_TRUE(buf.empty()); >@@ -594,6 +595,10 @@ TEST(Base64, EscapeAndUnescape) { > } > } > >+TEST(Base64, EscapeAndUnescape) { >+ TestEscapeAndUnescape<std::string>(); >+} >+ > TEST(Base64, DISABLED_HugeData) { > const size_t kSize = size_t(3) * 1000 * 1000 * 1000; > static_assert(kSize % 3 == 0, "kSize must be divisible by 3"); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/bits.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/bits.h >deleted file mode 100644 >index 901082ccddfca78457c746fbe22ee5709933bfaf..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/bits.h >+++ /dev/null >@@ -1,53 +0,0 @@ >-// Copyright 2018 The Abseil Authors. >-// >-// Licensed under the Apache License, Version 2.0 (the "License"); >-// you may not use this file except in compliance with the License. >-// You may obtain a copy of the License at >-// >-// http://www.apache.org/licenses/LICENSE-2.0 >-// >-// Unless required by applicable law or agreed to in writing, software >-// distributed under the License is distributed on an "AS IS" BASIS, >-// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >-// See the License for the specific language governing permissions and >-// limitations under the License. >- >-#ifndef ABSL_STRINGS_INTERNAL_BITS_H_ >-#define ABSL_STRINGS_INTERNAL_BITS_H_ >- >-#include <cstdint> >- >-#if defined(_MSC_VER) && defined(_M_X64) >-#include <intrin.h> >-#pragma intrinsic(_BitScanReverse64) >-#endif >- >-namespace absl { >-namespace strings_internal { >- >-// Returns the number of leading 0 bits in a 64-bit value. >-inline int CountLeadingZeros64(uint64_t n) { >-#if defined(__GNUC__) >- static_assert(sizeof(unsigned long long) == sizeof(n), // NOLINT(runtime/int) >- "__builtin_clzll does not take 64bit arg"); >- return n == 0 ? 64 : __builtin_clzll(n); >-#elif defined(_MSC_VER) && defined(_M_X64) >- unsigned long result; // NOLINT(runtime/int) >- if (_BitScanReverse64(&result, n)) { >- return 63 - result; >- } >- return 64; >-#else >- int zeroes = 60; >- if (n >> 32) zeroes -= 32, n >>= 32; >- if (n >> 16) zeroes -= 16, n >>= 16; >- if (n >> 8) zeroes -= 8, n >>= 8; >- if (n >> 4) zeroes -= 4, n >>= 4; >- return "\4\3\2\2\1\1\1\1\0\0\0\0\0\0\0\0"[n] + zeroes; >-#endif >-} >- >-} // namespace strings_internal >-} // namespace absl >- >-#endif // ABSL_STRINGS_INTERNAL_BITS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse.cc >index a04cc67669a7f75f936a767c5f223cb3592d9ebb..7e4dabc2262eff7726b72f54803154a042aaa718 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse.cc >@@ -91,7 +91,7 @@ static_assert(std::numeric_limits<int>::digits10 >= kDecimalExponentDigitsMax, > > // To avoid incredibly large inputs causing integer overflow for our exponent, > // we impose an arbitrary but very large limit on the number of significant >-// digits we will accept. The implementation refuses to match a std::string with >+// digits we will accept. The implementation refuses to match a string with > // more consecutive significant mantissa digits than this. > constexpr int kDecimalDigitLimit = 50000000; > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse_test.cc >index 1ff86004973a82fcd70e301c808a94349ed3bbb1..f48b9aee1a5e84db3b1cc296aab363f2c33db372 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/charconv_parse_test.cc >@@ -29,16 +29,16 @@ using absl::strings_internal::ParseFloat; > > namespace { > >-// Check that a given std::string input is parsed to the expected mantissa and >+// Check that a given string input is parsed to the expected mantissa and > // exponent. > // >-// Input std::string `s` must contain a '$' character. It marks the end of the >+// Input string `s` must contain a '$' character. It marks the end of the > // characters that should be consumed by the match. It is stripped from the > // input to ParseFloat. > // >-// If input std::string `s` contains '[' and ']' characters, these mark the region >+// If input string `s` contains '[' and ']' characters, these mark the region > // of characters that should be marked as the "subrange". For NaNs, this is >-// the location of the extended NaN std::string. For numbers, this is the location >+// the location of the extended NaN string. For numbers, this is the location > // of the full, over-large mantissa. > template <int base> > void ExpectParsedFloat(std::string s, absl::chars_format format_flags, >@@ -92,10 +92,10 @@ void ExpectParsedFloat(std::string s, absl::chars_format format_flags, > EXPECT_EQ(characters_matched, expected_characters_matched); > } > >-// Check that a given std::string input is parsed to the expected mantissa and >+// Check that a given string input is parsed to the expected mantissa and > // exponent. > // >-// Input std::string `s` must contain a '$' character. It marks the end of the >+// Input string `s` must contain a '$' character. It marks the end of the > // characters that were consumed by the match. > template <int base> > void ExpectNumber(std::string s, absl::chars_format format_flags, >@@ -106,7 +106,7 @@ void ExpectNumber(std::string s, absl::chars_format format_flags, > expected_literal_exponent); > } > >-// Check that a given std::string input is parsed to the given special value. >+// Check that a given string input is parsed to the given special value. > // > // This tests against both number bases, since infinities and NaNs have > // identical representations in both modes. >@@ -116,7 +116,7 @@ void ExpectSpecial(const std::string& s, absl::chars_format format_flags, > ExpectParsedFloat<16>(s, format_flags, type, 0, 0); > } > >-// Check that a given input std::string is not matched by Float. >+// Check that a given input string is not matched by Float. > template <int base> > void ExpectFailedParse(absl::string_view s, absl::chars_format format_flags) { > ParsedFloat parsed = >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/memutil.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/memutil.h >index a6f1c69138e3872f127e70aeb26e159a82925a0c..7de383b19ec3fc0f6c9331a41d94173c551aecbd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/memutil.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/memutil.h >@@ -14,7 +14,7 @@ > // limitations under the License. > // > >-// These routines provide mem versions of standard C std::string routines, >+// These routines provide mem versions of standard C string routines, > // such as strpbrk. They function exactly the same as the str versions, > // so if you wonder what they are, replace the word "mem" by > // "str" and check out the man page. I could return void*, as the >@@ -22,14 +22,14 @@ > // since this is by far the most common way these functions are called. > // > // The difference between the mem and str versions is the mem version >-// takes a pointer and a length, rather than a '\0'-terminated std::string. >+// takes a pointer and a length, rather than a '\0'-terminated string. > // The memcase* routines defined here assume the locale is "C" > // (they use absl::ascii_tolower instead of tolower). > // > // These routines are based on the BSD library. > // >-// Here's a list of routines from std::string.h, and their mem analogues. >-// Functions in lowercase are defined in std::string.h; those in UPPERCASE >+// Here's a list of routines from string.h, and their mem analogues. >+// Functions in lowercase are defined in string.h; those in UPPERCASE > // are defined here: > // > // strlen -- >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/ostringstream.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/ostringstream.h >index 6e1325b9140f8a0c3eedb6f5a13e21d8d6e0aaf6..e81a89affea25be26bbcd3120cc7094502a2d806 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/ostringstream.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/ostringstream.h >@@ -25,18 +25,18 @@ > namespace absl { > namespace strings_internal { > >-// The same as std::ostringstream but appends to a user-specified std::string, >+// The same as std::ostringstream but appends to a user-specified string, > // and is faster. It is ~70% faster to create, ~50% faster to write to, and >-// completely free to extract the result std::string. >+// completely free to extract the result string. > // >-// std::string s; >+// string s; > // OStringStream strm(&s); > // strm << 42 << ' ' << 3.14; // appends to `s` > // > // The stream object doesn't have to be named. Starting from C++11 operator<< > // works with rvalues of std::ostream. > // >-// std::string s; >+// string s; > // OStringStream(&s) << 42 << ' ' << 3.14; // appends to `s` > // > // OStringStream is faster to create than std::ostringstream but it's still >@@ -45,14 +45,14 @@ namespace strings_internal { > // > // Creates unnecessary instances of OStringStream: slow. > // >-// std::string s; >+// string s; > // OStringStream(&s) << 42; > // OStringStream(&s) << ' '; > // OStringStream(&s) << 3.14; > // > // Creates a single instance of OStringStream and reuses it: fast. > // >-// std::string s; >+// string s; > // OStringStream strm(&s); > // strm << 42; > // strm << ' '; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/resize_uninitialized.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/resize_uninitialized.h >index 0157ca0245f350bece6dcbe1987bdb80b4a54551..a94e0547b50f154234374e8e608042a6c7033bc9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/resize_uninitialized.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/resize_uninitialized.h >@@ -44,8 +44,8 @@ void ResizeUninit(string_type* s, size_t new_size, std::false_type) { > s->resize(new_size); > } > >-// Returns true if the std::string implementation supports a resize where >-// the new characters added to the std::string are left untouched. >+// Returns true if the string implementation supports a resize where >+// the new characters added to the string are left untouched. > // > // (A better name might be "STLStringSupportsUninitializedResize", alluding to > // the previous function.) >@@ -57,7 +57,7 @@ inline constexpr bool STLStringSupportsNontrashingResize(string_type*) { > // Like str->resize(new_size), except any new characters added to "*str" as a > // result of resizing may be left uninitialized, rather than being filled with > // '0' bytes. Typically used when code is then going to overwrite the backing >-// store of the std::string with known data. Uses a Google extension to std::string. >+// store of the string with known data. Uses a Google extension to ::string. > template <typename string_type, typename = void> > inline void STLStringResizeUninitialized(string_type* s, size_t new_size) { > ResizeUninit(s, new_size, HasResizeUninitialized<string_type>()); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.cc >index eafb068fe2867446270f228a6449e0db5a4ccab5..b40be8ff3824e9c66b4f7c1ecc0856e4178fdd3d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.cc >@@ -112,7 +112,7 @@ class ConvertedIntInfo { > // Note: 'o' conversions do not have a base indicator, it's just that > // the '#' flag is specified to modify the precision for 'o' conversions. > string_view BaseIndicator(const ConvertedIntInfo &info, >- const ConversionSpec &conv) { >+ const ConversionSpec conv) { > bool alt = conv.flags().alt; > int radix = conv.conv().radix(); > if (conv.conv().id() == ConversionChar::p) >@@ -127,7 +127,7 @@ string_view BaseIndicator(const ConvertedIntInfo &info, > return {}; > } > >-string_view SignColumn(bool neg, const ConversionSpec &conv) { >+string_view SignColumn(bool neg, const ConversionSpec conv) { > if (conv.conv().is_signed()) { > if (neg) return "-"; > if (conv.flags().show_pos) return "+"; >@@ -136,7 +136,7 @@ string_view SignColumn(bool neg, const ConversionSpec &conv) { > return {}; > } > >-bool ConvertCharImpl(unsigned char v, const ConversionSpec &conv, >+bool ConvertCharImpl(unsigned char v, const ConversionSpec conv, > FormatSinkImpl *sink) { > size_t fill = 0; > if (conv.width() >= 0) fill = conv.width(); >@@ -148,7 +148,7 @@ bool ConvertCharImpl(unsigned char v, const ConversionSpec &conv, > } > > bool ConvertIntImplInner(const ConvertedIntInfo &info, >- const ConversionSpec &conv, FormatSinkImpl *sink) { >+ const ConversionSpec conv, FormatSinkImpl *sink) { > // Print as a sequence of Substrings: > // [left_spaces][sign][base_indicator][zeroes][formatted][right_spaces] > size_t fill = 0; >@@ -202,8 +202,7 @@ bool ConvertIntImplInner(const ConvertedIntInfo &info, > } > > template <typename T> >-bool ConvertIntImplInner(T v, const ConversionSpec &conv, >- FormatSinkImpl *sink) { >+bool ConvertIntImplInner(T v, const ConversionSpec conv, FormatSinkImpl *sink) { > ConvertedIntInfo info(v, conv.conv()); > if (conv.flags().basic && conv.conv().id() != ConversionChar::p) { > if (info.is_neg()) sink->Append(1, '-'); >@@ -218,7 +217,7 @@ bool ConvertIntImplInner(T v, const ConversionSpec &conv, > } > > template <typename T> >-bool ConvertIntArg(T v, const ConversionSpec &conv, FormatSinkImpl *sink) { >+bool ConvertIntArg(T v, const ConversionSpec conv, FormatSinkImpl *sink) { > if (conv.conv().is_float()) { > return FormatConvertImpl(static_cast<double>(v), conv, sink).value; > } >@@ -234,11 +233,11 @@ bool ConvertIntArg(T v, const ConversionSpec &conv, FormatSinkImpl *sink) { > } > > template <typename T> >-bool ConvertFloatArg(T v, const ConversionSpec &conv, FormatSinkImpl *sink) { >+bool ConvertFloatArg(T v, const ConversionSpec conv, FormatSinkImpl *sink) { > return conv.conv().is_float() && ConvertFloatImpl(v, conv, sink); > } > >-inline bool ConvertStringArg(string_view v, const ConversionSpec &conv, >+inline bool ConvertStringArg(string_view v, const ConversionSpec conv, > FormatSinkImpl *sink) { > if (conv.conv().id() != ConversionChar::s) > return false; >@@ -254,19 +253,19 @@ inline bool ConvertStringArg(string_view v, const ConversionSpec &conv, > > // ==================== Strings ==================== > ConvertResult<Conv::s> FormatConvertImpl(const std::string &v, >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertStringArg(v, conv, sink)}; > } > > ConvertResult<Conv::s> FormatConvertImpl(string_view v, >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertStringArg(v, conv, sink)}; > } > > ConvertResult<Conv::s | Conv::p> FormatConvertImpl(const char *v, >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > if (conv.conv().id() == ConversionChar::p) > return {FormatConvertImpl(VoidPtr(v), conv, sink).value}; >@@ -283,7 +282,7 @@ ConvertResult<Conv::s | Conv::p> FormatConvertImpl(const char *v, > } > > // ==================== Raw pointers ==================== >-ConvertResult<Conv::p> FormatConvertImpl(VoidPtr v, const ConversionSpec &conv, >+ConvertResult<Conv::p> FormatConvertImpl(VoidPtr v, const ConversionSpec conv, > FormatSinkImpl *sink) { > if (conv.conv().id() != ConversionChar::p) > return {false}; >@@ -295,104 +294,83 @@ ConvertResult<Conv::p> FormatConvertImpl(VoidPtr v, const ConversionSpec &conv, > } > > // ==================== Floats ==================== >-FloatingConvertResult FormatConvertImpl(float v, const ConversionSpec &conv, >+FloatingConvertResult FormatConvertImpl(float v, const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertFloatArg(v, conv, sink)}; > } >-FloatingConvertResult FormatConvertImpl(double v, const ConversionSpec &conv, >+FloatingConvertResult FormatConvertImpl(double v, const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertFloatArg(v, conv, sink)}; > } > FloatingConvertResult FormatConvertImpl(long double v, >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertFloatArg(v, conv, sink)}; > } > > // ==================== Chars ==================== >-IntegralConvertResult FormatConvertImpl(char v, const ConversionSpec &conv, >+IntegralConvertResult FormatConvertImpl(char v, const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(signed char v, >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(unsigned char v, >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > > // ==================== Ints ==================== > IntegralConvertResult FormatConvertImpl(short v, // NOLINT >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(unsigned short v, // NOLINT >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } >-IntegralConvertResult FormatConvertImpl(int v, const ConversionSpec &conv, >+IntegralConvertResult FormatConvertImpl(int v, const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } >-IntegralConvertResult FormatConvertImpl(unsigned v, const ConversionSpec &conv, >+IntegralConvertResult FormatConvertImpl(unsigned v, const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(long v, // NOLINT >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(unsigned long v, // NOLINT >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(long long v, // NOLINT >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(unsigned long long v, // NOLINT >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > IntegralConvertResult FormatConvertImpl(absl::uint128 v, >- const ConversionSpec &conv, >+ const ConversionSpec conv, > FormatSinkImpl *sink) { > return {ConvertIntArg(v, conv, sink)}; > } > >-template struct FormatArgImpl::TypedVTable<str_format_internal::VoidPtr>; >- >-template struct FormatArgImpl::TypedVTable<bool>; >-template struct FormatArgImpl::TypedVTable<char>; >-template struct FormatArgImpl::TypedVTable<signed char>; >-template struct FormatArgImpl::TypedVTable<unsigned char>; >-template struct FormatArgImpl::TypedVTable<short>; // NOLINT >-template struct FormatArgImpl::TypedVTable<unsigned short>; // NOLINT >-template struct FormatArgImpl::TypedVTable<int>; >-template struct FormatArgImpl::TypedVTable<unsigned>; >-template struct FormatArgImpl::TypedVTable<long>; // NOLINT >-template struct FormatArgImpl::TypedVTable<unsigned long>; // NOLINT >-template struct FormatArgImpl::TypedVTable<long long>; // NOLINT >-template struct FormatArgImpl::TypedVTable<unsigned long long>; // NOLINT >-template struct FormatArgImpl::TypedVTable<absl::uint128>; >- >-template struct FormatArgImpl::TypedVTable<float>; >-template struct FormatArgImpl::TypedVTable<double>; >-template struct FormatArgImpl::TypedVTable<long double>; >- >-template struct FormatArgImpl::TypedVTable<const char *>; >-template struct FormatArgImpl::TypedVTable<std::string>; >-template struct FormatArgImpl::TypedVTable<string_view>; >+ABSL_INTERNAL_FORMAT_DISPATCH_OVERLOADS_EXPAND_(); >+ > > } // namespace str_format_internal > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.h >index a9562188ea91973a629828e0dec82bd470c9c07a..3376d48afda20ff233a1d1b7f41b30078ae1cce6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/arg.h >@@ -33,7 +33,7 @@ struct HasUserDefinedConvert : std::false_type {}; > template <typename T> > struct HasUserDefinedConvert< > T, void_t<decltype(AbslFormatConvert( >- std::declval<const T&>(), std::declval<const ConversionSpec&>(), >+ std::declval<const T&>(), std::declval<ConversionSpec>(), > std::declval<FormatSink*>()))>> : std::true_type {}; > template <typename T> > class StreamedWrapper; >@@ -50,25 +50,23 @@ struct VoidPtr { > : value(ptr ? reinterpret_cast<uintptr_t>(ptr) : 0) {} > uintptr_t value; > }; >-ConvertResult<Conv::p> FormatConvertImpl(VoidPtr v, const ConversionSpec& conv, >+ConvertResult<Conv::p> FormatConvertImpl(VoidPtr v, ConversionSpec conv, > FormatSinkImpl* sink); > > // Strings. >-ConvertResult<Conv::s> FormatConvertImpl(const std::string& v, >- const ConversionSpec& conv, >+ConvertResult<Conv::s> FormatConvertImpl(const std::string& v, ConversionSpec conv, > FormatSinkImpl* sink); >-ConvertResult<Conv::s> FormatConvertImpl(string_view v, >- const ConversionSpec& conv, >+ConvertResult<Conv::s> FormatConvertImpl(string_view v, ConversionSpec conv, > FormatSinkImpl* sink); > ConvertResult<Conv::s | Conv::p> FormatConvertImpl(const char* v, >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink); > template <class AbslCord, > typename std::enable_if< > std::is_same<AbslCord, ::Cord>::value>::type* = nullptr, > class AbslCordReader = ::CordReader> > ConvertResult<Conv::s> FormatConvertImpl(const AbslCord& value, >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink) { > if (conv.conv().id() != ConversionChar::s) return {false}; > >@@ -104,51 +102,48 @@ using IntegralConvertResult = > using FloatingConvertResult = ConvertResult<Conv::floating>; > > // Floats. >-FloatingConvertResult FormatConvertImpl(float v, const ConversionSpec& conv, >+FloatingConvertResult FormatConvertImpl(float v, ConversionSpec conv, > FormatSinkImpl* sink); >-FloatingConvertResult FormatConvertImpl(double v, const ConversionSpec& conv, >+FloatingConvertResult FormatConvertImpl(double v, ConversionSpec conv, > FormatSinkImpl* sink); >-FloatingConvertResult FormatConvertImpl(long double v, >- const ConversionSpec& conv, >+FloatingConvertResult FormatConvertImpl(long double v, ConversionSpec conv, > FormatSinkImpl* sink); > > // Chars. >-IntegralConvertResult FormatConvertImpl(char v, const ConversionSpec& conv, >+IntegralConvertResult FormatConvertImpl(char v, ConversionSpec conv, > FormatSinkImpl* sink); >-IntegralConvertResult FormatConvertImpl(signed char v, >- const ConversionSpec& conv, >+IntegralConvertResult FormatConvertImpl(signed char v, ConversionSpec conv, > FormatSinkImpl* sink); >-IntegralConvertResult FormatConvertImpl(unsigned char v, >- const ConversionSpec& conv, >+IntegralConvertResult FormatConvertImpl(unsigned char v, ConversionSpec conv, > FormatSinkImpl* sink); > > // Ints. > IntegralConvertResult FormatConvertImpl(short v, // NOLINT >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink); > IntegralConvertResult FormatConvertImpl(unsigned short v, // NOLINT >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink); >-IntegralConvertResult FormatConvertImpl(int v, const ConversionSpec& conv, >+IntegralConvertResult FormatConvertImpl(int v, ConversionSpec conv, > FormatSinkImpl* sink); >-IntegralConvertResult FormatConvertImpl(unsigned v, const ConversionSpec& conv, >+IntegralConvertResult FormatConvertImpl(unsigned v, ConversionSpec conv, > FormatSinkImpl* sink); > IntegralConvertResult FormatConvertImpl(long v, // NOLINT >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink); > IntegralConvertResult FormatConvertImpl(unsigned long v, // NOLINT >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink); > IntegralConvertResult FormatConvertImpl(long long v, // NOLINT >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink); > IntegralConvertResult FormatConvertImpl(unsigned long long v, // NOLINT >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink); >-IntegralConvertResult FormatConvertImpl(uint128 v, const ConversionSpec& conv, >+IntegralConvertResult FormatConvertImpl(uint128 v, ConversionSpec conv, > FormatSinkImpl* sink); > template <typename T, enable_if_t<std::is_same<T, bool>::value, int> = 0> >-IntegralConvertResult FormatConvertImpl(T v, const ConversionSpec& conv, >+IntegralConvertResult FormatConvertImpl(T v, ConversionSpec conv, > FormatSinkImpl* sink) { > return FormatConvertImpl(static_cast<int>(v), conv, sink); > } >@@ -159,11 +154,11 @@ template <typename T> > typename std::enable_if<std::is_enum<T>::value && > !HasUserDefinedConvert<T>::value, > IntegralConvertResult>::type >-FormatConvertImpl(T v, const ConversionSpec& conv, FormatSinkImpl* sink); >+FormatConvertImpl(T v, ConversionSpec conv, FormatSinkImpl* sink); > > template <typename T> > ConvertResult<Conv::s> FormatConvertImpl(const StreamedWrapper<T>& v, >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* out) { > std::ostringstream oss; > oss << v.v_; >@@ -176,7 +171,7 @@ ConvertResult<Conv::s> FormatConvertImpl(const StreamedWrapper<T>& v, > struct FormatCountCaptureHelper { > template <class T = int> > static ConvertResult<Conv::n> ConvertHelper(const FormatCountCapture& v, >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink) { > const absl::enable_if_t<sizeof(T) != 0, FormatCountCapture>& v2 = v; > >@@ -189,7 +184,7 @@ struct FormatCountCaptureHelper { > > template <class T = int> > ConvertResult<Conv::n> FormatConvertImpl(const FormatCountCapture& v, >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* sink) { > return FormatCountCaptureHelper::ConvertHelper(v, conv, sink); > } >@@ -199,20 +194,20 @@ ConvertResult<Conv::n> FormatConvertImpl(const FormatCountCapture& v, > struct FormatArgImplFriend { > template <typename Arg> > static bool ToInt(Arg arg, int* out) { >- if (!arg.vtbl_->to_int) return false; >- *out = arg.vtbl_->to_int(arg.data_); >- return true; >+ // A value initialized ConversionSpec has a `none` conv, which tells the >+ // dispatcher to run the `int` conversion. >+ return arg.dispatcher_(arg.data_, {}, out); > } > > template <typename Arg> >- static bool Convert(Arg arg, const str_format_internal::ConversionSpec& conv, >+ static bool Convert(Arg arg, str_format_internal::ConversionSpec conv, > FormatSinkImpl* out) { >- return arg.vtbl_->convert(arg.data_, conv, out); >+ return arg.dispatcher_(arg.data_, conv, out); > } > > template <typename Arg> >- static const void* GetVTablePtrForTest(Arg arg) { >- return arg.vtbl_; >+ static typename Arg::Dispatcher GetVTablePtrForTest(Arg arg) { >+ return arg.dispatcher_; > } > }; > >@@ -229,11 +224,7 @@ class FormatArgImpl { > char buf[kInlinedSpace]; > }; > >- struct VTable { >- bool (*convert)(Data, const str_format_internal::ConversionSpec& conv, >- FormatSinkImpl* out); >- int (*to_int)(Data); >- }; >+ using Dispatcher = bool (*)(Data, ConversionSpec, void* out); > > template <typename T> > struct store_by_value >@@ -253,10 +244,6 @@ class FormatArgImpl { > : ByPointer))> { > }; > >- // An instance of an FormatArgImpl::VTable suitable for 'T'. >- template <typename T> >- struct TypedVTable; >- > // To reduce the number of vtables we will decay values before hand. > // Anything with a user-defined Convert will get its own vtable. > // For everything else: >@@ -338,7 +325,10 @@ class FormatArgImpl { > }; > > template <typename T> >- void Init(const T& value); >+ void Init(const T& value) { >+ data_ = Manager<T>::SetValue(value); >+ dispatcher_ = &Dispatch<T>; >+ } > > template <typename T> > static int ToIntVal(const T& val) { >@@ -355,79 +345,75 @@ class FormatArgImpl { > return static_cast<int>(val); > } > >- Data data_; >- const VTable* vtbl_; >-}; >- >-template <typename T> >-struct FormatArgImpl::TypedVTable { >- private: >- static bool ConvertImpl(Data arg, >- const str_format_internal::ConversionSpec& conv, >- FormatSinkImpl* out) { >- return str_format_internal::FormatConvertImpl(Manager<T>::Value(arg), conv, >- out) >- .value; >+ template <typename T> >+ static bool ToInt(Data arg, int* out, std::true_type /* is_integral */, >+ std::false_type) { >+ *out = ToIntVal(Manager<T>::Value(arg)); >+ return true; > } > >- template <typename U = T, typename = void> >- struct ToIntImpl { >- static constexpr int (*value)(Data) = nullptr; >- }; >+ template <typename T> >+ static bool ToInt(Data arg, int* out, std::false_type, >+ std::true_type /* is_enum */) { >+ *out = ToIntVal(static_cast<typename std::underlying_type<T>::type>( >+ Manager<T>::Value(arg))); >+ return true; >+ } > >- template <typename U> >- struct ToIntImpl<U, >- typename std::enable_if<std::is_integral<U>::value>::type> { >- static int Invoke(Data arg) { return ToIntVal(Manager<T>::Value(arg)); } >- static constexpr int (*value)(Data) = &Invoke; >- }; >+ template <typename T> >+ static bool ToInt(Data, int*, std::false_type, std::false_type) { >+ return false; >+ } > >- template <typename U> >- struct ToIntImpl<U, typename std::enable_if<std::is_enum<U>::value>::type> { >- static int Invoke(Data arg) { >- return ToIntVal(static_cast<typename std::underlying_type<T>::type>( >- Manager<T>::Value(arg))); >+ template <typename T> >+ static bool Dispatch(Data arg, ConversionSpec spec, void* out) { >+ // A `none` conv indicates that we want the `int` conversion. >+ if (ABSL_PREDICT_FALSE(spec.conv().id() == ConversionChar::none)) { >+ return ToInt<T>(arg, static_cast<int*>(out), std::is_integral<T>(), >+ std::is_enum<T>()); > } >- static constexpr int (*value)(Data) = &Invoke; >- }; > >- public: >- static constexpr VTable value{&ConvertImpl, ToIntImpl<>::value}; >-}; >+ return str_format_internal::FormatConvertImpl( >+ Manager<T>::Value(arg), spec, static_cast<FormatSinkImpl*>(out)) >+ .value; >+ } > >-template <typename T> >-constexpr FormatArgImpl::VTable FormatArgImpl::TypedVTable<T>::value; >+ Data data_; >+ Dispatcher dispatcher_; >+}; > >-template <typename T> >-void FormatArgImpl::Init(const T& value) { >- data_ = Manager<T>::SetValue(value); >- vtbl_ = &TypedVTable<T>::value; >-} >+#define ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(T, E) \ >+ E template bool FormatArgImpl::Dispatch<T>(Data, ConversionSpec, void*) >+ >+#define ABSL_INTERNAL_FORMAT_DISPATCH_OVERLOADS_EXPAND_(...) \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(str_format_internal::VoidPtr, \ >+ __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(bool, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(char, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(signed char, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(unsigned char, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(short, __VA_ARGS__); /* NOLINT */ \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(unsigned short, /* NOLINT */ \ >+ __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(int, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(unsigned int, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(long, __VA_ARGS__); /* NOLINT */ \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(unsigned long, /* NOLINT */ \ >+ __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(long long, /* NOLINT */ \ >+ __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(unsigned long long, /* NOLINT */ \ >+ __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(uint128, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(float, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(double, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(long double, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(const char*, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(std::string, __VA_ARGS__); \ >+ ABSL_INTERNAL_FORMAT_DISPATCH_INSTANTIATE_(string_view, __VA_ARGS__) >+ >+ABSL_INTERNAL_FORMAT_DISPATCH_OVERLOADS_EXPAND_(extern); > >-extern template struct FormatArgImpl::TypedVTable<str_format_internal::VoidPtr>; >- >-extern template struct FormatArgImpl::TypedVTable<bool>; >-extern template struct FormatArgImpl::TypedVTable<char>; >-extern template struct FormatArgImpl::TypedVTable<signed char>; >-extern template struct FormatArgImpl::TypedVTable<unsigned char>; >-extern template struct FormatArgImpl::TypedVTable<short>; // NOLINT >-extern template struct FormatArgImpl::TypedVTable<unsigned short>; // NOLINT >-extern template struct FormatArgImpl::TypedVTable<int>; >-extern template struct FormatArgImpl::TypedVTable<unsigned>; >-extern template struct FormatArgImpl::TypedVTable<long>; // NOLINT >-extern template struct FormatArgImpl::TypedVTable<unsigned long>; // NOLINT >-extern template struct FormatArgImpl::TypedVTable<long long>; // NOLINT >-extern template struct FormatArgImpl::TypedVTable< >- unsigned long long>; // NOLINT >-extern template struct FormatArgImpl::TypedVTable<uint128>; >- >-extern template struct FormatArgImpl::TypedVTable<float>; >-extern template struct FormatArgImpl::TypedVTable<double>; >-extern template struct FormatArgImpl::TypedVTable<long double>; >- >-extern template struct FormatArgImpl::TypedVTable<const char*>; >-extern template struct FormatArgImpl::TypedVTable<std::string>; >-extern template struct FormatArgImpl::TypedVTable<string_view>; > } // namespace str_format_internal > } // namespace absl > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.cc >index 33e8641558ab86f3e892addeecd9a6def8b40a97..c4eddd17ddbe56a76c5ec1a5184079bf107166bc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.cc >@@ -103,15 +103,15 @@ class ConverterConsumer { > }; > > template <typename Converter> >-bool ConvertAll(const UntypedFormatSpecImpl& format, >- absl::Span<const FormatArgImpl> args, >- const Converter& converter) { >- const ParsedFormatBase* pc = format.parsed_conversion(); >- if (pc) >- return pc->ProcessFormat(ConverterConsumer<Converter>(converter, args)); >- >- return ParseFormatString(format.str(), >- ConverterConsumer<Converter>(converter, args)); >+bool ConvertAll(const UntypedFormatSpecImpl format, >+ absl::Span<const FormatArgImpl> args, Converter converter) { >+ if (format.has_parsed_conversion()) { >+ return format.parsed_conversion()->ProcessFormat( >+ ConverterConsumer<Converter>(converter, args)); >+ } else { >+ return ParseFormatString(format.str(), >+ ConverterConsumer<Converter>(converter, args)); >+ } > } > > class DefaultConverter { >@@ -158,7 +158,7 @@ bool BindWithPack(const UnboundConversion* props, > return ArgContext(pack).Bind(props, bound); > } > >-std::string Summarize(const UntypedFormatSpecImpl& format, >+std::string Summarize(const UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args) { > typedef SummarizingConverter Converter; > std::string out; >@@ -167,23 +167,18 @@ std::string Summarize(const UntypedFormatSpecImpl& format, > // flush. > FormatSinkImpl sink(&out); > if (!ConvertAll(format, args, Converter(&sink))) { >- sink.Flush(); >- out.clear(); >+ return ""; > } > } > return out; > } > > bool FormatUntyped(FormatRawSinkImpl raw_sink, >- const UntypedFormatSpecImpl& format, >+ const UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args) { > FormatSinkImpl sink(raw_sink); > using Converter = DefaultConverter; >- if (!ConvertAll(format, args, Converter(&sink))) { >- sink.Flush(); >- return false; >- } >- return true; >+ return ConvertAll(format, args, Converter(&sink)); > } > > std::ostream& Streamable::Print(std::ostream& os) const { >@@ -191,14 +186,16 @@ std::ostream& Streamable::Print(std::ostream& os) const { > return os; > } > >-std::string& AppendPack(std::string* out, const UntypedFormatSpecImpl& format, >+std::string& AppendPack(std::string* out, const UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args) { > size_t orig = out->size(); >- if (!FormatUntyped(out, format, args)) out->resize(orig); >+ if (ABSL_PREDICT_FALSE(!FormatUntyped(out, format, args))) { >+ out->erase(orig); >+ } > return *out; > } > >-int FprintF(std::FILE* output, const UntypedFormatSpecImpl& format, >+int FprintF(std::FILE* output, const UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args) { > FILERawSink sink(output); > if (!FormatUntyped(&sink, format, args)) { >@@ -216,7 +213,7 @@ int FprintF(std::FILE* output, const UntypedFormatSpecImpl& format, > return static_cast<int>(sink.count()); > } > >-int SnprintF(char* output, size_t size, const UntypedFormatSpecImpl& format, >+int SnprintF(char* output, size_t size, const UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args) { > BufferRawSink sink(output, size ? size - 1 : 0); > if (!FormatUntyped(&sink, format, args)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.h >index 4008611211cfe6053c51af0648a051354bdf81a2..1b52df9c7f5655f1d78dadc50825c4d23e4260f5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/bind.h >@@ -33,13 +33,21 @@ class UntypedFormatSpecImpl { > public: > UntypedFormatSpecImpl() = delete; > >- explicit UntypedFormatSpecImpl(string_view s) : str_(s), pc_() {} >+ explicit UntypedFormatSpecImpl(string_view s) >+ : data_(s.data()), size_(s.size()) {} > explicit UntypedFormatSpecImpl( > const str_format_internal::ParsedFormatBase* pc) >- : pc_(pc) {} >- string_view str() const { return str_; } >+ : data_(pc), size_(~size_t{}) {} >+ >+ bool has_parsed_conversion() const { return size_ == ~size_t{}; } >+ >+ string_view str() const { >+ assert(!has_parsed_conversion()); >+ return string_view(static_cast<const char*>(data_), size_); >+ } > const str_format_internal::ParsedFormatBase* parsed_conversion() const { >- return pc_; >+ assert(has_parsed_conversion()); >+ return static_cast<const str_format_internal::ParsedFormatBase*>(data_); > } > > template <typename T> >@@ -48,8 +56,8 @@ class UntypedFormatSpecImpl { > } > > private: >- string_view str_; >- const str_format_internal::ParsedFormatBase* pc_; >+ const void* data_; >+ size_t size_; > }; > > template <typename T, typename...> >@@ -144,31 +152,31 @@ class Streamable { > }; > > // for testing >-std::string Summarize(const UntypedFormatSpecImpl& format, >+std::string Summarize(UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args); > bool BindWithPack(const UnboundConversion* props, > absl::Span<const FormatArgImpl> pack, BoundConversion* bound); > > bool FormatUntyped(FormatRawSinkImpl raw_sink, >- const UntypedFormatSpecImpl& format, >+ UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args); > >-std::string& AppendPack(std::string* out, const UntypedFormatSpecImpl& format, >+std::string& AppendPack(std::string* out, UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args); > >-inline std::string FormatPack(const UntypedFormatSpecImpl& format, >+inline std::string FormatPack(const UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args) { > std::string out; > AppendPack(&out, format, args); > return out; > } > >-int FprintF(std::FILE* output, const UntypedFormatSpecImpl& format, >+int FprintF(std::FILE* output, UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args); >-int SnprintF(char* output, size_t size, const UntypedFormatSpecImpl& format, >+int SnprintF(char* output, size_t size, UntypedFormatSpecImpl format, > absl::Span<const FormatArgImpl> args); > >-// Returned by Streamed(v). Converts via '%s' to the std::string created >+// Returned by Streamed(v). Converts via '%s' to the string created > // by std::ostream << v. > template <typename T> > class StreamedWrapper { >@@ -178,7 +186,7 @@ class StreamedWrapper { > private: > template <typename S> > friend ConvertResult<Conv::s> FormatConvertImpl(const StreamedWrapper<S>& v, >- const ConversionSpec& conv, >+ ConversionSpec conv, > FormatSinkImpl* out); > const T& v_; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/extension.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/extension.h >index 810330b9d71bd188c26eae215abd61d48ed4bd78..11b996aefd37d157a3f08daa5a22dc01c0a2d023 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/extension.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/extension.h >@@ -18,6 +18,7 @@ > #define ABSL_STRINGS_INTERNAL_STR_FORMAT_EXTENSION_H_ > > #include <limits.h> >+#include <cstddef> > #include <cstring> > #include <ostream> > >@@ -57,7 +58,7 @@ class FormatRawSinkImpl { > void (*write_)(void*, string_view); > }; > >-// An abstraction to which conversions write their std::string data. >+// An abstraction to which conversions write their string data. > class FormatSinkImpl { > public: > explicit FormatSinkImpl(FormatRawSinkImpl raw) : raw_(raw) {} >@@ -307,7 +308,12 @@ class ConversionSpec { > public: > Flags flags() const { return flags_; } > LengthMod length_mod() const { return length_mod_; } >- ConversionChar conv() const { return conv_; } >+ ConversionChar conv() const { >+ // Keep this field first in the struct . It generates better code when >+ // accessing it when ConversionSpec is passed by value in registers. >+ static_assert(offsetof(ConversionSpec, conv_) == 0, ""); >+ return conv_; >+ } > > // Returns the specified width. If width is unspecfied, it returns a negative > // value. >@@ -324,9 +330,9 @@ class ConversionSpec { > void set_left(bool b) { flags_.left = b; } > > private: >+ ConversionChar conv_; > Flags flags_; > LengthMod length_mod_; >- ConversionChar conv_; > int width_; > int precision_; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/output.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/output.h >index 3b0aa5e7157ecbdaf8f243e44a13a751638b575d..12ecd99e680d1ebd679d3c882f7f26f3e9fb6e46 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/output.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/output.h >@@ -69,10 +69,10 @@ class FILERawSink { > > // Provide RawSink integration with common types from the STL. > inline void AbslFormatFlush(std::string* out, string_view s) { >- out->append(s.begin(), s.size()); >+ out->append(s.data(), s.size()); > } > inline void AbslFormatFlush(std::ostream* out, string_view s) { >- out->write(s.begin(), s.size()); >+ out->write(s.data(), s.size()); > } > > template <class AbslCord, typename = typename std::enable_if< >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.cc >index 10114f489c0189f0586149685378e13a8df94684..5e3d0d0547d1dbad47b3a503b685076374373320 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.cc >@@ -81,19 +81,27 @@ static constexpr std::int8_t kIds[] = { > template <bool is_positional> > bool ConsumeConversion(string_view *src, UnboundConversion *conv, > int *next_arg) { >- const char *pos = src->begin(); >- const char *const end = src->end(); >+ const char *pos = src->data(); >+ const char *const end = pos + src->size(); > char c; >- // Read the next char into `c` and update `pos`. Reads '\0' if at end. >- const auto get_char = [&] { c = pos == end ? '\0' : *pos++; }; >+ // Read the next char into `c` and update `pos`. Returns false if there are >+ // no more chars to read. >+#define ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR() \ >+ do { \ >+ if (ABSL_PREDICT_FALSE(pos == end)) return false; \ >+ c = *pos++; \ >+ } while (0) > > const auto parse_digits = [&] { > int digits = c - '0'; >- // We do not want to overflow `digits` so we consume at most digits10-1 >+ // We do not want to overflow `digits` so we consume at most digits10 > // digits. If there are more digits the parsing will fail later on when the > // digit doesn't match the expected characters. >- int num_digits = std::numeric_limits<int>::digits10 - 2; >- for (get_char(); num_digits && std::isdigit(c); get_char()) { >+ int num_digits = std::numeric_limits<int>::digits10; >+ for (;;) { >+ if (ABSL_PREDICT_FALSE(pos == end || !num_digits)) break; >+ c = *pos++; >+ if (!std::isdigit(c)) break; > --num_digits; > digits = 10 * digits + c - '0'; > } >@@ -101,14 +109,14 @@ bool ConsumeConversion(string_view *src, UnboundConversion *conv, > }; > > if (is_positional) { >- get_char(); >- if (c < '1' || c > '9') return false; >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); >+ if (ABSL_PREDICT_FALSE(c < '1' || c > '9')) return false; > conv->arg_position = parse_digits(); > assert(conv->arg_position > 0); >- if (c != '$') return false; >+ if (ABSL_PREDICT_FALSE(c != '$')) return false; > } > >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > > // We should start with the basic flag on. > assert(conv->flags.basic); >@@ -119,32 +127,39 @@ bool ConsumeConversion(string_view *src, UnboundConversion *conv, > if (c < 'A') { > conv->flags.basic = false; > >- for (; c <= '0'; get_char()) { >+ for (; c <= '0';) { >+ // FIXME: We might be able to speed this up reusing the kIds lookup table >+ // from above. >+ // It might require changing Flags to be a plain integer where we can |= a >+ // value. > switch (c) { > case '-': > conv->flags.left = true; >- continue; >+ break; > case '+': > conv->flags.show_pos = true; >- continue; >+ break; > case ' ': > conv->flags.sign_col = true; >- continue; >+ break; > case '#': > conv->flags.alt = true; >- continue; >+ break; > case '0': > conv->flags.zero = true; >- continue; >+ break; >+ default: >+ goto flags_done; > } >- break; >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > } >+flags_done: > > if (c <= '9') { > if (c >= '0') { > int maybe_width = parse_digits(); > if (!is_positional && c == '$') { >- if (*next_arg != 0) return false; >+ if (ABSL_PREDICT_FALSE(*next_arg != 0)) return false; > // Positional conversion. > *next_arg = -1; > conv->flags = Flags(); >@@ -153,12 +168,12 @@ bool ConsumeConversion(string_view *src, UnboundConversion *conv, > } > conv->width.set_value(maybe_width); > } else if (c == '*') { >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > if (is_positional) { >- if (c < '1' || c > '9') return false; >+ if (ABSL_PREDICT_FALSE(c < '1' || c > '9')) return false; > conv->width.set_from_arg(parse_digits()); >- if (c != '$') return false; >- get_char(); >+ if (ABSL_PREDICT_FALSE(c != '$')) return false; >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > } else { > conv->width.set_from_arg(++*next_arg); > } >@@ -166,16 +181,16 @@ bool ConsumeConversion(string_view *src, UnboundConversion *conv, > } > > if (c == '.') { >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > if (std::isdigit(c)) { > conv->precision.set_value(parse_digits()); > } else if (c == '*') { >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > if (is_positional) { >- if (c < '1' || c > '9') return false; >+ if (ABSL_PREDICT_FALSE(c < '1' || c > '9')) return false; > conv->precision.set_from_arg(parse_digits()); > if (c != '$') return false; >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > } else { > conv->precision.set_from_arg(++*next_arg); > } >@@ -188,23 +203,23 @@ bool ConsumeConversion(string_view *src, UnboundConversion *conv, > std::int8_t id = kIds[static_cast<unsigned char>(c)]; > > if (id < 0) { >- if (id == none) return false; >+ if (ABSL_PREDICT_FALSE(id == none)) return false; > > // It is a length modifier. > using str_format_internal::LengthMod; > LengthMod length_mod = LengthMod::FromId(static_cast<LM>(~id)); >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > if (c == 'h' && length_mod.id() == LengthMod::h) { > conv->length_mod = LengthMod::FromId(LengthMod::hh); >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > } else if (c == 'l' && length_mod.id() == LengthMod::l) { > conv->length_mod = LengthMod::FromId(LengthMod::ll); >- get_char(); >+ ABSL_FORMAT_PARSER_INTERNAL_GET_CHAR(); > } else { > conv->length_mod = length_mod; > } > id = kIds[static_cast<unsigned char>(c)]; >- if (id < 0) return false; >+ if (ABSL_PREDICT_FALSE(id < 0)) return false; > } > > assert(CheckFastPathSetting(*conv)); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.h >index 5bebc95540e6c28d39d0847305366742ced9edfa..1022f06297a74a584e82e41ca4b824fbbf63391c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser.h >@@ -75,14 +75,14 @@ struct UnboundConversion { > bool ConsumeUnboundConversion(string_view* src, UnboundConversion* conv, > int* next_arg); > >-// Parse the format std::string provided in 'src' and pass the identified items into >+// Parse the format string provided in 'src' and pass the identified items into > // 'consumer'. > // Text runs will be passed by calling > // Consumer::Append(string_view); > // ConversionItems will be passed by calling > // Consumer::ConvertOne(UnboundConversion, string_view); > // In the case of ConvertOne, the string_view that is passed is the >-// portion of the format std::string corresponding to the conversion, not including >+// portion of the format string corresponding to the conversion, not including > // the leading %. On success, it returns true. On failure, it stops and returns > // false. > template <typename Consumer> >@@ -90,7 +90,7 @@ bool ParseFormatString(string_view src, Consumer consumer) { > int next_arg = 0; > while (!src.empty()) { > const char* percent = >- static_cast<const char*>(memchr(src.begin(), '%', src.size())); >+ static_cast<const char*>(memchr(src.data(), '%', src.size())); > if (!percent) { > // We found the last substring. > return consumer.Append(src); >@@ -98,7 +98,7 @@ bool ParseFormatString(string_view src, Consumer consumer) { > // We found a percent, so push the text run then process the percent. > size_t percent_loc = percent - src.data(); > if (!consumer.Append(string_view(src.data(), percent_loc))) return false; >- if (percent + 1 >= src.end()) return false; >+ if (percent + 1 >= src.data() + src.size()) return false; > > UnboundConversion conv; > >@@ -178,7 +178,8 @@ class ParsedFormatBase { > const char* const base = data_.get(); > string_view text(base, 0); > for (const auto& item : items_) { >- text = string_view(text.end(), (base + item.text_end) - text.end()); >+ const char* const end = text.data() + text.size(); >+ text = string_view(end, (base + item.text_end) - end); > if (item.is_conversion) { > if (!consumer.ConvertOne(item.conv, text)) return false; > } else { >@@ -237,7 +238,7 @@ class ParsedFormatBase { > // This class also supports runtime format checking with the ::New() and > // ::NewAllowIgnored() factory functions. > // This is the only API that allows the user to pass a runtime specified format >-// std::string. These factory functions will return NULL if the format does not match >+// string. These factory functions will return NULL if the format does not match > // the conversions requested by the user. > template <str_format_internal::Conv... C> > class ExtendedParsedFormat : public str_format_internal::ParsedFormatBase { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser_test.cc >index e698020b1aba38452c7b8899fd3ed2769b0c3e51..ae40203191b97f9c3c4f0c4be281a22921db6e6b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_format/parser_test.cc >@@ -66,10 +66,10 @@ class ConsumeUnboundConversionTest : public ::testing::Test { > typedef UnboundConversion Props; > string_view Consume(string_view* src) { > int next = 0; >- const char* prev_begin = src->begin(); >+ const char* prev_begin = src->data(); > o = UnboundConversion(); // refresh > ConsumeUnboundConversion(src, &o, &next); >- return {prev_begin, static_cast<size_t>(src->begin() - prev_begin)}; >+ return {prev_begin, static_cast<size_t>(src->data() - prev_begin)}; > } > > bool Run(const char *fmt, bool force_positional = false) { >@@ -84,9 +84,9 @@ class ConsumeUnboundConversionTest : public ::testing::Test { > TEST_F(ConsumeUnboundConversionTest, ConsumeSpecification) { > struct Expectation { > int line; >- const char *src; >- const char *out; >- const char *src_post; >+ string_view src; >+ string_view out; >+ string_view src_post; > }; > const Expectation kExpect[] = { > {__LINE__, "", "", "" }, >@@ -236,6 +236,16 @@ TEST_F(ConsumeUnboundConversionTest, WidthAndPrecision) { > EXPECT_EQ(9, o.precision.get_from_arg()); > > EXPECT_FALSE(Run(".*0$d")) << "no arg 0"; >+ >+ // Large values >+ EXPECT_TRUE(Run("999999999.999999999d")); >+ EXPECT_FALSE(o.width.is_from_arg()); >+ EXPECT_EQ(999999999, o.width.value()); >+ EXPECT_FALSE(o.precision.is_from_arg()); >+ EXPECT_EQ(999999999, o.precision.value()); >+ >+ EXPECT_FALSE(Run("1000000000.999999999d")); >+ EXPECT_FALSE(Run("999999999.1000000000d")); > } > > TEST_F(ConsumeUnboundConversionTest, Flags) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_join_internal.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_join_internal.h >index a734758c21ee30951f77a2c69c96ec75d71ee96a..0058fc8aa4d5a81210a98f2d1bb11bc0c65a965c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_join_internal.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_join_internal.h >@@ -189,7 +189,7 @@ struct DefaultFormatter<std::unique_ptr<ValueType>> > // > > // The main joining algorithm. This simply joins the elements in the given >-// iterator range, each separated by the given separator, into an output std::string, >+// iterator range, each separated by the given separator, into an output string, > // and formats each element using the provided Formatter object. > template <typename Iterator, typename Formatter> > std::string JoinAlgorithm(Iterator start, Iterator end, absl::string_view s, >@@ -205,20 +205,20 @@ std::string JoinAlgorithm(Iterator start, Iterator end, absl::string_view s, > } > > // A joining algorithm that's optimized for a forward iterator range of >-// std::string-like objects that do not need any additional formatting. This is to >-// optimize the common case of joining, say, a std::vector<std::string> or a >+// string-like objects that do not need any additional formatting. This is to >+// optimize the common case of joining, say, a std::vector<string> or a > // std::vector<absl::string_view>. > // > // This is an overload of the previous JoinAlgorithm() function. Here the > // Formatter argument is of type NoFormatter. Since NoFormatter is an internal > // type, this overload is only invoked when strings::Join() is called with a >-// range of std::string-like objects (e.g., std::string, absl::string_view), and an >+// range of string-like objects (e.g., string, absl::string_view), and an > // explicit Formatter argument was NOT specified. > // > // The optimization is that the needed space will be reserved in the output >-// std::string to avoid the need to resize while appending. To do this, the iterator >+// string to avoid the need to resize while appending. To do this, the iterator > // range will be traversed twice: once to calculate the total needed size, and >-// then again to copy the elements and delimiters to the output std::string. >+// then again to copy the elements and delimiters to the output string. > template <typename Iterator, > typename = typename std::enable_if<std::is_convertible< > typename std::iterator_traits<Iterator>::iterator_category, >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_split_internal.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_split_internal.h >index 9cf0833f49028ab039c7c422504a996d99e943dc..81e8d55544680d671e56eb0254995b42bd548d40 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_split_internal.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/internal/str_split_internal.h >@@ -51,7 +51,7 @@ namespace strings_internal { > > // This class is implicitly constructible from everything that absl::string_view > // is implicitly constructible from. If it's constructed from a temporary >-// std::string, the data is moved into a data member so its lifetime matches that of >+// string, the data is moved into a data member so its lifetime matches that of > // the ConvertibleToStringView instance. > class ConvertibleToStringView { > public: >@@ -102,7 +102,7 @@ ConvertibleToStringView(std::string&& s) // NOLINT(runtime/explicit) > absl::string_view value_; > }; > >-// An iterator that enumerates the parts of a std::string from a Splitter. The text >+// An iterator that enumerates the parts of a string from a Splitter. The text > // to be split, the Delimiter, and the Predicate are all taken from the given > // Splitter object. Iterators may only be compared if they refer to the same > // Splitter instance. >@@ -159,7 +159,7 @@ class SplitIterator { > } > const absl::string_view text = splitter_->text(); > const absl::string_view d = delimiter_.Find(text, pos_); >- if (d.data() == text.end()) state_ = kLastState; >+ if (d.data() == text.data() + text.size()) state_ = kLastState; > curr_ = text.substr(pos_, d.data() - (text.data() + pos_)); > pos_ += curr_.size() + d.size(); > } while (!predicate_(curr_)); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.cc >index 25bd7f0b828bfdec72ca587d4871bffb692862e3..3d10c57784ebe92549855bcfb0776799966f1717 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.cc >@@ -27,6 +27,13 @@ bool CaseEqual(absl::string_view piece1, absl::string_view piece2) { > } > } // namespace > >+bool EqualsIgnoreCase(absl::string_view piece1, absl::string_view piece2) { >+ return (piece1.size() == piece2.size() && >+ 0 == absl::strings_internal::memcasecmp(piece1.data(), piece2.data(), >+ piece1.size())); >+ // memcasecmp uses absl::ascii_tolower(). >+} >+ > bool StartsWithIgnoreCase(absl::string_view text, absl::string_view prefix) { > return (text.size() >= prefix.size()) && > CaseEqual(text.substr(0, prefix.size()), prefix); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.h >index 108b6048b048ec265f518aa3a6ee04d1019f3436..6e8ed10fc50feb6f7196db419c4a38a4240b4f7b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match.h >@@ -17,7 +17,7 @@ > // File: match.h > // ----------------------------------------------------------------------------- > // >-// This file contains simple utilities for performing std::string matching checks. >+// This file contains simple utilities for performing string matching checks. > // All of these function parameters are specified as `absl::string_view`, > // meaning that these functions can accept `std::string`, `absl::string_view` or > // nul-terminated C-style strings. >@@ -41,14 +41,14 @@ namespace absl { > > // StrContains() > // >-// Returns whether a given std::string `haystack` contains the substring `needle`. >+// Returns whether a given string `haystack` contains the substring `needle`. > inline bool StrContains(absl::string_view haystack, absl::string_view needle) { > return haystack.find(needle, 0) != haystack.npos; > } > > // StartsWith() > // >-// Returns whether a given std::string `text` begins with `prefix`. >+// Returns whether a given string `text` begins with `prefix`. > inline bool StartsWith(absl::string_view text, absl::string_view prefix) { > return prefix.empty() || > (text.size() >= prefix.size() && >@@ -57,7 +57,7 @@ inline bool StartsWith(absl::string_view text, absl::string_view prefix) { > > // EndsWith() > // >-// Returns whether a given std::string `text` ends with `suffix`. >+// Returns whether a given string `text` ends with `suffix`. > inline bool EndsWith(absl::string_view text, absl::string_view suffix) { > return suffix.empty() || > (text.size() >= suffix.size() && >@@ -66,16 +66,22 @@ inline bool EndsWith(absl::string_view text, absl::string_view suffix) { > ); > } > >-// StartsWithIgnoreCase() >+// EqualsIgnoreCase() > // >-// Returns whether a given std::string `text` starts with `starts_with`, ignoring >+// Returns whether given ASCII strings `piece1` and `piece2` are equal, ignoring > // case in the comparison. >+bool EqualsIgnoreCase(absl::string_view piece1, absl::string_view piece2); >+ >+// StartsWithIgnoreCase() >+// >+// Returns whether a given ASCII string `text` starts with `starts_with`, >+// ignoring case in the comparison. > bool StartsWithIgnoreCase(absl::string_view text, absl::string_view prefix); > > // EndsWithIgnoreCase() > // >-// Returns whether a given std::string `text` ends with `ends_with`, ignoring case >-// in the comparison. >+// Returns whether a given ASCII string `text` ends with `ends_with`, ignoring >+// case in the comparison. > bool EndsWithIgnoreCase(absl::string_view text, absl::string_view suffix); > > } // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match_test.cc >index d194f0e689bf6e001656c52e357b5d778a67b55d..c21e00bf807c94f7f7c5a88f265f20694cdf1143 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/match_test.cc >@@ -80,6 +80,17 @@ TEST(MatchTest, ContainsNull) { > EXPECT_FALSE(absl::StrContains(cs, sv2)); > } > >+TEST(MatchTest, EqualsIgnoreCase) { >+ std::string text = "the"; >+ absl::string_view data(text); >+ >+ EXPECT_TRUE(absl::EqualsIgnoreCase(data, "The")); >+ EXPECT_TRUE(absl::EqualsIgnoreCase(data, "THE")); >+ EXPECT_TRUE(absl::EqualsIgnoreCase(data, "the")); >+ EXPECT_FALSE(absl::EqualsIgnoreCase(data, "Quick")); >+ EXPECT_FALSE(absl::EqualsIgnoreCase(data, "then")); >+} >+ > TEST(MatchTest, StartsWithIgnoreCase) { > EXPECT_TRUE(absl::StartsWithIgnoreCase("foo", "foo")); > EXPECT_TRUE(absl::StartsWithIgnoreCase("foo", "Fo")); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.cc >index f842ed85e9f595efb8da1ec050275dd368a26e97..9309f9da9dcc5c907e6835eca6d047c90f2c6f32 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.cc >@@ -12,7 +12,7 @@ > // See the License for the specific language governing permissions and > // limitations under the License. > >-// This file contains std::string processing functions related to >+// This file contains string processing functions related to > // numeric values. > > #include "absl/strings/numbers.h" >@@ -30,10 +30,10 @@ > #include <memory> > #include <utility> > >+#include "absl/base/internal/bits.h" > #include "absl/base/internal/raw_logging.h" > #include "absl/strings/ascii.h" > #include "absl/strings/charconv.h" >-#include "absl/strings/internal/bits.h" > #include "absl/strings/internal/memutil.h" > #include "absl/strings/str_cat.h" > >@@ -161,8 +161,8 @@ bool SimpleAtob(absl::string_view str, bool* value) { > // their output to the beginning of the buffer. The caller is responsible > // for ensuring that the buffer has enough space to hold the output. > // >-// Returns a pointer to the end of the std::string (i.e. the null character >-// terminating the std::string). >+// Returns a pointer to the end of the string (i.e. the null character >+// terminating the string). > // ---------------------------------------------------------------------- > > namespace { >@@ -344,7 +344,7 @@ static std::pair<uint64_t, uint64_t> Mul32(std::pair<uint64_t, uint64_t> num, > uint64_t bits128_up = (bits96_127 >> 32) + (bits64_127 < bits64_95); > if (bits128_up == 0) return {bits64_127, bits0_63}; > >- int shift = 64 - strings_internal::CountLeadingZeros64(bits128_up); >+ int shift = 64 - base_internal::CountLeadingZeros64(bits128_up); > uint64_t lo = (bits0_63 >> shift) + (bits64_127 << (64 - shift)); > uint64_t hi = (bits64_127 >> shift) + (bits128_up << (64 - shift)); > return {hi, lo}; >@@ -375,7 +375,7 @@ static std::pair<uint64_t, uint64_t> PowFive(uint64_t num, int expfive) { > 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5, > 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5 * 5}; > result = Mul32(result, powers_of_five[expfive & 15]); >- int shift = strings_internal::CountLeadingZeros64(result.first); >+ int shift = base_internal::CountLeadingZeros64(result.first); > if (shift != 0) { > result.first = (result.first << shift) + (result.second >> (64 - shift)); > result.second = (result.second << shift); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.h >index cf3c597266cf71474d405dd0cfdea5cdeb58badc..f9b2ccef94d8bd55b1bd82de582e589cbd212f51 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers.h >@@ -41,32 +41,32 @@ namespace absl { > > // SimpleAtoi() > // >-// Converts the given std::string into an integer value, returning `true` if >-// successful. The std::string must reflect a base-10 integer (optionally followed or >+// Converts the given string into an integer value, returning `true` if >+// successful. The string must reflect a base-10 integer (optionally followed or > // preceded by ASCII whitespace) whose value falls within the range of the >-// integer type, >+// integer type. > template <typename int_type> > ABSL_MUST_USE_RESULT bool SimpleAtoi(absl::string_view s, int_type* out); > > // SimpleAtof() > // >-// Converts the given std::string (optionally followed or preceded by ASCII >+// Converts the given string (optionally followed or preceded by ASCII > // whitespace) into a float, which may be rounded on overflow or underflow. >-// See http://en.cppreference.com/w/c/std::string/byte/strtof for details about the >+// See http://en.cppreference.com/w/c/string/byte/strtof for details about the > // allowed formats for `str`. > ABSL_MUST_USE_RESULT bool SimpleAtof(absl::string_view str, float* value); > > // SimpleAtod() > // >-// Converts the given std::string (optionally followed or preceded by ASCII >+// Converts the given string (optionally followed or preceded by ASCII > // whitespace) into a double, which may be rounded on overflow or underflow. >-// See http://en.cppreference.com/w/c/std::string/byte/strtof for details about the >+// See http://en.cppreference.com/w/c/string/byte/strtof for details about the > // allowed formats for `str`. > ABSL_MUST_USE_RESULT bool SimpleAtod(absl::string_view str, double* value); > > // SimpleAtob() > // >-// Converts the given std::string into a boolean, returning `true` if successful. >+// Converts the given string into a boolean, returning `true` if successful. > // The following case-insensitive strings are interpreted as boolean `true`: > // "true", "t", "yes", "y", "1". The following case-insensitive strings > // are interpreted as boolean `false`: "false", "f", "no", "n", "0". >@@ -169,9 +169,9 @@ ABSL_MUST_USE_RESULT bool safe_strtoi_base(absl::string_view s, int_type* out, > > // SimpleAtoi() > // >-// Converts a std::string to an integer, using `safe_strto?()` functions for actual >+// Converts a string to an integer, using `safe_strto?()` functions for actual > // parsing, returning `true` if successful. The `safe_strto?()` functions apply >-// strict checking; the std::string must be a base-10 integer, optionally followed or >+// strict checking; the string must be a base-10 integer, optionally followed or > // preceded by ASCII whitespace, with a value in the range of the corresponding > // integer type. > template <typename int_type> >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_benchmark.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_benchmark.cc >index 8ef650b9a3eccdd79664dc79d2d6f261a6686a32..0570b758633ee0406d1217e9e70012b28c180c71 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_benchmark.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_benchmark.cc >@@ -158,7 +158,7 @@ BENCHMARK(BM_safe_strtou64_string) > ->ArgPair(16, 10) > ->ArgPair(16, 16); > >-// Returns a vector of `num_strings` strings. Each std::string represents a >+// Returns a vector of `num_strings` strings. Each string represents a > // floating point number with `num_digits` digits before the decimal point and > // another `num_digits` digits after. > std::vector<std::string> MakeFloatStrings(int num_strings, int num_digits) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_test.cc >index 27cc0479e3412c838b4785dfafb8895efe096f63..36fc0d64fd5130507b1e811b38dfe57e32493b4d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/numbers_test.cc >@@ -12,7 +12,7 @@ > // See the License for the specific language governing permissions and > // limitations under the License. > >-// This file tests std::string processing functions related to numeric values. >+// This file tests string processing functions related to numeric values. > > #include "absl/strings/numbers.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.cc >index 3fe8c95eca9ef979491693638900858b08725d0b..efa4fd73dbac77048747ecc4a7f1fcb0e67d7706 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.cc >@@ -79,7 +79,7 @@ AlphaNum::AlphaNum(Dec dec) { > // ---------------------------------------------------------------------- > // StrCat() > // This merges the given strings or integers, with no delimiter. This >-// is designed to be the fastest possible way to construct a std::string out >+// is designed to be the fastest possible way to construct a string out > // of a mix of raw C strings, StringPieces, strings, and integer values. > // ---------------------------------------------------------------------- > >@@ -154,10 +154,10 @@ std::string CatPieces(std::initializer_list<absl::string_view> pieces) { > } > > // It's possible to call StrAppend with an absl::string_view that is itself a >-// fragment of the std::string we're appending to. However the results of this are >+// fragment of the string we're appending to. However the results of this are > // random. Therefore, check for this in debug mode. Use unsigned math so we > // only have to do one comparison. Note, there's an exception case: appending an >-// empty std::string is always allowed. >+// empty string is always allowed. > #define ASSERT_NO_OVERLAP(dest, src) \ > assert(((src).size() == 0) || \ > (uintptr_t((src).data() - (dest).data()) > uintptr_t((dest).size()))) >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.h >index e5501a5012ee8d9cf79cbe6b50b04756d191bf9e..da9ed9a269e990b87922ff2d19aa9717e5e0d721 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat.h >@@ -23,7 +23,7 @@ > // designed to be used as a parameter type that efficiently manages conversion > // to strings and avoids copies in the above operations. > // >-// Any routine accepting either a std::string or a number may accept `AlphaNum`. >+// Any routine accepting either a string or a number may accept `AlphaNum`. > // The basic idea is that by accepting a `const AlphaNum &` as an argument > // to your function, your callers will automagically convert bools, integers, > // and floating point values to strings for you. >@@ -65,7 +65,7 @@ > namespace absl { > > namespace strings_internal { >-// AlphaNumBuffer allows a way to pass a std::string to StrCat without having to do >+// AlphaNumBuffer allows a way to pass a string to StrCat without having to do > // memory allocation. It is simply a pair of a fixed-size character array, and > // a size. Please don't use outside of absl, yet. > template <size_t max_size> >@@ -119,8 +119,8 @@ enum PadSpec : uint8_t { > // Hex > // ----------------------------------------------------------------------------- > // >-// `Hex` stores a set of hexadecimal std::string conversion parameters for use >-// within `AlphaNum` std::string conversions. >+// `Hex` stores a set of hexadecimal string conversion parameters for use >+// within `AlphaNum` string conversions. > struct Hex { > uint64_t value; > uint8_t width; >@@ -168,8 +168,8 @@ struct Hex { > // Dec > // ----------------------------------------------------------------------------- > // >-// `Dec` stores a set of decimal std::string conversion parameters for use >-// within `AlphaNum` std::string conversions. Dec is slower than the default >+// `Dec` stores a set of decimal string conversion parameters for use >+// within `AlphaNum` string conversions. Dec is slower than the default > // integer conversion, so use it only if you need padding. > struct Dec { > uint64_t value; >@@ -271,7 +271,7 @@ class AlphaNum { > // > // Merges given strings or numbers, using no delimiter(s). > // >-// `StrCat()` is designed to be the fastest possible way to construct a std::string >+// `StrCat()` is designed to be the fastest possible way to construct a string > // out of a mix of raw C strings, string_views, strings, bool values, > // and numeric values. > // >@@ -279,7 +279,7 @@ class AlphaNum { > // works poorly on strings built up out of fragments. > // > // For clarity and performance, don't use `StrCat()` when appending to a >-// std::string. Use `StrAppend()` instead. In particular, avoid using any of these >+// string. Use `StrAppend()` instead. In particular, avoid using any of these > // (anti-)patterns: > // > // str.append(StrCat(...)) >@@ -328,26 +328,26 @@ ABSL_MUST_USE_RESULT inline std::string StrCat(const AlphaNum& a, const AlphaNum > // StrAppend() > // ----------------------------------------------------------------------------- > // >-// Appends a std::string or set of strings to an existing std::string, in a similar >+// Appends a string or set of strings to an existing string, in a similar > // fashion to `StrCat()`. > // > // WARNING: `StrAppend(&str, a, b, c, ...)` requires that none of the > // a, b, c, parameters be a reference into str. For speed, `StrAppend()` does > // not try to check each of its input arguments to be sure that they are not >-// a subset of the std::string being appended to. That is, while this will work: >+// a subset of the string being appended to. That is, while this will work: > // >-// std::string s = "foo"; >+// string s = "foo"; > // s += s; > // > // This output is undefined: > // >-// std::string s = "foo"; >+// string s = "foo"; > // StrAppend(&s, s); > // > // This output is undefined as well, since `absl::string_view` does not own its > // data: > // >-// std::string s = "foobar"; >+// string s = "foobar"; > // absl::string_view p = s; > // StrAppend(&s, p); > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat_test.cc >index e9d67690638271920d26234d9739053c30b27489..555d8db3eb85438be5f8cc7f59dd5507d6d5ba54 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_cat_test.cc >@@ -28,7 +28,8 @@ > #define ABSL_EXPECT_DEBUG_DEATH(statement, regex) \ > EXPECT_DEBUG_DEATH(statement, ".*") > #else >-#define ABSL_EXPECT_DEBUG_DEATH EXPECT_DEBUG_DEATH >+#define ABSL_EXPECT_DEBUG_DEATH(statement, regex) \ >+ EXPECT_DEBUG_DEATH(statement, regex) > #endif > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format.h >index 70a811b75e2ff8863c9e3687c6c282f90c2317f9..2d07725de21fcf7906950f02d20615b5ad1098ad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format.h >@@ -18,20 +18,20 @@ > // ----------------------------------------------------------------------------- > // > // The `str_format` library is a typesafe replacement for the family of >-// `printf()` std::string formatting routines within the `<cstdio>` standard library >+// `printf()` string formatting routines within the `<cstdio>` standard library > // header. Like the `printf` family, the `str_format` uses a "format string" to > // perform argument substitutions based on types. > // > // Example: > // >-// std::string s = absl::StrFormat("%s %s You have $%d!", "Hello", name, dollars); >+// string s = absl::StrFormat("%s %s You have $%d!", "Hello", name, dollars); > // > // The library consists of the following basic utilities: > // > // * `absl::StrFormat()`, a type-safe replacement for `std::sprintf()`, to >-// write a format std::string to a `string` value. >-// * `absl::StrAppendFormat()` to append a format std::string to a `string` >-// * `absl::StreamFormat()` to more efficiently write a format std::string to a >+// write a format string to a `string` value. >+// * `absl::StrAppendFormat()` to append a format string to a `string` >+// * `absl::StreamFormat()` to more efficiently write a format string to a > // stream, such as`std::cout`. > // * `absl::PrintF()`, `absl::FPrintF()` and `absl::SNPrintF()` as > // replacements for `std::printf()`, `std::fprintf()` and `std::snprintf()`. >@@ -39,15 +39,15 @@ > // Note: a version of `std::sprintf()` is not supported as it is > // generally unsafe due to buffer overflows. > // >-// Additionally, you can provide a format std::string (and its associated arguments) >+// Additionally, you can provide a format string (and its associated arguments) > // using one of the following abstractions: > // >-// * A `FormatSpec` class template fully encapsulates a format std::string and its >+// * A `FormatSpec` class template fully encapsulates a format string and its > // type arguments and is usually provided to `str_format` functions as a > // variadic argument of type `FormatSpec<Arg...>`. The `FormatSpec<Args...>` > // template is evaluated at compile-time, providing type safety. > // * A `ParsedFormat` instance, which encapsulates a specific, pre-compiled >-// format std::string for a specific set of type(s), and which can be passed >+// format string for a specific set of type(s), and which can be passed > // between API boundaries. (The `FormatSpec` type should not be used > // directly.) > // >@@ -60,7 +60,7 @@ > // > // * A `FormatUntyped()` function that is similar to `Format()` except it is > // loosely typed. `FormatUntyped()` is not a template and does not perform >-// any compile-time checking of the format std::string; instead, it returns a >+// any compile-time checking of the format string; instead, it returns a > // boolean from a runtime check. > // > // In addition, the `str_format` library provides extension points for >@@ -89,7 +89,7 @@ namespace absl { > // Example: > // > // absl::UntypedFormatSpec format("%d"); >-// std::string out; >+// string out; > // CHECK(absl::FormatUntyped(&out, format, {absl::FormatArg(1)})); > class UntypedFormatSpec { > public: >@@ -135,7 +135,7 @@ str_format_internal::StreamedWrapper<T> FormatStreamed(const T& v) { > // Example: > // > // int n = 0; >-// std::string s = absl::StrFormat("%s%d%n", "hello", 123, >+// string s = absl::StrFormat("%s%d%n", "hello", 123, > // absl::FormatCountCapture(&n)); > // EXPECT_EQ(8, n); > class FormatCountCapture { >@@ -155,10 +155,10 @@ class FormatCountCapture { > > // FormatSpec > // >-// The `FormatSpec` type defines the makeup of a format std::string within the >+// The `FormatSpec` type defines the makeup of a format string within the > // `str_format` library. You should not need to use or manipulate this type > // directly. A `FormatSpec` is a variadic class template that is evaluated at >-// compile-time, according to the format std::string and arguments that are passed >+// compile-time, according to the format string and arguments that are passed > // to it. > // > // For a `FormatSpec` to be valid at compile-time, it must be provided as >@@ -166,12 +166,12 @@ class FormatCountCapture { > // > // * A `constexpr` literal or `absl::string_view`, which is how it most often > // used. >-// * A `ParsedFormat` instantiation, which ensures the format std::string is >+// * A `ParsedFormat` instantiation, which ensures the format string is > // valid before use. (See below.) > // > // Example: > // >-// // Provided as a std::string literal. >+// // Provided as a string literal. > // absl::StrFormat("Welcome to %s, Number %d!", "The Village", 6); > // > // // Provided as a constexpr absl::string_view. >@@ -183,7 +183,7 @@ class FormatCountCapture { > // absl::ParsedFormat<'s', 'd'> formatString("Welcome to %s, Number %d!"); > // absl::StrFormat(formatString, "TheVillage", 6); > // >-// A format std::string generally follows the POSIX syntax as used within the POSIX >+// A format string generally follows the POSIX syntax as used within the POSIX > // `printf` specification. > // > // (See http://pubs.opengroup.org/onlinepubs/9699919799/utilities/printf.html.) >@@ -223,7 +223,7 @@ class FormatCountCapture { > // "%p", *int -> "0x7ffdeb6ad2a4" > // > // int n = 0; >-// std::string s = absl::StrFormat( >+// string s = absl::StrFormat( > // "%s%d%n", "hello", 123, absl::FormatCountCapture(&n)); > // EXPECT_EQ(8, n); > // >@@ -236,7 +236,7 @@ class FormatCountCapture { > // > // However, in the `str_format` library, a format conversion specifies a broader > // C++ conceptual category instead of an exact type. For example, `%s` binds to >-// any std::string-like argument, so `std::string`, `absl::string_view`, and >+// any string-like argument, so `std::string`, `absl::string_view`, and > // `const char*` are all accepted. Likewise, `%d` accepts any integer-like > // argument, etc. > >@@ -248,7 +248,7 @@ using FormatSpec = > // > // A `ParsedFormat` is a class template representing a preparsed `FormatSpec`, > // with template arguments specifying the conversion characters used within the >-// format std::string. Such characters must be valid format type specifiers, and >+// format string. Such characters must be valid format type specifiers, and > // these type specifiers are checked at compile-time. > // > // Instances of `ParsedFormat` can be created, copied, and reused to speed up >@@ -275,26 +275,26 @@ using ParsedFormat = str_format_internal::ExtendedParsedFormat< > > // StrFormat() > // >-// Returns a `string` given a `printf()`-style format std::string and zero or more >+// Returns a `string` given a `printf()`-style format string and zero or more > // additional arguments. Use it as you would `sprintf()`. `StrFormat()` is the > // primary formatting function within the `str_format` library, and should be > // used in most cases where you need type-safe conversion of types into > // formatted strings. > // >-// The format std::string generally consists of ordinary character data along with >+// The format string generally consists of ordinary character data along with > // one or more format conversion specifiers (denoted by the `%` character). >-// Ordinary character data is returned unchanged into the result std::string, while >+// Ordinary character data is returned unchanged into the result string, while > // each conversion specification performs a type substitution from > // `StrFormat()`'s other arguments. See the comments for `FormatSpec` for full >-// information on the makeup of this format std::string. >+// information on the makeup of this format string. > // > // Example: > // >-// std::string s = absl::StrFormat( >+// string s = absl::StrFormat( > // "Welcome to %s, Number %d!", "The Village", 6); > // EXPECT_EQ("Welcome to The Village, Number 6!", s); > // >-// Returns an empty std::string in case of error. >+// Returns an empty string in case of error. > template <typename... Args> > ABSL_MUST_USE_RESULT std::string StrFormat(const FormatSpec<Args...>& format, > const Args&... args) { >@@ -305,13 +305,13 @@ ABSL_MUST_USE_RESULT std::string StrFormat(const FormatSpec<Args...>& format, > > // StrAppendFormat() > // >-// Appends to a `dst` std::string given a format std::string, and zero or more additional >+// Appends to a `dst` string given a format string, and zero or more additional > // arguments, returning `*dst` as a convenience for chaining purposes. Appends > // nothing in case of error (but possibly alters its capacity). > // > // Example: > // >-// std::string orig("For example PI is approximately "); >+// string orig("For example PI is approximately "); > // std::cout << StrAppendFormat(&orig, "%12.6f", 3.14); > template <typename... Args> > std::string& StrAppendFormat(std::string* dst, const FormatSpec<Args...>& format, >@@ -323,7 +323,7 @@ std::string& StrAppendFormat(std::string* dst, const FormatSpec<Args...>& format > > // StreamFormat() > // >-// Writes to an output stream given a format std::string and zero or more arguments, >+// Writes to an output stream given a format string and zero or more arguments, > // generally in a manner that is more efficient than streaming the result of > // `absl:: StrFormat()`. The returned object must be streamed before the full > // expression ends. >@@ -341,7 +341,7 @@ ABSL_MUST_USE_RESULT str_format_internal::Streamable StreamFormat( > > // PrintF() > // >-// Writes to stdout given a format std::string and zero or more arguments. This >+// Writes to stdout given a format string and zero or more arguments. This > // function is functionally equivalent to `std::printf()` (and type-safe); > // prefer `absl::PrintF()` over `std::printf()`. > // >@@ -361,14 +361,14 @@ int PrintF(const FormatSpec<Args...>& format, const Args&... args) { > > // FPrintF() > // >-// Writes to a file given a format std::string and zero or more arguments. This >+// Writes to a file given a format string and zero or more arguments. This > // function is functionally equivalent to `std::fprintf()` (and type-safe); > // prefer `absl::FPrintF()` over `std::fprintf()`. > // > // Example: > // > // std::string_view s = "Ulaanbaatar"; >-// absl::FPrintF("The capital of Mongolia is %s", s); >+// absl::FPrintF(stdout, "The capital of Mongolia is %s", s); > // > // Outputs: "The capital of Mongolia is Ulaanbaatar" > // >@@ -382,7 +382,7 @@ int FPrintF(std::FILE* output, const FormatSpec<Args...>& format, > > // SNPrintF() > // >-// Writes to a sized buffer given a format std::string and zero or more arguments. >+// Writes to a sized buffer given a format string and zero or more arguments. > // This function is functionally equivalent to `std::snprintf()` (and > // type-safe); prefer `absl::SNPrintF()` over `std::snprintf()`. > // >@@ -430,14 +430,14 @@ class FormatRawSink { > > // Format() > // >-// Writes a formatted std::string to an arbitrary sink object (implementing the >-// `absl::FormatRawSink` interface), using a format std::string and zero or more >+// Writes a formatted string to an arbitrary sink object (implementing the >+// `absl::FormatRawSink` interface), using a format string and zero or more > // additional arguments. > // > // By default, `string` and `std::ostream` are supported as destination objects. > // > // `absl::Format()` is a generic version of `absl::StrFormat(), for custom >-// sinks. The format std::string, like format strings for `StrFormat()`, is checked >+// sinks. The format string, like format strings for `StrFormat()`, is checked > // at compile-time. > // > // On failure, this function returns `false` and the state of the sink is >@@ -463,13 +463,13 @@ using FormatArg = str_format_internal::FormatArgImpl; > > // FormatUntyped() > // >-// Writes a formatted std::string to an arbitrary sink object (implementing the >+// Writes a formatted string to an arbitrary sink object (implementing the > // `absl::FormatRawSink` interface), using an `UntypedFormatSpec` and zero or > // more additional arguments. > // > // This function acts as the most generic formatting function in the > // `str_format` library. The caller provides a raw sink, an unchecked format >-// std::string, and (usually) a runtime specified list of arguments; no compile-time >+// string, and (usually) a runtime specified list of arguments; no compile-time > // checking of formatting is performed within this function. As a result, a > // caller should check the return value to verify that no error occurred. > // On failure, this function returns `false` and the state of the sink is >@@ -483,9 +483,9 @@ using FormatArg = str_format_internal::FormatArgImpl; > // > // Example: > // >-// std::optional<std::string> FormatDynamic(const std::string& in_format, >-// const vector<std::string>& in_args) { >-// std::string out; >+// std::optional<string> FormatDynamic(const string& in_format, >+// const vector<string>& in_args) { >+// string out; > // std::vector<absl::FormatArg> args; > // for (const auto& v : in_args) { > // // It is important that 'v' is a reference to the objects in in_args. >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format_test.cc >index fed75fafb5c65c11b92f8edbca538a4acdd1177c..aa14e211709ab7393796a320830c755916acc883 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_format_test.cc >@@ -391,7 +391,7 @@ TEST(StrFormat, BehavesAsDocumented) { > #endif > > // Other conversion >- int64_t value = 0x7ffdeb6; >+ int64_t value = 0x7ffdeb4; > auto ptr_value = static_cast<uintptr_t>(value); > const int& something = *reinterpret_cast<const int*>(ptr_value); > EXPECT_EQ(StrFormat("%p", &something), StrFormat("0x%x", ptr_value)); >@@ -601,3 +601,39 @@ TEST_F(ParsedFormatTest, RegressionMixPositional) { > > } // namespace > } // namespace absl >+ >+// Some codegen thunks that we can use to easily dump the generated assembly for >+// different StrFormat calls. >+ >+inline std::string CodegenAbslStrFormatInt(int i) { >+ return absl::StrFormat("%d", i); >+} >+ >+inline std::string CodegenAbslStrFormatIntStringInt64(int i, const std::string& s, >+ int64_t i64) { >+ return absl::StrFormat("%d %s %d", i, s, i64); >+} >+ >+inline void CodegenAbslStrAppendFormatInt(std::string* out, int i) { >+ absl::StrAppendFormat(out, "%d", i); >+} >+ >+inline void CodegenAbslStrAppendFormatIntStringInt64(std::string* out, int i, >+ const std::string& s, >+ int64_t i64) { >+ absl::StrAppendFormat(out, "%d %s %d", i, s, i64); >+} >+ >+auto absl_internal_str_format_force_codegen_funcs = std::make_tuple( >+ CodegenAbslStrFormatInt, CodegenAbslStrFormatIntStringInt64, >+ CodegenAbslStrAppendFormatInt, CodegenAbslStrAppendFormatIntStringInt64); >+ >+bool absl_internal_str_format_force_codegen_always_false; >+// Force the compiler to generate the functions by making it look like we >+// escape the function pointers. >+// It can't statically know that >+// absl_internal_str_format_force_codegen_always_false is not changed by someone >+// else. >+bool absl_internal_str_format_force_codegen = >+ absl_internal_str_format_force_codegen_always_false && >+ printf("%p", &absl_internal_str_format_force_codegen_funcs) == 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_join.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_join.h >index bd4d0e1d9324cc729214449935b57337a6716fb7..f9611ad3b47d72d0d992bd2bef9075c8608fd25e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_join.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_join.h >@@ -18,10 +18,10 @@ > // ----------------------------------------------------------------------------- > // > // This header file contains functions for joining a range of elements and >-// returning the result as a std::string. StrJoin operations are specified by passing >-// a range, a separator std::string to use between the elements joined, and an >+// returning the result as a string. StrJoin operations are specified by passing >+// a range, a separator string to use between the elements joined, and an > // optional Formatter responsible for converting each argument in the range to a >-// std::string. If omitted, a default `AlphaNumFormatter()` is called on the elements >+// string. If omitted, a default `AlphaNumFormatter()` is called on the elements > // to be joined, using the same formatting that `absl::StrCat()` uses. This > // package defines a number of default formatters, and you can define your own > // implementations. >@@ -29,7 +29,7 @@ > // Ranges are specified by passing a container with `std::begin()` and > // `std::end()` iterators, container-specific `begin()` and `end()` iterators, a > // brace-initialized `std::initializer_list`, or a `std::tuple` of heterogeneous >-// objects. The separator std::string is specified as an `absl::string_view`. >+// objects. The separator string is specified as an `absl::string_view`. > // > // Because the default formatter uses the `absl::AlphaNum` class, > // `absl::StrJoin()`, like `absl::StrCat()`, will work out-of-the-box on >@@ -37,8 +37,8 @@ > // > // Example: > // >-// std::vector<std::string> v = {"foo", "bar", "baz"}; >-// std::string s = absl::StrJoin(v, "-"); >+// std::vector<string> v = {"foo", "bar", "baz"}; >+// string s = absl::StrJoin(v, "-"); > // EXPECT_EQ("foo-bar-baz", s); > // > // See comments on the `absl::StrJoin()` function for more examples. >@@ -52,6 +52,7 @@ > #include <iterator> > #include <string> > #include <tuple> >+#include <type_traits> > #include <utility> > > #include "absl/base/macros.h" >@@ -65,16 +66,16 @@ namespace absl { > // ----------------------------------------------------------------------------- > // > // A Formatter is a function object that is responsible for formatting its >-// argument as a std::string and appending it to a given output std::string. Formatters >+// argument as a string and appending it to a given output string. Formatters > // may be implemented as function objects, lambdas, or normal functions. You may > // provide your own Formatter to enable `absl::StrJoin()` to work with arbitrary > // types. > // > // The following is an example of a custom Formatter that simply uses >-// `std::to_string()` to format an integer as a std::string. >+// `std::to_string()` to format an integer as a string. > // > // struct MyFormatter { >-// void operator()(std::string* out, int i) const { >+// void operator()(string* out, int i) const { > // out->append(std::to_string(i)); > // } > // }; >@@ -83,7 +84,7 @@ namespace absl { > // argument to `absl::StrJoin()`: > // > // std::vector<int> v = {1, 2, 3, 4}; >-// std::string s = absl::StrJoin(v, "-", MyFormatter()); >+// string s = absl::StrJoin(v, "-", MyFormatter()); > // EXPECT_EQ("1-2-3-4", s); > // > // The following standard formatters are provided within this file: >@@ -155,10 +156,10 @@ DereferenceFormatter() { > // StrJoin() > // ----------------------------------------------------------------------------- > // >-// Joins a range of elements and returns the result as a std::string. >-// `absl::StrJoin()` takes a range, a separator std::string to use between the >+// Joins a range of elements and returns the result as a string. >+// `absl::StrJoin()` takes a range, a separator string to use between the > // elements joined, and an optional Formatter responsible for converting each >-// argument in the range to a std::string. >+// argument in the range to a string. > // > // If omitted, the default `AlphaNumFormatter()` is called on the elements to be > // joined. >@@ -166,22 +167,22 @@ DereferenceFormatter() { > // Example 1: > // // Joins a collection of strings. This pattern also works with a collection > // // of `absl::string_view` or even `const char*`. >-// std::vector<std::string> v = {"foo", "bar", "baz"}; >-// std::string s = absl::StrJoin(v, "-"); >+// std::vector<string> v = {"foo", "bar", "baz"}; >+// string s = absl::StrJoin(v, "-"); > // EXPECT_EQ("foo-bar-baz", s); > // > // Example 2: > // // Joins the values in the given `std::initializer_list<>` specified using > // // brace initialization. This pattern also works with an initializer_list > // // of ints or `absl::string_view` -- any `AlphaNum`-compatible type. >-// std::string s = absl::StrJoin({"foo", "bar", "baz"}, "-"); >+// string s = absl::StrJoin({"foo", "bar", "baz"}, "-"); > // EXPECT_EQ("foo-bar-baz", s); > // > // Example 3: > // // Joins a collection of ints. This pattern also works with floats, > // // doubles, int64s -- any `StrCat()`-compatible type. > // std::vector<int> v = {1, 2, 3, -4}; >-// std::string s = absl::StrJoin(v, "-"); >+// string s = absl::StrJoin(v, "-"); > // EXPECT_EQ("1-2-3--4", s); > // > // Example 4: >@@ -192,7 +193,7 @@ DereferenceFormatter() { > // // `std::vector<int*>`. > // int x = 1, y = 2, z = 3; > // std::vector<int*> v = {&x, &y, &z}; >-// std::string s = absl::StrJoin(v, "-"); >+// string s = absl::StrJoin(v, "-"); > // EXPECT_EQ("1-2-3", s); > // > // Example 5: >@@ -201,42 +202,42 @@ DereferenceFormatter() { > // v.emplace_back(new int(1)); > // v.emplace_back(new int(2)); > // v.emplace_back(new int(3)); >-// std::string s = absl::StrJoin(v, "-"); >+// string s = absl::StrJoin(v, "-"); > // EXPECT_EQ("1-2-3", s); > // > // Example 6: > // // Joins a `std::map`, with each key-value pair separated by an equals > // // sign. This pattern would also work with, say, a > // // `std::vector<std::pair<>>`. >-// std::map<std::string, int> m = { >+// std::map<string, int> m = { > // std::make_pair("a", 1), > // std::make_pair("b", 2), > // std::make_pair("c", 3)}; >-// std::string s = absl::StrJoin(m, ",", absl::PairFormatter("=")); >+// string s = absl::StrJoin(m, ",", absl::PairFormatter("=")); > // EXPECT_EQ("a=1,b=2,c=3", s); > // > // Example 7: > // // These examples show how `absl::StrJoin()` handles a few common edge > // // cases: >-// std::vector<std::string> v_empty; >+// std::vector<string> v_empty; > // EXPECT_EQ("", absl::StrJoin(v_empty, "-")); > // >-// std::vector<std::string> v_one_item = {"foo"}; >+// std::vector<string> v_one_item = {"foo"}; > // EXPECT_EQ("foo", absl::StrJoin(v_one_item, "-")); > // >-// std::vector<std::string> v_empty_string = {""}; >+// std::vector<string> v_empty_string = {""}; > // EXPECT_EQ("", absl::StrJoin(v_empty_string, "-")); > // >-// std::vector<std::string> v_one_item_empty_string = {"a", ""}; >+// std::vector<string> v_one_item_empty_string = {"a", ""}; > // EXPECT_EQ("a-", absl::StrJoin(v_one_item_empty_string, "-")); > // >-// std::vector<std::string> v_two_empty_string = {"", ""}; >+// std::vector<string> v_two_empty_string = {"", ""}; > // EXPECT_EQ("-", absl::StrJoin(v_two_empty_string, "-")); > // > // Example 8: > // // Joins a `std::tuple<T...>` of heterogeneous types, converting each to >-// // a std::string using the `absl::AlphaNum` class. >-// std::string s = absl::StrJoin(std::make_tuple(123, "abc", 0.456), "-"); >+// // a string using the `absl::AlphaNum` class. >+// string s = absl::StrJoin(std::make_tuple(123, "abc", 0.456), "-"); > // EXPECT_EQ("123-abc-0.456", s); > > template <typename Iterator, typename Formatter> >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace.h >index f4d9bb9545d689762180a1a430e64b5065bcf27c..3bfe4c61ebd1504d3c8eb8e50ba870fa8fa68939 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace.h >@@ -17,19 +17,19 @@ > // File: str_replace.h > // ----------------------------------------------------------------------------- > // >-// This file defines `absl::StrReplaceAll()`, a general-purpose std::string >+// This file defines `absl::StrReplaceAll()`, a general-purpose string > // replacement function designed for large, arbitrary text substitutions, > // especially on strings which you are receiving from some other system for > // further processing (e.g. processing regular expressions, escaping HTML >-// entities, etc. `StrReplaceAll` is designed to be efficient even when only >+// entities, etc.). `StrReplaceAll` is designed to be efficient even when only > // one substitution is being performed, or when substitution is rare. > // >-// If the std::string being modified is known at compile-time, and the substitutions >+// If the string being modified is known at compile-time, and the substitutions > // vary, `absl::Substitute()` may be a better choice. > // > // Example: > // >-// std::string html_escaped = absl::StrReplaceAll(user_input, { >+// string html_escaped = absl::StrReplaceAll(user_input, { > // {"&", "&"}, > // {"<", "<"}, > // {">", ">"}, >@@ -49,16 +49,16 @@ namespace absl { > > // StrReplaceAll() > // >-// Replaces character sequences within a given std::string with replacements provided >+// Replaces character sequences within a given string with replacements provided > // within an initializer list of key/value pairs. Candidate replacements are >-// considered in order as they occur within the std::string, with earlier matches >+// considered in order as they occur within the string, with earlier matches > // taking precedence, and longer matches taking precedence for candidates >-// starting at the same position in the std::string. Once a substitution is made, the >+// starting at the same position in the string. Once a substitution is made, the > // replaced text is not considered for any further substitutions. > // > // Example: > // >-// std::string s = absl::StrReplaceAll("$who bought $count #Noun. Thanks $who!", >+// string s = absl::StrReplaceAll("$who bought $count #Noun. Thanks $who!", > // {{"$count", absl::StrCat(5)}, > // {"$who", "Bob"}, > // {"#Noun", "Apples"}}); >@@ -78,28 +78,28 @@ ABSL_MUST_USE_RESULT std::string StrReplaceAll( > // replacements["$who"] = "Bob"; > // replacements["$count"] = "5"; > // replacements["#Noun"] = "Apples"; >-// std::string s = absl::StrReplaceAll("$who bought $count #Noun. Thanks $who!", >+// string s = absl::StrReplaceAll("$who bought $count #Noun. Thanks $who!", > // replacements); > // EXPECT_EQ("Bob bought 5 Apples. Thanks Bob!", s); > // > // // A std::vector of std::pair elements can be more efficient. >-// std::vector<std::pair<const absl::string_view, std::string>> replacements; >+// std::vector<std::pair<const absl::string_view, string>> replacements; > // replacements.push_back({"&", "&"}); > // replacements.push_back({"<", "<"}); > // replacements.push_back({">", ">"}); >-// std::string s = absl::StrReplaceAll("if (ptr < &foo)", >+// string s = absl::StrReplaceAll("if (ptr < &foo)", > // replacements); > // EXPECT_EQ("if (ptr < &foo)", s); > template <typename StrToStrMapping> > std::string StrReplaceAll(absl::string_view s, const StrToStrMapping& replacements); > > // Overload of `StrReplaceAll()` to replace character sequences within a given >-// output std::string *in place* with replacements provided within an initializer >+// output string *in place* with replacements provided within an initializer > // list of key/value pairs, returning the number of substitutions that occurred. > // > // Example: > // >-// std::string s = std::string("$who bought $count #Noun. Thanks $who!"); >+// string s = std::string("$who bought $count #Noun. Thanks $who!"); > // int count; > // count = absl::StrReplaceAll({{"$count", absl::StrCat(5)}, > // {"$who", "Bob"}, >@@ -112,12 +112,12 @@ int StrReplaceAll( > std::string* target); > > // Overload of `StrReplaceAll()` to replace patterns within a given output >-// std::string *in place* with replacements provided within a container of key/value >+// string *in place* with replacements provided within a container of key/value > // pairs. > // > // Example: > // >-// std::string s = std::string("if (ptr < &foo)"); >+// string s = std::string("if (ptr < &foo)"); > // int count = absl::StrReplaceAll({{"&", "&"}, > // {"<", "<"}, > // {">", ">"}}, &s); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace_benchmark.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace_benchmark.cc >index e608de8d19e41ba73c3d54ffa37643c72be802db..8386f2e6bf10ece46ba479ae253d879667c5e1f7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace_benchmark.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_replace_benchmark.cc >@@ -38,16 +38,16 @@ struct Replacement { > {"liquor", "shakes"}, // > }; > >-// Here, we set up a std::string for use in global-replace benchmarks. >+// Here, we set up a string for use in global-replace benchmarks. > // We started with a million blanks, and then deterministically insert >-// 10,000 copies each of two pangrams. The result is a std::string that is >+// 10,000 copies each of two pangrams. The result is a string that is > // 40% blank space and 60% these words. 'the' occurs 18,247 times and > // all the substitutions together occur 49,004 times. > // >-// We then create "after_replacing_the" to be a std::string that is a result of >+// We then create "after_replacing_the" to be a string that is a result of > // replacing "the" with "box" in big_string. > // >-// And then we create "after_replacing_many" to be a std::string that is result >+// And then we create "after_replacing_many" to be a string that is result > // of preferring several substitutions. > void SetUpStrings() { > if (big_string == nullptr) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.cc >index 0207213c26baf913d9981497943d7867a1cdeeea..0a68c52d14f19b88d3add2e37227c513c59d1246 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.cc >@@ -43,10 +43,11 @@ absl::string_view GenericFind(absl::string_view text, > if (delimiter.empty() && text.length() > 0) { > // Special case for empty std::string delimiters: always return a zero-length > // absl::string_view referring to the item at position 1 past pos. >- return absl::string_view(text.begin() + pos + 1, 0); >+ return absl::string_view(text.data() + pos + 1, 0); > } > size_t found_pos = absl::string_view::npos; >- absl::string_view found(text.end(), 0); // By default, not found >+ absl::string_view found(text.data() + text.size(), >+ 0); // By default, not found > found_pos = find_policy.Find(text, delimiter, pos); > if (found_pos != absl::string_view::npos) { > found = absl::string_view(text.data() + found_pos, >@@ -87,7 +88,7 @@ absl::string_view ByString::Find(absl::string_view text, size_t pos) const { > // absl::string_view. > size_t found_pos = text.find(delimiter_[0], pos); > if (found_pos == absl::string_view::npos) >- return absl::string_view(text.end(), 0); >+ return absl::string_view(text.data() + text.size(), 0); > return text.substr(found_pos, 1); > } > return GenericFind(text, delimiter_, pos, LiteralPolicy()); >@@ -100,7 +101,7 @@ absl::string_view ByString::Find(absl::string_view text, size_t pos) const { > absl::string_view ByChar::Find(absl::string_view text, size_t pos) const { > size_t found_pos = text.find(c_, pos); > if (found_pos == absl::string_view::npos) >- return absl::string_view(text.end(), 0); >+ return absl::string_view(text.data() + text.size(), 0); > return text.substr(found_pos, 1); > } > >@@ -128,9 +129,9 @@ absl::string_view ByLength::Find(absl::string_view text, > // If the std::string is shorter than the chunk size we say we > // "can't find the delimiter" so this will be the last chunk. > if (substr.length() <= static_cast<size_t>(length_)) >- return absl::string_view(text.end(), 0); >+ return absl::string_view(text.data() + text.size(), 0); > >- return absl::string_view(substr.begin() + length_, 0); >+ return absl::string_view(substr.data() + length_, 0); > } > > } // namespace absl >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.h >index 9a7be2b0534425872c25e986aebc21db7e7fb756..c7eb280c41130ab86e343369c7e5fc90cd69de26 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split.h >@@ -19,13 +19,13 @@ > // > // This file contains functions for splitting strings. It defines the main > // `StrSplit()` function, several delimiters for determining the boundaries on >-// which to split the std::string, and predicates for filtering delimited results. >+// which to split the string, and predicates for filtering delimited results. > // `StrSplit()` adapts the returned collection to the type specified by the > // caller. > // > // Example: > // >-// // Splits the given std::string on commas. Returns the results in a >+// // Splits the given string on commas. Returns the results in a > // // vector of strings. > // std::vector<std::string> v = absl::StrSplit("a,b,c", ','); > // // Can also use "," >@@ -55,7 +55,7 @@ namespace absl { > //------------------------------------------------------------------------------ > // > // `StrSplit()` uses delimiters to define the boundaries between elements in the >-// provided input. Several `Delimiter` types are defined below. If a std::string >+// provided input. Several `Delimiter` types are defined below. If a string > // (`const char*`, `std::string`, or `absl::string_view`) is passed in place of > // an explicit `Delimiter` object, `StrSplit()` treats it the same way as if it > // were passed a `ByString` delimiter. >@@ -65,7 +65,7 @@ namespace absl { > // > // The following `Delimiter` types are available for use within `StrSplit()`: > // >-// - `ByString` (default for std::string arguments) >+// - `ByString` (default for string arguments) > // - `ByChar` (default for a char argument) > // - `ByAnyChar` > // - `ByLength` >@@ -76,15 +76,15 @@ namespace absl { > // be split and the position to begin searching for the next delimiter in the > // input text. The returned absl::string_view should refer to the next > // occurrence (after pos) of the represented delimiter; this returned >-// absl::string_view represents the next location where the input std::string should >+// absl::string_view represents the next location where the input string should > // be broken. The returned absl::string_view may be zero-length if the Delimiter >-// does not represent a part of the std::string (e.g., a fixed-length delimiter). If >+// does not represent a part of the string (e.g., a fixed-length delimiter). If > // no delimiter is found in the given text, a zero-length absl::string_view > // referring to text.end() should be returned (e.g., > // absl::string_view(text.end(), 0)). It is important that the returned > // absl::string_view always be within the bounds of input text given as an >-// argument--it must not refer to a std::string that is physically located outside of >-// the given std::string. >+// argument--it must not refer to a string that is physically located outside of >+// the given string. > // > // The following example is a simple Delimiter object that is created with a > // single char and will look for that char in the text passed to the Find() >@@ -104,13 +104,13 @@ namespace absl { > > // ByString > // >-// A sub-std::string delimiter. If `StrSplit()` is passed a std::string in place of a >-// `Delimiter` object, the std::string will be implicitly converted into a >+// A sub-string delimiter. If `StrSplit()` is passed a string in place of a >+// `Delimiter` object, the string will be implicitly converted into a > // `ByString` delimiter. > // > // Example: > // >-// // Because a std::string literal is converted to an `absl::ByString`, >+// // Because a string literal is converted to an `absl::ByString`, > // // the following two splits are equivalent. > // > // std::vector<std::string> v1 = absl::StrSplit("a, b, c", ", "); >@@ -131,7 +131,7 @@ class ByString { > // ByChar > // > // A single character delimiter. `ByChar` is functionally equivalent to a >-// 1-char std::string within a `ByString` delimiter, but slightly more >+// 1-char string within a `ByString` delimiter, but slightly more > // efficient. > // > // Example: >@@ -164,9 +164,9 @@ class ByChar { > // ByAnyChar > // > // A delimiter that will match any of the given byte-sized characters within >-// its provided std::string. >+// its provided string. > // >-// Note: this delimiter works with single-byte std::string data, but does not work >+// Note: this delimiter works with single-byte string data, but does not work > // with variable-width encodings, such as UTF-8. > // > // Example: >@@ -175,8 +175,8 @@ class ByChar { > // std::vector<std::string> v = absl::StrSplit("a,b=c", ByAnyChar(",=")); > // // v[0] == "a", v[1] == "b", v[2] == "c" > // >-// If `ByAnyChar` is given the empty std::string, it behaves exactly like >-// `ByString` and matches each individual character in the input std::string. >+// If `ByAnyChar` is given the empty string, it behaves exactly like >+// `ByString` and matches each individual character in the input string. > // > class ByAnyChar { > public: >@@ -192,7 +192,7 @@ class ByAnyChar { > // A delimiter for splitting into equal-length strings. The length argument to > // the constructor must be greater than 0. > // >-// Note: this delimiter works with single-byte std::string data, but does not work >+// Note: this delimiter works with single-byte string data, but does not work > // with variable-width encodings, such as UTF-8. > // > // Example: >@@ -202,7 +202,7 @@ class ByAnyChar { > > // // v[0] == "123", v[1] == "456", v[2] == "789" > // >-// Note that the std::string does not have to be a multiple of the fixed split >+// Note that the string does not have to be a multiple of the fixed split > // length. In such a case, the last substring will be shorter. > // > // using absl::ByLength; >@@ -223,9 +223,9 @@ namespace strings_internal { > // A traits-like metafunction for selecting the default Delimiter object type > // for a particular Delimiter type. The base case simply exposes type Delimiter > // itself as the delimiter's Type. However, there are specializations for >-// std::string-like objects that map them to the ByString delimiter object. >+// string-like objects that map them to the ByString delimiter object. > // This allows functions like absl::StrSplit() and absl::MaxSplits() to accept >-// std::string-like objects (e.g., ',') as delimiter arguments but they will be >+// string-like objects (e.g., ',') as delimiter arguments but they will be > // treated as if a ByString delimiter was given. > template <typename Delimiter> > struct SelectDelimiter { >@@ -261,7 +261,8 @@ class MaxSplitsImpl { > : delimiter_(delimiter), limit_(limit), count_(0) {} > absl::string_view Find(absl::string_view text, size_t pos) { > if (count_++ == limit_) { >- return absl::string_view(text.end(), 0); // No more matches. >+ return absl::string_view(text.data() + text.size(), >+ 0); // No more matches. > } > return delimiter_.Find(text, pos); > } >@@ -331,7 +332,7 @@ struct AllowEmpty { > // SkipEmpty() > // > // Returns `false` if the given `absl::string_view` is empty, indicating that >-// `StrSplit()` should omit the empty std::string. >+// `StrSplit()` should omit the empty string. > // > // Example: > // >@@ -339,7 +340,7 @@ struct AllowEmpty { > // > // // v[0] == "a", v[1] == "b" > // >-// Note: `SkipEmpty()` does not consider a std::string containing only whitespace >+// Note: `SkipEmpty()` does not consider a string containing only whitespace > // to be empty. To skip such whitespace as well, use the `SkipWhitespace()` > // predicate. > struct SkipEmpty { >@@ -349,7 +350,7 @@ struct SkipEmpty { > // SkipWhitespace() > // > // Returns `false` if the given `absl::string_view` is empty *or* contains only >-// whitespace, indicating that `StrSplit()` should omit the std::string. >+// whitespace, indicating that `StrSplit()` should omit the string. > // > // Example: > // >@@ -373,7 +374,7 @@ struct SkipWhitespace { > > // StrSplit() > // >-// Splits a given std::string based on the provided `Delimiter` object, returning the >+// Splits a given string based on the provided `Delimiter` object, returning the > // elements within the type specified by the caller. Optionally, you may pass a > // `Predicate` to `StrSplit()` indicating whether to include or exclude the > // resulting element within the final result set. (See the overviews for >@@ -412,7 +413,7 @@ struct SkipWhitespace { > // > // The `StrSplit()` function adapts the returned collection to the collection > // specified by the caller (e.g. `std::vector` above). The returned collections >-// may contain `string`, `absl::string_view` (in which case the original std::string >+// may contain `string`, `absl::string_view` (in which case the original string > // being split must ensure that it outlives the collection), or any object that > // can be explicitly created from an `absl::string_view`. This behavior works > // for: >@@ -424,7 +425,7 @@ struct SkipWhitespace { > // Example: > // > // // The results are returned as `absl::string_view` objects. Note that we >-// // have to ensure that the input std::string outlives any results. >+// // have to ensure that the input string outlives any results. > // std::vector<absl::string_view> v = absl::StrSplit("a,b,c", ','); > // > // // Stores results in a std::set<std::string>, which also performs >@@ -444,7 +445,7 @@ struct SkipWhitespace { > // // is provided as a series of key/value pairs. For example, the 0th element > // // resulting from the split will be stored as a key to the 1st element. If > // // an odd number of elements are resolved, the last element is paired with >-// // a default-constructed value (e.g., empty std::string). >+// // a default-constructed value (e.g., empty string). > // std::map<std::string, std::string> m = absl::StrSplit("a,b,c", ','); > // // m["a"] == "b", m["c"] == "" // last component value equals "" > // >@@ -452,14 +453,14 @@ struct SkipWhitespace { > // elements and is not a collection type. When splitting to a `std::pair` the > // first two split strings become the `std::pair` `.first` and `.second` > // members, respectively. The remaining split substrings are discarded. If there >-// are less than two split substrings, the empty std::string is used for the >+// are less than two split substrings, the empty string is used for the > // corresponding > // `std::pair` member. > // > // Example: > // > // // Stores first two split strings as the members in a std::pair. >-// std::pair<std::string, std::string> p = absl::StrSplit("a,b,c", ','); >+// std::pair<string, string> p = absl::StrSplit("a,b,c", ','); > // // p.first == "a", p.second == "b" // "c" is omitted. > // > // The `StrSplit()` function can be used multiple times to perform more >@@ -467,9 +468,9 @@ struct SkipWhitespace { > // > // Example: > // >-// // The input std::string "a=b=c,d=e,f=,g" becomes >+// // The input string "a=b=c,d=e,f=,g" becomes > // // { "a" => "b=c", "d" => "e", "f" => "", "g" => "" } >-// std::map<std::string, std::string> m; >+// std::map<string, string> m; > // for (absl::string_view sp : absl::StrSplit("a=b=c,d=e,f=,g", ',')) { > // m.insert(absl::StrSplit(sp, absl::MaxSplits('=', 1))); > // } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split_test.cc >index 29a68047c99aef31bc7d3673a32b7228542c79a3..413ad318bf9b29337b7e88788a550a9db3e50eae 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/str_split_test.cc >@@ -276,7 +276,7 @@ TEST(SplitIterator, Basics) { > EXPECT_EQ(it, end); > } > >-// Simple Predicate to skip a particular std::string. >+// Simple Predicate to skip a particular string. > class Skip { > public: > explicit Skip(const std::string& s) : s_(s) {} >@@ -763,12 +763,12 @@ template <typename Delimiter> > static bool IsFoundAtStartingPos(absl::string_view text, Delimiter d, > size_t starting_pos, int expected_pos) { > absl::string_view found = d.Find(text, starting_pos); >- return found.data() != text.end() && >+ return found.data() != text.data() + text.size() && > expected_pos == found.data() - text.data(); > } > > // Helper function for testing Delimiter objects. Returns true if the given >-// Delimiter is found in the given std::string at the given position. This function >+// Delimiter is found in the given string at the given position. This function > // tests two cases: > // 1. The actual text given, staring at position 0 > // 2. The text given with leading padding that should be ignored >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view.h >index 1b8f8a374745cb6466b1cd622197fdc2f6dcd9d5..ec9f62be15733e3cde8256df1cfc520c22e11f0d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view.h >@@ -19,7 +19,7 @@ > // > // This file contains the definition of the `absl::string_view` class. A > // `string_view` points to a contiguous span of characters, often part or all of >-// another `std::string`, double-quoted std::string literal, character array, or even >+// another `std::string`, double-quoted string literal, character array, or even > // another `string_view`. > // > // This `absl::string_view` abstraction is designed to be a drop-in >@@ -56,15 +56,15 @@ namespace absl { > > // absl::string_view > // >-// A `string_view` provides a lightweight view into the std::string data provided by >-// a `std::string`, double-quoted std::string literal, character array, or even >-// another `string_view`. A `string_view` does *not* own the std::string to which it >+// A `string_view` provides a lightweight view into the string data provided by >+// a `std::string`, double-quoted string literal, character array, or even >+// another `string_view`. A `string_view` does *not* own the string to which it > // points, and that data cannot be modified through the view. > // > // You can use `string_view` as a function or method parameter anywhere a >-// parameter can receive a double-quoted std::string literal, `const char*`, >+// parameter can receive a double-quoted string literal, `const char*`, > // `std::string`, or another `absl::string_view` argument with no need to copy >-// the std::string data. Systematic use of `string_view` within function arguments >+// the string data. Systematic use of `string_view` within function arguments > // reduces data copies and `strlen()` calls. > // > // Because of its small size, prefer passing `string_view` by value: >@@ -97,8 +97,8 @@ namespace absl { > // `string_view` this way, it is your responsibility to ensure that the object > // pointed to by the `string_view` outlives the `string_view`. > // >-// A `string_view` may represent a whole std::string or just part of a std::string. For >-// example, when splitting a std::string, `std::vector<absl::string_view>` is a >+// A `string_view` may represent a whole string or just part of a string. For >+// example, when splitting a string, `std::vector<absl::string_view>` is a > // natural data type for the output. > // > // >@@ -141,7 +141,7 @@ namespace absl { > // All empty `string_view` objects whether null or not, are equal: > // > // absl::string_view() == absl::string_view("", 0) >-// absl::string_view(nullptr, 0) == absl:: string_view("abcdef"+6, 0) >+// absl::string_view(nullptr, 0) == absl::string_view("abcdef"+6, 0) > class string_view { > public: > using traits_type = std::char_traits<char>; >@@ -173,8 +173,19 @@ class string_view { > // Implicit constructor of a `string_view` from nul-terminated `str`. When > // accepting possibly null strings, use `absl::NullSafeStringView(str)` > // instead (see below). >+#if ABSL_HAVE_BUILTIN(__builtin_strlen) || \ >+ (defined(__GNUC__) && !defined(__clang__)) >+ // GCC has __builtin_strlen according to >+ // https://gcc.gnu.org/onlinedocs/gcc-4.7.0/gcc/Other-Builtins.html, but >+ // ABSL_HAVE_BUILTIN doesn't detect that, so we use the extra checks above. >+ // __builtin_strlen is constexpr. >+ constexpr string_view(const char* str) // NOLINT(runtime/explicit) >+ : ptr_(str), >+ length_(CheckLengthInternal(str ? __builtin_strlen(str) : 0)) {} >+#else > constexpr string_view(const char* str) // NOLINT(runtime/explicit) >- : ptr_(str), length_(CheckLengthInternal(StrLenInternal(str))) {} >+ : ptr_(str), length_(CheckLengthInternal(str ? strlen(str) : 0)) {} >+#endif > > // Implicit constructor of a `string_view` from a `const char*` and length. > constexpr string_view(const char* data, size_type len) >@@ -340,7 +351,7 @@ class string_view { > // > // Returns a "substring" of the `string_view` (at offset `pos` and length > // `n`) as another string_view. This function throws `std::out_of_bounds` if >- // `pos > size'. >+ // `pos > size`. > string_view substr(size_type pos, size_type n = npos) const { > if (ABSL_PREDICT_FALSE(pos > length_)) > base_internal::ThrowStdOutOfRange("absl::string_view::substr"); >@@ -351,7 +362,7 @@ class string_view { > // string_view::compare() > // > // Performs a lexicographical comparison between the `string_view` and >- // another `absl::string_view), returning -1 if `this` is less than, 0 if >+ // another `absl::string_view`, returning -1 if `this` is less than, 0 if > // `this` is equal to, and 1 if `this` is greater than the passed std::string > // view. Note that in the case of data equality, a further comparison is made > // on the respective sizes of the two `string_view`s to determine which is >@@ -481,22 +492,6 @@ class string_view { > static constexpr size_type kMaxSize = > std::numeric_limits<difference_type>::max(); > >- // check whether __builtin_strlen is provided by the compiler. >- // GCC doesn't have __has_builtin() >- // (https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66970), >- // but has __builtin_strlen according to >- // https://gcc.gnu.org/onlinedocs/gcc-4.7.0/gcc/Other-Builtins.html. >-#if ABSL_HAVE_BUILTIN(__builtin_strlen) || \ >- (defined(__GNUC__) && !defined(__clang__)) >- static constexpr size_type StrLenInternal(const char* str) { >- return str ? __builtin_strlen(str) : 0; >- } >-#else >- static constexpr size_type StrLenInternal(const char* str) { >- return str ? strlen(str) : 0; >- } >-#endif >- > static constexpr size_type CheckLengthInternal(size_type len) { > return len; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view_test.cc >index b19d07c7fd61d821223bfc0bf97b2a2a6234d38c..a94e82254ee4521dfc11b05aa89d0e741ca5b7ca 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/string_view_test.cc >@@ -35,7 +35,8 @@ > #define ABSL_EXPECT_DEATH_IF_SUPPORTED(statement, regex) \ > EXPECT_DEATH_IF_SUPPORTED(statement, ".*") > #else >-#define ABSL_EXPECT_DEATH_IF_SUPPORTED EXPECT_DEATH_IF_SUPPORTED >+#define ABSL_EXPECT_DEATH_IF_SUPPORTED(statement, regex) \ >+ EXPECT_DEATH_IF_SUPPORTED(statement, regex) > #endif > > namespace { >@@ -283,7 +284,7 @@ TEST(StringViewTest, ComparisonOperatorsByCharacterPosition) { > } > #undef COMPARE > >-// Sadly, our users often confuse std::string::npos with absl::string_view::npos; >+// Sadly, our users often confuse string::npos with absl::string_view::npos; > // So much so that we test here that they are the same. They need to > // both be unsigned, and both be the maximum-valued integer of their type. > >@@ -811,15 +812,18 @@ TEST(StringViewTest, FrontBackSingleChar) { > } > > // `std::string_view::string_view(const char*)` calls >-// `std::char_traits<char>::length(const char*)` to get the std::string length. In >+// `std::char_traits<char>::length(const char*)` to get the string length. In > // libc++, it doesn't allow `nullptr` in the constexpr context, with the error > // "read of dereferenced null pointer is not allowed in a constant expression". > // At run time, the behavior of `std::char_traits::length()` on `nullptr` is >-// undefined by the standard and usually results in crash with libc++. This >-// conforms to the standard, but `absl::string_view` implements a different >+// undefined by the standard and usually results in crash with libc++. >+// In MSVC, creating a constexpr string_view from nullptr also triggers an >+// "unevaluable pointer value" error. This compiler implementation conforms >+// to the standard, but `absl::string_view` implements a different > // behavior for historical reasons. We work around tests that construct > // `string_view` from `nullptr` when using libc++. >-#if !defined(ABSL_HAVE_STD_STRING_VIEW) || !defined(_LIBCPP_VERSION) >+#if !defined(ABSL_HAVE_STD_STRING_VIEW) || \ >+ (!defined(_LIBCPP_VERSION) && !defined(_MSC_VER)) > #define ABSL_HAVE_STRING_VIEW_FROM_NULLPTR 1 > #endif // !defined(ABSL_HAVE_STD_STRING_VIEW) || !defined(_LIBCPP_VERSION) > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip.h >index 2f8d21f7deb9fca02938ca7a9240a0e17756f211..8d0d7c6bfc1230f0c39a151218e1cc927b8836bf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip.h >@@ -17,7 +17,7 @@ > // File: strip.h > // ----------------------------------------------------------------------------- > // >-// This file contains various functions for stripping substrings from a std::string. >+// This file contains various functions for stripping substrings from a string. > #ifndef ABSL_STRINGS_STRIP_H_ > #define ABSL_STRINGS_STRIP_H_ > >@@ -33,7 +33,7 @@ namespace absl { > > // ConsumePrefix() > // >-// Strips the `expected` prefix from the start of the given std::string, returning >+// Strips the `expected` prefix from the start of the given string, returning > // `true` if the strip operation succeeded or false otherwise. > // > // Example: >@@ -48,7 +48,7 @@ inline bool ConsumePrefix(absl::string_view* str, absl::string_view expected) { > } > // ConsumeSuffix() > // >-// Strips the `expected` suffix from the end of the given std::string, returning >+// Strips the `expected` suffix from the end of the given string, returning > // `true` if the strip operation succeeded or false otherwise. > // > // Example: >@@ -64,9 +64,9 @@ inline bool ConsumeSuffix(absl::string_view* str, absl::string_view expected) { > > // StripPrefix() > // >-// Returns a view into the input std::string 'str' with the given 'prefix' removed, >-// but leaving the original std::string intact. If the prefix does not match at the >-// start of the std::string, returns the original std::string instead. >+// Returns a view into the input string 'str' with the given 'prefix' removed, >+// but leaving the original string intact. If the prefix does not match at the >+// start of the string, returns the original string instead. > ABSL_MUST_USE_RESULT inline absl::string_view StripPrefix( > absl::string_view str, absl::string_view prefix) { > if (absl::StartsWith(str, prefix)) str.remove_prefix(prefix.size()); >@@ -75,9 +75,9 @@ ABSL_MUST_USE_RESULT inline absl::string_view StripPrefix( > > // StripSuffix() > // >-// Returns a view into the input std::string 'str' with the given 'suffix' removed, >-// but leaving the original std::string intact. If the suffix does not match at the >-// end of the std::string, returns the original std::string instead. >+// Returns a view into the input string 'str' with the given 'suffix' removed, >+// but leaving the original string intact. If the suffix does not match at the >+// end of the string, returns the original string instead. > ABSL_MUST_USE_RESULT inline absl::string_view StripSuffix( > absl::string_view str, absl::string_view suffix) { > if (absl::EndsWith(str, suffix)) str.remove_suffix(suffix.size()); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip_test.cc >index 205c160c1929263796a59323ac59509e8ab6589e..40c4c6071211208104c7ddb2fdcf376519924347 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/strip_test.cc >@@ -12,8 +12,8 @@ > // See the License for the specific language governing permissions and > // limitations under the License. > >-// This file contains functions that remove a defined part from the std::string, >-// i.e., strip the std::string. >+// This file contains functions that remove a defined part from the string, >+// i.e., strip the string. > > #include "absl/strings/strip.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/substitute.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/substitute.h >index c4b25ba70952c19d64ebc849b7e3c09cafbc52cf..4de7b4e75ff23b41341fd635cf221da669bd51d4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/substitute.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/strings/substitute.h >@@ -17,46 +17,46 @@ > // File: substitute.h > // ----------------------------------------------------------------------------- > // >-// This package contains functions for efficiently performing std::string >-// substitutions using a format std::string with positional notation: >+// This package contains functions for efficiently performing string >+// substitutions using a format string with positional notation: > // `Substitute()` and `SubstituteAndAppend()`. > // > // Unlike printf-style format specifiers, `Substitute()` functions do not need > // to specify the type of the substitution arguments. Supported arguments >-// following the format std::string, such as strings, string_views, ints, >+// following the format string, such as strings, string_views, ints, > // floats, and bools, are automatically converted to strings during the > // substitution process. (See below for a full list of supported types.) > // > // `Substitute()` does not allow you to specify *how* to format a value, beyond >-// the default conversion to std::string. For example, you cannot format an integer >+// the default conversion to string. For example, you cannot format an integer > // in hex. > // >-// The format std::string uses positional identifiers indicated by a dollar sign ($) >+// The format string uses positional identifiers indicated by a dollar sign ($) > // and single digit positional ids to indicate which substitution arguments to >-// use at that location within the format std::string. >+// use at that location within the format string. > // > // Example 1: >-// std::string s = Substitute("$1 purchased $0 $2. Thanks $1!", >+// string s = Substitute("$1 purchased $0 $2. Thanks $1!", > // 5, "Bob", "Apples"); > // EXPECT_EQ("Bob purchased 5 Apples. Thanks Bob!", s); > // > // Example 2: >-// std::string s = "Hi. "; >+// string s = "Hi. "; > // SubstituteAndAppend(&s, "My name is $0 and I am $1 years old.", "Bob", 5); > // EXPECT_EQ("Hi. My name is Bob and I am 5 years old.", s); > // > // > // Supported types: >-// * absl::string_view, std::string, const char* (null is equivalent to "") >+// * absl::string_view, string, const char* (null is equivalent to "") > // * int32_t, int64_t, uint32_t, uint64 > // * float, double > // * bool (Printed as "true" or "false") >-// * pointer types other than char* (Printed as "0x<lower case hex std::string>", >+// * pointer types other than char* (Printed as "0x<lower case hex string>", > // except that null is printed as "NULL") > // >-// If an invalid format std::string is provided, Substitute returns an empty std::string >-// and SubstituteAndAppend does not change the provided output std::string. >-// A format std::string is invalid if it: >+// If an invalid format string is provided, Substitute returns an empty string >+// and SubstituteAndAppend does not change the provided output string. >+// A format string is invalid if it: > // * ends in an unescaped $ character, > // e.g. "Hello $", or > // * calls for a position argument which is not provided, >@@ -88,7 +88,7 @@ namespace substitute_internal { > // > // This class provides an argument type for `absl::Substitute()` and > // `absl::SubstituteAndAppend()`. `Arg` handles implicit conversion of various >-// types to a std::string. (`Arg` is very similar to the `AlphaNum` class in >+// types to a string. (`Arg` is very similar to the `AlphaNum` class in > // `StrCat()`.) > // > // This class has implicit constructors. >@@ -197,8 +197,8 @@ constexpr int PlaceholderBitmask(const char* format) { > > // SubstituteAndAppend() > // >-// Substitutes variables into a given format std::string and appends to a given >-// output std::string. See file comments above for usage. >+// Substitutes variables into a given format string and appends to a given >+// output string. See file comments above for usage. > // > // The declarations of `SubstituteAndAppend()` below consist of overloads > // for passing 0 to 10 arguments, respectively. >@@ -444,7 +444,7 @@ void SubstituteAndAppend( > > // Substitute() > // >-// Substitutes variables into a given format std::string. See file comments above >+// Substitutes variables into a given format string. See file comments above > // for usage. > // > // The declarations of `Substitute()` below consist of overloads for passing 0 >@@ -456,7 +456,7 @@ void SubstituteAndAppend( > // Example: > // template <typename... Args> > // void VarMsg(absl::string_view format, const Args&... args) { >-// std::string s = absl::Substitute(format, args...); >+// string s = absl::Substitute(format, args...); > > ABSL_MUST_USE_RESULT inline std::string Substitute(absl::string_view format) { > std::string result; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.bazel >index 8d302e01223b45bd92a21670a11f5592f9b15c64..f52e9d41644a8d931bf39fae7ec84327beaf41e4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.bazel >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.bazel >@@ -88,6 +88,9 @@ cc_test( > size = "small", > srcs = ["barrier_test.cc"], > copts = ABSL_TEST_COPTS, >+ tags = [ >+ "no_test_wasm", >+ ], > deps = [ > ":synchronization", > "//absl/time", >@@ -100,6 +103,9 @@ cc_test( > size = "small", > srcs = ["blocking_counter_test.cc"], > copts = ABSL_TEST_COPTS, >+ tags = [ >+ "no_test_wasm", >+ ], > deps = [ > ":synchronization", > "//absl/time", >@@ -138,6 +144,9 @@ cc_library( > name = "thread_pool", > testonly = 1, > hdrs = ["internal/thread_pool.h"], >+ visibility = [ >+ "//absl:__subpackages__", >+ ], > deps = [ > ":synchronization", > "//absl/base:core_headers", >@@ -149,6 +158,7 @@ cc_test( > size = "large", > srcs = ["mutex_test.cc"], > copts = ABSL_TEST_COPTS, >+ shard_count = 25, > deps = [ > ":synchronization", > ":thread_pool", >@@ -205,6 +215,7 @@ cc_test( > name = "per_thread_sem_test", > size = "medium", > copts = ABSL_TEST_COPTS, >+ tags = ["no_test_wasm"], > deps = [ > ":per_thread_sem_test_common", > ":synchronization", >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.gn >index 3664aa13274f4040f9369b95f40deef2b590db8f..e6f763a060ba9ce0aaee9c141810fff6b4e78c11 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/BUILD.gn >@@ -50,8 +50,8 @@ source_set("synchronization") { > "internal/create_thread_identity.cc", > "internal/per_thread_sem.cc", > "internal/waiter.cc", >+ "mutex.cc", > "notification.cc", >- "mutex.cc" > ] > public = [ > "barrier.h", >@@ -93,6 +93,8 @@ source_set("thread_pool") { > ":synchronization", > "../base:core_headers", > ] >+ visibility = [] >+ visibility += [ "../*" ] > } > > source_set("per_thread_sem_test_common") { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc >index ab1f3f84e74c1d7edef6379e02684b00b53a7e34..d3878de2b68f21929f00b8634bcb9794a0e4f086 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc >@@ -204,8 +204,7 @@ class NodeSet { > } > > private: >- static const int32_t kEmpty; >- static const int32_t kDel; >+ enum : int32_t { kEmpty = -1, kDel = -2 }; > Vec<int32_t> table_; > uint32_t occupied_; // Count of non-empty slots (includes deleted slots) > >@@ -255,9 +254,6 @@ class NodeSet { > NodeSet& operator=(const NodeSet&) = delete; > }; > >-const int32_t NodeSet::kEmpty = -1; >-const int32_t NodeSet::kDel = -2; >- > // We encode a node index and a node version in GraphId. The version > // number is incremented when the GraphId is freed which automatically > // invalidates all copies of the GraphId. >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/per_thread_sem_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/per_thread_sem_test.cc >index 2b52ea76ab0e942601eb50f160fe845b9b0685cf..c29d8403df43fbc3d8d2c5c7ccdaf3aeed2f2ad4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/per_thread_sem_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/internal/per_thread_sem_test.cc >@@ -153,12 +153,15 @@ TEST_F(PerThreadSemTest, WithTimeout) { > > TEST_F(PerThreadSemTest, Timeouts) { > absl::Time timeout = absl::Now() + absl::Milliseconds(50); >+ // Allow for a slight early return, to account for quality of implementation >+ // issues on various platforms. >+ const absl::Duration slop = absl::Microseconds(200); > EXPECT_FALSE(Wait(timeout)); >- EXPECT_LE(timeout, absl::Now()); >+ EXPECT_LE(timeout, absl::Now() + slop); > > absl::Time negative_timeout = absl::UnixEpoch() - absl::Milliseconds(100); > EXPECT_FALSE(Wait(negative_timeout)); >- EXPECT_LE(negative_timeout, absl::Now()); // trivially true :) >+ EXPECT_LE(negative_timeout, absl::Now() + slop); // trivially true :) > > Post(GetOrCreateCurrentThreadIdentity()); > // The wait here has an expired timeout, but we have a wake to consume, >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.cc >index 80f34f035fc6b679eeb0b14d89cfaeaa1f80f051..9d59a79d92d9f7928f3dfe2a730434e154acec47 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.cc >@@ -298,7 +298,7 @@ static struct SynchEvent { // this is a trivial hash table for the events > // set "bits" in the word there (waiting until lockbit is clear before doing > // so), and return a refcounted reference that will remain valid until > // UnrefSynchEvent() is called. If a new SynchEvent is allocated, >-// the std::string name is copied into it. >+// the string name is copied into it. > // When used with a mutex, the caller should also ensure that kMuEvent > // is set in the mutex word, and similarly for condition variables and kCVEvent. > static SynchEvent *EnsureSynchEvent(std::atomic<intptr_t> *addr, >@@ -1827,8 +1827,8 @@ bool Mutex::LockSlowWithDeadline(MuHow how, const Condition *cond, > cond == nullptr || EvalConditionAnnotated(cond, this, true, how); > } > >-// RAW_CHECK_FMT() takes a condition, a printf-style format std::string, and >-// the printf-style argument list. The format std::string must be a literal. >+// RAW_CHECK_FMT() takes a condition, a printf-style format string, and >+// the printf-style argument list. The format string must be a literal. > // Arguments after the first are not evaluated unless the condition is true. > #define RAW_CHECK_FMT(cond, ...) \ > do { \ >@@ -1975,7 +1975,7 @@ void Mutex::LockSlowLoop(SynchWaitParams *waitp, int flags) { > // Unlock this mutex, which is held by the current thread. > // If waitp is non-zero, it must be the wait parameters for the current thread > // which holds the lock but is not runnable because its condition is false >-// or it n the process of blocking on a condition variable; it must requeue >+// or it is in the process of blocking on a condition variable; it must requeue > // itself on the mutex/condvar to wait for its condition to become true. > void Mutex::UnlockSlow(SynchWaitParams *waitp) { > intptr_t v = mu_.load(std::memory_order_relaxed); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.h >index 83c214867ac53a9122dd4e9753fc8bf72e128be5..a378190538e2d236eaf399315007d00872e20869 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/mutex.h >@@ -962,7 +962,7 @@ void RegisterMutexTracer(void (*fn)(const char *msg, const void *obj, > // > // The function pointer registered here will be called here on various CondVar > // events. The callback is given an opaque handle to the CondVar object and >-// a std::string identifying the event. This is thread-safe, but only a single >+// a string identifying the event. This is thread-safe, but only a single > // tracer can be registered. > // > // Events that can be sent are "Wait", "Unwait", "Signal wakeup", and >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/notification_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/notification_test.cc >index 9b3b6a5a9e84ec1a3655a2c8f43eb8fbc5832ef6..d8708d551a1575746ea2a85426972fd981911016 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/notification_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/synchronization/notification_test.cc >@@ -71,10 +71,13 @@ static void BasicTests(bool notify_before_waiting, Notification* notification) { > notification->WaitForNotificationWithTimeout(absl::Milliseconds(0))); > EXPECT_FALSE(notification->WaitForNotificationWithDeadline(absl::Now())); > >+ const absl::Duration delay = absl::Milliseconds(50); >+ // Allow for a slight early return, to account for quality of implementation >+ // issues on various platforms. >+ const absl::Duration slop = absl::Microseconds(200); > absl::Time start = absl::Now(); >- EXPECT_FALSE( >- notification->WaitForNotificationWithTimeout(absl::Milliseconds(50))); >- EXPECT_LE(start + absl::Milliseconds(50), absl::Now()); >+ EXPECT_FALSE(notification->WaitForNotificationWithTimeout(delay)); >+ EXPECT_LE(start + delay, absl::Now() + slop); > > ThreadSafeCounter ready_counter; > ThreadSafeCounter done_counter; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/BUILD.gn >index 9927af8343cb16099e022cd3780f486e7afa411d..5758e9dc1e3c3dc45f28a525ffe6013a83314749 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/BUILD.gn >@@ -62,8 +62,9 @@ source_set("test_util") { > ":time", > "../base", > "../time/internal/cctz:time_zone", >- "//testing/gtest", > "//testing/gmock", >+ "//testing/gtest", >+ "//third_party/googletest/:gmock", > ] > visibility = [] > visibility += [ "../time:*" ] >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/duration.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/duration.cc >index f402137b0a650d2b7ed605262fd954062aab29a0..2950c7cdc6320b4d3f8434b1e2b01ad56d72db40 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/duration.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/duration.cc >@@ -666,7 +666,7 @@ std::chrono::hours ToChronoHours(Duration d) { > } > > // >-// To/From std::string formatting. >+// To/From string formatting. > // > > namespace { >@@ -744,7 +744,7 @@ void AppendNumberUnit(std::string* out, double n, DisplayUnit unit) { > } // namespace > > // From Go's doc at http://golang.org/pkg/time/#Duration.String >-// [FormatDuration] returns a std::string representing the duration in the >+// [FormatDuration] returns a string representing the duration in the > // form "72h3m0.5s". Leading zero units are omitted. As a special > // case, durations less than one second format use a smaller unit > // (milli-, micro-, or nanoseconds) to ensure that the leading digit >@@ -787,8 +787,8 @@ std::string FormatDuration(Duration d) { > namespace { > > // A helper for ParseDuration() that parses a leading number from the given >-// std::string and stores the result in *int_part/*frac_part/*frac_scale. The >-// given std::string pointer is modified to point to the first unconsumed char. >+// string and stores the result in *int_part/*frac_part/*frac_scale. The >+// given string pointer is modified to point to the first unconsumed char. > bool ConsumeDurationNumber(const char** dpp, int64_t* int_part, > int64_t* frac_part, int64_t* frac_scale) { > *int_part = 0; >@@ -816,8 +816,8 @@ bool ConsumeDurationNumber(const char** dpp, int64_t* int_part, > } > > // A helper for ParseDuration() that parses a leading unit designator (e.g., >-// ns, us, ms, s, m, h) from the given std::string and stores the resulting unit >-// in "*unit". The given std::string pointer is modified to point to the first >+// ns, us, ms, s, m, h) from the given string and stores the resulting unit >+// in "*unit". The given string pointer is modified to point to the first > // unconsumed char. > bool ConsumeDurationUnit(const char** start, Duration* unit) { > const char *s = *start; >@@ -850,7 +850,7 @@ bool ConsumeDurationUnit(const char** start, Duration* unit) { > } // namespace > > // From Go's doc at http://golang.org/pkg/time/#ParseDuration >-// [ParseDuration] parses a duration std::string. A duration std::string is >+// [ParseDuration] parses a duration string. A duration string is > // a possibly signed sequence of decimal numbers, each with optional > // fraction and a unit suffix, such as "300ms", "-1.5h" or "2h45m". > // Valid time units are "ns", "us" "ms", "s", "m", "h". >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/format.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/format.cc >index e98e60a372bfc3a4472c8a17ac7a5383782b887a..ee597e407a9ddd226a8f880659f2d50427f9192b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/format.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/format.cc >@@ -88,7 +88,7 @@ bool ParseTime(const std::string& format, const std::string& input, absl::Time* > return absl::ParseTime(format, input, absl::UTCTimeZone(), time, err); > } > >-// If the input std::string does not contain an explicit UTC offset, interpret >+// If the input string does not contain an explicit UTC offset, interpret > // the fields with respect to the given TimeZone. > bool ParseTime(const std::string& format, const std::string& input, absl::TimeZone tz, > absl::Time* time, std::string* err) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time.h >index 898222b4c7af29830ea20507d9c294f2d0ef969c..0842fa4a468952fbb71fe9e98b883e2fa47d96e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time.h >@@ -59,7 +59,7 @@ namespace cctz { > // inferior fields to their minimum valid value (as described above). The > // following are examples of how each of the six types would align the fields > // representing November 22, 2015 at 12:34:56 in the afternoon. (Note: the >-// std::string format used here is not important; it's just a shorthand way of >+// string format used here is not important; it's just a shorthand way of > // showing the six YMDHMS fields.) > // > // civil_second 2015-11-22 12:34:56 >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time_detail.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time_detail.h >index 2362a4f4fbc958fa30dcfe7a14187688df462adf..d7f72717ece254088740e7375828596c963fb013 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time_detail.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/civil_time_detail.h >@@ -355,11 +355,11 @@ class civil_time { > : civil_time(ct.f_) {} > > // Factories for the maximum/minimum representable civil_time. >- static civil_time max() { >+ static CONSTEXPR_F civil_time max() { > const auto max_year = std::numeric_limits<std::int_least64_t>::max(); > return civil_time(max_year, 12, 31, 23, 59, 59); > } >- static civil_time min() { >+ static CONSTEXPR_F civil_time min() { > const auto min_year = std::numeric_limits<std::int_least64_t>::min(); > return civil_time(min_year, 1, 1, 0, 0, 0); > } >@@ -416,6 +416,12 @@ class civil_time { > return difference(T{}, lhs.f_, rhs.f_); > } > >+ template <typename H> >+ friend H AbslHashValue(H h, civil_time a) { >+ return H::combine(std::move(h), a.f_.y, a.f_.m, a.f_.d, >+ a.f_.hh, a.f_.mm, a.f_.ss); >+ } >+ > private: > // All instantiations of this template are allowed to call the following > // private constructor and access the private fields member. >@@ -508,12 +514,8 @@ CONSTEXPR_F weekday get_weekday(const civil_day& cd) noexcept { > CONSTEXPR_D int k_weekday_offsets[1 + 12] = { > -1, 0, 3, 2, 5, 0, 3, 5, 1, 4, 6, 2, 4, > }; >- year_t wd = cd.year() - (cd.month() < 3); >- if (wd >= 0) { >- wd += wd / 4 - wd / 100 + wd / 400; >- } else { >- wd += (wd - 3) / 4 - (wd - 99) / 100 + (wd - 399) / 400; >- } >+ year_t wd = 2400 + (cd.year() % 400) - (cd.month() < 3); >+ wd += wd / 4 - wd / 100 + wd / 400; > wd += k_weekday_offsets[cd.month()] + cd.day(); > return k_weekday_by_sun_off[(wd % 7 + 7) % 7]; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/time_zone.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/time_zone.h >index 0b9764ea72a9caa3e5f2f7fee0427357aed28808..f28dad17550659b07f0c20038bc47f25a02f62a1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/time_zone.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/include/cctz/time_zone.h >@@ -224,6 +224,11 @@ class time_zone { > return !(lhs == rhs); > } > >+ template <typename H> >+ friend H AbslHashValue(H h, time_zone tz) { >+ return H::combine(std::move(h), &tz.effective_impl()); >+ } >+ > class Impl; > > private: >@@ -279,7 +284,7 @@ bool parse(const std::string&, const std::string&, const time_zone&, > } // namespace detail > > // Formats the given time_point in the given cctz::time_zone according to >-// the provided format std::string. Uses strftime()-like formatting options, >+// the provided format string. Uses strftime()-like formatting options, > // with the following extensions: > // > // - %Ez - RFC3339-compatible numeric UTC offset (+hh:mm or -hh:mm) >@@ -298,7 +303,7 @@ bool parse(const std::string&, const std::string&, const time_zone&, > // more than four characters, just like %Y. > // > // Tip: Format strings should include the UTC offset (e.g., %z, %Ez, or %E*z) >-// so that the resulting std::string uniquely identifies an absolute time. >+// so that the resulting string uniquely identifies an absolute time. > // > // Example: > // cctz::time_zone lax; >@@ -314,7 +319,7 @@ inline std::string format(const std::string& fmt, const time_point<D>& tp, > return detail::format(fmt, p.first, n, tz); > } > >-// Parses an input std::string according to the provided format std::string and >+// Parses an input string according to the provided format string and > // returns the corresponding time_point. Uses strftime()-like formatting > // options, with the same extensions as cctz::format(), but with the > // exceptions that %E#S is interpreted as %E*S, and %E#f as %E*f. %Ez >@@ -328,7 +333,7 @@ inline std::string format(const std::string& fmt, const time_point<D>& tp, > // > // "1970-01-01 00:00:00.0 +0000" > // >-// For example, parsing a std::string of "15:45" (%H:%M) will return a time_point >+// For example, parsing a string of "15:45" (%H:%M) will return a time_point > // that represents "1970-01-01 15:45:00.0 +0000". > // > // Note that parse() returns time instants, so it makes most sense to parse >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/cctz_benchmark.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/cctz_benchmark.cc >index c97df78c09c83945e68f5511b7b92e4af31770c4..4498d7d0b6b7e0c4a6fdb92149c78796d3d08b6f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/cctz_benchmark.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/cctz_benchmark.cc >@@ -778,13 +778,13 @@ void BM_Zone_UTCTimeZone(benchmark::State& state) { > } > BENCHMARK(BM_Zone_UTCTimeZone); > >-// In each "ToDateTime" benchmark we switch between two instants >-// separated by at least one transition in order to defeat any >-// internal caching of previous results (e.g., see local_time_hint_). >+// In each "ToCivil" benchmark we switch between two instants separated >+// by at least one transition in order to defeat any internal caching of >+// previous results (e.g., see local_time_hint_). > // > // The "UTC" variants use UTC instead of the Google/local time zone. > >-void BM_Time_ToDateTime_CCTZ(benchmark::State& state) { >+void BM_Time_ToCivil_CCTZ(benchmark::State& state) { > const cctz::time_zone tz = TestTimeZone(); > std::chrono::system_clock::time_point tp = > std::chrono::system_clock::from_time_t(1384569027); >@@ -796,9 +796,9 @@ void BM_Time_ToDateTime_CCTZ(benchmark::State& state) { > benchmark::DoNotOptimize(cctz::convert(tp, tz)); > } > } >-BENCHMARK(BM_Time_ToDateTime_CCTZ); >+BENCHMARK(BM_Time_ToCivil_CCTZ); > >-void BM_Time_ToDateTime_Libc(benchmark::State& state) { >+void BM_Time_ToCivil_Libc(benchmark::State& state) { > // No timezone support, so just use localtime. > time_t t = 1384569027; > time_t t2 = 1418962578; >@@ -813,9 +813,9 @@ void BM_Time_ToDateTime_Libc(benchmark::State& state) { > #endif > } > } >-BENCHMARK(BM_Time_ToDateTime_Libc); >+BENCHMARK(BM_Time_ToCivil_Libc); > >-void BM_Time_ToDateTimeUTC_CCTZ(benchmark::State& state) { >+void BM_Time_ToCivilUTC_CCTZ(benchmark::State& state) { > const cctz::time_zone tz = cctz::utc_time_zone(); > std::chrono::system_clock::time_point tp = > std::chrono::system_clock::from_time_t(1384569027); >@@ -824,9 +824,9 @@ void BM_Time_ToDateTimeUTC_CCTZ(benchmark::State& state) { > benchmark::DoNotOptimize(cctz::convert(tp, tz)); > } > } >-BENCHMARK(BM_Time_ToDateTimeUTC_CCTZ); >+BENCHMARK(BM_Time_ToCivilUTC_CCTZ); > >-void BM_Time_ToDateTimeUTC_Libc(benchmark::State& state) { >+void BM_Time_ToCivilUTC_Libc(benchmark::State& state) { > time_t t = 1384569027; > struct tm tm; > while (state.KeepRunning()) { >@@ -838,16 +838,16 @@ void BM_Time_ToDateTimeUTC_Libc(benchmark::State& state) { > #endif > } > } >-BENCHMARK(BM_Time_ToDateTimeUTC_Libc); >+BENCHMARK(BM_Time_ToCivilUTC_Libc); > >-// In each "FromDateTime" benchmark we switch between two YMDhms >-// values separated by at least one transition in order to defeat any >-// internal caching of previous results (e.g., see time_local_hint_). >+// In each "FromCivil" benchmark we switch between two YMDhms values >+// separated by at least one transition in order to defeat any internal >+// caching of previous results (e.g., see time_local_hint_). > // > // The "UTC" variants use UTC instead of the Google/local time zone. > // The "Day0" variants require normalization of the day of month. > >-void BM_Time_FromDateTime_CCTZ(benchmark::State& state) { >+void BM_Time_FromCivil_CCTZ(benchmark::State& state) { > const cctz::time_zone tz = TestTimeZone(); > int i = 0; > while (state.KeepRunning()) { >@@ -860,9 +860,9 @@ void BM_Time_FromDateTime_CCTZ(benchmark::State& state) { > } > } > } >-BENCHMARK(BM_Time_FromDateTime_CCTZ); >+BENCHMARK(BM_Time_FromCivil_CCTZ); > >-void BM_Time_FromDateTime_Libc(benchmark::State& state) { >+void BM_Time_FromCivil_Libc(benchmark::State& state) { > // No timezone support, so just use localtime. > int i = 0; > while (state.KeepRunning()) { >@@ -886,20 +886,20 @@ void BM_Time_FromDateTime_Libc(benchmark::State& state) { > benchmark::DoNotOptimize(mktime(&tm)); > } > } >-BENCHMARK(BM_Time_FromDateTime_Libc); >+BENCHMARK(BM_Time_FromCivil_Libc); > >-void BM_Time_FromDateTimeUTC_CCTZ(benchmark::State& state) { >+void BM_Time_FromCivilUTC_CCTZ(benchmark::State& state) { > const cctz::time_zone tz = cctz::utc_time_zone(); > while (state.KeepRunning()) { > benchmark::DoNotOptimize( > cctz::convert(cctz::civil_second(2014, 12, 18, 20, 16, 18), tz)); > } > } >-BENCHMARK(BM_Time_FromDateTimeUTC_CCTZ); >+BENCHMARK(BM_Time_FromCivilUTC_CCTZ); > >-// There is no BM_Time_FromDateTimeUTC_Libc. >+// There is no BM_Time_FromCivilUTC_Libc. > >-void BM_Time_FromDateTimeDay0_CCTZ(benchmark::State& state) { >+void BM_Time_FromCivilDay0_CCTZ(benchmark::State& state) { > const cctz::time_zone tz = TestTimeZone(); > int i = 0; > while (state.KeepRunning()) { >@@ -912,9 +912,9 @@ void BM_Time_FromDateTimeDay0_CCTZ(benchmark::State& state) { > } > } > } >-BENCHMARK(BM_Time_FromDateTimeDay0_CCTZ); >+BENCHMARK(BM_Time_FromCivilDay0_CCTZ); > >-void BM_Time_FromDateTimeDay0_Libc(benchmark::State& state) { >+void BM_Time_FromCivilDay0_Libc(benchmark::State& state) { > // No timezone support, so just use localtime. > int i = 0; > while (state.KeepRunning()) { >@@ -938,7 +938,7 @@ void BM_Time_FromDateTimeDay0_Libc(benchmark::State& state) { > benchmark::DoNotOptimize(mktime(&tm)); > } > } >-BENCHMARK(BM_Time_FromDateTimeDay0_Libc); >+BENCHMARK(BM_Time_FromCivilDay0_Libc); > > const char* const kFormats[] = { > RFC1123_full, // 0 >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/civil_time_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/civil_time_test.cc >index f6648c8f1f21f3207aba1b5e4bb066061b064b65..faffde470af4f45c3b8a4baa0c0d2bcfa318804a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/civil_time_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/civil_time_test.cc >@@ -620,7 +620,7 @@ TEST(CivilTime, Relational) { > TEST_RELATIONAL(civil_second(2014, 1, 1, 1, 1, 0), > civil_second(2014, 1, 1, 1, 1, 1)); > >- // Tests the relational operators of two different CivilTime types. >+ // Tests the relational operators of two different civil-time types. > TEST_RELATIONAL(civil_day(2014, 1, 1), civil_minute(2014, 1, 1, 1, 1)); > TEST_RELATIONAL(civil_day(2014, 1, 1), civil_month(2014, 2)); > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_fixed.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_fixed.cc >index 598b08fde42236cf137e0e82d09d9f2c1f1b22f0..db9a475a9b4904cb2c4da6b5cf280ca8ca437270 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_fixed.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_fixed.cc >@@ -15,8 +15,8 @@ > #include "time_zone_fixed.h" > > #include <algorithm> >+#include <cassert> > #include <chrono> >-#include <cstdio> > #include <cstring> > #include <string> > >@@ -29,8 +29,15 @@ namespace { > // The prefix used for the internal names of fixed-offset zones. > const char kFixedOffsetPrefix[] = "Fixed/UTC"; > >+const char kDigits[] = "0123456789"; >+ >+char* Format02d(char* p, int v) { >+ *p++ = kDigits[(v / 10) % 10]; >+ *p++ = kDigits[v % 10]; >+ return p; >+} >+ > int Parse02d(const char* p) { >- static const char kDigits[] = "0123456789"; > if (const char* ap = std::strchr(kDigits, *p)) { > int v = static_cast<int>(ap - kDigits); > if (const char* bp = std::strchr(kDigits, *++p)) { >@@ -95,9 +102,17 @@ std::string FixedOffsetToName(const seconds& offset) { > } > int hours = minutes / 60; > minutes %= 60; >- char buf[sizeof(kFixedOffsetPrefix) + sizeof("-24:00:00")]; >- snprintf(buf, sizeof(buf), "%s%c%02d:%02d:%02d", >- kFixedOffsetPrefix, sign, hours, minutes, seconds); >+ char buf[sizeof(kFixedOffsetPrefix) - 1 + sizeof("-24:00:00")]; >+ std::strcpy(buf, kFixedOffsetPrefix); >+ char* ep = buf + sizeof(kFixedOffsetPrefix) - 1; >+ *ep++ = sign; >+ ep = Format02d(ep, hours); >+ *ep++ = ':'; >+ ep = Format02d(ep, minutes); >+ *ep++ = ':'; >+ ep = Format02d(ep, seconds); >+ *ep++ = '\0'; >+ assert(ep == buf + sizeof(buf)); > return buf; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format.cc >index 1b023848efa18e81562d57b3955ac120df855474..a02b1e341ac0ccefd60e0fa859014e8d0ad05b09 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format.cc >@@ -533,7 +533,7 @@ const char* ParseSubSeconds(const char* dp, detail::femtoseconds* subseconds) { > return dp; > } > >-// Parses a std::string into a std::tm using strptime(3). >+// Parses a string into a std::tm using strptime(3). > const char* ParseTM(const char* dp, const char* fmt, std::tm* tm) { > if (dp != nullptr) { > dp = strptime(dp, fmt, tm); >@@ -743,7 +743,7 @@ bool parse(const std::string& format, const std::string& input, > data = ParseTM(data, spec.c_str(), &tm); > > // If we successfully parsed %p we need to remember whether the result >- // was AM or PM so that we can adjust tm_hour before ConvertDateTime(). >+ // was AM or PM so that we can adjust tm_hour before time_zone::lookup(). > // So reparse the input with a known AM hour, and check if it is shifted > // to a PM hour. > if (spec == "%p" && data != nullptr) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format_test.cc >index a90dda7603a8a10e37b3b58d9665968b583962be..6b9928ed32f431027c4b92a49c32e9a08a18d891 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_format_test.cc >@@ -669,13 +669,13 @@ TEST(Parse, WithTimeZone) { > utc_time_zone(), &tp)); > ExpectTime(tp, tz, 2013, 6, 28, 19 - 8 - 7, 8, 9, -7 * 60 * 60, true, "PDT"); > >- // Check a skipped time (a Spring DST transition). parse() returns >- // the preferred-offset result, as defined for ConvertDateTime(). >+ // Check a skipped time (a Spring DST transition). parse() uses the >+ // pre-transition offset. > EXPECT_TRUE(parse("%Y-%m-%d %H:%M:%S", "2011-03-13 02:15:00", tz, &tp)); > ExpectTime(tp, tz, 2011, 3, 13, 3, 15, 0, -7 * 60 * 60, true, "PDT"); > >- // Check a repeated time (a Fall DST transition). parse() returns >- // the preferred-offset result, as defined for ConvertDateTime(). >+ // Check a repeated time (a Fall DST transition). parse() uses the >+ // pre-transition offset. > EXPECT_TRUE(parse("%Y-%m-%d %H:%M:%S", "2011-11-06 01:15:00", tz, &tp)); > ExpectTime(tp, tz, 2011, 11, 6, 1, 15, 0, -7 * 60 * 60, true, "PDT"); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_info.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_info.cc >index bf73635d4c6a8c033b6e487925c41a2eab805184..2cb358d048e2d82191de8bcf202829c7ea2cb883 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_info.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_info.cc >@@ -286,7 +286,7 @@ bool TimeZoneInfo::EquivTransitions(std::uint_fast8_t tt1_index, > return true; > } > >-// Use the POSIX-TZ-environment-variable-style std::string to handle times >+// Use the POSIX-TZ-environment-variable-style string to handle times > // in years after the last transition stored in the zoneinfo data. > void TimeZoneInfo::ExtendTransitions(const std::string& name, > const Header& hdr) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_posix.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_posix.h >index 6619f27edcf63895abeb23f3ddb48eff4f9f2ac8..9ccd4a8b68bd5761f56e314b9d8a250171b48d69 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_posix.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/internal/cctz/src/time_zone_posix.h >@@ -89,7 +89,7 @@ struct PosixTransition { > } time; > }; > >-// The entirety of a POSIX-std::string specified time-zone rule. The standard >+// The entirety of a POSIX-string specified time-zone rule. The standard > // abbreviation and offset are always given. If the time zone includes > // daylight saving, then the daylight abbrevation is non-empty and the > // remaining fields are also valid. Note that the start/end transitions >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/time.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/time.h >index c41cb89c5eff1d4daf8ecb16c5b15639702f6462..50bf971df0a29cb4750aef70c11e705447646a22 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/time.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/time/time.h >@@ -25,11 +25,12 @@ > // * `absl::TimeZone` defines geopolitical time zone regions (as collected > // within the IANA Time Zone database (https://www.iana.org/time-zones)). > // >+// > // Example: > // > // absl::TimeZone nyc; > // >-// // LoadTimeZone may fail so it's always better to check for success. >+// // LoadTimeZone() may fail so it's always better to check for success. > // if (!absl::LoadTimeZone("America/New_York", &nyc)) { > // // handle error case > // } >@@ -162,6 +163,11 @@ class Duration { > Duration& operator*=(float r) { return *this *= static_cast<double>(r); } > Duration& operator/=(float r) { return *this /= static_cast<double>(r); } > >+ template <typename H> >+ friend H AbslHashValue(H h, Duration d) { >+ return H::combine(std::move(h), d.rep_hi_, d.rep_lo_); >+ } >+ > private: > friend constexpr int64_t time_internal::GetRepHi(Duration d); > friend constexpr uint32_t time_internal::GetRepLo(Duration d); >@@ -321,6 +327,9 @@ Duration Ceil(Duration d, Duration unit); > // 0 == d / inf > // INT64_MAX == inf / d > // >+// d < inf >+// -inf < d >+// > // // Division by zero returns infinity, or INT64_MIN/MAX where appropriate. > // inf == d / 0 > // INT64_MAX == d / absl::ZeroDuration() >@@ -481,7 +490,7 @@ std::chrono::hours ToChronoHours(Duration d); > > // FormatDuration() > // >-// Returns a std::string representing the duration in the form "72h3m0.5s". >+// Returns a string representing the duration in the form "72h3m0.5s". > // Returns "inf" or "-inf" for +/- `InfiniteDuration()`. > std::string FormatDuration(Duration d); > >@@ -492,11 +501,11 @@ inline std::ostream& operator<<(std::ostream& os, Duration d) { > > // ParseDuration() > // >-// Parses a duration std::string consisting of a possibly signed sequence of >+// Parses a duration string consisting of a possibly signed sequence of > // decimal numbers, each with an optional fractional part and a unit > // suffix. The valid suffixes are "ns", "us" "ms", "s", "m", and "h". > // Simple examples include "300ms", "-1.5h", and "2h45m". Parses "0" as >-// `ZeroDuration()`. Parses "inf" and "-inf" as +/- `InfiniteDuration()`. >+// `ZeroDuration()`. Parses "inf" and "-inf" as +/- `InfiniteDuration()`. > bool ParseDuration(const std::string& dur_string, Duration* d); > > // Support for flag values of type Duration. Duration flags must be specified >@@ -607,6 +616,11 @@ class Time { > // Returns the breakdown of this instant in the given TimeZone. > Breakdown In(TimeZone tz) const; > >+ template <typename H> >+ friend H AbslHashValue(H h, Time t) { >+ return H::combine(std::move(h), t.rep_); >+ } >+ > private: > friend constexpr Time time_internal::FromUnixDuration(Duration d); > friend constexpr Duration time_internal::ToUnixDuration(Time t); >@@ -689,7 +703,9 @@ constexpr Time InfinitePast() { > // Examples: > // > // absl::TimeZone lax; >-// if (!absl::LoadTimeZone("America/Los_Angeles", &lax)) { ... } >+// if (!absl::LoadTimeZone("America/Los_Angeles", &lax)) { >+// // handle error case >+// } > // > // // A unique civil time > // absl::TimeConversion jan01 = >@@ -759,7 +775,9 @@ TimeConversion ConvertDateTime(int64_t year, int mon, int day, int hour, > // Example: > // > // absl::TimeZone seattle; >-// if (!absl::LoadTimeZone("America/Los_Angeles", &seattle)) { ... } >+// if (!absl::LoadTimeZone("America/Los_Angeles", &seattle)) { >+// // handle error case >+// } > // absl::Time t = absl::FromDateTime(2017, 9, 26, 9, 30, 0, seattle); > Time FromDateTime(int64_t year, int mon, int day, int hour, int min, int sec, > TimeZone tz); >@@ -871,8 +889,10 @@ std::chrono::system_clock::time_point ToChronoTime(Time); > // FormatTime()/ParseTime() format specifiers for RFC3339 date/time strings, > // with trailing zeros trimmed or with fractional seconds omitted altogether. > // >-// Note that RFC3339_sec[] matches an ISO 8601 extended format for date >-// and time with UTC offset. >+// Note that RFC3339_sec[] matches an ISO 8601 extended format for date and >+// time with UTC offset. Also note the use of "%Y": RFC3339 mandates that >+// years have exactly four digits, but we allow them to take their natural >+// width. > extern const char RFC3339_full[]; // %Y-%m-%dT%H:%M:%E*S%Ez > extern const char RFC3339_sec[]; // %Y-%m-%dT%H:%M:%S%Ez > >@@ -886,7 +906,7 @@ extern const char RFC1123_no_wday[]; // %d %b %E4Y %H:%M:%S %z > // FormatTime() > // > // Formats the given `absl::Time` in the `absl::TimeZone` according to the >-// provided format std::string. Uses strftime()-like formatting options, with >+// provided format string. Uses strftime()-like formatting options, with > // the following extensions: > // > // - %Ez - RFC3339-compatible numeric UTC offset (+hh:mm or -hh:mm) >@@ -910,16 +930,18 @@ extern const char RFC1123_no_wday[]; // %d %b %E4Y %H:%M:%S %z > // Example: > // > // absl::TimeZone lax; >-// if (!absl::LoadTimeZone("America/Los_Angeles", &lax)) { ... } >+// if (!absl::LoadTimeZone("America/Los_Angeles", &lax)) { >+// // handle error case >+// } > // absl::Time t = absl::FromDateTime(2013, 1, 2, 3, 4, 5, lax); > // >-// std::string f = absl::FormatTime("%H:%M:%S", t, lax); // "03:04:05" >+// string f = absl::FormatTime("%H:%M:%S", t, lax); // "03:04:05" > // f = absl::FormatTime("%H:%M:%E3S", t, lax); // "03:04:05.000" > // > // Note: If the given `absl::Time` is `absl::InfiniteFuture()`, the returned >-// std::string will be exactly "infinite-future". If the given `absl::Time` is >-// `absl::InfinitePast()`, the returned std::string will be exactly "infinite-past". >-// In both cases the given format std::string and `absl::TimeZone` are ignored. >+// string will be exactly "infinite-future". If the given `absl::Time` is >+// `absl::InfinitePast()`, the returned string will be exactly "infinite-past". >+// In both cases the given format string and `absl::TimeZone` are ignored. > // > std::string FormatTime(const std::string& format, Time t, TimeZone tz); > >@@ -936,7 +958,7 @@ inline std::ostream& operator<<(std::ostream& os, Time t) { > > // ParseTime() > // >-// Parses an input std::string according to the provided format std::string and >+// Parses an input string according to the provided format string and > // returns the corresponding `absl::Time`. Uses strftime()-like formatting > // options, with the same extensions as FormatTime(), but with the > // exceptions that %E#S is interpreted as %E*S, and %E#f as %E*f. %Ez >@@ -950,7 +972,7 @@ inline std::ostream& operator<<(std::ostream& os, Time t) { > // > // "1970-01-01 00:00:00.0 +0000" > // >-// For example, parsing a std::string of "15:45" (%H:%M) will return an absl::Time >+// For example, parsing a string of "15:45" (%H:%M) will return an absl::Time > // that represents "1970-01-01 15:45:00.0 +0000". > // > // Note that since ParseTime() returns time instants, it makes the most sense >@@ -977,15 +999,15 @@ inline std::ostream& operator<<(std::ostream& os, Time t) { > // Errors are indicated by returning false and assigning an error message > // to the "err" out param if it is non-null. > // >-// Note: If the input std::string is exactly "infinite-future", the returned >+// Note: If the input string is exactly "infinite-future", the returned > // `absl::Time` will be `absl::InfiniteFuture()` and `true` will be returned. >-// If the input std::string is "infinite-past", the returned `absl::Time` will be >+// If the input string is "infinite-past", the returned `absl::Time` will be > // `absl::InfinitePast()` and `true` will be returned. > // > bool ParseTime(const std::string& format, const std::string& input, Time* time, > std::string* err); > >-// Like ParseTime() above, but if the format std::string does not contain a UTC >+// Like ParseTime() above, but if the format string does not contain a UTC > // offset specification (%z/%Ez/%E*z) then the input is interpreted in the > // given TimeZone. This means that the input, by itself, does not identify a > // unique instant. Being time-zone dependent, it also admits the possibility >@@ -1028,7 +1050,9 @@ std::string UnparseFlag(Time t); > // absl::TimeZone pst = absl::FixedTimeZone(-8 * 60 * 60); > // absl::TimeZone loc = absl::LocalTimeZone(); > // absl::TimeZone lax; >-// if (!absl::LoadTimeZone("America/Los_Angeles", &lax)) { ... } >+// if (!absl::LoadTimeZone("America/Los_Angeles", &lax)) { >+// // handle error case >+// } > // > // See also: > // - https://github.com/google/cctz >@@ -1045,6 +1069,11 @@ class TimeZone { > > std::string name() const { return cz_.name(); } > >+ template <typename H> >+ friend H AbslHashValue(H h, TimeZone tz) { >+ return H::combine(std::move(h), tz.cz_); >+ } >+ > private: > friend bool operator==(TimeZone a, TimeZone b) { return a.cz_ == b.cz_; } > friend bool operator!=(TimeZone a, TimeZone b) { return a.cz_ != b.cz_; } >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.bazel b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.bazel >index 096c119e638cbd35040550bed10b192e7b467401..32f690c94c1d347c1437283b8088d2b195f51cbb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.bazel >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.bazel >@@ -19,6 +19,7 @@ load( > "ABSL_DEFAULT_COPTS", > "ABSL_TEST_COPTS", > "ABSL_EXCEPTIONS_FLAG", >+ "ABSL_EXCEPTIONS_FLAG_LINKOPTS", > ) > > package(default_visibility = ["//visibility:public"]) >@@ -55,6 +56,7 @@ cc_library( > "bad_any_cast.h", > ], > copts = ABSL_EXCEPTIONS_FLAG + ABSL_DEFAULT_COPTS, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > visibility = ["//visibility:private"], > deps = [ > "//absl/base", >@@ -69,6 +71,7 @@ cc_test( > "any_test.cc", > ], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":any", > "//absl/base", >@@ -100,6 +103,7 @@ cc_test( > name = "any_exception_safety_test", > srcs = ["any_exception_safety_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":any", > "//absl/base:exception_safety_testing", >@@ -124,6 +128,7 @@ cc_test( > size = "small", > srcs = ["span_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":span", > "//absl/base:config", >@@ -131,6 +136,7 @@ cc_test( > "//absl/base:exception_testing", > "//absl/container:fixed_array", > "//absl/container:inlined_vector", >+ "//absl/hash:hash_testing", > "//absl/strings", > "@com_google_googletest//:gtest_main", > ], >@@ -148,6 +154,7 @@ cc_test( > "//absl/base:exception_testing", > "//absl/container:fixed_array", > "//absl/container:inlined_vector", >+ "//absl/hash:hash_testing", > "//absl/strings", > "@com_google_googletest//:gtest_main", > ], >@@ -172,6 +179,7 @@ cc_library( > srcs = ["bad_optional_access.cc"], > hdrs = ["bad_optional_access.h"], > copts = ABSL_DEFAULT_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > "//absl/base", > "//absl/base:config", >@@ -183,6 +191,7 @@ cc_library( > srcs = ["bad_variant_access.cc"], > hdrs = ["bad_variant_access.h"], > copts = ABSL_EXCEPTIONS_FLAG + ABSL_DEFAULT_COPTS, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > "//absl/base", > "//absl/base:config", >@@ -196,6 +205,7 @@ cc_test( > "optional_test.cc", > ], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":optional", > "//absl/base", >@@ -212,6 +222,7 @@ cc_test( > "optional_exception_safety_test.cc", > ], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":optional", > "//absl/base:exception_safety_testing", >@@ -239,6 +250,7 @@ cc_test( > size = "small", > srcs = ["variant_test.cc"], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":variant", > "//absl/base:config", >@@ -271,6 +283,7 @@ cc_test( > "variant_exception_safety_test.cc", > ], > copts = ABSL_TEST_COPTS + ABSL_EXCEPTIONS_FLAG, >+ linkopts = ABSL_EXCEPTIONS_FLAG_LINKOPTS, > deps = [ > ":variant", > "//absl/base:exception_safety_testing", >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.gn >index fd7e89dee34c3b32905a7fa78fbfbfe2abb01e55..bfdafb317e3b4bf1d3fec252cd0cd1d193ec1929 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/BUILD.gn >@@ -104,6 +104,7 @@ source_set("optional") { > deps = [ > ":bad_optional_access", > "../base:config", >+ "../base:core_headers", > "../memory", > "../meta:type_traits", > "../utility", >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/any_exception_safety_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/any_exception_safety_test.cc >index 36955f6c5c376d32fb11a55895ba4ba06fe5a78e..f9dd8c482c25543b9142e42ba175a8c0149c3566 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/any_exception_safety_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/any_exception_safety_test.cc >@@ -62,7 +62,7 @@ testing::AssertionResult AnyInvariants(absl::any* a) { > static_cast<void>(unused); > return AssertionFailure() > << "A reset `any` should not be able to be any_cast"; >- } catch (absl::bad_any_cast) { >+ } catch (const absl::bad_any_cast&) { > } catch (...) { > return AssertionFailure() > << "Unexpected exception thrown from absl::any_cast"; >@@ -107,7 +107,7 @@ TEST(AnyExceptionSafety, Assignment) { > }; > auto any_strong_tester = testing::MakeExceptionSafetyTester() > .WithInitialValue(original) >- .WithInvariants(AnyInvariants, any_is_strong); >+ .WithContracts(AnyInvariants, any_is_strong); > > Thrower val(2); > absl::any any_val(val); >@@ -129,7 +129,7 @@ TEST(AnyExceptionSafety, Assignment) { > auto strong_empty_any_tester = > testing::MakeExceptionSafetyTester() > .WithInitialValue(absl::any{}) >- .WithInvariants(AnyInvariants, empty_any_is_strong); >+ .WithContracts(AnyInvariants, empty_any_is_strong); > > EXPECT_TRUE(strong_empty_any_tester.Test(assign_any)); > EXPECT_TRUE(strong_empty_any_tester.Test(assign_val)); >@@ -142,7 +142,7 @@ TEST(AnyExceptionSafety, Emplace) { > absl::any{absl::in_place_type_t<Thrower>(), 1, testing::nothrow_ctor}; > auto one_tester = testing::MakeExceptionSafetyTester() > .WithInitialValue(initial_val) >- .WithInvariants(AnyInvariants, AnyIsEmpty); >+ .WithContracts(AnyInvariants, AnyIsEmpty); > > auto emp_thrower = [](absl::any* ap) { ap->emplace<Thrower>(2); }; > auto emp_throwervec = [](absl::any* ap) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/internal/variant.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/internal/variant.h >index 7708e67cb14f416ade16719e2369aaf648bb6283..eff4fefed409d92d507dba8669fe1231ed4cb15e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/internal/variant.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/internal/variant.h >@@ -1,4 +1,4 @@ >-// Copyright 2017 The Abseil Authors. >+// Copyright 2018 The Abseil Authors. > // > // Licensed under the Apache License, Version 2.0 (the "License"); > // you may not use this file except in compliance with the License. >@@ -37,6 +37,8 @@ > #include "absl/types/bad_variant_access.h" > #include "absl/utility/utility.h" > >+#if !defined(ABSL_HAVE_STD_VARIANT) >+ > namespace absl { > > template <class... Types> >@@ -908,6 +910,11 @@ struct PerformVisitation { > template <std::size_t... TupIs, std::size_t... Is> > constexpr ReturnType Run(std::false_type /*has_valueless*/, > index_sequence<TupIs...>, SizeT<Is>...) const { >+ static_assert( >+ std::is_same<ReturnType, >+ absl::result_of_t<Op(VariantAccessResult< >+ Is, QualifiedVariants>...)>>::value, >+ "All visitation overloads must have the same return type."); > return absl::base_internal::Invoke( > absl::forward<Op>(op), > VariantCoreAccess::Access<Is>( >@@ -1227,23 +1234,23 @@ using VariantCopyBase = absl::conditional_t< > // Base that is dependent on whether or not the move-assign can be trivial. > template <class... T> > using VariantMoveAssignBase = absl::conditional_t< >- absl::disjunction<absl::conjunction<std::is_move_assignable<Union<T...>>, >+ absl::disjunction<absl::conjunction<absl::is_move_assignable<Union<T...>>, > std::is_move_constructible<Union<T...>>, > std::is_destructible<Union<T...>>>, > absl::negation<absl::conjunction< > std::is_move_constructible<T>..., >- std::is_move_assignable<T>...>>>::value, >+ absl::is_move_assignable<T>...>>>::value, > VariantCopyBase<T...>, VariantMoveAssignBaseNontrivial<T...>>; > > // Base that is dependent on whether or not the copy-assign can be trivial. > template <class... T> > using VariantCopyAssignBase = absl::conditional_t< >- absl::disjunction<absl::conjunction<std::is_copy_assignable<Union<T...>>, >+ absl::disjunction<absl::conjunction<absl::is_copy_assignable<Union<T...>>, > std::is_copy_constructible<Union<T...>>, > std::is_destructible<Union<T...>>>, > absl::negation<absl::conjunction< > std::is_copy_constructible<T>..., >- std::is_copy_assignable<T>...>>>::value, >+ absl::is_copy_assignable<T>...>>>::value, > VariantMoveAssignBase<T...>, VariantCopyAssignBaseNontrivial<T...>>; > > template <class... T> >@@ -1612,4 +1619,5 @@ struct VariantHashBase<Variant, > } // namespace variant_internal > } // namespace absl > >+#endif // !defined(ABSL_HAVE_STD_VARIANT) > #endif // ABSL_TYPES_variant_internal_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional.h >index c837cddeef48942647e7f8292d0163750946b2b6..14210018815c4c661207fe40f452b0135c9a0af3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional.h >@@ -59,6 +59,7 @@ using std::nullopt; > #include <type_traits> > #include <utility> > >+#include "absl/base/attributes.h" > #include "absl/memory/memory.h" > #include "absl/meta/type_traits.h" > #include "absl/types/bad_optional_access.h" >@@ -411,10 +412,10 @@ constexpr copy_traits get_ctor_copy_traits() { > > template <typename T> > constexpr copy_traits get_assign_copy_traits() { >- return std::is_copy_assignable<T>::value && >+ return absl::is_copy_assignable<T>::value && > std::is_copy_constructible<T>::value > ? copy_traits::copyable >- : std::is_move_assignable<T>::value && >+ : absl::is_move_assignable<T>::value && > std::is_move_constructible<T>::value > ? copy_traits::movable > : copy_traits::non_movable; >@@ -700,7 +701,7 @@ class optional : private optional_internal::optional_data<T>, > // optional::reset() > // > // Destroys the inner `T` value of an `absl::optional` if one is present. >- void reset() noexcept { this->destruct(); } >+ ABSL_ATTRIBUTE_REINITIALIZES void reset() noexcept { this->destruct(); } > > // optional::emplace() > // >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_exception_safety_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_exception_safety_test.cc >index d2ef04b8d1b083c755c2306855460aa8f0de81dd..d117ee5184919fe6b12d1233de16e9aee80e803e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_exception_safety_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_exception_safety_test.cc >@@ -38,12 +38,12 @@ constexpr int kUpdatedInteger = 10; > template <typename OptionalT> > bool ValueThrowsBadOptionalAccess(const OptionalT& optional) try { > return (static_cast<void>(optional.value()), false); >-} catch (absl::bad_optional_access) { >+} catch (const absl::bad_optional_access&) { > return true; > } > > template <typename OptionalT> >-AssertionResult CheckInvariants(OptionalT* optional_ptr) { >+AssertionResult OptionalInvariants(OptionalT* optional_ptr) { > // Check the current state post-throw for validity > auto& optional = *optional_ptr; > >@@ -123,8 +123,8 @@ TEST(OptionalExceptionSafety, NothrowConstructors) { > TEST(OptionalExceptionSafety, Emplace) { > // Test the basic guarantee plus test the result of optional::has_value() > // is false in all cases >- auto disengaged_test = MakeExceptionSafetyTester().WithInvariants( >- CheckInvariants<Optional>, CheckDisengaged<Optional>); >+ auto disengaged_test = MakeExceptionSafetyTester().WithContracts( >+ OptionalInvariants<Optional>, CheckDisengaged<Optional>); > auto disengaged_test_empty = disengaged_test.WithInitialValue(Optional()); > auto disengaged_test_nonempty = > disengaged_test.WithInitialValue(Optional(kInitialInteger)); >@@ -147,11 +147,11 @@ TEST(OptionalExceptionSafety, EverythingThrowsSwap) { > // Test the basic guarantee plus test the result of optional::has_value() > // remains the same > auto test = >- MakeExceptionSafetyTester().WithInvariants(CheckInvariants<Optional>); >+ MakeExceptionSafetyTester().WithContracts(OptionalInvariants<Optional>); > auto disengaged_test_empty = test.WithInitialValue(Optional()) >- .WithInvariants(CheckDisengaged<Optional>); >+ .WithContracts(CheckDisengaged<Optional>); > auto engaged_test_nonempty = test.WithInitialValue(Optional(kInitialInteger)) >- .WithInvariants(CheckEngaged<Optional>); >+ .WithContracts(CheckEngaged<Optional>); > > auto swap_empty = [](Optional* optional_ptr) { > auto empty = Optional(); >@@ -192,11 +192,11 @@ TEST(OptionalExceptionSafety, CopyAssign) { > // Test the basic guarantee plus test the result of optional::has_value() > // remains the same > auto test = >- MakeExceptionSafetyTester().WithInvariants(CheckInvariants<Optional>); >+ MakeExceptionSafetyTester().WithContracts(OptionalInvariants<Optional>); > auto disengaged_test_empty = test.WithInitialValue(Optional()) >- .WithInvariants(CheckDisengaged<Optional>); >+ .WithContracts(CheckDisengaged<Optional>); > auto engaged_test_nonempty = test.WithInitialValue(Optional(kInitialInteger)) >- .WithInvariants(CheckEngaged<Optional>); >+ .WithContracts(CheckEngaged<Optional>); > > auto copyassign_nonempty = [](Optional* optional_ptr) { > auto nonempty = >@@ -218,11 +218,11 @@ TEST(OptionalExceptionSafety, MoveAssign) { > // Test the basic guarantee plus test the result of optional::has_value() > // remains the same > auto test = >- MakeExceptionSafetyTester().WithInvariants(CheckInvariants<Optional>); >+ MakeExceptionSafetyTester().WithContracts(OptionalInvariants<Optional>); > auto disengaged_test_empty = test.WithInitialValue(Optional()) >- .WithInvariants(CheckDisengaged<Optional>); >+ .WithContracts(CheckDisengaged<Optional>); > auto engaged_test_nonempty = test.WithInitialValue(Optional(kInitialInteger)) >- .WithInvariants(CheckEngaged<Optional>); >+ .WithContracts(CheckEngaged<Optional>); > > auto moveassign_empty = [](Optional* optional_ptr) { > auto empty = Optional(); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_test.cc >index 179bfd66d2fee0eb9321b0559b2b9dc18910dbb3..d90db9f821b52d4a935557792e212e14199e7809 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/optional_test.cc >@@ -607,11 +607,12 @@ TEST(optionalTest, CopyAssignment) { > opt2_to_empty = empty; > EXPECT_FALSE(opt2_to_empty); > >- EXPECT_FALSE(std::is_copy_assignable<absl::optional<const int>>::value); >- EXPECT_TRUE(std::is_copy_assignable<absl::optional<Copyable>>::value); >- EXPECT_FALSE(std::is_copy_assignable<absl::optional<MoveableThrow>>::value); >- EXPECT_FALSE(std::is_copy_assignable<absl::optional<MoveableNoThrow>>::value); >- EXPECT_FALSE(std::is_copy_assignable<absl::optional<NonMovable>>::value); >+ EXPECT_FALSE(absl::is_copy_assignable<absl::optional<const int>>::value); >+ EXPECT_TRUE(absl::is_copy_assignable<absl::optional<Copyable>>::value); >+ EXPECT_FALSE(absl::is_copy_assignable<absl::optional<MoveableThrow>>::value); >+ EXPECT_FALSE( >+ absl::is_copy_assignable<absl::optional<MoveableNoThrow>>::value); >+ EXPECT_FALSE(absl::is_copy_assignable<absl::optional<NonMovable>>::value); > > EXPECT_TRUE(absl::is_trivially_copy_assignable<int>::value); > EXPECT_TRUE(absl::is_trivially_copy_assignable<volatile int>::value); >@@ -625,9 +626,9 @@ TEST(optionalTest, CopyAssignment) { > }; > > EXPECT_TRUE(absl::is_trivially_copy_assignable<Trivial>::value); >- EXPECT_FALSE(std::is_copy_assignable<const Trivial>::value); >- EXPECT_FALSE(std::is_copy_assignable<volatile Trivial>::value); >- EXPECT_TRUE(std::is_copy_assignable<NonTrivial>::value); >+ EXPECT_FALSE(absl::is_copy_assignable<const Trivial>::value); >+ EXPECT_FALSE(absl::is_copy_assignable<volatile Trivial>::value); >+ EXPECT_TRUE(absl::is_copy_assignable<NonTrivial>::value); > EXPECT_FALSE(absl::is_trivially_copy_assignable<NonTrivial>::value); > > // std::optional doesn't support volatile nontrivial types. >@@ -695,11 +696,11 @@ TEST(optionalTest, MoveAssignment) { > EXPECT_EQ(1, listener.volatile_move_assign); > } > #endif // ABSL_HAVE_STD_OPTIONAL >- EXPECT_FALSE(std::is_move_assignable<absl::optional<const int>>::value); >- EXPECT_TRUE(std::is_move_assignable<absl::optional<Copyable>>::value); >- EXPECT_TRUE(std::is_move_assignable<absl::optional<MoveableThrow>>::value); >- EXPECT_TRUE(std::is_move_assignable<absl::optional<MoveableNoThrow>>::value); >- EXPECT_FALSE(std::is_move_assignable<absl::optional<NonMovable>>::value); >+ EXPECT_FALSE(absl::is_move_assignable<absl::optional<const int>>::value); >+ EXPECT_TRUE(absl::is_move_assignable<absl::optional<Copyable>>::value); >+ EXPECT_TRUE(absl::is_move_assignable<absl::optional<MoveableThrow>>::value); >+ EXPECT_TRUE(absl::is_move_assignable<absl::optional<MoveableNoThrow>>::value); >+ EXPECT_FALSE(absl::is_move_assignable<absl::optional<NonMovable>>::value); > > EXPECT_FALSE( > std::is_nothrow_move_assignable<absl::optional<MoveableThrow>>::value); >@@ -1619,7 +1620,7 @@ TEST(optionalTest, AssignmentConstraints) { > EXPECT_TRUE( > (std::is_assignable<absl::optional<AnyLike>&, const AnyLike&>::value)); > EXPECT_TRUE(std::is_move_assignable<absl::optional<AnyLike>>::value); >- EXPECT_TRUE(std::is_copy_assignable<absl::optional<AnyLike>>::value); >+ EXPECT_TRUE(absl::is_copy_assignable<absl::optional<AnyLike>>::value); > } > > } // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span.h >index 76be819ecca2186e5dd49dba0b2fd4a6d27ff9e8..911af0c57a61a6785557a91fbb692d7d90191458 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span.h >@@ -87,7 +87,7 @@ constexpr auto GetDataImpl(C& c, char) noexcept // NOLINT(runtime/references) > return c.data(); > } > >-// Before C++17, std::string::data returns a const char* in all cases. >+// Before C++17, string::data returns a const char* in all cases. > inline char* GetDataImpl(std::string& s, // NOLINT(runtime/references) > int) noexcept { > return &s[0]; >@@ -379,64 +379,70 @@ class Span { > // > // Returns a reference to the i'th element of this span. > constexpr reference at(size_type i) const { >- return ABSL_PREDICT_TRUE(i < size()) >- ? ptr_[i] >+ return ABSL_PREDICT_TRUE(i < size()) // >+ ? *(data() + i) > : (base_internal::ThrowStdOutOfRange( > "Span::at failed bounds check"), >- ptr_[i]); >+ *(data() + i)); > } > > // Span::front() > // > // Returns a reference to the first element of this span. >- reference front() const noexcept { return ABSL_ASSERT(size() > 0), ptr_[0]; } >+ constexpr reference front() const noexcept { >+ return ABSL_ASSERT(size() > 0), *data(); >+ } > > // Span::back() > // > // Returns a reference to the last element of this span. >- reference back() const noexcept { >- return ABSL_ASSERT(size() > 0), ptr_[size() - 1]; >+ constexpr reference back() const noexcept { >+ return ABSL_ASSERT(size() > 0), *(data() + size() - 1); > } > > // Span::begin() > // > // Returns an iterator to the first element of this span. >- constexpr iterator begin() const noexcept { return ptr_; } >+ constexpr iterator begin() const noexcept { return data(); } > > // Span::cbegin() > // > // Returns a const iterator to the first element of this span. >- constexpr const_iterator cbegin() const noexcept { return ptr_; } >+ constexpr const_iterator cbegin() const noexcept { return begin(); } > > // Span::end() > // > // Returns an iterator to the last element of this span. >- iterator end() const noexcept { return ptr_ + len_; } >+ constexpr iterator end() const noexcept { return data() + size(); } > > // Span::cend() > // > // Returns a const iterator to the last element of this span. >- const_iterator cend() const noexcept { return end(); } >+ constexpr const_iterator cend() const noexcept { return end(); } > > // Span::rbegin() > // > // Returns a reverse iterator starting at the last element of this span. >- reverse_iterator rbegin() const noexcept { return reverse_iterator(end()); } >+ constexpr reverse_iterator rbegin() const noexcept { >+ return reverse_iterator(end()); >+ } > > // Span::crbegin() > // > // Returns a reverse const iterator starting at the last element of this span. >- const_reverse_iterator crbegin() const noexcept { return rbegin(); } >+ constexpr const_reverse_iterator crbegin() const noexcept { return rbegin(); } > > // Span::rend() > // > // Returns a reverse iterator starting at the first element of this span. >- reverse_iterator rend() const noexcept { return reverse_iterator(begin()); } >+ constexpr reverse_iterator rend() const noexcept { >+ return reverse_iterator(begin()); >+ } > > // Span::crend() > // > // Returns a reverse iterator starting at the first element of this span. >- const_reverse_iterator crend() const noexcept { return rend(); } >+ constexpr const_reverse_iterator crend() const noexcept { return rend(); } > > // Span mutations > >@@ -444,7 +450,7 @@ class Span { > // > // Removes the first `n` elements from the span. > void remove_prefix(size_type n) noexcept { >- assert(len_ >= n); >+ assert(size() >= n); > ptr_ += n; > len_ -= n; > } >@@ -453,7 +459,7 @@ class Span { > // > // Removes the last `n` elements from the span. > void remove_suffix(size_type n) noexcept { >- assert(len_ >= n); >+ assert(size() >= n); > len_ -= n; > } > >@@ -474,11 +480,18 @@ class Span { > // absl::MakeSpan(vec).subspan(4); // {} > // absl::MakeSpan(vec).subspan(5); // throws std::out_of_range > constexpr Span subspan(size_type pos = 0, size_type len = npos) const { >- return (pos <= len_) >- ? Span(ptr_ + pos, span_internal::Min(len_ - pos, len)) >+ return (pos <= size()) >+ ? Span(data() + pos, span_internal::Min(size() - pos, len)) > : (base_internal::ThrowStdOutOfRange("pos > size()"), Span()); > } > >+ // Support for absl::Hash. >+ template <typename H> >+ friend H AbslHashValue(H h, Span v) { >+ return H::combine(H::combine_contiguous(std::move(h), v.data(), v.size()), >+ v.size()); >+ } >+ > private: > pointer ptr_; > size_type len_; >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span_test.cc >index fbce7e87479841cf89138026993238c399862168..bd739ff2a070a6fcec1a64ea133d1b81f00cfc7a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/span_test.cc >@@ -29,6 +29,7 @@ > #include "absl/base/internal/exception_testing.h" > #include "absl/container/fixed_array.h" > #include "absl/container/inlined_vector.h" >+#include "absl/hash/hash_testing.h" > #include "absl/strings/str_cat.h" > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant.h >index 17e0634de03746433674e99ec077b6f773e8b2d9..2f78722f8b7c956d6d176e902d4e009ca9694a2d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant.h >@@ -414,9 +414,9 @@ constexpr absl::add_pointer_t<const T> get_if( > // }; > // > // // Declare our variant, and call `absl::visit()` on it. >-// std::variant<int, std::string> foo = std::string("foo"); >+// absl::variant<int, std::string> foo = std::string("foo"); > // GetVariant visitor; >-// std::visit(visitor, foo); // Prints `The variant's value is: foo' >+// absl::visit(visitor, foo); // Prints `The variant's value is: foo' > template <typename Visitor, typename... Variants> > variant_internal::VisitResult<Visitor, Variants...> visit(Visitor&& vis, > Variants&&... vars) { >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_exception_safety_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_exception_safety_test.cc >index 27c0b96ca6f340850b0251bc6e916d37cdf11645..58436f07b843748e7a63d11c829e16f5551a6db8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_exception_safety_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_exception_safety_test.cc >@@ -53,13 +53,13 @@ void ToValuelessByException(ThrowingVariant& v) { // NOLINT > try { > v.emplace<Thrower>(); > v.emplace<Thrower>(ExceptionOnConversion<Thrower>()); >- } catch (ConversionException& /*e*/) { >+ } catch (const ConversionException&) { > // This space intentionally left blank. > } > } > > // Check that variant is still in a usable state after an exception is thrown. >-testing::AssertionResult CheckInvariants(ThrowingVariant* v) { >+testing::AssertionResult VariantInvariants(ThrowingVariant* v) { > using testing::AssertionFailure; > using testing::AssertionSuccess; > >@@ -100,7 +100,7 @@ testing::AssertionResult CheckInvariants(ThrowingVariant* v) { > auto unused = absl::get<Thrower>(*v); > static_cast<void>(unused); > return AssertionFailure() << "Variant should not contain Thrower"; >- } catch (absl::bad_variant_access) { >+ } catch (const absl::bad_variant_access&) { > } catch (...) { > return AssertionFailure() << "Unexpected exception throw from absl::get"; > } >@@ -213,8 +213,8 @@ TEST(VariantExceptionSafetyTest, CopyAssign) { > MakeExceptionSafetyTester() > .WithInitialValue(WithThrower()) > .WithOperation([&rhs](ThrowingVariant* lhs) { *lhs = rhs; }); >- EXPECT_TRUE(tester.WithInvariants(CheckInvariants).Test()); >- EXPECT_FALSE(tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_TRUE(tester.WithContracts(VariantInvariants).Test()); >+ EXPECT_FALSE(tester.WithContracts(strong_guarantee).Test()); > } > { > const ThrowingVariant rhs(ExpectedThrowerVec()); >@@ -222,8 +222,8 @@ TEST(VariantExceptionSafetyTest, CopyAssign) { > MakeExceptionSafetyTester() > .WithInitialValue(WithThrowerVec()) > .WithOperation([&rhs](ThrowingVariant* lhs) { *lhs = rhs; }); >- EXPECT_TRUE(tester.WithInvariants(CheckInvariants).Test()); >- EXPECT_FALSE(tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_TRUE(tester.WithContracts(VariantInvariants).Test()); >+ EXPECT_FALSE(tester.WithContracts(strong_guarantee).Test()); > } > // libstdc++ std::variant has bugs on copy assignment regarding exception > // safety. >@@ -251,12 +251,12 @@ TEST(VariantExceptionSafetyTest, CopyAssign) { > .WithInitialValue(WithCopyNoThrow()) > .WithOperation([&rhs](ThrowingVariant* lhs) { *lhs = rhs; }); > EXPECT_TRUE(tester >- .WithInvariants(CheckInvariants, >- [](ThrowingVariant* lhs) { >- return lhs->valueless_by_exception(); >- }) >+ .WithContracts(VariantInvariants, >+ [](ThrowingVariant* lhs) { >+ return lhs->valueless_by_exception(); >+ }) > .Test()); >- EXPECT_FALSE(tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_FALSE(tester.WithContracts(strong_guarantee).Test()); > } > #endif // !(defined(ABSL_HAVE_STD_VARIANT) && defined(__GLIBCXX__)) > { >@@ -268,7 +268,7 @@ TEST(VariantExceptionSafetyTest, CopyAssign) { > const ThrowingVariant rhs(MoveNothrow{}); > EXPECT_TRUE(MakeExceptionSafetyTester() > .WithInitialValue(WithThrower()) >- .WithInvariants(CheckInvariants, strong_guarantee) >+ .WithContracts(VariantInvariants, strong_guarantee) > .Test([&rhs](ThrowingVariant* lhs) { *lhs = rhs; })); > } > } >@@ -304,11 +304,11 @@ TEST(VariantExceptionSafetyTest, MoveAssign) { > *lhs = std::move(copy); > }); > EXPECT_TRUE(tester >- .WithInvariants( >- CheckInvariants, >+ .WithContracts( >+ VariantInvariants, > [&](ThrowingVariant* lhs) { return lhs->index() == j; }) > .Test()); >- EXPECT_FALSE(tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_FALSE(tester.WithContracts(strong_guarantee).Test()); > } > { > // - otherwise (index() != j), equivalent to >@@ -318,10 +318,10 @@ TEST(VariantExceptionSafetyTest, MoveAssign) { > ThrowingVariant rhs(CopyNothrow{}); > EXPECT_TRUE(MakeExceptionSafetyTester() > .WithInitialValue(WithThrower()) >- .WithInvariants(CheckInvariants, >- [](ThrowingVariant* lhs) { >- return lhs->valueless_by_exception(); >- }) >+ .WithContracts(VariantInvariants, >+ [](ThrowingVariant* lhs) { >+ return lhs->valueless_by_exception(); >+ }) > .Test([&](ThrowingVariant* lhs) { > auto copy = rhs; > *lhs = std::move(copy); >@@ -347,12 +347,12 @@ TEST(VariantExceptionSafetyTest, ValueAssign) { > .WithInitialValue(WithThrower()) > .WithOperation([rhs](ThrowingVariant* lhs) { *lhs = rhs; }); > EXPECT_TRUE(copy_tester >- .WithInvariants(CheckInvariants, >- [](ThrowingVariant* lhs) { >- return !lhs->valueless_by_exception(); >- }) >+ .WithContracts(VariantInvariants, >+ [](ThrowingVariant* lhs) { >+ return !lhs->valueless_by_exception(); >+ }) > .Test()); >- EXPECT_FALSE(copy_tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_FALSE(copy_tester.WithContracts(strong_guarantee).Test()); > // move assign > auto move_tester = MakeExceptionSafetyTester() > .WithInitialValue(WithThrower()) >@@ -361,13 +361,13 @@ TEST(VariantExceptionSafetyTest, ValueAssign) { > *lhs = std::move(copy); > }); > EXPECT_TRUE(move_tester >- .WithInvariants(CheckInvariants, >- [](ThrowingVariant* lhs) { >- return !lhs->valueless_by_exception(); >- }) >+ .WithContracts(VariantInvariants, >+ [](ThrowingVariant* lhs) { >+ return !lhs->valueless_by_exception(); >+ }) > .Test()); > >- EXPECT_FALSE(move_tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_FALSE(move_tester.WithContracts(strong_guarantee).Test()); > } > // Otherwise (*this holds something else), if is_nothrow_constructible_v<Tj, > // T> || !is_nothrow_move_constructible_v<Tj> is true, equivalent to >@@ -400,12 +400,12 @@ TEST(VariantExceptionSafetyTest, ValueAssign) { > .WithInitialValue(WithCopyNoThrow()) > .WithOperation([&rhs](ThrowingVariant* lhs) { *lhs = rhs; }); > EXPECT_TRUE(copy_tester >- .WithInvariants(CheckInvariants, >- [](ThrowingVariant* lhs) { >- return lhs->valueless_by_exception(); >- }) >+ .WithContracts(VariantInvariants, >+ [](ThrowingVariant* lhs) { >+ return lhs->valueless_by_exception(); >+ }) > .Test()); >- EXPECT_FALSE(copy_tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_FALSE(copy_tester.WithContracts(strong_guarantee).Test()); > // move > auto move_tester = MakeExceptionSafetyTester() > .WithInitialValue(WithCopyNoThrow()) >@@ -413,12 +413,12 @@ TEST(VariantExceptionSafetyTest, ValueAssign) { > *lhs = ExpectedThrower(testing::nothrow_ctor); > }); > EXPECT_TRUE(move_tester >- .WithInvariants(CheckInvariants, >- [](ThrowingVariant* lhs) { >- return lhs->valueless_by_exception(); >- }) >+ .WithContracts(VariantInvariants, >+ [](ThrowingVariant* lhs) { >+ return lhs->valueless_by_exception(); >+ }) > .Test()); >- EXPECT_FALSE(move_tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_FALSE(move_tester.WithContracts(strong_guarantee).Test()); > } > // Otherwise (if is_nothrow_constructible_v<Tj, T> == false && > // is_nothrow_move_constructible<Tj> == true), >@@ -432,7 +432,7 @@ TEST(VariantExceptionSafetyTest, ValueAssign) { > MoveNothrow rhs; > EXPECT_TRUE(MakeExceptionSafetyTester() > .WithInitialValue(WithThrower()) >- .WithInvariants(CheckInvariants, strong_guarantee) >+ .WithContracts(VariantInvariants, strong_guarantee) > .Test([&rhs](ThrowingVariant* lhs) { *lhs = rhs; })); > } > #endif // !(defined(ABSL_HAVE_STD_VARIANT) && defined(__GLIBCXX__)) >@@ -450,12 +450,12 @@ TEST(VariantExceptionSafetyTest, Emplace) { > v->emplace<Thrower>(args); > }); > EXPECT_TRUE(tester >- .WithInvariants(CheckInvariants, >- [](ThrowingVariant* v) { >- return v->valueless_by_exception(); >- }) >+ .WithContracts(VariantInvariants, >+ [](ThrowingVariant* v) { >+ return v->valueless_by_exception(); >+ }) > .Test()); >- EXPECT_FALSE(tester.WithInvariants(strong_guarantee).Test()); >+ EXPECT_FALSE(tester.WithContracts(strong_guarantee).Test()); > } > } > >@@ -472,7 +472,7 @@ TEST(VariantExceptionSafetyTest, Swap) { > ThrowingVariant rhs = ExpectedThrower(); > EXPECT_TRUE(MakeExceptionSafetyTester() > .WithInitialValue(WithThrower()) >- .WithInvariants(CheckInvariants) >+ .WithContracts(VariantInvariants) > .Test([&](ThrowingVariant* lhs) { > auto copy = rhs; > lhs->swap(copy); >@@ -486,7 +486,7 @@ TEST(VariantExceptionSafetyTest, Swap) { > ThrowingVariant rhs = ExpectedThrower(); > EXPECT_TRUE(MakeExceptionSafetyTester() > .WithInitialValue(WithCopyNoThrow()) >- .WithInvariants(CheckInvariants) >+ .WithContracts(VariantInvariants) > .Test([&](ThrowingVariant* lhs) { > auto copy = rhs; > lhs->swap(copy); >@@ -496,7 +496,7 @@ TEST(VariantExceptionSafetyTest, Swap) { > ThrowingVariant rhs = ExpectedThrower(); > EXPECT_TRUE(MakeExceptionSafetyTester() > .WithInitialValue(WithCopyNoThrow()) >- .WithInvariants(CheckInvariants) >+ .WithContracts(VariantInvariants) > .Test([&](ThrowingVariant* lhs) { > auto copy = rhs; > copy.swap(*lhs); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_test.cc b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_test.cc >index 262bd9446c1841a570715745c3434ee1b8e19f95..bfb8bd7a0223f3cd9834e4764dc5a73bc1664def 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/types/variant_test.cc >@@ -403,7 +403,7 @@ struct is_trivially_move_constructible > > template <class T> > struct is_trivially_move_assignable >- : std::is_move_assignable<SingleUnion<T>>::type {}; >+ : absl::is_move_assignable<SingleUnion<T>>::type {}; > > TEST(VariantTest, NothrowMoveConstructible) { > // Verify that variant is nothrow move constructible iff its template >@@ -2439,14 +2439,14 @@ TEST(VariantTest, TestMoveConversionViaConvertVariantTo) { > > TEST(VariantTest, TestCopyAndMoveTypeTraits) { > EXPECT_TRUE(std::is_copy_constructible<variant<std::string>>::value); >- EXPECT_TRUE(std::is_copy_assignable<variant<std::string>>::value); >+ EXPECT_TRUE(absl::is_copy_assignable<variant<std::string>>::value); > EXPECT_TRUE(std::is_move_constructible<variant<std::string>>::value); >- EXPECT_TRUE(std::is_move_assignable<variant<std::string>>::value); >+ EXPECT_TRUE(absl::is_move_assignable<variant<std::string>>::value); > EXPECT_TRUE(std::is_move_constructible<variant<std::unique_ptr<int>>>::value); >- EXPECT_TRUE(std::is_move_assignable<variant<std::unique_ptr<int>>>::value); >+ EXPECT_TRUE(absl::is_move_assignable<variant<std::unique_ptr<int>>>::value); > EXPECT_FALSE( > std::is_copy_constructible<variant<std::unique_ptr<int>>>::value); >- EXPECT_FALSE(std::is_copy_assignable<variant<std::unique_ptr<int>>>::value); >+ EXPECT_FALSE(absl::is_copy_assignable<variant<std::unique_ptr<int>>>::value); > > EXPECT_FALSE( > absl::is_trivially_copy_constructible<variant<std::string>>::value); >diff --git a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/utility/utility.h b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/utility/utility.h >index d73602c47d3b6da166357fbcdc5d41519d623561..aef4baa02dcd67aceb254f0a9e4bff00da4ce5b0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/utility/utility.h >+++ b/Source/ThirdParty/libwebrtc/Source/third_party/abseil-cpp/absl/utility/utility.h >@@ -235,13 +235,13 @@ auto apply_helper(Functor&& functor, Tuple&& t, index_sequence<Indexes...>) > // Example: > // > // class Foo{void Bar(int);}; >-// void user_function(int, std::string); >+// void user_function(int, string); > // void user_function(std::unique_ptr<Foo>); > // > // int main() > // { >-// std::tuple<int, std::string> tuple1(42, "bar"); >-// // Invokes the user function overload on int, std::string. >+// std::tuple<int, string> tuple1(42, "bar"); >+// // Invokes the user function overload on int, string. > // absl::apply(&user_function, tuple1); > // > // auto foo = absl::make_unique<Foo>(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/AUTHORS b/Source/ThirdParty/libwebrtc/Source/webrtc/AUTHORS >index 6b0afd861810fabf5b6d53f335b88579e137d647..1216b48d95905562cd6e1ffe797f0c63ed23d8a9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/AUTHORS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/AUTHORS >@@ -16,6 +16,7 @@ Cody Barnes <conceptgenesis@gmail.com> > Colin Plumb > David Porter <david@porter.me> > Dax Booysen <dax@younow.com> >+Danail Kirov <dkirovbroadsoft@gmail.com> > Dmitry Lizin <sdkdimon@gmail.com> > Eric Rescorla, RTFM Inc. <ekr@rtfm.com> > Frederik Riedel, Frogg GmbH <frederik.riedel@frogg.io> >@@ -47,6 +48,7 @@ Peng Yu <yupeng323@gmail.com> > Rafael Lopez Diez <rafalopezdiez@gmail.com> > Ralph Giles <giles@ghostscript.com> > Riku Voipio <riku.voipio@linaro.org> >+Robert Bares <robert@bares.me> > Robert Nagy <robert.nagy@gmail.com> > Ryan Yoakum <ryoakum@skobalt.com> > Satender Saroha <ssaroha@yahoo.com> >@@ -56,6 +58,7 @@ Silviu Caragea <silviu.cpp@gmail.com> > Stefan Gula <steweg@gmail.com> > Steve Reid <sreid@sea-to-sky.net> > Tarun Chawla <trnkumarchawla@gmail.com> >+Uladzislau Susha <landby@gmail.com> > Vladimir Beloborodov <VladimirTechMan@gmail.com> > Vicken Simonian <vsimon@gmail.com> > Victor Costan <costan@gmail.com> >@@ -70,6 +73,10 @@ Sergio Garcia Murillo <sergio.garcia.murillo@gmail.com> > Maxim Pavlov <pavllovmax@gmail.com> > Yusuke Suzuki <utatane.tea@gmail.com> > Piasy Xu <xz4215@gmail.com> >+Tomas Popela <tomas.popela@gmail.com> >+Jan Grulich <grulja@gmail.com> >+Eike Rathke <erathke@redhat.com> >+Michel Promonet <michel.promonet.1@gmail.com> > > &yet LLC <*@andyet.com> > Agora IO <*@agora.io> >@@ -80,6 +87,7 @@ Google Inc. <*@google.com> > HyperConnect Inc. <*@hpcnt.com> > Life On Air Inc. <*@lifeonair.com> > Intel Corporation <*@intel.com> >+Microsoft Corporation <*@microsoft.com> > MIPS Technologies <*@mips.com> > Mozilla Foundation <*@mozilla.com> > Opera Software ASA <*@opera.com> >@@ -97,4 +105,5 @@ Wire Swiss GmbH <*@wire.com> > Miguel Paris <mparisdiaz@gmail.com> > Vewd Software AS <*@vewd.com> > Highfive, Inc. <*@highfive.com> >-CoSMo Software Consulting, Pte Ltd <*@cosmosoftware.io> >\ No newline at end of file >+CoSMo Software Consulting, Pte Ltd <*@cosmosoftware.io> >+Tuple, LLC <*@tuple.app> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/BUILD.gn >index 5a1446d9563063214847f419f10e140c0ff32673..249094bdd6ed8810810bb6a0f375e48a03d75386 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/BUILD.gn >@@ -40,12 +40,14 @@ if (!build_with_chromium) { > if (rtc_include_tests) { > deps += [ > ":rtc_unittests", >+ ":slow_tests", > ":video_engine_tests", > ":webrtc_nonparallel_tests", > ":webrtc_perf_tests", > "call:fake_network_unittests", > "common_audio:common_audio_unittests", > "common_video:common_video_unittests", >+ "examples:examples_unittests", > "media:rtc_media_unittests", > "modules:modules_tests", > "modules:modules_unittests", >@@ -54,10 +56,8 @@ if (!build_with_chromium) { > "modules/remote_bitrate_estimator:bwe_simulations_tests", > "modules/rtp_rtcp:test_packet_masks_metrics", > "modules/video_capture:video_capture_internal_impl", >- "ortc:ortc_unittests", > "pc:peerconnection_unittests", > "pc:rtc_pc_unittests", >- "rtc_base:rtc_base_tests_utils", > "stats:rtc_stats_unittests", > "system_wrappers:system_wrappers_unittests", > "test", >@@ -98,6 +98,10 @@ config("common_inherited_config") { > cflags = [] > ldflags = [] > >+ if (rtc_enable_symbol_export) { >+ defines = [ "WEBRTC_ENABLE_SYMBOL_EXPORT" ] >+ } >+ > if (build_with_mozilla) { > defines += [ "WEBRTC_MOZILLA_BUILD" ] > } >@@ -218,8 +222,8 @@ config("common_config") { > defines += [ "WEBRTC_INCLUDE_INTERNAL_AUDIO_DEVICE" ] > } > >- if (!rtc_libvpx_build_vp9) { >- defines += [ "RTC_DISABLE_VP9" ] >+ if (rtc_libvpx_build_vp9) { >+ defines += [ "RTC_ENABLE_VP9" ] > } > > if (rtc_enable_sctp) { >@@ -374,29 +378,33 @@ if (!build_with_chromium) { > > deps = [ > ":webrtc_common", >+ "api:libjingle_peerconnection_api", > "api:transport_api", > "audio", > "call", > "common_audio", > "common_video", >+ "logging:rtc_event_log_api", >+ "logging:rtc_event_log_impl_base", > "media", > "modules", > "modules/video_capture:video_capture_internal_impl", >- "ortc", >+ "p2p:rtc_p2p", >+ "pc:libjingle_peerconnection", >+ "pc:peerconnection", >+ "pc:rtc_pc", >+ "pc:rtc_pc_base", > "rtc_base", > "sdk", > "video", > ] > >- # Additional factory functions to be exposed. >- # Rational: These factories are small enough (89 KiB / 32+ MiB) >- # to be unconditionaly included for user convenience. >- # That begin said: >- # TODO(yvesg) Consider making all non-core APIs optional, so that users >- # can build a customized library tailored to their needs. >+ # Include audio and video codecs by default. > deps += [ > "api/audio_codecs:builtin_audio_decoder_factory", > "api/audio_codecs:builtin_audio_encoder_factory", >+ "api/video_codecs:builtin_video_decoder_factory", >+ "api/video_codecs:builtin_video_encoder_factory", > ] > > if (build_with_mozilla) { >@@ -430,9 +438,9 @@ rtc_source_set("webrtc_common") { > deps = [ > "api:array_view", > "api/video:video_bitrate_allocation", >+ "api/video:video_frame", > "rtc_base:checks", >- "rtc_base:deprecation", >- "rtc_base:stringutils", >+ "//third_party/abseil-cpp/absl/strings", > ] > > if (!build_with_chromium && is_clang) { >@@ -467,7 +475,6 @@ if (rtc_include_tests) { > "p2p:rtc_p2p_unittests", > "rtc_base:rtc_base_approved_unittests", > "rtc_base:rtc_base_tests_main", >- "rtc_base:rtc_base_tests_utils", > "rtc_base:rtc_base_unittests", > "rtc_base:rtc_json_unittests", > "rtc_base:rtc_numerics_unittests", >@@ -499,6 +506,17 @@ if (rtc_include_tests) { > } > } > >+ # This runs tests that must run in real time and therefore can take some >+ # time to execute. They are in a separate executable to avoid making the >+ # regular unittest suite too slow to run frequently. >+ rtc_test("slow_tests") { >+ testonly = true >+ deps = [ >+ "modules/congestion_controller/goog_cc:goog_cc_slow_tests", >+ "test:test_main", >+ ] >+ } >+ > # TODO(pbos): Rename test suite, this is no longer "just" for video targets. > video_engine_tests_resources = [ > "resources/foreman_cif_short.yuv", >@@ -524,7 +542,6 @@ if (rtc_include_tests) { > # should move them to a more appropriate test suite. > "call:call_tests", > "modules/video_capture", >- "rtc_base:rtc_base_tests_utils", > "test:test_common", > "test:test_main", > "test:video_test_common", >@@ -617,6 +634,7 @@ if (rtc_include_tests) { > "sdk/android/tests/src/org/webrtc/GlGenericDrawerTest.java", > "sdk/android/tests/src/org/webrtc/HardwareVideoEncoderTest.java", > "sdk/android/tests/src/org/webrtc/ScalingSettingsTest.java", >+ "sdk/android/tests/src/org/webrtc/CryptoOptionsTest.java", > ] > > deps = [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/DEPS b/Source/ThirdParty/libwebrtc/Source/webrtc/DEPS >index 67aa4a803c20919c26355ca3f09e8c25bde1d51d..555c27f10e7e199730006d4db97d360d2c158376 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/DEPS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/DEPS >@@ -7,16 +7,16 @@ vars = { > 'checkout_configuration': 'default', > 'checkout_instrumented_libraries': 'checkout_linux and checkout_configuration == "default"', > 'webrtc_git': 'https://webrtc.googlesource.com', >- 'chromium_revision': 'ccb83d4a5512802f1193a506b5431e85e0138043', >+ 'chromium_revision': 'b04e513f825d6ce22e147601cbfc75ea23eb9073', > 'boringssl_git': 'https://boringssl.googlesource.com', > # Three lines of non-changing comments so that > # the commit queue can handle CLs rolling swarming_client > # and whatever else without interference from each other. >- 'swarming_revision': '486c9b53c4d54dd4b95bb6ce0e31160e600dfc11', >+ 'swarming_revision': '157bec8a25cc4ebd6a16052510d08b05b6102aad', > # Three lines of non-changing comments so that > # the commit queue can handle CLs rolling BoringSSL > # and whatever else without interference from each other. >- 'boringssl_revision': 'ce00828c89df3d4c40de7d715b1a032eb03c525c', >+ 'boringssl_revision': '6965d25602754bc419c5f757d008ba1f4da49ae4', > # Three lines of non-changing comments so that > # the commit queue can handle CLs rolling lss > # and whatever else without interference from each other. >@@ -24,29 +24,29 @@ vars = { > # Three lines of non-changing comments so that > # the commit queue can handle CLs rolling catapult > # and whatever else without interference from each other. >- 'catapult_revision': '4fc4281d2152a36762a0bf47ba76bf8d4a6ee234', >+ 'catapult_revision': '17079a5cc1f27a2dc57558a0897666cac4a5bc64', > # Three lines of non-changing comments so that > # the commit queue can handle CLs rolling libFuzzer > # and whatever else without interference from each other. >- 'libfuzzer_revision': 'a305a5eb85ed42edc5c965c14f308f576cb245ca', >+ 'libfuzzer_revision': '2a53098584c48af50aec3fb51febe5e651489774', > # Three lines of non-changing comments so that > # the commit queue can handle CLs rolling freetype > # and whatever else without interference from each other. >- 'freetype_revision': 'abd997aa7cf2bc9219136782c7363d14d325199c', >+ 'freetype_revision': 'fb0d66d04c4dd8d7f9604af1a6001b2737cb5098', > # Three lines of non-changing comments so that > # the commit queue can handle CLs rolling HarfBuzz > # and whatever else without interference from each other. >- 'harfbuzz_revision': '54d332dd9b0263821376161cdffb60ffb3c7847f', >+ 'harfbuzz_revision': '574d888c8a409295a952361a39c8e83a52a0fc3d', > } > deps = { > # TODO(kjellander): Move this to be Android-only once the libevent dependency > # in base/third_party/libevent is solved. > 'src/base': >- Var('chromium_git') + '/chromium/src/base' + '@' + '0db61379d8f76807e6f78a35ccff9227205b6195', >+ Var('chromium_git') + '/chromium/src/base' + '@' + 'b9901f9bc5b65a1169d5db696ef4428a65cec622', > 'src/build': >- Var('chromium_git') + '/chromium/src/build' + '@' + '8bb9fef862bd7570c5b465860c427701c81c1531', >+ Var('chromium_git') + '/chromium/src/build' + '@' + 'bbd67a350d745200e5798cace11a28edfd9fc3b2', > 'src/buildtools': >- Var('chromium_git') + '/chromium/buildtools.git' + '@' + '2dff9c9c74e9d732e6fe57c84ef7fd044cc45d96', >+ Var('chromium_git') + '/chromium/buildtools.git' + '@' + '04161ec8d7c781e4498c699254c69ba0dd959fde', > # Gradle 4.3-rc4. Used for testing Android Studio project generation for WebRTC. > 'src/examples/androidtests/third_party/gradle': { > 'url': Var('chromium_git') + '/external/github.com/gradle/gradle.git' + '@' + >@@ -54,21 +54,44 @@ deps = { > 'condition': 'checkout_android', > }, > 'src/ios': { >- 'url': Var('chromium_git') + '/chromium/src/ios' + '@' + 'd4e010516156c2ef2ccd974d4aa0abd9ef48c698', >+ 'url': Var('chromium_git') + '/chromium/src/ios' + '@' + 'eb9e8e09dc4b28e691cfd70242ba50ceecb6ab20', > 'condition': 'checkout_ios', > }, > 'src/testing': >- Var('chromium_git') + '/chromium/src/testing' + '@' + '677e1ef2627ca32226f5e801ebec39613035873b', >+ Var('chromium_git') + '/chromium/src/testing' + '@' + '5a776bca050aedfb0269cdd8408ff0559acde2aa', > 'src/third_party': >- Var('chromium_git') + '/chromium/src/third_party' + '@' + 'dcfac1c5eda73c8fbd0451c8e40454b284c67110', >+ Var('chromium_git') + '/chromium/src/third_party' + '@' + '77f6c16720eff7721b0c0b2438ce9126f59e1f0b', > 'src/third_party/android_ndk': { > 'url': Var('chromium_git') + '/android_ndk.git' + '@' + '4e2cea441bfd43f0863d14f57b1e1844260b9884', > 'condition': 'checkout_android', > }, > 'src/third_party/android_tools': { >- 'url': Var('chromium_git') + '/android_tools.git' + '@' + '130499e25286f4d56acafa252fee09f3cc595c49', >+ 'url': Var('chromium_git') + '/android_tools.git' + '@' + '6fecaa542f73dd5aeed170d9a4cf340159b42976', > 'condition': 'checkout_android', > }, >+ >+ 'src/third_party/android_build_tools/aapt2': { >+ 'packages': [ >+ { >+ 'package': 'chromium/third_party/android_build_tools/aapt2', >+ 'version': 'version:3.3.0-beta01-5013011-cr0', >+ }, >+ ], >+ 'condition': 'checkout_android', >+ 'dep_type': 'cipd', >+ }, >+ >+ 'src/third_party/android_build_tools/bundletool': { >+ 'packages': [ >+ { >+ 'package': 'chromium/third_party/android_tools_bundletool', >+ 'version': 'version:0.7.1-cr0', >+ }, >+ ], >+ 'condition': 'checkout_android', >+ 'dep_type': 'cipd', >+ }, >+ > 'src/third_party/auto/src': { > 'url': Var('chromium_git') + '/external/github.com/google/auto.git' + '@' + '8a81a858ae7b78a1aef71ac3905fade0bbd64e82', > 'condition': 'checkout_android', >@@ -84,7 +107,7 @@ deps = { > 'src/third_party/colorama/src': > Var('chromium_git') + '/external/colorama.git' + '@' + '799604a1041e9b3bc5d2789ecbd7e8db2e18e6b8', > 'src/third_party/depot_tools': >- Var('chromium_git') + '/chromium/tools/depot_tools.git' + '@' + '71e3be7a50c21faeee91ed99a8d5addfb7594e7c', >+ Var('chromium_git') + '/chromium/tools/depot_tools.git' + '@' + '44d4b29082f0d8bacacd623f91c4d29637b4b901', > 'src/third_party/errorprone/lib': { > 'url': Var('chromium_git') + '/chromium/third_party/errorprone.git' + '@' + '980d49e839aa4984015efed34b0134d4b2c9b6d7', > 'condition': 'checkout_android', >@@ -101,7 +124,7 @@ deps = { > Var('chromium_git') + '/external/github.com/harfbuzz/harfbuzz.git' + '@' + Var('harfbuzz_revision'), > # WebRTC-only dependency (not present in Chromium). > 'src/third_party/gtest-parallel': >- Var('chromium_git') + '/external/github.com/google/gtest-parallel' + '@' + 'fe7f791f14769390d0b124ef8231cde4d575eb12', >+ Var('chromium_git') + '/external/github.com/google/gtest-parallel' + '@' + 'e472187d1129e508890aa20ac914adeac2e7f7b6', > 'src/third_party/google-truth': { > 'packages': [ > { >@@ -113,9 +136,9 @@ deps = { > 'dep_type': 'cipd', > }, > 'src/third_party/googletest/src': >- Var('chromium_git') + '/external/github.com/google/googletest.git' + '@' + '2e68926a9d4929e9289373cd49e40ddcb9a628f7', >+ Var('chromium_git') + '/external/github.com/google/googletest.git' + '@' + '879ac092fde0a19e1b3a61b2546b2a422b1528bc', > 'src/third_party/icu': { >- 'url': Var('chromium_git') + '/chromium/deps/icu.git' + '@' + 'c52a2a250d6c5f5cbdd015dff36af7c5d0ae1150', >+ 'url': Var('chromium_git') + '/chromium/deps/icu.git' + '@' + '407b39301e71006b68bd38e770f35d32398a7b14', > }, > 'src/third_party/jsr-305/src': { > 'url': Var('chromium_git') + '/external/jsr-305.git' + '@' + '642c508235471f7220af6d5df2d3210e3bfc0919', >@@ -135,9 +158,9 @@ deps = { > 'src/third_party/libsrtp': > Var('chromium_git') + '/chromium/deps/libsrtp.git' + '@' + '650611720ecc23e0e6b32b0e3100f8b4df91696c', > 'src/third_party/libvpx/source/libvpx': >- Var('chromium_git') + '/webm/libvpx.git' + '@' + '2beb5c9f91e7166c2c9d01c94bf84767815121e4', >+ Var('chromium_git') + '/webm/libvpx.git' + '@' + 'ac3eccdc24bccece5f73ee67b88154f3bf4a4e9a', > 'src/third_party/libyuv': >- Var('chromium_git') + '/libyuv/libyuv.git' + '@' + '9a07219dc8fbf2b77e390d16bd24809444838a91', >+ Var('chromium_git') + '/libyuv/libyuv.git' + '@' + 'b36c86fdfe746d7be904c3a565b047b24d58087e', > 'src/third_party/lss': { > 'url': Var('chromium_git') + '/linux-syscall-support.git' + '@' + Var('lss_revision'), > 'condition': 'checkout_android or checkout_linux', >@@ -146,13 +169,20 @@ deps = { > 'url': Var('chromium_git') + '/external/mockito/mockito.git' + '@' + '04a2a289a4222f80ad20717c25144981210d2eac', > 'condition': 'checkout_android', > }, >+ >+ # Used by boringssl. >+ 'src/third_party/nasm': { >+ 'url': Var('chromium_git') + '/chromium/deps/nasm.git' + '@' + >+ 'a0a6951e259bd347c133969740348bb5ebb468c4' >+ }, >+ > 'src/third_party/openh264/src': > Var('chromium_git') + '/external/github.com/cisco/openh264' + '@' + '3b51f16a4a41df729f8d647f03e48c5f272911ff', > 'src/third_party/r8': { > 'packages': [ > { > 'package': 'chromium/third_party/r8', >- 'version': 'version:1.2.48', >+ 'version': 'version:1.4.4-cr0', > }, > ], > 'condition': 'checkout_android', >@@ -195,7 +225,7 @@ deps = { > 'src/third_party/yasm/source/patched-yasm': > Var('chromium_git') + '/chromium/deps/yasm/patched-yasm.git' + '@' + '720b70524a4424b15fc57e82263568c8ba0496ad', > 'src/tools': >- Var('chromium_git') + '/chromium/src/tools' + '@' + '731f63f3126cd6f5ceb2aafceade80857900a409', >+ Var('chromium_git') + '/chromium/src/tools' + '@' + 'b47c6315878eabd73bf71c1aa0cbd9b816716789', > 'src/tools/swarming_client': > Var('chromium_git') + '/infra/luci/client-py.git' + '@' + Var('swarming_revision'), > >@@ -895,17 +925,6 @@ deps = { > 'dep_type': 'cipd', > }, > >- 'src/third_party/android_deps/libs/com_google_android_play_core': { >- 'packages': [ >- { >- 'package': 'chromium/third_party/android_deps/libs/com_google_android_play_core', >- 'version': 'version:1.3.0-cr0', >- }, >- ], >- 'condition': 'checkout_android', >- 'dep_type': 'cipd', >- }, >- > 'src/third_party/android_deps/libs/com_google_code_findbugs_jsr305': { > 'packages': [ > { >@@ -1016,6 +1035,17 @@ deps = { > 'dep_type': 'cipd', > }, > >+ 'src/third_party/android_deps/libs/com_google_protobuf_protobuf_lite': { >+ 'packages': [ >+ { >+ 'package': 'chromium/third_party/android_deps/libs/com_google_protobuf_protobuf_lite', >+ 'version': 'version:3.0.1-cr0', >+ }, >+ ], >+ 'condition': 'checkout_android', >+ 'dep_type': 'cipd', >+ }, >+ > 'src/third_party/android_deps/libs/com_squareup_javapoet': { > 'packages': [ > { >@@ -1393,8 +1423,6 @@ hooks = [ > recursedeps = [ > # buildtools provides clang_format, libc++, and libc++abi. > 'src/buildtools', >- # android_tools manages the NDK. >- 'src/third_party/android_tools', > ] > > # Define rules for which include paths are allowed in our source. >@@ -1416,9 +1444,12 @@ include_rules = [ > "+test", > "+rtc_tools", > >- # Abseil whitelist. Keep this in sync with abseil-in-webrtc-md. >+ # Abseil whitelist. Keep this in sync with abseil-in-webrtc.md. > "+absl/container/inlined_vector.h", > "+absl/memory/memory.h", >+ "+absl/meta/type_traits.h", >+ "+absl/strings/ascii.h", >+ "+absl/strings/match.h", > "+absl/strings/string_view.h", > "+absl/types/optional.h", > "+absl/types/variant.h", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/OWNERS >index bfa7801bd827b825e431fa359afddecc90f34234..289b5a3c7ea1839318bb52fee9635516e8615628 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/OWNERS >@@ -4,19 +4,14 @@ mflodman@webrtc.org > niklas.enbom@webrtc.org > phoglund@webrtc.org > stefan@webrtc.org >-tina.legrand@webrtc.org > tommi@webrtc.org > per-file .gitignore=* > per-file .gn=mbonadei@webrtc.org > per-file *.gn=mbonadei@webrtc.org > per-file *.gni=mbonadei@webrtc.org >-per-file *.py=phoglund@webrtc.org > per-file AUTHORS=* >-per-file BUILD.gn=phoglund@webrtc.org > per-file DEPS=* > per-file pylintrc=phoglund@webrtc.org >-per-file THIRD_PARTY_DEPS=phoglund@webrtc.org >-per-file THIRD_PARTY_DEPS=titovartem@webrtc.org > per-file WATCHLISTS=* > per-file abseil-in-webrtc.md=danilchap@webrtc.org > per-file abseil-in-webrtc.md=kwiberg@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/PRESUBMIT.py b/Source/ThirdParty/libwebrtc/Source/webrtc/PRESUBMIT.py >index a7b5cc14a48fa0b80702462aa9c9101d8a027571..34b5adf777838390aa588e986513f520a2e305cf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/PRESUBMIT.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/PRESUBMIT.py >@@ -481,7 +481,8 @@ def CheckNoStreamUsageIsAdded(input_api, output_api, > file_filter = lambda x: (input_api.FilterSourceFile(x) > and source_file_filter(x)) > for f in input_api.AffectedSourceFiles(file_filter): >- if f.LocalPath() == 'PRESUBMIT.py': >+ # Usage of stringstream is allowed under examples/. >+ if f.LocalPath() == 'PRESUBMIT.py' or f.LocalPath().startswith('examples'): > continue > for line_num, line in f.ChangedContents(): > if ((include_re.search(line) or usage_re.search(line)) >@@ -862,8 +863,89 @@ def CommonChecks(input_api, output_api): > results.extend(CheckNoStreamUsageIsAdded( > input_api, output_api, non_third_party_sources)) > results.extend(CheckAddedDepsHaveTargetApprovals(input_api, output_api)) >+ results.extend(CheckApiDepsFileIsUpToDate(input_api, output_api)) >+ results.extend(CheckAbslMemoryInclude( >+ input_api, output_api, non_third_party_sources)) >+ return results >+ >+ >+def CheckApiDepsFileIsUpToDate(input_api, output_api): >+ """Check that 'include_rules' in api/DEPS is up to date. >+ >+ The file api/DEPS must be kept up to date in order to avoid to avoid to >+ include internal header from WebRTC's api/ headers. >+ >+ This check is focused on ensuring that 'include_rules' contains a deny >+ rule for each root level directory. More focused allow rules can be >+ added to 'specific_include_rules'. >+ """ >+ results = [] >+ api_deps = os.path.join(input_api.PresubmitLocalPath(), 'api', 'DEPS') >+ with open(api_deps) as f: >+ deps_content = _ParseDeps(f.read()) >+ >+ include_rules = deps_content.get('include_rules', []) >+ >+ # Only check top level directories affected by the current CL. >+ dirs_to_check = set() >+ for f in input_api.AffectedFiles(): >+ path_tokens = [t for t in f.LocalPath().split(os.sep) if t] >+ if len(path_tokens) > 1: >+ if (path_tokens[0] != 'api' and >+ os.path.isdir(os.path.join(input_api.PresubmitLocalPath(), >+ path_tokens[0]))): >+ dirs_to_check.add(path_tokens[0]) >+ >+ missing_include_rules = set() >+ for p in dirs_to_check: >+ rule = '-%s' % p >+ if rule not in include_rules: >+ missing_include_rules.add(rule) >+ >+ if missing_include_rules: >+ error_msg = [ >+ 'include_rules = [\n', >+ ' ...\n', >+ ] >+ >+ for r in sorted(missing_include_rules): >+ error_msg.append(' "%s",\n' % str(r)) >+ >+ error_msg.append(' ...\n') >+ error_msg.append(']\n') >+ >+ results.append(output_api.PresubmitError( >+ 'New root level directory detected! WebRTC api/ headers should ' >+ 'not #include headers from \n' >+ 'the new directory, so please update "include_rules" in file\n' >+ '"%s". Example:\n%s\n' % (api_deps, ''.join(error_msg)))) >+ > return results > >+def CheckAbslMemoryInclude(input_api, output_api, source_file_filter): >+ pattern = input_api.re.compile( >+ r'^#include\s*"absl/memory/memory.h"', input_api.re.MULTILINE) >+ file_filter = lambda f: (f.LocalPath().endswith(('.cc', '.h')) >+ and source_file_filter(f)) >+ >+ files = [] >+ for f in input_api.AffectedFiles( >+ include_deletes=False, file_filter=file_filter): >+ contents = input_api.ReadFile(f) >+ if pattern.search(contents): >+ continue >+ for _, line in f.ChangedContents(): >+ if 'absl::make_unique' in line: >+ files.append(f) >+ break >+ >+ if len(files): >+ return [output_api.PresubmitError( >+ 'Please include "absl/memory/memory.h" header for' >+ ' absl::make_unique.\nThis header may or may not be included' >+ ' transitively depends on the C++ standard version.', >+ files)] >+ return [] > > def CheckChangeOnUpload(input_api, output_api): > results = [] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/WATCHLISTS b/Source/ThirdParty/libwebrtc/Source/webrtc/WATCHLISTS >index dc8d48c51f95bc8c43cbf0bdb2e6bb0da18f62e6..4260bf91bbc599730e04cd5324e475419d841927 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/WATCHLISTS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/WATCHLISTS >@@ -79,6 +79,9 @@ > 'bitrate_controller': { > 'filepath': 'modules/bitrate_controller/.*' > }, >+ 'congestion_controller': { >+ 'filepath': 'modules/congestion_controller/.*' >+ }, > 'remote_bitrate_estimator': { > 'filepath': 'modules/remote_bitrate_estimator/.*' > }, >@@ -97,8 +100,8 @@ > }, > > 'WATCHLISTS': { >- 'this_file': [''], >- 'all_webrtc': ['tterriberry@mozilla.com'], >+ 'this_file': [], >+ 'all_webrtc': [], > 'root_files': ['niklas.enbom@webrtc.org', > 'peah@webrtc.org', > 'qiang.lu@intel.com', >@@ -106,16 +109,14 @@ > 'build_files': ['mbonadei@webrtc.org'], > 'common_audio': ['alessiob@webrtc.org', > 'aluebs@webrtc.org', >- 'andrew@webrtc.org', > 'audio-team@agora.io', >- 'bjornv@webrtc.org', > 'minyue@webrtc.org', > 'peah@webrtc.org'], >- 'audio': ['solenberg@webrtc.org', >- 'tina.legrand@webrtc.org'], >+ 'audio': ['solenberg@webrtc.org'], > 'api': ['kwiberg@webrtc.org', > 'solenberg@webrtc.org'], >- 'base': ['kwiberg@webrtc.org'], >+ 'base': ['kwiberg@webrtc.org', >+ 'benwright@webrtc.org'], > 'call': ['mflodman@webrtc.org', > 'solenberg@webrtc.org', > 'stefan@webrtc.org'], >@@ -137,8 +138,7 @@ > 'henrik.lundin@webrtc.org', > 'kwiberg@webrtc.org', > 'minyue@webrtc.org', >- 'peah@webrtc.org', >- 'tina.legrand@webrtc.org'], >+ 'peah@webrtc.org'], > 'neteq': ['alessiob@webrtc.org', > 'audio-team@agora.io', > 'henrik.lundin@webrtc.org', >@@ -149,9 +149,7 @@ > 'audio_processing': ['aleloi@webrtc.org', > 'alessiob@webrtc.org', > 'aluebs@webrtc.org', >- 'andrew@webrtc.org', > 'audio-team@agora.io', >- 'bjornv@webrtc.org', > 'henrik.lundin@webrtc.org', > 'kwiberg@webrtc.org', > 'minyue@webrtc.org', >@@ -166,11 +164,14 @@ > 'zhengzhonghou@agora.io'], > 'bitrate_controller': ['mflodman@webrtc.org', > 'stefan@webrtc.org', >+ 'srte@webrtc.org', > 'zhuangzesen@agora.io'], >+ 'congestion_controller': ['srte@webrtc.org'], > 'remote_bitrate_estimator': ['mflodman@webrtc.org', > 'stefan@webrtc.org', > 'zhuangzesen@agora.io'], > 'pacing': ['mflodman@webrtc.org', >+ 'srte@webrtc.org', > 'stefan@webrtc.org', > 'zhuangzesen@agora.io'], > 'rtp_rtcp': ['mflodman@webrtc.org', >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/abseil-in-webrtc.md b/Source/ThirdParty/libwebrtc/Source/webrtc/abseil-in-webrtc.md >index 04ae8d28de12b74ccba5cd51a6e521f384329d47..1075aa9b7962015e2e973c3550c5e69b55da25da 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/abseil-in-webrtc.md >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/abseil-in-webrtc.md >@@ -15,6 +15,10 @@ adds the first use. > * `absl::make_unique` and `absl::WrapUnique` > * `absl::optional` and related stuff from `absl/types/optional.h`. > * `absl::string_view` >+* The functions in `absl/strings/ascii.h` and `absl/strings/match.h` >+* `absl::is_trivially_copy_constructible`, >+ `absl::is_trivially_copy_assignable`, and >+ `absl::is_trivially_destructible` from `absl/meta/type_traits.h`. > * `absl::variant` and related stuff from `absl/types/variant.h`. > > ## **Disallowed** >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/BUILD.gn >index 5b21334fe9859e4eb4d04affb346253f89b7c7be..fe27c7b8880b8f2eba155f39ef7fb49a3629bc8d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/BUILD.gn >@@ -44,6 +44,33 @@ rtc_source_set("callfactory_api") { > ] > } > >+rtc_static_library("create_peerconnection_factory") { >+ visibility = [ "*" ] >+ allow_poison = [ "software_video_codecs" ] >+ sources = [ >+ "create_peerconnection_factory.cc", >+ "create_peerconnection_factory.h", >+ ] >+ deps = [ >+ ":callfactory_api", >+ ":fec_controller_api", >+ ":libjingle_peerconnection_api", >+ "../logging:rtc_event_log_api", >+ "../logging:rtc_event_log_impl_base", >+ "../media:rtc_audio_video", >+ "../modules/audio_device:audio_device_api", >+ "../modules/audio_processing:api", >+ "../pc:peerconnection", >+ "../rtc_base:ptr_util", >+ "../rtc_base:rtc_base", >+ "../rtc_base:rtc_base_approved", >+ "audio:audio_mixer_api", >+ "audio_codecs:audio_codecs_api", >+ "transport:network_control", >+ "video_codecs:video_codecs_api", >+ ] >+} >+ > rtc_static_library("libjingle_peerconnection_api") { > visibility = [ "*" ] > cflags = [] >@@ -52,6 +79,8 @@ rtc_static_library("libjingle_peerconnection_api") { > "bitrate_constraints.h", > "candidate.cc", > "candidate.h", >+ "crypto/cryptooptions.cc", >+ "crypto/cryptooptions.h", > "crypto/framedecryptorinterface.h", > "crypto/frameencryptorinterface.h", > "cryptoparams.h", >@@ -63,6 +92,7 @@ rtc_static_library("libjingle_peerconnection_api") { > "jsepicecandidate.cc", > "jsepicecandidate.h", > "jsepsessiondescription.h", >+ "media_transport_interface.cc", > "media_transport_interface.h", > "mediaconstraintsinterface.cc", > "mediaconstraintsinterface.h", >@@ -109,6 +139,7 @@ rtc_static_library("libjingle_peerconnection_api") { > "audio_codecs:audio_codecs_api", > "transport:bitrate_settings", > "transport:network_control", >+ "video:encoded_image", > "video:video_frame", > "//third_party/abseil-cpp/absl/types:optional", > >@@ -124,6 +155,7 @@ rtc_static_library("libjingle_peerconnection_api") { > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", > "../rtc_base:stringutils", >+ "../rtc_base/system:rtc_export", > ] > > if (is_nacl) { >@@ -132,6 +164,13 @@ rtc_static_library("libjingle_peerconnection_api") { > } > } > >+rtc_source_set("scoped_refptr") { >+ visibility = [ "*" ] >+ sources = [ >+ "scoped_refptr.h", >+ ] >+} >+ > rtc_source_set("video_quality_test_fixture_api") { > visibility = [ "*" ] > testonly = true >@@ -164,6 +203,7 @@ rtc_source_set("test_dependency_factory") { > deps = [ > ":video_quality_test_fixture_api", > "../rtc_base:thread_checker", >+ "//third_party/abseil-cpp/absl/memory", > ] > if (!build_with_chromium && is_clang) { > # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >@@ -203,34 +243,15 @@ rtc_source_set("libjingle_logging_api") { > rtc_source_set("ortc_api") { > visibility = [ "*" ] > sources = [ >- "ortc/mediadescription.cc", >- "ortc/mediadescription.h", >- "ortc/ortcfactoryinterface.h", >- "ortc/ortcrtpreceiverinterface.h", >- "ortc/ortcrtpsenderinterface.h", > "ortc/packettransportinterface.h", >- "ortc/rtptransportcontrollerinterface.h", > "ortc/rtptransportinterface.h", >- "ortc/sessiondescription.cc", >- "ortc/sessiondescription.h", > "ortc/srtptransportinterface.h", >- "ortc/udptransportinterface.h", > ] > >- # For mediastreaminterface.h, etc. >- # TODO(deadbeef): Create a separate target for the common things ORTC and >- # PeerConnection code shares, so that ortc_api can depend on that instead of >- # libjingle_peerconnection_api. > deps = [ > ":libjingle_peerconnection_api", >- "..:webrtc_common", >- "../rtc_base:rtc_base", > "//third_party/abseil-cpp/absl/types:optional", > ] >- if (!build_with_chromium && is_clang) { >- # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >- suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >- } > } > > rtc_source_set("rtc_stats_api") { >@@ -246,6 +267,7 @@ rtc_source_set("rtc_stats_api") { > deps = [ > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", >+ "../rtc_base/system:rtc_export", > ] > } > >@@ -270,6 +292,17 @@ rtc_source_set("transport_api") { > ] > } > >+rtc_source_set("bitrate_allocation") { >+ visibility = [ "*" ] >+ sources = [ >+ "call/bitrate_allocation.h", >+ ] >+ deps = [ >+ "units:data_rate", >+ "units:time_delta", >+ ] >+} >+ > rtc_source_set("simulated_network_api") { > visibility = [ "*" ] > sources = [ >@@ -347,6 +380,7 @@ if (rtc_include_tests) { > ] > > deps = [ >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/audio_processing:audioproc_f_impl", > ] >@@ -446,11 +480,33 @@ if (rtc_include_tests) { > ] > } > >- rtc_source_set("fake_frame_crypto") { >+ rtc_source_set("mock_frame_encryptor") { >+ testonly = true >+ sources = [ >+ "test/mock_frame_encryptor.cc", >+ "test/mock_frame_encryptor.h", >+ ] >+ deps = [ >+ ":libjingle_peerconnection_api", >+ "../test:test_support", >+ ] >+ } >+ >+ rtc_source_set("mock_frame_decryptor") { >+ testonly = true >+ sources = [ >+ "test/mock_frame_decryptor.cc", >+ "test/mock_frame_decryptor.h", >+ ] >+ deps = [ >+ ":libjingle_peerconnection_api", >+ "../test:test_support", >+ ] >+ } >+ >+ rtc_source_set("fake_frame_encryptor") { > testonly = true > sources = [ >- "test/fake_frame_decryptor.cc", >- "test/fake_frame_decryptor.h", > "test/fake_frame_encryptor.cc", > "test/fake_frame_encryptor.h", > ] >@@ -463,6 +519,21 @@ if (rtc_include_tests) { > ] > } > >+ rtc_source_set("fake_frame_decryptor") { >+ testonly = true >+ sources = [ >+ "test/fake_frame_decryptor.cc", >+ "test/fake_frame_decryptor.h", >+ ] >+ deps = [ >+ ":array_view", >+ ":libjingle_peerconnection_api", >+ "..:webrtc_common", >+ "../rtc_base:checks", >+ "../rtc_base:rtc_base_approved", >+ ] >+ } >+ > rtc_source_set("mock_peerconnectioninterface") { > testonly = true > sources = [ >@@ -500,6 +571,18 @@ if (rtc_include_tests) { > ] > } > >+ rtc_source_set("mock_video_bitrate_allocator_factory") { >+ testonly = true >+ sources = [ >+ "test/mock_video_bitrate_allocator_factory.h", >+ ] >+ >+ deps = [ >+ "../api/video:video_bitrate_allocator_factory", >+ "../test:test_support", >+ ] >+ } >+ > rtc_source_set("mock_video_codec_factory") { > testonly = true > sources = [ >@@ -513,15 +596,72 @@ if (rtc_include_tests) { > ] > } > >+ rtc_source_set("mock_video_decoder") { >+ visibility = [ "*" ] >+ >+ testonly = true >+ sources = [ >+ "test/mock_video_decoder.cc", >+ "test/mock_video_decoder.h", >+ ] >+ >+ deps = [ >+ "../api/video_codecs:video_codecs_api", >+ "../test:test_support", >+ ] >+ } >+ >+ rtc_source_set("mock_video_encoder") { >+ visibility = [ "*" ] >+ >+ testonly = true >+ sources = [ >+ "test/mock_video_encoder.cc", >+ "test/mock_video_encoder.h", >+ ] >+ >+ deps = [ >+ "../api/video_codecs:video_codecs_api", >+ "../test:test_support", >+ ] >+ } >+ >+ rtc_source_set("fake_media_transport") { >+ testonly = true >+ >+ sources = [ >+ "test/fake_media_transport.h", >+ ] >+ >+ deps = [ >+ ":libjingle_peerconnection_api", >+ "../rtc_base:checks", >+ "//third_party/abseil-cpp/absl/memory:memory", >+ ] >+ } >+ >+ rtc_source_set("loopback_media_transport") { >+ testonly = true >+ >+ sources = [ >+ "test/loopback_media_transport.h", >+ ] >+ >+ deps = [ >+ ":libjingle_peerconnection_api", >+ "../rtc_base:checks", >+ "../rtc_base:rtc_base", >+ ] >+ } >+ > rtc_source_set("rtc_api_unittests") { > testonly = true > > sources = [ > "array_view_unittest.cc", >- "ortc/mediadescription_unittest.cc", >- "ortc/sessiondescription_unittest.cc", > "rtcerror_unittest.cc", > "rtpparameters_unittest.cc", >+ "test/loopback_media_transport_unittest.cc", > ] > > if (!build_with_chromium && is_clang) { >@@ -532,10 +672,10 @@ if (rtc_include_tests) { > deps = [ > ":array_view", > ":libjingle_peerconnection_api", >- ":ortc_api", >+ ":loopback_media_transport", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base_approved", >- "../rtc_base:rtc_base_tests_utils", > "../test:test_support", > "units:units_unittests", > ] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/DEPS b/Source/ThirdParty/libwebrtc/Source/webrtc/api/DEPS >index 847cce259f9ae63a020e0d62a24201bca72457a2..286d8d1e1e6c778b40458f77d065e882212a8a30 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/DEPS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/DEPS >@@ -1,15 +1,55 @@ >+# This is supposed to be a complete list of top-level directories, >+# excepting only api/ itself. > include_rules = [ >- "+third_party/libyuv", >- "+common_video", >- "+media", >- "+p2p", >- "+pc", >- "+logging/rtc_event_log/rtc_event_log_factory_interface.h", >- "+modules/audio_processing/include", >- "+system_wrappers/include", >+ "-audio", >+ "-base", >+ "-build", >+ "-buildtools", >+ "-build_overrides", >+ "-call", >+ "-common_audio", >+ "-common_video", >+ "-data", >+ "-examples", >+ "-ios", >+ "-infra", >+ "-logging", >+ "-media", >+ "-modules", >+ "-out", >+ "-p2p", >+ "-pc", >+ "-resources", >+ "-rtc_base", >+ "-rtc_tools", >+ "-sdk", >+ "-stats", >+ "-style-guide", >+ "-system_wrappers", >+ "-test", >+ "-testing", >+ "-third_party", >+ "-tools", >+ "-tools_webrtc", >+ "-video", >+ "-external/webrtc/webrtc", # Android platform build. >+ "-libyuv", >+ "-common_types.h", >+ "-WebRTC", > ] > > specific_include_rules = { >+ # Some internal headers are allowed even in API headers: >+ ".*\.h": [ >+ "+rtc_base/checks.h", >+ "+rtc_base/system/rtc_export.h", >+ "+rtc_base/units/unit_base.h", >+ ], >+ >+ "array_view\.h": [ >+ "+rtc_base/type_traits.h", >+ ], >+ > # Needed because AudioEncoderOpus is in the wrong place for > # backwards compatibilty reasons. See > # https://bugs.chromium.org/p/webrtc/issues/detail?id=7847 >@@ -17,13 +57,252 @@ specific_include_rules = { > "+modules/audio_coding/codecs/opus/audio_encoder_opus.h", > ], > >- # We allow .cc files in webrtc/api/ to #include a bunch of stuff >- # that's off-limits for the .h files. That's because .h files leak >- # their #includes to whoever's #including them, but .cc files do not >- # since no one #includes them. >+ "asyncresolverfactory\.h": [ >+ "+rtc_base/asyncresolverinterface.h", >+ ], >+ >+ "candidate\.h": [ >+ "+rtc_base/network_constants.h", >+ "+rtc_base/socketaddress.h", >+ ], >+ >+ "create_peerconnection_factory\.h": [ >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "datachannelinterface\.h": [ >+ "+rtc_base/copyonwritebuffer.h", >+ "+rtc_base/refcount.h", >+ ], >+ >+ "dtmfsenderinterface\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "fec_controller\.h": [ >+ "+modules/include/module_fec_types.h", >+ ], >+ >+ "jsep\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "jsepicecandidate\.h": [ >+ "+rtc_base/constructormagic.h", >+ ], >+ >+ "jsepsessiondescription\.h": [ >+ "+rtc_base/constructormagic.h", >+ ], >+ >+ "mediastreaminterface\.h": [ >+ "+modules/audio_processing/include/audio_processing_statistics.h", >+ "+rtc_base/refcount.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "media_transport_interface\.h": [ >+ "+rtc_base/copyonwritebuffer.h", # As used by datachannelinterface.h >+ "+rtc_base/networkroute.h" >+ ], >+ >+ "peerconnectionfactoryproxy\.h": [ >+ "+rtc_base/bind.h", >+ ], >+ >+ "refcountedbase\.h": [ >+ "+rtc_base/constructormagic.h", >+ "+rtc_base/refcount.h", >+ "+rtc_base/refcounter.h", >+ ], >+ >+ "rtcerror\.h": [ >+ "+rtc_base/logging.h", >+ ], >+ >+ "rtpreceiverinterface\.h": [ >+ "+rtc_base/refcount.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "rtpsenderinterface\.h": [ >+ "+rtc_base/refcount.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "rtptransceiverinterface\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "setremotedescriptionobserverinterface\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "statstypes\.h": [ >+ "+rtc_base/constructormagic.h", >+ "+rtc_base/refcount.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ "+rtc_base/stringencode.h", >+ "+rtc_base/thread_checker.h", >+ ], >+ >+ "umametrics\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "audio_frame\.h": [ >+ "+rtc_base/constructormagic.h", >+ ], >+ >+ "audio_mixer\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "audio_decoder\.h": [ >+ "+rtc_base/buffer.h", >+ "+rtc_base/constructormagic.h", >+ ], >+ >+ "audio_decoder_factory\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "audio_decoder_factory_template\.h": [ >+ "+rtc_base/refcountedobject.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "audio_encoder\.h": [ >+ "+rtc_base/buffer.h", >+ "+rtc_base/deprecation.h", >+ ], >+ >+ "audio_encoder_factory\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "audio_encoder_factory_template\.h": [ >+ "+rtc_base/refcountedobject.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "builtin_audio_decoder_factory\.h": [ >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "builtin_audio_encoder_factory\.h": [ >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "framedecryptorinterface\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "frameencryptorinterface\.h": [ >+ "+rtc_base/refcount.h", >+ ], >+ >+ "ortcfactoryinterface\.h": [ >+ "+rtc_base/network.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ "+rtc_base/thread.h", >+ ], >+ >+ "udptransportinterface\.h": [ >+ "+rtc_base/socketaddress.h", >+ ], >+ >+ "rtcstatscollectorcallback\.h": [ >+ "+rtc_base/refcount.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "rtcstatsreport\.h": [ >+ "+rtc_base/refcount.h", >+ "+rtc_base/refcountedobject.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "audioproc_float\.h": [ >+ "+modules/audio_processing/include/audio_processing.h", >+ ], >+ >+ "fake_frame_decryptor\.h": [ >+ "+rtc_base/refcountedobject.h", >+ ], >+ >+ "fake_frame_encryptor\.h": [ >+ "+rtc_base/refcountedobject.h", >+ ], >+ >+ "fakeconstraints\.h": [ >+ "+rtc_base/stringencode.h", >+ ], >+ >+ "mock.*\.h": [ >+ "+test/gmock.h", >+ ], >+ >+ "simulated_network\.h": [ >+ "+rtc_base/criticalsection.h", >+ "+rtc_base/random.h", >+ "+rtc_base/thread_annotations.h", >+ ], >+ >+ "test_dependency_factory\.h": [ >+ "+rtc_base/thread_checker.h", >+ ], >+ >+ "videocodec_test_fixture\.h": [ >+ "+modules/video_coding/include/video_codec_interface.h" >+ ], >+ >+ "i010_buffer\.h": [ >+ "+rtc_base/memory/aligned_malloc.h" >+ ], >+ >+ "i420_buffer\.h": [ >+ "+rtc_base/memory/aligned_malloc.h", >+ ], >+ >+ "video_frame_buffer\.h": [ >+ "+rtc_base/refcount.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ "video_timing\.h": [ >+ "+rtc_base/numerics/safe_conversions.h", >+ ], >+ >+ "video_encoder_config\.h": [ >+ "+rtc_base/refcount.h", >+ "+rtc_base/scoped_ref_ptr.h", >+ ], >+ >+ # .cc files in api/ should not be restricted in what they can #include, >+ # so we re-add all the top-level directories here. (That's because .h >+ # files leak their #includes to whoever's #including them, but .cc files >+ # do not since no one #includes them.) > ".*\.cc": [ >- "+modules/audio_coding", >- "+modules/audio_processing", >- "+modules/video_coding", >+ "+audio", >+ "+call", >+ "+common_audio", >+ "+common_video", >+ "+examples", >+ "+logging", >+ "+media", >+ "+modules", >+ "+p2p", >+ "+pc", >+ "+rtc_base", >+ "+rtc_tools", >+ "+sdk", >+ "+stats", >+ "+system_wrappers", >+ "+test", >+ "+tools", >+ "+tools_webrtc", >+ "+video", >+ "+third_party", > ], > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/api/OWNERS >index 593de230d37879a2598f5859df2d432aacfb32b2..1ae5c646e2317abae35464ba83fa3ffe0be12a0b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/OWNERS >@@ -13,3 +13,10 @@ per-file peerconnection*=hbos@webrtc.org > > per-file *.gn=phoglund@webrtc.org > per-file *.gni=phoglund@webrtc.org >+ >+per-file DEPS=mbonadei@webrtc.org >+per-file DEPS=kwiberg@webrtc.org >+ >+per-file media_transport*=sukhanov@webrtc.org >+per-file media_transport*=psla@webrtc.org >+per-file media_transport*=mellem@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/BUILD.gn >index 286a5a6f1c6cf523f420fe4ff7c927b439bc5015..b66be763431f72b33f000f541dcc0c1da18389af 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/BUILD.gn >@@ -39,6 +39,27 @@ rtc_source_set("aec3_config") { > "echo_canceller3_config.cc", > "echo_canceller3_config.h", > ] >+ deps = [ >+ "../../rtc_base:checks", >+ "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:safe_minmax", >+ "../../rtc_base/system:rtc_export", >+ ] >+} >+ >+rtc_source_set("aec3_config_json") { >+ visibility = [ "*" ] >+ sources = [ >+ "echo_canceller3_config_json.cc", >+ "echo_canceller3_config_json.h", >+ ] >+ deps = [ >+ ":aec3_config", >+ "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:rtc_json", >+ "../../rtc_base/system:rtc_export", >+ "//third_party/abseil-cpp/absl/strings:strings", >+ ] > } > > rtc_source_set("aec3_factory") { >@@ -54,6 +75,7 @@ rtc_source_set("aec3_factory") { > ":echo_control", > "../../modules/audio_processing/aec3", > "../../rtc_base:rtc_base_approved", >+ "../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.cc >index 3b03d13c1ca51d913e9ee8429859a5e31529c110..3eb2a8d2a009b8fe1a4ef870e76dce4709189274 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.cc >@@ -9,7 +9,36 @@ > */ > #include "api/audio/echo_canceller3_config.h" > >+#include <algorithm> >+#include <cmath> >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/numerics/safe_minmax.h" >+ > namespace webrtc { >+namespace { >+bool Limit(float* value, float min, float max) { >+ float clamped = rtc::SafeClamp(*value, min, max); >+ clamped = std::isfinite(clamped) ? clamped : min; >+ bool res = *value == clamped; >+ *value = clamped; >+ return res; >+} >+ >+bool Limit(size_t* value, size_t min, size_t max) { >+ size_t clamped = rtc::SafeClamp(*value, min, max); >+ bool res = *value == clamped; >+ *value = clamped; >+ return res; >+} >+ >+bool Limit(int* value, int min, int max) { >+ int clamped = rtc::SafeClamp(*value, min, max); >+ bool res = *value == clamped; >+ *value = clamped; >+ return res; >+} >+} // namespace > > EchoCanceller3Config::EchoCanceller3Config() = default; > EchoCanceller3Config::EchoCanceller3Config(const EchoCanceller3Config& e) = >@@ -18,9 +47,6 @@ EchoCanceller3Config::Delay::Delay() = default; > EchoCanceller3Config::Delay::Delay(const EchoCanceller3Config::Delay& e) = > default; > >-EchoCanceller3Config::Mask::Mask() = default; >-EchoCanceller3Config::Mask::Mask(const EchoCanceller3Config::Mask& m) = default; >- > EchoCanceller3Config::EchoModel::EchoModel() = default; > EchoCanceller3Config::EchoModel::EchoModel( > const EchoCanceller3Config::EchoModel& e) = default; >@@ -51,4 +77,179 @@ EchoCanceller3Config::Suppressor::Tuning::Tuning(MaskingThresholds mask_lf, > EchoCanceller3Config::Suppressor::Tuning::Tuning( > const EchoCanceller3Config::Suppressor::Tuning& e) = default; > >+bool EchoCanceller3Config::Validate(EchoCanceller3Config* config) { >+ RTC_DCHECK(config); >+ EchoCanceller3Config* c = config; >+ bool res = true; >+ >+ if (c->delay.down_sampling_factor != 4 && >+ c->delay.down_sampling_factor != 8) { >+ c->delay.down_sampling_factor = 4; >+ res = false; >+ } >+ if (c->delay.delay_headroom_blocks <= 1 && >+ c->delay.hysteresis_limit_1_blocks == 1) { >+ c->delay.hysteresis_limit_1_blocks = 0; >+ res = false; >+ } >+ res = res & Limit(&c->delay.default_delay, 0, 5000); >+ res = res & Limit(&c->delay.num_filters, 0, 5000); >+ res = res & Limit(&c->delay.api_call_jitter_blocks, 1, 5000); >+ res = res & Limit(&c->delay.min_echo_path_delay_blocks, 0, 5000); >+ res = res & Limit(&c->delay.delay_headroom_blocks, 0, 5000); >+ res = res & Limit(&c->delay.hysteresis_limit_1_blocks, 0, 5000); >+ res = res & Limit(&c->delay.hysteresis_limit_2_blocks, 0, 5000); >+ res = res & Limit(&c->delay.skew_hysteresis_blocks, 0, 5000); >+ res = res & Limit(&c->delay.fixed_capture_delay_samples, 0, 5000); >+ res = res & Limit(&c->delay.delay_estimate_smoothing, 0.f, 1.f); >+ res = res & Limit(&c->delay.delay_candidate_detection_threshold, 0.f, 1.f); >+ res = res & Limit(&c->delay.delay_selection_thresholds.initial, 1, 250); >+ res = res & Limit(&c->delay.delay_selection_thresholds.converged, 1, 250); >+ >+ res = res & Limit(&c->filter.main.length_blocks, 1, 50); >+ res = res & Limit(&c->filter.main.leakage_converged, 0.f, 1000.f); >+ res = res & Limit(&c->filter.main.leakage_diverged, 0.f, 1000.f); >+ res = res & Limit(&c->filter.main.error_floor, 0.f, 1000.f); >+ res = res & Limit(&c->filter.main.error_ceil, 0.f, 100000000.f); >+ res = res & Limit(&c->filter.main.noise_gate, 0.f, 100000000.f); >+ >+ res = res & Limit(&c->filter.main_initial.length_blocks, 1, 50); >+ res = res & Limit(&c->filter.main_initial.leakage_converged, 0.f, 1000.f); >+ res = res & Limit(&c->filter.main_initial.leakage_diverged, 0.f, 1000.f); >+ res = res & Limit(&c->filter.main_initial.error_floor, 0.f, 1000.f); >+ res = res & Limit(&c->filter.main_initial.error_ceil, 0.f, 100000000.f); >+ res = res & Limit(&c->filter.main_initial.noise_gate, 0.f, 100000000.f); >+ >+ if (c->filter.main.length_blocks < c->filter.main_initial.length_blocks) { >+ c->filter.main_initial.length_blocks = c->filter.main.length_blocks; >+ res = false; >+ } >+ >+ res = res & Limit(&c->filter.shadow.length_blocks, 1, 50); >+ res = res & Limit(&c->filter.shadow.rate, 0.f, 1.f); >+ res = res & Limit(&c->filter.shadow.noise_gate, 0.f, 100000000.f); >+ >+ res = res & Limit(&c->filter.shadow_initial.length_blocks, 1, 50); >+ res = res & Limit(&c->filter.shadow_initial.rate, 0.f, 1.f); >+ res = res & Limit(&c->filter.shadow_initial.noise_gate, 0.f, 100000000.f); >+ >+ if (c->filter.shadow.length_blocks < c->filter.shadow_initial.length_blocks) { >+ c->filter.shadow_initial.length_blocks = c->filter.shadow.length_blocks; >+ res = false; >+ } >+ >+ res = res & Limit(&c->filter.config_change_duration_blocks, 0, 100000); >+ res = res & Limit(&c->filter.initial_state_seconds, 0.f, 100.f); >+ >+ res = res & Limit(&c->erle.min, 1.f, 100000.f); >+ res = res & Limit(&c->erle.max_l, 1.f, 100000.f); >+ res = res & Limit(&c->erle.max_h, 1.f, 100000.f); >+ if (c->erle.min > c->erle.max_l || c->erle.min > c->erle.max_h) { >+ c->erle.min = std::min(c->erle.max_l, c->erle.max_h); >+ res = false; >+ } >+ res = res & Limit(&c->erle.num_sections, 1, c->filter.main.length_blocks); >+ >+ res = res & Limit(&c->ep_strength.lf, 0.f, 1000000.f); >+ res = res & Limit(&c->ep_strength.mf, 0.f, 1000000.f); >+ res = res & Limit(&c->ep_strength.hf, 0.f, 1000000.f); >+ res = res & Limit(&c->ep_strength.default_len, -1.f, 1.f); >+ >+ res = >+ res & Limit(&c->echo_audibility.low_render_limit, 0.f, 32768.f * 32768.f); >+ res = res & >+ Limit(&c->echo_audibility.normal_render_limit, 0.f, 32768.f * 32768.f); >+ res = res & Limit(&c->echo_audibility.floor_power, 0.f, 32768.f * 32768.f); >+ res = res & Limit(&c->echo_audibility.audibility_threshold_lf, 0.f, >+ 32768.f * 32768.f); >+ res = res & Limit(&c->echo_audibility.audibility_threshold_mf, 0.f, >+ 32768.f * 32768.f); >+ res = res & Limit(&c->echo_audibility.audibility_threshold_hf, 0.f, >+ 32768.f * 32768.f); >+ >+ res = res & >+ Limit(&c->render_levels.active_render_limit, 0.f, 32768.f * 32768.f); >+ res = res & Limit(&c->render_levels.poor_excitation_render_limit, 0.f, >+ 32768.f * 32768.f); >+ res = res & Limit(&c->render_levels.poor_excitation_render_limit_ds8, 0.f, >+ 32768.f * 32768.f); >+ >+ res = >+ res & Limit(&c->echo_removal_control.gain_rampup.initial_gain, 0.f, 1.f); >+ res = res & Limit(&c->echo_removal_control.gain_rampup.first_non_zero_gain, >+ 0.f, 1.f); >+ res = res & Limit(&c->echo_removal_control.gain_rampup.non_zero_gain_blocks, >+ 0, 100000); >+ res = res & >+ Limit(&c->echo_removal_control.gain_rampup.full_gain_blocks, 0, 100000); >+ >+ res = res & Limit(&c->echo_model.noise_floor_hold, 0, 1000); >+ res = res & Limit(&c->echo_model.min_noise_floor_power, 0, 2000000.f); >+ res = res & Limit(&c->echo_model.stationary_gate_slope, 0, 1000000.f); >+ res = res & Limit(&c->echo_model.noise_gate_power, 0, 1000000.f); >+ res = res & Limit(&c->echo_model.noise_gate_slope, 0, 1000000.f); >+ res = res & Limit(&c->echo_model.render_pre_window_size, 0, 100); >+ res = res & Limit(&c->echo_model.render_post_window_size, 0, 100); >+ res = res & Limit(&c->echo_model.render_pre_window_size_init, 0, 100); >+ res = res & Limit(&c->echo_model.render_post_window_size_init, 0, 100); >+ res = res & Limit(&c->echo_model.nonlinear_hold, 0, 100); >+ res = res & Limit(&c->echo_model.nonlinear_release, 0, 1.f); >+ >+ res = res & Limit(&c->suppressor.nearend_average_blocks, 1, 5000); >+ >+ res = res & >+ Limit(&c->suppressor.normal_tuning.mask_lf.enr_transparent, 0.f, 100.f); >+ res = res & >+ Limit(&c->suppressor.normal_tuning.mask_lf.enr_suppress, 0.f, 100.f); >+ res = res & >+ Limit(&c->suppressor.normal_tuning.mask_lf.emr_transparent, 0.f, 100.f); >+ res = res & >+ Limit(&c->suppressor.normal_tuning.mask_hf.enr_transparent, 0.f, 100.f); >+ res = res & >+ Limit(&c->suppressor.normal_tuning.mask_hf.enr_suppress, 0.f, 100.f); >+ res = res & >+ Limit(&c->suppressor.normal_tuning.mask_hf.emr_transparent, 0.f, 100.f); >+ res = res & Limit(&c->suppressor.normal_tuning.max_inc_factor, 0.f, 100.f); >+ res = res & Limit(&c->suppressor.normal_tuning.max_dec_factor_lf, 0.f, 100.f); >+ >+ res = res & Limit(&c->suppressor.nearend_tuning.mask_lf.enr_transparent, 0.f, >+ 100.f); >+ res = res & >+ Limit(&c->suppressor.nearend_tuning.mask_lf.enr_suppress, 0.f, 100.f); >+ res = res & Limit(&c->suppressor.nearend_tuning.mask_lf.emr_transparent, 0.f, >+ 100.f); >+ res = res & Limit(&c->suppressor.nearend_tuning.mask_hf.enr_transparent, 0.f, >+ 100.f); >+ res = res & >+ Limit(&c->suppressor.nearend_tuning.mask_hf.enr_suppress, 0.f, 100.f); >+ res = res & Limit(&c->suppressor.nearend_tuning.mask_hf.emr_transparent, 0.f, >+ 100.f); >+ res = res & Limit(&c->suppressor.nearend_tuning.max_inc_factor, 0.f, 100.f); >+ res = >+ res & Limit(&c->suppressor.nearend_tuning.max_dec_factor_lf, 0.f, 100.f); >+ >+ res = res & Limit(&c->suppressor.dominant_nearend_detection.enr_threshold, >+ 0.f, 1000000.f); >+ res = res & Limit(&c->suppressor.dominant_nearend_detection.snr_threshold, >+ 0.f, 1000000.f); >+ res = res & Limit(&c->suppressor.dominant_nearend_detection.hold_duration, 0, >+ 10000); >+ res = res & Limit(&c->suppressor.dominant_nearend_detection.trigger_threshold, >+ 0, 10000); >+ >+ res = res & Limit(&c->suppressor.high_bands_suppression.enr_threshold, 0.f, >+ 1000000.f); >+ res = res & Limit(&c->suppressor.high_bands_suppression.max_gain_during_echo, >+ 0.f, 1.f); >+ >+ res = res & Limit(&c->suppressor.floor_first_increase, 0.f, 1000000.f); >+ >+ if (c->delay.delay_headroom_blocks > >+ c->filter.main_initial.length_blocks - 1) { >+ c->delay.delay_headroom_blocks = c->filter.main_initial.length_blocks - 1; >+ res = false; >+ } >+ >+ return res; >+} > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.h >index e7376ed7bcd3c4eb9c826c4344172051ad6ebdd5..ea6e51baf9c53d1236f0759556b26e2d921ac769 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config.h >@@ -13,18 +13,31 @@ > > #include <stddef.h> // size_t > >+#include "rtc_base/system/rtc_export.h" >+ > namespace webrtc { > > // Configuration struct for EchoCanceller3 >-struct EchoCanceller3Config { >+struct RTC_EXPORT EchoCanceller3Config { >+ // Checks and updates the config parameters to lie within (mostly) reasonable >+ // ranges. Returns true if and only of the config did not need to be changed. >+ static bool Validate(EchoCanceller3Config* config); >+ > EchoCanceller3Config(); > EchoCanceller3Config(const EchoCanceller3Config& e); >+ >+ struct Buffering { >+ bool use_new_render_buffering = true; >+ size_t excess_render_detection_interval_blocks = 250; >+ size_t max_allowed_excess_render_blocks = 8; >+ } buffering; >+ > struct Delay { > Delay(); > Delay(const Delay& e); > size_t default_delay = 5; > size_t down_sampling_factor = 4; >- size_t num_filters = 6; >+ size_t num_filters = 5; > size_t api_call_jitter_blocks = 26; > size_t min_echo_path_delay_blocks = 0; > size_t delay_headroom_blocks = 2; >@@ -37,7 +50,7 @@ struct EchoCanceller3Config { > struct DelaySelectionThresholds { > int initial; > int converged; >- } delay_selection_thresholds = {25, 25}; >+ } delay_selection_thresholds = {5, 20}; > } delay; > > struct Filter { >@@ -46,6 +59,7 @@ struct EchoCanceller3Config { > float leakage_converged; > float leakage_diverged; > float error_floor; >+ float error_ceil; > float noise_gate; > }; > >@@ -55,10 +69,11 @@ struct EchoCanceller3Config { > float noise_gate; > }; > >- MainConfiguration main = {13, 0.00005f, 0.01f, 0.1f, 20075344.f}; >+ MainConfiguration main = {13, 0.00005f, 0.05f, 0.001f, 2.f, 20075344.f}; > ShadowConfiguration shadow = {13, 0.7f, 20075344.f}; > >- MainConfiguration main_initial = {12, 0.005f, 0.5f, 0.001f, 20075344.f}; >+ MainConfiguration main_initial = {12, 0.005f, 0.5f, >+ 0.001f, 2.f, 20075344.f}; > ShadowConfiguration shadow_initial = {12, 0.9f, 20075344.f}; > > size_t config_change_duration_blocks = 250; >@@ -72,38 +87,19 @@ struct EchoCanceller3Config { > float max_l = 4.f; > float max_h = 1.5f; > bool onset_detection = true; >+ size_t num_sections = 1; > } erle; > > struct EpStrength { > float lf = 1.f; > float mf = 1.f; > float hf = 1.f; >- float default_len = 0.88f; >+ float default_len = 0.83f; > bool reverb_based_on_render = true; > bool echo_can_saturate = true; > bool bounded_erl = false; > } ep_strength; > >- struct Mask { >- Mask(); >- Mask(const Mask& m); >- float m0 = 0.1f; >- float m1 = 0.01f; >- float m2 = 0.0001f; >- float m3 = 0.01f; >- float m5 = 0.01f; >- float m6 = 0.0001f; >- float m7 = 0.01f; >- float m8 = 0.0001f; >- float m9 = 0.1f; >- >- float gain_curve_offset = 1.45f; >- float gain_curve_slope = 5.f; >- float temporal_masking_lf = 0.9f; >- float temporal_masking_hf = 0.6f; >- size_t temporal_masking_lf_bands = 3; >- } gain_mask; >- > struct EchoAudibility { > float low_render_limit = 4 * 64.f; > float normal_render_limit = 64.f; >@@ -111,7 +107,7 @@ struct EchoCanceller3Config { > float audibility_threshold_lf = 10; > float audibility_threshold_mf = 10; > float audibility_threshold_hf = 10; >- bool use_stationary_properties = true; >+ bool use_stationary_properties = false; > bool use_stationarity_properties_at_init = false; > } echo_audibility; > >@@ -176,20 +172,22 @@ struct EchoCanceller3Config { > float max_dec_factor_lf; > }; > >- Tuning normal_tuning = Tuning(MaskingThresholds(.2f, .3f, .3f), >+ Tuning normal_tuning = Tuning(MaskingThresholds(.3f, .4f, .3f), > MaskingThresholds(.07f, .1f, .3f), > 2.0f, > 0.25f); >- Tuning nearend_tuning = Tuning(MaskingThresholds(.2f, .3f, .3f), >- MaskingThresholds(.07f, .1f, .3f), >+ Tuning nearend_tuning = Tuning(MaskingThresholds(1.09f, 1.1f, .3f), >+ MaskingThresholds(.1f, .3f, .3f), > 2.0f, > 0.25f); > > struct DominantNearendDetection { >- float enr_threshold = 10.f; >- float snr_threshold = 10.f; >- int hold_duration = 25; >- int trigger_threshold = 15; >+ float enr_threshold = 4.f; >+ float enr_exit_threshold = .1f; >+ float snr_threshold = 30.f; >+ int hold_duration = 50; >+ int trigger_threshold = 12; >+ bool use_during_initial_phase = true; > } dominant_nearend_detection; > > struct HighBandsSuppression { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config_json.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config_json.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..01a831cb4c996082cca08255df5db84effb1edf2 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config_json.cc >@@ -0,0 +1,583 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#include "api/audio/echo_canceller3_config_json.h" >+ >+#include <string> >+#include <vector> >+ >+#include "rtc_base/logging.h" >+#include "rtc_base/strings/json.h" >+#include "rtc_base/strings/string_builder.h" >+ >+namespace webrtc { >+namespace { >+void ReadParam(const Json::Value& root, std::string param_name, bool* param) { >+ RTC_DCHECK(param); >+ bool v; >+ if (rtc::GetBoolFromJsonObject(root, param_name, &v)) { >+ *param = v; >+ } >+} >+ >+void ReadParam(const Json::Value& root, std::string param_name, size_t* param) { >+ RTC_DCHECK(param); >+ int v; >+ if (rtc::GetIntFromJsonObject(root, param_name, &v)) { >+ RTC_DCHECK_GE(v, 0); >+ *param = v; >+ } >+} >+ >+void ReadParam(const Json::Value& root, std::string param_name, int* param) { >+ RTC_DCHECK(param); >+ int v; >+ if (rtc::GetIntFromJsonObject(root, param_name, &v)) { >+ *param = v; >+ } >+} >+ >+void ReadParam(const Json::Value& root, std::string param_name, float* param) { >+ RTC_DCHECK(param); >+ double v; >+ if (rtc::GetDoubleFromJsonObject(root, param_name, &v)) { >+ *param = static_cast<float>(v); >+ } >+} >+ >+void ReadParam(const Json::Value& root, >+ std::string param_name, >+ EchoCanceller3Config::Filter::MainConfiguration* param) { >+ RTC_DCHECK(param); >+ Json::Value json_array; >+ if (rtc::GetValueFromJsonObject(root, param_name, &json_array)) { >+ std::vector<double> v; >+ rtc::JsonArrayToDoubleVector(json_array, &v); >+ if (v.size() != 6) { >+ RTC_LOG(LS_ERROR) << "Incorrect array size for " << param_name; >+ return; >+ } >+ param->length_blocks = static_cast<size_t>(v[0]); >+ param->leakage_converged = static_cast<float>(v[1]); >+ param->leakage_diverged = static_cast<float>(v[2]); >+ param->error_floor = static_cast<float>(v[3]); >+ param->error_ceil = static_cast<float>(v[4]); >+ param->noise_gate = static_cast<float>(v[5]); >+ } >+} >+ >+void ReadParam(const Json::Value& root, >+ std::string param_name, >+ EchoCanceller3Config::Filter::ShadowConfiguration* param) { >+ RTC_DCHECK(param); >+ Json::Value json_array; >+ if (rtc::GetValueFromJsonObject(root, param_name, &json_array)) { >+ std::vector<double> v; >+ rtc::JsonArrayToDoubleVector(json_array, &v); >+ if (v.size() != 3) { >+ RTC_LOG(LS_ERROR) << "Incorrect array size for " << param_name; >+ return; >+ } >+ param->length_blocks = static_cast<size_t>(v[0]); >+ param->rate = static_cast<float>(v[1]); >+ param->noise_gate = static_cast<float>(v[2]); >+ } >+} >+ >+void ReadParam(const Json::Value& root, >+ std::string param_name, >+ EchoCanceller3Config::Suppressor::MaskingThresholds* param) { >+ RTC_DCHECK(param); >+ Json::Value json_array; >+ if (rtc::GetValueFromJsonObject(root, param_name, &json_array)) { >+ std::vector<double> v; >+ rtc::JsonArrayToDoubleVector(json_array, &v); >+ if (v.size() != 3) { >+ RTC_LOG(LS_ERROR) << "Incorrect array size for " << param_name; >+ return; >+ } >+ param->enr_transparent = static_cast<float>(v[0]); >+ param->enr_suppress = static_cast<float>(v[1]); >+ param->emr_transparent = static_cast<float>(v[2]); >+ } >+} >+} // namespace >+ >+void Aec3ConfigFromJsonString(absl::string_view json_string, >+ EchoCanceller3Config* config, >+ bool* parsing_successful) { >+ RTC_DCHECK(config); >+ RTC_DCHECK(parsing_successful); >+ EchoCanceller3Config& cfg = *config; >+ cfg = EchoCanceller3Config(); >+ *parsing_successful = true; >+ >+ Json::Value root; >+ bool success = Json::Reader().parse(std::string(json_string), root); >+ if (!success) { >+ RTC_LOG(LS_ERROR) << "Incorrect JSON format: " << json_string; >+ *parsing_successful = false; >+ return; >+ } >+ >+ Json::Value aec3_root; >+ success = rtc::GetValueFromJsonObject(root, "aec3", &aec3_root); >+ if (!success) { >+ RTC_LOG(LS_ERROR) << "Missing AEC3 config field: " << json_string; >+ *parsing_successful = false; >+ return; >+ } >+ >+ Json::Value section; >+ if (rtc::GetValueFromJsonObject(root, "buffering", §ion)) { >+ ReadParam(section, "use_new_render_buffering", >+ &cfg.buffering.use_new_render_buffering); >+ ReadParam(section, "excess_render_detection_interval_blocks", >+ &cfg.buffering.excess_render_detection_interval_blocks); >+ ReadParam(section, "max_allowed_excess_render_blocks", >+ &cfg.buffering.max_allowed_excess_render_blocks); >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "delay", §ion)) { >+ ReadParam(section, "default_delay", &cfg.delay.default_delay); >+ ReadParam(section, "down_sampling_factor", &cfg.delay.down_sampling_factor); >+ ReadParam(section, "num_filters", &cfg.delay.num_filters); >+ ReadParam(section, "api_call_jitter_blocks", >+ &cfg.delay.api_call_jitter_blocks); >+ ReadParam(section, "min_echo_path_delay_blocks", >+ &cfg.delay.min_echo_path_delay_blocks); >+ ReadParam(section, "delay_headroom_blocks", >+ &cfg.delay.delay_headroom_blocks); >+ ReadParam(section, "hysteresis_limit_1_blocks", >+ &cfg.delay.hysteresis_limit_1_blocks); >+ ReadParam(section, "hysteresis_limit_2_blocks", >+ &cfg.delay.hysteresis_limit_2_blocks); >+ ReadParam(section, "skew_hysteresis_blocks", >+ &cfg.delay.skew_hysteresis_blocks); >+ ReadParam(section, "fixed_capture_delay_samples", >+ &cfg.delay.fixed_capture_delay_samples); >+ ReadParam(section, "delay_estimate_smoothing", >+ &cfg.delay.delay_estimate_smoothing); >+ ReadParam(section, "delay_candidate_detection_threshold", >+ &cfg.delay.delay_candidate_detection_threshold); >+ >+ Json::Value subsection; >+ if (rtc::GetValueFromJsonObject(section, "delay_selection_thresholds", >+ &subsection)) { >+ ReadParam(subsection, "initial", >+ &cfg.delay.delay_selection_thresholds.initial); >+ ReadParam(subsection, "converged", >+ &cfg.delay.delay_selection_thresholds.converged); >+ } >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "filter", §ion)) { >+ ReadParam(section, "main", &cfg.filter.main); >+ ReadParam(section, "shadow", &cfg.filter.shadow); >+ ReadParam(section, "main_initial", &cfg.filter.main_initial); >+ ReadParam(section, "shadow_initial", &cfg.filter.shadow_initial); >+ ReadParam(section, "config_change_duration_blocks", >+ &cfg.filter.config_change_duration_blocks); >+ ReadParam(section, "initial_state_seconds", >+ &cfg.filter.initial_state_seconds); >+ ReadParam(section, "conservative_initial_phase", >+ &cfg.filter.conservative_initial_phase); >+ ReadParam(section, "enable_shadow_filter_output_usage", >+ &cfg.filter.enable_shadow_filter_output_usage); >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "erle", §ion)) { >+ ReadParam(section, "min", &cfg.erle.min); >+ ReadParam(section, "max_l", &cfg.erle.max_l); >+ ReadParam(section, "max_h", &cfg.erle.max_h); >+ ReadParam(section, "onset_detection", &cfg.erle.onset_detection); >+ ReadParam(section, "num_sections", &cfg.erle.num_sections); >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "ep_strength", §ion)) { >+ ReadParam(section, "lf", &cfg.ep_strength.lf); >+ ReadParam(section, "mf", &cfg.ep_strength.mf); >+ ReadParam(section, "hf", &cfg.ep_strength.hf); >+ ReadParam(section, "default_len", &cfg.ep_strength.default_len); >+ ReadParam(section, "reverb_based_on_render", >+ &cfg.ep_strength.reverb_based_on_render); >+ ReadParam(section, "echo_can_saturate", &cfg.ep_strength.echo_can_saturate); >+ ReadParam(section, "bounded_erl", &cfg.ep_strength.bounded_erl); >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "echo_audibility", §ion)) { >+ ReadParam(section, "low_render_limit", >+ &cfg.echo_audibility.low_render_limit); >+ ReadParam(section, "normal_render_limit", >+ &cfg.echo_audibility.normal_render_limit); >+ >+ ReadParam(section, "floor_power", &cfg.echo_audibility.floor_power); >+ ReadParam(section, "audibility_threshold_lf", >+ &cfg.echo_audibility.audibility_threshold_lf); >+ ReadParam(section, "audibility_threshold_mf", >+ &cfg.echo_audibility.audibility_threshold_mf); >+ ReadParam(section, "audibility_threshold_hf", >+ &cfg.echo_audibility.audibility_threshold_hf); >+ ReadParam(section, "use_stationary_properties", >+ &cfg.echo_audibility.use_stationary_properties); >+ ReadParam(section, "use_stationarity_properties_at_init", >+ &cfg.echo_audibility.use_stationarity_properties_at_init); >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "render_levels", §ion)) { >+ ReadParam(section, "active_render_limit", >+ &cfg.render_levels.active_render_limit); >+ ReadParam(section, "poor_excitation_render_limit", >+ &cfg.render_levels.poor_excitation_render_limit); >+ ReadParam(section, "poor_excitation_render_limit_ds8", >+ &cfg.render_levels.poor_excitation_render_limit_ds8); >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "echo_removal_control", >+ §ion)) { >+ Json::Value subsection; >+ if (rtc::GetValueFromJsonObject(section, "gain_rampup", &subsection)) { >+ ReadParam(subsection, "initial_gain", >+ &cfg.echo_removal_control.gain_rampup.initial_gain); >+ ReadParam(subsection, "first_non_zero_gain", >+ &cfg.echo_removal_control.gain_rampup.first_non_zero_gain); >+ ReadParam(subsection, "non_zero_gain_blocks", >+ &cfg.echo_removal_control.gain_rampup.non_zero_gain_blocks); >+ ReadParam(subsection, "full_gain_blocks", >+ &cfg.echo_removal_control.gain_rampup.full_gain_blocks); >+ } >+ ReadParam(section, "has_clock_drift", >+ &cfg.echo_removal_control.has_clock_drift); >+ ReadParam(section, "linear_and_stable_echo_path", >+ &cfg.echo_removal_control.linear_and_stable_echo_path); >+ } >+ >+ if (rtc::GetValueFromJsonObject(aec3_root, "echo_model", §ion)) { >+ Json::Value subsection; >+ ReadParam(section, "noise_floor_hold", &cfg.echo_model.noise_floor_hold); >+ ReadParam(section, "min_noise_floor_power", >+ &cfg.echo_model.min_noise_floor_power); >+ ReadParam(section, "stationary_gate_slope", >+ &cfg.echo_model.stationary_gate_slope); >+ ReadParam(section, "noise_gate_power", &cfg.echo_model.noise_gate_power); >+ ReadParam(section, "noise_gate_slope", &cfg.echo_model.noise_gate_slope); >+ ReadParam(section, "render_pre_window_size", >+ &cfg.echo_model.render_pre_window_size); >+ ReadParam(section, "render_post_window_size", >+ &cfg.echo_model.render_post_window_size); >+ ReadParam(section, "render_pre_window_size_init", >+ &cfg.echo_model.render_pre_window_size_init); >+ ReadParam(section, "render_post_window_size_init", >+ &cfg.echo_model.render_post_window_size_init); >+ ReadParam(section, "nonlinear_hold", &cfg.echo_model.nonlinear_hold); >+ ReadParam(section, "nonlinear_release", &cfg.echo_model.nonlinear_release); >+ } >+ >+ Json::Value subsection; >+ if (rtc::GetValueFromJsonObject(aec3_root, "suppressor", §ion)) { >+ ReadParam(section, "nearend_average_blocks", >+ &cfg.suppressor.nearend_average_blocks); >+ >+ if (rtc::GetValueFromJsonObject(section, "normal_tuning", &subsection)) { >+ ReadParam(subsection, "mask_lf", &cfg.suppressor.normal_tuning.mask_lf); >+ ReadParam(subsection, "mask_hf", &cfg.suppressor.normal_tuning.mask_hf); >+ ReadParam(subsection, "max_inc_factor", >+ &cfg.suppressor.normal_tuning.max_inc_factor); >+ ReadParam(subsection, "max_dec_factor_lf", >+ &cfg.suppressor.normal_tuning.max_dec_factor_lf); >+ } >+ >+ if (rtc::GetValueFromJsonObject(section, "nearend_tuning", &subsection)) { >+ ReadParam(subsection, "mask_lf", &cfg.suppressor.nearend_tuning.mask_lf); >+ ReadParam(subsection, "mask_hf", &cfg.suppressor.nearend_tuning.mask_hf); >+ ReadParam(subsection, "max_inc_factor", >+ &cfg.suppressor.nearend_tuning.max_inc_factor); >+ ReadParam(subsection, "max_dec_factor_lf", >+ &cfg.suppressor.nearend_tuning.max_dec_factor_lf); >+ } >+ >+ if (rtc::GetValueFromJsonObject(section, "dominant_nearend_detection", >+ &subsection)) { >+ ReadParam(subsection, "enr_threshold", >+ &cfg.suppressor.dominant_nearend_detection.enr_threshold); >+ ReadParam(subsection, "enr_exit_threshold", >+ &cfg.suppressor.dominant_nearend_detection.enr_exit_threshold); >+ ReadParam(subsection, "snr_threshold", >+ &cfg.suppressor.dominant_nearend_detection.snr_threshold); >+ ReadParam(subsection, "hold_duration", >+ &cfg.suppressor.dominant_nearend_detection.hold_duration); >+ ReadParam(subsection, "trigger_threshold", >+ &cfg.suppressor.dominant_nearend_detection.trigger_threshold); >+ ReadParam( >+ subsection, "use_during_initial_phase", >+ &cfg.suppressor.dominant_nearend_detection.use_during_initial_phase); >+ } >+ >+ if (rtc::GetValueFromJsonObject(section, "high_bands_suppression", >+ &subsection)) { >+ ReadParam(subsection, "enr_threshold", >+ &cfg.suppressor.high_bands_suppression.enr_threshold); >+ ReadParam(subsection, "max_gain_during_echo", >+ &cfg.suppressor.high_bands_suppression.max_gain_during_echo); >+ } >+ >+ ReadParam(section, "floor_first_increase", >+ &cfg.suppressor.floor_first_increase); >+ ReadParam(section, "enforce_transparent", >+ &cfg.suppressor.enforce_transparent); >+ ReadParam(section, "enforce_empty_higher_bands", >+ &cfg.suppressor.enforce_empty_higher_bands); >+ } >+} >+ >+EchoCanceller3Config Aec3ConfigFromJsonString(absl::string_view json_string) { >+ EchoCanceller3Config cfg; >+ bool not_used; >+ Aec3ConfigFromJsonString(json_string, &cfg, ¬_used); >+ return cfg; >+} >+ >+std::string Aec3ConfigToJsonString(const EchoCanceller3Config& config) { >+ rtc::StringBuilder ost; >+ ost << "{"; >+ ost << "\"aec3\": {"; >+ ost << "\"delay\": {"; >+ ost << "\"default_delay\": " << config.delay.default_delay << ","; >+ ost << "\"down_sampling_factor\": " << config.delay.down_sampling_factor >+ << ","; >+ ost << "\"num_filters\": " << config.delay.num_filters << ","; >+ ost << "\"api_call_jitter_blocks\": " << config.delay.api_call_jitter_blocks >+ << ","; >+ ost << "\"min_echo_path_delay_blocks\": " >+ << config.delay.min_echo_path_delay_blocks << ","; >+ ost << "\"delay_headroom_blocks\": " << config.delay.delay_headroom_blocks >+ << ","; >+ ost << "\"hysteresis_limit_1_blocks\": " >+ << config.delay.hysteresis_limit_1_blocks << ","; >+ ost << "\"hysteresis_limit_2_blocks\": " >+ << config.delay.hysteresis_limit_2_blocks << ","; >+ ost << "\"skew_hysteresis_blocks\": " << config.delay.skew_hysteresis_blocks >+ << ","; >+ ost << "\"fixed_capture_delay_samples\": " >+ << config.delay.fixed_capture_delay_samples << ","; >+ ost << "\"delay_estimate_smoothing\": " >+ << config.delay.delay_estimate_smoothing << ","; >+ ost << "\"delay_candidate_detection_threshold\": " >+ << config.delay.delay_candidate_detection_threshold << ","; >+ >+ ost << "\"delay_selection_thresholds\": {"; >+ ost << "\"initial\": " << config.delay.delay_selection_thresholds.initial >+ << ","; >+ ost << "\"converged\": " << config.delay.delay_selection_thresholds.converged; >+ ost << "}"; >+ >+ ost << "},"; >+ >+ ost << "\"filter\": {"; >+ ost << "\"main\": ["; >+ ost << config.filter.main.length_blocks << ","; >+ ost << config.filter.main.leakage_converged << ","; >+ ost << config.filter.main.leakage_diverged << ","; >+ ost << config.filter.main.error_floor << ","; >+ ost << config.filter.main.error_ceil << ","; >+ ost << config.filter.main.noise_gate; >+ ost << "],"; >+ >+ ost << "\"shadow\": ["; >+ ost << config.filter.shadow.length_blocks << ","; >+ ost << config.filter.shadow.rate << ","; >+ ost << config.filter.shadow.noise_gate; >+ ost << "],"; >+ >+ ost << "\"main_initial\": ["; >+ ost << config.filter.main_initial.length_blocks << ","; >+ ost << config.filter.main_initial.leakage_converged << ","; >+ ost << config.filter.main_initial.leakage_diverged << ","; >+ ost << config.filter.main_initial.error_floor << ","; >+ ost << config.filter.main_initial.error_ceil << ","; >+ ost << config.filter.main_initial.noise_gate; >+ ost << "],"; >+ >+ ost << "\"shadow_initial\": ["; >+ ost << config.filter.shadow_initial.length_blocks << ","; >+ ost << config.filter.shadow_initial.rate << ","; >+ ost << config.filter.shadow_initial.noise_gate; >+ ost << "],"; >+ >+ ost << "\"config_change_duration_blocks\": " >+ << config.filter.config_change_duration_blocks << ","; >+ ost << "\"initial_state_seconds\": " << config.filter.initial_state_seconds >+ << ","; >+ ost << "\"conservative_initial_phase\": " >+ << (config.filter.conservative_initial_phase ? "true" : "false") << ","; >+ ost << "\"enable_shadow_filter_output_usage\": " >+ << (config.filter.enable_shadow_filter_output_usage ? "true" : "false"); >+ >+ ost << "},"; >+ >+ ost << "\"erle\": {"; >+ ost << "\"min\": " << config.erle.min << ","; >+ ost << "\"max_l\": " << config.erle.max_l << ","; >+ ost << "\"max_h\": " << config.erle.max_h << ","; >+ ost << "\"onset_detection\": " >+ << (config.erle.onset_detection ? "true" : "false") << ","; >+ ost << "\"num_sections\": " << config.erle.num_sections; >+ ost << "},"; >+ >+ ost << "\"ep_strength\": {"; >+ ost << "\"lf\": " << config.ep_strength.lf << ","; >+ ost << "\"mf\": " << config.ep_strength.mf << ","; >+ ost << "\"hf\": " << config.ep_strength.hf << ","; >+ ost << "\"default_len\": " << config.ep_strength.default_len << ","; >+ ost << "\"reverb_based_on_render\": " >+ << (config.ep_strength.reverb_based_on_render ? "true" : "false") << ","; >+ ost << "\"echo_can_saturate\": " >+ << (config.ep_strength.echo_can_saturate ? "true" : "false") << ","; >+ ost << "\"bounded_erl\": " >+ << (config.ep_strength.bounded_erl ? "true" : "false"); >+ >+ ost << "},"; >+ >+ ost << "\"echo_audibility\": {"; >+ ost << "\"low_render_limit\": " << config.echo_audibility.low_render_limit >+ << ","; >+ ost << "\"normal_render_limit\": " >+ << config.echo_audibility.normal_render_limit << ","; >+ ost << "\"floor_power\": " << config.echo_audibility.floor_power << ","; >+ ost << "\"audibility_threshold_lf\": " >+ << config.echo_audibility.audibility_threshold_lf << ","; >+ ost << "\"audibility_threshold_mf\": " >+ << config.echo_audibility.audibility_threshold_mf << ","; >+ ost << "\"audibility_threshold_hf\": " >+ << config.echo_audibility.audibility_threshold_hf << ","; >+ ost << "\"use_stationary_properties\": " >+ << (config.echo_audibility.use_stationary_properties ? "true" : "false") >+ << ","; >+ ost << "\"use_stationarity_properties_at_init\": " >+ << (config.echo_audibility.use_stationarity_properties_at_init ? "true" >+ : "false"); >+ ost << "},"; >+ >+ ost << "\"render_levels\": {"; >+ ost << "\"active_render_limit\": " << config.render_levels.active_render_limit >+ << ","; >+ ost << "\"poor_excitation_render_limit\": " >+ << config.render_levels.poor_excitation_render_limit << ","; >+ ost << "\"poor_excitation_render_limit_ds8\": " >+ << config.render_levels.poor_excitation_render_limit_ds8; >+ ost << "},"; >+ >+ ost << "\"echo_removal_control\": {"; >+ ost << "\"gain_rampup\": {"; >+ ost << "\"initial_gain\": " >+ << config.echo_removal_control.gain_rampup.initial_gain << ","; >+ ost << "\"first_non_zero_gain\": " >+ << config.echo_removal_control.gain_rampup.first_non_zero_gain << ","; >+ ost << "\"non_zero_gain_blocks\": " >+ << config.echo_removal_control.gain_rampup.non_zero_gain_blocks << ","; >+ ost << "\"full_gain_blocks\": " >+ << config.echo_removal_control.gain_rampup.full_gain_blocks; >+ ost << "},"; >+ ost << "\"has_clock_drift\": " >+ << (config.echo_removal_control.has_clock_drift ? "true" : "false") >+ << ","; >+ ost << "\"linear_and_stable_echo_path\": " >+ << (config.echo_removal_control.linear_and_stable_echo_path ? "true" >+ : "false"); >+ >+ ost << "},"; >+ >+ ost << "\"echo_model\": {"; >+ ost << "\"noise_floor_hold\": " << config.echo_model.noise_floor_hold << ","; >+ ost << "\"min_noise_floor_power\": " >+ << config.echo_model.min_noise_floor_power << ","; >+ ost << "\"stationary_gate_slope\": " >+ << config.echo_model.stationary_gate_slope << ","; >+ ost << "\"noise_gate_power\": " << config.echo_model.noise_gate_power << ","; >+ ost << "\"noise_gate_slope\": " << config.echo_model.noise_gate_slope << ","; >+ ost << "\"render_pre_window_size\": " >+ << config.echo_model.render_pre_window_size << ","; >+ ost << "\"render_post_window_size\": " >+ << config.echo_model.render_post_window_size << ","; >+ ost << "\"render_pre_window_size_init\": " >+ << config.echo_model.render_pre_window_size_init << ","; >+ ost << "\"render_post_window_size_init\": " >+ << config.echo_model.render_post_window_size_init << ","; >+ ost << "\"nonlinear_hold\": " << config.echo_model.nonlinear_hold << ","; >+ ost << "\"nonlinear_release\": " << config.echo_model.nonlinear_release; >+ ost << "},"; >+ >+ ost << "\"suppressor\": {"; >+ ost << "\"nearend_average_blocks\": " >+ << config.suppressor.nearend_average_blocks << ","; >+ ost << "\"normal_tuning\": {"; >+ ost << "\"mask_lf\": ["; >+ ost << config.suppressor.normal_tuning.mask_lf.enr_transparent << ","; >+ ost << config.suppressor.normal_tuning.mask_lf.enr_suppress << ","; >+ ost << config.suppressor.normal_tuning.mask_lf.emr_transparent; >+ ost << "],"; >+ ost << "\"mask_hf\": ["; >+ ost << config.suppressor.normal_tuning.mask_hf.enr_transparent << ","; >+ ost << config.suppressor.normal_tuning.mask_hf.enr_suppress << ","; >+ ost << config.suppressor.normal_tuning.mask_hf.emr_transparent; >+ ost << "],"; >+ ost << "\"max_inc_factor\": " >+ << config.suppressor.normal_tuning.max_inc_factor << ","; >+ ost << "\"max_dec_factor_lf\": " >+ << config.suppressor.normal_tuning.max_dec_factor_lf; >+ ost << "},"; >+ ost << "\"nearend_tuning\": {"; >+ ost << "\"mask_lf\": ["; >+ ost << config.suppressor.nearend_tuning.mask_lf.enr_transparent << ","; >+ ost << config.suppressor.nearend_tuning.mask_lf.enr_suppress << ","; >+ ost << config.suppressor.nearend_tuning.mask_lf.emr_transparent; >+ ost << "],"; >+ ost << "\"mask_hf\": ["; >+ ost << config.suppressor.nearend_tuning.mask_hf.enr_transparent << ","; >+ ost << config.suppressor.nearend_tuning.mask_hf.enr_suppress << ","; >+ ost << config.suppressor.nearend_tuning.mask_hf.emr_transparent; >+ ost << "],"; >+ ost << "\"max_inc_factor\": " >+ << config.suppressor.nearend_tuning.max_inc_factor << ","; >+ ost << "\"max_dec_factor_lf\": " >+ << config.suppressor.nearend_tuning.max_dec_factor_lf; >+ ost << "},"; >+ ost << "\"dominant_nearend_detection\": {"; >+ ost << "\"enr_threshold\": " >+ << config.suppressor.dominant_nearend_detection.enr_threshold << ","; >+ ost << "\"enr_exit_threshold\": " >+ << config.suppressor.dominant_nearend_detection.enr_exit_threshold << ","; >+ ost << "\"snr_threshold\": " >+ << config.suppressor.dominant_nearend_detection.snr_threshold << ","; >+ ost << "\"hold_duration\": " >+ << config.suppressor.dominant_nearend_detection.hold_duration << ","; >+ ost << "\"trigger_threshold\": " >+ << config.suppressor.dominant_nearend_detection.trigger_threshold << ","; >+ ost << "\"use_during_initial_phase\": " >+ << config.suppressor.dominant_nearend_detection.use_during_initial_phase; >+ ost << "},"; >+ ost << "\"high_bands_suppression\": {"; >+ ost << "\"enr_threshold\": " >+ << config.suppressor.high_bands_suppression.enr_threshold << ","; >+ ost << "\"max_gain_during_echo\": " >+ << config.suppressor.high_bands_suppression.max_gain_during_echo; >+ ost << "},"; >+ ost << "\"floor_first_increase\": " << config.suppressor.floor_first_increase >+ << ","; >+ ost << "\"enforce_transparent\": " >+ << (config.suppressor.enforce_transparent ? "true" : "false") << ","; >+ ost << "\"enforce_empty_higher_bands\": " >+ << (config.suppressor.enforce_empty_higher_bands ? "true" : "false"); >+ ost << "}"; >+ ost << "}"; >+ ost << "}"; >+ >+ return ost.Release(); >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config_json.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config_json.h >new file mode 100644 >index 0000000000000000000000000000000000000000..8973650f85c1a61753cbc591d4f1621286a981d4 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_config_json.h >@@ -0,0 +1,44 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_AUDIO_ECHO_CANCELLER3_CONFIG_JSON_H_ >+#define API_AUDIO_ECHO_CANCELLER3_CONFIG_JSON_H_ >+ >+#include <string> >+ >+#include "absl/strings/string_view.h" >+#include "api/audio/echo_canceller3_config.h" >+#include "rtc_base/system/rtc_export.h" >+ >+namespace webrtc { >+// Parses a JSON-encoded string into an Aec3 config. Fields corresponds to >+// substruct names, with the addition that there must be a top-level node >+// "aec3". Produces default config values for anything that cannot be parsed >+// from the string. If any error was found in the parsing, parsing_successful is >+// set to false. >+RTC_EXPORT void Aec3ConfigFromJsonString(absl::string_view json_string, >+ EchoCanceller3Config* config, >+ bool* parsing_successful); >+ >+// To be deprecated. >+// Parses a JSON-encoded string into an Aec3 config. Fields corresponds to >+// substruct names, with the addition that there must be a top-level node >+// "aec3". Returns default config values for anything that cannot be parsed from >+// the string. >+RTC_EXPORT EchoCanceller3Config >+Aec3ConfigFromJsonString(absl::string_view json_string); >+ >+// Encodes an Aec3 config in JSON format. Fields corresponds to substruct names, >+// with the addition that the top-level node is named "aec3". >+std::string Aec3ConfigToJsonString(const EchoCanceller3Config& config); >+ >+} // namespace webrtc >+ >+#endif // API_AUDIO_ECHO_CANCELLER3_CONFIG_JSON_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_factory.h >index f6db11685037a1a2c32d84a6f765a287fc475b7f..9052d99bb1eb891bdc3ceb9f9ffa6a920ac32622 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/echo_canceller3_factory.h >@@ -15,10 +15,11 @@ > > #include "api/audio/echo_canceller3_config.h" > #include "api/audio/echo_control.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >-class EchoCanceller3Factory : public EchoControlFactory { >+class RTC_EXPORT EchoCanceller3Factory : public EchoControlFactory { > public: > // Factory producing EchoCanceller3 instances with the default configuration. > EchoCanceller3Factory(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/BUILD.gn >index 9a797788f68dcea72de92123c1b5524a51447bad..4e04a8a64a1048af0332431b46c157fda6ddded1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/BUILD.gn >@@ -17,8 +17,12 @@ if (rtc_include_tests) { > testonly = true > sources = [ > "audio_frame_unittest.cc", >+ "echo_canceller3_config_json_unittest.cc", >+ "echo_canceller3_config_unittest.cc", > ] > deps = [ >+ "..:aec3_config", >+ "..:aec3_config_json", > "..:audio_frame_api", > "../../../rtc_base:rtc_base_approved", > "../../../test:test_support", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/echo_canceller3_config_json_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/echo_canceller3_config_json_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7c3a445c60a3412d03a0a2cef2947de14d156b5d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/echo_canceller3_config_json_unittest.cc >@@ -0,0 +1,41 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/audio/echo_canceller3_config_json.h" >+#include "api/audio/echo_canceller3_config.h" >+#include "test/gtest.h" >+ >+namespace webrtc { >+ >+TEST(EchoCanceller3JsonHelpers, ToStringAndParseJson) { >+ EchoCanceller3Config cfg; >+ cfg.delay.down_sampling_factor = 1u; >+ cfg.filter.shadow_initial.length_blocks = 7u; >+ cfg.suppressor.normal_tuning.mask_hf.enr_suppress = .5f; >+ std::string json_string = Aec3ConfigToJsonString(cfg); >+ EchoCanceller3Config cfg_transformed = Aec3ConfigFromJsonString(json_string); >+ >+ // Expect unchanged values to remain default. >+ EXPECT_EQ(cfg.filter.main.error_floor, >+ cfg_transformed.filter.main.error_floor); >+ EXPECT_EQ(cfg.ep_strength.default_len, >+ cfg_transformed.ep_strength.default_len); >+ EXPECT_EQ(cfg.suppressor.normal_tuning.mask_lf.enr_suppress, >+ cfg_transformed.suppressor.normal_tuning.mask_lf.enr_suppress); >+ >+ // Expect changed values to carry through the transformation. >+ EXPECT_EQ(cfg.delay.down_sampling_factor, >+ cfg_transformed.delay.down_sampling_factor); >+ EXPECT_EQ(cfg.filter.shadow_initial.length_blocks, >+ cfg_transformed.filter.shadow_initial.length_blocks); >+ EXPECT_EQ(cfg.suppressor.normal_tuning.mask_hf.enr_suppress, >+ cfg_transformed.suppressor.normal_tuning.mask_hf.enr_suppress); >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/echo_canceller3_config_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/echo_canceller3_config_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..ffe72ac958df59c554e910297707b5d7ca14c30e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio/test/echo_canceller3_config_unittest.cc >@@ -0,0 +1,47 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <string> >+ >+#include "api/audio/echo_canceller3_config.h" >+#include "api/audio/echo_canceller3_config_json.h" >+#include "test/gtest.h" >+ >+namespace webrtc { >+ >+TEST(EchoCanceller3Config, ValidConfigIsNotModified) { >+ EchoCanceller3Config config; >+ EXPECT_TRUE(EchoCanceller3Config::Validate(&config)); >+ EchoCanceller3Config default_config; >+ EXPECT_EQ(Aec3ConfigToJsonString(config), >+ Aec3ConfigToJsonString(default_config)); >+} >+ >+TEST(EchoCanceller3Config, InvalidConfigIsCorrected) { >+ // Change a parameter and validate. >+ EchoCanceller3Config config; >+ config.echo_model.min_noise_floor_power = -1600000.f; >+ EXPECT_FALSE(EchoCanceller3Config::Validate(&config)); >+ EXPECT_GE(config.echo_model.min_noise_floor_power, 0.f); >+ // Verify remaining parameters are unchanged. >+ EchoCanceller3Config default_config; >+ config.echo_model.min_noise_floor_power = >+ default_config.echo_model.min_noise_floor_power; >+ EXPECT_EQ(Aec3ConfigToJsonString(config), >+ Aec3ConfigToJsonString(default_config)); >+} >+ >+TEST(EchoCanceller3Config, ValidatedConfigsAreValid) { >+ EchoCanceller3Config config; >+ config.delay.down_sampling_factor = 983; >+ EXPECT_FALSE(EchoCanceller3Config::Validate(&config)); >+ EXPECT_TRUE(EchoCanceller3Config::Validate(&config)); >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/BUILD.gn >index a7060a229adf0171814724333a698343e2ccaad5..31fe2bec58b03be53e1428494a2702a0e6254389 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/BUILD.gn >@@ -30,11 +30,13 @@ rtc_source_set("audio_codecs_api") { > ] > deps = [ > "..:array_view", >+ "..:bitrate_allocation", > "../..:webrtc_common", > "../../rtc_base:checks", > "../../rtc_base:deprecation", > "../../rtc_base:rtc_base_approved", > "../../rtc_base:sanitizer", >+ "../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/BUILD.gn >index 972480500125fa9dce27307b4753d6211661fe04..34ec2e4208ef34856b5854ba9c0cdef4b99372cd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/BUILD.gn >@@ -21,11 +21,12 @@ rtc_static_library("audio_encoder_L16") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:pcm16b", > "../../../rtc_base:rtc_base_approved", > "../../../rtc_base:safe_minmax", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -39,10 +40,11 @@ rtc_static_library("audio_decoder_L16") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:pcm16b", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.cc >index a71e3087007df3fd76fa232fc3e3468fa3821e99..be0c6b56b25b569eae5e65bc7785987c4c586217 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.cc >@@ -11,7 +11,7 @@ > #include "api/audio_codecs/L16/audio_decoder_L16.h" > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.h" > #include "modules/audio_coding/codecs/pcm16b/pcm16b_common.h" > #include "rtc_base/numerics/safe_conversions.h" >@@ -23,7 +23,7 @@ absl::optional<AudioDecoderL16::Config> AudioDecoderL16::SdpToConfig( > Config config; > config.sample_rate_hz = format.clockrate_hz; > config.num_channels = rtc::checked_cast<int>(format.num_channels); >- return STR_CASE_CMP(format.name.c_str(), "L16") == 0 && config.IsOk() >+ return absl::EqualsIgnoreCase(format.name, "L16") && config.IsOk() > ? absl::optional<Config>(config) > : absl::nullopt; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.h >index 184ec24ed92f1e22d30c9b2e6be7f3d7f77e15c8..f0be03659c548c1c478b3ae0e9564e5595303135 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_decoder_L16.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_decoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // L16 decoder API for use as a template parameter to > // CreateAudioDecoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioDecoderL16 { >+struct RTC_EXPORT AudioDecoderL16 { > struct Config { > bool IsOk() const { > return (sample_rate_hz == 8000 || sample_rate_hz == 16000 || >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.cc >index b516f62863da24fce35ef52d46a0e04c2d5c797f..1bb552bc5c1e6e3257b18572f97b3345fdae3b7f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.cc >@@ -11,7 +11,7 @@ > #include "api/audio_codecs/L16/audio_encoder_L16.h" > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/pcm16b/audio_encoder_pcm16b.h" > #include "modules/audio_coding/codecs/pcm16b/pcm16b_common.h" > #include "rtc_base/numerics/safe_conversions.h" >@@ -35,7 +35,7 @@ absl::optional<AudioEncoderL16::Config> AudioEncoderL16::SdpToConfig( > config.frame_size_ms = rtc::SafeClamp(10 * (*ptime / 10), 10, 60); > } > } >- return STR_CASE_CMP(format.name.c_str(), "L16") == 0 && config.IsOk() >+ return absl::EqualsIgnoreCase(format.name, "L16") && config.IsOk() > ? absl::optional<Config>(config) > : absl::nullopt; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.h >index 340e3af47fbaf77caaeb844311a74b89506edfca..b410286802fe7689acc4f9889778b88cc8ee8a3a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/L16/audio_encoder_L16.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_encoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // L16 encoder API for use as a template parameter to > // CreateAudioEncoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioEncoderL16 { >+struct RTC_EXPORT AudioEncoderL16 { > struct Config { > bool IsOk() const { > return (sample_rate_hz == 8000 || sample_rate_hz == 16000 || >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_codec_pair_id.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_codec_pair_id.cc >index ac84107eda108249d94d31a8717763e9f04af178..6cb51ed6b72c7e4f4652dc728aa33ae3e3952614 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_codec_pair_id.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_codec_pair_id.cc >@@ -11,6 +11,7 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > > #include <atomic> >+#include <cstdint> > > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder.h >index b01a66a064a44ebf89e5e87891c9bd13bdda9894..889e2c62d0ed89fa04879f8b7a8971ebb3a5fef4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder.h >@@ -11,6 +11,8 @@ > #ifndef API_AUDIO_CODECS_AUDIO_DECODER_H_ > #define API_AUDIO_CODECS_AUDIO_DECODER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <vector> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory.h >index 90f93f0cd5fc908e99183196fad06fddf78ee4fb..55e197a14a92708226d3dfabc357f23c0aa75f1f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory.h >@@ -23,7 +23,6 @@ > namespace webrtc { > > // A factory that creates AudioDecoders. >-// NOTE: This class is still under development and may change without notice. > class AudioDecoderFactory : public rtc::RefCountInterface { > public: > virtual std::vector<AudioCodecSpec> GetSupportedDecoders() = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory_template.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory_template.h >index cdbe8bde7a842e8e8d5a2a863ca8ae1e8cecb287..22bb7de472136e8c4eeac2b1d3e864144436311b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory_template.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_decoder_factory_template.h >@@ -112,8 +112,6 @@ class AudioDecoderFactoryT : public AudioDecoderFactory { > // decoder types in the order they were specified in the template argument > // list, stopping at the first one that claims to be able to do the job. > // >-// NOTE: This function is still under development and may change without notice. >-// > // TODO(kwiberg): Point at CreateBuiltinAudioDecoderFactory() for an example of > // how it is used. > template <typename... Ts> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.cc >index 595c11131e7c41066f34ee241e7f19f32906274a..1d885f9cecf84d8098f4e4c241271f07d7efe58f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.cc >@@ -92,6 +92,11 @@ void AudioEncoder::OnReceivedUplinkBandwidth( > int target_audio_bitrate_bps, > absl::optional<int64_t> bwe_period_ms) {} > >+void AudioEncoder::OnReceivedUplinkAllocation(BitrateAllocationUpdate update) { >+ OnReceivedUplinkBandwidth(update.target_bitrate.bps(), >+ update.bwe_period.ms()); >+} >+ > void AudioEncoder::OnReceivedRtt(int rtt_ms) {} > > void AudioEncoder::OnReceivedOverhead(size_t overhead_bytes_per_packet) {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.h >index 6a1f6534b21a8a3ffb6d209d95571c89ae208a16..c908518063d4da7c94aa0e9346db7ee3acbef1e9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder.h >@@ -11,13 +11,13 @@ > #ifndef API_AUDIO_CODECS_AUDIO_ENCODER_H_ > #define API_AUDIO_CODECS_AUDIO_ENCODER_H_ > >-#include <algorithm> > #include <memory> > #include <string> > #include <vector> > > #include "absl/types/optional.h" > #include "api/array_view.h" >+#include "api/call/bitrate_allocation.h" > #include "rtc_base/buffer.h" > #include "rtc_base/deprecation.h" > >@@ -222,6 +222,10 @@ class AudioEncoder { > virtual void OnReceivedUplinkBandwidth(int target_audio_bitrate_bps, > absl::optional<int64_t> bwe_period_ms); > >+ // Provides target audio bitrate and corresponding probing interval of >+ // the bandwidth estimator to this encoder to allow it to adapt. >+ virtual void OnReceivedUplinkAllocation(BitrateAllocationUpdate update); >+ > // Provides RTT to this encoder to allow it to adapt. > virtual void OnReceivedRtt(int rtt_ms); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory.h >index fb4e23ffe78b5b22c67e22888a3272c67ad3b9b9..b290967c7803fd747d8b47b3465f449e74beae09 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory.h >@@ -23,7 +23,6 @@ > namespace webrtc { > > // A factory that creates AudioEncoders. >-// NOTE: This class is still under development and may change without notice. > class AudioEncoderFactory : public rtc::RefCountInterface { > public: > // Returns a prioritized list of audio codecs, to use for signaling etc. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory_template.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory_template.h >index 376b39e4c6f9fc8eaa3c7ef37adcdee61c428383..20886190196262ba2777ce6cbba78fede669e760 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory_template.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_encoder_factory_template.h >@@ -131,8 +131,6 @@ class AudioEncoderFactoryT : public AudioEncoderFactory { > // encoders in the order they were specified in the template argument list, > // stopping at the first one that claims to be able to do the job. > // >-// NOTE: This function is still under development and may change without notice. >-// > // TODO(kwiberg): Point at CreateBuiltinAudioEncoderFactory() for an example of > // how it is used. > template <typename... Ts> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.cc >index 81ed299ddb66aaf836967d4f446f73e908bc7211..11788b91e829198bf277bc34dffe80b9ff539f6e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.cc >@@ -10,7 +10,7 @@ > > #include "api/audio_codecs/audio_format.h" > >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > > namespace webrtc { > >@@ -32,7 +32,7 @@ SdpAudioFormat::SdpAudioFormat(absl::string_view name, > parameters(param) {} > > bool SdpAudioFormat::Matches(const SdpAudioFormat& o) const { >- return STR_CASE_CMP(name.c_str(), o.name.c_str()) == 0 && >+ return absl::EqualsIgnoreCase(name, o.name) && > clockrate_hz == o.clockrate_hz && num_channels == o.num_channels; > } > >@@ -41,19 +41,11 @@ SdpAudioFormat& SdpAudioFormat::operator=(const SdpAudioFormat&) = default; > SdpAudioFormat& SdpAudioFormat::operator=(SdpAudioFormat&&) = default; > > bool operator==(const SdpAudioFormat& a, const SdpAudioFormat& b) { >- return STR_CASE_CMP(a.name.c_str(), b.name.c_str()) == 0 && >+ return absl::EqualsIgnoreCase(a.name, b.name) && > a.clockrate_hz == b.clockrate_hz && a.num_channels == b.num_channels && > a.parameters == b.parameters; > } > >-void swap(SdpAudioFormat& a, SdpAudioFormat& b) { >- using std::swap; >- swap(a.name, b.name); >- swap(a.clockrate_hz, b.clockrate_hz); >- swap(a.num_channels, b.num_channels); >- swap(a.parameters, b.parameters); >-} >- > AudioCodecInfo::AudioCodecInfo(int sample_rate_hz, > size_t num_channels, > int bitrate_bps) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.h >index aa5dbb13b65d72ed0f4d6d6ee686d47202fd479c..b8b042178ba1a5e24275e55957d76a52d8a480a4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/audio_format.h >@@ -11,19 +11,18 @@ > #ifndef API_AUDIO_CODECS_AUDIO_FORMAT_H_ > #define API_AUDIO_CODECS_AUDIO_FORMAT_H_ > >+#include <stddef.h> > #include <map> > #include <string> >-#include <utility> > > #include "absl/strings/string_view.h" >-#include "absl/types/optional.h" > #include "rtc_base/checks.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // SDP specification for a single audio codec. >-// NOTE: This class is still under development and may change without notice. >-struct SdpAudioFormat { >+struct RTC_EXPORT SdpAudioFormat { > using Parameters = std::map<std::string, std::string>; > > SdpAudioFormat(const SdpAudioFormat&); >@@ -54,8 +53,6 @@ struct SdpAudioFormat { > Parameters parameters; > }; > >-void swap(SdpAudioFormat& a, SdpAudioFormat& b); >- > // Information about how an audio format is treated by the codec implementation. > // Contains basic information, such as sample rate and number of channels, which > // isn't uniformly presented by SDP. Also contains flags indicating support for >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_decoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_decoder_factory.h >index 3127403e243fb901863105a76d94dc66b937dd68..3accd4a3020d2429918149a4b6b9bf3e685ec1f6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_decoder_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_decoder_factory.h >@@ -17,7 +17,6 @@ > namespace webrtc { > > // Creates a new factory that can create the built-in types of audio decoders. >-// NOTE: This function is still under development and may change without notice. > rtc::scoped_refptr<AudioDecoderFactory> CreateBuiltinAudioDecoderFactory(); > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_encoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_encoder_factory.h >index d37ff257e6e32e39cc777845fe8d3972eb3e13cc..3c67dd30e66b2351ed40aceac154b1b428fbd459 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_encoder_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/builtin_audio_encoder_factory.h >@@ -17,7 +17,6 @@ > namespace webrtc { > > // Creates a new factory that can create the built-in types of audio encoders. >-// NOTE: This function is still under development and may change without notice. > rtc::scoped_refptr<AudioEncoderFactory> CreateBuiltinAudioEncoderFactory(); > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/BUILD.gn >index 169172a66829ab793f51831ad4c9ba888507666b..3b8f23c1bc6e7430755ac5f72ee928ffa4342b8d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/BUILD.gn >@@ -21,11 +21,12 @@ rtc_static_library("audio_encoder_g711") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:g711", > "../../../rtc_base:rtc_base_approved", > "../../../rtc_base:safe_minmax", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -39,10 +40,11 @@ rtc_static_library("audio_decoder_g711") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:g711", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.cc >index cb16584cdee5402eb205b43b47b939531a77a609..91599c416adf93c459895c3003cdfdf32789b6a8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.cc >@@ -14,7 +14,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/g711/audio_decoder_pcm.h" > #include "rtc_base/numerics/safe_conversions.h" > >@@ -22,8 +22,8 @@ namespace webrtc { > > absl::optional<AudioDecoderG711::Config> AudioDecoderG711::SdpToConfig( > const SdpAudioFormat& format) { >- const bool is_pcmu = STR_CASE_CMP(format.name.c_str(), "PCMU") == 0; >- const bool is_pcma = STR_CASE_CMP(format.name.c_str(), "PCMA") == 0; >+ const bool is_pcmu = absl::EqualsIgnoreCase(format.name, "PCMU"); >+ const bool is_pcma = absl::EqualsIgnoreCase(format.name, "PCMA"); > if (format.clockrate_hz == 8000 && format.num_channels >= 1 && > (is_pcmu || is_pcma)) { > Config config; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.h >index 8275a8c7a14bff4df4d14c4c8b38593673789c2e..ccd1ee0480052733b716c4683efd7924d2b519dc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_decoder_g711.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_decoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // G711 decoder API for use as a template parameter to > // CreateAudioDecoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioDecoderG711 { >+struct RTC_EXPORT AudioDecoderG711 { > struct Config { > enum class Type { kPcmU, kPcmA }; > bool IsOk() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.cc >index 1d5e541276b9aedd86c68d01f5c7c707cc28fe3c..0cc8dc4ddd29ef5b55ee071927a6943e60334d89 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.cc >@@ -14,7 +14,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/g711/audio_encoder_pcm.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/numerics/safe_minmax.h" >@@ -24,8 +24,8 @@ namespace webrtc { > > absl::optional<AudioEncoderG711::Config> AudioEncoderG711::SdpToConfig( > const SdpAudioFormat& format) { >- const bool is_pcmu = STR_CASE_CMP(format.name.c_str(), "PCMU") == 0; >- const bool is_pcma = STR_CASE_CMP(format.name.c_str(), "PCMA") == 0; >+ const bool is_pcmu = absl::EqualsIgnoreCase(format.name, "PCMU"); >+ const bool is_pcma = absl::EqualsIgnoreCase(format.name, "PCMA"); > if (format.clockrate_hz == 8000 && format.num_channels >= 1 && > (is_pcmu || is_pcma)) { > Config config; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.h >index 6b6eb5fce0ce54fcbd60c64fea67d9bcda1b1686..23ae18b5e3f032b839cf7e909e08445f3b68e0eb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g711/audio_encoder_g711.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_encoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // G711 encoder API for use as a template parameter to > // CreateAudioEncoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioEncoderG711 { >+struct RTC_EXPORT AudioEncoderG711 { > struct Config { > enum class Type { kPcmU, kPcmA }; > bool IsOk() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/BUILD.gn >index 50b1396d2915ebff2c6d11ce35dcc7891c3b1c30..e4321d299317af8f01ad3e562a043096ee1ef214 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/BUILD.gn >@@ -29,11 +29,12 @@ rtc_static_library("audio_encoder_g722") { > deps = [ > ":audio_encoder_g722_config", > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:g722", > "../../../rtc_base:rtc_base_approved", > "../../../rtc_base:safe_minmax", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -47,10 +48,11 @@ rtc_static_library("audio_decoder_g722") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:g722", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.cc >index f1e2afbab50d0ff7d038645e84cf7f256b198e1a..2cc16c37d94d650b4eeabb26cdc4d29b8045583a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.cc >@@ -14,7 +14,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/g722/audio_decoder_g722.h" > #include "rtc_base/numerics/safe_conversions.h" > >@@ -22,7 +22,7 @@ namespace webrtc { > > absl::optional<AudioDecoderG722::Config> AudioDecoderG722::SdpToConfig( > const SdpAudioFormat& format) { >- return STR_CASE_CMP(format.name.c_str(), "G722") == 0 && >+ return absl::EqualsIgnoreCase(format.name, "G722") && > format.clockrate_hz == 8000 && > (format.num_channels == 1 || format.num_channels == 2) > ? absl::optional<Config>( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.h >index b7bb0893f3303e3a4f98a57e687bee620504ec9e..2a674926dbccf1c76c090595deeb1390c3a13f81 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_decoder_g722.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_decoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // G722 decoder API for use as a template parameter to > // CreateAudioDecoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioDecoderG722 { >+struct RTC_EXPORT AudioDecoderG722 { > struct Config { > bool IsOk() const { return num_channels == 1 || num_channels == 2; } > int num_channels; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.cc >index 0cf71639bf32f937d2fd452470a41be87f89ab4f..6374ae802faa87748ccba46e172752bb7668c376 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.cc >@@ -14,7 +14,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/g722/audio_encoder_g722.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/numerics/safe_minmax.h" >@@ -24,7 +24,7 @@ namespace webrtc { > > absl::optional<AudioEncoderG722Config> AudioEncoderG722::SdpToConfig( > const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "g722") != 0 || >+ if (!absl::EqualsIgnoreCase(format.name, "g722") || > format.clockrate_hz != 8000) { > return absl::nullopt; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.h >index b97fe1b147890d0854276d1311208e99b10ccf9a..327c0af04aa7ce0455841f981c662abf862b6dbd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722.h >@@ -19,14 +19,13 @@ > #include "api/audio_codecs/audio_encoder.h" > #include "api/audio_codecs/audio_format.h" > #include "api/audio_codecs/g722/audio_encoder_g722_config.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // G722 encoder API for use as a template parameter to > // CreateAudioEncoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioEncoderG722 { >+struct RTC_EXPORT AudioEncoderG722 { > using Config = AudioEncoderG722Config; > static absl::optional<AudioEncoderG722Config> SdpToConfig( > const SdpAudioFormat& audio_format); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722_config.h >index 773e430ce3c0c8c06ad2e5c55a0453914038274d..287898589fcaaa02fbbd895a1cfbfc97d3891c99 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/g722/audio_encoder_g722_config.h >@@ -13,7 +13,6 @@ > > namespace webrtc { > >-// NOTE: This struct is still under development and may change without notice. > struct AudioEncoderG722Config { > bool IsOk() const { > return frame_size_ms > 0 && frame_size_ms % 10 == 0 && num_channels >= 1; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/BUILD.gn >index d4f504fd4f9137cbcffc42818e8c04aa9c21dfc7..d766fa4a97a138c62aa44206ee14944d30da4dce 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/BUILD.gn >@@ -29,11 +29,11 @@ rtc_static_library("audio_encoder_ilbc") { > deps = [ > ":audio_encoder_ilbc_config", > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:ilbc", > "../../../rtc_base:rtc_base_approved", > "../../../rtc_base:safe_minmax", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -47,10 +47,10 @@ rtc_static_library("audio_decoder_ilbc") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:ilbc", > "../../../rtc_base:rtc_base_approved", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.cc >index 1f4c475c406428b771634cdefc3541e6937dfc72..4a00f8dbf0054c6a3a6a169a1ea1faa6ead19ebf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.cc >@@ -14,14 +14,14 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.h" > > namespace webrtc { > > absl::optional<AudioDecoderIlbc::Config> AudioDecoderIlbc::SdpToConfig( > const SdpAudioFormat& format) { >- return STR_CASE_CMP(format.name.c_str(), "ILBC") == 0 && >+ return absl::EqualsIgnoreCase(format.name, "ILBC") && > format.clockrate_hz == 8000 && format.num_channels == 1 > ? absl::optional<Config>(Config()) > : absl::nullopt; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.h >index 20f6ffd287f9c8f0976b850872e164f6a7ed8255..9ab847977d81127169903c065bab3d9f5b34ae86 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_decoder_ilbc.h >@@ -23,8 +23,6 @@ namespace webrtc { > > // ILBC decoder API for use as a template parameter to > // CreateAudioDecoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. > struct AudioDecoderIlbc { > struct Config {}; // Empty---no config values needed! > static absl::optional<Config> SdpToConfig(const SdpAudioFormat& audio_format); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >index 2ae75474ccd8dea9d503de7f80af541d0d8050e1..4d028c9aaf4c0884c38b1034ea07300fe72125d2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >@@ -14,7 +14,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/numerics/safe_minmax.h" >@@ -40,7 +40,7 @@ int GetIlbcBitrate(int ptime) { > > absl::optional<AudioEncoderIlbcConfig> AudioEncoderIlbc::SdpToConfig( > const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "ILBC") != 0 || >+ if (!absl::EqualsIgnoreCase(format.name.c_str(), "ILBC") || > format.clockrate_hz != 8000 || format.num_channels != 1) { > return absl::nullopt; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.h >index 0a86b162e81c628ccfbfdd907a01dd8a895e533b..e4aeca70de840e174621398541e9561b34285027 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.h >@@ -24,8 +24,6 @@ namespace webrtc { > > // ILBC encoder API for use as a template parameter to > // CreateAudioEncoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. > struct AudioEncoderIlbc { > using Config = AudioEncoderIlbcConfig; > static absl::optional<AudioEncoderIlbcConfig> SdpToConfig( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc_config.h >index 22909a957bb257b683983c129c757187c0e86664..4d82f9901cdfe193a4da543d5830ebad20f9947a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc_config.h >@@ -13,7 +13,6 @@ > > namespace webrtc { > >-// NOTE: This struct is still under development and may change without notice. > struct AudioEncoderIlbcConfig { > bool IsOk() const { > return (frame_size_ms == 20 || frame_size_ms == 30 || frame_size_ms == 40 || >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/BUILD.gn >index ed9d9622577cf6aa1cdc46f68d0d808147a01d5c..c7d6e43271529113cf1fd56fd8dcd80fcdaf0a10 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/BUILD.gn >@@ -77,10 +77,10 @@ rtc_static_library("audio_encoder_isac_fix") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:isac_fix", > "../../../rtc_base:rtc_base_approved", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -94,10 +94,10 @@ rtc_static_library("audio_decoder_isac_fix") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:isac_fix", > "../../../rtc_base:rtc_base_approved", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -111,10 +111,11 @@ rtc_static_library("audio_encoder_isac_float") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:isac", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -128,10 +129,11 @@ rtc_static_library("audio_decoder_isac_float") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:isac", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.cc >index 446640ff70c06b4808528bbe720f3ebbc967a3e7..51ae572fa982e5e49ba972e973e1709e58d1087b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.cc >@@ -11,14 +11,14 @@ > #include "api/audio_codecs/isac/audio_decoder_isac_fix.h" > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/isac/fix/include/audio_decoder_isacfix.h" > > namespace webrtc { > > absl::optional<AudioDecoderIsacFix::Config> AudioDecoderIsacFix::SdpToConfig( > const SdpAudioFormat& format) { >- return STR_CASE_CMP(format.name.c_str(), "ISAC") == 0 && >+ return absl::EqualsIgnoreCase(format.name, "ISAC") && > format.clockrate_hz == 16000 && format.num_channels == 1 > ? absl::optional<Config>(Config()) > : absl::nullopt; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.h >index a4ce6858385852d1c3b6adeeca5413b6c4f6bec6..b7a5cef6dc5cd73629949e12f08668e992b60b2b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_fix.h >@@ -23,8 +23,6 @@ namespace webrtc { > > // iSAC decoder API (fixed-point implementation) for use as a template > // parameter to CreateAudioDecoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. > struct AudioDecoderIsacFix { > struct Config {}; // Empty---no config values needed! > static absl::optional<Config> SdpToConfig(const SdpAudioFormat& audio_format); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.cc >index 1c1926f1a1ebfbc0ae8c3647ab4f2cad198cdfa1..d9de3a08b30121849f99c644c320e7049a25e00d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.cc >@@ -11,14 +11,14 @@ > #include "api/audio_codecs/isac/audio_decoder_isac_float.h" > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/isac/main/include/audio_decoder_isac.h" > > namespace webrtc { > > absl::optional<AudioDecoderIsacFloat::Config> > AudioDecoderIsacFloat::SdpToConfig(const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "ISAC") == 0 && >+ if (absl::EqualsIgnoreCase(format.name, "ISAC") && > (format.clockrate_hz == 16000 || format.clockrate_hz == 32000) && > format.num_channels == 1) { > Config config; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.h >index cc139634196ddf5ee3d2a43f6726834d66388ec8..e78f8b81ee01fd634a07711a36fb45a64a6d3da9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_decoder_isac_float.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_decoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // iSAC decoder API (floating-point implementation) for use as a template > // parameter to CreateAudioDecoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioDecoderIsacFloat { >+struct RTC_EXPORT AudioDecoderIsacFloat { > struct Config { > bool IsOk() const { > return sample_rate_hz == 16000 || sample_rate_hz == 32000; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.cc >index cd8975398b4178c9e985495cbc4ad7c055d5c66c..a10d1ee61ae8f896cd2c4cad4b368aec4d0a8650 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.cc >@@ -11,7 +11,7 @@ > #include "api/audio_codecs/isac/audio_encoder_isac_fix.h" > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/isac/fix/include/audio_encoder_isacfix.h" > #include "rtc_base/string_to_number.h" > >@@ -19,7 +19,7 @@ namespace webrtc { > > absl::optional<AudioEncoderIsacFix::Config> AudioEncoderIsacFix::SdpToConfig( > const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "ISAC") == 0 && >+ if (absl::EqualsIgnoreCase(format.name, "ISAC") && > format.clockrate_hz == 16000 && format.num_channels == 1) { > Config config; > const auto ptime_iter = format.parameters.find("ptime"); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.h >index 731e48d0c27668aa77c3b08a56b1610edb6bc5a6..c3c3672b583d1959cd0c01c2f48f21681d9e1e30 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_fix.h >@@ -23,8 +23,6 @@ namespace webrtc { > > // iSAC encoder API (fixed-point implementation) for use as a template > // parameter to CreateAudioEncoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. > struct AudioEncoderIsacFix { > struct Config { > bool IsOk() const { return frame_size_ms == 30 || frame_size_ms == 60; } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.cc >index 83d1faff2fc24ff20bff42c51dd226364ba30103..37982b1f15872f73490578aa4a35bf114684706d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.cc >@@ -11,7 +11,7 @@ > #include "api/audio_codecs/isac/audio_encoder_isac_float.h" > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/isac/main/include/audio_encoder_isac.h" > #include "rtc_base/string_to_number.h" > >@@ -19,7 +19,7 @@ namespace webrtc { > > absl::optional<AudioEncoderIsacFloat::Config> > AudioEncoderIsacFloat::SdpToConfig(const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "ISAC") == 0 && >+ if (absl::EqualsIgnoreCase(format.name, "ISAC") && > (format.clockrate_hz == 16000 || format.clockrate_hz == 32000) && > format.num_channels == 1) { > Config config; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.h >index 5df2dd3d6f4b065d74d065fd2e06bd35fa85b320..0cb9c17d7105ed3865c82dab909a110416ec2589 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/isac/audio_encoder_isac_float.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_encoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // iSAC encoder API (floating-point implementation) for use as a template > // parameter to CreateAudioEncoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioEncoderIsacFloat { >+struct RTC_EXPORT AudioEncoderIsacFloat { > struct Config { > bool IsOk() const { > switch (sample_rate_hz) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/BUILD.gn >index 800abbe279bfe3a65d16bb637bcdff053f092e7a..5552c21fa51079676d89705a0992b13e94ff7d83 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/BUILD.gn >@@ -20,6 +20,7 @@ rtc_static_library("audio_encoder_opus_config") { > ] > deps = [ > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/types:optional", > ] > defines = [] >@@ -44,6 +45,8 @@ rtc_source_set("audio_encoder_opus") { > "..:audio_codecs_api", > "../../../modules/audio_coding:webrtc_opus", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -57,10 +60,11 @@ rtc_static_library("audio_decoder_opus") { > ] > deps = [ > "..:audio_codecs_api", >- "../../..:webrtc_common", > "../../../modules/audio_coding:webrtc_opus", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.cc >index 41397f04d2024893e26922cde0edb61542b21f9f..2f1668b825bc020908d983981a512bf7f3760dd4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.cc >@@ -15,7 +15,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/codecs/opus/audio_decoder_opus.h" > > namespace webrtc { >@@ -35,7 +35,7 @@ absl::optional<AudioDecoderOpus::Config> AudioDecoderOpus::SdpToConfig( > } > return 1; // Default to mono. > }(); >- if (STR_CASE_CMP(format.name.c_str(), "opus") == 0 && >+ if (absl::EqualsIgnoreCase(format.name, "opus") && > format.clockrate_hz == 48000 && format.num_channels == 2 && > num_channels) { > return Config{*num_channels}; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.h >index de26026b7833979b3fed7ad9b5740e3ea720607a..6fbbcb598a14d77ae6dbf246bb387e4f8b8b6d49 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_decoder_opus.h >@@ -18,14 +18,13 @@ > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/audio_codecs/audio_decoder.h" > #include "api/audio_codecs/audio_format.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // Opus decoder API for use as a template parameter to > // CreateAudioDecoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioDecoderOpus { >+struct RTC_EXPORT AudioDecoderOpus { > struct Config { > int num_channels; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus.h >index 20aaaf726f93149490320d082e02fbb0df33ddbc..03cb0d6b38c8334569fb668872ae4d8e76a4d640 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus.h >@@ -19,14 +19,13 @@ > #include "api/audio_codecs/audio_encoder.h" > #include "api/audio_codecs/audio_format.h" > #include "api/audio_codecs/opus/audio_encoder_opus_config.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // Opus encoder API for use as a template parameter to > // CreateAudioEncoderFactory<...>(). >-// >-// NOTE: This struct is still under development and may change without notice. >-struct AudioEncoderOpus { >+struct RTC_EXPORT AudioEncoderOpus { > using Config = AudioEncoderOpusConfig; > static absl::optional<AudioEncoderOpusConfig> SdpToConfig( > const SdpAudioFormat& audio_format); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus_config.h >index c7067bb1c324186f2b6ddc56731c5ebbe2ad3567..0a3ee2c8dfb0d40d8a2c12080c0513f5dd5061cc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/opus/audio_encoder_opus_config.h >@@ -16,11 +16,11 @@ > #include <vector> > > #include "absl/types/optional.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >-// NOTE: This struct is still under development and may change without notice. >-struct AudioEncoderOpusConfig { >+struct RTC_EXPORT AudioEncoderOpusConfig { > static constexpr int kDefaultFrameSizeMs = 20; > > // Opus API allows a min bitrate of 500bps, but Opus documentation suggests >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.cc >index a4a37d219a43c9ea25765ea59cc3a3716a9593fa..e33214bad04d6ce9ba4f482fcc450d8ecbba4f43 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.cc >@@ -49,8 +49,9 @@ void AudioOptions::SetAll(const AudioOptions& change) { > change.audio_jitter_buffer_max_packets); > SetFrom(&audio_jitter_buffer_fast_accelerate, > change.audio_jitter_buffer_fast_accelerate); >+ SetFrom(&audio_jitter_buffer_min_delay_ms, >+ change.audio_jitter_buffer_min_delay_ms); > SetFrom(&typing_detection, change.typing_detection); >- SetFrom(&aecm_generate_comfort_noise, change.aecm_generate_comfort_noise); > SetFrom(&experimental_agc, change.experimental_agc); > SetFrom(&extended_filter_aec, change.extended_filter_aec); > SetFrom(&delay_agnostic_aec, change.delay_agnostic_aec); >@@ -77,8 +78,9 @@ bool AudioOptions::operator==(const AudioOptions& o) const { > audio_jitter_buffer_max_packets == o.audio_jitter_buffer_max_packets && > audio_jitter_buffer_fast_accelerate == > o.audio_jitter_buffer_fast_accelerate && >+ audio_jitter_buffer_min_delay_ms == >+ o.audio_jitter_buffer_min_delay_ms && > typing_detection == o.typing_detection && >- aecm_generate_comfort_noise == o.aecm_generate_comfort_noise && > experimental_agc == o.experimental_agc && > extended_filter_aec == o.extended_filter_aec && > delay_agnostic_aec == o.delay_agnostic_aec && >@@ -109,8 +111,9 @@ std::string AudioOptions::ToString() const { > audio_jitter_buffer_max_packets); > ToStringIfSet(&result, "audio_jitter_buffer_fast_accelerate", > audio_jitter_buffer_fast_accelerate); >+ ToStringIfSet(&result, "audio_jitter_buffer_min_delay_ms", >+ audio_jitter_buffer_min_delay_ms); > ToStringIfSet(&result, "typing", typing_detection); >- ToStringIfSet(&result, "comfort_noise", aecm_generate_comfort_noise); > ToStringIfSet(&result, "experimental_agc", experimental_agc); > ToStringIfSet(&result, "extended_filter_aec", extended_filter_aec); > ToStringIfSet(&result, "delay_agnostic_aec", delay_agnostic_aec); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.h >index aecff414298cd248ee42c106f2d530e53e632551..c2d1f4487c36874d5389d6d54a9b1ea4218b5842 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_options.h >@@ -11,6 +11,7 @@ > #ifndef API_AUDIO_OPTIONS_H_ > #define API_AUDIO_OPTIONS_H_ > >+#include <stdint.h> > #include <string> > > #include "absl/types/optional.h" >@@ -53,9 +54,10 @@ struct AudioOptions { > absl::optional<int> audio_jitter_buffer_max_packets; > // Audio receiver jitter buffer (NetEq) fast accelerate mode. > absl::optional<bool> audio_jitter_buffer_fast_accelerate; >+ // Audio receiver jitter buffer (NetEq) minimum target delay in milliseconds. >+ absl::optional<int> audio_jitter_buffer_min_delay_ms; > // Audio processing to detect typing. > absl::optional<bool> typing_detection; >- absl::optional<bool> aecm_generate_comfort_noise; > absl::optional<bool> experimental_agc; > absl::optional<bool> extended_filter_aec; > absl::optional<bool> delay_agnostic_aec; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/bitrate_allocation.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/bitrate_allocation.h >new file mode 100644 >index 0000000000000000000000000000000000000000..2d7f21bc1e941ca49bf85d0c7d36ff3630a16517 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/bitrate_allocation.h >@@ -0,0 +1,41 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#ifndef API_CALL_BITRATE_ALLOCATION_H_ >+#define API_CALL_BITRATE_ALLOCATION_H_ >+ >+#include "api/units/data_rate.h" >+#include "api/units/time_delta.h" >+ >+namespace webrtc { >+ >+// BitrateAllocationUpdate provides information to allocated streams about their >+// bitrate allocation. It originates from the BitrateAllocater class and is >+// propagated from there. >+struct BitrateAllocationUpdate { >+ // The allocated target bitrate. Media streams should produce this amount of >+ // data. (Note that this may include packet overhead depending on >+ // configuration.) >+ DataRate target_bitrate = DataRate::Zero(); >+ // The allocated part of the estimated link capacity. This is more stable than >+ // the target as it is based on the underlying link capacity estimate. This >+ // should be used to change encoder configuration when the cost of change is >+ // high. >+ DataRate link_capacity = DataRate::Zero(); >+ // Predicted packet loss ratio. >+ double packet_loss_ratio = 0; >+ // Predicted round trip time. >+ TimeDelta round_trip_time = TimeDelta::PlusInfinity(); >+ // |bwe_period| is deprecated, use the link capacity allocation instead. >+ TimeDelta bwe_period = TimeDelta::PlusInfinity(); >+}; >+ >+} // namespace webrtc >+ >+#endif // API_CALL_BITRATE_ALLOCATION_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.cc >index 0a9dd5bcc7e15851b448fc27811eb4b5e1707dbb..bcadc762de4d1934f4083dac18f25ab8dda237b8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.cc >@@ -10,6 +10,8 @@ > > #include "api/call/transport.h" > >+#include <cstdint> >+ > namespace webrtc { > > PacketOptions::PacketOptions() = default; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.h >index 18d22705cd6df915c1c6234efa84cf3b59e99796..32e5ddf874fb1c4cc10b5f1bfd913ede31151872 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/call/transport.h >@@ -32,6 +32,8 @@ struct PacketOptions { > std::vector<uint8_t> application_data; > // Whether this is a retransmission of an earlier packet. > bool is_retransmit = false; >+ bool included_in_feedback = false; >+ bool included_in_allocation = false; > }; > > class Transport { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.cc >index c2c6c530172cce1b5e19e50cffa601ab807285f3..275b17366c21cdcd0e7d4e5ea3c7808ee5473ca2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.cc >@@ -10,6 +10,8 @@ > > #include "api/candidate.h" > >+#include "rtc_base/helpers.h" >+#include "rtc_base/ipaddress.h" > #include "rtc_base/strings/string_builder.h" > > namespace cricket { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.h >index 6e0547b5e2fd84e8df013887f50adfd4170440ff..4c650d9bdbd52d3c34de0f35406d17b802cfac08 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/candidate.h >@@ -18,16 +18,16 @@ > #include <string> > > #include "rtc_base/checks.h" >-#include "rtc_base/helpers.h" > #include "rtc_base/network_constants.h" > #include "rtc_base/socketaddress.h" >+#include "rtc_base/system/rtc_export.h" > > namespace cricket { > > // Candidate for ICE based connection discovery. > // TODO(phoglund): remove things in here that are not needed in the public API. > >-class Candidate { >+class RTC_EXPORT Candidate { > public: > Candidate(); > // TODO(pthatcher): Match the ordering and param list as per RFC 5245 >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/create_peerconnection_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/create_peerconnection_factory.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..1a6d086f6600d17c15c069da86220c29578d70f6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/create_peerconnection_factory.cc >@@ -0,0 +1,182 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/create_peerconnection_factory.h" >+ >+#include <memory> >+#include <utility> >+ >+#include "api/call/callfactoryinterface.h" >+#include "api/peerconnectioninterface.h" >+#include "api/video_codecs/video_decoder_factory.h" >+#include "api/video_codecs/video_encoder_factory.h" >+#include "logging/rtc_event_log/rtc_event_log_factory.h" >+#include "logging/rtc_event_log/rtc_event_log_factory_interface.h" >+#include "media/engine/webrtcmediaengine.h" >+#include "modules/audio_device/include/audio_device.h" >+#include "modules/audio_processing/include/audio_processing.h" >+#include "rtc_base/bind.h" >+#include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/thread.h" >+ >+namespace webrtc { >+ >+#if defined(USE_BUILTIN_SW_CODECS) >+rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory) { >+ return CreatePeerConnectionFactoryWithAudioMixer( >+ nullptr /*network_thread*/, nullptr /*worker_thread*/, >+ nullptr /*signaling_thread*/, nullptr /*default_adm*/, >+ audio_encoder_factory, audio_decoder_factory, >+ nullptr /*video_encoder_factory*/, nullptr /*video_decoder_factory*/, >+ nullptr /*audio_mixer*/); >+} >+ >+// Note: all the other CreatePeerConnectionFactory variants just end up calling >+// this, ultimately. >+rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer, >+ rtc::scoped_refptr<AudioProcessing> audio_processing) { >+ rtc::scoped_refptr<AudioProcessing> audio_processing_use = audio_processing; >+ if (!audio_processing_use) { >+ audio_processing_use = AudioProcessingBuilder().Create(); >+ } >+ >+ std::unique_ptr<cricket::MediaEngineInterface> media_engine( >+ cricket::WebRtcMediaEngineFactory::Create( >+ default_adm, audio_encoder_factory, audio_decoder_factory, >+ video_encoder_factory, video_decoder_factory, audio_mixer, >+ audio_processing_use)); >+ >+ std::unique_ptr<CallFactoryInterface> call_factory = CreateCallFactory(); >+ >+ std::unique_ptr<RtcEventLogFactoryInterface> event_log_factory = >+ CreateRtcEventLogFactory(); >+ >+ return CreateModularPeerConnectionFactory( >+ network_thread, worker_thread, signaling_thread, std::move(media_engine), >+ std::move(call_factory), std::move(event_log_factory)); >+} >+ >+rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer, >+ rtc::scoped_refptr<AudioProcessing> audio_processing, >+ std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory, >+ std::unique_ptr<NetworkControllerFactoryInterface> >+ network_controller_factory) { >+ rtc::scoped_refptr<AudioProcessing> audio_processing_use = audio_processing; >+ if (!audio_processing_use) { >+ audio_processing_use = AudioProcessingBuilder().Create(); >+ } >+ >+ std::unique_ptr<cricket::MediaEngineInterface> media_engine( >+ cricket::WebRtcMediaEngineFactory::Create( >+ default_adm, audio_encoder_factory, audio_decoder_factory, >+ video_encoder_factory, video_decoder_factory, audio_mixer, >+ audio_processing_use)); >+ >+ std::unique_ptr<CallFactoryInterface> call_factory = CreateCallFactory(); >+ >+ std::unique_ptr<RtcEventLogFactoryInterface> event_log_factory = >+ CreateRtcEventLogFactory(); >+ >+ return CreateModularPeerConnectionFactory( >+ network_thread, worker_thread, signaling_thread, std::move(media_engine), >+ std::move(call_factory), std::move(event_log_factory), >+ std::move(fec_controller_factory), std::move(network_controller_factory)); >+} >+#endif >+ >+rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ rtc::scoped_refptr<AudioDeviceModule> default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ std::unique_ptr<VideoEncoderFactory> video_encoder_factory, >+ std::unique_ptr<VideoDecoderFactory> video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer, >+ rtc::scoped_refptr<AudioProcessing> audio_processing) { >+ if (!audio_processing) >+ audio_processing = AudioProcessingBuilder().Create(); >+ >+ std::unique_ptr<cricket::MediaEngineInterface> media_engine = >+ cricket::WebRtcMediaEngineFactory::Create( >+ default_adm, audio_encoder_factory, audio_decoder_factory, >+ std::move(video_encoder_factory), std::move(video_decoder_factory), >+ audio_mixer, audio_processing); >+ >+ std::unique_ptr<CallFactoryInterface> call_factory = CreateCallFactory(); >+ >+ std::unique_ptr<RtcEventLogFactoryInterface> event_log_factory = >+ CreateRtcEventLogFactory(); >+ PeerConnectionFactoryDependencies dependencies; >+ dependencies.network_thread = network_thread; >+ dependencies.worker_thread = worker_thread; >+ dependencies.signaling_thread = signaling_thread; >+ dependencies.media_engine = std::move(media_engine); >+ dependencies.call_factory = std::move(call_factory); >+ dependencies.event_log_factory = std::move(event_log_factory); >+ return CreateModularPeerConnectionFactory(std::move(dependencies)); >+} >+ >+#if defined(USE_BUILTIN_SW_CODECS) >+rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactoryWithAudioMixer( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer) { >+ return CreatePeerConnectionFactory( >+ network_thread, worker_thread, signaling_thread, default_adm, >+ audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >+ video_decoder_factory, audio_mixer, nullptr); >+} >+ >+rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory) { >+ return CreatePeerConnectionFactoryWithAudioMixer( >+ network_thread, worker_thread, signaling_thread, default_adm, >+ audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >+ video_decoder_factory, nullptr); >+} >+#endif >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/create_peerconnection_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/create_peerconnection_factory.h >new file mode 100644 >index 0000000000000000000000000000000000000000..baa50c7bd8063d562395e01f7f2330212efaf4e1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/create_peerconnection_factory.h >@@ -0,0 +1,179 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_CREATE_PEERCONNECTION_FACTORY_H_ >+#define API_CREATE_PEERCONNECTION_FACTORY_H_ >+ >+#include <memory> >+ >+#include "api/audio/audio_mixer.h" >+#include "api/audio_codecs/audio_decoder_factory.h" >+#include "api/audio_codecs/audio_encoder_factory.h" >+#include "api/fec_controller.h" >+#include "api/peerconnectioninterface.h" >+#include "api/transport/network_control.h" >+#include "rtc_base/scoped_ref_ptr.h" >+ >+namespace rtc { >+// TODO(bugs.webrtc.org/9987): Move rtc::Thread to api/ or expose a better >+// type. At the moment, rtc::Thread is not part of api/ so it cannot be >+// included in order to avoid to leak internal types. >+class Thread; >+} // namespace rtc >+ >+namespace cricket { >+class WebRtcVideoDecoderFactory; >+class WebRtcVideoEncoderFactory; >+} // namespace cricket >+ >+namespace webrtc { >+ >+class AudioDeviceModule; >+class AudioProcessing; >+ >+#if defined(USE_BUILTIN_SW_CODECS) >+// Create a new instance of PeerConnectionFactoryInterface. >+// >+// This method relies on the thread it's called on as the "signaling thread" >+// for the PeerConnectionFactory it creates. >+// >+// As such, if the current thread is not already running an rtc::Thread message >+// loop, an application using this method must eventually either call >+// rtc::Thread::Current()->Run(), or call >+// rtc::Thread::Current()->ProcessMessages() within the application's own >+// message loop. >+RTC_EXPORT rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactory( >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory); >+ >+// Create a new instance of PeerConnectionFactoryInterface. >+// >+// |network_thread|, |worker_thread| and |signaling_thread| are >+// the only mandatory parameters. >+// >+// If non-null, a reference is added to |default_adm|, and ownership of >+// |video_encoder_factory| and |video_decoder_factory| is transferred to the >+// returned factory. >+// TODO(deadbeef): Use rtc::scoped_refptr<> and std::unique_ptr<> to make this >+// ownership transfer and ref counting more obvious. >+RTC_EXPORT rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory); >+ >+// Create a new instance of PeerConnectionFactoryInterface with optional >+// external audio mixed and audio processing modules. >+// >+// If |audio_mixer| is null, an internal audio mixer will be created and used. >+// If |audio_processing| is null, an internal audio processing module will be >+// created and used. >+RTC_EXPORT rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer, >+ rtc::scoped_refptr<AudioProcessing> audio_processing); >+ >+// Create a new instance of PeerConnectionFactoryInterface with optional >+// external audio mixer, audio processing, and fec controller modules. >+// >+// If |audio_mixer| is null, an internal audio mixer will be created and used. >+// If |audio_processing| is null, an internal audio processing module will be >+// created and used. >+// If |fec_controller_factory| is null, an internal fec controller module will >+// be created and used. >+// If |network_controller_factory| is provided, it will be used if enabled via >+// field trial. >+RTC_EXPORT rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer, >+ rtc::scoped_refptr<AudioProcessing> audio_processing, >+ std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory, >+ std::unique_ptr<NetworkControllerFactoryInterface> >+ network_controller_factory = nullptr); >+#endif // defined(USE_BUILTIN_SW_CODECS) >+ >+// Create a new instance of PeerConnectionFactoryInterface with optional video >+// codec factories. These video factories represents all video codecs, i.e. no >+// extra internal video codecs will be added. >+// When building WebRTC with rtc_use_builtin_sw_codecs = false, this is the >+// only available CreatePeerConnectionFactory overload. >+RTC_EXPORT rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactory( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ rtc::scoped_refptr<AudioDeviceModule> default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ std::unique_ptr<VideoEncoderFactory> video_encoder_factory, >+ std::unique_ptr<VideoDecoderFactory> video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer, >+ rtc::scoped_refptr<AudioProcessing> audio_processing); >+ >+#if defined(USE_BUILTIN_SW_CODECS) >+// Create a new instance of PeerConnectionFactoryInterface with external audio >+// mixer. >+// >+// If |audio_mixer| is null, an internal audio mixer will be created and used. >+RTC_EXPORT rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactoryWithAudioMixer( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<AudioMixer> audio_mixer); >+ >+// Create a new instance of PeerConnectionFactoryInterface. >+// Same thread is used as worker and network thread. >+RTC_EXPORT inline rtc::scoped_refptr<PeerConnectionFactoryInterface> >+CreatePeerConnectionFactory( >+ rtc::Thread* worker_and_network_thread, >+ rtc::Thread* signaling_thread, >+ AudioDeviceModule* default_adm, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >+ cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >+ cricket::WebRtcVideoDecoderFactory* video_decoder_factory) { >+ return CreatePeerConnectionFactory( >+ worker_and_network_thread, worker_and_network_thread, signaling_thread, >+ default_adm, audio_encoder_factory, audio_decoder_factory, >+ video_encoder_factory, video_decoder_factory); >+} >+#endif // defined(USE_BUILTIN_SW_CODECS) >+ >+} // namespace webrtc >+ >+#endif // API_CREATE_PEERCONNECTION_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/crypto/cryptooptions.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/crypto/cryptooptions.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7f34f19b5385722dc2abaefc9b52f1d06ffb4350 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/crypto/cryptooptions.cc >@@ -0,0 +1,78 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/crypto/cryptooptions.h" >+#include "rtc_base/sslstreamadapter.h" >+ >+namespace webrtc { >+ >+CryptoOptions::CryptoOptions() {} >+ >+CryptoOptions::CryptoOptions(const CryptoOptions& other) { >+ srtp = other.srtp; >+ sframe = other.sframe; >+} >+ >+CryptoOptions::~CryptoOptions() {} >+ >+// static >+CryptoOptions CryptoOptions::NoGcm() { >+ CryptoOptions options; >+ options.srtp.enable_gcm_crypto_suites = false; >+ return options; >+} >+ >+std::vector<int> CryptoOptions::GetSupportedDtlsSrtpCryptoSuites() const { >+ std::vector<int> crypto_suites; >+ if (srtp.enable_gcm_crypto_suites) { >+ crypto_suites.push_back(rtc::SRTP_AEAD_AES_256_GCM); >+ crypto_suites.push_back(rtc::SRTP_AEAD_AES_128_GCM); >+ } >+ // Note: SRTP_AES128_CM_SHA1_80 is what is required to be supported (by >+ // draft-ietf-rtcweb-security-arch), but SRTP_AES128_CM_SHA1_32 is allowed as >+ // well, and saves a few bytes per packet if it ends up selected. >+ // As the cipher suite is potentially insecure, it will only be used if >+ // enabled by both peers. >+ if (srtp.enable_aes128_sha1_32_crypto_cipher) { >+ crypto_suites.push_back(rtc::SRTP_AES128_CM_SHA1_32); >+ } >+ crypto_suites.push_back(rtc::SRTP_AES128_CM_SHA1_80); >+ return crypto_suites; >+} >+ >+bool CryptoOptions::operator==(const CryptoOptions& other) const { >+ struct data_being_tested_for_equality { >+ struct Srtp { >+ bool enable_gcm_crypto_suites; >+ bool enable_aes128_sha1_32_crypto_cipher; >+ bool enable_encrypted_rtp_header_extensions; >+ } srtp; >+ struct SFrame { >+ bool require_frame_encryption; >+ } sframe; >+ }; >+ static_assert(sizeof(data_being_tested_for_equality) == sizeof(*this), >+ "Did you add something to CryptoOptions and forget to " >+ "update operator==?"); >+ >+ return srtp.enable_gcm_crypto_suites == other.srtp.enable_gcm_crypto_suites && >+ srtp.enable_aes128_sha1_32_crypto_cipher == >+ other.srtp.enable_aes128_sha1_32_crypto_cipher && >+ srtp.enable_encrypted_rtp_header_extensions == >+ other.srtp.enable_encrypted_rtp_header_extensions && >+ sframe.require_frame_encryption == >+ other.sframe.require_frame_encryption; >+} >+ >+bool CryptoOptions::operator!=(const CryptoOptions& other) const { >+ return !(*this == other); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/crypto/cryptooptions.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/crypto/cryptooptions.h >new file mode 100644 >index 0000000000000000000000000000000000000000..e730ab208959800702857d9f469c94d659a8fb9d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/crypto/cryptooptions.h >@@ -0,0 +1,66 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_CRYPTO_CRYPTOOPTIONS_H_ >+#define API_CRYPTO_CRYPTOOPTIONS_H_ >+ >+#include <vector> >+ >+namespace webrtc { >+ >+// CryptoOptions defines advanced cryptographic settings for native WebRTC. >+// These settings must be passed into PeerConnectionFactoryInterface::Options >+// and are only applicable to native use cases of WebRTC. >+struct CryptoOptions { >+ CryptoOptions(); >+ CryptoOptions(const CryptoOptions& other); >+ ~CryptoOptions(); >+ >+ // Helper method to return an instance of the CryptoOptions with GCM crypto >+ // suites disabled. This method should be used instead of depending on current >+ // default values set by the constructor. >+ static CryptoOptions NoGcm(); >+ >+ // Returns a list of the supported DTLS-SRTP Crypto suites based on this set >+ // of crypto options. >+ std::vector<int> GetSupportedDtlsSrtpCryptoSuites() const; >+ >+ bool operator==(const CryptoOptions& other) const; >+ bool operator!=(const CryptoOptions& other) const; >+ >+ // SRTP Related Peer Connection options. >+ struct Srtp { >+ // Enable GCM crypto suites from RFC 7714 for SRTP. GCM will only be used >+ // if both sides enable it. >+ bool enable_gcm_crypto_suites = false; >+ >+ // If set to true, the (potentially insecure) crypto cipher >+ // SRTP_AES128_CM_SHA1_32 will be included in the list of supported ciphers >+ // during negotiation. It will only be used if both peers support it and no >+ // other ciphers get preferred. >+ bool enable_aes128_sha1_32_crypto_cipher = false; >+ >+ // If set to true, encrypted RTP header extensions as defined in RFC 6904 >+ // will be negotiated. They will only be used if both peers support them. >+ bool enable_encrypted_rtp_header_extensions = false; >+ } srtp; >+ >+ // Options to be used when the FrameEncryptor / FrameDecryptor APIs are used. >+ struct SFrame { >+ // If set all RtpSenders must have an FrameEncryptor attached to them before >+ // they are allowed to send packets. All RtpReceivers must have a >+ // FrameDecryptor attached to them before they are able to receive packets. >+ bool require_frame_encryption = false; >+ } sframe; >+}; >+ >+} // namespace webrtc >+ >+#endif // API_CRYPTO_CRYPTOOPTIONS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/cryptoparams.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/cryptoparams.h >index 2350528358f26e012b30fa160576c21294e325a3..abe905546229c8327e2d7a9c685f8ee44698db20 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/cryptoparams.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/cryptoparams.h >@@ -16,6 +16,8 @@ > namespace cricket { > > // Parameters for SRTP negotiation, as described in RFC 4568. >+// TODO(benwright) - Rename to SrtpCryptoParams as these only apply to SRTP and >+// not generic crypto parameters for WebRTC. > struct CryptoParams { > CryptoParams() : tag(0) {} > CryptoParams(int t, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/datachannelinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/datachannelinterface.h >index a0d2b3b51a59858b1ba84fdeae8b69133d19cce9..7cb55822337f8045b3f0998546213f1ae1033dc8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/datachannelinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/datachannelinterface.h >@@ -14,6 +14,8 @@ > #ifndef API_DATACHANNELINTERFACE_H_ > #define API_DATACHANNELINTERFACE_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <string> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.cc >index 52a60f95e32e90ad2196b671d837c2a54718e08a..01f5720563d908347f5dcb5f4381b9b21c4bb98b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.cc >@@ -38,4 +38,32 @@ void SetSessionDescriptionObserver::OnFailure(const std::string& error) { > OnFailure(RTCError(RTCErrorType::INTERNAL_ERROR, std::string(error))); > } > >+const char SessionDescriptionInterface::kOffer[] = "offer"; >+const char SessionDescriptionInterface::kPrAnswer[] = "pranswer"; >+const char SessionDescriptionInterface::kAnswer[] = "answer"; >+ >+const char* SdpTypeToString(SdpType type) { >+ switch (type) { >+ case SdpType::kOffer: >+ return SessionDescriptionInterface::kOffer; >+ case SdpType::kPrAnswer: >+ return SessionDescriptionInterface::kPrAnswer; >+ case SdpType::kAnswer: >+ return SessionDescriptionInterface::kAnswer; >+ } >+ return ""; >+} >+ >+absl::optional<SdpType> SdpTypeFromString(const std::string& type_str) { >+ if (type_str == SessionDescriptionInterface::kOffer) { >+ return SdpType::kOffer; >+ } else if (type_str == SessionDescriptionInterface::kPrAnswer) { >+ return SdpType::kPrAnswer; >+ } else if (type_str == SessionDescriptionInterface::kAnswer) { >+ return SdpType::kAnswer; >+ } else { >+ return absl::nullopt; >+ } >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.h >index 4d4bcc0bfb62ed034b978689bddac7c3b4826266..1c50455d04df9764dfbfa77728b205e5de768043 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsep.h >@@ -29,6 +29,7 @@ > #include "absl/types/optional.h" > #include "api/rtcerror.h" > #include "rtc_base/refcount.h" >+#include "rtc_base/system/rtc_export.h" > > namespace cricket { > class Candidate; >@@ -73,13 +74,13 @@ class IceCandidateInterface { > // Creates a IceCandidateInterface based on SDP string. > // Returns null if the sdp string can't be parsed. > // |error| may be null. >-IceCandidateInterface* CreateIceCandidate(const std::string& sdp_mid, >- int sdp_mline_index, >- const std::string& sdp, >- SdpParseError* error); >+RTC_EXPORT IceCandidateInterface* CreateIceCandidate(const std::string& sdp_mid, >+ int sdp_mline_index, >+ const std::string& sdp, >+ SdpParseError* error); > > // Creates an IceCandidateInterface based on a parsed candidate structure. >-std::unique_ptr<IceCandidateInterface> CreateIceCandidate( >+RTC_EXPORT std::unique_ptr<IceCandidateInterface> CreateIceCandidate( > const std::string& sdp_mid, > int sdp_mline_index, > const cricket::Candidate& candidate); >@@ -121,7 +122,7 @@ absl::optional<SdpType> SdpTypeFromString(const std::string& type_str); > // and is therefore not expected to be thread safe. > // > // An instance can be created by CreateSessionDescription. >-class SessionDescriptionInterface { >+class RTC_EXPORT SessionDescriptionInterface { > public: > // String representations of the supported SDP types. > static const char kOffer[]; >@@ -181,21 +182,21 @@ class SessionDescriptionInterface { > // |error| may be null. > // TODO(steveanton): This function is deprecated. Please use the functions below > // which take an SdpType enum instead. Remove this once it is no longer used. >-SessionDescriptionInterface* CreateSessionDescription(const std::string& type, >- const std::string& sdp, >- SdpParseError* error); >+RTC_EXPORT SessionDescriptionInterface* CreateSessionDescription( >+ const std::string& type, >+ const std::string& sdp, >+ SdpParseError* error); > > // Creates a SessionDescriptionInterface based on the SDP string and the type. > // Returns null if the SDP string cannot be parsed. > // If using the signature with |error_out|, details of the parsing error may be > // written to |error_out| if it is not null. >-std::unique_ptr<SessionDescriptionInterface> CreateSessionDescription( >- SdpType type, >- const std::string& sdp); >-std::unique_ptr<SessionDescriptionInterface> CreateSessionDescription( >- SdpType type, >- const std::string& sdp, >- SdpParseError* error_out); >+RTC_EXPORT std::unique_ptr<SessionDescriptionInterface> >+CreateSessionDescription(SdpType type, const std::string& sdp); >+RTC_EXPORT std::unique_ptr<SessionDescriptionInterface> >+CreateSessionDescription(SdpType type, >+ const std::string& sdp, >+ SdpParseError* error_out); > > // Creates a SessionDescriptionInterface based on a parsed SDP structure and the > // given type, ID and version. >@@ -206,7 +207,8 @@ std::unique_ptr<SessionDescriptionInterface> CreateSessionDescription( > std::unique_ptr<cricket::SessionDescription> description); > > // CreateOffer and CreateAnswer callback interface. >-class CreateSessionDescriptionObserver : public rtc::RefCountInterface { >+class RTC_EXPORT CreateSessionDescriptionObserver >+ : public rtc::RefCountInterface { > public: > // This callback transfers the ownership of the |desc|. > // TODO(deadbeef): Make this take an std::unique_ptr<> to avoid confusion >@@ -227,7 +229,7 @@ class CreateSessionDescriptionObserver : public rtc::RefCountInterface { > }; > > // SetLocalDescription and SetRemoteDescription callback interface. >-class SetSessionDescriptionObserver : public rtc::RefCountInterface { >+class RTC_EXPORT SetSessionDescriptionObserver : public rtc::RefCountInterface { > public: > virtual void OnSuccess() = 0; > // See description in CreateSessionDescriptionObserver for OnFailure. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.cc >index b9ba2fea29aaaad62c0bcf832ba8683fc702ec04..ed2f7927f272b34de36d3dc03f0c630041edaa3e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.cc >@@ -10,6 +10,9 @@ > > #include "api/jsepicecandidate.h" > >+#include <algorithm> >+#include <utility> >+ > namespace webrtc { > > std::string JsepIceCandidate::sdp_mid() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.h >index 50520fe7279b2a6696c05d3bfcfea001df027951..9cc7443f3fbbdc784fd7b51bfdba9c1871b438f6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/jsepicecandidate.h >@@ -14,8 +14,8 @@ > #ifndef API_JSEPICECANDIDATE_H_ > #define API_JSEPICECANDIDATE_H_ > >+#include <stddef.h> > #include <string> >-#include <utility> > #include <vector> > > #include "api/candidate.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/media_transport_interface.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/media_transport_interface.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..ef223aa788022e7a79f8c9870b7ba8db51a82088 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/media_transport_interface.cc >@@ -0,0 +1,124 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+// This is EXPERIMENTAL interface for media transport. >+// >+// The goal is to refactor WebRTC code so that audio and video frames >+// are sent / received through the media transport interface. This will >+// enable different media transport implementations, including QUIC-based >+// media transport. >+ >+#include "api/media_transport_interface.h" >+ >+#include <cstdint> >+#include <utility> >+ >+namespace webrtc { >+ >+MediaTransportSettings::MediaTransportSettings() = default; >+MediaTransportSettings::MediaTransportSettings(const MediaTransportSettings&) = >+ default; >+MediaTransportSettings& MediaTransportSettings::operator=( >+ const MediaTransportSettings&) = default; >+MediaTransportSettings::~MediaTransportSettings() = default; >+ >+MediaTransportEncodedAudioFrame::~MediaTransportEncodedAudioFrame() {} >+ >+MediaTransportEncodedAudioFrame::MediaTransportEncodedAudioFrame( >+ int sampling_rate_hz, >+ int starting_sample_index, >+ int samples_per_channel, >+ int sequence_number, >+ FrameType frame_type, >+ uint8_t payload_type, >+ std::vector<uint8_t> encoded_data) >+ : sampling_rate_hz_(sampling_rate_hz), >+ starting_sample_index_(starting_sample_index), >+ samples_per_channel_(samples_per_channel), >+ sequence_number_(sequence_number), >+ frame_type_(frame_type), >+ payload_type_(payload_type), >+ encoded_data_(std::move(encoded_data)) {} >+ >+MediaTransportEncodedAudioFrame& MediaTransportEncodedAudioFrame::operator=( >+ const MediaTransportEncodedAudioFrame&) = default; >+ >+MediaTransportEncodedAudioFrame& MediaTransportEncodedAudioFrame::operator=( >+ MediaTransportEncodedAudioFrame&&) = default; >+ >+MediaTransportEncodedAudioFrame::MediaTransportEncodedAudioFrame( >+ const MediaTransportEncodedAudioFrame&) = default; >+ >+MediaTransportEncodedAudioFrame::MediaTransportEncodedAudioFrame( >+ MediaTransportEncodedAudioFrame&&) = default; >+ >+MediaTransportEncodedVideoFrame::~MediaTransportEncodedVideoFrame() {} >+ >+MediaTransportEncodedVideoFrame::MediaTransportEncodedVideoFrame( >+ int64_t frame_id, >+ std::vector<int64_t> referenced_frame_ids, >+ VideoCodecType codec_type, >+ const webrtc::EncodedImage& encoded_image) >+ : codec_type_(codec_type), >+ encoded_image_(encoded_image), >+ frame_id_(frame_id), >+ referenced_frame_ids_(std::move(referenced_frame_ids)) {} >+ >+MediaTransportEncodedVideoFrame& MediaTransportEncodedVideoFrame::operator=( >+ const MediaTransportEncodedVideoFrame&) = default; >+ >+MediaTransportEncodedVideoFrame& MediaTransportEncodedVideoFrame::operator=( >+ MediaTransportEncodedVideoFrame&&) = default; >+ >+MediaTransportEncodedVideoFrame::MediaTransportEncodedVideoFrame( >+ const MediaTransportEncodedVideoFrame&) = default; >+ >+MediaTransportEncodedVideoFrame::MediaTransportEncodedVideoFrame( >+ MediaTransportEncodedVideoFrame&&) = default; >+ >+SendDataParams::SendDataParams() = default; >+ >+RTCErrorOr<std::unique_ptr<MediaTransportInterface>> >+MediaTransportFactory::CreateMediaTransport( >+ rtc::PacketTransportInternal* packet_transport, >+ rtc::Thread* network_thread, >+ bool is_caller) { >+ MediaTransportSettings settings; >+ settings.is_caller = is_caller; >+ return CreateMediaTransport(packet_transport, network_thread, settings); >+} >+ >+RTCErrorOr<std::unique_ptr<MediaTransportInterface>> >+MediaTransportFactory::CreateMediaTransport( >+ rtc::PacketTransportInternal* packet_transport, >+ rtc::Thread* network_thread, >+ const MediaTransportSettings& settings) { >+ return std::unique_ptr<MediaTransportInterface>(nullptr); >+} >+ >+absl::optional<TargetTransferRate> >+MediaTransportInterface::GetLatestTargetTransferRate() { >+ return absl::nullopt; >+} >+ >+void MediaTransportInterface::SetNetworkChangeCallback( >+ MediaTransportNetworkChangeCallback* callback) {} >+ >+void MediaTransportInterface::RemoveTargetTransferRateObserver( >+ webrtc::TargetTransferRateObserver* observer) {} >+ >+void MediaTransportInterface::AddTargetTransferRateObserver( >+ webrtc::TargetTransferRateObserver* observer) {} >+ >+size_t MediaTransportInterface::GetAudioPacketOverhead() const { >+ return 0; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/media_transport_interface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/media_transport_interface.h >index 2c88cd4ce6a1502a472e955dbc439db521dd2bf2..b10dd63a465c45a982f8589565dee98d23abea5c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/media_transport_interface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/media_transport_interface.h >@@ -1,5 +1,4 @@ >-/* >- * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+/* Copyright 2018 The WebRTC project authors. All Rights Reserved. > * > * Use of this source code is governed by a BSD-style license > * that can be found in the LICENSE file in the root of the source >@@ -18,12 +17,18 @@ > #ifndef API_MEDIA_TRANSPORT_INTERFACE_H_ > #define API_MEDIA_TRANSPORT_INTERFACE_H_ > >+#include <api/transport/network_control.h> > #include <memory> >+#include <string> > #include <utility> > #include <vector> > >+#include "absl/types/optional.h" >+#include "api/array_view.h" > #include "api/rtcerror.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "api/video/encoded_image.h" >+#include "rtc_base/copyonwritebuffer.h" >+#include "rtc_base/networkroute.h" > > namespace rtc { > class PacketTransportInternal; >@@ -32,16 +37,36 @@ class Thread; > > namespace webrtc { > >+// A collection of settings for creation of media transport. >+struct MediaTransportSettings final { >+ MediaTransportSettings(); >+ MediaTransportSettings(const MediaTransportSettings&); >+ MediaTransportSettings& operator=(const MediaTransportSettings&); >+ ~MediaTransportSettings(); >+ >+ // Group calls are not currently supported, in 1:1 call one side must set >+ // is_caller = true and another is_caller = false. >+ bool is_caller; >+ >+ // Must be set if a pre-shared key is used for the call. >+ // TODO(bugs.webrtc.org/9944): This should become zero buffer in the distant >+ // future. >+ absl::optional<std::string> pre_shared_key; >+}; >+ > // Represents encoded audio frame in any encoding (type of encoding is opaque). > // To avoid copying of encoded data use move semantics when passing by value. >-class MediaTransportEncodedAudioFrame { >+class MediaTransportEncodedAudioFrame final { > public: > enum class FrameType { > // Normal audio frame (equivalent to webrtc::kAudioFrameSpeech). > kSpeech, > > // DTX frame (equivalent to webrtc::kAudioFrameCN). >- kDiscountinuousTransmission, >+ // DTX frame (equivalent to webrtc::kAudioFrameCN). >+ kDiscontinuousTransmission, >+ // TODO(nisse): Mis-spelled version, update users, then delete. >+ kDiscountinuousTransmission = kDiscontinuousTransmission, > }; > > MediaTransportEncodedAudioFrame( >@@ -75,14 +100,15 @@ class MediaTransportEncodedAudioFrame { > uint8_t payload_type, > > // Vector with opaque encoded data. >- std::vector<uint8_t> encoded_data) >- : sampling_rate_hz_(sampling_rate_hz), >- starting_sample_index_(starting_sample_index), >- samples_per_channel_(samples_per_channel), >- sequence_number_(sequence_number), >- frame_type_(frame_type), >- payload_type_(payload_type), >- encoded_data_(std::move(encoded_data)) {} >+ std::vector<uint8_t> encoded_data); >+ >+ ~MediaTransportEncodedAudioFrame(); >+ MediaTransportEncodedAudioFrame(const MediaTransportEncodedAudioFrame&); >+ MediaTransportEncodedAudioFrame& operator=( >+ const MediaTransportEncodedAudioFrame& other); >+ MediaTransportEncodedAudioFrame& operator=( >+ MediaTransportEncodedAudioFrame&& other); >+ MediaTransportEncodedAudioFrame(MediaTransportEncodedAudioFrame&&); > > // Getters. > int sampling_rate_hz() const { return sampling_rate_hz_; } >@@ -113,6 +139,16 @@ class MediaTransportEncodedAudioFrame { > std::vector<uint8_t> encoded_data_; > }; > >+// Callback to notify about network route changes. >+class MediaTransportNetworkChangeCallback { >+ public: >+ virtual ~MediaTransportNetworkChangeCallback() = default; >+ >+ // Called when the network route is changed, with the new network route. >+ virtual void OnNetworkRouteChanged( >+ const rtc::NetworkRoute& new_network_route) = 0; >+}; >+ > // Interface for receiving encoded audio frames from MediaTransportInterface > // implementations. > class MediaTransportAudioSinkInterface { >@@ -125,16 +161,19 @@ class MediaTransportAudioSinkInterface { > }; > > // Represents encoded video frame, along with the codec information. >-class MediaTransportEncodedVideoFrame { >+class MediaTransportEncodedVideoFrame final { > public: > MediaTransportEncodedVideoFrame(int64_t frame_id, > std::vector<int64_t> referenced_frame_ids, > VideoCodecType codec_type, >- const webrtc::EncodedImage& encoded_image) >- : codec_type_(codec_type), >- encoded_image_(encoded_image), >- frame_id_(frame_id), >- referenced_frame_ids_(std::move(referenced_frame_ids)) {} >+ const webrtc::EncodedImage& encoded_image); >+ ~MediaTransportEncodedVideoFrame(); >+ MediaTransportEncodedVideoFrame(const MediaTransportEncodedVideoFrame&); >+ MediaTransportEncodedVideoFrame& operator=( >+ const MediaTransportEncodedVideoFrame& other); >+ MediaTransportEncodedVideoFrame& operator=( >+ MediaTransportEncodedVideoFrame&& other); >+ MediaTransportEncodedVideoFrame(MediaTransportEncodedVideoFrame&&); > > VideoCodecType codec_type() const { return codec_type_; } > const webrtc::EncodedImage& encoded_image() const { return encoded_image_; } >@@ -183,6 +222,85 @@ class MediaTransportVideoSinkInterface { > virtual void OnKeyFrameRequested(uint64_t channel_id) = 0; > }; > >+// State of the media transport. Media transport begins in the pending state. >+// It transitions to writable when it is ready to send media. It may transition >+// back to pending if the connection is blocked. It may transition to closed at >+// any time. Closed is terminal: a transport will never re-open once closed. >+enum class MediaTransportState { >+ kPending, >+ kWritable, >+ kClosed, >+}; >+ >+// Callback invoked whenever the state of the media transport changes. >+class MediaTransportStateCallback { >+ public: >+ virtual ~MediaTransportStateCallback() = default; >+ >+ // Invoked whenever the state of the media transport changes. >+ virtual void OnStateChanged(MediaTransportState state) = 0; >+}; >+ >+// Supported types of application data messages. >+enum class DataMessageType { >+ // Application data buffer with the binary bit unset. >+ kText, >+ >+ // Application data buffer with the binary bit set. >+ kBinary, >+ >+ // Transport-agnostic control messages, such as open or open-ack messages. >+ kControl, >+}; >+ >+// Parameters for sending data. The parameters may change from message to >+// message, even within a single channel. For example, control messages may be >+// sent reliably and in-order, even if the data channel is configured for >+// unreliable delivery. >+struct SendDataParams { >+ SendDataParams(); >+ >+ DataMessageType type = DataMessageType::kText; >+ >+ // Whether to deliver the message in order with respect to other ordered >+ // messages with the same channel_id. >+ bool ordered = false; >+ >+ // If set, the maximum number of times this message may be >+ // retransmitted by the transport before it is dropped. >+ // Setting this value to zero disables retransmission. >+ // Must be non-negative. |max_rtx_count| and |max_rtx_ms| may not be set >+ // simultaneously. >+ absl::optional<int> max_rtx_count; >+ >+ // If set, the maximum number of milliseconds for which the transport >+ // may retransmit this message before it is dropped. >+ // Setting this value to zero disables retransmission. >+ // Must be non-negative. |max_rtx_count| and |max_rtx_ms| may not be set >+ // simultaneously. >+ absl::optional<int> max_rtx_ms; >+}; >+ >+// Sink for callbacks related to a data channel. >+class DataChannelSink { >+ public: >+ virtual ~DataChannelSink() = default; >+ >+ // Callback issued when data is received by the transport. >+ virtual void OnDataReceived(int channel_id, >+ DataMessageType type, >+ const rtc::CopyOnWriteBuffer& buffer) = 0; >+ >+ // Callback issued when a remote data channel begins the closing procedure. >+ // Messages sent after the closing procedure begins will not be transmitted. >+ virtual void OnChannelClosing(int channel_id) = 0; >+ >+ // Callback issued when a (remote or local) data channel completes the closing >+ // procedure. Closing channels become closed after all pending data has been >+ // transmitted. >+ virtual void OnChannelClosed(int channel_id) = 0; >+}; >+ > // Media transport interface for sending / receiving encoded audio/video frames > // and receiving bandwidth estimate update from congestion control. > class MediaTransportInterface { >@@ -217,8 +335,62 @@ class MediaTransportInterface { > // pass a nullptr. > virtual void SetReceiveVideoSink(MediaTransportVideoSinkInterface* sink) = 0; > >+ // Adds a target bitrate observer. Before media transport is destructed >+ // the observer must be unregistered (by calling >+ // RemoveTargetTransferRateObserver). >+ // A newly registered observer will be called back with the latest recorded >+ // target rate, if available. >+ virtual void AddTargetTransferRateObserver( >+ webrtc::TargetTransferRateObserver* observer); >+ >+ // Removes an existing |observer| from observers. If observer was never >+ // registered, an error is logged and method does nothing. >+ virtual void RemoveTargetTransferRateObserver( >+ webrtc::TargetTransferRateObserver* observer); >+ >+ // Returns the last known target transfer rate as reported to the above >+ // observers. >+ virtual absl::optional<TargetTransferRate> GetLatestTargetTransferRate(); >+ >+ // Gets the audio packet overhead in bytes. Returned overhead does not include >+ // transport overhead (ipv4/6, turn channeldata, tcp/udp, etc.). >+ // If the transport is capable of fusing packets together, this overhead >+ // might not be a very accurate number. >+ virtual size_t GetAudioPacketOverhead() const; >+ >+ // Sets an observer for network change events. If the network route is already >+ // established when the callback is set, |callback| will be called immediately >+ // with the current network route. >+ // Before media transport is destroyed, the callback must be unregistered by >+ // setting it to nullptr. >+ virtual void SetNetworkChangeCallback( >+ MediaTransportNetworkChangeCallback* callback); >+ >+ // Sets a state observer callback. Before media transport is destroyed, the >+ // callback must be unregistered by setting it to nullptr. >+ // A newly registered callback will be called with the current state. >+ // Media transport does not invoke this callback concurrently. >+ virtual void SetMediaTransportStateCallback( >+ MediaTransportStateCallback* callback) = 0; >+ >+ // Sends a data buffer to the remote endpoint using the given send parameters. >+ // |buffer| may not be larger than 256 KiB. Returns an error if the send >+ // fails. >+ virtual RTCError SendData(int channel_id, >+ const SendDataParams& params, >+ const rtc::CopyOnWriteBuffer& buffer) = 0; >+ >+ // Closes |channel_id| gracefully. Returns an error if |channel_id| is not >+ // open. Data sent after the closing procedure begins will not be >+ // transmitted. The channel becomes closed after pending data is transmitted. >+ virtual RTCError CloseChannel(int channel_id) = 0; >+ >+ // Sets a sink for data messages and channel state callbacks. Before media >+ // transport is destroyed, the sink must be unregistered by setting it to >+ // nullptr. >+ virtual void SetDataSink(DataChannelSink* sink) = 0; >+ > // TODO(sukhanov): RtcEventLogs. >- // TODO(sukhanov): Bandwidth updates. > }; > > // If media transport factory is set in peer connection factory, it will be >@@ -236,10 +408,21 @@ class MediaTransportFactory { > // - Does not take ownership of packet_transport or network_thread. > // - Does not support group calls, in 1:1 call one side must set > // is_caller = true and another is_caller = false. >+ // TODO(bugs.webrtc.org/9938) This constructor will be removed and replaced >+ // with the one below. >+ virtual RTCErrorOr<std::unique_ptr<MediaTransportInterface>> >+ CreateMediaTransport(rtc::PacketTransportInternal* packet_transport, >+ rtc::Thread* network_thread, >+ bool is_caller); >+ >+ // Creates media transport. >+ // - Does not take ownership of packet_transport or network_thread. >+ // TODO(bugs.webrtc.org/9938): remove default implementation once all children >+ // override it. > virtual RTCErrorOr<std::unique_ptr<MediaTransportInterface>> > CreateMediaTransport(rtc::PacketTransportInternal* packet_transport, > rtc::Thread* network_thread, >- bool is_caller) = 0; >+ const MediaTransportSettings& settings); > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.cc >index 55677869bedd7c3212400b87cfae3277acc34d30..b66ba0503ac7f58f7170d675918de9a24c6a6992 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.cc >@@ -10,7 +10,9 @@ > > #include "api/mediaconstraintsinterface.h" > >+#include "absl/types/optional.h" > #include "api/peerconnectioninterface.h" >+#include "media/base/mediaconfig.h" > #include "rtc_base/stringencode.h" > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.h >index c6a914aa564f9831b128ba3086baef7a580e397c..c9a6e1b18858df599a13b2cd2103565c05970d3d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediaconstraintsinterface.h >@@ -21,10 +21,11 @@ > #ifndef API_MEDIACONSTRAINTSINTERFACE_H_ > #define API_MEDIACONSTRAINTSINTERFACE_H_ > >+#include <stddef.h> > #include <string> > #include <vector> > >-#include "absl/types/optional.h" >+#include "api/audio_options.h" > #include "api/peerconnectioninterface.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.cc >index 6f08a0caf1c1b1234ac51779f13d5cada99c474c..955e7e4c608ee505fc0e0c90cda3415f1b0d1d74 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.cc >@@ -17,34 +17,6 @@ namespace webrtc { > const char MediaStreamTrackInterface::kVideoKind[] = "video"; > const char MediaStreamTrackInterface::kAudioKind[] = "audio"; > >-void AudioProcessorInterface::GetStats(AudioProcessorStats* /*stats*/) { >- RTC_NOTREACHED() << "Old-style GetStats() is called but it has no " >- << "implementation."; >- RTC_LOG(LS_ERROR) << "Old-style GetStats() is called but it has no " >- << "implementation."; >-} >- >-// TODO(ivoc): Remove this when the function becomes pure virtual. >-AudioProcessorInterface::AudioProcessorStatistics >-AudioProcessorInterface::GetStats(bool /*has_remote_tracks*/) { >- AudioProcessorStats stats; >- GetStats(&stats); >- AudioProcessorStatistics new_stats; >- new_stats.apm_statistics.divergent_filter_fraction = >- stats.aec_divergent_filter_fraction; >- new_stats.apm_statistics.delay_median_ms = stats.echo_delay_median_ms; >- new_stats.apm_statistics.delay_standard_deviation_ms = >- stats.echo_delay_std_ms; >- new_stats.apm_statistics.echo_return_loss = stats.echo_return_loss; >- new_stats.apm_statistics.echo_return_loss_enhancement = >- stats.echo_return_loss_enhancement; >- new_stats.apm_statistics.residual_echo_likelihood = >- stats.residual_echo_likelihood; >- new_stats.apm_statistics.residual_echo_likelihood_recent_max = >- stats.residual_echo_likelihood_recent_max; >- return new_stats; >-} >- > VideoTrackInterface::ContentHint VideoTrackInterface::content_hint() const { > return ContentHint::kNone; > } >@@ -58,4 +30,8 @@ AudioTrackInterface::GetAudioProcessor() { > return nullptr; > } > >+const cricket::AudioOptions AudioSourceInterface::options() const { >+ return {}; >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.h >index b661351e9fc985ed6d15621afe274de8ce1c47e7..6d967660c573adb8c598a3534f67f5fc8f274a77 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediastreaminterface.h >@@ -23,18 +23,13 @@ > #include <vector> > > #include "absl/types/optional.h" >+#include "api/audio_options.h" > #include "api/video/video_frame.h" >-// TODO(zhihuang): Remove unrelated headers once downstream applications stop >-// relying on them; they were previously transitively included by >-// mediachannel.h, which is no longer a dependency of this file. > #include "api/video/video_sink_interface.h" > #include "api/video/video_source_interface.h" > #include "modules/audio_processing/include/audio_processing_statistics.h" >-#include "rtc_base/ratetracker.h" > #include "rtc_base/refcount.h" > #include "rtc_base/scoped_ref_ptr.h" >-#include "rtc_base/thread.h" >-#include "rtc_base/timeutils.h" > > namespace webrtc { > >@@ -213,53 +208,28 @@ class AudioSourceInterface : public MediaSourceInterface { > // TODO(tommi): Make pure virtual. > virtual void AddSink(AudioTrackSinkInterface* sink) {} > virtual void RemoveSink(AudioTrackSinkInterface* sink) {} >+ >+ // Returns options for the AudioSource. >+ // (for some of the settings this approach is broken, e.g. setting >+ // audio network adaptation on the source is the wrong layer of abstraction). >+ virtual const cricket::AudioOptions options() const; > }; > > // Interface of the audio processor used by the audio track to collect > // statistics. > class AudioProcessorInterface : public rtc::RefCountInterface { > public: >- // Deprecated, use AudioProcessorStatistics instead. >- // TODO(ivoc): Remove this when all implementations have switched to the new >- // GetStats function. See b/67926135. >- struct AudioProcessorStats { >- AudioProcessorStats() >- : typing_noise_detected(false), >- echo_return_loss(0), >- echo_return_loss_enhancement(0), >- echo_delay_median_ms(0), >- echo_delay_std_ms(0), >- residual_echo_likelihood(0.0f), >- residual_echo_likelihood_recent_max(0.0f), >- aec_divergent_filter_fraction(0.0) {} >- ~AudioProcessorStats() {} >- >- bool typing_noise_detected; >- int echo_return_loss; >- int echo_return_loss_enhancement; >- int echo_delay_median_ms; >- int echo_delay_std_ms; >- float residual_echo_likelihood; >- float residual_echo_likelihood_recent_max; >- float aec_divergent_filter_fraction; >- }; >- // This struct maintains the optionality of the stats, and will replace the >- // regular stats struct when all users have been updated. > struct AudioProcessorStatistics { > bool typing_noise_detected = false; > AudioProcessingStats apm_statistics; > }; > >- // Get audio processor statistics. >- virtual void GetStats(AudioProcessorStats* stats); >- > // Get audio processor statistics. The |has_remote_tracks| argument should be > // set if there are active remote tracks (this would usually be true during > // a call). If there are no remote tracks some of the stats will not be set by > // the AudioProcessor, because they only make sense if there is at least one > // remote track. >- // TODO(ivoc): Make pure virtual when all implementions are updated. >- virtual AudioProcessorStatistics GetStats(bool has_remote_tracks); >+ virtual AudioProcessorStatistics GetStats(bool has_remote_tracks) = 0; > > protected: > ~AudioProcessorInterface() override = default; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription.cc >deleted file mode 100644 >index d5155f22fe9a35885cdd5370af282753861f337d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription.cc >+++ /dev/null >@@ -1,13 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "api/ortc/mediadescription.h" >- >-namespace webrtc {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription.h >deleted file mode 100644 >index 5cf1d1a67be0187702c5819d6bff6219efae027f..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription.h >+++ /dev/null >@@ -1,53 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef API_ORTC_MEDIADESCRIPTION_H_ >-#define API_ORTC_MEDIADESCRIPTION_H_ >- >-#include <string> >-#include <utility> >-#include <vector> >- >-#include "absl/types/optional.h" >-#include "api/cryptoparams.h" >- >-namespace webrtc { >- >-// A structured representation of a media description within an SDP session >-// description. >-class MediaDescription { >- public: >- explicit MediaDescription(std::string mid) : mid_(std::move(mid)) {} >- >- ~MediaDescription() {} >- >- // The mid(media stream identification) is used for identifying media streams >- // within a session description. >- // https://tools.ietf.org/html/rfc5888#section-6 >- absl::optional<std::string> mid() const { return mid_; } >- void set_mid(std::string mid) { mid_.emplace(std::move(mid)); } >- >- // Security keys and parameters for this media stream. Can be used to >- // negotiate parameters for SRTP. >- // https://tools.ietf.org/html/rfc4568#page-5 >- std::vector<cricket::CryptoParams>& sdes_params() { return sdes_params_; } >- const std::vector<cricket::CryptoParams>& sdes_params() const { >- return sdes_params_; >- } >- >- private: >- absl::optional<std::string> mid_; >- >- std::vector<cricket::CryptoParams> sdes_params_; >-}; >- >-} // namespace webrtc >- >-#endif // API_ORTC_MEDIADESCRIPTION_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription_unittest.cc >deleted file mode 100644 >index 9ff943af6f405fa8cbfca229e7b02336b40b33c7..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/mediadescription_unittest.cc >+++ /dev/null >@@ -1,30 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "api/ortc/mediadescription.h" >-#include "test/gtest.h" >- >-namespace webrtc { >- >-class MediaDescriptionTest : public testing::Test {}; >- >-TEST_F(MediaDescriptionTest, CreateMediaDescription) { >- MediaDescription m("a"); >- EXPECT_EQ("a", m.mid()); >-} >- >-TEST_F(MediaDescriptionTest, AddSdesParam) { >- MediaDescription m("a"); >- m.sdes_params().push_back(cricket::CryptoParams()); >- const std::vector<cricket::CryptoParams>& params = m.sdes_params(); >- EXPECT_EQ(1u, params.size()); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcfactoryinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcfactoryinterface.h >deleted file mode 100644 >index 99373528318cd7b24683898757314b496d7c3317..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcfactoryinterface.h >+++ /dev/null >@@ -1,232 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef API_ORTC_ORTCFACTORYINTERFACE_H_ >-#define API_ORTC_ORTCFACTORYINTERFACE_H_ >- >-#include <memory> >-#include <string> >-#include <utility> // For std::move. >- >-#include "api/mediastreaminterface.h" >-#include "api/mediatypes.h" >-#include "api/ortc/ortcrtpreceiverinterface.h" >-#include "api/ortc/ortcrtpsenderinterface.h" >-#include "api/ortc/packettransportinterface.h" >-#include "api/ortc/rtptransportcontrollerinterface.h" >-#include "api/ortc/rtptransportinterface.h" >-#include "api/ortc/srtptransportinterface.h" >-#include "api/ortc/udptransportinterface.h" >-#include "api/peerconnectioninterface.h" >-#include "api/rtcerror.h" >-#include "api/rtpparameters.h" >-#include "rtc_base/network.h" >-#include "rtc_base/scoped_ref_ptr.h" >-#include "rtc_base/thread.h" >- >-namespace webrtc { >- >-// TODO(deadbeef): This should be part of /api/, but currently it's not and >-// including its header violates checkdeps rules. >-class AudioDeviceModule; >- >-// WARNING: This is experimental/under development, so use at your own risk; no >-// guarantee about API stability is guaranteed here yet. >-// >-// This class is the ORTC analog of PeerConnectionFactory. It acts as a factory >-// for ORTC objects that can be connected to each other. >-// >-// Some of these objects may not be represented by the ORTC specification, but >-// follow the same general principles. >-// >-// If one of the factory methods takes another object as an argument, it MUST >-// have been created by the same OrtcFactory. >-// >-// On object lifetimes: objects should be destroyed in this order: >-// 1. Objects created by the factory. >-// 2. The factory itself. >-// 3. Objects passed into OrtcFactoryInterface::Create. >-class OrtcFactoryInterface { >- public: >- // |network_thread| is the thread on which packets are sent and received. >- // If null, a new rtc::Thread with a default socket server is created. >- // >- // |signaling_thread| is used for callbacks to the consumer of the API. If >- // null, the current thread will be used, which assumes that the API consumer >- // is running a message loop on this thread (either using an existing >- // rtc::Thread, or by calling rtc::Thread::Current()->ProcessMessages). >- // >- // |network_manager| is used to determine which network interfaces are >- // available. This is used for ICE, for example. If null, a default >- // implementation will be used. Only accessed on |network_thread|. >- // >- // |socket_factory| is used (on the network thread) for creating sockets. If >- // it's null, a default implementation will be used, which assumes >- // |network_thread| is a normal rtc::Thread. >- // >- // |adm| is optional, and allows a different audio device implementation to >- // be injected; otherwise a platform-specific module will be used that will >- // use the default audio input. >- // >- // |audio_encoder_factory| and |audio_decoder_factory| are used to >- // instantiate audio codecs; they determine what codecs are supported. >- // >- // Note that the OrtcFactoryInterface does not take ownership of any of the >- // objects passed in by raw pointer, and as previously stated, these objects >- // can't be destroyed before the factory is. >- static RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>> Create( >- rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory); >- >- // Constructor for convenience which uses default implementations where >- // possible (though does still require that the current thread runs a message >- // loop; see above). >- static RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>> Create( >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory) { >- return Create(nullptr, nullptr, nullptr, nullptr, nullptr, >- audio_encoder_factory, audio_decoder_factory); >- } >- >- virtual ~OrtcFactoryInterface() {} >- >- // Creates an RTP transport controller, which is used in calls to >- // CreateRtpTransport methods. If your application has some notion of a >- // "call", you should create one transport controller per call. >- // >- // However, if you only are using one RtpTransport object, this doesn't need >- // to be called explicitly; CreateRtpTransport will create one automatically >- // if |rtp_transport_controller| is null. See below. >- // >- // TODO(deadbeef): Add MediaConfig and RtcEventLog arguments? >- virtual RTCErrorOr<std::unique_ptr<RtpTransportControllerInterface>> >- CreateRtpTransportController() = 0; >- >- // Creates an RTP transport using the provided packet transports and >- // transport controller. >- // >- // |rtp| will be used for sending RTP packets, and |rtcp| for RTCP packets. >- // >- // |rtp| can't be null. |rtcp| must be non-null if and only if >- // |rtp_parameters.rtcp.mux| is false, indicating that RTCP muxing isn't used. >- // Note that if RTCP muxing isn't enabled initially, it can still enabled >- // later through SetParameters. >- // >- // If |transport_controller| is null, one will automatically be created, and >- // its lifetime managed by the returned RtpTransport. This should only be >- // done if a single RtpTransport is being used to communicate with the remote >- // endpoint. >- virtual RTCErrorOr<std::unique_ptr<RtpTransportInterface>> CreateRtpTransport( >- const RtpTransportParameters& rtp_parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerInterface* transport_controller) = 0; >- >- // Creates an SrtpTransport which is an RTP transport that uses SRTP. >- virtual RTCErrorOr<std::unique_ptr<SrtpTransportInterface>> >- CreateSrtpTransport( >- const RtpTransportParameters& rtp_parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerInterface* transport_controller) = 0; >- >- // Returns the capabilities of an RTP sender of type |kind|. These >- // capabilities can be used to determine what RtpParameters to use to create >- // an RtpSender. >- // >- // If for some reason you pass in MEDIA_TYPE_DATA, returns an empty structure. >- virtual RtpCapabilities GetRtpSenderCapabilities( >- cricket::MediaType kind) const = 0; >- >- // Creates an RTP sender with |track|. Will not start sending until Send is >- // called. This is provided as a convenience; it's equivalent to calling >- // CreateRtpSender with a kind (see below), followed by SetTrack. >- // >- // |track| and |transport| must not be null. >- virtual RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> CreateRtpSender( >- rtc::scoped_refptr<MediaStreamTrackInterface> track, >- RtpTransportInterface* transport) = 0; >- >- // Overload of CreateRtpSender allows creating the sender without a track. >- // >- // |kind| must be MEDIA_TYPE_AUDIO or MEDIA_TYPE_VIDEO. >- virtual RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> CreateRtpSender( >- cricket::MediaType kind, >- RtpTransportInterface* transport) = 0; >- >- // Returns the capabilities of an RTP receiver of type |kind|. These >- // capabilities can be used to determine what RtpParameters to use to create >- // an RtpReceiver. >- // >- // If for some reason you pass in MEDIA_TYPE_DATA, returns an empty structure. >- virtual RtpCapabilities GetRtpReceiverCapabilities( >- cricket::MediaType kind) const = 0; >- >- // Creates an RTP receiver of type |kind|. Will not start receiving media >- // until Receive is called. >- // >- // |kind| must be MEDIA_TYPE_AUDIO or MEDIA_TYPE_VIDEO. >- // >- // |transport| must not be null. >- virtual RTCErrorOr<std::unique_ptr<OrtcRtpReceiverInterface>> >- CreateRtpReceiver(cricket::MediaType kind, >- RtpTransportInterface* transport) = 0; >- >- // Create a UDP transport with IP address family |family|, using a port >- // within the specified range. >- // >- // |family| must be AF_INET or AF_INET6. >- // >- // |min_port|/|max_port| values of 0 indicate no range restriction. >- // >- // Returns an error if the transport wasn't successfully created. >- virtual RTCErrorOr<std::unique_ptr<UdpTransportInterface>> >- CreateUdpTransport(int family, uint16_t min_port, uint16_t max_port) = 0; >- >- // Method for convenience that has no port range restrictions. >- RTCErrorOr<std::unique_ptr<UdpTransportInterface>> CreateUdpTransport( >- int family) { >- return CreateUdpTransport(family, 0, 0); >- } >- >- // NOTE: The methods below to create tracks/sources return scoped_refptrs >- // rather than unique_ptrs, because these interfaces are also used with >- // PeerConnection, where everything is ref-counted. >- >- // Creates a audio source representing the default microphone input. >- // |options| decides audio processing settings. >- virtual rtc::scoped_refptr<AudioSourceInterface> CreateAudioSource( >- const cricket::AudioOptions& options) = 0; >- >- // Version of the above method that uses default options. >- rtc::scoped_refptr<AudioSourceInterface> CreateAudioSource() { >- return CreateAudioSource(cricket::AudioOptions()); >- } >- >- // Creates a new local video track wrapping |source|. The same |source| can >- // be used in several tracks. >- virtual rtc::scoped_refptr<VideoTrackInterface> CreateVideoTrack( >- const std::string& id, >- VideoTrackSourceInterface* source) = 0; >- >- // Creates an new local audio track wrapping |source|. >- virtual rtc::scoped_refptr<AudioTrackInterface> CreateAudioTrack( >- const std::string& id, >- AudioSourceInterface* source) = 0; >-}; >- >-} // namespace webrtc >- >-#endif // API_ORTC_ORTCFACTORYINTERFACE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcrtpreceiverinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcrtpreceiverinterface.h >deleted file mode 100644 >index 59ff97762135372a9f1b250c39b5d477cb8e20a4..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcrtpreceiverinterface.h >+++ /dev/null >@@ -1,84 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-// This file contains interfaces for RtpReceivers: >-// http://publications.ortc.org/2016/20161202/#rtcrtpreceiver* >-// >-// However, underneath the RtpReceiver is an RtpTransport, rather than a >-// DtlsTransport. This is to allow different types of RTP transports (besides >-// DTLS-SRTP) to be used. >- >-#ifndef API_ORTC_ORTCRTPRECEIVERINTERFACE_H_ >-#define API_ORTC_ORTCRTPRECEIVERINTERFACE_H_ >- >-#include "api/mediastreaminterface.h" >-#include "api/mediatypes.h" >-#include "api/ortc/rtptransportinterface.h" >-#include "api/rtcerror.h" >-#include "api/rtpparameters.h" >- >-namespace webrtc { >- >-// Note: Since receiver capabilities may depend on how the OrtcFactory was >-// created, instead of a static "GetCapabilities" method on this interface, >-// there is a "GetRtpReceiverCapabilities" method on the OrtcFactory. >-class OrtcRtpReceiverInterface { >- public: >- virtual ~OrtcRtpReceiverInterface() {} >- >- // Returns a track representing the media received by this receiver. >- // >- // Currently, this will return null until Receive has been successfully >- // called. Also, a new track will be created every time the primary SSRC >- // changes. >- // >- // If encodings are removed, GetTrack will return null. Though deactivating >- // an encoding (setting |active| to false) will not do this. >- // >- // In the future, these limitations will be fixed, and GetTrack will return >- // the same track for the lifetime of the RtpReceiver. So it's not >- // recommended to write code that depends on this non-standard behavior. >- virtual rtc::scoped_refptr<MediaStreamTrackInterface> GetTrack() const = 0; >- >- // Once supported, will switch to receiving media on a new transport. >- // However, this is not currently supported and will always return an error. >- virtual RTCError SetTransport(RtpTransportInterface* transport) = 0; >- // Returns previously set (or constructed-with) transport. >- virtual RtpTransportInterface* GetTransport() const = 0; >- >- // Start receiving media with |parameters| (if |parameters| contains an >- // active encoding). >- // >- // There are no limitations to how the parameters can be changed after the >- // initial call to Receive, as long as they're valid (for example, they can't >- // use the same payload type for two codecs). >- virtual RTCError Receive(const RtpParameters& parameters) = 0; >- // Returns parameters that were last successfully passed into Receive, or >- // empty parameters if that hasn't yet occurred. >- // >- // Note that for parameters that are described as having an "implementation >- // default" value chosen, GetParameters() will return those chosen defaults, >- // with the exception of SSRCs which have special behavior. See >- // rtpparameters.h for more details. >- virtual RtpParameters GetParameters() const = 0; >- >- // Audio or video receiver? >- // >- // Once GetTrack() starts always returning a track, this method will be >- // redundant, as one can call "GetTrack()->kind()". However, it's still a >- // nice convenience, and is symmetric with OrtcRtpSenderInterface::GetKind. >- virtual cricket::MediaType GetKind() const = 0; >- >- // TODO(deadbeef): GetContributingSources >-}; >- >-} // namespace webrtc >- >-#endif // API_ORTC_ORTCRTPRECEIVERINTERFACE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcrtpsenderinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcrtpsenderinterface.h >deleted file mode 100644 >index fd4dfaa7907ffc28c7716c6bed0726eeb4d8fec4..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/ortcrtpsenderinterface.h >+++ /dev/null >@@ -1,77 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-// This file contains interfaces for RtpSenders: >-// http://publications.ortc.org/2016/20161202/#rtcrtpsender* >-// >-// However, underneath the RtpSender is an RtpTransport, rather than a >-// DtlsTransport. This is to allow different types of RTP transports (besides >-// DTLS-SRTP) to be used. >- >-#ifndef API_ORTC_ORTCRTPSENDERINTERFACE_H_ >-#define API_ORTC_ORTCRTPSENDERINTERFACE_H_ >- >-#include "api/mediastreaminterface.h" >-#include "api/mediatypes.h" >-#include "api/ortc/rtptransportinterface.h" >-#include "api/rtcerror.h" >-#include "api/rtpparameters.h" >- >-namespace webrtc { >- >-// Note: Since sender capabilities may depend on how the OrtcFactory was >-// created, instead of a static "GetCapabilities" method on this interface, >-// there is a "GetRtpSenderCapabilities" method on the OrtcFactory. >-class OrtcRtpSenderInterface { >- public: >- virtual ~OrtcRtpSenderInterface() {} >- >- // Sets the source of media that will be sent by this sender. >- // >- // If Send has already been called, will immediately switch to sending this >- // track. If |track| is null, will stop sending media. >- // >- // Returns INVALID_PARAMETER error if an audio track is set on a video >- // RtpSender, or vice-versa. >- virtual RTCError SetTrack(MediaStreamTrackInterface* track) = 0; >- // Returns previously set (or constructed-with) track. >- virtual rtc::scoped_refptr<MediaStreamTrackInterface> GetTrack() const = 0; >- >- // Once supported, will switch to sending media on a new transport. However, >- // this is not currently supported and will always return an error. >- virtual RTCError SetTransport(RtpTransportInterface* transport) = 0; >- // Returns previously set (or constructed-with) transport. >- virtual RtpTransportInterface* GetTransport() const = 0; >- >- // Start sending media with |parameters| (if |parameters| contains an active >- // encoding). >- // >- // There are no limitations to how the parameters can be changed after the >- // initial call to Send, as long as they're valid (for example, they can't >- // use the same payload type for two codecs). >- virtual RTCError Send(const RtpParameters& parameters) = 0; >- // Returns parameters that were last successfully passed into Send, or empty >- // parameters if that hasn't yet occurred. >- // >- // Note that for parameters that are described as having an "implementation >- // default" value chosen, GetParameters() will return those chosen defaults, >- // with the exception of SSRCs which have special behavior. See >- // rtpparameters.h for more details. >- virtual RtpParameters GetParameters() const = 0; >- >- // Audio or video sender? >- virtual cricket::MediaType GetKind() const = 0; >- >- // TODO(deadbeef): SSRC conflict signal. >-}; >- >-} // namespace webrtc >- >-#endif // API_ORTC_ORTCRTPSENDERINTERFACE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/rtptransportcontrollerinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/rtptransportcontrollerinterface.h >deleted file mode 100644 >index 85f37fa7a0d5d208dd769e3d9bcc0377b9975cab..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/rtptransportcontrollerinterface.h >+++ /dev/null >@@ -1,57 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef API_ORTC_RTPTRANSPORTCONTROLLERINTERFACE_H_ >-#define API_ORTC_RTPTRANSPORTCONTROLLERINTERFACE_H_ >- >-#include <vector> >- >-#include "api/ortc/rtptransportinterface.h" >- >-namespace webrtc { >- >-class RtpTransportControllerAdapter; >- >-// Used to group RTP transports between a local endpoint and the same remote >-// endpoint, for the purpose of sharing bandwidth estimation and other things. >-// >-// Comparing this to the PeerConnection model, non-budled audio/video would use >-// two RtpTransports with a single RtpTransportController, whereas bundled >-// media would use a single RtpTransport, and two PeerConnections would use >-// independent RtpTransportControllers. >-// >-// RtpTransports are associated with this controller when they're created, by >-// passing the controller into OrtcFactory's relevant "CreateRtpTransport" >-// method. When a transport is destroyed, it's automatically disassociated. >-// GetTransports returns all currently associated transports. >-// >-// This is the RTP equivalent of "IceTransportController" in ORTC; RtpTransport >-// is to RtpTransportController as IceTransport is to IceTransportController. >-class RtpTransportControllerInterface { >- public: >- virtual ~RtpTransportControllerInterface() {} >- >- // Returns all transports associated with this controller (see explanation >- // above). No ordering is guaranteed. >- virtual std::vector<RtpTransportInterface*> GetTransports() const = 0; >- >- protected: >- // Only for internal use. Returns a pointer to an internal interface, for use >- // by the implementation. >- virtual RtpTransportControllerAdapter* GetInternal() = 0; >- >- // Classes that can use this internal interface. >- friend class OrtcFactory; >- friend class RtpTransportAdapter; >-}; >- >-} // namespace webrtc >- >-#endif // API_ORTC_RTPTRANSPORTCONTROLLERINTERFACE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/rtptransportinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/rtptransportinterface.h >index b0d30e8560e27a528ba67a5354afb32da0a2db64..4b9d16158f65a0481dcf7653f6b42f4de9d82ede 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/rtptransportinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/rtptransportinterface.h >@@ -18,7 +18,6 @@ > #include "api/rtcerror.h" > #include "api/rtp_headers.h" > #include "api/rtpparameters.h" >-#include "common_types.h" // NOLINT(build/include) > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription.cc >deleted file mode 100644 >index 1078884ecb45b207896534a89028a115753f9e77..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription.cc >+++ /dev/null >@@ -1,13 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "api/ortc/sessiondescription.h" >- >-namespace webrtc {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription.h >deleted file mode 100644 >index ebbaa27d6ffc0011450326e51c17c8b17df9f952..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription.h >+++ /dev/null >@@ -1,45 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef API_ORTC_SESSIONDESCRIPTION_H_ >-#define API_ORTC_SESSIONDESCRIPTION_H_ >- >-#include <string> >-#include <utility> >- >-namespace webrtc { >- >-// A structured representation of an SDP session description. >-class SessionDescription { >- public: >- SessionDescription(int64_t session_id, std::string session_version) >- : session_id_(session_id), session_version_(std::move(session_version)) {} >- >- // https://tools.ietf.org/html/rfc4566#section-5.2 >- // o=<username> <sess-id> <sess-version> <nettype> <addrtype> >- // <unicast-address> >- // session_id_ is the "sess-id" field. >- // session_version_ is the "sess-version" field. >- int64_t session_id() { return session_id_; } >- void set_session_id(int64_t session_id) { session_id_ = session_id; } >- >- const std::string& session_version() const { return session_version_; } >- void set_session_version(std::string session_version) { >- session_version_ = std::move(session_version); >- } >- >- private: >- int64_t session_id_; >- std::string session_version_; >-}; >- >-} // namespace webrtc >- >-#endif // API_ORTC_SESSIONDESCRIPTION_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription_unittest.cc >deleted file mode 100644 >index e4611c6c6b2bf7a0be4f5c92166f40934b05f706..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/sessiondescription_unittest.cc >+++ /dev/null >@@ -1,23 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "api/ortc/sessiondescription.h" >-#include "test/gtest.h" >- >-namespace webrtc { >- >-class SessionDescriptionTest : public testing::Test {}; >- >-TEST_F(SessionDescriptionTest, CreateSessionDescription) { >- SessionDescription s(-1, "0"); >- EXPECT_EQ(-1, s.session_id()); >- EXPECT_EQ("0", s.session_version()); >-} >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/udptransportinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/udptransportinterface.h >deleted file mode 100644 >index f246a25e9ddd0f5ecafe0069bc863d0308ef2b81..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/ortc/udptransportinterface.h >+++ /dev/null >@@ -1,49 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef API_ORTC_UDPTRANSPORTINTERFACE_H_ >-#define API_ORTC_UDPTRANSPORTINTERFACE_H_ >- >-#include "api/ortc/packettransportinterface.h" >-#include "api/proxy.h" >-#include "rtc_base/socketaddress.h" >- >-namespace webrtc { >- >-// Interface for a raw UDP transport (not using ICE), meaning a combination of >-// a local/remote IP address/port. >-// >-// An instance can be instantiated using OrtcFactory. >-// >-// Each instance reserves a UDP port, which will be freed when the >-// UdpTransportInterface destructor is called. >-// >-// Calling SetRemoteAddress sets the destination of outgoing packets; without a >-// destination, packets can't be sent, but they can be received. >-class UdpTransportInterface : public virtual PacketTransportInterface { >- public: >- // Get the address of the socket allocated for this transport. >- virtual rtc::SocketAddress GetLocalAddress() const = 0; >- >- // Sets the address to which packets will be delivered. >- // >- // Calling with a "nil" (default-constructed) address is legal, and unsets >- // any previously set destination. >- // >- // However, calling with an incomplete address (port or IP not set) will >- // fail. >- virtual bool SetRemoteAddress(const rtc::SocketAddress& dest) = 0; >- // Simple getter. If never set, returns nil address. >- virtual rtc::SocketAddress GetRemoteAddress() const = 0; >-}; >- >-} // namespace webrtc >- >-#endif // API_ORTC_UDPTRANSPORTINTERFACE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.cc >index b4148d7480aabe443cb99e74df742e0d584a1507..f2953b5b0c28c611fb10efbf8306de245e673f2b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.cc >@@ -159,6 +159,11 @@ RTCError PeerConnectionInterface::SetBitrate( > return SetBitrate(bitrate); > } > >+PeerConnectionInterface::PeerConnectionState >+PeerConnectionInterface::peer_connection_state() { >+ return PeerConnectionInterface::PeerConnectionState::kNew; >+} >+ > bool PeerConnectionInterface::StartRtcEventLog(rtc::PlatformFile file, > int64_t max_size_bytes) { > return false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.h >index 1c32b69692d00e6efab782c5bdba1cd7a830e2c8..54161b8da59b0226083a6abf98296ae8339df6b2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/peerconnectioninterface.h >@@ -69,7 +69,6 @@ > > #include <memory> > #include <string> >-#include <utility> > #include <vector> > > #include "api/asyncresolverfactory.h" >@@ -78,10 +77,11 @@ > #include "api/audio_codecs/audio_encoder_factory.h" > #include "api/audio_options.h" > #include "api/call/callfactoryinterface.h" >+#include "api/crypto/cryptooptions.h" > #include "api/datachannelinterface.h" >-#include "api/dtmfsenderinterface.h" > #include "api/fec_controller.h" > #include "api/jsep.h" >+#include "api/media_transport_interface.h" > #include "api/mediastreaminterface.h" > #include "api/rtcerror.h" > #include "api/rtceventlogoutput.h" >@@ -94,7 +94,6 @@ > #include "api/transport/bitrate_settings.h" > #include "api/transport/network_control.h" > #include "api/turncustomizer.h" >-#include "api/umametrics.h" > #include "logging/rtc_event_log/rtc_event_log_factory_interface.h" > #include "media/base/mediaconfig.h" > // TODO(bugs.webrtc.org/6353): cricket::VideoCapturer is deprecated and should >@@ -114,6 +113,7 @@ > #include "rtc_base/socketaddress.h" > #include "rtc_base/sslcertificate.h" > #include "rtc_base/sslstreamadapter.h" >+#include "rtc_base/system/rtc_export.h" > > namespace rtc { > class SSLIdentity; >@@ -160,7 +160,7 @@ enum class SdpSemantics { kPlanB, kUnifiedPlan }; > > class PeerConnectionInterface : public rtc::RefCountInterface { > public: >- // See https://w3c.github.io/webrtc-pc/#state-definitions >+ // See https://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate > enum SignalingState { > kStable, > kHaveLocalOffer, >@@ -170,12 +170,24 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > kClosed, > }; > >+ // See https://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate > enum IceGatheringState { > kIceGatheringNew, > kIceGatheringGathering, > kIceGatheringComplete > }; > >+ // See https://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate >+ enum class PeerConnectionState { >+ kNew, >+ kConnecting, >+ kConnected, >+ kDisconnected, >+ kFailed, >+ kClosed, >+ }; >+ >+ // See https://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate > enum IceConnectionState { > kIceConnectionNew, > kIceConnectionChecking, >@@ -282,7 +294,7 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > // organization of the implementation, which isn't stable. So we > // need getters and setters at least for fields which applications > // are interested in. >- struct RTCConfiguration { >+ struct RTC_EXPORT RTCConfiguration { > // This struct is subject to reorganization, both for naming > // consistency, and to group settings to match where they are used > // in the implementation. To do that, we need getter and setter >@@ -328,6 +340,22 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > media_config.video.experiment_cpu_load_estimator = enable; > } > >+ int audio_rtcp_report_interval_ms() const { >+ return media_config.audio.rtcp_report_interval_ms; >+ } >+ void set_audio_rtcp_report_interval_ms(int audio_rtcp_report_interval_ms) { >+ media_config.audio.rtcp_report_interval_ms = >+ audio_rtcp_report_interval_ms; >+ } >+ >+ int video_rtcp_report_interval_ms() const { >+ return media_config.video.rtcp_report_interval_ms; >+ } >+ void set_video_rtcp_report_interval_ms(int video_rtcp_report_interval_ms) { >+ media_config.video.rtcp_report_interval_ms = >+ video_rtcp_report_interval_ms; >+ } >+ > static const int kUndefined = -1; > // Default maximum number of packets in the audio jitter buffer. > static const int kAudioJitterBufferMaxPackets = 50; >@@ -394,6 +422,7 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > // Use new combined audio/video bandwidth estimation? > absl::optional<bool> combined_audio_video_bwe; > >+ // TODO(bugs.webrtc.org/9891) - Move to crypto_options > // Can be used to disable DTLS-SRTP. This should never be done, but can be > // useful for testing purposes, for example in setting up a loopback call > // with a single PeerConnection. >@@ -421,6 +450,9 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > // if it falls behind. > bool audio_jitter_buffer_fast_accelerate = false; > >+ // The minimum delay in milliseconds for the audio jitter buffer. >+ int audio_jitter_buffer_min_delay_ms = 0; >+ > // Timeout in milliseconds before an ICE candidate pair is considered to be > // "not receiving", after which a lower priority candidate pair may be > // selected. >@@ -556,6 +588,7 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > // For all other users, specify kUnifiedPlan. > SdpSemantics sdp_semantics = SdpSemantics::kPlanB; > >+ // TODO(bugs.webrtc.org/9891) - Move to crypto_options or remove. > // Actively reset the SRTP parameters whenever the DTLS transports > // underneath are reset for every offer/answer negotiation. > // This is only intended to be a workaround for crbug.com/835958 >@@ -563,6 +596,34 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > // correctly. This flag will be deprecated soon. Do not rely on it. > bool active_reset_srtp_params = false; > >+ // If MediaTransportFactory is provided in PeerConnectionFactory, this flag >+ // informs PeerConnection that it should use the MediaTransportInterface. >+ // It's invalid to set it to |true| if the MediaTransportFactory wasn't >+ // provided. >+ bool use_media_transport = false; >+ >+ // If MediaTransportFactory is provided in PeerConnectionFactory, this flag >+ // informs PeerConnection that it should use the MediaTransportInterface for >+ // data channels. It's invalid to set it to |true| if the >+ // MediaTransportFactory wasn't provided. Data channels over media >+ // transport are not compatible with RTP or SCTP data channels. Setting >+ // both |use_media_transport_for_data_channels| and >+ // |enable_rtp_data_channel| is invalid. >+ bool use_media_transport_for_data_channels = false; >+ >+ // Defines advanced optional cryptographic settings related to SRTP and >+ // frame encryption for native WebRTC. Setting this will overwrite any >+ // settings set in PeerConnectionFactory (which is deprecated). >+ absl::optional<CryptoOptions> crypto_options; >+ >+ // Configure if we should include the SDP attribute extmap-allow-mixed in >+ // our offer. Although we currently do support this, it's not included in >+ // our offer by default due to a previous bug that caused the SDP parser to >+ // abort parsing if this attribute was present. This is fixed in Chrome 71. >+ // TODO(webrtc:9985): Change default to true once sufficient time has >+ // passed. >+ bool offer_extmap_allow_mixed = false; >+ > // > // Don't forget to update operator== if adding something. > // >@@ -972,11 +1033,13 @@ class PeerConnectionInterface : public rtc::RefCountInterface { > virtual SignalingState signaling_state() = 0; > > // Returns the aggregate state of all ICE *and* DTLS transports. >- // TODO(deadbeef): Implement "PeerConnectionState" according to the standard, >- // to aggregate ICE+DTLS state, and change the scope of IceConnectionState to >- // be just the ICE layer. See: crbug.com/webrtc/6145 >+ // TODO(jonasolsson): Replace with standardized_ice_connection_state once it >+ // is ready, see crbug.com/webrtc/6145 > virtual IceConnectionState ice_connection_state() = 0; > >+ // Returns the aggregated state of all ICE and DTLS transports. >+ virtual PeerConnectionState peer_connection_state(); >+ > virtual IceGatheringState ice_gathering_state() = 0; > > // Starts RtcEventLog using existing file. Takes ownership of |file| and >@@ -1045,6 +1108,10 @@ class PeerConnectionObserver { > virtual void OnIceConnectionChange( > PeerConnectionInterface::IceConnectionState new_state) = 0; > >+ // Called any time the PeerConnectionState changes. >+ virtual void OnConnectionChange( >+ PeerConnectionInterface::PeerConnectionState new_state) {} >+ > // Called any time the IceGatheringState changes. > virtual void OnIceGatheringChange( > PeerConnectionInterface::IceGatheringState new_state) = 0; >@@ -1156,6 +1223,7 @@ struct PeerConnectionFactoryDependencies final { > std::unique_ptr<RtcEventLogFactoryInterface> event_log_factory; > std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory; > std::unique_ptr<NetworkControllerFactoryInterface> network_controller_factory; >+ std::unique_ptr<MediaTransportFactory> media_transport_factory; > }; > > // PeerConnectionFactoryInterface is the factory interface used for creating >@@ -1175,7 +1243,7 @@ class PeerConnectionFactoryInterface : public rtc::RefCountInterface { > public: > class Options { > public: >- Options() : crypto_options(rtc::CryptoOptions::NoGcm()) {} >+ Options() {} > > // If set to true, created PeerConnections won't enforce any SRTP > // requirement, allowing unsecured media. Should only be used for >@@ -1204,7 +1272,7 @@ class PeerConnectionFactoryInterface : public rtc::RefCountInterface { > rtc::SSLProtocolVersion ssl_max_version = rtc::SSL_PROTOCOL_DTLS_12; > > // Sets crypto related options, e.g. enabled cipher suites. >- rtc::CryptoOptions crypto_options; >+ CryptoOptions crypto_options = CryptoOptions::NoGcm(); > }; > > // Set the options to be used for subsequently created PeerConnections. >@@ -1307,137 +1375,6 @@ class PeerConnectionFactoryInterface : public rtc::RefCountInterface { > ~PeerConnectionFactoryInterface() override = default; > }; > >-#if defined(USE_BUILTIN_SW_CODECS) >-// Create a new instance of PeerConnectionFactoryInterface. >-// >-// This method relies on the thread it's called on as the "signaling thread" >-// for the PeerConnectionFactory it creates. >-// >-// As such, if the current thread is not already running an rtc::Thread message >-// loop, an application using this method must eventually either call >-// rtc::Thread::Current()->Run(), or call >-// rtc::Thread::Current()->ProcessMessages() within the application's own >-// message loop. >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory); >- >-// Create a new instance of PeerConnectionFactoryInterface. >-// >-// |network_thread|, |worker_thread| and |signaling_thread| are >-// the only mandatory parameters. >-// >-// If non-null, a reference is added to |default_adm|, and ownership of >-// |video_encoder_factory| and |video_decoder_factory| is transferred to the >-// returned factory. >-// TODO(deadbeef): Use rtc::scoped_refptr<> and std::unique_ptr<> to make this >-// ownership transfer and ref counting more obvious. >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory); >- >-// Create a new instance of PeerConnectionFactoryInterface with optional >-// external audio mixed and audio processing modules. >-// >-// If |audio_mixer| is null, an internal audio mixer will be created and used. >-// If |audio_processing| is null, an internal audio processing module will be >-// created and used. >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer, >- rtc::scoped_refptr<AudioProcessing> audio_processing); >- >-// Create a new instance of PeerConnectionFactoryInterface with optional >-// external audio mixer, audio processing, and fec controller modules. >-// >-// If |audio_mixer| is null, an internal audio mixer will be created and used. >-// If |audio_processing| is null, an internal audio processing module will be >-// created and used. >-// If |fec_controller_factory| is null, an internal fec controller module will >-// be created and used. >-// If |network_controller_factory| is provided, it will be used if enabled via >-// field trial. >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer, >- rtc::scoped_refptr<AudioProcessing> audio_processing, >- std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory, >- std::unique_ptr<NetworkControllerFactoryInterface> >- network_controller_factory = nullptr); >-#endif >- >-// Create a new instance of PeerConnectionFactoryInterface with optional video >-// codec factories. These video factories represents all video codecs, i.e. no >-// extra internal video codecs will be added. >-// When building WebRTC with rtc_use_builtin_sw_codecs = false, this is the >-// only available CreatePeerConnectionFactory overload. >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- rtc::scoped_refptr<AudioDeviceModule> default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- std::unique_ptr<VideoEncoderFactory> video_encoder_factory, >- std::unique_ptr<VideoDecoderFactory> video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer, >- rtc::scoped_refptr<AudioProcessing> audio_processing); >- >-#if defined(USE_BUILTIN_SW_CODECS) >-// Create a new instance of PeerConnectionFactoryInterface with external audio >-// mixer. >-// >-// If |audio_mixer| is null, an internal audio mixer will be created and used. >-rtc::scoped_refptr<PeerConnectionFactoryInterface> >-CreatePeerConnectionFactoryWithAudioMixer( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer); >- >-// Create a new instance of PeerConnectionFactoryInterface. >-// Same thread is used as worker and network thread. >-inline rtc::scoped_refptr<PeerConnectionFactoryInterface> >-CreatePeerConnectionFactory( >- rtc::Thread* worker_and_network_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory) { >- return CreatePeerConnectionFactory( >- worker_and_network_thread, worker_and_network_thread, signaling_thread, >- default_adm, audio_encoder_factory, audio_decoder_factory, >- video_encoder_factory, video_decoder_factory); >-} >-#endif >- > // This is a lower-level version of the CreatePeerConnectionFactory functions > // above. It's implemented in the "peerconnection" build target, whereas the > // above methods are only implemented in the broader "libjingle_peerconnection" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.cc >index c86bddfa8b1a946d010c3ef92e7efb40df8877f0..e668285ba29ba61d3a7275990dea890f76d0e27f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.cc >@@ -14,7 +14,7 @@ namespace webrtc { > namespace internal { > > SynchronousMethodCall::SynchronousMethodCall(rtc::MessageHandler* proxy) >- : e_(), proxy_(proxy) {} >+ : proxy_(proxy) {} > > SynchronousMethodCall::~SynchronousMethodCall() = default; > >@@ -23,15 +23,14 @@ void SynchronousMethodCall::Invoke(const rtc::Location& posted_from, > if (t->IsCurrent()) { > proxy_->OnMessage(nullptr); > } else { >- e_.reset(new rtc::Event(false, false)); > t->Post(posted_from, this, 0); >- e_->Wait(rtc::Event::kForever); >+ e_.Wait(rtc::Event::kForever); > } > } > > void SynchronousMethodCall::OnMessage(rtc::Message*) { > proxy_->OnMessage(nullptr); >- e_->Set(); >+ e_.Set(); > } > > } // namespace internal >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.h >index c8962efed7c8d2c903eebe70ec0f5509927f197e..99160517deebd79cd33d371febe21c4a7710caa1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/proxy.h >@@ -143,7 +143,7 @@ class SynchronousMethodCall : public rtc::MessageData, > private: > void OnMessage(rtc::Message*) override; > >- std::unique_ptr<rtc::Event> e_; >+ rtc::Event e_; > rtc::MessageHandler* proxy_; > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.cc >index da7f1ea9fd4002f77234239058116e9d867bfdba..bf973b6fe5771d42361adb8369b30cd9b2b372c6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.cc >@@ -10,14 +10,6 @@ > > #include "api/rtp_headers.h" > >-#include <string.h> >-#include <algorithm> >-#include <limits> >-#include <type_traits> >- >-#include "rtc_base/checks.h" >-#include "rtc_base/stringutils.h" >- > namespace webrtc { > > RTPHeaderExtension::RTPHeaderExtension() >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.h >index 41f39888b8081a6bfb935907f3b2cd4a4d783dba..c766899b9fd88f4a2c2ee9090f7f8cb69e9462bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtp_headers.h >@@ -12,20 +12,17 @@ > #define API_RTP_HEADERS_H_ > > #include <stddef.h> >+#include <stdint.h> > #include <string.h> >-#include <string> >-#include <vector> > > #include "absl/types/optional.h" > #include "api/array_view.h" >+#include "api/video/color_space.h" > #include "api/video/video_content_type.h" > #include "api/video/video_frame_marking.h" > #include "api/video/video_rotation.h" > #include "api/video/video_timing.h" >- > #include "common_types.h" // NOLINT(build/include) >-#include "rtc_base/checks.h" >-#include "rtc_base/deprecation.h" > > namespace webrtc { > >@@ -131,6 +128,8 @@ struct RTPHeaderExtension { > // For identifying the media section used to interpret this RTP packet. See > // https://tools.ietf.org/html/draft-ietf-mmusic-sdp-bundle-negotiation-38 > Mid mid; >+ >+ absl::optional<ColorSpace> color_space; > }; > > struct RTPHeader { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.cc >index 98ced9b3ead35bed8f7c9b37d4d210c5aea6bfb0..063d106a00a5901492ab17aa8a0fad01571eff15 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.cc >@@ -12,7 +12,7 @@ > #include <algorithm> > #include <string> > >-#include "rtc_base/checks.h" >+#include "api/array_view.h" > #include "rtc_base/strings/string_builder.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.h >index 678ac02c4b7da7dc5a3bcaa5011e39452e7ee197..badda074599a93bb5da325bbefdd783b07ff1b18 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpparameters.h >@@ -11,12 +11,14 @@ > #ifndef API_RTPPARAMETERS_H_ > #define API_RTPPARAMETERS_H_ > >+#include <stdint.h> > #include <string> > #include <unordered_map> > #include <vector> > > #include "absl/types/optional.h" > #include "api/mediatypes.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -404,6 +406,14 @@ struct RtpEncodingParameters { > // bitrate priority. > double bitrate_priority = kDefaultBitratePriority; > >+ // The relative DiffServ Code Point priority for this encoding, allowing >+ // packets to be marked relatively higher or lower without affecting >+ // bandwidth allocations. See https://w3c.github.io/webrtc-dscp-exp/ . NB >+ // we follow chromium's translation of the allowed string enum values for >+ // this field to 1.0, 0.5, et cetera, similar to bitrate_priority above. >+ // TODO(http://crbug.com/webrtc/8630): Implement this per encoding parameter. >+ double network_priority = kDefaultBitratePriority; >+ > // Indicates the preferred duration of media represented by a packet in > // milliseconds for this encoding. If set, this will take precedence over the > // ptime set in the RtpCodecParameters. This could happen if SDP negotiation >@@ -471,7 +481,8 @@ struct RtpEncodingParameters { > bool operator==(const RtpEncodingParameters& o) const { > return ssrc == o.ssrc && codec_payload_type == o.codec_payload_type && > fec == o.fec && rtx == o.rtx && dtx == o.dtx && >- bitrate_priority == o.bitrate_priority && ptime == o.ptime && >+ bitrate_priority == o.bitrate_priority && >+ network_priority == o.network_priority && ptime == o.ptime && > max_bitrate_bps == o.max_bitrate_bps && > min_bitrate_bps == o.min_bitrate_bps && > max_framerate == o.max_framerate && >@@ -607,7 +618,7 @@ struct RtcpParameters final { > bool operator!=(const RtcpParameters& o) const { return !(*this == o); } > }; > >-struct RtpParameters { >+struct RTC_EXPORT RtpParameters { > RtpParameters(); > RtpParameters(const RtpParameters&); > ~RtpParameters(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpsenderinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpsenderinterface.h >index 9554c1c4da79735547caebd28401816997420d8c..7c94c216e64efbeb5c4628bebe6925a6f5023e15 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpsenderinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtpsenderinterface.h >@@ -24,7 +24,6 @@ > #include "api/proxy.h" > #include "api/rtcerror.h" > #include "api/rtpparameters.h" >-#include "rtc_base/deprecation.h" > #include "rtc_base/refcount.h" > #include "rtc_base/scoped_ref_ptr.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.cc >index d833339fa6646850faa9be22e99806abd81c3ebe..e62b014c6564335f70694115df988660aac43513 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.cc >@@ -10,6 +10,8 @@ > > #include "api/rtptransceiverinterface.h" > >+#include "rtc_base/checks.h" >+ > namespace webrtc { > > RtpTransceiverInit::RtpTransceiverInit() = default; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.h >index 301a3809f8b4def3e76d569508064b403d258a6b..c01fdafb8323cf91bbb557fd2bf272a45ee61157 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/rtptransceiverinterface.h >@@ -16,9 +16,12 @@ > > #include "absl/types/optional.h" > #include "api/array_view.h" >+#include "api/mediatypes.h" >+#include "api/rtpparameters.h" > #include "api/rtpreceiverinterface.h" > #include "api/rtpsenderinterface.h" > #include "rtc_base/refcount.h" >+#include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/scoped_refptr.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/scoped_refptr.h >new file mode 100644 >index 0000000000000000000000000000000000000000..0993e032cc7249ae060bbf83292135d2e7fc296b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/scoped_refptr.h >@@ -0,0 +1,162 @@ >+/* >+ * Copyright 2011 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+// Originally these classes are from Chromium. >+// http://src.chromium.org/viewvc/chrome/trunk/src/base/memory/ref_counted.h?view=markup >+ >+// >+// A smart pointer class for reference counted objects. Use this class instead >+// of calling AddRef and Release manually on a reference counted object to >+// avoid common memory leaks caused by forgetting to Release an object >+// reference. Sample usage: >+// >+// class MyFoo : public RefCounted<MyFoo> { >+// ... >+// }; >+// >+// void some_function() { >+// scoped_refptr<MyFoo> foo = new MyFoo(); >+// foo->Method(param); >+// // |foo| is released when this function returns >+// } >+// >+// void some_other_function() { >+// scoped_refptr<MyFoo> foo = new MyFoo(); >+// ... >+// foo = nullptr; // explicitly releases |foo| >+// ... >+// if (foo) >+// foo->Method(param); >+// } >+// >+// The above examples show how scoped_refptr<T> acts like a pointer to T. >+// Given two scoped_refptr<T> classes, it is also possible to exchange >+// references between the two objects, like so: >+// >+// { >+// scoped_refptr<MyFoo> a = new MyFoo(); >+// scoped_refptr<MyFoo> b; >+// >+// b.swap(a); >+// // now, |b| references the MyFoo object, and |a| references null. >+// } >+// >+// To make both |a| and |b| in the above example reference the same MyFoo >+// object, simply use the assignment operator: >+// >+// { >+// scoped_refptr<MyFoo> a = new MyFoo(); >+// scoped_refptr<MyFoo> b; >+// >+// b = a; >+// // now, |a| and |b| each own a reference to the same MyFoo object. >+// } >+// >+ >+#ifndef API_SCOPED_REFPTR_H_ >+#define API_SCOPED_REFPTR_H_ >+ >+#include <memory> >+#include <utility> >+ >+namespace rtc { >+ >+template <class T> >+class scoped_refptr { >+ public: >+ scoped_refptr() : ptr_(nullptr) {} >+ >+ scoped_refptr(T* p) : ptr_(p) { // NOLINT(runtime/explicit) >+ if (ptr_) >+ ptr_->AddRef(); >+ } >+ >+ scoped_refptr(const scoped_refptr<T>& r) : ptr_(r.ptr_) { >+ if (ptr_) >+ ptr_->AddRef(); >+ } >+ >+ template <typename U> >+ scoped_refptr(const scoped_refptr<U>& r) : ptr_(r.get()) { >+ if (ptr_) >+ ptr_->AddRef(); >+ } >+ >+ // Move constructors. >+ scoped_refptr(scoped_refptr<T>&& r) : ptr_(r.release()) {} >+ >+ template <typename U> >+ scoped_refptr(scoped_refptr<U>&& r) : ptr_(r.release()) {} >+ >+ ~scoped_refptr() { >+ if (ptr_) >+ ptr_->Release(); >+ } >+ >+ T* get() const { return ptr_; } >+ operator T*() const { return ptr_; } >+ T* operator->() const { return ptr_; } >+ >+ // Returns the (possibly null) raw pointer, and makes the scoped_refptr hold a >+ // null pointer, all without touching the reference count of the underlying >+ // pointed-to object. The object is still reference counted, and the caller of >+ // release() is now the proud owner of one reference, so it is responsible for >+ // calling Release() once on the object when no longer using it. >+ T* release() { >+ T* retVal = ptr_; >+ ptr_ = nullptr; >+ return retVal; >+ } >+ >+ scoped_refptr<T>& operator=(T* p) { >+ // AddRef first so that self assignment should work >+ if (p) >+ p->AddRef(); >+ if (ptr_) >+ ptr_->Release(); >+ ptr_ = p; >+ return *this; >+ } >+ >+ scoped_refptr<T>& operator=(const scoped_refptr<T>& r) { >+ return *this = r.ptr_; >+ } >+ >+ template <typename U> >+ scoped_refptr<T>& operator=(const scoped_refptr<U>& r) { >+ return *this = r.get(); >+ } >+ >+ scoped_refptr<T>& operator=(scoped_refptr<T>&& r) { >+ scoped_refptr<T>(std::move(r)).swap(*this); >+ return *this; >+ } >+ >+ template <typename U> >+ scoped_refptr<T>& operator=(scoped_refptr<U>&& r) { >+ scoped_refptr<T>(std::move(r)).swap(*this); >+ return *this; >+ } >+ >+ void swap(T** pp) { >+ T* p = ptr_; >+ ptr_ = *pp; >+ *pp = p; >+ } >+ >+ void swap(scoped_refptr<T>& r) { swap(&r.ptr_); } >+ >+ protected: >+ T* ptr_; >+}; >+ >+} // namespace rtc >+ >+#endif // API_SCOPED_REFPTR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats.h >index 1705f6ab58ee79ddf69a5a576f65f72d2eaa0f6e..7fd0d84e0bb145581b78c5b2059f7d980a65970d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats.h >@@ -18,6 +18,7 @@ > #include <vector> > > #include "rtc_base/checks.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -47,7 +48,7 @@ class RTCStatsMemberInterface; > // for (const RTCStatsMemberInterface* member : foo.Members()) { > // printf("%s = %s\n", member->name(), member->ValueToString().c_str()); > // } >-class RTCStats { >+class RTC_EXPORT RTCStats { > public: > RTCStats(const std::string& id, int64_t timestamp_us) > : id_(id), timestamp_us_(timestamp_us) {} >@@ -247,7 +248,7 @@ class RTCStatsMemberInterface { > // (undefined reference to |kType|). The supported types are the ones described > // by |RTCStatsMemberInterface::Type|. > template <typename T> >-class RTCStatsMember : public RTCStatsMemberInterface { >+class RTC_EXPORT RTCStatsMember : public RTCStatsMemberInterface { > public: > static const Type kType; > >@@ -294,15 +295,6 @@ class RTCStatsMember : public RTCStatsMemberInterface { > is_defined_ = true; > return value_; > } >- T& operator=(const RTCStatsMember<T>& other) { >- RTC_DCHECK(other.is_defined_); >- // Shouldn't be attempting to assign an RTCNonStandardStatsMember to an >- // RTCStatsMember or vice versa. >- RTC_DCHECK(is_standardized() == other.is_standardized()); >- value_ = other.value_; >- is_defined_ = true; >- return value_; >- } > > // Value getters. > T& operator*() { >@@ -347,6 +339,11 @@ class RTCNonStandardStatsMember : public RTCStatsMember<T> { > : RTCStatsMember<T>(std::move(other)) {} > > bool is_standardized() const override { return false; } >+ >+ T& operator=(const T& value) { return RTCStatsMember<T>::operator=(value); } >+ T& operator=(const T&& value) { >+ return RTCStatsMember<T>::operator=(std::move(value)); >+ } > }; > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats_objects.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats_objects.h >index f377809ba3a71da62505987a1b5697797c4f5804..8392231218953331da532ed8d2914fb554e8217f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats_objects.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstats_objects.h >@@ -16,6 +16,7 @@ > #include <vector> > > #include "api/stats/rtcstats.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -73,7 +74,7 @@ struct RTCNetworkType { > }; > > // https://w3c.github.io/webrtc-stats/#certificatestats-dict* >-class RTCCertificateStats final : public RTCStats { >+class RTC_EXPORT RTCCertificateStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -89,7 +90,7 @@ class RTCCertificateStats final : public RTCStats { > }; > > // https://w3c.github.io/webrtc-stats/#codec-dict* >-class RTCCodecStats final : public RTCStats { >+class RTC_EXPORT RTCCodecStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -110,7 +111,7 @@ class RTCCodecStats final : public RTCStats { > }; > > // https://w3c.github.io/webrtc-stats/#dcstats-dict* >-class RTCDataChannelStats final : public RTCStats { >+class RTC_EXPORT RTCDataChannelStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -132,7 +133,7 @@ class RTCDataChannelStats final : public RTCStats { > > // https://w3c.github.io/webrtc-stats/#candidatepair-dict* > // TODO(hbos): Tracking bug https://bugs.webrtc.org/7062 >-class RTCIceCandidatePairStats final : public RTCStats { >+class RTC_EXPORT RTCIceCandidatePairStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -186,7 +187,7 @@ class RTCIceCandidatePairStats final : public RTCStats { > // crbug.com/632723 > // TODO(qingsi): Add the stats of STUN binding requests (keepalives) and collect > // them in the new PeerConnection::GetStats. >-class RTCIceCandidateStats : public RTCStats { >+class RTC_EXPORT RTCIceCandidateStats : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -221,7 +222,7 @@ class RTCIceCandidateStats : public RTCStats { > // |kType| need to be different ("RTCStatsType type") in the local/remote case. > // https://w3c.github.io/webrtc-stats/#rtcstatstype-str* > // This forces us to have to override copy() and type(). >-class RTCLocalIceCandidateStats final : public RTCIceCandidateStats { >+class RTC_EXPORT RTCLocalIceCandidateStats final : public RTCIceCandidateStats { > public: > static const char kType[]; > RTCLocalIceCandidateStats(const std::string& id, int64_t timestamp_us); >@@ -230,7 +231,8 @@ class RTCLocalIceCandidateStats final : public RTCIceCandidateStats { > const char* type() const override; > }; > >-class RTCRemoteIceCandidateStats final : public RTCIceCandidateStats { >+class RTC_EXPORT RTCRemoteIceCandidateStats final >+ : public RTCIceCandidateStats { > public: > static const char kType[]; > RTCRemoteIceCandidateStats(const std::string& id, int64_t timestamp_us); >@@ -241,7 +243,7 @@ class RTCRemoteIceCandidateStats final : public RTCIceCandidateStats { > > // https://w3c.github.io/webrtc-stats/#msstats-dict* > // TODO(hbos): Tracking bug crbug.com/660827 >-class RTCMediaStreamStats final : public RTCStats { >+class RTC_EXPORT RTCMediaStreamStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -256,7 +258,7 @@ class RTCMediaStreamStats final : public RTCStats { > > // https://w3c.github.io/webrtc-stats/#mststats-dict* > // TODO(hbos): Tracking bug crbug.com/659137 >-class RTCMediaStreamTrackStats final : public RTCStats { >+class RTC_EXPORT RTCMediaStreamTrackStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -306,10 +308,14 @@ class RTCMediaStreamTrackStats final : public RTCStats { > RTCStatsMember<double> total_samples_duration; > RTCStatsMember<uint64_t> concealed_samples; > RTCStatsMember<uint64_t> concealment_events; >+ // Non-standard audio-only member >+ // TODO(kuddai): Add description to standard. crbug.com/webrtc/10042 >+ RTCNonStandardStatsMember<uint64_t> jitter_buffer_flushes; >+ RTCNonStandardStatsMember<uint64_t> delayed_packet_outage_samples; > }; > > // https://w3c.github.io/webrtc-stats/#pcstats-dict* >-class RTCPeerConnectionStats final : public RTCStats { >+class RTC_EXPORT RTCPeerConnectionStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -324,7 +330,7 @@ class RTCPeerConnectionStats final : public RTCStats { > > // https://w3c.github.io/webrtc-stats/#streamstats-dict* > // TODO(hbos): Tracking bug crbug.com/657854 >-class RTCRTPStreamStats : public RTCStats { >+class RTC_EXPORT RTCRTPStreamStats : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -362,7 +368,7 @@ class RTCRTPStreamStats : public RTCStats { > // https://w3c.github.io/webrtc-stats/#inboundrtpstats-dict* > // TODO(hbos): Support the remote case |is_remote = true|. > // https://bugs.webrtc.org/7065 >-class RTCInboundRTPStreamStats final : public RTCRTPStreamStats { >+class RTC_EXPORT RTCInboundRTPStreamStats final : public RTCRTPStreamStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -406,7 +412,7 @@ class RTCInboundRTPStreamStats final : public RTCRTPStreamStats { > // https://w3c.github.io/webrtc-stats/#outboundrtpstats-dict* > // TODO(hbos): Support the remote case |is_remote = true|. > // https://bugs.webrtc.org/7066 >-class RTCOutboundRTPStreamStats final : public RTCRTPStreamStats { >+class RTC_EXPORT RTCOutboundRTPStreamStats final : public RTCRTPStreamStats { > public: > WEBRTC_RTCSTATS_DECL(); > >@@ -423,7 +429,7 @@ class RTCOutboundRTPStreamStats final : public RTCRTPStreamStats { > }; > > // https://w3c.github.io/webrtc-stats/#transportstats-dict* >-class RTCTransportStats final : public RTCStats { >+class RTC_EXPORT RTCTransportStats final : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstatsreport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstatsreport.h >index f7410b39f79affa2376366ebcf93be614aec302f..8bf1cb57d9d55ae50432b256355dd21f284fb436 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstatsreport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/stats/rtcstatsreport.h >@@ -20,12 +20,13 @@ > #include "rtc_base/refcount.h" > #include "rtc_base/refcountedobject.h" > #include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // A collection of stats. > // This is accessible as a map from |RTCStats::id| to |RTCStats|. >-class RTCStatsReport : public rtc::RefCountInterface { >+class RTC_EXPORT RTCStatsReport : public rtc::RefCountInterface { > public: > typedef std::map<std::string, std::unique_ptr<const RTCStats>> StatsMap; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/DEPS b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/DEPS >index 98b1ad391fca2b644c61a3f50328dd5efede2edf..9c293b323e703fc6cd49d5d9830cab7f8f54f98d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/DEPS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/DEPS >@@ -5,4 +5,10 @@ specific_include_rules = { > ".*": [ > "+video" > ], >+ "loopback_media_transport\.h": [ >+ "+rtc_base/asyncinvoker.h", >+ "+rtc_base/criticalsection.h", >+ "+rtc_base/thread.h", >+ "+rtc_base/thread_checker.h", >+ ], > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.cc >index 432664a03063c8b58ed243694e3cdb02b5cb93da..b77017fdb43684003fe552f3b8446bc09361a908 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.cc >@@ -25,19 +25,20 @@ int FakeFrameDecryptor::Decrypt(cricket::MediaType media_type, > rtc::ArrayView<uint8_t> frame, > size_t* bytes_written) { > if (fail_decryption_) { >- return 1; >+ return static_cast<int>(FakeDecryptStatus::FORCED_FAILURE); > } > > RTC_CHECK_EQ(frame.size() + 1, encrypted_frame.size()); > for (size_t i = 0; i < frame.size(); i++) { >- frame[i] ^= fake_key_; >+ frame[i] = encrypted_frame[i] ^ fake_key_; > } > > if (encrypted_frame[frame.size()] != expected_postfix_byte_) { >- return 1; >+ return static_cast<int>(FakeDecryptStatus::INVALID_POSTFIX); > } > >- return 0; >+ *bytes_written = frame.size(); >+ return static_cast<int>(FakeDecryptStatus::OK); > } > > size_t FakeFrameDecryptor::GetMaxPlaintextByteSize( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.h >index b945def59a8b0a5b74745f52b74208bf165de5c3..bcb8080fd4c84c66b335242c093148d2983b709a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_decryptor.h >@@ -22,34 +22,40 @@ namespace webrtc { > // FrameDecryptorInterface. It is constructed with a simple single digit key and > // a fixed postfix byte. This is just to validate that the core code works > // as expected. >-class FakeFrameDecryptor >+class FakeFrameDecryptor final > : public rtc::RefCountedObject<FrameDecryptorInterface> { > public: > // Provide a key (0,255) and some postfix byte (0,255) this should match the > // byte you expect from the FakeFrameEncryptor. >- explicit FakeFrameDecryptor(uint8_t fake_key = 1, >+ explicit FakeFrameDecryptor(uint8_t fake_key = 0xAA, > uint8_t expected_postfix_byte = 255); >- >- // FrameDecryptorInterface implementation >+ // Fake decryption that just xors the payload with the 1 byte key and checks >+ // the postfix byte. This will always fail if fail_decryption_ is set to true. > int Decrypt(cricket::MediaType media_type, > const std::vector<uint32_t>& csrcs, > rtc::ArrayView<const uint8_t> additional_data, > rtc::ArrayView<const uint8_t> encrypted_frame, > rtc::ArrayView<uint8_t> frame, > size_t* bytes_written) override; >- >+ // Always returns 1 less than the size of the encrypted frame. > size_t GetMaxPlaintextByteSize(cricket::MediaType media_type, > size_t encrypted_frame_size) override; >- >+ // Sets the fake key to use for encryption. > void SetFakeKey(uint8_t fake_key); >- >+ // Returns the fake key used for encryption. > uint8_t GetFakeKey() const; >- >+ // Set the Postfix byte that is expected in the encrypted payload. > void SetExpectedPostfixByte(uint8_t expected_postfix_byte); >- >+ // Returns the postfix byte that will be checked for in the encrypted payload. > uint8_t GetExpectedPostfixByte() const; >- >+ // If set to true will force all encryption to fail. > void SetFailDecryption(bool fail_decryption); >+ // Simple error codes for tests to validate against. >+ enum class FakeDecryptStatus : int { >+ OK = 0, >+ FORCED_FAILURE = 1, >+ INVALID_POSTFIX = 2 >+ }; > > private: > uint8_t fake_key_ = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.cc >index 013058f96aa6dfe9b0613938e57daef5da38c36c..edf3cc2136ad00f785d76f7a490b28554ad2d53d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.cc >@@ -22,18 +22,18 @@ int FakeFrameEncryptor::Encrypt(cricket::MediaType media_type, > rtc::ArrayView<const uint8_t> frame, > rtc::ArrayView<uint8_t> encrypted_frame, > size_t* bytes_written) { >- // Useful if you want to test failure cases. > if (fail_encryption_) { >- return 1; >+ return static_cast<int>(FakeEncryptionStatus::FORCED_FAILURE); > } > > RTC_CHECK_EQ(frame.size() + 1, encrypted_frame.size()); > for (size_t i = 0; i < frame.size(); i++) { >- encrypted_frame[i] ^= fake_key_; >+ encrypted_frame[i] = frame[i] ^ fake_key_; > } >+ > encrypted_frame[frame.size()] = postfix_byte_; > *bytes_written = encrypted_frame.size(); >- return 0; >+ return static_cast<int>(FakeEncryptionStatus::OK); > } > > size_t FakeFrameEncryptor::GetMaxCiphertextByteSize( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.h >index 61ae9383f6a54390997bb14c3b096822fee0d4e4..0bec967dc229646eaafdb1bc6ebcf9b6660a7afb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_frame_encryptor.h >@@ -24,29 +24,35 @@ class FakeFrameEncryptor > : public rtc::RefCountedObject<FrameEncryptorInterface> { > public: > // Provide a key (0,255) and some postfix byte (0,255). >- explicit FakeFrameEncryptor(uint8_t fake_key = 1, uint8_t postfix_byte = 255); >- >- // FrameEncryptorInterface implementation >+ explicit FakeFrameEncryptor(uint8_t fake_key = 0xAA, >+ uint8_t postfix_byte = 255); >+ // Simply xors each payload with the provided fake key and adds the postfix >+ // bit to the end. This will always fail if fail_encryption_ is set to true. > int Encrypt(cricket::MediaType media_type, > uint32_t ssrc, > rtc::ArrayView<const uint8_t> additional_data, > rtc::ArrayView<const uint8_t> frame, > rtc::ArrayView<uint8_t> encrypted_frame, > size_t* bytes_written) override; >- >+ // Always returns 1 more than the size of the frame. > size_t GetMaxCiphertextByteSize(cricket::MediaType media_type, > size_t frame_size) override; >- >+ // Sets the fake key to use during encryption. > void SetFakeKey(uint8_t fake_key); >- >+ // Returns the fake key used during encryption. > uint8_t GetFakeKey() const; >- >+ // Set the postfix byte to use. > void SetPostfixByte(uint8_t expected_postfix_byte); >- >+ // Return a postfix byte added to each outgoing payload. > uint8_t GetPostfixByte() const; >- >+ // Force all encryptions to fail. > void SetFailEncryption(bool fail_encryption); > >+ enum class FakeEncryptionStatus : int { >+ OK = 0, >+ FORCED_FAILURE = 1, >+ }; >+ > private: > uint8_t fake_key_ = 0; > uint8_t postfix_byte_ = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_media_transport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_media_transport.h >new file mode 100644 >index 0000000000000000000000000000000000000000..eb6ec93ae2b7922b8223d9891a1d358d8a225534 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/fake_media_transport.h >@@ -0,0 +1,131 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_FAKE_MEDIA_TRANSPORT_H_ >+#define API_TEST_FAKE_MEDIA_TRANSPORT_H_ >+ >+#include <memory> >+#include <string> >+#include <utility> >+#include <vector> >+ >+#include "absl/memory/memory.h" >+#include "api/media_transport_interface.h" >+ >+namespace webrtc { >+ >+// TODO(sukhanov): For now fake media transport does nothing and is used only >+// in jsepcontroller unittests. In the future we should implement fake media >+// transport, which forwards frames to another fake media transport, so we >+// could unit test audio / video integration. >+class FakeMediaTransport : public MediaTransportInterface { >+ public: >+ explicit FakeMediaTransport(const MediaTransportSettings& settings) >+ : settings_(settings) {} >+ ~FakeMediaTransport() = default; >+ >+ RTCError SendAudioFrame(uint64_t channel_id, >+ MediaTransportEncodedAudioFrame frame) override { >+ return RTCError::OK(); >+ } >+ >+ RTCError SendVideoFrame( >+ uint64_t channel_id, >+ const MediaTransportEncodedVideoFrame& frame) override { >+ return RTCError::OK(); >+ } >+ >+ RTCError RequestKeyFrame(uint64_t channel_id) override { >+ return RTCError::OK(); >+ }; >+ >+ void SetReceiveAudioSink(MediaTransportAudioSinkInterface* sink) override {} >+ void SetReceiveVideoSink(MediaTransportVideoSinkInterface* sink) override {} >+ >+ // Returns true if fake media transport was created as a caller. >+ bool is_caller() const { return settings_.is_caller; } >+ absl::optional<std::string> pre_shared_key() const { >+ return settings_.pre_shared_key; >+ } >+ >+ RTCError SendData(int channel_id, >+ const SendDataParams& params, >+ const rtc::CopyOnWriteBuffer& buffer) override { >+ return RTCError::OK(); >+ } >+ >+ RTCError CloseChannel(int channel_id) override { return RTCError::OK(); } >+ >+ void SetDataSink(DataChannelSink* sink) override {} >+ >+ void SetMediaTransportStateCallback( >+ MediaTransportStateCallback* callback) override { >+ state_callback_ = callback; >+ } >+ >+ void SetState(webrtc::MediaTransportState state) { >+ if (state_callback_) { >+ state_callback_->OnStateChanged(state); >+ } >+ } >+ >+ void AddTargetTransferRateObserver( >+ webrtc::TargetTransferRateObserver* observer) override { >+ RTC_CHECK(std::find(target_rate_observers_.begin(), >+ target_rate_observers_.end(), >+ observer) == target_rate_observers_.end()); >+ target_rate_observers_.push_back(observer); >+ } >+ >+ void RemoveTargetTransferRateObserver( >+ webrtc::TargetTransferRateObserver* observer) override { >+ auto it = std::find(target_rate_observers_.begin(), >+ target_rate_observers_.end(), observer); >+ if (it != target_rate_observers_.end()) { >+ target_rate_observers_.erase(it); >+ } >+ } >+ >+ int target_rate_observers_size() { return target_rate_observers_.size(); } >+ >+ private: >+ const MediaTransportSettings settings_; >+ MediaTransportStateCallback* state_callback_; >+ std::vector<webrtc::TargetTransferRateObserver*> target_rate_observers_; >+}; >+ >+// Fake media transport factory creates fake media transport. >+class FakeMediaTransportFactory : public MediaTransportFactory { >+ public: >+ FakeMediaTransportFactory() = default; >+ ~FakeMediaTransportFactory() = default; >+ >+ RTCErrorOr<std::unique_ptr<MediaTransportInterface>> CreateMediaTransport( >+ rtc::PacketTransportInternal* packet_transport, >+ rtc::Thread* network_thread, >+ bool is_caller) override { >+ MediaTransportSettings settings; >+ settings.is_caller = is_caller; >+ return CreateMediaTransport(packet_transport, network_thread, settings); >+ } >+ >+ RTCErrorOr<std::unique_ptr<MediaTransportInterface>> CreateMediaTransport( >+ rtc::PacketTransportInternal* packet_transport, >+ rtc::Thread* network_thread, >+ const MediaTransportSettings& settings) override { >+ std::unique_ptr<MediaTransportInterface> media_transport = >+ absl::make_unique<FakeMediaTransport>(settings); >+ return std::move(media_transport); >+ } >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TEST_FAKE_MEDIA_TRANSPORT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/loopback_media_transport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/loopback_media_transport.h >new file mode 100644 >index 0000000000000000000000000000000000000000..48255b135e7508cab00496630a537fdc379b473c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/loopback_media_transport.h >@@ -0,0 +1,291 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_LOOPBACK_MEDIA_TRANSPORT_H_ >+#define API_TEST_LOOPBACK_MEDIA_TRANSPORT_H_ >+ >+#include <memory> >+#include <utility> >+ >+#include "api/media_transport_interface.h" >+#include "rtc_base/asyncinvoker.h" >+#include "rtc_base/criticalsection.h" >+#include "rtc_base/thread.h" >+#include "rtc_base/thread_checker.h" >+ >+namespace webrtc { >+ >+// Wrapper used to hand out unique_ptrs to loopback media transports without >+// ownership changes. >+class WrapperMediaTransport : public MediaTransportInterface { >+ public: >+ explicit WrapperMediaTransport(MediaTransportInterface* wrapped) >+ : wrapped_(wrapped) {} >+ >+ RTCError SendAudioFrame(uint64_t channel_id, >+ MediaTransportEncodedAudioFrame frame) override { >+ return wrapped_->SendAudioFrame(channel_id, std::move(frame)); >+ } >+ >+ RTCError SendVideoFrame( >+ uint64_t channel_id, >+ const MediaTransportEncodedVideoFrame& frame) override { >+ return wrapped_->SendVideoFrame(channel_id, frame); >+ } >+ >+ RTCError RequestKeyFrame(uint64_t channel_id) override { >+ return wrapped_->RequestKeyFrame(channel_id); >+ } >+ >+ void SetReceiveAudioSink(MediaTransportAudioSinkInterface* sink) override { >+ wrapped_->SetReceiveAudioSink(sink); >+ } >+ >+ void SetReceiveVideoSink(MediaTransportVideoSinkInterface* sink) override { >+ wrapped_->SetReceiveVideoSink(sink); >+ } >+ >+ void SetMediaTransportStateCallback( >+ MediaTransportStateCallback* callback) override { >+ wrapped_->SetMediaTransportStateCallback(callback); >+ } >+ >+ RTCError SendData(int channel_id, >+ const SendDataParams& params, >+ const rtc::CopyOnWriteBuffer& buffer) override { >+ return wrapped_->SendData(channel_id, params, buffer); >+ } >+ >+ RTCError CloseChannel(int channel_id) override { >+ return wrapped_->CloseChannel(channel_id); >+ } >+ >+ void SetDataSink(DataChannelSink* sink) override { >+ wrapped_->SetDataSink(sink); >+ } >+ >+ private: >+ MediaTransportInterface* wrapped_; >+}; >+ >+class WrapperMediaTransportFactory : public MediaTransportFactory { >+ public: >+ explicit WrapperMediaTransportFactory(MediaTransportInterface* wrapped) >+ : wrapped_(wrapped) {} >+ >+ RTCErrorOr<std::unique_ptr<MediaTransportInterface>> CreateMediaTransport( >+ rtc::PacketTransportInternal* packet_transport, >+ rtc::Thread* network_thread, >+ const MediaTransportSettings& settings) override { >+ return {absl::make_unique<WrapperMediaTransport>(wrapped_)}; >+ } >+ >+ private: >+ MediaTransportInterface* wrapped_; >+}; >+ >+// Contains two MediaTransportsInterfaces that are connected to each other. >+// Currently supports audio only. >+class MediaTransportPair { >+ public: >+ struct Stats { >+ int sent_audio_frames = 0; >+ int received_audio_frames = 0; >+ }; >+ >+ explicit MediaTransportPair(rtc::Thread* thread) >+ : first_(thread, &second_), second_(thread, &first_) {} >+ >+ // Ownership stays with MediaTransportPair >+ MediaTransportInterface* first() { return &first_; } >+ MediaTransportInterface* second() { return &second_; } >+ >+ std::unique_ptr<MediaTransportFactory> first_factory() { >+ return absl::make_unique<WrapperMediaTransportFactory>(&first_); >+ } >+ >+ std::unique_ptr<MediaTransportFactory> second_factory() { >+ return absl::make_unique<WrapperMediaTransportFactory>(&second_); >+ } >+ >+ void SetState(MediaTransportState state) { >+ first_.SetState(state); >+ second_.SetState(state); >+ } >+ >+ void FlushAsyncInvokes() { >+ first_.FlushAsyncInvokes(); >+ second_.FlushAsyncInvokes(); >+ } >+ >+ Stats FirstStats() { return first_.GetStats(); } >+ Stats SecondStats() { return second_.GetStats(); } >+ >+ private: >+ class LoopbackMediaTransport : public MediaTransportInterface { >+ public: >+ LoopbackMediaTransport(rtc::Thread* thread, LoopbackMediaTransport* other) >+ : thread_(thread), other_(other) {} >+ >+ ~LoopbackMediaTransport() { >+ rtc::CritScope lock(&sink_lock_); >+ RTC_CHECK(sink_ == nullptr); >+ RTC_CHECK(data_sink_ == nullptr); >+ } >+ >+ RTCError SendAudioFrame(uint64_t channel_id, >+ MediaTransportEncodedAudioFrame frame) override { >+ { >+ rtc::CritScope lock(&stats_lock_); >+ ++stats_.sent_audio_frames; >+ } >+ invoker_.AsyncInvoke<void>(RTC_FROM_HERE, thread_, >+ [this, channel_id, frame] { >+ other_->OnData(channel_id, std::move(frame)); >+ }); >+ return RTCError::OK(); >+ }; >+ >+ RTCError SendVideoFrame( >+ uint64_t channel_id, >+ const MediaTransportEncodedVideoFrame& frame) override { >+ return RTCError::OK(); >+ } >+ >+ RTCError RequestKeyFrame(uint64_t channel_id) override { >+ return RTCError::OK(); >+ } >+ >+ void SetReceiveAudioSink(MediaTransportAudioSinkInterface* sink) override { >+ rtc::CritScope lock(&sink_lock_); >+ if (sink) { >+ RTC_CHECK(sink_ == nullptr); >+ } >+ sink_ = sink; >+ } >+ >+ void SetReceiveVideoSink(MediaTransportVideoSinkInterface* sink) override {} >+ >+ void SetMediaTransportStateCallback( >+ MediaTransportStateCallback* callback) override { >+ rtc::CritScope lock(&sink_lock_); >+ state_callback_ = callback; >+ invoker_.AsyncInvoke<void>(RTC_FROM_HERE, thread_, [this] { >+ RTC_DCHECK_RUN_ON(thread_); >+ OnStateChanged(); >+ }); >+ } >+ >+ RTCError SendData(int channel_id, >+ const SendDataParams& params, >+ const rtc::CopyOnWriteBuffer& buffer) override { >+ invoker_.AsyncInvoke<void>( >+ RTC_FROM_HERE, thread_, [this, channel_id, params, buffer] { >+ other_->OnData(channel_id, params.type, buffer); >+ }); >+ return RTCError::OK(); >+ } >+ >+ RTCError CloseChannel(int channel_id) override { >+ invoker_.AsyncInvoke<void>(RTC_FROM_HERE, thread_, [this, channel_id] { >+ other_->OnRemoteCloseChannel(channel_id); >+ rtc::CritScope lock(&sink_lock_); >+ if (data_sink_) { >+ data_sink_->OnChannelClosed(channel_id); >+ } >+ }); >+ return RTCError::OK(); >+ } >+ >+ void SetDataSink(DataChannelSink* sink) override { >+ rtc::CritScope lock(&sink_lock_); >+ data_sink_ = sink; >+ } >+ >+ void SetState(MediaTransportState state) { >+ invoker_.AsyncInvoke<void>(RTC_FROM_HERE, thread_, [this, state] { >+ RTC_DCHECK_RUN_ON(thread_); >+ state_ = state; >+ OnStateChanged(); >+ }); >+ } >+ >+ void FlushAsyncInvokes() { invoker_.Flush(thread_); } >+ >+ Stats GetStats() { >+ rtc::CritScope lock(&stats_lock_); >+ return stats_; >+ } >+ >+ private: >+ void OnData(uint64_t channel_id, MediaTransportEncodedAudioFrame frame) { >+ { >+ rtc::CritScope lock(&sink_lock_); >+ if (sink_) { >+ sink_->OnData(channel_id, frame); >+ } >+ } >+ { >+ rtc::CritScope lock(&stats_lock_); >+ ++stats_.received_audio_frames; >+ } >+ } >+ >+ void OnData(int channel_id, >+ DataMessageType type, >+ const rtc::CopyOnWriteBuffer& buffer) { >+ rtc::CritScope lock(&sink_lock_); >+ if (data_sink_) { >+ data_sink_->OnDataReceived(channel_id, type, buffer); >+ } >+ } >+ >+ void OnRemoteCloseChannel(int channel_id) { >+ rtc::CritScope lock(&sink_lock_); >+ if (data_sink_) { >+ data_sink_->OnChannelClosing(channel_id); >+ data_sink_->OnChannelClosed(channel_id); >+ } >+ } >+ >+ void OnStateChanged() RTC_RUN_ON(thread_) { >+ rtc::CritScope lock(&sink_lock_); >+ if (state_callback_) { >+ state_callback_->OnStateChanged(state_); >+ } >+ } >+ >+ rtc::Thread* const thread_; >+ rtc::CriticalSection sink_lock_; >+ rtc::CriticalSection stats_lock_; >+ >+ MediaTransportAudioSinkInterface* sink_ RTC_GUARDED_BY(sink_lock_) = >+ nullptr; >+ DataChannelSink* data_sink_ RTC_GUARDED_BY(sink_lock_) = nullptr; >+ MediaTransportStateCallback* state_callback_ RTC_GUARDED_BY(sink_lock_) = >+ nullptr; >+ >+ MediaTransportState state_ RTC_GUARDED_BY(thread_) = >+ MediaTransportState::kPending; >+ >+ LoopbackMediaTransport* const other_; >+ >+ Stats stats_ RTC_GUARDED_BY(stats_lock_); >+ >+ rtc::AsyncInvoker invoker_; >+ }; >+ >+ LoopbackMediaTransport first_; >+ LoopbackMediaTransport second_; >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TEST_LOOPBACK_MEDIA_TRANSPORT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/loopback_media_transport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/loopback_media_transport_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..ba741a05ca79862d68340e825d7ccd50deaf27b7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/loopback_media_transport_unittest.cc >@@ -0,0 +1,171 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <memory> >+#include <vector> >+ >+#include "api/test/loopback_media_transport.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+namespace { >+ >+class MockMediaTransportAudioSinkInterface >+ : public MediaTransportAudioSinkInterface { >+ public: >+ MOCK_METHOD2(OnData, void(uint64_t, MediaTransportEncodedAudioFrame)); >+}; >+ >+class MockDataChannelSink : public DataChannelSink { >+ public: >+ MOCK_METHOD3(OnDataReceived, >+ void(int, DataMessageType, const rtc::CopyOnWriteBuffer&)); >+ MOCK_METHOD1(OnChannelClosing, void(int)); >+ MOCK_METHOD1(OnChannelClosed, void(int)); >+}; >+ >+class MockStateCallback : public MediaTransportStateCallback { >+ public: >+ MOCK_METHOD1(OnStateChanged, void(MediaTransportState)); >+}; >+ >+// Test only uses the sequence number. >+MediaTransportEncodedAudioFrame CreateAudioFrame(int sequence_number) { >+ static constexpr int kSamplingRateHz = 48000; >+ static constexpr int kStartingSampleIndex = 0; >+ static constexpr int kSamplesPerChannel = 480; >+ static constexpr uint8_t kPayloadType = 17; >+ >+ return MediaTransportEncodedAudioFrame( >+ kSamplingRateHz, kStartingSampleIndex, kSamplesPerChannel, >+ sequence_number, MediaTransportEncodedAudioFrame::FrameType::kSpeech, >+ kPayloadType, std::vector<uint8_t>(kSamplesPerChannel)); >+} >+ >+} // namespace >+ >+TEST(LoopbackMediaTransport, AudioWithNoSinkSilentlyIgnored) { >+ std::unique_ptr<rtc::Thread> thread = rtc::Thread::Create(); >+ thread->Start(); >+ MediaTransportPair transport_pair(thread.get()); >+ transport_pair.first()->SendAudioFrame(1, CreateAudioFrame(0)); >+ transport_pair.second()->SendAudioFrame(2, CreateAudioFrame(0)); >+ transport_pair.FlushAsyncInvokes(); >+} >+ >+TEST(LoopbackMediaTransport, AudioDeliveredToSink) { >+ std::unique_ptr<rtc::Thread> thread = rtc::Thread::Create(); >+ thread->Start(); >+ MediaTransportPair transport_pair(thread.get()); >+ testing::StrictMock<MockMediaTransportAudioSinkInterface> sink; >+ EXPECT_CALL(sink, >+ OnData(1, testing::Property( >+ &MediaTransportEncodedAudioFrame::sequence_number, >+ testing::Eq(10)))); >+ transport_pair.second()->SetReceiveAudioSink(&sink); >+ transport_pair.first()->SendAudioFrame(1, CreateAudioFrame(10)); >+ >+ transport_pair.FlushAsyncInvokes(); >+ transport_pair.second()->SetReceiveAudioSink(nullptr); >+} >+ >+TEST(LoopbackMediaTransport, DataDeliveredToSink) { >+ std::unique_ptr<rtc::Thread> thread = rtc::Thread::Create(); >+ thread->Start(); >+ MediaTransportPair transport_pair(thread.get()); >+ >+ MockDataChannelSink sink; >+ transport_pair.first()->SetDataSink(&sink); >+ >+ const int channel_id = 1; >+ EXPECT_CALL(sink, >+ OnDataReceived( >+ channel_id, DataMessageType::kText, >+ testing::Property<rtc::CopyOnWriteBuffer, const char*>( >+ &rtc::CopyOnWriteBuffer::cdata, testing::StrEq("foo")))); >+ >+ SendDataParams params; >+ params.type = DataMessageType::kText; >+ rtc::CopyOnWriteBuffer buffer("foo"); >+ transport_pair.second()->SendData(channel_id, params, buffer); >+ >+ transport_pair.FlushAsyncInvokes(); >+ transport_pair.first()->SetDataSink(nullptr); >+} >+ >+TEST(LoopbackMediaTransport, CloseDeliveredToSink) { >+ std::unique_ptr<rtc::Thread> thread = rtc::Thread::Create(); >+ thread->Start(); >+ MediaTransportPair transport_pair(thread.get()); >+ >+ MockDataChannelSink first_sink; >+ transport_pair.first()->SetDataSink(&first_sink); >+ >+ MockDataChannelSink second_sink; >+ transport_pair.second()->SetDataSink(&second_sink); >+ >+ const int channel_id = 1; >+ { >+ testing::InSequence s; >+ EXPECT_CALL(second_sink, OnChannelClosing(channel_id)); >+ EXPECT_CALL(second_sink, OnChannelClosed(channel_id)); >+ EXPECT_CALL(first_sink, OnChannelClosed(channel_id)); >+ } >+ >+ transport_pair.first()->CloseChannel(channel_id); >+ >+ transport_pair.FlushAsyncInvokes(); >+ transport_pair.first()->SetDataSink(nullptr); >+ transport_pair.second()->SetDataSink(nullptr); >+} >+ >+TEST(LoopbackMediaTransport, InitialStateDeliveredWhenCallbackSet) { >+ std::unique_ptr<rtc::Thread> thread = rtc::Thread::Create(); >+ thread->Start(); >+ MediaTransportPair transport_pair(thread.get()); >+ >+ MockStateCallback state_callback; >+ >+ EXPECT_CALL(state_callback, OnStateChanged(MediaTransportState::kPending)); >+ transport_pair.first()->SetMediaTransportStateCallback(&state_callback); >+ transport_pair.FlushAsyncInvokes(); >+} >+ >+TEST(LoopbackMediaTransport, ChangedStateDeliveredWhenCallbackSet) { >+ std::unique_ptr<rtc::Thread> thread = rtc::Thread::Create(); >+ thread->Start(); >+ MediaTransportPair transport_pair(thread.get()); >+ >+ transport_pair.SetState(MediaTransportState::kWritable); >+ transport_pair.FlushAsyncInvokes(); >+ >+ MockStateCallback state_callback; >+ >+ EXPECT_CALL(state_callback, OnStateChanged(MediaTransportState::kWritable)); >+ transport_pair.first()->SetMediaTransportStateCallback(&state_callback); >+ transport_pair.FlushAsyncInvokes(); >+} >+ >+TEST(LoopbackMediaTransport, StateChangeDeliveredToCallback) { >+ std::unique_ptr<rtc::Thread> thread = rtc::Thread::Create(); >+ thread->Start(); >+ MediaTransportPair transport_pair(thread.get()); >+ >+ MockStateCallback state_callback; >+ >+ EXPECT_CALL(state_callback, OnStateChanged(MediaTransportState::kPending)); >+ EXPECT_CALL(state_callback, OnStateChanged(MediaTransportState::kWritable)); >+ transport_pair.first()->SetMediaTransportStateCallback(&state_callback); >+ transport_pair.SetState(MediaTransportState::kWritable); >+ transport_pair.FlushAsyncInvokes(); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_decryptor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_decryptor.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f4b54f966cd8c06158566a4b50573529549f0aec >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_decryptor.cc >@@ -0,0 +1,18 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/test/mock_frame_decryptor.h" >+ >+namespace webrtc { >+ >+MockFrameDecryptor::MockFrameDecryptor() = default; >+MockFrameDecryptor::~MockFrameDecryptor() = default; >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_decryptor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_decryptor.h >new file mode 100644 >index 0000000000000000000000000000000000000000..e80396554f4e05bf72c8f7ccdd67862984ad1dc1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_decryptor.h >@@ -0,0 +1,40 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_MOCK_FRAME_DECRYPTOR_H_ >+#define API_TEST_MOCK_FRAME_DECRYPTOR_H_ >+ >+#include <vector> >+ >+#include "api/crypto/framedecryptorinterface.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+class MockFrameDecryptor : public FrameDecryptorInterface { >+ public: >+ MockFrameDecryptor(); >+ ~MockFrameDecryptor() override; >+ >+ MOCK_METHOD6(Decrypt, >+ int(cricket::MediaType, >+ const std::vector<uint32_t>&, >+ rtc::ArrayView<const uint8_t>, >+ rtc::ArrayView<const uint8_t>, >+ rtc::ArrayView<uint8_t>, >+ size_t*)); >+ >+ MOCK_METHOD2(GetMaxPlaintextByteSize, >+ size_t(cricket::MediaType, size_t encrypted_frame_size)); >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TEST_MOCK_FRAME_DECRYPTOR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_encryptor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_encryptor.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..0972259f057ec72b471df07312a04a4918dee26c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_encryptor.cc >@@ -0,0 +1,19 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/test/mock_frame_encryptor.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+MockFrameEncryptor::MockFrameEncryptor() = default; >+MockFrameEncryptor::~MockFrameEncryptor() = default; >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_encryptor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_encryptor.h >new file mode 100644 >index 0000000000000000000000000000000000000000..c4fb1fde87b825a17a73fdf4e6060dda6f01dbc0 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_frame_encryptor.h >@@ -0,0 +1,38 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_MOCK_FRAME_ENCRYPTOR_H_ >+#define API_TEST_MOCK_FRAME_ENCRYPTOR_H_ >+ >+#include "api/crypto/frameencryptorinterface.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+class MockFrameEncryptor : public FrameEncryptorInterface { >+ public: >+ MockFrameEncryptor(); >+ ~MockFrameEncryptor() override; >+ >+ MOCK_METHOD6(Encrypt, >+ int(cricket::MediaType, >+ uint32_t, >+ rtc::ArrayView<const uint8_t>, >+ rtc::ArrayView<const uint8_t>, >+ rtc::ArrayView<uint8_t>, >+ size_t*)); >+ >+ MOCK_METHOD2(GetMaxCiphertextByteSize, >+ size_t(cricket::MediaType media_type, size_t frame_size)); >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TEST_MOCK_FRAME_ENCRYPTOR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_bitrate_allocator_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_bitrate_allocator_factory.h >new file mode 100644 >index 0000000000000000000000000000000000000000..0cae061ab75dcd667570dbb7cf333cf5506f1f84 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_bitrate_allocator_factory.h >@@ -0,0 +1,37 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_MOCK_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >+#define API_TEST_MOCK_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >+ >+#include <memory> >+ >+#include "api/video/video_bitrate_allocator_factory.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+class MockVideoBitrateAllocatorFactory >+ : public webrtc::VideoBitrateAllocatorFactory { >+ public: >+ virtual std::unique_ptr<VideoBitrateAllocator> CreateVideoBitrateAllocator( >+ const VideoCodec& codec) { >+ return std::unique_ptr<VideoBitrateAllocator>( >+ CreateVideoBitrateAllocatorProxy(codec)); >+ } >+ ~MockVideoBitrateAllocatorFactory() { Die(); } >+ MOCK_METHOD1(CreateVideoBitrateAllocatorProxy, >+ VideoBitrateAllocator*(const VideoCodec&)); >+ MOCK_METHOD0(Die, void()); >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TEST_MOCK_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_decoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_decoder.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..85ed0e16782aa9039a4dea663c9561fe3e7158d3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_decoder.cc >@@ -0,0 +1,20 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/test/mock_video_decoder.h" >+ >+namespace webrtc { >+ >+MockDecodedImageCallback::MockDecodedImageCallback() = default; >+MockDecodedImageCallback::~MockDecodedImageCallback() = default; >+MockVideoDecoder::MockVideoDecoder() = default; >+MockVideoDecoder::~MockVideoDecoder() = default; >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_decoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_decoder.h >new file mode 100644 >index 0000000000000000000000000000000000000000..56ff5463fde1407dbd821dfa4e35ea43c31ad0f0 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_decoder.h >@@ -0,0 +1,57 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_MOCK_VIDEO_DECODER_H_ >+#define API_TEST_MOCK_VIDEO_DECODER_H_ >+ >+#include "api/video_codecs/video_decoder.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+class MockDecodedImageCallback : public DecodedImageCallback { >+ public: >+ MockDecodedImageCallback(); >+ ~MockDecodedImageCallback() override; >+ >+ MOCK_METHOD1(Decoded, int32_t(VideoFrame& decodedImage)); // NOLINT >+ MOCK_METHOD2(Decoded, >+ int32_t(VideoFrame& decodedImage, // NOLINT >+ int64_t decode_time_ms)); >+ MOCK_METHOD3(Decoded, >+ void(VideoFrame& decodedImage, // NOLINT >+ absl::optional<int32_t> decode_time_ms, >+ absl::optional<uint8_t> qp)); >+ MOCK_METHOD1(ReceivedDecodedReferenceFrame, >+ int32_t(const uint64_t pictureId)); >+ MOCK_METHOD1(ReceivedDecodedFrame, int32_t(const uint64_t pictureId)); >+}; >+ >+class MockVideoDecoder : public VideoDecoder { >+ public: >+ MockVideoDecoder(); >+ ~MockVideoDecoder() override; >+ >+ MOCK_METHOD2(InitDecode, >+ int32_t(const VideoCodec* codecSettings, int32_t numberOfCores)); >+ MOCK_METHOD4(Decode, >+ int32_t(const EncodedImage& inputImage, >+ bool missingFrames, >+ const CodecSpecificInfo* codecSpecificInfo, >+ int64_t renderTimeMs)); >+ MOCK_METHOD1(RegisterDecodeCompleteCallback, >+ int32_t(DecodedImageCallback* callback)); >+ MOCK_METHOD0(Release, int32_t()); >+ MOCK_METHOD0(Copy, VideoDecoder*()); >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TEST_MOCK_VIDEO_DECODER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_encoder.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..a0d82b1c33b1fc27ebaee948a6b9bc8e3cf39ea7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_encoder.cc >@@ -0,0 +1,20 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/test/mock_video_encoder.h" >+ >+namespace webrtc { >+ >+MockEncodedImageCallback::MockEncodedImageCallback() = default; >+MockEncodedImageCallback::~MockEncodedImageCallback() = default; >+MockVideoEncoder::MockVideoEncoder() = default; >+MockVideoEncoder::~MockVideoEncoder() = default; >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_encoder.h >new file mode 100644 >index 0000000000000000000000000000000000000000..62f17bab884255be8890f5648e1d9a77d78a9fdc >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/mock_video_encoder.h >@@ -0,0 +1,57 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_MOCK_VIDEO_ENCODER_H_ >+#define API_TEST_MOCK_VIDEO_ENCODER_H_ >+ >+#include <vector> >+ >+#include "api/video_codecs/video_encoder.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+class MockEncodedImageCallback : public EncodedImageCallback { >+ public: >+ MockEncodedImageCallback(); >+ ~MockEncodedImageCallback(); >+ MOCK_METHOD3(OnEncodedImage, >+ Result(const EncodedImage& encodedImage, >+ const CodecSpecificInfo* codecSpecificInfo, >+ const RTPFragmentationHeader* fragmentation)); >+}; >+ >+class MockVideoEncoder : public VideoEncoder { >+ public: >+ MockVideoEncoder(); >+ ~MockVideoEncoder(); >+ MOCK_CONST_METHOD2(Version, int32_t(int8_t* version, int32_t length)); >+ MOCK_METHOD3(InitEncode, >+ int32_t(const VideoCodec* codecSettings, >+ int32_t numberOfCores, >+ size_t maxPayloadSize)); >+ MOCK_METHOD3(Encode, >+ int32_t(const VideoFrame& inputImage, >+ const CodecSpecificInfo* codecSpecificInfo, >+ const std::vector<FrameType>* frame_types)); >+ MOCK_METHOD1(RegisterEncodeCompleteCallback, >+ int32_t(EncodedImageCallback* callback)); >+ MOCK_METHOD0(Release, int32_t()); >+ MOCK_METHOD0(Reset, int32_t()); >+ MOCK_METHOD2(SetRates, int32_t(uint32_t newBitRate, uint32_t frameRate)); >+ MOCK_METHOD2(SetRateAllocation, >+ int32_t(const VideoBitrateAllocation& newBitRate, >+ uint32_t frameRate)); >+ MOCK_CONST_METHOD0(GetEncoderInfo, EncoderInfo(void)); >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TEST_MOCK_VIDEO_ENCODER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulated_network.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulated_network.h >index aec300caa2b3e4fd8eb6eb29569b52cf81ec6574..c03b4ca4cbf28b5adb1b9d56153ef9cbac1521f3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulated_network.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulated_network.h >@@ -42,11 +42,11 @@ struct PacketDeliveryInfo { > uint64_t packet_id; > }; > >-// DefaultNetworkSimulationConfig is a default network behavior configuration >-// for default network behavior that will be used by WebRTC if no custom >+// BuiltInNetworkBehaviorConfig is a built-in network behavior configuration >+// for built-in network behavior that will be used by WebRTC if no custom > // NetworkBehaviorInterface is provided. >-struct DefaultNetworkSimulationConfig { >- DefaultNetworkSimulationConfig() {} >+struct BuiltInNetworkBehaviorConfig { >+ BuiltInNetworkBehaviorConfig() {} > // Queue length in number of packets. > size_t queue_length_packets = 0; > // Delay in addition to capacity induced delay. >@@ -75,9 +75,6 @@ class NetworkBehaviorInterface { > virtual ~NetworkBehaviorInterface() = default; > }; > >-// Deprecated. DO NOT USE. Use NetworkBehaviorInterface instead. >-using NetworkSimulationInterface = NetworkBehaviorInterface; >- > } // namespace webrtc > > #endif // API_TEST_SIMULATED_NETWORK_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulcast_test_fixture.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulcast_test_fixture.h >index e7eab241348da180ca5c83d565a31191cf889472..5270d133064a0d14928b318ed0b8260e08f4296d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulcast_test_fixture.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/simulcast_test_fixture.h >@@ -33,6 +33,7 @@ class SimulcastTestFixture { > virtual void TestSpatioTemporalLayers333PatternEncoder() = 0; > virtual void TestSpatioTemporalLayers321PatternEncoder() = 0; > virtual void TestStrideEncodeDecode() = 0; >+ virtual void TestDecodeWidthHeightSet() = 0; > }; > > } // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/test_dependency_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/test_dependency_factory.cc >index 3e59bc095c87474970b54d86172770e54792de04..179c623940f2bf4ff0bb091dd79399d9bc5dad76 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/test_dependency_factory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/test_dependency_factory.cc >@@ -11,6 +11,7 @@ > #include <memory> > #include <utility> > >+#include "absl/memory/memory.h" > #include "api/test/test_dependency_factory.h" > #include "rtc_base/thread_checker.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/BUILD.gn >new file mode 100644 >index 0000000000000000000000000000000000000000..9b63a4acb3da2f4a59ea92d70fe7ea2f8042b44e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/BUILD.gn >@@ -0,0 +1,24 @@ >+# Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+# >+# Use of this source code is governed by a BSD-style license >+# that can be found in the LICENSE file in the root of the source >+# tree. An additional intellectual property rights grant can be found >+# in the file PATENTS. All contributing project authors may >+# be found in the AUTHORS file in the root of the source tree. >+ >+import("../../../webrtc.gni") >+ >+rtc_source_set("function_video_factory") { >+ visibility = [ "*" ] >+ testonly = true >+ public = [ >+ "function_video_decoder_factory.h", >+ "function_video_encoder_factory.h", >+ ] >+ >+ deps = [ >+ "../../../rtc_base:checks", >+ "../../video_codecs:video_codecs_api", >+ "//third_party/abseil-cpp/absl/memory", >+ ] >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/function_video_decoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/function_video_decoder_factory.h >new file mode 100644 >index 0000000000000000000000000000000000000000..03a43239976a01139ef709e294e038f1990492f9 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/function_video_decoder_factory.h >@@ -0,0 +1,56 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_VIDEO_FUNCTION_VIDEO_DECODER_FACTORY_H_ >+#define API_TEST_VIDEO_FUNCTION_VIDEO_DECODER_FACTORY_H_ >+ >+#include <functional> >+#include <memory> >+#include <utility> >+#include <vector> >+ >+#include "api/video_codecs/sdp_video_format.h" >+#include "api/video_codecs/video_decoder_factory.h" >+#include "rtc_base/checks.h" >+ >+namespace webrtc { >+namespace test { >+ >+// A decoder factory producing decoders by calling a supplied create function. >+class FunctionVideoDecoderFactory final : public VideoDecoderFactory { >+ public: >+ explicit FunctionVideoDecoderFactory( >+ std::function<std::unique_ptr<VideoDecoder>()> create) >+ : create_([create](const SdpVideoFormat&) { return create(); }) {} >+ explicit FunctionVideoDecoderFactory( >+ std::function<std::unique_ptr<VideoDecoder>(const SdpVideoFormat&)> >+ create) >+ : create_(std::move(create)) {} >+ >+ // Unused by tests. >+ std::vector<SdpVideoFormat> GetSupportedFormats() const override { >+ RTC_NOTREACHED(); >+ return {}; >+ } >+ >+ std::unique_ptr<VideoDecoder> CreateVideoDecoder( >+ const SdpVideoFormat& format) override { >+ return create_(format); >+ } >+ >+ private: >+ const std::function<std::unique_ptr<VideoDecoder>(const SdpVideoFormat&)> >+ create_; >+}; >+ >+} // namespace test >+} // namespace webrtc >+ >+#endif // API_TEST_VIDEO_FUNCTION_VIDEO_DECODER_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/function_video_encoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/function_video_encoder_factory.h >new file mode 100644 >index 0000000000000000000000000000000000000000..85f848cd1fdc85328a2402db2bad81ad0178cc29 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video/function_video_encoder_factory.h >@@ -0,0 +1,65 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TEST_VIDEO_FUNCTION_VIDEO_ENCODER_FACTORY_H_ >+#define API_TEST_VIDEO_FUNCTION_VIDEO_ENCODER_FACTORY_H_ >+ >+#include <functional> >+#include <memory> >+#include <utility> >+#include <vector> >+ >+#include "api/video_codecs/sdp_video_format.h" >+#include "api/video_codecs/video_encoder_factory.h" >+#include "rtc_base/checks.h" >+ >+namespace webrtc { >+namespace test { >+ >+// An encoder factory producing encoders by calling a supplied create >+// function. >+class FunctionVideoEncoderFactory final : public VideoEncoderFactory { >+ public: >+ explicit FunctionVideoEncoderFactory( >+ std::function<std::unique_ptr<VideoEncoder>()> create) >+ : create_([create](const SdpVideoFormat&) { return create(); }) {} >+ explicit FunctionVideoEncoderFactory( >+ std::function<std::unique_ptr<VideoEncoder>(const SdpVideoFormat&)> >+ create) >+ : create_(std::move(create)) {} >+ >+ // Unused by tests. >+ std::vector<SdpVideoFormat> GetSupportedFormats() const override { >+ RTC_NOTREACHED(); >+ return {}; >+ } >+ >+ CodecInfo QueryVideoEncoder( >+ const SdpVideoFormat& /* format */) const override { >+ CodecInfo codec_info; >+ codec_info.is_hardware_accelerated = false; >+ codec_info.has_internal_source = false; >+ return codec_info; >+ } >+ >+ std::unique_ptr<VideoEncoder> CreateVideoEncoder( >+ const SdpVideoFormat& format) override { >+ return create_(format); >+ } >+ >+ private: >+ const std::function<std::unique_ptr<VideoEncoder>(const SdpVideoFormat&)> >+ create_; >+}; >+ >+} // namespace test >+} // namespace webrtc >+ >+#endif // API_TEST_VIDEO_FUNCTION_VIDEO_ENCODER_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video_quality_test_fixture.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video_quality_test_fixture.h >index ff1472e9a58e61354e805192528f3ef875cec923..069a24353de7ebccde392c068a5d4e05cc627cf3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video_quality_test_fixture.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/video_quality_test_fixture.h >@@ -82,15 +82,11 @@ class VideoQualityTestFixtureInterface { > std::string graph_data_output_filename; > std::string graph_title; > } analyzer; >- // Deprecated. DO NOT USE. Use config instead. This is not pipe actually, >- // it is just configuration, that will be passed to default implementation >- // of simulation layer. >- DefaultNetworkSimulationConfig pipe; > // Config for default simulation implementation. Must be nullopt if > // `sender_network` and `receiver_network` in InjectionComponents are > // non-null. May be nullopt even if `sender_network` and `receiver_network` > // are null; in that case, a default config will be used. >- absl::optional<DefaultNetworkSimulationConfig> config; >+ absl::optional<BuiltInNetworkBehaviorConfig> config; > struct SS { // Spatial scalability. > std::vector<VideoStream> streams; // If empty, one stream is assumed. > size_t selected_stream; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/videocodec_test_fixture.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/videocodec_test_fixture.h >index a0b8083d7236c37a068beb7012de36d6637bbf56..925c60c3c92c9f5ebd149d53f188335731a1e424 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/videocodec_test_fixture.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/test/videocodec_test_fixture.h >@@ -123,10 +123,6 @@ class VideoCodecTestFixture { > webrtc::H264PacketizationMode::NonInterleaved; > } h264_codec_settings; > >- // Should hardware accelerated codecs be used? >- bool hw_encoder = false; >- bool hw_decoder = false; >- > // Custom checker that will be called for each frame. > const EncodedFrameChecker* encoded_frame_checker = nullptr; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/BUILD.gn >index f473179dd14adfa4ff61f1c224b0ef97a2c4a2c8..357ff0054bd0db6de234ee73ecb978ff68865bc8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/BUILD.gn >@@ -19,7 +19,15 @@ rtc_source_set("bitrate_settings") { > ] > } > >+rtc_source_set("enums") { >+ visibility = [ "*" ] >+ sources = [ >+ "enums.h", >+ ] >+} >+ > rtc_static_library("network_control") { >+ visibility = [ "*" ] > sources = [ > "network_control.h", > "network_types.cc", >@@ -35,6 +43,19 @@ rtc_static_library("network_control") { > ] > } > >+rtc_static_library("goog_cc") { >+ visibility = [ "*" ] >+ sources = [ >+ "goog_cc_factory.cc", >+ "goog_cc_factory.h", >+ ] >+ deps = [ >+ ":network_control", >+ "../../modules/congestion_controller/goog_cc", >+ "//third_party/abseil-cpp/absl/memory", >+ ] >+} >+ > if (rtc_include_tests) { > rtc_source_set("network_control_test") { > testonly = true >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/OWNERS >new file mode 100644 >index 0000000000000000000000000000000000000000..5991f6fc56a079d56a9c9310ba6f7828db34e11f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/OWNERS >@@ -0,0 +1,2 @@ >+srte@webrtc.org >+terelius@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/enums.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/enums.h >new file mode 100644 >index 0000000000000000000000000000000000000000..b1d5770cb962aa7987f890481931b0648300dec7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/enums.h >@@ -0,0 +1,32 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TRANSPORT_ENUMS_H_ >+#define API_TRANSPORT_ENUMS_H_ >+ >+namespace webrtc { >+ >+// See https://w3c.github.io/webrtc-pc/#rtcicetransportstate >+// Note that kFailed is currently not a terminal state, and a transport might >+// incorrectly be marked as failed while gathering candidates, see >+// bugs.webrtc.org/8833 >+enum class IceTransportState { >+ kNew, >+ kChecking, >+ kConnected, >+ kCompleted, >+ kFailed, >+ kDisconnected, >+ kClosed, >+}; >+ >+} // namespace webrtc >+ >+#endif // API_TRANSPORT_ENUMS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/goog_cc_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/goog_cc_factory.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..119e2dcf84abcb8d3ffdc6963919bb7d2471e088 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/goog_cc_factory.cc >@@ -0,0 +1,43 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/transport/goog_cc_factory.h" >+ >+#include "absl/memory/memory.h" >+#include "modules/congestion_controller/goog_cc/goog_cc_network_control.h" >+namespace webrtc { >+GoogCcNetworkControllerFactory::GoogCcNetworkControllerFactory( >+ RtcEventLog* event_log) >+ : event_log_(event_log) {} >+ >+std::unique_ptr<NetworkControllerInterface> >+GoogCcNetworkControllerFactory::Create(NetworkControllerConfig config) { >+ return absl::make_unique<GoogCcNetworkController>(event_log_, config, false); >+} >+ >+TimeDelta GoogCcNetworkControllerFactory::GetProcessInterval() const { >+ const int64_t kUpdateIntervalMs = 25; >+ return TimeDelta::ms(kUpdateIntervalMs); >+} >+ >+GoogCcFeedbackNetworkControllerFactory::GoogCcFeedbackNetworkControllerFactory( >+ RtcEventLog* event_log) >+ : event_log_(event_log) {} >+ >+std::unique_ptr<NetworkControllerInterface> >+GoogCcFeedbackNetworkControllerFactory::Create(NetworkControllerConfig config) { >+ return absl::make_unique<GoogCcNetworkController>(event_log_, config, true); >+} >+ >+TimeDelta GoogCcFeedbackNetworkControllerFactory::GetProcessInterval() const { >+ const int64_t kUpdateIntervalMs = 25; >+ return TimeDelta::ms(kUpdateIntervalMs); >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/goog_cc_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/goog_cc_factory.h >new file mode 100644 >index 0000000000000000000000000000000000000000..2e3b3172d02fd6ffa0b1f3ffe338ce57e2a574d4 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/goog_cc_factory.h >@@ -0,0 +1,47 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_TRANSPORT_GOOG_CC_FACTORY_H_ >+#define API_TRANSPORT_GOOG_CC_FACTORY_H_ >+#include <memory> >+ >+#include "api/transport/network_control.h" >+ >+namespace webrtc { >+class RtcEventLog; >+ >+class GoogCcNetworkControllerFactory >+ : public NetworkControllerFactoryInterface { >+ public: >+ explicit GoogCcNetworkControllerFactory(RtcEventLog*); >+ std::unique_ptr<NetworkControllerInterface> Create( >+ NetworkControllerConfig config) override; >+ TimeDelta GetProcessInterval() const override; >+ >+ private: >+ RtcEventLog* const event_log_; >+}; >+ >+// Factory to create packet feedback only GoogCC, this can be used for >+// connections providing packet receive time feedback but no other reports. >+class GoogCcFeedbackNetworkControllerFactory >+ : public NetworkControllerFactoryInterface { >+ public: >+ explicit GoogCcFeedbackNetworkControllerFactory(RtcEventLog*); >+ std::unique_ptr<NetworkControllerInterface> Create( >+ NetworkControllerConfig config) override; >+ TimeDelta GetProcessInterval() const override; >+ >+ private: >+ RtcEventLog* const event_log_; >+}; >+} // namespace webrtc >+ >+#endif // API_TRANSPORT_GOOG_CC_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.cc >index 48bdcab162bbe2a08f2af22db025af93b71f8c3d..80214de1a2381247875ba13f1c1aaad6e30b1846 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.cc >@@ -38,7 +38,7 @@ std::vector<PacketResult> TransportPacketsFeedback::ReceivedWithSendInfo() > const { > std::vector<PacketResult> res; > for (const PacketResult& fb : packet_feedbacks) { >- if (fb.receive_time.IsFinite() && fb.sent_packet.has_value()) { >+ if (fb.receive_time.IsFinite()) { > res.push_back(fb); > } > } >@@ -48,7 +48,7 @@ std::vector<PacketResult> TransportPacketsFeedback::ReceivedWithSendInfo() > std::vector<PacketResult> TransportPacketsFeedback::LostWithSendInfo() const { > std::vector<PacketResult> res; > for (const PacketResult& fb : packet_feedbacks) { >- if (fb.receive_time.IsPlusInfinity() && fb.sent_packet.has_value()) { >+ if (fb.receive_time.IsPlusInfinity()) { > res.push_back(fb); > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.h >index ad20abc53d710f820d884af62dcf915440b6b98a..0f1d7ab68334c3d40ef09a29bd376f727fc6fe7a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/network_types.h >@@ -88,12 +88,13 @@ struct PacedPacketInfo { > struct SentPacket { > Timestamp send_time = Timestamp::PlusInfinity(); > DataSize size = DataSize::Zero(); >+ DataSize prior_unacked_data = DataSize::Zero(); > PacedPacketInfo pacing_info; > // Transport independent sequence number, any tracked packet should have a > // sequence number that is unique over the whole call and increasing by 1 for > // each packet. > int64_t sequence_number; >- // Data in flight when the packet was sent, including the packet. >+ // Tracked data in flight when the packet was sent, excluding unacked data. > DataSize data_in_flight = DataSize::Zero(); > }; > >@@ -125,7 +126,7 @@ struct PacketResult { > PacketResult(const PacketResult&); > ~PacketResult(); > >- absl::optional<SentPacket> sent_packet; >+ SentPacket sent_packet; > Timestamp receive_time = Timestamp::PlusInfinity(); > }; > >@@ -139,6 +140,9 @@ struct TransportPacketsFeedback { > DataSize prior_in_flight = DataSize::Zero(); > std::vector<PacketResult> packet_feedbacks; > >+ // Arrival times for messages without send time information. >+ std::vector<Timestamp> sendless_arrival_times; >+ > std::vector<PacketResult> ReceivedWithSendInfo() const; > std::vector<PacketResult> LostWithSendInfo() const; > std::vector<PacketResult> PacketsWithFeedback() const; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/test/network_control_tester.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/test/network_control_tester.cc >index 2edabd0717e5949a0be7744ac6c043dd9e9ee029..c590ae88bc4130a78c16d5d083216b7c95b2b033 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/test/network_control_tester.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/transport/test/network_control_tester.cc >@@ -87,7 +87,7 @@ void NetworkControllerTester::RunSimulation(TimeDelta duration, > if (state_.congestion_window && state_.congestion_window->IsFinite()) { > DataSize data_in_flight = DataSize::Zero(); > for (PacketResult& packet : outstanding_packets_) >- data_in_flight += packet.sent_packet->size; >+ data_in_flight += packet.sent_packet.size; > if (data_in_flight > *state_.congestion_window) > send_packet = false; > } >@@ -98,7 +98,7 @@ void NetworkControllerTester::RunSimulation(TimeDelta duration, > sent_packet.sequence_number = packet_sequence_number_++; > sent_packet.data_in_flight = sent_packet.size; > for (PacketResult& packet : outstanding_packets_) >- sent_packet.data_in_flight += packet.sent_packet->size; >+ sent_packet.data_in_flight += packet.sent_packet.size; > Update(&state_, controller_->OnSentPacket(sent_packet)); > > TimeDelta time_in_flight = sent_packet.size / actual_bandwidth; >@@ -120,7 +120,7 @@ void NetworkControllerTester::RunSimulation(TimeDelta duration, > TransportPacketsFeedback feedback; > feedback.prior_in_flight = DataSize::Zero(); > for (PacketResult& packet : outstanding_packets_) >- feedback.prior_in_flight += packet.sent_packet->size; >+ feedback.prior_in_flight += packet.sent_packet.size; > while (!outstanding_packets_.empty() && > current_time_ >= outstanding_packets_.front().receive_time + > propagation_delay) { >@@ -131,7 +131,7 @@ void NetworkControllerTester::RunSimulation(TimeDelta duration, > feedback.packet_feedbacks.back().receive_time + propagation_delay; > feedback.data_in_flight = DataSize::Zero(); > for (PacketResult& packet : outstanding_packets_) >- feedback.data_in_flight += packet.sent_packet->size; >+ feedback.data_in_flight += packet.sent_packet.size; > Update(&state_, controller_->OnTransportPacketsFeedback(feedback)); > } > current_time_ += packet_interval; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/BUILD.gn >index 7dbddf4e622f4ed7fbc1f73fcc952de3428fbe6b..7e150f176679b23343571e1d148b2b30929bce02 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/BUILD.gn >@@ -9,6 +9,7 @@ > import("../../webrtc.gni") > > rtc_source_set("data_rate") { >+ visibility = [ "*" ] > sources = [ > "data_rate.cc", > "data_rate.h", >@@ -18,11 +19,13 @@ rtc_source_set("data_rate") { > ":data_size", > ":time_delta", > "../../rtc_base:checks", >- "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:stringutils", >+ "../../rtc_base/units:unit_base", > ] > } > > rtc_source_set("data_size") { >+ visibility = [ "*" ] > sources = [ > "data_size.cc", > "data_size.h", >@@ -30,10 +33,13 @@ rtc_source_set("data_size") { > > deps = [ > "../../rtc_base:checks", >- "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:stringutils", >+ "../../rtc_base/units:unit_base", > ] > } >+ > rtc_source_set("time_delta") { >+ visibility = [ "*" ] > sources = [ > "time_delta.cc", > "time_delta.h", >@@ -41,11 +47,13 @@ rtc_source_set("time_delta") { > > deps = [ > "../../rtc_base:checks", >- "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:stringutils", >+ "../../rtc_base/units:unit_base", > ] > } > > rtc_source_set("timestamp") { >+ visibility = [ "*" ] > sources = [ > "timestamp.cc", > "timestamp.h", >@@ -54,7 +62,8 @@ rtc_source_set("timestamp") { > deps = [ > ":time_delta", > "../../rtc_base:checks", >- "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:stringutils", >+ "../../rtc_base/units:unit_base", > ] > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.cc >index 9170627715849809fc0dfcd6c3855e078775eea2..d72d958c028f976b2c150f2d67df0a3cbdc0f60b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.cc >@@ -14,7 +14,7 @@ > > namespace webrtc { > >-std::string ToString(const DataRate& value) { >+std::string ToString(DataRate value) { > char buf[64]; > rtc::SimpleStringBuilder sb(buf); > if (value.IsInfinite()) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.h >index 4bb988b65888bb934addaadfd0629f6196262e92..7119284874e92b7c54e8d2cda5555892a5a3bf64 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate.h >@@ -15,22 +15,17 @@ > #include <ostream> // no-presubmit-check TODO(webrtc:8982) > #endif // UNIT_TEST > >-#include <stdint.h> >-#include <algorithm> >-#include <cmath> > #include <limits> > #include <string> >- >-#include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" >+#include <type_traits> > > #include "api/units/data_size.h" > #include "api/units/time_delta.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/units/unit_base.h" > > namespace webrtc { > namespace data_rate_impl { >-constexpr int64_t kPlusInfinityVal = std::numeric_limits<int64_t>::max(); >- > inline int64_t Microbits(const DataSize& size) { > constexpr int64_t kMaxBeforeConversion = > std::numeric_limits<int64_t>::max() / 8000000; >@@ -43,184 +38,64 @@ inline int64_t Microbits(const DataSize& size) { > // DataRate is a class that represents a given data rate. This can be used to > // represent bandwidth, encoding bitrate, etc. The internal storage is bits per > // second (bps). >-class DataRate { >+class DataRate final : public rtc_units_impl::RelativeUnit<DataRate> { > public: > DataRate() = delete; >- static constexpr DataRate Zero() { return DataRate(0); } >- static constexpr DataRate Infinity() { >- return DataRate(data_rate_impl::kPlusInfinityVal); >- } >+ static constexpr DataRate Infinity() { return PlusInfinity(); } > template <int64_t bps> > static constexpr DataRate BitsPerSec() { >- static_assert(bps >= 0, ""); >- static_assert(bps < data_rate_impl::kPlusInfinityVal, ""); >- return DataRate(bps); >+ return FromStaticValue<bps>(); > } > template <int64_t kbps> > static constexpr DataRate KilobitsPerSec() { >- static_assert(kbps >= 0, ""); >- static_assert(kbps < data_rate_impl::kPlusInfinityVal / 1000, ""); >- return DataRate(kbps * 1000); >+ return FromStaticFraction<kbps, 1000>(); > } >- >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static DataRate bps(T bits_per_second) { >- RTC_DCHECK_GE(bits_per_second, 0); >- RTC_DCHECK_LT(bits_per_second, data_rate_impl::kPlusInfinityVal); >- return DataRate(rtc::dchecked_cast<int64_t>(bits_per_second)); >- } >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static DataRate kbps(T kilobits_per_sec) { >- RTC_DCHECK_GE(kilobits_per_sec, 0); >- RTC_DCHECK_LT(kilobits_per_sec, data_rate_impl::kPlusInfinityVal / 1000); >- return DataRate::bps(rtc::dchecked_cast<int64_t>(kilobits_per_sec) * 1000); >- } >- >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >- static DataRate bps(T bits_per_second) { >- if (bits_per_second == std::numeric_limits<T>::infinity()) { >- return Infinity(); >- } else { >- RTC_DCHECK(!std::isnan(bits_per_second)); >- RTC_DCHECK_GE(bits_per_second, 0); >- RTC_DCHECK_LT(bits_per_second, data_rate_impl::kPlusInfinityVal); >- return DataRate(rtc::dchecked_cast<int64_t>(bits_per_second)); >- } >+ template <typename T> >+ static constexpr DataRate bps(T bits_per_second) { >+ return FromValue(bits_per_second); > } >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >- static DataRate kbps(T kilobits_per_sec) { >- return DataRate::bps(kilobits_per_sec * 1e3); >+ template <typename T> >+ static constexpr DataRate kbps(T kilobits_per_sec) { >+ return FromFraction<1000>(kilobits_per_sec); > } >- > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type bps() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(bits_per_sec_); >+ constexpr T bps() const { >+ return ToValue<T>(); > } > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type kbps() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(UnsafeKilobitsPerSec()); >- } >- >- template <typename T> >- typename std::enable_if<std::is_floating_point<T>::value, >- T>::type constexpr bps() const { >- return IsInfinite() ? std::numeric_limits<T>::infinity() : bits_per_sec_; >- } >- template <typename T> >- typename std::enable_if<std::is_floating_point<T>::value, >- T>::type constexpr kbps() const { >- return bps<T>() * 1e-3; >+ T kbps() const { >+ return ToFraction<1000, T>(); > } >- > constexpr int64_t bps_or(int64_t fallback_value) const { >- return IsFinite() ? bits_per_sec_ : fallback_value; >+ return ToValueOr(fallback_value); > } > constexpr int64_t kbps_or(int64_t fallback_value) const { >- return IsFinite() ? UnsafeKilobitsPerSec() : fallback_value; >- } >- >- constexpr bool IsZero() const { return bits_per_sec_ == 0; } >- constexpr bool IsInfinite() const { >- return bits_per_sec_ == data_rate_impl::kPlusInfinityVal; >- } >- constexpr bool IsFinite() const { return !IsInfinite(); } >- DataRate Clamped(DataRate min_rate, DataRate max_rate) const { >- return std::max(min_rate, std::min(*this, max_rate)); >- } >- void Clamp(DataRate min_rate, DataRate max_rate) { >- *this = Clamped(min_rate, max_rate); >- } >- DataRate operator-(const DataRate& other) const { >- return DataRate::bps(bps() - other.bps()); >- } >- DataRate operator+(const DataRate& other) const { >- return DataRate::bps(bps() + other.bps()); >- } >- DataRate& operator-=(const DataRate& other) { >- *this = *this - other; >- return *this; >- } >- DataRate& operator+=(const DataRate& other) { >- *this = *this + other; >- return *this; >- } >- constexpr double operator/(const DataRate& other) const { >- return bps<double>() / other.bps<double>(); >- } >- constexpr bool operator==(const DataRate& other) const { >- return bits_per_sec_ == other.bits_per_sec_; >- } >- constexpr bool operator!=(const DataRate& other) const { >- return bits_per_sec_ != other.bits_per_sec_; >- } >- constexpr bool operator<=(const DataRate& other) const { >- return bits_per_sec_ <= other.bits_per_sec_; >- } >- constexpr bool operator>=(const DataRate& other) const { >- return bits_per_sec_ >= other.bits_per_sec_; >- } >- constexpr bool operator>(const DataRate& other) const { >- return bits_per_sec_ > other.bits_per_sec_; >- } >- constexpr bool operator<(const DataRate& other) const { >- return bits_per_sec_ < other.bits_per_sec_; >+ return ToFractionOr<1000>(fallback_value); > } > > private: > // Bits per second used internally to simplify debugging by making the value > // more recognizable. >- explicit constexpr DataRate(int64_t bits_per_second) >- : bits_per_sec_(bits_per_second) {} >- constexpr int64_t UnsafeKilobitsPerSec() const { >- return (bits_per_sec_ + 500) / 1000; >- } >- int64_t bits_per_sec_; >+ friend class rtc_units_impl::UnitBase<DataRate>; >+ using RelativeUnit::RelativeUnit; >+ static constexpr bool one_sided = true; > }; > >-inline DataRate operator*(const DataRate& rate, const double& scalar) { >- return DataRate::bps(std::round(rate.bps() * scalar)); >-} >-inline DataRate operator*(const double& scalar, const DataRate& rate) { >- return rate * scalar; >-} >-inline DataRate operator*(const DataRate& rate, const int64_t& scalar) { >- return DataRate::bps(rate.bps() * scalar); >-} >-inline DataRate operator*(const int64_t& scalar, const DataRate& rate) { >- return rate * scalar; >-} >-inline DataRate operator*(const DataRate& rate, const int32_t& scalar) { >- return DataRate::bps(rate.bps() * scalar); >-} >-inline DataRate operator*(const int32_t& scalar, const DataRate& rate) { >- return rate * scalar; >-} >- >-inline DataRate operator/(const DataSize& size, const TimeDelta& duration) { >+inline DataRate operator/(const DataSize size, const TimeDelta duration) { > return DataRate::bps(data_rate_impl::Microbits(size) / duration.us()); > } >-inline TimeDelta operator/(const DataSize& size, const DataRate& rate) { >+inline TimeDelta operator/(const DataSize size, const DataRate rate) { > return TimeDelta::us(data_rate_impl::Microbits(size) / rate.bps()); > } >-inline DataSize operator*(const DataRate& rate, const TimeDelta& duration) { >+inline DataSize operator*(const DataRate rate, const TimeDelta duration) { > int64_t microbits = rate.bps() * duration.us(); > return DataSize::bytes((microbits + 4000000) / 8000000); > } >-inline DataSize operator*(const TimeDelta& duration, const DataRate& rate) { >+inline DataSize operator*(const TimeDelta duration, const DataRate rate) { > return rate * duration; > } > >-std::string ToString(const DataRate& value); >+std::string ToString(DataRate value); > > #ifdef UNIT_TEST > inline std::ostream& operator<<( // no-presubmit-check TODO(webrtc:8982) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate_unittest.cc >index 8e5b6605948fd88fda51533c15ee531e8c9b3d9b..996298c1fcee8bff67f0cf78dd35241b16721aec 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_rate_unittest.cc >@@ -130,6 +130,9 @@ TEST(DataRateTest, MathOperations) { > > EXPECT_EQ(rate_a / rate_b, static_cast<double>(kValueA) / kValueB); > >+ EXPECT_EQ((rate_a / 10).bps(), kValueA / 10); >+ EXPECT_NEAR((rate_a / 0.5).bps(), kValueA * 2, 1); >+ > DataRate mutable_rate = DataRate::bps(kValueA); > mutable_rate += rate_b; > EXPECT_EQ(mutable_rate.bps(), kValueA + kValueB); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.cc >index 4440f89d03c2ff84a1b50cfabae43f1e979a9d4e..8a87786c77a3e97eab433b94ab65acca117588a7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.cc >@@ -14,7 +14,7 @@ > > namespace webrtc { > >-std::string ToString(const DataSize& value) { >+std::string ToString(DataSize value) { > char buf[64]; > rtc::SimpleStringBuilder sb(buf); > if (value.IsInfinite()) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.h >index 8958b2469730cf200efa4824a88e84c6fb4ffc65..b4cbb652f3c907c097196eb5f722e9643a19f319 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/data_size.h >@@ -15,143 +15,44 @@ > #include <ostream> // no-presubmit-check TODO(webrtc:8982) > #endif // UNIT_TEST > >-#include <stdint.h> >-#include <cmath> >-#include <limits> > #include <string> > #include <type_traits> > >-#include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" >+#include "rtc_base/units/unit_base.h" > > namespace webrtc { >-namespace data_size_impl { >-constexpr int64_t kPlusInfinityVal = std::numeric_limits<int64_t>::max(); >-} // namespace data_size_impl >- > // DataSize is a class represeting a count of bytes. >-class DataSize { >+class DataSize final : public rtc_units_impl::RelativeUnit<DataSize> { > public: > DataSize() = delete; >- static constexpr DataSize Zero() { return DataSize(0); } >- static constexpr DataSize Infinity() { >- return DataSize(data_size_impl::kPlusInfinityVal); >- } >+ static constexpr DataSize Infinity() { return PlusInfinity(); } > template <int64_t bytes> > static constexpr DataSize Bytes() { >- static_assert(bytes >= 0, ""); >- static_assert(bytes < data_size_impl::kPlusInfinityVal, ""); >- return DataSize(bytes); >+ return FromStaticValue<bytes>(); > } > > template < > typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static DataSize bytes(T bytes) { >- RTC_DCHECK_GE(bytes, 0); >- RTC_DCHECK_LT(bytes, data_size_impl::kPlusInfinityVal); >- return DataSize(rtc::dchecked_cast<int64_t>(bytes)); >- } >- >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >+ typename std::enable_if<std::is_arithmetic<T>::value>::type* = nullptr> > static DataSize bytes(T bytes) { >- if (bytes == std::numeric_limits<T>::infinity()) { >- return Infinity(); >- } else { >- RTC_DCHECK(!std::isnan(bytes)); >- RTC_DCHECK_GE(bytes, 0); >- RTC_DCHECK_LT(bytes, data_size_impl::kPlusInfinityVal); >- return DataSize(rtc::dchecked_cast<int64_t>(bytes)); >- } >+ return FromValue(bytes); > } >- > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type bytes() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(bytes_); >- } >- >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- bytes() const { >- return IsInfinite() ? std::numeric_limits<T>::infinity() : bytes_; >+ typename std::enable_if<std::is_arithmetic<T>::value, T>::type bytes() const { >+ return ToValue<T>(); > } > > constexpr int64_t bytes_or(int64_t fallback_value) const { >- return IsFinite() ? bytes_ : fallback_value; >- } >- >- constexpr bool IsZero() const { return bytes_ == 0; } >- constexpr bool IsInfinite() const { >- return bytes_ == data_size_impl::kPlusInfinityVal; >- } >- constexpr bool IsFinite() const { return !IsInfinite(); } >- DataSize operator-(const DataSize& other) const { >- return DataSize::bytes(bytes() - other.bytes()); >- } >- DataSize operator+(const DataSize& other) const { >- return DataSize::bytes(bytes() + other.bytes()); >- } >- DataSize& operator-=(const DataSize& other) { >- *this = *this - other; >- return *this; >- } >- DataSize& operator+=(const DataSize& other) { >- *this = *this + other; >- return *this; >- } >- constexpr double operator/(const DataSize& other) const { >- return bytes<double>() / other.bytes<double>(); >- } >- constexpr bool operator==(const DataSize& other) const { >- return bytes_ == other.bytes_; >- } >- constexpr bool operator!=(const DataSize& other) const { >- return bytes_ != other.bytes_; >- } >- constexpr bool operator<=(const DataSize& other) const { >- return bytes_ <= other.bytes_; >- } >- constexpr bool operator>=(const DataSize& other) const { >- return bytes_ >= other.bytes_; >- } >- constexpr bool operator>(const DataSize& other) const { >- return bytes_ > other.bytes_; >- } >- constexpr bool operator<(const DataSize& other) const { >- return bytes_ < other.bytes_; >+ return ToValueOr(fallback_value); > } > > private: >- explicit constexpr DataSize(int64_t bytes) : bytes_(bytes) {} >- int64_t bytes_; >+ friend class rtc_units_impl::UnitBase<DataSize>; >+ using RelativeUnit::RelativeUnit; >+ static constexpr bool one_sided = true; > }; > >-inline DataSize operator*(const DataSize& size, const double& scalar) { >- return DataSize::bytes(std::round(size.bytes() * scalar)); >-} >-inline DataSize operator*(const double& scalar, const DataSize& size) { >- return size * scalar; >-} >-inline DataSize operator*(const DataSize& size, const int64_t& scalar) { >- return DataSize::bytes(size.bytes() * scalar); >-} >-inline DataSize operator*(const int64_t& scalar, const DataSize& size) { >- return size * scalar; >-} >-inline DataSize operator*(const DataSize& size, const int32_t& scalar) { >- return DataSize::bytes(size.bytes() * scalar); >-} >-inline DataSize operator*(const int32_t& scalar, const DataSize& size) { >- return size * scalar; >-} >-inline DataSize operator/(const DataSize& size, const int64_t& scalar) { >- return DataSize::bytes(size.bytes() / scalar); >-} >- >-std::string ToString(const DataSize& value); >+std::string ToString(DataSize value); > > #ifdef UNIT_TEST > inline std::ostream& operator<<( // no-presubmit-check TODO(webrtc:8982) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.cc >index d38387a5669aee4b952adccca6d884bfee7bc13c..f90451b429a7a0b1d99d3ba968c3d6615cbad91b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.cc >@@ -14,7 +14,7 @@ > > namespace webrtc { > >-std::string ToString(const TimeDelta& value) { >+std::string ToString(TimeDelta value) { > char buf[64]; > rtc::SimpleStringBuilder sb(buf); > if (value.IsPlusInfinity()) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.h >index ec364172aa18df3a84fccdcafa1ffaaf92cd7dfd..64583698937120f94e2f6113dd6852cbc924db2d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta.h >@@ -15,21 +15,13 @@ > #include <ostream> // no-presubmit-check TODO(webrtc:8982) > #endif // UNIT_TEST > >-#include <stdint.h> >-#include <cmath> > #include <cstdlib> >-#include <limits> > #include <string> >+#include <type_traits> > >-#include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" >+#include "rtc_base/units/unit_base.h" > > namespace webrtc { >-namespace timedelta_impl { >-constexpr int64_t kPlusInfinityVal = std::numeric_limits<int64_t>::max(); >-constexpr int64_t kMinusInfinityVal = std::numeric_limits<int64_t>::min(); >-} // namespace timedelta_impl >- > // TimeDelta represents the difference between two timestamps. Commonly this can > // be a duration. However since two Timestamps are not guaranteed to have the > // same epoch (they might come from different computers, making exact >@@ -37,245 +29,69 @@ constexpr int64_t kMinusInfinityVal = std::numeric_limits<int64_t>::min(); > // undefined. To simplify usage, it can be constructed and converted to > // different units, specifically seconds (s), milliseconds (ms) and > // microseconds (us). >-class TimeDelta { >+class TimeDelta final : public rtc_units_impl::RelativeUnit<TimeDelta> { > public: > TimeDelta() = delete; >- static constexpr TimeDelta Zero() { return TimeDelta(0); } >- static constexpr TimeDelta PlusInfinity() { >- return TimeDelta(timedelta_impl::kPlusInfinityVal); >- } >- static constexpr TimeDelta MinusInfinity() { >- return TimeDelta(timedelta_impl::kMinusInfinityVal); >- } > template <int64_t seconds> > static constexpr TimeDelta Seconds() { >- static_assert(seconds > timedelta_impl::kMinusInfinityVal / 1000000, ""); >- static_assert(seconds < timedelta_impl::kPlusInfinityVal / 1000000, ""); >- return TimeDelta(seconds * 1000000); >+ return FromStaticFraction<seconds, 1000000>(); > } > template <int64_t ms> > static constexpr TimeDelta Millis() { >- static_assert(ms > timedelta_impl::kMinusInfinityVal / 1000, ""); >- static_assert(ms < timedelta_impl::kPlusInfinityVal / 1000, ""); >- return TimeDelta(ms * 1000); >+ return FromStaticFraction<ms, 1000>(); > } > template <int64_t us> > static constexpr TimeDelta Micros() { >- static_assert(us > timedelta_impl::kMinusInfinityVal, ""); >- static_assert(us < timedelta_impl::kPlusInfinityVal, ""); >- return TimeDelta(us); >- } >- >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static TimeDelta seconds(T seconds) { >- RTC_DCHECK_GT(seconds, timedelta_impl::kMinusInfinityVal / 1000000); >- RTC_DCHECK_LT(seconds, timedelta_impl::kPlusInfinityVal / 1000000); >- return TimeDelta(rtc::dchecked_cast<int64_t>(seconds) * 1000000); >+ return FromStaticValue<us>(); > } >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static TimeDelta ms(T milliseconds) { >- RTC_DCHECK_GT(milliseconds, timedelta_impl::kMinusInfinityVal / 1000); >- RTC_DCHECK_LT(milliseconds, timedelta_impl::kPlusInfinityVal / 1000); >- return TimeDelta(rtc::dchecked_cast<int64_t>(milliseconds) * 1000); >- } >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static TimeDelta us(T microseconds) { >- RTC_DCHECK_GT(microseconds, timedelta_impl::kMinusInfinityVal); >- RTC_DCHECK_LT(microseconds, timedelta_impl::kPlusInfinityVal); >- return TimeDelta(rtc::dchecked_cast<int64_t>(microseconds)); >- } >- >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >+ template <typename T> > static TimeDelta seconds(T seconds) { >- return TimeDelta::us(seconds * 1e6); >+ return FromFraction<1000000>(seconds); > } >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >+ template <typename T> > static TimeDelta ms(T milliseconds) { >- return TimeDelta::us(milliseconds * 1e3); >+ return FromFraction<1000>(milliseconds); > } >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >+ template <typename T> > static TimeDelta us(T microseconds) { >- if (microseconds == std::numeric_limits<T>::infinity()) { >- return PlusInfinity(); >- } else if (microseconds == -std::numeric_limits<T>::infinity()) { >- return MinusInfinity(); >- } else { >- RTC_DCHECK(!std::isnan(microseconds)); >- RTC_DCHECK_GT(microseconds, timedelta_impl::kMinusInfinityVal); >- RTC_DCHECK_LT(microseconds, timedelta_impl::kPlusInfinityVal); >- return TimeDelta(rtc::dchecked_cast<int64_t>(microseconds)); >- } >+ return FromValue(microseconds); > } >- > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type seconds() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(UnsafeSeconds()); >+ T seconds() const { >+ return ToFraction<1000000, T>(); > } > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type ms() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(UnsafeMillis()); >+ T ms() const { >+ return ToFraction<1000, T>(); > } > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type us() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(microseconds_); >+ T us() const { >+ return ToValue<T>(); > } > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type ns() const { >- RTC_DCHECK_GE(us(), std::numeric_limits<T>::min() / 1000); >- RTC_DCHECK_LE(us(), std::numeric_limits<T>::max() / 1000); >- return rtc::dchecked_cast<T>(us() * 1000); >- } >- >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- seconds() const { >- return us<T>() * 1e-6; >- } >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- ms() const { >- return us<T>() * 1e-3; >- } >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- us() const { >- return IsPlusInfinity() >- ? std::numeric_limits<T>::infinity() >- : IsMinusInfinity() ? -std::numeric_limits<T>::infinity() >- : microseconds_; >- } >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- ns() const { >- return us<T>() * 1e3; >+ T ns() const { >+ return ToMultiple<1000, T>(); > } > > constexpr int64_t seconds_or(int64_t fallback_value) const { >- return IsFinite() ? UnsafeSeconds() : fallback_value; >+ return ToFractionOr<1000000>(fallback_value); > } > constexpr int64_t ms_or(int64_t fallback_value) const { >- return IsFinite() ? UnsafeMillis() : fallback_value; >+ return ToFractionOr<1000>(fallback_value); > } > constexpr int64_t us_or(int64_t fallback_value) const { >- return IsFinite() ? microseconds_ : fallback_value; >+ return ToValueOr(fallback_value); > } > > TimeDelta Abs() const { return TimeDelta::us(std::abs(us())); } >- constexpr bool IsZero() const { return microseconds_ == 0; } >- constexpr bool IsFinite() const { return !IsInfinite(); } >- constexpr bool IsInfinite() const { >- return microseconds_ == timedelta_impl::kPlusInfinityVal || >- microseconds_ == timedelta_impl::kMinusInfinityVal; >- } >- constexpr bool IsPlusInfinity() const { >- return microseconds_ == timedelta_impl::kPlusInfinityVal; >- } >- constexpr bool IsMinusInfinity() const { >- return microseconds_ == timedelta_impl::kMinusInfinityVal; >- } >- TimeDelta operator+(const TimeDelta& other) const { >- if (IsPlusInfinity() || other.IsPlusInfinity()) { >- RTC_DCHECK(!IsMinusInfinity()); >- RTC_DCHECK(!other.IsMinusInfinity()); >- return PlusInfinity(); >- } else if (IsMinusInfinity() || other.IsMinusInfinity()) { >- RTC_DCHECK(!IsPlusInfinity()); >- RTC_DCHECK(!other.IsPlusInfinity()); >- return MinusInfinity(); >- } >- return TimeDelta::us(us() + other.us()); >- } >- TimeDelta operator-(const TimeDelta& other) const { >- if (IsPlusInfinity() || other.IsMinusInfinity()) { >- RTC_DCHECK(!IsMinusInfinity()); >- RTC_DCHECK(!other.IsPlusInfinity()); >- return PlusInfinity(); >- } else if (IsMinusInfinity() || other.IsPlusInfinity()) { >- RTC_DCHECK(!IsPlusInfinity()); >- RTC_DCHECK(!other.IsMinusInfinity()); >- return MinusInfinity(); >- } >- return TimeDelta::us(us() - other.us()); >- } >- TimeDelta& operator-=(const TimeDelta& other) { >- *this = *this - other; >- return *this; >- } >- TimeDelta& operator+=(const TimeDelta& other) { >- *this = *this + other; >- return *this; >- } >- constexpr double operator/(const TimeDelta& other) const { >- return us<double>() / other.us<double>(); >- } >- constexpr bool operator==(const TimeDelta& other) const { >- return microseconds_ == other.microseconds_; >- } >- constexpr bool operator!=(const TimeDelta& other) const { >- return microseconds_ != other.microseconds_; >- } >- constexpr bool operator<=(const TimeDelta& other) const { >- return microseconds_ <= other.microseconds_; >- } >- constexpr bool operator>=(const TimeDelta& other) const { >- return microseconds_ >= other.microseconds_; >- } >- constexpr bool operator>(const TimeDelta& other) const { >- return microseconds_ > other.microseconds_; >- } >- constexpr bool operator<(const TimeDelta& other) const { >- return microseconds_ < other.microseconds_; >- } > > private: >- explicit constexpr TimeDelta(int64_t us) : microseconds_(us) {} >- constexpr int64_t UnsafeSeconds() const { >- return (microseconds_ + (microseconds_ >= 0 ? 500000 : -500000)) / 1000000; >- } >- constexpr int64_t UnsafeMillis() const { >- return (microseconds_ + (microseconds_ >= 0 ? 500 : -500)) / 1000; >- } >- int64_t microseconds_; >+ friend class rtc_units_impl::UnitBase<TimeDelta>; >+ using RelativeUnit::RelativeUnit; >+ static constexpr bool one_sided = false; > }; > >-inline TimeDelta operator*(const TimeDelta& delta, const double& scalar) { >- return TimeDelta::us(std::round(delta.us() * scalar)); >-} >-inline TimeDelta operator*(const double& scalar, const TimeDelta& delta) { >- return delta * scalar; >-} >-inline TimeDelta operator*(const TimeDelta& delta, const int64_t& scalar) { >- return TimeDelta::us(delta.us() * scalar); >-} >-inline TimeDelta operator*(const int64_t& scalar, const TimeDelta& delta) { >- return delta * scalar; >-} >-inline TimeDelta operator*(const TimeDelta& delta, const int32_t& scalar) { >- return TimeDelta::us(delta.us() * scalar); >-} >-inline TimeDelta operator*(const int32_t& scalar, const TimeDelta& delta) { >- return delta * scalar; >-} >- >-inline TimeDelta operator/(const TimeDelta& delta, const int64_t& scalar) { >- return TimeDelta::us(delta.us() / scalar); >-} >-std::string ToString(const TimeDelta& value); >+std::string ToString(TimeDelta value); > > #ifdef UNIT_TEST > inline std::ostream& operator<<( // no-presubmit-check TODO(webrtc:8982) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta_unittest.cc >index bf8bbce69befed616228d1b4ce44c163bc447e3f..a46ba835cb18f4e5bb62a9a39a873cded37dff13 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/time_delta_unittest.cc >@@ -10,6 +10,8 @@ > > #include "api/units/time_delta.h" > >+#include <limits> >+ > #include "test/gtest.h" > > namespace webrtc { >@@ -106,6 +108,27 @@ TEST(TimeDeltaTest, ComparisonOperators) { > EXPECT_LT(TimeDelta::MinusInfinity(), TimeDelta::Zero()); > } > >+TEST(TimeDeltaTest, Clamping) { >+ const TimeDelta upper = TimeDelta::ms(800); >+ const TimeDelta lower = TimeDelta::ms(100); >+ const TimeDelta under = TimeDelta::ms(100); >+ const TimeDelta inside = TimeDelta::ms(500); >+ const TimeDelta over = TimeDelta::ms(1000); >+ EXPECT_EQ(under.Clamped(lower, upper), lower); >+ EXPECT_EQ(inside.Clamped(lower, upper), inside); >+ EXPECT_EQ(over.Clamped(lower, upper), upper); >+ >+ TimeDelta mutable_delta = lower; >+ mutable_delta.Clamp(lower, upper); >+ EXPECT_EQ(mutable_delta, lower); >+ mutable_delta = inside; >+ mutable_delta.Clamp(lower, upper); >+ EXPECT_EQ(mutable_delta, inside); >+ mutable_delta = over; >+ mutable_delta.Clamp(lower, upper); >+ EXPECT_EQ(mutable_delta, upper); >+} >+ > TEST(TimeDeltaTest, CanBeInititializedFromLargeInt) { > const int kMaxInt = std::numeric_limits<int>::max(); > EXPECT_EQ(TimeDelta::seconds(kMaxInt).us(), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.cc >index feb144789ca2b59f024dd9b735010b47d4f5d1d2..d3417cfb05b2f2673e44779f9941e39553d9b387 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.cc >@@ -13,7 +13,7 @@ > #include "rtc_base/strings/string_builder.h" > > namespace webrtc { >-std::string ToString(const Timestamp& value) { >+std::string ToString(Timestamp value) { > char buf[64]; > rtc::SimpleStringBuilder sb(buf); > if (value.IsInfinite()) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.h >index 0298f5da979342bdbc29980876622b2ea040acf1..a6e450f6850f559332c3d56a9ea3ee9402c55d4c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/units/timestamp.h >@@ -15,189 +15,94 @@ > #include <ostream> // no-presubmit-check TODO(webrtc:8982) > #endif // UNIT_TEST > >-#include <stdint.h> >-#include <limits> > #include <string> >+#include <type_traits> > > #include "api/units/time_delta.h" > #include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" > > namespace webrtc { >-namespace timestamp_impl { >-constexpr int64_t kPlusInfinityVal = std::numeric_limits<int64_t>::max(); >-constexpr int64_t kMinusInfinityVal = std::numeric_limits<int64_t>::min(); >-} // namespace timestamp_impl >- > // Timestamp represents the time that has passed since some unspecified epoch. > // The epoch is assumed to be before any represented timestamps, this means that > // negative values are not valid. The most notable feature is that the > // difference of two Timestamps results in a TimeDelta. >-class Timestamp { >+class Timestamp final : public rtc_units_impl::UnitBase<Timestamp> { > public: > Timestamp() = delete; >- static constexpr Timestamp PlusInfinity() { >- return Timestamp(timestamp_impl::kPlusInfinityVal); >- } >- static constexpr Timestamp MinusInfinity() { >- return Timestamp(timestamp_impl::kMinusInfinityVal); >- } >+ > template <int64_t seconds> > static constexpr Timestamp Seconds() { >- static_assert(seconds >= 0, ""); >- static_assert(seconds < timestamp_impl::kPlusInfinityVal / 1000000, ""); >- return Timestamp(seconds * 1000000); >+ return FromStaticFraction<seconds, 1000000>(); > } > template <int64_t ms> > static constexpr Timestamp Millis() { >- static_assert(ms >= 0, ""); >- static_assert(ms < timestamp_impl::kPlusInfinityVal / 1000, ""); >- return Timestamp(ms * 1000); >+ return FromStaticFraction<ms, 1000>(); > } > template <int64_t us> > static constexpr Timestamp Micros() { >- static_assert(us >= 0, ""); >- static_assert(us < timestamp_impl::kPlusInfinityVal, ""); >- return Timestamp(us); >- } >- >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static Timestamp seconds(T seconds) { >- RTC_DCHECK_GE(seconds, 0); >- RTC_DCHECK_LT(seconds, timestamp_impl::kPlusInfinityVal / 1000000); >- return Timestamp(rtc::dchecked_cast<int64_t>(seconds) * 1000000); >- } >- >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static Timestamp ms(T milliseconds) { >- RTC_DCHECK_GE(milliseconds, 0); >- RTC_DCHECK_LT(milliseconds, timestamp_impl::kPlusInfinityVal / 1000); >- return Timestamp(rtc::dchecked_cast<int64_t>(milliseconds) * 1000); >+ return FromStaticValue<us>(); > } > >- template < >- typename T, >- typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >- static Timestamp us(T microseconds) { >- RTC_DCHECK_GE(microseconds, 0); >- RTC_DCHECK_LT(microseconds, timestamp_impl::kPlusInfinityVal); >- return Timestamp(rtc::dchecked_cast<int64_t>(microseconds)); >- } >- >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >+ template <typename T> > static Timestamp seconds(T seconds) { >- return Timestamp::us(seconds * 1e6); >+ return FromFraction<1000000>(seconds); > } >- >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >+ template <typename T> > static Timestamp ms(T milliseconds) { >- return Timestamp::us(milliseconds * 1e3); >+ return FromFraction<1000>(milliseconds); > } >- template <typename T, >- typename std::enable_if<std::is_floating_point<T>::value>::type* = >- nullptr> >+ template <typename T> > static Timestamp us(T microseconds) { >- if (microseconds == std::numeric_limits<double>::infinity()) { >- return PlusInfinity(); >- } else if (microseconds == -std::numeric_limits<double>::infinity()) { >- return MinusInfinity(); >- } else { >- RTC_DCHECK(!std::isnan(microseconds)); >- RTC_DCHECK_GE(microseconds, 0); >- RTC_DCHECK_LT(microseconds, timestamp_impl::kPlusInfinityVal); >- return Timestamp(rtc::dchecked_cast<int64_t>(microseconds)); >- } >+ return FromValue(microseconds); > } >- > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type seconds() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(UnsafeSeconds()); >+ T seconds() const { >+ return ToFraction<1000000, T>(); > } > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type ms() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(UnsafeMillis()); >+ T ms() const { >+ return ToFraction<1000, T>(); > } > template <typename T = int64_t> >- typename std::enable_if<std::is_integral<T>::value, T>::type us() const { >- RTC_DCHECK(IsFinite()); >- return rtc::dchecked_cast<T>(microseconds_); >- } >- >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- seconds() const { >- return us<T>() * 1e-6; >- } >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- ms() const { >- return us<T>() * 1e-3; >- } >- template <typename T> >- constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >- us() const { >- return IsPlusInfinity() >- ? std::numeric_limits<T>::infinity() >- : IsMinusInfinity() ? -std::numeric_limits<T>::infinity() >- : microseconds_; >+ T us() const { >+ return ToValue<T>(); > } > > constexpr int64_t seconds_or(int64_t fallback_value) const { >- return IsFinite() ? UnsafeSeconds() : fallback_value; >+ return ToFractionOr<1000000>(fallback_value); > } > constexpr int64_t ms_or(int64_t fallback_value) const { >- return IsFinite() ? UnsafeMillis() : fallback_value; >+ return ToFractionOr<1000>(fallback_value); > } > constexpr int64_t us_or(int64_t fallback_value) const { >- return IsFinite() ? microseconds_ : fallback_value; >+ return ToValueOr(fallback_value); > } > >- constexpr bool IsFinite() const { return !IsInfinite(); } >- constexpr bool IsInfinite() const { >- return microseconds_ == timedelta_impl::kPlusInfinityVal || >- microseconds_ == timedelta_impl::kMinusInfinityVal; >- } >- constexpr bool IsPlusInfinity() const { >- return microseconds_ == timedelta_impl::kPlusInfinityVal; >- } >- constexpr bool IsMinusInfinity() const { >- return microseconds_ == timedelta_impl::kMinusInfinityVal; >- } >- Timestamp operator+(const TimeDelta& other) const { >- if (IsPlusInfinity() || other.IsPlusInfinity()) { >+ Timestamp operator+(const TimeDelta delta) const { >+ if (IsPlusInfinity() || delta.IsPlusInfinity()) { > RTC_DCHECK(!IsMinusInfinity()); >- RTC_DCHECK(!other.IsMinusInfinity()); >+ RTC_DCHECK(!delta.IsMinusInfinity()); > return PlusInfinity(); >- } else if (IsMinusInfinity() || other.IsMinusInfinity()) { >+ } else if (IsMinusInfinity() || delta.IsMinusInfinity()) { > RTC_DCHECK(!IsPlusInfinity()); >- RTC_DCHECK(!other.IsPlusInfinity()); >+ RTC_DCHECK(!delta.IsPlusInfinity()); > return MinusInfinity(); > } >- return Timestamp::us(us() + other.us()); >+ return Timestamp::us(us() + delta.us()); > } >- Timestamp operator-(const TimeDelta& other) const { >- if (IsPlusInfinity() || other.IsMinusInfinity()) { >+ Timestamp operator-(const TimeDelta delta) const { >+ if (IsPlusInfinity() || delta.IsMinusInfinity()) { > RTC_DCHECK(!IsMinusInfinity()); >- RTC_DCHECK(!other.IsPlusInfinity()); >+ RTC_DCHECK(!delta.IsPlusInfinity()); > return PlusInfinity(); >- } else if (IsMinusInfinity() || other.IsPlusInfinity()) { >+ } else if (IsMinusInfinity() || delta.IsPlusInfinity()) { > RTC_DCHECK(!IsPlusInfinity()); >- RTC_DCHECK(!other.IsMinusInfinity()); >+ RTC_DCHECK(!delta.IsMinusInfinity()); > return MinusInfinity(); > } >- return Timestamp::us(us() - other.us()); >+ return Timestamp::us(us() - delta.us()); > } >- TimeDelta operator-(const Timestamp& other) const { >+ TimeDelta operator-(const Timestamp other) const { > if (IsPlusInfinity() || other.IsMinusInfinity()) { > RTC_DCHECK(!IsMinusInfinity()); > RTC_DCHECK(!other.IsPlusInfinity()); >@@ -209,45 +114,22 @@ class Timestamp { > } > return TimeDelta::us(us() - other.us()); > } >- Timestamp& operator-=(const TimeDelta& other) { >- *this = *this - other; >+ Timestamp& operator-=(const TimeDelta delta) { >+ *this = *this - delta; > return *this; > } >- Timestamp& operator+=(const TimeDelta& other) { >- *this = *this + other; >+ Timestamp& operator+=(const TimeDelta delta) { >+ *this = *this + delta; > return *this; > } >- constexpr bool operator==(const Timestamp& other) const { >- return microseconds_ == other.microseconds_; >- } >- constexpr bool operator!=(const Timestamp& other) const { >- return microseconds_ != other.microseconds_; >- } >- constexpr bool operator<=(const Timestamp& other) const { >- return microseconds_ <= other.microseconds_; >- } >- constexpr bool operator>=(const Timestamp& other) const { >- return microseconds_ >= other.microseconds_; >- } >- constexpr bool operator>(const Timestamp& other) const { >- return microseconds_ > other.microseconds_; >- } >- constexpr bool operator<(const Timestamp& other) const { >- return microseconds_ < other.microseconds_; >- } > > private: >- explicit constexpr Timestamp(int64_t us) : microseconds_(us) {} >- constexpr int64_t UnsafeSeconds() const { >- return (microseconds_ + 500000) / 1000000; >- } >- constexpr int64_t UnsafeMillis() const { >- return (microseconds_ + 500) / 1000; >- } >- int64_t microseconds_; >+ friend class rtc_units_impl::UnitBase<Timestamp>; >+ using UnitBase::UnitBase; >+ static constexpr bool one_sided = true; > }; > >-std::string ToString(const Timestamp& value); >+std::string ToString(Timestamp value); > > #ifdef UNIT_TEST > inline std::ostream& operator<<( // no-presubmit-check TODO(webrtc:8982) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/BUILD.gn >index d04126021da5a6c1a292cc5b5289c502eaa5f800..5ed70898f7b7081df0d27b0fdb179a2ce8c91ddd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/BUILD.gn >@@ -13,6 +13,9 @@ rtc_source_set("video_frame") { > sources = [ > "color_space.cc", > "color_space.h", >+ "hdr_metadata.cc", >+ "hdr_metadata.h", >+ "video_codec_type.h", > "video_content_type.cc", > "video_content_type.h", > "video_frame.cc", >@@ -31,6 +34,7 @@ rtc_source_set("video_frame") { > deps = [ > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >+ "../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -46,6 +50,7 @@ rtc_source_set("video_frame_i420") { > "../../rtc_base:checks", > "../../rtc_base:rtc_base", > "../../rtc_base/memory:aligned_malloc", >+ "../../rtc_base/system:rtc_export", > "//third_party/libyuv", > ] > } >@@ -66,6 +71,24 @@ rtc_source_set("video_frame_i010") { > ] > } > >+rtc_source_set("encoded_image") { >+ sources = [ >+ "encoded_image.cc", >+ "encoded_image.h", >+ ] >+ deps = [ >+ ":video_frame", >+ "../..:webrtc_common", >+ >+ # TODO(bugs.webrtc.org/8311) Cleanup when kMaxSpatialLayers is moved to proper include. >+ "../../api/video:video_bitrate_allocation", >+ "../../rtc_base:checks", >+ "../../rtc_base:rtc_base_approved", >+ "../../rtc_base/system:rtc_export", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >+ > rtc_source_set("encoded_frame") { > visibility = [ "*" ] > sources = [ >@@ -152,6 +175,7 @@ rtc_source_set("video_stream_encoder") { > ] > > deps = [ >+ ":video_bitrate_allocator_factory", > ":video_frame", > > # For rtpparameters.h >@@ -182,3 +206,22 @@ rtc_source_set("video_stream_encoder_create") { > "//third_party/abseil-cpp/absl/memory", > ] > } >+ >+rtc_static_library("builtin_video_bitrate_allocator_factory") { >+ visibility = [ "*" ] >+ sources = [ >+ "builtin_video_bitrate_allocator_factory.cc", >+ "builtin_video_bitrate_allocator_factory.h", >+ ] >+ >+ deps = [ >+ ":video_bitrate_allocation", >+ ":video_bitrate_allocator_factory", >+ "../../media:rtc_media_base", >+ "../../modules/video_coding:video_coding_utility", >+ "../../modules/video_coding:webrtc_vp9_helpers", >+ "../../rtc_base:ptr_util", >+ "../../rtc_base/system:fallthrough", >+ "//third_party/abseil-cpp/absl/memory", >+ ] >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/builtin_video_bitrate_allocator_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/builtin_video_bitrate_allocator_factory.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..70f6ad04cbe2520d9342d4979f95a7402a22d652 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/builtin_video_bitrate_allocator_factory.cc >@@ -0,0 +1,56 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/video/builtin_video_bitrate_allocator_factory.h" >+ >+#include "absl/memory/memory.h" >+#include "media/base/codec.h" >+#include "modules/video_coding/codecs/vp9/svc_rate_allocator.h" >+#include "modules/video_coding/utility/default_video_bitrate_allocator.h" >+#include "modules/video_coding/utility/simulcast_rate_allocator.h" >+#include "rtc_base/system/fallthrough.h" >+ >+namespace webrtc { >+ >+namespace { >+ >+class BuiltinVideoBitrateAllocatorFactory >+ : public VideoBitrateAllocatorFactory { >+ public: >+ BuiltinVideoBitrateAllocatorFactory() = default; >+ ~BuiltinVideoBitrateAllocatorFactory() override = default; >+ >+ std::unique_ptr<VideoBitrateAllocator> CreateVideoBitrateAllocator( >+ const VideoCodec& codec) override { >+ std::unique_ptr<VideoBitrateAllocator> rate_allocator; >+ switch (codec.codecType) { >+ case kVideoCodecVP8: >+ RTC_FALLTHROUGH(); >+ case kVideoCodecH264: >+ rate_allocator.reset(new SimulcastRateAllocator(codec)); >+ break; >+ case kVideoCodecVP9: >+ rate_allocator.reset(new SvcRateAllocator(codec)); >+ break; >+ default: >+ rate_allocator.reset(new DefaultVideoBitrateAllocator(codec)); >+ } >+ return rate_allocator; >+ } >+}; >+ >+} // namespace >+ >+std::unique_ptr<VideoBitrateAllocatorFactory> >+CreateBuiltinVideoBitrateAllocatorFactory() { >+ return absl::make_unique<BuiltinVideoBitrateAllocatorFactory>(); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/builtin_video_bitrate_allocator_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/builtin_video_bitrate_allocator_factory.h >new file mode 100644 >index 0000000000000000000000000000000000000000..ac880a0863e8ce42c14637de7d7b0a4b398dcdb7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/builtin_video_bitrate_allocator_factory.h >@@ -0,0 +1,25 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_VIDEO_BUILTIN_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >+#define API_VIDEO_BUILTIN_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >+ >+#include <memory> >+ >+#include "api/video/video_bitrate_allocator_factory.h" >+ >+namespace webrtc { >+ >+std::unique_ptr<VideoBitrateAllocatorFactory> >+CreateBuiltinVideoBitrateAllocatorFactory(); >+ >+} // namespace webrtc >+ >+#endif // API_VIDEO_BUILTIN_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.cc >index a8be5cd8280202d5fcf93ac7c5983b96de12dc64..ad138abe8093d0d6befaf56d2442f899651e4c64 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.cc >@@ -10,18 +10,72 @@ > > #include "api/video/color_space.h" > >+namespace { >+// Try to convert |enum_value| into the enum class T. |enum_bitmask| is created >+// by the funciton below. Returns true if conversion was successful, false >+// otherwise. >+template <typename T> >+bool SetFromUint8(uint8_t enum_value, uint64_t enum_bitmask, T* out) { >+ if ((enum_value < 64) && ((enum_bitmask >> enum_value) & 1)) { >+ *out = static_cast<T>(enum_value); >+ return true; >+ } >+ return false; >+} >+ >+// This function serves as an assert for the constexpr function below. It's on >+// purpose not declared as constexpr so that it causes a build problem if enum >+// values of 64 or above are used. The bitmask and the code generating it would >+// have to be extended if the standard is updated to include enum values >= 64. >+int EnumMustBeLessThan64() { >+ return -1; >+} >+ >+template <typename T, size_t N> >+constexpr int MakeMask(const int index, const int length, T (&values)[N]) { >+ return length > 1 >+ ? (MakeMask(index, 1, values) + >+ MakeMask(index + 1, length - 1, values)) >+ : (static_cast<uint8_t>(values[index]) < 64 >+ ? (uint64_t{1} << static_cast<uint8_t>(values[index])) >+ : EnumMustBeLessThan64()); >+} >+ >+// Create a bitmask where each bit corresponds to one potential enum value. >+// |values| should be an array listing all possible enum values. The bit is set >+// to one if the corresponding enum exists. Only works for enums with values >+// less than 64. >+template <typename T, size_t N> >+constexpr uint64_t CreateEnumBitmask(T (&values)[N]) { >+ return MakeMask(0, N, values); >+} >+ >+} // namespace >+ > namespace webrtc { > > ColorSpace::ColorSpace() = default; >+ColorSpace::ColorSpace(const ColorSpace& other) = default; >+ColorSpace::ColorSpace(ColorSpace&& other) = default; >+ColorSpace& ColorSpace::operator=(const ColorSpace& other) = default; > > ColorSpace::ColorSpace(PrimaryID primaries, > TransferID transfer, > MatrixID matrix, > RangeID range) >+ : ColorSpace(primaries, transfer, matrix, range, nullptr) {} >+ >+ColorSpace::ColorSpace(PrimaryID primaries, >+ TransferID transfer, >+ MatrixID matrix, >+ RangeID range, >+ const HdrMetadata* hdr_metadata) > : primaries_(primaries), > transfer_(transfer), > matrix_(matrix), >- range_(range) {} >+ range_(range), >+ hdr_metadata_(hdr_metadata ? absl::make_optional(*hdr_metadata) >+ : absl::nullopt) {} > > ColorSpace::PrimaryID ColorSpace::primaries() const { > return primaries_; >@@ -39,4 +93,61 @@ ColorSpace::RangeID ColorSpace::range() const { > return range_; > } > >+const HdrMetadata* ColorSpace::hdr_metadata() const { >+ return hdr_metadata_ ? &*hdr_metadata_ : nullptr; >+} >+ >+bool ColorSpace::set_primaries_from_uint8(uint8_t enum_value) { >+ constexpr PrimaryID kPrimaryIds[] = { >+ PrimaryID::kInvalid, PrimaryID::kBT709, PrimaryID::kUNSPECIFIED, >+ PrimaryID::kBT470M, PrimaryID::kBT470BG, PrimaryID::kSMPTE170M, >+ PrimaryID::kSMPTE240M, PrimaryID::kFILM, PrimaryID::kBT2020, >+ PrimaryID::kSMPTEST428, PrimaryID::kSMPTEST431, PrimaryID::kSMPTEST432, >+ PrimaryID::kJEDECP22}; >+ constexpr uint64_t enum_bitmask = CreateEnumBitmask(kPrimaryIds); >+ >+ return SetFromUint8(enum_value, enum_bitmask, &primaries_); >+} >+ >+bool ColorSpace::set_transfer_from_uint8(uint8_t enum_value) { >+ constexpr TransferID kTransferIds[] = { >+ TransferID::kInvalid, TransferID::kBT709, >+ TransferID::kUNSPECIFIED, TransferID::kGAMMA22, >+ TransferID::kGAMMA28, TransferID::kSMPTE170M, >+ TransferID::kSMPTE240M, TransferID::kLINEAR, >+ TransferID::kLOG, TransferID::kLOG_SQRT, >+ TransferID::kIEC61966_2_4, TransferID::kBT1361_ECG, >+ TransferID::kIEC61966_2_1, TransferID::kBT2020_10, >+ TransferID::kBT2020_12, TransferID::kSMPTEST2084, >+ TransferID::kSMPTEST428, TransferID::kARIB_STD_B67}; >+ constexpr uint64_t enum_bitmask = CreateEnumBitmask(kTransferIds); >+ >+ return SetFromUint8(enum_value, enum_bitmask, &transfer_); >+} >+ >+bool ColorSpace::set_matrix_from_uint8(uint8_t enum_value) { >+ constexpr MatrixID kMatrixIds[] = { >+ MatrixID::kRGB, MatrixID::kBT709, MatrixID::kUNSPECIFIED, >+ MatrixID::kFCC, MatrixID::kBT470BG, MatrixID::kSMPTE170M, >+ MatrixID::kSMPTE240M, MatrixID::kYCOCG, MatrixID::kBT2020_NCL, >+ MatrixID::kBT2020_CL, MatrixID::kSMPTE2085, MatrixID::kCDNCLS, >+ MatrixID::kCDCLS, MatrixID::kBT2100_ICTCP, MatrixID::kInvalid}; >+ constexpr uint64_t enum_bitmask = CreateEnumBitmask(kMatrixIds); >+ >+ return SetFromUint8(enum_value, enum_bitmask, &matrix_); >+} >+ >+bool ColorSpace::set_range_from_uint8(uint8_t enum_value) { >+ constexpr RangeID kRangeIds[] = {RangeID::kInvalid, RangeID::kLimited, >+ RangeID::kFull, RangeID::kDerived}; >+ constexpr uint64_t enum_bitmask = CreateEnumBitmask(kRangeIds); >+ >+ return SetFromUint8(enum_value, enum_bitmask, &range_); >+} >+ >+void ColorSpace::set_hdr_metadata(const HdrMetadata* hdr_metadata) { >+ hdr_metadata_ = >+ hdr_metadata ? absl::make_optional(*hdr_metadata) : absl::nullopt; >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.h >index 8102647b6ff8847a049004cbb59858e08b4d8471..79a15f59b393287bb89b98658ac64b44e186a069 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/color_space.h >@@ -11,95 +11,145 @@ > #ifndef API_VIDEO_COLOR_SPACE_H_ > #define API_VIDEO_COLOR_SPACE_H_ > >+#include <stdint.h> >+ >+#include "absl/types/optional.h" >+#include "api/video/hdr_metadata.h" >+ > namespace webrtc { > >-// Used to represent a color space for the purpose of color conversion. This >-// class only represents color information that can be transferred through the >-// bitstream of WebRTC's internal supported codecs: >+// This class represents color information as specified in T-REC H.273, >+// available from https://www.itu.int/rec/T-REC-H.273. >+// >+// WebRTC's supported codecs: > // - VP9 supports color profiles, see VP9 Bitstream & Decoding Process > // Specification Version 0.6 Section 7.2.2 "Color config semantics" available > // from https://www.webmproject.org. >-// TODO(emircan): Extract these values from decode and add to the existing ones. > // - VP8 only supports BT.601, see > // https://tools.ietf.org/html/rfc6386#section-9.2 >-// - H264 supports different color primaries, transfer characteristics, matrix >-// coefficients and range. See T-REC-H.264 E.2.1, "VUI parameters semantics", >-// available from https://www.itu.int/rec/T-REC-H.264. >+// - H264 uses the exact same representation as T-REC H.273. See T-REC-H.264 >+// E.2.1, "VUI parameters semantics", available from >+// https://www.itu.int/rec/T-REC-H.264. >+ > class ColorSpace { > public: >- enum class PrimaryID { >- kInvalid, >- kBT709, >- kBT470M, >- kBT470BG, >- kSMPTE170M, // Identical to BT601 >- kSMPTE240M, >- kFILM, >- kBT2020, >- kSMPTEST428, >- kSMPTEST431, >- kSMPTEST432, >- kJEDECP22, >+ enum class PrimaryID : uint8_t { >+ // The indices are equal to the values specified in T-REC H.273 Table 2. >+ kInvalid = 0, >+ kBT709 = 1, >+ kUNSPECIFIED = 2, >+ kBT470M = 4, >+ kBT470BG = 5, >+ kSMPTE170M = 6, // Identical to BT601 >+ kSMPTE240M = 7, >+ kFILM = 8, >+ kBT2020 = 9, >+ kSMPTEST428 = 10, >+ kSMPTEST431 = 11, >+ kSMPTEST432 = 12, >+ kJEDECP22 = 22, // Identical to EBU3213-E >+ // When adding/removing entries here, please make sure to do the >+ // corresponding change to kPrimaryIds. > }; > >- enum class TransferID { >- kInvalid, >- kBT709, >- kGAMMA22, >- kGAMMA28, >- kSMPTE170M, >- kSMPTE240M, >- kLINEAR, >- kLOG, >- kLOG_SQRT, >- kIEC61966_2_4, >- kBT1361_ECG, >- kIEC61966_2_1, >- kBT2020_10, >- kBT2020_12, >- kSMPTEST2084, >- kSMPTEST428, >- kARIB_STD_B67, >+ enum class TransferID : uint8_t { >+ // The indices are equal to the values specified in T-REC H.273 Table 3. >+ kInvalid = 0, >+ kBT709 = 1, >+ kUNSPECIFIED = 2, >+ kGAMMA22 = 4, >+ kGAMMA28 = 5, >+ kSMPTE170M = 6, >+ kSMPTE240M = 7, >+ kLINEAR = 8, >+ kLOG = 9, >+ kLOG_SQRT = 10, >+ kIEC61966_2_4 = 11, >+ kBT1361_ECG = 12, >+ kIEC61966_2_1 = 13, >+ kBT2020_10 = 14, >+ kBT2020_12 = 15, >+ kSMPTEST2084 = 16, >+ kSMPTEST428 = 17, >+ kARIB_STD_B67 = 18, >+ // When adding/removing entries here, please make sure to do the >+ // corresponding change to kTransferIds. > }; > >- enum class MatrixID { >- kInvalid, >- kRGB, >- kBT709, >- kFCC, >- kBT470BG, >- kSMPTE170M, >- kSMPTE240M, >- kYCOCG, >- kBT2020_NCL, >- kBT2020_CL, >- kSMPTE2085, >+ enum class MatrixID : uint8_t { >+ // The indices are equal to the values specified in T-REC H.273 Table 4. >+ kRGB = 0, >+ kBT709 = 1, >+ kUNSPECIFIED = 2, >+ kFCC = 4, >+ kBT470BG = 5, >+ kSMPTE170M = 6, >+ kSMPTE240M = 7, >+ kYCOCG = 8, >+ kBT2020_NCL = 9, >+ kBT2020_CL = 10, >+ kSMPTE2085 = 11, >+ kCDNCLS = 12, >+ kCDCLS = 13, >+ kBT2100_ICTCP = 14, >+ kInvalid = 63, >+ // When adding/removing entries here, please make sure to do the >+ // corresponding change to kMatrixIds. > }; > > enum class RangeID { >- kInvalid, >+ // The indices are equal to the values specified at >+ // https://www.webmproject.org/docs/container/#colour for the element Range. >+ kInvalid = 0, > // Limited Rec. 709 color range with RGB values ranging from 16 to 235. >- kLimited, >+ kLimited = 1, > // Full RGB color range with RGB valees from 0 to 255. >- kFull, >+ kFull = 2, >+ // Range is defined by MatrixCoefficients/TransferCharacteristics. >+ kDerived = 3, >+ // When adding/removing entries here, please make sure to do the >+ // corresponding change to kRangeIds. > }; > > ColorSpace(); >+ ColorSpace(const ColorSpace& other); >+ ColorSpace(ColorSpace&& other); >+ ColorSpace& operator=(const ColorSpace& other); > ColorSpace(PrimaryID primaries, > TransferID transfer, > MatrixID matrix, > RangeID full_range); >+ ColorSpace(PrimaryID primaries, >+ TransferID transfer, >+ MatrixID matrix, >+ RangeID range, >+ const HdrMetadata* hdr_metadata); >+ bool operator==(const ColorSpace& other) const { >+ return primaries_ == other.primaries() && transfer_ == other.transfer() && >+ matrix_ == other.matrix() && range_ == other.range() && >+ ((hdr_metadata_.has_value() && other.hdr_metadata() && >+ *hdr_metadata_ == *other.hdr_metadata()) || >+ (!hdr_metadata_.has_value() && other.hdr_metadata() == nullptr)); >+ } > > PrimaryID primaries() const; > TransferID transfer() const; > MatrixID matrix() const; > RangeID range() const; >+ const HdrMetadata* hdr_metadata() const; >+ >+ bool set_primaries_from_uint8(uint8_t enum_value); >+ bool set_transfer_from_uint8(uint8_t enum_value); >+ bool set_matrix_from_uint8(uint8_t enum_value); >+ bool set_range_from_uint8(uint8_t enum_value); >+ void set_hdr_metadata(const HdrMetadata* hdr_metadata); > > private: > PrimaryID primaries_ = PrimaryID::kInvalid; > TransferID transfer_ = TransferID::kInvalid; > MatrixID matrix_ = MatrixID::kInvalid; > RangeID range_ = RangeID::kInvalid; >+ absl::optional<HdrMetadata> hdr_metadata_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/encoded_image.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/encoded_image.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..e7c6fadee703c89acce249813c1d2224aef2238c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/encoded_image.cc >@@ -0,0 +1,42 @@ >+/* >+ * Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/video/encoded_image.h" >+ >+#include <string.h> >+ >+namespace webrtc { >+ >+// FFmpeg's decoder, used by H264DecoderImpl, requires up to 8 bytes padding due >+// to optimized bitstream readers. See avcodec_decode_video2. >+const size_t EncodedImage::kBufferPaddingBytesH264 = 8; >+ >+size_t EncodedImage::GetBufferPaddingBytes(VideoCodecType codec_type) { >+ switch (codec_type) { >+ case kVideoCodecH264: >+ return kBufferPaddingBytesH264; >+ default: >+ return 0; >+ } >+} >+ >+EncodedImage::EncodedImage() : EncodedImage(nullptr, 0, 0) {} >+ >+EncodedImage::EncodedImage(const EncodedImage&) = default; >+ >+EncodedImage::EncodedImage(uint8_t* buffer, size_t length, size_t size) >+ : _buffer(buffer), _length(length), _size(size) {} >+ >+void EncodedImage::SetEncodeTime(int64_t encode_start_ms, >+ int64_t encode_finish_ms) { >+ timing_.encode_start_ms = encode_start_ms; >+ timing_.encode_finish_ms = encode_finish_ms; >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/encoded_image.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/encoded_image.h >new file mode 100644 >index 0000000000000000000000000000000000000000..a7c719ce3a5678d51ac24fc732039eebd1b90df0 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/encoded_image.h >@@ -0,0 +1,109 @@ >+/* >+ * Copyright (c) 2014 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_VIDEO_ENCODED_IMAGE_H_ >+#define API_VIDEO_ENCODED_IMAGE_H_ >+ >+#include <stdint.h> >+ >+#include "absl/types/optional.h" >+#include "api/video/color_space.h" >+#include "api/video/video_bitrate_allocation.h" >+#include "api/video/video_codec_type.h" >+#include "api/video/video_content_type.h" >+#include "api/video/video_rotation.h" >+#include "api/video/video_timing.h" >+#include "common_types.h" // NOLINT(build/include) >+#include "rtc_base/checks.h" >+#include "rtc_base/system/rtc_export.h" >+ >+namespace webrtc { >+ >+// TODO(bug.webrtc.org/9378): This is a legacy api class, which is slowly being >+// cleaned up. Direct use of its members is strongly discouraged. >+class RTC_EXPORT EncodedImage { >+ public: >+ static const size_t kBufferPaddingBytesH264; >+ >+ // Some decoders require encoded image buffers to be padded with a small >+ // number of additional bytes (due to over-reading byte readers). >+ static size_t GetBufferPaddingBytes(VideoCodecType codec_type); >+ >+ EncodedImage(); >+ EncodedImage(const EncodedImage&); >+ EncodedImage(uint8_t* buffer, size_t length, size_t size); >+ >+ // TODO(nisse): Change style to timestamp(), set_timestamp(), for consistency >+ // with the VideoFrame class. >+ // Set frame timestamp (90kHz). >+ void SetTimestamp(uint32_t timestamp) { timestamp_rtp_ = timestamp; } >+ >+ // Get frame timestamp (90kHz). >+ uint32_t Timestamp() const { return timestamp_rtp_; } >+ >+ void SetEncodeTime(int64_t encode_start_ms, int64_t encode_finish_ms); >+ >+ absl::optional<int> SpatialIndex() const { >+ return spatial_index_; >+ } >+ void SetSpatialIndex(absl::optional<int> spatial_index) { >+ RTC_DCHECK_GE(spatial_index.value_or(0), 0); >+ RTC_DCHECK_LT(spatial_index.value_or(0), kMaxSpatialLayers); >+ spatial_index_ = spatial_index; >+ } >+ >+ const webrtc::ColorSpace* ColorSpace() const { >+ return color_space_ ? &*color_space_ : nullptr; >+ } >+ void SetColorSpace(const webrtc::ColorSpace* color_space) { >+ color_space_ = >+ color_space ? absl::make_optional(*color_space) : absl::nullopt; >+ } >+ >+ uint32_t _encodedWidth = 0; >+ uint32_t _encodedHeight = 0; >+ // NTP time of the capture time in local timebase in milliseconds. >+ int64_t ntp_time_ms_ = 0; >+ int64_t capture_time_ms_ = 0; >+ FrameType _frameType = kVideoFrameDelta; >+ uint8_t* _buffer; >+ size_t _length; >+ size_t _size; >+ VideoRotation rotation_ = kVideoRotation_0; >+ VideoContentType content_type_ = VideoContentType::UNSPECIFIED; >+ bool _completeFrame = false; >+ int qp_ = -1; // Quantizer value. >+ >+ // When an application indicates non-zero values here, it is taken as an >+ // indication that all future frames will be constrained with those limits >+ // until the application indicates a change again. >+ PlayoutDelay playout_delay_ = {-1, -1}; >+ >+ struct Timing { >+ uint8_t flags = VideoSendTiming::kInvalid; >+ int64_t encode_start_ms = 0; >+ int64_t encode_finish_ms = 0; >+ int64_t packetization_finish_ms = 0; >+ int64_t pacer_exit_ms = 0; >+ int64_t network_timestamp_ms = 0; >+ int64_t network2_timestamp_ms = 0; >+ int64_t receive_start_ms = 0; >+ int64_t receive_finish_ms = 0; >+ } timing_; >+ >+ private: >+ uint32_t timestamp_rtp_ = 0; >+ absl::optional<int> spatial_index_; >+ absl::optional<webrtc::ColorSpace> color_space_; >+}; >+ >+} // namespace webrtc >+ >+#endif // API_VIDEO_ENCODED_IMAGE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/hdr_metadata.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/hdr_metadata.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..e2a669c98a3c0f079af5c176945ec29828bd2ddd >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/hdr_metadata.cc >@@ -0,0 +1,21 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/video/hdr_metadata.h" >+ >+namespace webrtc { >+ >+HdrMasteringMetadata::Chromaticity::Chromaticity() = default; >+ >+HdrMasteringMetadata::HdrMasteringMetadata() = default; >+ >+HdrMetadata::HdrMetadata() = default; >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/hdr_metadata.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/hdr_metadata.h >new file mode 100644 >index 0000000000000000000000000000000000000000..676a900c99486421430f04dbed2b288713cee3bd >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/hdr_metadata.h >@@ -0,0 +1,87 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_VIDEO_HDR_METADATA_H_ >+#define API_VIDEO_HDR_METADATA_H_ >+ >+#include <stdint.h> >+ >+namespace webrtc { >+ >+// SMPTE ST 2086 mastering metadata, >+// see https://ieeexplore.ieee.org/document/8353899. >+struct HdrMasteringMetadata { >+ struct Chromaticity { >+ // xy chromaticity coordinates must be calculated as specified in ISO >+ // 11664-3:2012 Section 7, and must be specified with four decimal places. >+ // The x coordinate must be in the range [0.0001, 0.7400] and the y >+ // coordinate must be in the range [0.0001, 0.8400]. >+ float x = 0.0f; >+ float y = 0.0f; >+ bool operator==(const Chromaticity& rhs) const { >+ return x == rhs.x && y == rhs.y; >+ } >+ >+ Chromaticity(); >+ }; >+ >+ // The nominal primaries of the mastering display. >+ Chromaticity primary_r; >+ Chromaticity primary_g; >+ Chromaticity primary_b; >+ >+ // The nominal chromaticity of the white point of the mastering display. >+ Chromaticity white_point; >+ >+ // The nominal maximum display luminance of the mastering display. Specified >+ // in the unit candela/m2. The value must be in the range [5, 10000] with zero >+ // decimal places. >+ float luminance_max = 0.0f; >+ >+ // The nominal minimum display luminance of the mastering display. Specified >+ // in the unit candela/m2. The value must be in the range [0.0001, 5.0000] >+ // with four decimal places. >+ float luminance_min = 0.0f; >+ >+ HdrMasteringMetadata(); >+ >+ bool operator==(const HdrMasteringMetadata& rhs) const { >+ return ((primary_r == rhs.primary_r) && (primary_g == rhs.primary_g) && >+ (primary_b == rhs.primary_b) && (white_point == rhs.white_point) && >+ (luminance_max == rhs.luminance_max) && >+ (luminance_min == rhs.luminance_min)); >+ } >+}; >+ >+// High dynamic range (HDR) metadata common for HDR10 and WebM/VP9-based HDR >+// formats. This struct replicates the HDRMetadata struct defined in >+// https://cs.chromium.org/chromium/src/media/base/hdr_metadata.h >+struct HdrMetadata { >+ HdrMasteringMetadata mastering_metadata; >+ // Max content light level (CLL), i.e. maximum brightness level present in the >+ // stream, in nits. 1 nit = 1 candela/m2. >+ uint32_t max_content_light_level = 0; >+ // Max frame-average light level (FALL), i.e. maximum average brightness of >+ // the brightest frame in the stream, in nits. >+ uint32_t max_frame_average_light_level = 0; >+ >+ HdrMetadata(); >+ >+ bool operator==(const HdrMetadata& rhs) const { >+ return ( >+ (max_content_light_level == rhs.max_content_light_level) && >+ (max_frame_average_light_level == rhs.max_frame_average_light_level) && >+ (mastering_metadata == rhs.mastering_metadata)); >+ } >+}; >+ >+} // namespace webrtc >+ >+#endif // API_VIDEO_HDR_METADATA_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/i420_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/i420_buffer.h >index 282b2426ca36e159228ee5b3ee71835c8bf77119..631e394e534f0b27479c1c6f6df61a051cc55f5b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/i420_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/i420_buffer.h >@@ -11,16 +11,19 @@ > #ifndef API_VIDEO_I420_BUFFER_H_ > #define API_VIDEO_I420_BUFFER_H_ > >+#include <stdint.h> > #include <memory> > > #include "api/video/video_frame_buffer.h" > #include "api/video/video_rotation.h" > #include "rtc_base/memory/aligned_malloc.h" >+#include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // Plain I420 buffer in standard memory. >-class I420Buffer : public I420BufferInterface { >+class RTC_EXPORT I420Buffer : public I420BufferInterface { > public: > static rtc::scoped_refptr<I420Buffer> Create(int width, int height); > static rtc::scoped_refptr<I420Buffer> Create(int width, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/test/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/test/BUILD.gn >index cba42458d2215d13c5f50fe894ac8fca4c78fad2..c8269efdc079bbe37ede277aaa7f39b51205c89b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/test/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/test/BUILD.gn >@@ -11,10 +11,12 @@ import("../../../webrtc.gni") > rtc_source_set("rtc_api_video_unittests") { > testonly = true > sources = [ >+ "color_space_unittest.cc", > "video_bitrate_allocation_unittest.cc", > ] > deps = [ > "..:video_bitrate_allocation", >+ "..:video_frame", > "../../../test:test_support", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/test/color_space_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/test/color_space_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..b71bcef86a6bc52c1d16ed9b39e647607bb0b5f5 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/test/color_space_unittest.cc >@@ -0,0 +1,55 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <stdint.h> >+ >+#include "api/video/color_space.h" >+#include "test/gtest.h" >+ >+namespace webrtc { >+TEST(ColorSpace, TestSettingPrimariesFromUint8) { >+ ColorSpace color_space; >+ EXPECT_TRUE(color_space.set_primaries_from_uint8( >+ static_cast<uint8_t>(ColorSpace::PrimaryID::kBT470BG))); >+ EXPECT_EQ(ColorSpace::PrimaryID::kBT470BG, color_space.primaries()); >+ EXPECT_FALSE(color_space.set_primaries_from_uint8(3)); >+ EXPECT_FALSE(color_space.set_primaries_from_uint8(23)); >+ EXPECT_FALSE(color_space.set_primaries_from_uint8(64)); >+} >+ >+TEST(ColorSpace, TestSettingTransferFromUint8) { >+ ColorSpace color_space; >+ EXPECT_TRUE(color_space.set_transfer_from_uint8( >+ static_cast<uint8_t>(ColorSpace::TransferID::kBT2020_10))); >+ EXPECT_EQ(ColorSpace::TransferID::kBT2020_10, color_space.transfer()); >+ EXPECT_FALSE(color_space.set_transfer_from_uint8(3)); >+ EXPECT_FALSE(color_space.set_transfer_from_uint8(19)); >+ EXPECT_FALSE(color_space.set_transfer_from_uint8(128)); >+} >+ >+TEST(ColorSpace, TestSettingMatrixFromUint8) { >+ ColorSpace color_space; >+ EXPECT_TRUE(color_space.set_matrix_from_uint8( >+ static_cast<uint8_t>(ColorSpace::MatrixID::kCDNCLS))); >+ EXPECT_EQ(ColorSpace::MatrixID::kCDNCLS, color_space.matrix()); >+ EXPECT_FALSE(color_space.set_matrix_from_uint8(3)); >+ EXPECT_FALSE(color_space.set_matrix_from_uint8(15)); >+ EXPECT_FALSE(color_space.set_matrix_from_uint8(255)); >+} >+ >+TEST(ColorSpace, TestSettingRangeFromUint8) { >+ ColorSpace color_space; >+ EXPECT_TRUE(color_space.set_range_from_uint8( >+ static_cast<uint8_t>(ColorSpace::RangeID::kFull))); >+ EXPECT_EQ(ColorSpace::RangeID::kFull, color_space.range()); >+ EXPECT_FALSE(color_space.set_range_from_uint8(4)); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.cc >index 6c5ad1eae5fac150d1340682c31369f132bfc449..1b3569047e0e7809c34d3eb0d8be51418e399f5c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.cc >@@ -10,12 +10,11 @@ > > #include "api/video/video_bitrate_allocation.h" > >-#include <limits> >+#include <cstdint> > > #include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/strings/string_builder.h" >-#include "rtc_base/stringutils.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.h >index 9e2501d85a09aacb919e9cb8afaf70dae4e49c9b..d1771b44a435cb83a29ea94c631d2890307a9cff 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_bitrate_allocation.h >@@ -11,6 +11,8 @@ > #ifndef API_VIDEO_VIDEO_BITRATE_ALLOCATION_H_ > #define API_VIDEO_VIDEO_BITRATE_ALLOCATION_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <limits> > #include <string> > #include <vector> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_codec_type.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_codec_type.h >new file mode 100644 >index 0000000000000000000000000000000000000000..447723cf088cdab2832bd1b51fc28810b120935c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_codec_type.h >@@ -0,0 +1,30 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_VIDEO_VIDEO_CODEC_TYPE_H_ >+#define API_VIDEO_VIDEO_CODEC_TYPE_H_ >+ >+namespace webrtc { >+ >+// Video codec types >+enum VideoCodecType { >+ // There are various memset(..., 0, ...) calls in the code that rely on >+ // kVideoCodecGeneric being zero. >+ kVideoCodecGeneric = 0, >+ kVideoCodecVP8, >+ kVideoCodecVP9, >+ kVideoCodecH264, >+ kVideoCodecI420, >+ kVideoCodecMultiplex, >+}; >+ >+} // namespace webrtc >+ >+#endif // API_VIDEO_VIDEO_CODEC_TYPE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_content_type.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_content_type.h >index 8c6460288d821cb66ef7140d9895e3c569893426..2d38a62366c577576a82bfd2d0bd4894dbc23731 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_content_type.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_content_type.h >@@ -13,8 +13,6 @@ > > #include <stdint.h> > >-#include <string> >- > namespace webrtc { > > enum class VideoContentType : uint8_t { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.cc >index 5521ad8a587441223743c2f9a8c91d43590050d9..eaae33b2b3ffd84f5f42e5553e053c1d2aafad7f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.cc >@@ -64,6 +64,13 @@ VideoFrame::Builder& VideoFrame::Builder::set_color_space( > return *this; > } > >+VideoFrame::Builder& VideoFrame::Builder::set_color_space( >+ const ColorSpace* color_space) { >+ color_space_ = >+ color_space ? absl::make_optional(*color_space) : absl::nullopt; >+ return *this; >+} >+ > VideoFrame::VideoFrame(const rtc::scoped_refptr<VideoFrameBuffer>& buffer, > webrtc::VideoRotation rotation, > int64_t timestamp_us) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.h >index dcb533e29f0c09f2bae8797313ba18d52e716ffc..2c5d081ba621dc76dba6cef8fcc1296218e39992 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_frame.h >@@ -15,12 +15,15 @@ > > #include "absl/types/optional.h" > #include "api/video/color_space.h" >+#include "api/video/hdr_metadata.h" > #include "api/video/video_frame_buffer.h" > #include "api/video/video_rotation.h" >+#include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >-class VideoFrame { >+class RTC_EXPORT VideoFrame { > public: > // Preferred way of building VideoFrame objects. > class Builder { >@@ -37,6 +40,7 @@ class VideoFrame { > Builder& set_ntp_time_ms(int64_t ntp_time_ms); > Builder& set_rotation(VideoRotation rotation); > Builder& set_color_space(const ColorSpace& color_space); >+ Builder& set_color_space(const ColorSpace* color_space); > > private: > rtc::scoped_refptr<webrtc::VideoFrameBuffer> video_frame_buffer_; >@@ -110,8 +114,10 @@ class VideoFrame { > VideoRotation rotation() const { return rotation_; } > void set_rotation(VideoRotation rotation) { rotation_ = rotation; } > >- // Set Color space when available. >- absl::optional<ColorSpace> color_space() const { return color_space_; } >+ // Get color space when available. >+ const ColorSpace* color_space() const { >+ return color_space_ ? &*color_space_ : nullptr; >+ } > > // Get render time in milliseconds. > // TODO(nisse): Deprecated. Migrate all users to timestamp_us(). >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_source_interface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_source_interface.h >index 4ee471945529df133c3d4d1833f8069e70d8e887..2bf73704bfaf1ab13c657b75e7f6544112a55413 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_source_interface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_source_interface.h >@@ -15,12 +15,13 @@ > > #include "absl/types/optional.h" > #include "api/video/video_sink_interface.h" >+#include "rtc_base/system/rtc_export.h" > > namespace rtc { > > // VideoSinkWants is used for notifying the source of properties a video frame > // should have when it is delivered to a certain sink. >-struct VideoSinkWants { >+struct RTC_EXPORT VideoSinkWants { > VideoSinkWants(); > VideoSinkWants(const VideoSinkWants&); > ~VideoSinkWants(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_observer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_observer.h >index b940efdf1150031f104e2a3971d8e73460eb57d3..98b5cfcb2cd30d9dfee04b3a183991f9361a53e4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_observer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_observer.h >@@ -11,6 +11,7 @@ > #ifndef API_VIDEO_VIDEO_STREAM_ENCODER_OBSERVER_H_ > #define API_VIDEO_VIDEO_STREAM_ENCODER_OBSERVER_H_ > >+#include <string> > #include <vector> > > #include "absl/types/optional.h" >@@ -68,6 +69,9 @@ class VideoStreamEncoderObserver : public CpuOveruseMetricsObserver { > virtual void OnSendEncodedImage(const EncodedImage& encoded_image, > const CodecSpecificInfo* codec_info) = 0; > >+ virtual void OnEncoderImplementationChanged( >+ const std::string& implementation_name) = 0; >+ > virtual void OnFrameDropped(DropReason reason) = 0; > > // Used to indicate change in content type, which may require a change in >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_settings.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_settings.h >index b67f33c982dbbecc08b18774b548bf0d89fc5831..37c1de7015f21b3a85d69bf20b0330cbaccfd9c6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_settings.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video/video_stream_encoder_settings.h >@@ -11,6 +11,7 @@ > #ifndef API_VIDEO_VIDEO_STREAM_ENCODER_SETTINGS_H_ > #define API_VIDEO_VIDEO_STREAM_ENCODER_SETTINGS_H_ > >+#include "api/video/video_bitrate_allocator_factory.h" > #include "api/video_codecs/video_encoder_factory.h" > > namespace webrtc { >@@ -24,6 +25,9 @@ struct VideoStreamEncoderSettings { > > // Ownership stays with WebrtcVideoEngine (delegated from PeerConnection). > VideoEncoderFactory* encoder_factory = nullptr; >+ >+ // Ownership stays with WebrtcVideoEngine (delegated from PeerConnection). >+ VideoBitrateAllocatorFactory* bitrate_allocator_factory = nullptr; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/BUILD.gn >index 55674d09010a98eda9a236b66b8c42ee4dcf2db6..d0b1b9ba8ea48b32fc0d1d41078701374089d97e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/BUILD.gn >@@ -28,15 +28,18 @@ rtc_source_set("video_codecs_api") { > "video_encoder_config.cc", > "video_encoder_config.h", > "video_encoder_factory.h", >+ "vp8_temporal_layers.h", > ] > > deps = [ > "../..:webrtc_common", >- "../../common_video", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >+ "../../rtc_base/system:rtc_export", >+ "../video:encoded_image", > "../video:video_bitrate_allocation", > "../video:video_frame", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -56,6 +59,7 @@ rtc_static_library("builtin_video_decoder_factory") { > ":video_codecs_api", > "../../media:rtc_internal_video_codecs", > "../../rtc_base:ptr_util", >+ "../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", > ] > } >@@ -75,8 +79,29 @@ rtc_static_library("builtin_video_encoder_factory") { > ":video_codecs_api", > "../../media:rtc_internal_video_codecs", > "../../media:rtc_media_base", >+ "../../media:rtc_vp8_encoder_simulcast_proxy", > "../../rtc_base:ptr_util", >+ "../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", >+ ] >+} >+ >+rtc_static_library("create_vp8_temporal_layers") { >+ visibility = [ "*" ] >+ allow_poison = [ "software_video_codecs" ] >+ sources = [ >+ "create_vp8_temporal_layers.cc", >+ "create_vp8_temporal_layers.h", >+ ] >+ >+ deps = [ >+ ":video_codecs_api", >+ "../..:webrtc_common", >+ "../../modules/video_coding:video_codec_interface", >+ "../../modules/video_coding:webrtc_vp8_temporal_layers", >+ "../../system_wrappers:system_wrappers", >+ "//third_party/abseil-cpp/absl/memory:memory", > ] > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/OWNERS >new file mode 100644 >index 0000000000000000000000000000000000000000..0a70bddd34d3b8375fa11b4b36043f5189ab75a1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/OWNERS >@@ -0,0 +1,4 @@ >+magjed@webrtc.org >+nisse@webrtc.org >+sprang@webrtc.org >+brandtr@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_decoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_decoder_factory.h >index 1f8e75c9ae7d52ada95f33ba0ec572612d1b658b..d516077d9991bb18af8b02cd7e6fee9e2d7008d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_decoder_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_decoder_factory.h >@@ -14,11 +14,13 @@ > #include <memory> > > #include "api/video_codecs/video_decoder_factory.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // Creates a new factory that can create the built-in types of video decoders. >-std::unique_ptr<VideoDecoderFactory> CreateBuiltinVideoDecoderFactory(); >+RTC_EXPORT std::unique_ptr<VideoDecoderFactory> >+CreateBuiltinVideoDecoderFactory(); > > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.cc >index ca389b99e3b5612d91cc7a852060dc86fe1dcb4c..1d0827cbc0c4d996f71499149080bae53a4dc0ba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.cc >@@ -13,6 +13,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "absl/strings/match.h" > #include "api/video_codecs/sdp_video_format.h" > #include "media/base/codec.h" > #include "media/base/mediaconstants.h" >@@ -59,7 +60,7 @@ class BuiltinVideoEncoderFactory : public VideoEncoderFactory { > if (IsFormatSupported(internal_encoder_factory_->GetSupportedFormats(), > format)) { > internal_encoder = >- cricket::CodecNamesEq(format.name.c_str(), cricket::kVp8CodecName) >+ absl::EqualsIgnoreCase(format.name, cricket::kVp8CodecName) > ? absl::make_unique<VP8EncoderSimulcastProxy>( > internal_encoder_factory_.get(), format) > : internal_encoder_factory_->CreateVideoEncoder(format); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.h >index 6a6618fb32724e4a138bda51d51bf99cf41c9e6d..2c4537205c59418f52c100f305bad4368c4642f9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/builtin_video_encoder_factory.h >@@ -14,12 +14,14 @@ > #include <memory> > > #include "api/video_codecs/video_encoder_factory.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // Creates a new factory that can create the built-in types of video encoders. > // The factory has simulcast support for VP8. >-std::unique_ptr<VideoEncoderFactory> CreateBuiltinVideoEncoderFactory(); >+RTC_EXPORT std::unique_ptr<VideoEncoderFactory> >+CreateBuiltinVideoEncoderFactory(); > > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/create_vp8_temporal_layers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/create_vp8_temporal_layers.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..dd290f39fdd477eba05d4498a05aef8cc045825d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/create_vp8_temporal_layers.cc >@@ -0,0 +1,33 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/video_codecs/create_vp8_temporal_layers.h" >+#include "absl/memory/memory.h" >+#include "api/video_codecs/vp8_temporal_layers.h" >+#include "modules/video_coding/codecs/vp8/default_temporal_layers.h" >+#include "modules/video_coding/codecs/vp8/screenshare_layers.h" >+#include "system_wrappers/include/clock.h" >+ >+namespace webrtc { >+ >+std::unique_ptr<Vp8TemporalLayers> CreateVp8TemporalLayers( >+ Vp8TemporalLayersType type, >+ int num_temporal_layers) { >+ switch (type) { >+ case Vp8TemporalLayersType::kFixedPattern: >+ return absl::make_unique<DefaultTemporalLayers>(num_temporal_layers); >+ case Vp8TemporalLayersType::kBitrateDynamic: >+ // Conference mode temporal layering for screen content in base stream. >+ return absl::make_unique<ScreenshareLayers>(num_temporal_layers, >+ Clock::GetRealTimeClock()); >+ } >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/create_vp8_temporal_layers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/create_vp8_temporal_layers.h >new file mode 100644 >index 0000000000000000000000000000000000000000..52f14c8eda39f34cea5ccd3f2ddfea93a9ddeb35 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/create_vp8_temporal_layers.h >@@ -0,0 +1,26 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_VIDEO_CODECS_CREATE_VP8_TEMPORAL_LAYERS_H_ >+#define API_VIDEO_CODECS_CREATE_VP8_TEMPORAL_LAYERS_H_ >+ >+#include <memory> >+ >+#include "api/video_codecs/vp8_temporal_layers.h" >+ >+namespace webrtc { >+ >+std::unique_ptr<Vp8TemporalLayers> CreateVp8TemporalLayers( >+ Vp8TemporalLayersType type, >+ int num_temporal_layers); >+ >+} // namespace webrtc >+ >+#endif // API_VIDEO_CODECS_CREATE_VP8_TEMPORAL_LAYERS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/sdp_video_format.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/sdp_video_format.h >index c25b857fde7f705c2741435cf370dd99c3ee17c8..edb78191c75318fec5ec3b6c0e61bcca6a45bac4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/sdp_video_format.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/sdp_video_format.h >@@ -14,11 +14,13 @@ > #include <map> > #include <string> > >+#include "rtc_base/system/rtc_export.h" >+ > namespace webrtc { > > // SDP specification for a single video codec. > // NOTE: This class is still under development and may change without notice. >-struct SdpVideoFormat { >+struct RTC_EXPORT SdpVideoFormat { > using Parameters = std::map<std::string, std::string>; > > explicit SdpVideoFormat(const std::string& name); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/BUILD.gn >index fa191625a1a4c021a5442ff924ff5e8ead031836..6f9dcaa2d8a1dee1a5b7b58202e51f7c55709da1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/BUILD.gn >@@ -21,6 +21,7 @@ if (rtc_include_tests) { > "..:builtin_video_encoder_factory", > "..:rtc_software_fallback_wrappers", > "..:video_codecs_api", >+ "../..:mock_video_encoder", > "../../../modules/video_coding:video_codec_interface", > "../../../modules/video_coding:video_coding_utility", > "../../../modules/video_coding:webrtc_vp8", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/builtin_video_encoder_factory_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/builtin_video_encoder_factory_unittest.cc >index 73957c999f04d35b6102e007de031413a9619787..30754d9bdbd1a18b98e3ae19f7d6bbc8bd339a8a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/builtin_video_encoder_factory_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/builtin_video_encoder_factory_unittest.cc >@@ -27,11 +27,11 @@ TEST(BuiltinVideoEncoderFactoryTest, AnnouncesVp9AccordingToBuildFlags) { > break; > } > } >-#if defined(RTC_DISABLE_VP9) >- EXPECT_FALSE(claims_vp9_support); >-#else >+#if defined(RTC_ENABLE_VP9) > EXPECT_TRUE(claims_vp9_support); >-#endif // defined(RTC_DISABLE_VP9) >+#else >+ EXPECT_FALSE(claims_vp9_support); >+#endif // defined(RTC_ENABLE_VP9) > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/video_encoder_software_fallback_wrapper_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/video_encoder_software_fallback_wrapper_unittest.cc >index 98ebed230511d8075ee256ba80ed97f5d1d02d5b..b374f3d1e1638da7f24906f06fa94250f9def9bb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/video_encoder_software_fallback_wrapper_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/test/video_encoder_software_fallback_wrapper_unittest.cc >@@ -8,23 +8,26 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include "api/video_codecs/video_encoder_software_fallback_wrapper.h" >- > #include <utility> > >+#include "api/test/mock_video_encoder.h" > #include "api/video/i420_buffer.h" > #include "api/video/video_bitrate_allocation.h" >+#include "api/video_codecs/video_encoder_software_fallback_wrapper.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "modules/video_coding/include/video_error_codes.h" > #include "modules/video_coding/utility/simulcast_rate_allocator.h" > #include "rtc_base/checks.h" > #include "rtc_base/fakeclock.h" > #include "test/field_trial.h" >+#include "test/gmock.h" > #include "test/gtest.h" > > namespace webrtc { >+using ::testing::Return; >+ > namespace { > const int kWidth = 320; > const int kHeight = 240; >@@ -34,6 +37,12 @@ const size_t kMaxPayloadSize = 800; > const int kDefaultMinPixelsPerFrame = 320 * 180; > const int kLowThreshold = 10; > const int kHighThreshold = 20; >+ >+VideoEncoder::EncoderInfo GetEncoderInfo(bool trusted_rate_controller) { >+ VideoEncoder::EncoderInfo info; >+ info.has_trusted_rate_controller = trusted_rate_controller; >+ return info; >+} > } // namespace > > class VideoEncoderSoftwareFallbackWrapperTest : public ::testing::Test { >@@ -62,9 +71,7 @@ class VideoEncoderSoftwareFallbackWrapperTest : public ::testing::Test { > ++encode_count_; > if (encode_complete_callback_ && > encode_return_code_ == WEBRTC_VIDEO_CODEC_OK) { >- CodecSpecificInfo info; >- info.codec_name = ImplementationName(); >- encode_complete_callback_->OnEncodedImage(EncodedImage(), &info, >+ encode_complete_callback_->OnEncodedImage(EncodedImage(), nullptr, > nullptr); > } > return encode_return_code_; >@@ -81,26 +88,19 @@ class VideoEncoderSoftwareFallbackWrapperTest : public ::testing::Test { > return WEBRTC_VIDEO_CODEC_OK; > } > >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override { >- ++set_channel_parameters_count_; >- return WEBRTC_VIDEO_CODEC_OK; >- } >- > int32_t SetRateAllocation(const VideoBitrateAllocation& bitrate_allocation, > uint32_t framerate) override { > ++set_rates_count_; > return WEBRTC_VIDEO_CODEC_OK; > } > >- bool SupportsNativeHandle() const override { >+ EncoderInfo GetEncoderInfo() const override { > ++supports_native_handle_count_; >- return supports_native_handle_; >- } >- >- const char* ImplementationName() const override { return "fake-encoder"; } >- >- VideoEncoder::ScalingSettings GetScalingSettings() const override { >- return VideoEncoder::ScalingSettings(kLowThreshold, kHighThreshold); >+ EncoderInfo info; >+ info.scaling_settings = ScalingSettings(kLowThreshold, kHighThreshold); >+ info.supports_native_handle = supports_native_handle_; >+ info.implementation_name = "fake-encoder"; >+ return info; > } > > int init_encode_count_ = 0; >@@ -109,7 +109,6 @@ class VideoEncoderSoftwareFallbackWrapperTest : public ::testing::Test { > int encode_count_ = 0; > EncodedImageCallback* encode_complete_callback_ = nullptr; > int release_count_ = 0; >- int set_channel_parameters_count_ = 0; > int set_rates_count_ = 0; > mutable int supports_native_handle_count_ = 0; > bool supports_native_handle_ = false; >@@ -122,11 +121,9 @@ class VideoEncoderSoftwareFallbackWrapperTest : public ::testing::Test { > const CodecSpecificInfo* codec_specific_info, > const RTPFragmentationHeader* fragmentation) override { > ++callback_count_; >- last_codec_name_ = codec_specific_info->codec_name; > return Result(Result::OK, callback_count_); > } > int callback_count_ = 0; >- std::string last_codec_name_; > }; > > void UtilizeFallbackEncoder(); >@@ -134,7 +131,8 @@ class VideoEncoderSoftwareFallbackWrapperTest : public ::testing::Test { > void EncodeFrame(); > void EncodeFrame(int expected_ret); > void CheckLastEncoderName(const char* expected_name) { >- EXPECT_STREQ(expected_name, callback_.last_codec_name_.c_str()); >+ EXPECT_EQ(expected_name, >+ fallback_wrapper_->GetEncoderInfo().implementation_name); > } > > test::ScopedFieldTrials override_field_trials_; >@@ -271,15 +269,6 @@ TEST_F(VideoEncoderSoftwareFallbackWrapperTest, > EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, fallback_wrapper_->Release()); > } > >-TEST_F(VideoEncoderSoftwareFallbackWrapperTest, >- SetChannelParametersForwardedDuringFallback) { >- UtilizeFallbackEncoder(); >- EXPECT_EQ(0, fake_encoder_->set_channel_parameters_count_); >- fallback_wrapper_->SetChannelParameters(1, 1); >- EXPECT_EQ(1, fake_encoder_->set_channel_parameters_count_); >- EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, fallback_wrapper_->Release()); >-} >- > TEST_F(VideoEncoderSoftwareFallbackWrapperTest, > SetRatesForwardedDuringFallback) { > UtilizeFallbackEncoder(); >@@ -291,15 +280,19 @@ TEST_F(VideoEncoderSoftwareFallbackWrapperTest, > > TEST_F(VideoEncoderSoftwareFallbackWrapperTest, > SupportsNativeHandleForwardedWithoutFallback) { >- fallback_wrapper_->SupportsNativeHandle(); >+ fallback_wrapper_->GetEncoderInfo(); > EXPECT_EQ(1, fake_encoder_->supports_native_handle_count_); > } > > TEST_F(VideoEncoderSoftwareFallbackWrapperTest, > SupportsNativeHandleNotForwardedDuringFallback) { >+ // Fake encoder signals support for native handle, default (libvpx) does not. >+ fake_encoder_->supports_native_handle_ = true; >+ EXPECT_TRUE(fallback_wrapper_->GetEncoderInfo().supports_native_handle); > UtilizeFallbackEncoder(); >- fallback_wrapper_->SupportsNativeHandle(); >- EXPECT_EQ(0, fake_encoder_->supports_native_handle_count_); >+ EXPECT_FALSE(fallback_wrapper_->GetEncoderInfo().supports_native_handle); >+ // Both times, both encoders are queried. >+ EXPECT_EQ(2, fake_encoder_->supports_native_handle_count_); > EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, fallback_wrapper_->Release()); > } > >@@ -489,7 +482,7 @@ TEST_F(ForcedFallbackTestDisabled, GetScaleSettings) { > EncodeFrameAndVerifyLastName("fake-encoder"); > > // Default min pixels per frame should be used. >- const auto settings = fallback_wrapper_->GetScalingSettings(); >+ const auto settings = fallback_wrapper_->GetEncoderInfo().scaling_settings; > EXPECT_TRUE(settings.thresholds.has_value()); > EXPECT_EQ(kDefaultMinPixelsPerFrame, settings.min_pixels_per_frame); > } >@@ -500,7 +493,7 @@ TEST_F(ForcedFallbackTestEnabled, GetScaleSettingsWithNoFallback) { > EncodeFrameAndVerifyLastName("fake-encoder"); > > // Configured min pixels per frame should be used. >- const auto settings = fallback_wrapper_->GetScalingSettings(); >+ const auto settings = fallback_wrapper_->GetEncoderInfo().scaling_settings; > EXPECT_EQ(kMinPixelsPerFrame, settings.min_pixels_per_frame); > ASSERT_TRUE(settings.thresholds); > EXPECT_EQ(kLowThreshold, settings.thresholds->low); >@@ -513,7 +506,7 @@ TEST_F(ForcedFallbackTestEnabled, GetScaleSettingsWithFallback) { > EncodeFrameAndVerifyLastName("libvpx"); > > // Configured min pixels per frame should be used. >- const auto settings = fallback_wrapper_->GetScalingSettings(); >+ const auto settings = fallback_wrapper_->GetEncoderInfo().scaling_settings; > EXPECT_TRUE(settings.thresholds.has_value()); > EXPECT_EQ(kMinPixelsPerFrame, settings.min_pixels_per_frame); > } >@@ -525,8 +518,77 @@ TEST_F(ForcedFallbackTestEnabled, ScalingDisabledIfResizeOff) { > EncodeFrameAndVerifyLastName("libvpx"); > > // Should be disabled for automatic resize off. >- const auto settings = fallback_wrapper_->GetScalingSettings(); >+ const auto settings = fallback_wrapper_->GetEncoderInfo().scaling_settings; > EXPECT_FALSE(settings.thresholds.has_value()); > } > >+TEST(SoftwareFallbackEncoderTest, BothRateControllersNotTrusted) { >+ auto* sw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ auto* hw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ >+ EXPECT_CALL(*sw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(false))); >+ EXPECT_CALL(*hw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(false))); >+ >+ std::unique_ptr<VideoEncoder> wrapper = >+ CreateVideoEncoderSoftwareFallbackWrapper( >+ std::unique_ptr<VideoEncoder>(sw_encoder), >+ std::unique_ptr<VideoEncoder>(hw_encoder)); >+ EXPECT_FALSE(wrapper->GetEncoderInfo().has_trusted_rate_controller); >+} >+ >+TEST(SoftwareFallbackEncoderTest, SwRateControllerTrusted) { >+ auto* sw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ auto* hw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ EXPECT_CALL(*sw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(true))); >+ EXPECT_CALL(*hw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(false))); >+ >+ std::unique_ptr<VideoEncoder> wrapper = >+ CreateVideoEncoderSoftwareFallbackWrapper( >+ std::unique_ptr<VideoEncoder>(sw_encoder), >+ std::unique_ptr<VideoEncoder>(hw_encoder)); >+ EXPECT_FALSE(wrapper->GetEncoderInfo().has_trusted_rate_controller); >+} >+ >+TEST(SoftwareFallbackEncoderTest, HwRateControllerTrusted) { >+ auto* sw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ auto* hw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ EXPECT_CALL(*sw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(false))); >+ EXPECT_CALL(*hw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(true))); >+ >+ std::unique_ptr<VideoEncoder> wrapper = >+ CreateVideoEncoderSoftwareFallbackWrapper( >+ std::unique_ptr<VideoEncoder>(sw_encoder), >+ std::unique_ptr<VideoEncoder>(hw_encoder)); >+ EXPECT_TRUE(wrapper->GetEncoderInfo().has_trusted_rate_controller); >+ >+ // Trigger fallback to software. >+ EXPECT_CALL(*hw_encoder, Encode) >+ .WillOnce(Return(WEBRTC_VIDEO_CODEC_FALLBACK_SOFTWARE)); >+ VideoFrame frame = VideoFrame::Builder().build(); >+ wrapper->Encode(frame, nullptr, nullptr); >+ >+ EXPECT_FALSE(wrapper->GetEncoderInfo().has_trusted_rate_controller); >+} >+ >+TEST(SoftwareFallbackEncoderTest, BothRateControllersTrusted) { >+ auto* sw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ auto* hw_encoder = new testing::NiceMock<MockVideoEncoder>(); >+ EXPECT_CALL(*sw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(true))); >+ EXPECT_CALL(*hw_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(GetEncoderInfo(true))); >+ >+ std::unique_ptr<VideoEncoder> wrapper = >+ CreateVideoEncoderSoftwareFallbackWrapper( >+ std::unique_ptr<VideoEncoder>(sw_encoder), >+ std::unique_ptr<VideoEncoder>(hw_encoder)); >+ EXPECT_TRUE(wrapper->GetEncoderInfo().has_trusted_rate_controller); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.cc >index d36535b31008d36dec340cb5553f75d9f8989d2e..0819c822a06de799c13d7a5d2d8206f470ed3708 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.cc >@@ -11,13 +11,10 @@ > #include "api/video_codecs/video_codec.h" > > #include <string.h> >-#include <algorithm> >-#include <limits> > #include <string> >-#include <type_traits> > >+#include "absl/strings/match.h" > #include "rtc_base/checks.h" >-#include "rtc_base/strings/string_builder.h" > #include "rtc_base/stringutils.h" > > namespace webrtc { >@@ -118,10 +115,6 @@ static const char* kPayloadNameI420 = "I420"; > static const char* kPayloadNameGeneric = "Generic"; > static const char* kPayloadNameMultiplex = "Multiplex"; > >-static bool CodecNamesEq(const char* name1, const char* name2) { >- return _stricmp(name1, name2) == 0; >-} >- > const char* CodecTypeToPayloadString(VideoCodecType type) { > switch (type) { > case kVideoCodecVP8: >@@ -139,15 +132,15 @@ const char* CodecTypeToPayloadString(VideoCodecType type) { > } > > VideoCodecType PayloadStringToCodecType(const std::string& name) { >- if (CodecNamesEq(name.c_str(), kPayloadNameVp8)) >+ if (absl::EqualsIgnoreCase(name, kPayloadNameVp8)) > return kVideoCodecVP8; >- if (CodecNamesEq(name.c_str(), kPayloadNameVp9)) >+ if (absl::EqualsIgnoreCase(name, kPayloadNameVp9)) > return kVideoCodecVP9; >- if (CodecNamesEq(name.c_str(), kPayloadNameH264)) >+ if (absl::EqualsIgnoreCase(name, kPayloadNameH264)) > return kVideoCodecH264; >- if (CodecNamesEq(name.c_str(), kPayloadNameI420)) >+ if (absl::EqualsIgnoreCase(name, kPayloadNameI420)) > return kVideoCodecI420; >- if (CodecNamesEq(name.c_str(), kPayloadNameMultiplex)) >+ if (absl::EqualsIgnoreCase(name, kPayloadNameMultiplex)) > return kVideoCodecMultiplex; > return kVideoCodecGeneric; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.h >index 5b4ebb25c074604bbeece813bdd4f6b6621c7660..9fba60c4e872710fcc8579ef24094febdbb47541 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_codec.h >@@ -11,9 +11,14 @@ > #ifndef API_VIDEO_CODECS_VIDEO_CODEC_H_ > #define API_VIDEO_CODECS_VIDEO_CODEC_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <string> > >+#include "api/video/video_bitrate_allocation.h" >+#include "api/video/video_codec_type.h" > #include "common_types.h" // NOLINT(build/include) >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -86,8 +91,8 @@ struct VideoCodecH264 { > }; > > // Translates from name of codec to codec type and vice versa. >-const char* CodecTypeToPayloadString(VideoCodecType type); >-VideoCodecType PayloadStringToCodecType(const std::string& name); >+RTC_EXPORT const char* CodecTypeToPayloadString(VideoCodecType type); >+RTC_EXPORT VideoCodecType PayloadStringToCodecType(const std::string& name); > > union VideoCodecUnion { > VideoCodecVP8 VP8; >@@ -98,7 +103,7 @@ union VideoCodecUnion { > enum class VideoCodecMode { kRealtimeVideo, kScreensharing }; > > // Common video codec properties >-class VideoCodec { >+class RTC_EXPORT VideoCodec { > public: > VideoCodec(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_decoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_decoder.h >index 7995fccb0563da6e10e3ce969a59d2ecee667991..36cf80fe84c763db25ef2ae7398b240421634806 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_decoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_decoder.h >@@ -15,9 +15,10 @@ > #include <string> > #include <vector> > >+#include "api/video/encoded_image.h" > #include "api/video/video_frame.h" > #include "api/video_codecs/video_codec.h" >-#include "common_video/include/video_frame.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -25,7 +26,7 @@ namespace webrtc { > struct CodecSpecificInfo; > class VideoCodec; > >-class DecodedImageCallback { >+class RTC_EXPORT DecodedImageCallback { > public: > virtual ~DecodedImageCallback() {} > >@@ -47,7 +48,7 @@ class DecodedImageCallback { > virtual int32_t ReceivedDecodedFrame(const uint64_t pictureId); > }; > >-class VideoDecoder { >+class RTC_EXPORT VideoDecoder { > public: > virtual ~VideoDecoder() {} > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.cc >index 008780e38c4c19e2007df344441d1b7ab4f15c64..135687c970b7b6db891bd26f884c1a9682eb2473 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.cc >@@ -10,6 +10,10 @@ > > #include "api/video_codecs/video_encoder.h" > >+#include <string.h> >+ >+#include "rtc_base/checks.h" >+ > namespace webrtc { > > // TODO(mflodman): Add default complexity for VP9 and VP9. >@@ -79,6 +83,14 @@ VideoEncoder::ScalingSettings::~ScalingSettings() {} > constexpr VideoEncoder::ScalingSettings::KOff > VideoEncoder::ScalingSettings::kOff; > >+VideoEncoder::EncoderInfo::EncoderInfo() >+ : scaling_settings(VideoEncoder::ScalingSettings::kOff), >+ supports_native_handle(false), >+ implementation_name("unknown"), >+ has_trusted_rate_controller(false) {} >+ >+VideoEncoder::EncoderInfo::~EncoderInfo() = default; >+ > int32_t VideoEncoder::SetRates(uint32_t bitrate, uint32_t framerate) { > RTC_NOTREACHED() << "SetRate(uint32_t, uint32_t) is deprecated."; > return -1; >@@ -101,4 +113,14 @@ bool VideoEncoder::SupportsNativeHandle() const { > const char* VideoEncoder::ImplementationName() const { > return "unknown"; > } >+ >+// TODO(webrtc:9722): Remove and make pure virtual when the three legacy >+// methods called here are gone. >+VideoEncoder::EncoderInfo VideoEncoder::GetEncoderInfo() const { >+ EncoderInfo info; >+ info.scaling_settings = GetScalingSettings(); >+ info.supports_native_handle = SupportsNativeHandle(); >+ info.implementation_name = ImplementationName(); >+ return info; >+} > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.h >index 68d9b445580f388253d256915f9b8f96964a1dbd..e7deb7dbb9cdb2670e0424c2ba2d8a20d21292b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder.h >@@ -16,11 +16,12 @@ > #include <vector> > > #include "absl/types/optional.h" >+#include "api/video/encoded_image.h" > #include "api/video/video_bitrate_allocation.h" > #include "api/video/video_frame.h" > #include "api/video_codecs/video_codec.h" >-#include "common_video/include/video_frame.h" > #include "rtc_base/checks.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -73,7 +74,7 @@ class EncodedImageCallback { > virtual void OnDroppedFrame(DropReason reason) {} > }; > >-class VideoEncoder { >+class RTC_EXPORT VideoEncoder { > public: > struct QpThresholds { > QpThresholds(int l, int h) : low(l), high(h) {} >@@ -102,13 +103,13 @@ class VideoEncoder { > ScalingSettings(KOff); // NOLINT(runtime/explicit) > ~ScalingSettings(); > >- const absl::optional<QpThresholds> thresholds; >+ absl::optional<QpThresholds> thresholds; > > // We will never ask for a resolution lower than this. > // TODO(kthelgason): Lower this limit when better testing > // on MediaCodec and fallback implementations are in place. > // See https://bugs.chromium.org/p/webrtc/issues/detail?id=7206 >- const int min_pixels_per_frame = 320 * 180; >+ int min_pixels_per_frame = 320 * 180; > > private: > // Private constructor; to get an object without thresholds, use >@@ -116,6 +117,36 @@ class VideoEncoder { > ScalingSettings(); > }; > >+ // Struct containing metadata about the encoder implementing this interface. >+ struct EncoderInfo { >+ EncoderInfo(); >+ ~EncoderInfo(); >+ >+ // Any encoder implementation wishing to use the WebRTC provided >+ // quality scaler must populate this field. >+ ScalingSettings scaling_settings; >+ >+ // If true, encoder supports working with a native handle (e.g. texture >+ // handle for hw codecs) rather than requiring a raw I420 buffer. >+ bool supports_native_handle; >+ >+ // The name of this particular encoder implementation, e.g. "libvpx". >+ std::string implementation_name; >+ >+ // If this field is true, the encoder rate controller must perform >+ // well even in difficult situations, and produce close to the specified >+ // target bitrate seen over a reasonable time window, drop frames if >+ // necessary in order to keep the rate correct, and react quickly to >+ // changing bitrate targets. If this method returns true, we disable the >+ // frame dropper in the media optimization module and rely entirely on the >+ // encoder to produce media at a bitrate that closely matches the target. >+ // Any overshooting may result in delay buildup. If this method returns >+ // false (default behavior), the media opt frame dropper will drop input >+ // frames if it suspect encoder misbehavior. Misbehavior is common, >+ // especially in hardware codecs. Disable media opt at your own risk. >+ bool has_trusted_rate_controller; >+ }; >+ > static VideoCodecVP8 GetDefaultVp8Settings(); > static VideoCodecVP9 GetDefaultVp9Settings(); > static VideoCodecH264 GetDefaultH264Settings(); >@@ -171,17 +202,6 @@ class VideoEncoder { > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) = 0; > >- // Inform the encoder of the new packet loss rate and the round-trip time of >- // the network. >- // >- // Input: >- // - packet_loss : Fraction lost >- // (loss rate in percent = 100 * packetLoss / 255) >- // - rtt : Round-trip time in milliseconds >- // Return value : WEBRTC_VIDEO_CODEC_OK if OK >- // <0 - Errors: WEBRTC_VIDEO_CODEC_ERROR >- virtual int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) = 0; >- > // Inform the encoder about the new target bit rate. > // > // Input: >@@ -196,12 +216,17 @@ class VideoEncoder { > virtual int32_t SetRateAllocation(const VideoBitrateAllocation& allocation, > uint32_t framerate); > >- // Any encoder implementation wishing to use the WebRTC provided >- // quality scaler must implement this method. >+ // GetScalingSettings(), SupportsNativeHandle(), ImplementationName() are >+ // deprecated, use GetEncoderInfo() instead. > virtual ScalingSettings GetScalingSettings() const; >- > virtual bool SupportsNativeHandle() const; > virtual const char* ImplementationName() const; >+ >+ // Returns meta-data about the encoder, such as implementation name. >+ // The output of this method may change during runtime. For instance if a >+ // hardware encoder fails, it may fall back to doing software encoding using >+ // an implementation with different characteristics. >+ virtual EncoderInfo GetEncoderInfo() const; > }; > } // namespace webrtc > #endif // API_VIDEO_CODECS_VIDEO_ENCODER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.cc >index 74977eaea3fdd59fc573c1b183044209d71a89e9..66ff7c94eb43b756a33623c598d2b063a6e6f50a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.cc >@@ -9,7 +9,6 @@ > */ > #include "api/video_codecs/video_encoder_config.h" > >-#include <algorithm> > #include <string> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.h >index e10f08190c68a332c31cc2cdaad49e2caebf8e32..9dd8e4fa5d02c4495cdbd9835c63a0cb5630b9c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_config.h >@@ -11,6 +11,7 @@ > #ifndef API_VIDEO_CODECS_VIDEO_ENCODER_CONFIG_H_ > #define API_VIDEO_CODECS_VIDEO_ENCODER_CONFIG_H_ > >+#include <stddef.h> > #include <string> > #include <vector> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_software_fallback_wrapper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_software_fallback_wrapper.cc >index 99f29707e441c029250b7edd9b05d983c1d9219c..fd0678543abefaa3d730852323d9d425087f6cc3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_software_fallback_wrapper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/video_encoder_software_fallback_wrapper.cc >@@ -87,12 +87,9 @@ class VideoEncoderSoftwareFallbackWrapper final : public VideoEncoder { > int32_t Encode(const VideoFrame& frame, > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) override; >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; > int32_t SetRateAllocation(const VideoBitrateAllocation& bitrate_allocation, > uint32_t framerate) override; >- bool SupportsNativeHandle() const override; >- ScalingSettings GetScalingSettings() const override; >- const char* ImplementationName() const override; >+ EncoderInfo GetEncoderInfo() const override; > > private: > bool InitFallbackEncoder(); >@@ -162,7 +159,7 @@ VideoEncoderSoftwareFallbackWrapper::VideoEncoderSoftwareFallbackWrapper( > if (forced_fallback_possible_) { > GetForcedFallbackParamsFromFieldTrialGroup( > &forced_fallback_.min_pixels_, &forced_fallback_.max_pixels_, >- encoder_->GetScalingSettings().min_pixels_per_frame - >+ encoder_->GetEncoderInfo().scaling_settings.min_pixels_per_frame - > 1); // No HW below. > } > } >@@ -185,8 +182,6 @@ bool VideoEncoderSoftwareFallbackWrapper::InitFallbackEncoder() { > fallback_encoder_->RegisterEncodeCompleteCallback(callback_); > if (rates_set_) > fallback_encoder_->SetRateAllocation(bitrate_allocation_, framerate_); >- if (channel_parameters_set_) >- fallback_encoder_->SetChannelParameters(packet_loss_, rtt_); > > // Since we're switching to the fallback encoder, Release the real encoder. It > // may be re-initialized via InitEncode later, and it will continue to get >@@ -206,7 +201,6 @@ int32_t VideoEncoderSoftwareFallbackWrapper::InitEncode( > max_payload_size_ = max_payload_size; > // Clear stored rate/channel parameters. > rates_set_ = false; >- channel_parameters_set_ = false; > ValidateSettingsForForcedFallback(); > > // Try to reinit forced software codec if it is in use. >@@ -270,18 +264,6 @@ int32_t VideoEncoderSoftwareFallbackWrapper::Encode( > return ret; > } > >-int32_t VideoEncoderSoftwareFallbackWrapper::SetChannelParameters( >- uint32_t packet_loss, >- int64_t rtt) { >- channel_parameters_set_ = true; >- packet_loss_ = packet_loss; >- rtt_ = rtt; >- int32_t ret = encoder_->SetChannelParameters(packet_loss, rtt); >- if (use_fallback_encoder_) >- return fallback_encoder_->SetChannelParameters(packet_loss, rtt); >- return ret; >-} >- > int32_t VideoEncoderSoftwareFallbackWrapper::SetRateAllocation( > const VideoBitrateAllocation& bitrate_allocation, > uint32_t framerate) { >@@ -294,29 +276,29 @@ int32_t VideoEncoderSoftwareFallbackWrapper::SetRateAllocation( > return ret; > } > >-bool VideoEncoderSoftwareFallbackWrapper::SupportsNativeHandle() const { >- return use_fallback_encoder_ ? fallback_encoder_->SupportsNativeHandle() >- : encoder_->SupportsNativeHandle(); >-} >+VideoEncoder::EncoderInfo VideoEncoderSoftwareFallbackWrapper::GetEncoderInfo() >+ const { >+ EncoderInfo fallback_encoder_info = fallback_encoder_->GetEncoderInfo(); >+ EncoderInfo default_encoder_info = encoder_->GetEncoderInfo(); >+ >+ EncoderInfo info = >+ use_fallback_encoder_ ? fallback_encoder_info : default_encoder_info; > >-VideoEncoder::ScalingSettings >-VideoEncoderSoftwareFallbackWrapper::GetScalingSettings() const { > if (forced_fallback_possible_) { > const auto settings = forced_fallback_.active_ >- ? fallback_encoder_->GetScalingSettings() >- : encoder_->GetScalingSettings(); >- return settings.thresholds >- ? VideoEncoder::ScalingSettings(settings.thresholds->low, >- settings.thresholds->high, >- forced_fallback_.min_pixels_) >- : VideoEncoder::ScalingSettings::kOff; >+ ? fallback_encoder_info.scaling_settings >+ : default_encoder_info.scaling_settings; >+ info.scaling_settings = >+ settings.thresholds >+ ? VideoEncoder::ScalingSettings(settings.thresholds->low, >+ settings.thresholds->high, >+ forced_fallback_.min_pixels_) >+ : VideoEncoder::ScalingSettings::kOff; >+ } else { >+ info.scaling_settings = default_encoder_info.scaling_settings; > } >- return encoder_->GetScalingSettings(); >-} > >-const char* VideoEncoderSoftwareFallbackWrapper::ImplementationName() const { >- return use_fallback_encoder_ ? fallback_encoder_->ImplementationName() >- : encoder_->ImplementationName(); >+ return info; > } > > bool VideoEncoderSoftwareFallbackWrapper::IsForcedFallbackActive() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/vp8_temporal_layers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/vp8_temporal_layers.h >new file mode 100644 >index 0000000000000000000000000000000000000000..70223a42096545ac85727e8cce31ee7850783047 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/video_codecs/vp8_temporal_layers.h >@@ -0,0 +1,195 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef API_VIDEO_CODECS_VP8_TEMPORAL_LAYERS_H_ >+#define API_VIDEO_CODECS_VP8_TEMPORAL_LAYERS_H_ >+ >+#include <memory> >+#include <vector> >+ >+namespace webrtc { >+ >+// Some notes on the prerequisites of the TemporalLayers interface. >+// * Vp8TemporalLayers is not thread safe, synchronization is the caller's >+// responsibility. >+// * The encoder is assumed to encode all frames in order, and callbacks to >+// PopulateCodecSpecific() / FrameEncoded() must happen in the same order. >+// >+// This means that in the case of pipelining encoders, it is OK to have a chain >+// of calls such as this: >+// - UpdateLayerConfig(timestampA) >+// - UpdateLayerConfig(timestampB) >+// - PopulateCodecSpecific(timestampA, ...) >+// - UpdateLayerConfig(timestampC) >+// - OnEncodeDone(timestampA, 1234, ...) >+// - UpdateLayerConfig(timestampC) >+// - OnEncodeDone(timestampB, 0, ...) >+// - OnEncodeDone(timestampC, 1234, ...) >+// Note that UpdateLayerConfig() for a new frame can happen before >+// FrameEncoded() for a previous one, but calls themselves must be both >+// synchronized (e.g. run on a task queue) and in order (per type). >+ >+// Two different flavors of temporal layers are currently available: >+// kFixedPattern uses a fixed repeating pattern of 1-4 layers. >+// kBitrateDynamic can allocate frames dynamically to 1 or 2 layers, based on >+// the bitrate produced. >+enum class Vp8TemporalLayersType { kFixedPattern, kBitrateDynamic }; >+ >+struct CodecSpecificInfoVP8; >+ >+struct Vp8EncoderConfig { >+ static constexpr size_t kMaxPeriodicity = 16; >+ static constexpr size_t kMaxLayers = 5; >+ >+ // Number of active temporal layers. Set to 0 if not used. >+ uint32_t ts_number_layers; >+ // Arrays of length |ts_number_layers|, indicating (cumulative) target bitrate >+ // and rate decimator (e.g. 4 if every 4th frame is in the given layer) for >+ // each active temporal layer, starting with temporal id 0. >+ uint32_t ts_target_bitrate[kMaxLayers]; >+ uint32_t ts_rate_decimator[kMaxLayers]; >+ >+ // The periodicity of the temporal pattern. Set to 0 if not used. >+ uint32_t ts_periodicity; >+ // Array of length |ts_periodicity| indicating the sequence of temporal id's >+ // to assign to incoming frames. >+ uint32_t ts_layer_id[kMaxPeriodicity]; >+ >+ // Target bitrate, in bps. >+ uint32_t rc_target_bitrate; >+ >+ // Clamp QP to min/max. Use 0 to disable clamping. >+ uint32_t rc_min_quantizer; >+ uint32_t rc_max_quantizer; >+}; >+ >+// Defined bit-maskable reference to the three buffers available in VP8. >+enum class Vp8BufferReference : uint8_t { >+ kNone = 0, >+ kLast = 1, >+ kGolden = 2, >+ kAltref = 4 >+}; >+ >+// This interface defines a way of getting the encoder settings needed to >+// realize a temporal layer structure. >+class Vp8TemporalLayers { >+ public: >+ enum BufferFlags : int { >+ kNone = 0, >+ kReference = 1, >+ kUpdate = 2, >+ kReferenceAndUpdate = kReference | kUpdate, >+ }; >+ enum FreezeEntropy { kFreezeEntropy }; >+ >+ struct FrameConfig { >+ FrameConfig(); >+ >+ FrameConfig(BufferFlags last, BufferFlags golden, BufferFlags arf); >+ FrameConfig(BufferFlags last, >+ BufferFlags golden, >+ BufferFlags arf, >+ FreezeEntropy); >+ >+ bool drop_frame; >+ BufferFlags last_buffer_flags; >+ BufferFlags golden_buffer_flags; >+ BufferFlags arf_buffer_flags; >+ >+ // The encoder layer ID is used to utilize the correct bitrate allocator >+ // inside the encoder. It does not control references nor determine which >+ // "actual" temporal layer this is. The packetizer temporal index determines >+ // which layer the encoded frame should be packetized into. >+ // Normally these are the same, but current temporal-layer strategies for >+ // screenshare use one bitrate allocator for all layers, but attempt to >+ // packetize / utilize references to split a stream into multiple layers, >+ // with different quantizer settings, to hit target bitrate. >+ // TODO(pbos): Screenshare layers are being reconsidered at the time of >+ // writing, we might be able to remove this distinction, and have a temporal >+ // layer imply both (the normal case). >+ int encoder_layer_id; >+ int packetizer_temporal_idx; >+ >+ bool layer_sync; >+ >+ bool freeze_entropy; >+ >+ // Indicates in which order the encoder should search the reference buffers >+ // when doing motion prediction. Set to kNone to use unspecified order. Any >+ // buffer indicated here must not have the corresponding no_ref bit set. >+ // If all three buffers can be reference, the one not listed here should be >+ // searched last. >+ Vp8BufferReference first_reference; >+ Vp8BufferReference second_reference; >+ >+ private: >+ FrameConfig(BufferFlags last, >+ BufferFlags golden, >+ BufferFlags arf, >+ bool freeze_entropy); >+ }; >+ >+ virtual ~Vp8TemporalLayers() = default; >+ >+ // If this method returns true, the encoder is free to drop frames for >+ // instance in an effort to uphold encoding bitrate. >+ // If this return false, the encoder must not drop any frames unless: >+ // 1. Requested to do so via FrameConfig.drop_frame >+ // 2. The frame to be encoded is requested to be a keyframe >+ // 3. The encoded detected a large overshoot and decided to drop and then >+ // re-encode the image at a low bitrate. In this case the encoder should >+ // call OnEncodeDone() once with size = 0 to indicate drop, and then call >+ // OnEncodeDone() again when the frame has actually been encoded. >+ virtual bool SupportsEncoderFrameDropping() const = 0; >+ >+ // New target bitrate, per temporal layer. >+ virtual void OnRatesUpdated(const std::vector<uint32_t>& bitrates_bps, >+ int framerate_fps) = 0; >+ >+ // Called by the encoder before encoding a frame. |cfg| contains the current >+ // configuration. If the TemporalLayers instance wishes any part of that >+ // to be changed before the encode step, |cfg| should be changed and then >+ // return true. If false is returned, the encoder will proceed without >+ // updating the configuration. >+ virtual bool UpdateConfiguration(Vp8EncoderConfig* cfg) = 0; >+ >+ // Returns the recommended VP8 encode flags needed, and moves the temporal >+ // pattern to the next frame. >+ // The timestamp may be used as both a time and a unique identifier, and so >+ // the caller must make sure no two frames use the same timestamp. >+ // The timestamp uses a 90kHz RTP clock. >+ // After calling this method, first call the actual encoder with the provided >+ // frame configuration, and then OnEncodeDone() below. >+ virtual FrameConfig UpdateLayerConfig(uint32_t rtp_timestamp) = 0; >+ >+ // Called after the encode step is done. |rtp_timestamp| must match the >+ // parameter use in the UpdateLayerConfig() call. >+ // |is_keyframe| must be true iff the encoder decided to encode this frame as >+ // a keyframe. >+ // If the encoder decided to drop this frame, |size_bytes| must be set to 0, >+ // otherwise it should indicate the size in bytes of the encoded frame. >+ // If |size_bytes| > 0, and |vp8_info| is not null, the TemporalLayers >+ // instance my update |vp8_info| with codec specific data such as temporal id. >+ // Some fields of this struct may have already been populated by the encoder, >+ // check before overwriting. >+ // If |size_bytes| > 0, |qp| should indicate the frame-level QP this frame was >+ // encoded at. If the encoder does not support extracting this, |qp| should be >+ // set to 0. >+ virtual void OnEncodeDone(uint32_t rtp_timestamp, >+ size_t size_bytes, >+ bool is_keyframe, >+ int qp, >+ CodecSpecificInfoVP8* vp8_info) = 0; >+}; >+ >+} // namespace webrtc >+ >+#endif // API_VIDEO_CODECS_VP8_TEMPORAL_LAYERS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/BUILD.gn >index 91cece93cb9aae50ab51aba645391de535d153d1..c045af637651d37e881ae31a89d7237bbfe96b4b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/BUILD.gn >@@ -26,19 +26,13 @@ rtc_static_library("audio") { > "audio_transport_impl.h", > "channel_receive.cc", > "channel_receive.h", >- "channel_receive_proxy.cc", >- "channel_receive_proxy.h", > "channel_send.cc", > "channel_send.h", >- "channel_send_proxy.cc", >- "channel_send_proxy.h", > "conversion.h", > "null_audio_poller.cc", > "null_audio_poller.h", > "remix_resample.cc", > "remix_resample.h", >- "time_interval.cc", >- "time_interval.h", > "transport_feedback_packet_loss_tracker.cc", > "transport_feedback_packet_loss_tracker.h", > ] >@@ -49,7 +43,6 @@ rtc_static_library("audio") { > } > > deps = [ >- "..:webrtc_common", > "../api:array_view", > "../api:call_api", > "../api:libjingle_peerconnection_api", >@@ -65,12 +58,14 @@ rtc_static_library("audio") { > "../common_audio:common_audio_c", > "../logging:rtc_event_audio", > "../logging:rtc_event_log_api", >+ "../logging:rtc_stream_config", > "../modules/audio_coding", >+ "../modules/audio_coding:audio_encoder_cng", > "../modules/audio_coding:audio_format_conversion", > "../modules/audio_coding:audio_network_adaptor_config", >- "../modules/audio_coding:cng", > "../modules/audio_device", > "../modules/audio_processing", >+ "../modules/audio_processing:api", > "../modules/bitrate_controller:bitrate_controller", > "../modules/pacing:pacing", > "../modules/remote_bitrate_estimator:remote_bitrate_estimator", >@@ -128,22 +123,34 @@ if (rtc_include_tests) { > "mock_voe_channel_proxy.h", > "remix_resample_unittest.cc", > "test/audio_stats_test.cc", >- "time_interval_unittest.cc", >+ "test/media_transport_test.cc", > "transport_feedback_packet_loss_tracker_unittest.cc", > ] > deps = [ > ":audio", > ":audio_end_to_end_test", >+ "../api:loopback_media_transport", > "../api:mock_audio_mixer", >+ "../api:mock_frame_decryptor", >+ "../api:mock_frame_encryptor", > "../api/audio:audio_frame_api", >+ "../api/audio_codecs:audio_codecs_api", >+ "../api/audio_codecs/opus:audio_decoder_opus", >+ "../api/audio_codecs/opus:audio_encoder_opus", > "../api/units:time_delta", >+ "../call:mock_bitrate_allocator", > "../call:mock_call_interfaces", > "../call:mock_rtp_interfaces", > "../call:rtp_interfaces", > "../call:rtp_receiver", > "../common_audio", > "../logging:mocks", >+ "../logging:rtc_event_log_api", > "../modules/audio_device:mock_audio_device", >+ "../rtc_base:rtc_base_tests_utils", >+ >+ # For TestAudioDeviceModule >+ "../modules/audio_device:audio_device_impl", > "../modules/audio_mixer:audio_mixer_impl", > "../modules/audio_processing:audio_processing_statistics", > "../modules/audio_processing:mocks", >@@ -151,9 +158,9 @@ if (rtc_include_tests) { > "../modules/pacing:pacing", > "../modules/rtp_rtcp:mock_rtp_rtcp", > "../modules/rtp_rtcp:rtp_rtcp_format", >+ "../modules/utility", > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", >- "../rtc_base:rtc_base_tests_utils", > "../rtc_base:rtc_task_queue", > "../rtc_base:safe_compare", > "../system_wrappers:system_wrappers", >@@ -265,6 +272,7 @@ if (rtc_include_tests) { > "../test:single_threaded_task_queue", > "../test:test_common", > "../test:test_main", >+ "../test:test_support", > "//testing/gtest", > "//third_party/abseil-cpp/absl/memory", > ] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.cc >index 563e8a036a303fde7879caeda6c523b8c2d043b9..8d4afe08e3767cbebb64d6dbde4a72cd23632ad3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.cc >@@ -13,14 +13,17 @@ > #include <string> > #include <utility> > >+#include "absl/memory/memory.h" >+#include "api/array_view.h" >+#include "api/audio_codecs/audio_format.h" > #include "api/call/audio_sink.h" >+#include "api/rtpparameters.h" > #include "audio/audio_send_stream.h" > #include "audio/audio_state.h" >-#include "audio/channel_receive_proxy.h" >+#include "audio/channel_receive.h" > #include "audio/conversion.h" >+#include "call/rtp_config.h" > #include "call/rtp_stream_receiver_controller_interface.h" >-#include "modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h" >-#include "modules/rtp_rtcp/include/rtp_rtcp.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/strings/string_builder.h" >@@ -53,6 +56,7 @@ std::string AudioReceiveStream::Config::ToString() const { > ss << "{rtp: " << rtp.ToString(); > ss << ", rtcp_send_transport: " > << (rtcp_send_transport ? "(Transport)" : "null"); >+ ss << ", media_transport: " << (media_transport ? "(Transport)" : "null"); > if (!sync_group.empty()) { > ss << ", sync_group: " << sync_group; > } >@@ -62,7 +66,7 @@ std::string AudioReceiveStream::Config::ToString() const { > > namespace internal { > namespace { >-std::unique_ptr<voe::ChannelReceiveProxy> CreateChannelAndProxy( >+std::unique_ptr<voe::ChannelReceiveInterface> CreateChannelReceive( > webrtc::AudioState* audio_state, > ProcessThread* module_process_thread, > const webrtc::AudioReceiveStream::Config& config, >@@ -70,13 +74,13 @@ std::unique_ptr<voe::ChannelReceiveProxy> CreateChannelAndProxy( > RTC_DCHECK(audio_state); > internal::AudioState* internal_audio_state = > static_cast<internal::AudioState*>(audio_state); >- return absl::make_unique<voe::ChannelReceiveProxy>( >- absl::make_unique<voe::ChannelReceive>( >- module_process_thread, internal_audio_state->audio_device_module(), >- config.rtcp_send_transport, event_log, config.rtp.remote_ssrc, >- config.jitter_buffer_max_packets, >- config.jitter_buffer_fast_accelerate, config.decoder_factory, >- config.codec_pair_id, config.frame_decryptor)); >+ return voe::CreateChannelReceive( >+ module_process_thread, internal_audio_state->audio_device_module(), >+ config.media_transport, config.rtcp_send_transport, event_log, >+ config.rtp.remote_ssrc, config.jitter_buffer_max_packets, >+ config.jitter_buffer_fast_accelerate, config.jitter_buffer_min_delay_ms, >+ config.decoder_factory, config.codec_pair_id, config.frame_decryptor, >+ config.crypto_options); > } > } // namespace > >@@ -92,10 +96,10 @@ AudioReceiveStream::AudioReceiveStream( > config, > audio_state, > event_log, >- CreateChannelAndProxy(audio_state.get(), >- module_process_thread, >- config, >- event_log)) {} >+ CreateChannelReceive(audio_state.get(), >+ module_process_thread, >+ config, >+ event_log)) {} > > AudioReceiveStream::AudioReceiveStream( > RtpStreamReceiverControllerInterface* receiver_controller, >@@ -103,25 +107,26 @@ AudioReceiveStream::AudioReceiveStream( > const webrtc::AudioReceiveStream::Config& config, > const rtc::scoped_refptr<webrtc::AudioState>& audio_state, > webrtc::RtcEventLog* event_log, >- std::unique_ptr<voe::ChannelReceiveProxy> channel_proxy) >- : audio_state_(audio_state), channel_proxy_(std::move(channel_proxy)) { >+ std::unique_ptr<voe::ChannelReceiveInterface> channel_receive) >+ : audio_state_(audio_state), channel_receive_(std::move(channel_receive)) { > RTC_LOG(LS_INFO) << "AudioReceiveStream: " << config.rtp.remote_ssrc; >- RTC_DCHECK(receiver_controller); >- RTC_DCHECK(packet_router); > RTC_DCHECK(config.decoder_factory); > RTC_DCHECK(config.rtcp_send_transport); > RTC_DCHECK(audio_state_); >- RTC_DCHECK(channel_proxy_); >+ RTC_DCHECK(channel_receive_); > > module_process_thread_checker_.DetachFromThread(); > >- // Configure bandwidth estimation. >- channel_proxy_->RegisterReceiverCongestionControlObjects(packet_router); >- >- // Register with transport. >- rtp_stream_receiver_ = receiver_controller->CreateReceiver( >- config.rtp.remote_ssrc, channel_proxy_.get()); >+ if (!config.media_transport) { >+ RTC_DCHECK(receiver_controller); >+ RTC_DCHECK(packet_router); >+ // Configure bandwidth estimation. >+ channel_receive_->RegisterReceiverCongestionControlObjects(packet_router); > >+ // Register with transport. >+ rtp_stream_receiver_ = receiver_controller->CreateReceiver( >+ config.rtp.remote_ssrc, channel_receive_.get()); >+ } > ConfigureStream(this, config, true); > } > >@@ -129,8 +134,10 @@ AudioReceiveStream::~AudioReceiveStream() { > RTC_DCHECK_RUN_ON(&worker_thread_checker_); > RTC_LOG(LS_INFO) << "~AudioReceiveStream: " << config_.rtp.remote_ssrc; > Stop(); >- channel_proxy_->DisassociateSendChannel(); >- channel_proxy_->ResetReceiverCongestionControlObjects(); >+ channel_receive_->SetAssociatedSendChannel(nullptr); >+ if (!config_.media_transport) { >+ channel_receive_->ResetReceiverCongestionControlObjects(); >+ } > } > > void AudioReceiveStream::Reconfigure( >@@ -144,7 +151,7 @@ void AudioReceiveStream::Start() { > if (playing_) { > return; > } >- channel_proxy_->StartPlayout(); >+ channel_receive_->StartPlayout(); > playing_ = true; > audio_state()->AddReceivingStream(this); > } >@@ -154,7 +161,7 @@ void AudioReceiveStream::Stop() { > if (!playing_) { > return; > } >- channel_proxy_->StopPlayout(); >+ channel_receive_->StopPlayout(); > playing_ = false; > audio_state()->RemoveReceivingStream(this); > } >@@ -165,11 +172,11 @@ webrtc::AudioReceiveStream::Stats AudioReceiveStream::GetStats() const { > stats.remote_ssrc = config_.rtp.remote_ssrc; > > webrtc::CallReceiveStatistics call_stats = >- channel_proxy_->GetRTCPStatistics(); >+ channel_receive_->GetRTCPStatistics(); > // TODO(solenberg): Don't return here if we can't get the codec - return the > // stats we *can* get. > webrtc::CodecInst codec_inst = {0}; >- if (!channel_proxy_->GetRecCodec(&codec_inst)) { >+ if (!channel_receive_->GetRecCodec(&codec_inst)) { > return stats; > } > >@@ -186,13 +193,13 @@ webrtc::AudioReceiveStream::Stats AudioReceiveStream::GetStats() const { > if (codec_inst.plfreq / 1000 > 0) { > stats.jitter_ms = call_stats.jitterSamples / (codec_inst.plfreq / 1000); > } >- stats.delay_estimate_ms = channel_proxy_->GetDelayEstimate(); >- stats.audio_level = channel_proxy_->GetSpeechOutputLevelFullRange(); >- stats.total_output_energy = channel_proxy_->GetTotalOutputEnergy(); >- stats.total_output_duration = channel_proxy_->GetTotalOutputDuration(); >+ stats.delay_estimate_ms = channel_receive_->GetDelayEstimate(); >+ stats.audio_level = channel_receive_->GetSpeechOutputLevelFullRange(); >+ stats.total_output_energy = channel_receive_->GetTotalOutputEnergy(); >+ stats.total_output_duration = channel_receive_->GetTotalOutputDuration(); > > // Get jitter buffer and total delay (alg + jitter + playout) stats. >- auto ns = channel_proxy_->GetNetworkStatistics(); >+ auto ns = channel_receive_->GetNetworkStatistics(); > stats.jitter_buffer_ms = ns.currentBufferSize; > stats.jitter_buffer_preferred_ms = ns.preferredBufferSize; > stats.total_samples_received = ns.totalSamplesReceived; >@@ -207,8 +214,10 @@ webrtc::AudioReceiveStream::Stats AudioReceiveStream::GetStats() const { > stats.secondary_discarded_rate = Q14ToFloat(ns.currentSecondaryDiscardedRate); > stats.accelerate_rate = Q14ToFloat(ns.currentAccelerateRate); > stats.preemptive_expand_rate = Q14ToFloat(ns.currentPreemptiveRate); >+ stats.jitter_buffer_flushes = ns.packetBufferFlushes; >+ stats.delayed_packet_outage_samples = ns.delayedPacketOutageSamples; > >- auto ds = channel_proxy_->GetDecodingCallStatistics(); >+ auto ds = channel_receive_->GetDecodingCallStatistics(); > stats.decoding_calls_to_silence_generator = ds.calls_to_silence_generator; > stats.decoding_calls_to_neteq = ds.calls_to_neteq; > stats.decoding_normal = ds.decoded_normal; >@@ -222,23 +231,23 @@ webrtc::AudioReceiveStream::Stats AudioReceiveStream::GetStats() const { > > void AudioReceiveStream::SetSink(AudioSinkInterface* sink) { > RTC_DCHECK_RUN_ON(&worker_thread_checker_); >- channel_proxy_->SetSink(sink); >+ channel_receive_->SetSink(sink); > } > > void AudioReceiveStream::SetGain(float gain) { > RTC_DCHECK_RUN_ON(&worker_thread_checker_); >- channel_proxy_->SetChannelOutputVolumeScaling(gain); >+ channel_receive_->SetChannelOutputVolumeScaling(gain); > } > > std::vector<RtpSource> AudioReceiveStream::GetSources() const { > RTC_DCHECK_RUN_ON(&worker_thread_checker_); >- return channel_proxy_->GetSources(); >+ return channel_receive_->GetSources(); > } > > AudioMixer::Source::AudioFrameInfo AudioReceiveStream::GetAudioFrameWithInfo( > int sample_rate_hz, > AudioFrame* audio_frame) { >- return channel_proxy_->GetAudioFrameWithInfo(sample_rate_hz, audio_frame); >+ return channel_receive_->GetAudioFrameWithInfo(sample_rate_hz, audio_frame); > } > > int AudioReceiveStream::Ssrc() const { >@@ -246,7 +255,7 @@ int AudioReceiveStream::Ssrc() const { > } > > int AudioReceiveStream::PreferredSampleRate() const { >- return channel_proxy_->PreferredSampleRate(); >+ return channel_receive_->PreferredSampleRate(); > } > > int AudioReceiveStream::id() const { >@@ -256,32 +265,29 @@ int AudioReceiveStream::id() const { > > absl::optional<Syncable::Info> AudioReceiveStream::GetInfo() const { > RTC_DCHECK_RUN_ON(&module_process_thread_checker_); >- absl::optional<Syncable::Info> info = channel_proxy_->GetSyncInfo(); >+ absl::optional<Syncable::Info> info = channel_receive_->GetSyncInfo(); > > if (!info) > return absl::nullopt; > >- info->current_delay_ms = channel_proxy_->GetDelayEstimate(); >+ info->current_delay_ms = channel_receive_->GetDelayEstimate(); > return info; > } > > uint32_t AudioReceiveStream::GetPlayoutTimestamp() const { > // Called on video capture thread. >- return channel_proxy_->GetPlayoutTimestamp(); >+ return channel_receive_->GetPlayoutTimestamp(); > } > > void AudioReceiveStream::SetMinimumPlayoutDelay(int delay_ms) { > RTC_DCHECK_RUN_ON(&module_process_thread_checker_); >- return channel_proxy_->SetMinimumPlayoutDelay(delay_ms); >+ return channel_receive_->SetMinimumPlayoutDelay(delay_ms); > } > > void AudioReceiveStream::AssociateSendStream(AudioSendStream* send_stream) { > RTC_DCHECK_RUN_ON(&worker_thread_checker_); >- if (send_stream) { >- channel_proxy_->AssociateSendChannel(send_stream->GetChannelProxy()); >- } else { >- channel_proxy_->DisassociateSendChannel(); >- } >+ channel_receive_->SetAssociatedSendChannel( >+ send_stream ? send_stream->GetChannel() : nullptr); > associated_send_stream_ = send_stream; > } > >@@ -294,7 +300,7 @@ bool AudioReceiveStream::DeliverRtcp(const uint8_t* packet, size_t length) { > // calls on the worker thread. We should move towards always using a network > // thread. Then this check can be enabled. > // RTC_DCHECK(!thread_checker_.CalledOnValidThread()); >- return channel_proxy_->ReceivedRTCPPacket(packet, length); >+ return channel_receive_->ReceivedRTCPPacket(packet, length); > } > > void AudioReceiveStream::OnRtpPacket(const RtpPacketReceived& packet) { >@@ -302,7 +308,7 @@ void AudioReceiveStream::OnRtpPacket(const RtpPacketReceived& packet) { > // calls on the worker thread. We should move towards always using a network > // thread. Then this check can be enabled. > // RTC_DCHECK(!thread_checker_.CalledOnValidThread()); >- channel_proxy_->OnRtpPacket(packet); >+ channel_receive_->OnRtpPacket(packet); > } > > const webrtc::AudioReceiveStream::Config& AudioReceiveStream::config() const { >@@ -328,7 +334,7 @@ void AudioReceiveStream::ConfigureStream(AudioReceiveStream* stream, > RTC_LOG(LS_INFO) << "AudioReceiveStream::ConfigureStream: " > << new_config.ToString(); > RTC_DCHECK(stream); >- const auto& channel_proxy = stream->channel_proxy_; >+ const auto& channel_receive = stream->channel_receive_; > const auto& old_config = stream->config_; > > // Configuration parameters which cannot be changed. >@@ -342,7 +348,7 @@ void AudioReceiveStream::ConfigureStream(AudioReceiveStream* stream, > old_config.decoder_factory == new_config.decoder_factory); > > if (first_time || old_config.rtp.local_ssrc != new_config.rtp.local_ssrc) { >- channel_proxy->SetLocalSSRC(new_config.rtp.local_ssrc); >+ channel_receive->SetLocalSSRC(new_config.rtp.local_ssrc); > } > > if (!first_time) { >@@ -354,11 +360,11 @@ void AudioReceiveStream::ConfigureStream(AudioReceiveStream* stream, > // using the actual packet size for the configured codec. > if (first_time || old_config.rtp.nack.rtp_history_ms != > new_config.rtp.nack.rtp_history_ms) { >- channel_proxy->SetNACKStatus(new_config.rtp.nack.rtp_history_ms != 0, >- new_config.rtp.nack.rtp_history_ms / 20); >+ channel_receive->SetNACKStatus(new_config.rtp.nack.rtp_history_ms != 0, >+ new_config.rtp.nack.rtp_history_ms / 20); > } > if (first_time || old_config.decoder_map != new_config.decoder_map) { >- channel_proxy->SetReceiveCodecs(new_config.decoder_map); >+ channel_receive->SetReceiveCodecs(new_config.decoder_map); > } > > stream->config_ = new_config; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.h >index e982b04d54b8e7f273f338959d820676a8a94eb6..86bcb1c6a2db425ef46b236e86b426b09e3d7337 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream.h >@@ -18,7 +18,6 @@ > #include "api/rtp_headers.h" > #include "audio/audio_state.h" > #include "call/audio_receive_stream.h" >-#include "call/rtp_packet_sink_interface.h" > #include "call/syncable.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/thread_checker.h" >@@ -32,7 +31,7 @@ class RtpStreamReceiverControllerInterface; > class RtpStreamReceiverInterface; > > namespace voe { >-class ChannelReceiveProxy; >+class ChannelReceiveInterface; > } // namespace voe > > namespace internal { >@@ -48,13 +47,14 @@ class AudioReceiveStream final : public webrtc::AudioReceiveStream, > const webrtc::AudioReceiveStream::Config& config, > const rtc::scoped_refptr<webrtc::AudioState>& audio_state, > webrtc::RtcEventLog* event_log); >- // For unit tests, which need to supply a mock channel proxy. >- AudioReceiveStream(RtpStreamReceiverControllerInterface* receiver_controller, >- PacketRouter* packet_router, >- const webrtc::AudioReceiveStream::Config& config, >- const rtc::scoped_refptr<webrtc::AudioState>& audio_state, >- webrtc::RtcEventLog* event_log, >- std::unique_ptr<voe::ChannelReceiveProxy> channel_proxy); >+ // For unit tests, which need to supply a mock channel receive. >+ AudioReceiveStream( >+ RtpStreamReceiverControllerInterface* receiver_controller, >+ PacketRouter* packet_router, >+ const webrtc::AudioReceiveStream::Config& config, >+ const rtc::scoped_refptr<webrtc::AudioState>& audio_state, >+ webrtc::RtcEventLog* event_log, >+ std::unique_ptr<voe::ChannelReceiveInterface> channel_receive); > ~AudioReceiveStream() override; > > // webrtc::AudioReceiveStream implementation. >@@ -101,7 +101,7 @@ class AudioReceiveStream final : public webrtc::AudioReceiveStream, > rtc::ThreadChecker module_process_thread_checker_; > webrtc::AudioReceiveStream::Config config_; > rtc::scoped_refptr<webrtc::AudioState> audio_state_; >- std::unique_ptr<voe::ChannelReceiveProxy> channel_proxy_; >+ const std::unique_ptr<voe::ChannelReceiveInterface> channel_receive_; > AudioSendStream* associated_send_stream_ = nullptr; > > bool playing_ RTC_GUARDED_BY(worker_thread_checker_) = false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream_unittest.cc >index 97c42c4383f8436342b477036f6dc57f05979433..742281085665152728efc2d1789a83a713654696 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_receive_stream_unittest.cc >@@ -13,6 +13,7 @@ > #include <vector> > > #include "api/test/mock_audio_mixer.h" >+#include "api/test/mock_frame_decryptor.h" > #include "audio/audio_receive_stream.h" > #include "audio/conversion.h" > #include "audio/mock_voe_channel_proxy.h" >@@ -82,16 +83,16 @@ struct ConfigHelper { > new rtc::RefCountedObject<testing::NiceMock<MockAudioDeviceModule>>(); > audio_state_ = AudioState::Create(config); > >- channel_proxy_ = new testing::StrictMock<MockChannelReceiveProxy>(); >- EXPECT_CALL(*channel_proxy_, SetLocalSSRC(kLocalSsrc)).Times(1); >- EXPECT_CALL(*channel_proxy_, SetNACKStatus(true, 15)).Times(1); >- EXPECT_CALL(*channel_proxy_, >+ channel_receive_ = new testing::StrictMock<MockChannelReceive>(); >+ EXPECT_CALL(*channel_receive_, SetLocalSSRC(kLocalSsrc)).Times(1); >+ EXPECT_CALL(*channel_receive_, SetNACKStatus(true, 15)).Times(1); >+ EXPECT_CALL(*channel_receive_, > RegisterReceiverCongestionControlObjects(&packet_router_)) > .Times(1); >- EXPECT_CALL(*channel_proxy_, ResetReceiverCongestionControlObjects()) >+ EXPECT_CALL(*channel_receive_, ResetReceiverCongestionControlObjects()) > .Times(1); >- EXPECT_CALL(*channel_proxy_, DisassociateSendChannel()).Times(1); >- EXPECT_CALL(*channel_proxy_, SetReceiveCodecs(_)) >+ EXPECT_CALL(*channel_receive_, SetAssociatedSendChannel(nullptr)).Times(1); >+ EXPECT_CALL(*channel_receive_, SetReceiveCodecs(_)) > .WillRepeatedly(Invoke([](const std::map<int, SdpAudioFormat>& codecs) { > EXPECT_THAT(codecs, testing::IsEmpty()); > })); >@@ -113,33 +114,33 @@ struct ConfigHelper { > new internal::AudioReceiveStream( > &rtp_stream_receiver_controller_, &packet_router_, stream_config_, > audio_state_, &event_log_, >- std::unique_ptr<voe::ChannelReceiveProxy>(channel_proxy_))); >+ std::unique_ptr<voe::ChannelReceiveInterface>(channel_receive_))); > } > > AudioReceiveStream::Config& config() { return stream_config_; } > rtc::scoped_refptr<MockAudioMixer> audio_mixer() { return audio_mixer_; } >- MockChannelReceiveProxy* channel_proxy() { return channel_proxy_; } >+ MockChannelReceive* channel_receive() { return channel_receive_; } > > void SetupMockForGetStats() { > using testing::DoAll; > using testing::SetArgPointee; > >- ASSERT_TRUE(channel_proxy_); >- EXPECT_CALL(*channel_proxy_, GetRTCPStatistics()) >+ ASSERT_TRUE(channel_receive_); >+ EXPECT_CALL(*channel_receive_, GetRTCPStatistics()) > .WillOnce(Return(kCallStats)); >- EXPECT_CALL(*channel_proxy_, GetDelayEstimate()) >+ EXPECT_CALL(*channel_receive_, GetDelayEstimate()) > .WillOnce(Return(kJitterBufferDelay + kPlayoutBufferDelay)); >- EXPECT_CALL(*channel_proxy_, GetSpeechOutputLevelFullRange()) >+ EXPECT_CALL(*channel_receive_, GetSpeechOutputLevelFullRange()) > .WillOnce(Return(kSpeechOutputLevel)); >- EXPECT_CALL(*channel_proxy_, GetTotalOutputEnergy()) >+ EXPECT_CALL(*channel_receive_, GetTotalOutputEnergy()) > .WillOnce(Return(kTotalOutputEnergy)); >- EXPECT_CALL(*channel_proxy_, GetTotalOutputDuration()) >+ EXPECT_CALL(*channel_receive_, GetTotalOutputDuration()) > .WillOnce(Return(kTotalOutputDuration)); >- EXPECT_CALL(*channel_proxy_, GetNetworkStatistics()) >+ EXPECT_CALL(*channel_receive_, GetNetworkStatistics()) > .WillOnce(Return(kNetworkStats)); >- EXPECT_CALL(*channel_proxy_, GetDecodingCallStatistics()) >+ EXPECT_CALL(*channel_receive_, GetDecodingCallStatistics()) > .WillOnce(Return(kAudioDecodeStats)); >- EXPECT_CALL(*channel_proxy_, GetRecCodec(_)) >+ EXPECT_CALL(*channel_receive_, GetRecCodec(_)) > .WillOnce(DoAll(SetArgPointee<0>(kCodecInst), Return(true))); > } > >@@ -149,7 +150,7 @@ struct ConfigHelper { > rtc::scoped_refptr<AudioState> audio_state_; > rtc::scoped_refptr<MockAudioMixer> audio_mixer_; > AudioReceiveStream::Config stream_config_; >- testing::StrictMock<MockChannelReceiveProxy>* channel_proxy_ = nullptr; >+ testing::StrictMock<MockChannelReceive>* channel_receive_ = nullptr; > RtpStreamReceiverController rtp_stream_receiver_controller_; > MockTransport rtcp_send_transport_; > }; >@@ -216,7 +217,7 @@ TEST(AudioReceiveStreamTest, ConfigToString) { > "{rtp: {remote_ssrc: 1234, local_ssrc: 5678, transport_cc: off, nack: " > "{rtp_history_ms: 0}, extensions: [{uri: " > "urn:ietf:params:rtp-hdrext:ssrc-audio-level, id: 3}]}, " >- "rtcp_send_transport: null}", >+ "rtcp_send_transport: null, media_transport: null}", > config.ToString()); > } > >@@ -238,7 +239,7 @@ TEST(AudioReceiveStreamTest, ReceiveRtpPacket) { > ASSERT_TRUE(parsed_packet.Parse(&rtp_packet[0], rtp_packet.size())); > parsed_packet.set_arrival_time_ms((packet_time_us + 500) / 1000); > >- EXPECT_CALL(*helper.channel_proxy(), >+ EXPECT_CALL(*helper.channel_receive(), > OnRtpPacket(testing::Ref(parsed_packet))); > > recv_stream->OnRtpPacket(parsed_packet); >@@ -249,7 +250,7 @@ TEST(AudioReceiveStreamTest, ReceiveRtcpPacket) { > helper.config().rtp.transport_cc = true; > auto recv_stream = helper.CreateAudioReceiveStream(); > std::vector<uint8_t> rtcp_packet = CreateRtcpSenderReport(); >- EXPECT_CALL(*helper.channel_proxy(), >+ EXPECT_CALL(*helper.channel_receive(), > ReceivedRTCPPacket(&rtcp_packet[0], rtcp_packet.size())) > .WillOnce(Return(true)); > EXPECT_TRUE(recv_stream->DeliverRtcp(&rtcp_packet[0], rtcp_packet.size())); >@@ -311,7 +312,7 @@ TEST(AudioReceiveStreamTest, GetStats) { > TEST(AudioReceiveStreamTest, SetGain) { > ConfigHelper helper; > auto recv_stream = helper.CreateAudioReceiveStream(); >- EXPECT_CALL(*helper.channel_proxy(), >+ EXPECT_CALL(*helper.channel_receive(), > SetChannelOutputVolumeScaling(FloatEq(0.765f))); > recv_stream->SetGain(0.765f); > } >@@ -322,10 +323,10 @@ TEST(AudioReceiveStreamTest, StreamsShouldBeAddedToMixerOnceOnStart) { > auto recv_stream1 = helper1.CreateAudioReceiveStream(); > auto recv_stream2 = helper2.CreateAudioReceiveStream(); > >- EXPECT_CALL(*helper1.channel_proxy(), StartPlayout()).Times(1); >- EXPECT_CALL(*helper2.channel_proxy(), StartPlayout()).Times(1); >- EXPECT_CALL(*helper1.channel_proxy(), StopPlayout()).Times(1); >- EXPECT_CALL(*helper2.channel_proxy(), StopPlayout()).Times(1); >+ EXPECT_CALL(*helper1.channel_receive(), StartPlayout()).Times(1); >+ EXPECT_CALL(*helper2.channel_receive(), StartPlayout()).Times(1); >+ EXPECT_CALL(*helper1.channel_receive(), StopPlayout()).Times(1); >+ EXPECT_CALL(*helper2.channel_receive(), StopPlayout()).Times(1); > EXPECT_CALL(*helper1.audio_mixer(), AddSource(recv_stream1.get())) > .WillOnce(Return(true)); > EXPECT_CALL(*helper1.audio_mixer(), AddSource(recv_stream2.get())) >@@ -366,12 +367,32 @@ TEST(AudioReceiveStreamTest, ReconfigureWithUpdatedConfig) { > kTransportSequenceNumberId + 1)); > new_config.decoder_map.emplace(1, SdpAudioFormat("foo", 8000, 1)); > >- MockChannelReceiveProxy& channel_proxy = *helper.channel_proxy(); >- EXPECT_CALL(channel_proxy, SetLocalSSRC(kLocalSsrc + 1)).Times(1); >- EXPECT_CALL(channel_proxy, SetNACKStatus(true, 15 + 1)).Times(1); >- EXPECT_CALL(channel_proxy, SetReceiveCodecs(new_config.decoder_map)); >+ MockChannelReceive& channel_receive = *helper.channel_receive(); >+ EXPECT_CALL(channel_receive, SetLocalSSRC(kLocalSsrc + 1)).Times(1); >+ EXPECT_CALL(channel_receive, SetNACKStatus(true, 15 + 1)).Times(1); >+ EXPECT_CALL(channel_receive, SetReceiveCodecs(new_config.decoder_map)); > > recv_stream->Reconfigure(new_config); > } >+ >+TEST(AudioReceiveStreamTest, ReconfigureWithFrameDecryptor) { >+ ConfigHelper helper; >+ auto recv_stream = helper.CreateAudioReceiveStream(); >+ >+ auto new_config_0 = helper.config(); >+ rtc::scoped_refptr<FrameDecryptorInterface> mock_frame_decryptor_0( >+ new rtc::RefCountedObject<MockFrameDecryptor>()); >+ new_config_0.frame_decryptor = mock_frame_decryptor_0; >+ >+ recv_stream->Reconfigure(new_config_0); >+ >+ auto new_config_1 = helper.config(); >+ rtc::scoped_refptr<FrameDecryptorInterface> mock_frame_decryptor_1( >+ new rtc::RefCountedObject<MockFrameDecryptor>()); >+ new_config_1.frame_decryptor = mock_frame_decryptor_1; >+ new_config_1.crypto_options.sframe.require_frame_encryption = true; >+ recv_stream->Reconfigure(new_config_1); >+} >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.cc >index e82264a83cb1cb285e83e83adc40955c605e6cb5..75e6efb9e800fd0918ccbb5be9f1af7da8891b60 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.cc >@@ -15,12 +15,22 @@ > #include <vector> > > #include "absl/memory/memory.h" >- >+#include "api/audio_codecs/audio_encoder.h" >+#include "api/audio_codecs/audio_encoder_factory.h" >+#include "api/audio_codecs/audio_format.h" >+#include "api/call/transport.h" >+#include "api/crypto/frameencryptorinterface.h" > #include "audio/audio_state.h" >-#include "audio/channel_send_proxy.h" >+#include "audio/channel_send.h" > #include "audio/conversion.h" >+#include "call/rtp_config.h" > #include "call/rtp_transport_controller_send_interface.h" >+#include "common_audio/vad/include/vad.h" >+#include "logging/rtc_event_log/events/rtc_event_audio_send_stream_config.h" >+#include "logging/rtc_event_log/rtc_event_log.h" >+#include "logging/rtc_event_log/rtc_stream_config.h" > #include "modules/audio_coding/codecs/cng/audio_encoder_cng.h" >+#include "modules/audio_processing/include/audio_processing.h" > #include "rtc_base/checks.h" > #include "rtc_base/event.h" > #include "rtc_base/function_view.h" >@@ -38,130 +48,132 @@ constexpr size_t kPacketLossTrackerMaxWindowSizeMs = 15000; > constexpr size_t kPacketLossRateMinNumAckedPackets = 50; > constexpr size_t kRecoverablePacketLossRateMinNumAckedPairs = 40; > >-void CallEncoder(const std::unique_ptr<voe::ChannelSendProxy>& channel_proxy, >+void CallEncoder(const std::unique_ptr<voe::ChannelSendInterface>& channel_send, > rtc::FunctionView<void(AudioEncoder*)> lambda) { >- channel_proxy->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder_ptr) { >+ channel_send->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder_ptr) { > RTC_DCHECK(encoder_ptr); > lambda(encoder_ptr->get()); > }); > } > >-std::unique_ptr<voe::ChannelSendProxy> CreateChannelAndProxy( >- rtc::TaskQueue* worker_queue, >- ProcessThread* module_process_thread, >- RtcpRttStats* rtcp_rtt_stats, >- RtcEventLog* event_log, >- FrameEncryptorInterface* frame_encryptor) { >- return absl::make_unique<voe::ChannelSendProxy>( >- absl::make_unique<voe::ChannelSend>(worker_queue, module_process_thread, >- rtcp_rtt_stats, event_log, >- frame_encryptor)); >-} >-} // namespace >- >-// Helper class to track the actively sending lifetime of this stream. >-class AudioSendStream::TimedTransport : public Transport { >- public: >- TimedTransport(Transport* transport, TimeInterval* time_interval) >- : transport_(transport), lifetime_(time_interval) {} >- bool SendRtp(const uint8_t* packet, >- size_t length, >- const PacketOptions& options) { >- if (lifetime_) { >- lifetime_->Extend(); >+void UpdateEventLogStreamConfig(RtcEventLog* event_log, >+ const AudioSendStream::Config& config, >+ const AudioSendStream::Config* old_config) { >+ using SendCodecSpec = AudioSendStream::Config::SendCodecSpec; >+ // Only update if any of the things we log have changed. >+ auto payload_types_equal = [](const absl::optional<SendCodecSpec>& a, >+ const absl::optional<SendCodecSpec>& b) { >+ if (a.has_value() && b.has_value()) { >+ return a->format.name == b->format.name && >+ a->payload_type == b->payload_type; > } >- return transport_->SendRtp(packet, length, options); >+ return !a.has_value() && !b.has_value(); >+ }; >+ >+ if (old_config && config.rtp.ssrc == old_config->rtp.ssrc && >+ config.rtp.extensions == old_config->rtp.extensions && >+ payload_types_equal(config.send_codec_spec, >+ old_config->send_codec_spec)) { >+ return; > } >- bool SendRtcp(const uint8_t* packet, size_t length) { >- return transport_->SendRtcp(packet, length); >+ >+ auto rtclog_config = absl::make_unique<rtclog::StreamConfig>(); >+ rtclog_config->local_ssrc = config.rtp.ssrc; >+ rtclog_config->rtp_extensions = config.rtp.extensions; >+ if (config.send_codec_spec) { >+ rtclog_config->codecs.emplace_back(config.send_codec_spec->format.name, >+ config.send_codec_spec->payload_type, 0); > } >- ~TimedTransport() {} >+ event_log->Log(absl::make_unique<RtcEventAudioSendStreamConfig>( >+ std::move(rtclog_config))); >+} > >- private: >- Transport* transport_; >- TimeInterval* lifetime_; >-}; >+} // namespace > > AudioSendStream::AudioSendStream( > const webrtc::AudioSendStream::Config& config, > const rtc::scoped_refptr<webrtc::AudioState>& audio_state, > rtc::TaskQueue* worker_queue, > ProcessThread* module_process_thread, >- RtpTransportControllerSendInterface* transport, >- BitrateAllocator* bitrate_allocator, >+ RtpTransportControllerSendInterface* rtp_transport, >+ BitrateAllocatorInterface* bitrate_allocator, > RtcEventLog* event_log, > RtcpRttStats* rtcp_rtt_stats, >- const absl::optional<RtpState>& suspended_rtp_state, >- TimeInterval* overall_call_lifetime) >+ const absl::optional<RtpState>& suspended_rtp_state) > : AudioSendStream(config, > audio_state, > worker_queue, >- transport, >+ rtp_transport, > bitrate_allocator, > event_log, > rtcp_rtt_stats, > suspended_rtp_state, >- overall_call_lifetime, >- CreateChannelAndProxy(worker_queue, >- module_process_thread, >- rtcp_rtt_stats, >- event_log, >- config.frame_encryptor)) {} >+ voe::CreateChannelSend(worker_queue, >+ module_process_thread, >+ config.media_transport, >+ config.send_transport, >+ rtcp_rtt_stats, >+ event_log, >+ config.frame_encryptor, >+ config.crypto_options, >+ config.rtp.extmap_allow_mixed, >+ config.rtcp_report_interval_ms)) {} > > AudioSendStream::AudioSendStream( > const webrtc::AudioSendStream::Config& config, > const rtc::scoped_refptr<webrtc::AudioState>& audio_state, > rtc::TaskQueue* worker_queue, >- RtpTransportControllerSendInterface* transport, >- BitrateAllocator* bitrate_allocator, >+ RtpTransportControllerSendInterface* rtp_transport, >+ BitrateAllocatorInterface* bitrate_allocator, > RtcEventLog* event_log, > RtcpRttStats* rtcp_rtt_stats, > const absl::optional<RtpState>& suspended_rtp_state, >- TimeInterval* overall_call_lifetime, >- std::unique_ptr<voe::ChannelSendProxy> channel_proxy) >+ std::unique_ptr<voe::ChannelSendInterface> channel_send) > : worker_queue_(worker_queue), >- config_(Config(nullptr)), >+ config_(Config(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr)), > audio_state_(audio_state), >- channel_proxy_(std::move(channel_proxy)), >+ channel_send_(std::move(channel_send)), > event_log_(event_log), > bitrate_allocator_(bitrate_allocator), >- transport_(transport), >+ rtp_transport_(rtp_transport), > packet_loss_tracker_(kPacketLossTrackerMaxWindowSizeMs, > kPacketLossRateMinNumAckedPackets, > kRecoverablePacketLossRateMinNumAckedPairs), > rtp_rtcp_module_(nullptr), >- suspended_rtp_state_(suspended_rtp_state), >- overall_call_lifetime_(overall_call_lifetime) { >+ suspended_rtp_state_(suspended_rtp_state) { > RTC_LOG(LS_INFO) << "AudioSendStream: " << config.rtp.ssrc; > RTC_DCHECK(worker_queue_); > RTC_DCHECK(audio_state_); >- RTC_DCHECK(channel_proxy_); >+ RTC_DCHECK(channel_send_); > RTC_DCHECK(bitrate_allocator_); >- RTC_DCHECK(transport); >- RTC_DCHECK(overall_call_lifetime_); >+ // TODO(nisse): Eventually, we should have only media_transport. But for the >+ // time being, we can have either. When media transport is injected, there >+ // should be no rtp_transport, and below check should be strengthened to XOR >+ // (either rtp_transport or media_transport but not both). >+ RTC_DCHECK(rtp_transport || config.media_transport); > >- channel_proxy_->SetRTCPStatus(true); >- rtp_rtcp_module_ = channel_proxy_->GetRtpRtcp(); >+ rtp_rtcp_module_ = channel_send_->GetRtpRtcp(); > RTC_DCHECK(rtp_rtcp_module_); > > ConfigureStream(this, config, true); > > pacer_thread_checker_.DetachFromThread(); >- // Signal congestion controller this object is ready for OnPacket* callbacks. >- transport_->RegisterPacketFeedbackObserver(this); >+ if (rtp_transport_) { >+ // Signal congestion controller this object is ready for OnPacket* >+ // callbacks. >+ rtp_transport_->RegisterPacketFeedbackObserver(this); >+ } > } > > AudioSendStream::~AudioSendStream() { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > RTC_LOG(LS_INFO) << "~AudioSendStream: " << config_.rtp.ssrc; > RTC_DCHECK(!sending_); >- transport_->DeRegisterPacketFeedbackObserver(this); >- channel_proxy_->RegisterTransport(nullptr); >- channel_proxy_->ResetSenderCongestionControlObjects(); >- // Lifetime can only be updated after deregistering >- // |timed_send_transport_adapter_| in the underlying channel object to avoid >- // data races in |active_lifetime_|. >- overall_call_lifetime_->Extend(active_lifetime_); >+ if (rtp_transport_) { >+ rtp_transport_->DeRegisterPacketFeedbackObserver(this); >+ channel_send_->ResetSenderCongestionControlObjects(); >+ } > } > > const webrtc::AudioSendStream::Config& AudioSendStream::GetConfig() const { >@@ -196,51 +208,42 @@ void AudioSendStream::ConfigureStream( > bool first_time) { > RTC_LOG(LS_INFO) << "AudioSendStream::ConfigureStream: " > << new_config.ToString(); >- const auto& channel_proxy = stream->channel_proxy_; >+ UpdateEventLogStreamConfig(stream->event_log_, new_config, >+ first_time ? nullptr : &stream->config_); >+ >+ const auto& channel_send = stream->channel_send_; > const auto& old_config = stream->config_; > >+ // Configuration parameters which cannot be changed. >+ RTC_DCHECK(first_time || >+ old_config.send_transport == new_config.send_transport); >+ > if (first_time || old_config.rtp.ssrc != new_config.rtp.ssrc) { >- channel_proxy->SetLocalSSRC(new_config.rtp.ssrc); >+ channel_send->SetLocalSSRC(new_config.rtp.ssrc); > if (stream->suspended_rtp_state_) { > stream->rtp_rtcp_module_->SetRtpState(*stream->suspended_rtp_state_); > } > } > if (first_time || old_config.rtp.c_name != new_config.rtp.c_name) { >- channel_proxy->SetRTCP_CNAME(new_config.rtp.c_name); >- } >- // TODO(solenberg): Config NACK history window (which is a packet count), >- // using the actual packet size for the configured codec. >- if (first_time || old_config.rtp.nack.rtp_history_ms != >- new_config.rtp.nack.rtp_history_ms) { >- channel_proxy->SetNACKStatus(new_config.rtp.nack.rtp_history_ms != 0, >- new_config.rtp.nack.rtp_history_ms / 20); >- } >- >- if (first_time || new_config.send_transport != old_config.send_transport) { >- if (old_config.send_transport) { >- channel_proxy->RegisterTransport(nullptr); >- } >- if (new_config.send_transport) { >- stream->timed_send_transport_adapter_.reset(new TimedTransport( >- new_config.send_transport, &stream->active_lifetime_)); >- } else { >- stream->timed_send_transport_adapter_.reset(nullptr); >- } >- channel_proxy->RegisterTransport( >- stream->timed_send_transport_adapter_.get()); >+ channel_send->SetRTCP_CNAME(new_config.rtp.c_name); > } > > // Enable the frame encryptor if a new frame encryptor has been provided. > if (first_time || new_config.frame_encryptor != old_config.frame_encryptor) { >- channel_proxy->SetFrameEncryptor(new_config.frame_encryptor); >+ channel_send->SetFrameEncryptor(new_config.frame_encryptor); >+ } >+ >+ if (first_time || >+ new_config.rtp.extmap_allow_mixed != old_config.rtp.extmap_allow_mixed) { >+ channel_send->SetExtmapAllowMixed(new_config.rtp.extmap_allow_mixed); > } > > const ExtensionIds old_ids = FindExtensionIds(old_config.rtp.extensions); > const ExtensionIds new_ids = FindExtensionIds(new_config.rtp.extensions); > // Audio level indication > if (first_time || new_ids.audio_level != old_ids.audio_level) { >- channel_proxy->SetSendAudioLevelIndicationStatus(new_ids.audio_level != 0, >- new_ids.audio_level); >+ channel_send->SetSendAudioLevelIndicationStatus(new_ids.audio_level != 0, >+ new_ids.audio_level); > } > bool transport_seq_num_id_changed = > new_ids.transport_sequence_number != old_ids.transport_sequence_number; >@@ -248,7 +251,7 @@ void AudioSendStream::ConfigureStream( > (transport_seq_num_id_changed && > !webrtc::field_trial::IsEnabled("WebRTC-Audio-ForceNoTWCC"))) { > if (!first_time) { >- channel_proxy->ResetSenderCongestionControlObjects(); >+ channel_send->ResetSenderCongestionControlObjects(); > } > > RtcpBandwidthObserver* bandwidth_observer = nullptr; >@@ -256,24 +259,26 @@ void AudioSendStream::ConfigureStream( > new_ids.transport_sequence_number != 0 && > !webrtc::field_trial::IsEnabled("WebRTC-Audio-ForceNoTWCC"); > if (has_transport_sequence_number) { >- channel_proxy->EnableSendTransportSequenceNumber( >+ channel_send->EnableSendTransportSequenceNumber( > new_ids.transport_sequence_number); > // Probing in application limited region is only used in combination with > // send side congestion control, wich depends on feedback packets which > // requires transport sequence numbers to be enabled. >- stream->transport_->EnablePeriodicAlrProbing(true); >- bandwidth_observer = stream->transport_->GetBandwidthObserver(); >+ if (stream->rtp_transport_) { >+ stream->rtp_transport_->EnablePeriodicAlrProbing(true); >+ bandwidth_observer = stream->rtp_transport_->GetBandwidthObserver(); >+ } >+ } >+ if (stream->rtp_transport_) { >+ channel_send->RegisterSenderCongestionControlObjects( >+ stream->rtp_transport_, bandwidth_observer); > } >- >- channel_proxy->RegisterSenderCongestionControlObjects(stream->transport_, >- bandwidth_observer); > } >- > // MID RTP header extension. > if ((first_time || new_ids.mid != old_ids.mid || > new_config.rtp.mid != old_config.rtp.mid) && > new_ids.mid != 0 && !new_config.rtp.mid.empty()) { >- channel_proxy->SetMid(new_config.rtp.mid, new_ids.mid); >+ channel_send->SetMid(new_config.rtp.mid, new_ids.mid); > } > > if (!ReconfigureSendCodec(stream, new_config)) { >@@ -296,16 +301,20 @@ void AudioSendStream::Start() { > FindExtensionIds(config_.rtp.extensions).transport_sequence_number != 0 && > !webrtc::field_trial::IsEnabled("WebRTC-Audio-ForceNoTWCC"); > if (config_.min_bitrate_bps != -1 && config_.max_bitrate_bps != -1 && >+ !config_.has_dscp && > (has_transport_sequence_number || > !webrtc::field_trial::IsEnabled("WebRTC-Audio-SendSideBwe") || > webrtc::field_trial::IsEnabled("WebRTC-Audio-ABWENoTWCC"))) { > // Audio BWE is enabled. >- transport_->packet_sender()->SetAccountForAudioPackets(true); >+ rtp_transport_->packet_sender()->SetAccountForAudioPackets(true); >+ rtp_rtcp_module_->SetAsPartOfAllocation(true); > ConfigureBitrateObserver(config_.min_bitrate_bps, config_.max_bitrate_bps, > config_.bitrate_priority, > has_transport_sequence_number); >+ } else { >+ rtp_rtcp_module_->SetAsPartOfAllocation(false); > } >- channel_proxy_->StartSend(); >+ channel_send_->StartSend(); > sending_ = true; > audio_state()->AddSendingStream(this, encoder_sample_rate_hz_, > encoder_num_channels_); >@@ -318,14 +327,14 @@ void AudioSendStream::Stop() { > } > > RemoveBitrateObserver(); >- channel_proxy_->StopSend(); >+ channel_send_->StopSend(); > sending_ = false; > audio_state()->RemoveSendingStream(this); > } > > void AudioSendStream::SendAudioData(std::unique_ptr<AudioFrame> audio_frame) { > RTC_CHECK_RUNS_SERIALIZED(&audio_capture_race_checker_); >- channel_proxy_->ProcessAndEncodeAudio(std::move(audio_frame)); >+ channel_send_->ProcessAndEncodeAudio(std::move(audio_frame)); > } > > bool AudioSendStream::SendTelephoneEvent(int payload_type, >@@ -333,14 +342,14 @@ bool AudioSendStream::SendTelephoneEvent(int payload_type, > int event, > int duration_ms) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_proxy_->SetSendTelephoneEventPayloadType(payload_type, >- payload_frequency) && >- channel_proxy_->SendTelephoneEventOutband(event, duration_ms); >+ return channel_send_->SetSendTelephoneEventPayloadType(payload_type, >+ payload_frequency) && >+ channel_send_->SendTelephoneEventOutband(event, duration_ms); > } > > void AudioSendStream::SetMuted(bool muted) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_proxy_->SetInputMute(muted); >+ channel_send_->SetInputMute(muted); > } > > webrtc::AudioSendStream::Stats AudioSendStream::GetStats() const { >@@ -352,8 +361,9 @@ webrtc::AudioSendStream::Stats AudioSendStream::GetStats( > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > webrtc::AudioSendStream::Stats stats; > stats.local_ssrc = config_.rtp.ssrc; >+ stats.target_bitrate_bps = channel_send_->GetBitrate(); > >- webrtc::CallSendStatistics call_stats = channel_proxy_->GetRTCPStatistics(); >+ webrtc::CallSendStatistics call_stats = channel_send_->GetRTCPStatistics(); > stats.bytes_sent = call_stats.bytesSent; > stats.packets_sent = call_stats.packetsSent; > // RTT isn't known until a RTCP report is received. Until then, VoiceEngine >@@ -367,7 +377,7 @@ webrtc::AudioSendStream::Stats AudioSendStream::GetStats( > stats.codec_payload_type = spec.payload_type; > > // Get data from the last remote RTCP report. >- for (const auto& block : channel_proxy_->GetRemoteRTCPReportBlocks()) { >+ for (const auto& block : channel_send_->GetRemoteRTCPReportBlocks()) { > // Lookup report for send ssrc only. > if (block.source_SSRC == stats.local_ssrc) { > stats.packets_lost = block.cumulative_num_packets_lost; >@@ -389,7 +399,7 @@ webrtc::AudioSendStream::Stats AudioSendStream::GetStats( > stats.total_input_duration = input_stats.total_duration; > > stats.typing_noise_detected = audio_state()->typing_noise_detected(); >- stats.ana_statistics = channel_proxy_->GetANAStatistics(); >+ stats.ana_statistics = channel_send_->GetANAStatistics(); > RTC_DCHECK(audio_state_->audio_processing()); > stats.apm_statistics = > audio_state_->audio_processing()->GetStatistics(has_remote_tracks); >@@ -406,27 +416,24 @@ bool AudioSendStream::DeliverRtcp(const uint8_t* packet, size_t length) { > // calls on the worker thread. We should move towards always using a network > // thread. Then this check can be enabled. > // RTC_DCHECK(!worker_thread_checker_.CalledOnValidThread()); >- return channel_proxy_->ReceivedRTCPPacket(packet, length); >+ return channel_send_->ReceivedRTCPPacket(packet, length); > } > >-uint32_t AudioSendStream::OnBitrateUpdated(uint32_t bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt, >- int64_t bwe_period_ms) { >+uint32_t AudioSendStream::OnBitrateUpdated(BitrateAllocationUpdate update) { > // A send stream may be allocated a bitrate of zero if the allocator decides > // to disable it. For now we ignore this decision and keep sending on min > // bitrate. >- if (bitrate_bps == 0) { >- bitrate_bps = config_.min_bitrate_bps; >+ if (update.target_bitrate.IsZero()) { >+ update.target_bitrate = DataRate::bps(config_.min_bitrate_bps); > } >- RTC_DCHECK_GE(bitrate_bps, static_cast<uint32_t>(config_.min_bitrate_bps)); >+ RTC_DCHECK_GE(update.target_bitrate.bps<int>(), config_.min_bitrate_bps); > // The bitrate allocator might allocate an higher than max configured bitrate > // if there is room, to allow for, as example, extra FEC. Ignore that for now. >- const uint32_t max_bitrate_bps = config_.max_bitrate_bps; >- if (bitrate_bps > max_bitrate_bps) >- bitrate_bps = max_bitrate_bps; >+ const DataRate max_bitrate = DataRate::bps(config_.max_bitrate_bps); >+ if (update.target_bitrate > max_bitrate) >+ update.target_bitrate = max_bitrate; > >- channel_proxy_->SetBitrate(bitrate_bps, bwe_period_ms); >+ channel_send_->OnBitrateAllocation(update); > > // The amount of audio protection is not exposed by the encoder, hence > // always returning 0. >@@ -460,25 +467,24 @@ void AudioSendStream::OnPacketFeedbackVector( > // the previously sent value is no longer relevant. This will be taken care > // of with some refactoring which is now being done. > if (plr) { >- channel_proxy_->OnTwccBasedUplinkPacketLossRate(*plr); >+ channel_send_->OnTwccBasedUplinkPacketLossRate(*plr); > } > if (rplr) { >- channel_proxy_->OnRecoverableUplinkPacketLossRate(*rplr); >+ channel_send_->OnRecoverableUplinkPacketLossRate(*rplr); > } > } > > void AudioSendStream::SetTransportOverhead(int transport_overhead_per_packet) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_proxy_->SetTransportOverhead(transport_overhead_per_packet); >+ channel_send_->SetTransportOverhead(transport_overhead_per_packet); > } > > RtpState AudioSendStream::GetRtpState() const { > return rtp_rtcp_module_->GetRtpState(); > } > >-const voe::ChannelSendProxy& AudioSendStream::GetChannelProxy() const { >- RTC_DCHECK(channel_proxy_.get()); >- return *channel_proxy_.get(); >+const voe::ChannelSendInterface* AudioSendStream::GetChannel() const { >+ return channel_send_.get(); > } > > internal::AudioState* AudioSendStream::audio_state() { >@@ -549,12 +555,12 @@ bool AudioSendStream::SetupSendCodec(AudioSendStream* stream, > > // Wrap the encoder in a an AudioEncoderCNG, if VAD is enabled. > if (spec.cng_payload_type) { >- AudioEncoderCng::Config cng_config; >+ AudioEncoderCngConfig cng_config; > cng_config.num_channels = encoder->NumChannels(); > cng_config.payload_type = *spec.cng_payload_type; > cng_config.speech_encoder = std::move(encoder); > cng_config.vad_mode = Vad::kVadNormal; >- encoder.reset(new AudioEncoderCng(std::move(cng_config))); >+ encoder = CreateComfortNoiseEncoder(std::move(cng_config)); > > stream->RegisterCngPayloadType( > *spec.cng_payload_type, >@@ -563,8 +569,8 @@ bool AudioSendStream::SetupSendCodec(AudioSendStream* stream, > > stream->StoreEncoderProperties(encoder->SampleRateHz(), > encoder->NumChannels()); >- stream->channel_proxy_->SetEncoder(new_config.send_codec_spec->payload_type, >- std::move(encoder)); >+ stream->channel_send_->SetEncoder(new_config.send_codec_spec->payload_type, >+ std::move(encoder)); > return true; > } > >@@ -610,7 +616,7 @@ bool AudioSendStream::ReconfigureSendCodec(AudioSendStream* stream, > if (!do_not_update_target_bitrate && new_target_bitrate_bps && > new_target_bitrate_bps != > old_config.send_codec_spec->target_bitrate_bps) { >- CallEncoder(stream->channel_proxy_, [&](AudioEncoder* encoder) { >+ CallEncoder(stream->channel_send_, [&](AudioEncoder* encoder) { > encoder->OnReceivedTargetAudioBitrate(*new_target_bitrate_bps); > }); > } >@@ -628,7 +634,7 @@ void AudioSendStream::ReconfigureANA(AudioSendStream* stream, > return; > } > if (new_config.audio_network_adaptor_config) { >- CallEncoder(stream->channel_proxy_, [&](AudioEncoder* encoder) { >+ CallEncoder(stream->channel_send_, [&](AudioEncoder* encoder) { > if (encoder->EnableAudioNetworkAdaptor( > *new_config.audio_network_adaptor_config, stream->event_log_)) { > RTC_DLOG(LS_INFO) << "Audio network adaptor enabled on SSRC " >@@ -638,7 +644,7 @@ void AudioSendStream::ReconfigureANA(AudioSendStream* stream, > } > }); > } else { >- CallEncoder(stream->channel_proxy_, [&](AudioEncoder* encoder) { >+ CallEncoder(stream->channel_send_, [&](AudioEncoder* encoder) { > encoder->DisableAudioNetworkAdaptor(); > }); > RTC_DLOG(LS_INFO) << "Audio network adaptor disabled on SSRC " >@@ -662,7 +668,7 @@ void AudioSendStream::ReconfigureCNG(AudioSendStream* stream, > } > > // Wrap or unwrap the encoder in an AudioEncoderCNG. >- stream->channel_proxy_->ModifyEncoder( >+ stream->channel_send_->ModifyEncoder( > [&](std::unique_ptr<AudioEncoder>* encoder_ptr) { > std::unique_ptr<AudioEncoder> old_encoder(std::move(*encoder_ptr)); > auto sub_encoders = old_encoder->ReclaimContainedEncoders(); >@@ -676,12 +682,12 @@ void AudioSendStream::ReconfigureCNG(AudioSendStream* stream, > old_encoder = std::move(tmp); > } > if (new_config.send_codec_spec->cng_payload_type) { >- AudioEncoderCng::Config config; >+ AudioEncoderCngConfig config; > config.speech_encoder = std::move(old_encoder); > config.num_channels = config.speech_encoder->NumChannels(); > config.payload_type = *new_config.send_codec_spec->cng_payload_type; > config.vad_mode = Vad::kVadNormal; >- encoder_ptr->reset(new AudioEncoderCng(std::move(config))); >+ *encoder_ptr = CreateComfortNoiseEncoder(std::move(config)); > } else { > *encoder_ptr = std::move(old_encoder); > } >@@ -708,13 +714,18 @@ void AudioSendStream::ReconfigureBitrateObserver( > > bool has_transport_sequence_number = new_transport_seq_num_id != 0; > if (new_config.min_bitrate_bps != -1 && new_config.max_bitrate_bps != -1 && >+ !new_config.has_dscp && > (has_transport_sequence_number || > !webrtc::field_trial::IsEnabled("WebRTC-Audio-SendSideBwe"))) { >+ stream->rtp_transport_->packet_sender()->SetAccountForAudioPackets(true); > stream->ConfigureBitrateObserver( > new_config.min_bitrate_bps, new_config.max_bitrate_bps, > new_config.bitrate_priority, has_transport_sequence_number); >+ stream->rtp_rtcp_module_->SetAsPartOfAllocation(true); > } else { >+ stream->rtp_transport_->packet_sender()->SetAccountForAudioPackets(false); > stream->RemoveBitrateObserver(); >+ stream->rtp_rtcp_module_->SetAsPartOfAllocation(false); > } > } > >@@ -724,7 +735,7 @@ void AudioSendStream::ConfigureBitrateObserver(int min_bitrate_bps, > bool has_packet_feedback) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > RTC_DCHECK_GE(max_bitrate_bps, min_bitrate_bps); >- rtc::Event thread_sync_event(false /* manual_reset */, false); >+ rtc::Event thread_sync_event; > worker_queue_->PostTask([&] { > // We may get a callback immediately as the observer is registered, so make > // sure the bitrate limits in config_ are up-to-date. >@@ -744,7 +755,7 @@ void AudioSendStream::ConfigureBitrateObserver(int min_bitrate_bps, > > void AudioSendStream::RemoveBitrateObserver() { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- rtc::Event thread_sync_event(false /* manual_reset */, false); >+ rtc::Event thread_sync_event; > worker_queue_->PostTask([this, &thread_sync_event] { > bitrate_allocator_->RemoveObserver(this); > thread_sync_event.Set(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.h >index 1ea676b65db745221861aa51d4309498c6aac91a..bf94901ef05750f3068dee903a6d0f5cbef6df5c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream.h >@@ -14,7 +14,7 @@ > #include <memory> > #include <vector> > >-#include "audio/time_interval.h" >+#include "audio/channel_send.h" > #include "audio/transport_feedback_packet_loss_tracker.h" > #include "call/audio_send_stream.h" > #include "call/audio_state.h" >@@ -30,10 +30,6 @@ class RtcpBandwidthObserver; > class RtcpRttStats; > class RtpTransportControllerSendInterface; > >-namespace voe { >-class ChannelSendProxy; >-} // namespace voe >- > namespace internal { > class AudioState; > >@@ -45,23 +41,21 @@ class AudioSendStream final : public webrtc::AudioSendStream, > const rtc::scoped_refptr<webrtc::AudioState>& audio_state, > rtc::TaskQueue* worker_queue, > ProcessThread* module_process_thread, >- RtpTransportControllerSendInterface* transport, >- BitrateAllocator* bitrate_allocator, >+ RtpTransportControllerSendInterface* rtp_transport, >+ BitrateAllocatorInterface* bitrate_allocator, > RtcEventLog* event_log, > RtcpRttStats* rtcp_rtt_stats, >- const absl::optional<RtpState>& suspended_rtp_state, >- TimeInterval* overall_call_lifetime); >- // For unit tests, which need to supply a mock channel proxy. >+ const absl::optional<RtpState>& suspended_rtp_state); >+ // For unit tests, which need to supply a mock ChannelSend. > AudioSendStream(const webrtc::AudioSendStream::Config& config, > const rtc::scoped_refptr<webrtc::AudioState>& audio_state, > rtc::TaskQueue* worker_queue, >- RtpTransportControllerSendInterface* transport, >- BitrateAllocator* bitrate_allocator, >+ RtpTransportControllerSendInterface* rtp_transport, >+ BitrateAllocatorInterface* bitrate_allocator, > RtcEventLog* event_log, > RtcpRttStats* rtcp_rtt_stats, > const absl::optional<RtpState>& suspended_rtp_state, >- TimeInterval* overall_call_lifetime, >- std::unique_ptr<voe::ChannelSendProxy> channel_proxy); >+ std::unique_ptr<voe::ChannelSendInterface> channel_send); > ~AudioSendStream() override; > > // webrtc::AudioSendStream implementation. >@@ -83,10 +77,7 @@ class AudioSendStream final : public webrtc::AudioSendStream, > bool DeliverRtcp(const uint8_t* packet, size_t length); > > // Implements BitrateAllocatorObserver. >- uint32_t OnBitrateUpdated(uint32_t bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt, >- int64_t bwe_period_ms) override; >+ uint32_t OnBitrateUpdated(BitrateAllocationUpdate update) override; > > // From PacketFeedbackObserver. > void OnPacketAdded(uint32_t ssrc, uint16_t seq_num) override; >@@ -96,7 +87,7 @@ class AudioSendStream final : public webrtc::AudioSendStream, > void SetTransportOverhead(int transport_overhead_per_packet); > > RtpState GetRtpState() const; >- const voe::ChannelSendProxy& GetChannelProxy() const; >+ const voe::ChannelSendInterface* GetChannel() const; > > private: > class TimedTransport; >@@ -133,15 +124,15 @@ class AudioSendStream final : public webrtc::AudioSendStream, > rtc::TaskQueue* worker_queue_; > webrtc::AudioSendStream::Config config_; > rtc::scoped_refptr<webrtc::AudioState> audio_state_; >- std::unique_ptr<voe::ChannelSendProxy> channel_proxy_; >+ const std::unique_ptr<voe::ChannelSendInterface> channel_send_; > RtcEventLog* const event_log_; > > int encoder_sample_rate_hz_ = 0; > size_t encoder_num_channels_ = 0; > bool sending_ = false; > >- BitrateAllocator* const bitrate_allocator_; >- RtpTransportControllerSendInterface* const transport_; >+ BitrateAllocatorInterface* const bitrate_allocator_; >+ RtpTransportControllerSendInterface* const rtp_transport_; > > rtc::CriticalSection packet_loss_tracker_cs_; > TransportFeedbackPacketLossTracker packet_loss_tracker_ >@@ -150,10 +141,6 @@ class AudioSendStream final : public webrtc::AudioSendStream, > RtpRtcp* rtp_rtcp_module_; > absl::optional<RtpState> const suspended_rtp_state_; > >- std::unique_ptr<TimedTransport> timed_send_transport_adapter_; >- TimeInterval active_lifetime_; >- TimeInterval* overall_call_lifetime_ = nullptr; >- > // RFC 5285: Each distinct extension MUST have a unique ID. The value 0 is > // reserved for padding and MUST NOT be used as a local identifier. > // So it should be safe to use 0 here to indicate "not configured". >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream_unittest.cc >index 0a954f86d629c5595d2014d74c1378268de1e6e0..e400ada0e5af409b42ac68b4a668bf1a34b41da5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_send_stream_unittest.cc >@@ -13,7 +13,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >-#include "api/units/time_delta.h" >+#include "api/test/mock_frame_encryptor.h" > #include "audio/audio_send_stream.h" > #include "audio/audio_state.h" > #include "audio/conversion.h" >@@ -27,12 +27,10 @@ > #include "modules/rtp_rtcp/mocks/mock_rtcp_bandwidth_observer.h" > #include "modules/rtp_rtcp/mocks/mock_rtcp_rtt_stats.h" > #include "modules/rtp_rtcp/mocks/mock_rtp_rtcp.h" >-#include "rtc_base/fakeclock.h" > #include "rtc_base/task_queue.h" > #include "test/gtest.h" > #include "test/mock_audio_encoder.h" > #include "test/mock_audio_encoder_factory.h" >-#include "test/mock_transport.h" > > namespace webrtc { > namespace test { >@@ -41,6 +39,7 @@ namespace { > using testing::_; > using testing::Eq; > using testing::Ne; >+using testing::Field; > using testing::Invoke; > using testing::Return; > using testing::StrEq; >@@ -128,7 +127,7 @@ rtc::scoped_refptr<MockAudioEncoderFactory> SetupEncoderFactoryMock() { > > struct ConfigHelper { > ConfigHelper(bool audio_bwe_enabled, bool expect_set_encoder_call) >- : stream_config_(nullptr), >+ : stream_config_(/*send_transport=*/nullptr, /*media_transport=*/nullptr), > audio_processing_(new rtc::RefCountedObject<MockAudioProcessing>()), > bitrate_allocator_(&limit_observer_), > worker_queue_("ConfigHelper_worker_queue"), >@@ -142,7 +141,7 @@ struct ConfigHelper { > new rtc::RefCountedObject<MockAudioDeviceModule>(); > audio_state_ = AudioState::Create(config); > >- SetupDefaultChannelProxy(audio_bwe_enabled); >+ SetupDefaultChannelSend(audio_bwe_enabled); > SetupMockForSetupSendCodec(expect_set_encoder_call); > > // Use ISAC as default codec so as to prevent unnecessary |channel_proxy_| >@@ -150,7 +149,6 @@ struct ConfigHelper { > stream_config_.send_codec_spec = > AudioSendStream::Config::SendCodecSpec(kIsacPayloadType, kIsacFormat); > stream_config_.rtp.ssrc = kSsrc; >- stream_config_.rtp.nack.rtp_history_ms = 200; > stream_config_.rtp.c_name = kCName; > stream_config_.rtp.extensions.push_back( > RtpExtension(RtpExtension::kAudioLevelUri, kAudioLevelId)); >@@ -167,8 +165,7 @@ struct ConfigHelper { > new internal::AudioSendStream( > stream_config_, audio_state_, &worker_queue_, &rtp_transport_, > &bitrate_allocator_, &event_log_, &rtcp_rtt_stats_, absl::nullopt, >- &active_lifetime_, >- std::unique_ptr<voe::ChannelSendProxy>(channel_proxy_))); >+ std::unique_ptr<voe::ChannelSendInterface>(channel_send_))); > } > > AudioSendStream::Config& config() { return stream_config_; } >@@ -176,9 +173,8 @@ struct ConfigHelper { > return *static_cast<MockAudioEncoderFactory*>( > stream_config_.encoder_factory.get()); > } >- MockChannelSendProxy* channel_proxy() { return channel_proxy_; } >+ MockChannelSend* channel_send() { return channel_send_; } > RtpTransportControllerSendInterface* transport() { return &rtp_transport_; } >- TimeInterval* active_lifetime() { return &active_lifetime_; } > > static void AddBweToConfig(AudioSendStream::Config* config) { > config->rtp.extensions.push_back(RtpExtension( >@@ -186,47 +182,40 @@ struct ConfigHelper { > config->send_codec_spec->transport_cc_enabled = true; > } > >- void SetupDefaultChannelProxy(bool audio_bwe_enabled) { >- EXPECT_TRUE(channel_proxy_ == nullptr); >- channel_proxy_ = new testing::StrictMock<MockChannelSendProxy>(); >- EXPECT_CALL(*channel_proxy_, GetRtpRtcp()).WillRepeatedly(Invoke([this]() { >+ void SetupDefaultChannelSend(bool audio_bwe_enabled) { >+ EXPECT_TRUE(channel_send_ == nullptr); >+ channel_send_ = new testing::StrictMock<MockChannelSend>(); >+ EXPECT_CALL(*channel_send_, GetRtpRtcp()).WillRepeatedly(Invoke([this]() { > return &this->rtp_rtcp_; > })); >- EXPECT_CALL(*channel_proxy_, SetRTCPStatus(true)).Times(1); >- EXPECT_CALL(*channel_proxy_, SetLocalSSRC(kSsrc)).Times(1); >- EXPECT_CALL(*channel_proxy_, SetRTCP_CNAME(StrEq(kCName))).Times(1); >- EXPECT_CALL(*channel_proxy_, SetNACKStatus(true, 10)).Times(1); >- EXPECT_CALL(*channel_proxy_, SetFrameEncryptor(nullptr)).Times(1); >- EXPECT_CALL(*channel_proxy_, >+ EXPECT_CALL(*channel_send_, SetLocalSSRC(kSsrc)).Times(1); >+ EXPECT_CALL(*channel_send_, SetRTCP_CNAME(StrEq(kCName))).Times(1); >+ EXPECT_CALL(*channel_send_, SetFrameEncryptor(_)).Times(1); >+ EXPECT_CALL(*channel_send_, SetExtmapAllowMixed(false)).Times(1); >+ EXPECT_CALL(*channel_send_, > SetSendAudioLevelIndicationStatus(true, kAudioLevelId)) > .Times(1); > EXPECT_CALL(rtp_transport_, GetBandwidthObserver()) > .WillRepeatedly(Return(&bandwidth_observer_)); > if (audio_bwe_enabled) { >- EXPECT_CALL(*channel_proxy_, >+ EXPECT_CALL(*channel_send_, > EnableSendTransportSequenceNumber(kTransportSequenceNumberId)) > .Times(1); >- EXPECT_CALL(*channel_proxy_, >+ EXPECT_CALL(*channel_send_, > RegisterSenderCongestionControlObjects( > &rtp_transport_, Eq(&bandwidth_observer_))) > .Times(1); > } else { >- EXPECT_CALL(*channel_proxy_, RegisterSenderCongestionControlObjects( >- &rtp_transport_, Eq(nullptr))) >+ EXPECT_CALL(*channel_send_, RegisterSenderCongestionControlObjects( >+ &rtp_transport_, Eq(nullptr))) > .Times(1); > } >- EXPECT_CALL(*channel_proxy_, ResetSenderCongestionControlObjects()) >- .Times(1); >- { >- ::testing::InSequence unregister_on_destruction; >- EXPECT_CALL(*channel_proxy_, RegisterTransport(_)).Times(1); >- EXPECT_CALL(*channel_proxy_, RegisterTransport(nullptr)).Times(1); >- } >+ EXPECT_CALL(*channel_send_, ResetSenderCongestionControlObjects()).Times(1); > } > > void SetupMockForSetupSendCodec(bool expect_set_encoder_call) { > if (expect_set_encoder_call) { >- EXPECT_CALL(*channel_proxy_, SetEncoderForMock(_, _)) >+ EXPECT_CALL(*channel_send_, SetEncoderForMock(_, _)) > .WillOnce(Invoke( > [this](int payload_type, std::unique_ptr<AudioEncoder>* encoder) { > this->audio_encoder_ = std::move(*encoder); >@@ -237,7 +226,7 @@ struct ConfigHelper { > > void SetupMockForModifyEncoder() { > // Let ModifyEncoder to invoke mock audio encoder. >- EXPECT_CALL(*channel_proxy_, ModifyEncoder(_)) >+ EXPECT_CALL(*channel_send_, ModifyEncoder(_)) > .WillRepeatedly(Invoke( > [this](rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> > modifier) { >@@ -247,13 +236,13 @@ struct ConfigHelper { > } > > void SetupMockForSendTelephoneEvent() { >- EXPECT_TRUE(channel_proxy_); >- EXPECT_CALL(*channel_proxy_, SetSendTelephoneEventPayloadType( >- kTelephoneEventPayloadType, >- kTelephoneEventPayloadFrequency)) >+ EXPECT_TRUE(channel_send_); >+ EXPECT_CALL(*channel_send_, SetSendTelephoneEventPayloadType( >+ kTelephoneEventPayloadType, >+ kTelephoneEventPayloadFrequency)) > .WillOnce(Return(true)); > EXPECT_CALL( >- *channel_proxy_, >+ *channel_send_, > SendTelephoneEventOutband(kTelephoneEventCode, kTelephoneEventDuration)) > .WillOnce(Return(true)); > } >@@ -271,13 +260,14 @@ struct ConfigHelper { > block.fraction_lost = 0; > report_blocks.push_back(block); // Duplicate SSRC, bad fraction_lost. > >- EXPECT_TRUE(channel_proxy_); >- EXPECT_CALL(*channel_proxy_, GetRTCPStatistics()) >+ EXPECT_TRUE(channel_send_); >+ EXPECT_CALL(*channel_send_, GetRTCPStatistics()) > .WillRepeatedly(Return(kCallStats)); >- EXPECT_CALL(*channel_proxy_, GetRemoteRTCPReportBlocks()) >+ EXPECT_CALL(*channel_send_, GetRemoteRTCPReportBlocks()) > .WillRepeatedly(Return(report_blocks)); >- EXPECT_CALL(*channel_proxy_, GetANAStatistics()) >+ EXPECT_CALL(*channel_send_, GetANAStatistics()) > .WillRepeatedly(Return(ANAStats())); >+ EXPECT_CALL(*channel_send_, GetBitrate()).WillRepeatedly(Return(0)); > > audio_processing_stats_.echo_return_loss = kEchoReturnLoss; > audio_processing_stats_.echo_return_loss_enhancement = >@@ -297,10 +287,9 @@ struct ConfigHelper { > private: > rtc::scoped_refptr<AudioState> audio_state_; > AudioSendStream::Config stream_config_; >- testing::StrictMock<MockChannelSendProxy>* channel_proxy_ = nullptr; >+ testing::StrictMock<MockChannelSend>* channel_send_ = nullptr; > rtc::scoped_refptr<MockAudioProcessing> audio_processing_; > AudioProcessingStats audio_processing_stats_; >- TimeInterval active_lifetime_; > testing::StrictMock<MockRtcpBandwidthObserver> bandwidth_observer_; > testing::NiceMock<MockRtcEventLog> event_log_; > testing::NiceMock<MockRtpTransportControllerSend> rtp_transport_; >@@ -316,7 +305,8 @@ struct ConfigHelper { > } // namespace > > TEST(AudioSendStreamTest, ConfigToString) { >- AudioSendStream::Config config(nullptr); >+ AudioSendStream::Config config(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr); > config.rtp.ssrc = kSsrc; > config.rtp.c_name = kCName; > config.min_bitrate_bps = 12000; >@@ -327,12 +317,15 @@ TEST(AudioSendStreamTest, ConfigToString) { > config.send_codec_spec->transport_cc_enabled = false; > config.send_codec_spec->cng_payload_type = 42; > config.encoder_factory = MockAudioEncoderFactory::CreateUnusedFactory(); >+ config.rtp.extmap_allow_mixed = true; > config.rtp.extensions.push_back( > RtpExtension(RtpExtension::kAudioLevelUri, kAudioLevelId)); >+ config.rtcp_report_interval_ms = 2500; > EXPECT_EQ( >- "{rtp: {ssrc: 1234, extensions: [{uri: " >- "urn:ietf:params:rtp-hdrext:ssrc-audio-level, id: 2}], nack: " >- "{rtp_history_ms: 0}, c_name: foo_name}, send_transport: null, " >+ "{rtp: {ssrc: 1234, extmap-allow-mixed: true, extensions: [{uri: " >+ "urn:ietf:params:rtp-hdrext:ssrc-audio-level, id: 2}], " >+ "c_name: foo_name}, rtcp_report_interval_ms: 2500, " >+ "send_transport: null, media_transport: null, " > "min_bitrate_bps: 12000, max_bitrate_bps: 34000, " > "send_codec_spec: {nack_enabled: true, transport_cc_enabled: false, " > "cng_payload_type: 42, payload_type: 103, " >@@ -358,7 +351,7 @@ TEST(AudioSendStreamTest, SendTelephoneEvent) { > TEST(AudioSendStreamTest, SetMuted) { > ConfigHelper helper(false, true); > auto send_stream = helper.CreateAudioSendStream(); >- EXPECT_CALL(*helper.channel_proxy(), SetInputMute(true)); >+ EXPECT_CALL(*helper.channel_send(), SetInputMute(true)); > send_stream->SetMuted(true); > } > >@@ -448,7 +441,7 @@ TEST(AudioSendStreamTest, SendCodecCanApplyVad) { > helper.config().send_codec_spec->cng_payload_type = 105; > using ::testing::Invoke; > std::unique_ptr<AudioEncoder> stolen_encoder; >- EXPECT_CALL(*helper.channel_proxy(), SetEncoderForMock(_, _)) >+ EXPECT_CALL(*helper.channel_send(), SetEncoderForMock(_, _)) > .WillOnce( > Invoke([&stolen_encoder](int payload_type, > std::unique_ptr<AudioEncoder>* encoder) { >@@ -468,17 +461,31 @@ TEST(AudioSendStreamTest, SendCodecCanApplyVad) { > TEST(AudioSendStreamTest, DoesNotPassHigherBitrateThanMaxBitrate) { > ConfigHelper helper(false, true); > auto send_stream = helper.CreateAudioSendStream(); >- EXPECT_CALL(*helper.channel_proxy(), >- SetBitrate(helper.config().max_bitrate_bps, _)); >- send_stream->OnBitrateUpdated(helper.config().max_bitrate_bps + 5000, 0.0, 50, >- 6000); >+ EXPECT_CALL(*helper.channel_send(), >+ OnBitrateAllocation( >+ Field(&BitrateAllocationUpdate::target_bitrate, >+ Eq(DataRate::bps(helper.config().max_bitrate_bps))))); >+ BitrateAllocationUpdate update; >+ update.target_bitrate = DataRate::bps(helper.config().max_bitrate_bps + 5000); >+ update.packet_loss_ratio = 0; >+ update.round_trip_time = TimeDelta::ms(50); >+ update.bwe_period = TimeDelta::ms(6000); >+ send_stream->OnBitrateUpdated(update); > } > > TEST(AudioSendStreamTest, ProbingIntervalOnBitrateUpdated) { > ConfigHelper helper(false, true); > auto send_stream = helper.CreateAudioSendStream(); >- EXPECT_CALL(*helper.channel_proxy(), SetBitrate(_, 5000)); >- send_stream->OnBitrateUpdated(50000, 0.0, 50, 5000); >+ >+ EXPECT_CALL(*helper.channel_send(), >+ OnBitrateAllocation(Field(&BitrateAllocationUpdate::bwe_period, >+ Eq(TimeDelta::ms(5000))))); >+ BitrateAllocationUpdate update; >+ update.target_bitrate = DataRate::bps(helper.config().max_bitrate_bps + 5000); >+ update.packet_loss_ratio = 0; >+ update.round_trip_time = TimeDelta::ms(50); >+ update.bwe_period = TimeDelta::ms(5000); >+ send_stream->OnBitrateUpdated(update); > } > > // Test that AudioSendStream doesn't recreate the encoder unnecessarily. >@@ -489,7 +496,7 @@ TEST(AudioSendStreamTest, DontRecreateEncoder) { > // to be correct, it's instead set-up manually here. Otherwise a simple change > // to ConfigHelper (say to WillRepeatedly) would silently make this test > // useless. >- EXPECT_CALL(*helper.channel_proxy(), SetEncoderForMock(_, _)) >+ EXPECT_CALL(*helper.channel_send(), SetEncoderForMock(_, _)) > .WillOnce(Return(true)); > > helper.config().send_codec_spec = >@@ -504,46 +511,44 @@ TEST(AudioSendStreamTest, ReconfigureTransportCcResetsFirst) { > auto send_stream = helper.CreateAudioSendStream(); > auto new_config = helper.config(); > ConfigHelper::AddBweToConfig(&new_config); >- EXPECT_CALL(*helper.channel_proxy(), >+ EXPECT_CALL(*helper.channel_send(), > EnableSendTransportSequenceNumber(kTransportSequenceNumberId)) > .Times(1); > { > ::testing::InSequence seq; >- EXPECT_CALL(*helper.channel_proxy(), ResetSenderCongestionControlObjects()) >+ EXPECT_CALL(*helper.channel_send(), ResetSenderCongestionControlObjects()) > .Times(1); >- EXPECT_CALL(*helper.channel_proxy(), RegisterSenderCongestionControlObjects( >- helper.transport(), Ne(nullptr))) >+ EXPECT_CALL(*helper.channel_send(), RegisterSenderCongestionControlObjects( >+ helper.transport(), Ne(nullptr))) > .Times(1); > } > send_stream->Reconfigure(new_config); > } > >-// Checks that AudioSendStream logs the times at which RTP packets are sent >-// through its interface. >-TEST(AudioSendStreamTest, UpdateLifetime) { >+// Validates that reconfiguring the AudioSendStream with a Frame encryptor >+// correctly reconfigures on the object without crashing. >+TEST(AudioSendStreamTest, ReconfigureWithFrameEncryptor) { > ConfigHelper helper(false, true); >+ auto send_stream = helper.CreateAudioSendStream(); >+ auto new_config = helper.config(); > >- MockTransport mock_transport; >- helper.config().send_transport = &mock_transport; >+ rtc::scoped_refptr<FrameEncryptorInterface> mock_frame_encryptor_0( >+ new rtc::RefCountedObject<MockFrameEncryptor>()); >+ new_config.frame_encryptor = mock_frame_encryptor_0; >+ EXPECT_CALL(*helper.channel_send(), SetFrameEncryptor(Ne(nullptr))).Times(1); >+ send_stream->Reconfigure(new_config); > >- Transport* registered_transport; >- ON_CALL(*helper.channel_proxy(), RegisterTransport(_)) >- .WillByDefault(Invoke([®istered_transport](Transport* transport) { >- registered_transport = transport; >- })); >+ // Not updating the frame encryptor shouldn't force it to reconfigure. >+ EXPECT_CALL(*helper.channel_send(), SetFrameEncryptor(_)).Times(0); >+ send_stream->Reconfigure(new_config); > >- rtc::ScopedFakeClock fake_clock; >- constexpr int64_t kTimeBetweenSendRtpCallsMs = 100; >- { >- auto send_stream = helper.CreateAudioSendStream(); >- EXPECT_CALL(mock_transport, SendRtp(_, _, _)).Times(2); >- const PacketOptions options; >- registered_transport->SendRtp(nullptr, 0, options); >- fake_clock.AdvanceTime(TimeDelta::ms(kTimeBetweenSendRtpCallsMs)); >- registered_transport->SendRtp(nullptr, 0, options); >- } >- EXPECT_TRUE(!helper.active_lifetime()->Empty()); >- EXPECT_EQ(helper.active_lifetime()->Length(), kTimeBetweenSendRtpCallsMs); >+ // Updating frame encryptor to a new object should force a call to the proxy. >+ rtc::scoped_refptr<FrameEncryptorInterface> mock_frame_encryptor_1( >+ new rtc::RefCountedObject<MockFrameEncryptor>()); >+ new_config.frame_encryptor = mock_frame_encryptor_1; >+ new_config.crypto_options.sframe.require_frame_encryption = true; >+ EXPECT_CALL(*helper.channel_send(), SetFrameEncryptor(Ne(nullptr))).Times(1); >+ send_stream->Reconfigure(new_config); > } > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_transport_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_transport_impl.cc >index 94ef6cb68a775d57d7b38f3db3997ac59902ca20..539456e2f49ae6f5cce0a2eef7a52d379ba728d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_transport_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/audio_transport_impl.cc >@@ -17,7 +17,7 @@ > #include "audio/remix_resample.h" > #include "audio/utility/audio_frame_operations.h" > #include "call/audio_send_stream.h" >-#include "rtc_base/logging.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.cc >index e9f7503474f5f543fce4bd1d47cd54064aac19e6..483147fa9eed00e4e60fb1b4bdb0531560f911da 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.cc >@@ -18,14 +18,20 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "audio/audio_level.h" > #include "audio/channel_send.h" > #include "audio/utility/audio_frame_operations.h" > #include "logging/rtc_event_log/events/rtc_event_audio_playout.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" >+#include "modules/audio_coding/include/audio_coding_module.h" > #include "modules/audio_device/include/audio_device.h" > #include "modules/pacing/packet_router.h" > #include "modules/rtp_rtcp/include/receive_statistics.h" >+#include "modules/rtp_rtcp/include/remote_ntp_time_estimator.h" >+#include "modules/rtp_rtcp/include/rtp_rtcp.h" >+#include "modules/rtp_rtcp/source/contributing_sources.h" >+#include "modules/rtp_rtcp/source/rtp_header_extensions.h" > #include "modules/rtp_rtcp/source/rtp_packet_received.h" > #include "modules/utility/include/process_thread.h" > #include "rtc_base/checks.h" >@@ -33,6 +39,8 @@ > #include "rtc_base/format_macros.h" > #include "rtc_base/location.h" > #include "rtc_base/logging.h" >+#include "rtc_base/numerics/safe_minmax.h" >+#include "rtc_base/race_checker.h" > #include "rtc_base/thread_checker.h" > #include "rtc_base/timeutils.h" > #include "system_wrappers/include/metrics.h" >@@ -50,13 +58,228 @@ constexpr int64_t kMinRetransmissionWindowMs = 30; > constexpr int kVoiceEngineMinMinPlayoutDelayMs = 0; > constexpr int kVoiceEngineMaxMinPlayoutDelayMs = 10000; > >-} // namespace >+webrtc::FrameType WebrtcFrameTypeForMediaTransportFrameType( >+ MediaTransportEncodedAudioFrame::FrameType frame_type) { >+ switch (frame_type) { >+ case MediaTransportEncodedAudioFrame::FrameType::kSpeech: >+ return kAudioFrameSpeech; >+ break; >+ >+ case MediaTransportEncodedAudioFrame::FrameType:: >+ kDiscountinuousTransmission: >+ return kAudioFrameCN; >+ break; >+ } >+} >+ >+WebRtcRTPHeader CreateWebrtcRTPHeaderForMediaTransportFrame( >+ const MediaTransportEncodedAudioFrame& frame, >+ uint64_t channel_id) { >+ webrtc::WebRtcRTPHeader webrtc_header = {}; >+ webrtc_header.header.payloadType = frame.payload_type(); >+ webrtc_header.header.payload_type_frequency = frame.sampling_rate_hz(); >+ webrtc_header.header.timestamp = frame.starting_sample_index(); >+ webrtc_header.header.sequenceNumber = frame.sequence_number(); >+ >+ webrtc_header.frameType = >+ WebrtcFrameTypeForMediaTransportFrameType(frame.frame_type()); >+ >+ webrtc_header.header.ssrc = static_cast<uint32_t>(channel_id); >+ >+ // The rest are initialized by the RTPHeader constructor. >+ return webrtc_header; >+} >+ >+class ChannelReceive : public ChannelReceiveInterface, >+ public MediaTransportAudioSinkInterface { >+ public: >+ // Used for receive streams. >+ ChannelReceive(ProcessThread* module_process_thread, >+ AudioDeviceModule* audio_device_module, >+ MediaTransportInterface* media_transport, >+ Transport* rtcp_send_transport, >+ RtcEventLog* rtc_event_log, >+ uint32_t remote_ssrc, >+ size_t jitter_buffer_max_packets, >+ bool jitter_buffer_fast_playout, >+ int jitter_buffer_min_delay_ms, >+ rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, >+ absl::optional<AudioCodecPairId> codec_pair_id, >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor, >+ const webrtc::CryptoOptions& crypto_options); >+ ~ChannelReceive() override; >+ >+ void SetSink(AudioSinkInterface* sink) override; >+ >+ void SetReceiveCodecs(const std::map<int, SdpAudioFormat>& codecs) override; >+ >+ // API methods >+ >+ void StartPlayout() override; >+ void StopPlayout() override; >+ >+ // Codecs >+ bool GetRecCodec(CodecInst* codec) const override; >+ >+ bool ReceivedRTCPPacket(const uint8_t* data, size_t length) override; >+ >+ // RtpPacketSinkInterface. >+ void OnRtpPacket(const RtpPacketReceived& packet) override; >+ >+ // Muting, Volume and Level. >+ void SetChannelOutputVolumeScaling(float scaling) override; >+ int GetSpeechOutputLevelFullRange() const override; >+ // See description of "totalAudioEnergy" in the WebRTC stats spec: >+ // https://w3c.github.io/webrtc-stats/#dom-rtcmediastreamtrackstats-totalaudioenergy >+ double GetTotalOutputEnergy() const override; >+ double GetTotalOutputDuration() const override; >+ >+ // Stats. >+ NetworkStatistics GetNetworkStatistics() const override; >+ AudioDecodingCallStats GetDecodingCallStatistics() const override; >+ >+ // Audio+Video Sync. >+ uint32_t GetDelayEstimate() const override; >+ void SetMinimumPlayoutDelay(int delayMs) override; >+ uint32_t GetPlayoutTimestamp() const override; >+ >+ // Produces the transport-related timestamps; current_delay_ms is left unset. >+ absl::optional<Syncable::Info> GetSyncInfo() const override; >+ >+ // RTP+RTCP >+ void SetLocalSSRC(unsigned int ssrc) override; >+ >+ void RegisterReceiverCongestionControlObjects( >+ PacketRouter* packet_router) override; >+ void ResetReceiverCongestionControlObjects() override; >+ >+ CallReceiveStatistics GetRTCPStatistics() const override; >+ void SetNACKStatus(bool enable, int maxNumberOfPackets) override; >+ >+ AudioMixer::Source::AudioFrameInfo GetAudioFrameWithInfo( >+ int sample_rate_hz, >+ AudioFrame* audio_frame) override; >+ >+ int PreferredSampleRate() const override; >+ >+ // Associate to a send channel. >+ // Used for obtaining RTT for a receive-only channel. >+ void SetAssociatedSendChannel(const ChannelSendInterface* channel) override; >+ >+ std::vector<RtpSource> GetSources() const override; >+ >+ private: >+ bool ReceivePacket(const uint8_t* packet, >+ size_t packet_length, >+ const RTPHeader& header); >+ int ResendPackets(const uint16_t* sequence_numbers, int length); >+ void UpdatePlayoutTimestamp(bool rtcp); >+ >+ int GetRtpTimestampRateHz() const; >+ int64_t GetRTT() const; >+ >+ // MediaTransportAudioSinkInterface override; >+ void OnData(uint64_t channel_id, >+ MediaTransportEncodedAudioFrame frame) override; >+ >+ int32_t OnReceivedPayloadData(const uint8_t* payloadData, >+ size_t payloadSize, >+ const WebRtcRTPHeader* rtpHeader); >+ >+ bool Playing() const { >+ rtc::CritScope lock(&playing_lock_); >+ return playing_; >+ } >+ >+ // Thread checkers document and lock usage of some methods to specific threads >+ // we know about. The goal is to eventually split up voe::ChannelReceive into >+ // parts with single-threaded semantics, and thereby reduce the need for >+ // locks. >+ rtc::ThreadChecker worker_thread_checker_; >+ rtc::ThreadChecker module_process_thread_checker_; >+ // Methods accessed from audio and video threads are checked for sequential- >+ // only access. We don't necessarily own and control these threads, so thread >+ // checkers cannot be used. E.g. Chromium may transfer "ownership" from one >+ // audio thread to another, but access is still sequential. >+ rtc::RaceChecker audio_thread_race_checker_; >+ rtc::RaceChecker video_capture_thread_race_checker_; >+ rtc::CriticalSection _callbackCritSect; >+ rtc::CriticalSection volume_settings_critsect_; >+ >+ rtc::CriticalSection playing_lock_; >+ bool playing_ RTC_GUARDED_BY(&playing_lock_) = false; >+ >+ RtcEventLog* const event_log_; >+ >+ // Indexed by payload type. >+ std::map<uint8_t, int> payload_type_frequencies_; >+ >+ std::unique_ptr<ReceiveStatistics> rtp_receive_statistics_; >+ std::unique_ptr<RtpRtcp> _rtpRtcpModule; >+ const uint32_t remote_ssrc_; >+ >+ // Info for GetSources and GetSyncInfo is updated on network or worker thread, >+ // queried on the worker thread. >+ rtc::CriticalSection rtp_sources_lock_; >+ ContributingSources contributing_sources_ RTC_GUARDED_BY(&rtp_sources_lock_); >+ absl::optional<uint32_t> last_received_rtp_timestamp_ >+ RTC_GUARDED_BY(&rtp_sources_lock_); >+ absl::optional<int64_t> last_received_rtp_system_time_ms_ >+ RTC_GUARDED_BY(&rtp_sources_lock_); >+ absl::optional<uint8_t> last_received_rtp_audio_level_ >+ RTC_GUARDED_BY(&rtp_sources_lock_); >+ >+ std::unique_ptr<AudioCodingModule> audio_coding_; >+ AudioSinkInterface* audio_sink_ = nullptr; >+ AudioLevel _outputAudioLevel; >+ >+ RemoteNtpTimeEstimator ntp_estimator_ RTC_GUARDED_BY(ts_stats_lock_); >+ >+ // Timestamp of the audio pulled from NetEq. >+ absl::optional<uint32_t> jitter_buffer_playout_timestamp_; >+ >+ rtc::CriticalSection video_sync_lock_; >+ uint32_t playout_timestamp_rtp_ RTC_GUARDED_BY(video_sync_lock_); >+ uint32_t playout_delay_ms_ RTC_GUARDED_BY(video_sync_lock_); >+ >+ rtc::CriticalSection ts_stats_lock_; >+ >+ std::unique_ptr<rtc::TimestampWrapAroundHandler> rtp_ts_wraparound_handler_; >+ // The rtp timestamp of the first played out audio frame. >+ int64_t capture_start_rtp_time_stamp_; >+ // The capture ntp time (in local timebase) of the first played out audio >+ // frame. >+ int64_t capture_start_ntp_time_ms_ RTC_GUARDED_BY(ts_stats_lock_); >+ >+ // uses >+ ProcessThread* _moduleProcessThreadPtr; >+ AudioDeviceModule* _audioDeviceModulePtr; >+ float _outputGain RTC_GUARDED_BY(volume_settings_critsect_); >+ >+ // An associated send channel. >+ rtc::CriticalSection assoc_send_channel_lock_; >+ const ChannelSendInterface* associated_send_channel_ >+ RTC_GUARDED_BY(assoc_send_channel_lock_); >+ >+ PacketRouter* packet_router_ = nullptr; >+ >+ rtc::ThreadChecker construction_thread_; >+ >+ MediaTransportInterface* const media_transport_; >+ >+ // E2EE Audio Frame Decryption >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor_; >+ webrtc::CryptoOptions crypto_options_; >+}; > > int32_t ChannelReceive::OnReceivedPayloadData( > const uint8_t* payloadData, > size_t payloadSize, > const WebRtcRTPHeader* rtpHeader) { >- if (!channel_state_.Get().playing) { >+ // We should not be receiving any RTP packets if media_transport is set. >+ RTC_CHECK(!media_transport_); >+ >+ if (!Playing()) { > // Avoid inserting into NetEQ when we are not playing. Count the > // packet as discarded. > return 0; >@@ -82,14 +305,35 @@ int32_t ChannelReceive::OnReceivedPayloadData( > return 0; > } > >+// MediaTransportAudioSinkInterface override. >+void ChannelReceive::OnData(uint64_t channel_id, >+ MediaTransportEncodedAudioFrame frame) { >+ RTC_CHECK(media_transport_); >+ >+ if (!Playing()) { >+ // Avoid inserting into NetEQ when we are not playing. Count the >+ // packet as discarded. >+ return; >+ } >+ >+ // Send encoded audio frame to Decoder / NetEq. >+ if (audio_coding_->IncomingPacket( >+ frame.encoded_data().data(), frame.encoded_data().size(), >+ CreateWebrtcRTPHeaderForMediaTransportFrame(frame, channel_id)) != >+ 0) { >+ RTC_DLOG(LS_ERROR) << "ChannelReceive::OnData: unable to " >+ "push data to the ACM"; >+ } >+} >+ > AudioMixer::Source::AudioFrameInfo ChannelReceive::GetAudioFrameWithInfo( > int sample_rate_hz, > AudioFrame* audio_frame) { >+ RTC_DCHECK_RUNS_SERIALIZED(&audio_thread_race_checker_); > audio_frame->sample_rate_hz_ = sample_rate_hz; > >- unsigned int ssrc; >- RTC_CHECK_EQ(GetRemoteSSRC(ssrc), 0); >- event_log_->Log(absl::make_unique<RtcEventAudioPlayout>(ssrc)); >+ event_log_->Log(absl::make_unique<RtcEventAudioPlayout>(remote_ssrc_)); >+ > // Get 10ms raw PCM data from the ACM (mixer limits output frequency) > bool muted; > if (audio_coding_->PlayoutData10Ms(audio_frame->sample_rate_hz_, audio_frame, >@@ -191,6 +435,7 @@ AudioMixer::Source::AudioFrameInfo ChannelReceive::GetAudioFrameWithInfo( > } > > int ChannelReceive::PreferredSampleRate() const { >+ RTC_DCHECK_RUNS_SERIALIZED(&audio_thread_race_checker_); > // Return the bigger of playout and receive frequency in the ACM. > return std::max(audio_coding_->ReceiveFrequency(), > audio_coding_->PlayoutFrequency()); >@@ -199,14 +444,17 @@ int ChannelReceive::PreferredSampleRate() const { > ChannelReceive::ChannelReceive( > ProcessThread* module_process_thread, > AudioDeviceModule* audio_device_module, >+ MediaTransportInterface* media_transport, > Transport* rtcp_send_transport, > RtcEventLog* rtc_event_log, > uint32_t remote_ssrc, > size_t jitter_buffer_max_packets, > bool jitter_buffer_fast_playout, >+ int jitter_buffer_min_delay_ms, > rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, > absl::optional<AudioCodecPairId> codec_pair_id, >- FrameDecryptorInterface* frame_decryptor) >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor, >+ const webrtc::CryptoOptions& crypto_options) > : event_log_(rtc_event_log), > rtp_receive_statistics_( > ReceiveStatistics::Create(Clock::GetRealTimeClock())), >@@ -222,7 +470,12 @@ ChannelReceive::ChannelReceive( > _audioDeviceModulePtr(audio_device_module), > _outputGain(1.0f), > associated_send_channel_(nullptr), >- frame_decryptor_(frame_decryptor) { >+ media_transport_(media_transport), >+ frame_decryptor_(frame_decryptor), >+ crypto_options_(crypto_options) { >+ // TODO(nisse): Use _moduleProcessThreadPtr instead? >+ module_process_thread_checker_.DetachFromThread(); >+ > RTC_DCHECK(module_process_thread); > RTC_DCHECK(audio_device_module); > AudioCodingModule::Config acm_config; >@@ -230,6 +483,7 @@ ChannelReceive::ChannelReceive( > acm_config.neteq_config.codec_pair_id = codec_pair_id; > acm_config.neteq_config.max_packets_in_buffer = jitter_buffer_max_packets; > acm_config.neteq_config.enable_fast_accelerate = jitter_buffer_fast_playout; >+ acm_config.neteq_config.min_delay_ms = jitter_buffer_min_delay_ms; > acm_config.neteq_config.enable_muted_state = true; > audio_coding_.reset(AudioCodingModule::Create(acm_config)); > >@@ -238,9 +492,7 @@ ChannelReceive::ChannelReceive( > rtp_receive_statistics_->EnableRetransmitDetection(remote_ssrc_, true); > RtpRtcp::Configuration configuration; > configuration.audio = true; >- // TODO(nisse): Also set receiver_only = true, but that seems to break RTT >- // estimation, resulting in test failures for >- // PeerConnectionIntegrationTest.GetCaptureStartNtpTimeWithOldStatsApi >+ configuration.receiver_only = true; > configuration.outgoing_transport = rtcp_send_transport; > configuration.receive_statistics = rtp_receive_statistics_.get(); > >@@ -249,26 +501,9 @@ ChannelReceive::ChannelReceive( > _rtpRtcpModule.reset(RtpRtcp::CreateRtpRtcp(configuration)); > _rtpRtcpModule->SetSendingMediaStatus(false); > _rtpRtcpModule->SetRemoteSSRC(remote_ssrc_); >- Init(); >-} >- >-ChannelReceive::~ChannelReceive() { >- Terminate(); >- RTC_DCHECK(!channel_state_.Get().playing); >-} >- >-void ChannelReceive::Init() { >- channel_state_.Reset(); > >- // --- Add modules to process thread (for periodic schedulation) > _moduleProcessThreadPtr->RegisterModule(_rtpRtcpModule.get(), RTC_FROM_HERE); > >- // --- ACM initialization >- int error = audio_coding_->InitializeReceiver(); >- RTC_DCHECK_EQ(0, error); >- >- // --- RTP/RTCP module initialization >- > // Ensure that RTCP is enabled by default for the created channel. > // Note that, the module will keep generating RTCP until it is explicitly > // disabled by the user. >@@ -276,60 +511,54 @@ void ChannelReceive::Init() { > // be transmitted since the Transport object will then be invalid. > // RTCP is enabled by default. > _rtpRtcpModule->SetRTCPStatus(RtcpMode::kCompound); >+ >+ if (media_transport_) { >+ media_transport_->SetReceiveAudioSink(this); >+ } > } > >-void ChannelReceive::Terminate() { >+ChannelReceive::~ChannelReceive() { > RTC_DCHECK(construction_thread_.CalledOnValidThread()); >- // Must be called on the same thread as Init(). >- rtp_receive_statistics_->RegisterRtcpStatisticsCallback(NULL); >+ >+ if (media_transport_) { >+ media_transport_->SetReceiveAudioSink(nullptr); >+ } > > StopPlayout(); > >- // The order to safely shutdown modules in a channel is: >- // 1. De-register callbacks in modules >- // 2. De-register modules in process thread >- // 3. Destroy modules > int error = audio_coding_->RegisterTransportCallback(NULL); > RTC_DCHECK_EQ(0, error); > >- // De-register modules in process thread > if (_moduleProcessThreadPtr) > _moduleProcessThreadPtr->DeRegisterModule(_rtpRtcpModule.get()); >- >- // End of modules shutdown > } > > void ChannelReceive::SetSink(AudioSinkInterface* sink) { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > rtc::CritScope cs(&_callbackCritSect); > audio_sink_ = sink; > } > >-int32_t ChannelReceive::StartPlayout() { >- if (channel_state_.Get().playing) { >- return 0; >- } >- >- channel_state_.SetPlaying(true); >- >- return 0; >+void ChannelReceive::StartPlayout() { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >+ rtc::CritScope lock(&playing_lock_); >+ playing_ = true; > } > >-int32_t ChannelReceive::StopPlayout() { >- if (!channel_state_.Get().playing) { >- return 0; >- } >- >- channel_state_.SetPlaying(false); >+void ChannelReceive::StopPlayout() { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >+ rtc::CritScope lock(&playing_lock_); >+ playing_ = false; > _outputAudioLevel.Clear(); >- >- return 0; > } > >-int32_t ChannelReceive::GetRecCodec(CodecInst& codec) { >- return (audio_coding_->ReceiveCodec(&codec)); >+bool ChannelReceive::GetRecCodec(CodecInst* codec) const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >+ return (audio_coding_->ReceiveCodec(codec) == 0); > } > > std::vector<webrtc::RtpSource> ChannelReceive::GetSources() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > int64_t now_ms = rtc::TimeMillis(); > std::vector<RtpSource> sources; > { >@@ -347,6 +576,7 @@ std::vector<webrtc::RtpSource> ChannelReceive::GetSources() const { > > void ChannelReceive::SetReceiveCodecs( > const std::map<int, SdpAudioFormat>& codecs) { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > for (const auto& kv : codecs) { > RTC_DCHECK_GE(kv.second.clockrate_hz, 1000); > payload_type_frequencies_[kv.first] = kv.second.clockrate_hz; >@@ -354,7 +584,7 @@ void ChannelReceive::SetReceiveCodecs( > audio_coding_->SetReceiveCodecs(codecs); > } > >-// TODO(nisse): Move receive logic up to AudioReceiveStream. >+// May be called on either worker thread or network thread. > void ChannelReceive::OnRtpPacket(const RtpPacketReceived& packet) { > int64_t now_ms = rtc::TimeMillis(); > uint8_t audio_level; >@@ -369,7 +599,9 @@ void ChannelReceive::OnRtpPacket(const RtpPacketReceived& packet) { > if (has_audio_level) > last_received_rtp_audio_level_ = audio_level; > std::vector<uint32_t> csrcs = packet.Csrcs(); >- contributing_sources_.Update(now_ms, csrcs); >+ contributing_sources_.Update( >+ now_ms, csrcs, >+ has_audio_level ? absl::optional<uint8_t>(audio_level) : absl::nullopt); > } > > // Store playout timestamp for the received RTP packet >@@ -429,6 +661,10 @@ bool ChannelReceive::ReceivePacket(const uint8_t* packet, > // Update the final payload. > payload = decrypted_audio_payload.data(); > payload_data_length = decrypted_audio_payload.size(); >+ } else if (crypto_options_.sframe.require_frame_encryption) { >+ RTC_DLOG(LS_ERROR) >+ << "FrameDecryptor required but not set, dropping packet"; >+ payload_data_length = 0; > } > > if (payload_data_length == 0) { >@@ -439,7 +675,9 @@ bool ChannelReceive::ReceivePacket(const uint8_t* packet, > &webrtc_rtp_header); > } > >-int32_t ChannelReceive::ReceivedRTCPPacket(const uint8_t* data, size_t length) { >+// May be called on either worker thread or network thread. >+// TODO(nisse): Drop always-true return value. >+bool ChannelReceive::ReceivedRTCPPacket(const uint8_t* data, size_t length) { > // Store playout timestamp for the received RTCP packet > UpdatePlayoutTimestamp(true); > >@@ -449,7 +687,7 @@ int32_t ChannelReceive::ReceivedRTCPPacket(const uint8_t* data, size_t length) { > int64_t rtt = GetRTT(); > if (rtt == 0) { > // Waiting for valid RTT. >- return 0; >+ return true; > } > > int64_t nack_window_ms = rtt; >@@ -465,46 +703,45 @@ int32_t ChannelReceive::ReceivedRTCPPacket(const uint8_t* data, size_t length) { > if (0 != _rtpRtcpModule->RemoteNTP(&ntp_secs, &ntp_frac, NULL, NULL, > &rtp_timestamp)) { > // Waiting for RTCP. >- return 0; >+ return true; > } > > { > rtc::CritScope lock(&ts_stats_lock_); > ntp_estimator_.UpdateRtcpTimestamp(rtt, ntp_secs, ntp_frac, rtp_timestamp); > } >- return 0; >+ return true; > } > > int ChannelReceive::GetSpeechOutputLevelFullRange() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > return _outputAudioLevel.LevelFullRange(); > } > > double ChannelReceive::GetTotalOutputEnergy() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > return _outputAudioLevel.TotalEnergy(); > } > > double ChannelReceive::GetTotalOutputDuration() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > return _outputAudioLevel.TotalDuration(); > } > > void ChannelReceive::SetChannelOutputVolumeScaling(float scaling) { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > rtc::CritScope cs(&volume_settings_critsect_); > _outputGain = scaling; > } > >-int ChannelReceive::SetLocalSSRC(unsigned int ssrc) { >+void ChannelReceive::SetLocalSSRC(uint32_t ssrc) { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > _rtpRtcpModule->SetSSRC(ssrc); >- return 0; >-} >- >-// TODO(nisse): Pass ssrc in return value instead. >-int ChannelReceive::GetRemoteSSRC(unsigned int& ssrc) { >- ssrc = remote_ssrc_; >- return 0; > } > > void ChannelReceive::RegisterReceiverCongestionControlObjects( > PacketRouter* packet_router) { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > RTC_DCHECK(packet_router); > RTC_DCHECK(!packet_router_); > constexpr bool remb_candidate = false; >@@ -513,13 +750,16 @@ void ChannelReceive::RegisterReceiverCongestionControlObjects( > } > > void ChannelReceive::ResetReceiverCongestionControlObjects() { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > RTC_DCHECK(packet_router_); > packet_router_->RemoveReceiveRtpModule(_rtpRtcpModule.get()); > packet_router_ = nullptr; > } > >-int ChannelReceive::GetRTPStatistics(CallReceiveStatistics& stats) { >+CallReceiveStatistics ChannelReceive::GetRTCPStatistics() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > // --- RtcpStatistics >+ CallReceiveStatistics stats; > > // The jitter statistics is updated for each received RTP packet and is > // based on received packets. >@@ -556,14 +796,15 @@ int ChannelReceive::GetRTPStatistics(CallReceiveStatistics& stats) { > rtc::CritScope lock(&ts_stats_lock_); > stats.capture_start_ntp_time_ms_ = capture_start_ntp_time_ms_; > } >- return 0; >+ return stats; > } > >-void ChannelReceive::SetNACKStatus(bool enable, int maxNumberOfPackets) { >+void ChannelReceive::SetNACKStatus(bool enable, int max_packets) { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > // None of these functions can fail. >- rtp_receive_statistics_->SetMaxReorderingThreshold(maxNumberOfPackets); >+ rtp_receive_statistics_->SetMaxReorderingThreshold(max_packets); > if (enable) >- audio_coding_->EnableNack(maxNumberOfPackets); >+ audio_coding_->EnableNack(max_packets); > else > audio_coding_->DisableNack(); > } >@@ -574,54 +815,61 @@ int ChannelReceive::ResendPackets(const uint16_t* sequence_numbers, > return _rtpRtcpModule->SendNACK(sequence_numbers, length); > } > >-void ChannelReceive::SetAssociatedSendChannel(ChannelSend* channel) { >+void ChannelReceive::SetAssociatedSendChannel( >+ const ChannelSendInterface* channel) { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > rtc::CritScope lock(&assoc_send_channel_lock_); > associated_send_channel_ = channel; > } > >-int ChannelReceive::GetNetworkStatistics(NetworkStatistics& stats) { >- return audio_coding_->GetNetworkStatistics(&stats); >+NetworkStatistics ChannelReceive::GetNetworkStatistics() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >+ NetworkStatistics stats; >+ int error = audio_coding_->GetNetworkStatistics(&stats); >+ RTC_DCHECK_EQ(0, error); >+ return stats; > } > >-void ChannelReceive::GetDecodingCallStatistics( >- AudioDecodingCallStats* stats) const { >- audio_coding_->GetDecodingCallStatistics(stats); >+AudioDecodingCallStats ChannelReceive::GetDecodingCallStatistics() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >+ AudioDecodingCallStats stats; >+ audio_coding_->GetDecodingCallStatistics(&stats); >+ return stats; > } > > uint32_t ChannelReceive::GetDelayEstimate() const { >+ RTC_DCHECK(worker_thread_checker_.CalledOnValidThread() || >+ module_process_thread_checker_.CalledOnValidThread()); > rtc::CritScope lock(&video_sync_lock_); > return audio_coding_->FilteredCurrentDelayMs() + playout_delay_ms_; > } > >-int ChannelReceive::SetMinimumPlayoutDelay(int delayMs) { >- if ((delayMs < kVoiceEngineMinMinPlayoutDelayMs) || >- (delayMs > kVoiceEngineMaxMinPlayoutDelayMs)) { >+void ChannelReceive::SetMinimumPlayoutDelay(int delay_ms) { >+ RTC_DCHECK(module_process_thread_checker_.CalledOnValidThread()); >+ // Limit to range accepted by both VoE and ACM, so we're at least getting as >+ // close as possible, instead of failing. >+ delay_ms = rtc::SafeClamp(delay_ms, 0, 10000); >+ if ((delay_ms < kVoiceEngineMinMinPlayoutDelayMs) || >+ (delay_ms > kVoiceEngineMaxMinPlayoutDelayMs)) { > RTC_DLOG(LS_ERROR) << "SetMinimumPlayoutDelay() invalid min delay"; >- return -1; >+ return; > } >- if (audio_coding_->SetMinimumPlayoutDelay(delayMs) != 0) { >+ if (audio_coding_->SetMinimumPlayoutDelay(delay_ms) != 0) { > RTC_DLOG(LS_ERROR) > << "SetMinimumPlayoutDelay() failed to set min playout delay"; >- return -1; > } >- return 0; > } > >-int ChannelReceive::GetPlayoutTimestamp(unsigned int& timestamp) { >- uint32_t playout_timestamp_rtp = 0; >+uint32_t ChannelReceive::GetPlayoutTimestamp() const { >+ RTC_DCHECK_RUNS_SERIALIZED(&video_capture_thread_race_checker_); > { > rtc::CritScope lock(&video_sync_lock_); >- playout_timestamp_rtp = playout_timestamp_rtp_; >- } >- if (playout_timestamp_rtp == 0) { >- RTC_DLOG(LS_ERROR) << "GetPlayoutTimestamp() failed to retrieve timestamp"; >- return -1; >+ return playout_timestamp_rtp_; > } >- timestamp = playout_timestamp_rtp; >- return 0; > } > > absl::optional<Syncable::Info> ChannelReceive::GetSyncInfo() const { >+ RTC_DCHECK(module_process_thread_checker_.CalledOnValidThread()); > Syncable::Info info; > if (_rtpRtcpModule->RemoteNTP(&info.capture_time_ntp_secs, > &info.capture_time_ntp_frac, nullptr, nullptr, >@@ -683,6 +931,14 @@ int ChannelReceive::GetRtpTimestampRateHz() const { > } > > int64_t ChannelReceive::GetRTT() const { >+ if (media_transport_) { >+ auto target_rate = media_transport_->GetLatestTargetTransferRate(); >+ if (target_rate.has_value()) { >+ return target_rate->network_estimate.round_trip_time.ms(); >+ } >+ >+ return 0; >+ } > RtcpMode method = _rtpRtcpModule->RTCP(); > if (method == RtcpMode::kOff) { > return 0; >@@ -705,6 +961,8 @@ int64_t ChannelReceive::GetRTT() const { > int64_t avg_rtt = 0; > int64_t max_rtt = 0; > int64_t min_rtt = 0; >+ // TODO(nisse): This method computes RTT based on sender reports, even though >+ // a receive stream is not supposed to do that. > if (_rtpRtcpModule->RTT(remote_ssrc_, &rtt, &avg_rtt, &min_rtt, &max_rtt) != > 0) { > return 0; >@@ -712,5 +970,29 @@ int64_t ChannelReceive::GetRTT() const { > return rtt; > } > >+} // namespace >+ >+std::unique_ptr<ChannelReceiveInterface> CreateChannelReceive( >+ ProcessThread* module_process_thread, >+ AudioDeviceModule* audio_device_module, >+ MediaTransportInterface* media_transport, >+ Transport* rtcp_send_transport, >+ RtcEventLog* rtc_event_log, >+ uint32_t remote_ssrc, >+ size_t jitter_buffer_max_packets, >+ bool jitter_buffer_fast_playout, >+ int jitter_buffer_min_delay_ms, >+ rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, >+ absl::optional<AudioCodecPairId> codec_pair_id, >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor, >+ const webrtc::CryptoOptions& crypto_options) { >+ return absl::make_unique<ChannelReceive>( >+ module_process_thread, audio_device_module, media_transport, >+ rtcp_send_transport, rtc_event_log, remote_ssrc, >+ jitter_buffer_max_packets, jitter_buffer_fast_playout, >+ jitter_buffer_min_delay_ms, decoder_factory, codec_pair_id, >+ frame_decryptor, crypto_options); >+} >+ > } // namespace voe > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.h >index 82eb4dff0c4b55749a3fe69665a137dd4f33b397..90276234a137a16c29ab14396d08173a6bbd1701 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive.h >@@ -17,22 +17,18 @@ > > #include "absl/types/optional.h" > #include "api/audio/audio_mixer.h" >+#include "api/audio_codecs/audio_decoder_factory.h" > #include "api/call/audio_sink.h" > #include "api/call/transport.h" >+#include "api/crypto/cryptooptions.h" >+#include "api/media_transport_interface.h" > #include "api/rtpreceiverinterface.h" >-#include "audio/audio_level.h" >+#include "call/rtp_packet_sink_interface.h" > #include "call/syncable.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/audio_coding/include/audio_coding_module.h" >-#include "modules/rtp_rtcp/include/remote_ntp_time_estimator.h" >-#include "modules/rtp_rtcp/include/rtp_header_parser.h" >-#include "modules/rtp_rtcp/include/rtp_rtcp.h" >-#include "modules/rtp_rtcp/source/contributing_sources.h" >-#include "rtc_base/criticalsection.h" >-#include "rtc_base/thread_checker.h" > > // TODO(solenberg, nisse): This file contains a few NOLINT marks, to silence >-// warnings about use of unsigned short, and non-const reference arguments. >+// warnings about use of unsigned short. > // These need cleanup, in a separate cl. > > namespace rtc { >@@ -66,201 +62,85 @@ struct CallReceiveStatistics { > > namespace voe { > >-class ChannelSend; >+class ChannelSendInterface; > >-// Helper class to simplify locking scheme for members that are accessed from >-// multiple threads. >-// Example: a member can be set on thread T1 and read by an internal audio >-// thread T2. Accessing the member via this class ensures that we are >-// safe and also avoid TSan v2 warnings. >-class ChannelReceiveState { >- public: >- struct State { >- bool playing = false; >- }; >- >- ChannelReceiveState() {} >- virtual ~ChannelReceiveState() {} >- >- void Reset() { >- rtc::CritScope lock(&lock_); >- state_ = State(); >- } >- >- State Get() const { >- rtc::CritScope lock(&lock_); >- return state_; >- } >- >- void SetPlaying(bool enable) { >- rtc::CritScope lock(&lock_); >- state_.playing = enable; >- } >+// Interface class needed for AudioReceiveStream tests that use a >+// MockChannelReceive. > >- private: >- rtc::CriticalSection lock_; >- State state_; >-}; >- >-class ChannelReceive : public RtpData { >+class ChannelReceiveInterface : public RtpPacketSinkInterface { > public: >- // Used for receive streams. >- ChannelReceive(ProcessThread* module_process_thread, >- AudioDeviceModule* audio_device_module, >- Transport* rtcp_send_transport, >- RtcEventLog* rtc_event_log, >- uint32_t remote_ssrc, >- size_t jitter_buffer_max_packets, >- bool jitter_buffer_fast_playout, >- rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, >- absl::optional<AudioCodecPairId> codec_pair_id, >- FrameDecryptorInterface* frame_decryptor); >- virtual ~ChannelReceive(); >+ virtual ~ChannelReceiveInterface() = default; > >- void SetSink(AudioSinkInterface* sink); >+ virtual void SetSink(AudioSinkInterface* sink) = 0; > >- void SetReceiveCodecs(const std::map<int, SdpAudioFormat>& codecs); >+ virtual void SetReceiveCodecs( >+ const std::map<int, SdpAudioFormat>& codecs) = 0; > >- // API methods >+ virtual void StartPlayout() = 0; >+ virtual void StopPlayout() = 0; > >- // VoEBase >- int32_t StartPlayout(); >- int32_t StopPlayout(); >+ virtual bool GetRecCodec(CodecInst* codec) const = 0; > >- // Codecs >- int32_t GetRecCodec(CodecInst& codec); // NOLINT >+ virtual bool ReceivedRTCPPacket(const uint8_t* data, size_t length) = 0; > >- // TODO(nisse, solenberg): Delete when VoENetwork is deleted. >- int32_t ReceivedRTCPPacket(const uint8_t* data, size_t length); >- void OnRtpPacket(const RtpPacketReceived& packet); >- >- // Muting, Volume and Level. >- void SetChannelOutputVolumeScaling(float scaling); >- int GetSpeechOutputLevelFullRange() const; >+ virtual void SetChannelOutputVolumeScaling(float scaling) = 0; >+ virtual int GetSpeechOutputLevelFullRange() const = 0; > // See description of "totalAudioEnergy" in the WebRTC stats spec: > // https://w3c.github.io/webrtc-stats/#dom-rtcmediastreamtrackstats-totalaudioenergy >- double GetTotalOutputEnergy() const; >- double GetTotalOutputDuration() const; >+ virtual double GetTotalOutputEnergy() const = 0; >+ virtual double GetTotalOutputDuration() const = 0; > > // Stats. >- int GetNetworkStatistics(NetworkStatistics& stats); // NOLINT >- void GetDecodingCallStatistics(AudioDecodingCallStats* stats) const; >+ virtual NetworkStatistics GetNetworkStatistics() const = 0; >+ virtual AudioDecodingCallStats GetDecodingCallStatistics() const = 0; > > // Audio+Video Sync. >- uint32_t GetDelayEstimate() const; >- int SetMinimumPlayoutDelay(int delayMs); >- int GetPlayoutTimestamp(unsigned int& timestamp); // NOLINT >+ virtual uint32_t GetDelayEstimate() const = 0; >+ virtual void SetMinimumPlayoutDelay(int delay_ms) = 0; >+ virtual uint32_t GetPlayoutTimestamp() const = 0; > > // Produces the transport-related timestamps; current_delay_ms is left unset. >- absl::optional<Syncable::Info> GetSyncInfo() const; >+ virtual absl::optional<Syncable::Info> GetSyncInfo() const = 0; > > // RTP+RTCP >- int SetLocalSSRC(unsigned int ssrc); >- >- void RegisterReceiverCongestionControlObjects(PacketRouter* packet_router); >- void ResetReceiverCongestionControlObjects(); >+ virtual void SetLocalSSRC(uint32_t ssrc) = 0; > >- int GetRTPStatistics(CallReceiveStatistics& stats); // NOLINT >- void SetNACKStatus(bool enable, int maxNumberOfPackets); >+ virtual void RegisterReceiverCongestionControlObjects( >+ PacketRouter* packet_router) = 0; >+ virtual void ResetReceiverCongestionControlObjects() = 0; > >- // From RtpData in the RTP/RTCP module >- int32_t OnReceivedPayloadData(const uint8_t* payloadData, >- size_t payloadSize, >- const WebRtcRTPHeader* rtpHeader) override; >+ virtual CallReceiveStatistics GetRTCPStatistics() const = 0; >+ virtual void SetNACKStatus(bool enable, int max_packets) = 0; > >- // From AudioMixer::Source. >- AudioMixer::Source::AudioFrameInfo GetAudioFrameWithInfo( >+ virtual AudioMixer::Source::AudioFrameInfo GetAudioFrameWithInfo( > int sample_rate_hz, >- AudioFrame* audio_frame); >+ AudioFrame* audio_frame) = 0; > >- int PreferredSampleRate() const; >+ virtual int PreferredSampleRate() const = 0; > > // Associate to a send channel. > // Used for obtaining RTT for a receive-only channel. >- void SetAssociatedSendChannel(ChannelSend* channel); >- >- std::vector<RtpSource> GetSources() const; >- >- private: >- void Init(); >- void Terminate(); >- >- int GetRemoteSSRC(unsigned int& ssrc); // NOLINT >- >- bool ReceivePacket(const uint8_t* packet, >- size_t packet_length, >- const RTPHeader& header); >- int ResendPackets(const uint16_t* sequence_numbers, int length); >- void UpdatePlayoutTimestamp(bool rtcp); >- >- int GetRtpTimestampRateHz() const; >- int64_t GetRTT() const; >- >- rtc::CriticalSection _callbackCritSect; >- rtc::CriticalSection volume_settings_critsect_; >- >- ChannelReceiveState channel_state_; >- >- RtcEventLog* const event_log_; >+ virtual void SetAssociatedSendChannel( >+ const ChannelSendInterface* channel) = 0; > >- // Indexed by payload type. >- std::map<uint8_t, int> payload_type_frequencies_; >- >- std::unique_ptr<ReceiveStatistics> rtp_receive_statistics_; >- std::unique_ptr<RtpRtcp> _rtpRtcpModule; >- const uint32_t remote_ssrc_; >- >- // Info for GetSources and GetSyncInfo is updated on network or worker thread, >- // queried on the worker thread. >- rtc::CriticalSection rtp_sources_lock_; >- ContributingSources contributing_sources_ RTC_GUARDED_BY(&rtp_sources_lock_); >- absl::optional<uint32_t> last_received_rtp_timestamp_ >- RTC_GUARDED_BY(&rtp_sources_lock_); >- absl::optional<int64_t> last_received_rtp_system_time_ms_ >- RTC_GUARDED_BY(&rtp_sources_lock_); >- absl::optional<uint8_t> last_received_rtp_audio_level_ >- RTC_GUARDED_BY(&rtp_sources_lock_); >- >- std::unique_ptr<AudioCodingModule> audio_coding_; >- AudioSinkInterface* audio_sink_ = nullptr; >- AudioLevel _outputAudioLevel; >- >- RemoteNtpTimeEstimator ntp_estimator_ RTC_GUARDED_BY(ts_stats_lock_); >- >- // Timestamp of the audio pulled from NetEq. >- absl::optional<uint32_t> jitter_buffer_playout_timestamp_; >- >- rtc::CriticalSection video_sync_lock_; >- uint32_t playout_timestamp_rtp_ RTC_GUARDED_BY(video_sync_lock_); >- uint32_t playout_delay_ms_ RTC_GUARDED_BY(video_sync_lock_); >- >- rtc::CriticalSection ts_stats_lock_; >- >- std::unique_ptr<rtc::TimestampWrapAroundHandler> rtp_ts_wraparound_handler_; >- // The rtp timestamp of the first played out audio frame. >- int64_t capture_start_rtp_time_stamp_; >- // The capture ntp time (in local timebase) of the first played out audio >- // frame. >- int64_t capture_start_ntp_time_ms_ RTC_GUARDED_BY(ts_stats_lock_); >- >- // uses >- ProcessThread* _moduleProcessThreadPtr; >- AudioDeviceModule* _audioDeviceModulePtr; >- float _outputGain RTC_GUARDED_BY(volume_settings_critsect_); >- >- // An associated send channel. >- rtc::CriticalSection assoc_send_channel_lock_; >- ChannelSend* associated_send_channel_ >- RTC_GUARDED_BY(assoc_send_channel_lock_); >- >- PacketRouter* packet_router_ = nullptr; >- >- rtc::ThreadChecker construction_thread_; >- >- // E2EE Audio Frame Decryption >- FrameDecryptorInterface* frame_decryptor_ = nullptr; >+ virtual std::vector<RtpSource> GetSources() const = 0; > }; > >+std::unique_ptr<ChannelReceiveInterface> CreateChannelReceive( >+ ProcessThread* module_process_thread, >+ AudioDeviceModule* audio_device_module, >+ MediaTransportInterface* media_transport, >+ Transport* rtcp_send_transport, >+ RtcEventLog* rtc_event_log, >+ uint32_t remote_ssrc, >+ size_t jitter_buffer_max_packets, >+ bool jitter_buffer_fast_playout, >+ int jitter_buffer_min_delay_ms, >+ rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, >+ absl::optional<AudioCodecPairId> codec_pair_id, >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor, >+ const webrtc::CryptoOptions& crypto_options); >+ > } // namespace voe > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive_proxy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive_proxy.cc >deleted file mode 100644 >index b1c1c458326ac9ad92f2e59383f04137c89c6295..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive_proxy.cc >+++ /dev/null >@@ -1,197 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "audio/channel_receive_proxy.h" >- >-#include <utility> >- >-#include "api/call/audio_sink.h" >-#include "audio/channel_send_proxy.h" >-#include "call/rtp_transport_controller_send_interface.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/numerics/safe_minmax.h" >- >-namespace webrtc { >-namespace voe { >-ChannelReceiveProxy::ChannelReceiveProxy() {} >- >-ChannelReceiveProxy::ChannelReceiveProxy( >- std::unique_ptr<ChannelReceive> channel) >- : channel_(std::move(channel)) { >- RTC_DCHECK(channel_); >- module_process_thread_checker_.DetachFromThread(); >-} >- >-ChannelReceiveProxy::~ChannelReceiveProxy() {} >- >-void ChannelReceiveProxy::SetLocalSSRC(uint32_t ssrc) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- int error = channel_->SetLocalSSRC(ssrc); >- RTC_DCHECK_EQ(0, error); >-} >- >-void ChannelReceiveProxy::SetNACKStatus(bool enable, int max_packets) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetNACKStatus(enable, max_packets); >-} >- >-CallReceiveStatistics ChannelReceiveProxy::GetRTCPStatistics() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- CallReceiveStatistics stats = {0}; >- int error = channel_->GetRTPStatistics(stats); >- RTC_DCHECK_EQ(0, error); >- return stats; >-} >- >-bool ChannelReceiveProxy::ReceivedRTCPPacket(const uint8_t* packet, >- size_t length) { >- // May be called on either worker thread or network thread. >- return channel_->ReceivedRTCPPacket(packet, length) == 0; >-} >- >-void ChannelReceiveProxy::RegisterReceiverCongestionControlObjects( >- PacketRouter* packet_router) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->RegisterReceiverCongestionControlObjects(packet_router); >-} >- >-void ChannelReceiveProxy::ResetReceiverCongestionControlObjects() { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->ResetReceiverCongestionControlObjects(); >-} >- >-NetworkStatistics ChannelReceiveProxy::GetNetworkStatistics() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- NetworkStatistics stats = {0}; >- int error = channel_->GetNetworkStatistics(stats); >- RTC_DCHECK_EQ(0, error); >- return stats; >-} >- >-AudioDecodingCallStats ChannelReceiveProxy::GetDecodingCallStatistics() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- AudioDecodingCallStats stats; >- channel_->GetDecodingCallStatistics(&stats); >- return stats; >-} >- >-int ChannelReceiveProxy::GetSpeechOutputLevelFullRange() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->GetSpeechOutputLevelFullRange(); >-} >- >-double ChannelReceiveProxy::GetTotalOutputEnergy() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->GetTotalOutputEnergy(); >-} >- >-double ChannelReceiveProxy::GetTotalOutputDuration() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->GetTotalOutputDuration(); >-} >- >-uint32_t ChannelReceiveProxy::GetDelayEstimate() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread() || >- module_process_thread_checker_.CalledOnValidThread()); >- return channel_->GetDelayEstimate(); >-} >- >-void ChannelReceiveProxy::SetReceiveCodecs( >- const std::map<int, SdpAudioFormat>& codecs) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetReceiveCodecs(codecs); >-} >- >-void ChannelReceiveProxy::SetSink(AudioSinkInterface* sink) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetSink(sink); >-} >- >-void ChannelReceiveProxy::OnRtpPacket(const RtpPacketReceived& packet) { >- // May be called on either worker thread or network thread. >- channel_->OnRtpPacket(packet); >-} >- >-void ChannelReceiveProxy::SetChannelOutputVolumeScaling(float scaling) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetChannelOutputVolumeScaling(scaling); >-} >- >-AudioMixer::Source::AudioFrameInfo ChannelReceiveProxy::GetAudioFrameWithInfo( >- int sample_rate_hz, >- AudioFrame* audio_frame) { >- RTC_DCHECK_RUNS_SERIALIZED(&audio_thread_race_checker_); >- return channel_->GetAudioFrameWithInfo(sample_rate_hz, audio_frame); >-} >- >-int ChannelReceiveProxy::PreferredSampleRate() const { >- RTC_DCHECK_RUNS_SERIALIZED(&audio_thread_race_checker_); >- return channel_->PreferredSampleRate(); >-} >- >-void ChannelReceiveProxy::AssociateSendChannel( >- const ChannelSendProxy& send_channel_proxy) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetAssociatedSendChannel(send_channel_proxy.GetChannel()); >-} >- >-void ChannelReceiveProxy::DisassociateSendChannel() { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetAssociatedSendChannel(nullptr); >-} >- >-absl::optional<Syncable::Info> ChannelReceiveProxy::GetSyncInfo() const { >- RTC_DCHECK(module_process_thread_checker_.CalledOnValidThread()); >- return channel_->GetSyncInfo(); >-} >- >-uint32_t ChannelReceiveProxy::GetPlayoutTimestamp() const { >- RTC_DCHECK_RUNS_SERIALIZED(&video_capture_thread_race_checker_); >- unsigned int timestamp = 0; >- int error = channel_->GetPlayoutTimestamp(timestamp); >- RTC_DCHECK(!error || timestamp == 0); >- return timestamp; >-} >- >-void ChannelReceiveProxy::SetMinimumPlayoutDelay(int delay_ms) { >- RTC_DCHECK(module_process_thread_checker_.CalledOnValidThread()); >- // Limit to range accepted by both VoE and ACM, so we're at least getting as >- // close as possible, instead of failing. >- delay_ms = rtc::SafeClamp(delay_ms, 0, 10000); >- int error = channel_->SetMinimumPlayoutDelay(delay_ms); >- if (0 != error) { >- RTC_LOG(LS_WARNING) << "Error setting minimum playout delay."; >- } >-} >- >-bool ChannelReceiveProxy::GetRecCodec(CodecInst* codec_inst) const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->GetRecCodec(*codec_inst) == 0; >-} >- >-std::vector<RtpSource> ChannelReceiveProxy::GetSources() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->GetSources(); >-} >- >-void ChannelReceiveProxy::StartPlayout() { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- int error = channel_->StartPlayout(); >- RTC_DCHECK_EQ(0, error); >-} >- >-void ChannelReceiveProxy::StopPlayout() { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- int error = channel_->StopPlayout(); >- RTC_DCHECK_EQ(0, error); >-} >-} // namespace voe >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive_proxy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive_proxy.h >deleted file mode 100644 >index 8ebacc300dcfe3a725e9c940903fd08b9dfb7557..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_receive_proxy.h >+++ /dev/null >@@ -1,109 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef AUDIO_CHANNEL_RECEIVE_PROXY_H_ >-#define AUDIO_CHANNEL_RECEIVE_PROXY_H_ >- >-#include <map> >-#include <memory> >-#include <vector> >- >-#include "api/audio/audio_mixer.h" >-#include "api/rtpreceiverinterface.h" >-#include "audio/channel_receive.h" >-#include "call/rtp_packet_sink_interface.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/race_checker.h" >-#include "rtc_base/thread_checker.h" >- >-namespace webrtc { >- >-class AudioSinkInterface; >-class PacketRouter; >-class RtpPacketReceived; >-class Transport; >- >-namespace voe { >- >-class ChannelSendProxy; >- >-// This class provides the "view" of a voe::Channel that we need to implement >-// webrtc::AudioReceiveStream. It serves two purposes: >-// 1. Allow mocking just the interfaces used, instead of the entire >-// voe::Channel class. >-// 2. Provide a refined interface for the stream classes, including assumptions >-// on return values and input adaptation. >-class ChannelReceiveProxy : public RtpPacketSinkInterface { >- public: >- ChannelReceiveProxy(); >- explicit ChannelReceiveProxy(std::unique_ptr<ChannelReceive> channel); >- virtual ~ChannelReceiveProxy(); >- >- // Shared with ChannelSendProxy >- virtual void SetLocalSSRC(uint32_t ssrc); >- virtual void SetNACKStatus(bool enable, int max_packets); >- virtual CallReceiveStatistics GetRTCPStatistics() const; >- virtual bool ReceivedRTCPPacket(const uint8_t* packet, size_t length); >- >- virtual void RegisterReceiverCongestionControlObjects( >- PacketRouter* packet_router); >- virtual void ResetReceiverCongestionControlObjects(); >- virtual NetworkStatistics GetNetworkStatistics() const; >- virtual AudioDecodingCallStats GetDecodingCallStatistics() const; >- virtual int GetSpeechOutputLevelFullRange() const; >- // See description of "totalAudioEnergy" in the WebRTC stats spec: >- // https://w3c.github.io/webrtc-stats/#dom-rtcmediastreamtrackstats-totalaudioenergy >- virtual double GetTotalOutputEnergy() const; >- virtual double GetTotalOutputDuration() const; >- virtual uint32_t GetDelayEstimate() const; >- virtual void SetReceiveCodecs(const std::map<int, SdpAudioFormat>& codecs); >- virtual void SetSink(AudioSinkInterface* sink); >- >- // Implements RtpPacketSinkInterface >- void OnRtpPacket(const RtpPacketReceived& packet) override; >- >- virtual void SetChannelOutputVolumeScaling(float scaling); >- virtual AudioMixer::Source::AudioFrameInfo GetAudioFrameWithInfo( >- int sample_rate_hz, >- AudioFrame* audio_frame); >- virtual int PreferredSampleRate() const; >- virtual void AssociateSendChannel(const ChannelSendProxy& send_channel_proxy); >- virtual void DisassociateSendChannel(); >- >- // Produces the transport-related timestamps; current_delay_ms is left unset. >- absl::optional<Syncable::Info> GetSyncInfo() const; >- virtual uint32_t GetPlayoutTimestamp() const; >- virtual void SetMinimumPlayoutDelay(int delay_ms); >- virtual bool GetRecCodec(CodecInst* codec_inst) const; >- virtual std::vector<webrtc::RtpSource> GetSources() const; >- virtual void StartPlayout(); >- virtual void StopPlayout(); >- >- private: >- // Thread checkers document and lock usage of some methods on voe::Channel to >- // specific threads we know about. The goal is to eventually split up >- // voe::Channel into parts with single-threaded semantics, and thereby reduce >- // the need for locks. >- rtc::ThreadChecker worker_thread_checker_; >- rtc::ThreadChecker module_process_thread_checker_; >- // Methods accessed from audio and video threads are checked for sequential- >- // only access. We don't necessarily own and control these threads, so thread >- // checkers cannot be used. E.g. Chromium may transfer "ownership" from one >- // audio thread to another, but access is still sequential. >- rtc::RaceChecker audio_thread_race_checker_; >- rtc::RaceChecker video_capture_thread_race_checker_; >- std::unique_ptr<ChannelReceive> channel_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(ChannelReceiveProxy); >-}; >-} // namespace voe >-} // namespace webrtc >- >-#endif // AUDIO_CHANNEL_RECEIVE_PROXY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.cc >index d3748b39909a493e32ad2196df926644aa852b98..c458fe465b07080d4f12632b81f817b62a3d1444 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.cc >@@ -19,19 +19,25 @@ > > #include "absl/memory/memory.h" > #include "api/array_view.h" >+#include "api/call/transport.h" > #include "api/crypto/frameencryptorinterface.h" > #include "audio/utility/audio_frame_operations.h" > #include "call/rtp_transport_controller_send_interface.h" > #include "logging/rtc_event_log/events/rtc_event_audio_playout.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" >+#include "modules/audio_coding/include/audio_coding_module.h" >+#include "modules/audio_processing/rms_level.h" > #include "modules/pacing/packet_router.h" > #include "modules/utility/include/process_thread.h" > #include "rtc_base/checks.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/format_macros.h" > #include "rtc_base/location.h" > #include "rtc_base/logging.h" >+#include "rtc_base/numerics/safe_conversions.h" >+#include "rtc_base/race_checker.h" > #include "rtc_base/rate_limiter.h" > #include "rtc_base/task_queue.h" > #include "rtc_base/thread_checker.h" >@@ -47,7 +53,259 @@ namespace { > constexpr int64_t kMaxRetransmissionWindowMs = 1000; > constexpr int64_t kMinRetransmissionWindowMs = 30; > >-} // namespace >+MediaTransportEncodedAudioFrame::FrameType >+MediaTransportFrameTypeForWebrtcFrameType(webrtc::FrameType frame_type) { >+ switch (frame_type) { >+ case kAudioFrameSpeech: >+ return MediaTransportEncodedAudioFrame::FrameType::kSpeech; >+ break; >+ >+ case kAudioFrameCN: >+ return MediaTransportEncodedAudioFrame::FrameType:: >+ kDiscontinuousTransmission; >+ break; >+ >+ default: >+ RTC_CHECK(false) << "Unexpected frame type=" << frame_type; >+ break; >+ } >+} >+ >+class RtpPacketSenderProxy; >+class TransportFeedbackProxy; >+class TransportSequenceNumberProxy; >+class VoERtcpObserver; >+ >+class ChannelSend >+ : public ChannelSendInterface, >+ public Transport, >+ public OverheadObserver, >+ public AudioPacketizationCallback, // receive encoded packets from the >+ // ACM >+ public TargetTransferRateObserver { >+ public: >+ // TODO(nisse): Make OnUplinkPacketLossRate public, and delete friend >+ // declaration. >+ friend class VoERtcpObserver; >+ >+ ChannelSend(rtc::TaskQueue* encoder_queue, >+ ProcessThread* module_process_thread, >+ MediaTransportInterface* media_transport, >+ Transport* rtp_transport, >+ RtcpRttStats* rtcp_rtt_stats, >+ RtcEventLog* rtc_event_log, >+ FrameEncryptorInterface* frame_encryptor, >+ const webrtc::CryptoOptions& crypto_options, >+ bool extmap_allow_mixed, >+ int rtcp_report_interval_ms); >+ >+ ~ChannelSend() override; >+ >+ // Send using this encoder, with this payload type. >+ bool SetEncoder(int payload_type, >+ std::unique_ptr<AudioEncoder> encoder) override; >+ void ModifyEncoder(rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> >+ modifier) override; >+ >+ // API methods >+ void StartSend() override; >+ void StopSend() override; >+ >+ // Codecs >+ void OnBitrateAllocation(BitrateAllocationUpdate update) override; >+ int GetBitrate() const override; >+ >+ // Network >+ bool ReceivedRTCPPacket(const uint8_t* data, size_t length) override; >+ >+ // Muting, Volume and Level. >+ void SetInputMute(bool enable) override; >+ >+ // Stats. >+ ANAStats GetANAStatistics() const override; >+ >+ // Used by AudioSendStream. >+ RtpRtcp* GetRtpRtcp() const override; >+ >+ // DTMF. >+ bool SendTelephoneEventOutband(int event, int duration_ms) override; >+ bool SetSendTelephoneEventPayloadType(int payload_type, >+ int payload_frequency) override; >+ >+ // RTP+RTCP >+ void SetLocalSSRC(uint32_t ssrc) override; >+ void SetMid(const std::string& mid, int extension_id) override; >+ void SetExtmapAllowMixed(bool extmap_allow_mixed) override; >+ void SetSendAudioLevelIndicationStatus(bool enable, int id) override; >+ void EnableSendTransportSequenceNumber(int id) override; >+ >+ void RegisterSenderCongestionControlObjects( >+ RtpTransportControllerSendInterface* transport, >+ RtcpBandwidthObserver* bandwidth_observer) override; >+ void ResetSenderCongestionControlObjects() override; >+ void SetRTCP_CNAME(absl::string_view c_name) override; >+ std::vector<ReportBlock> GetRemoteRTCPReportBlocks() const override; >+ CallSendStatistics GetRTCPStatistics() const override; >+ >+ // ProcessAndEncodeAudio() posts a task on the shared encoder task queue, >+ // which in turn calls (on the queue) ProcessAndEncodeAudioOnTaskQueue() where >+ // the actual processing of the audio takes place. The processing mainly >+ // consists of encoding and preparing the result for sending by adding it to a >+ // send queue. >+ // The main reason for using a task queue here is to release the native, >+ // OS-specific, audio capture thread as soon as possible to ensure that it >+ // can go back to sleep and be prepared to deliver an new captured audio >+ // packet. >+ void ProcessAndEncodeAudio(std::unique_ptr<AudioFrame> audio_frame) override; >+ >+ void SetTransportOverhead(size_t transport_overhead_per_packet) override; >+ >+ // The existence of this function alongside OnUplinkPacketLossRate is >+ // a compromise. We want the encoder to be agnostic of the PLR source, but >+ // we also don't want it to receive conflicting information from TWCC and >+ // from RTCP-XR. >+ void OnTwccBasedUplinkPacketLossRate(float packet_loss_rate) override; >+ >+ void OnRecoverableUplinkPacketLossRate( >+ float recoverable_packet_loss_rate) override; >+ >+ int64_t GetRTT() const override; >+ >+ // E2EE Custom Audio Frame Encryption >+ void SetFrameEncryptor( >+ rtc::scoped_refptr<FrameEncryptorInterface> frame_encryptor) override; >+ >+ private: >+ class ProcessAndEncodeAudioTask; >+ >+ // From AudioPacketizationCallback in the ACM >+ int32_t SendData(FrameType frameType, >+ uint8_t payloadType, >+ uint32_t timeStamp, >+ const uint8_t* payloadData, >+ size_t payloadSize, >+ const RTPFragmentationHeader* fragmentation) override; >+ >+ // From Transport (called by the RTP/RTCP module) >+ bool SendRtp(const uint8_t* data, >+ size_t len, >+ const PacketOptions& packet_options) override; >+ bool SendRtcp(const uint8_t* data, size_t len) override; >+ >+ // From OverheadObserver in the RTP/RTCP module >+ void OnOverheadChanged(size_t overhead_bytes_per_packet) override; >+ >+ void OnUplinkPacketLossRate(float packet_loss_rate); >+ bool InputMute() const; >+ >+ int SetSendRtpHeaderExtension(bool enable, RTPExtensionType type, int id); >+ >+ void UpdateOverheadForEncoder() >+ RTC_EXCLUSIVE_LOCKS_REQUIRED(overhead_per_packet_lock_); >+ >+ int32_t SendRtpAudio(FrameType frameType, >+ uint8_t payloadType, >+ uint32_t timeStamp, >+ rtc::ArrayView<const uint8_t> payload, >+ const RTPFragmentationHeader* fragmentation); >+ >+ int32_t SendMediaTransportAudio(FrameType frameType, >+ uint8_t payloadType, >+ uint32_t timeStamp, >+ rtc::ArrayView<const uint8_t> payload, >+ const RTPFragmentationHeader* fragmentation); >+ >+ // Return media transport or nullptr if using RTP. >+ MediaTransportInterface* media_transport() { return media_transport_; } >+ >+ // Called on the encoder task queue when a new input audio frame is ready >+ // for encoding. >+ void ProcessAndEncodeAudioOnTaskQueue(AudioFrame* audio_input); >+ >+ void OnReceivedRtt(int64_t rtt_ms); >+ >+ void OnTargetTransferRate(TargetTransferRate) override; >+ >+ // Thread checkers document and lock usage of some methods on voe::Channel to >+ // specific threads we know about. The goal is to eventually split up >+ // voe::Channel into parts with single-threaded semantics, and thereby reduce >+ // the need for locks. >+ rtc::ThreadChecker worker_thread_checker_; >+ rtc::ThreadChecker module_process_thread_checker_; >+ // Methods accessed from audio and video threads are checked for sequential- >+ // only access. We don't necessarily own and control these threads, so thread >+ // checkers cannot be used. E.g. Chromium may transfer "ownership" from one >+ // audio thread to another, but access is still sequential. >+ rtc::RaceChecker audio_thread_race_checker_; >+ >+ rtc::CriticalSection _callbackCritSect; >+ rtc::CriticalSection volume_settings_critsect_; >+ >+ bool sending_ RTC_GUARDED_BY(&worker_thread_checker_) = false; >+ >+ RtcEventLog* const event_log_; >+ >+ std::unique_ptr<RtpRtcp> _rtpRtcpModule; >+ >+ std::unique_ptr<AudioCodingModule> audio_coding_; >+ uint32_t _timeStamp RTC_GUARDED_BY(encoder_queue_); >+ >+ uint16_t send_sequence_number_; >+ >+ // uses >+ ProcessThread* const _moduleProcessThreadPtr; >+ Transport* const _transportPtr; // WebRtc socket or external transport >+ RmsLevel rms_level_ RTC_GUARDED_BY(encoder_queue_); >+ bool input_mute_ RTC_GUARDED_BY(volume_settings_critsect_); >+ bool previous_frame_muted_ RTC_GUARDED_BY(encoder_queue_); >+ // VoeRTP_RTCP >+ // TODO(henrika): can today be accessed on the main thread and on the >+ // task queue; hence potential race. >+ bool _includeAudioLevelIndication; >+ size_t transport_overhead_per_packet_ >+ RTC_GUARDED_BY(overhead_per_packet_lock_); >+ size_t rtp_overhead_per_packet_ RTC_GUARDED_BY(overhead_per_packet_lock_); >+ rtc::CriticalSection overhead_per_packet_lock_; >+ // RtcpBandwidthObserver >+ const std::unique_ptr<VoERtcpObserver> rtcp_observer_; >+ >+ PacketRouter* packet_router_ RTC_GUARDED_BY(&worker_thread_checker_) = >+ nullptr; >+ const std::unique_ptr<TransportFeedbackProxy> feedback_observer_proxy_; >+ const std::unique_ptr<TransportSequenceNumberProxy> seq_num_allocator_proxy_; >+ const std::unique_ptr<RtpPacketSenderProxy> rtp_packet_sender_proxy_; >+ const std::unique_ptr<RateLimiter> retransmission_rate_limiter_; >+ >+ rtc::ThreadChecker construction_thread_; >+ >+ const bool use_twcc_plr_for_ana_; >+ >+ rtc::CriticalSection encoder_queue_lock_; >+ bool encoder_queue_is_active_ RTC_GUARDED_BY(encoder_queue_lock_) = false; >+ rtc::TaskQueue* const encoder_queue_ = nullptr; >+ >+ MediaTransportInterface* const media_transport_; >+ int media_transport_sequence_number_ RTC_GUARDED_BY(encoder_queue_) = 0; >+ >+ rtc::CriticalSection media_transport_lock_; >+ // Currently set by SetLocalSSRC. >+ uint64_t media_transport_channel_id_ RTC_GUARDED_BY(&media_transport_lock_) = >+ 0; >+ // Cache payload type and sampling frequency from most recent call to >+ // SetEncoder. Needed to set MediaTransportEncodedAudioFrame metadata, and >+ // invalidate on encoder change. >+ int media_transport_payload_type_ RTC_GUARDED_BY(&media_transport_lock_); >+ int media_transport_sampling_frequency_ >+ RTC_GUARDED_BY(&media_transport_lock_); >+ >+ // E2EE Audio Frame Encryption >+ rtc::scoped_refptr<FrameEncryptorInterface> frame_encryptor_; >+ // E2EE Frame Encryption Options >+ const webrtc::CryptoOptions crypto_options_; >+ >+ rtc::CriticalSection bitrate_crit_section_; >+ int configured_bitrate_bps_ RTC_GUARDED_BY(bitrate_crit_section_) = 0; >+}; > > const int kTelephoneEventAttenuationdB = 10; > >@@ -254,6 +512,23 @@ int32_t ChannelSend::SendData(FrameType frameType, > size_t payloadSize, > const RTPFragmentationHeader* fragmentation) { > RTC_DCHECK_RUN_ON(encoder_queue_); >+ rtc::ArrayView<const uint8_t> payload(payloadData, payloadSize); >+ >+ if (media_transport() != nullptr) { >+ return SendMediaTransportAudio(frameType, payloadType, timeStamp, payload, >+ fragmentation); >+ } else { >+ return SendRtpAudio(frameType, payloadType, timeStamp, payload, >+ fragmentation); >+ } >+} >+ >+int32_t ChannelSend::SendRtpAudio(FrameType frameType, >+ uint8_t payloadType, >+ uint32_t timeStamp, >+ rtc::ArrayView<const uint8_t> payload, >+ const RTPFragmentationHeader* fragmentation) { >+ RTC_DCHECK_RUN_ON(encoder_queue_); > if (_includeAudioLevelIndication) { > // Store current audio level in the RTP/RTCP module. > // The level will be used in combination with voice-activity state >@@ -268,16 +543,15 @@ int32_t ChannelSend::SendData(FrameType frameType, > // TODO(benwright@webrtc.org) - Allocate enough to always encrypt inline. > // Allocate a buffer to hold the maximum possible encrypted payload. > size_t max_ciphertext_size = frame_encryptor_->GetMaxCiphertextByteSize( >- cricket::MEDIA_TYPE_AUDIO, payloadSize); >+ cricket::MEDIA_TYPE_AUDIO, payload.size()); > encrypted_audio_payload.SetSize(max_ciphertext_size); > > // Encrypt the audio payload into the buffer. > size_t bytes_written = 0; > int encrypt_status = frame_encryptor_->Encrypt( > cricket::MEDIA_TYPE_AUDIO, _rtpRtcpModule->SSRC(), >- /*additional_data=*/nullptr, >- rtc::ArrayView<const uint8_t>(payloadData, payloadSize), >- encrypted_audio_payload, &bytes_written); >+ /*additional_data=*/nullptr, payload, encrypted_audio_payload, >+ &bytes_written); > if (encrypt_status != 0) { > RTC_DLOG(LS_ERROR) << "Channel::SendData() failed encrypt audio payload: " > << encrypt_status; >@@ -286,19 +560,23 @@ int32_t ChannelSend::SendData(FrameType frameType, > // Resize the buffer to the exact number of bytes actually used. > encrypted_audio_payload.SetSize(bytes_written); > // Rewrite the payloadData and size to the new encrypted payload. >- payloadData = encrypted_audio_payload.data(); >- payloadSize = encrypted_audio_payload.size(); >+ payload = encrypted_audio_payload; >+ } else if (crypto_options_.sframe.require_frame_encryption) { >+ RTC_DLOG(LS_ERROR) << "Channel::SendData() failed sending audio payload: " >+ << "A frame encryptor is required but one is not set."; >+ return -1; > } > > // Push data from ACM to RTP/RTCP-module to deliver audio frame for > // packetization. > // This call will trigger Transport::SendPacket() from the RTP/RTCP module. >- if (!_rtpRtcpModule->SendOutgoingData( >- (FrameType&)frameType, payloadType, timeStamp, >- // Leaving the time when this frame was >- // received from the capture device as >- // undefined for voice for now. >- -1, payloadData, payloadSize, fragmentation, nullptr, nullptr)) { >+ if (!_rtpRtcpModule->SendOutgoingData((FrameType&)frameType, payloadType, >+ timeStamp, >+ // Leaving the time when this frame was >+ // received from the capture device as >+ // undefined for voice for now. >+ -1, payload.data(), payload.size(), >+ fragmentation, nullptr, nullptr)) { > RTC_DLOG(LS_ERROR) > << "ChannelSend::SendData() failed to send data to RTP/RTCP module"; > return -1; >@@ -307,9 +585,68 @@ int32_t ChannelSend::SendData(FrameType frameType, > return 0; > } > >+int32_t ChannelSend::SendMediaTransportAudio( >+ FrameType frameType, >+ uint8_t payloadType, >+ uint32_t timeStamp, >+ rtc::ArrayView<const uint8_t> payload, >+ const RTPFragmentationHeader* fragmentation) { >+ RTC_DCHECK_RUN_ON(encoder_queue_); >+ // TODO(nisse): Use null _transportPtr for MediaTransport. >+ // RTC_DCHECK(_transportPtr == nullptr); >+ uint64_t channel_id; >+ int sampling_rate_hz; >+ { >+ rtc::CritScope cs(&media_transport_lock_); >+ if (media_transport_payload_type_ != payloadType) { >+ // Payload type is being changed, media_transport_sampling_frequency_, >+ // no longer current. >+ return -1; >+ } >+ sampling_rate_hz = media_transport_sampling_frequency_; >+ channel_id = media_transport_channel_id_; >+ } >+ const MediaTransportEncodedAudioFrame frame( >+ /*sampling_rate_hz=*/sampling_rate_hz, >+ >+ // TODO(nisse): Timestamp and sample index are the same for all supported >+ // audio codecs except G722. Refactor audio coding module to only use >+ // sample index, and leave translation to RTP time, when needed, for >+ // RTP-specific code. >+ /*starting_sample_index=*/timeStamp, >+ >+ // Sample count isn't conveniently available from the AudioCodingModule, >+ // and needs some refactoring to wire up in a good way. For now, left as >+ // zero. >+ /*sample_count=*/0, >+ >+ /*sequence_number=*/media_transport_sequence_number_, >+ MediaTransportFrameTypeForWebrtcFrameType(frameType), payloadType, >+ std::vector<uint8_t>(payload.begin(), payload.end())); >+ >+ // TODO(nisse): Introduce a MediaTransportSender object bound to a specific >+ // channel id. >+ RTCError rtc_error = >+ media_transport()->SendAudioFrame(channel_id, std::move(frame)); >+ >+ if (!rtc_error.ok()) { >+ RTC_LOG(LS_ERROR) << "Failed to send frame, rtc_error=" >+ << ToString(rtc_error.type()) << ", " >+ << rtc_error.message(); >+ return -1; >+ } >+ >+ ++media_transport_sequence_number_; >+ >+ return 0; >+} >+ > bool ChannelSend::SendRtp(const uint8_t* data, > size_t len, > const PacketOptions& options) { >+ // We should not be sending RTP packets if media transport is available. >+ RTC_CHECK(!media_transport()); >+ > rtc::CritScope cs(&_callbackCritSect); > > if (_transportPtr == NULL) { >@@ -343,23 +680,22 @@ bool ChannelSend::SendRtcp(const uint8_t* data, size_t len) { > return true; > } > >-int ChannelSend::PreferredSampleRate() const { >- // Return the bigger of playout and receive frequency in the ACM. >- return std::max(audio_coding_->ReceiveFrequency(), >- audio_coding_->PlayoutFrequency()); >-} >- > ChannelSend::ChannelSend(rtc::TaskQueue* encoder_queue, > ProcessThread* module_process_thread, >+ MediaTransportInterface* media_transport, >+ Transport* rtp_transport, > RtcpRttStats* rtcp_rtt_stats, > RtcEventLog* rtc_event_log, >- FrameEncryptorInterface* frame_encryptor) >+ FrameEncryptorInterface* frame_encryptor, >+ const webrtc::CryptoOptions& crypto_options, >+ bool extmap_allow_mixed, >+ int rtcp_report_interval_ms) > : event_log_(rtc_event_log), > _timeStamp(0), // This is just an offset, RTP module will add it's own > // random offset > send_sequence_number_(0), > _moduleProcessThreadPtr(module_process_thread), >- _transportPtr(NULL), >+ _transportPtr(rtp_transport), > input_mute_(false), > previous_frame_muted_(false), > _includeAudioLevelIndication(false), >@@ -374,49 +710,56 @@ ChannelSend::ChannelSend(rtc::TaskQueue* encoder_queue, > use_twcc_plr_for_ana_( > webrtc::field_trial::FindFullName("UseTwccPlrForAna") == "Enabled"), > encoder_queue_(encoder_queue), >- frame_encryptor_(frame_encryptor) { >+ media_transport_(media_transport), >+ frame_encryptor_(frame_encryptor), >+ crypto_options_(crypto_options) { > RTC_DCHECK(module_process_thread); > RTC_DCHECK(encoder_queue); >+ module_process_thread_checker_.DetachFromThread(); >+ > audio_coding_.reset(AudioCodingModule::Create(AudioCodingModule::Config())); > > RtpRtcp::Configuration configuration; >+ >+ // We gradually remove codepaths that depend on RTP when using media >+ // transport. All of this logic should be moved to the future >+ // RTPMediaTransport. In this case it means that overhead and bandwidth >+ // observers should not be called when using media transport. >+ if (!media_transport_) { >+ configuration.overhead_observer = this; >+ configuration.bandwidth_callback = rtcp_observer_.get(); >+ configuration.transport_feedback_callback = feedback_observer_proxy_.get(); >+ } >+ > configuration.audio = true; > configuration.outgoing_transport = this; >- configuration.overhead_observer = this; >- configuration.bandwidth_callback = rtcp_observer_.get(); > > configuration.paced_sender = rtp_packet_sender_proxy_.get(); > configuration.transport_sequence_number_allocator = > seq_num_allocator_proxy_.get(); >- configuration.transport_feedback_callback = feedback_observer_proxy_.get(); > > configuration.event_log = event_log_; > configuration.rtt_stats = rtcp_rtt_stats; > configuration.retransmission_rate_limiter = > retransmission_rate_limiter_.get(); >+ configuration.extmap_allow_mixed = extmap_allow_mixed; >+ configuration.rtcp_report_interval_ms = rtcp_report_interval_ms; > > _rtpRtcpModule.reset(RtpRtcp::CreateRtpRtcp(configuration)); > _rtpRtcpModule->SetSendingMediaStatus(false); >- Init(); >-} > >-ChannelSend::~ChannelSend() { >- Terminate(); >- RTC_DCHECK(!channel_state_.Get().sending); >-} >- >-void ChannelSend::Init() { >- channel_state_.Reset(); >+ // We want to invoke the 'TargetRateObserver' and |OnOverheadChanged| >+ // callbacks after the audio_coding_ is fully initialized. >+ if (media_transport_) { >+ RTC_DLOG(LS_INFO) << "Setting media_transport_ rate observers."; >+ media_transport_->AddTargetTransferRateObserver(this); >+ OnOverheadChanged(media_transport_->GetAudioPacketOverhead()); >+ } else { >+ RTC_DLOG(LS_INFO) << "Not setting media_transport_ rate observers."; >+ } > >- // --- Add modules to process thread (for periodic schedulation) > _moduleProcessThreadPtr->RegisterModule(_rtpRtcpModule.get(), RTC_FROM_HERE); > >- // --- ACM initialization >- int error = audio_coding_->InitializeReceiver(); >- RTC_DCHECK_EQ(0, error); >- >- // --- RTP/RTCP module initialization >- > // Ensure that RTCP is enabled by default for the created channel. > // Note that, the module will keep generating RTCP until it is explicitly > // disabled by the user. >@@ -425,36 +768,30 @@ void ChannelSend::Init() { > // RTCP is enabled by default. > _rtpRtcpModule->SetRTCPStatus(RtcpMode::kCompound); > >- // --- Register all permanent callbacks >- error = audio_coding_->RegisterTransportCallback(this); >+ int error = audio_coding_->RegisterTransportCallback(this); > RTC_DCHECK_EQ(0, error); > } > >-void ChannelSend::Terminate() { >+ChannelSend::~ChannelSend() { > RTC_DCHECK(construction_thread_.CalledOnValidThread()); >- // Must be called on the same thread as Init(). >+ >+ if (media_transport_) { >+ media_transport_->RemoveTargetTransferRateObserver(this); >+ } > > StopSend(); > >- // The order to safely shutdown modules in a channel is: >- // 1. De-register callbacks in modules >- // 2. De-register modules in process thread >- // 3. Destroy modules > int error = audio_coding_->RegisterTransportCallback(NULL); > RTC_DCHECK_EQ(0, error); > >- // De-register modules in process thread > if (_moduleProcessThreadPtr) > _moduleProcessThreadPtr->DeRegisterModule(_rtpRtcpModule.get()); >- >- // End of modules shutdown > } > >-int32_t ChannelSend::StartSend() { >- if (channel_state_.Get().sending) { >- return 0; >- } >- channel_state_.SetSending(true); >+void ChannelSend::StartSend() { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); >+ RTC_DCHECK(!sending_); >+ sending_ = true; > > // Resume the previous sequence number which was reset by StopSend(). This > // needs to be done before |sending| is set to true on the RTP/RTCP module. >@@ -462,26 +799,21 @@ int32_t ChannelSend::StartSend() { > _rtpRtcpModule->SetSequenceNumber(send_sequence_number_); > } > _rtpRtcpModule->SetSendingMediaStatus(true); >- if (_rtpRtcpModule->SetSendingStatus(true) != 0) { >- RTC_DLOG(LS_ERROR) << "StartSend() RTP/RTCP failed to start sending"; >- _rtpRtcpModule->SetSendingMediaStatus(false); >- rtc::CritScope cs(&_callbackCritSect); >- channel_state_.SetSending(false); >- return -1; >- } >+ int ret = _rtpRtcpModule->SetSendingStatus(true); >+ RTC_DCHECK_EQ(0, ret); > { > // It is now OK to start posting tasks to the encoder task queue. > rtc::CritScope cs(&encoder_queue_lock_); > encoder_queue_is_active_ = true; > } >- return 0; > } > > void ChannelSend::StopSend() { >- if (!channel_state_.Get().sending) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); >+ if (!sending_) { > return; > } >- channel_state_.SetSending(false); >+ sending_ = false; > > // Post a task to the encoder thread which sets an event when the task is > // executed. We know that no more encoding tasks will be added to the task >@@ -491,7 +823,7 @@ void ChannelSend::StopSend() { > // to acccess and invalid channel object. > RTC_DCHECK(encoder_queue_); > >- rtc::Event flush(false, false); >+ rtc::Event flush; > { > // Clear |encoder_queue_is_active_| under lock to prevent any other tasks > // than this final "flush task" to be posted on the queue. >@@ -519,6 +851,7 @@ void ChannelSend::StopSend() { > > bool ChannelSend::SetEncoder(int payload_type, > std::unique_ptr<AudioEncoder> encoder) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > RTC_DCHECK_GE(payload_type, 0); > RTC_DCHECK_LE(payload_type, 127); > // TODO(ossu): Make CodecInsts up, for now: one for the RTP/RTCP module and >@@ -549,25 +882,48 @@ bool ChannelSend::SetEncoder(int payload_type, > } > } > >+ if (media_transport_) { >+ rtc::CritScope cs(&media_transport_lock_); >+ media_transport_payload_type_ = payload_type; >+ // TODO(nisse): Currently broken for G722, since timestamps passed through >+ // encoder use RTP clock rather than sample count, and they differ for G722. >+ media_transport_sampling_frequency_ = encoder->RtpTimestampRateHz(); >+ } > audio_coding_->SetEncoder(std::move(encoder)); > return true; > } > > void ChannelSend::ModifyEncoder( > rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> modifier) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > audio_coding_->ModifyEncoder(modifier); > } > >-void ChannelSend::SetBitRate(int bitrate_bps, int64_t probing_interval_ms) { >+void ChannelSend::OnBitrateAllocation(BitrateAllocationUpdate update) { >+ // This method can be called on the worker thread, module process thread >+ // or on a TaskQueue via VideoSendStreamImpl::OnEncoderConfigurationChanged. >+ // TODO(solenberg): Figure out a good way to check this or enforce calling >+ // rules. >+ // RTC_DCHECK(worker_thread_checker_.CalledOnValidThread() || >+ // module_process_thread_checker_.CalledOnValidThread()); >+ rtc::CritScope lock(&bitrate_crit_section_); >+ > audio_coding_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { > if (*encoder) { >- (*encoder)->OnReceivedUplinkBandwidth(bitrate_bps, probing_interval_ms); >+ (*encoder)->OnReceivedUplinkAllocation(update); > } > }); >- retransmission_rate_limiter_->SetMaxRate(bitrate_bps); >+ retransmission_rate_limiter_->SetMaxRate(update.target_bitrate.bps()); >+ configured_bitrate_bps_ = update.target_bitrate.bps(); >+} >+ >+int ChannelSend::GetBitrate() const { >+ rtc::CritScope lock(&bitrate_crit_section_); >+ return configured_bitrate_bps_; > } > > void ChannelSend::OnTwccBasedUplinkPacketLossRate(float packet_loss_rate) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > if (!use_twcc_plr_for_ana_) > return; > audio_coding_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { >@@ -579,6 +935,7 @@ void ChannelSend::OnTwccBasedUplinkPacketLossRate(float packet_loss_rate) { > > void ChannelSend::OnRecoverableUplinkPacketLossRate( > float recoverable_packet_loss_rate) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > audio_coding_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { > if (*encoder) { > (*encoder)->OnReceivedUplinkRecoverablePacketLossFraction( >@@ -597,47 +954,22 @@ void ChannelSend::OnUplinkPacketLossRate(float packet_loss_rate) { > }); > } > >-bool ChannelSend::EnableAudioNetworkAdaptor(const std::string& config_string) { >- bool success = false; >- audio_coding_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { >- if (*encoder) { >- success = >- (*encoder)->EnableAudioNetworkAdaptor(config_string, event_log_); >- } >- }); >- return success; >-} >- >-void ChannelSend::DisableAudioNetworkAdaptor() { >- audio_coding_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { >- if (*encoder) >- (*encoder)->DisableAudioNetworkAdaptor(); >- }); >-} >- >-void ChannelSend::SetReceiverFrameLengthRange(int min_frame_length_ms, >- int max_frame_length_ms) { >- audio_coding_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { >- if (*encoder) { >- (*encoder)->SetReceiverFrameLengthRange(min_frame_length_ms, >- max_frame_length_ms); >- } >- }); >-} >- >-void ChannelSend::RegisterTransport(Transport* transport) { >- rtc::CritScope cs(&_callbackCritSect); >- _transportPtr = transport; >-} >+// TODO(nisse): Delete always-true return value. >+bool ChannelSend::ReceivedRTCPPacket(const uint8_t* data, size_t length) { >+ // May be called on either worker thread or network thread. >+ if (media_transport_) { >+ // Ignore RTCP packets while media transport is used. >+ // Those packets should not arrive, but we are seeing occasional packets. >+ return 0; >+ } > >-int32_t ChannelSend::ReceivedRTCPPacket(const uint8_t* data, size_t length) { > // Deliver RTCP packet to RTP/RTCP module for parsing > _rtpRtcpModule->IncomingRtcpPacket(data, length); > > int64_t rtt = GetRTT(); > if (rtt == 0) { > // Waiting for valid RTT. >- return 0; >+ return true; > } > > int64_t nack_window_ms = rtt; >@@ -648,16 +980,12 @@ int32_t ChannelSend::ReceivedRTCPPacket(const uint8_t* data, size_t length) { > } > retransmission_rate_limiter_->SetWindowSize(nack_window_ms); > >- // Invoke audio encoders OnReceivedRtt(). >- audio_coding_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { >- if (*encoder) >- (*encoder)->OnReceivedRtt(rtt); >- }); >- >- return 0; >+ OnReceivedRtt(rtt); >+ return true; > } > > void ChannelSend::SetInputMute(bool enable) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > rtc::CritScope cs(&volume_settings_critsect_); > input_mute_ = enable; > } >@@ -667,24 +995,26 @@ bool ChannelSend::InputMute() const { > return input_mute_; > } > >-int ChannelSend::SendTelephoneEventOutband(int event, int duration_ms) { >+bool ChannelSend::SendTelephoneEventOutband(int event, int duration_ms) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > RTC_DCHECK_LE(0, event); > RTC_DCHECK_GE(255, event); > RTC_DCHECK_LE(0, duration_ms); > RTC_DCHECK_GE(65535, duration_ms); >- if (!Sending()) { >- return -1; >+ if (!sending_) { >+ return false; > } > if (_rtpRtcpModule->SendTelephoneEventOutband( > event, duration_ms, kTelephoneEventAttenuationdB) != 0) { > RTC_DLOG(LS_ERROR) << "SendTelephoneEventOutband() failed to send event"; >- return -1; >+ return false; > } >- return 0; >+ return true; > } > >-int ChannelSend::SetSendTelephoneEventPayloadType(int payload_type, >- int payload_frequency) { >+bool ChannelSend::SetSendTelephoneEventPayloadType(int payload_type, >+ int payload_frequency) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > RTC_DCHECK_LE(0, payload_type); > RTC_DCHECK_GE(127, payload_type); > CodecInst codec = {0}; >@@ -697,34 +1027,44 @@ int ChannelSend::SetSendTelephoneEventPayloadType(int payload_type, > RTC_DLOG(LS_ERROR) > << "SetSendTelephoneEventPayloadType() failed to register " > "send payload type"; >- return -1; >+ return false; > } > } >- return 0; >+ return true; > } > >-int ChannelSend::SetLocalSSRC(unsigned int ssrc) { >- if (channel_state_.Get().sending) { >- RTC_DLOG(LS_ERROR) << "SetLocalSSRC() already sending"; >- return -1; >+void ChannelSend::SetLocalSSRC(uint32_t ssrc) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); >+ RTC_DCHECK(!sending_); >+ >+ if (media_transport_) { >+ rtc::CritScope cs(&media_transport_lock_); >+ media_transport_channel_id_ = ssrc; > } > _rtpRtcpModule->SetSSRC(ssrc); >- return 0; > } > > void ChannelSend::SetMid(const std::string& mid, int extension_id) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > int ret = SetSendRtpHeaderExtension(true, kRtpExtensionMid, extension_id); > RTC_DCHECK_EQ(0, ret); > _rtpRtcpModule->SetMid(mid); > } > >-int ChannelSend::SetSendAudioLevelIndicationStatus(bool enable, >- unsigned char id) { >+void ChannelSend::SetExtmapAllowMixed(bool extmap_allow_mixed) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); >+ _rtpRtcpModule->SetExtmapAllowMixed(extmap_allow_mixed); >+} >+ >+void ChannelSend::SetSendAudioLevelIndicationStatus(bool enable, int id) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > _includeAudioLevelIndication = enable; >- return SetSendRtpHeaderExtension(enable, kRtpExtensionAudioLevel, id); >+ int ret = SetSendRtpHeaderExtension(enable, kRtpExtensionAudioLevel, id); >+ RTC_DCHECK_EQ(0, ret); > } > > void ChannelSend::EnableSendTransportSequenceNumber(int id) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > int ret = > SetSendRtpHeaderExtension(true, kRtpExtensionTransportSequenceNumber, id); > RTC_DCHECK_EQ(0, ret); >@@ -733,6 +1073,7 @@ void ChannelSend::EnableSendTransportSequenceNumber(int id) { > void ChannelSend::RegisterSenderCongestionControlObjects( > RtpTransportControllerSendInterface* transport, > RtcpBandwidthObserver* bandwidth_observer) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > RtpPacketSender* rtp_packet_sender = transport->packet_sender(); > TransportFeedbackObserver* transport_feedback_observer = > transport->transport_feedback_observer(); >@@ -754,6 +1095,7 @@ void ChannelSend::RegisterSenderCongestionControlObjects( > } > > void ChannelSend::ResetSenderCongestionControlObjects() { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > RTC_DCHECK(packet_router_); > _rtpRtcpModule->SetStorePacketsStatus(false, 600); > rtcp_observer_->SetBandwidthObserver(nullptr); >@@ -764,35 +1106,25 @@ void ChannelSend::ResetSenderCongestionControlObjects() { > rtp_packet_sender_proxy_->SetPacketSender(nullptr); > } > >-void ChannelSend::SetRTCPStatus(bool enable) { >- _rtpRtcpModule->SetRTCPStatus(enable ? RtcpMode::kCompound : RtcpMode::kOff); >-} >- >-int ChannelSend::SetRTCP_CNAME(const char cName[256]) { >- if (_rtpRtcpModule->SetCNAME(cName) != 0) { >- RTC_DLOG(LS_ERROR) << "SetRTCP_CNAME() failed to set RTCP CNAME"; >- return -1; >- } >- return 0; >+void ChannelSend::SetRTCP_CNAME(absl::string_view c_name) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); >+ // Note: SetCNAME() accepts a c string of length at most 255. >+ const std::string c_name_limited(c_name.substr(0, 255)); >+ int ret = _rtpRtcpModule->SetCNAME(c_name_limited.c_str()) != 0; >+ RTC_DCHECK_EQ(0, ret) << "SetRTCP_CNAME() failed to set RTCP CNAME"; > } > >-int ChannelSend::GetRemoteRTCPReportBlocks( >- std::vector<ReportBlock>* report_blocks) { >- if (report_blocks == NULL) { >- RTC_DLOG(LS_ERROR) << "GetRemoteRTCPReportBlock()s invalid report_blocks."; >- return -1; >- } >- >+std::vector<ReportBlock> ChannelSend::GetRemoteRTCPReportBlocks() const { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > // Get the report blocks from the latest received RTCP Sender or Receiver > // Report. Each element in the vector contains the sender's SSRC and a > // report block according to RFC 3550. > std::vector<RTCPReportBlock> rtcp_report_blocks; >- if (_rtpRtcpModule->RemoteRTCPStat(&rtcp_report_blocks) != 0) { >- return -1; >- } > >- if (rtcp_report_blocks.empty()) >- return 0; >+ int ret = _rtpRtcpModule->RemoteRTCPStat(&rtcp_report_blocks); >+ RTC_DCHECK_EQ(0, ret); >+ >+ std::vector<ReportBlock> report_blocks; > > std::vector<RTCPReportBlock>::const_iterator it = rtcp_report_blocks.begin(); > for (; it != rtcp_report_blocks.end(); ++it) { >@@ -806,19 +1138,16 @@ int ChannelSend::GetRemoteRTCPReportBlocks( > report_block.interarrival_jitter = it->jitter; > report_block.last_SR_timestamp = it->last_sender_report_timestamp; > report_block.delay_since_last_SR = it->delay_since_last_sender_report; >- report_blocks->push_back(report_block); >+ report_blocks.push_back(report_block); > } >- return 0; >+ return report_blocks; > } > >-int ChannelSend::GetRTPStatistics(CallSendStatistics& stats) { >- // --- RtcpStatistics >- >- // --- RTT >+CallSendStatistics ChannelSend::GetRTCPStatistics() const { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); >+ CallSendStatistics stats = {0}; > stats.rttMs = GetRTT(); > >- // --- Data counters >- > size_t bytesSent(0); > uint32_t packetsSent(0); > >@@ -831,24 +1160,12 @@ int ChannelSend::GetRTPStatistics(CallSendStatistics& stats) { > stats.bytesSent = bytesSent; > stats.packetsSent = packetsSent; > >- return 0; >-} >- >-void ChannelSend::SetNACKStatus(bool enable, int maxNumberOfPackets) { >- // None of these functions can fail. >- if (enable) >- audio_coding_->EnableNack(maxNumberOfPackets); >- else >- audio_coding_->DisableNack(); >-} >- >-// Called when we are missing one or more packets. >-int ChannelSend::ResendPackets(const uint16_t* sequence_numbers, int length) { >- return _rtpRtcpModule->SendNACK(sequence_numbers, length); >+ return stats; > } > > void ChannelSend::ProcessAndEncodeAudio( > std::unique_ptr<AudioFrame> audio_frame) { >+ RTC_DCHECK_RUNS_SERIALIZED(&audio_thread_race_checker_); > // Avoid posting any new tasks if sending was already stopped in StopSend(). > rtc::CritScope cs(&encoder_queue_lock_); > if (!encoder_queue_is_active_) { >@@ -914,6 +1231,7 @@ void ChannelSend::UpdateOverheadForEncoder() { > } > > void ChannelSend::SetTransportOverhead(size_t transport_overhead_per_packet) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > rtc::CritScope cs(&overhead_per_packet_lock_); > transport_overhead_per_packet_ = transport_overhead_per_packet; > UpdateOverheadForEncoder(); >@@ -927,36 +1245,42 @@ void ChannelSend::OnOverheadChanged(size_t overhead_bytes_per_packet) { > } > > ANAStats ChannelSend::GetANAStatistics() const { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > return audio_coding_->GetANAStats(); > } > > RtpRtcp* ChannelSend::GetRtpRtcp() const { >+ RTC_DCHECK(module_process_thread_checker_.CalledOnValidThread()); > return _rtpRtcpModule.get(); > } > > int ChannelSend::SetSendRtpHeaderExtension(bool enable, > RTPExtensionType type, >- unsigned char id) { >+ int id) { > int error = 0; > _rtpRtcpModule->DeregisterSendRtpHeaderExtension(type); > if (enable) { >- error = _rtpRtcpModule->RegisterSendRtpHeaderExtension(type, id); >+ // TODO(nisse): RtpRtcp::RegisterSendRtpHeaderExtension to take an int >+ // argument. Currently it wants an uint8_t. >+ error = _rtpRtcpModule->RegisterSendRtpHeaderExtension( >+ type, rtc::dchecked_cast<uint8_t>(id)); > } > return error; > } > >-int ChannelSend::GetRtpTimestampRateHz() const { >- const auto format = audio_coding_->ReceiveFormat(); >- // Default to the playout frequency if we've not gotten any packets yet. >- // TODO(ossu): Zero clockrate can only happen if we've added an external >- // decoder for a format we don't support internally. Remove once that way of >- // adding decoders is gone! >- return (format && format->clockrate_hz != 0) >- ? format->clockrate_hz >- : audio_coding_->PlayoutFrequency(); >-} >- > int64_t ChannelSend::GetRTT() const { >+ if (media_transport_) { >+ // GetRTT is generally used in the RTCP codepath, where media transport is >+ // not present and so it shouldn't be needed. But it's also invoked in >+ // 'GetStats' method, and for now returning media transport RTT here gives >+ // us "free" rtt stats for media transport. >+ auto target_rate = media_transport_->GetLatestTargetTransferRate(); >+ if (target_rate.has_value()) { >+ return target_rate.value().network_estimate.round_trip_time.ms(); >+ } >+ >+ return 0; >+ } > RtcpMode method = _rtpRtcpModule->RTCP(); > if (method == RtcpMode::kOff) { > return 0; >@@ -981,16 +1305,52 @@ int64_t ChannelSend::GetRTT() const { > return rtt; > } > >-void ChannelSend::SetFrameEncryptor(FrameEncryptorInterface* frame_encryptor) { >+void ChannelSend::SetFrameEncryptor( >+ rtc::scoped_refptr<FrameEncryptorInterface> frame_encryptor) { >+ RTC_DCHECK_RUN_ON(&worker_thread_checker_); > rtc::CritScope cs(&encoder_queue_lock_); > if (encoder_queue_is_active_) { > encoder_queue_->PostTask([this, frame_encryptor]() { >- this->frame_encryptor_ = frame_encryptor; >+ this->frame_encryptor_ = std::move(frame_encryptor); > }); > } else { >- frame_encryptor_ = frame_encryptor; >+ frame_encryptor_ = std::move(frame_encryptor); > } > } > >+void ChannelSend::OnTargetTransferRate(TargetTransferRate rate) { >+ RTC_DCHECK(media_transport_); >+ OnReceivedRtt(rate.network_estimate.round_trip_time.ms()); >+} >+ >+void ChannelSend::OnReceivedRtt(int64_t rtt_ms) { >+ // Invoke audio encoders OnReceivedRtt(). >+ audio_coding_->ModifyEncoder( >+ [rtt_ms](std::unique_ptr<AudioEncoder>* encoder) { >+ if (*encoder) { >+ (*encoder)->OnReceivedRtt(rtt_ms); >+ } >+ }); >+} >+ >+} // namespace >+ >+std::unique_ptr<ChannelSendInterface> CreateChannelSend( >+ rtc::TaskQueue* encoder_queue, >+ ProcessThread* module_process_thread, >+ MediaTransportInterface* media_transport, >+ Transport* rtp_transport, >+ RtcpRttStats* rtcp_rtt_stats, >+ RtcEventLog* rtc_event_log, >+ FrameEncryptorInterface* frame_encryptor, >+ const webrtc::CryptoOptions& crypto_options, >+ bool extmap_allow_mixed, >+ int rtcp_report_interval_ms) { >+ return absl::make_unique<ChannelSend>( >+ encoder_queue, module_process_thread, media_transport, rtp_transport, >+ rtcp_rtt_stats, rtc_event_log, frame_encryptor, crypto_options, >+ extmap_allow_mixed, rtcp_report_interval_ms); >+} >+ > } // namespace voe > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.h >index ef92f8e84eb60f08fbb27209fa796a5628e3d82a..083e9a60293067bf60fe2ab8675658ebc648b11b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send.h >@@ -11,42 +11,26 @@ > #ifndef AUDIO_CHANNEL_SEND_H_ > #define AUDIO_CHANNEL_SEND_H_ > >-#include <map> > #include <memory> > #include <string> > #include <vector> > > #include "api/audio/audio_frame.h" > #include "api/audio_codecs/audio_encoder.h" >-#include "api/call/transport.h" >-#include "common_types.h" // NOLINT(build/include) >-#include "modules/audio_coding/include/audio_coding_module.h" >-#include "modules/audio_processing/rms_level.h" >+#include "api/crypto/cryptooptions.h" >+#include "api/media_transport_interface.h" > #include "modules/rtp_rtcp/include/rtp_rtcp.h" >-#include "rtc_base/criticalsection.h" >+#include "rtc_base/function_view.h" > #include "rtc_base/task_queue.h" >-#include "rtc_base/thread_checker.h" >- >-// TODO(solenberg, nisse): This file contains a few NOLINT marks, to silence >-// warnings about use of unsigned short, and non-const reference arguments. >-// These need cleanup, in a separate cl. >- >-namespace rtc { >-class TimestampWrapAroundHandler; >-} > > namespace webrtc { > > class FrameEncryptorInterface; >-class PacketRouter; > class ProcessThread; >-class RateLimiter; > class RtcEventLog; > class RtpRtcp; > class RtpTransportControllerSendInterface; > >-struct SenderInfo; >- > struct CallSendStatistics { > int64_t rttMs; > size_t bytesSent; >@@ -67,239 +51,77 @@ struct ReportBlock { > > namespace voe { > >-class RtpPacketSenderProxy; >-class TransportFeedbackProxy; >-class TransportSequenceNumberProxy; >-class VoERtcpObserver; >- >-// Helper class to simplify locking scheme for members that are accessed from >-// multiple threads. >-// Example: a member can be set on thread T1 and read by an internal audio >-// thread T2. Accessing the member via this class ensures that we are >-// safe and also avoid TSan v2 warnings. >-class ChannelSendState { >- public: >- struct State { >- bool sending = false; >- }; >- >- ChannelSendState() {} >- virtual ~ChannelSendState() {} >- >- void Reset() { >- rtc::CritScope lock(&lock_); >- state_ = State(); >- } >- >- State Get() const { >- rtc::CritScope lock(&lock_); >- return state_; >- } >- >- void SetSending(bool enable) { >- rtc::CritScope lock(&lock_); >- state_.sending = enable; >- } >- >- private: >- rtc::CriticalSection lock_; >- State state_; >-}; >- >-class ChannelSend >- : public Transport, >- public AudioPacketizationCallback, // receive encoded packets from the >- // ACM >- public OverheadObserver { >+class ChannelSendInterface { > public: >- // TODO(nisse): Make OnUplinkPacketLossRate public, and delete friend >- // declaration. >- friend class VoERtcpObserver; >- >- ChannelSend(rtc::TaskQueue* encoder_queue, >- ProcessThread* module_process_thread, >- RtcpRttStats* rtcp_rtt_stats, >- RtcEventLog* rtc_event_log, >- FrameEncryptorInterface* frame_encryptor); >- >- virtual ~ChannelSend(); >+ virtual ~ChannelSendInterface() = default; > >- // Send using this encoder, with this payload type. >- bool SetEncoder(int payload_type, std::unique_ptr<AudioEncoder> encoder); >- void ModifyEncoder( >- rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> modifier); >+ virtual bool ReceivedRTCPPacket(const uint8_t* packet, size_t length) = 0; > >- // API methods >+ virtual CallSendStatistics GetRTCPStatistics() const = 0; > >- // VoEBase >- int32_t StartSend(); >- void StopSend(); >+ virtual bool SetEncoder(int payload_type, >+ std::unique_ptr<AudioEncoder> encoder) = 0; >+ virtual void ModifyEncoder( >+ rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> modifier) = 0; > >- // Codecs >- void SetBitRate(int bitrate_bps, int64_t probing_interval_ms); >- bool EnableAudioNetworkAdaptor(const std::string& config_string); >- void DisableAudioNetworkAdaptor(); >- >- // TODO(nisse): Modifies decoder, but not used? >- void SetReceiverFrameLengthRange(int min_frame_length_ms, >- int max_frame_length_ms); >- >- // Network >- void RegisterTransport(Transport* transport); >- // TODO(nisse, solenberg): Delete when VoENetwork is deleted. >- int32_t ReceivedRTCPPacket(const uint8_t* data, size_t length); >- >- // Muting, Volume and Level. >- void SetInputMute(bool enable); >- >- // Stats. >- ANAStats GetANAStatistics() const; >- >- // Used by AudioSendStream. >- RtpRtcp* GetRtpRtcp() const; >- >- // DTMF. >- int SendTelephoneEventOutband(int event, int duration_ms); >- int SetSendTelephoneEventPayloadType(int payload_type, int payload_frequency); >- >- // RTP+RTCP >- int SetLocalSSRC(unsigned int ssrc); >- >- void SetMid(const std::string& mid, int extension_id); >- int SetSendAudioLevelIndicationStatus(bool enable, unsigned char id); >- void EnableSendTransportSequenceNumber(int id); >- >- void RegisterSenderCongestionControlObjects( >+ virtual void SetLocalSSRC(uint32_t ssrc) = 0; >+ virtual void SetMid(const std::string& mid, int extension_id) = 0; >+ virtual void SetRTCP_CNAME(absl::string_view c_name) = 0; >+ virtual void SetExtmapAllowMixed(bool extmap_allow_mixed) = 0; >+ virtual void SetSendAudioLevelIndicationStatus(bool enable, int id) = 0; >+ virtual void EnableSendTransportSequenceNumber(int id) = 0; >+ virtual void RegisterSenderCongestionControlObjects( > RtpTransportControllerSendInterface* transport, >- RtcpBandwidthObserver* bandwidth_observer); >- void ResetSenderCongestionControlObjects(); >- void SetRTCPStatus(bool enable); >- int SetRTCP_CNAME(const char cName[256]); >- int GetRemoteRTCPReportBlocks(std::vector<ReportBlock>* report_blocks); >- int GetRTPStatistics(CallSendStatistics& stats); // NOLINT >- void SetNACKStatus(bool enable, int maxNumberOfPackets); >- >- // From AudioPacketizationCallback in the ACM >- int32_t SendData(FrameType frameType, >- uint8_t payloadType, >- uint32_t timeStamp, >- const uint8_t* payloadData, >- size_t payloadSize, >- const RTPFragmentationHeader* fragmentation) override; >- >- // From Transport (called by the RTP/RTCP module) >- bool SendRtp(const uint8_t* data, >- size_t len, >- const PacketOptions& packet_options) override; >- bool SendRtcp(const uint8_t* data, size_t len) override; >- >- int PreferredSampleRate() const; >- >- bool Sending() const { return channel_state_.Get().sending; } >- RtpRtcp* RtpRtcpModulePtr() const { return _rtpRtcpModule.get(); } >- >- // ProcessAndEncodeAudio() posts a task on the shared encoder task queue, >- // which in turn calls (on the queue) ProcessAndEncodeAudioOnTaskQueue() where >- // the actual processing of the audio takes place. The processing mainly >- // consists of encoding and preparing the result for sending by adding it to a >- // send queue. >- // The main reason for using a task queue here is to release the native, >- // OS-specific, audio capture thread as soon as possible to ensure that it >- // can go back to sleep and be prepared to deliver an new captured audio >- // packet. >- void ProcessAndEncodeAudio(std::unique_ptr<AudioFrame> audio_frame); >- >- void SetTransportOverhead(size_t transport_overhead_per_packet); >- >- // From OverheadObserver in the RTP/RTCP module >- void OnOverheadChanged(size_t overhead_bytes_per_packet) override; >- >- // The existence of this function alongside OnUplinkPacketLossRate is >- // a compromise. We want the encoder to be agnostic of the PLR source, but >- // we also don't want it to receive conflicting information from TWCC and >- // from RTCP-XR. >- void OnTwccBasedUplinkPacketLossRate(float packet_loss_rate); >- >- void OnRecoverableUplinkPacketLossRate(float recoverable_packet_loss_rate); >- >- int64_t GetRTT() const; >- >- // E2EE Custom Audio Frame Encryption >- void SetFrameEncryptor(FrameEncryptorInterface* frame_encryptor); >- >- private: >- class ProcessAndEncodeAudioTask; >- >- void Init(); >- void Terminate(); >- >- void OnUplinkPacketLossRate(float packet_loss_rate); >- bool InputMute() const; >- >- int ResendPackets(const uint16_t* sequence_numbers, int length); >- >- int SetSendRtpHeaderExtension(bool enable, >- RTPExtensionType type, >- unsigned char id); >- >- void UpdateOverheadForEncoder() >- RTC_EXCLUSIVE_LOCKS_REQUIRED(overhead_per_packet_lock_); >- >- int GetRtpTimestampRateHz() const; >- >- // Called on the encoder task queue when a new input audio frame is ready >- // for encoding. >- void ProcessAndEncodeAudioOnTaskQueue(AudioFrame* audio_input); >- >- rtc::CriticalSection _callbackCritSect; >- rtc::CriticalSection volume_settings_critsect_; >- >- ChannelSendState channel_state_; >- >- RtcEventLog* const event_log_; >- >- std::unique_ptr<RtpRtcp> _rtpRtcpModule; >- >- std::unique_ptr<AudioCodingModule> audio_coding_; >- uint32_t _timeStamp RTC_GUARDED_BY(encoder_queue_); >- >- uint16_t send_sequence_number_; >- >- // uses >- ProcessThread* _moduleProcessThreadPtr; >- Transport* _transportPtr; // WebRtc socket or external transport >- RmsLevel rms_level_ RTC_GUARDED_BY(encoder_queue_); >- bool input_mute_ RTC_GUARDED_BY(volume_settings_critsect_); >- bool previous_frame_muted_ RTC_GUARDED_BY(encoder_queue_); >- // VoeRTP_RTCP >- // TODO(henrika): can today be accessed on the main thread and on the >- // task queue; hence potential race. >- bool _includeAudioLevelIndication; >- size_t transport_overhead_per_packet_ >- RTC_GUARDED_BY(overhead_per_packet_lock_); >- size_t rtp_overhead_per_packet_ RTC_GUARDED_BY(overhead_per_packet_lock_); >- rtc::CriticalSection overhead_per_packet_lock_; >- // RtcpBandwidthObserver >- std::unique_ptr<VoERtcpObserver> rtcp_observer_; >- >- PacketRouter* packet_router_ = nullptr; >- std::unique_ptr<TransportFeedbackProxy> feedback_observer_proxy_; >- std::unique_ptr<TransportSequenceNumberProxy> seq_num_allocator_proxy_; >- std::unique_ptr<RtpPacketSenderProxy> rtp_packet_sender_proxy_; >- std::unique_ptr<RateLimiter> retransmission_rate_limiter_; >- >- rtc::ThreadChecker construction_thread_; >- >- const bool use_twcc_plr_for_ana_; >- >- rtc::CriticalSection encoder_queue_lock_; >- bool encoder_queue_is_active_ RTC_GUARDED_BY(encoder_queue_lock_) = false; >- rtc::TaskQueue* encoder_queue_ = nullptr; >- >- // E2EE Audio Frame Encryption >- FrameEncryptorInterface* frame_encryptor_ = nullptr; >+ RtcpBandwidthObserver* bandwidth_observer) = 0; >+ virtual void ResetSenderCongestionControlObjects() = 0; >+ virtual std::vector<ReportBlock> GetRemoteRTCPReportBlocks() const = 0; >+ virtual ANAStats GetANAStatistics() const = 0; >+ virtual bool SetSendTelephoneEventPayloadType(int payload_type, >+ int payload_frequency) = 0; >+ virtual bool SendTelephoneEventOutband(int event, int duration_ms) = 0; >+ virtual void OnBitrateAllocation(BitrateAllocationUpdate update) = 0; >+ virtual int GetBitrate() const = 0; >+ virtual void SetInputMute(bool muted) = 0; >+ >+ virtual void ProcessAndEncodeAudio( >+ std::unique_ptr<AudioFrame> audio_frame) = 0; >+ virtual void SetTransportOverhead(size_t transport_overhead_per_packet) = 0; >+ virtual RtpRtcp* GetRtpRtcp() const = 0; >+ >+ virtual void OnTwccBasedUplinkPacketLossRate(float packet_loss_rate) = 0; >+ virtual void OnRecoverableUplinkPacketLossRate( >+ float recoverable_packet_loss_rate) = 0; >+ // In RTP we currently rely on RTCP packets (|ReceivedRTCPPacket|) to inform >+ // about RTT. >+ // In media transport we rely on the TargetTransferRateObserver instead. >+ // In other words, if you are using RTP, you should expect >+ // |ReceivedRTCPPacket| to be called, if you are using media transport, >+ // |OnTargetTransferRate| will be called. >+ // >+ // In future, RTP media will move to the media transport implementation and >+ // these conditions will be removed. >+ // Returns the RTT in milliseconds. >+ virtual int64_t GetRTT() const = 0; >+ virtual void StartSend() = 0; >+ virtual void StopSend() = 0; >+ >+ // E2EE Custom Audio Frame Encryption (Optional) >+ virtual void SetFrameEncryptor( >+ rtc::scoped_refptr<FrameEncryptorInterface> frame_encryptor) = 0; > }; > >+std::unique_ptr<ChannelSendInterface> CreateChannelSend( >+ rtc::TaskQueue* encoder_queue, >+ ProcessThread* module_process_thread, >+ MediaTransportInterface* media_transport, >+ Transport* rtp_transport, >+ RtcpRttStats* rtcp_rtt_stats, >+ RtcEventLog* rtc_event_log, >+ FrameEncryptorInterface* frame_encryptor, >+ const webrtc::CryptoOptions& crypto_options, >+ bool extmap_allow_mixed, >+ int rtcp_report_interval_ms); >+ > } // namespace voe > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send_proxy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send_proxy.cc >deleted file mode 100644 >index 8091bdc671beeb89350a6a83aac294f126039884..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send_proxy.cc >+++ /dev/null >@@ -1,207 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "audio/channel_send_proxy.h" >- >-#include <utility> >- >-#include "api/call/audio_sink.h" >-#include "call/rtp_transport_controller_send_interface.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/numerics/safe_minmax.h" >- >-namespace webrtc { >-namespace voe { >-ChannelSendProxy::ChannelSendProxy() {} >- >-ChannelSendProxy::ChannelSendProxy(std::unique_ptr<ChannelSend> channel) >- : channel_(std::move(channel)) { >- RTC_DCHECK(channel_); >- module_process_thread_checker_.DetachFromThread(); >-} >- >-ChannelSendProxy::~ChannelSendProxy() {} >- >-void ChannelSendProxy::SetLocalSSRC(uint32_t ssrc) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- int error = channel_->SetLocalSSRC(ssrc); >- RTC_DCHECK_EQ(0, error); >-} >- >-void ChannelSendProxy::SetNACKStatus(bool enable, int max_packets) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetNACKStatus(enable, max_packets); >-} >- >-CallSendStatistics ChannelSendProxy::GetRTCPStatistics() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- CallSendStatistics stats = {0}; >- int error = channel_->GetRTPStatistics(stats); >- RTC_DCHECK_EQ(0, error); >- return stats; >-} >- >-void ChannelSendProxy::RegisterTransport(Transport* transport) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->RegisterTransport(transport); >-} >- >-bool ChannelSendProxy::ReceivedRTCPPacket(const uint8_t* packet, >- size_t length) { >- // May be called on either worker thread or network thread. >- return channel_->ReceivedRTCPPacket(packet, length) == 0; >-} >- >-bool ChannelSendProxy::SetEncoder(int payload_type, >- std::unique_ptr<AudioEncoder> encoder) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->SetEncoder(payload_type, std::move(encoder)); >-} >- >-void ChannelSendProxy::ModifyEncoder( >- rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> modifier) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->ModifyEncoder(modifier); >-} >- >-void ChannelSendProxy::SetRTCPStatus(bool enable) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetRTCPStatus(enable); >-} >- >-void ChannelSendProxy::SetMid(const std::string& mid, int extension_id) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetMid(mid, extension_id); >-} >- >-void ChannelSendProxy::SetRTCP_CNAME(const std::string& c_name) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- // Note: VoERTP_RTCP::SetRTCP_CNAME() accepts a char[256] array. >- std::string c_name_limited = c_name.substr(0, 255); >- int error = channel_->SetRTCP_CNAME(c_name_limited.c_str()); >- RTC_DCHECK_EQ(0, error); >-} >- >-void ChannelSendProxy::SetSendAudioLevelIndicationStatus(bool enable, int id) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- int error = channel_->SetSendAudioLevelIndicationStatus(enable, id); >- RTC_DCHECK_EQ(0, error); >-} >- >-void ChannelSendProxy::EnableSendTransportSequenceNumber(int id) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->EnableSendTransportSequenceNumber(id); >-} >- >-void ChannelSendProxy::RegisterSenderCongestionControlObjects( >- RtpTransportControllerSendInterface* transport, >- RtcpBandwidthObserver* bandwidth_observer) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->RegisterSenderCongestionControlObjects(transport, >- bandwidth_observer); >-} >- >-void ChannelSendProxy::ResetSenderCongestionControlObjects() { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->ResetSenderCongestionControlObjects(); >-} >- >-std::vector<ReportBlock> ChannelSendProxy::GetRemoteRTCPReportBlocks() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- std::vector<webrtc::ReportBlock> blocks; >- int error = channel_->GetRemoteRTCPReportBlocks(&blocks); >- RTC_DCHECK_EQ(0, error); >- return blocks; >-} >- >-ANAStats ChannelSendProxy::GetANAStatistics() const { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->GetANAStatistics(); >-} >- >-bool ChannelSendProxy::SetSendTelephoneEventPayloadType(int payload_type, >- int payload_frequency) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->SetSendTelephoneEventPayloadType(payload_type, >- payload_frequency) == 0; >-} >- >-bool ChannelSendProxy::SendTelephoneEventOutband(int event, int duration_ms) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return channel_->SendTelephoneEventOutband(event, duration_ms) == 0; >-} >- >-void ChannelSendProxy::SetBitrate(int bitrate_bps, >- int64_t probing_interval_ms) { >- // This method can be called on the worker thread, module process thread >- // or on a TaskQueue via VideoSendStreamImpl::OnEncoderConfigurationChanged. >- // TODO(solenberg): Figure out a good way to check this or enforce calling >- // rules. >- // RTC_DCHECK(worker_thread_checker_.CalledOnValidThread() || >- // module_process_thread_checker_.CalledOnValidThread()); >- channel_->SetBitRate(bitrate_bps, probing_interval_ms); >-} >- >-void ChannelSendProxy::SetInputMute(bool muted) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetInputMute(muted); >-} >- >-void ChannelSendProxy::ProcessAndEncodeAudio( >- std::unique_ptr<AudioFrame> audio_frame) { >- RTC_DCHECK_RUNS_SERIALIZED(&audio_thread_race_checker_); >- return channel_->ProcessAndEncodeAudio(std::move(audio_frame)); >-} >- >-void ChannelSendProxy::SetTransportOverhead(int transport_overhead_per_packet) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetTransportOverhead(transport_overhead_per_packet); >-} >- >-RtpRtcp* ChannelSendProxy::GetRtpRtcp() const { >- RTC_DCHECK(module_process_thread_checker_.CalledOnValidThread()); >- return channel_->GetRtpRtcp(); >-} >- >-void ChannelSendProxy::OnTwccBasedUplinkPacketLossRate(float packet_loss_rate) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->OnTwccBasedUplinkPacketLossRate(packet_loss_rate); >-} >- >-void ChannelSendProxy::OnRecoverableUplinkPacketLossRate( >- float recoverable_packet_loss_rate) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->OnRecoverableUplinkPacketLossRate(recoverable_packet_loss_rate); >-} >- >-void ChannelSendProxy::StartSend() { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- int error = channel_->StartSend(); >- RTC_DCHECK_EQ(0, error); >-} >- >-void ChannelSendProxy::StopSend() { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->StopSend(); >-} >- >-ChannelSend* ChannelSendProxy::GetChannel() const { >- return channel_.get(); >-} >- >-void ChannelSendProxy::SetFrameEncryptor( >- FrameEncryptorInterface* frame_encryptor) { >- RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- channel_->SetFrameEncryptor(frame_encryptor); >-} >- >-} // namespace voe >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send_proxy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send_proxy.h >deleted file mode 100644 >index 1b8b4a02ce5eecd8d69c886e52081efedeb9ab56..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/channel_send_proxy.h >+++ /dev/null >@@ -1,111 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef AUDIO_CHANNEL_SEND_PROXY_H_ >-#define AUDIO_CHANNEL_SEND_PROXY_H_ >- >-#include <memory> >-#include <string> >-#include <vector> >- >-#include "api/audio_codecs/audio_encoder.h" >-#include "audio/channel_send.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/race_checker.h" >-#include "rtc_base/thread_checker.h" >- >-namespace webrtc { >- >-class FrameEncryptorInterface; >-class RtcpBandwidthObserver; >-class RtpRtcp; >-class RtpTransportControllerSendInterface; >-class Transport; >- >-namespace voe { >- >-// This class provides the "view" of a voe::Channel that we need to implement >-// webrtc::AudioSendStream. It serves two purposes: >-// 1. Allow mocking just the interfaces used, instead of the entire >-// voe::Channel class. >-// 2. Provide a refined interface for the stream classes, including assumptions >-// on return values and input adaptation. >-class ChannelSendProxy { >- public: >- ChannelSendProxy(); >- explicit ChannelSendProxy(std::unique_ptr<ChannelSend> channel); >- virtual ~ChannelSendProxy(); >- >- // Shared with ChannelReceiveProxy >- virtual void SetLocalSSRC(uint32_t ssrc); >- virtual void SetNACKStatus(bool enable, int max_packets); >- virtual CallSendStatistics GetRTCPStatistics() const; >- virtual void RegisterTransport(Transport* transport); >- virtual bool ReceivedRTCPPacket(const uint8_t* packet, size_t length); >- >- virtual bool SetEncoder(int payload_type, >- std::unique_ptr<AudioEncoder> encoder); >- virtual void ModifyEncoder( >- rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> modifier); >- >- virtual void SetRTCPStatus(bool enable); >- virtual void SetMid(const std::string& mid, int extension_id); >- virtual void SetRTCP_CNAME(const std::string& c_name); >- virtual void SetSendAudioLevelIndicationStatus(bool enable, int id); >- virtual void EnableSendTransportSequenceNumber(int id); >- virtual void RegisterSenderCongestionControlObjects( >- RtpTransportControllerSendInterface* transport, >- RtcpBandwidthObserver* bandwidth_observer); >- virtual void ResetSenderCongestionControlObjects(); >- virtual std::vector<ReportBlock> GetRemoteRTCPReportBlocks() const; >- virtual ANAStats GetANAStatistics() const; >- virtual bool SetSendTelephoneEventPayloadType(int payload_type, >- int payload_frequency); >- virtual bool SendTelephoneEventOutband(int event, int duration_ms); >- virtual void SetBitrate(int bitrate_bps, int64_t probing_interval_ms); >- virtual void SetInputMute(bool muted); >- >- virtual void ProcessAndEncodeAudio(std::unique_ptr<AudioFrame> audio_frame); >- virtual void SetTransportOverhead(int transport_overhead_per_packet); >- virtual RtpRtcp* GetRtpRtcp() const; >- >- virtual void OnTwccBasedUplinkPacketLossRate(float packet_loss_rate); >- virtual void OnRecoverableUplinkPacketLossRate( >- float recoverable_packet_loss_rate); >- virtual void StartSend(); >- virtual void StopSend(); >- >- // Needed by ChannelReceiveProxy::AssociateSendChannel. >- virtual ChannelSend* GetChannel() const; >- >- // E2EE Custom Audio Frame Encryption (Optional) >- virtual void SetFrameEncryptor(FrameEncryptorInterface* frame_encryptor); >- >- private: >- // Thread checkers document and lock usage of some methods on voe::Channel to >- // specific threads we know about. The goal is to eventually split up >- // voe::Channel into parts with single-threaded semantics, and thereby reduce >- // the need for locks. >- rtc::ThreadChecker worker_thread_checker_; >- rtc::ThreadChecker module_process_thread_checker_; >- // Methods accessed from audio and video threads are checked for sequential- >- // only access. We don't necessarily own and control these threads, so thread >- // checkers cannot be used. E.g. Chromium may transfer "ownership" from one >- // audio thread to another, but access is still sequential. >- rtc::RaceChecker audio_thread_race_checker_; >- rtc::RaceChecker video_capture_thread_race_checker_; >- std::unique_ptr<ChannelSend> channel_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(ChannelSendProxy); >-}; >-} // namespace voe >-} // namespace webrtc >- >-#endif // AUDIO_CHANNEL_SEND_PROXY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/mock_voe_channel_proxy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/mock_voe_channel_proxy.h >index 910858fe130cac8d94f84cb0c59b8b1aff5e2f68..eee25c5a2d532cb245d99cf4dd4873e974392401 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/mock_voe_channel_proxy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/mock_voe_channel_proxy.h >@@ -16,15 +16,16 @@ > #include <string> > #include <vector> > >-#include "audio/channel_receive_proxy.h" >-#include "audio/channel_send_proxy.h" >+#include "api/test/mock_frame_encryptor.h" >+#include "audio/channel_receive.h" >+#include "audio/channel_send.h" > #include "modules/rtp_rtcp/source/rtp_packet_received.h" > #include "test/gmock.h" > > namespace webrtc { > namespace test { > >-class MockChannelReceiveProxy : public voe::ChannelReceiveProxy { >+class MockChannelReceive : public voe::ChannelReceiveInterface { > public: > MOCK_METHOD1(SetLocalSSRC, void(uint32_t ssrc)); > MOCK_METHOD2(SetNACKStatus, void(bool enable, int max_packets)); >@@ -46,10 +47,10 @@ class MockChannelReceiveProxy : public voe::ChannelReceiveProxy { > AudioMixer::Source::AudioFrameInfo(int sample_rate_hz, > AudioFrame* audio_frame)); > MOCK_CONST_METHOD0(PreferredSampleRate, int()); >- MOCK_METHOD1(AssociateSendChannel, >- void(const voe::ChannelSendProxy& send_channel_proxy)); >- MOCK_METHOD0(DisassociateSendChannel, void()); >+ MOCK_METHOD1(SetAssociatedSendChannel, >+ void(const voe::ChannelSendInterface* send_channel)); > MOCK_CONST_METHOD0(GetPlayoutTimestamp, uint32_t()); >+ MOCK_CONST_METHOD0(GetSyncInfo, absl::optional<Syncable::Info>()); > MOCK_METHOD1(SetMinimumPlayoutDelay, void(int delay_ms)); > MOCK_CONST_METHOD1(GetRecCodec, bool(CodecInst* codec_inst)); > MOCK_METHOD1(SetReceiveCodecs, >@@ -59,7 +60,7 @@ class MockChannelReceiveProxy : public voe::ChannelReceiveProxy { > MOCK_METHOD0(StopPlayout, void()); > }; > >-class MockChannelSendProxy : public voe::ChannelSendProxy { >+class MockChannelSend : public voe::ChannelSendInterface { > public: > // GMock doesn't like move-only types, like std::unique_ptr. > virtual bool SetEncoder(int payload_type, >@@ -71,10 +72,10 @@ class MockChannelSendProxy : public voe::ChannelSendProxy { > MOCK_METHOD1( > ModifyEncoder, > void(rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> modifier)); >- MOCK_METHOD1(SetRTCPStatus, void(bool enable)); >+ MOCK_METHOD2(SetMid, void(const std::string& mid, int extension_id)); > MOCK_METHOD1(SetLocalSSRC, void(uint32_t ssrc)); >- MOCK_METHOD1(SetRTCP_CNAME, void(const std::string& c_name)); >- MOCK_METHOD2(SetNACKStatus, void(bool enable, int max_packets)); >+ MOCK_METHOD1(SetRTCP_CNAME, void(absl::string_view c_name)); >+ MOCK_METHOD1(SetExtmapAllowMixed, void(bool extmap_allow_mixed)); > MOCK_METHOD2(SetSendAudioLevelIndicationStatus, void(bool enable, int id)); > MOCK_METHOD1(EnableSendTransportSequenceNumber, void(int id)); > MOCK_METHOD2(RegisterSenderCongestionControlObjects, >@@ -87,9 +88,8 @@ class MockChannelSendProxy : public voe::ChannelSendProxy { > MOCK_METHOD2(SetSendTelephoneEventPayloadType, > bool(int payload_type, int payload_frequency)); > MOCK_METHOD2(SendTelephoneEventOutband, bool(int event, int duration_ms)); >- MOCK_METHOD2(SetBitrate, void(int bitrate_bps, int64_t probing_interval_ms)); >+ MOCK_METHOD1(OnBitrateAllocation, void(BitrateAllocationUpdate update)); > MOCK_METHOD1(SetInputMute, void(bool muted)); >- MOCK_METHOD1(RegisterTransport, void(Transport* transport)); > MOCK_METHOD2(ReceivedRTCPPacket, bool(const uint8_t* packet, size_t length)); > // GMock doesn't like move-only types, like std::unique_ptr. > virtual void ProcessAndEncodeAudio(std::unique_ptr<AudioFrame> audio_frame) { >@@ -97,15 +97,19 @@ class MockChannelSendProxy : public voe::ChannelSendProxy { > } > MOCK_METHOD1(ProcessAndEncodeAudioForMock, > void(std::unique_ptr<AudioFrame>* audio_frame)); >- MOCK_METHOD1(SetTransportOverhead, void(int transport_overhead_per_packet)); >+ MOCK_METHOD1(SetTransportOverhead, >+ void(size_t transport_overhead_per_packet)); > MOCK_CONST_METHOD0(GetRtpRtcp, RtpRtcp*()); >+ MOCK_CONST_METHOD0(GetBitrate, int()); > MOCK_METHOD1(OnTwccBasedUplinkPacketLossRate, void(float packet_loss_rate)); > MOCK_METHOD1(OnRecoverableUplinkPacketLossRate, > void(float recoverable_packet_loss_rate)); >+ MOCK_CONST_METHOD0(GetRTT, int64_t()); > MOCK_METHOD0(StartSend, void()); > MOCK_METHOD0(StopSend, void()); >- MOCK_METHOD1(SetFrameEncryptor, >- void(FrameEncryptorInterface* frame_encryptor)); >+ MOCK_METHOD1( >+ SetFrameEncryptor, >+ void(rtc::scoped_refptr<FrameEncryptorInterface> frame_encryptor)); > }; > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.cc >index c22b3d879115811ce6d0d1d274db1bfd3718c27d..d2b1199ed6dd27d58d3d7e486728e49f3c318c5d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.cc >@@ -9,8 +9,13 @@ > */ > > #include "audio/null_audio_poller.h" >-#include "rtc_base/logging.h" >+ >+#include <stddef.h> >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/location.h" > #include "rtc_base/thread.h" >+#include "rtc_base/timeutils.h" > > namespace webrtc { > namespace internal { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.h >index afb6edbaf16132f787088acc6f366bd1cb624708..f91eb7d6c0782448cc9a7a6943b5542bf3b7c11e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/null_audio_poller.h >@@ -11,8 +11,11 @@ > #ifndef AUDIO_NULL_AUDIO_POLLER_H_ > #define AUDIO_NULL_AUDIO_POLLER_H_ > >+#include <stdint.h> >+ > #include "modules/audio_device/include/audio_device_defines.h" > #include "rtc_base/messagehandler.h" >+#include "rtc_base/messagequeue.h" > #include "rtc_base/thread_checker.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >index 97222e947a4e62f345c7304d7b692af9c0da5b64..b1158da213f966b44340d6220fc9e04f662a7ebc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >@@ -13,10 +13,7 @@ > #include "api/audio/audio_frame.h" > #include "audio/utility/audio_frame_operations.h" > #include "common_audio/resampler/include/push_resampler.h" >-#include "common_audio/signal_processing/include/signal_processing_library.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > > namespace webrtc { > namespace voe { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.cc >index 8ec2a7cf4a574e153caa143ee647f4133fc9ee92..9fa95b17b8e46d94eebd641c37c02261ccf80d47 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.cc >@@ -123,8 +123,8 @@ class NoBandwidthDropAfterDtx : public AudioBweTest { > return test::ResourcePath("voice_engine/audio_dtx16", "wav"); > } > >- DefaultNetworkSimulationConfig GetNetworkPipeConfig() override { >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig GetNetworkPipeConfig() override { >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.link_capacity_kbps = 50; > pipe_config.queue_length_packets = 1500; > pipe_config.queue_delay_ms = 300; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.h >index c70ad1c91b93f771b580f460aff631e2a6cd6b1a..4de2cab31613bec35968111a10dc55e3b995a7d3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_bwe_integration_test.h >@@ -27,7 +27,7 @@ class AudioBweTest : public test::EndToEndTest { > protected: > virtual std::string AudioInputFile() = 0; > >- virtual DefaultNetworkSimulationConfig GetNetworkPipeConfig() = 0; >+ virtual BuiltInNetworkBehaviorConfig GetNetworkPipeConfig() = 0; > > size_t GetNumVideoStreams() const override; > size_t GetNumAudioStreams() const override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.cc >index 02abe734bbfe1f2ec688b402ca122e2e5d48d596..5e187ad8f8980912935ad6a0bfa90cc09b332e45 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.cc >@@ -28,8 +28,8 @@ constexpr int kSampleRate = 48000; > AudioEndToEndTest::AudioEndToEndTest() > : EndToEndTest(CallTest::kDefaultTimeoutMs) {} > >-DefaultNetworkSimulationConfig AudioEndToEndTest::GetNetworkPipeConfig() const { >- return DefaultNetworkSimulationConfig(); >+BuiltInNetworkBehaviorConfig AudioEndToEndTest::GetNetworkPipeConfig() const { >+ return BuiltInNetworkBehaviorConfig(); > } > > size_t AudioEndToEndTest::GetNumVideoStreams() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.h >index 9dda834d22b4ed6e5ee2a0f2f48723cf4c00c002..ba1e0c7d6ef8eec770f3bcdd813bfb22745a4802 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_end_to_end_test.h >@@ -29,7 +29,7 @@ class AudioEndToEndTest : public test::EndToEndTest { > const AudioSendStream* send_stream() const { return send_stream_; } > const AudioReceiveStream* receive_stream() const { return receive_stream_; } > >- virtual DefaultNetworkSimulationConfig GetNetworkPipeConfig() const; >+ virtual BuiltInNetworkBehaviorConfig GetNetworkPipeConfig() const; > > size_t GetNumVideoStreams() const override; > size_t GetNumAudioStreams() const override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_stats_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_stats_test.cc >index 019679e0782f86b0b76a4b9031eebc793a3e2349..556a16dfa6d505633f9972d09652bcc17f019bad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_stats_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/audio_stats_test.cc >@@ -32,8 +32,8 @@ class NoLossTest : public AudioEndToEndTest { > > NoLossTest() = default; > >- DefaultNetworkSimulationConfig GetNetworkPipeConfig() const override { >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig GetNetworkPipeConfig() const override { >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.queue_delay_ms = kRttMs / 2; > return pipe_config; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/low_bandwidth_audio_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/low_bandwidth_audio_test.cc >index 16a3b91e877cdd88d6f93f8ed6f22b9bc92ab437..857f983d57775a1a47b10f80980d68a9e8de25df 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/low_bandwidth_audio_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/low_bandwidth_audio_test.cc >@@ -14,14 +14,15 @@ > #include "system_wrappers/include/sleep.h" > #include "test/testsupport/fileutils.h" > >-DEFINE_int(sample_rate_hz, >- 16000, >- "Sample rate (Hz) of the produced audio files."); >+WEBRTC_DEFINE_int(sample_rate_hz, >+ 16000, >+ "Sample rate (Hz) of the produced audio files."); > >-DEFINE_bool(quick, >- false, >- "Don't do the full audio recording. " >- "Used to quickly check that the test runs without crashing."); >+WEBRTC_DEFINE_bool( >+ quick, >+ false, >+ "Don't do the full audio recording. " >+ "Used to quickly check that the test runs without crashing."); > > namespace webrtc { > namespace test { >@@ -89,8 +90,8 @@ class Mobile2GNetworkTest : public AudioQualityTest { > {{"maxaveragebitrate", "6000"}, {"ptime", "60"}, {"stereo", "1"}}}); > } > >- DefaultNetworkSimulationConfig GetNetworkPipeConfig() const override { >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig GetNetworkPipeConfig() const override { >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.link_capacity_kbps = 12; > pipe_config.queue_length_packets = 1500; > pipe_config.queue_delay_ms = 400; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/media_transport_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/media_transport_test.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..b4af4a08819138bf696926133f04e1edee3a6020 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/test/media_transport_test.cc >@@ -0,0 +1,143 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/audio_codecs/audio_decoder_factory_template.h" >+#include "api/audio_codecs/audio_encoder_factory_template.h" >+#include "api/audio_codecs/opus/audio_decoder_opus.h" >+#include "api/audio_codecs/opus/audio_encoder_opus.h" >+#include "api/test/loopback_media_transport.h" >+#include "api/test/mock_audio_mixer.h" >+#include "audio/audio_receive_stream.h" >+#include "audio/audio_send_stream.h" >+#include "call/test/mock_bitrate_allocator.h" >+#include "logging/rtc_event_log/rtc_event_log.h" >+#include "modules/audio_device/include/test_audio_device.h" >+#include "modules/audio_mixer/audio_mixer_impl.h" >+#include "modules/audio_processing/include/mock_audio_processing.h" >+#include "modules/utility/include/process_thread.h" >+#include "rtc_base/task_queue.h" >+#include "rtc_base/timeutils.h" >+#include "test/gtest.h" >+#include "test/mock_transport.h" >+ >+namespace webrtc { >+namespace test { >+ >+namespace { >+constexpr int kPayloadTypeOpus = 17; >+constexpr int kSamplingFrequency = 48000; >+constexpr int kNumChannels = 2; >+constexpr int kWantedSamples = 3000; >+constexpr int kTestTimeoutMs = 2 * rtc::kNumMillisecsPerSec; >+ >+class TestRenderer : public TestAudioDeviceModule::Renderer { >+ public: >+ TestRenderer(int sampling_frequency, int num_channels, size_t wanted_samples) >+ : sampling_frequency_(sampling_frequency), >+ num_channels_(num_channels), >+ wanted_samples_(wanted_samples) {} >+ ~TestRenderer() override = default; >+ >+ int SamplingFrequency() const override { return sampling_frequency_; } >+ int NumChannels() const override { return num_channels_; } >+ >+ bool Render(rtc::ArrayView<const int16_t> data) override { >+ if (data.size() >= wanted_samples_) { >+ return false; >+ } >+ wanted_samples_ -= data.size(); >+ return true; >+ } >+ >+ private: >+ const int sampling_frequency_; >+ const int num_channels_; >+ size_t wanted_samples_; >+}; >+ >+} // namespace >+ >+TEST(AudioWithMediaTransport, DeliversAudio) { >+ std::unique_ptr<rtc::Thread> transport_thread = rtc::Thread::Create(); >+ transport_thread->Start(); >+ MediaTransportPair transport_pair(transport_thread.get()); >+ MockTransport rtcp_send_transport; >+ MockTransport send_transport; >+ std::unique_ptr<RtcEventLog> null_event_log = RtcEventLog::CreateNull(); >+ MockBitrateAllocator bitrate_allocator; >+ >+ rtc::scoped_refptr<TestAudioDeviceModule> audio_device = >+ TestAudioDeviceModule::CreateTestAudioDeviceModule( >+ TestAudioDeviceModule::CreatePulsedNoiseCapturer( >+ /* max_amplitude= */ 10000, kSamplingFrequency, kNumChannels), >+ absl::make_unique<TestRenderer>(kSamplingFrequency, kNumChannels, >+ kWantedSamples)); >+ >+ AudioState::Config audio_config; >+ audio_config.audio_mixer = AudioMixerImpl::Create(); >+ // TODO(nisse): Is a mock AudioProcessing enough? >+ audio_config.audio_processing = >+ new rtc::RefCountedObject<MockAudioProcessing>(); >+ audio_config.audio_device_module = audio_device; >+ rtc::scoped_refptr<AudioState> audio_state = AudioState::Create(audio_config); >+ >+ // TODO(nisse): Use some lossless codec? >+ const SdpAudioFormat audio_format("opus", kSamplingFrequency, kNumChannels); >+ >+ // Setup receive stream; >+ webrtc::AudioReceiveStream::Config receive_config; >+ // TODO(nisse): Update AudioReceiveStream to not require rtcp_send_transport >+ // when a MediaTransport is provided. >+ receive_config.rtcp_send_transport = &rtcp_send_transport; >+ receive_config.media_transport = transport_pair.first(); >+ receive_config.decoder_map.emplace(kPayloadTypeOpus, audio_format); >+ receive_config.decoder_factory = >+ CreateAudioDecoderFactory<AudioDecoderOpus>(); >+ >+ std::unique_ptr<ProcessThread> receive_process_thread = >+ ProcessThread::Create("audio recv thread"); >+ >+ webrtc::internal::AudioReceiveStream receive_stream( >+ /*rtp_stream_receiver_controller=*/nullptr, >+ /*packet_router=*/nullptr, receive_process_thread.get(), receive_config, >+ audio_state, null_event_log.get()); >+ >+ // TODO(nisse): Update AudioSendStream to not require send_transport when a >+ // MediaTransport is provided. >+ AudioSendStream::Config send_config(&send_transport, transport_pair.second()); >+ send_config.send_codec_spec = >+ AudioSendStream::Config::SendCodecSpec(kPayloadTypeOpus, audio_format); >+ send_config.encoder_factory = CreateAudioEncoderFactory<AudioEncoderOpus>(); >+ rtc::TaskQueue send_tq("audio send queue"); >+ std::unique_ptr<ProcessThread> send_process_thread = >+ ProcessThread::Create("audio send thread"); >+ webrtc::internal::AudioSendStream send_stream( >+ send_config, audio_state, &send_tq, send_process_thread.get(), >+ /*transport=*/nullptr, &bitrate_allocator, null_event_log.get(), >+ /*rtcp_rtt_stats=*/nullptr, absl::optional<RtpState>()); >+ >+ audio_device->Init(); // Starts thread. >+ audio_device->RegisterAudioCallback(audio_state->audio_transport()); >+ >+ receive_stream.Start(); >+ send_stream.Start(); >+ audio_device->StartPlayout(); >+ audio_device->StartRecording(); >+ >+ EXPECT_TRUE(audio_device->WaitForPlayoutEnd(kTestTimeoutMs)); >+ >+ audio_device->StopRecording(); >+ audio_device->StopPlayout(); >+ receive_stream.Stop(); >+ send_stream.Stop(); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval.cc >deleted file mode 100644 >index cc103408a3a2467e5a0a02f88e390d74813c6c1d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval.cc >+++ /dev/null >@@ -1,56 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "audio/time_interval.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/timeutils.h" >- >-namespace webrtc { >- >-TimeInterval::TimeInterval() = default; >-TimeInterval::~TimeInterval() = default; >- >-void TimeInterval::Extend() { >- Extend(rtc::TimeMillis()); >-} >- >-void TimeInterval::Extend(int64_t time) { >- if (!interval_) { >- interval_.emplace(time, time); >- } else { >- if (time < interval_->first) { >- interval_->first = time; >- } >- if (time > interval_->last) { >- interval_->last = time; >- } >- } >-} >- >-void TimeInterval::Extend(const TimeInterval& other_interval) { >- if (!other_interval.Empty()) { >- Extend(other_interval.interval_->first); >- Extend(other_interval.interval_->last); >- } >-} >- >-bool TimeInterval::Empty() const { >- return !interval_; >-} >- >-int64_t TimeInterval::Length() const { >- RTC_DCHECK(interval_); >- return interval_->last - interval_->first; >-} >- >-TimeInterval::Interval::Interval(int64_t first, int64_t last) >- : first(first), last(last) {} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval.h >deleted file mode 100644 >index 79fe29d9d569a1056d59131d5876e2c90b96d499..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval.h >+++ /dev/null >@@ -1,65 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef AUDIO_TIME_INTERVAL_H_ >-#define AUDIO_TIME_INTERVAL_H_ >- >-#include <stdint.h> >- >-#include "absl/types/optional.h" >- >-namespace webrtc { >- >-// This class logs the first and last time its Extend() function is called. >-// >-// This class is not thread-safe; Extend() calls should only be made by a >-// single thread at a time, such as within a lock or destructor. >-// >-// Example usage: >-// // let x < y < z < u < v >-// rtc::TimeInterval interval; >-// ... // interval.Extend(); // at time x >-// ... >-// interval.Extend(); // at time y >-// ... >-// interval.Extend(); // at time u >-// ... >-// interval.Extend(z); // at time v >-// ... >-// if (!interval.Empty()) { >-// int64_t active_time = interval.Length(); // returns (u - x) >-// } >-class TimeInterval { >- public: >- TimeInterval(); >- ~TimeInterval(); >- // Extend the interval with the current time. >- void Extend(); >- // Extend the interval with a given time. >- void Extend(int64_t time); >- // Take the convex hull with another interval. >- void Extend(const TimeInterval& other_interval); >- // True iff Extend has never been called. >- bool Empty() const; >- // Returns the time between the first and the last tick, in milliseconds. >- int64_t Length() const; >- >- private: >- struct Interval { >- Interval(int64_t first, int64_t last); >- >- int64_t first, last; >- }; >- absl::optional<Interval> interval_; >-}; >- >-} // namespace webrtc >- >-#endif // AUDIO_TIME_INTERVAL_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval_unittest.cc >deleted file mode 100644 >index deff6e363da2161dbdee680719872fb296526e97..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/time_interval_unittest.cc >+++ /dev/null >@@ -1,48 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "audio/time_interval.h" >-#include "api/units/time_delta.h" >-#include "rtc_base/fakeclock.h" >-#include "test/gtest.h" >- >-namespace webrtc { >- >-TEST(TimeIntervalTest, TimeInMs) { >- rtc::ScopedFakeClock fake_clock; >- TimeInterval interval; >- interval.Extend(); >- fake_clock.AdvanceTime(TimeDelta::ms(100)); >- interval.Extend(); >- EXPECT_EQ(interval.Length(), 100); >-} >- >-TEST(TimeIntervalTest, Empty) { >- TimeInterval interval; >- EXPECT_TRUE(interval.Empty()); >- interval.Extend(); >- EXPECT_FALSE(interval.Empty()); >- interval.Extend(200); >- EXPECT_FALSE(interval.Empty()); >-} >- >-TEST(TimeIntervalTest, MonotoneIncreasing) { >- const size_t point_count = 7; >- const int64_t interval_points[] = {3, 2, 5, 0, 4, 1, 6}; >- const int64_t interval_differences[] = {0, 1, 3, 5, 5, 5, 6}; >- TimeInterval interval; >- EXPECT_TRUE(interval.Empty()); >- for (size_t i = 0; i < point_count; ++i) { >- interval.Extend(interval_points[i]); >- EXPECT_EQ(interval_differences[i], interval.Length()); >- } >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/transport_feedback_packet_loss_tracker.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/transport_feedback_packet_loss_tracker.cc >index c7acd766c4485925b4da7a9c028137d35def9fca..f41439b494fcd38b3a3a1675b63386fafaa4de8b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/transport_feedback_packet_loss_tracker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/transport_feedback_packet_loss_tracker.cc >@@ -10,11 +10,11 @@ > > #include "audio/transport_feedback_packet_loss_tracker.h" > >+#include <iterator> > #include <limits> > #include <utility> > > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/transport_feedback.h" > #include "rtc_base/checks.h" > #include "rtc_base/numerics/mod_ops.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/BUILD.gn >index 76c09a51a380e1c84655afa096e84e1014f7c7eb..11a65bdd4638ad5bfbef56bf9e9341f2ecff3b86 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/BUILD.gn >@@ -21,7 +21,6 @@ rtc_static_library("audio_frame_operations") { > ] > > deps = [ >- "../..:webrtc_common", > "../../api/audio:audio_frame_api", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.cc >index fb1f3b07711b79dd4bc059bc1921df28fea26bed..1a8232b02b3ae4bed9b2dcf21e8c1361eb1cd801 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.cc >@@ -12,6 +12,7 @@ > > #include <string.h> > #include <algorithm> >+#include <cstdint> > > #include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_conversions.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.h b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.h >index 599352356f62f39346df89e0297c6c21152a6719..c1445b691098e39934affdaf9a839d142eaad523 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/utility/audio_frame_operations.h >@@ -12,6 +12,7 @@ > #define AUDIO_UTILITY_AUDIO_FRAME_OPERATIONS_H_ > > #include <stddef.h> >+#include <stdint.h> > > #include "api/audio/audio_frame.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/OWNERS >index 524e2676ff52bfef38930036abfef85b26072bf4..5465ed8e6a869e1e742474024df997b40ac525bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/OWNERS >@@ -1 +1,2 @@ >+mbonadei@webrtc.org > phoglund@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/build.gni b/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/build.gni >index 81cb3e73ae2329f92de21978552a8778ab82f214..669044db81cb2185be374b0b10027093c5683c35 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/build.gni >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/build_overrides/build.gni >@@ -12,8 +12,8 @@ enable_java_templates = true > # Some non-Chromium builds don't use Chromium's third_party/binutils. > linux_use_bundled_binutils_override = true > >-# Variable that can be used to support multiple build scenarios, like having >-# Chromium specific targets in a client project's GN file etc. >+# Don't set this variable to true when building stadalone WebRTC, it is >+# only needed to support both WebRTC standalone and Chromium builds. > build_with_chromium = false > > # Use our own suppressions files. >@@ -42,7 +42,7 @@ if (host_os == "mac") { > [ target_os ], > "value") > assert(_result != 2, >- "Do not allow building targets with the default" + >+ "Do not allow building targets with the default " + > "hermetic toolchain if the minimum OS version is not met.") > use_system_xcode = _result == 0 > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/call/BUILD.gn >index 611d9e8adb13c7b529108fc7cfd314b08616b0dd..f6ba4ec8490829f44af38e25458d0b69889eb014 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/BUILD.gn >@@ -38,11 +38,14 @@ rtc_source_set("call_interfaces") { > "../api/audio_codecs:audio_codecs_api", > "../api/transport:network_control", > "../modules/audio_device:audio_device", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/audio_processing:audio_processing_statistics", > "../rtc_base:audio_format_to_string", >+ "../rtc_base:checks", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", >+ "../rtc_base/network:sent_packet", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -161,6 +164,9 @@ rtc_source_set("bitrate_allocator") { > "bitrate_allocator.h", > ] > deps = [ >+ "../api:bitrate_allocation", >+ "../api/units:data_rate", >+ "../api/units:time_delta", > "../modules/bitrate_controller", > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", >@@ -193,7 +199,6 @@ rtc_static_library("call") { > ":rtp_sender", > ":simulated_network", > ":video_stream_api", >- "..:webrtc_common", > "../api:callfactory_api", > "../api:simulated_network_api", > "../api:transport_api", >@@ -217,6 +222,7 @@ rtc_static_library("call") { > "../rtc_base:rtc_task_queue", > "../rtc_base:safe_minmax", > "../rtc_base:sequenced_task_checker", >+ "../rtc_base/experiments:field_trial_parser", > "../rtc_base/synchronization:rw_lock_wrapper", > "../system_wrappers", > "../system_wrappers:field_trial", >@@ -236,7 +242,6 @@ rtc_source_set("video_stream_api") { > ] > deps = [ > ":rtp_interfaces", >- "../:webrtc_common", > "../api:libjingle_peerconnection_api", > "../api:transport_api", > "../api/video:video_frame", >@@ -257,6 +262,9 @@ rtc_source_set("simulated_network") { > ] > deps = [ > "../api:simulated_network_api", >+ "../api/units:data_rate", >+ "../api/units:data_size", >+ "../api/units:time_delta", > "../rtc_base:rtc_base_approved", > "//third_party/abseil-cpp/absl/memory", > "//third_party/abseil-cpp/absl/types:optional", >@@ -283,10 +291,10 @@ rtc_source_set("fake_network") { > ":call_interfaces", > ":simulated_network", > ":simulated_packet_receiver", >- "..:webrtc_common", > "../api:simulated_network_api", > "../api:transport_api", > "../modules:module_api", >+ "../modules/utility", > "../rtc_base:rtc_base_approved", > "../rtc_base:sequenced_task_checker", > "../system_wrappers", >@@ -322,8 +330,9 @@ if (rtc_include_tests) { > ":rtp_receiver", > ":rtp_sender", > ":simulated_network", >- "..:webrtc_common", > "../api:array_view", >+ "../api:fake_media_transport", >+ "../api:fake_media_transport", > "../api:libjingle_peerconnection_api", > "../api:mock_audio_mixer", > "../api/audio_codecs:builtin_audio_decoder_factory", >@@ -359,6 +368,7 @@ if (rtc_include_tests) { > "//testing/gmock", > "//testing/gtest", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > if (!build_with_chromium && is_clang) { > # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >@@ -378,12 +388,13 @@ if (rtc_include_tests) { > ":call_interfaces", > ":simulated_network", > ":video_stream_api", >- "..:webrtc_common", > "../api:simulated_network_api", > "../api/audio_codecs:builtin_audio_encoder_factory", >+ "../api/video:builtin_video_bitrate_allocator_factory", > "../api/video:video_bitrate_allocation", > "../api/video_codecs:video_codecs_api", > "../logging:rtc_event_log_api", >+ "../logging:rtc_event_log_impl_output", > "../modules/audio_coding", > "../modules/audio_device", > "../modules/audio_device:audio_device_impl", >@@ -426,6 +437,7 @@ if (rtc_include_tests) { > "../modules/pacing", > "../rtc_base:rate_limiter", > "../rtc_base:rtc_base", >+ "../rtc_base/network:sent_packet", > "../test:test_support", > ] > } >@@ -437,7 +449,7 @@ if (rtc_include_tests) { > ] > deps = [ > ":bitrate_allocator", >- "//test:test_support", >+ "../test:test_support", > ] > } > rtc_source_set("mock_call_interfaces") { >@@ -448,11 +460,14 @@ if (rtc_include_tests) { > ] > deps = [ > ":call_interfaces", >- "//test:test_support", >+ "../test:test_support", > ] > } > > rtc_test("fake_network_unittests") { >+ sources = [ >+ "test/fake_network_pipe_unittest.cc", >+ ] > deps = [ > ":call_interfaces", > ":fake_network", >@@ -462,10 +477,8 @@ if (rtc_include_tests) { > "../system_wrappers", > "../test:test_common", > "../test:test_main", >+ "../test:test_support", > "//testing/gtest", > ] >- sources = [ >- "test/fake_network_pipe_unittest.cc", >- ] > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_receive_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_receive_stream.h >index 9c890b1cda375853141f84bb6dd14a799aabb1f6..36cc05939674bef1b125b51a4d311efbe28443e7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_receive_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_receive_stream.h >@@ -19,10 +19,11 @@ > #include "absl/types/optional.h" > #include "api/audio_codecs/audio_decoder_factory.h" > #include "api/call/transport.h" >+#include "api/crypto/cryptooptions.h" >+#include "api/media_transport_interface.h" > #include "api/rtpparameters.h" > #include "api/rtpreceiverinterface.h" > #include "call/rtp_config.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { >@@ -62,6 +63,7 @@ class AudioReceiveStream { > float secondary_discarded_rate = 0.0f; > float accelerate_rate = 0.0f; > float preemptive_expand_rate = 0.0f; >+ uint64_t delayed_packet_outage_samples = 0; > int32_t decoding_calls_to_silence_generator = 0; > int32_t decoding_calls_to_neteq = 0; > int32_t decoding_normal = 0; >@@ -70,6 +72,7 @@ class AudioReceiveStream { > int32_t decoding_plc_cng = 0; > int32_t decoding_muted_output = 0; > int64_t capture_start_ntp_time_ms = 0; >+ uint64_t jitter_buffer_flushes = 0; > }; > > struct Config { >@@ -106,9 +109,12 @@ class AudioReceiveStream { > > Transport* rtcp_send_transport = nullptr; > >+ MediaTransportInterface* media_transport = nullptr; >+ > // NetEq settings. > size_t jitter_buffer_max_packets = 50; > bool jitter_buffer_fast_accelerate = false; >+ int jitter_buffer_min_delay_ms = 0; > > // Identifier for an A/V synchronization group. Empty string to disable. > // TODO(pbos): Synchronize streams in a sync group, not just one video >@@ -122,6 +128,9 @@ class AudioReceiveStream { > > absl::optional<AudioCodecPairId> codec_pair_id; > >+ // Per PeerConnection crypto options. >+ webrtc::CryptoOptions crypto_options; >+ > // An optional custom frame decryptor that allows the entire frame to be > // decrypted in whatever way the caller choses. This is not required by > // default. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.cc >index e2cf20cf84480e6f47ec7d33470205c30b45652c..303b49cbfb319667218785d824067b77aa8e9579 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.cc >@@ -9,6 +9,9 @@ > */ > > #include "call/audio_send_stream.h" >+ >+#include <stddef.h> >+ > #include "rtc_base/stringencode.h" > #include "rtc_base/strings/audio_format_to_string.h" > #include "rtc_base/strings/string_builder.h" >@@ -18,8 +21,12 @@ namespace webrtc { > AudioSendStream::Stats::Stats() = default; > AudioSendStream::Stats::~Stats() = default; > >+AudioSendStream::Config::Config(Transport* send_transport, >+ MediaTransportInterface* media_transport) >+ : send_transport(send_transport), media_transport(media_transport) {} >+ > AudioSendStream::Config::Config(Transport* send_transport) >- : send_transport(send_transport) {} >+ : Config(send_transport, nullptr) {} > > AudioSendStream::Config::~Config() = default; > >@@ -27,7 +34,9 @@ std::string AudioSendStream::Config::ToString() const { > char buf[1024]; > rtc::SimpleStringBuilder ss(buf); > ss << "{rtp: " << rtp.ToString(); >+ ss << ", rtcp_report_interval_ms: " << rtcp_report_interval_ms; > ss << ", send_transport: " << (send_transport ? "(Transport)" : "null"); >+ ss << ", media_transport: " << (media_transport ? "(Transport)" : "null"); > ss << ", min_bitrate_bps: " << min_bitrate_bps; > ss << ", max_bitrate_bps: " << max_bitrate_bps; > ss << ", send_codec_spec: " >@@ -44,6 +53,7 @@ std::string AudioSendStream::Config::Rtp::ToString() const { > char buf[1024]; > rtc::SimpleStringBuilder ss(buf); > ss << "{ssrc: " << ssrc; >+ ss << ", extmap-allow-mixed: " << (extmap_allow_mixed ? "true" : "false"); > ss << ", extensions: ["; > for (size_t i = 0; i < extensions.size(); ++i) { > ss << extensions[i].ToString(); >@@ -52,7 +62,6 @@ std::string AudioSendStream::Config::Rtp::ToString() const { > } > } > ss << ']'; >- ss << ", nack: " << nack.ToString(); > ss << ", c_name: " << c_name; > ss << '}'; > return ss.str(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.h >index 61e25312229e90015ae476faec28ffe3a361ed0a..c996dab83d70005505229e244af493e7e8d2f82f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/audio_send_stream.h >@@ -21,7 +21,9 @@ > #include "api/audio_codecs/audio_encoder_factory.h" > #include "api/audio_codecs/audio_format.h" > #include "api/call/transport.h" >+#include "api/crypto/cryptooptions.h" > #include "api/crypto/frameencryptorinterface.h" >+#include "api/media_transport_interface.h" > #include "api/rtpparameters.h" > #include "call/rtp_config.h" > #include "modules/audio_processing/include/audio_processing_statistics.h" >@@ -57,10 +59,13 @@ class AudioSendStream { > > ANAStats ana_statistics; > AudioProcessingStats apm_statistics; >+ >+ int64_t target_bitrate_bps = 0; > }; > > struct Config { > Config() = delete; >+ Config(Transport* send_transport, MediaTransportInterface* media_transport); > explicit Config(Transport* send_transport); > ~Config(); > std::string ToString() const; >@@ -78,20 +83,25 @@ class AudioSendStream { > // included in the list of extensions. > std::string mid; > >+ // Corresponds to the SDP attribute extmap-allow-mixed. >+ bool extmap_allow_mixed = false; >+ > // RTP header extensions used for the sent stream. > std::vector<RtpExtension> extensions; > >- // See NackConfig for description. >- NackConfig nack; >- > // RTCP CNAME, see RFC 3550. > std::string c_name; > } rtp; > >+ // Time interval between RTCP report for audio >+ int rtcp_report_interval_ms = 5000; >+ > // Transport for outgoing packets. The transport is expected to exist for > // the entire life of the AudioSendStream and is owned by the API client. > Transport* send_transport = nullptr; > >+ MediaTransportInterface* media_transport = nullptr; >+ > // Bitrate limits used for variable audio bitrate streams. Set both to -1 to > // disable audio bitrate adaptation. > // Note: This is still an experimental feature and not ready for real usage. >@@ -99,6 +109,7 @@ class AudioSendStream { > int max_bitrate_bps = -1; > > double bitrate_priority = 1.0; >+ bool has_dscp = false; > > // Defines whether to turn on audio network adaptor, and defines its config > // string. >@@ -130,6 +141,9 @@ class AudioSendStream { > // Track ID as specified during track creation. > std::string track_id; > >+ // Per PeerConnection crypto options. >+ webrtc::CryptoOptions crypto_options; >+ > // An optional custom frame encryptor that allows the entire frame to be > // encryptor in whatever way the caller choses. This is not required by > // default. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.cc >index acb19ae6fb2bff0bcda025f38e95670fcaf0e28a..660981c30e399edeb069a151640c6d35788edbf9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.cc >@@ -16,6 +16,8 @@ > #include <memory> > #include <utility> > >+#include "api/units/data_rate.h" >+#include "api/units/time_delta.h" > #include "modules/bitrate_controller/include/bitrate_controller.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >@@ -50,10 +52,12 @@ double MediaRatio(uint32_t allocated_bitrate, uint32_t protection_bitrate) { > > BitrateAllocator::BitrateAllocator(LimitObserver* limit_observer) > : limit_observer_(limit_observer), >- last_bitrate_bps_(0), >+ last_target_bps_(0), >+ last_link_capacity_bps_(0), > last_non_zero_bitrate_bps_(kDefaultBitrateBps), > last_fraction_loss_(0), > last_rtt_(0), >+ last_bwe_period_ms_(1000), > num_pause_events_(0), > clock_(Clock::GetRealTimeClock()), > last_bwe_log_time_(0), >@@ -88,11 +92,13 @@ uint8_t BitrateAllocator::GetTransmissionMaxBitrateMultiplier() { > } > > void BitrateAllocator::OnNetworkChanged(uint32_t target_bitrate_bps, >+ uint32_t link_capacity_bps, > uint8_t fraction_loss, > int64_t rtt, > int64_t bwe_period_ms) { > RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- last_bitrate_bps_ = target_bitrate_bps; >+ last_target_bps_ = target_bitrate_bps; >+ last_link_capacity_bps_ = link_capacity_bps; > last_non_zero_bitrate_bps_ = > target_bitrate_bps > 0 ? target_bitrate_bps : last_non_zero_bitrate_bps_; > last_fraction_loss_ = fraction_loss; >@@ -107,11 +113,18 @@ void BitrateAllocator::OnNetworkChanged(uint32_t target_bitrate_bps, > } > > ObserverAllocation allocation = AllocateBitrates(target_bitrate_bps); >+ ObserverAllocation bandwidth_allocation = AllocateBitrates(link_capacity_bps); > > for (auto& config : bitrate_observer_configs_) { > uint32_t allocated_bitrate = allocation[config.observer]; >- uint32_t protection_bitrate = config.observer->OnBitrateUpdated( >- allocated_bitrate, last_fraction_loss_, last_rtt_, last_bwe_period_ms_); >+ uint32_t allocated_bandwidth = bandwidth_allocation[config.observer]; >+ BitrateAllocationUpdate update; >+ update.target_bitrate = DataRate::bps(allocated_bitrate); >+ update.link_capacity = DataRate::bps(allocated_bandwidth); >+ update.packet_loss_ratio = last_fraction_loss_ / 256.0; >+ update.round_trip_time = TimeDelta::ms(last_rtt_); >+ update.bwe_period = TimeDelta::ms(last_bwe_period_ms_); >+ uint32_t protection_bitrate = config.observer->OnBitrateUpdated(update); > > if (allocated_bitrate == 0 && config.allocated_bitrate_bps > 0) { > if (target_bitrate_bps > 0) >@@ -163,15 +176,22 @@ void BitrateAllocator::AddObserver(BitrateAllocatorObserver* observer, > config.bitrate_priority, config.has_packet_feedback)); > } > >- ObserverAllocation allocation; >- if (last_bitrate_bps_ > 0) { >+ if (last_target_bps_ > 0) { > // Calculate a new allocation and update all observers. >- allocation = AllocateBitrates(last_bitrate_bps_); >+ >+ ObserverAllocation allocation = AllocateBitrates(last_target_bps_); >+ ObserverAllocation bandwidth_allocation = >+ AllocateBitrates(last_link_capacity_bps_); > for (auto& config : bitrate_observer_configs_) { > uint32_t allocated_bitrate = allocation[config.observer]; >- uint32_t protection_bitrate = config.observer->OnBitrateUpdated( >- allocated_bitrate, last_fraction_loss_, last_rtt_, >- last_bwe_period_ms_); >+ uint32_t bandwidth = bandwidth_allocation[config.observer]; >+ BitrateAllocationUpdate update; >+ update.target_bitrate = DataRate::bps(allocated_bitrate); >+ update.link_capacity = DataRate::bps(bandwidth); >+ update.packet_loss_ratio = last_fraction_loss_ / 256.0; >+ update.round_trip_time = TimeDelta::ms(last_rtt_); >+ update.bwe_period = TimeDelta::ms(last_bwe_period_ms_); >+ uint32_t protection_bitrate = config.observer->OnBitrateUpdated(update); > config.allocated_bitrate_bps = allocated_bitrate; > if (allocated_bitrate > 0) > config.media_ratio = MediaRatio(allocated_bitrate, protection_bitrate); >@@ -180,9 +200,14 @@ void BitrateAllocator::AddObserver(BitrateAllocatorObserver* observer, > // Currently, an encoder is not allowed to produce frames. > // But we still have to return the initial config bitrate + let the > // observer know that it can not produce frames. >- allocation = AllocateBitrates(last_non_zero_bitrate_bps_); >- observer->OnBitrateUpdated(0, last_fraction_loss_, last_rtt_, >- last_bwe_period_ms_); >+ >+ BitrateAllocationUpdate update; >+ update.target_bitrate = DataRate::Zero(); >+ update.link_capacity = DataRate::Zero(); >+ update.packet_loss_ratio = last_fraction_loss_ / 256.0; >+ update.round_trip_time = TimeDelta::ms(last_rtt_); >+ update.bwe_period = TimeDelta::ms(last_bwe_period_ms_); >+ observer->OnBitrateUpdated(update); > } > UpdateAllocationLimits(); > } >@@ -248,7 +273,8 @@ void BitrateAllocator::RemoveObserver(BitrateAllocatorObserver* observer) { > UpdateAllocationLimits(); > } > >-int BitrateAllocator::GetStartBitrate(BitrateAllocatorObserver* observer) { >+int BitrateAllocator::GetStartBitrate( >+ BitrateAllocatorObserver* observer) const { > RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); > const auto& it = FindObserverConfig(observer); > if (it == bitrate_observer_configs_.end()) { >@@ -272,6 +298,17 @@ void BitrateAllocator::SetBitrateAllocationStrategy( > bitrate_allocation_strategy_ = std::move(bitrate_allocation_strategy); > } > >+BitrateAllocator::ObserverConfigs::const_iterator >+BitrateAllocator::FindObserverConfig( >+ const BitrateAllocatorObserver* observer) const { >+ for (auto it = bitrate_observer_configs_.begin(); >+ it != bitrate_observer_configs_.end(); ++it) { >+ if (it->observer == observer) >+ return it; >+ } >+ return bitrate_observer_configs_.end(); >+} >+ > BitrateAllocator::ObserverConfigs::iterator > BitrateAllocator::FindObserverConfig(const BitrateAllocatorObserver* observer) { > for (auto it = bitrate_observer_configs_.begin(); >@@ -283,19 +320,19 @@ BitrateAllocator::FindObserverConfig(const BitrateAllocatorObserver* observer) { > } > > BitrateAllocator::ObserverAllocation BitrateAllocator::AllocateBitrates( >- uint32_t bitrate) { >+ uint32_t bitrate) const { > if (bitrate_observer_configs_.empty()) > return ObserverAllocation(); > > if (bitrate_allocation_strategy_ != nullptr) { >- std::vector<const rtc::BitrateAllocationStrategy::TrackConfig*> >- track_configs(bitrate_observer_configs_.size()); >- int i = 0; >- for (const auto& c : bitrate_observer_configs_) { >- track_configs[i++] = &c; >- } >+ // Note: This intentionally causes slicing, we only copy the fields in >+ // ObserverConfig that are inherited from TrackConfig. >+ std::vector<rtc::BitrateAllocationStrategy::TrackConfig> track_configs( >+ bitrate_observer_configs_.begin(), bitrate_observer_configs_.end()); >+ > std::vector<uint32_t> track_allocations = >- bitrate_allocation_strategy_->AllocateBitrates(bitrate, track_configs); >+ bitrate_allocation_strategy_->AllocateBitrates( >+ bitrate, std::move(track_configs)); > // The strategy should return allocation for all tracks. > RTC_CHECK(track_allocations.size() == bitrate_observer_configs_.size()); > ObserverAllocation allocation; >@@ -331,7 +368,8 @@ BitrateAllocator::ObserverAllocation BitrateAllocator::AllocateBitrates( > return MaxRateAllocation(bitrate, sum_max_bitrates); > } > >-BitrateAllocator::ObserverAllocation BitrateAllocator::ZeroRateAllocation() { >+BitrateAllocator::ObserverAllocation BitrateAllocator::ZeroRateAllocation() >+ const { > ObserverAllocation allocation; > for (const auto& observer_config : bitrate_observer_configs_) > allocation[observer_config.observer] = 0; >@@ -339,7 +377,7 @@ BitrateAllocator::ObserverAllocation BitrateAllocator::ZeroRateAllocation() { > } > > BitrateAllocator::ObserverAllocation BitrateAllocator::LowRateAllocation( >- uint32_t bitrate) { >+ uint32_t bitrate) const { > ObserverAllocation allocation; > // Start by allocating bitrate to observers enforcing a min bitrate, hence > // remaining_bitrate might turn negative. >@@ -400,7 +438,7 @@ BitrateAllocator::ObserverAllocation BitrateAllocator::LowRateAllocation( > // min_bitrate_bps values, until one of the observers hits its max_bitrate_bps. > BitrateAllocator::ObserverAllocation BitrateAllocator::NormalRateAllocation( > uint32_t bitrate, >- uint32_t sum_min_bitrates) { >+ uint32_t sum_min_bitrates) const { > ObserverAllocation allocation; > ObserverAllocation observers_capacities; > for (const auto& observer_config : bitrate_observer_configs_) { >@@ -420,7 +458,7 @@ BitrateAllocator::ObserverAllocation BitrateAllocator::NormalRateAllocation( > > BitrateAllocator::ObserverAllocation BitrateAllocator::MaxRateAllocation( > uint32_t bitrate, >- uint32_t sum_max_bitrates) { >+ uint32_t sum_max_bitrates) const { > ObserverAllocation allocation; > > for (const auto& observer_config : bitrate_observer_configs_) { >@@ -457,10 +495,11 @@ uint32_t BitrateAllocator::ObserverConfig::MinBitrateWithHysteresis() const { > return min_bitrate; > } > >-void BitrateAllocator::DistributeBitrateEvenly(uint32_t bitrate, >- bool include_zero_allocations, >- int max_multiplier, >- ObserverAllocation* allocation) { >+void BitrateAllocator::DistributeBitrateEvenly( >+ uint32_t bitrate, >+ bool include_zero_allocations, >+ int max_multiplier, >+ ObserverAllocation* allocation) const { > RTC_DCHECK_EQ(allocation->size(), bitrate_observer_configs_.size()); > > ObserverSortingMap list_max_bitrates; >@@ -491,8 +530,9 @@ void BitrateAllocator::DistributeBitrateEvenly(uint32_t bitrate, > } > } > >-bool BitrateAllocator::EnoughBitrateForAllObservers(uint32_t bitrate, >- uint32_t sum_min_bitrates) { >+bool BitrateAllocator::EnoughBitrateForAllObservers( >+ uint32_t bitrate, >+ uint32_t sum_min_bitrates) const { > if (bitrate < sum_min_bitrates) > return false; > >@@ -511,7 +551,7 @@ bool BitrateAllocator::EnoughBitrateForAllObservers(uint32_t bitrate, > void BitrateAllocator::DistributeBitrateRelatively( > uint32_t remaining_bitrate, > const ObserverAllocation& observers_capacities, >- ObserverAllocation* allocation) { >+ ObserverAllocation* allocation) const { > RTC_DCHECK_EQ(allocation->size(), bitrate_observer_configs_.size()); > RTC_DCHECK_EQ(observers_capacities.size(), bitrate_observer_configs_.size()); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.h >index 47df7e01512eaa327d7016022eb185fe25148582..059f77b967179ab02354e001026cd5e43149d66c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator.h >@@ -19,6 +19,7 @@ > #include <utility> > #include <vector> > >+#include "api/call/bitrate_allocation.h" > #include "rtc_base/bitrateallocationstrategy.h" > #include "rtc_base/sequenced_task_checker.h" > >@@ -34,10 +35,7 @@ class BitrateAllocatorObserver { > public: > // Returns the amount of protection used by the BitrateAllocatorObserver > // implementation, as bitrate in bps. >- virtual uint32_t OnBitrateUpdated(uint32_t bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt, >- int64_t bwe_period_ms) = 0; >+ virtual uint32_t OnBitrateUpdated(BitrateAllocationUpdate update) = 0; > > protected: > virtual ~BitrateAllocatorObserver() {} >@@ -73,7 +71,7 @@ class BitrateAllocatorInterface { > virtual void AddObserver(BitrateAllocatorObserver* observer, > MediaStreamAllocationConfig config) = 0; > virtual void RemoveObserver(BitrateAllocatorObserver* observer) = 0; >- virtual int GetStartBitrate(BitrateAllocatorObserver* observer) = 0; >+ virtual int GetStartBitrate(BitrateAllocatorObserver* observer) const = 0; > > protected: > virtual ~BitrateAllocatorInterface() = default; >@@ -104,6 +102,7 @@ class BitrateAllocator : public BitrateAllocatorInterface { > > // Allocate target_bitrate across the registered BitrateAllocatorObservers. > void OnNetworkChanged(uint32_t target_bitrate_bps, >+ uint32_t link_capacity_bps, > uint8_t fraction_loss, > int64_t rtt, > int64_t bwe_period_ms); >@@ -120,7 +119,7 @@ class BitrateAllocator : public BitrateAllocatorInterface { > > // Returns initial bitrate allocated for |observer|. If |observer| is not in > // the list of added observers, a best guess is returned. >- int GetStartBitrate(BitrateAllocatorObserver* observer) override; >+ int GetStartBitrate(BitrateAllocatorObserver* observer) const override; > > // Sets external allocation strategy. If strategy is not set default WebRTC > // allocation mechanism will be used. The strategy may be changed during call. >@@ -171,31 +170,34 @@ class BitrateAllocator : public BitrateAllocatorInterface { > void UpdateAllocationLimits() RTC_RUN_ON(&sequenced_checker_); > > typedef std::vector<ObserverConfig> ObserverConfigs; >+ ObserverConfigs::const_iterator FindObserverConfig( >+ const BitrateAllocatorObserver* observer) const >+ RTC_RUN_ON(&sequenced_checker_); > ObserverConfigs::iterator FindObserverConfig( > const BitrateAllocatorObserver* observer) RTC_RUN_ON(&sequenced_checker_); > > typedef std::multimap<uint32_t, const ObserverConfig*> ObserverSortingMap; > typedef std::map<BitrateAllocatorObserver*, int> ObserverAllocation; > >- ObserverAllocation AllocateBitrates(uint32_t bitrate) >+ ObserverAllocation AllocateBitrates(uint32_t bitrate) const > RTC_RUN_ON(&sequenced_checker_); > > // Allocates zero bitrate to all observers. >- ObserverAllocation ZeroRateAllocation() RTC_RUN_ON(&sequenced_checker_); >+ ObserverAllocation ZeroRateAllocation() const RTC_RUN_ON(&sequenced_checker_); > // Allocates bitrate to observers when there isn't enough to allocate the > // minimum to all observers. >- ObserverAllocation LowRateAllocation(uint32_t bitrate) >+ ObserverAllocation LowRateAllocation(uint32_t bitrate) const > RTC_RUN_ON(&sequenced_checker_); > // Allocates bitrate to all observers when the available bandwidth is enough > // to allocate the minimum to all observers but not enough to allocate the > // max bitrate of each observer. > ObserverAllocation NormalRateAllocation(uint32_t bitrate, >- uint32_t sum_min_bitrates) >+ uint32_t sum_min_bitrates) const > RTC_RUN_ON(&sequenced_checker_); > // Allocates bitrate to observers when there is enough available bandwidth > // for all observers to be allocated their max bitrate. > ObserverAllocation MaxRateAllocation(uint32_t bitrate, >- uint32_t sum_max_bitrates) >+ uint32_t sum_max_bitrates) const > RTC_RUN_ON(&sequenced_checker_); > > // Splits |bitrate| evenly to observers already in |allocation|. >@@ -205,9 +207,10 @@ class BitrateAllocator : public BitrateAllocatorInterface { > void DistributeBitrateEvenly(uint32_t bitrate, > bool include_zero_allocations, > int max_multiplier, >- ObserverAllocation* allocation) >+ ObserverAllocation* allocation) const > RTC_RUN_ON(&sequenced_checker_); >- bool EnoughBitrateForAllObservers(uint32_t bitrate, uint32_t sum_min_bitrates) >+ bool EnoughBitrateForAllObservers(uint32_t bitrate, >+ uint32_t sum_min_bitrates) const > RTC_RUN_ON(&sequenced_checker_); > > // From the available |bitrate|, each observer will be allocated a >@@ -219,7 +222,7 @@ class BitrateAllocator : public BitrateAllocatorInterface { > void DistributeBitrateRelatively( > uint32_t bitrate, > const ObserverAllocation& observers_capacities, >- ObserverAllocation* allocation) RTC_RUN_ON(&sequenced_checker_); >+ ObserverAllocation* allocation) const RTC_RUN_ON(&sequenced_checker_); > > // Allow packets to be transmitted in up to 2 times max video bitrate if the > // bandwidth estimate allows it. >@@ -232,7 +235,8 @@ class BitrateAllocator : public BitrateAllocatorInterface { > LimitObserver* const limit_observer_ RTC_GUARDED_BY(&sequenced_checker_); > // Stored in a list to keep track of the insertion order. > ObserverConfigs bitrate_observer_configs_ RTC_GUARDED_BY(&sequenced_checker_); >- uint32_t last_bitrate_bps_ RTC_GUARDED_BY(&sequenced_checker_); >+ uint32_t last_target_bps_ RTC_GUARDED_BY(&sequenced_checker_); >+ uint32_t last_link_capacity_bps_ RTC_GUARDED_BY(&sequenced_checker_); > uint32_t last_non_zero_bitrate_bps_ RTC_GUARDED_BY(&sequenced_checker_); > uint8_t last_fraction_loss_ RTC_GUARDED_BY(&sequenced_checker_); > int64_t last_rtt_ RTC_GUARDED_BY(&sequenced_checker_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator_unittest.cc >index 2961fd409ce98923487a2c3dc64b93da964724ad..0771f798edcd8b19d379759bc8e44b08e23c36f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_allocator_unittest.cc >@@ -59,15 +59,13 @@ class TestBitrateObserver : public BitrateAllocatorObserver { > protection_ratio_ = protection_ratio; > } > >- uint32_t OnBitrateUpdated(uint32_t bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt, >- int64_t probing_interval_ms) override { >- last_bitrate_bps_ = bitrate_bps; >- last_fraction_loss_ = fraction_loss; >- last_rtt_ms_ = rtt; >- last_probing_interval_ms_ = probing_interval_ms; >- return bitrate_bps * protection_ratio_; >+ uint32_t OnBitrateUpdated(BitrateAllocationUpdate update) override { >+ last_bitrate_bps_ = update.target_bitrate.bps(); >+ last_fraction_loss_ = >+ rtc::dchecked_cast<uint8_t>(update.packet_loss_ratio * 256); >+ last_rtt_ms_ = update.round_trip_time.ms(); >+ last_probing_interval_ms_ = update.bwe_period.ms(); >+ return update.target_bitrate.bps() * protection_ratio_; > } > uint32_t last_bitrate_bps_; > uint8_t last_fraction_loss_; >@@ -76,6 +74,18 @@ class TestBitrateObserver : public BitrateAllocatorObserver { > double protection_ratio_; > }; > >+class BitrateAllocatorForTest : public BitrateAllocator { >+ public: >+ using BitrateAllocator::BitrateAllocator; >+ void OnNetworkChanged(uint32_t target_bitrate_bps, >+ uint8_t fraction_loss, >+ int64_t rtt, >+ int64_t bwe_period_ms) { >+ BitrateAllocator::OnNetworkChanged(target_bitrate_bps, target_bitrate_bps, >+ fraction_loss, rtt, bwe_period_ms); >+ } >+}; >+ > namespace { > constexpr int64_t kDefaultProbingIntervalMs = 3000; > const double kDefaultBitratePriority = 1.0; >@@ -83,7 +93,8 @@ const double kDefaultBitratePriority = 1.0; > > class BitrateAllocatorTest : public ::testing::Test { > protected: >- BitrateAllocatorTest() : allocator_(new BitrateAllocator(&limit_observer_)) { >+ BitrateAllocatorTest() >+ : allocator_(new BitrateAllocatorForTest(&limit_observer_)) { > allocator_->OnNetworkChanged(300000u, 0, 0, kDefaultProbingIntervalMs); > } > ~BitrateAllocatorTest() {} >@@ -100,7 +111,7 @@ class BitrateAllocatorTest : public ::testing::Test { > } > > NiceMock<MockLimitObserver> limit_observer_; >- std::unique_ptr<BitrateAllocator> allocator_; >+ std::unique_ptr<BitrateAllocatorForTest> allocator_; > }; > > TEST_F(BitrateAllocatorTest, UpdatingBitrateObserver) { >@@ -216,7 +227,7 @@ TEST_F(BitrateAllocatorTest, RemoveObserverTriggersLimitObserver) { > class BitrateAllocatorTestNoEnforceMin : public ::testing::Test { > protected: > BitrateAllocatorTestNoEnforceMin() >- : allocator_(new BitrateAllocator(&limit_observer_)) { >+ : allocator_(new BitrateAllocatorForTest(&limit_observer_)) { > allocator_->OnNetworkChanged(300000u, 0, 0, kDefaultProbingIntervalMs); > } > ~BitrateAllocatorTestNoEnforceMin() {} >@@ -232,7 +243,7 @@ class BitrateAllocatorTestNoEnforceMin : public ::testing::Test { > enforce_min_bitrate, track_id, bitrate_priority, false}); > } > NiceMock<MockLimitObserver> limit_observer_; >- std::unique_ptr<BitrateAllocator> allocator_; >+ std::unique_ptr<BitrateAllocatorForTest> allocator_; > }; > > // The following three tests verify enforcing a minimum bitrate works as >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_estimator_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_estimator_tests.cc >index 766e38c9788cc409ed479898ca1b731c33c47734..0862827d3a24a396d30dc774247c5a6c21bbe227 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_estimator_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/bitrate_estimator_tests.cc >@@ -46,8 +46,6 @@ class LogObserver { > private: > class Callback : public rtc::LogSink { > public: >- Callback() : done_(false, false) {} >- > void OnLogMessage(const std::string& message) override { > rtc::CritScope lock(&crit_sect_); > // Ignore log lines that are due to missing AST extensions, these are >@@ -111,14 +109,14 @@ class BitrateEstimatorTest : public test::CallTest { > &task_queue_, > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ BuiltInNetworkBehaviorConfig())), > sender_call_.get(), payload_type_map_)); > send_transport_->SetReceiver(receiver_call_->Receiver()); > receive_transport_.reset(new test::DirectTransport( > &task_queue_, > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ BuiltInNetworkBehaviorConfig())), > receiver_call_.get(), payload_type_map_)); > receive_transport_->SetReceiver(sender_call_->Receiver()); > >@@ -126,6 +124,8 @@ class BitrateEstimatorTest : public test::CallTest { > video_send_config.rtp.ssrcs.push_back(kVideoSendSsrcs[0]); > video_send_config.encoder_settings.encoder_factory = > &fake_encoder_factory_; >+ video_send_config.encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory_.get(); > video_send_config.rtp.payload_name = "FAKE"; > video_send_config.rtp.payload_type = kFakeVideoSendPayloadType; > SetVideoSendConfig(video_send_config); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.cc >index 635874b2d624a652c963af0c9c88701ecf4ccaa3..95f64e92499058d25f2295e599636c9ba9ef7a60 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.cc >@@ -22,7 +22,6 @@ > #include "audio/audio_receive_stream.h" > #include "audio/audio_send_stream.h" > #include "audio/audio_state.h" >-#include "audio/time_interval.h" > #include "call/bitrate_allocator.h" > #include "call/call.h" > #include "call/flexfec_receive_stream_impl.h" >@@ -30,7 +29,6 @@ > #include "call/rtp_stream_receiver_controller.h" > #include "call/rtp_transport_controller_send.h" > #include "logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.h" >-#include "logging/rtc_event_log/events/rtc_event_audio_send_stream_config.h" > #include "logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.h" > #include "logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.h" > #include "logging/rtc_event_log/events/rtc_event_video_receive_stream_config.h" >@@ -56,6 +54,7 @@ > #include "rtc_base/synchronization/rw_lock_wrapper.h" > #include "rtc_base/task_queue.h" > #include "rtc_base/thread_annotations.h" >+#include "rtc_base/timeutils.h" > #include "rtc_base/trace_event.h" > #include "system_wrappers/include/clock.h" > #include "system_wrappers/include/cpu_info.h" >@@ -146,18 +145,6 @@ std::unique_ptr<rtclog::StreamConfig> CreateRtcLogStreamConfig( > return rtclog_config; > } > >-std::unique_ptr<rtclog::StreamConfig> CreateRtcLogStreamConfig( >- const AudioSendStream::Config& config) { >- auto rtclog_config = absl::make_unique<rtclog::StreamConfig>(); >- rtclog_config->local_ssrc = config.rtp.ssrc; >- rtclog_config->rtp_extensions = config.rtp.extensions; >- if (config.send_codec_spec) { >- rtclog_config->codecs.emplace_back(config.send_codec_spec->format.name, >- config.send_codec_spec->payload_type, 0); >- } >- return rtclog_config; >-} >- > } // namespace > > namespace internal { >@@ -236,6 +223,15 @@ class Call final : public webrtc::Call, > uint32_t allocated_without_feedback_bps, > bool has_packet_feedback) override; > >+ // This method is invoked when the media transport is created and when the >+ // media transport is being destructed. >+ // We only allow one media transport per connection. >+ // >+ // It should be called with non-null argument at most once, and if it was >+ // called with non-null argument, it has to be called with a null argument >+ // at least once after that. >+ void MediaTransportChange(MediaTransportInterface* media_transport) override; >+ > private: > DeliveryStatus DeliverRtcp(MediaType media_type, > const uint8_t* packet, >@@ -256,6 +252,10 @@ class Call final : public webrtc::Call, > void UpdateHistograms(); > void UpdateAggregateNetworkState(); > >+ // If |media_transport| is not null, it registers the rate observer for the >+ // media transport. >+ void RegisterRateObserver() RTC_LOCKS_EXCLUDED(target_observer_crit_); >+ > Clock* const clock_; > > const int num_cpu_cores_; >@@ -346,7 +346,6 @@ class Call final : public webrtc::Call, > absl::optional<int64_t> last_received_rtp_audio_ms_; > absl::optional<int64_t> first_received_rtp_video_ms_; > absl::optional<int64_t> last_received_rtp_video_ms_; >- TimeInterval sent_rtp_audio_timer_ms_; > > rtc::CriticalSection last_bandwidth_bps_crit_; > uint32_t last_bandwidth_bps_ RTC_GUARDED_BY(&last_bandwidth_bps_crit_); >@@ -374,6 +373,15 @@ class Call final : public webrtc::Call, > // Declared last since it will issue callbacks from a task queue. Declaring it > // last ensures that it is destroyed first and any running tasks are finished. > std::unique_ptr<RtpTransportControllerSendInterface> transport_send_; >+ >+ // This is a precaution, since |MediaTransportChange| is not guaranteed to be >+ // invoked on a particular thread. >+ rtc::CriticalSection target_observer_crit_; >+ bool is_target_rate_observer_registered_ >+ RTC_GUARDED_BY(&target_observer_crit_) = false; >+ MediaTransportInterface* media_transport_ >+ RTC_GUARDED_BY(&target_observer_crit_) = nullptr; >+ > RTC_DISALLOW_COPY_AND_ASSIGN(Call); > }; > } // namespace internal >@@ -444,7 +452,6 @@ Call::Call(const Call::Config& config, > video_send_delay_stats_(new SendDelayStats(clock_)), > start_ms_(clock_->TimeInMilliseconds()) { > RTC_DCHECK(config.event_log != nullptr); >- transport_send->RegisterTargetTransferRateObserver(this); > transport_send_ = std::move(transport_send); > transport_send_ptr_ = transport_send_.get(); > >@@ -486,6 +493,43 @@ Call::~Call() { > UpdateHistograms(); > } > >+void Call::RegisterRateObserver() { >+ rtc::CritScope lock(&target_observer_crit_); >+ >+ if (is_target_rate_observer_registered_) { >+ return; >+ } >+ >+ is_target_rate_observer_registered_ = true; >+ >+ if (media_transport_) { >+ media_transport_->AddTargetTransferRateObserver(this); >+ } else { >+ transport_send_ptr_->RegisterTargetTransferRateObserver(this); >+ } >+} >+ >+void Call::MediaTransportChange(MediaTransportInterface* media_transport) { >+ rtc::CritScope lock(&target_observer_crit_); >+ >+ if (is_target_rate_observer_registered_) { >+ // Only used to unregister rate observer from media transport. Registration >+ // happens when the stream is created. >+ if (!media_transport && media_transport_) { >+ media_transport_->RemoveTargetTransferRateObserver(this); >+ media_transport_ = nullptr; >+ is_target_rate_observer_registered_ = false; >+ } >+ } else if (media_transport) { >+ RTC_DCHECK(media_transport_ == nullptr || >+ media_transport_ == media_transport) >+ << "media_transport_=" << (media_transport_ != nullptr) >+ << ", (media_transport_==media_transport)=" >+ << (media_transport_ == media_transport); >+ media_transport_ = media_transport; >+ } >+} >+ > void Call::UpdateHistograms() { > RTC_HISTOGRAM_COUNTS_100000( > "WebRTC.Call.LifetimeInSeconds", >@@ -495,11 +539,6 @@ void Call::UpdateHistograms() { > void Call::UpdateSendHistograms(int64_t first_sent_packet_ms) { > if (first_sent_packet_ms == -1) > return; >- if (!sent_rtp_audio_timer_ms_.Empty()) { >- RTC_HISTOGRAM_COUNTS_100000( >- "WebRTC.Call.TimeSendingAudioRtpPacketsInSeconds", >- sent_rtp_audio_timer_ms_.Length() / 1000); >- } > int64_t elapsed_sec = > (clock_->TimeInMilliseconds() - first_sent_packet_ms) / 1000; > if (elapsed_sec < metrics::kMinRunTimeInSeconds) >@@ -578,9 +617,16 @@ webrtc::AudioSendStream* Call::CreateAudioSendStream( > const webrtc::AudioSendStream::Config& config) { > TRACE_EVENT0("webrtc", "Call::CreateAudioSendStream"); > RTC_DCHECK_CALLED_SEQUENTIALLY(&configuration_sequence_checker_); >- event_log_->Log(absl::make_unique<RtcEventAudioSendStreamConfig>( >- CreateRtcLogStreamConfig(config))); > >+ { >+ rtc::CritScope lock(&target_observer_crit_); >+ RTC_DCHECK(media_transport_ == config.media_transport); >+ } >+ >+ RegisterRateObserver(); >+ >+ // Stream config is logged in AudioSendStream::ConfigureStream, as it may >+ // change during the stream's lifetime. > absl::optional<RtpState> suspended_rtp_state; > { > const auto& iter = suspended_audio_send_ssrcs_.find(config.rtp.ssrc); >@@ -596,7 +642,7 @@ webrtc::AudioSendStream* Call::CreateAudioSendStream( > config, config_.audio_state, transport_send_ptr_->GetWorkerQueue(), > module_process_thread_.get(), transport_send_ptr_, > bitrate_allocator_.get(), event_log_, call_stats_.get(), >- suspended_rtp_state, &sent_rtp_audio_timer_ms_); >+ suspended_rtp_state); > { > WriteLockScoped write_lock(*send_crit_); > RTC_DCHECK(audio_send_ssrcs_.find(config.rtp.ssrc) == >@@ -708,6 +754,8 @@ webrtc::VideoSendStream* Call::CreateVideoSendStream( > TRACE_EVENT0("webrtc", "Call::CreateVideoSendStream"); > RTC_DCHECK_CALLED_SEQUENTIALLY(&configuration_sequence_checker_); > >+ RegisterRateObserver(); >+ > video_send_delay_stats_->AddSsrcs(config); > for (size_t ssrc_index = 0; ssrc_index < config.rtp.ssrcs.size(); > ++ssrc_index) { >@@ -1044,6 +1092,18 @@ void Call::OnSentPacket(const rtc::SentPacket& sent_packet) { > } > > void Call::OnTargetTransferRate(TargetTransferRate msg) { >+ // TODO(bugs.webrtc.org/9719) >+ // Call::OnTargetTransferRate requires that on target transfer rate is invoked >+ // from the worker queue (because bitrate_allocator_ requires it). Media >+ // transport does not guarantee the callback on the worker queue. >+ // When the threading model for MediaTransportInterface is update, reconsider >+ // changing this implementation. >+ if (!transport_send_ptr_->GetWorkerQueue()->IsCurrent()) { >+ transport_send_ptr_->GetWorkerQueue()->PostTask( >+ [this, msg] { this->OnTargetTransferRate(msg); }); >+ return; >+ } >+ > uint32_t target_bitrate_bps = msg.target_rate.bps(); > int loss_ratio_255 = msg.network_estimate.loss_rate_ratio * 255; > uint8_t fraction_loss = >@@ -1057,8 +1117,9 @@ void Call::OnTargetTransferRate(TargetTransferRate msg) { > } > // For controlling the rate of feedback messages. > receive_side_cc_.OnBitrateChanged(target_bitrate_bps); >- bitrate_allocator_->OnNetworkChanged(target_bitrate_bps, fraction_loss, >- rtt_ms, probing_interval_ms); >+ bitrate_allocator_->OnNetworkChanged(target_bitrate_bps, bandwidth_bps, >+ fraction_loss, rtt_ms, >+ probing_interval_ms); > > // Ignore updates if bitrate is zero (the aggregate network state is down). > if (target_bitrate_bps == 0) { >@@ -1215,8 +1276,10 @@ PacketReceiver::DeliveryStatus Call::DeliverRtp(MediaType media_type, > > if (packet_time_us != -1) { > if (receive_time_calculator_) { >+ // Repair packet_time_us for clock resets by comparing a new read of >+ // the same clock (TimeUTCMicros) to a monotonic clock reading. > packet_time_us = receive_time_calculator_->ReconcileReceiveTimes( >- packet_time_us, clock_->TimeInMicroseconds()); >+ packet_time_us, rtc::TimeUTCMicros(), clock_->TimeInMicroseconds()); > } > parsed_packet.set_arrival_time_ms((packet_time_us + 500) / 1000); > } else { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.h >index 4167296d756c547f434efd6c553ae6971609b3e4..c0e31c0902b3c96dcd0107c7a83e8c5c8f6c814c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call.h >@@ -24,11 +24,10 @@ > #include "call/rtp_transport_controller_send_interface.h" > #include "call/video_receive_stream.h" > #include "call/video_send_stream.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/bitrateallocationstrategy.h" > #include "rtc_base/copyonwritebuffer.h" >+#include "rtc_base/network/sent_packet.h" > #include "rtc_base/networkroute.h" >-#include "rtc_base/socket.h" > > namespace webrtc { > >@@ -58,6 +57,11 @@ class Call { > > virtual AudioSendStream* CreateAudioSendStream( > const AudioSendStream::Config& config) = 0; >+ >+ // Gets called when media transport is created or removed. >+ virtual void MediaTransportChange( >+ MediaTransportInterface* media_transport_interface) = 0; >+ > virtual void DestroyAudioSendStream(AudioSendStream* send_stream) = 0; > > virtual AudioReceiveStream* CreateAudioReceiveStream( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_config.cc >index d3425aa57997a96ef29702b08e130ec4a2b95a8c..b149c889eae028b9dd3983c6e03491516f25d66a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_config.cc >@@ -10,6 +10,8 @@ > > #include "call/call_config.h" > >+#include "rtc_base/checks.h" >+ > namespace webrtc { > > CallConfig::CallConfig(RtcEventLog* event_log) : event_log(event_log) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_perf_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_perf_tests.cc >index 8fcd566f81a8241de0ad2fa6b1a5f64e20971afb..72aa78c5f7770eac1334163917e2fc70cc9c0bee 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_perf_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_perf_tests.cc >@@ -16,6 +16,7 @@ > #include "absl/memory/memory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" > #include "api/test/simulated_network.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video/video_bitrate_allocation.h" > #include "api/video_codecs/video_encoder_config.h" > #include "call/call.h" >@@ -64,7 +65,7 @@ class CallPerfTest : public test::CallTest { > > void TestMinTransmitBitrate(bool pad_to_min_bitrate); > >- void TestCaptureNtpTime(const DefaultNetworkSimulationConfig& net_config, >+ void TestCaptureNtpTime(const BuiltInNetworkBehaviorConfig& net_config, > int threshold_ms, > int start_time_ms, > int run_time_ms); >@@ -152,7 +153,7 @@ void CallPerfTest::TestAudioVideoSync(FecMode fec, > const uint32_t kAudioSendSsrc = 1234; > const uint32_t kAudioRecvSsrc = 5678; > >- DefaultNetworkSimulationConfig audio_net_config; >+ BuiltInNetworkBehaviorConfig audio_net_config; > audio_net_config.queue_delay_ms = 500; > audio_net_config.loss_percent = 5; > >@@ -217,7 +218,7 @@ void CallPerfTest::TestAudioVideoSync(FecMode fec, > test::PacketTransport::kSender, video_pt_map, > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ BuiltInNetworkBehaviorConfig()))); > video_send_transport->SetReceiver(receiver_call_->Receiver()); > > receive_transport = absl::make_unique<test::PacketTransport>( >@@ -225,13 +226,14 @@ void CallPerfTest::TestAudioVideoSync(FecMode fec, > test::PacketTransport::kReceiver, payload_type_map_, > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ BuiltInNetworkBehaviorConfig()))); > receive_transport->SetReceiver(sender_call_->Receiver()); > > CreateSendConfig(1, 0, 0, video_send_transport.get()); > CreateMatchingReceiveConfigs(receive_transport.get()); > >- AudioSendStream::Config audio_send_config(audio_send_transport.get()); >+ AudioSendStream::Config audio_send_config(audio_send_transport.get(), >+ /*media_transport=*/nullptr); > audio_send_config.rtp.ssrc = kAudioSendSsrc; > audio_send_config.send_codec_spec = AudioSendStream::Config::SendCodecSpec( > kAudioSendPayloadType, {"ISAC", 16000, 1}); >@@ -337,14 +339,14 @@ TEST_F(CallPerfTest, PlaysOutAudioAndVideoInSyncWithVideoFasterThanAudioDrift) { > } > > void CallPerfTest::TestCaptureNtpTime( >- const DefaultNetworkSimulationConfig& net_config, >+ const BuiltInNetworkBehaviorConfig& net_config, > int threshold_ms, > int start_time_ms, > int run_time_ms) { > class CaptureNtpTimeObserver : public test::EndToEndTest, > public rtc::VideoSinkInterface<VideoFrame> { > public: >- CaptureNtpTimeObserver(const DefaultNetworkSimulationConfig& net_config, >+ CaptureNtpTimeObserver(const BuiltInNetworkBehaviorConfig& net_config, > int threshold_ms, > int start_time_ms, > int run_time_ms) >@@ -461,7 +463,7 @@ void CallPerfTest::TestCaptureNtpTime( > } > > rtc::CriticalSection crit_; >- const DefaultNetworkSimulationConfig net_config_; >+ const BuiltInNetworkBehaviorConfig net_config_; > Clock* const clock_; > int threshold_ms_; > int start_time_ms_; >@@ -478,10 +480,10 @@ void CallPerfTest::TestCaptureNtpTime( > RunBaseTest(&test); > } > >-// Flaky tests, disabled on Mac due to webrtc:8291. >-#if !(defined(WEBRTC_MAC)) >+// Flaky tests, disabled on Mac and Windows due to webrtc:8291. >+#if !(defined(WEBRTC_MAC) || defined(WEBRTC_WIN)) > TEST_F(CallPerfTest, CaptureNtpTimeWithNetworkDelay) { >- DefaultNetworkSimulationConfig net_config; >+ BuiltInNetworkBehaviorConfig net_config; > net_config.queue_delay_ms = 100; > // TODO(wu): lower the threshold as the calculation/estimatation becomes more > // accurate. >@@ -492,7 +494,7 @@ TEST_F(CallPerfTest, CaptureNtpTimeWithNetworkDelay) { > } > > TEST_F(CallPerfTest, CaptureNtpTimeWithNetworkJitter) { >- DefaultNetworkSimulationConfig net_config; >+ BuiltInNetworkBehaviorConfig net_config; > net_config.queue_delay_ms = 100; > net_config.delay_standard_deviation_ms = 10; > // TODO(wu): lower the threshold as the calculation/estimatation becomes more >@@ -724,12 +726,13 @@ TEST_F(CallPerfTest, MAYBE_KeepsHighBitrateWhenReconfiguringSender) { > BitrateObserver() > : EndToEndTest(kDefaultTimeoutMs), > FakeEncoder(Clock::GetRealTimeClock()), >- time_to_reconfigure_(false, false), > encoder_inits_(0), > last_set_bitrate_kbps_(0), > send_stream_(nullptr), > frame_generator_(nullptr), >- encoder_factory_(this) {} >+ encoder_factory_(this), >+ bitrate_allocator_factory_( >+ CreateBuiltinVideoBitrateAllocatorFactory()) {} > > int32_t InitEncode(const VideoCodec* config, > int32_t number_of_cores, >@@ -767,8 +770,9 @@ TEST_F(CallPerfTest, MAYBE_KeepsHighBitrateWhenReconfiguringSender) { > return FakeEncoder::SetRateAllocation(rate_allocation, framerate); > } > >- void ModifySenderCallConfig(Call::Config* config) override { >- config->bitrate_config.start_bitrate_bps = kInitialBitrateKbps * 1000; >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { >+ bitrate_config->start_bitrate_bps = kInitialBitrateKbps * 1000; > } > > void ModifyVideoConfigs( >@@ -776,6 +780,8 @@ TEST_F(CallPerfTest, MAYBE_KeepsHighBitrateWhenReconfiguringSender) { > std::vector<VideoReceiveStream::Config>* receive_configs, > VideoEncoderConfig* encoder_config) override { > send_config->encoder_settings.encoder_factory = &encoder_factory_; >+ send_config->encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory_.get(); > encoder_config->max_bitrate_bps = 2 * kReconfigureThresholdKbps * 1000; > encoder_config->video_stream_factory = > new rtc::RefCountedObject<VideoStreamFactory>(); >@@ -811,6 +817,7 @@ TEST_F(CallPerfTest, MAYBE_KeepsHighBitrateWhenReconfiguringSender) { > VideoSendStream* send_stream_; > test::FrameGeneratorCapturer* frame_generator_; > test::VideoEncoderProxyFactory encoder_factory_; >+ std::unique_ptr<VideoBitrateAllocatorFactory> bitrate_allocator_factory_; > VideoEncoderConfig encoder_config_; > } test; > >@@ -865,8 +872,8 @@ void CallPerfTest::TestMinAudioVideoBitrate( > max_bwe_(max_bwe) {} > > protected: >- DefaultNetworkSimulationConfig GetFakeNetworkPipeConfig() { >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig GetFakeNetworkPipeConfig() { >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.link_capacity_kbps = test_bitrate_from_; > return pipe_config; > } >@@ -903,7 +910,7 @@ void CallPerfTest::TestMinAudioVideoBitrate( > ? test_bitrate <= test_bitrate_to_ > : test_bitrate >= test_bitrate_to_; > test_bitrate += test_bitrate_step_) { >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.link_capacity_kbps = test_bitrate; > send_simulated_network_->SetConfig(pipe_config); > receive_simulated_network_->SetConfig(pipe_config); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_unittest.cc >index 8ae04008f9ca3ba84c2973ff2cadb1301bb18b64..770f255313cc1d47b8989c43c31008605e20e11d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/call_unittest.cc >@@ -15,6 +15,7 @@ > > #include "absl/memory/memory.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" >+#include "api/test/fake_media_transport.h" > #include "api/test/mock_audio_mixer.h" > #include "audio/audio_receive_stream.h" > #include "audio/audio_send_stream.h" >@@ -62,7 +63,8 @@ TEST(CallTest, ConstructDestruct) { > > TEST(CallTest, CreateDestroy_AudioSendStream) { > CallHelper call; >- AudioSendStream::Config config(nullptr); >+ AudioSendStream::Config config(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr); > config.rtp.ssrc = 42; > AudioSendStream* stream = call->CreateAudioSendStream(config); > EXPECT_NE(stream, nullptr); >@@ -84,7 +86,8 @@ TEST(CallTest, CreateDestroy_AudioReceiveStream) { > > TEST(CallTest, CreateDestroy_AudioSendStreams) { > CallHelper call; >- AudioSendStream::Config config(nullptr); >+ AudioSendStream::Config config(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr); > std::list<AudioSendStream*> streams; > for (int i = 0; i < 2; ++i) { > for (uint32_t ssrc = 0; ssrc < 1234567; ssrc += 34567) { >@@ -142,7 +145,8 @@ TEST(CallTest, CreateDestroy_AssociateAudioSendReceiveStreams_RecvFirst) { > AudioReceiveStream* recv_stream = call->CreateAudioReceiveStream(recv_config); > EXPECT_NE(recv_stream, nullptr); > >- AudioSendStream::Config send_config(nullptr); >+ AudioSendStream::Config send_config(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr); > send_config.rtp.ssrc = 777; > AudioSendStream* send_stream = call->CreateAudioSendStream(send_config); > EXPECT_NE(send_stream, nullptr); >@@ -160,7 +164,8 @@ TEST(CallTest, CreateDestroy_AssociateAudioSendReceiveStreams_RecvFirst) { > > TEST(CallTest, CreateDestroy_AssociateAudioSendReceiveStreams_SendFirst) { > CallHelper call; >- AudioSendStream::Config send_config(nullptr); >+ AudioSendStream::Config send_config(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr); > send_config.rtp.ssrc = 777; > AudioSendStream* send_stream = call->CreateAudioSendStream(send_config); > EXPECT_NE(send_stream, nullptr); >@@ -263,7 +268,8 @@ TEST(CallTest, RecreatingAudioStreamWithSameSsrcReusesRtpState) { > CallHelper call; > > auto create_stream_and_get_rtp_state = [&](uint32_t ssrc) { >- AudioSendStream::Config config(nullptr); >+ AudioSendStream::Config config(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr); > config.rtp.ssrc = ssrc; > AudioSendStream* stream = call->CreateAudioSendStream(config); > const RtpState rtp_state = >@@ -284,4 +290,27 @@ TEST(CallTest, RecreatingAudioStreamWithSameSsrcReusesRtpState) { > EXPECT_EQ(rtp_state1.media_has_been_sent, rtp_state2.media_has_been_sent); > } > >+TEST(CallTest, RegisterMediaTransportBitrateCallbacksInCreateStream) { >+ CallHelper call; >+ MediaTransportSettings settings; >+ webrtc::FakeMediaTransport fake_media_transport(settings); >+ >+ EXPECT_EQ(0, fake_media_transport.target_rate_observers_size()); >+ AudioSendStream::Config config(/*send_transport=*/nullptr, >+ /*media_transport=*/&fake_media_transport); >+ >+ call->MediaTransportChange(&fake_media_transport); >+ AudioSendStream* stream = call->CreateAudioSendStream(config); >+ >+ // We get 2 subscribers: one subscriber from call.cc, and one from >+ // ChannelSend. >+ EXPECT_EQ(2, fake_media_transport.target_rate_observers_size()); >+ >+ call->DestroyAudioSendStream(stream); >+ EXPECT_EQ(1, fake_media_transport.target_rate_observers_size()); >+ >+ call->MediaTransportChange(nullptr); >+ EXPECT_EQ(0, fake_media_transport.target_rate_observers_size()); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/callfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/callfactory.cc >index fa7d0fac6696376cdcc39ce9eafd791aa4f5ef16..ab057be028bb5fc4fea7cb3680fc41a23e358433 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/callfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/callfactory.cc >@@ -30,7 +30,7 @@ bool ParseConfigParam(std::string exp_name, int* field) { > return (sscanf(group.c_str(), "%d", field) == 1); > } > >-absl::optional<webrtc::DefaultNetworkSimulationConfig> ParseDegradationConfig( >+absl::optional<webrtc::BuiltInNetworkBehaviorConfig> ParseDegradationConfig( > bool send) { > std::string exp_prefix = "WebRTCFakeNetwork"; > if (send) { >@@ -39,7 +39,7 @@ absl::optional<webrtc::DefaultNetworkSimulationConfig> ParseDegradationConfig( > exp_prefix += "Receive"; > } > >- webrtc::DefaultNetworkSimulationConfig config; >+ webrtc::BuiltInNetworkBehaviorConfig config; > bool configured = false; > configured |= > ParseConfigParam(exp_prefix + "DelayMs", &config.queue_delay_ms); >@@ -63,15 +63,15 @@ absl::optional<webrtc::DefaultNetworkSimulationConfig> ParseDegradationConfig( > configured |= ParseConfigParam(exp_prefix + "AvgBurstLossLength", > &config.avg_burst_loss_length); > return configured >- ? absl::optional<webrtc::DefaultNetworkSimulationConfig>(config) >+ ? absl::optional<webrtc::BuiltInNetworkBehaviorConfig>(config) > : absl::nullopt; > } > } // namespace > > Call* CallFactory::CreateCall(const Call::Config& config) { >- absl::optional<webrtc::DefaultNetworkSimulationConfig> >- send_degradation_config = ParseDegradationConfig(true); >- absl::optional<webrtc::DefaultNetworkSimulationConfig> >+ absl::optional<webrtc::BuiltInNetworkBehaviorConfig> send_degradation_config = >+ ParseDegradationConfig(true); >+ absl::optional<webrtc::BuiltInNetworkBehaviorConfig> > receive_degradation_config = ParseDegradationConfig(false); > > if (send_degradation_config || receive_degradation_config) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.cc >index 8181310e78ccbcb7752e29fa32e431f01d540050..a7ef41d490579d27581f6e7556c29a3a6327cb3c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.cc >@@ -17,8 +17,8 @@ > namespace webrtc { > DegradedCall::DegradedCall( > std::unique_ptr<Call> call, >- absl::optional<DefaultNetworkSimulationConfig> send_config, >- absl::optional<DefaultNetworkSimulationConfig> receive_config) >+ absl::optional<BuiltInNetworkBehaviorConfig> send_config, >+ absl::optional<BuiltInNetworkBehaviorConfig> receive_config) > : clock_(Clock::GetRealTimeClock()), > call_(std::move(call)), > send_config_(send_config), >@@ -183,6 +183,8 @@ bool DegradedCall::SendRtp(const uint8_t* packet, > rtc::SentPacket sent_packet; > sent_packet.packet_id = options.packet_id; > sent_packet.send_time_ms = clock_->TimeInMilliseconds(); >+ sent_packet.info.included_in_feedback = options.included_in_feedback; >+ sent_packet.info.included_in_allocation = options.included_in_allocation; > sent_packet.info.packet_size_bytes = length; > sent_packet.info.packet_type = rtc::PacketType::kData; > call_->OnSentPacket(sent_packet); >@@ -213,4 +215,10 @@ PacketReceiver::DeliveryStatus DegradedCall::DeliverPacket( > return status; > } > >+void DegradedCall::MediaTransportChange( >+ MediaTransportInterface* media_transport) { >+ // TODO(bugs.webrtc.org/9719) We should add support for media transport here >+ // at some point. >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.h >index 3c0b80df86ad35ae9d22fa239417af3bf9d69976..d78b1d1026cfe8b1e1658db98c47429936e238c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/degraded_call.h >@@ -28,8 +28,8 @@ class DegradedCall : public Call, private Transport, private PacketReceiver { > public: > explicit DegradedCall( > std::unique_ptr<Call> call, >- absl::optional<DefaultNetworkSimulationConfig> send_config, >- absl::optional<DefaultNetworkSimulationConfig> receive_config); >+ absl::optional<BuiltInNetworkBehaviorConfig> send_config, >+ absl::optional<BuiltInNetworkBehaviorConfig> receive_config); > ~DegradedCall() override; > > // Implements Call. >@@ -91,13 +91,14 @@ class DegradedCall : public Call, private Transport, private PacketReceiver { > Clock* const clock_; > const std::unique_ptr<Call> call_; > >- const absl::optional<DefaultNetworkSimulationConfig> send_config_; >+ void MediaTransportChange(MediaTransportInterface* media_transport) override; >+ const absl::optional<BuiltInNetworkBehaviorConfig> send_config_; > const std::unique_ptr<ProcessThread> send_process_thread_; > SimulatedNetwork* send_simulated_network_; > std::unique_ptr<FakeNetworkPipe> send_pipe_; > size_t num_send_streams_; > >- const absl::optional<DefaultNetworkSimulationConfig> receive_config_; >+ const absl::optional<BuiltInNetworkBehaviorConfig> receive_config_; > SimulatedNetwork* receive_simulated_network_; > std::unique_ptr<FakeNetworkPipe> receive_pipe_; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.cc >index e702c1b6f5c7ef8b7385cacddd0c9481d24a0717..3bd2e8f48b16a22c6143f5b676ed8ef92b9b8d47 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.cc >@@ -18,6 +18,7 @@ > #include "call/call.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" >+#include "modules/utility/include/process_thread.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/clock.h" > >@@ -90,7 +91,6 @@ FakeNetworkPipe::FakeNetworkPipe( > dropped_packets_(0), > sent_packets_(0), > total_packet_delay_us_(0), >- next_process_time_us_(clock_->TimeInMicroseconds()), > last_log_time_us_(clock_->TimeInMicroseconds()) {} > > FakeNetworkPipe::FakeNetworkPipe( >@@ -105,7 +105,6 @@ FakeNetworkPipe::FakeNetworkPipe( > dropped_packets_(0), > sent_packets_(0), > total_packet_delay_us_(0), >- next_process_time_us_(clock_->TimeInMicroseconds()), > last_log_time_us_(clock_->TimeInMicroseconds()) {} > > FakeNetworkPipe::~FakeNetworkPipe() = default; >@@ -154,8 +153,8 @@ bool FakeNetworkPipe::EnqueuePacket(rtc::CopyOnWriteBuffer packet, > bool is_rtcp, > MediaType media_type, > absl::optional<int64_t> packet_time_us) { >- int64_t time_now_us = clock_->TimeInMicroseconds(); > rtc::CritScope crit(&process_lock_); >+ int64_t time_now_us = clock_->TimeInMicroseconds(); > size_t packet_size = packet.size(); > NetworkPacket net_packet(std::move(packet), time_now_us, time_now_us, options, > is_rtcp, media_type, packet_time_us); >@@ -169,6 +168,12 @@ bool FakeNetworkPipe::EnqueuePacket(rtc::CopyOnWriteBuffer packet, > packets_in_flight_.pop_back(); > ++dropped_packets_; > } >+ if (network_behavior_->NextDeliveryTimeUs()) { >+ rtc::CritScope crit(&process_thread_lock_); >+ if (process_thread_) >+ process_thread_->WakeUp(nullptr); >+ } >+ > return sent; > } > >@@ -201,10 +206,11 @@ size_t FakeNetworkPipe::SentPackets() { > } > > void FakeNetworkPipe::Process() { >- int64_t time_now_us = clock_->TimeInMicroseconds(); >+ int64_t time_now_us; > std::queue<NetworkPacket> packets_to_deliver; > { > rtc::CritScope crit(&process_lock_); >+ time_now_us = clock_->TimeInMicroseconds(); > if (time_now_us - last_log_time_us_ > kLogIntervalMs * 1000) { > int64_t queueing_delay_us = 0; > if (!packets_in_flight_.empty()) >@@ -250,9 +256,11 @@ void FakeNetworkPipe::Process() { > // arrived, due to NetworkProcess being called too late. For stats, use > // the time it should have been on the link. > total_packet_delay_us_ += added_delay_us; >+ ++sent_packets_; >+ } else { >+ ++dropped_packets_; > } > } >- sent_packets_ += packets_to_deliver.size(); > } > > rtc::CritScope crit(&config_lock_); >@@ -261,10 +269,6 @@ void FakeNetworkPipe::Process() { > packets_to_deliver.pop(); > DeliverNetworkPacket(&packet); > } >- absl::optional<int64_t> delivery_us = network_behavior_->NextDeliveryTimeUs(); >- next_process_time_us_ = delivery_us >- ? *delivery_us >- : time_now_us + kDefaultProcessIntervalMs * 1000; > } > > void FakeNetworkPipe::DeliverNetworkPacket(NetworkPacket* packet) { >@@ -291,8 +295,17 @@ void FakeNetworkPipe::DeliverNetworkPacket(NetworkPacket* packet) { > > int64_t FakeNetworkPipe::TimeUntilNextProcess() { > rtc::CritScope crit(&process_lock_); >- int64_t delay_us = next_process_time_us_ - clock_->TimeInMicroseconds(); >- return std::max<int64_t>((delay_us + 500) / 1000, 0); >+ absl::optional<int64_t> delivery_us = network_behavior_->NextDeliveryTimeUs(); >+ if (delivery_us) { >+ int64_t delay_us = *delivery_us - clock_->TimeInMicroseconds(); >+ return std::max<int64_t>((delay_us + 500) / 1000, 0); >+ } >+ return kDefaultProcessIntervalMs; >+} >+ >+void FakeNetworkPipe::ProcessThreadAttached(ProcessThread* process_thread) { >+ rtc::CritScope cs(&process_thread_lock_); >+ process_thread_ = process_thread; > } > > bool FakeNetworkPipe::HasTransport() const { >@@ -316,31 +329,8 @@ void FakeNetworkPipe::ResetStats() { > total_packet_delay_us_ = 0; > } > >-void FakeNetworkPipe::AddToPacketDropCount() { >- rtc::CritScope crit(&process_lock_); >- ++dropped_packets_; >-} >- >-void FakeNetworkPipe::AddToPacketSentCount(int count) { >- rtc::CritScope crit(&process_lock_); >- sent_packets_ += count; >-} >- >-void FakeNetworkPipe::AddToTotalDelay(int delay_us) { >- rtc::CritScope crit(&process_lock_); >- total_packet_delay_us_ += delay_us; >-} >- > int64_t FakeNetworkPipe::GetTimeInMicroseconds() const { > return clock_->TimeInMicroseconds(); > } > >-bool FakeNetworkPipe::ShouldProcess(int64_t time_now_us) const { >- return time_now_us >= next_process_time_us_; >-} >- >-void FakeNetworkPipe::SetTimeToNextProcess(int64_t skip_us) { >- next_process_time_us_ += skip_us; >-} >- > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.h >index f85e82efc5ee46edc40a9d10e5dc9a2642f0f966..b1b4cee14d0e61d101927c94e9b7b4e5818fb718 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/fake_network_pipe.h >@@ -23,7 +23,6 @@ > #include "api/test/simulated_network.h" > #include "call/call.h" > #include "call/simulated_packet_receiver.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/thread_annotations.h" >@@ -140,6 +139,7 @@ class FakeNetworkPipe : public webrtc::SimulatedPacketReceiverInterface, > // packets ready to be delivered. > void Process() override; > int64_t TimeUntilNextProcess() override; >+ void ProcessThreadAttached(ProcessThread* process_thread) override; > > // Get statistics. > float PercentageLoss(); >@@ -150,9 +150,6 @@ class FakeNetworkPipe : public webrtc::SimulatedPacketReceiverInterface, > > protected: > void DeliverPacketWithLock(NetworkPacket* packet); >- void AddToPacketDropCount(); >- void AddToPacketSentCount(int count); >- void AddToTotalDelay(int delay_us); > int64_t GetTimeInMicroseconds() const; > bool ShouldProcess(int64_t time_now_us) const; > void SetTimeToNextProcess(int64_t skip_us); >@@ -197,6 +194,9 @@ class FakeNetworkPipe : public webrtc::SimulatedPacketReceiverInterface, > // processes, such as the packet queues. > rtc::CriticalSection process_lock_; > >+ rtc::CriticalSection process_thread_lock_; >+ ProcessThread* process_thread_ RTC_GUARDED_BY(process_thread_lock_) = nullptr; >+ > // Packets are added at the back of the deque, this makes the deque ordered > // by increasing send time. The common case when removing packets from the > // deque is removing early packets, which will be close to the front of the >@@ -210,9 +210,6 @@ class FakeNetworkPipe : public webrtc::SimulatedPacketReceiverInterface, > size_t dropped_packets_ RTC_GUARDED_BY(process_lock_); > size_t sent_packets_ RTC_GUARDED_BY(process_lock_); > int64_t total_packet_delay_us_ RTC_GUARDED_BY(process_lock_); >- >- int64_t next_process_time_us_; >- > int64_t last_log_time_us_; > > RTC_DISALLOW_COPY_AND_ASSIGN(FakeNetworkPipe); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.cc >index ab138368ba0ea998166c6a9f92226e6d84d298da..ab6dde37b46d769cdd9cb6e31d1cca355ff85da1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.cc >@@ -10,6 +10,8 @@ > > #include "call/flexfec_receive_stream.h" > >+#include "rtc_base/checks.h" >+ > namespace webrtc { > > FlexfecReceiveStream::Config::Config(Transport* rtcp_send_transport) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.h >index ccc301430e844a388bdca8179f22b480081cd7d3..949ad742aa03f66d721ab50b875e1bcbaf5b3a29 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/flexfec_receive_stream.h >@@ -12,7 +12,6 @@ > #define CALL_FLEXFEC_RECEIVE_STREAM_H_ > > #include <stdint.h> >- > #include <string> > #include <vector> > >@@ -20,7 +19,6 @@ > #include "api/rtp_headers.h" > #include "api/rtpparameters.h" > #include "call/rtp_packet_sink_interface.h" >-#include "common_types.h" // NOLINT(build/include) > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/packet_receiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/packet_receiver.h >index f05e409d79d136c98c082d509cf906d37d756a06..02a01550def65234a89417668d784c555cc6b03c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/packet_receiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/packet_receiver.h >@@ -16,7 +16,6 @@ > #include <vector> > > #include "api/mediatypes.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/copyonwritebuffer.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.cc >index 8eb451ac04f73a7b85aebc82effe40b7b156999b..9735421f30ec51cb0ebc5f61e56fced0525de60b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.cc >@@ -11,7 +11,9 @@ > #include "call/rampup_tests.h" > > #include "call/fake_network_pipe.h" >+#include "logging/rtc_event_log/output/rtc_event_log_output_file.h" > #include "rtc_base/checks.h" >+#include "rtc_base/flags.h" > #include "rtc_base/logging.h" > #include "rtc_base/platform_thread.h" > #include "rtc_base/stringencode.h" >@@ -38,6 +40,10 @@ std::vector<uint32_t> GenerateSsrcs(size_t num_streams, uint32_t ssrc_offset) { > } > } // namespace > >+WEBRTC_DEFINE_string(ramp_dump_name, >+ "", >+ "Filename for dumped received RTP stream."); >+ > RampUpTester::RampUpTester(size_t num_video_streams, > size_t num_audio_streams, > size_t num_flexfec_streams, >@@ -48,7 +54,6 @@ RampUpTester::RampUpTester(size_t num_video_streams, > bool red, > bool report_perf_stats) > : EndToEndTest(test::CallTest::kLongTimeoutMs), >- stop_event_(false, false), > clock_(Clock::GetRealTimeClock()), > num_video_streams_(num_video_streams), > num_audio_streams_(num_audio_streams), >@@ -79,11 +84,12 @@ RampUpTester::RampUpTester(size_t num_video_streams, > > RampUpTester::~RampUpTester() {} > >-void RampUpTester::ModifySenderCallConfig(Call::Config* config) { >+void RampUpTester::ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) { > if (start_bitrate_bps_ != 0) { >- config->bitrate_config.start_bitrate_bps = start_bitrate_bps_; >+ bitrate_config->start_bitrate_bps = start_bitrate_bps_; > } >- config->bitrate_config.min_bitrate_bps = 10000; >+ bitrate_config->min_bitrate_bps = 10000; > } > > void RampUpTester::OnVideoStreamsCreated( >@@ -448,8 +454,9 @@ void RampUpDownUpTester::PollStats() { > } while (!stop_event_.Wait(kPollIntervalMs)); > } > >-void RampUpDownUpTester::ModifyReceiverCallConfig(Call::Config* config) { >- config->bitrate_config.min_bitrate_bps = 10000; >+void RampUpDownUpTester::ModifyReceiverBitrateConfig( >+ BitrateConstraints* bitrate_config) { >+ bitrate_config->min_bitrate_bps = 10000; > } > > std::string RampUpDownUpTester::GetModifierString() const { >@@ -566,7 +573,23 @@ void RampUpDownUpTester::EvolveTestState(int bitrate_bps, bool suspended) { > > class RampUpTest : public test::CallTest { > public: >- RampUpTest() {} >+ RampUpTest() { >+ std::string dump_name(FLAG_ramp_dump_name); >+ if (!dump_name.empty()) { >+ send_event_log_ = RtcEventLog::Create(RtcEventLog::EncodingType::Legacy); >+ recv_event_log_ = RtcEventLog::Create(RtcEventLog::EncodingType::Legacy); >+ bool event_log_started = >+ send_event_log_->StartLogging( >+ absl::make_unique<RtcEventLogOutputFile>( >+ dump_name + ".send.rtc.dat", RtcEventLog::kUnlimitedOutput), >+ RtcEventLog::kImmediateOutput) && >+ recv_event_log_->StartLogging( >+ absl::make_unique<RtcEventLogOutputFile>( >+ dump_name + ".recv.rtc.dat", RtcEventLog::kUnlimitedOutput), >+ RtcEventLog::kImmediateOutput); >+ RTC_DCHECK(event_log_started); >+ } >+ } > }; > > static const uint32_t kStartBitrateBps = 60000; >@@ -667,13 +690,7 @@ TEST_F(RampUpTest, TransportSequenceNumberSimulcastRedRtx) { > RunBaseTest(&test); > } > >-// TODO(bugs.webrtc.org/8878) >-#if defined(WEBRTC_MAC) >-#define MAYBE_AudioTransportSequenceNumber DISABLED_AudioTransportSequenceNumber >-#else >-#define MAYBE_AudioTransportSequenceNumber AudioTransportSequenceNumber >-#endif >-TEST_F(RampUpTest, MAYBE_AudioTransportSequenceNumber) { >+TEST_F(RampUpTest, AudioTransportSequenceNumber) { > RampUpTester test(0, 1, 0, 300000, 10000, > RtpExtension::kTransportSequenceNumberUri, false, false, > false); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.h >index 6cc65ce2a2243488ec91a5de3c337ae315f90997..b7d4af5f5a8898fafa0b6b3a1345706cb82c9759 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rampup_tests.h >@@ -67,7 +67,7 @@ class RampUpTester : public test::EndToEndTest { > > rtc::Event stop_event_; > Clock* const clock_; >- DefaultNetworkSimulationConfig forward_transport_config_; >+ BuiltInNetworkBehaviorConfig forward_transport_config_; > const size_t num_video_streams_; > const size_t num_audio_streams_; > const size_t num_flexfec_streams_; >@@ -83,7 +83,7 @@ class RampUpTester : public test::EndToEndTest { > typedef std::map<uint32_t, uint32_t> SsrcMap; > class VideoStreamFactory; > >- void ModifySenderCallConfig(Call::Config* config) override; >+ void ModifySenderBitrateConfig(BitrateConstraints* bitrate_config) override; > void OnVideoStreamsCreated( > VideoSendStream* send_stream, > const std::vector<VideoReceiveStream*>& receive_streams) override; >@@ -142,7 +142,7 @@ class RampUpDownUpTester : public RampUpTester { > kTransitionToNextState, > }; > >- void ModifyReceiverCallConfig(Call::Config* config); >+ void ModifyReceiverBitrateConfig(BitrateConstraints* bitrate_config) override; > > std::string GetModifierString() const; > int GetExpectedHighBitrate() const; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.cc >index 16c6a43cc3ffc05492916a194604af5dc7fc1ff2..b7056732e8b65a8822c91ac5fd56d51c9416d38d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.cc >@@ -9,59 +9,112 @@ > */ > > #include "call/receive_time_calculator.h" >+ >+#include <string> >+ > #include "absl/memory/memory.h" >+#include "rtc_base/experiments/field_trial_parser.h" > #include "rtc_base/logging.h" >+#include "rtc_base/numerics/safe_minmax.h" > #include "system_wrappers/include/field_trial.h" > > namespace webrtc { > namespace { >-using ::webrtc::field_trial::FindFullName; > using ::webrtc::field_trial::IsEnabled; > >-const char kBweReceiveTimeCorrection[] = "WebRTC-BweReceiveTimeCorrection"; >+const char kBweReceiveTimeCorrection[] = "WebRTC-Bwe-ReceiveTimeFix"; > } // namespace > >-ReceiveTimeCalculator::ReceiveTimeCalculator(int64_t min_delta_ms, >- int64_t max_delta_diff_ms) >- : min_delta_us_(min_delta_ms * 1000), >- max_delta_diff_us_(max_delta_diff_ms * 1000) {} >+ReceiveTimeCalculatorConfig::ReceiveTimeCalculatorConfig() >+ : max_packet_time_repair("maxrep", TimeDelta::ms(2000)), >+ stall_threshold("stall", TimeDelta::ms(5)), >+ tolerance("tol", TimeDelta::ms(1)), >+ max_stall("maxstall", TimeDelta::seconds(5)) { >+ std::string trial_string = >+ field_trial::FindFullName(kBweReceiveTimeCorrection); >+ ParseFieldTrial( >+ {&max_packet_time_repair, &stall_threshold, &tolerance, &max_stall}, >+ trial_string); >+} >+ReceiveTimeCalculatorConfig::ReceiveTimeCalculatorConfig( >+ const ReceiveTimeCalculatorConfig&) = default; >+ReceiveTimeCalculatorConfig::~ReceiveTimeCalculatorConfig() = default; >+ >+ReceiveTimeCalculator::ReceiveTimeCalculator() >+ : config_(ReceiveTimeCalculatorConfig()) {} > > std::unique_ptr<ReceiveTimeCalculator> > ReceiveTimeCalculator::CreateFromFieldTrial() { > if (!IsEnabled(kBweReceiveTimeCorrection)) > return nullptr; >- int min, max; >- if (sscanf(FindFullName(kBweReceiveTimeCorrection).c_str(), "Enabled,%d,%d", >- &min, &max) != 2) { >- RTC_LOG(LS_WARNING) << "Invalid number of parameters provided."; >- return nullptr; >- } >- return absl::make_unique<ReceiveTimeCalculator>(min, max); >+ return absl::make_unique<ReceiveTimeCalculator>(); > } > >-int64_t ReceiveTimeCalculator::ReconcileReceiveTimes(int64_t packet_time_us_, >- int64_t safe_time_us_) { >- if (!receive_time_offset_us_) { >- receive_time_offset_us_ = safe_time_us_ - packet_time_us_; >- } else { >- int64_t safe_delta_us = safe_time_us_ - last_safe_time_us_; >- int64_t packet_delta_us_ = packet_time_us_ - last_packet_time_us_; >- int64_t delta_diff = packet_delta_us_ - safe_delta_us; >- // Packet time should not decrease significantly, a large decrease indicates >- // a reset of the packet time clock and we should reset the offest >- // parameter. The safe reference time can increase in large jumps if the >- // thread measuring it is backgrounded for longer periods. But if the packet >- // time increases significantly more than the safe time, it indicates a >- // clock reset and we should reset the offset. >- >- if (packet_delta_us_ < min_delta_us_ || delta_diff > max_delta_diff_us_) { >- RTC_LOG(LS_WARNING) << "Received a clock jump of " << delta_diff >- << " resetting offset"; >- receive_time_offset_us_ = safe_time_us_ - packet_time_us_; >+int64_t ReceiveTimeCalculator::ReconcileReceiveTimes(int64_t packet_time_us, >+ int64_t system_time_us, >+ int64_t safe_time_us) { >+ int64_t stall_time_us = system_time_us - packet_time_us; >+ if (total_system_time_passed_us_ < config_.stall_threshold->us()) { >+ stall_time_us = rtc::SafeMin(stall_time_us, config_.max_stall->us()); >+ } >+ int64_t corrected_time_us = safe_time_us - stall_time_us; >+ >+ if (last_packet_time_us_ == -1 && stall_time_us < 0) { >+ static_clock_offset_us_ = stall_time_us; >+ corrected_time_us += static_clock_offset_us_; >+ } else if (last_packet_time_us_ > 0) { >+ // All repairs depend on variables being intialized >+ int64_t packet_time_delta_us = packet_time_us - last_packet_time_us_; >+ int64_t system_time_delta_us = system_time_us - last_system_time_us_; >+ int64_t safe_time_delta_us = safe_time_us - last_safe_time_us_; >+ >+ // Repair backwards clock resets during initial stall. In this case, the >+ // reset is observed only in packet time but never in system time. >+ if (system_time_delta_us < 0) >+ total_system_time_passed_us_ += config_.stall_threshold->us(); >+ else >+ total_system_time_passed_us_ += system_time_delta_us; >+ if (packet_time_delta_us < 0 && >+ total_system_time_passed_us_ < config_.stall_threshold->us()) { >+ static_clock_offset_us_ -= packet_time_delta_us; >+ } >+ corrected_time_us += static_clock_offset_us_; >+ >+ // Detect resets inbetween clock readings in socket and app. >+ bool forward_clock_reset = >+ corrected_time_us + config_.tolerance->us() < last_corrected_time_us_; >+ bool obvious_backward_clock_reset = system_time_us < packet_time_us; >+ >+ // Harder case with backward clock reset during stall, the reset being >+ // smaller than the stall. Compensate throughout the stall. >+ bool small_backward_clock_reset = >+ !obvious_backward_clock_reset && >+ safe_time_delta_us > system_time_delta_us + config_.tolerance->us(); >+ bool stall_start = >+ packet_time_delta_us >= 0 && >+ system_time_delta_us > packet_time_delta_us + config_.tolerance->us(); >+ bool stall_is_over = safe_time_delta_us > config_.stall_threshold->us(); >+ bool packet_time_caught_up = >+ packet_time_delta_us < 0 && system_time_delta_us >= 0; >+ if (stall_start && small_backward_clock_reset) >+ small_reset_during_stall_ = true; >+ else if (stall_is_over || packet_time_caught_up) >+ small_reset_during_stall_ = false; >+ >+ // If resets are detected, advance time by (capped) packet time increase. >+ if (forward_clock_reset || obvious_backward_clock_reset || >+ small_reset_during_stall_) { >+ corrected_time_us = last_corrected_time_us_ + >+ rtc::SafeClamp(packet_time_delta_us, 0, >+ config_.max_packet_time_repair->us()); > } > } >- last_packet_time_us_ = packet_time_us_; >- last_safe_time_us_ = safe_time_us_; >- return packet_time_us_ + *receive_time_offset_us_; >+ >+ last_corrected_time_us_ = corrected_time_us; >+ last_packet_time_us_ = packet_time_us; >+ last_system_time_us_ = system_time_us; >+ last_safe_time_us_ = safe_time_us; >+ return corrected_time_us; > } >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.h >index a8217cd101259f8b381adb082279d8375dc21f14..269d4ef91009084d20dc04f00102426080c4267c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator.h >@@ -10,13 +10,24 @@ > #ifndef CALL_RECEIVE_TIME_CALCULATOR_H_ > #define CALL_RECEIVE_TIME_CALCULATOR_H_ > >-#include <stdint.h> > #include <memory> > >-#include "absl/types/optional.h" >+#include "rtc_base/experiments/field_trial_units.h" > > namespace webrtc { > >+struct ReceiveTimeCalculatorConfig { >+ ReceiveTimeCalculatorConfig(); >+ ReceiveTimeCalculatorConfig(const ReceiveTimeCalculatorConfig&); >+ ReceiveTimeCalculatorConfig& operator=(const ReceiveTimeCalculatorConfig&) = >+ default; >+ ~ReceiveTimeCalculatorConfig(); >+ FieldTrialParameter<TimeDelta> max_packet_time_repair; >+ FieldTrialParameter<TimeDelta> stall_threshold; >+ FieldTrialParameter<TimeDelta> tolerance; >+ FieldTrialParameter<TimeDelta> max_stall; >+}; >+ > // The receive time calculator serves the purpose of combining packet time > // stamps with a safely incremental clock. This assumes that the packet time > // stamps are based on lower layer timestamps that have more accurate time >@@ -28,20 +39,20 @@ namespace webrtc { > class ReceiveTimeCalculator { > public: > static std::unique_ptr<ReceiveTimeCalculator> CreateFromFieldTrial(); >- // The min delta is used to correct for jumps backwards in time, to allow some >- // packet reordering a small negative value is appropriate to use. The max >- // delta difference is used as margin when detecting when packet time >- // increases more than the safe clock. This should be larger than the largest >- // expected sysmtem induced delay in the safe clock timestamp. >- ReceiveTimeCalculator(int64_t min_delta_ms, int64_t max_delta_diff_ms); >- int64_t ReconcileReceiveTimes(int64_t packet_time_us_, int64_t safe_time_us_); >+ ReceiveTimeCalculator(); >+ int64_t ReconcileReceiveTimes(int64_t packet_time_us_, >+ int64_t system_time_us_, >+ int64_t safe_time_us_); > > private: >- const int64_t min_delta_us_; >- const int64_t max_delta_diff_us_; >- absl::optional<int64_t> receive_time_offset_us_; >- int64_t last_packet_time_us_ = 0; >- int64_t last_safe_time_us_ = 0; >+ int64_t last_corrected_time_us_ = -1; >+ int64_t last_packet_time_us_ = -1; >+ int64_t last_system_time_us_ = -1; >+ int64_t last_safe_time_us_ = -1; >+ int64_t total_system_time_passed_us_ = 0; >+ int64_t static_clock_offset_us_ = 0; >+ int64_t small_reset_during_stall_ = false; >+ ReceiveTimeCalculatorConfig config_; > }; > } // namespace webrtc > #endif // CALL_RECEIVE_TIME_CALCULATOR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator_unittest.cc >index 92d5a279eae11242b786b77c15fe425edfa29ba2..38ef54ece794112cc9dad815c24b49513c70d3aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/receive_time_calculator_unittest.cc >@@ -10,65 +10,237 @@ > > #include "call/receive_time_calculator.h" > >+#include <algorithm> >+#include <iostream> >+#include <tuple> >+#include <vector> >+ >+#include "absl/memory/memory.h" >+#include "absl/types/optional.h" >+#include "rtc_base/random.h" >+#include "rtc_base/timeutils.h" > #include "test/gtest.h" > > namespace webrtc { > namespace test { > namespace { > >-int64_t ReconcileMs(ReceiveTimeCalculator* calc, >- int64_t packet_time_ms, >- int64_t safe_time_ms) { >- return calc->ReconcileReceiveTimes(packet_time_ms * 1000, >- safe_time_ms * 1000) / >- 1000; >-} >-} // namespace >+class EmulatedClock { >+ public: >+ explicit EmulatedClock(int seed, float drift = 0.0f) >+ : random_(seed), clock_us_(random_.Rand<uint32_t>()), drift_(drift) {} >+ virtual ~EmulatedClock() = default; >+ int64_t GetClockUs() const { return clock_us_; } > >-TEST(ReceiveTimeCalculatorTest, UsesSmallerIncrements) { >- int64_t kMinDeltaMs = -20; >- int64_t kMaxDeltaDiffMs = 100; >- ReceiveTimeCalculator calc(kMinDeltaMs, kMaxDeltaDiffMs); >- // Initialize offset. >- ReconcileMs(&calc, 10000, 20000); >- >- EXPECT_EQ(ReconcileMs(&calc, 10010, 20100), 20010); >- EXPECT_EQ(ReconcileMs(&calc, 10020, 20100), 20020); >- EXPECT_EQ(ReconcileMs(&calc, 10030, 20100), 20030); >- >- EXPECT_EQ(ReconcileMs(&calc, 10110, 20200), 20110); >- EXPECT_EQ(ReconcileMs(&calc, 10120, 20200), 20120); >- EXPECT_EQ(ReconcileMs(&calc, 10130, 20200), 20130); >- >- // Small jumps backwards are let trough, they might be due to reordering. >- EXPECT_EQ(ReconcileMs(&calc, 10120, 20200), 20120); >- // The safe clock might be smaller than the packet clock. >- EXPECT_EQ(ReconcileMs(&calc, 10210, 20200), 20210); >- EXPECT_EQ(ReconcileMs(&calc, 10240, 20200), 20240); >-} >+ protected: >+ int64_t UpdateClock(int64_t time_us) { >+ if (!last_query_us_) >+ last_query_us_ = time_us; >+ int64_t skip_us = time_us - *last_query_us_; >+ accumulated_drift_us_ += skip_us * drift_; >+ int64_t drift_correction_us = static_cast<int64_t>(accumulated_drift_us_); >+ accumulated_drift_us_ -= drift_correction_us; >+ clock_us_ += skip_us + drift_correction_us; >+ last_query_us_ = time_us; >+ return skip_us; >+ } >+ Random random_; > >-TEST(ReceiveTimeCalculatorTest, CorrectsJumps) { >- int64_t kMinDeltaMs = -20; >- int64_t kMaxDeltaDiffMs = 100; >- ReceiveTimeCalculator calc(kMinDeltaMs, kMaxDeltaDiffMs); >- // Initialize offset. >- ReconcileMs(&calc, 10000, 20000); >- >- EXPECT_EQ(ReconcileMs(&calc, 10010, 20100), 20010); >- EXPECT_EQ(ReconcileMs(&calc, 10020, 20100), 20020); >- EXPECT_EQ(ReconcileMs(&calc, 10030, 20100), 20030); >- >- // Jump forward in time. >- EXPECT_EQ(ReconcileMs(&calc, 10240, 20200), 20200); >- EXPECT_EQ(ReconcileMs(&calc, 10250, 20200), 20210); >- EXPECT_EQ(ReconcileMs(&calc, 10260, 20200), 20220); >- >- // Jump backward in time. >- EXPECT_EQ(ReconcileMs(&calc, 10230, 20300), 20300); >- EXPECT_EQ(ReconcileMs(&calc, 10240, 20300), 20310); >- EXPECT_EQ(ReconcileMs(&calc, 10250, 20300), 20320); >-} >+ private: >+ int64_t clock_us_; >+ absl::optional<int64_t> last_query_us_; >+ float drift_; >+ float accumulated_drift_us_ = 0; >+}; > >-} // namespace test >+class EmulatedMonotoneousClock : public EmulatedClock { >+ public: >+ explicit EmulatedMonotoneousClock(int seed) : EmulatedClock(seed) {} >+ ~EmulatedMonotoneousClock() = default; >+ >+ int64_t Query(int64_t time_us) { >+ int64_t skip_us = UpdateClock(time_us); >+ >+ // In a stall >+ if (stall_recovery_time_us_ > 0) { >+ if (GetClockUs() > stall_recovery_time_us_) { >+ stall_recovery_time_us_ = 0; >+ return GetClockUs(); >+ } else { >+ return stall_recovery_time_us_; >+ } >+ } >+ >+ // Check if we enter a stall >+ for (int k = 0; k < skip_us; ++k) { >+ if (random_.Rand<double>() < kChanceOfStallPerUs) { >+ int64_t stall_duration_us = >+ static_cast<int64_t>(random_.Rand<float>() * kMaxStallDurationUs); >+ stall_recovery_time_us_ = GetClockUs() + stall_duration_us; >+ return stall_recovery_time_us_; >+ } >+ } >+ return GetClockUs(); >+ } >+ >+ void ForceStallUs() { >+ int64_t stall_duration_us = >+ static_cast<int64_t>(random_.Rand<float>() * kMaxStallDurationUs); >+ stall_recovery_time_us_ = GetClockUs() + stall_duration_us; >+ } >+ >+ bool Stalled() const { return stall_recovery_time_us_ > 0; } >+ >+ int64_t GetRemainingStall(int64_t time_us) const { >+ return stall_recovery_time_us_ > 0 ? stall_recovery_time_us_ - GetClockUs() >+ : 0; >+ } >+ >+ const int64_t kMaxStallDurationUs = rtc::kNumMicrosecsPerSec; >+ >+ private: >+ const float kChanceOfStallPerUs = 5e-6f; >+ int64_t stall_recovery_time_us_ = 0; >+}; > >+class EmulatedNonMonotoneousClock : public EmulatedClock { >+ public: >+ EmulatedNonMonotoneousClock(int seed, int64_t duration_us, float drift = 0) >+ : EmulatedClock(seed, drift) { >+ Pregenerate(duration_us); >+ } >+ ~EmulatedNonMonotoneousClock() = default; >+ >+ void Pregenerate(int64_t duration_us) { >+ int64_t time_since_reset_us = kMinTimeBetweenResetsUs; >+ int64_t clock_offset_us = 0; >+ for (int64_t time_us = 0; time_us < duration_us; time_us += kResolutionUs) { >+ int64_t skip_us = UpdateClock(time_us); >+ time_since_reset_us += skip_us; >+ int64_t reset_us = 0; >+ if (time_since_reset_us >= kMinTimeBetweenResetsUs) { >+ for (int k = 0; k < skip_us; ++k) { >+ if (random_.Rand<double>() < kChanceOfResetPerUs) { >+ reset_us = static_cast<int64_t>(2 * random_.Rand<float>() * >+ kMaxAbsResetUs) - >+ kMaxAbsResetUs; >+ clock_offset_us += reset_us; >+ time_since_reset_us = 0; >+ break; >+ } >+ } >+ } >+ pregenerated_clock_.emplace_back(GetClockUs() + clock_offset_us); >+ resets_us_.emplace_back(reset_us); >+ } >+ } >+ >+ int64_t Query(int64_t time_us) { >+ size_t ixStart = >+ (last_reset_query_time_us_ + (kResolutionUs >> 1)) / kResolutionUs + 1; >+ size_t ixEnd = (time_us + (kResolutionUs >> 1)) / kResolutionUs; >+ if (ixEnd >= pregenerated_clock_.size()) >+ return -1; >+ last_reset_size_us_ = 0; >+ for (size_t ix = ixStart; ix <= ixEnd; ++ix) { >+ if (resets_us_[ix] != 0) { >+ last_reset_size_us_ = resets_us_[ix]; >+ } >+ } >+ last_reset_query_time_us_ = time_us; >+ return pregenerated_clock_[ixEnd]; >+ } >+ >+ bool WasReset() const { return last_reset_size_us_ != 0; } >+ bool WasNegativeReset() const { return last_reset_size_us_ < 0; } >+ int64_t GetLastResetUs() const { return last_reset_size_us_; } >+ >+ private: >+ const float kChanceOfResetPerUs = 1e-6f; >+ const int64_t kMaxAbsResetUs = rtc::kNumMicrosecsPerSec; >+ const int64_t kMinTimeBetweenResetsUs = 3 * rtc::kNumMicrosecsPerSec; >+ const int64_t kResolutionUs = rtc::kNumMicrosecsPerMillisec; >+ int64_t last_reset_query_time_us_ = 0; >+ int64_t last_reset_size_us_ = 0; >+ std::vector<int64_t> pregenerated_clock_; >+ std::vector<int64_t> resets_us_; >+}; >+ >+TEST(ClockRepair, NoClockDrift) { >+ const int kSeeds = 10; >+ const int kFirstSeed = 1; >+ const int64_t kRuntimeUs = 10 * rtc::kNumMicrosecsPerSec; >+ const float kDrift = 0.0f; >+ const int64_t kMaxPacketInterarrivalUs = 50 * rtc::kNumMicrosecsPerMillisec; >+ for (int seed = kFirstSeed; seed < kSeeds + kFirstSeed; ++seed) { >+ EmulatedMonotoneousClock monotone_clock(seed); >+ EmulatedNonMonotoneousClock non_monotone_clock( >+ seed + 1, kRuntimeUs + rtc::kNumMicrosecsPerSec, kDrift); >+ ReceiveTimeCalculator reception_time_tracker; >+ int64_t corrected_clock_0 = 0; >+ int64_t reset_during_stall_tol_us = 0; >+ bool initial_clock_stall = true; >+ int64_t accumulated_upper_bound_tolerance_us = 0; >+ int64_t accumulated_lower_bound_tolerance_us = 0; >+ Random random(1); >+ monotone_clock.ForceStallUs(); >+ int64_t last_time_us = 0; >+ bool add_tolerance_on_next_packet = false; >+ int64_t monotone_noise_us = 1000; >+ >+ for (int64_t time_us = 0; time_us < kRuntimeUs; >+ time_us += static_cast<int64_t>(random.Rand<float>() * >+ kMaxPacketInterarrivalUs)) { >+ int64_t socket_time_us = non_monotone_clock.Query(time_us); >+ int64_t monotone_us = monotone_clock.Query(time_us) + >+ 2 * random.Rand<float>() * monotone_noise_us - >+ monotone_noise_us; >+ int64_t system_time_us = non_monotone_clock.Query( >+ time_us + monotone_clock.GetRemainingStall(time_us)); >+ >+ int64_t corrected_clock_us = reception_time_tracker.ReconcileReceiveTimes( >+ socket_time_us, system_time_us, monotone_us); >+ if (time_us == 0) >+ corrected_clock_0 = corrected_clock_us; >+ >+ if (add_tolerance_on_next_packet) >+ accumulated_lower_bound_tolerance_us -= (time_us - last_time_us); >+ >+ // Perfect repair cannot be achiveved if non-monotone clock resets during >+ // a monotone clock stall. >+ add_tolerance_on_next_packet = false; >+ if (monotone_clock.Stalled() && non_monotone_clock.WasReset()) { >+ reset_during_stall_tol_us = >+ std::max(reset_during_stall_tol_us, time_us - last_time_us); >+ if (non_monotone_clock.WasNegativeReset()) { >+ add_tolerance_on_next_packet = true; >+ } >+ if (initial_clock_stall && !non_monotone_clock.WasNegativeReset()) { >+ // Positive resets during an initial clock stall cannot be repaired >+ // and error will propagate through rest of trace. >+ accumulated_upper_bound_tolerance_us += >+ std::abs(non_monotone_clock.GetLastResetUs()); >+ } >+ } else { >+ reset_during_stall_tol_us = 0; >+ initial_clock_stall = false; >+ } >+ int64_t err = corrected_clock_us - corrected_clock_0 - time_us; >+ >+ // Resets during stalls may lead to small errors temporarily. >+ int64_t lower_tol_us = accumulated_lower_bound_tolerance_us - >+ reset_during_stall_tol_us - monotone_noise_us - >+ 2 * rtc::kNumMicrosecsPerMillisec; >+ EXPECT_GE(err, lower_tol_us); >+ int64_t upper_tol_us = accumulated_upper_bound_tolerance_us + >+ monotone_noise_us + >+ 2 * rtc::kNumMicrosecsPerMillisec; >+ EXPECT_LE(err, upper_tol_us); >+ >+ last_time_us = time_us; >+ } >+ } >+} >+} // namespace >+} // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer.cc >index d441c05b3db85bc3a6e99b84d498717c4e7e8686..a06d2a4604c403807c0a28a6df68258321438bd2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer.cc >@@ -13,7 +13,6 @@ > #include "api/rtp_headers.h" > #include "call/rtcp_packet_sink_interface.h" > #include "call/rtp_rtcp_demuxer_helper.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer_unittest.cc >index 808364acc5e8434674bc1b29f405ce7f72ace76a..6ceb7ab5d9d69e937f1468a08d2050b17b801059 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtcp_demuxer_unittest.cc >@@ -16,7 +16,6 @@ > #include "absl/memory/memory.h" > #include "api/rtp_headers.h" > #include "call/rtcp_packet_sink_interface.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/source/rtcp_packet/bye.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.cc >index 1445c2552e492948e626998d7e0bc8fabf7914b0..cce78ad64c7d91130ff22e9b40543c32d9463793 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.cc >@@ -10,6 +10,9 @@ > > #include "call/rtp_config.h" > >+#include <cstdint> >+ >+#include "api/array_view.h" > #include "rtc_base/strings/string_builder.h" > > namespace webrtc { >@@ -60,6 +63,7 @@ std::string RtpConfig::ToString() const { > << (rtcp_mode == RtcpMode::kCompound ? "RtcpMode::kCompound" > : "RtcpMode::kReducedSize"); > ss << ", max_packet_size: " << max_packet_size; >+ ss << ", extmap-allow-mixed: " << (extmap_allow_mixed ? "true" : "false"); > ss << ", extensions: ["; > for (size_t i = 0; i < extensions.size(); ++i) { > ss << extensions[i].ToString(); >@@ -108,18 +112,4 @@ std::string RtpConfig::Rtx::ToString() const { > ss << '}'; > return ss.str(); > } >- >-RtcpConfig::RtcpConfig() = default; >-RtcpConfig::RtcpConfig(const RtcpConfig&) = default; >-RtcpConfig::~RtcpConfig() = default; >- >-std::string RtcpConfig::ToString() const { >- char buf[1024]; >- rtc::SimpleStringBuilder ss(buf); >- ss << "{video_report_interval_ms: " << video_report_interval_ms; >- ss << ", audio_report_interval_ms: " << audio_report_interval_ms; >- ss << '}'; >- return ss.str(); >-} >- > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.h >index 8f8d03a3a8242a6b19f5c5f65bd1c2b6386b6a05..97ae951765c2bd23e452a1fd140e115a35869bdd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_config.h >@@ -11,6 +11,8 @@ > #ifndef CALL_RTP_CONFIG_H_ > #define CALL_RTP_CONFIG_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <string> > #include <vector> > >@@ -74,6 +76,9 @@ struct RtpConfig { > // Max RTP packet size delivered to send transport from VideoEngine. > size_t max_packet_size = kDefaultMaxPacketSize; > >+ // Corresponds to the SDP attribute extmap-allow-mixed. >+ bool extmap_allow_mixed = false; >+ > // RTP header extensions to use for this send stream. > std::vector<RtpExtension> extensions; > >@@ -129,17 +134,5 @@ struct RtpConfig { > // RTCP CNAME, see RFC 3550. > std::string c_name; > }; >- >-struct RtcpConfig { >- RtcpConfig(); >- RtcpConfig(const RtcpConfig&); >- ~RtcpConfig(); >- std::string ToString() const; >- >- // Time interval between RTCP report for video >- int64_t video_report_interval_ms = 1000; >- // Time interval between RTCP report for audio >- int64_t audio_report_interval_ms = 5000; >-}; > } // namespace webrtc > #endif // CALL_RTP_CONFIG_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_demuxer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_demuxer_unittest.cc >index b5ede3cd03a0ce5f7dfda815ac89100271b6d536..3b8200e7819889a5163c9ccdd08a7cfa76127fd2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_demuxer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_demuxer_unittest.cc >@@ -17,7 +17,6 @@ > #include "absl/memory/memory.h" > #include "call/ssrc_binding_observer.h" > #include "call/test/mock_rtp_packet_sink_interface.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/include/rtp_header_extension_map.h" > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" > #include "modules/rtp_rtcp/source/rtp_packet_received.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.cc >index ffbda1a2dbe3a328669184e62af62c97bdaeb836..2a32eef78332e8033edbe8fddeb11b8808b34dbb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.cc >@@ -144,6 +144,8 @@ RTPVideoHeader RtpPayloadParams::GetRtpVideoHeader( > rtp_video_header.rotation = image.rotation_; > rtp_video_header.content_type = image.content_type_; > rtp_video_header.playout_delay = image.playout_delay_; >+ rtp_video_header.width = image._encodedWidth; >+ rtp_video_header.height = image._encodedHeight; > > SetVideoTiming(image, &rtp_video_header.video_timing); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.h >index f6829cace5c629e55a2ec9cd5a471cd49a55de89..b91bfe4b2c5db00e02f39c19b92a705646aa0d4e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params.h >@@ -16,7 +16,6 @@ > > #include "api/video_codecs/video_encoder.h" > #include "call/rtp_config.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/source/rtp_generic_frame_descriptor.h" > #include "modules/rtp_rtcp/source/rtp_video_header.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params_unittest.cc >index a3cb379d18a655548682229137e3e8fdbf551b50..9a519ba0a7f730fc58b80e69bf4a13efbb11d73e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_payload_params_unittest.cc >@@ -309,9 +309,13 @@ class RtpPayloadParamsVp8ToGenericTest : public ::testing::Test { > int64_t shared_frame_id, > FrameType frame_type, > LayerSync layer_sync, >- const std::set<int64_t>& expected_deps) { >+ const std::set<int64_t>& expected_deps, >+ uint16_t width = 0, >+ uint16_t height = 0) { > EncodedImage encoded_image; > encoded_image._frameType = frame_type; >+ encoded_image._encodedWidth = width; >+ encoded_image._encodedHeight = height; > > CodecSpecificInfo codec_info{}; > codec_info.codecType = kVideoCodecVP8; >@@ -330,6 +334,9 @@ class RtpPayloadParamsVp8ToGenericTest : public ::testing::Test { > std::set<int64_t> actual_deps(header.generic->dependencies.begin(), > header.generic->dependencies.end()); > EXPECT_EQ(expected_deps, actual_deps); >+ >+ EXPECT_EQ(header.width, width); >+ EXPECT_EQ(header.height, height); > } > > protected: >@@ -339,13 +346,13 @@ class RtpPayloadParamsVp8ToGenericTest : public ::testing::Test { > }; > > TEST_F(RtpPayloadParamsVp8ToGenericTest, Keyframe) { >- ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}); >+ ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}, 480, 360); > ConvertAndCheck(0, 1, kVideoFrameDelta, kNoSync, {0}); >- ConvertAndCheck(0, 2, kVideoFrameKey, kNoSync, {}); >+ ConvertAndCheck(0, 2, kVideoFrameKey, kNoSync, {}, 480, 360); > } > > TEST_F(RtpPayloadParamsVp8ToGenericTest, TooHighTemporalIndex) { >- ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}); >+ ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}, 480, 360); > > EncodedImage encoded_image; > encoded_image._frameType = kVideoFrameDelta; >@@ -362,7 +369,7 @@ TEST_F(RtpPayloadParamsVp8ToGenericTest, TooHighTemporalIndex) { > > TEST_F(RtpPayloadParamsVp8ToGenericTest, LayerSync) { > // 02120212 pattern >- ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}); >+ ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}, 480, 360); > ConvertAndCheck(2, 1, kVideoFrameDelta, kNoSync, {0}); > ConvertAndCheck(1, 2, kVideoFrameDelta, kNoSync, {0}); > ConvertAndCheck(2, 3, kVideoFrameDelta, kNoSync, {0, 1, 2}); >@@ -375,7 +382,7 @@ TEST_F(RtpPayloadParamsVp8ToGenericTest, LayerSync) { > > TEST_F(RtpPayloadParamsVp8ToGenericTest, FrameIdGaps) { > // 0101 pattern >- ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}); >+ ConvertAndCheck(0, 0, kVideoFrameKey, kNoSync, {}, 480, 360); > ConvertAndCheck(1, 1, kVideoFrameDelta, kNoSync, {0}); > > ConvertAndCheck(0, 5, kVideoFrameDelta, kNoSync, {0}); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.cc >index 54aef21fd9eafc9133c363e556ebf232c6ed4f37..5265c95afb8104128d340f7269d9519d6d77727a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.cc >@@ -26,11 +26,6 @@ static const size_t kMaxOverheadBytes = 500; > const char kTaskQueueExperiment[] = "WebRTC-TaskQueueCongestionControl"; > using TaskQueueController = webrtc::webrtc_cc::SendSideCongestionController; > >-bool TaskQueueExperimentEnabled() { >- std::string trial = webrtc::field_trial::FindFullName(kTaskQueueExperiment); >- return trial.find("Enable") == 0; >-} >- > std::unique_ptr<SendSideCongestionControllerInterface> CreateController( > Clock* clock, > rtc::TaskQueue* task_queue, >@@ -70,9 +65,9 @@ RtpTransportControllerSend::RtpTransportControllerSend( > retransmission_rate_limiter_(clock, kRetransmitWindowSizeMs), > task_queue_("rtp_send_controller") { > // Created after task_queue to be able to post to the task queue internally. >- send_side_cc_ = >- CreateController(clock, &task_queue_, event_log, &pacer_, bitrate_config, >- TaskQueueExperimentEnabled(), controller_factory); >+ send_side_cc_ = CreateController( >+ clock, &task_queue_, event_log, &pacer_, bitrate_config, >+ !field_trial::IsDisabled(kTaskQueueExperiment), controller_factory); > > process_thread_->RegisterModule(&pacer_, RTC_FROM_HERE); > process_thread_->RegisterModule(send_side_cc_.get(), RTC_FROM_HERE); >@@ -90,18 +85,20 @@ RtpVideoSenderInterface* RtpTransportControllerSend::CreateRtpVideoSender( > std::map<uint32_t, RtpState> suspended_ssrcs, > const std::map<uint32_t, RtpPayloadState>& states, > const RtpConfig& rtp_config, >- const RtcpConfig& rtcp_config, >+ int rtcp_report_interval_ms, > Transport* send_transport, > const RtpSenderObservers& observers, > RtcEventLog* event_log, >- std::unique_ptr<FecController> fec_controller) { >+ std::unique_ptr<FecController> fec_controller, >+ const RtpSenderFrameEncryptionConfig& frame_encryption_config) { > video_rtp_senders_.push_back(absl::make_unique<RtpVideoSender>( >- ssrcs, suspended_ssrcs, states, rtp_config, rtcp_config, send_transport, >- observers, >+ ssrcs, suspended_ssrcs, states, rtp_config, rtcp_report_interval_ms, >+ send_transport, observers, > // TODO(holmer): Remove this circular dependency by injecting > // the parts of RtpTransportControllerSendInterface that are really used. >- this, event_log, &retransmission_rate_limiter_, >- std::move(fec_controller))); >+ this, event_log, &retransmission_rate_limiter_, std::move(fec_controller), >+ frame_encryption_config.frame_encryptor, >+ frame_encryption_config.crypto_options)); > return video_rtp_senders_.back().get(); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.h >index 6aaf29d6bc37b83323c78aa3e45301261f023416..f1c46cc07855ddef7e412c6680d19c5b68c29de4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send.h >@@ -20,7 +20,6 @@ > #include "call/rtp_bitrate_configurator.h" > #include "call/rtp_transport_controller_send_interface.h" > #include "call/rtp_video_sender.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/congestion_controller/include/send_side_congestion_controller_interface.h" > #include "modules/pacing/packet_router.h" > #include "modules/utility/include/process_thread.h" >@@ -29,7 +28,9 @@ > #include "rtc_base/task_queue.h" > > namespace webrtc { >+ > class Clock; >+class FrameEncryptorInterface; > class RtcEventLog; > > // TODO(nisse): When we get the underlying transports here, we should >@@ -52,11 +53,12 @@ class RtpTransportControllerSend final > const std::map<uint32_t, RtpPayloadState>& > states, // move states into RtpTransportControllerSend > const RtpConfig& rtp_config, >- const RtcpConfig& rtcp_config, >+ int rtcp_report_interval_ms, > Transport* send_transport, > const RtpSenderObservers& observers, > RtcEventLog* event_log, >- std::unique_ptr<FecController> fec_controller) override; >+ std::unique_ptr<FecController> fec_controller, >+ const RtpSenderFrameEncryptionConfig& frame_encryption_config) override; > void DestroyRtpVideoSender( > RtpVideoSenderInterface* rtp_video_sender) override; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send_interface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send_interface.h >index 5c51c54fe13cf102fcc5afdd02c1a5fdfdb697b7..9868585cdf8113928a1ad63dd65eecc971cc2f7c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send_interface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_transport_controller_send_interface.h >@@ -20,10 +20,12 @@ > > #include "absl/types/optional.h" > #include "api/bitrate_constraints.h" >+#include "api/crypto/cryptooptions.h" > #include "api/fec_controller.h" > #include "api/transport/bitrate_settings.h" > #include "call/rtp_config.h" > #include "logging/rtc_event_log/rtc_event_log.h" >+#include "modules/rtp_rtcp/include/rtcp_statistics.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > > namespace rtc { >@@ -35,6 +37,7 @@ namespace webrtc { > > class CallStats; > class CallStatsObserver; >+class FrameEncryptorInterface; > class TargetTransferRateObserver; > class Transport; > class Module; >@@ -62,6 +65,11 @@ struct RtpSenderObservers { > SendPacketObserver* send_packet_observer; > }; > >+struct RtpSenderFrameEncryptionConfig { >+ FrameEncryptorInterface* frame_encryptor = nullptr; >+ CryptoOptions crypto_options; >+}; >+ > // An RtpTransportController should own everything related to the RTP > // transport to/from a remote endpoint. We should have separate > // interfaces for send and receive side, even if they are implemented >@@ -97,11 +105,12 @@ class RtpTransportControllerSendInterface { > // TODO(holmer): Move states into RtpTransportControllerSend. > const std::map<uint32_t, RtpPayloadState>& states, > const RtpConfig& rtp_config, >- const RtcpConfig& rtcp_config, >+ int rtcp_report_interval_ms, > Transport* send_transport, > const RtpSenderObservers& observers, > RtcEventLog* event_log, >- std::unique_ptr<FecController> fec_controller) = 0; >+ std::unique_ptr<FecController> fec_controller, >+ const RtpSenderFrameEncryptionConfig& frame_encryption_config) = 0; > virtual void DestroyRtpVideoSender( > RtpVideoSenderInterface* rtp_video_sender) = 0; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.cc >index 10d7e7daeda6d321756caf992845eda8d7cca522..ba1b992e95025aca58bc6cc2cfa1190ceb8381dc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.cc >@@ -40,8 +40,8 @@ static const size_t kPathMTU = 1500; > > std::vector<std::unique_ptr<RtpRtcp>> CreateRtpRtcpModules( > const std::vector<uint32_t>& ssrcs, >- const std::vector<uint32_t>& protected_media_ssrcs, >- const RtcpConfig& rtcp_config, >+ const RtpConfig& rtp_config, >+ int rtcp_report_interval_ms, > Transport* send_transport, > RtcpIntraFrameObserver* intra_frame_callback, > RtcpBandwidthObserver* bandwidth_callback, >@@ -56,8 +56,11 @@ std::vector<std::unique_ptr<RtpRtcp>> CreateRtpRtcpModules( > RtcEventLog* event_log, > RateLimiter* retransmission_rate_limiter, > OverheadObserver* overhead_observer, >- RtpKeepAliveConfig keepalive_config) { >+ RtpKeepAliveConfig keepalive_config, >+ FrameEncryptorInterface* frame_encryptor, >+ const CryptoOptions& crypto_options) { > RTC_DCHECK_GT(ssrcs.size(), 0); >+ > RtpRtcp::Configuration configuration; > configuration.audio = false; > configuration.receiver_only = false; >@@ -79,12 +82,15 @@ std::vector<std::unique_ptr<RtpRtcp>> CreateRtpRtcpModules( > configuration.retransmission_rate_limiter = retransmission_rate_limiter; > configuration.overhead_observer = overhead_observer; > configuration.keepalive_config = keepalive_config; >- configuration.rtcp_interval_config.video_interval_ms = >- rtcp_config.video_report_interval_ms; >- configuration.rtcp_interval_config.audio_interval_ms = >- rtcp_config.audio_report_interval_ms; >+ configuration.frame_encryptor = frame_encryptor; >+ configuration.require_frame_encryption = >+ crypto_options.sframe.require_frame_encryption; >+ configuration.extmap_allow_mixed = rtp_config.extmap_allow_mixed; >+ configuration.rtcp_report_interval_ms = rtcp_report_interval_ms; >+ > std::vector<std::unique_ptr<RtpRtcp>> modules; >- const std::vector<uint32_t>& flexfec_protected_ssrcs = protected_media_ssrcs; >+ const std::vector<uint32_t>& flexfec_protected_ssrcs = >+ rtp_config.flexfec.protected_media_ssrcs; > for (uint32_t ssrc : ssrcs) { > bool enable_flexfec = flexfec_sender != nullptr && > std::find(flexfec_protected_ssrcs.begin(), >@@ -177,13 +183,15 @@ RtpVideoSender::RtpVideoSender( > std::map<uint32_t, RtpState> suspended_ssrcs, > const std::map<uint32_t, RtpPayloadState>& states, > const RtpConfig& rtp_config, >- const RtcpConfig& rtcp_config, >+ int rtcp_report_interval_ms, > Transport* send_transport, > const RtpSenderObservers& observers, > RtpTransportControllerSendInterface* transport, > RtcEventLog* event_log, > RateLimiter* retransmission_limiter, >- std::unique_ptr<FecController> fec_controller) >+ std::unique_ptr<FecController> fec_controller, >+ FrameEncryptorInterface* frame_encryptor, >+ const CryptoOptions& crypto_options) > : send_side_bwe_with_overhead_( > webrtc::field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")), > active_(false), >@@ -191,25 +199,26 @@ RtpVideoSender::RtpVideoSender( > suspended_ssrcs_(std::move(suspended_ssrcs)), > flexfec_sender_(MaybeCreateFlexfecSender(rtp_config, suspended_ssrcs_)), > fec_controller_(std::move(fec_controller)), >- rtp_modules_( >- CreateRtpRtcpModules(ssrcs, >- rtp_config.flexfec.protected_media_ssrcs, >- rtcp_config, >- send_transport, >- observers.intra_frame_callback, >- transport->GetBandwidthObserver(), >- transport, >- observers.rtcp_rtt_stats, >- flexfec_sender_.get(), >- observers.bitrate_observer, >- observers.frame_count_observer, >- observers.rtcp_type_observer, >- observers.send_delay_observer, >- observers.send_packet_observer, >- event_log, >- retransmission_limiter, >- this, >- transport->keepalive_config())), >+ rtp_modules_(CreateRtpRtcpModules(ssrcs, >+ rtp_config, >+ rtcp_report_interval_ms, >+ send_transport, >+ observers.intra_frame_callback, >+ transport->GetBandwidthObserver(), >+ transport, >+ observers.rtcp_rtt_stats, >+ flexfec_sender_.get(), >+ observers.bitrate_observer, >+ observers.frame_count_observer, >+ observers.rtcp_type_observer, >+ observers.send_delay_observer, >+ observers.send_packet_observer, >+ event_log, >+ retransmission_limiter, >+ this, >+ transport->keepalive_config(), >+ frame_encryptor, >+ crypto_options)), > rtp_config_(rtp_config), > transport_(transport), > transport_overhead_bytes_per_packet_(0), >@@ -246,9 +255,6 @@ RtpVideoSender::RtpVideoSender( > for (size_t i = 0; i < rtp_config_.extensions.size(); ++i) { > const std::string& extension = rtp_config_.extensions[i].uri; > int id = rtp_config_.extensions[i].id; >- // One-byte-extension local identifiers are in the range 1-14 inclusive. >- RTC_DCHECK_GE(id, 1); >- RTC_DCHECK_LE(id, 14); > RTC_DCHECK(RtpExtension::IsSupportedForVideo(extension)); > for (auto& rtp_rtcp : rtp_modules_) { > RTC_CHECK(rtp_rtcp->RegisterRtpHeaderExtension(extension, id)); >@@ -592,12 +598,14 @@ void RtpVideoSender::OnBitrateUpdated(uint32_t bitrate_bps, > rtc::CritScope lock(&crit_); > uint32_t payload_bitrate_bps = bitrate_bps; > if (send_side_bwe_with_overhead_) { >- payload_bitrate_bps -= CalculateOverheadRateBps( >+ uint32_t overhead_bps = CalculateOverheadRateBps( > CalculatePacketRate( > bitrate_bps, > rtp_config_.max_packet_size + transport_overhead_bytes_per_packet_), > overhead_bytes_per_packet_ + transport_overhead_bytes_per_packet_, > bitrate_bps); >+ RTC_DCHECK_LE(overhead_bps, bitrate_bps); >+ payload_bitrate_bps = bitrate_bps - overhead_bps; > } > > // Get the encoder target rate. It is the estimated network rate - >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.h >index db329be00b04470fb3917f420df485c3d5b8b290..adac46bcd813084654b36cf742117cf52f1a7cfa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender.h >@@ -23,7 +23,6 @@ > #include "call/rtp_payload_params.h" > #include "call/rtp_transport_controller_send_interface.h" > #include "call/rtp_video_sender_interface.h" >-#include "common_types.h" // NOLINT(build/include) > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/rtp_rtcp/include/flexfec_sender.h" > #include "modules/rtp_rtcp/source/rtp_video_header.h" >@@ -36,6 +35,7 @@ > > namespace webrtc { > >+class FrameEncryptorInterface; > class RTPFragmentationHeader; > class RtpRtcp; > class RtpTransportControllerSendInterface; >@@ -53,13 +53,15 @@ class RtpVideoSender : public RtpVideoSenderInterface, > std::map<uint32_t, RtpState> suspended_ssrcs, > const std::map<uint32_t, RtpPayloadState>& states, > const RtpConfig& rtp_config, >- const RtcpConfig& rtcp_config, >+ int rtcp_report_interval_ms, > Transport* send_transport, > const RtpSenderObservers& observers, > RtpTransportControllerSendInterface* transport, > RtcEventLog* event_log, > RateLimiter* retransmission_limiter, // move inside RtpTransport >- std::unique_ptr<FecController> fec_controller); >+ std::unique_ptr<FecController> fec_controller, >+ FrameEncryptorInterface* frame_encryptor, >+ const CryptoOptions& crypto_options); // move inside RtpTransport > ~RtpVideoSender() override; > > // RegisterProcessThread register |module_process_thread| with those objects >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender_unittest.cc >index e91bfc4aeb14f268e309eafc0405269d293cc293..8ab89d6104c465abffc9867c4b8d34cccfaabc94 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/rtp_video_sender_unittest.cc >@@ -102,12 +102,13 @@ class RtpVideoSenderTestFixture { > std::map<uint32_t, RtpState> suspended_ssrcs; > router_ = absl::make_unique<RtpVideoSender>( > config_.rtp.ssrcs, suspended_ssrcs, suspended_payload_states, >- config_.rtp, config_.rtcp, &transport_, >+ config_.rtp, config_.rtcp_report_interval_ms, &transport_, > CreateObservers(&call_stats_, &encoder_feedback_, &stats_proxy_, > &stats_proxy_, &stats_proxy_, &stats_proxy_, > &stats_proxy_, &stats_proxy_, &send_delay_stats_), > &transport_controller_, &event_log_, &retransmission_rate_limiter_, >- absl::make_unique<FecControllerDefault>(&clock_)); >+ absl::make_unique<FecControllerDefault>(&clock_), nullptr, >+ CryptoOptions{}); > } > > RtpVideoSender* router() { return router_.get(); } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.cc >index ee58dce718248aa106bc3a817a34a921c8929fa6..77be14821c2aef1eb77f5ffa66f17fbe4640b2cd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.cc >@@ -13,6 +13,7 @@ > #include <algorithm> > #include <cmath> > #include <utility> >+#include "api/units/data_rate.h" > > namespace webrtc { > >@@ -26,6 +27,9 @@ SimulatedNetwork::~SimulatedNetwork() = default; > > void SimulatedNetwork::SetConfig(const SimulatedNetwork::Config& config) { > rtc::CritScope crit(&config_lock_); >+ if (config_.link_capacity_kbps != config.link_capacity_kbps) { >+ reset_capacity_delay_error_ = true; >+ } > config_ = config; // Shallow copy of the struct. > double prob_loss = config.loss_percent / 100.0; > if (config_.avg_burst_loss_length == -1) { >@@ -64,42 +68,45 @@ bool SimulatedNetwork::EnqueuePacket(PacketInFlightInfo packet) { > // Too many packet on the link, drop this one. > return false; > } >- >- // Delay introduced by the link capacity. >- int64_t capacity_delay_ms = 0; >- if (config.link_capacity_kbps > 0) { >- // Using bytes per millisecond to avoid losing precision. >- const int64_t bytes_per_millisecond = config.link_capacity_kbps / 8; >- // To round to the closest millisecond we add half a milliseconds worth of >- // bytes to the delay calculation. >- capacity_delay_ms = (packet.size + capacity_delay_error_bytes_ + >- bytes_per_millisecond / 2) / >- bytes_per_millisecond; >- capacity_delay_error_bytes_ += >- packet.size - capacity_delay_ms * bytes_per_millisecond; >- } > int64_t network_start_time_us = packet.send_time_us; >- > { > rtc::CritScope crit(&config_lock_); >+ if (reset_capacity_delay_error_) { >+ capacity_delay_error_bytes_ = 0; >+ reset_capacity_delay_error_ = false; >+ } > if (pause_transmission_until_us_) { > network_start_time_us = > std::max(network_start_time_us, *pause_transmission_until_us_); > pause_transmission_until_us_.reset(); > } > } >+ >+ // Delay introduced by the link capacity. >+ TimeDelta capacity_delay = TimeDelta::Zero(); >+ if (config.link_capacity_kbps > 0) { >+ const DataRate link_capacity = DataRate::kbps(config.link_capacity_kbps); >+ int64_t compensated_size = >+ static_cast<int64_t>(packet.size) + capacity_delay_error_bytes_; >+ capacity_delay = DataSize::bytes(compensated_size) / link_capacity; >+ >+ capacity_delay_error_bytes_ += >+ packet.size - (capacity_delay * link_capacity).bytes(); >+ } >+ > // Check if there already are packets on the link and change network start > // time forward if there is. > if (!capacity_link_.empty() && > network_start_time_us < capacity_link_.back().arrival_time_us) > network_start_time_us = capacity_link_.back().arrival_time_us; > >- int64_t arrival_time_us = network_start_time_us + capacity_delay_ms * 1000; >+ int64_t arrival_time_us = network_start_time_us + capacity_delay.us(); > capacity_link_.push({packet, arrival_time_us}); > return true; > } > > absl::optional<int64_t> SimulatedNetwork::NextDeliveryTimeUs() const { >+ rtc::CritScope crit(&process_lock_); > if (!delay_link_.empty()) > return delay_link_.begin()->arrival_time_us; > return absl::nullopt; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.h >index 0f8453283a88dd99deedf7c8ddeacc26d292d8ea..e978027092c7793dd3ce430d9bb73ef3775d2ede 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/simulated_network.h >@@ -28,7 +28,7 @@ namespace webrtc { > // capacity introduced delay. > class SimulatedNetwork : public NetworkBehaviorInterface { > public: >- using Config = DefaultNetworkSimulationConfig; >+ using Config = BuiltInNetworkBehaviorConfig; > explicit SimulatedNetwork(Config config, uint64_t random_seed = 1); > ~SimulatedNetwork() override; > >@@ -49,6 +49,7 @@ class SimulatedNetwork : public NetworkBehaviorInterface { > int64_t arrival_time_us; > }; > rtc::CriticalSection config_lock_; >+ bool reset_capacity_delay_error_ RTC_GUARDED_BY(config_lock_) = false; > > // |process_lock| guards the data structures involved in delay and loss > // processes, such as the packet queues. >@@ -56,7 +57,7 @@ class SimulatedNetwork : public NetworkBehaviorInterface { > std::queue<PacketInfo> capacity_link_ RTC_GUARDED_BY(process_lock_); > Random random_; > >- std::deque<PacketInfo> delay_link_; >+ std::deque<PacketInfo> delay_link_ RTC_GUARDED_BY(process_lock_); > > // Link configuration. > Config config_ RTC_GUARDED_BY(config_lock_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/fake_network_pipe_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/fake_network_pipe_unittest.cc >index 09e585241a407677af37e9a7c266fdaeb1624074..de5aec71352ccc7ae04881ca61bae3b78c2a6d2c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/fake_network_pipe_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/fake_network_pipe_unittest.cc >@@ -73,7 +73,7 @@ class FakeNetworkPipeTest : public ::testing::Test { > > // Test the capacity link and verify we get as many packets as we expect. > TEST_F(FakeNetworkPipeTest, CapacityTest) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = 20; > config.link_capacity_kbps = 80; > MockReceiver receiver; >@@ -113,7 +113,7 @@ TEST_F(FakeNetworkPipeTest, CapacityTest) { > > // Test the extra network delay. > TEST_F(FakeNetworkPipeTest, ExtraDelayTest) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = 20; > config.queue_delay_ms = 100; > config.link_capacity_kbps = 80; >@@ -149,7 +149,7 @@ TEST_F(FakeNetworkPipeTest, ExtraDelayTest) { > // Test the number of buffers and packets are dropped when sending too many > // packets too quickly. > TEST_F(FakeNetworkPipeTest, QueueLengthTest) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = 2; > config.link_capacity_kbps = 80; > MockReceiver receiver; >@@ -173,7 +173,7 @@ TEST_F(FakeNetworkPipeTest, QueueLengthTest) { > > // Test we get statistics as expected. > TEST_F(FakeNetworkPipeTest, StatisticsTest) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = 2; > config.queue_delay_ms = 20; > config.link_capacity_kbps = 80; >@@ -205,7 +205,7 @@ TEST_F(FakeNetworkPipeTest, StatisticsTest) { > // Change the link capacity half-way through the test and verify that the > // delivery times change accordingly. > TEST_F(FakeNetworkPipeTest, ChangingCapacityWithEmptyPipeTest) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = 20; > config.link_capacity_kbps = 80; > MockReceiver receiver; >@@ -266,7 +266,7 @@ TEST_F(FakeNetworkPipeTest, ChangingCapacityWithEmptyPipeTest) { > // Change the link capacity half-way through the test and verify that the > // delivery times change accordingly. > TEST_F(FakeNetworkPipeTest, ChangingCapacityWithPacketsInPipeTest) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = 20; > config.link_capacity_kbps = 80; > MockReceiver receiver; >@@ -321,7 +321,7 @@ TEST_F(FakeNetworkPipeTest, ChangingCapacityWithPacketsInPipeTest) { > > // At first disallow reordering and then allow reordering. > TEST_F(FakeNetworkPipeTest, DisallowReorderingThenAllowReordering) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = 1000; > config.link_capacity_kbps = 800; > config.queue_delay_ms = 100; >@@ -374,7 +374,7 @@ TEST_F(FakeNetworkPipeTest, BurstLoss) { > const int kNumPackets = 10000; > const int kPacketSize = 10; > >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_length_packets = kNumPackets; > config.loss_percent = kLossPercent; > config.avg_burst_loss_length = kAvgBurstLength; >@@ -409,7 +409,7 @@ TEST_F(FakeNetworkPipeTest, BurstLoss) { > } > > TEST_F(FakeNetworkPipeTest, SetReceiver) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.link_capacity_kbps = 800; > MockReceiver receiver; > auto simulated_network = absl::make_unique<SimulatedNetwork>(config); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_bitrate_allocator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_bitrate_allocator.h >index 714c02541f172ad5d5cd694a8092ca143b9544c8..f00ed79c59fb78289f8d904d89afe865b59b9e58 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_bitrate_allocator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_bitrate_allocator.h >@@ -21,7 +21,7 @@ class MockBitrateAllocator : public BitrateAllocatorInterface { > MOCK_METHOD2(AddObserver, > void(BitrateAllocatorObserver*, MediaStreamAllocationConfig)); > MOCK_METHOD1(RemoveObserver, void(BitrateAllocatorObserver*)); >- MOCK_METHOD1(GetStartBitrate, int(BitrateAllocatorObserver*)); >+ MOCK_CONST_METHOD1(GetStartBitrate, int(BitrateAllocatorObserver*)); > }; > } // namespace webrtc > #endif // CALL_TEST_MOCK_BITRATE_ALLOCATOR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_rtp_transport_controller_send.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_rtp_transport_controller_send.h >index 37587b2b76b9be19dfa68bc4ae73a458ecbb4e19..4d34d06e0567b99ef40a09357fdec4654134af24 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_rtp_transport_controller_send.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/test/mock_rtp_transport_controller_send.h >@@ -17,12 +17,14 @@ > #include <vector> > > #include "api/bitrate_constraints.h" >+#include "api/crypto/cryptooptions.h" >+#include "api/crypto/frameencryptorinterface.h" > #include "call/rtp_transport_controller_send_interface.h" > #include "modules/congestion_controller/include/network_changed_observer.h" > #include "modules/pacing/packet_router.h" >+#include "rtc_base/network/sent_packet.h" > #include "rtc_base/networkroute.h" > #include "rtc_base/rate_limiter.h" >-#include "rtc_base/socket.h" > #include "test/gmock.h" > > namespace webrtc { >@@ -30,17 +32,18 @@ namespace webrtc { > class MockRtpTransportControllerSend > : public RtpTransportControllerSendInterface { > public: >- MOCK_METHOD9( >+ MOCK_METHOD10( > CreateRtpVideoSender, > RtpVideoSenderInterface*(const std::vector<uint32_t>&, > std::map<uint32_t, RtpState>, > const std::map<uint32_t, RtpPayloadState>&, > const RtpConfig&, >- const RtcpConfig&, >+ int rtcp_report_interval_ms, > Transport*, > const RtpSenderObservers&, > RtcEventLog*, >- std::unique_ptr<FecController>)); >+ std::unique_ptr<FecController>, >+ const RtpSenderFrameEncryptionConfig&)); > MOCK_METHOD1(DestroyRtpVideoSender, void(RtpVideoSenderInterface*)); > MOCK_METHOD0(GetWorkerQueue, rtc::TaskQueue*()); > MOCK_METHOD0(packet_router, PacketRouter*()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_receive_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_receive_stream.h >index 49efe4fceaa5ee4647192b08a7f3e05ceaca22a9..2cd5f1631a09b15a0195315b7eb88c3f16fc7179 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_receive_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_receive_stream.h >@@ -17,6 +17,7 @@ > #include <vector> > > #include "api/call/transport.h" >+#include "api/crypto/cryptooptions.h" > #include "api/rtp_headers.h" > #include "api/rtpparameters.h" > #include "api/rtpreceiverinterface.h" >@@ -25,12 +26,12 @@ > #include "api/video/video_timing.h" > #include "api/video_codecs/sdp_video_format.h" > #include "call/rtp_config.h" >-#include "common_types.h" // NOLINT(build/include) >-#include "common_video/include/frame_callback.h" >+#include "modules/rtp_rtcp/include/rtcp_statistics.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > > namespace webrtc { > >+class FrameDecryptorInterface; > class RtpPacketSinkInterface; > class VideoDecoderFactory; > >@@ -215,6 +216,14 @@ class VideoReceiveStream { > // TODO(nisse): Used with VideoDecoderFactory::LegacyCreateVideoDecoder. > // Delete when that method is retired. > std::string stream_id; >+ >+ // An optional custom frame decryptor that allows the entire frame to be >+ // decrypted in whatever way the caller choses. This is not required by >+ // default. >+ rtc::scoped_refptr<webrtc::FrameDecryptorInterface> frame_decryptor; >+ >+ // Per PeerConnection cryptography options. >+ CryptoOptions crypto_options; > }; > > // Starts stream activity. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.cc >index 6b93fe28238a4b8e35da230fdae455c68dc513e1..c8ba308fa54236a2bf05733a4146b119c4d7371e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.cc >@@ -9,6 +9,10 @@ > */ > > #include "call/video_send_stream.h" >+ >+#include <utility> >+ >+#include "api/crypto/frameencryptorinterface.h" > #include "rtc_base/strings/string_builder.h" > > namespace webrtc { >@@ -75,11 +79,9 @@ std::string VideoSendStream::Config::ToString() const { > ss << "{encoder_settings: { experiment_cpu_load_estimator: " > << (encoder_settings.experiment_cpu_load_estimator ? "on" : "off") << "}}"; > ss << ", rtp: " << rtp.ToString(); >- ss << ", rtcp: " << rtcp.ToString(); >+ ss << ", rtcp_report_interval_ms: " << rtcp_report_interval_ms; > ss << ", pre_encode_callback: " > << (pre_encode_callback ? "(VideoSinkInterface)" : "nullptr"); >- ss << ", post_encode_callback: " >- << (post_encode_callback ? "(EncodedFrameObserver)" : "nullptr"); > ss << ", render_delay_ms: " << render_delay_ms; > ss << ", target_delay_ms: " << target_delay_ms; > ss << ", suspend_below_min_bitrate: " >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.h >index 3ff624d230bd99811cc40ef1d8350b7720ab192a..4d3d9c0c574cb7f76c6e9f9b95140284ef6ebe42 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/call/video_send_stream.h >@@ -11,24 +11,29 @@ > #ifndef CALL_VIDEO_SEND_STREAM_H_ > #define CALL_VIDEO_SEND_STREAM_H_ > >+#include <stdint.h> > #include <map> > #include <string> >-#include <utility> > #include <vector> > >+#include "absl/types/optional.h" > #include "api/call/transport.h" >+#include "api/crypto/cryptooptions.h" >+#include "api/rtpparameters.h" >+#include "api/video/video_content_type.h" >+#include "api/video/video_frame.h" > #include "api/video/video_sink_interface.h" > #include "api/video/video_source_interface.h" > #include "api/video/video_stream_encoder_settings.h" > #include "api/video_codecs/video_encoder_config.h" >-#include "api/video_codecs/video_encoder_factory.h" > #include "call/rtp_config.h" >-#include "common_types.h" // NOLINT(build/include) >-#include "common_video/include/frame_callback.h" >+#include "modules/rtp_rtcp/include/rtcp_statistics.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > > namespace webrtc { > >+class FrameEncryptorInterface; >+ > class VideoSendStream { > public: > struct StreamStats { >@@ -108,7 +113,8 @@ class VideoSendStream { > > RtpConfig rtp; > >- RtcpConfig rtcp; >+ // Time interval between RTCP report for video >+ int rtcp_report_interval_ms = 1000; > > // Transport for outgoing packets. > Transport* send_transport = nullptr; >@@ -117,12 +123,6 @@ class VideoSendStream { > // effects, snapshots etc. 'nullptr' disables the callback. > rtc::VideoSinkInterface<VideoFrame>* pre_encode_callback = nullptr; > >- // Called for each encoded frame, e.g. used for file storage. 'nullptr' >- // disables the callback. Also measures timing and passes the time >- // spent on encoding. This timing will not fire if encoding takes longer >- // than the measuring window, since the sample data will have been dropped. >- EncodedFrameObserver* post_encode_callback = nullptr; >- > // Expected delay needed by the renderer, i.e. the frame will be delivered > // this many milliseconds, if possible, earlier than expected render time. > // Only valid if |local_renderer| is set. >@@ -143,6 +143,14 @@ class VideoSendStream { > // Track ID as specified during track creation. > std::string track_id; > >+ // An optional custom frame encryptor that allows the entire frame to be >+ // encrypted in whatever way the caller chooses. This is not required by >+ // default. >+ rtc::scoped_refptr<webrtc::FrameEncryptorInterface> frame_encryptor; >+ >+ // Per PeerConnection cryptography options. >+ CryptoOptions crypto_options; >+ > private: > // Access to the copy constructor is private to force use of the Copy() > // method for those exceptional cases where we do use it. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/codereview.settings b/Source/ThirdParty/libwebrtc/Source/webrtc/codereview.settings >index d8cf6125f4e3a522be58dd85cbc788ba82c6ac5c..1724eddc072defe1f41ff0ffde116c30446387d6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/codereview.settings >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/codereview.settings >@@ -1,5 +1,4 @@ > # This file is used by git-cl to get repository specific information. >-CC_LIST: webrtc-reviews@webrtc.org > CODE_REVIEW_SERVER: codereview.webrtc.org > GERRIT_HOST: True > PROJECT: webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/BUILD.gn >index abcfe9ad1683fbbb58707a7db652b39d3e9c2e0b..74b0c6029fde6877d9db30edeb915fa431f7a867 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/BUILD.gn >@@ -47,10 +47,10 @@ rtc_static_library("common_audio") { > deps = [ > ":common_audio_c", > ":sinc_resampler", >- "..:webrtc_common", > "../rtc_base:checks", > "../rtc_base:gtest_prod", > "../rtc_base:rtc_base_approved", >+ "../rtc_base:sanitizer", > "../rtc_base/memory:aligned_array", > "../rtc_base/memory:aligned_malloc", > "../rtc_base/system:arch", >@@ -181,7 +181,6 @@ rtc_source_set("common_audio_c") { > deps = [ > ":common_audio_c_arm_asm", > ":common_audio_cc", >- "..:webrtc_common", > "../rtc_base:checks", > "../rtc_base:compile_assert_c", > "../rtc_base:rtc_base_approved", >@@ -201,7 +200,6 @@ rtc_source_set("common_audio_cc") { > ] > > deps = [ >- "..:webrtc_common", > "../rtc_base:rtc_base_approved", > "../system_wrappers", > ] >@@ -212,7 +210,6 @@ rtc_source_set("sinc_resampler") { > "resampler/sinc_resampler.h", > ] > deps = [ >- "..:webrtc_common", > "../rtc_base:gtest_prod", > "../rtc_base:rtc_base_approved", > "../rtc_base/memory:aligned_malloc", >@@ -381,7 +378,6 @@ if (rtc_include_tests) { > ":fir_filter", > ":fir_filter_factory", > ":sinc_resampler", >- "..:webrtc_common", > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_utils", >@@ -389,6 +385,7 @@ if (rtc_include_tests) { > "../system_wrappers:cpu_features_api", > "../test:fileutils", > "../test:test_main", >+ "../test:test_support", > "//testing/gtest", > ] > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/audio_converter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/audio_converter.h >index 769d724e2fb69e0b432933e770caf29cb50c0ff2..24a5e72cee9040890a97e26bb0e023bc1f43869c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/audio_converter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/audio_converter.h >@@ -11,6 +11,7 @@ > #ifndef COMMON_AUDIO_AUDIO_CONVERTER_H_ > #define COMMON_AUDIO_AUDIO_CONVERTER_H_ > >+#include <stddef.h> > #include <memory> > > #include "rtc_base/constructormagic.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/channel_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/channel_buffer.cc >index 38d231e09d06471f99bd28ff2668f014dfa8e242..b9b8c25e37506ed3899f84f0a9f9dce7c9f01b1c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/channel_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/channel_buffer.cc >@@ -10,6 +10,9 @@ > > #include "common_audio/channel_buffer.h" > >+#include <cstdint> >+ >+#include "common_audio/include/audio_util.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_c.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_c.cc >index 3418434672ace3e21ab2af0e4c48f773d8faddf1..b6ec27ad26bda6fcb9b0c9248e8b37af3c24ec4d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_c.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_c.cc >@@ -11,11 +11,8 @@ > #include "common_audio/fir_filter_c.h" > > #include <string.h> >- > #include <memory> > >-#include "common_audio/fir_filter_neon.h" >-#include "common_audio/fir_filter_sse.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_factory.cc >index 3243b27ef9ad2df624c5ac7ed7ddfc9b77b6512a..19528e312ec54cc6175c358ec3ac79cf24de87bf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_factory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_factory.cc >@@ -13,12 +13,12 @@ > #include "common_audio/fir_filter_c.h" > #include "rtc_base/checks.h" > #include "rtc_base/system/arch.h" >-#include "system_wrappers/include/cpu_features_wrapper.h" > > #if defined(WEBRTC_HAS_NEON) > #include "common_audio/fir_filter_neon.h" > #elif defined(WEBRTC_ARCH_X86_FAMILY) > #include "common_audio/fir_filter_sse.h" >+#include "system_wrappers/include/cpu_features_wrapper.h" // kSSE2, WebRtc_G... > #endif > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_sse.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_sse.h >index 7707f933d9ee6b7b1305c5694c7bdda6530e01ca..b768a37aa1544b08a6cea3ec13778fc23346d4b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_sse.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/fir_filter_sse.h >@@ -11,6 +11,7 @@ > #ifndef COMMON_AUDIO_FIR_FILTER_SSE_H_ > #define COMMON_AUDIO_FIR_FILTER_SSE_H_ > >+#include <stddef.h> > #include <memory> > > #include "common_audio/fir_filter.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/include/audio_util.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/include/audio_util.h >index de242a49be8b663484a8b9eb4e5faecb03827af4..bca5718044c90e86191a2218eebcca522efc1612 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/include/audio_util.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/include/audio_util.h >@@ -11,6 +11,7 @@ > #ifndef COMMON_AUDIO_INCLUDE_AUDIO_UTIL_H_ > #define COMMON_AUDIO_INCLUDE_AUDIO_UTIL_H_ > >+#include <stdint.h> > #include <algorithm> > #include <cmath> > #include <cstring> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier.h >index a3e4dd16a82c21349efa60828d97aa9c5782f815..4d0d8bf38ee90b62a681bd3c9405802d8f74fdf0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier.h >@@ -11,6 +11,7 @@ > #ifndef COMMON_AUDIO_REAL_FOURIER_H_ > #define COMMON_AUDIO_REAL_FOURIER_H_ > >+#include <stddef.h> > #include <complex> > #include <memory> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier_ooura.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier_ooura.h >index bb8eef96dff6ef103d39491fbd09d6db1ea1d072..b36c84f10b6b238212e637ab505b3d584e230ad3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier_ooura.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/real_fourier_ooura.h >@@ -11,6 +11,7 @@ > #ifndef COMMON_AUDIO_REAL_FOURIER_OOURA_H_ > #define COMMON_AUDIO_REAL_FOURIER_OOURA_H_ > >+#include <stddef.h> > #include <complex> > #include <memory> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_resampler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_resampler.cc >index 318d97b72c28ad50f3b0337aaf3b3fd855a542c8..9b89867e5a0dc930335276ee0983c2a7396135ca 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_resampler.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_resampler.cc >@@ -10,12 +10,12 @@ > > #include "common_audio/resampler/include/push_resampler.h" > >+#include <stdint.h> > #include <string.h> > > #include "absl/container/inlined_vector.h" > #include "absl/memory/memory.h" > #include "common_audio/include/audio_util.h" >-#include "common_audio/resampler/include/resampler.h" > #include "common_audio/resampler/push_sinc_resampler.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_sinc_resampler.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_sinc_resampler.h >index 1ffe73f79ba2806daa93268a1f8597dc1162a4b9..db9cdc1fd4b33e8a562bcb3226210be7ad428f5e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_sinc_resampler.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/push_sinc_resampler.h >@@ -11,6 +11,8 @@ > #ifndef COMMON_AUDIO_RESAMPLER_PUSH_SINC_RESAMPLER_H_ > #define COMMON_AUDIO_RESAMPLER_PUSH_SINC_RESAMPLER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > > #include "common_audio/resampler/sinc_resampler.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/resampler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/resampler.cc >index aa3a4bac2aa2bc23447ce28f57662cd96b3ed725..e4d2aa2b61559ed4509970f6e5549a27cff00578 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/resampler.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/resampler.cc >@@ -12,6 +12,7 @@ > * A wrapper for resampling a numerous amount of sampling combinations. > */ > >+#include <stdint.h> > #include <stdlib.h> > #include <string.h> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.cc >index 5aa2061ab75ec4db122e09967566d107daf6bc41..46015140303112d3e2c7bd5d917d967d916a2402 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.cc >@@ -88,13 +88,13 @@ > #include "common_audio/resampler/sinc_resampler.h" > > #include <math.h> >+#include <stdint.h> > #include <string.h> >- > #include <limits> > > #include "rtc_base/checks.h" > #include "rtc_base/system/arch.h" >-#include "system_wrappers/include/cpu_features_wrapper.h" >+#include "system_wrappers/include/cpu_features_wrapper.h" // kSSE2, WebRtc_G... > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.h >index 8a833ce88c99a32f80bdffd82561862b5423a569..0be4318e33922070457de0595b879578ddddb72c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler.h >@@ -14,6 +14,7 @@ > #ifndef COMMON_AUDIO_RESAMPLER_SINC_RESAMPLER_H_ > #define COMMON_AUDIO_RESAMPLER_SINC_RESAMPLER_H_ > >+#include <stddef.h> > #include <memory> > > #include "rtc_base/constructormagic.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler_sse.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler_sse.cc >index 3906a797528a425d03621ea170614fdd0e1bb927..f6a24d0a0e0d34174732e68a91f83204605fab22 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler_sse.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/resampler/sinc_resampler_sse.cc >@@ -11,10 +11,12 @@ > // Modified from the Chromium original: > // src/media/base/simd/sinc_resampler_sse.cc > >-#include "common_audio/resampler/sinc_resampler.h" >- >+#include <stddef.h> >+#include <stdint.h> > #include <xmmintrin.h> > >+#include "common_audio/resampler/sinc_resampler.h" >+ > namespace webrtc { > > float SincResampler::Convolve_SSE(const float* input_ptr, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.cc >index d426bda2501343cc721a5b2c0f52fb66c035290e..0d5aaa485415904eb3ccdb023fa2b37b81fd875a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.cc >@@ -12,6 +12,7 @@ > > #include <cmath> > >+#include "rtc_base/checks.h" > #include "rtc_base/timeutils.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.h >index cff746953aa80ad90cdd5d29e3d915171b35bf23..c467d85426956f888cf025cbcda59b8e40e0d011 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/smoothing_filter.h >@@ -11,9 +11,10 @@ > #ifndef COMMON_AUDIO_SMOOTHING_FILTER_H_ > #define COMMON_AUDIO_SMOOTHING_FILTER_H_ > >+#include <stdint.h> >+ > #include "absl/types/optional.h" > #include "rtc_base/constructormagic.h" >-#include "system_wrappers/include/clock.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/vad/vad.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/vad/vad.cc >index 1cb332a61d887fd24db15aea5b8c1ea7e8915a12..987ed526c00a9793ca97fca30b3248be66b06157 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/vad/vad.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/vad/vad.cc >@@ -12,6 +12,7 @@ > > #include <memory> > >+#include "common_audio/vad/include/webrtc_vad.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >index 0209f52d2c73592eaa63da25f2bd91d3e97bee85..90a89281e5cd48dc2e32c20553c00f1d1f8785be 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >@@ -10,35 +10,45 @@ > > #include "common_audio/wav_file.h" > >+#include <errno.h> > #include <algorithm> > #include <cstdio> >-#include <limits> >+#include <type_traits> > > #include "common_audio/include/audio_util.h" > #include "common_audio/wav_header.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/system/arch.h" > > namespace webrtc { >+namespace { > > // We write 16-bit PCM WAV files. >-static const WavFormat kWavFormat = kWavFormatPcm; >-static const size_t kBytesPerSample = 2; >+constexpr WavFormat kWavFormat = kWavFormatPcm; >+static_assert(std::is_trivially_destructible<WavFormat>::value, ""); >+constexpr size_t kBytesPerSample = 2; > > // Doesn't take ownership of the file handle and won't close it. > class ReadableWavFile : public ReadableWav { > public: > explicit ReadableWavFile(FILE* file) : file_(file) {} >+ ReadableWavFile(const ReadableWavFile&) = delete; >+ ReadableWavFile& operator=(const ReadableWavFile&) = delete; > size_t Read(void* buf, size_t num_bytes) override { > return fread(buf, 1, num_bytes, file_); > } >+ bool Eof() const override { return feof(file_) != 0; } >+ bool SeekForward(uint32_t num_bytes) override { >+ return fseek(file_, num_bytes, SEEK_CUR) == 0; >+ } > > private: > FILE* file_; > }; > >+} // namespace >+ > WavReader::WavReader(const std::string& filename) > : WavReader(rtc::OpenPlatformFileReadOnly(filename)) {} > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.cc >index 8fc5fef5085230668379dec0f29476430552c673..e49b7489b2efa3eaed6abae6d6a7243100dcef36 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.cc >@@ -14,13 +14,13 @@ > > #include "common_audio/wav_header.h" > >-#include <algorithm> > #include <cstring> > #include <limits> > #include <string> > >-#include "common_audio/include/audio_util.h" > #include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/sanitizer.h" > #include "rtc_base/system/arch.h" > > namespace webrtc { >@@ -32,6 +32,11 @@ struct ChunkHeader { > }; > static_assert(sizeof(ChunkHeader) == 8, "ChunkHeader size"); > >+struct RiffHeader { >+ ChunkHeader header; >+ uint32_t Format; >+}; >+ > // We can't nest this definition in WavHeader, because VS2013 gives an error > // on sizeof(WavHeader::fmt): "error C2070: 'unknown': illegal sizeof operand". > struct FmtSubchunk { >@@ -46,11 +51,12 @@ struct FmtSubchunk { > static_assert(sizeof(FmtSubchunk) == 24, "FmtSubchunk size"); > const uint32_t kFmtSubchunkSize = sizeof(FmtSubchunk) - sizeof(ChunkHeader); > >+// Simple wav header. It does not include chunks that are not essential to read >+// audio samples. > struct WavHeader { >- struct { >- ChunkHeader header; >- uint32_t Format; >- } riff; >+ WavHeader(const WavHeader&) = default; >+ WavHeader& operator=(const WavHeader&) = default; >+ RiffHeader riff; > FmtSubchunk fmt; > struct { > ChunkHeader header; >@@ -58,6 +64,87 @@ struct WavHeader { > }; > static_assert(sizeof(WavHeader) == kWavHeaderSize, "no padding in header"); > >+#ifdef WEBRTC_ARCH_LITTLE_ENDIAN >+static inline void WriteLE16(uint16_t* f, uint16_t x) { >+ *f = x; >+} >+static inline void WriteLE32(uint32_t* f, uint32_t x) { >+ *f = x; >+} >+static inline void WriteFourCC(uint32_t* f, char a, char b, char c, char d) { >+ *f = static_cast<uint32_t>(a) | static_cast<uint32_t>(b) << 8 | >+ static_cast<uint32_t>(c) << 16 | static_cast<uint32_t>(d) << 24; >+} >+ >+static inline uint16_t ReadLE16(uint16_t x) { >+ return x; >+} >+static inline uint32_t ReadLE32(uint32_t x) { >+ return x; >+} >+static inline std::string ReadFourCC(uint32_t x) { >+ return std::string(reinterpret_cast<char*>(&x), 4); >+} >+#else >+#error "Write be-to-le conversion functions" >+#endif >+ >+static inline uint32_t RiffChunkSize(size_t bytes_in_payload) { >+ return static_cast<uint32_t>(bytes_in_payload + kWavHeaderSize - >+ sizeof(ChunkHeader)); >+} >+ >+static inline uint32_t ByteRate(size_t num_channels, >+ int sample_rate, >+ size_t bytes_per_sample) { >+ return static_cast<uint32_t>(num_channels * sample_rate * bytes_per_sample); >+} >+ >+static inline uint16_t BlockAlign(size_t num_channels, >+ size_t bytes_per_sample) { >+ return static_cast<uint16_t>(num_channels * bytes_per_sample); >+} >+ >+// Finds a chunk having the sought ID. If found, then |readable| points to the >+// first byte of the sought chunk data. If not found, the end of the file is >+// reached. >+void FindWaveChunk(ChunkHeader* chunk_header, >+ ReadableWav* readable, >+ const std::string sought_chunk_id) { >+ RTC_DCHECK_EQ(sought_chunk_id.size(), 4); >+ while (!readable->Eof()) { >+ if (readable->Read(chunk_header, sizeof(*chunk_header)) != >+ sizeof(*chunk_header)) >+ return; // EOF. >+ if (ReadFourCC(chunk_header->ID) == sought_chunk_id) >+ return; // Sought chunk found. >+ // Ignore current chunk by skipping its payload. >+ if (!readable->SeekForward(chunk_header->Size)) >+ return; // EOF or error. >+ } >+ return; // EOF. >+} >+ >+bool ReadFmtChunkData(FmtSubchunk* fmt_subchunk, ReadableWav* readable) { >+ // Reads "fmt " chunk payload. >+ if (readable->Read(&(fmt_subchunk->AudioFormat), kFmtSubchunkSize) != >+ kFmtSubchunkSize) >+ return false; >+ const uint32_t fmt_size = ReadLE32(fmt_subchunk->header.Size); >+ if (fmt_size != kFmtSubchunkSize) { >+ // There is an optional two-byte extension field permitted to be present >+ // with PCM, but which must be zero. >+ int16_t ext_size; >+ if (kFmtSubchunkSize + sizeof(ext_size) != fmt_size) >+ return false; >+ if (readable->Read(&ext_size, sizeof(ext_size)) != sizeof(ext_size)) >+ return false; >+ if (ext_size != 0) >+ return false; >+ } >+ return true; >+} >+ > } // namespace > > bool CheckWavParameters(size_t num_channels, >@@ -112,47 +199,6 @@ bool CheckWavParameters(size_t num_channels, > return true; > } > >-#ifdef WEBRTC_ARCH_LITTLE_ENDIAN >-static inline void WriteLE16(uint16_t* f, uint16_t x) { >- *f = x; >-} >-static inline void WriteLE32(uint32_t* f, uint32_t x) { >- *f = x; >-} >-static inline void WriteFourCC(uint32_t* f, char a, char b, char c, char d) { >- *f = static_cast<uint32_t>(a) | static_cast<uint32_t>(b) << 8 | >- static_cast<uint32_t>(c) << 16 | static_cast<uint32_t>(d) << 24; >-} >- >-static inline uint16_t ReadLE16(uint16_t x) { >- return x; >-} >-static inline uint32_t ReadLE32(uint32_t x) { >- return x; >-} >-static inline std::string ReadFourCC(uint32_t x) { >- return std::string(reinterpret_cast<char*>(&x), 4); >-} >-#else >-#error "Write be-to-le conversion functions" >-#endif >- >-static inline uint32_t RiffChunkSize(size_t bytes_in_payload) { >- return static_cast<uint32_t>(bytes_in_payload + kWavHeaderSize - >- sizeof(ChunkHeader)); >-} >- >-static inline uint32_t ByteRate(size_t num_channels, >- int sample_rate, >- size_t bytes_per_sample) { >- return static_cast<uint32_t>(num_channels * sample_rate * bytes_per_sample); >-} >- >-static inline uint16_t BlockAlign(size_t num_channels, >- size_t bytes_per_sample) { >- return static_cast<uint16_t>(num_channels * bytes_per_sample); >-} >- > void WriteWavHeader(uint8_t* buf, > size_t num_channels, > int sample_rate, >@@ -162,7 +208,7 @@ void WriteWavHeader(uint8_t* buf, > RTC_CHECK(CheckWavParameters(num_channels, sample_rate, format, > bytes_per_sample, num_samples)); > >- WavHeader header; >+ auto header = rtc::MsanUninitialized<WavHeader>({}); > const size_t bytes_in_payload = bytes_per_sample * num_samples; > > WriteFourCC(&header.riff.header.ID, 'R', 'I', 'F', 'F'); >@@ -194,25 +240,38 @@ bool ReadWavHeader(ReadableWav* readable, > WavFormat* format, > size_t* bytes_per_sample, > size_t* num_samples) { >- WavHeader header; >- if (readable->Read(&header, kWavHeaderSize - sizeof(header.data)) != >- kWavHeaderSize - sizeof(header.data)) >+ auto header = rtc::MsanUninitialized<WavHeader>({}); >+ >+ // Read RIFF chunk. >+ if (readable->Read(&header.riff, sizeof(header.riff)) != sizeof(header.riff)) >+ return false; >+ if (ReadFourCC(header.riff.header.ID) != "RIFF") >+ return false; >+ if (ReadFourCC(header.riff.Format) != "WAVE") > return false; > >- const uint32_t fmt_size = ReadLE32(header.fmt.header.Size); >- if (fmt_size != kFmtSubchunkSize) { >- // There is an optional two-byte extension field permitted to be present >- // with PCM, but which must be zero. >- int16_t ext_size; >- if (kFmtSubchunkSize + sizeof(ext_size) != fmt_size) >- return false; >- if (readable->Read(&ext_size, sizeof(ext_size)) != sizeof(ext_size)) >- return false; >- if (ext_size != 0) >- return false; >+ // Find "fmt " and "data" chunks. While the official Wave file specification >+ // does not put requirements on the chunks order, it is uncommon to find the >+ // "data" chunk before the "fmt " one. The code below fails if this is not the >+ // case. >+ FindWaveChunk(&header.fmt.header, readable, "fmt "); >+ if (ReadFourCC(header.fmt.header.ID) != "fmt ") { >+ RTC_LOG(LS_ERROR) << "Cannot find 'fmt ' chunk."; >+ return false; >+ } >+ if (!ReadFmtChunkData(&header.fmt, readable)) { >+ RTC_LOG(LS_ERROR) << "Cannot read 'fmt ' chunk."; >+ return false; >+ } >+ if (readable->Eof()) { >+ RTC_LOG(LS_ERROR) << "'fmt ' chunk placed after 'data' chunk."; >+ return false; > } >- if (readable->Read(&header.data, sizeof(header.data)) != sizeof(header.data)) >+ FindWaveChunk(&header.data.header, readable, "data"); >+ if (ReadFourCC(header.data.header.ID) != "data") { >+ RTC_LOG(LS_ERROR) << "Cannot find 'data' chunk."; > return false; >+ } > > // Parse needed fields. > *format = static_cast<WavFormat>(ReadLE16(header.fmt.AudioFormat)); >@@ -224,16 +283,6 @@ bool ReadWavHeader(ReadableWav* readable, > return false; > *num_samples = bytes_in_payload / *bytes_per_sample; > >- // Sanity check remaining fields. >- if (ReadFourCC(header.riff.header.ID) != "RIFF") >- return false; >- if (ReadFourCC(header.riff.Format) != "WAVE") >- return false; >- if (ReadFourCC(header.fmt.header.ID) != "fmt ") >- return false; >- if (ReadFourCC(header.data.header.ID) != "data") >- return false; >- > if (ReadLE32(header.riff.header.Size) < RiffChunkSize(bytes_in_payload)) > return false; > if (ReadLE32(header.fmt.ByteRate) != >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.h >index 872d3abce5ca4cdc940ca91440441e7237bcbe0b..a519ba556f54f0365494beb637d8d6b7ba249670 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header.h >@@ -21,8 +21,11 @@ static const size_t kWavHeaderSize = 44; > class ReadableWav { > public: > // Returns the number of bytes read. >- size_t virtual Read(void* buf, size_t num_bytes) = 0; >- virtual ~ReadableWav() {} >+ virtual size_t Read(void* buf, size_t num_bytes) = 0; >+ // Returns true if the end-of-file has been reached. >+ virtual bool Eof() const = 0; >+ virtual bool SeekForward(uint32_t num_bytes) = 0; >+ virtual ~ReadableWav() = default; > }; > > enum WavFormat { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header_unittest.cc >index b7169b53c8f1b1a1c1c4fd24bcde860b410817ee..2e2eda5a6f2d08f23cfba5dcd49f9c3fc5a7a0dd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_header_unittest.cc >@@ -19,12 +19,6 @@ namespace webrtc { > // Doesn't take ownership of the buffer. > class ReadableWavBuffer : public ReadableWav { > public: >- ReadableWavBuffer(const uint8_t* buf, size_t size) >- : buf_(buf), >- size_(size), >- pos_(0), >- buf_exhausted_(false), >- check_read_size_(true) {} > ReadableWavBuffer(const uint8_t* buf, size_t size, bool check_read_size) > : buf_(buf), > size_(size), >@@ -57,6 +51,27 @@ class ReadableWavBuffer : public ReadableWav { > return num_bytes; > } > >+ bool Eof() const override { return pos_ == size_; } >+ >+ bool SeekForward(uint32_t num_bytes) override { >+ // Verify we don't try to read outside of a properly sized header. >+ if (size_ >= kWavHeaderSize) >+ EXPECT_GE(size_, pos_ + num_bytes); >+ EXPECT_FALSE(buf_exhausted_); >+ >+ const size_t bytes_remaining = size_ - pos_; >+ if (num_bytes > bytes_remaining) { >+ // Error: cannot seek beyond EOF. >+ return false; >+ } >+ if (num_bytes == bytes_remaining) { >+ // There should not be another read attempt after this point. >+ buf_exhausted_ = true; >+ } >+ pos_ += num_bytes; >+ return true; >+ } >+ > private: > const uint8_t* buf_; > const size_t size_; >@@ -103,7 +118,7 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > // invalid field is indicated in the array name, and in the comments with > // *BAD*. > { >- static const uint8_t kBadRiffID[] = { >+ constexpr uint8_t kBadRiffID[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'i', 'f', 'f', // *BAD* >@@ -121,12 +136,13 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > 0x99, 0xd0, 0x5b, 0x07, // size of payload: 123457689 > // clang-format on > }; >- ReadableWavBuffer r(kBadRiffID, sizeof(kBadRiffID)); >+ ReadableWavBuffer r(kBadRiffID, sizeof(kBadRiffID), >+ /*check_read_size=*/false); > EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > } > { >- static const uint8_t kBadBitsPerSample[] = { >+ constexpr uint8_t kBadBitsPerSample[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'I', 'F', 'F', >@@ -144,12 +160,13 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > 0x99, 0xd0, 0x5b, 0x07, // size of payload: 123457689 > // clang-format on > }; >- ReadableWavBuffer r(kBadBitsPerSample, sizeof(kBadBitsPerSample)); >+ ReadableWavBuffer r(kBadBitsPerSample, sizeof(kBadBitsPerSample), >+ /*check_read_size=*/true); > EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > } > { >- static const uint8_t kBadByteRate[] = { >+ constexpr uint8_t kBadByteRate[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'I', 'F', 'F', >@@ -167,12 +184,13 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > 0x99, 0xd0, 0x5b, 0x07, // size of payload: 123457689 > // clang-format on > }; >- ReadableWavBuffer r(kBadByteRate, sizeof(kBadByteRate)); >+ ReadableWavBuffer r(kBadByteRate, sizeof(kBadByteRate), >+ /*check_read_size=*/true); > EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > } > { >- static const uint8_t kBadFmtHeaderSize[] = { >+ constexpr uint8_t kBadFmtHeaderSize[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'I', 'F', 'F', >@@ -191,12 +209,13 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > 0x99, 0xd0, 0x5b, 0x07, // size of payload: 123457689 > // clang-format on > }; >- ReadableWavBuffer r(kBadFmtHeaderSize, sizeof(kBadFmtHeaderSize), false); >+ ReadableWavBuffer r(kBadFmtHeaderSize, sizeof(kBadFmtHeaderSize), >+ /*check_read_size=*/false); > EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > } > { >- static const uint8_t kNonZeroExtensionField[] = { >+ constexpr uint8_t kNonZeroExtensionField[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'I', 'F', 'F', >@@ -216,12 +235,12 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > // clang-format on > }; > ReadableWavBuffer r(kNonZeroExtensionField, sizeof(kNonZeroExtensionField), >- false); >+ /*check_read_size=*/false); > EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > } > { >- static const uint8_t kMissingDataChunk[] = { >+ constexpr uint8_t kMissingDataChunk[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'I', 'F', 'F', >@@ -237,12 +256,13 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > 8, 0, // bits per sample: 1 * 8 > // clang-format on > }; >- ReadableWavBuffer r(kMissingDataChunk, sizeof(kMissingDataChunk)); >+ ReadableWavBuffer r(kMissingDataChunk, sizeof(kMissingDataChunk), >+ /*check_read_size=*/true); > EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > } > { >- static const uint8_t kMissingFmtAndDataChunks[] = { >+ constexpr uint8_t kMissingFmtAndDataChunks[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'I', 'F', 'F', >@@ -251,7 +271,8 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > // clang-format on > }; > ReadableWavBuffer r(kMissingFmtAndDataChunks, >- sizeof(kMissingFmtAndDataChunks)); >+ sizeof(kMissingFmtAndDataChunks), >+ /*check_read_size=*/true); > EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > } >@@ -259,11 +280,11 @@ TEST(WavHeaderTest, ReadWavHeaderWithErrors) { > > // Try writing and reading a valid WAV header and make sure it looks OK. > TEST(WavHeaderTest, WriteAndReadWavHeader) { >- static const int kSize = 4 + kWavHeaderSize + 4; >+ constexpr int kSize = 4 + kWavHeaderSize + 4; > uint8_t buf[kSize]; > memset(buf, 0xa4, sizeof(buf)); > WriteWavHeader(buf + 4, 17, 12345, kWavFormatALaw, 1, 123457689); >- static const uint8_t kExpectedBuf[] = { >+ constexpr uint8_t kExpectedBuf[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 0xa4, 0xa4, 0xa4, 0xa4, // untouched bytes before header >@@ -291,7 +312,8 @@ TEST(WavHeaderTest, WriteAndReadWavHeader) { > WavFormat format = kWavFormatPcm; > size_t bytes_per_sample = 0; > size_t num_samples = 0; >- ReadableWavBuffer r(buf + 4, sizeof(buf) - 8); >+ ReadableWavBuffer r(buf + 4, sizeof(buf) - 8, >+ /*check_read_size=*/true); > EXPECT_TRUE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > EXPECT_EQ(17u, num_channels); >@@ -303,24 +325,67 @@ TEST(WavHeaderTest, WriteAndReadWavHeader) { > > // Try reading an atypical but valid WAV header and make sure it's parsed OK. > TEST(WavHeaderTest, ReadAtypicalWavHeader) { >- static const uint8_t kBuf[] = { >+ constexpr uint8_t kBuf[] = { > // clang-format off > // clang formatting doesn't respect inline comments. > 'R', 'I', 'F', 'F', >- 0x3d, 0xd1, 0x5b, 0x07, // size of whole file - 8 + an extra 128 bytes of >- // "metadata": 123457689 + 44 - 8 + 128. (atypical) >+ 0xbf, 0xd0, 0x5b, 0x07, // Size of whole file - 8 + extra 2 bytes of zero >+ // extension: 123457689 + 44 - 8 + 2 (atypical). > 'W', 'A', 'V', 'E', > 'f', 'm', 't', ' ', >- 18, 0, 0, 0, // size of fmt block (with an atypical extension size field) >- 6, 0, // format: A-law (6) >- 17, 0, // channels: 17 >- 0x39, 0x30, 0, 0, // sample rate: 12345 >- 0xc9, 0x33, 0x03, 0, // byte rate: 1 * 17 * 12345 >- 17, 0, // block align: NumChannels * BytesPerSample >- 8, 0, // bits per sample: 1 * 8 >- 0, 0, // zero extension size field (atypical) >+ 18, 0, 0, 0, // Size of fmt block (with an atypical extension >+ // size field). >+ 6, 0, // Format: A-law (6). >+ 17, 0, // Channels: 17. >+ 0x39, 0x30, 0, 0, // Sample rate: 12345. >+ 0xc9, 0x33, 0x03, 0, // Byte rate: 1 * 17 * 12345. >+ 17, 0, // Block align: NumChannels * BytesPerSample. >+ 8, 0, // Bits per sample: 1 * 8. >+ 0, 0, // Zero extension size field (atypical). > 'd', 'a', 't', 'a', >- 0x99, 0xd0, 0x5b, 0x07, // size of payload: 123457689 >+ 0x99, 0xd0, 0x5b, 0x07, // Size of payload: 123457689. >+ // clang-format on >+ }; >+ >+ size_t num_channels = 0; >+ int sample_rate = 0; >+ WavFormat format = kWavFormatPcm; >+ size_t bytes_per_sample = 0; >+ size_t num_samples = 0; >+ ReadableWavBuffer r(kBuf, sizeof(kBuf), /*check_read_size=*/true); >+ EXPECT_TRUE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, >+ &bytes_per_sample, &num_samples)); >+ EXPECT_EQ(17u, num_channels); >+ EXPECT_EQ(12345, sample_rate); >+ EXPECT_EQ(kWavFormatALaw, format); >+ EXPECT_EQ(1u, bytes_per_sample); >+ EXPECT_EQ(123457689u, num_samples); >+} >+ >+// Try reading a valid WAV header which contains an optional chunk and make sure >+// it's parsed OK. >+TEST(WavHeaderTest, ReadWavHeaderWithOptionalChunk) { >+ constexpr uint8_t kBuf[] = { >+ // clang-format off >+ // clang formatting doesn't respect inline comments. >+ 'R', 'I', 'F', 'F', >+ 0xcd, 0xd0, 0x5b, 0x07, // Size of whole file - 8 + an extra 16 bytes of >+ // "metadata" (8 bytes header, 16 bytes payload): >+ // 123457689 + 44 - 8 + 16. >+ 'W', 'A', 'V', 'E', >+ 'f', 'm', 't', ' ', >+ 16, 0, 0, 0, // Size of fmt block. >+ 6, 0, // Format: A-law (6). >+ 17, 0, // Channels: 17. >+ 0x39, 0x30, 0, 0, // Sample rate: 12345. >+ 0xc9, 0x33, 0x03, 0, // Byte rate: 1 * 17 * 12345. >+ 17, 0, // Block align: NumChannels * BytesPerSample. >+ 8, 0, // Bits per sample: 1 * 8. >+ 'L', 'I', 'S', 'T', // Metadata chunk ID. >+ 16, 0, 0, 0, // Metadata chunk payload size. >+ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, // Metadata (16 bytes). >+ 'd', 'a', 't', 'a', >+ 0x99, 0xd0, 0x5b, 0x07, // Size of payload: 123457689. > // clang-format on > }; > >@@ -329,7 +394,7 @@ TEST(WavHeaderTest, ReadAtypicalWavHeader) { > WavFormat format = kWavFormatPcm; > size_t bytes_per_sample = 0; > size_t num_samples = 0; >- ReadableWavBuffer r(kBuf, sizeof(kBuf)); >+ ReadableWavBuffer r(kBuf, sizeof(kBuf), /*check_read_size=*/true); > EXPECT_TRUE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, > &bytes_per_sample, &num_samples)); > EXPECT_EQ(17u, num_channels); >@@ -339,4 +404,37 @@ TEST(WavHeaderTest, ReadAtypicalWavHeader) { > EXPECT_EQ(123457689u, num_samples); > } > >+// Try reading an invalid WAV header which has the the data chunk before the >+// format one and make sure it's not parsed. >+TEST(WavHeaderTest, ReadWavHeaderWithDataBeforeFormat) { >+ constexpr uint8_t kBuf[] = { >+ // clang-format off >+ // clang formatting doesn't respect inline comments. >+ 'R', 'I', 'F', 'F', >+ 52, 0, 0, 0, // Size of whole file - 8: 16 + 44 - 8. >+ 'W', 'A', 'V', 'E', >+ 'd', 'a', 't', 'a', >+ 16, 0, 0, 0, // Data chunk payload size. >+ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, // Data 16 bytes. >+ 'f', 'm', 't', ' ', >+ 16, 0, 0, 0, // Size of fmt block. >+ 6, 0, // Format: A-law (6). >+ 1, 0, // Channels: 1. >+ 60, 0, 0, 0, // Sample rate: 60. >+ 60, 0, 0, 0, // Byte rate: 1 * 1 * 60. >+ 1, 0, // Block align: NumChannels * BytesPerSample. >+ 8, 0, // Bits per sample: 1 * 8. >+ // clang-format on >+ }; >+ >+ size_t num_channels = 0; >+ int sample_rate = 0; >+ WavFormat format = kWavFormatPcm; >+ size_t bytes_per_sample = 0; >+ size_t num_samples = 0; >+ ReadableWavBuffer r(kBuf, sizeof(kBuf), /*check_read_size=*/false); >+ EXPECT_FALSE(ReadWavHeader(&r, &num_channels, &sample_rate, &format, >+ &bytes_per_sample, &num_samples)); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_types.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_types.h >index 99c4064a3cf660d1326ad590e4f68fa34443ed0a..848b899a0dd1f489c526d0b25451a5504c2b2b38 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_types.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_types.h >@@ -11,16 +11,15 @@ > #ifndef COMMON_TYPES_H_ > #define COMMON_TYPES_H_ > >-#include <stddef.h> >-#include <string.h> >-#include <string> >-#include <vector> >+#include <stddef.h> // For size_t >+#include <cstdint> > >-#include "api/array_view.h" >+#include "absl/strings/match.h" > // TODO(sprang): Remove this include when all usage includes it directly. > #include "api/video/video_bitrate_allocation.h" >+// TODO(bugs.webrtc.org/7660): Delete include once downstream code is updated. >+#include "api/video/video_codec_type.h" > #include "rtc_base/checks.h" >-#include "rtc_base/deprecation.h" > > #if defined(_MSC_VER) > // Disable "new behavior: elements of array will be default initialized" >@@ -30,16 +29,6 @@ > > #define RTP_PAYLOAD_NAME_SIZE 32u > >-#if defined(WEBRTC_WIN) || defined(WIN32) >-// Compares two strings without regard to case. >-#define STR_CASE_CMP(s1, s2) ::_stricmp(s1, s2) >-// Compares characters of two strings without regard to case. >-#define STR_NCASE_CMP(s1, s2, n) ::_strnicmp(s1, s2, n) >-#else >-#define STR_CASE_CMP(s1, s2) ::strcasecmp(s1, s2) >-#define STR_NCASE_CMP(s1, s2, n) ::strncasecmp(s1, s2, n) >-#endif >- > namespace webrtc { > > enum FrameType { >@@ -50,29 +39,6 @@ enum FrameType { > kVideoFrameDelta = 4, > }; > >-// Statistics for an RTCP channel >-struct RtcpStatistics { >- RtcpStatistics() >- : fraction_lost(0), >- packets_lost(0), >- extended_highest_sequence_number(0), >- jitter(0) {} >- >- uint8_t fraction_lost; >- int32_t packets_lost; // Defined as a 24 bit signed integer in RTCP >- uint32_t extended_highest_sequence_number; >- uint32_t jitter; >-}; >- >-class RtcpStatisticsCallback { >- public: >- virtual ~RtcpStatisticsCallback() {} >- >- virtual void StatisticsUpdated(const RtcpStatistics& statistics, >- uint32_t ssrc) = 0; >- virtual void CNameChanged(const char* cname, uint32_t ssrc) = 0; >-}; >- > // Statistics for RTCP packet types. > struct RtcpPacketTypeCounter { > RtcpPacketTypeCounter() >@@ -207,7 +173,7 @@ struct CodecInst { > > bool operator==(const CodecInst& other) const { > return pltype == other.pltype && >- (STR_CASE_CMP(plname, other.plname) == 0) && >+ absl::EqualsIgnoreCase(plname, other.plname) && > plfreq == other.plfreq && pacsize == other.pacsize && > channels == other.channels && rate == other.rate; > } >@@ -218,80 +184,6 @@ struct CodecInst { > // RTP > enum { kRtpCsrcSize = 15 }; // RFC 3550 page 13 > >-// NETEQ statistics. >-struct NetworkStatistics { >- // current jitter buffer size in ms >- uint16_t currentBufferSize; >- // preferred (optimal) buffer size in ms >- uint16_t preferredBufferSize; >- // adding extra delay due to "peaky jitter" >- bool jitterPeaksFound; >- // Stats below correspond to similarly-named fields in the WebRTC stats spec. >- // https://w3c.github.io/webrtc-stats/#dom-rtcmediastreamtrackstats >- uint64_t totalSamplesReceived; >- uint64_t concealedSamples; >- uint64_t concealmentEvents; >- uint64_t jitterBufferDelayMs; >- // Stats below DO NOT correspond directly to anything in the WebRTC stats >- // Loss rate (network + late); fraction between 0 and 1, scaled to Q14. >- uint16_t currentPacketLossRate; >- // Late loss rate; fraction between 0 and 1, scaled to Q14. >- union { >- RTC_DEPRECATED uint16_t currentDiscardRate; >- }; >- // fraction (of original stream) of synthesized audio inserted through >- // expansion (in Q14) >- uint16_t currentExpandRate; >- // fraction (of original stream) of synthesized speech inserted through >- // expansion (in Q14) >- uint16_t currentSpeechExpandRate; >- // fraction of synthesized speech inserted through pre-emptive expansion >- // (in Q14) >- uint16_t currentPreemptiveRate; >- // fraction of data removed through acceleration (in Q14) >- uint16_t currentAccelerateRate; >- // fraction of data coming from secondary decoding (in Q14) >- uint16_t currentSecondaryDecodedRate; >- // Fraction of secondary data, including FEC and RED, that is discarded (in >- // Q14). Discarding of secondary data can be caused by the reception of the >- // primary data, obsoleting the secondary data. It can also be caused by early >- // or late arrival of secondary data. >- uint16_t currentSecondaryDiscardedRate; >- // clock-drift in parts-per-million (negative or positive) >- int32_t clockDriftPPM; >- // average packet waiting time in the jitter buffer (ms) >- int meanWaitingTimeMs; >- // median packet waiting time in the jitter buffer (ms) >- int medianWaitingTimeMs; >- // min packet waiting time in the jitter buffer (ms) >- int minWaitingTimeMs; >- // max packet waiting time in the jitter buffer (ms) >- int maxWaitingTimeMs; >- // added samples in off mode due to packet loss >- size_t addedSamples; >-}; >- >-// Statistics for calls to AudioCodingModule::PlayoutData10Ms(). >-struct AudioDecodingCallStats { >- AudioDecodingCallStats() >- : calls_to_silence_generator(0), >- calls_to_neteq(0), >- decoded_normal(0), >- decoded_plc(0), >- decoded_cng(0), >- decoded_plc_cng(0), >- decoded_muted_output(0) {} >- >- int calls_to_silence_generator; // Number of calls where silence generated, >- // and NetEq was disengaged from decoding. >- int calls_to_neteq; // Number of calls to NetEq. >- int decoded_normal; // Number of calls where audio RTP packet decoded. >- int decoded_plc; // Number of calls resulted in PLC. >- int decoded_cng; // Number of calls where comfort noise generated due to DTX. >- int decoded_plc_cng; // Number of calls resulted where PLC faded to CNG. >- int decoded_muted_output; // Number of calls returning a muted state output. >-}; >- > // ================================================================== > // Video specific types > // ================================================================== >@@ -330,18 +222,6 @@ enum Profile { > > } // namespace H264 > >-// Video codec types >-enum VideoCodecType { >- // There are various memset(..., 0, ...) calls in the code that rely on >- // kVideoCodecGeneric being zero. >- kVideoCodecGeneric = 0, >- kVideoCodecVP8, >- kVideoCodecVP9, >- kVideoCodecH264, >- kVideoCodecI420, >- kVideoCodecMultiplex, >-}; >- > struct SpatialLayer { > bool operator==(const SpatialLayer& other) const; > bool operator!=(const SpatialLayer& other) const { return !(*this == other); } >@@ -361,9 +241,6 @@ struct SpatialLayer { > // settings such as resolution. > typedef SpatialLayer SimulcastStream; > >-// TODO(sprang): Remove this when downstream projects have been updated. >-using BitrateAllocation = VideoBitrateAllocation; >- > // Bandwidth over-use detector options. These are used to drive > // experimentation with bandwidth estimation parameters. > // See modules/remote_bitrate_estimator/overuse_detector.h >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/BUILD.gn >index 43780c113d507fb4b6682f9fef83be530baf7597..125bcfc558337e0e23c71e8929f63d088f9d4697 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/BUILD.gn >@@ -26,7 +26,6 @@ rtc_static_library("common_video") { > "h264/sps_vui_rewriter.h", > "i420_buffer_pool.cc", > "include/bitrate_adjuster.h", >- "include/frame_callback.h", > "include/i420_buffer_pool.h", > "include/incoming_video_stream.h", > "include/video_frame.h", >@@ -34,7 +33,6 @@ rtc_static_library("common_video") { > "incoming_video_stream.cc", > "libyuv/include/webrtc_libyuv.h", > "libyuv/webrtc_libyuv.cc", >- "video_frame.cc", > "video_frame_buffer.cc", > "video_render_frames.cc", > "video_render_frames.h", >@@ -42,6 +40,7 @@ rtc_static_library("common_video") { > > deps = [ > "..:webrtc_common", >+ "../api/video:encoded_image", > "../api/video:video_bitrate_allocation", > "../api/video:video_bitrate_allocator", > "../api/video:video_frame", >@@ -96,6 +95,7 @@ if (rtc_include_tests) { > "../rtc_base:rtc_base_tests_utils", > "../test:fileutils", > "../test:test_main", >+ "../test:test_support", > "../test:video_test_common", > "//testing/gtest", > "//third_party/libyuv", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/bitrate_adjuster.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/bitrate_adjuster.cc >index 163c4b1981474b392bb8707206f7ad5785f969ea..ac21f2bd20387a6731727b48010ebc205ae12de2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/bitrate_adjuster.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/bitrate_adjuster.cc >@@ -13,7 +13,6 @@ > #include <algorithm> > #include <cmath> > >-#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/timeutils.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.cc >index 2e63b9d4a16ef5e61a3cfcb1f5eb04111c9c9459..f3d2f8a6bf804de31433cd2d7cb54aa00a8b53aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.cc >@@ -9,13 +9,12 @@ > */ > #include "common_video/h264/h264_bitstream_parser.h" > >-#include <memory> >+#include <stdlib.h> >+#include <cstdint> > #include <vector> > >-#include "rtc_base/bitbuffer.h" >-#include "rtc_base/checks.h" >- > #include "common_video/h264/h264_common.h" >+#include "rtc_base/bitbuffer.h" > #include "rtc_base/logging.h" > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.h >index b3fac7bb215726eff82ff33aa1e2374aceee3232..962c9c16ad2f46fbad7e47d3c264f5866a21ba28 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_bitstream_parser.h >@@ -17,10 +17,6 @@ > #include "common_video/h264/pps_parser.h" > #include "common_video/h264/sps_parser.h" > >-namespace rtc { >-class BitBufferWriter; >-} >- > namespace webrtc { > > // Stateful H264 bitstream parser (due to SPS/PPS). Used to parse out QP values >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.cc >index 60ed94d40311675865ef844cf88465686eeb0f51..5e58ba62e91308534fdfbbeaf9427b7df0696caf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.cc >@@ -10,6 +10,8 @@ > > #include "common_video/h264/h264_common.h" > >+#include <cstdint> >+ > namespace webrtc { > namespace H264 { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.h >index a58a9f55a59cdda46079222f7a2ba5682ec52341..027833b7cd7c26fe8d0f88c910f1db8894dd6444 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/h264_common.h >@@ -11,7 +11,8 @@ > #ifndef COMMON_VIDEO_H264_H264_COMMON_H_ > #define COMMON_VIDEO_H264_H264_COMMON_H_ > >-#include <memory> >+#include <stddef.h> >+#include <stdint.h> > #include <vector> > > #include "rtc_base/buffer.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/pps_parser.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/pps_parser.cc >index 5bc29f359257d1cdf14deac3b59ceada544698cd..464f6081e31e0c5b653b467ef97b31b8e08f6f02 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/pps_parser.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/pps_parser.cc >@@ -10,12 +10,12 @@ > > #include "common_video/h264/pps_parser.h" > >-#include <memory> >+#include <cstdint> > #include <vector> > > #include "common_video/h264/h264_common.h" > #include "rtc_base/bitbuffer.h" >-#include "rtc_base/logging.h" >+#include "rtc_base/checks.h" > > #define RETURN_EMPTY_ON_FAIL(x) \ > if (!(x)) { \ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_parser.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_parser.cc >index b6799a3be40840461afda20a140bc2ef9b64e764..c6f6d47466b15cbf13cdf9f8cff9bc580d06fde8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_parser.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_parser.cc >@@ -10,12 +10,11 @@ > > #include "common_video/h264/sps_parser.h" > >-#include <memory> >+#include <cstdint> > #include <vector> > > #include "common_video/h264/h264_common.h" > #include "rtc_base/bitbuffer.h" >-#include "rtc_base/logging.h" > > namespace { > typedef absl::optional<webrtc::SpsParser::SpsState> OptionalSps; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.cc >index 749b62e6d21d313af0407436c0dcb09d7052d9e0..3eab11f188cebb0ff8c83b1593d8bc35b522d82d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.cc >@@ -11,18 +11,17 @@ > > #include "common_video/h264/sps_vui_rewriter.h" > >-#include <algorithm> >-#include <memory> >+#include <string.h> >+#include <cstdint> > #include <vector> > >+#include "common_video/h264/h264_common.h" >+#include "common_video/h264/sps_parser.h" > #include "rtc_base/bitbuffer.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_minmax.h" > >-#include "common_video/h264/h264_common.h" >-#include "common_video/h264/sps_parser.h" >- > namespace webrtc { > > // The maximum expected growth from adding a VUI to the SPS. It's actually >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.h >index 233051d2d61dfb6f85799c30badcd69b86d9d068..20d1dd0cdbfef688007b9c2362846fcae8714d07 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/h264/sps_vui_rewriter.h >@@ -12,14 +12,13 @@ > #ifndef COMMON_VIDEO_H264_SPS_VUI_REWRITER_H_ > #define COMMON_VIDEO_H264_SPS_VUI_REWRITER_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "absl/types/optional.h" > #include "common_video/h264/sps_parser.h" > #include "rtc_base/buffer.h" > >-namespace rtc { >-class BitBuffer; >-} >- > namespace webrtc { > > // A class that can parse an SPS block of a NAL unit and if necessary >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/i420_buffer_pool.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/i420_buffer_pool.cc >index c403eeb4fcba14a499f2b091040cd3158817ddb5..2e6cdc83b51b6bfb4f889f84851889ae1fcb08ed 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/i420_buffer_pool.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/i420_buffer_pool.cc >@@ -10,6 +10,8 @@ > > #include "common_video/include/i420_buffer_pool.h" > >+#include <limits> >+ > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/bitrate_adjuster.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/bitrate_adjuster.h >index ee312e4f9a452e759be39dd5a569df1c5c9dfd57..bc2c6bb84936d39b92957f0a482cc6996ed11ed0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/bitrate_adjuster.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/bitrate_adjuster.h >@@ -11,10 +11,13 @@ > #ifndef COMMON_VIDEO_INCLUDE_BITRATE_ADJUSTER_H_ > #define COMMON_VIDEO_INCLUDE_BITRATE_ADJUSTER_H_ > >-#include <functional> >+#include <stddef.h> >+#include <stdint.h> > >+#include "absl/types/optional.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/rate_statistics.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/frame_callback.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/frame_callback.h >deleted file mode 100644 >index a3883c1d8d2f688a3673b0ae362e7ef8bacb3945..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/frame_callback.h >+++ /dev/null >@@ -1,59 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef COMMON_VIDEO_INCLUDE_FRAME_CALLBACK_H_ >-#define COMMON_VIDEO_INCLUDE_FRAME_CALLBACK_H_ >- >-#include <stddef.h> >-#include <stdint.h> >- >-#include "common_types.h" // NOLINT(build/include) >- >-namespace webrtc { >- >-class VideoFrame; >- >-struct EncodedFrame { >- public: >- EncodedFrame() >- : data_(nullptr), >- length_(0), >- frame_type_(kEmptyFrame), >- stream_id_(0), >- timestamp_(0) {} >- EncodedFrame(const uint8_t* data, >- size_t length, >- FrameType frame_type, >- size_t stream_id, >- uint32_t timestamp) >- : data_(data), >- length_(length), >- frame_type_(frame_type), >- stream_id_(stream_id), >- timestamp_(timestamp) {} >- >- const uint8_t* data_; >- const size_t length_; >- const FrameType frame_type_; >- const size_t stream_id_; >- const uint32_t timestamp_; >-}; >- >-class EncodedFrameObserver { >- public: >- virtual void EncodedFrameCallback(const EncodedFrame& encoded_frame) = 0; >- >- protected: >- virtual ~EncodedFrameObserver() {} >-}; >- >-} // namespace webrtc >- >-#endif // COMMON_VIDEO_INCLUDE_FRAME_CALLBACK_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/i420_buffer_pool.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/i420_buffer_pool.h >index 79f7eec23af08fd0ea656874066659fd00286d87..2dcee195fad1e73b28b2cb0ec41e784f6374258a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/i420_buffer_pool.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/i420_buffer_pool.h >@@ -11,12 +11,13 @@ > #ifndef COMMON_VIDEO_INCLUDE_I420_BUFFER_POOL_H_ > #define COMMON_VIDEO_INCLUDE_I420_BUFFER_POOL_H_ > >-#include <limits> >+#include <stddef.h> > #include <list> > > #include "api/video/i420_buffer.h" > #include "rtc_base/race_checker.h" > #include "rtc_base/refcountedobject.h" >+#include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/incoming_video_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/incoming_video_stream.h >index 8063061d31327228022d9e9b544d5b4477d3a00f..fe8d10e216a0dfbd08a68a8be9d0a6ef19759ff8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/incoming_video_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/incoming_video_stream.h >@@ -11,10 +11,14 @@ > #ifndef COMMON_VIDEO_INCLUDE_INCOMING_VIDEO_STREAM_H_ > #define COMMON_VIDEO_INCLUDE_INCOMING_VIDEO_STREAM_H_ > >+#include <stdint.h> >+ >+#include "api/video/video_frame.h" > #include "api/video/video_sink_interface.h" > #include "common_video/video_render_frames.h" > #include "rtc_base/race_checker.h" > #include "rtc_base/task_queue.h" >+#include "rtc_base/thread_checker.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame.h >index 565870e4dc36471824a86bf2ce44e0c1666767b6..ba280f2a8cdcbf86c8dbb0528ae82a973f25f07e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame.h >@@ -11,91 +11,7 @@ > #ifndef COMMON_VIDEO_INCLUDE_VIDEO_FRAME_H_ > #define COMMON_VIDEO_INCLUDE_VIDEO_FRAME_H_ > >-// TODO(nisse): This header file should eventually be deleted. The >-// EncodedImage class stays in this file until we have figured out how >-// to refactor and clean up related interfaces, at which point it >-// should be moved to somewhere under api/. >- >-#include "absl/types/optional.h" >-#include "api/video/video_content_type.h" >-#include "api/video/video_rotation.h" >-#include "api/video/video_timing.h" >-#include "common_types.h" // NOLINT(build/include) >- >-namespace webrtc { >- >-// TODO(pbos): Rename EncodedFrame and reformat this class' members. >-class EncodedImage { >- public: >- static const size_t kBufferPaddingBytesH264; >- >- // Some decoders require encoded image buffers to be padded with a small >- // number of additional bytes (due to over-reading byte readers). >- static size_t GetBufferPaddingBytes(VideoCodecType codec_type); >- >- EncodedImage(); >- EncodedImage(const EncodedImage&); >- EncodedImage(uint8_t* buffer, size_t length, size_t size); >- >- // TODO(nisse): Change style to timestamp(), set_timestamp(), for consistency >- // with the VideoFrame class. >- // Set frame timestamp (90kHz). >- void SetTimestamp(uint32_t timestamp) { timestamp_rtp_ = timestamp; } >- >- // Get frame timestamp (90kHz). >- uint32_t Timestamp() const { return timestamp_rtp_; } >- >- void SetEncodeTime(int64_t encode_start_ms, int64_t encode_finish_ms); >- >- absl::optional<int> SpatialIndex() const { >- if (spatial_index_ < 0) >- return absl::nullopt; >- return spatial_index_; >- } >- void SetSpatialIndex(absl::optional<int> spatial_index) { >- RTC_DCHECK_GE(spatial_index.value_or(0), 0); >- RTC_DCHECK_LT(spatial_index.value_or(0), kMaxSpatialLayers); >- spatial_index_ = spatial_index.value_or(-1); >- } >- >- uint32_t _encodedWidth = 0; >- uint32_t _encodedHeight = 0; >- // NTP time of the capture time in local timebase in milliseconds. >- int64_t ntp_time_ms_ = 0; >- int64_t capture_time_ms_ = 0; >- FrameType _frameType = kVideoFrameDelta; >- uint8_t* _buffer; >- size_t _length; >- size_t _size; >- VideoRotation rotation_ = kVideoRotation_0; >- VideoContentType content_type_ = VideoContentType::UNSPECIFIED; >- bool _completeFrame = false; >- int qp_ = -1; // Quantizer value. >- >- // When an application indicates non-zero values here, it is taken as an >- // indication that all future frames will be constrained with those limits >- // until the application indicates a change again. >- PlayoutDelay playout_delay_ = {-1, -1}; >- >- struct Timing { >- uint8_t flags = VideoSendTiming::kInvalid; >- int64_t encode_start_ms = 0; >- int64_t encode_finish_ms = 0; >- int64_t packetization_finish_ms = 0; >- int64_t pacer_exit_ms = 0; >- int64_t network_timestamp_ms = 0; >- int64_t network2_timestamp_ms = 0; >- int64_t receive_start_ms = 0; >- int64_t receive_finish_ms = 0; >- } timing_; >- >- private: >- uint32_t timestamp_rtp_ = 0; >- // -1 means not set. Use a plain int rather than optional, to keep this class >- // copyable with memcpy. >- int spatial_index_ = -1; >-}; >- >-} // namespace webrtc >+// TODO(nisse): Delete this file, after downstream code is updated. >+#include "api/video/encoded_image.h" > > #endif // COMMON_VIDEO_INCLUDE_VIDEO_FRAME_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame_buffer.h >index 11bb8125316aac912d1ff859aaadd730a821cf8a..f813851b9a9ea71ad577e490ffd66c1e717868a0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/include/video_frame_buffer.h >@@ -11,51 +11,15 @@ > #ifndef COMMON_VIDEO_INCLUDE_VIDEO_FRAME_BUFFER_H_ > #define COMMON_VIDEO_INCLUDE_VIDEO_FRAME_BUFFER_H_ > >-#include <memory> >+#include <stdint.h> > > #include "api/video/video_frame_buffer.h" > #include "rtc_base/callback.h" >+#include "rtc_base/refcountedobject.h" > #include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { > >-// Deprecated. Please use WrapI420Buffer(...) instead. >-class WrappedI420Buffer : public I420BufferInterface { >- public: >- WrappedI420Buffer(int width, >- int height, >- const uint8_t* y_plane, >- int y_stride, >- const uint8_t* u_plane, >- int u_stride, >- const uint8_t* v_plane, >- int v_stride, >- const rtc::Callback0<void>& no_longer_used); >- int width() const override; >- int height() const override; >- >- const uint8_t* DataY() const override; >- const uint8_t* DataU() const override; >- const uint8_t* DataV() const override; >- int StrideY() const override; >- int StrideU() const override; >- int StrideV() const override; >- >- private: >- friend class rtc::RefCountedObject<WrappedI420Buffer>; >- ~WrappedI420Buffer() override; >- >- const int width_; >- const int height_; >- const uint8_t* const y_plane_; >- const uint8_t* const u_plane_; >- const uint8_t* const v_plane_; >- const int y_stride_; >- const int u_stride_; >- const int v_stride_; >- rtc::Callback0<void> no_longer_used_cb_; >-}; >- > rtc::scoped_refptr<I420BufferInterface> WrapI420Buffer( > int width, > int height, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/incoming_video_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/incoming_video_stream.cc >index efca514556698f6f83bd1137e78063442c094939..c8b12e747ff6c8c979a2d0599b3608e12f679e33 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/incoming_video_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/incoming_video_stream.cc >@@ -11,9 +11,11 @@ > #include "common_video/include/incoming_video_stream.h" > > #include <memory> >+#include <utility> > >+#include "absl/types/optional.h" > #include "common_video/video_render_frames.h" >-#include "rtc_base/timeutils.h" >+#include "rtc_base/checks.h" > #include "rtc_base/trace_event.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/include/webrtc_libyuv.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/include/webrtc_libyuv.h >index 4cbfec20187e2204027270d802891fd1c7f39ee0..d3248340d875fea8175183572bbb23e9b6deb7ec 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/include/webrtc_libyuv.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/include/webrtc_libyuv.h >@@ -15,11 +15,14 @@ > #ifndef COMMON_VIDEO_LIBYUV_INCLUDE_WEBRTC_LIBYUV_H_ > #define COMMON_VIDEO_LIBYUV_INCLUDE_WEBRTC_LIBYUV_H_ > >+#include <stdint.h> > #include <stdio.h> > #include <vector> > > #include "api/video/video_frame.h" >-#include "common_types.h" // NOLINT(build/include) // VideoTypes. >+#include "api/video/video_frame_buffer.h" >+#include "common_types.h" // NOLINT(build/include) >+#include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/webrtc_libyuv.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/webrtc_libyuv.cc >index 6bdfabd5e9c76f9ec72898ab20e6b0f902175f6b..6e3fb3ed35a4a403aa5235e9781e95b102341080 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/webrtc_libyuv.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/libyuv/webrtc_libyuv.cc >@@ -124,29 +124,6 @@ int ExtractBuffer(const VideoFrame& input_frame, size_t size, uint8_t* buffer) { > buffer); > } > >-int ConvertNV12ToRGB565(const uint8_t* src_frame, >- uint8_t* dst_frame, >- int width, >- int height) { >- int abs_height = (height < 0) ? -height : height; >- const uint8_t* yplane = src_frame; >- const uint8_t* uvInterlaced = src_frame + (width * abs_height); >- >- return libyuv::NV12ToRGB565(yplane, width, uvInterlaced, (width + 1) >> 1, >- dst_frame, width, width, height); >-} >- >-int ConvertRGB24ToARGB(const uint8_t* src_frame, >- uint8_t* dst_frame, >- int width, >- int height, >- int dst_stride) { >- if (dst_stride == 0) >- dst_stride = width; >- return libyuv::RGB24ToARGB(src_frame, width, dst_frame, dst_stride, width, >- height); >-} >- > int ConvertVideoType(VideoType video_type) { > switch (video_type) { > case VideoType::kUnknown: >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame.cc >deleted file mode 100644 >index f4618146711183cfaa37ee257dd2320acbf7fa55..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame.cc >+++ /dev/null >@@ -1,47 +0,0 @@ >-/* >- * Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "common_video/include/video_frame.h" >- >-#include <string.h> >- >-#include <algorithm> // swap >- >-#include "rtc_base/bind.h" >-#include "rtc_base/checks.h" >- >-namespace webrtc { >- >-// FFmpeg's decoder, used by H264DecoderImpl, requires up to 8 bytes padding due >-// to optimized bitstream readers. See avcodec_decode_video2. >-const size_t EncodedImage::kBufferPaddingBytesH264 = 8; >- >-size_t EncodedImage::GetBufferPaddingBytes(VideoCodecType codec_type) { >- switch (codec_type) { >- case kVideoCodecH264: >- return kBufferPaddingBytesH264; >- default: >- return 0; >- } >-} >- >-EncodedImage::EncodedImage() : EncodedImage(nullptr, 0, 0) {} >- >-EncodedImage::EncodedImage(const EncodedImage&) = default; >- >-EncodedImage::EncodedImage(uint8_t* buffer, size_t length, size_t size) >- : _buffer(buffer), _length(length), _size(size) {} >- >-void EncodedImage::SetEncodeTime(int64_t encode_start_ms, >- int64_t encode_finish_ms) { >- timing_.encode_start_ms = encode_start_ms; >- timing_.encode_finish_ms = encode_finish_ms; >-} >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >index be0b4927bbefb4cc0d73f1342382b85c0d99b787..5f89b4acb96d947da246c4ce42f913fc558970f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >@@ -203,57 +203,6 @@ rtc::scoped_refptr<I420BufferInterface> I010BufferBase::ToI420() { > > } // namespace > >-WrappedI420Buffer::WrappedI420Buffer(int width, >- int height, >- const uint8_t* y_plane, >- int y_stride, >- const uint8_t* u_plane, >- int u_stride, >- const uint8_t* v_plane, >- int v_stride, >- const rtc::Callback0<void>& no_longer_used) >- : width_(width), >- height_(height), >- y_plane_(y_plane), >- u_plane_(u_plane), >- v_plane_(v_plane), >- y_stride_(y_stride), >- u_stride_(u_stride), >- v_stride_(v_stride), >- no_longer_used_cb_(no_longer_used) {} >- >-WrappedI420Buffer::~WrappedI420Buffer() { >- no_longer_used_cb_(); >-} >- >-int WrappedI420Buffer::width() const { >- return width_; >-} >- >-int WrappedI420Buffer::height() const { >- return height_; >-} >- >-const uint8_t* WrappedI420Buffer::DataY() const { >- return y_plane_; >-} >-const uint8_t* WrappedI420Buffer::DataU() const { >- return u_plane_; >-} >-const uint8_t* WrappedI420Buffer::DataV() const { >- return v_plane_; >-} >- >-int WrappedI420Buffer::StrideY() const { >- return y_stride_; >-} >-int WrappedI420Buffer::StrideU() const { >- return u_stride_; >-} >-int WrappedI420Buffer::StrideV() const { >- return v_stride_; >-} >- > rtc::scoped_refptr<I420BufferInterface> WrapI420Buffer( > int width, > int height, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.cc >index 5a0b0b00712d5b44008b363f23eb498fa487949f..8152625f5b2ef4d5cf644e811f739380e3c7e283 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.cc >@@ -10,8 +10,10 @@ > > #include "common_video/video_render_frames.h" > >+#include <type_traits> > #include <utility> > >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/timeutils.h" > #include "system_wrappers/include/metrics.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.h b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.h >index 2226577dc3ef1ea5d5d211949514cc957f7ff0fc..2c4bdd9177dcd025f6508e2dd911c44ae5057c42 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_render_frames.h >@@ -11,8 +11,8 @@ > #ifndef COMMON_VIDEO_VIDEO_RENDER_FRAMES_H_ > #define COMMON_VIDEO_VIDEO_RENDER_FRAMES_H_ > >+#include <stddef.h> > #include <stdint.h> >- > #include <list> > > #include "absl/types/optional.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/data/voice_engine/stereo_rtp_files/rtpplay.exe b/Source/ThirdParty/libwebrtc/Source/webrtc/data/voice_engine/stereo_rtp_files/rtpplay.exe >new file mode 100644 >index 0000000000000000000000000000000000000000..6f938c8fb220f73406e23295817e71c5fde58d4f >GIT binary patch >literal 87040 >zcmeFadwf*Y)i-|TmQ0dK&Oj0f5FtwN7F0AM%79D=!#z03kU)^2^%AG?QVQoF3W13y >zshn&_d8+pERI3!*+CF_++j_yv6A}^=z*Z4_Y%E?7sqFEfM!_V&$h_aR&zVWEKK=dP >zf8Iad&x=mxoVC|pd+oK?UVH7eFH^B1DA^@Ra!Rt4NJ!ESJkvkFG}e6hSAUW5C9jk- >z;n|G8p1Z@g_^;<~-MTdV-L*xl*L?4`HQ%|j=uUO5QgqX;MQhZ%iqyMqxphs^vhUti >zdPR0-@w9(M_Wz~*pOZi^AW4gDZfVX3O*50F9+Gk#SvD75ZITou?Z7{8@&g#OL%bu& >zi8uO}{Fd4sl9Xn`@PvQJSY&+@4g3>dTOHEv^b&DM1?l*H64+Kz1Wx^X#v$?S)Z0Hl >ztVaHShW>v-E>&(_uLvyEhZs7AkHvQ~bCD$7c<C+QQNBX}l7knxSJ4Z;hyC+QQv0RK >z9cu-MXx(Vl<*IMNTVoL<Jn$~+eDy!!1uy;XFuVxx+`ymyPk8=IZykmQnF@NdzXfm2 >z+BG*5gH-O57T%n1!@Kp4?-3&5LH1G(zUTfEUJ%g#_wWA#2^ik`o2HkSl+3^Gx<y-K >zGe(q2+b+YOjzjkCrTDIDkhs<_No*f)xJ;5(aA~3>@tg68gbQIYYQw$WGJ2^4*7j-0 >z(jdhqz9~tu8l;xCKY1=do!cm(^TwB@;4`m;!vC2iNu4h~1?gq&Q7W{VRG_prbH^D# >z_}>zVgh;DwaPzSs8ss(jo7Jvn?)aJzo(DoOn3zhr_^10N$$IR^;|(dYWB70JzMYV? >zL22#3?TAy}dy%)(Dsat(R>q$KF{@n}6GVZ(`HBiO@YhOw8i}2$EMC476s)95N)pn# >z<u*2Ivc#PCb#AR@$5_A}3_p@zCWZToz~Z%wBuVS?G}!o@PROe+iE~go=yx8x<be@> >z>HR(Lchq%C&wuxhKe+yD+7FCbk_0-T8dm3SIk>Tob#@%euGe-f4M0xH5_Y^n>i9UD >z9q2fmEx+&sf;R11iNC?BdabR7kmN@?v=42}8C^_ZeiHR*X>ZI5J4b<`KsDH??-3a# >zY1Q@MXe;rpb+wnam+lYMEYkzW^lHv3+|j_GUU8@aB;w<!7H(F3sEXeR5!7lkAE0Ja >z(@Wc98L<A^%m}^95AK7YwZ=n|{6IIJSlywHkFuGzoEVW`c$nC$l3!rNbtMZNGHWK7 >zR?=UZ1pW-yOBds-3S|fRn`fw~uEF;Ti^G}3yc5i8U4`}9%VwSxOg#i)G<A~z>e*cT >zB=TZ{{L~5X^1{9<w!~S<kK!fRKi3vf(sx`8rVQ7PB76l|jk}8PrNq`HHtoaBmEp^% >zj3<{;83o}gdU+TxwWSAz@awg!B_*reB_X$V+Pm>{cDmzeHtT6>r|JY*{(Pz({~c^K >zWYy9xgmVl72m3MA-<Q7yB3iSTz1HzzHfs_f7fEqlL4D8tqIIRx{iW@CJ^^EYri{l^ >zmD6T!B>^{%XJ`~x5$Lf%Tz3`IgCMWx69UVozZX43)CGI>+D=m^n)llzsWq*Gr)h^A >z+Mr!-=>R~R*-T{RmS^$8YMfR4#p9Bs2Y5YOVsHA8q*=*-2T<!=8{4NH-n^yyC|lyL >z<Ua%~7+wk)MFU+ye)}q_v?p5KrC0Zs?$ImWZIBY`y9u?I4~`Z|E$wpi?;&;;bn4Z| >zSOq!?FDY}E?$-nD2LtV`)koj;{P!bZ=DoUo?*;bpvY!LD`k2G62m0v!a=e?U@7}qu >zH+$tMKP4ZW?R>w3=ZmxxNhNH)9F<9^Zza@ytm05*hy{+Zpu37y@JdueJG41yimWPH >z(O1duhz}{*a%%G@s)*sqF$3I+%1xgPCGZFN$aBC_U#+h8)+{S+k2ZE`dlFy%e&EfL >zlkCm=2RdSQ)^q%qY_ImA<M>lItZMM^4-1hI-3bg(J>PwiV@PH9Q{xPL;SjATK&YbV >z<)!;s#7tznIu1Fa&WQQHL$tFVIMHCUgt~o{5UR0|z1BuVp;E}Ki)QZ_=r?IonI>V@ >zA<7x};(wsq9q5vp8c+CV%hJZoraQ$$BG9Kb9+Q;O5HqXj(=%Dc34NiHgjvn?pu0f| >zh210lk`Q7EoH(t^TiSlZs_WT5V9oHaSXm(K${=h?v@y<9ceoYut>I_jb&@sdMITvF >z!~YC-q80{nVjI4V1==iWV@cDj2q-j8)EWn+P3JT<4nFw<o79+hFfiCVGfquE`n^J- >zEyfsDF<8mcgW(^7wrO+s3i{q?^<dSq@B;vc_}ck`Kfk&S4A#O76N8aig1-}`?ck6# >zx|!<f4kT=PV6Z1}0xBf&TOj5h7xn9Q81q;bfZFN$%@@ec49S`?OK$#5JmkyGuZV{n >zrBpK<N-*|;cn!<Ve-#gIx%mO{;E<c4t;maqU*RG4JpQo!)nmwZ6#hFz(3{;m)-Q4Y >z&oF|K$%s8N4CB;KhcLzy1{n)O4Y#41Nh&{|LzQ)JC%z%#<w^L$>r6qv1XgQ}GL}t+ >zR`O+JWsNI(Tx1HU+Q#|YC?&|ZQDG2=k>BGY`z{hQ*r+O=dlGrV&!EtnWqedFmFHo- >ze!1m65cADckX?T87oZibPOxlxtqNZVc9`?QAU^|z1wO{45k#wJ5VWZMi7jpBWS}L~ >zDP}Ih2Tr&HC7F%;X*p%~vYbl(pq25Hms1%fbxQbez~z5Oy9fo#4Sz`QOO^l!g|rba >z3wfBGdG&lMt3fh3%O-JE@>i*i1#@G=U!+RCTa@Ysfkv}18d_mCD}Y_v|1!NbNDcM| >zJO42iJgECx5{<cG+h`Phpu#M=oO0H0yBN4ZzMRm`)H`~Dd@3I6w-cDOU2gsX#Ma;i >zTUP_b_4AR+ss~Hk!BxHEZ}mMcg4?C-Gw@Z#2R^0T)A1awIMLv&Y;cCjYBNWWr-3kp >zoEvr%8;dPAUU`Yw@e1lJw~8MF0fCiISm#?<KeDhyqh(d{<|Niw!jdek)r7T;q!{E8 >zU`zt*eenFGL+V8hHne?JLuP}-kpJFf{_`k*`(jfhQz-vjt<$NUvC9v<1l?(nwmmzR >z$n7zQ(H<8KXykS^+`=nI0KK%oA;A|7KxgJ^XS^G~<e#UZr`|P^{LLZjbMGAV?WV59 >z$Vi@)2iB*iAs7uj7UbiuwOUwpYr7M%6JR%72Cl+a;!pHZ%2LJueVIUtM(MSZPeUuz >z2BR(X7K~OuR>i*nMCgeksQeyOY8q?}usX-8_&!rf(fLH;zabG;w2H4hMuXU~da_x( >z8ZY(eU0S18Qs51p&;xOid)FQ+EF1tkV7cP4O1>Vp14>Wqhv$<3PYbuwEx!O8$138L >zJdV){<1wXuAksW72bHnf2B)Ov5$SR#DNz;Q4ZM22++Nyl8fxMD4N1t>;12PTDej%6 >z?Z$QWq$N*NK8?HUbq8Anr|O4z7eveLLC!3q5P<wbZdG&<9E88l7A!*qS>aoWt-mZp >zt>hOR<Z_~I$mFV?`kpkq)NmO&&3tn)<)C5Vpz1=)7%RISQZ!_%;$t2~RyMedXNrR6 >z+6Q{b6v3Z^#w3g@8!jskasN<uC0EEa8_@S*JZo9CcZ*$=u?XW-tv(m^f~p8UShWW! >z%<aqEm_vV6nfaZ*+0MrF9aKhLXK8y~XQ-y_8PtKD^ECcKu-7zgaErlGZn!BrcU-WB >z|Mt&99hx?H1$<<~&C!}0tD;NFD)~cZ5*vql^{l$y^0!g{okT4~a}+Bt@CCeT8q3v3 >z8*X-5WcWmDbW7?Ol$pb_e+PU$OFxIc;dp;SYIPh`uheD|NK(f)6!AOqsK&iXEZqS& >zAmoKlw#dV#5*ka}PG4GMo}?^okfP-Ejq{qauW&P)ZLX75yV*jgZHc>SlUdKq#9U|7 >zUE(<jx*+@*$`4l-pyynLKm6B!5+X2iNS<E3#!0STjhj_?z`>F{v$g@RKDR)xDNI$n >zFk0>m3Ld_P#xx!0WaW?O)AaFtGvLv=9?@XQxeECdCPCHBS5d}U=450qctV{>)#u4( >zon)=USt;fd*~t==xok8lSW+>l=dg6<cbCAKp6i4e%xIeHq!vu%G;I<O8%Kv}P80qi >z)$?wu=gvQhda`cgMmB*t^mLX_^3-!pk+X7)Y-*ae0loSPTe*{kyta@#sW6#|a%a=+ >zLXa?wiwY`Pd0sFYLKE<>srp`mB|>?%v5TSFmb0=|n@Or7$&a>chip(F(SKM)9x1Qt >z#+<>nZE2t!W?UKHy4*Xx&a3A5f*v)aWq+gWTkTXbbsHmB;4pQ2%=omKeW*!79bekM >zeI+uYf%0#hi|;*p_AT@9-M*a=I$vB)iNnatEa>r#*ksp72zp<byi|(L?1c;UGEw7? >zgY{@29&8}*Acz;L*S?nh>fKtSM^fD^;EBCT4VA~+$fS!jno2$cAj_tz=$U#xefy>o >z3!Bn(@!dV!E9v*b2leQ)-TE32yzr3I$Yhzm*A=&4+vW5*)_loweY@1#v{U}IH|dJz >zOgM}cqSz%cW)s+A&FOnx9c5(T$*tqf&$1TyI@a_hvJ96W`QOtT<C2mA2caUa+l+K( >z-%gBF@rlGLk3ok}K00WHM_&lpQybAFn%acb<u%x#{w)WCXqqa;1yg9)OZQarOw2M! >z3c0$Mg`9d27Ra)LnC6H01!(+QG|*_l=?I9t+^IYGe3&5C2}C5h!~?b7)~%3tQx^xx >zon`}V56>@`Vm4qeYoiv3ZY=h`2!)Z3|MG@qZPc>SSBmqHJOrd>2Cx9r%p?${I*WVe >z6g$b;W)nhDJi=2E?dJ|g9}&n^(FckPK&cX`1wmNYNB#jG`&#-NFX9z=oO%8Qu*~Q1 >zGCT*vC#ctmK*-eTqSugJzrCejSzdb3SPT-LV2GawRH*KZ=+a_0LQWyXlg62y0nl7+ >zy<1X>LTtXXG7)scKp459a5g+jlkR)h2)gCI37efQ{hNQM%_LdCjrtM|RodRvRp3Bm >z2xhm(MG4>>3QaOZd!ePGdHz<<qY6?ggUs{j9rVDQkFLgdVjsqLbF5+kFPa;;tWYHB >z9`V&$k+;RnjA#j9lsg<h&ky&dAI=^PX4H5|-Ei|}G#Qjf<F++9%98nVa}(H!VSE5j >zFWrcS6T1AG;t{&+qNZH)8Ox_{-!8eO6^PJlS)*)vVGHF*2rQHW3b&x4&!#|4PZa1I >zy<#!-ih0OMm<bm_&75Iq!`V4WKPV@CBj-rUnT?!;nQ##lIUz%>>x31mQd|+AFv9R- >zjJrbo=8wqPiED!i^~MlifH$Z~yW(Lr9+t^4&59fIwV5l?k`jCb`sN!m(8Z$FC!!UD >zRh1#$QHa-ID10N*wh=;**8>C58T1E_@Mq2(7KI=fa<)G+^??QD(z24NTkwS**732u >zhW{cHEDUZV5F(tlHIB~UGXCONN!mupET5(+CDiC0hwW7?-@?NS5FH1q_=7-Y*#riT >zL(FPQvMFm#$A|W1j(0-*4qzZ{8&P8WO7bbuCJ1e(M=^v6nH?dES?AeFh;5Vvi%WAZ >z8pi5gGL=*xt&d_An59sO+avfBx#ki~QC-b=v?l^yqZE1>xw<3?lL$5;9047E{vRMZ >zNISzsmyC4t!M2SEQ9|Tu%wI}SRAeR*w5jDR&F7-Blx$zRk`dy6LJz7nWR~trQ!=(w >zieXQrv(dh4cjHJVGbf8&eTn*MWIe3KXyqSFroq$~a4U5fU7ZiVv=xlBx~FHVqnh_9 >zd9Cwp(-+#*Os(Cn?XoxTQIDuoLG2r5UMtERX-eNW3S)(=3z4I*>S^#x^wCWARk+nL >zq8-7R+r-sJlQXX&7AXoR&fF$vYOz^E3_OH#qd*bTPMAJktO6k64rL*)=i$gbwo4Rp >zeO<x@WTPF<hy9`f2uW^s!J;IqFSiW9f(~WN)EWzca+rWqNz^c^ffE>CtNSP{cFeHN >zZB1)#SDZ6us$a3{JRLJTz~e@kz0UF}L&@ICsv!TNsY%fD`H1*2X#TTf(4y?^5`lys >znU6^X2vQg@z^#CVQ9>abH3wZo1li(x1vk=r0)3=WT42zd2*d?B(0xd2?30uQTH^^x >zE!SqQM`KCKI8l>I%pSuJ17Moid#@y|_L}Tf@e*Rh<|1Apz*(Ri^FSh0vfNySk|;h@ >z9Z#f3t_46xI--5rJx)pS%+;romdB0~J64fP;D9aF;*oo8m#F7ZT+kN8gd_aNRPm=v >zMe)WYD{(Yi{BT@uGtu5pp}i&0@>n)8R>@niE@*LS@))KjsXnIN>$Is2LWdfm+AL4g >zMerq&ZY)i*REXA{!{UOEtg%TktTp*a7EnA?far$Pu|zSQgpq&<%p$)cVTkss;$I;Q >zYVo5r7D!OriUMjo8->?qBV4S=^t8~U!>5S7EX0TePwW{yl2R$fLs~XQ^kTWqi^-q2 >zk_SYHl|s&UCaXYF&C`TX$6Fjzqh6k>%`dM=O4VdgD72uMRT-{Kp=}aqc|D@JzYnme >zSt30)CRMlA%q<m169PeQ3mk>#<E`R<5&=rLS#;zYNrmM=9n&I>N2jaPl3b28GoZQG >zq)POdjLB^kNA>KDXkJs6zkz7cpQp<$KNC;(>FOjQ1=ulICsHdkA;ptLbCM@ZieRk% >zXQXOj!ql>1TvQwtI{!e5iyvGr#3Q#=AJy}T3kW)L&2%-bJ8*Qmk&C~xjO?@2Cs`s@ >z35K4cpyXp9EQJ4{A;KespHRt9A$W!{%uIg~T}KMBcWUH6yaQj}$S7kS7@ZuwWxnI< >zGFf-&{(&#xImePu<{V&X-3C6VTYJrBIJ8cCi2nziH8T5OnvMs`FWm#nX{5<*`yG4z >zt={JS>Iv2XqEjD%bl9%sTzuzNdZlUj(?o*ikA%?i5r~dOfTdPXhsSmf7Rfw#V|^^W >z78SnoCv3EygUE9_rhDZ+k!8(6YjSP+SYn{_f-<y|T>Sq+b%Xq7QAoZh^ve6#wJizd >zCP=f|{V&IhS&`E=#mUl)4Qv#<*NIG*kiV>4z<S_ri>!qhYDQW^jd$2kL%9OTu3pR1 >zz{zw}1S>hHRvMnJp(j*%6^y<)Uv`Ry+Lll7@d$Zf!hxAwr(tJlVh}fd@X~}aZJVe= >z;SQsS9cS007zXVh#DG#uA<EC`p@wt%(%>B(4DomAd7f`3iFZv0d8Y8^o4QtG;9D9d >z7oL0mF8txK;T<V;9JrL!k>ZbE{|ecySYrtgsii1lR<0KDzrI60_eZH*9U>PkKK4+q >zrmn3ZPvr@{JWD6vLRO&*JJ7s%5;()3wXV0cnI}M6#M9hz%TvIM2Hpzt7bvdDs^5}# >zR`jwCJ^v@9H(0%>?yHI>S1`6e1|WaAmnx=?4G*SR9G|LKF|i284{_Zj+T?Lb+dFVc >zs{X&E(l8`e_vx-DL6wZC>nS|w`A^ep^$93u{_hBoP5ed!Jji_}N#Ot>m}Cx7hhT8^ >z>LZD%gq)Zvat8oBQX&YC%O|RQ+-6B~xTsRPV>l`Tv?5UOuXFz#x%qRO@aF*jus+9B >zl7zn2*>pw(iZT+8rrSlJAsLa2Axe*b)JAZuRk!>H)l<WNgwYEsimvUc!4!$F5dj+y >z>lBs%xT5by2+Oo9;{6D9GrX@@uC+U~L-B!LN0+bb?rVcp{1VJMT#_F=?M7;c$G7|L >z3)o+pVzl+cw|h-y)3G?>Ck>KujQxcjN_^j)_`Wl7xpvH%KoG$gSh2jes_Yv?D|s(M >zKqk?Bwab0`?!K74)pSfyJhUcV>vDqJ=j>33jnxaZb|>cR3)JJ+uSgQ@r+F<q$-dNP >z-iVf$)bm;Xvz73+PuUfhcFLxDdR!s_6phT}vmZk&LD}COy^9tO?4S7e23uGvf>Fgh >z^6|GZJ)jOx?p(S)VFx;Zm<@_KG&{w55ichOmr=m{d%$LWdTC3)X=O#{B%Z#6<onra >zWvYcOvV{07;Dffz4zQk*zwz;)jbiefI*DS_XY{wFUwv(;&@c%c-kk>-=PM3OL`7O> >zSc;IWgZ%h^pPc|Z6TStN<98v!=jwv)C|Th<3G&u9w6!(21dELn3BtrW6IL;e!k?nb >zwc*DJx9u%djN+;!hCujNfQ9&T0P1Pso#N$bdPxueUA+8^(5t<=UAq>O0nACrTk&cb >z#fQ9NVFqa?ZRuVj-&<Hl%9ncg414!r^mA7w^78<L9kc8Ue+Bvp0b=cmkPENzW3`RQ >zJTRS}2>Hmh^i+VSr)JQT8&A}RQ19(dJd5&R;C?R#6pUYq{GoCI|IrY9h(EwLnecDn >z3nJax<BCHNR1U?;=vh+I68XeNC#ApSL6R|w?Q*m`Tz){c_7l3_sr%g&gUX)_$i~Cl >z=&N-9FccUjghc1Nlksn~83L`lO8c7<q$_KMd4f%sdi-uGTn~EmtEnLnF51tsuLeDR >z3zh;Ted|{IW%&eY*{`_u`{><pZJz<ac7jHK@(3U_oT1x0dTEJeCNV0h7qh9vn4@3o >zc4!~R5q{lH39Nevl@~%fTDOz+TtZYd?kVYx25ow-b~xTMC)Wup17##CYh||2vEPv| >zWU%Xmk;d|aL~Nl`U*yz7ZmbpPwQhZdGm47mLE+8O)3j{}Ue>vr4oiIqie#}4N2k^v >z_wAK`unpw<CA-}GFdj<$arQ;Ym&}W@wNE_0FV_hv5za+Z1QaXo9Zucj6)!z=yn;RE >z);^5;vIURp5Qt{;-H6uF-%6)IbU>h@-b+UeLcSI(RTC_5BjTi&A(D-PlgRp<5rf7B >z6vHZJUzU`zx3y24Hm@iPj^|Rh6ILzAKZ8qGYb&*4t*9%uu)vD!EF#IM?xyYpENj!p >zRBvtOQUE12lYfnIcRRf#+bLJ*kE7l71XN=ZAv^lmIikf6%FP}d+8-{-zLH$um$LRb >z#KE$cBfCTPWfPgTPBxmFqZrR-%0h?icsrFSBQcv27Yw{P@l6L}rfXdwXv}2!g3M@N >zzC|Y8aBUM*M8bT>Xh(M{?JO!C4hI~=x0$14pQDp^oua;t8Rgq0xi*otCd3`^a?t<b >z2R=y^aJ)o%59A-4$P=4fR5$@yR#OLkBMkJq2nJH7+mVUp@HbQDpHWX}IViWB08uy& >z(du~g2b9I<DkK4<Z=)|G%>?o%;dR5-2<@3j`M+7up}K4n68OTA-qdZ<`=&q_nhk-$ >z=F66Y?2NgjFPg(~z*ilYTOwdh+u$y0ycZJ5E%_3Lc{1C@-bPh_C-_kEZBwZfWI_b= >z7~+Buiy@f%0k!4d;MSD9%{QCKm$AQ<ylo1REpjKy5kZgZa~OrdY$2z^S8`@qi2vFo >z0t-KnG<5cmfNnx}78LnqrVz4`pWlyXLL6h8DLA3+5@IcJ)oJ1e!F6<2+ACqkFEPfz >zSlP*(?1GvjOb_;bLg&_Do;O9vnq^-}l^UxV9^Y&0$lw1@)P_{-9roIg_Wp>fn9II& >z9M`(zzHW5`J5_QLRX8F_b{p3w7P%Q}S&GPj2M;9&U;*af@ooMh7X2?WMF$+nfbRr^ >z!PG2%*i-TvnVl?p>v44Y<~;0P_Z-CkH+tin2`{VUjf(`&x=#$fhIdFz4G>e(LWw$K >zkbwqIV8J+SK+`GG%Qc8!#ra~8s_P7rTc7l|u*(R@a`RUJ0fo8yJM2m+`X@4f7>9+j >zWkLajbwuzcgDhmWFqLQ9Cd}j9Hq++p!&uU9w<#C1OpAisC-M)^y@D3%U}9Mn?3-!D >zw8b!i!|k8U-W+bDGb|fj<zy?I`f@BrxEcPIyY<=5XeG=vY#!}mifbF)PP=-+ziMgU >zo9ohp{5@K|pg=cl9kyH5p{pXg+tlemXKeb2Ocz#z*a5Eh3zLwo1>#Q4z#J>C*CHMj >z7Yit+-(b3Gb)DEm0AXhY`EO{T{-&(SFNSAD`n8ix!Wqxn^q0k$oEBZ>&_0SMjVR5S >zHo&&j`p&V2g4Y~-Fw`UmZHx@QUCQ}vmsWH}NL{<d>->87cmY+%kP%xhEXI6N?p;l% >z=HOesfpuu@?i4AsT)W%ncPfL(jzQVUb{R9aQ4<?enc37GY=rOc>n`ZAL0$<gGaC+C >zYc9C^q?xXd@VN+Ib?u-6X3`(I`BmyGG@4-;JdPa?Yp0&Yt2xH4$)M5dpA^bso+2jf >zfcBYFr@@T9?jY5Q*4p&3(ORcom}!lZvmARIVocOJPB0`uJ1}J8Z-_=si?<BaPB#bn >zZ)k3TIl78r7BVA!G!emtG0qqht?^g!&(L#dp(~??m%?n<Vs*@1lH96w9qMrw7lGBK >zd?l7~*_>jJ)>VYH_hr?UZ0gkz2CV2J_5<MnIg1lZ-0UuQbmi|oEuU_@ggMwgHdyj@ >z6&qEH0qD(=Zmj1bVBEvEoPex*_)qawp>@VvK5aa}t}KT3Jh!5b9DJd(Fm4s(jA^^# >zLncl;3AK9BR4c8+K}|3gy5)O~RxsVMe~?tCXdm0i>t<JLN8?(L)A51tHTnMCNGeUV >z^y`grWdY&DeZuKllg3uNbv2%n<!YxsRak-5PPX2yZ*n7@t#|61oFbQY%M@rMY%Z1{ >zbUW)T>CgtfYr-0LvR)J10Vf?6nHrSs!e$B4!omVr8js1z@XntnRySR&I*t`kQ(xc^ >zp{1ZcNtcK00M;n;5TB&gOEC;2<3#9qShpD7wb`aP>!iwar*BYAXWg;;NH7QB8u?DG >zy%2J(<V6^oX!d$g8MzEzK!Eye7iP6Tp{|LrO30`b8B80ywnStyFc_Gb@)3UtB42hg >z-;Gv2(yC>M%Q(7A4v<}b>`RPXFf85rs<`QC3OjDDPm{NaX3q-ob16H<Nd+?^DB^uo >zkZ~Ie4Kwo|6qku!N~3ZGp5Hco5DW(t2J%HiafG;Q_&mYmAi(ECq{2}9zF0%UvuMAX >z&@#Re(m-Vqo5Ofp%X(-~g{idWO-Xx4Lx$*_h{$c-N^=bQKsUw4#0u|6IjsRt{L9~~ >zY0-I!8)&u9I4RO;>W9$|ducF5V{g70dovavwhL9Xn~7s#_|B|3SGnZtsFIV2Zj=*Z >zw<_C9`Lxd*+UJhrzl<9>ae7aYfgaqj%Hnw$_`s9|g|;3V*oAgYD0TIEdUWyVc)dQ) >z&eA(RaVX__x<1djP?@67bL;aw3zZA>d0u^9-a=)(KCeKZSGZ8g*XI@K^NJTL9(~@0 >z`n<`}%~KYtyLFphh7rMy8X__-zO};+C2LFinZ1guUY|1>>!*~Ol1)8y12vByum2A- >z1u_%$eG#lBu4sqMqIV}SJr?igcHR32z9}fmiW(wsh`gM=B<$9)(>s;P%?Fj@))h89 >zPQv4u))jU<PQYVM>k0=R$DrpmA5^h%+ZX|hpB<fTGu+YHcEb^!?VvFccFg&-L%jse >z3$JN!(<e6kYdRdkKgA&5TloiqvKKb@tCI$HVc#-#@fha-wcj64C?op)Undk#zyC}^ >zNuOUFH(s1Se?D7W?3`cHVaXtbrWmR#L=nA#8WjQ2kFs0sTO3$ynl-x|ffgiTkMc*C >zS-^{1XWO=v+mKw;I@`Xb+>WPFt+O4d&z5pTlbv`^TcEy)O<0i5^+}o6ETf&g6mF>S >zE?DpW-S1U*bG!N#s_z=tuI;u7MX}ldi{`YcR!cZHUfF7E9W{NQI=QvX)>>o3(|N6B >z_SPCZo(fva9IZ7DJmqMgJK-u}>WMs!JuL|la2R9db(PJ%PCZ__r*(H}d+V-}-3-yw >zlwE3cFXR74BMR%ATf=YroSrCLmQ@44@-5*SBU)w|1!`wf5{j8D<10kAUSFMe9Rk(f >z;IbON-OON&58>rUR_<zV$pOdvv_pX(w?N${L5m)G7`8s`#=4g>1IrTX6iB7Cb;R^h >z>Lm6i_6ynlvm?_bMW!W-Ou4N<Zkt~6f$|BVI$4L|E~AB#H!K$vG>IDG+L>G2h-w%6 >zTNm2I(^x#&#gi9LqQMBk;Wyrbz9Aoi1+d)C*@%DBp#`-+73W85)bPKfV?a1!WdNgQ >zx9^>GBkI|ddi{zrtjr>l@146p2Zj0`=W<x`fLp{EL+m|=a1*g>!>yNlddi(^=lG@i >zo^p73_&X9|i*hX3OVTa3Z9|B{am#HwB4XEf-g4V^#8EE3cy@XD`taqfD=~E?#&QU@ >zI+#%N5~??$dJ@>i^#LN1T324(svE9HsF3=iId4WY_0Pki-RQ6n_CNdyGKUY~55EeY >z04ysQ?DvcG8I<lh2kBo?`Y0>C+TB0f<3#>zDSalTkH#O~&&>Q4*hg7A(ou)`28Qz6 >z`peyJb)~f)T<#XfB^+1}-jA~>ARv+9U&a@}_s8G>4fBd2HB(F{Vl=bqpX+fd;}U8d >zb%XkpLflwB1*W;&Y^YkN$9NcaMt)(1H(Bhee^uZh3hbpdR#9L!?GsKCyD(WAwpmbV >zuR@v6DlMH#dn`4X6JmWgRI}qo643>~F70Q|=k_782oA^LGny^O4mKsT!4YIq!izFl >zCx)rm+@>=W65IGDtA6(uJ)c_IT;>uaIh`IV-Ggq0Ev<U>3B3Zx2e3Ai0sCw<&vmFl >z_>Uxhjl_R9wzHu5M7=Yx6y85uG-8mjJ|V1E2&*!Qg%!{uR#DhrCa@|AYvOonYQicR >zhBaGgRU!X1!c?ZfMu)2jeL10Dia$J#RcRD06t9qPN+HsO;z86=@HA4;@$ef`$)m&9 >z6P+iB&do&US(6Srsn9UWejT#Ya(^5?jHy;0hRxt|Gez`Bwd9t5NKR`kMkE(g+iy{? >zV`N|h7dqBDv9gaWPQ}Hp(i{0QFvQ;^_Qdg`S^e`oPPzGMP)n#hp&m=9MG3Vqp%$3k >z3d(SkP=;cfmo@<nv4}tswq{lzGir>rVvnrHr+U>eoVFherm!$QgBF|7S@uy?pQWn) >zma4jEix3$tW}+PPi^@VNap4=G6lH?gQ;_XbPoM_X$1<u14G+$d=vZ^CIHp$&LKS15 >z5l`4+r;Zex5g?@H=mehF+kg{$*t&oo)n`*&2$xozz~*JxSXxbmnh8g)F%R-VP#Qeu >zyTD^V@%RPtILYLZjkhA*-^JFA8XPx@3r$1}J1({zp`(O2TF%lAxHECld9m$wT7S9I >zu1uC+DEB&`^vZb%DpdiWV)GkQ{TO4Pl|wdMTDQlDT238Ur0Fk;ckc+Je_{Nu*EXbK >z4|c<O;f(-<zfZlc@qQsol#=EU8^kCd#v_a$BJ=353#}Yxl;M8Lu-ME1?{#!IjgtPI >zl72)<mzznx<Jf+H6e&jGqk8NytoPM8v6(o_mm~JFtP!xR8fX8mhyZ*?eqlqJ<Fv`& >z=<s+V`$r=COCtMskU<ZTUs#pqc)woTggT54PavG#g!2mFJa6H&rOR!(<@0>|1l<hS >zierM6)YvGR4O{3${f0QjW^SC)+u)R65RC+(PJE-k!>fdwj!ICzrd#9m<-;6mE@)m$ >zi@b4!ivgP?5_Tap5{QfEfqfHu`nwzo^*{<)H@zV!RHP`-!o?=R6}#Te*12OUS`+jp >z{;j{;sa(hwd3;Me3XC)qx7?F8*CXn(czBL-<iJcIDEbr`e#@L+Fwg@Ffjy7?fw)Z> >z-CqvtKY~tb<pKLDIQ=lkL(?c4pJ`psi)obD+wG#U`bJaZqF5xs@?Ma8MVF#6LnN0W >zd8DJ$OtuC27sCBWC7<__ZGu<`7#-e3Jz@>s!dK%D&utOSn3`kHi~x#oGxl@fp<uC} >zP5>rB^biI}_z+sxyE~o+0D8%7n6$q{$i95ari`iY2^@1v3GqhHwD!O^{0gg_{ZL0% >zf$<?5BB7Nwu@8bD=z<uYq;T2SJE#bsiwdJP75e5=kU;aD@JbxB!m?e?OSDNK4}Z76 >zTY&|@ne<n%O%HTVImP1x$U!{T;Y0|(W2hLGFLF+|!_G`4u5mcS$^B0XdGO5{*!4$H >zt0wzWwRrBAsKpejMQW5w)?({co;6g!HbMVtL7yrC`YwLx2|<4w0TTN-M-Q>)apcR0 >z-fb=|!bFU~CtK`PzX;W@Wp#0}qdvrc)tN{%wF3qjjHMZEmxs@W)|tUr%f|4Cz#@lh >z`#zM`<5;kE&3$mnUcrkMixb*w&6O-)3VH{x7t1cbk57G?BO++8f&_ESeErntIZ8Ud >z0M0>QJHTnT0Dlw(H~GHCB?oGrHuFhHn^qv7!Z)via^b+%5o#Ny03&!6eC<R&y$cUS >zIPVSG%tPNmp!Xq!1=`HTRYc)(mQN3eYYf8p<A|z?-SvW>HN@8EIqPK1ryCqNWOdKa >zNz8eA-5`P~LWIBHA+*g5S6g#y48z_Yzmr$^AxIUq95(L`@_Q@*<hhMlhCikY-XX9a >z7eU@dMtUZG>3KMG>(*^?EKQRYuws<d&ck5$z*^Aqti=lsiRZv|bPH!#jAiKdCeYV` >zfG(Wo?X>SgJDE^UUPHTFGK}L)!Ul9B_M;cD9bhZ_h{Y0zpy~}cOV@cF)u|B)P)>}D >zu|%%b&J>3RN6({8dui*fo1e~w7hc!dAMn_f{Qgdx;_Y$mxVj7r^$_Cj1d@jDMK5jl >zowIrp4k~9gr16!=N`c*e=i}=DL$e5#f|c~LyvLteLfhiw&tb`09aFjoL0`|~3eb9- >zA_)Q@;(UWyzBR_xxh?vc4X1b<lZINOo;gLn{w{mcriS{vii8DTh;D`X)<eWAT>U=3 >zc{HdLXYBx9Z-ECba3M|{g!oM+)PlKvXXKyLxe0SP680Azz~aIs>Yl31-KY#5j(&tP >zfcaGW1^LS1Nl5wU`lRv6qCPd4MA8;-Y@(g7WE&%~O{ice`jTXf*hwRWk%_3ZO`w)E >z6^p7TVDHhe6z59u@*gNrRG41_TxLx*sa&m#+=c!?HripE|9r%xy%-nUB`E`rXdpYH >zrZ-i1BtJ4$c$z+x(lH*Mq~_jSs7%HV53r(KVSQh2eUCFP#ZwwmCRh-=QD`zf)?_+& >zCz)@<vy=43aP4&93x(N~bEBv(LL2`0M4YTIlb((!=`TkJY0S%_)+2>Vj(Rq!lJ?Sr >zs3<wULP$CDl{{lyPauvg5#wC*!EY3p5AB8wX&t~E6I0e$A8<t8)f)FIBj`^q{TWSv >z5Dg^Hd(aru6BraFd+{J7jy<r0#4yGxiszDN*%zhFvz|#2yu;#*+duGkOk1E;%WxKN >z5c?J4*j$O-dgozU%KE4)UB*|@b8pi}biz_QeU~~?2q69CAZQq_m#HY6<uG|Q=CL!& >zcpDm)HspyGDT2g0ZD#9$zouI~30jjvqc7uAF<{bEtJkmQV4ylIqrx@(i9ex0idu+a >z)NFfNONK-p*pr4ZBZ7Y)zzZtr3^rtk_-lq3E@-2@>R@V|b(eJdPGd%f<6~IgLpXAe >zg;5b-fJkK~Hm(q<LBy<9*&L07#-CA`u%bfGu(uy&QDoegFRVf~dhJ}|E#^QC5CbaH >zPTJR!+jSh32VTGPQ5818uhQOioT(sdv;cxZ?~Uy{3k97W*<K+S<|XdZL^Q93%m-S* >z*XWhCCn=@ETk;0BN27+p8c?s|U!z#E1j4w9V}gwcE8iVz7+b^t`iZDLstbva#7?lL >zOu7+=)eV-s52UKd#_@@4wGc;iI_EpO#g-KQpC&!aSX>^e;XgJVJjbA~F1{8M(-Zx( >zZAEH6O}bSXa?Nt|5r6m5MHnKSn`)b;u0b?6i40?G!0mITM3xKVUmN>O)ZaTB8=8d3 >z-WTslk+~HwSv8?_Pl$z35!!IJB;|>HwOFQrO4gY?$1n;?Y(OlwnTp>SeVjNjMz^NT >zosp?#V}{l3$S&`8WR@F_EspZ&qm;~EE7{w0G65ou8BmZOoH|_d`cP7|g^<v6B<?R! >z1jHSU60GDMU#BvPx`IpFo4^sM>cNs~EIuIaIm(7Dc_uCSzI7y-z+g#$g7DVV4e~&v >zIJ#+5im<B?Q)z-tB@w+$r|S)P&LbU44wlWa-zdaSfWO*#T4w?4*1Bos{*Vy1m<A3B >zmqw}Q(WOac>D(W6T5lyL<W<bsLIYHgS3W~~s@(iIq1AOlm4^I)VY_lYs32;ZwD~Xs >z$p)k{s{VBu6;>G!{p&J%hm{e#4t+nl&ku(|5L3m()&n8nydG@};D17(uC~7E=8?w} >zXx@+~x+qU9Z)4(aSFmMv-E{1Xx&{V!lb`mPjakN5Qx0*;CcR!;G}4LLDHizi^-^p& >z)Hh(h;kBtQO2G=PH5?9&W9T8S>DP~j>xDdPv<(Q}oKKC>h}rUOPy;e!F75S%v2+-Z >zIaH$D7Q#9Hm+4G4>oY@v6b(X<5KD*M<)aeyY=gHR8`H*OjDvfB33>xW9puNxfh;n> >zVh$hQtOG@A27dxdfSFr;im#Q#(Z=(@fm=<FhVp#5>Ihi#Meb;Mp3i|pa#*;^MUE@k >zT90F`d+tSR-8B%V?g*Mp)|i1?U!UUbFVD5BnIsJ_)i}8jDeRuGv2~cSx=9XlGq$)& >zi3$i1FApB3ZdVR<-gLV!;92XUlT0|8;t@fgcx6K-?IkSxc$h7M6&Wnaooyi?xh9#8 >zVaAq=Jsl-j3t!}Rz+wmD7h!NDV}oscI4wDCXC}%sR%1)V`V8hsIOONAHgGzp7hoKU >z))EO-G<K0u!Fsl&%_TMynTlWV`p{BzMdM0%ATE(eDdz(N+_voBRA|!7BAV9_&E_GR >z`{t<`KGh8$zTfb6ry+DT(hHBQA8Xc_B9k*>`P%5m)uo@ZnL{0Unh;LoCDa6c|I`9+ >zof3jU3uK8F5c=p1P)C15>q)fiQQO(8Vr4ezJ8vf!C)qDeJCe?%?XSRSn0{Dg4a}l2 >zgRQo*8Ee#x(*12v94x%gxLeJA$%c_1&O`)|HVg-nF@DB(2#h)4SzsEDG^7)e5+yrw >zO^I4GL%~s<o793CC91vErX4=hX1Wpa2nx{l+Gi-((6v|cgX$H)%ffR639tK&iJ*a{ >z5#hz+HVfgV*(Q;fmbI_HJEAy%)Z;2d|6Py%3r`JaYO$_{jXieOy-Jv<<aiI`K<Af} >z){SSNQPF?u_41MR=#<5g4Wp#S$<Q&n3m~f^54H=P>%dn6YfJg4=Jv)M(G5kE9=&4` >zbp~ssE>hC!n--0f>|q)!sY#Ud7lrib;3|39qefA8rDy}zPX9_Po*;qM0nu$clX|d2 >z81X_}HUf*@ao7>ft7_UsFyF2<w9{2<FNmxMpYOw|Cf}LAMxz8R(d1*$BZG~hEo9Zr >zi}h{u=p^C7!NghOA*}H3m41nDC?K}z7zN>o7!&na4_qQL56V@T8;a2Uxgog?v$S8k >z3E%A(litVC&WVcnJQ0cf0P)WU0El7QkDg7@?|HuNHRZIAXs_e-mOUHK!4Zs1jCm#d >zV9^XY#A`?lz@xa*QVjUG;^KRUQ6B_!aW=3UyJ=Xrt<<<in9+POrXg3+uw3J7mYF80 >z6Lu4sFsv$0U*9^S2o0bPmehIjC`9ZGS6q#JdaLSuS{SY+rs28_hWjF%!-ZE0djOsa >zX~~PuDT69r(KA}-_!36CW#ufxf3}??SA*`N5Z3Nphyi<`n=w3@hEBQ6*NqK@gJA3o >z?UW`}<C}3k6iTr9F52Ms4G5`{gJ)|TEE%A9+81kXFb}P;GbN|sMGjgfc)O{Wu$sO; >zk?Y@)3+<4wjByN6Tqs1`2Ahek5(xj1XW)nY5jB=uZid>B(JnG>4sipM5!_G{_hUJh >zLhgfGMkE4*OAN~*mZ26c)5!!*YlW{|J-7wuCI^>bl}DJqOa{{j>EM)ih;K=fQr{BH >z2)k4?Wa&s}5x4VTq@^aBz}{zpJoyiG-XJX*<K8b*;3ZI89Ixq5?#n0LO=@?zNbyVI >zXKsP=t`f?aXWB0c3_;g$S|0<y$Fa-Lc4HI6F1DBIjR;@|W}>7*EVuC-BYj~6M=EV) >z@POMB2qE<uBU{JLj+`D2`%Gjp`x)tu74!-5C(P|P<`_RyJ7ZI2BD@K<UxAovO9si` >zxZw=>8)EY=mQUrj@=-ya{e@_4%TG3Ug$&ovI&8ISkWAbUX=kgLtt-okeB**)_QtJD >zVQ0wEHF2LC!kye8SzY2u6G(-@-^dH`zd^XQjy@Wr{z8u>eZ&0)9DSyVgOQ<Ob=5jM >zO;%{*=cWrTVg~U1FhrYE444|-BiutP6oZJqoBZ0bNm5Sp1U6A=bb+k;xoOz2;mwJI >z4df$%U|y8U#b&QeZHAb;NJ6sMypSx%dv-aNOSu*!l!cO%A>wn}sI-=L+T4bz5WQgY >zJ&JbE$F?Lmz{jC_Bk@h@cT+SkBKG_k%k?yx&c?B5j!7=@;|s($UHV5HEF5Gb`@7wW >zBeJImBaVviNPl+`z5BQ1%*EsYM?#&z-n<_t&VpfbmuvXb?}^4k$k-g<VSKc%X9aIT >ze$Fq^HG55|u1(WLq^QUBEUf&N@AzrmYiTjA9W2Hi<>sd-xId?OI#Y^g&;k_oayClO >zE2fR*p5mxS$4b9%x7^Z%1iE#A#+=fFJ4Fo&iyLROj+$HA-oL9zab?r}r~SLo6|$uw >zdP0ODTa38?L_k(9$Bu`r-MWk&SiZvIja{s;IBEu!=ty8WR+QSER>Y?dh2y+-WsRsW >zfA3vUU2S7AuFAQ!p`Smq9T8O$MZ;J8DSCdEo=@<f(DM#FmmX{*9&tYUQ)nXEBSV|{ >z%=9vZPDVH75P!+n9K0gfVx%>lpZFk=D5q@=r|p}%wXVgm!9%g7miCQ>GcQK<H<{tJ >z?dz{ANAK7=hax+>jJyE|+<zey@xvY$C847xwo?R8C}D{2{slFP8I@q2))k1`s5@f} >z?c~2fvITsAfG8OE*<1l>oAa)*w74KvQD)Mji3(e|9-HNA$s>E4re6Ei_so5A!Y>yd >zf{W9Z6Bw(YaJZ|?zPty!E=0fA*JAj?7zh_RlMKoS*jH<a$%LAVb*Pc`O@}w1sn_-v >z)$5^=My7D!la4qUPS{$EN4T_aHa4XNIV^K3c5AJ#rJ?RXG~Mum`83|<CLFEy(*3Rb >zez6fkV!JG(xiP1H0KThDO@lSHsaeT#kW7A01pBlx)0rIN<48CdFO#H0?uBFoX^7b0 >zY7<0TG1WyrO9aou65*^7l_s|>8db^nd?xA-m6&V_#EoD}?R-UyMqUG(EWS~7vl2-O >zJxP>6e#oC`^t9Hwx73{x-Yn~eBFuAdDyxZBpQ&PUCI9*zQJ(E?w{2SgJh(jPU>LwP >z45zcz+-3aFSURBkh)v}PzF{KO+8O%~Jk+y!NjCyFi(V((h}aJ)q3ODjl1<wTfvnUv >zV;=5Z#_zDmh-u+!w#*Z)ID=5xGItGKyT|V_d5G2|Y|D~tj7_nD{F8_0$BvlRPUB)x >z1G()w$m~^9W@17%&zz8%3>h$UtrYr#8<ZS-F-~H{bG>{Y1l5oUbC|wk^#w5e@RzWQ >zey`}tZd^1<Cy)!TBX`deR)7%V9}70IlOZ`Si?JN^>Jw@8EDdfnCdNHjTdl`+bL8~u >z&<4`wUU~_N@d)ay?6WACV=F0XKJ`RjzUZL2E&DghESn&#FuIXgbG>||gsi$UiFW-I >zi4*lfegSqu!xt5A;V*%}^M^q~Jbl<(_$x~PN_b6fp&q7B?6G`ecU8C-2H13p>qQvR >zI$PY-bqFeIUF(YQ0vsiTM_c82IIxWi7-*YfEDfd=hq5QBa_i`MxZN{yYLeVSu{=0R >zv>7T<?S_MO2zR=|S(lst3&G_6z|n|WL+6}vmu6(gAlS31bB(c-1Cswn4n{dTBdRR7 >zbq?Vx`{lOR2)i;-jcvZ}M8#l?JiF<2N0nT;?En=GN|Y445frfN>$U!-e?z-t%L93I >zky)6QUyg@iOl|>3;8SiMjU1@X^y;I^M2ZKQ$IP+K=xvY&Nt629MBD*XFdq9DlEOs^ >zAP9lA6+r-cpo?mXB?-)nBqN8;g5w5Mx$QoxF>XUWRUqI0eVFY;@uL;r^-C`_n4iqm >z=Jy`~i{bO)o#sbJI(?qP=kV=*=^)Z};`3L0ehAp&JIv2hclo9N!28eec?fu;@jVHj >zQhc=6ZLKm|Z(i&>U{rhE-kOd|&x?h`o3nK^T5w)0Al}>-<}`Z4T@Q^JqRHC#VWVQ8 >z&(R)rMyN`3nqwa>3N_UtCgzqOe_+u)4XJ3a7g=;a#am?7RqBYyX4{o&miF;U?RB@- >zV>hNoW|a!)6$JfCd)-OU36WWs3+Pn@Jq~UOnh}|Gwd&M9{$ie0BE?$~mlsivNn&9c >z(k{dUqC%2^U=D7~GcZ3H{HY%3)vMn%=HRA7%!esL+1GS9u^C4yvAqP7-XTw$UxhIP >zHwBwH=#C5d0Xu={saeJobiQn(jK3>3XW;L0nvXjNy3w4NW4wbX(LlEw&sbRq6yU<C >z0vrNi6}@x|>buz3Zp<F&rsWU?qU;qI)dg%J-4}JyKsTui!q3AK>X9O!1<`FN^)9*j >zE))e%ELM;2Xy61EGlNwP?gng~L>QBngJ^CMMp<Xq=D|e(CE2|gdA~uNNahYBi5pyT >zoi9Ws#1swuB{?Sva&4|4ZVKr=cmSr5KEYG6fIXPE)(lk>x2gRFaaEBI`+MTC-yq!* >zNJdYfi-c&LYY8G9USV~YF;W{Gp=Ps+F72dkT>C&Tde^tx3)*7!ZT7<CMV+KIc3}(U >z5Hos3yMYrW$wtzv+sUzQ>_b?L`ZKC*<y=ZRRWu+F+P91`=G1RnKN%f4|AEV~4?u5m >zq9L|CPJgSSQTjqP+i)MkRU{rvq&In4b-Nb8o=R5HR<CCo4j8R(vy#gB*dBb|x6~mg >zQPBsTZP_nBgiAUTFo@L3FtL^eOp1F2g~+Iy*(kA|c9+EF0R+yWpRpNu{GWx5-XesZ >z4q;b!;neiFJp=FP6<wAP&Ko8^q12F<B|h9oQii)^+J&u1T^d7|#Du@SrT&O^w#Z`B >zn2>Ei$hN?kY)TdVvb{3eVn=0z{D!GXLFTkRN+7ExMvznjSh3ptlcWG@Pa+w5Rf+G0 >zBE<C{rGc!9Z+sJi!O1ktcX;cBd6Cnru$3jiN#F3grS51mjz(UNR}$@R?Zfmx+4djq >z7{nRa3GPQ+L__4N!}i?Mm#A;FrcVzX-LyM0t4`@ef+4OW`SyHg0!LeM^GEOIca5KX >zn{t^NXc~cvcCuN!jnKD=O_?otdx>}^&OS^Ba|aH~ZOwweiTmV7cWH-hzG|)%Bm%oF >z`C{*yskM;RJx@~$Z5(m-C<$HE8h1-do|OYLTFL>15#8ZXLUG0HVzm*mw1la0BKRfz >zI^1wEY^_G?dN=7bsClIcT#AgrFnK95ZZd+%!96Xe($}yqfN&mmAmY~g+@{kJBsC&n >z*uDzRJKU>Yv6`HyD3G_@h01FQn|h8|S=LUMVAhQG$iW|T<$X&aVLYuR>}rEpX4Xzm >zQOm7cj1aqV;S`iwdRDQ5>}Vo;dOZ<au4cB;@DaNKiUdE=)kemg9L6>=&WuFpLw?|6 >zz_c^P@<Z>@1ESz9Q-F@rHfC$5ui1<>1q_TwjEl9?-yz7ofP{X78zYPY?R1&^z^^IS >z=@s%rv<??t1f@LUQ;*0GK7?27dc+0ixbqwLWy5`9-9N1g0)S1C{MR$+4geN7@>7Si >zI2RLUxFOi+Nf$N^VVMXQIg7^tVfhN4jRSGM>O-o-7kC$!Dv2TU1PwHi#soImB5Xm= >zM&=GwAn_7Dn@y62E4C8IFzI&MKI^RvrL2v}iE)(=^kQVC@=34M>~e%l_Rx9=DunHq >zMOXpFrG<e)#1nAVS=0euxmffRemPb<v2ca`Y;LU5sM)y6ceA^R*x0;I=+=1@0#dTg >zjT{Y_GouGD!(y(dD#Q~<=+$e5Bt=}<+8ZbKvC#F<1%DMeY~xuDl-YE@&G})p!re`G >zkPHQfE5lpYAh`A&2u4JPO+0~w#Y!(6ds<)hR0d7FYm#^&F%zcSBuBH|{*DhFQO^R$ >zYg;n4T?N{pW23PHV-l{SU1j94-QD&C5d0>_LK{%5b>1U1=j%6@Yl~C_)o&h|_<C4S >zU37Vfb-$QjxXhYI%W(=|KC(q=mA6RE0HD6<#$?WDsGxtgh{#i0Q=1?n50oH(72#e; >z?)pxXcg$_E=RCyQK(<z_9x82kqBtIzOIjZ*==THX0+6iYKPI&3B3OK^GcE)spt#Tn >zNcuv$t~KaR#wXAYwEVA~N+{RRRg4=;iM<aJ(fN5kH&kl5yIf!HUbHDML6_*xcdpG& >zNH_?%0M{HCcHPaSEnb|A+?x;+XR|$m{H=eZJwp`7a`BZ28d!5)n%j#FM%sp;gfL;~ >z7KGL^aJ%QCm3JfJh*-n&Dz+#H9S$uO3lbRO%Wou03WHl7OwElz#{&|<jR<A@5{d|c >z+=3ZcI`jHME7y5IqBY>&;>=2v2NUHhkqvy+yoiYa?Jc1-zxH%QJ`T&;sm#Xo$gU}< >zf-ACX+7@reXU+xlTlO@5Xr4R4{0d{Sd40U<)m!{<gY=de;_npvh1~Q<ucQa8?nJM~ >zgV<Q^iUta3xlwPK15oSuxio5Zz(7S#Pg32j<LT>a#J*ZP`y*n;qZH7Ql3a{}C=R`5 >z4C(}<ft=A60%7i(W`c}AVj>R8)i?tJLk+oi2L`aC0z2~KrF(W3p^C6l$QU_YBtKAw >zGHF6|YP0;%R6MiSA{~bi?06Hit=1pHH_}dyy?cM6I68TqU)q7sPw>gO+b=zh?-0Jf >zi_c5=^x*S%e2xMB<Mn=NH9q&?b00n*-{Y5RH~6KC?#0vxpYizo6rX=I`lUHo`AUBF >zBku?Jd<r^<7e4e9+CzK87P}5LqxVO|yFE6`dT+*iq;a4`ZoUMNEp-Di1i2F0(GPaO >zyO&}G^zufL`I05_Y`(|bsVldLY0Vp^u9+Ls1BJNsxDZ>v$cwMoN54F<Pin+D`F-T7 >z#<e%>w8Xm)VidL@_97Y8L8sh8_ckEG4;javvVc(z;0~G}V33ME2xz2pA82pJ^?dP! >zm`<u$5Im^G=-wx++Bx<hAPSIUFG9Jo>yy-<AnFx^!&3)iGb!~AuUWMzc)@OGbOQ0N >z9sN9ZjtPRrNP#i|id=h@Iw7(NGq*8Xx82Osbo6s<NiZpWjHk%8rD`E?u24q-2Zb~p >zO<)#bBD^XPE*C_uN)kaSAOi9x&NhK_wcPR(Jj_K(D);>W^aOU(IBbl<gBXXi@j&Br >zG!)xX{SoPE=^g`92$A|n+dvOhOexR;g_yG!vLN)h7c();0%Kc|ZJ>uTD;Fajb2Qxj >zCDPu)9ixG_lC#Cz@r=%3P90Z)A~;qqP35RZmU_MVbtPNg8F)RJ=_>h!!0V}87pAZU >z{e47#UlKJ7&q>T^%K~T`22-u^U2OW-8hddWqgcz&!lj6q7#icXH=M13kAm2uvK!tp >zue5&9<M1_N$Rb?%fiYka_*J~U2gVByJ-l+fN2Xt*x)K##MjrHD!ud^j+Z7x{S_1=H >zDBSO>?vfu|NSP}JODejs$1OGkZx;8_=zK6IYQQlI#I0B^vPfO-?v@}DB<<+w=%a*! >zU;l~y2y%yXky~{@4J<)V6Qn8%O}u){7cg(4EcHDt3yB7N!XsplL82$nPERo!_Rh-I >zS_V_JMP%z1@*-?lrDO*ETZ^7()xzZ8Dn@JMl6oQB(PkJcV}ex-j=#}F1|H23fq=OT >z-$>T1@ra~cNh02eUG;;w-V-cuhG5Cl!Szt{Ay6EEgRyC*^B48B_F}|Np%VXN9v9`A >zO*sw#4Tt;G&tiM<WGQa?4x#QYagiGPTQG9i7LAgMRE02usU)(E=^_!^D#WJ4!YY1L >zOgOO{qUl4@el%5KkoO88{KUe69-^sSM$Y@Di-el|^i`oI?N9}}A{O(~cKGwYK>H?a >zJc*Z7w__+U)93<5s7@i8A2*~bS)@PJytQHgz7CwfO~~3ORAa71a-at+U@VTD{vMVJ >zfwCDP8gZSdQ5Yd#I|5&Or1pkiyEAUr?(DPQ;}Rne;tjTncGKRf#TGZv8!>|6miih@ >zMWA|Ef`{?*RkUwDGSK6vlFqe;w1z4C3KJ(a?44)63r2550T)p<`!1q|UFsVOVkW46 >z5iYBu+Lfr6TR8zv1b8dXO&k(S?I;x;CvJ>LvNMJEm<b@NA!dL*D9{`z%W6=fzWs_` >z{Ve8Fdd2UIau#?+3%r6LfET|X^xI0l@&f9)XqjiU#%H8Wm(sDyXQPj}=A&~(1HbiE >zJkuz*7WlV}ot+3gTZY$-U$DUMw7~B$s%`!gdJ^!^0a(SeP(rP-T~e-Pfp*l&DEKG- >zi!3%p96~J2pb}A%X(eQ8w^C_cFHY2t#GW9KIkuTySPX9*qJkg)sPI(0v3r4J*~O8N >zRXr0f3_@118<7IxvZ%n!j0N#-F%{$wn=xuka1w!ItrAWFAF+7hQz+@b$Q_buF&)G1 >zsx*_RHE{^l;73!$evtw=uGsX4Q%GJ(fxC7W;`FrgG}k9~2Cg|I7RrY3M+kf%{x>1w >zr$Y)X#G`}=KaTcFh!YMt)+lBMG|}MY9i(oa*aI+Va@%h0WO(y5Wu11iV)Hq2^B<t% >zS`0txb2}N!*tv4^A!uT(QoP+FH;)(T4&}nwwE}df@?9;KCO1ccwZKHyVlG+y!d7e? >zP$C;_x5+K%;3<6NWo6Pb9u~#B%VQa4YFNz+PZKFik}02~knjuvn{L8l?*ar;cLI5q >zNH{l%yo-?M2w1j-OqBrgS|HC82`Ab`4=ay7O2~@^?2rkI-De`t2l90yp(lwPCgi08 >z_Gb&Z+C)~=wDQ;jN?s<CAF-0@XD)Fi0`4{T`Xi@qQ3jSMIZKrBOO(r&D6?Y)lwpO) >zaI=+xY-@SAPQb3SU|&F#xWF`glYm}hK@SS3BYdlXPPCw}0y<P+#aRW;iv1d?;jqZ? >z<xU|1aUiTbe7k_XZ^0shv^~69KzEx^<4_nd}m&sm@qLq**~MP)8QNcc0mGSTpd >z*9lB!VNNLz-y;C`T7U^eZIb}qW`S@QC-z`t+$L=-?JsY2G`Gtw4RDPx^Dt+@(BW-| >zN)eZ&NtbuCN13R7lB38Al|18e?USH_yA#eeE@1O%?uG?We*wcTI2J5mvvIAS0Amj; >zf&E@O*^%5|Fhp~j{7@5WFE_u1hR}9>q@5Zg-=B#YF4i&f@AFI3G{1B$K1=Ya!RO6p >zIQRJc1D`&8PU7Qi@gq#`msTOY9-q7LS%=Rie13q>gZTUipP%FND|~*B&-3`S;nR)J >z>-cD==4?jydDTW-?AA_Y$@e#-dw`3dfs6Nn@i9K1;nR=L8GO=Eat=PD@HrQs$@ome >zXEr|CsYRPFmzyc}tnGSPJLQt^zsjT*1GVMAxEY_j@VN({W_%vPM?1A-^Ek@_e6m<> >zrXx=a<>p856&oN@pIj$5KZb{<PoBlDcO@_OAzs-0B4#5Y7vcUM5`K{-{32E+gbz4` >zFW}hXL+E8#k`wroE%+1*ex(J!+Jeup;C>4}$AZtd;EOGI(1I_u;L9!eN(+9Y1;53D >zhb{OW7JRh@S1kB?3%=2WM<Nm(N1`g+*K9tb7EN2sM^rpQ8|Ewoa>dQ;O^*yA{A38> >zu_1&fhY+3`LU?)z;des_&kP|vJA|-f2;r3>g!UwYytAxO@FnkDIG*8`H6eC+C+dNl >zRnQVhJ};SEJd}(pppwZKn%xHdQ0wZW;RGjFu|EqxR>XpN^-a6(NXLAkub$2K*6S5t >z0$70e6L`ml%!+=z<42OP^(LP>F`!OBPt`N5FyQB%0@xlMI74@RhQiHQb_h1m<vW%9 >z(+IR`(AU?9I~R@dH6dJ{%d&|;_<mqj0gD1PJ-`whx2-+(VJ9ggov;wPnz~oBsnHEg >zN&X?7VzOc@Ve%yf?rV5}!(zc3iUjca6cF`BaYXbN!(n0S69UXqU>a{5%pX=-mjVJ7 >zCd(plZ%Ba!F5C{#U6=w=et2tu(I{dvMKHShCCJYk4hvI_1UM@NCYnM9n5})227zV{ >zmjdCE3{2e26cH+~$OL|cA_Nlz!?9!{`r$B`kbw!_n*^IOFoDk{fm9aBzywoxc&Kh( >z6R0PFR2FW6pt98AF#MDj!M-yLCS+iO7bd|}KEg{FEt*bY{w2}=(Yj7y?I~P`P2}rw >z7pyGr+%Lc2tnSA&Xrky7U}I^}K5;K~0isbEg4WOs4zV*4nVVs1uy!wkw_}h-vcs{L >zopf7<vqBK+G)7}93c|2e4Yo?0(rVfys(fZ5HU%fg9m^lXeQj8mZvunj?p)se8@O+* >z*Svcd_o<viBJ$w$4IRsbuy#C4L*P{E0={mgoXsu9d59gj1G61R49u>9+Z6X#v9b$; >z{HGv>Jos%7!q?YdSYz_e78X}7<ChhhwE0a`rjdm+f;`tO#T*p<7Y>U*YhAq}pocSR >zDh^nkC~e0zUV*-yL|MUQQPCF#<BV>zPoGI#s#DD`OfyeFMpDCMh~nB`36^xAsrd~# >z{F6pH8g}SuWX3M@rHBpTHviehAd-XMx+5?fjcMq@Vr=Hk!0)KwR!fkg+b!wn0j<a? >z=g@W=TE_rt6@QfS^rRC5F*@TW@1$+{+94;RQmQxH3$sCJE5uO?H+x<C(5s#FBIO9B >zfRf)zL7=lpJ5_{~&nX4liq5xEK+QHE@BxcfjtSR3wj;(5xcv$fzhCS*!I2#s@x*_O >zUn9U>?*8J~op37nrj4|fqOT#H&gnK}VTy$_b%2Rq<eG)aQ!Sf@EsX8@TzBj;R3w}Q >zvNilSXl%481cdF?Z6m0t)2!#0_+8aYYxvUVP3Z=X(bT&JQpW}WSn8Abl|=J5S)(Bb >zewOviZ~I}^*!jQ@=ZjLx&njiy*`<8;-0)KR&<_7QrL+S-JVBK5-V4dPRH81E&Msxk >zS*5i8@0DWIg)b6C+?y(55sJ9%>>?%&D?-8LYX84Q#kyZA1u1c}{uRS+)<1vP&H7_W >znJ!4P)bh+PQ)+4467eloaX7(Nu8xmvu-~8-;P?0_y@s^x$qdoaWAmx?riu1P_uq>* >zG&>Xq9S1>J=)FvO{Hr0lA^uZ(K{v{#tfQe(`a)-jYXrs6k&YV*@hc_wqWiR~)0nK? >z12cG880Qs>_|1R{q6NhD_Yo-;ooW6Mb&!3Y+JJr^u99B^e3njzRBmAa2l-6osfV?# >z6+7_OLc@iYs*}TZD!_hL0p4T*N1sZG1&bcbK|QmQ=lD@PcpMoG^af4Pf%Ssx9J=MB >zh(8Bo2y_H_dhx5!=!M*u2YzH$R&3xWIS2v?NPH4O$(&fz>#f_iA}Pq%f?B8bd;w^Q >zyQ_uG!&n0oRQnQ{4*|gC9dlUlLaeuptm4-Yx2BOryOdpMSyN%IM&eEaku3ZhVX`cW >z4iij8SMmPsAVKRYTGwMQQjiwo0FL{qxK-9NlW1c4UEs&s9zh0iP#8Z}wrbm3c*BMt >z{AL1dbggkOJAhxktjDn$jK_PmJy^boH~52L8)zc7RmJ~yH!Q9bXL&0dXc0Tey8#ft >z(D`d3KneVD@q{1Gtl?G9Cc9#~SdYd9f`MazPdDLoBEf{)%CVH4p`E^3e(;_IRv>Wv >zx8Vldo%Kg6KkmSRA$l$KI)Z+?0qqZwLxrl)F32*0CUN&-V{s<G9c;-j^wLHqx^5M_ >zb)h;$af?X;@k+8~yC5nAs~Rp9fxSDDX-8lOvHoxbA!Kawrbfq+uJ1MY#V>-|Aq%MF >ziH^gW^;ri@F;()mb!2x3#ZR6-Hj+dUc#L9w^n*<QV_(K?dM8BvI}U+XCw@a++l%mQ >zANINgPM|_Kl7@q6_v0zYtnWxv3o3f62$zi#9A0YV)Hl6G?e&^yuRapgFP^b%DQc;n >zM}xt;$-y8Wj79_R2Kg1|f=ftDw;Q+{hyvS{e0&OqP+wBL$zMsudCa}B<usI{OK;i+ >zy_0voQMzCI(5aoo-KE89py5hhLvX!352fc|IKCcbqM2@oBv~jAi;C#$-@}Ro`m+27 >z{Ft3GB0Lp_55Ikqo&?G}d$dDE+F+47KioyV&S8=)vPim3k`(Y@p(2yyV_C`Ef@GN> >z*#lc?mLo`daiu3%EZ~2BQm|NH(c1xU5G?LSQ4CAAg2ksX2OY-Z0zvQYv*}^To{cLq >zZZ)YDf?5_T+#skKjVx4XlKRKlq?QU&lg}oFn>g&KtsSE>&o}Ae6qukl)}(g{e%-PP >zho8iX^IZ5QaN?2zyg`tB2PV*D**dX7ZAbGdY71q<%%^~*D40Z5@m)_q0BlhnE6WS= >z=fq1Wk1Z^$<WGy2@;ppOtN71SXz&RiOrc#^$(vGWldHIrLYorgw^?XaypEn|MLds{ >zqv_=4AEI_M`@n_Vd_NxP=Uc8ZvtrhcN!MgEYY|(BQw<ZuOL-A~2D6HfOrd$PmXbot >ztK@?}r<xENs%40mMW8l0$ln()AyAuA$=^(&U0KDuQfOD>H&s$-Gb;ISQ)swA_wf|k >zoFLztLYrU7wG`UoD!wj-77X(5rqGrmUY$Z)Ud69Vp{)$^*(tOeEBVwE+AUT5!W3E< >zzn`2!y8~lb3T<^2x24dOAV2Xq$$)CQzLFmmFC>tSRs8Sb1p;Y`K)Hi_ZxXY$8Sj<+ >zr6ls!EqJfuzfIvkAn^Yed+!1tRdwx;pLrw<$-oRSBmn{rm}npb0zr8MaKbPl7y^?_ >zk`X0hNG43ikc4?ePy(R?mdRmS>Z7$qTPfOWTl=fGwF+nz2oii2!73UA#79q<s6j9Z >z5SjmX?Q_m#cxbiv-_P&f-*u9;&wi}^eyzRt+H3zJg1?2~H%IUvVEA`O@E>CMt_c3a >z4FA>${-X^4<_P{{41Y!h{|SaaE`t9g!yg{Oe~RJ9Mev_y_~##%YJbZPhJT8gDZkG% >z{P&m{CDXi{;lCo`TZP@E!+`_b%oqo}wpmB0dn{lS&2}-EBs`EN{7ei2Jf~=s&7yy> >zsG?Ce%2+vQl=Tu?kx)fCYW7LI8DoMM_d{*W3XvuAX1d%7E0b)ckML5g-^X1PWg3)W >z!;-C_!GW@o3lmFU2U~byPH$g~VXx`@EF7&POQQ2GC({1Sk1|z(Cs;yW`E8Jm67uSA >zgRGa3H+&mpwS>Io+aTk88DzJw7(T|jhxT)E6|^Zk&S%NmT^jr=m5{U;Nk13-3zZ_t >zo#k60gjI|zB477*DLkvNH4;)>^N>vs#84n6bMuf5hjbesK?v{&)k;@og<nbMFPJdr >zsM(eYy^$(Ru45!*BIpdZ0WeUce5+xN1m?;sV_+_Ul|;KO#}N|9az@i(jOmeyoPi=$ >zJ6`k*$?>!d?jtlYmiM&}c0-l%J4vY&i8OlKcycY)LR(R^!62o-P_VOnY9vJA2rV3U >z(aC)JK``HnI4xuue2@&Py;dhtIc%Ewcm)D`Br32tF)&x099-ZMpL91jE0Il2{(Dk5 >z-)*IWlMwhG8GkVuRXzSzm0igAIdcl!lL8`#ZiG`GFBNVi6swpTjE7`etU3Ux;NZuy >z3m>u7zBXjy?@7Tc<qWq}P}28S6>=ojnsE9*`n=hN4<8aqB6A8$7I8z(E9Gq|n16tW >zIn((oWRlNMDHPH~3n?6iGgQLaCgZ3@3&uYs6G{@Dglo}_8gRbR#a(TYUD#fOqU;Rj >zVj!7~L@W-daI1!>gt;hMBqKSuG=h5<a4|SYvhFUw_sRPVkhXj7X9$GEd$T~o@PR<4 >zmf~Klc&l5OECvGpw1ub$(0JH7#lv!Ml0TMUy4cx5CQ(uy(k;q3A_;kXquw`?^$;iR >zI@yI-tk%%on8)@%p?bt;(>yE{Al?{-x0<D*{eC#}Z)fe^q?}`#MZI>-m%)+_a9%(v >zdDWe)r^J^Ed1BNdCmyj>G-L!zz6d3Q2MQ8~gWA|o4EPRBIazHG1D)Nn_6x8GG%HCs >zGG15Yd`GHUi9~Ew+F24yrM=7gNZ1`eW!VZr@a_ksjjkY;&x1_FJm^`Gl5Hdb>$EaD >z_=5U|9D2d|c4GK*C7^AhD<I?7c0%w)kt@+IBu6arlM7u2yAU6-FdMC|p?2Z?R}n8M >zMXn^f&>peKPb+jm@Q@^v1c92zTV2>MYn3rvDw_*JMTW32_1AW~R6(1v$Ti$9P@e<? >z9;}o>kj}0VcHvXzOT;qAT3yL@;k}5JGHAmdAzGdowuqG<SLniF*way1xKHRRwhP2W >zBnNLBSmY|P3qOjo%GVdVAo1^>C@b?|tILjaR#@p?w8b*d{DKD$77Fjv(mr6$1Oa|Y >z8<zQ3{rh%bTJ$#8NW0L|D;FCCvA9Org+E0skQ1j0W076>WyCIDTj;vRE(9YMX7HpM >zA^SaIK^2x}7w(K$;F)*VSje7^SWxMYgIwW=g)Otlm2MZNMJ)2C6uQRSh3g|0=4n<} >zhFwUASlDJ3xhB|!n21Hbsn9jiE}UhIKGbl`bFHpSj6D$xTV9cCl3h3yvB+Og=$dR7 >z_CzeqR;w$^E<6>nuoV}%rq~5ZMiBzdzopQ11L`P(VP0Z&-DnpYBNnzhid<81moQ?H >zUs>pyW*2UYSeTtw*L1sZbHu_{Q{<Xq7p6xn^6Lv-*>)i<Vqx}LT{G=MV#LC>s>n49 >z<3q$Ee@&rlwq5uVvzyp&{7o1Stgf5v!Uqw%`-$o-a+&PH>k&JiD9=Jy4&H~3*!edz >zzjN%u;}N_2w=lnE$R3Z_@kE6dy5=JP5j#JlL|qoUP#v+mpQzCySFT+siP-UvGQace >zLQcfa&!|#Y9{PX8?tY?7i(K>V!mx-P|0MH!GwMHL=cn1r>dLnZr_ae{a6eJ0MXm*Q >z;r)mmPn2q*s{r!BB6fa8t-2Q4g}+4X?%&P)E<zub?Pv(V;4|7ToMKkus9+vtb&atL >z??hQ)QS7?bF6@g~DT4W93SF2m|1)Bx44PA{u2l4`C@b4_MXu}Y0v}~XW7qX|VU=u! >z-hB`4h7gZ1?n|A0pJf(ubqa-=pI`tW7vYe&VAlY#WH(m9m;zpv5HauwW`+nC;kP29 >z7KpgN0_qrNVG<&iyTWjW2p6GAMATnxaFOO-CS#s$Zw2RJC>aRZ^o+XFUVFsu(v=E} >zWmC|OAySXIC}Ob)qYAdBrbaC0VN@wB6|R#lTq=L7!R{JVD%3G6{k8PO#wEiX<85YB >z4jstYN}ubQfEUM!igRVs&DL?h6iX_BpF<0dWO_y>p4ULm6ustx7nbt00jyk1`hYow >zpF>Ob0xZJ8RL<um2lDk;;0CZp&x2qpEI(4PmT1mXZpu$<)^D<nZO+qf${*Kk9k|Js >z-khi3ls~@NI(U;Uqd8BrDStw<DQ=T_BJuM}d^;pp=vYW$2IDj2C*e$iTTHJP_&$R; >z9Fa#m>rf$BW*Nx}!hm+84$#{`MCN)8bOLsgPQc**2OGe30T*lxVvo?Jz!6m;o1SrA >zZxs?MK@%`3V5p<-5N768!lWGo3UhBoRG6T9Kz*aE=#h0V9s{BSHJmEH+^muAuzJTs >z2Hctv>^=gcsa;B|t3R)_3OoYB9t$x|m9;QdVG?Wx;!Q+T5E}N5D=$F<U4sL7N#2Jj >zTbL5ihPcI7l4uyfsgjS@dAR4yN~wJ-K4P%!kPo6IAd&Co=J=|B$Qd1K1TXlIt{;dz >z=s7s_S(8dQebRxftS($i8xn<%@wa6x?RvtD#lAnsvJtvbyg)mV{O+QpvP%$>Uj_L^ >zH@#o<%b3MPF$nCYws;qwm8K7}I#H)lDIQU^zEK<I8l-m6T%z5iXKfv5S%-oUsZ|1T >zTpY?a2)=%~0G&pvG?d|eEJ}JUvzKGH{^1S(qRdYwPlyQ<%4&n;GRWM=F)FPGNbVt> >zW$wAO^v2=&2+t@KpLSGLK-V;L6tMZ0_P+zm4C)ZB?vpUTM7~P&;^umh-4!lH_Cbd~ >zP!9VeC?Q-?&ujk`u!PfhfUfMhTfDy?l>F@v>7tvF!eqizU3<Z?=#4>J^L>bSkH8NJ >z#y7myCiWBvl*WT%Sc6G1h;?EDThq+YjHtrI*=FHZ43(i&*4!BPK_q!!A}y3W(X$F^ >z#1bP?RyI0-hK#C02v($Lu#`7LrW8FqaTezZLzr$rddr(Vt5yDkoC$qd5>G?>xc+@O >zVzrghNoyNX`%xvm1|>~ax((uceKcyZmTIQ5o{C7T)=x+EGXV#SfGqYqyqBR(g4Pd# >zw~U9E>LC-(<jwsu3=~|j<Sg#tuv@k)t7u}l7e^}=v3;OU++wEj)C(OMVRS;x2&e~U >ziW3T#M%B+j*i))jA>3fWdAez&#F4mIzI2ZlaiPgwgCJ2sQj4S0EtdHE;2ZTtj|b`- >zD7oulK@5x#1d|lBoX__4HV$ppY>LCJeR_Y$ValCjhUJ<NWs<VB3AT%(Y7^K0FuirM >z3Z063#bSeF5(AL5OY_-`jL_<hdID0!x*%0?M0Vr)eQQAQvmJMAs`NsSJg_;f{ZH85 >zZSFTS)jQCP0hq)qY1*-E!P$d<B|N_Mg&HhksvduA={(&FiT=Is`_J3KnMJ<t3p`;W >zZm$pU8O{wujyW!gsh2B>2}5yM!WnRk!{H<Yzd?8%PB)Ow|LCXH#AENizlJ2;Vh0<6 >z(@(8ZD*p5b6Bh$$eiEEwfWyBxaan**erMvI2K@22xR`>k3UD0y0>CcdEd_iAyk>v{ >zFaU55a7qAeaBBtLTEJ=Oha>#!0DOVLzd8wHtZS;kAE8wTZm=#;qiikMxK8r#%`6f; >zvi%gYk9f!wZj~KIM0prnB+QZRRiXzz2cm4S^E*zq520irdYivl@82`6P#7%RA0~T} >z=>_g+Ly@x%wLndR9g&Si&0vYt{zu9090(ta(>+Z0PJ=<JHr1PWr1;12Xy1^K5MUwo >z+WDRrd=yHni1XM|7BJ1Yg~H$7V`M~o8N)U3cnJ3@A<mLQoEpJmOW&V&<D{Nl($pd_ >z|HU|waVhv9OUho5Kx)b3MH;ya4E)OE+z6WgVx{grO1f&Vl=waHaY#JOX@8A;p#1Cg >zeLkWB{=J0uxD;>>*e9+miH#E0Iw>%dMou@~3U`ea9)4S@?lNGw(uB>jxj^!Pf&9u~ >zCrH7jiAdrFG!tIN?nfnpM1PTWNU}^$-w*5V)7Z6Pr4q?KcE(DDkNb-?b)3dER)MHz >zYI8-xE0PoOMN-M-B(V_vyJNp&9P6ckFsmaV4U@@hu)3}h9{oG=&Ni0#g5E+VACTIX >zjqv)~AFY@LpNJ|q!(X#?n!Y`aP|<|v7<N{o0kGx}$zW7vUDSx#>;Y1AHVwp+hb4{h >zzJ%~jAA)O$K=&o69q)ndJjk_R1K6gGU!k41FGmts?=kY~;D(uW^xWG{Yc&XlY(QgY >zVgcthCpMj`@b+suHPpK-)Cf-dC*MMz(IFVSN$@Xg5R~|`*d_cX!Hwj=9#~uIT!qkZ >z2mXHQC%B7hz&0njqM}H~0}iquV2d_=Xe6qMxcT!a4r|2OCpxn-{w+emCYm<=FkbB) >z$h!1TnN=smeEPKMXCQ&R7olP$H26W*F;&MDKFvI_&`@>RWm*s9a1aYGf<-m$2YHBR >zzG)9qn%vp`2yP+IP0@F!r0@sm8ME%n6s_t2uTEi_Hy1D7tKmv8E#P2(Ns0l_k|o8z >z?9;>_@-B&Q^WGAFXmwood)-I;@6kQJ_?KaNx4X^%F6Zx7@a;QIC}sG}>U#6Un}^<f >z`^|T<zjDXlrOrO%j!8c=;w#h1f85sHf|kK~`$?gI<S0Xo-DmI=(;15QvREe~)^sW4 >zZw&Z~fUd|u8G>4TMF#pw8TeMI=lApPd~FWyLl(ZcN*1yZ@5FDD1qwZq4HjC{Ipdl^ >zP3KH&Z~+J6Hu-~xPvWC{eDC2CEfKfaJ?F3C_qOnRe}1^9MfZn;htK!9N&fa`9Xc#9 >zw^J?c-E1zE=rodZxc>a}9pP|ed|So?|7qgdiIo%2p<-fQtKBjA#vHCKt({movE%{D >zql9ujjQn26HiSgO?juaP0^}Y%p}>7dKDl5Gx<#PuEObpw*HZ=U2574SWxZZqTgGe9 >zx2-sP?B%w`vs>Tnp%dE{G>Rt_<ap##M_B52o+P`mQC^sB$5m~+u4>CsKCTEM?y9zi >ztJ;pbn(dA>vQ3Mm3`G(T!;x7}qL@M|XxXKDWbX1`Q}`ybYPt#ZGU!NR3w?Vi8`Q8` >z5%(F0w}vhRL%4)hC;&GlF&2H^M4}Z))Wtw3I`*gma6(Dw<_40#^iK0A)uEu_KGmE4 >zy$5so^Gy(a0Q~7|Phus4$5XqHK(rRUSSiNCVrDYrM03z_@%_e&Z70E!dQsj_4LW+a >zzT1c;2t-zBKaAO~ok|?1-EVG%RIc7@-;Cc&e9-T%=w<lf)R`!gaH@%^bD5e4)wuwQ >z@P!fCXmQ>}`cFjt)1-e-)E_7P8%*E!p)-697-w#v*n|c;dE2KE*TX@_SxV2=X3JMi >z1z%|q^YO;Bn=D^Z5<nJ%QCoakD(E0`yUWzC$rpP(jV!SXQqV>FAz<GI)Anpr+lnrh >zYP`jP!H_Rb(NwL!TQl3_jcprueA?C)J`VTY*;A^th9sP}w0A$w+P(GoiT96emDZL1 >zea|yOnnWf*!i8=#?JD}i5JKT64M}{~(4xmPI;3nI+0=*^xf1usX5><VmS{1W9>N@@ >zp|=XO!LsN1Rwh@1oP<AMF59GvvM~$qU~~sIaQ){w-yuJEcU})A<~OTm#x>%WdnUw1 >zhcrsu59@+1Sn_=Kd%ly9^^El+>RiezTcdqMX=ZOO$5riIYpQzAbrU22@4RTPdhYdX >zEUt;`oj+fNhnt$J0*%l>EK?O2lGpikZ7a0zmwmMLealB|2Cm9}#WM)<l)(Oli;Y<J >zwQoSaVe4bk8HAnK!bjzykvKV!gG><~6D>g_>Ji+#WgT1JwRF5+*10v*j|JbkmEo~@ >zbQN&lw{#+mP;nKnsfvH4Dn3sMPy=Fp+P?IJrc~iM=2Wn_;fXTKS@BhrF_apA<9SRw >z2<6=qAss%UxISNU$*^+4LwdMwMsmVQ<Um-trT|sBE^lL3qqeEhfL(?Cstj}Yv0%Yb >zCJMKE)1Vwt!(M0zg%amCE9AORnW-+4P!|x?Uvd;Ba+1BhN%w;K>cUly^;?`s7GQoC >z+d3|p&^SOh6~J)T6Xz<|J+R0BfdX1c$pXEFRpVZC<-JzH!a4^$0H4C5y#iYGvLod* >z*-?J?MDHpm%Jl)FuSp~cXcanoW@p=Ys!?jwXq*0NYM&0A3)l~|CQJK`{%J7~&_`3a >zd=ZJIk>L&@+~k1OSQW6Q{1bd6Z~Fu2*ny!BL8V9$4oOMf2?CnqbMZ7|JhrvEPvgmF >zJ{F%?elL5jmf=ym6%k{*Kft3I)PZ98^Y8{^E1BSHZ{&Sjxk70}11F-juYlBQx-ieH >zZ@RGDHz?G9QFgZ*4-AL&P3yv$-qX$c8F-dYu}B^zS=#*j4j>r#&LrQ>+$KvGI(FJx >zaENQuD9I!%zNfjs(DcFaO$FUe=W>06`Np2^;~T?kN258m1}xf85^Oex*GcS%ak#f@ >z@u}y9v5|%Qz(L#|$Np9!E~j8J_)15UfdP#1W%~6ON#vDp8rx#%x8q@y7%n<*!$%@l >z`7>7`RT+`W-cnzG@nW?9--!Ny|E>J5LtV3Y)zs{f;#_Y!)Oos%|A}1tFMg%l`~+!e >zkdt<)!LzB&?$rHw3qwfL{rHE_Z0uT>xL=iNZZ7HCG{lU}_5NhaMfdN<D<P1qG!WJc >zef>oDd8TGPUeN7OHTGmTY8#W$-qZiapHp?)U70pv{trb0_&Dw)hLRV0htZJg8v?N= >zdTTSta6v;fIthBjHi{1k&%|>XTsxhSAST>tL{v~W9P&<VF6i16Z-y>Tw)1td&_;`) >zfo>Oev5^X)g)jhM?2c`=P=7{DO(%^_uUIJIt2N!nsDMI<E#18SqJZV_nttN*I)kvB >zg&sDjHYOfvzdD->s<W{{6ZVeg_p_#qB~m`*l1QdaLl(gzupbNy$1fQxsG!k6u%okE >z%busFzS9~*b8hjo*u;pTdG{zNaj73u+m}VCqxHI5(R!==;d!)&7hdi&gyQ`nB?xqC >z|3h(Q_hP2^Gu(r&%C=pElmLt`QSa-5?^o;t;}Ht3KP|?*fA%W90K(L8sp^B{Aoy^3 >z${*bo#>ib^*d<*75?mX*F6j#oNA-o5uIvk=rM~c#*p_$U1Bodeo7hCtPFyEaqn8kO >z?PG|$-oOXaR$3&)cjVR$(f6thx~@tI$@Pw>58C)4)`Zc<Unb2}+PK&W(8j3~pp9RP >za57mqVBW0?3_*x!>jPw4YVT*Ly+0gE6kQiWdTj)C9=-01!Uw7Sfz<RzUE1^)ZV;P( >zScS%K4B<-d2qYD)-@w`{f<p6;iEjRsGciT2Xo2j7dqj^|C@z*jNsoQqS*SUrI-be_ >z#zB|4`#u-rx?YNlI>*l=onsJl^u`gtaa91XO(QEMhARI_BcuXbq};C7b{~_An%)~w >zrBh-3v;Klti|;|-Ro}yUOln_`slpDNzvZH2fjA<)rcLfOm{-20&pd{na?392GfzOL >z|7)brfSw~+X_(>B+=Gorv2nmP(f&aX7DXIMG}&+IQtY>UiSJp!8IW%sFQCvN4221< >zgDjIk$q8H;{3uZH8hTxCi!^7n;95OZ5d?ODxyIzv!UMd8+jpVGl5{}`SF0E#Qu=R& >zNv@iXTIhvzUo4_iK&l@I|51NZY%2M3Q%N^N_zX#+P%Q=$I*3r48oLy}q^8C%A+XYa >z0k?6R5VrrqLfwXsp+U-VTi()=eZ-f%9)~2zmu}-95yr-@wQBxtI>x~XOXx93Ow(=t >z8SEi7P50v;ktURh=DTh@Z+UHFTU8**8^@Si+8Md>MpzDPLltER0&i}|$A7W^Iz8Wy >z2_{vw1p2klhh?^}NB3hBG<;$Za!^Y4@ryu^AFx>kV-M@USg6E9<`IeC8AfOjdjdh5 >z2k`tn6J`3GgHm(xqHY6CPEx!+6`qF3f}l4C$%?c1Zf&(Z4_7=0k+f>bCFSF9e4g_T >z#Wv4{3`l{Xf<3DU9U+Nzhe=G_Xc7~b%)cWpqTchj9ON&ozrdEme@0a+LVuv5hR8K~ >zMLAs1(Zu%GyN`F3L6&>tASyn&^4lLr0-74na9#^kg>M{R@LJUl44r|TSia=7D)9J; >zA0_)J$+<O0b8%e=e20)T2O)D9VIf3CK*e#PTTyUpW3arzsQfl|gU~+H_ZF&pgh6M} >zyxJO%ZpCzR1LzI9a0=)?f^*SM=5!EFjRV_O9Bo^1@Yp#*4S{H8CM?2_Ze<cg`orek >z6J&(gB-|)cfFw2IdI2QG;mvm>2Uj>x1aj2q;2$9in)V?mV9GB<<0Nrw2GS;1U%5o* >zAoQzC+C9}tD_W#)?<EyO)=&8_<obTkk~4D0NetHXwDD2!%2Iz4ui<{wUWP`o^9VUA >zNE%;`nhJ`~!5v*JNT^|59-D4c2`ov#-^K-!bELZ<R^6r>ne#&3Mv^-v#oqNbu^&fA >z-`VjsvG0YWJn~3%Bi<b%BA&5QKB4xehHprM=kN^%s8z(;5ca6US-4+V1*zH)argeH >zts&DB8k~N|2L|GbKwu$CJRg}`s0pR*plr9Ly<gU|l|4p|y%jJ-h#_!7iWtJXqqd4q >z`LxE+`rCw&tu=<$;G6gS`^LDW7b7$QtJbTHa(_WP#(lAUXp`8=c$XFZzh!`X*0T4% >z<p6gRdMbvE?+3W=Ho#><p#0Yia5s=R{<g;dnFCzhv~L*TICA`V4sfHsCU!Q!8NMd= >zO9nWit?&l&w;bQ@{q$RmZ<On=9pC1PAw-RDPhwO{*-PWwQ_$7ykBo0vBk}q$NF62P >zaKO`sN**sgusss2Ps6gT1n?w7MrcV7R^B9H5_?odkP)qM(u&*aG`5<+RRDnm_!bEn >zXloGJP50&2VJozEY=I_(ckOYZ3XgBo0|8?&P2<d+-uDh*^S2{vk0*N6kL=B?p=oCy >zo|ih?HZuh?KqdC!nUGU)uLk$lL4eXmWgOOgNq#};)%pcB$U5jqAajZ4T_x?Ihel(D >zeqX_~2`uN3Ucn|!%nwX<1;kW94*6Lo<l+_$e^}hjgG|t_)+X`wYOqZN6u<(_fNWN! >za`P5f&BB|XQ&1`!+t#VN+syt8X=`JiqumahE0d@o;l;Zs!H`s7*smVWu|1#mff(fB >zGX{^>Qz5or1eE}@0%qEVX#Wgm-LrVp1P0vmjZy3lhpP|j3&2GIp?l4R{ZxlUrZ4`L >zpyAD+p)hFp2}B*hF3fSNz5Iba=k&Ov0`!6oFYYjQKsFRMRR;2n9qFXgZ6>jG{_v2s >zF)tFm&UdH#I<N@K<Bx$rQ}f5V+a!7}<~btkY{KbJ*cftRE8MXed4jg{Pg7y4&V)cT >zF*7@>NT=s|cG!j8vIk(o{n)$5PyEu8ilod#wChS}y|H$Y?!}Ue5N+6IIT8tPQD}N_ >z5w1!!UB!RHC=ip2m{-H4URl)hC}S%@T|iJi8YrqMznz_PqW!4e0bdDrkB?%PBOf~k >z!*H)s_e|#cPf$plze_<cX59#{q2bKqeQ0HP-|=Al7qnfh%3tJFhCV^Y5|9Cke;tU^ >zqx+SaVaw6|X2m42{$u&Fj&$sYMw$Q!LYy!HGrh!+{@%v6wF%TH3XbA#n@?;MKu{;4 >z;dVp_A~s(9Uun~Gu*~Xf(>H!oo4yOlM}tPQX6+ek&s))I(=Th)Boi#URma2p?`_q2 >z-?3GX_~<*f>QfLujH&d*)_<{8|L!XOzfG&Ii*D6EbPfI;YSf3ax>VpGpmvRY0$P76 >zzou<Zptg-CtI;W*L8lPgHblZz`S)bfZbnE+P5d~T_z1a)(|QIxlh_!s9934enw67C >zND9WetF?2+Tq(BnH_wCN?GVQre3IPYEgreSqp{qAP7&4QX}_A?d%>19)~tB{G=4fT >zIzz*;cM8)Vr!kY(f3erskFj3zbgu!qp)2e^gDd8*A<prbU4(!dI|E=MZEr!phIsHT >zcsd2Q$4zr);hs9T<}AJM_bcl1xbdR9_d)1_1z&_lrP44E+(Im$xc8z|o95u&%VP1~ >z%O1Rd$?m<Z250BntikWP4~R{18U#xo*PLB{@i7#?&y=<M(KIAO4yh?lfskHq9xxV` >z*?ZQ04)<}FxGP2Xbjdw%!vj_%0SuShNki9yq%)9hWX6W2ux%<`9(@`0Nl#i)*@3ty >z)2zQhDPhkjrt)XRajjH1{|wMjIs>}yrMxLb;nOS+eSD^v4Ouos89G+sG<lNF8!mUt >zdA83D@fPSoD9|y`+o9`FVo6<ok!yA^p2v2NY+9wjF`g&|a=!H0_;Led*?be4{-~Kb >z^EbYHS9gThnf78E^%%(S{SRbk5u&GH!Cn_F7aDiebobJZIc9gdlZ0Hgg_8DssA1I@ >ze+1}svufyJq!lk}mS}a)OonvS{mln)jvzDkj>b#Exr?xMNVSU)x)8!AQV5;E3xPBA >z@F5DJGy9BJpM6FY>pqkfUT7d_7?SPt!^IeKm~bi-m{=EJ3Po}cMHfdZm4vUq2@*Eb >z`GcYB$qmWXJyWlUKbHNf4|JCa?}5QqG@z_DwE6yu^Bi^|72%5Sd}=UzLGU{$uuwAO >zfz(s#8nx*?i?E|H%wB2}PQnVKG6Z2lmQ^ob--{0YHs+QH&#H@a@@Q*xq~-LJ2HY~v >zdT5+oC{O1@Xk<c=kz7X6FohynzndIz2&RX;0F0R!dqmga=v0dZ62wF81%kthg{e2t >zC_gu4AReSDPSF?Xo?VbKxR7iNQ8vSw&2X#p0gC&fLPT48Bf90jlpSOMD;2_%^WCUL >zvYU`TdOs0b=ys!wtwQ4zv;krto+>mU(O?s^^tw0<3Tx|-YAXul_ec$Htb)@BcTkum >z9tr&^EN#LJcovHW@t{DM&}DQvVz$|`r4XAl>J;IxK-wcfY7{+Qi_;s<$Z@x}@NX44 >z^@mKHa2=fSRtm!N&rHGDDRalc6ZkZNhH$|{N~Jk#H?j#~Am}Q(jV~ho{#nnHjc((g >zm_d9<uYzR~{8R7(o>ieI3yEJ6e2~mV!Y|o2vWD{OyT&y`wk847s1IhX2|hx&xJ@9; >z#CL!mSO2WGQUn!`Br^6|Jn8omg5{5hMdeaD_aPxR;RncxSWu;csa7hz6{00h+<@20 >z@NR`wC|gRtcaW$Fd&icBhf<CfZ=;-|Mv}xlhrK$w3=mgp6=(`VLZuOnl{u>L2n}Rn >z-(gjHfXtxtQ>bOZ-nM>U4YJ~V4Z#YZPNtMQ-(@N3fWaw{Hhd5fYVjZ=S`NH%q5M{L >z$^+083o6Phn7RCd!rjk1D9<}Qlp-Rc&2phJqFzJ#1W}I(gbQJ@LV*#^`h64<dl$fY >zmJ)FtOQA~m0jw9dxDteNuxTz9jw6+5v+|gXAsH7#ll`*{(^1df*zi!&2tUE$et0N7 >zU(N=3vLkxZHyp^?@KAaeIXu(|mELU-?xK*{5I}u_%7@y&@Gzo6FQ+aSWv>u6MA-uo >z50eTjMXQ!A%&ZW~Wi!HKW39$66rzCX`E>Dx*J=F7*O2uPp3}u`Si#8{@bC^02KgaP >zGbhgi>voIUk4qKDc@>JiAKtpgxme_ete8#=tJsws`W9tGvHb`>lvucld_3Y~9CG@v >zW{14e`F9F@28O0lFf`Q|0t3cFzdch7(B_{dAgP>p4V^`7e;)enolpWpcR=L_90gjT >zaEN8^NMPh{SUNYnZQ{0(FYHebrQlJJ$cBLx>~d5onrL_cg%Z+Z5Pw+=E5K~Fk@fbY >zQ1JX6)WFaaP;YAqvlRMgksU-iZ=;B;!l~P+SQ#RCG|7O_1`v+iMo_{&4GT0oh7#|a >z;%8qyKa`AnPAZG=svdNO!XMxtDcepa)J42=8j>!w_99ejKVD0z(CmW2WEGaNcxjY~ >zcP=BWZG?r(`TWJm#eDuu<Whd$b$l%Uk8Ps}_@S&6ledIpNP|I=?FnEFPKP~Zy+I$| >z$kPT?s5nd?;x>Os8CLP}VCyh?yJq(Heu2g^V~p>buqxs(#;e9eD>)><VQ|EU20qB~ >zGkyr<p;U|c_{FXG24c7GV_6m@!Z|lnh7MkU3wzQL;(#b#RWlbIFRo7f95u-g#7i!r >zxiiQ3QW0Am#BBXutY3yNCg)UNI!CbNkHf>~H~C^-q;aHAE}s;GoX#it1duM_SR5t{ >zp-B<PdDbv%?Wm&=W-A<5O69`GH%-Uw4oV~jLLHMA&qxSq#kWn&o>ylTc7m25F&IP1 >zD$+)Xz$(EbBTdtn`1DPy6=U$0SKNNZ7(b^B&G*mRgaQGP)#tuMbYDX5WBJVHfgmg1 >zmFcp9c;o<=FZkh`;=^26$wSxNU)1-}gWp`wD!)p}<3LtX1))IRFQE4KcJMeVM|G#c >z4JYn>2xD;V3Jvj7$$b>L4~ud)dU33c02tOksP?B&D^~5s$*Wa(=oYaB9FST-8|uXQ >z6t#dia07C2`>Qac1+WNJBy05Y6rmyf88~7?D1!4C-*`#_wFO!rptg9)qm<H5q?D!- >zno-W@IJ(%=6V)v4z0~t1*?5RgV2_viW3d>EZXYxZUZs6h`O!dhR46LCGo;%%7GrdD >z6WOv&EC6aF589<RLRNAck(&rYT7s8m{i~H@aFh~P)v6f7`8joHf!s`P1Y;QCOdy;k >zzUvvz5{8o&8b*$*m3aM>nne}gpVoz((ywIAB-%_fl8s#~6uvA)dUy=3G}e+N1m_8= >zp9wgnT<oM3$u@EzVkSME&7z`Nc%A%F`nMfql}kdbc9PvTntW(pu?oK>vp5D5QeXf% >zkspjgC58T3WaNAv|EvzC874e74sR8cfsK9K3?$_7ZVKa5Z2pWShma;XPm!w<R7Y&W >ztqwTR%kjQYuu)+b#>rV>DQaW4-rhzI?EFyj3YLfs6>Q!O87a8eC^%e^9YVn%5XC}> >zWmnUj=sOj>UF}i_M!p3UvrQOlV`KM5A}Cp?P1Jg%<fl<s6eIY{c_9=RDC7JusdMM4 >zj@yJCm2wK(LzCn<39l!Fw-J6pt|+NkUKQvUQZMwk_TnArNFt6!mE=;K?JedFBF9-^ >zC?);YG7LZ{OyQ3pXOL#Cb0n3QuM3zh%<J}#fJQVdmeFhAhW*p*IPslvFvqPq!>(%5 >zoDQ-->t)mlm~rXmLnlZV?5ze0dU$RY7@T&myGgf!ct1#n;BJyU|2G&iR5<t5ZJ?w1 >zJl%#b@o9&EU@ikZyt{Q9o`cpetn^N9Z-*`hg!@hY9>}()5g?qh@ul}o+!?^+V|bVc >zU;+FbPzQL9h&Hez`0-)PMNE{5Y`cMMiQH@d1r*tKBiVvv=NMO}Y`dOpX&wlzl&xsd >z`Qm)Cx*buhgNC_;tT2JH<DNsVxX>J!3}j-Sr@)jDF<l1}gn>wINiZcvOmVE&Gcui@ >z@<=htoSv$u+X*W{oJp)>Rgm#^K@l)8cf7!UK}DzeO5Zf09P3@@P1j(&8$b-fxGf6{ >z3^)+K3|`0HB3}Cl)nj28C*wGo*LJ@%dX+IW1VWeTN?^nhE_jqfnx0twf-e`bGlDs0 >z$515KDnMciC(86F+pETMiSTSqv#bc8^NnncIZ<Hd7NCakQ1tlYSSbDNdhk`mkh3V{ >z_;IXSFm9N9dhu9b2*N{y!pOp_S<Z-AP|Rb5{$<OjQh~(AGp|yC6iPn)JqoWGEjiV{ >zzZ0VB-BU?k{bW*<p?-qtEY?LSE|u^qG>b#SG0%#@V?X`=$B-@;S5z84h|EW1v9R^X >z`0K>AEiM&oAz<yDNM}ZN4TV%kjMnlKx;w(iMo14!T8Bt4hGgx=4ubEjZo_6c(vg1; >z=JOyJrVBbk;1dv8Z*c#RUxlZ`cyKU+oQr`_D_lvTGkx5e#QI<ae>}Ua*E5mdKf|oR >zQztg1za<tn?$1Ia__AJq%lYgJcMXbi4aH_(X!sltG5IYWV4=|9IkAGCbz4+EmX?iP >zBwDUw8c+SRo@P~d8x}rPNk}THPG5v|c^pvDieNm80YPAoNbS=I^!O2tp58PU2G5US >z0#a62aH%>pm`Bnf`9l%L{l9%08j`Z7t~td999DirN7AlQt{$>>x6|TR$b1vQl|q2v >zLI}RQ@D>Y=&rP!mpRc0oE}dH>d`Y_27)lhSFK~kax?n+1U_dV=tYJksENn}WrT}3h >z>4?zhb2F`cDMaIypAkb{Xe#7$XTok8Vw+MZ1ev{F7$)av2A|kE_h)LVSQ2y8U&<Rj >z3`p5CNZI4W>_J2%4g9+7g0t&B#_j|ww7-k1@~&LH!n>RwN$J3<931FI?>P{=Up8fr >z@x}aUj5!qZCnIyf4x?tqK@gw)eW%G$ZI-Ro@be}7oEGLH*H#-_seE`XwBb?|FQ%Xq >zr>RMg=7-XVjU@xEN(#5pIz>c*l!w0(id^Af-YOP$B80BFDdQF1ezZB0iCqigEB5c3 >z+aipZfXZlk6vh@o$Mg@A9&2$c$|S3;b#6*Bt%`&AT7_RBB>x(OgeS^5RE;Yc_G)Vo >zHwB7>QK$yt)eKy&T#}*=JVfEF-}e+$Og_Rp@QFDHZG!c+H(;dUL^kh>HcM{;Kf~;B >z0gLZoBai%*`lhr`hTXoFoih<j_vsgs(Mlv!AR+4z7^QJbKH?lG1s`>vNp$MjL_Yb0 >zJ4sW$`!ATduow%J8Zh?e8gjA!#<w7x!(F)s)y`>SFyyKHdy@Eb*~i@p{GP&{G^B$! >zb+G~u(&QS_kB8o7z1rlT)r8A-oOdd}(7?C&_w0fg28P^y8rxP#1$m?-770h8L*=4> >z#kJ-dslB7EVD!i}@<^lPDNu@yM#&rHOT0D-_dss4f43UZ<K3QKxRfSQQhnF>FU)LE >z<%bgS^$l)C<3zZvxmrxR92W6o;IUL6((nuQ=|{r3dh}1-3r)9VWAFONU8e%KYnd#| >zsJ%p5rDL07gV;gj^*q@QSc8n>;lWqdG8qsqMf#gX!jenzSY6@5CRX-7Ew)C-f<4WD >zG20y<wB~m0Rndb^crt5OCE^dMbNqYN{BVqT4PS<as-^_*fGswe8~RJgid=|2b;tP6 >z#Wj$)0=CjSr*6=}!XwSAj8(5k-!KT6ppTQqFk1SB`aw^zh(UQ3`H+!jRqQ#XR2>eg >z=j}P6N^jkBLYdwgRR8eBekdx{K_mtvGs+&N+nxR%1$a&NF^@DOP0BvTHsZ(PMvGE+ >zKRrhf8VNhec`z_(Qpej<8inqs<cyw)VntgP<uBmn1ubNk`NO$u@TO=pqFl=^%sWtD >z;Dcu^8|_PBNn%=95hpY_(j*J!W{}*Hm|GHdPvmx~>SQ|DFefxc`C_lChw1nhayT6H >zQ{9KgchlJt=NkmN!-rFt<h7&ypN%w|Lj80v=+nP2oBS8FYx?;^TL12|xbCjWKHd1b >z|MT47Jn$9ZC}_88PmeNaXH^;tN~KYe7u2r9@gd$5P5;}T50p?&s)O1etM>2(Nh<Xt >ztSWFjYyX}T{cuYQ7(uluXgCy9H{tnFQq{Y0jZU>!wMS)1R>e?LLznb78pFQp<SsVR >zZU((T2k`f?p_qEoi1jOa^#d_O4+buFYj-gN*g412EQ@!=z?IO8<>UdH8c`p@t0-@b >zPnfMr*6%CE(C6HYxP;wwI!sqtymgSNPpfuFpLbX2UckGpHFRzbmlE9z3l)VznPkKb >zwOp)d7fQyd&AF<X8Q#G&)4c<6%v>bUnI3r?1#c?IXcLXTd*BUXR_v0ordTkJqKi7u >zrx5uH0n4;qKx0bC06QYGIo*SF1tu_nY_V2Q%}puh4`TER;_j{CzQ9n{F$d74-xq#N >zDChXr$M~U}$h?Kj=V5MdML;e!s6ysYB%H_Ei0DG*&!s9Y6nZ}nhqqBEunv~0_M24T >z{#bqhxnNxU{DI)@sy!c|nWLuIKlZ*T>g_DGmAV&pL+&QRpiCDB$4@O0UZJHe-ko(9 >z;6+);p%^2%&epvE`D?-u$vDxaD&YrEdJBcSV2Aj(k%>~!8bhJepcc~1!)%K!rJt_f >z_aw}0wbP!30f^&78m<$~1HwCeou$HtWUNyuadOA13YcT%hLoavHa2D<*0VUhC{^M( >zV__`LE^@JTEz0OZ!vcwZC=|1c;;jnUG_Ej^w1gB#el*O_d@?KZV3DwX?*j;~NEm>9 >zIeOk6I}i&8RzdcVeAwr_V;0e)!o?9pYO9gD?F%r8`Dc-eN^gxJ&CEeLTnqmp-yS|e >zBL_(rwh4}TtVn$~h%Zs!HBeY|T{zs;q(<5l-WZ_(hrhy1Sot?GD<KnlzKK>VptJEW >zg(M`wKKmiE_vwUq^6)5mxCj)RprcqsWO=|M!9>{jf`B^ZQpy&axpq*IwEq-}?pZsH >zn7U_gS9i4-y}5*}2sdE&`S9&Bx->kx_y)YSL3p^Bf?@JjIpE)3j%fmlcY2q_02+ox >z6`Igx=@osYbhVhgBfDA^zSu6yR|?<Yu2v<}D&>WgxMeL4yF`zc8E}>c4<Jl^$&|=M >zXAIUn&c{(1&Uz@q^B+-CwO~SrTR(DpiQJwfw?`$nR=u~<|6#v;?|^*ouzc?roUD7( >zLKE#M8n#iYWyJxql8o&%wDXYK&F_aPv`)ebHSBl|VWMZOa89OZ+9;Ys6wP!onozME >zhan$FEr=sIG@1nodK)q*?G#)l|Be{$^~@z?+;x&dJ%q&HPNANrPz4J0L$D<vV;J>V >z1Hf_t?|<OJPe@^Ej++DND?MVv0M{4YhBKM4q2v7I{HFtjk0HQBYBjVZO6GW9lDD6@ >zC{|{*by@UEZ|wS4n~=|KtPM|7`mRNV3J2IOb&p-xjA3=L6}QO@E^H_Z55kB`UETZ1 >z3sU+zrF772q_l_SUree0lYR@m0~UG{7J8Fug*Q5sW()f?Vd6>Agl0<mP!@j}#0YFd >z*n?rtA>+;@1O~xDiB5V*+<83(Jqn6&4BS&ug@FE~7i+qg@Nx-n0^vO);zd=!aH@b& >zp;Q)ymO6&0gN8+~jkeiUfIwI!T#H%NxrD;CQMk8@;fkf{-?w|m4va`>n8Ir8-k@^} >z8^<7ltlKJu)zH%swC-73tm-hklCXs~FoZkcz!w+;n`k>iC?ZoyN?;3Zc?8Oi=$^et >zExu04XJL_P*3!XEcWWlOTHeB=(|Bk0W?TS163m4tl;6-kd%$uK5B=ik0<7)_1Giu; >zd5|wi#(UXhtk=U>6}UTzFF1&`>|5Y@AI=bt1a39}fwqEhMYiA|C|5kFz{0tCJRDk0 >z$;lxT_Q?^LF>vz~hMN+wP6IBT^W+$r6W-;<E1C!Qn=FL7f6iP*DH;!O{CkQDg|mxT >zjY$nl;f?2qvNkqD>8r8{RWvc<_`hIEcqOQxDdBnuOPdmQ>zNIb4;vgZ!A4CslMqkS >zs=2f~UIO~A2yZtcDTOBN`?b>niWN6{()>b<)fJE9$<Nr>J*CU?mcoly01TWj(P_Z| >z>8zpptf2b*(FvA1en1`uYg}>xlVXR43;UAL)F~5|gZ|wp<h3Yda$@9uNH2_7BAWG_ >zU_`^iwTHk^3PHHnh`QWFm5diNA)yb+!5g_qY@p--UV*^+{EhRU4-{<JYXav^<?`+9 >zc$?Tq_}Bb<)P*Ib_9C0LIK0y6;=(I?!z+_fF5#7i@JfAnr8c}$6JDtfPZHKKa%n+? >z7=7Kp2i)^MjO_3oaP;yo@%3uq+%-tdLr4p4`o;)V=*>;{pzvaX<|1rdm`z~4rz1zI >zF6s;(ZP=3%_87w+jTNl;!Y~#~W`yqb?|TYohFApPjpV1W<$}$Vr;&T?mjnx*3gCg- >z`D*2kdtkAP6k1`tDg2`hX%Mfm`}ZB)(IIvhbX#Fw5%msK$xf`?oNK5`JX9qYj<G5c >zYXb8Ru7sg1SPI!}F4sd6>fwGIKam;rK=}l3Q|0mi$`(nMB4q>w%k_jV^Khj%1;YLL >z8>p^=Hq7~X;|s&}X;v(I*pQ|VFsJ?Ixm?pd7S20|_sqnuUZV3NrIDvJ`YDZeF^yfh >zYNdC0SFTp+HGpy$Zw)u3m4-bgVW1e=ezS#(oXdfhu?Na}0*z+_zD}WR6mq!(X~x{0 >z=o}l1;2)>p@1x**-e*N1OaqcwfkFmphXl5e2i}ZE)myoozY#S{shdjP|481SB=0*# >z@7bO-44~7<?G18!hTI+z-H7p|05iQ+kNYOjRq+PMJucMS1a`KMSL~A)N9F=u*asor >zu2|s_oakWm<;!~b#<RSy6V1$mv8I<V(()@f{-l*ISw?D`GE8?i&1u;dEmATD%QEL_ >zO3B|TB}1f?*v4m<XyAM;IU8<*a}PP6J<8%@)sW~+CCAC+I0B0Bj_64Dd2nmSJd;Fk >zl(3Gd>6)Wy(rwHm16(`1a<F`3cf^DFMdFS(Y6UM+8~8X?K~f;!go;!1vr(W=WP0Nm >zSyM_>&=Y#EvCFJgc!y)cr#ju5Pg{s*iCRJ#v_B6G2pNTL!wax^B2%~F8TuN%*L9hb >z^vKT-D9N``lBZLWQ^X`oA%sj%=uvSG5{`#(mJtpo;!pr^{3$s;NRD@t<A-fx;bY4l >z-1I$YX9i&@);pL?LET=D`&8hHCfl&DN0=<G1!y0$peMVa(<{FKW~aBUC_78QT}=5n >zi&P^*<m1=S@iuj~&4AQDLy^Bokv~k4SBa6cNKe`1NQcSMwP5x`V2Do2ANm9~dQk)Z >z)z}zKbRMMO-lgF7Q*en=UNG;|?4$|1F#z>YvJ5!eh?KHEv=}r8v;V_<7DFfxB|OZI >z4MkiQG6_F}Z3J5zO3%KBO1_68>!!#~P-MS*k9n0tst`kBlJZhegDUT+%X=^iaKQ!g >zq^FVKk~YmAV8eA`+O4HY7?L_5#Y~e<yTB-G%=L>UsZlM|G00-35(lL)=$<v;rs`_C >zu<Adha3}jusdPWMfI|1$@UACqhP1y<ia_<i{PJ^^>g`sHRaLDygB6?&v%^jrTAyR% >zYeCO*<R<+57TP8*yA1<wRjX+*Lii1EgbTyMVT@WtG<1gad}$iGnJ*^$_Dru90}wdY >z`AHZ&{sAV<e&Dmt&P2iO0tMn6ZbHFbq=H-dE-N@b0muPJa{U9?Ocs}hOOqFe=CK6! >z@&@Eh&8sm=_h1H5;Y8gkJlUU3lPIK+amQ$c1{yih*-l|sQrJ0AgrQQ{$3Y<Vhs-D4 >z##b*!cDu2@kp}!9!V(QUvg9GTupZHlSF!0+B*ieyW40~>+lwYN4x;#Jy$7*_*P-h+ >zXfZ5n!VPNS3wEu{faH){oWI3DEHZCK&Jx6X3N&v)`~y{hmR}79TW$J0HLGzgf3p@3 >z*8^ewG$O3oIpJ^+h(h8njLsz(d(^SsdNx}LuYbi-jux<fI?egMh5R+hYaV1+;&i1q >zT(1php%BAv8h$aW3&nEkd?^z1&f6H`v6Db2$9i#I6B5wjp@pruY<rjudRpDHx9agA >z--b<OJcCCNr=lzR_bbulrUptnpu;^#|2Z7aWdkFe5?po;kKSG|ai`AXiCTQ`mds>F >z1)yJliW2ROX&*;HYNrMYKJxEZv%7j93A|VXco)jwC~&$BvB0?+frt0pSnfX^=#FQl >zMqxG#4g2&&ZeVq(!fQZzY-isf<R_vG#rX}=Q3naxWG?BIraYRjw1K=d*4>1Fbv#Nf >zurf2?JHg{Qrc5zB!H$z;G#9KRw8Da3u>wc}3|(Tj;)+X$%9j}SoeldwVmZN}BM4SD >zsTqV<6M)aw&tLHa7>PPv>6^p+%w!d&7IrK|l8(TzSSCVyB_gy}Btk1uhaf^rT`@Ss >zs$&(J#3&-xV_DyYbgVM;zO%w7pX0vtYv`YmlF@DWD;Y_G9U=h-=}{VGQ2TK~yt0;u >zO;~N9Fo{=B-Cm4-fB9%|_BpZ@c(f224Ztb@kytZWEmk8;oYi68m}7wVjk_Bmgp^t< >zk>e0milam<3iMdVFGA<Y1}c^kHz%RO;et!4$n%t`?GK^WcibuWXC?t>2leMI)SoSX >zr_v_UfF6b<K{u>{Mvt9GcbKp`B%+{^Bn)BuR50T*047-_W{700g`EGO@Lr_wzI=;? >z=f5Y7rjL>27@q^j&&ctJ=qQqiq#7__;=!5Z?0lnV1S5&k$VU<R7z{=Djp(Br96=zs >z27~7V3o9cL!71=Lc22Tg_iR3BK~GvKEkcEZog!hW!CX&-<~W?GP(;a`FcC8pY3UGA >z7%bU?08NDbWi&nQ)6gMW5|KfTnotHtCdhTc3f!yYIetjj98iu}YNUkmWE=J#d|C{j >z7{C!}a#j~MYqQG?o>x(ZZ2a@@6If_ivuV3btkC{Uxm-`VTuiy#eON3slJ3DO{8l4Y >zNN7N}q+pm=XCJ#O88NZd4SFEa{K7KUl@09{6b#%Gou5*WTPes*6yypqNd6dt0l7F> >zMe<=fK}gYEX>U1Z4#wwy3rJxFDk^CJd_PCNe@(t8ioV(WIuQ@0;)vopmV#tFHja+S >zmvs8~YKz1HL3o{|#n`?Rq*eM6zok<f`W2d|?%5nYgyUdD1Cd5ju_n_60Y8dKiu5Lm >zw2dO&OOft6#7cohY8NA2C`Af3R38Nq73nQvq?8@qhMN$xoIV=HF;a9&iOZA{mn<c& >zN84TxDxalq^#oO;Hqh9?Pi1Y-LFqX~={ZU187HPklBv*@8>7Y=J{S2BKy6YgQEndc >zGHedKoF*^bIB2EbWD&Y+joC*U`lTNcMPl34DhvrA=9qkjE)xq2a?lF|l55&1C>lWi >zp@x%f;i+IvWlhNt!iY<DD07@(6$wwVsVNykjX&39H+Fb_E0+#sxtC4lG(Zcp7@m>U >zbebH*WyK<{5se{@zg5%Ti!zGt@wifn6#(5m=rYHsk<HgAo3kmKMks<_%BC5@%~&zK >zPVP<QZXx&1gJLn*(B^vi+cf+=MqHgqbiPe!TL|rkg!ZzC#>8N2ATMA=ZvYq7Enp4p >z?v^Rw-iDD()JmfBIC*}NJpYV5Z<gY)^mKnd)}qgD)O!-0CkUsFaP||9Q^G0f(fx6V >z>I;g<iz$aT321?;2USukJ(rOjN=(m^ov?C%Lk^pWl0k_3Ej<v2?K_4tTHLp%$;9yU >z*L0cH)Dni^?8DhhQJ<u!KmD6n1ZJ)Ogw@|;1?jRsl;^(((<kX>vrNb1`n~FH{ylyK >zMzk`0x~C`owqE}|5FMWAoKE4TnGv1}itvaW-q}FEzE|)3`#zEg+a40|#t55b*eM}# >z;Bz1e4ShyCmXV}gIclTVhbgPeq#lIiuA<;?r{E`2@Ux}h(MaQfO3Rovgb^SN2Vo>j >z7(`@%g6_&;72ilmhY4v5A$7kYwSN(b&oQzzV9JV7cxTXTra=5k%wA$dEDWB)m^2q* >zyhmaDIXa9vDed=xP@^WWZY-Po&>oEx5X9=9P~k+w7Z36gtQbt>D-I_sfKCJ@C`1iL >z;`Twb-ZRJOy<>S5^*iM39g1)%MK}$LFh`1T2=CSed}kr=Gz*^>;SW>%S)^C5+rX0n >zcUl<+r~s6a(z}{DYmMG|v;mFR1uineXPN4zIDSNNR8btqUS|aZJT+zBpt3=NUMB|m >zd=L$wOgl-=L?=t93R^(HGkMh*N!6uvtqh}?AYMhl=+nt#{H1s>hQ#t~wS9xp7|1Dd >zzCdw4NO8KvICXzqYs_x!^~8^D1S^ZS{Uka&xY}@U)uZm-Pv;Iuk;R}ggS|{ZiYR45 >z_kOx%fif%T=*nS*Rz!h+NP)jZfe#P^2a63Z_<~gJ-Vd=c7>PGRC0?}-DmRq%67m*n >zfwxoS?d{jZ20|Vv2DTRKAW{(d7rZKd6@<XR=s%eUVH*DPCLc;cz#bx!ff66b0!E3C >z?%q$kuC%1Y(@R7SE@!=!jm`xEQN6e&t=srKkPs7ciAN?Z9aZ#lxfd6gJ9N+9tyeX6 >zuwm&wN~50AH~@+;SxO_`n>NxGE{NIegKp^%q`%0j#jHEftkvFOY@)&NjS(2jA3=KF >z#Sj`UI?zbdiz;MppE5!>SxOSFP3yaIj0!KdQ1wl#jJVJNc_+M*H}&v6D24DlJf6~4 >z`Om9-cl$rd^IyzEajf`AEDl6HEmHinOjr%_WCUz5-4BQCy$;%h^nD}*arG2)QfL|( >zZX5?f7xWA`qgFwd;0RU{Hz$F~x&vFdgmJrs-k%KJV}XuiPz)ij`diLsV_E#e-I!h2 >zr7U@;j(Xb#@#PE<JlKh-T0bM4>Kg>2i5Ad$KF+_Vhu?#yMHW3SjWZF&40+=-S;9cM >z(qjwos(aQfJ`xBSp6ksd!;`6VnVQGc1x&RvwV0{5Fm(x2?_g>rQ=Lq$VQM{7y-Z!j >z)HO_PV(JE_Zer?Yrfy;C15AC0sSh*tQKmk|)F+twBvYSa>eEc!!PMuOx|^x{n0kPz >zuQBx?Q{Q6h5vCqx>ItTP#8iQ)9Zc<H>RG1tFtwMd+z+Uf)J)YdRm)U8Qw>Z_Vrnu| >zjZAeiwT7wnO!YE#6;szRwTY=4n7WCno0+<WsShyqA*Mde)JLJd_F6b~2Qxj-)ZNSu >z9q{|#|Edz$vcQD=rYwRQqwzphrz|72GYB>A>`qd(dr3|EE2$m(p(>t#gVZMvLscF4 >zJ5=S|HqvXs=*lUkeF)W%CO{q9(+-t;^c1N{Sx^;cETo@FF>?D@jAtn>mGy0?$~7#U >z%~0j>e@>mdro6V&XiiTzW*E(N^{d_0%bi~1r0LVA8|T;7yQ@78Si9@%+~wZtx>^Lw >zaeB!2kl<l_BfmrwH}@G6R}Y}y%<FPEx|&575~jRk;x->KamnwRxToGTanBpc-xRp? >z0ycw>_>v#nM3HS$=2y-zy1n)FHRY=(RE%#hPMUC|adKwn<cXQnCr-*VPRg7ydCClE >z%Dp#CnPjxAsyC*{e&sLLIqcW_+tY^%mKksQZeRV^;OlR?>vUmj&;IV&@>h;e{-)jX >zANOPz{A0HK74Z@!fBkNf@xJR<4)6c3?{|&=d;IT9;D2HXlz7UQJ7ySDJ;vKI8p@y) >zlTvT2ch^-_*Eq;fWt4TxjLRI$t80zk>J`kzVRT@qluaIEO<hHK4Ms+9nX%I0@mANe >zA#x(w+&0ncttpG+?s8Xq9o%SR>ZA!-Rfsk%HLEfkFxKdum3m{1k1Hy+8ljl2oUtBO >zKBLFuu9%fNt!9-mwZiL}l{&o!c8-;^Qg85aaa?N6DsF-lnizuQjZkZR;{ir@{jAi< >zmBwYWQY(#>vr?<#I5!7vY9(haCc8Sf5resdwACJCb!|g=O?4%nOIbF<F{5fm{fuJI >zj0Pv?UdE+*W?&9r&PHdQ$BT$PaU9Dv8M);SFMXG-t}S2TNE^#Va2@594tE^qsjIlt >z;pJ?LmKIq`Y?hq4bBnlT)wPveMO|&Jqr%J0w=OLxSx`LRoMS6qT3oz<<E|Yo4fLoX >zZqkHIE*0Cv-g38hhLJ^XtMj=l#0pGHtsEQ2Rk;zFaampEYL3klD||InL2hZHm^1-@ >zY&M&S`C`%(<D?sA+%R><4U>(|<(|u?irV^&+2nVEw_&Nz<ESvs#MHumoO#9)^AfZX >zS4OSlE?;GR%@QQjTUS@(S<)9FM*&pVdL3>Q*}oi!_rID7%0ivbi@f}c*<j^ZTj#B= >zlBe6tO7R9`=43YIW?fQ>G{JftHK^FMu``TzTaLx;uB)v>Z>x;s7S<V^AT(+mZetZ1 >zU=_8;IM7Bkw^JjUTwVVSRhD(*jLa*ljJ$a!h^;SrEK<Ng6RUNRZT>9_7B4j~Dk`#= >zi@A&p)}rg`iI6egk?Ogg+JE_qW!1}lbw0FgYVWdZMh3Z8);T=2*L#iSH8pj28Ov*p >z<?iLa6^>f(rO2rj-vt@rx*aQh)our3zdT4^t=myix4gFcZU=g*lvI`&G0LmDv-nY( >zl!5A+8prbT8rg3~2F8&~QhT+yjS?D1b%>f{UCvie^=Pj1)l?eM9E=|7helt$(Oz9^ >zL)XTTL8z>gi<OM6XZ#9%4)<!S!;N}gQC?f&$fXz^wH2!=IBEx=Ff)+(I%7qhkGcy2 >zy4(+9C6{mTA=daZ(66<XD^_q$cV)GE>9PqGGmH!B$}1OCFLRf>R~tvqGOk!rSyO`% >zLPRuHm1CT*<YY5s_DC)!+iK4|hqu77B4y=DclD(X$b&<2sc*Q{smMXCqaxy3Ew@E5 >z)JA^$CY!jUfU^JtQR*B4TB;r&QrH(K7)CPp{irR^P0ooxpNptFit?U`%2{3Sn9WsJ >zqSMze#9+*=T18e9D(dSwZ~df8jgv1mPP)3M$yc+UZ(EA@iA&6B%gQ|tLMSNr+&MNn >zya)znfQGAL$PVV3a*t<%x4t5`rW|AObw+dU(!3no(m6Rci!D0Hv@~C>r+RrUI>geY >zRpoBuxUmi1@?|v+tJgjDwv0(-##ytBnbAHXCE~5CFJ4^_JZcD;#_O&#M$#8?x@sc5 >z%e`OgQFgkjN6g-mM@p1&&6>U#BTiTKxO8c?$A+qdZ^!((X%Xb{#z<PmUXjs=QzU<v >z1ry2THwF`lEV>ZCd$E^SR=T4@FqgBjv(n}Bc#RcQ-9|!j;G0%kXQVbHSz|;+XRSh~ >zuN=#b9`mnx91|V)<sAj)1#C5b6$e;RKf*GZ+jLD1OtcmJ-}m=0lD2f|QoI~q=~(7l >zo>y0QXC$$M3rrFH<r+DUHEiuI<`DyZ#hu1Q4H&I4!;R*gr6x9pU+z~l_ouVxtNh|% >zPb1akKPBwH#>?fX-<!WLf$vM;`x5vslmG;Eaoimki6kK7|Ja{N+&|@pD3$7%SdB(w >z7&>IgKxIsVa@5%IY4l^Cn3%DNi36j5y7AX4$Lq!cVaSMNr7}jROiUX$WSoR@?ZB9r >z#A{QBz)Jp&A1CS)VvNcm$?DjFl7kZ8>ynaUhs4O3Lx4CbVt@uNz=CZbis*c8QZ!mb >zPeHO@U+m0i95=)$x%}&&EyV>^ZbC{5+xwlHmcuOoR07rlf`Dy+KLc6-M*yb){l?~S >zBLL$7vjO=4JD>`%8SpURcYx;ruK|t$Isxi&Ib0$j8&Cu&2doC%3wRjtd%&}RmjG`A >zP62uWhIB~D2FwN&0`3HC2K*ZE6ks>tZ-9e<Noc&Zgm#0tmA6)^e6<y9`C98+vCQG7 >zWwpmw;bdD@w6TLl5Z2#y?$w;N-0Nh^Hbzx(#dUQ?tj1PjUFfJ~=pGJhv=!AJ53PnP >z9ktb170tn33hjOwrQorsL0DMzI%>;}UU#|238aNs3p#4+e9N8270f-7I?-rchUL(m >zGN~7FbksIfV?Tga&k;MeMC>z`S5y#jSW#E$h@e!Kd&^n6B+@aG<w{3Gb%le=@p>IA >z>b+PTc<ZpPUQyQoqSEI^ioH>#L234Yx<*EsFO(Z{=8y_PE?s$b&*iw68*8dTwPKIQ >zi2VW1vI_e_Vrq>JsuRxauJd>@#NdEid*@Y)81Ym%tFeH_<`L!1;X%gA>tVvt3O3=4 >z^CvDsenI}$(URREow87LKDNdYs{@;m*pcFLscI-mHAtz0l|+4cg~M3msPY<FEwaqb >zm$m>r4#%AwmZy~&b+t7hugSM(wFl+ZmsU!-hnwTByA#c#zPjGQ6;Q*8lthukgB`V~ >zNLd=nYiQeTwb8K(g@lHLqNv7#45eFMNtMqnlC#2^APg(2QBTx%#P%u`2MPnLe7BF) >zU!-kgSG5K%^*--JY-Hgh6}jwLDs(h=xlJIIY-7+!Z2?DB?rP-MS6kkI#XhYRxjC#! >zN*<*hkV=$|<ZYSHvzoQ*@~Ad5AIp^}BR5y>TP#o1=vH$jwRhIm-Bl|kaE1|Ev><t< >zpMrj}q3_GARAg2&%b3V7g40hEpBO(NNvYH-6Lhd0r0lO$8kEYx_@iH}CO$rXK%7!J >zP&q_5RH@WU?#vNakGG`N`f6&%a$Fi$%lWt({K3F+2?>gX#0315T!KoS&_5D0*|Fc? >z1l=&5W_UtEzl0>R7(&VbQpBG=PN∓t~>K6AbYK6A}g`X#3o^Y@rQ|%m31l3i0o9 >zm#dlcuod!6e;c?f|1QN}uju<@Rq!p8=n#IO==)PDzTKbf-$<qWzxs!^HxctGARRCs >zkO7zgm<Y%OOae>>tdo8dqRK0EIhUHAIiE|#e|lyCmx}-NOdFSq|Mbi}E*1avIL#mv >zlyWXLhf7^SCMY<un8T&6CKHs!ELeq71^TI<rhi<#cECWLe$e0{hM@_GNyCPZNFF)r >z8sq3O*QTUicYWH}ap~hTCQQtnG&yU^4L44mHho6+%vrN<GL<i@sB}~<cUHUZtXWZ8 >zSHIHj@%kF>TDAJ_+m@8wzVwc-cfV&%<Jxsi>wn<iu<_m>ZVEK>n}a|4@s^+5cmD$q >z{`6-L{rneOAAaPQzj}1rueU$;_;3E>iQoS2_fP)e|NQZ(|NPURpZ?1;&+gc{>$&Iu >z`ofF5_v~%i_tO5>124bw>T9pR@wbC-9y<Ki-`{@c$h+^g9ew}U@e?1M{P3fXKlz8y >zek#=Q>1UsJp8n#@*)O}gd(NG|(2Gadx&LbZbLN=mT5{**&A&N+LBYaB*1{rNF)s1k >zviMdp|6lL^-y;9Zmvi{<+nuPbt93**x9_aQaS*q_ytH7^LQ8Q@(PGjsEGe)Q&4(N^ >z^U~t^1s3|67cE$jQ!HwW7h5fyqjqVDjk7E)A>F(=(9Ofi)zaJ|F1H9os$=O~3pdwN >z!m&`I1D8XLisXvGDMA)YR0^W-ql1(EF6Ab0vuLhTUS)sgqpTe6W57QE0-zmm3J?N( >z3itwW2Jj`I3(yUqU7mCJUI2Urh@Fzd=>Q`EM!@C2YhgDIFc~ltU;!)ulmM0h?f_H* >zoPZiYJ-`cC1!w{U06zvi2>3bRmw@en-vOQmJP&vY@CM)z;BCNrfMbA@fKLFY0G|QA >z0CWM)1LU7V*5LctWULne^y|d;BH`g}^QVzdlq3BzxU`f8%&=t2;3jf2Fi_1%Eto;W >z){2R#l@n=H${BA=or7a9d=_)Q~jE2yijEMLtP`y3wn+8vd(qOREKb2F=4cQv`# >z%Dp}}(_o7fW_ToD<bwk)2Q<RrY`$9V=JHw}&OW$Y$1*q56qLIwoLmkl&Kgj6tGSzf >z=ma<WYF2YOzUA1P<!lZRU~E0H2#W`CUC|&~%ym?V8iYzAA^<l9B?m;|P%sn*2j8Hg >z8aQ}DQ#PY8rl6dr17-no02aW_fQ5h}Kq=rhz#RYwfE8m#N&}aY0cb$EHl&Q@G8$62 >zh73ST2K3<W$w)CaaL{CM&}1MUbjwtnnBRgkFphm-hoAJeJ9LjKO;G7~AjZVL9CuzL >z=}Zr6C7rcJC+Sko4i<GyKYb`c)ES?DVVI~>zxeWqA5Ag+@-Ox2PyFGfN2*PK|IOdR >z^RHR8_ps^2=JuwS=k2=xiJTuDJNL+i(AwV>np=M|@HhYG(P1BM&rQ_&*5B)EcsZ}& >z<FZ*Vm1Xn4yzkbnKQlHfi|*K$>f^*;Q+VA&e=>0dk4b>KtMyhHXBe?f3oe7&<;MCt >z*hotf52Hwllt**$RFv0#J50p(6v8KXUV?vDJd;WCBG<G|%wVVVMg6P`ht~n<_xU>Z >zfkLkn?T+B|^+nlVXPAZ4*(2g~_DE*Qy;E`*BO|p**2?^88S@wBa_OXGEX-NRrIV7u >z&F2<!xj;>i=rUk~*2?T;ofH<*9?^xRH0}<si|E8OQ5*axc)vNm?QyowpCPTz*aFyC >z4^9=Mv?8WMOP|}3Z7i5$wi(5RAsxw9*H(M0!M=gRVvftf^3u&V2gEZ@tVYI*i#~kO >zI2=awj?Bpzj6`J$&fuY&5m6?jW@c3aDe5;Mj>A94-RW@GI%={ePrz9!hf`83ma{qH >zSq?`Bs5sisqW!m=WwmtbTSM#J<?iwoSdd}qCo;y!!7h2r#K~|uUm)SoGLJ25iBX`U >zZUwe2D#i56t5RbF+vNBbm~xz#g;|e%kg7|<zB1mc;;aHc=4Rr_#BUK6rU*sa^uQ8* >zsmFmM^5qSg%$bVG(zoYf^P&smoKre~?mJdzr^mDvAO%FZj@l~`cT>a=zr9NHu^K>E >z1L6VT>t{#IgYY!~NW3bfOhB`loblVTMeb)0K^oh`fJXt30iFOn33v+dG~hYFUcf7W >zw*c=0J_39SI1Ts`a1QVlKmlnPF@XMn0f0e(p@3n4k$_ad1i%!)48UxF8IT891Xv6x >z2e<$pKm*{K<ZDLZk13S$YXI2*JPyUJ0{jf{7r<M9Gk`(JSSDa0U<H8Y$ZwcGrRkME >z-(g<<uG8bcXOb4F3gp>}VGLR2Ot?@x7f?)vbJ_3VPSa&JeMb4L@FA3U^m%B?7ysPP >zYnS;6_I|w|it+JpaF^3zoF%41gELwVdJS6l<!7L2wDCy{3E(BjT$Kslt+C><I1b@2 >zbIY6zH;HS2F-ctRKJ7C1yPW0Tn{YDCzI4#cd|ll(t@4s|WPZD_(pFT}_9gyXda9`K >zx@j1e6GSmbn=u*@KY{Gb4vihFDnRaG({};(?qru$H4$B<8~hR(vsYDGii&7|jmoLd >zxzbZ{r8BnhU@3>thmwbxDn|^#S?RqhN{)Luah+dU8GRv0B~o0l6pS0z;-X7yZG~9b >z5!WkIaJm00F+_d~ioY+`Ip9o?F~x*u%+8y8MPWyZ!iEW$Lh#@#`&-4zkyuV79o)x3 >zyTEFdQI^$#`ewVoj#V_5gL4U6pG$qUcY#R~{FI}|{1c2wL%afSsYR~kJdx|}>cL$F >zEfQu!xtq#{m{q=xRcCJTq9QpnS3{qBS#wMi8%oA(G_8vHpk3yAD~z*e8Nbdi;{qE? >zOf*%*BlJyCUB*NdoigeaX3)wrDION298iQAXjzS%RcbKA(8T)pKXb>79o@Yo<G)a1 >zmo(FyqN1F|auTA7CdcgtyHvTQMvV1aM^ov&B-0LWB{Lgm&N6~qbA@O?8TT2`REeL8 >zl|iLfEHgrHxGX4?k9k3k&6XCG%1a9HYCJ?$UK*h;Vce1;q7fovcBR)VS7zGWVsmLO >z_50GC1#3h@k);@hNEHEF_7z=)m;$aG6|O{uR}^VI^>^9#w`uE<e1A>sW2xI?8n_~9 >z-xPRMmKv_`_^leqH-#=&Csp!y$?9b_8!6YXYowPX6cs1OyDT!nj2kz{Y+mK0)6MUH >z-<QDu!zFNsnR^T{5fXIu^|TxI{qKKM30Rx3R*`-OexTxJu2*qs_{zV(L+=DU5oPxQ >z^gE)=GA?~PV1FK<@vFE5zy!cdKrWyNPzksjuo>_eU>D#3;0QnfoCoMPsJIkBCSVSr >z7*GXh0Bi#M0`N584Zuf$F96DoDlQ3-0>}iI09Jq#unO=az!QLdfVTlB0iOY4?}b0W >zIKT|R0>CYRWq^9XTEH&=zXR+58~_{x{BO%^9rg%40PI^dVb7*%3ZOp#`)*CxoM{>e >z!2V6sU;wsJo3MA&lnB5cPZO2Zf7Smfl=($~_J?BGUvAE~6fLwYz~0~71q*QFW05&u >z(&J87k)*}D5jAxlhq%~4n#|x2@j8fCgx2kIL3Q+*EX4C+#>Kd5tafbJ$&>CCTUJ%G >zeMPXzz|a5J-j#<}QEd4}@*rRgVUwU?l_fxcTDsQWfFzP=q5;F^0FOYzW+0IyEV9U& >zhX@)31)dmo5DXZ2i0mT!<^URyO+f@7Fo>W50wT=mn}qSbasHqA-rVoIx!u*>)m6Vb >z=XXw3pVNWWv-Z(Ru;Fh-C&dr#5B&^mm#zpVz=Eb<w|}%W+$VXUACwZGT6Rs0-vECx >zGTT4RKkvV<#+8-_nwbsjbA5BN)3VCX_Dciaec;-Jl&tK)B5he+fvH+Dq+0!cKh1At >z`SpnzsX6HZ2>gtUe=IG<sl4pP*u+OY{r$pfY3uwFfFv$0C9~{aW!X99H-nzNl%4tr >zdcUWoi;aHk%kS}DCEdzv2}A<Lr1ejM0H1`Ep#$P$%8vNmdKn&+`Zw_LD;poEw+pN= >zmtPV6==`o}!~7t!va8GXVn_P7?wFp@HzhqW5GDrA&d%|N*6{n%-cOn-+xD|-{POs$ >zu6{0J09(HxAv!T7v$W-bek7!&3{U$5$@yKzPvE>nKMAnIsO+@ljBbNc)1n8$Gb`hN >z$}#$bBabqc;8^DrNP&Men}&iwM3pZbf$i+{E`jMf?7Re$m@zafBRwrTdsuqeyt}mA >zKRwBRU;`n0r6r|@ODATfCI28{P*<rqKf)q8J*ylt1GdW`gDn^qKn<P+!at%-&w>lf >z_qq%n7?}H~K59KQqpSs82Mx>4NlAZHW7&RLtz~<Gq%6O^<sc?!JVMql^>bUxfQW^F >z!O{jpa#(7{2!EIpL26S5q;<#$NWzG$(q~J70rD&Tm-^Uw6$9(B{<^&Xjf8y!UkTqH >zVG!#Ji_~$j{crH~S8mH+caSzR4c0kfh)c&17<iCRg#Wzs`7ed#|E10jdKR)y1D^_k >z(7^CAn_XUB0gk~#>+$HH%F82G;^6qeh2Xc9RucGI$%rw!8W1W1PbMK{+E_gO+}zQ& >z9@XqsuS&I+O~Ezz|Ihyl1Ypo&V1SqEDpsiVfwkT`Y7MYkIoNsKnc{rxeCix@zHx3k >zB~G*(=MHhlxZB-U9`fS6ByW{>&pYH9#w@my9c8Ck1>Q`2D~8C~@(XFG%IXQFRh-IJ >zZ>oH?L2Xk9)P2=hKdax>-SKs7k>|-<<X!R+sYWB{P&$FWM+@i|^ebA?Y-i?~N6j#n >z$+z>v{CmDg>=Fk>sB9%Yxl$gNmt+IgNBvzrp{dT&p{OQ$8g)P~pcFI!O-7s19&`@f >zLJ!c>I1+Q*9#6oV&360~K1Xa4(?o$dDlUtMqOR;N2gna(FMN!*6l-J?Re*Qm{rDI@ >zhkwNbNhTRda>-_LfV7|;=?ioq9Zjdv<#Z$cobIEi=p}m39L~luqanyYV7OUbR+V%s >z-Cif?etL)gR$tOd)_v=7yPln7kGEr-G0p;KpL5(<>YR4Qx=UT;j&v8e=iM&uV0VXm >zz&+xgb}M^Ncy+x7UQ3U7(u?(;_qut#Jb18U^ya_v$7O`flH=t2a;L1Us;g&Ik-DkE >zbzA+Ko~9S+gZhO2NgGy})!cg4I%8fiZ<)cY9y3`q>%;o9S6D7v&OTxr*g<xQU1!Ct >zGOxj-crJgRui)GGK7NHqi9d@x@rh_FZNU11<SJUFsbT7AltMO<{$x8zra>qK)k2L> >zOVk;?hw{-<RDhZTdIxY3{sGWCPOg(+TAkLVCbeiReUZ+f3+M{^8QnvV(C_FKdY3+= >zRm`W&XH04;bCUU{xxy?k8_9)opNv$uRXzQ-u5E=fhjnKY*bH`@89+cmfOA`}`Lld5 >zpT$q|bNm6XBx;Co5h+YziE-j%5iA`UEuWV!$xOLP-jT!9>uMdy-*@UqRic{f)|zQY >zpV8O!J&i1B#aex>!PYeE9c!MIZ|wmMsAAW&+t|wPYky>?JKs82o!3Ac3>SC|%|q+Z >zcC;TA;$`?#{5`&dgGhCfNP3Z1L7wN5#iW2tBeO|9`H-vwtd5f(Nfr7OZARMy4&7*P >zkfSy5Puh>s6ZAU0Lo1t(8E+<=-<c21nk<S9Ws_JQdynO_b?gZHntji%z$h8~NnW3~ >z;6^*cSP#Te<X~61e{u8NOKvR4Pk(o~8|+p0p7NT3cGd^!F+IoY<ZbfqdY(DLtjgD$ >zo6I8flv!-vHSe=5K9bMl`TR4!n_uU@a1PitlFj5Ec~JILLtwP?)e^NyZC7Eswl=lW >z2duBHv(`iFXX{t%Pj;wX!){<l*~D&V=h&m|N%l;8iM`TZXK%1~+27cg?FY8w#5p}c >z-i@R{%9BwCwE}5zQ4cf%O+&w-$8kqI1<%2|aVOH+>}rlOXPSN38z7&P_*9U^bG(s= >z7CXgWk)S*26?&U~#hPg4T87=lPP9|(jdrG&1A1Zffq*Ia>7sh5UMdA-brAI6Rh6Sg >zsj+H;nxfuNGt^A=u9~MlP>a+uwNkBCu`)*rwahl0K~5#7CyZ?->Ij^427Qkj!OJAF >z@!Pl+QKUWji9A64aVU8}>e8kZ(}SoO)y5rgBJPP_$Mf-*cs4mlB#o!tX+B*==g@U@ >z3x(H7nNcj3z0EfAUHl@C72QOZm>_0}MdEqUTcn9xv0UsIUx-2(uM$-!Wn>x<18*2# >zl6xeMcA?X0Rl1ihW|5$uh?`t;o44aJJdP*uB%aK>^Pap9Pvrx7I?v?UFrs7lcs?1h >zo6ht2Y#zZ6=&O3H`+>XK-Rt)C#&~(&N^hHY!n^AEv+`?juY5L^En%m{P0>JVV3qd3 >z*gI7Zd#zo~$#g2Xi$H^p$uqKoYNc*DmE8Jn3)gWwx~EmK3f0ZD3v9kp|EjB5DOQfP >z+d64IV;4Bvom0*wC)u6i&H`L3qB@8JGml0y(F$}DT|<xI+StQMxF4R2ui@&XHmOVM >zkw&B`X+c^OOc>zq0P?XU9yF^H=|Xydex;B!GKdTTJ<B1Z$Q<%MSwNPM6<|R=CVwNJ >zk}c#jvWx5^m&g^+xtrt`xl4W{y_k__7!vwA3G{F=yCo{h>XOI^oPj^XRiWPA{7t@* >zU*a{uc5Rjm)Gk%17x=*%3k`_lhxiO!f;W&;q$+JqvuQJP3(KXeX^`2$9A|DYubUm% >zXm*}e1iaVq%A%h*Ca#Hba*F&8?9>(cqr53^$-DA5!0=D13RuPJs<x`D>ZwMmscNBG >zE36p2$;?shRji6vp-!s2s*tYdyyC#k3k}0xEO-G;MAOk6v;^%yMd&Ir@RK+SWUxCP >zif!~~l#Bb3iDW7;&}=e~EF){lHgb$qW2<-}zaXxN#<GRHtqgrn1?lVhmab#fw^}-; >z)5ejGv7_`JOYlm32wg_;xIb9HP!djRkO(q~zrhRG0oIat1h%-&OZd<HK2H=~#7Hqu >zED+nA6HYI8m^;>e$NdkduG_)w=T36xyUW~9+*@u>Z?pH%*U_W!_BY_}z3x4?j`y-R >z%3BEj!<XKCzJ#ylTX=*x%Nm1>EB+R5ELw?fa*db)wqT1mEQ{qfH9{AvwR*SyMJHRm >ztRdD|VCiqH+ZMKeu%2{gfakK#+Gy>w8rZS+Ksy)MXp{YS(Cx-fbLWEF&->ij;~a3l >zaE^iXJ>?AWhI(&%#98jX0$xI89EP95%`wHXcsibm^KmSBhh&i{B$RR*Llc3ud(i=O >zJ+Ss6dXiQ!pD^p1O~GmnF^8Me%n9aS&4p$|Hi%`iRjiP0WuLRl>^?gL9#IwU^5=MG >zVA-quYj%O%VQoNzyMx{i0d1Yljq~7p0X&tR5NEBk!Fk|>xYb<~xP6>E-JR*K0R5?g >zssrED$1Sh~PCACq<L%(3^d?vFFp$dg<Ogz{jyBt{R0duQ+s%S`Exte;0blGlIZ|1& >zq6$_uRk(^$7Rc8wbxa)r$sDWBsM@-@-m7!Ho%(NjlOEw5_0D*UbQ_!6&)F{mpRWZr >zziS&e45I+r7;m?9+B##MiOv>hhl5-OKFc2W7q`Sc?VST(&>!dCs;}y=2CJc9L58ag >zv=;ROFKQC4BASY|YO5Nj>si$5iB8}Y^b;<|v%q5hOzsd2NGH<H^lN&S-lo6Onr67U >z*F0oiHGeWIvk*2J?7~;<Bx}cG`6#gR`^0f^S`=AxtYy|J>y~xTs$ti$V{Cs;0+1RZ >z0iU40Xd+q)B)G6e2N5S;5Xs_Y(MR+b>0%hj?>I3T{J}i&o|rEdix0&bv0iKzJHX-` >z7GH{!fZ#=ORooD_#RCx}s{)BMLd{Tb)DN`rFJQqQ0~^p87vQz{-|<)2=Ltb<u=}e? >z2>3qZ=@j}6y+CV$v}`oDnN7fJNdQ^s&U*uE4+hR24l*zi`1&o7gun8Id^s=R>wt^4 >zgS;H%NBIeUmS5nPLF<b7J#L6f!U(HiWJCWRL%}E%Jg}!wLugSH#0p8UtTE_0l!UsX >zo~SPxh=!tU^cotEa)CMCL34q5&!Y3_BDw+gs{|M%2nXX3ko{UX97o{BxEXGVL+miS >zmK|=#*$LpY=6UOV-ECOGm<m7HP{(@Ij*k)J0k^4QIv_V&%mwTgilu;FfhYw0Hj1r) >z;BK&Nhr!P-0u;}R^MK=JV8t7vSd@tS!jM5SScb?jSxbh4N83;~k&!Y=B56tqUT2AW >z-!;4-FW3w5!n|5uxEJBK0|MltfJ1DnofTunSqWB>m26G6r`ikcB0JcLbavY%c7oF# >zcr%CvgVcnvS}dGJu!gJ&iv-^av2qK#lz#|V7jnP-;VMEkR83SQXb1v-`lQ;Zw}St^ >zTkqG0^-*1<PwKP!yuPR}gU@(F7wZyzAAIZ}E7%GFPrH^CZbeuPttM8a6=fmI1fOyv >z*-Cbh-DE#GOpcNwa*~_{O}$7ilN+R%l#u(xpg}YkW<CvR6B<dQC<0GeQk%8|#)+c| >zG>ImIKhcx+p{d|8rqfKYJtOHDkgCaaD)<I@bT*w!=hKCBDg6*;PK9(oJxq&$pU%?r >U)W|HGkE0-vln?~y{o&_70meto0{{R3 > >literal 0 >HcmV?d00001 > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/BUILD.gn >index 43d1dfe9f71298d395ea2d976318f251cfb2e077..4ed07af9830a5293819af7f7d0c4fb28caefa102 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/BUILD.gn >@@ -60,6 +60,32 @@ group("examples") { > } > } > >+rtc_source_set("read_auth_file") { >+ testonly = true >+ sources = [ >+ "turnserver/read_auth_file.cc", >+ "turnserver/read_auth_file.h", >+ ] >+ deps = [ >+ "../rtc_base:rtc_base", >+ ] >+} >+ >+if (rtc_include_tests) { >+ rtc_test("examples_unittests") { >+ testonly = true >+ sources = [ >+ "turnserver/read_auth_file_unittest.cc", >+ ] >+ deps = [ >+ ":read_auth_file", >+ "../test:test_main", >+ "//test:test_support", >+ "//testing/gtest", >+ ] >+ } >+} >+ > if (is_android) { > rtc_android_apk("AppRTCMobile") { > testonly = true >@@ -166,15 +192,12 @@ if (is_android) { > "../build/android/adb_reverse_forwarder.py", > "../examples/androidtests/video_quality_loopback_test.py", > "../resources/reference_video_640x360_30fps.y4m", >- "../rtc_tools/barcode_tools/barcode_decoder.py", >- "../rtc_tools/barcode_tools/helper_functions.py", > "../rtc_tools/compare_videos.py", > "../rtc_tools/testing/prebuilt_apprtc.zip", > "../rtc_tools/testing/golang/linux/go.tar.gz", > "../rtc_tools/testing/build_apprtc.py", > "../rtc_tools/testing/utils.py", > "../tools_webrtc/video_quality_toolchain/linux/ffmpeg", >- "../tools_webrtc/video_quality_toolchain/linux/zxing", > "${root_out_dir}/frame_analyzer_host", > ] > } >@@ -480,8 +503,10 @@ if (is_ios || (is_mac && target_cpu != "x86")) { > "../api:libjingle_peerconnection_api", > "../api/audio_codecs:builtin_audio_decoder_factory", > "../api/audio_codecs:builtin_audio_encoder_factory", >+ "../api/video:builtin_video_bitrate_allocator_factory", > "../logging:rtc_event_log_impl_base", > "../media:rtc_audio_video", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../pc:libjingle_peerconnection", > "../rtc_base:rtc_base", >@@ -625,12 +650,13 @@ if (is_ios || (is_mac && target_cpu != "x86")) { > rtc_ios_xctest_test("apprtcmobile_tests") { > info_plist = "objc/AppRTCMobile/ios/Info.plist" > sources = [ >- "objc/AppRTCMobile/ios/main.m", >+ "objc/AppRTCMobile/tests/main.mm", > ] > deps = [ > ":AppRTCMobile_lib", > ":apprtcmobile_test_sources", > "../sdk:framework_objc", >+ "//test:test_support", > ] > ldflags = [ "-all_load" ] > } >@@ -655,11 +681,14 @@ if (is_linux || is_win) { > suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] > } > deps = [ >+ "../api:create_peerconnection_factory", > "../api:libjingle_peerconnection_api", > "../api/video:video_frame_i420", > "../rtc_base:checks", > "../rtc_base:stringutils", > "../rtc_base/third_party/sigslot", >+ "../system_wrappers:field_trial", >+ "../test:field_trial", > ] > if (is_win) { > sources += [ >@@ -696,6 +725,7 @@ if (is_linux || is_win) { > "../api/video_codecs:builtin_video_encoder_factory", > "../media:rtc_audio_video", > "../modules/audio_device:audio_device", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/video_capture:video_capture_module", > "../pc:libjingle_peerconnection", >@@ -718,10 +748,11 @@ if (is_linux || is_win) { > "peerconnection/server/utils.h", > ] > deps = [ >- "..:webrtc_common", > "../rtc_base:rtc_base_approved", > "../rtc_base:stringutils", > "../rtc_tools:command_line_parser", >+ "../system_wrappers:field_trial", >+ "../test:field_trial", > ] > if (!build_with_chromium && is_clang) { > # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >@@ -750,6 +781,7 @@ if (is_linux || is_win) { > "turnserver/turnserver_main.cc", > ] > deps = [ >+ ":read_auth_file", > "../p2p:rtc_p2p", > "../pc:rtc_pc", > "../rtc_base:rtc_base", >@@ -807,6 +839,7 @@ if (is_win || is_android) { > configs += [ "//build/config/win:windowed" ] > } > deps = [ >+ "../api:create_peerconnection_factory", > "../api:libjingle_peerconnection_api", > "../api/audio_codecs:builtin_audio_decoder_factory", > "../api/audio_codecs:builtin_audio_encoder_factory", >@@ -816,6 +849,7 @@ if (is_win || is_android) { > "../media:rtc_media", > "../media:rtc_media_base", > "../modules/audio_device:audio_device", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/video_capture:video_capture_module", > "../pc:libjingle_peerconnection", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/DEPS b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/DEPS >index 6a9a9f09cffef37ffc32156a3bc5bdc9b63aa2a8..5b378455900b537142e9a81ea769a6a1f27e7bba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/DEPS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/DEPS >@@ -8,5 +8,6 @@ include_rules = [ > "+p2p", > "+pc", > "+sdk/objc", >+ "+system_wrappers/include", > "+third_party/libyuv", > ] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/aarproject/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/aarproject/OWNERS >new file mode 100644 >index 0000000000000000000000000000000000000000..3c4e54174e60f6590dcca98a9806777cc1f9c118 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/aarproject/OWNERS >@@ -0,0 +1 @@ >+sakal@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/AndroidManifest.xml b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/AndroidManifest.xml >index d6e0ff0417444db9d19c20f27c60cd8f46f9fe10..576449bb14e9edefe405dcb47ae5a123c419661d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/AndroidManifest.xml >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/AndroidManifest.xml >@@ -18,7 +18,8 @@ > <uses-permission android:name="android.permission.INTERNET" /> > <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/> > <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" /> >- <uses-permission android:name="android.permission.CAPTURE_VIDEO_OUTPUT" /> >+ <uses-permission tools:ignore="ProtectedPermissions" >+ android:name="android.permission.CAPTURE_VIDEO_OUTPUT" /> > > <!-- This is a test application that should always be debuggable. --> > <application android:label="@string/app_name" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/res/layout/activity_connect.xml b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/res/layout/activity_connect.xml >index 83707f72ed07f7cbd71445ea4ce4c378426bd963..3c591896444a825d9c0dd88a275a03e47a778151 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/res/layout/activity_connect.xml >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/res/layout/activity_connect.xml >@@ -1,6 +1,7 @@ > <?xml version="1.0" encoding="utf-8"?> > <LinearLayout > xmlns:android="http://schemas.android.com/apk/res/android" >+ xmlns:tools="http://schemas.android.com/tools" > android:layout_margin="16dp" > android:layout_width="match_parent" > android:layout_height="match_parent" >@@ -21,7 +22,9 @@ > android:gravity="center" > android:layout_marginBottom="8dp"> > >+ <!-- TODO(crbug.com/900912): Fix and remove lint ignore --> > <EditText >+ tools:ignore="LabelFor" > android:id="@+id/room_edittext" > android:layout_width="0dp" > android:layout_height="wrap_content" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/CallActivity.java b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/CallActivity.java >index ccc2c3bd93d8070e4363671e2734785074bb9eac..43961506ab93b32d5eb37c72c7356aa6facbb259 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/CallActivity.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/CallActivity.java >@@ -171,7 +171,7 @@ public class CallActivity extends Activity implements AppRTCClient.SignalingEven > private RoomConnectionParameters roomConnectionParameters; > @Nullable > private PeerConnectionParameters peerConnectionParameters; >- private boolean iceConnected; >+ private boolean connected; > private boolean isError; > private boolean callControlFragmentVisible = true; > private long callStartedTimeMs; >@@ -203,7 +203,7 @@ public class CallActivity extends Activity implements AppRTCClient.SignalingEven > getWindow().getDecorView().setSystemUiVisibility(getSystemUiVisibility()); > setContentView(R.layout.activity_call); > >- iceConnected = false; >+ connected = false; > signalingParameters = null; > > // Create UI controls. >@@ -553,7 +553,7 @@ public class CallActivity extends Activity implements AppRTCClient.SignalingEven > > // Helper functions. > private void toggleCallControlFragmentVisibility() { >- if (!iceConnected || !callFragment.isAdded()) { >+ if (!connected || !callFragment.isAdded()) { > return; > } > // Show/hide call control fragment >@@ -649,7 +649,7 @@ public class CallActivity extends Activity implements AppRTCClient.SignalingEven > audioManager.stop(); > audioManager = null; > } >- if (iceConnected && !isError) { >+ if (connected && !isError) { > setResult(RESULT_OK); > } else { > setResult(RESULT_CANCELED); >@@ -911,8 +911,6 @@ public class CallActivity extends Activity implements AppRTCClient.SignalingEven > @Override > public void run() { > logAndToast("ICE connected, delay=" + delta + "ms"); >- iceConnected = true; >- callConnected(); > } > }); > } >@@ -923,7 +921,30 @@ public class CallActivity extends Activity implements AppRTCClient.SignalingEven > @Override > public void run() { > logAndToast("ICE disconnected"); >- iceConnected = false; >+ } >+ }); >+ } >+ >+ @Override >+ public void onConnected() { >+ final long delta = System.currentTimeMillis() - callStartedTimeMs; >+ runOnUiThread(new Runnable() { >+ @Override >+ public void run() { >+ logAndToast("DTLS connected, delay=" + delta + "ms"); >+ connected = true; >+ callConnected(); >+ } >+ }); >+ } >+ >+ @Override >+ public void onDisconnected() { >+ runOnUiThread(new Runnable() { >+ @Override >+ public void run() { >+ logAndToast("DTLS disconnected"); >+ connected = false; > disconnect(); > } > }); >@@ -937,7 +958,7 @@ public class CallActivity extends Activity implements AppRTCClient.SignalingEven > runOnUiThread(new Runnable() { > @Override > public void run() { >- if (!isError && iceConnected) { >+ if (!isError && connected) { > hudFragment.updateEncoderStatistics(reports); > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java >index 94135ded4d648713ce54a52ef39ed92aff709122..f85a788f3d16c6709549d4c8c3efe98f9c8ff2e4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java >@@ -52,6 +52,7 @@ import org.webrtc.MediaStreamTrack; > import org.webrtc.MediaStreamTrack.MediaType; > import org.webrtc.PeerConnection; > import org.webrtc.PeerConnection.IceConnectionState; >+import org.webrtc.PeerConnection.PeerConnectionState; > import org.webrtc.PeerConnectionFactory; > import org.webrtc.RtpParameters; > import org.webrtc.RtpReceiver; >@@ -293,11 +294,23 @@ public class PeerConnectionClient { > void onIceConnected(); > > /** >- * Callback fired once connection is closed (IceConnectionState is >+ * Callback fired once connection is disconnected (IceConnectionState is > * DISCONNECTED). > */ > void onIceDisconnected(); > >+ /** >+ * Callback fired once DTLS connection is established (PeerConnectionState >+ * is CONNECTED). >+ */ >+ void onConnected(); >+ >+ /** >+ * Callback fired once DTLS connection is disconnected (PeerConnectionState >+ * is DISCONNECTED). >+ */ >+ void onDisconnected(); >+ > /** > * Callback fired once peer connection is closed. > */ >@@ -1263,6 +1276,20 @@ public class PeerConnectionClient { > }); > } > >+ @Override >+ public void onConnectionChange(final PeerConnection.PeerConnectionState newState) { >+ executor.execute(() -> { >+ Log.d(TAG, "PeerConnectionState: " + newState); >+ if (newState == PeerConnectionState.CONNECTED) { >+ events.onConnected(); >+ } else if (newState == PeerConnectionState.DISCONNECTED) { >+ events.onDisconnected(); >+ } else if (newState == PeerConnectionState.FAILED) { >+ reportError("DTLS connection failed."); >+ } >+ }); >+ } >+ > @Override > public void onIceGatheringChange(PeerConnection.IceGatheringState newState) { > Log.d(TAG, "IceGatheringState: " + newState); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidjunit/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidjunit/OWNERS >new file mode 100644 >index 0000000000000000000000000000000000000000..3c4e54174e60f6590dcca98a9806777cc1f9c118 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidjunit/OWNERS >@@ -0,0 +1 @@ >+sakal@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidjunit/src/org/appspot/apprtc/TCPChannelClientTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidjunit/src/org/appspot/apprtc/TCPChannelClientTest.java >index 8c5f38ccb31b16afe5ba3ee8b28447f7f6f7953f..a8806ec5f7d67c16b09fc37bd61cd4d3d3a9ccd3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidjunit/src/org/appspot/apprtc/TCPChannelClientTest.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidjunit/src/org/appspot/apprtc/TCPChannelClientTest.java >@@ -38,10 +38,10 @@ public class TCPChannelClientTest { > * How long we wait before trying to connect to the server. Note: was > * previously only 10, which was too short (tests were flaky). > */ >- private static final int SERVER_WAIT = 100; >- private static final int CONNECT_TIMEOUT = 100; >- private static final int SEND_TIMEOUT = 100; >- private static final int DISCONNECT_TIMEOUT = 100; >+ private static final int SERVER_WAIT = 300; >+ private static final int CONNECT_TIMEOUT = 1000; >+ private static final int SEND_TIMEOUT = 1000; >+ private static final int DISCONNECT_TIMEOUT = 1000; > private static final int TERMINATION_TIMEOUT = 1000; > private static final String TEST_MESSAGE_SERVER = "Hello, Server!"; > private static final String TEST_MESSAGE_CLIENT = "Hello, Client!"; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/BUILD.gn >index 5568eb11dde3ebf3b951b02a62aa6855132865e1..7928640cc6f5caaeaea9c6a7156cd0d6613f9c74 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/BUILD.gn >@@ -49,10 +49,12 @@ if (is_android) { > "//api:libjingle_peerconnection_api", > "//api/audio_codecs:builtin_audio_decoder_factory", > "//api/audio_codecs:builtin_audio_encoder_factory", >+ "//api/video:builtin_video_bitrate_allocator_factory", > "//logging:rtc_event_log_impl_base", > "//media:rtc_audio_video", > "//media:rtc_internal_video_codecs", > "//modules/audio_processing", >+ "//modules/audio_processing:api", > "//modules/utility:utility", > "//pc:libjingle_peerconnection", > "//rtc_base:rtc_base", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/jni/androidcallclient.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/jni/androidcallclient.cc >index 005f369a6d104622dddb0bdbef8071a812f21233..a1394cd01c523dc6904830f3ac9af9876c49d5ad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/jni/androidcallclient.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidnativeapi/jni/androidcallclient.cc >@@ -16,6 +16,7 @@ > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" > #include "api/peerconnectioninterface.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "examples/androidnativeapi/generated_jni/jni/CallClient_jni.h" > #include "media/engine/internaldecoderfactory.h" > #include "media/engine/internalencoderfactory.h" >@@ -99,7 +100,8 @@ void AndroidCallClient::Call(JNIEnv* env, > remote_sink_ = webrtc::JavaToNativeVideoSink(env, remote_sink.obj()); > > video_source_ = webrtc::CreateJavaVideoSource(env, signaling_thread_.get(), >- false /* is_screencast */); >+ /* is_screencast= */ false, >+ /* align_timestamps= */ true); > > CreatePeerConnection(); > Connect(); >@@ -159,6 +161,7 @@ void AndroidCallClient::CreatePeerConnectionFactory() { > webrtc::CreateBuiltinAudioDecoderFactory(), > absl::make_unique<webrtc::InternalEncoderFactory>(), > absl::make_unique<webrtc::InternalDecoderFactory>(), >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory(), > nullptr /* audio_mixer */, webrtc::AudioProcessingBuilder().Create()); > RTC_LOG(LS_INFO) << "Media engine created: " << media_engine.get(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/AndroidManifest.xml b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/AndroidManifest.xml >index 7cd52bc531d74b9e861f0b5c3062c95a3a4564ae..dae2e980a6a60a7f34feab8f93f771f4fdc1e904 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/AndroidManifest.xml >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/AndroidManifest.xml >@@ -14,7 +14,7 @@ > package="org.appspot.apprtc.test"> > > <uses-permission android:name="android.permission.RUN_INSTRUMENTATION" /> >- <uses-sdk android:minSdkVersion="13" android:targetSdkVersion="21" /> >+ <uses-sdk android:minSdkVersion="16" android:targetSdkVersion="21" /> > > <application> > <uses-library android:name="android.test.runner" /> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/src/org/appspot/apprtc/test/PeerConnectionClientTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/src/org/appspot/apprtc/test/PeerConnectionClientTest.java >index e2674017e456e059761baf502fa17905269a010b..c165f59b033d53fdd092af733a180cc7da465efc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/src/org/appspot/apprtc/test/PeerConnectionClientTest.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/src/org/appspot/apprtc/test/PeerConnectionClientTest.java >@@ -183,6 +183,16 @@ public class PeerConnectionClientTest implements PeerConnectionEvents { > } > } > >+ @Override >+ public void onConnected() { >+ Log.d(TAG, "DTLS Connected"); >+ } >+ >+ @Override >+ public void onDisconnected() { >+ Log.d(TAG, "DTLS Disconnected"); >+ } >+ > @Override > public void onPeerConnectionClosed() { > Log.d(TAG, "PeerConnection closed"); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/video_quality_loopback_test.py b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/video_quality_loopback_test.py >index 19dd37c3d36bfc84ecc9824aaa6607f73c21edd3..fc198a826f7962444b6485868b4b20a3d5d6e297 100755 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/video_quality_loopback_test.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/androidtests/video_quality_loopback_test.py >@@ -190,20 +190,13 @@ def RunTest(android_device, adb_path, build_dir, temp_dir, num_retries, > # Run comparison script. > compare_script = os.path.join(SRC_DIR, 'rtc_tools', 'compare_videos.py') > frame_analyzer = os.path.join(build_dir, 'frame_analyzer_host') >- zxing_path = os.path.join(TOOLCHAIN_DIR, 'zxing') >- stats_file_ref = os.path.join(temp_dir, 'stats_ref.txt') >- stats_file_test = os.path.join(temp_dir, 'stats_test.txt') > > args = [ > '--ref_video', reference_video_yuv, > '--test_video', test_video_yuv, > '--yuv_frame_width', '640', > '--yuv_frame_height', '360', >- '--stats_file_ref', stats_file_ref, >- '--stats_file_test', stats_file_test, > '--frame_analyzer', frame_analyzer, >- '--ffmpeg_path', ffmpeg_path, >- '--zxing_path', zxing_path, > ] > if chartjson_result_file: > args.extend(['--chartjson_result_file', chartjson_result_file]) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDAppClient.m b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDAppClient.m >index 9a0d3695f91615469c990d1122d1aeecdbff60e5..8b3d105797e2fff6861aba5dea3e10ee99ca111a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDAppClient.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDAppClient.m >@@ -400,10 +400,15 @@ - (void)peerConnection:(RTCPeerConnection *)peerConnection > didChangeIceConnectionState:(RTCIceConnectionState)newState { > RTCLog(@"ICE state changed: %ld", (long)newState); > dispatch_async(dispatch_get_main_queue(), ^{ >- [_delegate appClient:self didChangeConnectionState:newState]; >+ [self.delegate appClient:self didChangeConnectionState:newState]; > }); > } > >+- (void)peerConnection:(RTCPeerConnection *)peerConnection >+ didChangeConnectionState:(RTCPeerConnectionState)newState { >+ RTCLog(@"ICE+DTLS state changed: %ld", (long)newState); >+} >+ > - (void)peerConnection:(RTCPeerConnection *)peerConnection > didChangeIceGatheringState:(RTCIceGatheringState)newState { > RTCLog(@"ICE gathering state changed: %ld", (long)newState); >@@ -450,16 +455,16 @@ - (void)peerConnection:(RTCPeerConnection *)peerConnection > [[NSError alloc] initWithDomain:kARDAppClientErrorDomain > code:kARDAppClientErrorCreateSDP > userInfo:userInfo]; >- [_delegate appClient:self didError:sdpError]; >+ [self.delegate appClient:self didError:sdpError]; > return; > } > __weak ARDAppClient *weakSelf = self; >- [_peerConnection setLocalDescription:sdp >- completionHandler:^(NSError *error) { >- ARDAppClient *strongSelf = weakSelf; >- [strongSelf peerConnection:strongSelf.peerConnection >- didSetSessionDescriptionWithError:error]; >- }]; >+ [self.peerConnection setLocalDescription:sdp >+ completionHandler:^(NSError *error) { >+ ARDAppClient *strongSelf = weakSelf; >+ [strongSelf peerConnection:strongSelf.peerConnection >+ didSetSessionDescriptionWithError:error]; >+ }]; > ARDSessionDescriptionMessage *message = > [[ARDSessionDescriptionMessage alloc] initWithDescription:sdp]; > [self sendSignalingMessage:message]; >@@ -480,22 +485,21 @@ - (void)peerConnection:(RTCPeerConnection *)peerConnection > [[NSError alloc] initWithDomain:kARDAppClientErrorDomain > code:kARDAppClientErrorSetSDP > userInfo:userInfo]; >- [_delegate appClient:self didError:sdpError]; >+ [self.delegate appClient:self didError:sdpError]; > return; > } > // If we're answering and we've just set the remote offer we need to create > // an answer and set the local description. >- if (!_isInitiator && !_peerConnection.localDescription) { >+ if (!self.isInitiator && !self.peerConnection.localDescription) { > RTCMediaConstraints *constraints = [self defaultAnswerConstraints]; > __weak ARDAppClient *weakSelf = self; >- [_peerConnection answerForConstraints:constraints >- completionHandler:^(RTCSessionDescription *sdp, >- NSError *error) { >- ARDAppClient *strongSelf = weakSelf; >- [strongSelf peerConnection:strongSelf.peerConnection >- didCreateSessionDescription:sdp >- error:error]; >- }]; >+ [self.peerConnection answerForConstraints:constraints >+ completionHandler:^(RTCSessionDescription *sdp, NSError *error) { >+ ARDAppClient *strongSelf = weakSelf; >+ [strongSelf peerConnection:strongSelf.peerConnection >+ didCreateSessionDescription:sdp >+ error:error]; >+ }]; > } > }); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDStatsBuilder.m b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDStatsBuilder.m >index cfe2ec5cd828569007f239b1debd7738ce1dbd9e..f9a1920399211bf7a63c0adfa77f321c7eda4393 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDStatsBuilder.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ARDStatsBuilder.m >@@ -167,23 +167,43 @@ - (int)calculateAvgQP { > return deltaFramesEncoded != 0 ? deltaQPSum / deltaFramesEncoded : 0; > } > >+- (void)updateBweStatOfKey:(NSString *)key value:(NSString *)value { >+ if ([key isEqualToString:@"googAvailableSendBandwidth"]) { >+ _availableSendBw = [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >+ } else if ([key isEqualToString:@"googAvailableReceiveBandwidth"]) { >+ _availableRecvBw = [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >+ } else if ([key isEqualToString:@"googActualEncBitrate"]) { >+ _actualEncBitrate = [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >+ } else if ([key isEqualToString:@"googTargetEncBitrate"]) { >+ _targetEncBitrate = [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >+ } >+} >+ > - (void)parseBweStatsReport:(RTCLegacyStatsReport *)statsReport { >- [statsReport.values enumerateKeysAndObjectsUsingBlock:^( >- NSString *key, NSString *value, BOOL *stop) { >- if ([key isEqualToString:@"googAvailableSendBandwidth"]) { >- _availableSendBw = >- [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >- } else if ([key isEqualToString:@"googAvailableReceiveBandwidth"]) { >- _availableRecvBw = >- [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >- } else if ([key isEqualToString:@"googActualEncBitrate"]) { >- _actualEncBitrate = >- [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >- } else if ([key isEqualToString:@"googTargetEncBitrate"]) { >- _targetEncBitrate = >- [ARDBitrateTracker bitrateStringForBitrate:value.doubleValue]; >- } >- }]; >+ [statsReport.values >+ enumerateKeysAndObjectsUsingBlock:^(NSString *key, NSString *value, BOOL *stop) { >+ [self updateBweStatOfKey:key value:value]; >+ }]; >+} >+ >+- (void)updateConnectionStatOfKey:(NSString *)key value:(NSString *)value { >+ if ([key isEqualToString:@"googRtt"]) { >+ _connRtt = value; >+ } else if ([key isEqualToString:@"googLocalCandidateType"]) { >+ _localCandType = value; >+ } else if ([key isEqualToString:@"googRemoteCandidateType"]) { >+ _remoteCandType = value; >+ } else if ([key isEqualToString:@"googTransportType"]) { >+ _transportType = value; >+ } else if ([key isEqualToString:@"bytesReceived"]) { >+ NSInteger byteCount = value.integerValue; >+ [_connRecvBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >+ _connRecvBitrate = _connRecvBitrateTracker.bitrateString; >+ } else if ([key isEqualToString:@"bytesSent"]) { >+ NSInteger byteCount = value.integerValue; >+ [_connSendBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >+ _connSendBitrate = _connSendBitrateTracker.bitrateString; >+ } > } > > - (void)parseConnectionStatsReport:(RTCLegacyStatsReport *)statsReport { >@@ -191,26 +211,10 @@ - (void)parseConnectionStatsReport:(RTCLegacyStatsReport *)statsReport { > if (![activeConnection isEqualToString:@"true"]) { > return; > } >- [statsReport.values enumerateKeysAndObjectsUsingBlock:^( >- NSString *key, NSString *value, BOOL *stop) { >- if ([key isEqualToString:@"googRtt"]) { >- _connRtt = value; >- } else if ([key isEqualToString:@"googLocalCandidateType"]) { >- _localCandType = value; >- } else if ([key isEqualToString:@"googRemoteCandidateType"]) { >- _remoteCandType = value; >- } else if ([key isEqualToString:@"googTransportType"]) { >- _transportType = value; >- } else if ([key isEqualToString:@"bytesReceived"]) { >- NSInteger byteCount = value.integerValue; >- [_connRecvBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >- _connRecvBitrate = _connRecvBitrateTracker.bitrateString; >- } else if ([key isEqualToString:@"bytesSent"]) { >- NSInteger byteCount = value.integerValue; >- [_connSendBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >- _connSendBitrate = _connSendBitrateTracker.bitrateString; >- } >- }]; >+ [statsReport.values >+ enumerateKeysAndObjectsUsingBlock:^(NSString *key, NSString *value, BOOL *stop) { >+ [self updateConnectionStatOfKey:key value:value]; >+ }]; > } > > - (void)parseSendSsrcStatsReport:(RTCLegacyStatsReport *)statsReport { >@@ -224,50 +228,58 @@ - (void)parseSendSsrcStatsReport:(RTCLegacyStatsReport *)statsReport { > } > } > >+- (void)updateAudioSendStatOfKey:(NSString *)key value:(NSString *)value { >+ if ([key isEqualToString:@"googCodecName"]) { >+ _audioSendCodec = value; >+ } else if ([key isEqualToString:@"bytesSent"]) { >+ NSInteger byteCount = value.integerValue; >+ [_audioSendBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >+ _audioSendBitrate = _audioSendBitrateTracker.bitrateString; >+ } >+} >+ > - (void)parseAudioSendStatsReport:(RTCLegacyStatsReport *)statsReport { >- [statsReport.values enumerateKeysAndObjectsUsingBlock:^( >- NSString *key, NSString *value, BOOL *stop) { >- if ([key isEqualToString:@"googCodecName"]) { >- _audioSendCodec = value; >- } else if ([key isEqualToString:@"bytesSent"]) { >- NSInteger byteCount = value.integerValue; >- [_audioSendBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >- _audioSendBitrate = _audioSendBitrateTracker.bitrateString; >- } >- }]; >+ [statsReport.values >+ enumerateKeysAndObjectsUsingBlock:^(NSString *key, NSString *value, BOOL *stop) { >+ [self updateAudioSendStatOfKey:key value:value]; >+ }]; >+} >+ >+- (void)updateVideoSendStatOfKey:(NSString *)key value:(NSString *)value { >+ if ([key isEqualToString:@"googCodecName"]) { >+ _videoSendCodec = value; >+ } else if ([key isEqualToString:@"googFrameHeightInput"]) { >+ _videoInputHeight = value; >+ } else if ([key isEqualToString:@"googFrameWidthInput"]) { >+ _videoInputWidth = value; >+ } else if ([key isEqualToString:@"googFrameRateInput"]) { >+ _videoInputFps = value; >+ } else if ([key isEqualToString:@"googFrameHeightSent"]) { >+ _videoSendHeight = value; >+ } else if ([key isEqualToString:@"googFrameWidthSent"]) { >+ _videoSendWidth = value; >+ } else if ([key isEqualToString:@"googFrameRateSent"]) { >+ _videoSendFps = value; >+ } else if ([key isEqualToString:@"googAvgEncodeMs"]) { >+ _videoEncodeMs = value; >+ } else if ([key isEqualToString:@"bytesSent"]) { >+ NSInteger byteCount = value.integerValue; >+ [_videoSendBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >+ _videoSendBitrate = _videoSendBitrateTracker.bitrateString; >+ } else if ([key isEqualToString:@"qpSum"]) { >+ _oldVideoQPSum = _videoQPSum; >+ _videoQPSum = value.integerValue; >+ } else if ([key isEqualToString:@"framesEncoded"]) { >+ _oldFramesEncoded = _framesEncoded; >+ _framesEncoded = value.integerValue; >+ } > } > > - (void)parseVideoSendStatsReport:(RTCLegacyStatsReport *)statsReport { >- [statsReport.values enumerateKeysAndObjectsUsingBlock:^( >- NSString *key, NSString *value, BOOL *stop) { >- if ([key isEqualToString:@"googCodecName"]) { >- _videoSendCodec = value; >- } else if ([key isEqualToString:@"googFrameHeightInput"]) { >- _videoInputHeight = value; >- } else if ([key isEqualToString:@"googFrameWidthInput"]) { >- _videoInputWidth = value; >- } else if ([key isEqualToString:@"googFrameRateInput"]) { >- _videoInputFps = value; >- } else if ([key isEqualToString:@"googFrameHeightSent"]) { >- _videoSendHeight = value; >- } else if ([key isEqualToString:@"googFrameWidthSent"]) { >- _videoSendWidth = value; >- } else if ([key isEqualToString:@"googFrameRateSent"]) { >- _videoSendFps = value; >- } else if ([key isEqualToString:@"googAvgEncodeMs"]) { >- _videoEncodeMs = value; >- } else if ([key isEqualToString:@"bytesSent"]) { >- NSInteger byteCount = value.integerValue; >- [_videoSendBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >- _videoSendBitrate = _videoSendBitrateTracker.bitrateString; >- } else if ([key isEqualToString:@"qpSum"]) { >- _oldVideoQPSum = _videoQPSum; >- _videoQPSum = value.integerValue; >- } else if ([key isEqualToString:@"framesEncoded"]) { >- _oldFramesEncoded = _framesEncoded; >- _framesEncoded = value.integerValue; >- } >- }]; >+ [statsReport.values >+ enumerateKeysAndObjectsUsingBlock:^(NSString *key, NSString *value, BOOL *stop) { >+ [self updateVideoSendStatOfKey:key value:value]; >+ }]; > } > > - (void)parseRecvSsrcStatsReport:(RTCLegacyStatsReport *)statsReport { >@@ -281,44 +293,52 @@ - (void)parseRecvSsrcStatsReport:(RTCLegacyStatsReport *)statsReport { > } > } > >+- (void)updateAudioRecvStatOfKey:(NSString *)key value:(NSString *)value { >+ if ([key isEqualToString:@"googCodecName"]) { >+ _audioRecvCodec = value; >+ } else if ([key isEqualToString:@"bytesReceived"]) { >+ NSInteger byteCount = value.integerValue; >+ [_audioRecvBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >+ _audioRecvBitrate = _audioRecvBitrateTracker.bitrateString; >+ } else if ([key isEqualToString:@"googSpeechExpandRate"]) { >+ _audioExpandRate = value; >+ } else if ([key isEqualToString:@"googCurrentDelayMs"]) { >+ _audioCurrentDelay = value; >+ } >+} >+ > - (void)parseAudioRecvStatsReport:(RTCLegacyStatsReport *)statsReport { >- [statsReport.values enumerateKeysAndObjectsUsingBlock:^( >- NSString *key, NSString *value, BOOL *stop) { >- if ([key isEqualToString:@"googCodecName"]) { >- _audioRecvCodec = value; >- } else if ([key isEqualToString:@"bytesReceived"]) { >- NSInteger byteCount = value.integerValue; >- [_audioRecvBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >- _audioRecvBitrate = _audioRecvBitrateTracker.bitrateString; >- } else if ([key isEqualToString:@"googSpeechExpandRate"]) { >- _audioExpandRate = value; >- } else if ([key isEqualToString:@"googCurrentDelayMs"]) { >- _audioCurrentDelay = value; >- } >- }]; >+ [statsReport.values >+ enumerateKeysAndObjectsUsingBlock:^(NSString *key, NSString *value, BOOL *stop) { >+ [self updateAudioRecvStatOfKey:key value:value]; >+ }]; >+} >+ >+- (void)updateVideoRecvStatOfKey:(NSString *)key value:(NSString *)value { >+ if ([key isEqualToString:@"googFrameHeightReceived"]) { >+ _videoRecvHeight = value; >+ } else if ([key isEqualToString:@"googFrameWidthReceived"]) { >+ _videoRecvWidth = value; >+ } else if ([key isEqualToString:@"googFrameRateReceived"]) { >+ _videoRecvFps = value; >+ } else if ([key isEqualToString:@"googFrameRateDecoded"]) { >+ _videoDecodedFps = value; >+ } else if ([key isEqualToString:@"googFrameRateOutput"]) { >+ _videoOutputFps = value; >+ } else if ([key isEqualToString:@"googDecodeMs"]) { >+ _videoDecodeMs = value; >+ } else if ([key isEqualToString:@"bytesReceived"]) { >+ NSInteger byteCount = value.integerValue; >+ [_videoRecvBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >+ _videoRecvBitrate = _videoRecvBitrateTracker.bitrateString; >+ } > } > > - (void)parseVideoRecvStatsReport:(RTCLegacyStatsReport *)statsReport { >- [statsReport.values enumerateKeysAndObjectsUsingBlock:^( >- NSString *key, NSString *value, BOOL *stop) { >- if ([key isEqualToString:@"googFrameHeightReceived"]) { >- _videoRecvHeight = value; >- } else if ([key isEqualToString:@"googFrameWidthReceived"]) { >- _videoRecvWidth = value; >- } else if ([key isEqualToString:@"googFrameRateReceived"]) { >- _videoRecvFps = value; >- } else if ([key isEqualToString:@"googFrameRateDecoded"]) { >- _videoDecodedFps = value; >- } else if ([key isEqualToString:@"googFrameRateOutput"]) { >- _videoOutputFps = value; >- } else if ([key isEqualToString:@"googDecodeMs"]) { >- _videoDecodeMs = value; >- } else if ([key isEqualToString:@"bytesReceived"]) { >- NSInteger byteCount = value.integerValue; >- [_videoRecvBitrateTracker updateBitrateWithCurrentByteCount:byteCount]; >- _videoRecvBitrate = _videoRecvBitrateTracker.bitrateString; >- } >- }]; >+ [statsReport.values >+ enumerateKeysAndObjectsUsingBlock:^(NSString *key, NSString *value, BOOL *stop) { >+ [self updateVideoRecvStatOfKey:key value:value]; >+ }]; > } > > @end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDMainViewController.m b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDMainViewController.m >index fae3f7b8aa7267c6db45bef262ed9b3560c6b20d..63b3dd76cad025d9bc985110adc523314428ae07 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDMainViewController.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDMainViewController.m >@@ -32,14 +32,17 @@ @interface ARDMainViewController () < > ARDMainViewDelegate, > ARDVideoCallViewControllerDelegate, > RTCAudioSessionDelegate> >+@property(nonatomic, strong) ARDMainView *mainView; >+@property(nonatomic, strong) AVAudioPlayer *audioPlayer; > @end > > @implementation ARDMainViewController { >- ARDMainView *_mainView; >- AVAudioPlayer *_audioPlayer; > BOOL _useManualAudio; > } > >+@synthesize mainView = _mainView; >+@synthesize audioPlayer = _audioPlayer; >+ > - (void)viewDidLoad { > [super viewDidLoad]; > if ([[[NSProcessInfo processInfo] arguments] containsObject:loopbackLaunchProcessArgument]) { >@@ -165,13 +168,13 @@ - (void)audioSessionDidStartPlayOrRecord:(RTCAudioSession *)session { > // Stop playback on main queue and then configure WebRTC. > [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeMain > block:^{ >- if (_mainView.isAudioLoopPlaying) { >- RTCLog(@"Stopping audio loop due to WebRTC start."); >- [_audioPlayer stop]; >- } >- RTCLog(@"Setting isAudioEnabled to YES."); >- session.isAudioEnabled = YES; >- }]; >+ if (self.mainView.isAudioLoopPlaying) { >+ RTCLog(@"Stopping audio loop due to WebRTC start."); >+ [self.audioPlayer stop]; >+ } >+ RTCLog(@"Setting isAudioEnabled to YES."); >+ session.isAudioEnabled = YES; >+ }]; > } > > - (void)audioSessionDidStopPlayOrRecord:(RTCAudioSession *)session { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDVideoCallViewController.m b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDVideoCallViewController.m >index 2827bf94423cb023e45ea984f5e77687ae764a0a..f7f4a877ef0aa1ed131a71229b4aff207dbacc2a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDVideoCallViewController.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/ios/ARDVideoCallViewController.m >@@ -27,6 +27,7 @@ @interface ARDVideoCallViewController () <ARDAppClientDelegate, > RTCAudioSessionDelegate> > @property(nonatomic, strong) RTCVideoTrack *remoteVideoTrack; > @property(nonatomic, readonly) ARDVideoCallView *videoCallView; >+@property(nonatomic, assign) AVAudioSessionPortOverride portOverride; > @end > > @implementation ARDVideoCallViewController { >@@ -34,12 +35,12 @@ @implementation ARDVideoCallViewController { > RTCVideoTrack *_remoteVideoTrack; > ARDCaptureController *_captureController; > ARDFileCaptureController *_fileCaptureController NS_AVAILABLE_IOS(10); >- AVAudioSessionPortOverride _portOverride; > } > > @synthesize videoCallView = _videoCallView; > @synthesize remoteVideoTrack = _remoteVideoTrack; > @synthesize delegate = _delegate; >+@synthesize portOverride = _portOverride; > > - (instancetype)initForRoom:(NSString *)room > isLoopback:(BOOL)isLoopback >@@ -168,7 +169,7 @@ - (void)videoCallViewDidChangeRoute:(ARDVideoCallView *)view { > [session lockForConfiguration]; > NSError *error = nil; > if ([session overrideOutputAudioPort:override error:&error]) { >- _portOverride = override; >+ self.portOverride = override; > } else { > RTCLogError(@"Error overriding output port: %@", > error.localizedDescription); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/mac/APPRTCViewController.m b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/mac/APPRTCViewController.m >index 9f5ed40b1b41c28ea1cf1ed73af365d5461b7bc2..a972a20e4ea898964b1ebe0222a8d2c440bbbfad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/mac/APPRTCViewController.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/mac/APPRTCViewController.m >@@ -39,6 +39,7 @@ @interface APPRTCMainView : NSView > @property(nonatomic, weak) id<APPRTCMainViewDelegate> delegate; > @property(nonatomic, readonly) NSView<RTCVideoRenderer>* localVideoView; > @property(nonatomic, readonly) NSView<RTCVideoRenderer>* remoteVideoView; >+@property(nonatomic, readonly) NSTextView* logView; > > - (void)displayLogMessage:(NSString*)message; > >@@ -52,7 +53,6 @@ @implementation APPRTCMainView { > NSButton* _connectButton; > NSButton* _loopbackButton; > NSTextField* _roomField; >- NSTextView* _logView; > CGSize _localVideoSize; > CGSize _remoteVideoSize; > } >@@ -60,14 +60,13 @@ @implementation APPRTCMainView { > @synthesize delegate = _delegate; > @synthesize localVideoView = _localVideoView; > @synthesize remoteVideoView = _remoteVideoView; >- >+@synthesize logView = _logView; > > - (void)displayLogMessage:(NSString *)message { > dispatch_async(dispatch_get_main_queue(), ^{ >- _logView.string = >- [NSString stringWithFormat:@"%@%@\n", _logView.string, message]; >- NSRange range = NSMakeRange(_logView.string.length, 0); >- [_logView scrollRangeToVisible:range]; >+ self.logView.string = [NSString stringWithFormat:@"%@%@\n", self.logView.string, message]; >+ NSRange range = NSMakeRange(self.logView.string.length, 0); >+ [self.logView scrollRangeToVisible:range]; > }); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/tests/main.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/tests/main.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..3625ffd7bf81ca83ee266bfa199fa9ef3048035d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objc/AppRTCMobile/tests/main.mm >@@ -0,0 +1,21 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#import <UIKit/UIKit.h> >+ >+#include "test/ios/coverage_util_ios.h" >+ >+int main(int argc, char* argv[]) { >+ rtc::test::ConfigureCoverageReportPath(); >+ >+ @autoreleasepool { >+ return UIApplicationMain(argc, argv, nil, nil); >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objcnativeapi/objc/objccallclient.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objcnativeapi/objc/objccallclient.mm >index c384da3ef8cbbb776dd05fc54055c2099db111cf..c7b2af42123e495d7e72ae66cecfd157e32381d2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objcnativeapi/objc/objccallclient.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/objcnativeapi/objc/objccallclient.mm >@@ -21,6 +21,7 @@ > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" > #include "api/peerconnectioninterface.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "media/engine/webrtcmediaengine.h" > #include "modules/audio_processing/include/audio_processing.h" > #include "sdk/objc/native/api/video_capturer.h" >@@ -116,12 +117,16 @@ void ObjCCallClient::CreatePeerConnectionFactory() { > std::unique_ptr<webrtc::VideoEncoderFactory> videoEncoderFactory = > webrtc::ObjCToNativeVideoEncoderFactory([[RTCDefaultVideoEncoderFactory alloc] init]); > >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> videoBitrateAllocatorFactory = >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory(); >+ > std::unique_ptr<cricket::MediaEngineInterface> media_engine = > cricket::WebRtcMediaEngineFactory::Create(nullptr /* adm */, > webrtc::CreateBuiltinAudioEncoderFactory(), > webrtc::CreateBuiltinAudioDecoderFactory(), > std::move(videoEncoderFactory), > std::move(videoDecoderFactory), >+ std::move(videoBitrateAllocatorFactory), > nullptr /* audio_mixer */, > webrtc::AudioProcessingBuilder().Create()); > RTC_LOG(LS_INFO) << "Media engine created: " << media_engine.get(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/conductor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/conductor.cc >index c25259ce34df35ca4c384568f8ef9248fca7d02a..a781c4a1874aa32b0d9cf8307bd550aa0966888d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/conductor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/conductor.cc >@@ -16,6 +16,7 @@ > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" > #include "examples/peerconnection/client/defaults.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/flagdefs.h b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/flagdefs.h >index 564e0e95f90c7c55fc9325f93fd02c7e9865524f..51f9c9a4255a4aae3faf480f655b0a775acf41e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/flagdefs.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/flagdefs.h >@@ -19,20 +19,29 @@ extern const uint16_t kDefaultServerPort; // From defaults.[h|cc] > // header file so that they can be shared across the different main.cc's > // for each platform. > >-DEFINE_bool(help, false, "Prints this message"); >-DEFINE_bool(autoconnect, >- false, >- "Connect to the server without user " >- "intervention."); >-DEFINE_string(server, "localhost", "The server to connect to."); >-DEFINE_int(port, >- kDefaultServerPort, >- "The port on which the server is listening."); >-DEFINE_bool( >+WEBRTC_DEFINE_bool(help, false, "Prints this message"); >+WEBRTC_DEFINE_bool(autoconnect, >+ false, >+ "Connect to the server without user " >+ "intervention."); >+WEBRTC_DEFINE_string(server, "localhost", "The server to connect to."); >+WEBRTC_DEFINE_int(port, >+ kDefaultServerPort, >+ "The port on which the server is listening."); >+WEBRTC_DEFINE_bool( > autocall, > false, > "Call the first available other client on " > "the server without user intervention. Note: this flag should only be set " > "to true on one of the two clients."); > >+WEBRTC_DEFINE_string( >+ force_fieldtrials, >+ "", >+ "Field trials control experimental features. This flag specifies the field " >+ "trials in effect. E.g. running with " >+ "--force_fieldtrials=WebRTC-FooFeature/Enabled/ " >+ "will assign the group Enabled to field trial WebRTC-FooFeature. Multiple " >+ "trials are separated by \"/\""); >+ > #endif // EXAMPLES_PEERCONNECTION_CLIENT_FLAGDEFS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main.cc >index 50179c4ba5510cf1ed7a3c2eda6fa00af7dc7b18..6c34683d2ea91d61051e78de09e7fdd2e9e38401 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main.cc >@@ -17,6 +17,8 @@ > > #include "rtc_base/ssladapter.h" > #include "rtc_base/thread.h" >+#include "system_wrappers/include/field_trial.h" >+#include "test/field_trial.h" > > class CustomSocketServer : public rtc::PhysicalSocketServer { > public: >@@ -75,6 +77,11 @@ int main(int argc, char* argv[]) { > return 0; > } > >+ webrtc::test::ValidateFieldTrialsStringOrDie(FLAG_force_fieldtrials); >+ // InitFieldTrialsFromString stores the char*, so the char array must outlive >+ // the application. >+ webrtc::field_trial::InitFieldTrialsFromString(FLAG_force_fieldtrials); >+ > // Abort if the user specifies a port that is outside the allowed > // range [1, 65535]. > if ((FLAG_port < 1) || (FLAG_port > 65535)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main_wnd.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main_wnd.cc >index 52b0d88523c3ca823dc98707412538db0df9aae5..2f7777d1eea1e6e38d1bb90fe9f5f6023862291c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main_wnd.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/linux/main_wnd.cc >@@ -18,11 +18,8 @@ > #include "examples/peerconnection/client/defaults.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringutils.h" > #include "third_party/libyuv/include/libyuv/convert_from.h" > >-using rtc::sprintfn; >- > namespace { > > // >@@ -150,7 +147,7 @@ GtkMainWnd::GtkMainWnd(const char* server, > autoconnect_(autoconnect), > autocall_(autocall) { > char buffer[10]; >- sprintfn(buffer, sizeof(buffer), "%i", port); >+ snprintf(buffer, sizeof(buffer), "%i", port); > port_ = buffer; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main.cc >index 25dde1ef6585d82578a79f45452bea4b6d616fef..f92e606c338119e22e43b11357ffb8bac230aa6c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main.cc >@@ -16,6 +16,8 @@ > #include "rtc_base/ssladapter.h" > #include "rtc_base/win32socketinit.h" > #include "rtc_base/win32socketserver.h" >+#include "system_wrappers/include/field_trial.h" >+#include "test/field_trial.h" > > int PASCAL wWinMain(HINSTANCE instance, > HINSTANCE prev_instance, >@@ -36,6 +38,11 @@ int PASCAL wWinMain(HINSTANCE instance, > return 0; > } > >+ webrtc::test::ValidateFieldTrialsStringOrDie(FLAG_force_fieldtrials); >+ // InitFieldTrialsFromString stores the char*, so the char array must outlive >+ // the application. >+ webrtc::field_trial::InitFieldTrialsFromString(FLAG_force_fieldtrials); >+ > // Abort if the user specifies a port that is outside the allowed > // range [1, 65535]. > if ((FLAG_port < 1) || (FLAG_port > 65535)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main_wnd.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main_wnd.cc >index 3ad248843a815b0a9005682e4ff01a3d04c9899c..8edd1e76f350314de0304035a193cd2101a79388 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main_wnd.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/main_wnd.cc >@@ -17,14 +17,11 @@ > #include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringutils.h" > #include "third_party/libyuv/include/libyuv/convert_argb.h" > > ATOM MainWnd::wnd_class_ = 0; > const wchar_t MainWnd::kClassName[] = L"WebRTC_MainWnd"; > >-using rtc::sprintfn; >- > namespace { > > const char kConnecting[] = "Connecting... "; >@@ -86,8 +83,8 @@ MainWnd::MainWnd(const char* server, > server_(server), > auto_connect_(auto_connect), > auto_call_(auto_call) { >- char buffer[10] = {0}; >- sprintfn(buffer, sizeof(buffer), "%i", port); >+ char buffer[10]; >+ snprintf(buffer, sizeof(buffer), "%i", port); > port_ = buffer; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/peer_connection_client.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/peer_connection_client.cc >index bb79d167911198dc556139d9460bd731b37cc778..f173e42793a25aed8c772fec4c5721b1c86d828e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/peer_connection_client.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/client/peer_connection_client.cc >@@ -14,14 +14,11 @@ > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/nethelpers.h" >-#include "rtc_base/stringutils.h" > > #ifdef WIN32 > #include "rtc_base/win32socketserver.h" > #endif > >-using rtc::sprintfn; >- > namespace { > > // This is our magical hangup signal. >@@ -136,7 +133,7 @@ void PeerConnectionClient::DoConnect() { > hanging_get_.reset(CreateClientSocket(server_address_.ipaddr().family())); > InitSocketSignals(); > char buffer[1024]; >- sprintfn(buffer, sizeof(buffer), "GET /sign_in?%s HTTP/1.0\r\n\r\n", >+ snprintf(buffer, sizeof(buffer), "GET /sign_in?%s HTTP/1.0\r\n\r\n", > client_name_.c_str()); > onconnect_data_ = buffer; > >@@ -158,9 +155,9 @@ bool PeerConnectionClient::SendToPeer(int peer_id, const std::string& message) { > return false; > > char headers[1024]; >- sprintfn(headers, sizeof(headers), >+ snprintf(headers, sizeof(headers), > "POST /message?peer_id=%i&to=%i HTTP/1.0\r\n" >- "Content-Length: %i\r\n" >+ "Content-Length: %zu\r\n" > "Content-Type: text/plain\r\n" > "\r\n", > my_id_, peer_id, message.length()); >@@ -190,7 +187,7 @@ bool PeerConnectionClient::SignOut() { > > if (my_id_ != -1) { > char buffer[1024]; >- sprintfn(buffer, sizeof(buffer), >+ snprintf(buffer, sizeof(buffer), > "GET /sign_out?peer_id=%i HTTP/1.0\r\n\r\n", my_id_); > onconnect_data_ = buffer; > return ConnectControlSocket(); >@@ -237,7 +234,7 @@ void PeerConnectionClient::OnConnect(rtc::AsyncSocket* socket) { > > void PeerConnectionClient::OnHangingGetConnect(rtc::AsyncSocket* socket) { > char buffer[1024]; >- sprintfn(buffer, sizeof(buffer), "GET /wait?peer_id=%i HTTP/1.0\r\n\r\n", >+ snprintf(buffer, sizeof(buffer), "GET /wait?peer_id=%i HTTP/1.0\r\n\r\n", > my_id_); > int len = static_cast<int>(strlen(buffer)); > int sent = socket->Send(buffer, len); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/main.cc >index 075b6ea2d3f7a06f7186876b37155e2e5e4dd542..5214ca5ca551154cf92a1955fece100d1a417189 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/main.cc >@@ -18,7 +18,19 @@ > #include "examples/peerconnection/server/data_socket.h" > #include "examples/peerconnection/server/peer_channel.h" > #include "examples/peerconnection/server/utils.h" >+#include "rtc_base/flags.h" > #include "rtc_tools/simple_command_line_parser.h" >+#include "system_wrappers/include/field_trial.h" >+#include "test/field_trial.h" >+ >+WEBRTC_DEFINE_string( >+ force_fieldtrials, >+ "", >+ "Field trials control experimental features. This flag specifies the field " >+ "trials in effect. E.g. running with " >+ "--force_fieldtrials=WebRTC-FooFeature/Enabled/ " >+ "will assign the group Enabled to field trial WebRTC-FooFeature. Multiple " >+ "trials are separated by \"/\""); > > static const size_t kMaxConnections = (FD_SETSIZE - 2); > >@@ -62,6 +74,11 @@ int main(int argc, char* argv[]) { > return 0; > } > >+ webrtc::test::ValidateFieldTrialsStringOrDie(FLAG_force_fieldtrials); >+ // InitFieldTrialsFromString stores the char*, so the char array must outlive >+ // the application. >+ webrtc::field_trial::InitFieldTrialsFromString(FLAG_force_fieldtrials); >+ > int port = strtol((parser.GetFlag("port")).c_str(), NULL, 10); > > // Abort if the user specifies a port that is outside the allowed >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/peer_channel.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/peer_channel.cc >index 6ef08d7b21d73b6f2299adb69ce72b2faf2ef850..b23b7e04e21d6bbc0b9ce5d22075e0735fdf17f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/peer_channel.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/peerconnection/server/peer_channel.cc >@@ -19,9 +19,6 @@ > #include "examples/peerconnection/server/data_socket.h" > #include "examples/peerconnection/server/utils.h" > #include "rtc_base/stringencode.h" >-#include "rtc_base/stringutils.h" >- >-using rtc::sprintfn; > > // Set to the peer id of the originator when messages are being > // exchanged between peers, but set to the id of the receiving peer >@@ -98,7 +95,7 @@ std::string ChannelMember::GetEntry() const { > > // name, 11-digit int, 1-digit bool, newline, null > char entry[kMaxNameLength + 15]; >- sprintfn(entry, sizeof(entry), "%s,%d,%d\n", >+ snprintf(entry, sizeof(entry), "%s,%d,%d\n", > name_.substr(0, kMaxNameLength).c_str(), id_, connected_); > return entry; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/stunprober/main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/stunprober/main.cc >index 45a76f523743e6fafa1be85f449badfa3c7f6855..6da96e469982f05e73baab2b348a2124f54a4707 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/stunprober/main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/stunprober/main.cc >@@ -14,7 +14,7 @@ > > #include <map> > #include <memory> >-#include <sstream> // no-presubmit-check TODO(webrtc:8982) >+#include <sstream> > > #include "p2p/base/basicpacketsocketfactory.h" > #include "p2p/stunprober/stunprober.h" >@@ -33,16 +33,21 @@ > using stunprober::StunProber; > using stunprober::AsyncCallback; > >-DEFINE_bool(help, false, "Prints this message"); >-DEFINE_int(interval, 10, "Interval of consecutive stun pings in milliseconds"); >-DEFINE_bool(shared_socket, false, "Share socket mode for different remote IPs"); >-DEFINE_int(pings_per_ip, >- 10, >- "Number of consecutive stun pings to send for each IP"); >-DEFINE_int(timeout, >- 1000, >- "Milliseconds of wait after the last ping sent before exiting"); >-DEFINE_string( >+WEBRTC_DEFINE_bool(help, false, "Prints this message"); >+WEBRTC_DEFINE_int(interval, >+ 10, >+ "Interval of consecutive stun pings in milliseconds"); >+WEBRTC_DEFINE_bool(shared_socket, >+ false, >+ "Share socket mode for different remote IPs"); >+WEBRTC_DEFINE_int(pings_per_ip, >+ 10, >+ "Number of consecutive stun pings to send for each IP"); >+WEBRTC_DEFINE_int( >+ timeout, >+ 1000, >+ "Milliseconds of wait after the last ping sent before exiting"); >+WEBRTC_DEFINE_string( > servers, > "stun.l.google.com:19302,stun1.l.google.com:19302,stun2.l.google.com:19302", > "Comma separated STUN server addresses with ports"); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..de5465e82fe8339b01d527cf4642e03a06bf662a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file.cc >@@ -0,0 +1,32 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "examples/turnserver/read_auth_file.h" >+#include "rtc_base/stringencode.h" >+ >+namespace webrtc_examples { >+ >+std::map<std::string, std::string> ReadAuthFile(std::istream* s) { >+ std::map<std::string, std::string> name_to_key; >+ for (std::string line; std::getline(*s, line);) { >+ const size_t sep = line.find('='); >+ if (sep == std::string::npos) >+ continue; >+ char buf[32]; >+ size_t len = rtc::hex_decode(buf, sizeof(buf), line.data() + sep + 1, >+ line.size() - sep - 1); >+ if (len > 0) { >+ name_to_key.emplace(line.substr(0, sep), std::string(buf, len)); >+ } >+ } >+ return name_to_key; >+} >+ >+} // namespace webrtc_examples >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file.h b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file.h >new file mode 100644 >index 0000000000000000000000000000000000000000..1c139c99240cae9d1a1f7ba483e595c331f1c0a1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file.h >@@ -0,0 +1,24 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef EXAMPLES_TURNSERVER_READ_AUTH_FILE_H_ >+#define EXAMPLES_TURNSERVER_READ_AUTH_FILE_H_ >+ >+#include <istream> >+#include <map> >+#include <string> >+ >+namespace webrtc_examples { >+ >+std::map<std::string, std::string> ReadAuthFile(std::istream* s); >+ >+} // namespace webrtc_examples >+ >+#endif // EXAMPLES_TURNSERVER_READ_AUTH_FILE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..4a6f332c27a57669f4f2b45592ece3f32ac6bc22 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/read_auth_file_unittest.cc >@@ -0,0 +1,44 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <sstream> >+ >+#include "examples/turnserver/read_auth_file.h" >+#include "test/gtest.h" >+ >+namespace webrtc_examples { >+ >+TEST(ReadAuthFile, HandlesEmptyFile) { >+ std::istringstream empty; >+ auto map = ReadAuthFile(&empty); >+ EXPECT_TRUE(map.empty()); >+} >+ >+TEST(ReadAuthFile, RecognizesValidUser) { >+ std::istringstream file("foo=deadbeaf\n"); >+ auto map = ReadAuthFile(&file); >+ ASSERT_NE(map.find("foo"), map.end()); >+ EXPECT_EQ(map["foo"], "\xde\xad\xbe\xaf"); >+} >+ >+TEST(ReadAuthFile, EmptyValueForInvalidHex) { >+ std::istringstream file( >+ "foo=deadbeaf\n" >+ "bar=xxxxinvalidhex\n" >+ "baz=cafe\n"); >+ auto map = ReadAuthFile(&file); >+ ASSERT_NE(map.find("foo"), map.end()); >+ EXPECT_EQ(map["foo"], "\xde\xad\xbe\xaf"); >+ EXPECT_EQ(map.find("bar"), map.end()); >+ ASSERT_NE(map.find("baz"), map.end()); >+ EXPECT_EQ(map["baz"], "\xca\xfe"); >+} >+ >+} // namespace webrtc_examples >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/turnserver_main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/turnserver_main.cc >index edc0b699b4c7628d35451e98f0a8e9f943f3736c..850fc2d9c89a215b9e12e9880585ac0419b94f47 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/turnserver_main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/turnserver/turnserver_main.cc >@@ -8,39 +8,44 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include <iostream> // NOLINT >+#include <fstream> >+#include <iostream> >+#include <map> >+#include <string> > >+#include "examples/turnserver/read_auth_file.h" > #include "p2p/base/basicpacketsocketfactory.h" > #include "p2p/base/turnserver.h" > #include "rtc_base/asyncudpsocket.h" >-#include "rtc_base/optionsfile.h" > #include "rtc_base/stringencode.h" > #include "rtc_base/thread.h" > >-static const char kSoftware[] = "libjingle TurnServer"; >+namespace { >+const char kSoftware[] = "libjingle TurnServer"; > > class TurnFileAuth : public cricket::TurnAuthInterface { > public: >- explicit TurnFileAuth(const std::string& path) : file_(path) { file_.Load(); } >+ explicit TurnFileAuth(std::map<std::string, std::string> name_to_key) >+ : name_to_key_(std::move(name_to_key)) {} >+ > virtual bool GetKey(const std::string& username, > const std::string& realm, > std::string* key) { > // File is stored as lines of <username>=<HA1>. > // Generate HA1 via "echo -n "<username>:<realm>:<password>" | md5sum" >- std::string hex; >- bool ret = file_.GetStringValue(username, &hex); >- if (ret) { >- char buf[32]; >- size_t len = rtc::hex_decode(buf, sizeof(buf), hex); >- *key = std::string(buf, len); >- } >- return ret; >+ auto it = name_to_key_.find(username); >+ if (it == name_to_key_.end()) >+ return false; >+ *key = it->second; >+ return true; > } > > private: >- rtc::OptionsFile file_; >+ const std::map<std::string, std::string> name_to_key_; > }; > >+} // namespace >+ > int main(int argc, char* argv[]) { > if (argc != 5) { > std::cerr << "usage: turnserver int-addr ext-ip realm auth-file" >@@ -70,7 +75,11 @@ int main(int argc, char* argv[]) { > } > > cricket::TurnServer server(main); >- TurnFileAuth auth(argv[4]); >+ std::fstream auth_file(argv[4], std::fstream::in); >+ >+ TurnFileAuth auth(auth_file.is_open() >+ ? webrtc_examples::ReadAuthFile(&auth_file) >+ : std::map<std::string, std::string>()); > server.set_realm(argv[3]); > server.set_software(kSoftware); > server.set_auth_hook(&auth); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/unityplugin/simple_peer_connection.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/unityplugin/simple_peer_connection.cc >index 36e1937ead7d92c4e11256d1768657e3b7fc265e..bf6a6827b03071cf6648b7a1f2948bf48156514b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/examples/unityplugin/simple_peer_connection.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/examples/unityplugin/simple_peer_connection.cc >@@ -15,6 +15,7 @@ > #include "absl/memory/memory.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/videosourceproxy.h" > #include "media/engine/internaldecoderfactory.h" > #include "media/engine/internalencoderfactory.h" >@@ -448,7 +449,8 @@ void SimplePeerConnection::AddStreams(bool audio_only) { > > rtc::scoped_refptr<webrtc::jni::AndroidVideoTrackSource> source( > new rtc::RefCountedObject<webrtc::jni::AndroidVideoTrackSource>( >- g_signaling_thread.get(), env, false)); >+ g_signaling_thread.get(), env, /* is_screencast= */ false, >+ /* align_timestamps= */ true)); > rtc::scoped_refptr<webrtc::VideoTrackSourceProxy> proxy_source = > webrtc::VideoTrackSourceProxy::Create(g_signaling_thread.get(), > g_worker_thread.get(), source); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/infra/config/cq.cfg b/Source/ThirdParty/libwebrtc/Source/webrtc/infra/config/cq.cfg >index c7b5ef923b8ade4266773b1d3908cecf51bc4743..1c0f0228f811d4dc174b383d906f4c3b05cf34ca 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/infra/config/cq.cfg >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/infra/config/cq.cfg >@@ -36,7 +36,6 @@ verifiers { > builders { name: "ios_arm64_rel" } > builders { name: "ios_dbg" } > builders { name: "ios_rel" } >- builders { name: "ios32_sim_ios9_dbg" } > builders { name: "ios64_sim_ios9_dbg" } > builders { name: "ios64_sim_ios10_dbg" } > builders { name: "ios_api_framework" } >@@ -63,7 +62,12 @@ verifiers { > builders { name: "mac_chromium_compile" } > builders { name: "mac_compile_dbg" } > builders { name: "mac_rel" } >- builders { name: "presubmit" } >+ builders { >+ name: "presubmit" >+ # Presubmit builder should be re-run every time CQ is triggered >+ # for last minute lint, OWNERS, etc checks. >+ disable_reuse: true >+ } > builders { name: "win_asan" } > builders { name: "win_chromium_compile" } > builders { name: "win_clang_dbg" } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/BUILD.gn >index 3871193badcd7f681077ed70298801b0df51f03d..575952c315f9c3182054c9987237caa39cf5d2b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/BUILD.gn >@@ -42,6 +42,7 @@ rtc_source_set("rtc_event_log_api") { > "../rtc_base:ptr_util", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_task_queue", >+ "//third_party/abseil-cpp/absl/memory", > ] > } > >@@ -53,7 +54,6 @@ rtc_source_set("rtc_stream_config") { > > deps = [ > ":rtc_event_log_api", >- "..:webrtc_common", > "../api:libjingle_peerconnection_api", > ] > } >@@ -87,6 +87,7 @@ rtc_source_set("rtc_event_audio") { > ":rtc_event_log_api", > ":rtc_stream_config", > "../modules/audio_coding:audio_network_adaptor_config", >+ "../rtc_base:checks", > "../rtc_base:ptr_util", > "//third_party/abseil-cpp/absl/memory", > ] >@@ -130,6 +131,7 @@ rtc_source_set("rtc_event_rtp_rtcp") { > ":rtc_event_log_api", > "../api:array_view", > "../modules/rtp_rtcp:rtp_rtcp_format", >+ "../rtc_base:checks", > "../rtc_base:ptr_util", > "../rtc_base:rtc_base_approved", > "//third_party/abseil-cpp/absl/memory", >@@ -147,40 +149,62 @@ rtc_source_set("rtc_event_video") { > deps = [ > ":rtc_event_log_api", > ":rtc_stream_config", >+ "../rtc_base:checks", > "../rtc_base:ptr_util", > "//third_party/abseil-cpp/absl/memory", > ] > } > >+# TODO(eladalon): Break down into (1) encoder and (2) decoder; we don't need >+# the decoder code in the WebRTC library, only in unit tests and tools. > rtc_static_library("rtc_event_log_impl_encoder") { > visibility = [ "*" ] > sources = [ >- "rtc_event_log/encoder/rtc_event_log_encoder_legacy.cc", >- "rtc_event_log/encoder/rtc_event_log_encoder_legacy.h", >+ "rtc_event_log/encoder/blob_encoding.cc", >+ "rtc_event_log/encoder/blob_encoding.h", >+ "rtc_event_log/encoder/delta_encoding.cc", >+ "rtc_event_log/encoder/delta_encoding.h", >+ "rtc_event_log/encoder/rtc_event_log_encoder_common.cc", >+ "rtc_event_log/encoder/rtc_event_log_encoder_common.h", >+ "rtc_event_log/encoder/varint.cc", >+ "rtc_event_log/encoder/varint.h", > ] > > defines = [] > > deps = [ >- ":ice_log", >- ":rtc_event_audio", >- ":rtc_event_bwe", >- ":rtc_event_log_api", >- ":rtc_event_log_impl_output", >- ":rtc_event_pacing", >- ":rtc_event_rtp_rtcp", >- ":rtc_event_video", >- ":rtc_stream_config", >- "../modules/audio_coding:audio_network_adaptor", >- "../modules/remote_bitrate_estimator:remote_bitrate_estimator", >- "../modules/rtp_rtcp:rtp_rtcp_format", > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", >+ "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings:strings", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > > if (rtc_enable_protobuf) { > defines += [ "ENABLE_RTC_EVENT_LOG" ] >- deps += [ ":rtc_event_log_proto" ] >+ deps += [ >+ ":ice_log", >+ ":rtc_event_audio", >+ ":rtc_event_bwe", >+ ":rtc_event_log2_proto", >+ ":rtc_event_log_api", >+ ":rtc_event_log_impl_output", >+ ":rtc_event_log_proto", >+ ":rtc_event_pacing", >+ ":rtc_event_rtp_rtcp", >+ ":rtc_event_video", >+ ":rtc_stream_config", >+ "../api:array_view", >+ "../modules/audio_coding:audio_network_adaptor", >+ "../modules/remote_bitrate_estimator:remote_bitrate_estimator", >+ "../modules/rtp_rtcp:rtp_rtcp_format", >+ ] >+ sources += [ >+ "rtc_event_log/encoder/rtc_event_log_encoder_legacy.cc", >+ "rtc_event_log/encoder/rtc_event_log_encoder_legacy.h", >+ "rtc_event_log/encoder/rtc_event_log_encoder_new_format.cc", >+ "rtc_event_log/encoder/rtc_event_log_encoder_new_format.h", >+ ] > } > } > >@@ -211,8 +235,7 @@ rtc_static_library("rtc_event_log_impl_base") { > deps = [ > ":ice_log", > ":rtc_event_log_api", >- ":rtc_event_log_impl_encoder", >- ":rtc_event_log_impl_output", >+ "../api:libjingle_logging_api", > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_task_queue", >@@ -223,7 +246,7 @@ rtc_static_library("rtc_event_log_impl_base") { > > if (rtc_enable_protobuf) { > defines += [ "ENABLE_RTC_EVENT_LOG" ] >- deps += [ ":rtc_event_log_proto" ] >+ deps += [ ":rtc_event_log_impl_encoder" ] > } > } > >@@ -246,6 +269,7 @@ rtc_source_set("fake_rtc_event_log") { > > if (rtc_enable_protobuf) { > proto_library("rtc_event_log_proto") { >+ visibility = [ "*" ] > sources = [ > "rtc_event_log/rtc_event_log.proto", > ] >@@ -253,6 +277,7 @@ if (rtc_enable_protobuf) { > } > > proto_library("rtc_event_log2_proto") { >+ visibility = [ "*" ] > sources = [ > "rtc_event_log/rtc_event_log2.proto", > ] >@@ -262,8 +287,6 @@ if (rtc_enable_protobuf) { > rtc_static_library("rtc_event_log_parser") { > visibility = [ "*" ] > sources = [ >- "rtc_event_log/rtc_event_log_parser.cc", >- "rtc_event_log/rtc_event_log_parser.h", > "rtc_event_log/rtc_event_log_parser_new.cc", > "rtc_event_log/rtc_event_log_parser_new.h", > "rtc_event_log/rtc_event_processor.h", >@@ -274,13 +297,13 @@ if (rtc_enable_protobuf) { > ":rtc_event_bwe", > ":rtc_event_log2_proto", > ":rtc_event_log_api", >+ ":rtc_event_log_impl_encoder", > ":rtc_event_log_proto", > ":rtc_stream_config", >- "..:webrtc_common", > "../api:libjingle_peerconnection_api", > "../call:video_stream_api", > "../modules/audio_coding:audio_network_adaptor", >- "../modules/congestion_controller:transport_feedback", >+ "../modules/congestion_controller/rtp:transport_feedback", > "../modules/remote_bitrate_estimator:remote_bitrate_estimator", > "../modules/rtp_rtcp", > "../modules/rtp_rtcp:rtp_rtcp_format", >@@ -298,6 +321,9 @@ if (rtc_enable_protobuf) { > assert(rtc_enable_protobuf) > defines = [ "ENABLE_RTC_EVENT_LOG" ] > sources = [ >+ "rtc_event_log/encoder/blob_encoding_unittest.cc", >+ "rtc_event_log/encoder/delta_encoding_unittest.cc", >+ "rtc_event_log/encoder/rtc_event_log_encoder_common_unittest.cc", > "rtc_event_log/encoder/rtc_event_log_encoder_unittest.cc", > "rtc_event_log/output/rtc_event_log_output_file_unittest.cc", > "rtc_event_log/rtc_event_log_unittest.cc", >@@ -309,6 +335,7 @@ if (rtc_enable_protobuf) { > ":ice_log", > ":rtc_event_audio", > ":rtc_event_bwe", >+ ":rtc_event_log2_proto", > ":rtc_event_log_api", > ":rtc_event_log_impl_base", > ":rtc_event_log_impl_encoder", >@@ -332,6 +359,7 @@ if (rtc_enable_protobuf) { > "../test:test_support", > "//testing/gtest", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > if (!build_with_chromium && is_clang) { > # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >@@ -362,49 +390,6 @@ if (rtc_enable_protobuf) { > } > } > } >- >- if (rtc_include_tests) { >- rtc_executable("rtc_event_log2text") { >- testonly = true >- sources = [ >- "rtc_event_log/rtc_event_log2text.cc", >- ] >- deps = [ >- ":rtc_event_log_api", >- ":rtc_event_log_parser", >- "../:webrtc_common", >- "../call:video_stream_api", >- "../modules/rtp_rtcp:rtp_rtcp_format", >- "../rtc_base:checks", >- "../rtc_base:protobuf_utils", >- "../rtc_base:rtc_base_approved", >- "../rtc_base:stringutils", >- >- # TODO(kwiberg): Remove this dependency. >- "../api/audio_codecs:audio_codecs_api", >- "../modules/audio_coding:audio_network_adaptor_config", >- "../modules/rtp_rtcp", >- ] >- if (!build_with_chromium && is_clang) { >- # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >- suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >- } >- } >- } >- if (rtc_include_tests) { >- rtc_executable("rtc_event_log2stats") { >- testonly = true >- sources = [ >- "rtc_event_log/rtc_event_log2stats.cc", >- ] >- deps = [ >- ":rtc_event_log_api", >- ":rtc_event_log_proto", >- "../rtc_base:checks", >- "../rtc_base:rtc_base_approved", >- ] >- } >- } > } > > rtc_source_set("ice_log") { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..ee137f13fc1a402f6d808387a76a409494585fe8 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding.cc >@@ -0,0 +1,110 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/blob_encoding.h" >+ >+#include <algorithm> >+ >+#include "logging/rtc_event_log/encoder/varint.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+ >+std::string EncodeBlobs(const std::vector<std::string>& blobs) { >+ RTC_DCHECK(!blobs.empty()); >+ >+ size_t result_length_bound = kMaxVarIntLengthBytes * blobs.size(); >+ for (const auto& blob : blobs) { >+ // Providing an input so long that it would cause a wrap-around is an error. >+ RTC_DCHECK_GE(result_length_bound + blob.length(), result_length_bound); >+ result_length_bound += blob.length(); >+ } >+ >+ std::string result; >+ result.reserve(result_length_bound); >+ >+ // First, encode all of the lengths. >+ for (absl::string_view blob : blobs) { >+ result += EncodeVarInt(blob.length()); >+ } >+ >+ // Second, encode the actual blobs. >+ for (absl::string_view blob : blobs) { >+ result.append(blob.data(), blob.length()); >+ } >+ >+ RTC_DCHECK_LE(result.size(), result_length_bound); >+ return result; >+} >+ >+std::vector<absl::string_view> DecodeBlobs(absl::string_view encoded_blobs, >+ size_t num_of_blobs) { >+ if (encoded_blobs.empty()) { >+ RTC_LOG(LS_WARNING) << "Corrupt input; empty input."; >+ return std::vector<absl::string_view>(); >+ } >+ >+ if (num_of_blobs == 0u) { >+ RTC_LOG(LS_WARNING) >+ << "Corrupt input; number of blobs must be greater than 0."; >+ return std::vector<absl::string_view>(); >+ } >+ >+ size_t read_idx = 0; >+ >+ // Read the lengths of all blobs. >+ std::vector<uint64_t> lengths(num_of_blobs); >+ for (size_t i = 0; i < num_of_blobs; ++i) { >+ if (read_idx >= encoded_blobs.length()) { >+ RTC_DCHECK_EQ(read_idx, encoded_blobs.length()); >+ RTC_LOG(LS_WARNING) << "Corrupt input; excessive number of blobs."; >+ return std::vector<absl::string_view>(); >+ } >+ >+ const size_t read_bytes = >+ DecodeVarInt(encoded_blobs.substr(read_idx), &lengths[i]); >+ if (read_bytes == 0) { >+ RTC_LOG(LS_WARNING) << "Corrupt input; varint decoding failed."; >+ return std::vector<absl::string_view>(); >+ } >+ >+ read_idx += read_bytes; >+ >+ // Note: It might be that read_idx == encoded_blobs.length(), if this >+ // is the last iteration, and all of the blobs are the empty string. >+ RTC_DCHECK_LE(read_idx, encoded_blobs.length()); >+ } >+ >+ // Read the blobs themselves. >+ std::vector<absl::string_view> blobs(num_of_blobs); >+ for (size_t i = 0; i < num_of_blobs; ++i) { >+ if (read_idx + lengths[i] < read_idx) { // Wrap-around detection. >+ RTC_LOG(LS_WARNING) << "Corrupt input; unreasonably large blob sequence."; >+ return std::vector<absl::string_view>(); >+ } >+ >+ if (read_idx + lengths[i] > encoded_blobs.length()) { >+ RTC_LOG(LS_WARNING) << "Corrupt input; blob sizes exceed input size."; >+ return std::vector<absl::string_view>(); >+ } >+ >+ blobs[i] = encoded_blobs.substr(read_idx, lengths[i]); >+ read_idx += lengths[i]; >+ } >+ >+ if (read_idx != encoded_blobs.length()) { >+ RTC_LOG(LS_WARNING) << "Corrupt input; unrecognized trailer."; >+ return std::vector<absl::string_view>(); >+ } >+ >+ return blobs; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding.h >new file mode 100644 >index 0000000000000000000000000000000000000000..32a10d277f7fd6ae17030a774a03e8b71d960d8b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding.h >@@ -0,0 +1,51 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef LOGGING_RTC_EVENT_LOG_ENCODER_BLOB_ENCODING_H_ >+#define LOGGING_RTC_EVENT_LOG_ENCODER_BLOB_ENCODING_H_ >+ >+#include <string> >+#include <vector> >+ >+#include "absl/strings/string_view.h" >+ >+namespace webrtc { >+ >+// Encode/decode a sequence of strings, whose length is not known to be >+// discernable from the blob itself (i.e. without being transmitted OOB), >+// in a way that would allow us to separate them again on the decoding side. >+// The number of blobs is assumed to be transmitted OOB. For example, if >+// multiple sequences of different blobs are sent, but all sequences contain >+// the same number of blobs, it is beneficial to not encode the number of blobs. >+// >+// EncodeBlobs() must be given a non-empty vector. The blobs themselves may >+// be equal to "", though. >+// EncodeBlobs() may not fail. >+// EncodeBlobs() never returns the empty string. >+// >+// Calling DecodeBlobs() on an empty string, or with |num_of_blobs| set to 0, >+// is an error. >+// DecodeBlobs() returns an empty vector if it fails, e.g. due to a mismatch >+// between |num_of_blobs| and |encoded_blobs|, which can happen if >+// |encoded_blobs| is corrupted. >+// When successful, DecodeBlobs() returns a vector of string_view objects, >+// which refer to the original input (|encoded_blobs|), and therefore may >+// not outlive it. >+// >+// Note that the returned std::string might have been reserved for significantly >+// more memory than it ends up using. If the caller to EncodeBlobs() intends >+// to store the result long-term, he should consider shrink_to_fit()-ing it. >+std::string EncodeBlobs(const std::vector<std::string>& blobs); >+std::vector<absl::string_view> DecodeBlobs(absl::string_view encoded_blobs, >+ size_t num_of_blobs); >+ >+} // namespace webrtc >+ >+#endif // LOGGING_RTC_EVENT_LOG_ENCODER_BLOB_ENCODING_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..1d08e0feb8707f7610738a9b9e2dcdbf5841ae73 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/blob_encoding_unittest.cc >@@ -0,0 +1,152 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/blob_encoding.h" >+ >+#include <string> >+#include <vector> >+ >+#include "logging/rtc_event_log/encoder/varint.h" >+#include "rtc_base/checks.h" >+#include "test/gtest.h" >+ >+using CharT = std::string::value_type; >+ >+namespace webrtc { >+ >+namespace { >+ >+void TestEncodingAndDecoding(const std::vector<std::string>& blobs) { >+ RTC_DCHECK(!blobs.empty()); >+ >+ const std::string encoded = EncodeBlobs(blobs); >+ ASSERT_FALSE(encoded.empty()); >+ >+ const std::vector<absl::string_view> decoded = >+ DecodeBlobs(encoded, blobs.size()); >+ >+ ASSERT_EQ(decoded.size(), blobs.size()); >+ for (size_t i = 0; i < decoded.size(); ++i) { >+ ASSERT_EQ(decoded[i], blobs[i]); >+ } >+} >+ >+void TestGracefulErrorHandling(absl::string_view encoded_blobs, >+ size_t num_of_blobs) { >+ const std::vector<absl::string_view> decoded = >+ DecodeBlobs(encoded_blobs, num_of_blobs); >+ EXPECT_TRUE(decoded.empty()); >+} >+ >+} // namespace >+ >+TEST(BlobEncoding, EmptyBlob) { >+ TestEncodingAndDecoding({""}); >+} >+ >+TEST(BlobEncoding, SingleCharacterBlob) { >+ TestEncodingAndDecoding({"a"}); >+} >+ >+TEST(BlobEncoding, LongBlob) { >+ std::string blob = ""; >+ for (size_t i = 0; i < 100000; ++i) { >+ blob += std::to_string(i + 1) + " Mississippi\n"; >+ } >+ TestEncodingAndDecoding({blob}); >+} >+ >+TEST(BlobEncoding, BlobsOfVariousLengths) { >+ constexpr size_t kJump = 0xf032d; // Arbitrary. >+ constexpr size_t kMax = 0xffffff; // Arbitrary. >+ >+ std::string blob; >+ blob.reserve(kMax); >+ >+ for (size_t i = 0; i < kMax; i += kJump) { >+ blob.append(kJump, 'x'); >+ TestEncodingAndDecoding({blob}); >+ } >+} >+ >+TEST(BlobEncoding, MultipleBlobs) { >+ std::vector<std::string> blobs; >+ for (size_t i = 0; i < 100000; ++i) { >+ blobs.push_back(std::to_string(i + 1) + " Mississippi\n"); >+ } >+ TestEncodingAndDecoding(blobs); >+} >+ >+TEST(BlobEncoding, DecodeBlobsHandlesErrorsGracefullyEmptyInput) { >+ TestGracefulErrorHandling("", 1); >+} >+ >+TEST(BlobEncoding, DecodeBlobsHandlesErrorsGracefullyZeroBlobs) { >+ const std::string encoded = EncodeBlobs({"a"}); >+ ASSERT_FALSE(encoded.empty()); >+ TestGracefulErrorHandling(encoded, 0); >+} >+ >+TEST(BlobEncoding, DecodeBlobsHandlesErrorsGracefullyBlobLengthTooSmall) { >+ std::string encoded = EncodeBlobs({"ab"}); >+ ASSERT_FALSE(encoded.empty()); >+ ASSERT_EQ(encoded[0], 0x02); >+ encoded[0] = 0x01; >+ TestGracefulErrorHandling(encoded, 1); >+} >+ >+TEST(BlobEncoding, DecodeBlobsHandlesErrorsGracefullyBlobLengthTooLarge) { >+ std::string encoded = EncodeBlobs({"a"}); >+ ASSERT_FALSE(encoded.empty()); >+ ASSERT_EQ(encoded[0], 0x01); >+ encoded[0] = 0x02; >+ TestGracefulErrorHandling(encoded, 1); >+} >+ >+TEST(BlobEncoding, >+ DecodeBlobsHandlesErrorsGracefullyNumberOfBlobsIncorrectlyHigh) { >+ const std::vector<std::string> blobs = {"a", "b"}; >+ const std::string encoded = EncodeBlobs(blobs); >+ // Test focus - two empty strings encoded, but DecodeBlobs() told way more >+ // blobs are in the strings than could be expected. >+ TestGracefulErrorHandling(encoded, 1000); >+ >+ // Test sanity - show that DecodeBlobs() would have worked if it got the >+ // correct input. >+ TestEncodingAndDecoding(blobs); >+} >+ >+TEST(BlobEncoding, DecodeBlobsHandlesErrorsGracefullyDefectiveVarInt) { >+ std::string defective_varint; >+ for (size_t i = 0; i < kMaxVarIntLengthBytes; ++i) { >+ ASSERT_LE(kMaxVarIntLengthBytes, 0xffu); >+ defective_varint += static_cast<CharT>(static_cast<size_t>(0x80u) | i); >+ } >+ defective_varint += 0x01u; >+ >+ const std::string defective_encoded = defective_varint + "whatever"; >+ >+ TestGracefulErrorHandling(defective_encoded, 1); >+} >+ >+TEST(BlobEncoding, DecodeBlobsHandlesErrorsGracefullyLengthSumWrapAround) { >+ std::string max_size_varint; >+ for (size_t i = 0; i < kMaxVarIntLengthBytes - 1; ++i) { >+ max_size_varint += 0xffu; >+ } >+ max_size_varint += 0x7fu; >+ >+ const std::string defective_encoded = >+ max_size_varint + max_size_varint + "whatever"; >+ >+ TestGracefulErrorHandling(defective_encoded, 2); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..5e334b98d0009badf43daa4505a3d0e3fa99f338 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding.cc >@@ -0,0 +1,974 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/delta_encoding.h" >+ >+#include <algorithm> >+#include <limits> >+#include <memory> >+#include <utility> >+ >+#include "absl/memory/memory.h" >+#include "logging/rtc_event_log/encoder/varint.h" >+#include "rtc_base/bitbuffer.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/constructormagic.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/numerics/safe_conversions.h" >+ >+namespace webrtc { >+namespace { >+ >+// TODO(eladalon): Only build the decoder in tools and unit tests. >+ >+bool g_force_unsigned_for_testing = false; >+bool g_force_signed_for_testing = false; >+ >+size_t BitsToBytes(size_t bits) { >+ return (bits / 8) + (bits % 8 > 0 ? 1 : 0); >+} >+ >+// TODO(eladalon): Replace by something more efficient. >+uint64_t UnsignedBitWidth(uint64_t input, bool zero_val_as_zero_width = false) { >+ if (zero_val_as_zero_width && input == 0) { >+ return 0; >+ } >+ >+ uint64_t width = 0; >+ do { // input == 0 -> width == 1 >+ width += 1; >+ input >>= 1; >+ } while (input != 0); >+ return width; >+} >+ >+uint64_t SignedBitWidth(uint64_t max_pos_magnitude, >+ uint64_t max_neg_magnitude) { >+ const uint64_t bitwidth_pos = UnsignedBitWidth(max_pos_magnitude, true); >+ const uint64_t bitwidth_neg = >+ (max_neg_magnitude > 0) ? UnsignedBitWidth(max_neg_magnitude - 1, true) >+ : 0; >+ return 1 + std::max(bitwidth_pos, bitwidth_neg); >+} >+ >+// Return the maximum integer of a given bit width. >+// Examples: >+// MaxUnsignedValueOfBitWidth(1) = 0x01 >+// MaxUnsignedValueOfBitWidth(6) = 0x3f >+// MaxUnsignedValueOfBitWidth(8) = 0xff >+// MaxUnsignedValueOfBitWidth(32) = 0xffffffff >+uint64_t MaxUnsignedValueOfBitWidth(uint64_t bit_width) { >+ RTC_DCHECK_GE(bit_width, 1); >+ RTC_DCHECK_LE(bit_width, 64); >+ return (bit_width == 64) ? std::numeric_limits<uint64_t>::max() >+ : ((static_cast<uint64_t>(1) << bit_width) - 1); >+} >+ >+// Computes the delta between |previous| and |current|, under the assumption >+// that wrap-around occurs after |width| is exceeded. >+uint64_t UnsignedDelta(uint64_t previous, uint64_t current, uint64_t bit_mask) { >+ return (current - previous) & bit_mask; >+} >+ >+// Determines the encoding type (e.g. fixed-size encoding). >+// Given an encoding type, may also distinguish between some variants of it >+// (e.g. which fields of the fixed-size encoding are explicitly mentioned by >+// the header, and which are implicitly assumed to hold certain default values). >+enum class EncodingType { >+ kFixedSizeUnsignedDeltasNoEarlyWrapNoOpt = 0, >+ kFixedSizeSignedDeltasEarlyWrapAndOptSupported = 1, >+ kReserved1 = 2, >+ kReserved2 = 3, >+ kNumberOfEncodingTypes // Keep last >+}; >+ >+// The width of each field in the encoding header. Note that this is the >+// width in case the field exists; not all fields occur in all encoding types. >+constexpr size_t kBitsInHeaderForEncodingType = 2; >+constexpr size_t kBitsInHeaderForDeltaWidthBits = 6; >+constexpr size_t kBitsInHeaderForSignedDeltas = 1; >+constexpr size_t kBitsInHeaderForValuesOptional = 1; >+constexpr size_t kBitsInHeaderForValueWidthBits = 6; >+ >+static_assert(static_cast<size_t>(EncodingType::kNumberOfEncodingTypes) <= >+ 1 << kBitsInHeaderForEncodingType, >+ "Not all encoding types fit."); >+ >+// Default values for when the encoding header does not specify explicitly. >+constexpr bool kDefaultSignedDeltas = false; >+constexpr bool kDefaultValuesOptional = false; >+constexpr uint64_t kDefaultValueWidthBits = 64; >+ >+// Wrap BitBufferWriter and extend its functionality by (1) keeping track of >+// the number of bits written and (2) owning its buffer. >+class BitWriter final { >+ public: >+ explicit BitWriter(size_t byte_count) >+ : buffer_(byte_count, '\0'), >+ bit_writer_(reinterpret_cast<uint8_t*>(&buffer_[0]), buffer_.size()), >+ written_bits_(0), >+ valid_(true) { >+ RTC_DCHECK_GT(byte_count, 0); >+ } >+ >+ void WriteBits(uint64_t val, size_t bit_count) { >+ RTC_DCHECK(valid_); >+ const bool success = bit_writer_.WriteBits(val, bit_count); >+ RTC_DCHECK(success); >+ written_bits_ += bit_count; >+ } >+ >+ void WriteBits(const std::string& input) { >+ RTC_DCHECK(valid_); >+ for (std::string::value_type c : input) { >+ WriteBits(c, 8 * sizeof(std::string::value_type)); >+ } >+ } >+ >+ // Returns everything that was written so far. >+ // Nothing more may be written after this is called. >+ std::string GetString() { >+ RTC_DCHECK(valid_); >+ valid_ = false; >+ >+ buffer_.resize(BitsToBytes(written_bits_)); >+ written_bits_ = 0; >+ >+ std::string result; >+ std::swap(buffer_, result); >+ return result; >+ } >+ >+ private: >+ std::string buffer_; >+ rtc::BitBufferWriter bit_writer_; >+ // Note: Counting bits instead of bytes wraps around earlier than it has to, >+ // which means the maximum length is lower than it could be. We don't expect >+ // to go anywhere near the limit, though, so this is good enough. >+ size_t written_bits_; >+ bool valid_; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(BitWriter); >+}; >+ >+// Parameters for fixed-size delta-encoding/decoding. >+// These are tailored for the sequence which will be encoded (e.g. widths). >+class FixedLengthEncodingParameters final { >+ public: >+ static bool ValidParameters(uint64_t delta_width_bits, >+ bool signed_deltas, >+ bool values_optional, >+ uint64_t value_width_bits) { >+ return (1 <= delta_width_bits && delta_width_bits <= 64 && >+ 1 <= value_width_bits && value_width_bits <= 64 && >+ delta_width_bits <= value_width_bits); >+ } >+ >+ FixedLengthEncodingParameters(uint64_t delta_width_bits, >+ bool signed_deltas, >+ bool values_optional, >+ uint64_t value_width_bits) >+ : delta_width_bits_(delta_width_bits), >+ signed_deltas_(signed_deltas), >+ values_optional_(values_optional), >+ value_width_bits_(value_width_bits), >+ delta_mask_(MaxUnsignedValueOfBitWidth(delta_width_bits_)), >+ value_mask_(MaxUnsignedValueOfBitWidth(value_width_bits_)) { >+ RTC_DCHECK(ValidParameters(delta_width_bits, signed_deltas, values_optional, >+ value_width_bits)); >+ } >+ >+ // Number of bits necessary to hold the widest(*) of the deltas between the >+ // values in the sequence. >+ // (*) - Widest might not be the largest, if signed deltas are used. >+ uint64_t delta_width_bits() const { return delta_width_bits_; } >+ >+ // Whether deltas are signed. >+ bool signed_deltas() const { return signed_deltas_; } >+ >+ // Whether the values of the sequence are optional. That is, it may be >+ // that some of them do not have a value (not even a sentinel value indicating >+ // invalidity). >+ bool values_optional() const { return values_optional_; } >+ >+ // Number of bits necessary to hold the largest value in the sequence. >+ uint64_t value_width_bits() const { return value_width_bits_; } >+ >+ // Masks where only the bits relevant to the deltas/values are turned on. >+ uint64_t delta_mask() const { return delta_mask_; } >+ uint64_t value_mask() const { return value_mask_; } >+ >+ void SetSignedDeltas(bool signed_deltas) { signed_deltas_ = signed_deltas; } >+ void SetDeltaWidthBits(uint64_t delta_width_bits) { >+ delta_width_bits_ = delta_width_bits; >+ delta_mask_ = MaxUnsignedValueOfBitWidth(delta_width_bits); >+ } >+ >+ private: >+ uint64_t delta_width_bits_; // Normally const, but mutable in tests. >+ bool signed_deltas_; // Normally const, but mutable in tests. >+ const bool values_optional_; >+ const uint64_t value_width_bits_; >+ >+ uint64_t delta_mask_; // Normally const, but mutable in tests. >+ const uint64_t value_mask_; >+}; >+ >+// Performs delta-encoding of a single (non-empty) sequence of values, using >+// an encoding where all deltas are encoded using the same number of bits. >+// (With the exception of optional elements; those are encoded as a bit vector >+// with one bit per element, plus a fixed number of bits for every element that >+// has a value.) >+class FixedLengthDeltaEncoder final { >+ public: >+ // See webrtc::EncodeDeltas() for general details. >+ // This function return a bit pattern that would allow the decoder to >+ // determine whether it was produced by FixedLengthDeltaEncoder, and can >+ // therefore be decoded by FixedLengthDeltaDecoder, or whether it was produced >+ // by a different encoder. >+ static std::string EncodeDeltas( >+ absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values); >+ >+ private: >+ // Calculate min/max values of unsigned/signed deltas, given the bit width >+ // of all the values in the series. >+ static void CalculateMinAndMaxDeltas( >+ absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values, >+ uint64_t bit_width, >+ uint64_t* max_unsigned_delta, >+ uint64_t* max_pos_signed_delta, >+ uint64_t* min_neg_signed_delta); >+ >+ // No effect outside of unit tests. >+ // In unit tests, may lead to forcing signed/unsigned deltas, etc. >+ static void ConsiderTestOverrides(FixedLengthEncodingParameters* params, >+ uint64_t delta_width_bits_signed, >+ uint64_t delta_width_bits_unsigned); >+ >+ // FixedLengthDeltaEncoder objects are to be created by EncodeDeltas() and >+ // released by it before it returns. They're mostly a convenient way to >+ // avoid having to pass a lot of state between different functions. >+ // Therefore, it was deemed acceptable to let them have a reference to >+ // |values|, whose lifetime must exceed the lifetime of |this|. >+ FixedLengthDeltaEncoder(const FixedLengthEncodingParameters& params, >+ absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values, >+ size_t existent_values_count); >+ >+ // Perform delta-encoding using the parameters given to the ctor on the >+ // sequence of values given to the ctor. >+ std::string Encode(); >+ >+ // Exact lengths. >+ size_t OutputLengthBytes(size_t existent_values_count) const; >+ size_t HeaderLengthBits() const; >+ size_t EncodedDeltasLengthBits(size_t existent_values_count) const; >+ >+ // Encode the compression parameters into the stream. >+ void EncodeHeader(); >+ >+ // Encode a given delta into the stream. >+ void EncodeDelta(uint64_t previous, uint64_t current); >+ void EncodeUnsignedDelta(uint64_t previous, uint64_t current); >+ void EncodeSignedDelta(uint64_t previous, uint64_t current); >+ >+ // The parameters according to which encoding will be done (width of >+ // fields, whether signed deltas should be used, etc.) >+ const FixedLengthEncodingParameters params_; >+ >+ // The encoding scheme assumes that at least one value is transmitted OOB, >+ // so that the first value can be encoded as a delta from that OOB value, >+ // which is |base_|. >+ const absl::optional<uint64_t> base_; >+ >+ // The values to be encoded. >+ // Note: This is a non-owning reference. See comment above ctor for details. >+ const std::vector<absl::optional<uint64_t>>& values_; >+ >+ // Buffer into which encoded values will be written. >+ // This is created dynmically as a way to enforce that the rest of the >+ // ctor has finished running when this is constructed, so that the lower >+ // bound on the buffer size would be guaranteed correct. >+ std::unique_ptr<BitWriter> writer_; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(FixedLengthDeltaEncoder); >+}; >+ >+// TODO(eladalon): Reduce the number of passes. >+std::string FixedLengthDeltaEncoder::EncodeDeltas( >+ absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values) { >+ RTC_DCHECK(!values.empty()); >+ >+ // As a special case, if all of the elements are identical to the base, >+ // (including, for optional fields, about their existence/non-existence), >+ // the empty string is used to signal that. >+ if (std::all_of( >+ values.cbegin(), values.cend(), >+ [base](absl::optional<uint64_t> val) { return val == base; })) { >+ return std::string(); >+ } >+ >+ bool non_decreasing = true; >+ uint64_t max_value_including_base = base.value_or(0u); >+ size_t existent_values_count = 0; >+ { >+ uint64_t previous = base.value_or(0u); >+ for (size_t i = 0; i < values.size(); ++i) { >+ if (!values[i].has_value()) { >+ continue; >+ } >+ ++existent_values_count; >+ non_decreasing &= (previous <= values[i].value()); >+ max_value_including_base = >+ std::max(max_value_including_base, values[i].value()); >+ previous = values[i].value(); >+ } >+ } >+ >+ // If the sequence is non-decreasing, it may be assumed to have width = 64; >+ // there's no reason to encode the actual max width in the encoding header. >+ const uint64_t value_width_bits = >+ non_decreasing ? 64 : UnsignedBitWidth(max_value_including_base); >+ >+ uint64_t max_unsigned_delta; >+ uint64_t max_pos_signed_delta; >+ uint64_t min_neg_signed_delta; >+ CalculateMinAndMaxDeltas(base, values, value_width_bits, &max_unsigned_delta, >+ &max_pos_signed_delta, &min_neg_signed_delta); >+ >+ const uint64_t delta_width_bits_unsigned = >+ UnsignedBitWidth(max_unsigned_delta); >+ const uint64_t delta_width_bits_signed = >+ SignedBitWidth(max_pos_signed_delta, min_neg_signed_delta); >+ >+ // Note: Preference for unsigned if the two have the same width (efficiency). >+ const bool signed_deltas = >+ delta_width_bits_signed < delta_width_bits_unsigned; >+ const uint64_t delta_width_bits = >+ signed_deltas ? delta_width_bits_signed : delta_width_bits_unsigned; >+ >+ const bool values_optional = >+ !base.has_value() || (existent_values_count < values.size()); >+ >+ FixedLengthEncodingParameters params(delta_width_bits, signed_deltas, >+ values_optional, value_width_bits); >+ >+ // No effect in production. >+ ConsiderTestOverrides(¶ms, delta_width_bits_signed, >+ delta_width_bits_unsigned); >+ >+ FixedLengthDeltaEncoder encoder(params, base, values, existent_values_count); >+ return encoder.Encode(); >+} >+ >+void FixedLengthDeltaEncoder::CalculateMinAndMaxDeltas( >+ absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values, >+ uint64_t bit_width, >+ uint64_t* max_unsigned_delta_out, >+ uint64_t* max_pos_signed_delta_out, >+ uint64_t* min_neg_signed_delta_out) { >+ RTC_DCHECK(!values.empty()); >+ RTC_DCHECK(max_unsigned_delta_out); >+ RTC_DCHECK(max_pos_signed_delta_out); >+ RTC_DCHECK(min_neg_signed_delta_out); >+ >+ const uint64_t bit_mask = MaxUnsignedValueOfBitWidth(bit_width); >+ >+ uint64_t max_unsigned_delta = 0; >+ uint64_t max_pos_signed_delta = 0; >+ uint64_t min_neg_signed_delta = 0; >+ >+ absl::optional<uint64_t> prev = base; >+ for (size_t i = 0; i < values.size(); ++i) { >+ if (!values[i].has_value()) { >+ continue; >+ } >+ >+ if (!prev.has_value()) { >+ // If the base is non-existent, the first existent value is encoded as >+ // a varint, rather than as a delta. >+ RTC_DCHECK(!base.has_value()); >+ prev = values[i]; >+ continue; >+ } >+ >+ const uint64_t current = values[i].value(); >+ >+ const uint64_t forward_delta = UnsignedDelta(*prev, current, bit_mask); >+ const uint64_t backward_delta = UnsignedDelta(current, *prev, bit_mask); >+ >+ max_unsigned_delta = std::max(max_unsigned_delta, forward_delta); >+ >+ if (forward_delta < backward_delta) { >+ max_pos_signed_delta = std::max(max_pos_signed_delta, forward_delta); >+ } else { >+ min_neg_signed_delta = std::max(min_neg_signed_delta, backward_delta); >+ } >+ >+ prev = current; >+ } >+ >+ *max_unsigned_delta_out = max_unsigned_delta; >+ *max_pos_signed_delta_out = max_pos_signed_delta; >+ *min_neg_signed_delta_out = min_neg_signed_delta; >+} >+ >+void FixedLengthDeltaEncoder::ConsiderTestOverrides( >+ FixedLengthEncodingParameters* params, >+ uint64_t delta_width_bits_signed, >+ uint64_t delta_width_bits_unsigned) { >+ if (g_force_unsigned_for_testing) { >+ params->SetDeltaWidthBits(delta_width_bits_unsigned); >+ params->SetSignedDeltas(false); >+ } else if (g_force_signed_for_testing) { >+ params->SetDeltaWidthBits(delta_width_bits_signed); >+ params->SetSignedDeltas(true); >+ } else { >+ // Unchanged. >+ } >+} >+ >+FixedLengthDeltaEncoder::FixedLengthDeltaEncoder( >+ const FixedLengthEncodingParameters& params, >+ absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values, >+ size_t existent_values_count) >+ : params_(params), base_(base), values_(values) { >+ RTC_DCHECK(!values_.empty()); >+ writer_ = >+ absl::make_unique<BitWriter>(OutputLengthBytes(existent_values_count)); >+} >+ >+std::string FixedLengthDeltaEncoder::Encode() { >+ EncodeHeader(); >+ >+ if (params_.values_optional()) { >+ // Encode which values exist and which don't. >+ for (absl::optional<uint64_t> value : values_) { >+ writer_->WriteBits(value.has_value() ? 1u : 0u, 1); >+ } >+ } >+ >+ absl::optional<uint64_t> previous = base_; >+ for (absl::optional<uint64_t> value : values_) { >+ if (!value.has_value()) { >+ RTC_DCHECK(params_.values_optional()); >+ continue; >+ } >+ >+ if (!previous.has_value()) { >+ // If the base is non-existent, the first existent value is encoded as >+ // a varint, rather than as a delta. >+ RTC_DCHECK(!base_.has_value()); >+ writer_->WriteBits(EncodeVarInt(value.value())); >+ } else { >+ EncodeDelta(previous.value(), value.value()); >+ } >+ >+ previous = value; >+ } >+ >+ return writer_->GetString(); >+} >+ >+size_t FixedLengthDeltaEncoder::OutputLengthBytes( >+ size_t existent_values_count) const { >+ return BitsToBytes(HeaderLengthBits() + >+ EncodedDeltasLengthBits(existent_values_count)); >+} >+ >+size_t FixedLengthDeltaEncoder::HeaderLengthBits() const { >+ if (params_.signed_deltas() == kDefaultSignedDeltas && >+ params_.values_optional() == kDefaultValuesOptional && >+ params_.value_width_bits() == kDefaultValueWidthBits) { >+ return kBitsInHeaderForEncodingType + kBitsInHeaderForDeltaWidthBits; >+ } else { >+ return kBitsInHeaderForEncodingType + kBitsInHeaderForDeltaWidthBits + >+ kBitsInHeaderForSignedDeltas + kBitsInHeaderForValuesOptional + >+ kBitsInHeaderForValueWidthBits; >+ } >+} >+ >+size_t FixedLengthDeltaEncoder::EncodedDeltasLengthBits( >+ size_t existent_values_count) const { >+ if (!params_.values_optional()) { >+ return values_.size() * params_.delta_width_bits(); >+ } else { >+ RTC_DCHECK_EQ(std::count_if(values_.begin(), values_.end(), >+ [](absl::optional<uint64_t> val) { >+ return val.has_value(); >+ }), >+ existent_values_count); >+ // One bit for each delta, to indicate if the value exists, and delta_width >+ // for each existent value, to indicate the delta itself. >+ // If base_ is non-existent, the first value (if any) is encoded as a varint >+ // rather than as a delta. >+ const size_t existence_bitmap_size_bits = 1 * values_.size(); >+ const bool first_value_is_varint = >+ !base_.has_value() && existent_values_count >= 1; >+ const size_t first_value_varint_size_bits = 8 * kMaxVarIntLengthBytes; >+ const size_t deltas_count = existent_values_count - first_value_is_varint; >+ const size_t deltas_size_bits = deltas_count * params_.delta_width_bits(); >+ return existence_bitmap_size_bits + first_value_varint_size_bits + >+ deltas_size_bits; >+ } >+} >+ >+void FixedLengthDeltaEncoder::EncodeHeader() { >+ RTC_DCHECK(writer_); >+ >+ const EncodingType encoding_type = >+ (params_.value_width_bits() == kDefaultValueWidthBits && >+ params_.signed_deltas() == kDefaultSignedDeltas && >+ params_.values_optional() == kDefaultValuesOptional) >+ ? EncodingType::kFixedSizeUnsignedDeltasNoEarlyWrapNoOpt >+ : EncodingType::kFixedSizeSignedDeltasEarlyWrapAndOptSupported; >+ >+ writer_->WriteBits(static_cast<uint64_t>(encoding_type), >+ kBitsInHeaderForEncodingType); >+ >+ // Note: Since it's meaningless for a field to be of width 0, when it comes >+ // to fields that relate widths, we encode width 1 as 0, width 2 as 1, >+ >+ writer_->WriteBits(params_.delta_width_bits() - 1, >+ kBitsInHeaderForDeltaWidthBits); >+ >+ if (encoding_type == EncodingType::kFixedSizeUnsignedDeltasNoEarlyWrapNoOpt) { >+ return; >+ } >+ >+ writer_->WriteBits(static_cast<uint64_t>(params_.signed_deltas()), >+ kBitsInHeaderForSignedDeltas); >+ writer_->WriteBits(static_cast<uint64_t>(params_.values_optional()), >+ kBitsInHeaderForValuesOptional); >+ writer_->WriteBits(params_.value_width_bits() - 1, >+ kBitsInHeaderForValueWidthBits); >+} >+ >+void FixedLengthDeltaEncoder::EncodeDelta(uint64_t previous, uint64_t current) { >+ if (params_.signed_deltas()) { >+ EncodeSignedDelta(previous, current); >+ } else { >+ EncodeUnsignedDelta(previous, current); >+ } >+} >+ >+void FixedLengthDeltaEncoder::EncodeUnsignedDelta(uint64_t previous, >+ uint64_t current) { >+ RTC_DCHECK(writer_); >+ const uint64_t delta = UnsignedDelta(previous, current, params_.value_mask()); >+ writer_->WriteBits(delta, params_.delta_width_bits()); >+} >+ >+void FixedLengthDeltaEncoder::EncodeSignedDelta(uint64_t previous, >+ uint64_t current) { >+ RTC_DCHECK(writer_); >+ >+ const uint64_t forward_delta = >+ UnsignedDelta(previous, current, params_.value_mask()); >+ const uint64_t backward_delta = >+ UnsignedDelta(current, previous, params_.value_mask()); >+ >+ uint64_t delta; >+ if (forward_delta <= backward_delta) { >+ delta = forward_delta; >+ } else { >+ // Compute the unsigned representation of a negative delta. >+ // This is the two's complement representation of this negative value, >+ // when deltas are of width params_.delta_mask(). >+ RTC_DCHECK_GE(params_.delta_mask(), backward_delta); >+ RTC_DCHECK_LT(params_.delta_mask() - backward_delta, params_.delta_mask()); >+ delta = params_.delta_mask() - backward_delta + 1; >+ RTC_DCHECK_LE(delta, params_.delta_mask()); >+ } >+ >+ writer_->WriteBits(delta, params_.delta_width_bits()); >+} >+ >+// Perform decoding of a a delta-encoded stream, extracting the original >+// sequence of values. >+class FixedLengthDeltaDecoder final { >+ public: >+ // Checks whether FixedLengthDeltaDecoder is a suitable decoder for this >+ // bitstream. Note that this does NOT imply that stream is valid, and will >+ // be decoded successfully. It DOES imply that all other decoder classes >+ // will fail to decode this input, though. >+ static bool IsSuitableDecoderFor(const std::string& input); >+ >+ // Assuming that |input| is the result of fixed-size delta-encoding >+ // that took place with the same value to |base| and over |num_of_deltas| >+ // original values, this will return the sequence of original values. >+ // If an error occurs (can happen if |input| is corrupt), an empty >+ // vector will be returned. >+ static std::vector<absl::optional<uint64_t>> DecodeDeltas( >+ const std::string& input, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas); >+ >+ private: >+ // Reads the encoding header in |input| and returns a FixedLengthDeltaDecoder >+ // with the corresponding configuration, that can be used to decode the >+ // values in |input|. >+ // If the encoding header is corrupt (contains an illegal configuration), >+ // nullptr will be returned. >+ // When a valid FixedLengthDeltaDecoder is returned, this does not mean that >+ // the entire stream is free of error. Rather, only the encoding header is >+ // examined and guaranteed. >+ static std::unique_ptr<FixedLengthDeltaDecoder> Create( >+ const std::string& input, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas); >+ >+ // FixedLengthDeltaDecoder objects are to be created by DecodeDeltas() and >+ // released by it before it returns. They're mostly a convenient way to >+ // avoid having to pass a lot of state between different functions. >+ // Therefore, it was deemed acceptable that |reader| does not own the buffer >+ // it reads, meaning the lifetime of |this| must not exceed the lifetime >+ // of |reader|'s underlying buffer. >+ FixedLengthDeltaDecoder(std::unique_ptr<rtc::BitBuffer> reader, >+ const FixedLengthEncodingParameters& params, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas); >+ >+ // Perform the decoding using the parameters given to the ctor. >+ std::vector<absl::optional<uint64_t>> Decode(); >+ >+ // Decode a varint and write it to |output|. Return value indicates success >+ // or failure. In case of failure, no guarantees are made about the contents >+ // of |output| or the results of additional reads. >+ bool ParseVarInt(uint64_t* output); >+ >+ // Attempt to parse a delta from the input reader. >+ // Returns true/false for success/failure. >+ // Writes the delta into |delta| if successful. >+ bool ParseDelta(uint64_t* delta); >+ >+ // Add |delta| to |base| to produce the next value in a sequence. >+ // The delta is applied as signed/unsigned depending on the parameters >+ // given to the ctor. Wrap-around is taken into account according to the >+ // values' width, as specified by the aforementioned encoding parameters. >+ uint64_t ApplyDelta(uint64_t base, uint64_t delta) const; >+ >+ // Helpers for ApplyDelta(). >+ uint64_t ApplyUnsignedDelta(uint64_t base, uint64_t delta) const; >+ uint64_t ApplySignedDelta(uint64_t base, uint64_t delta) const; >+ >+ // Reader of the input stream to be decoded. Does not own that buffer. >+ // See comment above ctor for details. >+ const std::unique_ptr<rtc::BitBuffer> reader_; >+ >+ // The parameters according to which encoding will be done (width of >+ // fields, whether signed deltas should be used, etc.) >+ const FixedLengthEncodingParameters params_; >+ >+ // The encoding scheme assumes that at least one value is transmitted OOB, >+ // so that the first value can be encoded as a delta from that OOB value, >+ // which is |base_|. >+ const absl::optional<uint64_t> base_; >+ >+ // The number of values to be known to be decoded. >+ const size_t num_of_deltas_; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(FixedLengthDeltaDecoder); >+}; >+ >+bool FixedLengthDeltaDecoder::IsSuitableDecoderFor(const std::string& input) { >+ if (input.length() < kBitsInHeaderForEncodingType) { >+ return false; >+ } >+ >+ rtc::BitBuffer reader(reinterpret_cast<const uint8_t*>(&input[0]), >+ kBitsInHeaderForEncodingType); >+ >+ uint32_t encoding_type_bits; >+ const bool result = >+ reader.ReadBits(&encoding_type_bits, kBitsInHeaderForEncodingType); >+ RTC_DCHECK(result); >+ >+ const auto encoding_type = static_cast<EncodingType>(encoding_type_bits); >+ return encoding_type == >+ EncodingType::kFixedSizeUnsignedDeltasNoEarlyWrapNoOpt || >+ encoding_type == >+ EncodingType::kFixedSizeSignedDeltasEarlyWrapAndOptSupported; >+} >+ >+std::vector<absl::optional<uint64_t>> FixedLengthDeltaDecoder::DecodeDeltas( >+ const std::string& input, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas) { >+ auto decoder = FixedLengthDeltaDecoder::Create(input, base, num_of_deltas); >+ if (!decoder) { >+ return std::vector<absl::optional<uint64_t>>(); >+ } >+ >+ return decoder->Decode(); >+} >+ >+std::unique_ptr<FixedLengthDeltaDecoder> FixedLengthDeltaDecoder::Create( >+ const std::string& input, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas) { >+ if (input.length() < kBitsInHeaderForEncodingType) { >+ return nullptr; >+ } >+ >+ auto reader = absl::make_unique<rtc::BitBuffer>( >+ reinterpret_cast<const uint8_t*>(&input[0]), input.length()); >+ >+ // Encoding type >+ uint32_t encoding_type_bits; >+ const bool result = >+ reader->ReadBits(&encoding_type_bits, kBitsInHeaderForEncodingType); >+ RTC_DCHECK(result); >+ const EncodingType encoding = static_cast<EncodingType>(encoding_type_bits); >+ if (encoding != EncodingType::kFixedSizeUnsignedDeltasNoEarlyWrapNoOpt && >+ encoding != >+ EncodingType::kFixedSizeSignedDeltasEarlyWrapAndOptSupported) { >+ RTC_LOG(LS_WARNING) << "Unrecognized encoding type."; >+ return nullptr; >+ } >+ >+ uint32_t read_buffer; >+ >+ // delta_width_bits >+ if (!reader->ReadBits(&read_buffer, kBitsInHeaderForDeltaWidthBits)) { >+ return nullptr; >+ } >+ RTC_DCHECK_LE(read_buffer, 64 - 1); // See encoding for -1's rationale. >+ const uint64_t delta_width_bits = >+ read_buffer + 1; // See encoding for +1's rationale. >+ >+ // signed_deltas, values_optional, value_width_bits >+ bool signed_deltas; >+ bool values_optional; >+ uint64_t value_width_bits; >+ if (encoding == EncodingType::kFixedSizeUnsignedDeltasNoEarlyWrapNoOpt) { >+ signed_deltas = kDefaultSignedDeltas; >+ values_optional = kDefaultValuesOptional; >+ value_width_bits = kDefaultValueWidthBits; >+ } else { >+ // signed_deltas >+ if (!reader->ReadBits(&read_buffer, kBitsInHeaderForSignedDeltas)) { >+ return nullptr; >+ } >+ signed_deltas = rtc::dchecked_cast<bool>(read_buffer); >+ >+ // values_optional >+ if (!reader->ReadBits(&read_buffer, kBitsInHeaderForValuesOptional)) { >+ return nullptr; >+ } >+ RTC_DCHECK_LE(read_buffer, 1); >+ values_optional = rtc::dchecked_cast<bool>(read_buffer); >+ >+ // value_width_bits >+ if (!reader->ReadBits(&read_buffer, kBitsInHeaderForValueWidthBits)) { >+ return nullptr; >+ } >+ RTC_DCHECK_LE(read_buffer, 64 - 1); // See encoding for -1's rationale. >+ value_width_bits = read_buffer + 1; // See encoding for +1's rationale. >+ } >+ >+ // Note: Because of the way the parameters are read, it is not possible >+ // for illegal values to be read. We check nevertheless, in case the code >+ // changes in the future in a way that breaks this promise. >+ if (!FixedLengthEncodingParameters::ValidParameters( >+ delta_width_bits, signed_deltas, values_optional, value_width_bits)) { >+ RTC_LOG(LS_WARNING) << "Corrupt log; illegal encoding parameters."; >+ return nullptr; >+ } >+ >+ FixedLengthEncodingParameters params(delta_width_bits, signed_deltas, >+ values_optional, value_width_bits); >+ return absl::WrapUnique(new FixedLengthDeltaDecoder(std::move(reader), params, >+ base, num_of_deltas)); >+} >+ >+FixedLengthDeltaDecoder::FixedLengthDeltaDecoder( >+ std::unique_ptr<rtc::BitBuffer> reader, >+ const FixedLengthEncodingParameters& params, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas) >+ : reader_(std::move(reader)), >+ params_(params), >+ base_(base), >+ num_of_deltas_(num_of_deltas) { >+ RTC_DCHECK(reader_); >+} >+ >+std::vector<absl::optional<uint64_t>> FixedLengthDeltaDecoder::Decode() { >+ RTC_DCHECK(reader_); >+ >+ std::vector<bool> existing_values(num_of_deltas_); >+ if (params_.values_optional()) { >+ for (size_t i = 0; i < num_of_deltas_; ++i) { >+ uint32_t exists; >+ if (!reader_->ReadBits(&exists, 1u)) { >+ RTC_LOG(LS_WARNING) << "Failed to read existence-indicating bit."; >+ return std::vector<absl::optional<uint64_t>>(); >+ } >+ RTC_DCHECK_LE(exists, 1u); >+ existing_values[i] = (exists == 1); >+ } >+ } else { >+ std::fill(existing_values.begin(), existing_values.end(), true); >+ } >+ >+ absl::optional<uint64_t> previous = base_; >+ std::vector<absl::optional<uint64_t>> values(num_of_deltas_); >+ >+ for (size_t i = 0; i < num_of_deltas_; ++i) { >+ if (!existing_values[i]) { >+ RTC_DCHECK(params_.values_optional()); >+ continue; >+ } >+ >+ if (!previous) { >+ // If the base is non-existent, the first existent value is encoded as >+ // a varint, rather than as a delta. >+ RTC_DCHECK(!base_.has_value()); >+ uint64_t first_value; >+ if (!ParseVarInt(&first_value)) { >+ RTC_LOG(LS_WARNING) << "Failed to read first value."; >+ return std::vector<absl::optional<uint64_t>>(); >+ } >+ values[i] = first_value; >+ } else { >+ uint64_t delta; >+ if (!ParseDelta(&delta)) { >+ return std::vector<absl::optional<uint64_t>>(); >+ } >+ values[i] = ApplyDelta(previous.value(), delta); >+ } >+ >+ previous = values[i]; >+ } >+ >+ return values; >+} >+ >+bool FixedLengthDeltaDecoder::ParseVarInt(uint64_t* output) { >+ RTC_DCHECK(reader_); >+ return DecodeVarInt(reader_.get(), output) != 0; >+} >+ >+bool FixedLengthDeltaDecoder::ParseDelta(uint64_t* delta) { >+ RTC_DCHECK(reader_); >+ >+ // BitBuffer and BitBufferWriter read/write higher bits before lower bits. >+ >+ const size_t lower_bit_count = >+ std::min<uint64_t>(params_.delta_width_bits(), 32u); >+ const size_t higher_bit_count = (params_.delta_width_bits() <= 32u) >+ ? 0 >+ : params_.delta_width_bits() - 32u; >+ >+ uint32_t lower_bits; >+ uint32_t higher_bits; >+ >+ if (higher_bit_count > 0) { >+ if (!reader_->ReadBits(&higher_bits, higher_bit_count)) { >+ RTC_LOG(LS_WARNING) << "Failed to read higher half of delta."; >+ return false; >+ } >+ } else { >+ higher_bits = 0; >+ } >+ >+ if (!reader_->ReadBits(&lower_bits, lower_bit_count)) { >+ RTC_LOG(LS_WARNING) << "Failed to read lower half of delta."; >+ return false; >+ } >+ >+ const uint64_t lower_bits_64 = static_cast<uint64_t>(lower_bits); >+ const uint64_t higher_bits_64 = static_cast<uint64_t>(higher_bits); >+ >+ *delta = (higher_bits_64 << 32) | lower_bits_64; >+ return true; >+} >+ >+uint64_t FixedLengthDeltaDecoder::ApplyDelta(uint64_t base, >+ uint64_t delta) const { >+ RTC_DCHECK_LE(base, MaxUnsignedValueOfBitWidth(params_.value_width_bits())); >+ RTC_DCHECK_LE(delta, MaxUnsignedValueOfBitWidth(params_.delta_width_bits())); >+ return params_.signed_deltas() ? ApplySignedDelta(base, delta) >+ : ApplyUnsignedDelta(base, delta); >+} >+ >+uint64_t FixedLengthDeltaDecoder::ApplyUnsignedDelta(uint64_t base, >+ uint64_t delta) const { >+ // Note: May still be used if signed deltas used. >+ RTC_DCHECK_LE(base, MaxUnsignedValueOfBitWidth(params_.value_width_bits())); >+ RTC_DCHECK_LE(delta, MaxUnsignedValueOfBitWidth(params_.delta_width_bits())); >+ return (base + delta) & params_.value_mask(); >+} >+ >+uint64_t FixedLengthDeltaDecoder::ApplySignedDelta(uint64_t base, >+ uint64_t delta) const { >+ RTC_DCHECK(params_.signed_deltas()); >+ RTC_DCHECK_LE(base, MaxUnsignedValueOfBitWidth(params_.value_width_bits())); >+ RTC_DCHECK_LE(delta, MaxUnsignedValueOfBitWidth(params_.delta_width_bits())); >+ >+ const uint64_t top_bit = static_cast<uint64_t>(1) >+ << (params_.delta_width_bits() - 1); >+ >+ const bool positive_delta = ((delta & top_bit) == 0); >+ if (positive_delta) { >+ return ApplyUnsignedDelta(base, delta); >+ } >+ >+ const uint64_t delta_abs = (~delta & params_.delta_mask()) + 1; >+ return (base - delta_abs) & params_.value_mask(); >+} >+ >+} // namespace >+ >+std::string EncodeDeltas(absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values) { >+ // TODO(eladalon): Support additional encodings. >+ return FixedLengthDeltaEncoder::EncodeDeltas(base, values); >+} >+ >+std::vector<absl::optional<uint64_t>> DecodeDeltas( >+ const std::string& input, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas) { >+ RTC_DCHECK_GT(num_of_deltas, 0); // Allows empty vector to indicate error. >+ >+ // The empty string is a special case indicating that all values were equal >+ // to the base. >+ if (input.empty()) { >+ std::vector<absl::optional<uint64_t>> result(num_of_deltas); >+ std::fill(result.begin(), result.end(), base); >+ return result; >+ } >+ >+ if (FixedLengthDeltaDecoder::IsSuitableDecoderFor(input)) { >+ return FixedLengthDeltaDecoder::DecodeDeltas(input, base, num_of_deltas); >+ } >+ >+ RTC_LOG(LS_WARNING) << "Could not decode delta-encoded stream."; >+ return std::vector<absl::optional<uint64_t>>(); >+} >+ >+void SetFixedLengthEncoderDeltaSignednessForTesting(bool signedness) { >+ g_force_unsigned_for_testing = !signedness; >+ g_force_signed_for_testing = signedness; >+} >+ >+void UnsetFixedLengthEncoderDeltaSignednessForTesting() { >+ g_force_unsigned_for_testing = false; >+ g_force_signed_for_testing = false; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding.h >new file mode 100644 >index 0000000000000000000000000000000000000000..9e5e8b7f43cdd4b9d4290fbd4fc860df566d51e1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding.h >@@ -0,0 +1,45 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef LOGGING_RTC_EVENT_LOG_ENCODER_DELTA_ENCODING_H_ >+#define LOGGING_RTC_EVENT_LOG_ENCODER_DELTA_ENCODING_H_ >+ >+#include <string> >+#include <vector> >+ >+#include "absl/types/optional.h" >+ >+namespace webrtc { >+ >+// Encode |values| as a sequence of deltas following on |base| and return it. >+// If all of the values were equal to the base, an empty string will be >+// returned; this is a valid encoding of that edge case. >+// |base| is not guaranteed to be written into |output|, and must therefore >+// be provided separately to the decoder. >+// This function never fails. >+// TODO(eladalon): Split into optional and non-optional variants (efficiency). >+std::string EncodeDeltas(absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values); >+ >+// EncodeDeltas() and DecodeDeltas() are inverse operations; >+// invoking DecodeDeltas() over the output of EncodeDeltas(), will return >+// the input originally given to EncodeDeltas(). >+// |num_of_deltas| must be greater than zero. If input is not a valid encoding >+// of |num_of_deltas| elements based on |base|, the function returns an empty >+// vector, which signals an error. >+// TODO(eladalon): Split into optional and non-optional variants (efficiency). >+std::vector<absl::optional<uint64_t>> DecodeDeltas( >+ const std::string& input, >+ absl::optional<uint64_t> base, >+ size_t num_of_deltas); >+ >+} // namespace webrtc >+ >+#endif // LOGGING_RTC_EVENT_LOG_ENCODER_DELTA_ENCODING_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..ca44010edac3005af51f90cddf593824f455fa14 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/delta_encoding_unittest.cc >@@ -0,0 +1,693 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/delta_encoding.h" >+ >+#include <algorithm> >+#include <limits> >+#include <numeric> >+#include <string> >+#include <tuple> >+#include <vector> >+ >+#include "absl/types/optional.h" >+#include "rtc_base/arraysize.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/random.h" >+#include "test/gtest.h" >+ >+namespace webrtc { >+ >+void SetFixedLengthEncoderDeltaSignednessForTesting(bool signedness); >+void UnsetFixedLengthEncoderDeltaSignednessForTesting(); >+ >+namespace { >+ >+enum class DeltaSignedness { kNoOverride, kForceUnsigned, kForceSigned }; >+ >+void MaybeSetSignedness(DeltaSignedness signedness) { >+ switch (signedness) { >+ case DeltaSignedness::kNoOverride: >+ UnsetFixedLengthEncoderDeltaSignednessForTesting(); >+ return; >+ case DeltaSignedness::kForceUnsigned: >+ SetFixedLengthEncoderDeltaSignednessForTesting(false); >+ return; >+ case DeltaSignedness::kForceSigned: >+ SetFixedLengthEncoderDeltaSignednessForTesting(true); >+ return; >+ } >+ RTC_NOTREACHED(); >+} >+ >+uint64_t RandomWithMaxBitWidth(Random* prng, uint64_t max_width) { >+ RTC_DCHECK_GE(max_width, 1u); >+ RTC_DCHECK_LE(max_width, 64u); >+ >+ const uint64_t low = prng->Rand(std::numeric_limits<uint32_t>::max()); >+ const uint64_t high = >+ max_width > 32u ? prng->Rand(std::numeric_limits<uint32_t>::max()) : 0u; >+ >+ const uint64_t random_before_mask = (high << 32) | low; >+ >+ if (max_width < 64) { >+ return random_before_mask & ((static_cast<uint64_t>(1) << max_width) - 1); >+ } else { >+ return random_before_mask; >+ } >+} >+ >+// Encodes |values| based on |base|, then decodes the result and makes sure >+// that it is equal to the original input. >+// If |encoded_string| is non-null, the encoded result will also be written >+// into it. >+void TestEncodingAndDecoding( >+ absl::optional<uint64_t> base, >+ const std::vector<absl::optional<uint64_t>>& values, >+ std::string* encoded_string = nullptr) { >+ const std::string encoded = EncodeDeltas(base, values); >+ if (encoded_string) { >+ *encoded_string = encoded; >+ } >+ >+ const std::vector<absl::optional<uint64_t>> decoded = >+ DecodeDeltas(encoded, base, values.size()); >+ >+ EXPECT_EQ(decoded, values); >+} >+ >+std::vector<absl::optional<uint64_t>> CreateSequenceByFirstValue( >+ uint64_t first, >+ size_t sequence_length) { >+ std::vector<absl::optional<uint64_t>> sequence(sequence_length); >+ std::iota(sequence.begin(), sequence.end(), first); >+ return sequence; >+} >+ >+std::vector<absl::optional<uint64_t>> CreateSequenceByLastValue( >+ uint64_t last, >+ size_t num_values) { >+ const uint64_t first = last - num_values + 1; >+ std::vector<absl::optional<uint64_t>> result(num_values); >+ std::iota(result.begin(), result.end(), first); >+ return result; >+} >+ >+// If |sequence_length| is greater than the number of deltas, the sequence of >+// deltas will wrap around. >+std::vector<absl::optional<uint64_t>> CreateSequenceByOptionalDeltas( >+ uint64_t first, >+ const std::vector<absl::optional<uint64_t>>& deltas, >+ size_t sequence_length) { >+ RTC_DCHECK_GE(sequence_length, 1); >+ >+ std::vector<absl::optional<uint64_t>> sequence(sequence_length); >+ >+ uint64_t previous = first; >+ for (size_t i = 0, next_delta_index = 0; i < sequence.size(); ++i) { >+ if (deltas[next_delta_index].has_value()) { >+ sequence[i] = >+ absl::optional<uint64_t>(previous + deltas[next_delta_index].value()); >+ previous = sequence[i].value(); >+ } >+ next_delta_index = (next_delta_index + 1) % deltas.size(); >+ } >+ >+ return sequence; >+} >+ >+size_t EncodingLengthUpperBound(size_t delta_max_bit_width, >+ size_t num_of_deltas, >+ DeltaSignedness signedness_override) { >+ absl::optional<size_t> smallest_header_size_bytes; >+ switch (signedness_override) { >+ case DeltaSignedness::kNoOverride: >+ case DeltaSignedness::kForceUnsigned: >+ smallest_header_size_bytes = 1; >+ break; >+ case DeltaSignedness::kForceSigned: >+ smallest_header_size_bytes = 2; >+ break; >+ } >+ RTC_DCHECK(smallest_header_size_bytes); >+ >+ return delta_max_bit_width * num_of_deltas + *smallest_header_size_bytes; >+} >+ >+// If |sequence_length| is greater than the number of deltas, the sequence of >+// deltas will wrap around. >+std::vector<absl::optional<uint64_t>> CreateSequenceByDeltas( >+ uint64_t first, >+ const std::vector<uint64_t>& deltas, >+ size_t sequence_length) { >+ RTC_DCHECK(!deltas.empty()); >+ std::vector<absl::optional<uint64_t>> optional_deltas(deltas.size()); >+ for (size_t i = 0; i < deltas.size(); ++i) { >+ optional_deltas[i] = absl::optional<uint64_t>(deltas[i]); >+ } >+ return CreateSequenceByOptionalDeltas(first, optional_deltas, >+ sequence_length); >+} >+ >+// Tests of the delta encoding, parameterized by the number of values >+// in the sequence created by the test. >+class DeltaEncodingTest >+ : public ::testing::TestWithParam< >+ std::tuple<DeltaSignedness, size_t, bool, uint64_t>> { >+ public: >+ DeltaEncodingTest() >+ : signedness_(std::get<0>(GetParam())), >+ num_of_values_(std::get<1>(GetParam())), >+ optional_values_(std::get<2>(GetParam())), >+ partial_random_seed_(std::get<3>(GetParam())) { >+ MaybeSetSignedness(signedness_); >+ } >+ >+ ~DeltaEncodingTest() override = default; >+ >+ // Running with the same seed for all variants would make all tests start >+ // with the same sequence; avoid this by making the seed different. >+ uint64_t Seed() const { >+ // Multiply everything but by different primes to produce unique results. >+ return 2 * static_cast<uint64_t>(signedness_) + 3 * num_of_values_ + >+ 5 * optional_values_ + 7 * partial_random_seed_; >+ } >+ >+ const DeltaSignedness signedness_; >+ const uint64_t num_of_values_; >+ const bool optional_values_; >+ const uint64_t partial_random_seed_; // Explained where it's used. >+}; >+ >+TEST_P(DeltaEncodingTest, AllValuesEqualToExistentBaseValue) { >+ const absl::optional<uint64_t> base(3432); >+ std::vector<absl::optional<uint64_t>> values(num_of_values_); >+ std::fill(values.begin(), values.end(), base); >+ std::string encoded; >+ TestEncodingAndDecoding(base, values, &encoded); >+ >+ // Additional requirement - the encoding should be efficient in this >+ // case - the empty string will be used. >+ EXPECT_TRUE(encoded.empty()); >+} >+ >+TEST_P(DeltaEncodingTest, AllValuesEqualToNonExistentBaseValue) { >+ if (!optional_values_) { >+ return; // Test irrelevant for this case. >+ } >+ >+ const absl::optional<uint64_t> base; >+ std::vector<absl::optional<uint64_t>> values(num_of_values_); >+ std::fill(values.begin(), values.end(), base); >+ std::string encoded; >+ TestEncodingAndDecoding(base, values, &encoded); >+ >+ // Additional requirement - the encoding should be efficient in this >+ // case - the empty string will be used. >+ EXPECT_TRUE(encoded.empty()); >+} >+ >+TEST_P(DeltaEncodingTest, BaseNonExistentButSomeOtherValuesExist) { >+ if (!optional_values_) { >+ return; // Test irrelevant for this case. >+ } >+ >+ const absl::optional<uint64_t> base; >+ std::vector<absl::optional<uint64_t>> values(num_of_values_); >+ >+ Random prng(Seed()); >+ >+ const uint64_t max_bit_width = 1 + prng.Rand(63); // [1, 64] >+ >+ for (size_t i = 0; i < values.size();) { >+ // Leave a random number of values as non-existent. >+ const size_t non_existent_count = prng.Rand(values.size() - i - 1); >+ i += non_existent_count; >+ >+ // Assign random values to a random number of values. (At least one, to >+ // prevent this iteration of the outer loop from being a no-op.) >+ const size_t existent_count = >+ std::max<size_t>(prng.Rand(values.size() - i - 1), 1); >+ for (size_t j = 0; j < existent_count; ++j) { >+ values[i + j] = RandomWithMaxBitWidth(&prng, max_bit_width); >+ } >+ i += existent_count; >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_P(DeltaEncodingTest, MinDeltaNoWrapAround) { >+ const absl::optional<uint64_t> base(3432); >+ >+ auto values = CreateSequenceByFirstValue(base.value() + 1, num_of_values_); >+ ASSERT_GT(values[values.size() - 1], base) << "Sanity; must not wrap around"; >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ values[0] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_P(DeltaEncodingTest, BigDeltaNoWrapAround) { >+ const uint64_t kBigDelta = 132828; >+ const absl::optional<uint64_t> base(3432); >+ >+ auto values = >+ CreateSequenceByFirstValue(base.value() + kBigDelta, num_of_values_); >+ ASSERT_GT(values[values.size() - 1], base) << "Sanity; must not wrap around"; >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ values[0] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_P(DeltaEncodingTest, MaxDeltaNoWrapAround) { >+ const absl::optional<uint64_t> base(3432); >+ >+ auto values = CreateSequenceByLastValue(std::numeric_limits<uint64_t>::max(), >+ num_of_values_); >+ ASSERT_GT(values[values.size() - 1], base) << "Sanity; must not wrap around"; >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ values[0] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_P(DeltaEncodingTest, SmallDeltaWithWrapAroundComparedToBase) { >+ if (optional_values_ && num_of_values_ == 1) { >+ return; // Inapplicable >+ } >+ >+ const absl::optional<uint64_t> base(std::numeric_limits<uint64_t>::max()); >+ >+ auto values = CreateSequenceByDeltas(*base, {1, 10, 3}, num_of_values_); >+ ASSERT_LT(values[0], base) << "Sanity; must wrap around"; >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ values[1] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_P(DeltaEncodingTest, SmallDeltaWithWrapAroundInValueSequence) { >+ if (num_of_values_ == 1 || (optional_values_ && num_of_values_ < 3)) { >+ return; // Inapplicable. >+ } >+ >+ const absl::optional<uint64_t> base(std::numeric_limits<uint64_t>::max() - 2); >+ >+ auto values = CreateSequenceByDeltas(*base, {1, 10, 3}, num_of_values_); >+ ASSERT_LT(values[values.size() - 1], values[0]) << "Sanity; must wrap around"; >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ RTC_DCHECK_GT(values.size() - 1, 1u); // Wrap around not cancelled. >+ values[1] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+// Suppress "integral constant overflow" warning; this is the test's focus. >+#ifdef _MSC_VER >+#pragma warning(push) >+#pragma warning(disable : 4307) >+#endif >+TEST_P(DeltaEncodingTest, BigDeltaWithWrapAroundComparedToBase) { >+ if (optional_values_ && num_of_values_ == 1) { >+ return; // Inapplicable >+ } >+ >+ const uint64_t kBigDelta = 132828; >+ const absl::optional<uint64_t> base(std::numeric_limits<uint64_t>::max() - >+ kBigDelta + 3); >+ >+ auto values = >+ CreateSequenceByFirstValue(base.value() + kBigDelta, num_of_values_); >+ ASSERT_LT(values[0], base.value()) << "Sanity; must wrap around"; >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ values[1] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_P(DeltaEncodingTest, BigDeltaWithWrapAroundInValueSequence) { >+ if (num_of_values_ == 1 || (optional_values_ && num_of_values_ < 3)) { >+ return; // Inapplicable. >+ } >+ >+ const uint64_t kBigDelta = 132828; >+ const absl::optional<uint64_t> base(std::numeric_limits<uint64_t>::max() - >+ kBigDelta + 3); >+ >+ auto values = CreateSequenceByFirstValue(std::numeric_limits<uint64_t>::max(), >+ num_of_values_); >+ ASSERT_LT(values[values.size() - 1], values[0]) << "Sanity; must wrap around"; >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ RTC_DCHECK_GT(values.size() - 1, 1u); // Wrap around not cancelled. >+ values[1] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+#ifdef _MSC_VER >+#pragma warning(pop) >+#endif >+ >+TEST_P(DeltaEncodingTest, MaxDeltaWithWrapAroundComparedToBase) { >+ if (optional_values_ && num_of_values_ == 1) { >+ return; // Inapplicable >+ } >+ >+ const absl::optional<uint64_t> base(3432); >+ auto values = CreateSequenceByFirstValue(*base - 1, num_of_values_); >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ values[1] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_P(DeltaEncodingTest, MaxDeltaWithWrapAroundInValueSequence) { >+ if (num_of_values_ == 1 || (optional_values_ && num_of_values_ < 3)) { >+ return; // Inapplicable. >+ } >+ >+ const absl::optional<uint64_t> base(3432); >+ >+ auto values = CreateSequenceByDeltas( >+ *base, {0, std::numeric_limits<uint64_t>::max(), 3}, num_of_values_); >+ // Wraps around continuously by virtue of being max(); will not ASSERT. >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ RTC_DCHECK_GT(values.size() - 1, 1u); // Wrap around not cancelled. >+ values[1] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+// If num_of_values_ == 1, a zero delta will yield an empty string; that's >+// already covered by AllValuesEqualToExistentBaseValue, but it doesn't hurt to >+// test again. For all other cases, we have a new test. >+TEST_P(DeltaEncodingTest, ZeroDelta) { >+ const absl::optional<uint64_t> base(3432); >+ >+ // Arbitrary sequence of deltas with intentional zero deltas, as well as >+ // consecutive zeros. >+ const std::vector<uint64_t> deltas = {0, 312, 11, 1, 1, 0, 0, 12, >+ 400321, 3, 3, 12, 5, 0, 6}; >+ auto values = CreateSequenceByDeltas(base.value(), deltas, num_of_values_); >+ >+ if (optional_values_) { >+ // Arbitrarily make one of the values non-existent, to force >+ // optional-supporting encoding. >+ values[0] = absl::optional<uint64_t>(); >+ } >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+INSTANTIATE_TEST_CASE_P( >+ SignednessOverrideAndNumberOfValuesInSequence, >+ DeltaEncodingTest, >+ ::testing::Combine(::testing::Values(DeltaSignedness::kNoOverride, >+ DeltaSignedness::kForceUnsigned, >+ DeltaSignedness::kForceSigned), >+ ::testing::Values(1, 2, 100, 10000), >+ ::testing::Bool(), >+ ::testing::Values(10, 20, 30))); >+ >+// Tests over the quality of the compression (as opposed to its correctness). >+// Not to be confused with tests of runtime efficiency. >+class DeltaEncodingCompressionQualityTest >+ : public ::testing::TestWithParam< >+ std::tuple<DeltaSignedness, uint64_t, uint64_t, uint64_t>> { >+ public: >+ DeltaEncodingCompressionQualityTest() >+ : signedness_(std::get<0>(GetParam())), >+ delta_max_bit_width_(std::get<1>(GetParam())), >+ num_of_values_(std::get<2>(GetParam())), >+ partial_random_seed_(std::get<3>(GetParam())) { >+ MaybeSetSignedness(signedness_); >+ } >+ >+ ~DeltaEncodingCompressionQualityTest() override = default; >+ >+ // Running with the same seed for all variants would make all tests start >+ // with the same sequence; avoid this by making the seed different. >+ uint64_t Seed() const { >+ // Multiply everything but by different primes to produce unique results. >+ return 2 * static_cast<uint64_t>(signedness_) + 3 * delta_max_bit_width_ + >+ 5 * delta_max_bit_width_ + 7 * num_of_values_ + >+ 11 * partial_random_seed_; >+ } >+ >+ const DeltaSignedness signedness_; >+ const uint64_t delta_max_bit_width_; >+ const uint64_t num_of_values_; >+ const uint64_t partial_random_seed_; // Explained where it's used. >+}; >+ >+// If no wrap-around occurs in the stream, the width of the values does not >+// matter to compression performance; only the deltas matter. >+TEST_P(DeltaEncodingCompressionQualityTest, >+ BaseDoesNotAffectEfficiencyIfNoWrapAround) { >+ // 1. Bases which will not produce a wrap-around. >+ // 2. The last base - 0xffffffffffffffff - does cause a wrap-around, but >+ // that still works, because the width is 64 anyway, and does not >+ // need to be conveyed explicitly in the encoding header. >+ const uint64_t bases[] = {0, 0x55, 0xffffffff, >+ std::numeric_limits<uint64_t>::max()}; >+ const size_t kIntendedWrapAroundBaseIndex = arraysize(bases); >+ >+ std::vector<uint64_t> deltas(num_of_values_); >+ >+ // Allows us to make sure that the deltas do not produce a wrap-around. >+ uint64_t last_element[arraysize(bases)]; >+ memcpy(last_element, bases, sizeof(bases)); >+ >+ // Avoid empty |deltas| due to first element causing wrap-around. >+ deltas[0] = 1; >+ for (size_t i = 0; i < arraysize(last_element); ++i) { >+ last_element[i] += 1; >+ } >+ >+ Random prng(Seed()); >+ >+ for (size_t i = 1; i < deltas.size(); ++i) { >+ const uint64_t delta = RandomWithMaxBitWidth(&prng, delta_max_bit_width_); >+ >+ bool wrap_around = false; >+ for (size_t j = 0; j < arraysize(last_element); ++j) { >+ if (j == kIntendedWrapAroundBaseIndex) { >+ continue; >+ } >+ >+ last_element[j] += delta; >+ if (last_element[j] < bases[j]) { >+ wrap_around = true; >+ break; >+ } >+ } >+ >+ if (wrap_around) { >+ deltas.resize(i); >+ break; >+ } >+ >+ deltas[i] = delta; >+ } >+ >+ std::string encodings[arraysize(bases)]; >+ >+ for (size_t i = 0; i < arraysize(bases); ++i) { >+ const auto values = >+ CreateSequenceByDeltas(bases[i], deltas, num_of_values_); >+ // Produce the encoding and write it to encodings[i]. >+ // By using TestEncodingAndDecoding() to do this, we also sanity-test >+ // the encoding/decoding, though that is not the test's focus. >+ TestEncodingAndDecoding(bases[i], values, &encodings[i]); >+ EXPECT_LE(encodings[i].length(), >+ EncodingLengthUpperBound(delta_max_bit_width_, num_of_values_, >+ signedness_)); >+ } >+ >+ // Test focus - all of the encodings should be the same, as they are based >+ // on the same delta sequence, and do not contain a wrap-around. >+ for (size_t i = 1; i < arraysize(encodings); ++i) { >+ EXPECT_EQ(encodings[i], encodings[0]); >+ } >+} >+ >+INSTANTIATE_TEST_CASE_P( >+ SignednessOverrideAndDeltaMaxBitWidthAndNumberOfValuesInSequence, >+ DeltaEncodingCompressionQualityTest, >+ ::testing::Combine( >+ ::testing::Values(DeltaSignedness::kNoOverride, >+ DeltaSignedness::kForceUnsigned, >+ DeltaSignedness::kForceSigned), >+ ::testing::Values(1, 4, 8, 15, 16, 17, 31, 32, 33, 63, 64), >+ ::testing::Values(1, 2, 100, 10000), >+ ::testing::Values(11, 12, 13))); >+ >+// Similar to DeltaEncodingTest, but instead of semi-surgically producing >+// specific cases, produce large amount of semi-realistic inputs. >+class DeltaEncodingFuzzerLikeTest >+ : public ::testing::TestWithParam< >+ std::tuple<DeltaSignedness, uint64_t, uint64_t, bool, uint64_t>> { >+ public: >+ DeltaEncodingFuzzerLikeTest() >+ : signedness_(std::get<0>(GetParam())), >+ delta_max_bit_width_(std::get<1>(GetParam())), >+ num_of_values_(std::get<2>(GetParam())), >+ optional_values_(std::get<3>(GetParam())), >+ partial_random_seed_(std::get<4>(GetParam())) { >+ MaybeSetSignedness(signedness_); >+ } >+ >+ ~DeltaEncodingFuzzerLikeTest() override = default; >+ >+ // Running with the same seed for all variants would make all tests start >+ // with the same sequence; avoid this by making the seed different. >+ uint64_t Seed() const { >+ // Multiply everything but by different primes to produce unique results. >+ return 2 * static_cast<uint64_t>(signedness_) + 3 * delta_max_bit_width_ + >+ 5 * delta_max_bit_width_ + 7 * num_of_values_ + >+ 11 * static_cast<uint64_t>(optional_values_) + >+ 13 * partial_random_seed_; >+ } >+ >+ const DeltaSignedness signedness_; >+ const uint64_t delta_max_bit_width_; >+ const uint64_t num_of_values_; >+ const bool optional_values_; >+ const uint64_t partial_random_seed_; // Explained where it's used. >+}; >+ >+TEST_P(DeltaEncodingFuzzerLikeTest, Test) { >+ const absl::optional<uint64_t> base(3432); >+ >+ Random prng(Seed()); >+ std::vector<absl::optional<uint64_t>> deltas(num_of_values_); >+ for (size_t i = 0; i < deltas.size(); ++i) { >+ if (!optional_values_ || prng.Rand<bool>()) { >+ deltas[i] = RandomWithMaxBitWidth(&prng, delta_max_bit_width_); >+ } >+ } >+ const auto values = >+ CreateSequenceByOptionalDeltas(base.value(), deltas, num_of_values_); >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+INSTANTIATE_TEST_CASE_P( >+ SignednessOverrideAndDeltaMaxBitWidthAndNumberOfValuesInSequence, >+ DeltaEncodingFuzzerLikeTest, >+ ::testing::Combine( >+ ::testing::Values(DeltaSignedness::kNoOverride, >+ DeltaSignedness::kForceUnsigned, >+ DeltaSignedness::kForceSigned), >+ ::testing::Values(1, 4, 8, 15, 16, 17, 31, 32, 33, 63, 64), >+ ::testing::Values(1, 2, 100, 10000), >+ ::testing::Bool(), >+ ::testing::Values(21, 22, 23))); >+ >+class DeltaEncodingSpecificEdgeCasesTest >+ : public ::testing::TestWithParam< >+ std::tuple<DeltaSignedness, uint64_t, bool>> { >+ public: >+ DeltaEncodingSpecificEdgeCasesTest() { >+ UnsetFixedLengthEncoderDeltaSignednessForTesting(); >+ } >+ >+ ~DeltaEncodingSpecificEdgeCasesTest() override = default; >+}; >+ >+// This case is special because it produces identical forward/backward deltas. >+TEST_F(DeltaEncodingSpecificEdgeCasesTest, SignedDeltaWithOnlyTopBitOn) { >+ MaybeSetSignedness(DeltaSignedness::kForceSigned); >+ >+ const absl::optional<uint64_t> base(3432); >+ >+ const uint64_t delta = static_cast<uint64_t>(1) << 63; >+ const std::vector<absl::optional<uint64_t>> values = {base.value() + delta}; >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+TEST_F(DeltaEncodingSpecificEdgeCasesTest, MaximumUnsignedDelta) { >+ MaybeSetSignedness(DeltaSignedness::kForceUnsigned); >+ >+ const absl::optional<uint64_t> base((static_cast<uint64_t>(1) << 63) + 0x123); >+ >+ const std::vector<absl::optional<uint64_t>> values = {base.value() - 1}; >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+// Check that, if all deltas are set to -1, things still work. >+TEST_P(DeltaEncodingSpecificEdgeCasesTest, ReverseSequence) { >+ MaybeSetSignedness(std::get<0>(GetParam())); >+ const uint64_t width = std::get<1>(GetParam()); >+ const bool wrap_around = std::get<2>(GetParam()); >+ >+ const uint64_t value_mask = (width == 64) >+ ? std::numeric_limits<uint64_t>::max() >+ : ((static_cast<uint64_t>(1) << width) - 1); >+ >+ const uint64_t base = wrap_around ? 1u : (0xf82d3 & value_mask); >+ const std::vector<absl::optional<uint64_t>> values = { >+ (base - 1u) & value_mask, (base - 2u) & value_mask, >+ (base - 3u) & value_mask}; >+ >+ TestEncodingAndDecoding(base, values); >+} >+ >+INSTANTIATE_TEST_CASE_P( >+ _, >+ DeltaEncodingSpecificEdgeCasesTest, >+ ::testing::Combine( >+ ::testing::Values(DeltaSignedness::kNoOverride, >+ DeltaSignedness::kForceUnsigned, >+ DeltaSignedness::kForceSigned), >+ ::testing::Values(1, 4, 8, 15, 16, 17, 31, 32, 33, 63, 64), >+ ::testing::Bool())); >+ >+} // namespace >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder.h >index beb711c9dd82f72b95f5dc85f102e1648116f8bb..6ce750f67e760a1dc27ead8f6d8fbeaa2e36ee86 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder.h >@@ -22,7 +22,8 @@ class RtcEventLogEncoder { > public: > virtual ~RtcEventLogEncoder() = default; > >- virtual std::string EncodeLogStart(int64_t timestamp_us) = 0; >+ virtual std::string EncodeLogStart(int64_t timestamp_us, >+ int64_t utc_time_us) = 0; > virtual std::string EncodeLogEnd(int64_t timestamp_us) = 0; > > virtual std::string EncodeBatch( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7aea47611d6c3da6c55735b296f7b19ec9bb43ca >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common.cc >@@ -0,0 +1,40 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder_common.h" >+ >+#include "rtc_base/checks.h" >+ >+namespace webrtc { >+namespace { >+// We use 0x3fff because that gives decent precision (compared to the underlying >+// measurement producing the packet loss fraction) on the one hand, while >+// allowing us to use no more than 2 bytes in varint form on the other hand. >+// (We might also fixed-size encode using at most 14 bits.) >+constexpr uint32_t kPacketLossFractionRange = (1 << 14) - 1; // 0x3fff >+constexpr float kPacketLossFractionRangeFloat = >+ static_cast<float>(kPacketLossFractionRange); >+} // namespace >+ >+uint32_t ConvertPacketLossFractionToProtoFormat(float packet_loss_fraction) { >+ RTC_DCHECK_GE(packet_loss_fraction, 0); >+ RTC_DCHECK_LE(packet_loss_fraction, 1); >+ return static_cast<uint32_t>(packet_loss_fraction * kPacketLossFractionRange); >+} >+ >+bool ParsePacketLossFractionFromProtoFormat(uint32_t proto_packet_loss_fraction, >+ float* output) { >+ if (proto_packet_loss_fraction >= kPacketLossFractionRange) { >+ return false; >+ } >+ *output = proto_packet_loss_fraction / kPacketLossFractionRangeFloat; >+ return true; >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common.h >new file mode 100644 >index 0000000000000000000000000000000000000000..429f8ed2ad73fc5a9eaaa76fcda120f5cb52c699 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common.h >@@ -0,0 +1,93 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef LOGGING_RTC_EVENT_LOG_ENCODER_RTC_EVENT_LOG_ENCODER_COMMON_H_ >+#define LOGGING_RTC_EVENT_LOG_ENCODER_RTC_EVENT_LOG_ENCODER_COMMON_H_ >+ >+#include <stdint.h> >+ >+#include <limits> >+#include <type_traits> >+ >+namespace webrtc { >+ >+// Convert between the packet fraction loss (a floating point number in >+// the range [0.0, 1.0]), and a uint32_t with up to a fixed number of bits. >+// The latter can be more efficiently stored in a protobuf and/or delta-encoded. >+uint32_t ConvertPacketLossFractionToProtoFormat(float packet_loss_fraction); >+bool ParsePacketLossFractionFromProtoFormat(uint32_t proto_packet_loss_fraction, >+ float* output); >+ >+} // namespace webrtc >+ >+namespace webrtc_event_logging { >+ >+// Produce an unsigned representation of a signed integer. On two's complement >+// machines, this is equivalent to: >+// static_cast<uint64_t>(static_cast<std::make_unsigned<T>>(y)) >+template <typename T> >+uint64_t ToUnsigned(T y) { >+ static_assert(std::is_integral<T>::value, ""); >+ static_assert(std::is_signed<T>::value, ""); >+ >+ // Note that a signed integer whose width is N bits, has N-1 digits. >+ static_assert(std::numeric_limits<T>::digits < 64, ""); >+ >+ constexpr T MIN_T = std::numeric_limits<T>::min(); >+ constexpr T MAX_T = std::numeric_limits<T>::max(); >+ >+ static_assert(MAX_T + MIN_T + 1 >= 0, "MAX_T >= abs(MIN_T) - 1"); >+ >+ if (y >= 0) { >+ return static_cast<uint64_t>(y); >+ } else { >+ // y is in the range [MIN_T, -1], so (y - MIN_T) is in the >+ // range [0, abs(MIN_T) - 1]. This is representable in a T >+ // because MAX_T >= abs(MIN_T) - 1, as per the static_assert above. >+ return static_cast<uint64_t>(MAX_T) + 1 + static_cast<uint64_t>(y - MIN_T); >+ } >+} >+ >+// Assuming x = ToUnsigned(y), return |y|. >+// Note: static_cast<T>(x) would work on most platforms and compilers, but >+// involves undefined behavior. This function is well-defined, and can be >+// optimized to a noop for 64 bit types, or a few arithmetic >+// instructions and a single conditional jump for narrower types. >+template <typename T> >+bool ToSigned(uint64_t x, T* y) { >+ static_assert(std::is_integral<T>::value, ""); >+ static_assert(std::is_signed<T>::value, ""); >+ >+ // Note that a signed integer whose width is N bits, has N-1 digits. >+ static_assert(std::numeric_limits<T>::digits < 64, ""); >+ >+ constexpr T MIN_T = std::numeric_limits<T>::min(); >+ constexpr T MAX_T = std::numeric_limits<T>::max(); >+ >+ using UNSIGNED_T = typename std::make_unsigned<T>::type; >+ constexpr auto MAX_UNSIGNED_T = std::numeric_limits<UNSIGNED_T>::max(); >+ if (x > static_cast<uint64_t>(MAX_UNSIGNED_T)) { >+ return false; // |x| cannot be represented using a T. >+ } >+ >+ if (x <= static_cast<uint64_t>(MAX_T)) { >+ // The original value was positive, so it is safe to just static_cast. >+ *y = static_cast<T>(x); >+ } else { // x > static_cast<uint64_t>(MAX_T) >+ const uint64_t neg_x = x - static_cast<uint64_t>(MAX_T) - 1; >+ *y = static_cast<T>(neg_x) + MIN_T; >+ } >+ >+ return true; >+} >+ >+} // namespace webrtc_event_logging >+ >+#endif // LOGGING_RTC_EVENT_LOG_ENCODER_RTC_EVENT_LOG_ENCODER_COMMON_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..a7d94406ae0d2f6e1cb1e656a71f8b471e83fcdb >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_common_unittest.cc >@@ -0,0 +1,83 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder_common.h" >+ >+#include <limits> >+#include <type_traits> >+#include <vector> >+ >+#include "test/gtest.h" >+ >+namespace webrtc_event_logging { >+namespace { >+ >+template <typename T> >+class SignednessConversionTest : public testing::Test { >+ public: >+ static_assert(std::is_integral<T>::value, ""); >+ static_assert(std::is_signed<T>::value, ""); >+}; >+ >+TYPED_TEST_CASE_P(SignednessConversionTest); >+ >+TYPED_TEST_P(SignednessConversionTest, CorrectlyConvertsLegalValues) { >+ using T = TypeParam; >+ std::vector<T> legal_values = {std::numeric_limits<T>::min(), >+ std::numeric_limits<T>::min() + 1, >+ -1, >+ 0, >+ 1, >+ std::numeric_limits<T>::max() - 1, >+ std::numeric_limits<T>::max()}; >+ for (T val : legal_values) { >+ const auto unsigned_val = ToUnsigned(val); >+ T signed_val; >+ ASSERT_TRUE(ToSigned<T>(unsigned_val, &signed_val)) >+ << "Failed on " << static_cast<uint64_t>(unsigned_val) << "."; >+ EXPECT_EQ(val, signed_val) >+ << "Failed on " << static_cast<uint64_t>(unsigned_val) << "."; >+ } >+} >+ >+TYPED_TEST_P(SignednessConversionTest, FailsOnConvertingIllegalValues) { >+ using T = TypeParam; >+ >+ // Note that a signed integer whose width is N bits, has N-1 digits. >+ constexpr bool width_is_64 = std::numeric_limits<T>::digits == 63; >+ >+ if (width_is_64) { >+ return; // Test irrelevant; illegal values do not exist. >+ } >+ >+ const uint64_t max_legal_value = ToUnsigned(static_cast<T>(-1)); >+ >+ const std::vector<uint64_t> illegal_values = { >+ max_legal_value + 1u, max_legal_value + 2u, >+ std::numeric_limits<uint64_t>::max() - 1u, >+ std::numeric_limits<uint64_t>::max()}; >+ >+ for (uint64_t unsigned_val : illegal_values) { >+ T signed_val; >+ EXPECT_FALSE(ToSigned<T>(unsigned_val, &signed_val)) >+ << "Failed on " << static_cast<uint64_t>(unsigned_val) << "."; >+ } >+} >+ >+REGISTER_TYPED_TEST_CASE_P(SignednessConversionTest, >+ CorrectlyConvertsLegalValues, >+ FailsOnConvertingIllegalValues); >+ >+using Types = ::testing::Types<int8_t, int16_t, int32_t, int64_t>; >+ >+INSTANTIATE_TYPED_TEST_CASE_P(_, SignednessConversionTest, Types); >+ >+} // namespace >+} // namespace webrtc_event_logging >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.cc >index 7ef8d677c609072e106b04e7251d6b2a48cee8d0..29636ede922f2d44928e3730d1e25e6c2ab7f6b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.cc >@@ -47,7 +47,6 @@ > #include "rtc_base/ignore_wundef.h" > #include "rtc_base/logging.h" > >-#ifdef ENABLE_RTC_EVENT_LOG > > // *.pb.h files are generated at build-time by the protobuf compiler. > RTC_PUSH_IGNORING_WUNDEF() >@@ -223,7 +222,8 @@ ConvertIceCandidatePairEventType(IceCandidatePairEventType type) { > > } // namespace > >-std::string RtcEventLogEncoderLegacy::EncodeLogStart(int64_t timestamp_us) { >+std::string RtcEventLogEncoderLegacy::EncodeLogStart(int64_t timestamp_us, >+ int64_t utc_time_us) { > rtclog::Event rtclog_event; > rtclog_event.set_timestamp_us(timestamp_us); > rtclog_event.set_type(rtclog::Event::LOG_START); >@@ -357,37 +357,37 @@ std::string RtcEventLogEncoderLegacy::Encode(const RtcEvent& event) { > std::string RtcEventLogEncoderLegacy::EncodeAlrState( > const RtcEventAlrState& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::ALR_STATE_EVENT); > > auto* alr_state = rtclog_event.mutable_alr_state(); >- alr_state->set_in_alr(event.in_alr_); >+ alr_state->set_in_alr(event.in_alr()); > return Serialize(&rtclog_event); > } > > std::string RtcEventLogEncoderLegacy::EncodeAudioNetworkAdaptation( > const RtcEventAudioNetworkAdaptation& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::AUDIO_NETWORK_ADAPTATION_EVENT); > > auto* audio_network_adaptation = > rtclog_event.mutable_audio_network_adaptation(); >- if (event.config_->bitrate_bps) >- audio_network_adaptation->set_bitrate_bps(*event.config_->bitrate_bps); >- if (event.config_->frame_length_ms) >+ if (event.config().bitrate_bps) >+ audio_network_adaptation->set_bitrate_bps(*event.config().bitrate_bps); >+ if (event.config().frame_length_ms) > audio_network_adaptation->set_frame_length_ms( >- *event.config_->frame_length_ms); >- if (event.config_->uplink_packet_loss_fraction) { >+ *event.config().frame_length_ms); >+ if (event.config().uplink_packet_loss_fraction) { > audio_network_adaptation->set_uplink_packet_loss_fraction( >- *event.config_->uplink_packet_loss_fraction); >+ *event.config().uplink_packet_loss_fraction); > } >- if (event.config_->enable_fec) >- audio_network_adaptation->set_enable_fec(*event.config_->enable_fec); >- if (event.config_->enable_dtx) >- audio_network_adaptation->set_enable_dtx(*event.config_->enable_dtx); >- if (event.config_->num_channels) >- audio_network_adaptation->set_num_channels(*event.config_->num_channels); >+ if (event.config().enable_fec) >+ audio_network_adaptation->set_enable_fec(*event.config().enable_fec); >+ if (event.config().enable_dtx) >+ audio_network_adaptation->set_enable_dtx(*event.config().enable_dtx); >+ if (event.config().num_channels) >+ audio_network_adaptation->set_num_channels(*event.config().num_channels); > > return Serialize(&rtclog_event); > } >@@ -395,11 +395,11 @@ std::string RtcEventLogEncoderLegacy::EncodeAudioNetworkAdaptation( > std::string RtcEventLogEncoderLegacy::EncodeAudioPlayout( > const RtcEventAudioPlayout& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::AUDIO_PLAYOUT_EVENT); > > auto* playout_event = rtclog_event.mutable_audio_playout_event(); >- playout_event->set_local_ssrc(event.ssrc_); >+ playout_event->set_local_ssrc(event.ssrc()); > > return Serialize(&rtclog_event); > } >@@ -407,15 +407,15 @@ std::string RtcEventLogEncoderLegacy::EncodeAudioPlayout( > std::string RtcEventLogEncoderLegacy::EncodeAudioReceiveStreamConfig( > const RtcEventAudioReceiveStreamConfig& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::AUDIO_RECEIVER_CONFIG_EVENT); > > rtclog::AudioReceiveConfig* receiver_config = > rtclog_event.mutable_audio_receiver_config(); >- receiver_config->set_remote_ssrc(event.config_->remote_ssrc); >- receiver_config->set_local_ssrc(event.config_->local_ssrc); >+ receiver_config->set_remote_ssrc(event.config().remote_ssrc); >+ receiver_config->set_local_ssrc(event.config().local_ssrc); > >- for (const auto& e : event.config_->rtp_extensions) { >+ for (const auto& e : event.config().rtp_extensions) { > rtclog::RtpHeaderExtension* extension = > receiver_config->add_header_extensions(); > extension->set_name(e.uri); >@@ -428,15 +428,15 @@ std::string RtcEventLogEncoderLegacy::EncodeAudioReceiveStreamConfig( > std::string RtcEventLogEncoderLegacy::EncodeAudioSendStreamConfig( > const RtcEventAudioSendStreamConfig& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::AUDIO_SENDER_CONFIG_EVENT); > > rtclog::AudioSendConfig* sender_config = > rtclog_event.mutable_audio_sender_config(); > >- sender_config->set_ssrc(event.config_->local_ssrc); >+ sender_config->set_ssrc(event.config().local_ssrc); > >- for (const auto& e : event.config_->rtp_extensions) { >+ for (const auto& e : event.config().rtp_extensions) { > rtclog::RtpHeaderExtension* extension = > sender_config->add_header_extensions(); > extension->set_name(e.uri); >@@ -449,12 +449,12 @@ std::string RtcEventLogEncoderLegacy::EncodeAudioSendStreamConfig( > std::string RtcEventLogEncoderLegacy::EncodeBweUpdateDelayBased( > const RtcEventBweUpdateDelayBased& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::DELAY_BASED_BWE_UPDATE); > > auto* bwe_event = rtclog_event.mutable_delay_based_bwe_update(); >- bwe_event->set_bitrate_bps(event.bitrate_bps_); >- bwe_event->set_detector_state(ConvertDetectorState(event.detector_state_)); >+ bwe_event->set_bitrate_bps(event.bitrate_bps()); >+ bwe_event->set_detector_state(ConvertDetectorState(event.detector_state())); > > return Serialize(&rtclog_event); > } >@@ -462,13 +462,13 @@ std::string RtcEventLogEncoderLegacy::EncodeBweUpdateDelayBased( > std::string RtcEventLogEncoderLegacy::EncodeBweUpdateLossBased( > const RtcEventBweUpdateLossBased& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::LOSS_BASED_BWE_UPDATE); > > auto* bwe_event = rtclog_event.mutable_loss_based_bwe_update(); >- bwe_event->set_bitrate_bps(event.bitrate_bps_); >- bwe_event->set_fraction_loss(event.fraction_loss_); >- bwe_event->set_total_packets(event.total_packets_); >+ bwe_event->set_bitrate_bps(event.bitrate_bps()); >+ bwe_event->set_fraction_loss(event.fraction_loss()); >+ bwe_event->set_total_packets(event.total_packets()); > > return Serialize(&rtclog_event); > } >@@ -476,15 +476,15 @@ std::string RtcEventLogEncoderLegacy::EncodeBweUpdateLossBased( > std::string RtcEventLogEncoderLegacy::EncodeIceCandidatePairConfig( > const RtcEventIceCandidatePairConfig& event) { > rtclog::Event encoded_rtc_event; >- encoded_rtc_event.set_timestamp_us(event.timestamp_us_); >+ encoded_rtc_event.set_timestamp_us(event.timestamp_us()); > encoded_rtc_event.set_type(rtclog::Event::ICE_CANDIDATE_PAIR_CONFIG); > > auto* encoded_ice_event = > encoded_rtc_event.mutable_ice_candidate_pair_config(); > encoded_ice_event->set_config_type( >- ConvertIceCandidatePairConfigType(event.type_)); >- encoded_ice_event->set_candidate_pair_id(event.candidate_pair_id_); >- const auto& desc = event.candidate_pair_desc_; >+ ConvertIceCandidatePairConfigType(event.type())); >+ encoded_ice_event->set_candidate_pair_id(event.candidate_pair_id()); >+ const auto& desc = event.candidate_pair_desc(); > encoded_ice_event->set_local_candidate_type( > ConvertIceCandidateType(desc.local_candidate_type)); > encoded_ice_event->set_local_relay_protocol( >@@ -505,28 +505,28 @@ std::string RtcEventLogEncoderLegacy::EncodeIceCandidatePairConfig( > std::string RtcEventLogEncoderLegacy::EncodeIceCandidatePairEvent( > const RtcEventIceCandidatePair& event) { > rtclog::Event encoded_rtc_event; >- encoded_rtc_event.set_timestamp_us(event.timestamp_us_); >+ encoded_rtc_event.set_timestamp_us(event.timestamp_us()); > encoded_rtc_event.set_type(rtclog::Event::ICE_CANDIDATE_PAIR_EVENT); > > auto* encoded_ice_event = > encoded_rtc_event.mutable_ice_candidate_pair_event(); > encoded_ice_event->set_event_type( >- ConvertIceCandidatePairEventType(event.type_)); >- encoded_ice_event->set_candidate_pair_id(event.candidate_pair_id_); >+ ConvertIceCandidatePairEventType(event.type())); >+ encoded_ice_event->set_candidate_pair_id(event.candidate_pair_id()); > return Serialize(&encoded_rtc_event); > } > > std::string RtcEventLogEncoderLegacy::EncodeProbeClusterCreated( > const RtcEventProbeClusterCreated& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::BWE_PROBE_CLUSTER_CREATED_EVENT); > > auto* probe_cluster = rtclog_event.mutable_probe_cluster(); >- probe_cluster->set_id(event.id_); >- probe_cluster->set_bitrate_bps(event.bitrate_bps_); >- probe_cluster->set_min_packets(event.min_probes_); >- probe_cluster->set_min_bytes(event.min_bytes_); >+ probe_cluster->set_id(event.id()); >+ probe_cluster->set_bitrate_bps(event.bitrate_bps()); >+ probe_cluster->set_min_packets(event.min_probes()); >+ probe_cluster->set_min_bytes(event.min_bytes()); > > return Serialize(&rtclog_event); > } >@@ -534,12 +534,12 @@ std::string RtcEventLogEncoderLegacy::EncodeProbeClusterCreated( > std::string RtcEventLogEncoderLegacy::EncodeProbeResultFailure( > const RtcEventProbeResultFailure& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::BWE_PROBE_RESULT_EVENT); > > auto* probe_result = rtclog_event.mutable_probe_result(); >- probe_result->set_id(event.id_); >- probe_result->set_result(ConvertProbeResultType(event.failure_reason_)); >+ probe_result->set_id(event.id()); >+ probe_result->set_result(ConvertProbeResultType(event.failure_reason())); > > return Serialize(&rtclog_event); > } >@@ -547,70 +547,71 @@ std::string RtcEventLogEncoderLegacy::EncodeProbeResultFailure( > std::string RtcEventLogEncoderLegacy::EncodeProbeResultSuccess( > const RtcEventProbeResultSuccess& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::BWE_PROBE_RESULT_EVENT); > > auto* probe_result = rtclog_event.mutable_probe_result(); >- probe_result->set_id(event.id_); >+ probe_result->set_id(event.id()); > probe_result->set_result(rtclog::BweProbeResult::SUCCESS); >- probe_result->set_bitrate_bps(event.bitrate_bps_); >+ probe_result->set_bitrate_bps(event.bitrate_bps()); > > return Serialize(&rtclog_event); > } > > std::string RtcEventLogEncoderLegacy::EncodeRtcpPacketIncoming( > const RtcEventRtcpPacketIncoming& event) { >- return EncodeRtcpPacket(event.timestamp_us_, event.packet_, true); >+ return EncodeRtcpPacket(event.timestamp_us(), event.packet(), true); > } > > std::string RtcEventLogEncoderLegacy::EncodeRtcpPacketOutgoing( > const RtcEventRtcpPacketOutgoing& event) { >- return EncodeRtcpPacket(event.timestamp_us_, event.packet_, false); >+ return EncodeRtcpPacket(event.timestamp_us(), event.packet(), false); > } > > std::string RtcEventLogEncoderLegacy::EncodeRtpPacketIncoming( > const RtcEventRtpPacketIncoming& event) { >- return EncodeRtpPacket(event.timestamp_us_, event.header_, >- event.packet_length_, PacedPacketInfo::kNotAProbe, >+ return EncodeRtpPacket(event.timestamp_us(), event.header(), >+ event.packet_length(), PacedPacketInfo::kNotAProbe, > true); > } > > std::string RtcEventLogEncoderLegacy::EncodeRtpPacketOutgoing( > const RtcEventRtpPacketOutgoing& event) { >- return EncodeRtpPacket(event.timestamp_us_, event.header_, >- event.packet_length_, event.probe_cluster_id_, false); >+ return EncodeRtpPacket(event.timestamp_us(), event.header(), >+ event.packet_length(), event.probe_cluster_id(), >+ false); > } > > std::string RtcEventLogEncoderLegacy::EncodeVideoReceiveStreamConfig( > const RtcEventVideoReceiveStreamConfig& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::VIDEO_RECEIVER_CONFIG_EVENT); > > rtclog::VideoReceiveConfig* receiver_config = > rtclog_event.mutable_video_receiver_config(); >- receiver_config->set_remote_ssrc(event.config_->remote_ssrc); >- receiver_config->set_local_ssrc(event.config_->local_ssrc); >+ receiver_config->set_remote_ssrc(event.config().remote_ssrc); >+ receiver_config->set_local_ssrc(event.config().local_ssrc); > > // TODO(perkj): Add field for rsid. >- receiver_config->set_rtcp_mode(ConvertRtcpMode(event.config_->rtcp_mode)); >- receiver_config->set_remb(event.config_->remb); >+ receiver_config->set_rtcp_mode(ConvertRtcpMode(event.config().rtcp_mode)); >+ receiver_config->set_remb(event.config().remb); > >- for (const auto& e : event.config_->rtp_extensions) { >+ for (const auto& e : event.config().rtp_extensions) { > rtclog::RtpHeaderExtension* extension = > receiver_config->add_header_extensions(); > extension->set_name(e.uri); > extension->set_id(e.id); > } > >- for (const auto& d : event.config_->codecs) { >+ for (const auto& d : event.config().codecs) { > rtclog::DecoderConfig* decoder = receiver_config->add_decoders(); > decoder->set_name(d.payload_name); > decoder->set_payload_type(d.payload_type); > if (d.rtx_payload_type != 0) { > rtclog::RtxMap* rtx = receiver_config->add_rtx_map(); > rtx->set_payload_type(d.payload_type); >- rtx->mutable_config()->set_rtx_ssrc(event.config_->rtx_ssrc); >+ rtx->mutable_config()->set_rtx_ssrc(event.config().rtx_ssrc); > rtx->mutable_config()->set_rtx_payload_type(d.rtx_payload_type); > } > } >@@ -621,19 +622,19 @@ std::string RtcEventLogEncoderLegacy::EncodeVideoReceiveStreamConfig( > std::string RtcEventLogEncoderLegacy::EncodeVideoSendStreamConfig( > const RtcEventVideoSendStreamConfig& event) { > rtclog::Event rtclog_event; >- rtclog_event.set_timestamp_us(event.timestamp_us_); >+ rtclog_event.set_timestamp_us(event.timestamp_us()); > rtclog_event.set_type(rtclog::Event::VIDEO_SENDER_CONFIG_EVENT); > > rtclog::VideoSendConfig* sender_config = > rtclog_event.mutable_video_sender_config(); > > // TODO(perkj): rtclog::VideoSendConfig should only contain one SSRC. >- sender_config->add_ssrcs(event.config_->local_ssrc); >- if (event.config_->rtx_ssrc != 0) { >- sender_config->add_rtx_ssrcs(event.config_->rtx_ssrc); >+ sender_config->add_ssrcs(event.config().local_ssrc); >+ if (event.config().rtx_ssrc != 0) { >+ sender_config->add_rtx_ssrcs(event.config().rtx_ssrc); > } > >- for (const auto& e : event.config_->rtp_extensions) { >+ for (const auto& e : event.config().rtp_extensions) { > rtclog::RtpHeaderExtension* extension = > sender_config->add_header_extensions(); > extension->set_name(e.uri); >@@ -642,13 +643,13 @@ std::string RtcEventLogEncoderLegacy::EncodeVideoSendStreamConfig( > > // TODO(perkj): rtclog::VideoSendConfig should contain many possible codec > // configurations. >- for (const auto& codec : event.config_->codecs) { >+ for (const auto& codec : event.config().codecs) { > sender_config->set_rtx_payload_type(codec.rtx_payload_type); > rtclog::EncoderConfig* encoder = sender_config->mutable_encoder(); > encoder->set_name(codec.payload_name); > encoder->set_payload_type(codec.payload_type); > >- if (event.config_->codecs.size() > 1) { >+ if (event.config().codecs.size() > 1) { > RTC_LOG(WARNING) > << "LogVideoSendStreamConfig currently only supports one " > << "codec. Logging codec :" << codec.payload_name; >@@ -756,5 +757,3 @@ std::string RtcEventLogEncoderLegacy::Serialize(rtclog::Event* event) { > } > > } // namespace webrtc >- >-#endif // ENABLE_RTC_EVENT_LOG >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.h >index 87db039e3aff3f16eb9caeac5722e11567da55d0..3105dc1e68c6b938ea890c7d261b9f2766702bf7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.h >@@ -18,8 +18,6 @@ > #include "logging/rtc_event_log/encoder/rtc_event_log_encoder.h" > #include "rtc_base/buffer.h" > >-#if defined(ENABLE_RTC_EVENT_LOG) >- > namespace webrtc { > > namespace rtclog { >@@ -52,7 +50,8 @@ class RtcEventLogEncoderLegacy final : public RtcEventLogEncoder { > public: > ~RtcEventLogEncoderLegacy() override = default; > >- std::string EncodeLogStart(int64_t timestamp_us) override; >+ std::string EncodeLogStart(int64_t timestamp_us, >+ int64_t utc_time_us) override; > std::string EncodeLogEnd(int64_t timestamp_us) override; > > std::string EncodeBatch( >@@ -105,6 +104,4 @@ class RtcEventLogEncoderLegacy final : public RtcEventLogEncoder { > > } // namespace webrtc > >-#endif // ENABLE_RTC_EVENT_LOG >- > #endif // LOGGING_RTC_EVENT_LOG_ENCODER_RTC_EVENT_LOG_ENCODER_LEGACY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..79dc379f48b41a460d5f462172c214f1227598c2 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.cc >@@ -0,0 +1,1336 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.h" >+ >+#include "absl/types/optional.h" >+#include "api/array_view.h" >+#include "logging/rtc_event_log/encoder/blob_encoding.h" >+#include "logging/rtc_event_log/encoder/delta_encoding.h" >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder_common.h" >+#include "logging/rtc_event_log/events/rtc_event_alr_state.h" >+#include "logging/rtc_event_log/events/rtc_event_audio_network_adaptation.h" >+#include "logging/rtc_event_log/events/rtc_event_audio_playout.h" >+#include "logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.h" >+#include "logging/rtc_event_log/events/rtc_event_audio_send_stream_config.h" >+#include "logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.h" >+#include "logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.h" >+#include "logging/rtc_event_log/events/rtc_event_ice_candidate_pair.h" >+#include "logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.h" >+#include "logging/rtc_event_log/events/rtc_event_probe_cluster_created.h" >+#include "logging/rtc_event_log/events/rtc_event_probe_result_failure.h" >+#include "logging/rtc_event_log/events/rtc_event_probe_result_success.h" >+#include "logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.h" >+#include "logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.h" >+#include "logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.h" >+#include "logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.h" >+#include "logging/rtc_event_log/events/rtc_event_video_receive_stream_config.h" >+#include "logging/rtc_event_log/events/rtc_event_video_send_stream_config.h" >+#include "logging/rtc_event_log/rtc_stream_config.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" >+#include "modules/remote_bitrate_estimator/include/bwe_defines.h" >+#include "modules/rtp_rtcp/include/rtp_cvo.h" >+#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/app.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/bye.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/extended_jitter_report.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/extended_reports.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/psfb.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/receiver_report.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/rtpfb.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/sdes.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/sender_report.h" >+#include "modules/rtp_rtcp/source/rtp_header_extensions.h" >+#include "modules/rtp_rtcp/source/rtp_packet.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/ignore_wundef.h" >+#include "rtc_base/logging.h" >+ >+// *.pb.h files are generated at build-time by the protobuf compiler. >+RTC_PUSH_IGNORING_WUNDEF() >+#ifdef WEBRTC_ANDROID_PLATFORM_BUILD >+#include "external/webrtc/webrtc/logging/rtc_event_log/rtc_event_log2.pb.h" >+#else >+#include "logging/rtc_event_log/rtc_event_log2.pb.h" >+#endif >+RTC_POP_IGNORING_WUNDEF() >+ >+using webrtc_event_logging::ToUnsigned; >+ >+namespace webrtc { >+ >+namespace { >+rtclog2::DelayBasedBweUpdates::DetectorState ConvertToProtoFormat( >+ BandwidthUsage state) { >+ switch (state) { >+ case BandwidthUsage::kBwNormal: >+ return rtclog2::DelayBasedBweUpdates::BWE_NORMAL; >+ case BandwidthUsage::kBwUnderusing: >+ return rtclog2::DelayBasedBweUpdates::BWE_UNDERUSING; >+ case BandwidthUsage::kBwOverusing: >+ return rtclog2::DelayBasedBweUpdates::BWE_OVERUSING; >+ case BandwidthUsage::kLast: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::DelayBasedBweUpdates::BWE_UNKNOWN_STATE; >+} >+ >+rtclog2::BweProbeResultFailure::FailureReason ConvertToProtoFormat( >+ ProbeFailureReason failure_reason) { >+ switch (failure_reason) { >+ case ProbeFailureReason::kInvalidSendReceiveInterval: >+ return rtclog2::BweProbeResultFailure::INVALID_SEND_RECEIVE_INTERVAL; >+ case ProbeFailureReason::kInvalidSendReceiveRatio: >+ return rtclog2::BweProbeResultFailure::INVALID_SEND_RECEIVE_RATIO; >+ case ProbeFailureReason::kTimeout: >+ return rtclog2::BweProbeResultFailure::TIMEOUT; >+ case ProbeFailureReason::kLast: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::BweProbeResultFailure::UNKNOWN; >+} >+ >+// Returns true if there are recognized extensions that we should log >+// and false if there are no extensions or all extensions are types we don't >+// log. The protobuf representation of the header configs is written to >+// |proto_config|. >+bool ConvertToProtoFormat(const std::vector<RtpExtension>& extensions, >+ rtclog2::RtpHeaderExtensionConfig* proto_config) { >+ size_t unknown_extensions = 0; >+ for (auto& extension : extensions) { >+ if (extension.uri == RtpExtension::kAudioLevelUri) { >+ proto_config->set_audio_level_id(extension.id); >+ } else if (extension.uri == RtpExtension::kTimestampOffsetUri) { >+ proto_config->set_transmission_time_offset_id(extension.id); >+ } else if (extension.uri == RtpExtension::kAbsSendTimeUri) { >+ proto_config->set_absolute_send_time_id(extension.id); >+ } else if (extension.uri == RtpExtension::kTransportSequenceNumberUri) { >+ proto_config->set_transport_sequence_number_id(extension.id); >+ } else if (extension.uri == RtpExtension::kVideoRotationUri) { >+ proto_config->set_video_rotation_id(extension.id); >+ } else { >+ ++unknown_extensions; >+ } >+ } >+ return unknown_extensions < extensions.size(); >+} >+ >+rtclog2::IceCandidatePairConfig::IceCandidatePairConfigType >+ConvertToProtoFormat(IceCandidatePairConfigType type) { >+ switch (type) { >+ case IceCandidatePairConfigType::kAdded: >+ return rtclog2::IceCandidatePairConfig::ADDED; >+ case IceCandidatePairConfigType::kUpdated: >+ return rtclog2::IceCandidatePairConfig::UPDATED; >+ case IceCandidatePairConfigType::kDestroyed: >+ return rtclog2::IceCandidatePairConfig::DESTROYED; >+ case IceCandidatePairConfigType::kSelected: >+ return rtclog2::IceCandidatePairConfig::SELECTED; >+ case IceCandidatePairConfigType::kNumValues: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_CONFIG_TYPE; >+} >+ >+rtclog2::IceCandidatePairConfig::IceCandidateType ConvertToProtoFormat( >+ IceCandidateType type) { >+ switch (type) { >+ case IceCandidateType::kUnknown: >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_CANDIDATE_TYPE; >+ case IceCandidateType::kLocal: >+ return rtclog2::IceCandidatePairConfig::LOCAL; >+ case IceCandidateType::kStun: >+ return rtclog2::IceCandidatePairConfig::STUN; >+ case IceCandidateType::kPrflx: >+ return rtclog2::IceCandidatePairConfig::PRFLX; >+ case IceCandidateType::kRelay: >+ return rtclog2::IceCandidatePairConfig::RELAY; >+ case IceCandidateType::kNumValues: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_CANDIDATE_TYPE; >+} >+ >+rtclog2::IceCandidatePairConfig::Protocol ConvertToProtoFormat( >+ IceCandidatePairProtocol protocol) { >+ switch (protocol) { >+ case IceCandidatePairProtocol::kUnknown: >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_PROTOCOL; >+ case IceCandidatePairProtocol::kUdp: >+ return rtclog2::IceCandidatePairConfig::UDP; >+ case IceCandidatePairProtocol::kTcp: >+ return rtclog2::IceCandidatePairConfig::TCP; >+ case IceCandidatePairProtocol::kSsltcp: >+ return rtclog2::IceCandidatePairConfig::SSLTCP; >+ case IceCandidatePairProtocol::kTls: >+ return rtclog2::IceCandidatePairConfig::TLS; >+ case IceCandidatePairProtocol::kNumValues: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_PROTOCOL; >+} >+ >+rtclog2::IceCandidatePairConfig::AddressFamily ConvertToProtoFormat( >+ IceCandidatePairAddressFamily address_family) { >+ switch (address_family) { >+ case IceCandidatePairAddressFamily::kUnknown: >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_ADDRESS_FAMILY; >+ case IceCandidatePairAddressFamily::kIpv4: >+ return rtclog2::IceCandidatePairConfig::IPV4; >+ case IceCandidatePairAddressFamily::kIpv6: >+ return rtclog2::IceCandidatePairConfig::IPV6; >+ case IceCandidatePairAddressFamily::kNumValues: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_ADDRESS_FAMILY; >+} >+ >+rtclog2::IceCandidatePairConfig::NetworkType ConvertToProtoFormat( >+ IceCandidateNetworkType network_type) { >+ switch (network_type) { >+ case IceCandidateNetworkType::kUnknown: >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_NETWORK_TYPE; >+ case IceCandidateNetworkType::kEthernet: >+ return rtclog2::IceCandidatePairConfig::ETHERNET; >+ case IceCandidateNetworkType::kLoopback: >+ return rtclog2::IceCandidatePairConfig::LOOPBACK; >+ case IceCandidateNetworkType::kWifi: >+ return rtclog2::IceCandidatePairConfig::WIFI; >+ case IceCandidateNetworkType::kVpn: >+ return rtclog2::IceCandidatePairConfig::VPN; >+ case IceCandidateNetworkType::kCellular: >+ return rtclog2::IceCandidatePairConfig::CELLULAR; >+ case IceCandidateNetworkType::kNumValues: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::IceCandidatePairConfig::UNKNOWN_NETWORK_TYPE; >+} >+ >+rtclog2::IceCandidatePairEvent::IceCandidatePairEventType ConvertToProtoFormat( >+ IceCandidatePairEventType type) { >+ switch (type) { >+ case IceCandidatePairEventType::kCheckSent: >+ return rtclog2::IceCandidatePairEvent::CHECK_SENT; >+ case IceCandidatePairEventType::kCheckReceived: >+ return rtclog2::IceCandidatePairEvent::CHECK_RECEIVED; >+ case IceCandidatePairEventType::kCheckResponseSent: >+ return rtclog2::IceCandidatePairEvent::CHECK_RESPONSE_SENT; >+ case IceCandidatePairEventType::kCheckResponseReceived: >+ return rtclog2::IceCandidatePairEvent::CHECK_RESPONSE_RECEIVED; >+ case IceCandidatePairEventType::kNumValues: >+ RTC_NOTREACHED(); >+ } >+ RTC_NOTREACHED(); >+ return rtclog2::IceCandidatePairEvent::UNKNOWN_CHECK_TYPE; >+} >+ >+// Copies all RTCP blocks except APP, SDES and unknown from |packet| to >+// |buffer|. |buffer| must have space for |IP_PACKET_SIZE| bytes. |packet| must >+// be at most |IP_PACKET_SIZE| bytes long. >+size_t RemoveNonWhitelistedRtcpBlocks(const rtc::Buffer& packet, >+ uint8_t* buffer) { >+ RTC_DCHECK(packet.size() <= IP_PACKET_SIZE); >+ RTC_DCHECK(buffer != nullptr); >+ rtcp::CommonHeader header; >+ const uint8_t* block_begin = packet.data(); >+ const uint8_t* packet_end = packet.data() + packet.size(); >+ size_t buffer_length = 0; >+ while (block_begin < packet_end) { >+ if (!header.Parse(block_begin, packet_end - block_begin)) { >+ break; // Incorrect message header. >+ } >+ const uint8_t* next_block = header.NextPacket(); >+ RTC_DCHECK_GT(next_block, block_begin); >+ RTC_DCHECK_LE(next_block, packet_end); >+ size_t block_size = next_block - block_begin; >+ switch (header.type()) { >+ case rtcp::Bye::kPacketType: >+ case rtcp::ExtendedJitterReport::kPacketType: >+ case rtcp::ExtendedReports::kPacketType: >+ case rtcp::Psfb::kPacketType: >+ case rtcp::ReceiverReport::kPacketType: >+ case rtcp::Rtpfb::kPacketType: >+ case rtcp::SenderReport::kPacketType: >+ // We log sender reports, receiver reports, bye messages >+ // inter-arrival jitter, third-party loss reports, payload-specific >+ // feedback and extended reports. >+ // TODO(terelius): As an optimization, don't copy anything if all blocks >+ // in the packet are whitelisted types. >+ memcpy(buffer + buffer_length, block_begin, block_size); >+ buffer_length += block_size; >+ break; >+ case rtcp::App::kPacketType: >+ case rtcp::Sdes::kPacketType: >+ default: >+ // We don't log sender descriptions, application defined messages >+ // or message blocks of unknown type. >+ break; >+ } >+ >+ block_begin += block_size; >+ } >+ return buffer_length; >+} >+ >+template <typename EventType, typename ProtoType> >+void EncodeRtcpPacket(rtc::ArrayView<const EventType*> batch, >+ ProtoType* proto_batch) { >+ if (batch.size() == 0) { >+ return; >+ } >+ >+ // Base event >+ const EventType* const base_event = batch[0]; >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ { >+ uint8_t buffer[IP_PACKET_SIZE]; >+ size_t buffer_length = >+ RemoveNonWhitelistedRtcpBlocks(base_event->packet(), buffer); >+ proto_batch->set_raw_packet(buffer, buffer_length); >+ } >+ >+ if (batch.size() == 1) { >+ return; >+ } >+ >+ // Delta encoding >+ proto_batch->set_number_of_deltas(batch.size() - 1); >+ std::vector<absl::optional<uint64_t>> values(batch.size() - 1); >+ std::string encoded_deltas; >+ >+ // timestamp_ms >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = ToUnsigned(event->timestamp_ms()); >+ } >+ encoded_deltas = EncodeDeltas(ToUnsigned(base_event->timestamp_ms()), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_timestamp_ms_deltas(encoded_deltas); >+ } >+ >+ // raw_packet >+ std::vector<std::string> scrubed_packets(batch.size() - 1); >+ for (size_t i = 0; i < scrubed_packets.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ scrubed_packets[i].resize(event->packet().size()); >+ static_assert(sizeof(std::string::value_type) == sizeof(uint8_t), ""); >+ const size_t buffer_length = RemoveNonWhitelistedRtcpBlocks( >+ event->packet(), reinterpret_cast<uint8_t*>(&scrubed_packets[i][0])); >+ if (buffer_length < event->packet().size()) { >+ scrubed_packets[i].resize(buffer_length); >+ } >+ } >+ proto_batch->set_raw_packet_blobs(EncodeBlobs(scrubed_packets)); >+} >+ >+template <typename EventType, typename ProtoType> >+void EncodeRtpPacket(const std::vector<const EventType*>& batch, >+ ProtoType* proto_batch) { >+ if (batch.size() == 0) { >+ return; >+ } >+ >+ // Base event >+ const EventType* const base_event = batch[0]; >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_marker(base_event->header().Marker()); >+ // TODO(terelius): Is payload type needed? >+ proto_batch->set_payload_type(base_event->header().PayloadType()); >+ proto_batch->set_sequence_number(base_event->header().SequenceNumber()); >+ proto_batch->set_rtp_timestamp(base_event->header().Timestamp()); >+ proto_batch->set_ssrc(base_event->header().Ssrc()); >+ proto_batch->set_payload_size(base_event->payload_length()); >+ proto_batch->set_header_size(base_event->header_length()); >+ proto_batch->set_padding_size(base_event->padding_length()); >+ >+ // Add header extensions (base event). >+ absl::optional<uint64_t> base_transport_sequence_number; >+ { >+ uint16_t seqnum; >+ if (base_event->header().template GetExtension<TransportSequenceNumber>( >+ &seqnum)) { >+ proto_batch->set_transport_sequence_number(seqnum); >+ base_transport_sequence_number = seqnum; >+ } >+ } >+ >+ absl::optional<uint64_t> unsigned_base_transmission_time_offset; >+ { >+ int32_t offset; >+ if (base_event->header().template GetExtension<TransmissionOffset>( >+ &offset)) { >+ proto_batch->set_transmission_time_offset(offset); >+ unsigned_base_transmission_time_offset = ToUnsigned(offset); >+ } >+ } >+ >+ absl::optional<uint64_t> base_absolute_send_time; >+ { >+ uint32_t sendtime; >+ if (base_event->header().template GetExtension<AbsoluteSendTime>( >+ &sendtime)) { >+ proto_batch->set_absolute_send_time(sendtime); >+ base_absolute_send_time = sendtime; >+ } >+ } >+ >+ absl::optional<uint64_t> base_video_rotation; >+ { >+ VideoRotation video_rotation; >+ if (base_event->header().template GetExtension<VideoOrientation>( >+ &video_rotation)) { >+ proto_batch->set_video_rotation( >+ ConvertVideoRotationToCVOByte(video_rotation)); >+ base_video_rotation = ConvertVideoRotationToCVOByte(video_rotation); >+ } >+ } >+ >+ absl::optional<uint64_t> base_audio_level; >+ absl::optional<uint64_t> base_voice_activity; >+ { >+ bool voice_activity; >+ uint8_t audio_level; >+ if (base_event->header().template GetExtension<AudioLevel>(&voice_activity, >+ &audio_level)) { >+ RTC_DCHECK_LE(audio_level, 0x7Fu); >+ base_audio_level = audio_level; >+ proto_batch->set_audio_level(audio_level); >+ >+ base_voice_activity = voice_activity; >+ proto_batch->set_voice_activity(voice_activity); >+ } >+ } >+ >+ if (batch.size() == 1) { >+ return; >+ } >+ >+ // Delta encoding >+ proto_batch->set_number_of_deltas(batch.size() - 1); >+ std::vector<absl::optional<uint64_t>> values(batch.size() - 1); >+ std::string encoded_deltas; >+ >+ // timestamp_ms (event) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = ToUnsigned(event->timestamp_ms()); >+ } >+ encoded_deltas = EncodeDeltas(ToUnsigned(base_event->timestamp_ms()), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_timestamp_ms_deltas(encoded_deltas); >+ } >+ >+ // marker (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->header().Marker(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->header().Marker(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_marker_deltas(encoded_deltas); >+ } >+ >+ // payload_type (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->header().PayloadType(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->header().PayloadType(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_payload_type_deltas(encoded_deltas); >+ } >+ >+ // sequence_number (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->header().SequenceNumber(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->header().SequenceNumber(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_sequence_number_deltas(encoded_deltas); >+ } >+ >+ // rtp_timestamp (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->header().Timestamp(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->header().Timestamp(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_rtp_timestamp_deltas(encoded_deltas); >+ } >+ >+ // ssrc (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->header().Ssrc(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->header().Ssrc(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_ssrc_deltas(encoded_deltas); >+ } >+ >+ // payload_size (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->payload_length(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->payload_length(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_payload_size_deltas(encoded_deltas); >+ } >+ >+ // header_size (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->header_length(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->header_length(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_header_size_deltas(encoded_deltas); >+ } >+ >+ // padding_size (RTP base) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ values[i] = event->padding_length(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->padding_length(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_padding_size_deltas(encoded_deltas); >+ } >+ >+ // transport_sequence_number (RTP extension) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ uint16_t seqnum; >+ if (event->header().template GetExtension<TransportSequenceNumber>( >+ &seqnum)) { >+ values[i] = seqnum; >+ } else { >+ values[i].reset(); >+ } >+ } >+ encoded_deltas = EncodeDeltas(base_transport_sequence_number, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_transport_sequence_number_deltas(encoded_deltas); >+ } >+ >+ // transmission_time_offset (RTP extension) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ int32_t offset; >+ if (event->header().template GetExtension<TransmissionOffset>(&offset)) { >+ values[i] = ToUnsigned(offset); >+ } else { >+ values[i].reset(); >+ } >+ } >+ encoded_deltas = EncodeDeltas(unsigned_base_transmission_time_offset, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_transmission_time_offset_deltas(encoded_deltas); >+ } >+ >+ // absolute_send_time (RTP extension) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ uint32_t sendtime; >+ if (event->header().template GetExtension<AbsoluteSendTime>(&sendtime)) { >+ values[i] = sendtime; >+ } else { >+ values[i].reset(); >+ } >+ } >+ encoded_deltas = EncodeDeltas(base_absolute_send_time, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_absolute_send_time_deltas(encoded_deltas); >+ } >+ >+ // video_rotation (RTP extension) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ VideoRotation video_rotation; >+ if (event->header().template GetExtension<VideoOrientation>( >+ &video_rotation)) { >+ values[i] = ConvertVideoRotationToCVOByte(video_rotation); >+ } else { >+ values[i].reset(); >+ } >+ } >+ encoded_deltas = EncodeDeltas(base_video_rotation, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_video_rotation_deltas(encoded_deltas); >+ } >+ >+ // audio_level (RTP extension) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ bool voice_activity; >+ uint8_t audio_level; >+ if (event->header().template GetExtension<AudioLevel>(&voice_activity, >+ &audio_level)) { >+ RTC_DCHECK_LE(audio_level, 0x7Fu); >+ values[i] = audio_level; >+ } else { >+ values[i].reset(); >+ } >+ } >+ encoded_deltas = EncodeDeltas(base_audio_level, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_audio_level_deltas(encoded_deltas); >+ } >+ >+ // voice_activity (RTP extension) >+ for (size_t i = 0; i < values.size(); ++i) { >+ const EventType* event = batch[i + 1]; >+ bool voice_activity; >+ uint8_t audio_level; >+ if (event->header().template GetExtension<AudioLevel>(&voice_activity, >+ &audio_level)) { >+ RTC_DCHECK_LE(audio_level, 0x7Fu); >+ values[i] = voice_activity; >+ } else { >+ values[i].reset(); >+ } >+ } >+ encoded_deltas = EncodeDeltas(base_voice_activity, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_voice_activity_deltas(encoded_deltas); >+ } >+} >+} // namespace >+ >+std::string RtcEventLogEncoderNewFormat::EncodeLogStart(int64_t timestamp_us, >+ int64_t utc_time_us) { >+ rtclog2::EventStream event_stream; >+ rtclog2::BeginLogEvent* proto_batch = event_stream.add_begin_log_events(); >+ proto_batch->set_timestamp_ms(timestamp_us / 1000); >+ proto_batch->set_version(2); >+ proto_batch->set_utc_time_ms(utc_time_us / 1000); >+ return event_stream.SerializeAsString(); >+} >+ >+std::string RtcEventLogEncoderNewFormat::EncodeLogEnd(int64_t timestamp_us) { >+ rtclog2::EventStream event_stream; >+ rtclog2::EndLogEvent* proto_batch = event_stream.add_end_log_events(); >+ proto_batch->set_timestamp_ms(timestamp_us / 1000); >+ return event_stream.SerializeAsString(); >+} >+ >+std::string RtcEventLogEncoderNewFormat::EncodeBatch( >+ std::deque<std::unique_ptr<RtcEvent>>::const_iterator begin, >+ std::deque<std::unique_ptr<RtcEvent>>::const_iterator end) { >+ rtclog2::EventStream event_stream; >+ std::string encoded_output; >+ >+ { >+ std::vector<const RtcEventAlrState*> alr_state_events; >+ std::vector<const RtcEventAudioNetworkAdaptation*> >+ audio_network_adaptation_events; >+ std::vector<const RtcEventAudioPlayout*> audio_playout_events; >+ std::vector<const RtcEventAudioReceiveStreamConfig*> >+ audio_recv_stream_configs; >+ std::vector<const RtcEventAudioSendStreamConfig*> audio_send_stream_configs; >+ std::vector<const RtcEventBweUpdateDelayBased*> bwe_delay_based_updates; >+ std::vector<const RtcEventBweUpdateLossBased*> bwe_loss_based_updates; >+ std::vector<const RtcEventProbeClusterCreated*> >+ probe_cluster_created_events; >+ std::vector<const RtcEventProbeResultFailure*> probe_result_failure_events; >+ std::vector<const RtcEventProbeResultSuccess*> probe_result_success_events; >+ std::vector<const RtcEventRtcpPacketIncoming*> incoming_rtcp_packets; >+ std::vector<const RtcEventRtcpPacketOutgoing*> outgoing_rtcp_packets; >+ std::map<uint32_t /* SSRC */, std::vector<const RtcEventRtpPacketIncoming*>> >+ incoming_rtp_packets; >+ std::map<uint32_t /* SSRC */, std::vector<const RtcEventRtpPacketOutgoing*>> >+ outgoing_rtp_packets; >+ std::vector<const RtcEventVideoReceiveStreamConfig*> >+ video_recv_stream_configs; >+ std::vector<const RtcEventVideoSendStreamConfig*> video_send_stream_configs; >+ std::vector<const RtcEventIceCandidatePairConfig*> ice_candidate_configs; >+ std::vector<const RtcEventIceCandidatePair*> ice_candidate_events; >+ >+ for (auto it = begin; it != end; ++it) { >+ switch ((*it)->GetType()) { >+ case RtcEvent::Type::AlrStateEvent: { >+ auto* rtc_event = >+ static_cast<const RtcEventAlrState* const>(it->get()); >+ alr_state_events.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::AudioNetworkAdaptation: { >+ auto* rtc_event = >+ static_cast<const RtcEventAudioNetworkAdaptation* const>( >+ it->get()); >+ audio_network_adaptation_events.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::AudioPlayout: { >+ auto* rtc_event = >+ static_cast<const RtcEventAudioPlayout* const>(it->get()); >+ audio_playout_events.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::AudioReceiveStreamConfig: { >+ auto* rtc_event = >+ static_cast<const RtcEventAudioReceiveStreamConfig* const>( >+ it->get()); >+ audio_recv_stream_configs.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::AudioSendStreamConfig: { >+ auto* rtc_event = >+ static_cast<const RtcEventAudioSendStreamConfig* const>( >+ it->get()); >+ audio_send_stream_configs.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::BweUpdateDelayBased: { >+ auto* rtc_event = >+ static_cast<const RtcEventBweUpdateDelayBased* const>(it->get()); >+ bwe_delay_based_updates.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::BweUpdateLossBased: { >+ auto* rtc_event = >+ static_cast<const RtcEventBweUpdateLossBased* const>(it->get()); >+ bwe_loss_based_updates.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::ProbeClusterCreated: { >+ auto* rtc_event = >+ static_cast<const RtcEventProbeClusterCreated* const>(it->get()); >+ probe_cluster_created_events.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::ProbeResultFailure: { >+ auto* rtc_event = >+ static_cast<const RtcEventProbeResultFailure* const>(it->get()); >+ probe_result_failure_events.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::ProbeResultSuccess: { >+ auto* rtc_event = >+ static_cast<const RtcEventProbeResultSuccess* const>(it->get()); >+ probe_result_success_events.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::RtcpPacketIncoming: { >+ auto* rtc_event = >+ static_cast<const RtcEventRtcpPacketIncoming* const>(it->get()); >+ incoming_rtcp_packets.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::RtcpPacketOutgoing: { >+ auto* rtc_event = >+ static_cast<const RtcEventRtcpPacketOutgoing* const>(it->get()); >+ outgoing_rtcp_packets.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::RtpPacketIncoming: { >+ auto* rtc_event = >+ static_cast<const RtcEventRtpPacketIncoming* const>(it->get()); >+ auto& v = incoming_rtp_packets[rtc_event->header().Ssrc()]; >+ v.emplace_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::RtpPacketOutgoing: { >+ auto* rtc_event = >+ static_cast<const RtcEventRtpPacketOutgoing* const>(it->get()); >+ auto& v = outgoing_rtp_packets[rtc_event->header().Ssrc()]; >+ v.emplace_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::VideoReceiveStreamConfig: { >+ auto* rtc_event = >+ static_cast<const RtcEventVideoReceiveStreamConfig* const>( >+ it->get()); >+ video_recv_stream_configs.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::VideoSendStreamConfig: { >+ auto* rtc_event = >+ static_cast<const RtcEventVideoSendStreamConfig* const>( >+ it->get()); >+ video_send_stream_configs.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::IceCandidatePairConfig: { >+ auto* rtc_event = >+ static_cast<const RtcEventIceCandidatePairConfig* const>( >+ it->get()); >+ ice_candidate_configs.push_back(rtc_event); >+ break; >+ } >+ case RtcEvent::Type::IceCandidatePairEvent: { >+ auto* rtc_event = >+ static_cast<const RtcEventIceCandidatePair* const>(it->get()); >+ ice_candidate_events.push_back(rtc_event); >+ break; >+ } >+ } >+ } >+ >+ EncodeAlrState(alr_state_events, &event_stream); >+ EncodeAudioNetworkAdaptation(audio_network_adaptation_events, >+ &event_stream); >+ EncodeAudioPlayout(audio_playout_events, &event_stream); >+ EncodeAudioRecvStreamConfig(audio_recv_stream_configs, &event_stream); >+ EncodeAudioSendStreamConfig(audio_send_stream_configs, &event_stream); >+ EncodeBweUpdateDelayBased(bwe_delay_based_updates, &event_stream); >+ EncodeBweUpdateLossBased(bwe_loss_based_updates, &event_stream); >+ EncodeProbeClusterCreated(probe_cluster_created_events, &event_stream); >+ EncodeProbeResultFailure(probe_result_failure_events, &event_stream); >+ EncodeProbeResultSuccess(probe_result_success_events, &event_stream); >+ EncodeRtcpPacketIncoming(incoming_rtcp_packets, &event_stream); >+ EncodeRtcpPacketOutgoing(outgoing_rtcp_packets, &event_stream); >+ EncodeRtpPacketIncoming(incoming_rtp_packets, &event_stream); >+ EncodeRtpPacketOutgoing(outgoing_rtp_packets, &event_stream); >+ EncodeVideoRecvStreamConfig(video_recv_stream_configs, &event_stream); >+ EncodeVideoSendStreamConfig(video_send_stream_configs, &event_stream); >+ EncodeIceCandidatePairConfig(ice_candidate_configs, &event_stream); >+ EncodeIceCandidatePairEvent(ice_candidate_events, &event_stream); >+ } // Deallocate the temporary vectors. >+ >+ return event_stream.SerializeAsString(); >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeAlrState( >+ rtc::ArrayView<const RtcEventAlrState*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventAlrState* base_event : batch) { >+ rtclog2::AlrState* proto_batch = event_stream->add_alr_states(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_in_alr(base_event->in_alr()); >+ } >+ // TODO(terelius): Should we delta-compress this event type? >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeAudioNetworkAdaptation( >+ rtc::ArrayView<const RtcEventAudioNetworkAdaptation*> batch, >+ rtclog2::EventStream* event_stream) { >+ if (batch.size() == 0) >+ return; >+ >+ // Base event >+ const RtcEventAudioNetworkAdaptation* const base_event = batch[0]; >+ rtclog2::AudioNetworkAdaptations* proto_batch = >+ event_stream->add_audio_network_adaptations(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ if (base_event->config().bitrate_bps.has_value()) >+ proto_batch->set_bitrate_bps(base_event->config().bitrate_bps.value()); >+ if (base_event->config().frame_length_ms.has_value()) { >+ proto_batch->set_frame_length_ms( >+ base_event->config().frame_length_ms.value()); >+ } >+ absl::optional<uint64_t> base_uplink_packet_loss_fraction; >+ if (base_event->config().uplink_packet_loss_fraction.has_value()) { >+ base_uplink_packet_loss_fraction = ConvertPacketLossFractionToProtoFormat( >+ base_event->config().uplink_packet_loss_fraction.value()); >+ proto_batch->set_uplink_packet_loss_fraction( >+ base_uplink_packet_loss_fraction.value()); >+ } >+ if (base_event->config().enable_fec.has_value()) >+ proto_batch->set_enable_fec(base_event->config().enable_fec.value()); >+ if (base_event->config().enable_dtx.has_value()) >+ proto_batch->set_enable_dtx(base_event->config().enable_dtx.value()); >+ // Note that |num_channels_deltas| encodes N as N-1, to keep deltas smaller, >+ // but there's no reason to do the same for the base event's value, since >+ // no bits will be spared. >+ if (base_event->config().num_channels.has_value()) >+ proto_batch->set_num_channels(base_event->config().num_channels.value()); >+ >+ if (batch.size() == 1) >+ return; >+ >+ // Delta encoding >+ proto_batch->set_number_of_deltas(batch.size() - 1); >+ std::vector<absl::optional<uint64_t>> values(batch.size() - 1); >+ std::string encoded_deltas; >+ >+ // timestamp_ms >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioNetworkAdaptation* event = batch[i + 1]; >+ values[i] = ToUnsigned(event->timestamp_ms()); >+ } >+ encoded_deltas = EncodeDeltas(ToUnsigned(base_event->timestamp_ms()), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_timestamp_ms_deltas(encoded_deltas); >+ } >+ >+ // bitrate_bps >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioNetworkAdaptation* event = batch[i + 1]; >+ if (event->config().bitrate_bps.has_value()) { >+ values[i] = ToUnsigned(event->config().bitrate_bps.value()); >+ } else { >+ values[i].reset(); >+ } >+ } >+ const absl::optional<uint64_t> unsigned_base_bitrate_bps = >+ base_event->config().bitrate_bps.has_value() >+ ? ToUnsigned(base_event->config().bitrate_bps.value()) >+ : absl::optional<uint64_t>(); >+ encoded_deltas = EncodeDeltas(unsigned_base_bitrate_bps, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_bitrate_bps_deltas(encoded_deltas); >+ } >+ >+ // frame_length_ms >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioNetworkAdaptation* event = batch[i + 1]; >+ if (event->config().frame_length_ms.has_value()) { >+ values[i] = ToUnsigned(event->config().frame_length_ms.value()); >+ } else { >+ values[i].reset(); >+ } >+ } >+ const absl::optional<uint64_t> unsigned_base_frame_length_ms = >+ base_event->config().frame_length_ms.has_value() >+ ? ToUnsigned(base_event->config().frame_length_ms.value()) >+ : absl::optional<uint64_t>(); >+ encoded_deltas = EncodeDeltas(unsigned_base_frame_length_ms, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_frame_length_ms_deltas(encoded_deltas); >+ } >+ >+ // uplink_packet_loss_fraction >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioNetworkAdaptation* event = batch[i + 1]; >+ if (event->config().uplink_packet_loss_fraction.has_value()) { >+ values[i] = ConvertPacketLossFractionToProtoFormat( >+ event->config().uplink_packet_loss_fraction.value()); >+ } else { >+ values[i].reset(); >+ } >+ } >+ encoded_deltas = EncodeDeltas(base_uplink_packet_loss_fraction, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_uplink_packet_loss_fraction_deltas(encoded_deltas); >+ } >+ >+ // enable_fec >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioNetworkAdaptation* event = batch[i + 1]; >+ values[i] = event->config().enable_fec; >+ } >+ encoded_deltas = EncodeDeltas(base_event->config().enable_fec, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_enable_fec_deltas(encoded_deltas); >+ } >+ >+ // enable_dtx >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioNetworkAdaptation* event = batch[i + 1]; >+ values[i] = event->config().enable_dtx; >+ } >+ encoded_deltas = EncodeDeltas(base_event->config().enable_dtx, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_enable_dtx_deltas(encoded_deltas); >+ } >+ >+ // num_channels >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioNetworkAdaptation* event = batch[i + 1]; >+ const absl::optional<size_t> num_channels = event->config().num_channels; >+ if (num_channels.has_value()) { >+ // Since the number of channels is always greater than 0, we can encode >+ // N channels as N-1, thereby making sure that we get smaller deltas. >+ // That is, a toggle of 1->2->1 can be encoded as deltas vector (1, 1), >+ // rather than as (1, 3) or (1, -1), either of which would require two >+ // bits per delta. >+ RTC_DCHECK_GT(num_channels.value(), 0u); >+ values[i] = num_channels.value() - 1; >+ } else { >+ values[i].reset(); >+ } >+ } >+ // In the base event, N channels encoded as N channels, but for delta >+ // compression purposes, also shifted down by 1. >+ absl::optional<size_t> shifted_base_num_channels; >+ if (base_event->config().num_channels.has_value()) { >+ RTC_DCHECK_GT(base_event->config().num_channels.value(), 0u); >+ shifted_base_num_channels = base_event->config().num_channels.value() - 1; >+ } >+ encoded_deltas = EncodeDeltas(shifted_base_num_channels, values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_num_channels_deltas(encoded_deltas); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeAudioPlayout( >+ rtc::ArrayView<const RtcEventAudioPlayout*> batch, >+ rtclog2::EventStream* event_stream) { >+ if (batch.size() == 0) >+ return; >+ >+ // Base event >+ const RtcEventAudioPlayout* const base_event = batch[0]; >+ rtclog2::AudioPlayoutEvents* proto_batch = >+ event_stream->add_audio_playout_events(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_local_ssrc(base_event->ssrc()); >+ >+ if (batch.size() == 1) >+ return; >+ >+ // Delta encoding >+ proto_batch->set_number_of_deltas(batch.size() - 1); >+ std::vector<absl::optional<uint64_t>> values(batch.size() - 1); >+ std::string encoded_deltas; >+ >+ // timestamp_ms >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioPlayout* event = batch[i + 1]; >+ values[i] = ToUnsigned(event->timestamp_ms()); >+ } >+ encoded_deltas = EncodeDeltas(ToUnsigned(base_event->timestamp_ms()), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_timestamp_ms_deltas(encoded_deltas); >+ } >+ >+ // local_ssrc >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventAudioPlayout* event = batch[i + 1]; >+ values[i] = event->ssrc(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->ssrc(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_local_ssrc_deltas(encoded_deltas); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeAudioRecvStreamConfig( >+ rtc::ArrayView<const RtcEventAudioReceiveStreamConfig*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventAudioReceiveStreamConfig* base_event : batch) { >+ rtclog2::AudioRecvStreamConfig* proto_batch = >+ event_stream->add_audio_recv_stream_configs(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_remote_ssrc(base_event->config().remote_ssrc); >+ proto_batch->set_local_ssrc(base_event->config().local_ssrc); >+ >+ rtclog2::RtpHeaderExtensionConfig* proto_config = >+ proto_batch->mutable_header_extensions(); >+ bool has_recognized_extensions = >+ ConvertToProtoFormat(base_event->config().rtp_extensions, proto_config); >+ if (!has_recognized_extensions) >+ proto_batch->clear_header_extensions(); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeAudioSendStreamConfig( >+ rtc::ArrayView<const RtcEventAudioSendStreamConfig*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventAudioSendStreamConfig* base_event : batch) { >+ rtclog2::AudioSendStreamConfig* proto_batch = >+ event_stream->add_audio_send_stream_configs(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_ssrc(base_event->config().local_ssrc); >+ >+ rtclog2::RtpHeaderExtensionConfig* proto_config = >+ proto_batch->mutable_header_extensions(); >+ bool has_recognized_extensions = >+ ConvertToProtoFormat(base_event->config().rtp_extensions, proto_config); >+ if (!has_recognized_extensions) >+ proto_batch->clear_header_extensions(); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeBweUpdateDelayBased( >+ rtc::ArrayView<const RtcEventBweUpdateDelayBased*> batch, >+ rtclog2::EventStream* event_stream) { >+ if (batch.size() == 0) >+ return; >+ >+ // Base event >+ const RtcEventBweUpdateDelayBased* const base_event = batch[0]; >+ rtclog2::DelayBasedBweUpdates* proto_batch = >+ event_stream->add_delay_based_bwe_updates(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_bitrate_bps(base_event->bitrate_bps()); >+ proto_batch->set_detector_state( >+ ConvertToProtoFormat(base_event->detector_state())); >+ >+ if (batch.size() == 1) >+ return; >+ >+ // Delta encoding >+ proto_batch->set_number_of_deltas(batch.size() - 1); >+ std::vector<absl::optional<uint64_t>> values(batch.size() - 1); >+ std::string encoded_deltas; >+ >+ // timestamp_ms >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventBweUpdateDelayBased* event = batch[i + 1]; >+ values[i] = ToUnsigned(event->timestamp_ms()); >+ } >+ encoded_deltas = EncodeDeltas(ToUnsigned(base_event->timestamp_ms()), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_timestamp_ms_deltas(encoded_deltas); >+ } >+ >+ // bitrate_bps >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventBweUpdateDelayBased* event = batch[i + 1]; >+ values[i] = event->bitrate_bps(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->bitrate_bps(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_bitrate_bps_deltas(encoded_deltas); >+ } >+ >+ // detector_state >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventBweUpdateDelayBased* event = batch[i + 1]; >+ values[i] = >+ static_cast<uint64_t>(ConvertToProtoFormat(event->detector_state())); >+ } >+ encoded_deltas = EncodeDeltas( >+ static_cast<uint64_t>(ConvertToProtoFormat(base_event->detector_state())), >+ values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_detector_state_deltas(encoded_deltas); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeBweUpdateLossBased( >+ rtc::ArrayView<const RtcEventBweUpdateLossBased*> batch, >+ rtclog2::EventStream* event_stream) { >+ if (batch.size() == 0) >+ return; >+ >+ // Base event >+ const RtcEventBweUpdateLossBased* const base_event = batch[0]; >+ rtclog2::LossBasedBweUpdates* proto_batch = >+ event_stream->add_loss_based_bwe_updates(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_bitrate_bps(base_event->bitrate_bps()); >+ proto_batch->set_fraction_loss(base_event->fraction_loss()); >+ proto_batch->set_total_packets(base_event->total_packets()); >+ >+ if (batch.size() == 1) >+ return; >+ >+ // Delta encoding >+ proto_batch->set_number_of_deltas(batch.size() - 1); >+ std::vector<absl::optional<uint64_t>> values(batch.size() - 1); >+ std::string encoded_deltas; >+ >+ // timestamp_ms >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventBweUpdateLossBased* event = batch[i + 1]; >+ values[i] = ToUnsigned(event->timestamp_ms()); >+ } >+ encoded_deltas = EncodeDeltas(ToUnsigned(base_event->timestamp_ms()), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_timestamp_ms_deltas(encoded_deltas); >+ } >+ >+ // bitrate_bps >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventBweUpdateLossBased* event = batch[i + 1]; >+ values[i] = event->bitrate_bps(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->bitrate_bps(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_bitrate_bps_deltas(encoded_deltas); >+ } >+ >+ // fraction_loss >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventBweUpdateLossBased* event = batch[i + 1]; >+ values[i] = event->fraction_loss(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->fraction_loss(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_fraction_loss_deltas(encoded_deltas); >+ } >+ >+ // total_packets >+ for (size_t i = 0; i < values.size(); ++i) { >+ const RtcEventBweUpdateLossBased* event = batch[i + 1]; >+ values[i] = event->total_packets(); >+ } >+ encoded_deltas = EncodeDeltas(base_event->total_packets(), values); >+ if (!encoded_deltas.empty()) { >+ proto_batch->set_total_packets_deltas(encoded_deltas); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeProbeClusterCreated( >+ rtc::ArrayView<const RtcEventProbeClusterCreated*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventProbeClusterCreated* base_event : batch) { >+ rtclog2::BweProbeCluster* proto_batch = event_stream->add_probe_clusters(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_id(base_event->id()); >+ proto_batch->set_bitrate_bps(base_event->bitrate_bps()); >+ proto_batch->set_min_packets(base_event->min_probes()); >+ proto_batch->set_min_bytes(base_event->min_bytes()); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeProbeResultFailure( >+ rtc::ArrayView<const RtcEventProbeResultFailure*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventProbeResultFailure* base_event : batch) { >+ rtclog2::BweProbeResultFailure* proto_batch = >+ event_stream->add_probe_failure(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_id(base_event->id()); >+ proto_batch->set_failure( >+ ConvertToProtoFormat(base_event->failure_reason())); >+ } >+ // TODO(terelius): Should we delta-compress this event type? >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeProbeResultSuccess( >+ rtc::ArrayView<const RtcEventProbeResultSuccess*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventProbeResultSuccess* base_event : batch) { >+ rtclog2::BweProbeResultSuccess* proto_batch = >+ event_stream->add_probe_success(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_id(base_event->id()); >+ proto_batch->set_bitrate_bps(base_event->bitrate_bps()); >+ } >+ // TODO(terelius): Should we delta-compress this event type? >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeRtcpPacketIncoming( >+ rtc::ArrayView<const RtcEventRtcpPacketIncoming*> batch, >+ rtclog2::EventStream* event_stream) { >+ if (batch.empty()) { >+ return; >+ } >+ EncodeRtcpPacket(batch, event_stream->add_incoming_rtcp_packets()); >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeRtcpPacketOutgoing( >+ rtc::ArrayView<const RtcEventRtcpPacketOutgoing*> batch, >+ rtclog2::EventStream* event_stream) { >+ if (batch.empty()) { >+ return; >+ } >+ EncodeRtcpPacket(batch, event_stream->add_outgoing_rtcp_packets()); >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeRtpPacketIncoming( >+ const std::map<uint32_t, std::vector<const RtcEventRtpPacketIncoming*>>& >+ batch, >+ rtclog2::EventStream* event_stream) { >+ for (auto it : batch) { >+ RTC_DCHECK(!it.second.empty()); >+ EncodeRtpPacket(it.second, event_stream->add_incoming_rtp_packets()); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeRtpPacketOutgoing( >+ const std::map<uint32_t, std::vector<const RtcEventRtpPacketOutgoing*>>& >+ batch, >+ rtclog2::EventStream* event_stream) { >+ for (auto it : batch) { >+ RTC_DCHECK(!it.second.empty()); >+ EncodeRtpPacket(it.second, event_stream->add_outgoing_rtp_packets()); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeVideoRecvStreamConfig( >+ rtc::ArrayView<const RtcEventVideoReceiveStreamConfig*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventVideoReceiveStreamConfig* base_event : batch) { >+ rtclog2::VideoRecvStreamConfig* proto_batch = >+ event_stream->add_video_recv_stream_configs(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_remote_ssrc(base_event->config().remote_ssrc); >+ proto_batch->set_local_ssrc(base_event->config().local_ssrc); >+ proto_batch->set_rtx_ssrc(base_event->config().rtx_ssrc); >+ >+ rtclog2::RtpHeaderExtensionConfig* proto_config = >+ proto_batch->mutable_header_extensions(); >+ bool has_recognized_extensions = >+ ConvertToProtoFormat(base_event->config().rtp_extensions, proto_config); >+ if (!has_recognized_extensions) >+ proto_batch->clear_header_extensions(); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeVideoSendStreamConfig( >+ rtc::ArrayView<const RtcEventVideoSendStreamConfig*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventVideoSendStreamConfig* base_event : batch) { >+ rtclog2::VideoSendStreamConfig* proto_batch = >+ event_stream->add_video_send_stream_configs(); >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_ssrc(base_event->config().local_ssrc); >+ proto_batch->set_rtx_ssrc(base_event->config().rtx_ssrc); >+ >+ rtclog2::RtpHeaderExtensionConfig* proto_config = >+ proto_batch->mutable_header_extensions(); >+ bool has_recognized_extensions = >+ ConvertToProtoFormat(base_event->config().rtp_extensions, proto_config); >+ if (!has_recognized_extensions) >+ proto_batch->clear_header_extensions(); >+ } >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeIceCandidatePairConfig( >+ rtc::ArrayView<const RtcEventIceCandidatePairConfig*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventIceCandidatePairConfig* base_event : batch) { >+ rtclog2::IceCandidatePairConfig* proto_batch = >+ event_stream->add_ice_candidate_configs(); >+ >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ proto_batch->set_config_type(ConvertToProtoFormat(base_event->type())); >+ proto_batch->set_candidate_pair_id(base_event->candidate_pair_id()); >+ const auto& desc = base_event->candidate_pair_desc(); >+ proto_batch->set_local_candidate_type( >+ ConvertToProtoFormat(desc.local_candidate_type)); >+ proto_batch->set_local_relay_protocol( >+ ConvertToProtoFormat(desc.local_relay_protocol)); >+ proto_batch->set_local_network_type( >+ ConvertToProtoFormat(desc.local_network_type)); >+ proto_batch->set_local_address_family( >+ ConvertToProtoFormat(desc.local_address_family)); >+ proto_batch->set_remote_candidate_type( >+ ConvertToProtoFormat(desc.remote_candidate_type)); >+ proto_batch->set_remote_address_family( >+ ConvertToProtoFormat(desc.remote_address_family)); >+ proto_batch->set_candidate_pair_protocol( >+ ConvertToProtoFormat(desc.candidate_pair_protocol)); >+ } >+ // TODO(terelius): Should we delta-compress this event type? >+} >+ >+void RtcEventLogEncoderNewFormat::EncodeIceCandidatePairEvent( >+ rtc::ArrayView<const RtcEventIceCandidatePair*> batch, >+ rtclog2::EventStream* event_stream) { >+ for (const RtcEventIceCandidatePair* base_event : batch) { >+ rtclog2::IceCandidatePairEvent* proto_batch = >+ event_stream->add_ice_candidate_events(); >+ >+ proto_batch->set_timestamp_ms(base_event->timestamp_ms()); >+ >+ proto_batch->set_event_type(ConvertToProtoFormat(base_event->type())); >+ proto_batch->set_candidate_pair_id(base_event->candidate_pair_id()); >+ } >+ // TODO(terelius): Should we delta-compress this event type? >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.h >new file mode 100644 >index 0000000000000000000000000000000000000000..7dfa490114b4eb883bc7896151498e98110a8806 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.h >@@ -0,0 +1,127 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef LOGGING_RTC_EVENT_LOG_ENCODER_RTC_EVENT_LOG_ENCODER_NEW_FORMAT_H_ >+#define LOGGING_RTC_EVENT_LOG_ENCODER_RTC_EVENT_LOG_ENCODER_NEW_FORMAT_H_ >+ >+#include <deque> >+#include <map> >+#include <memory> >+#include <string> >+#include <vector> >+ >+#include "api/array_view.h" >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder.h" >+ >+namespace webrtc { >+ >+namespace rtclog2 { >+class EventStream; // Auto-generated from protobuf. >+} // namespace rtclog2 >+ >+class RtcEventAlrState; >+class RtcEventAudioNetworkAdaptation; >+class RtcEventAudioPlayout; >+class RtcEventAudioReceiveStreamConfig; >+class RtcEventAudioSendStreamConfig; >+class RtcEventBweUpdateDelayBased; >+class RtcEventBweUpdateLossBased; >+class RtcEventLoggingStarted; >+class RtcEventLoggingStopped; >+class RtcEventProbeClusterCreated; >+class RtcEventProbeResultFailure; >+class RtcEventProbeResultSuccess; >+class RtcEventRtcpPacketIncoming; >+class RtcEventRtcpPacketOutgoing; >+class RtcEventRtpPacketIncoming; >+class RtcEventRtpPacketOutgoing; >+class RtcEventVideoReceiveStreamConfig; >+class RtcEventVideoSendStreamConfig; >+class RtcEventIceCandidatePairConfig; >+class RtcEventIceCandidatePair; >+class RtpPacket; >+ >+class RtcEventLogEncoderNewFormat final : public RtcEventLogEncoder { >+ public: >+ ~RtcEventLogEncoderNewFormat() override = default; >+ >+ std::string EncodeBatch( >+ std::deque<std::unique_ptr<RtcEvent>>::const_iterator begin, >+ std::deque<std::unique_ptr<RtcEvent>>::const_iterator end) override; >+ >+ std::string EncodeLogStart(int64_t timestamp_us, >+ int64_t utc_time_us) override; >+ std::string EncodeLogEnd(int64_t timestamp_us) override; >+ >+ private: >+ // Encoding entry-point for the various RtcEvent subclasses. >+ void EncodeAlrState(rtc::ArrayView<const RtcEventAlrState*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeAudioNetworkAdaptation( >+ rtc::ArrayView<const RtcEventAudioNetworkAdaptation*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeAudioPlayout(rtc::ArrayView<const RtcEventAudioPlayout*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeAudioRecvStreamConfig( >+ rtc::ArrayView<const RtcEventAudioReceiveStreamConfig*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeAudioSendStreamConfig( >+ rtc::ArrayView<const RtcEventAudioSendStreamConfig*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeBweUpdateDelayBased( >+ rtc::ArrayView<const RtcEventBweUpdateDelayBased*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeBweUpdateLossBased( >+ rtc::ArrayView<const RtcEventBweUpdateLossBased*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeLoggingStarted(rtc::ArrayView<const RtcEventLoggingStarted*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeLoggingStopped(rtc::ArrayView<const RtcEventLoggingStopped*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeProbeClusterCreated( >+ rtc::ArrayView<const RtcEventProbeClusterCreated*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeProbeResultFailure( >+ rtc::ArrayView<const RtcEventProbeResultFailure*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeProbeResultSuccess( >+ rtc::ArrayView<const RtcEventProbeResultSuccess*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeRtcpPacketIncoming( >+ rtc::ArrayView<const RtcEventRtcpPacketIncoming*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeRtcpPacketOutgoing( >+ rtc::ArrayView<const RtcEventRtcpPacketOutgoing*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeRtpPacketIncoming( >+ const std::map<uint32_t, std::vector<const RtcEventRtpPacketIncoming*>>& >+ batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeRtpPacketOutgoing( >+ const std::map<uint32_t, std::vector<const RtcEventRtpPacketOutgoing*>>& >+ batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeVideoRecvStreamConfig( >+ rtc::ArrayView<const RtcEventVideoReceiveStreamConfig*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeVideoSendStreamConfig( >+ rtc::ArrayView<const RtcEventVideoSendStreamConfig*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeIceCandidatePairConfig( >+ rtc::ArrayView<const RtcEventIceCandidatePairConfig*> batch, >+ rtclog2::EventStream* event_stream); >+ void EncodeIceCandidatePairEvent( >+ rtc::ArrayView<const RtcEventIceCandidatePair*> batch, >+ rtclog2::EventStream* event_stream); >+}; >+ >+} // namespace webrtc >+ >+#endif // LOGGING_RTC_EVENT_LOG_ENCODER_RTC_EVENT_LOG_ENCODER_NEW_FORMAT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_unittest.cc >index 990fa6080bb94613d543474460b87f8a3c46c08e..856bec66e7ccc081c36407a8c2641d15da47cab4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/rtc_event_log_encoder_unittest.cc >@@ -11,9 +11,11 @@ > #include <deque> > #include <limits> > #include <string> >+#include <tuple> > > #include "absl/memory/memory.h" > #include "logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.h" >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.h" > #include "logging/rtc_event_log/events/rtc_event_alr_state.h" > #include "logging/rtc_event_log/events/rtc_event_audio_network_adaptation.h" > #include "logging/rtc_event_log/events/rtc_event_audio_playout.h" >@@ -39,156 +41,357 @@ > #include "test/gtest.h" > > namespace webrtc { >- >- >-class RtcEventLogEncoderTest : public testing::TestWithParam<int> { >+class RtcEventLogEncoderTest >+ : public testing::TestWithParam<std::tuple<int, bool, size_t, bool>> { > protected: > RtcEventLogEncoderTest() >- : encoder_(new RtcEventLogEncoderLegacy), >- seed_(GetParam()), >+ : seed_(std::get<0>(GetParam())), > prng_(seed_), >- gen_(seed_ * 880001UL) {} >+ new_encoding_(std::get<1>(GetParam())), >+ event_count_(std::get<2>(GetParam())), >+ force_repeated_fields_(std::get<3>(GetParam())), >+ gen_(seed_ * 880001UL), >+ verifier_(new_encoding_ ? RtcEventLog::EncodingType::NewFormat >+ : RtcEventLog::EncodingType::Legacy) { >+ if (new_encoding_) >+ encoder_ = absl::make_unique<RtcEventLogEncoderNewFormat>(); >+ else >+ encoder_ = absl::make_unique<RtcEventLogEncoderLegacy>(); >+ } > ~RtcEventLogEncoderTest() override = default; > > // ANA events have some optional fields, so we want to make sure that we get > // correct behavior both when all of the values are there, as well as when > // only some. > void TestRtcEventAudioNetworkAdaptation( >- std::unique_ptr<AudioEncoderRuntimeConfig> runtime_config); >+ const std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>>&); >+ >+ template <typename EventType> >+ std::unique_ptr<EventType> NewRtpPacket( >+ uint32_t ssrc, >+ const RtpHeaderExtensionMap& extension_map); >+ >+ template <typename ParsedType> >+ const std::vector<ParsedType>* GetRtpPacketsBySsrc( >+ const ParsedRtcEventLogNew* parsed_log, >+ uint32_t ssrc); >+ >+ template <typename EventType, typename ParsedType> >+ void TestRtpPackets(); > > std::deque<std::unique_ptr<RtcEvent>> history_; >- // TODO(eladalon): Once we have more than one possible encoder, parameterize >- // encoder selection. > std::unique_ptr<RtcEventLogEncoder> encoder_; > ParsedRtcEventLogNew parsed_log_; > const uint64_t seed_; > Random prng_; >+ const bool new_encoding_; >+ const size_t event_count_; >+ const bool force_repeated_fields_; > test::EventGenerator gen_; >+ test::EventVerifier verifier_; > }; > >-TEST_P(RtcEventLogEncoderTest, RtcEventAlrState) { >- std::unique_ptr<RtcEventAlrState> event = gen_.NewAlrState(); >- history_.push_back(event->Copy()); >+void RtcEventLogEncoderTest::TestRtcEventAudioNetworkAdaptation( >+ const std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>>& >+ events) { >+ ASSERT_TRUE(history_.empty()) << "Function should be called once per test."; >+ >+ for (auto& event : events) { >+ history_.push_back(event->Copy()); >+ } > > std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); > ASSERT_TRUE(parsed_log_.ParseString(encoded)); >- const auto& alr_state_events = parsed_log_.alr_state_events(); >+ const auto& ana_configs = parsed_log_.audio_network_adaptation_events(); > >- ASSERT_EQ(alr_state_events.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedAlrStateEvent(*event, alr_state_events[0])); >+ ASSERT_EQ(ana_configs.size(), events.size()); >+ for (size_t i = 0; i < events.size(); ++i) { >+ verifier_.VerifyLoggedAudioNetworkAdaptationEvent(*events[i], >+ ana_configs[i]); >+ } > } > >-void RtcEventLogEncoderTest::TestRtcEventAudioNetworkAdaptation( >- std::unique_ptr<AudioEncoderRuntimeConfig> runtime_config) { >- // This function is called repeatedly. Clear state between calls. >- history_.clear(); >- auto original_runtime_config = *runtime_config; >- auto event = absl::make_unique<RtcEventAudioNetworkAdaptation>( >- std::move(runtime_config)); >- const int64_t timestamp_us = event->timestamp_us_; >- history_.push_back(std::move(event)); >+template <> >+std::unique_ptr<RtcEventRtpPacketIncoming> RtcEventLogEncoderTest::NewRtpPacket( >+ uint32_t ssrc, >+ const RtpHeaderExtensionMap& extension_map) { >+ return gen_.NewRtpPacketIncoming(ssrc, extension_map, false); >+} >+ >+template <> >+std::unique_ptr<RtcEventRtpPacketOutgoing> RtcEventLogEncoderTest::NewRtpPacket( >+ uint32_t ssrc, >+ const RtpHeaderExtensionMap& extension_map) { >+ return gen_.NewRtpPacketOutgoing(ssrc, extension_map, false); >+} >+ >+template <> >+const std::vector<LoggedRtpPacketIncoming>* >+RtcEventLogEncoderTest::GetRtpPacketsBySsrc( >+ const ParsedRtcEventLogNew* parsed_log, >+ uint32_t ssrc) { >+ const auto& incoming_streams = parsed_log->incoming_rtp_packets_by_ssrc(); >+ for (const auto& stream : incoming_streams) { >+ if (stream.ssrc == ssrc) { >+ return &stream.incoming_packets; >+ } >+ } >+ return nullptr; >+} >+ >+template <> >+const std::vector<LoggedRtpPacketOutgoing>* >+RtcEventLogEncoderTest::GetRtpPacketsBySsrc( >+ const ParsedRtcEventLogNew* parsed_log, >+ uint32_t ssrc) { >+ const auto& outgoing_streams = parsed_log->outgoing_rtp_packets_by_ssrc(); >+ for (const auto& stream : outgoing_streams) { >+ if (stream.ssrc == ssrc) { >+ return &stream.outgoing_packets; >+ } >+ } >+ return nullptr; >+} >+ >+template <typename EventType, typename ParsedType> >+void RtcEventLogEncoderTest::TestRtpPackets() { >+ // SSRCs will be randomly assigned out of this small pool, significant only >+ // in that it also covers such edge cases as SSRC = 0 and SSRC = 0xffffffff. >+ // The pool is intentionally small, so as to produce collisions. >+ const std::vector<uint32_t> kSsrcPool = {0x00000000, 0x12345678, 0xabcdef01, >+ 0xffffffff, 0x20171024, 0x19840730, >+ 0x19831230}; >+ >+ // TODO(terelius): Test extensions for legacy encoding, too. >+ RtpHeaderExtensionMap extension_map; >+ if (new_encoding_) { >+ extension_map = gen_.NewRtpHeaderExtensionMap(true); >+ } >+ >+ // Simulate |event_count_| RTP packets, with SSRCs assigned randomly >+ // out of the small pool above. >+ std::map<uint32_t, std::vector<std::unique_ptr<EventType>>> events_by_ssrc; >+ for (size_t i = 0; i < event_count_; ++i) { >+ const uint32_t ssrc = kSsrcPool[prng_.Rand(kSsrcPool.size() - 1)]; >+ std::unique_ptr<EventType> event = >+ (events_by_ssrc[ssrc].empty() || !force_repeated_fields_) >+ ? NewRtpPacket<EventType>(ssrc, extension_map) >+ : events_by_ssrc[ssrc][0]->Copy(); >+ history_.push_back(event->Copy()); >+ events_by_ssrc[ssrc].emplace_back(std::move(event)); >+ } > >+ // Encode and parse. > std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); > ASSERT_TRUE(parsed_log_.ParseString(encoded)); >- const auto& ana_configs = parsed_log_.audio_network_adaptation_events(); >- ASSERT_EQ(ana_configs.size(), 1u); > >- EXPECT_EQ(ana_configs[0].timestamp_us, timestamp_us); >- EXPECT_EQ(ana_configs[0].config, original_runtime_config); >+ // For each SSRC, make sure the RTP packets associated with it to have been >+ // correctly encoded and parsed. >+ for (auto it = events_by_ssrc.begin(); it != events_by_ssrc.end(); ++it) { >+ const uint32_t ssrc = it->first; >+ const auto& original_packets = it->second; >+ const std::vector<ParsedType>* parsed_rtp_packets = >+ GetRtpPacketsBySsrc<ParsedType>(&parsed_log_, ssrc); >+ ASSERT_NE(parsed_rtp_packets, nullptr); >+ ASSERT_EQ(original_packets.size(), parsed_rtp_packets->size()); >+ for (size_t i = 0; i < original_packets.size(); ++i) { >+ verifier_.VerifyLoggedRtpPacket<EventType, ParsedType>( >+ *original_packets[i], (*parsed_rtp_packets)[i]); >+ } >+ } >+} >+ >+TEST_P(RtcEventLogEncoderTest, RtcEventAlrState) { >+ std::vector<std::unique_ptr<RtcEventAlrState>> events(event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ events[i] = (i == 0 || !force_repeated_fields_) ? gen_.NewAlrState() >+ : events[0]->Copy(); >+ history_.push_back(events[i]->Copy()); >+ } >+ >+ std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); >+ ASSERT_TRUE(parsed_log_.ParseString(encoded)); >+ const auto& alr_state_events = parsed_log_.alr_state_events(); >+ >+ ASSERT_EQ(alr_state_events.size(), event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ verifier_.VerifyLoggedAlrStateEvent(*events[i], alr_state_events[i]); >+ } > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioNetworkAdaptationBitrate) { >- auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >- const int bitrate_bps = rtc::checked_cast<int>( >- prng_.Rand(0, std::numeric_limits<int32_t>::max())); >- runtime_config->bitrate_bps = bitrate_bps; >- TestRtcEventAudioNetworkAdaptation(std::move(runtime_config)); >+ std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ if (i == 0 || !force_repeated_fields_) { >+ auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >+ const int bitrate_bps = rtc::checked_cast<int>( >+ prng_.Rand(0, std::numeric_limits<int32_t>::max())); >+ runtime_config->bitrate_bps = bitrate_bps; >+ events[i] = absl::make_unique<RtcEventAudioNetworkAdaptation>( >+ std::move(runtime_config)); >+ } else { >+ events[i] = events[0]->Copy(); >+ } >+ } >+ TestRtcEventAudioNetworkAdaptation(events); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioNetworkAdaptationFrameLength) { >- auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >- const int frame_length_ms = prng_.Rand(1, 1000); >- runtime_config->frame_length_ms = frame_length_ms; >- TestRtcEventAudioNetworkAdaptation(std::move(runtime_config)); >+ std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ if (i == 0 || !force_repeated_fields_) { >+ auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >+ const int frame_length_ms = prng_.Rand(1, 1000); >+ runtime_config->frame_length_ms = frame_length_ms; >+ events[i] = absl::make_unique<RtcEventAudioNetworkAdaptation>( >+ std::move(runtime_config)); >+ } else { >+ events[i] = events[0]->Copy(); >+ } >+ } >+ TestRtcEventAudioNetworkAdaptation(events); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioNetworkAdaptationPacketLoss) { >- // To simplify the test, we just check powers of two. >- const float plr = std::pow(0.5f, prng_.Rand(1, 8)); >- auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >- runtime_config->uplink_packet_loss_fraction = plr; >- TestRtcEventAudioNetworkAdaptation(std::move(runtime_config)); >+ std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ if (i == 0 || !force_repeated_fields_) { >+ // To simplify the test, we just check powers of two. >+ const float plr = std::pow(0.5f, prng_.Rand(1, 8)); >+ auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >+ runtime_config->uplink_packet_loss_fraction = plr; >+ events[i] = absl::make_unique<RtcEventAudioNetworkAdaptation>( >+ std::move(runtime_config)); >+ } else { >+ events[i] = events[0]->Copy(); >+ } >+ } >+ TestRtcEventAudioNetworkAdaptation(events); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioNetworkAdaptationFec) { >- // The test might be trivially passing for one of the two boolean values, so >- // for safety's sake, we test both. >- for (bool fec_enabled : {false, true}) { >- auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >- runtime_config->enable_fec = fec_enabled; >- TestRtcEventAudioNetworkAdaptation(std::move(runtime_config)); >+ std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ if (i == 0 || !force_repeated_fields_) { >+ auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >+ runtime_config->enable_fec = prng_.Rand<bool>(); >+ events[i] = absl::make_unique<RtcEventAudioNetworkAdaptation>( >+ std::move(runtime_config)); >+ } else { >+ events[i] = events[0]->Copy(); >+ } > } >+ TestRtcEventAudioNetworkAdaptation(events); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioNetworkAdaptationDtx) { >- // The test might be trivially passing for one of the two boolean values, so >- // for safety's sake, we test both. >- for (bool dtx_enabled : {false, true}) { >- auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >- runtime_config->enable_dtx = dtx_enabled; >- TestRtcEventAudioNetworkAdaptation(std::move(runtime_config)); >+ std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ if (i == 0 || !force_repeated_fields_) { >+ auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >+ runtime_config->enable_dtx = prng_.Rand<bool>(); >+ events[i] = absl::make_unique<RtcEventAudioNetworkAdaptation>( >+ std::move(runtime_config)); >+ } else { >+ events[i] = events[0]->Copy(); >+ } > } >+ TestRtcEventAudioNetworkAdaptation(events); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioNetworkAdaptationChannels) { >- // The test might be trivially passing for one of the two possible values, so >- // for safety's sake, we test both. >- for (size_t channels : {1, 2}) { >- auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >- runtime_config->num_channels = channels; >- TestRtcEventAudioNetworkAdaptation(std::move(runtime_config)); >+ std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ if (i == 0 || !force_repeated_fields_) { >+ auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >+ runtime_config->num_channels = prng_.Rand(1, 2); >+ events[i] = absl::make_unique<RtcEventAudioNetworkAdaptation>( >+ std::move(runtime_config)); >+ } else { >+ events[i] = events[0]->Copy(); >+ } > } >+ TestRtcEventAudioNetworkAdaptation(events); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioNetworkAdaptationAll) { >- const int bitrate_bps = rtc::checked_cast<int>( >- prng_.Rand(0, std::numeric_limits<int32_t>::max())); >- const int frame_length_ms = prng_.Rand(1, 1000); >- const float plr = std::pow(0.5f, prng_.Rand(1, 8)); >- for (bool fec_enabled : {false, true}) { >- for (bool dtx_enabled : {false, true}) { >- for (size_t channels : {1, 2}) { >- auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >- runtime_config->bitrate_bps = bitrate_bps; >- runtime_config->frame_length_ms = frame_length_ms; >- runtime_config->uplink_packet_loss_fraction = plr; >- runtime_config->enable_fec = fec_enabled; >- runtime_config->enable_dtx = dtx_enabled; >- runtime_config->num_channels = channels; >- >- TestRtcEventAudioNetworkAdaptation(std::move(runtime_config)); >- } >+ std::vector<std::unique_ptr<RtcEventAudioNetworkAdaptation>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ if (i == 0 || !force_repeated_fields_) { >+ auto runtime_config = absl::make_unique<AudioEncoderRuntimeConfig>(); >+ runtime_config->bitrate_bps = rtc::checked_cast<int>( >+ prng_.Rand(0, std::numeric_limits<int32_t>::max())); >+ runtime_config->frame_length_ms = prng_.Rand(1, 1000); >+ runtime_config->uplink_packet_loss_fraction = >+ std::pow(0.5f, prng_.Rand(1, 8)); >+ runtime_config->enable_fec = prng_.Rand<bool>(); >+ runtime_config->enable_dtx = prng_.Rand<bool>(); >+ runtime_config->num_channels = prng_.Rand(1, 2); >+ events[i] = absl::make_unique<RtcEventAudioNetworkAdaptation>( >+ std::move(runtime_config)); >+ } else { >+ events[i] = events[0]->Copy(); > } > } >+ TestRtcEventAudioNetworkAdaptation(events); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventAudioPlayout) { >- uint32_t ssrc = prng_.Rand<uint32_t>(); >- std::unique_ptr<RtcEventAudioPlayout> event = gen_.NewAudioPlayout(ssrc); >- history_.push_back(event->Copy()); >+ // SSRCs will be randomly assigned out of this small pool, significant only >+ // in that it also covers such edge cases as SSRC = 0 and SSRC = 0xffffffff. >+ // The pool is intentionally small, so as to produce collisions. >+ const std::vector<uint32_t> kSsrcPool = {0x00000000, 0x12345678, 0xabcdef01, >+ 0xffffffff, 0x20171024, 0x19840730, >+ 0x19831230}; >+ >+ std::map<uint32_t, std::vector<std::unique_ptr<RtcEventAudioPlayout>>> >+ original_events_by_ssrc; >+ for (size_t i = 0; i < event_count_; ++i) { >+ const uint32_t ssrc = kSsrcPool[prng_.Rand(kSsrcPool.size() - 1)]; >+ std::unique_ptr<RtcEventAudioPlayout> event = >+ (original_events_by_ssrc[ssrc].empty() || !force_repeated_fields_) >+ ? gen_.NewAudioPlayout(ssrc) >+ : original_events_by_ssrc[ssrc][0]->Copy(); >+ history_.push_back(event->Copy()); >+ original_events_by_ssrc[ssrc].push_back(std::move(event)); >+ } > > std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); > ASSERT_TRUE(parsed_log_.ParseString(encoded)); > >- const auto& playout_events = parsed_log_.audio_playout_events(); >- ASSERT_EQ(playout_events.size(), 1u); >- const auto playout_stream = playout_events.find(ssrc); >- ASSERT_TRUE(playout_stream != playout_events.end()); >- ASSERT_EQ(playout_stream->second.size(), 1u); >- LoggedAudioPlayoutEvent playout_event = playout_stream->second[0]; >- EXPECT_TRUE(test::VerifyLoggedAudioPlayoutEvent(*event, playout_event)); >+ const auto& parsed_playout_events_by_ssrc = >+ parsed_log_.audio_playout_events(); >+ >+ // Same number of distinct SSRCs. >+ ASSERT_EQ(parsed_playout_events_by_ssrc.size(), >+ original_events_by_ssrc.size()); >+ >+ for (auto& original_event_it : original_events_by_ssrc) { >+ const uint32_t ssrc = original_event_it.first; >+ const auto& original_playout_events = original_event_it.second; >+ >+ const auto& parsed_event_it = parsed_playout_events_by_ssrc.find(ssrc); >+ ASSERT_TRUE(parsed_event_it != parsed_playout_events_by_ssrc.end()); >+ const auto& parsed_playout_events = parsed_event_it->second; >+ >+ // Same number playout events for the SSRC under examination. >+ ASSERT_EQ(original_playout_events.size(), parsed_playout_events.size()); >+ >+ for (size_t i = 0; i < original_playout_events.size(); ++i) { >+ verifier_.VerifyLoggedAudioPlayoutEvent(*original_playout_events[i], >+ parsed_playout_events[i]); >+ } >+ } > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventAudioReceiveStreamConfig) { > uint32_t ssrc = prng_.Rand<uint32_t>(); > RtpHeaderExtensionMap extensions = gen_.NewRtpHeaderExtensionMap(); >@@ -201,9 +404,10 @@ TEST_P(RtcEventLogEncoderTest, RtcEventAudioReceiveStreamConfig) { > const auto& audio_recv_configs = parsed_log_.audio_recv_configs(); > > ASSERT_EQ(audio_recv_configs.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedAudioRecvConfig(*event, audio_recv_configs[0])); >+ verifier_.VerifyLoggedAudioRecvConfig(*event, audio_recv_configs[0]); > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventAudioSendStreamConfig) { > uint32_t ssrc = prng_.Rand<uint32_t>(); > RtpHeaderExtensionMap extensions = gen_.NewRtpHeaderExtensionMap(); >@@ -216,37 +420,51 @@ TEST_P(RtcEventLogEncoderTest, RtcEventAudioSendStreamConfig) { > const auto& audio_send_configs = parsed_log_.audio_send_configs(); > > ASSERT_EQ(audio_send_configs.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedAudioSendConfig(*event, audio_send_configs[0])); >+ verifier_.VerifyLoggedAudioSendConfig(*event, audio_send_configs[0]); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventBweUpdateDelayBased) { >- std::unique_ptr<RtcEventBweUpdateDelayBased> event = >- gen_.NewBweUpdateDelayBased(); >- history_.push_back(event->Copy()); >+ std::vector<std::unique_ptr<RtcEventBweUpdateDelayBased>> events( >+ event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ events[i] = (i == 0 || !force_repeated_fields_) >+ ? gen_.NewBweUpdateDelayBased() >+ : events[0]->Copy(); >+ history_.push_back(events[i]->Copy()); >+ } > > std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); > ASSERT_TRUE(parsed_log_.ParseString(encoded)); >+ > const auto& bwe_delay_updates = parsed_log_.bwe_delay_updates(); >+ ASSERT_EQ(bwe_delay_updates.size(), event_count_); > >- ASSERT_EQ(bwe_delay_updates.size(), 1u); >- EXPECT_TRUE( >- test::VerifyLoggedBweDelayBasedUpdate(*event, bwe_delay_updates[0])); >+ for (size_t i = 0; i < event_count_; ++i) { >+ verifier_.VerifyLoggedBweDelayBasedUpdate(*events[i], bwe_delay_updates[i]); >+ } > } > > TEST_P(RtcEventLogEncoderTest, RtcEventBweUpdateLossBased) { >- std::unique_ptr<RtcEventBweUpdateLossBased> event = >- gen_.NewBweUpdateLossBased(); >- history_.push_back(event->Copy()); >+ std::vector<std::unique_ptr<RtcEventBweUpdateLossBased>> events(event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ events[i] = (i == 0 || !force_repeated_fields_) >+ ? gen_.NewBweUpdateLossBased() >+ : events[0]->Copy(); >+ history_.push_back(events[i]->Copy()); >+ } > > std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); > ASSERT_TRUE(parsed_log_.ParseString(encoded)); >+ > const auto& bwe_loss_updates = parsed_log_.bwe_loss_updates(); >+ ASSERT_EQ(bwe_loss_updates.size(), event_count_); > >- ASSERT_EQ(bwe_loss_updates.size(), 1u); >- EXPECT_TRUE( >- test::VerifyLoggedBweLossBasedUpdate(*event, bwe_loss_updates[0])); >+ for (size_t i = 0; i < event_count_; ++i) { >+ verifier_.VerifyLoggedBweLossBasedUpdate(*events[i], bwe_loss_updates[i]); >+ } > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventIceCandidatePairConfig) { > std::unique_ptr<RtcEventIceCandidatePairConfig> event = > gen_.NewIceCandidatePairConfig(); >@@ -258,10 +476,11 @@ TEST_P(RtcEventLogEncoderTest, RtcEventIceCandidatePairConfig) { > parsed_log_.ice_candidate_pair_configs(); > > ASSERT_EQ(ice_candidate_pair_configs.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedIceCandidatePairConfig( >- *event, ice_candidate_pair_configs[0])); >+ verifier_.VerifyLoggedIceCandidatePairConfig(*event, >+ ice_candidate_pair_configs[0]); > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventIceCandidatePair) { > std::unique_ptr<RtcEventIceCandidatePair> event = gen_.NewIceCandidatePair(); > history_.push_back(event->Copy()); >@@ -272,18 +491,21 @@ TEST_P(RtcEventLogEncoderTest, RtcEventIceCandidatePair) { > parsed_log_.ice_candidate_pair_events(); > > ASSERT_EQ(ice_candidate_pair_events.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedIceCandidatePairEvent( >- *event, ice_candidate_pair_events[0])); >+ verifier_.VerifyLoggedIceCandidatePairEvent(*event, >+ ice_candidate_pair_events[0]); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventLoggingStarted) { > const int64_t timestamp_us = rtc::TimeMicros(); >+ const int64_t utc_time_us = rtc::TimeUTCMicros(); > >- ASSERT_TRUE(parsed_log_.ParseString(encoder_->EncodeLogStart(timestamp_us))); >+ ASSERT_TRUE(parsed_log_.ParseString( >+ encoder_->EncodeLogStart(timestamp_us, utc_time_us))); > const auto& start_log_events = parsed_log_.start_log_events(); > > ASSERT_EQ(start_log_events.size(), 1u); >- EXPECT_EQ(start_log_events[0].timestamp_us, timestamp_us); >+ verifier_.VerifyLoggedStartEvent(timestamp_us, utc_time_us, >+ start_log_events[0]); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventLoggingStopped) { >@@ -293,9 +515,10 @@ TEST_P(RtcEventLogEncoderTest, RtcEventLoggingStopped) { > const auto& stop_log_events = parsed_log_.stop_log_events(); > > ASSERT_EQ(stop_log_events.size(), 1u); >- EXPECT_EQ(stop_log_events[0].timestamp_us, timestamp_us); >+ verifier_.VerifyLoggedStopEvent(timestamp_us, stop_log_events[0]); > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventProbeClusterCreated) { > std::unique_ptr<RtcEventProbeClusterCreated> event = > gen_.NewProbeClusterCreated(); >@@ -307,10 +530,11 @@ TEST_P(RtcEventLogEncoderTest, RtcEventProbeClusterCreated) { > parsed_log_.bwe_probe_cluster_created_events(); > > ASSERT_EQ(bwe_probe_cluster_created_events.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedBweProbeClusterCreatedEvent( >- *event, bwe_probe_cluster_created_events[0])); >+ verifier_.VerifyLoggedBweProbeClusterCreatedEvent( >+ *event, bwe_probe_cluster_created_events[0]); > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventProbeResultFailure) { > std::unique_ptr<RtcEventProbeResultFailure> event = > gen_.NewProbeResultFailure(); >@@ -321,10 +545,11 @@ TEST_P(RtcEventLogEncoderTest, RtcEventProbeResultFailure) { > const auto& bwe_probe_failure_events = parsed_log_.bwe_probe_failure_events(); > > ASSERT_EQ(bwe_probe_failure_events.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedBweProbeFailureEvent( >- *event, bwe_probe_failure_events[0])); >+ verifier_.VerifyLoggedBweProbeFailureEvent(*event, >+ bwe_probe_failure_events[0]); > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventProbeResultSuccess) { > std::unique_ptr<RtcEventProbeResultSuccess> event = > gen_.NewProbeResultSuccess(); >@@ -335,78 +560,67 @@ TEST_P(RtcEventLogEncoderTest, RtcEventProbeResultSuccess) { > const auto& bwe_probe_success_events = parsed_log_.bwe_probe_success_events(); > > ASSERT_EQ(bwe_probe_success_events.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedBweProbeSuccessEvent( >- *event, bwe_probe_success_events[0])); >+ verifier_.VerifyLoggedBweProbeSuccessEvent(*event, >+ bwe_probe_success_events[0]); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventRtcpPacketIncoming) { >- std::unique_ptr<RtcEventRtcpPacketIncoming> event = >- gen_.NewRtcpPacketIncoming(); >- history_.push_back(event->Copy()); >+ if (!new_encoding_ && force_repeated_fields_) { >+ // The old encoding does not work with duplicated packets. Since the legacy >+ // encoding is being phased out, we will not fix this. >+ return; >+ } >+ >+ std::vector<std::unique_ptr<RtcEventRtcpPacketIncoming>> events(event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ events[i] = (i == 0 || !force_repeated_fields_) >+ ? gen_.NewRtcpPacketIncoming() >+ : events[0]->Copy(); >+ history_.push_back(events[i]->Copy()); >+ } > > std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); > ASSERT_TRUE(parsed_log_.ParseString(encoded)); >+ > const auto& incoming_rtcp_packets = parsed_log_.incoming_rtcp_packets(); >+ ASSERT_EQ(incoming_rtcp_packets.size(), event_count_); > >- ASSERT_EQ(incoming_rtcp_packets.size(), 1u); >- EXPECT_TRUE( >- test::VerifyLoggedRtcpPacketIncoming(*event, incoming_rtcp_packets[0])); >+ for (size_t i = 0; i < event_count_; ++i) { >+ verifier_.VerifyLoggedRtcpPacketIncoming(*events[i], >+ incoming_rtcp_packets[i]); >+ } > } > > TEST_P(RtcEventLogEncoderTest, RtcEventRtcpPacketOutgoing) { >- std::unique_ptr<RtcEventRtcpPacketOutgoing> event = >- gen_.NewRtcpPacketOutgoing(); >- history_.push_back(event->Copy()); >+ std::vector<std::unique_ptr<RtcEventRtcpPacketOutgoing>> events(event_count_); >+ for (size_t i = 0; i < event_count_; ++i) { >+ events[i] = (i == 0 || !force_repeated_fields_) >+ ? gen_.NewRtcpPacketOutgoing() >+ : events[0]->Copy(); >+ history_.push_back(events[i]->Copy()); >+ } > > std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); > ASSERT_TRUE(parsed_log_.ParseString(encoded)); >+ > const auto& outgoing_rtcp_packets = parsed_log_.outgoing_rtcp_packets(); >+ ASSERT_EQ(outgoing_rtcp_packets.size(), event_count_); > >- ASSERT_EQ(outgoing_rtcp_packets.size(), 1u); >- EXPECT_TRUE( >- test::VerifyLoggedRtcpPacketOutgoing(*event, outgoing_rtcp_packets[0])); >+ for (size_t i = 0; i < event_count_; ++i) { >+ verifier_.VerifyLoggedRtcpPacketOutgoing(*events[i], >+ outgoing_rtcp_packets[i]); >+ } > } > > TEST_P(RtcEventLogEncoderTest, RtcEventRtpPacketIncoming) { >- uint32_t ssrc = prng_.Rand<uint32_t>(); >- RtpHeaderExtensionMap extension_map; // TODO(terelius): Test extensions too. >- std::unique_ptr<RtcEventRtpPacketIncoming> event = >- gen_.NewRtpPacketIncoming(ssrc, extension_map); >- history_.push_back(event->Copy()); >- >- std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); >- ASSERT_TRUE(parsed_log_.ParseString(encoded)); >- const auto& incoming_rtp_packets_by_ssrc = >- parsed_log_.incoming_rtp_packets_by_ssrc(); >- >- ASSERT_EQ(incoming_rtp_packets_by_ssrc.size(), 1u); >- const auto& stream = incoming_rtp_packets_by_ssrc[0]; >- EXPECT_EQ(stream.ssrc, ssrc); >- ASSERT_EQ(stream.incoming_packets.size(), 1u); >- EXPECT_TRUE( >- test::VerifyLoggedRtpPacketIncoming(*event, stream.incoming_packets[0])); >+ TestRtpPackets<RtcEventRtpPacketIncoming, LoggedRtpPacketIncoming>(); > } > > TEST_P(RtcEventLogEncoderTest, RtcEventRtpPacketOutgoing) { >- uint32_t ssrc = prng_.Rand<uint32_t>(); >- RtpHeaderExtensionMap extension_map; // TODO(terelius): Test extensions too. >- std::unique_ptr<RtcEventRtpPacketOutgoing> event = >- gen_.NewRtpPacketOutgoing(ssrc, extension_map); >- history_.push_back(event->Copy()); >- >- std::string encoded = encoder_->EncodeBatch(history_.begin(), history_.end()); >- ASSERT_TRUE(parsed_log_.ParseString(encoded)); >- const auto& outgoing_rtp_packets_by_ssrc = >- parsed_log_.outgoing_rtp_packets_by_ssrc(); >- >- ASSERT_EQ(outgoing_rtp_packets_by_ssrc.size(), 1u); >- const auto& stream = outgoing_rtp_packets_by_ssrc[0]; >- EXPECT_EQ(stream.ssrc, ssrc); >- ASSERT_EQ(stream.outgoing_packets.size(), 1u); >- EXPECT_TRUE( >- test::VerifyLoggedRtpPacketOutgoing(*event, stream.outgoing_packets[0])); >+ TestRtpPackets<RtcEventRtpPacketOutgoing, LoggedRtpPacketOutgoing>(); > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventVideoReceiveStreamConfig) { > uint32_t ssrc = prng_.Rand<uint32_t>(); > RtpHeaderExtensionMap extensions = gen_.NewRtpHeaderExtensionMap(); >@@ -419,9 +633,10 @@ TEST_P(RtcEventLogEncoderTest, RtcEventVideoReceiveStreamConfig) { > const auto& video_recv_configs = parsed_log_.video_recv_configs(); > > ASSERT_EQ(video_recv_configs.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedVideoRecvConfig(*event, video_recv_configs[0])); >+ verifier_.VerifyLoggedVideoRecvConfig(*event, video_recv_configs[0]); > } > >+// TODO(eladalon/terelius): Test with multiple events in the batch. > TEST_P(RtcEventLogEncoderTest, RtcEventVideoSendStreamConfig) { > uint32_t ssrc = prng_.Rand<uint32_t>(); > RtpHeaderExtensionMap extensions = gen_.NewRtpHeaderExtensionMap(); >@@ -434,11 +649,15 @@ TEST_P(RtcEventLogEncoderTest, RtcEventVideoSendStreamConfig) { > const auto& video_send_configs = parsed_log_.video_send_configs(); > > ASSERT_EQ(video_send_configs.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedVideoSendConfig(*event, video_send_configs[0])); >+ verifier_.VerifyLoggedVideoSendConfig(*event, video_send_configs[0]); > } > >-INSTANTIATE_TEST_CASE_P(RandomSeeds, >- RtcEventLogEncoderTest, >- ::testing::Values(1, 2, 3, 4, 5)); >+INSTANTIATE_TEST_CASE_P( >+ RandomSeeds, >+ RtcEventLogEncoderTest, >+ ::testing::Combine(/* Random seed*: */ ::testing::Values(1, 2, 3, 4, 5), >+ /* Encoding: */ ::testing::Bool(), >+ /* Event count: */ ::testing::Values(1, 2, 10, 100), >+ /* Repeated fields: */ ::testing::Bool())); > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/varint.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/varint.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..41e29e8480bd5aa11ca2e1b64057bda86dcf43ba >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/varint.cc >@@ -0,0 +1,80 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "logging/rtc_event_log/encoder/varint.h" >+ >+#include "rtc_base/checks.h" >+ >+// TODO(eladalon): Add unit tests. >+ >+namespace webrtc { >+ >+const size_t kMaxVarIntLengthBytes = 10; // ceil(64 / 7.0) is 10. >+ >+std::string EncodeVarInt(uint64_t input) { >+ std::string output; >+ output.reserve(kMaxVarIntLengthBytes); >+ >+ do { >+ uint8_t byte = static_cast<uint8_t>(input & 0x7f); >+ input >>= 7; >+ if (input > 0) { >+ byte |= 0x80; >+ } >+ output += byte; >+ } while (input > 0); >+ >+ RTC_DCHECK_GE(output.size(), 1u); >+ RTC_DCHECK_LE(output.size(), kMaxVarIntLengthBytes); >+ >+ return output; >+} >+ >+// There is some code duplication between the flavors of this function. >+// For performance's sake, it's best to just keep it. >+size_t DecodeVarInt(absl::string_view input, uint64_t* output) { >+ RTC_DCHECK(output); >+ >+ uint64_t decoded = 0; >+ for (size_t i = 0; i < input.length() && i < kMaxVarIntLengthBytes; ++i) { >+ decoded += (static_cast<uint64_t>(input[i] & 0x7f) >+ << static_cast<uint64_t>(7 * i)); >+ if (!(input[i] & 0x80)) { >+ *output = decoded; >+ return i + 1; >+ } >+ } >+ >+ return 0; >+} >+ >+// There is some code duplication between the flavors of this function. >+// For performance's sake, it's best to just keep it. >+size_t DecodeVarInt(rtc::BitBuffer* input, uint64_t* output) { >+ RTC_DCHECK(output); >+ >+ uint64_t decoded = 0; >+ for (size_t i = 0; i < kMaxVarIntLengthBytes; ++i) { >+ uint8_t byte; >+ if (!input->ReadUInt8(&byte)) { >+ return 0; >+ } >+ decoded += >+ (static_cast<uint64_t>(byte & 0x7f) << static_cast<uint64_t>(7 * i)); >+ if (!(byte & 0x80)) { >+ *output = decoded; >+ return i + 1; >+ } >+ } >+ >+ return 0; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/varint.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/varint.h >new file mode 100644 >index 0000000000000000000000000000000000000000..95494ae753fc2587e7cbb0e3facc94ecf13682bf >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/encoder/varint.h >@@ -0,0 +1,45 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef LOGGING_RTC_EVENT_LOG_ENCODER_VARINT_H_ >+#define LOGGING_RTC_EVENT_LOG_ENCODER_VARINT_H_ >+ >+#include <string> >+ >+#include "absl/strings/string_view.h" >+#include "rtc_base/bitbuffer.h" >+ >+namespace webrtc { >+ >+extern const size_t kMaxVarIntLengthBytes; >+ >+// Encode a given uint64_t as a varint. From least to most significant, >+// each batch of seven bits are put into the lower bits of a byte, and the last >+// remaining bit in that byte (the highest one) marks whether additional bytes >+// follow (which happens if and only if there are other bits in |input| which >+// are non-zero). >+// Notes: If input == 0, one byte is used. If input is uint64_t::max, exactly >+// kMaxVarIntLengthBytes are used. >+std::string EncodeVarInt(uint64_t input); >+ >+// Inverse of EncodeVarInt(). >+// If decoding is successful, a non-zero number is returned, indicating the >+// number of bytes read from |input|, and the decoded varint is written >+// into |output|. >+// If not successful, 0 is returned, and |output| is not modified. >+size_t DecodeVarInt(absl::string_view input, uint64_t* output); >+ >+// Same as other version, but uses a rtc::BitBuffer for input. >+// Some bits may be consumed even if a varint fails to be read. >+size_t DecodeVarInt(rtc::BitBuffer* input, uint64_t* output); >+ >+} // namespace webrtc >+ >+#endif // LOGGING_RTC_EVENT_LOG_ENCODER_VARINT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event.h >index 21d6e48e0f52904500492f3aeaa9ec6d3b967f47..7ae63758b158ab306e11f0550c898d8dc58d7ae3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event.h >@@ -57,12 +57,13 @@ class RtcEvent { > > virtual bool IsConfigEvent() const = 0; > >- virtual std::unique_ptr<RtcEvent> Copy() const = 0; >- >- const int64_t timestamp_us_; >+ int64_t timestamp_ms() const { return timestamp_us_ / 1000; } >+ int64_t timestamp_us() const { return timestamp_us_; } > > protected: > explicit RtcEvent(int64_t timestamp_us) : timestamp_us_(timestamp_us) {} >+ >+ const int64_t timestamp_us_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.cc >index c1f1b39532033010b5c6c81b4f48656c1685d66c..8e5c2ca82f71cc9fdeff8bb5d756ba64da72e616 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.cc >@@ -28,8 +28,8 @@ bool RtcEventAlrState::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventAlrState::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventAlrState(*this)); >+std::unique_ptr<RtcEventAlrState> RtcEventAlrState::Copy() const { >+ return absl::WrapUnique<RtcEventAlrState>(new RtcEventAlrState(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.h >index e737c915e1d561118fc0eb46470e8dae6ff68c73..9769585d3d6826477d2e8a7567618047a3158808 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_alr_state.h >@@ -26,12 +26,14 @@ class RtcEventAlrState final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventAlrState> Copy() const; > >- const bool in_alr_; >+ bool in_alr() const { return in_alr_; } > > private: > RtcEventAlrState(const RtcEventAlrState& other); >+ >+ const bool in_alr_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.cc >index dcf2d7de59224af2d64c111cebd6fb445ad5279a..571b9a93d3eb7dee9e62c037ed9684988d3e48f8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.cc >@@ -14,12 +14,15 @@ > > #include "absl/memory/memory.h" > #include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > RtcEventAudioNetworkAdaptation::RtcEventAudioNetworkAdaptation( > std::unique_ptr<AudioEncoderRuntimeConfig> config) >- : config_(std::move(config)) {} >+ : config_(std::move(config)) { >+ RTC_DCHECK(config_); >+} > > RtcEventAudioNetworkAdaptation::RtcEventAudioNetworkAdaptation( > const RtcEventAudioNetworkAdaptation& other) >@@ -36,7 +39,8 @@ bool RtcEventAudioNetworkAdaptation::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventAudioNetworkAdaptation::Copy() const { >+std::unique_ptr<RtcEventAudioNetworkAdaptation> >+RtcEventAudioNetworkAdaptation::Copy() const { > return absl::WrapUnique(new RtcEventAudioNetworkAdaptation(*this)); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.h >index 65fbad8621d03ea7d6f2ba26cc5f7592255a1624..ec6ca1125c40a3e6ae2f95e7571927b451adb5ea 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_network_adaptation.h >@@ -29,12 +29,14 @@ class RtcEventAudioNetworkAdaptation final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventAudioNetworkAdaptation> Copy() const; > >- const std::unique_ptr<const AudioEncoderRuntimeConfig> config_; >+ const AudioEncoderRuntimeConfig& config() const { return *config_; } > > private: > RtcEventAudioNetworkAdaptation(const RtcEventAudioNetworkAdaptation& other); >+ >+ const std::unique_ptr<const AudioEncoderRuntimeConfig> config_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.cc >index a9643ba868c038d81d0a3557239f9d8f2a0e3fc8..6c4aa98d3e238cac3253df2533da0c1063f22288 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.cc >@@ -27,8 +27,9 @@ bool RtcEventAudioPlayout::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventAudioPlayout::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventAudioPlayout(*this)); >+std::unique_ptr<RtcEventAudioPlayout> RtcEventAudioPlayout::Copy() const { >+ return absl::WrapUnique<RtcEventAudioPlayout>( >+ new RtcEventAudioPlayout(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.h >index 3080f5b9f0e72eb1add3a9d216e65ced7ecf5363..4b40f5c02728aa6b4f203ddc01f6c97335d5df8f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_playout.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_AUDIO_PLAYOUT_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_AUDIO_PLAYOUT_H_ > >+#include <stdint.h> > #include <memory> > > #include "logging/rtc_event_log/events/rtc_event.h" >@@ -26,12 +27,14 @@ class RtcEventAudioPlayout final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventAudioPlayout> Copy() const; > >- const uint32_t ssrc_; >+ uint32_t ssrc() const { return ssrc_; } > > private: > RtcEventAudioPlayout(const RtcEventAudioPlayout& other); >+ >+ const uint32_t ssrc_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.cc >index fee7c3c766c3d1da48beb88c07e50e03a0ed0555..fdef393d546c1a891662a2554d2d0779a91775e9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.cc >@@ -14,12 +14,15 @@ > > #include "absl/memory/memory.h" > #include "logging/rtc_event_log/rtc_stream_config.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > RtcEventAudioReceiveStreamConfig::RtcEventAudioReceiveStreamConfig( > std::unique_ptr<rtclog::StreamConfig> config) >- : config_(std::move(config)) {} >+ : config_(std::move(config)) { >+ RTC_DCHECK(config_); >+} > > RtcEventAudioReceiveStreamConfig::RtcEventAudioReceiveStreamConfig( > const RtcEventAudioReceiveStreamConfig& other) >@@ -36,9 +39,9 @@ bool RtcEventAudioReceiveStreamConfig::IsConfigEvent() const { > return true; > } > >-std::unique_ptr<RtcEvent> RtcEventAudioReceiveStreamConfig::Copy() const { >- auto config_copy = absl::make_unique<rtclog::StreamConfig>(*config_); >- return absl::WrapUnique<RtcEvent>( >+std::unique_ptr<RtcEventAudioReceiveStreamConfig> >+RtcEventAudioReceiveStreamConfig::Copy() const { >+ return absl::WrapUnique<RtcEventAudioReceiveStreamConfig>( > new RtcEventAudioReceiveStreamConfig(*this)); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.h >index 275765031436209d299fe7add7561dd3c3418b55..2b73f63f000dff06f94f39bba0cc3b3a76348fa2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_receive_stream_config.h >@@ -31,13 +31,15 @@ class RtcEventAudioReceiveStreamConfig final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventAudioReceiveStreamConfig> Copy() const; > >- const std::unique_ptr<const rtclog::StreamConfig> config_; >+ const rtclog::StreamConfig& config() const { return *config_; } > > private: > RtcEventAudioReceiveStreamConfig( > const RtcEventAudioReceiveStreamConfig& other); >+ >+ const std::unique_ptr<const rtclog::StreamConfig> config_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.cc >index 47939c7bdc32ef51af0b9092d0499e5ba88e4944..f1a85bff6946500446fe8e318b9aa0fae1c3c431 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.cc >@@ -14,12 +14,15 @@ > > #include "absl/memory/memory.h" > #include "logging/rtc_event_log/rtc_stream_config.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > RtcEventAudioSendStreamConfig::RtcEventAudioSendStreamConfig( > std::unique_ptr<rtclog::StreamConfig> config) >- : config_(std::move(config)) {} >+ : config_(std::move(config)) { >+ RTC_DCHECK(config_); >+} > > RtcEventAudioSendStreamConfig::RtcEventAudioSendStreamConfig( > const RtcEventAudioSendStreamConfig& other) >@@ -36,9 +39,10 @@ bool RtcEventAudioSendStreamConfig::IsConfigEvent() const { > return true; > } > >-std::unique_ptr<RtcEvent> RtcEventAudioSendStreamConfig::Copy() const { >- auto config_copy = absl::make_unique<rtclog::StreamConfig>(*config_); >- return absl::WrapUnique<RtcEvent>(new RtcEventAudioSendStreamConfig(*this)); >+std::unique_ptr<RtcEventAudioSendStreamConfig> >+RtcEventAudioSendStreamConfig::Copy() const { >+ return absl::WrapUnique<RtcEventAudioSendStreamConfig>( >+ new RtcEventAudioSendStreamConfig(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.h >index b248a61915019e97a6e758977ba602a95c0dfc31..c0efa950901bbedc205de431d5de1a68d626dafb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_audio_send_stream_config.h >@@ -31,12 +31,14 @@ class RtcEventAudioSendStreamConfig final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventAudioSendStreamConfig> Copy() const; > >- const std::unique_ptr<const rtclog::StreamConfig> config_; >+ const rtclog::StreamConfig& config() const { return *config_; } > > private: > RtcEventAudioSendStreamConfig(const RtcEventAudioSendStreamConfig& other); >+ >+ const std::unique_ptr<const rtclog::StreamConfig> config_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.cc >index 8c17971260520bbaf32a23610de7d11aa56fd359..dcc87421f8737d7e33fc6de2cf6e306dcf1f2be3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.cc >@@ -36,8 +36,10 @@ bool RtcEventBweUpdateDelayBased::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventBweUpdateDelayBased::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventBweUpdateDelayBased(*this)); >+std::unique_ptr<RtcEventBweUpdateDelayBased> RtcEventBweUpdateDelayBased::Copy() >+ const { >+ return absl::WrapUnique<RtcEventBweUpdateDelayBased>( >+ new RtcEventBweUpdateDelayBased(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.h >index 3f47dc7f97045d279eda22bd9bb8a4fb621d983b..a97d49f5a54e47bc0982bf9455b1d3dd4141c93a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_BWE_UPDATE_DELAY_BASED_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_BWE_UPDATE_DELAY_BASED_H_ > >+#include <stdint.h> > #include <memory> > > #include "logging/rtc_event_log/events/rtc_event.h" >@@ -29,13 +30,16 @@ class RtcEventBweUpdateDelayBased final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventBweUpdateDelayBased> Copy() const; > >- const int32_t bitrate_bps_; >- const BandwidthUsage detector_state_; >+ int32_t bitrate_bps() const { return bitrate_bps_; } >+ BandwidthUsage detector_state() const { return detector_state_; } > > private: > RtcEventBweUpdateDelayBased(const RtcEventBweUpdateDelayBased& other); >+ >+ const int32_t bitrate_bps_; >+ const BandwidthUsage detector_state_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.cc >index b0e72ddb0aa794fd67333b06524713a18cdf698c..8453238cf903b38e914529131108eb8a5852e995 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.cc >@@ -38,8 +38,10 @@ bool RtcEventBweUpdateLossBased::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventBweUpdateLossBased::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventBweUpdateLossBased(*this)); >+std::unique_ptr<RtcEventBweUpdateLossBased> RtcEventBweUpdateLossBased::Copy() >+ const { >+ return absl::WrapUnique<RtcEventBweUpdateLossBased>( >+ new RtcEventBweUpdateLossBased(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.h >index 2d4ffda1b1ea6ed66d01800de749d6a1d42887d8..b4f88775fb72eaa9618aaa919ec136cc5d83fea8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_BWE_UPDATE_LOSS_BASED_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_BWE_UPDATE_LOSS_BASED_H_ > >+#include <stdint.h> > #include <memory> > > #include "logging/rtc_event_log/events/rtc_event.h" >@@ -28,14 +29,18 @@ class RtcEventBweUpdateLossBased final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventBweUpdateLossBased> Copy() const; > >- const int32_t bitrate_bps_; >- const uint8_t fraction_loss_; >- const int32_t total_packets_; >+ int32_t bitrate_bps() const { return bitrate_bps_; } >+ uint8_t fraction_loss() const { return fraction_loss_; } >+ int32_t total_packets() const { return total_packets_; } > > private: > RtcEventBweUpdateLossBased(const RtcEventBweUpdateLossBased& other); >+ >+ const int32_t bitrate_bps_; >+ const uint8_t fraction_loss_; >+ const int32_t total_packets_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.cc >index fa32de68d5eab7c1a2ff7e5736753b0150b26a01..9c31737ea5ee611351807e1b04057d0c5627ec62 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.cc >@@ -35,8 +35,10 @@ bool RtcEventIceCandidatePair::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventIceCandidatePair::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventIceCandidatePair(*this)); >+std::unique_ptr<RtcEventIceCandidatePair> RtcEventIceCandidatePair::Copy() >+ const { >+ return absl::WrapUnique<RtcEventIceCandidatePair>( >+ new RtcEventIceCandidatePair(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.h >index f49a0d1792cb431befc37badc6b1693dfab3ab11..7414fec79b1e8b062d204d5786746df1473ca5c9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair.h >@@ -36,13 +36,16 @@ class RtcEventIceCandidatePair final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventIceCandidatePair> Copy() const; > >- const IceCandidatePairEventType type_; >- const uint32_t candidate_pair_id_; >+ IceCandidatePairEventType type() const { return type_; } >+ uint32_t candidate_pair_id() const { return candidate_pair_id_; } > > private: > RtcEventIceCandidatePair(const RtcEventIceCandidatePair& other); >+ >+ const IceCandidatePairEventType type_; >+ const uint32_t candidate_pair_id_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.cc >index f89de7740c7be98ebd62f8f0a6eef3c25d087dfb..fbb8a73dfbea59f66ff3bf56e6454c74454f55d6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.cc >@@ -64,8 +64,10 @@ bool RtcEventIceCandidatePairConfig::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventIceCandidatePairConfig::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventIceCandidatePairConfig(*this)); >+std::unique_ptr<RtcEventIceCandidatePairConfig> >+RtcEventIceCandidatePairConfig::Copy() const { >+ return absl::WrapUnique<RtcEventIceCandidatePairConfig>( >+ new RtcEventIceCandidatePairConfig(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.h >index 1f6eb8ceb0fffe8e87d1fe489e26e23ef2bb96ec..40bdbaa771e81c0cfb6a9d0132b7cdef42b9cd70 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.h >@@ -92,14 +92,20 @@ class RtcEventIceCandidatePairConfig final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventIceCandidatePairConfig> Copy() const; > >- const IceCandidatePairConfigType type_; >- const uint32_t candidate_pair_id_; >- const IceCandidatePairDescription candidate_pair_desc_; >+ IceCandidatePairConfigType type() const { return type_; } >+ uint32_t candidate_pair_id() const { return candidate_pair_id_; } >+ const IceCandidatePairDescription& candidate_pair_desc() const { >+ return candidate_pair_desc_; >+ } > > private: > RtcEventIceCandidatePairConfig(const RtcEventIceCandidatePairConfig& other); >+ >+ const IceCandidatePairConfigType type_; >+ const uint32_t candidate_pair_id_; >+ const IceCandidatePairDescription candidate_pair_desc_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.cc >index a60765dbd0f1beff9bdbbb44f134d24cff8f4e5e..c11a6ce78eb56e0bf9363b4cf8a1a6b2aacdfe4e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.cc >@@ -39,8 +39,10 @@ bool RtcEventProbeClusterCreated::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventProbeClusterCreated::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventProbeClusterCreated(*this)); >+std::unique_ptr<RtcEventProbeClusterCreated> RtcEventProbeClusterCreated::Copy() >+ const { >+ return absl::WrapUnique<RtcEventProbeClusterCreated>( >+ new RtcEventProbeClusterCreated(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.h >index 3e8c21f5c9219839673334eb0816dca1251450f0..ad757edd530e0cf85a15e18e44c25cf15d401865 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_cluster_created.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_PROBE_CLUSTER_CREATED_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_PROBE_CLUSTER_CREATED_H_ > >+#include <stdint.h> > #include <memory> > > #include "logging/rtc_event_log/events/rtc_event.h" >@@ -29,15 +30,20 @@ class RtcEventProbeClusterCreated final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventProbeClusterCreated> Copy() const; >+ >+ int32_t id() const { return id_; } >+ int32_t bitrate_bps() const { return bitrate_bps_; } >+ uint32_t min_probes() const { return min_probes_; } >+ uint32_t min_bytes() const { return min_bytes_; } >+ >+ private: >+ RtcEventProbeClusterCreated(const RtcEventProbeClusterCreated& other); > > const int32_t id_; > const int32_t bitrate_bps_; > const uint32_t min_probes_; > const uint32_t min_bytes_; >- >- private: >- RtcEventProbeClusterCreated(const RtcEventProbeClusterCreated& other); > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.cc >index aa6091ca775b54b92db908a71e011546565f28b6..295003ae8d049114114e2494f48e7058b1b7b428 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.cc >@@ -33,8 +33,10 @@ bool RtcEventProbeResultFailure::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventProbeResultFailure::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventProbeResultFailure(*this)); >+std::unique_ptr<RtcEventProbeResultFailure> RtcEventProbeResultFailure::Copy() >+ const { >+ return absl::WrapUnique<RtcEventProbeResultFailure>( >+ new RtcEventProbeResultFailure(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.h >index 181c6940370ddd0636c99e21e552b965c34f2898..0c40f0cd1b11eb9309fc6a6acf7f9c5bdb67ecce 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_failure.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_PROBE_RESULT_FAILURE_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_PROBE_RESULT_FAILURE_H_ > >+#include <stdint.h> > #include <memory> > > #include "logging/rtc_event_log/events/rtc_event.h" >@@ -33,13 +34,16 @@ class RtcEventProbeResultFailure final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventProbeResultFailure> Copy() const; > >- const int32_t id_; >- const ProbeFailureReason failure_reason_; >+ int32_t id() const { return id_; } >+ ProbeFailureReason failure_reason() const { return failure_reason_; } > > private: > RtcEventProbeResultFailure(const RtcEventProbeResultFailure& other); >+ >+ const int32_t id_; >+ const ProbeFailureReason failure_reason_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.cc >index 55907295debd613d93508dbe75b6d10e6cc3a2cc..d5f9e2f780ca977566f41c4c3334a5e1442d68dc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.cc >@@ -32,8 +32,10 @@ bool RtcEventProbeResultSuccess::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventProbeResultSuccess::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventProbeResultSuccess(*this)); >+std::unique_ptr<RtcEventProbeResultSuccess> RtcEventProbeResultSuccess::Copy() >+ const { >+ return absl::WrapUnique<RtcEventProbeResultSuccess>( >+ new RtcEventProbeResultSuccess(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.h >index 15ff1833bf04d1c62ada136ed3803dbfdcbd59a9..a08dadd212ba823f058fd2925328db60036ac3e0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_probe_result_success.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_PROBE_RESULT_SUCCESS_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_PROBE_RESULT_SUCCESS_H_ > >+#include <stdint.h> > #include <memory> > > #include "logging/rtc_event_log/events/rtc_event.h" >@@ -26,13 +27,16 @@ class RtcEventProbeResultSuccess final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventProbeResultSuccess> Copy() const; > >- const int32_t id_; >- const int32_t bitrate_bps_; >+ int32_t id() const { return id_; } >+ int32_t bitrate_bps() const { return bitrate_bps_; } > > private: > RtcEventProbeResultSuccess(const RtcEventProbeResultSuccess& other); >+ >+ const int32_t id_; >+ const int32_t bitrate_bps_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.cc >index 119938c7f71884809d5482b0008411c160cd76e3..45a418f1f69c2bc786086cfde9909fa71bf4282f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.cc >@@ -33,8 +33,10 @@ bool RtcEventRtcpPacketIncoming::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventRtcpPacketIncoming::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventRtcpPacketIncoming(*this)); >+std::unique_ptr<RtcEventRtcpPacketIncoming> RtcEventRtcpPacketIncoming::Copy() >+ const { >+ return absl::WrapUnique<RtcEventRtcpPacketIncoming>( >+ new RtcEventRtcpPacketIncoming(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.h >index 6d904953a48a8876c2370ebd7e08927233c2677e..8394fe005cd56382acf63a12e390938e06da4f18 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_incoming.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_RTCP_PACKET_INCOMING_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_RTCP_PACKET_INCOMING_H_ > >+#include <stdint.h> > #include <memory> > > #include "api/array_view.h" >@@ -28,12 +29,14 @@ class RtcEventRtcpPacketIncoming final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventRtcpPacketIncoming> Copy() const; > >- rtc::Buffer packet_; >+ const rtc::Buffer& packet() const { return packet_; } > > private: > RtcEventRtcpPacketIncoming(const RtcEventRtcpPacketIncoming& other); >+ >+ rtc::Buffer packet_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.cc >index 4085ac0f31362aef9d5e57f2129095de93e6a744..b583e5614a2040ce8ea724cf0ed96a8eb3fe2449 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.cc >@@ -33,8 +33,10 @@ bool RtcEventRtcpPacketOutgoing::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventRtcpPacketOutgoing::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventRtcpPacketOutgoing(*this)); >+std::unique_ptr<RtcEventRtcpPacketOutgoing> RtcEventRtcpPacketOutgoing::Copy() >+ const { >+ return absl::WrapUnique<RtcEventRtcpPacketOutgoing>( >+ new RtcEventRtcpPacketOutgoing(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.h >index 3aa9e7199ad633c1e7661b492613abdf6008e8de..b47b85d0b3635d18b071e465d6b864ffec09099f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtcp_packet_outgoing.h >@@ -11,6 +11,7 @@ > #ifndef LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_RTCP_PACKET_OUTGOING_H_ > #define LOGGING_RTC_EVENT_LOG_EVENTS_RTC_EVENT_RTCP_PACKET_OUTGOING_H_ > >+#include <stdint.h> > #include <memory> > > #include "api/array_view.h" >@@ -28,12 +29,14 @@ class RtcEventRtcpPacketOutgoing final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventRtcpPacketOutgoing> Copy() const; > >- rtc::Buffer packet_; >+ const rtc::Buffer& packet() const { return packet_; } > > private: > RtcEventRtcpPacketOutgoing(const RtcEventRtcpPacketOutgoing& other); >+ >+ rtc::Buffer packet_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.cc >index 8e15d021751b685cf9a5ccead4b2178e818792a9..898c0aaf8ee64f15b0f4ac6e8799346dddbcb428 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.cc >@@ -12,18 +12,26 @@ > > #include "absl/memory/memory.h" > #include "modules/rtp_rtcp/source/rtp_packet_received.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > RtcEventRtpPacketIncoming::RtcEventRtpPacketIncoming( > const RtpPacketReceived& packet) >- : packet_length_(packet.size()) { >+ : payload_length_(packet.payload_size()), >+ header_length_(packet.headers_size()), >+ padding_length_(packet.padding_size()) { > header_.CopyHeaderFrom(packet); >+ RTC_DCHECK_EQ(packet.size(), >+ payload_length_ + header_length_ + padding_length_); > } > > RtcEventRtpPacketIncoming::RtcEventRtpPacketIncoming( > const RtcEventRtpPacketIncoming& other) >- : RtcEvent(other.timestamp_us_), packet_length_(other.packet_length_) { >+ : RtcEvent(other.timestamp_us_), >+ payload_length_(other.payload_length_), >+ header_length_(other.header_length_), >+ padding_length_(other.padding_length_) { > header_.CopyHeaderFrom(other.header_); > } > >@@ -37,8 +45,10 @@ bool RtcEventRtpPacketIncoming::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventRtpPacketIncoming::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventRtpPacketIncoming(*this)); >+std::unique_ptr<RtcEventRtpPacketIncoming> RtcEventRtpPacketIncoming::Copy() >+ const { >+ return absl::WrapUnique<RtcEventRtpPacketIncoming>( >+ new RtcEventRtpPacketIncoming(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.h >index 6b141b6d57622e6d29b19a61749fb6cb336139c5..1e357351b3cf26da031c8f1b98f95f2389faff75 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_incoming.h >@@ -29,13 +29,24 @@ class RtcEventRtpPacketIncoming final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventRtpPacketIncoming> Copy() const; > >- RtpPacket header_; // Only the packet's header will be stored here. >- const size_t packet_length_; // Length before stripping away all but header. >+ size_t packet_length() const { >+ return payload_length_ + header_length_ + padding_length_; >+ } >+ >+ const RtpPacket& header() const { return header_; } >+ size_t payload_length() const { return payload_length_; } >+ size_t header_length() const { return header_length_; } >+ size_t padding_length() const { return padding_length_; } > > private: > RtcEventRtpPacketIncoming(const RtcEventRtpPacketIncoming& other); >+ >+ RtpPacket header_; // Only the packet's header will be stored here. >+ const size_t payload_length_; // Media payload, excluding header and padding. >+ const size_t header_length_; // RTP header. >+ const size_t padding_length_; // RTP padding. > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.cc >index 103440ad6e9661daded0fe1feac25a022df0da3f..050474edda7c02c0cff4f34730b0ad6a7f4d9f7d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.cc >@@ -12,20 +12,28 @@ > > #include "absl/memory/memory.h" > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > RtcEventRtpPacketOutgoing::RtcEventRtpPacketOutgoing( > const RtpPacketToSend& packet, > int probe_cluster_id) >- : packet_length_(packet.size()), probe_cluster_id_(probe_cluster_id) { >+ : payload_length_(packet.payload_size()), >+ header_length_(packet.headers_size()), >+ padding_length_(packet.padding_size()), >+ probe_cluster_id_(probe_cluster_id) { > header_.CopyHeaderFrom(packet); >+ RTC_DCHECK_EQ(packet.size(), >+ payload_length_ + header_length_ + padding_length_); > } > > RtcEventRtpPacketOutgoing::RtcEventRtpPacketOutgoing( > const RtcEventRtpPacketOutgoing& other) > : RtcEvent(other.timestamp_us_), >- packet_length_(other.packet_length_), >+ payload_length_(other.payload_length_), >+ header_length_(other.header_length_), >+ padding_length_(other.padding_length_), > probe_cluster_id_(other.probe_cluster_id_) { > header_.CopyHeaderFrom(other.header_); > } >@@ -40,8 +48,10 @@ bool RtcEventRtpPacketOutgoing::IsConfigEvent() const { > return false; > } > >-std::unique_ptr<RtcEvent> RtcEventRtpPacketOutgoing::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventRtpPacketOutgoing(*this)); >+std::unique_ptr<RtcEventRtpPacketOutgoing> RtcEventRtpPacketOutgoing::Copy() >+ const { >+ return absl::WrapUnique<RtcEventRtpPacketOutgoing>( >+ new RtcEventRtpPacketOutgoing(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.h >index 171a21c420e41676f3f33050cacc1c6e2d61c455..ebddc14231f3af193f2d0223d337966f017c5048 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.h >@@ -30,14 +30,27 @@ class RtcEventRtpPacketOutgoing final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventRtpPacketOutgoing> Copy() const; > >- RtpPacket header_; // Only the packet's header will be stored here. >- const size_t packet_length_; // Length before stripping away all but header. >- const int probe_cluster_id_; >+ size_t packet_length() const { >+ return payload_length_ + header_length_ + padding_length_; >+ } >+ >+ const RtpPacket& header() const { return header_; } >+ size_t payload_length() const { return payload_length_; } >+ size_t header_length() const { return header_length_; } >+ size_t padding_length() const { return padding_length_; } >+ int probe_cluster_id() const { return probe_cluster_id_; } > > private: > RtcEventRtpPacketOutgoing(const RtcEventRtpPacketOutgoing& other); >+ >+ RtpPacket header_; // Only the packet's header will be stored here. >+ const size_t payload_length_; // Media payload, excluding header and padding. >+ const size_t header_length_; // RTP header. >+ const size_t padding_length_; // RTP padding. >+ // TODO(eladalon): Delete |probe_cluster_id_| along with legacy encoding. >+ const int probe_cluster_id_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.cc >index 7f4dd9b7b807c4b12fd389bd426d4fdeb6c58640..5dec97bf83772e912b1986b02221c55d9e88e2a3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.cc >@@ -13,12 +13,15 @@ > #include <utility> > > #include "absl/memory/memory.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > RtcEventVideoReceiveStreamConfig::RtcEventVideoReceiveStreamConfig( > std::unique_ptr<rtclog::StreamConfig> config) >- : config_(std::move(config)) {} >+ : config_(std::move(config)) { >+ RTC_DCHECK(config_); >+} > > RtcEventVideoReceiveStreamConfig::RtcEventVideoReceiveStreamConfig( > const RtcEventVideoReceiveStreamConfig& other) >@@ -35,8 +38,9 @@ bool RtcEventVideoReceiveStreamConfig::IsConfigEvent() const { > return true; > } > >-std::unique_ptr<RtcEvent> RtcEventVideoReceiveStreamConfig::Copy() const { >- return absl::WrapUnique<RtcEvent>( >+std::unique_ptr<RtcEventVideoReceiveStreamConfig> >+RtcEventVideoReceiveStreamConfig::Copy() const { >+ return absl::WrapUnique<RtcEventVideoReceiveStreamConfig>( > new RtcEventVideoReceiveStreamConfig(*this)); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.h >index c6263c3fdbe22c7ccb691d7be882550235ff4482..801ba7ddd44d81468e4154f61c271a625e12babc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_receive_stream_config.h >@@ -28,13 +28,15 @@ class RtcEventVideoReceiveStreamConfig final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventVideoReceiveStreamConfig> Copy() const; > >- const std::unique_ptr<const rtclog::StreamConfig> config_; >+ const rtclog::StreamConfig& config() const { return *config_; } > > private: > RtcEventVideoReceiveStreamConfig( > const RtcEventVideoReceiveStreamConfig& other); >+ >+ const std::unique_ptr<const rtclog::StreamConfig> config_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.cc >index b4b78a8a52a75c13ccec6f714a13e30535465bbd..dc4b167438cba793e1aec888adc16472cbf89c5f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.cc >@@ -35,8 +35,10 @@ bool RtcEventVideoSendStreamConfig::IsConfigEvent() const { > return true; > } > >-std::unique_ptr<RtcEvent> RtcEventVideoSendStreamConfig::Copy() const { >- return absl::WrapUnique<RtcEvent>(new RtcEventVideoSendStreamConfig(*this)); >+std::unique_ptr<RtcEventVideoSendStreamConfig> >+RtcEventVideoSendStreamConfig::Copy() const { >+ return absl::WrapUnique<RtcEventVideoSendStreamConfig>( >+ new RtcEventVideoSendStreamConfig(*this)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.h >index 3472f8855be669dd13ca68eb651e18853a53eb0f..fe274c8c3a55fd350f83ad500cea482062b296cb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/events/rtc_event_video_send_stream_config.h >@@ -28,12 +28,14 @@ class RtcEventVideoSendStreamConfig final : public RtcEvent { > > bool IsConfigEvent() const override; > >- std::unique_ptr<RtcEvent> Copy() const override; >+ std::unique_ptr<RtcEventVideoSendStreamConfig> Copy() const; > >- const std::unique_ptr<const rtclog::StreamConfig> config_; >+ const rtclog::StreamConfig& config() const { return *config_; } > > private: > RtcEventVideoSendStreamConfig(const RtcEventVideoSendStreamConfig& other); >+ >+ const std::unique_ptr<const rtclog::StreamConfig> config_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.cc >index 73a598d6b1e32a8c418a23328165dd8e2b12c8ee..b48fe9234fd74fc72a8911ff4baac55a71604bda 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.cc >@@ -10,8 +10,14 @@ > > #include "logging/rtc_event_log/rtc_event_log.h" > >+#include "absl/memory/memory.h" >+ > namespace webrtc { > >+std::unique_ptr<RtcEventLog> RtcEventLog::CreateNull() { >+ return absl::make_unique<RtcEventLogNullImpl>(); >+} >+ > bool RtcEventLogNullImpl::StartLogging( > std::unique_ptr<RtcEventLogOutput> output, > int64_t output_period_ms) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.h >index 0c7140690241d808740a1940e749189e9e64c5c2..7d900c00686f860728c9202004af81bfa1df88ca 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log.h >@@ -11,9 +11,8 @@ > #ifndef LOGGING_RTC_EVENT_LOG_RTC_EVENT_LOG_H_ > #define LOGGING_RTC_EVENT_LOG_RTC_EVENT_LOG_H_ > >+#include <stdint.h> > #include <memory> >-#include <string> >-#include <utility> > > #include "api/rtceventlogoutput.h" > #include "logging/rtc_event_log/events/rtc_event.h" >@@ -29,10 +28,9 @@ class RtcEventLog { > enum : size_t { kUnlimitedOutput = 0 }; > enum : int64_t { kImmediateOutput = 0 }; > >- // TODO(eladalon): Two stages are upcoming. >- // 1. Extend this to actually support the new encoding. >- // 2. Get rid of the legacy encoding, allowing us to get rid of this enum. >- enum class EncodingType { Legacy }; >+ // TODO(eladalon): Get rid of the legacy encoding and this enum once all >+ // clients have migrated to the new format. >+ enum class EncodingType { Legacy, NewFormat }; > > virtual ~RtcEventLog() {} > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2.proto b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2.proto >index 4cac4553ced4347ba3a27d3bb08af4152d371725..995853b565cc1e18d30ed8bacd180af5d75ad5cd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2.proto >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2.proto >@@ -10,20 +10,17 @@ package webrtc.rtclog2; > // single EventStream object containing the same events. Hence, it is not > // necessary to wait for the entire log to be complete before beginning to > // write it to a file. >+// Note: For all X_deltas fields, we rely on the default value being an >+// empty string. > message EventStream { > // Deprecated - Maintained for compatibility with the old event log. >- // TODO(terelius): Maybe we can remove this and instead check the stream for >- // presence of a version field. That requires a custom protobuf parser, but we >- // have that already anyway. > repeated Event stream = 1 [deprecated = true]; >- // required - The version number must be 2 for this version of the event log. >- optional uint32 version = 2; >- repeated IncomingRtpPackets incoming_rtp_packets = 3; >- repeated OutgoingRtpPackets outgoing_rtp_packets = 4; >- repeated IncomingRtcpPackets incoming_rtcp_packets = 5; >- repeated OutgoingRtcpPackets outgoing_rtcp_packets = 6; >- repeated AudioPlayoutEvents audio_playout_events = 7; >- // The field tags 8-15 are reserved for the most common events >+ repeated IncomingRtpPackets incoming_rtp_packets = 2; >+ repeated OutgoingRtpPackets outgoing_rtp_packets = 3; >+ repeated IncomingRtcpPackets incoming_rtcp_packets = 4; >+ repeated OutgoingRtcpPackets outgoing_rtcp_packets = 5; >+ repeated AudioPlayoutEvents audio_playout_events = 6; >+ // The field tags 7-15 are reserved for the most common events. > repeated BeginLogEvent begin_log_events = 16; > repeated EndLogEvent end_log_events = 17; > repeated LossBasedBweUpdates loss_based_bwe_updates = 18; >@@ -32,6 +29,9 @@ message EventStream { > repeated BweProbeCluster probe_clusters = 21; > repeated BweProbeResultSuccess probe_success = 22; > repeated BweProbeResultFailure probe_failure = 23; >+ repeated AlrState alr_states = 24; >+ repeated IceCandidatePairConfig ice_candidate_configs = 25; >+ repeated IceCandidatePairEvent ice_candidate_events = 26; > > repeated AudioRecvStreamConfig audio_recv_stream_configs = 101; > repeated AudioSendStreamConfig audio_send_stream_configs = 102; >@@ -45,136 +45,206 @@ message Event { > } > > message IncomingRtpPackets { >+ // required > optional int64 timestamp_ms = 1; > >- // RTP marker bit, used to label boundaries within e.g. video frames. >+ // required - RTP marker bit, used to label boundaries between video frames. > optional bool marker = 2; > >- // RTP payload type. >+ // required - RTP payload type. > optional uint32 payload_type = 3; > >- // RTP sequence number. >+ // required - RTP sequence number. > optional uint32 sequence_number = 4; > >- // RTP monotonic clock timestamp (not actual time). >+ // required - RTP monotonic clock timestamp (not actual time). > optional fixed32 rtp_timestamp = 5; > >- // Synchronization source of this packet's RTP stream. >+ // required - Synchronization source of this packet's RTP stream. > optional fixed32 ssrc = 6; > > // TODO(terelius/dinor): Add CSRCs. Field number 7 reserved for this purpose. > >- // required - The size of the packet including both payload and header. >- optional uint32 packet_size = 8; >+ // required - The size (in bytes) of the media payload, not including >+ // RTP header or padding. The packet size is the sum of payload, header and >+ // padding. >+ optional uint32 payload_size = 8; >+ >+ // required - The size (in bytes) of the RTP header. >+ optional uint32 header_size = 9; >+ >+ // required - The size (in bytes) of the padding. >+ optional uint32 padding_size = 10; >+ >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 11; >+ >+ // Field numbers 12-14 reserved for future use. > > // Optional header extensions. >- optional int32 transmission_time_offset = 9; >- optional uint32 absolute_send_time = 10; >- optional uint32 transport_sequence_number = 11; >- optional uint32 audio_level = 12; >- // TODO(terelius): Add header extensions like video rotation, playout delay? >- >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >+ optional uint32 transport_sequence_number = 15; >+ optional int32 transmission_time_offset = 16; >+ optional uint32 absolute_send_time = 17; >+ optional uint32 video_rotation = 18; >+ // |audio_level| and |voice_activity| are always used in conjunction. >+ optional uint32 audio_level = 19; >+ optional bool voice_activity = 20; >+ // TODO(terelius): Add other header extensions like playout delay? >+ >+ // Delta encodings. >+ optional bytes timestamp_ms_deltas = 101; > optional bytes marker_deltas = 102; > optional bytes payload_type_deltas = 103; > optional bytes sequence_number_deltas = 104; > optional bytes rtp_timestamp_deltas = 105; >+ // Field number 107 reserved for CSRC. > optional bytes ssrc_deltas = 106; >- optional bytes packet_size_deltas = 107; >- optional bytes transmission_time_offset_deltas = 108; >- optional bytes absolute_send_time_deltas = 109; >- optional bytes transport_sequence_number_deltas = 110; >- optional bytes audio_level_deltas = 111; >+ optional bytes payload_size_deltas = 108; >+ optional bytes header_size_deltas = 109; >+ optional bytes padding_size_deltas = 110; >+ // Field number 111-114 reserved for future use. >+ optional bytes transport_sequence_number_deltas = 115; >+ optional bytes transmission_time_offset_deltas = 116; >+ optional bytes absolute_send_time_deltas = 117; >+ optional bytes video_rotation_deltas = 118; >+ // |audio_level| and |voice_activity| are always used in conjunction. >+ optional bytes audio_level_deltas = 119; >+ optional bytes voice_activity_deltas = 120; > } > > message OutgoingRtpPackets { >+ // required > optional int64 timestamp_ms = 1; > >- // RTP marker bit, used to label boundaries within e.g. video frames. >+ // required - RTP marker bit, used to label boundaries between video frames. > optional bool marker = 2; > >+ // required - RTP payload type. > optional uint32 payload_type = 3; > >- // RTP sequence number. >+ // required - RTP sequence number. > optional uint32 sequence_number = 4; > >- // RTP monotonic clock timestamp (not actual time). >+ // required - RTP monotonic clock timestamp (not actual time). > optional fixed32 rtp_timestamp = 5; > >- // Synchronization source of this packet's RTP stream. >+ // required - Synchronization source of this packet's RTP stream. > optional fixed32 ssrc = 6; > > // TODO(terelius/dinor): Add CSRCs. Field number 7 reserved for this purpose. > >- // required - The size of the packet including both payload and header. >- optional uint32 packet_size = 8; >+ // required - The size (in bytes) of the media payload, not including >+ // RTP header or padding. The packet size is the sum of payload, header and >+ // padding. >+ optional uint32 payload_size = 8; >+ >+ // required - The size (in bytes) of the RTP header. >+ optional uint32 header_size = 9; >+ >+ // required - The size (in bytes) of the padding. >+ optional uint32 padding_size = 10; >+ >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 11; >+ >+ // Field numbers 12-14 reserved for future use. > > // Optional header extensions. >- optional int32 transmission_time_offset = 9; >- optional uint32 absolute_send_time = 10; >- optional uint32 transport_sequence_number = 11; >- optional uint32 audio_level = 12; >- // TODO(terelius): Add header extensions like video rotation, playout delay? >- >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >+ optional uint32 transport_sequence_number = 15; >+ optional int32 transmission_time_offset = 16; >+ optional uint32 absolute_send_time = 17; >+ optional uint32 video_rotation = 18; >+ // |audio_level| and |voice_activity| are always used in conjunction. >+ optional uint32 audio_level = 19; >+ optional bool voice_activity = 20; >+ // TODO(terelius): Add other header extensions like playout delay? >+ >+ // Delta encodings. >+ optional bytes timestamp_ms_deltas = 101; > optional bytes marker_deltas = 102; > optional bytes payload_type_deltas = 103; > optional bytes sequence_number_deltas = 104; > optional bytes rtp_timestamp_deltas = 105; > optional bytes ssrc_deltas = 106; >- optional bytes packet_size_deltas = 107; >- optional bytes probe_cluster_id_deltas = 108; >- optional bytes transmission_time_offset_deltas = 109; >- optional bytes absolute_send_time_deltas = 110; >- optional bytes transport_sequence_number_deltas = 111; >+ // Field number 107 reserved for CSRC. >+ optional bytes payload_size_deltas = 108; >+ optional bytes header_size_deltas = 109; >+ optional bytes padding_size_deltas = 110; >+ // Field number 111-114 reserved for future use. >+ optional bytes transport_sequence_number_deltas = 115; >+ optional bytes transmission_time_offset_deltas = 116; >+ optional bytes absolute_send_time_deltas = 117; >+ optional bytes video_rotation_deltas = 118; >+ // |audio_level| and |voice_activity| are always used in conjunction. >+ optional bytes audio_level_deltas = 119; >+ optional bytes voice_activity_deltas = 120; > } > > message IncomingRtcpPackets { >+ // required > optional int64 timestamp_ms = 1; > > // required - The whole packet including both payload and header. > optional bytes raw_packet = 2; > // TODO(terelius): Feasible to log parsed RTCP instead? > >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >- optional bytes raw_packet_deltas = 102; >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 3; >+ >+ // Delta/blob encodings. >+ optional bytes timestamp_ms_deltas = 101; >+ optional bytes raw_packet_blobs = 102; > } > > message OutgoingRtcpPackets { >+ // required > optional int64 timestamp_ms = 1; > > // required - The whole packet including both payload and header. > optional bytes raw_packet = 2; > // TODO(terelius): Feasible to log parsed RTCP instead? > >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >- optional bytes raw_packet_deltas = 102; >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 3; >+ >+ // Delta/blob encodings. >+ optional bytes timestamp_ms_deltas = 101; >+ optional bytes raw_packet_blobs = 102; > } > > message AudioPlayoutEvents { >+ // required > optional int64 timestamp_ms = 1; > > // required - The SSRC of the audio stream associated with the playout event. > optional uint32 local_ssrc = 2; > >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 3; >+ >+ // Delta encodings. >+ optional bytes timestamp_ms_deltas = 101; > optional bytes local_ssrc_deltas = 102; > } > > message BeginLogEvent { >+ // required > optional int64 timestamp_ms = 1; >+ >+ // required >+ optional uint32 version = 2; >+ >+ // required >+ optional int64 utc_time_ms = 3; > } > > message EndLogEvent { >+ // required > optional int64 timestamp_ms = 1; > } > > message LossBasedBweUpdates { >+ // required > optional int64 timestamp_ms = 1; > > // TODO(terelius): Update log interface to unsigned. >@@ -191,29 +261,37 @@ message LossBasedBweUpdates { > // required - Total number of packets that the BWE update is based on. > optional uint32 total_packets = 4; > >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >- optional bytes bitrate_deltas_bps = 102; >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 5; >+ >+ // Delta encodings. >+ optional bytes timestamp_ms_deltas = 101; >+ optional bytes bitrate_bps_deltas = 102; > optional bytes fraction_loss_deltas = 103; > optional bytes total_packets_deltas = 104; > } > > message DelayBasedBweUpdates { >+ // required > optional int64 timestamp_ms = 1; > > // required - Bandwidth estimate (in bps) after the update. > optional uint32 bitrate_bps = 2; > > enum DetectorState { >- BWE_NORMAL = 0; >- BWE_UNDERUSING = 1; >- BWE_OVERUSING = 2; >+ BWE_UNKNOWN_STATE = 0; >+ BWE_NORMAL = 1; >+ BWE_UNDERUSING = 2; >+ BWE_OVERUSING = 3; > } > optional DetectorState detector_state = 3; > >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >- optional bytes bitrate_deltas_bps = 102; >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 4; >+ >+ // Delta encodings. >+ optional bytes timestamp_ms_deltas = 101; >+ optional bytes bitrate_bps_deltas = 102; > optional bytes detector_state_deltas = 103; > } > >@@ -221,15 +299,17 @@ message DelayBasedBweUpdates { > message RtpHeaderExtensionConfig { > // Optional IDs for the header extensions. Each ID is a 4-bit number that is > // only set if that extension is configured. >- // TODO(terelius): Can we skip transmission_time_offset? When is it used? >+ // TODO: Can we skip audio level? > optional int32 transmission_time_offset_id = 1; > optional int32 absolute_send_time_id = 2; > optional int32 transport_sequence_number_id = 3; >- optional int32 audio_level_id = 4; >- // TODO(terelius): Add video_rotation and playout delay? >+ optional int32 video_rotation_id = 4; >+ optional int32 audio_level_id = 5; >+ // TODO(terelius): Add other header extensions like playout delay? > } > > message VideoRecvStreamConfig { >+ // required > optional int64 timestamp_ms = 1; > > // required - Synchronization source (stream identifier) to be received. >@@ -238,42 +318,38 @@ message VideoRecvStreamConfig { > // required - Sender SSRC used for sending RTCP (such as receiver reports). > optional uint32 local_ssrc = 3; > >- // required if RTX is configured >+ // optional - required if RTX is configured. SSRC for the RTX stream. > optional uint32 rtx_ssrc = 4; > >- // optional - RTP source stream ID >- optional bytes rsid = 5; >- > // IDs for the header extension we care about. Only required if there are > // header extensions configured. >- optional RtpHeaderExtensionConfig header_extensions = 6; >+ optional RtpHeaderExtensionConfig header_extensions = 5; > > // TODO(terelius): Do we need codec-payload mapping? If so and rtx_ssrc is > // used, we also need a map between RTP payload type and RTX payload type. > } > > message VideoSendStreamConfig { >+ // required > optional int64 timestamp_ms = 1; > >- // Synchronization source (stream identifier) for outgoing stream. >- // One stream can have several ssrcs for e.g. simulcast. >+ // required - Synchronization source (stream identifier) for outgoing stream. >+ // When using simulcast, a separate config should be logged for each stream. > optional uint32 ssrc = 2; > >- // SSRC for the RTX stream >+ // optional - required if RTX is configured. SSRC for the RTX stream. > optional uint32 rtx_ssrc = 3; > >- // RTP source stream ID >- optional bytes rsid = 4; >- > // IDs for the header extension we care about. Only required if there are > // header extensions configured. >- optional RtpHeaderExtensionConfig header_extensions = 5; >+ optional RtpHeaderExtensionConfig header_extensions = 4; > > // TODO(terelius): Do we need codec-payload mapping? If so and rtx_ssrc is > // used, we also need a map between RTP payload type and RTX payload type. > } > > message AudioRecvStreamConfig { >+ // required > optional int64 timestamp_ms = 1; > > // required - Synchronization source (stream identifier) to be received. >@@ -284,38 +360,33 @@ message AudioRecvStreamConfig { > > // Field number 4 reserved for RTX SSRC. > >- // optional - RTP source stream ID >- optional bytes rsid = 5; >- > // IDs for the header extension we care about. Only required if there are > // header extensions configured. >- optional RtpHeaderExtensionConfig header_extensions = 6; >+ optional RtpHeaderExtensionConfig header_extensions = 5; > > // TODO(terelius): Do we need codec-payload mapping? If so and rtx_ssrc is > // used, we also need a map between RTP payload type and RTX payload type. > } > > message AudioSendStreamConfig { >+ // required > optional int64 timestamp_ms = 1; > >- // Synchronization source (stream identifier) for outgoing stream. >- // One stream can have several ssrcs for e.g. simulcast. >+ // required - Synchronization source (stream identifier) for outgoing stream. > optional uint32 ssrc = 2; > >- // Field number 3 reserved for RTX SSRC >- >- // RTP source stream ID >- optional bytes rsid = 4; >+ // Field number 3 reserved for RTX SSRC. > > // IDs for the header extension we care about. Only required if there are > // header extensions configured. >- optional RtpHeaderExtensionConfig header_extensions = 5; >+ optional RtpHeaderExtensionConfig header_extensions = 4; > > // TODO(terelius): Do we need codec-payload mapping? If so and rtx_ssrc is > // used, we also need a map between RTP payload type and RTX payload type. > } > > message AudioNetworkAdaptations { >+ // required > optional int64 timestamp_ms = 1; > > // Bit rate that the audio encoder is operating at. >@@ -328,7 +399,12 @@ message AudioNetworkAdaptations { > > // Packet loss fraction that the encoder's forward error correction (FEC) is > // optimized for. >- optional float uplink_packet_loss_fraction = 4; >+ // Instead of encoding a float, we encode a value between 0 and 16383, which >+ // if divided by 16383, will give a value close to the original float. >+ // The value 16383 (2^14 - 1) was chosen so that it would give good precision >+ // on the one hand, and would be encodable with two bytes in varint form >+ // on the other hand. >+ optional uint32 uplink_packet_loss_fraction = 4; > > // Whether forward error correction (FEC) is turned on or off. > optional bool enable_fec = 5; >@@ -339,10 +415,13 @@ message AudioNetworkAdaptations { > // Number of audio channels that each encoded packet consists of. > optional uint32 num_channels = 7; > >- // Delta encodings >- optional bytes timestamp_deltas_ms = 101; >- optional bytes bitrate_deltas_bps = 102; >- optional bytes frame_length_deltas_ms = 103; >+ // optional - required if the batch contains delta encoded events. >+ optional uint32 number_of_deltas = 8; >+ >+ // Delta encodings. >+ optional bytes timestamp_ms_deltas = 101; >+ optional bytes bitrate_bps_deltas = 102; >+ optional bytes frame_length_ms_deltas = 103; > optional bytes uplink_packet_loss_fraction_deltas = 104; > optional bytes enable_fec_deltas = 105; > optional bytes enable_dtx_deltas = 106; >@@ -350,6 +429,7 @@ message AudioNetworkAdaptations { > } > > message BweProbeCluster { >+ // required > optional int64 timestamp_ms = 1; > > // required - The id of this probe cluster. >@@ -366,6 +446,7 @@ message BweProbeCluster { > } > > message BweProbeResultSuccess { >+ // required > optional int64 timestamp_ms = 1; > > // required - The id of this probe cluster. >@@ -376,6 +457,7 @@ message BweProbeResultSuccess { > } > > message BweProbeResultFailure { >+ // required > optional int64 timestamp_ms = 1; > > // required - The id of this probe cluster. >@@ -391,3 +473,101 @@ message BweProbeResultFailure { > // required > optional FailureReason failure = 3; > } >+ >+message AlrState { >+ // required >+ optional int64 timestamp_ms = 1; >+ >+ // required - True if the send rate is application limited. >+ optional bool in_alr = 2; >+} >+ >+message IceCandidatePairConfig { >+ enum IceCandidatePairConfigType { >+ UNKNOWN_CONFIG_TYPE = 0; >+ ADDED = 1; >+ UPDATED = 2; >+ DESTROYED = 3; >+ SELECTED = 4; >+ } >+ >+ enum IceCandidateType { >+ UNKNOWN_CANDIDATE_TYPE = 0; >+ LOCAL = 1; >+ STUN = 2; >+ PRFLX = 3; >+ RELAY = 4; >+ } >+ >+ enum Protocol { >+ UNKNOWN_PROTOCOL = 0; >+ UDP = 1; >+ TCP = 2; >+ SSLTCP = 3; >+ TLS = 4; >+ } >+ >+ enum AddressFamily { >+ UNKNOWN_ADDRESS_FAMILY = 0; >+ IPV4 = 1; >+ IPV6 = 2; >+ } >+ >+ enum NetworkType { >+ UNKNOWN_NETWORK_TYPE = 0; >+ ETHERNET = 1; >+ WIFI = 2; >+ CELLULAR = 3; >+ VPN = 4; >+ LOOPBACK = 5; >+ } >+ >+ // required >+ optional int64 timestamp_ms = 1; >+ >+ // required >+ optional IceCandidatePairConfigType config_type = 2; >+ >+ // required >+ optional uint32 candidate_pair_id = 3; >+ >+ // required >+ optional IceCandidateType local_candidate_type = 4; >+ >+ // required >+ optional Protocol local_relay_protocol = 5; >+ >+ // required >+ optional NetworkType local_network_type = 6; >+ >+ // required >+ optional AddressFamily local_address_family = 7; >+ >+ // required >+ optional IceCandidateType remote_candidate_type = 8; >+ >+ // required >+ optional AddressFamily remote_address_family = 9; >+ >+ // required >+ optional Protocol candidate_pair_protocol = 10; >+} >+ >+message IceCandidatePairEvent { >+ enum IceCandidatePairEventType { >+ UNKNOWN_CHECK_TYPE = 0; >+ CHECK_SENT = 1; >+ CHECK_RECEIVED = 2; >+ CHECK_RESPONSE_SENT = 3; >+ CHECK_RESPONSE_RECEIVED = 4; >+ } >+ >+ // required >+ optional int64 timestamp_ms = 1; >+ >+ // required >+ optional IceCandidatePairEventType event_type = 2; >+ >+ // required >+ optional uint32 candidate_pair_id = 3; >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2rtp_dump.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2rtp_dump.cc >index a3d3456a7bdeab7e2bcd9d32a1bfac2c21f9d378..da7df64faddf61a13e41d12fbd58e72f456a0feb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2rtp_dump.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2rtp_dump.cc >@@ -32,31 +32,32 @@ namespace { > > using MediaType = webrtc::ParsedRtcEventLogNew::MediaType; > >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > audio, > true, > "Use --noaudio to exclude audio packets from the converted RTPdump file."); >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > video, > true, > "Use --novideo to exclude video packets from the converted RTPdump file."); >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > data, > true, > "Use --nodata to exclude data packets from the converted RTPdump file."); >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > rtp, > true, > "Use --nortp to exclude RTP packets from the converted RTPdump file."); >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > rtcp, > true, > "Use --nortcp to exclude RTCP packets from the converted RTPdump file."); >-DEFINE_string(ssrc, >- "", >- "Store only packets with this SSRC (decimal or hex, the latter " >- "starting with 0x)."); >-DEFINE_bool(help, false, "Prints this message."); >+WEBRTC_DEFINE_string( >+ ssrc, >+ "", >+ "Store only packets with this SSRC (decimal or hex, the latter " >+ "starting with 0x)."); >+WEBRTC_DEFINE_bool(help, false, "Prints this message."); > > // Parses the input string for a valid SSRC. If a valid SSRC is found, it is > // written to the output variable |ssrc|, and true is returned. Otherwise, >@@ -97,9 +98,10 @@ bool ShouldSkipStream(MediaType media_type, > // Convert a LoggedRtpPacketIncoming to a test::RtpPacket. Header extension IDs > // are allocated according to the provided extension map. This might not match > // the extension map used in the actual call. >-void ConvertRtpPacket(const webrtc::LoggedRtpPacketIncoming& incoming, >- const webrtc::RtpHeaderExtensionMap default_extension_map, >- webrtc::test::RtpPacket* packet) { >+void ConvertRtpPacket( >+ const webrtc::LoggedRtpPacketIncoming& incoming, >+ const webrtc::RtpHeaderExtensionMap& default_extension_map, >+ webrtc::test::RtpPacket* packet) { > webrtc::RtpPacket reconstructed_packet(&default_extension_map); > > reconstructed_packet.SetMarker(incoming.rtp.header.markerBit); >@@ -196,8 +198,6 @@ int main(int argc, char* argv[]) { > return -1; > } > >- std::cout << "Found " << parsed_stream.GetNumberOfEvents() >- << " events in the input file." << std::endl; > int rtp_counter = 0, rtcp_counter = 0; > bool header_only = false; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2stats.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2stats.cc >deleted file mode 100644 >index 928c829f49f13c0c51ac90855c56f4833d397ca3..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2stats.cc >+++ /dev/null >@@ -1,265 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <inttypes.h> >-#include <stdio.h> >- >-#include <fstream> >-#include <iostream> >-#include <map> >-#include <string> >-#include <tuple> >-#include <utility> >-#include <vector> >- >-#include "logging/rtc_event_log/rtc_event_log.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/flags.h" >-#include "rtc_base/ignore_wundef.h" >-#include "rtc_base/logging.h" >- >-// Files generated at build-time by the protobuf compiler. >-RTC_PUSH_IGNORING_WUNDEF() >-#ifdef WEBRTC_ANDROID_PLATFORM_BUILD >-#include "external/webrtc/webrtc/logging/rtc_event_log/rtc_event_log.pb.h" >-#else >-#include "logging/rtc_event_log/rtc_event_log.pb.h" >-#endif >-RTC_POP_IGNORING_WUNDEF() >- >-namespace { >- >-DEFINE_bool(help, false, "Prints this message."); >- >-struct Stats { >- int count = 0; >- size_t total_size = 0; >-}; >- >-// We are duplicating some parts of the parser here because we want to get >-// access to raw protobuf events. >-std::pair<uint64_t, bool> ParseVarInt(std::istream& stream) { >- uint64_t varint = 0; >- for (size_t bytes_read = 0; bytes_read < 10; ++bytes_read) { >- // The most significant bit of each byte is 0 if it is the last byte in >- // the varint and 1 otherwise. Thus, we take the 7 least significant bits >- // of each byte and shift them 7 bits for each byte read previously to get >- // the (unsigned) integer. >- int byte = stream.get(); >- if (stream.eof()) { >- return std::make_pair(varint, false); >- } >- RTC_DCHECK_GE(byte, 0); >- RTC_DCHECK_LE(byte, 255); >- varint |= static_cast<uint64_t>(byte & 0x7F) << (7 * bytes_read); >- if ((byte & 0x80) == 0) { >- return std::make_pair(varint, true); >- } >- } >- return std::make_pair(varint, false); >-} >- >-bool ParseEvents(const std::string& filename, >- std::vector<webrtc::rtclog::Event>* events) { >- std::ifstream stream(filename, std::ios_base::in | std::ios_base::binary); >- if (!stream.good() || !stream.is_open()) { >- RTC_LOG(LS_WARNING) << "Could not open file for reading."; >- return false; >- } >- >- const size_t kMaxEventSize = (1u << 16) - 1; >- std::vector<char> tmp_buffer(kMaxEventSize); >- uint64_t tag; >- uint64_t message_length; >- bool success; >- >- RTC_DCHECK(stream.good()); >- >- while (1) { >- // Check whether we have reached end of file. >- stream.peek(); >- if (stream.eof()) { >- return true; >- } >- >- // Read the next message tag. The tag number is defined as >- // (fieldnumber << 3) | wire_type. In our case, the field number is >- // supposed to be 1 and the wire type for an length-delimited field is 2. >- const uint64_t kExpectedTag = (1 << 3) | 2; >- std::tie(tag, success) = ParseVarInt(stream); >- if (!success) { >- RTC_LOG(LS_WARNING) >- << "Missing field tag from beginning of protobuf event."; >- return false; >- } else if (tag != kExpectedTag) { >- RTC_LOG(LS_WARNING) >- << "Unexpected field tag at beginning of protobuf event."; >- return false; >- } >- >- // Read the length field. >- std::tie(message_length, success) = ParseVarInt(stream); >- if (!success) { >- RTC_LOG(LS_WARNING) << "Missing message length after protobuf field tag."; >- return false; >- } else if (message_length > kMaxEventSize) { >- RTC_LOG(LS_WARNING) << "Protobuf message length is too large."; >- return false; >- } >- >- // Read the next protobuf event to a temporary char buffer. >- stream.read(tmp_buffer.data(), message_length); >- if (stream.gcount() != static_cast<int>(message_length)) { >- RTC_LOG(LS_WARNING) << "Failed to read protobuf message from file."; >- return false; >- } >- >- // Parse the protobuf event from the buffer. >- webrtc::rtclog::Event event; >- if (!event.ParseFromArray(tmp_buffer.data(), message_length)) { >- RTC_LOG(LS_WARNING) << "Failed to parse protobuf message."; >- return false; >- } >- events->push_back(event); >- } >-} >- >-// TODO(terelius): Should this be placed in some utility file instead? >-std::string EventTypeToString(webrtc::rtclog::Event::EventType event_type) { >- switch (event_type) { >- case webrtc::rtclog::Event::UNKNOWN_EVENT: >- return "UNKNOWN_EVENT"; >- case webrtc::rtclog::Event::LOG_START: >- return "LOG_START"; >- case webrtc::rtclog::Event::LOG_END: >- return "LOG_END"; >- case webrtc::rtclog::Event::RTP_EVENT: >- return "RTP_EVENT"; >- case webrtc::rtclog::Event::RTCP_EVENT: >- return "RTCP_EVENT"; >- case webrtc::rtclog::Event::AUDIO_PLAYOUT_EVENT: >- return "AUDIO_PLAYOUT_EVENT"; >- case webrtc::rtclog::Event::LOSS_BASED_BWE_UPDATE: >- return "LOSS_BASED_BWE_UPDATE"; >- case webrtc::rtclog::Event::DELAY_BASED_BWE_UPDATE: >- return "DELAY_BASED_BWE_UPDATE"; >- case webrtc::rtclog::Event::VIDEO_RECEIVER_CONFIG_EVENT: >- return "VIDEO_RECV_CONFIG"; >- case webrtc::rtclog::Event::VIDEO_SENDER_CONFIG_EVENT: >- return "VIDEO_SEND_CONFIG"; >- case webrtc::rtclog::Event::AUDIO_RECEIVER_CONFIG_EVENT: >- return "AUDIO_RECV_CONFIG"; >- case webrtc::rtclog::Event::AUDIO_SENDER_CONFIG_EVENT: >- return "AUDIO_SEND_CONFIG"; >- case webrtc::rtclog::Event::AUDIO_NETWORK_ADAPTATION_EVENT: >- return "AUDIO_NETWORK_ADAPTATION"; >- case webrtc::rtclog::Event::BWE_PROBE_CLUSTER_CREATED_EVENT: >- return "BWE_PROBE_CREATED"; >- case webrtc::rtclog::Event::BWE_PROBE_RESULT_EVENT: >- return "BWE_PROBE_RESULT"; >- case webrtc::rtclog::Event::ALR_STATE_EVENT: >- return "ALR_STATE_EVENT"; >- case webrtc::rtclog::Event::ICE_CANDIDATE_PAIR_CONFIG: >- return "ICE_CANDIDATE_PAIR_CONFIG"; >- case webrtc::rtclog::Event::ICE_CANDIDATE_PAIR_EVENT: >- return "ICE_CANDIDATE_PAIR_EVENT"; >- } >- RTC_NOTREACHED(); >- return "UNKNOWN_EVENT"; >-} >- >-} // namespace >- >-// This utility will print basic information about each packet to stdout. >-// Note that parser will assert if the protobuf event is missing some required >-// fields and we attempt to access them. We don't handle this at the moment. >-int main(int argc, char* argv[]) { >- std::string program_name = argv[0]; >- std::string usage = >- "Tool for file usage statistics from an RtcEventLog.\n" >- "Run " + >- program_name + >- " --help for usage.\n" >- "Example usage:\n" + >- program_name + " input.rel\n"; >- if (rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true) || FLAG_help || >- argc != 2) { >- std::cout << usage; >- if (FLAG_help) { >- rtc::FlagList::Print(nullptr, false); >- return 0; >- } >- return 1; >- } >- std::string file_name = argv[1]; >- >- std::vector<webrtc::rtclog::Event> events; >- if (!ParseEvents(file_name, &events)) { >- RTC_LOG(LS_ERROR) << "Failed to parse event log."; >- return -1; >- } >- >- // Get file size >- FILE* file = fopen(file_name.c_str(), "rb"); >- fseek(file, 0L, SEEK_END); >- int64_t file_size = ftell(file); >- fclose(file); >- >- // We are deliberately using low level protobuf functions to get the stats >- // since the convenience functions in the parser would CHECK that the events >- // are well formed. >- std::map<webrtc::rtclog::Event::EventType, Stats> stats; >- int malformed_events = 0; >- size_t malformed_event_size = 0; >- size_t accumulated_event_size = 0; >- for (const webrtc::rtclog::Event& event : events) { >- size_t serialized_size = event.ByteSizeLong(); >- // When the event is written on the disk, it is part of an EventStream >- // object. The event stream will prepend a 1 byte field number/wire type, >- // and a varint encoding (base 128) of the event length. >- serialized_size = >- 1 + (1 + (serialized_size > 127) + (serialized_size > 16383)) + >- serialized_size; >- >- if (event.has_type() && event.has_timestamp_us()) { >- stats[event.type()].count++; >- stats[event.type()].total_size += serialized_size; >- } else { >- // The event is missing the type or the timestamp field. >- malformed_events++; >- malformed_event_size += serialized_size; >- } >- accumulated_event_size += serialized_size; >- } >- >- printf("Type \tCount\tTotal size\tAverage size\tPercent\n"); >- printf( >- "-----------------------------------------------------------------------" >- "\n"); >- for (const auto it : stats) { >- printf("%-22s\t%5d\t%10zu\t%12.2lf\t%7.2lf\n", >- EventTypeToString(it.first).c_str(), it.second.count, >- it.second.total_size, >- static_cast<double>(it.second.total_size) / it.second.count, >- static_cast<double>(it.second.total_size) / file_size * 100); >- } >- if (malformed_events != 0) { >- printf("%-22s\t%5d\t%10zu\t%12.2lf\t%7.2lf\n", "MALFORMED", >- malformed_events, malformed_event_size, >- static_cast<double>(malformed_event_size) / malformed_events, >- static_cast<double>(malformed_event_size) / file_size * 100); >- } >- if (file_size - accumulated_event_size != 0) { >- printf("WARNING: %" PRId64 " bytes not accounted for\n", >- file_size - accumulated_event_size); >- } >- >- return 0; >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2text.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2text.cc >deleted file mode 100644 >index 67cb6ecdf4b5a5cbbfbd38693c520d2dde381b6f..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log2text.cc >+++ /dev/null >@@ -1,881 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <string.h> >- >-#include <iomanip> // setfill, setw >-#include <iostream> >-#include <map> >-#include <string> >-#include <utility> // pair >- >-#include "common_types.h" // NOLINT(build/include) >-#include "logging/rtc_event_log/rtc_event_log_parser_new.h" >-#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/bye.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/extended_reports.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/fir.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/nack.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/pli.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/rapid_resync_request.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/receiver_report.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/remb.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/sdes.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/sender_report.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/tmmbn.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/tmmbr.h" >-#include "modules/rtp_rtcp/source/rtcp_packet/transport_feedback.h" >-#include "modules/rtp_rtcp/source/rtp_header_extensions.h" >-#include "modules/rtp_rtcp/source/rtp_utility.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/flags.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/strings/string_builder.h" >- >-namespace { >- >-DEFINE_bool(unknown, true, "Use --nounknown to exclude unknown events."); >-DEFINE_bool(startstop, true, "Use --nostartstop to exclude start/stop events."); >-DEFINE_bool(config, true, "Use --noconfig to exclude stream configurations."); >-DEFINE_bool(bwe, true, "Use --nobwe to exclude BWE events."); >-DEFINE_bool(incoming, true, "Use --noincoming to exclude incoming packets."); >-DEFINE_bool(outgoing, true, "Use --nooutgoing to exclude packets."); >-// TODO(terelius): Note that the media type doesn't work with outgoing packets. >-DEFINE_bool(audio, true, "Use --noaudio to exclude audio packets."); >-// TODO(terelius): Note that the media type doesn't work with outgoing packets. >-DEFINE_bool(video, true, "Use --novideo to exclude video packets."); >-// TODO(terelius): Note that the media type doesn't work with outgoing packets. >-DEFINE_bool(data, true, "Use --nodata to exclude data packets."); >-DEFINE_bool(rtp, true, "Use --nortp to exclude RTP packets."); >-DEFINE_bool(rtcp, true, "Use --nortcp to exclude RTCP packets."); >-DEFINE_bool(playout, true, "Use --noplayout to exclude audio playout events."); >-DEFINE_bool(ana, true, "Use --noana to exclude ANA events."); >-DEFINE_bool(probe, true, "Use --noprobe to exclude probe events."); >-DEFINE_bool(ice, true, "Use --noice to exclude ICE events."); >- >-DEFINE_bool(print_full_packets, >- false, >- "Print the full RTP headers and RTCP packets in hex."); >- >-// TODO(terelius): Allow a list of SSRCs. >-DEFINE_string(ssrc, >- "", >- "Print only packets with this SSRC (decimal or hex, the latter " >- "starting with 0x)."); >-DEFINE_bool(help, false, "Prints this message."); >- >-using MediaType = webrtc::ParsedRtcEventLogNew::MediaType; >- >-static uint32_t filtered_ssrc = 0; >- >-// Parses the input string for a valid SSRC. If a valid SSRC is found, it is >-// written to the static global variable |filtered_ssrc|, and true is returned. >-// Otherwise, false is returned. >-// The empty string must be validated as true, because it is the default value >-// of the command-line flag. In this case, no value is written to the output >-// variable. >-bool ParseSsrc(std::string str) { >- // If the input string starts with 0x or 0X it indicates a hexadecimal number. >- auto read_mode = std::dec; >- if (str.size() > 2 && >- (str.substr(0, 2) == "0x" || str.substr(0, 2) == "0X")) { >- read_mode = std::hex; >- str = str.substr(2); >- } >- std::stringstream ss(str); >- ss >> read_mode >> filtered_ssrc; >- return str.empty() || (!ss.fail() && ss.eof()); >-} >- >-bool ExcludePacket(webrtc::PacketDirection direction, >- MediaType media_type, >- uint32_t packet_ssrc) { >- if (!FLAG_outgoing && direction == webrtc::kOutgoingPacket) >- return true; >- if (!FLAG_incoming && direction == webrtc::kIncomingPacket) >- return true; >- if (!FLAG_audio && media_type == MediaType::AUDIO) >- return true; >- if (!FLAG_video && media_type == MediaType::VIDEO) >- return true; >- if (!FLAG_data && media_type == MediaType::DATA) >- return true; >- if (strlen(FLAG_ssrc) > 0 && packet_ssrc != filtered_ssrc) >- return true; >- return false; >-} >- >-const char* StreamInfo(webrtc::PacketDirection direction, >- MediaType media_type) { >- if (direction == webrtc::kOutgoingPacket) { >- if (media_type == MediaType::AUDIO) >- return "(out,audio)"; >- else if (media_type == MediaType::VIDEO) >- return "(out,video)"; >- else if (media_type == MediaType::DATA) >- return "(out,data)"; >- else >- return "(out)"; >- } >- if (direction == webrtc::kIncomingPacket) { >- if (media_type == MediaType::AUDIO) >- return "(in,audio)"; >- else if (media_type == MediaType::VIDEO) >- return "(in,video)"; >- else if (media_type == MediaType::DATA) >- return "(in,data)"; >- else >- return "(in)"; >- } >- return "(unknown)"; >-} >- >-// Return default values for header extensions, to use on streams without stored >-// mapping data. Currently this only applies to audio streams, since the mapping >-// is not stored in the event log. >-// TODO(ivoc): Remove this once this mapping is stored in the event log for >-// audio streams. Tracking bug: webrtc:6399 >-webrtc::RtpHeaderExtensionMap GetDefaultHeaderExtensionMap() { >- webrtc::RtpHeaderExtensionMap default_map; >- default_map.Register<webrtc::AudioLevel>( >- webrtc::RtpExtension::kAudioLevelDefaultId); >- default_map.Register<webrtc::TransmissionOffset>( >- webrtc::RtpExtension::kTimestampOffsetDefaultId); >- default_map.Register<webrtc::AbsoluteSendTime>( >- webrtc::RtpExtension::kAbsSendTimeDefaultId); >- default_map.Register<webrtc::VideoOrientation>( >- webrtc::RtpExtension::kVideoRotationDefaultId); >- default_map.Register<webrtc::VideoContentTypeExtension>( >- webrtc::RtpExtension::kVideoContentTypeDefaultId); >- default_map.Register<webrtc::VideoTimingExtension>( >- webrtc::RtpExtension::kVideoTimingDefaultId); >- default_map.Register<webrtc::FrameMarkingExtension>( >- webrtc::RtpExtension::kFrameMarkingDefaultId); >- default_map.Register<webrtc::TransportSequenceNumber>( >- webrtc::RtpExtension::kTransportSequenceNumberDefaultId); >- default_map.Register<webrtc::PlayoutDelayLimits>( >- webrtc::RtpExtension::kPlayoutDelayDefaultId); >- return default_map; >-} >- >-void PrintSenderReport(const webrtc::ParsedRtcEventLogNew& parsed_stream, >- const webrtc::rtcp::CommonHeader& rtcp_block, >- uint64_t log_timestamp, >- webrtc::PacketDirection direction) { >- webrtc::rtcp::SenderReport sr; >- if (!sr.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(sr.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, sr.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_SR" << StreamInfo(direction, media_type) >- << "\tssrc=" << sr.sender_ssrc() >- << "\ttimestamp=" << sr.rtp_timestamp() << std::endl; >-} >- >-void PrintReceiverReport(const webrtc::ParsedRtcEventLogNew& parsed_stream, >- const webrtc::rtcp::CommonHeader& rtcp_block, >- uint64_t log_timestamp, >- webrtc::PacketDirection direction) { >- webrtc::rtcp::ReceiverReport rr; >- if (!rr.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(rr.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, rr.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_RR" << StreamInfo(direction, media_type) >- << "\tssrc=" << rr.sender_ssrc() << std::endl; >-} >- >-void PrintXr(const webrtc::ParsedRtcEventLogNew& parsed_stream, >- const webrtc::rtcp::CommonHeader& rtcp_block, >- uint64_t log_timestamp, >- webrtc::PacketDirection direction) { >- webrtc::rtcp::ExtendedReports xr; >- if (!xr.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(xr.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, xr.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_XR" << StreamInfo(direction, media_type) >- << "\tssrc=" << xr.sender_ssrc() << std::endl; >-} >- >-void PrintSdes(const webrtc::rtcp::CommonHeader& rtcp_block, >- uint64_t log_timestamp, >- webrtc::PacketDirection direction) { >- std::cout << log_timestamp << "\t" >- << "RTCP_SDES" << StreamInfo(direction, MediaType::ANY) >- << std::endl; >- RTC_NOTREACHED() << "SDES should have been redacted when writing the log"; >-} >- >-void PrintBye(const webrtc::ParsedRtcEventLogNew& parsed_stream, >- const webrtc::rtcp::CommonHeader& rtcp_block, >- uint64_t log_timestamp, >- webrtc::PacketDirection direction) { >- webrtc::rtcp::Bye bye; >- if (!bye.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(bye.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, bye.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_BYE" << StreamInfo(direction, media_type) >- << "\tssrc=" << bye.sender_ssrc() << std::endl; >-} >- >-void PrintRtpFeedback(const webrtc::ParsedRtcEventLogNew& parsed_stream, >- const webrtc::rtcp::CommonHeader& rtcp_block, >- uint64_t log_timestamp, >- webrtc::PacketDirection direction) { >- switch (rtcp_block.fmt()) { >- case webrtc::rtcp::Nack::kFeedbackMessageType: { >- webrtc::rtcp::Nack nack; >- if (!nack.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(nack.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, nack.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_NACK" << StreamInfo(direction, media_type) >- << "\tssrc=" << nack.sender_ssrc() << std::endl; >- break; >- } >- case webrtc::rtcp::Tmmbr::kFeedbackMessageType: { >- webrtc::rtcp::Tmmbr tmmbr; >- if (!tmmbr.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(tmmbr.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, tmmbr.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_TMMBR" << StreamInfo(direction, media_type) >- << "\tssrc=" << tmmbr.sender_ssrc() << std::endl; >- break; >- } >- case webrtc::rtcp::Tmmbn::kFeedbackMessageType: { >- webrtc::rtcp::Tmmbn tmmbn; >- if (!tmmbn.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(tmmbn.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, tmmbn.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_TMMBN" << StreamInfo(direction, media_type) >- << "\tssrc=" << tmmbn.sender_ssrc() << std::endl; >- break; >- } >- case webrtc::rtcp::RapidResyncRequest::kFeedbackMessageType: { >- webrtc::rtcp::RapidResyncRequest sr_req; >- if (!sr_req.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(sr_req.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, sr_req.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_SRREQ" << StreamInfo(direction, media_type) >- << "\tssrc=" << sr_req.sender_ssrc() << std::endl; >- break; >- } >- case webrtc::rtcp::TransportFeedback::kFeedbackMessageType: { >- webrtc::rtcp::TransportFeedback transport_feedback; >- if (!transport_feedback.Parse(rtcp_block)) >- return; >- MediaType media_type = parsed_stream.GetMediaType( >- transport_feedback.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, >- transport_feedback.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_NEWFB" << StreamInfo(direction, media_type) >- << "\tsender_ssrc=" << transport_feedback.sender_ssrc() >- << "\tmedia_ssrc=" << transport_feedback.media_ssrc() >- << std::endl; >- break; >- } >- default: >- std::cout << log_timestamp << "\t" >- << "RTCP_RTPFB(UNKNOWN)" << std::endl; >- break; >- } >-} >- >-void PrintPsFeedback(const webrtc::ParsedRtcEventLogNew& parsed_stream, >- const webrtc::rtcp::CommonHeader& rtcp_block, >- uint64_t log_timestamp, >- webrtc::PacketDirection direction) { >- switch (rtcp_block.fmt()) { >- case webrtc::rtcp::Pli::kFeedbackMessageType: { >- webrtc::rtcp::Pli pli; >- if (!pli.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(pli.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, pli.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_PLI" << StreamInfo(direction, media_type) >- << "\tssrc=" << pli.sender_ssrc() << std::endl; >- break; >- } >- case webrtc::rtcp::Fir::kFeedbackMessageType: { >- webrtc::rtcp::Fir fir; >- if (!fir.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(fir.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, fir.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_FIR" << StreamInfo(direction, media_type) >- << "\tssrc=" << fir.sender_ssrc() << std::endl; >- break; >- } >- case webrtc::rtcp::Remb::kFeedbackMessageType: { >- webrtc::rtcp::Remb remb; >- if (!remb.Parse(rtcp_block)) >- return; >- MediaType media_type = >- parsed_stream.GetMediaType(remb.sender_ssrc(), direction); >- if (ExcludePacket(direction, media_type, remb.sender_ssrc())) >- return; >- std::cout << log_timestamp << "\t" >- << "RTCP_REMB" << StreamInfo(direction, media_type) >- << "\tssrc=" << remb.sender_ssrc() << std::endl; >- break; >- } >- default: >- std::cout << log_timestamp << "\t" >- << "RTCP_PSFB(UNKNOWN)" << std::endl; >- break; >- } >-} >- >-enum class InputSource { >- STDIN, >- FILE, >-}; >- >-void PrintUsageGuide(const std::string& program_name) { >- std::cout >- << "Tool for printing packet information from an RtcEventLog as text.\n" >- << "* Run " + program_name + " --help for usage.\n" >- << "* Example usage for parsing a file:\n" >- << " " << program_name + " input.rel\n" >- << "* Example usage for parsing the stdin:\n" >- << " " << program_name + "\n"; >-} >- >-// TODO(eladalon): Return a stream or file descriptor instead. >-InputSource ParseCommandLineFlags(int argc, char* argv[]) { >- if (rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true) != 0) { >- PrintUsageGuide(argv[0]); >- exit(-1); >- } >- >- if (FLAG_help) { >- PrintUsageGuide(argv[0]); >- std::cout << std::endl; >- rtc::FlagList::Print(nullptr, false); >- exit(0); >- } >- >- switch (argc) { >- case 1: >- return InputSource::STDIN; >- case 2: >- return InputSource::FILE; >- default: >- PrintUsageGuide(argv[0]); >- exit(-1); >- } >-} >- >-} // namespace >- >-// This utility will print basic information about each packet to stdout. >-// Note that parser will assert if the protobuf event is missing some required >-// fields and we attempt to access them. We don't handle this at the moment. >-int main(int argc, char* argv[]) { >- InputSource input_source = ParseCommandLineFlags(argc, argv); >- >- if (strlen(FLAG_ssrc) > 0) >- RTC_CHECK(ParseSsrc(FLAG_ssrc)) << "Flag verification has failed."; >- >- webrtc::RtpHeaderExtensionMap default_map = GetDefaultHeaderExtensionMap(); >- bool default_map_used = false; >- >- webrtc::ParsedRtcEventLogNew parsed_stream; >- >- switch (input_source) { >- case InputSource::STDIN: { >- if (!parsed_stream.ParseStream(std::cin)) { >- std::cerr << "Error while parsing input stream." << std::endl; >- return -1; >- } >- break; >- } >- case InputSource::FILE: { >- if (!parsed_stream.ParseFile(argv[1])) { >- std::cerr << "Error while parsing input file: " << argv[1] << std::endl; >- return -1; >- } >- break; >- } >- default: { RTC_NOTREACHED() << "Unsupported input source."; } >- } >- >- for (size_t i = 0; i < parsed_stream.GetNumberOfEvents(); i++) { >- bool event_recognized = false; >- switch (parsed_stream.GetEventType(i)) { >- case webrtc::ParsedRtcEventLogNew::EventType::UNKNOWN_EVENT: { >- if (FLAG_unknown) { >- std::cout << parsed_stream.GetTimestamp(i) << "\tUNKNOWN_EVENT" >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::LOG_START: { >- if (FLAG_startstop) { >- std::cout << parsed_stream.GetTimestamp(i) << "\tLOG_START" >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::LOG_END: { >- if (FLAG_startstop) { >- std::cout << parsed_stream.GetTimestamp(i) << "\tLOG_END" >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::RTP_EVENT: { >- if (FLAG_rtp) { >- size_t header_length; >- size_t total_length; >- uint8_t header[IP_PACKET_SIZE]; >- webrtc::PacketDirection direction; >- const webrtc::RtpHeaderExtensionMap* extension_map = >- parsed_stream.GetRtpHeader(i, &direction, header, &header_length, >- &total_length, nullptr); >- >- if (extension_map == nullptr) { >- extension_map = &default_map; >- if (!default_map_used) >- RTC_LOG(LS_WARNING) << "Using default header extension map"; >- default_map_used = true; >- } >- >- // Parse header to get SSRC and RTP time. >- webrtc::RtpUtility::RtpHeaderParser rtp_parser(header, header_length); >- webrtc::RTPHeader parsed_header; >- rtp_parser.Parse(&parsed_header, extension_map); >- MediaType media_type = >- parsed_stream.GetMediaType(parsed_header.ssrc, direction); >- >- if (ExcludePacket(direction, media_type, parsed_header.ssrc)) { >- event_recognized = true; >- break; >- } >- >- std::cout << parsed_stream.GetTimestamp(i) << "\tRTP" >- << StreamInfo(direction, media_type) >- << "\tssrc=" << parsed_header.ssrc >- << "\ttimestamp=" << parsed_header.timestamp; >- if (parsed_header.extension.hasAbsoluteSendTime) { >- std::cout << "\tAbsSendTime=" >- << parsed_header.extension.absoluteSendTime; >- } >- if (parsed_header.extension.hasVideoContentType) { >- std::cout << "\tContentType=" >- << static_cast<int>( >- parsed_header.extension.videoContentType); >- } >- if (parsed_header.extension.hasVideoRotation) { >- std::cout << "\tRotation=" >- << static_cast<int>( >- parsed_header.extension.videoRotation); >- } >- if (parsed_header.extension.hasTransportSequenceNumber) { >- std::cout << "\tTransportSeq=" >- << parsed_header.extension.transportSequenceNumber; >- } >- if (parsed_header.extension.hasTransmissionTimeOffset) { >- std::cout << "\tTransmTimeOffset=" >- << parsed_header.extension.transmissionTimeOffset; >- } >- if (parsed_header.extension.hasAudioLevel) { >- std::cout << "\tAudioLevel=" >- << static_cast<int>(parsed_header.extension.audioLevel); >- } >- std::cout << std::endl; >- if (FLAG_print_full_packets) { >- // TODO(terelius): Rewrite this file to use printf instead of cout. >- std::cout << "\t\t" << std::hex; >- char prev_fill = std::cout.fill('0'); >- for (size_t i = 0; i < header_length; i++) { >- std::cout << std::setw(2) << static_cast<unsigned>(header[i]); >- if (i % 4 == 3) >- std::cout << " "; // Separator between 32-bit words. >- } >- std::cout.fill(prev_fill); >- std::cout << std::dec << std::endl; >- } >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::RTCP_EVENT: { >- if (FLAG_rtcp) { >- size_t length; >- uint8_t packet[IP_PACKET_SIZE]; >- webrtc::PacketDirection direction; >- parsed_stream.GetRtcpPacket(i, &direction, packet, &length); >- >- webrtc::rtcp::CommonHeader rtcp_block; >- const uint8_t* packet_end = packet + length; >- for (const uint8_t* next_block = packet; next_block != packet_end; >- next_block = rtcp_block.NextPacket()) { >- ptrdiff_t remaining_blocks_size = packet_end - next_block; >- RTC_DCHECK_GT(remaining_blocks_size, 0); >- if (!rtcp_block.Parse(next_block, remaining_blocks_size)) { >- RTC_LOG(LS_WARNING) << "Failed to parse RTCP"; >- break; >- } >- >- uint64_t log_timestamp = parsed_stream.GetTimestamp(i); >- switch (rtcp_block.type()) { >- case webrtc::rtcp::SenderReport::kPacketType: >- PrintSenderReport(parsed_stream, rtcp_block, log_timestamp, >- direction); >- break; >- case webrtc::rtcp::ReceiverReport::kPacketType: >- PrintReceiverReport(parsed_stream, rtcp_block, log_timestamp, >- direction); >- break; >- case webrtc::rtcp::Sdes::kPacketType: >- PrintSdes(rtcp_block, log_timestamp, direction); >- break; >- case webrtc::rtcp::ExtendedReports::kPacketType: >- PrintXr(parsed_stream, rtcp_block, log_timestamp, direction); >- break; >- case webrtc::rtcp::Bye::kPacketType: >- PrintBye(parsed_stream, rtcp_block, log_timestamp, direction); >- break; >- case webrtc::rtcp::Rtpfb::kPacketType: >- PrintRtpFeedback(parsed_stream, rtcp_block, log_timestamp, >- direction); >- break; >- case webrtc::rtcp::Psfb::kPacketType: >- PrintPsFeedback(parsed_stream, rtcp_block, log_timestamp, >- direction); >- break; >- default: >- break; >- } >- if (FLAG_print_full_packets) { >- std::cout << "\t\t" << std::hex; >- char prev_fill = std::cout.fill('0'); >- for (const uint8_t* p = next_block; p < rtcp_block.NextPacket(); >- p++) { >- std::cout << std::setw(2) << static_cast<unsigned>(*p); >- ptrdiff_t chars_printed = p - next_block; >- if (chars_printed % 4 == 3) >- std::cout << " "; // Separator between 32-bit words. >- } >- std::cout.fill(prev_fill); >- std::cout << std::dec << std::endl; >- } >- } >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::AUDIO_PLAYOUT_EVENT: { >- if (FLAG_playout) { >- auto audio_playout = parsed_stream.GetAudioPlayout(i); >- std::cout << audio_playout.log_time_us() << "\tAUDIO_PLAYOUT" >- << "\tssrc=" << audio_playout.ssrc << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::LOSS_BASED_BWE_UPDATE: { >- if (FLAG_bwe) { >- auto bwe_update = parsed_stream.GetLossBasedBweUpdate(i); >- std::cout << bwe_update.log_time_us() << "\tBWE(LOSS_BASED)" >- << "\tbitrate_bps=" << bwe_update.bitrate_bps >- << "\tfraction_lost=" >- << static_cast<unsigned>(bwe_update.fraction_lost) >- << "\texpected_packets=" << bwe_update.expected_packets >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::DELAY_BASED_BWE_UPDATE: { >- if (FLAG_bwe) { >- auto bwe_update = parsed_stream.GetDelayBasedBweUpdate(i); >- std::cout << bwe_update.log_time_us() << "\tBWE(DELAY_BASED)" >- << "\tbitrate_bps=" << bwe_update.bitrate_bps >- << "\tdetector_state=" >- << static_cast<int>(bwe_update.detector_state) << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType:: >- VIDEO_RECEIVER_CONFIG_EVENT: { >- if (FLAG_config && FLAG_video && FLAG_incoming) { >- webrtc::rtclog::StreamConfig config = >- parsed_stream.GetVideoReceiveConfig(i); >- std::cout << parsed_stream.GetTimestamp(i) << "\tVIDEO_RECV_CONFIG" >- << "\tssrc=" << config.remote_ssrc >- << "\tfeedback_ssrc=" << config.local_ssrc; >- std::cout << "\textensions={"; >- for (const auto& extension : config.rtp_extensions) { >- std::cout << extension.ToString() << ","; >- } >- std::cout << "}"; >- std::cout << "\tcodecs={"; >- for (const auto& codec : config.codecs) { >- std::cout << "{name: " << codec.payload_name >- << ", payload_type: " << codec.payload_type >- << ", rtx_payload_type: " << codec.rtx_payload_type >- << "}"; >- } >- std::cout << "}" << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::VIDEO_SENDER_CONFIG_EVENT: { >- if (FLAG_config && FLAG_video && FLAG_outgoing) { >- std::vector<webrtc::rtclog::StreamConfig> configs = >- parsed_stream.GetVideoSendConfig(i); >- for (const auto& config : configs) { >- std::cout << parsed_stream.GetTimestamp(i) << "\tVIDEO_SEND_CONFIG"; >- std::cout << "\tssrcs=" << config.local_ssrc; >- std::cout << "\trtx_ssrcs=" << config.rtx_ssrc; >- std::cout << "\textensions={"; >- for (const auto& extension : config.rtp_extensions) { >- std::cout << extension.ToString() << ","; >- } >- std::cout << "}"; >- std::cout << "\tcodecs={"; >- for (const auto& codec : config.codecs) { >- std::cout << "{name: " << codec.payload_name >- << ", payload_type: " << codec.payload_type >- << ", rtx_payload_type: " << codec.rtx_payload_type >- << "}"; >- } >- std::cout << "}" << std::endl; >- } >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType:: >- AUDIO_RECEIVER_CONFIG_EVENT: { >- if (FLAG_config && FLAG_audio && FLAG_incoming) { >- webrtc::rtclog::StreamConfig config = >- parsed_stream.GetAudioReceiveConfig(i); >- std::cout << parsed_stream.GetTimestamp(i) << "\tAUDIO_RECV_CONFIG" >- << "\tssrc=" << config.remote_ssrc >- << "\tfeedback_ssrc=" << config.local_ssrc; >- std::cout << "\textensions={"; >- for (const auto& extension : config.rtp_extensions) { >- std::cout << extension.ToString() << ","; >- } >- std::cout << "}"; >- std::cout << "\tcodecs={"; >- for (const auto& codec : config.codecs) { >- std::cout << "{name: " << codec.payload_name >- << ", payload_type: " << codec.payload_type >- << ", rtx_payload_type: " << codec.rtx_payload_type >- << "}"; >- } >- std::cout << "}" << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::AUDIO_SENDER_CONFIG_EVENT: { >- if (FLAG_config && FLAG_audio && FLAG_outgoing) { >- webrtc::rtclog::StreamConfig config = >- parsed_stream.GetAudioSendConfig(i); >- std::cout << parsed_stream.GetTimestamp(i) << "\tAUDIO_SEND_CONFIG" >- << "\tssrc=" << config.local_ssrc; >- std::cout << "\textensions={"; >- for (const auto& extension : config.rtp_extensions) { >- std::cout << extension.ToString() << ","; >- } >- std::cout << "}"; >- std::cout << "\tcodecs={"; >- for (const auto& codec : config.codecs) { >- std::cout << "{name: " << codec.payload_name >- << ", payload_type: " << codec.payload_type >- << ", rtx_payload_type: " << codec.rtx_payload_type >- << "}"; >- } >- std::cout << "}" << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType:: >- AUDIO_NETWORK_ADAPTATION_EVENT: { >- if (FLAG_ana) { >- auto ana_event = parsed_stream.GetAudioNetworkAdaptation(i); >- char buffer[300]; >- rtc::SimpleStringBuilder builder(buffer); >- builder << parsed_stream.GetTimestamp(i) << "\tANA_UPDATE"; >- if (ana_event.config.bitrate_bps) { >- builder << "\tbitrate_bps=" << *ana_event.config.bitrate_bps; >- } >- if (ana_event.config.frame_length_ms) { >- builder << "\tframe_length_ms=" >- << *ana_event.config.frame_length_ms; >- } >- if (ana_event.config.uplink_packet_loss_fraction) { >- builder << "\tuplink_packet_loss_fraction=" >- << *ana_event.config.uplink_packet_loss_fraction; >- } >- if (ana_event.config.enable_fec) { >- builder << "\tenable_fec=" << *ana_event.config.enable_fec; >- } >- if (ana_event.config.enable_dtx) { >- builder << "\tenable_dtx=" << *ana_event.config.enable_dtx; >- } >- if (ana_event.config.num_channels) { >- builder << "\tnum_channels=" << *ana_event.config.num_channels; >- } >- std::cout << builder.str() << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType:: >- BWE_PROBE_CLUSTER_CREATED_EVENT: { >- if (FLAG_probe) { >- auto probe_event = parsed_stream.GetBweProbeClusterCreated(i); >- std::cout << parsed_stream.GetTimestamp(i) << "\tPROBE_CREATED(" >- << probe_event.id << ")" >- << "\tbitrate_bps=" << probe_event.bitrate_bps >- << "\tmin_packets=" << probe_event.min_packets >- << "\tmin_bytes=" << probe_event.min_bytes << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::BWE_PROBE_FAILURE_EVENT: { >- if (FLAG_probe) { >- webrtc::LoggedBweProbeFailureEvent probe_result = >- parsed_stream.GetBweProbeFailure(i); >- std::cout << parsed_stream.GetTimestamp(i) << "\tPROBE_FAILURE(" >- << probe_result.id << ")" >- << "\tfailure_reason=" >- << static_cast<int>(probe_result.failure_reason) >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::BWE_PROBE_SUCCESS_EVENT: { >- if (FLAG_probe) { >- webrtc::LoggedBweProbeSuccessEvent probe_result = >- parsed_stream.GetBweProbeSuccess(i); >- std::cout << parsed_stream.GetTimestamp(i) << "\tPROBE_SUCCESS(" >- << probe_result.id << ")" >- << "\tbitrate_bps=" << probe_result.bitrate_bps >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::ALR_STATE_EVENT: { >- if (FLAG_bwe) { >- webrtc::LoggedAlrStateEvent alr_state = parsed_stream.GetAlrState(i); >- std::cout << parsed_stream.GetTimestamp(i) << "\tALR_STATE" >- << "\tin_alr=" << alr_state.in_alr << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::ICE_CANDIDATE_PAIR_CONFIG: { >- if (FLAG_ice) { >- webrtc::LoggedIceCandidatePairConfig ice_cp_config = >- parsed_stream.GetIceCandidatePairConfig(i); >- // TODO(qingsi): convert the numeric representation of states to text >- std::cout << parsed_stream.GetTimestamp(i) >- << "\tICE_CANDIDATE_PAIR_CONFIG" >- << "\ttype=" << static_cast<int>(ice_cp_config.type) >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- >- case webrtc::ParsedRtcEventLogNew::EventType::ICE_CANDIDATE_PAIR_EVENT: { >- if (FLAG_ice) { >- webrtc::LoggedIceCandidatePairEvent ice_cp_event = >- parsed_stream.GetIceCandidatePairEvent(i); >- // TODO(qingsi): convert the numeric representation of states to text >- std::cout << parsed_stream.GetTimestamp(i) >- << "\tICE_CANDIDATE_PAIR_EVENT" >- << "\ttype=" << static_cast<int>(ice_cp_event.type) >- << std::endl; >- } >- event_recognized = true; >- break; >- } >- } >- >- if (!event_recognized) { >- std::cout << "Unrecognized event (" >- << static_cast<int>(parsed_stream.GetEventType(i)) << ")" >- << std::endl; >- } >- } >- return 0; >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_impl.cc >index 554a0283fd6f9737aa04e2e2a84f1d8888442146..c022a3d4185cf143b94e0e281bc6fe725cee3672 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_impl.cc >@@ -18,8 +18,9 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "api/rtceventlogoutput.h" > #include "logging/rtc_event_log/encoder/rtc_event_log_encoder_legacy.h" >-#include "logging/rtc_event_log/output/rtc_event_log_output_file.h" >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder_new_format.h" > #include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/event.h" >@@ -64,7 +65,11 @@ std::unique_ptr<RtcEventLogEncoder> CreateEncoder( > RtcEventLog::EncodingType type) { > switch (type) { > case RtcEventLog::EncodingType::Legacy: >+ RTC_LOG(LS_INFO) << "Creating legacy encoder for RTC event log."; > return absl::make_unique<RtcEventLogEncoderLegacy>(); >+ case RtcEventLog::EncodingType::NewFormat: >+ RTC_LOG(LS_INFO) << "Creating new format encoder for RTC event log."; >+ return absl::make_unique<RtcEventLogEncoderNewFormat>(); > default: > RTC_LOG(LS_ERROR) << "Unknown RtcEventLog encoder type (" << int(type) > << ")"; >@@ -168,20 +173,19 @@ bool RtcEventLogImpl::StartLogging(std::unique_ptr<RtcEventLogOutput> output, > return false; > } > >- // TODO(terelius): The mapping between log timestamps and UTC should be stored >- // in the event_log START event. > const int64_t timestamp_us = rtc::TimeMicros(); > const int64_t utc_time_us = rtc::TimeUTCMicros(); > RTC_LOG(LS_INFO) << "Starting WebRTC event log. (Timestamp, UTC) = " > << "(" << timestamp_us << ", " << utc_time_us << ")."; > > // Binding to |this| is safe because |this| outlives the |task_queue_|. >- auto start = [this, timestamp_us](std::unique_ptr<RtcEventLogOutput> output) { >+ auto start = [this, timestamp_us, >+ utc_time_us](std::unique_ptr<RtcEventLogOutput> output) { > RTC_DCHECK_RUN_ON(task_queue_.get()); > RTC_DCHECK(output->IsActive()); > event_output_ = std::move(output); > num_config_events_written_ = 0; >- WriteToOutput(event_encoder_->EncodeLogStart(timestamp_us)); >+ WriteToOutput(event_encoder_->EncodeLogStart(timestamp_us, utc_time_us)); > LogEventsFromMemoryToOutput(); > }; > >@@ -197,7 +201,7 @@ void RtcEventLogImpl::StopLogging() { > > RTC_LOG(LS_INFO) << "Stopping WebRTC event log."; > >- rtc::Event output_stopped(true, false); >+ rtc::Event output_stopped; > > // Binding to |this| is safe because |this| outlives the |task_queue_|. > task_queue_->PostTask([this, &output_stopped]() { >@@ -372,8 +376,4 @@ std::unique_ptr<RtcEventLog> RtcEventLog::Create( > #endif // ENABLE_RTC_EVENT_LOG > } > >-std::unique_ptr<RtcEventLog> RtcEventLog::CreateNull() { >- return std::unique_ptr<RtcEventLog>(new RtcEventLogNullImpl()); >-} >- > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser.cc >deleted file mode 100644 >index 802af189aad59270668f57c335d22c31d46b35ed..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser.cc >+++ /dev/null >@@ -1,851 +0,0 @@ >-/* >- * Copyright (c) 2016 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "logging/rtc_event_log/rtc_event_log_parser.h" >- >-#include <stdint.h> >-#include <string.h> >- >-#include <algorithm> >-#include <fstream> >-#include <istream> >-#include <map> >-#include <utility> >- >-#include "logging/rtc_event_log/rtc_event_log.h" >-#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor.h" >-#include "modules/remote_bitrate_estimator/include/bwe_defines.h" >-#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/protobuf_utils.h" >- >-namespace webrtc { >- >-namespace { >-RtcpMode GetRuntimeRtcpMode(rtclog::VideoReceiveConfig::RtcpMode rtcp_mode) { >- switch (rtcp_mode) { >- case rtclog::VideoReceiveConfig::RTCP_COMPOUND: >- return RtcpMode::kCompound; >- case rtclog::VideoReceiveConfig::RTCP_REDUCEDSIZE: >- return RtcpMode::kReducedSize; >- } >- RTC_NOTREACHED(); >- return RtcpMode::kOff; >-} >- >-ParsedRtcEventLog::EventType GetRuntimeEventType( >- rtclog::Event::EventType event_type) { >- switch (event_type) { >- case rtclog::Event::UNKNOWN_EVENT: >- return ParsedRtcEventLog::EventType::UNKNOWN_EVENT; >- case rtclog::Event::LOG_START: >- return ParsedRtcEventLog::EventType::LOG_START; >- case rtclog::Event::LOG_END: >- return ParsedRtcEventLog::EventType::LOG_END; >- case rtclog::Event::RTP_EVENT: >- return ParsedRtcEventLog::EventType::RTP_EVENT; >- case rtclog::Event::RTCP_EVENT: >- return ParsedRtcEventLog::EventType::RTCP_EVENT; >- case rtclog::Event::AUDIO_PLAYOUT_EVENT: >- return ParsedRtcEventLog::EventType::AUDIO_PLAYOUT_EVENT; >- case rtclog::Event::LOSS_BASED_BWE_UPDATE: >- return ParsedRtcEventLog::EventType::LOSS_BASED_BWE_UPDATE; >- case rtclog::Event::DELAY_BASED_BWE_UPDATE: >- return ParsedRtcEventLog::EventType::DELAY_BASED_BWE_UPDATE; >- case rtclog::Event::VIDEO_RECEIVER_CONFIG_EVENT: >- return ParsedRtcEventLog::EventType::VIDEO_RECEIVER_CONFIG_EVENT; >- case rtclog::Event::VIDEO_SENDER_CONFIG_EVENT: >- return ParsedRtcEventLog::EventType::VIDEO_SENDER_CONFIG_EVENT; >- case rtclog::Event::AUDIO_RECEIVER_CONFIG_EVENT: >- return ParsedRtcEventLog::EventType::AUDIO_RECEIVER_CONFIG_EVENT; >- case rtclog::Event::AUDIO_SENDER_CONFIG_EVENT: >- return ParsedRtcEventLog::EventType::AUDIO_SENDER_CONFIG_EVENT; >- case rtclog::Event::AUDIO_NETWORK_ADAPTATION_EVENT: >- return ParsedRtcEventLog::EventType::AUDIO_NETWORK_ADAPTATION_EVENT; >- case rtclog::Event::BWE_PROBE_CLUSTER_CREATED_EVENT: >- return ParsedRtcEventLog::EventType::BWE_PROBE_CLUSTER_CREATED_EVENT; >- case rtclog::Event::BWE_PROBE_RESULT_EVENT: >- return ParsedRtcEventLog::EventType::BWE_PROBE_RESULT_EVENT; >- case rtclog::Event::ALR_STATE_EVENT: >- return ParsedRtcEventLog::EventType::ALR_STATE_EVENT; >- case rtclog::Event::ICE_CANDIDATE_PAIR_CONFIG: >- return ParsedRtcEventLog::EventType::ICE_CANDIDATE_PAIR_CONFIG; >- case rtclog::Event::ICE_CANDIDATE_PAIR_EVENT: >- return ParsedRtcEventLog::EventType::ICE_CANDIDATE_PAIR_EVENT; >- } >- return ParsedRtcEventLog::EventType::UNKNOWN_EVENT; >-} >- >-BandwidthUsage GetRuntimeDetectorState( >- rtclog::DelayBasedBweUpdate::DetectorState detector_state) { >- switch (detector_state) { >- case rtclog::DelayBasedBweUpdate::BWE_NORMAL: >- return BandwidthUsage::kBwNormal; >- case rtclog::DelayBasedBweUpdate::BWE_UNDERUSING: >- return BandwidthUsage::kBwUnderusing; >- case rtclog::DelayBasedBweUpdate::BWE_OVERUSING: >- return BandwidthUsage::kBwOverusing; >- } >- RTC_NOTREACHED(); >- return BandwidthUsage::kBwNormal; >-} >- >-IceCandidatePairConfigType GetRuntimeIceCandidatePairConfigType( >- rtclog::IceCandidatePairConfig::IceCandidatePairConfigType type) { >- switch (type) { >- case rtclog::IceCandidatePairConfig::ADDED: >- return IceCandidatePairConfigType::kAdded; >- case rtclog::IceCandidatePairConfig::UPDATED: >- return IceCandidatePairConfigType::kUpdated; >- case rtclog::IceCandidatePairConfig::DESTROYED: >- return IceCandidatePairConfigType::kDestroyed; >- case rtclog::IceCandidatePairConfig::SELECTED: >- return IceCandidatePairConfigType::kSelected; >- } >- RTC_NOTREACHED(); >- return IceCandidatePairConfigType::kAdded; >-} >- >-IceCandidateType GetRuntimeIceCandidateType( >- rtclog::IceCandidatePairConfig::IceCandidateType type) { >- switch (type) { >- case rtclog::IceCandidatePairConfig::LOCAL: >- return IceCandidateType::kLocal; >- case rtclog::IceCandidatePairConfig::STUN: >- return IceCandidateType::kStun; >- case rtclog::IceCandidatePairConfig::PRFLX: >- return IceCandidateType::kPrflx; >- case rtclog::IceCandidatePairConfig::RELAY: >- return IceCandidateType::kRelay; >- case rtclog::IceCandidatePairConfig::UNKNOWN_CANDIDATE_TYPE: >- return IceCandidateType::kUnknown; >- } >- RTC_NOTREACHED(); >- return IceCandidateType::kUnknown; >-} >- >-IceCandidatePairProtocol GetRuntimeIceCandidatePairProtocol( >- rtclog::IceCandidatePairConfig::Protocol protocol) { >- switch (protocol) { >- case rtclog::IceCandidatePairConfig::UDP: >- return IceCandidatePairProtocol::kUdp; >- case rtclog::IceCandidatePairConfig::TCP: >- return IceCandidatePairProtocol::kTcp; >- case rtclog::IceCandidatePairConfig::SSLTCP: >- return IceCandidatePairProtocol::kSsltcp; >- case rtclog::IceCandidatePairConfig::TLS: >- return IceCandidatePairProtocol::kTls; >- case rtclog::IceCandidatePairConfig::UNKNOWN_PROTOCOL: >- return IceCandidatePairProtocol::kUnknown; >- } >- RTC_NOTREACHED(); >- return IceCandidatePairProtocol::kUnknown; >-} >- >-IceCandidatePairAddressFamily GetRuntimeIceCandidatePairAddressFamily( >- rtclog::IceCandidatePairConfig::AddressFamily address_family) { >- switch (address_family) { >- case rtclog::IceCandidatePairConfig::IPV4: >- return IceCandidatePairAddressFamily::kIpv4; >- case rtclog::IceCandidatePairConfig::IPV6: >- return IceCandidatePairAddressFamily::kIpv6; >- case rtclog::IceCandidatePairConfig::UNKNOWN_ADDRESS_FAMILY: >- return IceCandidatePairAddressFamily::kUnknown; >- } >- RTC_NOTREACHED(); >- return IceCandidatePairAddressFamily::kUnknown; >-} >- >-IceCandidateNetworkType GetRuntimeIceCandidateNetworkType( >- rtclog::IceCandidatePairConfig::NetworkType network_type) { >- switch (network_type) { >- case rtclog::IceCandidatePairConfig::ETHERNET: >- return IceCandidateNetworkType::kEthernet; >- case rtclog::IceCandidatePairConfig::LOOPBACK: >- return IceCandidateNetworkType::kLoopback; >- case rtclog::IceCandidatePairConfig::WIFI: >- return IceCandidateNetworkType::kWifi; >- case rtclog::IceCandidatePairConfig::VPN: >- return IceCandidateNetworkType::kVpn; >- case rtclog::IceCandidatePairConfig::CELLULAR: >- return IceCandidateNetworkType::kCellular; >- case rtclog::IceCandidatePairConfig::UNKNOWN_NETWORK_TYPE: >- return IceCandidateNetworkType::kUnknown; >- } >- RTC_NOTREACHED(); >- return IceCandidateNetworkType::kUnknown; >-} >- >-IceCandidatePairEventType GetRuntimeIceCandidatePairEventType( >- rtclog::IceCandidatePairEvent::IceCandidatePairEventType type) { >- switch (type) { >- case rtclog::IceCandidatePairEvent::CHECK_SENT: >- return IceCandidatePairEventType::kCheckSent; >- case rtclog::IceCandidatePairEvent::CHECK_RECEIVED: >- return IceCandidatePairEventType::kCheckReceived; >- case rtclog::IceCandidatePairEvent::CHECK_RESPONSE_SENT: >- return IceCandidatePairEventType::kCheckResponseSent; >- case rtclog::IceCandidatePairEvent::CHECK_RESPONSE_RECEIVED: >- return IceCandidatePairEventType::kCheckResponseReceived; >- } >- RTC_NOTREACHED(); >- return IceCandidatePairEventType::kCheckSent; >-} >- >-std::pair<uint64_t, bool> ParseVarInt(std::istream& stream) { >- uint64_t varint = 0; >- for (size_t bytes_read = 0; bytes_read < 10; ++bytes_read) { >- // The most significant bit of each byte is 0 if it is the last byte in >- // the varint and 1 otherwise. Thus, we take the 7 least significant bits >- // of each byte and shift them 7 bits for each byte read previously to get >- // the (unsigned) integer. >- int byte = stream.get(); >- if (stream.eof()) { >- return std::make_pair(varint, false); >- } >- RTC_DCHECK_GE(byte, 0); >- RTC_DCHECK_LE(byte, 255); >- varint |= static_cast<uint64_t>(byte & 0x7F) << (7 * bytes_read); >- if ((byte & 0x80) == 0) { >- return std::make_pair(varint, true); >- } >- } >- return std::make_pair(varint, false); >-} >- >-void GetHeaderExtensions(std::vector<RtpExtension>* header_extensions, >- const RepeatedPtrField<rtclog::RtpHeaderExtension>& >- proto_header_extensions) { >- header_extensions->clear(); >- for (auto& p : proto_header_extensions) { >- RTC_CHECK(p.has_name()); >- RTC_CHECK(p.has_id()); >- const std::string& name = p.name(); >- int id = p.id(); >- header_extensions->push_back(RtpExtension(name, id)); >- } >-} >- >-} // namespace >- >-ParsedRtcEventLog::ParsedRtcEventLog() = default; >-ParsedRtcEventLog::~ParsedRtcEventLog() = default; >- >-ParsedRtcEventLog::BweProbeResultEvent::BweProbeResultEvent() = default; >-ParsedRtcEventLog::BweProbeResultEvent::BweProbeResultEvent( >- const BweProbeResultEvent&) = default; >- >-bool ParsedRtcEventLog::ParseFile(const std::string& filename) { >- std::ifstream file(filename, std::ios_base::in | std::ios_base::binary); >- if (!file.good() || !file.is_open()) { >- RTC_LOG(LS_WARNING) << "Could not open file for reading."; >- return false; >- } >- >- return ParseStream(file); >-} >- >-bool ParsedRtcEventLog::ParseString(const std::string& s) { >- std::istringstream stream(s, std::ios_base::in | std::ios_base::binary); >- return ParseStream(stream); >-} >- >-bool ParsedRtcEventLog::ParseStream(std::istream& stream) { >- events_.clear(); >- const size_t kMaxEventSize = (1u << 16) - 1; >- std::vector<char> tmp_buffer(kMaxEventSize); >- uint64_t tag; >- uint64_t message_length; >- bool success; >- >- RTC_DCHECK(stream.good()); >- >- while (1) { >- // Check whether we have reached end of file. >- stream.peek(); >- if (stream.eof()) { >- // Process all extensions maps for faster look-up later. >- for (auto& event_stream : streams_) { >- rtp_extensions_maps_[StreamId(event_stream.ssrc, >- event_stream.direction)] = >- &event_stream.rtp_extensions_map; >- } >- return true; >- } >- >- // Read the next message tag. The tag number is defined as >- // (fieldnumber << 3) | wire_type. In our case, the field number is >- // supposed to be 1 and the wire type for an >- // length-delimited field is 2. >- const uint64_t kExpectedTag = (1 << 3) | 2; >- std::tie(tag, success) = ParseVarInt(stream); >- if (!success) { >- RTC_LOG(LS_WARNING) >- << "Missing field tag from beginning of protobuf event."; >- return false; >- } else if (tag != kExpectedTag) { >- RTC_LOG(LS_WARNING) >- << "Unexpected field tag at beginning of protobuf event."; >- return false; >- } >- >- // Read the length field. >- std::tie(message_length, success) = ParseVarInt(stream); >- if (!success) { >- RTC_LOG(LS_WARNING) << "Missing message length after protobuf field tag."; >- return false; >- } else if (message_length > kMaxEventSize) { >- RTC_LOG(LS_WARNING) << "Protobuf message length is too large."; >- return false; >- } >- >- // Read the next protobuf event to a temporary char buffer. >- stream.read(tmp_buffer.data(), message_length); >- if (stream.gcount() != static_cast<int>(message_length)) { >- RTC_LOG(LS_WARNING) << "Failed to read protobuf message from file."; >- return false; >- } >- >- // Parse the protobuf event from the buffer. >- rtclog::Event event; >- if (!event.ParseFromArray(tmp_buffer.data(), message_length)) { >- RTC_LOG(LS_WARNING) << "Failed to parse protobuf message."; >- return false; >- } >- >- EventType type = GetRuntimeEventType(event.type()); >- switch (type) { >- case VIDEO_RECEIVER_CONFIG_EVENT: { >- rtclog::StreamConfig config = GetVideoReceiveConfig(event); >- streams_.emplace_back(config.remote_ssrc, MediaType::VIDEO, >- kIncomingPacket, >- RtpHeaderExtensionMap(config.rtp_extensions)); >- streams_.emplace_back(config.local_ssrc, MediaType::VIDEO, >- kOutgoingPacket, >- RtpHeaderExtensionMap(config.rtp_extensions)); >- break; >- } >- case VIDEO_SENDER_CONFIG_EVENT: { >- std::vector<rtclog::StreamConfig> configs = GetVideoSendConfig(event); >- for (size_t i = 0; i < configs.size(); i++) { >- streams_.emplace_back( >- configs[i].local_ssrc, MediaType::VIDEO, kOutgoingPacket, >- RtpHeaderExtensionMap(configs[i].rtp_extensions)); >- >- streams_.emplace_back( >- configs[i].rtx_ssrc, MediaType::VIDEO, kOutgoingPacket, >- RtpHeaderExtensionMap(configs[i].rtp_extensions)); >- } >- break; >- } >- case AUDIO_RECEIVER_CONFIG_EVENT: { >- rtclog::StreamConfig config = GetAudioReceiveConfig(event); >- streams_.emplace_back(config.remote_ssrc, MediaType::AUDIO, >- kIncomingPacket, >- RtpHeaderExtensionMap(config.rtp_extensions)); >- streams_.emplace_back(config.local_ssrc, MediaType::AUDIO, >- kOutgoingPacket, >- RtpHeaderExtensionMap(config.rtp_extensions)); >- break; >- } >- case AUDIO_SENDER_CONFIG_EVENT: { >- rtclog::StreamConfig config = GetAudioSendConfig(event); >- streams_.emplace_back(config.local_ssrc, MediaType::AUDIO, >- kOutgoingPacket, >- RtpHeaderExtensionMap(config.rtp_extensions)); >- break; >- } >- default: >- break; >- } >- >- events_.push_back(event); >- } >-} >- >-size_t ParsedRtcEventLog::GetNumberOfEvents() const { >- return events_.size(); >-} >- >-int64_t ParsedRtcEventLog::GetTimestamp(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_timestamp_us()); >- return event.timestamp_us(); >-} >- >-ParsedRtcEventLog::EventType ParsedRtcEventLog::GetEventType( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- return GetRuntimeEventType(event.type()); >-} >- >-// The header must have space for at least IP_PACKET_SIZE bytes. >-webrtc::RtpHeaderExtensionMap* ParsedRtcEventLog::GetRtpHeader( >- size_t index, >- PacketDirection* incoming, >- uint8_t* header, >- size_t* header_length, >- size_t* total_length, >- int* probe_cluster_id) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::RTP_EVENT); >- RTC_CHECK(event.has_rtp_packet()); >- const rtclog::RtpPacket& rtp_packet = event.rtp_packet(); >- // Get direction of packet. >- RTC_CHECK(rtp_packet.has_incoming()); >- if (incoming != nullptr) { >- *incoming = rtp_packet.incoming() ? kIncomingPacket : kOutgoingPacket; >- } >- // Get packet length. >- RTC_CHECK(rtp_packet.has_packet_length()); >- if (total_length != nullptr) { >- *total_length = rtp_packet.packet_length(); >- } >- // Get header length. >- RTC_CHECK(rtp_packet.has_header()); >- if (header_length != nullptr) { >- *header_length = rtp_packet.header().size(); >- } >- // Get header contents. >- if (header != nullptr) { >- const size_t kMinRtpHeaderSize = 12; >- RTC_CHECK_GE(rtp_packet.header().size(), kMinRtpHeaderSize); >- RTC_CHECK_LE(rtp_packet.header().size(), >- static_cast<size_t>(IP_PACKET_SIZE)); >- memcpy(header, rtp_packet.header().data(), rtp_packet.header().size()); >- uint32_t ssrc = ByteReader<uint32_t>::ReadBigEndian(header + 8); >- StreamId stream_id( >- ssrc, rtp_packet.incoming() ? kIncomingPacket : kOutgoingPacket); >- auto it = rtp_extensions_maps_.find(stream_id); >- if (it != rtp_extensions_maps_.end()) { >- return it->second; >- } >- } >- if (probe_cluster_id != nullptr) { >- if (rtp_packet.has_probe_cluster_id()) { >- *probe_cluster_id = rtp_packet.probe_cluster_id(); >- RTC_CHECK_NE(*probe_cluster_id, PacedPacketInfo::kNotAProbe); >- } else { >- *probe_cluster_id = PacedPacketInfo::kNotAProbe; >- } >- } >- return nullptr; >-} >- >-// The packet must have space for at least IP_PACKET_SIZE bytes. >-void ParsedRtcEventLog::GetRtcpPacket(size_t index, >- PacketDirection* incoming, >- uint8_t* packet, >- size_t* length) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::RTCP_EVENT); >- RTC_CHECK(event.has_rtcp_packet()); >- const rtclog::RtcpPacket& rtcp_packet = event.rtcp_packet(); >- // Get direction of packet. >- RTC_CHECK(rtcp_packet.has_incoming()); >- if (incoming != nullptr) { >- *incoming = rtcp_packet.incoming() ? kIncomingPacket : kOutgoingPacket; >- } >- // Get packet length. >- RTC_CHECK(rtcp_packet.has_packet_data()); >- if (length != nullptr) { >- *length = rtcp_packet.packet_data().size(); >- } >- // Get packet contents. >- if (packet != nullptr) { >- RTC_CHECK_LE(rtcp_packet.packet_data().size(), >- static_cast<unsigned>(IP_PACKET_SIZE)); >- memcpy(packet, rtcp_packet.packet_data().data(), >- rtcp_packet.packet_data().size()); >- } >-} >- >-rtclog::StreamConfig ParsedRtcEventLog::GetVideoReceiveConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetVideoReceiveConfig(events_[index]); >-} >- >-rtclog::StreamConfig ParsedRtcEventLog::GetVideoReceiveConfig( >- const rtclog::Event& event) const { >- rtclog::StreamConfig config; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::VIDEO_RECEIVER_CONFIG_EVENT); >- RTC_CHECK(event.has_video_receiver_config()); >- const rtclog::VideoReceiveConfig& receiver_config = >- event.video_receiver_config(); >- // Get SSRCs. >- RTC_CHECK(receiver_config.has_remote_ssrc()); >- config.remote_ssrc = receiver_config.remote_ssrc(); >- RTC_CHECK(receiver_config.has_local_ssrc()); >- config.local_ssrc = receiver_config.local_ssrc(); >- config.rtx_ssrc = 0; >- // Get RTCP settings. >- RTC_CHECK(receiver_config.has_rtcp_mode()); >- config.rtcp_mode = GetRuntimeRtcpMode(receiver_config.rtcp_mode()); >- RTC_CHECK(receiver_config.has_remb()); >- config.remb = receiver_config.remb(); >- >- // Get RTX map. >- std::map<uint32_t, const rtclog::RtxConfig> rtx_map; >- for (int i = 0; i < receiver_config.rtx_map_size(); i++) { >- const rtclog::RtxMap& map = receiver_config.rtx_map(i); >- RTC_CHECK(map.has_payload_type()); >- RTC_CHECK(map.has_config()); >- RTC_CHECK(map.config().has_rtx_ssrc()); >- RTC_CHECK(map.config().has_rtx_payload_type()); >- rtx_map.insert(std::make_pair(map.payload_type(), map.config())); >- } >- >- // Get header extensions. >- GetHeaderExtensions(&config.rtp_extensions, >- receiver_config.header_extensions()); >- // Get decoders. >- config.codecs.clear(); >- for (int i = 0; i < receiver_config.decoders_size(); i++) { >- RTC_CHECK(receiver_config.decoders(i).has_name()); >- RTC_CHECK(receiver_config.decoders(i).has_payload_type()); >- int rtx_payload_type = 0; >- auto rtx_it = rtx_map.find(receiver_config.decoders(i).payload_type()); >- if (rtx_it != rtx_map.end()) { >- rtx_payload_type = rtx_it->second.rtx_payload_type(); >- if (config.rtx_ssrc != 0 && >- config.rtx_ssrc != rtx_it->second.rtx_ssrc()) { >- RTC_LOG(LS_WARNING) >- << "RtcEventLog protobuf contained different SSRCs for " >- "different received RTX payload types. Will only use " >- "rtx_ssrc = " >- << config.rtx_ssrc << "."; >- } else { >- config.rtx_ssrc = rtx_it->second.rtx_ssrc(); >- } >- } >- config.codecs.emplace_back(receiver_config.decoders(i).name(), >- receiver_config.decoders(i).payload_type(), >- rtx_payload_type); >- } >- return config; >-} >- >-std::vector<rtclog::StreamConfig> ParsedRtcEventLog::GetVideoSendConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetVideoSendConfig(events_[index]); >-} >- >-std::vector<rtclog::StreamConfig> ParsedRtcEventLog::GetVideoSendConfig( >- const rtclog::Event& event) const { >- std::vector<rtclog::StreamConfig> configs; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::VIDEO_SENDER_CONFIG_EVENT); >- RTC_CHECK(event.has_video_sender_config()); >- const rtclog::VideoSendConfig& sender_config = event.video_sender_config(); >- if (sender_config.rtx_ssrcs_size() > 0 && >- sender_config.ssrcs_size() != sender_config.rtx_ssrcs_size()) { >- RTC_LOG(WARNING) >- << "VideoSendConfig is configured for RTX but the number of " >- "SSRCs doesn't match the number of RTX SSRCs."; >- } >- configs.resize(sender_config.ssrcs_size()); >- for (int i = 0; i < sender_config.ssrcs_size(); i++) { >- // Get SSRCs. >- configs[i].local_ssrc = sender_config.ssrcs(i); >- if (sender_config.rtx_ssrcs_size() > 0 && >- i < sender_config.rtx_ssrcs_size()) { >- RTC_CHECK(sender_config.has_rtx_payload_type()); >- configs[i].rtx_ssrc = sender_config.rtx_ssrcs(i); >- } >- // Get header extensions. >- GetHeaderExtensions(&configs[i].rtp_extensions, >- sender_config.header_extensions()); >- >- // Get the codec. >- RTC_CHECK(sender_config.has_encoder()); >- RTC_CHECK(sender_config.encoder().has_name()); >- RTC_CHECK(sender_config.encoder().has_payload_type()); >- configs[i].codecs.emplace_back( >- sender_config.encoder().name(), sender_config.encoder().payload_type(), >- sender_config.has_rtx_payload_type() ? sender_config.rtx_payload_type() >- : 0); >- } >- return configs; >-} >- >-rtclog::StreamConfig ParsedRtcEventLog::GetAudioReceiveConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetAudioReceiveConfig(events_[index]); >-} >- >-rtclog::StreamConfig ParsedRtcEventLog::GetAudioReceiveConfig( >- const rtclog::Event& event) const { >- rtclog::StreamConfig config; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::AUDIO_RECEIVER_CONFIG_EVENT); >- RTC_CHECK(event.has_audio_receiver_config()); >- const rtclog::AudioReceiveConfig& receiver_config = >- event.audio_receiver_config(); >- // Get SSRCs. >- RTC_CHECK(receiver_config.has_remote_ssrc()); >- config.remote_ssrc = receiver_config.remote_ssrc(); >- RTC_CHECK(receiver_config.has_local_ssrc()); >- config.local_ssrc = receiver_config.local_ssrc(); >- // Get header extensions. >- GetHeaderExtensions(&config.rtp_extensions, >- receiver_config.header_extensions()); >- return config; >-} >- >-rtclog::StreamConfig ParsedRtcEventLog::GetAudioSendConfig(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetAudioSendConfig(events_[index]); >-} >- >-rtclog::StreamConfig ParsedRtcEventLog::GetAudioSendConfig( >- const rtclog::Event& event) const { >- rtclog::StreamConfig config; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::AUDIO_SENDER_CONFIG_EVENT); >- RTC_CHECK(event.has_audio_sender_config()); >- const rtclog::AudioSendConfig& sender_config = event.audio_sender_config(); >- // Get SSRCs. >- RTC_CHECK(sender_config.has_ssrc()); >- config.local_ssrc = sender_config.ssrc(); >- // Get header extensions. >- GetHeaderExtensions(&config.rtp_extensions, >- sender_config.header_extensions()); >- return config; >-} >- >-void ParsedRtcEventLog::GetAudioPlayout(size_t index, uint32_t* ssrc) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::AUDIO_PLAYOUT_EVENT); >- RTC_CHECK(event.has_audio_playout_event()); >- const rtclog::AudioPlayoutEvent& loss_event = event.audio_playout_event(); >- RTC_CHECK(loss_event.has_local_ssrc()); >- if (ssrc != nullptr) { >- *ssrc = loss_event.local_ssrc(); >- } >-} >- >-void ParsedRtcEventLog::GetLossBasedBweUpdate(size_t index, >- int32_t* bitrate_bps, >- uint8_t* fraction_loss, >- int32_t* total_packets) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::LOSS_BASED_BWE_UPDATE); >- RTC_CHECK(event.has_loss_based_bwe_update()); >- const rtclog::LossBasedBweUpdate& loss_event = event.loss_based_bwe_update(); >- RTC_CHECK(loss_event.has_bitrate_bps()); >- if (bitrate_bps != nullptr) { >- *bitrate_bps = loss_event.bitrate_bps(); >- } >- RTC_CHECK(loss_event.has_fraction_loss()); >- if (fraction_loss != nullptr) { >- *fraction_loss = loss_event.fraction_loss(); >- } >- RTC_CHECK(loss_event.has_total_packets()); >- if (total_packets != nullptr) { >- *total_packets = loss_event.total_packets(); >- } >-} >- >-ParsedRtcEventLog::BweDelayBasedUpdate >-ParsedRtcEventLog::GetDelayBasedBweUpdate(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::DELAY_BASED_BWE_UPDATE); >- RTC_CHECK(event.has_delay_based_bwe_update()); >- const rtclog::DelayBasedBweUpdate& delay_event = >- event.delay_based_bwe_update(); >- >- BweDelayBasedUpdate res; >- res.timestamp = GetTimestamp(index); >- RTC_CHECK(delay_event.has_bitrate_bps()); >- res.bitrate_bps = delay_event.bitrate_bps(); >- RTC_CHECK(delay_event.has_detector_state()); >- res.detector_state = GetRuntimeDetectorState(delay_event.detector_state()); >- return res; >-} >- >-void ParsedRtcEventLog::GetAudioNetworkAdaptation( >- size_t index, >- AudioEncoderRuntimeConfig* config) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::AUDIO_NETWORK_ADAPTATION_EVENT); >- RTC_CHECK(event.has_audio_network_adaptation()); >- const rtclog::AudioNetworkAdaptation& ana_event = >- event.audio_network_adaptation(); >- if (ana_event.has_bitrate_bps()) >- config->bitrate_bps = ana_event.bitrate_bps(); >- if (ana_event.has_enable_fec()) >- config->enable_fec = ana_event.enable_fec(); >- if (ana_event.has_enable_dtx()) >- config->enable_dtx = ana_event.enable_dtx(); >- if (ana_event.has_frame_length_ms()) >- config->frame_length_ms = ana_event.frame_length_ms(); >- if (ana_event.has_num_channels()) >- config->num_channels = ana_event.num_channels(); >- if (ana_event.has_uplink_packet_loss_fraction()) >- config->uplink_packet_loss_fraction = >- ana_event.uplink_packet_loss_fraction(); >-} >- >-ParsedRtcEventLog::BweProbeClusterCreatedEvent >-ParsedRtcEventLog::GetBweProbeClusterCreated(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::BWE_PROBE_CLUSTER_CREATED_EVENT); >- RTC_CHECK(event.has_probe_cluster()); >- const rtclog::BweProbeCluster& pcc_event = event.probe_cluster(); >- BweProbeClusterCreatedEvent res; >- res.timestamp = GetTimestamp(index); >- RTC_CHECK(pcc_event.has_id()); >- res.id = pcc_event.id(); >- RTC_CHECK(pcc_event.has_bitrate_bps()); >- res.bitrate_bps = pcc_event.bitrate_bps(); >- RTC_CHECK(pcc_event.has_min_packets()); >- res.min_packets = pcc_event.min_packets(); >- RTC_CHECK(pcc_event.has_min_bytes()); >- res.min_bytes = pcc_event.min_bytes(); >- return res; >-} >- >-ParsedRtcEventLog::BweProbeResultEvent ParsedRtcEventLog::GetBweProbeResult( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::BWE_PROBE_RESULT_EVENT); >- RTC_CHECK(event.has_probe_result()); >- const rtclog::BweProbeResult& pr_event = event.probe_result(); >- BweProbeResultEvent res; >- res.timestamp = GetTimestamp(index); >- RTC_CHECK(pr_event.has_id()); >- res.id = pr_event.id(); >- >- RTC_CHECK(pr_event.has_result()); >- if (pr_event.result() == rtclog::BweProbeResult::SUCCESS) { >- RTC_CHECK(pr_event.has_bitrate_bps()); >- res.bitrate_bps = pr_event.bitrate_bps(); >- } else if (pr_event.result() == >- rtclog::BweProbeResult::INVALID_SEND_RECEIVE_INTERVAL) { >- res.failure_reason = ProbeFailureReason::kInvalidSendReceiveInterval; >- } else if (pr_event.result() == >- rtclog::BweProbeResult::INVALID_SEND_RECEIVE_RATIO) { >- res.failure_reason = ProbeFailureReason::kInvalidSendReceiveRatio; >- } else if (pr_event.result() == rtclog::BweProbeResult::TIMEOUT) { >- res.failure_reason = ProbeFailureReason::kTimeout; >- } else { >- RTC_NOTREACHED(); >- } >- >- return res; >-} >- >-ParsedRtcEventLog::AlrStateEvent ParsedRtcEventLog::GetAlrState( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- RTC_CHECK(event.has_type()); >- RTC_CHECK_EQ(event.type(), rtclog::Event::ALR_STATE_EVENT); >- RTC_CHECK(event.has_alr_state()); >- const rtclog::AlrState& alr_event = event.alr_state(); >- AlrStateEvent res; >- res.timestamp = GetTimestamp(index); >- RTC_CHECK(alr_event.has_in_alr()); >- res.in_alr = alr_event.in_alr(); >- >- return res; >-} >- >-ParsedRtcEventLog::IceCandidatePairConfig >-ParsedRtcEventLog::GetIceCandidatePairConfig(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& rtc_event = events_[index]; >- RTC_CHECK(rtc_event.has_type()); >- RTC_CHECK_EQ(rtc_event.type(), rtclog::Event::ICE_CANDIDATE_PAIR_CONFIG); >- IceCandidatePairConfig res; >- const rtclog::IceCandidatePairConfig& config = >- rtc_event.ice_candidate_pair_config(); >- res.timestamp = GetTimestamp(index); >- RTC_CHECK(config.has_config_type()); >- res.type = GetRuntimeIceCandidatePairConfigType(config.config_type()); >- RTC_CHECK(config.has_candidate_pair_id()); >- res.candidate_pair_id = config.candidate_pair_id(); >- RTC_CHECK(config.has_local_candidate_type()); >- res.local_candidate_type = >- GetRuntimeIceCandidateType(config.local_candidate_type()); >- RTC_CHECK(config.has_local_relay_protocol()); >- res.local_relay_protocol = >- GetRuntimeIceCandidatePairProtocol(config.local_relay_protocol()); >- RTC_CHECK(config.has_local_network_type()); >- res.local_network_type = >- GetRuntimeIceCandidateNetworkType(config.local_network_type()); >- RTC_CHECK(config.has_local_address_family()); >- res.local_address_family = >- GetRuntimeIceCandidatePairAddressFamily(config.local_address_family()); >- RTC_CHECK(config.has_remote_candidate_type()); >- res.remote_candidate_type = >- GetRuntimeIceCandidateType(config.remote_candidate_type()); >- RTC_CHECK(config.has_remote_address_family()); >- res.remote_address_family = >- GetRuntimeIceCandidatePairAddressFamily(config.remote_address_family()); >- RTC_CHECK(config.has_candidate_pair_protocol()); >- res.candidate_pair_protocol = >- GetRuntimeIceCandidatePairProtocol(config.candidate_pair_protocol()); >- return res; >-} >- >-ParsedRtcEventLog::IceCandidatePairEvent >-ParsedRtcEventLog::GetIceCandidatePairEvent(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& rtc_event = events_[index]; >- RTC_CHECK(rtc_event.has_type()); >- RTC_CHECK_EQ(rtc_event.type(), rtclog::Event::ICE_CANDIDATE_PAIR_EVENT); >- IceCandidatePairEvent res; >- const rtclog::IceCandidatePairEvent& event = >- rtc_event.ice_candidate_pair_event(); >- res.timestamp = GetTimestamp(index); >- RTC_CHECK(event.has_event_type()); >- res.type = GetRuntimeIceCandidatePairEventType(event.event_type()); >- RTC_CHECK(event.has_candidate_pair_id()); >- res.candidate_pair_id = event.candidate_pair_id(); >- return res; >-} >- >-// Returns the MediaType for registered SSRCs. Search from the end to use last >-// registered types first. >-ParsedRtcEventLog::MediaType ParsedRtcEventLog::GetMediaType( >- uint32_t ssrc, >- PacketDirection direction) const { >- for (auto rit = streams_.rbegin(); rit != streams_.rend(); ++rit) { >- if (rit->ssrc == ssrc && rit->direction == direction) >- return rit->media_type; >- } >- return MediaType::ANY; >-} >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser.h >deleted file mode 100644 >index c93ec6da7c361ababadc5f2634410f028acb8b68..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser.h >+++ /dev/null >@@ -1,259 +0,0 @@ >-/* >- * Copyright (c) 2016 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >-#ifndef LOGGING_RTC_EVENT_LOG_RTC_EVENT_LOG_PARSER_H_ >-#define LOGGING_RTC_EVENT_LOG_RTC_EVENT_LOG_PARSER_H_ >- >-#include <map> >-#include <sstream> // no-presubmit-check TODO(webrtc:8982) >-#include <string> >-#include <utility> // pair >-#include <vector> >- >-#include "call/video_receive_stream.h" >-#include "call/video_send_stream.h" >-#include "logging/rtc_event_log/events/rtc_event_ice_candidate_pair.h" >-#include "logging/rtc_event_log/events/rtc_event_ice_candidate_pair_config.h" >-#include "logging/rtc_event_log/events/rtc_event_probe_result_failure.h" >-#include "logging/rtc_event_log/rtc_event_log.h" >-#include "logging/rtc_event_log/rtc_stream_config.h" >-#include "modules/rtp_rtcp/include/rtp_header_extension_map.h" >-#include "modules/rtp_rtcp/source/byte_io.h" >-#include "rtc_base/ignore_wundef.h" >- >-// Files generated at build-time by the protobuf compiler. >-RTC_PUSH_IGNORING_WUNDEF() >-#ifdef WEBRTC_ANDROID_PLATFORM_BUILD >-#include "external/webrtc/webrtc/logging/rtc_event_log/rtc_event_log.pb.h" >-#else >-#include "logging/rtc_event_log/rtc_event_log.pb.h" >-#endif >-RTC_POP_IGNORING_WUNDEF() >- >-namespace webrtc { >- >-enum class BandwidthUsage; >-enum class MediaType; >- >-struct AudioEncoderRuntimeConfig; >- >-class ParsedRtcEventLog { >- friend class RtcEventLogTestHelper; >- >- public: >- ParsedRtcEventLog(); >- ~ParsedRtcEventLog(); >- >- struct BweProbeClusterCreatedEvent { >- uint64_t timestamp; >- uint32_t id; >- uint64_t bitrate_bps; >- uint32_t min_packets; >- uint32_t min_bytes; >- }; >- >- struct BweProbeResultEvent { >- BweProbeResultEvent(); >- BweProbeResultEvent(const BweProbeResultEvent&); >- >- uint64_t timestamp; >- uint32_t id; >- absl::optional<uint64_t> bitrate_bps; >- absl::optional<ProbeFailureReason> failure_reason; >- }; >- >- struct BweDelayBasedUpdate { >- uint64_t timestamp; >- int32_t bitrate_bps; >- BandwidthUsage detector_state; >- }; >- >- struct AlrStateEvent { >- uint64_t timestamp; >- bool in_alr; >- }; >- >- struct IceCandidatePairConfig { >- uint64_t timestamp; >- IceCandidatePairConfigType type; >- uint32_t candidate_pair_id; >- IceCandidateType local_candidate_type; >- IceCandidatePairProtocol local_relay_protocol; >- IceCandidateNetworkType local_network_type; >- IceCandidatePairAddressFamily local_address_family; >- IceCandidateType remote_candidate_type; >- IceCandidatePairAddressFamily remote_address_family; >- IceCandidatePairProtocol candidate_pair_protocol; >- }; >- >- struct IceCandidatePairEvent { >- uint64_t timestamp; >- IceCandidatePairEventType type; >- uint32_t candidate_pair_id; >- }; >- >- enum EventType { >- UNKNOWN_EVENT = 0, >- LOG_START = 1, >- LOG_END = 2, >- RTP_EVENT = 3, >- RTCP_EVENT = 4, >- AUDIO_PLAYOUT_EVENT = 5, >- LOSS_BASED_BWE_UPDATE = 6, >- DELAY_BASED_BWE_UPDATE = 7, >- VIDEO_RECEIVER_CONFIG_EVENT = 8, >- VIDEO_SENDER_CONFIG_EVENT = 9, >- AUDIO_RECEIVER_CONFIG_EVENT = 10, >- AUDIO_SENDER_CONFIG_EVENT = 11, >- AUDIO_NETWORK_ADAPTATION_EVENT = 16, >- BWE_PROBE_CLUSTER_CREATED_EVENT = 17, >- BWE_PROBE_RESULT_EVENT = 18, >- ALR_STATE_EVENT = 19, >- ICE_CANDIDATE_PAIR_CONFIG = 20, >- ICE_CANDIDATE_PAIR_EVENT = 21, >- }; >- >- enum class MediaType { ANY, AUDIO, VIDEO, DATA }; >- >- // Reads an RtcEventLog file and returns true if parsing was successful. >- bool ParseFile(const std::string& file_name); >- >- // Reads an RtcEventLog from a string and returns true if successful. >- bool ParseString(const std::string& s); >- >- // Reads an RtcEventLog from an istream and returns true if successful. >- bool ParseStream(std::istream& stream); >- >- // Returns the number of events in an EventStream. >- size_t GetNumberOfEvents() const; >- >- // Reads the arrival timestamp (in microseconds) from a rtclog::Event. >- int64_t GetTimestamp(size_t index) const; >- >- // Reads the event type of the rtclog::Event at |index|. >- EventType GetEventType(size_t index) const; >- >- // Reads the header, direction, header length and packet length from the RTP >- // event at |index|, and stores the values in the corresponding output >- // parameters. Each output parameter can be set to nullptr if that value >- // isn't needed. >- // NB: The header must have space for at least IP_PACKET_SIZE bytes. >- // Returns: a pointer to a header extensions map acquired from parsing >- // corresponding Audio/Video Sender/Receiver config events. >- // Warning: if the same SSRC is reused by both video and audio streams during >- // call, extensions maps may be incorrect (the last one would be returned). >- webrtc::RtpHeaderExtensionMap* GetRtpHeader(size_t index, >- PacketDirection* incoming, >- uint8_t* header, >- size_t* header_length, >- size_t* total_length, >- int* probe_cluster_id) const; >- >- // Reads packet, direction and packet length from the RTCP event at |index|, >- // and stores the values in the corresponding output parameters. >- // Each output parameter can be set to nullptr if that value isn't needed. >- // NB: The packet must have space for at least IP_PACKET_SIZE bytes. >- void GetRtcpPacket(size_t index, >- PacketDirection* incoming, >- uint8_t* packet, >- size_t* length) const; >- >- // Reads a video receive config event to a StreamConfig struct. >- // Only the fields that are stored in the protobuf will be written. >- rtclog::StreamConfig GetVideoReceiveConfig(size_t index) const; >- >- // Reads a video send config event to a StreamConfig struct. If the proto >- // contains multiple SSRCs and RTX SSRCs (this used to be the case for >- // simulcast streams) then we return one StreamConfig per SSRC,RTX_SSRC pair. >- // Only the fields that are stored in the protobuf will be written. >- std::vector<rtclog::StreamConfig> GetVideoSendConfig(size_t index) const; >- >- // Reads a audio receive config event to a StreamConfig struct. >- // Only the fields that are stored in the protobuf will be written. >- rtclog::StreamConfig GetAudioReceiveConfig(size_t index) const; >- >- // Reads a config event to a StreamConfig struct. >- // Only the fields that are stored in the protobuf will be written. >- rtclog::StreamConfig GetAudioSendConfig(size_t index) const; >- >- // Reads the SSRC from the audio playout event at |index|. The SSRC is stored >- // in the output parameter ssrc. The output parameter can be set to nullptr >- // and in that case the function only asserts that the event is well formed. >- void GetAudioPlayout(size_t index, uint32_t* ssrc) const; >- >- // Reads bitrate, fraction loss (as defined in RFC 1889) and total number of >- // expected packets from the loss based BWE event at |index| and stores the >- // values in >- // the corresponding output parameters. Each output parameter can be set to >- // nullptr if that >- // value isn't needed. >- void GetLossBasedBweUpdate(size_t index, >- int32_t* bitrate_bps, >- uint8_t* fraction_loss, >- int32_t* total_packets) const; >- >- // Reads bitrate and detector_state from the delay based BWE event at |index| >- // and stores the values in the corresponding output parameters. Each output >- // parameter can be set to nullptr if that >- // value isn't needed. >- BweDelayBasedUpdate GetDelayBasedBweUpdate(size_t index) const; >- >- // Reads a audio network adaptation event to a (non-NULL) >- // AudioEncoderRuntimeConfig struct. Only the fields that are >- // stored in the protobuf will be written. >- void GetAudioNetworkAdaptation(size_t index, >- AudioEncoderRuntimeConfig* config) const; >- >- BweProbeClusterCreatedEvent GetBweProbeClusterCreated(size_t index) const; >- >- BweProbeResultEvent GetBweProbeResult(size_t index) const; >- >- MediaType GetMediaType(uint32_t ssrc, PacketDirection direction) const; >- >- AlrStateEvent GetAlrState(size_t index) const; >- >- IceCandidatePairConfig GetIceCandidatePairConfig(size_t index) const; >- IceCandidatePairEvent GetIceCandidatePairEvent(size_t index) const; >- >- private: >- rtclog::StreamConfig GetVideoReceiveConfig(const rtclog::Event& event) const; >- std::vector<rtclog::StreamConfig> GetVideoSendConfig( >- const rtclog::Event& event) const; >- rtclog::StreamConfig GetAudioReceiveConfig(const rtclog::Event& event) const; >- rtclog::StreamConfig GetAudioSendConfig(const rtclog::Event& event) const; >- >- std::vector<rtclog::Event> events_; >- >- struct Stream { >- Stream(uint32_t ssrc, >- MediaType media_type, >- webrtc::PacketDirection direction, >- webrtc::RtpHeaderExtensionMap map) >- : ssrc(ssrc), >- media_type(media_type), >- direction(direction), >- rtp_extensions_map(map) {} >- uint32_t ssrc; >- MediaType media_type; >- webrtc::PacketDirection direction; >- webrtc::RtpHeaderExtensionMap rtp_extensions_map; >- }; >- >- // All configured streams found in the event log. >- std::vector<Stream> streams_; >- >- // To find configured extensions map for given stream, what are needed to >- // parse a header. >- typedef std::pair<uint32_t, webrtc::PacketDirection> StreamId; >- std::map<StreamId, webrtc::RtpHeaderExtensionMap*> rtp_extensions_maps_; >-}; >- >-} // namespace webrtc >- >-#endif // LOGGING_RTC_EVENT_LOG_RTC_EVENT_LOG_PARSER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.cc >index 5fa69ed637f65eac09129b46f82b8595f4138457..7278ea18b1d33c45c3d8e77e00e12bfcee98e399 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.cc >@@ -24,21 +24,30 @@ > #include "absl/types/optional.h" > #include "api/rtp_headers.h" > #include "api/rtpparameters.h" >+#include "logging/rtc_event_log/encoder/blob_encoding.h" >+#include "logging/rtc_event_log/encoder/delta_encoding.h" >+#include "logging/rtc_event_log/encoder/rtc_event_log_encoder_common.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor.h" >-#include "modules/congestion_controller/transport_feedback_adapter.h" >+#include "modules/congestion_controller/rtp/transport_feedback_adapter.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" >+#include "modules/rtp_rtcp/include/rtp_cvo.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" > #include "modules/rtp_rtcp/source/rtp_utility.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >+#include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/protobuf_utils.h" > >+using webrtc_event_logging::ToSigned; >+using webrtc_event_logging::ToUnsigned; >+ > namespace webrtc { > > namespace { >+// Conversion functions for legacy wire format. > RtcpMode GetRuntimeRtcpMode(rtclog::VideoReceiveConfig::RtcpMode rtcp_mode) { > switch (rtcp_mode) { > case rtclog::VideoReceiveConfig::RTCP_COMPOUND: >@@ -50,52 +59,6 @@ RtcpMode GetRuntimeRtcpMode(rtclog::VideoReceiveConfig::RtcpMode rtcp_mode) { > return RtcpMode::kOff; > } > >-ParsedRtcEventLogNew::EventType GetRuntimeEventType( >- rtclog::Event::EventType event_type) { >- switch (event_type) { >- case rtclog::Event::UNKNOWN_EVENT: >- return ParsedRtcEventLogNew::EventType::UNKNOWN_EVENT; >- case rtclog::Event::LOG_START: >- return ParsedRtcEventLogNew::EventType::LOG_START; >- case rtclog::Event::LOG_END: >- return ParsedRtcEventLogNew::EventType::LOG_END; >- case rtclog::Event::RTP_EVENT: >- return ParsedRtcEventLogNew::EventType::RTP_EVENT; >- case rtclog::Event::RTCP_EVENT: >- return ParsedRtcEventLogNew::EventType::RTCP_EVENT; >- case rtclog::Event::AUDIO_PLAYOUT_EVENT: >- return ParsedRtcEventLogNew::EventType::AUDIO_PLAYOUT_EVENT; >- case rtclog::Event::LOSS_BASED_BWE_UPDATE: >- return ParsedRtcEventLogNew::EventType::LOSS_BASED_BWE_UPDATE; >- case rtclog::Event::DELAY_BASED_BWE_UPDATE: >- return ParsedRtcEventLogNew::EventType::DELAY_BASED_BWE_UPDATE; >- case rtclog::Event::VIDEO_RECEIVER_CONFIG_EVENT: >- return ParsedRtcEventLogNew::EventType::VIDEO_RECEIVER_CONFIG_EVENT; >- case rtclog::Event::VIDEO_SENDER_CONFIG_EVENT: >- return ParsedRtcEventLogNew::EventType::VIDEO_SENDER_CONFIG_EVENT; >- case rtclog::Event::AUDIO_RECEIVER_CONFIG_EVENT: >- return ParsedRtcEventLogNew::EventType::AUDIO_RECEIVER_CONFIG_EVENT; >- case rtclog::Event::AUDIO_SENDER_CONFIG_EVENT: >- return ParsedRtcEventLogNew::EventType::AUDIO_SENDER_CONFIG_EVENT; >- case rtclog::Event::AUDIO_NETWORK_ADAPTATION_EVENT: >- return ParsedRtcEventLogNew::EventType::AUDIO_NETWORK_ADAPTATION_EVENT; >- case rtclog::Event::BWE_PROBE_CLUSTER_CREATED_EVENT: >- return ParsedRtcEventLogNew::EventType::BWE_PROBE_CLUSTER_CREATED_EVENT; >- case rtclog::Event::BWE_PROBE_RESULT_EVENT: >- // Probe successes and failures are currently stored in the same proto >- // message, we are moving towards separate messages. Probe results >- // therefore need special treatment in the parser. >- return ParsedRtcEventLogNew::EventType::UNKNOWN_EVENT; >- case rtclog::Event::ALR_STATE_EVENT: >- return ParsedRtcEventLogNew::EventType::ALR_STATE_EVENT; >- case rtclog::Event::ICE_CANDIDATE_PAIR_CONFIG: >- return ParsedRtcEventLogNew::EventType::ICE_CANDIDATE_PAIR_CONFIG; >- case rtclog::Event::ICE_CANDIDATE_PAIR_EVENT: >- return ParsedRtcEventLogNew::EventType::ICE_CANDIDATE_PAIR_EVENT; >- } >- return ParsedRtcEventLogNew::EventType::UNKNOWN_EVENT; >-} >- > BandwidthUsage GetRuntimeDetectorState( > rtclog::DelayBasedBweUpdate::DetectorState detector_state) { > switch (detector_state) { >@@ -212,6 +175,175 @@ IceCandidatePairEventType GetRuntimeIceCandidatePairEventType( > return IceCandidatePairEventType::kCheckSent; > } > >+// Conversion functions for version 2 of the wire format. >+BandwidthUsage GetRuntimeDetectorState( >+ rtclog2::DelayBasedBweUpdates::DetectorState detector_state) { >+ switch (detector_state) { >+ case rtclog2::DelayBasedBweUpdates::BWE_NORMAL: >+ return BandwidthUsage::kBwNormal; >+ case rtclog2::DelayBasedBweUpdates::BWE_UNDERUSING: >+ return BandwidthUsage::kBwUnderusing; >+ case rtclog2::DelayBasedBweUpdates::BWE_OVERUSING: >+ return BandwidthUsage::kBwOverusing; >+ case rtclog2::DelayBasedBweUpdates::BWE_UNKNOWN_STATE: >+ break; >+ } >+ RTC_NOTREACHED(); >+ return BandwidthUsage::kBwNormal; >+} >+ >+ProbeFailureReason GetRuntimeProbeFailureReason( >+ rtclog2::BweProbeResultFailure::FailureReason failure) { >+ switch (failure) { >+ case rtclog2::BweProbeResultFailure::INVALID_SEND_RECEIVE_INTERVAL: >+ return ProbeFailureReason::kInvalidSendReceiveInterval; >+ case rtclog2::BweProbeResultFailure::INVALID_SEND_RECEIVE_RATIO: >+ return ProbeFailureReason::kInvalidSendReceiveRatio; >+ case rtclog2::BweProbeResultFailure::TIMEOUT: >+ return ProbeFailureReason::kTimeout; >+ case rtclog2::BweProbeResultFailure::UNKNOWN: >+ break; >+ } >+ RTC_NOTREACHED(); >+ return ProbeFailureReason::kTimeout; >+} >+ >+IceCandidatePairConfigType GetRuntimeIceCandidatePairConfigType( >+ rtclog2::IceCandidatePairConfig::IceCandidatePairConfigType type) { >+ switch (type) { >+ case rtclog2::IceCandidatePairConfig::ADDED: >+ return IceCandidatePairConfigType::kAdded; >+ case rtclog2::IceCandidatePairConfig::UPDATED: >+ return IceCandidatePairConfigType::kUpdated; >+ case rtclog2::IceCandidatePairConfig::DESTROYED: >+ return IceCandidatePairConfigType::kDestroyed; >+ case rtclog2::IceCandidatePairConfig::SELECTED: >+ return IceCandidatePairConfigType::kSelected; >+ case rtclog2::IceCandidatePairConfig::UNKNOWN_CONFIG_TYPE: >+ break; >+ } >+ RTC_NOTREACHED(); >+ return IceCandidatePairConfigType::kAdded; >+} >+ >+IceCandidateType GetRuntimeIceCandidateType( >+ rtclog2::IceCandidatePairConfig::IceCandidateType type) { >+ switch (type) { >+ case rtclog2::IceCandidatePairConfig::LOCAL: >+ return IceCandidateType::kLocal; >+ case rtclog2::IceCandidatePairConfig::STUN: >+ return IceCandidateType::kStun; >+ case rtclog2::IceCandidatePairConfig::PRFLX: >+ return IceCandidateType::kPrflx; >+ case rtclog2::IceCandidatePairConfig::RELAY: >+ return IceCandidateType::kRelay; >+ case rtclog2::IceCandidatePairConfig::UNKNOWN_CANDIDATE_TYPE: >+ return IceCandidateType::kUnknown; >+ } >+ RTC_NOTREACHED(); >+ return IceCandidateType::kUnknown; >+} >+ >+IceCandidatePairProtocol GetRuntimeIceCandidatePairProtocol( >+ rtclog2::IceCandidatePairConfig::Protocol protocol) { >+ switch (protocol) { >+ case rtclog2::IceCandidatePairConfig::UDP: >+ return IceCandidatePairProtocol::kUdp; >+ case rtclog2::IceCandidatePairConfig::TCP: >+ return IceCandidatePairProtocol::kTcp; >+ case rtclog2::IceCandidatePairConfig::SSLTCP: >+ return IceCandidatePairProtocol::kSsltcp; >+ case rtclog2::IceCandidatePairConfig::TLS: >+ return IceCandidatePairProtocol::kTls; >+ case rtclog2::IceCandidatePairConfig::UNKNOWN_PROTOCOL: >+ return IceCandidatePairProtocol::kUnknown; >+ } >+ RTC_NOTREACHED(); >+ return IceCandidatePairProtocol::kUnknown; >+} >+ >+IceCandidatePairAddressFamily GetRuntimeIceCandidatePairAddressFamily( >+ rtclog2::IceCandidatePairConfig::AddressFamily address_family) { >+ switch (address_family) { >+ case rtclog2::IceCandidatePairConfig::IPV4: >+ return IceCandidatePairAddressFamily::kIpv4; >+ case rtclog2::IceCandidatePairConfig::IPV6: >+ return IceCandidatePairAddressFamily::kIpv6; >+ case rtclog2::IceCandidatePairConfig::UNKNOWN_ADDRESS_FAMILY: >+ return IceCandidatePairAddressFamily::kUnknown; >+ } >+ RTC_NOTREACHED(); >+ return IceCandidatePairAddressFamily::kUnknown; >+} >+ >+IceCandidateNetworkType GetRuntimeIceCandidateNetworkType( >+ rtclog2::IceCandidatePairConfig::NetworkType network_type) { >+ switch (network_type) { >+ case rtclog2::IceCandidatePairConfig::ETHERNET: >+ return IceCandidateNetworkType::kEthernet; >+ case rtclog2::IceCandidatePairConfig::LOOPBACK: >+ return IceCandidateNetworkType::kLoopback; >+ case rtclog2::IceCandidatePairConfig::WIFI: >+ return IceCandidateNetworkType::kWifi; >+ case rtclog2::IceCandidatePairConfig::VPN: >+ return IceCandidateNetworkType::kVpn; >+ case rtclog2::IceCandidatePairConfig::CELLULAR: >+ return IceCandidateNetworkType::kCellular; >+ case rtclog2::IceCandidatePairConfig::UNKNOWN_NETWORK_TYPE: >+ return IceCandidateNetworkType::kUnknown; >+ } >+ RTC_NOTREACHED(); >+ return IceCandidateNetworkType::kUnknown; >+} >+ >+IceCandidatePairEventType GetRuntimeIceCandidatePairEventType( >+ rtclog2::IceCandidatePairEvent::IceCandidatePairEventType type) { >+ switch (type) { >+ case rtclog2::IceCandidatePairEvent::CHECK_SENT: >+ return IceCandidatePairEventType::kCheckSent; >+ case rtclog2::IceCandidatePairEvent::CHECK_RECEIVED: >+ return IceCandidatePairEventType::kCheckReceived; >+ case rtclog2::IceCandidatePairEvent::CHECK_RESPONSE_SENT: >+ return IceCandidatePairEventType::kCheckResponseSent; >+ case rtclog2::IceCandidatePairEvent::CHECK_RESPONSE_RECEIVED: >+ return IceCandidatePairEventType::kCheckResponseReceived; >+ case rtclog2::IceCandidatePairEvent::UNKNOWN_CHECK_TYPE: >+ break; >+ } >+ RTC_NOTREACHED(); >+ return IceCandidatePairEventType::kCheckSent; >+} >+ >+std::vector<RtpExtension> GetRuntimeRtpHeaderExtensionConfig( >+ const rtclog2::RtpHeaderExtensionConfig& proto_header_extensions) { >+ std::vector<RtpExtension> rtp_extensions; >+ if (proto_header_extensions.has_transmission_time_offset_id()) { >+ rtp_extensions.emplace_back( >+ RtpExtension::kTimestampOffsetUri, >+ proto_header_extensions.transmission_time_offset_id()); >+ } >+ if (proto_header_extensions.has_absolute_send_time_id()) { >+ rtp_extensions.emplace_back( >+ RtpExtension::kAbsSendTimeUri, >+ proto_header_extensions.absolute_send_time_id()); >+ } >+ if (proto_header_extensions.has_transport_sequence_number_id()) { >+ rtp_extensions.emplace_back( >+ RtpExtension::kTransportSequenceNumberUri, >+ proto_header_extensions.transport_sequence_number_id()); >+ } >+ if (proto_header_extensions.has_audio_level_id()) { >+ rtp_extensions.emplace_back(RtpExtension::kAudioLevelUri, >+ proto_header_extensions.audio_level_id()); >+ } >+ if (proto_header_extensions.has_video_rotation_id()) { >+ rtp_extensions.emplace_back(RtpExtension::kVideoRotationUri, >+ proto_header_extensions.video_rotation_id()); >+ } >+ return rtp_extensions; >+} >+// End of conversion functions. >+ > // Reads a VarInt from |stream| and returns it. Also writes the read bytes to > // |buffer| starting |bytes_written| bytes into the buffer. |bytes_written| is > // incremented for each written byte. >@@ -271,15 +403,322 @@ void SortPacketFeedbackVectorWithLoss(std::vector<PacketFeedback>* vec) { > std::sort(vec->begin(), vec->end(), LossHandlingPacketFeedbackComparator()); > } > >+template <typename ProtoType, typename LoggedType> >+void StoreRtpPackets( >+ const ProtoType& proto, >+ std::map<uint32_t, std::vector<LoggedType>>* rtp_packets_map) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ RTC_CHECK(proto.has_marker()); >+ RTC_CHECK(proto.has_payload_type()); >+ RTC_CHECK(proto.has_sequence_number()); >+ RTC_CHECK(proto.has_rtp_timestamp()); >+ RTC_CHECK(proto.has_ssrc()); >+ RTC_CHECK(proto.has_payload_size()); >+ RTC_CHECK(proto.has_header_size()); >+ RTC_CHECK(proto.has_padding_size()); >+ >+ // Base event >+ { >+ RTPHeader header; >+ header.markerBit = rtc::checked_cast<bool>(proto.marker()); >+ header.payloadType = rtc::checked_cast<uint8_t>(proto.payload_type()); >+ header.sequenceNumber = >+ rtc::checked_cast<uint16_t>(proto.sequence_number()); >+ header.timestamp = rtc::checked_cast<uint32_t>(proto.rtp_timestamp()); >+ header.ssrc = rtc::checked_cast<uint32_t>(proto.ssrc()); >+ header.numCSRCs = 0; // TODO(terelius): Implement CSRC. >+ header.paddingLength = rtc::checked_cast<size_t>(proto.padding_size()); >+ header.headerLength = rtc::checked_cast<size_t>(proto.header_size()); >+ // TODO(terelius): Should we implement payload_type_frequency? >+ if (proto.has_transport_sequence_number()) { >+ header.extension.hasTransportSequenceNumber = true; >+ header.extension.transportSequenceNumber = >+ rtc::checked_cast<uint16_t>(proto.transport_sequence_number()); >+ } >+ if (proto.has_transmission_time_offset()) { >+ header.extension.hasTransmissionTimeOffset = true; >+ header.extension.transmissionTimeOffset = >+ rtc::checked_cast<int32_t>(proto.transmission_time_offset()); >+ } >+ if (proto.has_absolute_send_time()) { >+ header.extension.hasAbsoluteSendTime = true; >+ header.extension.absoluteSendTime = >+ rtc::checked_cast<uint32_t>(proto.absolute_send_time()); >+ } >+ if (proto.has_video_rotation()) { >+ header.extension.hasVideoRotation = true; >+ header.extension.videoRotation = ConvertCVOByteToVideoRotation( >+ rtc::checked_cast<uint8_t>(proto.video_rotation())); >+ } >+ if (proto.has_audio_level()) { >+ RTC_CHECK(proto.has_voice_activity()); >+ header.extension.hasAudioLevel = true; >+ header.extension.voiceActivity = >+ rtc::checked_cast<bool>(proto.voice_activity()); >+ const uint8_t audio_level = >+ rtc::checked_cast<uint8_t>(proto.audio_level()); >+ RTC_CHECK_LE(audio_level, 0x7Fu); >+ header.extension.audioLevel = audio_level; >+ } else { >+ RTC_CHECK(!proto.has_voice_activity()); >+ } >+ (*rtp_packets_map)[header.ssrc].emplace_back( >+ proto.timestamp_ms() * 1000, header, proto.header_size(), >+ proto.payload_size() + header.headerLength + header.paddingLength); >+ } >+ >+ const size_t number_of_deltas = >+ proto.has_number_of_deltas() ? proto.number_of_deltas() : 0u; >+ if (number_of_deltas == 0) { >+ return; >+ } >+ >+ // timestamp_ms (event) >+ std::vector<absl::optional<uint64_t>> timestamp_ms_values = >+ DecodeDeltas(proto.timestamp_ms_deltas(), >+ ToUnsigned(proto.timestamp_ms()), number_of_deltas); >+ RTC_CHECK_EQ(timestamp_ms_values.size(), number_of_deltas); >+ >+ // marker (RTP base) >+ std::vector<absl::optional<uint64_t>> marker_values = >+ DecodeDeltas(proto.marker_deltas(), proto.marker(), number_of_deltas); >+ RTC_CHECK_EQ(marker_values.size(), number_of_deltas); >+ >+ // payload_type (RTP base) >+ std::vector<absl::optional<uint64_t>> payload_type_values = DecodeDeltas( >+ proto.payload_type_deltas(), proto.payload_type(), number_of_deltas); >+ RTC_CHECK_EQ(payload_type_values.size(), number_of_deltas); >+ >+ // sequence_number (RTP base) >+ std::vector<absl::optional<uint64_t>> sequence_number_values = >+ DecodeDeltas(proto.sequence_number_deltas(), proto.sequence_number(), >+ number_of_deltas); >+ RTC_CHECK_EQ(sequence_number_values.size(), number_of_deltas); >+ >+ // rtp_timestamp (RTP base) >+ std::vector<absl::optional<uint64_t>> rtp_timestamp_values = DecodeDeltas( >+ proto.rtp_timestamp_deltas(), proto.rtp_timestamp(), number_of_deltas); >+ RTC_CHECK_EQ(rtp_timestamp_values.size(), number_of_deltas); >+ >+ // ssrc (RTP base) >+ std::vector<absl::optional<uint64_t>> ssrc_values = >+ DecodeDeltas(proto.ssrc_deltas(), proto.ssrc(), number_of_deltas); >+ RTC_CHECK_EQ(ssrc_values.size(), number_of_deltas); >+ >+ // payload_size (RTP base) >+ std::vector<absl::optional<uint64_t>> payload_size_values = DecodeDeltas( >+ proto.payload_size_deltas(), proto.payload_size(), number_of_deltas); >+ RTC_CHECK_EQ(payload_size_values.size(), number_of_deltas); >+ >+ // header_size (RTP base) >+ std::vector<absl::optional<uint64_t>> header_size_values = DecodeDeltas( >+ proto.header_size_deltas(), proto.header_size(), number_of_deltas); >+ RTC_CHECK_EQ(header_size_values.size(), number_of_deltas); >+ >+ // padding_size (RTP base) >+ std::vector<absl::optional<uint64_t>> padding_size_values = DecodeDeltas( >+ proto.padding_size_deltas(), proto.padding_size(), number_of_deltas); >+ RTC_CHECK_EQ(padding_size_values.size(), number_of_deltas); >+ >+ // transport_sequence_number (RTP extension) >+ std::vector<absl::optional<uint64_t>> transport_sequence_number_values; >+ { >+ const absl::optional<uint64_t> base_transport_sequence_number = >+ proto.has_transport_sequence_number() >+ ? proto.transport_sequence_number() >+ : absl::optional<uint64_t>(); >+ transport_sequence_number_values = >+ DecodeDeltas(proto.transport_sequence_number_deltas(), >+ base_transport_sequence_number, number_of_deltas); >+ RTC_CHECK_EQ(transport_sequence_number_values.size(), number_of_deltas); >+ } >+ >+ // transmission_time_offset (RTP extension) >+ std::vector<absl::optional<uint64_t>> transmission_time_offset_values; >+ { >+ const absl::optional<uint64_t> unsigned_base_transmission_time_offset = >+ proto.has_transmission_time_offset() >+ ? ToUnsigned(proto.transmission_time_offset()) >+ : absl::optional<uint64_t>(); >+ transmission_time_offset_values = >+ DecodeDeltas(proto.transmission_time_offset_deltas(), >+ unsigned_base_transmission_time_offset, number_of_deltas); >+ RTC_CHECK_EQ(transmission_time_offset_values.size(), number_of_deltas); >+ } >+ >+ // absolute_send_time (RTP extension) >+ std::vector<absl::optional<uint64_t>> absolute_send_time_values; >+ { >+ const absl::optional<uint64_t> base_absolute_send_time = >+ proto.has_absolute_send_time() ? proto.absolute_send_time() >+ : absl::optional<uint64_t>(); >+ absolute_send_time_values = >+ DecodeDeltas(proto.absolute_send_time_deltas(), base_absolute_send_time, >+ number_of_deltas); >+ RTC_CHECK_EQ(absolute_send_time_values.size(), number_of_deltas); >+ } >+ >+ // video_rotation (RTP extension) >+ std::vector<absl::optional<uint64_t>> video_rotation_values; >+ { >+ const absl::optional<uint64_t> base_video_rotation = >+ proto.has_video_rotation() ? proto.video_rotation() >+ : absl::optional<uint64_t>(); >+ video_rotation_values = DecodeDeltas(proto.video_rotation_deltas(), >+ base_video_rotation, number_of_deltas); >+ RTC_CHECK_EQ(video_rotation_values.size(), number_of_deltas); >+ } >+ >+ // audio_level (RTP extension) >+ std::vector<absl::optional<uint64_t>> audio_level_values; >+ { >+ const absl::optional<uint64_t> base_audio_level = >+ proto.has_audio_level() ? proto.audio_level() >+ : absl::optional<uint64_t>(); >+ audio_level_values = DecodeDeltas(proto.audio_level_deltas(), >+ base_audio_level, number_of_deltas); >+ RTC_CHECK_EQ(audio_level_values.size(), number_of_deltas); >+ } >+ >+ // voice_activity (RTP extension) >+ std::vector<absl::optional<uint64_t>> voice_activity_values; >+ { >+ const absl::optional<uint64_t> base_voice_activity = >+ proto.has_voice_activity() ? proto.voice_activity() >+ : absl::optional<uint64_t>(); >+ voice_activity_values = DecodeDeltas(proto.voice_activity_deltas(), >+ base_voice_activity, number_of_deltas); >+ RTC_CHECK_EQ(voice_activity_values.size(), number_of_deltas); >+ } >+ >+ // Delta decoding >+ for (size_t i = 0; i < number_of_deltas; ++i) { >+ RTC_CHECK(timestamp_ms_values[i].has_value()); >+ RTC_CHECK(marker_values[i].has_value()); >+ RTC_CHECK(payload_type_values[i].has_value()); >+ RTC_CHECK(sequence_number_values[i].has_value()); >+ RTC_CHECK(rtp_timestamp_values[i].has_value()); >+ RTC_CHECK(ssrc_values[i].has_value()); >+ RTC_CHECK(payload_size_values[i].has_value()); >+ RTC_CHECK(header_size_values[i].has_value()); >+ RTC_CHECK(padding_size_values[i].has_value()); >+ >+ int64_t timestamp_ms; >+ RTC_CHECK(ToSigned(timestamp_ms_values[i].value(), ×tamp_ms)); >+ >+ RTPHeader header; >+ header.markerBit = rtc::checked_cast<bool>(*marker_values[i]); >+ header.payloadType = rtc::checked_cast<uint8_t>(*payload_type_values[i]); >+ header.sequenceNumber = >+ rtc::checked_cast<uint16_t>(*sequence_number_values[i]); >+ header.timestamp = rtc::checked_cast<uint32_t>(*rtp_timestamp_values[i]); >+ header.ssrc = rtc::checked_cast<uint32_t>(*ssrc_values[i]); >+ header.numCSRCs = 0; // TODO(terelius): Implement CSRC. >+ header.paddingLength = rtc::checked_cast<size_t>(*padding_size_values[i]); >+ header.headerLength = rtc::checked_cast<size_t>(*header_size_values[i]); >+ // TODO(terelius): Should we implement payload_type_frequency? >+ if (transport_sequence_number_values.size() > i && >+ transport_sequence_number_values[i].has_value()) { >+ header.extension.hasTransportSequenceNumber = true; >+ header.extension.transportSequenceNumber = rtc::checked_cast<uint16_t>( >+ transport_sequence_number_values[i].value()); >+ } >+ if (transmission_time_offset_values.size() > i && >+ transmission_time_offset_values[i].has_value()) { >+ header.extension.hasTransmissionTimeOffset = true; >+ int32_t transmission_time_offset; >+ RTC_CHECK(ToSigned(transmission_time_offset_values[i].value(), >+ &transmission_time_offset)); >+ header.extension.transmissionTimeOffset = transmission_time_offset; >+ } >+ if (absolute_send_time_values.size() > i && >+ absolute_send_time_values[i].has_value()) { >+ header.extension.hasAbsoluteSendTime = true; >+ header.extension.absoluteSendTime = >+ rtc::checked_cast<uint32_t>(absolute_send_time_values[i].value()); >+ } >+ if (video_rotation_values.size() > i && >+ video_rotation_values[i].has_value()) { >+ header.extension.hasVideoRotation = true; >+ header.extension.videoRotation = ConvertCVOByteToVideoRotation( >+ rtc::checked_cast<uint8_t>(video_rotation_values[i].value())); >+ } >+ if (audio_level_values.size() > i && audio_level_values[i].has_value()) { >+ RTC_CHECK(voice_activity_values.size() > i && >+ voice_activity_values[i].has_value()); >+ header.extension.hasAudioLevel = true; >+ header.extension.voiceActivity = >+ rtc::checked_cast<bool>(voice_activity_values[i].value()); >+ const uint8_t audio_level = >+ rtc::checked_cast<uint8_t>(audio_level_values[i].value()); >+ RTC_CHECK_LE(audio_level, 0x7Fu); >+ header.extension.audioLevel = audio_level; >+ } else { >+ RTC_CHECK(voice_activity_values.size() <= i || >+ !voice_activity_values[i].has_value()); >+ } >+ (*rtp_packets_map)[header.ssrc].emplace_back( >+ 1000 * timestamp_ms, header, header.headerLength, >+ payload_size_values[i].value() + header.headerLength + >+ header.paddingLength); >+ } >+} >+ >+template <typename ProtoType, typename LoggedType> >+void StoreRtcpPackets(const ProtoType& proto, >+ std::vector<LoggedType>* rtcp_packets) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ RTC_CHECK(proto.has_raw_packet()); >+ >+ // Base event >+ rtcp_packets->emplace_back(proto.timestamp_ms() * 1000, proto.raw_packet()); >+ >+ const size_t number_of_deltas = >+ proto.has_number_of_deltas() ? proto.number_of_deltas() : 0u; >+ if (number_of_deltas == 0) { >+ return; >+ } >+ >+ // timestamp_ms >+ std::vector<absl::optional<uint64_t>> timestamp_ms_values = >+ DecodeDeltas(proto.timestamp_ms_deltas(), >+ ToUnsigned(proto.timestamp_ms()), number_of_deltas); >+ RTC_CHECK_EQ(timestamp_ms_values.size(), number_of_deltas); >+ >+ // raw_packet >+ RTC_CHECK(proto.has_raw_packet_blobs()); >+ std::vector<absl::string_view> raw_packet_values = >+ DecodeBlobs(proto.raw_packet_blobs(), number_of_deltas); >+ RTC_CHECK_EQ(raw_packet_values.size(), number_of_deltas); >+ >+ // Delta decoding >+ for (size_t i = 0; i < number_of_deltas; ++i) { >+ RTC_CHECK(timestamp_ms_values[i].has_value()); >+ int64_t timestamp_ms; >+ RTC_CHECK(ToSigned(timestamp_ms_values[i].value(), ×tamp_ms)); >+ >+ rtcp_packets->emplace_back( >+ 1000 * timestamp_ms, >+ reinterpret_cast<const uint8_t*>(raw_packet_values[i].data()), >+ raw_packet_values[i].size()); >+ } >+} >+ > } // namespace > > LoggedRtcpPacket::LoggedRtcpPacket(uint64_t timestamp_us, > const uint8_t* packet, > size_t total_length) > : timestamp_us(timestamp_us), raw_data(packet, packet + total_length) {} >+LoggedRtcpPacket::LoggedRtcpPacket(uint64_t timestamp_us, >+ const std::string& packet) >+ : timestamp_us(timestamp_us), raw_data(packet.size()) { >+ memcpy(raw_data.data(), packet.data(), packet.size()); >+} > LoggedRtcpPacket::LoggedRtcpPacket(const LoggedRtcpPacket& rhs) = default; > LoggedRtcpPacket::~LoggedRtcpPacket() = default; > >+LoggedVideoSendConfig::LoggedVideoSendConfig() = default; > LoggedVideoSendConfig::LoggedVideoSendConfig( > int64_t timestamp_us, > const std::vector<rtclog::StreamConfig>& configs) >@@ -361,7 +800,6 @@ ParsedRtcEventLogNew::ParsedRtcEventLogNew( > } > > void ParsedRtcEventLogNew::Clear() { >- events_.clear(); > default_extension_map_ = GetDefaultHeaderExtensionMap(); > > incoming_rtx_ssrcs_.clear(); >@@ -478,9 +916,8 @@ bool ParsedRtcEventLogNew::ParseStream( > > bool ParsedRtcEventLogNew::ParseStreamInternal( > std::istream& stream) { // no-presubmit-check TODO(webrtc:8982) >- const size_t kMaxEventSize = (1u << 16) - 1; >- const size_t kMaxVarintSize = 10; >- std::vector<char> buffer(kMaxEventSize + 2 * kMaxVarintSize); >+ constexpr uint64_t kMaxEventSize = 10000000; // Sanity check. >+ std::vector<char> buffer(0xFFFF); > > RTC_DCHECK(stream.good()); > >@@ -491,10 +928,12 @@ bool ParsedRtcEventLogNew::ParseStreamInternal( > break; > } > >- // Read the next message tag. The tag number is defined as >- // (fieldnumber << 3) | wire_type. In our case, the field number is >- // supposed to be 1 and the wire type for a length-delimited field is 2. >- const uint64_t kExpectedV1Tag = (1 << 3) | 2; >+ // Read the next message tag. Protobuf defines the message tag as >+ // (field_number << 3) | wire_type. In the legacy encoding, the field number >+ // is supposed to be 1 and the wire type for a length-delimited field is 2. >+ // In the new encoding we still expect the wire type to be 2, but the field >+ // number will be greater than 1. >+ constexpr uint64_t kExpectedV1Tag = (1 << 3) | 2; > size_t bytes_written = 0; > absl::optional<uint64_t> tag = > ParseVarInt(stream, buffer.data(), &bytes_written); >@@ -502,9 +941,13 @@ bool ParsedRtcEventLogNew::ParseStreamInternal( > RTC_LOG(LS_WARNING) > << "Missing field tag from beginning of protobuf event."; > return false; >- } else if (*tag != kExpectedV1Tag) { >- RTC_LOG(LS_WARNING) >- << "Unexpected field tag at beginning of protobuf event."; >+ } >+ constexpr uint64_t kWireTypeMask = 0x07; >+ const uint64_t wire_type = *tag & kWireTypeMask; >+ if (wire_type != 2) { >+ RTC_LOG(LS_WARNING) << "Expected field tag with wire type 2 (length " >+ "delimited message). Found wire type " >+ << wire_type; > return false; > } > >@@ -520,6 +963,8 @@ bool ParsedRtcEventLogNew::ParseStreamInternal( > } > > // Read the next protobuf event to a temporary char buffer. >+ if (buffer.size() < bytes_written + *message_length) >+ buffer.resize(bytes_written + *message_length); > stream.read(buffer.data() + bytes_written, *message_length); > if (stream.gcount() != static_cast<int>(*message_length)) { > RTC_LOG(LS_WARNING) << "Failed to read protobuf message from file."; >@@ -527,21 +972,32 @@ bool ParsedRtcEventLogNew::ParseStreamInternal( > } > size_t buffer_size = bytes_written + *message_length; > >- // Parse the protobuf event from the buffer. >- rtclog::EventStream event_stream; >- if (!event_stream.ParseFromArray(buffer.data(), buffer_size)) { >- RTC_LOG(LS_WARNING) << "Failed to parse protobuf message."; >- return false; >- } >+ if (*tag == kExpectedV1Tag) { >+ // Parse the protobuf event from the buffer. >+ rtclog::EventStream event_stream; >+ if (!event_stream.ParseFromArray(buffer.data(), buffer_size)) { >+ RTC_LOG(LS_WARNING) >+ << "Failed to parse legacy-format protobuf message."; >+ return false; >+ } > >- RTC_CHECK_EQ(event_stream.stream_size(), 1); >- StoreParsedEvent(event_stream.stream(0)); >- events_.push_back(event_stream.stream(0)); >+ RTC_CHECK_EQ(event_stream.stream_size(), 1); >+ StoreParsedLegacyEvent(event_stream.stream(0)); >+ } else { >+ // Parse the protobuf event from the buffer. >+ rtclog2::EventStream event_stream; >+ if (!event_stream.ParseFromArray(buffer.data(), buffer_size)) { >+ RTC_LOG(LS_WARNING) << "Failed to parse new-format protobuf message."; >+ return false; >+ } >+ StoreParsedNewFormatEvent(event_stream); >+ } > } > return true; > } > >-void ParsedRtcEventLogNew::StoreParsedEvent(const rtclog::Event& event) { >+void ParsedRtcEventLogNew::StoreParsedLegacyEvent(const rtclog::Event& event) { >+ RTC_CHECK(event.has_type()); > if (event.type() != rtclog::Event::VIDEO_RECEIVER_CONFIG_EVENT && > event.type() != rtclog::Event::VIDEO_SENDER_CONFIG_EVENT && > event.type() != rtclog::Event::AUDIO_RECEIVER_CONFIG_EVENT && >@@ -554,55 +1010,63 @@ void ParsedRtcEventLogNew::StoreParsedEvent(const rtclog::Event& event) { > last_timestamp_ = std::max(last_timestamp_, timestamp); > } > >- switch (GetEventType(event)) { >- case ParsedRtcEventLogNew::EventType::VIDEO_RECEIVER_CONFIG_EVENT: { >+ switch (event.type()) { >+ case rtclog::Event::VIDEO_RECEIVER_CONFIG_EVENT: { > rtclog::StreamConfig config = GetVideoReceiveConfig(event); > video_recv_configs_.emplace_back(GetTimestamp(event), config); >- incoming_rtp_extensions_maps_[config.remote_ssrc] = >- RtpHeaderExtensionMap(config.rtp_extensions); >- // TODO(terelius): I don't understand the reason for configuring header >- // extensions for the local SSRC. I think it should be removed, but for >- // now I want to preserve the previous functionality. >- incoming_rtp_extensions_maps_[config.local_ssrc] = >- RtpHeaderExtensionMap(config.rtp_extensions); >+ if (!config.rtp_extensions.empty()) { >+ incoming_rtp_extensions_maps_[config.remote_ssrc] = >+ RtpHeaderExtensionMap(config.rtp_extensions); >+ // TODO(terelius): I don't understand the reason for configuring header >+ // extensions for the local SSRC. I think it should be removed, but for >+ // now I want to preserve the previous functionality. >+ incoming_rtp_extensions_maps_[config.local_ssrc] = >+ RtpHeaderExtensionMap(config.rtp_extensions); >+ } > incoming_video_ssrcs_.insert(config.remote_ssrc); > incoming_video_ssrcs_.insert(config.rtx_ssrc); > incoming_rtx_ssrcs_.insert(config.rtx_ssrc); > break; > } >- case ParsedRtcEventLogNew::EventType::VIDEO_SENDER_CONFIG_EVENT: { >+ case rtclog::Event::VIDEO_SENDER_CONFIG_EVENT: { > std::vector<rtclog::StreamConfig> configs = GetVideoSendConfig(event); > video_send_configs_.emplace_back(GetTimestamp(event), configs); > for (const auto& config : configs) { >- outgoing_rtp_extensions_maps_[config.local_ssrc] = >- RtpHeaderExtensionMap(config.rtp_extensions); >- outgoing_rtp_extensions_maps_[config.rtx_ssrc] = >- RtpHeaderExtensionMap(config.rtp_extensions); >+ if (!config.rtp_extensions.empty()) { >+ outgoing_rtp_extensions_maps_[config.local_ssrc] = >+ RtpHeaderExtensionMap(config.rtp_extensions); >+ outgoing_rtp_extensions_maps_[config.rtx_ssrc] = >+ RtpHeaderExtensionMap(config.rtp_extensions); >+ } > outgoing_video_ssrcs_.insert(config.local_ssrc); > outgoing_video_ssrcs_.insert(config.rtx_ssrc); > outgoing_rtx_ssrcs_.insert(config.rtx_ssrc); > } > break; > } >- case ParsedRtcEventLogNew::EventType::AUDIO_RECEIVER_CONFIG_EVENT: { >+ case rtclog::Event::AUDIO_RECEIVER_CONFIG_EVENT: { > rtclog::StreamConfig config = GetAudioReceiveConfig(event); > audio_recv_configs_.emplace_back(GetTimestamp(event), config); >- incoming_rtp_extensions_maps_[config.remote_ssrc] = >- RtpHeaderExtensionMap(config.rtp_extensions); >- incoming_rtp_extensions_maps_[config.local_ssrc] = >- RtpHeaderExtensionMap(config.rtp_extensions); >+ if (!config.rtp_extensions.empty()) { >+ incoming_rtp_extensions_maps_[config.remote_ssrc] = >+ RtpHeaderExtensionMap(config.rtp_extensions); >+ incoming_rtp_extensions_maps_[config.local_ssrc] = >+ RtpHeaderExtensionMap(config.rtp_extensions); >+ } > incoming_audio_ssrcs_.insert(config.remote_ssrc); > break; > } >- case ParsedRtcEventLogNew::EventType::AUDIO_SENDER_CONFIG_EVENT: { >+ case rtclog::Event::AUDIO_SENDER_CONFIG_EVENT: { > rtclog::StreamConfig config = GetAudioSendConfig(event); > audio_send_configs_.emplace_back(GetTimestamp(event), config); >- outgoing_rtp_extensions_maps_[config.local_ssrc] = >- RtpHeaderExtensionMap(config.rtp_extensions); >+ if (!config.rtp_extensions.empty()) { >+ outgoing_rtp_extensions_maps_[config.local_ssrc] = >+ RtpHeaderExtensionMap(config.rtp_extensions); >+ } > outgoing_audio_ssrcs_.insert(config.local_ssrc); > break; > } >- case ParsedRtcEventLogNew::EventType::RTP_EVENT: { >+ case rtclog::Event::RTP_EVENT: { > PacketDirection direction; > uint8_t header[IP_PACKET_SIZE]; > size_t header_length; >@@ -645,7 +1109,7 @@ void ParsedRtcEventLogNew::StoreParsedEvent(const rtclog::Event& event) { > } > break; > } >- case ParsedRtcEventLogNew::EventType::RTCP_EVENT: { >+ case rtclog::Event::RTCP_EVENT: { > PacketDirection direction; > uint8_t packet[IP_PACKET_SIZE]; > size_t total_length; >@@ -731,113 +1195,76 @@ void ParsedRtcEventLogNew::StoreParsedEvent(const rtclog::Event& event) { > } > break; > } >- case ParsedRtcEventLogNew::EventType::LOG_START: { >+ case rtclog::Event::LOG_START: { > start_log_events_.push_back(LoggedStartEvent(GetTimestamp(event))); > break; > } >- case ParsedRtcEventLogNew::EventType::LOG_END: { >+ case rtclog::Event::LOG_END: { > stop_log_events_.push_back(LoggedStopEvent(GetTimestamp(event))); > break; > } >- case ParsedRtcEventLogNew::EventType::AUDIO_PLAYOUT_EVENT: { >+ case rtclog::Event::AUDIO_PLAYOUT_EVENT: { > LoggedAudioPlayoutEvent playout_event = GetAudioPlayout(event); > audio_playout_events_[playout_event.ssrc].push_back(playout_event); > break; > } >- case ParsedRtcEventLogNew::EventType::LOSS_BASED_BWE_UPDATE: { >+ case rtclog::Event::LOSS_BASED_BWE_UPDATE: { > bwe_loss_updates_.push_back(GetLossBasedBweUpdate(event)); > break; > } >- case ParsedRtcEventLogNew::EventType::DELAY_BASED_BWE_UPDATE: { >+ case rtclog::Event::DELAY_BASED_BWE_UPDATE: { > bwe_delay_updates_.push_back(GetDelayBasedBweUpdate(event)); > break; > } >- case ParsedRtcEventLogNew::EventType::AUDIO_NETWORK_ADAPTATION_EVENT: { >+ case rtclog::Event::AUDIO_NETWORK_ADAPTATION_EVENT: { > LoggedAudioNetworkAdaptationEvent ana_event = > GetAudioNetworkAdaptation(event); > audio_network_adaptation_events_.push_back(ana_event); > break; > } >- case ParsedRtcEventLogNew::EventType::BWE_PROBE_CLUSTER_CREATED_EVENT: { >+ case rtclog::Event::BWE_PROBE_CLUSTER_CREATED_EVENT: { > bwe_probe_cluster_created_events_.push_back( > GetBweProbeClusterCreated(event)); > break; > } >- case ParsedRtcEventLogNew::EventType::BWE_PROBE_FAILURE_EVENT: { >- bwe_probe_failure_events_.push_back(GetBweProbeFailure(event)); >- break; >- } >- case ParsedRtcEventLogNew::EventType::BWE_PROBE_SUCCESS_EVENT: { >- bwe_probe_success_events_.push_back(GetBweProbeSuccess(event)); >+ case rtclog::Event::BWE_PROBE_RESULT_EVENT: { >+ // Probe successes and failures are currently stored in the same proto >+ // message, we are moving towards separate messages. Probe results >+ // therefore need special treatment in the parser. >+ RTC_CHECK(event.has_probe_result()); >+ RTC_CHECK(event.probe_result().has_result()); >+ if (event.probe_result().result() == rtclog::BweProbeResult::SUCCESS) { >+ bwe_probe_success_events_.push_back(GetBweProbeSuccess(event)); >+ } else { >+ bwe_probe_failure_events_.push_back(GetBweProbeFailure(event)); >+ } > break; > } >- case ParsedRtcEventLogNew::EventType::ALR_STATE_EVENT: { >+ case rtclog::Event::ALR_STATE_EVENT: { > alr_state_events_.push_back(GetAlrState(event)); > break; > } >- case ParsedRtcEventLogNew::EventType::ICE_CANDIDATE_PAIR_CONFIG: { >+ case rtclog::Event::ICE_CANDIDATE_PAIR_CONFIG: { > ice_candidate_pair_configs_.push_back(GetIceCandidatePairConfig(event)); > break; > } >- case ParsedRtcEventLogNew::EventType::ICE_CANDIDATE_PAIR_EVENT: { >+ case rtclog::Event::ICE_CANDIDATE_PAIR_EVENT: { > ice_candidate_pair_events_.push_back(GetIceCandidatePairEvent(event)); > break; > } >- case ParsedRtcEventLogNew::EventType::UNKNOWN_EVENT: { >+ case rtclog::Event::UNKNOWN_EVENT: { > break; > } > } > } > >-size_t ParsedRtcEventLogNew::GetNumberOfEvents() const { >- return events_.size(); >-} >- >-int64_t ParsedRtcEventLogNew::GetTimestamp(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetTimestamp(event); >-} > > int64_t ParsedRtcEventLogNew::GetTimestamp(const rtclog::Event& event) const { > RTC_CHECK(event.has_timestamp_us()); > return event.timestamp_us(); > } > >-ParsedRtcEventLogNew::EventType ParsedRtcEventLogNew::GetEventType( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetEventType(event); >-} >- >-ParsedRtcEventLogNew::EventType ParsedRtcEventLogNew::GetEventType( >- const rtclog::Event& event) const { >- RTC_CHECK(event.has_type()); >- if (event.type() == rtclog::Event::BWE_PROBE_RESULT_EVENT) { >- RTC_CHECK(event.has_probe_result()); >- RTC_CHECK(event.probe_result().has_result()); >- if (event.probe_result().result() == rtclog::BweProbeResult::SUCCESS) >- return ParsedRtcEventLogNew::EventType::BWE_PROBE_SUCCESS_EVENT; >- return ParsedRtcEventLogNew::EventType::BWE_PROBE_FAILURE_EVENT; >- } >- return GetRuntimeEventType(event.type()); >-} >- > // The header must have space for at least IP_PACKET_SIZE bytes. >-const webrtc::RtpHeaderExtensionMap* ParsedRtcEventLogNew::GetRtpHeader( >- size_t index, >- PacketDirection* incoming, >- uint8_t* header, >- size_t* header_length, >- size_t* total_length, >- int* probe_cluster_id) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetRtpHeader(event, incoming, header, header_length, total_length, >- probe_cluster_id); >-} >- > const webrtc::RtpHeaderExtensionMap* ParsedRtcEventLogNew::GetRtpHeader( > const rtclog::Event& event, > PacketDirection* incoming, >@@ -899,15 +1326,6 @@ const webrtc::RtpHeaderExtensionMap* ParsedRtcEventLogNew::GetRtpHeader( > } > > // The packet must have space for at least IP_PACKET_SIZE bytes. >-void ParsedRtcEventLogNew::GetRtcpPacket(size_t index, >- PacketDirection* incoming, >- uint8_t* packet, >- size_t* length) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- GetRtcpPacket(event, incoming, packet, length); >-} >- > void ParsedRtcEventLogNew::GetRtcpPacket(const rtclog::Event& event, > PacketDirection* incoming, > uint8_t* packet, >@@ -935,12 +1353,6 @@ void ParsedRtcEventLogNew::GetRtcpPacket(const rtclog::Event& event, > } > } > >-rtclog::StreamConfig ParsedRtcEventLogNew::GetVideoReceiveConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetVideoReceiveConfig(events_[index]); >-} >- > rtclog::StreamConfig ParsedRtcEventLogNew::GetVideoReceiveConfig( > const rtclog::Event& event) const { > rtclog::StreamConfig config; >@@ -1002,12 +1414,6 @@ rtclog::StreamConfig ParsedRtcEventLogNew::GetVideoReceiveConfig( > return config; > } > >-std::vector<rtclog::StreamConfig> ParsedRtcEventLogNew::GetVideoSendConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetVideoSendConfig(events_[index]); >-} >- > std::vector<rtclog::StreamConfig> ParsedRtcEventLogNew::GetVideoSendConfig( > const rtclog::Event& event) const { > std::vector<rtclog::StreamConfig> configs; >@@ -1046,12 +1452,6 @@ std::vector<rtclog::StreamConfig> ParsedRtcEventLogNew::GetVideoSendConfig( > return configs; > } > >-rtclog::StreamConfig ParsedRtcEventLogNew::GetAudioReceiveConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetAudioReceiveConfig(events_[index]); >-} >- > rtclog::StreamConfig ParsedRtcEventLogNew::GetAudioReceiveConfig( > const rtclog::Event& event) const { > rtclog::StreamConfig config; >@@ -1071,12 +1471,6 @@ rtclog::StreamConfig ParsedRtcEventLogNew::GetAudioReceiveConfig( > return config; > } > >-rtclog::StreamConfig ParsedRtcEventLogNew::GetAudioSendConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- return GetAudioSendConfig(events_[index]); >-} >- > rtclog::StreamConfig ParsedRtcEventLogNew::GetAudioSendConfig( > const rtclog::Event& event) const { > rtclog::StreamConfig config; >@@ -1093,13 +1487,6 @@ rtclog::StreamConfig ParsedRtcEventLogNew::GetAudioSendConfig( > return config; > } > >-LoggedAudioPlayoutEvent ParsedRtcEventLogNew::GetAudioPlayout( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetAudioPlayout(event); >-} >- > LoggedAudioPlayoutEvent ParsedRtcEventLogNew::GetAudioPlayout( > const rtclog::Event& event) const { > RTC_CHECK(event.has_type()); >@@ -1113,13 +1500,6 @@ LoggedAudioPlayoutEvent ParsedRtcEventLogNew::GetAudioPlayout( > return res; > } > >-LoggedBweLossBasedUpdate ParsedRtcEventLogNew::GetLossBasedBweUpdate( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetLossBasedBweUpdate(event); >-} >- > LoggedBweLossBasedUpdate ParsedRtcEventLogNew::GetLossBasedBweUpdate( > const rtclog::Event& event) const { > RTC_CHECK(event.has_type()); >@@ -1138,13 +1518,6 @@ LoggedBweLossBasedUpdate ParsedRtcEventLogNew::GetLossBasedBweUpdate( > return bwe_update; > } > >-LoggedBweDelayBasedUpdate ParsedRtcEventLogNew::GetDelayBasedBweUpdate( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetDelayBasedBweUpdate(event); >-} >- > LoggedBweDelayBasedUpdate ParsedRtcEventLogNew::GetDelayBasedBweUpdate( > const rtclog::Event& event) const { > RTC_CHECK(event.has_type()); >@@ -1162,13 +1535,6 @@ LoggedBweDelayBasedUpdate ParsedRtcEventLogNew::GetDelayBasedBweUpdate( > return res; > } > >-LoggedAudioNetworkAdaptationEvent >-ParsedRtcEventLogNew::GetAudioNetworkAdaptation(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetAudioNetworkAdaptation(event); >-} >- > LoggedAudioNetworkAdaptationEvent > ParsedRtcEventLogNew::GetAudioNetworkAdaptation( > const rtclog::Event& event) const { >@@ -1196,13 +1562,6 @@ ParsedRtcEventLogNew::GetAudioNetworkAdaptation( > return res; > } > >-LoggedBweProbeClusterCreatedEvent >-ParsedRtcEventLogNew::GetBweProbeClusterCreated(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetBweProbeClusterCreated(event); >-} >- > LoggedBweProbeClusterCreatedEvent > ParsedRtcEventLogNew::GetBweProbeClusterCreated( > const rtclog::Event& event) const { >@@ -1223,13 +1582,6 @@ ParsedRtcEventLogNew::GetBweProbeClusterCreated( > return res; > } > >-LoggedBweProbeFailureEvent ParsedRtcEventLogNew::GetBweProbeFailure( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetBweProbeFailure(event); >-} >- > LoggedBweProbeFailureEvent ParsedRtcEventLogNew::GetBweProbeFailure( > const rtclog::Event& event) const { > RTC_CHECK(event.has_type()); >@@ -1260,13 +1612,6 @@ LoggedBweProbeFailureEvent ParsedRtcEventLogNew::GetBweProbeFailure( > return res; > } > >-LoggedBweProbeSuccessEvent ParsedRtcEventLogNew::GetBweProbeSuccess( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetBweProbeSuccess(event); >-} >- > LoggedBweProbeSuccessEvent ParsedRtcEventLogNew::GetBweProbeSuccess( > const rtclog::Event& event) const { > RTC_CHECK(event.has_type()); >@@ -1286,12 +1631,6 @@ LoggedBweProbeSuccessEvent ParsedRtcEventLogNew::GetBweProbeSuccess( > return res; > } > >-LoggedAlrStateEvent ParsedRtcEventLogNew::GetAlrState(size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& event = events_[index]; >- return GetAlrState(event); >-} >- > LoggedAlrStateEvent ParsedRtcEventLogNew::GetAlrState( > const rtclog::Event& event) const { > RTC_CHECK(event.has_type()); >@@ -1306,13 +1645,6 @@ LoggedAlrStateEvent ParsedRtcEventLogNew::GetAlrState( > return res; > } > >-LoggedIceCandidatePairConfig ParsedRtcEventLogNew::GetIceCandidatePairConfig( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& rtc_event = events_[index]; >- return GetIceCandidatePairConfig(rtc_event); >-} >- > LoggedIceCandidatePairConfig ParsedRtcEventLogNew::GetIceCandidatePairConfig( > const rtclog::Event& rtc_event) const { > RTC_CHECK(rtc_event.has_type()); >@@ -1349,13 +1681,6 @@ LoggedIceCandidatePairConfig ParsedRtcEventLogNew::GetIceCandidatePairConfig( > return res; > } > >-LoggedIceCandidatePairEvent ParsedRtcEventLogNew::GetIceCandidatePairEvent( >- size_t index) const { >- RTC_CHECK_LT(index, GetNumberOfEvents()); >- const rtclog::Event& rtc_event = events_[index]; >- return GetIceCandidatePairEvent(rtc_event); >-} >- > LoggedIceCandidatePairEvent ParsedRtcEventLogNew::GetIceCandidatePairEvent( > const rtclog::Event& rtc_event) const { > RTC_CHECK(rtc_event.has_type()); >@@ -1438,7 +1763,8 @@ const std::vector<MatchedSendArrivalTimes> GetNetworkTrace( > clock.AdvanceTimeMicroseconds(time_us - clock.TimeInMicroseconds()); > if (clock.TimeInMicroseconds() >= NextRtcpTime()) { > RTC_DCHECK_EQ(clock.TimeInMicroseconds(), NextRtcpTime()); >- feedback_adapter.OnTransportFeedback(rtcp_iterator->transport_feedback); >+ feedback_adapter.ProcessTransportFeedback( >+ rtcp_iterator->transport_feedback); > std::vector<PacketFeedback> feedback = > feedback_adapter.GetTransportFeedbackVector(); > SortPacketFeedbackVectorWithLoss(&feedback); >@@ -1452,14 +1778,25 @@ const std::vector<MatchedSendArrivalTimes> GetNetworkTrace( > if (clock.TimeInMicroseconds() >= NextRtpTime()) { > RTC_DCHECK_EQ(clock.TimeInMicroseconds(), NextRtpTime()); > const RtpPacketType& rtp_packet = *rtp_iterator->second; >+ rtc::SentPacket sent_packet; >+ sent_packet.send_time_ms = rtp_packet.rtp.log_time_ms(); >+ sent_packet.info.packet_size_bytes = rtp_packet.rtp.total_length; > if (rtp_packet.rtp.header.extension.hasTransportSequenceNumber) { > feedback_adapter.AddPacket( > rtp_packet.rtp.header.ssrc, > rtp_packet.rtp.header.extension.transportSequenceNumber, > rtp_packet.rtp.total_length, PacedPacketInfo()); >- feedback_adapter.OnSentPacket( >- rtp_packet.rtp.header.extension.transportSequenceNumber, >- rtp_packet.rtp.log_time_ms()); >+ sent_packet.packet_id = >+ rtp_packet.rtp.header.extension.transportSequenceNumber; >+ sent_packet.info.included_in_feedback = true; >+ sent_packet.info.included_in_allocation = true; >+ feedback_adapter.ProcessSentPacket(sent_packet); >+ } else { >+ sent_packet.info.included_in_feedback = false; >+ // TODO(srte): Make it possible to indicate that all packets are part of >+ // allocation. >+ sent_packet.info.included_in_allocation = false; >+ feedback_adapter.ProcessSentPacket(sent_packet); > } > ++rtp_iterator; > } >@@ -1468,4 +1805,613 @@ const std::vector<MatchedSendArrivalTimes> GetNetworkTrace( > return rtp_rtcp_matched; > } > >+// Helper functions for new format start here >+void ParsedRtcEventLogNew::StoreParsedNewFormatEvent( >+ const rtclog2::EventStream& stream) { >+ RTC_DCHECK_EQ(stream.stream_size(), 0); >+ >+ RTC_DCHECK_EQ( >+ stream.incoming_rtp_packets_size() + stream.outgoing_rtp_packets_size() + >+ stream.incoming_rtcp_packets_size() + >+ stream.outgoing_rtcp_packets_size() + >+ stream.audio_playout_events_size() + stream.begin_log_events_size() + >+ stream.end_log_events_size() + stream.loss_based_bwe_updates_size() + >+ stream.delay_based_bwe_updates_size() + >+ stream.audio_network_adaptations_size() + >+ stream.probe_clusters_size() + stream.probe_success_size() + >+ stream.probe_failure_size() + stream.alr_states_size() + >+ stream.ice_candidate_configs_size() + >+ stream.ice_candidate_events_size() + >+ stream.audio_recv_stream_configs_size() + >+ stream.audio_send_stream_configs_size() + >+ stream.video_recv_stream_configs_size() + >+ stream.video_send_stream_configs_size(), >+ 1u); >+ >+ if (stream.incoming_rtp_packets_size() == 1) { >+ StoreIncomingRtpPackets(stream.incoming_rtp_packets(0)); >+ } else if (stream.outgoing_rtp_packets_size() == 1) { >+ StoreOutgoingRtpPackets(stream.outgoing_rtp_packets(0)); >+ } else if (stream.incoming_rtcp_packets_size() == 1) { >+ StoreIncomingRtcpPackets(stream.incoming_rtcp_packets(0)); >+ } else if (stream.outgoing_rtcp_packets_size() == 1) { >+ StoreOutgoingRtcpPackets(stream.outgoing_rtcp_packets(0)); >+ } else if (stream.audio_playout_events_size() == 1) { >+ StoreAudioPlayoutEvent(stream.audio_playout_events(0)); >+ } else if (stream.begin_log_events_size() == 1) { >+ StoreStartEvent(stream.begin_log_events(0)); >+ } else if (stream.end_log_events_size() == 1) { >+ StoreStopEvent(stream.end_log_events(0)); >+ } else if (stream.loss_based_bwe_updates_size() == 1) { >+ StoreBweLossBasedUpdate(stream.loss_based_bwe_updates(0)); >+ } else if (stream.delay_based_bwe_updates_size() == 1) { >+ StoreBweDelayBasedUpdate(stream.delay_based_bwe_updates(0)); >+ } else if (stream.audio_network_adaptations_size() == 1) { >+ StoreAudioNetworkAdaptationEvent(stream.audio_network_adaptations(0)); >+ } else if (stream.probe_clusters_size() == 1) { >+ StoreBweProbeClusterCreated(stream.probe_clusters(0)); >+ } else if (stream.probe_success_size() == 1) { >+ StoreBweProbeSuccessEvent(stream.probe_success(0)); >+ } else if (stream.probe_failure_size() == 1) { >+ StoreBweProbeFailureEvent(stream.probe_failure(0)); >+ } else if (stream.alr_states_size() == 1) { >+ StoreAlrStateEvent(stream.alr_states(0)); >+ } else if (stream.ice_candidate_configs_size() == 1) { >+ StoreIceCandidatePairConfig(stream.ice_candidate_configs(0)); >+ } else if (stream.ice_candidate_events_size() == 1) { >+ StoreIceCandidateEvent(stream.ice_candidate_events(0)); >+ } else if (stream.audio_recv_stream_configs_size() == 1) { >+ StoreAudioRecvConfig(stream.audio_recv_stream_configs(0)); >+ } else if (stream.audio_send_stream_configs_size() == 1) { >+ StoreAudioSendConfig(stream.audio_send_stream_configs(0)); >+ } else if (stream.video_recv_stream_configs_size() == 1) { >+ StoreVideoRecvConfig(stream.video_recv_stream_configs(0)); >+ } else if (stream.video_send_stream_configs_size() == 1) { >+ StoreVideoSendConfig(stream.video_send_stream_configs(0)); >+ } else { >+ RTC_NOTREACHED(); >+ } >+} >+ >+void ParsedRtcEventLogNew::StoreAlrStateEvent(const rtclog2::AlrState& proto) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ RTC_CHECK(proto.has_in_alr()); >+ LoggedAlrStateEvent alr_event; >+ alr_event.timestamp_us = proto.timestamp_ms() * 1000; >+ alr_event.in_alr = proto.in_alr(); >+ >+ alr_state_events_.push_back(alr_event); >+ // TODO(terelius): Should we delta encode this event type? >+} >+ >+void ParsedRtcEventLogNew::StoreAudioPlayoutEvent( >+ const rtclog2::AudioPlayoutEvents& proto) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ RTC_CHECK(proto.has_local_ssrc()); >+ >+ // Base event >+ auto map_it = audio_playout_events_[proto.local_ssrc()]; >+ audio_playout_events_[proto.local_ssrc()].emplace_back( >+ 1000 * proto.timestamp_ms(), proto.local_ssrc()); >+ >+ const size_t number_of_deltas = >+ proto.has_number_of_deltas() ? proto.number_of_deltas() : 0u; >+ if (number_of_deltas == 0) { >+ return; >+ } >+ >+ // timestamp_ms >+ std::vector<absl::optional<uint64_t>> timestamp_ms_values = >+ DecodeDeltas(proto.timestamp_ms_deltas(), >+ ToUnsigned(proto.timestamp_ms()), number_of_deltas); >+ RTC_CHECK_EQ(timestamp_ms_values.size(), number_of_deltas); >+ >+ // local_ssrc >+ std::vector<absl::optional<uint64_t>> local_ssrc_values = DecodeDeltas( >+ proto.local_ssrc_deltas(), proto.local_ssrc(), number_of_deltas); >+ RTC_CHECK_EQ(local_ssrc_values.size(), number_of_deltas); >+ >+ // Delta decoding >+ for (size_t i = 0; i < number_of_deltas; ++i) { >+ RTC_CHECK(timestamp_ms_values[i].has_value()); >+ RTC_CHECK(local_ssrc_values[i].has_value()); >+ RTC_CHECK_LE(local_ssrc_values[i].value(), >+ std::numeric_limits<uint32_t>::max()); >+ >+ int64_t timestamp_ms; >+ RTC_CHECK(ToSigned(timestamp_ms_values[i].value(), ×tamp_ms)); >+ >+ const uint32_t local_ssrc = >+ static_cast<uint32_t>(local_ssrc_values[i].value()); >+ audio_playout_events_[local_ssrc].emplace_back(1000 * timestamp_ms, >+ local_ssrc); >+ } >+} >+ >+void ParsedRtcEventLogNew::StoreIncomingRtpPackets( >+ const rtclog2::IncomingRtpPackets& proto) { >+ StoreRtpPackets(proto, &incoming_rtp_packets_map_); >+} >+ >+void ParsedRtcEventLogNew::StoreOutgoingRtpPackets( >+ const rtclog2::OutgoingRtpPackets& proto) { >+ StoreRtpPackets(proto, &outgoing_rtp_packets_map_); >+} >+ >+void ParsedRtcEventLogNew::StoreIncomingRtcpPackets( >+ const rtclog2::IncomingRtcpPackets& proto) { >+ StoreRtcpPackets(proto, &incoming_rtcp_packets_); >+} >+ >+void ParsedRtcEventLogNew::StoreOutgoingRtcpPackets( >+ const rtclog2::OutgoingRtcpPackets& proto) { >+ StoreRtcpPackets(proto, &outgoing_rtcp_packets_); >+} >+ >+void ParsedRtcEventLogNew::StoreStartEvent( >+ const rtclog2::BeginLogEvent& proto) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ RTC_CHECK(proto.has_version()); >+ RTC_CHECK(proto.has_utc_time_ms()); >+ RTC_CHECK_EQ(proto.version(), 2); >+ LoggedStartEvent start_event(proto.timestamp_ms() * 1000, >+ proto.utc_time_ms()); >+ >+ start_log_events_.push_back(start_event); >+} >+ >+void ParsedRtcEventLogNew::StoreStopEvent(const rtclog2::EndLogEvent& proto) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ LoggedStopEvent stop_event(proto.timestamp_ms() * 1000); >+ >+ stop_log_events_.push_back(stop_event); >+} >+ >+void ParsedRtcEventLogNew::StoreBweLossBasedUpdate( >+ const rtclog2::LossBasedBweUpdates& proto) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ RTC_CHECK(proto.has_bitrate_bps()); >+ RTC_CHECK(proto.has_fraction_loss()); >+ RTC_CHECK(proto.has_total_packets()); >+ >+ // Base event >+ bwe_loss_updates_.emplace_back(1000 * proto.timestamp_ms(), >+ proto.bitrate_bps(), proto.fraction_loss(), >+ proto.total_packets()); >+ >+ const size_t number_of_deltas = >+ proto.has_number_of_deltas() ? proto.number_of_deltas() : 0u; >+ if (number_of_deltas == 0) { >+ return; >+ } >+ >+ // timestamp_ms >+ std::vector<absl::optional<uint64_t>> timestamp_ms_values = >+ DecodeDeltas(proto.timestamp_ms_deltas(), >+ ToUnsigned(proto.timestamp_ms()), number_of_deltas); >+ RTC_CHECK_EQ(timestamp_ms_values.size(), number_of_deltas); >+ >+ // bitrate_bps >+ std::vector<absl::optional<uint64_t>> bitrate_bps_values = DecodeDeltas( >+ proto.bitrate_bps_deltas(), proto.bitrate_bps(), number_of_deltas); >+ RTC_CHECK_EQ(bitrate_bps_values.size(), number_of_deltas); >+ >+ // fraction_loss >+ std::vector<absl::optional<uint64_t>> fraction_loss_values = DecodeDeltas( >+ proto.fraction_loss_deltas(), proto.fraction_loss(), number_of_deltas); >+ RTC_CHECK_EQ(fraction_loss_values.size(), number_of_deltas); >+ >+ // total_packets >+ std::vector<absl::optional<uint64_t>> total_packets_values = DecodeDeltas( >+ proto.total_packets_deltas(), proto.total_packets(), number_of_deltas); >+ RTC_CHECK_EQ(total_packets_values.size(), number_of_deltas); >+ >+ // Delta decoding >+ for (size_t i = 0; i < number_of_deltas; ++i) { >+ RTC_CHECK(timestamp_ms_values[i].has_value()); >+ int64_t timestamp_ms; >+ RTC_CHECK(ToSigned(timestamp_ms_values[i].value(), ×tamp_ms)); >+ >+ RTC_CHECK(bitrate_bps_values[i].has_value()); >+ RTC_CHECK_LE(bitrate_bps_values[i].value(), >+ std::numeric_limits<uint32_t>::max()); >+ const uint32_t bitrate_bps = >+ static_cast<uint32_t>(bitrate_bps_values[i].value()); >+ >+ RTC_CHECK(fraction_loss_values[i].has_value()); >+ RTC_CHECK_LE(fraction_loss_values[i].value(), >+ std::numeric_limits<uint32_t>::max()); >+ const uint32_t fraction_loss = >+ static_cast<uint32_t>(fraction_loss_values[i].value()); >+ >+ RTC_CHECK(total_packets_values[i].has_value()); >+ RTC_CHECK_LE(total_packets_values[i].value(), >+ std::numeric_limits<uint32_t>::max()); >+ const uint32_t total_packets = >+ static_cast<uint32_t>(total_packets_values[i].value()); >+ >+ bwe_loss_updates_.emplace_back(1000 * timestamp_ms, bitrate_bps, >+ fraction_loss, total_packets); >+ } >+} >+ >+void ParsedRtcEventLogNew::StoreBweDelayBasedUpdate( >+ const rtclog2::DelayBasedBweUpdates& proto) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ RTC_CHECK(proto.has_bitrate_bps()); >+ RTC_CHECK(proto.has_detector_state()); >+ >+ // Base event >+ const BandwidthUsage base_detector_state = >+ GetRuntimeDetectorState(proto.detector_state()); >+ bwe_delay_updates_.emplace_back(1000 * proto.timestamp_ms(), >+ proto.bitrate_bps(), base_detector_state); >+ >+ const size_t number_of_deltas = >+ proto.has_number_of_deltas() ? proto.number_of_deltas() : 0u; >+ if (number_of_deltas == 0) { >+ return; >+ } >+ >+ // timestamp_ms >+ std::vector<absl::optional<uint64_t>> timestamp_ms_values = >+ DecodeDeltas(proto.timestamp_ms_deltas(), >+ ToUnsigned(proto.timestamp_ms()), number_of_deltas); >+ RTC_CHECK_EQ(timestamp_ms_values.size(), number_of_deltas); >+ >+ // bitrate_bps >+ std::vector<absl::optional<uint64_t>> bitrate_bps_values = DecodeDeltas( >+ proto.bitrate_bps_deltas(), proto.bitrate_bps(), number_of_deltas); >+ RTC_CHECK_EQ(bitrate_bps_values.size(), number_of_deltas); >+ >+ // detector_state >+ std::vector<absl::optional<uint64_t>> detector_state_values = DecodeDeltas( >+ proto.detector_state_deltas(), >+ static_cast<uint64_t>(proto.detector_state()), number_of_deltas); >+ RTC_CHECK_EQ(detector_state_values.size(), number_of_deltas); >+ >+ // Delta decoding >+ for (size_t i = 0; i < number_of_deltas; ++i) { >+ RTC_CHECK(timestamp_ms_values[i].has_value()); >+ int64_t timestamp_ms; >+ RTC_CHECK(ToSigned(timestamp_ms_values[i].value(), ×tamp_ms)); >+ >+ RTC_CHECK(bitrate_bps_values[i].has_value()); >+ RTC_CHECK_LE(bitrate_bps_values[i].value(), >+ std::numeric_limits<uint32_t>::max()); >+ const uint32_t bitrate_bps = >+ static_cast<uint32_t>(bitrate_bps_values[i].value()); >+ >+ RTC_CHECK(detector_state_values[i].has_value()); >+ const auto detector_state = >+ static_cast<rtclog2::DelayBasedBweUpdates::DetectorState>( >+ detector_state_values[i].value()); >+ >+ bwe_delay_updates_.emplace_back(1000 * timestamp_ms, bitrate_bps, >+ GetRuntimeDetectorState(detector_state)); >+ } >+} >+ >+void ParsedRtcEventLogNew::StoreBweProbeClusterCreated( >+ const rtclog2::BweProbeCluster& proto) { >+ LoggedBweProbeClusterCreatedEvent probe_cluster; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ probe_cluster.timestamp_us = proto.timestamp_ms() * 1000; >+ RTC_CHECK(proto.has_id()); >+ probe_cluster.id = proto.id(); >+ RTC_CHECK(proto.has_bitrate_bps()); >+ probe_cluster.bitrate_bps = proto.bitrate_bps(); >+ RTC_CHECK(proto.has_min_packets()); >+ probe_cluster.min_packets = proto.min_packets(); >+ RTC_CHECK(proto.has_min_bytes()); >+ probe_cluster.min_bytes = proto.min_bytes(); >+ >+ bwe_probe_cluster_created_events_.push_back(probe_cluster); >+ >+ // TODO(terelius): Should we delta encode this event type? >+} >+ >+void ParsedRtcEventLogNew::StoreBweProbeSuccessEvent( >+ const rtclog2::BweProbeResultSuccess& proto) { >+ LoggedBweProbeSuccessEvent probe_result; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ probe_result.timestamp_us = proto.timestamp_ms() * 1000; >+ RTC_CHECK(proto.has_id()); >+ probe_result.id = proto.id(); >+ RTC_CHECK(proto.has_bitrate_bps()); >+ probe_result.bitrate_bps = proto.bitrate_bps(); >+ >+ bwe_probe_success_events_.push_back(probe_result); >+ >+ // TODO(terelius): Should we delta encode this event type? >+} >+ >+void ParsedRtcEventLogNew::StoreBweProbeFailureEvent( >+ const rtclog2::BweProbeResultFailure& proto) { >+ LoggedBweProbeFailureEvent probe_result; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ probe_result.timestamp_us = proto.timestamp_ms() * 1000; >+ RTC_CHECK(proto.has_id()); >+ probe_result.id = proto.id(); >+ RTC_CHECK(proto.has_failure()); >+ probe_result.failure_reason = GetRuntimeProbeFailureReason(proto.failure()); >+ >+ bwe_probe_failure_events_.push_back(probe_result); >+ >+ // TODO(terelius): Should we delta encode this event type? >+} >+ >+void ParsedRtcEventLogNew::StoreAudioNetworkAdaptationEvent( >+ const rtclog2::AudioNetworkAdaptations& proto) { >+ RTC_CHECK(proto.has_timestamp_ms()); >+ >+ // Base event >+ { >+ AudioEncoderRuntimeConfig runtime_config; >+ if (proto.has_bitrate_bps()) { >+ runtime_config.bitrate_bps = proto.bitrate_bps(); >+ } >+ if (proto.has_frame_length_ms()) { >+ runtime_config.frame_length_ms = proto.frame_length_ms(); >+ } >+ if (proto.has_uplink_packet_loss_fraction()) { >+ float uplink_packet_loss_fraction; >+ RTC_CHECK(ParsePacketLossFractionFromProtoFormat( >+ proto.uplink_packet_loss_fraction(), &uplink_packet_loss_fraction)); >+ runtime_config.uplink_packet_loss_fraction = uplink_packet_loss_fraction; >+ } >+ if (proto.has_enable_fec()) { >+ runtime_config.enable_fec = proto.enable_fec(); >+ } >+ if (proto.has_enable_dtx()) { >+ runtime_config.enable_dtx = proto.enable_dtx(); >+ } >+ if (proto.has_num_channels()) { >+ // Note: Encoding N as N-1 only done for |num_channels_deltas|. >+ runtime_config.num_channels = proto.num_channels(); >+ } >+ audio_network_adaptation_events_.emplace_back(1000 * proto.timestamp_ms(), >+ runtime_config); >+ } >+ >+ const size_t number_of_deltas = >+ proto.has_number_of_deltas() ? proto.number_of_deltas() : 0u; >+ if (number_of_deltas == 0) { >+ return; >+ } >+ >+ // timestamp_ms >+ std::vector<absl::optional<uint64_t>> timestamp_ms_values = >+ DecodeDeltas(proto.timestamp_ms_deltas(), >+ ToUnsigned(proto.timestamp_ms()), number_of_deltas); >+ RTC_CHECK_EQ(timestamp_ms_values.size(), number_of_deltas); >+ >+ // bitrate_bps >+ const absl::optional<uint64_t> unsigned_base_bitrate_bps = >+ proto.has_bitrate_bps() >+ ? absl::optional<uint64_t>(ToUnsigned(proto.bitrate_bps())) >+ : absl::optional<uint64_t>(); >+ std::vector<absl::optional<uint64_t>> bitrate_bps_values = DecodeDeltas( >+ proto.bitrate_bps_deltas(), unsigned_base_bitrate_bps, number_of_deltas); >+ RTC_CHECK_EQ(bitrate_bps_values.size(), number_of_deltas); >+ >+ // frame_length_ms >+ const absl::optional<uint64_t> unsigned_base_frame_length_ms = >+ proto.has_frame_length_ms() >+ ? absl::optional<uint64_t>(ToUnsigned(proto.frame_length_ms())) >+ : absl::optional<uint64_t>(); >+ std::vector<absl::optional<uint64_t>> frame_length_ms_values = >+ DecodeDeltas(proto.frame_length_ms_deltas(), >+ unsigned_base_frame_length_ms, number_of_deltas); >+ RTC_CHECK_EQ(frame_length_ms_values.size(), number_of_deltas); >+ >+ // uplink_packet_loss_fraction >+ const absl::optional<uint64_t> uplink_packet_loss_fraction = >+ proto.has_uplink_packet_loss_fraction() >+ ? absl::optional<uint64_t>(proto.uplink_packet_loss_fraction()) >+ : absl::optional<uint64_t>(); >+ std::vector<absl::optional<uint64_t>> uplink_packet_loss_fraction_values = >+ DecodeDeltas(proto.uplink_packet_loss_fraction_deltas(), >+ uplink_packet_loss_fraction, number_of_deltas); >+ RTC_CHECK_EQ(uplink_packet_loss_fraction_values.size(), number_of_deltas); >+ >+ // enable_fec >+ const absl::optional<uint64_t> enable_fec = >+ proto.has_enable_fec() ? absl::optional<uint64_t>(proto.enable_fec()) >+ : absl::optional<uint64_t>(); >+ std::vector<absl::optional<uint64_t>> enable_fec_values = >+ DecodeDeltas(proto.enable_fec_deltas(), enable_fec, number_of_deltas); >+ RTC_CHECK_EQ(enable_fec_values.size(), number_of_deltas); >+ >+ // enable_dtx >+ const absl::optional<uint64_t> enable_dtx = >+ proto.has_enable_dtx() ? absl::optional<uint64_t>(proto.enable_dtx()) >+ : absl::optional<uint64_t>(); >+ std::vector<absl::optional<uint64_t>> enable_dtx_values = >+ DecodeDeltas(proto.enable_dtx_deltas(), enable_dtx, number_of_deltas); >+ RTC_CHECK_EQ(enable_dtx_values.size(), number_of_deltas); >+ >+ // num_channels >+ // Note: For delta encoding, all num_channel values, including the base, >+ // were shifted down by one, but in the base event, they were not. >+ // We likewise shift the base event down by one, to get the same base as >+ // encoding had, but then shift all of the values (except the base) back up >+ // to their original value. >+ absl::optional<uint64_t> shifted_base_num_channels; >+ if (proto.has_num_channels()) { >+ shifted_base_num_channels = >+ absl::optional<uint64_t>(proto.num_channels() - 1); >+ } >+ std::vector<absl::optional<uint64_t>> num_channels_values = DecodeDeltas( >+ proto.num_channels_deltas(), shifted_base_num_channels, number_of_deltas); >+ for (size_t i = 0; i < num_channels_values.size(); ++i) { >+ if (num_channels_values[i].has_value()) { >+ num_channels_values[i] = num_channels_values[i].value() + 1; >+ } >+ } >+ RTC_CHECK_EQ(num_channels_values.size(), number_of_deltas); >+ >+ // Delta decoding >+ for (size_t i = 0; i < number_of_deltas; ++i) { >+ RTC_CHECK(timestamp_ms_values[i].has_value()); >+ int64_t timestamp_ms; >+ RTC_CHECK(ToSigned(timestamp_ms_values[i].value(), ×tamp_ms)); >+ >+ AudioEncoderRuntimeConfig runtime_config; >+ if (bitrate_bps_values[i].has_value()) { >+ int signed_bitrate_bps; >+ RTC_CHECK(ToSigned(bitrate_bps_values[i].value(), &signed_bitrate_bps)); >+ runtime_config.bitrate_bps = signed_bitrate_bps; >+ } >+ if (frame_length_ms_values[i].has_value()) { >+ int signed_frame_length_ms; >+ RTC_CHECK( >+ ToSigned(frame_length_ms_values[i].value(), &signed_frame_length_ms)); >+ runtime_config.frame_length_ms = signed_frame_length_ms; >+ } >+ if (uplink_packet_loss_fraction_values[i].has_value()) { >+ float uplink_packet_loss_fraction; >+ RTC_CHECK(ParsePacketLossFractionFromProtoFormat( >+ rtc::checked_cast<uint32_t>( >+ uplink_packet_loss_fraction_values[i].value()), >+ &uplink_packet_loss_fraction)); >+ runtime_config.uplink_packet_loss_fraction = uplink_packet_loss_fraction; >+ } >+ if (enable_fec_values[i].has_value()) { >+ runtime_config.enable_fec = >+ rtc::checked_cast<bool>(enable_fec_values[i].value()); >+ } >+ if (enable_dtx_values[i].has_value()) { >+ runtime_config.enable_dtx = >+ rtc::checked_cast<bool>(enable_dtx_values[i].value()); >+ } >+ if (num_channels_values[i].has_value()) { >+ runtime_config.num_channels = >+ rtc::checked_cast<size_t>(num_channels_values[i].value()); >+ } >+ audio_network_adaptation_events_.emplace_back(1000 * timestamp_ms, >+ runtime_config); >+ } >+} >+ >+void ParsedRtcEventLogNew::StoreIceCandidatePairConfig( >+ const rtclog2::IceCandidatePairConfig& proto) { >+ LoggedIceCandidatePairConfig ice_config; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ ice_config.timestamp_us = proto.timestamp_ms() * 1000; >+ >+ RTC_CHECK(proto.has_config_type()); >+ ice_config.type = GetRuntimeIceCandidatePairConfigType(proto.config_type()); >+ RTC_CHECK(proto.has_candidate_pair_id()); >+ ice_config.candidate_pair_id = proto.candidate_pair_id(); >+ RTC_CHECK(proto.has_local_candidate_type()); >+ ice_config.local_candidate_type = >+ GetRuntimeIceCandidateType(proto.local_candidate_type()); >+ RTC_CHECK(proto.has_local_relay_protocol()); >+ ice_config.local_relay_protocol = >+ GetRuntimeIceCandidatePairProtocol(proto.local_relay_protocol()); >+ RTC_CHECK(proto.has_local_network_type()); >+ ice_config.local_network_type = >+ GetRuntimeIceCandidateNetworkType(proto.local_network_type()); >+ RTC_CHECK(proto.has_local_address_family()); >+ ice_config.local_address_family = >+ GetRuntimeIceCandidatePairAddressFamily(proto.local_address_family()); >+ RTC_CHECK(proto.has_remote_candidate_type()); >+ ice_config.remote_candidate_type = >+ GetRuntimeIceCandidateType(proto.remote_candidate_type()); >+ RTC_CHECK(proto.has_remote_address_family()); >+ ice_config.remote_address_family = >+ GetRuntimeIceCandidatePairAddressFamily(proto.remote_address_family()); >+ RTC_CHECK(proto.has_candidate_pair_protocol()); >+ ice_config.candidate_pair_protocol = >+ GetRuntimeIceCandidatePairProtocol(proto.candidate_pair_protocol()); >+ >+ ice_candidate_pair_configs_.push_back(ice_config); >+ >+ // TODO(terelius): Should we delta encode this event type? >+} >+ >+void ParsedRtcEventLogNew::StoreIceCandidateEvent( >+ const rtclog2::IceCandidatePairEvent& proto) { >+ LoggedIceCandidatePairEvent ice_event; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ ice_event.timestamp_us = proto.timestamp_ms() * 1000; >+ RTC_CHECK(proto.has_event_type()); >+ ice_event.type = GetRuntimeIceCandidatePairEventType(proto.event_type()); >+ RTC_CHECK(proto.has_candidate_pair_id()); >+ ice_event.candidate_pair_id = proto.candidate_pair_id(); >+ >+ ice_candidate_pair_events_.push_back(ice_event); >+ >+ // TODO(terelius): Should we delta encode this event type? >+} >+ >+void ParsedRtcEventLogNew::StoreVideoRecvConfig( >+ const rtclog2::VideoRecvStreamConfig& proto) { >+ LoggedVideoRecvConfig stream; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ stream.timestamp_us = proto.timestamp_ms() * 1000; >+ RTC_CHECK(proto.has_remote_ssrc()); >+ stream.config.remote_ssrc = proto.remote_ssrc(); >+ RTC_CHECK(proto.has_local_ssrc()); >+ stream.config.local_ssrc = proto.local_ssrc(); >+ if (proto.has_rtx_ssrc()) { >+ stream.config.rtx_ssrc = proto.rtx_ssrc(); >+ } >+ if (proto.has_header_extensions()) { >+ stream.config.rtp_extensions = >+ GetRuntimeRtpHeaderExtensionConfig(proto.header_extensions()); >+ } >+ video_recv_configs_.push_back(stream); >+} >+ >+void ParsedRtcEventLogNew::StoreVideoSendConfig( >+ const rtclog2::VideoSendStreamConfig& proto) { >+ LoggedVideoSendConfig stream; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ stream.timestamp_us = proto.timestamp_ms() * 1000; >+ rtclog::StreamConfig config; >+ RTC_CHECK(proto.has_ssrc()); >+ config.local_ssrc = proto.ssrc(); >+ if (proto.has_rtx_ssrc()) { >+ config.rtx_ssrc = proto.rtx_ssrc(); >+ } >+ if (proto.has_header_extensions()) { >+ config.rtp_extensions = >+ GetRuntimeRtpHeaderExtensionConfig(proto.header_extensions()); >+ } >+ stream.configs.push_back(config); >+ video_send_configs_.push_back(stream); >+} >+ >+void ParsedRtcEventLogNew::StoreAudioRecvConfig( >+ const rtclog2::AudioRecvStreamConfig& proto) { >+ LoggedAudioRecvConfig stream; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ stream.timestamp_us = proto.timestamp_ms() * 1000; >+ RTC_CHECK(proto.has_remote_ssrc()); >+ stream.config.remote_ssrc = proto.remote_ssrc(); >+ RTC_CHECK(proto.has_local_ssrc()); >+ stream.config.local_ssrc = proto.local_ssrc(); >+ if (proto.has_header_extensions()) { >+ stream.config.rtp_extensions = >+ GetRuntimeRtpHeaderExtensionConfig(proto.header_extensions()); >+ } >+ audio_recv_configs_.push_back(stream); >+} >+ >+void ParsedRtcEventLogNew::StoreAudioSendConfig( >+ const rtclog2::AudioSendStreamConfig& proto) { >+ LoggedAudioSendConfig stream; >+ RTC_CHECK(proto.has_timestamp_ms()); >+ stream.timestamp_us = proto.timestamp_ms() * 1000; >+ RTC_CHECK(proto.has_ssrc()); >+ stream.config.local_ssrc = proto.ssrc(); >+ if (proto.has_header_extensions()) { >+ stream.config.rtp_extensions = >+ GetRuntimeRtpHeaderExtensionConfig(proto.header_extensions()); >+ } >+ audio_send_configs_.push_back(stream); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.h >index 5ab0c15b92a79e56c4c09cdf0d4b04eef12f0676..50e3b9922de26adc8de8000d0ea6f4ec4cbb5b31 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_parser_new.h >@@ -25,7 +25,7 @@ > #include "logging/rtc_event_log/events/rtc_event_probe_result_failure.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "logging/rtc_event_log/rtc_stream_config.h" >-#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "modules/rtp_rtcp/include/rtp_header_extension_map.h" > #include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" > #include "modules/rtp_rtcp/source/rtcp_packet/nack.h" >@@ -39,8 +39,10 @@ > RTC_PUSH_IGNORING_WUNDEF() > #ifdef WEBRTC_ANDROID_PLATFORM_BUILD > #include "external/webrtc/webrtc/logging/rtc_event_log/rtc_event_log.pb.h" >+#include "external/webrtc/webrtc/logging/rtc_event_log/rtc_event_log2.pb.h" > #else > #include "logging/rtc_event_log/rtc_event_log.pb.h" >+#include "logging/rtc_event_log/rtc_event_log2.pb.h" > #endif > RTC_POP_IGNORING_WUNDEF() > >@@ -55,70 +57,136 @@ struct AudioEncoderRuntimeConfig; > // considered to outweigh the added memory and runtime overhead incurred by > // adding a vptr. > struct LoggedAlrStateEvent { >- int64_t timestamp_us; >- bool in_alr; >+ LoggedAlrStateEvent() = default; >+ LoggedAlrStateEvent(int64_t timestamp_us, bool in_alr) >+ : timestamp_us(timestamp_us), in_alr(in_alr) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ bool in_alr; > }; > > struct LoggedAudioPlayoutEvent { >- int64_t timestamp_us; >- uint32_t ssrc; >+ LoggedAudioPlayoutEvent() = default; >+ LoggedAudioPlayoutEvent(int64_t timestamp_us, uint32_t ssrc) >+ : timestamp_us(timestamp_us), ssrc(ssrc) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ uint32_t ssrc; > }; > > struct LoggedAudioNetworkAdaptationEvent { >- int64_t timestamp_us; >- AudioEncoderRuntimeConfig config; >+ LoggedAudioNetworkAdaptationEvent() = default; >+ LoggedAudioNetworkAdaptationEvent(int64_t timestamp_us, >+ const AudioEncoderRuntimeConfig& config) >+ : timestamp_us(timestamp_us), config(config) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ AudioEncoderRuntimeConfig config; > }; > > struct LoggedBweDelayBasedUpdate { >+ LoggedBweDelayBasedUpdate() = default; >+ LoggedBweDelayBasedUpdate(int64_t timestamp_us, >+ int32_t bitrate_bps, >+ BandwidthUsage detector_state) >+ : timestamp_us(timestamp_us), >+ bitrate_bps(bitrate_bps), >+ detector_state(detector_state) {} >+ >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > int32_t bitrate_bps; > BandwidthUsage detector_state; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedBweLossBasedUpdate { >+ LoggedBweLossBasedUpdate() = default; >+ LoggedBweLossBasedUpdate(int64_t timestamp_us, >+ int32_t bitrate_bps, >+ uint8_t fraction_lost, >+ int32_t expected_packets) >+ : timestamp_us(timestamp_us), >+ bitrate_bps(bitrate_bps), >+ fraction_lost(fraction_lost), >+ expected_packets(expected_packets) {} >+ >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > int32_t bitrate_bps; > uint8_t fraction_lost; > int32_t expected_packets; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedBweProbeClusterCreatedEvent { >+ LoggedBweProbeClusterCreatedEvent() = default; >+ LoggedBweProbeClusterCreatedEvent(int64_t timestamp_us, >+ int32_t id, >+ int32_t bitrate_bps, >+ uint32_t min_packets, >+ uint32_t min_bytes) >+ : timestamp_us(timestamp_us), >+ id(id), >+ bitrate_bps(bitrate_bps), >+ min_packets(min_packets), >+ min_bytes(min_bytes) {} >+ >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > int32_t id; > int32_t bitrate_bps; > uint32_t min_packets; > uint32_t min_bytes; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedBweProbeSuccessEvent { >+ LoggedBweProbeSuccessEvent() = default; >+ LoggedBweProbeSuccessEvent(int64_t timestamp_us, >+ int32_t id, >+ int32_t bitrate_bps) >+ : timestamp_us(timestamp_us), id(id), bitrate_bps(bitrate_bps) {} >+ >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > int32_t id; > int32_t bitrate_bps; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedBweProbeFailureEvent { >+ LoggedBweProbeFailureEvent() = default; >+ LoggedBweProbeFailureEvent(int64_t timestamp_us, >+ int32_t id, >+ ProbeFailureReason failure_reason) >+ : timestamp_us(timestamp_us), id(id), failure_reason(failure_reason) {} >+ >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > int32_t id; > ProbeFailureReason failure_reason; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedIceCandidatePairConfig { >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > IceCandidatePairConfigType type; > uint32_t candidate_pair_id; >@@ -129,16 +197,23 @@ struct LoggedIceCandidatePairConfig { > IceCandidateType remote_candidate_type; > IceCandidatePairAddressFamily remote_address_family; > IceCandidatePairProtocol candidate_pair_protocol; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedIceCandidatePairEvent { >+ LoggedIceCandidatePairEvent() = default; >+ LoggedIceCandidatePairEvent(int64_t timestamp_us, >+ IceCandidatePairEventType type, >+ uint32_t candidate_pair_id) >+ : timestamp_us(timestamp_us), >+ type(type), >+ candidate_pair_id(candidate_pair_id) {} >+ >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > IceCandidatePairEventType type; > uint32_t candidate_pair_id; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedRtpPacket { >@@ -150,13 +225,15 @@ struct LoggedRtpPacket { > header(header), > header_length(header_length), > total_length(total_length) {} >+ >+ int64_t log_time_us() const { return timestamp_us; } >+ int64_t log_time_ms() const { return timestamp_us / 1000; } >+ > int64_t timestamp_us; > // TODO(terelius): This allocates space for 15 CSRCs even if none are used. > RTPHeader header; > size_t header_length; > size_t total_length; >- int64_t log_time_us() const { return timestamp_us; } >- int64_t log_time_ms() const { return timestamp_us / 1000; } > }; > > struct LoggedRtpPacketIncoming { >@@ -165,9 +242,10 @@ struct LoggedRtpPacketIncoming { > size_t header_length, > size_t total_length) > : rtp(timestamp_us, header, header_length, total_length) {} >- LoggedRtpPacket rtp; > int64_t log_time_us() const { return rtp.timestamp_us; } > int64_t log_time_ms() const { return rtp.timestamp_us / 1000; } >+ >+ LoggedRtpPacket rtp; > }; > > struct LoggedRtpPacketOutgoing { >@@ -176,21 +254,25 @@ struct LoggedRtpPacketOutgoing { > size_t header_length, > size_t total_length) > : rtp(timestamp_us, header, header_length, total_length) {} >- LoggedRtpPacket rtp; > int64_t log_time_us() const { return rtp.timestamp_us; } > int64_t log_time_ms() const { return rtp.timestamp_us / 1000; } >+ >+ LoggedRtpPacket rtp; > }; > > struct LoggedRtcpPacket { > LoggedRtcpPacket(uint64_t timestamp_us, > const uint8_t* packet, > size_t total_length); >+ LoggedRtcpPacket(uint64_t timestamp_us, const std::string& packet); > LoggedRtcpPacket(const LoggedRtcpPacket&); > ~LoggedRtcpPacket(); >- int64_t timestamp_us; >- std::vector<uint8_t> raw_data; >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ std::vector<uint8_t> raw_data; > }; > > struct LoggedRtcpPacketIncoming { >@@ -198,9 +280,13 @@ struct LoggedRtcpPacketIncoming { > const uint8_t* packet, > size_t total_length) > : rtcp(timestamp_us, packet, total_length) {} >- LoggedRtcpPacket rtcp; >+ LoggedRtcpPacketIncoming(uint64_t timestamp_us, const std::string& packet) >+ : rtcp(timestamp_us, packet) {} >+ > int64_t log_time_us() const { return rtcp.timestamp_us; } > int64_t log_time_ms() const { return rtcp.timestamp_us / 1000; } >+ >+ LoggedRtcpPacket rtcp; > }; > > struct LoggedRtcpPacketOutgoing { >@@ -208,97 +294,150 @@ struct LoggedRtcpPacketOutgoing { > const uint8_t* packet, > size_t total_length) > : rtcp(timestamp_us, packet, total_length) {} >- LoggedRtcpPacket rtcp; >+ LoggedRtcpPacketOutgoing(uint64_t timestamp_us, const std::string& packet) >+ : rtcp(timestamp_us, packet) {} >+ > int64_t log_time_us() const { return rtcp.timestamp_us; } > int64_t log_time_ms() const { return rtcp.timestamp_us / 1000; } >+ >+ LoggedRtcpPacket rtcp; > }; > > struct LoggedRtcpPacketReceiverReport { >- int64_t timestamp_us; >- rtcp::ReceiverReport rr; >+ LoggedRtcpPacketReceiverReport() = default; >+ LoggedRtcpPacketReceiverReport(int64_t timestamp_us, >+ const rtcp::ReceiverReport& rr) >+ : timestamp_us(timestamp_us), rr(rr) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtcp::ReceiverReport rr; > }; > > struct LoggedRtcpPacketSenderReport { >- int64_t timestamp_us; >- rtcp::SenderReport sr; >+ LoggedRtcpPacketSenderReport() = default; >+ LoggedRtcpPacketSenderReport(int64_t timestamp_us, >+ const rtcp::SenderReport& sr) >+ : timestamp_us(timestamp_us), sr(sr) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtcp::SenderReport sr; > }; > > struct LoggedRtcpPacketRemb { >- int64_t timestamp_us; >- rtcp::Remb remb; >+ LoggedRtcpPacketRemb() = default; >+ LoggedRtcpPacketRemb(int64_t timestamp_us, const rtcp::Remb& remb) >+ : timestamp_us(timestamp_us), remb(remb) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtcp::Remb remb; > }; > > struct LoggedRtcpPacketNack { >- int64_t timestamp_us; >- rtcp::Nack nack; >+ LoggedRtcpPacketNack() = default; >+ LoggedRtcpPacketNack(int64_t timestamp_us, const rtcp::Nack& nack) >+ : timestamp_us(timestamp_us), nack(nack) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtcp::Nack nack; > }; > > struct LoggedRtcpPacketTransportFeedback { >- int64_t timestamp_us; >- rtcp::TransportFeedback transport_feedback; >+ LoggedRtcpPacketTransportFeedback() = default; >+ LoggedRtcpPacketTransportFeedback( >+ int64_t timestamp_us, >+ const rtcp::TransportFeedback& transport_feedback) >+ : timestamp_us(timestamp_us), transport_feedback(transport_feedback) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtcp::TransportFeedback transport_feedback; > }; > > struct LoggedStartEvent { > explicit LoggedStartEvent(int64_t timestamp_us) >- : timestamp_us(timestamp_us) {} >- int64_t timestamp_us; >+ : LoggedStartEvent(timestamp_us, timestamp_us / 1000) {} >+ >+ LoggedStartEvent(int64_t timestamp_us, int64_t utc_start_time_ms) >+ : timestamp_us(timestamp_us), utc_start_time_ms(utc_start_time_ms) {} >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ int64_t utc_start_time_ms; > }; > > struct LoggedStopEvent { > explicit LoggedStopEvent(int64_t timestamp_us) : timestamp_us(timestamp_us) {} >- int64_t timestamp_us; >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; > }; > > struct LoggedAudioRecvConfig { >+ LoggedAudioRecvConfig() = default; > LoggedAudioRecvConfig(int64_t timestamp_us, const rtclog::StreamConfig config) > : timestamp_us(timestamp_us), config(config) {} >- int64_t timestamp_us; >- rtclog::StreamConfig config; >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtclog::StreamConfig config; > }; > > struct LoggedAudioSendConfig { >+ LoggedAudioSendConfig() = default; > LoggedAudioSendConfig(int64_t timestamp_us, const rtclog::StreamConfig config) > : timestamp_us(timestamp_us), config(config) {} >- int64_t timestamp_us; >- rtclog::StreamConfig config; >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtclog::StreamConfig config; > }; > > struct LoggedVideoRecvConfig { >+ LoggedVideoRecvConfig() = default; > LoggedVideoRecvConfig(int64_t timestamp_us, const rtclog::StreamConfig config) > : timestamp_us(timestamp_us), config(config) {} >- int64_t timestamp_us; >- rtclog::StreamConfig config; >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ rtclog::StreamConfig config; > }; > > struct LoggedVideoSendConfig { >+ LoggedVideoSendConfig(); > LoggedVideoSendConfig(int64_t timestamp_us, > const std::vector<rtclog::StreamConfig>& configs); > LoggedVideoSendConfig(const LoggedVideoSendConfig&); > ~LoggedVideoSendConfig(); >- int64_t timestamp_us; >- std::vector<rtclog::StreamConfig> configs; >+ > int64_t log_time_us() const { return timestamp_us; } > int64_t log_time_ms() const { return timestamp_us / 1000; } >+ >+ int64_t timestamp_us; >+ std::vector<rtclog::StreamConfig> configs; > }; > > template <typename T> >@@ -477,30 +616,6 @@ class ParsedRtcEventLogNew { > friend class RtcEventLogTestHelper; > > public: >- ~ParsedRtcEventLogNew(); >- >- enum class EventType { >- UNKNOWN_EVENT = 0, >- LOG_START = 1, >- LOG_END = 2, >- RTP_EVENT = 3, >- RTCP_EVENT = 4, >- AUDIO_PLAYOUT_EVENT = 5, >- LOSS_BASED_BWE_UPDATE = 6, >- DELAY_BASED_BWE_UPDATE = 7, >- VIDEO_RECEIVER_CONFIG_EVENT = 8, >- VIDEO_SENDER_CONFIG_EVENT = 9, >- AUDIO_RECEIVER_CONFIG_EVENT = 10, >- AUDIO_SENDER_CONFIG_EVENT = 11, >- AUDIO_NETWORK_ADAPTATION_EVENT = 16, >- BWE_PROBE_CLUSTER_CREATED_EVENT = 17, >- BWE_PROBE_FAILURE_EVENT = 18, >- BWE_PROBE_SUCCESS_EVENT = 19, >- ALR_STATE_EVENT = 20, >- ICE_CANDIDATE_PAIR_CONFIG = 21, >- ICE_CANDIDATE_PAIR_EVENT = 22, >- }; >- > enum class MediaType { ANY, AUDIO, VIDEO, DATA }; > enum class UnconfiguredHeaderExtensions { > kDontParse, >@@ -541,6 +656,8 @@ class ParsedRtcEventLogNew { > UnconfiguredHeaderExtensions parse_unconfigured_header_extensions = > UnconfiguredHeaderExtensions::kDontParse); > >+ ~ParsedRtcEventLogNew(); >+ > // Clears previously parsed events and resets the ParsedRtcEventLogNew to an > // empty state. > void Clear(); >@@ -555,111 +672,8 @@ class ParsedRtcEventLogNew { > bool ParseStream( > std::istream& stream); // no-presubmit-check TODO(webrtc:8982) > >- // Returns the number of events in an EventStream. >- size_t GetNumberOfEvents() const; >- >- // Reads the arrival timestamp (in microseconds) from a rtclog::Event. >- int64_t GetTimestamp(size_t index) const; >- int64_t GetTimestamp(const rtclog::Event& event) const; >- >- // Reads the event type of the rtclog::Event at |index|. >- EventType GetEventType(size_t index) const; >- EventType GetEventType(const rtclog::Event& event) const; >- >- // Reads the header, direction, header length and packet length from the RTP >- // event at |index|, and stores the values in the corresponding output >- // parameters. Each output parameter can be set to nullptr if that value >- // isn't needed. >- // NB: The header must have space for at least IP_PACKET_SIZE bytes. >- // Returns: a pointer to a header extensions map acquired from parsing >- // corresponding Audio/Video Sender/Receiver config events. >- // Warning: if the same SSRC is reused by both video and audio streams during >- // call, extensions maps may be incorrect (the last one would be returned). >- const webrtc::RtpHeaderExtensionMap* GetRtpHeader( >- size_t index, >- PacketDirection* incoming, >- uint8_t* header, >- size_t* header_length, >- size_t* total_length, >- int* probe_cluster_id) const; >- const webrtc::RtpHeaderExtensionMap* GetRtpHeader( >- const rtclog::Event& event, >- PacketDirection* incoming, >- uint8_t* header, >- size_t* header_length, >- size_t* total_length, >- int* probe_cluster_id) const; >- >- // Reads packet, direction and packet length from the RTCP event at |index|, >- // and stores the values in the corresponding output parameters. >- // Each output parameter can be set to nullptr if that value isn't needed. >- // NB: The packet must have space for at least IP_PACKET_SIZE bytes. >- void GetRtcpPacket(size_t index, >- PacketDirection* incoming, >- uint8_t* packet, >- size_t* length) const; >- void GetRtcpPacket(const rtclog::Event& event, >- PacketDirection* incoming, >- uint8_t* packet, >- size_t* length) const; >- >- // Reads a video receive config event to a StreamConfig struct. >- // Only the fields that are stored in the protobuf will be written. >- rtclog::StreamConfig GetVideoReceiveConfig(size_t index) const; >- >- // Reads a video send config event to a StreamConfig struct. If the proto >- // contains multiple SSRCs and RTX SSRCs (this used to be the case for >- // simulcast streams) then we return one StreamConfig per SSRC,RTX_SSRC pair. >- // Only the fields that are stored in the protobuf will be written. >- std::vector<rtclog::StreamConfig> GetVideoSendConfig(size_t index) const; >- >- // Reads a audio receive config event to a StreamConfig struct. >- // Only the fields that are stored in the protobuf will be written. >- rtclog::StreamConfig GetAudioReceiveConfig(size_t index) const; >- >- // Reads a config event to a StreamConfig struct. >- // Only the fields that are stored in the protobuf will be written. >- rtclog::StreamConfig GetAudioSendConfig(size_t index) const; >- >- // Reads the SSRC from the audio playout event at |index|. The SSRC is stored >- // in the output parameter ssrc. The output parameter can be set to nullptr >- // and in that case the function only asserts that the event is well formed. >- LoggedAudioPlayoutEvent GetAudioPlayout(size_t index) const; >- >- // Reads bitrate, fraction loss (as defined in RFC 1889) and total number of >- // expected packets from the loss based BWE event at |index| and stores the >- // values in >- // the corresponding output parameters. Each output parameter can be set to >- // nullptr if that >- // value isn't needed. >- LoggedBweLossBasedUpdate GetLossBasedBweUpdate(size_t index) const; >- >- // Reads bitrate and detector_state from the delay based BWE event at |index| >- // and stores the values in the corresponding output parameters. Each output >- // parameter can be set to nullptr if that >- // value isn't needed. >- LoggedBweDelayBasedUpdate GetDelayBasedBweUpdate(size_t index) const; >- >- // Reads a audio network adaptation event to a (non-NULL) >- // AudioEncoderRuntimeConfig struct. Only the fields that are >- // stored in the protobuf will be written. >- LoggedAudioNetworkAdaptationEvent GetAudioNetworkAdaptation( >- size_t index) const; >- >- LoggedBweProbeClusterCreatedEvent GetBweProbeClusterCreated( >- size_t index) const; >- >- LoggedBweProbeFailureEvent GetBweProbeFailure(size_t index) const; >- LoggedBweProbeSuccessEvent GetBweProbeSuccess(size_t index) const; >- > MediaType GetMediaType(uint32_t ssrc, PacketDirection direction) const; > >- LoggedAlrStateEvent GetAlrState(size_t index) const; >- >- LoggedIceCandidatePairConfig GetIceCandidatePairConfig(size_t index) const; >- >- LoggedIceCandidatePairEvent GetIceCandidatePairEvent(size_t index) const; >- > // Configured SSRCs. > const std::set<uint32_t>& incoming_rtx_ssrcs() const { > return incoming_rtx_ssrcs_; >@@ -841,7 +855,36 @@ class ParsedRtcEventLogNew { > bool ParseStreamInternal( > std::istream& stream); // no-presubmit-check TODO(webrtc:8982) > >- void StoreParsedEvent(const rtclog::Event& event); >+ void StoreParsedLegacyEvent(const rtclog::Event& event); >+ >+ // Reads the arrival timestamp (in microseconds) from a rtclog::Event. >+ int64_t GetTimestamp(const rtclog::Event& event) const; >+ >+ // Reads the header, direction, header length and packet length from the RTP >+ // event at |index|, and stores the values in the corresponding output >+ // parameters. Each output parameter can be set to nullptr if that value >+ // isn't needed. >+ // NB: The header must have space for at least IP_PACKET_SIZE bytes. >+ // Returns: a pointer to a header extensions map acquired from parsing >+ // corresponding Audio/Video Sender/Receiver config events. >+ // Warning: if the same SSRC is reused by both video and audio streams during >+ // call, extensions maps may be incorrect (the last one would be returned). >+ const webrtc::RtpHeaderExtensionMap* GetRtpHeader( >+ const rtclog::Event& event, >+ PacketDirection* incoming, >+ uint8_t* header, >+ size_t* header_length, >+ size_t* total_length, >+ int* probe_cluster_id) const; >+ >+ // Reads packet, direction and packet length from the RTCP event at |index|, >+ // and stores the values in the corresponding output parameters. >+ // Each output parameter can be set to nullptr if that value isn't needed. >+ // NB: The packet must have space for at least IP_PACKET_SIZE bytes. >+ void GetRtcpPacket(const rtclog::Event& event, >+ PacketDirection* incoming, >+ uint8_t* packet, >+ size_t* length) const; > > rtclog::StreamConfig GetVideoReceiveConfig(const rtclog::Event& event) const; > std::vector<rtclog::StreamConfig> GetVideoSendConfig( >@@ -873,7 +916,31 @@ class ParsedRtcEventLogNew { > LoggedIceCandidatePairEvent GetIceCandidatePairEvent( > const rtclog::Event& event) const; > >- std::vector<rtclog::Event> events_; >+ // Parsing functions for new format. >+ void StoreParsedNewFormatEvent(const rtclog2::EventStream& event); >+ void StoreIncomingRtpPackets(const rtclog2::IncomingRtpPackets& proto); >+ void StoreOutgoingRtpPackets(const rtclog2::OutgoingRtpPackets& proto); >+ void StoreIncomingRtcpPackets(const rtclog2::IncomingRtcpPackets& proto); >+ void StoreOutgoingRtcpPackets(const rtclog2::OutgoingRtcpPackets& proto); >+ void StoreAudioPlayoutEvent(const rtclog2::AudioPlayoutEvents& proto); >+ void StoreStartEvent(const rtclog2::BeginLogEvent& proto); >+ void StoreStopEvent(const rtclog2::EndLogEvent& proto); >+ void StoreBweLossBasedUpdate(const rtclog2::LossBasedBweUpdates& proto); >+ void StoreBweDelayBasedUpdate(const rtclog2::DelayBasedBweUpdates& proto); >+ void StoreAudioNetworkAdaptationEvent( >+ const rtclog2::AudioNetworkAdaptations& proto); >+ void StoreBweProbeClusterCreated(const rtclog2::BweProbeCluster& proto); >+ void StoreBweProbeSuccessEvent(const rtclog2::BweProbeResultSuccess& proto); >+ void StoreBweProbeFailureEvent(const rtclog2::BweProbeResultFailure& proto); >+ void StoreAlrStateEvent(const rtclog2::AlrState& proto); >+ void StoreIceCandidatePairConfig( >+ const rtclog2::IceCandidatePairConfig& proto); >+ void StoreIceCandidateEvent(const rtclog2::IceCandidatePairEvent& proto); >+ void StoreAudioRecvConfig(const rtclog2::AudioRecvStreamConfig& proto); >+ void StoreAudioSendConfig(const rtclog2::AudioSendStreamConfig& proto); >+ void StoreVideoRecvConfig(const rtclog2::VideoRecvStreamConfig& proto); >+ void StoreVideoSendConfig(const rtclog2::VideoSendStreamConfig& proto); >+ // End of new parsing functions. > > struct Stream { > Stream(uint32_t ssrc, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest.cc >index e238f84884618ac19073e7304e01ee1746e4bc91..f7b7898993f7c92b9157d9a7918e924d8110cb87 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest.cc >@@ -101,13 +101,16 @@ struct EventCounts { > }; > > class RtcEventLogSession >- : public ::testing::TestWithParam<std::tuple<uint64_t, int64_t>> { >+ : public ::testing::TestWithParam< >+ std::tuple<uint64_t, int64_t, RtcEventLog::EncodingType>> { > public: > RtcEventLogSession() > : seed_(std::get<0>(GetParam())), > prng_(seed_), >+ output_period_ms_(std::get<1>(GetParam())), >+ encoding_type_(std::get<2>(GetParam())), > gen_(seed_ * 880001UL), >- output_period_ms_(std::get<1>(GetParam())) { >+ verifier_(encoding_type_) { > clock_.SetTimeMicros(prng_.Rand<uint32_t>()); > // Find the name of the current test, in order to use it as a temporary > // filename. >@@ -166,12 +169,15 @@ class RtcEventLogSession > std::vector<std::unique_ptr<RtcEventRtcpPacketOutgoing>> outgoing_rtcp_list_; > > int64_t start_time_us_; >+ int64_t utc_start_time_us_; > int64_t stop_time_us_; > > const uint64_t seed_; > Random prng_; >+ const int64_t output_period_ms_; >+ const RtcEventLog::EncodingType encoding_type_; > test::EventGenerator gen_; >- int64_t output_period_ms_; >+ test::EventVerifier verifier_; > rtc::ScopedFakeClock clock_; > std::string temp_filename_; > }; >@@ -284,8 +290,7 @@ void RtcEventLogSession::WriteLog(EventCounts count, > // Maybe always use the ScopedFakeClock, but conditionally SleepMs()? > > // The log file will be flushed to disk when the event_log goes out of scope. >- std::unique_ptr<RtcEventLog> event_log( >- RtcEventLog::Create(RtcEventLog::EncodingType::Legacy)); >+ std::unique_ptr<RtcEventLog> event_log(RtcEventLog::Create(encoding_type_)); > > // We can't send or receive packets without configured streams. > RTC_CHECK_GE(count.video_recv_streams, 1); >@@ -306,6 +311,7 @@ void RtcEventLogSession::WriteLog(EventCounts count, > absl::make_unique<RtcEventLogOutputFile>(temp_filename_, 10000000), > output_period_ms_); > start_time_us_ = rtc::TimeMicros(); >+ utc_start_time_us_ = rtc::TimeUTCMicros(); > } > > clock_.AdvanceTimeMicros(prng_.Rand(20) * 1000); >@@ -465,19 +471,18 @@ void RtcEventLogSession::ReadAndVerifyLog() { > // Start and stop events. > auto& parsed_start_log_events = parsed_log.start_log_events(); > ASSERT_EQ(parsed_start_log_events.size(), static_cast<size_t>(1)); >- EXPECT_TRUE( >- test::VerifyLoggedStartEvent(start_time_us_, parsed_start_log_events[0])); >+ verifier_.VerifyLoggedStartEvent(start_time_us_, utc_start_time_us_, >+ parsed_start_log_events[0]); > > auto& parsed_stop_log_events = parsed_log.stop_log_events(); > ASSERT_EQ(parsed_stop_log_events.size(), static_cast<size_t>(1)); >- EXPECT_TRUE( >- test::VerifyLoggedStopEvent(stop_time_us_, parsed_stop_log_events[0])); >+ verifier_.VerifyLoggedStopEvent(stop_time_us_, parsed_stop_log_events[0]); > > auto& parsed_alr_state_events = parsed_log.alr_state_events(); > ASSERT_EQ(parsed_alr_state_events.size(), alr_state_list_.size()); > for (size_t i = 0; i < parsed_alr_state_events.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedAlrStateEvent(*alr_state_list_[i], >- parsed_alr_state_events[i])); >+ verifier_.VerifyLoggedAlrStateEvent(*alr_state_list_[i], >+ parsed_alr_state_events[i]); > } > > const auto& parsed_audio_playout_map = parsed_log.audio_playout_events(); >@@ -488,8 +493,8 @@ void RtcEventLogSession::ReadAndVerifyLog() { > const auto& audio_playout_stream = audio_playout_map_[ssrc]; > ASSERT_EQ(parsed_audio_playout_stream.size(), audio_playout_stream.size()); > for (size_t i = 0; i < parsed_audio_playout_map.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedAudioPlayoutEvent( >- *audio_playout_stream[i], parsed_audio_playout_stream[i])); >+ verifier_.VerifyLoggedAudioPlayoutEvent(*audio_playout_stream[i], >+ parsed_audio_playout_stream[i]); > } > } > >@@ -498,22 +503,22 @@ void RtcEventLogSession::ReadAndVerifyLog() { > ASSERT_EQ(parsed_audio_network_adaptation_events.size(), > ana_configs_list_.size()); > for (size_t i = 0; i < parsed_audio_network_adaptation_events.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedAudioNetworkAdaptationEvent( >- *ana_configs_list_[i], parsed_audio_network_adaptation_events[i])); >+ verifier_.VerifyLoggedAudioNetworkAdaptationEvent( >+ *ana_configs_list_[i], parsed_audio_network_adaptation_events[i]); > } > > auto& parsed_bwe_delay_updates = parsed_log.bwe_delay_updates(); > ASSERT_EQ(parsed_bwe_delay_updates.size(), bwe_delay_list_.size()); > for (size_t i = 0; i < parsed_bwe_delay_updates.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedBweDelayBasedUpdate( >- *bwe_delay_list_[i], parsed_bwe_delay_updates[i])); >+ verifier_.VerifyLoggedBweDelayBasedUpdate(*bwe_delay_list_[i], >+ parsed_bwe_delay_updates[i]); > } > > auto& parsed_bwe_loss_updates = parsed_log.bwe_loss_updates(); > ASSERT_EQ(parsed_bwe_loss_updates.size(), bwe_loss_list_.size()); > for (size_t i = 0; i < parsed_bwe_loss_updates.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedBweLossBasedUpdate( >- *bwe_loss_list_[i], parsed_bwe_loss_updates[i])); >+ verifier_.VerifyLoggedBweLossBasedUpdate(*bwe_loss_list_[i], >+ parsed_bwe_loss_updates[i]); > } > > auto& parsed_bwe_probe_cluster_created_events = >@@ -521,30 +526,30 @@ void RtcEventLogSession::ReadAndVerifyLog() { > ASSERT_EQ(parsed_bwe_probe_cluster_created_events.size(), > probe_creation_list_.size()); > for (size_t i = 0; i < parsed_bwe_probe_cluster_created_events.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedBweProbeClusterCreatedEvent( >- *probe_creation_list_[i], parsed_bwe_probe_cluster_created_events[i])); >+ verifier_.VerifyLoggedBweProbeClusterCreatedEvent( >+ *probe_creation_list_[i], parsed_bwe_probe_cluster_created_events[i]); > } > > auto& parsed_bwe_probe_failure_events = parsed_log.bwe_probe_failure_events(); > ASSERT_EQ(parsed_bwe_probe_failure_events.size(), probe_failure_list_.size()); > for (size_t i = 0; i < parsed_bwe_probe_failure_events.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedBweProbeFailureEvent( >- *probe_failure_list_[i], parsed_bwe_probe_failure_events[i])); >+ verifier_.VerifyLoggedBweProbeFailureEvent( >+ *probe_failure_list_[i], parsed_bwe_probe_failure_events[i]); > } > > auto& parsed_bwe_probe_success_events = parsed_log.bwe_probe_success_events(); > ASSERT_EQ(parsed_bwe_probe_success_events.size(), probe_success_list_.size()); > for (size_t i = 0; i < parsed_bwe_probe_success_events.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedBweProbeSuccessEvent( >- *probe_success_list_[i], parsed_bwe_probe_success_events[i])); >+ verifier_.VerifyLoggedBweProbeSuccessEvent( >+ *probe_success_list_[i], parsed_bwe_probe_success_events[i]); > } > > auto& parsed_ice_candidate_pair_configs = > parsed_log.ice_candidate_pair_configs(); > ASSERT_EQ(parsed_ice_candidate_pair_configs.size(), ice_config_list_.size()); > for (size_t i = 0; i < parsed_ice_candidate_pair_configs.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedIceCandidatePairConfig( >- *ice_config_list_[i], parsed_ice_candidate_pair_configs[i])); >+ verifier_.VerifyLoggedIceCandidatePairConfig( >+ *ice_config_list_[i], parsed_ice_candidate_pair_configs[i]); > } > > auto& parsed_ice_candidate_pair_events = >@@ -552,8 +557,8 @@ void RtcEventLogSession::ReadAndVerifyLog() { > ASSERT_EQ(parsed_ice_candidate_pair_events.size(), > parsed_ice_candidate_pair_events.size()); > for (size_t i = 0; i < parsed_ice_candidate_pair_events.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedIceCandidatePairEvent( >- *ice_event_list_[i], parsed_ice_candidate_pair_events[i])); >+ verifier_.VerifyLoggedIceCandidatePairEvent( >+ *ice_event_list_[i], parsed_ice_candidate_pair_events[i]); > } > > auto& parsed_incoming_rtp_packets_by_ssrc = >@@ -566,8 +571,8 @@ void RtcEventLogSession::ReadAndVerifyLog() { > const auto& rtp_stream = incoming_rtp_map_[ssrc]; > ASSERT_EQ(parsed_rtp_stream.size(), rtp_stream.size()); > for (size_t i = 0; i < parsed_rtp_stream.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedRtpPacketIncoming(*rtp_stream[i], >- parsed_rtp_stream[i])); >+ verifier_.VerifyLoggedRtpPacketIncoming(*rtp_stream[i], >+ parsed_rtp_stream[i]); > } > } > >@@ -581,47 +586,47 @@ void RtcEventLogSession::ReadAndVerifyLog() { > const auto& rtp_stream = outgoing_rtp_map_[ssrc]; > ASSERT_EQ(parsed_rtp_stream.size(), rtp_stream.size()); > for (size_t i = 0; i < parsed_rtp_stream.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedRtpPacketOutgoing(*rtp_stream[i], >- parsed_rtp_stream[i])); >+ verifier_.VerifyLoggedRtpPacketOutgoing(*rtp_stream[i], >+ parsed_rtp_stream[i]); > } > } > > auto& parsed_incoming_rtcp_packets = parsed_log.incoming_rtcp_packets(); > ASSERT_EQ(parsed_incoming_rtcp_packets.size(), incoming_rtcp_list_.size()); > for (size_t i = 0; i < parsed_incoming_rtcp_packets.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedRtcpPacketIncoming( >- *incoming_rtcp_list_[i], parsed_incoming_rtcp_packets[i])); >+ verifier_.VerifyLoggedRtcpPacketIncoming(*incoming_rtcp_list_[i], >+ parsed_incoming_rtcp_packets[i]); > } > > auto& parsed_outgoing_rtcp_packets = parsed_log.outgoing_rtcp_packets(); > ASSERT_EQ(parsed_outgoing_rtcp_packets.size(), outgoing_rtcp_list_.size()); > for (size_t i = 0; i < parsed_outgoing_rtcp_packets.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedRtcpPacketOutgoing( >- *outgoing_rtcp_list_[i], parsed_outgoing_rtcp_packets[i])); >+ verifier_.VerifyLoggedRtcpPacketOutgoing(*outgoing_rtcp_list_[i], >+ parsed_outgoing_rtcp_packets[i]); > } > auto& parsed_audio_recv_configs = parsed_log.audio_recv_configs(); > ASSERT_EQ(parsed_audio_recv_configs.size(), audio_recv_config_list_.size()); > for (size_t i = 0; i < parsed_audio_recv_configs.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedAudioRecvConfig( >- *audio_recv_config_list_[i], parsed_audio_recv_configs[i])); >+ verifier_.VerifyLoggedAudioRecvConfig(*audio_recv_config_list_[i], >+ parsed_audio_recv_configs[i]); > } > auto& parsed_audio_send_configs = parsed_log.audio_send_configs(); > ASSERT_EQ(parsed_audio_send_configs.size(), audio_send_config_list_.size()); > for (size_t i = 0; i < parsed_audio_send_configs.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedAudioSendConfig( >- *audio_send_config_list_[i], parsed_audio_send_configs[i])); >+ verifier_.VerifyLoggedAudioSendConfig(*audio_send_config_list_[i], >+ parsed_audio_send_configs[i]); > } > auto& parsed_video_recv_configs = parsed_log.video_recv_configs(); > ASSERT_EQ(parsed_video_recv_configs.size(), video_recv_config_list_.size()); > for (size_t i = 0; i < parsed_video_recv_configs.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedVideoRecvConfig( >- *video_recv_config_list_[i], parsed_video_recv_configs[i])); >+ verifier_.VerifyLoggedVideoRecvConfig(*video_recv_config_list_[i], >+ parsed_video_recv_configs[i]); > } > auto& parsed_video_send_configs = parsed_log.video_send_configs(); > ASSERT_EQ(parsed_video_send_configs.size(), video_send_config_list_.size()); > for (size_t i = 0; i < parsed_video_send_configs.size(); i++) { >- EXPECT_TRUE(test::VerifyLoggedVideoSendConfig( >- *video_send_config_list_[i], parsed_video_send_configs[i])); >+ verifier_.VerifyLoggedVideoSendConfig(*video_send_config_list_[i], >+ parsed_video_send_configs[i]); > } > > // Clean up temporary file - can be pretty slow. >@@ -680,7 +685,25 @@ TEST_P(RtcEventLogSession, StartLoggingInTheMiddle) { > ReadAndVerifyLog(); > } > >-TEST(RtcEventLogTest, CircularBufferKeepsMostRecentEvents) { >+INSTANTIATE_TEST_CASE_P( >+ RtcEventLogTest, >+ RtcEventLogSession, >+ ::testing::Combine( >+ ::testing::Values(1234567, 7654321), >+ ::testing::Values(RtcEventLog::kImmediateOutput, 1, 5), >+ ::testing::Values(RtcEventLog::EncodingType::Legacy, >+ RtcEventLog::EncodingType::NewFormat))); >+ >+class RtcEventLogCircularBufferTest >+ : public ::testing::TestWithParam<RtcEventLog::EncodingType> { >+ public: >+ RtcEventLogCircularBufferTest() >+ : encoding_type_(GetParam()), verifier_(encoding_type_) {} >+ const RtcEventLog::EncodingType encoding_type_; >+ const test::EventVerifier verifier_; >+}; >+ >+TEST_P(RtcEventLogCircularBufferTest, KeepsMostRecentEvents) { > // TODO(terelius): Maybe make a separate RtcEventLogImplTest that can access > // the size of the cyclic buffer? > constexpr size_t kNumEvents = 20000; >@@ -699,8 +722,7 @@ TEST(RtcEventLogTest, CircularBufferKeepsMostRecentEvents) { > > // When log_dumper goes out of scope, it causes the log file to be flushed > // to disk. >- std::unique_ptr<RtcEventLog> log_dumper( >- RtcEventLog::Create(RtcEventLog::EncodingType::Legacy)); >+ std::unique_ptr<RtcEventLog> log_dumper(RtcEventLog::Create(encoding_type_)); > > for (size_t i = 0; i < kNumEvents; i++) { > // The purpose of the test is to verify that the log can handle >@@ -714,6 +736,7 @@ TEST(RtcEventLogTest, CircularBufferKeepsMostRecentEvents) { > fake_clock->AdvanceTimeMicros(10000); > } > int64_t start_time_us = rtc::TimeMicros(); >+ int64_t utc_start_time_us = rtc::TimeUTCMicros(); > log_dumper->StartLogging( > absl::make_unique<RtcEventLogOutputFile>(temp_filename, 10000000), > RtcEventLog::kImmediateOutput); >@@ -727,11 +750,12 @@ TEST(RtcEventLogTest, CircularBufferKeepsMostRecentEvents) { > > const auto& start_log_events = parsed_log.start_log_events(); > ASSERT_EQ(start_log_events.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedStartEvent(start_time_us, start_log_events[0])); >+ verifier_.VerifyLoggedStartEvent(start_time_us, utc_start_time_us, >+ start_log_events[0]); > > const auto& stop_log_events = parsed_log.stop_log_events(); > ASSERT_EQ(stop_log_events.size(), 1u); >- EXPECT_TRUE(test::VerifyLoggedStopEvent(stop_time_us, stop_log_events[0])); >+ verifier_.VerifyLoggedStopEvent(stop_time_us, stop_log_events[0]); > > const auto& probe_success_events = parsed_log.bwe_probe_success_events(); > // If the following fails, it probably means that kNumEvents isn't larger >@@ -752,19 +776,20 @@ TEST(RtcEventLogTest, CircularBufferKeepsMostRecentEvents) { > fake_clock->SetTimeMicros(first_timestamp_us); > for (size_t i = 1; i < probe_success_events.size(); i++) { > fake_clock->AdvanceTimeMicros(10000); >- ASSERT_TRUE(test::VerifyLoggedBweProbeSuccessEvent( >+ verifier_.VerifyLoggedBweProbeSuccessEvent( > RtcEventProbeResultSuccess(first_id + i, first_bitrate_bps + i * 1000), >- probe_success_events[i])); >+ probe_success_events[i]); > } > } > >+INSTANTIATE_TEST_CASE_P( >+ RtcEventLogTest, >+ RtcEventLogCircularBufferTest, >+ ::testing::Values(RtcEventLog::EncodingType::Legacy, >+ RtcEventLog::EncodingType::NewFormat)); >+ > // TODO(terelius): Verify parser behavior if the timestamps are not > // monotonically increasing in the log. > >-INSTANTIATE_TEST_CASE_P( >- RtcEventLogTest, >- RtcEventLogSession, >- ::testing::Combine(::testing::Values(1234567, 7654321), >- ::testing::Values(RtcEventLog::kImmediateOutput, 1, 5))); > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.cc >index 9f6240cbe1e6bf77771df1b137dbe9d37f10b638..e87808d125c3e493c11f15c77a9daf01643b527c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.cc >@@ -12,12 +12,15 @@ > > #include <string.h> // memcmp > >+#include <algorithm> > #include <limits> > #include <memory> > #include <numeric> >+#include <string> > #include <utility> > #include <vector> > >+#include "absl/types/optional.h" > #include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/rtp_rtcp/include/rtp_cvo.h" >@@ -25,6 +28,7 @@ > #include "modules/rtp_rtcp/source/rtp_packet_received.h" > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" > #include "rtc_base/checks.h" >+#include "test/gtest.h" > > namespace webrtc { > >@@ -62,6 +66,16 @@ void ShuffleInPlace(Random* prng, rtc::ArrayView<T> array) { > std::swap(array[i], array[other]); > } > } >+ >+absl::optional<int> GetExtensionId(const std::vector<RtpExtension>& extensions, >+ const std::string& uri) { >+ for (const auto& extension : extensions) { >+ if (extension.uri == uri) >+ return extension.id; >+ } >+ return absl::nullopt; >+} >+ > } // namespace > > std::unique_ptr<RtcEventAlrState> EventGenerator::NewAlrState() { >@@ -266,7 +280,8 @@ void EventGenerator::RandomizeRtpPacket( > size_t padding_size, > uint32_t ssrc, > const RtpHeaderExtensionMap& extension_map, >- RtpPacket* rtp_packet) { >+ RtpPacket* rtp_packet, >+ bool all_configured_exts) { > constexpr int kMaxPayloadType = 127; > rtp_packet->SetPayloadType(prng_.Rand(kMaxPayloadType)); > rtp_packet->SetMarker(prng_.Rand<bool>()); >@@ -281,16 +296,30 @@ void EventGenerator::RandomizeRtpPacket( > } > rtp_packet->SetCsrcs(csrcs); > >- if (extension_map.IsRegistered(TransmissionOffset::kId)) >+ if (extension_map.IsRegistered(TransmissionOffset::kId) && >+ (all_configured_exts || prng_.Rand<bool>())) { > rtp_packet->SetExtension<TransmissionOffset>(prng_.Rand(0x00ffffff)); >- if (extension_map.IsRegistered(AudioLevel::kId)) >+ } >+ >+ if (extension_map.IsRegistered(AudioLevel::kId) && >+ (all_configured_exts || prng_.Rand<bool>())) { > rtp_packet->SetExtension<AudioLevel>(prng_.Rand<bool>(), prng_.Rand(127)); >- if (extension_map.IsRegistered(AbsoluteSendTime::kId)) >+ } >+ >+ if (extension_map.IsRegistered(AbsoluteSendTime::kId) && >+ (all_configured_exts || prng_.Rand<bool>())) { > rtp_packet->SetExtension<AbsoluteSendTime>(prng_.Rand(0x00ffffff)); >- if (extension_map.IsRegistered(VideoOrientation::kId)) >+ } >+ >+ if (extension_map.IsRegistered(VideoOrientation::kId) && >+ (all_configured_exts || prng_.Rand<bool>())) { > rtp_packet->SetExtension<VideoOrientation>(prng_.Rand(3)); >- if (extension_map.IsRegistered(TransportSequenceNumber::kId)) >+ } >+ >+ if (extension_map.IsRegistered(TransportSequenceNumber::kId) && >+ (all_configured_exts || prng_.Rand<bool>())) { > rtp_packet->SetExtension<TransportSequenceNumber>(prng_.Rand<uint16_t>()); >+ } > > RTC_CHECK_LE(rtp_packet->headers_size() + payload_size, IP_PACKET_SIZE); > >@@ -299,12 +328,13 @@ void EventGenerator::RandomizeRtpPacket( > for (size_t i = 0; i < payload_size; i++) { > payload[i] = prng_.Rand<uint8_t>(); > } >- RTC_CHECK(rtp_packet->SetPadding(padding_size, &prng_)); >+ RTC_CHECK(rtp_packet->SetPadding(padding_size)); > } > > std::unique_ptr<RtcEventRtpPacketIncoming> EventGenerator::NewRtpPacketIncoming( > uint32_t ssrc, >- const RtpHeaderExtensionMap& extension_map) { >+ const RtpHeaderExtensionMap& extension_map, >+ bool all_configured_exts) { > constexpr size_t kMaxPaddingLength = 224; > const bool padding = prng_.Rand(0, 9) == 0; // Let padding be 10% probable. > const size_t padding_size = !padding ? 0u : prng_.Rand(0u, kMaxPaddingLength); >@@ -325,14 +355,15 @@ std::unique_ptr<RtcEventRtpPacketIncoming> EventGenerator::NewRtpPacketIncoming( > > RtpPacketReceived rtp_packet(&extension_map); > RandomizeRtpPacket(payload_size, padding_size, ssrc, extension_map, >- &rtp_packet); >+ &rtp_packet, all_configured_exts); > > return absl::make_unique<RtcEventRtpPacketIncoming>(rtp_packet); > } > > std::unique_ptr<RtcEventRtpPacketOutgoing> EventGenerator::NewRtpPacketOutgoing( > uint32_t ssrc, >- const RtpHeaderExtensionMap& extension_map) { >+ const RtpHeaderExtensionMap& extension_map, >+ bool all_configured_exts) { > constexpr size_t kMaxPaddingLength = 224; > const bool padding = prng_.Rand(0, 9) == 0; // Let padding be 10% probable. > const size_t padding_size = !padding ? 0u : prng_.Rand(0u, kMaxPaddingLength); >@@ -354,33 +385,34 @@ std::unique_ptr<RtcEventRtpPacketOutgoing> EventGenerator::NewRtpPacketOutgoing( > RtpPacketToSend rtp_packet(&extension_map, > kMaxHeaderSize + payload_size + padding_size); > RandomizeRtpPacket(payload_size, padding_size, ssrc, extension_map, >- &rtp_packet); >+ &rtp_packet, all_configured_exts); > > int probe_cluster_id = prng_.Rand(0, 100000); > return absl::make_unique<RtcEventRtpPacketOutgoing>(rtp_packet, > probe_cluster_id); > } > >-RtpHeaderExtensionMap EventGenerator::NewRtpHeaderExtensionMap() { >+RtpHeaderExtensionMap EventGenerator::NewRtpHeaderExtensionMap( >+ bool configure_all) { > RtpHeaderExtensionMap extension_map; > std::vector<int> id(RtpExtension::kOneByteHeaderExtensionMaxId - > RtpExtension::kMinId + 1); > std::iota(id.begin(), id.end(), RtpExtension::kMinId); > ShuffleInPlace(&prng_, rtc::ArrayView<int>(id)); > >- if (prng_.Rand<bool>()) { >+ if (configure_all || prng_.Rand<bool>()) { > extension_map.Register<AudioLevel>(id[0]); > } >- if (prng_.Rand<bool>()) { >+ if (configure_all || prng_.Rand<bool>()) { > extension_map.Register<TransmissionOffset>(id[1]); > } >- if (prng_.Rand<bool>()) { >+ if (configure_all || prng_.Rand<bool>()) { > extension_map.Register<AbsoluteSendTime>(id[2]); > } >- if (prng_.Rand<bool>()) { >+ if (configure_all || prng_.Rand<bool>()) { > extension_map.Register<VideoOrientation>(id[3]); > } >- if (prng_.Rand<bool>()) { >+ if (configure_all || prng_.Rand<bool>()) { > extension_map.Register<TransportSequenceNumber>(id[4]); > } > >@@ -469,396 +501,317 @@ EventGenerator::NewVideoSendStreamConfig( > return absl::make_unique<RtcEventVideoSendStreamConfig>(std::move(config)); > } > >-bool VerifyLoggedAlrStateEvent(const RtcEventAlrState& original_event, >- const LoggedAlrStateEvent& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (original_event.in_alr_ != logged_event.in_alr) >- return false; >- return true; >+void EventVerifier::VerifyLoggedAlrStateEvent( >+ const RtcEventAlrState& original_event, >+ const LoggedAlrStateEvent& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ EXPECT_EQ(original_event.in_alr(), logged_event.in_alr); > } > >-bool VerifyLoggedAudioPlayoutEvent( >+void EventVerifier::VerifyLoggedAudioPlayoutEvent( > const RtcEventAudioPlayout& original_event, >- const LoggedAudioPlayoutEvent& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (original_event.ssrc_ != logged_event.ssrc) >- return false; >- return true; >+ const LoggedAudioPlayoutEvent& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ EXPECT_EQ(original_event.ssrc(), logged_event.ssrc); > } > >-bool VerifyLoggedAudioNetworkAdaptationEvent( >+void EventVerifier::VerifyLoggedAudioNetworkAdaptationEvent( > const RtcEventAudioNetworkAdaptation& original_event, >- const LoggedAudioNetworkAdaptationEvent& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- >- if (original_event.config_->bitrate_bps != logged_event.config.bitrate_bps) >- return false; >- if (original_event.config_->enable_dtx != logged_event.config.enable_dtx) >- return false; >- if (original_event.config_->enable_fec != logged_event.config.enable_fec) >- return false; >- if (original_event.config_->frame_length_ms != >- logged_event.config.frame_length_ms) >- return false; >- if (original_event.config_->num_channels != logged_event.config.num_channels) >- return false; >- if (original_event.config_->uplink_packet_loss_fraction != >- logged_event.config.uplink_packet_loss_fraction) >- return false; >- >- return true; >-} >- >-bool VerifyLoggedBweDelayBasedUpdate( >+ const LoggedAudioNetworkAdaptationEvent& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ >+ EXPECT_EQ(original_event.config().bitrate_bps, >+ logged_event.config.bitrate_bps); >+ EXPECT_EQ(original_event.config().enable_dtx, logged_event.config.enable_dtx); >+ EXPECT_EQ(original_event.config().enable_fec, logged_event.config.enable_fec); >+ EXPECT_EQ(original_event.config().frame_length_ms, >+ logged_event.config.frame_length_ms); >+ EXPECT_EQ(original_event.config().num_channels, >+ logged_event.config.num_channels); >+ >+ // uplink_packet_loss_fraction >+ ASSERT_EQ(original_event.config().uplink_packet_loss_fraction.has_value(), >+ logged_event.config.uplink_packet_loss_fraction.has_value()); >+ if (original_event.config().uplink_packet_loss_fraction.has_value()) { >+ const float original = >+ original_event.config().uplink_packet_loss_fraction.value(); >+ const float logged = >+ logged_event.config.uplink_packet_loss_fraction.value(); >+ const float uplink_packet_loss_fraction_delta = std::abs(original - logged); >+ EXPECT_LE(uplink_packet_loss_fraction_delta, 0.0001f); >+ } >+} >+ >+void EventVerifier::VerifyLoggedBweDelayBasedUpdate( > const RtcEventBweUpdateDelayBased& original_event, >- const LoggedBweDelayBasedUpdate& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (original_event.bitrate_bps_ != logged_event.bitrate_bps) >- return false; >- if (original_event.detector_state_ != logged_event.detector_state) >- return false; >- return true; >+ const LoggedBweDelayBasedUpdate& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ EXPECT_EQ(original_event.bitrate_bps(), logged_event.bitrate_bps); >+ EXPECT_EQ(original_event.detector_state(), logged_event.detector_state); > } > >-bool VerifyLoggedBweLossBasedUpdate( >+void EventVerifier::VerifyLoggedBweLossBasedUpdate( > const RtcEventBweUpdateLossBased& original_event, >- const LoggedBweLossBasedUpdate& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (original_event.bitrate_bps_ != logged_event.bitrate_bps) >- return false; >- if (original_event.fraction_loss_ != logged_event.fraction_lost) >- return false; >- if (original_event.total_packets_ != logged_event.expected_packets) >- return false; >- return true; >-} >- >-bool VerifyLoggedBweProbeClusterCreatedEvent( >+ const LoggedBweLossBasedUpdate& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ EXPECT_EQ(original_event.bitrate_bps(), logged_event.bitrate_bps); >+ EXPECT_EQ(original_event.fraction_loss(), logged_event.fraction_lost); >+ EXPECT_EQ(original_event.total_packets(), logged_event.expected_packets); >+} >+ >+void EventVerifier::VerifyLoggedBweProbeClusterCreatedEvent( > const RtcEventProbeClusterCreated& original_event, >- const LoggedBweProbeClusterCreatedEvent& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (original_event.id_ != logged_event.id) >- return false; >- if (original_event.bitrate_bps_ != logged_event.bitrate_bps) >- return false; >- if (original_event.min_probes_ != logged_event.min_packets) >- return false; >- if (original_event.min_bytes_ != logged_event.min_bytes) >- return false; >- >- return true; >-} >- >-bool VerifyLoggedBweProbeFailureEvent( >+ const LoggedBweProbeClusterCreatedEvent& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ EXPECT_EQ(original_event.id(), logged_event.id); >+ EXPECT_EQ(original_event.bitrate_bps(), logged_event.bitrate_bps); >+ EXPECT_EQ(original_event.min_probes(), logged_event.min_packets); >+ EXPECT_EQ(original_event.min_bytes(), logged_event.min_bytes); >+} >+ >+void EventVerifier::VerifyLoggedBweProbeFailureEvent( > const RtcEventProbeResultFailure& original_event, >- const LoggedBweProbeFailureEvent& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (original_event.id_ != logged_event.id) >- return false; >- if (original_event.failure_reason_ != logged_event.failure_reason) >- return false; >- return true; >+ const LoggedBweProbeFailureEvent& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ EXPECT_EQ(original_event.id(), logged_event.id); >+ EXPECT_EQ(original_event.failure_reason(), logged_event.failure_reason); > } > >-bool VerifyLoggedBweProbeSuccessEvent( >+void EventVerifier::VerifyLoggedBweProbeSuccessEvent( > const RtcEventProbeResultSuccess& original_event, >- const LoggedBweProbeSuccessEvent& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (original_event.id_ != logged_event.id) >- return false; >- if (original_event.bitrate_bps_ != logged_event.bitrate_bps) >- return false; >- return true; >+ const LoggedBweProbeSuccessEvent& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ EXPECT_EQ(original_event.id(), logged_event.id); >+ EXPECT_EQ(original_event.bitrate_bps(), logged_event.bitrate_bps); > } > >-bool VerifyLoggedIceCandidatePairConfig( >+void EventVerifier::VerifyLoggedIceCandidatePairConfig( > const RtcEventIceCandidatePairConfig& original_event, >- const LoggedIceCandidatePairConfig& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- >- if (original_event.type_ != logged_event.type) >- return false; >- if (original_event.candidate_pair_id_ != logged_event.candidate_pair_id) >- return false; >- if (original_event.candidate_pair_desc_.local_candidate_type != >- logged_event.local_candidate_type) >- return false; >- if (original_event.candidate_pair_desc_.local_relay_protocol != >- logged_event.local_relay_protocol) >- return false; >- if (original_event.candidate_pair_desc_.local_network_type != >- logged_event.local_network_type) >- return false; >- if (original_event.candidate_pair_desc_.local_address_family != >- logged_event.local_address_family) >- return false; >- if (original_event.candidate_pair_desc_.remote_candidate_type != >- logged_event.remote_candidate_type) >- return false; >- if (original_event.candidate_pair_desc_.remote_address_family != >- logged_event.remote_address_family) >- return false; >- if (original_event.candidate_pair_desc_.candidate_pair_protocol != >- logged_event.candidate_pair_protocol) >- return false; >- >- return true; >-} >- >-bool VerifyLoggedIceCandidatePairEvent( >+ const LoggedIceCandidatePairConfig& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ >+ EXPECT_EQ(original_event.type(), logged_event.type); >+ EXPECT_EQ(original_event.candidate_pair_id(), logged_event.candidate_pair_id); >+ EXPECT_EQ(original_event.candidate_pair_desc().local_candidate_type, >+ logged_event.local_candidate_type); >+ EXPECT_EQ(original_event.candidate_pair_desc().local_relay_protocol, >+ logged_event.local_relay_protocol); >+ EXPECT_EQ(original_event.candidate_pair_desc().local_network_type, >+ logged_event.local_network_type); >+ EXPECT_EQ(original_event.candidate_pair_desc().local_address_family, >+ logged_event.local_address_family); >+ EXPECT_EQ(original_event.candidate_pair_desc().remote_candidate_type, >+ logged_event.remote_candidate_type); >+ EXPECT_EQ(original_event.candidate_pair_desc().remote_address_family, >+ logged_event.remote_address_family); >+ EXPECT_EQ(original_event.candidate_pair_desc().candidate_pair_protocol, >+ logged_event.candidate_pair_protocol); >+} >+ >+void EventVerifier::VerifyLoggedIceCandidatePairEvent( > const RtcEventIceCandidatePair& original_event, >- const LoggedIceCandidatePairEvent& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- >- if (original_event.type_ != logged_event.type) >- return false; >- if (original_event.candidate_pair_id_ != logged_event.candidate_pair_id) >- return false; >+ const LoggedIceCandidatePairEvent& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); > >- return true; >+ EXPECT_EQ(original_event.type(), logged_event.type); >+ EXPECT_EQ(original_event.candidate_pair_id(), logged_event.candidate_pair_id); > } > >-bool VerifyLoggedRtpHeader(const RtpPacket& original_header, >+void VerifyLoggedRtpHeader(const RtpPacket& original_header, > const RTPHeader& logged_header) { > // Standard RTP header. >- if (original_header.Marker() != logged_header.markerBit) >- return false; >- if (original_header.PayloadType() != logged_header.payloadType) >- return false; >- if (original_header.SequenceNumber() != logged_header.sequenceNumber) >- return false; >- if (original_header.Timestamp() != logged_header.timestamp) >- return false; >- if (original_header.Ssrc() != logged_header.ssrc) >- return false; >- if (original_header.Csrcs().size() != logged_header.numCSRCs) >- return false; >- for (size_t i = 0; i < logged_header.numCSRCs; i++) { >- if (original_header.Csrcs()[i] != logged_header.arrOfCSRCs[i]) >- return false; >- } >+ EXPECT_EQ(original_header.Marker(), logged_header.markerBit); >+ EXPECT_EQ(original_header.PayloadType(), logged_header.payloadType); >+ EXPECT_EQ(original_header.SequenceNumber(), logged_header.sequenceNumber); >+ EXPECT_EQ(original_header.Timestamp(), logged_header.timestamp); >+ EXPECT_EQ(original_header.Ssrc(), logged_header.ssrc); > >- if (original_header.headers_size() != logged_header.headerLength) >- return false; >+ EXPECT_EQ(original_header.headers_size(), logged_header.headerLength); > > // TransmissionOffset header extension. >- if (original_header.HasExtension<TransmissionOffset>() != >- logged_header.extension.hasTransmissionTimeOffset) >- return false; >+ ASSERT_EQ(original_header.HasExtension<TransmissionOffset>(), >+ logged_header.extension.hasTransmissionTimeOffset); > if (logged_header.extension.hasTransmissionTimeOffset) { > int32_t offset; >- original_header.GetExtension<TransmissionOffset>(&offset); >- if (offset != logged_header.extension.transmissionTimeOffset) >- return false; >+ ASSERT_TRUE(original_header.GetExtension<TransmissionOffset>(&offset)); >+ EXPECT_EQ(offset, logged_header.extension.transmissionTimeOffset); > } > > // AbsoluteSendTime header extension. >- if (original_header.HasExtension<AbsoluteSendTime>() != >- logged_header.extension.hasAbsoluteSendTime) >- return false; >+ ASSERT_EQ(original_header.HasExtension<AbsoluteSendTime>(), >+ logged_header.extension.hasAbsoluteSendTime); > if (logged_header.extension.hasAbsoluteSendTime) { > uint32_t sendtime; >- original_header.GetExtension<AbsoluteSendTime>(&sendtime); >- if (sendtime != logged_header.extension.absoluteSendTime) >- return false; >+ ASSERT_TRUE(original_header.GetExtension<AbsoluteSendTime>(&sendtime)); >+ EXPECT_EQ(sendtime, logged_header.extension.absoluteSendTime); > } > > // TransportSequenceNumber header extension. >- if (original_header.HasExtension<TransportSequenceNumber>() != >- logged_header.extension.hasTransportSequenceNumber) >- return false; >+ ASSERT_EQ(original_header.HasExtension<TransportSequenceNumber>(), >+ logged_header.extension.hasTransportSequenceNumber); > if (logged_header.extension.hasTransportSequenceNumber) { > uint16_t seqnum; >- original_header.GetExtension<TransportSequenceNumber>(&seqnum); >- if (seqnum != logged_header.extension.transportSequenceNumber) >- return false; >+ ASSERT_TRUE(original_header.GetExtension<TransportSequenceNumber>(&seqnum)); >+ EXPECT_EQ(seqnum, logged_header.extension.transportSequenceNumber); > } > > // AudioLevel header extension. >- if (original_header.HasExtension<AudioLevel>() != >- logged_header.extension.hasAudioLevel) >- return false; >+ ASSERT_EQ(original_header.HasExtension<AudioLevel>(), >+ logged_header.extension.hasAudioLevel); > if (logged_header.extension.hasAudioLevel) { > bool voice_activity; > uint8_t audio_level; >- original_header.GetExtension<AudioLevel>(&voice_activity, &audio_level); >- if (voice_activity != logged_header.extension.voiceActivity) >- return false; >- if (audio_level != logged_header.extension.audioLevel) >- return false; >+ ASSERT_TRUE(original_header.GetExtension<AudioLevel>(&voice_activity, >+ &audio_level)); >+ EXPECT_EQ(voice_activity, logged_header.extension.voiceActivity); >+ EXPECT_EQ(audio_level, logged_header.extension.audioLevel); > } > > // VideoOrientation header extension. >- if (original_header.HasExtension<VideoOrientation>() != >- logged_header.extension.hasVideoRotation) >- return false; >+ ASSERT_EQ(original_header.HasExtension<VideoOrientation>(), >+ logged_header.extension.hasVideoRotation); > if (logged_header.extension.hasVideoRotation) { > uint8_t rotation; >- original_header.GetExtension<VideoOrientation>(&rotation); >- if (ConvertCVOByteToVideoRotation(rotation) != >- logged_header.extension.videoRotation) >- return false; >+ ASSERT_TRUE(original_header.GetExtension<VideoOrientation>(&rotation)); >+ EXPECT_EQ(ConvertCVOByteToVideoRotation(rotation), >+ logged_header.extension.videoRotation); > } >- >- return true; > } > >-bool VerifyLoggedRtpPacketIncoming( >+void EventVerifier::VerifyLoggedRtpPacketIncoming( > const RtcEventRtpPacketIncoming& original_event, >- const LoggedRtpPacketIncoming& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- >- if (original_event.header_.headers_size() != logged_event.rtp.header_length) >- return false; >- >- if (original_event.packet_length_ != logged_event.rtp.total_length) >- return false; >- >- if ((original_event.header_.data()[0] & 0x20) != 0 && // has padding >- original_event.packet_length_ - original_event.header_.headers_size() != >- logged_event.rtp.header.paddingLength) { >- // Currently, RTC eventlog encoder-parser can only maintain padding length >- // if packet is full padding. >- // TODO(webrtc:9730): Change the condition to something like >- // original_event.padding_length_ != logged_event.rtp.header.paddingLength. >- return false; >- } >+ const LoggedRtpPacketIncoming& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ >+ EXPECT_EQ(original_event.header().headers_size(), >+ logged_event.rtp.header_length); >+ >+ EXPECT_EQ(original_event.packet_length(), logged_event.rtp.total_length); > >- if (!VerifyLoggedRtpHeader(original_event.header_, logged_event.rtp.header)) >- return false; >+ // Currently, RTC eventlog encoder-parser can only maintain padding length >+ // if packet is full padding. >+ EXPECT_EQ(original_event.padding_length(), >+ logged_event.rtp.header.paddingLength); > >- return true; >+ VerifyLoggedRtpHeader(original_event.header(), logged_event.rtp.header); > } > >-bool VerifyLoggedRtpPacketOutgoing( >+void EventVerifier::VerifyLoggedRtpPacketOutgoing( > const RtcEventRtpPacketOutgoing& original_event, >- const LoggedRtpPacketOutgoing& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- >- if (original_event.header_.headers_size() != logged_event.rtp.header_length) >- return false; >- >- if (original_event.packet_length_ != logged_event.rtp.total_length) >- return false; >- >- if ((original_event.header_.data()[0] & 0x20) != 0 && // has padding >- original_event.packet_length_ - original_event.header_.headers_size() != >- logged_event.rtp.header.paddingLength) { >- // Currently, RTC eventlog encoder-parser can only maintain padding length >- // if packet is full padding. >- // TODO(webrtc:9730): Change the condition to something like >- // original_event.padding_length_ != logged_event.rtp.header.paddingLength. >- return false; >- } >+ const LoggedRtpPacketOutgoing& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ >+ EXPECT_EQ(original_event.header().headers_size(), >+ logged_event.rtp.header_length); >+ >+ EXPECT_EQ(original_event.packet_length(), logged_event.rtp.total_length); >+ >+ // Currently, RTC eventlog encoder-parser can only maintain padding length >+ // if packet is full padding. >+ EXPECT_EQ(original_event.padding_length(), >+ logged_event.rtp.header.paddingLength); > > // TODO(terelius): Probe cluster ID isn't parsed, used or tested. Unless > // someone has a strong reason to keep it, it'll be removed. > >- if (!VerifyLoggedRtpHeader(original_event.header_, logged_event.rtp.header)) >- return false; >- >- return true; >+ VerifyLoggedRtpHeader(original_event.header(), logged_event.rtp.header); > } > >-bool VerifyLoggedRtcpPacketIncoming( >+void EventVerifier::VerifyLoggedRtcpPacketIncoming( > const RtcEventRtcpPacketIncoming& original_event, >- const LoggedRtcpPacketIncoming& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- >- if (original_event.packet_.size() != logged_event.rtcp.raw_data.size()) >- return false; >- if (memcmp(original_event.packet_.data(), logged_event.rtcp.raw_data.data(), >- original_event.packet_.size()) != 0) { >- return false; >- } >- return true; >+ const LoggedRtcpPacketIncoming& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ >+ ASSERT_EQ(original_event.packet().size(), logged_event.rtcp.raw_data.size()); >+ EXPECT_EQ( >+ memcmp(original_event.packet().data(), logged_event.rtcp.raw_data.data(), >+ original_event.packet().size()), >+ 0); > } > >-bool VerifyLoggedRtcpPacketOutgoing( >+void EventVerifier::VerifyLoggedRtcpPacketOutgoing( > const RtcEventRtcpPacketOutgoing& original_event, >- const LoggedRtcpPacketOutgoing& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- >- if (original_event.packet_.size() != logged_event.rtcp.raw_data.size()) >- return false; >- if (memcmp(original_event.packet_.data(), logged_event.rtcp.raw_data.data(), >- original_event.packet_.size()) != 0) { >- return false; >+ const LoggedRtcpPacketOutgoing& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ >+ ASSERT_EQ(original_event.packet().size(), logged_event.rtcp.raw_data.size()); >+ EXPECT_EQ( >+ memcmp(original_event.packet().data(), logged_event.rtcp.raw_data.data(), >+ original_event.packet().size()), >+ 0); >+} >+ >+void EventVerifier::VerifyLoggedStartEvent( >+ int64_t start_time_us, >+ int64_t utc_start_time_us, >+ const LoggedStartEvent& logged_event) const { >+ EXPECT_EQ(start_time_us / 1000, logged_event.log_time_ms()); >+ if (encoding_type_ == RtcEventLog::EncodingType::NewFormat) { >+ EXPECT_EQ(utc_start_time_us / 1000, logged_event.utc_start_time_ms); > } >- return true; > } > >-bool VerifyLoggedStartEvent(int64_t start_time_us, >- const LoggedStartEvent& logged_event) { >- if (start_time_us != logged_event.log_time_us()) >- return false; >- return true; >+void EventVerifier::VerifyLoggedStopEvent( >+ int64_t stop_time_us, >+ const LoggedStopEvent& logged_event) const { >+ EXPECT_EQ(stop_time_us / 1000, logged_event.log_time_ms()); > } > >-bool VerifyLoggedStopEvent(int64_t stop_time_us, >- const LoggedStopEvent& logged_event) { >- if (stop_time_us != logged_event.log_time_us()) >- return false; >- return true; >+void VerifyLoggedStreamConfig(const rtclog::StreamConfig& original_config, >+ const rtclog::StreamConfig& logged_config) { >+ EXPECT_EQ(original_config.local_ssrc, logged_config.local_ssrc); >+ EXPECT_EQ(original_config.remote_ssrc, logged_config.remote_ssrc); >+ EXPECT_EQ(original_config.rtx_ssrc, logged_config.rtx_ssrc); >+ >+ EXPECT_EQ(original_config.rtp_extensions.size(), >+ logged_config.rtp_extensions.size()); >+ size_t recognized_extensions = 0; >+ for (size_t i = 0; i < kMaxNumExtensions; i++) { >+ auto original_id = >+ GetExtensionId(original_config.rtp_extensions, kExtensions[i].name); >+ auto logged_id = >+ GetExtensionId(logged_config.rtp_extensions, kExtensions[i].name); >+ EXPECT_EQ(original_id, logged_id) >+ << "IDs for " << kExtensions[i].name << " don't match. Original ID " >+ << original_id.value_or(-1) << ". Parsed ID " << logged_id.value_or(-1) >+ << "."; >+ if (original_id) { >+ recognized_extensions++; >+ } >+ } >+ EXPECT_EQ(recognized_extensions, original_config.rtp_extensions.size()); > } > >-bool VerifyLoggedAudioRecvConfig( >+void EventVerifier::VerifyLoggedAudioRecvConfig( > const RtcEventAudioReceiveStreamConfig& original_event, >- const LoggedAudioRecvConfig& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (*original_event.config_ != logged_event.config) >- return false; >- return true; >+ const LoggedAudioRecvConfig& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ VerifyLoggedStreamConfig(original_event.config(), logged_event.config); > } > >-bool VerifyLoggedAudioSendConfig( >+void EventVerifier::VerifyLoggedAudioSendConfig( > const RtcEventAudioSendStreamConfig& original_event, >- const LoggedAudioSendConfig& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (*original_event.config_ != logged_event.config) >- return false; >- return true; >+ const LoggedAudioSendConfig& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ VerifyLoggedStreamConfig(original_event.config(), logged_event.config); > } > >-bool VerifyLoggedVideoRecvConfig( >+void EventVerifier::VerifyLoggedVideoRecvConfig( > const RtcEventVideoReceiveStreamConfig& original_event, >- const LoggedVideoRecvConfig& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >- if (*original_event.config_ != logged_event.config) >- return false; >- return true; >+ const LoggedVideoRecvConfig& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); >+ VerifyLoggedStreamConfig(original_event.config(), logged_event.config); > } > >-bool VerifyLoggedVideoSendConfig( >+void EventVerifier::VerifyLoggedVideoSendConfig( > const RtcEventVideoSendStreamConfig& original_event, >- const LoggedVideoSendConfig& logged_event) { >- if (original_event.timestamp_us_ != logged_event.log_time_us()) >- return false; >+ const LoggedVideoSendConfig& logged_event) const { >+ EXPECT_EQ(original_event.timestamp_ms(), logged_event.log_time_ms()); > // TODO(terelius): In the past, we allowed storing multiple RtcStreamConfigs > // in the same RtcEventVideoSendStreamConfig. Look into whether we should drop > // backwards compatibility in the parser. >- if (logged_event.configs.size() != 1) >- return false; >- if (*original_event.config_ != logged_event.configs[0]) >- return false; >- return true; >+ ASSERT_EQ(logged_event.configs.size(), 1u); >+ VerifyLoggedStreamConfig(original_event.config(), logged_event.configs[0]); > } > > } // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.h >index f4d18765287be4dc156633ec3bb48d3c8927d522..81760b75553608affaaae360d82ae0b908f0e8bc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_event_log_unittest_helper.h >@@ -33,6 +33,7 @@ > #include "logging/rtc_event_log/events/rtc_event_video_receive_stream_config.h" > #include "logging/rtc_event_log/events/rtc_event_video_send_stream_config.h" > #include "logging/rtc_event_log/rtc_event_log_parser_new.h" >+#include "logging/rtc_event_log/rtc_stream_config.h" > #include "modules/rtp_rtcp/source/rtcp_packet/receiver_report.h" > #include "modules/rtp_rtcp/source/rtcp_packet/report_block.h" > #include "modules/rtp_rtcp/source/rtcp_packet/sender_report.h" >@@ -70,21 +71,32 @@ class EventGenerator { > > std::unique_ptr<RtcEventRtcpPacketOutgoing> NewRtcpPacketOutgoing(); > >+ // |all_configured_exts| determines whether the RTP packet exhibits all >+ // configured extensions, or a random subset thereof. > void RandomizeRtpPacket(size_t payload_size, > size_t padding_size, > uint32_t ssrc, > const RtpHeaderExtensionMap& extension_map, >- RtpPacket* rtp_packet); >+ RtpPacket* rtp_packet, >+ bool all_configured_exts); > >+ // |all_configured_exts| determines whether the RTP packet exhibits all >+ // configured extensions, or a random subset thereof. > std::unique_ptr<RtcEventRtpPacketIncoming> NewRtpPacketIncoming( > uint32_t ssrc, >- const RtpHeaderExtensionMap& extension_map); >+ const RtpHeaderExtensionMap& extension_map, >+ bool all_configured_exts = true); > >+ // |all_configured_exts| determines whether the RTP packet exhibits all >+ // configured extensions, or a random subset thereof. > std::unique_ptr<RtcEventRtpPacketOutgoing> NewRtpPacketOutgoing( > uint32_t ssrc, >- const RtpHeaderExtensionMap& extension_map); >+ const RtpHeaderExtensionMap& extension_map, >+ bool all_configured_exts = true); > >- RtpHeaderExtensionMap NewRtpHeaderExtensionMap(); >+ // |configure_all| determines whether all supported extensions are configured, >+ // or a random subset. >+ RtpHeaderExtensionMap NewRtpHeaderExtensionMap(bool configure_all = false); > > std::unique_ptr<RtcEventAudioReceiveStreamConfig> NewAudioReceiveStreamConfig( > uint32_t ssrc, >@@ -110,80 +122,111 @@ class EventGenerator { > Random prng_; > }; > >-bool VerifyLoggedAlrStateEvent(const RtcEventAlrState& original_event, >- const LoggedAlrStateEvent& logged_event); >- >-bool VerifyLoggedAudioPlayoutEvent(const RtcEventAudioPlayout& original_event, >- const LoggedAudioPlayoutEvent& logged_event); >- >-bool VerifyLoggedAudioNetworkAdaptationEvent( >- const RtcEventAudioNetworkAdaptation& original_event, >- const LoggedAudioNetworkAdaptationEvent& logged_event); >- >-bool VerifyLoggedBweDelayBasedUpdate( >- const RtcEventBweUpdateDelayBased& original_event, >- const LoggedBweDelayBasedUpdate& logged_event); >- >-bool VerifyLoggedBweLossBasedUpdate( >- const RtcEventBweUpdateLossBased& original_event, >- const LoggedBweLossBasedUpdate& logged_event); >- >-bool VerifyLoggedBweProbeClusterCreatedEvent( >- const RtcEventProbeClusterCreated& original_event, >- const LoggedBweProbeClusterCreatedEvent& logged_event); >- >-bool VerifyLoggedBweProbeFailureEvent( >- const RtcEventProbeResultFailure& original_event, >- const LoggedBweProbeFailureEvent& logged_event); >- >-bool VerifyLoggedBweProbeSuccessEvent( >- const RtcEventProbeResultSuccess& original_event, >- const LoggedBweProbeSuccessEvent& logged_event); >- >-bool VerifyLoggedIceCandidatePairConfig( >- const RtcEventIceCandidatePairConfig& original_event, >- const LoggedIceCandidatePairConfig& logged_event); >- >-bool VerifyLoggedIceCandidatePairEvent( >- const RtcEventIceCandidatePair& original_event, >- const LoggedIceCandidatePairEvent& logged_event); >- >-bool VerifyLoggedRtpPacketIncoming( >- const RtcEventRtpPacketIncoming& original_event, >- const LoggedRtpPacketIncoming& logged_event); >- >-bool VerifyLoggedRtpPacketOutgoing( >- const RtcEventRtpPacketOutgoing& original_event, >- const LoggedRtpPacketOutgoing& logged_event); >- >-bool VerifyLoggedRtcpPacketIncoming( >- const RtcEventRtcpPacketIncoming& original_event, >- const LoggedRtcpPacketIncoming& logged_event); >- >-bool VerifyLoggedRtcpPacketOutgoing( >- const RtcEventRtcpPacketOutgoing& original_event, >- const LoggedRtcpPacketOutgoing& logged_event); >- >-bool VerifyLoggedStartEvent(int64_t start_time_us, >- const LoggedStartEvent& logged_event); >-bool VerifyLoggedStopEvent(int64_t stop_time_us, >- const LoggedStopEvent& logged_event); >- >-bool VerifyLoggedAudioRecvConfig( >- const RtcEventAudioReceiveStreamConfig& original_event, >- const LoggedAudioRecvConfig& logged_event); >- >-bool VerifyLoggedAudioSendConfig( >- const RtcEventAudioSendStreamConfig& original_event, >- const LoggedAudioSendConfig& logged_event); >- >-bool VerifyLoggedVideoRecvConfig( >- const RtcEventVideoReceiveStreamConfig& original_event, >- const LoggedVideoRecvConfig& logged_event); >+class EventVerifier { >+ public: >+ explicit EventVerifier(RtcEventLog::EncodingType encoding_type) >+ : encoding_type_(encoding_type) {} >+ >+ void VerifyLoggedAlrStateEvent(const RtcEventAlrState& original_event, >+ const LoggedAlrStateEvent& logged_event) const; >+ >+ void VerifyLoggedAudioPlayoutEvent( >+ const RtcEventAudioPlayout& original_event, >+ const LoggedAudioPlayoutEvent& logged_event) const; >+ >+ void VerifyLoggedAudioNetworkAdaptationEvent( >+ const RtcEventAudioNetworkAdaptation& original_event, >+ const LoggedAudioNetworkAdaptationEvent& logged_event) const; >+ >+ void VerifyLoggedBweDelayBasedUpdate( >+ const RtcEventBweUpdateDelayBased& original_event, >+ const LoggedBweDelayBasedUpdate& logged_event) const; >+ >+ void VerifyLoggedBweLossBasedUpdate( >+ const RtcEventBweUpdateLossBased& original_event, >+ const LoggedBweLossBasedUpdate& logged_event) const; >+ >+ void VerifyLoggedBweProbeClusterCreatedEvent( >+ const RtcEventProbeClusterCreated& original_event, >+ const LoggedBweProbeClusterCreatedEvent& logged_event) const; >+ >+ void VerifyLoggedBweProbeFailureEvent( >+ const RtcEventProbeResultFailure& original_event, >+ const LoggedBweProbeFailureEvent& logged_event) const; >+ >+ void VerifyLoggedBweProbeSuccessEvent( >+ const RtcEventProbeResultSuccess& original_event, >+ const LoggedBweProbeSuccessEvent& logged_event) const; >+ >+ void VerifyLoggedIceCandidatePairConfig( >+ const RtcEventIceCandidatePairConfig& original_event, >+ const LoggedIceCandidatePairConfig& logged_event) const; >+ >+ void VerifyLoggedIceCandidatePairEvent( >+ const RtcEventIceCandidatePair& original_event, >+ const LoggedIceCandidatePairEvent& logged_event) const; >+ >+ void VerifyLoggedRtpPacketIncoming( >+ const RtcEventRtpPacketIncoming& original_event, >+ const LoggedRtpPacketIncoming& logged_event) const; >+ >+ void VerifyLoggedRtpPacketOutgoing( >+ const RtcEventRtpPacketOutgoing& original_event, >+ const LoggedRtpPacketOutgoing& logged_event) const; >+ >+ template <typename EventType, typename ParsedType> >+ void VerifyLoggedRtpPacket(const EventType& original_event, >+ const ParsedType& logged_event) { >+ static_assert(sizeof(ParsedType) == 0, >+ "You have to use one of the two defined template " >+ "specializations of VerifyLoggedRtpPacket"); >+ } >+ >+ template <> >+ void VerifyLoggedRtpPacket(const RtcEventRtpPacketIncoming& original_event, >+ const LoggedRtpPacketIncoming& logged_event) { >+ VerifyLoggedRtpPacketIncoming(original_event, logged_event); >+ } >+ >+ template <> >+ void VerifyLoggedRtpPacket(const RtcEventRtpPacketOutgoing& original_event, >+ const LoggedRtpPacketOutgoing& logged_event) { >+ VerifyLoggedRtpPacketOutgoing(original_event, logged_event); >+ } >+ >+ void VerifyLoggedRtcpPacketIncoming( >+ const RtcEventRtcpPacketIncoming& original_event, >+ const LoggedRtcpPacketIncoming& logged_event) const; >+ >+ void VerifyLoggedRtcpPacketOutgoing( >+ const RtcEventRtcpPacketOutgoing& original_event, >+ const LoggedRtcpPacketOutgoing& logged_event) const; >+ >+ void VerifyLoggedStartEvent(int64_t start_time_us, >+ int64_t utc_start_time_us, >+ const LoggedStartEvent& logged_event) const; >+ void VerifyLoggedStopEvent(int64_t stop_time_us, >+ const LoggedStopEvent& logged_event) const; >+ >+ void VerifyLoggedAudioRecvConfig( >+ const RtcEventAudioReceiveStreamConfig& original_event, >+ const LoggedAudioRecvConfig& logged_event) const; >+ >+ void VerifyLoggedAudioSendConfig( >+ const RtcEventAudioSendStreamConfig& original_event, >+ const LoggedAudioSendConfig& logged_event) const; >+ >+ void VerifyLoggedVideoRecvConfig( >+ const RtcEventVideoReceiveStreamConfig& original_event, >+ const LoggedVideoRecvConfig& logged_event) const; >+ >+ void VerifyLoggedVideoSendConfig( >+ const RtcEventVideoSendStreamConfig& original_event, >+ const LoggedVideoSendConfig& logged_event) const; > >-bool VerifyLoggedVideoSendConfig( >- const RtcEventVideoSendStreamConfig& original_event, >- const LoggedVideoSendConfig& logged_event); >+ private: >+ RtcEventLog::EncodingType encoding_type_; >+}; > > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_stream_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_stream_config.h >index d95c8359ece829f70527ba418cb2348c9e576e88..a9da4814e6b388481022bafc4b5aab7f8d494df0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_stream_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/logging/rtc_event_log/rtc_stream_config.h >@@ -11,12 +11,12 @@ > #ifndef LOGGING_RTC_EVENT_LOG_RTC_STREAM_CONFIG_H_ > #define LOGGING_RTC_EVENT_LOG_RTC_STREAM_CONFIG_H_ > >+#include <stdint.h> > #include <string> > #include <vector> > > #include "api/rtp_headers.h" > #include "api/rtpparameters.h" >-#include "common_types.h" // NOLINT(build/include) > > namespace webrtc { > namespace rtclog { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/media/BUILD.gn >index 858b272f5580dc2a4b16efff37d4087b4ce6ec18..b6f98c67b7278227d17c6676464d9f821369e3b8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/BUILD.gn >@@ -42,6 +42,7 @@ rtc_source_set("rtc_h264_profile_id") { > "..:webrtc_common", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", >+ "../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -136,7 +137,9 @@ rtc_static_library("rtc_media_base") { > "../modules/audio_processing:audio_processing_statistics", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", >+ "../rtc_base/system:rtc_export", > "../rtc_base/third_party/sigslot", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > >@@ -159,6 +162,52 @@ rtc_static_library("rtc_constants") { > ] > } > >+rtc_static_library("rtc_simulcast_encoder_adapter") { >+ visibility = [ "*" ] >+ defines = [] >+ libs = [] >+ sources = [ >+ "engine/simulcast_encoder_adapter.cc", >+ "engine/simulcast_encoder_adapter.h", >+ ] >+ deps = [ >+ "../api/video:video_bitrate_allocation", >+ "../api/video:video_frame_i420", >+ "../api/video_codecs:video_codecs_api", >+ "../modules/video_coding:video_codec_interface", >+ "../modules/video_coding:video_coding_utility", >+ "../rtc_base:checks", >+ "../rtc_base:rtc_base_approved", >+ "../rtc_base:sequenced_task_checker", >+ "../system_wrappers", >+ "../system_wrappers:field_trial", >+ "//third_party/abseil-cpp/absl/types:optional", >+ "//third_party/libyuv", >+ ] >+ if (!build_with_chromium && is_clang) { >+ # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >+ suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >+ } >+} >+ >+rtc_static_library("rtc_vp8_encoder_simulcast_proxy") { >+ visibility = [ "*" ] >+ defines = [] >+ libs = [] >+ sources = [ >+ "engine/vp8_encoder_simulcast_proxy.cc", >+ "engine/vp8_encoder_simulcast_proxy.h", >+ ] >+ deps = [ >+ ":rtc_simulcast_encoder_adapter", >+ "../api/video_codecs:video_codecs_api", >+ ] >+ if (!build_with_chromium && is_clang) { >+ # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >+ suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >+ } >+} >+ > rtc_static_library("rtc_internal_video_codecs") { > visibility = [ "*" ] > allow_poison = [ "software_video_codecs" ] >@@ -166,8 +215,9 @@ rtc_static_library("rtc_internal_video_codecs") { > libs = [] > deps = [ > ":rtc_h264_profile_id", >+ ":rtc_simulcast_encoder_adapter", >+ ":rtc_vp8_encoder_simulcast_proxy", > "../modules/video_coding:video_codec_interface", >- "../system_wrappers:field_trial", > "//third_party/abseil-cpp/absl/memory", > ] > sources = [ >@@ -183,9 +233,9 @@ rtc_static_library("rtc_internal_video_codecs") { > "engine/scopedvideodecoder.h", > "engine/scopedvideoencoder.cc", > "engine/scopedvideoencoder.h", >- "engine/simulcast_encoder_adapter.cc", >- "engine/simulcast_encoder_adapter.h", >- "engine/vp8_encoder_simulcast_proxy.cc", >+ >+ # TODO(bugs.webrtc.org/7925): stop exporting this header once downstream >+ # targets depend on :rtc_vp8_encoder_simulcast_proxy directly. > "engine/vp8_encoder_simulcast_proxy.h", > "engine/webrtcvideodecoderfactory.h", > "engine/webrtcvideoencoderfactory.h", >@@ -202,24 +252,20 @@ rtc_static_library("rtc_internal_video_codecs") { > deps += [ > ":rtc_constants", > ":rtc_media_base", >- "..:webrtc_common", > "../api/video:video_bitrate_allocation", >- "../api/video:video_frame_i420", >+ "../api/video:video_frame", > "../api/video_codecs:rtc_software_fallback_wrappers", > "../api/video_codecs:video_codecs_api", > "../call:call_interfaces", > "../call:video_stream_api", >- "../modules/video_coding:video_coding_utility", > "../modules/video_coding:webrtc_h264", > "../modules/video_coding:webrtc_multiplex", > "../modules/video_coding:webrtc_vp8", > "../modules/video_coding:webrtc_vp9", > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", >- "../rtc_base:sequenced_task_checker", >- "../system_wrappers", >- "//third_party/abseil-cpp/absl/types:optional", >- "//third_party/libyuv", >+ "../rtc_base/system:rtc_export", >+ "//third_party/abseil-cpp/absl/strings", > ] > } > >@@ -232,12 +278,15 @@ rtc_static_library("rtc_audio_video") { > defines = [] > libs = [] > deps = [ >+ "../api/video:video_bitrate_allocator_factory", >+ "../modules/audio_processing:api", > "../modules/audio_processing/aec_dump:aec_dump", > "../modules/video_coding:video_codec_interface", > "../modules/video_coding:video_coding", > "../modules/video_coding:video_coding_utility", > "../rtc_base:audio_format_to_string", > "../rtc_base:checks", >+ "../rtc_base/system:rtc_export", > "../rtc_base/third_party/base64", > "../system_wrappers:field_trial", > "../system_wrappers:metrics", >@@ -305,6 +354,7 @@ rtc_static_library("rtc_audio_video") { > "../api:libjingle_peerconnection_api", > "../api:transport_api", > "../api/audio_codecs:audio_codecs_api", >+ "../api/video:builtin_video_bitrate_allocator_factory", > "../api/video:video_frame", > "../api/video:video_frame_i420", > "../api/video_codecs:rtc_software_fallback_wrappers", >@@ -322,7 +372,10 @@ rtc_static_library("rtc_audio_video") { > "../rtc_base:rtc_base", > "../rtc_base:rtc_task_queue", > "../rtc_base:stringutils", >+ "../rtc_base/experiments:normalize_simulcast_size_experiment", > "../system_wrappers", >+ "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -391,24 +444,29 @@ if (rtc_include_tests) { > include_dirs = [] > deps = [ > ":rtc_audio_video", >+ ":rtc_simulcast_encoder_adapter", > "../api:libjingle_peerconnection_api", > "../api/video:video_frame_i420", > "../call:video_stream_api", > "../common_video:common_video", > "../modules/audio_coding:rent_a_codec", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/rtp_rtcp:rtp_rtcp_format", > "../modules/video_coding:video_codec_interface", > "../modules/video_coding:video_coding_utility", > "../p2p:rtc_p2p", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_task_queue", > "../rtc_base:stringutils", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > ] > sources = [ > "base/fakeframesource.cc", > "base/fakeframesource.h", >+ "base/fakemediaengine.cc", > "base/fakemediaengine.h", > "base/fakenetworkinterface.h", > "base/fakertp.cc", >@@ -450,7 +508,6 @@ if (rtc_include_tests) { > "../call:mock_rtp_interfaces", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", >- "../rtc_base:rtc_base_tests_utils", > "../rtc_base:rtc_task_queue_for_test", > "../rtc_base/third_party/sigslot", > "../test:test_support", >@@ -486,8 +543,10 @@ if (rtc_include_tests) { > ":rtc_audio_video", > ":rtc_constants", > ":rtc_data", >+ "../api/test/video:function_video_factory", > "../api/units:time_delta", > "../api/video:video_frame_i420", >+ "../modules/audio_processing:api", > "../modules/audio_processing:mocks", > "../modules/rtp_rtcp", > "../modules/video_coding:video_codec_interface", >@@ -495,11 +554,13 @@ if (rtc_include_tests) { > "../pc:rtc_pc", > "../pc:rtc_pc_base", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", >+ "../rtc_base:rtc_base_tests_utils", > "../rtc_base:rtc_task_queue", > "../rtc_base:stringutils", > "../test:field_trial", >- "../test:test_common", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > ] > sources = [ > "base/codec_unittest.cc", >@@ -568,13 +629,18 @@ if (rtc_include_tests) { > ":rtc_media", > ":rtc_media_base", > ":rtc_media_tests_utils", >+ ":rtc_simulcast_encoder_adapter", >+ ":rtc_vp8_encoder_simulcast_proxy", > ":rtc_vp9_profile", > "../api:create_simulcast_test_fixture_api", > "../api:libjingle_peerconnection_api", >+ "../api:mock_video_bitrate_allocator", >+ "../api:mock_video_bitrate_allocator_factory", > "../api:mock_video_codec_factory", > "../api:simulcast_test_fixture_api", > "../api/audio_codecs:builtin_audio_decoder_factory", > "../api/audio_codecs:builtin_audio_encoder_factory", >+ "../api/video:builtin_video_bitrate_allocator_factory", > "../api/video:video_bitrate_allocation", > "../api/video:video_frame", > "../api/video_codecs:builtin_video_decoder_factory", >@@ -592,7 +658,6 @@ if (rtc_include_tests) { > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_main", >- "../rtc_base:rtc_base_tests_utils", > "../test:audio_codec_mocks", > "../test:test_support", > "../test:video_test_common", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/media/OWNERS >index 634eaf2bfa79e1871a7b0608cf5c0205562b7c52..4544e9f4806138670823db84ba3ebbaa4d47f331 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/OWNERS >@@ -4,6 +4,7 @@ sprang@webrtc.org > magjed@webrtc.org > mflodman@webrtc.org > perkj@webrtc.org >+shampson@webrtc.org > solenberg@webrtc.org > > # These are for the common case of adding or renaming files. If you're doing >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.cc >index caf32121125b1dbec0ca8cf20f96189489315fe8..0a1c7156af80e0599d19e52752b8e2d731a51b3c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.cc >@@ -12,13 +12,13 @@ > > #include <algorithm> > >+#include "absl/strings/match.h" > #include "media/base/h264_profile_level_id.h" > #include "media/base/vp9_profile.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/stringencode.h" > #include "rtc_base/strings/string_builder.h" >-#include "rtc_base/stringutils.h" > > namespace cricket { > >@@ -26,8 +26,8 @@ FeedbackParams::FeedbackParams() = default; > FeedbackParams::~FeedbackParams() = default; > > bool FeedbackParam::operator==(const FeedbackParam& other) const { >- return _stricmp(other.id().c_str(), id().c_str()) == 0 && >- _stricmp(other.param().c_str(), param().c_str()) == 0; >+ return absl::EqualsIgnoreCase(other.id(), id()) && >+ absl::EqualsIgnoreCase(other.param(), param()); > } > > bool FeedbackParams::operator==(const FeedbackParams& other) const { >@@ -97,7 +97,7 @@ bool Codec::Matches(const Codec& codec) const { > const int kMaxStaticPayloadId = 95; > return (id <= kMaxStaticPayloadId || codec.id <= kMaxStaticPayloadId) > ? (id == codec.id) >- : (_stricmp(name.c_str(), codec.name.c_str()) == 0); >+ : (absl::EqualsIgnoreCase(name, codec.name)); > } > > bool Codec::GetParam(const std::string& name, std::string* out) const { >@@ -235,7 +235,7 @@ VideoCodec& VideoCodec::operator=(const VideoCodec& c) = default; > VideoCodec& VideoCodec::operator=(VideoCodec&& c) = default; > > void VideoCodec::SetDefaultParameters() { >- if (_stricmp(kH264CodecName, name.c_str()) == 0) { >+ if (absl::EqualsIgnoreCase(kH264CodecName, name)) { > // This default is set for all H.264 codecs created because > // that was the default before packetization mode support was added. > // TODO(hta): Move this to the places that create VideoCodecs from >@@ -268,10 +268,10 @@ static bool IsSameH264PacketizationMode(const CodecParameterMap& ours, > bool VideoCodec::Matches(const VideoCodec& other) const { > if (!Codec::Matches(other)) > return false; >- if (CodecNamesEq(name.c_str(), kH264CodecName)) >+ if (absl::EqualsIgnoreCase(name, kH264CodecName)) > return webrtc::H264::IsSameH264Profile(params, other.params) && > IsSameH264PacketizationMode(params, other.params); >- if (CodecNamesEq(name.c_str(), kVp9CodecName)) >+ if (absl::EqualsIgnoreCase(name, kVp9CodecName)) > return webrtc::IsSameVP9Profile(params, other.params); > return true; > } >@@ -285,16 +285,16 @@ VideoCodec VideoCodec::CreateRtxCodec(int rtx_payload_type, > > VideoCodec::CodecType VideoCodec::GetCodecType() const { > const char* payload_name = name.c_str(); >- if (_stricmp(payload_name, kRedCodecName) == 0) { >+ if (absl::EqualsIgnoreCase(payload_name, kRedCodecName)) { > return CODEC_RED; > } >- if (_stricmp(payload_name, kUlpfecCodecName) == 0) { >+ if (absl::EqualsIgnoreCase(payload_name, kUlpfecCodecName)) { > return CODEC_ULPFEC; > } >- if (_stricmp(payload_name, kFlexfecCodecName) == 0) { >+ if (absl::EqualsIgnoreCase(payload_name, kFlexfecCodecName)) { > return CODEC_FLEXFEC; > } >- if (_stricmp(payload_name, kRtxCodecName) == 0) { >+ if (absl::EqualsIgnoreCase(payload_name, kRtxCodecName)) { > return CODEC_RTX; > } > >@@ -362,14 +362,6 @@ bool HasTransportCc(const Codec& codec) { > FeedbackParam(kRtcpFbParamTransportCc, kParamValueEmpty)); > } > >-bool CodecNamesEq(const std::string& name1, const std::string& name2) { >- return CodecNamesEq(name1.c_str(), name2.c_str()); >-} >- >-bool CodecNamesEq(const char* name1, const char* name2) { >- return _stricmp(name1, name2) == 0; >-} >- > const VideoCodec* FindMatchingCodec( > const std::vector<VideoCodec>& supported_codecs, > const VideoCodec& codec) { >@@ -387,12 +379,12 @@ bool IsSameCodec(const std::string& name1, > const std::string& name2, > const CodecParameterMap& params2) { > // If different names (case insensitive), then not same formats. >- if (!CodecNamesEq(name1, name2)) >+ if (!absl::EqualsIgnoreCase(name1, name2)) > return false; > // For every format besides H264 and VP9, comparing names is enough. >- if (CodecNamesEq(name1.c_str(), kH264CodecName)) >+ if (absl::EqualsIgnoreCase(name1, kH264CodecName)) > return webrtc::H264::IsSameH264Profile(params1, params2); >- if (CodecNamesEq(name1.c_str(), kVp9CodecName)) >+ if (absl::EqualsIgnoreCase(name1, kVp9CodecName)) > return webrtc::IsSameVP9Profile(params1, params2); > return true; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.h >index db133f6e27f644f51e0d79da829cc253f0013922..0ca6535287beeb29ff5598df1f25cbaf14090d53 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/codec.h >@@ -18,8 +18,8 @@ > > #include "api/rtpparameters.h" > #include "api/video_codecs/sdp_video_format.h" >-#include "common_types.h" // NOLINT(build/include) > #include "media/base/mediaconstants.h" >+#include "rtc_base/system/rtc_export.h" > > namespace cricket { > >@@ -62,7 +62,7 @@ class FeedbackParams { > std::vector<FeedbackParam> params_; > }; > >-struct Codec { >+struct RTC_EXPORT Codec { > int id; > std::string name; > int clockrate; >@@ -142,7 +142,7 @@ struct AudioCodec : public Codec { > bool operator!=(const AudioCodec& c) const { return !(*this == c); } > }; > >-struct VideoCodec : public Codec { >+struct RTC_EXPORT VideoCodec : public Codec { > // Creates a codec with the given parameters. > VideoCodec(int id, const std::string& name); > // Creates a codec with the given name and empty id. >@@ -215,8 +215,6 @@ const Codec* FindCodecById(const std::vector<Codec>& codecs, int payload_type) { > return nullptr; > } > >-bool CodecNamesEq(const std::string& name1, const std::string& name2); >-bool CodecNamesEq(const char* name1, const char* name2); > bool HasNack(const Codec& codec); > bool HasRemb(const Codec& codec); > bool HasRrtr(const Codec& codec); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakemediaengine.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakemediaengine.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..b48206fd77772ac9388b0860e29f558850fe339a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakemediaengine.cc >@@ -0,0 +1,598 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "media/base/fakemediaengine.h" >+ >+#include <utility> >+ >+#include "absl/memory/memory.h" >+#include "absl/strings/match.h" >+#include "rtc_base/checks.h" >+ >+namespace cricket { >+ >+FakeVoiceMediaChannel::DtmfInfo::DtmfInfo(uint32_t ssrc, >+ int event_code, >+ int duration) >+ : ssrc(ssrc), event_code(event_code), duration(duration) {} >+ >+FakeVoiceMediaChannel::VoiceChannelAudioSink::VoiceChannelAudioSink( >+ AudioSource* source) >+ : source_(source) { >+ source_->SetSink(this); >+} >+FakeVoiceMediaChannel::VoiceChannelAudioSink::~VoiceChannelAudioSink() { >+ if (source_) { >+ source_->SetSink(nullptr); >+ } >+} >+void FakeVoiceMediaChannel::VoiceChannelAudioSink::OnData( >+ const void* audio_data, >+ int bits_per_sample, >+ int sample_rate, >+ size_t number_of_channels, >+ size_t number_of_frames) {} >+void FakeVoiceMediaChannel::VoiceChannelAudioSink::OnClose() { >+ source_ = nullptr; >+} >+AudioSource* FakeVoiceMediaChannel::VoiceChannelAudioSink::source() const { >+ return source_; >+} >+ >+FakeVoiceMediaChannel::FakeVoiceMediaChannel(FakeVoiceEngine* engine, >+ const AudioOptions& options) >+ : engine_(engine), max_bps_(-1) { >+ output_scalings_[0] = 1.0; // For default channel. >+ SetOptions(options); >+} >+FakeVoiceMediaChannel::~FakeVoiceMediaChannel() { >+ if (engine_) { >+ engine_->UnregisterChannel(this); >+ } >+} >+const std::vector<AudioCodec>& FakeVoiceMediaChannel::recv_codecs() const { >+ return recv_codecs_; >+} >+const std::vector<AudioCodec>& FakeVoiceMediaChannel::send_codecs() const { >+ return send_codecs_; >+} >+const std::vector<AudioCodec>& FakeVoiceMediaChannel::codecs() const { >+ return send_codecs(); >+} >+const std::vector<FakeVoiceMediaChannel::DtmfInfo>& >+FakeVoiceMediaChannel::dtmf_info_queue() const { >+ return dtmf_info_queue_; >+} >+const AudioOptions& FakeVoiceMediaChannel::options() const { >+ return options_; >+} >+int FakeVoiceMediaChannel::max_bps() const { >+ return max_bps_; >+} >+bool FakeVoiceMediaChannel::SetSendParameters( >+ const AudioSendParameters& params) { >+ set_send_rtcp_parameters(params.rtcp); >+ return (SetSendCodecs(params.codecs) && >+ SetSendExtmapAllowMixed(params.extmap_allow_mixed) && >+ SetSendRtpHeaderExtensions(params.extensions) && >+ SetMaxSendBandwidth(params.max_bandwidth_bps) && >+ SetOptions(params.options)); >+} >+bool FakeVoiceMediaChannel::SetRecvParameters( >+ const AudioRecvParameters& params) { >+ set_recv_rtcp_parameters(params.rtcp); >+ return (SetRecvCodecs(params.codecs) && >+ SetRecvRtpHeaderExtensions(params.extensions)); >+} >+void FakeVoiceMediaChannel::SetPlayout(bool playout) { >+ set_playout(playout); >+} >+void FakeVoiceMediaChannel::SetSend(bool send) { >+ set_sending(send); >+} >+bool FakeVoiceMediaChannel::SetAudioSend(uint32_t ssrc, >+ bool enable, >+ const AudioOptions* options, >+ AudioSource* source) { >+ if (!SetLocalSource(ssrc, source)) { >+ return false; >+ } >+ if (!RtpHelper<VoiceMediaChannel>::MuteStream(ssrc, !enable)) { >+ return false; >+ } >+ if (enable && options) { >+ return SetOptions(*options); >+ } >+ return true; >+} >+bool FakeVoiceMediaChannel::HasSource(uint32_t ssrc) const { >+ return local_sinks_.find(ssrc) != local_sinks_.end(); >+} >+bool FakeVoiceMediaChannel::AddRecvStream(const StreamParams& sp) { >+ if (!RtpHelper<VoiceMediaChannel>::AddRecvStream(sp)) >+ return false; >+ output_scalings_[sp.first_ssrc()] = 1.0; >+ return true; >+} >+bool FakeVoiceMediaChannel::RemoveRecvStream(uint32_t ssrc) { >+ if (!RtpHelper<VoiceMediaChannel>::RemoveRecvStream(ssrc)) >+ return false; >+ output_scalings_.erase(ssrc); >+ return true; >+} >+bool FakeVoiceMediaChannel::CanInsertDtmf() { >+ for (std::vector<AudioCodec>::const_iterator it = send_codecs_.begin(); >+ it != send_codecs_.end(); ++it) { >+ // Find the DTMF telephone event "codec". >+ if (absl::EqualsIgnoreCase(it->name, "telephone-event")) { >+ return true; >+ } >+ } >+ return false; >+} >+bool FakeVoiceMediaChannel::InsertDtmf(uint32_t ssrc, >+ int event_code, >+ int duration) { >+ dtmf_info_queue_.push_back(DtmfInfo(ssrc, event_code, duration)); >+ return true; >+} >+bool FakeVoiceMediaChannel::SetOutputVolume(uint32_t ssrc, double volume) { >+ if (0 == ssrc) { >+ std::map<uint32_t, double>::iterator it; >+ for (it = output_scalings_.begin(); it != output_scalings_.end(); ++it) { >+ it->second = volume; >+ } >+ return true; >+ } else if (output_scalings_.find(ssrc) != output_scalings_.end()) { >+ output_scalings_[ssrc] = volume; >+ return true; >+ } >+ return false; >+} >+bool FakeVoiceMediaChannel::GetOutputVolume(uint32_t ssrc, double* volume) { >+ if (output_scalings_.find(ssrc) == output_scalings_.end()) >+ return false; >+ *volume = output_scalings_[ssrc]; >+ return true; >+} >+bool FakeVoiceMediaChannel::GetStats(VoiceMediaInfo* info) { >+ return false; >+} >+void FakeVoiceMediaChannel::SetRawAudioSink( >+ uint32_t ssrc, >+ std::unique_ptr<webrtc::AudioSinkInterface> sink) { >+ sink_ = std::move(sink); >+} >+std::vector<webrtc::RtpSource> FakeVoiceMediaChannel::GetSources( >+ uint32_t ssrc) const { >+ return std::vector<webrtc::RtpSource>(); >+} >+bool FakeVoiceMediaChannel::SetRecvCodecs( >+ const std::vector<AudioCodec>& codecs) { >+ if (fail_set_recv_codecs()) { >+ // Fake the failure in SetRecvCodecs. >+ return false; >+ } >+ recv_codecs_ = codecs; >+ return true; >+} >+bool FakeVoiceMediaChannel::SetSendCodecs( >+ const std::vector<AudioCodec>& codecs) { >+ if (fail_set_send_codecs()) { >+ // Fake the failure in SetSendCodecs. >+ return false; >+ } >+ send_codecs_ = codecs; >+ return true; >+} >+bool FakeVoiceMediaChannel::SetMaxSendBandwidth(int bps) { >+ max_bps_ = bps; >+ return true; >+} >+bool FakeVoiceMediaChannel::SetOptions(const AudioOptions& options) { >+ // Does a "merge" of current options and set options. >+ options_.SetAll(options); >+ return true; >+} >+bool FakeVoiceMediaChannel::SetLocalSource(uint32_t ssrc, AudioSource* source) { >+ auto it = local_sinks_.find(ssrc); >+ if (source) { >+ if (it != local_sinks_.end()) { >+ RTC_CHECK(it->second->source() == source); >+ } else { >+ local_sinks_.insert(std::make_pair( >+ ssrc, absl::make_unique<VoiceChannelAudioSink>(source))); >+ } >+ } else { >+ if (it != local_sinks_.end()) { >+ local_sinks_.erase(it); >+ } >+ } >+ return true; >+} >+ >+bool CompareDtmfInfo(const FakeVoiceMediaChannel::DtmfInfo& info, >+ uint32_t ssrc, >+ int event_code, >+ int duration) { >+ return (info.duration == duration && info.event_code == event_code && >+ info.ssrc == ssrc); >+} >+ >+FakeVideoMediaChannel::FakeVideoMediaChannel(FakeVideoEngine* engine, >+ const VideoOptions& options) >+ : engine_(engine), max_bps_(-1) { >+ SetOptions(options); >+} >+FakeVideoMediaChannel::~FakeVideoMediaChannel() { >+ if (engine_) { >+ engine_->UnregisterChannel(this); >+ } >+} >+const std::vector<VideoCodec>& FakeVideoMediaChannel::recv_codecs() const { >+ return recv_codecs_; >+} >+const std::vector<VideoCodec>& FakeVideoMediaChannel::send_codecs() const { >+ return send_codecs_; >+} >+const std::vector<VideoCodec>& FakeVideoMediaChannel::codecs() const { >+ return send_codecs(); >+} >+bool FakeVideoMediaChannel::rendering() const { >+ return playout(); >+} >+const VideoOptions& FakeVideoMediaChannel::options() const { >+ return options_; >+} >+const std::map<uint32_t, rtc::VideoSinkInterface<webrtc::VideoFrame>*>& >+FakeVideoMediaChannel::sinks() const { >+ return sinks_; >+} >+int FakeVideoMediaChannel::max_bps() const { >+ return max_bps_; >+} >+bool FakeVideoMediaChannel::SetSendParameters( >+ const VideoSendParameters& params) { >+ set_send_rtcp_parameters(params.rtcp); >+ return (SetSendCodecs(params.codecs) && >+ SetSendExtmapAllowMixed(params.extmap_allow_mixed) && >+ SetSendRtpHeaderExtensions(params.extensions) && >+ SetMaxSendBandwidth(params.max_bandwidth_bps)); >+} >+bool FakeVideoMediaChannel::SetRecvParameters( >+ const VideoRecvParameters& params) { >+ set_recv_rtcp_parameters(params.rtcp); >+ return (SetRecvCodecs(params.codecs) && >+ SetRecvRtpHeaderExtensions(params.extensions)); >+} >+bool FakeVideoMediaChannel::AddSendStream(const StreamParams& sp) { >+ return RtpHelper<VideoMediaChannel>::AddSendStream(sp); >+} >+bool FakeVideoMediaChannel::RemoveSendStream(uint32_t ssrc) { >+ return RtpHelper<VideoMediaChannel>::RemoveSendStream(ssrc); >+} >+bool FakeVideoMediaChannel::GetSendCodec(VideoCodec* send_codec) { >+ if (send_codecs_.empty()) { >+ return false; >+ } >+ *send_codec = send_codecs_[0]; >+ return true; >+} >+bool FakeVideoMediaChannel::SetSink( >+ uint32_t ssrc, >+ rtc::VideoSinkInterface<webrtc::VideoFrame>* sink) { >+ if (ssrc != 0 && sinks_.find(ssrc) == sinks_.end()) { >+ return false; >+ } >+ if (ssrc != 0) { >+ sinks_[ssrc] = sink; >+ } >+ return true; >+} >+bool FakeVideoMediaChannel::HasSink(uint32_t ssrc) const { >+ return sinks_.find(ssrc) != sinks_.end() && sinks_.at(ssrc) != nullptr; >+} >+bool FakeVideoMediaChannel::SetSend(bool send) { >+ return set_sending(send); >+} >+bool FakeVideoMediaChannel::SetVideoSend( >+ uint32_t ssrc, >+ const VideoOptions* options, >+ rtc::VideoSourceInterface<webrtc::VideoFrame>* source) { >+ if (options) { >+ if (!SetOptions(*options)) { >+ return false; >+ } >+ } >+ sources_[ssrc] = source; >+ return true; >+} >+bool FakeVideoMediaChannel::HasSource(uint32_t ssrc) const { >+ return sources_.find(ssrc) != sources_.end() && sources_.at(ssrc) != nullptr; >+} >+bool FakeVideoMediaChannel::AddRecvStream(const StreamParams& sp) { >+ if (!RtpHelper<VideoMediaChannel>::AddRecvStream(sp)) >+ return false; >+ sinks_[sp.first_ssrc()] = NULL; >+ return true; >+} >+bool FakeVideoMediaChannel::RemoveRecvStream(uint32_t ssrc) { >+ if (!RtpHelper<VideoMediaChannel>::RemoveRecvStream(ssrc)) >+ return false; >+ sinks_.erase(ssrc); >+ return true; >+} >+void FakeVideoMediaChannel::FillBitrateInfo(BandwidthEstimationInfo* bwe_info) { >+} >+bool FakeVideoMediaChannel::GetStats(VideoMediaInfo* info) { >+ return false; >+} >+std::vector<webrtc::RtpSource> FakeVideoMediaChannel::GetSources( >+ uint32_t ssrc) const { >+ return {}; >+} >+bool FakeVideoMediaChannel::SetRecvCodecs( >+ const std::vector<VideoCodec>& codecs) { >+ if (fail_set_recv_codecs()) { >+ // Fake the failure in SetRecvCodecs. >+ return false; >+ } >+ recv_codecs_ = codecs; >+ return true; >+} >+bool FakeVideoMediaChannel::SetSendCodecs( >+ const std::vector<VideoCodec>& codecs) { >+ if (fail_set_send_codecs()) { >+ // Fake the failure in SetSendCodecs. >+ return false; >+ } >+ send_codecs_ = codecs; >+ >+ return true; >+} >+bool FakeVideoMediaChannel::SetOptions(const VideoOptions& options) { >+ options_ = options; >+ return true; >+} >+bool FakeVideoMediaChannel::SetMaxSendBandwidth(int bps) { >+ max_bps_ = bps; >+ return true; >+} >+ >+FakeDataMediaChannel::FakeDataMediaChannel(void* unused, >+ const DataOptions& options) >+ : send_blocked_(false), max_bps_(-1) {} >+FakeDataMediaChannel::~FakeDataMediaChannel() {} >+const std::vector<DataCodec>& FakeDataMediaChannel::recv_codecs() const { >+ return recv_codecs_; >+} >+const std::vector<DataCodec>& FakeDataMediaChannel::send_codecs() const { >+ return send_codecs_; >+} >+const std::vector<DataCodec>& FakeDataMediaChannel::codecs() const { >+ return send_codecs(); >+} >+int FakeDataMediaChannel::max_bps() const { >+ return max_bps_; >+} >+bool FakeDataMediaChannel::SetSendParameters(const DataSendParameters& params) { >+ set_send_rtcp_parameters(params.rtcp); >+ return (SetSendCodecs(params.codecs) && >+ SetMaxSendBandwidth(params.max_bandwidth_bps)); >+} >+bool FakeDataMediaChannel::SetRecvParameters(const DataRecvParameters& params) { >+ set_recv_rtcp_parameters(params.rtcp); >+ return SetRecvCodecs(params.codecs); >+} >+bool FakeDataMediaChannel::SetSend(bool send) { >+ return set_sending(send); >+} >+bool FakeDataMediaChannel::SetReceive(bool receive) { >+ set_playout(receive); >+ return true; >+} >+bool FakeDataMediaChannel::AddRecvStream(const StreamParams& sp) { >+ if (!RtpHelper<DataMediaChannel>::AddRecvStream(sp)) >+ return false; >+ return true; >+} >+bool FakeDataMediaChannel::RemoveRecvStream(uint32_t ssrc) { >+ if (!RtpHelper<DataMediaChannel>::RemoveRecvStream(ssrc)) >+ return false; >+ return true; >+} >+bool FakeDataMediaChannel::SendData(const SendDataParams& params, >+ const rtc::CopyOnWriteBuffer& payload, >+ SendDataResult* result) { >+ if (send_blocked_) { >+ *result = SDR_BLOCK; >+ return false; >+ } else { >+ last_sent_data_params_ = params; >+ last_sent_data_ = std::string(payload.data<char>(), payload.size()); >+ return true; >+ } >+} >+SendDataParams FakeDataMediaChannel::last_sent_data_params() { >+ return last_sent_data_params_; >+} >+std::string FakeDataMediaChannel::last_sent_data() { >+ return last_sent_data_; >+} >+bool FakeDataMediaChannel::is_send_blocked() { >+ return send_blocked_; >+} >+void FakeDataMediaChannel::set_send_blocked(bool blocked) { >+ send_blocked_ = blocked; >+} >+bool FakeDataMediaChannel::SetRecvCodecs(const std::vector<DataCodec>& codecs) { >+ if (fail_set_recv_codecs()) { >+ // Fake the failure in SetRecvCodecs. >+ return false; >+ } >+ recv_codecs_ = codecs; >+ return true; >+} >+bool FakeDataMediaChannel::SetSendCodecs(const std::vector<DataCodec>& codecs) { >+ if (fail_set_send_codecs()) { >+ // Fake the failure in SetSendCodecs. >+ return false; >+ } >+ send_codecs_ = codecs; >+ return true; >+} >+bool FakeDataMediaChannel::SetMaxSendBandwidth(int bps) { >+ max_bps_ = bps; >+ return true; >+} >+ >+FakeVoiceEngine::FakeVoiceEngine() : fail_create_channel_(false) { >+ // Add a fake audio codec. Note that the name must not be "" as there are >+ // sanity checks against that. >+ codecs_.push_back(AudioCodec(101, "fake_audio_codec", 0, 0, 1)); >+} >+RtpCapabilities FakeVoiceEngine::GetCapabilities() const { >+ return RtpCapabilities(); >+} >+void FakeVoiceEngine::Init() {} >+rtc::scoped_refptr<webrtc::AudioState> FakeVoiceEngine::GetAudioState() const { >+ return rtc::scoped_refptr<webrtc::AudioState>(); >+} >+VoiceMediaChannel* FakeVoiceEngine::CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const AudioOptions& options, >+ const webrtc::CryptoOptions& crypto_options) { >+ if (fail_create_channel_) { >+ return nullptr; >+ } >+ >+ FakeVoiceMediaChannel* ch = new FakeVoiceMediaChannel(this, options); >+ channels_.push_back(ch); >+ return ch; >+} >+FakeVoiceMediaChannel* FakeVoiceEngine::GetChannel(size_t index) { >+ return (channels_.size() > index) ? channels_[index] : NULL; >+} >+void FakeVoiceEngine::UnregisterChannel(VoiceMediaChannel* channel) { >+ channels_.erase(std::find(channels_.begin(), channels_.end(), channel)); >+} >+const std::vector<AudioCodec>& FakeVoiceEngine::send_codecs() const { >+ return codecs_; >+} >+const std::vector<AudioCodec>& FakeVoiceEngine::recv_codecs() const { >+ return codecs_; >+} >+void FakeVoiceEngine::SetCodecs(const std::vector<AudioCodec>& codecs) { >+ codecs_ = codecs; >+} >+int FakeVoiceEngine::GetInputLevel() { >+ return 0; >+} >+bool FakeVoiceEngine::StartAecDump(rtc::PlatformFile file, >+ int64_t max_size_bytes) { >+ return false; >+} >+void FakeVoiceEngine::StopAecDump() {} >+bool FakeVoiceEngine::StartRtcEventLog(rtc::PlatformFile file, >+ int64_t max_size_bytes) { >+ return false; >+} >+void FakeVoiceEngine::StopRtcEventLog() {} >+ >+FakeVideoEngine::FakeVideoEngine() >+ : capture_(false), fail_create_channel_(false) { >+ // Add a fake video codec. Note that the name must not be "" as there are >+ // sanity checks against that. >+ codecs_.push_back(VideoCodec(0, "fake_video_codec")); >+} >+RtpCapabilities FakeVideoEngine::GetCapabilities() const { >+ return RtpCapabilities(); >+} >+bool FakeVideoEngine::SetOptions(const VideoOptions& options) { >+ options_ = options; >+ return true; >+} >+VideoMediaChannel* FakeVideoEngine::CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options) { >+ if (fail_create_channel_) { >+ return nullptr; >+ } >+ >+ FakeVideoMediaChannel* ch = new FakeVideoMediaChannel(this, options); >+ channels_.emplace_back(ch); >+ return ch; >+} >+FakeVideoMediaChannel* FakeVideoEngine::GetChannel(size_t index) { >+ return (channels_.size() > index) ? channels_[index] : nullptr; >+} >+void FakeVideoEngine::UnregisterChannel(VideoMediaChannel* channel) { >+ auto it = std::find(channels_.begin(), channels_.end(), channel); >+ RTC_DCHECK(it != channels_.end()); >+ channels_.erase(it); >+} >+std::vector<VideoCodec> FakeVideoEngine::codecs() const { >+ return codecs_; >+} >+void FakeVideoEngine::SetCodecs(const std::vector<VideoCodec> codecs) { >+ codecs_ = codecs; >+} >+bool FakeVideoEngine::SetCapture(bool capture) { >+ capture_ = capture; >+ return true; >+} >+ >+FakeMediaEngine::FakeMediaEngine() >+ : CompositeMediaEngine(absl::make_unique<FakeVoiceEngine>(), >+ absl::make_unique<FakeVideoEngine>()), >+ voice_(static_cast<FakeVoiceEngine*>(&voice())), >+ video_(static_cast<FakeVideoEngine*>(&video())) {} >+FakeMediaEngine::~FakeMediaEngine() {} >+void FakeMediaEngine::SetAudioCodecs(const std::vector<AudioCodec>& codecs) { >+ voice_->SetCodecs(codecs); >+} >+void FakeMediaEngine::SetVideoCodecs(const std::vector<VideoCodec>& codecs) { >+ video_->SetCodecs(codecs); >+} >+ >+FakeVoiceMediaChannel* FakeMediaEngine::GetVoiceChannel(size_t index) { >+ return voice_->GetChannel(index); >+} >+FakeVideoMediaChannel* FakeMediaEngine::GetVideoChannel(size_t index) { >+ return video_->GetChannel(index); >+} >+ >+void FakeMediaEngine::set_fail_create_channel(bool fail) { >+ voice_->fail_create_channel_ = fail; >+ video_->fail_create_channel_ = fail; >+} >+ >+DataMediaChannel* FakeDataEngine::CreateChannel(const MediaConfig& config) { >+ FakeDataMediaChannel* ch = new FakeDataMediaChannel(this, DataOptions()); >+ channels_.push_back(ch); >+ return ch; >+} >+FakeDataMediaChannel* FakeDataEngine::GetChannel(size_t index) { >+ return (channels_.size() > index) ? channels_[index] : NULL; >+} >+void FakeDataEngine::UnregisterChannel(DataMediaChannel* channel) { >+ channels_.erase(std::find(channels_.begin(), channels_.end(), channel)); >+} >+void FakeDataEngine::SetDataCodecs(const std::vector<DataCodec>& data_codecs) { >+ data_codecs_ = data_codecs; >+} >+const std::vector<DataCodec>& FakeDataEngine::data_codecs() { >+ return data_codecs_; >+} >+ >+} // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakemediaengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakemediaengine.h >index 62bd6ba5ba50f214a69f9f71bb436f3e9f161dea..e3a0638715282cba7a21bb1f75dc1ea66fcd6406 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakemediaengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakemediaengine.h >@@ -17,10 +17,8 @@ > #include <set> > #include <string> > #include <tuple> >-#include <utility> > #include <vector> > >-#include "absl/memory/memory.h" > #include "api/call/audio_sink.h" > #include "media/base/audiosource.h" > #include "media/base/mediaengine.h" >@@ -28,10 +26,8 @@ > #include "media/base/streamparams.h" > #include "media/engine/webrtcvideoengine.h" > #include "modules/audio_processing/include/audio_processing.h" >-#include "rtc_base/checks.h" > #include "rtc_base/copyonwritebuffer.h" > #include "rtc_base/networkroute.h" >-#include "rtc_base/stringutils.h" > > using webrtc::RtpExtension; > >@@ -128,7 +124,7 @@ class RtpHelper : public Base { > } > receive_streams_.push_back(sp); > rtp_receive_parameters_[sp.first_ssrc()] = >- CreateRtpParametersWithOneEncoding(); >+ CreateRtpParametersWithEncodings(sp); > return true; > } > virtual bool RemoveRecvStream(uint32_t ssrc) { >@@ -256,6 +252,12 @@ class RtpHelper : public Base { > recv_extensions_ = extensions; > return true; > } >+ bool SetSendExtmapAllowMixed(bool extmap_allow_mixed) { >+ if (Base::ExtmapAllowMixed() != extmap_allow_mixed) { >+ Base::SetExtmapAllowMixed(extmap_allow_mixed); >+ } >+ return true; >+ } > bool SetSendRtpHeaderExtensions(const std::vector<RtpExtension>& extensions) { > send_extensions_ = extensions; > return true; >@@ -267,11 +269,11 @@ class RtpHelper : public Base { > recv_rtcp_parameters_ = params; > } > virtual void OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > rtp_packets_.push_back(std::string(packet->data<char>(), packet->size())); > } > virtual void OnRtcpReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > rtcp_packets_.push_back(std::string(packet->data<char>(), packet->size())); > } > virtual void OnReadyToSend(bool ready) { ready_to_send_ = ready; } >@@ -312,187 +314,72 @@ class RtpHelper : public Base { > class FakeVoiceMediaChannel : public RtpHelper<VoiceMediaChannel> { > public: > struct DtmfInfo { >- DtmfInfo(uint32_t ssrc, int event_code, int duration) >- : ssrc(ssrc), event_code(event_code), duration(duration) {} >+ DtmfInfo(uint32_t ssrc, int event_code, int duration); > uint32_t ssrc; > int event_code; > int duration; > }; > explicit FakeVoiceMediaChannel(FakeVoiceEngine* engine, >- const AudioOptions& options) >- : engine_(engine), max_bps_(-1) { >- output_scalings_[0] = 1.0; // For default channel. >- SetOptions(options); >- } >+ const AudioOptions& options); > ~FakeVoiceMediaChannel(); >- const std::vector<AudioCodec>& recv_codecs() const { return recv_codecs_; } >- const std::vector<AudioCodec>& send_codecs() const { return send_codecs_; } >- const std::vector<AudioCodec>& codecs() const { return send_codecs(); } >- const std::vector<DtmfInfo>& dtmf_info_queue() const { >- return dtmf_info_queue_; >- } >- const AudioOptions& options() const { return options_; } >- int max_bps() const { return max_bps_; } >- virtual bool SetSendParameters(const AudioSendParameters& params) { >- set_send_rtcp_parameters(params.rtcp); >- return (SetSendCodecs(params.codecs) && >- SetSendRtpHeaderExtensions(params.extensions) && >- SetMaxSendBandwidth(params.max_bandwidth_bps) && >- SetOptions(params.options)); >- } >- >- virtual bool SetRecvParameters(const AudioRecvParameters& params) { >- set_recv_rtcp_parameters(params.rtcp); >- return (SetRecvCodecs(params.codecs) && >- SetRecvRtpHeaderExtensions(params.extensions)); >- } >- >- virtual void SetPlayout(bool playout) { set_playout(playout); } >- virtual void SetSend(bool send) { set_sending(send); } >- virtual bool SetAudioSend(uint32_t ssrc, >- bool enable, >- const AudioOptions* options, >- AudioSource* source) { >- if (!SetLocalSource(ssrc, source)) { >- return false; >- } >- if (!RtpHelper<VoiceMediaChannel>::MuteStream(ssrc, !enable)) { >- return false; >- } >- if (enable && options) { >- return SetOptions(*options); >- } >- return true; >- } >+ const std::vector<AudioCodec>& recv_codecs() const; >+ const std::vector<AudioCodec>& send_codecs() const; >+ const std::vector<AudioCodec>& codecs() const; >+ const std::vector<DtmfInfo>& dtmf_info_queue() const; >+ const AudioOptions& options() const; >+ int max_bps() const; >+ bool SetSendParameters(const AudioSendParameters& params) override; > >- bool HasSource(uint32_t ssrc) const { >- return local_sinks_.find(ssrc) != local_sinks_.end(); >- } >+ bool SetRecvParameters(const AudioRecvParameters& params) override; > >- virtual bool AddRecvStream(const StreamParams& sp) { >- if (!RtpHelper<VoiceMediaChannel>::AddRecvStream(sp)) >- return false; >- output_scalings_[sp.first_ssrc()] = 1.0; >- return true; >- } >- virtual bool RemoveRecvStream(uint32_t ssrc) { >- if (!RtpHelper<VoiceMediaChannel>::RemoveRecvStream(ssrc)) >- return false; >- output_scalings_.erase(ssrc); >- return true; >- } >+ void SetPlayout(bool playout) override; >+ void SetSend(bool send) override; >+ bool SetAudioSend(uint32_t ssrc, >+ bool enable, >+ const AudioOptions* options, >+ AudioSource* source) override; > >- virtual bool CanInsertDtmf() { >- for (std::vector<AudioCodec>::const_iterator it = send_codecs_.begin(); >- it != send_codecs_.end(); ++it) { >- // Find the DTMF telephone event "codec". >- if (_stricmp(it->name.c_str(), "telephone-event") == 0) { >- return true; >- } >- } >- return false; >- } >- virtual bool InsertDtmf(uint32_t ssrc, int event_code, int duration) { >- dtmf_info_queue_.push_back(DtmfInfo(ssrc, event_code, duration)); >- return true; >- } >+ bool HasSource(uint32_t ssrc) const; > >- virtual bool SetOutputVolume(uint32_t ssrc, double volume) { >- if (0 == ssrc) { >- std::map<uint32_t, double>::iterator it; >- for (it = output_scalings_.begin(); it != output_scalings_.end(); ++it) { >- it->second = volume; >- } >- return true; >- } else if (output_scalings_.find(ssrc) != output_scalings_.end()) { >- output_scalings_[ssrc] = volume; >- return true; >- } >- return false; >- } >- bool GetOutputVolume(uint32_t ssrc, double* volume) { >- if (output_scalings_.find(ssrc) == output_scalings_.end()) >- return false; >- *volume = output_scalings_[ssrc]; >- return true; >- } >+ bool AddRecvStream(const StreamParams& sp) override; >+ bool RemoveRecvStream(uint32_t ssrc) override; >+ >+ bool CanInsertDtmf() override; >+ bool InsertDtmf(uint32_t ssrc, int event_code, int duration) override; >+ >+ bool SetOutputVolume(uint32_t ssrc, double volume) override; >+ bool GetOutputVolume(uint32_t ssrc, double* volume); > >- virtual bool GetStats(VoiceMediaInfo* info) { return false; } >+ bool GetStats(VoiceMediaInfo* info) override; > >- virtual void SetRawAudioSink( >+ void SetRawAudioSink( > uint32_t ssrc, >- std::unique_ptr<webrtc::AudioSinkInterface> sink) { >- sink_ = std::move(sink); >- } >+ std::unique_ptr<webrtc::AudioSinkInterface> sink) override; > >- virtual std::vector<webrtc::RtpSource> GetSources(uint32_t ssrc) const { >- return std::vector<webrtc::RtpSource>(); >- } >+ std::vector<webrtc::RtpSource> GetSources(uint32_t ssrc) const override; > > private: > class VoiceChannelAudioSink : public AudioSource::Sink { > public: >- explicit VoiceChannelAudioSink(AudioSource* source) : source_(source) { >- source_->SetSink(this); >- } >- virtual ~VoiceChannelAudioSink() { >- if (source_) { >- source_->SetSink(nullptr); >- } >- } >+ explicit VoiceChannelAudioSink(AudioSource* source); >+ ~VoiceChannelAudioSink() override; > void OnData(const void* audio_data, > int bits_per_sample, > int sample_rate, > size_t number_of_channels, >- size_t number_of_frames) override {} >- void OnClose() override { source_ = nullptr; } >- AudioSource* source() const { return source_; } >+ size_t number_of_frames) override; >+ void OnClose() override; >+ AudioSource* source() const; > > private: > AudioSource* source_; > }; > >- bool SetRecvCodecs(const std::vector<AudioCodec>& codecs) { >- if (fail_set_recv_codecs()) { >- // Fake the failure in SetRecvCodecs. >- return false; >- } >- recv_codecs_ = codecs; >- return true; >- } >- bool SetSendCodecs(const std::vector<AudioCodec>& codecs) { >- if (fail_set_send_codecs()) { >- // Fake the failure in SetSendCodecs. >- return false; >- } >- send_codecs_ = codecs; >- return true; >- } >- bool SetMaxSendBandwidth(int bps) { >- max_bps_ = bps; >- return true; >- } >- bool SetOptions(const AudioOptions& options) { >- // Does a "merge" of current options and set options. >- options_.SetAll(options); >- return true; >- } >- bool SetLocalSource(uint32_t ssrc, AudioSource* source) { >- auto it = local_sinks_.find(ssrc); >- if (source) { >- if (it != local_sinks_.end()) { >- RTC_CHECK(it->second->source() == source); >- } else { >- local_sinks_.insert(std::make_pair( >- ssrc, absl::make_unique<VoiceChannelAudioSink>(source))); >- } >- } else { >- if (it != local_sinks_.end()) { >- local_sinks_.erase(it); >- } >- } >- return true; >- } >+ bool SetRecvCodecs(const std::vector<AudioCodec>& codecs); >+ bool SetSendCodecs(const std::vector<AudioCodec>& codecs); >+ bool SetMaxSendBandwidth(int bps); >+ bool SetOptions(const AudioOptions& options); >+ bool SetLocalSource(uint32_t ssrc, AudioSource* source); > > FakeVoiceEngine* engine_; > std::vector<AudioCodec> recv_codecs_; >@@ -506,136 +393,55 @@ class FakeVoiceMediaChannel : public RtpHelper<VoiceMediaChannel> { > }; > > // A helper function to compare the FakeVoiceMediaChannel::DtmfInfo. >-inline bool CompareDtmfInfo(const FakeVoiceMediaChannel::DtmfInfo& info, >- uint32_t ssrc, >- int event_code, >- int duration) { >- return (info.duration == duration && info.event_code == event_code && >- info.ssrc == ssrc); >-} >+bool CompareDtmfInfo(const FakeVoiceMediaChannel::DtmfInfo& info, >+ uint32_t ssrc, >+ int event_code, >+ int duration); > > class FakeVideoMediaChannel : public RtpHelper<VideoMediaChannel> { > public: >- FakeVideoMediaChannel(FakeVideoEngine* engine, const VideoOptions& options) >- : engine_(engine), max_bps_(-1) { >- SetOptions(options); >- } >+ FakeVideoMediaChannel(FakeVideoEngine* engine, const VideoOptions& options); > > ~FakeVideoMediaChannel(); > >- const std::vector<VideoCodec>& recv_codecs() const { return recv_codecs_; } >- const std::vector<VideoCodec>& send_codecs() const { return send_codecs_; } >- const std::vector<VideoCodec>& codecs() const { return send_codecs(); } >- bool rendering() const { return playout(); } >- const VideoOptions& options() const { return options_; } >+ const std::vector<VideoCodec>& recv_codecs() const; >+ const std::vector<VideoCodec>& send_codecs() const; >+ const std::vector<VideoCodec>& codecs() const; >+ bool rendering() const; >+ const VideoOptions& options() const; > const std::map<uint32_t, rtc::VideoSinkInterface<webrtc::VideoFrame>*>& >- sinks() const { >- return sinks_; >- } >- int max_bps() const { return max_bps_; } >- bool SetSendParameters(const VideoSendParameters& params) override { >- set_send_rtcp_parameters(params.rtcp); >- return (SetSendCodecs(params.codecs) && >- SetSendRtpHeaderExtensions(params.extensions) && >- SetMaxSendBandwidth(params.max_bandwidth_bps)); >- } >- bool SetRecvParameters(const VideoRecvParameters& params) override { >- set_recv_rtcp_parameters(params.rtcp); >- return (SetRecvCodecs(params.codecs) && >- SetRecvRtpHeaderExtensions(params.extensions)); >- } >- bool AddSendStream(const StreamParams& sp) override { >- return RtpHelper<VideoMediaChannel>::AddSendStream(sp); >- } >- bool RemoveSendStream(uint32_t ssrc) override { >- return RtpHelper<VideoMediaChannel>::RemoveSendStream(ssrc); >- } >- >- bool GetSendCodec(VideoCodec* send_codec) override { >- if (send_codecs_.empty()) { >- return false; >- } >- *send_codec = send_codecs_[0]; >- return true; >- } >+ sinks() const; >+ int max_bps() const; >+ bool SetSendParameters(const VideoSendParameters& params) override; >+ bool SetRecvParameters(const VideoRecvParameters& params) override; >+ bool AddSendStream(const StreamParams& sp) override; >+ bool RemoveSendStream(uint32_t ssrc) override; >+ >+ bool GetSendCodec(VideoCodec* send_codec) override; > bool SetSink(uint32_t ssrc, >- rtc::VideoSinkInterface<webrtc::VideoFrame>* sink) override { >- if (ssrc != 0 && sinks_.find(ssrc) == sinks_.end()) { >- return false; >- } >- if (ssrc != 0) { >- sinks_[ssrc] = sink; >- } >- return true; >- } >- bool HasSink(uint32_t ssrc) const { >- return sinks_.find(ssrc) != sinks_.end() && sinks_.at(ssrc) != nullptr; >- } >+ rtc::VideoSinkInterface<webrtc::VideoFrame>* sink) override; >+ bool HasSink(uint32_t ssrc) const; > >- bool SetSend(bool send) override { return set_sending(send); } >+ bool SetSend(bool send) override; > bool SetVideoSend( > uint32_t ssrc, > const VideoOptions* options, >- rtc::VideoSourceInterface<webrtc::VideoFrame>* source) override { >- if (options) { >- if (!SetOptions(*options)) { >- return false; >- } >- } >- sources_[ssrc] = source; >- return true; >- } >+ rtc::VideoSourceInterface<webrtc::VideoFrame>* source) override; > >- bool HasSource(uint32_t ssrc) const { >- return sources_.find(ssrc) != sources_.end() && >- sources_.at(ssrc) != nullptr; >- } >- bool AddRecvStream(const StreamParams& sp) override { >- if (!RtpHelper<VideoMediaChannel>::AddRecvStream(sp)) >- return false; >- sinks_[sp.first_ssrc()] = NULL; >- return true; >- } >- bool RemoveRecvStream(uint32_t ssrc) override { >- if (!RtpHelper<VideoMediaChannel>::RemoveRecvStream(ssrc)) >- return false; >- sinks_.erase(ssrc); >- return true; >- } >+ bool HasSource(uint32_t ssrc) const; >+ bool AddRecvStream(const StreamParams& sp) override; >+ bool RemoveRecvStream(uint32_t ssrc) override; > >- void FillBitrateInfo(BandwidthEstimationInfo* bwe_info) override {} >- bool GetStats(VideoMediaInfo* info) override { return false; } >+ void FillBitrateInfo(BandwidthEstimationInfo* bwe_info) override; >+ bool GetStats(VideoMediaInfo* info) override; > >- std::vector<webrtc::RtpSource> GetSources(uint32_t ssrc) const override { >- return {}; >- } >+ std::vector<webrtc::RtpSource> GetSources(uint32_t ssrc) const override; > > private: >- bool SetRecvCodecs(const std::vector<VideoCodec>& codecs) { >- if (fail_set_recv_codecs()) { >- // Fake the failure in SetRecvCodecs. >- return false; >- } >- recv_codecs_ = codecs; >- return true; >- } >- bool SetSendCodecs(const std::vector<VideoCodec>& codecs) { >- if (fail_set_send_codecs()) { >- // Fake the failure in SetSendCodecs. >- return false; >- } >- send_codecs_ = codecs; >- >- return true; >- } >- bool SetOptions(const VideoOptions& options) { >- options_ = options; >- return true; >- } >- bool SetMaxSendBandwidth(int bps) { >- max_bps_ = bps; >- return true; >- } >+ bool SetRecvCodecs(const std::vector<VideoCodec>& codecs); >+ bool SetSendCodecs(const std::vector<VideoCodec>& codecs); >+ bool SetOptions(const VideoOptions& options); >+ bool SetMaxSendBandwidth(int bps); > > FakeVideoEngine* engine_; > std::vector<VideoCodec> recv_codecs_; >@@ -652,78 +458,33 @@ class DataOptions {}; > > class FakeDataMediaChannel : public RtpHelper<DataMediaChannel> { > public: >- explicit FakeDataMediaChannel(void* unused, const DataOptions& options) >- : send_blocked_(false), max_bps_(-1) {} >- ~FakeDataMediaChannel() {} >- const std::vector<DataCodec>& recv_codecs() const { return recv_codecs_; } >- const std::vector<DataCodec>& send_codecs() const { return send_codecs_; } >- const std::vector<DataCodec>& codecs() const { return send_codecs(); } >- int max_bps() const { return max_bps_; } >- >- virtual bool SetSendParameters(const DataSendParameters& params) { >- set_send_rtcp_parameters(params.rtcp); >- return (SetSendCodecs(params.codecs) && >- SetMaxSendBandwidth(params.max_bandwidth_bps)); >- } >- virtual bool SetRecvParameters(const DataRecvParameters& params) { >- set_recv_rtcp_parameters(params.rtcp); >- return SetRecvCodecs(params.codecs); >- } >- virtual bool SetSend(bool send) { return set_sending(send); } >- virtual bool SetReceive(bool receive) { >- set_playout(receive); >- return true; >- } >- virtual bool AddRecvStream(const StreamParams& sp) { >- if (!RtpHelper<DataMediaChannel>::AddRecvStream(sp)) >- return false; >- return true; >- } >- virtual bool RemoveRecvStream(uint32_t ssrc) { >- if (!RtpHelper<DataMediaChannel>::RemoveRecvStream(ssrc)) >- return false; >- return true; >- } >- >- virtual bool SendData(const SendDataParams& params, >- const rtc::CopyOnWriteBuffer& payload, >- SendDataResult* result) { >- if (send_blocked_) { >- *result = SDR_BLOCK; >- return false; >- } else { >- last_sent_data_params_ = params; >- last_sent_data_ = std::string(payload.data<char>(), payload.size()); >- return true; >- } >- } >- >- SendDataParams last_sent_data_params() { return last_sent_data_params_; } >- std::string last_sent_data() { return last_sent_data_; } >- bool is_send_blocked() { return send_blocked_; } >- void set_send_blocked(bool blocked) { send_blocked_ = blocked; } >+ explicit FakeDataMediaChannel(void* unused, const DataOptions& options); >+ ~FakeDataMediaChannel(); >+ const std::vector<DataCodec>& recv_codecs() const; >+ const std::vector<DataCodec>& send_codecs() const; >+ const std::vector<DataCodec>& codecs() const; >+ int max_bps() const; >+ >+ bool SetSendParameters(const DataSendParameters& params) override; >+ bool SetRecvParameters(const DataRecvParameters& params) override; >+ bool SetSend(bool send) override; >+ bool SetReceive(bool receive) override; >+ bool AddRecvStream(const StreamParams& sp) override; >+ bool RemoveRecvStream(uint32_t ssrc) override; >+ >+ bool SendData(const SendDataParams& params, >+ const rtc::CopyOnWriteBuffer& payload, >+ SendDataResult* result) override; >+ >+ SendDataParams last_sent_data_params(); >+ std::string last_sent_data(); >+ bool is_send_blocked(); >+ void set_send_blocked(bool blocked); > > private: >- bool SetRecvCodecs(const std::vector<DataCodec>& codecs) { >- if (fail_set_recv_codecs()) { >- // Fake the failure in SetRecvCodecs. >- return false; >- } >- recv_codecs_ = codecs; >- return true; >- } >- bool SetSendCodecs(const std::vector<DataCodec>& codecs) { >- if (fail_set_send_codecs()) { >- // Fake the failure in SetSendCodecs. >- return false; >- } >- send_codecs_ = codecs; >- return true; >- } >- bool SetMaxSendBandwidth(int bps) { >- max_bps_ = bps; >- return true; >- } >+ bool SetRecvCodecs(const std::vector<DataCodec>& codecs); >+ bool SetSendCodecs(const std::vector<DataCodec>& codecs); >+ bool SetMaxSendBandwidth(int bps); > > std::vector<DataCodec> recv_codecs_; > std::vector<DataCodec> send_codecs_; >@@ -733,233 +494,98 @@ class FakeDataMediaChannel : public RtpHelper<DataMediaChannel> { > int max_bps_; > }; > >-// A base class for all of the shared parts between FakeVoiceEngine >-// and FakeVideoEngine. >-class FakeBaseEngine { >- public: >- FakeBaseEngine() : options_changed_(false), fail_create_channel_(false) {} >- void set_fail_create_channel(bool fail) { fail_create_channel_ = fail; } >- >- RtpCapabilities GetCapabilities() const { return capabilities_; } >- void set_rtp_header_extensions(const std::vector<RtpExtension>& extensions) { >- capabilities_.header_extensions = extensions; >- } >- >- void set_rtp_header_extensions( >- const std::vector<cricket::RtpHeaderExtension>& extensions) { >- for (const cricket::RtpHeaderExtension& ext : extensions) { >- RtpExtension webrtc_ext; >- webrtc_ext.uri = ext.uri; >- webrtc_ext.id = ext.id; >- capabilities_.header_extensions.push_back(webrtc_ext); >- } >- } >- >- protected: >- // Flag used by optionsmessagehandler_unittest for checking whether any >- // relevant setting has been updated. >- // TODO(thaloun): Replace with explicit checks of before & after values. >- bool options_changed_; >- bool fail_create_channel_; >- RtpCapabilities capabilities_; >-}; >- >-class FakeVoiceEngine : public FakeBaseEngine { >+class FakeVoiceEngine : public VoiceEngineInterface { > public: >- FakeVoiceEngine() { >- // Add a fake audio codec. Note that the name must not be "" as there are >- // sanity checks against that. >- codecs_.push_back(AudioCodec(101, "fake_audio_codec", 0, 0, 1)); >- } >- void Init() {} >- rtc::scoped_refptr<webrtc::AudioState> GetAudioState() const { >- return rtc::scoped_refptr<webrtc::AudioState>(); >- } >- >- VoiceMediaChannel* CreateChannel(webrtc::Call* call, >- const MediaConfig& config, >- const AudioOptions& options) { >- if (fail_create_channel_) { >- return nullptr; >- } >- >- FakeVoiceMediaChannel* ch = new FakeVoiceMediaChannel(this, options); >- channels_.push_back(ch); >- return ch; >- } >- FakeVoiceMediaChannel* GetChannel(size_t index) { >- return (channels_.size() > index) ? channels_[index] : NULL; >- } >- void UnregisterChannel(VoiceMediaChannel* channel) { >- channels_.erase(std::find(channels_.begin(), channels_.end(), channel)); >- } >+ FakeVoiceEngine(); >+ RtpCapabilities GetCapabilities() const override; >+ void Init() override; >+ rtc::scoped_refptr<webrtc::AudioState> GetAudioState() const override; >+ >+ VoiceMediaChannel* CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const AudioOptions& options, >+ const webrtc::CryptoOptions& crypto_options) override; >+ FakeVoiceMediaChannel* GetChannel(size_t index); >+ void UnregisterChannel(VoiceMediaChannel* channel); > > // TODO(ossu): For proper testing, These should either individually settable > // or the voice engine should reference mockable factories. >- const std::vector<AudioCodec>& send_codecs() { return codecs_; } >- const std::vector<AudioCodec>& recv_codecs() { return codecs_; } >- void SetCodecs(const std::vector<AudioCodec>& codecs) { codecs_ = codecs; } >- >- int GetInputLevel() { return 0; } >- >- bool StartAecDump(rtc::PlatformFile file, int64_t max_size_bytes) { >- return false; >- } >- >- void StopAecDump() {} >- >- bool StartRtcEventLog(rtc::PlatformFile file, int64_t max_size_bytes) { >- return false; >- } >- >- void StopRtcEventLog() {} >+ const std::vector<AudioCodec>& send_codecs() const override; >+ const std::vector<AudioCodec>& recv_codecs() const override; >+ void SetCodecs(const std::vector<AudioCodec>& codecs); >+ int GetInputLevel(); >+ bool StartAecDump(rtc::PlatformFile file, int64_t max_size_bytes) override; >+ void StopAecDump() override; >+ bool StartRtcEventLog(rtc::PlatformFile file, int64_t max_size_bytes); >+ void StopRtcEventLog(); > > private: > std::vector<FakeVoiceMediaChannel*> channels_; > std::vector<AudioCodec> codecs_; >+ bool fail_create_channel_; > > friend class FakeMediaEngine; > }; > >-class FakeVideoEngine : public FakeBaseEngine { >+class FakeVideoEngine : public VideoEngineInterface { > public: >- FakeVideoEngine() : capture_(false) { >- // Add a fake video codec. Note that the name must not be "" as there are >- // sanity checks against that. >- codecs_.push_back(VideoCodec(0, "fake_video_codec")); >- } >- >- bool SetOptions(const VideoOptions& options) { >- options_ = options; >- options_changed_ = true; >- return true; >- } >- >- VideoMediaChannel* CreateChannel(webrtc::Call* call, >- const MediaConfig& config, >- const VideoOptions& options) { >- if (fail_create_channel_) { >- return nullptr; >- } >- >- FakeVideoMediaChannel* ch = new FakeVideoMediaChannel(this, options); >- channels_.emplace_back(ch); >- return ch; >- } >- >- FakeVideoMediaChannel* GetChannel(size_t index) { >- return (channels_.size() > index) ? channels_[index] : nullptr; >- } >- >- void UnregisterChannel(VideoMediaChannel* channel) { >- auto it = std::find(channels_.begin(), channels_.end(), channel); >- RTC_DCHECK(it != channels_.end()); >- channels_.erase(it); >- } >- >- const std::vector<VideoCodec>& codecs() const { return codecs_; } >- >- void SetCodecs(const std::vector<VideoCodec> codecs) { codecs_ = codecs; } >- >- bool SetCapture(bool capture) { >- capture_ = capture; >- return true; >- } >+ FakeVideoEngine(); >+ RtpCapabilities GetCapabilities() const override; >+ bool SetOptions(const VideoOptions& options); >+ VideoMediaChannel* CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options) override; >+ FakeVideoMediaChannel* GetChannel(size_t index); >+ void UnregisterChannel(VideoMediaChannel* channel); >+ std::vector<VideoCodec> codecs() const override; >+ void SetCodecs(const std::vector<VideoCodec> codecs); >+ bool SetCapture(bool capture); > > private: > std::vector<FakeVideoMediaChannel*> channels_; > std::vector<VideoCodec> codecs_; > bool capture_; > VideoOptions options_; >+ bool fail_create_channel_; > > friend class FakeMediaEngine; > }; > >-class FakeMediaEngine >- : public CompositeMediaEngine<FakeVoiceEngine, FakeVideoEngine> { >+class FakeMediaEngine : public CompositeMediaEngine { > public: >- FakeMediaEngine() >- : CompositeMediaEngine<FakeVoiceEngine, FakeVideoEngine>(std::tuple<>(), >- std::tuple<>()) { >- } >+ FakeMediaEngine(); > >- virtual ~FakeMediaEngine() {} >+ ~FakeMediaEngine() override; > >- void SetAudioCodecs(const std::vector<AudioCodec>& codecs) { >- voice().SetCodecs(codecs); >- } >- void SetVideoCodecs(const std::vector<VideoCodec>& codecs) { >- video().SetCodecs(codecs); >- } >+ void SetAudioCodecs(const std::vector<AudioCodec>& codecs); >+ void SetVideoCodecs(const std::vector<VideoCodec>& codecs); > >- void SetAudioRtpHeaderExtensions( >- const std::vector<RtpExtension>& extensions) { >- voice().set_rtp_header_extensions(extensions); >- } >- void SetVideoRtpHeaderExtensions( >- const std::vector<RtpExtension>& extensions) { >- video().set_rtp_header_extensions(extensions); >- } >+ FakeVoiceMediaChannel* GetVoiceChannel(size_t index); >+ FakeVideoMediaChannel* GetVideoChannel(size_t index); > >- void SetAudioRtpHeaderExtensions( >- const std::vector<cricket::RtpHeaderExtension>& extensions) { >- voice().set_rtp_header_extensions(extensions); >- } >- void SetVideoRtpHeaderExtensions( >- const std::vector<cricket::RtpHeaderExtension>& extensions) { >- video().set_rtp_header_extensions(extensions); >- } >+ void set_fail_create_channel(bool fail); > >- FakeVoiceMediaChannel* GetVoiceChannel(size_t index) { >- return voice().GetChannel(index); >- } >- FakeVideoMediaChannel* GetVideoChannel(size_t index) { >- return video().GetChannel(index); >- } >- >- bool capture() const { return video().capture_; } >- bool options_changed() const { return video().options_changed_; } >- void clear_options_changed() { video().options_changed_ = false; } >- void set_fail_create_channel(bool fail) { >- voice().set_fail_create_channel(fail); >- video().set_fail_create_channel(fail); >- } >+ private: >+ FakeVoiceEngine* const voice_; >+ FakeVideoEngine* const video_; > }; > > // Have to come afterwards due to declaration order >-inline FakeVoiceMediaChannel::~FakeVoiceMediaChannel() { >- if (engine_) { >- engine_->UnregisterChannel(this); >- } >-} >- >-inline FakeVideoMediaChannel::~FakeVideoMediaChannel() { >- if (engine_) { >- engine_->UnregisterChannel(this); >- } >-} > > class FakeDataEngine : public DataEngineInterface { > public: >- virtual DataMediaChannel* CreateChannel(const MediaConfig& config) { >- FakeDataMediaChannel* ch = new FakeDataMediaChannel(this, DataOptions()); >- channels_.push_back(ch); >- return ch; >- } >+ DataMediaChannel* CreateChannel(const MediaConfig& config) override; > >- FakeDataMediaChannel* GetChannel(size_t index) { >- return (channels_.size() > index) ? channels_[index] : NULL; >- } >+ FakeDataMediaChannel* GetChannel(size_t index); > >- void UnregisterChannel(DataMediaChannel* channel) { >- channels_.erase(std::find(channels_.begin(), channels_.end(), channel)); >- } >+ void UnregisterChannel(DataMediaChannel* channel); > >- virtual void SetDataCodecs(const std::vector<DataCodec>& data_codecs) { >- data_codecs_ = data_codecs; >- } >+ void SetDataCodecs(const std::vector<DataCodec>& data_codecs); > >- virtual const std::vector<DataCodec>& data_codecs() { return data_codecs_; } >+ const std::vector<DataCodec>& data_codecs() override; > > private: > std::vector<FakeDataMediaChannel*> channels_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakenetworkinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakenetworkinterface.h >index 0f14659c4606c9df6a760e6ee3383b2f97ac9a59..dfd340772b693392b15d3544b7c4c6d1d22fce55 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakenetworkinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakenetworkinterface.h >@@ -169,9 +169,9 @@ class FakeNetworkInterface : public MediaChannel::NetworkInterface, > static_cast<rtc::TypedMessageData<rtc::CopyOnWriteBuffer>*>(msg->pdata); > if (dest_) { > if (msg->message_id == ST_RTP) { >- dest_->OnPacketReceived(&msg_data->data(), rtc::CreatePacketTime(0)); >+ dest_->OnPacketReceived(&msg_data->data(), rtc::TimeMicros()); > } else { >- dest_->OnRtcpReceived(&msg_data->data(), rtc::CreatePacketTime(0)); >+ dest_->OnRtcpReceived(&msg_data->data(), rtc::TimeMicros()); > } > } > delete msg_data; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakevideocapturer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakevideocapturer.cc >index 94900ba37b6a54b00fa6267b69dbeb5238318457..b5d0189608a92352b8ed450eb3e67528032ffcaa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakevideocapturer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/fakevideocapturer.cc >@@ -11,6 +11,7 @@ > #include "media/base/fakevideocapturer.h" > > #include "rtc_base/arraysize.h" >+#include "rtc_base/timeutils.h" > > namespace cricket { > >@@ -95,7 +96,7 @@ cricket::CaptureState FakeVideoCapturer::Start( > SetCaptureState(cricket::CS_RUNNING); > frame_source_ = absl::make_unique<FakeFrameSource>( > format.width, format.height, >- format.interval / rtc::kNumNanosecsPerMicrosec); >+ format.interval / rtc::kNumNanosecsPerMicrosec, rtc::TimeMicros()); > frame_source_->SetRotation(rotation_); > return cricket::CS_RUNNING; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/h264_profile_level_id.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/h264_profile_level_id.h >index b4ff88369c61489f4f851b95c546c271a9c0fdf8..6a3e7cdb321966f693bf83fb90a61a94c4739b85 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/h264_profile_level_id.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/h264_profile_level_id.h >@@ -16,6 +16,7 @@ > > #include "absl/types/optional.h" > #include "common_types.h" // NOLINT(build/include) >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > namespace H264 { >@@ -61,18 +62,19 @@ absl::optional<ProfileLevelId> ParseProfileLevelId(const char* str); > // contained in an SDP key-value map. A default profile level id will be > // returned if the profile-level-id key is missing. Nothing will be returned if > // the key is present but the string is invalid. >-absl::optional<ProfileLevelId> ParseSdpProfileLevelId( >+RTC_EXPORT absl::optional<ProfileLevelId> ParseSdpProfileLevelId( > const CodecParameterMap& params); > > // Given that a decoder supports up to a given frame size (in pixels) at up to a > // given number of frames per second, return the highest H.264 level where it > // can guarantee that it will be able to support all valid encoded streams that > // are within that level. >-absl::optional<Level> SupportedLevel(int max_frame_pixel_count, float max_fps); >+RTC_EXPORT absl::optional<Level> SupportedLevel(int max_frame_pixel_count, >+ float max_fps); > > // Returns canonical string representation as three hex bytes of the profile > // level id, or returns nothing for invalid profile level ids. >-absl::optional<std::string> ProfileLevelIdToString( >+RTC_EXPORT absl::optional<std::string> ProfileLevelIdToString( > const ProfileLevelId& profile_level_id); > > // Generate codec parameters that will be used as answer in an SDP negotiation >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.cc >index cba3be30ed1616ce3375acaa0a9d57637698cf9e..0634b2e575f8b2ebaffc2299e12b7e7d0944fb74 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.cc >@@ -16,16 +16,19 @@ VideoOptions::VideoOptions() = default; > VideoOptions::~VideoOptions() = default; > > MediaChannel::MediaChannel(const MediaConfig& config) >- : enable_dscp_(config.enable_dscp), network_interface_(NULL) {} >+ : enable_dscp_(config.enable_dscp) {} > >-MediaChannel::MediaChannel() : enable_dscp_(false), network_interface_(NULL) {} >+MediaChannel::MediaChannel() : enable_dscp_(false) {} > > MediaChannel::~MediaChannel() {} > >-void MediaChannel::SetInterface(NetworkInterface* iface) { >+void MediaChannel::SetInterface( >+ NetworkInterface* iface, >+ webrtc::MediaTransportInterface* media_transport) { > rtc::CritScope cs(&network_interface_crit_); > network_interface_ = iface; >- SetDscp(enable_dscp_ ? PreferredDscp() : rtc::DSCP_DEFAULT); >+ media_transport_ = media_transport; >+ UpdateDscp(); > } > > int MediaChannel::GetRtpSendTimeExtnId() const { >@@ -84,6 +87,10 @@ std::map<std::string, std::string> AudioSendParameters::ToStringMap() const { > return params; > } > >+cricket::MediaType VoiceMediaChannel::media_type() const { >+ return cricket::MediaType::MEDIA_TYPE_AUDIO; >+} >+ > VideoSendParameters::VideoSendParameters() = default; > VideoSendParameters::~VideoSendParameters() = default; > >@@ -93,11 +100,19 @@ std::map<std::string, std::string> VideoSendParameters::ToStringMap() const { > return params; > } > >+cricket::MediaType VideoMediaChannel::media_type() const { >+ return cricket::MediaType::MEDIA_TYPE_VIDEO; >+} >+ > DataMediaChannel::DataMediaChannel() = default; > DataMediaChannel::DataMediaChannel(const MediaConfig& config) > : MediaChannel(config) {} > DataMediaChannel::~DataMediaChannel() = default; > >+cricket::MediaType DataMediaChannel::media_type() const { >+ return cricket::MediaType::MEDIA_TYPE_DATA; >+} >+ > bool DataMediaChannel::GetStats(DataMediaInfo* info) { > return true; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.h >index ff3368cf401d5f78af0779c547cf0b2ed7cb282e..fca6ee4dd9be5a55fb25857028e300a4f8e7a085 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediachannel.h >@@ -22,6 +22,7 @@ > #include "api/audio_options.h" > #include "api/crypto/framedecryptorinterface.h" > #include "api/crypto/frameencryptorinterface.h" >+#include "api/media_transport_interface.h" > #include "api/rtcerror.h" > #include "api/rtpparameters.h" > #include "api/rtpreceiverinterface.h" >@@ -183,14 +184,22 @@ class MediaChannel : public sigslot::has_slots<> { > MediaChannel(); > ~MediaChannel() override; > >- // Sets the abstract interface class for sending RTP/RTCP data. >- virtual void SetInterface(NetworkInterface* iface); >+ virtual cricket::MediaType media_type() const = 0; >+ >+ // Sets the abstract interface class for sending RTP/RTCP data and >+ // interface for media transport (experimental). If media transport is >+ // provided, it should be used instead of RTP/RTCP. >+ // TODO(sukhanov): Currently media transport can co-exist with RTP/RTCP, but >+ // in the future we will refactor code to send all frames with media >+ // transport. >+ virtual void SetInterface(NetworkInterface* iface, >+ webrtc::MediaTransportInterface* media_transport); > // Called when a RTP packet is received. > virtual void OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) = 0; >+ int64_t packet_time_us) = 0; > // Called when a RTCP packet is received. > virtual void OnRtcpReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) = 0; >+ int64_t packet_time_us) = 0; > // Called when the socket's ability to send has changed. > virtual void OnReadyToSend(bool ready) = 0; > // Called when the network route used for sending packets changed. >@@ -251,14 +260,28 @@ class MediaChannel : public sigslot::has_slots<> { > return network_interface_->SetOption(type, opt, option); > } > >+ webrtc::MediaTransportInterface* media_transport() { >+ return media_transport_; >+ } >+ >+ // Corresponds to the SDP attribute extmap-allow-mixed, see RFC8285. >+ // Set to true if it's allowed to mix one- and two-byte RTP header extensions >+ // in the same stream. The setter and getter must only be called from >+ // worker_thread. >+ void SetExtmapAllowMixed(bool extmap_allow_mixed) { >+ extmap_allow_mixed_ = extmap_allow_mixed; >+ } >+ bool ExtmapAllowMixed() const { return extmap_allow_mixed_; } >+ > protected: > virtual rtc::DiffServCodePoint PreferredDscp() const; > > bool DscpEnabled() const { return enable_dscp_; } > >- private: > // This method sets DSCP |value| on both RTP and RTCP channels. >- int SetDscp(rtc::DiffServCodePoint value) { >+ int UpdateDscp() { >+ rtc::DiffServCodePoint value = >+ enable_dscp_ ? PreferredDscp() : rtc::DSCP_DEFAULT; > int ret; > ret = SetOption(NetworkInterface::ST_RTP, rtc::Socket::OPT_DSCP, value); > if (ret == 0) { >@@ -267,6 +290,7 @@ class MediaChannel : public sigslot::has_slots<> { > return ret; > } > >+ private: > bool DoSendPacket(rtc::CopyOnWriteBuffer* packet, > bool rtcp, > const rtc::PacketOptions& options) { >@@ -283,7 +307,9 @@ class MediaChannel : public sigslot::has_slots<> { > // from any MediaEngine threads. This critical section is to protect accessing > // of network_interface_ object. > rtc::CriticalSection network_interface_crit_; >- NetworkInterface* network_interface_; >+ NetworkInterface* network_interface_ = nullptr; >+ webrtc::MediaTransportInterface* media_transport_ = nullptr; >+ bool extmap_allow_mixed_ = false; > }; > > // The stats information is structured as follows: >@@ -402,14 +428,6 @@ struct VoiceSenderInfo : public MediaSenderInfo { > // https://w3c.github.io/webrtc-stats/#dom-rtcmediastreamtrackstats-totalaudioenergy > double total_input_energy = 0.0; > double total_input_duration = 0.0; >- // TODO(bugs.webrtc.org/8572): Remove APM stats from this struct, since they >- // are no longer needed now that we have apm_statistics. >- int echo_delay_median_ms = 0; >- int echo_delay_std_ms = 0; >- int echo_return_loss = 0; >- int echo_return_loss_enhancement = 0; >- float residual_echo_likelihood = 0.0f; >- float residual_echo_likelihood_recent_max = 0.0f; > bool typing_noise_detected = false; > webrtc::ANAStats ana_statistics; > webrtc::AudioProcessingStats apm_statistics; >@@ -458,6 +476,10 @@ struct VoiceReceiverInfo : public MediaReceiverInfo { > int decoding_muted_output = 0; > // Estimated capture start time in NTP time in ms. > int64_t capture_start_ntp_time_ms = -1; >+ // Count of the number of buffer flushes. >+ uint64_t jitter_buffer_flushes = 0; >+ // Number of samples expanded due to delayed packets. >+ uint64_t delayed_packet_outage_samples = 0; > }; > > struct VideoSenderInfo : public MediaSenderInfo { >@@ -647,12 +669,14 @@ struct RtpSendParameters : RtpParameters<Codec> { > // This is the value to be sent in the MID RTP header extension (if the header > // extension in included in the list of extensions). > std::string mid; >+ bool extmap_allow_mixed = false; > > protected: > std::map<std::string, std::string> ToStringMap() const override { > auto params = RtpParameters<Codec>::ToStringMap(); > params["max_bandwidth_bps"] = rtc::ToString(max_bandwidth_bps); > params["mid"] = (mid.empty() ? "<not set>" : mid); >+ params["extmap-allow-mixed"] = extmap_allow_mixed ? "true" : "false"; > return params; > } > }; >@@ -674,6 +698,8 @@ class VoiceMediaChannel : public MediaChannel { > explicit VoiceMediaChannel(const MediaConfig& config) > : MediaChannel(config) {} > ~VoiceMediaChannel() override {} >+ >+ cricket::MediaType media_type() const override; > virtual bool SetSendParameters(const AudioSendParameters& params) = 0; > virtual bool SetRecvParameters(const AudioRecvParameters& params) = 0; > virtual webrtc::RtpParameters GetRtpSendParameters(uint32_t ssrc) const = 0; >@@ -746,6 +772,7 @@ class VideoMediaChannel : public MediaChannel { > : MediaChannel(config) {} > ~VideoMediaChannel() override {} > >+ cricket::MediaType media_type() const override; > virtual bool SetSendParameters(const VideoSendParameters& params) = 0; > virtual bool SetRecvParameters(const VideoRecvParameters& params) = 0; > virtual webrtc::RtpParameters GetRtpSendParameters(uint32_t ssrc) const = 0; >@@ -857,6 +884,7 @@ class DataMediaChannel : public MediaChannel { > explicit DataMediaChannel(const MediaConfig& config); > ~DataMediaChannel() override; > >+ cricket::MediaType media_type() const override; > virtual bool SetSendParameters(const DataSendParameters& params) = 0; > virtual bool SetRecvParameters(const DataRecvParameters& params) = 0; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconfig.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconfig.h >index eda387e319112b321f2fd8e68b68f23d5b0bf089..c8fc5212a0f8cbc5b27386b587ef46053ca5fb79 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconfig.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconfig.h >@@ -59,8 +59,17 @@ struct MediaConfig { > // TODO(bugs.webrtc.org/8504): If all goes well, the flag will be removed > // together with the old method of estimation. > bool experiment_cpu_load_estimator = false; >+ >+ // Time interval between RTCP report for video >+ int rtcp_report_interval_ms = 1000; > } video; > >+ // Audio-specific config. >+ struct Audio { >+ // Time interval between RTCP report for audio >+ int rtcp_report_interval_ms = 5000; >+ } audio; >+ > bool operator==(const MediaConfig& o) const { > return enable_dscp == o.enable_dscp && > video.enable_cpu_adaptation == o.video.enable_cpu_adaptation && >@@ -71,7 +80,9 @@ struct MediaConfig { > video.periodic_alr_bandwidth_probing == > o.video.periodic_alr_bandwidth_probing && > video.experiment_cpu_load_estimator == >- o.video.experiment_cpu_load_estimator; >+ o.video.experiment_cpu_load_estimator && >+ video.rtcp_report_interval_ms == o.video.rtcp_report_interval_ms && >+ audio.rtcp_report_interval_ms == o.audio.rtcp_report_interval_ms; > } > > bool operator!=(const MediaConfig& o) const { return !(*this == o); } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconstants.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconstants.h >index 14c49baf72cbad1c5a92028b479a0b173c0dabcd..c4a3406fd85a23a8aaba3403c319c6e811f5a6d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconstants.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaconstants.h >@@ -13,6 +13,8 @@ > > #include <string> > >+#include "rtc_base/system/rtc_export.h" >+ > // This file contains constants related to media. > > namespace cricket { >@@ -127,12 +129,12 @@ extern const char kComfortNoiseCodecName[]; > > extern const char kVp8CodecName[]; > extern const char kVp9CodecName[]; >-extern const char kH264CodecName[]; >+RTC_EXPORT extern const char kH264CodecName[]; > > // RFC 6184 RTP Payload Format for H.264 video >-extern const char kH264FmtpProfileLevelId[]; >-extern const char kH264FmtpLevelAsymmetryAllowed[]; >-extern const char kH264FmtpPacketizationMode[]; >+RTC_EXPORT extern const char kH264FmtpProfileLevelId[]; >+RTC_EXPORT extern const char kH264FmtpLevelAsymmetryAllowed[]; >+RTC_EXPORT extern const char kH264FmtpPacketizationMode[]; > extern const char kH264FmtpSpropParameterSets[]; > extern const char kH264ProfileLevelConstrainedBaseline[]; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.cc >index 76cac4b438342e0abe226a72765d04629d5823b7..d5198fe64f1901e003df4822939834dcf1bcef6a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.cc >@@ -10,6 +10,8 @@ > > #include "media/base/mediaengine.h" > >+#include <utility> >+ > #include "api/video/video_bitrate_allocation.h" > #include "rtc_base/stringencode.h" > >@@ -104,4 +106,33 @@ webrtc::RTCError ValidateRtpParameters( > return webrtc::RTCError::OK(); > } > >+CompositeMediaEngine::CompositeMediaEngine( >+ std::unique_ptr<VoiceEngineInterface> voice_engine, >+ std::unique_ptr<VideoEngineInterface> video_engine) >+ : voice_engine_(std::move(voice_engine)), >+ video_engine_(std::move(video_engine)) {} >+ >+CompositeMediaEngine::~CompositeMediaEngine() = default; >+ >+bool CompositeMediaEngine::Init() { >+ voice().Init(); >+ return true; >+} >+ >+VoiceEngineInterface& CompositeMediaEngine::voice() { >+ return *voice_engine_.get(); >+} >+ >+VideoEngineInterface& CompositeMediaEngine::video() { >+ return *video_engine_.get(); >+} >+ >+const VoiceEngineInterface& CompositeMediaEngine::voice() const { >+ return *voice_engine_.get(); >+} >+ >+const VideoEngineInterface& CompositeMediaEngine::video() const { >+ return *video_engine_.get(); >+} >+ > }; // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.h >index 01c6fb65269ea2e726124de9abb7289d56654bdb..01300d48ac370b7a95a6daa788a6f492c46c7a3c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/mediaengine.h >@@ -15,13 +15,13 @@ > #include <CoreAudio/CoreAudio.h> > #endif > >+#include <memory> > #include <string> >-#include <tuple> >-#include <utility> > #include <vector> > > #include "api/audio_codecs/audio_decoder_factory.h" > #include "api/audio_codecs/audio_encoder_factory.h" >+#include "api/crypto/cryptooptions.h" > #include "api/rtpparameters.h" > #include "call/audio_state.h" > #include "media/base/codec.h" >@@ -48,37 +48,30 @@ struct RtpCapabilities { > std::vector<webrtc::RtpExtension> header_extensions; > }; > >-// MediaEngineInterface is an abstraction of a media engine which can be >-// subclassed to support different media componentry backends. >-// It supports voice and video operations in the same class to facilitate >-// proper synchronization between both media types. >-class MediaEngineInterface { >+class VoiceEngineInterface { > public: >- virtual ~MediaEngineInterface() {} >+ VoiceEngineInterface() = default; >+ virtual ~VoiceEngineInterface() = default; >+ RTC_DISALLOW_COPY_AND_ASSIGN(VoiceEngineInterface); > > // Initialization > // Starts the engine. >- virtual bool Init() = 0; >+ virtual void Init() = 0; >+ > // TODO(solenberg): Remove once VoE API refactoring is done. > virtual rtc::scoped_refptr<webrtc::AudioState> GetAudioState() const = 0; > > // MediaChannel creation > // Creates a voice media channel. Returns NULL on failure. >- virtual VoiceMediaChannel* CreateChannel(webrtc::Call* call, >- const MediaConfig& config, >- const AudioOptions& options) = 0; >- // Creates a video media channel, paired with the specified voice channel. >- // Returns NULL on failure. >- virtual VideoMediaChannel* CreateVideoChannel( >+ virtual VoiceMediaChannel* CreateMediaChannel( > webrtc::Call* call, > const MediaConfig& config, >- const VideoOptions& options) = 0; >+ const AudioOptions& options, >+ const webrtc::CryptoOptions& crypto_options) = 0; > >- virtual const std::vector<AudioCodec>& audio_send_codecs() = 0; >- virtual const std::vector<AudioCodec>& audio_recv_codecs() = 0; >- virtual RtpCapabilities GetAudioCapabilities() = 0; >- virtual std::vector<VideoCodec> video_codecs() = 0; >- virtual RtpCapabilities GetVideoCapabilities() = 0; >+ virtual const std::vector<AudioCodec>& send_codecs() const = 0; >+ virtual const std::vector<AudioCodec>& recv_codecs() const = 0; >+ virtual RtpCapabilities GetCapabilities() const = 0; > > // Starts AEC dump using existing file, a maximum file size in bytes can be > // specified. Logging is stopped just before the size limit is exceeded. >@@ -89,69 +82,66 @@ class MediaEngineInterface { > virtual void StopAecDump() = 0; > }; > >+class VideoEngineInterface { >+ public: >+ VideoEngineInterface() = default; >+ virtual ~VideoEngineInterface() = default; >+ RTC_DISALLOW_COPY_AND_ASSIGN(VideoEngineInterface); >+ >+ // Creates a video media channel, paired with the specified voice channel. >+ // Returns NULL on failure. >+ virtual VideoMediaChannel* CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options) = 0; >+ >+ virtual std::vector<VideoCodec> codecs() const = 0; >+ virtual RtpCapabilities GetCapabilities() const = 0; >+}; >+ >+// MediaEngineInterface is an abstraction of a media engine which can be >+// subclassed to support different media componentry backends. >+// It supports voice and video operations in the same class to facilitate >+// proper synchronization between both media types. >+class MediaEngineInterface { >+ public: >+ virtual ~MediaEngineInterface() {} >+ >+ // Initialization >+ // Starts the engine. >+ virtual bool Init() = 0; >+ virtual VoiceEngineInterface& voice() = 0; >+ virtual VideoEngineInterface& video() = 0; >+ virtual const VoiceEngineInterface& voice() const = 0; >+ virtual const VideoEngineInterface& video() const = 0; >+}; >+ > // CompositeMediaEngine constructs a MediaEngine from separate > // voice and video engine classes. >-template <class VOICE, class VIDEO> > class CompositeMediaEngine : public MediaEngineInterface { > public: >- template <class... Args1, class... Args2> >- CompositeMediaEngine(std::tuple<Args1...> first_args, >- std::tuple<Args2...> second_args) >- : engines_(std::piecewise_construct, >- std::move(first_args), >- std::move(second_args)) {} >- >- virtual ~CompositeMediaEngine() {} >- virtual bool Init() { >- voice().Init(); >- return true; >- } >- >- virtual rtc::scoped_refptr<webrtc::AudioState> GetAudioState() const { >- return voice().GetAudioState(); >- } >- virtual VoiceMediaChannel* CreateChannel(webrtc::Call* call, >- const MediaConfig& config, >- const AudioOptions& options) { >- return voice().CreateChannel(call, config, options); >- } >- virtual VideoMediaChannel* CreateVideoChannel(webrtc::Call* call, >- const MediaConfig& config, >- const VideoOptions& options) { >- return video().CreateChannel(call, config, options); >- } >- >- virtual const std::vector<AudioCodec>& audio_send_codecs() { >- return voice().send_codecs(); >- } >- virtual const std::vector<AudioCodec>& audio_recv_codecs() { >- return voice().recv_codecs(); >- } >- virtual RtpCapabilities GetAudioCapabilities() { >- return voice().GetCapabilities(); >- } >- virtual std::vector<VideoCodec> video_codecs() { return video().codecs(); } >- virtual RtpCapabilities GetVideoCapabilities() { >- return video().GetCapabilities(); >- } >- >- virtual bool StartAecDump(rtc::PlatformFile file, int64_t max_size_bytes) { >- return voice().StartAecDump(file, max_size_bytes); >- } >- >- virtual void StopAecDump() { voice().StopAecDump(); } >- >- protected: >- VOICE& voice() { return engines_.first; } >- VIDEO& video() { return engines_.second; } >- const VOICE& voice() const { return engines_.first; } >- const VIDEO& video() const { return engines_.second; } >+ CompositeMediaEngine(std::unique_ptr<VoiceEngineInterface> audio_engine, >+ std::unique_ptr<VideoEngineInterface> video_engine); >+ ~CompositeMediaEngine() override; >+ bool Init() override; >+ >+ VoiceEngineInterface& voice() override; >+ VideoEngineInterface& video() override; >+ const VoiceEngineInterface& voice() const override; >+ const VideoEngineInterface& video() const override; > > private: >- std::pair<VOICE, VIDEO> engines_; >+ std::unique_ptr<VoiceEngineInterface> voice_engine_; >+ std::unique_ptr<VideoEngineInterface> video_engine_; > }; > >-enum DataChannelType { DCT_NONE = 0, DCT_RTP = 1, DCT_SCTP = 2 }; >+enum DataChannelType { >+ DCT_NONE = 0, >+ DCT_RTP = 1, >+ DCT_SCTP = 2, >+ DCT_MEDIA_TRANSPORT = 3 >+}; > > class DataEngineInterface { > public: >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.cc >index fc2b73141edd75e08e6d66b35b64389cfda8f3e6..c3367c0a1f60b494f7a3313d1f552cb7ad6eded2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.cc >@@ -12,6 +12,7 @@ > > #include <map> > >+#include "absl/strings/match.h" > #include "media/base/codec.h" > #include "media/base/mediaconstants.h" > #include "media/base/rtputils.h" >@@ -21,7 +22,6 @@ > #include "rtc_base/helpers.h" > #include "rtc_base/logging.h" > #include "rtc_base/sanitizer.h" >-#include "rtc_base/stringutils.h" > > namespace cricket { > >@@ -47,7 +47,7 @@ DataMediaChannel* RtpDataEngine::CreateChannel(const MediaConfig& config) { > static const DataCodec* FindCodecByName(const std::vector<DataCodec>& codecs, > const std::string& name) { > for (const DataCodec& codec : codecs) { >- if (_stricmp(name.c_str(), codec.name.c_str()) == 0) >+ if (absl::EqualsIgnoreCase(name, codec.name)) > return &codec; > } > return nullptr; >@@ -194,7 +194,7 @@ bool RtpDataMediaChannel::RemoveRecvStream(uint32_t ssrc) { > } > > void RtpDataMediaChannel::OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t /* packet_time_us */) { > RtpHeader header; > if (!GetRtpHeader(packet->cdata(), packet->size(), &header)) { > return; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.h >index 966d0c81514157d2e07a3c77d54f63f32e30b6f3..42e919039d013f9cfa86379e3aaaa2dad277c782 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine.h >@@ -82,9 +82,9 @@ class RtpDataMediaChannel : public DataMediaChannel { > return true; > } > virtual void OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > virtual void OnRtcpReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) {} >+ int64_t packet_time_us) {} > virtual void OnReadyToSend(bool ready) {} > virtual bool SendData(const SendDataParams& params, > const rtc::CopyOnWriteBuffer& payload, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine_unittest.cc >index c15c55f5e6274e88d357f9d960de6c16a7e88517..9b8e10b45007cced6698a5e244203e08b6dd45d3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtpdataengine_unittest.cc >@@ -73,7 +73,7 @@ class RtpDataMediaChannelTest : public testing::Test { > cricket::MediaConfig config; > cricket::RtpDataMediaChannel* channel = > static_cast<cricket::RtpDataMediaChannel*>(dme->CreateChannel(config)); >- channel->SetInterface(iface_.get()); >+ channel->SetInterface(iface_.get(), /*media_transport=*/nullptr); > channel->SignalDataReceived.connect(receiver_.get(), > &FakeDataReceiver::OnDataReceived); > return channel; >@@ -317,13 +317,13 @@ TEST_F(RtpDataMediaChannelTest, ReceiveData) { > std::unique_ptr<cricket::RtpDataMediaChannel> dmc(CreateChannel()); > > // SetReceived not called. >- dmc->OnPacketReceived(&packet, rtc::PacketTime()); >+ dmc->OnPacketReceived(&packet, /* packet_time_us */ -1); > EXPECT_FALSE(HasReceivedData()); > > dmc->SetReceive(true); > > // Unknown payload id >- dmc->OnPacketReceived(&packet, rtc::PacketTime()); >+ dmc->OnPacketReceived(&packet, /* packet_time_us */ -1); > EXPECT_FALSE(HasReceivedData()); > > cricket::DataCodec codec; >@@ -334,7 +334,7 @@ TEST_F(RtpDataMediaChannelTest, ReceiveData) { > ASSERT_TRUE(dmc->SetRecvParameters(parameters)); > > // Unknown stream >- dmc->OnPacketReceived(&packet, rtc::PacketTime()); >+ dmc->OnPacketReceived(&packet, /* packet_time_us */ -1); > EXPECT_FALSE(HasReceivedData()); > > cricket::StreamParams stream; >@@ -342,7 +342,7 @@ TEST_F(RtpDataMediaChannelTest, ReceiveData) { > ASSERT_TRUE(dmc->AddRecvStream(stream)); > > // Finally works! >- dmc->OnPacketReceived(&packet, rtc::PacketTime()); >+ dmc->OnPacketReceived(&packet, /* packet_time_us */ -1); > EXPECT_TRUE(HasReceivedData()); > EXPECT_EQ("abcde", GetReceivedData()); > EXPECT_EQ(5U, GetReceivedDataLen()); >@@ -355,6 +355,6 @@ TEST_F(RtpDataMediaChannelTest, InvalidRtpPackets) { > std::unique_ptr<cricket::RtpDataMediaChannel> dmc(CreateChannel()); > > // Too short >- dmc->OnPacketReceived(&packet, rtc::PacketTime()); >+ dmc->OnPacketReceived(&packet, /* packet_time_us */ -1); > EXPECT_FALSE(HasReceivedData()); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtputils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtputils.h >index 0fd9a59816b6e7e4c1a8ab3c24e954db010d6b28..c5fe14fb3d381788937d15ab0027d1b256c856b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtputils.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/rtputils.h >@@ -12,6 +12,7 @@ > #define MEDIA_BASE_RTPUTILS_H_ > > #include "rtc_base/byteorder.h" >+#include "rtc_base/system/rtc_export.h" > > namespace rtc { > struct PacketTimeUpdateParams; >@@ -68,9 +69,9 @@ bool IsValidRtpRtcpPacketSize(bool rtcp, size_t size); > const char* RtpRtcpStringLiteral(bool rtcp); > > // Verifies that a packet has a valid RTP header. >-bool ValidateRtpHeader(const uint8_t* rtp, >- size_t length, >- size_t* header_length); >+bool RTC_EXPORT ValidateRtpHeader(const uint8_t* rtp, >+ size_t length, >+ size_t* header_length); > > // Helper method which updates the absolute send time extension if present. > bool UpdateRtpAbsSendTimeExtension(uint8_t* rtp, >@@ -80,10 +81,11 @@ bool UpdateRtpAbsSendTimeExtension(uint8_t* rtp, > > // Applies specified |options| to the packet. It updates the absolute send time > // extension header if it is present present then updates HMAC. >-bool ApplyPacketOptions(uint8_t* data, >- size_t length, >- const rtc::PacketTimeUpdateParams& packet_time_params, >- uint64_t time_us); >+bool RTC_EXPORT >+ApplyPacketOptions(uint8_t* data, >+ size_t length, >+ const rtc::PacketTimeUpdateParams& packet_time_params, >+ uint64_t time_us); > > } // namespace cricket > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.cc >index 93c2bc98aef32b870772b35150b2849ef69067f5..512dc91f93c31b33fea3c454e827fc21e99c24d9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.cc >@@ -10,90 +10,14 @@ > > #include "media/base/testutils.h" > >-#include <math.h> > #include <algorithm> > #include <memory> > > #include "api/video/video_frame.h" > #include "media/base/videocapturer.h" >-#include "rtc_base/bytebuffer.h" >-#include "rtc_base/gunit.h" >-#include "rtc_base/stream.h" >-#include "rtc_base/stringutils.h" >-#include "rtc_base/testutils.h" > > namespace cricket { > >-///////////////////////////////////////////////////////////////////////// >-// Implementation of RawRtpPacket >-///////////////////////////////////////////////////////////////////////// >-void RawRtpPacket::WriteToByteBuffer(uint32_t in_ssrc, >- rtc::ByteBufferWriter* buf) const { >- if (!buf) >- return; >- >- buf->WriteUInt8(ver_to_cc); >- buf->WriteUInt8(m_to_pt); >- buf->WriteUInt16(sequence_number); >- buf->WriteUInt32(timestamp); >- buf->WriteUInt32(in_ssrc); >- buf->WriteBytes(payload, sizeof(payload)); >-} >- >-bool RawRtpPacket::ReadFromByteBuffer(rtc::ByteBufferReader* buf) { >- if (!buf) >- return false; >- >- bool ret = true; >- ret &= buf->ReadUInt8(&ver_to_cc); >- ret &= buf->ReadUInt8(&m_to_pt); >- ret &= buf->ReadUInt16(&sequence_number); >- ret &= buf->ReadUInt32(×tamp); >- ret &= buf->ReadUInt32(&ssrc); >- ret &= buf->ReadBytes(payload, sizeof(payload)); >- return ret; >-} >- >-bool RawRtpPacket::SameExceptSeqNumTimestampSsrc(const RawRtpPacket& packet, >- uint16_t seq, >- uint32_t ts, >- uint32_t ssc) const { >- return sequence_number == seq && timestamp == ts && >- ver_to_cc == packet.ver_to_cc && m_to_pt == packet.m_to_pt && >- ssrc == ssc && 0 == memcmp(payload, packet.payload, sizeof(payload)); >-} >- >-///////////////////////////////////////////////////////////////////////// >-// Implementation of RawRtcpPacket >-///////////////////////////////////////////////////////////////////////// >-void RawRtcpPacket::WriteToByteBuffer(rtc::ByteBufferWriter* buf) const { >- if (!buf) >- return; >- >- buf->WriteUInt8(ver_to_count); >- buf->WriteUInt8(type); >- buf->WriteUInt16(length); >- buf->WriteBytes(payload, sizeof(payload)); >-} >- >-bool RawRtcpPacket::ReadFromByteBuffer(rtc::ByteBufferReader* buf) { >- if (!buf) >- return false; >- >- bool ret = true; >- ret &= buf->ReadUInt8(&ver_to_count); >- ret &= buf->ReadUInt8(&type); >- ret &= buf->ReadUInt16(&length); >- ret &= buf->ReadBytes(payload, sizeof(payload)); >- return ret; >-} >- >-bool RawRtcpPacket::EqualsTo(const RawRtcpPacket& packet) const { >- return ver_to_count == packet.ver_to_count && type == packet.type && >- length == packet.length && >- 0 == memcmp(payload, packet.payload, sizeof(payload)); >-} >- > // Implementation of VideoCaptureListener. > VideoCapturerListener::VideoCapturerListener(VideoCapturer* capturer) > : capturer_(capturer), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.h >index c84696236b4a402fc4c46e341da29836136d3536..6b63ffd180003646caa54af755754b7856b5e835 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/testutils.h >@@ -20,12 +20,6 @@ > #include "rtc_base/arraysize.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > >-namespace rtc { >-class ByteBufferReader; >-class ByteBufferWriter; >-class StreamInterface; >-} // namespace rtc >- > namespace webrtc { > class VideoFrame; > } >@@ -43,40 +37,6 @@ inline std::vector<T> MakeVector(const T a[], size_t s) { > } > #define MAKE_VECTOR(a) cricket::MakeVector(a, arraysize(a)) > >-struct RtpDumpPacket; >-class RtpDumpWriter; >- >-struct RawRtpPacket { >- void WriteToByteBuffer(uint32_t in_ssrc, rtc::ByteBufferWriter* buf) const; >- bool ReadFromByteBuffer(rtc::ByteBufferReader* buf); >- // Check if this packet is the same as the specified packet except the >- // sequence number and timestamp, which should be the same as the specified >- // parameters. >- bool SameExceptSeqNumTimestampSsrc(const RawRtpPacket& packet, >- uint16_t seq, >- uint32_t ts, >- uint32_t ssc) const; >- int size() const { return 28; } >- >- uint8_t ver_to_cc; >- uint8_t m_to_pt; >- uint16_t sequence_number; >- uint32_t timestamp; >- uint32_t ssrc; >- char payload[16]; >-}; >- >-struct RawRtcpPacket { >- void WriteToByteBuffer(rtc::ByteBufferWriter* buf) const; >- bool ReadFromByteBuffer(rtc::ByteBufferReader* buf); >- bool EqualsTo(const RawRtcpPacket& packet) const; >- >- uint8_t ver_to_count; >- uint8_t type; >- uint16_t length; >- char payload[16]; >-}; >- > // Test helper for testing VideoCapturer implementations. > class VideoCapturerListener > : public sigslot::has_slots<>, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/turnutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/turnutils.h >index 13ed26b7a5ff7cc27569c266a8dd64522cffb623..7aa0651cea3ea559f4beb13b53b422d157f4ea2b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/turnutils.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/turnutils.h >@@ -14,16 +14,18 @@ > #include <cstddef> > #include <cstdint> > >+#include "rtc_base/system/rtc_export.h" >+ > namespace cricket { > > struct PacketOptions; > > // Finds data location within a TURN Channel Message or TURN Send Indication > // message. >-bool UnwrapTurnPacket(const uint8_t* packet, >- size_t packet_size, >- size_t* content_position, >- size_t* content_size); >+bool RTC_EXPORT UnwrapTurnPacket(const uint8_t* packet, >+ size_t packet_size, >+ size_t* content_position, >+ size_t* content_size); > > } // namespace cricket > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.cc >index 840101bcee3403d64bb8f91f75c4d51ebd04b72d..b2ed333614930887663b574f3fe58c05cb621095 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.cc >@@ -169,8 +169,18 @@ bool VideoAdapter::AdaptFrameResolution(int in_width, > // OnOutputFormatRequest and OnResolutionFramerateRequest. > int max_pixel_count = resolution_request_max_pixel_count_; > >- if (max_pixel_count_) >- max_pixel_count = std::min(max_pixel_count, *max_pixel_count_); >+ // Select target aspect ratio and max pixel count depending on input frame >+ // orientation. >+ absl::optional<std::pair<int, int>> target_aspect_ratio; >+ if (in_width > in_height) { >+ target_aspect_ratio = target_landscape_aspect_ratio_; >+ if (max_landscape_pixel_count_) >+ max_pixel_count = std::min(max_pixel_count, *max_landscape_pixel_count_); >+ } else { >+ target_aspect_ratio = target_portrait_aspect_ratio_; >+ if (max_portrait_pixel_count_) >+ max_pixel_count = std::min(max_pixel_count, *max_portrait_pixel_count_); >+ } > > int target_pixel_count = > std::min(resolution_request_target_pixel_count_, max_pixel_count); >@@ -195,19 +205,14 @@ bool VideoAdapter::AdaptFrameResolution(int in_width, > } > > // Calculate how the input should be cropped. >- if (!target_aspect_ratio_ || target_aspect_ratio_->first <= 0 || >- target_aspect_ratio_->second <= 0) { >+ if (!target_aspect_ratio || target_aspect_ratio->first <= 0 || >+ target_aspect_ratio->second <= 0) { > *cropped_width = in_width; > *cropped_height = in_height; > } else { >- // Adjust |target_aspect_ratio_| orientation to match input. >- if ((in_width > in_height) != >- (target_aspect_ratio_->first > target_aspect_ratio_->second)) { >- std::swap(target_aspect_ratio_->first, target_aspect_ratio_->second); >- } > const float requested_aspect = >- target_aspect_ratio_->first / >- static_cast<float>(target_aspect_ratio_->second); >+ target_aspect_ratio->first / >+ static_cast<float>(target_aspect_ratio->second); > *cropped_width = > std::min(in_width, static_cast<int>(in_height * requested_aspect)); > *cropped_height = >@@ -274,9 +279,33 @@ void VideoAdapter::OnOutputFormatRequest( > const absl::optional<std::pair<int, int>>& target_aspect_ratio, > const absl::optional<int>& max_pixel_count, > const absl::optional<int>& max_fps) { >+ absl::optional<std::pair<int, int>> target_landscape_aspect_ratio; >+ absl::optional<std::pair<int, int>> target_portrait_aspect_ratio; >+ if (target_aspect_ratio && target_aspect_ratio->first > 0 && >+ target_aspect_ratio->second > 0) { >+ // Maintain input orientation. >+ const int max_side = >+ std::max(target_aspect_ratio->first, target_aspect_ratio->second); >+ const int min_side = >+ std::min(target_aspect_ratio->first, target_aspect_ratio->second); >+ target_landscape_aspect_ratio = std::make_pair(max_side, min_side); >+ target_portrait_aspect_ratio = std::make_pair(min_side, max_side); >+ } >+ OnOutputFormatRequest(target_landscape_aspect_ratio, max_pixel_count, >+ target_portrait_aspect_ratio, max_pixel_count, max_fps); >+} >+ >+void VideoAdapter::OnOutputFormatRequest( >+ const absl::optional<std::pair<int, int>>& target_landscape_aspect_ratio, >+ const absl::optional<int>& max_landscape_pixel_count, >+ const absl::optional<std::pair<int, int>>& target_portrait_aspect_ratio, >+ const absl::optional<int>& max_portrait_pixel_count, >+ const absl::optional<int>& max_fps) { > rtc::CritScope cs(&critical_section_); >- target_aspect_ratio_ = target_aspect_ratio; >- max_pixel_count_ = max_pixel_count; >+ target_landscape_aspect_ratio_ = target_landscape_aspect_ratio; >+ max_landscape_pixel_count_ = max_landscape_pixel_count; >+ target_portrait_aspect_ratio_ = target_portrait_aspect_ratio; >+ max_portrait_pixel_count_ = max_portrait_pixel_count; > max_fps_ = max_fps; > next_frame_timestamp_ns_ = absl::nullopt; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.h >index bb9ab8026f088acd94329491ca25d4a62ea946fa..1864d5cc0c7bf358d95bbda3c5fbe205407809ad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter.h >@@ -68,6 +68,17 @@ class VideoAdapter { > const absl::optional<int>& max_pixel_count, > const absl::optional<int>& max_fps); > >+ // Same as above, but allows setting two different target aspect ratios >+ // depending on incoming frame orientation. This gives more fine-grained >+ // control and can e.g. be used to force landscape video to be cropped to >+ // portrait video. >+ void OnOutputFormatRequest( >+ const absl::optional<std::pair<int, int>>& target_landscape_aspect_ratio, >+ const absl::optional<int>& max_landscape_pixel_count, >+ const absl::optional<std::pair<int, int>>& target_portrait_aspect_ratio, >+ const absl::optional<int>& max_portrait_pixel_count, >+ const absl::optional<int>& max_fps); >+ > // Requests the output frame size from |AdaptFrameResolution| to have as close > // as possible to |target_pixel_count| pixels (if set) but no more than > // |max_pixel_count|. >@@ -100,9 +111,14 @@ class VideoAdapter { > // Max number of pixels/fps requested via calls to OnOutputFormatRequest, > // OnResolutionFramerateRequest respectively. > // The adapted output format is the minimum of these. >- absl::optional<std::pair<int, int>> target_aspect_ratio_ >+ absl::optional<std::pair<int, int>> target_landscape_aspect_ratio_ >+ RTC_GUARDED_BY(critical_section_); >+ absl::optional<int> max_landscape_pixel_count_ >+ RTC_GUARDED_BY(critical_section_); >+ absl::optional<std::pair<int, int>> target_portrait_aspect_ratio_ >+ RTC_GUARDED_BY(critical_section_); >+ absl::optional<int> max_portrait_pixel_count_ > RTC_GUARDED_BY(critical_section_); >- absl::optional<int> max_pixel_count_ RTC_GUARDED_BY(critical_section_); > absl::optional<int> max_fps_ RTC_GUARDED_BY(critical_section_); > int resolution_request_target_pixel_count_ RTC_GUARDED_BY(critical_section_); > int resolution_request_max_pixel_count_ RTC_GUARDED_BY(critical_section_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter_unittest.cc >index 67a662639e5514e44438d2a5e01686a091f0be58..c600fc2c0730c9916c4a4a9c17b3826a7ae8921b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videoadapter_unittest.cc >@@ -18,7 +18,6 @@ > #include "absl/memory/memory.h" > #include "media/base/fakeframesource.h" > #include "media/base/mediachannel.h" >-#include "media/base/testutils.h" > #include "media/base/videoadapter.h" > #include "rtc_base/gunit.h" > #include "rtc_base/logging.h" >@@ -1112,4 +1111,61 @@ TEST_P(VideoAdapterTest, TestAdaptToMax) { > EXPECT_EQ(640, out_width_); > EXPECT_EQ(360, out_height_); > } >+ >+// Test adjusting to 16:9 in landscape, and 9:16 in portrait. >+TEST(VideoAdapterTestMultipleOrientation, TestNormal) { >+ VideoAdapter video_adapter; >+ video_adapter.OnOutputFormatRequest(std::make_pair(640, 360), 640 * 360, >+ std::make_pair(360, 640), 360 * 640, 30); >+ >+ int cropped_width; >+ int cropped_height; >+ int out_width; >+ int out_height; >+ EXPECT_TRUE(video_adapter.AdaptFrameResolution( >+ /* in_width= */ 640, /* in_height= */ 480, /* in_timestamp_ns= */ 0, >+ &cropped_width, &cropped_height, &out_width, &out_height)); >+ EXPECT_EQ(640, cropped_width); >+ EXPECT_EQ(360, cropped_height); >+ EXPECT_EQ(640, out_width); >+ EXPECT_EQ(360, out_height); >+ >+ EXPECT_TRUE(video_adapter.AdaptFrameResolution( >+ /* in_width= */ 480, /* in_height= */ 640, >+ /* in_timestamp_ns= */ rtc::kNumNanosecsPerSec / 30, &cropped_width, >+ &cropped_height, &out_width, &out_height)); >+ EXPECT_EQ(360, cropped_width); >+ EXPECT_EQ(640, cropped_height); >+ EXPECT_EQ(360, out_width); >+ EXPECT_EQ(640, out_height); >+} >+ >+// Force output to be 9:16, even for landscape input. >+TEST(VideoAdapterTestMultipleOrientation, TestForcePortrait) { >+ VideoAdapter video_adapter; >+ video_adapter.OnOutputFormatRequest(std::make_pair(360, 640), 640 * 360, >+ std::make_pair(360, 640), 360 * 640, 30); >+ >+ int cropped_width; >+ int cropped_height; >+ int out_width; >+ int out_height; >+ EXPECT_TRUE(video_adapter.AdaptFrameResolution( >+ /* in_width= */ 640, /* in_height= */ 480, /* in_timestamp_ns= */ 0, >+ &cropped_width, &cropped_height, &out_width, &out_height)); >+ EXPECT_EQ(270, cropped_width); >+ EXPECT_EQ(480, cropped_height); >+ EXPECT_EQ(270, out_width); >+ EXPECT_EQ(480, out_height); >+ >+ EXPECT_TRUE(video_adapter.AdaptFrameResolution( >+ /* in_width= */ 480, /* in_height= */ 640, >+ /* in_timestamp_ns= */ rtc::kNumNanosecsPerSec / 30, &cropped_width, >+ &cropped_height, &out_width, &out_height)); >+ EXPECT_EQ(360, cropped_width); >+ EXPECT_EQ(640, cropped_height); >+ EXPECT_EQ(360, out_width); >+ EXPECT_EQ(640, out_height); >+} >+ > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer.h >index 826e38fa5a1acbdb1628e7745a10d5204986fb52..32742062aa928c0217885c3d0aa386ba918e1746 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer.h >@@ -26,6 +26,7 @@ > #include "media/base/videocommon.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > #include "rtc_base/thread_checker.h" > #include "rtc_base/timestampaligner.h" >@@ -72,8 +73,9 @@ enum CaptureState { > // media engine thread). Hence, the VideoCapture subclasses dont need to be > // thread safe. > // >-class VideoCapturer : public sigslot::has_slots<>, >- public rtc::VideoSourceInterface<webrtc::VideoFrame> { >+class RTC_EXPORT VideoCapturer >+ : public sigslot::has_slots<>, >+ public rtc::VideoSourceInterface<webrtc::VideoFrame> { > public: > VideoCapturer(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer_unittest.cc >index 080f4a08caeab048d6fa1e3beb9dbd08467d9f3f..07344ca7ac85ae2451b497fa49e6740e758bcfd1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocapturer_unittest.cc >@@ -15,7 +15,6 @@ > > #include "media/base/fakevideocapturer.h" > #include "media/base/fakevideorenderer.h" >-#include "media/base/testutils.h" > #include "media/base/videocapturer.h" > #include "rtc_base/gunit.h" > #include "rtc_base/logging.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocommon.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocommon.h >index 0ffd989cb6515f0e5e9d22001b87cbf645cc7998..6447cba3b2b69ac53f589a77cbac55024b3ea64c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocommon.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/videocommon.h >@@ -17,6 +17,7 @@ > > #include <string> > >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/timeutils.h" > > namespace cricket { >@@ -149,7 +150,7 @@ struct VideoFormatPod { > uint32_t fourcc; // Color space. FOURCC_ANY means that any color space is OK. > }; > >-struct VideoFormat : VideoFormatPod { >+struct RTC_EXPORT VideoFormat : VideoFormatPod { > static const int64_t kMinimumInterval = > rtc::kNumNanosecsPerSec / 10000; // 10k fps. > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.cc >index a28a912762a729e62796740aeb84815c3b991c2a..c9ccaa9f9d7dec1f0591935ddf960f9c0fd6c3ca 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.cc >@@ -15,7 +15,7 @@ > namespace webrtc { > > // Profile information for VP9 video. >-const char kVP9FmtpProfileId[] = "x-google-profile-id"; >+const char kVP9FmtpProfileId[] = "profile-id"; > > std::string VP9ProfileToString(VP9Profile profile) { > switch (profile) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.h >index 5cc1d166a8789b1829d7aa2d91951ca5efe08258..755c46da2cbdb963b5aedccae4e9596f8564f204 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/base/vp9_profile.h >@@ -16,7 +16,6 @@ > > #include "absl/types/optional.h" > #include "api/video_codecs/sdp_video_format.h" >-#include "common_types.h" // NOLINT(build/include) > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/adm_helpers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/adm_helpers.h >index c6ea3a209f2c9af3632bc48c39376ad4c95af280..2a35d26b4716140533e96a6af8a7de35df76442c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/adm_helpers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/adm_helpers.h >@@ -11,8 +11,6 @@ > #ifndef MEDIA_ENGINE_ADM_HELPERS_H_ > #define MEDIA_ENGINE_ADM_HELPERS_H_ > >-#include "common_types.h" // NOLINT(build/include) >- > namespace webrtc { > > class AudioDeviceModule; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/apm_helpers_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/apm_helpers_unittest.cc >index 464e053dc0a8d16bb6e1d2bff69a993b2d4ef92c..70ac0c1c21e8a535bcdb6a03099d83cd35639c1d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/apm_helpers_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/apm_helpers_unittest.cc >@@ -164,11 +164,4 @@ TEST(ApmHelpersTest, TypingDetectionStatus_EnableDisable) { > EXPECT_EQ(VoiceDetection::kVeryLowLikelihood, vd->likelihood()); > EXPECT_FALSE(vd->is_enabled()); > } >- >-// TODO(solenberg): Move this test to a better place - added here for the sake >-// of duplicating all relevant tests from audio_processing_test.cc. >-TEST(ApmHelpersTest, HighPassFilter_DefaultMode) { >- TestHelper helper; >- EXPECT_FALSE(helper.apm()->high_pass_filter()->is_enabled()); >-} > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.cc >index 736685fc79f3684f4df4195eeb4f48503ae45815..fb3ac698e6170278e5b8e81377ed78edb2dad6db 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.cc >@@ -14,5 +14,6 @@ namespace cricket { > > const int kMinVideoBitrateBps = 30000; > const int kVideoMtu = 1200; >-const int kVideoRtpBufferSize = 65536; >+const int kVideoRtpSendBufferSize = 65536; >+const int kVideoRtpRecvBufferSize = 262144; > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.h >index ea6c0754a65d972ebf8617bdb14aa869a18a6a24..b136060546100194ed6fe51478feb5b1583839e6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/constants.h >@@ -14,7 +14,8 @@ > namespace cricket { > > extern const int kVideoMtu; >-extern const int kVideoRtpBufferSize; >+extern const int kVideoRtpSendBufferSize; >+extern const int kVideoRtpRecvBufferSize; > > extern const char kH264CodecName[]; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.cc >index 7da1b0fe2fd7c2b76382e21ffdd12338d15e8868..297a143425b4c1ea41cc051afa43ad9135c3cd33 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.cc >@@ -15,6 +15,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "absl/strings/match.h" > #include "api/video_codecs/video_decoder_factory.h" > #include "api/video_codecs/video_decoder_software_fallback_wrapper.h" > #include "api/video_codecs/video_encoder_factory.h" >@@ -130,7 +131,7 @@ class EncoderAdapter : public webrtc::VideoEncoderFactory { > if (IsFormatSupported(internal_encoder_factory_->GetSupportedFormats(), > format)) { > internal_encoder = >- CodecNamesEq(format.name.c_str(), kVp8CodecName) >+ absl::EqualsIgnoreCase(format.name, kVp8CodecName) > ? absl::make_unique<webrtc::VP8EncoderSimulcastProxy>( > internal_encoder_factory_.get(), format) > : internal_encoder_factory_->CreateVideoEncoder(format); >@@ -141,7 +142,7 @@ class EncoderAdapter : public webrtc::VideoEncoderFactory { > if (IsFormatSupported(external_encoder_factory_->GetSupportedFormats(), > format)) { > external_encoder = >- CodecNamesEq(format.name.c_str(), kVp8CodecName) >+ absl::EqualsIgnoreCase(format.name, kVp8CodecName) > ? absl::make_unique<webrtc::SimulcastEncoderAdapter>( > external_encoder_factory_.get(), format) > : external_encoder_factory_->CreateVideoEncoder(format); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.h >index 5bd3580a6f10425d30fd95d4381368ba687ecbbf..27383ab386e9f3983de4325733dd1e486db65c99 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/convert_legacy_video_factory.h >@@ -13,6 +13,8 @@ > > #include <memory> > >+#include "rtc_base/system/rtc_export.h" >+ > namespace webrtc { > class VideoEncoderFactory; > class VideoDecoderFactory; >@@ -27,10 +29,12 @@ class WebRtcVideoDecoderFactory; > // new type of codec factories. The purpose of these functions is to provide an > // easy way for clients to migrate to the API with new factory types. > // TODO(magjed): Remove once old factories are gone, webrtc:7925. >-std::unique_ptr<webrtc::VideoEncoderFactory> ConvertVideoEncoderFactory( >+RTC_EXPORT std::unique_ptr<webrtc::VideoEncoderFactory> >+ConvertVideoEncoderFactory( > std::unique_ptr<WebRtcVideoEncoderFactory> external_encoder_factory); > >-std::unique_ptr<webrtc::VideoDecoderFactory> ConvertVideoDecoderFactory( >+RTC_EXPORT std::unique_ptr<webrtc::VideoDecoderFactory> >+ConvertVideoDecoderFactory( > std::unique_ptr<WebRtcVideoDecoderFactory> external_decoder_factory); > > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.cc >index ce99b9c1034c09215408948836e4c9443112af42..6c5b8c74b8f08d1829f5852efb3fd45919ea14c2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.cc >@@ -123,6 +123,7 @@ FakeVideoSendStream::FakeVideoSendStream( > source_(nullptr), > num_swapped_frames_(0) { > RTC_DCHECK(config.encoder_settings.encoder_factory != nullptr); >+ RTC_DCHECK(config.encoder_settings.bitrate_allocator_factory != nullptr); > ReconfigureVideoEncoder(std::move(encoder_config)); > } > >@@ -643,4 +644,7 @@ void FakeCall::OnSentPacket(const rtc::SentPacket& sent_packet) { > } > } > >+void FakeCall::MediaTransportChange( >+ webrtc::MediaTransportInterface* media_transport_interface) {} >+ > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.h >index dbcedb8f4108fe55f1a2181b88a0316c9b20889b..1b6deb0702da7f35e7425ae02f663052915be177 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtccall.h >@@ -273,6 +273,9 @@ class FakeCall final : public webrtc::Call, public webrtc::PacketReceiver { > int GetNumCreatedReceiveStreams() const; > void SetStats(const webrtc::Call::Stats& stats); > >+ void MediaTransportChange( >+ webrtc::MediaTransportInterface* media_transport_interface) override; >+ > private: > webrtc::AudioSendStream* CreateAudioSendStream( > const webrtc::AudioSendStream::Config& config) override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideocapturemodule.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideocapturemodule.h >index 47f1393c3f3819e851710bb84bb281a4ebd40d38..7556811f2de01962bfe014e5056324c2d68400fe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideocapturemodule.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideocapturemodule.h >@@ -14,7 +14,6 @@ > #include <vector> > > #include "api/video/i420_buffer.h" >-#include "media/base/testutils.h" > #include "media/engine/webrtcvideocapturer.h" > #include "rtc_base/task_queue_for_test.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.cc >index 7081b64933a956f2c06aa5a86284574b10f3db72..690f3e7f831eca760a75c5114434ccde6b914bd1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.cc >@@ -10,6 +10,7 @@ > > #include "media/engine/fakewebrtcvideoengine.h" > >+#include "absl/strings/match.h" > #include "media/base/codec.h" > #include "media/engine/simulcast_encoder_adapter.h" > #include "media/engine/webrtcvideodecoderfactory.h" >@@ -128,9 +129,7 @@ FakeWebRtcVideoDecoderFactory::decoders() { > // Encoder. > FakeWebRtcVideoEncoder::FakeWebRtcVideoEncoder( > FakeWebRtcVideoEncoderFactory* factory) >- : init_encode_event_(false, false), >- num_frames_encoded_(0), >- factory_(factory) {} >+ : num_frames_encoded_(0), factory_(factory) {} > > FakeWebRtcVideoEncoder::~FakeWebRtcVideoEncoder() { > if (factory_) { >@@ -167,11 +166,6 @@ int32_t FakeWebRtcVideoEncoder::Release() { > return WEBRTC_VIDEO_CODEC_OK; > } > >-int32_t FakeWebRtcVideoEncoder::SetChannelParameters(uint32_t packetLoss, >- int64_t rtt) { >- return WEBRTC_VIDEO_CODEC_OK; >-} >- > int32_t FakeWebRtcVideoEncoder::SetRateAllocation( > const webrtc::VideoBitrateAllocation& allocation, > uint32_t framerate) { >@@ -194,8 +188,7 @@ int FakeWebRtcVideoEncoder::GetNumEncodedFrames() { > > // Video encoder factory. > FakeWebRtcVideoEncoderFactory::FakeWebRtcVideoEncoderFactory() >- : created_video_encoder_event_(false, false), >- num_created_encoders_(0), >+ : num_created_encoders_(0), > encoders_have_internal_sources_(false), > vp8_factory_mode_(false) {} > >@@ -218,7 +211,7 @@ FakeWebRtcVideoEncoderFactory::CreateVideoEncoder( > rtc::CritScope lock(&crit_); > std::unique_ptr<webrtc::VideoEncoder> encoder; > if (IsFormatSupported(formats_, format)) { >- if (CodecNamesEq(format.name.c_str(), kVp8CodecName) && >+ if (absl::EqualsIgnoreCase(format.name, kVp8CodecName) && > !vp8_factory_mode_) { > // The simulcast adapter will ask this factory for multiple VP8 > // encoders. Enter vp8_factory_mode so that we now create these encoders >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.h >index f4287a31d26026fdee8931be9347db489846ae23..6ed92a789b57e2dd790980d7a9c8962ba247f098 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/fakewebrtcvideoengine.h >@@ -86,7 +86,6 @@ class FakeWebRtcVideoEncoder : public webrtc::VideoEncoder { > int32_t RegisterEncodeCompleteCallback( > webrtc::EncodedImageCallback* callback) override; > int32_t Release() override; >- int32_t SetChannelParameters(uint32_t packetLoss, int64_t rtt) override; > int32_t SetRateAllocation(const webrtc::VideoBitrateAllocation& allocation, > uint32_t framerate) override; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internaldecoderfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internaldecoderfactory.cc >index df74773c86db26ee374841247aa10bb400b197ac..722413338e51198d2fc616af59e99f9f760f7bc7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internaldecoderfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internaldecoderfactory.cc >@@ -10,6 +10,7 @@ > > #include "media/engine/internaldecoderfactory.h" > >+#include "absl/strings/match.h" > #include "api/video_codecs/sdp_video_format.h" > #include "media/base/mediaconstants.h" > #include "modules/video_coding/codecs/h264/include/h264.h" >@@ -55,11 +56,11 @@ std::unique_ptr<VideoDecoder> InternalDecoderFactory::CreateVideoDecoder( > return nullptr; > } > >- if (cricket::CodecNamesEq(format.name, cricket::kVp8CodecName)) >+ if (absl::EqualsIgnoreCase(format.name, cricket::kVp8CodecName)) > return VP8Decoder::Create(); >- if (cricket::CodecNamesEq(format.name, cricket::kVp9CodecName)) >+ if (absl::EqualsIgnoreCase(format.name, cricket::kVp9CodecName)) > return VP9Decoder::Create(); >- if (cricket::CodecNamesEq(format.name, cricket::kH264CodecName)) >+ if (absl::EqualsIgnoreCase(format.name, cricket::kH264CodecName)) > return H264Decoder::Create(); > > RTC_NOTREACHED(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internalencoderfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internalencoderfactory.cc >index e6c3c2e72fd632eac71eb38dfab61327a18b82df..e81b73d5e3d6cf15ac38cf7d35535d15d4220c87 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internalencoderfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/internalencoderfactory.cc >@@ -12,6 +12,7 @@ > > #include <utility> > >+#include "absl/strings/match.h" > #include "api/video_codecs/sdp_video_format.h" > #include "modules/video_coding/codecs/h264/include/h264.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" >@@ -41,11 +42,11 @@ VideoEncoderFactory::CodecInfo InternalEncoderFactory::QueryVideoEncoder( > > std::unique_ptr<VideoEncoder> InternalEncoderFactory::CreateVideoEncoder( > const SdpVideoFormat& format) { >- if (cricket::CodecNamesEq(format.name, cricket::kVp8CodecName)) >+ if (absl::EqualsIgnoreCase(format.name, cricket::kVp8CodecName)) > return VP8Encoder::Create(); >- if (cricket::CodecNamesEq(format.name, cricket::kVp9CodecName)) >+ if (absl::EqualsIgnoreCase(format.name, cricket::kVp9CodecName)) > return VP9Encoder::Create(cricket::VideoCodec(format)); >- if (cricket::CodecNamesEq(format.name, cricket::kH264CodecName)) >+ if (absl::EqualsIgnoreCase(format.name, cricket::kH264CodecName)) > return H264Encoder::Create(cricket::VideoCodec(format)); > RTC_LOG(LS_ERROR) << "Trying to created encoder of unsupported format " > << format.name; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.cc >index 236a2e81acc722d848554a87adf4840c24d75f94..b3913243edeb05d163dd3a2ab873fc00f200cf49 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.cc >@@ -12,6 +12,7 @@ > > #include <utility> > >+#include "absl/strings/match.h" > #include "api/video_codecs/sdp_video_format.h" > #include "media/base/codec.h" > #include "media/base/mediaconstants.h" >@@ -22,8 +23,8 @@ > namespace { > > bool IsMultiplexCodec(const cricket::VideoCodec& codec) { >- return cricket::CodecNamesEq(codec.name.c_str(), >- cricket::kMultiplexCodecName); >+ return absl::EqualsIgnoreCase(codec.name.c_str(), >+ cricket::kMultiplexCodecName); > } > > } // anonymous namespace >@@ -42,7 +43,7 @@ std::vector<SdpVideoFormat> MultiplexEncoderFactory::GetSupportedFormats() > const { > std::vector<SdpVideoFormat> formats = factory_->GetSupportedFormats(); > for (const auto& format : formats) { >- if (cricket::CodecNamesEq(format.name, kMultiplexAssociatedCodecName)) { >+ if (absl::EqualsIgnoreCase(format.name, kMultiplexAssociatedCodecName)) { > SdpVideoFormat multiplex_format = format; > multiplex_format.parameters[cricket::kCodecParamAssociatedCodecName] = > format.name; >@@ -88,7 +89,7 @@ std::vector<SdpVideoFormat> MultiplexDecoderFactory::GetSupportedFormats() > const { > std::vector<SdpVideoFormat> formats = factory_->GetSupportedFormats(); > for (const auto& format : formats) { >- if (cricket::CodecNamesEq(format.name, kMultiplexAssociatedCodecName)) { >+ if (absl::EqualsIgnoreCase(format.name, kMultiplexAssociatedCodecName)) { > SdpVideoFormat multiplex_format = format; > multiplex_format.parameters[cricket::kCodecParamAssociatedCodecName] = > format.name; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.h >index 7a851c8c1ecfc4203ca83fae91ac86eac77fe67a..ae099a4ef3e78b1efadfb192d76903015dcb5349 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/multiplexcodecfactory.h >@@ -16,6 +16,7 @@ > > #include "api/video_codecs/video_decoder_factory.h" > #include "api/video_codecs/video_encoder_factory.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > // Multiplex codec is a completely modular/optional codec that allows users to >@@ -36,7 +37,7 @@ namespace webrtc { > // and decoder instance(s) using these factories. > // - Use Multiplex*coderFactory classes in CreatePeerConnectionFactory() calls. > // - Select "multiplex" codec in SDP negotiation. >-class MultiplexEncoderFactory : public VideoEncoderFactory { >+class RTC_EXPORT MultiplexEncoderFactory : public VideoEncoderFactory { > public: > // |supports_augmenting_data| defines if the encoder would support augmenting > // data. If set, the encoder expects to receive video frame buffers of type >@@ -54,7 +55,7 @@ class MultiplexEncoderFactory : public VideoEncoderFactory { > const bool supports_augmenting_data_; > }; > >-class MultiplexDecoderFactory : public VideoDecoderFactory { >+class RTC_EXPORT MultiplexDecoderFactory : public VideoDecoderFactory { > public: > // |supports_augmenting_data| defines if the decoder would support augmenting > // data. If set, the decoder is expected to output video frame buffers of type >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine.h >index 9af0f9b130c6c2e4564534e9022de07e757fc8c1..62326dd4ddd43b39698edd122de0cc1addef5a9c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine.h >@@ -30,15 +30,19 @@ class WebRtcVideoEncoderFactory; > > // Video engine implementation that does nothing and can be used in > // CompositeMediaEngine. >-class NullWebRtcVideoEngine { >+class NullWebRtcVideoEngine : public VideoEngineInterface { > public: >- std::vector<VideoCodec> codecs() const { return std::vector<VideoCodec>(); } >+ std::vector<VideoCodec> codecs() const override { >+ return std::vector<VideoCodec>(); >+ } > >- RtpCapabilities GetCapabilities() const { return RtpCapabilities(); } >+ RtpCapabilities GetCapabilities() const override { return RtpCapabilities(); } > >- VideoMediaChannel* CreateChannel(webrtc::Call* call, >- const MediaConfig& config, >- const VideoOptions& options) { >+ VideoMediaChannel* CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options) override { > return nullptr; > } > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine_unittest.cc >index 6c740b8c8c372ab8a290c31bdcae575c855b2adf..e718d1bb697e21822728fa8c1f45ecc5e5d86334 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/nullwebrtcvideoengine_unittest.cc >@@ -9,6 +9,7 @@ > */ > > #include "media/engine/nullwebrtcvideoengine.h" >+#include "absl/memory/memory.h" > #include "media/engine/webrtcvoiceengine.h" > #include "modules/audio_device/include/mock_audio_device.h" > #include "modules/audio_processing/include/audio_processing.h" >@@ -18,8 +19,7 @@ > > namespace cricket { > >-class WebRtcMediaEngineNullVideo >- : public CompositeMediaEngine<WebRtcVoiceEngine, NullWebRtcVideoEngine> { >+class WebRtcMediaEngineNullVideo : public CompositeMediaEngine { > public: > WebRtcMediaEngineNullVideo( > webrtc::AudioDeviceModule* adm, >@@ -27,13 +27,13 @@ class WebRtcMediaEngineNullVideo > audio_encoder_factory, > const rtc::scoped_refptr<webrtc::AudioDecoderFactory>& > audio_decoder_factory) >- : CompositeMediaEngine<WebRtcVoiceEngine, NullWebRtcVideoEngine>( >- std::forward_as_tuple(adm, >- audio_encoder_factory, >- audio_decoder_factory, >- nullptr, >- webrtc::AudioProcessingBuilder().Create()), >- std::forward_as_tuple()) {} >+ : CompositeMediaEngine(absl::make_unique<WebRtcVoiceEngine>( >+ adm, >+ audio_encoder_factory, >+ audio_decoder_factory, >+ nullptr, >+ webrtc::AudioProcessingBuilder().Create()), >+ absl::make_unique<NullWebRtcVideoEngine>()) {} > }; > > // Simple test to check if NullWebRtcVideoEngine implements the methods >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/payload_type_mapper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/payload_type_mapper.cc >index 5d2fb6757c445ad212517900e894dc8409df5655..ae4e309ef4900d3b917f89ad0618fdced27219a6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/payload_type_mapper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/payload_type_mapper.cc >@@ -12,8 +12,8 @@ > > #include <utility> > >+#include "absl/strings/ascii.h" > #include "api/audio_codecs/audio_format.h" >-#include "common_types.h" // NOLINT(build/include) > #include "media/base/mediaconstants.h" > > namespace cricket { >@@ -140,7 +140,8 @@ bool PayloadTypeMapper::SdpAudioFormatOrdering::operator()( > const webrtc::SdpAudioFormat& b) const { > if (a.clockrate_hz == b.clockrate_hz) { > if (a.num_channels == b.num_channels) { >- int name_cmp = STR_CASE_CMP(a.name.c_str(), b.name.c_str()); >+ int name_cmp = >+ absl::AsciiStrToLower(a.name).compare(absl::AsciiStrToLower(b.name)); > if (name_cmp == 0) > return a.parameters < b.parameters; > return name_cmp < 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/scopedvideoencoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/scopedvideoencoder.cc >index 026654e07204d044b69f5cb89666f78b92ab31c6..db1148bf30fe2009304b57a7972bb8ff1ccf6350 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/scopedvideoencoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/scopedvideoencoder.cc >@@ -33,14 +33,10 @@ class ScopedVideoEncoder : public webrtc::VideoEncoder { > int32_t Encode(const webrtc::VideoFrame& frame, > const webrtc::CodecSpecificInfo* codec_specific_info, > const std::vector<webrtc::FrameType>* frame_types) override; >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; > int32_t SetRates(uint32_t bitrate, uint32_t framerate) override; > int32_t SetRateAllocation(const webrtc::VideoBitrateAllocation& allocation, > uint32_t framerate) override; >- ScalingSettings GetScalingSettings() const override; >- bool SupportsNativeHandle() const override; >- const char* ImplementationName() const override; >- >+ EncoderInfo GetEncoderInfo() const override; > ~ScopedVideoEncoder() override; > > private: >@@ -75,11 +71,6 @@ int32_t ScopedVideoEncoder::Encode( > return encoder_->Encode(frame, codec_specific_info, frame_types); > } > >-int32_t ScopedVideoEncoder::SetChannelParameters(uint32_t packet_loss, >- int64_t rtt) { >- return encoder_->SetChannelParameters(packet_loss, rtt); >-} >- > int32_t ScopedVideoEncoder::SetRates(uint32_t bitrate, uint32_t framerate) { > return encoder_->SetRates(bitrate, framerate); > } >@@ -90,17 +81,8 @@ int32_t ScopedVideoEncoder::SetRateAllocation( > return encoder_->SetRateAllocation(allocation, framerate); > } > >-webrtc::VideoEncoder::ScalingSettings ScopedVideoEncoder::GetScalingSettings() >- const { >- return encoder_->GetScalingSettings(); >-} >- >-bool ScopedVideoEncoder::SupportsNativeHandle() const { >- return encoder_->SupportsNativeHandle(); >-} >- >-const char* ScopedVideoEncoder::ImplementationName() const { >- return encoder_->ImplementationName(); >+webrtc::VideoEncoder::EncoderInfo ScopedVideoEncoder::GetEncoderInfo() const { >+ return encoder_->GetEncoderInfo(); > } > > ScopedVideoEncoder::~ScopedVideoEncoder() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast.cc >index c42e1dcae290840a9b398cc830d12c81cf6161c6..de3d8df097f1505d01f64ce30fe7a6e022dc0fe4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast.cc >@@ -18,6 +18,7 @@ > #include "media/engine/simulcast.h" > #include "modules/video_coding/utility/simulcast_rate_allocator.h" > #include "rtc_base/arraysize.h" >+#include "rtc_base/experiments/normalize_simulcast_size_experiment.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/field_trial.h" > >@@ -140,7 +141,13 @@ int FindSimulcastFormatIndex(int width, int height, size_t max_layers) { > // Simulcast stream width and height must both be dividable by > // |2 ^ (simulcast_layers - 1)|. > int NormalizeSimulcastSize(int size, size_t simulcast_layers) { >- const int base2_exponent = static_cast<int>(simulcast_layers) - 1; >+ int base2_exponent = static_cast<int>(simulcast_layers) - 1; >+ const absl::optional<int> experimental_base2_exponent = >+ webrtc::NormalizeSimulcastSizeExperiment::GetBase2Exponent(); >+ if (experimental_base2_exponent && >+ (size > (1 << *experimental_base2_exponent))) { >+ base2_exponent = *experimental_base2_exponent; >+ } > return ((size >> base2_exponent) << base2_exponent); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.cc >index 30da68ec85ca8b7fa0de71a71c65b8cd4d1725c8..6241ff9d36d691d039cdfec6c6c6666122d5967d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.cc >@@ -17,9 +17,9 @@ > #include "api/video/i420_buffer.h" > #include "api/video/video_bitrate_allocation.h" > #include "api/video_codecs/video_encoder_factory.h" >-#include "media/engine/scopedvideoencoder.h" > #include "modules/video_coding/utility/simulcast_rate_allocator.h" > #include "rtc_base/checks.h" >+#include "rtc_base/logging.h" > #include "system_wrappers/include/clock.h" > #include "system_wrappers/include/field_trial.h" > #include "third_party/libyuv/include/libyuv/scale.h" >@@ -127,9 +127,9 @@ SimulcastEncoderAdapter::SimulcastEncoderAdapter(VideoEncoderFactory* factory, > factory_(factory), > video_format_(format), > encoded_complete_callback_(nullptr), >- implementation_name_("SimulcastEncoderAdapter"), > experimental_boosted_screenshare_qp_(GetScreenshareBoostedQpValue()) { > RTC_DCHECK(factory_); >+ encoder_info_.implementation_name = "SimulcastEncoderAdapter"; > > // The adapter is typically created on the worker thread, but operated on > // the encoder task queue. >@@ -203,7 +203,8 @@ int SimulcastEncoderAdapter::InitEncode(const VideoCodec* inst, > start_bitrates.push_back(stream_bitrate); > } > >- std::string implementation_name; >+ encoder_info_.supports_native_handle = true; >+ encoder_info_.scaling_settings.thresholds = absl::nullopt; > // Create |number_of_streams| of encoder instances and init them. > for (int i = 0; i < number_of_streams; ++i) { > VideoCodec stream_codec; >@@ -212,6 +213,7 @@ int SimulcastEncoderAdapter::InitEncode(const VideoCodec* inst, > if (!doing_simulcast) { > stream_codec = codec_; > stream_codec.numberOfSimulcastStreams = 1; >+ > } else { > // Cap start bitrate to the min bitrate in order to avoid strange codec > // behavior. Since sending will be false, this should not matter. >@@ -247,6 +249,7 @@ int SimulcastEncoderAdapter::InitEncode(const VideoCodec* inst, > Release(); > return ret; > } >+ > std::unique_ptr<EncodedImageCallback> callback( > new AdapterEncodedImageCallback(this, i)); > encoder->RegisterEncodeCompleteCallback(callback.get()); >@@ -254,17 +257,45 @@ int SimulcastEncoderAdapter::InitEncode(const VideoCodec* inst, > stream_codec.width, stream_codec.height, > send_stream); > >- if (i != 0) { >- implementation_name += ", "; >+ if (!doing_simulcast) { >+ // Without simulcast, just pass through the encoder info from the one >+ // active encoder. >+ encoder_info_ = streaminfos_[0].encoder->GetEncoderInfo(); >+ } else { >+ const EncoderInfo encoder_impl_info = >+ streaminfos_[i].encoder->GetEncoderInfo(); >+ >+ if (i == 0) { >+ // Quality scaling not enabled for simulcast. >+ encoder_info_.scaling_settings = VideoEncoder::ScalingSettings::kOff; >+ >+ // Encoder name indicates names of all sub-encoders. >+ encoder_info_.implementation_name = "SimulcastEncoderAdapter ("; >+ encoder_info_.implementation_name += >+ encoder_impl_info.implementation_name; >+ >+ encoder_info_.supports_native_handle = >+ encoder_impl_info.supports_native_handle; >+ encoder_info_.has_trusted_rate_controller = >+ encoder_impl_info.has_trusted_rate_controller; >+ } else { >+ encoder_info_.implementation_name += ", "; >+ encoder_info_.implementation_name += >+ encoder_impl_info.implementation_name; >+ >+ // Native handle supported only if all encoders supports it. >+ encoder_info_.supports_native_handle &= >+ encoder_impl_info.supports_native_handle; >+ >+ // Trusted rate controller only if all encoders have it. >+ encoder_info_.has_trusted_rate_controller &= >+ encoder_impl_info.has_trusted_rate_controller; >+ } > } >- implementation_name += streaminfos_[i].encoder->ImplementationName(); > } > > if (doing_simulcast) { >- implementation_name_ = >- "SimulcastEncoderAdapter (" + implementation_name + ")"; >- } else { >- implementation_name_ = implementation_name; >+ encoder_info_.implementation_name += ")"; > } > > // To save memory, don't store encoders that we don't use. >@@ -377,15 +408,6 @@ int SimulcastEncoderAdapter::RegisterEncodeCompleteCallback( > return WEBRTC_VIDEO_CODEC_OK; > } > >-int SimulcastEncoderAdapter::SetChannelParameters(uint32_t packet_loss, >- int64_t rtt) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&encoder_queue_); >- for (size_t stream_idx = 0; stream_idx < streaminfos_.size(); ++stream_idx) { >- streaminfos_[stream_idx].encoder->SetChannelParameters(packet_loss, rtt); >- } >- return WEBRTC_VIDEO_CODEC_OK; >-} >- > int SimulcastEncoderAdapter::SetRateAllocation( > const VideoBitrateAllocation& bitrate, > uint32_t new_framerate) { >@@ -452,7 +474,6 @@ EncodedImageCallback::Result SimulcastEncoderAdapter::OnEncodedImage( > const RTPFragmentationHeader* fragmentation) { > EncodedImage stream_image(encodedImage); > CodecSpecificInfo stream_codec_specific = *codecSpecificInfo; >- stream_codec_specific.codec_name = implementation_name_.c_str(); > > stream_image.SetSpatialIndex(stream_idx); > >@@ -517,32 +538,8 @@ void SimulcastEncoderAdapter::DestroyStoredEncoders() { > } > } > >-bool SimulcastEncoderAdapter::SupportsNativeHandle() const { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&encoder_queue_); >- // We should not be calling this method before streaminfos_ are configured. >- RTC_DCHECK(!streaminfos_.empty()); >- for (const auto& streaminfo : streaminfos_) { >- if (!streaminfo.encoder->SupportsNativeHandle()) { >- return false; >- } >- } >- return true; >-} >- >-VideoEncoder::ScalingSettings SimulcastEncoderAdapter::GetScalingSettings() >- const { >- // TODO(brandtr): Investigate why the sequence checker below fails on mac. >- // RTC_DCHECK_CALLED_SEQUENTIALLY(&encoder_queue_); >- // Turn off quality scaling for simulcast. >- if (!Initialized() || NumberOfStreams(codec_) != 1) { >- return VideoEncoder::ScalingSettings::kOff; >- } >- return streaminfos_[0].encoder->GetScalingSettings(); >-} >- >-const char* SimulcastEncoderAdapter::ImplementationName() const { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&encoder_queue_); >- return implementation_name_.c_str(); >+VideoEncoder::EncoderInfo SimulcastEncoderAdapter::GetEncoderInfo() const { >+ return encoder_info_; > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.h >index 52643aa79b16270019f596e789daae99591a94b5..783c342d6b9fc4a289be7758af93adc64ddd7c3d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter.h >@@ -19,7 +19,7 @@ > #include <vector> > > #include "absl/types/optional.h" >-#include "media/engine/webrtcvideoencoderfactory.h" >+#include "api/video_codecs/sdp_video_format.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "rtc_base/atomicops.h" > #include "rtc_base/sequenced_task_checker.h" >@@ -48,7 +48,6 @@ class SimulcastEncoderAdapter : public VideoEncoder { > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) override; > int RegisterEncodeCompleteCallback(EncodedImageCallback* callback) override; >- int SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; > int SetRateAllocation(const VideoBitrateAllocation& bitrate, > uint32_t new_framerate) override; > >@@ -61,10 +60,7 @@ class SimulcastEncoderAdapter : public VideoEncoder { > const CodecSpecificInfo* codec_specific_info, > const RTPFragmentationHeader* fragmentation); > >- VideoEncoder::ScalingSettings GetScalingSettings() const override; >- >- bool SupportsNativeHandle() const override; >- const char* ImplementationName() const override; >+ EncoderInfo GetEncoderInfo() const override; > > private: > struct StreamInfo { >@@ -104,7 +100,7 @@ class SimulcastEncoderAdapter : public VideoEncoder { > VideoCodec codec_; > std::vector<StreamInfo> streaminfos_; > EncodedImageCallback* encoded_complete_callback_; >- std::string implementation_name_; >+ EncoderInfo encoder_info_; > > // Used for checking the single-threaded access of the encoder interface. > rtc::SequencedTaskChecker encoder_queue_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter_unittest.cc >index 5b7d185b8cc6ebaa2c244880ec34e05e60fb8fc3..59541755fa51dc377b6e3e63710b433ebc334bfc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_encoder_adapter_unittest.cc >@@ -15,16 +15,17 @@ > #include "absl/memory/memory.h" > #include "api/test/create_simulcast_test_fixture.h" > #include "api/test/simulcast_test_fixture.h" >+#include "api/test/video/function_video_decoder_factory.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "api/video_codecs/sdp_video_format.h" > #include "api/video_codecs/video_encoder_factory.h" > #include "common_video/include/video_frame_buffer.h" >+#include "media/base/mediaconstants.h" > #include "media/engine/internalencoderfactory.h" > #include "media/engine/simulcast_encoder_adapter.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "modules/video_coding/utility/simulcast_test_fixture_impl.h" >-#include "test/function_video_decoder_factory.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gmock.h" > #include "test/gtest.h" > >@@ -55,7 +56,6 @@ std::unique_ptr<SimulcastTestFixture> CreateSpecificSimulcastTestFixture( > std::move(decoder_factory), > SdpVideoFormat(cricket::kVp8CodecName)); > } >- > } // namespace > > TEST(SimulcastEncoderAdapterSimulcastTest, TestKeyFrameRequestsOnAllStreams) { >@@ -144,6 +144,12 @@ TEST(SimulcastEncoderAdapterSimulcastTest, > fixture->TestSpatioTemporalLayers321PatternEncoder(); > } > >+TEST(SimulcastEncoderAdapterSimulcastTest, TestDecodeWidthHeightSet) { >+ InternalEncoderFactory internal_encoder_factory; >+ auto fixture = CreateSpecificSimulcastTestFixture(&internal_encoder_factory); >+ fixture->TestDecodeWidthHeightSet(); >+} >+ > class MockVideoEncoder; > > class MockVideoEncoderFactory : public VideoEncoderFactory { >@@ -170,7 +176,9 @@ class MockVideoEncoderFactory : public VideoEncoderFactory { > class MockVideoEncoder : public VideoEncoder { > public: > explicit MockVideoEncoder(MockVideoEncoderFactory* factory) >- : factory_(factory), callback_(nullptr) {} >+ : factory_(factory), >+ scaling_settings_(VideoEncoder::ScalingSettings::kOff), >+ callback_(nullptr) {} > > // TODO(nisse): Valid overrides commented out, because the gmock > // methods don't use any override declarations, and we want to avoid >@@ -203,10 +211,13 @@ class MockVideoEncoder : public VideoEncoder { > return 0; > } > >- MOCK_METHOD2(SetChannelParameters, int32_t(uint32_t packetLoss, int64_t rtt)); >- >- bool SupportsNativeHandle() const /* override */ { >- return supports_native_handle_; >+ EncoderInfo GetEncoderInfo() const override { >+ EncoderInfo info; >+ info.supports_native_handle = supports_native_handle_; >+ info.implementation_name = implementation_name_; >+ info.scaling_settings = scaling_settings_; >+ info.has_trusted_rate_controller = has_trusted_rate_controller_; >+ return info; > } > > virtual ~MockVideoEncoder() { factory_->DestroyVideoEncoder(this); } >@@ -228,17 +239,30 @@ class MockVideoEncoder : public VideoEncoder { > supports_native_handle_ = enabled; > } > >+ void set_implementation_name(const std::string& name) { >+ implementation_name_ = name; >+ } >+ > void set_init_encode_return_value(int32_t value) { > init_encode_return_value_ = value; > } > >- VideoBitrateAllocation last_set_bitrate() const { return last_set_bitrate_; } >+ void set_scaling_settings(const VideoEncoder::ScalingSettings& settings) { >+ scaling_settings_ = settings; >+ } > >- MOCK_CONST_METHOD0(ImplementationName, const char*()); >+ void set_has_trusted_rate_controller(bool trusted) { >+ has_trusted_rate_controller_ = trusted; >+ } >+ >+ VideoBitrateAllocation last_set_bitrate() const { return last_set_bitrate_; } > > private: > MockVideoEncoderFactory* const factory_; > bool supports_native_handle_ = false; >+ std::string implementation_name_ = "unknown"; >+ VideoEncoder::ScalingSettings scaling_settings_; >+ bool has_trusted_rate_controller_ = false; > int32_t init_encode_return_value_ = 0; > VideoBitrateAllocation last_set_bitrate_; > >@@ -260,7 +284,7 @@ std::unique_ptr<VideoEncoder> MockVideoEncoderFactory::CreateVideoEncoder( > const char* encoder_name = encoder_names_.empty() > ? "codec_implementation_name" > : encoder_names_[encoders_.size()]; >- ON_CALL(*encoder, ImplementationName()).WillByDefault(Return(encoder_name)); >+ encoder->set_implementation_name(encoder_name); > encoders_.push_back(encoder.get()); > return encoder; > } >@@ -302,15 +326,6 @@ class TestSimulcastEncoderAdapterFakeHelper { > return new SimulcastEncoderAdapter(factory_.get(), SdpVideoFormat("VP8")); > } > >- void ExpectCallSetChannelParameters(uint32_t packetLoss, int64_t rtt) { >- EXPECT_TRUE(!factory_->encoders().empty()); >- for (size_t i = 0; i < factory_->encoders().size(); ++i) { >- EXPECT_CALL(*factory_->encoders()[i], >- SetChannelParameters(packetLoss, rtt)) >- .Times(1); >- } >- } >- > MockVideoEncoderFactory* factory() { return factory_.get(); } > > private: >@@ -459,14 +474,6 @@ TEST_F(TestSimulcastEncoderAdapterFake, Reinit) { > EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); > } > >-TEST_F(TestSimulcastEncoderAdapterFake, SetChannelParameters) { >- SetupCodec(); >- const uint32_t packetLoss = 5; >- const int64_t rtt = 30; >- helper_->ExpectCallSetChannelParameters(packetLoss, rtt); >- adapter_->SetChannelParameters(packetLoss, rtt); >-} >- > TEST_F(TestSimulcastEncoderAdapterFake, EncodedCallbackForDifferentEncoders) { > SetupCodec(); > >@@ -728,9 +735,11 @@ TEST_F(TestSimulcastEncoderAdapterFake, SupportsNativeHandleForSingleStreams) { > adapter_->RegisterEncodeCompleteCallback(this); > ASSERT_EQ(1u, helper_->factory()->encoders().size()); > helper_->factory()->encoders()[0]->set_supports_native_handle(true); >- EXPECT_TRUE(adapter_->SupportsNativeHandle()); >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_TRUE(adapter_->GetEncoderInfo().supports_native_handle); > helper_->factory()->encoders()[0]->set_supports_native_handle(false); >- EXPECT_FALSE(adapter_->SupportsNativeHandle()); >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_FALSE(adapter_->GetEncoderInfo().supports_native_handle); > } > > TEST_F(TestSimulcastEncoderAdapterFake, SetRatesUnderMinBitrate) { >@@ -763,7 +772,8 @@ TEST_F(TestSimulcastEncoderAdapterFake, SetRatesUnderMinBitrate) { > } > > TEST_F(TestSimulcastEncoderAdapterFake, SupportsImplementationName) { >- EXPECT_STREQ("SimulcastEncoderAdapter", adapter_->ImplementationName()); >+ EXPECT_EQ("SimulcastEncoderAdapter", >+ adapter_->GetEncoderInfo().implementation_name); > SimulcastTestFixtureImpl::DefaultSettings( > &codec_, static_cast<const int*>(kTestTemporalLayerProfile), > kVideoCodecVP8); >@@ -773,8 +783,8 @@ TEST_F(TestSimulcastEncoderAdapterFake, SupportsImplementationName) { > encoder_names.push_back("codec3"); > helper_->factory()->SetEncoderNames(encoder_names); > EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >- EXPECT_STREQ("SimulcastEncoderAdapter (codec1, codec2, codec3)", >- adapter_->ImplementationName()); >+ EXPECT_EQ("SimulcastEncoderAdapter (codec1, codec2, codec3)", >+ adapter_->GetEncoderInfo().implementation_name); > > // Single streams should not expose "SimulcastEncoderAdapter" in name. > EXPECT_EQ(0, adapter_->Release()); >@@ -782,7 +792,7 @@ TEST_F(TestSimulcastEncoderAdapterFake, SupportsImplementationName) { > EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); > adapter_->RegisterEncodeCompleteCallback(this); > ASSERT_EQ(1u, helper_->factory()->encoders().size()); >- EXPECT_STREQ("codec1", adapter_->ImplementationName()); >+ EXPECT_EQ("codec1", adapter_->GetEncoderInfo().implementation_name); > } > > TEST_F(TestSimulcastEncoderAdapterFake, >@@ -798,10 +808,11 @@ TEST_F(TestSimulcastEncoderAdapterFake, > encoder->set_supports_native_handle(true); > // If one encoder doesn't support it, then overall support is disabled. > helper_->factory()->encoders()[0]->set_supports_native_handle(false); >- EXPECT_FALSE(adapter_->SupportsNativeHandle()); >+ EXPECT_FALSE(adapter_->GetEncoderInfo().supports_native_handle); > // Once all do, then the adapter claims support. > helper_->factory()->encoders()[0]->set_supports_native_handle(true); >- EXPECT_TRUE(adapter_->SupportsNativeHandle()); >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_TRUE(adapter_->GetEncoderInfo().supports_native_handle); > } > > // TODO(nisse): Reuse definition in webrtc/test/fake_texture_handle.h. >@@ -837,7 +848,8 @@ TEST_F(TestSimulcastEncoderAdapterFake, > ASSERT_EQ(3u, helper_->factory()->encoders().size()); > for (MockVideoEncoder* encoder : helper_->factory()->encoders()) > encoder->set_supports_native_handle(true); >- EXPECT_TRUE(adapter_->SupportsNativeHandle()); >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_TRUE(adapter_->GetEncoderInfo().supports_native_handle); > > rtc::scoped_refptr<VideoFrameBuffer> buffer( > new rtc::RefCountedObject<FakeNativeBufferNoI420>(1280, 720)); >@@ -948,5 +960,50 @@ TEST_F(TestSimulcastEncoderAdapterFake, ActivatesCorrectStreamsInInitEncode) { > frame_types.resize(3, kVideoFrameKey); > EXPECT_EQ(0, adapter_->Encode(input_frame, nullptr, &frame_types)); > } >+ >+TEST_F(TestSimulcastEncoderAdapterFake, TrustedRateControl) { >+ // Set up common settings for three streams. >+ SimulcastTestFixtureImpl::DefaultSettings( >+ &codec_, static_cast<const int*>(kTestTemporalLayerProfile), >+ kVideoCodecVP8); >+ rate_allocator_.reset(new SimulcastRateAllocator(codec_)); >+ adapter_->RegisterEncodeCompleteCallback(this); >+ >+ // Only enough start bitrate for the lowest stream. >+ ASSERT_EQ(3u, codec_.numberOfSimulcastStreams); >+ codec_.startBitrate = codec_.simulcastStream[0].targetBitrate + >+ codec_.simulcastStream[1].minBitrate - 1; >+ >+ // Input data. >+ rtc::scoped_refptr<VideoFrameBuffer> buffer(I420Buffer::Create(1280, 720)); >+ VideoFrame input_frame(buffer, 100, 1000, kVideoRotation_180); >+ >+ // No encoder trusted, so simulcast adapter should not be either. >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_FALSE(adapter_->GetEncoderInfo().has_trusted_rate_controller); >+ >+ // Encode with three streams. >+ std::vector<MockVideoEncoder*> original_encoders = >+ helper_->factory()->encoders(); >+ >+ // All encoders are trusted, so simulcast adapter should be too. >+ original_encoders[0]->set_has_trusted_rate_controller(true); >+ original_encoders[1]->set_has_trusted_rate_controller(true); >+ original_encoders[2]->set_has_trusted_rate_controller(true); >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_TRUE(adapter_->GetEncoderInfo().has_trusted_rate_controller); >+ >+ // One encoder not trusted, so simulcast adapter should not be either. >+ original_encoders[2]->set_has_trusted_rate_controller(false); >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_FALSE(adapter_->GetEncoderInfo().has_trusted_rate_controller); >+ >+ // No encoder trusted, so simulcast adapter should not be either. >+ original_encoders[0]->set_has_trusted_rate_controller(false); >+ original_encoders[1]->set_has_trusted_rate_controller(false); >+ EXPECT_EQ(0, adapter_->InitEncode(&codec_, 1, 1200)); >+ EXPECT_FALSE(adapter_->GetEncoderInfo().has_trusted_rate_controller); >+} >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_unittest.cc >index f41636869789931739d6f93ab08e16bb598660fc..0c278160689dfe3639cf8d0914991024d5c2761b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/simulcast_unittest.cc >@@ -179,7 +179,7 @@ TEST(SimulcastTest, GetConfigWithNormalizedResolution) { > kMaxLayers, 640 + 1, 360 + 1, kMaxBitrateBps, kBitratePriority, kQpMax, > kMaxFps, !kScreenshare); > >- // Must be dividable by |2 ^ (num_layers - 1)|. >+ // Must be divisible by |2 ^ (num_layers - 1)|. > EXPECT_EQ(kMaxLayers, streams.size()); > EXPECT_EQ(320u, streams[0].width); > EXPECT_EQ(180u, streams[0].height); >@@ -187,6 +187,40 @@ TEST(SimulcastTest, GetConfigWithNormalizedResolution) { > EXPECT_EQ(360u, streams[1].height); > } > >+TEST(SimulcastTest, GetConfigWithNormalizedResolutionDivisibleBy4) { >+ test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Enabled-2/"); >+ >+ const size_t kMaxLayers = 2; >+ std::vector<VideoStream> streams = cricket::GetSimulcastConfig( >+ kMaxLayers, 709, 501, kMaxBitrateBps, kBitratePriority, kQpMax, kMaxFps, >+ !kScreenshare); >+ >+ // Must be divisible by |2 ^ 2|. >+ EXPECT_EQ(kMaxLayers, streams.size()); >+ EXPECT_EQ(354u, streams[0].width); >+ EXPECT_EQ(250u, streams[0].height); >+ EXPECT_EQ(708u, streams[1].width); >+ EXPECT_EQ(500u, streams[1].height); >+} >+ >+TEST(SimulcastTest, GetConfigWithNormalizedResolutionDivisibleBy8) { >+ test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Enabled-3/"); >+ >+ const size_t kMaxLayers = 2; >+ std::vector<VideoStream> streams = cricket::GetSimulcastConfig( >+ kMaxLayers, 709, 501, kMaxBitrateBps, kBitratePriority, kQpMax, kMaxFps, >+ !kScreenshare); >+ >+ // Must be divisible by |2 ^ 3|. >+ EXPECT_EQ(kMaxLayers, streams.size()); >+ EXPECT_EQ(352u, streams[0].width); >+ EXPECT_EQ(248u, streams[0].height); >+ EXPECT_EQ(704u, streams[1].width); >+ EXPECT_EQ(496u, streams[1].height); >+} >+ > TEST(SimulcastTest, GetConfigForScreenshare) { > const size_t kMaxLayers = 3; > std::vector<VideoStream> streams = cricket::GetSimulcastConfig( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.cc >index 4d858630b5824d704ee89c1b80416470db9a2e86..1a2f57725cec25aee430b468dc1592785b932050 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.cc >@@ -10,9 +10,7 @@ > > #include "media/engine/vp8_encoder_simulcast_proxy.h" > >-#include "media/engine/scopedvideoencoder.h" > #include "media/engine/simulcast_encoder_adapter.h" >-#include "rtc_base/checks.h" > > namespace webrtc { > VP8EncoderSimulcastProxy::VP8EncoderSimulcastProxy(VideoEncoderFactory* factory, >@@ -57,28 +55,14 @@ int VP8EncoderSimulcastProxy::RegisterEncodeCompleteCallback( > return encoder_->RegisterEncodeCompleteCallback(callback); > } > >-int VP8EncoderSimulcastProxy::SetChannelParameters(uint32_t packet_loss, >- int64_t rtt) { >- return encoder_->SetChannelParameters(packet_loss, rtt); >-} >- > int VP8EncoderSimulcastProxy::SetRateAllocation( > const VideoBitrateAllocation& bitrate, > uint32_t new_framerate) { > return encoder_->SetRateAllocation(bitrate, new_framerate); > } > >-VideoEncoder::ScalingSettings VP8EncoderSimulcastProxy::GetScalingSettings() >- const { >- return encoder_->GetScalingSettings(); >-} >- >-bool VP8EncoderSimulcastProxy::SupportsNativeHandle() const { >- return encoder_->SupportsNativeHandle(); >-} >- >-const char* VP8EncoderSimulcastProxy::ImplementationName() const { >- return encoder_->ImplementationName(); >+VideoEncoder::EncoderInfo VP8EncoderSimulcastProxy::GetEncoderInfo() const { >+ return encoder_->GetEncoderInfo(); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.h >index 685bcbed6ef846d79afbd3f8339890808f652a00..d04624f26c5d861507e68a682a22cefce8066c41 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy.h >@@ -16,8 +16,8 @@ > #include <vector> > > #include "api/video_codecs/sdp_video_format.h" >+#include "api/video_codecs/video_encoder.h" > #include "api/video_codecs/video_encoder_factory.h" >-#include "modules/video_coding/codecs/vp8/include/vp8.h" > > namespace webrtc { > >@@ -42,14 +42,9 @@ class VP8EncoderSimulcastProxy : public VideoEncoder { > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) override; > int RegisterEncodeCompleteCallback(EncodedImageCallback* callback) override; >- int SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; > int SetRateAllocation(const VideoBitrateAllocation& bitrate, > uint32_t new_framerate) override; >- >- VideoEncoder::ScalingSettings GetScalingSettings() const override; >- >- bool SupportsNativeHandle() const override; >- const char* ImplementationName() const override; >+ EncoderInfo GetEncoderInfo() const override; > > private: > VideoEncoderFactory* const factory_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy_unittest.cc >index d1dc9a31df66bc5933faf0b8b750456d56a9bc73..d1b26bbe93e6913a7a9b27971e2c083971eb5552 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/vp8_encoder_simulcast_proxy_unittest.cc >@@ -10,12 +10,11 @@ > */ > > #include "media/engine/vp8_encoder_simulcast_proxy.h" >- > #include <string> > > #include "api/test/mock_video_encoder_factory.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "media/engine/webrtcvideoencoderfactory.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "test/gmock.h" > #include "test/gtest.h" >@@ -52,9 +51,7 @@ class MockEncoder : public VideoEncoder { > const CodecSpecificInfo* codecSpecificInfo, > const std::vector<FrameType>* frame_types) /* override */); > >- MOCK_METHOD2(SetChannelParameters, int32_t(uint32_t packetLoss, int64_t rtt)); >- >- MOCK_CONST_METHOD0(ImplementationName, const char*()); >+ MOCK_CONST_METHOD0(GetEncoderInfo, VideoEncoder::EncoderInfo(void)); > }; > > TEST(VP8EncoderSimulcastProxy, ChoosesCorrectImplementation) { >@@ -94,8 +91,10 @@ TEST(VP8EncoderSimulcastProxy, ChoosesCorrectImplementation) { > > EXPECT_CALL(*mock_encoder, InitEncode(_, _, _)) > .WillOnce(Return(WEBRTC_VIDEO_CODEC_OK)); >- EXPECT_CALL(*mock_encoder, ImplementationName()) >- .WillRepeatedly(Return(kImplementationName.c_str())); >+ VideoEncoder::EncoderInfo encoder_info; >+ encoder_info.implementation_name = kImplementationName; >+ EXPECT_CALL(*mock_encoder, GetEncoderInfo()) >+ .WillRepeatedly(Return(encoder_info)); > > EXPECT_CALL(simulcast_factory, CreateVideoEncoderProxy(_)) > .Times(1) >@@ -105,7 +104,8 @@ TEST(VP8EncoderSimulcastProxy, ChoosesCorrectImplementation) { > SdpVideoFormat("VP8")); > EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, > simulcast_enabled_proxy.InitEncode(&codec_settings, 4, 1200)); >- EXPECT_EQ(kImplementationName, simulcast_enabled_proxy.ImplementationName()); >+ EXPECT_EQ(kImplementationName, >+ simulcast_enabled_proxy.GetEncoderInfo().implementation_name); > > NiceMock<MockEncoder>* mock_encoder1 = new NiceMock<MockEncoder>(); > NiceMock<MockEncoder>* mock_encoder2 = new NiceMock<MockEncoder>(); >@@ -116,23 +116,23 @@ TEST(VP8EncoderSimulcastProxy, ChoosesCorrectImplementation) { > EXPECT_CALL(*mock_encoder1, InitEncode(_, _, _)) > .WillOnce( > Return(WEBRTC_VIDEO_CODEC_ERR_SIMULCAST_PARAMETERS_NOT_SUPPORTED)); >- EXPECT_CALL(*mock_encoder1, ImplementationName()) >- .WillRepeatedly(Return(kImplementationName.c_str())); >+ EXPECT_CALL(*mock_encoder1, GetEncoderInfo()) >+ .WillRepeatedly(Return(encoder_info)); > > EXPECT_CALL(*mock_encoder2, InitEncode(_, _, _)) > .WillOnce(Return(WEBRTC_VIDEO_CODEC_OK)); >- EXPECT_CALL(*mock_encoder2, ImplementationName()) >- .WillRepeatedly(Return(kImplementationName.c_str())); >+ EXPECT_CALL(*mock_encoder2, GetEncoderInfo()) >+ .WillRepeatedly(Return(encoder_info)); > > EXPECT_CALL(*mock_encoder3, InitEncode(_, _, _)) > .WillOnce(Return(WEBRTC_VIDEO_CODEC_OK)); >- EXPECT_CALL(*mock_encoder3, ImplementationName()) >- .WillRepeatedly(Return(kImplementationName.c_str())); >+ EXPECT_CALL(*mock_encoder3, GetEncoderInfo()) >+ .WillRepeatedly(Return(encoder_info)); > > EXPECT_CALL(*mock_encoder4, InitEncode(_, _, _)) > .WillOnce(Return(WEBRTC_VIDEO_CODEC_OK)); >- EXPECT_CALL(*mock_encoder4, ImplementationName()) >- .WillRepeatedly(Return(kImplementationName.c_str())); >+ EXPECT_CALL(*mock_encoder4, GetEncoderInfo()) >+ .WillRepeatedly(Return(encoder_info)); > > EXPECT_CALL(nonsimulcast_factory, CreateVideoEncoderProxy(_)) > .Times(4) >@@ -146,12 +146,38 @@ TEST(VP8EncoderSimulcastProxy, ChoosesCorrectImplementation) { > EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, > simulcast_disabled_proxy.InitEncode(&codec_settings, 4, 1200)); > EXPECT_EQ(kSimulcastAdaptedImplementationName, >- simulcast_disabled_proxy.ImplementationName()); >+ simulcast_disabled_proxy.GetEncoderInfo().implementation_name); > > // Cleanup. > simulcast_enabled_proxy.Release(); > simulcast_disabled_proxy.Release(); > } > >+TEST(VP8EncoderSimulcastProxy, ForwardsTrustedSetting) { >+ NiceMock<MockEncoder>* mock_encoder = new NiceMock<MockEncoder>(); >+ NiceMock<MockVideoEncoderFactory> simulcast_factory; >+ >+ EXPECT_CALL(*mock_encoder, InitEncode(_, _, _)) >+ .WillOnce(Return(WEBRTC_VIDEO_CODEC_OK)); >+ >+ EXPECT_CALL(simulcast_factory, CreateVideoEncoderProxy(_)) >+ .Times(1) >+ .WillOnce(Return(mock_encoder)); >+ >+ VP8EncoderSimulcastProxy simulcast_enabled_proxy(&simulcast_factory, >+ SdpVideoFormat("VP8")); >+ VideoCodec codec_settings; >+ webrtc::test::CodecSettings(kVideoCodecVP8, &codec_settings); >+ EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, >+ simulcast_enabled_proxy.InitEncode(&codec_settings, 4, 1200)); >+ >+ VideoEncoder::EncoderInfo info; >+ info.has_trusted_rate_controller = true; >+ EXPECT_CALL(*mock_encoder, GetEncoderInfo()).WillRepeatedly(Return(info)); >+ >+ EXPECT_TRUE( >+ simulcast_enabled_proxy.GetEncoderInfo().has_trusted_rate_controller); >+} >+ > } // namespace testing > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.cc >index e60c592a3fabeed12eb58fed9a6b6ecd67fdf035..bc7d3c3e124ffefea36b021e5bc635a3602eb629 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.cc >@@ -11,10 +11,10 @@ > #include "media/engine/webrtcmediaengine.h" > > #include <algorithm> >-#include <memory> >-#include <tuple> > #include <utility> > >+#include "absl/memory/memory.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video_codecs/video_decoder_factory.h" > #include "api/video_codecs/video_encoder_factory.h" > #include "media/engine/webrtcvoiceengine.h" >@@ -38,23 +38,24 @@ MediaEngineInterface* CreateWebRtcMediaEngine( > audio_decoder_factory, > WebRtcVideoEncoderFactory* video_encoder_factory, > WebRtcVideoDecoderFactory* video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory, > rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, > rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing) { >+ std::unique_ptr<VideoEngineInterface> video_engine; > #ifdef HAVE_WEBRTC_VIDEO >- typedef WebRtcVideoEngine VideoEngine; >- std::tuple<std::unique_ptr<WebRtcVideoEncoderFactory>, >- std::unique_ptr<WebRtcVideoDecoderFactory>> >- video_args( >- (std::unique_ptr<WebRtcVideoEncoderFactory>(video_encoder_factory)), >- (std::unique_ptr<WebRtcVideoDecoderFactory>(video_decoder_factory))); >+ video_engine = absl::make_unique<WebRtcVideoEngine>( >+ std::unique_ptr<WebRtcVideoEncoderFactory>(video_encoder_factory), >+ std::unique_ptr<WebRtcVideoDecoderFactory>(video_decoder_factory), >+ std::move(video_bitrate_allocator_factory)); > #else >- typedef NullWebRtcVideoEngine VideoEngine; >- std::tuple<> video_args; >+ video_engine = absl::make_unique<NullWebRtcVideoEngine>(); > #endif >- return new CompositeMediaEngine<WebRtcVoiceEngine, VideoEngine>( >- std::forward_as_tuple(adm, audio_encoder_factory, audio_decoder_factory, >- audio_mixer, audio_processing), >- std::move(video_args)); >+ return new CompositeMediaEngine( >+ absl::make_unique<WebRtcVoiceEngine>(adm, audio_encoder_factory, >+ audio_decoder_factory, audio_mixer, >+ audio_processing), >+ std::move(video_engine)); > } > > } // namespace >@@ -67,10 +68,43 @@ MediaEngineInterface* WebRtcMediaEngineFactory::Create( > audio_decoder_factory, > WebRtcVideoEncoderFactory* video_encoder_factory, > WebRtcVideoDecoderFactory* video_decoder_factory) { >- return CreateWebRtcMediaEngine(adm, audio_encoder_factory, >- audio_decoder_factory, video_encoder_factory, >- video_decoder_factory, nullptr, >- webrtc::AudioProcessingBuilder().Create()); >+ return WebRtcMediaEngineFactory::Create( >+ adm, audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >+ video_decoder_factory, >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory()); >+} >+ >+MediaEngineInterface* WebRtcMediaEngineFactory::Create( >+ webrtc::AudioDeviceModule* adm, >+ const rtc::scoped_refptr<webrtc::AudioEncoderFactory>& >+ audio_encoder_factory, >+ const rtc::scoped_refptr<webrtc::AudioDecoderFactory>& >+ audio_decoder_factory, >+ WebRtcVideoEncoderFactory* video_encoder_factory, >+ WebRtcVideoDecoderFactory* video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory) { >+ return CreateWebRtcMediaEngine( >+ adm, audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >+ video_decoder_factory, std::move(video_bitrate_allocator_factory), >+ nullptr, webrtc::AudioProcessingBuilder().Create()); >+} >+ >+MediaEngineInterface* WebRtcMediaEngineFactory::Create( >+ webrtc::AudioDeviceModule* adm, >+ const rtc::scoped_refptr<webrtc::AudioEncoderFactory>& >+ audio_encoder_factory, >+ const rtc::scoped_refptr<webrtc::AudioDecoderFactory>& >+ audio_decoder_factory, >+ WebRtcVideoEncoderFactory* video_encoder_factory, >+ WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, >+ rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing) { >+ return WebRtcMediaEngineFactory::Create( >+ adm, audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >+ video_decoder_factory, >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory(), audio_mixer, >+ audio_processing); > } > > MediaEngineInterface* WebRtcMediaEngineFactory::Create( >@@ -81,11 +115,14 @@ MediaEngineInterface* WebRtcMediaEngineFactory::Create( > audio_decoder_factory, > WebRtcVideoEncoderFactory* video_encoder_factory, > WebRtcVideoDecoderFactory* video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory, > rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, > rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing) { > return CreateWebRtcMediaEngine( > adm, audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >- video_decoder_factory, audio_mixer, audio_processing); >+ video_decoder_factory, std::move(video_bitrate_allocator_factory), >+ audio_mixer, audio_processing); > } > #endif > >@@ -97,22 +134,35 @@ std::unique_ptr<MediaEngineInterface> WebRtcMediaEngineFactory::Create( > std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory, > rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, > rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing) { >+ return WebRtcMediaEngineFactory::Create( >+ adm, audio_encoder_factory, audio_decoder_factory, >+ std::move(video_encoder_factory), std::move(video_decoder_factory), >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory(), audio_mixer, >+ audio_processing); >+} >+ >+std::unique_ptr<MediaEngineInterface> WebRtcMediaEngineFactory::Create( >+ rtc::scoped_refptr<webrtc::AudioDeviceModule> adm, >+ rtc::scoped_refptr<webrtc::AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<webrtc::AudioDecoderFactory> audio_decoder_factory, >+ std::unique_ptr<webrtc::VideoEncoderFactory> video_encoder_factory, >+ std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory, >+ rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, >+ rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing) { > #ifdef HAVE_WEBRTC_VIDEO >- typedef WebRtcVideoEngine VideoEngine; >- std::tuple<std::unique_ptr<webrtc::VideoEncoderFactory>, >- std::unique_ptr<webrtc::VideoDecoderFactory>> >- video_args(std::move(video_encoder_factory), >- std::move(video_decoder_factory)); >+ auto video_engine = absl::make_unique<WebRtcVideoEngine>( >+ std::move(video_encoder_factory), std::move(video_decoder_factory), >+ std::move(video_bitrate_allocator_factory)); > #else >- typedef NullWebRtcVideoEngine VideoEngine; >- std::tuple<> video_args; >+ auto video_engine = absl::make_unique<NullWebRtcVideoEngine>(); > #endif >- return std::unique_ptr<MediaEngineInterface>( >- new CompositeMediaEngine<WebRtcVoiceEngine, VideoEngine>( >- std::forward_as_tuple(adm, audio_encoder_factory, >- audio_decoder_factory, audio_mixer, >- audio_processing), >- std::move(video_args))); >+ return std::unique_ptr<MediaEngineInterface>(new CompositeMediaEngine( >+ absl::make_unique<WebRtcVoiceEngine>(adm, audio_encoder_factory, >+ audio_decoder_factory, audio_mixer, >+ audio_processing), >+ std::move(video_engine))); > } > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.h >index d82eccf9dc347faf96f8631fdf264b9022173af9..fbf4466732715f33278d85fd3a56cbf42dc3b2fb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine.h >@@ -25,6 +25,7 @@ class AudioMixer; > class AudioProcessing; > class VideoDecoderFactory; > class VideoEncoderFactory; >+class VideoBitrateAllocatorFactory; > } // namespace webrtc > namespace cricket { > class WebRtcVideoDecoderFactory; >@@ -50,6 +51,29 @@ class WebRtcMediaEngineFactory { > audio_decoder_factory, > WebRtcVideoEncoderFactory* video_encoder_factory, > WebRtcVideoDecoderFactory* video_decoder_factory); >+ >+ static MediaEngineInterface* Create( >+ webrtc::AudioDeviceModule* adm, >+ const rtc::scoped_refptr<webrtc::AudioEncoderFactory>& >+ audio_encoder_factory, >+ const rtc::scoped_refptr<webrtc::AudioDecoderFactory>& >+ audio_decoder_factory, >+ WebRtcVideoEncoderFactory* video_encoder_factory, >+ WebRtcVideoDecoderFactory* video_decoder_factory, >+ rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, >+ rtc::scoped_refptr<webrtc::AudioProcessing> apm); >+ >+ static MediaEngineInterface* Create( >+ webrtc::AudioDeviceModule* adm, >+ const rtc::scoped_refptr<webrtc::AudioEncoderFactory>& >+ audio_encoder_factory, >+ const rtc::scoped_refptr<webrtc::AudioDecoderFactory>& >+ audio_decoder_factory, >+ WebRtcVideoEncoderFactory* video_encoder_factory, >+ WebRtcVideoDecoderFactory* video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory); >+ > static MediaEngineInterface* Create( > webrtc::AudioDeviceModule* adm, > const rtc::scoped_refptr<webrtc::AudioEncoderFactory>& >@@ -58,6 +82,8 @@ class WebRtcMediaEngineFactory { > audio_decoder_factory, > WebRtcVideoEncoderFactory* video_encoder_factory, > WebRtcVideoDecoderFactory* video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory, > rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, > rtc::scoped_refptr<webrtc::AudioProcessing> apm); > #endif >@@ -73,6 +99,17 @@ class WebRtcMediaEngineFactory { > std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory, > rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, > rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing); >+ >+ static std::unique_ptr<MediaEngineInterface> Create( >+ rtc::scoped_refptr<webrtc::AudioDeviceModule> adm, >+ rtc::scoped_refptr<webrtc::AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<webrtc::AudioDecoderFactory> audio_decoder_factory, >+ std::unique_ptr<webrtc::VideoEncoderFactory> video_encoder_factory, >+ std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory, >+ rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, >+ rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing); > }; > > // Verify that extension IDs are within 1-byte extension range and are not >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine_unittest.cc >index 4fa9a6b5ee35d89cbfc6854919d1bf44ad539640..eec7dc39c4881c369d4be642c9f6c3cdb8310044 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcmediaengine_unittest.cc >@@ -12,6 +12,7 @@ > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" > #include "media/engine/webrtcmediaengine.h" >@@ -249,4 +250,15 @@ TEST(WebRtcMediaEngineFactoryTest, CreateWithBuiltinDecoders) { > EXPECT_TRUE(engine); > } > >+TEST(WebRtcMediaEngineFactoryTest, CreateWithVideoBitrateFactory) { >+ std::unique_ptr<MediaEngineInterface> engine(WebRtcMediaEngineFactory::Create( >+ nullptr /* adm */, webrtc::CreateBuiltinAudioEncoderFactory(), >+ webrtc::CreateBuiltinAudioDecoderFactory(), >+ webrtc::CreateBuiltinVideoEncoderFactory(), >+ webrtc::CreateBuiltinVideoDecoderFactory(), >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory(), >+ nullptr /* audio_mixer */, webrtc::AudioProcessingBuilder().Create())); >+ EXPECT_TRUE(engine); >+} >+ > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideodecoderfactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideodecoderfactory.h >index 016cb09da000f474fadd65622934dd6308a950e0..f00d9a69e0e266d589d2ff9472bd54a4f3ee279d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideodecoderfactory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideodecoderfactory.h >@@ -13,9 +13,10 @@ > > #include <string> > >-#include "common_types.h" // NOLINT(build/include) >+#include "api/video/video_codec_type.h" > #include "media/base/codec.h" > #include "rtc_base/refcount.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > class VideoDecoder; >@@ -29,7 +30,7 @@ struct VideoDecoderParams { > > // Deprecated. Use webrtc::VideoDecoderFactory instead. > // https://bugs.chromium.org/p/webrtc/issues/detail?id=7925 >-class WebRtcVideoDecoderFactory { >+class RTC_EXPORT WebRtcVideoDecoderFactory { > public: > // Caller takes the ownership of the returned object and it should be released > // by calling DestroyVideoDecoder(). >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoencoderfactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoencoderfactory.h >index 97ac03b473f8f0c78c84ee865036560cad619213..d39e0fd5f95ae4540801503f2d558fd80f20b775 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoencoderfactory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoencoderfactory.h >@@ -14,8 +14,9 @@ > #include <string> > #include <vector> > >-#include "common_types.h" // NOLINT(build/include) >+#include "api/video/video_codec_type.h" > #include "media/base/codec.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > class VideoEncoder; >@@ -25,7 +26,7 @@ namespace cricket { > > // Deprecated. Use webrtc::VideoEncoderFactory instead. > // https://bugs.chromium.org/p/webrtc/issues/detail?id=7925 >-class WebRtcVideoEncoderFactory { >+class RTC_EXPORT WebRtcVideoEncoderFactory { > public: > virtual ~WebRtcVideoEncoderFactory() {} > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.cc >index 40b0c45a3d832c1166ac3744efad65e7fe036d01..a75db73697069b937fa88b617c3e3d9a4f4b48f7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.cc >@@ -16,6 +16,7 @@ > #include <string> > #include <utility> > >+#include "absl/strings/match.h" > #include "api/video_codecs/sdp_video_format.h" > #include "api/video_codecs/video_decoder_factory.h" > #include "api/video_codecs/video_encoder.h" >@@ -111,8 +112,8 @@ std::vector<VideoCodec> AssignPayloadTypesAndDefaultCodecs( > } > > // Add associated RTX codec for non-FEC codecs. >- if (!CodecNamesEq(codec.name, kUlpfecCodecName) && >- !CodecNamesEq(codec.name, kFlexfecCodecName)) { >+ if (!absl::EqualsIgnoreCase(codec.name, kUlpfecCodecName) && >+ !absl::EqualsIgnoreCase(codec.name, kFlexfecCodecName)) { > output_codecs.push_back( > VideoCodec::CreateRtxCodec(payload_type, codec.id)); > >@@ -147,8 +148,8 @@ int GetMaxFramerate(const webrtc::VideoEncoderConfig& encoder_config, > } > > bool IsTemporalLayersSupported(const std::string& codec_name) { >- return CodecNamesEq(codec_name, kVp8CodecName) || >- CodecNamesEq(codec_name, kVp9CodecName); >+ return absl::EqualsIgnoreCase(codec_name, kVp8CodecName) || >+ absl::EqualsIgnoreCase(codec_name, kVp9CodecName); > } > > static std::string CodecVectorToString(const std::vector<VideoCodec>& codecs) { >@@ -220,9 +221,9 @@ static bool ValidateStreamParams(const StreamParams& sp) { > // Returns true if the given codec is disallowed from doing simulcast. > bool IsCodecBlacklistedForSimulcast(const std::string& codec_name) { > return webrtc::field_trial::IsEnabled("WebRTC-H264Simulcast") >- ? CodecNamesEq(codec_name, kVp9CodecName) >- : CodecNamesEq(codec_name, kH264CodecName) || >- CodecNamesEq(codec_name, kVp9CodecName); >+ ? absl::EqualsIgnoreCase(codec_name, kVp9CodecName) >+ : absl::EqualsIgnoreCase(codec_name, kH264CodecName) || >+ absl::EqualsIgnoreCase(codec_name, kVp9CodecName); > } > > // The selected thresholds for QVGA and VGA corresponded to a QP around 10. >@@ -337,14 +338,14 @@ WebRtcVideoChannel::WebRtcVideoSendStream::ConfigureVideoEncoderSettings( > denoising = parameters_.options.video_noise_reduction.value_or(false); > } > >- if (CodecNamesEq(codec.name, kH264CodecName)) { >+ if (absl::EqualsIgnoreCase(codec.name, kH264CodecName)) { > webrtc::VideoCodecH264 h264_settings = > webrtc::VideoEncoder::GetDefaultH264Settings(); > h264_settings.frameDroppingOn = frame_dropping; > return new rtc::RefCountedObject< > webrtc::VideoEncoderConfig::H264EncoderSpecificSettings>(h264_settings); > } >- if (CodecNamesEq(codec.name, kVp8CodecName)) { >+ if (absl::EqualsIgnoreCase(codec.name, kVp8CodecName)) { > webrtc::VideoCodecVP8 vp8_settings = > webrtc::VideoEncoder::GetDefaultVp8Settings(); > vp8_settings.automaticResizeOn = automatic_resize; >@@ -354,7 +355,7 @@ WebRtcVideoChannel::WebRtcVideoSendStream::ConfigureVideoEncoderSettings( > return new rtc::RefCountedObject< > webrtc::VideoEncoderConfig::Vp8EncoderSpecificSettings>(vp8_settings); > } >- if (CodecNamesEq(codec.name, kVp9CodecName)) { >+ if (absl::EqualsIgnoreCase(codec.name, kVp9CodecName)) { > webrtc::VideoCodecVP9 vp9_settings = > webrtc::VideoEncoder::GetDefaultVp9Settings(); > const size_t default_num_spatial_layers = >@@ -436,20 +437,26 @@ void DefaultUnsignalledSsrcHandler::SetDefaultSink( > #if defined(USE_BUILTIN_SW_CODECS) > WebRtcVideoEngine::WebRtcVideoEngine( > std::unique_ptr<WebRtcVideoEncoderFactory> external_video_encoder_factory, >- std::unique_ptr<WebRtcVideoDecoderFactory> external_video_decoder_factory) >+ std::unique_ptr<WebRtcVideoDecoderFactory> external_video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory) > : decoder_factory_(ConvertVideoDecoderFactory( > std::move(external_video_decoder_factory))), > encoder_factory_(ConvertVideoEncoderFactory( >- std::move(external_video_encoder_factory))) { >+ std::move(external_video_encoder_factory))), >+ bitrate_allocator_factory_(std::move(video_bitrate_allocator_factory)) { > RTC_LOG(LS_INFO) << "WebRtcVideoEngine::WebRtcVideoEngine()"; > } > #endif > > WebRtcVideoEngine::WebRtcVideoEngine( > std::unique_ptr<webrtc::VideoEncoderFactory> video_encoder_factory, >- std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory) >+ std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory) > : decoder_factory_(std::move(video_decoder_factory)), >- encoder_factory_(std::move(video_encoder_factory)) { >+ encoder_factory_(std::move(video_encoder_factory)), >+ bitrate_allocator_factory_(std::move(video_bitrate_allocator_factory)) { > RTC_LOG(LS_INFO) << "WebRtcVideoEngine::WebRtcVideoEngine()"; > } > >@@ -457,15 +464,16 @@ WebRtcVideoEngine::~WebRtcVideoEngine() { > RTC_LOG(LS_INFO) << "WebRtcVideoEngine::~WebRtcVideoEngine"; > } > >-WebRtcVideoChannel* WebRtcVideoEngine::CreateChannel( >+VideoMediaChannel* WebRtcVideoEngine::CreateMediaChannel( > webrtc::Call* call, > const MediaConfig& config, >- const VideoOptions& options) { >- RTC_LOG(LS_INFO) << "CreateChannel. Options: " << options.ToString(); >- return new WebRtcVideoChannel(call, config, options, encoder_factory_.get(), >- decoder_factory_.get()); >+ const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options) { >+ RTC_LOG(LS_INFO) << "CreateMediaChannel. Options: " << options.ToString(); >+ return new WebRtcVideoChannel(call, config, options, crypto_options, >+ encoder_factory_.get(), decoder_factory_.get(), >+ bitrate_allocator_factory_.get()); > } >- > std::vector<VideoCodec> WebRtcVideoEngine::codecs() const { > return AssignPayloadTypesAndDefaultCodecs(encoder_factory_.get()); > } >@@ -496,6 +504,11 @@ RtpCapabilities WebRtcVideoEngine::GetCapabilities() const { > capabilities.header_extensions.push_back( > webrtc::RtpExtension(webrtc::RtpExtension::kFrameMarkingUri, > webrtc::RtpExtension::kFrameMarkingDefaultId)); >+ if (webrtc::field_trial::IsEnabled("WebRTC-GenericDescriptorAdvertised")) { >+ capabilities.header_extensions.push_back(webrtc::RtpExtension( >+ webrtc::RtpExtension::kGenericFrameDescriptorUri, >+ webrtc::RtpExtension::kGenericFrameDescriptorDefaultId)); >+ } > // TODO(bugs.webrtc.org/4050): Add MID header extension as capability once MID > // demuxing is completed. > // capabilities.header_extensions.push_back(webrtc::RtpExtension( >@@ -507,16 +520,23 @@ WebRtcVideoChannel::WebRtcVideoChannel( > webrtc::Call* call, > const MediaConfig& config, > const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options, > webrtc::VideoEncoderFactory* encoder_factory, >- webrtc::VideoDecoderFactory* decoder_factory) >+ webrtc::VideoDecoderFactory* decoder_factory, >+ webrtc::VideoBitrateAllocatorFactory* bitrate_allocator_factory) > : VideoMediaChannel(config), > call_(call), > unsignalled_ssrc_handler_(&default_unsignalled_ssrc_handler_), > video_config_(config.video), > encoder_factory_(encoder_factory), > decoder_factory_(decoder_factory), >+ bitrate_allocator_factory_(bitrate_allocator_factory), >+ preferred_dscp_(rtc::DSCP_DEFAULT), > default_send_options_(options), >- last_stats_log_ms_(-1) { >+ last_stats_log_ms_(-1), >+ discard_unknown_ssrc_packets_(webrtc::field_trial::IsEnabled( >+ "WebRTC-Video-DiscardPacketsWithUnknownSsrc")), >+ crypto_options_(crypto_options) { > RTC_DCHECK(thread_checker_.CalledOnValidThread()); > > rtcp_receiver_report_ssrc_ = kDefaultRtcpReceiverReportSsrc; >@@ -612,6 +632,9 @@ bool WebRtcVideoChannel::GetChangedSendParameters( > changed_params->codec = selected_send_codec; > > // Handle RTP header extensions. >+ if (params.extmap_allow_mixed != ExtmapAllowMixed()) { >+ changed_params->extmap_allow_mixed = params.extmap_allow_mixed; >+ } > std::vector<webrtc::RtpExtension> filtered_extensions = FilterRtpExtensions( > params.extensions, webrtc::RtpExtension::IsSupportedForVideo, true); > if (!send_rtp_extensions_ || (*send_rtp_extensions_ != filtered_extensions)) { >@@ -649,7 +672,7 @@ bool WebRtcVideoChannel::GetChangedSendParameters( > } > > rtc::DiffServCodePoint WebRtcVideoChannel::PreferredDscp() const { >- return rtc::DSCP_AF41; >+ return preferred_dscp_; > } > > bool WebRtcVideoChannel::SetSendParameters(const VideoSendParameters& params) { >@@ -666,6 +689,9 @@ bool WebRtcVideoChannel::SetSendParameters(const VideoSendParameters& params) { > RTC_LOG(LS_INFO) << "Using codec: " << codec_settings.codec.ToString(); > } > >+ if (changed_params.extmap_allow_mixed) { >+ SetExtmapAllowMixed(*changed_params.extmap_allow_mixed); >+ } > if (changed_params.rtp_header_extensions) { > send_rtp_extensions_ = changed_params.rtp_header_extensions; > } >@@ -764,11 +790,34 @@ webrtc::RTCError WebRtcVideoChannel::SetRtpSendParameters( > // different order (which should change the send codec). > webrtc::RtpParameters current_parameters = GetRtpSendParameters(ssrc); > if (current_parameters.codecs != parameters.codecs) { >- RTC_LOG(LS_ERROR) << "Using SetParameters to change the set of codecs " >- << "is not currently supported."; >+ RTC_DLOG(LS_ERROR) << "Using SetParameters to change the set of codecs " >+ << "is not currently supported."; > return webrtc::RTCError(webrtc::RTCErrorType::INTERNAL_ERROR); > } > >+ if (!parameters.encodings.empty()) { >+ const auto& priority = parameters.encodings[0].network_priority; >+ rtc::DiffServCodePoint new_dscp = rtc::DSCP_DEFAULT; >+ if (priority == 0.5 * webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_CS1; >+ } else if (priority == webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_DEFAULT; >+ } else if (priority == 2.0 * webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_AF42; >+ } else if (priority == 4.0 * webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_AF41; >+ } else { >+ RTC_LOG(LS_WARNING) << "Received invalid send network priority: " >+ << priority; >+ return webrtc::RTCError(webrtc::RTCErrorType::INVALID_RANGE); >+ } >+ >+ if (new_dscp != preferred_dscp_) { >+ preferred_dscp_ = new_dscp; >+ MediaChannel::UpdateDscp(); >+ } >+ } >+ > return it->second->SetRtpParameters(parameters); > } > >@@ -832,8 +881,8 @@ bool WebRtcVideoChannel::SetRtpReceiveParameters( > > webrtc::RtpParameters current_parameters = GetRtpReceiveParameters(ssrc); > if (current_parameters != parameters) { >- RTC_LOG(LS_ERROR) << "Changing the RTP receive parameters is currently " >- << "unsupported."; >+ RTC_DLOG(LS_ERROR) << "Changing the RTP receive parameters is currently " >+ << "unsupported."; > return false; > } > return true; >@@ -949,7 +998,7 @@ bool WebRtcVideoChannel::SetSend(bool send) { > TRACE_EVENT0("webrtc", "WebRtcVideoChannel::SetSend"); > RTC_LOG(LS_VERBOSE) << "SetSend: " << (send ? "true" : "false"); > if (send && !send_codec_) { >- RTC_LOG(LS_ERROR) << "SetSend(true) called before setting codec."; >+ RTC_DLOG(LS_ERROR) << "SetSend(true) called before setting codec."; > return false; > } > { >@@ -1028,6 +1077,11 @@ bool WebRtcVideoChannel::AddSendStream(const StreamParams& sp) { > config.encoder_settings.experiment_cpu_load_estimator = > video_config_.experiment_cpu_load_estimator; > config.encoder_settings.encoder_factory = encoder_factory_; >+ config.encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory_; >+ config.crypto_options = crypto_options_; >+ config.rtp.extmap_allow_mixed = ExtmapAllowMixed(); >+ config.rtcp_report_interval_ms = video_config_.rtcp_report_interval_ms; > > WebRtcVideoSendStream* stream = new WebRtcVideoSendStream( > call_, sp, std::move(config), default_send_options_, >@@ -1144,6 +1198,7 @@ bool WebRtcVideoChannel::AddRecvStream(const StreamParams& sp, > webrtc::FlexfecReceiveStream::Config flexfec_config(this); > ConfigureReceiverRtp(&config, &flexfec_config, sp); > >+ config.crypto_options = crypto_options_; > // TODO(nisse): Rename config variable to avoid negation. > config.disable_prerenderer_smoothing = > !video_config_.enable_prerenderer_smoothing; >@@ -1332,10 +1387,10 @@ void WebRtcVideoChannel::FillSendAndReceiveCodecStats( > } > > void WebRtcVideoChannel::OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > const webrtc::PacketReceiver::DeliveryStatus delivery_result = > call_->Receiver()->DeliverPacket(webrtc::MediaType::VIDEO, *packet, >- packet_time.timestamp); >+ packet_time_us); > switch (delivery_result) { > case webrtc::PacketReceiver::DELIVERY_OK: > return; >@@ -1345,6 +1400,10 @@ void WebRtcVideoChannel::OnPacketReceived(rtc::CopyOnWriteBuffer* packet, > break; > } > >+ if (discard_unknown_ssrc_packets_) { >+ return; >+ } >+ > uint32_t ssrc = 0; > if (!GetRtpSsrc(packet->cdata(), packet->size(), &ssrc)) { > return; >@@ -1379,7 +1438,7 @@ void WebRtcVideoChannel::OnPacketReceived(rtc::CopyOnWriteBuffer* packet, > } > > if (call_->Receiver()->DeliverPacket(webrtc::MediaType::VIDEO, *packet, >- packet_time.timestamp) != >+ packet_time_us) != > webrtc::PacketReceiver::DELIVERY_OK) { > RTC_LOG(LS_WARNING) << "Failed to deliver RTP packet on re-delivery."; > return; >@@ -1387,13 +1446,13 @@ void WebRtcVideoChannel::OnPacketReceived(rtc::CopyOnWriteBuffer* packet, > } > > void WebRtcVideoChannel::OnRtcpReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > // TODO(pbos): Check webrtc::PacketReceiver::DELIVERY_OK once we deliver > // for both audio and video on the same path. Since BundleFilter doesn't > // filter RTCP anymore incoming RTCP packets could've been going to audio (so > // logging failures spam the log). > call_->Receiver()->DeliverPacket(webrtc::MediaType::VIDEO, *packet, >- packet_time.timestamp); >+ packet_time_us); > } > > void WebRtcVideoChannel::OnReadyToSend(bool ready) { >@@ -1412,31 +1471,46 @@ void WebRtcVideoChannel::OnNetworkRouteChanged( > network_route.packet_overhead); > } > >-void WebRtcVideoChannel::SetInterface(NetworkInterface* iface) { >- MediaChannel::SetInterface(iface); >+void WebRtcVideoChannel::SetInterface( >+ NetworkInterface* iface, >+ webrtc::MediaTransportInterface* media_transport) { >+ // TODO(sukhanov): Video is not currently supported with media transport. >+ RTC_CHECK(media_transport == nullptr); >+ >+ MediaChannel::SetInterface(iface, media_transport); > // Set the RTP recv/send buffer to a bigger size. > >- // The group here can be either a positive integer with an explicit size, in >- // which case that is used as size. All other values shall result in the >- // default value being used. >- const std::string group_name = >- webrtc::field_trial::FindFullName("WebRTC-IncreasedReceivebuffers"); >- int recv_buffer_size = kVideoRtpBufferSize; >- if (!group_name.empty() && >- (sscanf(group_name.c_str(), "%d", &recv_buffer_size) != 1 || >- recv_buffer_size <= 0)) { >- RTC_LOG(LS_WARNING) << "Invalid receive buffer size: " << group_name; >- recv_buffer_size = kVideoRtpBufferSize; >- } > MediaChannel::SetOption(NetworkInterface::ST_RTP, rtc::Socket::OPT_RCVBUF, >- recv_buffer_size); >+ kVideoRtpRecvBufferSize); > > // Speculative change to increase the outbound socket buffer size. > // In b/15152257, we are seeing a significant number of packets discarded > // due to lack of socket buffer space, although it's not yet clear what the > // ideal value should be. > MediaChannel::SetOption(NetworkInterface::ST_RTP, rtc::Socket::OPT_SNDBUF, >- kVideoRtpBufferSize); >+ kVideoRtpSendBufferSize); >+} >+ >+void WebRtcVideoChannel::SetFrameDecryptor( >+ uint32_t ssrc, >+ rtc::scoped_refptr<webrtc::FrameDecryptorInterface> frame_decryptor) { >+ rtc::CritScope stream_lock(&stream_crit_); >+ auto matching_stream = receive_streams_.find(ssrc); >+ if (matching_stream != receive_streams_.end()) { >+ matching_stream->second->SetFrameDecryptor(frame_decryptor); >+ } >+} >+ >+void WebRtcVideoChannel::SetFrameEncryptor( >+ uint32_t ssrc, >+ rtc::scoped_refptr<webrtc::FrameEncryptorInterface> frame_encryptor) { >+ rtc::CritScope stream_lock(&stream_crit_); >+ auto matching_stream = send_streams_.find(ssrc); >+ if (matching_stream != send_streams_.end()) { >+ matching_stream->second->SetFrameEncryptor(frame_encryptor); >+ } else { >+ RTC_LOG(LS_ERROR) << "No stream found to attach frame encryptor"; >+ } > } > > absl::optional<uint32_t> WebRtcVideoChannel::GetDefaultReceiveStreamSsrc() { >@@ -1474,6 +1548,10 @@ bool WebRtcVideoChannel::SendRtp(const uint8_t* data, > if (DscpEnabled()) { > rtc_options.dscp = PreferredDscp(); > } >+ rtc_options.info_signaled_after_sent.included_in_feedback = >+ options.included_in_feedback; >+ rtc_options.info_signaled_after_sent.included_in_allocation = >+ options.included_in_allocation; > return MediaChannel::SendPacket(&packet, rtc_options); > } > >@@ -1483,6 +1561,7 @@ bool WebRtcVideoChannel::SendRtcp(const uint8_t* data, size_t len) { > if (DscpEnabled()) { > rtc_options.dscp = PreferredDscp(); > } >+ > return MediaChannel::SendRtcp(&packet, rtc_options); > } > >@@ -1569,7 +1648,6 @@ WebRtcVideoChannel::WebRtcVideoSendStream::WebRtcVideoSendStream( > ? webrtc::RtcpMode::kReducedSize > : webrtc::RtcpMode::kCompound; > parameters_.config.rtp.mid = send_params.mid; >- > rtp_parameters_.rtcp.reduced_size = send_params.rtcp.reduced_size; > > if (codec_settings) { >@@ -1699,6 +1777,10 @@ void WebRtcVideoChannel::WebRtcVideoSendStream::SetSendParameters( > parameters_.config.rtp.rtcp_mode == webrtc::RtcpMode::kReducedSize; > recreate_stream = true; > } >+ if (params.extmap_allow_mixed) { >+ parameters_.config.rtp.extmap_allow_mixed = *params.extmap_allow_mixed; >+ recreate_stream = true; >+ } > if (params.rtp_header_extensions) { > parameters_.config.rtp.extensions = *params.rtp_header_extensions; > rtp_parameters_.header_extensions = *params.rtp_header_extensions; >@@ -1799,6 +1881,15 @@ WebRtcVideoChannel::WebRtcVideoSendStream::GetRtpParameters() const { > return rtp_parameters_; > } > >+void WebRtcVideoChannel::WebRtcVideoSendStream::SetFrameEncryptor( >+ rtc::scoped_refptr<webrtc::FrameEncryptorInterface> frame_encryptor) { >+ RTC_DCHECK_RUN_ON(&thread_checker_); >+ parameters_.config.frame_encryptor = frame_encryptor; >+ if (stream_) { >+ RecreateWebRtcStream(); >+ } >+} >+ > void WebRtcVideoChannel::WebRtcVideoSendStream::UpdateSendState() { > RTC_DCHECK_RUN_ON(&thread_checker_); > if (sending_) { >@@ -2161,26 +2252,20 @@ WebRtcVideoChannel::WebRtcVideoReceiveStream::GetSources() { > return stream_->GetSources(); > } > >-absl::optional<uint32_t> >-WebRtcVideoChannel::WebRtcVideoReceiveStream::GetFirstPrimarySsrc() const { >+webrtc::RtpParameters >+WebRtcVideoChannel::WebRtcVideoReceiveStream::GetRtpParameters() const { >+ webrtc::RtpParameters rtp_parameters; >+ > std::vector<uint32_t> primary_ssrcs; > stream_params_.GetPrimarySsrcs(&primary_ssrcs); >- >- if (primary_ssrcs.empty()) { >- RTC_LOG(LS_WARNING) >- << "Empty primary ssrcs vector, returning empty optional"; >- return absl::nullopt; >- } else { >- return primary_ssrcs[0]; >+ for (uint32_t ssrc : primary_ssrcs) { >+ rtp_parameters.encodings.emplace_back(); >+ rtp_parameters.encodings.back().ssrc = ssrc; > } >-} > >-webrtc::RtpParameters >-WebRtcVideoChannel::WebRtcVideoReceiveStream::GetRtpParameters() const { >- webrtc::RtpParameters rtp_parameters; >- rtp_parameters.encodings.emplace_back(); >- rtp_parameters.encodings[0].ssrc = GetFirstPrimarySsrc(); > rtp_parameters.header_extensions = config_.rtp.extensions; >+ rtp_parameters.rtcp.reduced_size = >+ config_.rtp.rtcp_mode == webrtc::RtcpMode::kReducedSize; > > return rtp_parameters; > } >@@ -2230,9 +2315,9 @@ void WebRtcVideoChannel::WebRtcVideoReceiveStream::SetLocalSsrc( > // right now this can't be done due to unittests depending on receiving what > // they are sending from the same MediaChannel. > if (local_ssrc == config_.rtp.local_ssrc) { >- RTC_LOG(LS_INFO) << "Ignoring call to SetLocalSsrc because parameters are " >- "unchanged; local_ssrc=" >- << local_ssrc; >+ RTC_DLOG(LS_INFO) << "Ignoring call to SetLocalSsrc because parameters are " >+ "unchanged; local_ssrc=" >+ << local_ssrc; > return; > } > >@@ -2372,6 +2457,14 @@ bool WebRtcVideoChannel::WebRtcVideoReceiveStream::IsDefaultStream() const { > return default_stream_; > } > >+void WebRtcVideoChannel::WebRtcVideoReceiveStream::SetFrameDecryptor( >+ rtc::scoped_refptr<webrtc::FrameDecryptorInterface> frame_decryptor) { >+ config_.frame_decryptor = frame_decryptor; >+ if (stream_) { >+ RecreateWebRtcVideoStream(); >+ } >+} >+ > void WebRtcVideoChannel::WebRtcVideoReceiveStream::SetSink( > rtc::VideoSinkInterface<webrtc::VideoFrame>* sink) { > rtc::CritScope crit(&sink_lock_); >@@ -2607,10 +2700,11 @@ std::vector<webrtc::VideoStream> EncoderStreamFactory::CreateEncoderStreams( > std::vector<webrtc::VideoStream> layers; > > if (encoder_config.number_of_streams > 1 || >- ((CodecNamesEq(codec_name_, kVp8CodecName) || >- CodecNamesEq(codec_name_, kH264CodecName)) && >+ ((absl::EqualsIgnoreCase(codec_name_, kVp8CodecName) || >+ absl::EqualsIgnoreCase(codec_name_, kH264CodecName)) && > is_screenshare_ && screenshare_config_explicitly_enabled_)) { >- bool temporal_layers_supported = CodecNamesEq(codec_name_, kVp8CodecName); >+ bool temporal_layers_supported = >+ absl::EqualsIgnoreCase(codec_name_, kVp8CodecName); > layers = GetSimulcastConfig(encoder_config.number_of_streams, width, height, > 0 /*not used*/, encoder_config.bitrate_priority, > max_qp_, 0 /*not_used*/, is_screenshare_, >@@ -2703,7 +2797,7 @@ std::vector<webrtc::VideoStream> EncoderStreamFactory::CreateEncoderStreams( > layer.max_qp = max_qp_; > layer.bitrate_priority = encoder_config.bitrate_priority; > >- if (CodecNamesEq(codec_name_, kVp9CodecName)) { >+ if (absl::EqualsIgnoreCase(codec_name_, kVp9CodecName)) { > RTC_DCHECK(encoder_config.encoder_specific_settings); > // Use VP9 SVC layering from codec settings which might be initialized > // though field trial in ConfigureVideoEncoderSettings. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.h >index 8a1e6ad4d1703648edcf4570df47a33917d3c61a..3b79ad40b8a2d673fb87f1264ca8b55d724a9f53 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine.h >@@ -19,6 +19,7 @@ > > #include "absl/types/optional.h" > #include "api/call/transport.h" >+#include "api/video/video_bitrate_allocator_factory.h" > #include "api/video/video_frame.h" > #include "api/video/video_sink_interface.h" > #include "api/video/video_source_interface.h" >@@ -78,43 +79,53 @@ class DefaultUnsignalledSsrcHandler : public UnsignalledSsrcHandler { > }; > > // WebRtcVideoEngine is used for the new native WebRTC Video API (webrtc:1667). >-class WebRtcVideoEngine { >+class WebRtcVideoEngine : public VideoEngineInterface { > public: > #if defined(USE_BUILTIN_SW_CODECS) > // Internal SW video codecs will be added on top of the external codecs. > WebRtcVideoEngine( > std::unique_ptr<WebRtcVideoEncoderFactory> external_video_encoder_factory, >- std::unique_ptr<WebRtcVideoDecoderFactory> >- external_video_decoder_factory); >+ std::unique_ptr<WebRtcVideoDecoderFactory> external_video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory); > #endif > > // These video codec factories represents all video codecs, i.e. both software > // and external hardware codecs. > WebRtcVideoEngine( > std::unique_ptr<webrtc::VideoEncoderFactory> video_encoder_factory, >- std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory); >+ std::unique_ptr<webrtc::VideoDecoderFactory> video_decoder_factory, >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory); > >- virtual ~WebRtcVideoEngine(); >+ ~WebRtcVideoEngine() override; > >- WebRtcVideoChannel* CreateChannel(webrtc::Call* call, >- const MediaConfig& config, >- const VideoOptions& options); >+ VideoMediaChannel* CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options) override; > >- std::vector<VideoCodec> codecs() const; >- RtpCapabilities GetCapabilities() const; >+ std::vector<VideoCodec> codecs() const override; >+ RtpCapabilities GetCapabilities() const override; > > private: > const std::unique_ptr<webrtc::VideoDecoderFactory> decoder_factory_; > const std::unique_ptr<webrtc::VideoEncoderFactory> encoder_factory_; >+ const std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> >+ bitrate_allocator_factory_; > }; > > class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > public: >- WebRtcVideoChannel(webrtc::Call* call, >- const MediaConfig& config, >- const VideoOptions& options, >- webrtc::VideoEncoderFactory* encoder_factory, >- webrtc::VideoDecoderFactory* decoder_factory); >+ WebRtcVideoChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const VideoOptions& options, >+ const webrtc::CryptoOptions& crypto_options, >+ webrtc::VideoEncoderFactory* encoder_factory, >+ webrtc::VideoDecoderFactory* decoder_factory, >+ webrtc::VideoBitrateAllocatorFactory* bitrate_allocator_factory); > ~WebRtcVideoChannel() override; > > // VideoMediaChannel implementation >@@ -147,13 +158,28 @@ class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > bool GetStats(VideoMediaInfo* info) override; > > void OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > void OnRtcpReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > void OnReadyToSend(bool ready) override; > void OnNetworkRouteChanged(const std::string& transport_name, > const rtc::NetworkRoute& network_route) override; >- void SetInterface(NetworkInterface* iface) override; >+ void SetInterface(NetworkInterface* iface, >+ webrtc::MediaTransportInterface* media_transport) override; >+ >+ // E2E Encrypted Video Frame API >+ // Set a frame decryptor to a particular ssrc that will intercept all >+ // incoming video frames and attempt to decrypt them before forwarding the >+ // result. >+ void SetFrameDecryptor(uint32_t ssrc, >+ rtc::scoped_refptr<webrtc::FrameDecryptorInterface> >+ frame_decryptor) override; >+ // Set a frame encryptor to a particular ssrc that will intercept all >+ // outgoing video frames and attempt to encrypt them and forward the result >+ // to the packetizer. >+ void SetFrameEncryptor(uint32_t ssrc, >+ rtc::scoped_refptr<webrtc::FrameEncryptorInterface> >+ frame_encryptor) override; > > // Implemented for VideoMediaChannelTest. > bool sending() const { return sending_; } >@@ -201,6 +227,7 @@ class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > absl::optional<VideoCodecSettings> codec; > absl::optional<std::vector<webrtc::RtpExtension>> rtp_header_extensions; > absl::optional<std::string> mid; >+ absl::optional<bool> extmap_allow_mixed; > absl::optional<int> max_bandwidth_bps; > absl::optional<bool> conference_mode; > absl::optional<webrtc::RtcpMode> rtcp_mode; >@@ -257,6 +284,9 @@ class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > webrtc::RTCError SetRtpParameters(const webrtc::RtpParameters& parameters); > webrtc::RtpParameters GetRtpParameters() const; > >+ void SetFrameEncryptor( >+ rtc::scoped_refptr<webrtc::FrameEncryptorInterface> frame_encryptor); >+ > // Implements rtc::VideoSourceInterface<webrtc::VideoFrame>. > // WebRtcVideoSendStream acts as a source to the webrtc::VideoSendStream > // in |stream_|. This is done to proxy VideoSinkWants from the encoder to >@@ -371,6 +401,9 @@ class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > void OnFrame(const webrtc::VideoFrame& frame) override; > bool IsDefaultStream() const; > >+ void SetFrameDecryptor( >+ rtc::scoped_refptr<webrtc::FrameDecryptorInterface> frame_decryptor); >+ > void SetSink(rtc::VideoSinkInterface<webrtc::VideoFrame>* sink); > > VideoReceiverInfo GetVideoReceiverInfo(bool log_stats); >@@ -387,8 +420,6 @@ class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > > std::string GetCodecNameFromPayloadType(int payload_type); > >- absl::optional<uint32_t> GetFirstPrimarySsrc() const; >- > webrtc::Call* const call_; > const StreamParams stream_params_; > >@@ -466,6 +497,7 @@ class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > > webrtc::VideoEncoderFactory* const encoder_factory_; > webrtc::VideoDecoderFactory* const decoder_factory_; >+ webrtc::VideoBitrateAllocatorFactory* const bitrate_allocator_factory_; > std::vector<VideoCodecSettings> recv_codecs_; > std::vector<webrtc::RtpExtension> recv_rtp_extensions_; > // See reason for keeping track of the FlexFEC payload type separately in >@@ -475,14 +507,19 @@ class WebRtcVideoChannel : public VideoMediaChannel, public webrtc::Transport { > // TODO(deadbeef): Don't duplicate information between > // send_params/recv_params, rtp_extensions, options, etc. > VideoSendParameters send_params_; >+ rtc::DiffServCodePoint preferred_dscp_; > VideoOptions default_send_options_; > VideoRecvParameters recv_params_; > int64_t last_stats_log_ms_; >+ const bool discard_unknown_ssrc_packets_; > // This is a stream param that comes from the remote description, but wasn't > // signaled with any a=ssrc lines. It holds information that was signaled > // before the unsignaled receive stream is created when the first packet is > // received. > StreamParams unsignaled_stream_params_; >+ // Per peer connection crypto options that last for the lifetime of the peer >+ // connection. >+ const webrtc::CryptoOptions crypto_options_; > }; > > class EncoderStreamFactory >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine_unittest.cc >index e87f63ff4a65f2a14546ab1af79a540a4d5ba4c0..ab04fa6af3e5a6651154905b40887029a9165e9d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvideoengine_unittest.cc >@@ -14,10 +14,14 @@ > #include <utility> > #include <vector> > >+#include "absl/strings/match.h" > #include "api/rtpparameters.h" >+#include "api/test/mock_video_bitrate_allocator.h" >+#include "api/test/mock_video_bitrate_allocator_factory.h" > #include "api/test/mock_video_decoder_factory.h" > #include "api/test/mock_video_encoder_factory.h" > #include "api/units/time_delta.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video/video_bitrate_allocation.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" >@@ -101,7 +105,7 @@ bool HasRtxCodec(const std::vector<cricket::VideoCodec>& codecs, > int payload_type) { > for (const cricket::VideoCodec& codec : codecs) { > int associated_payload_type; >- if (cricket::CodecNamesEq(codec.name.c_str(), "rtx") && >+ if (absl::EqualsIgnoreCase(codec.name.c_str(), "rtx") && > codec.GetParam(cricket::kCodecParamAssociatedPayloadType, > &associated_payload_type) && > associated_payload_type == payload_type) { >@@ -213,7 +217,8 @@ class WebRtcVideoEngineTest : public ::testing::Test { > engine_(std::unique_ptr<cricket::FakeWebRtcVideoEncoderFactory>( > encoder_factory_), > std::unique_ptr<cricket::FakeWebRtcVideoDecoderFactory>( >- decoder_factory_)) { >+ decoder_factory_), >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory()) { > // Ensure fake clock doesn't return 0, which will cause some initializations > // fail inside RTP senders. > fake_clock_.AdvanceTimeMicros(1); >@@ -318,6 +323,25 @@ TEST_F(WebRtcVideoEngineTest, SupportsVideoRotationHeaderExtension) { > FAIL() << "Video Rotation extension not in header-extension list."; > } > >+class WebRtcVideoEngineTestWithGenericDescriptor >+ : public WebRtcVideoEngineTest { >+ public: >+ WebRtcVideoEngineTestWithGenericDescriptor() >+ : WebRtcVideoEngineTest("WebRTC-GenericDescriptorAdvertised/Enabled/") {} >+}; >+ >+TEST_F(WebRtcVideoEngineTestWithGenericDescriptor, AdvertiseGenericDescriptor) { >+ RtpCapabilities capabilities = engine_.GetCapabilities(); >+ ASSERT_FALSE(capabilities.header_extensions.empty()); >+ for (const RtpExtension& extension : capabilities.header_extensions) { >+ if (extension.uri == RtpExtension::kGenericFrameDescriptorUri) { >+ EXPECT_EQ(RtpExtension::kGenericFrameDescriptorDefaultId, extension.id); >+ return; >+ } >+ } >+ FAIL() << "Generic descriptor extension not in header-extension list."; >+} >+ > TEST_F(WebRtcVideoEngineTest, CVOSetHeaderExtensionBeforeCapturer) { > // Allocate the source first to prevent early destruction before channel's > // dtor is called. >@@ -433,8 +457,8 @@ TEST_F(WebRtcVideoEngineTest, CVOSetHeaderExtensionAfterCapturer) { > TEST_F(WebRtcVideoEngineTest, SetSendFailsBeforeSettingCodecs) { > encoder_factory_->AddSupportedVideoCodecType("VP8"); > >- std::unique_ptr<VideoMediaChannel> channel( >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> channel(engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > > EXPECT_TRUE(channel->AddSendStream(StreamParams::CreateLegacy(123))); > >@@ -447,8 +471,8 @@ TEST_F(WebRtcVideoEngineTest, SetSendFailsBeforeSettingCodecs) { > TEST_F(WebRtcVideoEngineTest, GetStatsWithoutSendCodecsSetDoesNotCrash) { > encoder_factory_->AddSupportedVideoCodecType("VP8"); > >- std::unique_ptr<VideoMediaChannel> channel( >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> channel(engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > EXPECT_TRUE(channel->AddSendStream(StreamParams::CreateLegacy(123))); > VideoMediaInfo info; > channel->GetStats(&info); >@@ -530,7 +554,7 @@ TEST_F(WebRtcVideoEngineTest, RtxCodecAddedForH264Codec) { > codecs, FindMatchingCodec(codecs, cricket::VideoCodec(h264_high))->id)); > } > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > TEST_F(WebRtcVideoEngineTest, CanConstructDecoderForVp9EncoderFactory) { > encoder_factory_->AddSupportedVideoCodecType("VP9"); > >@@ -540,7 +564,7 @@ TEST_F(WebRtcVideoEngineTest, CanConstructDecoderForVp9EncoderFactory) { > EXPECT_TRUE( > channel->AddRecvStream(cricket::StreamParams::CreateLegacy(kSsrc))); > } >-#endif // !defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > TEST_F(WebRtcVideoEngineTest, PropagatesInputFrameTimestamp) { > encoder_factory_->AddSupportedVideoCodecType("VP8"); >@@ -631,11 +655,11 @@ size_t WebRtcVideoEngineTest::GetEngineCodecIndex( > const std::vector<cricket::VideoCodec> codecs = engine_.codecs(); > for (size_t i = 0; i < codecs.size(); ++i) { > const cricket::VideoCodec engine_codec = codecs[i]; >- if (!CodecNamesEq(name, engine_codec.name)) >+ if (!absl::EqualsIgnoreCase(name, engine_codec.name)) > continue; > // The tests only use H264 Constrained Baseline. Make sure we don't return > // an internal H264 codec from the engine with a different H264 profile. >- if (CodecNamesEq(name.c_str(), kH264CodecName)) { >+ if (absl::EqualsIgnoreCase(name.c_str(), kH264CodecName)) { > const absl::optional<webrtc::H264::ProfileLevelId> profile_level_id = > webrtc::H264::ParseSdpProfileLevelId(engine_codec.params); > if (profile_level_id->profile != >@@ -657,8 +681,8 @@ cricket::VideoCodec WebRtcVideoEngineTest::GetEngineCodec( > > VideoMediaChannel* > WebRtcVideoEngineTest::SetSendParamsWithAllSupportedCodecs() { >- VideoMediaChannel* channel = >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions()); >+ VideoMediaChannel* channel = engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions()); > cricket::VideoSendParameters parameters; > // We need to look up the codec in the engine to get the correct payload type. > for (const webrtc::SdpVideoFormat& format : >@@ -677,8 +701,8 @@ WebRtcVideoEngineTest::SetSendParamsWithAllSupportedCodecs() { > > VideoMediaChannel* WebRtcVideoEngineTest::SetRecvParamsWithSupportedCodecs( > const std::vector<VideoCodec>& codecs) { >- VideoMediaChannel* channel = >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions()); >+ VideoMediaChannel* channel = engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions()); > cricket::VideoRecvParameters parameters; > parameters.codecs = codecs; > EXPECT_TRUE(channel->SetRecvParameters(parameters)); >@@ -732,8 +756,8 @@ TEST_F(WebRtcVideoEngineTest, ChannelWithH264CanChangeToVp8) { > EXPECT_EQ(cricket::CS_RUNNING, > capturer.Start(capturer.GetSupportedFormats()->front())); > >- std::unique_ptr<VideoMediaChannel> channel( >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> channel(engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > cricket::VideoSendParameters parameters; > parameters.codecs.push_back(GetEngineCodec("H264")); > EXPECT_TRUE(channel->SetSendParameters(parameters)); >@@ -761,8 +785,8 @@ TEST_F(WebRtcVideoEngineTest, > encoder_factory_->AddSupportedVideoCodecType("VP8"); > encoder_factory_->AddSupportedVideoCodecType("H264"); > >- std::unique_ptr<VideoMediaChannel> channel( >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> channel(engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > cricket::VideoSendParameters parameters; > parameters.codecs.push_back(GetEngineCodec("VP8")); > EXPECT_TRUE(channel->SetSendParameters(parameters)); >@@ -796,8 +820,8 @@ TEST_F(WebRtcVideoEngineTest, > encoder_factory_->AddSupportedVideoCodecType("VP8"); > encoder_factory_->AddSupportedVideoCodecType("H264"); > >- std::unique_ptr<VideoMediaChannel> channel( >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> channel(engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > cricket::VideoSendParameters parameters; > parameters.codecs.push_back(GetEngineCodec("H264")); > EXPECT_TRUE(channel->SetSendParameters(parameters)); >@@ -828,8 +852,8 @@ TEST_F(WebRtcVideoEngineTest, SimulcastEnabledForH264BehindFieldTrial) { > "WebRTC-H264Simulcast/Enabled/"); > encoder_factory_->AddSupportedVideoCodecType("H264"); > >- std::unique_ptr<VideoMediaChannel> channel( >- engine_.CreateChannel(call_.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> channel(engine_.CreateMediaChannel( >+ call_.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > cricket::VideoSendParameters parameters; > parameters.codecs.push_back(GetEngineCodec("H264")); > EXPECT_TRUE(channel->SetSendParameters(parameters)); >@@ -989,8 +1013,10 @@ TEST_F(WebRtcVideoEngineTest, GetSourcesWithNonExistingSsrc) { > TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, NullFactories) { > std::unique_ptr<webrtc::VideoEncoderFactory> encoder_factory; > std::unique_ptr<webrtc::VideoDecoderFactory> decoder_factory; >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory> rate_allocator_factory; > WebRtcVideoEngine engine(std::move(encoder_factory), >- std::move(decoder_factory)); >+ std::move(decoder_factory), >+ std::move(rate_allocator_factory)); > EXPECT_EQ(0u, engine.codecs().size()); > } > >@@ -1000,13 +1026,18 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, EmptyFactories) { > new webrtc::MockVideoEncoderFactory(); > webrtc::MockVideoDecoderFactory* decoder_factory = > new webrtc::MockVideoDecoderFactory(); >+ webrtc::MockVideoBitrateAllocatorFactory* rate_allocator_factory = >+ new webrtc::MockVideoBitrateAllocatorFactory(); > WebRtcVideoEngine engine( > (std::unique_ptr<webrtc::VideoEncoderFactory>(encoder_factory)), >- (std::unique_ptr<webrtc::VideoDecoderFactory>(decoder_factory))); >+ (std::unique_ptr<webrtc::VideoDecoderFactory>(decoder_factory)), >+ (std::unique_ptr<webrtc::VideoBitrateAllocatorFactory>( >+ rate_allocator_factory))); > EXPECT_CALL(*encoder_factory, GetSupportedFormats()); > EXPECT_EQ(0u, engine.codecs().size()); > EXPECT_CALL(*encoder_factory, Die()); > EXPECT_CALL(*decoder_factory, Die()); >+ EXPECT_CALL(*rate_allocator_factory, Die()); > } > > // Test full behavior in the video engine when video codec factories of the new >@@ -1019,9 +1050,17 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, Vp8) { > new webrtc::MockVideoEncoderFactory(); > webrtc::MockVideoDecoderFactory* decoder_factory = > new webrtc::MockVideoDecoderFactory(); >+ webrtc::MockVideoBitrateAllocatorFactory* rate_allocator_factory = >+ new webrtc::MockVideoBitrateAllocatorFactory(); >+ EXPECT_CALL(*rate_allocator_factory, >+ CreateVideoBitrateAllocatorProxy(Field( >+ &webrtc::VideoCodec::codecType, webrtc::kVideoCodecVP8))) >+ .WillOnce(testing::Return(new webrtc::MockVideoBitrateAllocator())); > WebRtcVideoEngine engine( > (std::unique_ptr<webrtc::VideoEncoderFactory>(encoder_factory)), >- (std::unique_ptr<webrtc::VideoDecoderFactory>(decoder_factory))); >+ (std::unique_ptr<webrtc::VideoDecoderFactory>(decoder_factory)), >+ (std::unique_ptr<webrtc::VideoBitrateAllocatorFactory>( >+ rate_allocator_factory))); > const webrtc::SdpVideoFormat vp8_format("VP8"); > const std::vector<webrtc::SdpVideoFormat> supported_formats = {vp8_format}; > EXPECT_CALL(*encoder_factory, GetSupportedFormats()) >@@ -1066,7 +1105,7 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, Vp8) { > EXPECT_CALL(*encoder_factory, QueryVideoEncoder(format)) > .WillRepeatedly(testing::Return(codec_info)); > FakeWebRtcVideoEncoder* const encoder = new FakeWebRtcVideoEncoder(nullptr); >- rtc::Event encoder_created(false, false); >+ rtc::Event encoder_created; > EXPECT_CALL(*encoder_factory, CreateVideoEncoderProxy(format)) > .WillOnce( > ::testing::DoAll(::testing::InvokeWithoutArgs( >@@ -1085,8 +1124,8 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, Vp8) { > > // Create send channel. > const int send_ssrc = 123; >- std::unique_ptr<VideoMediaChannel> send_channel( >- engine.CreateChannel(call.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> send_channel(engine.CreateMediaChannel( >+ call.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > cricket::VideoSendParameters send_parameters; > send_parameters.codecs.push_back(engine_codecs.at(0)); > EXPECT_TRUE(send_channel->SetSendParameters(send_parameters)); >@@ -1106,8 +1145,8 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, Vp8) { > > // Create recv channel. > const int recv_ssrc = 321; >- std::unique_ptr<VideoMediaChannel> recv_channel( >- engine.CreateChannel(call.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> recv_channel(engine.CreateMediaChannel( >+ call.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > cricket::VideoRecvParameters recv_parameters; > recv_parameters.codecs.push_back(engine_codecs.at(0)); > EXPECT_TRUE(recv_channel->SetRecvParameters(recv_parameters)); >@@ -1117,6 +1156,7 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, Vp8) { > // Remove streams previously added to free the encoder and decoder instance. > EXPECT_CALL(*encoder_factory, Die()); > EXPECT_CALL(*decoder_factory, Die()); >+ EXPECT_CALL(*rate_allocator_factory, Die()); > EXPECT_TRUE(send_channel->RemoveSendStream(send_ssrc)); > EXPECT_TRUE(recv_channel->RemoveRecvStream(recv_ssrc)); > } >@@ -1125,12 +1165,16 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, Vp8) { > TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, NullDecoder) { > // |engine| take ownership of the factories. > webrtc::MockVideoEncoderFactory* encoder_factory = >- new testing::StrictMock<webrtc::MockVideoEncoderFactory>(); >+ new webrtc::MockVideoEncoderFactory(); > webrtc::MockVideoDecoderFactory* decoder_factory = >- new testing::StrictMock<webrtc::MockVideoDecoderFactory>(); >+ new webrtc::MockVideoDecoderFactory(); >+ webrtc::MockVideoBitrateAllocatorFactory* rate_allocator_factory = >+ new webrtc::MockVideoBitrateAllocatorFactory(); > WebRtcVideoEngine engine( > (std::unique_ptr<webrtc::VideoEncoderFactory>(encoder_factory)), >- (std::unique_ptr<webrtc::VideoDecoderFactory>(decoder_factory))); >+ (std::unique_ptr<webrtc::VideoDecoderFactory>(decoder_factory)), >+ (std::unique_ptr<webrtc::VideoBitrateAllocatorFactory>( >+ rate_allocator_factory))); > const webrtc::SdpVideoFormat vp8_format("VP8"); > const std::vector<webrtc::SdpVideoFormat> supported_formats = {vp8_format}; > EXPECT_CALL(*encoder_factory, GetSupportedFormats()) >@@ -1147,8 +1191,8 @@ TEST(WebRtcVideoEngineNewVideoCodecFactoryTest, NullDecoder) { > > // Create recv channel. > const int recv_ssrc = 321; >- std::unique_ptr<VideoMediaChannel> recv_channel( >- engine.CreateChannel(call.get(), GetMediaConfig(), VideoOptions())); >+ std::unique_ptr<VideoMediaChannel> recv_channel(engine.CreateMediaChannel( >+ call.get(), GetMediaConfig(), VideoOptions(), webrtc::CryptoOptions())); > cricket::VideoRecvParameters recv_parameters; > recv_parameters.codecs.push_back(engine.codecs().front()); > EXPECT_TRUE(recv_channel->SetRecvParameters(recv_parameters)); >@@ -1223,7 +1267,8 @@ class WebRtcVideoChannelBaseTest : public testing::Test { > protected: > WebRtcVideoChannelBaseTest() > : engine_(webrtc::CreateBuiltinVideoEncoderFactory(), >- webrtc::CreateBuiltinVideoDecoderFactory()) {} >+ webrtc::CreateBuiltinVideoDecoderFactory(), >+ webrtc::CreateBuiltinVideoBitrateAllocatorFactory()) {} > > virtual void SetUp() { > // One testcase calls SetUp in a loop, only create call_ once. >@@ -1236,12 +1281,14 @@ class WebRtcVideoChannelBaseTest : public testing::Test { > // needs to be disabled, otherwise, tests which check the size of received > // frames become flaky. > media_config.video.enable_cpu_adaptation = false; >- channel_.reset(engine_.CreateChannel(call_.get(), media_config, >- cricket::VideoOptions())); >+ channel_.reset( >+ static_cast<cricket::WebRtcVideoChannel*>(engine_.CreateMediaChannel( >+ call_.get(), media_config, cricket::VideoOptions(), >+ webrtc::CryptoOptions()))); > channel_->OnReadyToSend(true); > EXPECT_TRUE(channel_.get() != NULL); > network_interface_.SetDestination(channel_.get()); >- channel_->SetInterface(&network_interface_); >+ channel_->SetInterface(&network_interface_, /*media_transport=*/nullptr); > cricket::VideoRecvParameters parameters; > parameters.codecs = engine_.codecs(); > channel_->SetRecvParameters(parameters); >@@ -1411,7 +1458,7 @@ class WebRtcVideoChannelBaseTest : public testing::Test { > > cricket::VideoCodec GetEngineCodec(const std::string& name) { > for (const cricket::VideoCodec& engine_codec : engine_.codecs()) { >- if (CodecNamesEq(name, engine_codec.name)) >+ if (absl::EqualsIgnoreCase(name, engine_codec.name)) > return engine_codec; > } > // This point should never be reached. >@@ -1465,60 +1512,7 @@ TEST_F(WebRtcVideoChannelBaseTest, SetSendSetsTransportBufferSizes) { > EXPECT_TRUE(SetOneCodec(DefaultCodec())); > EXPECT_TRUE(SetSend(true)); > EXPECT_EQ(64 * 1024, network_interface_.sendbuf_size()); >- EXPECT_EQ(64 * 1024, network_interface_.recvbuf_size()); >-} >- >-// Test that we properly set the send and recv buffer sizes when overriding >-// via field trials. >-TEST_F(WebRtcVideoChannelBaseTest, OverridesRecvBufferSize) { >- // Set field trial to override the default recv buffer size, and then re-run >- // setup where the interface is created and configured. >- const int kCustomRecvBufferSize = 123456; >- webrtc::test::ScopedFieldTrials field_trial( >- "WebRTC-IncreasedReceivebuffers/123456/"); >- SetUp(); >- >- EXPECT_TRUE(SetOneCodec(DefaultCodec())); >- EXPECT_TRUE(SetSend(true)); >- EXPECT_EQ(64 * 1024, network_interface_.sendbuf_size()); >- EXPECT_EQ(kCustomRecvBufferSize, network_interface_.recvbuf_size()); >-} >- >-// Test that we properly set the send and recv buffer sizes when overriding >-// via field trials with suffix. >-TEST_F(WebRtcVideoChannelBaseTest, OverridesRecvBufferSizeWithSuffix) { >- // Set field trial to override the default recv buffer size, and then re-run >- // setup where the interface is created and configured. >- const int kCustomRecvBufferSize = 123456; >- webrtc::test::ScopedFieldTrials field_trial( >- "WebRTC-IncreasedReceivebuffers/123456_Dogfood/"); >- SetUp(); >- >- EXPECT_TRUE(SetOneCodec(DefaultCodec())); >- EXPECT_TRUE(SetSend(true)); >- EXPECT_EQ(64 * 1024, network_interface_.sendbuf_size()); >- EXPECT_EQ(kCustomRecvBufferSize, network_interface_.recvbuf_size()); >-} >- >-// Test that we properly set the send and recv buffer sizes when overriding >-// via field trials that don't make any sense. >-TEST_F(WebRtcVideoChannelBaseTest, InvalidRecvBufferSize) { >- // Set bogus field trial values to override the default recv buffer size, and >- // then re-run setup where the interface is created and configured. The >- // default value should still be used. >- >- for (std::string group : {" ", "NotANumber", "-1", "0"}) { >- std::string field_trial_string = "WebRTC-IncreasedReceivebuffers/"; >- field_trial_string += group; >- field_trial_string += "/"; >- webrtc::test::ScopedFieldTrials field_trial(field_trial_string); >- SetUp(); >- >- EXPECT_TRUE(SetOneCodec(DefaultCodec())); >- EXPECT_TRUE(SetSend(true)); >- EXPECT_EQ(64 * 1024, network_interface_.sendbuf_size()); >- EXPECT_EQ(64 * 1024, network_interface_.recvbuf_size()); >- } >+ EXPECT_EQ(256 * 1024, network_interface_.recvbuf_size()); > } > > // Test that stats work properly for a 1-1 call. >@@ -1757,7 +1751,7 @@ TEST_F(WebRtcVideoChannelBaseTest, SetSink) { > EXPECT_TRUE(SetDefaultCodec()); > EXPECT_TRUE(SetSend(true)); > EXPECT_EQ(0, renderer_.num_rendered_frames()); >- channel_->OnPacketReceived(&packet1, rtc::PacketTime()); >+ channel_->OnPacketReceived(&packet1, /* packet_time_us */ -1); > EXPECT_TRUE(channel_->SetSink(kDefaultReceiveSsrc, &renderer_)); > EXPECT_TRUE(SendFrame()); > EXPECT_FRAME_WAIT(1, kVideoWidth, kVideoHeight, kTimeout); >@@ -2046,8 +2040,9 @@ class WebRtcVideoChannelTest : public WebRtcVideoEngineTest { > #endif > > fake_call_.reset(new FakeCall()); >- channel_.reset(engine_.CreateChannel(fake_call_.get(), GetMediaConfig(), >- VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel(fake_call_.get(), >+ GetMediaConfig(), VideoOptions(), >+ webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > last_ssrc_ = 123; > send_parameters_.codecs = engine_.codecs(); >@@ -2121,6 +2116,29 @@ class WebRtcVideoChannelTest : public WebRtcVideoEngineTest { > &BitrateConstraints::max_bitrate_bps, max_bitrate_bps))); > } > >+ void TestExtmapAllowMixedCaller(bool extmap_allow_mixed) { >+ // For a caller, the answer will be applied in set remote description >+ // where SetSendParameters() is called. >+ EXPECT_TRUE( >+ channel_->AddSendStream(cricket::StreamParams::CreateLegacy(kSsrc))); >+ send_parameters_.extmap_allow_mixed = extmap_allow_mixed; >+ EXPECT_TRUE(channel_->SetSendParameters(send_parameters_)); >+ const webrtc::VideoSendStream::Config& config = >+ fake_call_->GetVideoSendStreams()[0]->GetConfig(); >+ EXPECT_EQ(extmap_allow_mixed, config.rtp.extmap_allow_mixed); >+ } >+ >+ void TestExtmapAllowMixedCallee(bool extmap_allow_mixed) { >+ // For a callee, the answer will be applied in set local description >+ // where SetExtmapAllowMixed() and AddSendStream() are called. >+ channel_->SetExtmapAllowMixed(extmap_allow_mixed); >+ EXPECT_TRUE( >+ channel_->AddSendStream(cricket::StreamParams::CreateLegacy(kSsrc))); >+ const webrtc::VideoSendStream::Config& config = >+ fake_call_->GetVideoSendStreams()[0]->GetConfig(); >+ EXPECT_EQ(extmap_allow_mixed, config.rtp.extmap_allow_mixed); >+ } >+ > void TestSetSendRtpHeaderExtensions(const std::string& ext_uri) { > // Enable extension. > const int id = 1; >@@ -2377,6 +2395,20 @@ TEST_F(WebRtcVideoChannelTest, RecvStreamNoRtx) { > ASSERT_EQ(0U, recv_stream->GetConfig().rtp.rtx_ssrc); > } > >+// Test propagation of extmap allow mixed setting. >+TEST_F(WebRtcVideoChannelTest, SetExtmapAllowMixedAsCaller) { >+ TestExtmapAllowMixedCaller(/*extmap_allow_mixed=*/true); >+} >+TEST_F(WebRtcVideoChannelTest, SetExtmapAllowMixedDisabledAsCaller) { >+ TestExtmapAllowMixedCaller(/*extmap_allow_mixed=*/false); >+} >+TEST_F(WebRtcVideoChannelTest, SetExtmapAllowMixedAsCallee) { >+ TestExtmapAllowMixedCallee(/*extmap_allow_mixed=*/true); >+} >+TEST_F(WebRtcVideoChannelTest, SetExtmapAllowMixedDisabledAsCallee) { >+ TestExtmapAllowMixedCallee(/*extmap_allow_mixed=*/false); >+} >+ > TEST_F(WebRtcVideoChannelTest, NoHeaderExtesionsByDefault) { > FakeVideoSendStream* send_stream = > AddSendStream(cricket::StreamParams::CreateLegacy(kSsrcs1[0])); >@@ -2853,8 +2885,8 @@ TEST_F(WebRtcVideoChannelTest, SetMediaConfigSuspendBelowMinBitrate) { > MediaConfig media_config = GetMediaConfig(); > media_config.video.suspend_below_min_bitrate = true; > >- channel_.reset( >- engine_.CreateChannel(fake_call_.get(), media_config, VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel( >+ fake_call_.get(), media_config, VideoOptions(), webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > > channel_->SetSendParameters(send_parameters_); >@@ -2863,8 +2895,8 @@ TEST_F(WebRtcVideoChannelTest, SetMediaConfigSuspendBelowMinBitrate) { > EXPECT_TRUE(stream->GetConfig().suspend_below_min_bitrate); > > media_config.video.suspend_below_min_bitrate = false; >- channel_.reset( >- engine_.CreateChannel(fake_call_.get(), media_config, VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel( >+ fake_call_.get(), media_config, VideoOptions(), webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > > channel_->SetSendParameters(send_parameters_); >@@ -3204,8 +3236,8 @@ TEST_F(WebRtcVideoChannelTest, AdaptsOnOveruseAndChangeResolution) { > parameters.codecs.push_back(codec); > > MediaConfig media_config = GetMediaConfig(); >- channel_.reset( >- engine_.CreateChannel(fake_call_.get(), media_config, VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel( >+ fake_call_.get(), media_config, VideoOptions(), webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > ASSERT_TRUE(channel_->SetSendParameters(parameters)); > >@@ -3285,8 +3317,8 @@ TEST_F(WebRtcVideoChannelTest, PreviousAdaptationDoesNotApplyToScreenshare) { > > MediaConfig media_config = GetMediaConfig(); > media_config.video.enable_cpu_adaptation = true; >- channel_.reset( >- engine_.CreateChannel(fake_call_.get(), media_config, VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel( >+ fake_call_.get(), media_config, VideoOptions(), webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > ASSERT_TRUE(channel_->SetSendParameters(parameters)); > >@@ -3359,8 +3391,8 @@ void WebRtcVideoChannelTest::TestDegradationPreference( > > MediaConfig media_config = GetMediaConfig(); > media_config.video.enable_cpu_adaptation = true; >- channel_.reset( >- engine_.CreateChannel(fake_call_.get(), media_config, VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel( >+ fake_call_.get(), media_config, VideoOptions(), webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > > EXPECT_TRUE(channel_->SetSendParameters(parameters)); >@@ -3394,8 +3426,8 @@ void WebRtcVideoChannelTest::TestCpuAdaptation(bool enable_overuse, > if (enable_overuse) { > media_config.video.enable_cpu_adaptation = true; > } >- channel_.reset( >- engine_.CreateChannel(fake_call_.get(), media_config, VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel( >+ fake_call_.get(), media_config, VideoOptions(), webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > > EXPECT_TRUE(channel_->SetSendParameters(parameters)); >@@ -4575,31 +4607,55 @@ TEST_F(WebRtcVideoChannelTest, TestSetDscpOptions) { > new cricket::FakeNetworkInterface); > MediaConfig config; > std::unique_ptr<cricket::WebRtcVideoChannel> channel; >+ webrtc::RtpParameters parameters; > >- channel.reset(static_cast<cricket::WebRtcVideoChannel*>( >- engine_.CreateChannel(call_.get(), config, VideoOptions()))); >- channel->SetInterface(network_interface.get()); >+ channel.reset( >+ static_cast<cricket::WebRtcVideoChannel*>(engine_.CreateMediaChannel( >+ call_.get(), config, VideoOptions(), webrtc::CryptoOptions()))); >+ channel->SetInterface(network_interface.get(), /*media_transport=*/nullptr); > // Default value when DSCP is disabled should be DSCP_DEFAULT. > EXPECT_EQ(rtc::DSCP_DEFAULT, network_interface->dscp()); > >+ // Default value when DSCP is enabled is also DSCP_DEFAULT, until it is set >+ // through rtp parameters. > config.enable_dscp = true; >- channel.reset(static_cast<cricket::WebRtcVideoChannel*>( >- engine_.CreateChannel(call_.get(), config, VideoOptions()))); >- channel->SetInterface(network_interface.get()); >+ channel.reset( >+ static_cast<cricket::WebRtcVideoChannel*>(engine_.CreateMediaChannel( >+ call_.get(), config, VideoOptions(), webrtc::CryptoOptions()))); >+ channel->SetInterface(network_interface.get(), /*media_transport=*/nullptr); >+ EXPECT_EQ(rtc::DSCP_DEFAULT, network_interface->dscp()); >+ >+ // Create a send stream to configure >+ EXPECT_TRUE(channel->AddSendStream(StreamParams::CreateLegacy(kSsrc))); >+ parameters = channel->GetRtpSendParameters(kSsrc); >+ ASSERT_FALSE(parameters.encodings.empty()); >+ >+ // Various priorities map to various dscp values. >+ parameters.encodings[0].network_priority = 4.0; >+ ASSERT_TRUE(channel->SetRtpSendParameters(kSsrc, parameters).ok()); > EXPECT_EQ(rtc::DSCP_AF41, network_interface->dscp()); >+ parameters.encodings[0].network_priority = 0.5; >+ ASSERT_TRUE(channel->SetRtpSendParameters(kSsrc, parameters).ok()); >+ EXPECT_EQ(rtc::DSCP_CS1, network_interface->dscp()); >+ >+ // A bad priority does not change the dscp value. >+ parameters.encodings[0].network_priority = 0.0; >+ ASSERT_FALSE(channel->SetRtpSendParameters(kSsrc, parameters).ok()); >+ EXPECT_EQ(rtc::DSCP_CS1, network_interface->dscp()); > > // Packets should also self-identify their dscp in PacketOptions. > const uint8_t kData[10] = {0}; > EXPECT_TRUE(static_cast<webrtc::Transport*>(channel.get()) > ->SendRtcp(kData, sizeof(kData))); >- EXPECT_EQ(rtc::DSCP_AF41, network_interface->options().dscp); >+ EXPECT_EQ(rtc::DSCP_CS1, network_interface->options().dscp); > > // Verify that setting the option to false resets the > // DiffServCodePoint. > config.enable_dscp = false; >- channel.reset(static_cast<cricket::WebRtcVideoChannel*>( >- engine_.CreateChannel(call_.get(), config, VideoOptions()))); >- channel->SetInterface(network_interface.get()); >+ channel.reset( >+ static_cast<cricket::WebRtcVideoChannel*>(engine_.CreateMediaChannel( >+ call_.get(), config, VideoOptions(), webrtc::CryptoOptions()))); >+ channel->SetInterface(network_interface.get(), /*media_transport=*/nullptr); > EXPECT_EQ(rtc::DSCP_DEFAULT, network_interface->dscp()); > } > >@@ -4957,8 +5013,7 @@ TEST_F(WebRtcVideoChannelTest, DefaultReceiveStreamReconfiguresToUseRtx) { > memset(data, 0, sizeof(data)); > rtc::SetBE32(&data[8], ssrcs[0]); > rtc::CopyOnWriteBuffer packet(data, kDataLength); >- rtc::PacketTime packet_time; >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > > ASSERT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()) > << "No default receive stream created."; >@@ -5116,8 +5171,7 @@ TEST_F(WebRtcVideoChannelTest, RecvUnsignaledSsrcWithSignaledStreamId) { > memset(data, 0, sizeof(data)); > rtc::SetBE32(&data[8], kIncomingUnsignalledSsrc); > rtc::CopyOnWriteBuffer packet(data, kDataLength); >- rtc::PacketTime packet_time; >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > > // The stream should now be created with the appropriate sync label. > EXPECT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()); >@@ -5130,7 +5184,7 @@ TEST_F(WebRtcVideoChannelTest, RecvUnsignaledSsrcWithSignaledStreamId) { > ASSERT_TRUE(channel_->RemoveRecvStream(kIncomingUnsignalledSsrc)); > EXPECT_EQ(0u, fake_call_->GetVideoReceiveStreams().size()); > >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > EXPECT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()); > EXPECT_TRUE( > fake_call_->GetVideoReceiveStreams()[0]->GetConfig().sync_group.empty()); >@@ -5156,8 +5210,7 @@ void WebRtcVideoChannelTest::TestReceiveUnsignaledSsrcPacket( > rtc::Set8(data, 1, payload_type); > rtc::SetBE32(&data[8], kIncomingUnsignalledSsrc); > rtc::CopyOnWriteBuffer packet(data, kDataLength); >- rtc::PacketTime packet_time; >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > > if (expect_created_receive_stream) { > EXPECT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()) >@@ -5170,6 +5223,18 @@ void WebRtcVideoChannelTest::TestReceiveUnsignaledSsrcPacket( > } > } > >+class WebRtcVideoChannelDiscardUnknownSsrcTest : public WebRtcVideoChannelTest { >+ public: >+ WebRtcVideoChannelDiscardUnknownSsrcTest() >+ : WebRtcVideoChannelTest( >+ "WebRTC-Video-DiscardPacketsWithUnknownSsrc/Enabled/") {} >+}; >+ >+TEST_F(WebRtcVideoChannelDiscardUnknownSsrcTest, NoUnsignalledStreamCreated) { >+ TestReceiveUnsignaledSsrcPacket(GetEngineCodec("VP8").id, >+ false /* expect_created_receive_stream */); >+} >+ > TEST_F(WebRtcVideoChannelTest, Vp8PacketCreatesUnsignalledStream) { > TestReceiveUnsignaledSsrcPacket(GetEngineCodec("VP8").id, > true /* expect_created_receive_stream */); >@@ -5232,8 +5297,7 @@ TEST_F(WebRtcVideoChannelTest, ReceiveDifferentUnsignaledSsrc) { > rtpHeader.ssrc = kIncomingUnsignalledSsrc + 1; > cricket::SetRtpHeader(data, sizeof(data), rtpHeader); > rtc::CopyOnWriteBuffer packet(data, sizeof(data)); >- rtc::PacketTime packet_time; >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > // VP8 packet should create default receive stream. > ASSERT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()); > FakeVideoReceiveStream* recv_stream = fake_call_->GetVideoReceiveStreams()[0]; >@@ -5249,7 +5313,7 @@ TEST_F(WebRtcVideoChannelTest, ReceiveDifferentUnsignaledSsrc) { > rtpHeader.ssrc = kIncomingUnsignalledSsrc + 2; > cricket::SetRtpHeader(data, sizeof(data), rtpHeader); > rtc::CopyOnWriteBuffer packet2(data, sizeof(data)); >- channel_->OnPacketReceived(&packet2, packet_time); >+ channel_->OnPacketReceived(&packet2, /* packet_time_us */ -1); > // VP9 packet should replace the default receive SSRC. > ASSERT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()); > recv_stream = fake_call_->GetVideoReceiveStreams()[0]; >@@ -5266,7 +5330,7 @@ TEST_F(WebRtcVideoChannelTest, ReceiveDifferentUnsignaledSsrc) { > rtpHeader.ssrc = kIncomingUnsignalledSsrc + 3; > cricket::SetRtpHeader(data, sizeof(data), rtpHeader); > rtc::CopyOnWriteBuffer packet3(data, sizeof(data)); >- channel_->OnPacketReceived(&packet3, packet_time); >+ channel_->OnPacketReceived(&packet3, /* packet_time_us */ -1); > // H264 packet should replace the default receive SSRC. > ASSERT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()); > recv_stream = fake_call_->GetVideoReceiveStreams()[0]; >@@ -5300,8 +5364,7 @@ TEST_F(WebRtcVideoChannelTest, > rtp_header.ssrc = kSsrcs3[0]; > cricket::SetRtpHeader(data, sizeof(data), rtp_header); > rtc::CopyOnWriteBuffer packet(data, sizeof(data)); >- rtc::PacketTime packet_time; >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > // Default receive stream should be created. > ASSERT_EQ(1u, fake_call_->GetVideoReceiveStreams().size()); > FakeVideoReceiveStream* recv_stream0 = >@@ -5319,7 +5382,7 @@ TEST_F(WebRtcVideoChannelTest, > rtp_header.ssrc = kSsrcs3[1]; > cricket::SetRtpHeader(data, sizeof(data), rtp_header); > packet.SetData(data, sizeof(data)); >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > // New default receive stream should be created, but old stream should remain. > ASSERT_EQ(2u, fake_call_->GetVideoReceiveStreams().size()); > EXPECT_EQ(recv_stream0, fake_call_->GetVideoReceiveStreams()[0]); >@@ -5646,6 +5709,28 @@ TEST_F(WebRtcVideoChannelTest, SetRtpSendParametersPrioritySimulcastStreams) { > EXPECT_TRUE(channel_->SetVideoSend(primary_ssrc, nullptr, nullptr)); > } > >+// RTCRtpEncodingParameters.network_priority must be one of a few values >+// derived from the default priority, corresponding to very-low, low, medium, >+// or high. >+TEST_F(WebRtcVideoChannelTest, SetRtpSendParametersInvalidNetworkPriority) { >+ AddSendStream(); >+ webrtc::RtpParameters parameters = channel_->GetRtpSendParameters(last_ssrc_); >+ EXPECT_EQ(1UL, parameters.encodings.size()); >+ EXPECT_EQ(webrtc::kDefaultBitratePriority, >+ parameters.encodings[0].network_priority); >+ >+ double good_values[] = {0.5, 1.0, 2.0, 4.0}; >+ double bad_values[] = {-1.0, 0.0, 0.49, 0.51, 1.1, 3.99, 4.1, 5.0}; >+ for (auto it : good_values) { >+ parameters.encodings[0].network_priority = it; >+ EXPECT_TRUE(channel_->SetRtpSendParameters(last_ssrc_, parameters).ok()); >+ } >+ for (auto it : bad_values) { >+ parameters.encodings[0].network_priority = it; >+ EXPECT_FALSE(channel_->SetRtpSendParameters(last_ssrc_, parameters).ok()); >+ } >+} >+ > TEST_F(WebRtcVideoChannelTest, GetAndSetRtpSendParametersMaxFramerate) { > const size_t kNumSimulcastStreams = 3; > SetUpSimulcast(true, false); >@@ -6634,8 +6719,7 @@ TEST_F(WebRtcVideoChannelTest, GetRtpReceiveParametersWithUnsignaledSsrc) { > rtpHeader.ssrc = kIncomingUnsignalledSsrc; > cricket::SetRtpHeader(data, sizeof(data), rtpHeader); > rtc::CopyOnWriteBuffer packet(data, sizeof(data)); >- rtc::PacketTime packet_time; >- channel_->OnPacketReceived(&packet, packet_time); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > > // The |ssrc| member should still be unset. > rtp_parameters = channel_->GetRtpReceiveParameters(0); >@@ -6699,16 +6783,21 @@ class WebRtcVideoChannelSimulcastTest : public testing::Test { > : fake_call_(), > encoder_factory_(new cricket::FakeWebRtcVideoEncoderFactory), > decoder_factory_(new cricket::FakeWebRtcVideoDecoderFactory), >+ mock_rate_allocator_factory_( >+ new webrtc::MockVideoBitrateAllocatorFactory), > engine_(std::unique_ptr<cricket::FakeWebRtcVideoEncoderFactory>( > encoder_factory_), > std::unique_ptr<cricket::FakeWebRtcVideoDecoderFactory>( >- decoder_factory_)), >+ decoder_factory_), >+ std::unique_ptr<webrtc::VideoBitrateAllocatorFactory>( >+ mock_rate_allocator_factory_)), > last_ssrc_(0) {} > > void SetUp() override { > encoder_factory_->AddSupportedVideoCodecType("VP8"); >- channel_.reset( >- engine_.CreateChannel(&fake_call_, GetMediaConfig(), VideoOptions())); >+ channel_.reset(engine_.CreateMediaChannel(&fake_call_, GetMediaConfig(), >+ VideoOptions(), >+ webrtc::CryptoOptions())); > channel_->OnReadyToSend(true); > last_ssrc_ = 123; > } >@@ -6858,6 +6947,7 @@ class WebRtcVideoChannelSimulcastTest : public testing::Test { > FakeCall fake_call_; > cricket::FakeWebRtcVideoEncoderFactory* encoder_factory_; > cricket::FakeWebRtcVideoDecoderFactory* decoder_factory_; >+ webrtc::MockVideoBitrateAllocatorFactory* mock_rate_allocator_factory_; > WebRtcVideoEngine engine_; > std::unique_ptr<VideoMediaChannel> channel_; > uint32_t last_ssrc_; >@@ -6905,16 +6995,20 @@ TEST_F(WebRtcVideoChannelSimulcastTest, > false); > } > >-// The fake clock needs to be initialize before the call. >-// So defer creating call in base class. >-class WebRtcVideoChannelTestWithClock : public WebRtcVideoChannelBaseTest { >+class WebRtcVideoFakeClock { > public: >- WebRtcVideoChannelTestWithClock() { >+ WebRtcVideoFakeClock() { > fake_clock_.AdvanceTime(webrtc::TimeDelta::ms(1)); // avoid time=0 > } > rtc::ScopedFakeClock fake_clock_; > }; > >+// The fake clock needs to be initialized before the call, and not >+// destroyed until after all threads spawned by the test have been stopped. >+// This mixin ensures that. >+class WebRtcVideoChannelTestWithClock : public WebRtcVideoFakeClock, >+ public WebRtcVideoChannelBaseTest {}; >+ > TEST_F(WebRtcVideoChannelTestWithClock, GetSources) { > uint8_t data1[] = {0x80, 0x00, 0x00, 0x00, 0x00, 0x00, > 0x00, 0x00, 0x00, 0x00, 0x00, 0x00}; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.cc >index 8ae7b991e805392dc9254aea84b1386186b73ff9..1660bd86e114aafbcec1647d6388b915aabf4da0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.cc >@@ -19,8 +19,10 @@ > #include <utility> > #include <vector> > >+#include "absl/strings/match.h" > #include "api/audio_codecs/audio_codec_pair_id.h" > #include "api/call/audio_sink.h" >+#include "api/media_transport_interface.h" > #include "media/base/audiosource.h" > #include "media/base/mediaconstants.h" > #include "media/base/streamparams.h" >@@ -38,10 +40,8 @@ > #include "rtc_base/helpers.h" > #include "rtc_base/logging.h" > #include "rtc_base/race_checker.h" >-#include "rtc_base/stringencode.h" > #include "rtc_base/strings/audio_format_to_string.h" > #include "rtc_base/strings/string_builder.h" >-#include "rtc_base/stringutils.h" > #include "rtc_base/third_party/base64/base64.h" > #include "rtc_base/trace_event.h" > #include "system_wrappers/include/field_trial.h" >@@ -58,11 +58,6 @@ constexpr int kNackRtpHistoryMs = 5000; > const int kOpusMinBitrateBps = 6000; > const int kOpusBitrateFbBps = 32000; > >-// Default audio dscp value. >-// See http://tools.ietf.org/html/rfc2474 for details. >-// See also http://tools.ietf.org/html/draft-jennings-rtcweb-qos-00 >-const rtc::DiffServCodePoint kAudioDscpValue = rtc::DSCP_EF; >- > const int kMinTelephoneEventCode = 0; // RFC4733 (Section 2.3.1) > const int kMaxTelephoneEventCode = 255; > >@@ -83,12 +78,12 @@ class ProxySink : public webrtc::AudioSinkInterface { > > bool ValidateStreamParams(const StreamParams& sp) { > if (sp.ssrcs.empty()) { >- RTC_LOG(LS_ERROR) << "No SSRCs in stream parameters: " << sp.ToString(); >+ RTC_DLOG(LS_ERROR) << "No SSRCs in stream parameters: " << sp.ToString(); > return false; > } > if (sp.ssrcs.size() > 1) { >- RTC_LOG(LS_ERROR) << "Multiple SSRCs in stream parameters: " >- << sp.ToString(); >+ RTC_DLOG(LS_ERROR) << "Multiple SSRCs in stream parameters: " >+ << sp.ToString(); > return false; > } > return true; >@@ -110,7 +105,7 @@ std::string ToString(const AudioCodec& codec) { > } > > bool IsCodec(const AudioCodec& codec, const char* ref_name) { >- return (_stricmp(codec.name.c_str(), ref_name) == 0); >+ return absl::EqualsIgnoreCase(codec.name, ref_name); > } > > bool FindCodec(const std::vector<AudioCodec>& codecs, >@@ -284,6 +279,7 @@ void WebRtcVoiceEngine::Init() { > options.stereo_swapping = false; > options.audio_jitter_buffer_max_packets = 50; > options.audio_jitter_buffer_fast_accelerate = false; >+ options.audio_jitter_buffer_min_delay_ms = 0; > options.typing_detection = true; > options.experimental_agc = false; > options.extended_filter_aec = false; >@@ -303,12 +299,14 @@ rtc::scoped_refptr<webrtc::AudioState> WebRtcVoiceEngine::GetAudioState() > return audio_state_; > } > >-VoiceMediaChannel* WebRtcVoiceEngine::CreateChannel( >+VoiceMediaChannel* WebRtcVoiceEngine::CreateMediaChannel( > webrtc::Call* call, > const MediaConfig& config, >- const AudioOptions& options) { >+ const AudioOptions& options, >+ const webrtc::CryptoOptions& crypto_options) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- return new WebRtcVoiceMediaChannel(this, config, options, call); >+ return new WebRtcVoiceMediaChannel(this, config, options, crypto_options, >+ call); > } > > bool WebRtcVoiceEngine::ApplyOptions(const AudioOptions& options_in) { >@@ -320,11 +318,6 @@ bool WebRtcVoiceEngine::ApplyOptions(const AudioOptions& options_in) { > // Set and adjust echo canceller options. > // kEcConference is AEC with high suppression. > webrtc::EcModes ec_mode = webrtc::kEcConference; >- if (options.aecm_generate_comfort_noise && >- *options.aecm_generate_comfort_noise) { >- RTC_LOG(LS_WARNING) >- << "Ignoring deprecated mobile AEC setting: comfort noise"; >- } > > #if defined(WEBRTC_IOS) > if (options.ios_force_software_aec_HACK && >@@ -490,6 +483,12 @@ bool WebRtcVoiceEngine::ApplyOptions(const AudioOptions& options_in) { > audio_jitter_buffer_fast_accelerate_ = > *options.audio_jitter_buffer_fast_accelerate; > } >+ if (options.audio_jitter_buffer_min_delay_ms) { >+ RTC_LOG(LS_INFO) << "NetEq minimum delay is " >+ << *options.audio_jitter_buffer_min_delay_ms; >+ audio_jitter_buffer_min_delay_ms_ = >+ *options.audio_jitter_buffer_min_delay_ms; >+ } > > if (options.typing_detection) { > RTC_LOG(LS_INFO) << "Typing detection is enabled? " >@@ -713,16 +712,20 @@ class WebRtcVoiceMediaChannel::WebRtcAudioSendStream > const std::string track_id, > const absl::optional<webrtc::AudioSendStream::Config::SendCodecSpec>& > send_codec_spec, >+ bool extmap_allow_mixed, > const std::vector<webrtc::RtpExtension>& extensions, > int max_send_bitrate_bps, >+ int rtcp_report_interval_ms, > const absl::optional<std::string>& audio_network_adaptor_config, > webrtc::Call* call, > webrtc::Transport* send_transport, >+ webrtc::MediaTransportInterface* media_transport, > const rtc::scoped_refptr<webrtc::AudioEncoderFactory>& encoder_factory, > const absl::optional<webrtc::AudioCodecPairId> codec_pair_id, >- rtc::scoped_refptr<webrtc::FrameEncryptorInterface> frame_encryptor) >+ rtc::scoped_refptr<webrtc::FrameEncryptorInterface> frame_encryptor, >+ const webrtc::CryptoOptions& crypto_options) > : call_(call), >- config_(send_transport), >+ config_(send_transport, media_transport), > send_side_bwe_with_overhead_( > webrtc::field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")), > max_send_bitrate_bps_(max_send_bitrate_bps), >@@ -732,12 +735,17 @@ class WebRtcVoiceMediaChannel::WebRtcAudioSendStream > config_.rtp.ssrc = ssrc; > config_.rtp.mid = mid; > config_.rtp.c_name = c_name; >+ config_.rtp.extmap_allow_mixed = extmap_allow_mixed; > config_.rtp.extensions = extensions; >+ config_.has_dscp = rtp_parameters_.encodings[0].network_priority != >+ webrtc::kDefaultBitratePriority; > config_.audio_network_adaptor_config = audio_network_adaptor_config; > config_.encoder_factory = encoder_factory; > config_.codec_pair_id = codec_pair_id; > config_.track_id = track_id; > config_.frame_encryptor = frame_encryptor; >+ config_.crypto_options = crypto_options; >+ config_.rtcp_report_interval_ms = rtcp_report_interval_ms; > rtp_parameters_.encodings[0].ssrc = ssrc; > rtp_parameters_.rtcp.cname = c_name; > rtp_parameters_.header_extensions = extensions; >@@ -768,6 +776,11 @@ class WebRtcVoiceMediaChannel::WebRtcAudioSendStream > ReconfigureAudioSendStream(); > } > >+ void SetExtmapAllowMixed(bool extmap_allow_mixed) { >+ config_.rtp.extmap_allow_mixed = extmap_allow_mixed; >+ ReconfigureAudioSendStream(); >+ } >+ > void SetMid(const std::string& mid) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > if (config_.rtp.mid == mid) { >@@ -928,12 +941,16 @@ class WebRtcVoiceMediaChannel::WebRtcAudioSendStream > const absl::optional<int> old_rtp_max_bitrate = > rtp_parameters_.encodings[0].max_bitrate_bps; > double old_priority = rtp_parameters_.encodings[0].bitrate_priority; >+ double old_dscp = rtp_parameters_.encodings[0].network_priority; > rtp_parameters_ = parameters; > config_.bitrate_priority = rtp_parameters_.encodings[0].bitrate_priority; >+ config_.has_dscp = (rtp_parameters_.encodings[0].network_priority != >+ webrtc::kDefaultBitratePriority); > > bool reconfigure_send_stream = > (rtp_parameters_.encodings[0].max_bitrate_bps != old_rtp_max_bitrate) || >- (rtp_parameters_.encodings[0].bitrate_priority != old_priority); >+ (rtp_parameters_.encodings[0].bitrate_priority != old_priority) || >+ (rtp_parameters_.encodings[0].network_priority != old_dscp); > if (rtp_parameters_.encodings[0].max_bitrate_bps != old_rtp_max_bitrate) { > // Update the bitrate range. > if (send_rate) { >@@ -969,8 +986,8 @@ class WebRtcVoiceMediaChannel::WebRtcAudioSendStream > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > const bool is_opus = > config_.send_codec_spec && >- !STR_CASE_CMP(config_.send_codec_spec->format.name.c_str(), >- kOpusCodecName); >+ absl::EqualsIgnoreCase(config_.send_codec_spec->format.name, >+ kOpusCodecName); > if (is_opus && webrtc::field_trial::IsEnabled("WebRTC-Audio-SendSideBwe")) { > config_.min_bitrate_bps = kOpusMinBitrateBps; > >@@ -1014,8 +1031,6 @@ class WebRtcVoiceMediaChannel::WebRtcAudioSendStream > void UpdateSendCodecSpec( > const webrtc::AudioSendStream::Config::SendCodecSpec& send_codec_spec) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); >- config_.rtp.nack.rtp_history_ms = >- send_codec_spec.nack_enabled ? kNackRtpHistoryMs : 0; > config_.send_codec_spec = send_codec_spec; > auto info = > config_.encoder_factory->QueryAudioEncoder(send_codec_spec.format); >@@ -1077,12 +1092,15 @@ class WebRtcVoiceMediaChannel::WebRtcAudioReceiveStream { > const std::vector<webrtc::RtpExtension>& extensions, > webrtc::Call* call, > webrtc::Transport* rtcp_send_transport, >+ webrtc::MediaTransportInterface* media_transport, > const rtc::scoped_refptr<webrtc::AudioDecoderFactory>& decoder_factory, > const std::map<int, webrtc::SdpAudioFormat>& decoder_map, > absl::optional<webrtc::AudioCodecPairId> codec_pair_id, > size_t jitter_buffer_max_packets, > bool jitter_buffer_fast_accelerate, >- rtc::scoped_refptr<webrtc::FrameDecryptorInterface> frame_decryptor) >+ int jitter_buffer_min_delay_ms, >+ rtc::scoped_refptr<webrtc::FrameDecryptorInterface> frame_decryptor, >+ const webrtc::CryptoOptions& crypto_options) > : call_(call), config_() { > RTC_DCHECK(call); > config_.rtp.remote_ssrc = remote_ssrc; >@@ -1091,8 +1109,10 @@ class WebRtcVoiceMediaChannel::WebRtcAudioReceiveStream { > config_.rtp.nack.rtp_history_ms = use_nack ? kNackRtpHistoryMs : 0; > config_.rtp.extensions = extensions; > config_.rtcp_send_transport = rtcp_send_transport; >+ config_.media_transport = media_transport; > config_.jitter_buffer_max_packets = jitter_buffer_max_packets; > config_.jitter_buffer_fast_accelerate = jitter_buffer_fast_accelerate; >+ config_.jitter_buffer_min_delay_ms = jitter_buffer_min_delay_ms; > if (!stream_ids.empty()) { > config_.sync_group = stream_ids[0]; > } >@@ -1100,6 +1120,7 @@ class WebRtcVoiceMediaChannel::WebRtcAudioReceiveStream { > config_.decoder_map = decoder_map; > config_.codec_pair_id = codec_pair_id; > config_.frame_decryptor = frame_decryptor; >+ config_.crypto_options = crypto_options; > RecreateAudioReceiveStream(); > } > >@@ -1237,11 +1258,17 @@ class WebRtcVoiceMediaChannel::WebRtcAudioReceiveStream { > RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(WebRtcAudioReceiveStream); > }; > >-WebRtcVoiceMediaChannel::WebRtcVoiceMediaChannel(WebRtcVoiceEngine* engine, >- const MediaConfig& config, >- const AudioOptions& options, >- webrtc::Call* call) >- : VoiceMediaChannel(config), engine_(engine), call_(call) { >+WebRtcVoiceMediaChannel::WebRtcVoiceMediaChannel( >+ WebRtcVoiceEngine* engine, >+ const MediaConfig& config, >+ const AudioOptions& options, >+ const webrtc::CryptoOptions& crypto_options, >+ webrtc::Call* call) >+ : VoiceMediaChannel(config), >+ engine_(engine), >+ call_(call), >+ audio_config_(config.audio), >+ crypto_options_(crypto_options) { > RTC_LOG(LS_VERBOSE) << "WebRtcVoiceMediaChannel::WebRtcVoiceMediaChannel"; > RTC_DCHECK(call); > engine->RegisterChannel(this); >@@ -1264,7 +1291,7 @@ WebRtcVoiceMediaChannel::~WebRtcVoiceMediaChannel() { > } > > rtc::DiffServCodePoint WebRtcVoiceMediaChannel::PreferredDscp() const { >- return kAudioDscpValue; >+ return preferred_dscp_; > } > > bool WebRtcVoiceMediaChannel::SetSendParameters( >@@ -1283,6 +1310,14 @@ bool WebRtcVoiceMediaChannel::SetSendParameters( > if (!ValidateRtpExtensions(params.extensions)) { > return false; > } >+ >+ if (ExtmapAllowMixed() != params.extmap_allow_mixed) { >+ SetExtmapAllowMixed(params.extmap_allow_mixed); >+ for (auto& it : send_streams_) { >+ it.second->SetExtmapAllowMixed(params.extmap_allow_mixed); >+ } >+ } >+ > std::vector<webrtc::RtpExtension> filtered_extensions = FilterRtpExtensions( > params.extensions, webrtc::RtpExtension::IsSupportedForAudio, true); > if (send_rtp_extensions_ != filtered_extensions) { >@@ -1365,11 +1400,34 @@ webrtc::RTCError WebRtcVoiceMediaChannel::SetRtpSendParameters( > // different order (which should change the send codec). > webrtc::RtpParameters current_parameters = GetRtpSendParameters(ssrc); > if (current_parameters.codecs != parameters.codecs) { >- RTC_LOG(LS_ERROR) << "Using SetParameters to change the set of codecs " >- << "is not currently supported."; >+ RTC_DLOG(LS_ERROR) << "Using SetParameters to change the set of codecs " >+ << "is not currently supported."; > return webrtc::RTCError(webrtc::RTCErrorType::UNSUPPORTED_PARAMETER); > } > >+ if (!parameters.encodings.empty()) { >+ auto& priority = parameters.encodings[0].network_priority; >+ rtc::DiffServCodePoint new_dscp = rtc::DSCP_DEFAULT; >+ if (priority == 0.5 * webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_CS1; >+ } else if (priority == 1.0 * webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_DEFAULT; >+ } else if (priority == 2.0 * webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_EF; >+ } else if (priority == 4.0 * webrtc::kDefaultBitratePriority) { >+ new_dscp = rtc::DSCP_EF; >+ } else { >+ RTC_LOG(LS_WARNING) << "Received invalid send network priority: " >+ << priority; >+ return webrtc::RTCError(webrtc::RTCErrorType::INVALID_RANGE); >+ } >+ >+ if (new_dscp != preferred_dscp_) { >+ preferred_dscp_ = new_dscp; >+ MediaChannel::UpdateDscp(); >+ } >+ } >+ > // TODO(minyue): The following legacy actions go into > // |WebRtcAudioSendStream::SetRtpParameters()| which is called at the end, > // though there are two difference: >@@ -1440,8 +1498,8 @@ bool WebRtcVoiceMediaChannel::SetRtpReceiveParameters( > > webrtc::RtpParameters current_parameters = GetRtpReceiveParameters(ssrc); > if (current_parameters != parameters) { >- RTC_LOG(LS_ERROR) << "Changing the RTP receive parameters is currently " >- << "unsupported."; >+ RTC_DLOG(LS_ERROR) << "Changing the RTP receive parameters is currently " >+ << "unsupported."; > return false; > } > return true; >@@ -1623,17 +1681,17 @@ bool WebRtcVoiceMediaChannel::SetSendCodecs( > // TODO(solenberg): Break out into a separate function? > for (const AudioCodec& cn_codec : codecs) { > if (IsCodec(cn_codec, kCnCodecName) && >- cn_codec.clockrate == send_codec_spec->format.clockrate_hz) { >- switch (cn_codec.clockrate) { >- case 8000: >- case 16000: >- case 32000: >- send_codec_spec->cng_payload_type = cn_codec.id; >- break; >- default: >- RTC_LOG(LS_WARNING) >- << "CN frequency " << cn_codec.clockrate << " not supported."; >- break; >+ cn_codec.clockrate == send_codec_spec->format.clockrate_hz && >+ cn_codec.channels == voice_codec_info->num_channels) { >+ if (cn_codec.channels != 1) { >+ RTC_LOG(LS_WARNING) >+ << "CN #channels " << cn_codec.channels << " not supported."; >+ } else if (cn_codec.clockrate != 8000 && cn_codec.clockrate != 16000 && >+ cn_codec.clockrate != 32000) { >+ RTC_LOG(LS_WARNING) >+ << "CN frequency " << cn_codec.clockrate << " not supported."; >+ } else { >+ send_codec_spec->cng_payload_type = cn_codec.id; > } > break; > } >@@ -1761,9 +1819,11 @@ bool WebRtcVoiceMediaChannel::AddSendStream(const StreamParams& sp) { > absl::optional<std::string> audio_network_adaptor_config = > GetAudioNetworkAdaptorConfig(options_); > WebRtcAudioSendStream* stream = new WebRtcAudioSendStream( >- ssrc, mid_, sp.cname, sp.id, send_codec_spec_, send_rtp_extensions_, >- max_send_bitrate_bps_, audio_network_adaptor_config, call_, this, >- engine()->encoder_factory_, codec_pair_id_, nullptr); >+ ssrc, mid_, sp.cname, sp.id, send_codec_spec_, ExtmapAllowMixed(), >+ send_rtp_extensions_, max_send_bitrate_bps_, >+ audio_config_.rtcp_report_interval_ms, audio_network_adaptor_config, >+ call_, this, media_transport(), engine()->encoder_factory_, >+ codec_pair_id_, nullptr, crypto_options_); > send_streams_.insert(std::make_pair(ssrc, stream)); > > // At this point the stream's local SSRC has been updated. If it is the first >@@ -1826,7 +1886,7 @@ bool WebRtcVoiceMediaChannel::AddRecvStream(const StreamParams& sp) { > > const uint32_t ssrc = sp.first_ssrc(); > if (ssrc == 0) { >- RTC_LOG(LS_WARNING) << "AddRecvStream with ssrc==0 is not supported."; >+ RTC_DLOG(LS_WARNING) << "AddRecvStream with ssrc==0 is not supported."; > return false; > } > >@@ -1844,13 +1904,15 @@ bool WebRtcVoiceMediaChannel::AddRecvStream(const StreamParams& sp) { > > // Create a new channel for receiving audio data. > recv_streams_.insert(std::make_pair( >- ssrc, new WebRtcAudioReceiveStream( >- ssrc, receiver_reports_ssrc_, recv_transport_cc_enabled_, >- recv_nack_enabled_, sp.stream_ids(), recv_rtp_extensions_, >- call_, this, engine()->decoder_factory_, decoder_map_, >- codec_pair_id_, engine()->audio_jitter_buffer_max_packets_, >- engine()->audio_jitter_buffer_fast_accelerate_, >- unsignaled_frame_decryptor_))); >+ ssrc, >+ new WebRtcAudioReceiveStream( >+ ssrc, receiver_reports_ssrc_, recv_transport_cc_enabled_, >+ recv_nack_enabled_, sp.stream_ids(), recv_rtp_extensions_, call_, >+ this, media_transport(), engine()->decoder_factory_, decoder_map_, >+ codec_pair_id_, engine()->audio_jitter_buffer_max_packets_, >+ engine()->audio_jitter_buffer_fast_accelerate_, >+ engine()->audio_jitter_buffer_min_delay_ms_, >+ unsignaled_frame_decryptor_, crypto_options_))); > recv_streams_[ssrc]->SetPlayout(playout_); > > return true; >@@ -1979,14 +2041,14 @@ bool WebRtcVoiceMediaChannel::InsertDtmf(uint32_t ssrc, > event, duration); > } > >-void WebRtcVoiceMediaChannel::OnPacketReceived( >- rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+void WebRtcVoiceMediaChannel::OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >+ int64_t packet_time_us) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > > webrtc::PacketReceiver::DeliveryStatus delivery_result = > call_->Receiver()->DeliverPacket(webrtc::MediaType::AUDIO, *packet, >- packet_time.timestamp); >+ packet_time_us); >+ > if (delivery_result != webrtc::PacketReceiver::DELIVERY_UNKNOWN_SSRC) { > return; > } >@@ -2017,8 +2079,8 @@ void WebRtcVoiceMediaChannel::OnPacketReceived( > // Remove oldest unsignaled stream, if we have too many. > if (unsignaled_recv_ssrcs_.size() > kMaxUnsignaledRecvStreams) { > uint32_t remove_ssrc = unsignaled_recv_ssrcs_.front(); >- RTC_LOG(LS_INFO) << "Removing unsignaled receive stream with SSRC=" >- << remove_ssrc; >+ RTC_DLOG(LS_INFO) << "Removing unsignaled receive stream with SSRC=" >+ << remove_ssrc; > RemoveRecvStream(remove_ssrc); > } > RTC_DCHECK_GE(kMaxUnsignaledRecvStreams, unsignaled_recv_ssrcs_.size()); >@@ -2038,19 +2100,18 @@ void WebRtcVoiceMediaChannel::OnPacketReceived( > SetRawAudioSink(ssrc, std::move(proxy_sink)); > } > >- delivery_result = call_->Receiver()->DeliverPacket( >- webrtc::MediaType::AUDIO, *packet, packet_time.timestamp); >+ delivery_result = call_->Receiver()->DeliverPacket(webrtc::MediaType::AUDIO, >+ *packet, packet_time_us); > RTC_DCHECK_NE(webrtc::PacketReceiver::DELIVERY_UNKNOWN_SSRC, delivery_result); > } > >-void WebRtcVoiceMediaChannel::OnRtcpReceived( >- rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+void WebRtcVoiceMediaChannel::OnRtcpReceived(rtc::CopyOnWriteBuffer* packet, >+ int64_t packet_time_us) { > RTC_DCHECK(worker_thread_checker_.CalledOnValidThread()); > > // Forward packet to Call as well. > call_->Receiver()->DeliverPacket(webrtc::MediaType::AUDIO, *packet, >- packet_time.timestamp); >+ packet_time_us); > } > > void WebRtcVoiceMediaChannel::OnNetworkRouteChanged( >@@ -2182,6 +2243,7 @@ bool WebRtcVoiceMediaChannel::GetStats(VoiceMediaInfo* info) { > rinfo.secondary_discarded_rate = stats.secondary_discarded_rate; > rinfo.accelerate_rate = stats.accelerate_rate; > rinfo.preemptive_expand_rate = stats.preemptive_expand_rate; >+ rinfo.delayed_packet_outage_samples = stats.delayed_packet_outage_samples; > rinfo.decoding_calls_to_silence_generator = > stats.decoding_calls_to_silence_generator; > rinfo.decoding_calls_to_neteq = stats.decoding_calls_to_neteq; >@@ -2191,6 +2253,8 @@ bool WebRtcVoiceMediaChannel::GetStats(VoiceMediaInfo* info) { > rinfo.decoding_plc_cng = stats.decoding_plc_cng; > rinfo.decoding_muted_output = stats.decoding_muted_output; > rinfo.capture_start_ntp_time_ms = stats.capture_start_ntp_time_ms; >+ rinfo.jitter_buffer_flushes = stats.jitter_buffer_flushes; >+ > info->receivers.push_back(rinfo); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.h >index bedede12480dc77fe74c30bb54787c19487d8ab4..213f1b393f9e2ca28e059f05cf11a5e3f6c28e44 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine.h >@@ -40,7 +40,7 @@ class WebRtcVoiceMediaChannel; > > // WebRtcVoiceEngine is a class to be used with CompositeMediaEngine. > // It uses the WebRtc VoiceEngine library for audio handling. >-class WebRtcVoiceEngine final { >+class WebRtcVoiceEngine final : public VoiceEngineInterface { > friend class WebRtcVoiceMediaChannel; > > public: >@@ -50,19 +50,21 @@ class WebRtcVoiceEngine final { > const rtc::scoped_refptr<webrtc::AudioDecoderFactory>& decoder_factory, > rtc::scoped_refptr<webrtc::AudioMixer> audio_mixer, > rtc::scoped_refptr<webrtc::AudioProcessing> audio_processing); >- ~WebRtcVoiceEngine(); >+ ~WebRtcVoiceEngine() override; > > // Does initialization that needs to occur on the worker thread. >- void Init(); >+ void Init() override; > >- rtc::scoped_refptr<webrtc::AudioState> GetAudioState() const; >- VoiceMediaChannel* CreateChannel(webrtc::Call* call, >- const MediaConfig& config, >- const AudioOptions& options); >+ rtc::scoped_refptr<webrtc::AudioState> GetAudioState() const override; >+ VoiceMediaChannel* CreateMediaChannel( >+ webrtc::Call* call, >+ const MediaConfig& config, >+ const AudioOptions& options, >+ const webrtc::CryptoOptions& crypto_options) override; > >- const std::vector<AudioCodec>& send_codecs() const; >- const std::vector<AudioCodec>& recv_codecs() const; >- RtpCapabilities GetCapabilities() const; >+ const std::vector<AudioCodec>& send_codecs() const override; >+ const std::vector<AudioCodec>& recv_codecs() const override; >+ RtpCapabilities GetCapabilities() const override; > > // For tracking WebRtc channels. Needed because we have to pause them > // all when switching devices. >@@ -74,10 +76,10 @@ class WebRtcVoiceEngine final { > // specified. When the maximum file size is reached, logging is stopped and > // the file is closed. If max_size_bytes is set to <= 0, no limit will be > // used. >- bool StartAecDump(rtc::PlatformFile file, int64_t max_size_bytes); >+ bool StartAecDump(rtc::PlatformFile file, int64_t max_size_bytes) override; > > // Stops AEC dump. >- void StopAecDump(); >+ void StopAecDump() override; > > const webrtc::AudioProcessing::Config GetApmConfigForTest() const { > return apm()->GetConfig(); >@@ -130,6 +132,7 @@ class WebRtcVoiceEngine final { > // Jitter buffer settings for new streams. > size_t audio_jitter_buffer_max_packets_ = 50; > bool audio_jitter_buffer_fast_accelerate_ = false; >+ int audio_jitter_buffer_min_delay_ms_ = 0; > > RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(WebRtcVoiceEngine); > }; >@@ -142,6 +145,7 @@ class WebRtcVoiceMediaChannel final : public VoiceMediaChannel, > WebRtcVoiceMediaChannel(WebRtcVoiceEngine* engine, > const MediaConfig& config, > const AudioOptions& options, >+ const webrtc::CryptoOptions& crypto_options, > webrtc::Call* call); > ~WebRtcVoiceMediaChannel() override; > >@@ -192,9 +196,9 @@ class WebRtcVoiceMediaChannel final : public VoiceMediaChannel, > bool InsertDtmf(uint32_t ssrc, int event, int duration) override; > > void OnPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > void OnRtcpReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > void OnNetworkRouteChanged(const std::string& transport_name, > const rtc::NetworkRoute& network_route) override; > void OnReadyToSend(bool ready) override; >@@ -218,6 +222,10 @@ class WebRtcVoiceMediaChannel final : public VoiceMediaChannel, > if (DscpEnabled()) { > rtc_options.dscp = PreferredDscp(); > } >+ rtc_options.info_signaled_after_sent.included_in_feedback = >+ options.included_in_feedback; >+ rtc_options.info_signaled_after_sent.included_in_allocation = >+ options.included_in_allocation; > return VoiceMediaChannel::SendPacket(&packet, rtc_options); > } > >@@ -227,6 +235,7 @@ class WebRtcVoiceMediaChannel final : public VoiceMediaChannel, > if (DscpEnabled()) { > rtc_options.dscp = PreferredDscp(); > } >+ > return VoiceMediaChannel::SendRtcp(&packet, rtc_options); > } > >@@ -258,6 +267,7 @@ class WebRtcVoiceMediaChannel final : public VoiceMediaChannel, > std::vector<AudioCodec> recv_codecs_; > > int max_send_bitrate_bps_ = 0; >+ rtc::DiffServCodePoint preferred_dscp_ = rtc::DSCP_DEFAULT; > AudioOptions options_; > absl::optional<int> dtmf_payload_type_; > int dtmf_payload_freq_ = -1; >@@ -268,6 +278,8 @@ class WebRtcVoiceMediaChannel final : public VoiceMediaChannel, > bool send_ = false; > webrtc::Call* const call_ = nullptr; > >+ const MediaConfig::Audio audio_config_; >+ > // Queue of unsignaled SSRCs; oldest at the beginning. > std::vector<uint32_t> unsignaled_recv_ssrcs_; > >@@ -302,6 +314,9 @@ class WebRtcVoiceMediaChannel final : public VoiceMediaChannel, > const webrtc::AudioCodecPairId codec_pair_id_ = > webrtc::AudioCodecPairId::Create(); > >+ // Per peer connection crypto options that last for the lifetime of the peer >+ // connection. >+ const webrtc::CryptoOptions crypto_options_; > // Unsignaled streams have an option to have a frame decryptor set on them. > rtc::scoped_refptr<webrtc::FrameDecryptorInterface> > unsignaled_frame_decryptor_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine_unittest.cc >index bb816eed11827f2c0cdc62991643ebdb09835f83..9fd5dfb0ad85c4f7839da8a334ca3e1329302077 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/engine/webrtcvoiceengine_unittest.cc >@@ -11,6 +11,7 @@ > #include <memory> > #include <utility> > >+#include "absl/strings/match.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" > #include "api/rtpparameters.h" >@@ -210,8 +211,9 @@ class WebRtcVoiceEngineTestFake : public testing::Test { > > bool SetupChannel() { > EXPECT_CALL(*apm_, SetExtraOptions(testing::_)); >- channel_ = engine_->CreateChannel(&call_, cricket::MediaConfig(), >- cricket::AudioOptions()); >+ channel_ = engine_->CreateMediaChannel(&call_, cricket::MediaConfig(), >+ cricket::AudioOptions(), >+ webrtc::CryptoOptions()); > return (channel_ != nullptr); > } > >@@ -253,7 +255,7 @@ class WebRtcVoiceEngineTestFake : public testing::Test { > > void DeliverPacket(const void* data, int len) { > rtc::CopyOnWriteBuffer packet(reinterpret_cast<const uint8_t*>(data), len); >- channel_->OnPacketReceived(&packet, rtc::PacketTime()); >+ channel_->OnPacketReceived(&packet, /* packet_time_us */ -1); > } > > void TearDown() override { delete channel_; } >@@ -349,6 +351,30 @@ class WebRtcVoiceEngineTestFake : public testing::Test { > EXPECT_EQ(123, telephone_event.duration_ms); > } > >+ void TestExtmapAllowMixedCaller(bool extmap_allow_mixed) { >+ // For a caller, the answer will be applied in set remote description >+ // where SetSendParameters() is called. >+ EXPECT_TRUE(SetupChannel()); >+ EXPECT_TRUE( >+ channel_->AddSendStream(cricket::StreamParams::CreateLegacy(kSsrcX))); >+ send_parameters_.extmap_allow_mixed = extmap_allow_mixed; >+ SetSendParameters(send_parameters_); >+ const webrtc::AudioSendStream::Config& config = GetSendStreamConfig(kSsrcX); >+ EXPECT_EQ(extmap_allow_mixed, config.rtp.extmap_allow_mixed); >+ } >+ >+ void TestExtmapAllowMixedCallee(bool extmap_allow_mixed) { >+ // For a callee, the answer will be applied in set local description >+ // where SetExtmapAllowMixed() and AddSendStream() are called. >+ EXPECT_TRUE(SetupChannel()); >+ channel_->SetExtmapAllowMixed(extmap_allow_mixed); >+ EXPECT_TRUE( >+ channel_->AddSendStream(cricket::StreamParams::CreateLegacy(kSsrcX))); >+ >+ const webrtc::AudioSendStream::Config& config = GetSendStreamConfig(kSsrcX); >+ EXPECT_EQ(extmap_allow_mixed, config.rtp.extmap_allow_mixed); >+ } >+ > // Test that send bandwidth is set correctly. > // |codec| is the codec under test. > // |max_bitrate| is a parameter to set to SetMaxSendBandwidth(). >@@ -735,7 +761,7 @@ class WebRtcVoiceEngineTestFake : public testing::Test { > }; > > // Tests that we can create and destroy a channel. >-TEST_F(WebRtcVoiceEngineTestFake, CreateChannel) { >+TEST_F(WebRtcVoiceEngineTestFake, CreateMediaChannel) { > EXPECT_TRUE(SetupChannel()); > } > >@@ -1137,6 +1163,28 @@ TEST_F(WebRtcVoiceEngineTestFake, RtpParametersArePerStream) { > EXPECT_EQ(32000, GetCodecBitrate(kSsrcs4[2])); > } > >+// RTCRtpEncodingParameters.network_priority must be one of a few values >+// derived from the default priority, corresponding to very-low, low, medium, >+// or high. >+TEST_F(WebRtcVoiceEngineTestFake, SetRtpSendParametersInvalidNetworkPriority) { >+ EXPECT_TRUE(SetupSendStream()); >+ webrtc::RtpParameters parameters = channel_->GetRtpSendParameters(kSsrcX); >+ EXPECT_EQ(1UL, parameters.encodings.size()); >+ EXPECT_EQ(webrtc::kDefaultBitratePriority, >+ parameters.encodings[0].network_priority); >+ >+ double good_values[] = {0.5, 1.0, 2.0, 4.0}; >+ double bad_values[] = {-1.0, 0.0, 0.49, 0.51, 1.1, 3.99, 4.1, 5.0}; >+ for (auto it : good_values) { >+ parameters.encodings[0].network_priority = it; >+ EXPECT_TRUE(channel_->SetRtpSendParameters(kSsrcX, parameters).ok()); >+ } >+ for (auto it : bad_values) { >+ parameters.encodings[0].network_priority = it; >+ EXPECT_FALSE(channel_->SetRtpSendParameters(kSsrcX, parameters).ok()); >+ } >+} >+ > // Test that GetRtpSendParameters returns the currently configured codecs. > TEST_F(WebRtcVoiceEngineTestFake, GetRtpSendParametersCodecs) { > EXPECT_TRUE(SetupSendStream()); >@@ -1587,18 +1635,6 @@ TEST_F(WebRtcVoiceEngineTestFake, SetMaxSendBandwidthForAudioDoesntAffectBwe) { > SetSendParameters(send_parameters_); > } > >-// Test that we can enable NACK with opus as caller. >-TEST_F(WebRtcVoiceEngineTestFake, SetSendCodecEnableNackAsCaller) { >- EXPECT_TRUE(SetupSendStream()); >- cricket::AudioSendParameters parameters; >- parameters.codecs.push_back(kOpusCodec); >- parameters.codecs[0].AddFeedbackParam(cricket::FeedbackParam( >- cricket::kRtcpFbParamNack, cricket::kParamValueEmpty)); >- EXPECT_EQ(0, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); >- SetSendParameters(parameters); >- EXPECT_EQ(kRtpHistoryMs, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); >-} >- > // Test that we can enable NACK with opus as callee. > TEST_F(WebRtcVoiceEngineTestFake, SetSendCodecEnableNackAsCallee) { > EXPECT_TRUE(SetupRecvStream()); >@@ -1613,7 +1649,6 @@ TEST_F(WebRtcVoiceEngineTestFake, SetSendCodecEnableNackAsCallee) { > > EXPECT_TRUE( > channel_->AddSendStream(cricket::StreamParams::CreateLegacy(kSsrcX))); >- EXPECT_EQ(kRtpHistoryMs, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); > } > > // Test that we can enable NACK on receive streams. >@@ -1624,29 +1659,11 @@ TEST_F(WebRtcVoiceEngineTestFake, SetSendCodecEnableNackRecvStreams) { > parameters.codecs.push_back(kOpusCodec); > parameters.codecs[0].AddFeedbackParam(cricket::FeedbackParam( > cricket::kRtcpFbParamNack, cricket::kParamValueEmpty)); >- EXPECT_EQ(0, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); > EXPECT_EQ(0, GetRecvStreamConfig(kSsrcY).rtp.nack.rtp_history_ms); > SetSendParameters(parameters); >- EXPECT_EQ(kRtpHistoryMs, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); > EXPECT_EQ(kRtpHistoryMs, GetRecvStreamConfig(kSsrcY).rtp.nack.rtp_history_ms); > } > >-// Test that we can disable NACK. >-TEST_F(WebRtcVoiceEngineTestFake, SetSendCodecDisableNack) { >- EXPECT_TRUE(SetupSendStream()); >- cricket::AudioSendParameters parameters; >- parameters.codecs.push_back(kOpusCodec); >- parameters.codecs[0].AddFeedbackParam(cricket::FeedbackParam( >- cricket::kRtcpFbParamNack, cricket::kParamValueEmpty)); >- SetSendParameters(parameters); >- EXPECT_EQ(kRtpHistoryMs, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); >- >- parameters.codecs.clear(); >- parameters.codecs.push_back(kOpusCodec); >- SetSendParameters(parameters); >- EXPECT_EQ(0, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); >-} >- > // Test that we can disable NACK on receive streams. > TEST_F(WebRtcVoiceEngineTestFake, SetSendCodecDisableNackRecvStreams) { > EXPECT_TRUE(SetupSendStream()); >@@ -1656,13 +1673,11 @@ TEST_F(WebRtcVoiceEngineTestFake, SetSendCodecDisableNackRecvStreams) { > parameters.codecs[0].AddFeedbackParam(cricket::FeedbackParam( > cricket::kRtcpFbParamNack, cricket::kParamValueEmpty)); > SetSendParameters(parameters); >- EXPECT_EQ(kRtpHistoryMs, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); > EXPECT_EQ(kRtpHistoryMs, GetRecvStreamConfig(kSsrcY).rtp.nack.rtp_history_ms); > > parameters.codecs.clear(); > parameters.codecs.push_back(kOpusCodec); > SetSendParameters(parameters); >- EXPECT_EQ(0, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); > EXPECT_EQ(0, GetRecvStreamConfig(kSsrcY).rtp.nack.rtp_history_ms); > } > >@@ -1675,7 +1690,6 @@ TEST_F(WebRtcVoiceEngineTestFake, AddRecvStreamEnableNack) { > parameters.codecs[0].AddFeedbackParam(cricket::FeedbackParam( > cricket::kRtcpFbParamNack, cricket::kParamValueEmpty)); > SetSendParameters(parameters); >- EXPECT_EQ(kRtpHistoryMs, GetSendStreamConfig(kSsrcX).rtp.nack.rtp_history_ms); > > EXPECT_TRUE(AddRecvStream(kSsrcY)); > EXPECT_EQ(kRtpHistoryMs, GetRecvStreamConfig(kSsrcY).rtp.nack.rtp_history_ms); >@@ -2799,6 +2813,20 @@ TEST_F(WebRtcVoiceEngineTestFake, InsertDtmfOnSendStreamAsCallee) { > TestInsertDtmf(kSsrcX, false, kTelephoneEventCodec1); > } > >+// Test propagation of extmap allow mixed setting. >+TEST_F(WebRtcVoiceEngineTestFake, SetExtmapAllowMixedAsCaller) { >+ TestExtmapAllowMixedCaller(/*extmap_allow_mixed=*/true); >+} >+TEST_F(WebRtcVoiceEngineTestFake, SetExtmapAllowMixedDisabledAsCaller) { >+ TestExtmapAllowMixedCaller(/*extmap_allow_mixed=*/false); >+} >+TEST_F(WebRtcVoiceEngineTestFake, SetExtmapAllowMixedAsCallee) { >+ TestExtmapAllowMixedCallee(/*extmap_allow_mixed=*/true); >+} >+TEST_F(WebRtcVoiceEngineTestFake, SetExtmapAllowMixedDisabledAsCallee) { >+ TestExtmapAllowMixedCallee(/*extmap_allow_mixed=*/false); >+} >+ > TEST_F(WebRtcVoiceEngineTestFake, SetAudioOptions) { > EXPECT_TRUE(SetupSendStream()); > EXPECT_TRUE(AddRecvStream(kSsrcY)); >@@ -2914,11 +2942,15 @@ TEST_F(WebRtcVoiceEngineTestFake, SetOptionOverridesViaChannels) { > EXPECT_CALL(*apm_, SetExtraOptions(testing::_)).Times(10); > > std::unique_ptr<cricket::WebRtcVoiceMediaChannel> channel1( >- static_cast<cricket::WebRtcVoiceMediaChannel*>(engine_->CreateChannel( >- &call_, cricket::MediaConfig(), cricket::AudioOptions()))); >+ static_cast<cricket::WebRtcVoiceMediaChannel*>( >+ engine_->CreateMediaChannel(&call_, cricket::MediaConfig(), >+ cricket::AudioOptions(), >+ webrtc::CryptoOptions()))); > std::unique_ptr<cricket::WebRtcVoiceMediaChannel> channel2( >- static_cast<cricket::WebRtcVoiceMediaChannel*>(engine_->CreateChannel( >- &call_, cricket::MediaConfig(), cricket::AudioOptions()))); >+ static_cast<cricket::WebRtcVoiceMediaChannel*>( >+ engine_->CreateMediaChannel(&call_, cricket::MediaConfig(), >+ cricket::AudioOptions(), >+ webrtc::CryptoOptions()))); > > // Have to add a stream to make SetSend work. > cricket::StreamParams stream1; >@@ -3017,6 +3049,7 @@ TEST_F(WebRtcVoiceEngineTestFake, TestSetDscpOptions) { > cricket::FakeNetworkInterface network_interface; > cricket::MediaConfig config; > std::unique_ptr<cricket::WebRtcVoiceMediaChannel> channel; >+ webrtc::RtpParameters parameters; > > webrtc::AudioProcessing::Config apm_config; > EXPECT_CALL(*apm_, GetConfig()) >@@ -3026,32 +3059,54 @@ TEST_F(WebRtcVoiceEngineTestFake, TestSetDscpOptions) { > EXPECT_CALL(*apm_, SetExtraOptions(testing::_)).Times(3); > > channel.reset(static_cast<cricket::WebRtcVoiceMediaChannel*>( >- engine_->CreateChannel(&call_, config, cricket::AudioOptions()))); >- channel->SetInterface(&network_interface); >+ engine_->CreateMediaChannel(&call_, config, cricket::AudioOptions(), >+ webrtc::CryptoOptions()))); >+ channel->SetInterface(&network_interface, /*media_transport=*/nullptr); > // Default value when DSCP is disabled should be DSCP_DEFAULT. > EXPECT_EQ(rtc::DSCP_DEFAULT, network_interface.dscp()); > > config.enable_dscp = true; > channel.reset(static_cast<cricket::WebRtcVoiceMediaChannel*>( >- engine_->CreateChannel(&call_, config, cricket::AudioOptions()))); >- channel->SetInterface(&network_interface); >+ engine_->CreateMediaChannel(&call_, config, cricket::AudioOptions(), >+ webrtc::CryptoOptions()))); >+ channel->SetInterface(&network_interface, /*media_transport=*/nullptr); >+ EXPECT_EQ(rtc::DSCP_DEFAULT, network_interface.dscp()); >+ >+ // Create a send stream to configure >+ EXPECT_TRUE( >+ channel->AddSendStream(cricket::StreamParams::CreateLegacy(kSsrcZ))); >+ parameters = channel->GetRtpSendParameters(kSsrcZ); >+ ASSERT_FALSE(parameters.encodings.empty()); >+ >+ // Various priorities map to various dscp values. >+ parameters.encodings[0].network_priority = 4.0; >+ ASSERT_TRUE(channel->SetRtpSendParameters(kSsrcZ, parameters).ok()); > EXPECT_EQ(rtc::DSCP_EF, network_interface.dscp()); >+ parameters.encodings[0].network_priority = 0.5; >+ ASSERT_TRUE(channel->SetRtpSendParameters(kSsrcZ, parameters).ok()); >+ EXPECT_EQ(rtc::DSCP_CS1, network_interface.dscp()); >+ >+ // A bad priority does not change the dscp value. >+ parameters.encodings[0].network_priority = 0.0; >+ ASSERT_FALSE(channel->SetRtpSendParameters(kSsrcZ, parameters).ok()); >+ EXPECT_EQ(rtc::DSCP_CS1, network_interface.dscp()); > > // Packets should also self-identify their dscp in PacketOptions. > const uint8_t kData[10] = {0}; > EXPECT_TRUE(channel->SendRtcp(kData, sizeof(kData))); >- EXPECT_EQ(rtc::DSCP_EF, network_interface.options().dscp); >+ EXPECT_EQ(rtc::DSCP_CS1, network_interface.options().dscp); > > // Verify that setting the option to false resets the > // DiffServCodePoint. > config.enable_dscp = false; > channel.reset(static_cast<cricket::WebRtcVoiceMediaChannel*>( >- engine_->CreateChannel(&call_, config, cricket::AudioOptions()))); >- channel->SetInterface(&network_interface); >+ engine_->CreateMediaChannel(&call_, config, cricket::AudioOptions(), >+ webrtc::CryptoOptions()))); >+ channel->SetInterface(&network_interface, /*media_transport=*/nullptr); > // Default value when DSCP is disabled should be DSCP_DEFAULT. > EXPECT_EQ(rtc::DSCP_DEFAULT, network_interface.dscp()); > >- channel->SetInterface(nullptr); >+ channel->SetInterface(nullptr, nullptr); > } > > TEST_F(WebRtcVoiceEngineTestFake, SetOutputVolume) { >@@ -3194,9 +3249,9 @@ TEST_F(WebRtcVoiceEngineTestFake, DeliverAudioPacket_Call) { > const cricket::FakeAudioReceiveStream* s = > call_.GetAudioReceiveStream(kAudioSsrc); > EXPECT_EQ(0, s->received_packets()); >- channel_->OnPacketReceived(&kPcmuPacket, rtc::PacketTime()); >+ channel_->OnPacketReceived(&kPcmuPacket, /* packet_time_us */ -1); > EXPECT_EQ(1, s->received_packets()); >- channel_->OnRtcpReceived(&kRtcpPacket, rtc::PacketTime()); >+ channel_->OnRtcpReceived(&kRtcpPacket, /* packet_time_us */ -1); > EXPECT_EQ(2, s->received_packets()); > } > >@@ -3374,8 +3429,9 @@ TEST(WebRtcVoiceEngineTest, StartupShutdown) { > webrtc::RtcEventLogNullImpl event_log; > std::unique_ptr<webrtc::Call> call( > webrtc::Call::Create(webrtc::Call::Config(&event_log))); >- cricket::VoiceMediaChannel* channel = engine.CreateChannel( >- call.get(), cricket::MediaConfig(), cricket::AudioOptions()); >+ cricket::VoiceMediaChannel* channel = engine.CreateMediaChannel( >+ call.get(), cricket::MediaConfig(), cricket::AudioOptions(), >+ webrtc::CryptoOptions()); > EXPECT_TRUE(channel != nullptr); > delete channel; > } >@@ -3397,8 +3453,9 @@ TEST(WebRtcVoiceEngineTest, StartupShutdownWithExternalADM) { > webrtc::RtcEventLogNullImpl event_log; > std::unique_ptr<webrtc::Call> call( > webrtc::Call::Create(webrtc::Call::Config(&event_log))); >- cricket::VoiceMediaChannel* channel = engine.CreateChannel( >- call.get(), cricket::MediaConfig(), cricket::AudioOptions()); >+ cricket::VoiceMediaChannel* channel = engine.CreateMediaChannel( >+ call.get(), cricket::MediaConfig(), cricket::AudioOptions(), >+ webrtc::CryptoOptions()); > EXPECT_TRUE(channel != nullptr); > delete channel; > } >@@ -3417,7 +3474,7 @@ TEST(WebRtcVoiceEngineTest, HasCorrectPayloadTypeMapping) { > engine.Init(); > for (const cricket::AudioCodec& codec : engine.send_codecs()) { > auto is_codec = [&codec](const char* name, int clockrate = 0) { >- return STR_CASE_CMP(codec.name.c_str(), name) == 0 && >+ return absl::EqualsIgnoreCase(codec.name, name) && > (clockrate == 0 || codec.clockrate == clockrate); > }; > if (is_codec("CN", 16000)) { >@@ -3467,8 +3524,9 @@ TEST(WebRtcVoiceEngineTest, Has32Channels) { > cricket::VoiceMediaChannel* channels[32]; > size_t num_channels = 0; > while (num_channels < arraysize(channels)) { >- cricket::VoiceMediaChannel* channel = engine.CreateChannel( >- call.get(), cricket::MediaConfig(), cricket::AudioOptions()); >+ cricket::VoiceMediaChannel* channel = engine.CreateMediaChannel( >+ call.get(), cricket::MediaConfig(), cricket::AudioOptions(), >+ webrtc::CryptoOptions()); > if (!channel) > break; > channels[num_channels++] = channel; >@@ -3502,7 +3560,8 @@ TEST(WebRtcVoiceEngineTest, SetRecvCodecs) { > std::unique_ptr<webrtc::Call> call( > webrtc::Call::Create(webrtc::Call::Config(&event_log))); > cricket::WebRtcVoiceMediaChannel channel(&engine, cricket::MediaConfig(), >- cricket::AudioOptions(), call.get()); >+ cricket::AudioOptions(), >+ webrtc::CryptoOptions(), call.get()); > cricket::AudioRecvParameters parameters; > parameters.codecs = engine.recv_codecs(); > EXPECT_TRUE(channel.SetRecvParameters(parameters)); >@@ -3564,7 +3623,7 @@ TEST(WebRtcVoiceEngineTest, CollectRecvCodecs) { > auto find_codec = [&codecs](const webrtc::SdpAudioFormat& format) -> int { > for (size_t i = 0; i != codecs.size(); ++i) { > const cricket::AudioCodec& codec = codecs[i]; >- if (STR_CASE_CMP(codec.name.c_str(), format.name.c_str()) == 0 && >+ if (absl::EqualsIgnoreCase(codec.name, format.name) && > codec.clockrate == format.clockrate_hz && > codec.channels == format.num_channels) { > return rtc::checked_cast<int>(i); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/OWNERS >index 8262ff750896fb69b1c070d5905861ba46d3f0b4..bc2182b1b60f2c9c43060262adea10a8c90ba415 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/OWNERS >@@ -1,2 +1 @@ > jeroendb@webrtc.org >-shampson@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.cc >index a8b945a6f173d9ba06f29a4bb28c4a5fabc09822..4ffaaf7f5a0448d10fc31ef405cc3ef6b848bd94 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.cc >@@ -858,7 +858,7 @@ void SctpTransport::OnWritableState(rtc::PacketTransportInternal* transport) { > void SctpTransport::OnPacketRead(rtc::PacketTransportInternal* transport, > const char* data, > size_t len, >- const rtc::PacketTime& packet_time, >+ const int64_t& /* packet_time_us */, > int flags) { > RTC_DCHECK_RUN_ON(network_thread_); > RTC_DCHECK_EQ(transport_, transport); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.h >index a401d8b795a8aa914ae364ab13552628c5307721..5c2a750e4f3c3220c4e97b4b80716983bdd4305e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransport.h >@@ -14,7 +14,7 @@ > #include <errno.h> > > #include <map> >-#include <memory> // for unique_ptr. >+#include <memory> > #include <set> > #include <string> > #include <vector> >@@ -113,7 +113,7 @@ class SctpTransport : public SctpTransportInternal, > virtual void OnPacketRead(rtc::PacketTransportInternal* transport, > const char* data, > size_t len, >- const rtc::PacketTime& packet_time, >+ const int64_t& packet_time_us, > int flags); > > // Methods related to usrsctp callbacks. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransportinternal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransportinternal.h >index 0380a870b4c0b08d2ce1b46846b588684cba77d0..4c3b542a521d8bb63483720b8064da46495f49a5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransportinternal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/media/sctp/sctptransportinternal.h >@@ -14,7 +14,7 @@ > // TODO(deadbeef): Move SCTP code out of media/, and make it not depend on > // anything in media/. > >-#include <memory> // for unique_ptr >+#include <memory> > #include <string> > #include <vector> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/BUILD.gn >index 8127244462b3357266296af2c3586f15c88bf6d0..ef029914b3093c2be438fa230ebabac1c6f04934 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/BUILD.gn >@@ -54,6 +54,7 @@ rtc_source_set("module_api") { > "../api:libjingle_peerconnection_api", > "../modules/rtp_rtcp:rtp_video_header", > "../rtc_base:safe_conversions", >+ "../rtc_base/system:rtc_export", > ] > } > >@@ -236,6 +237,7 @@ if (rtc_include_tests) { > deps = [ > ":module_api", > "../test:test_main", >+ "../test:test_support", > "audio_coding:audio_coding_unittests", > "audio_device:audio_device_unittests", > "audio_mixer:audio_mixer_unittests", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/BUILD.gn >index 4982d19a7b3036f75089012d1e0e3d7417cbfbc4..df4ba23fff5d37740138c77b802239b4c628315c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/BUILD.gn >@@ -14,32 +14,6 @@ if (!build_with_mozilla) { > > visibility = [ ":*" ] > >-audio_codec_deps = [ >- ":cng", >- ":g711", >- ":pcm16b", >-] >-if (rtc_include_ilbc) { >- audio_codec_deps += [ ":ilbc" ] >-} >-if (rtc_include_opus) { >- audio_codec_deps += [ ":webrtc_opus" ] >-} >-if (current_cpu == "arm") { >- audio_codec_deps += [ ":isac_fix" ] >-} else { >- audio_codec_deps += [ ":isac" ] >-} >-audio_codec_deps += [ ":g722" ] >-if (!build_with_mozilla && !build_with_chromium) { >- audio_codec_deps += [ ":red" ] >-} >-audio_coding_deps = audio_codec_deps + [ >- "../..:webrtc_common", >- "../../common_audio", >- "../../system_wrappers", >- ] >- > rtc_static_library("audio_format_conversion") { > visibility += webrtc_default_visibility > sources = [ >@@ -53,6 +27,7 @@ rtc_static_library("audio_format_conversion") { > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", > "../../rtc_base:sanitizer", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -63,7 +38,6 @@ rtc_static_library("rent_a_codec") { > # TODO(bugs.webrtc.org/9808): Move to private visibility as soon as that > # client code gets updated. > visibility += [ "*" ] >- allow_poison = [ "audio_codecs" ] > > sources = [ > "acm2/acm_codec_database.cc", >@@ -72,20 +46,18 @@ rtc_static_library("rent_a_codec") { > "acm2/rent_a_codec.h", > ] > deps = [ >- "../../rtc_base:checks", >- "../../api:array_view", >- "//third_party/abseil-cpp/absl/types:optional", >- "../../api/audio_codecs:audio_codecs_api", >- "../..:webrtc_common", >- "../../rtc_base:protobuf_utils", >- "../../rtc_base:rtc_base_approved", >- "../../system_wrappers", >- ":audio_coding_module_typedefs", >- ":isac_common", >- ":isac_fix_c", >- ":neteq_decoder_enum", >- ] + audio_codec_deps >- >+ ":audio_coding_module_typedefs", >+ ":neteq_decoder_enum", >+ "../..:webrtc_common", >+ "../../api:array_view", >+ "../../api/audio_codecs:audio_codecs_api", >+ "../../rtc_base:checks", >+ "../../rtc_base:protobuf_utils", >+ "../../rtc_base:rtc_base_approved", >+ "../../system_wrappers", >+ "//third_party/abseil-cpp/absl/strings", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] > defines = audio_codec_defines > } > >@@ -95,12 +67,12 @@ rtc_source_set("audio_coding_module_typedefs") { > ] > deps = [ > "../..:webrtc_common", >+ "../../rtc_base:deprecation", > ] > } > > rtc_static_library("audio_coding") { > visibility += [ "*" ] >- allow_poison = [ "audio_codecs" ] # TODO(bugs.webrtc.org/8396): Remove. > sources = [ > "acm2/acm_receiver.cc", > "acm2/acm_receiver.h", >@@ -109,37 +81,34 @@ rtc_static_library("audio_coding") { > "acm2/audio_coding_module.cc", > "acm2/call_statistics.cc", > "acm2/call_statistics.h", >- "acm2/codec_manager.cc", >- "acm2/codec_manager.h", > "include/audio_coding_module.h", > ] > > defines = [] > >- if (rtc_include_opus) { >- public_deps = [ >- ":webrtc_opus", >- ] >- } >- >- deps = audio_coding_deps + [ >- "../../system_wrappers:metrics", >- "../../api/audio:audio_frame_api", >- "..:module_api", >- "../../common_audio:common_audio_c", >- "../../rtc_base:deprecation", >- "../../rtc_base:checks", >- "../../api:array_view", >- "../../api/audio_codecs:audio_codecs_api", >- ":audio_coding_module_typedefs", >- ":neteq", >- ":rent_a_codec", >- "../../rtc_base:audio_format_to_string", >- "../../rtc_base:rtc_base_approved", >- "//third_party/abseil-cpp/absl/types:optional", >- "../../logging:rtc_event_log_api", >- ] >- defines = audio_coding_defines >+ deps = [ >+ ":audio_coding_module_typedefs", >+ ":neteq", >+ ":neteq_decoder_enum", >+ ":rent_a_codec", >+ "..:module_api", >+ "..:module_api_public", >+ "../..:webrtc_common", >+ "../../api:array_view", >+ "../../api/audio:audio_frame_api", >+ "../../api/audio_codecs:audio_codecs_api", >+ "../../common_audio:common_audio", >+ "../../common_audio:common_audio_c", >+ "../../logging:rtc_event_log_api", >+ "../../rtc_base:audio_format_to_string", >+ "../../rtc_base:checks", >+ "../../rtc_base:deprecation", >+ "../../rtc_base:rtc_base_approved", >+ "../../system_wrappers", >+ "../../system_wrappers:metrics", >+ "//third_party/abseil-cpp/absl/strings", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] > } > > rtc_static_library("legacy_encoded_audio_frame") { >@@ -150,26 +119,41 @@ rtc_static_library("legacy_encoded_audio_frame") { > deps = [ > "../../api:array_view", > "../../api/audio_codecs:audio_codecs_api", >+ "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > } > >-rtc_static_library("cng") { >- visibility += [ "*" ] >+rtc_static_library("webrtc_cng") { >+ visibility += webrtc_default_visibility > sources = [ >- "codecs/cng/audio_encoder_cng.cc", >- "codecs/cng/audio_encoder_cng.h", > "codecs/cng/webrtc_cng.cc", > "codecs/cng/webrtc_cng.h", > ] > > deps = [ >- "../..:webrtc_common", > "../../api:array_view", >- "../../api/audio_codecs:audio_codecs_api", >- "../../common_audio", > "../../common_audio:common_audio_c", >+ "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:safe_conversions", >+ ] >+} >+ >+rtc_static_library("audio_encoder_cng") { >+ visibility += [ "*" ] >+ sources = [ >+ "codecs/cng/audio_encoder_cng.cc", >+ "codecs/cng/audio_encoder_cng.h", >+ ] >+ >+ deps = [ >+ ":webrtc_cng", >+ "../../api/audio_codecs:audio_codecs_api", >+ "../../common_audio", >+ "../../rtc_base:checks", >+ "//third_party/abseil-cpp/absl/memory", > ] > } > >@@ -181,10 +165,12 @@ rtc_static_library("red") { > ] > > deps = [ >+ "../../api:array_view", > "../../api/audio_codecs:audio_codecs_api", > "../../common_audio", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > } > >@@ -201,6 +187,7 @@ rtc_static_library("g711") { > deps = [ > ":legacy_encoded_audio_frame", > "../..:webrtc_common", >+ "../../api:array_view", > "../../api/audio_codecs:audio_codecs_api", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >@@ -235,6 +222,7 @@ rtc_static_library("g722") { > deps = [ > ":legacy_encoded_audio_frame", > "../..:webrtc_common", >+ "../../api:array_view", > "../../api/audio_codecs:audio_codecs_api", > "../../api/audio_codecs/g722:audio_encoder_g722_config", > "../../rtc_base:checks", >@@ -270,6 +258,7 @@ rtc_static_library("ilbc") { > deps = [ > ":legacy_encoded_audio_frame", > "../..:webrtc_common", >+ "../../api:array_view", > "../../api/audio_codecs:audio_codecs_api", > "../../api/audio_codecs/ilbc:audio_encoder_ilbc_config", > "../../common_audio", >@@ -786,6 +775,7 @@ rtc_static_library("pcm16b") { > ":g711", > ":legacy_encoded_audio_frame", > "../..:webrtc_common", >+ "../../api:array_view", > "../../api/audio_codecs:audio_codecs_api", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >@@ -820,6 +810,7 @@ rtc_static_library("webrtc_opus") { > deps = [ > ":audio_network_adaptor", > "../..:webrtc_common", >+ "../../api:array_view", > "../../api/audio_codecs:audio_codecs_api", > "../../api/audio_codecs/opus:audio_encoder_opus_config", > "../../common_audio", >@@ -829,6 +820,7 @@ rtc_static_library("webrtc_opus") { > "../../rtc_base:safe_minmax", > "../../system_wrappers:field_trial", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > public_deps = [ >@@ -882,6 +874,7 @@ if (rtc_enable_protobuf) { > proto_out_dir = "modules/audio_coding/audio_network_adaptor" > } > proto_library("ana_config_proto") { >+ visibility += [ "*" ] > sources = [ > "audio_network_adaptor/config.proto", > ] >@@ -1032,8 +1025,6 @@ rtc_static_library("neteq") { > "neteq/random_vector.h", > "neteq/red_payload_splitter.cc", > "neteq/red_payload_splitter.h", >- "neteq/rtcp.cc", >- "neteq/rtcp.h", > "neteq/statistics_calculator.cc", > "neteq/statistics_calculator.h", > "neteq/sync_buffer.cc", >@@ -1048,9 +1039,10 @@ rtc_static_library("neteq") { > > deps = [ > ":audio_coding_module_typedefs", >- ":cng", > ":neteq_decoder_enum", >+ ":webrtc_cng", > "..:module_api", >+ "..:module_api_public", > "../..:webrtc_common", > "../../api:array_view", > "../../api:libjingle_peerconnection_api", >@@ -1067,6 +1059,7 @@ rtc_static_library("neteq") { > "../../rtc_base/system:fallthrough", > "../../system_wrappers:field_trial", > "../../system_wrappers:metrics", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -1105,6 +1098,7 @@ rtc_source_set("neteq_tools_minimal") { > "../rtp_rtcp", > "//third_party/abseil-cpp/absl/types:optional", > ] >+ defines = audio_codec_defines > } > > rtc_source_set("neteq_test_tools") { >@@ -1140,11 +1134,11 @@ rtc_source_set("neteq_test_tools") { > "../../rtc_base:checks", > "../../rtc_base:rtc_base", > "../../rtc_base:rtc_base_approved", >- "../../rtc_base:rtc_base_tests_utils", > "../../rtc_base/system:arch", > "../../test:rtp_test_utils", > "../rtp_rtcp", > "../rtp_rtcp:rtp_rtcp_format", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > > public_deps = [ >@@ -1238,6 +1232,7 @@ if (rtc_enable_protobuf) { > "../../rtc_base:rtc_base_approved", > "../rtp_rtcp", > "../rtp_rtcp:rtp_rtcp_format", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > public_deps = [ > "../../logging:rtc_event_log_proto", >@@ -1246,6 +1241,30 @@ if (rtc_enable_protobuf) { > } > > if (rtc_include_tests) { >+ audio_coding_deps = [ >+ "../../common_audio", >+ "../../system_wrappers", >+ "../..:webrtc_common", >+ ":audio_encoder_cng", >+ ":g711", >+ ":g722", >+ ":pcm16b", >+ ] >+ if (rtc_include_ilbc) { >+ audio_coding_deps += [ ":ilbc" ] >+ } >+ if (rtc_include_opus) { >+ audio_coding_deps += [ ":webrtc_opus" ] >+ } >+ if (current_cpu == "arm") { >+ audio_coding_deps += [ ":isac_fix" ] >+ } else { >+ audio_coding_deps += [ ":isac" ] >+ } >+ if (!build_with_mozilla && !build_with_chromium) { >+ audio_coding_deps += [ ":red" ] >+ } >+ > rtc_source_set("mocks") { > testonly = true > sources = [ >@@ -1330,10 +1349,11 @@ if (rtc_include_tests) { > deps = [ > ":audio_coding", > ":audio_coding_module_typedefs", >+ ":audio_encoder_cng", > ":audio_format_conversion", >- ":cng", > ":pcm16b_c", > ":red", >+ ":webrtc_opus_c", > "..:module_api", > "../..:webrtc_common", > "../../api/audio:audio_frame_api", >@@ -1359,6 +1379,7 @@ if (rtc_include_tests) { > "../../system_wrappers", > "../../test:fileutils", > "../../test:test_support", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > defines = audio_coding_defines >@@ -1414,6 +1435,7 @@ if (rtc_include_tests) { > ":neteq_tools", > "../../rtc_base:rtc_base_approved", > "../../test:test_support", >+ "//third_party/abseil-cpp/absl/strings", > "//testing/gtest", > ] > } >@@ -1477,6 +1499,7 @@ if (rtc_include_tests) { > "../../rtc_base/system:arch", > "../../test:test_main", > "//testing/gtest", >+ "../../test:test_support", > ] + audio_coding_deps > > data = audio_decoder_unittests_resources >@@ -1501,7 +1524,7 @@ if (rtc_include_tests) { > rtc_source_set("neteq_test_factory") { > testonly = true > visibility += webrtc_default_visibility >- defines = [] >+ defines = audio_codec_defines > deps = [ > "../../rtc_base:checks", > "../../test:fileutils", >@@ -1594,6 +1617,7 @@ if (rtc_include_tests) { > "../../api:libjingle_peerconnection_api", > "../../rtc_base:rtc_base_approved", > "../../test:test_main", >+ "../../test:test_support", > "../audio_processing", > "//testing/gtest", > ] >@@ -1661,6 +1685,7 @@ if (rtc_include_tests) { > deps = audio_coding_deps + [ > "//third_party/abseil-cpp/absl/memory", > ":audio_coding", >+ ":audio_encoder_cng", > ":neteq_input_audio_tools", > "../../api/audio:audio_frame_api", > "../../api/audio_codecs/g711:audio_encoder_g711", >@@ -1756,7 +1781,6 @@ if (rtc_include_tests) { > ":neteq_test_support", > "../..:webrtc_common", > "../../rtc_base:rtc_base_approved", >- "../../test:fileutils", > "../../test:test_support", > ] > } >@@ -1956,6 +1980,7 @@ if (rtc_include_tests) { > "../../rtc_base:rtc_base_approved", > "../../test:fileutils", > "../../test:test_main", >+ "../../test:test_support", > "//testing/gtest", > ] > } >@@ -1968,8 +1993,6 @@ if (rtc_include_tests) { > "acm2/acm_receiver_unittest.cc", > "acm2/audio_coding_module_unittest.cc", > "acm2/call_statistics_unittest.cc", >- "acm2/codec_manager_unittest.cc", >- "acm2/rent_a_codec_unittest.cc", > "audio_network_adaptor/audio_network_adaptor_impl_unittest.cc", > "audio_network_adaptor/bitrate_controller_unittest.cc", > "audio_network_adaptor/channel_controller_unittest.cc", >@@ -2048,9 +2071,9 @@ if (rtc_include_tests) { > ":acm_send_test", > ":audio_coding", > ":audio_coding_module_typedefs", >+ ":audio_encoder_cng", > ":audio_format_conversion", > ":audio_network_adaptor", >- ":cng", > ":g711", > ":ilbc", > ":isac", >@@ -2064,6 +2087,7 @@ if (rtc_include_tests) { > ":pcm16b", > ":red", > ":rent_a_codec", >+ ":webrtc_cng", > ":webrtc_opus", > "..:module_api", > "../..:webrtc_common", >@@ -2079,6 +2103,7 @@ if (rtc_include_tests) { > "../../logging:mocks", > "../../logging:rtc_event_audio", > "../../logging:rtc_event_log_api", >+ "../../modules/rtp_rtcp:rtp_rtcp_format", > "../../rtc_base:checks", > "../../rtc_base:protobuf_utils", > "../../rtc_base:rtc_base", >@@ -2098,6 +2123,7 @@ if (rtc_include_tests) { > "codecs/opus/test:test_unittest", > "//testing/gtest", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > > defines = audio_coding_defines >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/OWNERS >index c1adc561b895dd82247d32f3481f24af47c73c89..46f99586842b39151d20ab7c6f63876ee58d0c68 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/OWNERS >@@ -4,6 +4,7 @@ kwiberg@webrtc.org > minyue@webrtc.org > jan.skoglund@webrtc.org > ossu@webrtc.org >+ivoc@webrtc.org > > # These are for the common case of adding or renaming files. If you're doing > # structural changes, please get a review from a reviewer in this file. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_codec_database.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_codec_database.cc >index 311af2df3ff6e94fd5933f29ee19a1851065eb14..cada80c061884389a1daa04249099a2e8c4b3ae0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_codec_database.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_codec_database.cc >@@ -17,7 +17,9 @@ > // references, where appropriate. > #include "modules/audio_coding/acm2/acm_codec_database.h" > >-#include "rtc_base/checks.h" >+#include "absl/strings/match.h" >+#include "api/array_view.h" >+#include "modules/audio_coding/acm2/rent_a_codec.h" > > #if ((defined WEBRTC_CODEC_ISAC) && (defined WEBRTC_CODEC_ISACFX)) > #error iSAC and iSACFX codecs cannot be enabled at the same time >@@ -239,12 +241,12 @@ int ACMCodecDB::CodecNumber(const CodecInst& codec_inst) { > } > > // Comfort Noise is special case, packet-size & rate is not checked. >- if (STR_CASE_CMP(database_[codec_id].plname, "CN") == 0) { >+ if (absl::EqualsIgnoreCase(database_[codec_id].plname, "CN")) { > return codec_id; > } > > // RED is special case, packet-size & rate is not checked. >- if (STR_CASE_CMP(database_[codec_id].plname, "red") == 0) { >+ if (absl::EqualsIgnoreCase(database_[codec_id].plname, "red")) { > return codec_id; > } > >@@ -272,12 +274,12 @@ int ACMCodecDB::CodecNumber(const CodecInst& codec_inst) { > > // Check the validity of rate. Codecs with multiple rates have their own > // function for this. >- if (STR_CASE_CMP("isac", codec_inst.plname) == 0) { >+ if (absl::EqualsIgnoreCase("isac", codec_inst.plname)) { > return IsISACRateValid(codec_inst.rate) ? codec_id : kInvalidRate; >- } else if (STR_CASE_CMP("ilbc", codec_inst.plname) == 0) { >+ } else if (absl::EqualsIgnoreCase("ilbc", codec_inst.plname)) { > return IsILBCRateValid(codec_inst.rate, codec_inst.pacsize) ? codec_id > : kInvalidRate; >- } else if (STR_CASE_CMP("opus", codec_inst.plname) == 0) { >+ } else if (absl::EqualsIgnoreCase("opus", codec_inst.plname)) { > return IsOpusRateValid(codec_inst.rate) ? codec_id : kInvalidRate; > } > >@@ -296,7 +298,7 @@ int ACMCodecDB::CodecId(const CodecInst& codec_inst) { > int ACMCodecDB::CodecId(const char* payload_name, > int frequency, > size_t channels) { >- for (const CodecInst& ci : RentACodec::Database()) { >+ for (const CodecInst& ci : database_) { > bool name_match = false; > bool frequency_match = false; > bool channels_match = false; >@@ -304,10 +306,10 @@ int ACMCodecDB::CodecId(const char* payload_name, > // Payload name, sampling frequency and number of channels need to match. > // NOTE! If |frequency| is -1, the frequency is not applicable, and is > // always treated as true, like for RED. >- name_match = (STR_CASE_CMP(ci.plname, payload_name) == 0); >+ name_match = absl::EqualsIgnoreCase(ci.plname, payload_name); > frequency_match = (frequency == ci.plfreq) || (frequency == -1); > // The number of channels must match for all codecs but Opus. >- if (STR_CASE_CMP(payload_name, "opus") != 0) { >+ if (!absl::EqualsIgnoreCase(payload_name, "opus")) { > channels_match = (channels == ci.channels); > } else { > // For opus we just check that number of channels is valid. >@@ -316,7 +318,7 @@ int ACMCodecDB::CodecId(const char* payload_name, > > if (name_match && frequency_match && channels_match) { > // We have found a matching codec in the list. >- return &ci - RentACodec::Database().data(); >+ return &ci - database_; > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receive_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receive_test.cc >index 2d7f2960de589bda4d935ac33294a6e59f619fa5..c149ec1b0bc21defe9ad30cc8fb7138ee6d438e5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receive_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receive_test.cc >@@ -14,6 +14,7 @@ > > #include <memory> > >+#include "absl/strings/match.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "modules/audio_coding/codecs/audio_format_conversion.h" > #include "modules/audio_coding/include/audio_coding_module.h" >@@ -30,11 +31,11 @@ namespace { > // Returns true if the codec should be registered, otherwise false. Changes > // the number of channels for the Opus codec to always be 1. > bool ModifyAndUseThisCodec(CodecInst* codec_param) { >- if (STR_CASE_CMP(codec_param->plname, "CN") == 0 && >+ if (absl::EqualsIgnoreCase(codec_param->plname, "CN") && > codec_param->plfreq == 48000) > return false; // Skip 48 kHz comfort noise. > >- if (STR_CASE_CMP(codec_param->plname, "telephone-event") == 0) >+ if (absl::EqualsIgnoreCase(codec_param->plname, "telephone-event")) > return false; // Skip DTFM. > > return true; >@@ -65,39 +66,43 @@ bool RemapPltypeAndUseThisCodec(const char* plname, > return false; // Don't use non-mono codecs. > > // Re-map pltypes to those used in the NetEq test files. >- if (STR_CASE_CMP(plname, "PCMU") == 0 && plfreq == 8000) { >+ if (absl::EqualsIgnoreCase(plname, "PCMU") && plfreq == 8000) { > *pltype = 0; >- } else if (STR_CASE_CMP(plname, "PCMA") == 0 && plfreq == 8000) { >+ } else if (absl::EqualsIgnoreCase(plname, "PCMA") && plfreq == 8000) { > *pltype = 8; >- } else if (STR_CASE_CMP(plname, "CN") == 0 && plfreq == 8000) { >+ } else if (absl::EqualsIgnoreCase(plname, "CN") && plfreq == 8000) { > *pltype = 13; >- } else if (STR_CASE_CMP(plname, "CN") == 0 && plfreq == 16000) { >+ } else if (absl::EqualsIgnoreCase(plname, "CN") && plfreq == 16000) { > *pltype = 98; >- } else if (STR_CASE_CMP(plname, "CN") == 0 && plfreq == 32000) { >+ } else if (absl::EqualsIgnoreCase(plname, "CN") && plfreq == 32000) { > *pltype = 99; >- } else if (STR_CASE_CMP(plname, "ILBC") == 0) { >+ } else if (absl::EqualsIgnoreCase(plname, "ILBC")) { > *pltype = 102; >- } else if (STR_CASE_CMP(plname, "ISAC") == 0 && plfreq == 16000) { >+ } else if (absl::EqualsIgnoreCase(plname, "ISAC") && plfreq == 16000) { > *pltype = 103; >- } else if (STR_CASE_CMP(plname, "ISAC") == 0 && plfreq == 32000) { >+ } else if (absl::EqualsIgnoreCase(plname, "ISAC") && plfreq == 32000) { > *pltype = 104; >- } else if (STR_CASE_CMP(plname, "telephone-event") == 0 && plfreq == 8000) { >+ } else if (absl::EqualsIgnoreCase(plname, "telephone-event") && >+ plfreq == 8000) { > *pltype = 106; >- } else if (STR_CASE_CMP(plname, "telephone-event") == 0 && plfreq == 16000) { >+ } else if (absl::EqualsIgnoreCase(plname, "telephone-event") && >+ plfreq == 16000) { > *pltype = 114; >- } else if (STR_CASE_CMP(plname, "telephone-event") == 0 && plfreq == 32000) { >+ } else if (absl::EqualsIgnoreCase(plname, "telephone-event") && >+ plfreq == 32000) { > *pltype = 115; >- } else if (STR_CASE_CMP(plname, "telephone-event") == 0 && plfreq == 48000) { >+ } else if (absl::EqualsIgnoreCase(plname, "telephone-event") && >+ plfreq == 48000) { > *pltype = 116; >- } else if (STR_CASE_CMP(plname, "red") == 0) { >+ } else if (absl::EqualsIgnoreCase(plname, "red")) { > *pltype = 117; >- } else if (STR_CASE_CMP(plname, "L16") == 0 && plfreq == 8000) { >+ } else if (absl::EqualsIgnoreCase(plname, "L16") && plfreq == 8000) { > *pltype = 93; >- } else if (STR_CASE_CMP(plname, "L16") == 0 && plfreq == 16000) { >+ } else if (absl::EqualsIgnoreCase(plname, "L16") && plfreq == 16000) { > *pltype = 94; >- } else if (STR_CASE_CMP(plname, "L16") == 0 && plfreq == 32000) { >+ } else if (absl::EqualsIgnoreCase(plname, "L16") && plfreq == 32000) { > *pltype = 95; >- } else if (STR_CASE_CMP(plname, "G722") == 0) { >+ } else if (absl::EqualsIgnoreCase(plname, "G722")) { > *pltype = 9; > } else { > // Don't use any other codecs. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.cc >index f631746b88f4bfb8e50baadbd36e2950e2e43fdb..d3af7c03418851a16c49bfc3254cacb2ab3d0bd7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.cc >@@ -10,21 +10,21 @@ > > #include "modules/audio_coding/acm2/acm_receiver.h" > >-#include <stdlib.h> // malloc >- >-#include <algorithm> // sort >+#include <stdlib.h> >+#include <string.h> >+#include <cstdint> > #include <vector> > >+#include "absl/strings/match.h" >+#include "api/audio/audio_frame.h" > #include "api/audio_codecs/audio_decoder.h" >-#include "common_audio/signal_processing/include/signal_processing_library.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/audio_coding/acm2/acm_resampler.h" > #include "modules/audio_coding/acm2/call_statistics.h" > #include "modules/audio_coding/acm2/rent_a_codec.h" > #include "modules/audio_coding/neteq/include/neteq.h" >+#include "modules/audio_coding/neteq/neteq_decoder_enum.h" > #include "modules/include/module_common_types.h" > #include "rtc_base/checks.h" >-#include "rtc_base/format_macros.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/strings/audio_format_to_string.h" >@@ -92,7 +92,7 @@ int AcmReceiver::InsertPacket(const WebRtcRTPHeader& rtp_header, > } > receive_timestamp = NowInTimestamp(ci->plfreq); > >- if (STR_CASE_CMP(ci->plname, "cn") == 0) { >+ if (absl::EqualsIgnoreCase(ci->plname, "cn")) { > if (last_audio_decoder_ && last_audio_decoder_->channels > 1) { > // This is a CNG and the audio codec is not mono, so skip pushing in > // packets into NetEq. >@@ -345,6 +345,13 @@ void AcmReceiver::GetNetworkStatistics(NetworkStatistics* acm_stat) { > acm_stat->concealedSamples = neteq_lifetime_stat.concealed_samples; > acm_stat->concealmentEvents = neteq_lifetime_stat.concealment_events; > acm_stat->jitterBufferDelayMs = neteq_lifetime_stat.jitter_buffer_delay_ms; >+ acm_stat->delayedPacketOutageSamples = >+ neteq_lifetime_stat.delayed_packet_outage_samples; >+ >+ NetEqOperationsAndState neteq_operations_and_state = >+ neteq_->GetOperationsAndState(); >+ acm_stat->packetBufferFlushes = >+ neteq_operations_and_state.packet_buffer_flushes; > } > > int AcmReceiver::DecoderByPayloadType(uint8_t payload_type, >@@ -391,7 +398,7 @@ const absl::optional<CodecInst> AcmReceiver::RtpHeaderToDecoder( > uint8_t first_payload_byte) const { > const absl::optional<CodecInst> ci = > neteq_->GetDecoder(rtp_header.payloadType); >- if (ci && STR_CASE_CMP(ci->plname, "red") == 0) { >+ if (ci && absl::EqualsIgnoreCase(ci->plname, "red")) { > // This is a RED packet. Get the payload of the audio codec. > return neteq_->GetDecoder(first_payload_byte & 0x7f); > } else { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.h >index a2ae723deee6acdbc7f43ff2f322c8ffe7c4248d..8e7d83978c0d7a13a39b62b9219cd4b4e30a30e5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_CODING_ACM2_ACM_RECEIVER_H_ > #define MODULES_AUDIO_CODING_ACM2_ACM_RECEIVER_H_ > >+#include <stdint.h> > #include <map> > #include <memory> > #include <string> >@@ -18,19 +19,21 @@ > > #include "absl/types/optional.h" > #include "api/array_view.h" >-#include "api/audio/audio_frame.h" >-#include "common_audio/vad/include/webrtc_vad.h" >+#include "api/audio_codecs/audio_decoder.h" >+#include "api/audio_codecs/audio_format.h" > #include "modules/audio_coding/acm2/acm_resampler.h" > #include "modules/audio_coding/acm2/call_statistics.h" > #include "modules/audio_coding/include/audio_coding_module.h" >-#include "modules/audio_coding/neteq/include/neteq.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/thread_annotations.h" > > namespace webrtc { > >+class Clock; > struct CodecInst; > class NetEq; >+struct RTPHeader; >+struct WebRtcRTPHeader; > > namespace acm2 { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver_unittest.cc >index 29f2a45669b3d99f21edc0402d16c1fed0be1f11..46384febb5749f45d0ed2dc71553e944409394ac 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_receiver_unittest.cc >@@ -73,12 +73,12 @@ class AcmReceiverTestOldApi : public AudioPacketizationCallback, > // If we have a compatible CN specification, stack a CNG on top. > auto it = cng_payload_types.find(info.sample_rate_hz); > if (it != cng_payload_types.end()) { >- AudioEncoderCng::Config config; >+ AudioEncoderCngConfig config; > config.speech_encoder = std::move(enc); > config.num_channels = 1; > config.payload_type = it->second; > config.vad_mode = Vad::kVadNormal; >- enc = absl::make_unique<AudioEncoderCng>(std::move(config)); >+ enc = CreateComfortNoiseEncoder(std::move(config)); > } > > // Actually start using the new encoder. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.cc >index c0b2064453759cd4fdbf167b1adc5b32407cb4b0..ca3583e32cdea464b9946c6ab9c8c5fe5c374c57 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.cc >@@ -13,7 +13,6 @@ > #include <assert.h> > #include <string.h> > >-#include "common_audio/resampler/include/resampler.h" > #include "rtc_base/logging.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.h >index 904ea524f8839a604da8a386f7aa5247457adf83..96ba93a7622995ce9aca315bf5aaad93494bd6e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/acm_resampler.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_AUDIO_CODING_ACM2_ACM_RESAMPLER_H_ > #define MODULES_AUDIO_CODING_ACM2_ACM_RESAMPLER_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "common_audio/resampler/include/push_resampler.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module.cc >index 60afeb69bded39cfc37f24fc3f7510539c9bcc12..c0aab3a30fa539a3bb90ea8b130fdb027f57cc50 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module.cc >@@ -10,28 +10,29 @@ > > #include "modules/audio_coding/include/audio_coding_module.h" > >+#include <assert.h> > #include <algorithm> >+#include <cstdint> > >+#include "absl/strings/match.h" >+#include "api/array_view.h" > #include "modules/audio_coding/acm2/acm_receiver.h" > #include "modules/audio_coding/acm2/acm_resampler.h" >-#include "modules/audio_coding/acm2/codec_manager.h" > #include "modules/audio_coding/acm2/rent_a_codec.h" > #include "modules/include/module_common_types.h" >+#include "modules/include/module_common_types_public.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/checks.h" >+#include "rtc_base/criticalsection.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" >+#include "rtc_base/thread_annotations.h" > #include "system_wrappers/include/metrics.h" > > namespace webrtc { > > namespace { > >-struct EncoderFactory { >- AudioEncoder* external_speech_encoder = nullptr; >- acm2::CodecManager codec_manager; >- acm2::RentACodec rent_a_codec; >-}; >- > class AudioCodingModuleImpl final : public AudioCodingModule { > public: > explicit AudioCodingModuleImpl(const AudioCodingModule::Config& config); >@@ -41,12 +42,6 @@ class AudioCodingModuleImpl final : public AudioCodingModule { > // Sender > // > >- // Can be called multiple times for Codec, CNG, RED. >- int RegisterSendCodec(const CodecInst& send_codec) override; >- >- void RegisterExternalSendCodec( >- AudioEncoder* external_speech_encoder) override; >- > void ModifyEncoder(rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> > modifier) override; > >@@ -65,26 +60,10 @@ class AudioCodingModuleImpl final : public AudioCodingModule { > // Add 10 ms of raw (PCM) audio data to the encoder. > int Add10MsData(const AudioFrame& audio_frame) override; > >- ///////////////////////////////////////// >- // (RED) Redundant Coding >- // >- >- // Configure RED status i.e. on/off. >- int SetREDStatus(bool enable_red) override; >- >- // Get RED status. >- bool REDStatus() const override; >- > ///////////////////////////////////////// > // (FEC) Forward Error Correction (codec internal) > // > >- // Configure FEC status i.e. on/off. >- int SetCodecFEC(bool enabled_codec_fec) override; >- >- // Get FEC status. >- bool CodecFEC() const override; >- > // Set target packet loss rate > int SetPacketLossRate(int loss_rate) override; > >@@ -94,14 +73,6 @@ class AudioCodingModuleImpl final : public AudioCodingModule { > // (CNG) Comfort Noise Generation > // > >- int SetVAD(bool enable_dtx = true, >- bool enable_vad = false, >- ACMVADMode mode = VADNormal) override; >- >- int VAD(bool* dtx_enabled, >- bool* vad_enabled, >- ACMVADMode* mode) const override; >- > int RegisterVADCallback(ACMVADCallback* vad_callback) override; > > ///////////////////////////////////////// >@@ -122,11 +93,6 @@ class AudioCodingModuleImpl final : public AudioCodingModule { > bool RegisterReceiveCodec(int rtp_payload_type, > const SdpAudioFormat& audio_format) override; > >- int RegisterReceiveCodec(const CodecInst& receive_codec) override; >- int RegisterReceiveCodec( >- const CodecInst& receive_codec, >- rtc::FunctionView<std::unique_ptr<AudioDecoder>()> isac_factory) override; >- > int RegisterExternalReceiveCodec(int rtp_payload_type, > AudioDecoder* external_decoder, > int sample_rate_hz, >@@ -214,11 +180,6 @@ class AudioCodingModuleImpl final : public AudioCodingModule { > const std::string histogram_name_; > }; > >- int RegisterReceiveCodecUnlocked( >- const CodecInst& codec, >- rtc::FunctionView<std::unique_ptr<AudioDecoder>()> isac_factory) >- RTC_EXCLUSIVE_LOCKS_REQUIRED(acm_crit_sect_); >- > int Add10MsDataInternal(const AudioFrame& audio_frame, InputData* input_data) > RTC_EXCLUSIVE_LOCKS_REQUIRED(acm_crit_sect_); > int Encode(const InputData& input_data) >@@ -256,12 +217,7 @@ class AudioCodingModuleImpl final : public AudioCodingModule { > acm2::AcmReceiver receiver_; // AcmReceiver has it's own internal lock. > ChangeLogger bitrate_logger_ RTC_GUARDED_BY(acm_crit_sect_); > >- std::unique_ptr<EncoderFactory> encoder_factory_ >- RTC_GUARDED_BY(acm_crit_sect_); >- >- // Current encoder stack, either obtained from >- // encoder_factory_->rent_a_codec.RentEncoderStack or provided by a call to >- // RegisterEncoder. >+ // Current encoder stack, provided by a call to RegisterEncoder. > std::unique_ptr<AudioEncoder> encoder_stack_ RTC_GUARDED_BY(acm_crit_sect_); > > std::unique_ptr<AudioDecoder> isac_decoder_16k_ >@@ -397,28 +353,6 @@ class RawAudioEncoderWrapper final : public AudioEncoder { > AudioEncoder* enc_; > }; > >-// Return false on error. >-bool CreateSpeechEncoderIfNecessary(EncoderFactory* ef) { >- auto* sp = ef->codec_manager.GetStackParams(); >- if (sp->speech_encoder) { >- // Do nothing; we already have a speech encoder. >- } else if (ef->codec_manager.GetCodecInst()) { >- RTC_DCHECK(!ef->external_speech_encoder); >- // We have no speech encoder, but we have a specification for making one. >- std::unique_ptr<AudioEncoder> enc = >- ef->rent_a_codec.RentEncoder(*ef->codec_manager.GetCodecInst()); >- if (!enc) >- return false; // Encoder spec was bad. >- sp->speech_encoder = std::move(enc); >- } else if (ef->external_speech_encoder) { >- RTC_DCHECK(!ef->codec_manager.GetCodecInst()); >- // We have an external speech encoder. >- sp->speech_encoder = std::unique_ptr<AudioEncoder>( >- new RawAudioEncoderWrapper(ef->external_speech_encoder)); >- } >- return true; >-} >- > void AudioCodingModuleImpl::ChangeLogger::MaybeLog(int value) { > if (value != last_value_ || first_time_) { > first_time_ = false; >@@ -433,7 +367,6 @@ AudioCodingModuleImpl::AudioCodingModuleImpl( > expected_in_ts_(0xD87F3F9F), > receiver_(config), > bitrate_logger_("WebRTC.Audio.TargetBitrateInKbps"), >- encoder_factory_(new EncoderFactory), > encoder_stack_(nullptr), > previous_pltype_(255), > receiver_initialized_(false), >@@ -541,69 +474,29 @@ int32_t AudioCodingModuleImpl::Encode(const InputData& input_data) { > // Sender > // > >-// Can be called multiple times for Codec, CNG, RED. >-int AudioCodingModuleImpl::RegisterSendCodec(const CodecInst& send_codec) { >- rtc::CritScope lock(&acm_crit_sect_); >- if (!encoder_factory_->codec_manager.RegisterEncoder(send_codec)) { >- return -1; >- } >- if (encoder_factory_->codec_manager.GetCodecInst()) { >- encoder_factory_->external_speech_encoder = nullptr; >- } >- if (!CreateSpeechEncoderIfNecessary(encoder_factory_.get())) { >- return -1; >- } >- auto* sp = encoder_factory_->codec_manager.GetStackParams(); >- if (sp->speech_encoder) >- encoder_stack_ = encoder_factory_->rent_a_codec.RentEncoderStack(sp); >- return 0; >-} >- >-void AudioCodingModuleImpl::RegisterExternalSendCodec( >- AudioEncoder* external_speech_encoder) { >- rtc::CritScope lock(&acm_crit_sect_); >- encoder_factory_->codec_manager.UnsetCodecInst(); >- encoder_factory_->external_speech_encoder = external_speech_encoder; >- RTC_CHECK(CreateSpeechEncoderIfNecessary(encoder_factory_.get())); >- auto* sp = encoder_factory_->codec_manager.GetStackParams(); >- RTC_CHECK(sp->speech_encoder); >- encoder_stack_ = encoder_factory_->rent_a_codec.RentEncoderStack(sp); >-} >- > void AudioCodingModuleImpl::ModifyEncoder( > rtc::FunctionView<void(std::unique_ptr<AudioEncoder>*)> modifier) { > rtc::CritScope lock(&acm_crit_sect_); >- >- // Wipe the encoder factory, so that everything that relies on it will fail. >- // We don't want the complexity of supporting swapping back and forth. >- if (encoder_factory_) { >- encoder_factory_.reset(); >- RTC_CHECK(!encoder_stack_); // Ensure we hadn't started using the factory. >- } >- > modifier(&encoder_stack_); > } > > // Get current send codec. > absl::optional<CodecInst> AudioCodingModuleImpl::SendCodec() const { > rtc::CritScope lock(&acm_crit_sect_); >- if (encoder_factory_) { >- auto* ci = encoder_factory_->codec_manager.GetCodecInst(); >- if (ci) { >- return *ci; >- } >- CreateSpeechEncoderIfNecessary(encoder_factory_.get()); >- const std::unique_ptr<AudioEncoder>& enc = >- encoder_factory_->codec_manager.GetStackParams()->speech_encoder; >- if (enc) { >- return acm2::CodecManager::ForgeCodecInst(enc.get()); >- } >- return absl::nullopt; >+ if (encoder_stack_) { >+ CodecInst ci; >+ ci.channels = encoder_stack_->NumChannels(); >+ ci.plfreq = encoder_stack_->SampleRateHz(); >+ ci.pacsize = rtc::CheckedDivExact( >+ static_cast<int>(encoder_stack_->Max10MsFramesInAPacket() * ci.plfreq), >+ 100); >+ ci.pltype = -1; // Not valid. >+ ci.rate = -1; // Not valid. >+ static const char kName[] = "external"; >+ memcpy(ci.plname, kName, sizeof(kName)); >+ return ci; > } else { >- return encoder_stack_ >- ? absl::optional<CodecInst>( >- acm2::CodecManager::ForgeCodecInst(encoder_stack_.get())) >- : absl::nullopt; >+ return absl::nullopt; > } > } > >@@ -800,59 +693,10 @@ int AudioCodingModuleImpl::PreprocessToAddData(const AudioFrame& in_frame, > return 0; > } > >-///////////////////////////////////////// >-// (RED) Redundant Coding >-// >- >-bool AudioCodingModuleImpl::REDStatus() const { >- rtc::CritScope lock(&acm_crit_sect_); >- return encoder_factory_->codec_manager.GetStackParams()->use_red; >-} >- >-// Configure RED status i.e on/off. >-int AudioCodingModuleImpl::SetREDStatus(bool enable_red) { >-#ifdef WEBRTC_CODEC_RED >- rtc::CritScope lock(&acm_crit_sect_); >- CreateSpeechEncoderIfNecessary(encoder_factory_.get()); >- if (!encoder_factory_->codec_manager.SetCopyRed(enable_red)) { >- return -1; >- } >- auto* sp = encoder_factory_->codec_manager.GetStackParams(); >- if (sp->speech_encoder) >- encoder_stack_ = encoder_factory_->rent_a_codec.RentEncoderStack(sp); >- return 0; >-#else >- RTC_LOG(LS_WARNING) << " WEBRTC_CODEC_RED is undefined"; >- return -1; >-#endif >-} >- > ///////////////////////////////////////// > // (FEC) Forward Error Correction (codec internal) > // > >-bool AudioCodingModuleImpl::CodecFEC() const { >- rtc::CritScope lock(&acm_crit_sect_); >- return encoder_factory_->codec_manager.GetStackParams()->use_codec_fec; >-} >- >-int AudioCodingModuleImpl::SetCodecFEC(bool enable_codec_fec) { >- rtc::CritScope lock(&acm_crit_sect_); >- CreateSpeechEncoderIfNecessary(encoder_factory_.get()); >- if (!encoder_factory_->codec_manager.SetCodecFEC(enable_codec_fec)) { >- return -1; >- } >- auto* sp = encoder_factory_->codec_manager.GetStackParams(); >- if (sp->speech_encoder) >- encoder_stack_ = encoder_factory_->rent_a_codec.RentEncoderStack(sp); >- if (enable_codec_fec) { >- return sp->use_codec_fec ? 0 : -1; >- } else { >- RTC_DCHECK(!sp->use_codec_fec); >- return 0; >- } >-} >- > int AudioCodingModuleImpl::SetPacketLossRate(int loss_rate) { > rtc::CritScope lock(&acm_crit_sect_); > if (HaveValidEncoder("SetPacketLossRate")) { >@@ -861,36 +705,6 @@ int AudioCodingModuleImpl::SetPacketLossRate(int loss_rate) { > return 0; > } > >-///////////////////////////////////////// >-// (VAD) Voice Activity Detection >-// >-int AudioCodingModuleImpl::SetVAD(bool enable_dtx, >- bool enable_vad, >- ACMVADMode mode) { >- // Note: |enable_vad| is not used; VAD is enabled based on the DTX setting. >- RTC_DCHECK_EQ(enable_dtx, enable_vad); >- rtc::CritScope lock(&acm_crit_sect_); >- CreateSpeechEncoderIfNecessary(encoder_factory_.get()); >- if (!encoder_factory_->codec_manager.SetVAD(enable_dtx, mode)) { >- return -1; >- } >- auto* sp = encoder_factory_->codec_manager.GetStackParams(); >- if (sp->speech_encoder) >- encoder_stack_ = encoder_factory_->rent_a_codec.RentEncoderStack(sp); >- return 0; >-} >- >-// Get VAD/DTX settings. >-int AudioCodingModuleImpl::VAD(bool* dtx_enabled, >- bool* vad_enabled, >- ACMVADMode* mode) const { >- rtc::CritScope lock(&acm_crit_sect_); >- const auto* sp = encoder_factory_->codec_manager.GetStackParams(); >- *dtx_enabled = *vad_enabled = sp->use_cng; >- *mode = sp->vad_mode; >- return 0; >-} >- > ///////////////////////////////////////// > // Receiver > // >@@ -949,59 +763,6 @@ bool AudioCodingModuleImpl::RegisterReceiveCodec( > return receiver_.AddCodec(rtp_payload_type, audio_format); > } > >-int AudioCodingModuleImpl::RegisterReceiveCodec(const CodecInst& codec) { >- rtc::CritScope lock(&acm_crit_sect_); >- auto* ef = encoder_factory_.get(); >- return RegisterReceiveCodecUnlocked( >- codec, [&] { return ef->rent_a_codec.RentIsacDecoder(codec.plfreq); }); >-} >- >-int AudioCodingModuleImpl::RegisterReceiveCodec( >- const CodecInst& codec, >- rtc::FunctionView<std::unique_ptr<AudioDecoder>()> isac_factory) { >- rtc::CritScope lock(&acm_crit_sect_); >- return RegisterReceiveCodecUnlocked(codec, isac_factory); >-} >- >-int AudioCodingModuleImpl::RegisterReceiveCodecUnlocked( >- const CodecInst& codec, >- rtc::FunctionView<std::unique_ptr<AudioDecoder>()> isac_factory) { >- RTC_DCHECK(receiver_initialized_); >- if (codec.channels > 2) { >- RTC_LOG_F(LS_ERROR) << "Unsupported number of channels: " << codec.channels; >- return -1; >- } >- >- auto codec_id = acm2::RentACodec::CodecIdByParams(codec.plname, codec.plfreq, >- codec.channels); >- if (!codec_id) { >- RTC_LOG_F(LS_ERROR) >- << "Wrong codec params to be registered as receive codec"; >- return -1; >- } >- auto codec_index = acm2::RentACodec::CodecIndexFromId(*codec_id); >- RTC_CHECK(codec_index) << "Invalid codec ID: " << static_cast<int>(*codec_id); >- >- // Check if the payload-type is valid. >- if (!acm2::RentACodec::IsPayloadTypeValid(codec.pltype)) { >- RTC_LOG_F(LS_ERROR) << "Invalid payload type " << codec.pltype << " for " >- << codec.plname; >- return -1; >- } >- >- AudioDecoder* isac_decoder = nullptr; >- if (STR_CASE_CMP(codec.plname, "isac") == 0) { >- std::unique_ptr<AudioDecoder>& saved_isac_decoder = >- codec.plfreq == 16000 ? isac_decoder_16k_ : isac_decoder_32k_; >- if (!saved_isac_decoder) { >- saved_isac_decoder = isac_factory(); >- } >- isac_decoder = saved_isac_decoder.get(); >- } >- return receiver_.AddCodec(*codec_index, codec.pltype, codec.channels, >- codec.plfreq, isac_decoder, codec.plname); >-} >- > int AudioCodingModuleImpl::RegisterExternalReceiveCodec( > int rtp_payload_type, > AudioDecoder* external_decoder, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module_unittest.cc >index 5ac6102c68209988adbb02b7b1ca5cc048ff0527..4e262f7b038a3bdf9f136f3a564a4952a7a26c0f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/audio_coding_module_unittest.cc >@@ -35,6 +35,7 @@ > #include "modules/audio_coding/neteq/tools/packet.h" > #include "modules/audio_coding/neteq/tools/rtp_file_source.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/messagedigest.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/platform_thread.h" >@@ -42,7 +43,6 @@ > #include "rtc_base/system/arch.h" > #include "rtc_base/thread_annotations.h" > #include "system_wrappers/include/clock.h" >-#include "system_wrappers/include/event_wrapper.h" > #include "system_wrappers/include/sleep.h" > #include "test/gtest.h" > #include "test/mock_audio_decoder.h" >@@ -408,12 +408,12 @@ class AudioCodingModuleTestWithComfortNoiseOldApi > acm_->RegisterReceiveCodec( > rtp_payload_type, SdpAudioFormat("cn", kSampleRateHz, 1))); > acm_->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* enc) { >- AudioEncoderCng::Config config; >+ AudioEncoderCngConfig config; > config.speech_encoder = std::move(*enc); > config.num_channels = 1; > config.payload_type = rtp_payload_type; > config.vad_mode = Vad::kVadNormal; >- *enc = absl::make_unique<AudioEncoderCng>(std::move(config)); >+ *enc = CreateComfortNoiseEncoder(std::move(config)); > }); > } > >@@ -481,7 +481,6 @@ class AudioCodingModuleMtTestOldApi : public AudioCodingModuleTestOldApi { > send_thread_(CbSendThread, this, "send"), > insert_packet_thread_(CbInsertPacketThread, this, "insert_packet"), > pull_audio_thread_(CbPullAudioThread, this, "pull_audio"), >- test_complete_(EventWrapper::Create()), > send_count_(0), > insert_packet_count_(0), > pull_audio_count_(0), >@@ -512,8 +511,8 @@ class AudioCodingModuleMtTestOldApi : public AudioCodingModuleTestOldApi { > insert_packet_thread_.Stop(); > } > >- EventTypeWrapper RunTest() { >- return test_complete_->Wait(10 * 60 * 1000); // 10 minutes' timeout. >+ bool RunTest() { >+ return test_complete_.Wait(10 * 60 * 1000); // 10 minutes' timeout. > } > > virtual bool TestDone() { >@@ -538,12 +537,12 @@ class AudioCodingModuleMtTestOldApi : public AudioCodingModuleTestOldApi { > SleepMs(1); > if (HasFatalFailure()) { > // End the test early if a fatal failure (ASSERT_*) has occurred. >- test_complete_->Set(); >+ test_complete_.Set(); > } > ++send_count_; > InsertAudioAndVerifyEncoding(); > if (TestDone()) { >- test_complete_->Set(); >+ test_complete_.Set(); > } > return true; > } >@@ -592,7 +591,7 @@ class AudioCodingModuleMtTestOldApi : public AudioCodingModuleTestOldApi { > rtc::PlatformThread send_thread_; > rtc::PlatformThread insert_packet_thread_; > rtc::PlatformThread pull_audio_thread_; >- const std::unique_ptr<EventWrapper> test_complete_; >+ rtc::Event test_complete_; > int send_count_; > int insert_packet_count_; > int pull_audio_count_ RTC_GUARDED_BY(crit_sect_); >@@ -607,7 +606,7 @@ class AudioCodingModuleMtTestOldApi : public AudioCodingModuleTestOldApi { > #define MAYBE_DoTest DoTest > #endif > TEST_F(AudioCodingModuleMtTestOldApi, MAYBE_DoTest) { >- EXPECT_EQ(kEventSignaled, RunTest()); >+ EXPECT_TRUE(RunTest()); > } > > // This is a multi-threaded ACM test using iSAC. The test encodes audio >@@ -717,7 +716,7 @@ class AcmIsacMtTestOldApi : public AudioCodingModuleMtTestOldApi { > #endif > #if defined(WEBRTC_CODEC_ISAC) || defined(WEBRTC_CODEC_ISACFX) > TEST_F(AcmIsacMtTestOldApi, MAYBE_DoTest) { >- EXPECT_EQ(kEventSignaled, RunTest()); >+ EXPECT_TRUE(RunTest()); > } > #endif > >@@ -734,7 +733,6 @@ class AcmReRegisterIsacMtTestOldApi : public AudioCodingModuleTestOldApi { > codec_registration_thread_(CbCodecRegistrationThread, > this, > "codec_registration"), >- test_complete_(EventWrapper::Create()), > codec_registered_(false), > receive_packet_count_(0), > next_insert_packet_time_ms_(0), >@@ -781,8 +779,8 @@ class AcmReRegisterIsacMtTestOldApi : public AudioCodingModuleTestOldApi { > codec_registration_thread_.Stop(); > } > >- EventTypeWrapper RunTest() { >- return test_complete_->Wait(10 * 60 * 1000); // 10 minutes' timeout. >+ bool RunTest() { >+ return test_complete_.Wait(10 * 60 * 1000); // 10 minutes' timeout. > } > > static bool CbReceiveThread(void* context) { >@@ -845,7 +843,7 @@ class AcmReRegisterIsacMtTestOldApi : public AudioCodingModuleTestOldApi { > SleepMs(1); > if (HasFatalFailure()) { > // End the test early if a fatal failure (ASSERT_*) has occurred. >- test_complete_->Set(); >+ test_complete_.Set(); > } > rtc::CritScope lock(&crit_sect_); > if (!codec_registered_ && >@@ -856,14 +854,14 @@ class AcmReRegisterIsacMtTestOldApi : public AudioCodingModuleTestOldApi { > codec_registered_ = true; > } > if (codec_registered_ && receive_packet_count_ > kNumPackets) { >- test_complete_->Set(); >+ test_complete_.Set(); > } > return true; > } > > rtc::PlatformThread receive_thread_; > rtc::PlatformThread codec_registration_thread_; >- const std::unique_ptr<EventWrapper> test_complete_; >+ rtc::Event test_complete_; > rtc::CriticalSection crit_sect_; > bool codec_registered_ RTC_GUARDED_BY(crit_sect_); > int receive_packet_count_ RTC_GUARDED_BY(crit_sect_); >@@ -880,7 +878,7 @@ class AcmReRegisterIsacMtTestOldApi : public AudioCodingModuleTestOldApi { > #endif > #if defined(WEBRTC_CODEC_ISAC) || defined(WEBRTC_CODEC_ISACFX) > TEST_F(AcmReRegisterIsacMtTestOldApi, MAYBE_DoTest) { >- EXPECT_EQ(kEventSignaled, RunTest()); >+ EXPECT_TRUE(RunTest()); > } > #endif > >@@ -1185,14 +1183,14 @@ class AcmSenderBitExactnessOldApi : public ::testing::Test, > > // Extract and verify the audio checksum. > std::string checksum_string = audio_checksum.Finish(); >- EXPECT_EQ(audio_checksum_ref, checksum_string); >+ ExpectChecksumEq(audio_checksum_ref, checksum_string); > > // Extract and verify the payload checksum. > rtc::Buffer checksum_result(payload_checksum_->Size()); > payload_checksum_->Finish(checksum_result.data(), checksum_result.size()); > checksum_string = > rtc::hex_encode(checksum_result.data<char>(), checksum_result.size()); >- EXPECT_EQ(payload_checksum_ref, checksum_string); >+ ExpectChecksumEq(payload_checksum_ref, checksum_string); > > // Verify number of packets produced. > EXPECT_EQ(expected_packets, packet_count_); >@@ -1201,6 +1199,18 @@ class AcmSenderBitExactnessOldApi : public ::testing::Test, > remove(output_file_name.c_str()); > } > >+ // Helper: result must be one the "|"-separated checksums. >+ void ExpectChecksumEq(std::string ref, std::string result) { >+ if (ref.size() == result.size()) { >+ // Only one checksum: clearer message. >+ EXPECT_EQ(ref, result); >+ } else { >+ EXPECT_NE(ref.find(result), std::string::npos) >+ << result << " must be one of these:\n" >+ << ref; >+ } >+ } >+ > // Inherited from test::PacketSource. > std::unique_ptr<test::Packet> NextPacket() override { > auto packet = send_test_->NextPacket(); >@@ -1438,21 +1448,35 @@ TEST_F(AcmSenderBitExactnessOldApi, MAYBE_G722_stereo_20ms) { > 50, test::AcmReceiveTestOldApi::kStereoOutput); > } > >+namespace { >+// Checksum depends on libopus being compiled with or without SSE. >+const std::string audio_maybe_sse = >+ "3e285b74510e62062fbd8142dacd16e9|" >+ "fd5d57d6d766908e6a7211e2a5c7f78a"; >+const std::string payload_maybe_sse = >+ "78cf8f03157358acdc69f6835caa0d9b|" >+ "b693bd95c2ee2354f92340dd09e9da68"; >+// Common checksums. >+const std::string audio_checksum = >+ AcmReceiverBitExactnessOldApi::PlatformChecksum( >+ audio_maybe_sse, >+ audio_maybe_sse, >+ "439e97ad1932c49923b5da029c17dd5e", >+ "038ec90f5f3fc2320f3090f8ecef6bb7", >+ "038ec90f5f3fc2320f3090f8ecef6bb7"); >+const std::string payload_checksum = >+ AcmReceiverBitExactnessOldApi::PlatformChecksum( >+ payload_maybe_sse, >+ payload_maybe_sse, >+ "ab88b1a049c36bdfeb7e8b057ef6982a", >+ "27fef7b799393347ec3b5694369a1c36", >+ "27fef7b799393347ec3b5694369a1c36"); >+}; // namespace >+ > TEST_F(AcmSenderBitExactnessOldApi, Opus_stereo_20ms) { > ASSERT_NO_FATAL_FAILURE(SetUpTest("opus", 48000, 2, 120, 960, 960)); >- Run(AcmReceiverBitExactnessOldApi::PlatformChecksum( >- "3e285b74510e62062fbd8142dacd16e9", >- "3e285b74510e62062fbd8142dacd16e9", >- "439e97ad1932c49923b5da029c17dd5e", >- "038ec90f5f3fc2320f3090f8ecef6bb7", >- "038ec90f5f3fc2320f3090f8ecef6bb7"), >- AcmReceiverBitExactnessOldApi::PlatformChecksum( >- "78cf8f03157358acdc69f6835caa0d9b", >- "78cf8f03157358acdc69f6835caa0d9b", >- "ab88b1a049c36bdfeb7e8b057ef6982a", >- "27fef7b799393347ec3b5694369a1c36", >- "27fef7b799393347ec3b5694369a1c36"), >- 50, test::AcmReceiveTestOldApi::kStereoOutput); >+ Run(audio_checksum, payload_checksum, 50, >+ test::AcmReceiveTestOldApi::kStereoOutput); > } > > TEST_F(AcmSenderBitExactnessNewApi, MAYBE_OpusFromFormat_stereo_20ms) { >@@ -1460,19 +1484,8 @@ TEST_F(AcmSenderBitExactnessNewApi, MAYBE_OpusFromFormat_stereo_20ms) { > SdpAudioFormat("opus", 48000, 2, {{"stereo", "1"}})); > ASSERT_NO_FATAL_FAILURE(SetUpTestExternalEncoder( > AudioEncoderOpus::MakeAudioEncoder(*config, 120), 120)); >- Run(AcmReceiverBitExactnessOldApi::PlatformChecksum( >- "3e285b74510e62062fbd8142dacd16e9", >- "3e285b74510e62062fbd8142dacd16e9", >- "439e97ad1932c49923b5da029c17dd5e", >- "038ec90f5f3fc2320f3090f8ecef6bb7", >- "038ec90f5f3fc2320f3090f8ecef6bb7"), >- AcmReceiverBitExactnessOldApi::PlatformChecksum( >- "78cf8f03157358acdc69f6835caa0d9b", >- "78cf8f03157358acdc69f6835caa0d9b", >- "ab88b1a049c36bdfeb7e8b057ef6982a", >- "27fef7b799393347ec3b5694369a1c36", >- "27fef7b799393347ec3b5694369a1c36"), >- 50, test::AcmReceiveTestOldApi::kStereoOutput); >+ Run(audio_checksum, payload_checksum, 50, >+ test::AcmReceiveTestOldApi::kStereoOutput); > } > > TEST_F(AcmSenderBitExactnessNewApi, OpusFromFormat_stereo_20ms_voip) { >@@ -1482,15 +1495,19 @@ TEST_F(AcmSenderBitExactnessNewApi, OpusFromFormat_stereo_20ms_voip) { > config->application = AudioEncoderOpusConfig::ApplicationMode::kVoip; > ASSERT_NO_FATAL_FAILURE(SetUpTestExternalEncoder( > AudioEncoderOpus::MakeAudioEncoder(*config, 120), 120)); >+ // Checksum depends on libopus being compiled with or without SSE. >+ const std::string audio_maybe_sse = >+ "b0325df4e8104f04e03af23c0b75800e|" >+ "3cd4e1bc2acd9440bb9e97af34080ffc"; >+ const std::string payload_maybe_sse = >+ "4eab2259b6fe24c22dd242a113e0b3d9|" >+ "4fc0af0aa06c26454af09832d3ec1b4e"; > Run(AcmReceiverBitExactnessOldApi::PlatformChecksum( >- "b0325df4e8104f04e03af23c0b75800e", >- "b0325df4e8104f04e03af23c0b75800e", >- "1c81121f5d9286a5a865d01dbab22ce8", >+ audio_maybe_sse, audio_maybe_sse, "1c81121f5d9286a5a865d01dbab22ce8", > "11d547f89142e9ef03f37d7ca7f32379", > "11d547f89142e9ef03f37d7ca7f32379"), > AcmReceiverBitExactnessOldApi::PlatformChecksum( >- "4eab2259b6fe24c22dd242a113e0b3d9", >- "4eab2259b6fe24c22dd242a113e0b3d9", >+ payload_maybe_sse, payload_maybe_sse, > "839ea60399447268ee0f0262a50b75fd", > "1815fd5589cad0c6f6cf946c76b81aeb", > "1815fd5589cad0c6f6cf946c76b81aeb"), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/call_statistics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/call_statistics.h >index 9dced6475a80c773f4c14290bf7f770674ae037a..5d94ac453891db8d351b75569145996d4fefea93 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/call_statistics.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/call_statistics.h >@@ -12,7 +12,7 @@ > #define MODULES_AUDIO_CODING_ACM2_CALL_STATISTICS_H_ > > #include "api/audio/audio_frame.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "modules/audio_coding/include/audio_coding_module_typedefs.h" > > // > // This class is for book keeping of calls to ACM. It is not useful to log API >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager.cc >deleted file mode 100644 >index f29e0f1a1e27b90cc9054d8377eb2f85b4dcd504..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager.cc >+++ /dev/null >@@ -1,246 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/audio_coding/acm2/codec_manager.h" >- >-#include "rtc_base/checks.h" >-//#include "rtc_base/format_macros.h" >-#include "modules/audio_coding/acm2/rent_a_codec.h" >-#include "rtc_base/logging.h" >- >-namespace webrtc { >-namespace acm2 { >- >-namespace { >- >-// Check if the given codec is a valid to be registered as send codec. >-int IsValidSendCodec(const CodecInst& send_codec) { >- if ((send_codec.channels != 1) && (send_codec.channels != 2)) { >- RTC_LOG(LS_ERROR) << "Wrong number of channels (" << send_codec.channels >- << "), only mono and stereo are supported)"; >- return -1; >- } >- >- auto maybe_codec_id = RentACodec::CodecIdByInst(send_codec); >- if (!maybe_codec_id) { >- RTC_LOG(LS_ERROR) << "Invalid codec setting for the send codec."; >- return -1; >- } >- >- // Telephone-event cannot be a send codec. >- if (!STR_CASE_CMP(send_codec.plname, "telephone-event")) { >- RTC_LOG(LS_ERROR) << "telephone-event cannot be a send codec"; >- return -1; >- } >- >- if (!RentACodec::IsSupportedNumChannels(*maybe_codec_id, send_codec.channels) >- .value_or(false)) { >- RTC_LOG(LS_ERROR) << send_codec.channels >- << " number of channels not supported for " >- << send_codec.plname << "."; >- return -1; >- } >- return RentACodec::CodecIndexFromId(*maybe_codec_id).value_or(-1); >-} >- >-bool IsOpus(const CodecInst& codec) { >- return >-#ifdef WEBRTC_CODEC_OPUS >- !STR_CASE_CMP(codec.plname, "opus") || >-#endif >- false; >-} >- >-} // namespace >- >-CodecManager::CodecManager() { >- thread_checker_.DetachFromThread(); >-} >- >-CodecManager::~CodecManager() = default; >- >-bool CodecManager::RegisterEncoder(const CodecInst& send_codec) { >- RTC_DCHECK(thread_checker_.CalledOnValidThread()); >- int codec_id = IsValidSendCodec(send_codec); >- >- // Check for reported errors from function IsValidSendCodec(). >- if (codec_id < 0) { >- return false; >- } >- >- switch (RentACodec::RegisterRedPayloadType( >- &codec_stack_params_.red_payload_types, send_codec)) { >- case RentACodec::RegistrationResult::kOk: >- return true; >- case RentACodec::RegistrationResult::kBadFreq: >- RTC_LOG(LS_ERROR) >- << "RegisterSendCodec() failed, invalid frequency for RED" >- " registration"; >- return false; >- case RentACodec::RegistrationResult::kSkip: >- break; >- } >- switch (RentACodec::RegisterCngPayloadType( >- &codec_stack_params_.cng_payload_types, send_codec)) { >- case RentACodec::RegistrationResult::kOk: >- return true; >- case RentACodec::RegistrationResult::kBadFreq: >- RTC_LOG(LS_ERROR) >- << "RegisterSendCodec() failed, invalid frequency for CNG" >- " registration"; >- return false; >- case RentACodec::RegistrationResult::kSkip: >- break; >- } >- >- if (IsOpus(send_codec)) { >- // VAD/DTX not supported. >- codec_stack_params_.use_cng = false; >- } >- >- send_codec_inst_ = send_codec; >- recreate_encoder_ = true; // Caller must recreate it. >- return true; >-} >- >-CodecInst CodecManager::ForgeCodecInst( >- const AudioEncoder* external_speech_encoder) { >- CodecInst ci; >- ci.channels = external_speech_encoder->NumChannels(); >- ci.plfreq = external_speech_encoder->SampleRateHz(); >- ci.pacsize = rtc::CheckedDivExact( >- static_cast<int>(external_speech_encoder->Max10MsFramesInAPacket() * >- ci.plfreq), >- 100); >- ci.pltype = -1; // Not valid. >- ci.rate = -1; // Not valid. >- static const char kName[] = "external"; >- memcpy(ci.plname, kName, sizeof(kName)); >- return ci; >-} >- >-bool CodecManager::SetCopyRed(bool enable) { >- if (enable && codec_stack_params_.use_codec_fec) { >- RTC_LOG(LS_WARNING) << "Codec internal FEC and RED cannot be co-enabled."; >- return false; >- } >- if (enable && send_codec_inst_ && >- codec_stack_params_.red_payload_types.count(send_codec_inst_->plfreq) < >- 1) { >- RTC_LOG(LS_WARNING) << "Cannot enable RED at " << send_codec_inst_->plfreq >- << " Hz."; >- return false; >- } >- codec_stack_params_.use_red = enable; >- return true; >-} >- >-bool CodecManager::SetVAD(bool enable, ACMVADMode mode) { >- // Sanity check of the mode. >- RTC_DCHECK(mode == VADNormal || mode == VADLowBitrate || mode == VADAggr || >- mode == VADVeryAggr); >- >- // Check that the send codec is mono. We don't support VAD/DTX for stereo >- // sending. >- const bool stereo_send = >- codec_stack_params_.speech_encoder >- ? (codec_stack_params_.speech_encoder->NumChannels() != 1) >- : false; >- if (enable && stereo_send) { >- RTC_LOG(LS_ERROR) << "VAD/DTX not supported for stereo sending"; >- return false; >- } >- >- // TODO(kwiberg): This doesn't protect Opus when injected as an external >- // encoder. >- if (send_codec_inst_ && IsOpus(*send_codec_inst_)) { >- // VAD/DTX not supported, but don't fail. >- enable = false; >- } >- >- codec_stack_params_.use_cng = enable; >- codec_stack_params_.vad_mode = mode; >- return true; >-} >- >-bool CodecManager::SetCodecFEC(bool enable_codec_fec) { >- if (enable_codec_fec && codec_stack_params_.use_red) { >- RTC_LOG(LS_WARNING) << "Codec internal FEC and RED cannot be co-enabled."; >- return false; >- } >- >- codec_stack_params_.use_codec_fec = enable_codec_fec; >- return true; >-} >- >-bool CodecManager::MakeEncoder(RentACodec* rac, AudioCodingModule* acm) { >- RTC_DCHECK(rac); >- RTC_DCHECK(acm); >- >- if (!recreate_encoder_) { >- bool error = false; >- // Try to re-use the speech encoder we've given to the ACM. >- acm->ModifyEncoder([&](std::unique_ptr<AudioEncoder>* encoder) { >- if (!*encoder) { >- // There is no existing encoder. >- recreate_encoder_ = true; >- return; >- } >- >- // Extract the speech encoder from the ACM. >- std::unique_ptr<AudioEncoder> enc = std::move(*encoder); >- while (true) { >- auto sub_enc = enc->ReclaimContainedEncoders(); >- if (sub_enc.empty()) { >- break; >- } >- RTC_CHECK_EQ(1, sub_enc.size()); >- >- // Replace enc with its sub encoder. We need to put the sub encoder in >- // a temporary first, since otherwise the old value of enc would be >- // destroyed before the new value got assigned, which would be bad >- // since the new value is a part of the old value. >- auto tmp_enc = std::move(sub_enc[0]); >- enc = std::move(tmp_enc); >- } >- >- // Wrap it in a new encoder stack and put it back. >- codec_stack_params_.speech_encoder = std::move(enc); >- *encoder = rac->RentEncoderStack(&codec_stack_params_); >- if (!*encoder) { >- error = true; >- } >- }); >- if (error) { >- return false; >- } >- if (!recreate_encoder_) { >- return true; >- } >- } >- >- if (!send_codec_inst_) { >- // We don't have the information we need to create a new speech encoder. >- // (This is not an error.) >- return true; >- } >- >- codec_stack_params_.speech_encoder = rac->RentEncoder(*send_codec_inst_); >- auto stack = rac->RentEncoderStack(&codec_stack_params_); >- if (!stack) { >- return false; >- } >- acm->SetEncoder(std::move(stack)); >- recreate_encoder_ = false; >- return true; >-} >- >-} // namespace acm2 >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager.h >deleted file mode 100644 >index ffbad9670880508c76f16815769853070571ed8d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager.h >+++ /dev/null >@@ -1,75 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_AUDIO_CODING_ACM2_CODEC_MANAGER_H_ >-#define MODULES_AUDIO_CODING_ACM2_CODEC_MANAGER_H_ >- >-#include <map> >- >-#include "absl/types/optional.h" >-#include "common_types.h" // NOLINT(build/include) >-#include "modules/audio_coding/acm2/rent_a_codec.h" >-#include "modules/audio_coding/include/audio_coding_module.h" >-#include "modules/audio_coding/include/audio_coding_module_typedefs.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/thread_checker.h" >- >-namespace webrtc { >- >-class AudioDecoder; >-class AudioEncoder; >- >-namespace acm2 { >- >-class CodecManager final { >- public: >- CodecManager(); >- ~CodecManager(); >- >- // Parses the given specification. On success, returns true and updates the >- // stored CodecInst and stack parameters; on error, returns false. >- bool RegisterEncoder(const CodecInst& send_codec); >- >- static CodecInst ForgeCodecInst(const AudioEncoder* external_speech_encoder); >- >- const CodecInst* GetCodecInst() const { >- return send_codec_inst_ ? &*send_codec_inst_ : nullptr; >- } >- >- void UnsetCodecInst() { send_codec_inst_ = absl::nullopt; } >- >- const RentACodec::StackParameters* GetStackParams() const { >- return &codec_stack_params_; >- } >- RentACodec::StackParameters* GetStackParams() { return &codec_stack_params_; } >- >- bool SetCopyRed(bool enable); >- >- bool SetVAD(bool enable, ACMVADMode mode); >- >- bool SetCodecFEC(bool enable_codec_fec); >- >- // Uses the provided Rent-A-Codec to create a new encoder stack, if we have a >- // complete specification; if so, it is then passed to set_encoder. On error, >- // returns false. >- bool MakeEncoder(RentACodec* rac, AudioCodingModule* acm); >- >- private: >- rtc::ThreadChecker thread_checker_; >- absl::optional<CodecInst> send_codec_inst_; >- RentACodec::StackParameters codec_stack_params_; >- bool recreate_encoder_ = true; // Need to recreate encoder? >- >- RTC_DISALLOW_COPY_AND_ASSIGN(CodecManager); >-}; >- >-} // namespace acm2 >-} // namespace webrtc >-#endif // MODULES_AUDIO_CODING_ACM2_CODEC_MANAGER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager_unittest.cc >deleted file mode 100644 >index 6a5ea5f52e2556248d6c07d42ec3fa2b4a3b3875..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/codec_manager_unittest.cc >+++ /dev/null >@@ -1,70 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "modules/audio_coding/acm2/codec_manager.h" >-#include "modules/audio_coding/acm2/rent_a_codec.h" >-#include "test/gtest.h" >-#include "test/mock_audio_encoder.h" >- >-namespace webrtc { >-namespace acm2 { >- >-using ::testing::Return; >- >-namespace { >- >-// Create a MockAudioEncoder with some reasonable default behavior. >-std::unique_ptr<MockAudioEncoder> CreateMockEncoder() { >- auto enc = std::unique_ptr<MockAudioEncoder>(new MockAudioEncoder); >- EXPECT_CALL(*enc, SampleRateHz()).WillRepeatedly(Return(8000)); >- EXPECT_CALL(*enc, NumChannels()).WillRepeatedly(Return(1)); >- EXPECT_CALL(*enc, Max10MsFramesInAPacket()).WillRepeatedly(Return(1)); >- return enc; >-} >- >-} // namespace >- >-TEST(CodecManagerTest, ExternalEncoderFec) { >- auto enc0 = CreateMockEncoder(); >- auto enc1 = CreateMockEncoder(); >- auto enc2 = CreateMockEncoder(); >- { >- ::testing::InSequence s; >- EXPECT_CALL(*enc0, SetFec(false)).WillOnce(Return(true)); >- EXPECT_CALL(*enc1, SetFec(true)).WillOnce(Return(true)); >- EXPECT_CALL(*enc2, SetFec(true)).WillOnce(Return(false)); >- } >- >- CodecManager cm; >- RentACodec rac; >- >- // use_codec_fec starts out false. >- EXPECT_FALSE(cm.GetStackParams()->use_codec_fec); >- cm.GetStackParams()->speech_encoder = std::move(enc0); >- EXPECT_TRUE(rac.RentEncoderStack(cm.GetStackParams())); >- EXPECT_FALSE(cm.GetStackParams()->use_codec_fec); >- >- // Set it to true. >- EXPECT_EQ(true, cm.SetCodecFEC(true)); >- EXPECT_TRUE(cm.GetStackParams()->use_codec_fec); >- cm.GetStackParams()->speech_encoder = std::move(enc1); >- EXPECT_TRUE(rac.RentEncoderStack(cm.GetStackParams())); >- EXPECT_TRUE(cm.GetStackParams()->use_codec_fec); >- >- // Switch to a codec that doesn't support it. >- cm.GetStackParams()->speech_encoder = std::move(enc2); >- EXPECT_TRUE(rac.RentEncoderStack(cm.GetStackParams())); >- EXPECT_FALSE(cm.GetStackParams()->use_codec_fec); >-} >- >-} // namespace acm2 >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc >index 45d78d74a5dd4e8a96e84351107a3b4162dcd9a2..bfddc42bc911876cf59b6eda8507801b1d06a0d9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc >@@ -13,34 +13,9 @@ > #include <memory> > #include <utility> > >-#include "modules/audio_coding/codecs/cng/audio_encoder_cng.h" >-#include "modules/audio_coding/codecs/g711/audio_encoder_pcm.h" > #include "rtc_base/logging.h" >-#include "modules/audio_coding/codecs/g722/audio_encoder_g722.h" >-#ifdef WEBRTC_CODEC_ILBC >-#include "modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.h" // nogncheck >-#endif >-#ifdef WEBRTC_CODEC_ISACFX >-#include "modules/audio_coding/codecs/isac/fix/include/audio_decoder_isacfix.h" // nogncheck >-#include "modules/audio_coding/codecs/isac/fix/include/audio_encoder_isacfix.h" // nogncheck >-#endif >-#ifdef WEBRTC_CODEC_ISAC >-#include "modules/audio_coding/codecs/isac/main/include/audio_decoder_isac.h" // nogncheck >-#include "modules/audio_coding/codecs/isac/main/include/audio_encoder_isac.h" // nogncheck >-#endif >-#ifdef WEBRTC_CODEC_OPUS >-#include "modules/audio_coding/codecs/opus/audio_encoder_opus.h" >-#endif >-#include "modules/audio_coding/codecs/pcm16b/audio_encoder_pcm16b.h" >-#ifdef WEBRTC_CODEC_RED >-#include "modules/audio_coding/codecs/red/audio_encoder_copy_red.h" // nogncheck >-#endif > #include "modules/audio_coding/acm2/acm_codec_database.h" > >-#if defined(WEBRTC_CODEC_ISACFX) || defined(WEBRTC_CODEC_ISAC) >-#include "modules/audio_coding/codecs/isac/locked_bandwidth_info.h" >-#endif >- > namespace webrtc { > namespace acm2 { > >@@ -54,7 +29,8 @@ absl::optional<RentACodec::CodecId> RentACodec::CodecIdByParams( > > absl::optional<CodecInst> RentACodec::CodecInstById(CodecId codec_id) { > absl::optional<int> mi = CodecIndexFromId(codec_id); >- return mi ? absl::optional<CodecInst>(Database()[*mi]) : absl::nullopt; >+ return mi ? absl::optional<CodecInst>(ACMCodecDB::database_[*mi]) >+ : absl::nullopt; > } > > absl::optional<RentACodec::CodecId> RentACodec::CodecIdByInst( >@@ -80,20 +56,6 @@ absl::optional<CodecInst> RentACodec::CodecInstByParams( > return ci; > } > >-absl::optional<bool> RentACodec::IsSupportedNumChannels(CodecId codec_id, >- size_t num_channels) { >- auto i = CodecIndexFromId(codec_id); >- return i ? absl::optional<bool>( >- ACMCodecDB::codec_settings_[*i].channel_support >= >- num_channels) >- : absl::nullopt; >-} >- >-rtc::ArrayView<const CodecInst> RentACodec::Database() { >- return rtc::ArrayView<const CodecInst>(ACMCodecDB::database_, >- NumberOfCodecs()); >-} >- > absl::optional<NetEqDecoder> RentACodec::NetEqDecoderFromCodecId( > CodecId codec_id, > size_t num_channels) { >@@ -106,200 +68,5 @@ absl::optional<NetEqDecoder> RentACodec::NetEqDecoderFromCodecId( > : ned; > } > >-RentACodec::RegistrationResult RentACodec::RegisterCngPayloadType( >- std::map<int, int>* pt_map, >- const CodecInst& codec_inst) { >- if (STR_CASE_CMP(codec_inst.plname, "CN") != 0) >- return RegistrationResult::kSkip; >- switch (codec_inst.plfreq) { >- case 8000: >- case 16000: >- case 32000: >- case 48000: >- (*pt_map)[codec_inst.plfreq] = codec_inst.pltype; >- return RegistrationResult::kOk; >- default: >- return RegistrationResult::kBadFreq; >- } >-} >- >-RentACodec::RegistrationResult RentACodec::RegisterRedPayloadType( >- std::map<int, int>* pt_map, >- const CodecInst& codec_inst) { >- if (STR_CASE_CMP(codec_inst.plname, "RED") != 0) >- return RegistrationResult::kSkip; >- switch (codec_inst.plfreq) { >- case 8000: >- (*pt_map)[codec_inst.plfreq] = codec_inst.pltype; >- return RegistrationResult::kOk; >- default: >- return RegistrationResult::kBadFreq; >- } >-} >- >-namespace { >- >-// Returns a new speech encoder, or null on error. >-// TODO(kwiberg): Don't handle errors here (bug 5033) >-std::unique_ptr<AudioEncoder> CreateEncoder( >- const CodecInst& speech_inst, >- const rtc::scoped_refptr<LockedIsacBandwidthInfo>& bwinfo) { >-#if defined(WEBRTC_CODEC_ISACFX) >- if (STR_CASE_CMP(speech_inst.plname, "isac") == 0) >- return std::unique_ptr<AudioEncoder>( >- new AudioEncoderIsacFixImpl(speech_inst, bwinfo)); >-#endif >-#if defined(WEBRTC_CODEC_ISAC) >- if (STR_CASE_CMP(speech_inst.plname, "isac") == 0) >- return std::unique_ptr<AudioEncoder>( >- new AudioEncoderIsacFloatImpl(speech_inst, bwinfo)); >-#endif >-#ifdef WEBRTC_CODEC_OPUS >- if (STR_CASE_CMP(speech_inst.plname, "opus") == 0) >- return std::unique_ptr<AudioEncoder>(new AudioEncoderOpusImpl(speech_inst)); >-#endif >- if (STR_CASE_CMP(speech_inst.plname, "pcmu") == 0) >- return std::unique_ptr<AudioEncoder>(new AudioEncoderPcmU(speech_inst)); >- if (STR_CASE_CMP(speech_inst.plname, "pcma") == 0) >- return std::unique_ptr<AudioEncoder>(new AudioEncoderPcmA(speech_inst)); >- if (STR_CASE_CMP(speech_inst.plname, "l16") == 0) >- return std::unique_ptr<AudioEncoder>(new AudioEncoderPcm16B(speech_inst)); >-#ifdef WEBRTC_CODEC_ILBC >- if (STR_CASE_CMP(speech_inst.plname, "ilbc") == 0) >- return std::unique_ptr<AudioEncoder>(new AudioEncoderIlbcImpl(speech_inst)); >-#endif >- if (STR_CASE_CMP(speech_inst.plname, "g722") == 0) >- return std::unique_ptr<AudioEncoder>(new AudioEncoderG722Impl(speech_inst)); >- RTC_LOG_F(LS_ERROR) << "Could not create encoder of type " >- << speech_inst.plname; >- return std::unique_ptr<AudioEncoder>(); >-} >- >-std::unique_ptr<AudioEncoder> CreateRedEncoder( >- std::unique_ptr<AudioEncoder> encoder, >- int red_payload_type) { >-#ifdef WEBRTC_CODEC_RED >- AudioEncoderCopyRed::Config config; >- config.payload_type = red_payload_type; >- config.speech_encoder = std::move(encoder); >- return std::unique_ptr<AudioEncoder>( >- new AudioEncoderCopyRed(std::move(config))); >-#else >- return std::unique_ptr<AudioEncoder>(); >-#endif >-} >- >-std::unique_ptr<AudioEncoder> CreateCngEncoder( >- std::unique_ptr<AudioEncoder> encoder, >- int payload_type, >- ACMVADMode vad_mode) { >- AudioEncoderCng::Config config; >- config.num_channels = encoder->NumChannels(); >- config.payload_type = payload_type; >- config.speech_encoder = std::move(encoder); >- switch (vad_mode) { >- case VADNormal: >- config.vad_mode = Vad::kVadNormal; >- break; >- case VADLowBitrate: >- config.vad_mode = Vad::kVadLowBitrate; >- break; >- case VADAggr: >- config.vad_mode = Vad::kVadAggressive; >- break; >- case VADVeryAggr: >- config.vad_mode = Vad::kVadVeryAggressive; >- break; >- default: >- RTC_FATAL(); >- } >- return std::unique_ptr<AudioEncoder>(new AudioEncoderCng(std::move(config))); >-} >- >-std::unique_ptr<AudioDecoder> CreateIsacDecoder( >- int sample_rate_hz, >- const rtc::scoped_refptr<LockedIsacBandwidthInfo>& bwinfo) { >-#if defined(WEBRTC_CODEC_ISACFX) >- return std::unique_ptr<AudioDecoder>( >- new AudioDecoderIsacFixImpl(sample_rate_hz, bwinfo)); >-#elif defined(WEBRTC_CODEC_ISAC) >- return std::unique_ptr<AudioDecoder>( >- new AudioDecoderIsacFloatImpl(sample_rate_hz, bwinfo)); >-#else >- RTC_FATAL() << "iSAC is not supported."; >- return std::unique_ptr<AudioDecoder>(); >-#endif >-} >- >-} // namespace >- >-RentACodec::RentACodec() { >-#if defined(WEBRTC_CODEC_ISACFX) || defined(WEBRTC_CODEC_ISAC) >- isac_bandwidth_info_ = new LockedIsacBandwidthInfo; >-#endif >-} >-RentACodec::~RentACodec() = default; >- >-std::unique_ptr<AudioEncoder> RentACodec::RentEncoder( >- const CodecInst& codec_inst) { >- return CreateEncoder(codec_inst, isac_bandwidth_info_); >-} >- >-RentACodec::StackParameters::StackParameters() { >- // Register the default payload types for RED and CNG. >- for (const CodecInst& ci : RentACodec::Database()) { >- RentACodec::RegisterCngPayloadType(&cng_payload_types, ci); >- RentACodec::RegisterRedPayloadType(&red_payload_types, ci); >- } >-} >- >-RentACodec::StackParameters::~StackParameters() = default; >- >-std::unique_ptr<AudioEncoder> RentACodec::RentEncoderStack( >- StackParameters* param) { >- if (!param->speech_encoder) >- return nullptr; >- >- if (param->use_codec_fec) { >- // Switch FEC on. On failure, remember that FEC is off. >- if (!param->speech_encoder->SetFec(true)) >- param->use_codec_fec = false; >- } else { >- // Switch FEC off. This shouldn't fail. >- const bool success = param->speech_encoder->SetFec(false); >- RTC_DCHECK(success); >- } >- >- auto pt = [¶m](const std::map<int, int>& m) { >- auto it = m.find(param->speech_encoder->SampleRateHz()); >- return it == m.end() ? absl::nullopt : absl::optional<int>(it->second); >- }; >- auto cng_pt = pt(param->cng_payload_types); >- param->use_cng = >- param->use_cng && cng_pt && param->speech_encoder->NumChannels() == 1; >- auto red_pt = pt(param->red_payload_types); >- param->use_red = param->use_red && red_pt; >- >- if (param->use_cng || param->use_red) { >- // The RED and CNG encoders need to be in sync with the speech encoder, so >- // reset the latter to ensure its buffer is empty. >- param->speech_encoder->Reset(); >- } >- std::unique_ptr<AudioEncoder> encoder_stack = >- std::move(param->speech_encoder); >- if (param->use_red) { >- encoder_stack = CreateRedEncoder(std::move(encoder_stack), *red_pt); >- } >- if (param->use_cng) { >- encoder_stack = >- CreateCngEncoder(std::move(encoder_stack), *cng_pt, param->vad_mode); >- } >- return encoder_stack; >-} >- >-std::unique_ptr<AudioDecoder> RentACodec::RentIsacDecoder(int sample_rate_hz) { >- return CreateIsacDecoder(sample_rate_hz, isac_bandwidth_info_); >-} >- > } // namespace acm2 > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.h >index b0ad382d8a9b243d2cceb7aa013c50f025010b51..2cf1c6e18c9e6f8b060c05408c625e43da2d862a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.h >@@ -31,8 +31,7 @@ class LockedIsacBandwidthInfo; > > namespace acm2 { > >-class RentACodec { >- public: >+struct RentACodec { > enum class CodecId { > #if defined(WEBRTC_CODEC_ISAC) || defined(WEBRTC_CODEC_ISACFX) > kISAC, >@@ -133,64 +132,9 @@ class RentACodec { > return payload_type >= 0 && payload_type <= 127; > } > >- static rtc::ArrayView<const CodecInst> Database(); >- >- static absl::optional<bool> IsSupportedNumChannels(CodecId codec_id, >- size_t num_channels); >- > static absl::optional<NetEqDecoder> NetEqDecoderFromCodecId( > CodecId codec_id, > size_t num_channels); >- >- // Parse codec_inst and extract payload types. If the given CodecInst was for >- // the wrong sort of codec, return kSkip; otherwise, if the rate was illegal, >- // return kBadFreq; otherwise, update the given RTP timestamp rate (Hz) -> >- // payload type map and return kOk. >- enum class RegistrationResult { kOk, kSkip, kBadFreq }; >- static RegistrationResult RegisterCngPayloadType(std::map<int, int>* pt_map, >- const CodecInst& codec_inst); >- static RegistrationResult RegisterRedPayloadType(std::map<int, int>* pt_map, >- const CodecInst& codec_inst); >- >- RentACodec(); >- ~RentACodec(); >- >- // Creates and returns an audio encoder built to the given specification. >- // Returns null in case of error. >- std::unique_ptr<AudioEncoder> RentEncoder(const CodecInst& codec_inst); >- >- struct StackParameters { >- StackParameters(); >- ~StackParameters(); >- >- std::unique_ptr<AudioEncoder> speech_encoder; >- >- bool use_codec_fec = false; >- bool use_red = false; >- bool use_cng = false; >- ACMVADMode vad_mode = VADNormal; >- >- // Maps from RTP timestamp rate (in Hz) to payload type. >- std::map<int, int> cng_payload_types; >- std::map<int, int> red_payload_types; >- }; >- >- // Creates and returns an audio encoder stack constructed to the given >- // specification. If the specification isn't compatible with the encoder, it >- // will be changed to match (things will be switched off). The speech encoder >- // will be stolen. If the specification isn't complete, returns nullptr. >- std::unique_ptr<AudioEncoder> RentEncoderStack(StackParameters* param); >- >- // Creates and returns an iSAC decoder. >- std::unique_ptr<AudioDecoder> RentIsacDecoder(int sample_rate_hz); >- >- private: >- std::unique_ptr<AudioEncoder> speech_encoder_; >- std::unique_ptr<AudioEncoder> cng_encoder_; >- std::unique_ptr<AudioEncoder> red_encoder_; >- rtc::scoped_refptr<LockedIsacBandwidthInfo> isac_bandwidth_info_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(RentACodec); > }; > > } // namespace acm2 >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec_unittest.cc >deleted file mode 100644 >index fd3329cfb11359cf5d27c3417a19983ca816db5e..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec_unittest.cc >+++ /dev/null >@@ -1,228 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "common_types.h" >-#include "modules/audio_coding/acm2/rent_a_codec.h" >-#include "rtc_base/arraysize.h" >-#include "test/gtest.h" >-#include "test/mock_audio_encoder.h" >- >-namespace webrtc { >-namespace acm2 { >- >-using ::testing::Return; >- >-namespace { >- >-const int kDataLengthSamples = 80; >-const int kPacketSizeSamples = 2 * kDataLengthSamples; >-const int16_t kZeroData[kDataLengthSamples] = {0}; >-const CodecInst kDefaultCodecInst = {0, "pcmu", 8000, kPacketSizeSamples, >- 1, 64000}; >-const int kCngPt = 13; >- >-class Marker final { >- public: >- MOCK_METHOD1(Mark, void(std::string desc)); >-}; >- >-} // namespace >- >-class RentACodecTestF : public ::testing::Test { >- protected: >- void CreateCodec() { >- auto speech_encoder = rent_a_codec_.RentEncoder(kDefaultCodecInst); >- ASSERT_TRUE(speech_encoder); >- RentACodec::StackParameters param; >- param.use_cng = true; >- param.speech_encoder = std::move(speech_encoder); >- encoder_ = rent_a_codec_.RentEncoderStack(¶m); >- } >- >- void EncodeAndVerify(size_t expected_out_length, >- uint32_t expected_timestamp, >- int expected_payload_type, >- int expected_send_even_if_empty) { >- rtc::Buffer out; >- AudioEncoder::EncodedInfo encoded_info; >- encoded_info = encoder_->Encode(timestamp_, kZeroData, &out); >- timestamp_ += kDataLengthSamples; >- EXPECT_TRUE(encoded_info.redundant.empty()); >- EXPECT_EQ(expected_out_length, encoded_info.encoded_bytes); >- EXPECT_EQ(expected_timestamp, encoded_info.encoded_timestamp); >- if (expected_payload_type >= 0) >- EXPECT_EQ(expected_payload_type, encoded_info.payload_type); >- if (expected_send_even_if_empty >= 0) >- EXPECT_EQ(static_cast<bool>(expected_send_even_if_empty), >- encoded_info.send_even_if_empty); >- } >- >- RentACodec rent_a_codec_; >- std::unique_ptr<AudioEncoder> encoder_; >- uint32_t timestamp_ = 0; >-}; >- >-// This test verifies that CNG frames are delivered as expected. Since the frame >-// size is set to 20 ms, we expect the first encode call to produce no output >-// (which is signaled as 0 bytes output of type kNoEncoding). The next encode >-// call should produce one SID frame of 9 bytes. The third call should not >-// result in any output (just like the first one). The fourth and final encode >-// call should produce an "empty frame", which is like no output, but with >-// AudioEncoder::EncodedInfo::send_even_if_empty set to true. (The reason to >-// produce an empty frame is to drive sending of DTMF packets in the RTP/RTCP >-// module.) >-TEST_F(RentACodecTestF, VerifyCngFrames) { >- CreateCodec(); >- uint32_t expected_timestamp = timestamp_; >- // Verify no frame. >- { >- SCOPED_TRACE("First encoding"); >- EncodeAndVerify(0, expected_timestamp, -1, -1); >- } >- >- // Verify SID frame delivered. >- { >- SCOPED_TRACE("Second encoding"); >- EncodeAndVerify(9, expected_timestamp, kCngPt, 1); >- } >- >- // Verify no frame. >- { >- SCOPED_TRACE("Third encoding"); >- EncodeAndVerify(0, expected_timestamp, -1, -1); >- } >- >- // Verify NoEncoding. >- expected_timestamp += 2 * kDataLengthSamples; >- { >- SCOPED_TRACE("Fourth encoding"); >- EncodeAndVerify(0, expected_timestamp, kCngPt, 1); >- } >-} >- >-TEST(RentACodecTest, ExternalEncoder) { >- const int kSampleRateHz = 8000; >- auto* external_encoder = new MockAudioEncoder; >- EXPECT_CALL(*external_encoder, SampleRateHz()) >- .WillRepeatedly(Return(kSampleRateHz)); >- EXPECT_CALL(*external_encoder, NumChannels()).WillRepeatedly(Return(1)); >- EXPECT_CALL(*external_encoder, SetFec(false)).WillRepeatedly(Return(true)); >- >- RentACodec rac; >- RentACodec::StackParameters param; >- param.speech_encoder = std::unique_ptr<AudioEncoder>(external_encoder); >- std::unique_ptr<AudioEncoder> encoder_stack = rac.RentEncoderStack(¶m); >- EXPECT_EQ(external_encoder, encoder_stack.get()); >- const int kPacketSizeSamples = kSampleRateHz / 100; >- int16_t audio[kPacketSizeSamples] = {0}; >- rtc::Buffer encoded; >- AudioEncoder::EncodedInfo info; >- >- Marker marker; >- { >- ::testing::InSequence s; >- info.encoded_timestamp = 0; >- EXPECT_CALL(*external_encoder, >- EncodeImpl(0, rtc::ArrayView<const int16_t>(audio), &encoded)) >- .WillOnce(Return(info)); >- EXPECT_CALL(marker, Mark("A")); >- EXPECT_CALL(marker, Mark("B")); >- EXPECT_CALL(marker, Mark("C")); >- } >- >- info = encoder_stack->Encode(0, audio, &encoded); >- EXPECT_EQ(0u, info.encoded_timestamp); >- marker.Mark("A"); >- >- // Change to internal encoder. >- CodecInst codec_inst = kDefaultCodecInst; >- codec_inst.pacsize = kPacketSizeSamples; >- param.speech_encoder = rac.RentEncoder(codec_inst); >- ASSERT_TRUE(param.speech_encoder); >- AudioEncoder* enc = param.speech_encoder.get(); >- std::unique_ptr<AudioEncoder> stack = rac.RentEncoderStack(¶m); >- EXPECT_EQ(enc, stack.get()); >- >- // Don't expect any more calls to the external encoder. >- info = stack->Encode(1, audio, &encoded); >- marker.Mark("B"); >- encoder_stack.reset(); >- marker.Mark("C"); >-} >- >-// Verify that the speech encoder's Reset method is called when CNG or RED >-// (or both) are switched on, but not when they're switched off. >-void TestCngAndRedResetSpeechEncoder(bool use_cng, bool use_red) { >- auto make_enc = [] { >- auto speech_encoder = >- std::unique_ptr<MockAudioEncoder>(new MockAudioEncoder); >- EXPECT_CALL(*speech_encoder, NumChannels()).WillRepeatedly(Return(1)); >- EXPECT_CALL(*speech_encoder, Max10MsFramesInAPacket()) >- .WillRepeatedly(Return(2)); >- EXPECT_CALL(*speech_encoder, SampleRateHz()).WillRepeatedly(Return(8000)); >- EXPECT_CALL(*speech_encoder, SetFec(false)).WillRepeatedly(Return(true)); >- return speech_encoder; >- }; >- auto speech_encoder1 = make_enc(); >- auto speech_encoder2 = make_enc(); >- Marker marker; >- { >- ::testing::InSequence s; >- EXPECT_CALL(marker, Mark("disabled")); >- EXPECT_CALL(marker, Mark("enabled")); >- if (use_cng || use_red) >- EXPECT_CALL(*speech_encoder2, Reset()); >- } >- >- RentACodec::StackParameters param1, param2; >- param1.speech_encoder = std::move(speech_encoder1); >- param2.speech_encoder = std::move(speech_encoder2); >- param2.use_cng = use_cng; >- param2.use_red = use_red; >- marker.Mark("disabled"); >- RentACodec rac; >- rac.RentEncoderStack(¶m1); >- marker.Mark("enabled"); >- rac.RentEncoderStack(¶m2); >-} >- >-TEST(RentACodecTest, CngResetsSpeechEncoder) { >- TestCngAndRedResetSpeechEncoder(true, false); >-} >- >-TEST(RentACodecTest, RedResetsSpeechEncoder) { >- TestCngAndRedResetSpeechEncoder(false, true); >-} >- >-TEST(RentACodecTest, CngAndRedResetsSpeechEncoder) { >- TestCngAndRedResetSpeechEncoder(true, true); >-} >- >-TEST(RentACodecTest, NoCngAndRedNoSpeechEncoderReset) { >- TestCngAndRedResetSpeechEncoder(false, false); >-} >- >-TEST(RentACodecTest, RentEncoderError) { >- const CodecInst codec_inst = { >- 0, "Robert'); DROP TABLE Students;", 8000, 160, 1, 64000}; >- RentACodec rent_a_codec; >- EXPECT_FALSE(rent_a_codec.RentEncoder(codec_inst)); >-} >- >-TEST(RentACodecTest, RentEncoderStackWithoutSpeechEncoder) { >- RentACodec::StackParameters sp; >- EXPECT_EQ(nullptr, sp.speech_encoder); >- EXPECT_EQ(nullptr, RentACodec().RentEncoderStack(&sp)); >-} >- >-} // namespace acm2 >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.cc >index c4832a36de4a557e577e6e5d73476060598288ec..9e47a06f7cac9c3bdfafe549f114c3697fe3232a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.cc >@@ -10,9 +10,14 @@ > > #include "modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.h" > >+#include <stdint.h> > #include <utility> >+#include <vector> > >-#include "rtc_base/logging.h" >+#include "modules/audio_coding/audio_network_adaptor/controller_manager.h" >+#include "modules/audio_coding/audio_network_adaptor/debug_dump_writer.h" >+#include "modules/audio_coding/audio_network_adaptor/event_log_writer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/timeutils.h" > #include "system_wrappers/include/field_trial.h" > >@@ -41,17 +46,7 @@ AudioNetworkAdaptorImpl::AudioNetworkAdaptorImpl( > kEventLogMinBitrateChangeBps, > kEventLogMinBitrateChangeFraction, > kEventLogMinPacketLossChangeFraction) >- : nullptr), >- enable_bitrate_adaptation_( >- webrtc::field_trial::IsEnabled("WebRTC-Audio-BitrateAdaptation")), >- enable_dtx_adaptation_( >- webrtc::field_trial::IsEnabled("WebRTC-Audio-DtxAdaptation")), >- enable_fec_adaptation_( >- webrtc::field_trial::IsEnabled("WebRTC-Audio-FecAdaptation")), >- enable_channel_adaptation_( >- webrtc::field_trial::IsEnabled("WebRTC-Audio-ChannelAdaptation")), >- enable_frame_length_adaptation_(webrtc::field_trial::IsEnabled( >- "WebRTC-Audio-FrameLengthAdaptation")) { >+ : nullptr) { > RTC_DCHECK(controller_manager_); > } > >@@ -152,24 +147,6 @@ AudioEncoderRuntimeConfig AudioNetworkAdaptorImpl::GetEncoderRuntimeConfig() { > } > prev_config_ = config; > >- // Prevent certain controllers from taking action (determined by field trials) >- if (!enable_bitrate_adaptation_ && config.bitrate_bps) { >- config.bitrate_bps.reset(); >- } >- if (!enable_dtx_adaptation_ && config.enable_dtx) { >- config.enable_dtx.reset(); >- } >- if (!enable_fec_adaptation_ && config.enable_fec) { >- config.enable_fec.reset(); >- config.uplink_packet_loss_fraction.reset(); >- } >- if (!enable_frame_length_adaptation_ && config.frame_length_ms) { >- config.frame_length_ms.reset(); >- } >- if (!enable_channel_adaptation_ && config.num_channels) { >- config.num_channels.reset(); >- } >- > if (debug_dump_writer_) > debug_dump_writer_->DumpEncoderRuntimeConfig(config, rtc::TimeMillis()); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.h >index e208ed2607c344632ba76718fbf0342637eae4ac..4c1c19b7ea28cb1c5e7ad760be0a41ee1add02ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.h >@@ -11,17 +11,21 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_AUDIO_NETWORK_ADAPTOR_IMPL_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_AUDIO_NETWORK_ADAPTOR_IMPL_H_ > >+#include <stdio.h> > #include <memory> > >+#include "absl/types/optional.h" >+#include "api/audio_codecs/audio_encoder.h" > #include "modules/audio_coding/audio_network_adaptor/controller.h" >-#include "modules/audio_coding/audio_network_adaptor/controller_manager.h" > #include "modules/audio_coding/audio_network_adaptor/debug_dump_writer.h" >-#include "modules/audio_coding/audio_network_adaptor/event_log_writer.h" > #include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > >+class ControllerManager; >+class EventLogWriter; > class RtcEventLog; > > class AudioNetworkAdaptorImpl final : public AudioNetworkAdaptor { >@@ -79,12 +83,6 @@ class AudioNetworkAdaptorImpl final : public AudioNetworkAdaptor { > > ANAStats stats_; > >- const bool enable_bitrate_adaptation_; >- const bool enable_dtx_adaptation_; >- const bool enable_fec_adaptation_; >- const bool enable_channel_adaptation_; >- const bool enable_frame_length_adaptation_; >- > RTC_DISALLOW_COPY_AND_ASSIGN(AudioNetworkAdaptorImpl); > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl_unittest.cc >index 5948ac36440da0f02c4a2fc5faddc047efa03954..be9550a7a763e5de345cc2073e0bc65acc5b801a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl_unittest.cc >@@ -51,7 +51,7 @@ MATCHER_P(IsRtcEventAnaConfigEqualTo, config, "") { > return false; > } > auto ana_event = static_cast<RtcEventAudioNetworkAdaptation*>(arg); >- return *ana_event->config_ == config; >+ return ana_event->config() == config; > } > > MATCHER_P(EncoderRuntimeConfigIs, config, "") { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/bitrate_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/bitrate_controller.h >index 282f599b6da3025547c5020aed7fd259cbd6dbe0..6b6330b030152b21e2ed8205582d77740fc26d1b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/bitrate_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/bitrate_controller.h >@@ -11,7 +11,11 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_BITRATE_CONTROLLER_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_BITRATE_CONTROLLER_H_ > >+#include <stddef.h> >+ >+#include "absl/types/optional.h" > #include "modules/audio_coding/audio_network_adaptor/controller.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/channel_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/channel_controller.h >index 23cbef663b4fbfc5551c237278e50f271e5deac1..0d775b1c77f1c97311467d7897baf89ea447b7e7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/channel_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/channel_controller.h >@@ -11,7 +11,11 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_CHANNEL_CONTROLLER_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_CHANNEL_CONTROLLER_H_ > >+#include <stddef.h> >+ >+#include "absl/types/optional.h" > #include "modules/audio_coding/audio_network_adaptor/controller.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/debug_dump_writer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/debug_dump_writer.cc >index 818362ed0da7ea3a3a7cf7a339ab267a6037d035..8e04e8eafc9371f1ff35471906b87fd92b305a78 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/debug_dump_writer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/debug_dump_writer.cc >@@ -10,10 +10,12 @@ > > #include "modules/audio_coding/audio_network_adaptor/debug_dump_writer.h" > >+#include "absl/types/optional.h" > #include "rtc_base/checks.h" > #include "rtc_base/ignore_wundef.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/protobuf_utils.h" >+#include "rtc_base/system/file_wrapper.h" > > #if WEBRTC_ENABLE_PROTOBUF > RTC_PUSH_IGNORING_WUNDEF() >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/dtx_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/dtx_controller.h >index fb40db28b20cca73e959322f42d4d4dba87afcd0..d3334eca9ef001b52840c4bf507fff0e2cfd10db 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/dtx_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/dtx_controller.h >@@ -11,7 +11,9 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_DTX_CONTROLLER_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_DTX_CONTROLLER_H_ > >+#include "absl/types/optional.h" > #include "modules/audio_coding/audio_network_adaptor/controller.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.cc >index 4a92343d9688bd09aaf98e3520a61579d3096033..7925b891277501de0dc62b704e99d5998c48d433 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.cc >@@ -9,13 +9,17 @@ > */ > > #include <math.h> >- > #include <algorithm> >+#include <cstdlib> >+#include <utility> > > #include "absl/memory/memory.h" >+#include "absl/types/optional.h" >+#include "logging/rtc_event_log/events/rtc_event.h" > #include "logging/rtc_event_log/events/rtc_event_audio_network_adaptation.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/audio_coding/audio_network_adaptor/event_log_writer.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.h >index fca8e53b664bf1bf3de89e50e98a82e8b0909267..72b5245e115a6ffd50205d10a3c8cba9e1dc899c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer.h >@@ -11,7 +11,7 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_EVENT_LOG_WRITER_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_EVENT_LOG_WRITER_H_ > >-#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer_unittest.cc >index df975947ca13a1f43ac8ceb8e75a5bf4ccd52446..42189c3e11c321fa03639a673dca22306779e0cf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/event_log_writer_unittest.cc >@@ -36,7 +36,7 @@ MATCHER_P(IsRtcEventAnaConfigEqualTo, config, "") { > return false; > } > auto ana_event = static_cast<RtcEventAudioNetworkAdaptation*>(arg); >- return *ana_event->config_ == config; >+ return ana_event->config() == config; > } > > struct EventLogWriterStates { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.cc >index 7ab72c9437e37219d24fd1e0a12799d4b5d87d88..936e22429ac88854fcd72eca75217e7d326aa10c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.cc >@@ -10,7 +10,7 @@ > > #include "modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.h" > >-#include <limits> >+#include <string> > #include <utility> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.h >index b66883ef552e514b038e82b25e4d6dc9727c64ad..b7d3d56173e6ba8de2d6cb35203b9ce31bd37e2e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_plr_based.h >@@ -13,8 +13,10 @@ > > #include <memory> > >+#include "absl/types/optional.h" > #include "common_audio/smoothing_filter.h" > #include "modules/audio_coding/audio_network_adaptor/controller.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "modules/audio_coding/audio_network_adaptor/util/threshold_curve.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.cc >index c8cfd31607c46e30a4cefc5a41fc349457b26414..6c30b8f2c0743e74008e6a22d41e6018a8c033d6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.cc >@@ -10,9 +10,6 @@ > > #include "modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.h" > >-#include <limits> >-#include <utility> >- > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.h >index 9a3c37c3d1d8ce06c208af72720fa7d1686a1e13..421cb70abd18baaabcfc6a0fadf43764f87b05ec 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/fec_controller_rplr_based.h >@@ -11,9 +11,9 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_FEC_CONTROLLER_RPLR_BASED_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_FEC_CONTROLLER_RPLR_BASED_H_ > >-#include <memory> >- >+#include "absl/types/optional.h" > #include "modules/audio_coding/audio_network_adaptor/controller.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "modules/audio_coding/audio_network_adaptor/util/threshold_curve.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.cc >index 40e97cb371e43c270f771ecf060e1f0e63615f02..b123c7c70ed5e87f825863c09a2c3fa686664fe8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.cc >@@ -11,10 +11,10 @@ > #include "modules/audio_coding/audio_network_adaptor/frame_length_controller.h" > > #include <algorithm> >+#include <iterator> > #include <utility> > > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.h >index f084fd0382f1607959d0c018c5df42b695bab926..f0a5aaba9a2a6a2a7d4653756400414a6e1b8e8e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/frame_length_controller.h >@@ -11,10 +11,13 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_FRAME_LENGTH_CONTROLLER_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_FRAME_LENGTH_CONTROLLER_H_ > >+#include <stddef.h> > #include <map> > #include <vector> > >+#include "absl/types/optional.h" > #include "modules/audio_coding/audio_network_adaptor/controller.h" >+#include "modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h >index 257a79af18d9194f1d0065eb0aadd8c39689390c..94e8ed961e7e683bf0d89be57b10e2be347cefac 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/audio_network_adaptor/include/audio_network_adaptor_config.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_INCLUDE_AUDIO_NETWORK_ADAPTOR_CONFIG_H_ > #define MODULES_AUDIO_CODING_AUDIO_NETWORK_ADAPTOR_INCLUDE_AUDIO_NETWORK_ADAPTOR_CONFIG_H_ > >+#include <stddef.h> >+ > #include "absl/types/optional.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/audio_format_conversion.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/audio_format_conversion.cc >index e38aa3333505c76420fb54f7892ef6497f87fcf1..f068301adf9168576cb11306fdbcffa71e564551 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/audio_format_conversion.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/audio_format_conversion.cc >@@ -11,11 +11,12 @@ > #include "modules/audio_coding/codecs/audio_format_conversion.h" > > #include <string.h> >+#include <string> >+#include <utility> > >-#include "absl/types/optional.h" >+#include "absl/strings/match.h" > #include "api/array_view.h" > #include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/sanitizer.h" > > namespace webrtc { >@@ -41,11 +42,11 @@ CodecInst MakeCodecInst(int payload_type, > } // namespace > > SdpAudioFormat CodecInstToSdp(const CodecInst& ci) { >- if (STR_CASE_CMP(ci.plname, "g722") == 0) { >+ if (absl::EqualsIgnoreCase(ci.plname, "g722")) { > RTC_CHECK_EQ(16000, ci.plfreq); > RTC_CHECK(ci.channels == 1 || ci.channels == 2); > return {"g722", 8000, ci.channels}; >- } else if (STR_CASE_CMP(ci.plname, "opus") == 0) { >+ } else if (absl::EqualsIgnoreCase(ci.plname, "opus")) { > RTC_CHECK_EQ(48000, ci.plfreq); > RTC_CHECK(ci.channels == 1 || ci.channels == 2); > return ci.channels == 1 >@@ -57,12 +58,12 @@ SdpAudioFormat CodecInstToSdp(const CodecInst& ci) { > } > > CodecInst SdpToCodecInst(int payload_type, const SdpAudioFormat& audio_format) { >- if (STR_CASE_CMP(audio_format.name.c_str(), "g722") == 0) { >+ if (absl::EqualsIgnoreCase(audio_format.name, "g722")) { > RTC_CHECK_EQ(8000, audio_format.clockrate_hz); > RTC_CHECK(audio_format.num_channels == 1 || audio_format.num_channels == 2); > return MakeCodecInst(payload_type, "g722", 16000, > audio_format.num_channels); >- } else if (STR_CASE_CMP(audio_format.name.c_str(), "opus") == 0) { >+ } else if (absl::EqualsIgnoreCase(audio_format.name, "opus")) { > RTC_CHECK_EQ(48000, audio_format.clockrate_hz); > RTC_CHECK_EQ(2, audio_format.num_channels); > const int num_channels = [&] { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >index 055190497a54c28436ed401295d2a01516ee4360..476954fe60d06d0478cbb5bd47d27d3c89d9ce9d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >@@ -10,40 +10,72 @@ > > #include "modules/audio_coding/codecs/cng/audio_encoder_cng.h" > >-#include <algorithm> >-#include <limits> >+#include <cstdint> > #include <memory> > #include <utility> > >+#include "absl/memory/memory.h" >+#include "modules/audio_coding/codecs/cng/webrtc_cng.h" >+#include "rtc_base/checks.h" >+ > namespace webrtc { > > namespace { > > const int kMaxFrameSizeMs = 60; > >-} // namespace >- >-AudioEncoderCng::Config::Config() = default; >-AudioEncoderCng::Config::Config(Config&&) = default; >-AudioEncoderCng::Config::~Config() = default; >- >-bool AudioEncoderCng::Config::IsOk() const { >- if (num_channels != 1) >- return false; >- if (!speech_encoder) >- return false; >- if (num_channels != speech_encoder->NumChannels()) >- return false; >- if (sid_frame_interval_ms < >- static_cast<int>(speech_encoder->Max10MsFramesInAPacket() * 10)) >- return false; >- if (num_cng_coefficients > WEBRTC_CNG_MAX_LPC_ORDER || >- num_cng_coefficients <= 0) >- return false; >- return true; >-} >- >-AudioEncoderCng::AudioEncoderCng(Config&& config) >+class AudioEncoderCng final : public AudioEncoder { >+ public: >+ explicit AudioEncoderCng(AudioEncoderCngConfig&& config); >+ ~AudioEncoderCng() override; >+ >+ // Not copyable or moveable. >+ AudioEncoderCng(const AudioEncoderCng&) = delete; >+ AudioEncoderCng(AudioEncoderCng&&) = delete; >+ AudioEncoderCng& operator=(const AudioEncoderCng&) = delete; >+ AudioEncoderCng& operator=(AudioEncoderCng&&) = delete; >+ >+ int SampleRateHz() const override; >+ size_t NumChannels() const override; >+ int RtpTimestampRateHz() const override; >+ size_t Num10MsFramesInNextPacket() const override; >+ size_t Max10MsFramesInAPacket() const override; >+ int GetTargetBitrate() const override; >+ EncodedInfo EncodeImpl(uint32_t rtp_timestamp, >+ rtc::ArrayView<const int16_t> audio, >+ rtc::Buffer* encoded) override; >+ void Reset() override; >+ bool SetFec(bool enable) override; >+ bool SetDtx(bool enable) override; >+ bool SetApplication(Application application) override; >+ void SetMaxPlaybackRate(int frequency_hz) override; >+ rtc::ArrayView<std::unique_ptr<AudioEncoder>> ReclaimContainedEncoders() >+ override; >+ void OnReceivedUplinkPacketLossFraction( >+ float uplink_packet_loss_fraction) override; >+ void OnReceivedUplinkRecoverablePacketLossFraction( >+ float uplink_recoverable_packet_loss_fraction) override; >+ void OnReceivedUplinkBandwidth( >+ int target_audio_bitrate_bps, >+ absl::optional<int64_t> bwe_period_ms) override; >+ >+ private: >+ EncodedInfo EncodePassive(size_t frames_to_encode, rtc::Buffer* encoded); >+ EncodedInfo EncodeActive(size_t frames_to_encode, rtc::Buffer* encoded); >+ size_t SamplesPer10msFrame() const; >+ >+ std::unique_ptr<AudioEncoder> speech_encoder_; >+ const int cng_payload_type_; >+ const int num_cng_coefficients_; >+ const int sid_frame_interval_ms_; >+ std::vector<int16_t> speech_buffer_; >+ std::vector<uint32_t> rtp_timestamps_; >+ bool last_frame_active_; >+ std::unique_ptr<Vad> vad_; >+ std::unique_ptr<ComfortNoiseEncoder> cng_encoder_; >+}; >+ >+AudioEncoderCng::AudioEncoderCng(AudioEncoderCngConfig&& config) > : speech_encoder_((static_cast<void>([&] { > RTC_CHECK(config.IsOk()) << "Invalid configuration."; > }()), >@@ -261,4 +293,31 @@ size_t AudioEncoderCng::SamplesPer10msFrame() const { > return rtc::CheckedDivExact(10 * SampleRateHz(), 1000); > } > >+} // namespace >+ >+AudioEncoderCngConfig::AudioEncoderCngConfig() = default; >+AudioEncoderCngConfig::AudioEncoderCngConfig(AudioEncoderCngConfig&&) = default; >+AudioEncoderCngConfig::~AudioEncoderCngConfig() = default; >+ >+bool AudioEncoderCngConfig::IsOk() const { >+ if (num_channels != 1) >+ return false; >+ if (!speech_encoder) >+ return false; >+ if (num_channels != speech_encoder->NumChannels()) >+ return false; >+ if (sid_frame_interval_ms < >+ static_cast<int>(speech_encoder->Max10MsFramesInAPacket() * 10)) >+ return false; >+ if (num_cng_coefficients > WEBRTC_CNG_MAX_LPC_ORDER || >+ num_cng_coefficients <= 0) >+ return false; >+ return true; >+} >+ >+std::unique_ptr<AudioEncoder> CreateComfortNoiseEncoder( >+ AudioEncoderCngConfig&& config) { >+ return absl::make_unique<AudioEncoderCng>(std::move(config)); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.h >index e4c65077651c0536ec23e0c9cebc99595fa474c3..2ef32364c7ea9df4ec16874b112005599f6208cf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.h >@@ -11,86 +11,38 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_CNG_AUDIO_ENCODER_CNG_H_ > #define MODULES_AUDIO_CODING_CODECS_CNG_AUDIO_ENCODER_CNG_H_ > >+#include <stddef.h> > #include <memory> >-#include <vector> > > #include "api/audio_codecs/audio_encoder.h" > #include "common_audio/vad/include/vad.h" >-#include "modules/audio_coding/codecs/cng/webrtc_cng.h" >-#include "rtc_base/constructormagic.h" > > namespace webrtc { > >-class Vad; >- >-class AudioEncoderCng final : public AudioEncoder { >- public: >- struct Config { >- Config(); >- Config(Config&&); >- ~Config(); >- bool IsOk() const; >- >- size_t num_channels = 1; >- int payload_type = 13; >- std::unique_ptr<AudioEncoder> speech_encoder; >- Vad::Aggressiveness vad_mode = Vad::kVadNormal; >- int sid_frame_interval_ms = 100; >- int num_cng_coefficients = 8; >- // The Vad pointer is mainly for testing. If a NULL pointer is passed, the >- // AudioEncoderCng creates (and destroys) a Vad object internally. If an >- // object is passed, the AudioEncoderCng assumes ownership of the Vad >- // object. >- Vad* vad = nullptr; >- }; >- >- explicit AudioEncoderCng(Config&& config); >- ~AudioEncoderCng() override; >- >- int SampleRateHz() const override; >- size_t NumChannels() const override; >- int RtpTimestampRateHz() const override; >- size_t Num10MsFramesInNextPacket() const override; >- size_t Max10MsFramesInAPacket() const override; >- int GetTargetBitrate() const override; >- EncodedInfo EncodeImpl(uint32_t rtp_timestamp, >- rtc::ArrayView<const int16_t> audio, >- rtc::Buffer* encoded) override; >- void Reset() override; >- bool SetFec(bool enable) override; >- bool SetDtx(bool enable) override; >- bool SetApplication(Application application) override; >- void SetMaxPlaybackRate(int frequency_hz) override; >- rtc::ArrayView<std::unique_ptr<AudioEncoder>> ReclaimContainedEncoders() >- override; >- void OnReceivedUplinkPacketLossFraction( >- float uplink_packet_loss_fraction) override; >- void OnReceivedUplinkRecoverablePacketLossFraction( >- float uplink_recoverable_packet_loss_fraction) override; >- void OnReceivedUplinkBandwidth( >- int target_audio_bitrate_bps, >- absl::optional<int64_t> bwe_period_ms) override; >- >- private: >- EncodedInfo EncodePassive(size_t frames_to_encode, >- rtc::Buffer* encoded); >- EncodedInfo EncodeActive(size_t frames_to_encode, >- rtc::Buffer* encoded); >- size_t SamplesPer10msFrame() const; >- >- std::unique_ptr<AudioEncoder> speech_encoder_; >- const int cng_payload_type_; >- const int num_cng_coefficients_; >- const int sid_frame_interval_ms_; >- std::vector<int16_t> speech_buffer_; >- std::vector<uint32_t> rtp_timestamps_; >- bool last_frame_active_; >- std::unique_ptr<Vad> vad_; >- std::unique_ptr<ComfortNoiseEncoder> cng_encoder_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(AudioEncoderCng); >+struct AudioEncoderCngConfig { >+ // Moveable, not copyable. >+ AudioEncoderCngConfig(); >+ AudioEncoderCngConfig(AudioEncoderCngConfig&&); >+ ~AudioEncoderCngConfig(); >+ >+ bool IsOk() const; >+ >+ size_t num_channels = 1; >+ int payload_type = 13; >+ std::unique_ptr<AudioEncoder> speech_encoder; >+ Vad::Aggressiveness vad_mode = Vad::kVadNormal; >+ int sid_frame_interval_ms = 100; >+ int num_cng_coefficients = 8; >+ // The Vad pointer is mainly for testing. If a NULL pointer is passed, the >+ // AudioEncoderCng creates (and destroys) a Vad object internally. If an >+ // object is passed, the AudioEncoderCng assumes ownership of the Vad >+ // object. >+ Vad* vad = nullptr; > }; > >+std::unique_ptr<AudioEncoder> CreateComfortNoiseEncoder( >+ AudioEncoderCngConfig&& config); >+ > } // namespace webrtc > > #endif // MODULES_AUDIO_CODING_CODECS_CNG_AUDIO_ENCODER_CNG_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng_unittest.cc >index a76dcbda2989f9e1f07c3a257852e6c1f9aeb980..e3655b464a5cef88a6dbe0630fd8ceb364371b27 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng_unittest.cc >@@ -50,8 +50,8 @@ class AudioEncoderCngTest : public ::testing::Test { > cng_.reset(); > } > >- AudioEncoderCng::Config MakeCngConfig() { >- AudioEncoderCng::Config config; >+ AudioEncoderCngConfig MakeCngConfig() { >+ AudioEncoderCngConfig config; > config.speech_encoder = std::move(mock_encoder_owner_); > EXPECT_TRUE(config.speech_encoder); > >@@ -63,7 +63,7 @@ class AudioEncoderCngTest : public ::testing::Test { > return config; > } > >- void CreateCng(AudioEncoderCng::Config&& config) { >+ void CreateCng(AudioEncoderCngConfig&& config) { > num_audio_samples_10ms_ = static_cast<size_t>(10 * sample_rate_hz_ / 1000); > ASSERT_LE(num_audio_samples_10ms_, kMaxNumSamples); > if (config.speech_encoder) { >@@ -75,7 +75,7 @@ class AudioEncoderCngTest : public ::testing::Test { > EXPECT_CALL(*mock_encoder_, Max10MsFramesInAPacket()) > .WillOnce(Return(1u)); > } >- cng_.reset(new AudioEncoderCng(std::move(config))); >+ cng_ = CreateComfortNoiseEncoder(std::move(config)); > } > > void Encode() { >@@ -193,7 +193,7 @@ class AudioEncoderCngTest : public ::testing::Test { > return encoded_info_.payload_type != kCngPayloadType; > } > >- std::unique_ptr<AudioEncoderCng> cng_; >+ std::unique_ptr<AudioEncoder> cng_; > std::unique_ptr<MockAudioEncoder> mock_encoder_owner_; > MockAudioEncoder* mock_encoder_; > MockVad* mock_vad_; // Ownership is transferred to |cng_|. >@@ -432,7 +432,7 @@ class AudioEncoderCngDeathTest : public AudioEncoderCngTest { > // deleted. > void TearDown() override { cng_.reset(); } > >- AudioEncoderCng::Config MakeCngConfig() { >+ AudioEncoderCngConfig MakeCngConfig() { > // Don't provide a Vad mock object, since it would leak when the test dies. > auto config = AudioEncoderCngTest::MakeCngConfig(); > config.vad = nullptr; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.cc >index a07b093b3c7095bbdd4582726a63c9ed7f41aea7..f18fb28e98807b3012dbddc06206093dbe03a598 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.cc >@@ -13,6 +13,7 @@ > #include <algorithm> > > #include "common_audio/signal_processing/include/signal_processing_library.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_conversions.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.h >index 3f8fadcd71b4dc7774d9b859e9c46745199bb7a6..6ff75298afa16a41b76e12eeebfd74739ab57302 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/webrtc_cng.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_CNG_WEBRTC_CNG_H_ > #define MODULES_AUDIO_CODING_CODECS_CNG_WEBRTC_CNG_H_ > >+#include <stdint.h> > #include <cstddef> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.cc >index 25f495f39d8187eba22c5ae1b2e04accafe9c5b8..d580a0509b4738d74b8f4dfd868f317fba13e0ef 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.cc >@@ -10,6 +10,8 @@ > > #include "modules/audio_coding/codecs/g711/audio_decoder_pcm.h" > >+#include <utility> >+ > #include "modules/audio_coding/codecs/g711/g711_interface.h" > #include "modules/audio_coding/codecs/legacy_encoded_audio_frame.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.h >index 29e4fa6593b93b7cbae26b26ce2132d31b64c050..9a01b8aff4ebf8d15c4cdeb1835e6d3408fd31ee 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_decoder_pcm.h >@@ -11,7 +11,12 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_G711_AUDIO_DECODER_PCM_H_ > #define MODULES_AUDIO_CODING_CODECS_G711_AUDIO_DECODER_PCM_H_ > >+#include <stddef.h> >+#include <stdint.h> >+#include <vector> >+ > #include "api/audio_codecs/audio_decoder.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_encoder_pcm.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_encoder_pcm.cc >index c14287ed487e9dbb6f9ba13d23137e536fc82b14..dce16358ef3518fccc286ae21d04d92b92f4d64a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_encoder_pcm.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g711/audio_encoder_pcm.cc >@@ -10,10 +10,9 @@ > > #include "modules/audio_coding/codecs/g711/audio_encoder_pcm.h" > >-#include <algorithm> >-#include <limits> >+#include <cstdint> > >-#include "common_types.h" // NOLINT(build/include) >+#include "common_types.h" > #include "modules/audio_coding/codecs/g711/g711_interface.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_decoder_g722.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_decoder_g722.cc >index ea4a721b12052565d9819f000417b029cc6767c2..4de55a0bb50fdf7631efe7d029fb2b79486914f7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_decoder_g722.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_decoder_g722.cc >@@ -11,6 +11,7 @@ > #include "modules/audio_coding/codecs/g722/audio_decoder_g722.h" > > #include <string.h> >+#include <utility> > > #include "modules/audio_coding/codecs/g722/g722_interface.h" > #include "modules/audio_coding/codecs/legacy_encoded_audio_frame.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_encoder_g722.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_encoder_g722.cc >index cb96c3c17ef81702649fc888ba3326de8d52f1d0..e63d5903090522985ad19fa4a64d02121359bb6f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_encoder_g722.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/g722/audio_encoder_g722.cc >@@ -10,10 +10,9 @@ > > #include "modules/audio_coding/codecs/g722/audio_encoder_g722.h" > >-#include <algorithm> >+#include <cstdint> > >-#include <limits> >-#include "common_types.h" // NOLINT(build/include) >+#include "common_types.h" > #include "modules/audio_coding/codecs/g722/g722_interface.h" > #include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_conversions.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.cc >index 9e58ce03ca474bb0c9cd2f0c623591a75ae41885..57b5abbe23df7fc2a2a197fa880a6c1d839a8c00 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.cc >@@ -10,6 +10,7 @@ > > #include "modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.h" > >+#include <memory> > #include <utility> > > #include "modules/audio_coding/codecs/ilbc/ilbc.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.h >index edb65d09c6d2fdacd78448b1d499a6814daf84bf..fcb2074dc95336018807da6d0c9dbe0c70085053 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_decoder_ilbc.h >@@ -11,7 +11,12 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_ILBC_AUDIO_DECODER_ILBC_H_ > #define MODULES_AUDIO_CODING_CODECS_ILBC_AUDIO_DECODER_ILBC_H_ > >+#include <stddef.h> >+#include <stdint.h> >+#include <vector> >+ > #include "api/audio_codecs/audio_decoder.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > > typedef struct iLBC_decinst_t_ IlbcDecoderInstance; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >index dae956b30c0498ee040adad3d98b69755613a011..35e6fc64a62abac634a6f213a758e9628999d532 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >@@ -11,8 +11,9 @@ > #include "modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.h" > > #include <algorithm> >-#include <limits> >-#include "common_types.h" // NOLINT(build/include) >+#include <cstdint> >+ >+#include "common_types.h" > #include "modules/audio_coding/codecs/ilbc/ilbc.h" > #include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_conversions.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/test/iLBCtestscript.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/test/iLBCtestscript.txt >deleted file mode 100644 >index 99c6092116b7ff500ecf69cd398d74c81e9166bd..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/test/iLBCtestscript.txt >+++ /dev/null >@@ -1,73 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-# >-# This script can be used to verify the bit exactness of iLBC fixed-point version 1.0.6 >-# >- >-INP=../../../../../../resources/audio_coding >-EXEP=../../../../../../../out/Release >-OUTP=./GeneratedFiles >-mkdir ./GeneratedFiles >- >-$EXEP/iLBCtest 20 $INP/F00.INP $OUTP/F00.BIT20 $OUTP/F00.OUT20 $INP/clean.chn >-$EXEP/iLBCtest 20 $INP/F01.INP $OUTP/F01.BIT20 $OUTP/F01.OUT20 $INP/clean.chn >-$EXEP/iLBCtest 20 $INP/F02.INP $OUTP/F02.BIT20 $OUTP/F02.OUT20 $INP/clean.chn >-$EXEP/iLBCtest 20 $INP/F03.INP $OUTP/F03.BIT20 $OUTP/F03.OUT20 $INP/clean.chn >-$EXEP/iLBCtest 20 $INP/F04.INP $OUTP/F04.BIT20 $OUTP/F04.OUT20 $INP/clean.chn >-$EXEP/iLBCtest 20 $INP/F05.INP $OUTP/F05.BIT20 $OUTP/F05.OUT20 $INP/clean.chn >-$EXEP/iLBCtest 20 $INP/F06.INP $OUTP/F06.BIT20 $OUTP/F06.OUT20 $INP/clean.chn >- >-$EXEP/iLBCtest 30 $INP/F00.INP $OUTP/F00.BIT30 $OUTP/F00.OUT30 $INP/clean.chn >-$EXEP/iLBCtest 30 $INP/F01.INP $OUTP/F01.BIT30 $OUTP/F01.OUT30 $INP/clean.chn >-$EXEP/iLBCtest 30 $INP/F02.INP $OUTP/F02.BIT30 $OUTP/F02.OUT30 $INP/clean.chn >-$EXEP/iLBCtest 30 $INP/F03.INP $OUTP/F03.BIT30 $OUTP/F03.OUT30 $INP/clean.chn >-$EXEP/iLBCtest 30 $INP/F04.INP $OUTP/F04.BIT30 $OUTP/F04.OUT30 $INP/clean.chn >-$EXEP/iLBCtest 30 $INP/F05.INP $OUTP/F05.BIT30 $OUTP/F05.OUT30 $INP/clean.chn >-$EXEP/iLBCtest 30 $INP/F06.INP $OUTP/F06.BIT30 $OUTP/F06.OUT30 $INP/clean.chn >- >-$EXEP/iLBCtest 20 $INP/F00.INP $OUTP/F00.BIT20 $OUTP/F00_tlm10.OUT20 $INP/tlm10.chn >-$EXEP/iLBCtest 20 $INP/F01.INP $OUTP/F01.BIT20 $OUTP/F01_tlm10.OUT20 $INP/tlm10.chn >-$EXEP/iLBCtest 20 $INP/F02.INP $OUTP/F02.BIT20 $OUTP/F02_tlm10.OUT20 $INP/tlm10.chn >-$EXEP/iLBCtest 30 $INP/F00.INP $OUTP/F00.BIT30 $OUTP/F00_tlm10.OUT30 $INP/tlm10.chn >-$EXEP/iLBCtest 30 $INP/F01.INP $OUTP/F01.BIT30 $OUTP/F01_tlm10.OUT30 $INP/tlm10.chn >-$EXEP/iLBCtest 30 $INP/F02.INP $OUTP/F02.BIT30 $OUTP/F02_tlm10.OUT30 $INP/tlm10.chn >- >- >-diff $OUTP/F00.BIT20 $INP/F00.BIT20 >-diff $OUTP/F01.BIT20 $INP/F01.BIT20 >-diff $OUTP/F02.BIT20 $INP/F02.BIT20 >-diff $OUTP/F03.BIT20 $INP/F03.BIT20 >-diff $OUTP/F04.BIT20 $INP/F04.BIT20 >-diff $OUTP/F05.BIT20 $INP/F05.BIT20 >-diff $OUTP/F06.BIT20 $INP/F06.BIT20 >-diff $OUTP/F00.OUT20 $INP/F00.OUT20 >-diff $OUTP/F01.OUT20 $INP/F01.OUT20 >-diff $OUTP/F02.OUT20 $INP/F02.OUT20 >-diff $OUTP/F03.OUT20 $INP/F03.OUT20 >-diff $OUTP/F04.OUT20 $INP/F04.OUT20 >-diff $OUTP/F05.OUT20 $INP/F05.OUT20 >-diff $OUTP/F06.OUT20 $INP/F06.OUT20 >- >-diff $OUTP/F00.BIT30 $INP/F00.BIT30 >-diff $OUTP/F01.BIT30 $INP/F01.BIT30 >-diff $OUTP/F02.BIT30 $INP/F02.BIT30 >-diff $OUTP/F03.BIT30 $INP/F03.BIT30 >-diff $OUTP/F04.BIT30 $INP/F04.BIT30 >-diff $OUTP/F05.BIT30 $INP/F05.BIT30 >-diff $OUTP/F06.BIT30 $INP/F06.BIT30 >-diff $OUTP/F00.OUT30 $INP/F00.OUT30 >-diff $OUTP/F01.OUT30 $INP/F01.OUT30 >-diff $OUTP/F02.OUT30 $INP/F02.OUT30 >-diff $OUTP/F03.OUT30 $INP/F03.OUT30 >-diff $OUTP/F04.OUT30 $INP/F04.OUT30 >-diff $OUTP/F05.OUT30 $INP/F05.OUT30 >-diff $OUTP/F06.OUT30 $INP/F06.OUT30 >- >-diff $OUTP/F00_tlm10.OUT20 $INP/F00_tlm10.OUT20 >-diff $OUTP/F01_tlm10.OUT20 $INP/F01_tlm10.OUT20 >-diff $OUTP/F02_tlm10.OUT20 $INP/F02_tlm10.OUT20 >-diff $OUTP/F00_tlm10.OUT30 $INP/F00_tlm10.OUT30 >-diff $OUTP/F01_tlm10.OUT30 $INP/F01_tlm10.OUT30 >-diff $OUTP/F02_tlm10.OUT30 $INP/F02_tlm10.OUT30 >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/ChannelFiles.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/ChannelFiles.txt >deleted file mode 100644 >index 05f7410141e8399f2b6b5bbe13f262f5d2886e80..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/ChannelFiles.txt >+++ /dev/null >@@ -1,3 +0,0 @@ >-bottlenecks.txt >-lowrates.txt >-tworates.txt >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/InputFiles.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/InputFiles.txt >deleted file mode 100644 >index f26b7afb6ca859a5209c6ccf3799d5803be42a31..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/InputFiles.txt >+++ /dev/null >@@ -1,31 +0,0 @@ >-DTMF_16kHz_long.pcm >-DTMF_16kHz_short.pcm >-F00.INP >-F01.INP >-F02.INP >-F03.INP >-F04.INP >-F05.INP >-F06.INP >-longtest.pcm >-ltest_speech_clean.pcm >-ltest_music.pcm >-ltest_speech_noisy.pcm >-misc2.pcm >-purenb.pcm >-sawsweep_380_60.pcm >-sinesweep.pcm >-sinesweep_half.pcm >-speechmusic.pcm >-speechmusic_nb.pcm >-speechoffice0dB.pcm >-speech_and_misc_NB.pcm >-speech_and_misc_WB.pcm >-testM4.pcm >-testM4D_rev.pcm >-testM4D.pcm >-testfile.pcm >-tone_cisco.pcm >-tone_cisco_long.pcm >-wb_contspeech.pcm >-wb_speech_office25db.pcm >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/InputFilesFew.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/InputFilesFew.txt >deleted file mode 100644 >index 08bbde30d7337b2836dc53a0053a3a309ce2c490..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/InputFilesFew.txt >+++ /dev/null >@@ -1,6 +0,0 @@ >-DTMF_16kHz_short.pcm >-ltest_speech_noisy.pcm >-misc2.pcm >-sinesweep.pcm >-speechmusic.pcm >-tone_cisco.pcm >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/ListOfTestCases.xls b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/ListOfTestCases.xls >deleted file mode 100644 >index f0889ef4eddb53b5295e4248059e3e345223f25b..0000000000000000000000000000000000000000 >GIT binary patch >literal 0 >HcmV?d00001 > >literal 152064 >zcmeIb3Ai0ql|6p$OG3gR#Uzsmk9kb;<|Kjy0V8M_L{wBXJdziP2``X5ASei^IHDq` >zIN&_bIL`_WC>q3>#;LJeyVbU})$Vp^F~7CW-S<@8Q~T6C=iKM_{rlfOzEfnaS9SK@ >zXRq2-_tt**e&g<4?|b71_Wg<RevUVD%wM-{Z?>!ObMWt02k&CcAppdG-L`F8hdv+w >z_-71IfB*SkumbnP$HS3C=5{W|JPhW1KE?u!?J;)1*b!qVj0a%sj8R}f0%li?-7t2? >z*aPE%7<*#ug|RorJ{bFAJP2byj0a=vk8uFTAjW|h2VoqHaR|nt7>8jTj&TIWkr+o| >z9F1`d#zKrm7>hB6ForQkFh()PFvc+^FqU8}#aM>%5R79nj>C8;#_<@-F;2jE7{-Yh >zCt;k7@o<b47>~d>1>;nVM`An*V<pCE7^h>LfpI3rqcP6Hcnrqb7>~tx9LD1@&cS#B >z#uG7~gmEs$lQEuxu?pinj7f~u7|g>O{C^5#Eye{H7h<f#cq+zvj13qUVN7Fe#JCva >z5{yk4n=xiEF2%SE<8q8G7*}9C4ddw;S7Ka+@eGWsF|NTl3o~uRk%NeQ8voy9kH9Y? >zfA)l8Ix-)?F@p=ZHRd+EwM*Il7q++V{r+3m?{jDC&=!8*-VN)t5jNMG^Q|9@GVNzK >zGtiogLpnah98J^PmbV?4hs<4yRlNwAVC<)271mmx$jo5<H`?_M&3gpwBOH7F(1*}| >zK7XGO^PswoPyl_HvKdJ8XDRG=Vvn`Y^*Vx1<7fK?IE%NNXWn_hUGr~8HQx@u{D~Pb >zx6ZvCTVwtS<6I~W=l$91rY8qCObtF_)8zD;!N2j(iPne77Jax4zCN_I0NxE?o#&cc >z@c&&<S@=8TAKTsS_WgG6PuuN7@tZ9;YhQojCyrnp;Fao}ffMY0FS6T%>~?2tTjpc- >znD1L=81?Cg*nfe!fon8iPMqI530l|yEu0UHtV7ND|IQv4LVZikr@08u(~HE<N$cq` >z${yXZ%<Z|y53xqRmF_vXV!QTSu|0b3aa_+mj_<KY3$?v_J$z4FxKcFl!D~r#_FK3z >z_L^gd9&-%fD%s1*f}U2k56-dw-2G6UaV0+Jkb@3B=HP>eRvmflxkq%sxkv6l_ds0h >z_v`xolg^r2^W^TG4#Ld<F58{s77caJ1I(!o#|d+gzn3?g?79S3bX>Dn(UqeHJDv$V >z_Xtazd*rb~9JKHt_`T|g6IQ{>BHR$*mSEKp)#HvDtR8gKVAm1Laod0!fU2=}1l{(m >zYZa5rmzPE-aQA`x3}>`+ytCUm*qSbnywBbd8S`J;261+Der_B5JI}S4{dH)W!OkM1 >z8n)->{j6R856iFSzTU2XoqunDe^-EiUx0s2fPa60e@}paZGeAafDic-n*T#~9^S8a >zug~iOeA}P;{%wEi{96L^d?3JoIKY24z`r2Ce<HxA`tuPx&yKbAY3qcqf9$%wY}Z|H >z-tp!g&D>Ym{daZz+O?lM;ByC<xgWFqJk$`o{~xh~{pslQ$N+y#fImv*`TEPw|DQ6w >zoL_wH)}7zBp67S?@7R|y2XxLyzUH(0w|Ru$_H+A^AH+WqzQ=oZalhD}$E{CW|G3-i >z%zqyKxo>A4zP9Z4^WCy;!r!v%u6ypdC)WAZmgj3wyFcb{uXB6-=i$0+d2gM!>+J9O >zxkrcp4qu8L@77_r4qw^-$_0CvE4e`K=Xmqudw;#<ss#(pRb4!H7{ERA9`<zsZzFk; >z#O39vopD7Ts2|Y2(@Z#^eZ!Y<z?@W<_$nbO2A0&M1DKPf11Q?01DKSg11R&P19*NV >z9l(N;bO6tcqyu>BB^|(XDd_+fzoY|>PCek5)B_f#9<V6&0Ng~T1<kjd%2nd|S1kgq >zs}?-Ws~Wh+M7`*o)BXVi3#6Rz#BARW4-VER!HsqANwzI-o`mnF_4H(9w0jc22##Tr >zaQrBOe*8nuDytjn@wV!M4a`OS{KXx=`TDkS{6}<6a*NqcpJdyYn<nAAhwb=j@Rq57 >zC)@t*mWn4Y_0l<xL+9Ica+2lCFAI3G^;K??`4ZyQS6{ttP=1(pPEh5xuW<O4{n<g` >zu3F$b#rH$2PA%7+`rYq-S95Cl^5r$Bs$9n@a^e%-I&M2(<?2r16<x)tKVKDCZq2Fc >zs`R%~z8d*1a;bruQypX5u4%cIsz-K|i;Sl;e^g1o^{sDplXNstUD}SZDi^=p{)x%= >zz}qJ#b%k7YYoHzl-#}}NhEddRXF4b5wryK$QGD=&AFPR{%5|b3hq>)fM>JKge#^8N >z`<78q2VF~h)q=-*=i_lWV6H2#wJ5mm46j<$xb7XUb3WGV)<HO`td;=vBzGnW*Dc-{ >zmy^`I;~LK~Q){c<)jioWW|FNdn<v2=LKRN{v~c|5@1H#f$4t_G+!S6DzL<(w6W(}+ >zC&Bw)y(ihavUw7`|5cd;Kwp#QNe+pbr2Y6U91dTYM63z#t;3V-9W%++mCcjv-!%z< >zz9!9+92zqTt4~uneDNExCe4%V6En%ymCcjj^<%{o0DVoGCpj!;lJ?`-aEkfDHeyZq >z3Mo7ZUY_@!Wb4Z2Nd~(n0npc^d6L6pCTTye4u``xsUp^dF9O4pJSb+8tt*=+Ik0OI >z0DVoWliW108=iXM#ofrzp@RztAGUh+)bz~ojSEgQH*)FZr!{x~Z4bb^>qB^wK8UB~ >zgLuNd8vnJQEw*1_wD2T<{@kgxKls^ix6R|fd9}e^z@dC^f@XI0&5YE|9MoW@{TjrY >znK!lecR%`}Ze};%%xK-rVGU;5uTrd;?WWfL^0z<I&Ft=*8LOK)s=*9j$IQW94;E_v >z)Y_lj_bc7Z9=@6Jx|u}{X83YJ%q&1|e*Jsh%maNh6Lm8q4QAT6wKkgVr`F#8i+go5 >zd(zBUv(gTLnXw3^ZC&~Ctv}Vx?B$zLly-1~nX1yZuKbrzd|WrPw{J#K+Tjgms!H3s >z@}8f5S~s(gZ$?qt(G6y*O53_}>uulH&Ft%&QIxj0!Aw<YTUUPcM?criJjgerC~dUC >zOjT)HSN`R@AJ)z6M>7-6N*jcki3p{QjK2Is_q5Hhpy9{Wv<LfU6r~-~V5X|Hk<oi^ >zez|UDf8UIvv?ChKRFyU|`uXp@S~qimZ$?qtF%4#_N*fvd;2XcJn;G=YC`ub@FjH09 >z$mmUXNrm=vbD(cVQQBC8nX1x8M*n=<4O%oi;qf%?yWDm2VcvD~!Q18sua)Tcy7n{r >zqz9LiLQ(#t2YQoMRfI<$XP%}{dPq4b^y5!@s5fa<Gse8;Q%}$*J+zz@YVjvM(wnrZ >z5M!=c^Co@L!^%ma4S&*w-lSDMpr`IVcj%KIUQP-n_>&HMlUB2D%pYE{LYs5}x(V?j >z{KUyMH!fJ#o%z;-w#`8g;G|jOuE6v7rU>34<lPbf&9@PylI(*11H6ZS(&UEKmu{Gx >zS+{Ze#s$}QPdKpWwjI$>$d(~Cc5T4F%0{yJ_D+J6^6jjS0lvLt4Qz+*O8%1XXSA>k >zRkWZBZQq4%8Ys|Jfu|8CT{yLR{oqMc8#dgy;A+pWy|(QDzb?eh;A*to4dUNZ@Q!VT >zrB{3_%H9-KPtuN8eE(@X=T#eZylSJ4S8a3?-n2dXnmOBUTrk+}%Dg!+auPb<+MRc> >zWw`G#*KCJ<kMQ>04t>bwzJutc9I}0s?PC^RGZ)7k<{dM4+X5VODf+cG*?ZsB($`2U >zi>{fEBNll_%-^<C?TBt$4@B1~-llo@mWww`PEXEk+;qj@*;89)Zd~w)?x_aGx9x&< >z$}MP{WA3MI{@Lbu1=jp*xU&U%;G0*S>@(RSuk8^V-ky2H#*Gl_vEB2w_T4rYjj3$d >z?3OCu5%0_eW1YEnM%M~nL7Xx*xn|w;1%r64;5j;H7p&|laERJp2S+b}gYDM{RsuL- >z4mkCy_H>lue0bI_gf|WBj29V|$x-|K$#+#IZ`Z%T<lC0N<F+3o5KcZ?og6jLpL{oE >z@^&2zOy0gV7_gI%RVPO+^a{e9-Id8%58D}=UtN})26jNtG2Zt%b^78<A%#o3Q#P;% >z;$cte#n2*?UY=;%@V<X1X?!@)=jp!-ZjexyR<4^~Kea~A|3`Vw?2Wp#(k6O6lD?*^ >z0?cu0bhW5cefSplc38Nb&{d5q=;>3J&TN`QdGQK!Yd;j`(=jn4-2_dsu(V&%*m{bG >zC7pP0!4nk`JS26E!Ra{K?Q6@x<mpiQCfl0JsbuwA_-FZIb$j$BVv5r@&P?67;Qa16 >z26n*t!Z~K(6yIy)s~+nL-l*=BG``7P7TPxIl(aVLl(aUwX<$b*mE%pVGcKL68Nb{! >z`an#5hFybt&Z{M#6|yVmBuY=(sN+c+bv$V!ma>OjGY@NbNVf~~<{=ShBIZr*g+G^! >zQazSZJ$v2E2CUX__fc~Sq~~m?V+M)idezn5;uOc(DK^ef0p5ELbM~=QaKYQzpKC?c >z_5K3ay;Dcp(RL6ewksO+cS864<fdtstaf=_-tFkXj;Pd+hlNczuR2vrE33SS+ZBHC >zf^*D^D{qqAfarF`e`jNo^KG<`fy-Cef1QTdSq#~by$q|k4(;>|sN92R&;_`P&Y)CI >znDcDw@O5ZGzWQU7=8n+fri&&w49-k#p1BTP$gTTH?_gF;Ub<mM*aBA?K6}CWa-vXs >znima{+S#mFw`Jkl4I3xhJHL9t-saU6-`nzUKZSWSnz!Fv`5g{7*W+|(;lJC_Fnv3A >ze(=Aw%wNpl;LA|8I4;EBPb^w|*3^cn$<0%X*H2APU$JOv%hce)QwN8K@H4n@#md3a >zp+W4lW^!h7@y1Q-E?9i>*{7{oH9WEYl&7uQuyOi=MHjEWsI$W*lZ)4l4y_tic8M}I >zk!ETl-PCw~R3nK|IaBFTC7K%PBUhtoQI)3Bqe?a97BF3bV~J5YQ|VD9ni}mRSL10> >zm8R08N;O4YWLArb#HgI9^r$pbn*uektLB}*aR!&kX}p@+yl7_2OsC$(6vRob8g7g+ >zx?$txn<jDPNVcezk0^^Xmv79osGZtR4a69GaoK#~#!WNI)n!RVFK(YGV-13F6LnI{ >zOBt>&$;MnIX((E%F<dv7rbgzBWki;0tgQSQku6J%tTdJpnQCl(SdAlY88adq9@0XK >zvX&8AvbFLWkP}<AbUD#B#8z6%i7nZhyAtHYRxOfIu{mowv8mSZ9)NoLoptGPF;ptp >zWcB96?Qltq*{F+;o3d)Ndh6hJh_V@V`EgTLZB{SrZihz);%vMCtr!|wbn5A6RwlQv >zXhwyPn$s7EGACcU#LO8iP0E~n<uw-Jr<*g<&xkq8dhMTz0YArk`&rHz>1WKGF7`-2 >zCnEfGb4L0ZF=tswBmG>`+s|^&NIzrdbVV`J&!rK5x;Z2L?9BPZ!1*3uPQ|&($v@dy >zgvZBB9xj^7%pJF*W)7D|W#(nQ7dia)&WfR7tpH+9%V!qA+MFzaF>|_;t1htGoG5@J >zLlJ(uIqL$e%?Uqa<}8a<q@TmR{VeB<^fP8o7ki|iBN2YOIV1gyn6oTak$&Q0+Do71 >zoRNOU%;}0%q@TE$_Vm-u8R=(dPJFIRpZAmET(tlWch0DDVrFlufd08PIouI!)YMHS >zus(H!#+qv2h*ASbaD!p_%o<pmleHygPIrFQHCCIGH4rx#J^gfZ)-_g}lYU0bS=Osa >zKXGmC<!3o(q@OW!y4WNA#Fe_IpKi`bKO^QW>s6$mxSsa%vz#;1&zLz~y^8a56cweX >zpKi`LKi!<OQ3I`WT{RF_0!u~}s-_xPpBlwGe(I(gSf4tkwgewXtQZ<qYTzhJw&gQx >zU~Nv+mZ&-1`Bm3gZBEp{QPlsQe!4m98mrAoKO^QW>s6$mP*yKL%Q++cjG5EL9_c4! >z+S5-rXQZDIbC&fg(obl<m!IXFk$%R^>FQOapSYg(^wZ55>1SuozH8ud=n*iB#lx;b >z*fG&}A+*7FO^qwJC8MTpDunf^ak1^S*iAQsV_G4EyOz%^gta+YU}ENUCs|!+wK-V` >z5qnQR-JEry)#jw15p$MBE7DKgX!P>4oHNqTm^oeSaej{B=B1~fZq7JAW9BT2R-~V} >zQ1|k)oHNqTm^oe1iu4l~)t-L3IV1h-%-MG#L=Cjgbrq_Ps-g;2)6HOQYQEEC$*8HD >zYG8fph}<>Rz;UGpj^Reb@|iWTHYaLJ)ST}8s%xw^Cu-mrZZvxO>E^6!tTre8jF_{m >zSCM|=Mx&RX<(!dz#?0wrkMt9F0X_Y6b4L0ZF=tt?BK^emw3nadoRNOU%<1Y?q@TE+ >z_Vm-u8R=(d&e^De*12j896^P(WK^MQs)6;XJ4GdG>ZTf4pE}~4ZK{D2S`Eb2!19?j >zur?=aOU#_^{HklLHYaM}I4&(c{d9BIHCCGwe#XpM)~iTAah>YrXE|r2pD}Z~*dzVK >z8PL;DH)o`u5p$OHD$-A!PQCmr=Zy3-W=>bHBK^d<+0#!qXQZE<IcK8=TIafIAg-*I >zj4D)3HLyN4E{5?_H`T!U)VO|GGPEX*-7#$V=if3YZJFca)n7)<?q{(&L1m@chZ#S+ >zJLi1eg=S}QoT&b?$#*w<itn+rmvt@0_a%|Om$Rq%9zDBTnH1lb#`^ANPw_o=_Okw^ >z_`WRC_j2|W-=k-DH8H{WiJ@5E-Rudzch8Q$9_#a=bk)~NS(qH@%J2CxcEVLSXrwC< >zQAWb`IAmnBD<7d&{pQNdvb<{Z#ZyzOFI=@6f6z6p)aZ%f4eD-D7HbMnr>Z1~bvwtR >zJJ)N9?r9OVd}1U$MrScE#&nBiEzXN^G&#o7VqT0H7Tv1m#W<E8qqCS7W4gt%w&%q- >zo*ZLoF)zjpi|!hb7vn^FjLu?Si~)=I`m27gM7;{o<Y=J6UKV4!;p)`eaG(f98E&{@ >z^)?)+43mL~Bd%f>UAlSQYVGp1q%9@)`z9N%7U-&Vb5t)_ynHOpv7zf!wZV&Gf{k8v >zJ7{BBd>r9Ffb!<>)@sR+_xpx?lyxgDA4}wfa&bo4NXti(jUKhh`B_~n<a~*opYHde >zKckB8R`(XaVsLU)E@jSI=+E!Fto5otZZ?{-Jp%kx0Q{xo<fvR$+#XT(dX*hFTUTgO >z-Q6#vZeDfK$?fv=#&w&oQ0mQ+?$YLVi?XOou*o7-l3}q&p=+{;dbgzerY9;!XE86v >zbc<yX%*$i<O;1dWrNz7$Gc3AQ&5N=7rY9;!XE86vbc?+%0=={<*MEPvhV!wsm=|M) >zMR$S7%cI;Iq{Qef=EWGWILo)oPLW;3Jg~k|cdflHSN;-q84d&;Ww_UMtI@FBm-+iV >zHs^NvqIH{Bk7yUJW$rgo^ZL4RM^~22ei}mDysobJXMj2{E<#I(+;5uvb@Rsh8#ixR >z>!bZ0j`6pgH`?F0dEGih`#TclubVg8->7-Z8XWEKXpFz*ywU!~&FkuEw7+9f{<?Xi >z{VnHR6{rtgH}H)gQ{!B$oYa$J<(YSGjD>K`)u%579W{Npvg*^9`V0LPZR6Vw7hk$* >zYWjSofG!;`e-$;epV9L0%4)NO#?I`{MPGyZ%q*f4<*%B2b~DHO95-`W@8W%466JF_ >zbG*-yGrQG@_jzfY&u->;pW|jO>t4Li%c6WPXO8zda%NZiqJ3UA6z8*>Iojvy%%QiF >z;_GVZW#v?#92+h9r}S4@%Za)SjFo(pfpBdtC+ad#se$VP5j0lU&6AgJzI<xxV*I@T >z-t!z%%IvaXWuGWpey){yDzUKeCfjo7udmo1wor<fjVSvx#pP^e#Fc2Ptfd)oji$v_ >z+RBJ4)s|bsjJU=U<8rn#;!3nt*5r)1#?#^|ZDqujYRlE_jJPHe<8rn#;?itw4%EZ= >ztA-LlrC?2tX;v?eu^PTZ?`3j{vO|=~@RdcGu`9}@$_`HsMA>|;SvNgpFE-j$X36I2 >zZ=>d~>7+VCB?NnAiJ#xqjhf~|^Yd!5G}?bRf0F;v^OyBJ$^T`s{+IJ7`QJT%=w=A| >z2|wuU^01m;tUvxFpzGIB22uh@Fi=+Llne|H4aY`M8c2&E-hivcX%UP>N8k*kMbK^F >zf<P5YsPKVPdUC8=uWMo~HB|MGnelGbMww}-<UuptnGZeqjjP_PE}Y!3Rw?1bL!&PJ >zZl5SqK@GTRs%JZ1CnTG4H8-ed)s!tPV`))2QyEdEnJO!BMpWaSsBDkG#V-7Ygq8L( >zLUZ=wOu1Fe2yLPhTB50psM1W8bvq-f@&+a*SEZ?ps1i-NTAmSAd0P_|l{1wQm163W >zn6lpZhJ9UXl~Of1US6+Uc+)X9BQ7Ow!Y+eK363%taZ%9?DkUq_fZu$1v0d@5VW<+6 >z{cPW%W$lNtZigr%HI;;o*sDs4k#L0y8{ze2cqkzfXCyI_6eDF>ubsa=A{ma4q%@Km >zNs1A-T&a<aBt+tjq(+ipq%7a5k&MPiQW{B(B*lm;*{P9?B}C$kq(<T!>A&OC^SV;V >zCdbFUUxseBx?aFETnmEd=j8ZAj;V&LK!d3zk)~qn|0?xy@9=PS^<<yW0(O~d(1c=& >z7lKq%t|T;QTrp)Y2h~JHMCD9nM3rW$EZG@RRW;E&s?t<OREehC0%k<Dw4(HgsGO;c >zsM1W8r9305>RnIos7g~AQ6-vkWj#Hrk?L($L{!dHdQ_UJS$JhpR$Otv>r$fVBkD#! >z>T04HR8kjZFzTA37@SZN81)TOx>h!>UAu1e)X<s}wRf&Y<R-Igrzm5(TDs<`q>WD^ >zhLerCGLjRS+;nz%)D)RBmJ?Z`v9j#vL?$<(v5}R=aw1DL=9V!hGPx;@j?5X$i7e4r >zS^je(lUumh$Vy{5k);}Qmw=qe<lZVeGG{C&GR@d5JnptvoURBewQF)hE0R&qo_k$$ >z!X~wX5@oX2RVQpxJJUlZyIyAKs#!HTy=D~~V^^K7UwY*B)!#45q^=!*s!9^u7oZH2 >zuEb<UDYq-WYfVu)lbKPbn=CI8nNi9;QbLra$;>D-OuB{5i*i(M%u=FsCi9}qFj-zY >zGNY7gmk?!XGBe5ylkPH-8Ko51lqj9a%qRmU`~Oq6_KR~rluGrRoRFIxf1jwoDC>0< >z^C#^xEcYV*K2e5yUCaDQy9~>%g1=8w;SGM>gulR#SFdQJas@3-w)cxN*-KM(lYB{) >zW73tRUT03-B;U6itx2hOl+I*kl<6kR%S&dIZQa@)feRB7TH4JFwRXPrT-}DgF?(O@ >zdtW(A+TR{pJ5v%&W=5HAvb@G*CbV|%*E7n}WM-5ZCfzkAGs@b%U+*ZL$;>DNCTIV$ >z)()j2-^q!<3P)XC;xB88;Xu$)hT|`5is9OAZ+oB6$GkSaP~*oZoi?VuFF3la{9V-4 >zzIMvPEBUf7!eXa(*9`wm(5L35X>6$cU6a>t>Uggsr!LEXg4e@QUYAoRcpW*lTaW~= >zN8-G8Qzv*GH+5N_6TBXc^17Tl!RyGWUAay0dMwUsH+6#7m8tvxYs1pb%2_|Tq;gt@ >z+S4N+{09tFWFBQ8T!8}yDvFHiztVGi@~TZ!mucm8Y`ksP?GR-|&$64Nk^)<JlZ?32 >zHZ_uogh-r`)JPJHlw~tDk|ptxltxk`NipJ<D>ag(36VG>sgWcYDa&hWBywBgG~RHY >zltxk`NipI|Zek?kQWK*haYhm&sTt|N$;(qxrN&Gykqe*;Db%BsRO{VrU8dx!<$f7u >zs#n!_vvryBOY;*0ZOjpO+9zwZe_SqJ_LotU2UOrs&i99+CwHfPK*#>%yxks`t71>j >z-Q+2r`;%X5jCm%;voM~GaUI5UFrJI?JdEdKya3~c7}sOG2;;>VFTuD0<E0okV%&uB >zGK`mFyaMB9j8|g33ggunufezl<Fy#C!+1T$8!+C8@g|HnW4s09tr%~^css^BFy4vr >zE{u0$ya(gG81KV)KgO*XAHet^#)mLIjPVhSk79fb<Kq~g!1yG_r!YQ^@fnQIVtfwc >zHjLXbK9BJQj4xt*3F8ioFJpWK<4%mPVtfr_E5=<IU&pu`;~N;?#P}A*w=uqh@m-AX >zVcdi94;c4id>`Wn7(c}L5yp=({t@FojGtiq6UI+5?%%d;+t2X-f5G@yjGtrt0^^q$ >z|Az4^jDN@YHO6l+{sZH;7{9~#J;r}x{1?W5WBdW*j~IW#_%p`;VEhH+evH3jY{M`u >zoOFxNn!+c?Z=PDberkI9ibYdfrUr4&m6zTW<#tLHW&OnP30+Q}qTJZY-D#2Hd8hLA >z^4v|H;(6rc<)t;n^G+e{;dwcEis!MDyGv<`=bhWlUY@(jQ#`LuzB(Yqt}ABapXJE) >znVxoC#i~}$i?I@ZcPwP0b1O^RQ6_p`k84qM?lJj`a|2QIybiBjw_(bg*}e}xZUgB2 >zHfnk;&z%V>%Wq$P#ZT`}T}`9f^!8$G*QWE^Chy(!N#4g!UzW@y@9jNzg!kq2N!~|K >z?-nP;`w4q9ALG57KE?a!>C4iY<h{KCjqtvlKFRy&>0S9u^4{Jj#(3|hPx8KN`l&$8 >zNvqEtXDbzAa!Hp$&yTUwT%k21OS^O$Wu&=At46x=9l8)SSLuz@Q>#{^b!?+nrzb{d >zx_%#JqL%ZnDJq$;wK~OwJEd!?4VqxRo*0Xd!kI{nBHlz<RuiKbkB*`=kr+jS3Aa#* >zQB1@~;Y=h(5pSX_vx!kGiH@Q)kr+jS30H0tqgWarg)@;Dg=b=2OtlT)b=w}1br&q| >z_PbJ2CYO$M|88xJ!G`i4DlC&rz0w$EsG-D%4SA*Isey<h%ls;S_ncbmmz68$c8W3< >zmV}$9lA?CiPc`OhL0ICdG1h-P6mzTVc4~^u8Ow+))mT}pb0Ql~i>x%36IrS;w~RTF >zRnFs>oRytfIguqAD{FdAWR=1Z6Ip32C$dyyuFmH~R;g4`kvU^Ik!i*@2Wn%&i-vlo >zsMM~>rIkA*7vjY+R%5O)ZpJQ?mD?G&LzKywtBsqn%Vg!=!|f2dz%^cXE<b<O>W$Mg >z?WSJss<ULgwo8;DUoo{=D)HDWPNJb;?P@e+uQn64U78|thSDQSGE|m+|IDtrTU|OE >z(URIO4H1=w(j!VU<d!ZyqNPa@IYa3YB^fG9eR@R8QX(o1rAL%z$d&lyh?Wi|MdS=6 >zN2D0q7*o;1FH`UerRIC(q?%k>SH*@Hi{WcXv$=Z8qRfS_AC2ag)e|=ph_2VQL;e_~ >z{fCqz<7>39=onr)40qH$qs%ILQ75Y;&|XmT%(_ydoDV(CvKTKN$%)pP&5kz5Y<Vfj >zj&?L7+R|)xv{`1|B4<ZCmJ_Wrn;mVA+42&S9W8JDRLwV>x24(aXtT_^%S(2&yjx3- >z)|t(YHfZ*$KuwN+trQBVQYj~w1&j03V(d3w@zmAqG9D~iQN|muf4z+dYY1AXu*mfm >zo5q!2xMJlXw6}1@spmZW<W&G&d!KyvX)ExcWBn;lTebPZjhkk?ms*p&a%lTxy8=CC >zy24&L&6VqCe^uIFj4YYbzjtzIRrmL$yS=jA39n3Vcd#}AcQ(!b>ZsqO2BDPSjhAgS >zyauXKgk2hqIyL=!@8+|^eSofVT9ltH%U@<24@Vp{j;OnhKV$ZD+}6JPK-*5yuwT_Z >zv`YQ}*8Mu$frjYIPZ;wTAMyJRTsiwIP`*L#+lSOEZ>{mS+2+IXhs`5=&pg^~Gp-A# >zo_^}ttIm4Z*$;2{R*Ob2DsT325hep(biGzXz+Km$@K@0DwbF!qyRQ>kgN^d$-M`an >zu)$Qy_vuF22;MR`*gzahhP>Z5#4+k_-O}P%A{PJ`$5R6N=olCr?oI{n%0h3suAN!D >zcBc0Go(4w#)&@rZ)&|D@)&|D^)&?g2)&`dRtqm;e(?F<)syq+cHOck}OwLR$-ngmx >zeK>nI3=gfUoe-fn$EubVp1E>x*eimyi8>KqGAXt$-mtoEZhg-a9QWF%pO|`{4-ZYB >zKU}vKx~J<34%S8n^Fo|Hf23~onKndrGp|^;WyOY#lQZ^j3c4;KwOhK25HWl>sk4gh >zgjQZU)H@m0Zdtow@!Ab*w^UzYbau;h!Y)L*6YxZH;_8T%=z7t+u%BfZQC2I*gZwqf >z^`PsucVBBTnz;tP2l;EDd4NBwg8FCsj8I<tyHjGUvJP&)OgDP0g`2)}>Y@PJGZP!t >zjcX%TBk`;V`he;%t}R854}Gjdwhz88dlbp0b(qLq2j7Q2)<N~5-JCTmXUB@6p+%>j >zex|QPs9=fwD$|%2bHEs6-!XQ<EGa!;31KajU*%Xziz&sDuNY0W;h2^s#N=B_i>YI& >zeN#3|3LFk7aCm6Nko{ezH9a)d8gEBB)-Iei4XUPi`!y_q<(Nv0D#etq4PAx7s74Z^ >z@=Ya1)iIT+3_Y-skPb&z4B6jhTGL}wtx<;^YdPXb3~WpS%Q2N0Rf;KJ9EnklCq(6& >zN{p&wDpMJHV51=&POKQRzst0y$EI4N4m;Lz#E}>nF8|hz98-x=rI_-?kr)+j0HUMv >zO(jOvF_ozdJ+QHm4sjP?f0t=Zk4?2k9d@kch$Ahq5!?b;H*!p+MU`aA7e``LxCMxg >z$~To5RmW7OGW5X413DbRU4Z>vrZqh_)f#l@tmTL!F)-W$ST}M^B}SEE$`?msRJa9* >zj><Qc7*)qqrZV)vCPF&IU4Z>vrZqh_)f#o!v6dr_#K3S1VBN?ul^9iuDPJ6kQQ;OK >zIx62(VpJVdmqxs%jC_Zm@7ME%clYyp_de#x(&`SGcC{$0)B0~0QI=hH+mBJ2-yev1 >z$Fj*NUPzBDtL~6vG$+zzqrQkVy;cM9mlu(a4pn!EiPSfm5ox#4=g*>ssf<p$=W2YG >z)^tG4qr>f%0%=lnyJq^Lu0fP9P<Y6f=R~Yal-8php%%KE;M}6*IMUyW^x=pvPxYLL >zUXf8<CNV4GJJR2Z1RQyKghpcPzg-m9|Dx53F|Q9N)w=BwnAtKjh$oBT<p{dKbH#UI >zOT+@ipCm!kJUhm{=yObGMxA5Y*I)QVXpDLyBWmAtX4Jmv%;FplJlvw}ZhJ<hhtA(M >z)0gl{_y+v*kKzyb5>t)h&w*Z!@KjjZRTf?Ic8(+ctw<k^_#*93CejA!FJ8=w_>T0q >zA^}I*e_F{B-*+pJ(o616m$4!D+f0jE^lnb&q^MYIH;^^Q7;_#3%|ZHO!|t~^=8~gJ >zGv`Z5)3o#7Ura_f;(i+woo_BVy3*Vw5!dm^8<~qQ-841r{mE)6(9!bOnKrd3%SruJ >zMcp2&*qn}tGe)CfgG}RC`Rg2m=@BLx^o678v<$&tQbssl{yHW?-(Y%#6@%?R7-+6D >zky_n6dHLqcr=~7mHM$hN^h1GHt78+&ewkiqkt?(NYfc$!UR*OvdX8{`X<wr3m*YYo >ztB~u0FF{T7N_2iAxC%>^{bE+ZccG6}&|J7YLJP6y#k%P!dt!vnh-F=W-<4@ujlan< >z&w_5t?Z26Bt_3mmyek|f=i@_Nd*qnSi8jNmFEUl<O^XbLj}IqD>zmDrw%cq>Yig>V >zS6w){VQoOv<0CF=w_BzUjnTWweVswxnh)*2Pn<=K2OVJQN9&7_<3RojWIEsruu=Do >zD^UJ<s@pAQ1$+ncS3q$f^{qa6mrq@U(BG!IoAh!24^%U)t1HlKou`0rJ?D+S%)sSa >zW8FJ)Ii@qC&N1yvE;DNRj?<*5ebbpy`=&GNZI8fL1z(Sj%h#;h*k!j&A9}1nzy}Im >z@gb(*MBUevr(3`Q)_nQO*XknVIFP>rnGX0O%$C8#Q2U+3)D`d@$X)@>fy`H!;YDbF >zpxDqF@4Mw)-Jck4uR_;;nO=lfqHA{L7D)9X=H*rV`OxSBPKb#SWg&80=wlUfUGOE@ >zha5&TSHX9ok5$lI$b9w_UWoGh>rW4!8e{FX@ORDhCA=cO0qq0{_!9H5BmNxe<p@s` >zxl{9Z&2gl^73sqfU#$JfMDF5pR>XItzZD5M(o?UU?;41GVQud(%5eSflP|9=a;N9- >z+S4I?nt|d=%;M<2;D&1=;7g8fW8|Vc!V^gD{QO19aiqT$>BA9UxBab%+zIBai0?>$ >zD-v)d^P!ctD11lfw6VaOQPr<B&$Ow<AGV3}qG~hdsZPwrBy14+Un1X3%QeU`m>yxG >zL0?koX_POSjg8PZm>yxpVCLI4;UJaoFLnjK_T|i(*2578ty8+4uS8w7Y3edAcFsFD >znSpn{)3hJcnNjDM_9d1Xb?5s~Q=;}wXGZOtZh!wp^Bd&KJw!v`p`u*<lG9A94KZt0 >zsqL!O_E%su#~NRkz3&SzPv+`Zt>&1_h%(irFQ$xiR=@2vCQ9FAMwC^PnSaF*3bg)R >ziDf>!zWg!MZYbK2T}rphZp@!I#A-KW6n7QfUqYE<G$Yb<qrR9j(z>+kn=oS{^^N94 >z+GX^L*=sroeO#wgvHvZ1nYPuK-4`GQim-3{vWU2&wA^TzYOZ{j6@AMwoELG1VP9UV >zrh7(Q`HrjDh<(F(5qpL+-}-J2+I_Q>|IQwBt$oi{rWegC5cC2?TlFI6-S0jYqVgHR >z&ea?j`dEcr7knx9u?m&XC{9`h--SL_L31JIvlJumx0^mZ6jHRGYN>p0P^JgkI`s5_ >zXGHy*i`pNW-T!@fPxCAb^=~rDF`pfMo_Sw{+IbKYz4FCMG12?xbE8+xzbN9`AFqFT >zu+)AgS^u^tclQ(iPLWK10xJ^b4^E1(KQD+_pO||is3Xc5oFdDD-+7YbOusABmovU_ >z1LsZj$^^fVC1z!OXZl^4pffR#JbY)cGW8FL);`=*xz*yQL{~ql*nMDVjw4u=JV$UE >z1s!=|#95@Th?b=2Wf{?zCC8igvgCSW*Cyaij&xPE=zmBV&`bB9#k)Vd%yC9lOq?^K >znSe7dnN`)qd4&4Q&p)3<kHX4{^GLK4@F?f4K2<&Luk`|gDZkz)$E2!}ph=NMX)@+r >zzWDl4wQG9>7O%m#4Rk)?S`^Tq@o>_s#=GoFo=dwikA3u@z3&roUGUobK{niB?|Y1L >zOsgqwG%aZ^Tg2&K`%CSbUySHc!-3j&4K&)9O!sd;ob5*YlJRc)Ij;xQ6S{p<<r<*_ >zr?uapo?~9meA%t5$xd?bn~&*NPRM?<50d}c@S#Bg%|3_*d>`81VwQ6jc;|l4H==pZ >zR(X4fxymuFYM|M;NWe25^L9wQ0-8M#0nCC2;rusyAnEr!XgA5!zg}{92tWAN#`gC} >zrrxHXe&S*zJM=x0y>0R<((-KLi?`I*C%tX*`!ZqE`~Irt7XUWSk-p>1uo;T8-98eL >zPsm95R-%}2@x4T$h<jU?%=TqH6niko?i<`B$6n!z2%1{1emjkO-P_w{Us}$z*_W0x >zZMw{MeNj`y+KwQfs?DI9{CxJMrch>kTXzxnaVCXg59zAw8xJ#UBqFYU%TP}(mp1#- >za;D9`w47<vWj064s?Cs^%C&WI_MxUwW_w$A5%*yoT2^BZYS}-J^XjttB~iU?_NC=a >zn=Z%6x<Y0%ZMw`R+eAK9n?W`C`RunYXIgg=_orpmdPrBJvva?!e$iKNn|*0H(`H{< >z&a~+=o1<mbW=PAmeD<s5OzSS<{<N%G59+G=ZN&Na80wd`^|slUmNRYkrR7YUF0(mW >zR&55g?B}yzEoWMH5%;HM)q4AF`-s=Q-aSUI2P5vCjKHUu_O`3*InFMwz}a?nO-I?~ >zI~v(`U6#9a9cMQjZ~MKvi2Ul>jx*nWXCn}KZx3ALn!ckw2(Lj<YyN##uEqnOOWWIS >zpBm4$+o#5}?Yd0oX*^&zs4zdj{cAki0~dL}8V`67(pv3xa<0Y$pETUtZl4;@w%e!1 >zv+cS}=V?4(H>B}ee*4#Wwg)cqel;HOp#8T9^=sM4_>k=pXngPzbse#fUWW8t8?1}r >zK9xS4ZMaXV=Nonz&(nIyaJ%(hO>}pT`xkq@3oiD4)gE#oB)Hm3?A($*-2JsUuEVUi >z;eM5#ZMa{hXB&3e&QyB9a7gL39M6K%vt4kp_pS7R3qis8FTk^v9%#+$ZMa{hXB+NU >z>Dh)|wlkF;FdS66pW|6jdbSHL_P&)KaN%6rFI&H|^=cbYS`ytmEW>3#V(;m7JqM@9 >zi|VB5b=8KZw;doE-g|u?!RfobUmLsjx&nh%q_3llqqm_(oi!QijwqOJ|AnttH$xWe >z?eCcTZSUzb)RZ!PhMH2QcUkDFbT0g|9kbo8#I-M{4<^}P`#d$J7;<rFRVx?{WZNB4 >zDBZP}m?>%+clLWvpP{Cd=`+-nGQG<}yqe1CLu#t6eV&?947oUR)KnS@rrSS1Qq(lj >zTTO2I3^k=ppP{Cd>0K7$)l^O&RFhjA=Y49D@q}YI#gL05)evh_X(*(o+UukgH7)6> >zrgHiWHKk1N@)4JAW+7#ImxZ|Lx%TDsAvKkY<GfEza(5iP4Y@c{4RuEpRFnT&Bt=b2 >zd#lM!pP{Cd>0LhJ(#<TSOz*M~H$B(BoIaRjw>ZxG)FgMu(c6%VBh^rML~CMSS9od) >z{NYjgx~}`XhkBdQRn=&Qk))b&`RPi3lNtLuFV&38PQ00VH0_W1jCj4R-&ZvTC9nIs >z<LYhD#igmPm%)03L0S0M`b71Obbkd^Z!>x7OE#0IzGO2lOG)bU%ml^Y=O$Zy$p&3q >znd<WlhGbED-IA)l?(gR6Z6;HFsb(_OmukjkDNTLdW<u(#<tAT!sRmtKx$5gS7?g#7 >zex<6f{AxD-v1M;Fnd(b5lc~N`GcHSM>gzTWRG*)leDx(7v^@fYF1TC?c3TZ;qxRw` >zRf6UB(ltr2G?OX8R5O_pOf}=OlqSJ$Ga(6<i{_%q5d&UsjHDWLapf4q&7Ws5s6PM2 >zRI2*AKk%TpnN0PinsNDxmj$zwYQ|+L-VAfoZ6>HbKR5XWFx8-oD_4Ep1~1Hg;T!tA >z1^3c-#Q(5`-lhTyY&ONLrJD+<u+bD>0%w?V8S7SJvnegA_NO)HzYuP=*nTNoi?6p) >z7oV!go<_AugHrKtJp7v2XwCj1yZ@z`y-j7SGTl_RD$`B5tfi{bHx(3!pQ}KPY)sjR >zs50HCi!Wc5zR{3WYA?ppRq21JX>U{6s!TVPt;%#$E^Dc(^i73SS<6*Fs!TWP;>%a1 >zZ!{<s|NKi=W%XNT`3k1DsccoIo61&Yx+#~nR8{(>f~xd$)sHIEjk@^qRp}cINu~C> >zEnSt>Z~kpkWoar~mFcFkRhe$eWi3^ezNwHZ%LQ~%<q3rM&FMy6e0fHBU+EhSs?vWw >zm#oUs>eoRxs?wRtRb_@Lm%kLLAZr<>T-H)dAy+k1K~=g1bW!CAM2oMtQ5RpHQ7zKY >zQ|?|3#i7r<b1%V1hr4&_ZAjN-_bgsd<uF4^Hso^GC6h)&eEFSj$mJ`+kP=b*(|EIA >zdp8Hxev#wH*4w0uP19melS-69f%x|)UVUt|#h2ZqBi+07Hk7NxWJ9@1Og7}Qm8e9| >zP*4_rj`VxT?mT$svb@~(v(?+Ai!ED;UX&q$)Ly?OE7ALos@{fjm6&WOSBc4nT(%OG >z=ot#hqL!mRl$dPN#g?r^&ty;_{`r=yMDM%9dK=1BVzQxJB_<nk*-BKRXDFydKSzBi >zG1;VxEnA76$&f&5uhWv1Sosn;zIN$tC|8NehBB0RocZ+}V{aCdt+|vX%F#0vl4C7V >zeaJD{q>C+Ej-JV&O8ggS$#V2Qr@ObITsbBi%8;W~VzMEZtwbeyhJs4;bJT|tlTEtV >zvX$tW+-6=qXMy?cLSuF^#+*8A%$hmId}_>??`>zyVM~p9*bc^AeylM!KERm0jyGmx >z7h|@bV9dYnX3U0@jCt+@jX7k6G0XQh=8LBo^N;%)bLFFqx%t7y9DTYm2VvfO&ot&6 >z1IE1IF~(dw&zRxI8FRz}V}5atG52Bq=Re7qYcT(#o?^_2nE%*G*xB8f^=Q2~e@|on >z=K^DXw~sMbtT*Oj%==Mv7d#*HuE3iP`^+`wJNO))x6e1`W%yc<3ozfs_)NQ#Fy9H! >zFlKkm_Y{0<!!YLi?6Zw|DdyYmdGHDI?SwB{`p5l@Iq${B6qxV!m*QB&-+H+*w<7+f >zuSA^N8*{^J5Fh5d|8>T^2lGw6$(TL%Fy`lPMVx!V_jekz_(8@je6KM-+~1h5-fGNa >z5%1y;8}muTJO5)?Kg7G=C*kLg#+>{aWA5JBm?zw3%ujbU=8<18=DV2hk$2$#nD5bd >zVjisH`mM(NH{yNm-Nwvq88h$OSaZbtjPDuqGsJtxy|9V-mjB3@w_(0N+=p0^m$^T~ >zv551JKR0F-<~ie6SToEs{TpLGfH+tG9{XdSC;h>g*TJuk{2BXVp56ay%;T^Sm$zEx >zlbGkbFuFJ98C}pamtnn%om%EB%yZN(EpsaTId_kic{<|z>0T`}2kZ372er(-5Yi0? >zw9NalPOm(;Wj=<uUVM1Vycha-$I&hGWvq8=sAc{FKYlaXGQWf$pIFi|ABG>3$F|J5 >zIR1d;Ewe9<Kk}rOIRwX_`iPb}8OOi-Q7!W}9RI~LS_UaGw>_q1ZpQI1etgSJ;rPcr >zsbv=8_#IZY%uf)<XU}h$mm}}bS=%yC#=1W7sVy^%JT01TnRy2obKs_y`3~}Q|D`RP >z?@O+LJ?Q79SHd3h{o1Qr=1lnU&S%0e%yZ{;EpsK-<AKj_nXh0yCa-UqJ+K~kyrg9w >z3x9UIsb$^>{Y<~2W$uT5e*LPJxeD_=@wF}U3(R-N8?Ybpz2Yq`o9{Qh9sVG$zrL$w >zUIf3U--qL%pSwTMG7o`Ys~7Bvfr;e@rtQywdBuWVFa+X%C;kj80`4^x4hb^XgS)=M >z@d!9%wg+#v+r_-|fV<}3ZdztLtm2LXEBR~t^J5%**T-9C{4*`{ncG|D;divmU0;Rw >zcel)szKzxXhnD&GAGXY;KY_=vecdlx<^jLKYX81v9{k6anfwc|8NkdtVY>!r7yrF0 >zw#MvY7MQ=h^S9qR?ff&Bud@76mOt8VpJ|zEuutdb&;dASj%i(s-`qYIaxf@4=2>>{ >zXWQ*z*n6%0`IqH~SN?s3o8{-rkF5OrjkYQs1C{f?xpwtN?a#NDcmBV5>XrqEnOjOX >zI_|jtA2-_h=VRf;0&|tEp~n2+S%pFJjTyixi65>dG>@@h0IP2QeT?NytFRnF3lX+| >z+2*)XJ_l*lIIfMqA5O^{N00pdP(o@P1NQgBsa@l!B7Z-WkQ#^6vBu$4sBv5?e?FYT >zHI6Ij?}t;W#-V)GIGn;Yj(YOvqrQBO=bq1@q|`W^&NU9DqQ;>t*Ep1b8i#UM<4|&H >z9Lh$GLy7C+?ziWD`&x&p2Smvgc1uLf@l0+1j%q6mxBMM-R@f~O6;{|S5w+FUJP}n@ >z*ewzD1Vv*@L?sn=OGFJ7c1uL{6n0BQ-4u39M8y<#OGK>{c1uK+6n0BQeH3;}L}e6q >zOGHf+c1uJx6n0BQ9Tav;L<JOfOT^q4c1y&p7j{d;d>3|0#7q}<OT-)(c1y(U7IsU- >zycTv##EcepOT=6jc1y%87Iu4&IoX_mL~KB1>E}Pzp+U^*>-j*1J4oRUR=7hH?ofq0 >zOyLe!xFZzqNQFB};f_|gV-#+o!Yxv`#R@m1aKj2WqHw%$_-j9=aN`O$p>RtSZmGg8 >zQ@DpH+_4IGoWea+;f`0h<sJuB<JxNmFy>&`TU-2pJB;}lxZ5<_W9)#jBL?m|%>yuY >z#wajw_i1*;*bQTM4BUm92V(4rfjd#NH^x2~`(ogZ)a-}xV2u4SaA#@;F%HBy2m^Ph >z=C9kfZR0qs@^8Su|NISCzzjkYh265o9cU4^M65Z5-4e0d6n0C*I#bv!5i3k#w?wQh >zh20XdsuXrh#ClTLEfFh8VYfuAA%)!%v3eACOT@ZS*eww&Mq#%^tQCdb60u4Yc1y(i >zP}nUID??$oM63yg-4d}H6n0C*I#Adx5i3Apw?sVm3%e!aSzp*K5zqI+Zi#rN7j{d; >zbG)!yBA(ra-4gM<F6@?wXLMn=L_C)ZyCvdTT-fbwUWZP=tbQ$gn8KZ?a3?9;$qM&y >zg<GL;k5IT%6z)`od!)iWO5s*2+-VATy272IaAzvqqZRHfg?o&`ovm<>Rk+6~+~XDQ >z9EE#=!aY&po}_T+D%_J5?kNhl%Hyc&6TlUAOGM=rc1uLf6?RKRwH0<tM4c6OOGJeg >zc1uKU6?RKRRTXwiL_HODOGG6Vc1uJJ6?RKR^%QnXMBNm2OGL#Kc1uL96n0BQl@xYM >zM12%?OGIT9c1uJ}6n0BQH57JBL>&}%OGE_}c1y(E7j{d;tQU4m#C#WaOT<hUc1y$@ >z7j{d;>=t%Q#Jm=EOT>&8c1y%u7IsU-EEaY<n|0_s%<9*nNrgLK;Z`f$8ikuuxU~v* >zfx=y=aO)KAsS3AV;WjATMG7~qa2plwVuiaz;WjDUW`&zkxJwo8GKITb;kGE;6$<w> >zg?qZfU8!(aDcmy@?rM*ts?P&g*ewy2SJ*8PHCNaz5!F`MEfIBA*ewwiR@f~OwN=<H >z5mi;#EfMup*ewy2RM;&MHB{Iw5!F-JEfIB7*ewwiQ`jvLwNltE5mi#yEfMum*ewy2 >zQP?dJHBs0t5!F!GEfIB4*ewwiP}nUIb6?mk5wl*{EfMow*eww=UDz!Vb6nUh5wly^ >zEfMot*eww=TG%ZSb6MCe5wlp>?QGVeYcQ){hptt)XDZyY6z<sycb&pLN8z5UaL-e? >z=PTR`6z+uzcfG>BNa0?ra4%7~8x-!P3U{Nz-K20YQ@EEa+$$9BW`%pD!o5o2UafGi >zQMg+a?zIZ{I)!_^$5GYSfGg~lh{`MMmWY}w?3RdXE9{ntIxFm!hzcw0mWbLa?3ReC >zD(sesdMfOeh)OE#mWUcE?3Re?DeRVrx+(0Ih>9ufmWWy@?3RcsDeRVr`Y7y{h{`DJ >zmWY}t?3RdXDD0MqIw<Uxhzcm|mWa78?3RdGFYK0x`7Z31h?y?zmWVkn?3RexE$o(v >zc`fXgh#4*HmWa75?3RdGEbMkR>(Co8t6zuSsBmvmxHl`@TNLiC3imdJd%MEDL*d@3 >zaPLyMcPrd`6z;tW_dbPtzrx+Ba34^(4=UV;6z;<c_YsBrsKR|r;XbZ#pHR3@D%__O >z?$ZkQ8HM|-!hO!;sOmR>E9{nt$}8-ah?*<xmWXOA?3Rc+E9{nt3M=fEh}tU5_gn3E >znW(D5a{L|jR2XtZR8nDnAbV0nh24X{qk0Ov2N88s7(5EK$5Anb;r{#`wNlvqiKvpo >z{7CksJ_>s@e@A5$hDQ@o6NNpRh-xV8(L~fiVR$qV6;RltiJ1Gs9!<oo7xri(=DV;* >z6EV|;J(`F)F6_}n%x+<iCSqO-do&R<TG*qBn9IT*O~fn~_UPHHL$_g8zYg85aGzJW >zFDTp>74Az4cZb4#S>e8-aCa))R~7DS3b$3^?ozm~E8N`*_YH;nrow$o;l8bK-%+^l >zD%|%J?jD8v2Zg&=;l8hMKTx<ID%_70?#BxEj~+)=-v+L*X8{qFSJ<<Fh?*<xSwKX! >z74|G3qRt9?G!Ye6*rSQ4t->BnL{$~`Xd>#VutyV7NrgR{h#D&F(L_{FVUH%FZVG!e >z5fxL|qlu`M!X8aTl@#`9BI={CM-x#Qg*}>xnkekiL{vjzk0zoH3VSpW6;RltiJ1Gs >z9!<oo7xri(=DV;*6EV|;J(`F)F6_}n%x+<iCSqO-do&R<TG*qBm`l`kY>Ajf)NX8N >zvku*d`TRQc6NURHh5KiP`>DeHOyT}T;r><Oey(u8P`F<z+`lQ@uN3az74Fvx_Zx-# >z4~6@!!u?L+ey?!<sc`?LaR04ve^9tTD%_tG?#~MMKMMC3g}Yzj{;F`>JdUcq4;=0@ >zuqC4Ma4#W<nk(#Cz~51Ag*^+1sI$TzO+<wi_Glt%t1vv8h^i{=(L~f!VUH%Fk_vk? >z5j9lUqlu`V!X8aT-4ym{A}Xe^M-x#ig*}>xDk<#IMAS!Nk0zor3VSpWHBs23iKvFc >z9!*3Y6!vH$Dxk1O6EXLNJ(`GFFYM7o%y(grCSs-wdo&SqT-c+DnBBr2O~kww_Gltz >zw6I4LF_(osnuu8}?9sDXhfsuSb%+ns{o2Hb>psVa?LNnc?>;wA;rMXg-;WRLeU1<B >zeU1<FeU1<JeU1<NeU1<ReU8llKF8(&pJTIt&#`&H=h#f(b8If~IW`;k9Gee(j?D-@ >z$L0i|W3z(Kv3bGg*v#N_Y;N#5Haqwnn;(3R%@97v<_Mo-vqY7nszE^2JPU}ZyuzLZ >zMATei&jKQ<t*~bS5p`DBqlu_6^qpZ#L~Wt(Ob}ItzB5786Z*~sQAy}K6GRQ6?@SQY >zgT6CC)D8O11W_^QI}>CZNo;!wty3lFJCom0ALu(1L}j4wOb|7JzB55o1NzPcQ3vQd >z6GR1|?@SPLkG?ZO%sTqc1To*}I}^lAqwh=*bBw+-LCh}t&IB>9=sOd{jH2&M5OayX >zGeOKE`p&SO%{nxQ`KsDrPuYPAcaXvztZ;`Y+@T70n8F>da7QTIkqUQ|!X2$}$0*!F >zg<GU>ixqB2;f57%MBzpiZcO3E6>dV|mMGj(g<Ga@4^g;d74A5Nd#J)4uW-vfj%O|P >zhyF=yiKsmEPYR;u&_5}NYD52|AnL5JNAox;tguHDQCsLcl0B&^^c@MJp3rwBh)P1= >zksxXaeMf?*9`qdvqHfT4B#4SZ-;p3{1${??s1o!Y38FsGcO;0)K;MxdY65*nf~W@c >z9SNci(03$=3P9hHAm$!@M}nAj^c@LezR`Cih?z#;ks#(6eMf?rUGyCZVqVdAB#0SB >z-;p5Z5`9O4m__s*VLO|3=mgB?*P(|g+=&WzlER&=a1U3w6$<wVg*!#zPF1)^D%_(K >zZl%JVrf{b#+!+dYroug1;m%UH$0*#{3inurdz`{OUg6GBxF;yw6BX`B3U{u;Jz3$N >zqHwD`jx~e&L;oDML{uL7=LAu6=${irwV{7b5Os#WBSBOc`i=xqTj)CyL{*{hND%de >zz9T_Y68eq=QA6lE5=8Z&??@1JgT5m{R1Er#1W_yKI}${dpzlZ!^?|-4K~x6%js#H? >z=sOZbHK6ZE5OsjQBSBOE`i=xK_vkwk#H^$5ND%Xlz9T`*H2RJNF~{gT62$DH??@2y >zioPR3%qaSf1TmNBI}*e!qVEXX*{nn7VLrbOO)A{^3b$I})+pSR!mU-f3l#1`g<Gd^ >zPgS_}3b#SwE>gH@h1;lb7c1N)3b#q&HY?nW!d<Fxmnq!k3b#e!u28tADcsW)?n;Hb >zO5vWNa94XAD-ZRD{yA)ks66z~38LoEKPQN4L;sv0>I{8Hf~YX`9SNeg(03$=szTq9 >zAnFNyM}nv%^c@MJhR}B;i0VP#ks#^@eMf?*81x+pqE^s%B#0_O-;p5d1ARw=s0{QS >z38E&@cO-~vK;Mxd>HvL5f~WxW9SLIY(RU<>Sx4WIAm$r=M}nAX^c@Lej?s4{h}lKo >zks#(3eMf?rQS==NVlL5lB#2o=-x0R6S%<E{e109eR^gthaL-b>XDi%w3ilj^d#=Jg >zPvM@ga4%4}7b@KK3il#~d$Gd3MB#2wxR)y2jS6>@!o5u4UaoMjP`H~F?v)DnDusKs >z!o5b}Zc(_`D%|T7?)4tW%0vC3e-2wBDi8g0f~Yz4&k3U1&_5@LIz!))ASw)fM}nv= >z^c@MJs?c{Nh<ZZbksvAweMf?*A@m&yqI%GGB#62}-;p3H27O0@s1@`b38G5ScO;1V >zK;MxdDg%8-f~X1f9SNcu(03$=IzZo%ASwWTM}nAp^c@Le*3ow)i1|j}ksxLoeMf?r >zWAq&fVs_DYB#3!M-;p3@6n#g6m`n5>31Sw}cZBV1)}c3GKEDpVQQ_XCaBo()w<z3O >z74B^c_jZMQhr+#6;ohZi?^d|?DBOD$?tKdPeucYL;Xa^nA5^#xDcpw@?js8KQHA@M >z!hKxfKA~`*RJczm+@}@pGYa=vh5MYxvGP!V=%2%uh{{9%oFHlr{d0n-HuTR4qR!BF >zB!~(_-;p3{3w=j|s4Daw38J3RcO-~PLf?@fY6zW1f~X#J8VRCq&|4&kia~FYAZi8O >zM1rUibQ1}pKF~iTh{{0!kRWOT{X>GN2J{aJq7KkMB!~(?|BxW&9{odtn053I31Ys{ >zKO~5mM*ol?<{14$f|y<O4+&yk(LW@J8Abn)Am$SNLxPw^^bcV>n|0_m%;(pk+ZFEf >z3ikzt`=Y{qN#X8LxGyW*R}}6}h5M?)eNExED%@QP_jQH4Tj9Q;aNkt8Zz<fj74AC< >z_g#hip2FRuaQ~oi_bS}?748QL_d|vIk;k#3P*>;+!j_1NLSK*|Y6*Qof~X?&1qq^l >z&=(|#%0XX{AZiAEL4v3j^aTl`PS6)5hzdbpkRWOUeL;e#3iJgDq8`u}B#25tUyvYb >z0DVD%n0@pG31Z&S7bJ)oM_-U2<{EuLf|zCW1qot)(HA6$nMGfaAm$W(L4ue~^aTlG >z9?=&hh#5p*kRav`eL;elHS`4uV!qH9B#4<pUl6wUm=nxj=HRoI`uMDb;|F##AKea* >zIx$!I^Fn-f&I=cm7=Or)Pj0{yOS-~h^TQVZb-{wYFv{KV0XY17>NzJjeFD!OZvUO| >zuf6_?&r90&zv+{77-Z)YbUObx_!OPAPtxgpat_yk3;KWWlXF;8_=Bz<gipj(Kv$2T >z0gLVv1exP^KM^$7qJI)J&!T@8L@}eQN6>tWekO=oMOTlY?JfFOLG%V)J%V<$=odiu >z;1f(<iU0pF@OdQm48q=#XV6aeqha`=QTW&PS!d8O)*3hLG`yrWVD6naU?z7MFu&M& >zz+AHXfcev21LoTO@c#n_{ug~l)$E*EozJK`^=aLoQN^|Tzj>yyZ-I5jzejw4!tJbZ >z=xM1PyNkl@s&Kn0-0lj;tAg*}0~Kyhh1*Nv_Exxk6mDOIdyvBIr*IEexcwFG0FNsx >z-=|UJ%F6c{LG8--SwZc}_c=lB%6FTfcICTWP`mPdUQoO8eL+yW@_kWIyYhWWP`mQo >zA*fyXzAT8k!M&9rDhBsff~XbTTM42{aBn4u`oO)FASwg*R)VMr+*=8v8gOqVh&sT% >zl^`kr_f~?Kd)!+IV%BkQC5ZXPy_Fzl8uwO$m}A^q31W6}ZzYI%#l4jvW)$~Uf|yJ6 >zTnl0r(Q^$no3GslF{`ip0~PKdg*#Z`4pF#6749&FJ6z$8P`D!%?kI&jTH$!@^5b2o >zaElafvBC{0+_1upDBP&RjVau?!c8dL5`|l;aLW|#Aqsb_!X2k@4^_D16>ho5vGNRp >zL(jEnoXSJbwIFH^cV~jAHr$;FqR!BBEr<%k-I*Y23q99@s4Dba3!<KIcP5BRLeI4z >zY6y2{f~X$!TnnOZaCat%ib2n{AZi6Y*Mg`L^jr&~KG1V5h{{0EwIFH&J=cP$2J~DD >zq7Kk=Er<$0&$S@t9zEBBn054A3u3;}b1jIOM$fe%<`_NKf|y<OTnl1e(Q_?`8AZ>v >zAm$Q1*MgWu^jrhYW*s^Kv-;=d!xZjBg*!>%PFA>wE8Gf&dxXNBqHw1w+#?n4Q3|(G >z;Z9Sy(-rOvg*#K>9<6X^DcoZe?rep7tinA`;U2GW=P2A06z+)%_aucoSK*$la8FUV >zRUXGWbOJc^T#Lr3JoH=(qUO+ZEr@DE&$S@x40j%L+V@dZ7<#S+QCsM_7DQFy9SlL# >z6MC)%QAy~z7DNr<wSyq42R+w<s2lWL3!-Asb1jHk!DoF6qDs(nEr|L+&$S>b13lM* >zs0s933!)m(b1jHEK+m-xDgZs#f|z^sTnl2>(Q_?``9{yRAZ8jp*MgX1^jr&KcF}V! >zh<QcNwIF5`J=cPmOY~d|ViwVJ4K$l|=se8o*P%&;J73{eE8H4|n^L&73U`6RU8r#D >z6z-`Cw_f2kDBMK~H?43R74Bk%yF}qODcoj-n^Cw+749;HyIkS6DBKka_cVojy24$l >za91hZGZgM>k7FG=4;*@~MdMT+daeagbLhDiwA!Kr1W{*&85Bf?q32o<wS}H*K~xob >zt_4v~=(!d|C86h95H*CJYe7^GdaeagH|V(*M8%-zS`f8@o@+r=33{#tQ6K2J7DQ#B >z=UNaofu3tYR0Ddh1yKj+xfVnPpyyf;bB~^DLCiXOt_3mQ=(!feOrz&o5Oa*4YeCE| >zdaeaAujsiJ#EhcnS`c%Io@+tOB6_ZYX0r}mgIWDLbgjZYQ{kSaaL-n_>lE%e3in)v >zd!E8QU*TS$a4%H2>lN-r3io1#dx^r`pl~l$xEmGjCWU*M!o6JKUZHR|E8Hs;?o|r+ >zYK41^!rh{9uT{9$DctKlj&<l7aOk-fjZ=B(xfVptq32o<)rOvHLDU&~t_4wH=(!d| >zZK3B{5LJbqYeCc#daeagN$9y2L=B<mS`gKPo@+tW4SKExQ8DPb7DTO}=UNa|f}U$Z >z)CYR51yLF3xfVoCpyyf;)qtLBLDT_yt_4v6=(!fe+@t4O5VMY+YeCF6daeaA)9ASt >z#2lmNS`f2~o@+tOD|)U4F{9|Y7Q|el=UNc6h@NYp*{nlvz^r~9dZWU<N#WkCaBoq# >zw<_G*6z=T`_YQ@7r^3BU;ohxq?@_q-D%|@N?)?gPtHOOi;XbHvA5ypvE8IsE?xPC# >zF@^iM!hJ&FKB;h@Qn*hm+-DT-vkLb)k7FHr132_ti^i!u^jr&~=FoF3h-yR6wIJ#Y >zJ=cP$F!WptqPEa;Er_Z@&$S@x2|d?>s3i1U3!;Y5b1jJKLC>`y>IOa6f~XkuTnnOB >z&~q(_DnZY+AnF4>*Mg`F^jr&~CeU*&h-yI3wIJ#MJ=cP$0Q6i7V(!s%Er?l1&$S@t >z8$H*8m}&G}3u2DZb1jJ3MbEV$<`q5Hf|ybCTnl0@(Q_?`Swzn@&}`PB+c2wNhi+H6 >z&nw&)6z+=(_a%k9L*c%xa9>flI~DG$3imaI+p2JPDcsi;?rw$qhQfVQ;l8DC-&VNq >zDBO1y?t2P%kH_&mzYQFEnngp*EP9#+F{kKh7Q}3#r&$p5h@NIa%piK21u=K%X%@t+ >zp{H38^M#&fLCh3-nguaO=xG+j?4YMv5c7hbW<ksddYT0>7wBmg#4MnvSrFGBJ<Wo+ >z^5|(6#5G4xvmmZEdYT1sozc@Qh%1bqW<gwA^fU|Ns-mYE=pKBwJzptpZKZvKNxPGP >zXZi^kou6PY4$bqV{2mNu_Tn64r{IQb@Tmg_%)y5Zn0FsFU`7`Ym`{xkn3I+cn6Di- >zU@l+&f47ssT-V)6z_sGS|KID``Fkfp`<u)5vH3rS5`Yle+`bC;Acfmc;U27T`zzc5 >z9>>bfA3XhnFGO?+Prn2WSoAAFC`tI;zYCgc(XR!~v*<U1wzKFz1kJbTw}KW}^gBV@ >zTl9ND^afAA1np?ie+k;jqW>230E_-0XlIN5D5$XLPl9%_=+A<7wdj8Y?Pk$m1nq9o >z{et$e2<?S-T^?u=8VxPl(;|cA;hODb5!$FM+S{T5LCiFs8VF*J@zg*Nvx}z&f|ys- >z2|>&#o*D>ZE~%2X1!fUX4S;6z4s8&#`gdptD%?Q|cd)`8qHu>Q++hlLxWXNwa7QZK >zQ3`jo!tvVW$GcGB7Af3fg&R`1VTBt}xKV{0Q@C-3n^3qV3b$0@mMPpr6z*7sJ5J#q >zs&L0E+;WfOSu_X^PrpRtR34sw38Lok^h*%chNoYGs53nM5=4dJ>6ajC3s1iUQB`>Q >zC5U>$(=S0(5}tkuqK5GFOAytAr(c4o8$A6IM8)9gmmq2dPrn3FC3yNJi2A_OFF{lW >zo_-0UCh*ii5Y>RE27;&qJT(wR1>lof1Tpt`Y9NSN$5R7A%r~AI2x6x3)Ibn(jHd>I >zm|Z+I5X8LVsevG76i*EVF_(C1Ac$GSQv;yctV1VYR{y+wn8KZ?a3?9;$qM&yg<GL; >zk5IT%6z)`od!)iWO5s*2+-VATy272IaAzvqqZRHfg?o&`ovm<>Rk+6~+~XDQ9EE#= >z!aY&po}_T+D%_J5?kNhl%HvpvP5_6eU!rj;4^O`YQFD0uC5URn(=S2P8J>O#qQda> >zOAxh%r(c4oDm?uXL_Oi@mmn$$Prn3FLwNcni0Z-9FG17|o_-0UV(|1!5VeA*UxKI- >zJpB?xec<VrASwe-zXVYecxoVsYQR$iLDT`B8VI5S@YFyMbC0J6f|zwYH4wyn<Eeol >zW*Sco1Tn{WY9NT&#Zv=8%qyN62x3O@)Ibn(iKhmFm_<A_0GiD@bRK5)>(Hdaov(1K >z6>g2fO)1=3g}XrEE>yU63ininTd!~%6z(F0n^w4u3U{%>U7~QC6mGM^%_!WZ3U`^p >zU9NCj6z&Rzdz!*MUE!`&xT_TI847o`$FUBb2M$laMB`K*o_-0U=J51O5Y>jKUxKJJ >zJpB?xh2iO!AZiOwzXVZLc={!XdcxB$K~xf+ehH$6@bpU%)q|&Bf~Xrj{SrjQ;OUnj >zY6VZf1W_e;`Xz|^z|${5R0f`Y38E(O)Ibo`fTsq6r~^DT5JUywsevHo9#0JfG3$70 >zAc*<KQv*TFG@cp=Vvh0DKoGNwrv`$US3ETk#Ejypfgt7*PYnbyi+E}PG@Etk8qDg~ >zp=%ZHnF{wTg?qNbU8iu*QMl(S-18Lf`3m;}g?pjGU9WI2Qn(i@+)EVh28Da6!riEF >zH!0l96z=5;_X>r(S>ax(aIaFhS1a6W6z&#<d#%E~PT^kfajZkvfWy--(KwZdr(c4o >zIXwLmM781Rmmum4Prn3FVR-r_h}y!_FF{llo_-0Up78Wb5S4_dUxKJ1JpB?x_2B83 >zAnFEBzXVY+c={!XTEWvVK~xEzehH#J@bpU%m4T;Uf~W~R)euBA;Hicn>HtqQ1W^Ha >zsv(HE$5RbK%sQTG2x7kRR6`Imji(xdm}5NE5X9`_sfHlt6;Cw;F{5~@A&9xeQw>4P >zBA#jh&1M~X17`K>&>I!*O$zsBg?o#_y;b4frf_dpxOXVrI~DF-3iob>dym4sSK;2L >zaPL>RTNUmD3im;U`;fwYSm8dRa358;k15>8748!X_eq8Ol)`;l;Xb2qpH;Zec^vD| >z8^Gb|oM@cN!_zrI)Eu7938LEYbWRX;hNp9as4zU86GUy{>6{>{3Qy+*QBQa}Cx}YI >z(>X!Z5I%885Y>ZE91=v`;1h=gQ8D<$Awkp%K5<A8Rf10(5=4FA6Ndy*8TiB@LDU33 >zsYMXgfKO@>L>=IhS_Dx6_@ov=%soD-MG&)&PihgweB+Z^1ToY2q!vNUF+Qn95VMO< >zY7xY|;*(kgF{AjT7D3D<KB+|zvxrY>0h-M^bQ@;%>(K2A_j!f;g2H`K;l8ABcPQMK >z749nvcc;RARpGv-a9b7bE`|HL!riTK-%z-3dK}NU+rZ(IbtIQuVSKWVAg(PwSw|38 >z6`!mli0g?@))B;$<r;u_+Ubp&zs@X0!YxNi7l9YI_%e6o%pt`$C6M-W#DpR6N@ >z>w{0$5yX|jC+i5}n&6Xl1aUR+$vT3#4)|mpL0kcRvJTKa<_P;r;OvN}@9ih<y!M}f >z(fJ8Jai;?m@7?w*f_nVzKKA3sX4t<?F@MF!V$8$8x5Jo^u>gbI4)T9~Y{uVz{{OWC >zf8A#AH4aoj_YZY(-*vD5-Jj05aF=(#bb&eg@DJTP1dbOFYJ<N&0b?%4DHznsc^JE4 >zY{A$G;|7ceVDP4LcZ_#n(9%zF(Ej-UE*NL8o1PrpFg5syO_S4WfP3+$ty{fm<K~TP >zX9gd>W%bks>x22klTKQ+{hh57aQroQUv=yHeeP_{!M|gEn&Kb$?p@!#WzmCnx#?!a >zf6SlW$N2w<u{{RkKNF+CI0s`d4Enz##s&<=HiN;quf*6B<JlNokLxkGKHtKqt_POI >z{<MAH$<9TBbMnXk|Nbz-V>s988#i4vxnXc-YV(ZUZ^h)L8)n*&sX6_`_P^=wigjBS >zuHCS4QhvP?W}z2cd0Ke<#!c%tUpO^2v)Ry*j#K!j_R~Uzn1cervtn=o=P|aZ<9{m- >z!=}vF8Ee<BTRqiI+u+F1(CDDEWGl{z>(;JaijOZb#%6ZWW9Y-8rRG1Eeee>HfB*TL >ytU$}0gs%p_2ww<(hB@DU)p(kj-JqRnMT|L`=g+2?9V7QiwI4Zc^ltx~?f*Y>YijHO > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/diffiSAC.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/diffiSAC.txt >deleted file mode 100644 >index 96b87c066b3ac7a5bad928d6606e36bccea8e95b..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/diffiSAC.txt >+++ /dev/null >@@ -1,481 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-diff ../dataqa350/i30_1DTMF_16kHz_long.pcm ../dataqa351/i30_1DTMF_16kHz_long.pcm >-diff ../dataqa350/i60_1DTMF_16kHz_long.pcm ../dataqa351/i60_1DTMF_16kHz_long.pcm >-diff ../dataqa350/i30_2DTMF_16kHz_long.pcm ../dataqa351/i30_2DTMF_16kHz_long.pcm >-diff ../dataqa350/i60_2DTMF_16kHz_long.pcm ../dataqa351/i60_2DTMF_16kHz_long.pcm >-diff ../dataqa350/i30_3DTMF_16kHz_long.pcm ../dataqa351/i30_3DTMF_16kHz_long.pcm >-diff ../dataqa350/i60_3DTMF_16kHz_long.pcm ../dataqa351/i60_3DTMF_16kHz_long.pcm >-diff ../dataqa350/i30_4DTMF_16kHz_long.pcm ../dataqa351/i30_4DTMF_16kHz_long.pcm >-diff ../dataqa350/i60_4DTMF_16kHz_long.pcm ../dataqa351/i60_4DTMF_16kHz_long.pcm >-diff ../dataqa350/i30_5DTMF_16kHz_long.pcm ../dataqa351/i30_5DTMF_16kHz_long.pcm >-diff ../dataqa350/i60_5DTMF_16kHz_long.pcm ../dataqa351/i60_5DTMF_16kHz_long.pcm >-diff ../dataqa350/i30_6DTMF_16kHz_long.pcm ../dataqa351/i30_6DTMF_16kHz_long.pcm >-diff ../dataqa350/i60_6DTMF_16kHz_long.pcm ../dataqa351/i60_6DTMF_16kHz_long.pcm >-diff ../dataqa350/a1DTMF_16kHz_long.pcm ../dataqa351/a1DTMF_16kHz_long.pcm >-diff ../dataqa350/a2DTMF_16kHz_long.pcm ../dataqa351/a2DTMF_16kHz_long.pcm >-diff ../dataqa350/a3DTMF_16kHz_long.pcm ../dataqa351/a3DTMF_16kHz_long.pcm >-diff ../dataqa350/i30_7DTMF_16kHz_short.pcm ../dataqa351/i30_7DTMF_16kHz_short.pcm >-diff ../dataqa350/i60_7DTMF_16kHz_short.pcm ../dataqa351/i60_7DTMF_16kHz_short.pcm >-diff ../dataqa350/i30_8DTMF_16kHz_short.pcm ../dataqa351/i30_8DTMF_16kHz_short.pcm >-diff ../dataqa350/i60_8DTMF_16kHz_short.pcm ../dataqa351/i60_8DTMF_16kHz_short.pcm >-diff ../dataqa350/i30_9DTMF_16kHz_short.pcm ../dataqa351/i30_9DTMF_16kHz_short.pcm >-diff ../dataqa350/i60_9DTMF_16kHz_short.pcm ../dataqa351/i60_9DTMF_16kHz_short.pcm >-diff ../dataqa350/i30_10DTMF_16kHz_short.pcm ../dataqa351/i30_10DTMF_16kHz_short.pcm >-diff ../dataqa350/i60_10DTMF_16kHz_short.pcm ../dataqa351/i60_10DTMF_16kHz_short.pcm >-diff ../dataqa350/i30_11DTMF_16kHz_short.pcm ../dataqa351/i30_11DTMF_16kHz_short.pcm >-diff ../dataqa350/i60_11DTMF_16kHz_short.pcm ../dataqa351/i60_11DTMF_16kHz_short.pcm >-diff ../dataqa350/i30_12DTMF_16kHz_short.pcm ../dataqa351/i30_12DTMF_16kHz_short.pcm >-diff ../dataqa350/i60_12DTMF_16kHz_short.pcm ../dataqa351/i60_12DTMF_16kHz_short.pcm >-diff ../dataqa350/a4DTMF_16kHz_short.pcm ../dataqa351/a4DTMF_16kHz_short.pcm >-diff ../dataqa350/a5DTMF_16kHz_short.pcm ../dataqa351/a5DTMF_16kHz_short.pcm >-diff ../dataqa350/a6DTMF_16kHz_short.pcm ../dataqa351/a6DTMF_16kHz_short.pcm >-diff ../dataqa350/i30_13F00.INP ../dataqa350/i30_13F00.INP >-diff ../dataqa350/i60_13F00.INP ../dataqa350/i60_13F00.INP >-diff ../dataqa350/i30_14F00.INP ../dataqa350/i30_14F00.INP >-diff ../dataqa350/i60_14F00.INP ../dataqa350/i60_14F00.INP >-diff ../dataqa350/i30_15F00.INP ../dataqa350/i30_15F00.INP >-diff ../dataqa350/i60_15F00.INP ../dataqa350/i60_15F00.INP >-diff ../dataqa350/i30_16F00.INP ../dataqa350/i30_16F00.INP >-diff ../dataqa350/i60_16F00.INP ../dataqa350/i60_16F00.INP >-diff ../dataqa350/i30_17F00.INP ../dataqa350/i30_17F00.INP >-diff ../dataqa350/i60_17F00.INP ../dataqa350/i60_17F00.INP >-diff ../dataqa350/i30_18F00.INP ../dataqa350/i30_18F00.INP >-diff ../dataqa350/i60_18F00.INP ../dataqa350/i60_18F00.INP >-diff ../dataqa350/a7F00.INP ../dataqa350/a7F00.INP >-diff ../dataqa350/a8F00.INP ../dataqa350/a8F00.INP >-diff ../dataqa350/a9F00.INP ../dataqa350/a9F00.INP >-diff ../dataqa350/i30_19F01.INP ../dataqa350/i30_19F01.INP >-diff ../dataqa350/i60_19F01.INP ../dataqa350/i60_19F01.INP >-diff ../dataqa350/i30_20F01.INP ../dataqa350/i30_20F01.INP >-diff ../dataqa350/i60_20F01.INP ../dataqa350/i60_20F01.INP >-diff ../dataqa350/i30_21F01.INP ../dataqa350/i30_21F01.INP >-diff ../dataqa350/i60_21F01.INP ../dataqa350/i60_21F01.INP >-diff ../dataqa350/i30_22F01.INP ../dataqa350/i30_22F01.INP >-diff ../dataqa350/i60_22F01.INP ../dataqa350/i60_22F01.INP >-diff ../dataqa350/i30_23F01.INP ../dataqa350/i30_23F01.INP >-diff ../dataqa350/i60_23F01.INP ../dataqa350/i60_23F01.INP >-diff ../dataqa350/i30_24F01.INP ../dataqa350/i30_24F01.INP >-diff ../dataqa350/i60_24F01.INP ../dataqa350/i60_24F01.INP >-diff ../dataqa350/a10F01.INP ../dataqa350/a10F01.INP >-diff ../dataqa350/a11F01.INP ../dataqa350/a11F01.INP >-diff ../dataqa350/a12F01.INP ../dataqa350/a12F01.INP >-diff ../dataqa350/i30_25F02.INP ../dataqa350/i30_25F02.INP >-diff ../dataqa350/i60_25F02.INP ../dataqa350/i60_25F02.INP >-diff ../dataqa350/i30_26F02.INP ../dataqa350/i30_26F02.INP >-diff ../dataqa350/i60_26F02.INP ../dataqa350/i60_26F02.INP >-diff ../dataqa350/i30_27F02.INP ../dataqa350/i30_27F02.INP >-diff ../dataqa350/i60_27F02.INP ../dataqa350/i60_27F02.INP >-diff ../dataqa350/i30_28F02.INP ../dataqa350/i30_28F02.INP >-diff ../dataqa350/i60_28F02.INP ../dataqa350/i60_28F02.INP >-diff ../dataqa350/i30_29F02.INP ../dataqa350/i30_29F02.INP >-diff ../dataqa350/i60_29F02.INP ../dataqa350/i60_29F02.INP >-diff ../dataqa350/i30_30F02.INP ../dataqa350/i30_30F02.INP >-diff ../dataqa350/i60_30F02.INP ../dataqa350/i60_30F02.INP >-diff ../dataqa350/a13F02.INP ../dataqa350/a13F02.INP >-diff ../dataqa350/a14F02.INP ../dataqa350/a14F02.INP >-diff ../dataqa350/a15F02.INP ../dataqa350/a15F02.INP >-diff ../dataqa350/i30_31F03.INP ../dataqa350/i30_31F03.INP >-diff ../dataqa350/i60_31F03.INP ../dataqa350/i60_31F03.INP >-diff ../dataqa350/i30_32F03.INP ../dataqa350/i30_32F03.INP >-diff ../dataqa350/i60_32F03.INP ../dataqa350/i60_32F03.INP >-diff ../dataqa350/i30_33F03.INP ../dataqa350/i30_33F03.INP >-diff ../dataqa350/i60_33F03.INP ../dataqa350/i60_33F03.INP >-diff ../dataqa350/i30_34F03.INP ../dataqa350/i30_34F03.INP >-diff ../dataqa350/i60_34F03.INP ../dataqa350/i60_34F03.INP >-diff ../dataqa350/i30_35F03.INP ../dataqa350/i30_35F03.INP >-diff ../dataqa350/i60_35F03.INP ../dataqa350/i60_35F03.INP >-diff ../dataqa350/i30_36F03.INP ../dataqa350/i30_36F03.INP >-diff ../dataqa350/i60_36F03.INP ../dataqa350/i60_36F03.INP >-diff ../dataqa350/a16F03.INP ../dataqa350/a16F03.INP >-diff ../dataqa350/a17F03.INP ../dataqa350/a17F03.INP >-diff ../dataqa350/a18F03.INP ../dataqa350/a18F03.INP >-diff ../dataqa350/i30_37F04.INP ../dataqa350/i30_37F04.INP >-diff ../dataqa350/i60_37F04.INP ../dataqa350/i60_37F04.INP >-diff ../dataqa350/i30_38F04.INP ../dataqa350/i30_38F04.INP >-diff ../dataqa350/i60_38F04.INP ../dataqa350/i60_38F04.INP >-diff ../dataqa350/i30_39F04.INP ../dataqa350/i30_39F04.INP >-diff ../dataqa350/i60_39F04.INP ../dataqa350/i60_39F04.INP >-diff ../dataqa350/i30_40F04.INP ../dataqa350/i30_40F04.INP >-diff ../dataqa350/i60_40F04.INP ../dataqa350/i60_40F04.INP >-diff ../dataqa350/i30_41F04.INP ../dataqa350/i30_41F04.INP >-diff ../dataqa350/i60_41F04.INP ../dataqa350/i60_41F04.INP >-diff ../dataqa350/i30_42F04.INP ../dataqa350/i30_42F04.INP >-diff ../dataqa350/i60_42F04.INP ../dataqa350/i60_42F04.INP >-diff ../dataqa350/a19F04.INP ../dataqa350/a19F04.INP >-diff ../dataqa350/a20F04.INP ../dataqa350/a20F04.INP >-diff ../dataqa350/a21F04.INP ../dataqa350/a21F04.INP >-diff ../dataqa350/i30_43F05.INP ../dataqa350/i30_43F05.INP >-diff ../dataqa350/i60_43F05.INP ../dataqa350/i60_43F05.INP >-diff ../dataqa350/i30_44F05.INP ../dataqa350/i30_44F05.INP >-diff ../dataqa350/i60_44F05.INP ../dataqa350/i60_44F05.INP >-diff ../dataqa350/i30_45F05.INP ../dataqa350/i30_45F05.INP >-diff ../dataqa350/i60_45F05.INP ../dataqa350/i60_45F05.INP >-diff ../dataqa350/i30_46F05.INP ../dataqa350/i30_46F05.INP >-diff ../dataqa350/i60_46F05.INP ../dataqa350/i60_46F05.INP >-diff ../dataqa350/i30_47F05.INP ../dataqa350/i30_47F05.INP >-diff ../dataqa350/i60_47F05.INP ../dataqa350/i60_47F05.INP >-diff ../dataqa350/i30_48F05.INP ../dataqa350/i30_48F05.INP >-diff ../dataqa350/i60_48F05.INP ../dataqa350/i60_48F05.INP >-diff ../dataqa350/a22F05.INP ../dataqa350/a22F05.INP >-diff ../dataqa350/a23F05.INP ../dataqa350/a23F05.INP >-diff ../dataqa350/a24F05.INP ../dataqa350/a24F05.INP >-diff ../dataqa350/i30_49F06.INP ../dataqa350/i30_49F06.INP >-diff ../dataqa350/i60_49F06.INP ../dataqa350/i60_49F06.INP >-diff ../dataqa350/i30_50F06.INP ../dataqa350/i30_50F06.INP >-diff ../dataqa350/i60_50F06.INP ../dataqa350/i60_50F06.INP >-diff ../dataqa350/i30_51F06.INP ../dataqa350/i30_51F06.INP >-diff ../dataqa350/i60_51F06.INP ../dataqa350/i60_51F06.INP >-diff ../dataqa350/i30_52F06.INP ../dataqa350/i30_52F06.INP >-diff ../dataqa350/i60_52F06.INP ../dataqa350/i60_52F06.INP >-diff ../dataqa350/i30_53F06.INP ../dataqa350/i30_53F06.INP >-diff ../dataqa350/i60_53F06.INP ../dataqa350/i60_53F06.INP >-diff ../dataqa350/i30_54F06.INP ../dataqa350/i30_54F06.INP >-diff ../dataqa350/i60_54F06.INP ../dataqa350/i60_54F06.INP >-diff ../dataqa350/a25F06.INP ../dataqa350/a25F06.INP >-diff ../dataqa350/a26F06.INP ../dataqa350/a26F06.INP >-diff ../dataqa350/a27F06.INP ../dataqa350/a27F06.INP >-diff ../dataqa350/i30_55longtest.pcm ../dataqa351/i30_55longtest.pcm >-diff ../dataqa350/i60_55longtest.pcm ../dataqa351/i60_55longtest.pcm >-diff ../dataqa350/i30_56longtest.pcm ../dataqa351/i30_56longtest.pcm >-diff ../dataqa350/i60_56longtest.pcm ../dataqa351/i60_56longtest.pcm >-diff ../dataqa350/i30_57longtest.pcm ../dataqa351/i30_57longtest.pcm >-diff ../dataqa350/i60_57longtest.pcm ../dataqa351/i60_57longtest.pcm >-diff ../dataqa350/i30_58longtest.pcm ../dataqa351/i30_58longtest.pcm >-diff ../dataqa350/i60_58longtest.pcm ../dataqa351/i60_58longtest.pcm >-diff ../dataqa350/i30_59longtest.pcm ../dataqa351/i30_59longtest.pcm >-diff ../dataqa350/i60_59longtest.pcm ../dataqa351/i60_59longtest.pcm >-diff ../dataqa350/i30_60longtest.pcm ../dataqa351/i30_60longtest.pcm >-diff ../dataqa350/i60_60longtest.pcm ../dataqa351/i60_60longtest.pcm >-diff ../dataqa350/a28longtest.pcm ../dataqa351/a28longtest.pcm >-diff ../dataqa350/a29longtest.pcm ../dataqa351/a29longtest.pcm >-diff ../dataqa350/a30longtest.pcm ../dataqa351/a30longtest.pcm >-diff ../dataqa350/i30_61ltest_speech_clean.pcm ../dataqa351/i30_61ltest_speech_clean.pcm >-diff ../dataqa350/i60_61ltest_speech_clean.pcm ../dataqa351/i60_61ltest_speech_clean.pcm >-diff ../dataqa350/i30_62ltest_speech_clean.pcm ../dataqa351/i30_62ltest_speech_clean.pcm >-diff ../dataqa350/i60_62ltest_speech_clean.pcm ../dataqa351/i60_62ltest_speech_clean.pcm >-diff ../dataqa350/i30_63ltest_speech_clean.pcm ../dataqa351/i30_63ltest_speech_clean.pcm >-diff ../dataqa350/i60_63ltest_speech_clean.pcm ../dataqa351/i60_63ltest_speech_clean.pcm >-diff ../dataqa350/i30_64ltest_speech_clean.pcm ../dataqa351/i30_64ltest_speech_clean.pcm >-diff ../dataqa350/i60_64ltest_speech_clean.pcm ../dataqa351/i60_64ltest_speech_clean.pcm >-diff ../dataqa350/i30_65ltest_speech_clean.pcm ../dataqa351/i30_65ltest_speech_clean.pcm >-diff ../dataqa350/i60_65ltest_speech_clean.pcm ../dataqa351/i60_65ltest_speech_clean.pcm >-diff ../dataqa350/i30_66ltest_speech_clean.pcm ../dataqa351/i30_66ltest_speech_clean.pcm >-diff ../dataqa350/i60_66ltest_speech_clean.pcm ../dataqa351/i60_66ltest_speech_clean.pcm >-diff ../dataqa350/a31ltest_speech_clean.pcm ../dataqa351/a31ltest_speech_clean.pcm >-diff ../dataqa350/a32ltest_speech_clean.pcm ../dataqa351/a32ltest_speech_clean.pcm >-diff ../dataqa350/a33ltest_speech_clean.pcm ../dataqa351/a33ltest_speech_clean.pcm >-diff ../dataqa350/i30_67ltest_music.pcm ../dataqa351/i30_67ltest_music.pcm >-diff ../dataqa350/i60_67ltest_music.pcm ../dataqa351/i60_67ltest_music.pcm >-diff ../dataqa350/i30_68ltest_music.pcm ../dataqa351/i30_68ltest_music.pcm >-diff ../dataqa350/i60_68ltest_music.pcm ../dataqa351/i60_68ltest_music.pcm >-diff ../dataqa350/i30_69ltest_music.pcm ../dataqa351/i30_69ltest_music.pcm >-diff ../dataqa350/i60_69ltest_music.pcm ../dataqa351/i60_69ltest_music.pcm >-diff ../dataqa350/i30_70ltest_music.pcm ../dataqa351/i30_70ltest_music.pcm >-diff ../dataqa350/i60_70ltest_music.pcm ../dataqa351/i60_70ltest_music.pcm >-diff ../dataqa350/i30_71ltest_music.pcm ../dataqa351/i30_71ltest_music.pcm >-diff ../dataqa350/i60_71ltest_music.pcm ../dataqa351/i60_71ltest_music.pcm >-diff ../dataqa350/i30_72ltest_music.pcm ../dataqa351/i30_72ltest_music.pcm >-diff ../dataqa350/i60_72ltest_music.pcm ../dataqa351/i60_72ltest_music.pcm >-diff ../dataqa350/a34ltest_music.pcm ../dataqa351/a34ltest_music.pcm >-diff ../dataqa350/a35ltest_music.pcm ../dataqa351/a35ltest_music.pcm >-diff ../dataqa350/a36ltest_music.pcm ../dataqa351/a36ltest_music.pcm >-diff ../dataqa350/i30_73ltest_speech_noisy.pcm ../dataqa351/i30_73ltest_speech_noisy.pcm >-diff ../dataqa350/i60_73ltest_speech_noisy.pcm ../dataqa351/i60_73ltest_speech_noisy.pcm >-diff ../dataqa350/i30_74ltest_speech_noisy.pcm ../dataqa351/i30_74ltest_speech_noisy.pcm >-diff ../dataqa350/i60_74ltest_speech_noisy.pcm ../dataqa351/i60_74ltest_speech_noisy.pcm >-diff ../dataqa350/i30_75ltest_speech_noisy.pcm ../dataqa351/i30_75ltest_speech_noisy.pcm >-diff ../dataqa350/i60_75ltest_speech_noisy.pcm ../dataqa351/i60_75ltest_speech_noisy.pcm >-diff ../dataqa350/i30_76ltest_speech_noisy.pcm ../dataqa351/i30_76ltest_speech_noisy.pcm >-diff ../dataqa350/i60_76ltest_speech_noisy.pcm ../dataqa351/i60_76ltest_speech_noisy.pcm >-diff ../dataqa350/i30_77ltest_speech_noisy.pcm ../dataqa351/i30_77ltest_speech_noisy.pcm >-diff ../dataqa350/i60_77ltest_speech_noisy.pcm ../dataqa351/i60_77ltest_speech_noisy.pcm >-diff ../dataqa350/i30_78ltest_speech_noisy.pcm ../dataqa351/i30_78ltest_speech_noisy.pcm >-diff ../dataqa350/i60_78ltest_speech_noisy.pcm ../dataqa351/i60_78ltest_speech_noisy.pcm >-diff ../dataqa350/a37ltest_speech_noisy.pcm ../dataqa351/a37ltest_speech_noisy.pcm >-diff ../dataqa350/a38ltest_speech_noisy.pcm ../dataqa351/a38ltest_speech_noisy.pcm >-diff ../dataqa350/a39ltest_speech_noisy.pcm ../dataqa351/a39ltest_speech_noisy.pcm >-diff ../dataqa350/i30_79misc2.pcm ../dataqa351/i30_79misc2.pcm >-diff ../dataqa350/i60_79misc2.pcm ../dataqa351/i60_79misc2.pcm >-diff ../dataqa350/i30_80misc2.pcm ../dataqa351/i30_80misc2.pcm >-diff ../dataqa350/i60_80misc2.pcm ../dataqa351/i60_80misc2.pcm >-diff ../dataqa350/i30_81misc2.pcm ../dataqa351/i30_81misc2.pcm >-diff ../dataqa350/i60_81misc2.pcm ../dataqa351/i60_81misc2.pcm >-diff ../dataqa350/i30_82misc2.pcm ../dataqa351/i30_82misc2.pcm >-diff ../dataqa350/i60_82misc2.pcm ../dataqa351/i60_82misc2.pcm >-diff ../dataqa350/i30_83misc2.pcm ../dataqa351/i30_83misc2.pcm >-diff ../dataqa350/i60_83misc2.pcm ../dataqa351/i60_83misc2.pcm >-diff ../dataqa350/i30_84misc2.pcm ../dataqa351/i30_84misc2.pcm >-diff ../dataqa350/i60_84misc2.pcm ../dataqa351/i60_84misc2.pcm >-diff ../dataqa350/a40misc2.pcm ../dataqa351/a40misc2.pcm >-diff ../dataqa350/a41misc2.pcm ../dataqa351/a41misc2.pcm >-diff ../dataqa350/a42misc2.pcm ../dataqa351/a42misc2.pcm >-diff ../dataqa350/i30_85purenb.pcm ../dataqa351/i30_85purenb.pcm >-diff ../dataqa350/i60_85purenb.pcm ../dataqa351/i60_85purenb.pcm >-diff ../dataqa350/i30_86purenb.pcm ../dataqa351/i30_86purenb.pcm >-diff ../dataqa350/i60_86purenb.pcm ../dataqa351/i60_86purenb.pcm >-diff ../dataqa350/i30_87purenb.pcm ../dataqa351/i30_87purenb.pcm >-diff ../dataqa350/i60_87purenb.pcm ../dataqa351/i60_87purenb.pcm >-diff ../dataqa350/i30_88purenb.pcm ../dataqa351/i30_88purenb.pcm >-diff ../dataqa350/i60_88purenb.pcm ../dataqa351/i60_88purenb.pcm >-diff ../dataqa350/i30_89purenb.pcm ../dataqa351/i30_89purenb.pcm >-diff ../dataqa350/i60_89purenb.pcm ../dataqa351/i60_89purenb.pcm >-diff ../dataqa350/i30_90purenb.pcm ../dataqa351/i30_90purenb.pcm >-diff ../dataqa350/i60_90purenb.pcm ../dataqa351/i60_90purenb.pcm >-diff ../dataqa350/a43purenb.pcm ../dataqa351/a43purenb.pcm >-diff ../dataqa350/a44purenb.pcm ../dataqa351/a44purenb.pcm >-diff ../dataqa350/a45purenb.pcm ../dataqa351/a45purenb.pcm >-diff ../dataqa350/i30_91sawsweep_380_60.pcm ../dataqa351/i30_91sawsweep_380_60.pcm >-diff ../dataqa350/i60_91sawsweep_380_60.pcm ../dataqa351/i60_91sawsweep_380_60.pcm >-diff ../dataqa350/i30_92sawsweep_380_60.pcm ../dataqa351/i30_92sawsweep_380_60.pcm >-diff ../dataqa350/i60_92sawsweep_380_60.pcm ../dataqa351/i60_92sawsweep_380_60.pcm >-diff ../dataqa350/i30_93sawsweep_380_60.pcm ../dataqa351/i30_93sawsweep_380_60.pcm >-diff ../dataqa350/i60_93sawsweep_380_60.pcm ../dataqa351/i60_93sawsweep_380_60.pcm >-diff ../dataqa350/i30_94sawsweep_380_60.pcm ../dataqa351/i30_94sawsweep_380_60.pcm >-diff ../dataqa350/i60_94sawsweep_380_60.pcm ../dataqa351/i60_94sawsweep_380_60.pcm >-diff ../dataqa350/i30_95sawsweep_380_60.pcm ../dataqa351/i30_95sawsweep_380_60.pcm >-diff ../dataqa350/i60_95sawsweep_380_60.pcm ../dataqa351/i60_95sawsweep_380_60.pcm >-diff ../dataqa350/i30_96sawsweep_380_60.pcm ../dataqa351/i30_96sawsweep_380_60.pcm >-diff ../dataqa350/i60_96sawsweep_380_60.pcm ../dataqa351/i60_96sawsweep_380_60.pcm >-diff ../dataqa350/a46sawsweep_380_60.pcm ../dataqa351/a46sawsweep_380_60.pcm >-diff ../dataqa350/a47sawsweep_380_60.pcm ../dataqa351/a47sawsweep_380_60.pcm >-diff ../dataqa350/a48sawsweep_380_60.pcm ../dataqa351/a48sawsweep_380_60.pcm >-diff ../dataqa350/i30_97sinesweep.pcm ../dataqa351/i30_97sinesweep.pcm >-diff ../dataqa350/i60_97sinesweep.pcm ../dataqa351/i60_97sinesweep.pcm >-diff ../dataqa350/i30_98sinesweep.pcm ../dataqa351/i30_98sinesweep.pcm >-diff ../dataqa350/i60_98sinesweep.pcm ../dataqa351/i60_98sinesweep.pcm >-diff ../dataqa350/i30_99sinesweep.pcm ../dataqa351/i30_99sinesweep.pcm >-diff ../dataqa350/i60_99sinesweep.pcm ../dataqa351/i60_99sinesweep.pcm >-diff ../dataqa350/i30_100sinesweep.pcm ../dataqa351/i30_100sinesweep.pcm >-diff ../dataqa350/i60_100sinesweep.pcm ../dataqa351/i60_100sinesweep.pcm >-diff ../dataqa350/i30_101sinesweep.pcm ../dataqa351/i30_101sinesweep.pcm >-diff ../dataqa350/i60_101sinesweep.pcm ../dataqa351/i60_101sinesweep.pcm >-diff ../dataqa350/i30_102sinesweep.pcm ../dataqa351/i30_102sinesweep.pcm >-diff ../dataqa350/i60_102sinesweep.pcm ../dataqa351/i60_102sinesweep.pcm >-diff ../dataqa350/a49sinesweep.pcm ../dataqa351/a49sinesweep.pcm >-diff ../dataqa350/a50sinesweep.pcm ../dataqa351/a50sinesweep.pcm >-diff ../dataqa350/a51sinesweep.pcm ../dataqa351/a51sinesweep.pcm >-diff ../dataqa350/i30_103sinesweep_half.pcm ../dataqa351/i30_103sinesweep_half.pcm >-diff ../dataqa350/i60_103sinesweep_half.pcm ../dataqa351/i60_103sinesweep_half.pcm >-diff ../dataqa350/i30_104sinesweep_half.pcm ../dataqa351/i30_104sinesweep_half.pcm >-diff ../dataqa350/i60_104sinesweep_half.pcm ../dataqa351/i60_104sinesweep_half.pcm >-diff ../dataqa350/i30_105sinesweep_half.pcm ../dataqa351/i30_105sinesweep_half.pcm >-diff ../dataqa350/i60_105sinesweep_half.pcm ../dataqa351/i60_105sinesweep_half.pcm >-diff ../dataqa350/i30_106sinesweep_half.pcm ../dataqa351/i30_106sinesweep_half.pcm >-diff ../dataqa350/i60_106sinesweep_half.pcm ../dataqa351/i60_106sinesweep_half.pcm >-diff ../dataqa350/i30_107sinesweep_half.pcm ../dataqa351/i30_107sinesweep_half.pcm >-diff ../dataqa350/i60_107sinesweep_half.pcm ../dataqa351/i60_107sinesweep_half.pcm >-diff ../dataqa350/i30_108sinesweep_half.pcm ../dataqa351/i30_108sinesweep_half.pcm >-diff ../dataqa350/i60_108sinesweep_half.pcm ../dataqa351/i60_108sinesweep_half.pcm >-diff ../dataqa350/a52sinesweep_half.pcm ../dataqa351/a52sinesweep_half.pcm >-diff ../dataqa350/a53sinesweep_half.pcm ../dataqa351/a53sinesweep_half.pcm >-diff ../dataqa350/a54sinesweep_half.pcm ../dataqa351/a54sinesweep_half.pcm >-diff ../dataqa350/i30_109speechmusic.pcm ../dataqa351/i30_109speechmusic.pcm >-diff ../dataqa350/i60_109speechmusic.pcm ../dataqa351/i60_109speechmusic.pcm >-diff ../dataqa350/i30_110speechmusic.pcm ../dataqa351/i30_110speechmusic.pcm >-diff ../dataqa350/i60_110speechmusic.pcm ../dataqa351/i60_110speechmusic.pcm >-diff ../dataqa350/i30_111speechmusic.pcm ../dataqa351/i30_111speechmusic.pcm >-diff ../dataqa350/i60_111speechmusic.pcm ../dataqa351/i60_111speechmusic.pcm >-diff ../dataqa350/i30_112speechmusic.pcm ../dataqa351/i30_112speechmusic.pcm >-diff ../dataqa350/i60_112speechmusic.pcm ../dataqa351/i60_112speechmusic.pcm >-diff ../dataqa350/i30_113speechmusic.pcm ../dataqa351/i30_113speechmusic.pcm >-diff ../dataqa350/i60_113speechmusic.pcm ../dataqa351/i60_113speechmusic.pcm >-diff ../dataqa350/i30_114speechmusic.pcm ../dataqa351/i30_114speechmusic.pcm >-diff ../dataqa350/i60_114speechmusic.pcm ../dataqa351/i60_114speechmusic.pcm >-diff ../dataqa350/a55speechmusic.pcm ../dataqa351/a55speechmusic.pcm >-diff ../dataqa350/a56speechmusic.pcm ../dataqa351/a56speechmusic.pcm >-diff ../dataqa350/a57speechmusic.pcm ../dataqa351/a57speechmusic.pcm >-diff ../dataqa350/i30_115speechmusic_nb.pcm ../dataqa351/i30_115speechmusic_nb.pcm >-diff ../dataqa350/i60_115speechmusic_nb.pcm ../dataqa351/i60_115speechmusic_nb.pcm >-diff ../dataqa350/i30_116speechmusic_nb.pcm ../dataqa351/i30_116speechmusic_nb.pcm >-diff ../dataqa350/i60_116speechmusic_nb.pcm ../dataqa351/i60_116speechmusic_nb.pcm >-diff ../dataqa350/i30_117speechmusic_nb.pcm ../dataqa351/i30_117speechmusic_nb.pcm >-diff ../dataqa350/i60_117speechmusic_nb.pcm ../dataqa351/i60_117speechmusic_nb.pcm >-diff ../dataqa350/i30_118speechmusic_nb.pcm ../dataqa351/i30_118speechmusic_nb.pcm >-diff ../dataqa350/i60_118speechmusic_nb.pcm ../dataqa351/i60_118speechmusic_nb.pcm >-diff ../dataqa350/i30_119speechmusic_nb.pcm ../dataqa351/i30_119speechmusic_nb.pcm >-diff ../dataqa350/i60_119speechmusic_nb.pcm ../dataqa351/i60_119speechmusic_nb.pcm >-diff ../dataqa350/i30_120speechmusic_nb.pcm ../dataqa351/i30_120speechmusic_nb.pcm >-diff ../dataqa350/i60_120speechmusic_nb.pcm ../dataqa351/i60_120speechmusic_nb.pcm >-diff ../dataqa350/a58speechmusic_nb.pcm ../dataqa351/a58speechmusic_nb.pcm >-diff ../dataqa350/a59speechmusic_nb.pcm ../dataqa351/a59speechmusic_nb.pcm >-diff ../dataqa350/a60speechmusic_nb.pcm ../dataqa351/a60speechmusic_nb.pcm >-diff ../dataqa350/i30_121speechoffice0dB.pcm ../dataqa351/i30_121speechoffice0dB.pcm >-diff ../dataqa350/i60_121speechoffice0dB.pcm ../dataqa351/i60_121speechoffice0dB.pcm >-diff ../dataqa350/i30_122speechoffice0dB.pcm ../dataqa351/i30_122speechoffice0dB.pcm >-diff ../dataqa350/i60_122speechoffice0dB.pcm ../dataqa351/i60_122speechoffice0dB.pcm >-diff ../dataqa350/i30_123speechoffice0dB.pcm ../dataqa351/i30_123speechoffice0dB.pcm >-diff ../dataqa350/i60_123speechoffice0dB.pcm ../dataqa351/i60_123speechoffice0dB.pcm >-diff ../dataqa350/i30_124speechoffice0dB.pcm ../dataqa351/i30_124speechoffice0dB.pcm >-diff ../dataqa350/i60_124speechoffice0dB.pcm ../dataqa351/i60_124speechoffice0dB.pcm >-diff ../dataqa350/i30_125speechoffice0dB.pcm ../dataqa351/i30_125speechoffice0dB.pcm >-diff ../dataqa350/i60_125speechoffice0dB.pcm ../dataqa351/i60_125speechoffice0dB.pcm >-diff ../dataqa350/i30_126speechoffice0dB.pcm ../dataqa351/i30_126speechoffice0dB.pcm >-diff ../dataqa350/i60_126speechoffice0dB.pcm ../dataqa351/i60_126speechoffice0dB.pcm >-diff ../dataqa350/a61speechoffice0dB.pcm ../dataqa351/a61speechoffice0dB.pcm >-diff ../dataqa350/a62speechoffice0dB.pcm ../dataqa351/a62speechoffice0dB.pcm >-diff ../dataqa350/a63speechoffice0dB.pcm ../dataqa351/a63speechoffice0dB.pcm >-diff ../dataqa350/i30_127speech_and_misc_NB.pcm ../dataqa351/i30_127speech_and_misc_NB.pcm >-diff ../dataqa350/i60_127speech_and_misc_NB.pcm ../dataqa351/i60_127speech_and_misc_NB.pcm >-diff ../dataqa350/i30_128speech_and_misc_NB.pcm ../dataqa351/i30_128speech_and_misc_NB.pcm >-diff ../dataqa350/i60_128speech_and_misc_NB.pcm ../dataqa351/i60_128speech_and_misc_NB.pcm >-diff ../dataqa350/i30_129speech_and_misc_NB.pcm ../dataqa351/i30_129speech_and_misc_NB.pcm >-diff ../dataqa350/i60_129speech_and_misc_NB.pcm ../dataqa351/i60_129speech_and_misc_NB.pcm >-diff ../dataqa350/i30_130speech_and_misc_NB.pcm ../dataqa351/i30_130speech_and_misc_NB.pcm >-diff ../dataqa350/i60_130speech_and_misc_NB.pcm ../dataqa351/i60_130speech_and_misc_NB.pcm >-diff ../dataqa350/i30_131speech_and_misc_NB.pcm ../dataqa351/i30_131speech_and_misc_NB.pcm >-diff ../dataqa350/i60_131speech_and_misc_NB.pcm ../dataqa351/i60_131speech_and_misc_NB.pcm >-diff ../dataqa350/i30_132speech_and_misc_NB.pcm ../dataqa351/i30_132speech_and_misc_NB.pcm >-diff ../dataqa350/i60_132speech_and_misc_NB.pcm ../dataqa351/i60_132speech_and_misc_NB.pcm >-diff ../dataqa350/a64speech_and_misc_NB.pcm ../dataqa351/a64speech_and_misc_NB.pcm >-diff ../dataqa350/a65speech_and_misc_NB.pcm ../dataqa351/a65speech_and_misc_NB.pcm >-diff ../dataqa350/a66speech_and_misc_NB.pcm ../dataqa351/a66speech_and_misc_NB.pcm >-diff ../dataqa350/i30_133speech_and_misc_WB.pcm ../dataqa351/i30_133speech_and_misc_WB.pcm >-diff ../dataqa350/i60_133speech_and_misc_WB.pcm ../dataqa351/i60_133speech_and_misc_WB.pcm >-diff ../dataqa350/i30_134speech_and_misc_WB.pcm ../dataqa351/i30_134speech_and_misc_WB.pcm >-diff ../dataqa350/i60_134speech_and_misc_WB.pcm ../dataqa351/i60_134speech_and_misc_WB.pcm >-diff ../dataqa350/i30_135speech_and_misc_WB.pcm ../dataqa351/i30_135speech_and_misc_WB.pcm >-diff ../dataqa350/i60_135speech_and_misc_WB.pcm ../dataqa351/i60_135speech_and_misc_WB.pcm >-diff ../dataqa350/i30_136speech_and_misc_WB.pcm ../dataqa351/i30_136speech_and_misc_WB.pcm >-diff ../dataqa350/i60_136speech_and_misc_WB.pcm ../dataqa351/i60_136speech_and_misc_WB.pcm >-diff ../dataqa350/i30_137speech_and_misc_WB.pcm ../dataqa351/i30_137speech_and_misc_WB.pcm >-diff ../dataqa350/i60_137speech_and_misc_WB.pcm ../dataqa351/i60_137speech_and_misc_WB.pcm >-diff ../dataqa350/i30_138speech_and_misc_WB.pcm ../dataqa351/i30_138speech_and_misc_WB.pcm >-diff ../dataqa350/i60_138speech_and_misc_WB.pcm ../dataqa351/i60_138speech_and_misc_WB.pcm >-diff ../dataqa350/a67speech_and_misc_WB.pcm ../dataqa351/a67speech_and_misc_WB.pcm >-diff ../dataqa350/a68speech_and_misc_WB.pcm ../dataqa351/a68speech_and_misc_WB.pcm >-diff ../dataqa350/a69speech_and_misc_WB.pcm ../dataqa351/a69speech_and_misc_WB.pcm >-diff ../dataqa350/i30_139testM4.pcm ../dataqa351/i30_139testM4.pcm >-diff ../dataqa350/i60_139testM4.pcm ../dataqa351/i60_139testM4.pcm >-diff ../dataqa350/i30_140testM4.pcm ../dataqa351/i30_140testM4.pcm >-diff ../dataqa350/i60_140testM4.pcm ../dataqa351/i60_140testM4.pcm >-diff ../dataqa350/i30_141testM4.pcm ../dataqa351/i30_141testM4.pcm >-diff ../dataqa350/i60_141testM4.pcm ../dataqa351/i60_141testM4.pcm >-diff ../dataqa350/i30_142testM4.pcm ../dataqa351/i30_142testM4.pcm >-diff ../dataqa350/i60_142testM4.pcm ../dataqa351/i60_142testM4.pcm >-diff ../dataqa350/i30_143testM4.pcm ../dataqa351/i30_143testM4.pcm >-diff ../dataqa350/i60_143testM4.pcm ../dataqa351/i60_143testM4.pcm >-diff ../dataqa350/i30_144testM4.pcm ../dataqa351/i30_144testM4.pcm >-diff ../dataqa350/i60_144testM4.pcm ../dataqa351/i60_144testM4.pcm >-diff ../dataqa350/a70testM4.pcm ../dataqa351/a70testM4.pcm >-diff ../dataqa350/a71testM4.pcm ../dataqa351/a71testM4.pcm >-diff ../dataqa350/a72testM4.pcm ../dataqa351/a72testM4.pcm >-diff ../dataqa350/i30_145testM4D_rev.pcm ../dataqa351/i30_145testM4D_rev.pcm >-diff ../dataqa350/i60_145testM4D_rev.pcm ../dataqa351/i60_145testM4D_rev.pcm >-diff ../dataqa350/i30_146testM4D_rev.pcm ../dataqa351/i30_146testM4D_rev.pcm >-diff ../dataqa350/i60_146testM4D_rev.pcm ../dataqa351/i60_146testM4D_rev.pcm >-diff ../dataqa350/i30_147testM4D_rev.pcm ../dataqa351/i30_147testM4D_rev.pcm >-diff ../dataqa350/i60_147testM4D_rev.pcm ../dataqa351/i60_147testM4D_rev.pcm >-diff ../dataqa350/i30_148testM4D_rev.pcm ../dataqa351/i30_148testM4D_rev.pcm >-diff ../dataqa350/i60_148testM4D_rev.pcm ../dataqa351/i60_148testM4D_rev.pcm >-diff ../dataqa350/i30_149testM4D_rev.pcm ../dataqa351/i30_149testM4D_rev.pcm >-diff ../dataqa350/i60_149testM4D_rev.pcm ../dataqa351/i60_149testM4D_rev.pcm >-diff ../dataqa350/i30_150testM4D_rev.pcm ../dataqa351/i30_150testM4D_rev.pcm >-diff ../dataqa350/i60_150testM4D_rev.pcm ../dataqa351/i60_150testM4D_rev.pcm >-diff ../dataqa350/a73testM4D_rev.pcm ../dataqa351/a73testM4D_rev.pcm >-diff ../dataqa350/a74testM4D_rev.pcm ../dataqa351/a74testM4D_rev.pcm >-diff ../dataqa350/a75testM4D_rev.pcm ../dataqa351/a75testM4D_rev.pcm >-diff ../dataqa350/i30_151testM4D.pcm ../dataqa351/i30_151testM4D.pcm >-diff ../dataqa350/i60_151testM4D.pcm ../dataqa351/i60_151testM4D.pcm >-diff ../dataqa350/i30_152testM4D.pcm ../dataqa351/i30_152testM4D.pcm >-diff ../dataqa350/i60_152testM4D.pcm ../dataqa351/i60_152testM4D.pcm >-diff ../dataqa350/i30_153testM4D.pcm ../dataqa351/i30_153testM4D.pcm >-diff ../dataqa350/i60_153testM4D.pcm ../dataqa351/i60_153testM4D.pcm >-diff ../dataqa350/i30_154testM4D.pcm ../dataqa351/i30_154testM4D.pcm >-diff ../dataqa350/i60_154testM4D.pcm ../dataqa351/i60_154testM4D.pcm >-diff ../dataqa350/i30_155testM4D.pcm ../dataqa351/i30_155testM4D.pcm >-diff ../dataqa350/i60_155testM4D.pcm ../dataqa351/i60_155testM4D.pcm >-diff ../dataqa350/i30_156testM4D.pcm ../dataqa351/i30_156testM4D.pcm >-diff ../dataqa350/i60_156testM4D.pcm ../dataqa351/i60_156testM4D.pcm >-diff ../dataqa350/a76testM4D.pcm ../dataqa351/a76testM4D.pcm >-diff ../dataqa350/a77testM4D.pcm ../dataqa351/a77testM4D.pcm >-diff ../dataqa350/a78testM4D.pcm ../dataqa351/a78testM4D.pcm >-diff ../dataqa350/i30_157testfile.pcm ../dataqa351/i30_157testfile.pcm >-diff ../dataqa350/i60_157testfile.pcm ../dataqa351/i60_157testfile.pcm >-diff ../dataqa350/i30_158testfile.pcm ../dataqa351/i30_158testfile.pcm >-diff ../dataqa350/i60_158testfile.pcm ../dataqa351/i60_158testfile.pcm >-diff ../dataqa350/i30_159testfile.pcm ../dataqa351/i30_159testfile.pcm >-diff ../dataqa350/i60_159testfile.pcm ../dataqa351/i60_159testfile.pcm >-diff ../dataqa350/i30_160testfile.pcm ../dataqa351/i30_160testfile.pcm >-diff ../dataqa350/i60_160testfile.pcm ../dataqa351/i60_160testfile.pcm >-diff ../dataqa350/i30_161testfile.pcm ../dataqa351/i30_161testfile.pcm >-diff ../dataqa350/i60_161testfile.pcm ../dataqa351/i60_161testfile.pcm >-diff ../dataqa350/i30_162testfile.pcm ../dataqa351/i30_162testfile.pcm >-diff ../dataqa350/i60_162testfile.pcm ../dataqa351/i60_162testfile.pcm >-diff ../dataqa350/a79testfile.pcm ../dataqa351/a79testfile.pcm >-diff ../dataqa350/a80testfile.pcm ../dataqa351/a80testfile.pcm >-diff ../dataqa350/a81testfile.pcm ../dataqa351/a81testfile.pcm >-diff ../dataqa350/i30_163tone_cisco.pcm ../dataqa351/i30_163tone_cisco.pcm >-diff ../dataqa350/i60_163tone_cisco.pcm ../dataqa351/i60_163tone_cisco.pcm >-diff ../dataqa350/i30_164tone_cisco.pcm ../dataqa351/i30_164tone_cisco.pcm >-diff ../dataqa350/i60_164tone_cisco.pcm ../dataqa351/i60_164tone_cisco.pcm >-diff ../dataqa350/i30_165tone_cisco.pcm ../dataqa351/i30_165tone_cisco.pcm >-diff ../dataqa350/i60_165tone_cisco.pcm ../dataqa351/i60_165tone_cisco.pcm >-diff ../dataqa350/i30_166tone_cisco.pcm ../dataqa351/i30_166tone_cisco.pcm >-diff ../dataqa350/i60_166tone_cisco.pcm ../dataqa351/i60_166tone_cisco.pcm >-diff ../dataqa350/i30_167tone_cisco.pcm ../dataqa351/i30_167tone_cisco.pcm >-diff ../dataqa350/i60_167tone_cisco.pcm ../dataqa351/i60_167tone_cisco.pcm >-diff ../dataqa350/i30_168tone_cisco.pcm ../dataqa351/i30_168tone_cisco.pcm >-diff ../dataqa350/i60_168tone_cisco.pcm ../dataqa351/i60_168tone_cisco.pcm >-diff ../dataqa350/a82tone_cisco.pcm ../dataqa351/a82tone_cisco.pcm >-diff ../dataqa350/a83tone_cisco.pcm ../dataqa351/a83tone_cisco.pcm >-diff ../dataqa350/a84tone_cisco.pcm ../dataqa351/a84tone_cisco.pcm >-diff ../dataqa350/i30_169tone_cisco_long.pcm ../dataqa351/i30_169tone_cisco_long.pcm >-diff ../dataqa350/i60_169tone_cisco_long.pcm ../dataqa351/i60_169tone_cisco_long.pcm >-diff ../dataqa350/i30_170tone_cisco_long.pcm ../dataqa351/i30_170tone_cisco_long.pcm >-diff ../dataqa350/i60_170tone_cisco_long.pcm ../dataqa351/i60_170tone_cisco_long.pcm >-diff ../dataqa350/i30_171tone_cisco_long.pcm ../dataqa351/i30_171tone_cisco_long.pcm >-diff ../dataqa350/i60_171tone_cisco_long.pcm ../dataqa351/i60_171tone_cisco_long.pcm >-diff ../dataqa350/i30_172tone_cisco_long.pcm ../dataqa351/i30_172tone_cisco_long.pcm >-diff ../dataqa350/i60_172tone_cisco_long.pcm ../dataqa351/i60_172tone_cisco_long.pcm >-diff ../dataqa350/i30_173tone_cisco_long.pcm ../dataqa351/i30_173tone_cisco_long.pcm >-diff ../dataqa350/i60_173tone_cisco_long.pcm ../dataqa351/i60_173tone_cisco_long.pcm >-diff ../dataqa350/i30_174tone_cisco_long.pcm ../dataqa351/i30_174tone_cisco_long.pcm >-diff ../dataqa350/i60_174tone_cisco_long.pcm ../dataqa351/i60_174tone_cisco_long.pcm >-diff ../dataqa350/a85tone_cisco_long.pcm ../dataqa351/a85tone_cisco_long.pcm >-diff ../dataqa350/a86tone_cisco_long.pcm ../dataqa351/a86tone_cisco_long.pcm >-diff ../dataqa350/a87tone_cisco_long.pcm ../dataqa351/a87tone_cisco_long.pcm >-diff ../dataqa350/i30_175wb_contspeech.pcm ../dataqa351/i30_175wb_contspeech.pcm >-diff ../dataqa350/i60_175wb_contspeech.pcm ../dataqa351/i60_175wb_contspeech.pcm >-diff ../dataqa350/i30_176wb_contspeech.pcm ../dataqa351/i30_176wb_contspeech.pcm >-diff ../dataqa350/i60_176wb_contspeech.pcm ../dataqa351/i60_176wb_contspeech.pcm >-diff ../dataqa350/i30_177wb_contspeech.pcm ../dataqa351/i30_177wb_contspeech.pcm >-diff ../dataqa350/i60_177wb_contspeech.pcm ../dataqa351/i60_177wb_contspeech.pcm >-diff ../dataqa350/i30_178wb_contspeech.pcm ../dataqa351/i30_178wb_contspeech.pcm >-diff ../dataqa350/i60_178wb_contspeech.pcm ../dataqa351/i60_178wb_contspeech.pcm >-diff ../dataqa350/i30_179wb_contspeech.pcm ../dataqa351/i30_179wb_contspeech.pcm >-diff ../dataqa350/i60_179wb_contspeech.pcm ../dataqa351/i60_179wb_contspeech.pcm >-diff ../dataqa350/i30_180wb_contspeech.pcm ../dataqa351/i30_180wb_contspeech.pcm >-diff ../dataqa350/i60_180wb_contspeech.pcm ../dataqa351/i60_180wb_contspeech.pcm >-diff ../dataqa350/a88wb_contspeech.pcm ../dataqa351/a88wb_contspeech.pcm >-diff ../dataqa350/a89wb_contspeech.pcm ../dataqa351/a89wb_contspeech.pcm >-diff ../dataqa350/a90wb_contspeech.pcm ../dataqa351/a90wb_contspeech.pcm >-diff ../dataqa350/i30_181wb_speech_office25db.pcm ../dataqa351/i30_181wb_speech_office25db.pcm >-diff ../dataqa350/i60_181wb_speech_office25db.pcm ../dataqa351/i60_181wb_speech_office25db.pcm >-diff ../dataqa350/i30_182wb_speech_office25db.pcm ../dataqa351/i30_182wb_speech_office25db.pcm >-diff ../dataqa350/i60_182wb_speech_office25db.pcm ../dataqa351/i60_182wb_speech_office25db.pcm >-diff ../dataqa350/i30_183wb_speech_office25db.pcm ../dataqa351/i30_183wb_speech_office25db.pcm >-diff ../dataqa350/i60_183wb_speech_office25db.pcm ../dataqa351/i60_183wb_speech_office25db.pcm >-diff ../dataqa350/i30_184wb_speech_office25db.pcm ../dataqa351/i30_184wb_speech_office25db.pcm >-diff ../dataqa350/i60_184wb_speech_office25db.pcm ../dataqa351/i60_184wb_speech_office25db.pcm >-diff ../dataqa350/i30_185wb_speech_office25db.pcm ../dataqa351/i30_185wb_speech_office25db.pcm >-diff ../dataqa350/i60_185wb_speech_office25db.pcm ../dataqa351/i60_185wb_speech_office25db.pcm >-diff ../dataqa350/i30_186wb_speech_office25db.pcm ../dataqa351/i30_186wb_speech_office25db.pcm >-diff ../dataqa350/i60_186wb_speech_office25db.pcm ../dataqa351/i60_186wb_speech_office25db.pcm >-diff ../dataqa350/a91wb_speech_office25db.pcm ../dataqa351/a91wb_speech_office25db.pcm >-diff ../dataqa350/a92wb_speech_office25db.pcm ../dataqa351/a92wb_speech_office25db.pcm >-diff ../dataqa350/a93wb_speech_office25db.pcm ../dataqa351/a93wb_speech_office25db.pcm >-diff ../dataqa350/a30_1DTMF_16kHz_short.pcm ../dataqa351/a30_1DTMF_16kHz_short.pcm >-diff ../dataqa350/a60_1DTMF_16kHz_short.pcm ../dataqa351/a60_1DTMF_16kHz_short.pcm >-diff ../dataqa350/a30_2ltest_speech_noisy.pcm ../dataqa351/a30_2ltest_speech_noisy.pcm >-diff ../dataqa350/a60_2ltest_speech_noisy.pcm ../dataqa351/a60_2ltest_speech_noisy.pcm >-diff ../dataqa350/a30_3misc2.pcm ../dataqa351/a30_3misc2.pcm >-diff ../dataqa350/a60_3misc2.pcm ../dataqa351/a60_3misc2.pcm >-diff ../dataqa350/a30_4sinesweep.pcm ../dataqa351/a30_4sinesweep.pcm >-diff ../dataqa350/a60_4sinesweep.pcm ../dataqa351/a60_4sinesweep.pcm >-diff ../dataqa350/a30_5speechmusic.pcm ../dataqa351/a30_5speechmusic.pcm >-diff ../dataqa350/a60_5speechmusic.pcm ../dataqa351/a60_5speechmusic.pcm >-diff ../dataqa350/a30_6tone_cisco.pcm ../dataqa351/a30_6tone_cisco.pcm >-diff ../dataqa350/a60_6tone_cisco.pcm ../dataqa351/a60_6tone_cisco.pcm >-diff ../dataqa350/a60_7tone_cisco.pcm ../dataqa351/a60_7tone_cisco.pcm >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/diffiSACPLC.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/diffiSACPLC.txt >deleted file mode 100644 >index 9e3629b2ca72a490e31c5b472dd91b3d4d2a8367..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/diffiSACPLC.txt >+++ /dev/null >@@ -1,20 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGFILE=logplc.txt >-echo "START PLC TEST" > $LOGFILE >- >-OUTDIR1=../dataqaplc_0 >-OUTDIR2=../dataqaplc_1 >- >-diff $OUTDIR1/outplc1.pcm $OUTDIR2/outplc1.pcm >-diff $OUTDIR1/outplc2.pcm $OUTDIR2/outplc2.pcm >-diff $OUTDIR1/outplc3.pcm $OUTDIR2/outplc3.pcm >-diff $OUTDIR1/outplc4.pcm $OUTDIR2/outplc4.pcm >-diff $OUTDIR1/outplc5.pcm $OUTDIR2/outplc5.pcm >-diff $OUTDIR1/outplc6.pcm $OUTDIR2/outplc6.pcm >- >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACLongtest.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACLongtest.txt >deleted file mode 100644 >index eeffc0c955edd07ac9f0eb0548cae100b6e03971..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACLongtest.txt >+++ /dev/null >@@ -1,61 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGFILE=logNormal.txt >-echo "START ISAC TEST" > $LOGFILE >-echo >> $LOGFILE >- >-ISAC=../Release/kenny.exe >-ISACFIXFLOAT=../Release/testFixFloat.exe >- >-INFILES=$(cat InputFiles.txt) >-SUBSET=$(cat InputFilesFew.txt) >-CHANNELFILES=$(cat ChannelFiles.txt) >-CHANNELLIST=($(cat ChannelFiles.txt)) >-INDIR=../data/orig >-OUTDIR=../dataqa >-mkdir -p $OUTDIR >- >-TARGETRATE=(10000 15000 20000 25000 30000 32000) >-#echo ${CHANNELFILES[1]} >- >-index1=0 >-index2=0 >- >-for file in $INFILES # loop over all input files >- do >- >- for rate in ${TARGETRATE[*]} >- do >- let "index1=index1+1" >- $ISAC -I $rate -FL 30 $INDIR/"$file" $OUTDIR/i30_$index1"$file" >> $LOGFILE >- $ISAC -I $rate -FL 60 $INDIR/"$file" $OUTDIR/i60_$index1"$file" >> $LOGFILE >- done >- for channel in $CHANNELFILES >- do >- let "index2=index2+1" >- $ISAC $INDIR/$channel $INDIR/"$file" $OUTDIR/a$index2"$file" >> $LOGFILE >- done >- >-done >- >-index1=0 >- >-for file in $SUBSET # loop over the subset of input files >- do >- let "index1=index1+1" >- $ISAC $INDIR/${CHANNELLIST[0]} -FL 30 -FIXED_FL $INDIR/"$file" $OUTDIR/a30_$index1"$file" >> $LOGFILE >- $ISAC $INDIR/${CHANNELLIST[0]} -FL 60 -FIXED_FL $INDIR/"$file" $OUTDIR/a60_$index1"$file" >> $LOGFILE >-done >- >-let "index1=index1+1" >- $ISAC $INDIR/${CHANNELLIST[0]} -INITRATE 25000 -FL 30 $INDIR/"$file" $OUTDIR/a60_$index1"$file" >> $LOGFILE >- >-# Run fault test >- >-#./runiSACfault.txt >- >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACNB.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACNB.txt >deleted file mode 100644 >index 605595cc04fca61924b9ba7cc7caefb96d02a038..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACNB.txt >+++ /dev/null >@@ -1,45 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGFILE=logNB.txt >-echo "START NARROWBAND TEST" > $LOGFILE >-echo >> $LOGFILE >- >-ISAC=../Release/kenny.exe >-ISACFIXFLOAT=../Release/testFixFloat.exe >- >-INFILES=$(cat InputFiles.txt) >-SUBSET=$(cat InputFilesFew.txt) >-CHANNELFILES=$(cat ChannelFiles.txt) >-CHANNELLIST=($(cat ChannelFiles.txt)) >-INDIR=../data/orig >-OUTDIR=../dataqaNB >-mkdir -p $OUTDIR >- >-TARGETRATE=(10000 15000 20000 25000 30000 32000) >-#echo ${CHANNELFILES[1]} >- >-index1=0 >-index2=0 >- >-# Narrowband Interfaces >- >-for file in $SUBSET # loop over all input files >- do >- for rate in ${TARGETRATE[*]} >- do >- let "index1=index1+1" >- $ISAC $rate -FL 30 -NB 1 $INDIR/"$file" $OUTDIR/nb130_$index1"$file" >> $LOGFILE >- $ISAC $rate -FL 60 -NB 1 $INDIR/"$file" $OUTDIR/nb160_$index1"$file" >> $LOGFILE >- $ISAC $rate -FL 30 -NB 2 $INDIR/"$file" $OUTDIR/nb230_$index1"$file" >> $LOGFILE >- $ISAC $rate -FL 60 -NB 2 $INDIR/"$file" $OUTDIR/nb260_$index1"$file" >> $LOGFILE >- $ISAC $rate -FL 30 -NB 2 -PL 10 $INDIR/"$file" $OUTDIR/nb2plc30_$index1"$file" >> $LOGFILE >- $ISAC $rate -FL 60 -NB 2 -PL 10 $INDIR/"$file" $OUTDIR/nb2plc60_$index1"$file" >> $LOGFILE >- done >- >-done >- >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACPLC.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACPLC.txt >deleted file mode 100644 >index 6bee6f7c3f3238610728e274921f62878a458e96..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACPLC.txt >+++ /dev/null >@@ -1,23 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGFILE=logplc.txt >-echo "START PLC TEST" > $LOGFILE >- >-ISAC=../Release/kenny.exe >- >-INDIR=../data/orig >-OUTDIR=../dataqaplc_0 >-mkdir -p $OUTDIR >- >-$ISAC 12000 -PL 15 $INDIR/speechmusic.pcm $OUTDIR/outplc1.pcm >-$ISAC 20000 -PL 15 $INDIR/speechmusic.pcm $OUTDIR/outplc2.pcm >-$ISAC 32000 -PL 15 $INDIR/speechmusic.pcm $OUTDIR/outplc3.pcm >-$ISAC 12000 -PL 15 $INDIR/tone_cisco.pcm $OUTDIR/outplc4.pcm >-$ISAC 20000 -PL 15 $INDIR/tone_cisco.pcm $OUTDIR/outplc5.pcm >-$ISAC 32000 -PL 15 $INDIR/tone_cisco.pcm $OUTDIR/outplc6.pcm >- >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACRate.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACRate.txt >deleted file mode 100644 >index d8403e099d650fb75eb368274bbe139f9d4982f2..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACRate.txt >+++ /dev/null >@@ -1,23 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGG=loggRate.txt >-OUTDIR=../dataqaRate >-mkdir -p $OUTDIR >- >-../Release/kenny.exe 13000 -FIXED_FL -FL 30 -MAX 100 ../data/orig/sawsweep_380_60.pcm $OUTDIR/out_napi_1.pcm > $LOGG >-../Release/kenny.exe ../data/orig/bottlenecks.txt -FIXED_FL -FL 30 -MAXRATE 32000 ../data/orig/sawsweep_380_60.pcm $OUTDIR/out_napi_2.pcm >> $LOGG >-../Release/kenny.exe 13000 -FIXED_FL -FL 30 -MAX 100 ../data/orig/sawsweep_380_60.pcm $OUTDIR/out_napi_3.pcm >> $LOGG >-../Release/kenny.exe ../data/orig/bottlenecks.txt -FIXED_FL -FL 30 -MAXRATE 32000 ../data/orig/sawsweep_380_60.pcm $OUTDIR/out_napi_4.pcm >> $LOGG >-../Release/kenny.exe 13000 -FIXED_FL -FL 60 -MAX 100 ../data/orig/sawsweep_380_60.pcm $OUTDIR/out_napi_5.pcm >> $LOGG >-../Release/kenny.exe ../data/orig/bottlenecks.txt -FIXED_FL -FL 60 -MAXRATE 32000 ../data/orig/sawsweep_380_60.pcm $OUTDIR/out_napi_6.pcm >> $LOGG >-../Release/kenny.exe 13000 -INIT_RATE 32000 -FIXED_FL -FL 60 -MAX 100 ../data/orig/sawsweep_380_60.pcm $OUTDIR/out_napi_7.pcm >> $LOGG >- >-../Release/kenny.exe 13000 -FIXED_FL -FL 30 -MAX 100 ../data/orig/longspeech.pcm $OUTDIR/out_napi_11.pcm >> $LOGG >-../Release/kenny.exe ../data/orig/bottlenecks.txt -FIXED_FL -FL 30 -MAXRATE 32000 ../data/orig/longspeech.pcm $OUTDIR/out_napi_12.pcm >> $LOGG >-../Release/kenny.exe 13000 -FIXED_FL -FL 30 -MAX 100 ../data/orig/longspeech.pcm $OUTDIR/out_napi_13.pcm >> $LOGG >-../Release/kenny.exe ../data/orig/bottlenecks.txt -FIXED_FL -FL 30 -MAXRATE 32000 ../data/orig/longspeech.pcm $OUTDIR/out_napi_14.pcm >> $LOGG >-../Release/kenny.exe 13000 -FIXED_FL -FL 60 -MAX 100 ../data/orig/longspeech.pcm $OUTDIR/out_napi_15.pcm >> $LOGG >-../Release/kenny.exe ../data/orig/bottlenecks.txt -FIXED_FL -FL 60 -MAXRATE 32000 ../data/orig/longspeech.pcm $OUTDIR/out_napi_16.pcm >> $LOGG >-../Release/kenny.exe 13000 -INIT_RATE 32000 -FIXED_FL -FL 60 -MAX 100 ../data/orig/longspeech.pcm $OUTDIR/out_napi_17.pcm >> $LOGG >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACfault.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACfault.txt >deleted file mode 100644 >index f4d9478fd49bff5103cd42ac459ea1c8ba4e8738..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACfault.txt >+++ /dev/null >@@ -1,40 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGFILE=logfault.txt >-echo "START FAULT TEST" > $LOGFILE >- >-ISAC=../Release/kenny.exe >-ISACFIXFLOAT=../Release/testFixFloat.exe >- >-INFILES=$(cat InputFiles.txt) >-SUBSET=$(cat InputFilesFew.txt) >-CHANNELFILES=$(cat ChannelFiles.txt) >-CHANNELLIST=($(cat ChannelFiles.txt)) >-INDIR=../data/orig >-OUTDIR=../dataqaft >-mkdir -p $OUTDIR >- >-TARGETRATE=(10000 15000 20000 25000 30000 32000) >-FAULTTEST=(1 2 3 4 5 6 7 9) >- >-index1=0 >- >-file=wb_contspeech.pcm >- >-# Fault test >-for testnr in ${FAULTTEST[*]} >- do >- $ISAC 32000 -F $testnr $INDIR/"$file" $OUTDIR/ft$testnr"$file" >> $LOGFILE >-done >- >-# Fault test number 10, error in bitstream >- $ISAC 32000 -F 10 $INDIR/"$file" $OUTDIR/ft10_"$file" >> $LOGFILE >- $ISAC 32000 -F 10 -PL 10 $INDIR/"$file" $OUTDIR/ft10plc_"$file" >> $LOGFILE >- $ISAC 32000 -F 10 -NB 1 $INDIR/"$file" $OUTDIR/ft10nb1_"$file" >> $LOGFILE >- $ISAC 32000 -F 10 -NB 2 -PL 10 $INDIR/"$file" $OUTDIR/ft10nb2_"$file" >> $LOGFILE >- >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACfixfloat.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACfixfloat.txt >deleted file mode 100644 >index c9e02df2e9f33660255e62e4f6a62c0aee115615..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/fix/test/QA/runiSACfixfloat.txt >+++ /dev/null >@@ -1,47 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGFILE=logfxfl.txt >-echo "START FIX-FLOAT TEST" > $LOGFILE >- >- >-ISACFIXFLOAT=../testFixFloat.exe >- >-INFILES=$(cat InputFiles.txt) >-SUBSET=$(cat InputFilesFew.txt) >-CHANNELFILES=$(cat ChannelFiles.txt) >-CHANNELLIST=($(cat ChannelFiles.txt)) >-INDIR=../data/orig >-OUTDIR=../dataqafxfl >-mkdir -p $OUTDIR >- >-index1=0 >- >-for file in $INFILES # loop over all input files >- do >- >- for channel in $CHANNELFILES >- do >- let "index1=index1+1" >- >- $ISACFIXFLOAT $INDIR/$channel -m 1 -PLC $INDIR/"$file" $OUTDIR/flfx$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 2 -PLC $INDIR/"$file" $OUTDIR/fxfl$index1"$file" >> $LOGFILE >- done >- >-done >- >-index1=0 >- >-for file in $SUBSET # loop over the subset of input files >- do >- let "index1=index1+1" >- $ISACFIXFLOAT $INDIR/$channel -m 1 -NB 1 $INDIR/"$file" $OUTDIR/flfxnb1_$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 2 -NB 1 $INDIR/"$file" $OUTDIR/fxflnb1_$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 1 -NB 2 -PLC $INDIR/"$file" $OUTDIR/flfxnb2_$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 2 -NB 2 -PLC $INDIR/"$file" $OUTDIR/fxflnb2_$index1"$file" >> $LOGFILE >-done >- >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/source/entropy_coding.c b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/source/entropy_coding.c >index 28767afc633c09eec75ad08e2cbf71698b411640..6692a519ca97c302d7af5a0deb3bd5586928ece6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/source/entropy_coding.c >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/source/entropy_coding.c >@@ -96,7 +96,7 @@ static void FindInvArSpec(const int16_t* ARCoefQ12, > const int32_t gainQ10, > int32_t* CurveQ16) { > int32_t CorrQ11[AR_ORDER + 1]; >- int32_t sum, tmpGain; >+ int64_t sum, tmpGain; > int32_t diffQ16[FRAMESAMPLES / 8]; > const int16_t* CS_ptrQ9; > int k, n; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACLongtest.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACLongtest.txt >deleted file mode 100644 >index 3f05224a0ecc256ae116a7e8e9744e7a0050da6a..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACLongtest.txt >+++ /dev/null >@@ -1,433 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >- >- >-if [ "$1" = "x64" ] || [ "$2" = "x64" ] || [ "$#" -eq 0 ] >- then >- PLATFORM=_X64 >- ISAC=../x64/Release/ReleaseTest-API_2005.exe >-elif [ "$1" = "LINUX" ] || [ "$2" = "LINUX" ] >- then >- PLATFORM=_linux >- ISAC=../ReleaseTest-API/isacswtest >-else >- PLATFORM=_2005 >- ISAC=../win32/Release/ReleaseTest-API_2005.exe >-fi >- >-if [ "$#" -eq 0 ] || [ "$1" = "all" ] || [ "$1" = "wb" ] >- then >- LOGFILE=logNormal"$PLATFORM".txt >- echo "START ISAC WB TEST" > $LOGFILE >- echo >> $LOGFILE >- >- INFILES=$(cat InputFiles.txt) >- SUBSET=$(cat InputFilesFew.txt) >- CHANNELFILES=$(cat ChannelFiles.txt) >- CHANNELLIST=($(cat ChannelFiles.txt)) >- INDIR=../data/orig >- OUTDIR=../dataqa"$PLATFORM" >- mkdir -p $OUTDIR >- rm -f $OUTDIR/* >- >- idx=0 >- RATE=10000 >- FRAMESIZE=30 >- >- >- for file in $INFILES # loop over all input files >- do >- >- echo "Input file: " $file >- echo "-----------------------------------" >- echo "Instantaneous with RATE " $RATE ", and Frame-size " $FRAMESIZE >- $ISAC -I -B $RATE -FL $FRAMESIZE -FS 16 $INDIR/"$file" $OUTDIR/i_"$FRAMESIZE"_"$RATE"_"$file" >> $LOGFILE >- echo >- >- name="${CHANNELLIST[$idx]}" >- echo "Adaptive with channel file: " $name >- >- $ISAC -B $INDIR/${CHANNELLIST[$idx]} -FS 16 $INDIR/"$file" $OUTDIR/a_${name%.*}_"$file" >> $LOGFILE >- >- echo >- echo >- >-# alternate between 30 & 60 ms. >- if [ $FRAMESIZE -eq 30 ] >- then >- FRAMESIZE=60 >- else >- FRAMESIZE=30 >- fi >- >-# rate between 10000 to 32000 bits/sec >- if [ $RATE -le 30000 ] >- then >- let "RATE=RATE+2000" >- else >- let "RATE=10000" >- fi >- >-# there are only three channel file >- if [ $idx -ge 2 ]; then >- idx=0 >- else >- let "idx=idx+1" >- fi >- >- done >- >- idx=0 >- >-# loop over the subset of input files >- for file in $SUBSET >- do >- >- if [ $idx -eq 0 ]; then >- $ISAC -B $INDIR/${CHANNELLIST[0]} -FL 30 -FIXED_FL -FS 16 $INDIR/"$file" $OUTDIR/a30_"$file" >> $LOGFILE >- idx=1 >- else >- $ISAC -B $INDIR/${CHANNELLIST[0]} -FL 60 -FIXED_FL -FS 16 $INDIR/"$file" $OUTDIR/a60_"$file" >> $LOGFILE >- idx=0 >- fi >- done >- >- $ISAC -B $INDIR/${CHANNELLIST[0]} -INITRATE 25000 -FL 30 -FS 16 $INDIR/"$file" $OUTDIR/a60_Init25kbps_"$file" >> $LOGFILE >- >- echo >- echo WIDEBAND DONE! >- echo >- echo >-fi >- >-if [ "$#" -eq 0 ] || [ "$1" = "all" ] || [ "$1" = "swb" ] >- then >- >- LOGFILE=logNormal_SWB"$PLATFORM".txt >- echo "START ISAC SWB TEST" > $LOGFILE >- echo >> $LOGFILE >- >- echo STARTING TO TEST SUPER-WIDEBAND >- >- INFILES=$(cat InputFilesSWB.txt) >- INDIR=../data/origswb >- OUTDIR=../dataqaswb"$PLATFORM" >- mkdir -p $OUTDIR >- rm -f $OUTDIR/* >- >- for file in $INFILES >- do >- echo >- echo "Input file: " $file >- echo "--------------------------------" >- for RATE in 12000 20000 32000 38000 45000 50000 56000 >- do >- >- echo "Rate " $RATE >- $ISAC -I -B $RATE -FL 30 -FS 32 $INDIR/"$file" $OUTDIR/swb_"$RATE"_"$file" >> $LOGFILE >- echo >- >- done >- >- done >-fi >- >-if [ "$#" -eq 0 ] || [ "$1" = "all" ] || [ "$1" = "API" ] >- then >- >- LOGFILE_API=logNormal_API"$PLATFORM".txt >- echo >- echo >- echo "START ISAC API TEST" > $LOGFILE_API >- echo >> $LOGFILE_API >- idx=1 >- echo " Test Enforcement of frame-size" >- echo "========================================================================================" >- mkdir -p ../FrameSizeLim"$PLATFORM" >- rm -f ../FrameSizeLim"$PLATFORM"/* >- echo >- echo "-- No enforcement; BN 10000" >- echo >- $ISAC -B 10000 -FS 16 ../data/orig/speech_and_misc_WB.pcm \ >- ../FrameSizeLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Now Enforce 30 ms frame size with the same bottleneck" >- echo "There should not be any 60 ms frame" >- echo >- $ISAC -B 10000 -FL 30 -FIXED_FL -FS 16 ../data/orig/speech_and_misc_WB.pcm \ >- ../FrameSizeLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- No enforcement; BN 32000" >- echo >- $ISAC -B 32000 -FS 16 ../data/orig/speech_and_misc_WB.pcm \ >- ../FrameSizeLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Now Enforce 60 ms frame size with the same bottleneck" >- echo "There should not be any 30 ms frame" >- echo >- $ISAC -B 32000 -FL 60 -FIXED_FL -FS 16 ../data/orig/speech_and_misc_WB.pcm \ >- ../FrameSizeLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo >- echo >- echo >- >- echo " Test Limiting of Payload Size and Rate" >- echo "========================================================================================" >- mkdir -p ../PayloadLim"$PLATFORM" >- rm -f ../PayloadLim"$PLATFORM"/* >- echo >- echo >- echo "-- No Limit, frame-size 60 ms, WIDEBAND" >- echo >- $ISAC -I -B 32000 -FL 60 -FS 16 ../data/orig/speech_and_misc_WB.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Payload-size limit of 250, frame-size 60 ms, WIDEBAND" >- echo >- $ISAC -I -B 32000 -FL 60 -FS 16 -MAX 250 ../data/orig/speech_and_misc_WB.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Rate limit of 33 kbps for 60 ms frame-size" >- echo >- $ISAC -I -B 32000 -FL 60 -FS 16 -MAXRATE 33000 ../data/orig/speech_and_misc_WB.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- echo >- echo >- >- echo "-- No Limit, frame-size 30 ms, WIDEBAND" >- echo >- $ISAC -I -B 32000 -FL 30 -FS 16 ../data/orig/speech_and_misc_WB.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Payload-size limit of 130, frame-size 30 ms, WIDEBAND" >- echo >- $ISAC -I -B 32000 -FL 30 -FS 16 -MAX 130 ../data/orig/speech_and_misc_WB.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Rate limit of 33 kbps for 30 ms frame-size, wideband" >- echo >- $ISAC -I -B 32000 -FL 30 -FS 16 -MAXRATE 33000 ../data/orig/speech_and_misc_WB.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- echo >- echo >- >- echo "-- No limit for 32 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 32000 -FL 30 -FS 32 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Payload limit of 130 bytes for 32 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 32000 -FL 30 -FS 32 -MAX 130 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- echo >- echo >- >- echo "-- No limit, Rate 45 kbps, 30 ms, SUPER-WIDEBAND, 12 kHz" >- echo >- $ISAC -I -B 45000 -FL 30 -FS 32 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Rate limit of 46 kbps for 42 kbps, 30 ms, SUPER-WIDEBAND, 12 kHz" >- echo >- $ISAC -I -B 45000 -FL 30 -FS 32 -MAXRATE 46000 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Payload limit of 170 bytes for 45 kbps, 30 ms, SUPER-WIDEBAND, 12 kHz" >- echo >- $ISAC -I -B 45000 -FL 30 -FS 32 -MAX 170 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- echo >- echo >- >- echo "-- No limit for 56 kbps, 30 ms, SUPER-WIDEBAND, 16 kHz" >- echo >- $ISAC -I -B 56000 -FL 30 -FS 32 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Payload limit of 200 bytes for 56 kbps 30 ms, SUPER-WIDEBAND, 16 kHz" >- echo >- $ISAC -I -B 56000 -FL 30 -FS 32 -MAX 200 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- echo "-- Rate limit of 57 kbps for 56 kbps 30 ms, SUPER-WIDEBAND, 16 kHz" >- echo >- $ISAC -I -B 56000 -FL 30 -FS 32 -MAXRATE 57000 ../data/origswb/jstest_32.pcm \ >- ../PayloadLim"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- echo >- echo >- echo >- echo >- echo >- >- echo " Test Trans-Coding" >- echo "========================================================================================" >- mkdir -p ../Transcoding"$PLATFORM" >- rm -f ../Transcoding"$PLATFORM"/* >- echo >- echo >- echo "-- 20 kbps, 30 ms, WIDEBAND" >- echo >- $ISAC -I -B 20000 -FL 30 -FS 16 ../data/orig/speech_and_misc_WB.pcm \ >- ../Transcoding"$PLATFORM"/APITest_refTrans20WB.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 32 kbps trans-coding to 20 kbps, 30 ms, WIDEBAND" >- echo >- $ISAC -I -B 32000 -FL 30 -FS 16 -T 20000 ../Transcoding"$PLATFORM"/APITest_32T20.pcm \ >- ../data/orig/speech_and_misc_WB.pcm ../Transcoding"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- >- echo >- echo >- echo "-- 38 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 38000 -FL 30 -FS 32 ../data/origswb/jstest_32.pcm \ >- ../Transcoding"$PLATFORM"/APITest_refTrans38.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 45 kbps trans-coding to 38 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 45000 -FL 30 -FS 32 -T 38000 ../Transcoding"$PLATFORM"/APITest_45T38.pcm \ >- ../data/origswb/jstest_32.pcm ../Transcoding"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- >- echo >- echo >- echo "-- 20 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 20000 -FL 30 -FS 32 ../data/origswb/jstest_32.pcm \ >- ../Transcoding"$PLATFORM"/APITest_refTrans20SWB.pcm >> $LOGFILE_API >- let "idx=idx+1" >- >- echo >- echo >- >- echo "-- 45 kbps trans-coding to 20 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 45000 -FL 30 -FS 32 -T 20000 ../Transcoding"$PLATFORM"/APITest_45T20.pcm \ >- ../data/origswb/jstest_32.pcm ../Transcoding"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- echo >- echo >- echo "-- 50 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 50000 -FL 30 -FS 32 ../data/origswb/jstest_32.pcm \ >- ../Transcoding"$PLATFORM"/APITest_refTrans50.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 56 kbps trans-coding to 50 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 56000 -FL 30 -FS 32 -T 50000 ../Transcoding"$PLATFORM"/APITest_56T50.pcm \ >- ../data/origswb/jstest_32.pcm ../Transcoding"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 56 kbps trans-coding to 20 kbps, 30 ms, SUPER-WIDEBAND" >- echo >- $ISAC -I -B 56000 -FL 30 -FS 32 -T 20000 ../Transcoding"$PLATFORM"/APITest_56T20.pcm \ >- ../data/origswb/jstest_32.pcm ../Transcoding"$PLATFORM"/APITest_"$idx".pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo "________________________________________________________" >- echo >- echo >- echo >- echo >- echo >- >- echo " Test FEC" >- echo "========================================================================================" >- mkdir -p ../FEC"$PLATFORM" >- rm -f ../FEC"$PLATFORM"/* >- echo >- echo >- echo "-- 32 kbps with transcoding to 20kbps, 30 ms, WIDEBAND, 10% packet loss" >- $ISAC -I -B 32000 -FL 30 -FS 16 -PL 10 -T 20000 ../FEC"$PLATFORM"/APITest_PL10_WB30_T20.pcm \ >- ../data/orig/speech_and_misc_WB.pcm ../FEC"$PLATFORM"/APITest_PL10_WB30.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 32 kbps, 60 ms, WIDEBAND, 10% packet loss" >- $ISAC -I -B 32000 -FL 60 -FS 16 -PL 10 ../data/orig/speech_and_misc_WB.pcm \ >- ../FEC"$PLATFORM"/APITest_PL10_WB60.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 32 kbps with transcoding to 20 kbps, 30 ms, SUPER-WIDEBAND, 10% packet loss" >- $ISAC -I -B 32000 -FL 30 -FS 32 -PL 10 -T 20000 ../FEC"$PLATFORM"/APITest_PL10_SWB_8kHz_T20.pcm \ >- ../data/origswb/jstest_32.pcm ../FEC"$PLATFORM"/APITest_PL10_SWB_8kHz.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 45 kbps with Trascoding to 38 kbps, 30 ms, SUPER-WIDEBAND, 10% packet loss" >- $ISAC -I -B 45000 -FL 30 -FS 32 -PL 10 -T 38000 ../FEC"$PLATFORM"/APITest_PL10_SWB_12kHz_T38.pcm \ >- ../data/origswb/jstest_32.pcm ../FEC"$PLATFORM"/APITest_PL10_SWB_12kHz.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >- >- echo "-- 56 kbps with transcoding to 50 kbps, 30 ms, SUPER-WIDEBAND, 10% packet loss" >- $ISAC -I -B 56000 -FL 30 -FS 32 -PL 10 -T 50000 ../FEC"$PLATFORM"/APITest_PL10_SWB_16kHz_T50.pcm \ >- ../data/origswb/jstest_32.pcm ../FEC"$PLATFORM"/APITest_PL10_SWB_16kHz.pcm >> $LOGFILE_API >- let "idx=idx+1" >- echo >- echo >-fi >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACfault.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACfault.txt >deleted file mode 100644 >index 63829a4b9834ec32f24f8bd9f783eabb5bdd29ba..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACfault.txt >+++ /dev/null >@@ -1,80 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character????? ?????? >-if [ "$1" = "x64" ] || [ "$#" -eq 0 ] >- then >- PLATFORM=_X64 >- ISAC=../x64/Release/ReleaseTest-API_2005.exe >-elif [ "$1" = "2005" ] >- then >- PLATFORM=_2005 >- ISAC=../win32/Release/ReleaseTest-API_2005.exe >-elif [ "$1" == "LINUX" ] >- then >- PLATFORM=_linux >- ISAC=../ReleaseTest-API/isacswtest >-else >- echo Unknown Platform >- exit 2 >-fi >- >-LOGFILE=logfault$PLATFORM.txt >-echo "START FAULT TEST" > $LOGFILE >- >- >-INFILES=$(cat InputFiles.txt) >-SUBSET=$(cat InputFilesFew.txt) >-CHANNELFILES=$(cat ChannelFiles.txt) >-CHANNELLIST=($(cat ChannelFiles.txt)) >-INDIR=../data/orig >-INDIRSWB=../data/origswb >-OUTDIR=../dataqaft$PLATFORM >-mkdir -p $OUTDIR >- >-#maximum Target rate for different bandwidth >-TARGETRATE=( 32000 32000 44000 56000 ) >-SAMPFREQ=( 16 32 32 32 ) >-FAULTTEST=(1 2 3 4 5 6 7 9) >- >-index1=0 >- >-file_wb=../data/orig/16kHz.pcm >-file_swb=../data/origswb/32kHz.pcm >- >-for idx in 0 1 2 3 >- do >-# Fault test >- echo >- echo "Sampling Frequency " ${SAMPFREQ[idx]} "kHz, Rate " ${TARGETRATE[idx]} "bps." >- echo "---------------------------------------------------" >- if [ ${SAMPFREQ[idx]} -eq 16 ]; then >- file=$file_wb >- else >- file=$file_swb >- fi >- >- for testnr in ${FAULTTEST[*]} >- do >- echo "Running Fault Test " $testnr >- $ISAC -I -B "${TARGETRATE[idx]}" -F $testnr -FS "${SAMPFREQ[idx]}" "$file" \ >- $OUTDIR/ft"$testnr"_"${TARGETRATE[idx]}"_"${SAMPFREQ[idx]}".pcm >> LOGFILE >- echo >- >- done >- >-# Fault test number 10, error in bitstream >- echo "Running Fault Test 10" >- $ISAC -I -B "${TARGETRATE[idx]}" -F 10 -FS "${SAMPFREQ[idx]}" "$file" \ >- $OUTDIR/ft10_"${TARGETRATE[idx]}"_"${SAMPFREQ[idx]}".pcm >> LOGFILE >- echo >- echo "Running Fault Test 10 with packetloss" >- $ISAC -I -B "${TARGETRATE[idx]}" -F 10 -PL 10 -FS "${SAMPFREQ[idx]}" "$file" \ >- $OUTDIR/ft10plc_"${TARGETRATE[idx]}"_"${SAMPFREQ[idx]}".pcm >> LOGFILE >- echo >-done >- >-echo >-echo >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACfixfloat.txt b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACfixfloat.txt >deleted file mode 100644 >index 4cda78e4fea6b53a79ff7409935e7641284b9278..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/isac/main/test/QA/runiSACfixfloat.txt >+++ /dev/null >@@ -1,47 +0,0 @@ >-#!/bin/bash >-(set -o igncr) 2>/dev/null && set -o igncr; # force bash to ignore \r character >- >-LOGFILE=logfxfl.txt >-echo "START FIX-FLOAT TEST" > $LOGFILE >- >- >-ISACFIXFLOAT=../../../fix/test/testFixFloat.exe >- >-INFILES=$(cat InputFiles.txt) >-SUBSET=$(cat InputFilesFew.txt) >-CHANNELFILES=$(cat ChannelFiles.txt) >-CHANNELLIST=($(cat ChannelFiles.txt)) >-INDIR=../data/orig >-OUTDIR=../dataqafxfl >-mkdir -p $OUTDIR >- >-index1=0 >- >-for file in $INFILES # loop over all input files >- do >- >- for channel in $CHANNELFILES >- do >- let "index1=index1+1" >- >- $ISACFIXFLOAT $INDIR/$channel -m 1 -PLC $INDIR/"$file" $OUTDIR/flfx$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 2 -PLC $INDIR/"$file" $OUTDIR/fxfl$index1"$file" >> $LOGFILE >- done >- >-done >- >-index1=0 >- >-for file in $SUBSET # loop over the subset of input files >- do >- let "index1=index1+1" >- $ISACFIXFLOAT $INDIR/$channel -m 1 -NB 1 $INDIR/"$file" $OUTDIR/flfxnb1_$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 2 -NB 1 $INDIR/"$file" $OUTDIR/fxflnb1_$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 1 -NB 2 -PLC $INDIR/"$file" $OUTDIR/flfxnb2_$index1"$file" >> $LOGFILE >- $ISACFIXFLOAT $INDIR/$channel -m 2 -NB 2 -PLC $INDIR/"$file" $OUTDIR/fxflnb2_$index1"$file" >> $LOGFILE >-done >- >-echo DONE! >- >- >- >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.cc >index 0bf3b19a3b17cb346bccd887762a277b2ec1e405..d9efc211da3dac6800b21b5d979db91bd6de6831 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.cc >@@ -14,6 +14,8 @@ > #include <memory> > #include <utility> > >+#include "rtc_base/checks.h" >+ > namespace webrtc { > > LegacyEncodedAudioFrame::LegacyEncodedAudioFrame(AudioDecoder* decoder, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.h >index 05d4fe489c91de0a3c06bddb6cffff4fb205c85c..41b08f7898e2e8fbf964ccb31fe21349135ec010 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/legacy_encoded_audio_frame.h >@@ -11,10 +11,14 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_LEGACY_ENCODED_AUDIO_FRAME_H_ > #define MODULES_AUDIO_CODING_CODECS_LEGACY_ENCODED_AUDIO_FRAME_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <vector> > >+#include "absl/types/optional.h" > #include "api/array_view.h" > #include "api/audio_codecs/audio_decoder.h" >+#include "rtc_base/buffer.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.cc >index 302b71403196aade86bb33fb1d4d8f49c778d26e..357cb1a20da799dd0288dd1ffe8446c75958ba61 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.cc >@@ -10,8 +10,11 @@ > > #include "modules/audio_coding/codecs/opus/audio_decoder_opus.h" > >+#include <memory> > #include <utility> > >+#include "absl/types/optional.h" >+#include "api/array_view.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.h >index 70aa40bdc46b01e88b9ff0aa0f5fcec27f7e00ec..8043425243cd9ed564b704ce332331093946eb40 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_decoder_opus.h >@@ -11,8 +11,13 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_OPUS_AUDIO_DECODER_OPUS_H_ > #define MODULES_AUDIO_CODING_CODECS_OPUS_AUDIO_DECODER_OPUS_H_ > >+#include <stddef.h> >+#include <stdint.h> >+#include <vector> >+ > #include "api/audio_codecs/audio_decoder.h" > #include "modules/audio_coding/codecs/opus/opus_interface.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.cc >index e6240e69826a140f86818cc737203c0816c855de..1a88acf3c03526fb61e2bf460954254e41266c5b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.cc >@@ -15,7 +15,8 @@ > #include <utility> > > #include "absl/memory/memory.h" >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" >+#include "common_types.h" > #include "modules/audio_coding/audio_network_adaptor/audio_network_adaptor_impl.h" > #include "modules/audio_coding/audio_network_adaptor/controller_manager.h" > #include "modules/audio_coding/codecs/opus/opus_interface.h" >@@ -214,8 +215,80 @@ int GetBitrateBps(const AudioEncoderOpusConfig& config) { > return *config.bitrate_bps; > } > >+bool IsValidPacketLossRate(int value) { >+ return value >= 0 && value <= 100; >+} >+ >+float ToFraction(int percent) { >+ return static_cast<float>(percent) / 100; >+} >+ >+float GetMinPacketLossRate() { >+ constexpr char kPacketLossFieldTrial[] = "WebRTC-Audio-OpusMinPacketLossRate"; >+ const bool use_opus_min_packet_loss_rate = >+ webrtc::field_trial::IsEnabled(kPacketLossFieldTrial); >+ if (use_opus_min_packet_loss_rate) { >+ const std::string field_trial_string = >+ webrtc::field_trial::FindFullName(kPacketLossFieldTrial); >+ constexpr int kDefaultMinPacketLossRate = 1; >+ int value = kDefaultMinPacketLossRate; >+ if (sscanf(field_trial_string.c_str(), "Enabled-%d", &value) == 1 && >+ !IsValidPacketLossRate(value)) { >+ RTC_LOG(LS_WARNING) << "Invalid parameter for " << kPacketLossFieldTrial >+ << ", using default value: " >+ << kDefaultMinPacketLossRate; >+ value = kDefaultMinPacketLossRate; >+ } >+ return ToFraction(value); >+ } >+ return 0.0; >+} >+ >+std::unique_ptr<AudioEncoderOpusImpl::NewPacketLossRateOptimizer> >+GetNewPacketLossRateOptimizer() { >+ constexpr char kPacketLossOptimizationName[] = >+ "WebRTC-Audio-NewOpusPacketLossRateOptimization"; >+ const bool use_new_packet_loss_optimization = >+ webrtc::field_trial::IsEnabled(kPacketLossOptimizationName); >+ if (use_new_packet_loss_optimization) { >+ const std::string field_trial_string = >+ webrtc::field_trial::FindFullName(kPacketLossOptimizationName); >+ int min_rate; >+ int max_rate; >+ float slope; >+ if (sscanf(field_trial_string.c_str(), "Enabled-%d-%d-%f", &min_rate, >+ &max_rate, &slope) == 3 && >+ IsValidPacketLossRate(min_rate) && IsValidPacketLossRate(max_rate)) { >+ return absl::make_unique< >+ AudioEncoderOpusImpl::NewPacketLossRateOptimizer>( >+ ToFraction(min_rate), ToFraction(max_rate), slope); >+ } >+ RTC_LOG(LS_WARNING) << "Invalid parameters for " >+ << kPacketLossOptimizationName >+ << ", using default values."; >+ return absl::make_unique< >+ AudioEncoderOpusImpl::NewPacketLossRateOptimizer>(); >+ } >+ return nullptr; >+} >+ > } // namespace > >+AudioEncoderOpusImpl::NewPacketLossRateOptimizer::NewPacketLossRateOptimizer( >+ float min_packet_loss_rate, >+ float max_packet_loss_rate, >+ float slope) >+ : min_packet_loss_rate_(min_packet_loss_rate), >+ max_packet_loss_rate_(max_packet_loss_rate), >+ slope_(slope) {} >+ >+float AudioEncoderOpusImpl::NewPacketLossRateOptimizer::OptimizePacketLossRate( >+ float packet_loss_rate) const { >+ packet_loss_rate = slope_ * packet_loss_rate; >+ return std::min(std::max(packet_loss_rate, min_packet_loss_rate_), >+ max_packet_loss_rate_); >+} >+ > void AudioEncoderOpusImpl::AppendSupportedEncoders( > std::vector<AudioCodecSpec>* specs) { > const SdpAudioFormat fmt = { >@@ -244,7 +317,7 @@ std::unique_ptr<AudioEncoder> AudioEncoderOpusImpl::MakeAudioEncoder( > > absl::optional<AudioCodecInfo> AudioEncoderOpusImpl::QueryAudioEncoder( > const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), GetPayloadName()) == 0 && >+ if (absl::EqualsIgnoreCase(format.name, GetPayloadName()) && > format.clockrate_hz == 48000 && format.num_channels == 2) { > const size_t num_channels = GetChannelCount(format); > const int bitrate = >@@ -276,7 +349,7 @@ AudioEncoderOpusConfig AudioEncoderOpusImpl::CreateConfig( > > absl::optional<AudioEncoderOpusConfig> AudioEncoderOpusImpl::SdpToConfig( > const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "opus") != 0 || >+ if (!absl::EqualsIgnoreCase(format.name, "opus") || > format.clockrate_hz != 48000 || format.num_channels != 2) { > return absl::nullopt; > } >@@ -398,10 +471,14 @@ AudioEncoderOpusImpl::AudioEncoderOpusImpl( > : payload_type_(payload_type), > send_side_bwe_with_overhead_( > webrtc::field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")), >+ use_link_capacity_for_adaptation_(webrtc::field_trial::IsEnabled( >+ "WebRTC-Audio-LinkCapacityAdaptation")), > adjust_bandwidth_( > webrtc::field_trial::IsEnabled("WebRTC-AdjustOpusBandwidth")), > bitrate_changed_(true), > packet_loss_rate_(0.0), >+ min_packet_loss_rate_(GetMinPacketLossRate()), >+ new_packet_loss_optimizer_(GetNewPacketLossRateOptimizer()), > inst_(nullptr), > packet_loss_fraction_smoother_(new PacketLossFractionSmoother()), > audio_network_adaptor_creator_(audio_network_adaptor_creator), >@@ -414,6 +491,7 @@ AudioEncoderOpusImpl::AudioEncoderOpusImpl( > RTC_CHECK(config.payload_type == -1 || config.payload_type == payload_type); > > RTC_CHECK(RecreateEncoderInstance(config)); >+ SetProjectedPacketLossRate(packet_loss_rate_); > } > > AudioEncoderOpusImpl::AudioEncoderOpusImpl(const CodecInst& codec_inst) >@@ -529,7 +607,8 @@ void AudioEncoderOpusImpl::OnReceivedUplinkRecoverablePacketLossFraction( > > void AudioEncoderOpusImpl::OnReceivedUplinkBandwidth( > int target_audio_bitrate_bps, >- absl::optional<int64_t> bwe_period_ms) { >+ absl::optional<int64_t> bwe_period_ms, >+ absl::optional<int64_t> link_capacity_allocation_bps) { > if (audio_network_adaptor_) { > audio_network_adaptor_->SetTargetAudioBitrate(target_audio_bitrate_bps); > // We give smoothed bitrate allocation to audio network adaptor as >@@ -547,6 +626,9 @@ void AudioEncoderOpusImpl::OnReceivedUplinkBandwidth( > bitrate_smoother_->SetTimeConstantMs(*bwe_period_ms * 4); > bitrate_smoother_->AddSample(target_audio_bitrate_bps); > >+ if (link_capacity_allocation_bps) >+ link_capacity_allocation_bps_ = link_capacity_allocation_bps; >+ > ApplyAudioNetworkAdaptor(); > } else if (send_side_bwe_with_overhead_) { > if (!overhead_bytes_per_packet_) { >@@ -565,6 +647,18 @@ void AudioEncoderOpusImpl::OnReceivedUplinkBandwidth( > SetTargetBitrate(target_audio_bitrate_bps); > } > } >+void AudioEncoderOpusImpl::OnReceivedUplinkBandwidth( >+ int target_audio_bitrate_bps, >+ absl::optional<int64_t> bwe_period_ms) { >+ OnReceivedUplinkBandwidth(target_audio_bitrate_bps, bwe_period_ms, >+ absl::nullopt); >+} >+ >+void AudioEncoderOpusImpl::OnReceivedUplinkAllocation( >+ BitrateAllocationUpdate update) { >+ OnReceivedUplinkBandwidth(update.target_bitrate.bps(), update.bwe_period.ms(), >+ update.link_capacity.bps()); >+} > > void AudioEncoderOpusImpl::OnReceivedRtt(int rtt_ms) { > if (!audio_network_adaptor_) >@@ -738,9 +832,14 @@ void AudioEncoderOpusImpl::SetNumChannelsToEncode( > } > > void AudioEncoderOpusImpl::SetProjectedPacketLossRate(float fraction) { >- float opt_loss_rate = OptimizePacketLossRate(fraction, packet_loss_rate_); >- if (packet_loss_rate_ != opt_loss_rate) { >- packet_loss_rate_ = opt_loss_rate; >+ if (new_packet_loss_optimizer_) { >+ fraction = new_packet_loss_optimizer_->OptimizePacketLossRate(fraction); >+ } else { >+ fraction = OptimizePacketLossRate(fraction, packet_loss_rate_); >+ fraction = std::max(fraction, min_packet_loss_rate_); >+ } >+ if (packet_loss_rate_ != fraction) { >+ packet_loss_rate_ = fraction; > RTC_CHECK_EQ( > 0, WebRtcOpus_SetPacketLossRate( > inst_, static_cast<int32_t>(packet_loss_rate_ * 100 + .5))); >@@ -794,14 +893,20 @@ AudioEncoderOpusImpl::DefaultAudioNetworkAdaptorCreator( > > void AudioEncoderOpusImpl::MaybeUpdateUplinkBandwidth() { > if (audio_network_adaptor_) { >- int64_t now_ms = rtc::TimeMillis(); >- if (!bitrate_smoother_last_update_time_ || >- now_ms - *bitrate_smoother_last_update_time_ >= >- config_.uplink_bandwidth_update_interval_ms) { >- absl::optional<float> smoothed_bitrate = bitrate_smoother_->GetAverage(); >- if (smoothed_bitrate) >- audio_network_adaptor_->SetUplinkBandwidth(*smoothed_bitrate); >- bitrate_smoother_last_update_time_ = now_ms; >+ if (use_link_capacity_for_adaptation_ && link_capacity_allocation_bps_) { >+ audio_network_adaptor_->SetUplinkBandwidth( >+ *link_capacity_allocation_bps_); >+ } else { >+ int64_t now_ms = rtc::TimeMillis(); >+ if (!bitrate_smoother_last_update_time_ || >+ now_ms - *bitrate_smoother_last_update_time_ >= >+ config_.uplink_bandwidth_update_interval_ms) { >+ absl::optional<float> smoothed_bitrate = >+ bitrate_smoother_->GetAverage(); >+ if (smoothed_bitrate) >+ audio_network_adaptor_->SetUplinkBandwidth(*smoothed_bitrate); >+ bitrate_smoother_last_update_time_ = now_ms; >+ } > } > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.h >index ea4b2650bbbe73646a39b1d78de5d62aa5e29ec4..150423fd3ee7aa7ec4a73cf4eebb8889e17816cb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus.h >@@ -34,6 +34,26 @@ struct CodecInst; > > class AudioEncoderOpusImpl final : public AudioEncoder { > public: >+ class NewPacketLossRateOptimizer { >+ public: >+ NewPacketLossRateOptimizer(float min_packet_loss_rate = 0.01, >+ float max_packet_loss_rate = 0.2, >+ float slope = 1.0); >+ >+ float OptimizePacketLossRate(float packet_loss_rate) const; >+ >+ // Getters for testing. >+ float min_packet_loss_rate() const { return min_packet_loss_rate_; }; >+ float max_packet_loss_rate() const { return max_packet_loss_rate_; }; >+ float slope() const { return slope_; }; >+ >+ private: >+ const float min_packet_loss_rate_; >+ const float max_packet_loss_rate_; >+ const float slope_; >+ RTC_DISALLOW_COPY_AND_ASSIGN(NewPacketLossRateOptimizer); >+ }; >+ > static AudioEncoderOpusConfig CreateConfig(const CodecInst& codec_inst); > > // Returns empty if the current bitrate falls within the hysteresis window, >@@ -99,6 +119,7 @@ class AudioEncoderOpusImpl final : public AudioEncoder { > void OnReceivedUplinkBandwidth( > int target_audio_bitrate_bps, > absl::optional<int64_t> bwe_period_ms) override; >+ void OnReceivedUplinkAllocation(BitrateAllocationUpdate update) override; > void OnReceivedRtt(int rtt_ms) override; > void OnReceivedOverhead(size_t overhead_bytes_per_packet) override; > void SetReceiverFrameLengthRange(int min_frame_length_ms, >@@ -110,6 +131,9 @@ class AudioEncoderOpusImpl final : public AudioEncoder { > > // Getters for testing. > float packet_loss_rate() const { return packet_loss_rate_; } >+ NewPacketLossRateOptimizer* new_packet_loss_optimizer() const { >+ return new_packet_loss_optimizer_.get(); >+ } > AudioEncoderOpusConfig::ApplicationMode application() const { > return config_.application; > } >@@ -141,6 +165,11 @@ class AudioEncoderOpusImpl final : public AudioEncoder { > void SetNumChannelsToEncode(size_t num_channels_to_encode); > void SetProjectedPacketLossRate(float fraction); > >+ void OnReceivedUplinkBandwidth( >+ int target_audio_bitrate_bps, >+ absl::optional<int64_t> bwe_period_ms, >+ absl::optional<int64_t> link_capacity_allocation); >+ > // TODO(minyue): remove "override" when we can deprecate > // |AudioEncoder::SetTargetBitrate|. > void SetTargetBitrate(int target_bps) override; >@@ -155,9 +184,12 @@ class AudioEncoderOpusImpl final : public AudioEncoder { > AudioEncoderOpusConfig config_; > const int payload_type_; > const bool send_side_bwe_with_overhead_; >+ const bool use_link_capacity_for_adaptation_; > const bool adjust_bandwidth_; > bool bitrate_changed_; > float packet_loss_rate_; >+ const float min_packet_loss_rate_; >+ const std::unique_ptr<NewPacketLossRateOptimizer> new_packet_loss_optimizer_; > std::vector<int16_t> input_buffer_; > OpusEncInst* inst_; > uint32_t first_timestamp_in_buffer_; >@@ -170,6 +202,7 @@ class AudioEncoderOpusImpl final : public AudioEncoder { > absl::optional<size_t> overhead_bytes_per_packet_; > const std::unique_ptr<SmoothingFilter> bitrate_smoother_; > absl::optional<int64_t> bitrate_smoother_last_update_time_; >+ absl::optional<int64_t> link_capacity_allocation_bps_; > int consecutive_dtx_frames_; > > friend struct AudioEncoderOpus; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus_unittest.cc >index 7a6d5fda4bbc16141ab84a935cd7334ed79b77de..d5c2c84f1ff713a1bcb61701b1558626a6b216ce 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/opus/audio_encoder_opus_unittest.cc >@@ -272,6 +272,51 @@ TEST(AudioEncoderOpusTest, PacketLossRateOptimized) { > // clang-format on > } > >+TEST(AudioEncoderOpusTest, PacketLossRateLowerBounded) { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-OpusMinPacketLossRate/Enabled-5/"); >+ auto states = CreateCodec(1); >+ auto I = [](float a, float b) { return IntervalSteps(a, b, 10); }; >+ constexpr float eps = 1e-8f; >+ >+ // clang-format off >+ TestSetPacketLossRate(states.get(), I(0.00f , 0.01f - eps), 0.05f); >+ TestSetPacketLossRate(states.get(), I(0.01f + eps, 0.06f - eps), 0.05f); >+ TestSetPacketLossRate(states.get(), I(0.06f + eps, 0.11f - eps), 0.05f); >+ TestSetPacketLossRate(states.get(), I(0.11f + eps, 0.22f - eps), 0.10f); >+ TestSetPacketLossRate(states.get(), I(0.22f + eps, 1.00f ), 0.20f); >+ >+ TestSetPacketLossRate(states.get(), I(1.00f , 0.18f + eps), 0.20f); >+ TestSetPacketLossRate(states.get(), I(0.18f - eps, 0.09f + eps), 0.10f); >+ TestSetPacketLossRate(states.get(), I(0.09f - eps, 0.04f + eps), 0.05f); >+ TestSetPacketLossRate(states.get(), I(0.04f - eps, 0.01f + eps), 0.05f); >+ TestSetPacketLossRate(states.get(), I(0.01f - eps, 0.00f ), 0.05f); >+ // clang-format on >+} >+ >+TEST(AudioEncoderOpusTest, NewPacketLossRateOptimization) { >+ { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-NewOpusPacketLossRateOptimization/Enabled-5-15-0.5/"); >+ auto states = CreateCodec(1); >+ >+ TestSetPacketLossRate(states.get(), {0.00f}, 0.05f); >+ TestSetPacketLossRate(states.get(), {0.12f}, 0.06f); >+ TestSetPacketLossRate(states.get(), {0.22f}, 0.11f); >+ TestSetPacketLossRate(states.get(), {0.50f}, 0.15f); >+ } >+ { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-NewOpusPacketLossRateOptimization/Enabled/"); >+ auto states = CreateCodec(1); >+ >+ TestSetPacketLossRate(states.get(), {0.00f}, 0.01f); >+ TestSetPacketLossRate(states.get(), {0.12f}, 0.12f); >+ TestSetPacketLossRate(states.get(), {0.22f}, 0.20f); >+ TestSetPacketLossRate(states.get(), {0.50f}, 0.20f); >+ } >+} >+ > TEST(AudioEncoderOpusTest, SetReceiverFrameLengthRange) { > auto states = CreateCodec(2); > // Before calling to |SetReceiverFrameLengthRange|, >@@ -446,6 +491,57 @@ TEST(AudioEncoderOpusTest, BitrateBounded) { > EXPECT_EQ(kMaxBitrateBps, states->encoder->GetTargetBitrate()); > } > >+TEST(AudioEncoderOpusTest, MinPacketLossRate) { >+ constexpr float kDefaultMinPacketLossRate = 0.01; >+ { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-OpusMinPacketLossRate/Enabled/"); >+ auto states = CreateCodec(1); >+ EXPECT_EQ(kDefaultMinPacketLossRate, states->encoder->packet_loss_rate()); >+ } >+ { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-OpusMinPacketLossRate/Enabled-200/"); >+ auto states = CreateCodec(1); >+ EXPECT_EQ(kDefaultMinPacketLossRate, states->encoder->packet_loss_rate()); >+ } >+ { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-OpusMinPacketLossRate/Enabled-50/"); >+ constexpr float kMinPacketLossRate = 0.5; >+ auto states = CreateCodec(1); >+ EXPECT_EQ(kMinPacketLossRate, states->encoder->packet_loss_rate()); >+ } >+} >+ >+TEST(AudioEncoderOpusTest, NewPacketLossRateOptimizer) { >+ { >+ auto states = CreateCodec(1); >+ auto optimizer = states->encoder->new_packet_loss_optimizer(); >+ EXPECT_EQ(nullptr, optimizer); >+ } >+ { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-NewOpusPacketLossRateOptimization/Enabled/"); >+ auto states = CreateCodec(1); >+ auto optimizer = states->encoder->new_packet_loss_optimizer(); >+ ASSERT_NE(nullptr, optimizer); >+ EXPECT_FLOAT_EQ(0.01, optimizer->min_packet_loss_rate()); >+ EXPECT_FLOAT_EQ(0.20, optimizer->max_packet_loss_rate()); >+ EXPECT_FLOAT_EQ(1.00, optimizer->slope()); >+ } >+ { >+ test::ScopedFieldTrials override_field_trials( >+ "WebRTC-Audio-NewOpusPacketLossRateOptimization/Enabled-2-50-0.7/"); >+ auto states = CreateCodec(1); >+ auto optimizer = states->encoder->new_packet_loss_optimizer(); >+ ASSERT_NE(nullptr, optimizer); >+ EXPECT_FLOAT_EQ(0.02, optimizer->min_packet_loss_rate()); >+ EXPECT_FLOAT_EQ(0.50, optimizer->max_packet_loss_rate()); >+ EXPECT_FLOAT_EQ(0.70, optimizer->slope()); >+ } >+} >+ > // Verifies that the complexity adaptation in the config works as intended. > TEST(AudioEncoderOpusTest, ConfigComplexityAdaptation) { > AudioEncoderOpusConfig config; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.cc >index b07624dffc5c4e32c8363ee0a6f87910504a0560..1dd2ff289ee10280963c3db5de58545886c60d49 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.cc >@@ -10,6 +10,8 @@ > > #include "modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.h" > >+#include <utility> >+ > #include "modules/audio_coding/codecs/legacy_encoded_audio_frame.h" > #include "modules/audio_coding/codecs/pcm16b/pcm16b.h" > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.h >index 7d234224e3cd00daa8e81b8ad3374c1793185939..9b478d806f7ac605d1a94933168cb0d251fa6141 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_decoder_pcm16b.h >@@ -11,7 +11,12 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_PCM16B_AUDIO_DECODER_PCM16B_H_ > #define MODULES_AUDIO_CODING_CODECS_PCM16B_AUDIO_DECODER_PCM16B_H_ > >+#include <stddef.h> >+#include <stdint.h> >+#include <vector> >+ > #include "api/audio_codecs/audio_decoder.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_encoder_pcm16b.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_encoder_pcm16b.cc >index 831daedc63f9d1f62bd33cff133ee8d0f8ea3326..106ab16794025183ada40181b6d4654095471b04 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_encoder_pcm16b.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/audio_encoder_pcm16b.cc >@@ -10,12 +10,9 @@ > > #include "modules/audio_coding/codecs/pcm16b/audio_encoder_pcm16b.h" > >-#include <algorithm> >- >-#include "common_types.h" // NOLINT(build/include) >+#include "common_types.h" > #include "modules/audio_coding/codecs/pcm16b/pcm16b.h" > #include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.cc >index 6d0fc2d015cc58868b6f8b01dedaf8fd118966c5..8f8bba527be45f90b9708c8a1c0b9c3e0ebba662 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.cc >@@ -10,6 +10,9 @@ > > #include "modules/audio_coding/codecs/pcm16b/pcm16b_common.h" > >+#include <stdint.h> >+#include <initializer_list> >+ > namespace webrtc { > > void Pcm16BAppendSupportedCodecSpecs(std::vector<AudioCodecSpec>* specs) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.h >index 980a9964088642dac8922de816236bb450ea6071..3fae717ff31d9c4c1ae3019097a555ad9c4678a0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/pcm16b/pcm16b_common.h >@@ -13,7 +13,7 @@ > > #include <vector> > >-#include "api/audio_codecs/audio_decoder_factory.h" >+#include "api/audio_codecs/audio_format.h" > > namespace webrtc { > void Pcm16BAppendSupportedCodecSpecs(std::vector<AudioCodecSpec>* specs); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.cc >index 2601f26e44fddacdda18f4ab0edb7344170db2c6..124e81198179cd3cc05824b674d9772c5dc227a5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.cc >@@ -11,8 +11,8 @@ > #include "modules/audio_coding/codecs/red/audio_encoder_copy_red.h" > > #include <string.h> >- > #include <utility> >+#include <vector> > > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.h >index 492ee3afb471763b87d2777d605830696db2b279..5a68876b898c143d56a1213f8bf460fa30728e59 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/red/audio_encoder_copy_red.h >@@ -11,9 +11,12 @@ > #ifndef MODULES_AUDIO_CODING_CODECS_RED_AUDIO_ENCODER_COPY_RED_H_ > #define MODULES_AUDIO_CODING_CODECS_RED_AUDIO_ENCODER_COPY_RED_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> >-#include <vector> > >+#include "absl/types/optional.h" >+#include "api/array_view.h" > #include "api/audio_codecs/audio_encoder.h" > #include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module.h >index b9f22288fd9fce32b5f1e5f979be47f1cdc500a3..f9fdba5f511b33d0ab902aa9b2ec0dbd081a58b6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module.h >@@ -154,40 +154,6 @@ class AudioCodingModule { > // Sender > // > >- /////////////////////////////////////////////////////////////////////////// >- // int32_t RegisterSendCodec() >- // Registers a codec, specified by |send_codec|, as sending codec. >- // This API can be called multiple of times to register Codec. The last codec >- // registered overwrites the previous ones. >- // The API can also be used to change payload type for CNG and RED, which are >- // registered by default to default payload types. >- // Note that registering CNG and RED won't overwrite speech codecs. >- // This API can be called to set/change the send payload-type, frame-size >- // or encoding rate (if applicable for the codec). >- // >- // Note: If a stereo codec is registered as send codec, VAD/DTX will >- // automatically be turned off, since it is not supported for stereo sending. >- // >- // Note: If a secondary encoder is already registered, and the new send-codec >- // has a sampling rate that does not match the secondary encoder, the >- // secondary encoder will be unregistered. >- // >- // Input: >- // -send_codec : Parameters of the codec to be registered, c.f. >- // common_types.h for the definition of >- // CodecInst. >- // >- // Return value: >- // -1 if failed to initialize, >- // 0 if succeeded. >- // >- virtual int32_t RegisterSendCodec(const CodecInst& send_codec) = 0; >- >- // Registers |external_speech_encoder| as encoder. The new encoder will >- // replace any previously registered speech encoder (internal or external). >- virtual void RegisterExternalSendCodec( >- AudioEncoder* external_speech_encoder) = 0; >- > // |modifier| is called exactly once with one argument: a pointer to the > // unique_ptr that holds the current encoder (which is null if there is no > // current encoder). For the duration of the call, |modifier| has exclusive >@@ -257,71 +223,6 @@ class AudioCodingModule { > // > virtual int32_t Add10MsData(const AudioFrame& audio_frame) = 0; > >- /////////////////////////////////////////////////////////////////////////// >- // (RED) Redundant Coding >- // >- >- /////////////////////////////////////////////////////////////////////////// >- // int32_t SetREDStatus() >- // configure RED status i.e. on/off. >- // >- // RFC 2198 describes a solution which has a single payload type which >- // signifies a packet with redundancy. That packet then becomes a container, >- // encapsulating multiple payloads into a single RTP packet. >- // Such a scheme is flexible, since any amount of redundancy may be >- // encapsulated within a single packet. There is, however, a small overhead >- // since each encapsulated payload must be preceded by a header indicating >- // the type of data enclosed. >- // >- // Input: >- // -enable_red : if true RED is enabled, otherwise RED is >- // disabled. >- // >- // Return value: >- // -1 if failed to set RED status, >- // 0 if succeeded. >- // >- virtual int32_t SetREDStatus(bool enable_red) = 0; >- >- /////////////////////////////////////////////////////////////////////////// >- // bool REDStatus() >- // Get RED status >- // >- // Return value: >- // true if RED is enabled, >- // false if RED is disabled. >- // >- virtual bool REDStatus() const = 0; >- >- /////////////////////////////////////////////////////////////////////////// >- // (FEC) Forward Error Correction (codec internal) >- // >- >- /////////////////////////////////////////////////////////////////////////// >- // int32_t SetCodecFEC() >- // Configures codec internal FEC status i.e. on/off. No effects on codecs that >- // do not provide internal FEC. >- // >- // Input: >- // -enable_fec : if true FEC will be enabled otherwise the FEC is >- // disabled. >- // >- // Return value: >- // -1 if failed, or the codec does not support FEC >- // 0 if succeeded. >- // >- virtual int SetCodecFEC(bool enable_codec_fec) = 0; >- >- /////////////////////////////////////////////////////////////////////////// >- // bool CodecFEC() >- // Gets status of codec internal FEC. >- // >- // Return value: >- // true if FEC is enabled, >- // false if FEC is disabled. >- // >- virtual bool CodecFEC() const = 0; >- > /////////////////////////////////////////////////////////////////////////// > // int SetPacketLossRate() > // Sets expected packet loss rate for encoding. Some encoders provide packet >@@ -343,55 +244,6 @@ class AudioCodingModule { > // (VAD) Voice Activity Detection > // > >- /////////////////////////////////////////////////////////////////////////// >- // int32_t SetVAD() >- // If DTX is enabled & the codec does not have internal DTX/VAD >- // WebRtc VAD will be automatically enabled and |enable_vad| is ignored. >- // >- // If DTX is disabled but VAD is enabled no DTX packets are send, >- // regardless of whether the codec has internal DTX/VAD or not. In this >- // case, WebRtc VAD is running to label frames as active/in-active. >- // >- // NOTE! VAD/DTX is not supported when sending stereo. >- // >- // Inputs: >- // -enable_dtx : if true DTX is enabled, >- // otherwise DTX is disabled. >- // -enable_vad : if true VAD is enabled, >- // otherwise VAD is disabled. >- // -vad_mode : determines the aggressiveness of VAD. A more >- // aggressive mode results in more frames labeled >- // as in-active, c.f. definition of >- // ACMVADMode in audio_coding_module_typedefs.h >- // for valid values. >- // >- // Return value: >- // -1 if failed to set up VAD/DTX, >- // 0 if succeeded. >- // >- virtual int32_t SetVAD(const bool enable_dtx = true, >- const bool enable_vad = false, >- const ACMVADMode vad_mode = VADNormal) = 0; >- >- /////////////////////////////////////////////////////////////////////////// >- // int32_t VAD() >- // Get VAD status. >- // >- // Outputs: >- // -dtx_enabled : is set to true if DTX is enabled, otherwise >- // is set to false. >- // -vad_enabled : is set to true if VAD is enabled, otherwise >- // is set to false. >- // -vad_mode : is set to the current aggressiveness of VAD. >- // >- // Return value: >- // -1 if fails to retrieve the setting of DTX/VAD, >- // 0 if succeeded. >- // >- virtual int32_t VAD(bool* dtx_enabled, >- bool* vad_enabled, >- ACMVADMode* vad_mode) const = 0; >- > /////////////////////////////////////////////////////////////////////////// > // int32_t RegisterVADCallback() > // Call this method to register a callback function which is called >@@ -455,29 +307,6 @@ class AudioCodingModule { > virtual bool RegisterReceiveCodec(int rtp_payload_type, > const SdpAudioFormat& audio_format) = 0; > >- /////////////////////////////////////////////////////////////////////////// >- // int32_t RegisterReceiveCodec() >- // Register possible decoders, can be called multiple times for >- // codecs, CNG-NB, CNG-WB, CNG-SWB, AVT and RED. >- // >- // Input: >- // -receive_codec : parameters of the codec to be registered, c.f. >- // common_types.h for the definition of >- // CodecInst. >- // >- // Return value: >- // -1 if failed to register the codec >- // 0 if the codec registered successfully. >- // >- virtual int RegisterReceiveCodec(const CodecInst& receive_codec) = 0; >- >- // Register a decoder; call repeatedly to register multiple decoders. |df| is >- // a decoder factory that returns an iSAC decoder; it will be called once if >- // the decoder being registered is iSAC. >- virtual int RegisterReceiveCodec( >- const CodecInst& receive_codec, >- rtc::FunctionView<std::unique_ptr<AudioDecoder>()> isac_factory) = 0; >- > // Registers an external decoder. The name is only used to provide information > // back to the caller about the decoder. Hence, the name is arbitrary, and may > // be empty. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module_typedefs.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module_typedefs.h >index cd4351b07ff1b943450904390873296e673354e4..bafff72e5ad4b6acf01ea283de7c061561a5f360 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module_typedefs.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/include/audio_coding_module_typedefs.h >@@ -13,6 +13,8 @@ > > #include <map> > >+#include "rtc_base/deprecation.h" >+ > namespace webrtc { > > /////////////////////////////////////////////////////////////////////////// >@@ -43,6 +45,84 @@ enum OpusApplicationMode { > kAudio = 1, > }; > >+// Statistics for calls to AudioCodingModule::PlayoutData10Ms(). >+struct AudioDecodingCallStats { >+ AudioDecodingCallStats() >+ : calls_to_silence_generator(0), >+ calls_to_neteq(0), >+ decoded_normal(0), >+ decoded_plc(0), >+ decoded_cng(0), >+ decoded_plc_cng(0), >+ decoded_muted_output(0) {} >+ >+ int calls_to_silence_generator; // Number of calls where silence generated, >+ // and NetEq was disengaged from decoding. >+ int calls_to_neteq; // Number of calls to NetEq. >+ int decoded_normal; // Number of calls where audio RTP packet decoded. >+ int decoded_plc; // Number of calls resulted in PLC. >+ int decoded_cng; // Number of calls where comfort noise generated due to DTX. >+ int decoded_plc_cng; // Number of calls resulted where PLC faded to CNG. >+ int decoded_muted_output; // Number of calls returning a muted state output. >+}; >+ >+// NETEQ statistics. >+struct NetworkStatistics { >+ // current jitter buffer size in ms >+ uint16_t currentBufferSize; >+ // preferred (optimal) buffer size in ms >+ uint16_t preferredBufferSize; >+ // adding extra delay due to "peaky jitter" >+ bool jitterPeaksFound; >+ // Stats below correspond to similarly-named fields in the WebRTC stats spec. >+ // https://w3c.github.io/webrtc-stats/#dom-rtcmediastreamtrackstats >+ uint64_t totalSamplesReceived; >+ uint64_t concealedSamples; >+ uint64_t concealmentEvents; >+ uint64_t jitterBufferDelayMs; >+ // Stats below DO NOT correspond directly to anything in the WebRTC stats >+ // Loss rate (network + late); fraction between 0 and 1, scaled to Q14. >+ uint16_t currentPacketLossRate; >+ // Late loss rate; fraction between 0 and 1, scaled to Q14. >+ union { >+ RTC_DEPRECATED uint16_t currentDiscardRate; >+ }; >+ // fraction (of original stream) of synthesized audio inserted through >+ // expansion (in Q14) >+ uint16_t currentExpandRate; >+ // fraction (of original stream) of synthesized speech inserted through >+ // expansion (in Q14) >+ uint16_t currentSpeechExpandRate; >+ // fraction of synthesized speech inserted through pre-emptive expansion >+ // (in Q14) >+ uint16_t currentPreemptiveRate; >+ // fraction of data removed through acceleration (in Q14) >+ uint16_t currentAccelerateRate; >+ // fraction of data coming from secondary decoding (in Q14) >+ uint16_t currentSecondaryDecodedRate; >+ // Fraction of secondary data, including FEC and RED, that is discarded (in >+ // Q14). Discarding of secondary data can be caused by the reception of the >+ // primary data, obsoleting the secondary data. It can also be caused by early >+ // or late arrival of secondary data. >+ uint16_t currentSecondaryDiscardedRate; >+ // clock-drift in parts-per-million (negative or positive) >+ int32_t clockDriftPPM; >+ // average packet waiting time in the jitter buffer (ms) >+ int meanWaitingTimeMs; >+ // median packet waiting time in the jitter buffer (ms) >+ int medianWaitingTimeMs; >+ // min packet waiting time in the jitter buffer (ms) >+ int minWaitingTimeMs; >+ // max packet waiting time in the jitter buffer (ms) >+ int maxWaitingTimeMs; >+ // added samples in off mode due to packet loss >+ size_t addedSamples; >+ // count of the number of buffer flushes >+ uint64_t packetBufferFlushes; >+ // number of samples expanded due to delayed packets >+ uint64_t delayedPacketOutageSamples; >+}; >+ > } // namespace webrtc > > #endif // MODULES_AUDIO_CODING_INCLUDE_AUDIO_CODING_MODULE_TYPEDEFS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.cc >index 18350b0a78971064654497bfda761ea1d30ffaf4..6161a8f91bf294fc4ccc52df4fc23bfadb3e6c89 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.cc >@@ -10,7 +10,10 @@ > > #include "modules/audio_coding/neteq/accelerate.h" > >-#include "common_audio/signal_processing/include/signal_processing_library.h" >+#include <assert.h> >+ >+#include "api/array_view.h" >+#include "modules/audio_coding/neteq/audio_multi_vector.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.h >index 560956895b7df63cd665dc4ba141aa933956068c..1a3af42b2e9f3c4c88b91a708f446d141e86e258 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/accelerate.h >@@ -11,13 +11,15 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_ACCELERATE_H_ > #define MODULES_AUDIO_CODING_NETEQ_ACCELERATE_H_ > >-#include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/audio_coding/neteq/time_stretch.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > >-// Forward declarations. >+class AudioMultiVector; > class BackgroundNoise; > > // This class implements the Accelerate operation. Most of the work is done >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_multi_vector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_multi_vector.h >index 86f82820ca7a77bd66d0c640d195066dce9ebb26..a2dd3c3cb10cfd4ccb58bb2a88d099778bbd7202 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_multi_vector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_multi_vector.h >@@ -11,8 +11,8 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_AUDIO_MULTI_VECTOR_H_ > #define MODULES_AUDIO_CODING_NETEQ_AUDIO_MULTI_VECTOR_H_ > >-#include <string.h> // Access to size_t. >- >+#include <stdint.h> >+#include <string.h> > #include <vector> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_vector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_vector.h >index 825a3bceac9be0f3b5ed60a27f568e773f293d30..d0db332db65de848d302f2064b302ff11b103a6a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_vector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/audio_vector.h >@@ -11,7 +11,8 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_AUDIO_VECTOR_H_ > #define MODULES_AUDIO_CODING_NETEQ_AUDIO_VECTOR_H_ > >-#include <string.h> // Access to size_t. >+#include <string.h> >+#include <cstdint> > #include <memory> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/background_noise.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/background_noise.h >index 58eecaa705ce8252dc87bb7754c23c7e2292ace4..84d7eb9d745af36a47c4767e9cfad530db14eeef 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/background_noise.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/background_noise.h >@@ -14,13 +14,12 @@ > #include <string.h> // size_t > #include <memory> > >-#include "modules/audio_coding/neteq/audio_multi_vector.h" >-#include "modules/audio_coding/neteq/include/neteq.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > > // Forward declarations. >+class AudioMultiVector; > class PostDecodeVad; > > // This class handles estimation of background noise parameters. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/buffer_level_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/buffer_level_filter.cc >index 6e8da0a01b95fb0cb96f03e93fcfba2a60c5f216..2f966185362f94eee9cbb609fe88c0301ff9dbcf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/buffer_level_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/buffer_level_filter.cc >@@ -10,7 +10,8 @@ > > #include "modules/audio_coding/neteq/buffer_level_filter.h" > >-#include <algorithm> // Provide access to std::max. >+#include <stdint.h> >+#include <algorithm> > > #include "rtc_base/numerics/safe_conversions.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.cc >index b341acd5a50d6e58b8dd2365af5c6d6bda724689..cb2b74dbf2b19a0af8f153c68e74e4f7805f89c1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.cc >@@ -11,11 +11,18 @@ > #include "modules/audio_coding/neteq/comfort_noise.h" > > #include <assert.h> >+#include <cstdint> >+#include <memory> > >-#include "api/audio_codecs/audio_decoder.h" >+#include "api/array_view.h" >+#include "modules/audio_coding/codecs/cng/webrtc_cng.h" >+#include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include "modules/audio_coding/neteq/audio_vector.h" > #include "modules/audio_coding/neteq/decoder_database.h" > #include "modules/audio_coding/neteq/dsp_helper.h" > #include "modules/audio_coding/neteq/sync_buffer.h" >+#include "rtc_base/buffer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.h >index 5169124040f677c04fedca12a0419dbc38320191..3a9bfde7aab5b4e61813d5d69ea4f9060383bbeb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/comfort_noise.h >@@ -11,12 +11,14 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_COMFORT_NOISE_H_ > #define MODULES_AUDIO_CODING_NETEQ_COMFORT_NOISE_H_ > >-#include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include <stddef.h> >+ > #include "rtc_base/constructormagic.h" > > namespace webrtc { > > // Forward declarations. >+class AudioMultiVector; > class DecoderDatabase; > class SyncBuffer; > struct Packet; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/cross_correlation.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/cross_correlation.h >index a747772cfc4352bb28da5265f039ca347f44e1b6..9ce8be83aecf0f52deab7cdb2b63e73c9cc66726 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/cross_correlation.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/cross_correlation.h >@@ -11,7 +11,8 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_CROSS_CORRELATION_H_ > #define MODULES_AUDIO_CODING_NETEQ_CROSS_CORRELATION_H_ > >-#include "common_types.h" // NOLINT(build/include) >+#include <stddef.h> >+#include <stdint.h> > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.cc >index 349fdab9f15d0973e9f8c393edd56f43f88fb712..83c2b3b90c569eab3cc11d491971ceb06932d3f8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.cc >@@ -11,8 +11,8 @@ > #include "modules/audio_coding/neteq/decision_logic.h" > > #include <assert.h> >-#include <algorithm> >-#include <limits> >+#include <stdio.h> >+#include <string> > > #include "modules/audio_coding/neteq/buffer_level_filter.h" > #include "modules/audio_coding/neteq/decoder_database.h" >@@ -20,8 +20,9 @@ > #include "modules/audio_coding/neteq/expand.h" > #include "modules/audio_coding/neteq/packet_buffer.h" > #include "modules/audio_coding/neteq/sync_buffer.h" >-#include "modules/include/module_common_types.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" >+#include "rtc_base/numerics/safe_conversions.h" > #include "system_wrappers/include/field_trial.h" > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.h >index 39761da0702b419b7d55573f332e7697d13f4246..2a533594b3382f471ce088a86f223e185f8b6f04 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic.h >@@ -12,7 +12,6 @@ > #define MODULES_AUDIO_CODING_NETEQ_DECISION_LOGIC_H_ > > #include "modules/audio_coding/neteq/defines.h" >-#include "modules/audio_coding/neteq/include/neteq.h" > #include "modules/audio_coding/neteq/tick_timer.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic_unittest.cc >index 08720d1768f456d00081b30087e0f105aee15758..183b9c79c9356331f07d3f6806e91b8957c907ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decision_logic_unittest.cc >@@ -31,7 +31,7 @@ TEST(DecisionLogic, CreateAndDestroy) { > TickTimer tick_timer; > PacketBuffer packet_buffer(10, &tick_timer); > DelayPeakDetector delay_peak_detector(&tick_timer); >- DelayManager delay_manager(240, &delay_peak_detector, &tick_timer); >+ DelayManager delay_manager(240, 0, &delay_peak_detector, &tick_timer); > BufferLevelFilter buffer_level_filter; > DecisionLogic* logic = DecisionLogic::Create( > fs_hz, output_size_samples, false, &decoder_database, packet_buffer, >@@ -48,7 +48,7 @@ TEST(DecisionLogic, PostponeDecodingAfterExpansionSettings) { > TickTimer tick_timer; > PacketBuffer packet_buffer(10, &tick_timer); > DelayPeakDetector delay_peak_detector(&tick_timer); >- DelayManager delay_manager(240, &delay_peak_detector, &tick_timer); >+ DelayManager delay_manager(240, 0, &delay_peak_detector, &tick_timer); > BufferLevelFilter buffer_level_filter; > { > test::ScopedFieldTrials field_trial( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decoder_database.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decoder_database.cc >index 1fd8c03ca24a63ba55590aacac34a05c03f5c6ba..0890beb9342d682fdce6ccbdb98c04571c183749 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decoder_database.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/decoder_database.cc >@@ -10,9 +10,15 @@ > > #include "modules/audio_coding/neteq/decoder_database.h" > >-#include <utility> // pair >+#include <stddef.h> >+#include <cstdint> >+#include <list> >+#include <type_traits> >+#include <utility> > >+#include "absl/strings/match.h" > #include "api/audio_codecs/audio_decoder.h" >+#include "common_types.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/strings/audio_format_to_string.h" >@@ -101,7 +107,7 @@ AudioDecoder* DecoderDatabase::DecoderInfo::GetDecoder() const { > } > > bool DecoderDatabase::DecoderInfo::IsType(const char* name) const { >- return STR_CASE_CMP(audio_format_.name.c_str(), name) == 0; >+ return absl::EqualsIgnoreCase(audio_format_.name, name); > } > > bool DecoderDatabase::DecoderInfo::IsType(const std::string& name) const { >@@ -110,7 +116,7 @@ bool DecoderDatabase::DecoderInfo::IsType(const std::string& name) const { > > absl::optional<DecoderDatabase::DecoderInfo::CngDecoder> > DecoderDatabase::DecoderInfo::CngDecoder::Create(const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "CN") == 0) { >+ if (absl::EqualsIgnoreCase(format.name, "CN")) { > // CN has a 1:1 RTP clock rate to sample rate ratio. > const int sample_rate_hz = format.clockrate_hz; > RTC_DCHECK(sample_rate_hz == 8000 || sample_rate_hz == 16000 || >@@ -123,11 +129,11 @@ DecoderDatabase::DecoderInfo::CngDecoder::Create(const SdpAudioFormat& format) { > > DecoderDatabase::DecoderInfo::Subtype > DecoderDatabase::DecoderInfo::SubtypeFromFormat(const SdpAudioFormat& format) { >- if (STR_CASE_CMP(format.name.c_str(), "CN") == 0) { >+ if (absl::EqualsIgnoreCase(format.name, "CN")) { > return Subtype::kComfortNoise; >- } else if (STR_CASE_CMP(format.name.c_str(), "telephone-event") == 0) { >+ } else if (absl::EqualsIgnoreCase(format.name, "telephone-event")) { > return Subtype::kDtmf; >- } else if (STR_CASE_CMP(format.name.c_str(), "red") == 0) { >+ } else if (absl::EqualsIgnoreCase(format.name, "red")) { > return Subtype::kRed; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.cc >index e5eb592efa1347565b62b0b19da75f71a3200dd3..67e6a13c1d769526d1718224a877c2e642dc93a7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.cc >@@ -11,14 +11,15 @@ > #include "modules/audio_coding/neteq/delay_manager.h" > > #include <assert.h> >-#include <math.h> >- >-#include <algorithm> // max, min >+#include <stdio.h> >+#include <stdlib.h> >+#include <algorithm> > #include <numeric> >+#include <string> > >-#include "common_audio/signal_processing/include/signal_processing_library.h" > #include "modules/audio_coding/neteq/delay_peak_detector.h" >-#include "modules/include/module_common_types.h" >+#include "modules/include/module_common_types_public.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "system_wrappers/include/field_trial.h" >@@ -61,6 +62,7 @@ absl::optional<int> GetForcedLimitProbability() { > namespace webrtc { > > DelayManager::DelayManager(size_t max_packets_in_buffer, >+ int base_min_target_delay_ms, > DelayPeakDetector* peak_detector, > const TickTimer* tick_timer) > : first_packet_received_(false), >@@ -68,13 +70,14 @@ DelayManager::DelayManager(size_t max_packets_in_buffer, > iat_vector_(kMaxIat + 1, 0), > iat_factor_(0), > tick_timer_(tick_timer), >+ base_min_target_delay_ms_(base_min_target_delay_ms), > base_target_level_(4), // In Q0 domain. > target_level_(base_target_level_ << 8), // In Q8 domain. > packet_len_ms_(0), > streaming_mode_(false), > last_seq_no_(0), > last_timestamp_(0), >- minimum_delay_ms_(0), >+ minimum_delay_ms_(base_min_target_delay_ms_), > maximum_delay_ms_(target_level_), > iat_cumulative_sum_(0), > max_iat_cumulative_sum_(0), >@@ -84,6 +87,8 @@ DelayManager::DelayManager(size_t max_packets_in_buffer, > field_trial::IsEnabled("WebRTC-Audio-NetEqFramelengthExperiment")), > forced_limit_probability_(GetForcedLimitProbability()) { > assert(peak_detector); // Should never be NULL. >+ RTC_DCHECK_GE(base_min_target_delay_ms_, 0); >+ RTC_DCHECK_LE(minimum_delay_ms_, maximum_delay_ms_); > > Reset(); > } >@@ -484,7 +489,7 @@ bool DelayManager::SetMinimumDelay(int delay_ms) { > static_cast<int>(3 * max_packets_in_buffer_ * packet_len_ms_ / 4))) { > return false; > } >- minimum_delay_ms_ = delay_ms; >+ minimum_delay_ms_ = std::max(delay_ms, base_min_target_delay_ms_); > return true; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.h >index cd5fc09031691f259fde43e1c023f0c30acbf1e5..2c8081b075fe41357ea6c64dbe2091e05d481fd8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager.h >@@ -31,9 +31,11 @@ class DelayManager { > > // Create a DelayManager object. Notify the delay manager that the packet > // buffer can hold no more than |max_packets_in_buffer| packets (i.e., this >- // is the number of packet slots in the buffer). Supply a PeakDetector >- // object to the DelayManager. >+ // is the number of packet slots in the buffer) and that the target delay >+ // should be greater than or equal to |base_min_target_delay_ms|. Supply a >+ // PeakDetector object to the DelayManager. > DelayManager(size_t max_packets_in_buffer, >+ int base_min_target_delay_ms, > DelayPeakDetector* peak_detector, > const TickTimer* tick_timer); > >@@ -144,6 +146,8 @@ class DelayManager { > IATVector iat_vector_; // Histogram of inter-arrival times. > int iat_factor_; // Forgetting factor for updating the IAT histogram (Q15). > const TickTimer* tick_timer_; >+ const int base_min_target_delay_ms_; // Lower bound for target_level_ and >+ // minimum_delay_ms_. > // Time elapsed since last packet. > std::unique_ptr<TickTimer::Stopwatch> packet_iat_stopwatch_; > int base_target_level_; // Currently preferred buffer level before peak >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager_unittest.cc >index e4e865fe46be30028032c7d546c93c82aadc1a10..6281a156799893c025155ea55e675fbe7e9c6915 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_manager_unittest.cc >@@ -27,6 +27,7 @@ using ::testing::_; > class DelayManagerTest : public ::testing::Test { > protected: > static const int kMaxNumberOfPackets = 240; >+ static const int kMinDelayMs = 0; > static const int kTimeStepMs = 10; > static const int kFs = 8000; > static const int kFrameSizeMs = 20; >@@ -56,7 +57,8 @@ void DelayManagerTest::SetUp() { > > void DelayManagerTest::RecreateDelayManager() { > EXPECT_CALL(detector_, Reset()).Times(1); >- dm_.reset(new DelayManager(kMaxNumberOfPackets, &detector_, &tick_timer_)); >+ dm_.reset(new DelayManager(kMaxNumberOfPackets, kMinDelayMs, &detector_, >+ &tick_timer_)); > } > > void DelayManagerTest::SetPacketAudioLength(int lengt_ms) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.cc >index eb9f6d530d5d651aa56721de7fecb01c207edaa3..893ce3e8fd24448cea66c7ea43de6b706c72467d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.cc >@@ -10,10 +10,9 @@ > > #include "modules/audio_coding/neteq/delay_peak_detector.h" > >-#include <algorithm> // max >+#include <algorithm> > > #include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" > #include "system_wrappers/include/field_trial.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.h >index 9defca5ad0216478bc7edc03d4df998d4d8fc415..272d50ed509fce23b56268bd5df17d6e7aec6b3c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/delay_peak_detector.h >@@ -11,8 +11,8 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_DELAY_PEAK_DETECTOR_H_ > #define MODULES_AUDIO_CODING_NETEQ_DELAY_PEAK_DETECTOR_H_ > >-#include <string.h> // size_t >- >+#include <stdint.h> >+#include <string.h> > #include <list> > #include <memory> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dsp_helper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dsp_helper.h >index efa2f9ce06447a8de2f2e34879506f580a68abcb..83794612c5fdf61a4e5d34a17ba9ba9a3fb98596 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dsp_helper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dsp_helper.h >@@ -11,9 +11,11 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_DSP_HELPER_H_ > #define MODULES_AUDIO_CODING_NETEQ_DSP_HELPER_H_ > >-#include <string.h> // Access to size_t. >+#include <stdint.h> >+#include <string.h> > > #include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include "modules/audio_coding/neteq/audio_vector.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_buffer.h >index 6de512772f2ea50d8b3eb285dd7f45a6c80ea01a..24b14ece416a1b46b3dd6754ff83eadf5a7ec4f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_buffer.h >@@ -11,8 +11,9 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_DTMF_BUFFER_H_ > #define MODULES_AUDIO_CODING_NETEQ_DTMF_BUFFER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <list> >-#include <string> // size_t > > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.cc >index 6fdb95aacad67bf133cec71471e696703c328ab9..6c412e364d87d4cf747968459672b34eb0ccf924 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.cc >@@ -30,6 +30,7 @@ > > #include "modules/audio_coding/neteq/dtmf_tone_generator.h" > >+#include "modules/audio_coding/neteq/audio_vector.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.h >index a773ff353d3b30df6948dc7cad748d30f9326a53..22e166ef0c43f9bb80127421733de30163363d00 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/dtmf_tone_generator.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_DTMF_TONE_GENERATOR_H_ > #define MODULES_AUDIO_CODING_NETEQ_DTMF_TONE_GENERATOR_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/audio_coding/neteq/audio_multi_vector.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.cc >index 5f671ada6b6203526e1f283bc096f9213c56a54c..4a06d09c1c61a7261437a9a77cd178f3c59e004f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.cc >@@ -17,6 +17,7 @@ > #include <limits> // numeric_limits<T> > > #include "common_audio/signal_processing/include/signal_processing_library.h" >+#include "modules/audio_coding/neteq/audio_multi_vector.h" > #include "modules/audio_coding/neteq/background_noise.h" > #include "modules/audio_coding/neteq/cross_correlation.h" > #include "modules/audio_coding/neteq/dsp_helper.h" >@@ -322,8 +323,7 @@ void Expand::SetParametersForNormalAfterExpand() { > current_lag_index_ = 0; > lag_index_direction_ = 0; > stop_muting_ = true; // Do not mute signal any more. >- statistics_->LogDelayedPacketOutageEvent( >- rtc::dchecked_cast<int>(expand_duration_samples_) / (fs_hz_ / 1000)); >+ statistics_->LogDelayedPacketOutageEvent(expand_duration_samples_, fs_hz_); > } > > void Expand::SetParametersForMergeAfterExpand() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.h >index 4cfe7b9d8cbeec1e7990bb4279a64016be149600..30c34a2c300df162429531a0ed32ed92001ae650 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand.h >@@ -14,12 +14,13 @@ > #include <assert.h> > #include <memory> > >-#include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include "modules/audio_coding/neteq/audio_vector.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > > // Forward declarations. >+class AudioMultiVector; > class BackgroundNoise; > class RandomVector; > class StatisticsCalculator; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_uma_logger.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_uma_logger.h >index 00907d4e5a841b4bfac592d72c96e788dc696652..bd079c6ab7750b0f3c1c5fa653780318aed765de 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_uma_logger.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_uma_logger.h >@@ -10,6 +10,7 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_EXPAND_UMA_LOGGER_H_ > #define MODULES_AUDIO_CODING_NETEQ_EXPAND_UMA_LOGGER_H_ > >+#include <stdint.h> > #include <memory> > #include <string> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_unittest.cc >index b4e6466e9045002bff808bdaceefa726b7d0ea56..09914daca82295c88dadee8190413186832b5a1d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/expand_unittest.cc >@@ -51,14 +51,16 @@ TEST(Expand, CreateUsingFactory) { > namespace { > class FakeStatisticsCalculator : public StatisticsCalculator { > public: >- void LogDelayedPacketOutageEvent(int outage_duration_ms) override { >- last_outage_duration_ms_ = outage_duration_ms; >+ void LogDelayedPacketOutageEvent(int num_samples, int fs_hz) override { >+ last_outage_duration_samples_ = num_samples; > } > >- int last_outage_duration_ms() const { return last_outage_duration_ms_; } >+ int last_outage_duration_samples() const { >+ return last_outage_duration_samples_; >+ } > > private: >- int last_outage_duration_ms_ = 0; >+ int last_outage_duration_samples_ = 0; > }; > > // This is the same size that is given to the SyncBuffer object in NetEq. >@@ -120,13 +122,12 @@ TEST_F(ExpandTest, DelayedPacketOutage) { > EXPECT_EQ(0, expand_.Process(&output)); > EXPECT_GT(output.Size(), 0u); > sum_output_len_samples += output.Size(); >- EXPECT_EQ(0, statistics_.last_outage_duration_ms()); >+ EXPECT_EQ(0, statistics_.last_outage_duration_samples()); > } > expand_.SetParametersForNormalAfterExpand(); > // Convert |sum_output_len_samples| to milliseconds. >- EXPECT_EQ(rtc::checked_cast<int>(sum_output_len_samples / >- (test_sample_rate_hz_ / 1000)), >- statistics_.last_outage_duration_ms()); >+ EXPECT_EQ(rtc::checked_cast<int>(sum_output_len_samples), >+ statistics_.last_outage_duration_samples()); > } > > // This test is similar to DelayedPacketOutage, but ends by calling >@@ -140,10 +141,10 @@ TEST_F(ExpandTest, LostPacketOutage) { > EXPECT_EQ(0, expand_.Process(&output)); > EXPECT_GT(output.Size(), 0u); > sum_output_len_samples += output.Size(); >- EXPECT_EQ(0, statistics_.last_outage_duration_ms()); >+ EXPECT_EQ(0, statistics_.last_outage_duration_samples()); > } > expand_.SetParametersForMergeAfterExpand(); >- EXPECT_EQ(0, statistics_.last_outage_duration_ms()); >+ EXPECT_EQ(0, statistics_.last_outage_duration_samples()); > } > > // This test is similar to the DelayedPacketOutage test above, but with the >@@ -161,13 +162,12 @@ TEST_F(ExpandTest, CheckOutageStatsAfterReset) { > expand_.Reset(); > sum_output_len_samples = 0; > } >- EXPECT_EQ(0, statistics_.last_outage_duration_ms()); >+ EXPECT_EQ(0, statistics_.last_outage_duration_samples()); > } > expand_.SetParametersForNormalAfterExpand(); > // Convert |sum_output_len_samples| to milliseconds. >- EXPECT_EQ(rtc::checked_cast<int>(sum_output_len_samples / >- (test_sample_rate_hz_ / 1000)), >- statistics_.last_outage_duration_ms()); >+ EXPECT_EQ(rtc::checked_cast<int>(sum_output_len_samples), >+ statistics_.last_outage_duration_samples()); > } > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/include/neteq.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/include/neteq.h >index 530975f3b8e7d7c53bce375401fb25fb5745317e..2820fd84787d5636f11b170c92a5b37058d19e59 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/include/neteq.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/include/neteq.h >@@ -72,6 +72,7 @@ struct NetEqLifetimeStatistics { > uint64_t jitter_buffer_delay_ms = 0; > // Below stat is not part of the spec. > uint64_t voice_concealed_samples = 0; >+ uint64_t delayed_packet_outage_samples = 0; > }; > > // Metrics that describe the operations performed in NetEq, and the internal >@@ -112,6 +113,7 @@ class NetEq { > bool enable_post_decode_vad = false; > size_t max_packets_in_buffer = 50; > int max_delay_ms = 2000; >+ int min_delay_ms = 0; > bool enable_fast_accelerate = false; > bool enable_muted_state = false; > absl::optional<AudioCodecPairId> codec_pair_id; >@@ -231,13 +233,6 @@ class NetEq { > // statistics are never reset. > virtual NetEqOperationsAndState GetOperationsAndState() const = 0; > >- // Writes the current RTCP statistics to |stats|. The statistics are reset >- // and a new report period is started with the call. >- virtual void GetRtcpStatistics(RtcpStatistics* stats) = 0; >- >- // Same as RtcpStatistics(), but does not reset anything. >- virtual void GetRtcpStatisticsNoReset(RtcpStatistics* stats) = 0; >- > // Enables post-decode VAD. When enabled, GetAudio() will return > // kOutputVADPassive when the signal contains no speech. > virtual void EnableVad() = 0; >@@ -266,10 +261,6 @@ class NetEq { > // Flushes both the packet buffer and the sync buffer. > virtual void FlushBuffers() = 0; > >- // Current usage of packet-buffer and it's limits. >- virtual void PacketBufferStatistics(int* current_num_packets, >- int* max_num_packets) const = 0; >- > // Enables NACK and sets the maximum size of the NACK list, which should be > // positive and no larger than Nack::kNackListSizeLimit. If NACK is already > // enabled then the maximum NACK list size is modified accordingly. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/mock/mock_delay_manager.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/mock/mock_delay_manager.h >index 9b2ed498d9dbfcc241ae14f7a75209601cade097..206cea7f10e7f569a45c486390c9b8748a5c5b05 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/mock/mock_delay_manager.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/mock/mock_delay_manager.h >@@ -20,9 +20,13 @@ namespace webrtc { > class MockDelayManager : public DelayManager { > public: > MockDelayManager(size_t max_packets_in_buffer, >+ int base_min_target_delay_ms, > DelayPeakDetector* peak_detector, > const TickTimer* tick_timer) >- : DelayManager(max_packets_in_buffer, peak_detector, tick_timer) {} >+ : DelayManager(max_packets_in_buffer, >+ base_min_target_delay_ms, >+ peak_detector, >+ tick_timer) {} > virtual ~MockDelayManager() { Die(); } > MOCK_METHOD0(Die, void()); > MOCK_CONST_METHOD0(iat_vector, const IATVector&()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.cc >index c62cdf88e4dc13d84fe3011bbe6435ba49c8f7c4..e3ecfeaed27073c66ca2ae2aa0d2e1bd846db1e9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.cc >@@ -10,9 +10,9 @@ > > #include "modules/audio_coding/neteq/nack_tracker.h" > >-#include <assert.h> // For assert. >- >-#include <algorithm> // For std::max. >+#include <assert.h> >+#include <cstdint> >+#include <utility> > > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.h >index 1936a944ccb52be74e5982b8dde36cf7a31ac1e3..d7c6b084222cfc2b0c4fde77ee44cd5862276903 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/nack_tracker.h >@@ -11,11 +11,12 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_NACK_TRACKER_H_ > #define MODULES_AUDIO_CODING_NETEQ_NACK_TRACKER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <map> > #include <vector> > >-#include "modules/audio_coding/include/audio_coding_module_typedefs.h" >-#include "modules/include/module_common_types.h" >+#include "modules/include/module_common_types_public.h" > #include "rtc_base/gtest_prod_util.h" > > // >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq.cc >index cf1c6aa665364553f30406b76aa430bc6f219e20..0e6147e232cb1470185bdf9872a708d8555ef6f4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq.cc >@@ -10,8 +10,6 @@ > > #include "modules/audio_coding/neteq/include/neteq.h" > >-#include <memory> >- > #include "modules/audio_coding/neteq/neteq_impl.h" > #include "rtc_base/strings/string_builder.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.cc >index f428be1c3bc8250dcd16cbfe086a6294276ef45f..2a025f304e537ce11800b7c9160032e19d224547 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.cc >@@ -11,13 +11,16 @@ > #include "modules/audio_coding/neteq/neteq_impl.h" > > #include <assert.h> >- > #include <algorithm> >+#include <cstdint> >+#include <cstring> >+#include <list> > #include <utility> > #include <vector> > > #include "api/audio_codecs/audio_decoder.h" > #include "common_audio/signal_processing/include/signal_processing_library.h" >+#include "modules/audio_coding/codecs/cng/webrtc_cng.h" > #include "modules/audio_coding/neteq/accelerate.h" > #include "modules/audio_coding/neteq/background_noise.h" > #include "modules/audio_coding/neteq/buffer_level_filter.h" >@@ -40,15 +43,14 @@ > #include "modules/audio_coding/neteq/red_payload_splitter.h" > #include "modules/audio_coding/neteq/sync_buffer.h" > #include "modules/audio_coding/neteq/tick_timer.h" >+#include "modules/audio_coding/neteq/time_stretch.h" > #include "modules/audio_coding/neteq/timestamp_scaler.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/sanitizer.h" > #include "rtc_base/strings/audio_format_to_string.h" >-#include "rtc_base/system/fallthrough.h" > #include "rtc_base/trace_event.h" >-#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > >@@ -61,6 +63,7 @@ NetEqImpl::Dependencies::Dependencies( > new DecoderDatabase(decoder_factory, config.codec_pair_id)), > delay_peak_detector(new DelayPeakDetector(tick_timer.get())), > delay_manager(new DelayManager(config.max_packets_in_buffer, >+ config.min_delay_ms, > delay_peak_detector.get(), > tick_timer.get())), > dtmf_buffer(new DtmfBuffer(config.sample_rate_hz)), >@@ -385,20 +388,6 @@ NetEqOperationsAndState NetEqImpl::GetOperationsAndState() const { > return result; > } > >-void NetEqImpl::GetRtcpStatistics(RtcpStatistics* stats) { >- rtc::CritScope lock(&crit_sect_); >- if (stats) { >- rtcp_.GetStatistics(false, stats); >- } >-} >- >-void NetEqImpl::GetRtcpStatisticsNoReset(RtcpStatistics* stats) { >- rtc::CritScope lock(&crit_sect_); >- if (stats) { >- rtcp_.GetStatistics(true, stats); >- } >-} >- > void NetEqImpl::EnableVad() { > rtc::CritScope lock(&crit_sect_); > assert(vad_.get()); >@@ -473,12 +462,6 @@ void NetEqImpl::FlushBuffers() { > first_packet_ = true; > } > >-void NetEqImpl::PacketBufferStatistics(int* current_num_packets, >- int* max_num_packets) const { >- rtc::CritScope lock(&crit_sect_); >- packet_buffer_->BufferStat(current_num_packets, max_num_packets); >-} >- > void NetEqImpl::EnableNack(size_t max_nack_list_size) { > rtc::CritScope lock(&crit_sect_); > if (!nack_enabled_) { >@@ -574,8 +557,6 @@ int NetEqImpl::InsertPacketInternal(const RTPHeader& rtp_header, > // Note: |first_packet_| will be cleared further down in this method, once > // the packet has been successfully inserted into the packet buffer. > >- rtcp_.Init(rtp_header.sequenceNumber); >- > // Flush the packet buffer and DTMF buffer. > packet_buffer_->Flush(); > dtmf_buffer_->Flush(); >@@ -590,9 +571,6 @@ int NetEqImpl::InsertPacketInternal(const RTPHeader& rtp_header, > timestamp_ = main_timestamp; > } > >- // Update RTCP statistics, only for regular packets. >- rtcp_.Update(rtp_header, receive_timestamp); >- > if (nack_enabled_) { > RTC_DCHECK(nack_); > if (update_sample_rate_and_channels) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.h >index 8ef97ce0af376f941ead4b1ec3eaf1d8896ec9f3..525ae615d856efd42ea168378b9224c0362e1761 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl.h >@@ -17,12 +17,11 @@ > #include "absl/types/optional.h" > #include "api/audio/audio_frame.h" > #include "modules/audio_coding/neteq/audio_multi_vector.h" >-#include "modules/audio_coding/neteq/defines.h" >+#include "modules/audio_coding/neteq/defines.h" // Modes, Operations > #include "modules/audio_coding/neteq/expand_uma_logger.h" > #include "modules/audio_coding/neteq/include/neteq.h" >-#include "modules/audio_coding/neteq/packet.h" // Declare PacketList. >+#include "modules/audio_coding/neteq/packet.h" > #include "modules/audio_coding/neteq/random_vector.h" >-#include "modules/audio_coding/neteq/rtcp.h" > #include "modules/audio_coding/neteq/statistics_calculator.h" > #include "modules/audio_coding/neteq/tick_timer.h" > #include "rtc_base/constructormagic.h" >@@ -170,17 +169,10 @@ class NetEqImpl : public webrtc::NetEq { > // after the call. > int NetworkStatistics(NetEqNetworkStatistics* stats) override; > >- // Writes the current RTCP statistics to |stats|. The statistics are reset >- // and a new report period is started with the call. >- void GetRtcpStatistics(RtcpStatistics* stats) override; >- > NetEqLifetimeStatistics GetLifetimeStatistics() const override; > > NetEqOperationsAndState GetOperationsAndState() const override; > >- // Same as RtcpStatistics(), but does not reset anything. >- void GetRtcpStatisticsNoReset(RtcpStatistics* stats) override; >- > // Enables post-decode VAD. When enabled, GetAudio() will return > // kOutputVADPassive when the signal contains no speech. > void EnableVad() override; >@@ -200,9 +192,6 @@ class NetEqImpl : public webrtc::NetEq { > // Flushes both the packet buffer and the sync buffer. > void FlushBuffers() override; > >- void PacketBufferStatistics(int* current_num_packets, >- int* max_num_packets) const override; >- > void EnableNack(size_t max_nack_list_size) override; > > void DisableNack() override; >@@ -395,7 +384,6 @@ class NetEqImpl : public webrtc::NetEq { > RTC_GUARDED_BY(crit_sect_); > RandomVector random_vector_ RTC_GUARDED_BY(crit_sect_); > std::unique_ptr<ComfortNoise> comfort_noise_ RTC_GUARDED_BY(crit_sect_); >- Rtcp rtcp_ RTC_GUARDED_BY(crit_sect_); > StatisticsCalculator stats_ RTC_GUARDED_BY(crit_sect_); > int fs_hz_ RTC_GUARDED_BY(crit_sect_); > int fs_mult_ RTC_GUARDED_BY(crit_sect_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl_unittest.cc >index b772dfa71de119fbcc0406eb4bed46160e47f5e8..0e087c847f18fdfc35fd28f322f5fab25952a030 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_impl_unittest.cc >@@ -92,7 +92,8 @@ class NetEqImplTest : public ::testing::Test { > > if (use_mock_delay_manager_) { > std::unique_ptr<MockDelayManager> mock(new MockDelayManager( >- config_.max_packets_in_buffer, delay_peak_detector_, tick_timer_)); >+ config_.max_packets_in_buffer, config_.min_delay_ms, >+ delay_peak_detector_, tick_timer_)); > mock_delay_manager_ = mock.get(); > EXPECT_CALL(*mock_delay_manager_, set_streaming_mode(false)).Times(1); > deps.delay_manager = std::move(mock); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_unittest.cc >index 3ee0b5bf693f539fffe746dd449f479da2f2f44f..e8b50237703148a33aefa834c13a03fc85f6efdf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/neteq_unittest.cc >@@ -22,12 +22,12 @@ > > #include "api/audio/audio_frame.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/audio_coding/codecs/pcm16b/pcm16b.h" > #include "modules/audio_coding/neteq/tools/audio_loop.h" > #include "modules/audio_coding/neteq/tools/neteq_packet_source_input.h" > #include "modules/audio_coding/neteq/tools/neteq_test.h" > #include "modules/audio_coding/neteq/tools/rtp_file_source.h" >+#include "modules/rtp_rtcp/include/rtcp_statistics.h" > #include "rtc_base/ignore_wundef.h" > #include "rtc_base/messagedigest.h" > #include "rtc_base/numerics/safe_conversions.h" >@@ -52,7 +52,7 @@ RTC_PUSH_IGNORING_WUNDEF() > RTC_POP_IGNORING_WUNDEF() > #endif > >-DEFINE_bool(gen_ref, false, "Generate reference files."); >+WEBRTC_DEFINE_bool(gen_ref, false, "Generate reference files."); > > namespace webrtc { > >@@ -234,7 +234,14 @@ void ResultSink::VerifyChecksum(const std::string& checksum) { > buffer.resize(digest_->Size()); > digest_->Finish(&buffer[0], buffer.size()); > const std::string result = rtc::hex_encode(&buffer[0], digest_->Size()); >- EXPECT_EQ(checksum, result); >+ if (checksum.size() == result.size()) { >+ EXPECT_EQ(checksum, result); >+ } else { >+ // Check result is one the '|'-separated checksums. >+ EXPECT_NE(checksum.find(result), std::string::npos) >+ << result << " should be one of these:\n" >+ << checksum; >+ } > } > > class NetEqDecodingTest : public ::testing::Test { >@@ -258,7 +265,6 @@ class NetEqDecodingTest : public ::testing::Test { > void DecodeAndCompare(const std::string& rtp_file, > const std::string& output_checksum, > const std::string& network_stats_checksum, >- const std::string& rtcp_stats_checksum, > bool gen_ref); > > static void PopulateRtpInfo(int frame_index, >@@ -366,7 +372,6 @@ void NetEqDecodingTest::DecodeAndCompare( > const std::string& rtp_file, > const std::string& output_checksum, > const std::string& network_stats_checksum, >- const std::string& rtcp_stats_checksum, > bool gen_ref) { > OpenInputFile(rtp_file); > >@@ -378,10 +383,6 @@ void NetEqDecodingTest::DecodeAndCompare( > gen_ref ? webrtc::test::OutputPath() + "neteq_network_stats.dat" : ""; > ResultSink network_stats(stat_out_file); > >- std::string rtcp_out_file = >- gen_ref ? webrtc::test::OutputPath() + "neteq_rtcp_stats.dat" : ""; >- ResultSink rtcp_stats(rtcp_out_file); >- > packet_ = rtp_source_->NextPacket(); > int i = 0; > uint64_t last_concealed_samples = 0; >@@ -418,11 +419,6 @@ void NetEqDecodingTest::DecodeAndCompare( > EXPECT_NEAR( > (delta_concealed_samples << 14) / delta_total_samples_received, > current_network_stats.expand_rate, (2 << 14) / 100.0); >- >- // Process RTCPstat. >- RtcpStatistics current_rtcp_stats; >- neteq_->GetRtcpStatistics(¤t_rtcp_stats); >- ASSERT_NO_FATAL_FAILURE(rtcp_stats.AddResult(current_rtcp_stats)); > } > } > >@@ -430,8 +426,6 @@ void NetEqDecodingTest::DecodeAndCompare( > output.VerifyChecksum(output_checksum); > SCOPED_TRACE("Check network stats."); > network_stats.VerifyChecksum(network_stats_checksum); >- SCOPED_TRACE("Check rtcp stats."); >- rtcp_stats.VerifyChecksum(rtcp_stats_checksum); > } > > void NetEqDecodingTest::PopulateRtpInfo(int frame_index, >@@ -481,14 +475,8 @@ TEST_F(NetEqDecodingTest, MAYBE_TestBitExactness) { > "4b2370f5c794741d2a46be5c7935c66ef3fb53e9", > "4b2370f5c794741d2a46be5c7935c66ef3fb53e9"); > >- const std::string rtcp_stats_checksum = >- PlatformChecksum("b8880bf9fed2487efbddcb8d94b9937a29ae521d", >- "f3f7b3d3e71d7e635240b5373b57df6a7e4ce9d4", "not used", >- "b8880bf9fed2487efbddcb8d94b9937a29ae521d", >- "b8880bf9fed2487efbddcb8d94b9937a29ae521d"); >- > DecodeAndCompare(input_rtp_file, output_checksum, network_stats_checksum, >- rtcp_stats_checksum, FLAG_gen_ref); >+ FLAG_gen_ref); > } > > #if !defined(WEBRTC_IOS) && defined(WEBRTC_NETEQ_UNITTEST_BITEXACT) && \ >@@ -501,12 +489,13 @@ TEST_F(NetEqDecodingTest, MAYBE_TestOpusBitExactness) { > const std::string input_rtp_file = > webrtc::test::ResourcePath("audio_coding/neteq_opus", "rtp"); > >- const std::string output_checksum = >- PlatformChecksum("14a63b3c7b925c82296be4bafc71bec85f2915c2", >- "b7b7ed802b0e18ee416973bf3b9ae98599b0181d", >- "5876e52dda90d5ca433c3726555b907b97c86374", >- "14a63b3c7b925c82296be4bafc71bec85f2915c2", >- "14a63b3c7b925c82296be4bafc71bec85f2915c2"); >+ // Checksum depends on libopus being compiled with or without SSE. >+ const std::string maybe_sse = >+ "14a63b3c7b925c82296be4bafc71bec85f2915c2|" >+ "2c05677daa968d6c68b92adf4affb7cd9bb4d363"; >+ const std::string output_checksum = PlatformChecksum( >+ maybe_sse, "b7b7ed802b0e18ee416973bf3b9ae98599b0181d", >+ "5876e52dda90d5ca433c3726555b907b97c86374", maybe_sse, maybe_sse); > > const std::string network_stats_checksum = > PlatformChecksum("adb3272498e436d1c019cbfd71610e9510c54497", >@@ -515,15 +504,8 @@ TEST_F(NetEqDecodingTest, MAYBE_TestOpusBitExactness) { > "adb3272498e436d1c019cbfd71610e9510c54497", > "adb3272498e436d1c019cbfd71610e9510c54497"); > >- const std::string rtcp_stats_checksum = >- PlatformChecksum("e37c797e3de6a64dda88c9ade7a013d022a2e1e0", >- "e37c797e3de6a64dda88c9ade7a013d022a2e1e0", >- "e37c797e3de6a64dda88c9ade7a013d022a2e1e0", >- "e37c797e3de6a64dda88c9ade7a013d022a2e1e0", >- "e37c797e3de6a64dda88c9ade7a013d022a2e1e0"); >- > DecodeAndCompare(input_rtp_file, output_checksum, network_stats_checksum, >- rtcp_stats_checksum, FLAG_gen_ref); >+ FLAG_gen_ref); > } > > #if !defined(WEBRTC_IOS) && defined(WEBRTC_NETEQ_UNITTEST_BITEXACT) && \ >@@ -536,21 +518,18 @@ TEST_F(NetEqDecodingTest, MAYBE_TestOpusDtxBitExactness) { > const std::string input_rtp_file = > webrtc::test::ResourcePath("audio_coding/neteq_opus_dtx", "rtp"); > >- const std::string output_checksum = >- PlatformChecksum("713af6c92881f5aab1285765ee6680da9d1c06ce", >- "3ec991b96872123f1554c03c543ca5d518431e46", >- "da9f9a2d94e0c2d67342fad4965d7b91cda50b25", >- "713af6c92881f5aab1285765ee6680da9d1c06ce", >- "713af6c92881f5aab1285765ee6680da9d1c06ce"); >+ const std::string maybe_sse = >+ "713af6c92881f5aab1285765ee6680da9d1c06ce|" >+ "2ac10c4e79aeedd0df2863b079da5848b40f00b5"; >+ const std::string output_checksum = PlatformChecksum( >+ maybe_sse, "3ec991b96872123f1554c03c543ca5d518431e46", >+ "da9f9a2d94e0c2d67342fad4965d7b91cda50b25", maybe_sse, maybe_sse); > > const std::string network_stats_checksum = > "bab58dc587d956f326056d7340c96eb9d2d3cc21"; > >- const std::string rtcp_stats_checksum = >- "ac27a7f305efb58b39bf123dccee25dee5758e63"; >- > DecodeAndCompare(input_rtp_file, output_checksum, network_stats_checksum, >- rtcp_stats_checksum, FLAG_gen_ref); >+ FLAG_gen_ref); > } > > // Use fax mode to avoid time-scaling. This is to simplify the testing of >@@ -1740,7 +1719,7 @@ TEST(NetEqNoTimeStretchingMode, RunTest) { > {8, kRtpExtensionVideoTiming}}; > std::unique_ptr<NetEqInput> input(new NetEqRtpDumpInput( > webrtc::test::ResourcePath("audio_coding/neteq_universal_new", "rtp"), >- rtp_ext_map)); >+ rtp_ext_map, absl::nullopt /*No SSRC filter*/)); > std::unique_ptr<TimeLimitedNetEqInput> input_time_limit( > new TimeLimitedNetEqInput(std::move(input), 20000)); > std::unique_ptr<AudioSink> output(new VoidAudioSink); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/normal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/normal.h >index b13f16e50dc818d1c0861ecc1fdc239c9877a92e..80451b5007813fa71396b871be55cdd813528570 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/normal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/normal.h >@@ -11,11 +11,9 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_NORMAL_H_ > #define MODULES_AUDIO_CODING_NETEQ_NORMAL_H_ > >+#include <stdint.h> > #include <string.h> // Access to size_t. > >-#include <vector> >- >-#include "modules/audio_coding/neteq/audio_multi_vector.h" > #include "modules/audio_coding/neteq/defines.h" > #include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" >@@ -24,6 +22,7 @@ > namespace webrtc { > > // Forward declarations. >+class AudioMultiVector; > class BackgroundNoise; > class DecoderDatabase; > class Expand; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet.h >index 45c56e1b686b353bdfe947e57c710eb8c00e7cd9..358d8fa1abb8f8b9e097fe175545ba869ef1fdfc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet.h >@@ -11,12 +11,14 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_PACKET_H_ > #define MODULES_AUDIO_CODING_NETEQ_PACKET_H_ > >+#include <stdint.h> > #include <list> > #include <memory> > > #include "api/audio_codecs/audio_decoder.h" > #include "modules/audio_coding/neteq/tick_timer.h" > #include "rtc_base/buffer.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.cc >index eba4d3e68b66d6e571e9c06e45573817af7da978..343763b4e4d813fa6105c3ab99e6fea7060662fc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.cc >@@ -14,12 +14,17 @@ > > #include "modules/audio_coding/neteq/packet_buffer.h" > >-#include <algorithm> // find_if() >+#include <algorithm> >+#include <list> >+#include <memory> >+#include <type_traits> >+#include <utility> > > #include "api/audio_codecs/audio_decoder.h" > #include "modules/audio_coding/neteq/decoder_database.h" > #include "modules/audio_coding/neteq/statistics_calculator.h" > #include "modules/audio_coding/neteq/tick_timer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > > namespace webrtc { >@@ -294,9 +299,4 @@ bool PacketBuffer::ContainsDtxOrCngPacket( > return false; > } > >-void PacketBuffer::BufferStat(int* num_packets, int* max_num_packets) const { >- *num_packets = static_cast<int>(buffer_.size()); >- *max_num_packets = static_cast<int>(max_number_of_packets_); >-} >- > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.h >index 7e34c1e9fce0b8003d43786625640c3c22f505ce..0f5cd7f8ed5dacd05c7051706b6d5f005fab4de0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/packet_buffer.h >@@ -14,7 +14,7 @@ > #include "absl/types/optional.h" > #include "modules/audio_coding/neteq/decoder_database.h" > #include "modules/audio_coding/neteq/packet.h" >-#include "modules/include/module_common_types.h" >+#include "modules/include/module_common_types_public.h" // IsNewerTimestamp > #include "rtc_base/constructormagic.h" > > namespace webrtc { >@@ -125,8 +125,6 @@ class PacketBuffer { > virtual bool ContainsDtxOrCngPacket( > const DecoderDatabase* decoder_database) const; > >- virtual void BufferStat(int* num_packets, int* max_num_packets) const; >- > // Static method returning true if |timestamp| is older than |timestamp_limit| > // but less than |horizon_samples| behind |timestamp_limit|. For instance, > // with timestamp_limit = 100 and horizon_samples = 10, a timestamp in the >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/post_decode_vad.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/post_decode_vad.h >index ad4f0823bb7a5d72c8111e0758141f34f76c996d..27d69a6eeb2280afb834e683b9f2da1a4b8d3112 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/post_decode_vad.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/post_decode_vad.h >@@ -11,13 +11,11 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_POST_DECODE_VAD_H_ > #define MODULES_AUDIO_CODING_NETEQ_POST_DECODE_VAD_H_ > >-#include <string> // size_t >+#include <stddef.h> >+#include <stdint.h> > > #include "api/audio_codecs/audio_decoder.h" > #include "common_audio/vad/include/webrtc_vad.h" >-#include "common_types.h" // NOLINT(build/include) // NULL >-#include "modules/audio_coding/neteq/defines.h" >-#include "modules/audio_coding/neteq/packet.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.cc >index 6159a9cb15d70e67086dff1f7ef0c025c751f393..cad8d6a50f6ccd2a50da2594bba79b0741df722f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.cc >@@ -10,9 +10,11 @@ > > #include "modules/audio_coding/neteq/preemptive_expand.h" > >-#include <algorithm> // min, max >+#include <algorithm> > >-#include "common_audio/signal_processing/include/signal_processing_library.h" >+#include "api/array_view.h" >+#include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include "modules/audio_coding/neteq/time_stretch.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.h >index ace648f15cce257109976bbec88dc87b0a125f53..0f7b3bc8acf18bd3ce9e6fceddeffc8aeef5a3f6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/preemptive_expand.h >@@ -11,13 +11,15 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_PREEMPTIVE_EXPAND_H_ > #define MODULES_AUDIO_CODING_NETEQ_PREEMPTIVE_EXPAND_H_ > >-#include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/audio_coding/neteq/time_stretch.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > >-// Forward declarations. >+class AudioMultiVector; > class BackgroundNoise; > > // This class implements the PreemptiveExpand operation. Most of the work is >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.cc >index f5435e8eab4ad1efb96d3a039f9676dcb829fc70..2dfe8386ff9a94eea97ee27fa7436574ceb7f021 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.cc >@@ -11,10 +11,15 @@ > #include "modules/audio_coding/neteq/red_payload_splitter.h" > > #include <assert.h> >+#include <stddef.h> >+#include <cstdint> >+#include <list> >+#include <utility> > #include <vector> > > #include "modules/audio_coding/neteq/decoder_database.h" >-#include "rtc_base/checks.h" >+#include "modules/audio_coding/neteq/packet.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.h >index 5e239ddc708260e66ff4ea361c5cc26117564b46..55063e741769e0c031b92ba0f0753a90fed0f76f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/red_payload_splitter.h >@@ -16,7 +16,6 @@ > > namespace webrtc { > >-// Forward declarations. > class DecoderDatabase; > > // This class handles splitting of RED payloads into smaller parts. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/rtcp.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/rtcp.cc >deleted file mode 100644 >index 551eb5f75b401ffb2f95828b282d8fdd5afca9c2..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/rtcp.cc >+++ /dev/null >@@ -1,94 +0,0 @@ >-/* >- * Copyright (c) 2011 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/audio_coding/neteq/rtcp.h" >- >-#include <stdlib.h> >-#include <string.h> >- >-#include <algorithm> >- >-namespace webrtc { >- >-void Rtcp::Init(uint16_t start_sequence_number) { >- cycles_ = 0; >- max_seq_no_ = start_sequence_number; >- base_seq_no_ = start_sequence_number; >- received_packets_ = 0; >- received_packets_prior_ = 0; >- expected_prior_ = 0; >- jitter_ = 0; >- transit_ = 0; >-} >- >-void Rtcp::Update(const RTPHeader& rtp_header, uint32_t receive_timestamp) { >- // Update number of received packets, and largest packet number received. >- received_packets_++; >- int16_t sn_diff = rtp_header.sequenceNumber - max_seq_no_; >- if (sn_diff >= 0) { >- if (rtp_header.sequenceNumber < max_seq_no_) { >- // Wrap-around detected. >- cycles_++; >- } >- max_seq_no_ = rtp_header.sequenceNumber; >- } >- >- // Calculate jitter according to RFC 3550, and update previous timestamps. >- // Note that the value in |jitter_| is in Q4. >- if (received_packets_ > 1) { >- int32_t ts_diff = receive_timestamp - (rtp_header.timestamp - transit_); >- int64_t jitter_diff = (std::abs(int64_t{ts_diff}) << 4) - jitter_; >- // Calculate 15 * jitter_ / 16 + jitter_diff / 16 (with proper rounding). >- jitter_ = jitter_ + ((jitter_diff + 8) >> 4); >- RTC_DCHECK_GE(jitter_, 0); >- } >- transit_ = rtp_header.timestamp - receive_timestamp; >-} >- >-void Rtcp::GetStatistics(bool no_reset, RtcpStatistics* stats) { >- // Extended highest sequence number received. >- stats->extended_highest_sequence_number = >- (static_cast<int>(cycles_) << 16) + max_seq_no_; >- >- // Calculate expected number of packets and compare it with the number of >- // packets that were actually received. The cumulative number of lost packets >- // can be extracted. >- uint32_t expected_packets = >- stats->extended_highest_sequence_number - base_seq_no_ + 1; >- if (received_packets_ == 0) { >- // No packets received, assume none lost. >- stats->packets_lost = 0; >- } else if (expected_packets > received_packets_) { >- stats->packets_lost = expected_packets - received_packets_; >- if (stats->packets_lost > 0xFFFFFF) { >- stats->packets_lost = 0xFFFFFF; >- } >- } else { >- stats->packets_lost = 0; >- } >- >- // Fraction lost since last report. >- uint32_t expected_since_last = expected_packets - expected_prior_; >- uint32_t received_since_last = received_packets_ - received_packets_prior_; >- if (!no_reset) { >- expected_prior_ = expected_packets; >- received_packets_prior_ = received_packets_; >- } >- int32_t lost = expected_since_last - received_since_last; >- if (expected_since_last == 0 || lost <= 0 || received_packets_ == 0) { >- stats->fraction_lost = 0; >- } else { >- stats->fraction_lost = std::min(0xFFU, (lost << 8) / expected_since_last); >- } >- >- stats->jitter = jitter_ >> 4; // Scaling from Q4. >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/rtcp.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/rtcp.h >deleted file mode 100644 >index b1de7eb7f487f27eb8bc696ce810698398498f09..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/rtcp.h >+++ /dev/null >@@ -1,55 +0,0 @@ >-/* >- * Copyright (c) 2011 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_AUDIO_CODING_NETEQ_RTCP_H_ >-#define MODULES_AUDIO_CODING_NETEQ_RTCP_H_ >- >-#include "modules/audio_coding/neteq/include/neteq.h" >-#include "rtc_base/constructormagic.h" >- >-namespace webrtc { >- >-// Forward declaration. >-struct RTPHeader; >- >-class Rtcp { >- public: >- Rtcp() { Init(0); } >- >- ~Rtcp() {} >- >- // Resets the RTCP statistics, and sets the first received sequence number. >- void Init(uint16_t start_sequence_number); >- >- // Updates the RTCP statistics with a new received packet. >- void Update(const RTPHeader& rtp_header, uint32_t receive_timestamp); >- >- // Returns the current RTCP statistics. If |no_reset| is true, the statistics >- // are not reset, otherwise they are. >- void GetStatistics(bool no_reset, RtcpStatistics* stats); >- >- private: >- uint16_t cycles_; // The number of wrap-arounds for the sequence number. >- uint16_t max_seq_no_; // The maximum sequence number received. Starts over >- // from 0 after wrap-around. >- uint16_t base_seq_no_; // The sequence number of the first received packet. >- uint32_t received_packets_; // The number of packets that have been received. >- uint32_t received_packets_prior_; // Number of packets received when last >- // report was generated. >- uint32_t expected_prior_; // Expected number of packets, at the time of the >- // last report. >- int64_t jitter_; // Current jitter value in Q4. >- int32_t transit_; // Clock difference for previous packet. >- >- RTC_DISALLOW_COPY_AND_ASSIGN(Rtcp); >-}; >- >-} // namespace webrtc >-#endif // MODULES_AUDIO_CODING_NETEQ_RTCP_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.cc >index c0e7a402aac00aa72a5f79dd34868f263c9f851d..50521fb03d364ea080d457047ac70f1a3879a666 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.cc >@@ -126,7 +126,10 @@ StatisticsCalculator::StatisticsCalculator() > 100), > excess_buffer_delay_("WebRTC.Audio.AverageExcessBufferDelayMs", > 60000, // 60 seconds report interval. >- 1000) {} >+ 1000), >+ buffer_full_counter_("WebRTC.Audio.JitterBufferFullPerMinute", >+ 60000, // 60 seconds report interval. >+ 100) {} > > StatisticsCalculator::~StatisticsCalculator() = default; > >@@ -229,6 +232,7 @@ void StatisticsCalculator::IncreaseCounter(size_t num_samples, int fs_hz) { > rtc::CheckedDivExact(static_cast<int>(1000 * num_samples), fs_hz); > delayed_packet_outage_counter_.AdvanceClock(time_step_ms); > excess_buffer_delay_.AdvanceClock(time_step_ms); >+ buffer_full_counter_.AdvanceClock(time_step_ms); > timestamps_since_last_report_ += static_cast<uint32_t>(num_samples); > if (timestamps_since_last_report_ > > static_cast<uint32_t>(fs_hz * kMaxReportPeriod)) { >@@ -250,13 +254,17 @@ void StatisticsCalculator::SecondaryDecodedSamples(int num_samples) { > > void StatisticsCalculator::FlushedPacketBuffer() { > operations_and_state_.packet_buffer_flushes++; >+ buffer_full_counter_.RegisterSample(); > } > >-void StatisticsCalculator::LogDelayedPacketOutageEvent(int outage_duration_ms) { >+void StatisticsCalculator::LogDelayedPacketOutageEvent(int num_samples, >+ int fs_hz) { >+ int outage_duration_ms = num_samples / (fs_hz / 1000); > RTC_HISTOGRAM_COUNTS("WebRTC.Audio.DelayedPacketOutageEventMs", > outage_duration_ms, 1 /* min */, 2000 /* max */, > 100 /* bucket count */); > delayed_packet_outage_counter_.RegisterSample(); >+ lifetime_stats_.delayed_packet_outage_samples += num_samples; > } > > void StatisticsCalculator::StoreWaitingTime(int waiting_time_ms) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.h >index 9e5dacfd920a63998026c3bcb96bed8f33eb04a6..49b74a04b3b4a6d77ce05e5704fc86c2bc4968ac 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/statistics_calculator.h >@@ -86,10 +86,10 @@ class StatisticsCalculator { > // Rerport that the packet buffer was flushed. > void FlushedPacketBuffer(); > >- // Logs a delayed packet outage event of |outage_duration_ms|. A delayed >- // packet outage event is defined as an expand period caused not by an actual >- // packet loss, but by a delayed packet. >- virtual void LogDelayedPacketOutageEvent(int outage_duration_ms); >+ // Logs a delayed packet outage event of |num_samples| expanded at a sample >+ // rate of |fs_hz|. A delayed packet outage event is defined as an expand >+ // period caused not by an actual packet loss, but by a delayed packet. >+ virtual void LogDelayedPacketOutageEvent(int num_samples, int fs_hz); > > // Returns the current network statistics in |stats|. The current sample rate > // is |fs_hz|, the total number of samples in packet buffer and sync buffer >@@ -199,6 +199,7 @@ class StatisticsCalculator { > size_t discarded_secondary_packets_; > PeriodicUmaCount delayed_packet_outage_counter_; > PeriodicUmaAverage excess_buffer_delay_; >+ PeriodicUmaCount buffer_full_counter_; > > RTC_DISALLOW_COPY_AND_ASSIGN(StatisticsCalculator); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/sync_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/sync_buffer.h >index 72e320c61f6f9017c8b2d540980249e80eeac092..d645e9163e9105c98f1b900b09caef416fffca73 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/sync_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/sync_buffer.h >@@ -11,8 +11,13 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_SYNC_BUFFER_H_ > #define MODULES_AUDIO_CODING_NETEQ_SYNC_BUFFER_H_ > >+#include <stddef.h> >+#include <stdint.h> >+#include <vector> >+ > #include "api/audio/audio_frame.h" > #include "modules/audio_coding/neteq/audio_multi_vector.h" >+#include "modules/audio_coding/neteq/audio_vector.h" > #include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_ilbc_quality_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_ilbc_quality_test.cc >index ad61235cf28966daab491e6ddb7761af82850c4c..6f103452980de52f097452fb496ac32ff95c4f9b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_ilbc_quality_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_ilbc_quality_test.cc >@@ -25,7 +25,7 @@ namespace { > static const int kInputSampleRateKhz = 8; > static const int kOutputSampleRateKhz = 8; > >-DEFINE_int(frame_size_ms, 20, "Codec frame size (milliseconds)."); >+WEBRTC_DEFINE_int(frame_size_ms, 20, "Codec frame size (milliseconds)."); > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_isac_quality_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_isac_quality_test.cc >index 94984b87e6f1239201924afa75dbf1016d08e325..651b0ca71a68ed83410a14de9d739b880553566d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_isac_quality_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_isac_quality_test.cc >@@ -21,7 +21,7 @@ static const int kIsacBlockDurationMs = 30; > static const int kIsacInputSamplingKhz = 16; > static const int kIsacOutputSamplingKhz = 16; > >-DEFINE_int(bit_rate_kbps, 32, "Target bit rate (kbps)."); >+WEBRTC_DEFINE_int(bit_rate_kbps, 32, "Target bit rate (kbps)."); > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_opus_quality_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_opus_quality_test.cc >index 6861e4c55aafd6c44647fdbc04b2ad925086fc62..f4a36363baa95bea7eb03ca6d2a53d90eaa66d60 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_opus_quality_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_opus_quality_test.cc >@@ -22,24 +22,26 @@ namespace { > static const int kOpusBlockDurationMs = 20; > static const int kOpusSamplingKhz = 48; > >-DEFINE_int(bit_rate_kbps, 32, "Target bit rate (kbps)."); >+WEBRTC_DEFINE_int(bit_rate_kbps, 32, "Target bit rate (kbps)."); > >-DEFINE_int(complexity, >- 10, >- "Complexity: 0 ~ 10 -- defined as in Opus" >- "specification."); >+WEBRTC_DEFINE_int(complexity, >+ 10, >+ "Complexity: 0 ~ 10 -- defined as in Opus" >+ "specification."); > >-DEFINE_int(maxplaybackrate, 48000, "Maximum playback rate (Hz)."); >+WEBRTC_DEFINE_int(maxplaybackrate, 48000, "Maximum playback rate (Hz)."); > >-DEFINE_int(application, 0, "Application mode: 0 -- VOIP, 1 -- Audio."); >+WEBRTC_DEFINE_int(application, 0, "Application mode: 0 -- VOIP, 1 -- Audio."); > >-DEFINE_int(reported_loss_rate, 10, "Reported percentile of packet loss."); >+WEBRTC_DEFINE_int(reported_loss_rate, >+ 10, >+ "Reported percentile of packet loss."); > >-DEFINE_bool(fec, false, "Enable FEC for encoding (-nofec to disable)."); >+WEBRTC_DEFINE_bool(fec, false, "Enable FEC for encoding (-nofec to disable)."); > >-DEFINE_bool(dtx, false, "Enable DTX for encoding (-nodtx to disable)."); >+WEBRTC_DEFINE_bool(dtx, false, "Enable DTX for encoding (-nodtx to disable)."); > >-DEFINE_int(sub_packets, 1, "Number of sub packets to repacketize."); >+WEBRTC_DEFINE_int(sub_packets, 1, "Number of sub packets to repacketize."); > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcm16b_quality_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcm16b_quality_test.cc >index 8872b94a0639315c2e9fd966b1d5743941b6df8d..9c53919d54077ef3064da88cb942fef1d499f99f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcm16b_quality_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcm16b_quality_test.cc >@@ -26,7 +26,7 @@ namespace { > static const int kInputSampleRateKhz = 48; > static const int kOutputSampleRateKhz = 48; > >-DEFINE_int(frame_size_ms, 20, "Codec frame size (milliseconds)."); >+WEBRTC_DEFINE_int(frame_size_ms, 20, "Codec frame size (milliseconds)."); > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcmu_quality_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcmu_quality_test.cc >index 54ff849739a3902f3c974ab62e0f58361a90f92d..85f22671eac47ada99fe389253b162f1d458d33c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcmu_quality_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_pcmu_quality_test.cc >@@ -25,7 +25,7 @@ namespace { > static const int kInputSampleRateKhz = 8; > static const int kOutputSampleRateKhz = 8; > >-DEFINE_int(frame_size_ms, 20, "Codec frame size (milliseconds)."); >+WEBRTC_DEFINE_int(frame_size_ms, 20, "Codec frame size (milliseconds)."); > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_speed_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_speed_test.cc >index c486ac60bd7c89ce417229c520cc9f4a603ab9f6..70777a2d0214921483addcf20ad59d3c91baf3dc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_speed_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/test/neteq_speed_test.cc >@@ -14,13 +14,12 @@ > > #include "modules/audio_coding/neteq/tools/neteq_performance_test.h" > #include "rtc_base/flags.h" >-#include "test/testsupport/fileutils.h" > > // Define command line flags. >-DEFINE_int(runtime_ms, 10000, "Simulated runtime in ms."); >-DEFINE_int(lossrate, 10, "Packet lossrate; drop every N packets."); >-DEFINE_float(drift, 0.1f, "Clockdrift factor."); >-DEFINE_bool(help, false, "Print this message."); >+WEBRTC_DEFINE_int(runtime_ms, 10000, "Simulated runtime in ms."); >+WEBRTC_DEFINE_int(lossrate, 10, "Packet lossrate; drop every N packets."); >+WEBRTC_DEFINE_float(drift, 0.1f, "Clockdrift factor."); >+WEBRTC_DEFINE_bool(help, false, "Print this message."); > > int main(int argc, char* argv[]) { > std::string program_name = argv[0]; >@@ -33,7 +32,6 @@ int main(int argc, char* argv[]) { > " --lossrate=N drop every N packets; default is 10\n" > " --drift=F clockdrift factor between 0.0 and 1.0; " > "default is 0.1\n"; >- webrtc::test::SetExecutablePath(argv[0]); > if (rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true) || FLAG_help || > argc != 1) { > printf("%s", usage.c_str()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tick_timer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tick_timer.h >index 520099e346624b5575293585dc9d7ca7d3cc4d60..02f083e02eb979cc77affaa458166112622fa542 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tick_timer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tick_timer.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_CODING_NETEQ_TICK_TIMER_H_ > #define MODULES_AUDIO_CODING_NETEQ_TICK_TIMER_H_ > >+#include <stdint.h> > #include <memory> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/timestamp_scaler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/timestamp_scaler.cc >index 07d945edce0712d7f51e4ece41c9f9bad8e2bcba..b0461bb92d84ef34082b1c4b16d149dddff9f813 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/timestamp_scaler.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/timestamp_scaler.cc >@@ -10,6 +10,7 @@ > > #include "modules/audio_coding/neteq/timestamp_scaler.h" > >+#include "api/audio_codecs/audio_format.h" > #include "modules/audio_coding/neteq/decoder_database.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.cc >index 21c5f9ef0223f7b10e8a775b0fe007ba2f53ea96..9107e5e5ef4fc024c0e80521f4191d652fde3298 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.cc >@@ -19,11 +19,8 @@ namespace webrtc { > namespace test { > > NetEqEventLogInput::NetEqEventLogInput(const std::string& file_name, >- const RtpHeaderExtensionMap& hdr_ext_map) >- : source_(RtcEventLogSource::Create(file_name)) { >- for (const auto& ext_pair : hdr_ext_map) { >- source_->RegisterRtpHeaderExtension(ext_pair.second, ext_pair.first); >- } >+ absl::optional<uint32_t> ssrc_filter) >+ : source_(RtcEventLogSource::Create(file_name, ssrc_filter)) { > LoadNextPacket(); > AdvanceOutputEvent(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.h >index 86cf9f2f02a92b6ef38c447bfd89fcff854cbd0b..e04df1684f1b871d3ccccee77e8fb4950e3b2348 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_event_log_input.h >@@ -28,7 +28,7 @@ class RtcEventLogSource; > class NetEqEventLogInput final : public NetEqPacketSourceInput { > public: > NetEqEventLogInput(const std::string& file_name, >- const RtpHeaderExtensionMap& hdr_ext_map); >+ absl::optional<uint32_t> ssrc_filter); > > absl::optional<int64_t> NextOutputEventTime() const override; > void AdvanceOutputEvent() override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.cc >index a86cf6afa5901784f71640630adde6a98a924f6f..ee02288c86d66e5f3dd06b1495cd8e62456662fb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.cc >@@ -58,15 +58,10 @@ std::unique_ptr<NetEqInput::PacketData> NetEqPacketSourceInput::PopPacket() { > return packet_data; > } > >-void NetEqPacketSourceInput::SelectSsrc(uint32_t ssrc) { >- source()->SelectSsrc(ssrc); >- if (packet_ && packet_->header().ssrc != ssrc) >- LoadNextPacket(); >-} >- > NetEqRtpDumpInput::NetEqRtpDumpInput(const std::string& file_name, >- const RtpHeaderExtensionMap& hdr_ext_map) >- : source_(RtpFileSource::Create(file_name)) { >+ const RtpHeaderExtensionMap& hdr_ext_map, >+ absl::optional<uint32_t> ssrc_filter) >+ : source_(RtpFileSource::Create(file_name, ssrc_filter)) { > for (const auto& ext_pair : hdr_ext_map) { > source_->RegisterRtpHeaderExtension(ext_pair.second, ext_pair.first); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.h >index b32c5340851f66255a55d66d86ee8f80e0956e84..8633d1f5dd1cf37ec671b8724e2cac28944e011b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_packet_source_input.h >@@ -12,8 +12,10 @@ > #define MODULES_AUDIO_CODING_NETEQ_TOOLS_NETEQ_PACKET_SOURCE_INPUT_H_ > > #include <map> >+#include <memory> > #include <string> > >+#include "absl/types/optional.h" > #include "modules/audio_coding/neteq/tools/neteq_input.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > >@@ -32,7 +34,6 @@ class NetEqPacketSourceInput : public NetEqInput { > std::unique_ptr<PacketData> PopPacket() override; > absl::optional<RTPHeader> NextHeader() const override; > bool ended() const override { return !next_output_event_ms_; } >- void SelectSsrc(uint32_t); > > protected: > virtual PacketSource* source() = 0; >@@ -48,7 +49,8 @@ class NetEqPacketSourceInput : public NetEqInput { > class NetEqRtpDumpInput final : public NetEqPacketSourceInput { > public: > NetEqRtpDumpInput(const std::string& file_name, >- const RtpHeaderExtensionMap& hdr_ext_map); >+ const RtpHeaderExtensionMap& hdr_ext_map, >+ absl::optional<uint32_t> ssrc_filter); > > absl::optional<int64_t> NextOutputEventTime() const override; > void AdvanceOutputEvent() override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_quality_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_quality_test.cc >index faca8959d4d4f5c7cf6a13e98ca32141b2b1f76e..2ee6779c539e1b9e67bb7c5fb071e251f5d09bd8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_quality_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_quality_test.cc >@@ -47,42 +47,47 @@ static bool ValidateFilename(const std::string& value, bool write) { > return true; > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > in_filename, > DefaultInFilename().c_str(), > "Filename for input audio (specify sample rate with --input_sample_rate, " > "and channels with --channels)."); > >-DEFINE_int(input_sample_rate, 16000, "Sample rate of input file in Hz."); >+WEBRTC_DEFINE_int(input_sample_rate, 16000, "Sample rate of input file in Hz."); > >-DEFINE_int(channels, 1, "Number of channels in input audio."); >+WEBRTC_DEFINE_int(channels, 1, "Number of channels in input audio."); > >-DEFINE_string(out_filename, >- DefaultOutFilename().c_str(), >- "Name of output audio file."); >+WEBRTC_DEFINE_string(out_filename, >+ DefaultOutFilename().c_str(), >+ "Name of output audio file."); > >-DEFINE_int(runtime_ms, 10000, "Simulated runtime (milliseconds)."); >+WEBRTC_DEFINE_int(runtime_ms, 10000, "Simulated runtime (milliseconds)."); > >-DEFINE_int(packet_loss_rate, 10, "Percentile of packet loss."); >+WEBRTC_DEFINE_int(packet_loss_rate, 10, "Percentile of packet loss."); > >-DEFINE_int(random_loss_mode, >- kUniformLoss, >- "Random loss mode: 0--no loss, 1--uniform loss, 2--Gilbert Elliot " >- "loss, 3--fixed loss."); >+WEBRTC_DEFINE_int( >+ random_loss_mode, >+ kUniformLoss, >+ "Random loss mode: 0--no loss, 1--uniform loss, 2--Gilbert Elliot " >+ "loss, 3--fixed loss."); > >-DEFINE_int(burst_length, >- 30, >- "Burst length in milliseconds, only valid for Gilbert Elliot loss."); >+WEBRTC_DEFINE_int( >+ burst_length, >+ 30, >+ "Burst length in milliseconds, only valid for Gilbert Elliot loss."); > >-DEFINE_float(drift_factor, 0.0, "Time drift factor."); >+WEBRTC_DEFINE_float(drift_factor, 0.0, "Time drift factor."); > >-DEFINE_int(preload_packets, 0, "Preload the buffer with this many packets."); >+WEBRTC_DEFINE_int(preload_packets, >+ 0, >+ "Preload the buffer with this many packets."); > >-DEFINE_string(loss_events, >- "", >- "List of loss events time and duration separated by comma: " >- "<first_event_time> <first_event_duration>, <second_event_time> " >- "<second_event_duration>, ..."); >+WEBRTC_DEFINE_string( >+ loss_events, >+ "", >+ "List of loss events time and duration separated by comma: " >+ "<first_event_time> <first_event_duration>, <second_event_time> " >+ "<second_event_duration>, ..."); > > // ProbTrans00Solver() is to calculate the transition probability from no-loss > // state to itself in a modified Gilbert Elliot packet loss model. The result is >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_rtpplay.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_rtpplay.cc >index 25e8cd8816e120ba7b3d852f7a71e07f7bc9f686..c2726eb4704b873be27e0b71cd122c8e936524a8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_rtpplay.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_rtpplay.cc >@@ -17,17 +17,17 @@ > #include "system_wrappers/include/field_trial.h" > #include "test/field_trial.h" > >-DEFINE_bool(codec_map, >- false, >- "Prints the mapping between RTP payload type and " >- "codec"); >-DEFINE_string( >+WEBRTC_DEFINE_bool(codec_map, >+ false, >+ "Prints the mapping between RTP payload type and " >+ "codec"); >+WEBRTC_DEFINE_string( > force_fieldtrials, > "", > "Field trials control experimental feature code which can be forced. " > "E.g. running with --force_fieldtrials=WebRTC-FooFeature/Enable/" > " will assign the group Enable to field trial WebRTC-FooFeature."); >-DEFINE_bool(help, false, "Prints this message"); >+WEBRTC_DEFINE_bool(help, false, "Prints this message"); > > int main(int argc, char* argv[]) { > webrtc::test::NetEqTestFactory factory; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_test_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_test_factory.cc >index 51077bc6b8b9235712bf2f16598cb354a617d93e..aa956ce8abcd311828a3337773f4ccc12d42deda 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_test_factory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/neteq_test_factory.cc >@@ -91,47 +91,60 @@ static bool ValidateExtensionId(int value) { > } > > // Define command line flags. >-DEFINE_int(pcmu, 0, "RTP payload type for PCM-u"); >-DEFINE_int(pcma, 8, "RTP payload type for PCM-a"); >-DEFINE_int(ilbc, 102, "RTP payload type for iLBC"); >-DEFINE_int(isac, 103, "RTP payload type for iSAC"); >-DEFINE_int(isac_swb, 104, "RTP payload type for iSAC-swb (32 kHz)"); >-DEFINE_int(opus, 111, "RTP payload type for Opus"); >-DEFINE_int(pcm16b, 93, "RTP payload type for PCM16b-nb (8 kHz)"); >-DEFINE_int(pcm16b_wb, 94, "RTP payload type for PCM16b-wb (16 kHz)"); >-DEFINE_int(pcm16b_swb32, 95, "RTP payload type for PCM16b-swb32 (32 kHz)"); >-DEFINE_int(pcm16b_swb48, 96, "RTP payload type for PCM16b-swb48 (48 kHz)"); >-DEFINE_int(g722, 9, "RTP payload type for G.722"); >-DEFINE_int(avt, 106, "RTP payload type for AVT/DTMF (8 kHz)"); >-DEFINE_int(avt_16, 114, "RTP payload type for AVT/DTMF (16 kHz)"); >-DEFINE_int(avt_32, 115, "RTP payload type for AVT/DTMF (32 kHz)"); >-DEFINE_int(avt_48, 116, "RTP payload type for AVT/DTMF (48 kHz)"); >-DEFINE_int(red, 117, "RTP payload type for redundant audio (RED)"); >-DEFINE_int(cn_nb, 13, "RTP payload type for comfort noise (8 kHz)"); >-DEFINE_int(cn_wb, 98, "RTP payload type for comfort noise (16 kHz)"); >-DEFINE_int(cn_swb32, 99, "RTP payload type for comfort noise (32 kHz)"); >-DEFINE_int(cn_swb48, 100, "RTP payload type for comfort noise (48 kHz)"); >-DEFINE_string(replacement_audio_file, >- "", >- "A PCM file that will be used to populate " >- "dummy" >- " RTP packets"); >-DEFINE_string(ssrc, >- "", >- "Only use packets with this SSRC (decimal or hex, the latter " >- "starting with 0x)"); >-DEFINE_int(audio_level, 1, "Extension ID for audio level (RFC 6464)"); >-DEFINE_int(abs_send_time, 3, "Extension ID for absolute sender time"); >-DEFINE_int(transport_seq_no, 5, "Extension ID for transport sequence number"); >-DEFINE_int(video_content_type, 7, "Extension ID for video content type"); >-DEFINE_int(video_timing, 8, "Extension ID for video timing"); >-DEFINE_bool(matlabplot, >- false, >- "Generates a matlab script for plotting the delay profile"); >-DEFINE_bool(pythonplot, >- false, >- "Generates a python script for plotting the delay profile"); >-DEFINE_bool(concealment_events, false, "Prints concealment events"); >+WEBRTC_DEFINE_int(pcmu, 0, "RTP payload type for PCM-u"); >+WEBRTC_DEFINE_int(pcma, 8, "RTP payload type for PCM-a"); >+WEBRTC_DEFINE_int(ilbc, 102, "RTP payload type for iLBC"); >+WEBRTC_DEFINE_int(isac, 103, "RTP payload type for iSAC"); >+WEBRTC_DEFINE_int(isac_swb, 104, "RTP payload type for iSAC-swb (32 kHz)"); >+WEBRTC_DEFINE_int(opus, 111, "RTP payload type for Opus"); >+WEBRTC_DEFINE_int(pcm16b, 93, "RTP payload type for PCM16b-nb (8 kHz)"); >+WEBRTC_DEFINE_int(pcm16b_wb, 94, "RTP payload type for PCM16b-wb (16 kHz)"); >+WEBRTC_DEFINE_int(pcm16b_swb32, >+ 95, >+ "RTP payload type for PCM16b-swb32 (32 kHz)"); >+WEBRTC_DEFINE_int(pcm16b_swb48, >+ 96, >+ "RTP payload type for PCM16b-swb48 (48 kHz)"); >+WEBRTC_DEFINE_int(g722, 9, "RTP payload type for G.722"); >+WEBRTC_DEFINE_int(avt, 106, "RTP payload type for AVT/DTMF (8 kHz)"); >+WEBRTC_DEFINE_int(avt_16, 114, "RTP payload type for AVT/DTMF (16 kHz)"); >+WEBRTC_DEFINE_int(avt_32, 115, "RTP payload type for AVT/DTMF (32 kHz)"); >+WEBRTC_DEFINE_int(avt_48, 116, "RTP payload type for AVT/DTMF (48 kHz)"); >+WEBRTC_DEFINE_int(red, 117, "RTP payload type for redundant audio (RED)"); >+WEBRTC_DEFINE_int(cn_nb, 13, "RTP payload type for comfort noise (8 kHz)"); >+WEBRTC_DEFINE_int(cn_wb, 98, "RTP payload type for comfort noise (16 kHz)"); >+WEBRTC_DEFINE_int(cn_swb32, 99, "RTP payload type for comfort noise (32 kHz)"); >+WEBRTC_DEFINE_int(cn_swb48, 100, "RTP payload type for comfort noise (48 kHz)"); >+WEBRTC_DEFINE_string(replacement_audio_file, >+ "", >+ "A PCM file that will be used to populate " >+ "dummy" >+ " RTP packets"); >+WEBRTC_DEFINE_string( >+ ssrc, >+ "", >+ "Only use packets with this SSRC (decimal or hex, the latter " >+ "starting with 0x)"); >+WEBRTC_DEFINE_int(audio_level, 1, "Extension ID for audio level (RFC 6464)"); >+WEBRTC_DEFINE_int(abs_send_time, 3, "Extension ID for absolute sender time"); >+WEBRTC_DEFINE_int(transport_seq_no, >+ 5, >+ "Extension ID for transport sequence number"); >+WEBRTC_DEFINE_int(video_content_type, 7, "Extension ID for video content type"); >+WEBRTC_DEFINE_int(video_timing, 8, "Extension ID for video timing"); >+WEBRTC_DEFINE_bool(matlabplot, >+ false, >+ "Generates a matlab script for plotting the delay profile"); >+WEBRTC_DEFINE_bool(pythonplot, >+ false, >+ "Generates a python script for plotting the delay profile"); >+WEBRTC_DEFINE_bool(concealment_events, false, "Prints concealment events"); >+WEBRTC_DEFINE_int(max_nr_packets_in_buffer, >+ 50, >+ "Maximum allowed number of packets in the buffer"); >+WEBRTC_DEFINE_bool(enable_fast_accelerate, >+ false, >+ "Enables jitter buffer fast accelerate"); > > // Maps a codec type to a printable name string. > std::string CodecName(NetEqDecoder codec) { >@@ -309,25 +322,27 @@ std::unique_ptr<NetEqTest> NetEqTestFactory::InitializeTest( > {FLAG_video_content_type, kRtpExtensionVideoContentType}, > {FLAG_video_timing, kRtpExtensionVideoTiming}}; > >+ absl::optional<uint32_t> ssrc_filter; >+ // Check if an SSRC value was provided. >+ if (strlen(FLAG_ssrc) > 0) { >+ uint32_t ssrc; >+ RTC_CHECK(ParseSsrc(FLAG_ssrc, &ssrc)) << "Flag verification has failed."; >+ ssrc_filter = ssrc; >+ } >+ > std::unique_ptr<NetEqInput> input; > if (RtpFileSource::ValidRtpDump(input_file_name) || > RtpFileSource::ValidPcap(input_file_name)) { >- input.reset(new NetEqRtpDumpInput(input_file_name, rtp_ext_map)); >+ input.reset( >+ new NetEqRtpDumpInput(input_file_name, rtp_ext_map, ssrc_filter)); > } else { >- input.reset(new NetEqEventLogInput(input_file_name, rtp_ext_map)); >+ input.reset(new NetEqEventLogInput(input_file_name, ssrc_filter)); > } > > std::cout << "Input file: " << input_file_name << std::endl; > RTC_CHECK(input) << "Cannot open input file"; > RTC_CHECK(!input->ended()) << "Input file is empty"; > >- // Check if an SSRC value was provided. >- if (strlen(FLAG_ssrc) > 0) { >- uint32_t ssrc; >- RTC_CHECK(ParseSsrc(FLAG_ssrc, &ssrc)) << "Flag verification has failed."; >- static_cast<NetEqPacketSourceInput*>(input.get())->SelectSsrc(ssrc); >- } >- > // Check the sample rate. > absl::optional<int> sample_rate_hz; > std::set<std::pair<int, uint32_t>> discarded_pt_and_ssrc; >@@ -457,6 +472,8 @@ std::unique_ptr<NetEqTest> NetEqTestFactory::InitializeTest( > callbacks.simulation_ended_callback = stats_plotter_.get(); > NetEq::Config config; > config.sample_rate_hz = *sample_rate_hz; >+ config.max_packets_in_buffer = FLAG_max_nr_packets_in_buffer; >+ config.enable_fast_accelerate = FLAG_enable_fast_accelerate; > return absl::make_unique<NetEqTest>(config, codecs, ext_codecs_, > std::move(input), std::move(output), > callbacks); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.cc >index b1a9b64193bf75b79a70c76e20f2012e89f5308a..0e7951b6e3c343ec22b970c1d61279c0872b6c51 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.cc >@@ -49,6 +49,20 @@ Packet::Packet(uint8_t* packet_memory, > valid_header_ = ParseHeader(parser); > } > >+Packet::Packet(const RTPHeader& header, >+ size_t virtual_packet_length_bytes, >+ size_t virtual_payload_length_bytes, >+ double time_ms) >+ : header_(header), >+ payload_memory_(), >+ payload_(NULL), >+ packet_length_bytes_(0), >+ payload_length_bytes_(0), >+ virtual_packet_length_bytes_(virtual_packet_length_bytes), >+ virtual_payload_length_bytes_(virtual_payload_length_bytes), >+ time_ms_(time_ms), >+ valid_header_(true) {} >+ > Packet::Packet(uint8_t* packet_memory, size_t allocated_bytes, double time_ms) > : payload_memory_(packet_memory), > payload_(NULL), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.h >index 623c5cb6359a99f5c93a39d23a07e411cba91ace..39137aae43a0aec59234007542b108fcfa7b883d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet.h >@@ -50,7 +50,17 @@ class Packet { > double time_ms, > const RtpHeaderParser& parser); > >- // The following two constructors are the same as above, but without a >+ // Same as above, but creates the packet from an already parsed RTPHeader. >+ // This is typically used when reading RTP dump files that only contain the >+ // RTP headers, and no payload. The |virtual_packet_length_bytes| tells what >+ // size the packet had on wire, including the now discarded payload, >+ // The |virtual_payload_length_bytes| tells the size of the payload. >+ Packet(const RTPHeader& header, >+ size_t virtual_packet_length_bytes, >+ size_t virtual_payload_length_bytes, >+ double time_ms); >+ >+ // The following constructors are the same as the first two, but without a > // parser. Note that when the object is constructed using any of these > // methods, the header will be parsed using a default RtpHeaderParser object. > // In particular, RTP header extensions won't be parsed. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.cc >index 30bf431835552395e48cf789aaa3c42dcf7c45f1..598ae6edd408d3e3ec09f129dd5aca1ebb12ddbb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.cc >@@ -13,7 +13,7 @@ > namespace webrtc { > namespace test { > >-PacketSource::PacketSource() : use_ssrc_filter_(false), ssrc_(0) {} >+PacketSource::PacketSource() = default; > > PacketSource::~PacketSource() = default; > >@@ -21,10 +21,5 @@ void PacketSource::FilterOutPayloadType(uint8_t payload_type) { > filter_.set(payload_type, true); > } > >-void PacketSource::SelectSsrc(uint32_t ssrc) { >- use_ssrc_filter_ = true; >- ssrc_ = ssrc; >-} >- > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.h >index fb689e33d4e4852c981440a2ae2180b82515804d..cb86a98999de06bfee85c568fe98cb6ff23e2082 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/packet_source.h >@@ -32,13 +32,8 @@ class PacketSource { > > virtual void FilterOutPayloadType(uint8_t payload_type); > >- virtual void SelectSsrc(uint32_t ssrc); >- > protected: > std::bitset<128> filter_; // Payload type is 7 bits in the RFC. >- // If SSRC filtering discards all packet that do not match the SSRC. >- bool use_ssrc_filter_; // True when SSRC filtering is active. >- uint32_t ssrc_; // The selected SSRC. All other SSRCs will be discarded. > > private: > RTC_DISALLOW_COPY_AND_ASSIGN(PacketSource); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.cc >index 9e435c7cf88fc2b9d6d866de480889f186b3f83f..582c2f2f13551f987baaf91aaea2ec8976ff2d4e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.cc >@@ -13,94 +13,115 @@ > #include <string.h> > #include <iostream> > #include <limits> >+#include <utility> > >+#include "logging/rtc_event_log/rtc_event_processor.h" > #include "modules/audio_coding/neteq/tools/packet.h" >-#include "modules/rtp_rtcp/include/rtp_header_parser.h" > #include "rtc_base/checks.h" > > namespace webrtc { > namespace test { > >-RtcEventLogSource* RtcEventLogSource::Create(const std::string& file_name) { >+namespace { >+bool ShouldSkipStream(ParsedRtcEventLogNew::MediaType media_type, >+ uint32_t ssrc, >+ absl::optional<uint32_t> ssrc_filter) { >+ if (media_type != ParsedRtcEventLogNew::MediaType::AUDIO) >+ return true; >+ if (ssrc_filter.has_value() && ssrc != *ssrc_filter) >+ return true; >+ return false; >+} >+} // namespace >+ >+RtcEventLogSource* RtcEventLogSource::Create( >+ const std::string& file_name, >+ absl::optional<uint32_t> ssrc_filter) { > RtcEventLogSource* source = new RtcEventLogSource(); >- RTC_CHECK(source->OpenFile(file_name)); >+ RTC_CHECK(source->OpenFile(file_name, ssrc_filter)); > return source; > } > > RtcEventLogSource::~RtcEventLogSource() {} > >-bool RtcEventLogSource::RegisterRtpHeaderExtension(RTPExtensionType type, >- uint8_t id) { >- RTC_CHECK(parser_.get()); >- return parser_->RegisterRtpHeaderExtension(type, id); >-} >- > std::unique_ptr<Packet> RtcEventLogSource::NextPacket() { >- for (; rtp_packet_index_ < parsed_stream_.GetNumberOfEvents(); >- rtp_packet_index_++) { >- if (parsed_stream_.GetEventType(rtp_packet_index_) == >- ParsedRtcEventLogNew::EventType::RTP_EVENT) { >- PacketDirection direction; >- size_t header_length; >- size_t packet_length; >- uint64_t timestamp_us = parsed_stream_.GetTimestamp(rtp_packet_index_); >- parsed_stream_.GetRtpHeader(rtp_packet_index_, &direction, nullptr, >- &header_length, &packet_length, nullptr); >- >- if (direction != kIncomingPacket) { >- continue; >- } >- >- uint8_t* packet_header = new uint8_t[header_length]; >- parsed_stream_.GetRtpHeader(rtp_packet_index_, nullptr, packet_header, >- nullptr, nullptr, nullptr); >- std::unique_ptr<Packet> packet( >- new Packet(packet_header, header_length, packet_length, >- static_cast<double>(timestamp_us) / 1000, *parser_.get())); >- >- if (!packet->valid_header()) { >- std::cout << "Warning: Packet with index " << rtp_packet_index_ >- << " has an invalid header and will be ignored." << std::endl; >- continue; >- } >- >- if (parsed_stream_.GetMediaType(packet->header().ssrc, direction) != >- ParsedRtcEventLogNew::MediaType::AUDIO) { >- continue; >- } >- >- // Check if the packet should not be filtered out. >- if (!filter_.test(packet->header().payloadType) && >- !(use_ssrc_filter_ && packet->header().ssrc != ssrc_)) { >- ++rtp_packet_index_; >- return packet; >- } >- } >- } >- return nullptr; >+ if (rtp_packet_index_ >= rtp_packets_.size()) >+ return nullptr; >+ >+ std::unique_ptr<Packet> packet = std::move(rtp_packets_[rtp_packet_index_++]); >+ return packet; > } > > int64_t RtcEventLogSource::NextAudioOutputEventMs() { >- while (audio_output_index_ < parsed_stream_.GetNumberOfEvents()) { >- if (parsed_stream_.GetEventType(audio_output_index_) == >- ParsedRtcEventLogNew::EventType::AUDIO_PLAYOUT_EVENT) { >- LoggedAudioPlayoutEvent playout_event = >- parsed_stream_.GetAudioPlayout(audio_output_index_); >- if (!(use_ssrc_filter_ && playout_event.ssrc != ssrc_)) { >- audio_output_index_++; >- return playout_event.timestamp_us / 1000; >- } >+ if (audio_output_index_ >= audio_outputs_.size()) >+ return std::numeric_limits<int64_t>::max(); >+ >+ int64_t output_time_ms = audio_outputs_[audio_output_index_++]; >+ return output_time_ms; >+} >+ >+RtcEventLogSource::RtcEventLogSource() : PacketSource() {} >+ >+bool RtcEventLogSource::OpenFile(const std::string& file_name, >+ absl::optional<uint32_t> ssrc_filter) { >+ ParsedRtcEventLogNew parsed_log; >+ if (!parsed_log.ParseFile(file_name)) >+ return false; >+ >+ const auto first_log_end_time_us = >+ parsed_log.stop_log_events().empty() >+ ? std::numeric_limits<int64_t>::max() >+ : parsed_log.stop_log_events().front().log_time_us(); >+ >+ auto handle_rtp_packet = >+ [this, >+ first_log_end_time_us](const webrtc::LoggedRtpPacketIncoming& incoming) { >+ if (!filter_.test(incoming.rtp.header.payloadType) && >+ incoming.log_time_us() < first_log_end_time_us) { >+ rtp_packets_.emplace_back(absl::make_unique<Packet>( >+ incoming.rtp.header, incoming.rtp.total_length, >+ incoming.rtp.total_length - incoming.rtp.header_length, >+ static_cast<double>(incoming.log_time_ms()))); >+ } >+ }; >+ >+ auto handle_audio_playout = >+ [this, first_log_end_time_us]( >+ const webrtc::LoggedAudioPlayoutEvent& audio_playout) { >+ if (audio_playout.log_time_us() < first_log_end_time_us) { >+ audio_outputs_.emplace_back(audio_playout.log_time_ms()); >+ } >+ }; >+ >+ // This wouldn't be needed if we knew that there was at most one audio stream. >+ webrtc::RtcEventProcessor event_processor; >+ for (const auto& rtp_packets : parsed_log.incoming_rtp_packets_by_ssrc()) { >+ ParsedRtcEventLogNew::MediaType media_type = >+ parsed_log.GetMediaType(rtp_packets.ssrc, webrtc::kIncomingPacket); >+ if (ShouldSkipStream(media_type, rtp_packets.ssrc, ssrc_filter)) { >+ continue; > } >- audio_output_index_++; >+ auto rtp_view = absl::make_unique< >+ webrtc::ProcessableEventList<webrtc::LoggedRtpPacketIncoming>>( >+ rtp_packets.incoming_packets.begin(), >+ rtp_packets.incoming_packets.end(), handle_rtp_packet); >+ event_processor.AddEvents(std::move(rtp_view)); >+ } >+ >+ for (const auto& audio_playouts : parsed_log.audio_playout_events()) { >+ if (ssrc_filter.has_value() && audio_playouts.first != *ssrc_filter) >+ continue; >+ auto audio_view = absl::make_unique< >+ webrtc::ProcessableEventList<webrtc::LoggedAudioPlayoutEvent>>( >+ audio_playouts.second.begin(), audio_playouts.second.end(), >+ handle_audio_playout); >+ event_processor.AddEvents(std::move(audio_view)); > } >- return std::numeric_limits<int64_t>::max(); >-} > >-RtcEventLogSource::RtcEventLogSource() >- : PacketSource(), parser_(RtpHeaderParser::Create()) {} >+ // Fills in rtp_packets_ and audio_outputs_. >+ event_processor.ProcessEventsInOrder(); > >-bool RtcEventLogSource::OpenFile(const std::string& file_name) { >- return parsed_stream_.ParseFile(file_name); >+ return true; > } > > } // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.h >index db4eb19e11f369db904470cbb528eb254586e1ac..6b761f7dbbd262d6fcedaa58aff205af890e6df3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtc_event_log_source.h >@@ -13,7 +13,9 @@ > > #include <memory> > #include <string> >+#include <vector> > >+#include "absl/types/optional.h" > #include "logging/rtc_event_log/rtc_event_log_parser_new.h" > #include "modules/audio_coding/neteq/tools/packet_source.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >@@ -31,13 +33,11 @@ class RtcEventLogSource : public PacketSource { > public: > // Creates an RtcEventLogSource reading from |file_name|. If the file cannot > // be opened, or has the wrong format, NULL will be returned. >- static RtcEventLogSource* Create(const std::string& file_name); >+ static RtcEventLogSource* Create(const std::string& file_name, >+ absl::optional<uint32_t> ssrc_filter); > > virtual ~RtcEventLogSource(); > >- // Registers an RTP header extension and binds it to |id|. >- virtual bool RegisterRtpHeaderExtension(RTPExtensionType type, uint8_t id); >- > std::unique_ptr<Packet> NextPacket() override; > > // Returns the timestamp of the next audio output event, in milliseconds. The >@@ -48,14 +48,14 @@ class RtcEventLogSource : public PacketSource { > private: > RtcEventLogSource(); > >- bool OpenFile(const std::string& file_name); >+ bool OpenFile(const std::string& file_name, >+ absl::optional<uint32_t> ssrc_filter); > >+ std::vector<std::unique_ptr<Packet>> rtp_packets_; > size_t rtp_packet_index_ = 0; >+ std::vector<int64_t> audio_outputs_; > size_t audio_output_index_ = 0; > >- ParsedRtcEventLogNew parsed_stream_; >- std::unique_ptr<RtpHeaderParser> parser_; >- > RTC_DISALLOW_COPY_AND_ASSIGN(RtcEventLogSource); > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_analyze.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_analyze.cc >index f9390381cd38d0d12ec113a313565eadf70484fe..9d3041ea668dca1debdc320cba5463630a248d8c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_analyze.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_analyze.cc >@@ -19,16 +19,16 @@ > #include "rtc_base/flags.h" > > // Define command line flags. >-DEFINE_int(red, 117, "RTP payload type for RED"); >-DEFINE_int(audio_level, >- -1, >- "Extension ID for audio level (RFC 6464); " >- "-1 not to print audio level"); >-DEFINE_int(abs_send_time, >- -1, >- "Extension ID for absolute sender time; " >- "-1 not to print absolute send time"); >-DEFINE_bool(help, false, "Print this message"); >+WEBRTC_DEFINE_int(red, 117, "RTP payload type for RED"); >+WEBRTC_DEFINE_int(audio_level, >+ -1, >+ "Extension ID for audio level (RFC 6464); " >+ "-1 not to print audio level"); >+WEBRTC_DEFINE_int(abs_send_time, >+ -1, >+ "Extension ID for absolute sender time; " >+ "-1 not to print absolute send time"); >+WEBRTC_DEFINE_bool(help, false, "Print this message"); > > int main(int argc, char* argv[]) { > std::string program_name = argv[0]; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_encode.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_encode.cc >index 5065ca16757d06fa525e6e3e77ad41d7cf21b298..14c6e58a9d644a731fc05ee84aa9f0c33cd84af6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_encode.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_encode.cc >@@ -40,20 +40,24 @@ namespace test { > namespace { > > // Define command line flags. >-DEFINE_bool(list_codecs, false, "Enumerate all codecs"); >-DEFINE_string(codec, "opus", "Codec to use"); >-DEFINE_int(frame_len, 0, "Frame length in ms; 0 indicates codec default value"); >-DEFINE_int(bitrate, 0, "Bitrate in kbps; 0 indicates codec default value"); >-DEFINE_int(payload_type, >- -1, >- "RTP payload type; -1 indicates codec default value"); >-DEFINE_int(cng_payload_type, >- -1, >- "RTP payload type for CNG; -1 indicates default value"); >-DEFINE_int(ssrc, 0, "SSRC to write to the RTP header"); >-DEFINE_bool(dtx, false, "Use DTX/CNG"); >-DEFINE_int(sample_rate, 48000, "Sample rate of the input file"); >-DEFINE_bool(help, false, "Print this message"); >+WEBRTC_DEFINE_bool(list_codecs, false, "Enumerate all codecs"); >+WEBRTC_DEFINE_string(codec, "opus", "Codec to use"); >+WEBRTC_DEFINE_int(frame_len, >+ 0, >+ "Frame length in ms; 0 indicates codec default value"); >+WEBRTC_DEFINE_int(bitrate, >+ 0, >+ "Bitrate in kbps; 0 indicates codec default value"); >+WEBRTC_DEFINE_int(payload_type, >+ -1, >+ "RTP payload type; -1 indicates codec default value"); >+WEBRTC_DEFINE_int(cng_payload_type, >+ -1, >+ "RTP payload type for CNG; -1 indicates default value"); >+WEBRTC_DEFINE_int(ssrc, 0, "SSRC to write to the RTP header"); >+WEBRTC_DEFINE_bool(dtx, false, "Use DTX/CNG"); >+WEBRTC_DEFINE_int(sample_rate, 48000, "Sample rate of the input file"); >+WEBRTC_DEFINE_bool(help, false, "Print this message"); > > // Add new codecs here, and to the map below. > enum class CodecType { >@@ -242,8 +246,8 @@ std::unique_ptr<AudioEncoder> CreateEncoder(CodecType codec_type, > return nullptr; > } > >-AudioEncoderCng::Config GetCngConfig(int sample_rate_hz) { >- AudioEncoderCng::Config cng_config; >+AudioEncoderCngConfig GetCngConfig(int sample_rate_hz) { >+ AudioEncoderCngConfig cng_config; > const auto default_payload_type = [&] { > switch (sample_rate_hz) { > case 8000: >@@ -309,10 +313,10 @@ int RunRtpEncode(int argc, char* argv[]) { > > // Create an external VAD/CNG encoder if needed. > if (FLAG_dtx && !codec_it->second.internal_dtx) { >- AudioEncoderCng::Config cng_config = GetCngConfig(codec->SampleRateHz()); >+ AudioEncoderCngConfig cng_config = GetCngConfig(codec->SampleRateHz()); > RTC_DCHECK(codec); > cng_config.speech_encoder = std::move(codec); >- codec = absl::make_unique<AudioEncoderCng>(std::move(cng_config)); >+ codec = CreateComfortNoiseEncoder(std::move(cng_config)); > } > RTC_DCHECK(codec); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc >index 7bf7d52083cc65c4018e1372ea4ada90e9fc3bb6..eda2b3e31fcbd4a069a15edd026442c4896d1495 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc >@@ -26,8 +26,9 @@ > namespace webrtc { > namespace test { > >-RtpFileSource* RtpFileSource::Create(const std::string& file_name) { >- RtpFileSource* source = new RtpFileSource(); >+RtpFileSource* RtpFileSource::Create(const std::string& file_name, >+ absl::optional<uint32_t> ssrc_filter) { >+ RtpFileSource* source = new RtpFileSource(ssrc_filter); > RTC_CHECK(source->OpenFile(file_name)); > return source; > } >@@ -72,7 +73,7 @@ std::unique_ptr<Packet> RtpFileSource::NextPacket() { > continue; > } > if (filter_.test(packet->header().payloadType) || >- (use_ssrc_filter_ && packet->header().ssrc != ssrc_)) { >+ (ssrc_filter_ && packet->header().ssrc != *ssrc_filter_)) { > // This payload type should be filtered out. Continue to the next packet. > continue; > } >@@ -80,8 +81,10 @@ std::unique_ptr<Packet> RtpFileSource::NextPacket() { > } > } > >-RtpFileSource::RtpFileSource() >- : PacketSource(), parser_(RtpHeaderParser::Create()) {} >+RtpFileSource::RtpFileSource(absl::optional<uint32_t> ssrc_filter) >+ : PacketSource(), >+ parser_(RtpHeaderParser::Create()), >+ ssrc_filter_(ssrc_filter) {} > > bool RtpFileSource::OpenFile(const std::string& file_name) { > rtp_reader_.reset(RtpFileReader::Create(RtpFileReader::kRtpDump, file_name)); >@@ -89,7 +92,7 @@ bool RtpFileSource::OpenFile(const std::string& file_name) { > return true; > rtp_reader_.reset(RtpFileReader::Create(RtpFileReader::kPcap, file_name)); > if (!rtp_reader_) { >- RTC_FATAL() << "Couldn't open input file as either a rtpdump or .pcap. Note " >+ FATAL() << "Couldn't open input file as either a rtpdump or .pcap. Note " > "that .pcapng is not supported."; > } > return true; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.h >index 013333b89a369fa222f47cc2890f1f79aee74fa1..02ab897566190b335215cc683f9609b9bf720e8b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.h >@@ -16,6 +16,7 @@ > #include <memory> > #include <string> > >+#include "absl/types/optional.h" > #include "common_types.h" // NOLINT(build/include) > #include "modules/audio_coding/neteq/tools/packet_source.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >@@ -33,7 +34,9 @@ class RtpFileSource : public PacketSource { > public: > // Creates an RtpFileSource reading from |file_name|. If the file cannot be > // opened, or has the wrong format, NULL will be returned. >- static RtpFileSource* Create(const std::string& file_name); >+ static RtpFileSource* Create( >+ const std::string& file_name, >+ absl::optional<uint32_t> ssrc_filter = absl::nullopt); > > // Checks whether a files is a valid RTP dump or PCAP (Wireshark) file. > static bool ValidRtpDump(const std::string& file_name); >@@ -51,12 +54,13 @@ class RtpFileSource : public PacketSource { > static const int kRtpFileHeaderSize = 4 + 4 + 4 + 2 + 2; > static const size_t kPacketHeaderSize = 8; > >- RtpFileSource(); >+ explicit RtpFileSource(absl::optional<uint32_t> ssrc_filter); > > bool OpenFile(const std::string& file_name); > > std::unique_ptr<RtpFileReader> rtp_reader_; > std::unique_ptr<RtpHeaderParser> parser_; >+ const absl::optional<uint32_t> ssrc_filter_; > > RTC_DISALLOW_COPY_AND_ASSIGN(RtpFileSource); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_jitter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_jitter.cc >index 3c49443036c39f7322ef1e8e637c32d616819789..92a7a8d2150afb44e17fcc6d3e14905f67884a93 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_jitter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_jitter.cc >@@ -23,7 +23,7 @@ namespace webrtc { > namespace test { > namespace { > >-DEFINE_bool(help, false, "Print help message"); >+WEBRTC_DEFINE_bool(help, false, "Print help message"); > > constexpr size_t kRtpDumpHeaderLength = 8; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/EncodeDecodeTest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/EncodeDecodeTest.cc >index c6f3f8c9f2562d492e48dde354654853c61d25eb..240836688cf0cfae2331af85826732dc6e29d8f3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/EncodeDecodeTest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/EncodeDecodeTest.cc >@@ -14,9 +14,9 @@ > #include <stdlib.h> > #include <memory> > >+#include "absl/strings/match.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/audio_coding/codecs/audio_format_conversion.h" > #include "modules/audio_coding/include/audio_coding_module.h" > #include "modules/audio_coding/test/utility.h" >@@ -248,11 +248,11 @@ void EncodeDecodeTest::Perform() { > > for (int n = 0; n < numCodecs; n++) { > EXPECT_EQ(0, acm->Codec(n, &sendCodecTmp)); >- if (STR_CASE_CMP(sendCodecTmp.plname, "telephone-event") == 0) { >+ if (absl::EqualsIgnoreCase(sendCodecTmp.plname, "telephone-event")) { > numPars[n] = 0; >- } else if (STR_CASE_CMP(sendCodecTmp.plname, "cn") == 0) { >+ } else if (absl::EqualsIgnoreCase(sendCodecTmp.plname, "cn")) { > numPars[n] = 0; >- } else if (STR_CASE_CMP(sendCodecTmp.plname, "red") == 0) { >+ } else if (absl::EqualsIgnoreCase(sendCodecTmp.plname, "red")) { > numPars[n] = 0; > } else if (sendCodecTmp.channels == 2) { > numPars[n] = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestRedFec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestRedFec.cc >index c56fed95dab96dc552c019b2bd22a785cca60e8a..8bb3971373da7f0721081a721b75d374e494961a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestRedFec.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestRedFec.cc >@@ -173,14 +173,14 @@ void TestRedFec::RegisterSendCodec( > auto encoder = encoder_factory_->MakeAudioEncoder(payload_type, codec_format, > absl::nullopt); > EXPECT_NE(encoder, nullptr); >- if (STR_CASE_CMP(codec_format.name.c_str(), "opus") != 0) { >+ if (!absl::EqualsIgnoreCase(codec_format.name, "opus")) { > if (vad_mode.has_value()) { >- AudioEncoderCng::Config config; >+ AudioEncoderCngConfig config; > config.speech_encoder = std::move(encoder); > config.num_channels = 1; > config.payload_type = cn_payload_type; > config.vad_mode = vad_mode.value(); >- encoder = absl::make_unique<AudioEncoderCng>(std::move(config)); >+ encoder = CreateComfortNoiseEncoder(std::move(config)); > EXPECT_EQ(true, > other_acm->RegisterReceiveCodec( > cn_payload_type, {"CN", codec_format.clockrate_hz, 1})); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestStereo.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestStereo.cc >index fd69fd262622e3143651da87f4f251bb8d8336a1..bf8e1894d0e41b936322dd5f7db4f51bd0042e22 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestStereo.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestStereo.cc >@@ -12,9 +12,9 @@ > > #include <string> > >+#include "absl/strings/match.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/audio_coding/codecs/audio_format_conversion.h" > #include "modules/audio_coding/include/audio_coding_module_typedefs.h" > #include "modules/audio_coding/test/utility.h" >@@ -626,14 +626,14 @@ void TestStereo::RegisterSendCodec(char side, > ASSERT_TRUE(my_acm != NULL); > > auto encoder_factory = CreateBuiltinAudioEncoderFactory(); >- const int clockrate_hz = STR_CASE_CMP(codec_name, "g722") == 0 >+ const int clockrate_hz = absl::EqualsIgnoreCase(codec_name, "g722") > ? sampling_freq_hz / 2 > : sampling_freq_hz; > const std::string ptime = rtc::ToString(rtc::CheckedDivExact( > pack_size, rtc::CheckedDivExact(sampling_freq_hz, 1000))); > SdpAudioFormat::Parameters params = {{"ptime", ptime}}; > RTC_CHECK(channels == 1 || channels == 2); >- if (STR_CASE_CMP(codec_name, "opus") == 0) { >+ if (absl::EqualsIgnoreCase(codec_name, "opus")) { > if (channels == 2) { > params["stereo"] = "1"; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestVADDTX.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestVADDTX.cc >index 4f02edaa83744c193b0e742ec22f78738e3e8eba..09c69f9c6bbe566096fa7bf2caa41fed5f71aec8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestVADDTX.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/TestVADDTX.cc >@@ -12,6 +12,7 @@ > > #include <string> > >+#include "absl/strings/match.h" > #include "api/audio_codecs/audio_decoder_factory_template.h" > #include "api/audio_codecs/audio_encoder_factory_template.h" > #include "api/audio_codecs/ilbc/audio_decoder_ilbc.h" >@@ -81,13 +82,13 @@ bool TestVadDtx::RegisterCodec(const SdpAudioFormat& codec_format, > auto encoder = encoder_factory_->MakeAudioEncoder(payload_type, codec_format, > absl::nullopt); > if (vad_mode.has_value() && >- STR_CASE_CMP(codec_format.name.c_str(), "opus") != 0) { >- AudioEncoderCng::Config config; >+ !absl::EqualsIgnoreCase(codec_format.name, "opus")) { >+ AudioEncoderCngConfig config; > config.speech_encoder = std::move(encoder); > config.num_channels = 1; > config.payload_type = cn_payload_type; > config.vad_mode = vad_mode.value(); >- encoder = absl::make_unique<AudioEncoderCng>(std::move(config)); >+ encoder = CreateComfortNoiseEncoder(std::move(config)); > added_comfort_noise = true; > } > channel_->SetIsStereo(encoder->NumChannels() > 1); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/iSACTest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/iSACTest.cc >index a130ae3e66fdac8b03101690a0f094dff1f6b6bb..c332fe08f11183694b1a89eb89d1b4a31c28b6c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/iSACTest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/iSACTest.cc >@@ -23,6 +23,7 @@ > #include <time.h> > #endif > >+#include "absl/strings/match.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/isac/audio_encoder_isac_float.h" > #include "modules/audio_coding/codecs/audio_format_conversion.h" >@@ -92,12 +93,12 @@ void ISACTest::Setup() { > for (codecCntr = 0; codecCntr < AudioCodingModule::NumberOfCodecs(); > codecCntr++) { > EXPECT_EQ(0, AudioCodingModule::Codec(codecCntr, &codecParam)); >- if (!STR_CASE_CMP(codecParam.plname, "ISAC") && >+ if (absl::EqualsIgnoreCase(codecParam.plname, "ISAC") && > codecParam.plfreq == 16000) { > memcpy(&_paramISAC16kHz, &codecParam, sizeof(CodecInst)); > _idISAC16kHz = codecCntr; > } >- if (!STR_CASE_CMP(codecParam.plname, "ISAC") && >+ if (absl::EqualsIgnoreCase(codecParam.plname, "ISAC") && > codecParam.plfreq == 32000) { > memcpy(&_paramISAC32kHz, &codecParam, sizeof(CodecInst)); > _idISAC32kHz = codecCntr; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/utility.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/utility.cc >index 83c25b537e45c18270cfc33e86947564257f3f5d..53f807790ef3377ab20ef4700a913d90acb24d89 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/utility.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/test/utility.cc >@@ -15,7 +15,7 @@ > #include <stdlib.h> > #include <string.h> > >-#include "common_types.h" // NOLINT(build/include) >+#include "absl/strings/match.h" > #include "modules/audio_coding/include/audio_coding_module.h" > #include "test/gtest.h" > >@@ -268,7 +268,7 @@ bool FixedPayloadTypeCodec(const char* payloadName) { > "G722", "QCELP", "CN", "MPA", "G728", "G729"}; > > for (int n = 0; n < NUM_CODECS_WITH_FIXED_PAYLOAD_TYPE; n++) { >- if (!STR_CASE_CMP(payloadName, fixPayloadTypeCodecs[n])) { >+ if (absl::EqualsIgnoreCase(payloadName, fixPayloadTypeCodecs[n])) { > return true; > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/BUILD.gn >index b690050981194395d7a075895197b71bd785fa67..67495aca77556e75593a1331d2e5f5838d444f8a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/BUILD.gn >@@ -24,6 +24,16 @@ config("audio_device_warnings_config") { > } > } > >+rtc_source_set("audio_device_default") { >+ visibility = [ "*" ] >+ sources = [ >+ "include/audio_device_default.h", >+ ] >+ deps = [ >+ ":audio_device_api", >+ ] >+} >+ > rtc_source_set("audio_device") { > visibility = [ "*" ] > public_deps = [ >@@ -203,8 +213,8 @@ rtc_source_set("audio_device_impl") { > deps = [ > ":audio_device_api", > ":audio_device_buffer", >+ ":audio_device_default", > ":audio_device_generic", >- "../..:webrtc_common", > "../../api:array_view", > "../../common_audio", > "../../common_audio:common_audio_c", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/android/audio_device_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/android/audio_device_unittest.cc >index b0cf42edec53a31595f9db47952e404f0db0a680..6e09139c79b68850f8d922ba945c687852478d49 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/android/audio_device_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/android/audio_device_unittest.cc >@@ -25,10 +25,10 @@ > #include "modules/audio_device/include/mock_audio_transport.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/format_macros.h" > #include "rtc_base/scoped_ref_ptr.h" > #include "rtc_base/timeutils.h" >-#include "system_wrappers/include/event_wrapper.h" > #include "test/gmock.h" > #include "test/gtest.h" > #include "test/testsupport/fileutils.h" >@@ -365,7 +365,7 @@ class MockAudioTransportAndroid : public test::MockAudioTransport { > > // Set default actions of the mock object. We are delegating to fake > // implementations (of AudioStreamInterface) here. >- void HandleCallbacks(EventWrapper* test_is_done, >+ void HandleCallbacks(rtc::Event* test_is_done, > AudioStreamInterface* audio_stream, > int num_callbacks) { > test_is_done_ = test_is_done; >@@ -448,7 +448,7 @@ class MockAudioTransportAndroid : public test::MockAudioTransport { > bool rec_mode() const { return type_ & kRecording; } > > private: >- EventWrapper* test_is_done_; >+ rtc::Event* test_is_done_; > size_t num_callbacks_; > int type_; > size_t play_count_; >@@ -460,7 +460,7 @@ class MockAudioTransportAndroid : public test::MockAudioTransport { > // AudioDeviceTest test fixture. > class AudioDeviceTest : public ::testing::Test { > protected: >- AudioDeviceTest() : test_is_done_(EventWrapper::Create()) { >+ AudioDeviceTest() { > // One-time initialization of JVM and application context. Ensures that we > // can do calls between C++ and Java. Initializes both Java and OpenSL ES > // implementations. >@@ -638,7 +638,7 @@ class AudioDeviceTest : public ::testing::Test { > return volume; > } > >- std::unique_ptr<EventWrapper> test_is_done_; >+ rtc::Event test_is_done_; > rtc::scoped_refptr<AudioDeviceModule> audio_device_; > AudioParameters playout_parameters_; > AudioParameters record_parameters_; >@@ -867,14 +867,14 @@ TEST_F(AudioDeviceTest, StopRecordingRequiresInitToRestart) { > // audio samples to play out using the NeedMorePlayData callback. > TEST_F(AudioDeviceTest, StartPlayoutVerifyCallbacks) { > MockAudioTransportAndroid mock(kPlayout); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL(mock, NeedMorePlayData(playout_frames_per_10ms_buffer(), > kBytesPerSample, playout_channels(), > playout_sample_rate(), NotNull(), _, _, _)) > .Times(AtLeast(kNumCallbacks)); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopPlayout(); > } > >@@ -884,7 +884,7 @@ TEST_F(AudioDeviceTest, StartPlayoutVerifyCallbacks) { > // delay estimates as well (argument #6). > TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > MockAudioTransportAndroid mock(kRecording); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL( > mock, RecordedDataIsAvailable(NotNull(), record_frames_per_10ms_buffer(), > kBytesPerSample, record_channels(), >@@ -893,7 +893,7 @@ TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartRecording(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopRecording(); > } > >@@ -901,7 +901,7 @@ TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > // active in both directions. > TEST_F(AudioDeviceTest, StartPlayoutAndRecordingVerifyCallbacks) { > MockAudioTransportAndroid mock(kPlayout | kRecording); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL(mock, NeedMorePlayData(playout_frames_per_10ms_buffer(), > kBytesPerSample, playout_channels(), > playout_sample_rate(), NotNull(), _, _, _)) >@@ -914,7 +914,7 @@ TEST_F(AudioDeviceTest, StartPlayoutAndRecordingVerifyCallbacks) { > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); > StartRecording(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopRecording(); > StopPlayout(); > } >@@ -930,12 +930,11 @@ TEST_F(AudioDeviceTest, RunPlayoutWithFileAsSource) { > std::string file_name = GetFileName(playout_sample_rate()); > std::unique_ptr<FileAudioStream> file_audio_stream( > new FileAudioStream(num_callbacks, file_name, playout_sample_rate())); >- mock.HandleCallbacks(test_is_done_.get(), file_audio_stream.get(), >- num_callbacks); >+ mock.HandleCallbacks(&test_is_done_, file_audio_stream.get(), num_callbacks); > // SetMaxPlayoutVolume(); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopPlayout(); > } > >@@ -964,13 +963,13 @@ TEST_F(AudioDeviceTest, DISABLED_RunPlayoutAndRecordingInFullDuplex) { > NiceMock<MockAudioTransportAndroid> mock(kPlayout | kRecording); > std::unique_ptr<FifoAudioStream> fifo_audio_stream( > new FifoAudioStream(playout_frames_per_10ms_buffer())); >- mock.HandleCallbacks(test_is_done_.get(), fifo_audio_stream.get(), >+ mock.HandleCallbacks(&test_is_done_, fifo_audio_stream.get(), > kFullDuplexTimeInSec * kNumCallbacksPerSecond); > SetMaxPlayoutVolume(); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartRecording(); > StartPlayout(); >- test_is_done_->Wait( >+ test_is_done_.Wait( > std::max(kTestTimeOutInMilliseconds, 1000 * kFullDuplexTimeInSec)); > StopPlayout(); > StopRecording(); >@@ -997,14 +996,14 @@ TEST_F(AudioDeviceTest, DISABLED_MeasureLoopbackLatency) { > NiceMock<MockAudioTransportAndroid> mock(kPlayout | kRecording); > std::unique_ptr<LatencyMeasuringAudioStream> latency_audio_stream( > new LatencyMeasuringAudioStream(playout_frames_per_10ms_buffer())); >- mock.HandleCallbacks(test_is_done_.get(), latency_audio_stream.get(), >+ mock.HandleCallbacks(&test_is_done_, latency_audio_stream.get(), > kMeasureLatencyTimeInSec * kNumCallbacksPerSecond); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > SetMaxPlayoutVolume(); > DisableBuiltInAECIfAvailable(); > StartRecording(); > StartPlayout(); >- test_is_done_->Wait( >+ test_is_done_.Wait( > std::max(kTestTimeOutInMilliseconds, 1000 * kMeasureLatencyTimeInSec)); > StopPlayout(); > StopRecording(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.cc >index 5c45780426b78f914a9b195161568c47d81d87a5..8f920cfff8ea9c34a365562b968aa0e7b7c74f04 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.cc >@@ -8,19 +8,16 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include <algorithm> >+#include <string.h> > #include <cmath> >- >-#include "modules/audio_device/audio_device_buffer.h" >+#include <cstddef> >+#include <cstdint> > > #include "common_audio/signal_processing/include/signal_processing_library.h" >-#include "modules/audio_device/audio_device_config.h" >-#include "rtc_base/arraysize.h" >+#include "modules/audio_device/audio_device_buffer.h" > #include "rtc_base/bind.h" > #include "rtc_base/checks.h" >-#include "rtc_base/format_macros.h" > #include "rtc_base/logging.h" >-#include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/timeutils.h" > #include "system_wrappers/include/metrics.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.h >index 7f0cf83baf7d7ea7d6e7bc7dc0331735ebbbe1e3..bbe2969d43697558b65874e3501e7eda607990af 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_buffer.h >@@ -11,9 +11,11 @@ > #ifndef MODULES_AUDIO_DEVICE_AUDIO_DEVICE_BUFFER_H_ > #define MODULES_AUDIO_DEVICE_AUDIO_DEVICE_BUFFER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <atomic> > >-#include "modules/audio_device/include/audio_device.h" >+#include "modules/audio_device/include/audio_device_defines.h" > #include "rtc_base/buffer.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/task_queue.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_data_observer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_data_observer.cc >index 723d1c8bbdc74c9a9c6bd2856a412a9980818b8b..994c2efdf33210a1bf46ff72ed37596484b5e2a7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_data_observer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_data_observer.cc >@@ -9,6 +9,8 @@ > */ > > #include "modules/audio_device/include/audio_device_data_observer.h" >+ >+#include "modules/audio_device/include/audio_device_defines.h" > #include "rtc_base/checks.h" > #include "rtc_base/refcountedobject.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_generic.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_generic.h >index be9c072c412ce84b1bc74a493305d1b3989afb30..7d3c83e119603bb6ebde9225a9857243587bdf47 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_generic.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_generic.h >@@ -11,8 +11,11 @@ > #ifndef AUDIO_DEVICE_AUDIO_DEVICE_GENERIC_H_ > #define AUDIO_DEVICE_AUDIO_DEVICE_GENERIC_H_ > >+#include <stdint.h> >+ > #include "modules/audio_device/audio_device_buffer.h" > #include "modules/audio_device/include/audio_device.h" >+#include "modules/audio_device/include/audio_device_defines.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.cc >index 91d6208fe1fd1e8e85470a68ef52e9590a642198..31d5b5ecfc51e9a8ca53b2c3aee89865983e06f4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.cc >@@ -10,12 +10,14 @@ > > #include "modules/audio_device/audio_device_impl.h" > >-#include "modules/audio_device/audio_device_config.h" >+#include <stddef.h> >+ >+#include "modules/audio_device/audio_device_config.h" // IWYU pragma: keep > #include "modules/audio_device/audio_device_generic.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "rtc_base/refcount.h" > #include "rtc_base/refcountedobject.h" >+#include "rtc_base/scoped_ref_ptr.h" > #include "system_wrappers/include/metrics.h" > > #if defined(_WIN32) >@@ -47,10 +49,10 @@ > #include "modules/audio_device/mac/audio_device_mac.h" > #endif > #if defined(WEBRTC_DUMMY_FILE_DEVICES) >+#include "modules/audio_device/dummy/file_audio_device.h" > #include "modules/audio_device/dummy/file_audio_device_factory.h" > #endif > #include "modules/audio_device/dummy/audio_device_dummy.h" >-#include "modules/audio_device/dummy/file_audio_device.h" > > #define CHECKinitialized_() \ > { \ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.h >index 481cdf32e00dca4ac9f766df25b0edfa1f1187d6..afe53b3240328d941769858e345898cd7c123819 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_impl.h >@@ -13,12 +13,11 @@ > > #if defined(WEBRTC_INCLUDE_INTERNAL_AUDIO_DEVICE) > >+#include <stdint.h> > #include <memory> > > #include "modules/audio_device/audio_device_buffer.h" > #include "modules/audio_device/include/audio_device.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/criticalsection.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_unittest.cc >index a82bc4a33ba3a7dbb46fc91b33bfee7b9565719d..1aa28e5d7c1230a3706a7976f0c6819117a99e28 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/audio_device_unittest.cc >@@ -509,7 +509,7 @@ class MockAudioTransport : public test::MockAudioTransport { > class AudioDeviceTest > : public ::testing::TestWithParam<webrtc::AudioDeviceModule::AudioLayer> { > protected: >- AudioDeviceTest() : audio_layer_(GetParam()), event_(false, false) { >+ AudioDeviceTest() : audio_layer_(GetParam()) { > // TODO(webrtc:9778): Re-enable on THREAD_SANITIZER? > #if !defined(ADDRESS_SANITIZER) && !defined(MEMORY_SANITIZER) && \ > !defined(WEBRTC_DUMMY_AUDIO_BUILD) && !defined(THREAD_SANITIZER) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/audio_device_dummy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/audio_device_dummy.h >index d709f776919fc7dc3a4ed893c9ce5b6fdd7c7458..2a2541098ea86869aee2dc31d9d2921d1f2fc81c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/audio_device_dummy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/audio_device_dummy.h >@@ -11,9 +11,12 @@ > #ifndef AUDIO_DEVICE_AUDIO_DEVICE_DUMMY_H_ > #define AUDIO_DEVICE_AUDIO_DEVICE_DUMMY_H_ > >-#include <stdio.h> >+#include <stdint.h> > >+#include "modules/audio_device/audio_device_buffer.h" > #include "modules/audio_device/audio_device_generic.h" >+#include "modules/audio_device/include/audio_device.h" >+#include "modules/audio_device/include/audio_device_defines.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.cc >index b9aeaa15065a9887007ce5e222fd71129201faf4..2848eeae350895789b01c6153236dd6213d62a1b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.cc >@@ -9,9 +9,13 @@ > */ > > #include "modules/audio_device/dummy/file_audio_device.h" >+ >+#include <string.h> >+ > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/platform_thread.h" >+#include "rtc_base/timeutils.h" > #include "system_wrappers/include/sleep.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.h >index f07690e7e033790202bd904dda6d86f972bc292b..210c8f7b8be6026fb020a332f9fb6b1b06366265 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device.h >@@ -26,7 +26,6 @@ class PlatformThread; > } // namespace rtc > > namespace webrtc { >-class EventWrapper; > > // This is a fake audio device which plays audio from a file as its microphone > // and plays out into a file. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.cc >index 6b38d8bc055ae8530d1e5285baf705d22aa0b8fb..027b13bf5c5ce1bbb7ee76bfc1fa014e7ffaf54c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.cc >@@ -10,8 +10,8 @@ > > #include "modules/audio_device/dummy/file_audio_device_factory.h" > >+#include <stdio.h> > #include <cstdlib> >-#include <cstring> > > #include "modules/audio_device/dummy/file_audio_device.h" > #include "rtc_base/logging.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.h >index 9cd5b3d62566f6b2db2870ab39a15f5c78f816f1..72f4ab2b384ed9407e167688c440f6df8a189fac 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/dummy/file_audio_device_factory.h >@@ -11,7 +11,7 @@ > #ifndef AUDIO_DEVICE_FILE_AUDIO_DEVICE_FACTORY_H_ > #define AUDIO_DEVICE_FILE_AUDIO_DEVICE_FACTORY_H_ > >-#include "common_types.h" // NOLINT(build/include) >+#include <stdint.h> > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/fine_audio_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/fine_audio_buffer.cc >index 4af344aef3786a30027307585b6171e9145d234e..b4f3c371ace369cdcfd45bb932616480830bd401 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/fine_audio_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/fine_audio_buffer.cc >@@ -10,6 +10,9 @@ > > #include "modules/audio_device/fine_audio_buffer.h" > >+#include <cstdint> >+#include <cstring> >+ > #include "modules/audio_device/audio_device_buffer.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/audio_device_data_observer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/audio_device_data_observer.h >index 3a4ee99518aa926b1cc215f0693b330fc9761bb1..959cfa4662c696b926fff17c6945c15387b1e036 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/audio_device_data_observer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/audio_device_data_observer.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_AUDIO_DEVICE_INCLUDE_AUDIO_DEVICE_DATA_OBSERVER_H_ > #define MODULES_AUDIO_DEVICE_INCLUDE_AUDIO_DEVICE_DATA_OBSERVER_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/audio_device/include/audio_device.h" > #include "rtc_base/scoped_ref_ptr.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.cc >index 7d55c47529a296f98d38e9fee9ac0882a1943c6c..8aea1d245e226be1d77a36df3bf35502e716c36e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.cc >@@ -8,8 +8,11 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > #include <algorithm> >+#include <cstdint> >+#include <cstdlib> > #include <memory> > #include <string> >+#include <type_traits> > #include <utility> > #include <vector> > >@@ -28,6 +31,7 @@ > #include "rtc_base/random.h" > #include "rtc_base/refcountedobject.h" > #include "rtc_base/thread.h" >+#include "rtc_base/thread_annotations.h" > #include "rtc_base/timeutils.h" > > namespace webrtc { >@@ -58,8 +62,6 @@ class TestAudioDeviceModuleImpl > audio_callback_(nullptr), > rendering_(false), > capturing_(false), >- done_rendering_(true, true), >- done_capturing_(true, true), > stop_thread_(false) { > auto good_sample_rate = [](int sr) { > return sr == 8000 || sr == 16000 || sr == 32000 || sr == 44100 || >@@ -108,14 +110,12 @@ class TestAudioDeviceModuleImpl > rtc::CritScope cs(&lock_); > RTC_CHECK(renderer_); > rendering_ = true; >- done_rendering_.Reset(); > return 0; > } > > int32_t StopPlayout() { > rtc::CritScope cs(&lock_); > rendering_ = false; >- done_rendering_.Set(); > return 0; > } > >@@ -123,14 +123,12 @@ class TestAudioDeviceModuleImpl > rtc::CritScope cs(&lock_); > RTC_CHECK(capturer_); > capturing_ = true; >- done_capturing_.Reset(); > return 0; > } > > int32_t StopRecording() { > rtc::CritScope cs(&lock_); > capturing_ = false; >- done_capturing_.Set(); > return 0; > } > >@@ -172,9 +170,10 @@ class TestAudioDeviceModuleImpl > uint32_t new_mic_level = 0; > if (recording_buffer_.size() > 0) { > audio_callback_->RecordedDataIsAvailable( >- recording_buffer_.data(), recording_buffer_.size(), 2, >- capturer_->NumChannels(), capturer_->SamplingFrequency(), 0, 0, >- 0, false, new_mic_level); >+ recording_buffer_.data(), >+ recording_buffer_.size() / capturer_->NumChannels(), >+ 2 * capturer_->NumChannels(), capturer_->NumChannels(), >+ capturer_->SamplingFrequency(), 0, 0, 0, false, new_mic_level); > } > if (!keep_capturing) { > capturing_ = false; >@@ -187,9 +186,10 @@ class TestAudioDeviceModuleImpl > int64_t ntp_time_ms = -1; > const int sampling_frequency = renderer_->SamplingFrequency(); > audio_callback_->NeedMorePlayData( >- SamplesPerFrame(sampling_frequency), 2, renderer_->NumChannels(), >- sampling_frequency, playout_buffer_.data(), samples_out, >- &elapsed_time_ms, &ntp_time_ms); >+ SamplesPerFrame(sampling_frequency), 2 * renderer_->NumChannels(), >+ renderer_->NumChannels(), sampling_frequency, >+ playout_buffer_.data(), samples_out, &elapsed_time_ms, >+ &ntp_time_ms); > const bool keep_rendering = > renderer_->Render(rtc::ArrayView<const int16_t>( > playout_buffer_.data(), samples_out)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.h >index 7d034dce8c70ecdb629c2c652b8e6b214bda6a4a..93f0b13601fb790f42548b541f3adbd403eaba4b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/include/test_audio_device.h >@@ -10,14 +10,18 @@ > #ifndef MODULES_AUDIO_DEVICE_INCLUDE_TEST_AUDIO_DEVICE_H_ > #define MODULES_AUDIO_DEVICE_INCLUDE_TEST_AUDIO_DEVICE_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <string> >-#include <vector> > >+#include "api/array_view.h" > #include "modules/audio_device/include/audio_device.h" >+#include "modules/audio_device/include/audio_device_defines.h" > #include "rtc_base/buffer.h" > #include "rtc_base/event.h" > #include "rtc_base/platform_file.h" >+#include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/ios/audio_device_unittest_ios.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/ios/audio_device_unittest_ios.mm >index 47b873d4a99b71a0ed6b66116e9ceb89dd43a2d2..19ff9c9233236a8ae502aeb0240119358d3c2a4b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/ios/audio_device_unittest_ios.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/ios/audio_device_unittest_ios.mm >@@ -22,11 +22,11 @@ > #include "modules/audio_device/ios/audio_device_ios.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/format_macros.h" > #include "rtc_base/logging.h" > #include "rtc_base/scoped_ref_ptr.h" > #include "rtc_base/timeutils.h" >-#include "system_wrappers/include/event_wrapper.h" > #include "test/gmock.h" > #include "test/gtest.h" > #include "test/testsupport/fileutils.h" >@@ -372,7 +372,7 @@ class MockAudioTransportIOS : public test::MockAudioTransport { > > // Set default actions of the mock object. We are delegating to fake > // implementations (of AudioStreamInterface) here. >- void HandleCallbacks(EventWrapper* test_is_done, >+ void HandleCallbacks(rtc::Event* test_is_done, > AudioStreamInterface* audio_stream, > size_t num_callbacks) { > test_is_done_ = test_is_done; >@@ -461,7 +461,7 @@ class MockAudioTransportIOS : public test::MockAudioTransport { > bool rec_mode() const { return type_ & kRecording; } > > private: >- EventWrapper* test_is_done_; >+ rtc::Event* test_is_done_; > size_t num_callbacks_; > int type_; > size_t play_count_; >@@ -472,7 +472,7 @@ class MockAudioTransportIOS : public test::MockAudioTransport { > // AudioDeviceTest test fixture. > class AudioDeviceTest : public ::testing::Test { > protected: >- AudioDeviceTest() : test_is_done_(EventWrapper::Create()) { >+ AudioDeviceTest() { > old_sev_ = rtc::LogMessage::GetLogToDebug(); > // Set suitable logging level here. Change to rtc::LS_INFO for more verbose > // output. See webrtc/rtc_base/logging.h for complete list of options. >@@ -572,7 +572,7 @@ class AudioDeviceTest : public ::testing::Test { > EXPECT_FALSE(audio_device()->Recording()); > } > >- std::unique_ptr<EventWrapper> test_is_done_; >+ rtc::Event test_is_done_; > rtc::scoped_refptr<AudioDeviceModule> audio_device_; > AudioParameters playout_parameters_; > AudioParameters record_parameters_; >@@ -662,7 +662,7 @@ TEST_F(AudioDeviceTest, DISABLED_StartPlayoutOnTwoInstances) { > // has been done successfully and that there is no conflict with the already > // playing first ADM. > MockAudioTransportIOS mock2(kPlayout); >- mock2.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock2.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL( > mock2, NeedMorePlayData(playout_frames_per_10ms_buffer(), kBytesPerSample, > playout_channels(), playout_sample_rate(), >@@ -671,7 +671,7 @@ TEST_F(AudioDeviceTest, DISABLED_StartPlayoutOnTwoInstances) { > EXPECT_EQ(0, second_audio_device->RegisterAudioCallback(&mock2)); > EXPECT_EQ(0, second_audio_device->StartPlayout()); > EXPECT_TRUE(second_audio_device->Playing()); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > EXPECT_EQ(0, second_audio_device->StopPlayout()); > EXPECT_FALSE(second_audio_device->Playing()); > EXPECT_FALSE(second_audio_device->PlayoutIsInitialized()); >@@ -683,14 +683,14 @@ TEST_F(AudioDeviceTest, DISABLED_StartPlayoutOnTwoInstances) { > // audio samples to play out using the NeedMorePlayData callback. > TEST_F(AudioDeviceTest, StartPlayoutVerifyCallbacks) { > MockAudioTransportIOS mock(kPlayout); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL(mock, NeedMorePlayData(playout_frames_per_10ms_buffer(), > kBytesPerSample, playout_channels(), > playout_sample_rate(), NotNull(), _, _, _)) > .Times(AtLeast(kNumCallbacks)); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopPlayout(); > } > >@@ -698,7 +698,7 @@ TEST_F(AudioDeviceTest, StartPlayoutVerifyCallbacks) { > // audio samples via the RecordedDataIsAvailable callback. > TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > MockAudioTransportIOS mock(kRecording); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL(mock, > RecordedDataIsAvailable( > NotNull(), record_frames_per_10ms_buffer(), kBytesPerSample, >@@ -708,7 +708,7 @@ TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartRecording(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopRecording(); > } > >@@ -716,7 +716,7 @@ TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > // active in both directions. > TEST_F(AudioDeviceTest, StartPlayoutAndRecordingVerifyCallbacks) { > MockAudioTransportIOS mock(kPlayout | kRecording); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL(mock, NeedMorePlayData(playout_frames_per_10ms_buffer(), > kBytesPerSample, playout_channels(), > playout_sample_rate(), NotNull(), _, _, _)) >@@ -730,7 +730,7 @@ TEST_F(AudioDeviceTest, StartPlayoutAndRecordingVerifyCallbacks) { > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); > StartRecording(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopRecording(); > StopPlayout(); > } >@@ -746,12 +746,11 @@ TEST_F(AudioDeviceTest, RunPlayoutWithFileAsSource) { > std::string file_name = GetFileName(playout_sample_rate()); > std::unique_ptr<FileAudioStream> file_audio_stream( > new FileAudioStream(num_callbacks, file_name, playout_sample_rate())); >- mock.HandleCallbacks(test_is_done_.get(), file_audio_stream.get(), >- num_callbacks); >+ mock.HandleCallbacks(&test_is_done_, file_audio_stream.get(), num_callbacks); > // SetMaxPlayoutVolume(); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopPlayout(); > } > >@@ -780,14 +779,13 @@ TEST_F(AudioDeviceTest, RunPlayoutAndRecordingInFullDuplex) { > NiceMock<MockAudioTransportIOS> mock(kPlayout | kRecording); > std::unique_ptr<FifoAudioStream> fifo_audio_stream( > new FifoAudioStream(playout_frames_per_10ms_buffer())); >- mock.HandleCallbacks(test_is_done_.get(), fifo_audio_stream.get(), >- kFullDuplexTimeInSec * kNumCallbacksPerSecond); >+ mock.HandleCallbacks( >+ &test_is_done_, fifo_audio_stream.get(), kFullDuplexTimeInSec * kNumCallbacksPerSecond); > // SetMaxPlayoutVolume(); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartRecording(); > StartPlayout(); >- test_is_done_->Wait( >- std::max(kTestTimeOutInMilliseconds, 1000 * kFullDuplexTimeInSec)); >+ test_is_done_.Wait(std::max(kTestTimeOutInMilliseconds, 1000 * kFullDuplexTimeInSec)); > StopPlayout(); > StopRecording(); > EXPECT_LE(fifo_audio_stream->average_size(), 10u); >@@ -809,15 +807,15 @@ TEST_F(AudioDeviceTest, DISABLED_MeasureLoopbackLatency) { > NiceMock<MockAudioTransportIOS> mock(kPlayout | kRecording); > std::unique_ptr<LatencyMeasuringAudioStream> latency_audio_stream( > new LatencyMeasuringAudioStream(playout_frames_per_10ms_buffer())); >- mock.HandleCallbacks(test_is_done_.get(), latency_audio_stream.get(), >+ mock.HandleCallbacks(&test_is_done_, >+ latency_audio_stream.get(), > kMeasureLatencyTimeInSec * kNumCallbacksPerSecond); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > // SetMaxPlayoutVolume(); > // DisableBuiltInAECIfAvailable(); > StartRecording(); > StartPlayout(); >- test_is_done_->Wait( >- std::max(kTestTimeOutInMilliseconds, 1000 * kMeasureLatencyTimeInSec)); >+ test_is_done_.Wait(std::max(kTestTimeOutInMilliseconds, 1000 * kMeasureLatencyTimeInSec)); > StopPlayout(); > StopRecording(); > // Verify that the correct number of transmitted impulses are detected. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_alsa_linux.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_alsa_linux.h >index 3d15565b8dc481400ac1a4e02ec6078e8c109887..d60dcafdced16cbca7ed3923f342cf2c13f7b416 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_alsa_linux.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_alsa_linux.h >@@ -29,7 +29,6 @@ typedef webrtc::adm_linux_alsa::AlsaSymbolTable WebRTCAlsaSymbolTable; > WebRTCAlsaSymbolTable* GetAlsaSymbolTable(); > > namespace webrtc { >-class EventWrapper; > > class AudioDeviceLinuxALSA : public AudioDeviceGeneric { > public: >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.cc >index 2157af328c6d3b237420aaef759d2cec87fcb2c3..3706d9176ecd80fe549683313a071c06eb383b9d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.cc >@@ -8,11 +8,12 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include "modules/audio_device/audio_device_config.h" >+#include <string.h> >+ > #include "modules/audio_device/linux/audio_device_pulse_linux.h" >+#include "modules/audio_device/linux/latebindingsymboltable_linux.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "system_wrappers/include/event_wrapper.h" > > WebRTCPulseSymbolTable* GetPulseSymbolTable() { > static WebRTCPulseSymbolTable* pulse_symbol_table = >@@ -31,10 +32,6 @@ namespace webrtc { > > AudioDeviceLinuxPulse::AudioDeviceLinuxPulse() > : _ptrAudioBuffer(NULL), >- _timeEventRec(*EventWrapper::Create()), >- _timeEventPlay(*EventWrapper::Create()), >- _recStartEvent(*EventWrapper::Create()), >- _playStartEvent(*EventWrapper::Create()), > _inputDeviceIndex(0), > _outputDeviceIndex(0), > _inputDeviceIsSpecified(false), >@@ -111,11 +108,6 @@ AudioDeviceLinuxPulse::~AudioDeviceLinuxPulse() { > delete[] _recDeviceName; > _recDeviceName = NULL; > } >- >- delete &_recStartEvent; >- delete &_playStartEvent; >- delete &_timeEventRec; >- delete &_timeEventPlay; > } > > void AudioDeviceLinuxPulse::AttachAudioBuffer(AudioDeviceBuffer* audioBuffer) { >@@ -1065,7 +1057,7 @@ int32_t AudioDeviceLinuxPulse::StartRecording() { > > // The audio thread will signal when recording has started. > _timeEventRec.Set(); >- if (kEventTimeout == _recStartEvent.Wait(10000)) { >+ if (!_recStartEvent.Wait(10000)) { > { > rtc::CritScope lock(&_critSect); > _startRec = false; >@@ -1180,7 +1172,7 @@ int32_t AudioDeviceLinuxPulse::StartPlayout() { > > // The audio thread will signal when playout has started. > _timeEventPlay.Set(); >- if (kEventTimeout == _playStartEvent.Wait(10000)) { >+ if (!_playStartEvent.Wait(10000)) { > { > rtc::CritScope lock(&_critSect); > _startPlay = false; >@@ -1994,14 +1986,8 @@ bool AudioDeviceLinuxPulse::RecThreadFunc(void* pThis) { > } > > bool AudioDeviceLinuxPulse::PlayThreadProcess() { >- switch (_timeEventPlay.Wait(1000)) { >- case kEventSignaled: >- break; >- case kEventError: >- RTC_LOG(LS_WARNING) << "EventWrapper::Wait() failed"; >- return true; >- case kEventTimeout: >- return true; >+ if (!_timeEventPlay.Wait(1000)) { >+ return true; > } > > rtc::CritScope lock(&_critSect); >@@ -2168,14 +2154,8 @@ bool AudioDeviceLinuxPulse::PlayThreadProcess() { > } > > bool AudioDeviceLinuxPulse::RecThreadProcess() { >- switch (_timeEventRec.Wait(1000)) { >- case kEventSignaled: >- break; >- case kEventError: >- RTC_LOG(LS_WARNING) << "EventWrapper::Wait() failed"; >- return true; >- case kEventTimeout: >- return true; >+ if (!_timeEventRec.Wait(1000)) { >+ return true; > } > > rtc::CritScope lock(&_critSect); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.h >index 1655b18117fb24e31738eb9cb246a82c72b6d611..4de87fc3e6e3e4ed6e2154ecc197209c7728807f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_device_pulse_linux.h >@@ -13,10 +13,16 @@ > > #include <memory> > >+#include "modules/audio_device/audio_device_buffer.h" > #include "modules/audio_device/audio_device_generic.h" >+#include "modules/audio_device/include/audio_device.h" >+#include "modules/audio_device/include/audio_device_defines.h" > #include "modules/audio_device/linux/audio_mixer_manager_pulse_linux.h" >+#include "modules/audio_device/linux/pulseaudiosymboltable_linux.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/platform_thread.h" >+#include "rtc_base/thread_annotations.h" > #include "rtc_base/thread_checker.h" > > #if defined(WEBRTC_USE_X11) >@@ -24,6 +30,8 @@ > #endif > > #include <pulse/pulseaudio.h> >+#include <stddef.h> >+#include <stdint.h> > > // We define this flag if it's missing from our headers, because we want to be > // able to compile against old headers but still use PA_STREAM_ADJUST_LATENCY >@@ -96,7 +104,6 @@ typedef webrtc::adm_linux_pulse::PulseAudioSymbolTable WebRTCPulseSymbolTable; > WebRTCPulseSymbolTable* GetPulseSymbolTable(); > > namespace webrtc { >-class EventWrapper; > > class AudioDeviceLinuxPulse : public AudioDeviceGeneric { > public: >@@ -255,10 +262,10 @@ class AudioDeviceLinuxPulse : public AudioDeviceGeneric { > AudioDeviceBuffer* _ptrAudioBuffer; > > rtc::CriticalSection _critSect; >- EventWrapper& _timeEventRec; >- EventWrapper& _timeEventPlay; >- EventWrapper& _recStartEvent; >- EventWrapper& _playStartEvent; >+ rtc::Event _timeEventRec; >+ rtc::Event _timeEventPlay; >+ rtc::Event _recStartEvent; >+ rtc::Event _playStartEvent; > > // TODO(pbos): Remove unique_ptr and use directly without resetting. > std::unique_ptr<rtc::PlatformThread> _ptrThreadPlay; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.cc >index 5ba923bb911c1d222449e51c8060d7f874188844..22e10407de2ff13f0aa6306462e6a4e97028ee58 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.cc >@@ -8,8 +8,12 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >+#include <stddef.h> >+ > #include "modules/audio_device/linux/audio_device_pulse_linux.h" > #include "modules/audio_device/linux/audio_mixer_manager_pulse_linux.h" >+#include "modules/audio_device/linux/latebindingsymboltable_linux.h" >+#include "modules/audio_device/linux/pulseaudiosymboltable_linux.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.h >index c5460b6237af872252307b2a7fc148180fc6203e..679d1b4b7c26d236e52306d8ad9d1afd85afa927 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/linux/audio_mixer_manager_pulse_linux.h >@@ -11,13 +11,11 @@ > #ifndef AUDIO_DEVICE_AUDIO_MIXER_MANAGER_PULSE_LINUX_H_ > #define AUDIO_DEVICE_AUDIO_MIXER_MANAGER_PULSE_LINUX_H_ > >-#include "modules/audio_device/include/audio_device.h" >-#include "modules/audio_device/linux/pulseaudiosymboltable_linux.h" >-#include "rtc_base/thread_checker.h" >- > #include <pulse/pulseaudio.h> > #include <stdint.h> > >+#include "rtc_base/thread_checker.h" >+ > #ifndef UINT32_MAX > #define UINT32_MAX ((uint32_t)-1) > #endif >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.cc >index 063ac85736ddc4d1015f2186351ff02d2797f3fd..9daf7bddf7bf73c59de28fe353564c1b8f335f23 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.cc >@@ -16,7 +16,6 @@ > #include "rtc_base/checks.h" > #include "rtc_base/platform_thread.h" > #include "rtc_base/system/arch.h" >-#include "system_wrappers/include/event_wrapper.h" > > #include <ApplicationServices/ApplicationServices.h> > #include <libkern/OSAtomic.h> // OSAtomicCompareAndSwap() >@@ -115,8 +114,6 @@ void AudioDeviceMac::logCAMsg(const rtc::LoggingSeverity sev, > > AudioDeviceMac::AudioDeviceMac() > : _ptrAudioBuffer(NULL), >- _stopEventRec(*EventWrapper::Create()), >- _stopEvent(*EventWrapper::Create()), > _mixerManager(), > _inputDeviceIndex(0), > _outputDeviceIndex(0), >@@ -153,9 +150,6 @@ AudioDeviceMac::AudioDeviceMac() > prev_key_state_() { > RTC_LOG(LS_INFO) << __FUNCTION__ << " created"; > >- RTC_DCHECK(&_stopEvent != NULL); >- RTC_DCHECK(&_stopEventRec != NULL); >- > memset(_renderConvertData, 0, sizeof(_renderConvertData)); > memset(&_outStreamFormat, 0, sizeof(AudioStreamBasicDescription)); > memset(&_outDesiredFormat, 0, sizeof(AudioStreamBasicDescription)); >@@ -204,8 +198,6 @@ AudioDeviceMac::~AudioDeviceMac() { > RTC_LOG(LS_ERROR) << "semaphore_destroy() error: " << kernErr; > } > >- delete &_stopEvent; >- delete &_stopEventRec; > } > > // ============================================================================ >@@ -1338,7 +1330,7 @@ int32_t AudioDeviceMac::StopRecording() { > _recording = false; > _doStopRec = true; // Signal to io proc to stop audio device > _critSect.Leave(); // Cannot be under lock, risk of deadlock >- if (kEventTimeout == _stopEventRec.Wait(2000)) { >+ if (!_stopEventRec.Wait(2000)) { > rtc::CritScope critScoped(&_critSect); > RTC_LOG(LS_WARNING) << "Timed out stopping the capture IOProc." > << "We may have failed to detect a device removal."; >@@ -1366,7 +1358,7 @@ int32_t AudioDeviceMac::StopRecording() { > _recording = false; > _doStop = true; // Signal to io proc to stop audio device > _critSect.Leave(); // Cannot be under lock, risk of deadlock >- if (kEventTimeout == _stopEvent.Wait(2000)) { >+ if (!_stopEvent.Wait(2000)) { > rtc::CritScope critScoped(&_critSect); > RTC_LOG(LS_WARNING) << "Timed out stopping the shared IOProc." > << "We may have failed to detect a device removal."; >@@ -1474,7 +1466,7 @@ int32_t AudioDeviceMac::StopPlayout() { > _playing = false; > _doStop = true; // Signal to io proc to stop audio device > _critSect.Leave(); // Cannot be under lock, risk of deadlock >- if (kEventTimeout == _stopEvent.Wait(2000)) { >+ if (!_stopEvent.Wait(2000)) { > rtc::CritScope critScoped(&_critSect); > RTC_LOG(LS_WARNING) << "Timed out stopping the render IOProc." > << "We may have failed to detect a device removal."; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.h >index 5cbdae28371a2930e0735a7423d45b66ea4cc135..5a7076065334cf754d756a04eb8170ea6bc8837c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_device/mac/audio_device_mac.h >@@ -16,9 +16,9 @@ > #include "modules/audio_device/audio_device_generic.h" > #include "modules/audio_device/mac/audio_mixer_manager_mac.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/logging.h" > #include "rtc_base/thread_annotations.h" >-#include "system_wrappers/include/event_wrapper.h" > > #include <AudioToolbox/AudioConverter.h> > #include <CoreAudio/CoreAudio.h> >@@ -31,7 +31,6 @@ class PlatformThread; > } // namespace rtc > > namespace webrtc { >-class EventWrapper; > > const uint32_t N_REC_SAMPLES_PER_SEC = 48000; > const uint32_t N_PLAY_SAMPLES_PER_SEC = 48000; >@@ -252,8 +251,8 @@ class AudioDeviceMac : public AudioDeviceGeneric { > > rtc::CriticalSection _critSect; > >- EventWrapper& _stopEventRec; >- EventWrapper& _stopEvent; >+ rtc::Event _stopEventRec; >+ rtc::Event _stopEvent; > > // TODO(pbos): Replace with direct members, just start/stop, no need to > // recreate the thread. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/BUILD.gn >index 0bb3ae137aeedddfd99c740f113f09b0a451c9f2..f935e83c5ba23f9efedae5542213b7646b8b5f83 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/BUILD.gn >@@ -37,7 +37,6 @@ rtc_static_library("audio_mixer_impl") { > > deps = [ > ":audio_frame_manipulator", >- "../..:webrtc_common", > "../../api:array_view", > "../../api/audio:audio_frame_api", > "../../api/audio:audio_mixer_api", >@@ -48,6 +47,7 @@ rtc_static_library("audio_mixer_impl") { > "../../system_wrappers", > "../../system_wrappers:metrics", > "../audio_processing", >+ "../audio_processing:api", > "../audio_processing:apm_logging", > "../audio_processing:audio_frame_view", > "../audio_processing/agc2:fixed_digital", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_impl.h >index e8994a674bc605d856497386e1df4d58ad84014b..8edd3b8bdc4d3e4b60036175a845aeee66890721 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_impl.h >@@ -17,6 +17,7 @@ > #include "api/audio/audio_mixer.h" > #include "modules/audio_mixer/frame_combiner.h" > #include "modules/audio_mixer/output_rate_calculator.h" >+#include "rtc_base/criticalsection.h" > #include "rtc_base/race_checker.h" > #include "rtc_base/scoped_ref_ptr.h" > #include "rtc_base/thread_annotations.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_test.cc >index 2004aeef463ab0ab902130f732406b99c0ecdc7b..b60787755d836e0282d0bad4d173f5f785281054 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/audio_mixer_test.cc >@@ -19,24 +19,25 @@ > #include "rtc_base/flags.h" > #include "rtc_base/strings/string_builder.h" > >-DEFINE_bool(help, false, "Prints this message"); >-DEFINE_int(sampling_rate, >- 16000, >- "Rate at which to mix (all input streams must have this rate)"); >+WEBRTC_DEFINE_bool(help, false, "Prints this message"); >+WEBRTC_DEFINE_int( >+ sampling_rate, >+ 16000, >+ "Rate at which to mix (all input streams must have this rate)"); > >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > stereo, > false, > "Enable stereo (interleaved). Inputs need not be as this parameter."); > >-DEFINE_bool(limiter, true, "Enable limiter."); >-DEFINE_string(output_file, >- "mixed_file.wav", >- "File in which to store the mixed result."); >-DEFINE_string(input_file_1, "", "First input. Default none."); >-DEFINE_string(input_file_2, "", "Second input. Default none."); >-DEFINE_string(input_file_3, "", "Third input. Default none."); >-DEFINE_string(input_file_4, "", "Fourth input. Default none."); >+WEBRTC_DEFINE_bool(limiter, true, "Enable limiter."); >+WEBRTC_DEFINE_string(output_file, >+ "mixed_file.wav", >+ "File in which to store the mixed result."); >+WEBRTC_DEFINE_string(input_file_1, "", "First input. Default none."); >+WEBRTC_DEFINE_string(input_file_2, "", "Second input. Default none."); >+WEBRTC_DEFINE_string(input_file_3, "", "Third input. Default none."); >+WEBRTC_DEFINE_string(input_file_4, "", "Fourth input. Default none."); > > namespace webrtc { > namespace test { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.cc >index cff2282e33fe698241da9aa3b204c299128ed2fb..d3493f5c3dd437fdf4432669451267f7d8fbcf5a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.cc >@@ -92,10 +92,10 @@ std::array<OneChannelBuffer, kMaximumAmountOfChannels> MixToFloatFrame( > return mixing_buffer; > } > >-void RunLimiter(AudioFrameView<float> mixing_buffer_view, >- FixedGainController* limiter) { >+void RunLimiter(AudioFrameView<float> mixing_buffer_view, Limiter* limiter) { > const size_t sample_rate = mixing_buffer_view.samples_per_channel() * 1000 / > AudioMixerImpl::kFrameDurationInMs; >+ // TODO(alessiob): Avoid calling SetSampleRate every time. > limiter->SetSampleRate(sample_rate); > limiter->Process(mixing_buffer_view); > } >@@ -117,10 +117,8 @@ void InterleaveToAudioFrame(AudioFrameView<const float> mixing_buffer_view, > > FrameCombiner::FrameCombiner(bool use_limiter) > : data_dumper_(new ApmDataDumper(0)), >- limiter_(data_dumper_.get(), "AudioMixer"), >- use_limiter_(use_limiter) { >- limiter_.SetGain(0.f); >-} >+ limiter_(static_cast<size_t>(48000), data_dumper_.get(), "AudioMixer"), >+ use_limiter_(use_limiter) {} > > FrameCombiner::~FrameCombiner() = default; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.h >index 70ac02775899070a40c752710f57f066f1d735a5..1c1cd53eff3115ee12ec186fefeb037fdc054e06 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_mixer/frame_combiner.h >@@ -15,11 +15,10 @@ > #include <vector> > > #include "api/audio/audio_frame.h" >-#include "modules/audio_processing/agc2/fixed_gain_controller.h" >+#include "modules/audio_processing/agc2/limiter.h" > > namespace webrtc { > class ApmDataDumper; >-class FixedGainController; > > class FrameCombiner { > public: >@@ -45,7 +44,7 @@ class FrameCombiner { > size_t number_of_streams) const; > > std::unique_ptr<ApmDataDumper> data_dumper_; >- FixedGainController limiter_; >+ Limiter limiter_; > const bool use_limiter_; > mutable int uma_logging_counter_ = 0; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/BUILD.gn >index 61b4d495ed2f9161823cc7c53e38ad5d31468ea1..66b07df199e67dacf9245742beda30d144297847 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/BUILD.gn >@@ -26,6 +26,41 @@ config("apm_debug_dump") { > } > } > >+rtc_static_library("config") { >+ visibility = [ ":*" ] >+ sources = [ >+ "include/config.cc", >+ "include/config.h", >+ ] >+ deps = [ >+ "../../rtc_base:macromagic", >+ "../../rtc_base/system:rtc_export", >+ ] >+} >+ >+rtc_source_set("api") { >+ visibility = [ "*" ] >+ sources = [ >+ "include/audio_processing.cc", >+ "include/audio_processing.h", >+ ] >+ deps = [ >+ ":audio_frame_view", >+ ":audio_generator_interface", >+ ":audio_processing_statistics", >+ ":config", >+ ":gain_control_interface", >+ "../../api/audio:aec3_config", >+ "../../api/audio:echo_control", >+ "../../rtc_base:deprecation", >+ "../../rtc_base:macromagic", >+ "../../rtc_base:ptr_util", >+ "../../rtc_base:rtc_base_approved", >+ "../../rtc_base/system:rtc_export", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >+ > rtc_static_library("audio_processing") { > visibility = [ "*" ] > configs += [ ":apm_debug_dump" ] >@@ -55,10 +90,6 @@ rtc_static_library("audio_processing") { > "gain_controller2.h", > "include/aec_dump.cc", > "include/aec_dump.h", >- "include/audio_processing.cc", >- "include/audio_processing.h", >- "include/config.cc", >- "include/config.h", > "level_estimator_impl.cc", > "level_estimator_impl.h", > "low_cut_filter.cc", >@@ -95,11 +126,13 @@ rtc_static_library("audio_processing") { > > defines = [] > deps = [ >+ ":api", > ":apm_logging", > ":audio_frame_view", > ":audio_generator_interface", > ":audio_processing_c", > ":audio_processing_statistics", >+ ":config", > ":gain_control_interface", > "../..:webrtc_common", > "../../api:array_view", >@@ -115,6 +148,7 @@ rtc_static_library("audio_processing") { > "../../rtc_base:safe_minmax", > "../../rtc_base:sanitizer", > "../../rtc_base/system:arch", >+ "../../rtc_base/system:rtc_export", > "../../system_wrappers:cpu_features_api", > "../../system_wrappers:field_trial", > "../../system_wrappers:metrics", >@@ -162,6 +196,7 @@ rtc_source_set("audio_processing_statistics") { > "include/audio_processing_statistics.h", > ] > deps = [ >+ "../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -294,6 +329,7 @@ rtc_source_set("apm_logging") { > deps = [ > "../../api:array_view", > "../../common_audio:common_audio", >+ "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", > "../../rtc_base:stringutils", > ] >@@ -307,6 +343,7 @@ if (rtc_include_tests) { > "include/mock_audio_processing.h", > ] > deps = [ >+ ":api", > ":audio_processing", > ":audio_processing_statistics", > "../../test:test_support", >@@ -358,10 +395,12 @@ if (rtc_include_tests) { > > deps = [ > ":analog_mic_simulation", >+ ":api", > ":apm_logging", > ":audio_frame_view", > ":audio_processing", > ":audioproc_test_utils", >+ ":config", > ":file_audio_generator_unittests", > ":mocks", > "../..:webrtc_common", >@@ -375,6 +414,7 @@ if (rtc_include_tests) { > "../../rtc_base:protobuf_utils", > "../../rtc_base:rtc_base", > "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:rtc_base_tests_utils", > "../../rtc_base:safe_minmax", > "../../rtc_base/system:arch", > "../../rtc_base/system:file_wrapper", >@@ -391,6 +431,8 @@ if (rtc_include_tests) { > "agc2:biquad_filter_unittests", > "agc2:fixed_digital_unittests", > "agc2:noise_estimator_unittests", >+ "agc2:rnn_vad_with_level_unittests", >+ "agc2:test_utils", > "agc2/rnn_vad:unittests", > "test/conversational_speech:unittest", > "utility:block_mean_calculator_unittest", >@@ -418,7 +460,6 @@ if (rtc_include_tests) { > ":audioproc_unittest_proto", > ":runtime_settings_protobuf_utils", > "../../api/audio:audio_frame_api", >- "../../rtc_base:rtc_base_tests_utils", > "../../rtc_base:rtc_task_queue", > "aec_dump", > "aec_dump:aec_dump_unittests", >@@ -482,6 +523,7 @@ if (rtc_include_tests) { > ] > > deps = [ >+ ":api", > ":audio_generator_factory", > ":audio_processing", > ":file_audio_generator", >@@ -512,6 +554,7 @@ if (rtc_include_tests) { > if (rtc_enable_protobuf) { > rtc_source_set("audioproc_f_impl") { > testonly = true >+ configs += [ ":apm_debug_dump" ] > sources = [ > "test/aec_dump_based_simulator.cc", > "test/aec_dump_based_simulator.h", >@@ -525,11 +568,14 @@ if (rtc_include_tests) { > > deps = [ > ":analog_mic_simulation", >+ ":api", >+ ":apm_logging", > ":audio_processing", > ":audioproc_debug_proto", > ":audioproc_protobuf_utils", > ":audioproc_test_utils", > ":runtime_settings_protobuf_utils", >+ "../../api/audio:aec3_config_json", > "../../api/audio:aec3_factory", > "../../common_audio:common_audio", > "../../rtc_base:checks", >@@ -544,6 +590,7 @@ if (rtc_include_tests) { > "aec_dump:aec_dump_impl", > "//testing/gtest", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } # audioproc_f_impl >@@ -553,6 +600,7 @@ if (rtc_include_tests) { > "test/audioproc_float_main.cc", > ] > deps = [ >+ ":api", > ":audio_processing", > "../../api:audioproc_f_api", > "../../rtc_base:rtc_base_approved", >@@ -578,6 +626,7 @@ if (rtc_include_tests) { > ] > > deps = [ >+ ":api", > ":audio_processing", > "../../api:array_view", > "../../api/audio:audio_frame_api", >@@ -662,6 +711,7 @@ if (rtc_include_tests) { > ] > > deps = [ >+ ":api", > ":audio_processing", > ":audioproc_debug_proto", > ":audioproc_protobuf_utils", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_core.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_core.cc >index 5a8cf8f58b601f1ab25e499e32a7d1a37ea7b87f..62b8ad0f7c0709e5c34b7f646385c007b9a089cb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_core.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_core.cc >@@ -21,6 +21,7 @@ > #include <algorithm> > > #include "rtc_base/checks.h" >+ > extern "C" { > #include "common_audio/ring_buffer.h" > } >@@ -29,7 +30,6 @@ extern "C" { > #include "modules/audio_processing/aec/aec_core_optimized_methods.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "modules/audio_processing/utility/delay_estimator_wrapper.h" >-#include "rtc_base/checks.h" > #include "rtc_base/system/arch.h" > #include "system_wrappers/include/cpu_features_wrapper.h" > #include "system_wrappers/include/metrics.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.cc >index 2851c0b6dd575d6968b76669db25ee42005772f6..210c2bebe0e2113ad0e808d24f23b9bc808608ad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.cc >@@ -14,7 +14,6 @@ > > #include "modules/audio_processing/aec/aec_resampler.h" > >-#include <math.h> > #include <stdlib.h> > #include <string.h> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.h >index 130f7ec7c76647d1ce836fc232a41776feeaedeb..a112c434d0bc91ba1a9d56ebf1ae31b3618f0feb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec/aec_resampler.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC_AEC_RESAMPLER_H_ > #define MODULES_AUDIO_PROCESSING_AEC_AEC_RESAMPLER_H_ > >+#include <stddef.h> >+ > #include "modules/audio_processing/aec/aec_core.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/BUILD.gn >index e631732a4539ab1362f15b998bd5c11cef82732c..189bcfd71255339bc7dbf77f0f370cebb31f4e43 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/BUILD.gn >@@ -20,16 +20,21 @@ rtc_static_library("aec3") { > "aec3_fft.h", > "aec_state.cc", > "aec_state.h", >+ "api_call_jitter_metrics.cc", >+ "api_call_jitter_metrics.h", > "block_delay_buffer.cc", > "block_delay_buffer.h", > "block_framer.cc", > "block_framer.h", > "block_processor.cc", > "block_processor.h", >+ "block_processor2.cc", > "block_processor_metrics.cc", > "block_processor_metrics.h", > "cascaded_biquad_filter.cc", > "cascaded_biquad_filter.h", >+ "clockdrift_detector.cc", >+ "clockdrift_detector.h", > "comfort_noise_generator.cc", > "comfort_noise_generator.h", > "decimator.cc", >@@ -76,10 +81,14 @@ rtc_static_library("aec3") { > "render_buffer.h", > "render_delay_buffer.cc", > "render_delay_buffer.h", >+ "render_delay_buffer2.cc", > "render_delay_controller.cc", > "render_delay_controller.h", >+ "render_delay_controller2.cc", > "render_delay_controller_metrics.cc", > "render_delay_controller_metrics.h", >+ "render_reverb_model.cc", >+ "render_reverb_model.h", > "render_signal_analyzer.cc", > "render_signal_analyzer.h", > "residual_echo_estimator.cc", >@@ -96,6 +105,8 @@ rtc_static_library("aec3") { > "reverb_model_fallback.h", > "shadow_filter_update_gain.cc", > "shadow_filter_update_gain.h", >+ "signal_dependent_erle_estimator.cc", >+ "signal_dependent_erle_estimator.h", > "skew_estimator.cc", > "skew_estimator.h", > "stationarity_estimator.cc", >@@ -183,11 +194,13 @@ if (rtc_include_tests) { > "adaptive_fir_filter_unittest.cc", > "aec3_fft_unittest.cc", > "aec_state_unittest.cc", >+ "api_call_jitter_metrics_unittest.cc", > "block_delay_buffer_unittest.cc", > "block_framer_unittest.cc", > "block_processor_metrics_unittest.cc", > "block_processor_unittest.cc", > "cascaded_biquad_filter_unittest.cc", >+ "clockdrift_detector_unittest.cc", > "comfort_noise_generator_unittest.cc", > "decimator_unittest.cc", > "echo_canceller3_unittest.cc", >@@ -211,6 +224,7 @@ if (rtc_include_tests) { > "residual_echo_estimator_unittest.cc", > "reverb_model_estimator_unittest.cc", > "shadow_filter_update_gain_unittest.cc", >+ "signal_dependent_erle_estimator_unittest.cc", > "skew_estimator_unittest.cc", > "subtractor_unittest.cc", > "suppression_filter_unittest.cc", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.cc >index 9a1e811fcfe3cf52ea9bb9868f91f927d8de7688..3ab1ebcc4b9debcd919eaabb2db3fd08c3a3d950 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.cc >@@ -24,7 +24,7 @@ > > #include "modules/audio_processing/aec3/fft_data.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >+#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > >@@ -417,12 +417,21 @@ void ApplyFilter_SSE2(const RenderBuffer& render_buffer, > > } // namespace aec3 > >+namespace { >+ >+bool EnablePartialFilterReset() { >+ return !field_trial::IsEnabled("WebRTC-Aec3PartialFilterResetKillSwitch"); >+} >+ >+} // namespace >+ > AdaptiveFirFilter::AdaptiveFirFilter(size_t max_size_partitions, > size_t initial_size_partitions, > size_t size_change_duration_blocks, > Aec3Optimization optimization, > ApmDataDumper* data_dumper) > : data_dumper_(data_dumper), >+ use_partial_filter_reset_(EnablePartialFilterReset()), > fft_(), > optimization_(optimization), > max_size_partitions_(max_size_partitions), >@@ -455,20 +464,22 @@ AdaptiveFirFilter::~AdaptiveFirFilter() = default; > void AdaptiveFirFilter::HandleEchoPathChange() { > size_t current_h_size = h_.size(); > h_.resize(GetTimeDomainLength(max_size_partitions_)); >- std::fill(h_.begin(), h_.end(), 0.f); >+ const size_t begin_coeffficient = >+ use_partial_filter_reset_ ? current_h_size : 0; >+ std::fill(h_.begin() + begin_coeffficient, h_.end(), 0.f); > h_.resize(current_h_size); > > size_t current_size_partitions = H_.size(); > H_.resize(max_size_partitions_); >- for (auto& H_j : H_) { >- H_j.Clear(); >- } >- H_.resize(current_size_partitions); >- > H2_.resize(max_size_partitions_); >- for (auto& H2_k : H2_) { >- H2_k.fill(0.f); >+ >+ const size_t begin_partition = >+ use_partial_filter_reset_ ? current_size_partitions : 0; >+ for (size_t k = begin_partition; k < max_size_partitions_; ++k) { >+ H_[k].Clear(); >+ H2_[k].fill(0.f); > } >+ H_.resize(current_size_partitions); > H2_.resize(current_size_partitions); > > erl_.fill(0.f); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.h >index 5dfb4660fa21c0375cbba5e2c910549bfa0078b4..7c832a6cd23a405c46f8b985a5619e236080af48 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/adaptive_fir_filter.h >@@ -11,8 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_ADAPTIVE_FIR_FILTER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_ADAPTIVE_FIR_FILTER_H_ > >+#include <stddef.h> > #include <array> >-#include <memory> > #include <vector> > > #include "api/array_view.h" >@@ -164,6 +164,7 @@ class AdaptiveFirFilter { > void UpdateSize(); > > ApmDataDumper* const data_dumper_; >+ const bool use_partial_filter_reset_; > const Aec3Fft fft_; > const Aec3Optimization optimization_; > const size_t max_size_partitions_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_common.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_common.cc >index c374a3517d60e37a7991d4ee964b100ed7001d64..aeb848a5700a64820ddccd4f33ff93305b561f4f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_common.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_common.cc >@@ -10,10 +10,11 @@ > > #include "modules/audio_processing/aec3/aec3_common.h" > >-#include "system_wrappers/include/cpu_features_wrapper.h" >+#include <stdint.h> > > #include "rtc_base/checks.h" > #include "rtc_base/system/arch.h" >+#include "system_wrappers/include/cpu_features_wrapper.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.cc >index e29434d3002ddbf21443292bdfe19a7328fab10b..1832101855242030ae0a9be625e6bcdd1d639e42 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.cc >@@ -12,6 +12,7 @@ > > #include <algorithm> > #include <functional> >+#include <iterator> > > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.h >index b70022255d9564d332fdc2a46a91ad91aa3e7361..6cd649af8f4c96a421334377cba9556f44315ee2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec3_fft.h >@@ -17,6 +17,7 @@ > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/utility/ooura_fft.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.cc >index 5e58a1c279c031abf629ff360934858e1e13485b..d5f256b1c1256e9fef27490d5080f292ebbd8aab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.cc >@@ -11,7 +11,7 @@ > #include "modules/audio_processing/aec3/aec_state.h" > > #include <math.h> >- >+#include <algorithm> > #include <numeric> > #include <vector> > >@@ -30,6 +30,22 @@ bool EnableErleResetsAtGainChanges() { > return !field_trial::IsEnabled("WebRTC-Aec3ResetErleAtGainChangesKillSwitch"); > } > >+bool UseLegacyFilterQualityState() { >+ return field_trial::IsEnabled("WebRTC-Aec3FilterQualityStateKillSwitch"); >+} >+ >+bool EnableLegacySaturationBehavior() { >+ return field_trial::IsEnabled("WebRTC-Aec3NewSaturationBehaviorKillSwitch"); >+} >+ >+bool UseSuppressionGainLimiter() { >+ return field_trial::IsEnabled("WebRTC-Aec3GainLimiterDeactivationKillSwitch"); >+} >+bool EnableErleUpdatesDuringReverb() { >+ return !field_trial::IsEnabled( >+ "WebRTC-Aec3EnableErleUpdatesDuringReverbKillSwitch"); >+} >+ > constexpr size_t kBlocksSinceConvergencedFilterInit = 10000; > constexpr size_t kBlocksSinceConsistentEstimateInit = 10000; > >@@ -52,18 +68,10 @@ void AecState::GetResidualEchoScaling( > } > > absl::optional<float> AecState::ErleUncertainty() const { >- bool filter_has_had_time_to_converge; >- if (config_.filter.conservative_initial_phase) { >- filter_has_had_time_to_converge = >- strong_not_saturated_render_blocks_ >= 1.5f * kNumBlocksPerSecond; >- } else { >- filter_has_had_time_to_converge = >- strong_not_saturated_render_blocks_ >= 0.8f * kNumBlocksPerSecond; >- } >- >- if (!filter_has_had_time_to_converge) { >+ if (SaturatedEcho() && use_legacy_saturation_behavior_) { > return 1.f; > } >+ > return absl::nullopt; > } > >@@ -71,22 +79,24 @@ AecState::AecState(const EchoCanceller3Config& config) > : data_dumper_( > new ApmDataDumper(rtc::AtomicOps::Increment(&instance_count_))), > config_(config), >+ use_legacy_saturation_behavior_(EnableLegacySaturationBehavior()), >+ enable_erle_resets_at_gain_changes_(EnableErleResetsAtGainChanges()), >+ enable_erle_updates_during_reverb_(EnableErleUpdatesDuringReverb()), >+ use_legacy_filter_quality_(UseLegacyFilterQualityState()), >+ use_suppressor_gain_limiter_(UseSuppressionGainLimiter()), > initial_state_(config_), > delay_state_(config_), > transparent_state_(config_), > filter_quality_state_(config_), >- saturation_detector_(config_), >+ legacy_filter_quality_state_(config_), >+ legacy_saturation_detector_(config_), > erl_estimator_(2 * kNumBlocksPerSecond), >- erle_estimator_(2 * kNumBlocksPerSecond, >- config_.erle.min, >- config_.erle.max_l, >- config_.erle.max_h), >+ erle_estimator_(2 * kNumBlocksPerSecond, config_), > suppression_gain_limiter_(config_), > filter_analyzer_(config_), > echo_audibility_( > config_.echo_audibility.use_stationarity_properties_at_init), >- reverb_model_estimator_(config_), >- enable_erle_resets_at_gain_changes_(EnableErleResetsAtGainChanges()) {} >+ reverb_model_estimator_(config_) {} > > AecState::~AecState() = default; > >@@ -97,13 +107,21 @@ void AecState::HandleEchoPathChange( > capture_signal_saturation_ = false; > strong_not_saturated_render_blocks_ = 0; > blocks_with_active_render_ = 0; >- suppression_gain_limiter_.Reset(); >+ if (use_suppressor_gain_limiter_) { >+ suppression_gain_limiter_.Reset(); >+ } > initial_state_.Reset(); > transparent_state_.Reset(); >- saturation_detector_.Reset(); >+ if (use_legacy_saturation_behavior_) { >+ legacy_saturation_detector_.Reset(); >+ } > erle_estimator_.Reset(true); > erl_estimator_.Reset(); >- filter_quality_state_.Reset(); >+ if (use_legacy_filter_quality_) { >+ legacy_filter_quality_state_.Reset(); >+ } else { >+ filter_quality_state_.Reset(); >+ } > }; > > // TODO(peah): Refine the reset scheme according to the type of gain and >@@ -155,35 +173,56 @@ void AecState::Update( > strong_not_saturated_render_blocks_ += > active_render && !SaturatedCapture() ? 1 : 0; > >- // Update the limit on the echo suppr ession after an echo path change to >- // avoid an initial echo burst. >- suppression_gain_limiter_.Update(render_buffer.GetRenderActivity(), >- TransparentMode()); >- if (subtractor_output_analyzer_.ConvergedFilter()) { >- suppression_gain_limiter_.Deactivate(); >+ if (use_suppressor_gain_limiter_) { >+ // Update the limit on the echo suppression after an echo path change to >+ // avoid an initial echo burst. >+ suppression_gain_limiter_.Update(render_buffer.GetRenderActivity(), >+ TransparentMode()); >+ >+ if (subtractor_output_analyzer_.ConvergedFilter()) { >+ suppression_gain_limiter_.Deactivate(); >+ } > } > >+ std::array<float, kFftLengthBy2Plus1> X2_reverb; >+ render_reverb_.Apply( >+ render_buffer.GetSpectrumBuffer(), delay_state_.DirectPathFilterDelay(), >+ config_.ep_strength.reverb_based_on_render ? ReverbDecay() : 0.f, >+ X2_reverb); >+ > if (config_.echo_audibility.use_stationary_properties) { > // Update the echo audibility evaluator. >- echo_audibility_.Update( >- render_buffer, delay_state_.DirectPathFilterDelay(), >- delay_state_.ExternalDelayReported(), >- config_.ep_strength.reverb_based_on_render ? ReverbDecay() : 0.f); >+ echo_audibility_.Update(render_buffer, >+ render_reverb_.GetReverbContributionPowerSpectrum(), >+ delay_state_.DirectPathFilterDelay(), >+ delay_state_.ExternalDelayReported()); > } > > // Update the ERL and ERLE measures. > if (initial_state_.TransitionTriggered()) { > erle_estimator_.Reset(false); > } >+ > const auto& X2 = render_buffer.Spectrum(delay_state_.DirectPathFilterDelay()); >- erle_estimator_.Update(X2, Y2, E2_main, >+ const auto& X2_input_erle = >+ enable_erle_updates_during_reverb_ ? X2_reverb : X2; >+ >+ erle_estimator_.Update(render_buffer, adaptive_filter_frequency_response, >+ X2_input_erle, Y2, E2_main, > subtractor_output_analyzer_.ConvergedFilter(), > config_.erle.onset_detection); >+ > erl_estimator_.Update(subtractor_output_analyzer_.ConvergedFilter(), X2, Y2); > > // Detect and flag echo saturation. >- saturation_detector_.Update(aligned_render_block, SaturatedCapture(), >- EchoPathGain()); >+ if (use_legacy_saturation_behavior_) { >+ legacy_saturation_detector_.Update(aligned_render_block, SaturatedCapture(), >+ EchoPathGain()); >+ } else { >+ saturation_detector_.Update(aligned_render_block, SaturatedCapture(), >+ UsableLinearEstimate(), subtractor_output, >+ EchoPathGain()); >+ } > > // Update the decision on whether to use the initial state parameter set. > initial_state_.Update(active_render, SaturatedCapture()); >@@ -196,11 +235,17 @@ void AecState::Update( > active_render, SaturatedCapture()); > > // Analyze the quality of the filter. >- filter_quality_state_.Update(saturation_detector_.SaturatedEcho(), >- active_render, SaturatedCapture(), >- TransparentMode(), external_delay, >- subtractor_output_analyzer_.ConvergedFilter(), >- subtractor_output_analyzer_.DivergedFilter()); >+ if (use_legacy_filter_quality_) { >+ legacy_filter_quality_state_.Update( >+ SaturatedEcho(), active_render, SaturatedCapture(), TransparentMode(), >+ external_delay, subtractor_output_analyzer_.ConvergedFilter(), >+ subtractor_output_analyzer_.DivergedFilter()); >+ } else { >+ filter_quality_state_.Update(active_render, TransparentMode(), >+ SaturatedCapture(), >+ filter_analyzer_.Consistent(), external_delay, >+ subtractor_output_analyzer_.ConvergedFilter()); >+ } > > // Update the reverb estimate. > const bool stationary_block = >@@ -227,8 +272,7 @@ void AecState::Update( > data_dumper_->DumpRaw("aec3_initial_state", > initial_state_.InitialStateActive()); > data_dumper_->DumpRaw("aec3_capture_saturation", SaturatedCapture()); >- data_dumper_->DumpRaw("aec3_echo_saturation", >- saturation_detector_.SaturatedEcho()); >+ data_dumper_->DumpRaw("aec3_echo_saturation", SaturatedEcho()); > data_dumper_->DumpRaw("aec3_converged_filter", > subtractor_output_analyzer_.ConvergedFilter()); > data_dumper_->DumpRaw("aec3_diverged_filter", >@@ -382,6 +426,51 @@ void AecState::TransparentMode::Update(int filter_delay_blocks, > } > > AecState::FilteringQualityAnalyzer::FilteringQualityAnalyzer( >+ const EchoCanceller3Config& config) {} >+ >+void AecState::FilteringQualityAnalyzer::Reset() { >+ usable_linear_estimate_ = false; >+ filter_update_blocks_since_reset_ = 0; >+} >+ >+void AecState::FilteringQualityAnalyzer::Update( >+ bool active_render, >+ bool transparent_mode, >+ bool saturated_capture, >+ bool consistent_estimate_, >+ const absl::optional<DelayEstimate>& external_delay, >+ bool converged_filter) { >+ // Update blocks counter. >+ const bool filter_update = active_render && !saturated_capture; >+ filter_update_blocks_since_reset_ += filter_update ? 1 : 0; >+ filter_update_blocks_since_start_ += filter_update ? 1 : 0; >+ >+ // Store convergence flag when observed. >+ convergence_seen_ = convergence_seen_ || converged_filter; >+ >+ // Verify requirements for achieving a decent filter. The requirements for >+ // filter adaptation at call startup are more restrictive than after an >+ // in-call reset. >+ const bool sufficient_data_to_converge_at_startup = >+ filter_update_blocks_since_start_ > kNumBlocksPerSecond * 0.4f; >+ const bool sufficient_data_to_converge_at_reset = >+ sufficient_data_to_converge_at_startup && >+ filter_update_blocks_since_reset_ > kNumBlocksPerSecond * 0.2f; >+ >+ // The linear filter can only be used it has had time to converge. >+ usable_linear_estimate_ = sufficient_data_to_converge_at_startup && >+ sufficient_data_to_converge_at_reset; >+ >+ // The linear filter can only be used if an external delay or convergence have >+ // been identified >+ usable_linear_estimate_ = >+ usable_linear_estimate_ && (external_delay || convergence_seen_); >+ >+ // If transparent mode is on, deactivate usign the linear filter. >+ usable_linear_estimate_ = usable_linear_estimate_ && !transparent_mode; >+} >+ >+AecState::LegacyFilteringQualityAnalyzer::LegacyFilteringQualityAnalyzer( > const EchoCanceller3Config& config) > : conservative_initial_phase_(config.filter.conservative_initial_phase), > required_blocks_for_convergence_( >@@ -390,7 +479,7 @@ AecState::FilteringQualityAnalyzer::FilteringQualityAnalyzer( > config.echo_removal_control.linear_and_stable_echo_path), > non_converged_sequence_size_(kBlocksSinceConvergencedFilterInit) {} > >-void AecState::FilteringQualityAnalyzer::Reset() { >+void AecState::LegacyFilteringQualityAnalyzer::Reset() { > usable_linear_estimate_ = false; > strong_not_saturated_render_blocks_ = 0; > if (linear_and_stable_echo_path_) { >@@ -402,7 +491,7 @@ void AecState::FilteringQualityAnalyzer::Reset() { > recent_convergence_ = true; > } > >-void AecState::FilteringQualityAnalyzer::Update( >+void AecState::LegacyFilteringQualityAnalyzer::Update( > bool saturated_echo, > bool active_render, > bool saturated_capture, >@@ -454,18 +543,41 @@ void AecState::FilteringQualityAnalyzer::Update( > } > } > >-AecState::SaturationDetector::SaturationDetector( >+void AecState::SaturationDetector::Update( >+ rtc::ArrayView<const float> x, >+ bool saturated_capture, >+ bool usable_linear_estimate, >+ const SubtractorOutput& subtractor_output, >+ float echo_path_gain) { >+ saturated_echo_ = saturated_capture; >+ if (usable_linear_estimate) { >+ constexpr float kSaturationThreshold = 20000.f; >+ saturated_echo_ = >+ saturated_echo_ && >+ (subtractor_output.s_main_max_abs > kSaturationThreshold || >+ subtractor_output.s_shadow_max_abs > kSaturationThreshold); >+ } else { >+ const float max_sample = fabs(*std::max_element( >+ x.begin(), x.end(), [](float a, float b) { return a * a < b * b; })); >+ >+ const float kMargin = 10.f; >+ float peak_echo_amplitude = max_sample * echo_path_gain * kMargin; >+ saturated_echo_ = saturated_echo_ && peak_echo_amplitude > 32000; >+ } >+} >+ >+AecState::LegacySaturationDetector::LegacySaturationDetector( > const EchoCanceller3Config& config) > : echo_can_saturate_(config.ep_strength.echo_can_saturate), > not_saturated_sequence_size_(1000) {} > >-void AecState::SaturationDetector::Reset() { >+void AecState::LegacySaturationDetector::Reset() { > not_saturated_sequence_size_ = 0; > } > >-void AecState::SaturationDetector::Update(rtc::ArrayView<const float> x, >- bool saturated_capture, >- float echo_path_gain) { >+void AecState::LegacySaturationDetector::Update(rtc::ArrayView<const float> x, >+ bool saturated_capture, >+ float echo_path_gain) { > if (!echo_can_saturate_) { > saturated_echo_ = false; > return; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.h >index ca476428a54d84acd22406087dfbe232d64b45fa..c9d9cf3f3a0bd28c4ab6f6622772c7858b2157ea 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state.h >@@ -11,9 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_AEC_STATE_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_AEC_STATE_H_ > >-#include <math.h> >- >-#include <algorithm> >+#include <stddef.h> >+#include <array> > #include <memory> > #include <vector> > >@@ -28,6 +27,7 @@ > #include "modules/audio_processing/aec3/erle_estimator.h" > #include "modules/audio_processing/aec3/filter_analyzer.h" > #include "modules/audio_processing/aec3/render_buffer.h" >+#include "modules/audio_processing/aec3/render_reverb_model.h" > #include "modules/audio_processing/aec3/reverb_model_estimator.h" > #include "modules/audio_processing/aec3/subtractor_output.h" > #include "modules/audio_processing/aec3/subtractor_output_analyzer.h" >@@ -46,11 +46,17 @@ class AecState { > // Returns whether the echo subtractor can be used to determine the residual > // echo. > bool UsableLinearEstimate() const { >+ if (use_legacy_filter_quality_) { >+ return legacy_filter_quality_state_.LinearFilterUsable(); >+ } > return filter_quality_state_.LinearFilterUsable(); > } > > // Returns whether the echo subtractor output should be used as output. > bool UseLinearFilterOutput() const { >+ if (use_legacy_filter_quality_) { >+ return legacy_filter_quality_state_.LinearFilterUsable(); >+ } > return filter_quality_state_.LinearFilterUsable(); > } > >@@ -99,7 +105,11 @@ class AecState { > bool SaturatedCapture() const { return capture_signal_saturation_; } > > // Returns whether the echo signal is saturated. >- bool SaturatedEcho() const { return saturation_detector_.SaturatedEcho(); } >+ bool SaturatedEcho() const { >+ return use_legacy_saturation_behavior_ >+ ? legacy_saturation_detector_.SaturatedEcho() >+ : saturation_detector_.SaturatedEcho(); >+ } > > // Updates the capture signal saturation. > void UpdateCaptureSaturation(bool capture_signal_saturation) { >@@ -122,7 +132,11 @@ class AecState { > > // Returns the upper limit for the echo suppression gain. > float SuppressionGainLimit() const { >- return suppression_gain_limiter_.Limit(); >+ if (use_suppressor_gain_limiter_) { >+ return suppression_gain_limiter_.Limit(); >+ } else { >+ return 1.f; >+ } > } > > // Returns whether the suppression gain limiter is active. >@@ -153,13 +167,14 @@ class AecState { > } > > private: >- void UpdateSuppressorGainLimit(bool render_activity); >- bool DetectEchoSaturation(rtc::ArrayView<const float> x, >- float echo_path_gain); >- > static int instance_count_; > std::unique_ptr<ApmDataDumper> data_dumper_; > const EchoCanceller3Config config_; >+ const bool use_legacy_saturation_behavior_; >+ const bool enable_erle_resets_at_gain_changes_; >+ const bool enable_erle_updates_during_reverb_; >+ const bool use_legacy_filter_quality_; >+ const bool use_suppressor_gain_limiter_; > > // Class for controlling the transition from the intial state, which in turn > // controls when the filter parameters for the initial state should be used. >@@ -255,7 +270,37 @@ class AecState { > // suppressor. > class FilteringQualityAnalyzer { > public: >- explicit FilteringQualityAnalyzer(const EchoCanceller3Config& config); >+ FilteringQualityAnalyzer(const EchoCanceller3Config& config); >+ >+ // Returns whether the the linear filter can be used for the echo >+ // canceller output. >+ bool LinearFilterUsable() const { return usable_linear_estimate_; } >+ >+ // Resets the state of the analyzer. >+ void Reset(); >+ >+ // Updates the analysis based on new data. >+ void Update(bool active_render, >+ bool transparent_mode, >+ bool saturated_capture, >+ bool consistent_estimate_, >+ const absl::optional<DelayEstimate>& external_delay, >+ bool converged_filter); >+ >+ private: >+ bool usable_linear_estimate_ = false; >+ size_t filter_update_blocks_since_reset_ = 0; >+ size_t filter_update_blocks_since_start_ = 0; >+ bool convergence_seen_ = false; >+ } filter_quality_state_; >+ >+ // Class containing the legacy functionality for analyzing how well the linear >+ // filter is, and can be expected to perform on the current signals. The >+ // purpose of this is for using to select the echo suppression functionality >+ // as well as the input to the echo suppressor. >+ class LegacyFilteringQualityAnalyzer { >+ public: >+ explicit LegacyFilteringQualityAnalyzer(const EchoCanceller3Config& config); > > // Returns whether the the linear filter is can be used for the echo > // canceller output. >@@ -284,14 +329,32 @@ class AecState { > size_t active_non_converged_sequence_size_ = 0; > bool recent_convergence_during_activity_ = false; > bool recent_convergence_ = false; >- } filter_quality_state_; >+ } legacy_filter_quality_state_; > >- // Class for detecting whether the echo is to be considered to be saturated. >- // The purpose of this is to allow customized behavior in the echo suppressor >- // for when the echo is saturated. >+ // Class for detecting whether the echo is to be considered to be >+ // saturated. > class SaturationDetector { > public: >- explicit SaturationDetector(const EchoCanceller3Config& config); >+ // Returns whether the echo is to be considered saturated. >+ bool SaturatedEcho() const { return saturated_echo_; }; >+ >+ // Updates the detection decision based on new data. >+ void Update(rtc::ArrayView<const float> x, >+ bool saturated_capture, >+ bool usable_linear_estimate, >+ const SubtractorOutput& subtractor_output, >+ float echo_path_gain); >+ >+ private: >+ bool saturated_echo_ = false; >+ } saturation_detector_; >+ >+ // Legacy class for detecting whether the echo is to be considered to be >+ // saturated. This is kept as a fallback solution to use instead of the class >+ // SaturationDetector, >+ class LegacySaturationDetector { >+ public: >+ explicit LegacySaturationDetector(const EchoCanceller3Config& config); > > // Returns whether the echo is to be considered saturated. > bool SaturatedEcho() const { return saturated_echo_; }; >@@ -308,7 +371,7 @@ class AecState { > const bool echo_can_saturate_; > size_t not_saturated_sequence_size_; > bool saturated_echo_ = false; >- } saturation_detector_; >+ } legacy_saturation_detector_; > > ErlEstimator erl_estimator_; > ErleEstimator erle_estimator_; >@@ -321,8 +384,8 @@ class AecState { > absl::optional<DelayEstimate> external_delay_; > EchoAudibility echo_audibility_; > ReverbModelEstimator reverb_model_estimator_; >+ RenderReverbModel render_reverb_; > SubtractorOutputAnalyzer subtractor_output_analyzer_; >- bool enable_erle_resets_at_gain_changes_ = true; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state_unittest.cc >index bf32a0b60720ecee71f32ef34a35b15374fe706d..a331006378162318b6c1c6e71a9332ceeaa0cce5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/aec_state_unittest.cc >@@ -25,7 +25,7 @@ TEST(AecState, NormalUsage) { > absl::optional<DelayEstimate> delay_estimate = > DelayEstimate(DelayEstimate::Quality::kRefined, 10); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > std::array<float, kFftLengthBy2Plus1> E2_main = {}; > std::array<float, kFftLengthBy2Plus1> Y2 = {}; > std::vector<std::vector<float>> x(3, std::vector<float>(kBlockSize, 0.f)); >@@ -56,7 +56,7 @@ TEST(AecState, NormalUsage) { > std::fill(x[0].begin(), x[0].end(), 101.f); > for (int k = 0; k < 3000; ++k) { > render_delay_buffer->Insert(x); >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.Update(delay_estimate, converged_filter_frequency_response, > impulse_response, *render_delay_buffer->GetRenderBuffer(), > E2_main, Y2, output, y); >@@ -65,7 +65,7 @@ TEST(AecState, NormalUsage) { > > // Verify that linear AEC usability becomes false after an echo path change is > // reported >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.HandleEchoPathChange(EchoPathVariability( > false, EchoPathVariability::DelayAdjustment::kBufferReadjustment, false)); > state.Update(delay_estimate, converged_filter_frequency_response, >@@ -76,7 +76,7 @@ TEST(AecState, NormalUsage) { > // Verify that the active render detection works as intended. > std::fill(x[0].begin(), x[0].end(), 101.f); > render_delay_buffer->Insert(x); >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.HandleEchoPathChange(EchoPathVariability( > true, EchoPathVariability::DelayAdjustment::kNewDetectedDelay, false)); > state.Update(delay_estimate, converged_filter_frequency_response, >@@ -86,7 +86,7 @@ TEST(AecState, NormalUsage) { > > for (int k = 0; k < 1000; ++k) { > render_delay_buffer->Insert(x); >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.Update(delay_estimate, converged_filter_frequency_response, > impulse_response, *render_delay_buffer->GetRenderBuffer(), > E2_main, Y2, output, y); >@@ -110,7 +110,7 @@ TEST(AecState, NormalUsage) { > > Y2.fill(10.f * 10000.f * 10000.f); > for (size_t k = 0; k < 1000; ++k) { >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.Update(delay_estimate, converged_filter_frequency_response, > impulse_response, *render_delay_buffer->GetRenderBuffer(), > E2_main, Y2, output, y); >@@ -128,7 +128,7 @@ TEST(AecState, NormalUsage) { > E2_main.fill(1.f * 10000.f * 10000.f); > Y2.fill(10.f * E2_main[0]); > for (size_t k = 0; k < 1000; ++k) { >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.Update(delay_estimate, converged_filter_frequency_response, > impulse_response, *render_delay_buffer->GetRenderBuffer(), > E2_main, Y2, output, y); >@@ -152,7 +152,7 @@ TEST(AecState, NormalUsage) { > E2_main.fill(1.f * 10000.f * 10000.f); > Y2.fill(5.f * E2_main[0]); > for (size_t k = 0; k < 1000; ++k) { >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.Update(delay_estimate, converged_filter_frequency_response, > impulse_response, *render_delay_buffer->GetRenderBuffer(), > E2_main, Y2, output, y); >@@ -179,7 +179,7 @@ TEST(AecState, ConvergedFilterDelay) { > EchoCanceller3Config config; > AecState state(config); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > absl::optional<DelayEstimate> delay_estimate; > std::array<float, kFftLengthBy2Plus1> E2_main; > std::array<float, kFftLengthBy2Plus1> Y2; >@@ -208,7 +208,7 @@ TEST(AecState, ConvergedFilterDelay) { > impulse_response[k * kBlockSize + 1] = 1.f; > > state.HandleEchoPathChange(echo_path_variability); >- output.UpdatePowers(y); >+ output.ComputeMetrics(y); > state.Update(delay_estimate, frequency_response, impulse_response, > *render_delay_buffer->GetRenderBuffer(), E2_main, Y2, output, > y); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..45f56a5dce6fab1371f244e89e2c025f7a440c98 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics.cc >@@ -0,0 +1,121 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/aec3/api_call_jitter_metrics.h" >+ >+#include <algorithm> >+#include <limits> >+ >+#include "modules/audio_processing/aec3/aec3_common.h" >+#include "system_wrappers/include/metrics.h" >+ >+namespace webrtc { >+namespace { >+ >+bool TimeToReportMetrics(int frames_since_last_report) { >+ constexpr int kNumFramesPerSecond = 100; >+ constexpr int kReportingIntervalFrames = 10 * kNumFramesPerSecond; >+ return frames_since_last_report == kReportingIntervalFrames; >+} >+ >+} // namespace >+ >+ApiCallJitterMetrics::Jitter::Jitter() >+ : max_(0), min_(std::numeric_limits<int>::max()) {} >+ >+void ApiCallJitterMetrics::Jitter::Update(int num_api_calls_in_a_row) { >+ min_ = std::min(min_, num_api_calls_in_a_row); >+ max_ = std::max(max_, num_api_calls_in_a_row); >+} >+ >+void ApiCallJitterMetrics::Jitter::Reset() { >+ min_ = std::numeric_limits<int>::max(); >+ max_ = 0; >+} >+ >+void ApiCallJitterMetrics::Reset() { >+ render_jitter_.Reset(); >+ capture_jitter_.Reset(); >+ num_api_calls_in_a_row_ = 0; >+ frames_since_last_report_ = 0; >+ last_call_was_render_ = false; >+ proper_call_observed_ = false; >+} >+ >+void ApiCallJitterMetrics::ReportRenderCall() { >+ if (!last_call_was_render_) { >+ // If the previous call was a capture and a proper call has been observed >+ // (containing both render and capture data), storing the last number of >+ // capture calls into the metrics. >+ if (proper_call_observed_) { >+ capture_jitter_.Update(num_api_calls_in_a_row_); >+ } >+ >+ // Reset the call counter to start counting render calls. >+ num_api_calls_in_a_row_ = 0; >+ } >+ ++num_api_calls_in_a_row_; >+ last_call_was_render_ = true; >+} >+ >+void ApiCallJitterMetrics::ReportCaptureCall() { >+ if (last_call_was_render_) { >+ // If the previous call was a render and a proper call has been observed >+ // (containing both render and capture data), storing the last number of >+ // render calls into the metrics. >+ if (proper_call_observed_) { >+ render_jitter_.Update(num_api_calls_in_a_row_); >+ } >+ // Reset the call counter to start counting capture calls. >+ num_api_calls_in_a_row_ = 0; >+ >+ // If this statement is reached, at least one render and one capture call >+ // have been observed. >+ proper_call_observed_ = true; >+ } >+ ++num_api_calls_in_a_row_; >+ last_call_was_render_ = false; >+ >+ // Only report and update jitter metrics for when a proper call, containing >+ // both render and capture data, has been observed. >+ if (proper_call_observed_ && >+ TimeToReportMetrics(++frames_since_last_report_)) { >+ // Report jitter, where the base basic unit is frames. >+ constexpr int kMaxJitterToReport = 50; >+ >+ // Report max and min jitter for render and capture, in units of 20 ms. >+ RTC_HISTOGRAM_COUNTS_LINEAR( >+ "WebRTC.Audio.EchoCanceller.MaxRenderJitter", >+ std::min(kMaxJitterToReport, render_jitter().max()), 1, >+ kMaxJitterToReport, kMaxJitterToReport); >+ RTC_HISTOGRAM_COUNTS_LINEAR( >+ "WebRTC.Audio.EchoCanceller.MinRenderJitter", >+ std::min(kMaxJitterToReport, render_jitter().min()), 1, >+ kMaxJitterToReport, kMaxJitterToReport); >+ >+ RTC_HISTOGRAM_COUNTS_LINEAR( >+ "WebRTC.Audio.EchoCanceller.MaxCaptureJitter", >+ std::min(kMaxJitterToReport, capture_jitter().max()), 1, >+ kMaxJitterToReport, kMaxJitterToReport); >+ RTC_HISTOGRAM_COUNTS_LINEAR( >+ "WebRTC.Audio.EchoCanceller.MinCaptureJitter", >+ std::min(kMaxJitterToReport, capture_jitter().min()), 1, >+ kMaxJitterToReport, kMaxJitterToReport); >+ >+ frames_since_last_report_ = 0; >+ Reset(); >+ } >+} >+ >+bool ApiCallJitterMetrics::WillReportMetricsAtNextCapture() const { >+ return TimeToReportMetrics(frames_since_last_report_ + 1); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics.h >new file mode 100644 >index 0000000000000000000000000000000000000000..dd1fa82e9324707292e87ee2f375cf85e717f3db >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics.h >@@ -0,0 +1,60 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_AUDIO_PROCESSING_AEC3_API_CALL_JITTER_METRICS_H_ >+#define MODULES_AUDIO_PROCESSING_AEC3_API_CALL_JITTER_METRICS_H_ >+ >+namespace webrtc { >+ >+// Stores data for reporting metrics on the API call jitter. >+class ApiCallJitterMetrics { >+ public: >+ class Jitter { >+ public: >+ Jitter(); >+ void Update(int num_api_calls_in_a_row); >+ void Reset(); >+ >+ int min() const { return min_; } >+ int max() const { return max_; } >+ >+ private: >+ int max_; >+ int min_; >+ }; >+ >+ ApiCallJitterMetrics() { Reset(); } >+ >+ // Update metrics for render API call. >+ void ReportRenderCall(); >+ >+ // Update and periodically report metrics for capture API call. >+ void ReportCaptureCall(); >+ >+ // Methods used only for testing. >+ const Jitter& render_jitter() const { return render_jitter_; } >+ const Jitter& capture_jitter() const { return capture_jitter_; } >+ bool WillReportMetricsAtNextCapture() const; >+ >+ private: >+ void Reset(); >+ >+ Jitter render_jitter_; >+ Jitter capture_jitter_; >+ >+ int num_api_calls_in_a_row_ = 0; >+ int frames_since_last_report_ = 0; >+ bool last_call_was_render_ = false; >+ bool proper_call_observed_ = false; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_AUDIO_PROCESSING_AEC3_API_CALL_JITTER_METRICS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..86608aa3e1093f97bc370dc9aa6b91bbff4933d4 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/api_call_jitter_metrics_unittest.cc >@@ -0,0 +1,109 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/aec3/api_call_jitter_metrics.h" >+#include "modules/audio_processing/aec3/aec3_common.h" >+ >+#include "test/gtest.h" >+ >+namespace webrtc { >+ >+// Verify constant jitter. >+TEST(ApiCallJitterMetrics, ConstantJitter) { >+ for (int jitter = 1; jitter < 20; ++jitter) { >+ ApiCallJitterMetrics metrics; >+ for (size_t k = 0; k < 30 * kNumBlocksPerSecond; ++k) { >+ for (int j = 0; j < jitter; ++j) { >+ metrics.ReportRenderCall(); >+ } >+ >+ for (int j = 0; j < jitter; ++j) { >+ metrics.ReportCaptureCall(); >+ >+ if (metrics.WillReportMetricsAtNextCapture()) { >+ EXPECT_EQ(jitter, metrics.render_jitter().min()); >+ EXPECT_EQ(jitter, metrics.render_jitter().max()); >+ EXPECT_EQ(jitter, metrics.capture_jitter().min()); >+ EXPECT_EQ(jitter, metrics.capture_jitter().max()); >+ } >+ } >+ } >+ } >+} >+ >+// Verify peaky jitter for the render. >+TEST(ApiCallJitterMetrics, JitterPeakRender) { >+ constexpr int kMinJitter = 2; >+ constexpr int kJitterPeak = 10; >+ constexpr int kPeakInterval = 100; >+ >+ ApiCallJitterMetrics metrics; >+ int render_surplus = 0; >+ >+ for (size_t k = 0; k < 30 * kNumBlocksPerSecond; ++k) { >+ const int num_render_calls = >+ k % kPeakInterval == 0 ? kJitterPeak : kMinJitter; >+ for (int j = 0; j < num_render_calls; ++j) { >+ metrics.ReportRenderCall(); >+ ++render_surplus; >+ } >+ >+ ASSERT_LE(kMinJitter, render_surplus); >+ const int num_capture_calls = >+ render_surplus == kMinJitter ? kMinJitter : kMinJitter + 1; >+ for (int j = 0; j < num_capture_calls; ++j) { >+ metrics.ReportCaptureCall(); >+ >+ if (metrics.WillReportMetricsAtNextCapture()) { >+ EXPECT_EQ(kMinJitter, metrics.render_jitter().min()); >+ EXPECT_EQ(kJitterPeak, metrics.render_jitter().max()); >+ EXPECT_EQ(kMinJitter, metrics.capture_jitter().min()); >+ EXPECT_EQ(kMinJitter + 1, metrics.capture_jitter().max()); >+ } >+ --render_surplus; >+ } >+ } >+} >+ >+// Verify peaky jitter for the capture. >+TEST(ApiCallJitterMetrics, JitterPeakCapture) { >+ constexpr int kMinJitter = 2; >+ constexpr int kJitterPeak = 10; >+ constexpr int kPeakInterval = 100; >+ >+ ApiCallJitterMetrics metrics; >+ int capture_surplus = kMinJitter; >+ >+ for (size_t k = 0; k < 30 * kNumBlocksPerSecond; ++k) { >+ ASSERT_LE(kMinJitter, capture_surplus); >+ const int num_render_calls = >+ capture_surplus == kMinJitter ? kMinJitter : kMinJitter + 1; >+ for (int j = 0; j < num_render_calls; ++j) { >+ metrics.ReportRenderCall(); >+ --capture_surplus; >+ } >+ >+ const int num_capture_calls = >+ k % kPeakInterval == 0 ? kJitterPeak : kMinJitter; >+ for (int j = 0; j < num_capture_calls; ++j) { >+ metrics.ReportCaptureCall(); >+ >+ if (metrics.WillReportMetricsAtNextCapture()) { >+ EXPECT_EQ(kMinJitter, metrics.render_jitter().min()); >+ EXPECT_EQ(kMinJitter + 1, metrics.render_jitter().max()); >+ EXPECT_EQ(kMinJitter, metrics.capture_jitter().min()); >+ EXPECT_EQ(kJitterPeak, metrics.capture_jitter().max()); >+ } >+ ++capture_surplus; >+ } >+ } >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_delay_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_delay_buffer.h >index 6e5fd5c09d0d46e25082d56805ac2dcf1eb4d088..624e9139a3888c90a086d25fc255a166ee389ced 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_delay_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_delay_buffer.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_BLOCK_DELAY_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_BLOCK_DELAY_BUFFER_H_ > >+#include <stddef.h> > #include <vector> > > #include "modules/audio_processing/audio_buffer.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_framer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_framer.cc >index 3160624515abae99c92486b7ce308cf828c382db..ca7667c24f4626f34cb5978e2671039c9d37d744 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_framer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_framer.cc >@@ -12,6 +12,7 @@ > > #include <algorithm> > >+#include "modules/audio_processing/aec3/aec3_common.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.cc >index 1bb69584cc3fbf7932c1a672217d1f5e0d569a78..ef25e7c23bc754edcc604888aa03c6f2c9b71fc8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.cc >@@ -9,12 +9,16 @@ > */ > #include "modules/audio_processing/aec3/block_processor.h" > >+#include <utility> >+ > #include "absl/types/optional.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/block_processor_metrics.h" >+#include "modules/audio_processing/aec3/delay_estimate.h" > #include "modules/audio_processing/aec3/echo_path_variability.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/logging.h" > >@@ -104,7 +108,7 @@ void BlockProcessorImpl::ProcessCapture( > if (!capture_properly_started_) { > capture_properly_started_ = true; > render_buffer_->Reset(); >- delay_controller_->Reset(); >+ delay_controller_->Reset(true); > } > } else { > // If no render data has yet arrived, do not process the capture signal. >@@ -119,7 +123,7 @@ void BlockProcessorImpl::ProcessCapture( > render_properly_started_) { > echo_path_variability.delay_change = > EchoPathVariability::DelayAdjustment::kBufferFlush; >- delay_controller_->Reset(); >+ delay_controller_->Reset(true); > RTC_LOG(LS_WARNING) << "Reset due to render buffer overrun at block " > << capture_call_counter_; > } >@@ -135,11 +139,11 @@ void BlockProcessorImpl::ProcessCapture( > estimated_delay_->quality == DelayEstimate::Quality::kRefined) { > echo_path_variability.delay_change = > EchoPathVariability::DelayAdjustment::kDelayReset; >- delay_controller_->Reset(); >+ delay_controller_->Reset(true); > capture_properly_started_ = false; > render_properly_started_ = false; > >- RTC_LOG(LS_WARNING) << "Reset due to render buffer underrrun at block " >+ RTC_LOG(LS_WARNING) << "Reset due to render buffer underrun at block " > << capture_call_counter_; > } > } else if (render_event_ == RenderDelayBuffer::BufferingEvent::kApiCallSkew) { >@@ -147,7 +151,7 @@ void BlockProcessorImpl::ProcessCapture( > // echo. > echo_path_variability.delay_change = > EchoPathVariability::DelayAdjustment::kDelayReset; >- delay_controller_->Reset(); >+ delay_controller_->Reset(true); > capture_properly_started_ = false; > render_properly_started_ = false; > RTC_LOG(LS_WARNING) << "Reset due to render buffer api skew at block " >@@ -180,7 +184,7 @@ void BlockProcessorImpl::ProcessCapture( > if (estimated_delay_->quality == DelayEstimate::Quality::kRefined) { > echo_path_variability.delay_change = > EchoPathVariability::DelayAdjustment::kDelayReset; >- delay_controller_->Reset(); >+ delay_controller_->Reset(true); > render_buffer_->Reset(); > capture_properly_started_ = false; > render_properly_started_ = false; >@@ -190,6 +194,8 @@ void BlockProcessorImpl::ProcessCapture( > } > } > >+ echo_path_variability.clock_drift = delay_controller_->HasClockdrift(); >+ > // Remove the echo from the capture signal. > echo_remover_->ProcessCapture( > echo_path_variability, capture_signal_saturation, estimated_delay_, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.h >index a3967ea7e4a41e577144e9c206b15f38a984b79f..5f7ec7c17c3c86b7892b30f3b1cb85fb0e1db655 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor.h >@@ -11,9 +11,12 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_BLOCK_PROCESSOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_BLOCK_PROCESSOR_H_ > >+#include <stddef.h> > #include <memory> > #include <vector> > >+#include "api/audio/echo_canceller3_config.h" >+#include "api/audio/echo_control.h" > #include "modules/audio_processing/aec3/echo_remover.h" > #include "modules/audio_processing/aec3/render_delay_buffer.h" > #include "modules/audio_processing/aec3/render_delay_controller.h" >@@ -23,19 +26,33 @@ namespace webrtc { > // Class for performing echo cancellation on 64 sample blocks of audio data. > class BlockProcessor { > public: >+ // Create a block processor with the legacy render buffering. > static BlockProcessor* Create(const EchoCanceller3Config& config, > int sample_rate_hz); >+ // Create a block processor with the new render buffering. >+ static BlockProcessor* Create2(const EchoCanceller3Config& config, >+ int sample_rate_hz); > // Only used for testing purposes. > static BlockProcessor* Create( > const EchoCanceller3Config& config, > int sample_rate_hz, > std::unique_ptr<RenderDelayBuffer> render_buffer); >+ static BlockProcessor* Create2( >+ const EchoCanceller3Config& config, >+ int sample_rate_hz, >+ std::unique_ptr<RenderDelayBuffer> render_buffer); > static BlockProcessor* Create( > const EchoCanceller3Config& config, > int sample_rate_hz, > std::unique_ptr<RenderDelayBuffer> render_buffer, > std::unique_ptr<RenderDelayController> delay_controller, > std::unique_ptr<EchoRemover> echo_remover); >+ static BlockProcessor* Create2( >+ const EchoCanceller3Config& config, >+ int sample_rate_hz, >+ std::unique_ptr<RenderDelayBuffer> render_buffer, >+ std::unique_ptr<RenderDelayController> delay_controller, >+ std::unique_ptr<EchoRemover> echo_remover); > > virtual ~BlockProcessor() = default; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor2.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor2.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..30bd3ee5ca2952ca227e41b61c402056340b1d4f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor2.cc >@@ -0,0 +1,256 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#include <stddef.h> >+#include <memory> >+#include <utility> >+#include <vector> >+ >+#include "absl/types/optional.h" >+#include "api/audio/echo_canceller3_config.h" >+#include "api/audio/echo_control.h" >+#include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/block_processor.h" >+#include "modules/audio_processing/aec3/block_processor_metrics.h" >+#include "modules/audio_processing/aec3/delay_estimate.h" >+#include "modules/audio_processing/aec3/echo_path_variability.h" >+#include "modules/audio_processing/aec3/echo_remover.h" >+#include "modules/audio_processing/aec3/render_delay_buffer.h" >+#include "modules/audio_processing/aec3/render_delay_controller.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/atomicops.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+namespace { >+ >+enum class BlockProcessorApiCall { kCapture, kRender }; >+ >+class BlockProcessorImpl2 final : public BlockProcessor { >+ public: >+ BlockProcessorImpl2(const EchoCanceller3Config& config, >+ int sample_rate_hz, >+ std::unique_ptr<RenderDelayBuffer> render_buffer, >+ std::unique_ptr<RenderDelayController> delay_controller, >+ std::unique_ptr<EchoRemover> echo_remover); >+ >+ BlockProcessorImpl2() = delete; >+ >+ ~BlockProcessorImpl2() override; >+ >+ void ProcessCapture(bool echo_path_gain_change, >+ bool capture_signal_saturation, >+ std::vector<std::vector<float>>* capture_block) override; >+ >+ void BufferRender(const std::vector<std::vector<float>>& block) override; >+ >+ void UpdateEchoLeakageStatus(bool leakage_detected) override; >+ >+ void GetMetrics(EchoControl::Metrics* metrics) const override; >+ >+ void SetAudioBufferDelay(size_t delay_ms) override; >+ >+ private: >+ static int instance_count_; >+ std::unique_ptr<ApmDataDumper> data_dumper_; >+ const EchoCanceller3Config config_; >+ bool capture_properly_started_ = false; >+ bool render_properly_started_ = false; >+ const size_t sample_rate_hz_; >+ std::unique_ptr<RenderDelayBuffer> render_buffer_; >+ std::unique_ptr<RenderDelayController> delay_controller_; >+ std::unique_ptr<EchoRemover> echo_remover_; >+ BlockProcessorMetrics metrics_; >+ RenderDelayBuffer::BufferingEvent render_event_; >+ size_t capture_call_counter_ = 0; >+ absl::optional<DelayEstimate> estimated_delay_; >+ absl::optional<int> echo_remover_delay_; >+}; >+ >+int BlockProcessorImpl2::instance_count_ = 0; >+ >+BlockProcessorImpl2::BlockProcessorImpl2( >+ const EchoCanceller3Config& config, >+ int sample_rate_hz, >+ std::unique_ptr<RenderDelayBuffer> render_buffer, >+ std::unique_ptr<RenderDelayController> delay_controller, >+ std::unique_ptr<EchoRemover> echo_remover) >+ : data_dumper_( >+ new ApmDataDumper(rtc::AtomicOps::Increment(&instance_count_))), >+ config_(config), >+ sample_rate_hz_(sample_rate_hz), >+ render_buffer_(std::move(render_buffer)), >+ delay_controller_(std::move(delay_controller)), >+ echo_remover_(std::move(echo_remover)), >+ render_event_(RenderDelayBuffer::BufferingEvent::kNone) { >+ RTC_DCHECK(ValidFullBandRate(sample_rate_hz_)); >+} >+ >+BlockProcessorImpl2::~BlockProcessorImpl2() = default; >+ >+void BlockProcessorImpl2::ProcessCapture( >+ bool echo_path_gain_change, >+ bool capture_signal_saturation, >+ std::vector<std::vector<float>>* capture_block) { >+ RTC_DCHECK(capture_block); >+ RTC_DCHECK_EQ(NumBandsForRate(sample_rate_hz_), capture_block->size()); >+ RTC_DCHECK_EQ(kBlockSize, (*capture_block)[0].size()); >+ >+ capture_call_counter_++; >+ >+ data_dumper_->DumpRaw("aec3_processblock_call_order", >+ static_cast<int>(BlockProcessorApiCall::kCapture)); >+ data_dumper_->DumpWav("aec3_processblock_capture_input", kBlockSize, >+ &(*capture_block)[0][0], >+ LowestBandRate(sample_rate_hz_), 1); >+ >+ if (render_properly_started_) { >+ if (!capture_properly_started_) { >+ capture_properly_started_ = true; >+ render_buffer_->Reset(); >+ delay_controller_->Reset(true); >+ } >+ } else { >+ // If no render data has yet arrived, do not process the capture signal. >+ return; >+ } >+ >+ EchoPathVariability echo_path_variability( >+ echo_path_gain_change, EchoPathVariability::DelayAdjustment::kNone, >+ false); >+ >+ if (render_event_ == RenderDelayBuffer::BufferingEvent::kRenderOverrun && >+ render_properly_started_) { >+ echo_path_variability.delay_change = >+ EchoPathVariability::DelayAdjustment::kBufferFlush; >+ delay_controller_->Reset(true); >+ RTC_LOG(LS_WARNING) << "Reset due to render buffer overrun at block " >+ << capture_call_counter_; >+ } >+ render_event_ = RenderDelayBuffer::BufferingEvent::kNone; >+ >+ // Update the render buffers with any newly arrived render blocks and prepare >+ // the render buffers for reading the render data corresponding to the current >+ // capture block. >+ RenderDelayBuffer::BufferingEvent buffer_event = >+ render_buffer_->PrepareCaptureProcessing(); >+ // Reset the delay controller at render buffer underrun. >+ if (buffer_event == RenderDelayBuffer::BufferingEvent::kRenderUnderrun) { >+ delay_controller_->Reset(false); >+ } >+ >+ data_dumper_->DumpWav("aec3_processblock_capture_input2", kBlockSize, >+ &(*capture_block)[0][0], >+ LowestBandRate(sample_rate_hz_), 1); >+ >+ // Compute and and apply the render delay required to achieve proper signal >+ // alignment. >+ estimated_delay_ = delay_controller_->GetDelay( >+ render_buffer_->GetDownsampledRenderBuffer(), render_buffer_->Delay(), >+ echo_remover_delay_, (*capture_block)[0]); >+ >+ if (estimated_delay_) { >+ bool delay_change = render_buffer_->SetDelay(estimated_delay_->delay); >+ if (delay_change) { >+ RTC_LOG(LS_WARNING) << "Delay changed to " << estimated_delay_->delay >+ << " at block " << capture_call_counter_; >+ echo_path_variability.delay_change = >+ EchoPathVariability::DelayAdjustment::kNewDetectedDelay; >+ } >+ } >+ >+ echo_path_variability.clock_drift = delay_controller_->HasClockdrift(); >+ >+ // Remove the echo from the capture signal. >+ echo_remover_->ProcessCapture( >+ echo_path_variability, capture_signal_saturation, estimated_delay_, >+ render_buffer_->GetRenderBuffer(), capture_block); >+ >+ // Check to see if a refined delay estimate has been obtained from the echo >+ // remover. >+ echo_remover_delay_ = echo_remover_->Delay(); >+ >+ // Update the metrics. >+ metrics_.UpdateCapture(false); >+} >+ >+void BlockProcessorImpl2::BufferRender( >+ const std::vector<std::vector<float>>& block) { >+ RTC_DCHECK_EQ(NumBandsForRate(sample_rate_hz_), block.size()); >+ RTC_DCHECK_EQ(kBlockSize, block[0].size()); >+ data_dumper_->DumpRaw("aec3_processblock_call_order", >+ static_cast<int>(BlockProcessorApiCall::kRender)); >+ data_dumper_->DumpWav("aec3_processblock_render_input", kBlockSize, >+ &block[0][0], LowestBandRate(sample_rate_hz_), 1); >+ data_dumper_->DumpWav("aec3_processblock_render_input2", kBlockSize, >+ &block[0][0], LowestBandRate(sample_rate_hz_), 1); >+ >+ render_event_ = render_buffer_->Insert(block); >+ >+ metrics_.UpdateRender(render_event_ != >+ RenderDelayBuffer::BufferingEvent::kNone); >+ >+ render_properly_started_ = true; >+ delay_controller_->LogRenderCall(); >+} >+ >+void BlockProcessorImpl2::UpdateEchoLeakageStatus(bool leakage_detected) { >+ echo_remover_->UpdateEchoLeakageStatus(leakage_detected); >+} >+ >+void BlockProcessorImpl2::GetMetrics(EchoControl::Metrics* metrics) const { >+ echo_remover_->GetMetrics(metrics); >+ const int block_size_ms = sample_rate_hz_ == 8000 ? 8 : 4; >+ absl::optional<size_t> delay = render_buffer_->Delay(); >+ metrics->delay_ms = delay ? static_cast<int>(*delay) * block_size_ms : 0; >+} >+ >+void BlockProcessorImpl2::SetAudioBufferDelay(size_t delay_ms) { >+ render_buffer_->SetAudioBufferDelay(delay_ms); >+} >+ >+} // namespace >+ >+BlockProcessor* BlockProcessor::Create2(const EchoCanceller3Config& config, >+ int sample_rate_hz) { >+ std::unique_ptr<RenderDelayBuffer> render_buffer( >+ RenderDelayBuffer::Create2(config, NumBandsForRate(sample_rate_hz))); >+ std::unique_ptr<RenderDelayController> delay_controller( >+ RenderDelayController::Create2(config, sample_rate_hz)); >+ std::unique_ptr<EchoRemover> echo_remover( >+ EchoRemover::Create(config, sample_rate_hz)); >+ return Create2(config, sample_rate_hz, std::move(render_buffer), >+ std::move(delay_controller), std::move(echo_remover)); >+} >+ >+BlockProcessor* BlockProcessor::Create2( >+ const EchoCanceller3Config& config, >+ int sample_rate_hz, >+ std::unique_ptr<RenderDelayBuffer> render_buffer) { >+ std::unique_ptr<RenderDelayController> delay_controller( >+ RenderDelayController::Create2(config, sample_rate_hz)); >+ std::unique_ptr<EchoRemover> echo_remover( >+ EchoRemover::Create(config, sample_rate_hz)); >+ return Create2(config, sample_rate_hz, std::move(render_buffer), >+ std::move(delay_controller), std::move(echo_remover)); >+} >+ >+BlockProcessor* BlockProcessor::Create2( >+ const EchoCanceller3Config& config, >+ int sample_rate_hz, >+ std::unique_ptr<RenderDelayBuffer> render_buffer, >+ std::unique_ptr<RenderDelayController> delay_controller, >+ std::unique_ptr<EchoRemover> echo_remover) { >+ return new BlockProcessorImpl2( >+ config, sample_rate_hz, std::move(render_buffer), >+ std::move(delay_controller), std::move(echo_remover)); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_metrics.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_metrics.cc >index c8bdda73898ec0601d5669761f77b515141f7ac1..deac1fcd221a05e420e51eade373b77b90e02046 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_metrics.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_metrics.cc >@@ -11,6 +11,7 @@ > #include "modules/audio_processing/aec3/block_processor_metrics.h" > > #include "modules/audio_processing/aec3/aec3_common.h" >+#include "rtc_base/checks.h" > #include "system_wrappers/include/metrics.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_unittest.cc >index 71457868b24adbdbcd621bcdaad3f323b8fda986..8aba5b5661e5d594f6f65a27860e5a5192bffdda 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/block_processor_unittest.cc >@@ -37,7 +37,7 @@ using testing::_; > // methods are callable. > void RunBasicSetupAndApiCallTest(int sample_rate_hz, int num_iterations) { > std::unique_ptr<BlockProcessor> block_processor( >- BlockProcessor::Create(EchoCanceller3Config(), sample_rate_hz)); >+ BlockProcessor::Create2(EchoCanceller3Config(), sample_rate_hz)); > std::vector<std::vector<float>> block(NumBandsForRate(sample_rate_hz), > std::vector<float>(kBlockSize, 1000.f)); > >@@ -51,7 +51,7 @@ void RunBasicSetupAndApiCallTest(int sample_rate_hz, int num_iterations) { > #if RTC_DCHECK_IS_ON && GTEST_HAS_DEATH_TEST && !defined(WEBRTC_ANDROID) > void RunRenderBlockSizeVerificationTest(int sample_rate_hz) { > std::unique_ptr<BlockProcessor> block_processor( >- BlockProcessor::Create(EchoCanceller3Config(), sample_rate_hz)); >+ BlockProcessor::Create2(EchoCanceller3Config(), sample_rate_hz)); > std::vector<std::vector<float>> block( > NumBandsForRate(sample_rate_hz), std::vector<float>(kBlockSize - 1, 0.f)); > >@@ -60,7 +60,7 @@ void RunRenderBlockSizeVerificationTest(int sample_rate_hz) { > > void RunCaptureBlockSizeVerificationTest(int sample_rate_hz) { > std::unique_ptr<BlockProcessor> block_processor( >- BlockProcessor::Create(EchoCanceller3Config(), sample_rate_hz)); >+ BlockProcessor::Create2(EchoCanceller3Config(), sample_rate_hz)); > std::vector<std::vector<float>> block( > NumBandsForRate(sample_rate_hz), std::vector<float>(kBlockSize - 1, 0.f)); > >@@ -72,7 +72,7 @@ void RunRenderNumBandsVerificationTest(int sample_rate_hz) { > ? NumBandsForRate(sample_rate_hz) + 1 > : 1; > std::unique_ptr<BlockProcessor> block_processor( >- BlockProcessor::Create(EchoCanceller3Config(), sample_rate_hz)); >+ BlockProcessor::Create2(EchoCanceller3Config(), sample_rate_hz)); > std::vector<std::vector<float>> block(wrong_num_bands, > std::vector<float>(kBlockSize, 0.f)); > >@@ -84,7 +84,7 @@ void RunCaptureNumBandsVerificationTest(int sample_rate_hz) { > ? NumBandsForRate(sample_rate_hz) + 1 > : 1; > std::unique_ptr<BlockProcessor> block_processor( >- BlockProcessor::Create(EchoCanceller3Config(), sample_rate_hz)); >+ BlockProcessor::Create2(EchoCanceller3Config(), sample_rate_hz)); > std::vector<std::vector<float>> block(wrong_num_bands, > std::vector<float>(kBlockSize, 0.f)); > >@@ -124,7 +124,7 @@ TEST(BlockProcessor, DISABLED_DelayControllerIntegration) { > EXPECT_CALL(*render_delay_buffer_mock, Delay()) > .Times(kNumBlocks + 1) > .WillRepeatedly(Return(0)); >- std::unique_ptr<BlockProcessor> block_processor(BlockProcessor::Create( >+ std::unique_ptr<BlockProcessor> block_processor(BlockProcessor::Create2( > EchoCanceller3Config(), rate, std::move(render_delay_buffer_mock))); > > std::vector<std::vector<float>> render_block( >@@ -173,7 +173,7 @@ TEST(BlockProcessor, DISABLED_SubmoduleIntegration) { > EXPECT_CALL(*echo_remover_mock, UpdateEchoLeakageStatus(_)) > .Times(kNumBlocks); > >- std::unique_ptr<BlockProcessor> block_processor(BlockProcessor::Create( >+ std::unique_ptr<BlockProcessor> block_processor(BlockProcessor::Create2( > EchoCanceller3Config(), rate, std::move(render_delay_buffer_mock), > std::move(render_delay_controller_mock), std::move(echo_remover_mock))); > >@@ -239,7 +239,7 @@ TEST(BlockProcessor, VerifyCaptureNumBandsCheck) { > // Verifiers that the verification for null ProcessCapture input works. > TEST(BlockProcessor, NullProcessCaptureParameter) { > EXPECT_DEATH(std::unique_ptr<BlockProcessor>( >- BlockProcessor::Create(EchoCanceller3Config(), 8000)) >+ BlockProcessor::Create2(EchoCanceller3Config(), 8000)) > ->ProcessCapture(false, false, nullptr), > ""); > } >@@ -249,7 +249,7 @@ TEST(BlockProcessor, NullProcessCaptureParameter) { > // tests on test bots has been fixed. > TEST(BlockProcessor, DISABLED_WrongSampleRate) { > EXPECT_DEATH(std::unique_ptr<BlockProcessor>( >- BlockProcessor::Create(EchoCanceller3Config(), 8001)), >+ BlockProcessor::Create2(EchoCanceller3Config(), 8001)), > ""); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.cc >index 333d2267a418cfb1419a6a2a6bfe8bb55463dfde..5dfd7c54e2b4e4cf77a2600839c38272d073a93e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.cc >@@ -9,6 +9,8 @@ > */ > #include "modules/audio_processing/aec3/cascaded_biquad_filter.h" > >+#include <algorithm> >+ > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.h >index 1e09fa6e5a4204761bc2411cc9b3a87a8aed9948..2a3b6d65c5a0f7510c060ff08eb5ab0005f00575 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/cascaded_biquad_filter.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_CASCADED_BIQUAD_FILTER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_CASCADED_BIQUAD_FILTER_H_ > >+#include <stddef.h> > #include <complex> > #include <vector> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..2c49b795c43da5363fe286715f61b4837123c4d7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector.cc >@@ -0,0 +1,61 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#include "modules/audio_processing/aec3/clockdrift_detector.h" >+ >+namespace webrtc { >+ >+ClockdriftDetector::ClockdriftDetector() >+ : level_(Level::kNone), stability_counter_(0) { >+ delay_history_.fill(0); >+} >+ >+ClockdriftDetector::~ClockdriftDetector() = default; >+ >+void ClockdriftDetector::Update(int delay_estimate) { >+ if (delay_estimate == delay_history_[0]) { >+ // Reset clockdrift level if delay estimate is stable for 7500 blocks (30 >+ // seconds). >+ if (++stability_counter_ > 7500) >+ level_ = Level::kNone; >+ return; >+ } >+ >+ stability_counter_ = 0; >+ const int d1 = delay_history_[0] - delay_estimate; >+ const int d2 = delay_history_[1] - delay_estimate; >+ const int d3 = delay_history_[2] - delay_estimate; >+ >+ // Patterns recognized as positive clockdrift: >+ // [x-3], x-2, x-1, x. >+ // [x-3], x-1, x-2, x. >+ const bool probable_drift_up = >+ (d1 == -1 && d2 == -2) || (d1 == -2 && d2 == -1); >+ const bool drift_up = probable_drift_up && d3 == -3; >+ >+ // Patterns recognized as negative clockdrift: >+ // [x+3], x+2, x+1, x. >+ // [x+3], x+1, x+2, x. >+ const bool probable_drift_down = (d1 == 1 && d2 == 2) || (d1 == 2 && d2 == 1); >+ const bool drift_down = probable_drift_down && d3 == 3; >+ >+ // Set clockdrift level. >+ if (drift_up || drift_down) { >+ level_ = Level::kVerified; >+ } else if ((probable_drift_up || probable_drift_down) && >+ level_ == Level::kNone) { >+ level_ = Level::kProbable; >+ } >+ >+ // Shift delay history one step. >+ delay_history_[2] = delay_history_[1]; >+ delay_history_[1] = delay_history_[0]; >+ delay_history_[0] = delay_estimate; >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector.h >new file mode 100644 >index 0000000000000000000000000000000000000000..22528c948922a80cdb94fc364a40a78a8abb2fe7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector.h >@@ -0,0 +1,38 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_AUDIO_PROCESSING_AEC3_CLOCKDRIFT_DETECTOR_H_ >+#define MODULES_AUDIO_PROCESSING_AEC3_CLOCKDRIFT_DETECTOR_H_ >+ >+#include <array> >+ >+namespace webrtc { >+ >+class ApmDataDumper; >+struct DownsampledRenderBuffer; >+struct EchoCanceller3Config; >+ >+// Detects clockdrift by analyzing the estimated delay. >+class ClockdriftDetector { >+ public: >+ enum class Level { kNone, kProbable, kVerified, kNumCategories }; >+ ClockdriftDetector(); >+ ~ClockdriftDetector(); >+ void Update(int delay_estimate); >+ Level ClockdriftLevel() const { return level_; } >+ >+ private: >+ std::array<int, 3> delay_history_; >+ Level level_; >+ size_t stability_counter_; >+}; >+} // namespace webrtc >+ >+#endif // MODULES_AUDIO_PROCESSING_AEC3_CLOCKDRIFT_DETECTOR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..0f98b01d3aae3ebe167c9d9a4528ab43f8065d7f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/clockdrift_detector_unittest.cc >@@ -0,0 +1,57 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/aec3/clockdrift_detector.h" >+ >+#include "test/gtest.h" >+ >+namespace webrtc { >+TEST(ClockdriftDetector, ClockdriftDetector) { >+ ClockdriftDetector c; >+ // No clockdrift at start. >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kNone); >+ >+ // Monotonically increasing delay. >+ for (int i = 0; i < 100; i++) >+ c.Update(1000); >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kNone); >+ for (int i = 0; i < 100; i++) >+ c.Update(1001); >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kNone); >+ for (int i = 0; i < 100; i++) >+ c.Update(1002); >+ // Probable clockdrift. >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kProbable); >+ for (int i = 0; i < 100; i++) >+ c.Update(1003); >+ // Verified clockdrift. >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kVerified); >+ >+ // Stable delay. >+ for (int i = 0; i < 10000; i++) >+ c.Update(1003); >+ // No clockdrift. >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kNone); >+ >+ // Decreasing delay. >+ for (int i = 0; i < 100; i++) >+ c.Update(1001); >+ for (int i = 0; i < 100; i++) >+ c.Update(999); >+ // Probable clockdrift. >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kProbable); >+ for (int i = 0; i < 100; i++) >+ c.Update(1000); >+ for (int i = 0; i < 100; i++) >+ c.Update(998); >+ // Verified clockdrift. >+ EXPECT_TRUE(c.ClockdriftLevel() == ClockdriftDetector::Level::kVerified); >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.cc >index 55faf4b5291a175bbe4bdab351614cd07beecf51..766e65863981cafba09dd51085def6f26c1f89f5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.cc >@@ -16,13 +16,16 @@ > #if defined(WEBRTC_ARCH_X86_FAMILY) > #include <emmintrin.h> > #endif >-#include <math.h> > #include <algorithm> > #include <array> >+#include <cmath> >+#include <cstdint> > #include <functional> > #include <numeric> > > #include "common_audio/signal_processing/include/signal_processing_library.h" >+#include "modules/audio_processing/aec3/vector_math.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >@@ -38,64 +41,10 @@ void TableRandomValue(int16_t* vector, int16_t vector_length, uint32_t* seed) { > > } // namespace > >-namespace aec3 { >- >-#if defined(WEBRTC_ARCH_X86_FAMILY) >- >-void EstimateComfortNoise_SSE2(const std::array<float, kFftLengthBy2Plus1>& N2, >- uint32_t* seed, >- FftData* lower_band_noise, >- FftData* upper_band_noise) { >- FftData* N_low = lower_band_noise; >- FftData* N_high = upper_band_noise; >- >- // Compute square root spectrum. >- std::array<float, kFftLengthBy2Plus1> N; >- for (size_t k = 0; k < kFftLengthBy2; k += 4) { >- __m128 v = _mm_loadu_ps(&N2[k]); >- v = _mm_sqrt_ps(v); >- _mm_storeu_ps(&N[k], v); >- } >- >- N[kFftLengthBy2] = sqrtf(N2[kFftLengthBy2]); >- >- // Compute the noise level for the upper bands. >- constexpr float kOneByNumBands = 1.f / (kFftLengthBy2Plus1 / 2 + 1); >- constexpr int kFftLengthBy2Plus1By2 = kFftLengthBy2Plus1 / 2; >- const float high_band_noise_level = >- std::accumulate(N.begin() + kFftLengthBy2Plus1By2, N.end(), 0.f) * >- kOneByNumBands; >- >- // Generate complex noise. >- std::array<int16_t, kFftLengthBy2 - 1> random_values_int; >- TableRandomValue(random_values_int.data(), random_values_int.size(), seed); >- >- std::array<float, kFftLengthBy2 - 1> sin; >- std::array<float, kFftLengthBy2 - 1> cos; >- constexpr float kScale = 6.28318530717959f / 32768.0f; >- std::transform(random_values_int.begin(), random_values_int.end(), >- sin.begin(), [&](int16_t a) { return -sinf(kScale * a); }); >- std::transform(random_values_int.begin(), random_values_int.end(), >- cos.begin(), [&](int16_t a) { return cosf(kScale * a); }); >- >- // Form low-frequency noise via spectral shaping. >- N_low->re[0] = N_low->re[kFftLengthBy2] = N_high->re[0] = >- N_high->re[kFftLengthBy2] = 0.f; >- std::transform(cos.begin(), cos.end(), N.begin() + 1, N_low->re.begin() + 1, >- std::multiplies<float>()); >- std::transform(sin.begin(), sin.end(), N.begin() + 1, N_low->im.begin() + 1, >- std::multiplies<float>()); >- >- // Form the high-frequency noise via simple levelling. >- std::transform(cos.begin(), cos.end(), N_high->re.begin() + 1, >- [&](float a) { return high_band_noise_level * a; }); >- std::transform(sin.begin(), sin.end(), N_high->im.begin() + 1, >- [&](float a) { return high_band_noise_level * a; }); >-} >- >-#endif >+namespace { > >-void EstimateComfortNoise(const std::array<float, kFftLengthBy2Plus1>& N2, >+void GenerateComfortNoise(Aec3Optimization optimization, >+ const std::array<float, kFftLengthBy2Plus1>& N2, > uint32_t* seed, > FftData* lower_band_noise, > FftData* upper_band_noise) { >@@ -104,8 +53,8 @@ void EstimateComfortNoise(const std::array<float, kFftLengthBy2Plus1>& N2, > > // Compute square root spectrum. > std::array<float, kFftLengthBy2Plus1> N; >- std::transform(N2.begin(), N2.end(), N.begin(), >- [](float a) { return sqrtf(a); }); >+ std::copy(N2.begin(), N2.end(), N.begin()); >+ aec3::VectorMath(optimization).Sqrt(N); > > // Compute the noise level for the upper bands. > constexpr float kOneByNumBands = 1.f / (kFftLengthBy2Plus1 / 2 + 1); >@@ -118,13 +67,21 @@ void EstimateComfortNoise(const std::array<float, kFftLengthBy2Plus1>& N2, > std::array<int16_t, kFftLengthBy2 - 1> random_values_int; > TableRandomValue(random_values_int.data(), random_values_int.size(), seed); > >+ // The analysis and synthesis windowing cause loss of power when >+ // cross-fading the noise where frames are completely uncorrelated >+ // (generated with random phase), hence the factor sqrt(2). >+ // This is not the case for the speech signal where the input is overlapping >+ // (strong correlation). > std::array<float, kFftLengthBy2 - 1> sin; > std::array<float, kFftLengthBy2 - 1> cos; > constexpr float kScale = 6.28318530717959f / 32768.0f; >+ constexpr float kSqrt2 = 1.4142135623f; > std::transform(random_values_int.begin(), random_values_int.end(), >- sin.begin(), [&](int16_t a) { return -sinf(kScale * a); }); >+ sin.begin(), >+ [&](int16_t a) { return -sinf(kScale * a) * kSqrt2; }); > std::transform(random_values_int.begin(), random_values_int.end(), >- cos.begin(), [&](int16_t a) { return cosf(kScale * a); }); >+ cos.begin(), >+ [&](int16_t a) { return cosf(kScale * a) * kSqrt2; }); > > // Form low-frequency noise via spectral shaping. > N_low->re[0] = N_low->re[kFftLengthBy2] = N_high->re[0] = >@@ -141,7 +98,7 @@ void EstimateComfortNoise(const std::array<float, kFftLengthBy2Plus1>& N2, > [&](float a) { return high_band_noise_level * a; }); > } > >-} // namespace aec3 >+} // namespace > > ComfortNoiseGenerator::ComfortNoiseGenerator(Aec3Optimization optimization) > : optimization_(optimization), >@@ -205,17 +162,8 @@ void ComfortNoiseGenerator::Compute( > const std::array<float, kFftLengthBy2Plus1>& N2 = > N2_initial_ ? *N2_initial_ : N2_; > >- switch (optimization_) { >-#if defined(WEBRTC_ARCH_X86_FAMILY) >- case Aec3Optimization::kSse2: >- aec3::EstimateComfortNoise_SSE2(N2, &seed_, lower_band_noise, >- upper_band_noise); >- break; >-#endif >- default: >- aec3::EstimateComfortNoise(N2, &seed_, lower_band_noise, >- upper_band_noise); >- } >+ GenerateComfortNoise(optimization_, N2, &seed_, lower_band_noise, >+ upper_band_noise); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.h >index 6a479896ecb976fb30abd3a3a69719c47ac164b9..3be386bab1e7687cc0f4af25cb8e47c2de15333f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_COMFORT_NOISE_GENERATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_COMFORT_NOISE_GENERATOR_H_ > >+#include <stdint.h> > #include <array> > #include <memory> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator_unittest.cc >index 77a09edcd6a038efdfb88f9d5356d8de19e87e44..10ba69603681fd30b5edcf13104e782eab53c074 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/comfort_noise_generator_unittest.cc >@@ -52,45 +52,6 @@ TEST(ComfortNoiseGenerator, NullUpperBandNoise) { > > #endif > >-#if defined(WEBRTC_ARCH_X86_FAMILY) >-// Verifies that the optimized methods are bitexact to their reference >-// counterparts. >-TEST(ComfortNoiseGenerator, TestOptimizations) { >- if (WebRtc_GetCPUInfo(kSSE2) != 0) { >- Random random_generator(42U); >- uint32_t seed = 42; >- uint32_t seed_SSE2 = 42; >- std::array<float, kFftLengthBy2Plus1> N2; >- FftData lower_band_noise; >- FftData upper_band_noise; >- FftData lower_band_noise_SSE2; >- FftData upper_band_noise_SSE2; >- for (int k = 0; k < 10; ++k) { >- for (size_t j = 0; j < N2.size(); ++j) { >- N2[j] = random_generator.Rand<float>() * 1000.f; >- } >- >- EstimateComfortNoise(N2, &seed, &lower_band_noise, &upper_band_noise); >- EstimateComfortNoise_SSE2(N2, &seed_SSE2, &lower_band_noise_SSE2, >- &upper_band_noise_SSE2); >- for (size_t j = 0; j < lower_band_noise.re.size(); ++j) { >- EXPECT_NEAR(lower_band_noise.re[j], lower_band_noise_SSE2.re[j], >- 0.00001f); >- EXPECT_NEAR(upper_band_noise.re[j], upper_band_noise_SSE2.re[j], >- 0.00001f); >- } >- for (size_t j = 1; j < lower_band_noise.re.size() - 1; ++j) { >- EXPECT_NEAR(lower_band_noise.im[j], lower_band_noise_SSE2.im[j], >- 0.00001f); >- EXPECT_NEAR(upper_band_noise.im[j], upper_band_noise_SSE2.im[j], >- 0.00001f); >- } >- } >- } >-} >- >-#endif >- > TEST(ComfortNoiseGenerator, CorrectLevel) { > ComfortNoiseGenerator cng(DetectOptimization()); > AecState aec_state(EchoCanceller3Config{}); >@@ -113,8 +74,8 @@ TEST(ComfortNoiseGenerator, CorrectLevel) { > for (int k = 0; k < 10000; ++k) { > cng.Compute(aec_state, N2, &n_lower, &n_upper); > } >- EXPECT_NEAR(N2[0], Power(n_lower), N2[0] / 10.f); >- EXPECT_NEAR(N2[0], Power(n_upper), N2[0] / 10.f); >+ EXPECT_NEAR(2.f * N2[0], Power(n_lower), N2[0] / 10.f); >+ EXPECT_NEAR(2.f * N2[0], Power(n_upper), N2[0] / 10.f); > } > > } // namespace aec3 >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/decimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/decimator.cc >index d9faa62dc446976a13d3ea06f0b3294ddf3acd67..bd03237ca080fb5341452a1faa39e160edeeee07 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/decimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/decimator.cc >@@ -9,6 +9,10 @@ > */ > #include "modules/audio_processing/aec3/decimator.h" > >+#include <array> >+#include <vector> >+ >+#include "modules/audio_processing/aec3/aec3_common.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.cc >index df0af6e0d8600160d2b574a969c3883aa9f59081..c105911aa8834df2049d281c84577fdfa558ec23 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.cc >@@ -10,6 +10,8 @@ > > #include "modules/audio_processing/aec3/downsampled_render_buffer.h" > >+#include <algorithm> >+ > namespace webrtc { > > DownsampledRenderBuffer::DownsampledRenderBuffer(size_t downsampled_buffer_size) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.h >index 943949666a069f0e447ba93257804dcdaed8dccd..c91ea3b836f60e85273b94abb68108efbf452746 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/downsampled_render_buffer.h >@@ -11,9 +11,9 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_DOWNSAMPLED_RENDER_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_DOWNSAMPLED_RENDER_BUFFER_H_ > >+#include <stddef.h> > #include <vector> > >-#include "modules/audio_processing/aec3/aec3_common.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.cc >index fa123de351e5449daaaba66f899a8e79a2108eea..e857a7e45a08476f028c7cfe3804f8f8d328763d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.cc >@@ -12,8 +12,10 @@ > > #include <algorithm> > #include <cmath> >+#include <utility> >+#include <vector> > >-#include "modules/audio_processing/aec3/aec3_common.h" >+#include "api/array_view.h" > #include "modules/audio_processing/aec3/matrix_buffer.h" > #include "modules/audio_processing/aec3/stationarity_estimator.h" > #include "modules/audio_processing/aec3/vector_buffer.h" >@@ -27,16 +29,18 @@ EchoAudibility::EchoAudibility(bool use_render_stationarity_at_init) > > EchoAudibility::~EchoAudibility() = default; > >-void EchoAudibility::Update(const RenderBuffer& render_buffer, >- int delay_blocks, >- bool external_delay_seen, >- float reverb_decay) { >+void EchoAudibility::Update( >+ const RenderBuffer& render_buffer, >+ rtc::ArrayView<const float> render_reverb_contribution_spectrum, >+ int delay_blocks, >+ bool external_delay_seen) { > UpdateRenderNoiseEstimator(render_buffer.GetSpectrumBuffer(), > render_buffer.GetBlockBuffer(), > external_delay_seen); > > if (external_delay_seen || use_render_stationarity_at_init_) { >- UpdateRenderStationarityFlags(render_buffer, delay_blocks, reverb_decay); >+ UpdateRenderStationarityFlags( >+ render_buffer, render_reverb_contribution_spectrum, delay_blocks); > } > } > >@@ -48,8 +52,8 @@ void EchoAudibility::Reset() { > > void EchoAudibility::UpdateRenderStationarityFlags( > const RenderBuffer& render_buffer, >- int delay_blocks, >- float reverb_decay) { >+ rtc::ArrayView<const float> render_reverb_contribution_spectrum, >+ int delay_blocks) { > const VectorBuffer& spectrum_buffer = render_buffer.GetSpectrumBuffer(); > int idx_at_delay = > spectrum_buffer.OffsetIndex(spectrum_buffer.read, delay_blocks); >@@ -57,8 +61,9 @@ void EchoAudibility::UpdateRenderStationarityFlags( > int num_lookahead = render_buffer.Headroom() - delay_blocks + 1; > num_lookahead = std::max(0, num_lookahead); > >- render_stationarity_.UpdateStationarityFlags(spectrum_buffer, idx_at_delay, >- num_lookahead, reverb_decay); >+ render_stationarity_.UpdateStationarityFlags( >+ spectrum_buffer, render_reverb_contribution_spectrum, idx_at_delay, >+ num_lookahead); > } > > void EchoAudibility::UpdateRenderNoiseEstimator( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.h >index 3b252f3ea2b7e5cbd9065376c2769887a76298b7..b903ca0cf1d79efba8e4b7485392f7500f755221 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_audibility.h >@@ -11,11 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_ECHO_AUDIBILITY_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_ECHO_AUDIBILITY_H_ > >-#include <algorithm> >-#include <array> >-#include <limits> >-#include <memory> >-#include <vector> >+#include <stddef.h> > > #include "absl/types/optional.h" > #include "api/array_view.h" >@@ -27,8 +23,6 @@ > > namespace webrtc { > >-class ApmDataDumper; >- > class EchoAudibility { > public: > explicit EchoAudibility(bool use_render_stationarity_at_init); >@@ -36,9 +30,9 @@ class EchoAudibility { > > // Feed new render data to the echo audibility estimator. > void Update(const RenderBuffer& render_buffer, >+ rtc::ArrayView<const float> render_reverb_contribution_spectrum, > int delay_blocks, >- bool external_delay_seen, >- float reverb_decay); >+ bool external_delay_seen); > // Get the residual echo scaling. > void GetResidualEchoScaling(bool filter_has_had_time_to_converge, > rtc::ArrayView<float> residual_scaling) const { >@@ -62,9 +56,10 @@ class EchoAudibility { > void Reset(); > > // Updates the render stationarity flags for the current frame. >- void UpdateRenderStationarityFlags(const RenderBuffer& render_buffer, >- int delay_blocks, >- float reverb_decay); >+ void UpdateRenderStationarityFlags( >+ const RenderBuffer& render_buffer, >+ rtc::ArrayView<const float> render_reverb_contribution_spectrum, >+ int delay_blocks); > > // Updates the noise estimator with the new render data since the previous > // call to this method. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.cc >index e5219c72ab7fae9a9e0bfe2e8e5ff453076c9740..f05edb15c3e56af98ca6bfca90262316f86cb39b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.cc >@@ -9,9 +9,12 @@ > */ > #include "modules/audio_processing/aec3/echo_canceller3.h" > >+#include <algorithm> >+#include <utility> >+ >+#include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" >-#include "rtc_base/logging.h" > #include "system_wrappers/include/field_trial.h" > > namespace webrtc { >@@ -41,25 +44,44 @@ bool EnableReverbModelling() { > return !field_trial::IsEnabled("WebRTC-Aec3ReverbModellingKillSwitch"); > } > >-bool EnableSuppressorNearendAveraging() { >- return !field_trial::IsEnabled( >- "WebRTC-Aec3SuppressorNearendAveragingKillSwitch"); >+bool EnableUnityInitialRampupGain() { >+ return field_trial::IsEnabled("WebRTC-Aec3EnableUnityInitialRampupGain"); > } > >-bool EnableSlowFilterAdaptation() { >- return !field_trial::IsEnabled("WebRTC-Aec3SlowFilterAdaptationKillSwitch"); >+bool EnableUnityNonZeroRampupGain() { >+ return field_trial::IsEnabled("WebRTC-Aec3EnableUnityNonZeroRampupGain"); > } > >-bool EnableShadowFilterJumpstart() { >- return !field_trial::IsEnabled("WebRTC-Aec3ShadowFilterJumpstartKillSwitch"); >+bool EnableLongReverb() { >+ return field_trial::IsEnabled("WebRTC-Aec3ShortReverbKillSwitch"); > } > >-bool EnableUnityInitialRampupGain() { >- return field_trial::IsEnabled("WebRTC-Aec3EnableUnityInitialRampupGain"); >+bool EnableNewFilterParams() { >+ return !field_trial::IsEnabled("WebRTC-Aec3NewFilterParamsKillSwitch"); > } > >-bool EnableUnityNonZeroRampupGain() { >- return field_trial::IsEnabled("WebRTC-Aec3EnableUnityNonZeroRampupGain"); >+bool EnableLegacyDominantNearend() { >+ return field_trial::IsEnabled("WebRTC-Aec3EnableLegacyDominantNearend"); >+} >+ >+bool UseLegacyNormalSuppressorTuning() { >+ return field_trial::IsEnabled("WebRTC-Aec3UseLegacyNormalSuppressorTuning"); >+} >+ >+bool ActivateStationarityProperties() { >+ return field_trial::IsEnabled("WebRTC-Aec3UseStationarityProperties"); >+} >+ >+bool ActivateStationarityPropertiesAtInit() { >+ return field_trial::IsEnabled("WebRTC-Aec3UseStationarityPropertiesAtInit"); >+} >+ >+bool EnableNewRenderBuffering() { >+ return !field_trial::IsEnabled("WebRTC-Aec3NewRenderBufferingKillSwitch"); >+} >+ >+bool UseEarlyDelayDetection() { >+ return !field_trial::IsEnabled("WebRTC-Aec3EarlyDelayDetectionKillSwitch"); > } > > // Method for adjusting config parameter dependencies.. >@@ -71,68 +93,27 @@ EchoCanceller3Config AdjustConfig(const EchoCanceller3Config& config) { > adjusted_cfg.ep_strength.default_len = 0.f; > } > >- // Use customized parameters when the system has clock-drift. >- if (config.echo_removal_control.has_clock_drift) { >- RTC_LOG(LS_WARNING) >- << "Customizing parameters to work well for the clock-drift case."; >- if (config.ep_strength.bounded_erl) { >- adjusted_cfg.ep_strength.default_len = 0.85f; >- adjusted_cfg.ep_strength.lf = 0.01f; >- adjusted_cfg.ep_strength.mf = 0.01f; >- adjusted_cfg.ep_strength.hf = 0.01f; >- adjusted_cfg.echo_model.render_pre_window_size = 1; >- adjusted_cfg.echo_model.render_post_window_size = 1; >- adjusted_cfg.echo_model.nonlinear_hold = 3; >- adjusted_cfg.echo_model.nonlinear_release = 0.001f; >- } else { >- adjusted_cfg.ep_strength.bounded_erl = true; >- adjusted_cfg.delay.down_sampling_factor = 2; >- adjusted_cfg.ep_strength.default_len = 0.8f; >- adjusted_cfg.ep_strength.lf = 0.01f; >- adjusted_cfg.ep_strength.mf = 0.01f; >- adjusted_cfg.ep_strength.hf = 0.01f; >- adjusted_cfg.filter.main = {30, 0.1f, 0.8f, 0.001f, 20075344.f}; >- adjusted_cfg.filter.shadow = {30, 0.7f, 20075344.f}; >- adjusted_cfg.filter.main_initial = {30, 0.1f, 1.5f, 0.001f, 20075344.f}; >- adjusted_cfg.filter.shadow_initial = {30, 0.9f, 20075344.f}; >- adjusted_cfg.echo_model.render_pre_window_size = 2; >- adjusted_cfg.echo_model.render_post_window_size = 2; >- adjusted_cfg.echo_model.nonlinear_hold = 3; >- adjusted_cfg.echo_model.nonlinear_release = 0.6f; >- } >- } >- > if (UseShortDelayEstimatorWindow()) { > adjusted_cfg.delay.num_filters = > std::min(adjusted_cfg.delay.num_filters, static_cast<size_t>(5)); > } > >- if (EnableReverbBasedOnRender() == false) { >- adjusted_cfg.ep_strength.reverb_based_on_render = false; >+ bool use_new_render_buffering = >+ EnableNewRenderBuffering() && config.buffering.use_new_render_buffering; >+ // Old render buffering needs one more filter to cover the same delay. >+ if (!use_new_render_buffering) { >+ adjusted_cfg.delay.num_filters += 1; > } > >- if (!EnableSuppressorNearendAveraging()) { >- adjusted_cfg.suppressor.nearend_average_blocks = 1; >+ if (EnableReverbBasedOnRender() == false) { >+ adjusted_cfg.ep_strength.reverb_based_on_render = false; > } > >- if (!EnableSlowFilterAdaptation()) { >- if (!EnableShadowFilterJumpstart()) { >- adjusted_cfg.filter.main.leakage_converged = 0.005f; >- adjusted_cfg.filter.main.leakage_diverged = 0.1f; >- } >- adjusted_cfg.filter.main_initial.leakage_converged = 0.05f; >- adjusted_cfg.filter.main_initial.leakage_diverged = 5.f; >- } >- >- if (!EnableShadowFilterJumpstart()) { >- if (EnableSlowFilterAdaptation()) { >- adjusted_cfg.filter.main.leakage_converged = 0.0005f; >- adjusted_cfg.filter.main.leakage_diverged = 0.01f; >- } else { >- adjusted_cfg.filter.main.leakage_converged = 0.005f; >- adjusted_cfg.filter.main.leakage_diverged = 0.1f; >- } >- adjusted_cfg.filter.main.error_floor = 0.001f; >+ if (!EnableNewFilterParams()) { >+ adjusted_cfg.filter.main.leakage_diverged = 0.01f; >+ adjusted_cfg.filter.main.error_floor = 0.1f; >+ adjusted_cfg.filter.main.error_ceil = 1E10f; >+ adjusted_cfg.filter.main_initial.error_ceil = 1E10f; > } > > if (EnableUnityInitialRampupGain() && >@@ -147,6 +128,42 @@ EchoCanceller3Config AdjustConfig(const EchoCanceller3Config& config) { > adjusted_cfg.echo_removal_control.gain_rampup.first_non_zero_gain = 1.f; > } > >+ if (EnableLongReverb()) { >+ adjusted_cfg.ep_strength.default_len = 0.88f; >+ } >+ >+ if (EnableLegacyDominantNearend()) { >+ adjusted_cfg.suppressor.nearend_tuning = >+ EchoCanceller3Config::Suppressor::Tuning( >+ EchoCanceller3Config::Suppressor::MaskingThresholds(.2f, .3f, .3f), >+ EchoCanceller3Config::Suppressor::MaskingThresholds(.07f, .1f, .3f), >+ 2.0f, 0.25f); >+ } >+ >+ if (UseLegacyNormalSuppressorTuning()) { >+ adjusted_cfg.suppressor.normal_tuning = >+ EchoCanceller3Config::Suppressor::Tuning( >+ EchoCanceller3Config::Suppressor::MaskingThresholds(.2f, .3f, .3f), >+ EchoCanceller3Config::Suppressor::MaskingThresholds(.07f, .1f, .3f), >+ 2.0f, 0.25f); >+ >+ adjusted_cfg.suppressor.dominant_nearend_detection.enr_threshold = 10.f; >+ adjusted_cfg.suppressor.dominant_nearend_detection.snr_threshold = 10.f; >+ adjusted_cfg.suppressor.dominant_nearend_detection.hold_duration = 25; >+ } >+ >+ if (ActivateStationarityProperties()) { >+ adjusted_cfg.echo_audibility.use_stationary_properties = true; >+ } >+ >+ if (ActivateStationarityPropertiesAtInit()) { >+ adjusted_cfg.echo_audibility.use_stationarity_properties_at_init = true; >+ } >+ >+ if (!UseEarlyDelayDetection()) { >+ adjusted_cfg.delay.delay_selection_thresholds = {25, 25}; >+ } >+ > return adjusted_cfg; > } > >@@ -330,12 +347,16 @@ int EchoCanceller3::instance_count_ = 0; > EchoCanceller3::EchoCanceller3(const EchoCanceller3Config& config, > int sample_rate_hz, > bool use_highpass_filter) >- : EchoCanceller3( >- AdjustConfig(config), >- sample_rate_hz, >- use_highpass_filter, >- std::unique_ptr<BlockProcessor>( >- BlockProcessor::Create(AdjustConfig(config), sample_rate_hz))) {} >+ : EchoCanceller3(AdjustConfig(config), >+ sample_rate_hz, >+ use_highpass_filter, >+ std::unique_ptr<BlockProcessor>( >+ EnableNewRenderBuffering() && >+ config.buffering.use_new_render_buffering >+ ? BlockProcessor::Create2(AdjustConfig(config), >+ sample_rate_hz) >+ : BlockProcessor::Create(AdjustConfig(config), >+ sample_rate_hz))) {} > EchoCanceller3::EchoCanceller3(const EchoCanceller3Config& config, > int sample_rate_hz, > bool use_highpass_filter, >@@ -425,6 +446,10 @@ void EchoCanceller3::ProcessCapture(AudioBuffer* capture, bool level_change) { > data_dumper_->DumpRaw("aec3_call_order", > static_cast<int>(EchoCanceller3ApiCall::kCapture)); > >+ // Report capture call in the metrics and periodically update API call >+ // metrics. >+ api_call_metrics_.ReportCaptureCall(); >+ > // Optionally delay the capture signal. > if (config_.delay.fixed_capture_delay_samples > 0) { > block_delay_buffer_.DelaySignal(capture); >@@ -479,6 +504,9 @@ void EchoCanceller3::EmptyRenderQueue() { > bool frame_to_buffer = > render_transfer_queue_.Remove(&render_queue_output_frame_); > while (frame_to_buffer) { >+ // Report render call in the metrics. >+ api_call_metrics_.ReportRenderCall(); >+ > BufferRenderFrameContent(&render_queue_output_frame_, 0, &render_blocker_, > block_processor_.get(), &block_, &sub_frame_view_); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.h >index f5520ba3ace5c3492a6b2a26896c883255f8855a..671d27167675c66c99c3e21e2572c6baf734e4f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_canceller3.h >@@ -11,7 +11,14 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_ECHO_CANCELLER3_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_ECHO_CANCELLER3_H_ > >+#include <stddef.h> >+#include <memory> >+#include <vector> >+ >+#include "api/array_view.h" > #include "api/audio/echo_canceller3_config.h" >+#include "api/audio/echo_control.h" >+#include "modules/audio_processing/aec3/api_call_jitter_metrics.h" > #include "modules/audio_processing/aec3/block_delay_buffer.h" > #include "modules/audio_processing/aec3/block_framer.h" > #include "modules/audio_processing/aec3/block_processor.h" >@@ -19,9 +26,11 @@ > #include "modules/audio_processing/aec3/frame_blocker.h" > #include "modules/audio_processing/audio_buffer.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/race_checker.h" > #include "rtc_base/swap_queue.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >@@ -132,6 +141,7 @@ class EchoCanceller3 : public EchoControl { > std::vector<rtc::ArrayView<float>> sub_frame_view_ > RTC_GUARDED_BY(capture_race_checker_); > BlockDelayBuffer block_delay_buffer_ RTC_GUARDED_BY(capture_race_checker_); >+ ApiCallJitterMetrics api_call_metrics_ RTC_GUARDED_BY(capture_race_checker_); > > RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(EchoCanceller3); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.cc >index 638ddc44084e85f2ac670c9ef52023ece4f6a36f..6069ed6be697dbc8315170303763f6f3c5f98e95 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.cc >@@ -9,31 +9,21 @@ > */ > #include "modules/audio_processing/aec3/echo_path_delay_estimator.h" > >-#include <algorithm> > #include <array> > > #include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/downsampled_render_buffer.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" >-#include "system_wrappers/include/field_trial.h" > > namespace webrtc { >-namespace { >-size_t GetDownSamplingFactor(const EchoCanceller3Config& config) { >- // Do not use down sampling factor 8 if kill switch is triggered. >- return (config.delay.down_sampling_factor == 8 && >- field_trial::IsEnabled("WebRTC-Aec3DownSamplingFactor8KillSwitch")) >- ? 4 >- : config.delay.down_sampling_factor; >-} >-} // namespace > > EchoPathDelayEstimator::EchoPathDelayEstimator( > ApmDataDumper* data_dumper, > const EchoCanceller3Config& config) > : data_dumper_(data_dumper), >- down_sampling_factor_(GetDownSamplingFactor(config)), >+ down_sampling_factor_(config.delay.down_sampling_factor), > sub_block_size_(down_sampling_factor_ != 0 > ? kBlockSize / down_sampling_factor_ > : kBlockSize), >@@ -45,7 +35,7 @@ EchoPathDelayEstimator::EchoPathDelayEstimator( > kMatchedFilterWindowSizeSubBlocks, > config.delay.num_filters, > kMatchedFilterAlignmentShiftSizeSubBlocks, >- GetDownSamplingFactor(config) == 8 >+ config.delay.down_sampling_factor == 8 > ? config.render_levels.poor_excitation_render_limit_ds8 > : config.render_levels.poor_excitation_render_limit, > config.delay.delay_estimate_smoothing, >@@ -59,13 +49,8 @@ EchoPathDelayEstimator::EchoPathDelayEstimator( > > EchoPathDelayEstimator::~EchoPathDelayEstimator() = default; > >-void EchoPathDelayEstimator::Reset(bool soft_reset) { >- if (!soft_reset) { >- matched_filter_lag_aggregator_.Reset(); >- } >- matched_filter_.Reset(); >- old_aggregated_lag_ = absl::nullopt; >- consistent_estimate_counter_ = 0; >+void EchoPathDelayEstimator::Reset(bool reset_delay_confidence) { >+ Reset(true, reset_delay_confidence); > } > > absl::optional<DelayEstimate> EchoPathDelayEstimator::EstimateDelay( >@@ -88,6 +73,12 @@ absl::optional<DelayEstimate> EchoPathDelayEstimator::EstimateDelay( > matched_filter_lag_aggregator_.Aggregate( > matched_filter_.GetLagEstimates()); > >+ // Run clockdrift detection. >+ if (aggregated_matched_filter_lag && >+ (*aggregated_matched_filter_lag).quality == >+ DelayEstimate::Quality::kRefined) >+ clockdrift_detector_.Update((*aggregated_matched_filter_lag).delay); >+ > // TODO(peah): Move this logging outside of this class once EchoCanceller3 > // development is done. > data_dumper_->DumpRaw( >@@ -112,10 +103,19 @@ absl::optional<DelayEstimate> EchoPathDelayEstimator::EstimateDelay( > old_aggregated_lag_ = aggregated_matched_filter_lag; > constexpr size_t kNumBlocksPerSecondBy2 = kNumBlocksPerSecond / 2; > if (consistent_estimate_counter_ > kNumBlocksPerSecondBy2) { >- Reset(true); >+ Reset(false, false); > } > > return aggregated_matched_filter_lag; > } > >+void EchoPathDelayEstimator::Reset(bool reset_lag_aggregator, >+ bool reset_delay_confidence) { >+ if (reset_lag_aggregator) { >+ matched_filter_lag_aggregator_.Reset(reset_delay_confidence); >+ } >+ matched_filter_.Reset(); >+ old_aggregated_lag_ = absl::nullopt; >+ consistent_estimate_counter_ = 0; >+} > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.h >index cea9abcb515169cd7bc1fdd3f771e50394b65a70..1f14735d1a51afb0d4dfb3ae6866b92a8280872e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator.h >@@ -11,13 +11,13 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_ECHO_PATH_DELAY_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_ECHO_PATH_DELAY_ESTIMATOR_H_ > >-#include <vector> >+#include <stddef.h> > > #include "absl/types/optional.h" >-#include "api/audio/echo_canceller3_config.h" >+#include "api/array_view.h" >+#include "modules/audio_processing/aec3/clockdrift_detector.h" > #include "modules/audio_processing/aec3/decimator.h" > #include "modules/audio_processing/aec3/delay_estimate.h" >-#include "modules/audio_processing/aec3/downsampled_render_buffer.h" > #include "modules/audio_processing/aec3/matched_filter.h" > #include "modules/audio_processing/aec3/matched_filter_lag_aggregator.h" > #include "rtc_base/constructormagic.h" >@@ -25,6 +25,8 @@ > namespace webrtc { > > class ApmDataDumper; >+struct DownsampledRenderBuffer; >+struct EchoCanceller3Config; > > // Estimates the delay of the echo path. > class EchoPathDelayEstimator { >@@ -33,9 +35,9 @@ class EchoPathDelayEstimator { > const EchoCanceller3Config& config); > ~EchoPathDelayEstimator(); > >- // Resets the estimation. If the soft-reset is specified, only the matched >- // filters are reset. >- void Reset(bool soft_reset); >+ // Resets the estimation. If the delay confidence is reset, the reset behavior >+ // is as if the call is restarted. >+ void Reset(bool reset_delay_confidence); > > // Produce a delay estimate if such is avaliable. > absl::optional<DelayEstimate> EstimateDelay( >@@ -48,6 +50,11 @@ class EchoPathDelayEstimator { > down_sampling_factor_); > } > >+ // Returns the level of detected clockdrift. >+ ClockdriftDetector::Level Clockdrift() const { >+ return clockdrift_detector_.ClockdriftLevel(); >+ } >+ > private: > ApmDataDumper* const data_dumper_; > const size_t down_sampling_factor_; >@@ -57,6 +64,10 @@ class EchoPathDelayEstimator { > MatchedFilterLagAggregator matched_filter_lag_aggregator_; > absl::optional<DelayEstimate> old_aggregated_lag_; > size_t consistent_estimate_counter_ = 0; >+ ClockdriftDetector clockdrift_detector_; >+ >+ // Internal reset method with more granularity. >+ void Reset(bool reset_lag_aggregator, bool reset_delay_confidence); > > RTC_DISALLOW_COPY_AND_ASSIGN(EchoPathDelayEstimator); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator_unittest.cc >index 311a4a22d5f8ee1ec764771050dd29ea1f640e3e..a4e3133bb5baae18b365ef3758c4dbd611cf0563 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_path_delay_estimator_unittest.cc >@@ -39,7 +39,7 @@ TEST(EchoPathDelayEstimator, BasicApiCalls) { > ApmDataDumper data_dumper(0); > EchoCanceller3Config config; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > EchoPathDelayEstimator estimator(&data_dumper, config); > std::vector<std::vector<float>> render(3, std::vector<float>(kBlockSize)); > std::vector<float> capture(kBlockSize); >@@ -64,12 +64,9 @@ TEST(EchoPathDelayEstimator, DelayEstimation) { > config.delay.num_filters = 10; > for (size_t delay_samples : {30, 64, 150, 200, 800, 4000}) { > SCOPED_TRACE(ProduceDebugText(delay_samples, down_sampling_factor)); >- >- config.delay.api_call_jitter_blocks = 5; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >- DelayBuffer<float> signal_delay_buffer( >- delay_samples + 2 * config.delay.api_call_jitter_blocks * 64); >+ RenderDelayBuffer::Create2(config, 3)); >+ DelayBuffer<float> signal_delay_buffer(delay_samples); > EchoPathDelayEstimator estimator(&data_dumper, config); > > absl::optional<DelayEstimate> estimated_delay_samples; >@@ -97,9 +94,7 @@ TEST(EchoPathDelayEstimator, DelayEstimation) { > // domain. > size_t delay_ds = delay_samples / down_sampling_factor; > size_t estimated_delay_ds = >- (estimated_delay_samples->delay - >- (config.delay.api_call_jitter_blocks + 1) * 64) / >- down_sampling_factor; >+ estimated_delay_samples->delay / down_sampling_factor; > EXPECT_NEAR(delay_ds, estimated_delay_ds, 1); > } else { > ADD_FAILURE(); >@@ -118,7 +113,7 @@ TEST(EchoPathDelayEstimator, NoDelayEstimatesForLowLevelRenderSignals) { > ApmDataDumper data_dumper(0); > EchoPathDelayEstimator estimator(&data_dumper, config); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(EchoCanceller3Config(), 3)); >+ RenderDelayBuffer::Create2(EchoCanceller3Config(), 3)); > for (size_t k = 0; k < 100; ++k) { > RandomizeSampleVector(&random_generator, render[0]); > for (auto& render_k : render[0]) { >@@ -142,7 +137,7 @@ TEST(EchoPathDelayEstimator, DISABLED_WrongRenderBlockSize) { > EchoCanceller3Config config; > EchoPathDelayEstimator estimator(&data_dumper, config); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > std::vector<float> capture(kBlockSize); > EXPECT_DEATH(estimator.EstimateDelay( > render_delay_buffer->GetDownsampledRenderBuffer(), capture), >@@ -157,7 +152,7 @@ TEST(EchoPathDelayEstimator, WrongCaptureBlockSize) { > EchoCanceller3Config config; > EchoPathDelayEstimator estimator(&data_dumper, config); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > std::vector<float> capture(std::vector<float>(kBlockSize - 1)); > EXPECT_DEATH(estimator.EstimateDelay( > render_delay_buffer->GetDownsampledRenderBuffer(), capture), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover.cc >index acdaf3104b9fae28099a02d1d6686c4e5f273156..cfb73951992cd9784d3328f5698bfb9ca7e243de 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover.cc >@@ -10,26 +10,29 @@ > #include "modules/audio_processing/aec3/echo_remover.h" > > #include <math.h> >+#include <stddef.h> > #include <algorithm> >+#include <array> > #include <memory> >-#include <numeric> >-#include <string> > > #include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/aec3_fft.h" > #include "modules/audio_processing/aec3/aec_state.h" > #include "modules/audio_processing/aec3/comfort_noise_generator.h" > #include "modules/audio_processing/aec3/echo_path_variability.h" > #include "modules/audio_processing/aec3/echo_remover_metrics.h" > #include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/aec3/render_buffer.h" >-#include "modules/audio_processing/aec3/render_delay_buffer.h" >+#include "modules/audio_processing/aec3/render_signal_analyzer.h" > #include "modules/audio_processing/aec3/residual_echo_estimator.h" > #include "modules/audio_processing/aec3/subtractor.h" >+#include "modules/audio_processing/aec3/subtractor_output.h" > #include "modules/audio_processing/aec3/suppression_filter.h" > #include "modules/audio_processing/aec3/suppression_gain.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/field_trial.h" >@@ -175,7 +178,7 @@ EchoRemoverImpl::EchoRemoverImpl(const EchoCanceller3Config& config, > subtractor_(config, data_dumper_.get(), optimization_), > suppression_gain_(config_, optimization_, sample_rate_hz), > cng_(optimization_), >- suppression_filter_(sample_rate_hz_), >+ suppression_filter_(optimization_, sample_rate_hz_), > render_signal_analyzer_(config_), > residual_echo_estimator_(config_), > aec_state_(config_) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.cc >index a04026b4f5f55bfb4e42a98b3063dd437707c366..da7a224e863a64630b8cea29801097f9dce69681 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.cc >@@ -11,9 +11,11 @@ > #include "modules/audio_processing/aec3/echo_remover_metrics.h" > > #include <math.h> >+#include <stddef.h> > #include <algorithm> > #include <numeric> > >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "system_wrappers/include/metrics.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.h >index 17b803a1cf9888bd0b49b87c755cbcda3ef0133d..0707a5f031d9ae619b13d1f152e6902ab43fad76 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_metrics.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_ECHO_REMOVER_METRICS_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_ECHO_REMOVER_METRICS_H_ > >+#include <array> >+ >+#include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/aec_state.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_unittest.cc >index da03f4c8e8db7df563a1a28d023a0a3333dea917..8bf76c4060d62045b888b764a54b08cb97c6d96e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/echo_remover_unittest.cc >@@ -48,7 +48,7 @@ TEST(EchoRemover, BasicApiCalls) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<EchoRemover> remover( > EchoRemover::Create(EchoCanceller3Config(), rate)); >- std::unique_ptr<RenderDelayBuffer> render_buffer(RenderDelayBuffer::Create( >+ std::unique_ptr<RenderDelayBuffer> render_buffer(RenderDelayBuffer::Create2( > EchoCanceller3Config(), NumBandsForRate(rate))); > > std::vector<std::vector<float>> render(NumBandsForRate(rate), >@@ -89,7 +89,7 @@ TEST(EchoRemover, WrongCaptureBlockSize) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<EchoRemover> remover( > EchoRemover::Create(EchoCanceller3Config(), rate)); >- std::unique_ptr<RenderDelayBuffer> render_buffer(RenderDelayBuffer::Create( >+ std::unique_ptr<RenderDelayBuffer> render_buffer(RenderDelayBuffer::Create2( > EchoCanceller3Config(), NumBandsForRate(rate))); > std::vector<std::vector<float>> capture( > NumBandsForRate(rate), std::vector<float>(kBlockSize - 1, 0.f)); >@@ -111,7 +111,7 @@ TEST(EchoRemover, DISABLED_WrongCaptureNumBands) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<EchoRemover> remover( > EchoRemover::Create(EchoCanceller3Config(), rate)); >- std::unique_ptr<RenderDelayBuffer> render_buffer(RenderDelayBuffer::Create( >+ std::unique_ptr<RenderDelayBuffer> render_buffer(RenderDelayBuffer::Create2( > EchoCanceller3Config(), NumBandsForRate(rate))); > std::vector<std::vector<float>> capture( > NumBandsForRate(rate == 48000 ? 16000 : rate + 16000), >@@ -131,7 +131,7 @@ TEST(EchoRemover, NullCapture) { > std::unique_ptr<EchoRemover> remover( > EchoRemover::Create(EchoCanceller3Config(), 8000)); > std::unique_ptr<RenderDelayBuffer> render_buffer( >- RenderDelayBuffer::Create(EchoCanceller3Config(), 3)); >+ RenderDelayBuffer::Create2(EchoCanceller3Config(), 3)); > EchoPathVariability echo_path_variability( > false, EchoPathVariability::DelayAdjustment::kNone, false); > EXPECT_DEATH( >@@ -161,7 +161,7 @@ TEST(EchoRemover, BasicEchoRemoval) { > config.delay.min_echo_path_delay_blocks = 0; > std::unique_ptr<EchoRemover> remover(EchoRemover::Create(config, rate)); > std::unique_ptr<RenderDelayBuffer> render_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > render_buffer->SetDelay(delay_samples / kBlockSize); > > std::vector<std::unique_ptr<DelayBuffer<float>>> delay_buffers(x.size()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.cc >index 2da9cd888350406bf78902806fe5649689e44e10..85b1e022da2f87266e0e8572a741cc84b11c86c1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.cc >@@ -13,6 +13,8 @@ > #include <algorithm> > #include <numeric> > >+#include "rtc_base/checks.h" >+ > namespace webrtc { > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.h >index dacd546a6de732f625b599bd6d04031c90b02331..29718c37836e8a8bd156279371d8272f00c2db34 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erl_estimator.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_ERL_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_ERL_ESTIMATOR_H_ > >+#include <stddef.h> > #include <array> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.cc >index 711085f96f0894c7c5c29c86e722e977bb6b4f5d..656a9c7fdf530661e526f55c07de3b9ea680ed93 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.cc >@@ -10,19 +10,18 @@ > > #include "modules/audio_processing/aec3/erle_estimator.h" > >-#include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" >-#include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > ErleEstimator::ErleEstimator(size_t startup_phase_length_blocks_, >- float min_erle, >- float max_erle_lf, >- float max_erle_hf) >+ const EchoCanceller3Config& config) > : startup_phase_length_blocks__(startup_phase_length_blocks_), >- fullband_erle_estimator_(min_erle, max_erle_lf), >- subband_erle_estimator_(min_erle, max_erle_lf, max_erle_hf) { >+ use_signal_dependent_erle_(config.erle.num_sections > 1), >+ fullband_erle_estimator_(config.erle.min, config.erle.max_l), >+ subband_erle_estimator_(config), >+ signal_dependent_erle_estimator_(config) { > Reset(true); > } > >@@ -31,20 +30,25 @@ ErleEstimator::~ErleEstimator() = default; > void ErleEstimator::Reset(bool delay_change) { > fullband_erle_estimator_.Reset(); > subband_erle_estimator_.Reset(); >+ signal_dependent_erle_estimator_.Reset(); > if (delay_change) { > blocks_since_reset_ = 0; > } > } > >-void ErleEstimator::Update(rtc::ArrayView<const float> render_spectrum, >- rtc::ArrayView<const float> capture_spectrum, >- rtc::ArrayView<const float> subtractor_spectrum, >- bool converged_filter, >- bool onset_detection) { >- RTC_DCHECK_EQ(kFftLengthBy2Plus1, render_spectrum.size()); >+void ErleEstimator::Update( >+ const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response, >+ rtc::ArrayView<const float> reverb_render_spectrum, >+ rtc::ArrayView<const float> capture_spectrum, >+ rtc::ArrayView<const float> subtractor_spectrum, >+ bool converged_filter, >+ bool onset_detection) { >+ RTC_DCHECK_EQ(kFftLengthBy2Plus1, reverb_render_spectrum.size()); > RTC_DCHECK_EQ(kFftLengthBy2Plus1, capture_spectrum.size()); > RTC_DCHECK_EQ(kFftLengthBy2Plus1, subtractor_spectrum.size()); >- const auto& X2 = render_spectrum; >+ const auto& X2_reverb = reverb_render_spectrum; > const auto& Y2 = capture_spectrum; > const auto& E2 = subtractor_spectrum; > >@@ -52,14 +56,23 @@ void ErleEstimator::Update(rtc::ArrayView<const float> render_spectrum, > return; > } > >- subband_erle_estimator_.Update(X2, Y2, E2, converged_filter, onset_detection); >- fullband_erle_estimator_.Update(X2, Y2, E2, converged_filter); >+ subband_erle_estimator_.Update(X2_reverb, Y2, E2, converged_filter, >+ onset_detection); >+ >+ if (use_signal_dependent_erle_) { >+ signal_dependent_erle_estimator_.Update( >+ render_buffer, filter_frequency_response, X2_reverb, Y2, E2, >+ subband_erle_estimator_.Erle(), converged_filter); >+ } >+ >+ fullband_erle_estimator_.Update(X2_reverb, Y2, E2, converged_filter); > } > > void ErleEstimator::Dump( > const std::unique_ptr<ApmDataDumper>& data_dumper) const { > fullband_erle_estimator_.Dump(data_dumper); > subband_erle_estimator_.Dump(data_dumper); >+ signal_dependent_erle_estimator_.Dump(data_dumper); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.h >index 8ca605ff71d58eb8679caa6f0d36ec0ea2566d3c..8036c2198b4e32bf4e8da1202685b9f5a50d4b2f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator.h >@@ -11,13 +11,17 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_ERLE_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_ERLE_ESTIMATOR_H_ > >+#include <stddef.h> > #include <array> > #include <memory> > > #include "absl/types/optional.h" > #include "api/array_view.h" >+#include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/fullband_erle_estimator.h" >+#include "modules/audio_processing/aec3/render_buffer.h" >+#include "modules/audio_processing/aec3/signal_dependent_erle_estimator.h" > #include "modules/audio_processing/aec3/subband_erle_estimator.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > >@@ -28,16 +32,17 @@ namespace webrtc { > class ErleEstimator { > public: > ErleEstimator(size_t startup_phase_length_blocks_, >- float min_erle, >- float max_erle_lf, >- float max_erle_hf); >+ const EchoCanceller3Config& config); > ~ErleEstimator(); > > // Resets the fullband ERLE estimator and the subbands ERLE estimators. > void Reset(bool delay_change); > > // Updates the ERLE estimates. >- void Update(rtc::ArrayView<const float> render_spectrum, >+ void Update(const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response, >+ rtc::ArrayView<const float> reverb_render_spectrum, > rtc::ArrayView<const float> capture_spectrum, > rtc::ArrayView<const float> subtractor_spectrum, > bool converged_filter, >@@ -45,11 +50,12 @@ class ErleEstimator { > > // Returns the most recent subband ERLE estimates. > const std::array<float, kFftLengthBy2Plus1>& Erle() const { >- return subband_erle_estimator_.Erle(); >+ return use_signal_dependent_erle_ ? signal_dependent_erle_estimator_.Erle() >+ : subband_erle_estimator_.Erle(); > } > // Returns the subband ERLE that are estimated during onsets. Used > // for logging/testing. >- const std::array<float, kFftLengthBy2Plus1>& ErleOnsets() const { >+ rtc::ArrayView<const float> ErleOnsets() const { > return subband_erle_estimator_.ErleOnsets(); > } > >@@ -70,8 +76,10 @@ class ErleEstimator { > > private: > const size_t startup_phase_length_blocks__; >+ const bool use_signal_dependent_erle_; > FullBandErleEstimator fullband_erle_estimator_; > SubbandErleEstimator subband_erle_estimator_; >+ SignalDependentErleEstimator signal_dependent_erle_estimator_; > size_t blocks_since_reset_ = 0; > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator_unittest.cc >index 2cb050af3b3da112ae2f975884d50d1eddb358eb..59a747159337f368960e497e30d9c35bbae56f94 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/erle_estimator_unittest.cc >@@ -12,6 +12,9 @@ > > #include "api/array_view.h" > #include "modules/audio_processing/aec3/erle_estimator.h" >+#include "modules/audio_processing/aec3/render_delay_buffer.h" >+#include "modules/audio_processing/aec3/vector_buffer.h" >+#include "rtc_base/random.h" > #include "test/gtest.h" > > namespace webrtc { >@@ -19,11 +22,9 @@ namespace webrtc { > namespace { > > constexpr int kLowFrequencyLimit = kFftLengthBy2 / 2; >-constexpr float kMaxErleLf = 8.f; >-constexpr float kMaxErleHf = 1.5f; >-constexpr float kMinErle = 1.0f; > constexpr float kTrueErle = 10.f; > constexpr float kTrueErleOnsets = 1.0f; >+constexpr float kEchoPathGain = 3.f; > > void VerifyErleBands(rtc::ArrayView<const float> erle, > float reference_lf, >@@ -44,80 +45,157 @@ void VerifyErle(rtc::ArrayView<const float> erle, > EXPECT_NEAR(reference_lf, erle_time_domain, 0.5); > } > >-void FormFarendFrame(std::array<float, kFftLengthBy2Plus1>* X2, >+void FormFarendTimeFrame(rtc::ArrayView<float> x) { >+ const std::array<float, kBlockSize> frame = { >+ 7459.88, 17209.6, 17383, 20768.9, 16816.7, 18386.3, 4492.83, 9675.85, >+ 6665.52, 14808.6, 9342.3, 7483.28, 19261.7, 4145.98, 1622.18, 13475.2, >+ 7166.32, 6856.61, 21937, 7263.14, 9569.07, 14919, 8413.32, 7551.89, >+ 7848.65, 6011.27, 13080.6, 15865.2, 12656, 17459.6, 4263.93, 4503.03, >+ 9311.79, 21095.8, 12657.9, 13906.6, 19267.2, 11338.1, 16828.9, 11501.6, >+ 11405, 15031.4, 14541.6, 19765.5, 18346.3, 19350.2, 3157.47, 18095.8, >+ 1743.68, 21328.2, 19727.5, 7295.16, 10332.4, 11055.5, 20107.4, 14708.4, >+ 12416.2, 16434, 2454.69, 9840.8, 6867.23, 1615.75, 6059.9, 8394.19}; >+ RTC_DCHECK_GE(x.size(), frame.size()); >+ std::copy(frame.begin(), frame.end(), x.begin()); >+} >+ >+void FormFarendFrame(const RenderBuffer& render_buffer, >+ std::array<float, kFftLengthBy2Plus1>* X2, > std::array<float, kFftLengthBy2Plus1>* E2, > std::array<float, kFftLengthBy2Plus1>* Y2, > float erle) { >- X2->fill(500 * 1000.f * 1000.f); >- E2->fill(1000.f * 1000.f); >- Y2->fill(erle * (*E2)[0]); >-} >+ const auto& spectrum_buffer = render_buffer.GetSpectrumBuffer(); >+ const auto& X2_from_buffer = spectrum_buffer.buffer[spectrum_buffer.write]; >+ std::copy(X2_from_buffer.begin(), X2_from_buffer.end(), X2->begin()); >+ std::transform(X2->begin(), X2->end(), Y2->begin(), >+ [](float a) { return a * kEchoPathGain * kEchoPathGain; }); >+ std::transform(Y2->begin(), Y2->end(), E2->begin(), >+ [erle](float a) { return a / erle; }); >+ >+} // namespace > >-void FormNearendFrame(std::array<float, kFftLengthBy2Plus1>* X2, >+void FormNearendFrame(rtc::ArrayView<float> x, >+ std::array<float, kFftLengthBy2Plus1>* X2, > std::array<float, kFftLengthBy2Plus1>* E2, > std::array<float, kFftLengthBy2Plus1>* Y2) { >+ x[0] = 0.f; > X2->fill(0.f); > Y2->fill(500.f * 1000.f * 1000.f); > E2->fill((*Y2)[0]); > } > >+void GetFilterFreq(std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response, >+ size_t delay_headroom_blocks) { >+ RTC_DCHECK_GE(filter_frequency_response.size(), delay_headroom_blocks); >+ for (auto& block_freq_resp : filter_frequency_response) { >+ block_freq_resp.fill(0.f); >+ } >+ >+ for (size_t k = 0; k < kFftLengthBy2Plus1; ++k) { >+ filter_frequency_response[delay_headroom_blocks][k] = kEchoPathGain; >+ } >+} >+ > } // namespace > > TEST(ErleEstimator, VerifyErleIncreaseAndHold) { > std::array<float, kFftLengthBy2Plus1> X2; > std::array<float, kFftLengthBy2Plus1> E2; > std::array<float, kFftLengthBy2Plus1> Y2; >+ EchoCanceller3Config config; >+ std::vector<std::vector<float>> x(3, std::vector<float>(kBlockSize, 0.f)); >+ std::vector<std::array<float, kFftLengthBy2Plus1>> filter_frequency_response( >+ config.filter.main.length_blocks); >+ std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >+ RenderDelayBuffer::Create2(config, 3)); > >- ErleEstimator estimator(0, kMinErle, kMaxErleLf, kMaxErleHf); >+ GetFilterFreq(filter_frequency_response, config.delay.delay_headroom_blocks); > >- // Verifies that the ERLE estimate is properly increased to higher values. >- FormFarendFrame(&X2, &E2, &Y2, kTrueErle); >+ ErleEstimator estimator(0, config); > >+ FormFarendTimeFrame(x[0]); >+ render_delay_buffer->Insert(x); >+ render_delay_buffer->PrepareCaptureProcessing(); >+ // Verifies that the ERLE estimate is properly increased to higher values. >+ FormFarendFrame(*render_delay_buffer->GetRenderBuffer(), &X2, &E2, &Y2, >+ kTrueErle); > for (size_t k = 0; k < 200; ++k) { >- estimator.Update(X2, Y2, E2, true, true); >+ render_delay_buffer->Insert(x); >+ render_delay_buffer->PrepareCaptureProcessing(); >+ estimator.Update(*render_delay_buffer->GetRenderBuffer(), >+ filter_frequency_response, X2, Y2, E2, true, true); > } > VerifyErle(estimator.Erle(), std::pow(2.f, estimator.FullbandErleLog2()), >- kMaxErleLf, kMaxErleHf); >+ config.erle.max_l, config.erle.max_h); > >- FormNearendFrame(&X2, &E2, &Y2); >+ FormNearendFrame(x[0], &X2, &E2, &Y2); > // Verifies that the ERLE is not immediately decreased during nearend > // activity. > for (size_t k = 0; k < 50; ++k) { >- estimator.Update(X2, Y2, E2, true, true); >+ render_delay_buffer->Insert(x); >+ render_delay_buffer->PrepareCaptureProcessing(); >+ estimator.Update(*render_delay_buffer->GetRenderBuffer(), >+ filter_frequency_response, X2, Y2, E2, true, true); > } > VerifyErle(estimator.Erle(), std::pow(2.f, estimator.FullbandErleLog2()), >- kMaxErleLf, kMaxErleHf); >+ config.erle.max_l, config.erle.max_h); > } > > TEST(ErleEstimator, VerifyErleTrackingOnOnsets) { > std::array<float, kFftLengthBy2Plus1> X2; > std::array<float, kFftLengthBy2Plus1> E2; > std::array<float, kFftLengthBy2Plus1> Y2; >+ EchoCanceller3Config config; >+ std::vector<std::vector<float>> x(3, std::vector<float>(kBlockSize, 0.f)); >+ std::vector<std::array<float, kFftLengthBy2Plus1>> filter_frequency_response( >+ config.filter.main.length_blocks); >+ >+ std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >+ RenderDelayBuffer::Create2(config, 3)); >+ >+ GetFilterFreq(filter_frequency_response, config.delay.delay_headroom_blocks); >+ >+ ErleEstimator estimator(0, config); > >- ErleEstimator estimator(0, kMinErle, kMaxErleLf, kMaxErleHf); >+ FormFarendTimeFrame(x[0]); >+ render_delay_buffer->Insert(x); >+ render_delay_buffer->PrepareCaptureProcessing(); > > for (size_t burst = 0; burst < 20; ++burst) { >- FormFarendFrame(&X2, &E2, &Y2, kTrueErleOnsets); >+ FormFarendFrame(*render_delay_buffer->GetRenderBuffer(), &X2, &E2, &Y2, >+ kTrueErleOnsets); > for (size_t k = 0; k < 10; ++k) { >- estimator.Update(X2, Y2, E2, true, true); >+ render_delay_buffer->Insert(x); >+ render_delay_buffer->PrepareCaptureProcessing(); >+ estimator.Update(*render_delay_buffer->GetRenderBuffer(), >+ filter_frequency_response, X2, Y2, E2, true, true); > } >- FormFarendFrame(&X2, &E2, &Y2, kTrueErle); >+ FormFarendFrame(*render_delay_buffer->GetRenderBuffer(), &X2, &E2, &Y2, >+ kTrueErle); > for (size_t k = 0; k < 200; ++k) { >- estimator.Update(X2, Y2, E2, true, true); >+ render_delay_buffer->Insert(x); >+ render_delay_buffer->PrepareCaptureProcessing(); >+ estimator.Update(*render_delay_buffer->GetRenderBuffer(), >+ filter_frequency_response, X2, Y2, E2, true, true); > } >- FormNearendFrame(&X2, &E2, &Y2); >+ FormNearendFrame(x[0], &X2, &E2, &Y2); > for (size_t k = 0; k < 300; ++k) { >- estimator.Update(X2, Y2, E2, true, true); >+ render_delay_buffer->Insert(x); >+ render_delay_buffer->PrepareCaptureProcessing(); >+ estimator.Update(*render_delay_buffer->GetRenderBuffer(), >+ filter_frequency_response, X2, Y2, E2, true, true); > } > } >- VerifyErleBands(estimator.ErleOnsets(), kMinErle, kMinErle); >- FormNearendFrame(&X2, &E2, &Y2); >+ VerifyErleBands(estimator.ErleOnsets(), config.erle.min, config.erle.min); >+ FormNearendFrame(x[0], &X2, &E2, &Y2); > for (size_t k = 0; k < 1000; k++) { >- estimator.Update(X2, Y2, E2, true, true); >+ estimator.Update(*render_delay_buffer->GetRenderBuffer(), >+ filter_frequency_response, X2, Y2, E2, true, true); > } > // Verifies that during ne activity, Erle converges to the Erle for onsets. > VerifyErle(estimator.Erle(), std::pow(2.f, estimator.FullbandErleLog2()), >- kMinErle, kMinErle); >+ config.erle.min, config.erle.min); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fft_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fft_buffer.h >index 47ede41825cfc1b6355f9b095f152f64b23d403b..9f81a910a8fb7094f87d0515aa2baf30e1d8040b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fft_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fft_buffer.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_FFT_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_FFT_BUFFER_H_ > >+#include <stddef.h> > #include <vector> > > #include "modules/audio_processing/aec3/fft_data.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.cc >index 790cb1e8b52d6bff187b4bdd402019cf7fa1ed68..5b890d74746c9de1bccfc5f2ae3134803bc31bd5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.cc >@@ -16,6 +16,7 @@ > #include <numeric> > > #include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/render_buffer.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.h >index 47e953366621dc23893279f5d7eb45cca666988c..99a0e2597377234ae219081aa8d33e10f978552a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/filter_analyzer.h >@@ -11,20 +11,20 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_FILTER_ANALYZER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_FILTER_ANALYZER_H_ > >+#include <stddef.h> > #include <array> >+#include <memory> > #include <vector> > >-#include "absl/types/optional.h" > #include "api/array_view.h" > #include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" >-#include "modules/audio_processing/aec3/cascaded_biquad_filter.h" >-#include "modules/audio_processing/aec3/render_buffer.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > > class ApmDataDumper; >+class RenderBuffer; > > // Class for analyzing the properties of an adaptive filter. > class FilterAnalyzer { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/frame_blocker.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/frame_blocker.cc >index 0a0c0e2faea5993086376342aa8df3de69c9bed9..ca122e5ebbdee8a79aab77bca0de599166586b16 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/frame_blocker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/frame_blocker.cc >@@ -10,8 +10,7 @@ > > #include "modules/audio_processing/aec3/frame_blocker.h" > >-#include <algorithm> >- >+#include "modules/audio_processing/aec3/aec3_common.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fullband_erle_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fullband_erle_estimator.cc >index db9be7c104e969e50b2811dac6033174d1bbfaa5..7893b97b3a27aee6673a43a09489a137f6d5663f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fullband_erle_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/fullband_erle_estimator.cc >@@ -18,6 +18,7 @@ > #include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_minmax.h" > > namespace webrtc { >@@ -25,7 +26,7 @@ namespace webrtc { > namespace { > constexpr float kEpsilon = 1e-3f; > constexpr float kX2BandEnergyThreshold = 44015068.0f; >-constexpr int kErleHold = 100; >+constexpr int kBlocksToHoldErle = 100; > constexpr int kPointsToAccumulate = 6; > } // namespace > >@@ -54,7 +55,7 @@ void FullBandErleEstimator::Update(rtc::ArrayView<const float> X2, > const float Y2_sum = std::accumulate(Y2.begin(), Y2.end(), 0.0f); > const float E2_sum = std::accumulate(E2.begin(), E2.end(), 0.0f); > if (instantaneous_erle_.Update(Y2_sum, E2_sum)) { >- hold_counter_time_domain_ = kErleHold; >+ hold_counter_time_domain_ = kBlocksToHoldErle; > erle_time_domain_log2_ += > 0.1f * ((instantaneous_erle_.GetInstErleLog2().value()) - > erle_time_domain_log2_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.cc >index 4f48cd47480b2b60f29738c00d65c69eeb1fa05a..ef87d142db1fc519b8ba85896477117cafea224c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.cc >@@ -13,7 +13,12 @@ > #include <algorithm> > #include <functional> > >+#include "modules/audio_processing/aec3/adaptive_fir_filter.h" > #include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/echo_path_variability.h" >+#include "modules/audio_processing/aec3/fft_data.h" >+#include "modules/audio_processing/aec3/render_signal_analyzer.h" >+#include "modules/audio_processing/aec3/subtractor_output.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" >@@ -127,7 +132,10 @@ void MainFilterUpdateGain::Compute( > H_error_increase.begin(), std::multiplies<float>()); > std::transform(H_error_.begin(), H_error_.end(), H_error_increase.begin(), > H_error_.begin(), [&](float a, float b) { >- return std::max(a + b, current_config_.error_floor); >+ float error = a + b; >+ error = std::max(error, current_config_.error_floor); >+ error = std::min(error, current_config_.error_ceil); >+ return error; > }); > > data_dumper_->DumpRaw("aec3_main_gain_H_error", H_error_); >@@ -153,6 +161,9 @@ void MainFilterUpdateGain::UpdateCurrentConfig() { > current_config_.error_floor = > average(old_target_config_.error_floor, target_config_.error_floor, > change_factor); >+ current_config_.error_ceil = >+ average(old_target_config_.error_ceil, target_config_.error_ceil, >+ change_factor); > current_config_.noise_gate = > average(old_target_config_.noise_gate, target_config_.noise_gate, > change_factor); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.h >index 525b52279c46ae6b0ee8323efb4e4c15a8ae075b..892ff689a470d30445af62814044c21a6aecd5f9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain.h >@@ -11,20 +11,22 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_MAIN_FILTER_UPDATE_GAIN_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_MAIN_FILTER_UPDATE_GAIN_H_ > >+#include <stddef.h> >+#include <array> > #include <memory> >-#include <vector> > > #include "api/audio/echo_canceller3_config.h" >-#include "modules/audio_processing/aec3/adaptive_fir_filter.h" > #include "modules/audio_processing/aec3/aec3_common.h" >-#include "modules/audio_processing/aec3/echo_path_variability.h" >-#include "modules/audio_processing/aec3/render_signal_analyzer.h" >-#include "modules/audio_processing/aec3/subtractor_output.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > >+class AdaptiveFirFilter; > class ApmDataDumper; >+struct EchoPathVariability; >+struct FftData; >+class RenderSignalAnalyzer; >+struct SubtractorOutput; > > // Provides functionality for computing the adaptive gain for the main filter. > class MainFilterUpdateGain { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain_unittest.cc >index 8c44ae064e2122e9e128a7ef5d89ecd49e681ce9..093b194736f86dcaf35c9a3875067c58a2b24f53 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/main_filter_update_gain_unittest.cc >@@ -66,7 +66,7 @@ void RunFilterUpdateTest(int num_blocks_to_process, > config.delay.min_echo_path_delay_blocks = 0; > config.delay.default_delay = 1; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > AecState aec_state(config); > RenderSignalAnalyzer render_signal_analyzer(config); > absl::optional<DelayEstimate> delay_estimate; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.cc >index 6744bd5812a0c75e3dc2534e94e60071819454b8..757219d52c22b22f93b9ed92d08291831b4501c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.cc >@@ -19,10 +19,14 @@ > #include <emmintrin.h> > #endif > #include <algorithm> >+#include <cstddef> >+#include <initializer_list> >+#include <iterator> > #include <numeric> > >-#include "api/audio/echo_canceller3_config.h" >+#include "modules/audio_processing/aec3/downsampled_render_buffer.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.h >index 2ef48286d79ee6229642b21a080d943315c0f608..2a65339a4cbfefe3d1b02b187fa1d1b0c28084d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter.h >@@ -11,18 +11,19 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_MATCHED_FILTER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_MATCHED_FILTER_H_ > >-#include <array> >-#include <memory> >+#include <stddef.h> > #include <vector> > >-#include "absl/types/optional.h" > #include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" >-#include "modules/audio_processing/aec3/downsampled_render_buffer.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/system/arch.h" > > namespace webrtc { >+ >+class ApmDataDumper; >+struct DownsampledRenderBuffer; >+ > namespace aec3 { > > #if defined(WEBRTC_HAS_NEON) >@@ -65,7 +66,6 @@ void MatchedFilterCore(size_t x_start_index, > > } // namespace aec3 > >-class ApmDataDumper; > > // Produces recursively updated cross-correlation estimates for several signal > // shifts where the intra-shift spacing is uniform. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.cc >index 7a03e60d8bfd94428b4a3507dc92f1078bda4071..f59525eee9e901ec3b2a4402889eaf12ab828775 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.cc >@@ -9,7 +9,11 @@ > */ > #include "modules/audio_processing/aec3/matched_filter_lag_aggregator.h" > >+#include <algorithm> >+#include <iterator> >+ > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >@@ -27,11 +31,13 @@ MatchedFilterLagAggregator::MatchedFilterLagAggregator( > > MatchedFilterLagAggregator::~MatchedFilterLagAggregator() = default; > >-void MatchedFilterLagAggregator::Reset() { >+void MatchedFilterLagAggregator::Reset(bool hard_reset) { > std::fill(histogram_.begin(), histogram_.end(), 0); > histogram_data_.fill(0); > histogram_data_index_ = 0; >- significant_candidate_found_ = false; >+ if (hard_reset) { >+ significant_candidate_found_ = false; >+ } > } > > absl::optional<DelayEstimate> MatchedFilterLagAggregator::Aggregate( >@@ -77,9 +83,13 @@ absl::optional<DelayEstimate> MatchedFilterLagAggregator::Aggregate( > if (histogram_[candidate] > thresholds_.converged || > (histogram_[candidate] > thresholds_.initial && > !significant_candidate_found_)) { >- return DelayEstimate(DelayEstimate::Quality::kRefined, candidate); >+ DelayEstimate::Quality quality = significant_candidate_found_ >+ ? DelayEstimate::Quality::kRefined >+ : DelayEstimate::Quality::kCoarse; >+ return DelayEstimate(quality, candidate); > } > } >+ > return absl::nullopt; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.h >index c57051aa685f3374a408e9d51c2df6f71a9c38d8..d7f34aed60734bf4152bbc376c26d5b7f9230688 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_lag_aggregator.h >@@ -34,7 +34,7 @@ class MatchedFilterLagAggregator { > ~MatchedFilterLagAggregator(); > > // Resets the aggregator. >- void Reset(); >+ void Reset(bool hard_reset); > > // Aggregates the provided lag estimates. > absl::optional<DelayEstimate> Aggregate( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_unittest.cc >index 16b603a13fe37daec690e9c939ba5d2ecf46814f..0c1711894b44f214599f4ac11ec5d7afe4304f17 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matched_filter_unittest.cc >@@ -165,7 +165,7 @@ TEST(MatchedFilter, LagEstimation) { > config.delay.delay_candidate_detection_threshold); > > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > > // Analyze the correlation between render and capture. > for (size_t k = 0; k < (600 + delay_samples / sub_block_size); ++k) { >@@ -261,7 +261,7 @@ TEST(MatchedFilter, LagNotReliableForUncorrelatedRenderAndCapture) { > std::fill(capture.begin(), capture.end(), 0.f); > ApmDataDumper data_dumper(0); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > MatchedFilter filter(&data_dumper, DetectOptimization(), sub_block_size, > kWindowSizeSubBlocks, kNumMatchedFilters, > kAlignmentShiftSubBlocks, 150, >@@ -306,7 +306,7 @@ TEST(MatchedFilter, LagNotUpdatedForLowLevelRender) { > config.delay.delay_estimate_smoothing, > config.delay.delay_candidate_detection_threshold); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(EchoCanceller3Config(), 3)); > Decimator capture_decimator(down_sampling_factor); > > // Analyze the correlation between render and capture. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.cc >index f95e7f46902e1f05e31a045aa7b9274223ca5b8a..bd6daea95c905ca9e178f8b5622c8b0ac98fa7be 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.cc >@@ -10,7 +10,7 @@ > > #include "modules/audio_processing/aec3/matrix_buffer.h" > >-#include "modules/audio_processing/aec3/aec3_common.h" >+#include <algorithm> > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.h >index 64aac0a4f65e577bb1ebfd78af6e22b2d1e4482e..cae3759f92084a3d9e1c807604da8f39a8bf218e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/matrix_buffer.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_MATRIX_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_MATRIX_BUFFER_H_ > >+#include <stddef.h> > #include <vector> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/mock/mock_render_delay_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/mock/mock_render_delay_controller.h >index ed5971cda0d36ccb616f60eaf26aa6d82c7f69c2..5f652e192f97a1d8eddef05597ba4598292c9cd0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/mock/mock_render_delay_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/mock/mock_render_delay_controller.h >@@ -25,7 +25,7 @@ class MockRenderDelayController : public RenderDelayController { > MockRenderDelayController(); > virtual ~MockRenderDelayController(); > >- MOCK_METHOD0(Reset, void()); >+ MOCK_METHOD1(Reset, void(bool reset_delay_statistics)); > MOCK_METHOD0(LogRenderCall, void()); > MOCK_METHOD4(GetDelay, > absl::optional<DelayEstimate>( >@@ -33,6 +33,7 @@ class MockRenderDelayController : public RenderDelayController { > size_t render_delay_buffer_delay, > const absl::optional<int>& echo_remover_delay, > rtc::ArrayView<const float> capture)); >+ MOCK_CONST_METHOD0(HasClockdrift, bool()); > }; > > } // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.cc >index e9d64e6b32632660f5a0453352234194ef876135..7a81ee89ea67e0dc8e39626cb6259193a4738f0f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.cc >@@ -14,6 +14,8 @@ > #include <algorithm> > #include <functional> > >+#include "rtc_base/checks.h" >+ > namespace webrtc { > namespace aec3 { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.h >index 94497d782c8d9d2aa40c09b348237ceea0bd5e6a..0f855beffb0f50e2d396803748e42d560153187a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/moving_average.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_MOVING_AVERAGE_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_MOVING_AVERAGE_H_ > >+#include <stddef.h> > #include <vector> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_buffer.h >index dd67268efd71ba8a98fed2cd902f16fe7d392594..4c7c60cf857d7d83d49ae1d32432ddd5234e2de5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_buffer.h >@@ -11,14 +11,17 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_RENDER_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_RENDER_BUFFER_H_ > >+#include <stddef.h> > #include <array> >-#include <memory> >+#include <vector> > > #include "api/array_view.h" >+#include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/fft_buffer.h" > #include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/aec3/matrix_buffer.h" > #include "modules/audio_processing/aec3/vector_buffer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.cc >index 37a2378738cc5be3060d3c4560c5afd644a63213..1ec27790a842db95ed979da7a4a2559b94719c8f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.cc >@@ -10,17 +10,21 @@ > > #include "modules/audio_processing/aec3/render_delay_buffer.h" > >-#include <string.h> >+#include <stdlib.h> > #include <algorithm> >+#include <memory> > #include <numeric> > >+#include "absl/types/optional.h" >+#include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/aec3_fft.h" >-#include "modules/audio_processing/aec3/block_processor.h" > #include "modules/audio_processing/aec3/decimator.h" > #include "modules/audio_processing/aec3/fft_buffer.h" > #include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/aec3/matrix_buffer.h" >+#include "modules/audio_processing/aec3/vector_buffer.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" >@@ -35,14 +39,6 @@ bool EnableZeroExternalDelayHeadroom() { > "WebRTC-Aec3ZeroExternalDelayHeadroomKillSwitch"); > } > >-size_t GetDownSamplingFactor(const EchoCanceller3Config& config) { >- // Do not use down sampling factor 8 if kill switch is triggered. >- return (config.delay.down_sampling_factor == 8 && >- field_trial::IsEnabled("WebRTC-Aec3DownSamplingFactor8KillSwitch")) >- ? 4 >- : config.delay.down_sampling_factor; >-} >- > class RenderDelayBufferImpl final : public RenderDelayBuffer { > public: > RenderDelayBufferImpl(const EchoCanceller3Config& config, size_t num_bands); >@@ -177,7 +173,7 @@ RenderDelayBufferImpl::RenderDelayBufferImpl(const EchoCanceller3Config& config, > new ApmDataDumper(rtc::AtomicOps::Increment(&instance_count_))), > optimization_(DetectOptimization()), > config_(config), >- down_sampling_factor_(GetDownSamplingFactor(config)), >+ down_sampling_factor_(config.delay.down_sampling_factor), > use_zero_external_delay_headroom_(EnableZeroExternalDelayHeadroom()), > sub_block_size_(static_cast<int>(down_sampling_factor_ > 0 > ? kBlockSize / down_sampling_factor_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.h >index a6d6874e3e4696b105de04a402c3327141384a3f..bd242f743f9b2a240ddb2c6d76359c842eb33b9a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer.h >@@ -12,15 +12,10 @@ > #define MODULES_AUDIO_PROCESSING_AEC3_RENDER_DELAY_BUFFER_H_ > > #include <stddef.h> >-#include <array> > #include <vector> > >-#include "absl/types/optional.h" >-#include "api/array_view.h" > #include "api/audio/echo_canceller3_config.h" >-#include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/downsampled_render_buffer.h" >-#include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/aec3/render_buffer.h" > > namespace webrtc { >@@ -33,12 +28,13 @@ class RenderDelayBuffer { > kNone, > kRenderUnderrun, > kRenderOverrun, >- kApiCallSkew, >- kRenderDataLost >+ kApiCallSkew > }; > > static RenderDelayBuffer* Create(const EchoCanceller3Config& config, > size_t num_bands); >+ static RenderDelayBuffer* Create2(const EchoCanceller3Config& config, >+ size_t num_bands); > virtual ~RenderDelayBuffer() = default; > > // Resets the buffer alignment. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer2.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer2.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..6992c5b898765f2529d99a355a881cda5f56db95 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer2.cc >@@ -0,0 +1,453 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <string.h> >+#include <algorithm> >+#include <memory> >+#include <numeric> >+#include <vector> >+ >+#include "absl/types/optional.h" >+#include "api/array_view.h" >+#include "api/audio/echo_canceller3_config.h" >+#include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/aec3_fft.h" >+#include "modules/audio_processing/aec3/decimator.h" >+#include "modules/audio_processing/aec3/downsampled_render_buffer.h" >+#include "modules/audio_processing/aec3/fft_buffer.h" >+#include "modules/audio_processing/aec3/fft_data.h" >+#include "modules/audio_processing/aec3/matrix_buffer.h" >+#include "modules/audio_processing/aec3/render_buffer.h" >+#include "modules/audio_processing/aec3/render_delay_buffer.h" >+#include "modules/audio_processing/aec3/vector_buffer.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/atomicops.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+namespace { >+ >+class RenderDelayBufferImpl2 final : public RenderDelayBuffer { >+ public: >+ RenderDelayBufferImpl2(const EchoCanceller3Config& config, size_t num_bands); >+ RenderDelayBufferImpl2() = delete; >+ ~RenderDelayBufferImpl2() override; >+ >+ void Reset() override; >+ BufferingEvent Insert(const std::vector<std::vector<float>>& block) override; >+ BufferingEvent PrepareCaptureProcessing() override; >+ bool SetDelay(size_t delay) override; >+ size_t Delay() const override { return ComputeDelay(); } >+ size_t MaxDelay() const override { >+ return blocks_.buffer.size() - 1 - buffer_headroom_; >+ } >+ RenderBuffer* GetRenderBuffer() override { return &echo_remover_buffer_; } >+ >+ const DownsampledRenderBuffer& GetDownsampledRenderBuffer() const override { >+ return low_rate_; >+ } >+ >+ int BufferLatency() const; >+ bool CausalDelay(size_t delay) const override; >+ void SetAudioBufferDelay(size_t delay_ms) override; >+ >+ private: >+ static int instance_count_; >+ std::unique_ptr<ApmDataDumper> data_dumper_; >+ const Aec3Optimization optimization_; >+ const EchoCanceller3Config config_; >+ size_t down_sampling_factor_; >+ const int sub_block_size_; >+ MatrixBuffer blocks_; >+ VectorBuffer spectra_; >+ FftBuffer ffts_; >+ absl::optional<size_t> delay_; >+ RenderBuffer echo_remover_buffer_; >+ DownsampledRenderBuffer low_rate_; >+ Decimator render_decimator_; >+ const Aec3Fft fft_; >+ std::vector<float> render_ds_; >+ const int buffer_headroom_; >+ bool last_call_was_render_ = false; >+ int num_api_calls_in_a_row_ = 0; >+ int max_observed_jitter_ = 1; >+ size_t capture_call_counter_ = 0; >+ size_t render_call_counter_ = 0; >+ bool render_activity_ = false; >+ size_t render_activity_counter_ = 0; >+ absl::optional<size_t> external_audio_buffer_delay_; >+ bool external_audio_buffer_delay_verified_after_reset_ = false; >+ size_t min_latency_blocks_ = 0; >+ size_t excess_render_detection_counter_ = 0; >+ size_t num_bands_; >+ >+ int MapDelayToTotalDelay(size_t delay) const; >+ int ComputeDelay() const; >+ void ApplyTotalDelay(int delay); >+ void InsertBlock(const std::vector<std::vector<float>>& block, >+ int previous_write); >+ bool DetectActiveRender(rtc::ArrayView<const float> x) const; >+ bool DetectExcessRenderBlocks(); >+ void IncrementWriteIndices(); >+ void IncrementLowRateReadIndices(); >+ void IncrementReadIndices(); >+ bool RenderOverrun(); >+ bool RenderUnderrun(); >+}; >+ >+int RenderDelayBufferImpl2::instance_count_ = 0; >+ >+RenderDelayBufferImpl2::RenderDelayBufferImpl2( >+ const EchoCanceller3Config& config, >+ size_t num_bands) >+ : data_dumper_( >+ new ApmDataDumper(rtc::AtomicOps::Increment(&instance_count_))), >+ optimization_(DetectOptimization()), >+ config_(config), >+ down_sampling_factor_(config.delay.down_sampling_factor), >+ sub_block_size_(static_cast<int>(down_sampling_factor_ > 0 >+ ? kBlockSize / down_sampling_factor_ >+ : kBlockSize)), >+ blocks_(GetRenderDelayBufferSize(down_sampling_factor_, >+ config.delay.num_filters, >+ config.filter.main.length_blocks), >+ num_bands, >+ kBlockSize), >+ spectra_(blocks_.buffer.size(), kFftLengthBy2Plus1), >+ ffts_(blocks_.buffer.size()), >+ delay_(config_.delay.default_delay), >+ echo_remover_buffer_(&blocks_, &spectra_, &ffts_), >+ low_rate_(GetDownSampledBufferSize(down_sampling_factor_, >+ config.delay.num_filters)), >+ render_decimator_(down_sampling_factor_), >+ fft_(), >+ render_ds_(sub_block_size_, 0.f), >+ buffer_headroom_(config.filter.main.length_blocks), >+ num_bands_(num_bands) { >+ RTC_DCHECK_EQ(blocks_.buffer.size(), ffts_.buffer.size()); >+ RTC_DCHECK_EQ(spectra_.buffer.size(), ffts_.buffer.size()); >+ >+ Reset(); >+} >+ >+RenderDelayBufferImpl2::~RenderDelayBufferImpl2() = default; >+ >+// Resets the buffer delays and clears the reported delays. >+void RenderDelayBufferImpl2::Reset() { >+ last_call_was_render_ = false; >+ num_api_calls_in_a_row_ = 1; >+ min_latency_blocks_ = 0; >+ excess_render_detection_counter_ = 0; >+ >+ // Initialize the read index to one sub-block before the write index. >+ low_rate_.read = low_rate_.OffsetIndex(low_rate_.write, sub_block_size_); >+ >+ // Check for any external audio buffer delay and whether it is feasible. >+ if (external_audio_buffer_delay_) { >+ const size_t headroom = 2; >+ size_t audio_buffer_delay_to_set; >+ // Minimum delay is 1 (like the low-rate render buffer). >+ if (*external_audio_buffer_delay_ <= headroom) { >+ audio_buffer_delay_to_set = 1; >+ } else { >+ audio_buffer_delay_to_set = *external_audio_buffer_delay_ - headroom; >+ } >+ >+ audio_buffer_delay_to_set = std::min(audio_buffer_delay_to_set, MaxDelay()); >+ >+ // When an external delay estimate is available, use that delay as the >+ // initial render buffer delay. >+ ApplyTotalDelay(audio_buffer_delay_to_set); >+ delay_ = ComputeDelay(); >+ >+ external_audio_buffer_delay_verified_after_reset_ = false; >+ } else { >+ // If an external delay estimate is not available, use that delay as the >+ // initial delay. Set the render buffer delays to the default delay. >+ ApplyTotalDelay(config_.delay.default_delay); >+ >+ // Unset the delays which are set by SetDelay. >+ delay_ = absl::nullopt; >+ } >+} >+ >+// Inserts a new block into the render buffers. >+RenderDelayBuffer::BufferingEvent RenderDelayBufferImpl2::Insert( >+ const std::vector<std::vector<float>>& block) { >+ ++render_call_counter_; >+ if (delay_) { >+ if (!last_call_was_render_) { >+ last_call_was_render_ = true; >+ num_api_calls_in_a_row_ = 1; >+ } else { >+ if (++num_api_calls_in_a_row_ > max_observed_jitter_) { >+ max_observed_jitter_ = num_api_calls_in_a_row_; >+ RTC_LOG(LS_WARNING) >+ << "New max number api jitter observed at render block " >+ << render_call_counter_ << ": " << num_api_calls_in_a_row_ >+ << " blocks"; >+ } >+ } >+ } >+ >+ // Increase the write indices to where the new blocks should be written. >+ const int previous_write = blocks_.write; >+ IncrementWriteIndices(); >+ >+ // Allow overrun and do a reset when render overrun occurrs due to more render >+ // data being inserted than capture data is received. >+ BufferingEvent event = >+ RenderOverrun() ? BufferingEvent::kRenderOverrun : BufferingEvent::kNone; >+ >+ // Detect and update render activity. >+ if (!render_activity_) { >+ render_activity_counter_ += DetectActiveRender(block[0]) ? 1 : 0; >+ render_activity_ = render_activity_counter_ >= 20; >+ } >+ >+ // Insert the new render block into the specified position. >+ InsertBlock(block, previous_write); >+ >+ if (event != BufferingEvent::kNone) { >+ Reset(); >+ } >+ >+ return event; >+} >+ >+// Prepares the render buffers for processing another capture block. >+RenderDelayBuffer::BufferingEvent >+RenderDelayBufferImpl2::PrepareCaptureProcessing() { >+ RenderDelayBuffer::BufferingEvent event = BufferingEvent::kNone; >+ ++capture_call_counter_; >+ >+ if (delay_) { >+ if (last_call_was_render_) { >+ last_call_was_render_ = false; >+ num_api_calls_in_a_row_ = 1; >+ } else { >+ if (++num_api_calls_in_a_row_ > max_observed_jitter_) { >+ max_observed_jitter_ = num_api_calls_in_a_row_; >+ RTC_LOG(LS_WARNING) >+ << "New max number api jitter observed at capture block " >+ << capture_call_counter_ << ": " << num_api_calls_in_a_row_ >+ << " blocks"; >+ } >+ } >+ } >+ >+ if (DetectExcessRenderBlocks()) { >+ // Too many render blocks compared to capture blocks. Risk of delay ending >+ // up before the filter used by the delay estimator. >+ RTC_LOG(LS_WARNING) << "Excess render blocks detected at block " >+ << capture_call_counter_; >+ Reset(); >+ event = BufferingEvent::kRenderOverrun; >+ } else if (RenderUnderrun()) { >+ // Don't increment the read indices of the low rate buffer if there is a >+ // render underrun. >+ RTC_LOG(LS_WARNING) << "Render buffer underrun detected at block " >+ << capture_call_counter_; >+ IncrementReadIndices(); >+ // Incrementing the buffer index without increasing the low rate buffer >+ // index means that the delay is reduced by one. >+ if (delay_ && *delay_ > 0) >+ delay_ = *delay_ - 1; >+ event = BufferingEvent::kRenderUnderrun; >+ } else { >+ // Increment the read indices in the render buffers to point to the most >+ // recent block to use in the capture processing. >+ IncrementLowRateReadIndices(); >+ IncrementReadIndices(); >+ } >+ >+ echo_remover_buffer_.SetRenderActivity(render_activity_); >+ if (render_activity_) { >+ render_activity_counter_ = 0; >+ render_activity_ = false; >+ } >+ >+ return event; >+} >+ >+// Sets the delay and returns a bool indicating whether the delay was changed. >+bool RenderDelayBufferImpl2::SetDelay(size_t delay) { >+ if (!external_audio_buffer_delay_verified_after_reset_ && >+ external_audio_buffer_delay_ && delay_) { >+ int difference = static_cast<int>(delay) - static_cast<int>(*delay_); >+ RTC_LOG(LS_WARNING) << "Mismatch between first estimated delay after reset " >+ "and externally reported audio buffer delay: " >+ << difference << " blocks"; >+ external_audio_buffer_delay_verified_after_reset_ = true; >+ } >+ if (delay_ && *delay_ == delay) { >+ return false; >+ } >+ delay_ = delay; >+ >+ // Compute the total delay and limit the delay to the allowed range. >+ int total_delay = MapDelayToTotalDelay(*delay_); >+ total_delay = >+ std::min(MaxDelay(), static_cast<size_t>(std::max(total_delay, 0))); >+ >+ // Apply the delay to the buffers. >+ ApplyTotalDelay(total_delay); >+ return true; >+} >+ >+// Returns whether the specified delay is causal. >+bool RenderDelayBufferImpl2::CausalDelay(size_t delay) const { >+ // TODO(gustaf): Remove this from RenderDelayBuffer. >+ return true; >+} >+ >+void RenderDelayBufferImpl2::SetAudioBufferDelay(size_t delay_ms) { >+ if (!external_audio_buffer_delay_) { >+ RTC_LOG(LS_WARNING) >+ << "Receiving a first externally reported audio buffer delay of " >+ << delay_ms << " ms."; >+ } >+ >+ // Convert delay from milliseconds to blocks (rounded down). >+ external_audio_buffer_delay_ = delay_ms >> ((num_bands_ == 1) ? 1 : 2); >+} >+ >+// Maps the externally computed delay to the delay used internally. >+int RenderDelayBufferImpl2::MapDelayToTotalDelay( >+ size_t external_delay_blocks) const { >+ const int latency_blocks = BufferLatency(); >+ return latency_blocks + static_cast<int>(external_delay_blocks); >+} >+ >+// Returns the delay (not including call jitter). >+int RenderDelayBufferImpl2::ComputeDelay() const { >+ const int latency_blocks = BufferLatency(); >+ int internal_delay = spectra_.read >= spectra_.write >+ ? spectra_.read - spectra_.write >+ : spectra_.size + spectra_.read - spectra_.write; >+ >+ return internal_delay - latency_blocks; >+} >+ >+// Set the read indices according to the delay. >+void RenderDelayBufferImpl2::ApplyTotalDelay(int delay) { >+ RTC_LOG(LS_WARNING) << "Applying total delay of " << delay << " blocks."; >+ blocks_.read = blocks_.OffsetIndex(blocks_.write, -delay); >+ spectra_.read = spectra_.OffsetIndex(spectra_.write, delay); >+ ffts_.read = ffts_.OffsetIndex(ffts_.write, delay); >+} >+ >+// Inserts a block into the render buffers. >+void RenderDelayBufferImpl2::InsertBlock( >+ const std::vector<std::vector<float>>& block, >+ int previous_write) { >+ auto& b = blocks_; >+ auto& lr = low_rate_; >+ auto& ds = render_ds_; >+ auto& f = ffts_; >+ auto& s = spectra_; >+ RTC_DCHECK_EQ(block.size(), b.buffer[b.write].size()); >+ for (size_t k = 0; k < block.size(); ++k) { >+ RTC_DCHECK_EQ(block[k].size(), b.buffer[b.write][k].size()); >+ std::copy(block[k].begin(), block[k].end(), b.buffer[b.write][k].begin()); >+ } >+ >+ data_dumper_->DumpWav("aec3_render_decimator_input", block[0].size(), >+ block[0].data(), 16000, 1); >+ render_decimator_.Decimate(block[0], ds); >+ data_dumper_->DumpWav("aec3_render_decimator_output", ds.size(), ds.data(), >+ 16000 / down_sampling_factor_, 1); >+ std::copy(ds.rbegin(), ds.rend(), lr.buffer.begin() + lr.write); >+ fft_.PaddedFft(block[0], b.buffer[previous_write][0], &f.buffer[f.write]); >+ f.buffer[f.write].Spectrum(optimization_, s.buffer[s.write]); >+} >+ >+bool RenderDelayBufferImpl2::DetectActiveRender( >+ rtc::ArrayView<const float> x) const { >+ const float x_energy = std::inner_product(x.begin(), x.end(), x.begin(), 0.f); >+ return x_energy > (config_.render_levels.active_render_limit * >+ config_.render_levels.active_render_limit) * >+ kFftLengthBy2; >+} >+ >+bool RenderDelayBufferImpl2::DetectExcessRenderBlocks() { >+ bool excess_render_detected = false; >+ const size_t latency_blocks = static_cast<size_t>(BufferLatency()); >+ // The recently seen minimum latency in blocks. Should be close to 0. >+ min_latency_blocks_ = std::min(min_latency_blocks_, latency_blocks); >+ // After processing a configurable number of blocks the minimum latency is >+ // checked. >+ if (++excess_render_detection_counter_ >= >+ config_.buffering.excess_render_detection_interval_blocks) { >+ // If the minimum latency is not lower than the threshold there have been >+ // more render than capture frames. >+ excess_render_detected = min_latency_blocks_ > >+ config_.buffering.max_allowed_excess_render_blocks; >+ // Reset the counter and let the minimum latency be the current latency. >+ min_latency_blocks_ = latency_blocks; >+ excess_render_detection_counter_ = 0; >+ } >+ >+ data_dumper_->DumpRaw("aec3_latency_blocks", latency_blocks); >+ data_dumper_->DumpRaw("aec3_min_latency_blocks", min_latency_blocks_); >+ data_dumper_->DumpRaw("aec3_excess_render_detected", excess_render_detected); >+ return excess_render_detected; >+} >+ >+// Computes the latency in the buffer (the number of unread sub-blocks). >+int RenderDelayBufferImpl2::BufferLatency() const { >+ const DownsampledRenderBuffer& l = low_rate_; >+ int latency_samples = (l.buffer.size() + l.read - l.write) % l.buffer.size(); >+ int latency_blocks = latency_samples / sub_block_size_; >+ return latency_blocks; >+} >+ >+// Increments the write indices for the render buffers. >+void RenderDelayBufferImpl2::IncrementWriteIndices() { >+ low_rate_.UpdateWriteIndex(-sub_block_size_); >+ blocks_.IncWriteIndex(); >+ spectra_.DecWriteIndex(); >+ ffts_.DecWriteIndex(); >+} >+ >+// Increments the read indices of the low rate render buffers. >+void RenderDelayBufferImpl2::IncrementLowRateReadIndices() { >+ low_rate_.UpdateReadIndex(-sub_block_size_); >+} >+ >+// Increments the read indices for the render buffers. >+void RenderDelayBufferImpl2::IncrementReadIndices() { >+ if (blocks_.read != blocks_.write) { >+ blocks_.IncReadIndex(); >+ spectra_.DecReadIndex(); >+ ffts_.DecReadIndex(); >+ } >+} >+ >+// Checks for a render buffer overrun. >+bool RenderDelayBufferImpl2::RenderOverrun() { >+ return low_rate_.read == low_rate_.write || blocks_.read == blocks_.write; >+} >+ >+// Checks for a render buffer underrun. >+bool RenderDelayBufferImpl2::RenderUnderrun() { >+ return low_rate_.read == low_rate_.write; >+} >+ >+} // namespace >+ >+RenderDelayBuffer* RenderDelayBuffer::Create2( >+ const EchoCanceller3Config& config, >+ size_t num_bands) { >+ return new RenderDelayBufferImpl2(config, num_bands); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer_unittest.cc >index ee895972619f40416d9cff2bc291a089871f41c6..d1530c6142c4e90b3d219ada7df3dff818bb5aaf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_buffer_unittest.cc >@@ -38,7 +38,7 @@ TEST(RenderDelayBuffer, BufferOverflow) { > for (auto rate : {8000, 16000, 32000, 48000}) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<RenderDelayBuffer> delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > std::vector<std::vector<float>> block_to_insert( > NumBandsForRate(rate), std::vector<float>(kBlockSize, 0.f)); > for (size_t k = 0; k < 10; ++k) { >@@ -62,7 +62,7 @@ TEST(RenderDelayBuffer, BufferOverflow) { > TEST(RenderDelayBuffer, AvailableBlock) { > constexpr size_t kNumBands = 1; > std::unique_ptr<RenderDelayBuffer> delay_buffer( >- RenderDelayBuffer::Create(EchoCanceller3Config(), kNumBands)); >+ RenderDelayBuffer::Create2(EchoCanceller3Config(), kNumBands)); > std::vector<std::vector<float>> input_block( > kNumBands, std::vector<float>(kBlockSize, 1.f)); > EXPECT_EQ(RenderDelayBuffer::BufferingEvent::kNone, >@@ -74,7 +74,7 @@ TEST(RenderDelayBuffer, AvailableBlock) { > TEST(RenderDelayBuffer, SetDelay) { > EchoCanceller3Config config; > std::unique_ptr<RenderDelayBuffer> delay_buffer( >- RenderDelayBuffer::Create(config, 1)); >+ RenderDelayBuffer::Create2(config, 1)); > ASSERT_TRUE(delay_buffer->Delay()); > delay_buffer->Reset(); > size_t initial_internal_delay = config.delay.min_echo_path_delay_blocks + >@@ -93,7 +93,7 @@ TEST(RenderDelayBuffer, SetDelay) { > // tests on test bots has been fixed. > TEST(RenderDelayBuffer, DISABLED_WrongDelay) { > std::unique_ptr<RenderDelayBuffer> delay_buffer( >- RenderDelayBuffer::Create(EchoCanceller3Config(), 3)); >+ RenderDelayBuffer::Create2(EchoCanceller3Config(), 3)); > EXPECT_DEATH(delay_buffer->SetDelay(21), ""); > } > >@@ -101,7 +101,7 @@ TEST(RenderDelayBuffer, DISABLED_WrongDelay) { > TEST(RenderDelayBuffer, WrongNumberOfBands) { > for (auto rate : {16000, 32000, 48000}) { > SCOPED_TRACE(ProduceDebugText(rate)); >- std::unique_ptr<RenderDelayBuffer> delay_buffer(RenderDelayBuffer::Create( >+ std::unique_ptr<RenderDelayBuffer> delay_buffer(RenderDelayBuffer::Create2( > EchoCanceller3Config(), NumBandsForRate(rate))); > std::vector<std::vector<float>> block_to_insert( > NumBandsForRate(rate < 48000 ? rate + 16000 : 16000), >@@ -115,7 +115,7 @@ TEST(RenderDelayBuffer, WrongBlockLength) { > for (auto rate : {8000, 16000, 32000, 48000}) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<RenderDelayBuffer> delay_buffer( >- RenderDelayBuffer::Create(EchoCanceller3Config(), 3)); >+ RenderDelayBuffer::Create2(EchoCanceller3Config(), 3)); > std::vector<std::vector<float>> block_to_insert( > NumBandsForRate(rate), std::vector<float>(kBlockSize - 1, 0.f)); > EXPECT_DEATH(delay_buffer->Insert(block_to_insert), ""); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.cc >index f336c392b4ee7039623668221d84c805fb6a9588..c4665eaa224dec1edd4325b22826104d4157a98c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.cc >@@ -9,10 +9,9 @@ > */ > #include "modules/audio_processing/aec3/render_delay_controller.h" > >+#include <stdlib.h> > #include <algorithm> > #include <memory> >-#include <numeric> >-#include <string> > #include <vector> > > #include "api/audio/echo_canceller3_config.h" >@@ -20,7 +19,9 @@ > #include "modules/audio_processing/aec3/echo_path_delay_estimator.h" > #include "modules/audio_processing/aec3/render_delay_controller_metrics.h" > #include "modules/audio_processing/aec3/skew_estimator.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/field_trial.h" >@@ -44,6 +45,10 @@ bool UseOffsetBlocks() { > return field_trial::IsEnabled("WebRTC-Aec3UseOffsetBlocks"); > } > >+bool UseEarlyDelayDetection() { >+ return !field_trial::IsEnabled("WebRTC-Aec3EarlyDelayDetectionKillSwitch"); >+} >+ > constexpr int kSkewHistorySizeLog2 = 8; > > class RenderDelayControllerImpl final : public RenderDelayController { >@@ -52,17 +57,19 @@ class RenderDelayControllerImpl final : public RenderDelayController { > int non_causal_offset, > int sample_rate_hz); > ~RenderDelayControllerImpl() override; >- void Reset() override; >+ void Reset(bool reset_delay_confidence) override; > void LogRenderCall() override; > absl::optional<DelayEstimate> GetDelay( > const DownsampledRenderBuffer& render_buffer, > size_t render_delay_buffer_delay, > const absl::optional<int>& echo_remover_delay, > rtc::ArrayView<const float> capture) override; >+ bool HasClockdrift() const override; > > private: > static int instance_count_; > std::unique_ptr<ApmDataDumper> data_dumper_; >+ const bool use_early_delay_detection_; > const int delay_headroom_blocks_; > const int hysteresis_limit_1_blocks_; > const int hysteresis_limit_2_blocks_; >@@ -81,6 +88,7 @@ class RenderDelayControllerImpl final : public RenderDelayController { > size_t capture_call_counter_ = 0; > int delay_change_counter_ = 0; > size_t soft_reset_counter_ = 0; >+ DelayEstimate::Quality last_delay_estimate_quality_; > RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(RenderDelayControllerImpl); > }; > >@@ -129,6 +137,7 @@ RenderDelayControllerImpl::RenderDelayControllerImpl( > int sample_rate_hz) > : data_dumper_( > new ApmDataDumper(rtc::AtomicOps::Increment(&instance_count_))), >+ use_early_delay_detection_(UseEarlyDelayDetection()), > delay_headroom_blocks_( > static_cast<int>(config.delay.delay_headroom_blocks)), > hysteresis_limit_1_blocks_( >@@ -139,7 +148,8 @@ RenderDelayControllerImpl::RenderDelayControllerImpl( > use_offset_blocks_(UseOffsetBlocks()), > delay_estimator_(data_dumper_.get(), config), > delay_buf_(kBlockSize * non_causal_offset, 0.f), >- skew_estimator_(kSkewHistorySizeLog2) { >+ skew_estimator_(kSkewHistorySizeLog2), >+ last_delay_estimate_quality_(DelayEstimate::Quality::kCoarse) { > RTC_DCHECK(ValidFullBandRate(sample_rate_hz)); > delay_estimator_.LogDelayEstimationProperties(sample_rate_hz, > delay_buf_.size()); >@@ -147,16 +157,19 @@ RenderDelayControllerImpl::RenderDelayControllerImpl( > > RenderDelayControllerImpl::~RenderDelayControllerImpl() = default; > >-void RenderDelayControllerImpl::Reset() { >+void RenderDelayControllerImpl::Reset(bool reset_delay_confidence) { > delay_ = absl::nullopt; > delay_samples_ = absl::nullopt; > skew_ = absl::nullopt; > previous_offset_blocks_ = 0; > std::fill(delay_buf_.begin(), delay_buf_.end(), 0.f); >- delay_estimator_.Reset(false); >+ delay_estimator_.Reset(reset_delay_confidence); > skew_estimator_.Reset(); > delay_change_counter_ = 0; > soft_reset_counter_ = 0; >+ if (reset_delay_confidence) { >+ last_delay_estimate_quality_ = DelayEstimate::Quality::kCoarse; >+ } > } > > void RenderDelayControllerImpl::LogRenderCall() { >@@ -194,9 +207,6 @@ absl::optional<DelayEstimate> RenderDelayControllerImpl::GetDelay( > absl::optional<int> skew = skew_estimator_.GetSkewFromCapture(); > > if (delay_samples) { >- // TODO(peah): Refactor the rest of the code to assume a kRefined estimate >- // quality. >- RTC_DCHECK(DelayEstimate::Quality::kRefined == delay_samples->quality); > if (!delay_samples_ || delay_samples->delay != delay_samples_->delay) { > delay_change_counter_ = 0; > } >@@ -239,7 +249,7 @@ absl::optional<DelayEstimate> RenderDelayControllerImpl::GetDelay( > } else if (soft_reset_counter_ > 10 * kNumBlocksPerSecond) { > // Soft reset the delay estimator if there is a significant offset > // detected. >- delay_estimator_.Reset(true); >+ delay_estimator_.Reset(false); > soft_reset_counter_ = 0; > } > } >@@ -264,14 +274,20 @@ absl::optional<DelayEstimate> RenderDelayControllerImpl::GetDelay( > > if (delay_samples_) { > // Compute the render delay buffer delay. >- delay_ = ComputeBufferDelay( >- delay_, delay_headroom_blocks_, hysteresis_limit_1_blocks_, >- hysteresis_limit_2_blocks_, offset_blocks, *delay_samples_); >+ const bool use_hysteresis = >+ last_delay_estimate_quality_ == DelayEstimate::Quality::kRefined && >+ delay_samples_->quality == DelayEstimate::Quality::kRefined; >+ delay_ = ComputeBufferDelay(delay_, delay_headroom_blocks_, >+ use_hysteresis ? hysteresis_limit_1_blocks_ : 0, >+ use_hysteresis ? hysteresis_limit_2_blocks_ : 0, >+ offset_blocks, *delay_samples_); >+ last_delay_estimate_quality_ = delay_samples_->quality; > } > > metrics_.Update(delay_samples_ ? absl::optional<size_t>(delay_samples_->delay) > : absl::nullopt, >- delay_ ? delay_->delay : 0, skew_shift); >+ delay_ ? delay_->delay : 0, skew_shift, >+ delay_estimator_.Clockdrift()); > > data_dumper_->DumpRaw("aec3_render_delay_controller_delay", > delay_samples ? delay_samples->delay : 0); >@@ -287,6 +303,10 @@ absl::optional<DelayEstimate> RenderDelayControllerImpl::GetDelay( > return delay_; > } > >+bool RenderDelayControllerImpl::HasClockdrift() const { >+ return delay_estimator_.Clockdrift() != ClockdriftDetector::Level::kNone; >+} >+ > } // namespace > > RenderDelayController* RenderDelayController::Create( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.h >index ddd95482fcce18b8bba92da9f4698525ea1c0894..b46ed892527f792881ef548c3b219698e2ceb8ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller.h >@@ -27,10 +27,13 @@ class RenderDelayController { > static RenderDelayController* Create(const EchoCanceller3Config& config, > int non_causal_offset, > int sample_rate_hz); >+ static RenderDelayController* Create2(const EchoCanceller3Config& config, >+ int sample_rate_hz); > virtual ~RenderDelayController() = default; > >- // Resets the delay controller. >- virtual void Reset() = 0; >+ // Resets the delay controller. If the delay confidence is reset, the reset >+ // behavior is as if the call is restarted. >+ virtual void Reset(bool reset_delay_confidence) = 0; > > // Logs a render call. > virtual void LogRenderCall() = 0; >@@ -41,6 +44,9 @@ class RenderDelayController { > size_t render_delay_buffer_delay, > const absl::optional<int>& echo_remover_delay, > rtc::ArrayView<const float> capture) = 0; >+ >+ // Returns true if clockdrift has been detected. >+ virtual bool HasClockdrift() const = 0; > }; > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller2.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller2.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..00daf8f2af71c2ddabb374acadfedb796998e426 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller2.cc >@@ -0,0 +1,218 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#include <stddef.h> >+#include <algorithm> >+#include <memory> >+ >+#include "absl/types/optional.h" >+#include "api/array_view.h" >+#include "api/audio/echo_canceller3_config.h" >+#include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/delay_estimate.h" >+#include "modules/audio_processing/aec3/downsampled_render_buffer.h" >+#include "modules/audio_processing/aec3/echo_path_delay_estimator.h" >+#include "modules/audio_processing/aec3/render_delay_controller.h" >+#include "modules/audio_processing/aec3/render_delay_controller_metrics.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/atomicops.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/constructormagic.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+ >+namespace { >+ >+bool UseEarlyDelayDetection() { >+ return !field_trial::IsEnabled("WebRTC-Aec3EarlyDelayDetectionKillSwitch"); >+} >+ >+class RenderDelayControllerImpl2 final : public RenderDelayController { >+ public: >+ RenderDelayControllerImpl2(const EchoCanceller3Config& config, >+ int sample_rate_hz); >+ ~RenderDelayControllerImpl2() override; >+ void Reset(bool reset_delay_confidence) override; >+ void LogRenderCall() override; >+ absl::optional<DelayEstimate> GetDelay( >+ const DownsampledRenderBuffer& render_buffer, >+ size_t render_delay_buffer_delay, >+ const absl::optional<int>& echo_remover_delay, >+ rtc::ArrayView<const float> capture) override; >+ bool HasClockdrift() const override; >+ >+ private: >+ static int instance_count_; >+ std::unique_ptr<ApmDataDumper> data_dumper_; >+ const bool use_early_delay_detection_; >+ const int delay_headroom_blocks_; >+ const int hysteresis_limit_1_blocks_; >+ const int hysteresis_limit_2_blocks_; >+ absl::optional<DelayEstimate> delay_; >+ EchoPathDelayEstimator delay_estimator_; >+ RenderDelayControllerMetrics metrics_; >+ absl::optional<DelayEstimate> delay_samples_; >+ size_t capture_call_counter_ = 0; >+ int delay_change_counter_ = 0; >+ DelayEstimate::Quality last_delay_estimate_quality_; >+ RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(RenderDelayControllerImpl2); >+}; >+ >+DelayEstimate ComputeBufferDelay( >+ const absl::optional<DelayEstimate>& current_delay, >+ int delay_headroom_blocks, >+ int hysteresis_limit_1_blocks, >+ int hysteresis_limit_2_blocks, >+ DelayEstimate estimated_delay) { >+ // The below division is not exact and the truncation is intended. >+ const int echo_path_delay_blocks = estimated_delay.delay >> kBlockSizeLog2; >+ >+ // Compute the buffer delay increase required to achieve the desired latency. >+ size_t new_delay_blocks = >+ std::max(echo_path_delay_blocks - delay_headroom_blocks, 0); >+ >+ // Add hysteresis. >+ if (current_delay) { >+ size_t current_delay_blocks = current_delay->delay; >+ if (new_delay_blocks > current_delay_blocks) { >+ if (new_delay_blocks <= >+ current_delay_blocks + hysteresis_limit_1_blocks) { >+ new_delay_blocks = current_delay_blocks; >+ } >+ } else if (new_delay_blocks < current_delay_blocks) { >+ size_t hysteresis_limit = std::max( >+ static_cast<int>(current_delay_blocks) - hysteresis_limit_2_blocks, >+ 0); >+ if (new_delay_blocks >= hysteresis_limit) { >+ new_delay_blocks = current_delay_blocks; >+ } >+ } >+ } >+ >+ DelayEstimate new_delay = estimated_delay; >+ new_delay.delay = new_delay_blocks; >+ return new_delay; >+} >+ >+int RenderDelayControllerImpl2::instance_count_ = 0; >+ >+RenderDelayControllerImpl2::RenderDelayControllerImpl2( >+ const EchoCanceller3Config& config, >+ int sample_rate_hz) >+ : data_dumper_( >+ new ApmDataDumper(rtc::AtomicOps::Increment(&instance_count_))), >+ use_early_delay_detection_(UseEarlyDelayDetection()), >+ delay_headroom_blocks_( >+ static_cast<int>(config.delay.delay_headroom_blocks)), >+ hysteresis_limit_1_blocks_( >+ static_cast<int>(config.delay.hysteresis_limit_1_blocks)), >+ hysteresis_limit_2_blocks_( >+ static_cast<int>(config.delay.hysteresis_limit_2_blocks)), >+ delay_estimator_(data_dumper_.get(), config), >+ last_delay_estimate_quality_(DelayEstimate::Quality::kCoarse) { >+ RTC_DCHECK(ValidFullBandRate(sample_rate_hz)); >+ delay_estimator_.LogDelayEstimationProperties(sample_rate_hz, 0); >+} >+ >+RenderDelayControllerImpl2::~RenderDelayControllerImpl2() = default; >+ >+void RenderDelayControllerImpl2::Reset(bool reset_delay_confidence) { >+ delay_ = absl::nullopt; >+ delay_samples_ = absl::nullopt; >+ delay_estimator_.Reset(reset_delay_confidence); >+ delay_change_counter_ = 0; >+ if (reset_delay_confidence) { >+ last_delay_estimate_quality_ = DelayEstimate::Quality::kCoarse; >+ } >+} >+ >+void RenderDelayControllerImpl2::LogRenderCall() {} >+ >+absl::optional<DelayEstimate> RenderDelayControllerImpl2::GetDelay( >+ const DownsampledRenderBuffer& render_buffer, >+ size_t render_delay_buffer_delay, >+ const absl::optional<int>& echo_remover_delay, >+ rtc::ArrayView<const float> capture) { >+ RTC_DCHECK_EQ(kBlockSize, capture.size()); >+ ++capture_call_counter_; >+ >+ auto delay_samples = delay_estimator_.EstimateDelay(render_buffer, capture); >+ >+ // Overrule the delay estimator delay if the echo remover reports a delay. >+ if (echo_remover_delay) { >+ int total_echo_remover_delay_samples = >+ (render_delay_buffer_delay + *echo_remover_delay) * kBlockSize; >+ delay_samples = DelayEstimate(DelayEstimate::Quality::kRefined, >+ total_echo_remover_delay_samples); >+ } >+ >+ if (delay_samples) { >+ if (!delay_samples_ || delay_samples->delay != delay_samples_->delay) { >+ delay_change_counter_ = 0; >+ } >+ if (delay_samples_) { >+ delay_samples_->blocks_since_last_change = >+ delay_samples_->delay == delay_samples->delay >+ ? delay_samples_->blocks_since_last_change + 1 >+ : 0; >+ delay_samples_->blocks_since_last_update = 0; >+ delay_samples_->delay = delay_samples->delay; >+ delay_samples_->quality = delay_samples->quality; >+ } else { >+ delay_samples_ = delay_samples; >+ } >+ } else { >+ if (delay_samples_) { >+ ++delay_samples_->blocks_since_last_change; >+ ++delay_samples_->blocks_since_last_update; >+ } >+ } >+ >+ if (delay_change_counter_ < 2 * kNumBlocksPerSecond) { >+ ++delay_change_counter_; >+ } >+ >+ if (delay_samples_) { >+ // Compute the render delay buffer delay. >+ const bool use_hysteresis = >+ last_delay_estimate_quality_ == DelayEstimate::Quality::kRefined && >+ delay_samples_->quality == DelayEstimate::Quality::kRefined; >+ delay_ = ComputeBufferDelay(delay_, delay_headroom_blocks_, >+ use_hysteresis ? hysteresis_limit_1_blocks_ : 0, >+ use_hysteresis ? hysteresis_limit_2_blocks_ : 0, >+ *delay_samples_); >+ last_delay_estimate_quality_ = delay_samples_->quality; >+ } >+ >+ metrics_.Update(delay_samples_ ? absl::optional<size_t>(delay_samples_->delay) >+ : absl::nullopt, >+ delay_ ? delay_->delay : 0, 0, delay_estimator_.Clockdrift()); >+ >+ data_dumper_->DumpRaw("aec3_render_delay_controller_delay", >+ delay_samples ? delay_samples->delay : 0); >+ data_dumper_->DumpRaw("aec3_render_delay_controller_buffer_delay", >+ delay_ ? delay_->delay : 0); >+ >+ return delay_; >+} >+ >+bool RenderDelayControllerImpl2::HasClockdrift() const { >+ return delay_estimator_.Clockdrift() != ClockdriftDetector::Level::kNone; >+} >+ >+} // namespace >+ >+RenderDelayController* RenderDelayController::Create2( >+ const EchoCanceller3Config& config, >+ int sample_rate_hz) { >+ return new RenderDelayControllerImpl2(config, sample_rate_hz); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.cc >index 09db33968340e17567053764337cb9fe24c5d646..582e03348297ec39656f76953a2b4f35e6ae8035 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.cc >@@ -13,6 +13,7 @@ > #include <algorithm> > > #include "modules/audio_processing/aec3/aec3_common.h" >+#include "rtc_base/checks.h" > #include "system_wrappers/include/metrics.h" > > namespace webrtc { >@@ -45,7 +46,8 @@ RenderDelayControllerMetrics::RenderDelayControllerMetrics() = default; > void RenderDelayControllerMetrics::Update( > absl::optional<size_t> delay_samples, > size_t buffer_delay_blocks, >- absl::optional<int> skew_shift_blocks) { >+ absl::optional<int> skew_shift_blocks, >+ ClockdriftDetector::Level clockdrift) { > ++call_counter_; > > if (!initial_update) { >@@ -114,6 +116,10 @@ void RenderDelayControllerMetrics::Update( > static_cast<int>(delay_changes), > static_cast<int>(DelayChangesCategory::kNumCategories)); > >+ RTC_HISTOGRAM_ENUMERATION( >+ "WebRTC.Audio.EchoCanceller.Clockdrift", static_cast<int>(clockdrift), >+ static_cast<int>(ClockdriftDetector::Level::kNumCategories)); >+ > metrics_reported_ = true; > call_counter_ = 0; > ResetMetrics(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.h >index 1cfe899b4885cb304b27aff5971eca9627057e2f..22cc202e73a52d316b128ea482bd6dddc3fb0066 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics.h >@@ -11,7 +11,10 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_RENDER_DELAY_CONTROLLER_METRICS_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_RENDER_DELAY_CONTROLLER_METRICS_H_ > >+#include <stddef.h> >+ > #include "absl/types/optional.h" >+#include "modules/audio_processing/aec3/clockdrift_detector.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >@@ -24,7 +27,8 @@ class RenderDelayControllerMetrics { > // Updates the metric with new data. > void Update(absl::optional<size_t> delay_samples, > size_t buffer_delay_blocks, >- absl::optional<int> skew_shift_blocks); >+ absl::optional<int> skew_shift_blocks, >+ ClockdriftDetector::Level clockdrift); > > // Returns true if the metrics have just been reported, otherwise false. > bool MetricsReported() { return metrics_reported_; } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics_unittest.cc >index e867de4df428280ddc6c1d4fee49040b13001d08..216b0e220d4b265777bd34bca920fd956bcbb3bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_metrics_unittest.cc >@@ -22,10 +22,12 @@ TEST(RenderDelayControllerMetrics, NormalUsage) { > > for (int j = 0; j < 3; ++j) { > for (int k = 0; k < kMetricsReportingIntervalBlocks - 1; ++k) { >- metrics.Update(absl::nullopt, 0, absl::nullopt); >+ metrics.Update(absl::nullopt, 0, absl::nullopt, >+ ClockdriftDetector::Level::kNone); > EXPECT_FALSE(metrics.MetricsReported()); > } >- metrics.Update(absl::nullopt, 0, absl::nullopt); >+ metrics.Update(absl::nullopt, 0, absl::nullopt, >+ ClockdriftDetector::Level::kNone); > EXPECT_TRUE(metrics.MetricsReported()); > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_unittest.cc >index 93c64998e9bd40833584d937f26db112efbae507..e9f02d324c13d34ed4dc2a444e08ae33bc70d674 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_delay_controller_unittest.cc >@@ -57,10 +57,9 @@ TEST(RenderDelayController, NoRenderSignal) { > for (auto rate : {8000, 16000, 32000, 48000}) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<RenderDelayBuffer> delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > std::unique_ptr<RenderDelayController> delay_controller( >- RenderDelayController::Create( >- config, RenderDelayBuffer::DelayEstimatorOffset(config), rate)); >+ RenderDelayController::Create2(config, rate)); > for (size_t k = 0; k < 100; ++k) { > auto delay = delay_controller->GetDelay( > delay_buffer->GetDownsampledRenderBuffer(), delay_buffer->Delay(), >@@ -87,11 +86,9 @@ TEST(RenderDelayController, BasicApiCalls) { > std::vector<std::vector<float>> render_block( > NumBandsForRate(rate), std::vector<float>(kBlockSize, 0.f)); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > std::unique_ptr<RenderDelayController> delay_controller( >- RenderDelayController::Create( >- EchoCanceller3Config(), >- RenderDelayBuffer::DelayEstimatorOffset(config), rate)); >+ RenderDelayController::Create2(EchoCanceller3Config(), rate)); > for (size_t k = 0; k < 10; ++k) { > render_delay_buffer->Insert(render_block); > render_delay_buffer->PrepareCaptureProcessing(); >@@ -128,11 +125,9 @@ TEST(RenderDelayController, Alignment) { > absl::optional<DelayEstimate> delay_blocks; > SCOPED_TRACE(ProduceDebugText(rate, delay_samples)); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > std::unique_ptr<RenderDelayController> delay_controller( >- RenderDelayController::Create( >- config, RenderDelayBuffer::DelayEstimatorOffset(config), >- rate)); >+ RenderDelayController::Create2(config, rate)); > DelayBuffer<float> signal_delay_buffer(delay_samples); > for (size_t k = 0; k < (400 + delay_samples / kBlockSize); ++k) { > RandomizeSampleVector(&random_generator, render_block[0]); >@@ -179,11 +174,9 @@ TEST(RenderDelayController, NonCausalAlignment) { > absl::optional<DelayEstimate> delay_blocks; > SCOPED_TRACE(ProduceDebugText(rate, -delay_samples)); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > std::unique_ptr<RenderDelayController> delay_controller( >- RenderDelayController::Create( >- EchoCanceller3Config(), >- RenderDelayBuffer::DelayEstimatorOffset(config), rate)); >+ RenderDelayController::Create2(EchoCanceller3Config(), rate)); > DelayBuffer<float> signal_delay_buffer(-delay_samples); > for (int k = 0; > k < (400 - delay_samples / static_cast<int>(kBlockSize)); ++k) { >@@ -223,11 +216,9 @@ TEST(RenderDelayController, AlignmentWithJitter) { > absl::optional<DelayEstimate> delay_blocks; > SCOPED_TRACE(ProduceDebugText(rate, delay_samples)); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > std::unique_ptr<RenderDelayController> delay_controller( >- RenderDelayController::Create( >- config, RenderDelayBuffer::DelayEstimatorOffset(config), >- rate)); >+ RenderDelayController::Create2(config, rate)); > DelayBuffer<float> signal_delay_buffer(delay_samples); > for (size_t j = 0; j < (1000 + delay_samples / kBlockSize) / > config.delay.api_call_jitter_blocks + >@@ -280,11 +271,10 @@ TEST(RenderDelayController, InitialHeadroom) { > for (auto rate : {8000, 16000, 32000, 48000}) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > > std::unique_ptr<RenderDelayController> delay_controller( >- RenderDelayController::Create( >- config, RenderDelayBuffer::DelayEstimatorOffset(config), rate)); >+ RenderDelayController::Create2(config, rate)); > } > } > } >@@ -300,12 +290,10 @@ TEST(RenderDelayController, WrongCaptureSize) { > for (auto rate : {8000, 16000, 32000, 48000}) { > SCOPED_TRACE(ProduceDebugText(rate)); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > EXPECT_DEATH( > std::unique_ptr<RenderDelayController>( >- RenderDelayController::Create( >- EchoCanceller3Config(), >- RenderDelayBuffer::DelayEstimatorOffset(config), rate)) >+ RenderDelayController::Create2(EchoCanceller3Config(), rate)) > ->GetDelay(render_delay_buffer->GetDownsampledRenderBuffer(), > render_delay_buffer->Delay(), echo_remover_delay, block), > ""); >@@ -320,11 +308,10 @@ TEST(RenderDelayController, DISABLED_WrongSampleRate) { > SCOPED_TRACE(ProduceDebugText(rate)); > EchoCanceller3Config config; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, NumBandsForRate(rate))); >+ RenderDelayBuffer::Create2(config, NumBandsForRate(rate))); > EXPECT_DEATH( >- std::unique_ptr<RenderDelayController>(RenderDelayController::Create( >- EchoCanceller3Config(), >- RenderDelayBuffer::DelayEstimatorOffset(config), rate)), >+ std::unique_ptr<RenderDelayController>( >+ RenderDelayController::Create2(EchoCanceller3Config(), rate)), > ""); > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_reverb_model.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_reverb_model.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..8ad54c0cc345f65d56df2e7f04fc399dfc56c32f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_reverb_model.cc >@@ -0,0 +1,44 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/aec3/render_reverb_model.h" >+ >+#include <algorithm> >+ >+#include "api/array_view.h" >+#include "rtc_base/checks.h" >+ >+namespace webrtc { >+ >+RenderReverbModel::RenderReverbModel() { >+ Reset(); >+} >+ >+RenderReverbModel::~RenderReverbModel() = default; >+ >+void RenderReverbModel::Reset() { >+ render_reverb_.Reset(); >+} >+ >+void RenderReverbModel::Apply(const VectorBuffer& spectrum_buffer, >+ int delay_blocks, >+ float reverb_decay, >+ rtc::ArrayView<float> reverb_power_spectrum) { >+ int idx_at_delay = >+ spectrum_buffer.OffsetIndex(spectrum_buffer.read, delay_blocks); >+ int idx_past = spectrum_buffer.IncIndex(idx_at_delay); >+ const auto& X2 = spectrum_buffer.buffer[idx_at_delay]; >+ RTC_DCHECK_EQ(X2.size(), reverb_power_spectrum.size()); >+ std::copy(X2.begin(), X2.end(), reverb_power_spectrum.begin()); >+ render_reverb_.AddReverbNoFreqShaping(spectrum_buffer.buffer[idx_past], 1.0f, >+ reverb_decay, reverb_power_spectrum); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_reverb_model.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_reverb_model.h >new file mode 100644 >index 0000000000000000000000000000000000000000..d404a69ecd2f20c075979a89e54ce7c315449124 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_reverb_model.h >@@ -0,0 +1,49 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_AUDIO_PROCESSING_AEC3_RENDER_REVERB_MODEL_H_ >+#define MODULES_AUDIO_PROCESSING_AEC3_RENDER_REVERB_MODEL_H_ >+ >+#include "api/array_view.h" >+#include "modules/audio_processing/aec3/reverb_model.h" >+#include "modules/audio_processing/aec3/vector_buffer.h" >+ >+namespace webrtc { >+ >+// The RenderReverbModel class applies an exponential reverberant model over the >+// render spectrum. >+class RenderReverbModel { >+ public: >+ RenderReverbModel(); >+ ~RenderReverbModel(); >+ >+ // Resets the state. >+ void Reset(); >+ >+ // Applies the reverberation model over the render spectrum. It also returns >+ // the reverberation render power spectrum in the array reverb_power_spectrum. >+ void Apply(const VectorBuffer& spectrum_buffer, >+ int delay_blocks, >+ float reverb_decay, >+ rtc::ArrayView<float> reverb_power_spectrum); >+ >+ // Gets the reverberation spectrum that was added to the render spectrum for >+ // computing the reverberation render spectrum. >+ rtc::ArrayView<const float> GetReverbContributionPowerSpectrum() const { >+ return render_reverb_.GetPowerSpectrum(); >+ } >+ >+ private: >+ ReverbModel render_reverb_; >+}; >+ >+} // namespace webrtc. >+ >+#endif // MODULES_AUDIO_PROCESSING_AEC3_RENDER_REVERB_MODEL_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.cc >index 50c34ce43f61a502edf2813cc07b044068f4141c..33b04bf70c9ff80d3e903911e793121f3c99d55c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.cc >@@ -12,7 +12,10 @@ > > #include <math.h> > #include <algorithm> >+#include <utility> >+#include <vector> > >+#include "api/array_view.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.h >index c603c921447b905cd0f42e5101389056683fbd6e..8a44232cf9a0d31c7fd5a2d6056b47e03ade3856 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer.h >@@ -11,13 +11,15 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_RENDER_SIGNAL_ANALYZER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_RENDER_SIGNAL_ANALYZER_H_ > >+#include <algorithm> > #include <array> >-#include <memory> >+#include <cstddef> > > #include "absl/types/optional.h" > #include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/render_buffer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer_unittest.cc >index f9b195583875f02968900314a326fb662d28e9ac..a993f8f8acf89cd9b0235c38044571fc3259987c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/render_signal_analyzer_unittest.cc >@@ -59,7 +59,7 @@ TEST(RenderSignalAnalyzer, NoFalseDetectionOfNarrowBands) { > std::vector<std::vector<float>> x(3, std::vector<float>(kBlockSize, 0.f)); > std::array<float, kBlockSize> x_old; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(EchoCanceller3Config(), 3)); >+ RenderDelayBuffer::Create2(EchoCanceller3Config(), 3)); > std::array<float, kFftLengthBy2Plus1> mask; > x_old.fill(0.f); > >@@ -93,7 +93,7 @@ TEST(RenderSignalAnalyzer, NarrowBandDetection) { > EchoCanceller3Config config; > config.delay.min_echo_path_delay_blocks = 0; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > > std::array<float, kFftLengthBy2Plus1> mask; > x_old.fill(0.f); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.cc >index 2e3ad9fea727e29ea4cfc1706f293dff678d809a..627dd900c4e8950e0dc687bed54d3c5d6d2dbb4a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.cc >@@ -11,9 +11,11 @@ > > #include "modules/audio_processing/aec3/residual_echo_estimator.h" > >-#include <numeric> >+#include <stddef.h> >+#include <algorithm> > #include <vector> > >+#include "api/array_view.h" > #include "modules/audio_processing/aec3/reverb_model.h" > #include "modules/audio_processing/aec3/reverb_model_fallback.h" > #include "rtc_base/checks.h" >@@ -105,9 +107,15 @@ void ResidualEchoEstimator::Estimate( > > // Estimate the residual echo power. > if (aec_state.UsableLinearEstimate()) { >- RTC_DCHECK(!aec_state.SaturatedEcho()); > LinearEstimate(S2_linear, aec_state.Erle(), aec_state.ErleUncertainty(), > R2); >+ >+ // When there is saturated echo, assume the same spectral content as is >+ // present in the micropone signal. >+ if (aec_state.SaturatedEcho()) { >+ std::copy(Y2.begin(), Y2.end(), R2->begin()); >+ } >+ > // Adds the estimated unmodelled echo power to the residual echo power > // estimate. > if (echo_reverb_) { >@@ -151,10 +159,10 @@ void ResidualEchoEstimator::Estimate( > } > NonLinearEstimate(echo_path_gain, X2, Y2, R2); > >- // If the echo is saturated, estimate the echo power as the maximum echo >- // power with a leakage factor. >+ // When there is saturated echo, assume the same spectral content as is >+ // present in the micropone signal. > if (aec_state.SaturatedEcho()) { >- R2->fill((*std::max_element(R2->begin(), R2->end())) * 100.f); >+ std::copy(Y2.begin(), Y2.end(), R2->begin()); > } > > if (!(aec_state.TransparentMode() && soft_transparent_mode_)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.h >index 6dcf24f090790b724e8e2e4742cf33ca14213cae..52885a58d85a3382422cdf9ce22ef203ba129fbb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator.h >@@ -11,18 +11,18 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_RESIDUAL_ECHO_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_RESIDUAL_ECHO_ESTIMATOR_H_ > >-#include <algorithm> > #include <array> > #include <memory> >-#include <vector> > >-#include "api/array_view.h" >+#include "absl/types/optional.h" > #include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/aec_state.h" > #include "modules/audio_processing/aec3/render_buffer.h" > #include "modules/audio_processing/aec3/reverb_model.h" > #include "modules/audio_processing/aec3/reverb_model_fallback.h" >+#include "modules/audio_processing/aec3/vector_buffer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >@@ -39,7 +39,7 @@ class ResidualEchoEstimator { > std::array<float, kFftLengthBy2Plus1>* R2); > > // Returns the reverberant power spectrum contributions to the echo residual. >- const std::array<float, kFftLengthBy2Plus1>& GetReverbPowerSpectrum() const { >+ rtc::ArrayView<const float> GetReverbPowerSpectrum() const { > if (echo_reverb_) { > return echo_reverb_->GetPowerSpectrum(); > } else { >@@ -66,7 +66,6 @@ class ResidualEchoEstimator { > const std::array<float, kFftLengthBy2Plus1>& Y2, > std::array<float, kFftLengthBy2Plus1>* R2); > >- > // Estimates the echo generating signal power as gated maximal power over a > // time window. > void EchoGeneratingPower(const VectorBuffer& spectrum_buffer, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator_unittest.cc >index 832d8ca1d33b6576489ae21b81025a9b663c3207..2e73a7e6116101fcc6bcca17f2c7081a511c9af1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/residual_echo_estimator_unittest.cc >@@ -27,7 +27,7 @@ TEST(ResidualEchoEstimator, NullResidualEchoPowerOutput) { > EchoCanceller3Config config; > AecState aec_state(config); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > std::vector<std::array<float, kFftLengthBy2Plus1>> H2; > std::array<float, kFftLengthBy2Plus1> S2_linear; > std::array<float, kFftLengthBy2Plus1> Y2; >@@ -48,7 +48,7 @@ TEST(ResidualEchoEstimator, DISABLED_BasicTest) { > ResidualEchoEstimator estimator(config); > AecState aec_state(config); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > > std::array<float, kFftLengthBy2Plus1> E2_main; > std::array<float, kFftLengthBy2Plus1> E2_shadow; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.cc >index f80afa2dfcba0963fe19c209d3203d177c0c8c59..95fd13af596269ae151afeaf874146a173c3a17f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.cc >@@ -10,11 +10,13 @@ > > #include "modules/audio_processing/aec3/reverb_decay_estimator.h" > >+#include <stddef.h> > #include <algorithm> > #include <cmath> > #include <numeric> > > #include "api/array_view.h" >+#include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" > #include "system_wrappers/include/field_trial.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.h >index 67a84ab8fcbf533d9c7af0da8e9928883a60a6d6..4c8d0c643462feae77eda9a45e5e1a175e061628 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_decay_estimator.h >@@ -16,12 +16,12 @@ > > #include "absl/types/optional.h" > #include "api/array_view.h" >-#include "api/audio/echo_canceller3_config.h" >-#include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/aec3_common.h" // kMaxAdaptiveFilter... > > namespace webrtc { > > class ApmDataDumper; >+struct EchoCanceller3Config; > > // Class for estimating the decay of the late reverb. > class ReverbDecayEstimator { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.cc >index 0d82515ec4b25688d7354be132a82208a965dc6b..d2103d442e4b45ff84e16a76d604c8a04649b65e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.cc >@@ -10,14 +10,12 @@ > > #include "modules/audio_processing/aec3/reverb_frequency_response.h" > >+#include <stddef.h> > #include <algorithm> > #include <array> >-#include <cmath> >-#include <memory> > #include <numeric> > > #include "api/array_view.h" >-#include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "rtc_base/checks.h" > #include "system_wrappers/include/field_trial.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.h >index 23485e009fbe30eb6d327361be6a14073081b02c..eb63b8e52e854bcca9f184eefa6518739cd8b8a1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_frequency_response.h >@@ -11,13 +11,12 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_REVERB_FREQUENCY_RESPONSE_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_REVERB_FREQUENCY_RESPONSE_H_ > >-#include <memory> >+#include <array> > #include <vector> > > #include "absl/types/optional.h" > #include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" >-#include "modules/audio_processing/logging/apm_data_dumper.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.cc >index 0ca248fc75431ea63377c24ed693c6bec6b907e6..f0a24c0249823485d997361b7566db1926363444 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.cc >@@ -10,13 +10,11 @@ > > #include "modules/audio_processing/aec3/reverb_model.h" > >-#include <math.h> >- >+#include <stddef.h> > #include <algorithm> > #include <functional> > > #include "api/array_view.h" >-#include "modules/audio_processing/aec3/aec3_common.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.h >index d3087a72d91023a53d7ca59e6e5fd054929738d5..56e2266e569e7447a1fb0fc2f6279820d997034a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_REVERB_MODEL_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_REVERB_MODEL_H_ > >+#include <array> >+ > #include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" > >@@ -50,9 +52,7 @@ class ReverbModel { > float reverb_decay); > > // Returns the current power spectrum reverberation contributions. >- const std::array<float, kFftLengthBy2Plus1>& GetPowerSpectrum() const { >- return reverb_; >- } >+ rtc::ArrayView<const float> GetPowerSpectrum() const { return reverb_; } > > private: > // Updates the reverberation contributions. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_estimator.h >index b6a3591e091970b90c8ce012c01d7ca34967ca0c..1112f93a719354baafc3c98844cc0b941a5acf10 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_estimator.h >@@ -11,11 +11,13 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_REVERB_MODEL_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_REVERB_MODEL_ESTIMATOR_H_ > >+#include <array> > #include <vector> > > #include "absl/types/optional.h" > #include "api/array_view.h" > #include "api/audio/echo_canceller3_config.h" >+#include "modules/audio_processing/aec3/aec3_common.h" // kFftLengthBy2Plus1 > #include "modules/audio_processing/aec3/reverb_decay_estimator.h" > #include "modules/audio_processing/aec3/reverb_frequency_response.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_fallback.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_fallback.h >index 1b2a953ea6417c2bee806c637de6f12e473b5451..1bd2b594e06aac0ba509d6ecf5192501c8c6f1d0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_fallback.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/reverb_model_fallback.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_REVERB_MODEL_FALLBACK_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_REVERB_MODEL_FALLBACK_H_ > >+#include <stddef.h> > #include <array> > #include <vector> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain.h >index a92bc3b8b7fa017a88a3eb94e0b75d2b26bbb8e6..05e632fa78b87313e2ef5d1db7fd774c5a167056 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain.h >@@ -11,11 +11,13 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_SHADOW_FILTER_UPDATE_GAIN_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_SHADOW_FILTER_UPDATE_GAIN_H_ > >+#include <stddef.h> >+#include <array> >+ > #include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" >-#include "modules/audio_processing/aec3/render_buffer.h" >+#include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/aec3/render_signal_analyzer.h" >-#include "rtc_base/constructormagic.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain_unittest.cc >index c040bbff91f2b7bbbefa58d5412cf300004fe2de..017c679588b1b725de763860e609cdbabdc6c0c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/shadow_filter_update_gain_unittest.cc >@@ -53,7 +53,7 @@ void RunFilterUpdateTest(int num_blocks_to_process, > config.delay.min_echo_path_delay_blocks = 0; > config.delay.default_delay = 1; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > > std::array<float, kBlockSize> x_old; > x_old.fill(0.f); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..32b36ab215a5471c1cb90d390515af738d9650c9 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator.cc >@@ -0,0 +1,368 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/aec3/signal_dependent_erle_estimator.h" >+ >+#include <algorithm> >+#include <functional> >+#include <numeric> >+ >+#include "modules/audio_processing/aec3/vector_buffer.h" >+#include "rtc_base/numerics/safe_minmax.h" >+ >+namespace webrtc { >+ >+namespace { >+ >+constexpr std::array<size_t, SignalDependentErleEstimator::kSubbands + 1> >+ kBandBoundaries = {1, 8, 16, 24, 32, 48, kFftLengthBy2Plus1}; >+ >+std::array<size_t, kFftLengthBy2Plus1> FormSubbandMap() { >+ std::array<size_t, kFftLengthBy2Plus1> map_band_to_subband; >+ size_t subband = 1; >+ for (size_t k = 0; k < map_band_to_subband.size(); ++k) { >+ RTC_DCHECK_LT(subband, kBandBoundaries.size()); >+ if (k >= kBandBoundaries[subband]) { >+ subband++; >+ RTC_DCHECK_LT(k, kBandBoundaries[subband]); >+ } >+ map_band_to_subband[k] = subband - 1; >+ } >+ return map_band_to_subband; >+} >+ >+// Defines the size in blocks of the sections that are used for dividing the >+// linear filter. The sections are split in a non-linear manner so that lower >+// sections that typically represent the direct path have a larger resolution >+// than the higher sections which typically represent more reverberant acoustic >+// paths. >+std::vector<size_t> DefineFilterSectionSizes(size_t delay_headroom_blocks, >+ size_t num_blocks, >+ size_t num_sections) { >+ size_t filter_length_blocks = num_blocks - delay_headroom_blocks; >+ std::vector<size_t> section_sizes(num_sections); >+ size_t remaining_blocks = filter_length_blocks; >+ size_t remaining_sections = num_sections; >+ size_t estimator_size = 2; >+ size_t idx = 0; >+ while (remaining_sections > 1 && >+ remaining_blocks > estimator_size * remaining_sections) { >+ RTC_DCHECK_LT(idx, section_sizes.size()); >+ section_sizes[idx] = estimator_size; >+ remaining_blocks -= estimator_size; >+ remaining_sections--; >+ estimator_size *= 2; >+ idx++; >+ } >+ >+ size_t last_groups_size = remaining_blocks / remaining_sections; >+ for (; idx < num_sections; idx++) { >+ section_sizes[idx] = last_groups_size; >+ } >+ section_sizes[num_sections - 1] += >+ remaining_blocks - last_groups_size * remaining_sections; >+ return section_sizes; >+} >+ >+// Forms the limits in blocks for each filter section. Those sections >+// are used for analyzing the echo estimates and investigating which >+// linear filter sections contribute most to the echo estimate energy. >+std::vector<size_t> SetSectionsBoundaries(size_t delay_headroom_blocks, >+ size_t num_blocks, >+ size_t num_sections) { >+ std::vector<size_t> estimator_boundaries_blocks(num_sections + 1); >+ if (estimator_boundaries_blocks.size() == 2) { >+ estimator_boundaries_blocks[0] = 0; >+ estimator_boundaries_blocks[1] = num_blocks; >+ return estimator_boundaries_blocks; >+ } >+ RTC_DCHECK_GT(estimator_boundaries_blocks.size(), 2); >+ const std::vector<size_t> section_sizes = >+ DefineFilterSectionSizes(delay_headroom_blocks, num_blocks, >+ estimator_boundaries_blocks.size() - 1); >+ >+ size_t idx = 0; >+ size_t current_size_block = 0; >+ RTC_DCHECK_EQ(section_sizes.size() + 1, estimator_boundaries_blocks.size()); >+ estimator_boundaries_blocks[0] = delay_headroom_blocks; >+ for (size_t k = delay_headroom_blocks; k < num_blocks; ++k) { >+ current_size_block++; >+ if (current_size_block >= section_sizes[idx]) { >+ idx = idx + 1; >+ if (idx == section_sizes.size()) { >+ break; >+ } >+ estimator_boundaries_blocks[idx] = k + 1; >+ current_size_block = 0; >+ } >+ } >+ estimator_boundaries_blocks[section_sizes.size()] = num_blocks; >+ return estimator_boundaries_blocks; >+} >+ >+std::array<float, SignalDependentErleEstimator::kSubbands> >+SetMaxErleSubbands(float max_erle_l, float max_erle_h, size_t limit_subband_l) { >+ std::array<float, SignalDependentErleEstimator::kSubbands> max_erle; >+ std::fill(max_erle.begin(), max_erle.begin() + limit_subband_l, max_erle_l); >+ std::fill(max_erle.begin() + limit_subband_l, max_erle.end(), max_erle_h); >+ return max_erle; >+} >+ >+} // namespace >+ >+SignalDependentErleEstimator::SignalDependentErleEstimator( >+ const EchoCanceller3Config& config) >+ : min_erle_(config.erle.min), >+ num_sections_(config.erle.num_sections), >+ num_blocks_(config.filter.main.length_blocks), >+ delay_headroom_blocks_(config.delay.delay_headroom_blocks), >+ band_to_subband_(FormSubbandMap()), >+ max_erle_(SetMaxErleSubbands(config.erle.max_l, >+ config.erle.max_h, >+ band_to_subband_[kFftLengthBy2 / 2])), >+ section_boundaries_blocks_(SetSectionsBoundaries(delay_headroom_blocks_, >+ num_blocks_, >+ num_sections_)), >+ S2_section_accum_(num_sections_), >+ erle_estimators_(num_sections_), >+ correction_factors_(num_sections_) { >+ RTC_DCHECK_LE(num_sections_, num_blocks_); >+ RTC_DCHECK_GE(num_sections_, 1); >+ >+ Reset(); >+} >+ >+SignalDependentErleEstimator::~SignalDependentErleEstimator() = default; >+ >+void SignalDependentErleEstimator::Reset() { >+ erle_.fill(min_erle_); >+ for (auto& erle : erle_estimators_) { >+ erle.fill(min_erle_); >+ } >+ erle_ref_.fill(min_erle_); >+ for (auto& factor : correction_factors_) { >+ factor.fill(1.0f); >+ } >+ num_updates_.fill(0); >+} >+ >+// Updates the Erle estimate by analyzing the current input signals. It takes >+// the render buffer and the filter frequency response in order to do an >+// estimation of the number of sections of the linear filter that are needed >+// for getting the majority of the energy in the echo estimate. Based on that >+// number of sections, it updates the erle estimation by introducing a >+// correction factor to the erle that is given as an input to this method. >+void SignalDependentErleEstimator::Update( >+ const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response, >+ rtc::ArrayView<const float> X2, >+ rtc::ArrayView<const float> Y2, >+ rtc::ArrayView<const float> E2, >+ rtc::ArrayView<const float> average_erle, >+ bool converged_filter) { >+ RTC_DCHECK_GT(num_sections_, 1); >+ >+ // Gets the number of filter sections that are needed for achieving 90 % >+ // of the power spectrum energy of the echo estimate. >+ std::array<size_t, kFftLengthBy2Plus1> n_active_sections; >+ ComputeNumberOfActiveFilterSections(render_buffer, filter_frequency_response, >+ n_active_sections); >+ >+ if (converged_filter) { >+ // Updates the correction factor that is used for correcting the erle and >+ // adapt it to the particular characteristics of the input signal. >+ UpdateCorrectionFactors(X2, Y2, E2, n_active_sections); >+ } >+ >+ // Applies the correction factor to the input erle for getting a more refined >+ // erle estimation for the current input signal. >+ for (size_t k = 0; k < kFftLengthBy2; ++k) { >+ float correction_factor = >+ correction_factors_[n_active_sections[k]][band_to_subband_[k]]; >+ erle_[k] = rtc::SafeClamp(average_erle[k] * correction_factor, min_erle_, >+ max_erle_[band_to_subband_[k]]); >+ } >+} >+ >+void SignalDependentErleEstimator::Dump( >+ const std::unique_ptr<ApmDataDumper>& data_dumper) const { >+ for (auto& erle : erle_estimators_) { >+ data_dumper->DumpRaw("aec3_all_erle", erle); >+ } >+ data_dumper->DumpRaw("aec3_ref_erle", erle_ref_); >+ for (auto& factor : correction_factors_) { >+ data_dumper->DumpRaw("aec3_erle_correction_factor", factor); >+ } >+ data_dumper->DumpRaw("aec3_erle", erle_); >+} >+ >+// Estimates for each band the smallest number of sections in the filter that >+// together constitute 90% of the estimated echo energy. >+void SignalDependentErleEstimator::ComputeNumberOfActiveFilterSections( >+ const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response, >+ rtc::ArrayView<size_t> n_active_filter_sections) { >+ RTC_DCHECK_GT(num_sections_, 1); >+ // Computes an approximation of the power spectrum if the filter would have >+ // been limited to a certain number of filter sections. >+ ComputeEchoEstimatePerFilterSection(render_buffer, filter_frequency_response); >+ // For each band, computes the number of filter sections that are needed for >+ // achieving the 90 % energy in the echo estimate. >+ ComputeActiveFilterSections(n_active_filter_sections); >+} >+ >+void SignalDependentErleEstimator::UpdateCorrectionFactors( >+ rtc::ArrayView<const float> X2, >+ rtc::ArrayView<const float> Y2, >+ rtc::ArrayView<const float> E2, >+ rtc::ArrayView<const size_t> n_active_sections) { >+ constexpr float kX2BandEnergyThreshold = 44015068.0f; >+ constexpr float kSmthConstantDecreases = 0.1f; >+ constexpr float kSmthConstantIncreases = kSmthConstantDecreases / 2.f; >+ auto subband_powers = [](rtc::ArrayView<const float> power_spectrum, >+ rtc::ArrayView<float> power_spectrum_subbands) { >+ for (size_t subband = 0; subband < kSubbands; ++subband) { >+ RTC_DCHECK_LE(kBandBoundaries[subband + 1], power_spectrum.size()); >+ power_spectrum_subbands[subband] = std::accumulate( >+ power_spectrum.begin() + kBandBoundaries[subband], >+ power_spectrum.begin() + kBandBoundaries[subband + 1], 0.f); >+ } >+ }; >+ >+ std::array<float, kSubbands> X2_subbands, E2_subbands, Y2_subbands; >+ subband_powers(X2, X2_subbands); >+ subband_powers(E2, E2_subbands); >+ subband_powers(Y2, Y2_subbands); >+ std::array<size_t, kSubbands> idx_subbands; >+ for (size_t subband = 0; subband < kSubbands; ++subband) { >+ // When aggregating the number of active sections in the filter for >+ // different bands we choose to take the minimum of all of them. As an >+ // example, if for one of the bands it is the direct path its main >+ // contributor to the final echo estimate, we consider the direct path is >+ // as well the main contributor for the subband that contains that >+ // particular band. That aggregate number of sections will be later used as >+ // the identifier of the erle estimator that needs to be updated. >+ RTC_DCHECK_LE(kBandBoundaries[subband + 1], n_active_sections.size()); >+ idx_subbands[subband] = *std::min_element( >+ n_active_sections.begin() + kBandBoundaries[subband], >+ n_active_sections.begin() + kBandBoundaries[subband + 1]); >+ } >+ >+ std::array<float, kSubbands> new_erle; >+ std::array<bool, kSubbands> is_erle_updated; >+ is_erle_updated.fill(false); >+ new_erle.fill(0.f); >+ for (size_t subband = 0; subband < kSubbands; ++subband) { >+ if (X2_subbands[subband] > kX2BandEnergyThreshold && >+ E2_subbands[subband] > 0) { >+ new_erle[subband] = Y2_subbands[subband] / E2_subbands[subband]; >+ RTC_DCHECK_GT(new_erle[subband], 0); >+ is_erle_updated[subband] = true; >+ ++num_updates_[subband]; >+ } >+ } >+ >+ for (size_t subband = 0; subband < kSubbands; ++subband) { >+ const size_t idx = idx_subbands[subband]; >+ RTC_DCHECK_LT(idx, erle_estimators_.size()); >+ float alpha = new_erle[subband] > erle_estimators_[idx][subband] >+ ? kSmthConstantIncreases >+ : kSmthConstantDecreases; >+ alpha = static_cast<float>(is_erle_updated[subband]) * alpha; >+ erle_estimators_[idx][subband] += >+ alpha * (new_erle[subband] - erle_estimators_[idx][subband]); >+ erle_estimators_[idx][subband] = rtc::SafeClamp( >+ erle_estimators_[idx][subband], min_erle_, max_erle_[subband]); >+ } >+ >+ for (size_t subband = 0; subband < kSubbands; ++subband) { >+ float alpha = new_erle[subband] > erle_ref_[subband] >+ ? kSmthConstantIncreases >+ : kSmthConstantDecreases; >+ alpha = static_cast<float>(is_erle_updated[subband]) * alpha; >+ erle_ref_[subband] += alpha * (new_erle[subband] - erle_ref_[subband]); >+ erle_ref_[subband] = >+ rtc::SafeClamp(erle_ref_[subband], min_erle_, max_erle_[subband]); >+ } >+ >+ for (size_t subband = 0; subband < kSubbands; ++subband) { >+ constexpr int kNumUpdateThr = 50; >+ if (is_erle_updated[subband] && num_updates_[subband] > kNumUpdateThr) { >+ const size_t idx = idx_subbands[subband]; >+ RTC_DCHECK_GT(erle_ref_[subband], 0.f); >+ // Computes the ratio between the erle that is updated using all the >+ // points and the erle that is updated only on signals that share the >+ // same number of active filter sections. >+ float new_correction_factor = >+ erle_estimators_[idx][subband] / erle_ref_[subband]; >+ >+ correction_factors_[idx][subband] += >+ 0.1f * (new_correction_factor - correction_factors_[idx][subband]); >+ } >+ } >+} >+ >+void SignalDependentErleEstimator::ComputeEchoEstimatePerFilterSection( >+ const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response) { >+ const VectorBuffer& spectrum_render_buffer = >+ render_buffer.GetSpectrumBuffer(); >+ >+ RTC_DCHECK_EQ(S2_section_accum_.size() + 1, >+ section_boundaries_blocks_.size()); >+ size_t idx_render = render_buffer.Position(); >+ idx_render = spectrum_render_buffer.OffsetIndex( >+ idx_render, section_boundaries_blocks_[0]); >+ >+ for (size_t section = 0; section < num_sections_; ++section) { >+ std::array<float, kFftLengthBy2Plus1> X2_section; >+ std::array<float, kFftLengthBy2Plus1> H2_section; >+ X2_section.fill(0.f); >+ H2_section.fill(0.f); >+ for (size_t block = section_boundaries_blocks_[section]; >+ block < section_boundaries_blocks_[section + 1]; ++block) { >+ std::transform(X2_section.begin(), X2_section.end(), >+ spectrum_render_buffer.buffer[idx_render].begin(), >+ X2_section.begin(), std::plus<float>()); >+ std::transform(H2_section.begin(), H2_section.end(), >+ filter_frequency_response[block].begin(), >+ H2_section.begin(), std::plus<float>()); >+ idx_render = spectrum_render_buffer.IncIndex(idx_render); >+ } >+ >+ std::transform(X2_section.begin(), X2_section.end(), H2_section.begin(), >+ S2_section_accum_[section].begin(), >+ std::multiplies<float>()); >+ } >+ >+ for (size_t section = 1; section < num_sections_; ++section) { >+ std::transform(S2_section_accum_[section - 1].begin(), >+ S2_section_accum_[section - 1].end(), >+ S2_section_accum_[section].begin(), >+ S2_section_accum_[section].begin(), std::plus<float>()); >+ } >+} >+ >+void SignalDependentErleEstimator::ComputeActiveFilterSections( >+ rtc::ArrayView<size_t> number_active_filter_sections) const { >+ std::fill(number_active_filter_sections.begin(), >+ number_active_filter_sections.end(), 0); >+ for (size_t k = 0; k < kFftLengthBy2Plus1; ++k) { >+ size_t section = num_sections_; >+ float target = 0.9f * S2_section_accum_[num_sections_ - 1][k]; >+ while (section > 0 && S2_section_accum_[section - 1][k] >= target) { >+ number_active_filter_sections[k] = --section; >+ } >+ } >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator.h >new file mode 100644 >index 0000000000000000000000000000000000000000..d8b56c2b20c382dd219c0dd27ad6535cbe495cd5 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator.h >@@ -0,0 +1,93 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_AUDIO_PROCESSING_AEC3_SIGNAL_DEPENDENT_ERLE_ESTIMATOR_H_ >+#define MODULES_AUDIO_PROCESSING_AEC3_SIGNAL_DEPENDENT_ERLE_ESTIMATOR_H_ >+ >+#include <memory> >+#include <vector> >+ >+#include "api/array_view.h" >+#include "api/audio/echo_canceller3_config.h" >+#include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/render_buffer.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" >+ >+namespace webrtc { >+ >+// This class estimates the dependency of the Erle to the input signal. By >+// looking at the input signal, an estimation on whether the current echo >+// estimate is due to the direct path or to a more reverberant one is performed. >+// Once that estimation is done, it is possible to refine the average Erle that >+// this class receive as an input. >+class SignalDependentErleEstimator { >+ public: >+ explicit SignalDependentErleEstimator(const EchoCanceller3Config& config); >+ >+ ~SignalDependentErleEstimator(); >+ >+ void Reset(); >+ >+ // Returns the Erle per frequency subband. >+ const std::array<float, kFftLengthBy2Plus1>& Erle() const { return erle_; } >+ >+ // Updates the Erle estimate. The Erle that is passed as an input is required >+ // to be an estimation of the average Erle achieved by the linear filter. >+ void Update(const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response, >+ rtc::ArrayView<const float> X2, >+ rtc::ArrayView<const float> Y2, >+ rtc::ArrayView<const float> E2, >+ rtc::ArrayView<const float> average_erle, >+ bool converged_filter); >+ >+ void Dump(const std::unique_ptr<ApmDataDumper>& data_dumper) const; >+ >+ static constexpr size_t kSubbands = 6; >+ >+ private: >+ void ComputeNumberOfActiveFilterSections( >+ const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response, >+ rtc::ArrayView<size_t> n_active_filter_sections); >+ >+ void UpdateCorrectionFactors(rtc::ArrayView<const float> X2, >+ rtc::ArrayView<const float> Y2, >+ rtc::ArrayView<const float> E2, >+ rtc::ArrayView<const size_t> n_active_sections); >+ >+ void ComputeEchoEstimatePerFilterSection( >+ const RenderBuffer& render_buffer, >+ const std::vector<std::array<float, kFftLengthBy2Plus1>>& >+ filter_frequency_response); >+ >+ void ComputeActiveFilterSections( >+ rtc::ArrayView<size_t> number_active_filter_sections) const; >+ >+ const float min_erle_; >+ const size_t num_sections_; >+ const size_t num_blocks_; >+ const size_t delay_headroom_blocks_; >+ const std::array<size_t, kFftLengthBy2Plus1> band_to_subband_; >+ const std::array<float, kSubbands> max_erle_; >+ const std::vector<size_t> section_boundaries_blocks_; >+ std::array<float, kFftLengthBy2Plus1> erle_; >+ std::vector<std::array<float, kFftLengthBy2Plus1>> S2_section_accum_; >+ std::vector<std::array<float, kSubbands>> erle_estimators_; >+ std::array<float, kSubbands> erle_ref_; >+ std::vector<std::array<float, kSubbands>> correction_factors_; >+ std::array<int, kSubbands> num_updates_; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_AUDIO_PROCESSING_AEC3_SIGNAL_DEPENDENT_ERLE_ESTIMATOR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..aec605f7abf0285d03552b2c11fb3a5322698ac3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/signal_dependent_erle_estimator_unittest.cc >@@ -0,0 +1,155 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/aec3/signal_dependent_erle_estimator.h" >+ >+#include <algorithm> >+#include <iostream> >+#include <string> >+ >+#include "api/audio/echo_canceller3_config.h" >+#include "modules/audio_processing/aec3/render_buffer.h" >+#include "modules/audio_processing/aec3/render_delay_buffer.h" >+#include "rtc_base/strings/string_builder.h" >+#include "test/gtest.h" >+ >+namespace webrtc { >+ >+namespace { >+ >+void GetActiveFrame(rtc::ArrayView<float> x) { >+ const std::array<float, kBlockSize> frame = { >+ 7459.88, 17209.6, 17383, 20768.9, 16816.7, 18386.3, 4492.83, 9675.85, >+ 6665.52, 14808.6, 9342.3, 7483.28, 19261.7, 4145.98, 1622.18, 13475.2, >+ 7166.32, 6856.61, 21937, 7263.14, 9569.07, 14919, 8413.32, 7551.89, >+ 7848.65, 6011.27, 13080.6, 15865.2, 12656, 17459.6, 4263.93, 4503.03, >+ 9311.79, 21095.8, 12657.9, 13906.6, 19267.2, 11338.1, 16828.9, 11501.6, >+ 11405, 15031.4, 14541.6, 19765.5, 18346.3, 19350.2, 3157.47, 18095.8, >+ 1743.68, 21328.2, 19727.5, 7295.16, 10332.4, 11055.5, 20107.4, 14708.4, >+ 12416.2, 16434, 2454.69, 9840.8, 6867.23, 1615.75, 6059.9, 8394.19}; >+ RTC_DCHECK_GE(x.size(), frame.size()); >+ std::copy(frame.begin(), frame.end(), x.begin()); >+} >+ >+class TestInputs { >+ public: >+ explicit TestInputs(const EchoCanceller3Config& cfg); >+ ~TestInputs(); >+ const RenderBuffer& GetRenderBuffer() { return *render_buffer_; } >+ rtc::ArrayView<const float> GetX2() { return X2_; } >+ rtc::ArrayView<const float> GetY2() { return Y2_; } >+ rtc::ArrayView<const float> GetE2() { return E2_; } >+ std::vector<std::array<float, kFftLengthBy2Plus1>> GetH2() { return H2_; } >+ void Update(); >+ >+ private: >+ void UpdateCurrentPowerSpectra(); >+ int n_ = 0; >+ std::unique_ptr<RenderDelayBuffer> render_delay_buffer_; >+ RenderBuffer* render_buffer_; >+ std::array<float, kFftLengthBy2Plus1> X2_; >+ std::array<float, kFftLengthBy2Plus1> Y2_; >+ std::array<float, kFftLengthBy2Plus1> E2_; >+ std::vector<std::array<float, kFftLengthBy2Plus1>> H2_; >+ std::vector<std::vector<float>> x_; >+}; >+ >+TestInputs::TestInputs(const EchoCanceller3Config& cfg) >+ : render_delay_buffer_(RenderDelayBuffer::Create2(cfg, 1)), >+ H2_(cfg.filter.main.length_blocks), >+ x_(1, std::vector<float>(kBlockSize, 0.f)) { >+ render_delay_buffer_->SetDelay(4); >+ render_buffer_ = render_delay_buffer_->GetRenderBuffer(); >+ for (auto& H : H2_) { >+ H.fill(0.f); >+ } >+ H2_[0].fill(1.0f); >+} >+ >+TestInputs::~TestInputs() = default; >+ >+void TestInputs::Update() { >+ if (n_ % 2 == 0) { >+ std::fill(x_[0].begin(), x_[0].end(), 0.f); >+ } else { >+ GetActiveFrame(x_[0]); >+ } >+ >+ render_delay_buffer_->Insert(x_); >+ render_delay_buffer_->PrepareCaptureProcessing(); >+ UpdateCurrentPowerSpectra(); >+ ++n_; >+} >+ >+void TestInputs::UpdateCurrentPowerSpectra() { >+ const VectorBuffer& spectrum_render_buffer = >+ render_buffer_->GetSpectrumBuffer(); >+ size_t idx = render_buffer_->Position(); >+ size_t prev_idx = spectrum_render_buffer.OffsetIndex(idx, 1); >+ auto& X2 = spectrum_render_buffer.buffer[idx]; >+ auto& X2_prev = spectrum_render_buffer.buffer[prev_idx]; >+ std::copy(X2.begin(), X2.end(), X2_.begin()); >+ RTC_DCHECK_EQ(X2.size(), Y2_.size()); >+ for (size_t k = 0; k < X2.size(); ++k) { >+ E2_[k] = 0.01f * X2_prev[k]; >+ Y2_[k] = X2[k] + E2_[k]; >+ } >+} >+ >+} // namespace >+ >+TEST(SignalDependentErleEstimator, SweepSettings) { >+ EchoCanceller3Config cfg; >+ size_t max_length_blocks = 50; >+ for (size_t blocks = 0; blocks < max_length_blocks; blocks = blocks + 10) { >+ for (size_t delay_headroom = 0; delay_headroom < 5; ++delay_headroom) { >+ for (size_t num_sections = 2; num_sections < max_length_blocks; >+ ++num_sections) { >+ cfg.filter.main.length_blocks = blocks; >+ cfg.filter.main_initial.length_blocks = >+ std::min(cfg.filter.main_initial.length_blocks, blocks); >+ cfg.delay.delay_headroom_blocks = delay_headroom; >+ cfg.erle.num_sections = num_sections; >+ if (EchoCanceller3Config::Validate(&cfg)) { >+ SignalDependentErleEstimator s(cfg); >+ std::array<float, kFftLengthBy2Plus1> average_erle; >+ average_erle.fill(cfg.erle.max_l); >+ TestInputs inputs(cfg); >+ for (size_t n = 0; n < 10; ++n) { >+ inputs.Update(); >+ s.Update(inputs.GetRenderBuffer(), inputs.GetH2(), inputs.GetX2(), >+ inputs.GetY2(), inputs.GetE2(), average_erle, true); >+ } >+ } >+ } >+ } >+ } >+} >+ >+TEST(SignalDependentErleEstimator, LongerRun) { >+ EchoCanceller3Config cfg; >+ cfg.filter.main.length_blocks = 2; >+ cfg.filter.main_initial.length_blocks = 1; >+ cfg.delay.delay_headroom_blocks = 0; >+ cfg.delay.hysteresis_limit_1_blocks = 0; >+ cfg.erle.num_sections = 2; >+ EXPECT_EQ(EchoCanceller3Config::Validate(&cfg), true); >+ std::array<float, kFftLengthBy2Plus1> average_erle; >+ average_erle.fill(cfg.erle.max_l); >+ SignalDependentErleEstimator s(cfg); >+ TestInputs inputs(cfg); >+ for (size_t n = 0; n < 200; ++n) { >+ inputs.Update(); >+ s.Update(inputs.GetRenderBuffer(), inputs.GetH2(), inputs.GetX2(), >+ inputs.GetY2(), inputs.GetE2(), average_erle, true); >+ } >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.cc >index 310e4e9dd1da22c541ba3a0708b4a44fef4b05fb..a2099fcd01301c2b95848368a7341fcd6d9e7735 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.cc >@@ -10,7 +10,6 @@ > #include "modules/audio_processing/aec3/skew_estimator.h" > > #include <algorithm> >-#include <numeric> > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.h >index ff2526025a87faafafe67dd09c30b9f747c535bc..b0946d833e543a872a6ab23c7ddf028d9ecca651 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/skew_estimator.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_SKEW_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_SKEW_ESTIMATOR_H_ > >+#include <stddef.h> > #include <vector> > > #include "absl/types/optional.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.cc >index efeabf1117f2098f434f728997657a64529e88c4..25100bfd4af56e745c95cfd6d86676df3c7c2081 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.cc >@@ -14,6 +14,7 @@ > #include <array> > #include <vector> > >+#include "api/array_view.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/vector_buffer.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >@@ -40,7 +41,6 @@ void StationarityEstimator::Reset() { > noise_.Reset(); > hangovers_.fill(0); > stationarity_flags_.fill(false); >- render_reverb_.Reset(); > } > > // Update just the noise estimator. Usefull until the delay is known >@@ -54,9 +54,9 @@ void StationarityEstimator::UpdateNoiseEstimator( > > void StationarityEstimator::UpdateStationarityFlags( > const VectorBuffer& spectrum_buffer, >+ rtc::ArrayView<const float> render_reverb_contribution_spectrum, > int idx_current, >- int num_lookahead, >- float reverb_decay) { >+ int num_lookahead) { > std::array<int, kWindowLength> indexes; > int num_lookahead_bounded = std::min(num_lookahead, kWindowLength - 1); > int idx = idx_current; >@@ -79,12 +79,9 @@ void StationarityEstimator::UpdateStationarityFlags( > spectrum_buffer.DecIndex(indexes[kWindowLength - 1]), > spectrum_buffer.OffsetIndex(idx_current, -(num_lookahead_bounded + 1))); > >- int idx_past = spectrum_buffer.IncIndex(idx_current); >- render_reverb_.UpdateReverbContributionsNoFreqShaping( >- spectrum_buffer.buffer[idx_past], 1.0f, reverb_decay); > for (size_t k = 0; k < stationarity_flags_.size(); ++k) { > stationarity_flags_[k] = EstimateBandStationarity( >- spectrum_buffer, render_reverb_.GetPowerSpectrum(), indexes, k); >+ spectrum_buffer, render_reverb_contribution_spectrum, indexes, k); > } > UpdateHangover(); > SmoothStationaryPerFreq(); >@@ -102,7 +99,7 @@ bool StationarityEstimator::IsBlockStationary() const { > > bool StationarityEstimator::EstimateBandStationarity( > const VectorBuffer& spectrum_buffer, >- const std::array<float, kFftLengthBy2Plus1>& reverb, >+ rtc::ArrayView<const float> reverb, > const std::array<int, kWindowLength>& indexes, > size_t band) const { > constexpr float kThrStationarity = 10.f; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.h >index e2c5a6253471f8486f11bee7c6193774c54ae6d9..704859a1fe783c59c1b8eef79461c550f19e400e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/stationarity_estimator.h >@@ -11,18 +11,19 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_STATIONARITY_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_STATIONARITY_ESTIMATOR_H_ > >+#include <stddef.h> > #include <array> > #include <memory> >-#include <vector> > > #include "api/array_view.h" >-#include "modules/audio_processing/aec3/aec3_common.h" >+#include "modules/audio_processing/aec3/aec3_common.h" // kFftLengthBy2Plus1... > #include "modules/audio_processing/aec3/reverb_model.h" >-#include "modules/audio_processing/aec3/vector_buffer.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > > class ApmDataDumper; >+struct VectorBuffer; > > class StationarityEstimator { > public: >@@ -37,10 +38,11 @@ class StationarityEstimator { > > // Update the flag indicating whether this current frame is stationary. For > // getting a more robust estimation, it looks at future and/or past frames. >- void UpdateStationarityFlags(const VectorBuffer& spectrum_buffer, >- int idx_current, >- int num_lookahead, >- float reverb_decay); >+ void UpdateStationarityFlags( >+ const VectorBuffer& spectrum_buffer, >+ rtc::ArrayView<const float> render_reverb_contribution_spectrum, >+ int idx_current, >+ int num_lookahead); > > // Returns true if the current band is stationary. > bool IsBandStationary(size_t band) const { >@@ -57,11 +59,10 @@ class StationarityEstimator { > > // Get an estimation of the stationarity for the current band by looking > // at the past/present/future available data. >- bool EstimateBandStationarity( >- const VectorBuffer& spectrum_buffer, >- const std::array<float, kFftLengthBy2Plus1>& reverb, >- const std::array<int, kWindowLength>& indexes, >- size_t band) const; >+ bool EstimateBandStationarity(const VectorBuffer& spectrum_buffer, >+ rtc::ArrayView<const float> reverb, >+ const std::array<int, kWindowLength>& indexes, >+ size_t band) const; > > // True if all bands at the current point are stationary. > bool AreAllBandsStationary(); >@@ -111,7 +112,6 @@ class StationarityEstimator { > NoiseSpectrum noise_; > std::array<int, kFftLengthBy2Plus1> hangovers_; > std::array<bool, kFftLengthBy2Plus1> stationarity_flags_; >- ReverbModel render_reverb_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.cc >index d8cb7a76315f26624a106540d75e69a39cecd725..9453e5739f0d6a39143088739b311875b35792b7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.cc >@@ -11,35 +11,38 @@ > #include "modules/audio_processing/aec3/subband_erle_estimator.h" > > #include <algorithm> >-#include <memory> >+#include <functional> > >-#include "absl/types/optional.h" >-#include "api/array_view.h" >-#include "modules/audio_processing/aec3/aec3_common.h" >-#include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "system_wrappers/include/field_trial.h" > > namespace webrtc { > > namespace { >-constexpr int kPointsToAccumulate = 6; >+ > constexpr float kX2BandEnergyThreshold = 44015068.0f; >-constexpr int kErleHold = 100; >-constexpr int kBlocksForOnsetDetection = kErleHold + 150; >+constexpr int kBlocksToHoldErle = 100; >+constexpr int kBlocksForOnsetDetection = kBlocksToHoldErle + 150; >+constexpr int kPointsToAccumulate = 6; > > bool EnableAdaptErleOnLowRender() { > return !field_trial::IsEnabled("WebRTC-Aec3AdaptErleOnLowRenderKillSwitch"); > } > >+std::array<float, kFftLengthBy2Plus1> SetMaxErleBands(float max_erle_l, >+ float max_erle_h) { >+ std::array<float, kFftLengthBy2Plus1> max_erle; >+ std::fill(max_erle.begin(), max_erle.begin() + kFftLengthBy2 / 2, max_erle_l); >+ std::fill(max_erle.begin() + kFftLengthBy2 / 2, max_erle.end(), max_erle_h); >+ return max_erle; >+} >+ > } // namespace > >-SubbandErleEstimator::SubbandErleEstimator(float min_erle, >- float max_erle_lf, >- float max_erle_hf) >- : min_erle_(min_erle), >- max_erle_lf_(max_erle_lf), >- max_erle_hf_(max_erle_hf), >+SubbandErleEstimator::SubbandErleEstimator(const EchoCanceller3Config& config) >+ : min_erle_(config.erle.min), >+ max_erle_(SetMaxErleBands(config.erle.max_l, config.erle.max_h)), > adapt_on_low_render_(EnableAdaptErleOnLowRender()) { > Reset(); > } >@@ -49,8 +52,9 @@ SubbandErleEstimator::~SubbandErleEstimator() = default; > void SubbandErleEstimator::Reset() { > erle_.fill(min_erle_); > erle_onsets_.fill(min_erle_); >- hold_counters_.fill(0); > coming_onset_.fill(true); >+ hold_counters_.fill(0); >+ ResetAccumulatedSpectra(); > } > > void SubbandErleEstimator::Update(rtc::ArrayView<const float> X2, >@@ -62,10 +66,8 @@ void SubbandErleEstimator::Update(rtc::ArrayView<const float> X2, > // Note that the use of the converged_filter flag already imposed > // a minimum of the erle that can be estimated as that flag would > // be false if the filter is performing poorly. >- constexpr size_t kFftLengthBy4 = kFftLengthBy2 / 2; >- UpdateBands(X2, Y2, E2, 1, kFftLengthBy4, max_erle_lf_, onset_detection); >- UpdateBands(X2, Y2, E2, kFftLengthBy4, kFftLengthBy2, max_erle_hf_, >- onset_detection); >+ UpdateAccumulatedSpectra(X2, Y2, E2); >+ UpdateBands(onset_detection); > } > > if (onset_detection) { >@@ -78,53 +80,45 @@ void SubbandErleEstimator::Update(rtc::ArrayView<const float> X2, > > void SubbandErleEstimator::Dump( > const std::unique_ptr<ApmDataDumper>& data_dumper) const { >- data_dumper->DumpRaw("aec3_erle", Erle()); > data_dumper->DumpRaw("aec3_erle_onset", ErleOnsets()); > } > >-void SubbandErleEstimator::UpdateBands(rtc::ArrayView<const float> X2, >- rtc::ArrayView<const float> Y2, >- rtc::ArrayView<const float> E2, >- size_t start, >- size_t stop, >- float max_erle, >- bool onset_detection) { >- auto erle_band_update = [](float erle_band, float new_erle, >- bool low_render_energy, float alpha_inc, >- float alpha_dec, float min_erle, float max_erle) { >- if (new_erle < erle_band && low_render_energy) { >- // Decreases are not allowed if low render energy signals were used for >- // the erle computation. >- return erle_band; >+void SubbandErleEstimator::UpdateBands(bool onset_detection) { >+ std::array<float, kFftLengthBy2> new_erle; >+ std::array<bool, kFftLengthBy2> is_erle_updated; >+ is_erle_updated.fill(false); >+ >+ for (size_t k = 1; k < kFftLengthBy2; ++k) { >+ if (accum_spectra_.num_points_[k] == kPointsToAccumulate && >+ accum_spectra_.E2_[k] > 0.f) { >+ new_erle[k] = accum_spectra_.Y2_[k] / accum_spectra_.E2_[k]; >+ is_erle_updated[k] = true; > } >- float alpha = new_erle > erle_band ? alpha_inc : alpha_dec; >- float erle_band_out = erle_band; >- erle_band_out = erle_band + alpha * (new_erle - erle_band); >- erle_band_out = rtc::SafeClamp(erle_band_out, min_erle, max_erle); >- return erle_band_out; >- }; >- >- for (size_t k = start; k < stop; ++k) { >- if (adapt_on_low_render_ || X2[k] > kX2BandEnergyThreshold) { >- bool low_render_energy = false; >- absl::optional<float> new_erle = instantaneous_erle_.Update( >- X2[k], Y2[k], E2[k], k, &low_render_energy); >- if (new_erle) { >- RTC_DCHECK(adapt_on_low_render_ || !low_render_energy); >- if (onset_detection && !low_render_energy) { >- if (coming_onset_[k]) { >- coming_onset_[k] = false; >- erle_onsets_[k] = erle_band_update( >- erle_onsets_[k], new_erle.value(), low_render_energy, 0.15f, >- 0.3f, min_erle_, max_erle); >- } >- hold_counters_[k] = kBlocksForOnsetDetection; >+ } >+ >+ if (onset_detection) { >+ for (size_t k = 1; k < kFftLengthBy2; ++k) { >+ if (is_erle_updated[k] && !accum_spectra_.low_render_energy_[k]) { >+ if (coming_onset_[k]) { >+ coming_onset_[k] = false; >+ float alpha = new_erle[k] < erle_onsets_[k] ? 0.3f : 0.15f; >+ erle_onsets_[k] = rtc::SafeClamp( >+ erle_onsets_[k] + alpha * (new_erle[k] - erle_onsets_[k]), >+ min_erle_, max_erle_[k]); > } >+ hold_counters_[k] = kBlocksForOnsetDetection; >+ } >+ } >+ } > >- erle_[k] = >- erle_band_update(erle_[k], new_erle.value(), low_render_energy, >- 0.05f, 0.1f, min_erle_, max_erle); >+ for (size_t k = 1; k < kFftLengthBy2; ++k) { >+ if (is_erle_updated[k]) { >+ float alpha = 0.05f; >+ if (new_erle[k] < erle_[k]) { >+ alpha = accum_spectra_.low_render_energy_[k] ? 0.f : 0.1f; > } >+ erle_[k] = rtc::SafeClamp(erle_[k] + alpha * (new_erle[k] - erle_[k]), >+ min_erle_, max_erle_[k]); > } > } > } >@@ -132,7 +126,7 @@ void SubbandErleEstimator::UpdateBands(rtc::ArrayView<const float> X2, > void SubbandErleEstimator::DecreaseErlePerBandForLowRenderSignals() { > for (size_t k = 1; k < kFftLengthBy2; ++k) { > hold_counters_[k]--; >- if (hold_counters_[k] <= (kBlocksForOnsetDetection - kErleHold)) { >+ if (hold_counters_[k] <= (kBlocksForOnsetDetection - kBlocksToHoldErle)) { > if (erle_[k] > erle_onsets_[k]) { > erle_[k] = std::max(erle_onsets_[k], 0.97f * erle_[k]); > RTC_DCHECK_LE(min_erle_, erle_[k]); >@@ -145,43 +139,55 @@ void SubbandErleEstimator::DecreaseErlePerBandForLowRenderSignals() { > } > } > >-SubbandErleEstimator::ErleInstantaneous::ErleInstantaneous() { >- Reset(); >+void SubbandErleEstimator::ResetAccumulatedSpectra() { >+ accum_spectra_.Y2_.fill(0.f); >+ accum_spectra_.E2_.fill(0.f); >+ accum_spectra_.num_points_.fill(0); >+ accum_spectra_.low_render_energy_.fill(false); > } > >-SubbandErleEstimator::ErleInstantaneous::~ErleInstantaneous() = default; >- >-absl::optional<float> SubbandErleEstimator::ErleInstantaneous::Update( >- float X2, >- float Y2, >- float E2, >- size_t band, >- bool* low_render_energy) { >- absl::optional<float> erle_instantaneous = absl::nullopt; >- RTC_DCHECK_LT(band, kFftLengthBy2Plus1); >- Y2_acum_[band] += Y2; >- E2_acum_[band] += E2; >- low_render_energy_[band] = >- low_render_energy_[band] || X2 < kX2BandEnergyThreshold; >- if (++num_points_[band] == kPointsToAccumulate) { >- if (E2_acum_[band]) { >- erle_instantaneous = Y2_acum_[band] / E2_acum_[band]; >+void SubbandErleEstimator::UpdateAccumulatedSpectra( >+ rtc::ArrayView<const float> X2, >+ rtc::ArrayView<const float> Y2, >+ rtc::ArrayView<const float> E2) { >+ auto& st = accum_spectra_; >+ if (adapt_on_low_render_) { >+ if (st.num_points_[0] == kPointsToAccumulate) { >+ st.num_points_[0] = 0; >+ st.Y2_.fill(0.f); >+ st.E2_.fill(0.f); >+ st.low_render_energy_.fill(false); >+ } >+ std::transform(Y2.begin(), Y2.end(), st.Y2_.begin(), st.Y2_.begin(), >+ std::plus<float>()); >+ std::transform(E2.begin(), E2.end(), st.E2_.begin(), st.E2_.begin(), >+ std::plus<float>()); >+ >+ for (size_t k = 0; k < X2.size(); ++k) { >+ st.low_render_energy_[k] = >+ st.low_render_energy_[k] || X2[k] < kX2BandEnergyThreshold; >+ } >+ st.num_points_[0]++; >+ st.num_points_.fill(st.num_points_[0]); >+ >+ } else { >+ // The update is always done using high render energy signals and >+ // therefore the field accum_spectra_.low_render_energy_ does not need to >+ // be modified. >+ for (size_t k = 0; k < X2.size(); ++k) { >+ if (X2[k] > kX2BandEnergyThreshold) { >+ if (st.num_points_[k] == kPointsToAccumulate) { >+ st.Y2_[k] = 0.f; >+ st.E2_[k] = 0.f; >+ st.num_points_[k] = 0; >+ } >+ st.Y2_[k] += Y2[k]; >+ st.E2_[k] += E2[k]; >+ st.num_points_[k]++; >+ } >+ RTC_DCHECK_EQ(st.low_render_energy_[k], false); > } >- *low_render_energy = low_render_energy_[band]; >- num_points_[band] = 0; >- Y2_acum_[band] = 0.f; >- E2_acum_[band] = 0.f; >- low_render_energy_[band] = false; > } >- >- return erle_instantaneous; >-} >- >-void SubbandErleEstimator::ErleInstantaneous::Reset() { >- Y2_acum_.fill(0.f); >- E2_acum_.fill(0.f); >- low_render_energy_.fill(false); >- num_points_.fill(0); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.h >index aa5e5ccb2486362553218a9c5990f37c03dcd6b8..b9862dbc6def1f314a87ed44effe2633ad47ba75 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subband_erle_estimator.h >@@ -11,11 +11,13 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_SUBBAND_ERLE_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_SUBBAND_ERLE_ESTIMATOR_H_ > >+#include <stddef.h> > #include <array> > #include <memory> >+#include <vector> > >-#include "absl/types/optional.h" > #include "api/array_view.h" >+#include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > >@@ -24,7 +26,7 @@ namespace webrtc { > // Estimates the echo return loss enhancement for each frequency subband. > class SubbandErleEstimator { > public: >- SubbandErleEstimator(float min_erle, float max_erle_lf, float max_erle_hf); >+ explicit SubbandErleEstimator(const EchoCanceller3Config& config); > ~SubbandErleEstimator(); > > // Resets the ERLE estimator. >@@ -41,55 +43,35 @@ class SubbandErleEstimator { > const std::array<float, kFftLengthBy2Plus1>& Erle() const { return erle_; } > > // Returns the ERLE estimate at onsets. >- const std::array<float, kFftLengthBy2Plus1>& ErleOnsets() const { >- return erle_onsets_; >- } >+ rtc::ArrayView<const float> ErleOnsets() const { return erle_onsets_; } > > void Dump(const std::unique_ptr<ApmDataDumper>& data_dumper) const; > > private: >- void UpdateBands(rtc::ArrayView<const float> X2, >- rtc::ArrayView<const float> Y2, >- rtc::ArrayView<const float> E2, >- size_t start, >- size_t stop, >- float max_erle, >- bool onset_detection); >- void DecreaseErlePerBandForLowRenderSignals(); >- >- class ErleInstantaneous { >- public: >- ErleInstantaneous(); >- ~ErleInstantaneous(); >- // Updates the ERLE for a band with a new block. Returns absl::nullopt >- // if not enough points were accumulated for doing the estimation, >- // otherwise, it returns the ERLE. When the ERLE is returned, the >- // low_render_energy flag contains information on whether the estimation was >- // done using low level render signals. >- absl::optional<float> Update(float X2, >- float Y2, >- float E2, >- size_t band, >- bool* low_render_energy); >- // Resets the ERLE estimator to its initial state. >- void Reset(); >- >- private: >- std::array<float, kFftLengthBy2Plus1> Y2_acum_; >- std::array<float, kFftLengthBy2Plus1> E2_acum_; >+ struct AccumulatedSpectra { >+ std::array<float, kFftLengthBy2Plus1> Y2_; >+ std::array<float, kFftLengthBy2Plus1> E2_; > std::array<bool, kFftLengthBy2Plus1> low_render_energy_; > std::array<int, kFftLengthBy2Plus1> num_points_; > }; > >- ErleInstantaneous instantaneous_erle_; >+ void UpdateAccumulatedSpectra(rtc::ArrayView<const float> X2, >+ rtc::ArrayView<const float> Y2, >+ rtc::ArrayView<const float> E2); >+ >+ void ResetAccumulatedSpectra(); >+ >+ void UpdateBands(bool onset_detection); >+ void DecreaseErlePerBandForLowRenderSignals(); >+ >+ const float min_erle_; >+ const std::array<float, kFftLengthBy2Plus1> max_erle_; >+ const bool adapt_on_low_render_; >+ AccumulatedSpectra accum_spectra_; > std::array<float, kFftLengthBy2Plus1> erle_; > std::array<float, kFftLengthBy2Plus1> erle_onsets_; > std::array<bool, kFftLengthBy2Plus1> coming_onset_; > std::array<int, kFftLengthBy2Plus1> hold_counters_; >- const float min_erle_; >- const float max_erle_lf_; >- const float max_erle_hf_; >- const bool adapt_on_low_render_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.cc >index 609e8ac7ed1b927c72809b91421af7c263854b8d..90069c7a749fc5bb1523f778f49d79a578b7d106 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.cc >@@ -11,12 +11,12 @@ > #include "modules/audio_processing/aec3/subtractor.h" > > #include <algorithm> >-#include <numeric> >+#include <utility> > > #include "api/array_view.h" >+#include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "system_wrappers/include/field_trial.h" > >@@ -191,7 +191,7 @@ void Subtractor::Process(const RenderBuffer& render_buffer, > adaptation_during_saturation_, &shadow_saturation); > > // Compute the signal powers in the subtractor output. >- output->UpdatePowers(y); >+ output->ComputeMetrics(y); > > // Adjust the filter if needed. > bool main_filter_adjusted = false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.h >index c92a971f1178373ba0a362c4ca4fb667bc97dbbd..bec014deec957d3ccdee4722ce1fbdd786b20406 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor.h >@@ -11,11 +11,13 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_SUBTRACTOR_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_SUBTRACTOR_H_ > >-#include <algorithm> >+#include <math.h> >+#include <stddef.h> > #include <array> > #include <vector> >-#include "math.h" > >+#include "api/array_view.h" >+#include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/adaptive_fir_filter.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/aec3_fft.h" >@@ -23,10 +25,11 @@ > #include "modules/audio_processing/aec3/echo_path_variability.h" > #include "modules/audio_processing/aec3/main_filter_update_gain.h" > #include "modules/audio_processing/aec3/render_buffer.h" >+#include "modules/audio_processing/aec3/render_signal_analyzer.h" > #include "modules/audio_processing/aec3/shadow_filter_update_gain.h" > #include "modules/audio_processing/aec3/subtractor_output.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >-#include "modules/audio_processing/utility/ooura_fft.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.cc >index affa4a3a06fa24db1cf3e52b27c3438212b0ed79..922cc3d1b3f3e413e5f27f0c47d75eb25c97aade 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.cc >@@ -33,7 +33,7 @@ void SubtractorOutput::Reset() { > y2 = 0.f; > } > >-void SubtractorOutput::UpdatePowers(rtc::ArrayView<const float> y) { >+void SubtractorOutput::ComputeMetrics(rtc::ArrayView<const float> y) { > const auto sum_of_squares = [](float a, float b) { return a + b * b; }; > y2 = std::accumulate(y.begin(), y.end(), 0.f, sum_of_squares); > e2_main = std::accumulate(e_main.begin(), e_main.end(), 0.f, sum_of_squares); >@@ -42,6 +42,14 @@ void SubtractorOutput::UpdatePowers(rtc::ArrayView<const float> y) { > s2_main = std::accumulate(s_main.begin(), s_main.end(), 0.f, sum_of_squares); > s2_shadow = > std::accumulate(s_shadow.begin(), s_shadow.end(), 0.f, sum_of_squares); >+ >+ s_main_max_abs = *std::max_element(s_main.begin(), s_main.end()); >+ s_main_max_abs = std::max(s_main_max_abs, >+ -(*std::min_element(s_main.begin(), s_main.end()))); >+ >+ s_shadow_max_abs = *std::max_element(s_shadow.begin(), s_shadow.end()); >+ s_shadow_max_abs = std::max( >+ s_shadow_max_abs, -(*std::min_element(s_shadow.begin(), s_shadow.end()))); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.h >index 89727e9396f122baf36bdf895abd4be478ef88b3..5f6fd3ed71207e419b476887f74739b6aaa8fa87 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output.h >@@ -36,12 +36,14 @@ struct SubtractorOutput { > float e2_main = 0.f; > float e2_shadow = 0.f; > float y2 = 0.f; >+ float s_main_max_abs = 0.f; >+ float s_shadow_max_abs = 0.f; > > // Reset the struct content. > void Reset(); > > // Updates the powers of the signals. >- void UpdatePowers(rtc::ArrayView<const float> y); >+ void ComputeMetrics(rtc::ArrayView<const float> y); > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.cc >index 3cacb4563014a365a9a1e279f96c02ac64f1a1fb..9374b8099f47b6b43b17582cf43f8d53ca3f863c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.cc >@@ -10,9 +10,9 @@ > > #include "modules/audio_processing/aec3/subtractor_output_analyzer.h" > >-#include <array> >-#include <numeric> >+#include <algorithm> > >+#include "modules/audio_processing/aec3/aec3_common.h" > #include "system_wrappers/include/field_trial.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.h >index b59a68e5e60b19451eaa502381cae2d5c23f184a..0e23ad5dfaf39b0bb40d2b5a75ddddab07308a10 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_output_analyzer.h >@@ -11,7 +11,6 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_SUBTRACTOR_OUTPUT_ANALYZER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_SUBTRACTOR_OUTPUT_ANALYZER_H_ > >-#include "api/array_view.h" > #include "modules/audio_processing/aec3/subtractor_output.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_unittest.cc >index ba744a6a8396537e219e86190202a2dc140feb78..8d14cc1a072527887774bdd57728e355a3068083 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/subtractor_unittest.cc >@@ -44,7 +44,7 @@ float RunSubtractorTest(int num_blocks_to_process, > config.delay.min_echo_path_delay_blocks = 0; > config.delay.default_delay = 1; > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > RenderSignalAnalyzer render_signal_analyzer(config); > Random random_generator(42U); > Aec3Fft fft; >@@ -127,7 +127,7 @@ TEST(Subtractor, DISABLED_NullOutput) { > EchoCanceller3Config config; > Subtractor subtractor(config, &data_dumper, DetectOptimization()); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > RenderSignalAnalyzer render_signal_analyzer(config); > std::vector<float> y(kBlockSize, 0.f); > >@@ -143,7 +143,7 @@ TEST(Subtractor, WrongCaptureSize) { > EchoCanceller3Config config; > Subtractor subtractor(config, &data_dumper, DetectOptimization()); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > RenderSignalAnalyzer render_signal_analyzer(config); > std::vector<float> y(kBlockSize - 1, 0.f); > SubtractorOutput output; >@@ -210,21 +210,4 @@ TEST(Subtractor, NonConvergenceOnUncorrelatedSignals) { > } > } > >-// Verifies that the subtractor is properly reset when there is an echo path >-// change. >-TEST(Subtractor, EchoPathChangeReset) { >- std::vector<int> blocks_with_echo_path_changes; >- blocks_with_echo_path_changes.push_back(99); >- for (size_t filter_length_blocks : {12, 20, 30}) { >- for (size_t delay_samples : {0, 64, 150, 200, 301}) { >- SCOPED_TRACE(ProduceDebugText(delay_samples, filter_length_blocks)); >- >- float echo_to_nearend_power = RunSubtractorTest( >- 100, delay_samples, filter_length_blocks, filter_length_blocks, false, >- blocks_with_echo_path_changes); >- EXPECT_NEAR(1.f, echo_to_nearend_power, 0.0000001f); >- } >- } >-} >- > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.cc >index 87e3008dc76aad8b26c709a44651ff3d6703792c..6fe296c219c632408d4797cb42a9967bf504ba4e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.cc >@@ -10,13 +10,14 @@ > > #include "modules/audio_processing/aec3/suppression_filter.h" > >-#include <math.h> > #include <algorithm> >+#include <cmath> > #include <cstring> > #include <functional> >-#include <numeric> >+#include <iterator> > >-#include "modules/audio_processing/utility/ooura_fft.h" >+#include "modules/audio_processing/aec3/vector_math.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_minmax.h" > > namespace webrtc { >@@ -59,8 +60,10 @@ const float kSqrtHanning[kFftLength] = { > > } // namespace > >-SuppressionFilter::SuppressionFilter(int sample_rate_hz) >- : sample_rate_hz_(sample_rate_hz), >+SuppressionFilter::SuppressionFilter(Aec3Optimization optimization, >+ int sample_rate_hz) >+ : optimization_(optimization), >+ sample_rate_hz_(sample_rate_hz), > fft_(), > e_output_old_(NumBandsForRate(sample_rate_hz_)) { > RTC_DCHECK(ValidFullBandRate(sample_rate_hz_)); >@@ -90,18 +93,17 @@ void SuppressionFilter::ApplyGain( > std::transform(suppression_gain.begin(), suppression_gain.end(), E.im.begin(), > E.im.begin(), std::multiplies<float>()); > >- // Compute and add the comfort noise. >- std::array<float, kFftLengthBy2Plus1> scaled_comfort_noise; >+ // Comfort noise gain is sqrt(1-g^2), where g is the suppression gain. >+ std::array<float, kFftLengthBy2Plus1> noise_gain; > std::transform(suppression_gain.begin(), suppression_gain.end(), >- comfort_noise.re.begin(), scaled_comfort_noise.begin(), >- [](float a, float b) { return std::max(1.f - a, 0.f) * b; }); >- std::transform(scaled_comfort_noise.begin(), scaled_comfort_noise.end(), >- E.re.begin(), E.re.begin(), std::plus<float>()); >- std::transform(suppression_gain.begin(), suppression_gain.end(), >- comfort_noise.im.begin(), scaled_comfort_noise.begin(), >- [](float a, float b) { return std::max(1.f - a, 0.f) * b; }); >- std::transform(scaled_comfort_noise.begin(), scaled_comfort_noise.end(), >- E.im.begin(), E.im.begin(), std::plus<float>()); >+ noise_gain.begin(), [](float g) { return 1.f - g * g; }); >+ aec3::VectorMath(optimization_).Sqrt(noise_gain); >+ >+ // Scale and add the comfort noise. >+ for (size_t k = 0; k < kFftLengthBy2Plus1; k++) { >+ E.re[k] += noise_gain[k] * comfort_noise.re[k]; >+ E.im[k] += noise_gain[k] * comfort_noise.im[k]; >+ } > > // Synthesis filterbank. > std::array<float, kFftLength> e_extended; >@@ -135,7 +137,7 @@ void SuppressionFilter::ApplyGain( > > // Scale and apply the noise to the signals. > const float high_bands_noise_scaling = >- 0.4f * std::max(1.f - high_bands_gain, 0.f); >+ 0.4f * std::sqrt(1.f - high_bands_gain * high_bands_gain); > > std::transform( > (*e)[1].begin(), (*e)[1].end(), time_domain_high_band_noise.begin(), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.h >index 237408d7f9dc3e598ee38edcd894282ed7d4b206..edd1290014d2a4ab62373a2a7955a908383f3d09 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter.h >@@ -16,13 +16,15 @@ > > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/aec3_fft.h" >+#include "modules/audio_processing/aec3/fft_data.h" >+#include "modules/audio_processing/utility/ooura_fft.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { > > class SuppressionFilter { > public: >- explicit SuppressionFilter(int sample_rate_hz); >+ SuppressionFilter(Aec3Optimization optimization, int sample_rate_hz); > ~SuppressionFilter(); > void ApplyGain(const FftData& comfort_noise, > const FftData& comfort_noise_high_bands, >@@ -32,6 +34,7 @@ class SuppressionFilter { > std::vector<std::vector<float>>* e); > > private: >+ const Aec3Optimization optimization_; > const int sample_rate_hz_; > const OouraFft ooura_fft_; > const Aec3Fft fft_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter_unittest.cc >index eaa608eed5d7faa05b346b3bd7ec9afc54a13e4a..9e4ff7c28b4c82310304b2321f0172b675e8dfc5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_filter_unittest.cc >@@ -45,21 +45,21 @@ TEST(SuppressionFilter, NullOutput) { > FftData E; > std::array<float, kFftLengthBy2Plus1> gain; > >- EXPECT_DEATH(SuppressionFilter(16000).ApplyGain(cn, cn_high_bands, gain, 1.0f, >- E, nullptr), >+ EXPECT_DEATH(SuppressionFilter(Aec3Optimization::kNone, 16000) >+ .ApplyGain(cn, cn_high_bands, gain, 1.0f, E, nullptr), > ""); > } > > // Verifies the check for allowed sample rate. > TEST(SuppressionFilter, ProperSampleRate) { >- EXPECT_DEATH(SuppressionFilter(16001), ""); >+ EXPECT_DEATH(SuppressionFilter(Aec3Optimization::kNone, 16001), ""); > } > > #endif > > // Verifies that no comfort noise is added when the gain is 1. > TEST(SuppressionFilter, ComfortNoiseInUnityGain) { >- SuppressionFilter filter(48000); >+ SuppressionFilter filter(Aec3Optimization::kNone, 48000); > FftData cn; > FftData cn_high_bands; > std::array<float, kFftLengthBy2Plus1> gain; >@@ -89,7 +89,7 @@ TEST(SuppressionFilter, ComfortNoiseInUnityGain) { > > // Verifies that the suppressor is able to suppress a signal. > TEST(SuppressionFilter, SignalSuppression) { >- SuppressionFilter filter(48000); >+ SuppressionFilter filter(Aec3Optimization::kNone, 48000); > FftData cn; > FftData cn_high_bands; > std::array<float, kFftLengthBy2> e_old_; >@@ -131,7 +131,7 @@ TEST(SuppressionFilter, SignalSuppression) { > // Verifies that the suppressor is able to pass through a desired signal while > // applying suppressing for some frequencies. > TEST(SuppressionFilter, SignalTransparency) { >- SuppressionFilter filter(48000); >+ SuppressionFilter filter(Aec3Optimization::kNone, 48000); > FftData cn; > std::array<float, kFftLengthBy2> e_old_; > Aec3Fft fft; >@@ -171,7 +171,7 @@ TEST(SuppressionFilter, SignalTransparency) { > > // Verifies that the suppressor delay. > TEST(SuppressionFilter, Delay) { >- SuppressionFilter filter(48000); >+ SuppressionFilter filter(Aec3Optimization::kNone, 48000); > FftData cn; > FftData cn_high_bands; > std::array<float, kFftLengthBy2> e_old_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.cc >index 613256673069a56ad976dd9abd3a55d0e4f98555..88cfc0a01e4e0d18385c59bfc6180c7ce9773681 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.cc >@@ -11,8 +11,8 @@ > #include "modules/audio_processing/aec3/suppression_gain.h" > > #include <math.h> >+#include <stddef.h> > #include <algorithm> >-#include <functional> > #include <numeric> > > #include "modules/audio_processing/aec3/moving_average.h" >@@ -20,15 +20,10 @@ > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" >-#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > namespace { > >-bool EnableNewSuppression() { >- return !field_trial::IsEnabled("WebRTC-Aec3NewSuppressionKillSwitch"); >-} >- > // Adjust the gains according to the presence of known external filters. > void AdjustForExternalFilters(std::array<float, kFftLengthBy2Plus1>* gain) { > // Limit the low frequency gains to avoid the impact of the high-pass filter >@@ -83,57 +78,6 @@ void WeightEchoForAudibility(const EchoCanceller3Config& config, > weigh(threshold, normalizer, 7, kFftLengthBy2Plus1, echo, weighted_echo); > } > >-// Computes the gain to reduce the echo to a non audible level. >-void GainToNoAudibleEchoFallback( >- const EchoCanceller3Config& config, >- bool low_noise_render, >- bool saturated_echo, >- bool linear_echo_estimate, >- const std::array<float, kFftLengthBy2Plus1>& nearend, >- const std::array<float, kFftLengthBy2Plus1>& weighted_echo, >- const std::array<float, kFftLengthBy2Plus1>& masker, >- const std::array<float, kFftLengthBy2Plus1>& min_gain, >- const std::array<float, kFftLengthBy2Plus1>& max_gain, >- const std::array<float, kFftLengthBy2Plus1>& one_by_weighted_echo, >- std::array<float, kFftLengthBy2Plus1>* gain) { >- float nearend_masking_margin = 0.f; >- if (linear_echo_estimate) { >- nearend_masking_margin = >- low_noise_render >- ? config.gain_mask.m9 >- : (saturated_echo ? config.gain_mask.m2 : config.gain_mask.m3); >- } else { >- nearend_masking_margin = config.gain_mask.m7; >- } >- >- RTC_DCHECK_LE(0.f, nearend_masking_margin); >- RTC_DCHECK_GT(1.f, nearend_masking_margin); >- >- const float masker_margin = >- linear_echo_estimate ? config.gain_mask.m0 : config.gain_mask.m8; >- >- for (size_t k = 0; k < gain->size(); ++k) { >- // TODO(devicentepena): Experiment by removing the reverberation estimation >- // from the nearend signal before computing the gains. >- const float unity_gain_masker = std::max(nearend[k], masker[k]); >- RTC_DCHECK_LE(0.f, nearend_masking_margin * unity_gain_masker); >- if (weighted_echo[k] <= nearend_masking_margin * unity_gain_masker || >- unity_gain_masker <= 0.f) { >- (*gain)[k] = 1.f; >- } else { >- RTC_DCHECK_LT(0.f, unity_gain_masker); >- (*gain)[k] = >- std::max(0.f, (1.f - config.gain_mask.gain_curve_slope * >- weighted_echo[k] / unity_gain_masker) * >- config.gain_mask.gain_curve_offset); >- (*gain)[k] = std::max(masker_margin * masker[k] * one_by_weighted_echo[k], >- (*gain)[k]); >- } >- >- (*gain)[k] = std::min(std::max((*gain)[k], min_gain[k]), max_gain[k]); >- } >-} >- > // TODO(peah): Make adaptive to take the actual filter error into account. > constexpr size_t kUpperAccurateBandPlus1 = 29; > >@@ -321,30 +265,9 @@ void SuppressionGain::LowerBandGain( > std::array<float, kFftLengthBy2Plus1> max_gain; > GetMaxGain(max_gain); > >- // Iteratively compute the gain required to attenuate the echo to a non >- // noticeable level. >- >- if (enable_new_suppression_) { > GainToNoAudibleEcho(nearend, weighted_residual_echo, comfort_noise, > min_gain, max_gain, gain); > AdjustForExternalFilters(gain); >- } else { >- const bool linear_echo_estimate = aec_state.UsableLinearEstimate(); >- std::array<float, kFftLengthBy2Plus1> masker; >- std::array<float, kFftLengthBy2Plus1> one_by_weighted_echo; >- std::transform(weighted_residual_echo.begin(), weighted_residual_echo.end(), >- one_by_weighted_echo.begin(), >- [](float e) { return e > 0.f ? 1.f / e : 1.f; }); >- gain->fill(0.f); >- for (int k = 0; k < 2; ++k) { >- std::copy(comfort_noise.begin(), comfort_noise.end(), masker.begin()); >- GainToNoAudibleEchoFallback(config_, low_noise_render, saturated_echo, >- linear_echo_estimate, nearend, >- weighted_residual_echo, masker, min_gain, >- max_gain, one_by_weighted_echo, gain); >- AdjustForExternalFilters(gain); >- } >- } > > // Adjust the gain for frequencies which have not yet converged. > AdjustNonConvergedFrequencies(gain); >@@ -372,7 +295,6 @@ SuppressionGain::SuppressionGain(const EchoCanceller3Config& config, > config_(config), > state_change_duration_blocks_( > static_cast<int>(config_.filter.config_change_duration_blocks)), >- enable_new_suppression_(EnableNewSuppression()), > moving_average_(kFftLengthBy2Plus1, > config.suppressor.nearend_average_blocks), > nearend_params_(config_.suppressor.nearend_tuning), >@@ -416,7 +338,7 @@ void SuppressionGain::GetGain( > > // Update the state selection. > dominant_nearend_detector_.Update(nearend_spectrum, residual_echo_spectrum, >- comfort_noise_spectrum); >+ comfort_noise_spectrum, initial_state_); > > // Compute gain for the lower band. > bool low_noise_render = low_render_detector_.Detect(render); >@@ -475,14 +397,17 @@ bool SuppressionGain::LowNoiseRenderDetector::Detect( > SuppressionGain::DominantNearendDetector::DominantNearendDetector( > const EchoCanceller3Config::Suppressor::DominantNearendDetection config) > : enr_threshold_(config.enr_threshold), >+ enr_exit_threshold_(config.enr_exit_threshold), > snr_threshold_(config.snr_threshold), > hold_duration_(config.hold_duration), >- trigger_threshold_(config.trigger_threshold) {} >+ trigger_threshold_(config.trigger_threshold), >+ use_during_initial_phase_(config.use_during_initial_phase) {} > > void SuppressionGain::DominantNearendDetector::Update( > rtc::ArrayView<const float> nearend_spectrum, > rtc::ArrayView<const float> residual_echo_spectrum, >- rtc::ArrayView<const float> comfort_noise_spectrum) { >+ rtc::ArrayView<const float> comfort_noise_spectrum, >+ bool initial_state) { > auto low_frequency_energy = [](rtc::ArrayView<const float> spectrum) { > RTC_DCHECK_LE(16, spectrum.size()); > return std::accumulate(spectrum.begin() + 1, spectrum.begin() + 16, 0.f); >@@ -493,7 +418,8 @@ void SuppressionGain::DominantNearendDetector::Update( > > // Detect strong active nearend if the nearend is sufficiently stronger than > // the echo and the nearend noise. >- if (ne_sum > enr_threshold_ * echo_sum && >+ if ((!initial_state || use_during_initial_phase_) && >+ ne_sum > enr_threshold_ * echo_sum && > ne_sum > snr_threshold_ * noise_sum) { > if (++trigger_counter_ >= trigger_threshold_) { > // After a period of strong active nearend activity, flag nearend mode. >@@ -505,6 +431,12 @@ void SuppressionGain::DominantNearendDetector::Update( > trigger_counter_ = std::max(0, trigger_counter_ - 1); > } > >+ // Exit nearend-state early at strong echo. >+ if (ne_sum < enr_exit_threshold_ * echo_sum && >+ echo_sum > snr_threshold_ * noise_sum) { >+ hold_counter_ = 0; >+ } >+ > // Remain in any nearend mode for a certain duration. > hold_counter_ = std::max(0, hold_counter_ - 1); > nearend_state_ = hold_counter_ > 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.h >index 4eb8581a862a6dcca3202e0c06e6063d7237f9fd..e74cd9632b123aadb1042ba4b46bb871f7d2e8e9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain.h >@@ -12,13 +12,18 @@ > #define MODULES_AUDIO_PROCESSING_AEC3_SUPPRESSION_GAIN_H_ > > #include <array> >+#include <memory> > #include <vector> > >+#include "absl/types/optional.h" >+#include "api/array_view.h" > #include "api/audio/echo_canceller3_config.h" > #include "modules/audio_processing/aec3/aec3_common.h" > #include "modules/audio_processing/aec3/aec_state.h" >+#include "modules/audio_processing/aec3/fft_data.h" > #include "modules/audio_processing/aec3/moving_average.h" > #include "modules/audio_processing/aec3/render_signal_analyzer.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >@@ -102,13 +107,16 @@ class SuppressionGain { > // Updates the state selection based on latest spectral estimates. > void Update(rtc::ArrayView<const float> nearend_spectrum, > rtc::ArrayView<const float> residual_echo_spectrum, >- rtc::ArrayView<const float> comfort_noise_spectrum); >+ rtc::ArrayView<const float> comfort_noise_spectrum, >+ bool initial_state); > > private: > const float enr_threshold_; >+ const float enr_exit_threshold_; > const float snr_threshold_; > const int hold_duration_; > const int trigger_threshold_; >+ const bool use_during_initial_phase_; > > bool nearend_state_ = false; > int trigger_counter_ = 0; >@@ -137,7 +145,6 @@ class SuppressionGain { > LowNoiseRenderDetector low_render_detector_; > bool initial_state_ = true; > int initial_state_change_counter_ = 0; >- const bool enable_new_suppression_; > aec3::MovingAverage moving_average_; > const GainParameters nearend_params_; > const GainParameters normal_params_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_limiter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_limiter.cc >index e3d7a660bf0879193155d009e8c822dc79adb7d2..6625a779c244a889fa6aed3b30ee27093faa4d66 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_limiter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_limiter.cc >@@ -14,6 +14,7 @@ > #include <algorithm> > > #include "modules/audio_processing/aec3/aec3_common.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_unittest.cc >index 1ff96ca6e0552d8cd3c35083ea80bd18cf637e66..651fd3674bc13c96ee3b210f931673e012f0c3ba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/suppression_gain_unittest.cc >@@ -77,7 +77,7 @@ TEST(SuppressionGain, BasicGainComputation) { > ApmDataDumper data_dumper(42); > Subtractor subtractor(config, &data_dumper, DetectOptimization()); > std::unique_ptr<RenderDelayBuffer> render_delay_buffer( >- RenderDelayBuffer::Create(config, 3)); >+ RenderDelayBuffer::Create2(config, 3)); > absl::optional<DelayEstimate> delay_estimate; > > // Ensure that a strong noise is detected to mask any echoes. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.cc >index f491168d9a141ae1789770ea01696c74233b4be4..0682885c0c7108881cc4269074bb6aa30cb039a3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.cc >@@ -10,7 +10,7 @@ > > #include "modules/audio_processing/aec3/vector_buffer.h" > >-#include "modules/audio_processing/aec3/aec3_common.h" >+#include <algorithm> > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.h >index c7d4f68b27c3886f39bac4b9198eca06d0a53288..4c0257ccea4cd0e89b5367381acc87973483e5c9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec3/vector_buffer.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AEC3_VECTOR_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_AEC3_VECTOR_BUFFER_H_ > >+#include <stddef.h> > #include <vector> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/BUILD.gn >index e5fee3e95f7154c9a2076f7c63a15ae0ebb1cb03..d4262ca485835dad51497a35c951386b9ef2c7f9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/BUILD.gn >@@ -17,6 +17,7 @@ rtc_source_set("aec_dump") { > deps = [ > "../", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/system:rtc_export", > ] > } > >@@ -42,6 +43,7 @@ rtc_source_set("mock_aec_dump_unittests") { > > deps = [ > ":mock_aec_dump", >+ "..:api", > "../", > "../../../rtc_base:rtc_base_approved", > "//testing/gtest", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_factory.h >index e3f00f67a2e28e0bcec4a150661d1e50229a80e4..1e55d594ee31340f2965831369040b176cec7115 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_factory.h >@@ -16,6 +16,7 @@ > > #include "modules/audio_processing/include/aec_dump.h" > #include "rtc_base/platform_file.h" >+#include "rtc_base/system/rtc_export.h" > > namespace rtc { > class TaskQueue; >@@ -23,7 +24,7 @@ class TaskQueue; > > namespace webrtc { > >-class AecDumpFactory { >+class RTC_EXPORT AecDumpFactory { > public: > // The |worker_queue| may not be null and must outlive the created > // AecDump instance. |max_log_size_bytes == -1| means the log size >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_impl.cc >index 2732934140684886959eeea5a5669019562881a6..9020f2b9dca14e635b8341f1d64692b100bf8ae5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aec_dump/aec_dump_impl.cc >@@ -65,7 +65,7 @@ AecDumpImpl::AecDumpImpl(std::unique_ptr<FileWrapper> debug_file, > > AecDumpImpl::~AecDumpImpl() { > // Block until all tasks have finished running. >- rtc::Event thread_sync_event(false /* manual_reset */, false); >+ rtc::Event thread_sync_event; > worker_queue_->PostTask([&thread_sync_event] { thread_sync_event.Set(); }); > // Wait until the event has been signaled with .Set(). By then all > // pending tasks will have finished. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/aecm_core.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/aecm_core.cc >index 0e56b50c5ef306b89bb6e28a83fcd051d7d403a2..67b70bfb36670a4398ec8d20019c43e80fe21541 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/aecm_core.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/aecm_core.cc >@@ -12,16 +12,15 @@ > > #include <stddef.h> > #include <stdlib.h> >+#include <string.h> > > extern "C" { > #include "common_audio/ring_buffer.h" > #include "common_audio/signal_processing/include/real_fft.h" > } >+#include "common_audio/signal_processing/include/signal_processing_library.h" > #include "modules/audio_processing/aecm/echo_control_mobile.h" > #include "modules/audio_processing/utility/delay_estimator_wrapper.h" >-extern "C" { >-#include "system_wrappers/include/cpu_features_wrapper.h" >-} > > #include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_conversions.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/echo_control_mobile.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/echo_control_mobile.cc >index c947563b2805e2e9771241f3bd1e543c6af2f848..257dfbf56e4e81634ca881691349a16dcfb687be 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/echo_control_mobile.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/aecm/echo_control_mobile.cc >@@ -14,10 +14,12 @@ > #include <stdio.h> > #endif > #include <stdlib.h> >+#include <string.h> > > extern "C" { > #include "common_audio/ring_buffer.h" > #include "common_audio/signal_processing/include/signal_processing_library.h" >+#include "modules/audio_processing/aecm/aecm_defines.h" > } > #include "modules/audio_processing/aecm/aecm_core.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.cc >index 12c8cfb7fbd1731d782ffc4d9312d4e488d976d2..c24db0dd5225be2821437af84f59da43d82ecc67 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.cc >@@ -12,8 +12,6 @@ > > #include <cmath> > #include <cstdlib> >- >-#include <algorithm> > #include <vector> > > #include "modules/audio_processing/agc/loudness_histogram.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.h >index 5d34c212f3abff98ee6f3aafbdf5d801d97d2d84..abd68d5e311d81fa8d2a24b5c14e3d54f32abeb2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc.h >@@ -17,7 +17,6 @@ > > namespace webrtc { > >-class AudioFrame; > class LoudnessHistogram; > > class Agc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc_manager_direct.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc_manager_direct.cc >index dc6d45121cdd6dbfd0f9fcc018870cafb1efec22..5c4deeccbf29c5e93ab01745bd643d3b7a5880ae 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc_manager_direct.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/agc_manager_direct.cc >@@ -10,6 +10,7 @@ > > #include "modules/audio_processing/agc/agc_manager_direct.h" > >+#include <algorithm> > #include <cmath> > > #ifdef WEBRTC_AGC_DEBUG_DUMP >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.cc >index 0ed5850e72d567533a32badaf32f4485a4fb13b7..cd57b82bcc72465a911a7f367c1941584239eaaf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.cc >@@ -10,8 +10,8 @@ > > #include "modules/audio_processing/agc/loudness_histogram.h" > >+#include <string.h> > #include <cmath> >-#include <cstring> > > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.h >index d8ad7511764a5aa1846541f8cbeba0a8d896d739..b210be96e3b91eaa282d1df539377cfabd39dcf2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc/loudness_histogram.h >@@ -11,8 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC_LOUDNESS_HISTOGRAM_H_ > #define MODULES_AUDIO_PROCESSING_AGC_LOUDNESS_HISTOGRAM_H_ > >-#include <string.h> >- >+#include <stdint.h> > #include <memory> > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/BUILD.gn >index 1865fde3e91e4fabf779cae6f180c326ad95c65b..5431a150c23c2b80a98551ade63247e70ec6afb3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/BUILD.gn >@@ -27,6 +27,7 @@ rtc_source_set("level_estimation_agc") { > ":gain_applier", > ":noise_level_estimator", > ":rnn_vad_with_level", >+ "..:api", > "..:apm_logging", > "..:audio_frame_view", > "../../../api:array_view", >@@ -58,6 +59,7 @@ rtc_source_set("adaptive_digital") { > ":gain_applier", > ":noise_level_estimator", > ":rnn_vad_with_level", >+ "..:api", > "..:apm_logging", > "..:audio_frame_view", > "../../../api:array_view", >@@ -96,12 +98,10 @@ rtc_source_set("fixed_digital") { > sources = [ > "fixed_digital_level_estimator.cc", > "fixed_digital_level_estimator.h", >- "fixed_gain_controller.cc", >- "fixed_gain_controller.h", >- "gain_curve_applier.cc", >- "gain_curve_applier.h", > "interpolated_gain_curve.cc", > "interpolated_gain_curve.h", >+ "limiter.cc", >+ "limiter.h", > ] > > configs += [ "..:apm_debug_dump" ] >@@ -128,6 +128,7 @@ rtc_source_set("gain_applier") { > deps = [ > ":common", > "..:audio_frame_view", >+ "../../../api:array_view", > "../../../rtc_base:safe_minmax", > ] > } >@@ -191,8 +192,8 @@ rtc_source_set("adaptive_digital_unittests") { > "../../../api:array_view", > "../../../common_audio", > "../../../rtc_base:checks", >+ "../../../rtc_base:gunit_helpers", > "../../../rtc_base:rtc_base_approved", >- "../../../rtc_base:rtc_base_tests_utils", > ] > } > >@@ -203,7 +204,7 @@ rtc_source_set("biquad_filter_unittests") { > ] > deps = [ > ":biquad_filter", >- "../../../rtc_base:rtc_base_tests_utils", >+ "../../../rtc_base:gunit_helpers", > ] > } > >@@ -216,11 +217,10 @@ rtc_source_set("fixed_digital_unittests") { > "compute_interpolated_gain_curve.cc", > "compute_interpolated_gain_curve.h", > "fixed_digital_level_estimator_unittest.cc", >- "fixed_gain_controller_unittest.cc", >- "gain_curve_applier_unittest.cc", > "interpolated_gain_curve_unittest.cc", >- "limiter.cc", >- "limiter.h", >+ "limiter_db_gain_curve.cc", >+ "limiter_db_gain_curve.h", >+ "limiter_db_gain_curve_unittest.cc", > "limiter_unittest.cc", > ] > deps = [ >@@ -232,8 +232,8 @@ rtc_source_set("fixed_digital_unittests") { > "../../../api:array_view", > "../../../common_audio", > "../../../rtc_base:checks", >+ "../../../rtc_base:gunit_helpers", > "../../../rtc_base:rtc_base_approved", >- "../../../rtc_base:rtc_base_tests_utils", > "../../../system_wrappers:metrics", > "//third_party/abseil-cpp/absl/memory", > ] >@@ -254,14 +254,29 @@ rtc_source_set("noise_estimator_unittests") { > "..:audio_frame_view", > "../../../api:array_view", > "../../../rtc_base:checks", >+ "../../../rtc_base:gunit_helpers", > "../../../rtc_base:rtc_base_approved", >- "../../../rtc_base:rtc_base_tests_utils", >+ ] >+} >+ >+rtc_source_set("rnn_vad_with_level_unittests") { >+ testonly = true >+ sources = [ >+ "vad_with_level_unittest.cc", >+ ] >+ deps = [ >+ ":rnn_vad_with_level", >+ "..:audio_frame_view", >+ "../../../rtc_base:gunit_helpers", > ] > } > > rtc_source_set("test_utils") { > testonly = true >- visibility = [ ":*" ] >+ visibility = [ >+ ":*", >+ "..:audio_processing_unittests", >+ ] > sources = [ > "agc2_testing_common.cc", > "agc2_testing_common.h", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.cc >index c7346c6042997e009d90db2cccd518804c513958..a5d36089c484964dbcfb5df00c0a25acd6f25501 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.cc >@@ -10,12 +10,10 @@ > > #include "modules/audio_processing/agc2/adaptive_agc.h" > >-#include <algorithm> >-#include <numeric> >- > #include "common_audio/include/audio_util.h" > #include "modules/audio_processing/agc2/vad_with_level.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >@@ -27,6 +25,19 @@ AdaptiveAgc::AdaptiveAgc(ApmDataDumper* apm_data_dumper) > RTC_DCHECK(apm_data_dumper); > } > >+AdaptiveAgc::AdaptiveAgc(ApmDataDumper* apm_data_dumper, >+ const AudioProcessing::Config::GainController2& config) >+ : speech_level_estimator_( >+ apm_data_dumper, >+ config.adaptive_digital.level_estimator, >+ config.adaptive_digital.use_saturation_protector, >+ config.adaptive_digital.extra_saturation_margin_db), >+ gain_applier_(apm_data_dumper), >+ apm_data_dumper_(apm_data_dumper), >+ noise_level_estimator_(apm_data_dumper) { >+ RTC_DCHECK(apm_data_dumper); >+} >+ > AdaptiveAgc::~AdaptiveAgc() = default; > > void AdaptiveAgc::Process(AudioFrameView<float> float_frame, >@@ -37,9 +48,9 @@ void AdaptiveAgc::Process(AudioFrameView<float> float_frame, > signal_with_levels.vad_result.speech_probability); > apm_data_dumper_->DumpRaw("agc2_vad_rms_dbfs", > signal_with_levels.vad_result.speech_rms_dbfs); >- > apm_data_dumper_->DumpRaw("agc2_vad_peak_dbfs", > signal_with_levels.vad_result.speech_peak_dbfs); >+ > speech_level_estimator_.UpdateEstimation(signal_with_levels.vad_result); > > signal_with_levels.input_level_dbfs = >@@ -61,7 +72,6 @@ void AdaptiveAgc::Process(AudioFrameView<float> float_frame, > > // The gain applier applies the gain. > gain_applier_.Process(signal_with_levels); >- ; > } > > void AdaptiveAgc::Reset() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.h >index 792b2bc916de5ab9c6facd99961fb47b8171e22e..16c0082ed8166d2e4a873041913b18ed4414abe9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_agc.h >@@ -11,13 +11,12 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_ADAPTIVE_AGC_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_ADAPTIVE_AGC_H_ > >-#include <memory> >- > #include "modules/audio_processing/agc2/adaptive_digital_gain_applier.h" > #include "modules/audio_processing/agc2/adaptive_mode_level_estimator.h" > #include "modules/audio_processing/agc2/noise_level_estimator.h" > #include "modules/audio_processing/agc2/vad_with_level.h" > #include "modules/audio_processing/include/audio_frame_view.h" >+#include "modules/audio_processing/include/audio_processing.h" > > namespace webrtc { > class ApmDataDumper; >@@ -25,6 +24,8 @@ class ApmDataDumper; > class AdaptiveAgc { > public: > explicit AdaptiveAgc(ApmDataDumper* apm_data_dumper); >+ AdaptiveAgc(ApmDataDumper* apm_data_dumper, >+ const AudioProcessing::Config::GainController2& config); > ~AdaptiveAgc(); > > void Process(AudioFrameView<float> float_frame, float last_audio_level); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_digital_gain_applier.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_digital_gain_applier.cc >index d4560caf1a9d8471a7e26e8127297df51393d829..6ece83b239240fa8e11abe4c0544a9e7ce5721ed 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_digital_gain_applier.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_digital_gain_applier.cc >@@ -15,6 +15,7 @@ > #include "common_audio/include/audio_util.h" > #include "modules/audio_processing/agc2/agc2_common.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "system_wrappers/include/metrics.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.cc >index b670f4b4692056c40aab7ff8e8e97d8a7fdbb807..8640324b593ffb126f15d54b21792efe4155e50c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.cc >@@ -12,13 +12,27 @@ > > #include "modules/audio_processing/agc2/agc2_common.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_minmax.h" > > namespace webrtc { > > AdaptiveModeLevelEstimator::AdaptiveModeLevelEstimator( > ApmDataDumper* apm_data_dumper) >- : saturation_protector_(apm_data_dumper), >+ : level_estimator_( >+ AudioProcessing::Config::GainController2::LevelEstimator::kRms), >+ use_saturation_protector_(true), >+ saturation_protector_(apm_data_dumper), >+ apm_data_dumper_(apm_data_dumper) {} >+ >+AdaptiveModeLevelEstimator::AdaptiveModeLevelEstimator( >+ ApmDataDumper* apm_data_dumper, >+ AudioProcessing::Config::GainController2::LevelEstimator level_estimator, >+ bool use_saturation_protector, >+ float extra_saturation_margin_db) >+ : level_estimator_(level_estimator), >+ use_saturation_protector_(use_saturation_protector), >+ saturation_protector_(apm_data_dumper, extra_saturation_margin_db), > apm_data_dumper_(apm_data_dumper) {} > > void AdaptiveModeLevelEstimator::UpdateEstimation( >@@ -42,20 +56,38 @@ void AdaptiveModeLevelEstimator::UpdateEstimation( > > const float leak_factor = buffer_is_full ? kFullBufferLeakFactor : 1.f; > >+ // Read speech level estimation. >+ float speech_level_dbfs = 0.f; >+ using LevelEstimatorType = >+ AudioProcessing::Config::GainController2::LevelEstimator; >+ switch (level_estimator_) { >+ case LevelEstimatorType::kRms: >+ speech_level_dbfs = vad_data.speech_rms_dbfs; >+ break; >+ case LevelEstimatorType::kPeak: >+ speech_level_dbfs = vad_data.speech_peak_dbfs; >+ break; >+ } >+ >+ // Update speech level estimation. > estimate_numerator_ = estimate_numerator_ * leak_factor + >- vad_data.speech_rms_dbfs * vad_data.speech_probability; >+ speech_level_dbfs * vad_data.speech_probability; > estimate_denominator_ = > estimate_denominator_ * leak_factor + vad_data.speech_probability; >- > last_estimate_with_offset_dbfs_ = estimate_numerator_ / estimate_denominator_; > >- saturation_protector_.UpdateMargin(vad_data, last_estimate_with_offset_dbfs_); >- DebugDumpEstimate(); >+ if (use_saturation_protector_) { >+ saturation_protector_.UpdateMargin(vad_data, >+ last_estimate_with_offset_dbfs_); >+ DebugDumpEstimate(); >+ } > } > > float AdaptiveModeLevelEstimator::LatestLevelEstimate() const { > return rtc::SafeClamp<float>( >- last_estimate_with_offset_dbfs_ + saturation_protector_.LastMargin(), >+ last_estimate_with_offset_dbfs_ + >+ (use_saturation_protector_ ? saturation_protector_.LastMargin() >+ : 0.f), > -90.f, 30.f); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.h >index 4d4180c4806870004120aa3c921ee755e8b1b619..63b9de2aec44a124dea725a3a6ffc4f601709afb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator.h >@@ -11,8 +11,12 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_ADAPTIVE_MODE_LEVEL_ESTIMATOR_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_ADAPTIVE_MODE_LEVEL_ESTIMATOR_H_ > >+#include <stddef.h> >+ >+#include "modules/audio_processing/agc2/agc2_common.h" // kFullBufferSizeMs... > #include "modules/audio_processing/agc2/saturation_protector.h" > #include "modules/audio_processing/agc2/vad_with_level.h" >+#include "modules/audio_processing/include/audio_processing.h" > > namespace webrtc { > class ApmDataDumper; >@@ -20,6 +24,11 @@ class ApmDataDumper; > class AdaptiveModeLevelEstimator { > public: > explicit AdaptiveModeLevelEstimator(ApmDataDumper* apm_data_dumper); >+ AdaptiveModeLevelEstimator( >+ ApmDataDumper* apm_data_dumper, >+ AudioProcessing::Config::GainController2::LevelEstimator level_estimator, >+ bool use_saturation_protector, >+ float extra_saturation_margin_db); > void UpdateEstimation(const VadWithLevel::LevelAndProbability& vad_data); > float LatestLevelEstimate() const; > void Reset(); >@@ -30,6 +39,9 @@ class AdaptiveModeLevelEstimator { > private: > void DebugDumpEstimate(); > >+ const AudioProcessing::Config::GainController2::LevelEstimator >+ level_estimator_; >+ const bool use_saturation_protector_; > size_t buffer_size_ms_ = 0; > float last_estimate_with_offset_dbfs_ = kInitialSpeechLevelEstimateDbfs; > float estimate_numerator_ = 0.f; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.cc >index 4cee9633f7778daeda05f10fcf0649bfe2844641..b7c64373fc8a00a8291046e71741c133f8739e1d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.cc >@@ -10,6 +10,10 @@ > > #include "modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.h" > >+#include <cmath> >+#include <vector> >+ >+#include "modules/audio_processing/agc2/agc2_common.h" > #include "modules/audio_processing/include/audio_frame_view.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.h >index 484b12893325a01ececbe6be6cae8742d0a8c557..6d12339888c2e8400513c3bf4e02202b6bc1826b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/adaptive_mode_level_estimator_agc.h >@@ -11,10 +11,12 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_ADAPTIVE_MODE_LEVEL_ESTIMATOR_AGC_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_ADAPTIVE_MODE_LEVEL_ESTIMATOR_AGC_H_ > >-#include <vector> >+#include <stddef.h> >+#include <stdint.h> > > #include "modules/audio_processing/agc/agc.h" > #include "modules/audio_processing/agc2/adaptive_mode_level_estimator.h" >+#include "modules/audio_processing/agc2/saturation_protector.h" > #include "modules/audio_processing/agc2/vad_with_level.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.cc >index 5da353fa8fd138feb4ff7b23c54c61c92124497b..af943df78fbb828f83c0a0cf6af1728ab2505a64 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.cc >@@ -10,6 +10,7 @@ > > #include "modules/audio_processing/agc2/agc2_common.h" > >+#include <stdio.h> > #include <string> > > #include "system_wrappers/include/field_trial.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.h >index 71d33e594da23bdf249c4b6b7cbb6fc0ffff07b0..55dd648db49ab161a1f9850ed4a87e689c61e3d2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/agc2_common.h >@@ -13,8 +13,6 @@ > > #include <stddef.h> > >-#include <cmath> >- > namespace webrtc { > > constexpr float kMinFloatS16Value = -32768.f; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.cc >index 9858d505846591f880e94db8174a321131a1f3f4..da8557c19082c1ca18091b3b6f484ae5ebb46ef3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.cc >@@ -10,6 +10,8 @@ > > #include "modules/audio_processing/agc2/biquad_filter.h" > >+#include <stddef.h> >+ > namespace webrtc { > > // Transposed direct form I implementation of a bi-quad filter applied to an >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.h >index 284930c5952831a0c7f12550136508b5504e9a1e..3d78c07607888922760fc05b681ac5ea5caac0ef 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/biquad_filter.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_BIQUAD_FILTER_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_BIQUAD_FILTER_H_ > >+#include <algorithm> >+ > #include "api/array_view.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/constructormagic.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/compute_interpolated_gain_curve.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/compute_interpolated_gain_curve.cc >index f395bceffa1bc77cd6379d06d619dfb44b9bee12..bc92613b697e6f1a40624c2d352aaae510b0c824 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/compute_interpolated_gain_curve.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/compute_interpolated_gain_curve.cc >@@ -19,23 +19,24 @@ > > #include "modules/audio_processing/agc2/agc2_common.h" > #include "modules/audio_processing/agc2/agc2_testing_common.h" >-#include "modules/audio_processing/agc2/limiter.h" >+#include "modules/audio_processing/agc2/limiter_db_gain_curve.h" > #include "rtc_base/checks.h" > > namespace webrtc { > namespace { > > std::pair<double, double> ComputeLinearApproximationParams( >- const Limiter* limiter, >+ const LimiterDbGainCurve* limiter, > const double x) { > const double m = limiter->GetGainFirstDerivativeLinear(x); > const double q = limiter->GetGainLinear(x) - m * x; > return {m, q}; > } > >-double ComputeAreaUnderPiecewiseLinearApproximation(const Limiter* limiter, >- const double x0, >- const double x1) { >+double ComputeAreaUnderPiecewiseLinearApproximation( >+ const LimiterDbGainCurve* limiter, >+ const double x0, >+ const double x1) { > RTC_CHECK_LT(x0, x1); > > // Linear approximation in x0 and x1. >@@ -60,7 +61,7 @@ double ComputeAreaUnderPiecewiseLinearApproximation(const Limiter* limiter, > // Computes the approximation error in the limiter region for a given interval. > // The error is computed as the difference between the areas beneath the limiter > // curve to approximate and its linear under-approximation. >-double LimiterUnderApproximationNegativeError(const Limiter* limiter, >+double LimiterUnderApproximationNegativeError(const LimiterDbGainCurve* limiter, > const double x0, > const double x1) { > const double area_limiter = limiter->GetGainIntegralLinear(x0, x1); >@@ -77,7 +78,7 @@ double LimiterUnderApproximationNegativeError(const Limiter* limiter, > // are assigned by halving intervals (starting with the whole beyond-knee region > // as a single interval). However, even if sub-optimal, this algorithm works > // well in practice and it is efficiently implemented using priority queues. >-std::vector<double> SampleLimiterRegion(const Limiter* limiter) { >+std::vector<double> SampleLimiterRegion(const LimiterDbGainCurve* limiter) { > static_assert(kInterpolatedGainCurveBeyondKneePoints > 2, ""); > > struct Interval { >@@ -131,7 +132,7 @@ std::vector<double> SampleLimiterRegion(const Limiter* limiter) { > // Compute the parameters to over-approximate the knee region via linear > // interpolation. Over-approximating is saturation-safe since the knee region is > // convex. >-void PrecomputeKneeApproxParams(const Limiter* limiter, >+void PrecomputeKneeApproxParams(const LimiterDbGainCurve* limiter, > test::InterpolatedParameters* parameters) { > static_assert(kInterpolatedGainCurveKneePoints > 2, ""); > // Get |kInterpolatedGainCurveKneePoints| - 1 equally spaced points. >@@ -165,7 +166,7 @@ void PrecomputeKneeApproxParams(const Limiter* limiter, > // interpolation and greedy sampling. Under-approximating is saturation-safe > // since the beyond-knee region is concave. > void PrecomputeBeyondKneeApproxParams( >- const Limiter* limiter, >+ const LimiterDbGainCurve* limiter, > test::InterpolatedParameters* parameters) { > // Find points on which the linear pieces are tangent to the gain curve. > const auto samples = SampleLimiterRegion(limiter); >@@ -216,7 +217,7 @@ namespace test { > > InterpolatedParameters ComputeInterpolatedGainCurveApproximationParams() { > InterpolatedParameters parameters; >- Limiter limiter; >+ LimiterDbGainCurve limiter; > parameters.computed_approximation_params_x.fill(0.0f); > parameters.computed_approximation_params_m.fill(0.0f); > parameters.computed_approximation_params_q.fill(0.0f); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_digital_level_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_digital_level_estimator.cc >index 39cc764185c1d8fb135d4cb843e9366613d83cc7..971f4f62b7901588b7406ccd8d5cbdec0520b2a4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_digital_level_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_digital_level_estimator.cc >@@ -13,6 +13,7 @@ > #include <algorithm> > #include <cmath> > >+#include "api/array_view.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.cc >index 0d7e3a61b15d30ac545067d767695fe2e7c0e57e..ef908dc35828494e4d6eadf92c08bc5f08cef3df 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.cc >@@ -10,13 +10,9 @@ > > #include "modules/audio_processing/agc2/fixed_gain_controller.h" > >-#include <algorithm> >-#include <cmath> >- > #include "api/array_view.h" > #include "common_audio/include/audio_util.h" > #include "modules/audio_processing/agc2/agc2_common.h" >-#include "modules/audio_processing/agc2/interpolated_gain_curve.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >@@ -39,7 +35,7 @@ FixedGainController::FixedGainController(ApmDataDumper* apm_data_dumper) > FixedGainController::FixedGainController(ApmDataDumper* apm_data_dumper, > std::string histogram_name_prefix) > : apm_data_dumper_(apm_data_dumper), >- gain_curve_applier_(48000, apm_data_dumper_, histogram_name_prefix) { >+ limiter_(48000, apm_data_dumper_, histogram_name_prefix) { > // Do update histograms.xml when adding name prefixes. > RTC_DCHECK(histogram_name_prefix == "" || histogram_name_prefix == "Test" || > histogram_name_prefix == "AudioMixer" || >@@ -61,12 +57,12 @@ void FixedGainController::SetGain(float gain_to_apply_db) { > // Reset the gain curve applier to quickly react on abrupt level changes > // caused by large changes of the applied gain. > if (previous_applied_gained != gain_to_apply_) { >- gain_curve_applier_.Reset(); >+ limiter_.Reset(); > } > } > > void FixedGainController::SetSampleRate(size_t sample_rate_hz) { >- gain_curve_applier_.SetSampleRate(sample_rate_hz); >+ limiter_.SetSampleRate(sample_rate_hz); > } > > void FixedGainController::Process(AudioFrameView<float> signal) { >@@ -84,7 +80,7 @@ void FixedGainController::Process(AudioFrameView<float> signal) { > } > > // Use the limiter. >- gain_curve_applier_.Process(signal); >+ limiter_.Process(signal); > > // Dump data for debug. > const auto channel_view = signal.channel(0); >@@ -100,6 +96,6 @@ void FixedGainController::Process(AudioFrameView<float> signal) { > } > > float FixedGainController::LastAudioLevel() const { >- return gain_curve_applier_.LastAudioLevel(); >+ return limiter_.LastAudioLevel(); > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.h >deleted file mode 100644 >index ff6ab811729494ed86dbd8626a6319bba61a4f1d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller.h >+++ /dev/null >@@ -1,42 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_AUDIO_PROCESSING_AGC2_FIXED_GAIN_CONTROLLER_H_ >-#define MODULES_AUDIO_PROCESSING_AGC2_FIXED_GAIN_CONTROLLER_H_ >- >-#include "modules/audio_processing/agc2/gain_curve_applier.h" >-#include "modules/audio_processing/include/audio_frame_view.h" >- >-namespace webrtc { >-class ApmDataDumper; >- >-class FixedGainController { >- public: >- explicit FixedGainController(ApmDataDumper* apm_data_dumper); >- FixedGainController(ApmDataDumper* apm_data_dumper, >- std::string histogram_name_prefix); >- >- void Process(AudioFrameView<float> signal); >- >- // Rate and gain may be changed at any time (but not concurrently >- // with any other method call). >- void SetGain(float gain_to_apply_db); >- void SetSampleRate(size_t sample_rate_hz); >- float LastAudioLevel() const; >- >- private: >- float gain_to_apply_ = 1.f; >- ApmDataDumper* apm_data_dumper_ = nullptr; >- GainCurveApplier gain_curve_applier_; >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_AUDIO_PROCESSING_AGC2_FIXED_GAIN_CONTROLLER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller_unittest.cc >deleted file mode 100644 >index db1732aa47648ad190e626f696cfe49386d952a4..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/fixed_gain_controller_unittest.cc >+++ /dev/null >@@ -1,258 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/audio_processing/agc2/fixed_gain_controller.h" >- >-#include "absl/memory/memory.h" >-#include "api/array_view.h" >-#include "modules/audio_processing/agc2/agc2_testing_common.h" >-#include "modules/audio_processing/agc2/vector_float_frame.h" >-#include "modules/audio_processing/logging/apm_data_dumper.h" >-#include "rtc_base/gunit.h" >-#include "system_wrappers/include/metrics.h" >- >-namespace webrtc { >-namespace { >- >-constexpr float kInputLevelLinear = 15000.f; >- >-constexpr float kGainToApplyDb = 15.f; >- >-float RunFixedGainControllerWithConstantInput(FixedGainController* fixed_gc, >- const float input_level, >- const size_t num_frames, >- const int sample_rate) { >- // Give time to the level etimator to converge. >- for (size_t i = 0; i < num_frames; ++i) { >- VectorFloatFrame vectors_with_float_frame( >- 1, rtc::CheckedDivExact(sample_rate, 100), input_level); >- fixed_gc->Process(vectors_with_float_frame.float_frame_view()); >- } >- >- // Process the last frame with constant input level. >- VectorFloatFrame vectors_with_float_frame_last( >- 1, rtc::CheckedDivExact(sample_rate, 100), input_level); >- fixed_gc->Process(vectors_with_float_frame_last.float_frame_view()); >- >- // Return the last sample from the last processed frame. >- const auto channel = >- vectors_with_float_frame_last.float_frame_view().channel(0); >- return channel[channel.size() - 1]; >-} >- >-std::unique_ptr<ApmDataDumper> GetApmDataDumper() { >- return absl::make_unique<ApmDataDumper>(0); >-} >- >-std::unique_ptr<FixedGainController> CreateFixedGainController( >- float gain_to_apply, >- size_t rate, >- std::string histogram_name_prefix, >- ApmDataDumper* test_data_dumper) { >- std::unique_ptr<FixedGainController> fgc = >- absl::make_unique<FixedGainController>(test_data_dumper, >- histogram_name_prefix); >- fgc->SetGain(gain_to_apply); >- fgc->SetSampleRate(rate); >- return fgc; >-} >- >-std::unique_ptr<FixedGainController> CreateFixedGainController( >- float gain_to_apply, >- size_t rate, >- ApmDataDumper* test_data_dumper) { >- return CreateFixedGainController(gain_to_apply, rate, "", test_data_dumper); >-} >- >-} // namespace >- >-TEST(AutomaticGainController2FixedDigital, CreateUse) { >- const int kSampleRate = 44000; >- auto test_data_dumper = GetApmDataDumper(); >- std::unique_ptr<FixedGainController> fixed_gc = CreateFixedGainController( >- kGainToApplyDb, kSampleRate, test_data_dumper.get()); >- VectorFloatFrame vectors_with_float_frame( >- 1, rtc::CheckedDivExact(kSampleRate, 100), kInputLevelLinear); >- auto float_frame = vectors_with_float_frame.float_frame_view(); >- fixed_gc->Process(float_frame); >- const auto channel = float_frame.channel(0); >- EXPECT_LT(kInputLevelLinear, channel[0]); >-} >- >-TEST(AutomaticGainController2FixedDigital, CheckSaturationBehaviorWithLimiter) { >- const float kInputLevel = 32767.f; >- const size_t kNumFrames = 5; >- const size_t kSampleRate = 42000; >- >- auto test_data_dumper = GetApmDataDumper(); >- >- const auto gains_no_saturation = >- test::LinSpace(0.1, test::kLimiterMaxInputLevelDbFs - 0.01, 10); >- for (const auto gain_db : gains_no_saturation) { >- // Since |test::kLimiterMaxInputLevelDbFs| > |gain_db|, the >- // limiter will not saturate the signal. >- std::unique_ptr<FixedGainController> fixed_gc_no_saturation = >- CreateFixedGainController(gain_db, kSampleRate, test_data_dumper.get()); >- >- // Saturation not expected. >- SCOPED_TRACE(std::to_string(gain_db)); >- EXPECT_LT( >- RunFixedGainControllerWithConstantInput( >- fixed_gc_no_saturation.get(), kInputLevel, kNumFrames, kSampleRate), >- 32767.f); >- } >- >- const auto gains_saturation = >- test::LinSpace(test::kLimiterMaxInputLevelDbFs + 0.01, 10, 10); >- for (const auto gain_db : gains_saturation) { >- // Since |test::kLimiterMaxInputLevelDbFs| < |gain|, the limiter >- // will saturate the signal. >- std::unique_ptr<FixedGainController> fixed_gc_saturation = >- CreateFixedGainController(gain_db, kSampleRate, test_data_dumper.get()); >- >- // Saturation expected. >- SCOPED_TRACE(std::to_string(gain_db)); >- EXPECT_FLOAT_EQ( >- RunFixedGainControllerWithConstantInput( >- fixed_gc_saturation.get(), kInputLevel, kNumFrames, kSampleRate), >- 32767.f); >- } >-} >- >-TEST(AutomaticGainController2FixedDigital, >- CheckSaturationBehaviorWithLimiterSingleSample) { >- const float kInputLevel = 32767.f; >- const size_t kNumFrames = 5; >- const size_t kSampleRate = 8000; >- >- auto test_data_dumper = GetApmDataDumper(); >- >- const auto gains_no_saturation = >- test::LinSpace(0.1, test::kLimiterMaxInputLevelDbFs - 0.01, 10); >- for (const auto gain_db : gains_no_saturation) { >- // Since |gain| > |test::kLimiterMaxInputLevelDbFs|, the limiter will >- // not saturate the signal. >- std::unique_ptr<FixedGainController> fixed_gc_no_saturation = >- CreateFixedGainController(gain_db, kSampleRate, test_data_dumper.get()); >- >- // Saturation not expected. >- SCOPED_TRACE(std::to_string(gain_db)); >- EXPECT_LT( >- RunFixedGainControllerWithConstantInput( >- fixed_gc_no_saturation.get(), kInputLevel, kNumFrames, kSampleRate), >- 32767.f); >- } >- >- const auto gains_saturation = >- test::LinSpace(test::kLimiterMaxInputLevelDbFs + 0.01, 10, 10); >- for (const auto gain_db : gains_saturation) { >- // Singe |gain| < |test::kLimiterMaxInputLevelDbFs|, the limiter will >- // saturate the signal. >- std::unique_ptr<FixedGainController> fixed_gc_saturation = >- CreateFixedGainController(gain_db, kSampleRate, test_data_dumper.get()); >- >- // Saturation expected. >- SCOPED_TRACE(std::to_string(gain_db)); >- EXPECT_FLOAT_EQ( >- RunFixedGainControllerWithConstantInput( >- fixed_gc_saturation.get(), kInputLevel, kNumFrames, kSampleRate), >- 32767.f); >- } >-} >- >-TEST(AutomaticGainController2FixedDigital, GainShouldChangeOnSetGain) { >- constexpr float kInputLevel = 1000.f; >- constexpr size_t kNumFrames = 5; >- constexpr size_t kSampleRate = 8000; >- constexpr float kGainDbNoChange = 0.f; >- constexpr float kGainDbFactor10 = 20.f; >- >- auto test_data_dumper = GetApmDataDumper(); >- std::unique_ptr<FixedGainController> fixed_gc_no_saturation = >- CreateFixedGainController(kGainDbNoChange, kSampleRate, >- test_data_dumper.get()); >- >- // Signal level is unchanged with 0 db gain. >- EXPECT_FLOAT_EQ( >- RunFixedGainControllerWithConstantInput( >- fixed_gc_no_saturation.get(), kInputLevel, kNumFrames, kSampleRate), >- kInputLevel); >- >- fixed_gc_no_saturation->SetGain(kGainDbFactor10); >- >- // +20db should increase signal by a factor of 10. >- EXPECT_FLOAT_EQ( >- RunFixedGainControllerWithConstantInput( >- fixed_gc_no_saturation.get(), kInputLevel, kNumFrames, kSampleRate), >- kInputLevel * 10); >-} >- >-TEST(AutomaticGainController2FixedDigital, >- SetGainShouldBeFastAndTimeInvariant) { >- // Number of frames required for the fixed gain controller to adapt on the >- // input signal when the gain changes. >- constexpr size_t kNumFrames = 5; >- >- constexpr float kInputLevel = 1000.f; >- constexpr size_t kSampleRate = 8000; >- constexpr float kGainDbLow = 0.f; >- constexpr float kGainDbHigh = 40.f; >- static_assert(kGainDbLow < kGainDbHigh, ""); >- >- auto test_data_dumper = GetApmDataDumper(); >- std::unique_ptr<FixedGainController> fixed_gc = CreateFixedGainController( >- kGainDbLow, kSampleRate, test_data_dumper.get()); >- >- fixed_gc->SetGain(kGainDbLow); >- const float output_level_pre = RunFixedGainControllerWithConstantInput( >- fixed_gc.get(), kInputLevel, kNumFrames, kSampleRate); >- >- fixed_gc->SetGain(kGainDbHigh); >- RunFixedGainControllerWithConstantInput(fixed_gc.get(), kInputLevel, >- kNumFrames, kSampleRate); >- >- fixed_gc->SetGain(kGainDbLow); >- const float output_level_post = RunFixedGainControllerWithConstantInput( >- fixed_gc.get(), kInputLevel, kNumFrames, kSampleRate); >- >- EXPECT_EQ(output_level_pre, output_level_post); >-} >- >-TEST(AutomaticGainController2FixedDigital, RegionHistogramIsUpdated) { >- constexpr size_t kSampleRate = 8000; >- constexpr float kGainDb = 0.f; >- constexpr float kInputLevel = 1000.f; >- constexpr size_t kNumFrames = 5; >- >- metrics::Reset(); >- >- auto test_data_dumper = GetApmDataDumper(); >- std::unique_ptr<FixedGainController> fixed_gc_no_saturation = >- CreateFixedGainController(kGainDb, kSampleRate, "Test", >- test_data_dumper.get()); >- >- static_cast<void>(RunFixedGainControllerWithConstantInput( >- fixed_gc_no_saturation.get(), kInputLevel, kNumFrames, kSampleRate)); >- >- // Destroying FixedGainController should cause the last limiter region to be >- // logged. >- fixed_gc_no_saturation.reset(); >- >- EXPECT_EQ(1, metrics::NumSamples( >- "WebRTC.Audio.Test.FixedDigitalGainCurveRegion.Identity")); >- EXPECT_EQ(0, metrics::NumSamples( >- "WebRTC.Audio.Test.FixedDigitalGainCurveRegion.Knee")); >- EXPECT_EQ(0, metrics::NumSamples( >- "WebRTC.Audio.Test.FixedDigitalGainCurveRegion.Limiter")); >- EXPECT_EQ(0, metrics::NumSamples( >- "WebRTC.Audio.Test.FixedDigitalGainCurveRegion.Saturation")); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.cc >index 38eb1de03f9589bbd7fed044b830e88fc28c1470..8c437177e3cbfa6be506a6bdf5676a70f89b1cf7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.cc >@@ -10,6 +10,7 @@ > > #include "modules/audio_processing/agc2/gain_applier.h" > >+#include "api/array_view.h" > #include "modules/audio_processing/agc2/agc2_common.h" > #include "rtc_base/numerics/safe_minmax.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.h >index e2567b19dbbdc9a3be18db4146c6a2ab009db9ca..7f9f00e07600e2c818489e2c611e7a326bc30038 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_applier.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_GAIN_APPLIER_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_GAIN_APPLIER_H_ > >+#include <stddef.h> >+ > #include "modules/audio_processing/include/audio_frame_view.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier.cc >deleted file mode 100644 >index 1eca21b98e71817872a0543a0f09e88591bdab5e..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier.cc >+++ /dev/null >@@ -1,141 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/audio_processing/agc2/gain_curve_applier.h" >- >-#include <algorithm> >-#include <array> >-#include <cmath> >- >-#include "api/array_view.h" >-#include "modules/audio_processing/logging/apm_data_dumper.h" >-#include "rtc_base/checks.h" >- >-namespace webrtc { >-namespace { >- >-// This constant affects the way scaling factors are interpolated for the first >-// sub-frame of a frame. Only in the case in which the first sub-frame has an >-// estimated level which is greater than the that of the previous analyzed >-// sub-frame, linear interpolation is replaced with a power function which >-// reduces the chances of over-shooting (and hence saturation), however reducing >-// the fixed gain effectiveness. >-constexpr float kAttackFirstSubframeInterpolationPower = 8.f; >- >-void InterpolateFirstSubframe(float last_factor, >- float current_factor, >- rtc::ArrayView<float> subframe) { >- const auto n = subframe.size(); >- constexpr auto p = kAttackFirstSubframeInterpolationPower; >- for (size_t i = 0; i < n; ++i) { >- subframe[i] = std::pow(1.f - i / n, p) * (last_factor - current_factor) + >- current_factor; >- } >-} >- >-void ComputePerSampleSubframeFactors( >- const std::array<float, kSubFramesInFrame + 1>& scaling_factors, >- size_t samples_per_channel, >- rtc::ArrayView<float> per_sample_scaling_factors) { >- const size_t num_subframes = scaling_factors.size() - 1; >- const size_t subframe_size = >- rtc::CheckedDivExact(samples_per_channel, num_subframes); >- >- // Handle first sub-frame differently in case of attack. >- const bool is_attack = scaling_factors[0] > scaling_factors[1]; >- if (is_attack) { >- InterpolateFirstSubframe( >- scaling_factors[0], scaling_factors[1], >- rtc::ArrayView<float>( >- per_sample_scaling_factors.subview(0, subframe_size))); >- } >- >- for (size_t i = is_attack ? 1 : 0; i < num_subframes; ++i) { >- const size_t subframe_start = i * subframe_size; >- const float scaling_start = scaling_factors[i]; >- const float scaling_end = scaling_factors[i + 1]; >- const float scaling_diff = (scaling_end - scaling_start) / subframe_size; >- for (size_t j = 0; j < subframe_size; ++j) { >- per_sample_scaling_factors[subframe_start + j] = >- scaling_start + scaling_diff * j; >- } >- } >-} >- >-void ScaleSamples(rtc::ArrayView<const float> per_sample_scaling_factors, >- AudioFrameView<float> signal) { >- const size_t samples_per_channel = signal.samples_per_channel(); >- RTC_DCHECK_EQ(samples_per_channel, per_sample_scaling_factors.size()); >- for (size_t i = 0; i < signal.num_channels(); ++i) { >- auto channel = signal.channel(i); >- for (size_t j = 0; j < samples_per_channel; ++j) { >- channel[j] *= per_sample_scaling_factors[j]; >- } >- } >-} >- >-} // namespace >- >-GainCurveApplier::GainCurveApplier(size_t sample_rate_hz, >- ApmDataDumper* apm_data_dumper, >- std::string histogram_name) >- : interp_gain_curve_(apm_data_dumper, histogram_name), >- level_estimator_(sample_rate_hz, apm_data_dumper), >- apm_data_dumper_(apm_data_dumper) {} >- >-GainCurveApplier::~GainCurveApplier() = default; >- >-void GainCurveApplier::Process(AudioFrameView<float> signal) { >- const auto level_estimate = level_estimator_.ComputeLevel(signal); >- >- RTC_DCHECK_EQ(level_estimate.size() + 1, scaling_factors_.size()); >- scaling_factors_[0] = last_scaling_factor_; >- std::transform(level_estimate.begin(), level_estimate.end(), >- scaling_factors_.begin() + 1, [this](float x) { >- return interp_gain_curve_.LookUpGainToApply(x); >- }); >- >- const size_t samples_per_channel = signal.samples_per_channel(); >- RTC_DCHECK_LE(samples_per_channel, kMaximalNumberOfSamplesPerChannel); >- >- auto per_sample_scaling_factors = rtc::ArrayView<float>( >- &per_sample_scaling_factors_[0], samples_per_channel); >- ComputePerSampleSubframeFactors(scaling_factors_, samples_per_channel, >- per_sample_scaling_factors); >- ScaleSamples(per_sample_scaling_factors, signal); >- >- last_scaling_factor_ = scaling_factors_.back(); >- >- // Dump data for debug. >- apm_data_dumper_->DumpRaw("agc2_gain_curve_applier_scaling_factors", >- samples_per_channel, >- per_sample_scaling_factors_.data()); >-} >- >-InterpolatedGainCurve::Stats GainCurveApplier::GetGainCurveStats() const { >- return interp_gain_curve_.get_stats(); >-} >- >-void GainCurveApplier::SetSampleRate(size_t sample_rate_hz) { >- level_estimator_.SetSampleRate(sample_rate_hz); >- // Check that per_sample_scaling_factors_ is large enough. >- RTC_DCHECK_LE(sample_rate_hz, >- kMaximalNumberOfSamplesPerChannel * 1000 / kFrameDurationMs); >-} >- >-void GainCurveApplier::Reset() { >- level_estimator_.Reset(); >-} >- >-float GainCurveApplier::LastAudioLevel() const { >- return level_estimator_.LastAudioLevel(); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier.h >deleted file mode 100644 >index e0be19e06944c83c12ed695cbc50072dd8eba34f..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier.h >+++ /dev/null >@@ -1,63 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_AUDIO_PROCESSING_AGC2_GAIN_CURVE_APPLIER_H_ >-#define MODULES_AUDIO_PROCESSING_AGC2_GAIN_CURVE_APPLIER_H_ >- >-#include <vector> >- >-#include "modules/audio_processing/agc2/fixed_digital_level_estimator.h" >-#include "modules/audio_processing/agc2/interpolated_gain_curve.h" >-#include "modules/audio_processing/include/audio_frame_view.h" >-#include "rtc_base/constructormagic.h" >- >-namespace webrtc { >-class ApmDataDumper; >- >-class GainCurveApplier { >- public: >- GainCurveApplier(size_t sample_rate_hz, >- ApmDataDumper* apm_data_dumper, >- std::string histogram_name_prefix); >- >- ~GainCurveApplier(); >- >- void Process(AudioFrameView<float> signal); >- InterpolatedGainCurve::Stats GetGainCurveStats() const; >- >- // Supported rates must be >- // * supported by FixedDigitalLevelEstimator >- // * below kMaximalNumberOfSamplesPerChannel*1000/kFrameDurationMs >- // so that samples_per_channel fit in the >- // per_sample_scaling_factors_ array. >- void SetSampleRate(size_t sample_rate_hz); >- >- // Resets the internal state. >- void Reset(); >- >- float LastAudioLevel() const; >- >- private: >- const InterpolatedGainCurve interp_gain_curve_; >- FixedDigitalLevelEstimator level_estimator_; >- ApmDataDumper* const apm_data_dumper_ = nullptr; >- >- // Work array containing the sub-frame scaling factors to be interpolated. >- std::array<float, kSubFramesInFrame + 1> scaling_factors_ = {}; >- std::array<float, kMaximalNumberOfSamplesPerChannel> >- per_sample_scaling_factors_ = {}; >- float last_scaling_factor_ = 1.f; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(GainCurveApplier); >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_AUDIO_PROCESSING_AGC2_GAIN_CURVE_APPLIER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier_unittest.cc >deleted file mode 100644 >index 0f75f62c407cbb3d99048f4082bbc7c49a1ff0c8..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/gain_curve_applier_unittest.cc >+++ /dev/null >@@ -1,60 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/audio_processing/agc2/gain_curve_applier.h" >- >-#include "common_audio/include/audio_util.h" >-#include "modules/audio_processing/agc2/agc2_common.h" >-#include "modules/audio_processing/agc2/agc2_testing_common.h" >-#include "modules/audio_processing/agc2/vector_float_frame.h" >-#include "modules/audio_processing/logging/apm_data_dumper.h" >-#include "rtc_base/gunit.h" >- >-namespace webrtc { >- >-TEST(GainCurveApplier, GainCurveApplierShouldConstructAndRun) { >- const int sample_rate_hz = 48000; >- ApmDataDumper apm_data_dumper(0); >- >- GainCurveApplier gain_curve_applier(sample_rate_hz, &apm_data_dumper, ""); >- >- VectorFloatFrame vectors_with_float_frame(1, sample_rate_hz / 100, >- kMaxAbsFloatS16Value); >- gain_curve_applier.Process(vectors_with_float_frame.float_frame_view()); >-} >- >-TEST(GainCurveApplier, OutputVolumeAboveThreshold) { >- const int sample_rate_hz = 48000; >- const float input_level = >- (kMaxAbsFloatS16Value + DbfsToFloatS16(test::kLimiterMaxInputLevelDbFs)) / >- 2.f; >- ApmDataDumper apm_data_dumper(0); >- >- GainCurveApplier gain_curve_applier(sample_rate_hz, &apm_data_dumper, ""); >- >- // Give the level estimator time to adapt. >- for (int i = 0; i < 5; ++i) { >- VectorFloatFrame vectors_with_float_frame(1, sample_rate_hz / 100, >- input_level); >- gain_curve_applier.Process(vectors_with_float_frame.float_frame_view()); >- } >- >- VectorFloatFrame vectors_with_float_frame(1, sample_rate_hz / 100, >- input_level); >- gain_curve_applier.Process(vectors_with_float_frame.float_frame_view()); >- rtc::ArrayView<const float> channel = >- vectors_with_float_frame.float_frame_view().channel(0); >- >- for (const auto& sample : channel) { >- EXPECT_LT(0.9f * kMaxAbsFloatS16Value, sample); >- } >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve.cc >index 73e6a8ecc36f1560c82509797ffd55875f8bd0f7..f5d6b47169c2226d4d1c2dd125b396d00b792c57 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve.cc >@@ -10,10 +10,12 @@ > > #include "modules/audio_processing/agc2/interpolated_gain_curve.h" > >+#include <algorithm> >+#include <iterator> >+ > #include "modules/audio_processing/agc2/agc2_common.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve_unittest.cc >index dd696313ae266574a4b8531ee9fbb96450cc2b0a..a8e0f2361c91a1213067807e04958d01f58f7879 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/interpolated_gain_curve_unittest.cc >@@ -9,6 +9,7 @@ > */ > > #include <array> >+#include <type_traits> > #include <vector> > > #include "api/array_view.h" >@@ -16,7 +17,7 @@ > #include "modules/audio_processing/agc2/agc2_common.h" > #include "modules/audio_processing/agc2/compute_interpolated_gain_curve.h" > #include "modules/audio_processing/agc2/interpolated_gain_curve.h" >-#include "modules/audio_processing/agc2/limiter.h" >+#include "modules/audio_processing/agc2/limiter_db_gain_curve.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" > #include "rtc_base/gunit.h" >@@ -27,7 +28,8 @@ namespace { > constexpr double kLevelEpsilon = 1e-2 * kMaxAbsFloatS16Value; > constexpr float kInterpolatedGainCurveTolerance = 1.f / 32768.f; > ApmDataDumper apm_data_dumper(0); >-const Limiter limiter; >+static_assert(std::is_trivially_destructible<LimiterDbGainCurve>::value, ""); >+const LimiterDbGainCurve limiter; > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.cc >index d2b9877be11ed4a361de93fa6dad73520aa00cdf..1589f07404d990039ddb93fe063c41dbea5ecea5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.cc >@@ -10,128 +10,141 @@ > > #include "modules/audio_processing/agc2/limiter.h" > >+#include <algorithm> >+#include <array> > #include <cmath> > >-#include "common_audio/include/audio_util.h" >+#include "api/array_view.h" > #include "modules/audio_processing/agc2/agc2_common.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/checks.h" >+#include "rtc_base/numerics/safe_minmax.h" > > namespace webrtc { > namespace { > >-double ComputeKneeStart(double max_input_level_db, >- double knee_smoothness_db, >- double compression_ratio) { >- RTC_CHECK_LT((compression_ratio - 1.0) * knee_smoothness_db / >- (2.0 * compression_ratio), >- max_input_level_db); >- return -knee_smoothness_db / 2.0 - >- max_input_level_db / (compression_ratio - 1.0); >+// This constant affects the way scaling factors are interpolated for the first >+// sub-frame of a frame. Only in the case in which the first sub-frame has an >+// estimated level which is greater than the that of the previous analyzed >+// sub-frame, linear interpolation is replaced with a power function which >+// reduces the chances of over-shooting (and hence saturation), however reducing >+// the fixed gain effectiveness. >+constexpr float kAttackFirstSubframeInterpolationPower = 8.f; >+ >+void InterpolateFirstSubframe(float last_factor, >+ float current_factor, >+ rtc::ArrayView<float> subframe) { >+ const auto n = subframe.size(); >+ constexpr auto p = kAttackFirstSubframeInterpolationPower; >+ for (size_t i = 0; i < n; ++i) { >+ subframe[i] = std::pow(1.f - i / n, p) * (last_factor - current_factor) + >+ current_factor; >+ } > } > >-std::array<double, 3> ComputeKneeRegionPolynomial(double knee_start_dbfs, >- double knee_smoothness_db, >- double compression_ratio) { >- const double a = (1.0 - compression_ratio) / >- (2.0 * knee_smoothness_db * compression_ratio); >- const double b = 1.0 - 2.0 * a * knee_start_dbfs; >- const double c = a * knee_start_dbfs * knee_start_dbfs; >- return {{a, b, c}}; >-} >+void ComputePerSampleSubframeFactors( >+ const std::array<float, kSubFramesInFrame + 1>& scaling_factors, >+ size_t samples_per_channel, >+ rtc::ArrayView<float> per_sample_scaling_factors) { >+ const size_t num_subframes = scaling_factors.size() - 1; >+ const size_t subframe_size = >+ rtc::CheckedDivExact(samples_per_channel, num_subframes); >+ >+ // Handle first sub-frame differently in case of attack. >+ const bool is_attack = scaling_factors[0] > scaling_factors[1]; >+ if (is_attack) { >+ InterpolateFirstSubframe( >+ scaling_factors[0], scaling_factors[1], >+ rtc::ArrayView<float>( >+ per_sample_scaling_factors.subview(0, subframe_size))); >+ } > >-double ComputeLimiterD1(double max_input_level_db, double compression_ratio) { >- return (std::pow(10.0, -max_input_level_db / (20.0 * compression_ratio)) * >- (1.0 - compression_ratio) / compression_ratio) / >- kMaxAbsFloatS16Value; >+ for (size_t i = is_attack ? 1 : 0; i < num_subframes; ++i) { >+ const size_t subframe_start = i * subframe_size; >+ const float scaling_start = scaling_factors[i]; >+ const float scaling_end = scaling_factors[i + 1]; >+ const float scaling_diff = (scaling_end - scaling_start) / subframe_size; >+ for (size_t j = 0; j < subframe_size; ++j) { >+ per_sample_scaling_factors[subframe_start + j] = >+ scaling_start + scaling_diff * j; >+ } >+ } > } > >-constexpr double ComputeLimiterD2(double compression_ratio) { >- return (1.0 - 2.0 * compression_ratio) / compression_ratio; >+void ScaleSamples(rtc::ArrayView<const float> per_sample_scaling_factors, >+ AudioFrameView<float> signal) { >+ const size_t samples_per_channel = signal.samples_per_channel(); >+ RTC_DCHECK_EQ(samples_per_channel, per_sample_scaling_factors.size()); >+ for (size_t i = 0; i < signal.num_channels(); ++i) { >+ auto channel = signal.channel(i); >+ for (size_t j = 0; j < samples_per_channel; ++j) { >+ channel[j] = rtc::SafeClamp(channel[j] * per_sample_scaling_factors[j], >+ kMinFloatS16Value, kMaxFloatS16Value); >+ } >+ } > } > >-double ComputeLimiterI2(double max_input_level_db, >- double compression_ratio, >- double gain_curve_limiter_i1) { >- RTC_CHECK_NE(gain_curve_limiter_i1, 0.f); >- return std::pow(10.0, -max_input_level_db / (20.0 * compression_ratio)) / >- gain_curve_limiter_i1 / >- std::pow(kMaxAbsFloatS16Value, gain_curve_limiter_i1 - 1); >+void CheckLimiterSampleRate(size_t sample_rate_hz) { >+ // Check that per_sample_scaling_factors_ is large enough. >+ RTC_DCHECK_LE(sample_rate_hz, >+ kMaximalNumberOfSamplesPerChannel * 1000 / kFrameDurationMs); > } > > } // namespace > >-Limiter::Limiter() >- : max_input_level_linear_(DbfsToFloatS16(max_input_level_db_)), >- knee_start_dbfs_(ComputeKneeStart(max_input_level_db_, >- knee_smoothness_db_, >- compression_ratio_)), >- knee_start_linear_(DbfsToFloatS16(knee_start_dbfs_)), >- limiter_start_dbfs_(knee_start_dbfs_ + knee_smoothness_db_), >- limiter_start_linear_(DbfsToFloatS16(limiter_start_dbfs_)), >- knee_region_polynomial_(ComputeKneeRegionPolynomial(knee_start_dbfs_, >- knee_smoothness_db_, >- compression_ratio_)), >- gain_curve_limiter_d1_( >- ComputeLimiterD1(max_input_level_db_, compression_ratio_)), >- gain_curve_limiter_d2_(ComputeLimiterD2(compression_ratio_)), >- gain_curve_limiter_i1_(1.0 / compression_ratio_), >- gain_curve_limiter_i2_(ComputeLimiterI2(max_input_level_db_, >- compression_ratio_, >- gain_curve_limiter_i1_)) { >- static_assert(knee_smoothness_db_ > 0.0f, ""); >- static_assert(compression_ratio_ > 1.0f, ""); >- RTC_CHECK_GE(max_input_level_db_, knee_start_dbfs_ + knee_smoothness_db_); >+Limiter::Limiter(size_t sample_rate_hz, >+ ApmDataDumper* apm_data_dumper, >+ std::string histogram_name) >+ : interp_gain_curve_(apm_data_dumper, histogram_name), >+ level_estimator_(sample_rate_hz, apm_data_dumper), >+ apm_data_dumper_(apm_data_dumper) { >+ CheckLimiterSampleRate(sample_rate_hz); > } > >-constexpr double Limiter::max_input_level_db_; >-constexpr double Limiter::knee_smoothness_db_; >-constexpr double Limiter::compression_ratio_; >+Limiter::~Limiter() = default; > >-double Limiter::GetOutputLevelDbfs(double input_level_dbfs) const { >- if (input_level_dbfs < knee_start_dbfs_) { >- return input_level_dbfs; >- } else if (input_level_dbfs < limiter_start_dbfs_) { >- return GetKneeRegionOutputLevelDbfs(input_level_dbfs); >- } >- return GetCompressorRegionOutputLevelDbfs(input_level_dbfs); >-} >+void Limiter::Process(AudioFrameView<float> signal) { >+ const auto level_estimate = level_estimator_.ComputeLevel(signal); > >-double Limiter::GetGainLinear(double input_level_linear) const { >- if (input_level_linear < knee_start_linear_) { >- return 1.0; >- } >- return DbfsToFloatS16( >- GetOutputLevelDbfs(FloatS16ToDbfs(input_level_linear))) / >- input_level_linear; >+ RTC_DCHECK_EQ(level_estimate.size() + 1, scaling_factors_.size()); >+ scaling_factors_[0] = last_scaling_factor_; >+ std::transform(level_estimate.begin(), level_estimate.end(), >+ scaling_factors_.begin() + 1, [this](float x) { >+ return interp_gain_curve_.LookUpGainToApply(x); >+ }); >+ >+ const size_t samples_per_channel = signal.samples_per_channel(); >+ RTC_DCHECK_LE(samples_per_channel, kMaximalNumberOfSamplesPerChannel); >+ >+ auto per_sample_scaling_factors = rtc::ArrayView<float>( >+ &per_sample_scaling_factors_[0], samples_per_channel); >+ ComputePerSampleSubframeFactors(scaling_factors_, samples_per_channel, >+ per_sample_scaling_factors); >+ ScaleSamples(per_sample_scaling_factors, signal); >+ >+ last_scaling_factor_ = scaling_factors_.back(); >+ >+ // Dump data for debug. >+ apm_data_dumper_->DumpRaw("agc2_gain_curve_applier_scaling_factors", >+ samples_per_channel, >+ per_sample_scaling_factors_.data()); > } > >-// Computes the first derivative of GetGainLinear() in |x|. >-double Limiter::GetGainFirstDerivativeLinear(double x) const { >- // Beyond-knee region only. >- RTC_CHECK_GE(x, limiter_start_linear_ - 1e-7 * kMaxAbsFloatS16Value); >- return gain_curve_limiter_d1_ * >- std::pow(x / kMaxAbsFloatS16Value, gain_curve_limiter_d2_); >+InterpolatedGainCurve::Stats Limiter::GetGainCurveStats() const { >+ return interp_gain_curve_.get_stats(); > } > >-// Computes the integral of GetGainLinear() in the range [x0, x1]. >-double Limiter::GetGainIntegralLinear(double x0, double x1) const { >- RTC_CHECK_LE(x0, x1); // Valid interval. >- RTC_CHECK_GE(x0, limiter_start_linear_); // Beyond-knee region only. >- auto limiter_integral = [this](const double& x) { >- return gain_curve_limiter_i2_ * std::pow(x, gain_curve_limiter_i1_); >- }; >- return limiter_integral(x1) - limiter_integral(x0); >+void Limiter::SetSampleRate(size_t sample_rate_hz) { >+ CheckLimiterSampleRate(sample_rate_hz); >+ level_estimator_.SetSampleRate(sample_rate_hz); > } > >-double Limiter::GetKneeRegionOutputLevelDbfs(double input_level_dbfs) const { >- return knee_region_polynomial_[0] * input_level_dbfs * input_level_dbfs + >- knee_region_polynomial_[1] * input_level_dbfs + >- knee_region_polynomial_[2]; >+void Limiter::Reset() { >+ level_estimator_.Reset(); > } > >-double Limiter::GetCompressorRegionOutputLevelDbfs( >- double input_level_dbfs) const { >- return (input_level_dbfs - max_input_level_db_) / compression_ratio_; >+float Limiter::LastAudioLevel() const { >+ return level_estimator_.LastAudioLevel(); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.h >index f350baeef105e3b9acff472bf947872f80eb1db8..1e0ab71a183786dd1616bd9e03086d2df7ae0369 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter.h >@@ -11,62 +11,52 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_LIMITER_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_LIMITER_H_ > >-#include <array> >+#include <string> >+#include <vector> > >-#include "modules/audio_processing/agc2/agc2_testing_common.h" >+#include "modules/audio_processing/agc2/fixed_digital_level_estimator.h" >+#include "modules/audio_processing/agc2/interpolated_gain_curve.h" >+#include "modules/audio_processing/include/audio_frame_view.h" >+#include "rtc_base/constructormagic.h" > > namespace webrtc { >+class ApmDataDumper; > >-// A class for computing gain curve parameters. The gain curve is >-// defined by constants kLimiterMaxInputLevelDbFs, kLimiterKneeSmoothnessDb, >-// kLimiterCompressionRatio. The curve consints of one linear part, >-// one quadratic polynomial part and another linear part. The >-// constants define the parameters of the parts. > class Limiter { > public: >- Limiter(); >+ Limiter(size_t sample_rate_hz, >+ ApmDataDumper* apm_data_dumper, >+ std::string histogram_name_prefix); >+ Limiter(const Limiter& limiter) = delete; >+ Limiter& operator=(const Limiter& limiter) = delete; >+ ~Limiter(); > >- double max_input_level_db() const { return max_input_level_db_; } >- double max_input_level_linear() const { return max_input_level_linear_; } >- double knee_start_linear() const { return knee_start_linear_; } >- double limiter_start_linear() const { return limiter_start_linear_; } >+ // Applies limiter and hard-clipping to |signal|. >+ void Process(AudioFrameView<float> signal); >+ InterpolatedGainCurve::Stats GetGainCurveStats() const; > >- // These methods can be marked 'constexpr' in C++ 14. >- double GetOutputLevelDbfs(double input_level_dbfs) const; >- double GetGainLinear(double input_level_linear) const; >- double GetGainFirstDerivativeLinear(double x) const; >- double GetGainIntegralLinear(double x0, double x1) const; >+ // Supported rates must be >+ // * supported by FixedDigitalLevelEstimator >+ // * below kMaximalNumberOfSamplesPerChannel*1000/kFrameDurationMs >+ // so that samples_per_channel fit in the >+ // per_sample_scaling_factors_ array. >+ void SetSampleRate(size_t sample_rate_hz); > >- private: >- double GetKneeRegionOutputLevelDbfs(double input_level_dbfs) const; >- double GetCompressorRegionOutputLevelDbfs(double input_level_dbfs) const; >- >- static constexpr double max_input_level_db_ = test::kLimiterMaxInputLevelDbFs; >- static constexpr double knee_smoothness_db_ = test::kLimiterKneeSmoothnessDb; >- static constexpr double compression_ratio_ = test::kLimiterCompressionRatio; >- >- const double max_input_level_linear_; >- >- // Do not modify signal with level <= knee_start_dbfs_. >- const double knee_start_dbfs_; >- const double knee_start_linear_; >+ // Resets the internal state. >+ void Reset(); > >- // The upper end of the knee region, which is between knee_start_dbfs_ and >- // limiter_start_dbfs_. >- const double limiter_start_dbfs_; >- const double limiter_start_linear_; >+ float LastAudioLevel() const; > >- // Coefficients {a, b, c} of the knee region polynomial >- // ax^2 + bx + c in the DB scale. >- const std::array<double, 3> knee_region_polynomial_; >- >- // Parameters for the computation of the first derivative of GetGainLinear(). >- const double gain_curve_limiter_d1_; >- const double gain_curve_limiter_d2_; >- >- // Parameters for the computation of the integral of GetGainLinear(). >- const double gain_curve_limiter_i1_; >- const double gain_curve_limiter_i2_; >+ private: >+ const InterpolatedGainCurve interp_gain_curve_; >+ FixedDigitalLevelEstimator level_estimator_; >+ ApmDataDumper* const apm_data_dumper_ = nullptr; >+ >+ // Work array containing the sub-frame scaling factors to be interpolated. >+ std::array<float, kSubFramesInFrame + 1> scaling_factors_ = {}; >+ std::array<float, kMaximalNumberOfSamplesPerChannel> >+ per_sample_scaling_factors_ = {}; >+ float last_scaling_factor_ = 1.f; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..d55ed5df58c1aa5dc1834f6019c011322d2e37c2 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve.cc >@@ -0,0 +1,138 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/agc2/limiter_db_gain_curve.h" >+ >+#include <cmath> >+ >+#include "common_audio/include/audio_util.h" >+#include "modules/audio_processing/agc2/agc2_common.h" >+#include "rtc_base/checks.h" >+ >+namespace webrtc { >+namespace { >+ >+double ComputeKneeStart(double max_input_level_db, >+ double knee_smoothness_db, >+ double compression_ratio) { >+ RTC_CHECK_LT((compression_ratio - 1.0) * knee_smoothness_db / >+ (2.0 * compression_ratio), >+ max_input_level_db); >+ return -knee_smoothness_db / 2.0 - >+ max_input_level_db / (compression_ratio - 1.0); >+} >+ >+std::array<double, 3> ComputeKneeRegionPolynomial(double knee_start_dbfs, >+ double knee_smoothness_db, >+ double compression_ratio) { >+ const double a = (1.0 - compression_ratio) / >+ (2.0 * knee_smoothness_db * compression_ratio); >+ const double b = 1.0 - 2.0 * a * knee_start_dbfs; >+ const double c = a * knee_start_dbfs * knee_start_dbfs; >+ return {{a, b, c}}; >+} >+ >+double ComputeLimiterD1(double max_input_level_db, double compression_ratio) { >+ return (std::pow(10.0, -max_input_level_db / (20.0 * compression_ratio)) * >+ (1.0 - compression_ratio) / compression_ratio) / >+ kMaxAbsFloatS16Value; >+} >+ >+constexpr double ComputeLimiterD2(double compression_ratio) { >+ return (1.0 - 2.0 * compression_ratio) / compression_ratio; >+} >+ >+double ComputeLimiterI2(double max_input_level_db, >+ double compression_ratio, >+ double gain_curve_limiter_i1) { >+ RTC_CHECK_NE(gain_curve_limiter_i1, 0.f); >+ return std::pow(10.0, -max_input_level_db / (20.0 * compression_ratio)) / >+ gain_curve_limiter_i1 / >+ std::pow(kMaxAbsFloatS16Value, gain_curve_limiter_i1 - 1); >+} >+ >+} // namespace >+ >+LimiterDbGainCurve::LimiterDbGainCurve() >+ : max_input_level_linear_(DbfsToFloatS16(max_input_level_db_)), >+ knee_start_dbfs_(ComputeKneeStart(max_input_level_db_, >+ knee_smoothness_db_, >+ compression_ratio_)), >+ knee_start_linear_(DbfsToFloatS16(knee_start_dbfs_)), >+ limiter_start_dbfs_(knee_start_dbfs_ + knee_smoothness_db_), >+ limiter_start_linear_(DbfsToFloatS16(limiter_start_dbfs_)), >+ knee_region_polynomial_(ComputeKneeRegionPolynomial(knee_start_dbfs_, >+ knee_smoothness_db_, >+ compression_ratio_)), >+ gain_curve_limiter_d1_( >+ ComputeLimiterD1(max_input_level_db_, compression_ratio_)), >+ gain_curve_limiter_d2_(ComputeLimiterD2(compression_ratio_)), >+ gain_curve_limiter_i1_(1.0 / compression_ratio_), >+ gain_curve_limiter_i2_(ComputeLimiterI2(max_input_level_db_, >+ compression_ratio_, >+ gain_curve_limiter_i1_)) { >+ static_assert(knee_smoothness_db_ > 0.0f, ""); >+ static_assert(compression_ratio_ > 1.0f, ""); >+ RTC_CHECK_GE(max_input_level_db_, knee_start_dbfs_ + knee_smoothness_db_); >+} >+ >+constexpr double LimiterDbGainCurve::max_input_level_db_; >+constexpr double LimiterDbGainCurve::knee_smoothness_db_; >+constexpr double LimiterDbGainCurve::compression_ratio_; >+ >+double LimiterDbGainCurve::GetOutputLevelDbfs(double input_level_dbfs) const { >+ if (input_level_dbfs < knee_start_dbfs_) { >+ return input_level_dbfs; >+ } else if (input_level_dbfs < limiter_start_dbfs_) { >+ return GetKneeRegionOutputLevelDbfs(input_level_dbfs); >+ } >+ return GetCompressorRegionOutputLevelDbfs(input_level_dbfs); >+} >+ >+double LimiterDbGainCurve::GetGainLinear(double input_level_linear) const { >+ if (input_level_linear < knee_start_linear_) { >+ return 1.0; >+ } >+ return DbfsToFloatS16( >+ GetOutputLevelDbfs(FloatS16ToDbfs(input_level_linear))) / >+ input_level_linear; >+} >+ >+// Computes the first derivative of GetGainLinear() in |x|. >+double LimiterDbGainCurve::GetGainFirstDerivativeLinear(double x) const { >+ // Beyond-knee region only. >+ RTC_CHECK_GE(x, limiter_start_linear_ - 1e-7 * kMaxAbsFloatS16Value); >+ return gain_curve_limiter_d1_ * >+ std::pow(x / kMaxAbsFloatS16Value, gain_curve_limiter_d2_); >+} >+ >+// Computes the integral of GetGainLinear() in the range [x0, x1]. >+double LimiterDbGainCurve::GetGainIntegralLinear(double x0, double x1) const { >+ RTC_CHECK_LE(x0, x1); // Valid interval. >+ RTC_CHECK_GE(x0, limiter_start_linear_); // Beyond-knee region only. >+ auto limiter_integral = [this](const double& x) { >+ return gain_curve_limiter_i2_ * std::pow(x, gain_curve_limiter_i1_); >+ }; >+ return limiter_integral(x1) - limiter_integral(x0); >+} >+ >+double LimiterDbGainCurve::GetKneeRegionOutputLevelDbfs( >+ double input_level_dbfs) const { >+ return knee_region_polynomial_[0] * input_level_dbfs * input_level_dbfs + >+ knee_region_polynomial_[1] * input_level_dbfs + >+ knee_region_polynomial_[2]; >+} >+ >+double LimiterDbGainCurve::GetCompressorRegionOutputLevelDbfs( >+ double input_level_dbfs) const { >+ return (input_level_dbfs - max_input_level_db_) / compression_ratio_; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve.h >new file mode 100644 >index 0000000000000000000000000000000000000000..9086e26739b7cfc20fe6a4c353a4d7bc5aa06885 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve.h >@@ -0,0 +1,76 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_AUDIO_PROCESSING_AGC2_LIMITER_DB_GAIN_CURVE_H_ >+#define MODULES_AUDIO_PROCESSING_AGC2_LIMITER_DB_GAIN_CURVE_H_ >+ >+#include <array> >+ >+#include "modules/audio_processing/agc2/agc2_testing_common.h" >+ >+namespace webrtc { >+ >+// A class for computing a limiter gain curve (in dB scale) given a set of >+// hard-coded parameters (namely, kLimiterDbGainCurveMaxInputLevelDbFs, >+// kLimiterDbGainCurveKneeSmoothnessDb, and >+// kLimiterDbGainCurveCompressionRatio). The generated curve consists of four >+// regions: identity (linear), knee (quadratic polynomial), compression >+// (linear), saturation (linear). The aforementioned constants are used to shape >+// the different regions. >+class LimiterDbGainCurve { >+ public: >+ LimiterDbGainCurve(); >+ >+ double max_input_level_db() const { return max_input_level_db_; } >+ double max_input_level_linear() const { return max_input_level_linear_; } >+ double knee_start_linear() const { return knee_start_linear_; } >+ double limiter_start_linear() const { return limiter_start_linear_; } >+ >+ // These methods can be marked 'constexpr' in C++ 14. >+ double GetOutputLevelDbfs(double input_level_dbfs) const; >+ double GetGainLinear(double input_level_linear) const; >+ double GetGainFirstDerivativeLinear(double x) const; >+ double GetGainIntegralLinear(double x0, double x1) const; >+ >+ private: >+ double GetKneeRegionOutputLevelDbfs(double input_level_dbfs) const; >+ double GetCompressorRegionOutputLevelDbfs(double input_level_dbfs) const; >+ >+ static constexpr double max_input_level_db_ = test::kLimiterMaxInputLevelDbFs; >+ static constexpr double knee_smoothness_db_ = test::kLimiterKneeSmoothnessDb; >+ static constexpr double compression_ratio_ = test::kLimiterCompressionRatio; >+ >+ const double max_input_level_linear_; >+ >+ // Do not modify signal with level <= knee_start_dbfs_. >+ const double knee_start_dbfs_; >+ const double knee_start_linear_; >+ >+ // The upper end of the knee region, which is between knee_start_dbfs_ and >+ // limiter_start_dbfs_. >+ const double limiter_start_dbfs_; >+ const double limiter_start_linear_; >+ >+ // Coefficients {a, b, c} of the knee region polynomial >+ // ax^2 + bx + c in the DB scale. >+ const std::array<double, 3> knee_region_polynomial_; >+ >+ // Parameters for the computation of the first derivative of GetGainLinear(). >+ const double gain_curve_limiter_d1_; >+ const double gain_curve_limiter_d2_; >+ >+ // Parameters for the computation of the integral of GetGainLinear(). >+ const double gain_curve_limiter_i1_; >+ const double gain_curve_limiter_i2_; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_AUDIO_PROCESSING_AGC2_LIMITER_DB_GAIN_CURVE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..049c8d568ea50f0caef27952f1d7bfa5ed0d54b6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_db_gain_curve_unittest.cc >@@ -0,0 +1,60 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/agc2/limiter_db_gain_curve.h" >+ >+#include "rtc_base/gunit.h" >+ >+namespace webrtc { >+ >+TEST(FixedDigitalGainController2Limiter, ConstructDestruct) { >+ LimiterDbGainCurve l; >+} >+ >+TEST(FixedDigitalGainController2Limiter, GainCurveShouldBeMonotone) { >+ LimiterDbGainCurve l; >+ float last_output_level = 0.f; >+ bool has_last_output_level = false; >+ for (float level = -90.f; level <= l.max_input_level_db(); level += 0.5f) { >+ const float current_output_level = l.GetOutputLevelDbfs(level); >+ if (!has_last_output_level) { >+ last_output_level = current_output_level; >+ has_last_output_level = true; >+ } >+ EXPECT_LE(last_output_level, current_output_level); >+ last_output_level = current_output_level; >+ } >+} >+ >+TEST(FixedDigitalGainController2Limiter, GainCurveShouldBeContinuous) { >+ LimiterDbGainCurve l; >+ float last_output_level = 0.f; >+ bool has_last_output_level = false; >+ constexpr float kMaxDelta = 0.5f; >+ for (float level = -90.f; level <= l.max_input_level_db(); level += 0.5f) { >+ const float current_output_level = l.GetOutputLevelDbfs(level); >+ if (!has_last_output_level) { >+ last_output_level = current_output_level; >+ has_last_output_level = true; >+ } >+ EXPECT_LE(current_output_level, last_output_level + kMaxDelta); >+ last_output_level = current_output_level; >+ } >+} >+ >+TEST(FixedDigitalGainController2Limiter, OutputGainShouldBeLessThanFullScale) { >+ LimiterDbGainCurve l; >+ for (float level = -90.f; level <= l.max_input_level_db(); level += 0.5f) { >+ const float current_output_level = l.GetOutputLevelDbfs(level); >+ EXPECT_LE(current_output_level, 0.f); >+ } >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_unittest.cc >index 707981263475aa3b9f0eb35189175985bb3a7932..e662a7fc890458e38b58f7812f70299387fdae2d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/limiter_unittest.cc >@@ -10,50 +10,50 @@ > > #include "modules/audio_processing/agc2/limiter.h" > >+#include "common_audio/include/audio_util.h" >+#include "modules/audio_processing/agc2/agc2_common.h" >+#include "modules/audio_processing/agc2/agc2_testing_common.h" >+#include "modules/audio_processing/agc2/vector_float_frame.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/gunit.h" > > namespace webrtc { > >-TEST(FixedDigitalGainController2Limiter, ConstructDestruct) { >- Limiter l; >-} >+TEST(Limiter, LimiterShouldConstructAndRun) { >+ const int sample_rate_hz = 48000; >+ ApmDataDumper apm_data_dumper(0); > >-TEST(FixedDigitalGainController2Limiter, GainCurveShouldBeMonotone) { >- Limiter l; >- float last_output_level = 0.f; >- bool has_last_output_level = false; >- for (float level = -90.f; level <= l.max_input_level_db(); level += 0.5f) { >- const float current_output_level = l.GetOutputLevelDbfs(level); >- if (!has_last_output_level) { >- last_output_level = current_output_level; >- has_last_output_level = true; >- } >- EXPECT_LE(last_output_level, current_output_level); >- last_output_level = current_output_level; >- } >+ Limiter limiter(sample_rate_hz, &apm_data_dumper, ""); >+ >+ VectorFloatFrame vectors_with_float_frame(1, sample_rate_hz / 100, >+ kMaxAbsFloatS16Value); >+ limiter.Process(vectors_with_float_frame.float_frame_view()); > } > >-TEST(FixedDigitalGainController2Limiter, GainCurveShouldBeContinuous) { >- Limiter l; >- float last_output_level = 0.f; >- bool has_last_output_level = false; >- constexpr float kMaxDelta = 0.5f; >- for (float level = -90.f; level <= l.max_input_level_db(); level += 0.5f) { >- const float current_output_level = l.GetOutputLevelDbfs(level); >- if (!has_last_output_level) { >- last_output_level = current_output_level; >- has_last_output_level = true; >- } >- EXPECT_LE(current_output_level, last_output_level + kMaxDelta); >- last_output_level = current_output_level; >+TEST(Limiter, OutputVolumeAboveThreshold) { >+ const int sample_rate_hz = 48000; >+ const float input_level = >+ (kMaxAbsFloatS16Value + DbfsToFloatS16(test::kLimiterMaxInputLevelDbFs)) / >+ 2.f; >+ ApmDataDumper apm_data_dumper(0); >+ >+ Limiter limiter(sample_rate_hz, &apm_data_dumper, ""); >+ >+ // Give the level estimator time to adapt. >+ for (int i = 0; i < 5; ++i) { >+ VectorFloatFrame vectors_with_float_frame(1, sample_rate_hz / 100, >+ input_level); >+ limiter.Process(vectors_with_float_frame.float_frame_view()); > } >-} > >-TEST(FixedDigitalGainController2Limiter, OutputGainShouldBeLessThanFullScale) { >- Limiter l; >- for (float level = -90.f; level <= l.max_input_level_db(); level += 0.5f) { >- const float current_output_level = l.GetOutputLevelDbfs(level); >- EXPECT_LE(current_output_level, 0.f); >+ VectorFloatFrame vectors_with_float_frame(1, sample_rate_hz / 100, >+ input_level); >+ limiter.Process(vectors_with_float_frame.float_frame_view()); >+ rtc::ArrayView<const float> channel = >+ vectors_with_float_frame.float_frame_view().channel(0); >+ >+ for (const auto& sample : channel) { >+ EXPECT_LT(0.9f * kMaxAbsFloatS16Value, sample); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_level_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_level_estimator.cc >index d9aaf1f9bd3f6689de7fad047b9334a3f0f706d2..6e43672ce0316f2762ce960e62607bc85b972c6f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_level_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_level_estimator.cc >@@ -10,13 +10,15 @@ > > #include "modules/audio_processing/agc2/noise_level_estimator.h" > >-#include <math.h> >- >+#include <stddef.h> > #include <algorithm> >+#include <cmath> > #include <numeric> > >+#include "api/array_view.h" > #include "common_audio/include/audio_util.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_spectrum_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_spectrum_estimator.cc >index 9e08126e892e3752c69ab3fc8120ce64d51a2d17..5735faf3d2cbd364d0d50c8b5ad0b4abb0d01cd2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_spectrum_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/noise_spectrum_estimator.cc >@@ -16,6 +16,7 @@ > #include "api/array_view.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/arraysize.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.cc >index 8ab56736a2f2b8b919f43f6415bff14b9024262a..8f472a55ed649d28121b9ea4c2bc15745c096eb9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.cc >@@ -10,6 +10,8 @@ > > #include "modules/audio_processing/agc2/rnn_vad/features_extraction.h" > >+#include <array> >+ > #include "modules/audio_processing/agc2/rnn_vad/lp_residual.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.h >index 1f63885c4e6d8774c2c44dc01125f598a753a402..ce5cce1857cd417bcbea4d79d011944cb86e5e4a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/features_extraction.h >@@ -11,12 +11,12 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_FEATURES_EXTRACTION_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_FEATURES_EXTRACTION_H_ > >-#include <memory> > #include <vector> > > #include "api/array_view.h" > #include "modules/audio_processing/agc2/biquad_filter.h" > #include "modules/audio_processing/agc2/rnn_vad/common.h" >+#include "modules/audio_processing/agc2/rnn_vad/pitch_info.h" > #include "modules/audio_processing/agc2/rnn_vad/pitch_search.h" > #include "modules/audio_processing/agc2/rnn_vad/sequence_buffer.h" > #include "modules/audio_processing/agc2/rnn_vad/spectral_features.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/fft_util.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/fft_util.cc >index 1017400a2824d1bba2c462c0673348bc14a5baa0..a1c5dac47774e40b189708b6545d5f9af901ee25 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/fft_util.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/fft_util.cc >@@ -10,6 +10,7 @@ > > #include "modules/audio_processing/agc2/rnn_vad/fft_util.h" > >+#include <stddef.h> > #include <cmath> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/lp_residual.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/lp_residual.h >index bffafd291f936e79a4961b1f7dddbb4b5e157f21..cddedca5d614ec94a0f36544b73d9309e4e8dd6a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/lp_residual.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/lp_residual.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_LP_RESIDUAL_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_LP_RESIDUAL_H_ > >+#include <stddef.h> >+ > #include "api/array_view.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search.cc >index 1f8859d5b04d453506ba185e7cc90ee8d4af9af3..aa0b751d28a1abd5e4d70cd5e2d0c91e644f0827 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search.cc >@@ -10,6 +10,11 @@ > > #include "modules/audio_processing/agc2/rnn_vad/pitch_search.h" > >+#include <array> >+#include <cstddef> >+ >+#include "rtc_base/checks.h" >+ > namespace webrtc { > namespace rnn_vad { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.cc >index b7b44d293769c0af932416922b75ed6aae002534..32ee8c00df849b5642170a080304c51df3c67fd0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.cc >@@ -10,10 +10,12 @@ > > #include "modules/audio_processing/agc2/rnn_vad/pitch_search_internal.h" > >+#include <stdlib.h> > #include <algorithm> > #include <cmath> >+#include <complex> >+#include <cstddef> > #include <numeric> >-#include <utility> > > #include "modules/audio_processing/agc2/rnn_vad/common.h" > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.h >index 75f7f17a42fc2968f3c287ee146d26731ae58693..bb747bb03e4c5d89c122dfc824477693d38ffb5d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/pitch_search_internal.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_PITCH_SEARCH_INTERNAL_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_PITCH_SEARCH_INTERNAL_H_ > >+#include <stddef.h> > #include <array> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn.h >index b3a3b9c485c172f6276cb49a2bb54d17a03c8b15..a7d057d576959001a44f7d3b0a7969960974c2d1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_RNN_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_RNN_H_ > >+#include <stddef.h> >+#include <sys/types.h> > #include <array> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn_vad_tool.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn_vad_tool.cc >index 5fba0bfac877c4dc6dfaf84268706c630bc679bf..b66dfd684d7443c83669f237c561d3ace9f05866 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn_vad_tool.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/rnn_vad_tool.cc >@@ -25,22 +25,22 @@ namespace rnn_vad { > namespace test { > namespace { > >-DEFINE_string(i, "", "Path to the input wav file"); >+WEBRTC_DEFINE_string(i, "", "Path to the input wav file"); > std::string InputWavFile() { > return static_cast<std::string>(FLAG_i); > } > >-DEFINE_string(f, "", "Path to the output features file"); >+WEBRTC_DEFINE_string(f, "", "Path to the output features file"); > std::string OutputFeaturesFile() { > return static_cast<std::string>(FLAG_f); > } > >-DEFINE_string(o, "", "Path to the output VAD probabilities file"); >+WEBRTC_DEFINE_string(o, "", "Path to the output VAD probabilities file"); > std::string OutputVadProbsFile() { > return static_cast<std::string>(FLAG_o); > } > >-DEFINE_bool(help, false, "Prints this message"); >+WEBRTC_DEFINE_bool(help, false, "Prints this message"); > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features.h >index bedd7ab3d7d1ada5824f176a7b55a9797b3b75b8..5c33dcdd245616b89131d0b7fc36be731aba4f7e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features.h >@@ -13,6 +13,7 @@ > > #include <array> > #include <complex> >+#include <cstddef> > #include <vector> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.cc >index 9f4e21850f5f76e9c0fb6b71aa28f4f9d1adae12..74211fe814da8871969a3ab0600e5a7552a8c83f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.cc >@@ -12,6 +12,7 @@ > > #include <algorithm> > #include <cmath> >+#include <cstddef> > > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.h >index 45bb382078e65727342d84633abb658d0d8919d8..edfd18cc85063b28c72bfcd6a72c9ce8b63391dc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/rnn_vad/spectral_features_internal.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_SPECTRAL_FEATURES_INTERNAL_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_RNN_VAD_SPECTRAL_FEATURES_INTERNAL_H_ > >+#include <stddef.h> > #include <array> > #include <complex> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.cc >index 0895583b83962ff360ae6013e3a6bbdf66acd7ed..94a52eaacaad867770a475dbc1bb33abf2a86088 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.cc >@@ -11,6 +11,7 @@ > #include "modules/audio_processing/agc2/saturation_protector.h" > > #include <algorithm> >+#include <iterator> > > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/numerics/safe_minmax.h" >@@ -56,9 +57,14 @@ float SaturationProtector::PeakEnveloper::Query() const { > } > > SaturationProtector::SaturationProtector(ApmDataDumper* apm_data_dumper) >+ : SaturationProtector(apm_data_dumper, GetExtraSaturationMarginOffsetDb()) { >+} >+ >+SaturationProtector::SaturationProtector(ApmDataDumper* apm_data_dumper, >+ float extra_saturation_margin_db) > : apm_data_dumper_(apm_data_dumper), > last_margin_(GetInitialSaturationMarginDb()), >- extra_saturation_margin_db_(GetExtraSaturationMarginOffsetDb()) {} >+ extra_saturation_margin_db_(extra_saturation_margin_db) {} > > void SaturationProtector::UpdateMargin( > const VadWithLevel::LevelAndProbability& vad_data, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.h >index 1705f6ac5fbbbb0a21ff1cee43bce975feb5d682..e637469070eab6032a1f66e5fb5d9059c85ac1f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/saturation_protector.h >@@ -24,6 +24,9 @@ class SaturationProtector { > public: > explicit SaturationProtector(ApmDataDumper* apm_data_dumper); > >+ SaturationProtector(ApmDataDumper* apm_data_dumper, >+ float extra_saturation_margin_db); >+ > // Update and return margin estimate. This method should be called > // whenever a frame is reliably classified as 'speech'. > // >@@ -60,7 +63,7 @@ class SaturationProtector { > > float last_margin_; > PeakEnveloper peak_enveloper_; >- float extra_saturation_margin_db_; >+ const float extra_saturation_margin_db_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/signal_classifier.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/signal_classifier.cc >index 0ec34148b9353a884e1c1a208f043ad95fd636af..8778c494265797a4ff88b620e7804b944af53f5d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/signal_classifier.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/signal_classifier.cc >@@ -18,7 +18,7 @@ > #include "modules/audio_processing/agc2/down_sampler.h" > #include "modules/audio_processing/agc2/noise_spectrum_estimator.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >-#include "rtc_base/constructormagic.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.cc >index decfacd0c18037608357cb7d153d6dc9ccd81312..52970dfe677c3d5e71efc2fd2e87ab4ec4a35203 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.cc >@@ -11,10 +11,12 @@ > #include "modules/audio_processing/agc2/vad_with_level.h" > > #include <algorithm> >+#include <array> >+#include <cmath> > >+#include "api/array_view.h" > #include "common_audio/include/audio_util.h" > #include "modules/audio_processing/agc2/rnn_vad/common.h" >-#include "rtc_base/checks.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.h >index 67a00ced6c1060966991db4b8ec83d59cda0a825..b0ad868d4b97c4a6b5360ece1363aa056be0516b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level.h >@@ -11,7 +11,6 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC2_VAD_WITH_LEVEL_H_ > #define MODULES_AUDIO_PROCESSING_AGC2_VAD_WITH_LEVEL_H_ > >-#include "api/array_view.h" > #include "common_audio/resampler/include/push_resampler.h" > #include "modules/audio_processing/agc2/rnn_vad/features_extraction.h" > #include "modules/audio_processing/agc2/rnn_vad/rnn.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f9aee62ba9aab410af1be2d019587fdc2cf5639a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/agc2/vad_with_level_unittest.cc >@@ -0,0 +1,40 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/audio_processing/agc2/vad_with_level.h" >+ >+#include "rtc_base/gunit.h" >+ >+namespace webrtc { >+namespace test { >+ >+TEST(AutomaticGainController2VadWithLevelEstimator, >+ PeakLevelGreaterThanRmsLevel) { >+ constexpr size_t kSampleRateHz = 8000; >+ >+ // 10 ms input frame, constant except for one peak value. >+ // Handcrafted so that the average is lower than the peak value. >+ std::array<float, kSampleRateHz / 100> frame; >+ frame.fill(1000.f); >+ frame[10] = 2000.f; >+ float* const channel0 = frame.data(); >+ AudioFrameView<float> frame_view(&channel0, 1, frame.size()); >+ >+ // Compute audio frame levels (the VAD result is ignored). >+ VadWithLevel vad_with_level; >+ auto levels_and_vad_prob = vad_with_level.AnalyzeFrame(frame_view); >+ >+ // Compare peak and RMS levels. >+ EXPECT_LT(levels_and_vad_prob.speech_rms_dbfs, >+ levels_and_vad_prob.speech_peak_dbfs); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.cc >index f163f5a07ded289233fe8faa8ca29dda041cc6a5..0c38a4fe82b854de906f06e675fc694842c9b61b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.cc >@@ -10,11 +10,13 @@ > > #include "modules/audio_processing/audio_buffer.h" > >+#include <string.h> >+#include <cstdint> >+ > #include "common_audio/channel_buffer.h" > #include "common_audio/include/audio_util.h" > #include "common_audio/resampler/push_sinc_resampler.h" >-#include "common_audio/signal_processing/include/signal_processing_library.h" >-#include "modules/audio_processing/common.h" >+#include "modules/audio_processing/splitting_filter.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.h >index ade3eec13db2a17377d4cebf074ba772ccfac55a..469646e8db4b68d9fd9748289f45a3f0ddd366d3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_buffer.h >@@ -11,18 +11,21 @@ > #ifndef MODULES_AUDIO_PROCESSING_AUDIO_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_AUDIO_BUFFER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <vector> > > #include "api/audio/audio_frame.h" > #include "common_audio/channel_buffer.h" > #include "modules/audio_processing/include/audio_processing.h" >-#include "modules/audio_processing/splitting_filter.h" >+#include "rtc_base/gtest_prod_util.h" > > namespace webrtc { > >-class PushSincResampler; > class IFChannelBuffer; >+class PushSincResampler; >+class SplittingFilter; > > enum Band { kBand0To8kHz = 0, kBand8To16kHz = 1, kBand16To24kHz = 2 }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.cc >index b9153fa08b92040a66b88ab2bfaab209c95ecb72..2937c0680bf5ac699b2c28ff968419688381f1eb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.cc >@@ -10,15 +10,16 @@ > > #include "modules/audio_processing/audio_processing_impl.h" > >-#include <math.h> > #include <algorithm> >+#include <cstdint> > #include <string> >+#include <type_traits> >+#include <utility> > >+#include "absl/types/optional.h" >+#include "api/array_view.h" > #include "common_audio/audio_converter.h" >-#include "common_audio/channel_buffer.h" > #include "common_audio/include/audio_util.h" >-#include "common_audio/signal_processing/include/signal_processing_library.h" >-#include "modules/audio_processing/aec/aec_core.h" > #include "modules/audio_processing/agc/agc_manager_direct.h" > #include "modules/audio_processing/agc2/gain_applier.h" > #include "modules/audio_processing/audio_buffer.h" >@@ -28,6 +29,7 @@ > #include "modules/audio_processing/gain_control_for_experimental_agc.h" > #include "modules/audio_processing/gain_control_impl.h" > #include "modules/audio_processing/gain_controller2.h" >+#include "modules/audio_processing/include/audio_frame_view.h" > #include "modules/audio_processing/level_estimator_impl.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "modules/audio_processing/low_cut_filter.h" >@@ -37,10 +39,9 @@ > #include "modules/audio_processing/voice_detection_impl.h" > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" >+#include "rtc_base/constructormagic.h" > #include "rtc_base/logging.h" >-#include "rtc_base/platform_file.h" > #include "rtc_base/refcountedobject.h" >-#include "rtc_base/system/arch.h" > #include "rtc_base/timeutils.h" > #include "rtc_base/trace_event.h" > #include "system_wrappers/include/metrics.h" >@@ -115,29 +116,6 @@ static const size_t kMaxAllowedValuesOfSamplesPerFrame = 480; > // TODO(peah): Decrease this once we properly handle hugely unbalanced > // reverse and forward call numbers. > static const size_t kMaxNumFramesToBuffer = 100; >- >-class HighPassFilterImpl : public HighPassFilter { >- public: >- explicit HighPassFilterImpl(AudioProcessingImpl* apm) : apm_(apm) {} >- ~HighPassFilterImpl() override = default; >- >- // HighPassFilter implementation. >- int Enable(bool enable) override { >- apm_->MutateConfig([enable](AudioProcessing::Config* config) { >- config->high_pass_filter.enabled = enable; >- }); >- >- return AudioProcessing::kNoError; >- } >- >- bool is_enabled() const override { >- return apm_->GetConfig().high_pass_filter.enabled; >- } >- >- private: >- AudioProcessingImpl* apm_; >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(HighPassFilterImpl); >-}; > } // namespace > > // Throughout webrtc, it's assumed that success is represented by zero. >@@ -249,8 +227,6 @@ bool AudioProcessingImpl::ApmSubmoduleStates::LowCutFilteringRequired() const { > struct AudioProcessingImpl::ApmPublicSubmodules { > ApmPublicSubmodules() {} > // Accessed externally of APM without any lock acquired. >- std::unique_ptr<EchoCancellationImpl> echo_cancellation; >- std::unique_ptr<EchoControlMobileImpl> echo_control_mobile; > std::unique_ptr<GainControlImpl> gain_control; > std::unique_ptr<LevelEstimatorImpl> level_estimator; > std::unique_ptr<NoiseSuppressionImpl> noise_suppression; >@@ -276,11 +252,14 @@ struct AudioProcessingImpl::ApmPrivateSubmodules { > std::unique_ptr<GainController2> gain_controller2; > std::unique_ptr<LowCutFilter> low_cut_filter; > rtc::scoped_refptr<EchoDetector> echo_detector; >+ std::unique_ptr<EchoCancellationImpl> echo_cancellation; > std::unique_ptr<EchoControl> echo_controller; >+ std::unique_ptr<EchoControlMobileImpl> echo_control_mobile; > std::unique_ptr<CustomProcessing> capture_post_processor; > std::unique_ptr<CustomProcessing> render_pre_processor; > std::unique_ptr<GainApplier> pre_amplifier; > std::unique_ptr<CustomAudioAnalyzer> capture_analyzer; >+ std::unique_ptr<LevelEstimatorImpl> output_level_estimator; > }; > > AudioProcessingBuilder::AudioProcessingBuilder() = default; >@@ -352,7 +331,6 @@ AudioProcessingImpl::AudioProcessingImpl( > render_runtime_settings_(kRuntimeSettingQueueSize), > capture_runtime_settings_enqueuer_(&capture_runtime_settings_), > render_runtime_settings_enqueuer_(&render_runtime_settings_), >- high_pass_filter_impl_(new HighPassFilterImpl(this)), > echo_control_factory_(std::move(echo_control_factory)), > submodule_states_(!!capture_post_processor, > !!render_pre_processor, >@@ -390,10 +368,6 @@ AudioProcessingImpl::AudioProcessingImpl( > capture_nonlocked_.echo_controller_enabled = > static_cast<bool>(echo_control_factory_); > >- public_submodules_->echo_cancellation.reset( >- new EchoCancellationImpl(&crit_render_, &crit_capture_)); >- public_submodules_->echo_control_mobile.reset( >- new EchoControlMobileImpl(&crit_render_, &crit_capture_)); > public_submodules_->gain_control.reset( > new GainControlImpl(&crit_render_, &crit_capture_)); > public_submodules_->level_estimator.reset( >@@ -412,6 +386,8 @@ AudioProcessingImpl::AudioProcessingImpl( > new rtc::RefCountedObject<ResidualEchoDetector>(); > } > >+ private_submodules_->echo_cancellation.reset(new EchoCancellationImpl()); >+ private_submodules_->echo_control_mobile.reset(new EchoControlMobileImpl()); > // TODO(alessiob): Move the injected gain controller once injection is > // implemented. > private_submodules_->gain_controller2.reset(new GainController2()); >@@ -530,16 +506,16 @@ int AudioProcessingImpl::InitializeLocked() { > formats_.api_format.output_stream().num_channels(), > formats_.api_format.output_stream().num_frames())); > >- public_submodules_->echo_cancellation->Initialize( >+ private_submodules_->echo_cancellation->Initialize( > proc_sample_rate_hz(), num_reverse_channels(), num_output_channels(), > num_proc_channels()); > AllocateRenderQueue(); > >- int success = public_submodules_->echo_cancellation->enable_metrics(true); >+ int success = private_submodules_->echo_cancellation->enable_metrics(true); > RTC_DCHECK_EQ(0, success); >- success = public_submodules_->echo_cancellation->enable_delay_logging(true); >+ success = private_submodules_->echo_cancellation->enable_delay_logging(true); > RTC_DCHECK_EQ(0, success); >- public_submodules_->echo_control_mobile->Initialize( >+ private_submodules_->echo_control_mobile->Initialize( > proc_split_sample_rate_hz(), num_reverse_channels(), > num_output_channels()); > >@@ -662,18 +638,18 @@ int AudioProcessingImpl::InitializeLocked(const ProcessingConfig& config) { > } > > void AudioProcessingImpl::ApplyConfig(const AudioProcessing::Config& config) { >- config_ = config; >- > // Run in a single-threaded manner when applying the settings. > rtc::CritScope cs_render(&crit_render_); > rtc::CritScope cs_capture(&crit_capture_); > >- public_submodules_->echo_cancellation->Enable( >+ config_ = config; >+ >+ private_submodules_->echo_cancellation->Enable( > config_.echo_canceller.enabled && !config_.echo_canceller.mobile_mode); >- public_submodules_->echo_control_mobile->Enable( >+ private_submodules_->echo_control_mobile->Enable( > config_.echo_canceller.enabled && config_.echo_canceller.mobile_mode); > >- public_submodules_->echo_cancellation->set_suppression_level( >+ private_submodules_->echo_cancellation->set_suppression_level( > config.echo_canceller.legacy_moderate_suppression_level > ? EchoCancellationImpl::SuppressionLevel::kModerateSuppression > : EchoCancellationImpl::SuppressionLevel::kHighSuppression); >@@ -698,6 +674,13 @@ void AudioProcessingImpl::ApplyConfig(const AudioProcessing::Config& config) { > << config_.gain_controller2.enabled; > RTC_LOG(LS_INFO) << "Pre-amplifier activated: " > << config_.pre_amplifier.enabled; >+ >+ if (config_.level_estimation.enabled && >+ !private_submodules_->output_level_estimator) { >+ private_submodules_->output_level_estimator.reset( >+ new LevelEstimatorImpl(&crit_capture_)); >+ private_submodules_->output_level_estimator->Enable(true); >+ } > } > > void AudioProcessingImpl::SetExtraOptions(const webrtc::Config& config) { >@@ -705,7 +688,7 @@ void AudioProcessingImpl::SetExtraOptions(const webrtc::Config& config) { > rtc::CritScope cs_render(&crit_render_); > rtc::CritScope cs_capture(&crit_capture_); > >- public_submodules_->echo_cancellation->SetExtraOptions(config); >+ private_submodules_->echo_cancellation->SetExtraOptions(config); > > if (capture_.transient_suppressor_enabled != > config.Get<ExperimentalNs>().enabled) { >@@ -1082,12 +1065,12 @@ void AudioProcessingImpl::AllocateRenderQueue() { > void AudioProcessingImpl::EmptyQueuedRenderAudio() { > rtc::CritScope cs_capture(&crit_capture_); > while (aec_render_signal_queue_->Remove(&aec_capture_queue_buffer_)) { >- public_submodules_->echo_cancellation->ProcessRenderAudio( >+ private_submodules_->echo_cancellation->ProcessRenderAudio( > aec_capture_queue_buffer_); > } > > while (aecm_render_signal_queue_->Remove(&aecm_capture_queue_buffer_)) { >- public_submodules_->echo_control_mobile->ProcessRenderAudio( >+ private_submodules_->echo_control_mobile->ProcessRenderAudio( > aecm_capture_queue_buffer_); > } > >@@ -1109,9 +1092,6 @@ int AudioProcessingImpl::ProcessStream(AudioFrame* frame) { > // Acquire the capture lock in order to safely call the function > // that retrieves the render side data. This function accesses APM > // getters that need the capture lock held when being called. >- // The lock needs to be released as >- // public_submodules_->echo_control_mobile->is_enabled() acquires this lock >- // as well. > rtc::CritScope cs_capture(&crit_capture_); > EmptyQueuedRenderAudio(); > } >@@ -1180,8 +1160,8 @@ int AudioProcessingImpl::ProcessCaptureStreamLocked() { > // Ensure that not both the AEC and AECM are active at the same time. > // TODO(peah): Simplify once the public API Enable functions for these > // are moved to APM. >- RTC_DCHECK(!(public_submodules_->echo_cancellation->is_enabled() && >- public_submodules_->echo_control_mobile->is_enabled())); >+ RTC_DCHECK(!(private_submodules_->echo_cancellation->is_enabled() && >+ private_submodules_->echo_control_mobile->is_enabled())); > > MaybeUpdateHistograms(); > >@@ -1265,7 +1245,7 @@ int AudioProcessingImpl::ProcessCaptureStreamLocked() { > > // Ensure that the stream delay was set before the call to the > // AEC ProcessCaptureAudio function. >- if (public_submodules_->echo_cancellation->is_enabled() && >+ if (private_submodules_->echo_cancellation->is_enabled() && > !private_submodules_->echo_controller && !was_stream_delay_set()) { > return AudioProcessing::kStreamParameterNotSetError; > } >@@ -1281,11 +1261,11 @@ int AudioProcessingImpl::ProcessCaptureStreamLocked() { > private_submodules_->echo_controller->ProcessCapture( > capture_buffer, capture_.echo_path_gain_change); > } else { >- RETURN_ON_ERR(public_submodules_->echo_cancellation->ProcessCaptureAudio( >+ RETURN_ON_ERR(private_submodules_->echo_cancellation->ProcessCaptureAudio( > capture_buffer, stream_delay_ms())); > } > >- if (public_submodules_->echo_control_mobile->is_enabled() && >+ if (private_submodules_->echo_control_mobile->is_enabled() && > public_submodules_->noise_suppression->is_enabled()) { > capture_buffer->CopyLowPassToReference(); > } >@@ -1293,14 +1273,14 @@ int AudioProcessingImpl::ProcessCaptureStreamLocked() { > > // Ensure that the stream delay was set before the call to the > // AECM ProcessCaptureAudio function. >- if (public_submodules_->echo_control_mobile->is_enabled() && >+ if (private_submodules_->echo_control_mobile->is_enabled() && > !was_stream_delay_set()) { > return AudioProcessing::kStreamParameterNotSetError; > } > > if (!(private_submodules_->echo_controller || >- public_submodules_->echo_cancellation->is_enabled())) { >- RETURN_ON_ERR(public_submodules_->echo_control_mobile->ProcessCaptureAudio( >+ private_submodules_->echo_cancellation->is_enabled())) { >+ RETURN_ON_ERR(private_submodules_->echo_control_mobile->ProcessCaptureAudio( > capture_buffer, stream_delay_ms())); > } > >@@ -1315,7 +1295,7 @@ int AudioProcessingImpl::ProcessCaptureStreamLocked() { > } > RETURN_ON_ERR(public_submodules_->gain_control->ProcessCaptureAudio( > capture_buffer, >- public_submodules_->echo_cancellation->stream_has_echo())); >+ private_submodules_->echo_cancellation->stream_has_echo())); > > if (submodule_states_.CaptureMultiBandProcessingActive() && > SampleRateSupportsMultiBand( >@@ -1364,6 +1344,13 @@ int AudioProcessingImpl::ProcessCaptureStreamLocked() { > > // The level estimator operates on the recombined data. > public_submodules_->level_estimator->ProcessStream(capture_buffer); >+ if (config_.level_estimation.enabled) { >+ private_submodules_->output_level_estimator->ProcessStream(capture_buffer); >+ capture_.stats.output_rms_dbfs = >+ private_submodules_->output_level_estimator->RMS(); >+ } else { >+ capture_.stats.output_rms_dbfs = absl::nullopt; >+ } > > capture_output_rms_.Analyze(rtc::ArrayView<const int16_t>( > capture_buffer->channels_const()[0], >@@ -1613,112 +1600,52 @@ void AudioProcessingImpl::DetachPlayoutAudioGenerator() { > // Delete audio generator, if one is attached. > } > >-AudioProcessing::AudioProcessingStatistics::AudioProcessingStatistics() { >- residual_echo_return_loss.Set(-100.0f, -100.0f, -100.0f, -100.0f); >- echo_return_loss.Set(-100.0f, -100.0f, -100.0f, -100.0f); >- echo_return_loss_enhancement.Set(-100.0f, -100.0f, -100.0f, -100.0f); >- a_nlp.Set(-100.0f, -100.0f, -100.0f, -100.0f); >-} >- >-AudioProcessing::AudioProcessingStatistics::AudioProcessingStatistics( >- const AudioProcessingStatistics& other) = default; >- >-AudioProcessing::AudioProcessingStatistics::~AudioProcessingStatistics() = >- default; >- >-// TODO(ivoc): Remove this when GetStatistics() becomes pure virtual. >-AudioProcessing::AudioProcessingStatistics AudioProcessing::GetStatistics() >- const { >- return AudioProcessingStatistics(); >-} >- >-// TODO(ivoc): Remove this when GetStatistics() becomes pure virtual. >-AudioProcessingStats AudioProcessing::GetStatistics( >+AudioProcessingStats AudioProcessingImpl::GetStatistics( > bool has_remote_tracks) const { >- return AudioProcessingStats(); >-} >- >-AudioProcessing::AudioProcessingStatistics AudioProcessingImpl::GetStatistics() >- const { >- AudioProcessingStatistics stats; >+ rtc::CritScope cs_capture(&crit_capture_); >+ if (!has_remote_tracks) { >+ return capture_.stats; >+ } >+ AudioProcessingStats stats = capture_.stats; > EchoCancellationImpl::Metrics metrics; > if (private_submodules_->echo_controller) { >- rtc::CritScope cs_capture(&crit_capture_); > auto ec_metrics = private_submodules_->echo_controller->GetMetrics(); >- float erl = static_cast<float>(ec_metrics.echo_return_loss); >- float erle = static_cast<float>(ec_metrics.echo_return_loss_enhancement); >- // Instant value will also be used for min, max and average. >- stats.echo_return_loss.Set(erl, erl, erl, erl); >- stats.echo_return_loss_enhancement.Set(erle, erle, erle, erle); >- } else if (public_submodules_->echo_cancellation->GetMetrics(&metrics) == >+ stats.echo_return_loss = ec_metrics.echo_return_loss; >+ stats.echo_return_loss_enhancement = >+ ec_metrics.echo_return_loss_enhancement; >+ stats.delay_ms = ec_metrics.delay_ms; >+ } else if (private_submodules_->echo_cancellation->GetMetrics(&metrics) == > Error::kNoError) { >- stats.a_nlp.Set(metrics.a_nlp); >- stats.divergent_filter_fraction = metrics.divergent_filter_fraction; >- stats.echo_return_loss.Set(metrics.echo_return_loss); >- stats.echo_return_loss_enhancement.Set( >- metrics.echo_return_loss_enhancement); >- stats.residual_echo_return_loss.Set(metrics.residual_echo_return_loss); >+ if (metrics.divergent_filter_fraction != -1.0f) { >+ stats.divergent_filter_fraction = >+ absl::optional<double>(metrics.divergent_filter_fraction); >+ } >+ if (metrics.echo_return_loss.instant != -100) { >+ stats.echo_return_loss = >+ absl::optional<double>(metrics.echo_return_loss.instant); >+ } >+ if (metrics.echo_return_loss_enhancement.instant != -100) { >+ stats.echo_return_loss_enhancement = >+ absl::optional<double>(metrics.echo_return_loss_enhancement.instant); >+ } > } >- { >- rtc::CritScope cs_capture(&crit_capture_); >+ if (config_.residual_echo_detector.enabled) { > RTC_DCHECK(private_submodules_->echo_detector); > auto ed_metrics = private_submodules_->echo_detector->GetMetrics(); > stats.residual_echo_likelihood = ed_metrics.echo_likelihood; > stats.residual_echo_likelihood_recent_max = > ed_metrics.echo_likelihood_recent_max; > } >- public_submodules_->echo_cancellation->GetDelayMetrics( >- &stats.delay_median, &stats.delay_standard_deviation, >- &stats.fraction_poor_delays); >- return stats; >-} >- >-AudioProcessingStats AudioProcessingImpl::GetStatistics( >- bool has_remote_tracks) const { >- AudioProcessingStats stats; >- if (has_remote_tracks) { >- EchoCancellationImpl::Metrics metrics; >- if (private_submodules_->echo_controller) { >- rtc::CritScope cs_capture(&crit_capture_); >- auto ec_metrics = private_submodules_->echo_controller->GetMetrics(); >- stats.echo_return_loss = ec_metrics.echo_return_loss; >- stats.echo_return_loss_enhancement = >- ec_metrics.echo_return_loss_enhancement; >- stats.delay_ms = ec_metrics.delay_ms; >- } else if (public_submodules_->echo_cancellation->GetMetrics(&metrics) == >- Error::kNoError) { >- if (metrics.divergent_filter_fraction != -1.0f) { >- stats.divergent_filter_fraction = >- absl::optional<double>(metrics.divergent_filter_fraction); >- } >- if (metrics.echo_return_loss.instant != -100) { >- stats.echo_return_loss = >- absl::optional<double>(metrics.echo_return_loss.instant); >- } >- if (metrics.echo_return_loss_enhancement.instant != -100) { >- stats.echo_return_loss_enhancement = absl::optional<double>( >- metrics.echo_return_loss_enhancement.instant); >- } >+ int delay_median, delay_std; >+ float fraction_poor_delays; >+ if (private_submodules_->echo_cancellation->GetDelayMetrics( >+ &delay_median, &delay_std, &fraction_poor_delays) == >+ Error::kNoError) { >+ if (delay_median >= 0) { >+ stats.delay_median_ms = absl::optional<int32_t>(delay_median); > } >- if (config_.residual_echo_detector.enabled) { >- rtc::CritScope cs_capture(&crit_capture_); >- RTC_DCHECK(private_submodules_->echo_detector); >- auto ed_metrics = private_submodules_->echo_detector->GetMetrics(); >- stats.residual_echo_likelihood = ed_metrics.echo_likelihood; >- stats.residual_echo_likelihood_recent_max = >- ed_metrics.echo_likelihood_recent_max; >- } >- int delay_median, delay_std; >- float fraction_poor_delays; >- if (public_submodules_->echo_cancellation->GetDelayMetrics( >- &delay_median, &delay_std, &fraction_poor_delays) == >- Error::kNoError) { >- if (delay_median >= 0) { >- stats.delay_median_ms = absl::optional<int32_t>(delay_median); >- } >- if (delay_std >= 0) { >- stats.delay_standard_deviation_ms = absl::optional<int32_t>(delay_std); >- } >+ if (delay_std >= 0) { >+ stats.delay_standard_deviation_ms = absl::optional<int32_t>(delay_std); > } > } > return stats; >@@ -1731,10 +1658,6 @@ GainControl* AudioProcessingImpl::gain_control() const { > return public_submodules_->gain_control.get(); > } > >-HighPassFilter* AudioProcessingImpl::high_pass_filter() const { >- return high_pass_filter_impl_.get(); >-} >- > LevelEstimator* AudioProcessingImpl::level_estimator() const { > return public_submodules_->level_estimator.get(); > } >@@ -1764,8 +1687,8 @@ AudioProcessing::Config AudioProcessingImpl::GetConfig() const { > bool AudioProcessingImpl::UpdateActiveSubmoduleStates() { > return submodule_states_.Update( > config_.high_pass_filter.enabled, >- public_submodules_->echo_cancellation->is_enabled(), >- public_submodules_->echo_control_mobile->is_enabled(), >+ private_submodules_->echo_cancellation->is_enabled(), >+ private_submodules_->echo_control_mobile->is_enabled(), > config_.residual_echo_detector.enabled, > public_submodules_->noise_suppression->is_enabled(), > public_submodules_->gain_control->is_enabled(), >@@ -1852,15 +1775,15 @@ void AudioProcessingImpl::InitializePreProcessor() { > void AudioProcessingImpl::MaybeUpdateHistograms() { > static const int kMinDiffDelayMs = 60; > >- if (public_submodules_->echo_cancellation->is_enabled()) { >+ if (private_submodules_->echo_cancellation->is_enabled()) { > // Activate delay_jumps_ counters if we know echo_cancellation is running. > // If a stream has echo we know that the echo_cancellation is in process. > if (capture_.stream_delay_jumps == -1 && >- public_submodules_->echo_cancellation->stream_has_echo()) { >+ private_submodules_->echo_cancellation->stream_has_echo()) { > capture_.stream_delay_jumps = 0; > } > if (capture_.aec_system_delay_jumps == -1 && >- public_submodules_->echo_cancellation->stream_has_echo()) { >+ private_submodules_->echo_cancellation->stream_has_echo()) { > capture_.aec_system_delay_jumps = 0; > } > >@@ -1883,7 +1806,7 @@ void AudioProcessingImpl::MaybeUpdateHistograms() { > rtc::CheckedDivExact(capture_nonlocked_.split_rate, 1000); > RTC_DCHECK_LT(0, samples_per_ms); > const int aec_system_delay_ms = >- public_submodules_->echo_cancellation->GetSystemDelayInSamples() / >+ private_submodules_->echo_cancellation->GetSystemDelayInSamples() / > samples_per_ms; > const int diff_aec_system_delay_ms = > aec_system_delay_ms - capture_.last_aec_system_delay_ms; >@@ -1927,7 +1850,7 @@ void AudioProcessingImpl::WriteAecDumpConfigMessage(bool forced) { > return; > } > std::string experiments_description = >- public_submodules_->echo_cancellation->GetExperimentsDescription(); >+ private_submodules_->echo_cancellation->GetExperimentsDescription(); > // TODO(peah): Add semicolon-separated concatenations of experiment > // descriptions for other submodules. > if (constants_.agc_clipped_level_min != kClippedLevelMin) { >@@ -1942,22 +1865,22 @@ void AudioProcessingImpl::WriteAecDumpConfigMessage(bool forced) { > > InternalAPMConfig apm_config; > >- apm_config.aec_enabled = public_submodules_->echo_cancellation->is_enabled(); >+ apm_config.aec_enabled = private_submodules_->echo_cancellation->is_enabled(); > apm_config.aec_delay_agnostic_enabled = >- public_submodules_->echo_cancellation->is_delay_agnostic_enabled(); >+ private_submodules_->echo_cancellation->is_delay_agnostic_enabled(); > apm_config.aec_drift_compensation_enabled = >- public_submodules_->echo_cancellation->is_drift_compensation_enabled(); >+ private_submodules_->echo_cancellation->is_drift_compensation_enabled(); > apm_config.aec_extended_filter_enabled = >- public_submodules_->echo_cancellation->is_extended_filter_enabled(); >+ private_submodules_->echo_cancellation->is_extended_filter_enabled(); > apm_config.aec_suppression_level = static_cast<int>( >- public_submodules_->echo_cancellation->suppression_level()); >+ private_submodules_->echo_cancellation->suppression_level()); > > apm_config.aecm_enabled = >- public_submodules_->echo_control_mobile->is_enabled(); >+ private_submodules_->echo_control_mobile->is_enabled(); > apm_config.aecm_comfort_noise_enabled = >- public_submodules_->echo_control_mobile->is_comfort_noise_enabled(); >- apm_config.aecm_routing_mode = >- static_cast<int>(public_submodules_->echo_control_mobile->routing_mode()); >+ private_submodules_->echo_control_mobile->is_comfort_noise_enabled(); >+ apm_config.aecm_routing_mode = static_cast<int>( >+ private_submodules_->echo_control_mobile->routing_mode()); > > apm_config.agc_enabled = public_submodules_->gain_control->is_enabled(); > apm_config.agc_mode = >@@ -2032,7 +1955,7 @@ void AudioProcessingImpl::RecordAudioProcessingState() { > AecDump::AudioProcessingState audio_proc_state; > audio_proc_state.delay = capture_nonlocked_.stream_delay_ms; > audio_proc_state.drift = >- public_submodules_->echo_cancellation->stream_drift_samples(); >+ private_submodules_->echo_cancellation->stream_drift_samples(); > audio_proc_state.level = gain_control()->stream_analog_level(); > audio_proc_state.keypress = capture_.key_pressed; > aec_dump_->AddAudioProcessingState(audio_proc_state); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.h >index 50cb82bf13ae1752df518e44197ac0eb9fb7a7b8..2f946c5e13d89204df74621beb42c48048d2b8e8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl.h >@@ -18,6 +18,7 @@ > #include "modules/audio_processing/audio_buffer.h" > #include "modules/audio_processing/include/aec_dump.h" > #include "modules/audio_processing/include/audio_processing.h" >+#include "modules/audio_processing/include/audio_processing_statistics.h" > #include "modules/audio_processing/render_queue_item_verifier.h" > #include "modules/audio_processing/rms_level.h" > #include "rtc_base/criticalsection.h" >@@ -109,7 +110,6 @@ class AudioProcessingImpl : public AudioProcessing { > bool was_stream_delay_set() const override > RTC_EXCLUSIVE_LOCKS_REQUIRED(crit_capture_); > >- AudioProcessingStatistics GetStatistics() const override; > AudioProcessingStats GetStatistics(bool has_remote_tracks) const override; > > // Methods returning pointers to APM submodules. >@@ -118,8 +118,6 @@ class AudioProcessingImpl : public AudioProcessing { > // created only once in a single-treaded manner > // during APM creation). > GainControl* gain_control() const override; >- // TODO(peah): Deprecate this API call. >- HighPassFilter* high_pass_filter() const override; > LevelEstimator* level_estimator() const override; > NoiseSuppression* noise_suppression() const override; > VoiceDetection* voice_detection() const override; >@@ -164,9 +162,6 @@ class AudioProcessingImpl : public AudioProcessing { > RuntimeSettingEnqueuer capture_runtime_settings_enqueuer_; > RuntimeSettingEnqueuer render_runtime_settings_enqueuer_; > >- // Submodule interface implementations. >- std::unique_ptr<HighPassFilter> high_pass_filter_impl_; >- > // EchoControl factory. > std::unique_ptr<EchoControlFactory> echo_control_factory_; > >@@ -396,6 +391,7 @@ class AudioProcessingImpl : public AudioProcessing { > bool echo_path_gain_change; > int prev_analog_mic_level; > float prev_pre_amp_gain; >+ AudioProcessingStats stats; > } capture_ RTC_GUARDED_BY(crit_capture_); > > struct ApmCaptureNonLockedState { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl_locking_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl_locking_unittest.cc >index 0e0a33f5700693f134274d7737afa3c5440891b7..9685ef95e9cf571ad60c2c8001f73e81f4359157 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl_locking_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_impl_locking_unittest.cc >@@ -487,10 +487,7 @@ void PopulateAudioFrame(AudioFrame* frame, > } > > AudioProcessingImplLockTest::AudioProcessingImplLockTest() >- : test_complete_(false, false), >- render_call_event_(false, false), >- capture_call_event_(false, false), >- render_thread_(RenderProcessorThreadFunc, this, "render"), >+ : render_thread_(RenderProcessorThreadFunc, this, "render"), > capture_thread_(CaptureProcessorThreadFunc, this, "capture"), > stats_thread_(StatsProcessorThreadFunc, this, "stats"), > apm_(AudioProcessingBuilder().Create()), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_performance_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_performance_unittest.cc >index 95f736c22f168bfdb87267839d7b042ea3fe9916..84cb5743072706b01f88cc21418e6e3e17cc9d80 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_performance_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_performance_unittest.cc >@@ -18,11 +18,11 @@ > #include "api/array_view.h" > #include "modules/audio_processing/test/test_utils.h" > #include "rtc_base/atomicops.h" >+#include "rtc_base/event.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/platform_thread.h" > #include "rtc_base/random.h" > #include "system_wrappers/include/clock.h" >-#include "system_wrappers/include/event_wrapper.h" > #include "test/gtest.h" > #include "test/testsupport/perf_test.h" > >@@ -391,8 +391,7 @@ class TimedThreadApiProcessor { > class CallSimulator : public ::testing::TestWithParam<SimulationConfig> { > public: > CallSimulator() >- : test_complete_(EventWrapper::Create()), >- render_thread_( >+ : render_thread_( > new rtc::PlatformThread(RenderProcessorThreadFunc, this, "render")), > capture_thread_(new rtc::PlatformThread(CaptureProcessorThreadFunc, > this, >@@ -401,10 +400,10 @@ class CallSimulator : public ::testing::TestWithParam<SimulationConfig> { > simulation_config_(static_cast<SimulationConfig>(GetParam())) {} > > // Run the call simulation with a timeout. >- EventTypeWrapper Run() { >+ bool Run() { > StartThreads(); > >- EventTypeWrapper result = test_complete_->Wait(kTestTimeout); >+ bool result = test_complete_.Wait(kTestTimeout); > > StopThreads(); > >@@ -420,7 +419,7 @@ class CallSimulator : public ::testing::TestWithParam<SimulationConfig> { > // done. > bool MaybeEndTest() { > if (frame_counters_.BothCountersExceedeThreshold(kMinNumFramesToProcess)) { >- test_complete_->Set(); >+ test_complete_.Set(); > return true; > } > return false; >@@ -570,7 +569,7 @@ class CallSimulator : public ::testing::TestWithParam<SimulationConfig> { > } > > // Event handler for the test. >- const std::unique_ptr<EventWrapper> test_complete_; >+ rtc::Event test_complete_; > > // Thread related variables. > std::unique_ptr<rtc::PlatformThread> render_thread_; >@@ -619,7 +618,7 @@ const float CallSimulator::kCaptureInputFloatLevel = 0.03125f; > // TODO(peah): Reactivate once issue 7712 has been resolved. > TEST_P(CallSimulator, DISABLED_ApiCallDurationTest) { > // Run test and verify that it did not time out. >- EXPECT_EQ(kEventSignaled, Run()); >+ EXPECT_TRUE(Run()); > } > > INSTANTIATE_TEST_CASE_P( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_unittest.cc >index 3f0decc957a634214f3d393f42c4518819b79b81..6809ab9dcd9e3140e02906e47c48a2410ad14c2b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/audio_processing_unittest.cc >@@ -1278,7 +1278,6 @@ TEST_F(ApmTest, AllProcessingDisabledByDefault) { > EXPECT_FALSE(config.echo_canceller.enabled); > EXPECT_FALSE(config.high_pass_filter.enabled); > EXPECT_FALSE(apm_->gain_control()->is_enabled()); >- EXPECT_FALSE(apm_->high_pass_filter()->is_enabled()); > EXPECT_FALSE(apm_->level_estimator()->is_enabled()); > EXPECT_FALSE(apm_->noise_suppression()->is_enabled()); > EXPECT_FALSE(apm_->voice_detection()->is_enabled()); >@@ -2802,4 +2801,42 @@ TEST(MAYBE_ApmStatistics, AECMEnabledTest) { > EXPECT_FALSE(stats.delay_median_ms); > EXPECT_FALSE(stats.delay_standard_deviation_ms); > } >+ >+TEST(ApmStatistics, ReportOutputRmsDbfs) { >+ ProcessingConfig processing_config = { >+ {{32000, 1}, {32000, 1}, {32000, 1}, {32000, 1}}}; >+ AudioProcessing::Config config; >+ >+ // Set up an audioframe. >+ AudioFrame frame; >+ frame.num_channels_ = 1; >+ SetFrameSampleRate(&frame, AudioProcessing::NativeRate::kSampleRate48kHz); >+ >+ // Fill the audio frame with a sawtooth pattern. >+ int16_t* ptr = frame.mutable_data(); >+ for (size_t i = 0; i < frame.kMaxDataSizeSamples; i++) { >+ ptr[i] = 10000 * ((i % 3) - 1); >+ } >+ >+ std::unique_ptr<AudioProcessing> apm(AudioProcessingBuilder().Create()); >+ apm->Initialize(processing_config); >+ >+ // If not enabled, no metric should be reported. >+ EXPECT_EQ(apm->ProcessStream(&frame), 0); >+ EXPECT_FALSE(apm->GetStatistics(false).output_rms_dbfs); >+ >+ // If enabled, metrics should be reported. >+ config.level_estimation.enabled = true; >+ apm->ApplyConfig(config); >+ EXPECT_EQ(apm->ProcessStream(&frame), 0); >+ auto stats = apm->GetStatistics(false); >+ EXPECT_TRUE(stats.output_rms_dbfs); >+ EXPECT_GE(*stats.output_rms_dbfs, 0); >+ >+ // If re-disabled, the value is again not reported. >+ config.level_estimation.enabled = false; >+ apm->ApplyConfig(config); >+ EXPECT_EQ(apm->ProcessStream(&frame), 0); >+ EXPECT_FALSE(apm->GetStatistics(false).output_rms_dbfs); >+} > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_bit_exact_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_bit_exact_unittest.cc >index 7f60c4b5d83881d46e7fa2a63d18cf01a68b4ecc..af0741ee178496a1319e9ca4b531735d5b72b707 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_bit_exact_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_bit_exact_unittest.cc >@@ -74,9 +74,7 @@ void RunBitexactnessTest( > EchoCancellationImpl::SuppressionLevel suppression_level, > bool stream_has_echo_reference, > const rtc::ArrayView<const float>& output_reference) { >- rtc::CriticalSection crit_render; >- rtc::CriticalSection crit_capture; >- EchoCancellationImpl echo_canceller(&crit_render, &crit_capture); >+ EchoCancellationImpl echo_canceller; > SetupComponent(sample_rate_hz, suppression_level, drift_compensation_enabled, > &echo_canceller); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.cc >index e6fe7691c4903d9279ab01be53796ffc724697e2..73fe51b752f66d2c0f8a251c29ee6af14aa34286 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.cc >@@ -10,11 +10,13 @@ > > #include "modules/audio_processing/echo_cancellation_impl.h" > >+#include <stdint.h> > #include <string.h> > > #include "modules/audio_processing/aec/aec_core.h" > #include "modules/audio_processing/aec/echo_cancellation.h" > #include "modules/audio_processing/audio_buffer.h" >+#include "modules/audio_processing/include/config.h" > #include "rtc_base/checks.h" > #include "system_wrappers/include/field_trial.h" > >@@ -103,11 +105,8 @@ class EchoCancellationImpl::Canceller { > void* state_; > }; > >-EchoCancellationImpl::EchoCancellationImpl(rtc::CriticalSection* crit_render, >- rtc::CriticalSection* crit_capture) >- : crit_render_(crit_render), >- crit_capture_(crit_capture), >- drift_compensation_enabled_(false), >+EchoCancellationImpl::EchoCancellationImpl() >+ : drift_compensation_enabled_(false), > metrics_enabled_(true), > suppression_level_(kHighSuppression), > stream_drift_samples_(0), >@@ -116,16 +115,12 @@ EchoCancellationImpl::EchoCancellationImpl(rtc::CriticalSection* crit_render, > delay_logging_enabled_(true), > extended_filter_enabled_(false), > delay_agnostic_enabled_(false), >- enforce_zero_stream_delay_(EnforceZeroStreamDelay()) { >- RTC_DCHECK(crit_render); >- RTC_DCHECK(crit_capture); >-} >+ enforce_zero_stream_delay_(EnforceZeroStreamDelay()) {} > > EchoCancellationImpl::~EchoCancellationImpl() = default; > > void EchoCancellationImpl::ProcessRenderAudio( > rtc::ArrayView<const float> packed_render_audio) { >- rtc::CritScope cs_capture(crit_capture_); > if (!enabled_) { > return; > } >@@ -149,7 +144,6 @@ void EchoCancellationImpl::ProcessRenderAudio( > > int EchoCancellationImpl::ProcessCaptureAudio(AudioBuffer* audio, > int stream_delay_ms) { >- rtc::CritScope cs_capture(crit_capture_); > if (!enabled_) { > return AudioProcessing::kNoError; > } >@@ -206,10 +200,6 @@ int EchoCancellationImpl::ProcessCaptureAudio(AudioBuffer* audio, > } > > int EchoCancellationImpl::Enable(bool enable) { >- // Run in a single-threaded manner. >- rtc::CritScope cs_render(crit_render_); >- rtc::CritScope cs_capture(crit_capture_); >- > if (enable && !enabled_) { > enabled_ = enable; // Must be set before Initialize() is called. > >@@ -227,68 +217,52 @@ int EchoCancellationImpl::Enable(bool enable) { > } > > bool EchoCancellationImpl::is_enabled() const { >- rtc::CritScope cs(crit_capture_); > return enabled_; > } > > int EchoCancellationImpl::set_suppression_level(SuppressionLevel level) { >- { >- if (MapSetting(level) == -1) { >- return AudioProcessing::kBadParameterError; >- } >- rtc::CritScope cs(crit_capture_); >- suppression_level_ = level; >+ if (MapSetting(level) == -1) { >+ return AudioProcessing::kBadParameterError; > } >+ suppression_level_ = level; > return Configure(); > } > > EchoCancellationImpl::SuppressionLevel EchoCancellationImpl::suppression_level() > const { >- rtc::CritScope cs(crit_capture_); > return suppression_level_; > } > > int EchoCancellationImpl::enable_drift_compensation(bool enable) { >- { >- rtc::CritScope cs(crit_capture_); >- drift_compensation_enabled_ = enable; >- } >+ drift_compensation_enabled_ = enable; > return Configure(); > } > > bool EchoCancellationImpl::is_drift_compensation_enabled() const { >- rtc::CritScope cs(crit_capture_); > return drift_compensation_enabled_; > } > > void EchoCancellationImpl::set_stream_drift_samples(int drift) { >- rtc::CritScope cs(crit_capture_); > was_stream_drift_set_ = true; > stream_drift_samples_ = drift; > } > > int EchoCancellationImpl::stream_drift_samples() const { >- rtc::CritScope cs(crit_capture_); > return stream_drift_samples_; > } > > int EchoCancellationImpl::enable_metrics(bool enable) { >- { >- rtc::CritScope cs(crit_capture_); >- metrics_enabled_ = enable; >- } >+ metrics_enabled_ = enable; > return Configure(); > } > > bool EchoCancellationImpl::are_metrics_enabled() const { >- rtc::CritScope cs(crit_capture_); > return enabled_ && metrics_enabled_; > } > > // TODO(ajm): we currently just use the metrics from the first AEC. Think more > // aboue the best way to extend this to multi-channel. > int EchoCancellationImpl::GetMetrics(Metrics* metrics) { >- rtc::CritScope cs(crit_capture_); > if (metrics == NULL) { > return AudioProcessing::kNullPointerError; > } >@@ -331,46 +305,36 @@ int EchoCancellationImpl::GetMetrics(Metrics* metrics) { > } > > bool EchoCancellationImpl::stream_has_echo() const { >- rtc::CritScope cs(crit_capture_); > return stream_has_echo_; > } > > int EchoCancellationImpl::enable_delay_logging(bool enable) { >- { >- rtc::CritScope cs(crit_capture_); >- delay_logging_enabled_ = enable; >- } >+ delay_logging_enabled_ = enable; > return Configure(); > } > > bool EchoCancellationImpl::is_delay_logging_enabled() const { >- rtc::CritScope cs(crit_capture_); > return enabled_ && delay_logging_enabled_; > } > > bool EchoCancellationImpl::is_delay_agnostic_enabled() const { >- rtc::CritScope cs(crit_capture_); > return delay_agnostic_enabled_; > } > > std::string EchoCancellationImpl::GetExperimentsDescription() { >- rtc::CritScope cs(crit_capture_); > return refined_adaptive_filter_enabled_ ? "RefinedAdaptiveFilter;" : ""; > } > > bool EchoCancellationImpl::is_refined_adaptive_filter_enabled() const { >- rtc::CritScope cs(crit_capture_); > return refined_adaptive_filter_enabled_; > } > > bool EchoCancellationImpl::is_extended_filter_enabled() const { >- rtc::CritScope cs(crit_capture_); > return extended_filter_enabled_; > } > > // TODO(bjornv): How should we handle the multi-channel case? > int EchoCancellationImpl::GetDelayMetrics(int* median, int* std) { >- rtc::CritScope cs(crit_capture_); > float fraction_poor_delays = 0; > return GetDelayMetrics(median, std, &fraction_poor_delays); > } >@@ -378,7 +342,6 @@ int EchoCancellationImpl::GetDelayMetrics(int* median, int* std) { > int EchoCancellationImpl::GetDelayMetrics(int* median, > int* std, > float* fraction_poor_delays) { >- rtc::CritScope cs(crit_capture_); > if (median == NULL) { > return AudioProcessing::kNullPointerError; > } >@@ -400,7 +363,6 @@ int EchoCancellationImpl::GetDelayMetrics(int* median, > } > > struct AecCore* EchoCancellationImpl::aec_core() const { >- rtc::CritScope cs(crit_capture_); > if (!enabled_) { > return NULL; > } >@@ -411,9 +373,6 @@ void EchoCancellationImpl::Initialize(int sample_rate_hz, > size_t num_reverse_channels, > size_t num_output_channels, > size_t num_proc_channels) { >- rtc::CritScope cs_render(crit_render_); >- rtc::CritScope cs_capture(crit_capture_); >- > stream_properties_.reset( > new StreamProperties(sample_rate_hz, num_reverse_channels, > num_output_channels, num_proc_channels)); >@@ -442,7 +401,6 @@ void EchoCancellationImpl::Initialize(int sample_rate_hz, > } > > int EchoCancellationImpl::GetSystemDelayInSamples() const { >- rtc::CritScope cs(crit_capture_); > RTC_DCHECK(enabled_); > // Report the delay for the first AEC component. > return WebRtcAec_system_delay(WebRtcAec_aec_core(cancellers_[0]->state())); >@@ -471,7 +429,6 @@ void EchoCancellationImpl::PackRenderAudioBuffer( > > void EchoCancellationImpl::SetExtraOptions(const webrtc::Config& config) { > { >- rtc::CritScope cs(crit_capture_); > extended_filter_enabled_ = config.Get<ExtendedFilter>().enabled; > delay_agnostic_enabled_ = config.Get<DelayAgnostic>().enabled; > refined_adaptive_filter_enabled_ = >@@ -481,8 +438,6 @@ void EchoCancellationImpl::SetExtraOptions(const webrtc::Config& config) { > } > > int EchoCancellationImpl::Configure() { >- rtc::CritScope cs_render(crit_render_); >- rtc::CritScope cs_capture(crit_capture_); > AecConfig config; > config.metricsMode = metrics_enabled_; > config.nlpMode = MapSetting(suppression_level_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.h >index 091a7b5116a2ad7e618a9cfe1847413e7f442ba2..a8b43a8cd62b8ec151b80de0b0ae64913677ffc1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl.h >@@ -11,12 +11,16 @@ > #ifndef MODULES_AUDIO_PROCESSING_ECHO_CANCELLATION_IMPL_H_ > #define MODULES_AUDIO_PROCESSING_ECHO_CANCELLATION_IMPL_H_ > >+#include <stddef.h> > #include <memory> >+#include <string> > #include <vector> > >+#include "api/array_view.h" > #include "modules/audio_processing/include/audio_processing.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >@@ -28,8 +32,7 @@ class AudioBuffer; > // for PC and IP phone applications. > class EchoCancellationImpl { > public: >- EchoCancellationImpl(rtc::CriticalSection* crit_render, >- rtc::CriticalSection* crit_capture); >+ explicit EchoCancellationImpl(); > ~EchoCancellationImpl(); > > void ProcessRenderAudio(rtc::ArrayView<const float> packed_render_audio); >@@ -80,17 +83,23 @@ class EchoCancellationImpl { > // P_a: Internal signal power at the point before the AEC's non-linear > // processor. > struct Metrics { >+ struct Statistic { >+ int instant = 0; // Instantaneous value. >+ int average = 0; // Long-term average. >+ int maximum = 0; // Long-term maximum. >+ int minimum = 0; // Long-term minimum. >+ }; > // RERL = ERL + ERLE >- AudioProcessing::Statistic residual_echo_return_loss; >+ Statistic residual_echo_return_loss; > > // ERL = 10log_10(P_far / P_echo) >- AudioProcessing::Statistic echo_return_loss; >+ Statistic echo_return_loss; > > // ERLE = 10log_10(P_echo / P_out) >- AudioProcessing::Statistic echo_return_loss_enhancement; >+ Statistic echo_return_loss_enhancement; > > // (Pre non-linear processing suppression) A_NLP = 10log_10(P_echo / P_a) >- AudioProcessing::Statistic a_nlp; >+ Statistic a_nlp; > > // Fraction of time that the AEC linear filter is divergent, in a 1-second > // non-overlapped aggregation window. >@@ -150,28 +159,23 @@ class EchoCancellationImpl { > void AllocateRenderQueue(); > int Configure(); > >- rtc::CriticalSection* const crit_render_ RTC_ACQUIRED_BEFORE(crit_capture_); >- rtc::CriticalSection* const crit_capture_; >- > bool enabled_ = false; >- bool drift_compensation_enabled_ RTC_GUARDED_BY(crit_capture_); >- bool metrics_enabled_ RTC_GUARDED_BY(crit_capture_); >- SuppressionLevel suppression_level_ RTC_GUARDED_BY(crit_capture_); >- int stream_drift_samples_ RTC_GUARDED_BY(crit_capture_); >- bool was_stream_drift_set_ RTC_GUARDED_BY(crit_capture_); >- bool stream_has_echo_ RTC_GUARDED_BY(crit_capture_); >- bool delay_logging_enabled_ RTC_GUARDED_BY(crit_capture_); >- bool extended_filter_enabled_ RTC_GUARDED_BY(crit_capture_); >- bool delay_agnostic_enabled_ RTC_GUARDED_BY(crit_capture_); >- bool refined_adaptive_filter_enabled_ RTC_GUARDED_BY(crit_capture_) = false; >+ bool drift_compensation_enabled_; >+ bool metrics_enabled_; >+ SuppressionLevel suppression_level_; >+ int stream_drift_samples_; >+ bool was_stream_drift_set_; >+ bool stream_has_echo_; >+ bool delay_logging_enabled_; >+ bool extended_filter_enabled_; >+ bool delay_agnostic_enabled_; >+ bool refined_adaptive_filter_enabled_ = false; > > // Only active on Chrome OS devices. >- const bool enforce_zero_stream_delay_ RTC_GUARDED_BY(crit_capture_); >+ const bool enforce_zero_stream_delay_; > > std::vector<std::unique_ptr<Canceller>> cancellers_; > std::unique_ptr<StreamProperties> stream_properties_; >- >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(EchoCancellationImpl); > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl_unittest.cc >index c50c52f857bf7351b6b81515c7b97bd5758dda5e..9d81c1df3b76b23a19d0aaa106dcc52f0c4d0cdb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_cancellation_impl_unittest.cc >@@ -18,9 +18,7 @@ > > namespace webrtc { > TEST(EchoCancellationInternalTest, ExtendedFilter) { >- rtc::CriticalSection crit_render; >- rtc::CriticalSection crit_capture; >- EchoCancellationImpl echo_canceller(&crit_render, &crit_capture); >+ EchoCancellationImpl echo_canceller; > echo_canceller.Initialize(AudioProcessing::kSampleRate32kHz, 2, 2, 2); > > EXPECT_TRUE(echo_canceller.aec_core() == nullptr); >@@ -51,9 +49,7 @@ TEST(EchoCancellationInternalTest, ExtendedFilter) { > } > > TEST(EchoCancellationInternalTest, DelayAgnostic) { >- rtc::CriticalSection crit_render; >- rtc::CriticalSection crit_capture; >- EchoCancellationImpl echo_canceller(&crit_render, &crit_capture); >+ EchoCancellationImpl echo_canceller; > echo_canceller.Initialize(AudioProcessing::kSampleRate32kHz, 1, 1, 1); > > EXPECT_TRUE(echo_canceller.aec_core() == NULL); >@@ -85,9 +81,7 @@ TEST(EchoCancellationInternalTest, DelayAgnostic) { > } > > TEST(EchoCancellationInternalTest, InterfaceConfiguration) { >- rtc::CriticalSection crit_render; >- rtc::CriticalSection crit_capture; >- EchoCancellationImpl echo_canceller(&crit_render, &crit_capture); >+ EchoCancellationImpl echo_canceller; > echo_canceller.Initialize(AudioProcessing::kSampleRate16kHz, 1, 1, 1); > > EXPECT_EQ(0, echo_canceller.enable_drift_compensation(true)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_bit_exact_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_bit_exact_unittest.cc >index 8b8369260a55651dec20e891d9639e92af17b51e..7844daf5225266acfe0dbcc461d9c8ef0c7e0e96 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_bit_exact_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_bit_exact_unittest.cc >@@ -64,9 +64,7 @@ void RunBitexactnessTest(int sample_rate_hz, > EchoControlMobileImpl::RoutingMode routing_mode, > bool comfort_noise_enabled, > const rtc::ArrayView<const float>& output_reference) { >- rtc::CriticalSection crit_render; >- rtc::CriticalSection crit_capture; >- EchoControlMobileImpl echo_control_mobile(&crit_render, &crit_capture); >+ EchoControlMobileImpl echo_control_mobile; > SetupComponent(sample_rate_hz, routing_mode, comfort_noise_enabled, > &echo_control_mobile); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.cc >index bd125c6e0de09a457eb25ba9ada59dd2e3e5ea71..b9fbf42052797dffaa3d639e1620268b08560801 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.cc >@@ -11,11 +11,13 @@ > #include "modules/audio_processing/echo_control_mobile_impl.h" > > #include <string.h> >+#include <cstdint> > > #include "modules/audio_processing/aecm/echo_control_mobile.h" > #include "modules/audio_processing/audio_buffer.h" >+#include "modules/audio_processing/include/audio_processing.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" >-#include "rtc_base/logging.h" > > namespace webrtc { > >@@ -55,10 +57,6 @@ AudioProcessing::Error MapError(int err) { > } > } // namespace > >-size_t EchoControlMobileImpl::echo_path_size_bytes() { >- return WebRtcAecm_echo_path_size_bytes(); >-} >- > struct EchoControlMobileImpl::StreamProperties { > StreamProperties() = delete; > StreamProperties(int sample_rate_hz, >@@ -90,17 +88,10 @@ class EchoControlMobileImpl::Canceller { > return state_; > } > >- void Initialize(int sample_rate_hz, >- unsigned char* external_echo_path, >- size_t echo_path_size_bytes) { >+ void Initialize(int sample_rate_hz) { > RTC_DCHECK(state_); > int error = WebRtcAecm_Init(state_, sample_rate_hz); > RTC_DCHECK_EQ(AudioProcessing::kNoError, error); >- if (external_echo_path != NULL) { >- error = WebRtcAecm_InitEchoPath(state_, external_echo_path, >- echo_path_size_bytes); >- RTC_DCHECK_EQ(AudioProcessing::kNoError, error); >- } > } > > private: >@@ -108,27 +99,13 @@ class EchoControlMobileImpl::Canceller { > RTC_DISALLOW_COPY_AND_ASSIGN(Canceller); > }; > >-EchoControlMobileImpl::EchoControlMobileImpl(rtc::CriticalSection* crit_render, >- rtc::CriticalSection* crit_capture) >- : crit_render_(crit_render), >- crit_capture_(crit_capture), >- routing_mode_(kSpeakerphone), >- comfort_noise_enabled_(false), >- external_echo_path_(NULL) { >- RTC_DCHECK(crit_render); >- RTC_DCHECK(crit_capture); >-} >+EchoControlMobileImpl::EchoControlMobileImpl() >+ : routing_mode_(kSpeakerphone), comfort_noise_enabled_(false) {} > >-EchoControlMobileImpl::~EchoControlMobileImpl() { >- if (external_echo_path_ != NULL) { >- delete[] external_echo_path_; >- external_echo_path_ = NULL; >- } >-} >+EchoControlMobileImpl::~EchoControlMobileImpl() {} > > void EchoControlMobileImpl::ProcessRenderAudio( > rtc::ArrayView<const int16_t> packed_render_audio) { >- rtc::CritScope cs_capture(crit_capture_); > if (!enabled_) { > return; > } >@@ -181,7 +158,6 @@ size_t EchoControlMobileImpl::NumCancellersRequired( > > int EchoControlMobileImpl::ProcessCaptureAudio(AudioBuffer* audio, > int stream_delay_ms) { >- rtc::CritScope cs_capture(crit_capture_); > if (!enabled_) { > return AudioProcessing::kNoError; > } >@@ -228,8 +204,6 @@ int EchoControlMobileImpl::ProcessCaptureAudio(AudioBuffer* audio, > > int EchoControlMobileImpl::Enable(bool enable) { > // Ensure AEC and AECM are not both enabled. >- rtc::CritScope cs_render(crit_render_); >- rtc::CritScope cs_capture(crit_capture_); > RTC_DCHECK(stream_properties_); > > if (enable && >@@ -252,7 +226,6 @@ int EchoControlMobileImpl::Enable(bool enable) { > } > > bool EchoControlMobileImpl::is_enabled() const { >- rtc::CritScope cs(crit_capture_); > return enabled_; > } > >@@ -260,90 +233,26 @@ int EchoControlMobileImpl::set_routing_mode(RoutingMode mode) { > if (MapSetting(mode) == -1) { > return AudioProcessing::kBadParameterError; > } >- >- { >- rtc::CritScope cs(crit_capture_); > routing_mode_ = mode; >- } > return Configure(); > } > > EchoControlMobileImpl::RoutingMode EchoControlMobileImpl::routing_mode() const { >- rtc::CritScope cs(crit_capture_); > return routing_mode_; > } > > int EchoControlMobileImpl::enable_comfort_noise(bool enable) { >- { >- rtc::CritScope cs(crit_capture_); > comfort_noise_enabled_ = enable; >- } > return Configure(); > } > > bool EchoControlMobileImpl::is_comfort_noise_enabled() const { >- rtc::CritScope cs(crit_capture_); > return comfort_noise_enabled_; > } > >-int EchoControlMobileImpl::SetEchoPath(const void* echo_path, >- size_t size_bytes) { >- { >- rtc::CritScope cs_render(crit_render_); >- rtc::CritScope cs_capture(crit_capture_); >- if (echo_path == NULL) { >- return AudioProcessing::kNullPointerError; >- } >- if (size_bytes != echo_path_size_bytes()) { >- // Size mismatch >- return AudioProcessing::kBadParameterError; >- } >- >- if (external_echo_path_ == NULL) { >- external_echo_path_ = new unsigned char[size_bytes]; >- } >- memcpy(external_echo_path_, echo_path, size_bytes); >- } >- >- // TODO(peah): Simplify once the Enable function has been removed from >- // the public APM API. >- RTC_DCHECK(stream_properties_); >- Initialize(stream_properties_->sample_rate_hz, >- stream_properties_->num_reverse_channels, >- stream_properties_->num_output_channels); >- return AudioProcessing::kNoError; >-} >- >-int EchoControlMobileImpl::GetEchoPath(void* echo_path, >- size_t size_bytes) const { >- rtc::CritScope cs(crit_capture_); >- if (echo_path == NULL) { >- return AudioProcessing::kNullPointerError; >- } >- if (size_bytes != echo_path_size_bytes()) { >- // Size mismatch >- return AudioProcessing::kBadParameterError; >- } >- if (!enabled_) { >- return AudioProcessing::kNotEnabledError; >- } >- >- // Get the echo path from the first channel >- int32_t err = >- WebRtcAecm_GetEchoPath(cancellers_[0]->state(), echo_path, size_bytes); >- if (err != 0) { >- return MapError(err); >- } >- >- return AudioProcessing::kNoError; >-} >- > void EchoControlMobileImpl::Initialize(int sample_rate_hz, > size_t num_reverse_channels, > size_t num_output_channels) { >- rtc::CritScope cs_render(crit_render_); >- rtc::CritScope cs_capture(crit_capture_); >- > stream_properties_.reset(new StreamProperties( > sample_rate_hz, num_reverse_channels, num_output_channels)); > >@@ -363,16 +272,12 @@ void EchoControlMobileImpl::Initialize(int sample_rate_hz, > if (!canceller) { > canceller.reset(new Canceller()); > } >- canceller->Initialize(sample_rate_hz, external_echo_path_, >- echo_path_size_bytes()); >+ canceller->Initialize(sample_rate_hz); > } >- > Configure(); > } > > int EchoControlMobileImpl::Configure() { >- rtc::CritScope cs_render(crit_render_); >- rtc::CritScope cs_capture(crit_capture_); > AecmConfig config; > config.cngMode = comfort_noise_enabled_; > config.echoMode = MapSetting(routing_mode_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.h >index a5b66c8311b80447a91d557a749ffe702727abbb..d5a8b0b445155e655e42b3702f1f37523f7a2f0e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_impl.h >@@ -11,14 +11,12 @@ > #ifndef MODULES_AUDIO_PROCESSING_ECHO_CONTROL_MOBILE_IMPL_H_ > #define MODULES_AUDIO_PROCESSING_ECHO_CONTROL_MOBILE_IMPL_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <vector> > >-#include "modules/audio_processing/include/audio_processing.h" >-#include "modules/audio_processing/render_queue_item_verifier.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/criticalsection.h" >-#include "rtc_base/swap_queue.h" >+#include "api/array_view.h" > > namespace webrtc { > >@@ -28,8 +26,7 @@ class AudioBuffer; > // robust option intended for use on mobile devices. > class EchoControlMobileImpl { > public: >- EchoControlMobileImpl(rtc::CriticalSection* crit_render, >- rtc::CriticalSection* crit_capture); >+ EchoControlMobileImpl(); > > ~EchoControlMobileImpl(); > >@@ -57,26 +54,6 @@ class EchoControlMobileImpl { > int enable_comfort_noise(bool enable); > bool is_comfort_noise_enabled() const; > >- // A typical use case is to initialize the component with an echo path from a >- // previous call. The echo path is retrieved using |GetEchoPath()|, typically >- // at the end of a call. The data can then be stored for later use as an >- // initializer before the next call, using |SetEchoPath()|. >- // >- // Controlling the echo path this way requires the data |size_bytes| to match >- // the internal echo path size. This size can be acquired using >- // |echo_path_size_bytes()|. |SetEchoPath()| causes an entire reset, worth >- // noting if it is to be called during an ongoing call. >- // >- // It is possible that version incompatibilities may result in a stored echo >- // path of the incorrect size. In this case, the stored path should be >- // discarded. >- int SetEchoPath(const void* echo_path, size_t size_bytes); >- int GetEchoPath(void* echo_path, size_t size_bytes) const; >- >- // The returned path size is guaranteed not to change for the lifetime of >- // the application. >- static size_t echo_path_size_bytes(); >- > void ProcessRenderAudio(rtc::ArrayView<const int16_t> packed_render_audio); > int ProcessCaptureAudio(AudioBuffer* audio, int stream_delay_ms); > >@@ -98,20 +75,13 @@ class EchoControlMobileImpl { > > int Configure(); > >- rtc::CriticalSection* const crit_render_ RTC_ACQUIRED_BEFORE(crit_capture_); >- rtc::CriticalSection* const crit_capture_; >- > bool enabled_ = false; > >- RoutingMode routing_mode_ RTC_GUARDED_BY(crit_capture_); >- bool comfort_noise_enabled_ RTC_GUARDED_BY(crit_capture_); >- unsigned char* external_echo_path_ RTC_GUARDED_BY(crit_render_) >- RTC_GUARDED_BY(crit_capture_); >+ RoutingMode routing_mode_; >+ bool comfort_noise_enabled_; > > std::vector<std::unique_ptr<Canceller>> cancellers_; > std::unique_ptr<StreamProperties> stream_properties_; >- >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(EchoControlMobileImpl); > }; > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_unittest.cc >index d7e470ca120f9cadda7feb81a21fe49570905312..f0e60483d411a3e4e43b5e5b88aa7616ab438cd3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_control_mobile_unittest.cc >@@ -18,9 +18,7 @@ > > namespace webrtc { > TEST(EchoControlMobileTest, InterfaceConfiguration) { >- rtc::CriticalSection crit_render; >- rtc::CriticalSection crit_capture; >- EchoControlMobileImpl aecm(&crit_render, &crit_capture); >+ EchoControlMobileImpl aecm; > aecm.Initialize(AudioProcessing::kSampleRate16kHz, 2, 2); > > // Turn AECM on >@@ -46,28 +44,6 @@ TEST(EchoControlMobileTest, InterfaceConfiguration) { > EXPECT_EQ(0, aecm.enable_comfort_noise(true)); > EXPECT_TRUE(aecm.is_comfort_noise_enabled()); > >- // Set and get echo path >- const size_t echo_path_size = aecm.echo_path_size_bytes(); >- std::vector<uint8_t> echo_path_in(echo_path_size); >- std::vector<uint8_t> echo_path_out(echo_path_size); >- EXPECT_EQ(AudioProcessing::kNullPointerError, >- aecm.SetEchoPath(nullptr, echo_path_size)); >- EXPECT_EQ(AudioProcessing::kNullPointerError, >- aecm.GetEchoPath(nullptr, echo_path_size)); >- EXPECT_EQ(AudioProcessing::kBadParameterError, >- aecm.GetEchoPath(echo_path_out.data(), 1)); >- EXPECT_EQ(0, aecm.GetEchoPath(echo_path_out.data(), echo_path_size)); >- for (size_t i = 0; i < echo_path_size; i++) { >- echo_path_in[i] = echo_path_out[i] + 1; >- } >- EXPECT_EQ(AudioProcessing::kBadParameterError, >- aecm.SetEchoPath(echo_path_in.data(), 1)); >- EXPECT_EQ(0, aecm.SetEchoPath(echo_path_in.data(), echo_path_size)); >- EXPECT_EQ(0, aecm.GetEchoPath(echo_path_out.data(), echo_path_size)); >- for (size_t i = 0; i < echo_path_size; i++) { >- EXPECT_EQ(echo_path_in[i], echo_path_out[i]); >- } >- > // Turn AECM off > EXPECT_EQ(0, aecm.Enable(false)); > EXPECT_FALSE(aecm.is_enabled()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_detector/circular_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_detector/circular_buffer.h >index 9f5cdfa95364359d6d3ae3d56f6590e1434b4548..c52311f863b48b6278ff330f183adf5d2dbbe522 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_detector/circular_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/echo_detector/circular_buffer.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_ECHO_DETECTOR_CIRCULAR_BUFFER_H_ > #define MODULES_AUDIO_PROCESSING_ECHO_DETECTOR_CIRCULAR_BUFFER_H_ > >+#include <stddef.h> > #include <vector> > > #include "absl/types/optional.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_for_experimental_agc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_for_experimental_agc.cc >index 4ab856c2c33e570940faeb98c0f7f388d359840e..1479d58b5430c98ba707dc263a2f2cd9b8bd116d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_for_experimental_agc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_for_experimental_agc.cc >@@ -13,7 +13,6 @@ > #include "modules/audio_processing/include/audio_processing.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" >-#include "rtc_base/checks.h" > #include "rtc_base/criticalsection.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.cc >index 685a27ff6681209ab2a4b842bc712e2b494dbb90..d5118ba073d9940cfcbab353c88cefec72ad6863 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.cc >@@ -10,10 +10,14 @@ > > #include "modules/audio_processing/gain_control_impl.h" > >+#include <cstdint> >+ > #include "absl/types/optional.h" > #include "modules/audio_processing/agc/legacy/gain_control.h" > #include "modules/audio_processing/audio_buffer.h" >+#include "modules/audio_processing/include/audio_processing.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.h >index 959422fb44c20ea139378157d6a121ac30a67b6e..c21d9113262d475b83d57cb296e225e9afd7747f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_control_impl.h >@@ -11,14 +11,16 @@ > #ifndef MODULES_AUDIO_PROCESSING_GAIN_CONTROL_IMPL_H_ > #define MODULES_AUDIO_PROCESSING_GAIN_CONTROL_IMPL_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <vector> > >-#include "modules/audio_processing/include/audio_processing.h" >-#include "modules/audio_processing/render_queue_item_verifier.h" >+#include "absl/types/optional.h" >+#include "api/array_view.h" >+#include "modules/audio_processing/include/gain_control.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" >-#include "rtc_base/swap_queue.h" > #include "rtc_base/thread_annotations.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.cc >index 29af962e745fb50631722241b9d949299c7f1e7d..2a327449df2e164282bd06c28d3ce9812df762d7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.cc >@@ -10,6 +10,7 @@ > > #include "modules/audio_processing/gain_controller2.h" > >+#include "common_audio/include/audio_util.h" > #include "modules/audio_processing/audio_buffer.h" > #include "modules/audio_processing/include/audio_frame_view.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" >@@ -24,8 +25,10 @@ int GainController2::instance_count_ = 0; > GainController2::GainController2() > : data_dumper_( > new ApmDataDumper(rtc::AtomicOps::Increment(&instance_count_))), >- fixed_gain_controller_(data_dumper_.get()), >- adaptive_agc_(data_dumper_.get()) {} >+ gain_applier_(/*hard_clip_samples=*/false, >+ /*initial_gain_factor=*/0.f), >+ adaptive_agc_(new AdaptiveAgc(data_dumper_.get())), >+ limiter_(static_cast<size_t>(48000), data_dumper_.get(), "Agc2") {} > > GainController2::~GainController2() = default; > >@@ -34,7 +37,7 @@ void GainController2::Initialize(int sample_rate_hz) { > sample_rate_hz == AudioProcessing::kSampleRate16kHz || > sample_rate_hz == AudioProcessing::kSampleRate32kHz || > sample_rate_hz == AudioProcessing::kSampleRate48kHz); >- fixed_gain_controller_.SetSampleRate(sample_rate_hz); >+ limiter_.SetSampleRate(sample_rate_hz); > data_dumper_->InitiateNewSetOfRecordings(); > data_dumper_->DumpRaw("sample_rate_hz", sample_rate_hz); > } >@@ -42,37 +45,71 @@ void GainController2::Initialize(int sample_rate_hz) { > void GainController2::Process(AudioBuffer* audio) { > AudioFrameView<float> float_frame(audio->channels_f(), audio->num_channels(), > audio->num_frames()); >- if (adaptive_digital_mode_) { >- adaptive_agc_.Process(float_frame, fixed_gain_controller_.LastAudioLevel()); >+ // Apply fixed gain first, then the adaptive one. >+ gain_applier_.ApplyGain(float_frame); >+ if (config_.adaptive_digital.enabled) { >+ adaptive_agc_->Process(float_frame, limiter_.LastAudioLevel()); > } >- fixed_gain_controller_.Process(float_frame); >+ limiter_.Process(float_frame); > } > > void GainController2::NotifyAnalogLevel(int level) { >- if (analog_level_ != level && adaptive_digital_mode_) { >- adaptive_agc_.Reset(); >+ if (analog_level_ != level && config_.adaptive_digital.enabled) { >+ adaptive_agc_->Reset(); > } > analog_level_ = level; > } > > void GainController2::ApplyConfig( > const AudioProcessing::Config::GainController2& config) { >- RTC_DCHECK(Validate(config)); >+ RTC_DCHECK(Validate(config)) >+ << " the invalid config was " << ToString(config); >+ > config_ = config; >- fixed_gain_controller_.SetGain(config_.fixed_gain_db); >- adaptive_digital_mode_ = config_.adaptive_digital_mode; >+ if (config.fixed_digital.gain_db != config_.fixed_digital.gain_db) { >+ // Reset the limiter to quickly react on abrupt level changes caused by >+ // large changes of the fixed gain. >+ limiter_.Reset(); >+ } >+ gain_applier_.SetGainFactor(DbToRatio(config_.fixed_digital.gain_db)); >+ adaptive_agc_.reset(new AdaptiveAgc(data_dumper_.get(), config_)); > } > > bool GainController2::Validate( > const AudioProcessing::Config::GainController2& config) { >- return config.fixed_gain_db >= 0.f; >+ return config.fixed_digital.gain_db >= 0.f && >+ config.fixed_digital.gain_db < 50.f && >+ config.adaptive_digital.extra_saturation_margin_db >= 0.f && >+ config.adaptive_digital.extra_saturation_margin_db <= 100.f; > } > > std::string GainController2::ToString( > const AudioProcessing::Config::GainController2& config) { > rtc::StringBuilder ss; >- ss << "{enabled: " << (config.enabled ? "true" : "false") << ", " >- << "fixed_gain_dB: " << config.fixed_gain_db << "}"; >+ std::string adaptive_digital_level_estimator; >+ using LevelEstimatorType = >+ AudioProcessing::Config::GainController2::LevelEstimator; >+ switch (config.adaptive_digital.level_estimator) { >+ case LevelEstimatorType::kRms: >+ adaptive_digital_level_estimator = "RMS"; >+ break; >+ case LevelEstimatorType::kPeak: >+ adaptive_digital_level_estimator = "peak"; >+ break; >+ } >+ // clang-format off >+ // clang formatting doesn't respect custom nested style. >+ ss << "{" >+ << "enabled: " << (config.enabled ? "true" : "false") << ", " >+ << "fixed_digital: {gain_db: " << config.fixed_digital.gain_db << "}, " >+ << "adaptive_digital: {" >+ << "enabled: " >+ << (config.adaptive_digital.enabled ? "true" : "false") << ", " >+ << "level_estimator: " << adaptive_digital_level_estimator << ", " >+ << "extra_saturation_margin_db:" >+ << config.adaptive_digital.extra_saturation_margin_db << "}" >+ << "}"; >+ // clang-format on > return ss.Release(); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.h >index b4727d0e777695e70fd24d49ceb7ad8fa7979a9c..3a11810bb59d9539566d8c75e13000e09811c8a2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2.h >@@ -15,7 +15,8 @@ > #include <string> > > #include "modules/audio_processing/agc2/adaptive_agc.h" >-#include "modules/audio_processing/agc2/fixed_gain_controller.h" >+#include "modules/audio_processing/agc2/gain_applier.h" >+#include "modules/audio_processing/agc2/limiter.h" > #include "modules/audio_processing/include/audio_processing.h" > #include "rtc_base/constructormagic.h" > >@@ -43,11 +44,11 @@ class GainController2 { > private: > static int instance_count_; > std::unique_ptr<ApmDataDumper> data_dumper_; >- FixedGainController fixed_gain_controller_; > AudioProcessing::Config::GainController2 config_; >- AdaptiveAgc adaptive_agc_; >+ GainApplier gain_applier_; >+ std::unique_ptr<AdaptiveAgc> adaptive_agc_; >+ Limiter limiter_; > int analog_level_ = -1; >- bool adaptive_digital_mode_ = true; > > RTC_DISALLOW_COPY_AND_ASSIGN(GainController2); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2_unittest.cc >index 5bedccd64a8bea76cca79a3663ac50980fdaf5c4..27d540a1988e9e8969fcdb08c75ea02bfce4fd72 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/gain_controller2_unittest.cc >@@ -10,20 +10,20 @@ > > #include <algorithm> > >+#include "absl/memory/memory.h" > #include "api/array_view.h" >+#include "modules/audio_processing/agc2/agc2_testing_common.h" > #include "modules/audio_processing/audio_buffer.h" > #include "modules/audio_processing/gain_controller2.h" >+#include "modules/audio_processing/test/audio_buffer_tools.h" >+#include "modules/audio_processing/test/bitexactness_tools.h" > #include "rtc_base/checks.h" > #include "test/gtest.h" > > namespace webrtc { > namespace test { >- > namespace { > >-constexpr size_t kFrameSizeMs = 10u; >-constexpr size_t kStereo = 2u; >- > void SetAudioBufferSamples(float value, AudioBuffer* ab) { > // Sets all the samples in |ab| to |value|. > for (size_t k = 0; k < ab->num_channels(); ++k) { >@@ -32,6 +32,75 @@ void SetAudioBufferSamples(float value, AudioBuffer* ab) { > } > } > >+float RunAgc2WithConstantInput(GainController2* agc2, >+ float input_level, >+ size_t num_frames, >+ int sample_rate) { >+ const int num_samples = rtc::CheckedDivExact(sample_rate, 100); >+ AudioBuffer ab(num_samples, 1, num_samples, 1, num_samples); >+ >+ // Give time to the level estimator to converge. >+ for (size_t i = 0; i < num_frames + 1; ++i) { >+ SetAudioBufferSamples(input_level, &ab); >+ agc2->Process(&ab); >+ } >+ >+ // Return the last sample from the last processed frame. >+ return ab.channels_f()[0][num_samples - 1]; >+} >+ >+AudioProcessing::Config::GainController2 CreateAgc2FixedDigitalModeConfig( >+ float fixed_gain_db) { >+ AudioProcessing::Config::GainController2 config; >+ config.adaptive_digital.enabled = false; >+ config.fixed_digital.gain_db = fixed_gain_db; >+ // TODO(alessiob): Check why ASSERT_TRUE() below does not compile. >+ EXPECT_TRUE(GainController2::Validate(config)); >+ return config; >+} >+ >+std::unique_ptr<GainController2> CreateAgc2FixedDigitalMode( >+ float fixed_gain_db, >+ size_t sample_rate_hz) { >+ auto agc2 = absl::make_unique<GainController2>(); >+ agc2->ApplyConfig(CreateAgc2FixedDigitalModeConfig(fixed_gain_db)); >+ agc2->Initialize(sample_rate_hz); >+ return agc2; >+} >+ >+float GainAfterProcessingFile(GainController2* gain_controller) { >+ // Set up an AudioBuffer to be filled from the speech file. >+ constexpr size_t kStereo = 2u; >+ const StreamConfig capture_config(AudioProcessing::kSampleRate48kHz, kStereo, >+ false); >+ AudioBuffer ab(capture_config.num_frames(), capture_config.num_channels(), >+ capture_config.num_frames(), capture_config.num_channels(), >+ capture_config.num_frames()); >+ test::InputAudioFile capture_file( >+ test::GetApmCaptureTestVectorFileName(AudioProcessing::kSampleRate48kHz)); >+ std::vector<float> capture_input(capture_config.num_frames() * >+ capture_config.num_channels()); >+ >+ // The file should contain at least this many frames. Every iteration, we put >+ // a frame through the gain controller. >+ const int kNumFramesToProcess = 100; >+ for (int frame_no = 0; frame_no < kNumFramesToProcess; ++frame_no) { >+ ReadFloatSamplesFromStereoFile(capture_config.num_frames(), >+ capture_config.num_channels(), &capture_file, >+ capture_input); >+ >+ test::CopyVectorToAudioBuffer(capture_config, capture_input, &ab); >+ gain_controller->Process(&ab); >+ } >+ >+ // Send in a last frame with values constant 1 (It's low enough to detect high >+ // gain, and for ease of computation). The applied gain is the result. >+ constexpr float sample_value = 1.f; >+ SetAudioBufferSamples(sample_value, &ab); >+ gain_controller->Process(&ab); >+ return ab.channels_f()[0][0]; >+} >+ > } // namespace > > TEST(GainController2, CreateApplyConfig) { >@@ -44,47 +113,173 @@ TEST(GainController2, CreateApplyConfig) { > gain_controller2->ApplyConfig(config); > > // Check that attenuation is not allowed. >- config.fixed_gain_db = -5.f; >+ config.fixed_digital.gain_db = -5.f; > EXPECT_FALSE(GainController2::Validate(config)); > > // Check that valid configurations are applied. >- for (const float& fixed_gain_db : {0.f, 5.f, 10.f, 50.f}) { >- config.fixed_gain_db = fixed_gain_db; >+ for (const float& fixed_gain_db : {0.f, 5.f, 10.f, 40.f}) { >+ config.fixed_digital.gain_db = fixed_gain_db; > EXPECT_TRUE(GainController2::Validate(config)); > gain_controller2->ApplyConfig(config); > } > } > > TEST(GainController2, ToString) { >- // Tests GainController2::ToString(). >+ // Tests GainController2::ToString(). Only test the enabled property. > AudioProcessing::Config::GainController2 config; >- config.fixed_gain_db = 5.f; > > config.enabled = false; >- EXPECT_EQ("{enabled: false, fixed_gain_dB: 5}", >- GainController2::ToString(config)); >+ EXPECT_EQ("{enabled: false", GainController2::ToString(config).substr(0, 15)); > > config.enabled = true; >- EXPECT_EQ("{enabled: true, fixed_gain_dB: 5}", >- GainController2::ToString(config)); >+ EXPECT_EQ("{enabled: true", GainController2::ToString(config).substr(0, 14)); > } > >-TEST(GainController2, Usage) { >- // Tests GainController2::Process() on an AudioBuffer instance. >- std::unique_ptr<GainController2> gain_controller2(new GainController2()); >- gain_controller2->Initialize(AudioProcessing::kSampleRate48kHz); >- const size_t num_frames = rtc::CheckedDivExact<size_t>( >- kFrameSizeMs * AudioProcessing::kSampleRate48kHz, 1000); >- AudioBuffer ab(num_frames, kStereo, num_frames, kStereo, num_frames); >- constexpr float sample_value = 1000.f; >- SetAudioBufferSamples(sample_value, &ab); >+TEST(GainController2FixedDigital, GainShouldChangeOnSetGain) { >+ constexpr float kInputLevel = 1000.f; >+ constexpr size_t kNumFrames = 5; >+ constexpr size_t kSampleRateHz = 8000; >+ constexpr float kGain0Db = 0.f; >+ constexpr float kGain20Db = 20.f; >+ >+ auto agc2_fixed = CreateAgc2FixedDigitalMode(kGain0Db, kSampleRateHz); >+ >+ // Signal level is unchanged with 0 db gain. >+ EXPECT_FLOAT_EQ(RunAgc2WithConstantInput(agc2_fixed.get(), kInputLevel, >+ kNumFrames, kSampleRateHz), >+ kInputLevel); >+ >+ // +20 db should increase signal by a factor of 10. >+ agc2_fixed->ApplyConfig(CreateAgc2FixedDigitalModeConfig(kGain20Db)); >+ EXPECT_FLOAT_EQ(RunAgc2WithConstantInput(agc2_fixed.get(), kInputLevel, >+ kNumFrames, kSampleRateHz), >+ kInputLevel * 10); >+} >+ >+TEST(GainController2FixedDigital, ChangeFixedGainShouldBeFastAndTimeInvariant) { >+ // Number of frames required for the fixed gain controller to adapt on the >+ // input signal when the gain changes. >+ constexpr size_t kNumFrames = 5; >+ >+ constexpr float kInputLevel = 1000.f; >+ constexpr size_t kSampleRateHz = 8000; >+ constexpr float kGainDbLow = 0.f; >+ constexpr float kGainDbHigh = 25.f; >+ static_assert(kGainDbLow < kGainDbHigh, ""); >+ >+ auto agc2_fixed = CreateAgc2FixedDigitalMode(kGainDbLow, kSampleRateHz); >+ >+ // Start with a lower gain. >+ const float output_level_pre = RunAgc2WithConstantInput( >+ agc2_fixed.get(), kInputLevel, kNumFrames, kSampleRateHz); >+ >+ // Increase gain. >+ agc2_fixed->ApplyConfig(CreateAgc2FixedDigitalModeConfig(kGainDbHigh)); >+ static_cast<void>(RunAgc2WithConstantInput(agc2_fixed.get(), kInputLevel, >+ kNumFrames, kSampleRateHz)); >+ >+ // Back to the lower gain. >+ agc2_fixed->ApplyConfig(CreateAgc2FixedDigitalModeConfig(kGainDbLow)); >+ const float output_level_post = RunAgc2WithConstantInput( >+ agc2_fixed.get(), kInputLevel, kNumFrames, kSampleRateHz); >+ >+ EXPECT_EQ(output_level_pre, output_level_post); >+} >+ >+struct FixedDigitalTestParams { >+ FixedDigitalTestParams(float gain_db_min, >+ float gain_db_max, >+ size_t sample_rate, >+ bool saturation_expected) >+ : gain_db_min(gain_db_min), >+ gain_db_max(gain_db_max), >+ sample_rate(sample_rate), >+ saturation_expected(saturation_expected) {} >+ float gain_db_min; >+ float gain_db_max; >+ size_t sample_rate; >+ bool saturation_expected; >+}; >+ >+class FixedDigitalTest >+ : public testing::Test, >+ public testing::WithParamInterface<FixedDigitalTestParams> {}; >+ >+TEST_P(FixedDigitalTest, CheckSaturationBehaviorWithLimiter) { >+ const float kInputLevel = 32767.f; >+ const size_t kNumFrames = 5; >+ >+ const auto params = GetParam(); >+ >+ const auto gains_db = >+ test::LinSpace(params.gain_db_min, params.gain_db_max, 10); >+ for (const auto gain_db : gains_db) { >+ SCOPED_TRACE(std::to_string(gain_db)); >+ auto agc2_fixed = CreateAgc2FixedDigitalMode(gain_db, params.sample_rate); >+ const float processed_sample = RunAgc2WithConstantInput( >+ agc2_fixed.get(), kInputLevel, kNumFrames, params.sample_rate); >+ if (params.saturation_expected) { >+ EXPECT_FLOAT_EQ(processed_sample, 32767.f); >+ } else { >+ EXPECT_LT(processed_sample, 32767.f); >+ } >+ } >+} >+ >+static_assert(test::kLimiterMaxInputLevelDbFs < 10, ""); >+INSTANTIATE_TEST_CASE_P( >+ GainController2, >+ FixedDigitalTest, >+ ::testing::Values( >+ // When gain < |test::kLimiterMaxInputLevelDbFs|, the limiter will not >+ // saturate the signal (at any sample rate). >+ FixedDigitalTestParams(0.1f, >+ test::kLimiterMaxInputLevelDbFs - 0.01f, >+ 8000, >+ false), >+ FixedDigitalTestParams(0.1, >+ test::kLimiterMaxInputLevelDbFs - 0.01f, >+ 48000, >+ false), >+ // When gain > |test::kLimiterMaxInputLevelDbFs|, the limiter will >+ // saturate the signal (at any sample rate). >+ FixedDigitalTestParams(test::kLimiterMaxInputLevelDbFs + 0.01f, >+ 10.f, >+ 8000, >+ true), >+ FixedDigitalTestParams(test::kLimiterMaxInputLevelDbFs + 0.01f, >+ 10.f, >+ 48000, >+ true))); >+ >+TEST(GainController2, UsageSaturationMargin) { >+ GainController2 gain_controller2; >+ gain_controller2.Initialize(AudioProcessing::kSampleRate48kHz); >+ > AudioProcessing::Config::GainController2 config; >+ // Check that samples are not amplified as much when extra margin is >+ // high. They should not be amplified at all, but only after convergence. GC2 >+ // starts with a gain, and it takes time until it's down to 0 dB. >+ config.fixed_digital.gain_db = 0.f; >+ config.adaptive_digital.enabled = true; >+ config.adaptive_digital.extra_saturation_margin_db = 50.f; >+ gain_controller2.ApplyConfig(config); > >- // Check that samples are amplified when the fixed gain is greater than 0 dB. >- config.fixed_gain_db = 5.f; >- gain_controller2->ApplyConfig(config); >- gain_controller2->Process(&ab); >- EXPECT_LT(sample_value, ab.channels_f()[0][0]); >+ EXPECT_LT(GainAfterProcessingFile(&gain_controller2), 2.f); >+} >+ >+TEST(GainController2, UsageNoSaturationMargin) { >+ GainController2 gain_controller2; >+ gain_controller2.Initialize(AudioProcessing::kSampleRate48kHz); >+ >+ AudioProcessing::Config::GainController2 config; >+ // Check that some gain is applied if there is no margin. >+ config.fixed_digital.gain_db = 0.f; >+ config.adaptive_digital.enabled = true; >+ config.adaptive_digital.extra_saturation_margin_db = 0.f; >+ gain_controller2.ApplyConfig(config); >+ >+ EXPECT_GT(GainAfterProcessingFile(&gain_controller2), 2.f); > } > > } // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/aec_dump.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/aec_dump.h >index 313e9d7b1de0b62e1abb6ca254ded0414cfa8b72..b734adfbbbb36a777e3ec9b6bb99f4505fdcf20f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/aec_dump.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/aec_dump.h >@@ -11,14 +11,13 @@ > #ifndef MODULES_AUDIO_PROCESSING_INCLUDE_AEC_DUMP_H_ > #define MODULES_AUDIO_PROCESSING_INCLUDE_AEC_DUMP_H_ > >-#include <memory> >+#include <stdint.h> > #include <string> >-#include <vector> > >-#include "api/array_view.h" > #include "api/audio/audio_frame.h" > #include "modules/audio_processing/include/audio_frame_view.h" > #include "modules/audio_processing/include/audio_processing.h" >+#include "rtc_base/deprecation.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.cc >index 75eedafc292387f2e9915f318d002ec1520afe97..27cd8824fec305c5b31f5a335ef8d63e6af32e86 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.cc >@@ -10,8 +10,6 @@ > > #include "modules/audio_processing/include/audio_processing.h" > >-#include "rtc_base/checks.h" >- > namespace webrtc { > > void CustomProcessing::SetRuntimeSetting( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.h >index 26880e37760264ddaac9404ee80e7479ec12634d..df51313229da28c1391878ac39a8e94343ab155f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing.h >@@ -34,6 +34,7 @@ > #include "rtc_base/platform_file.h" > #include "rtc_base/refcount.h" > #include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -48,7 +49,6 @@ class ProcessingConfig; > > class EchoDetector; > class GainControl; >-class HighPassFilter; > class LevelEstimator; > class NoiseSuppression; > class CustomAudioAnalyzer; >@@ -240,7 +240,6 @@ class AudioProcessing : public rtc::RefCountInterface { > // by changing the default values in the AudioProcessing::Config struct. > // The config is applied by passing the struct to the ApplyConfig method. > struct Config { >- // TODO(bugs.webrtc.org/9535): Currently unused. Use this to determine AEC. > struct EchoCanceller { > bool enabled = false; > bool mobile_mode = false; >@@ -271,11 +270,24 @@ class AudioProcessing : public rtc::RefCountInterface { > // first applies a fixed gain. The adaptive digital AGC can be turned off by > // setting |adaptive_digital_mode=false|. > struct GainController2 { >+ enum LevelEstimator { kRms, kPeak }; > bool enabled = false; >- bool adaptive_digital_mode = true; >- float fixed_gain_db = 0.f; >+ struct { >+ float gain_db = 0.f; >+ } fixed_digital; >+ struct { >+ bool enabled = false; >+ LevelEstimator level_estimator = kRms; >+ bool use_saturation_protector = true; >+ float extra_saturation_margin_db = 2.f; >+ } adaptive_digital; > } gain_controller2; > >+ // Enables reporting of |output_rms_dbfs| in webrtc::AudioProcessingStats. >+ struct LevelEstimation { >+ bool enabled = false; >+ } level_estimation; >+ > // Explicit copy assignment implementation to avoid issues with memory > // sanitizer complaints in case of self-assignment. > // TODO(peah): Add buildflag to ensure that this is only included for memory >@@ -520,85 +532,17 @@ class AudioProcessing : public rtc::RefCountInterface { > // specific member variables are reset. > virtual void UpdateHistogramsOnCallEnd() = 0; > >- // TODO(ivoc): Remove when the calling code no longer uses the old Statistics >- // API. >- struct Statistic { >- int instant = 0; // Instantaneous value. >- int average = 0; // Long-term average. >- int maximum = 0; // Long-term maximum. >- int minimum = 0; // Long-term minimum. >- }; >- >- struct Stat { >- void Set(const Statistic& other) { >- Set(other.instant, other.average, other.maximum, other.minimum); >- } >- void Set(float instant, float average, float maximum, float minimum) { >- instant_ = instant; >- average_ = average; >- maximum_ = maximum; >- minimum_ = minimum; >- } >- float instant() const { return instant_; } >- float average() const { return average_; } >- float maximum() const { return maximum_; } >- float minimum() const { return minimum_; } >- >- private: >- float instant_ = 0.0f; // Instantaneous value. >- float average_ = 0.0f; // Long-term average. >- float maximum_ = 0.0f; // Long-term maximum. >- float minimum_ = 0.0f; // Long-term minimum. >- }; >- >- struct AudioProcessingStatistics { >- AudioProcessingStatistics(); >- AudioProcessingStatistics(const AudioProcessingStatistics& other); >- ~AudioProcessingStatistics(); >- >- // AEC Statistics. >- // RERL = ERL + ERLE >- Stat residual_echo_return_loss; >- // ERL = 10log_10(P_far / P_echo) >- Stat echo_return_loss; >- // ERLE = 10log_10(P_echo / P_out) >- Stat echo_return_loss_enhancement; >- // (Pre non-linear processing suppression) A_NLP = 10log_10(P_echo / P_a) >- Stat a_nlp; >- // Fraction of time that the AEC linear filter is divergent, in a 1-second >- // non-overlapped aggregation window. >- float divergent_filter_fraction = -1.0f; >- >- // The delay metrics consists of the delay median and standard deviation. It >- // also consists of the fraction of delay estimates that can make the echo >- // cancellation perform poorly. The values are aggregated until the first >- // call to |GetStatistics()| and afterwards aggregated and updated every >- // second. Note that if there are several clients pulling metrics from >- // |GetStatistics()| during a session the first call from any of them will >- // change to one second aggregation window for all. >- int delay_median = -1; >- int delay_standard_deviation = -1; >- float fraction_poor_delays = -1.0f; >- >- // Residual echo detector likelihood. >- float residual_echo_likelihood = -1.0f; >- // Maximum residual echo likelihood from the last time period. >- float residual_echo_likelihood_recent_max = -1.0f; >- }; >- >- // TODO(ivoc): Make this pure virtual when all subclasses have been updated. >- virtual AudioProcessingStatistics GetStatistics() const; >- >- // This returns the stats as optionals and it will replace the regular >- // GetStatistics. >- virtual AudioProcessingStats GetStatistics(bool has_remote_tracks) const; >+ // Get audio processing statistics. The |has_remote_tracks| argument should be >+ // set if there are active remote tracks (this would usually be true during >+ // a call). If there are no remote tracks some of the stats will not be set by >+ // AudioProcessing, because they only make sense if there is at least one >+ // remote track. >+ virtual AudioProcessingStats GetStatistics(bool has_remote_tracks) const = 0; > > // These provide access to the component interfaces and should never return > // NULL. The pointers will be valid for the lifetime of the APM instance. > // The memory for these objects is entirely managed internally. > virtual GainControl* gain_control() const = 0; >- // TODO(peah): Deprecate this API call. >- virtual HighPassFilter* high_pass_filter() const = 0; > virtual LevelEstimator* level_estimator() const = 0; > virtual NoiseSuppression* noise_suppression() const = 0; > virtual VoiceDetection* voice_detection() const = 0; >@@ -648,7 +592,7 @@ class AudioProcessing : public rtc::RefCountInterface { > static const int kChunkSizeMs = 10; > }; > >-class AudioProcessingBuilder { >+class RTC_EXPORT AudioProcessingBuilder { > public: > AudioProcessingBuilder(); > ~AudioProcessingBuilder(); >@@ -788,17 +732,6 @@ class ProcessingConfig { > StreamConfig streams[StreamName::kNumStreamNames]; > }; > >-// TODO(peah): Remove this interface. >-// A filtering component which removes DC offset and low-frequency noise. >-// Recommended to be enabled on the client-side. >-class HighPassFilter { >- public: >- virtual int Enable(bool enable) = 0; >- virtual bool is_enabled() const = 0; >- >- virtual ~HighPassFilter() {} >-}; >- > // An estimation component used to retrieve level metrics. > class LevelEstimator { > public: >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing_statistics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing_statistics.h >index 237d23c5bce8b2bdadd7cba7f4298d1d6e93739c..683db052e6c6b7e69e87035840e1cc6f292c7129 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing_statistics.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/audio_processing_statistics.h >@@ -11,16 +11,27 @@ > #ifndef MODULES_AUDIO_PROCESSING_INCLUDE_AUDIO_PROCESSING_STATISTICS_H_ > #define MODULES_AUDIO_PROCESSING_INCLUDE_AUDIO_PROCESSING_STATISTICS_H_ > >+#include <stdint.h> >+ > #include "absl/types/optional.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > // This version of the stats uses Optionals, it will replace the regular > // AudioProcessingStatistics struct. >-struct AudioProcessingStats { >+struct RTC_EXPORT AudioProcessingStats { > AudioProcessingStats(); > AudioProcessingStats(const AudioProcessingStats& other); > ~AudioProcessingStats(); > >+ // The root mean square (RMS) level in dBFS (decibels from digital >+ // full-scale) of the last capture frame, after processing. It is >+ // constrained to [-127, 0]. >+ // The computation follows: https://tools.ietf.org/html/rfc6465 >+ // with the intent that it can provide the RTP audio level indication. >+ // Only reported if level estimation is enabled in AudioProcessing::Config. >+ absl::optional<int> output_rms_dbfs; >+ > // AEC Statistics. > // ERL = 10log_10(P_far / P_echo) > absl::optional<double> echo_return_loss; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/config.h >index 398aab61cffe63dd52f0581d980ac0f67f144da9..e77d3fd16d7b41963de37fa79bf61083dce92d37 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/config.h >@@ -14,6 +14,7 @@ > #include <map> > > #include "rtc_base/constructormagic.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -57,7 +58,7 @@ enum class ConfigOptionID { > // config.Set<Algo1_CostFunction>(new SqrCost()); > // > // Note: This class is thread-compatible (like STL containers). >-class Config { >+class RTC_EXPORT Config { > public: > // Returns the option if set or a default constructed one. > // Callers that access options too often are encouraged to cache the result. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/mock_audio_processing.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/mock_audio_processing.h >index 4a1fe626bd5d58b0b85e19e8d1b242f00341d048..f00a16d5e2d2610b171b40c62a13dfb82cac34a9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/mock_audio_processing.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/include/mock_audio_processing.h >@@ -42,13 +42,6 @@ class MockGainControl : public GainControl { > MOCK_CONST_METHOD0(stream_is_saturated, bool()); > }; > >-class MockHighPassFilter : public HighPassFilter { >- public: >- virtual ~MockHighPassFilter() {} >- MOCK_METHOD1(Enable, int(bool enable)); >- MOCK_CONST_METHOD0(is_enabled, bool()); >-}; >- > class MockLevelEstimator : public LevelEstimator { > public: > virtual ~MockLevelEstimator() {} >@@ -114,7 +107,6 @@ class MockAudioProcessing : public testing::NiceMock<AudioProcessing> { > public: > MockAudioProcessing() > : gain_control_(new testing::NiceMock<MockGainControl>()), >- high_pass_filter_(new testing::NiceMock<MockHighPassFilter>()), > level_estimator_(new testing::NiceMock<MockLevelEstimator>()), > noise_suppression_(new testing::NiceMock<MockNoiseSuppression>()), > voice_detection_(new testing::NiceMock<MockVoiceDetection>()) {} >@@ -180,12 +172,8 @@ class MockAudioProcessing : public testing::NiceMock<AudioProcessing> { > MOCK_METHOD0(DetachPlayoutAudioGenerator, void()); > > MOCK_METHOD0(UpdateHistogramsOnCallEnd, void()); >- MOCK_CONST_METHOD0(GetStatistics, AudioProcessingStatistics()); > MOCK_CONST_METHOD1(GetStatistics, AudioProcessingStats(bool)); > virtual MockGainControl* gain_control() const { return gain_control_.get(); } >- virtual MockHighPassFilter* high_pass_filter() const { >- return high_pass_filter_.get(); >- } > virtual MockLevelEstimator* level_estimator() const { > return level_estimator_.get(); > } >@@ -200,7 +188,6 @@ class MockAudioProcessing : public testing::NiceMock<AudioProcessing> { > > private: > std::unique_ptr<MockGainControl> gain_control_; >- std::unique_ptr<MockHighPassFilter> high_pass_filter_; > std::unique_ptr<MockLevelEstimator> level_estimator_; > std::unique_ptr<MockNoiseSuppression> noise_suppression_; > std::unique_ptr<MockVoiceDetection> voice_detection_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/level_estimator_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/level_estimator_impl.cc >index c937f8452534cea733b0c1f622858e7ea5ebe372..5b49b35fdcc5653824e084ae544425dda0f0722e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/level_estimator_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/level_estimator_impl.cc >@@ -10,9 +10,13 @@ > > #include "modules/audio_processing/level_estimator_impl.h" > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "api/array_view.h" > #include "modules/audio_processing/audio_buffer.h" > #include "modules/audio_processing/rms_level.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.cc >index e2e8602b989b366bc2a62f67681b508ee53fac5f..cc879c8b08fdc5d7d6cbff9949166e2ad6894125 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.cc >@@ -11,7 +11,6 @@ > #include "modules/audio_processing/logging/apm_data_dumper.h" > > #include "rtc_base/strings/string_builder.h" >-#include "rtc_base/stringutils.h" > > // Check to verify that the define is properly set. > #if !defined(WEBRTC_APM_DEBUG_DUMP) || \ >@@ -20,16 +19,30 @@ > #endif > > namespace webrtc { >- > namespace { > > #if WEBRTC_APM_DEBUG_DUMP == 1 >-std::string FormFileName(const char* name, >+ >+#if defined(WEBRTC_WIN) >+constexpr char kPathDelimiter = '\\'; >+#else >+constexpr char kPathDelimiter = '/'; >+#endif >+ >+std::string FormFileName(const char* output_dir, >+ const char* name, > int instance_index, > int reinit_index, > const std::string& suffix) { > char buf[1024]; > rtc::SimpleStringBuilder ss(buf); >+ const size_t output_dir_size = strlen(output_dir); >+ if (output_dir_size > 0) { >+ ss << output_dir; >+ if (output_dir[output_dir_size - 1] != kPathDelimiter) { >+ ss << kPathDelimiter; >+ } >+ } > ss << name << "_" << instance_index << "-" << reinit_index << suffix; > return ss.str(); > } >@@ -41,18 +54,22 @@ std::string FormFileName(const char* name, > ApmDataDumper::ApmDataDumper(int instance_index) > : instance_index_(instance_index) {} > #else >-ApmDataDumper::ApmDataDumper(int instance_index) {} >+ApmDataDumper::ApmDataDumper(int instance_index){}; > #endif > >-ApmDataDumper::~ApmDataDumper() {} >+ApmDataDumper::~ApmDataDumper() = default; > > #if WEBRTC_APM_DEBUG_DUMP == 1 >+bool ApmDataDumper::recording_activated_ = false; >+char ApmDataDumper::output_dir_[] = ""; >+ > FILE* ApmDataDumper::GetRawFile(const char* name) { >- std::string filename = >- FormFileName(name, instance_index_, recording_set_index_, ".dat"); >+ std::string filename = FormFileName(output_dir_, name, instance_index_, >+ recording_set_index_, ".dat"); > auto& f = raw_files_[filename]; > if (!f) { > f.reset(fopen(filename.c_str(), "wb")); >+ RTC_CHECK(f.get()) << "Cannot write to " << filename << "."; > } > return f.get(); > } >@@ -60,15 +77,14 @@ FILE* ApmDataDumper::GetRawFile(const char* name) { > WavWriter* ApmDataDumper::GetWavFile(const char* name, > int sample_rate_hz, > int num_channels) { >- std::string filename = >- FormFileName(name, instance_index_, recording_set_index_, ".wav"); >+ std::string filename = FormFileName(output_dir_, name, instance_index_, >+ recording_set_index_, ".wav"); > auto& f = wav_files_[filename]; > if (!f) { > f.reset(new WavWriter(filename.c_str(), sample_rate_hz, num_channels)); > } > return f.get(); > } >- > #endif > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.h >index d045027b354165a02419ba9d44b768f6313efc2a..b541ae84b9b86a7494aef73d797079419963bc1f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/logging/apm_data_dumper.h >@@ -11,14 +11,20 @@ > #ifndef MODULES_AUDIO_PROCESSING_LOGGING_APM_DATA_DUMPER_H_ > #define MODULES_AUDIO_PROCESSING_LOGGING_APM_DATA_DUMPER_H_ > >+#include <stdint.h> > #include <stdio.h> >+#include <string.h> > >-#include <memory> > #include <string> >+#if WEBRTC_APM_DEBUG_DUMP == 1 > #include <unordered_map> >+#endif > > #include "api/array_view.h" >+#if WEBRTC_APM_DEBUG_DUMP == 1 > #include "common_audio/wav_file.h" >+#include "rtc_base/checks.h" >+#endif > #include "rtc_base/constructormagic.h" > > // Check to verify that the define is properly set. >@@ -47,6 +53,21 @@ class ApmDataDumper { > > ~ApmDataDumper(); > >+ // Activates or deactivate the dumping functionality. >+ static void SetActivated(bool activated) { >+#if WEBRTC_APM_DEBUG_DUMP == 1 >+ recording_activated_ = activated; >+#endif >+ } >+ >+ // Set an optional output directory. >+ static void SetOutputDirectory(const std::string& output_dir) { >+#if WEBRTC_APM_DEBUG_DUMP == 1 >+ RTC_CHECK_LT(output_dir.size(), kOutputDirMaxLength); >+ strncpy(output_dir_, output_dir.c_str(), output_dir.size()); >+#endif >+ } >+ > // Reinitializes the data dumping such that new versions > // of all files being dumped to are created. > void InitiateNewSetOfRecordings() { >@@ -59,115 +80,155 @@ class ApmDataDumper { > // various formats. > void DumpRaw(const char* name, double v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(&v, sizeof(v), 1, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(&v, sizeof(v), 1, file); >+ } > #endif > } > > void DumpRaw(const char* name, size_t v_length, const double* v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(v, sizeof(v[0]), v_length, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(v, sizeof(v[0]), v_length, file); >+ } > #endif > } > > void DumpRaw(const char* name, rtc::ArrayView<const double> v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- DumpRaw(name, v.size(), v.data()); >+ if (recording_activated_) { >+ DumpRaw(name, v.size(), v.data()); >+ } > #endif > } > > void DumpRaw(const char* name, float v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(&v, sizeof(v), 1, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(&v, sizeof(v), 1, file); >+ } > #endif > } > > void DumpRaw(const char* name, size_t v_length, const float* v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(v, sizeof(v[0]), v_length, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(v, sizeof(v[0]), v_length, file); >+ } > #endif > } > > void DumpRaw(const char* name, rtc::ArrayView<const float> v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- DumpRaw(name, v.size(), v.data()); >+ if (recording_activated_) { >+ DumpRaw(name, v.size(), v.data()); >+ } > #endif > } > > void DumpRaw(const char* name, bool v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- DumpRaw(name, static_cast<int16_t>(v)); >+ if (recording_activated_) { >+ DumpRaw(name, static_cast<int16_t>(v)); >+ } > #endif > } > > void DumpRaw(const char* name, size_t v_length, const bool* v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- for (size_t k = 0; k < v_length; ++k) { >- int16_t value = static_cast<int16_t>(v[k]); >- fwrite(&value, sizeof(value), 1, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ for (size_t k = 0; k < v_length; ++k) { >+ int16_t value = static_cast<int16_t>(v[k]); >+ fwrite(&value, sizeof(value), 1, file); >+ } > } > #endif > } > > void DumpRaw(const char* name, rtc::ArrayView<const bool> v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- DumpRaw(name, v.size(), v.data()); >+ if (recording_activated_) { >+ DumpRaw(name, v.size(), v.data()); >+ } > #endif > } > > void DumpRaw(const char* name, int16_t v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(&v, sizeof(v), 1, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(&v, sizeof(v), 1, file); >+ } > #endif > } > > void DumpRaw(const char* name, size_t v_length, const int16_t* v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(v, sizeof(v[0]), v_length, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(v, sizeof(v[0]), v_length, file); >+ } > #endif > } > > void DumpRaw(const char* name, rtc::ArrayView<const int16_t> v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- DumpRaw(name, v.size(), v.data()); >+ if (recording_activated_) { >+ DumpRaw(name, v.size(), v.data()); >+ } > #endif > } > > void DumpRaw(const char* name, int32_t v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(&v, sizeof(v), 1, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(&v, sizeof(v), 1, file); >+ } > #endif > } > > void DumpRaw(const char* name, size_t v_length, const int32_t* v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(v, sizeof(v[0]), v_length, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(v, sizeof(v[0]), v_length, file); >+ } > #endif > } > > void DumpRaw(const char* name, size_t v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(&v, sizeof(v), 1, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(&v, sizeof(v), 1, file); >+ } > #endif > } > > void DumpRaw(const char* name, size_t v_length, const size_t* v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- FILE* file = GetRawFile(name); >- fwrite(v, sizeof(v[0]), v_length, file); >+ if (recording_activated_) { >+ FILE* file = GetRawFile(name); >+ fwrite(v, sizeof(v[0]), v_length, file); >+ } > #endif > } > > void DumpRaw(const char* name, rtc::ArrayView<const int32_t> v) { >+#if WEBRTC_APM_DEBUG_DUMP == 1 >+ if (recording_activated_) { >+ DumpRaw(name, v.size(), v.data()); >+ } >+#endif >+ } >+ >+ void DumpRaw(const char* name, rtc::ArrayView<const size_t> v) { > #if WEBRTC_APM_DEBUG_DUMP == 1 > DumpRaw(name, v.size(), v.data()); > #endif >@@ -179,8 +240,10 @@ class ApmDataDumper { > int sample_rate_hz, > int num_channels) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- WavWriter* file = GetWavFile(name, sample_rate_hz, num_channels); >- file->WriteSamples(v, v_length); >+ if (recording_activated_) { >+ WavWriter* file = GetWavFile(name, sample_rate_hz, num_channels); >+ file->WriteSamples(v, v_length); >+ } > #endif > } > >@@ -189,12 +252,17 @@ class ApmDataDumper { > int sample_rate_hz, > int num_channels) { > #if WEBRTC_APM_DEBUG_DUMP == 1 >- DumpWav(name, v.size(), v.data(), sample_rate_hz, num_channels); >+ if (recording_activated_) { >+ DumpWav(name, v.size(), v.data(), sample_rate_hz, num_channels); >+ } > #endif > } > > private: > #if WEBRTC_APM_DEBUG_DUMP == 1 >+ static bool recording_activated_; >+ static constexpr size_t kOutputDirMaxLength = 1024; >+ static char output_dir_[kOutputDirMaxLength]; > const int instance_index_; > int recording_set_index_ = 0; > std::unordered_map<std::string, std::unique_ptr<FILE, RawFileCloseFunctor>> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/low_cut_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/low_cut_filter.cc >index 5245c68d02a419e6f6560550e6ebd09de1ffdbb4..581d6bc42a88030f6af20071318dad9ad76807a7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/low_cut_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/low_cut_filter.cc >@@ -10,8 +10,13 @@ > > #include "modules/audio_processing/low_cut_filter.h" > >+#include <stdint.h> >+#include <cstring> >+ > #include "common_audio/signal_processing/include/signal_processing_library.h" > #include "modules/audio_processing/audio_buffer.h" >+#include "modules/audio_processing/include/audio_processing.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/noise_suppression_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/noise_suppression_impl.cc >index 15d404323c36f88eff1f1618e0421a0f65aac1cf..d8d9e32340b1ccf12eea7f7ccbd3575f142ddd1d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/noise_suppression_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/noise_suppression_impl.cc >@@ -11,9 +11,11 @@ > #include "modules/audio_processing/noise_suppression_impl.h" > > #include "modules/audio_processing/audio_buffer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > #if defined(WEBRTC_NS_FLOAT) > #include "modules/audio_processing/ns/noise_suppression.h" >+ > #define NS_CREATE WebRtcNs_Create > #define NS_FREE WebRtcNs_Free > #define NS_INIT WebRtcNs_Init >@@ -21,6 +23,7 @@ > typedef NsHandle NsState; > #elif defined(WEBRTC_NS_FIXED) > #include "modules/audio_processing/ns/noise_suppression_x.h" >+ > #define NS_CREATE WebRtcNsx_Create > #define NS_FREE WebRtcNsx_Free > #define NS_INIT WebRtcNsx_Init >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/residual_echo_detector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/residual_echo_detector.cc >index e80501358eaaafe241118c7cbfbc8248f9d466e1..3454214fd739d6dd36f0aafc0b8f2503ad5ba9a5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/residual_echo_detector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/residual_echo_detector.cc >@@ -13,9 +13,11 @@ > #include <algorithm> > #include <numeric> > >+#include "absl/types/optional.h" > #include "modules/audio_processing/audio_buffer.h" > #include "modules/audio_processing/logging/apm_data_dumper.h" > #include "rtc_base/atomicops.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/metrics.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/rms_level.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/rms_level.h >index 9aa549a57c1e336a4ec6d793437e3bfaf3eb4870..e6b5849ead8bde88dcc2f717c306b6b20b266fdc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/rms_level.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/rms_level.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_AUDIO_PROCESSING_RMS_LEVEL_H_ > #define MODULES_AUDIO_PROCESSING_RMS_LEVEL_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "absl/types/optional.h" > #include "api/array_view.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.cc >index 65f3c87497c85cda635a4f26e86806c5004ec8f3..df9f4976be27d0ae03c8053b0052cd0dbe37eca0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.cc >@@ -18,12 +18,14 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "api/audio/echo_canceller3_config_json.h" > #include "api/audio/echo_canceller3_factory.h" > #include "common_audio/include/audio_util.h" > #include "modules/audio_processing/aec_dump/aec_dump_factory.h" > #include "modules/audio_processing/echo_cancellation_impl.h" > #include "modules/audio_processing/echo_control_mobile_impl.h" > #include "modules/audio_processing/include/audio_processing.h" >+#include "modules/audio_processing/logging/apm_data_dumper.h" > #include "modules/audio_processing/test/fake_recording_device.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >@@ -34,603 +36,32 @@ > namespace webrtc { > namespace test { > namespace { >- >-// Prints out the currently used AEC3 parameter values in JSON format. >-void PrintAec3ParameterValues(const EchoCanceller3Config& cfg) { >- std::cout << "{"; >- std::cout << "\"delay\": {"; >- std::cout << "\"default_delay\": " << cfg.delay.default_delay << ","; >- std::cout << "\"down_sampling_factor\": " << cfg.delay.down_sampling_factor >- << ","; >- std::cout << "\"num_filters\": " << cfg.delay.num_filters << ","; >- std::cout << "\"api_call_jitter_blocks\": " >- << cfg.delay.api_call_jitter_blocks << ","; >- std::cout << "\"min_echo_path_delay_blocks\": " >- << cfg.delay.min_echo_path_delay_blocks << ","; >- std::cout << "\"delay_headroom_blocks\": " << cfg.delay.delay_headroom_blocks >- << ","; >- std::cout << "\"hysteresis_limit_1_blocks\": " >- << cfg.delay.hysteresis_limit_1_blocks << ","; >- std::cout << "\"hysteresis_limit_2_blocks\": " >- << cfg.delay.hysteresis_limit_2_blocks << ","; >- std::cout << "\"skew_hysteresis_blocks\": " >- << cfg.delay.skew_hysteresis_blocks << ","; >- std::cout << "\"fixed_capture_delay_samples\": " >- << cfg.delay.fixed_capture_delay_samples << ","; >- std::cout << "\"delay_estimate_smoothing\": " >- << cfg.delay.delay_estimate_smoothing << ","; >- std::cout << "\"delay_candidate_detection_threshold\": " >- << cfg.delay.delay_candidate_detection_threshold << ","; >- >- std::cout << "\"delay_selection_thresholds\": {"; >- std::cout << "\"initial\": " << cfg.delay.delay_selection_thresholds.initial >- << ","; >- std::cout << "\"converged\": " >- << cfg.delay.delay_selection_thresholds.converged; >- std::cout << "}"; >- >- std::cout << "},"; >- >- std::cout << "\"filter\": {"; >- std::cout << "\"main\": ["; >- std::cout << cfg.filter.main.length_blocks << ","; >- std::cout << cfg.filter.main.leakage_converged << ","; >- std::cout << cfg.filter.main.leakage_diverged << ","; >- std::cout << cfg.filter.main.error_floor << ","; >- std::cout << cfg.filter.main.noise_gate; >- std::cout << "],"; >- >- std::cout << "\"shadow\": ["; >- std::cout << cfg.filter.shadow.length_blocks << ","; >- std::cout << cfg.filter.shadow.rate << ","; >- std::cout << cfg.filter.shadow.noise_gate; >- std::cout << "],"; >- >- std::cout << "\"main_initial\": ["; >- std::cout << cfg.filter.main_initial.length_blocks << ","; >- std::cout << cfg.filter.main_initial.leakage_converged << ","; >- std::cout << cfg.filter.main_initial.leakage_diverged << ","; >- std::cout << cfg.filter.main_initial.error_floor << ","; >- std::cout << cfg.filter.main_initial.noise_gate; >- std::cout << "],"; >- >- std::cout << "\"shadow_initial\": ["; >- std::cout << cfg.filter.shadow_initial.length_blocks << ","; >- std::cout << cfg.filter.shadow_initial.rate << ","; >- std::cout << cfg.filter.shadow_initial.noise_gate; >- std::cout << "],"; >- >- std::cout << "\"config_change_duration_blocks\": " >- << cfg.filter.config_change_duration_blocks << ","; >- std::cout << "\"initial_state_seconds\": " << cfg.filter.initial_state_seconds >- << ","; >- std::cout << "\"conservative_initial_phase\": " >- << (cfg.filter.conservative_initial_phase ? "true" : "false") >- << ","; >- std::cout << "\"enable_shadow_filter_output_usage\": " >- << (cfg.filter.enable_shadow_filter_output_usage ? "true" >- : "false"); >- >- std::cout << "},"; >- >- std::cout << "\"erle\": {"; >- std::cout << "\"min\": " << cfg.erle.min << ","; >- std::cout << "\"max_l\": " << cfg.erle.max_l << ","; >- std::cout << "\"max_h\": " << cfg.erle.max_h << ","; >- std::cout << "\"onset_detection\": " >- << (cfg.erle.onset_detection ? "true" : "false"); >- std::cout << "},"; >- >- std::cout << "\"ep_strength\": {"; >- std::cout << "\"lf\": " << cfg.ep_strength.lf << ","; >- std::cout << "\"mf\": " << cfg.ep_strength.mf << ","; >- std::cout << "\"hf\": " << cfg.ep_strength.hf << ","; >- std::cout << "\"default_len\": " << cfg.ep_strength.default_len << ","; >- std::cout << "\"reverb_based_on_render\": " >- << (cfg.ep_strength.reverb_based_on_render ? "true" : "false") >- << ","; >- std::cout << "\"echo_can_saturate\": " >- << (cfg.ep_strength.echo_can_saturate ? "true" : "false") << ","; >- std::cout << "\"bounded_erl\": " >- << (cfg.ep_strength.bounded_erl ? "true" : "false"); >- >- std::cout << "},"; >- >- std::cout << "\"gain_mask\": {"; >- std::cout << "\"m0\": " << cfg.gain_mask.m0 << ","; >- std::cout << "\"m1\": " << cfg.gain_mask.m1 << ","; >- std::cout << "\"m2\": " << cfg.gain_mask.m2 << ","; >- std::cout << "\"m3\": " << cfg.gain_mask.m3 << ","; >- std::cout << "\"m5\": " << cfg.gain_mask.m5 << ","; >- std::cout << "\"m6\": " << cfg.gain_mask.m6 << ","; >- std::cout << "\"m7\": " << cfg.gain_mask.m7 << ","; >- std::cout << "\"m8\": " << cfg.gain_mask.m8 << ","; >- std::cout << "\"m9\": " << cfg.gain_mask.m9 << ","; >- std::cout << "\"gain_curve_offset\": " << cfg.gain_mask.gain_curve_offset >- << ","; >- std::cout << "\"gain_curve_slope\": " << cfg.gain_mask.gain_curve_slope >- << ","; >- std::cout << "\"temporal_masking_lf\": " << cfg.gain_mask.temporal_masking_lf >- << ","; >- std::cout << "\"temporal_masking_hf\": " << cfg.gain_mask.temporal_masking_hf >- << ","; >- std::cout << "\"temporal_masking_lf_bands\": " >- << cfg.gain_mask.temporal_masking_lf_bands; >- std::cout << "},"; >- >- std::cout << "\"echo_audibility\": {"; >- std::cout << "\"low_render_limit\": " << cfg.echo_audibility.low_render_limit >- << ","; >- std::cout << "\"normal_render_limit\": " >- << cfg.echo_audibility.normal_render_limit << ","; >- std::cout << "\"floor_power\": " << cfg.echo_audibility.floor_power << ","; >- std::cout << "\"audibility_threshold_lf\": " >- << cfg.echo_audibility.audibility_threshold_lf << ","; >- std::cout << "\"audibility_threshold_mf\": " >- << cfg.echo_audibility.audibility_threshold_mf << ","; >- std::cout << "\"audibility_threshold_hf\": " >- << cfg.echo_audibility.audibility_threshold_hf << ","; >- std::cout << "\"use_stationary_properties\": " >- << (cfg.echo_audibility.use_stationary_properties ? "true" >- : "false") >- << ","; >- std::cout << "\"use_stationarity_properties_at_init\": " >- << (cfg.echo_audibility.use_stationarity_properties_at_init >- ? "true" >- : "false"); >- std::cout << "},"; >- >- std::cout << "\"render_levels\": {"; >- std::cout << "\"active_render_limit\": " >- << cfg.render_levels.active_render_limit << ","; >- std::cout << "\"poor_excitation_render_limit\": " >- << cfg.render_levels.poor_excitation_render_limit << ","; >- std::cout << "\"poor_excitation_render_limit_ds8\": " >- << cfg.render_levels.poor_excitation_render_limit_ds8; >- std::cout << "},"; >- >- std::cout << "\"echo_removal_control\": {"; >- std::cout << "\"gain_rampup\": {"; >- std::cout << "\"initial_gain\": " >- << cfg.echo_removal_control.gain_rampup.initial_gain << ","; >- std::cout << "\"first_non_zero_gain\": " >- << cfg.echo_removal_control.gain_rampup.first_non_zero_gain << ","; >- std::cout << "\"non_zero_gain_blocks\": " >- << cfg.echo_removal_control.gain_rampup.non_zero_gain_blocks << ","; >- std::cout << "\"full_gain_blocks\": " >- << cfg.echo_removal_control.gain_rampup.full_gain_blocks; >- std::cout << "},"; >- std::cout << "\"has_clock_drift\": " >- << (cfg.echo_removal_control.has_clock_drift ? "true" : "false") >- << ","; >- std::cout << "\"linear_and_stable_echo_path\": " >- << (cfg.echo_removal_control.linear_and_stable_echo_path ? "true" >- : "false"); >- >- std::cout << "},"; >- >- std::cout << "\"echo_model\": {"; >- std::cout << "\"noise_floor_hold\": " << cfg.echo_model.noise_floor_hold >- << ","; >- std::cout << "\"min_noise_floor_power\": " >- << cfg.echo_model.min_noise_floor_power << ","; >- std::cout << "\"stationary_gate_slope\": " >- << cfg.echo_model.stationary_gate_slope << ","; >- std::cout << "\"noise_gate_power\": " << cfg.echo_model.noise_gate_power >- << ","; >- std::cout << "\"noise_gate_slope\": " << cfg.echo_model.noise_gate_slope >- << ","; >- std::cout << "\"render_pre_window_size\": " >- << cfg.echo_model.render_pre_window_size << ","; >- std::cout << "\"render_post_window_size\": " >- << cfg.echo_model.render_post_window_size << ","; >- std::cout << "\"render_pre_window_size_init\": " >- << cfg.echo_model.render_pre_window_size_init << ","; >- std::cout << "\"render_post_window_size_init\": " >- << cfg.echo_model.render_post_window_size_init << ","; >- std::cout << "\"nonlinear_hold\": " << cfg.echo_model.nonlinear_hold << ","; >- std::cout << "\"nonlinear_release\": " << cfg.echo_model.nonlinear_release; >- std::cout << "},"; >- >- std::cout << "\"suppressor\": {"; >- std::cout << "\"nearend_average_blocks\": " >- << cfg.suppressor.nearend_average_blocks << ","; >- std::cout << "\"normal_tuning\": {"; >- std::cout << "\"mask_lf\": ["; >- std::cout << cfg.suppressor.normal_tuning.mask_lf.enr_transparent << ","; >- std::cout << cfg.suppressor.normal_tuning.mask_lf.enr_suppress << ","; >- std::cout << cfg.suppressor.normal_tuning.mask_lf.emr_transparent; >- std::cout << "],"; >- std::cout << "\"mask_hf\": ["; >- std::cout << cfg.suppressor.normal_tuning.mask_hf.enr_transparent << ","; >- std::cout << cfg.suppressor.normal_tuning.mask_hf.enr_suppress << ","; >- std::cout << cfg.suppressor.normal_tuning.mask_hf.emr_transparent; >- std::cout << "],"; >- std::cout << "\"max_inc_factor\": " >- << cfg.suppressor.normal_tuning.max_inc_factor << ","; >- std::cout << "\"max_dec_factor_lf\": " >- << cfg.suppressor.normal_tuning.max_dec_factor_lf; >- std::cout << "},"; >- std::cout << "\"nearend_tuning\": {"; >- std::cout << "\"mask_lf\": ["; >- std::cout << cfg.suppressor.nearend_tuning.mask_lf.enr_transparent << ","; >- std::cout << cfg.suppressor.nearend_tuning.mask_lf.enr_suppress << ","; >- std::cout << cfg.suppressor.nearend_tuning.mask_lf.emr_transparent; >- std::cout << "],"; >- std::cout << "\"mask_hf\": ["; >- std::cout << cfg.suppressor.nearend_tuning.mask_hf.enr_transparent << ","; >- std::cout << cfg.suppressor.nearend_tuning.mask_hf.enr_suppress << ","; >- std::cout << cfg.suppressor.nearend_tuning.mask_hf.emr_transparent; >- std::cout << "],"; >- std::cout << "\"max_inc_factor\": " >- << cfg.suppressor.nearend_tuning.max_inc_factor << ","; >- std::cout << "\"max_dec_factor_lf\": " >- << cfg.suppressor.nearend_tuning.max_dec_factor_lf; >- std::cout << "},"; >- std::cout << "\"dominant_nearend_detection\": {"; >- std::cout << "\"enr_threshold\": " >- << cfg.suppressor.dominant_nearend_detection.enr_threshold << ","; >- std::cout << "\"snr_threshold\": " >- << cfg.suppressor.dominant_nearend_detection.snr_threshold << ","; >- std::cout << "\"hold_duration\": " >- << cfg.suppressor.dominant_nearend_detection.hold_duration << ","; >- std::cout << "\"trigger_threshold\": " >- << cfg.suppressor.dominant_nearend_detection.trigger_threshold; >- std::cout << "},"; >- std::cout << "\"high_bands_suppression\": {"; >- std::cout << "\"enr_threshold\": " >- << cfg.suppressor.high_bands_suppression.enr_threshold << ","; >- std::cout << "\"max_gain_during_echo\": " >- << cfg.suppressor.high_bands_suppression.max_gain_during_echo; >- std::cout << "},"; >- std::cout << "\"floor_first_increase\": " >- << cfg.suppressor.floor_first_increase << ","; >- std::cout << "\"enforce_transparent\": " >- << (cfg.suppressor.enforce_transparent ? "true" : "false") << ","; >- std::cout << "\"enforce_empty_higher_bands\": " >- << (cfg.suppressor.enforce_empty_higher_bands ? "true" : "false"); >- std::cout << "}"; >- std::cout << "}"; >- std::cout << std::endl; >+// Helper for reading JSON from a file and parsing it to an AEC3 configuration. >+EchoCanceller3Config ReadAec3ConfigFromJsonFile(const std::string& filename) { >+ std::string json_string; >+ std::string s; >+ std::ifstream f(filename.c_str()); >+ if (f.fail()) { >+ std::cout << "Failed to open the file " << filename << std::endl; >+ RTC_CHECK(false); >+ } >+ while (std::getline(f, s)) { >+ json_string += s; >+ } >+ >+ bool parsing_successful; >+ EchoCanceller3Config cfg; >+ Aec3ConfigFromJsonString(json_string, &cfg, &parsing_successful); >+ if (!parsing_successful) { >+ std::cout << "Parsing of json string failed: " << std::endl >+ << json_string << std::endl; >+ RTC_CHECK(false); >+ } >+ RTC_CHECK(EchoCanceller3Config::Validate(&cfg)); >+ >+ return cfg; > } > >-// Class for parsing the AEC3 parameters from a JSON file and producing a config >-// struct. >-class Aec3ParametersParser { >- public: >- static EchoCanceller3Config Parse(const std::string& filename) { >- return Aec3ParametersParser().ParseFile(filename); >- } >- >- private: >- Aec3ParametersParser() = default; >- >- void ReadParam(const Json::Value& root, >- std::string param_name, >- bool* param) const { >- RTC_CHECK(param); >- bool v; >- if (rtc::GetBoolFromJsonObject(root, param_name, &v)) { >- *param = v; >- } >- } >- >- void ReadParam(const Json::Value& root, >- std::string param_name, >- size_t* param) const { >- RTC_CHECK(param); >- int v; >- if (rtc::GetIntFromJsonObject(root, param_name, &v)) { >- *param = v; >- } >- } >- >- void ReadParam(const Json::Value& root, >- std::string param_name, >- int* param) const { >- RTC_CHECK(param); >- int v; >- if (rtc::GetIntFromJsonObject(root, param_name, &v)) { >- *param = v; >- } >- } >- >- void ReadParam(const Json::Value& root, >- std::string param_name, >- float* param) const { >- RTC_CHECK(param); >- double v; >- if (rtc::GetDoubleFromJsonObject(root, param_name, &v)) { >- *param = static_cast<float>(v); >- } >- } >- >- void ReadParam(const Json::Value& root, >- std::string param_name, >- EchoCanceller3Config::Filter::MainConfiguration* param) const { >- RTC_CHECK(param); >- Json::Value json_array; >- if (rtc::GetValueFromJsonObject(root, param_name, &json_array)) { >- std::vector<double> v; >- rtc::JsonArrayToDoubleVector(json_array, &v); >- if (v.size() != 5) { >- std::cout << "Incorrect array size for " << param_name << std::endl; >- RTC_CHECK(false); >- } >- param->length_blocks = static_cast<size_t>(v[0]); >- param->leakage_converged = static_cast<float>(v[1]); >- param->leakage_diverged = static_cast<float>(v[2]); >- param->error_floor = static_cast<float>(v[3]); >- param->noise_gate = static_cast<float>(v[4]); >- } >- } >- >- void ReadParam( >- const Json::Value& root, >- std::string param_name, >- EchoCanceller3Config::Filter::ShadowConfiguration* param) const { >- RTC_CHECK(param); >- Json::Value json_array; >- if (rtc::GetValueFromJsonObject(root, param_name, &json_array)) { >- std::vector<double> v; >- rtc::JsonArrayToDoubleVector(json_array, &v); >- if (v.size() != 3) { >- std::cout << "Incorrect array size for " << param_name << std::endl; >- RTC_CHECK(false); >- } >- param->length_blocks = static_cast<size_t>(v[0]); >- param->rate = static_cast<float>(v[1]); >- param->noise_gate = static_cast<float>(v[2]); >- } >- } >- >- void ReadParam( >- const Json::Value& root, >- std::string param_name, >- EchoCanceller3Config::Suppressor::MaskingThresholds* param) const { >- RTC_CHECK(param); >- Json::Value json_array; >- if (rtc::GetValueFromJsonObject(root, param_name, &json_array)) { >- std::vector<double> v; >- rtc::JsonArrayToDoubleVector(json_array, &v); >- if (v.size() != 3) { >- std::cout << "Incorrect array size for " << param_name << std::endl; >- RTC_CHECK(false); >- } >- param->enr_transparent = static_cast<float>(v[0]); >- param->enr_suppress = static_cast<float>(v[1]); >- param->emr_transparent = static_cast<float>(v[2]); >- } >- } >- >- EchoCanceller3Config ParseFile(const std::string& filename) const { >- EchoCanceller3Config cfg; >- Json::Value root; >- std::string s; >- std::string json_string; >- std::ifstream f(filename.c_str()); >- >- if (f.fail()) { >- std::cout << "Failed to open the file " << filename << std::endl; >- RTC_CHECK(false); >- } >- >- while (std::getline(f, s)) { >- json_string += s; >- } >- bool success = Json::Reader().parse(json_string, root); >- if (!success) { >- std::cout << "Incorrect JSON format:" << std::endl; >- std::cout << json_string << std::endl; >- RTC_CHECK(false); >- } >- >- Json::Value section; >- if (rtc::GetValueFromJsonObject(root, "delay", §ion)) { >- ReadParam(section, "default_delay", &cfg.delay.default_delay); >- ReadParam(section, "down_sampling_factor", >- &cfg.delay.down_sampling_factor); >- ReadParam(section, "num_filters", &cfg.delay.num_filters); >- ReadParam(section, "api_call_jitter_blocks", >- &cfg.delay.api_call_jitter_blocks); >- ReadParam(section, "min_echo_path_delay_blocks", >- &cfg.delay.min_echo_path_delay_blocks); >- ReadParam(section, "delay_headroom_blocks", >- &cfg.delay.delay_headroom_blocks); >- ReadParam(section, "hysteresis_limit_1_blocks", >- &cfg.delay.hysteresis_limit_1_blocks); >- ReadParam(section, "hysteresis_limit_2_blocks", >- &cfg.delay.hysteresis_limit_2_blocks); >- ReadParam(section, "skew_hysteresis_blocks", >- &cfg.delay.skew_hysteresis_blocks); >- ReadParam(section, "fixed_capture_delay_samples", >- &cfg.delay.fixed_capture_delay_samples); >- ReadParam(section, "delay_estimate_smoothing", >- &cfg.delay.delay_estimate_smoothing); >- ReadParam(section, "delay_candidate_detection_threshold", >- &cfg.delay.delay_candidate_detection_threshold); >- >- Json::Value subsection; >- if (rtc::GetValueFromJsonObject(section, "delay_selection_thresholds", >- &subsection)) { >- ReadParam(subsection, "initial", >- &cfg.delay.delay_selection_thresholds.initial); >- ReadParam(subsection, "converged", >- &cfg.delay.delay_selection_thresholds.converged); >- } >- } >- >- if (rtc::GetValueFromJsonObject(root, "filter", §ion)) { >- ReadParam(section, "main", &cfg.filter.main); >- ReadParam(section, "shadow", &cfg.filter.shadow); >- ReadParam(section, "main_initial", &cfg.filter.main_initial); >- ReadParam(section, "shadow_initial", &cfg.filter.shadow_initial); >- ReadParam(section, "config_change_duration_blocks", >- &cfg.filter.config_change_duration_blocks); >- ReadParam(section, "initial_state_seconds", >- &cfg.filter.initial_state_seconds); >- ReadParam(section, "conservative_initial_phase", >- &cfg.filter.conservative_initial_phase); >- ReadParam(section, "enable_shadow_filter_output_usage", >- &cfg.filter.enable_shadow_filter_output_usage); >- } >- >- if (rtc::GetValueFromJsonObject(root, "erle", §ion)) { >- ReadParam(section, "min", &cfg.erle.min); >- ReadParam(section, "max_l", &cfg.erle.max_l); >- ReadParam(section, "max_h", &cfg.erle.max_h); >- ReadParam(section, "onset_detection", &cfg.erle.onset_detection); >- } >- >- if (rtc::GetValueFromJsonObject(root, "ep_strength", §ion)) { >- ReadParam(section, "lf", &cfg.ep_strength.lf); >- ReadParam(section, "mf", &cfg.ep_strength.mf); >- ReadParam(section, "hf", &cfg.ep_strength.hf); >- ReadParam(section, "default_len", &cfg.ep_strength.default_len); >- ReadParam(section, "reverb_based_on_render", >- &cfg.ep_strength.reverb_based_on_render); >- ReadParam(section, "echo_can_saturate", >- &cfg.ep_strength.echo_can_saturate); >- ReadParam(section, "bounded_erl", &cfg.ep_strength.bounded_erl); >- } >- >- if (rtc::GetValueFromJsonObject(root, "gain_mask", §ion)) { >- ReadParam(section, "m1", &cfg.gain_mask.m1); >- ReadParam(section, "m2", &cfg.gain_mask.m2); >- ReadParam(section, "m3", &cfg.gain_mask.m3); >- ReadParam(section, "m5", &cfg.gain_mask.m5); >- ReadParam(section, "m6", &cfg.gain_mask.m6); >- ReadParam(section, "m7", &cfg.gain_mask.m7); >- ReadParam(section, "m8", &cfg.gain_mask.m8); >- ReadParam(section, "m9", &cfg.gain_mask.m9); >- >- ReadParam(section, "gain_curve_offset", &cfg.gain_mask.gain_curve_offset); >- ReadParam(section, "gain_curve_slope", &cfg.gain_mask.gain_curve_slope); >- ReadParam(section, "temporal_masking_lf", >- &cfg.gain_mask.temporal_masking_lf); >- ReadParam(section, "temporal_masking_hf", >- &cfg.gain_mask.temporal_masking_hf); >- ReadParam(section, "temporal_masking_lf_bands", >- &cfg.gain_mask.temporal_masking_lf_bands); >- } >- >- if (rtc::GetValueFromJsonObject(root, "echo_audibility", §ion)) { >- ReadParam(section, "low_render_limit", >- &cfg.echo_audibility.low_render_limit); >- ReadParam(section, "normal_render_limit", >- &cfg.echo_audibility.normal_render_limit); >- >- ReadParam(section, "floor_power", &cfg.echo_audibility.floor_power); >- ReadParam(section, "audibility_threshold_lf", >- &cfg.echo_audibility.audibility_threshold_lf); >- ReadParam(section, "audibility_threshold_mf", >- &cfg.echo_audibility.audibility_threshold_mf); >- ReadParam(section, "audibility_threshold_hf", >- &cfg.echo_audibility.audibility_threshold_hf); >- ReadParam(section, "use_stationary_properties", >- &cfg.echo_audibility.use_stationary_properties); >- ReadParam(section, "use_stationary_properties_at_init", >- &cfg.echo_audibility.use_stationarity_properties_at_init); >- } >- >- if (rtc::GetValueFromJsonObject(root, "echo_removal_control", §ion)) { >- Json::Value subsection; >- if (rtc::GetValueFromJsonObject(section, "gain_rampup", &subsection)) { >- ReadParam(subsection, "initial_gain", >- &cfg.echo_removal_control.gain_rampup.initial_gain); >- ReadParam(subsection, "first_non_zero_gain", >- &cfg.echo_removal_control.gain_rampup.first_non_zero_gain); >- ReadParam(subsection, "non_zero_gain_blocks", >- &cfg.echo_removal_control.gain_rampup.non_zero_gain_blocks); >- ReadParam(subsection, "full_gain_blocks", >- &cfg.echo_removal_control.gain_rampup.full_gain_blocks); >- } >- ReadParam(section, "has_clock_drift", >- &cfg.echo_removal_control.has_clock_drift); >- ReadParam(section, "linear_and_stable_echo_path", >- &cfg.echo_removal_control.linear_and_stable_echo_path); >- } >- >- if (rtc::GetValueFromJsonObject(root, "echo_model", §ion)) { >- Json::Value subsection; >- ReadParam(section, "noise_floor_hold", &cfg.echo_model.noise_floor_hold); >- ReadParam(section, "min_noise_floor_power", >- &cfg.echo_model.min_noise_floor_power); >- ReadParam(section, "stationary_gate_slope", >- &cfg.echo_model.stationary_gate_slope); >- ReadParam(section, "noise_gate_power", &cfg.echo_model.noise_gate_power); >- ReadParam(section, "noise_gate_slope", &cfg.echo_model.noise_gate_slope); >- ReadParam(section, "render_pre_window_size", >- &cfg.echo_model.render_pre_window_size); >- ReadParam(section, "render_post_window_size", >- &cfg.echo_model.render_post_window_size); >- ReadParam(section, "render_pre_window_size_init", >- &cfg.echo_model.render_pre_window_size_init); >- ReadParam(section, "render_post_window_size_init", >- &cfg.echo_model.render_post_window_size_init); >- ReadParam(section, "nonlinear_hold", &cfg.echo_model.nonlinear_hold); >- ReadParam(section, "nonlinear_release", >- &cfg.echo_model.nonlinear_release); >- } >- >- Json::Value subsection; >- if (rtc::GetValueFromJsonObject(root, "suppressor", §ion)) { >- ReadParam(section, "nearend_average_blocks", >- &cfg.suppressor.nearend_average_blocks); >- >- if (rtc::GetValueFromJsonObject(section, "normal_tuning", &subsection)) { >- ReadParam(subsection, "mask_lf", &cfg.suppressor.normal_tuning.mask_lf); >- ReadParam(subsection, "mask_hf", &cfg.suppressor.normal_tuning.mask_hf); >- ReadParam(subsection, "max_inc_factor", >- &cfg.suppressor.normal_tuning.max_inc_factor); >- ReadParam(subsection, "max_dec_factor_lf", >- &cfg.suppressor.normal_tuning.max_dec_factor_lf); >- } >- >- if (rtc::GetValueFromJsonObject(section, "nearend_tuning", &subsection)) { >- ReadParam(subsection, "mask_lf", >- &cfg.suppressor.nearend_tuning.mask_lf); >- ReadParam(subsection, "mask_hf", >- &cfg.suppressor.nearend_tuning.mask_hf); >- ReadParam(subsection, "max_inc_factor", >- &cfg.suppressor.nearend_tuning.max_inc_factor); >- ReadParam(subsection, "max_dec_factor_lf", >- &cfg.suppressor.nearend_tuning.max_dec_factor_lf); >- } >- >- if (rtc::GetValueFromJsonObject(section, "dominant_nearend_detection", >- &subsection)) { >- ReadParam(subsection, "enr_threshold", >- &cfg.suppressor.dominant_nearend_detection.enr_threshold); >- ReadParam(subsection, "snr_threshold", >- &cfg.suppressor.dominant_nearend_detection.snr_threshold); >- ReadParam(subsection, "hold_duration", >- &cfg.suppressor.dominant_nearend_detection.hold_duration); >- ReadParam(subsection, "trigger_threshold", >- &cfg.suppressor.dominant_nearend_detection.trigger_threshold); >- } >- >- if (rtc::GetValueFromJsonObject(section, "high_bands_suppression", >- &subsection)) { >- ReadParam(subsection, "enr_threshold", >- &cfg.suppressor.high_bands_suppression.enr_threshold); >- ReadParam(subsection, "max_gain_during_echo", >- &cfg.suppressor.high_bands_suppression.max_gain_during_echo); >- } >- >- ReadParam(section, "floor_first_increase", >- &cfg.suppressor.floor_first_increase); >- ReadParam(section, "enforce_transparent", >- &cfg.suppressor.enforce_transparent); >- ReadParam(section, "enforce_empty_higher_bands", >- &cfg.suppressor.enforce_empty_higher_bands); >- } >- >- std::cout << std::endl; >- return cfg; >- } >-}; >- > void CopyFromAudioFrame(const AudioFrame& src, ChannelBuffer<float>* dest) { > RTC_CHECK_EQ(src.num_channels_, dest->num_channels()); > RTC_CHECK_EQ(src.samples_per_channel_, dest->num_frames()); >@@ -695,6 +126,13 @@ AudioProcessingSimulator::AudioProcessingSimulator( > settings.initial_mic_level, > settings_.simulate_mic_gain ? *settings.simulated_mic_kind : 0), > worker_queue_("file_writer_task_queue") { >+ RTC_CHECK(!settings_.dump_internal_data || WEBRTC_APM_DEBUG_DUMP == 1); >+ ApmDataDumper::SetActivated(settings_.dump_internal_data); >+ if (settings_.dump_internal_data_output_dir.has_value()) { >+ ApmDataDumper::SetOutputDirectory( >+ settings_.dump_internal_data_output_dir.value()); >+ } >+ > if (settings_.ed_graph_output_filename && > !settings_.ed_graph_output_filename->empty()) { > residual_echo_likelihood_graph_writer_.open( >@@ -776,9 +214,9 @@ void AudioProcessingSimulator::ProcessStream(bool fixed_interface) { > } > > if (residual_echo_likelihood_graph_writer_.is_open()) { >- auto stats = ap_->GetStatistics(); >- residual_echo_likelihood_graph_writer_ << stats.residual_echo_likelihood >- << ", "; >+ auto stats = ap_->GetStatistics(true /*has_remote_tracks*/); >+ residual_echo_likelihood_graph_writer_ >+ << stats.residual_echo_likelihood.value_or(-1.f) << ", "; > } > > ++num_process_stream_calls_; >@@ -918,10 +356,13 @@ void AudioProcessingSimulator::CreateAudioProcessor() { > } > if (settings_.use_agc2) { > apm_config.gain_controller2.enabled = *settings_.use_agc2; >- apm_config.gain_controller2.fixed_gain_db = settings_.agc2_fixed_gain_db; >+ apm_config.gain_controller2.fixed_digital.gain_db = >+ settings_.agc2_fixed_gain_db; > if (settings_.agc2_use_adaptive_gain) { >- apm_config.gain_controller2.adaptive_digital_mode = >+ apm_config.gain_controller2.adaptive_digital.enabled = > *settings_.agc2_use_adaptive_gain; >+ apm_config.gain_controller2.adaptive_digital.level_estimator = >+ settings_.agc2_adaptive_level_estimator; > } > } > if (settings_.use_pre_amplifier) { >@@ -944,7 +385,7 @@ void AudioProcessingSimulator::CreateAudioProcessor() { > if (settings_.use_verbose_logging) { > std::cout << "Reading AEC3 Parameters from JSON input." << std::endl; > } >- cfg = Aec3ParametersParser::Parse(*settings_.aec3_settings_filename); >+ cfg = ReadAec3ConfigFromJsonFile(*settings_.aec3_settings_filename); > } > echo_control_factory.reset(new EchoCanceller3Factory(cfg)); > >@@ -952,7 +393,7 @@ void AudioProcessingSimulator::CreateAudioProcessor() { > if (!settings_.use_quiet_output) { > std::cout << "AEC3 settings:" << std::endl; > } >- PrintAec3ParameterValues(cfg); >+ std::cout << Aec3ConfigToJsonString(cfg) << std::endl; > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.h >index 4e580b4fced9548d58ac2d55443f446fbb53e7c2..4b67537b49a5b00b6a57d2610924522e2fb826cf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audio_processing_simulator.h >@@ -75,6 +75,8 @@ struct SimulationSettings { > absl::optional<int> agc_compression_gain; > absl::optional<bool> agc2_use_adaptive_gain; > float agc2_fixed_gain_db; >+ AudioProcessing::Config::GainController2::LevelEstimator >+ agc2_adaptive_level_estimator; > float pre_amplifier_gain_factor; > absl::optional<int> vad_likelihood; > absl::optional<int> ns_level; >@@ -92,6 +94,8 @@ struct SimulationSettings { > bool fixed_interface = false; > bool store_intermediate_output = false; > bool print_aec3_parameter_values = false; >+ bool dump_internal_data = false; >+ absl::optional<std::string> dump_internal_data_output_dir; > absl::optional<std::string> custom_call_order_filename; > absl::optional<std::string> aec3_settings_filename; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audioproc_float_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audioproc_float_impl.cc >index 3a448877d819ad11601173b0957ecb3dadb12c51..0ff2806e6d5decb06d687700b6d23c4320c156e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audioproc_float_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/audioproc_float_impl.cc >@@ -14,13 +14,17 @@ > #include <memory> > #include <string> > #include <utility> >+#include <vector> > >+#include "absl/strings/string_view.h" > #include "modules/audio_processing/include/audio_processing.h" > #include "modules/audio_processing/test/aec_dump_based_simulator.h" > #include "modules/audio_processing/test/audio_processing_simulator.h" > #include "modules/audio_processing/test/audioproc_float_impl.h" > #include "modules/audio_processing/test/wav_based_simulator.h" >+#include "rtc_base/checks.h" > #include "rtc_base/flags.h" >+#include "rtc_base/strings/string_builder.h" > > namespace webrtc { > namespace test { >@@ -37,155 +41,185 @@ const char kUsageDescription[] = > "processing module, either based on wav files or " > "protobuf debug dump recordings.\n"; > >-DEFINE_string(dump_input, "", "Aec dump input filename"); >-DEFINE_string(dump_output, "", "Aec dump output filename"); >-DEFINE_string(i, "", "Forward stream input wav filename"); >-DEFINE_string(o, "", "Forward stream output wav filename"); >-DEFINE_string(ri, "", "Reverse stream input wav filename"); >-DEFINE_string(ro, "", "Reverse stream output wav filename"); >-DEFINE_string(artificial_nearend, "", "Artificial nearend wav filename"); >-DEFINE_int(output_num_channels, >- kParameterNotSpecifiedValue, >- "Number of forward stream output channels"); >-DEFINE_int(reverse_output_num_channels, >- kParameterNotSpecifiedValue, >- "Number of Reverse stream output channels"); >-DEFINE_int(output_sample_rate_hz, >- kParameterNotSpecifiedValue, >- "Forward stream output sample rate in Hz"); >-DEFINE_int(reverse_output_sample_rate_hz, >- kParameterNotSpecifiedValue, >- "Reverse stream output sample rate in Hz"); >-DEFINE_bool(fixed_interface, >- false, >- "Use the fixed interface when operating on wav files"); >-DEFINE_int(aec, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the echo canceller"); >-DEFINE_int(aecm, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the mobile echo controller"); >-DEFINE_int(ed, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate (0) the residual echo detector"); >-DEFINE_string(ed_graph, "", "Output filename for graph of echo likelihood"); >-DEFINE_int(agc, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the AGC"); >-DEFINE_int(agc2, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the AGC2"); >-DEFINE_int(pre_amplifier, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the pre amplifier"); >-DEFINE_int(hpf, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the high-pass filter"); >-DEFINE_int(ns, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the noise suppressor"); >-DEFINE_int(ts, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the transient suppressor"); >-DEFINE_int(vad, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the voice activity detector"); >-DEFINE_int(le, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the level estimator"); >-DEFINE_bool(all_default, >- false, >- "Activate all of the default components (will be overridden by any " >- "other settings)"); >-DEFINE_int(aec_suppression_level, >- kParameterNotSpecifiedValue, >- "Set the aec suppression level (0-2)"); >-DEFINE_int(delay_agnostic, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the AEC delay agnostic mode"); >-DEFINE_int(extended_filter, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the AEC extended filter mode"); >-DEFINE_int(aec3, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the experimental AEC mode AEC3"); >-DEFINE_int(experimental_agc, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the experimental AGC"); >-DEFINE_int(experimental_agc_disable_digital_adaptive, >- kParameterNotSpecifiedValue, >- "Force-deactivate (1) digital adaptation in " >- "experimental AGC. Digital adaptation is active by default (0)."); >-DEFINE_int(experimental_agc_analyze_before_aec, >- kParameterNotSpecifiedValue, >- "Make level estimation happen before AEC" >- " in the experimental AGC. After AEC is the default (0)"); >-DEFINE_int( >+WEBRTC_DEFINE_string(dump_input, "", "Aec dump input filename"); >+WEBRTC_DEFINE_string(dump_output, "", "Aec dump output filename"); >+WEBRTC_DEFINE_string(i, "", "Forward stream input wav filename"); >+WEBRTC_DEFINE_string(o, "", "Forward stream output wav filename"); >+WEBRTC_DEFINE_string(ri, "", "Reverse stream input wav filename"); >+WEBRTC_DEFINE_string(ro, "", "Reverse stream output wav filename"); >+WEBRTC_DEFINE_string(artificial_nearend, "", "Artificial nearend wav filename"); >+WEBRTC_DEFINE_int(output_num_channels, >+ kParameterNotSpecifiedValue, >+ "Number of forward stream output channels"); >+WEBRTC_DEFINE_int(reverse_output_num_channels, >+ kParameterNotSpecifiedValue, >+ "Number of Reverse stream output channels"); >+WEBRTC_DEFINE_int(output_sample_rate_hz, >+ kParameterNotSpecifiedValue, >+ "Forward stream output sample rate in Hz"); >+WEBRTC_DEFINE_int(reverse_output_sample_rate_hz, >+ kParameterNotSpecifiedValue, >+ "Reverse stream output sample rate in Hz"); >+WEBRTC_DEFINE_bool(fixed_interface, >+ false, >+ "Use the fixed interface when operating on wav files"); >+WEBRTC_DEFINE_int(aec, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the echo canceller"); >+WEBRTC_DEFINE_int(aecm, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the mobile echo controller"); >+WEBRTC_DEFINE_int(ed, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate (0) the residual echo detector"); >+WEBRTC_DEFINE_string(ed_graph, >+ "", >+ "Output filename for graph of echo likelihood"); >+WEBRTC_DEFINE_int(agc, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the AGC"); >+WEBRTC_DEFINE_int(agc2, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the AGC2"); >+WEBRTC_DEFINE_int(pre_amplifier, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the pre amplifier"); >+WEBRTC_DEFINE_int(hpf, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the high-pass filter"); >+WEBRTC_DEFINE_int(ns, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the noise suppressor"); >+WEBRTC_DEFINE_int(ts, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the transient suppressor"); >+WEBRTC_DEFINE_int(vad, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the voice activity detector"); >+WEBRTC_DEFINE_int(le, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the level estimator"); >+WEBRTC_DEFINE_bool( >+ all_default, >+ false, >+ "Activate all of the default components (will be overridden by any " >+ "other settings)"); >+WEBRTC_DEFINE_int(aec_suppression_level, >+ kParameterNotSpecifiedValue, >+ "Set the aec suppression level (0-2)"); >+WEBRTC_DEFINE_int(delay_agnostic, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the AEC delay agnostic mode"); >+WEBRTC_DEFINE_int(extended_filter, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the AEC extended filter mode"); >+WEBRTC_DEFINE_int( >+ aec3, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the experimental AEC mode AEC3"); >+WEBRTC_DEFINE_int(experimental_agc, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the experimental AGC"); >+WEBRTC_DEFINE_int( >+ experimental_agc_disable_digital_adaptive, >+ kParameterNotSpecifiedValue, >+ "Force-deactivate (1) digital adaptation in " >+ "experimental AGC. Digital adaptation is active by default (0)."); >+WEBRTC_DEFINE_int(experimental_agc_analyze_before_aec, >+ kParameterNotSpecifiedValue, >+ "Make level estimation happen before AEC" >+ " in the experimental AGC. After AEC is the default (0)"); >+WEBRTC_DEFINE_int( > experimental_agc_agc2_level_estimator, > kParameterNotSpecifiedValue, > "AGC2 level estimation" > " in the experimental AGC. AGC1 level estimation is the default (0)"); >-DEFINE_int( >+WEBRTC_DEFINE_int( > refined_adaptive_filter, > kParameterNotSpecifiedValue, > "Activate (1) or deactivate(0) the refined adaptive filter functionality"); >-DEFINE_int(agc_mode, kParameterNotSpecifiedValue, "Specify the AGC mode (0-2)"); >-DEFINE_int(agc_target_level, >- kParameterNotSpecifiedValue, >- "Specify the AGC target level (0-31)"); >-DEFINE_int(agc_limiter, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the level estimator"); >-DEFINE_int(agc_compression_gain, >- kParameterNotSpecifiedValue, >- "Specify the AGC compression gain (0-90)"); >-DEFINE_float(agc2_enable_adaptive_gain, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) the AGC2 adaptive gain"); >-DEFINE_float(agc2_fixed_gain_db, 0.f, "AGC2 fixed gain (dB) to apply"); >-DEFINE_float(pre_amplifier_gain_factor, >- 1.f, >- "Pre-amplifier gain factor (linear) to apply"); >-DEFINE_int(vad_likelihood, >- kParameterNotSpecifiedValue, >- "Specify the VAD likelihood (0-3)"); >-DEFINE_int(ns_level, kParameterNotSpecifiedValue, "Specify the NS level (0-3)"); >-DEFINE_int(stream_delay, >- kParameterNotSpecifiedValue, >- "Specify the stream delay in ms to use"); >-DEFINE_int(use_stream_delay, >- kParameterNotSpecifiedValue, >- "Activate (1) or deactivate(0) reporting the stream delay"); >-DEFINE_int(stream_drift_samples, >- kParameterNotSpecifiedValue, >- "Specify the number of stream drift samples to use"); >-DEFINE_int(initial_mic_level, 100, "Initial mic level (0-255)"); >-DEFINE_int(simulate_mic_gain, >- 0, >- "Activate (1) or deactivate(0) the analog mic gain simulation"); >-DEFINE_int(simulated_mic_kind, >- kParameterNotSpecifiedValue, >- "Specify which microphone kind to use for microphone simulation"); >-DEFINE_bool(performance_report, false, "Report the APM performance "); >-DEFINE_bool(verbose, false, "Produce verbose output"); >-DEFINE_bool(quiet, false, "Avoid producing information about the progress."); >-DEFINE_bool(bitexactness_report, >- false, >- "Report bitexactness for aec dump result reproduction"); >-DEFINE_bool(discard_settings_in_aecdump, >- false, >- "Discard any config settings specified in the aec dump"); >-DEFINE_bool(store_intermediate_output, >- false, >- "Creates new output files after each init"); >-DEFINE_string(custom_call_order_file, "", "Custom process API call order file"); >-DEFINE_bool(print_aec3_parameter_values, >- false, >- "Print parameter values used in AEC3 in JSON-format"); >-DEFINE_string(aec3_settings, >- "", >- "File in JSON-format with custom AEC3 settings"); >-DEFINE_bool(help, false, "Print this message"); >+WEBRTC_DEFINE_int(agc_mode, >+ kParameterNotSpecifiedValue, >+ "Specify the AGC mode (0-2)"); >+WEBRTC_DEFINE_int(agc_target_level, >+ kParameterNotSpecifiedValue, >+ "Specify the AGC target level (0-31)"); >+WEBRTC_DEFINE_int(agc_limiter, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the level estimator"); >+WEBRTC_DEFINE_int(agc_compression_gain, >+ kParameterNotSpecifiedValue, >+ "Specify the AGC compression gain (0-90)"); >+WEBRTC_DEFINE_float(agc2_enable_adaptive_gain, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) the AGC2 adaptive gain"); >+WEBRTC_DEFINE_float(agc2_fixed_gain_db, 0.f, "AGC2 fixed gain (dB) to apply"); >+ >+std::vector<std::string> GetAgc2AdaptiveLevelEstimatorNames() { >+ return {"RMS", "peak"}; >+} >+WEBRTC_DEFINE_string( >+ agc2_adaptive_level_estimator, >+ "RMS", >+ "AGC2 adaptive digital level estimator to use [RMS, peak]"); >+ >+WEBRTC_DEFINE_float(pre_amplifier_gain_factor, >+ 1.f, >+ "Pre-amplifier gain factor (linear) to apply"); >+WEBRTC_DEFINE_int(vad_likelihood, >+ kParameterNotSpecifiedValue, >+ "Specify the VAD likelihood (0-3)"); >+WEBRTC_DEFINE_int(ns_level, >+ kParameterNotSpecifiedValue, >+ "Specify the NS level (0-3)"); >+WEBRTC_DEFINE_int(stream_delay, >+ kParameterNotSpecifiedValue, >+ "Specify the stream delay in ms to use"); >+WEBRTC_DEFINE_int(use_stream_delay, >+ kParameterNotSpecifiedValue, >+ "Activate (1) or deactivate(0) reporting the stream delay"); >+WEBRTC_DEFINE_int(stream_drift_samples, >+ kParameterNotSpecifiedValue, >+ "Specify the number of stream drift samples to use"); >+WEBRTC_DEFINE_int(initial_mic_level, 100, "Initial mic level (0-255)"); >+WEBRTC_DEFINE_int( >+ simulate_mic_gain, >+ 0, >+ "Activate (1) or deactivate(0) the analog mic gain simulation"); >+WEBRTC_DEFINE_int( >+ simulated_mic_kind, >+ kParameterNotSpecifiedValue, >+ "Specify which microphone kind to use for microphone simulation"); >+WEBRTC_DEFINE_bool(performance_report, false, "Report the APM performance "); >+WEBRTC_DEFINE_bool(verbose, false, "Produce verbose output"); >+WEBRTC_DEFINE_bool(quiet, >+ false, >+ "Avoid producing information about the progress."); >+WEBRTC_DEFINE_bool(bitexactness_report, >+ false, >+ "Report bitexactness for aec dump result reproduction"); >+WEBRTC_DEFINE_bool(discard_settings_in_aecdump, >+ false, >+ "Discard any config settings specified in the aec dump"); >+WEBRTC_DEFINE_bool(store_intermediate_output, >+ false, >+ "Creates new output files after each init"); >+WEBRTC_DEFINE_string(custom_call_order_file, >+ "", >+ "Custom process API call order file"); >+WEBRTC_DEFINE_bool(print_aec3_parameter_values, >+ false, >+ "Print parameter values used in AEC3 in JSON-format"); >+WEBRTC_DEFINE_string(aec3_settings, >+ "", >+ "File in JSON-format with custom AEC3 settings"); >+WEBRTC_DEFINE_bool(dump_data, >+ false, >+ "Dump internal data during the call (requires build flag)"); >+WEBRTC_DEFINE_string(dump_data_output_dir, >+ "", >+ "Internal data dump output directory"); >+WEBRTC_DEFINE_bool(help, false, "Print this message"); > > void SetSettingIfSpecified(const std::string& value, > absl::optional<std::string>* parameter) { >@@ -208,6 +242,27 @@ void SetSettingIfFlagSet(int32_t flag, absl::optional<bool>* parameter) { > } > } > >+AudioProcessing::Config::GainController2::LevelEstimator >+MapAgc2AdaptiveLevelEstimator(absl::string_view name) { >+ if (name.compare("RMS") == 0) { >+ return AudioProcessing::Config::GainController2::LevelEstimator::kRms; >+ } >+ if (name.compare("peak") == 0) { >+ return AudioProcessing::Config::GainController2::LevelEstimator::kPeak; >+ } >+ auto concat_strings = >+ [](const std::vector<std::string>& strings) -> std::string { >+ rtc::StringBuilder ss; >+ for (const auto& s : strings) { >+ ss << " " << s; >+ } >+ return ss.Release(); >+ }; >+ RTC_CHECK(false) >+ << "Invalid value for agc2_adaptive_level_estimator, valid options:" >+ << concat_strings(GetAgc2AdaptiveLevelEstimatorNames()) << "."; >+} >+ > SimulationSettings CreateSettings() { > SimulationSettings settings; > if (FLAG_all_default) { >@@ -275,6 +330,8 @@ SimulationSettings CreateSettings() { > SetSettingIfFlagSet(FLAG_agc2_enable_adaptive_gain, > &settings.agc2_use_adaptive_gain); > settings.agc2_fixed_gain_db = FLAG_agc2_fixed_gain_db; >+ settings.agc2_adaptive_level_estimator = >+ MapAgc2AdaptiveLevelEstimator(FLAG_agc2_adaptive_level_estimator); > settings.pre_amplifier_gain_factor = FLAG_pre_amplifier_gain_factor; > SetSettingIfSpecified(FLAG_vad_likelihood, &settings.vad_likelihood); > SetSettingIfSpecified(FLAG_ns_level, &settings.ns_level); >@@ -296,6 +353,9 @@ SimulationSettings CreateSettings() { > settings.fixed_interface = FLAG_fixed_interface; > settings.store_intermediate_output = FLAG_store_intermediate_output; > settings.print_aec3_parameter_values = FLAG_print_aec3_parameter_values; >+ settings.dump_internal_data = FLAG_dump_data; >+ SetSettingIfSpecified(FLAG_dump_data_output_dir, >+ &settings.dump_internal_data_output_dir); > > return settings; > } >@@ -446,6 +506,15 @@ void PerformBasicParameterSanityChecks(const SimulationSettings& settings) { > settings.artificial_nearend_filename && > !valid_wav_name(*settings.artificial_nearend_filename), > "Error: --artifical_nearend must be a valid .wav file name.\n"); >+ >+ ReportConditionalErrorAndExit( >+ WEBRTC_APM_DEBUG_DUMP == 0 && settings.dump_internal_data, >+ "Error: --dump_data cannot be set without proper build support.\n"); >+ >+ ReportConditionalErrorAndExit( >+ !settings.dump_internal_data && >+ settings.dump_internal_data_output_dir.has_value(), >+ "Error: --dump_data_output_dir cannot be set without --dump_data.\n"); > } > > } // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/BUILD.gn >index 721ebc7ee5669f552a1eec96b4211a822a658d74..551781b8bc90a6bfad530aa3817e58f9a794b9cc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/BUILD.gn >@@ -51,6 +51,7 @@ rtc_static_library("lib") { > "../../../../common_audio", > "../../../../rtc_base:checks", > "../../../../rtc_base:rtc_base_approved", >+ "../../../../test:fileutils", > "//third_party/abseil-cpp/absl/memory", > ] > visibility = [ ":*" ] # Only targets in this file can depend on this. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/generator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/generator.cc >index 741a1ca86ba210bf58290c82805a175d2ab932e5..fa561cf8e05516a8c0faccbec5778bacb4b7a53d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/generator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/generator.cc >@@ -32,10 +32,10 @@ const char kUsageDescription[] = > "Command-line tool to generate multiple-end audio tracks to simulate " > "conversational speech with two or more participants.\n"; > >-DEFINE_string(i, "", "Directory containing the speech turn wav files"); >-DEFINE_string(t, "", "Path to the timing text file"); >-DEFINE_string(o, "", "Output wav files destination path"); >-DEFINE_bool(help, false, "Prints this message"); >+WEBRTC_DEFINE_string(i, "", "Directory containing the speech turn wav files"); >+WEBRTC_DEFINE_string(t, "", "Path to the timing text file"); >+WEBRTC_DEFINE_string(o, "", "Output wav files destination path"); >+WEBRTC_DEFINE_bool(help, false, "Prints this message"); > > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/mock_wavreader_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/mock_wavreader_factory.cc >index eb8e3bed3c510181ccc46938ad4b36caf1f174d4..a45e3bbfa72fbee5c8a659b065a6a8f8f30d295c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/mock_wavreader_factory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/mock_wavreader_factory.cc >@@ -12,7 +12,6 @@ > > #include "modules/audio_processing/test/conversational_speech/mock_wavreader.h" > #include "rtc_base/logging.h" >-#include "rtc_base/pathutils.h" > #include "test/gmock.h" > > namespace webrtc { >@@ -39,9 +38,10 @@ MockWavReaderFactory::~MockWavReaderFactory() = default; > std::unique_ptr<WavReaderInterface> MockWavReaderFactory::CreateMock( > const std::string& filepath) { > // Search the parameters corresponding to filepath. >- const rtc::Pathname audiotrack_file_path(filepath); >- const auto it = >- audiotrack_names_params_.find(audiotrack_file_path.filename()); >+ size_t delimiter = filepath.find_last_of("/\\"); // Either windows or posix >+ std::string filename = >+ filepath.substr(delimiter == std::string::npos ? 0 : delimiter + 1); >+ const auto it = audiotrack_names_params_.find(filename); > > // If not found, use default parameters. > if (it == audiotrack_names_params_.end()) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/multiend_call.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/multiend_call.cc >index d633d90e9c426a2765f6aae219c5bc92064649c8..4e2f54d1509579d3d53fa11522d6a4dac27717fc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/multiend_call.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/multiend_call.cc >@@ -14,7 +14,7 @@ > #include <iterator> > > #include "rtc_base/logging.h" >-#include "rtc_base/pathutils.h" >+#include "test/testsupport/fileutils.h" > > namespace webrtc { > namespace test { >@@ -50,13 +50,13 @@ bool MultiEndCall::CreateAudioTrackReaders() { > if (it != audiotrack_readers_.end()) > continue; > >- // Instance Pathname to retrieve the full path to the audiotrack file. >- const rtc::Pathname audiotrack_file_path(audiotracks_path_, >- turn.audiotrack_file_name); >+ const std::string audiotrack_file_path = >+ test::JoinFilename(audiotracks_path_, turn.audiotrack_file_name); > > // Map the audiotrack file name to a new instance of WavReaderInterface. > std::unique_ptr<WavReaderInterface> wavreader = >- wavreader_abstract_factory_->Create(audiotrack_file_path.pathname()); >+ wavreader_abstract_factory_->Create( >+ test::JoinFilename(audiotracks_path_, turn.audiotrack_file_name)); > > if (sample_rate_hz_ == 0) { > sample_rate_hz_ = wavreader->SampleRate(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/simulator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/simulator.cc >index c2fb7808e08e6102571558fba61f5572bcfe9c48..c400499fd34103006c4d8a20e08aa73ef0940684 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/simulator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/conversational_speech/simulator.cc >@@ -25,7 +25,7 @@ > #include "rtc_base/constructormagic.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" >-#include "rtc_base/pathutils.h" >+#include "test/testsupport/fileutils.h" > > namespace webrtc { > namespace test { >@@ -46,21 +46,20 @@ InitSpeakerOutputFilePaths(const std::set<std::string>& speaker_names, > > // Add near-end and far-end output paths into the map. > for (const auto& speaker_name : speaker_names) { >- const rtc::Pathname near_end_path(output_path, >- "s_" + speaker_name + "-near_end.wav"); >+ const std::string near_end_path = >+ test::JoinFilename(output_path, "s_" + speaker_name + "-near_end.wav"); > RTC_LOG(LS_VERBOSE) << "The near-end audio track will be created in " >- << near_end_path.pathname() << "."; >+ << near_end_path << "."; > >- const rtc::Pathname far_end_path(output_path, >- "s_" + speaker_name + "-far_end.wav"); >+ const std::string far_end_path = >+ test::JoinFilename(output_path, "s_" + speaker_name + "-far_end.wav"); > RTC_LOG(LS_VERBOSE) << "The far-end audio track will be created in " >- << far_end_path.pathname() << "."; >+ << far_end_path << "."; > > // Add to map. > speaker_output_file_paths_map->emplace( > std::piecewise_construct, std::forward_as_tuple(speaker_name), >- std::forward_as_tuple(near_end_path.pathname(), >- far_end_path.pathname())); >+ std::forward_as_tuple(near_end_path, far_end_path)); > } > > return speaker_output_file_paths_map; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/apm_vad.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/apm_vad.cc >index a6184b5f9fda011fa7d68ae4c7a7ef526716517e..4b6ada27b6a6f82c17e86f5077fc88128ff6a71f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/apm_vad.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/apm_vad.cc >@@ -24,9 +24,9 @@ constexpr int kMaxSampleRate = 48000; > constexpr size_t kMaxFrameLen = > kAudioFrameLengthMilliseconds * kMaxSampleRate / 1000; > >-DEFINE_string(i, "", "Input wav file"); >-DEFINE_string(o_probs, "", "VAD probabilities output file"); >-DEFINE_string(o_rms, "", "VAD output file"); >+WEBRTC_DEFINE_string(i, "", "Input wav file"); >+WEBRTC_DEFINE_string(o_probs, "", "VAD probabilities output file"); >+WEBRTC_DEFINE_string(o_rms, "", "VAD output file"); > > int main(int argc, char* argv[]) { > if (rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true)) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/sound_level.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/sound_level.cc >index 98cf84c888f7b5c13f84c0e3fbe4eb2592bc06ac..35a2c11aeb2279cdfb3e3fd1cafc3157eb6278e2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/sound_level.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/sound_level.cc >@@ -26,13 +26,13 @@ constexpr size_t kMaxFrameLen = kMaxFrameLenMs * kMaxSampleRate / 1000; > > const double kOneDbReduction = DbToRatio(-1.0); > >-DEFINE_string(i, "", "Input wav file"); >-DEFINE_string(oc, "", "Config output file"); >-DEFINE_string(ol, "", "Levels output file"); >-DEFINE_float(a, 5.f, "Attack (ms)"); >-DEFINE_float(d, 20.f, "Decay (ms)"); >-DEFINE_int(f, 10, "Frame length (ms)"); >-DEFINE_bool(help, false, "prints this message"); >+WEBRTC_DEFINE_string(i, "", "Input wav file"); >+WEBRTC_DEFINE_string(oc, "", "Config output file"); >+WEBRTC_DEFINE_string(ol, "", "Levels output file"); >+WEBRTC_DEFINE_float(a, 5.f, "Attack (ms)"); >+WEBRTC_DEFINE_float(d, 20.f, "Decay (ms)"); >+WEBRTC_DEFINE_int(f, 10, "Frame length (ms)"); >+WEBRTC_DEFINE_bool(help, false, "prints this message"); > > int main(int argc, char* argv[]) { > if (rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/vad.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/vad.cc >index 191cb1e9fcfd152f9e0943768f0f5b36dfe72598..8a134ed185d9ce9603ad7d1bc559c4afc3ba3526 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/vad.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/test/py_quality_assessment/quality_assessment/vad.cc >@@ -27,8 +27,8 @@ constexpr size_t kMaxFrameLen = > > constexpr uint8_t kBitmaskBuffSize = 8; > >-DEFINE_string(i, "", "Input wav file"); >-DEFINE_string(o, "", "VAD output file"); >+WEBRTC_DEFINE_string(i, "", "Input wav file"); >+WEBRTC_DEFINE_string(o, "", "VAD output file"); > > int main(int argc, char* argv[]) { > if (rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, true)) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/moving_moments.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/moving_moments.cc >index a199bb0c1b9d3e1c07bcea1287534d1ade091752..83810bfe3c276eb0022d0b8348fe070881f6a4a2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/moving_moments.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/moving_moments.cc >@@ -10,7 +10,7 @@ > > #include "modules/audio_processing/transient/moving_moments.h" > >-#include <cmath> >+#include <algorithm> > > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.cc >index c3bf282b0e1ff17d1ae7b48789b15120b57f211c..8997d4c09206faae3a5c4ca41d420e9eeef58acd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.cc >@@ -13,12 +13,12 @@ > #include <float.h> > #include <math.h> > #include <string.h> >- > #include <algorithm> > > #include "modules/audio_processing/transient/common.h" > #include "modules/audio_processing/transient/daubechies_8_wavelet_coeffs.h" > #include "modules/audio_processing/transient/moving_moments.h" >+#include "modules/audio_processing/transient/wpd_node.h" > #include "modules/audio_processing/transient/wpd_tree.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.h >index 3267b3a6cdc4a5a666b7b4ed1b87845feaca829a..23b88f82b10cf2e6ec37728b11631c7a2e175ac1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_detector.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_TRANSIENT_TRANSIENT_DETECTOR_H_ > #define MODULES_AUDIO_PROCESSING_TRANSIENT_TRANSIENT_DETECTOR_H_ > >+#include <stddef.h> > #include <deque> > #include <memory> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppression_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppression_test.cc >index 9e7ecd578652e46b4462f97739c6a4ce650ab342..e15f69c04201550ea1622e843d8b7a252b951101 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppression_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppression_test.cc >@@ -23,26 +23,28 @@ > #include "test/gtest.h" > #include "test/testsupport/fileutils.h" > >-DEFINE_string(in_file_name, "", "PCM file that contains the signal."); >-DEFINE_string(detection_file_name, >- "", >- "PCM file that contains the detection signal."); >-DEFINE_string(reference_file_name, >- "", >- "PCM file that contains the reference signal."); >- >-DEFINE_int(chunk_size_ms, >- 10, >- "Time between each chunk of samples in milliseconds."); >- >-DEFINE_int(sample_rate_hz, 16000, "Sampling frequency of the signal in Hertz."); >-DEFINE_int(detection_rate_hz, >- 0, >- "Sampling frequency of the detection signal in Hertz."); >- >-DEFINE_int(num_channels, 1, "Number of channels."); >- >-DEFINE_bool(help, false, "Print this message."); >+WEBRTC_DEFINE_string(in_file_name, "", "PCM file that contains the signal."); >+WEBRTC_DEFINE_string(detection_file_name, >+ "", >+ "PCM file that contains the detection signal."); >+WEBRTC_DEFINE_string(reference_file_name, >+ "", >+ "PCM file that contains the reference signal."); >+ >+WEBRTC_DEFINE_int(chunk_size_ms, >+ 10, >+ "Time between each chunk of samples in milliseconds."); >+ >+WEBRTC_DEFINE_int(sample_rate_hz, >+ 16000, >+ "Sampling frequency of the signal in Hertz."); >+WEBRTC_DEFINE_int(detection_rate_hz, >+ 0, >+ "Sampling frequency of the detection signal in Hertz."); >+ >+WEBRTC_DEFINE_int(num_channels, 1, "Number of channels."); >+ >+WEBRTC_DEFINE_bool(help, false, "Print this message."); > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppressor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppressor.h >index 9ae3fc660a881e2dfc51f5f79b73a46abef819a0..ae51966a3d4ebb850b60cd468c53e2cff2041db0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppressor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/transient_suppressor.h >@@ -11,9 +11,9 @@ > #ifndef MODULES_AUDIO_PROCESSING_TRANSIENT_TRANSIENT_SUPPRESSOR_H_ > #define MODULES_AUDIO_PROCESSING_TRANSIENT_TRANSIENT_SUPPRESSOR_H_ > >-#include <deque> >+#include <stddef.h> >+#include <stdint.h> > #include <memory> >-#include <set> > > #include "rtc_base/gtest_prod_util.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.cc >index 72f4d76891f386de3dc12707b79d16141ed07d85..c8aa6158817afeeceb9c249b0737e07ab7de6c87 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.cc >@@ -10,10 +10,8 @@ > > #include "modules/audio_processing/transient/wpd_tree.h" > >-#include <math.h> > #include <string.h> > >-#include "modules/audio_processing/transient/dyadic_decimator.h" > #include "modules/audio_processing/transient/wpd_node.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.h >index 707a89d44ac16fdb24abdf99e9c15897ad6461a7..b62135dafa8d7ced899ac1a67023dc1c7e4261a2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/transient/wpd_tree.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_AUDIO_PROCESSING_TRANSIENT_WPD_TREE_H_ > #define MODULES_AUDIO_PROCESSING_TRANSIENT_WPD_TREE_H_ > >+#include <stddef.h> > #include <memory> > > #include "modules/audio_processing/transient/wpd_node.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/typing_detection.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/typing_detection.h >index 70fd9033661092133ae3f736f9320c981899a2b7..d8fb3592c9644d4d2ab68c7c9e418efa684357f6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/typing_detection.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/typing_detection.h >@@ -11,9 +11,11 @@ > #ifndef MODULES_AUDIO_PROCESSING_TYPING_DETECTION_H_ > #define MODULES_AUDIO_PROCESSING_TYPING_DETECTION_H_ > >+#include "rtc_base/system/rtc_export.h" >+ > namespace webrtc { > >-class TypingDetection { >+class RTC_EXPORT TypingDetection { > public: > TypingDetection(); > virtual ~TypingDetection(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft.cc >index c3333cea5beb000a9f52263c4f1f78c992b407d8..8628bd39f0aecc2191c3d4a252112d4d3422f4a4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft.cc >@@ -23,8 +23,6 @@ > > #include "modules/audio_processing/utility/ooura_fft.h" > >-#include <math.h> >- > #include "modules/audio_processing/utility/ooura_fft_tables_common.h" > #include "rtc_base/system/arch.h" > #include "system_wrappers/include/cpu_features_wrapper.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft_sse2.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft_sse2.cc >index 9b5d0f37a4497c7476ada387bfd7bb86fc2dc86c..0e4a44beccf41e6203e6f05811aaf3f9aca52649 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft_sse2.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/utility/ooura_fft_sse2.cc >@@ -8,10 +8,10 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include "modules/audio_processing/utility/ooura_fft.h" >- > #include <emmintrin.h> >+#include <xmmintrin.h> > >+#include "modules/audio_processing/utility/ooura_fft.h" > #include "modules/audio_processing/utility/ooura_fft_tables_common.h" > #include "modules/audio_processing/utility/ooura_fft_tables_neon_sse2.h" > #include "rtc_base/system/arch.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/gmm.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/gmm.cc >index cd8a1a8a375c7edb89a85302ca0b75771e688ab3..3b8764c4d02a9bb31733aae1ac112119d2ce56a4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/gmm.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/gmm.cc >@@ -11,7 +11,6 @@ > #include "modules/audio_processing/vad/gmm.h" > > #include <math.h> >-#include <stdlib.h> > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.cc >index 025ef2059d912b36b8e87f36b210734ef47a4c93..68e60dc66ae5176b29acd0477d1cd40544e3eee6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.cc >@@ -10,7 +10,6 @@ > > #include "modules/audio_processing/vad/pitch_based_vad.h" > >-#include <math.h> > #include <string.h> > > #include "modules/audio_processing/vad/common.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.h >index 4d327652c680172752b9f67310024b156e050355..22bc0f2263901007e2afda78aaa0c66ba44e6ceb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pitch_based_vad.h >@@ -18,7 +18,6 @@ > > namespace webrtc { > >-class AudioFrame; > class VadCircularBuffer; > > // Computes the probability of the input audio frame to be active given >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pole_zero_filter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pole_zero_filter.cc >index b9967d7aea5bf43944662c4361c5e5c825be08fd..4156d7e7952a9f1ae4e01078b43ccd60c9c55da5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pole_zero_filter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/pole_zero_filter.cc >@@ -10,7 +10,6 @@ > > #include "modules/audio_processing/vad/pole_zero_filter.h" > >-#include <stdlib.h> > #include <string.h> > #include <algorithm> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.cc >index 19a5282c3f71584b840fd3c85e0ba954fb01bd22..1397668eb4bcbd2432fd03c72856a29d64819d3d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.cc >@@ -12,7 +12,7 @@ > > #include <string.h> > >-#include "audio/utility/audio_frame_operations.h" >+#include "common_audio/vad/include/webrtc_vad.h" > #include "rtc_base/checks.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.h >index 79650fbe8c6cae1c3aded93acc12ae9ffd03159f..3dff4163c22e81d325ea2bec5a0b90eb52029f22 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/standalone_vad.h >@@ -11,13 +11,14 @@ > #ifndef MODULES_AUDIO_PROCESSING_AGC_STANDALONE_VAD_H_ > #define MODULES_AUDIO_PROCESSING_AGC_STANDALONE_VAD_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "common_audio/vad/include/webrtc_vad.h" > #include "modules/audio_processing/vad/common.h" > > namespace webrtc { > >-class AudioFrame; >- > class StandaloneVad { > public: > static StandaloneVad* Create(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/vad_audio_proc.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/vad_audio_proc.h >index e34091bdb6abd1e8a3b31c159f1f024e370d3b16..9be3467ef8523086473bb2046818ed24113fc8e3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/vad_audio_proc.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/vad_audio_proc.h >@@ -11,13 +11,14 @@ > #ifndef MODULES_AUDIO_PROCESSING_VAD_VAD_AUDIO_PROC_H_ > #define MODULES_AUDIO_PROCESSING_VAD_VAD_AUDIO_PROC_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > >-#include "modules/audio_processing/vad/common.h" >+#include "modules/audio_processing/vad/common.h" // AudioFeatures, kSampleR... > > namespace webrtc { > >-class AudioFrame; > class PoleZeroFilter; > > class VadAudioProc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/voice_activity_detector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/voice_activity_detector.h >index e424ac1a28278e5452e0253bc5e5977aa07493ce..d140fe2aa3a4140361cda106551d77a9679c694a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/voice_activity_detector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/vad/voice_activity_detector.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_AUDIO_PROCESSING_VAD_VOICE_ACTIVITY_DETECTOR_H_ > #define MODULES_AUDIO_PROCESSING_VAD_VOICE_ACTIVITY_DETECTOR_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <vector> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.cc >index 9280be1ef9d0c24f5eea8f18500ac880e2eaa12f..c55ca4aee8771ab10b98ba80b1f58e44b55f9a1e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.cc >@@ -10,8 +10,10 @@ > > #include "modules/audio_processing/voice_detection_impl.h" > >+#include "api/audio/audio_frame.h" > #include "common_audio/vad/include/webrtc_vad.h" > #include "modules/audio_processing/audio_buffer.h" >+#include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.h >index 4b724bdd25647ec3f1ae524cc57ffef04248dab5..c4384736799e01083d4358dace2ec511817edf79 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_processing/voice_detection_impl.h >@@ -11,11 +11,13 @@ > #ifndef MODULES_AUDIO_PROCESSING_VOICE_DETECTION_IMPL_H_ > #define MODULES_AUDIO_PROCESSING_VOICE_DETECTION_IMPL_H_ > >+#include <stddef.h> > #include <memory> > > #include "modules/audio_processing/include/audio_processing.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >@@ -42,6 +44,7 @@ class VoiceDetectionImpl : public VoiceDetection { > > private: > class Vad; >+ > rtc::CriticalSection* const crit_; > bool enabled_ RTC_GUARDED_BY(crit_) = false; > bool stream_has_voice_ RTC_GUARDED_BY(crit_) = false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/BUILD.gn >index 72046f0a015e02a8272f9430294bad0e4887addd..b501cb4457f2fe5f40d910bb5c06eae3ebd60322 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/BUILD.gn >@@ -15,6 +15,8 @@ rtc_static_library("bitrate_controller") { > "bitrate_controller_impl.cc", > "bitrate_controller_impl.h", > "include/bitrate_controller.h", >+ "loss_based_bandwidth_estimation.cc", >+ "loss_based_bandwidth_estimation.h", > "send_side_bandwidth_estimation.cc", > "send_side_bandwidth_estimation.h", > ] >@@ -31,10 +33,16 @@ rtc_static_library("bitrate_controller") { > > deps = [ > "..:module_api", >+ "../../api/transport:network_control", >+ "../../api/units:data_rate", >+ "../../api/units:time_delta", >+ "../../api/units:timestamp", > "../../logging:rtc_event_bwe", > "../../logging:rtc_event_log_api", > "../../rtc_base:checks", >+ "../../rtc_base:deprecation", > "../../rtc_base:rtc_base_approved", >+ "../../rtc_base/experiments:field_trial_parser", > "../../system_wrappers", > "../../system_wrappers:field_trial", > "../../system_wrappers:metrics", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_impl.cc >index 93045b73a36bb344a365d7bcaebf8b38ad223791..57c85f6f6fc57fe1217d48c6c0ff67769201f6bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_impl.cc >@@ -21,10 +21,10 @@ > > namespace webrtc { > namespace { >-absl::optional<DataRate> ToOptionalDataRate(int start_bitrate_bps) { >- if (start_bitrate_bps == -1) >- return absl::nullopt; >- return DataRate::bps(start_bitrate_bps); >+absl::optional<DataRate> ToOptionalDataRate(int send_bitrate_bps) { >+ if (send_bitrate_bps > 0) >+ return DataRate::bps(send_bitrate_bps); >+ return absl::nullopt; > } > DataRate MaxRate(int max_bitrate_bps) { > if (max_bitrate_bps == -1) >@@ -153,14 +153,12 @@ void BitrateControllerImpl::OnDelayBasedBweResult( > rtc::CritScope cs(&critsect_); > if (result.probe) { > bandwidth_estimation_.SetSendBitrate( >- DataRate::bps(result.target_bitrate_bps), >- Timestamp::ms(clock_->TimeInMilliseconds())); >+ result.target_bitrate, Timestamp::ms(clock_->TimeInMilliseconds())); > } > // Since SetSendBitrate now resets the delay-based estimate, we have to call > // UpdateDelayBasedEstimate after SetSendBitrate. > bandwidth_estimation_.UpdateDelayBasedEstimate( >- Timestamp::ms(clock_->TimeInMilliseconds()), >- DataRate::bps(result.target_bitrate_bps)); >+ Timestamp::ms(clock_->TimeInMilliseconds()), result.target_bitrate); > } > MaybeTriggerOnNetworkChanged(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_unittest.cc >index 96f8a1ef266eb9e62ecf0f801ccc31d08c7eda8b..cf281214aa3cfb4685e4be8f3965323b9c90ce00 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/bitrate_controller_unittest.cc >@@ -169,7 +169,7 @@ TEST_F(BitrateControllerTest, OneBitrateObserverOneRtcpObserver) { > EXPECT_EQ(300000, bitrate_observer_.last_bitrate_); > > // Test that a low delay-based estimate limits the combined estimate. >- webrtc::DelayBasedBwe::Result result(false, 280000); >+ webrtc::DelayBasedBwe::Result result(false, webrtc::DataRate::kbps(280)); > controller_->OnDelayBasedBweResult(result); > EXPECT_EQ(280000, bitrate_observer_.last_bitrate_); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/include/bitrate_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/include/bitrate_controller.h >index f67600d19083784e467d3fd22f727d3f0e332311..3e11fa76a3241de6089e8e2a3410c0c865f09d61 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/include/bitrate_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/include/bitrate_controller.h >@@ -15,15 +15,17 @@ > #ifndef MODULES_BITRATE_CONTROLLER_INCLUDE_BITRATE_CONTROLLER_H_ > #define MODULES_BITRATE_CONTROLLER_INCLUDE_BITRATE_CONTROLLER_H_ > >-#include <map> >+#include <stddef.h> >+#include <stdint.h> > > #include "modules/congestion_controller/goog_cc/delay_based_bwe.h" > #include "modules/include/module.h" >-#include "modules/pacing/paced_sender.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "rtc_base/deprecation.h" > > namespace webrtc { > >+class Clock; > class RtcEventLog; > > // Deprecated >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/loss_based_bandwidth_estimation.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/loss_based_bandwidth_estimation.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..5d7f8aa2c36d7fd66d562aacb86de6ce7666ce47 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/loss_based_bandwidth_estimation.cc >@@ -0,0 +1,215 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/bitrate_controller/loss_based_bandwidth_estimation.h" >+ >+#include <algorithm> >+#include <string> >+#include <vector> >+ >+#include "api/units/data_rate.h" >+#include "api/units/time_delta.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+namespace { >+const char kBweLossBasedControl[] = "WebRTC-Bwe-LossBasedControl"; >+ >+// Increase slower when RTT is high. >+double GetIncreaseFactor(const LossBasedControlConfig& config, TimeDelta rtt) { >+ // Clamp the RTT >+ if (rtt < config.increase_low_rtt) { >+ rtt = config.increase_low_rtt; >+ } else if (rtt > config.increase_high_rtt) { >+ rtt = config.increase_high_rtt; >+ } >+ auto rtt_range = config.increase_high_rtt.Get() - config.increase_low_rtt; >+ if (rtt_range <= TimeDelta::Zero()) { >+ RTC_DCHECK(false); // Only on misconfiguration. >+ return config.min_increase_factor; >+ } >+ auto rtt_offset = rtt - config.increase_low_rtt; >+ auto relative_offset = std::max(0.0, std::min(rtt_offset / rtt_range, 1.0)); >+ auto factor_range = config.max_increase_factor - config.min_increase_factor; >+ return config.min_increase_factor + (1 - relative_offset) * factor_range; >+} >+ >+double LossFromBitrate(DataRate bitrate, >+ DataRate loss_bandwidth_balance, >+ double exponent) { >+ if (loss_bandwidth_balance >= bitrate) >+ return 1.0; >+ return pow(loss_bandwidth_balance / bitrate, exponent); >+} >+ >+DataRate BitrateFromLoss(double loss, >+ DataRate loss_bandwidth_balance, >+ double exponent) { >+ if (exponent <= 0) { >+ RTC_DCHECK(false); >+ return DataRate::Infinity(); >+ } >+ if (loss < 1e-5) >+ return DataRate::Infinity(); >+ return loss_bandwidth_balance * pow(loss, -1.0 / exponent); >+} >+ >+double ExponentialUpdate(TimeDelta window, TimeDelta interval) { >+ // Use the convention that exponential window length (which is really >+ // infinite) is the time it takes to dampen to 1/e. >+ if (window <= TimeDelta::Zero()) { >+ RTC_DCHECK(false); >+ return 1.0f; >+ } >+ return 1.0f - exp(interval / window * -1.0); >+} >+ >+} // namespace >+ >+LossBasedControlConfig::LossBasedControlConfig() >+ : enabled(field_trial::IsEnabled(kBweLossBasedControl)), >+ min_increase_factor("min_incr", 1.02), >+ max_increase_factor("max_incr", 1.08), >+ increase_low_rtt("incr_low_rtt", TimeDelta::ms(200)), >+ increase_high_rtt("incr_high_rtt", TimeDelta::ms(800)), >+ decrease_factor("decr", 0.99), >+ loss_window("loss_win", TimeDelta::ms(800)), >+ loss_max_window("loss_max_win", TimeDelta::ms(800)), >+ acknowledged_rate_max_window("ackrate_max_win", TimeDelta::ms(800)), >+ increase_offset("incr_offset", DataRate::bps(1000)), >+ loss_bandwidth_balance_increase("balance_incr", DataRate::kbps(0.5)), >+ loss_bandwidth_balance_decrease("balance_decr", DataRate::kbps(4)), >+ loss_bandwidth_balance_exponent("exponent", 0.5), >+ allow_resets("resets", false), >+ decrease_interval("decr_intvl", TimeDelta::ms(300)), >+ loss_report_timeout("timeout", TimeDelta::ms(6000)) { >+ std::string trial_string = field_trial::FindFullName(kBweLossBasedControl); >+ ParseFieldTrial( >+ {&min_increase_factor, &max_increase_factor, &increase_low_rtt, >+ &increase_high_rtt, &decrease_factor, &loss_window, &loss_max_window, >+ &acknowledged_rate_max_window, &increase_offset, >+ &loss_bandwidth_balance_increase, &loss_bandwidth_balance_decrease, >+ &loss_bandwidth_balance_exponent, &allow_resets, &decrease_interval, >+ &loss_report_timeout}, >+ trial_string); >+} >+LossBasedControlConfig::LossBasedControlConfig(const LossBasedControlConfig&) = >+ default; >+LossBasedControlConfig::~LossBasedControlConfig() = default; >+ >+LossBasedBandwidthEstimation::LossBasedBandwidthEstimation() >+ : config_(LossBasedControlConfig()), >+ average_loss_(0), >+ average_loss_max_(0), >+ loss_based_bitrate_(DataRate::Zero()), >+ acknowledged_bitrate_max_(DataRate::Zero()), >+ acknowledged_bitrate_last_update_(Timestamp::MinusInfinity()), >+ time_last_decrease_(Timestamp::MinusInfinity()), >+ has_decreased_since_last_loss_report_(false), >+ last_loss_packet_report_(Timestamp::MinusInfinity()), >+ last_loss_ratio_(0) {} >+ >+void LossBasedBandwidthEstimation::UpdateLossStatistics( >+ const std::vector<PacketResult>& packet_results, >+ Timestamp at_time) { >+ if (packet_results.empty()) { >+ RTC_DCHECK(false); >+ return; >+ } >+ int loss_count = 0; >+ for (auto pkt : packet_results) { >+ loss_count += pkt.receive_time.IsInfinite() ? 1 : 0; >+ } >+ last_loss_ratio_ = static_cast<double>(loss_count) / packet_results.size(); >+ const TimeDelta time_passed = last_loss_packet_report_.IsFinite() >+ ? at_time - last_loss_packet_report_ >+ : TimeDelta::seconds(1); >+ last_loss_packet_report_ = at_time; >+ has_decreased_since_last_loss_report_ = false; >+ >+ average_loss_ += ExponentialUpdate(config_.loss_window, time_passed) * >+ (last_loss_ratio_ - average_loss_); >+ if (average_loss_ > average_loss_max_) { >+ average_loss_max_ = average_loss_; >+ } else { >+ average_loss_max_ += >+ ExponentialUpdate(config_.loss_max_window, time_passed) * >+ (average_loss_ - average_loss_max_); >+ } >+} >+ >+void LossBasedBandwidthEstimation::UpdateAcknowledgedBitrate( >+ DataRate acknowledged_bitrate, >+ Timestamp at_time) { >+ const TimeDelta time_passed = >+ acknowledged_bitrate_last_update_.IsFinite() >+ ? at_time - acknowledged_bitrate_last_update_ >+ : TimeDelta::seconds(1); >+ acknowledged_bitrate_last_update_ = at_time; >+ if (acknowledged_bitrate > acknowledged_bitrate_max_) { >+ acknowledged_bitrate_max_ = acknowledged_bitrate; >+ } else { >+ acknowledged_bitrate_max_ += >+ ExponentialUpdate(config_.acknowledged_rate_max_window, time_passed) * >+ (acknowledged_bitrate - acknowledged_bitrate_max_); >+ } >+} >+ >+void LossBasedBandwidthEstimation::Update(Timestamp at_time, >+ DataRate min_bitrate, >+ TimeDelta last_round_trip_time) { >+ // Only increase if loss has been low for some time. >+ const double loss_estimate_for_increase = average_loss_max_; >+ // Avoid multiple decreases from averaging over one loss spike. >+ const double loss_estimate_for_decrease = >+ std::min(average_loss_, last_loss_ratio_); >+ >+ const double loss_increase_threshold = LossFromBitrate( >+ loss_based_bitrate_, config_.loss_bandwidth_balance_increase, >+ config_.loss_bandwidth_balance_exponent); >+ const double loss_decrease_threshold = LossFromBitrate( >+ loss_based_bitrate_, config_.loss_bandwidth_balance_decrease, >+ config_.loss_bandwidth_balance_exponent); >+ const bool allow_decrease = >+ !has_decreased_since_last_loss_report_ && >+ (at_time - time_last_decrease_ >= >+ last_round_trip_time + config_.decrease_interval); >+ >+ if (loss_estimate_for_increase < loss_increase_threshold) { >+ // Increase bitrate by RTT-adaptive ratio. >+ DataRate new_increased_bitrate = >+ min_bitrate * GetIncreaseFactor(config_, last_round_trip_time) + >+ config_.increase_offset; >+ // The bitrate that would make the loss "just high enough". >+ const DataRate new_increased_bitrate_cap = BitrateFromLoss( >+ loss_estimate_for_increase, config_.loss_bandwidth_balance_increase, >+ config_.loss_bandwidth_balance_exponent); >+ new_increased_bitrate = >+ std::min(new_increased_bitrate, new_increased_bitrate_cap); >+ loss_based_bitrate_ = std::max(new_increased_bitrate, loss_based_bitrate_); >+ } else if (loss_estimate_for_decrease > loss_decrease_threshold && >+ allow_decrease) { >+ DataRate new_decreased_bitrate = >+ config_.decrease_factor * acknowledged_bitrate_max_; >+ // The bitrate that would make the loss "just acceptable". >+ const DataRate new_decreased_bitrate_floor = BitrateFromLoss( >+ loss_estimate_for_decrease, config_.loss_bandwidth_balance_decrease, >+ config_.loss_bandwidth_balance_exponent); >+ new_decreased_bitrate = >+ std::max(new_decreased_bitrate, new_decreased_bitrate_floor); >+ if (new_decreased_bitrate < loss_based_bitrate_) { >+ time_last_decrease_ = at_time; >+ has_decreased_since_last_loss_report_ = true; >+ loss_based_bitrate_ = new_decreased_bitrate; >+ } >+ } >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/loss_based_bandwidth_estimation.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/loss_based_bandwidth_estimation.h >new file mode 100644 >index 0000000000000000000000000000000000000000..0f560769ba43c5d823fd56fbbc089e627c95e6f9 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/loss_based_bandwidth_estimation.h >@@ -0,0 +1,80 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_BITRATE_CONTROLLER_LOSS_BASED_BANDWIDTH_ESTIMATION_H_ >+#define MODULES_BITRATE_CONTROLLER_LOSS_BASED_BANDWIDTH_ESTIMATION_H_ >+ >+#include <vector> >+ >+#include "api/units/data_rate.h" >+#include "api/units/time_delta.h" >+#include "api/units/timestamp.h" >+#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "rtc_base/experiments/field_trial_parser.h" >+ >+namespace webrtc { >+ >+struct LossBasedControlConfig { >+ LossBasedControlConfig(); >+ LossBasedControlConfig(const LossBasedControlConfig&); >+ LossBasedControlConfig& operator=(const LossBasedControlConfig&) = default; >+ ~LossBasedControlConfig(); >+ bool enabled; >+ FieldTrialParameter<double> min_increase_factor; >+ FieldTrialParameter<double> max_increase_factor; >+ FieldTrialParameter<TimeDelta> increase_low_rtt; >+ FieldTrialParameter<TimeDelta> increase_high_rtt; >+ FieldTrialParameter<double> decrease_factor; >+ FieldTrialParameter<TimeDelta> loss_window; >+ FieldTrialParameter<TimeDelta> loss_max_window; >+ FieldTrialParameter<TimeDelta> acknowledged_rate_max_window; >+ FieldTrialParameter<DataRate> increase_offset; >+ FieldTrialParameter<DataRate> loss_bandwidth_balance_increase; >+ FieldTrialParameter<DataRate> loss_bandwidth_balance_decrease; >+ FieldTrialParameter<double> loss_bandwidth_balance_exponent; >+ FieldTrialParameter<bool> allow_resets; >+ FieldTrialParameter<TimeDelta> decrease_interval; >+ FieldTrialParameter<TimeDelta> loss_report_timeout; >+}; >+ >+class LossBasedBandwidthEstimation { >+ public: >+ LossBasedBandwidthEstimation(); >+ void Update(Timestamp at_time, >+ DataRate min_bitrate, >+ TimeDelta last_round_trip_time); >+ void UpdateAcknowledgedBitrate(DataRate acknowledged_bitrate, >+ Timestamp at_time); >+ void MaybeReset(DataRate bitrate) { >+ if (config_.allow_resets) >+ loss_based_bitrate_ = bitrate; >+ } >+ void SetInitialBitrate(DataRate bitrate) { loss_based_bitrate_ = bitrate; } >+ bool Enabled() const { return config_.enabled; } >+ void UpdateLossStatistics(const std::vector<PacketResult>& packet_results, >+ Timestamp at_time); >+ DataRate GetEstimate() const { return loss_based_bitrate_; } >+ >+ private: >+ LossBasedControlConfig config_; >+ double average_loss_; >+ double average_loss_max_; >+ DataRate loss_based_bitrate_; >+ DataRate acknowledged_bitrate_max_; >+ Timestamp acknowledged_bitrate_last_update_; >+ Timestamp time_last_decrease_; >+ bool has_decreased_since_last_loss_report_; >+ Timestamp last_loss_packet_report_; >+ double last_loss_ratio_; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_BITRATE_CONTROLLER_LOSS_BASED_BANDWIDTH_ESTIMATION_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.cc >index cc108a95b08fcb9f59376421e062f6e4155b82d1..394c14747b7acb2ad014a54005d48a557f03b74d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.cc >@@ -11,12 +11,12 @@ > #include "modules/bitrate_controller/send_side_bandwidth_estimation.h" > > #include <algorithm> >-#include <cmath> > #include <cstdio> > #include <limits> > #include <string> > > #include "absl/memory/memory.h" >+#include "logging/rtc_event_log/events/rtc_event.h" > #include "logging/rtc_event_log/events/rtc_event_bwe_update_loss_based.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" >@@ -104,6 +104,83 @@ bool ReadBweLossExperimentParameters(float* low_loss_threshold, > } > } // namespace > >+LinkCapacityTracker::LinkCapacityTracker() >+ : tracking_rate("rate", TimeDelta::seconds(10)) { >+ ParseFieldTrial({&tracking_rate}, >+ field_trial::FindFullName("WebRTC-Bwe-LinkCapacity")); >+} >+ >+LinkCapacityTracker::~LinkCapacityTracker() {} >+ >+void LinkCapacityTracker::OnOveruse(DataRate acknowledged_rate, >+ Timestamp at_time) { >+ capacity_estimate_bps_ = >+ std::min(capacity_estimate_bps_, acknowledged_rate.bps<double>()); >+ last_link_capacity_update_ = at_time; >+} >+ >+void LinkCapacityTracker::OnStartingRate(DataRate start_rate) { >+ if (last_link_capacity_update_.IsInfinite()) >+ capacity_estimate_bps_ = start_rate.bps<double>(); >+} >+ >+void LinkCapacityTracker::OnRateUpdate(DataRate acknowledged, >+ Timestamp at_time) { >+ if (acknowledged.bps() > capacity_estimate_bps_) { >+ TimeDelta delta = at_time - last_link_capacity_update_; >+ double alpha = delta.IsFinite() ? exp(-(delta / tracking_rate.Get())) : 0; >+ capacity_estimate_bps_ = alpha * capacity_estimate_bps_ + >+ (1 - alpha) * acknowledged.bps<double>(); >+ } >+ last_link_capacity_update_ = at_time; >+} >+ >+void LinkCapacityTracker::OnRttBackoff(DataRate backoff_rate, >+ Timestamp at_time) { >+ capacity_estimate_bps_ = >+ std::min(capacity_estimate_bps_, backoff_rate.bps<double>()); >+ last_link_capacity_update_ = at_time; >+} >+ >+DataRate LinkCapacityTracker::estimate() const { >+ return DataRate::bps(capacity_estimate_bps_); >+} >+ >+RttBasedBackoff::RttBasedBackoff() >+ : rtt_limit_("limit", TimeDelta::PlusInfinity()), >+ drop_fraction_("fraction", 0.5), >+ drop_interval_("interval", TimeDelta::ms(300)), >+ persist_on_route_change_("persist"), >+ // By initializing this to plus infinity, we make sure that we never >+ // trigger rtt backoff unless packet feedback is enabled. >+ last_propagation_rtt_update_(Timestamp::PlusInfinity()), >+ last_propagation_rtt_(TimeDelta::Zero()) { >+ ParseFieldTrial({&rtt_limit_, &drop_fraction_, &drop_interval_, >+ &persist_on_route_change_}, >+ field_trial::FindFullName("WebRTC-Bwe-MaxRttLimit")); >+} >+ >+void RttBasedBackoff::OnRouteChange() { >+ if (!persist_on_route_change_) { >+ last_propagation_rtt_update_ = Timestamp::PlusInfinity(); >+ last_propagation_rtt_ = TimeDelta::Zero(); >+ } >+} >+ >+void RttBasedBackoff::UpdatePropagationRtt(Timestamp at_time, >+ TimeDelta propagation_rtt) { >+ last_propagation_rtt_update_ = at_time; >+ last_propagation_rtt_ = propagation_rtt; >+} >+ >+TimeDelta RttBasedBackoff::RttLowerBound(Timestamp at_time) const { >+ // TODO(srte): Use time since last unacknowledged packet for this. >+ TimeDelta time_since_rtt = at_time - last_propagation_rtt_update_; >+ return time_since_rtt + last_propagation_rtt_; >+} >+ >+RttBasedBackoff::~RttBasedBackoff() = default; >+ > SendSideBandwidthEstimation::SendSideBandwidthEstimation(RtcEventLog* event_log) > : lost_packets_since_last_loss_update_(0), > expected_packets_since_last_loss_update_(0), >@@ -151,14 +228,44 @@ SendSideBandwidthEstimation::SendSideBandwidthEstimation(RtcEventLog* event_log) > > SendSideBandwidthEstimation::~SendSideBandwidthEstimation() {} > >+void SendSideBandwidthEstimation::OnRouteChange() { >+ lost_packets_since_last_loss_update_ = 0; >+ expected_packets_since_last_loss_update_ = 0; >+ current_bitrate_ = DataRate::Zero(); >+ min_bitrate_configured_ = >+ DataRate::bps(congestion_controller::GetMinBitrateBps()); >+ max_bitrate_configured_ = kDefaultMaxBitrate; >+ last_low_bitrate_log_ = Timestamp::MinusInfinity(); >+ has_decreased_since_last_fraction_loss_ = false; >+ last_loss_feedback_ = Timestamp::MinusInfinity(); >+ last_loss_packet_report_ = Timestamp::MinusInfinity(); >+ last_timeout_ = Timestamp::MinusInfinity(); >+ last_fraction_loss_ = 0; >+ last_logged_fraction_loss_ = 0; >+ last_round_trip_time_ = TimeDelta::Zero(); >+ bwe_incoming_ = DataRate::Zero(); >+ delay_based_bitrate_ = DataRate::Zero(); >+ time_last_decrease_ = Timestamp::MinusInfinity(); >+ first_report_time_ = Timestamp::MinusInfinity(); >+ initially_lost_packets_ = 0; >+ bitrate_at_2_seconds_ = DataRate::Zero(); >+ uma_update_state_ = kNoUpdate; >+ uma_rtt_state_ = kNoUpdate; >+ last_rtc_event_log_ = Timestamp::MinusInfinity(); >+ >+ rtt_backoff_.OnRouteChange(); >+} >+ > void SendSideBandwidthEstimation::SetBitrates( > absl::optional<DataRate> send_bitrate, > DataRate min_bitrate, > DataRate max_bitrate, > Timestamp at_time) { > SetMinMaxBitrate(min_bitrate, max_bitrate); >- if (send_bitrate) >+ if (send_bitrate) { >+ link_capacity_.OnStartingRate(*send_bitrate); > SetSendBitrate(*send_bitrate, at_time); >+ } > } > > void SendSideBandwidthEstimation::SetSendBitrate(DataRate bitrate, >@@ -166,6 +273,9 @@ void SendSideBandwidthEstimation::SetSendBitrate(DataRate bitrate, > RTC_DCHECK(bitrate > DataRate::Zero()); > // Reset to avoid being capped by the estimate. > delay_based_bitrate_ = DataRate::Zero(); >+ if (loss_based_bandwidth_estimation_.Enabled()) { >+ loss_based_bandwidth_estimation_.MaybeReset(bitrate); >+ } > CapBitrateToThresholds(at_time, bitrate); > // Clear last sent bitrate history so the new value can be used directly > // and not capped. >@@ -195,6 +305,10 @@ void SendSideBandwidthEstimation::CurrentEstimate(int* bitrate, > *rtt = last_round_trip_time_.ms<int64_t>(); > } > >+DataRate SendSideBandwidthEstimation::GetEstimatedLinkCapacity() const { >+ return link_capacity_.estimate(); >+} >+ > void SendSideBandwidthEstimation::UpdateReceiverEstimate(Timestamp at_time, > DataRate bandwidth) { > bwe_incoming_ = bandwidth; >@@ -203,10 +317,33 @@ void SendSideBandwidthEstimation::UpdateReceiverEstimate(Timestamp at_time, > > void SendSideBandwidthEstimation::UpdateDelayBasedEstimate(Timestamp at_time, > DataRate bitrate) { >+ if (acknowledged_rate_) { >+ if (bitrate < delay_based_bitrate_) { >+ link_capacity_.OnOveruse(*acknowledged_rate_, at_time); >+ } >+ } > delay_based_bitrate_ = bitrate; > CapBitrateToThresholds(at_time, current_bitrate_); > } > >+void SendSideBandwidthEstimation::SetAcknowledgedRate( >+ absl::optional<DataRate> acknowledged_rate, >+ Timestamp at_time) { >+ acknowledged_rate_ = acknowledged_rate; >+ if (acknowledged_rate && loss_based_bandwidth_estimation_.Enabled()) { >+ loss_based_bandwidth_estimation_.UpdateAcknowledgedBitrate( >+ *acknowledged_rate, at_time); >+ } >+} >+ >+void SendSideBandwidthEstimation::IncomingPacketFeedbackVector( >+ const TransportPacketsFeedback& report) { >+ if (loss_based_bandwidth_estimation_.Enabled()) { >+ loss_based_bandwidth_estimation_.UpdateLossStatistics( >+ report.packet_feedbacks, report.feedback_time); >+ } >+} >+ > void SendSideBandwidthEstimation::UpdateReceiverBlock(uint8_t fraction_loss, > TimeDelta rtt, > int number_of_packets, >@@ -295,11 +432,24 @@ void SendSideBandwidthEstimation::UpdateRtt(TimeDelta rtt, Timestamp at_time) { > > void SendSideBandwidthEstimation::UpdateEstimate(Timestamp at_time) { > DataRate new_bitrate = current_bitrate_; >+ if (rtt_backoff_.RttLowerBound(at_time) > rtt_backoff_.rtt_limit_) { >+ if (at_time - time_last_decrease_ >= rtt_backoff_.drop_interval_) { >+ time_last_decrease_ = at_time; >+ new_bitrate = current_bitrate_ * rtt_backoff_.drop_fraction_; >+ link_capacity_.OnRttBackoff(new_bitrate, at_time); >+ } >+ CapBitrateToThresholds(at_time, new_bitrate); >+ return; >+ } >+ > // We trust the REMB and/or delay-based estimate during the first 2 seconds if > // we haven't had any packet loss reported, to allow startup bitrate probing. > if (last_fraction_loss_ == 0 && IsInStartPhase(at_time)) { > new_bitrate = std::max(bwe_incoming_, new_bitrate); > new_bitrate = std::max(delay_based_bitrate_, new_bitrate); >+ if (loss_based_bandwidth_estimation_.Enabled()) { >+ loss_based_bandwidth_estimation_.SetInitialBitrate(new_bitrate); >+ } > > if (new_bitrate != current_bitrate_) { > min_bitrate_history_.clear(); >@@ -314,6 +464,15 @@ void SendSideBandwidthEstimation::UpdateEstimate(Timestamp at_time) { > CapBitrateToThresholds(at_time, current_bitrate_); > return; > } >+ >+ if (loss_based_bandwidth_estimation_.Enabled()) { >+ loss_based_bandwidth_estimation_.Update( >+ at_time, min_bitrate_history_.front().second, last_round_trip_time_); >+ new_bitrate = MaybeRampupOrBackoff(new_bitrate, at_time); >+ CapBitrateToThresholds(at_time, new_bitrate); >+ return; >+ } >+ > TimeDelta time_since_loss_packet_report = at_time - last_loss_packet_report_; > TimeDelta time_since_loss_feedback = at_time - last_loss_feedback_; > if (time_since_loss_packet_report < 1.2 * kMaxRtcpFeedbackInterval) { >@@ -328,7 +487,7 @@ void SendSideBandwidthEstimation::UpdateEstimate(Timestamp at_time) { > // Note that by remembering the bitrate over the last second one can > // rampup up one second faster than if only allowed to start ramping > // at 8% per second rate now. E.g.: >- // If sending a constant 100kbps it can rampup immediatly to 108kbps >+ // If sending a constant 100kbps it can rampup immediately to 108kbps > // whenever a receiver report is received with lower packet loss. > // If instead one would do: current_bitrate_ *= 1.08^(delta time), > // it would take over one second since the lower packet loss to achieve >@@ -382,6 +541,12 @@ void SendSideBandwidthEstimation::UpdateEstimate(Timestamp at_time) { > CapBitrateToThresholds(at_time, new_bitrate); > } > >+void SendSideBandwidthEstimation::UpdatePropagationRtt( >+ Timestamp at_time, >+ TimeDelta propagation_rtt) { >+ rtt_backoff_.UpdatePropagationRtt(at_time, propagation_rtt); >+} >+ > bool SendSideBandwidthEstimation::IsInStartPhase(Timestamp at_time) const { > return first_report_time_.IsInfinite() || > at_time - first_report_time_ < kStartPhase; >@@ -407,6 +572,35 @@ void SendSideBandwidthEstimation::UpdateMinHistory(Timestamp at_time) { > min_bitrate_history_.push_back(std::make_pair(at_time, current_bitrate_)); > } > >+DataRate SendSideBandwidthEstimation::MaybeRampupOrBackoff(DataRate new_bitrate, >+ Timestamp at_time) { >+ // TODO(crodbro): reuse this code in UpdateEstimate instead of current >+ // inlining of very similar functionality. >+ const TimeDelta time_since_loss_packet_report = >+ at_time - last_loss_packet_report_; >+ const TimeDelta time_since_loss_feedback = at_time - last_loss_feedback_; >+ if (time_since_loss_packet_report < 1.2 * kMaxRtcpFeedbackInterval) { >+ new_bitrate = min_bitrate_history_.front().second * 1.08; >+ new_bitrate += DataRate::bps(1000); >+ } else if (time_since_loss_feedback > >+ kFeedbackTimeoutIntervals * kMaxRtcpFeedbackInterval && >+ (last_timeout_.IsInfinite() || >+ at_time - last_timeout_ > kTimeoutInterval)) { >+ if (in_timeout_experiment_) { >+ RTC_LOG(LS_WARNING) << "Feedback timed out (" >+ << ToString(time_since_loss_feedback) >+ << "), reducing bitrate."; >+ new_bitrate = new_bitrate * 0.8; >+ // Reset accumulators since we've already acted on missing feedback and >+ // shouldn't to act again on these old lost packets. >+ lost_packets_since_last_loss_update_ = 0; >+ expected_packets_since_last_loss_update_ = 0; >+ last_timeout_ = at_time; >+ } >+ } >+ return new_bitrate; >+} >+ > void SendSideBandwidthEstimation::CapBitrateToThresholds(Timestamp at_time, > DataRate bitrate) { > if (bwe_incoming_ > DataRate::Zero() && bitrate > bwe_incoming_) { >@@ -416,6 +610,10 @@ void SendSideBandwidthEstimation::CapBitrateToThresholds(Timestamp at_time, > bitrate > delay_based_bitrate_) { > bitrate = delay_based_bitrate_; > } >+ if (loss_based_bandwidth_estimation_.Enabled() && >+ loss_based_bandwidth_estimation_.GetEstimate() > DataRate::Zero()) { >+ bitrate = std::min(bitrate, loss_based_bandwidth_estimation_.GetEstimate()); >+ } > if (bitrate > max_bitrate_configured_) { > bitrate = max_bitrate_configured_; > } >@@ -441,5 +639,10 @@ void SendSideBandwidthEstimation::CapBitrateToThresholds(Timestamp at_time, > last_rtc_event_log_ = at_time; > } > current_bitrate_ = bitrate; >+ >+ if (acknowledged_rate_) { >+ link_capacity_.OnRateUpdate(std::min(current_bitrate_, *acknowledged_rate_), >+ at_time); >+ } > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.h >index 2c8b4ee3d5aba0463329d7c24b8200323b6e49b7..b016faba2ee3f987951452faa1ea79048de15678 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation.h >@@ -13,27 +13,71 @@ > #ifndef MODULES_BITRATE_CONTROLLER_SEND_SIDE_BANDWIDTH_ESTIMATION_H_ > #define MODULES_BITRATE_CONTROLLER_SEND_SIDE_BANDWIDTH_ESTIMATION_H_ > >+#include <stdint.h> > #include <deque> > #include <utility> > #include <vector> > > #include "absl/types/optional.h" >+#include "api/transport/network_types.h" >+#include "api/units/data_rate.h" >+#include "api/units/time_delta.h" >+#include "api/units/timestamp.h" >+#include "modules/bitrate_controller/loss_based_bandwidth_estimation.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "rtc_base/experiments/field_trial_parser.h" > > namespace webrtc { > > class RtcEventLog; > >+class LinkCapacityTracker { >+ public: >+ LinkCapacityTracker(); >+ ~LinkCapacityTracker(); >+ void OnOveruse(DataRate acknowledged_rate, Timestamp at_time); >+ void OnStartingRate(DataRate start_rate); >+ void OnRateUpdate(DataRate acknowledged, Timestamp at_time); >+ void OnRttBackoff(DataRate backoff_rate, Timestamp at_time); >+ DataRate estimate() const; >+ >+ private: >+ FieldTrialParameter<TimeDelta> tracking_rate; >+ double capacity_estimate_bps_ = 0; >+ Timestamp last_link_capacity_update_ = Timestamp::MinusInfinity(); >+}; >+ >+class RttBasedBackoff { >+ public: >+ RttBasedBackoff(); >+ ~RttBasedBackoff(); >+ void OnRouteChange(); >+ void UpdatePropagationRtt(Timestamp at_time, TimeDelta propagation_rtt); >+ TimeDelta RttLowerBound(Timestamp at_time) const; >+ >+ FieldTrialParameter<TimeDelta> rtt_limit_; >+ FieldTrialParameter<double> drop_fraction_; >+ FieldTrialParameter<TimeDelta> drop_interval_; >+ FieldTrialFlag persist_on_route_change_; >+ >+ public: >+ Timestamp last_propagation_rtt_update_; >+ TimeDelta last_propagation_rtt_; >+}; >+ > class SendSideBandwidthEstimation { > public: > SendSideBandwidthEstimation() = delete; > explicit SendSideBandwidthEstimation(RtcEventLog* event_log); >- virtual ~SendSideBandwidthEstimation(); >+ ~SendSideBandwidthEstimation(); > >+ void OnRouteChange(); > void CurrentEstimate(int* bitrate, uint8_t* loss, int64_t* rtt) const; >- >+ DataRate GetEstimatedLinkCapacity() const; > // Call periodically to update estimate. > void UpdateEstimate(Timestamp at_time); >+ void OnSentPacket(SentPacket sent_packet); >+ void UpdatePropagationRtt(Timestamp at_time, TimeDelta propagation_rtt); > > // Call when we receive a RTCP message with TMMBR or REMB. > void UpdateReceiverEstimate(Timestamp at_time, DataRate bandwidth); >@@ -62,6 +106,9 @@ class SendSideBandwidthEstimation { > void SetSendBitrate(DataRate bitrate, Timestamp at_time); > void SetMinMaxBitrate(DataRate min_bitrate, DataRate max_bitrate); > int GetMinBitrate() const; >+ void SetAcknowledgedRate(absl::optional<DataRate> acknowledged_rate, >+ Timestamp at_time); >+ void IncomingPacketFeedbackVector(const TransportPacketsFeedback& report); > > private: > enum UmaState { kNoUpdate, kFirstDone, kDone }; >@@ -75,16 +122,22 @@ class SendSideBandwidthEstimation { > // min bitrate used during last kBweIncreaseIntervalMs. > void UpdateMinHistory(Timestamp at_time); > >+ DataRate MaybeRampupOrBackoff(DataRate new_bitrate, Timestamp at_time); >+ > // Cap |bitrate| to [min_bitrate_configured_, max_bitrate_configured_] and > // set |current_bitrate_| to the capped value and updates the event log. > void CapBitrateToThresholds(Timestamp at_time, DataRate bitrate); > >+ RttBasedBackoff rtt_backoff_; >+ LinkCapacityTracker link_capacity_; >+ > std::deque<std::pair<Timestamp, DataRate> > min_bitrate_history_; > > // incoming filters > int lost_packets_since_last_loss_update_; > int expected_packets_since_last_loss_update_; > >+ absl::optional<DataRate> acknowledged_rate_; > DataRate current_bitrate_; > DataRate min_bitrate_configured_; > DataRate max_bitrate_configured_; >@@ -113,6 +166,7 @@ class SendSideBandwidthEstimation { > float low_loss_threshold_; > float high_loss_threshold_; > DataRate bitrate_threshold_; >+ LossBasedBandwidthEstimation loss_based_bandwidth_estimation_; > }; > } // namespace webrtc > #endif // MODULES_BITRATE_CONTROLLER_SEND_SIDE_BANDWIDTH_ESTIMATION_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation_unittest.cc >index becc616cf31561b3b4e869158e3e0d48246b74af..ccfbd8550327e86fa1757248f7409b5be1b46fb7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/bitrate_controller/send_side_bandwidth_estimation_unittest.cc >@@ -23,7 +23,7 @@ MATCHER(LossBasedBweUpdateWithBitrateOnly, "") { > return false; > } > auto bwe_event = static_cast<RtcEventBweUpdateLossBased*>(arg); >- return bwe_event->bitrate_bps_ > 0 && bwe_event->fraction_loss_ == 0; >+ return bwe_event->bitrate_bps() > 0 && bwe_event->fraction_loss() == 0; > } > > MATCHER(LossBasedBweUpdateWithBitrateAndLossFraction, "") { >@@ -31,7 +31,7 @@ MATCHER(LossBasedBweUpdateWithBitrateAndLossFraction, "") { > return false; > } > auto bwe_event = static_cast<RtcEventBweUpdateLossBased*>(arg); >- return bwe_event->bitrate_bps_ > 0 && bwe_event->fraction_loss_ > 0; >+ return bwe_event->bitrate_bps() > 0 && bwe_event->fraction_loss() > 0; > } > > void TestProbing(bool use_delay_based) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/BUILD.gn >index 2fcb71212f5247cdab800b22571f0186a05e98f7..73d81f16ec27cb021ab9f16f73d9898c6d977385 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/BUILD.gn >@@ -20,8 +20,6 @@ rtc_static_library("congestion_controller") { > visibility = [ "*" ] > configs += [ ":bwe_test_logging" ] > sources = [ >- "congestion_window_pushback_controller.cc", >- "congestion_window_pushback_controller.h", > "include/network_changed_observer.h", > "include/receive_side_congestion_controller.h", > "include/send_side_congestion_controller.h", >@@ -43,6 +41,7 @@ rtc_static_library("congestion_controller") { > "../../rtc_base:checks", > "../../rtc_base:ptr_util", > "../../rtc_base:rate_limiter", >+ "../../rtc_base/network:sent_packet", > "../../system_wrappers", > "../../system_wrappers:field_trial", > "../bitrate_controller", >@@ -52,6 +51,7 @@ rtc_static_library("congestion_controller") { > "goog_cc:delay_based_bwe", > "goog_cc:estimators", > "goog_cc:probe_controller", >+ "goog_cc:pushback_controller", > "//third_party/abseil-cpp/absl/memory", > ] > >@@ -98,7 +98,6 @@ if (rtc_include_tests) { > testonly = true > > sources = [ >- "congestion_window_pushback_controller_unittest.cc", > "receive_side_congestion_controller_unittest.cc", > "send_side_congestion_controller_unittest.cc", > "transport_feedback_adapter_unittest.cc", >@@ -111,7 +110,7 @@ if (rtc_include_tests) { > "../../rtc_base:checks", > "../../rtc_base:rtc_base", > "../../rtc_base:rtc_base_approved", >- "../../rtc_base:rtc_base_tests_utils", >+ "../../rtc_base/network:sent_packet", > "../../system_wrappers", > "../../test:field_trial", > "../../test:test_support", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/bbr/bbr_network_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/bbr/bbr_network_controller.cc >index b82ded42202cc4740eaf993c05aa7e743cb8ac0f..863ce4bf98d9692292cf8fea490c27c4b63044f6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/bbr/bbr_network_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/bbr/bbr_network_controller.cc >@@ -401,16 +401,15 @@ bool BbrNetworkController::IsProbingForMoreBandwidth() const { > > NetworkControlUpdate BbrNetworkController::OnTransportPacketsFeedback( > TransportPacketsFeedback msg) { >+ if (msg.packet_feedbacks.empty()) >+ return NetworkControlUpdate(); >+ > Timestamp feedback_recv_time = msg.feedback_time; >- absl::optional<SentPacket> last_sent_packet = >- msg.PacketsWithFeedback().back().sent_packet; >- if (!last_sent_packet.has_value()) { >- RTC_LOG(LS_WARNING) << "Last ack packet not in history, no RTT update"; >- } else { >- Timestamp send_time = last_sent_packet->send_time; >- TimeDelta send_delta = feedback_recv_time - send_time; >- rtt_stats_.UpdateRtt(send_delta, TimeDelta::Zero(), feedback_recv_time); >- } >+ SentPacket last_sent_packet = msg.PacketsWithFeedback().back().sent_packet; >+ >+ Timestamp send_time = last_sent_packet.send_time; >+ TimeDelta send_delta = feedback_recv_time - send_time; >+ rtt_stats_.UpdateRtt(send_delta, TimeDelta::Zero(), feedback_recv_time); > > const DataSize total_data_acked_before = sampler_->total_data_acked(); > >@@ -431,7 +430,7 @@ NetworkControlUpdate BbrNetworkController::OnTransportPacketsFeedback( > // Input the new data into the BBR model of the connection. > if (!acked_packets.empty()) { > int64_t last_acked_packet = >- acked_packets.rbegin()->sent_packet->sequence_number; >+ acked_packets.rbegin()->sent_packet.sequence_number; > > is_round_start = UpdateRoundTripCounter(last_acked_packet); > min_rtt_expired = >@@ -472,7 +471,7 @@ NetworkControlUpdate BbrNetworkController::OnTransportPacketsFeedback( > DataSize data_acked = sampler_->total_data_acked() - total_data_acked_before; > DataSize data_lost = DataSize::Zero(); > for (const PacketResult& packet : lost_packets) { >- data_lost += packet.sent_packet->size; >+ data_lost += packet.sent_packet.size; > } > > // After the model is updated, recalculate the pacing rate and congestion >@@ -483,7 +482,7 @@ NetworkControlUpdate BbrNetworkController::OnTransportPacketsFeedback( > // Cleanup internal state. > if (!acked_packets.empty()) { > sampler_->RemoveObsoletePackets( >- acked_packets.back().sent_packet->sequence_number); >+ acked_packets.back().sent_packet.sequence_number); > } > return CreateRateUpdate(msg.feedback_time); > } >@@ -550,7 +549,7 @@ void BbrNetworkController::EnterProbeBandwidthMode(Timestamp now) { > void BbrNetworkController::DiscardLostPackets( > const std::vector<PacketResult>& lost_packets) { > for (const PacketResult& packet : lost_packets) { >- sampler_->OnPacketLost(packet.sent_packet->sequence_number); >+ sampler_->OnPacketLost(packet.sent_packet.sequence_number); > } > } > >@@ -569,8 +568,8 @@ bool BbrNetworkController::UpdateBandwidthAndMinRtt( > const std::vector<PacketResult>& acked_packets) { > TimeDelta sample_rtt = TimeDelta::PlusInfinity(); > for (const auto& packet : acked_packets) { >- BandwidthSample bandwidth_sample = sampler_->OnPacketAcknowledged( >- now, packet.sent_packet->sequence_number); >+ BandwidthSample bandwidth_sample = >+ sampler_->OnPacketAcknowledged(now, packet.sent_packet.sequence_number); > last_sample_is_app_limited_ = bandwidth_sample.is_app_limited; > if (!bandwidth_sample.rtt.IsZero()) { > sample_rtt = std::min(sample_rtt, bandwidth_sample.rtt); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller.cc >deleted file mode 100644 >index daa51bddd30977992d8ad0e9f8529377be6d3ae4..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller.cc >+++ /dev/null >@@ -1,93 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <string> >-#include <algorithm> >- >-#include "modules/congestion_controller/congestion_window_pushback_controller.h" >-#include "system_wrappers/include/field_trial.h" >- >-namespace webrtc { >- >-// When CongestionWindowPushback is enabled, the pacer is oblivious to >-// the congestion window. The relation between outstanding data and >-// the congestion window affects encoder allocations directly. >-// This experiment is build on top of congestion window experiment. >-const char kCongestionPushbackExperiment[] = "WebRTC-CongestionWindowPushback"; >-const uint32_t kDefaultMinPushbackTargetBitrateBps = 30000; >- >-bool ReadCongestionWindowPushbackExperimentParameter( >- uint32_t* min_pushback_target_bitrate_bps) { >- RTC_DCHECK(min_pushback_target_bitrate_bps); >- std::string experiment_string = >- webrtc::field_trial::FindFullName(kCongestionPushbackExperiment); >- int parsed_values = sscanf(experiment_string.c_str(), "Enabled-%" PRIu32, >- min_pushback_target_bitrate_bps); >- if (parsed_values == 1) { >- RTC_CHECK_GE(*min_pushback_target_bitrate_bps, 0) >- << "Min pushback target bitrate must be greater than or equal to 0."; >- return true; >- } >- return false; >-} >- >-CongestionWindowPushbackController::CongestionWindowPushbackController() { >- if (!ReadCongestionWindowPushbackExperimentParameter( >- &min_pushback_target_bitrate_bps_)) { >- min_pushback_target_bitrate_bps_ = kDefaultMinPushbackTargetBitrateBps; >- } >-} >- >-void CongestionWindowPushbackController::UpdateOutstandingData( >- size_t outstanding_bytes) { >- outstanding_bytes_ = outstanding_bytes; >-} >- >-void CongestionWindowPushbackController::UpdateMaxOutstandingData( >- size_t max_outstanding_bytes) { >- DataSize data_window = DataSize::bytes(max_outstanding_bytes); >- if (current_data_window_) { >- data_window = (data_window + current_data_window_.value()) / 2; >- } >- current_data_window_ = data_window; >-} >- >-void CongestionWindowPushbackController::SetDataWindow(DataSize data_window) { >- current_data_window_ = data_window; >-} >- >-uint32_t CongestionWindowPushbackController::UpdateTargetBitrate( >- uint32_t bitrate_bps) { >- if (!current_data_window_ || current_data_window_->IsZero()) >- return bitrate_bps; >- double fill_ratio = >- outstanding_bytes_ / static_cast<double>(current_data_window_->bytes()); >- if (fill_ratio > 1.5) { >- encoding_rate_ratio_ *= 0.9; >- } else if (fill_ratio > 1) { >- encoding_rate_ratio_ *= 0.95; >- } else if (fill_ratio < 0.1) { >- encoding_rate_ratio_ = 1.0; >- } else { >- encoding_rate_ratio_ *= 1.05; >- encoding_rate_ratio_ = std::min(encoding_rate_ratio_, 1.0); >- } >- uint32_t adjusted_target_bitrate_bps = >- static_cast<uint32_t>(bitrate_bps * encoding_rate_ratio_); >- >- // Do not adjust below the minimum pushback bitrate but do obey if the >- // original estimate is below it. >- bitrate_bps = adjusted_target_bitrate_bps < min_pushback_target_bitrate_bps_ >- ? std::min(bitrate_bps, min_pushback_target_bitrate_bps_) >- : adjusted_target_bitrate_bps; >- return bitrate_bps; >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller.h >deleted file mode 100644 >index f608590583bdb811f846a43ca9e3588745392488..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller.h >+++ /dev/null >@@ -1,43 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_CONGESTION_CONTROLLER_CONGESTION_WINDOW_PUSHBACK_CONTROLLER_H_ >-#define MODULES_CONGESTION_CONTROLLER_CONGESTION_WINDOW_PUSHBACK_CONTROLLER_H_ >- >-#include "api/transport/network_types.h" >-#include "common_types.h" // NOLINT(build/include) >-#include "rtc_base/criticalsection.h" >-#include "rtc_base/format_macros.h" >- >-namespace webrtc { >- >-// This class enables pushback from congestion window directly to video encoder. >-// When the congestion window is filling up, the video encoder target bitrate >-// will be reduced accordingly to accommodate the network changes. To avoid >-// pausing video too frequently, a minimum encoder target bitrate threshold is >-// used to prevent video pause due to a full congestion window. >-class CongestionWindowPushbackController { >- public: >- CongestionWindowPushbackController(); >- void UpdateOutstandingData(size_t outstanding_bytes); >- void UpdateMaxOutstandingData(size_t max_outstanding_bytes); >- uint32_t UpdateTargetBitrate(uint32_t bitrate_bps); >- void SetDataWindow(DataSize data_window); >- >- private: >- absl::optional<DataSize> current_data_window_; >- size_t outstanding_bytes_ = 0; >- uint32_t min_pushback_target_bitrate_bps_; >- double encoding_rate_ratio_ = 1.0; >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_CONGESTION_CONTROLLER_CONGESTION_WINDOW_PUSHBACK_CONTROLLER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller_unittest.cc >deleted file mode 100644 >index d9d6a79d62ffbc60bdd8a558d5e01b9524b54f1d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/congestion_window_pushback_controller_unittest.cc >+++ /dev/null >@@ -1,65 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/congestion_controller/congestion_window_pushback_controller.h" >-#include "test/gmock.h" >-#include "test/gtest.h" >- >-using testing::_; >- >-namespace webrtc { >-namespace test { >- >-class CongestionWindowPushbackControllerTest : public ::testing::Test { >- protected: >- CongestionWindowPushbackController cwnd_controller_; >-}; >- >-TEST_F(CongestionWindowPushbackControllerTest, FullCongestionWindow) { >- cwnd_controller_.UpdateOutstandingData(100000); >- cwnd_controller_.UpdateMaxOutstandingData(50000); >- >- uint32_t bitrate_bps = 80000; >- bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >- EXPECT_EQ(72000u, bitrate_bps); >- >- cwnd_controller_.UpdateMaxOutstandingData(50000); >- bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >- EXPECT_EQ(static_cast<uint32_t>(72000 * 0.9 * 0.9), bitrate_bps); >-} >- >-TEST_F(CongestionWindowPushbackControllerTest, NormalCongestionWindow) { >- cwnd_controller_.UpdateOutstandingData(100000); >- cwnd_controller_.SetDataWindow(DataSize::bytes(200000)); >- >- uint32_t bitrate_bps = 80000; >- bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >- EXPECT_EQ(80000u, bitrate_bps); >- >- cwnd_controller_.UpdateMaxOutstandingData(20000); >- bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >- EXPECT_EQ(80000u, bitrate_bps); >-} >- >-TEST_F(CongestionWindowPushbackControllerTest, LowBitrate) { >- cwnd_controller_.UpdateOutstandingData(100000); >- cwnd_controller_.SetDataWindow(DataSize::bytes(50000)); >- >- uint32_t bitrate_bps = 35000; >- bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >- EXPECT_EQ(static_cast<uint32_t>(35000 * 0.9), bitrate_bps); >- >- cwnd_controller_.UpdateMaxOutstandingData(20000); >- bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >- EXPECT_EQ(30000u, bitrate_bps); >-} >- >-} // namespace test >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/BUILD.gn >index b12690b817e13cccb5e0063c937323ca75b0aec2..38169bbb4d029a3e85706ca6b9c4d79b7a745e3a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/BUILD.gn >@@ -19,10 +19,8 @@ config("bwe_test_logging") { > rtc_static_library("goog_cc") { > configs += [ ":bwe_test_logging" ] > sources = [ >- "goog_cc_factory.cc", > "goog_cc_network_control.cc", > "goog_cc_network_control.h", >- "include/goog_cc_factory.h", > ] > > deps = [ >@@ -30,6 +28,7 @@ rtc_static_library("goog_cc") { > ":delay_based_bwe", > ":estimators", > ":probe_controller", >+ ":pushback_controller", > "../..:module_api", > "../../..:webrtc_common", > "../../../api/transport:network_control", >@@ -49,6 +48,20 @@ rtc_static_library("goog_cc") { > ] > } > >+rtc_source_set("pushback_controller") { >+ sources = [ >+ "congestion_window_pushback_controller.cc", >+ "congestion_window_pushback_controller.h", >+ ] >+ deps = [ >+ "../../../api/transport:network_control", >+ "../../../rtc_base:checks", >+ "../../../rtc_base:rtc_base_approved", >+ "../../../system_wrappers:field_trial", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >+ > rtc_source_set("alr_detector") { > sources = [ > "alr_detector.cc", >@@ -84,6 +97,7 @@ rtc_source_set("estimators") { > ] > > deps = [ >+ "../../../api/units:data_rate", > "../../../logging:rtc_event_bwe", > "../../../logging:rtc_event_log_api", > "../../../rtc_base:checks", >@@ -107,6 +121,7 @@ rtc_source_set("delay_based_bwe") { > > deps = [ > ":estimators", >+ "../../../api/transport:network_control", > "../../../logging:rtc_event_bwe", > "../../../logging:rtc_event_log_api", > "../../../rtc_base:checks", >@@ -115,7 +130,9 @@ rtc_source_set("delay_based_bwe") { > "../../../system_wrappers:metrics", > "../../pacing", > "../../remote_bitrate_estimator", >+ "../../rtp_rtcp:rtp_rtcp_format", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > } > >@@ -149,6 +166,7 @@ if (rtc_include_tests) { > ":estimators", > ":goog_cc", > "..:test_controller_printer", >+ "../../../api/transport:goog_cc", > ] > } > rtc_source_set("goog_cc_unittests") { >@@ -157,6 +175,7 @@ if (rtc_include_tests) { > sources = [ > "acknowledged_bitrate_estimator_unittest.cc", > "alr_detector_unittest.cc", >+ "congestion_window_pushback_controller_unittest.cc", > "delay_based_bwe_unittest.cc", > "delay_based_bwe_unittest_helper.cc", > "delay_based_bwe_unittest_helper.h", >@@ -175,6 +194,8 @@ if (rtc_include_tests) { > ":estimators", > ":goog_cc", > ":probe_controller", >+ ":pushback_controller", >+ "../../../api/transport:goog_cc", > "../../../api/transport:network_control", > "../../../api/transport:network_control_test", > "../../../logging:mocks", >@@ -194,4 +215,39 @@ if (rtc_include_tests) { > "//third_party/abseil-cpp/absl/memory", > ] > } >+ rtc_source_set("goog_cc_slow_tests") { >+ testonly = true >+ >+ sources = [ >+ "goog_cc_network_control_slowtest.cc", >+ ] >+ if (!build_with_chromium && is_clang) { >+ suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >+ } >+ deps = [ >+ ":alr_detector", >+ ":delay_based_bwe", >+ ":estimators", >+ ":goog_cc", >+ ":probe_controller", >+ ":pushback_controller", >+ "../../../api/transport:goog_cc", >+ "../../../api/transport:network_control", >+ "../../../api/transport:network_control_test", >+ "../../../logging:mocks", >+ "../../../rtc_base:checks", >+ "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/experiments:alr_experiment", >+ "../../../system_wrappers", >+ "../../../system_wrappers:field_trial", >+ "../../../test:field_trial", >+ "../../../test:test_support", >+ "../../../test/scenario", >+ "../../pacing", >+ "../../remote_bitrate_estimator", >+ "../../rtp_rtcp:rtp_rtcp_format", >+ "//testing/gmock", >+ "//third_party/abseil-cpp/absl/memory", >+ ] >+ } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.cc >index 0722b2218bd597f7873f8a178766de6975579735..939da4cc33ceafd0b6b62d44f7fc6ce642ab7c8b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.cc >@@ -10,11 +10,15 @@ > > #include "modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h" > >+#include <stddef.h> >+#include <algorithm> > #include <utility> > > #include "absl/memory/memory.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/safe_conversions.h" >+#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > >@@ -31,7 +35,9 @@ AcknowledgedBitrateEstimator::~AcknowledgedBitrateEstimator() {} > > AcknowledgedBitrateEstimator::AcknowledgedBitrateEstimator( > std::unique_ptr<BitrateEstimator> bitrate_estimator) >- : bitrate_estimator_(std::move(bitrate_estimator)) {} >+ : account_for_unacknowledged_traffic_( >+ field_trial::IsEnabled("WebRTC-Bwe-AccountForUnacked")), >+ bitrate_estimator_(std::move(bitrate_estimator)) {} > > void AcknowledgedBitrateEstimator::IncomingPacketFeedbackVector( > const std::vector<PacketFeedback>& packet_feedback_vector) { >@@ -41,18 +47,41 @@ void AcknowledgedBitrateEstimator::IncomingPacketFeedbackVector( > for (const auto& packet : packet_feedback_vector) { > if (IsInSendTimeHistory(packet)) { > MaybeExpectFastRateChange(packet.send_time_ms); >- bitrate_estimator_->Update(packet.arrival_time_ms, >- rtc::dchecked_cast<int>(packet.payload_size)); >+ int acknowledged_estimate = rtc::dchecked_cast<int>(packet.payload_size); >+ if (account_for_unacknowledged_traffic_) >+ acknowledged_estimate += packet.unacknowledged_data; >+ bitrate_estimator_->Update(packet.arrival_time_ms, acknowledged_estimate); > } > } > } > > absl::optional<uint32_t> AcknowledgedBitrateEstimator::bitrate_bps() const { > auto estimated_bitrate = bitrate_estimator_->bitrate_bps(); >- return estimated_bitrate >- ? *estimated_bitrate + >- allocated_bitrate_without_feedback_bps_.value_or(0) >- : estimated_bitrate; >+ // If we account for unacknowledged traffic, we should not add the allocated >+ // bitrate for unallocated stream as we expect it to be included already. >+ if (account_for_unacknowledged_traffic_) { >+ return estimated_bitrate; >+ } else { >+ return estimated_bitrate >+ ? *estimated_bitrate + allocated_bitrate_without_feedback_bps_ >+ : estimated_bitrate; >+ } >+} >+ >+absl::optional<uint32_t> AcknowledgedBitrateEstimator::PeekBps() const { >+ return bitrate_estimator_->PeekBps(); >+} >+ >+absl::optional<DataRate> AcknowledgedBitrateEstimator::bitrate() const { >+ if (bitrate_bps()) >+ return DataRate::bps(*bitrate_bps()); >+ return absl::nullopt; >+} >+ >+absl::optional<DataRate> AcknowledgedBitrateEstimator::PeekRate() const { >+ if (PeekBps()) >+ return DataRate::bps(*PeekBps()); >+ return absl::nullopt; > } > > void AcknowledgedBitrateEstimator::SetAlrEndedTimeMs( >@@ -62,7 +91,7 @@ void AcknowledgedBitrateEstimator::SetAlrEndedTimeMs( > > void AcknowledgedBitrateEstimator::SetAllocatedBitrateWithoutFeedback( > uint32_t bitrate_bps) { >- allocated_bitrate_without_feedback_bps_.emplace(bitrate_bps); >+ allocated_bitrate_without_feedback_bps_ = bitrate_bps; > } > > void AcknowledgedBitrateEstimator::MaybeExpectFastRateChange( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h >index ccb9718b2509e6ac783fe6935473963af2942ea7..645c0b913a7e04e6b7a0d51600b73388e9d2a4d3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h >@@ -17,6 +17,8 @@ > #include "absl/types/optional.h" > #include "modules/congestion_controller/goog_cc/bitrate_estimator.h" > >+#include "api/units/data_rate.h" >+ > namespace webrtc { > > struct PacketFeedback; >@@ -32,14 +34,18 @@ class AcknowledgedBitrateEstimator { > void IncomingPacketFeedbackVector( > const std::vector<PacketFeedback>& packet_feedback_vector); > absl::optional<uint32_t> bitrate_bps() const; >+ absl::optional<uint32_t> PeekBps() const; >+ absl::optional<DataRate> bitrate() const; >+ absl::optional<DataRate> PeekRate() const; > void SetAlrEndedTimeMs(int64_t alr_ended_time_ms); > void SetAllocatedBitrateWithoutFeedback(uint32_t bitrate_bps); > > private: > void MaybeExpectFastRateChange(int64_t packet_arrival_time_ms); >+ const bool account_for_unacknowledged_traffic_; > absl::optional<int64_t> alr_ended_time_ms_; > std::unique_ptr<BitrateEstimator> bitrate_estimator_; >- absl::optional<uint32_t> allocated_bitrate_without_feedback_bps_; >+ uint32_t allocated_bitrate_without_feedback_bps_ = 0; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.cc >index ed8f3a66f4128c8c4bef81a4ee7564389bcdbf7f..db669428f022123c551f7704f7fa54d68a6d4e76 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.cc >@@ -10,20 +10,17 @@ > > #include "modules/congestion_controller/goog_cc/alr_detector.h" > >-#include <algorithm> >+#include <cstdint> > #include <cstdio> >-#include <string> > > #include "absl/memory/memory.h" >+#include "logging/rtc_event_log/events/rtc_event.h" > #include "logging/rtc_event_log/events/rtc_event_alr_state.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "rtc_base/checks.h" > #include "rtc_base/experiments/alr_experiment.h" >-#include "rtc_base/format_macros.h" >-#include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/timeutils.h" >-#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > AlrDetector::AlrDetector() : AlrDetector(nullptr) {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.h >index 874354bc580d9f20ced28c1f1f80bfab6b987366..c30ba1d0932d17b51ce3ea83cfb24d843f84f483 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector.h >@@ -11,10 +11,11 @@ > #ifndef MODULES_CONGESTION_CONTROLLER_GOOG_CC_ALR_DETECTOR_H_ > #define MODULES_CONGESTION_CONTROLLER_GOOG_CC_ALR_DETECTOR_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "absl/types/optional.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/pacing/interval_budget.h" >-#include "rtc_base/rate_statistics.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector_unittest.cc >index 3a3d002c23be081e511d12d2dbd7ee027177f8df..425268514d8b3a34510a8c7fc3f95d36094570fc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/alr_detector_unittest.cc >@@ -10,6 +10,7 @@ > > #include "modules/congestion_controller/goog_cc/alr_detector.h" > >+#include "rtc_base/checks.h" > #include "rtc_base/experiments/alr_experiment.h" > #include "test/field_trial.h" > #include "test/gtest.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.cc >index c776827802c9ec058702b68db890c59b701417bf..0a403927429fc9a207b03fef0219476de1210d6d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.cc >@@ -10,11 +10,11 @@ > > #include "modules/congestion_controller/goog_cc/bitrate_estimator.h" > >+#include <stdio.h> > #include <cmath> > #include <string> > > #include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" >-#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/field_trial.h" > >@@ -132,6 +132,12 @@ absl::optional<uint32_t> BitrateEstimator::bitrate_bps() const { > return bitrate_estimate_ * 1000; > } > >+absl::optional<uint32_t> BitrateEstimator::PeekBps() const { >+ if (current_window_ms_ > 0) >+ return sum_ * 8000 / current_window_ms_; >+ return absl::nullopt; >+} >+ > void BitrateEstimator::ExpectFastRateChange() { > // By setting the bitrate-estimate variance to a higher value we allow the > // bitrate to change fast for the next few samples. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.h >index 610fa214e7a900c9992b8c22158563d29c3978c2..d3df2b510472d0a81511a8c7819c4d6e0ef9dfba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/bitrate_estimator.h >@@ -11,7 +11,7 @@ > #ifndef MODULES_CONGESTION_CONTROLLER_GOOG_CC_BITRATE_ESTIMATOR_H_ > #define MODULES_CONGESTION_CONTROLLER_GOOG_CC_BITRATE_ESTIMATOR_H_ > >-#include <vector> >+#include <stdint.h> > > #include "absl/types/optional.h" > >@@ -29,6 +29,7 @@ class BitrateEstimator { > virtual void Update(int64_t now_ms, int bytes); > > virtual absl::optional<uint32_t> bitrate_bps() const; >+ absl::optional<uint32_t> PeekBps() const; > > virtual void ExpectFastRateChange(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..1e59cb5b4dfe102144e0e6b398e51ecd66a629c1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller.cc >@@ -0,0 +1,99 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <algorithm> >+#include <string> >+ >+#include "modules/congestion_controller/goog_cc/congestion_window_pushback_controller.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/format_macros.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+ >+// When CongestionWindowPushback is enabled, the pacer is oblivious to >+// the congestion window. The relation between outstanding data and >+// the congestion window affects encoder allocations directly. >+// This experiment is build on top of congestion window experiment. >+const char kCongestionPushbackExperiment[] = "WebRTC-CongestionWindowPushback"; >+const uint32_t kDefaultMinPushbackTargetBitrateBps = 30000; >+ >+bool ReadCongestionWindowPushbackExperimentParameter( >+ uint32_t* min_pushback_target_bitrate_bps) { >+ RTC_DCHECK(min_pushback_target_bitrate_bps); >+ std::string experiment_string = >+ webrtc::field_trial::FindFullName(kCongestionPushbackExperiment); >+ int parsed_values = sscanf(experiment_string.c_str(), "Enabled-%" PRIu32, >+ min_pushback_target_bitrate_bps); >+ if (parsed_values == 1) { >+ RTC_CHECK_GE(*min_pushback_target_bitrate_bps, 0) >+ << "Min pushback target bitrate must be greater than or equal to 0."; >+ return true; >+ } >+ return false; >+} >+ >+CongestionWindowPushbackController::CongestionWindowPushbackController() { >+ if (!ReadCongestionWindowPushbackExperimentParameter( >+ &min_pushback_target_bitrate_bps_)) { >+ min_pushback_target_bitrate_bps_ = kDefaultMinPushbackTargetBitrateBps; >+ } >+} >+ >+CongestionWindowPushbackController::CongestionWindowPushbackController( >+ uint32_t min_pushback_target_bitrate_bps) >+ : min_pushback_target_bitrate_bps_(min_pushback_target_bitrate_bps) {} >+ >+void CongestionWindowPushbackController::UpdateOutstandingData( >+ size_t outstanding_bytes) { >+ outstanding_bytes_ = outstanding_bytes; >+} >+ >+void CongestionWindowPushbackController::UpdateMaxOutstandingData( >+ size_t max_outstanding_bytes) { >+ DataSize data_window = DataSize::bytes(max_outstanding_bytes); >+ if (current_data_window_) { >+ data_window = (data_window + current_data_window_.value()) / 2; >+ } >+ current_data_window_ = data_window; >+} >+ >+void CongestionWindowPushbackController::SetDataWindow(DataSize data_window) { >+ current_data_window_ = data_window; >+} >+ >+uint32_t CongestionWindowPushbackController::UpdateTargetBitrate( >+ uint32_t bitrate_bps) { >+ if (!current_data_window_ || current_data_window_->IsZero()) >+ return bitrate_bps; >+ double fill_ratio = >+ outstanding_bytes_ / static_cast<double>(current_data_window_->bytes()); >+ if (fill_ratio > 1.5) { >+ encoding_rate_ratio_ *= 0.9; >+ } else if (fill_ratio > 1) { >+ encoding_rate_ratio_ *= 0.95; >+ } else if (fill_ratio < 0.1) { >+ encoding_rate_ratio_ = 1.0; >+ } else { >+ encoding_rate_ratio_ *= 1.05; >+ encoding_rate_ratio_ = std::min(encoding_rate_ratio_, 1.0); >+ } >+ uint32_t adjusted_target_bitrate_bps = >+ static_cast<uint32_t>(bitrate_bps * encoding_rate_ratio_); >+ >+ // Do not adjust below the minimum pushback bitrate but do obey if the >+ // original estimate is below it. >+ bitrate_bps = adjusted_target_bitrate_bps < min_pushback_target_bitrate_bps_ >+ ? std::min(bitrate_bps, min_pushback_target_bitrate_bps_) >+ : adjusted_target_bitrate_bps; >+ return bitrate_bps; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller.h >new file mode 100644 >index 0000000000000000000000000000000000000000..7ef0974cce3459611d90a74b2ad6969cff7aaf38 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller.h >@@ -0,0 +1,42 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_CONGESTION_CONTROLLER_GOOG_CC_CONGESTION_WINDOW_PUSHBACK_CONTROLLER_H_ >+#define MODULES_CONGESTION_CONTROLLER_GOOG_CC_CONGESTION_WINDOW_PUSHBACK_CONTROLLER_H_ >+ >+#include "api/transport/network_types.h" >+ >+namespace webrtc { >+ >+// This class enables pushback from congestion window directly to video encoder. >+// When the congestion window is filling up, the video encoder target bitrate >+// will be reduced accordingly to accommodate the network changes. To avoid >+// pausing video too frequently, a minimum encoder target bitrate threshold is >+// used to prevent video pause due to a full congestion window. >+class CongestionWindowPushbackController { >+ public: >+ CongestionWindowPushbackController(); >+ explicit CongestionWindowPushbackController( >+ uint32_t min_pushback_target_bitrate_bps); >+ void UpdateOutstandingData(size_t outstanding_bytes); >+ void UpdateMaxOutstandingData(size_t max_outstanding_bytes); >+ uint32_t UpdateTargetBitrate(uint32_t bitrate_bps); >+ void SetDataWindow(DataSize data_window); >+ >+ private: >+ absl::optional<DataSize> current_data_window_; >+ size_t outstanding_bytes_ = 0; >+ uint32_t min_pushback_target_bitrate_bps_; >+ double encoding_rate_ratio_ = 1.0; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_CONGESTION_CONTROLLER_GOOG_CC_CONGESTION_WINDOW_PUSHBACK_CONTROLLER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..30617eebb52cce983b20b8ed4f798b25c866c6a9 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/congestion_window_pushback_controller_unittest.cc >@@ -0,0 +1,65 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/congestion_controller/goog_cc/congestion_window_pushback_controller.h" >+#include "test/gmock.h" >+#include "test/gtest.h" >+ >+using testing::_; >+ >+namespace webrtc { >+namespace test { >+ >+class CongestionWindowPushbackControllerTest : public ::testing::Test { >+ protected: >+ CongestionWindowPushbackController cwnd_controller_; >+}; >+ >+TEST_F(CongestionWindowPushbackControllerTest, FullCongestionWindow) { >+ cwnd_controller_.UpdateOutstandingData(100000); >+ cwnd_controller_.UpdateMaxOutstandingData(50000); >+ >+ uint32_t bitrate_bps = 80000; >+ bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >+ EXPECT_EQ(72000u, bitrate_bps); >+ >+ cwnd_controller_.UpdateMaxOutstandingData(50000); >+ bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >+ EXPECT_EQ(static_cast<uint32_t>(72000 * 0.9 * 0.9), bitrate_bps); >+} >+ >+TEST_F(CongestionWindowPushbackControllerTest, NormalCongestionWindow) { >+ cwnd_controller_.UpdateOutstandingData(100000); >+ cwnd_controller_.SetDataWindow(DataSize::bytes(200000)); >+ >+ uint32_t bitrate_bps = 80000; >+ bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >+ EXPECT_EQ(80000u, bitrate_bps); >+ >+ cwnd_controller_.UpdateMaxOutstandingData(20000); >+ bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >+ EXPECT_EQ(80000u, bitrate_bps); >+} >+ >+TEST_F(CongestionWindowPushbackControllerTest, LowBitrate) { >+ cwnd_controller_.UpdateOutstandingData(100000); >+ cwnd_controller_.SetDataWindow(DataSize::bytes(50000)); >+ >+ uint32_t bitrate_bps = 35000; >+ bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >+ EXPECT_EQ(static_cast<uint32_t>(35000 * 0.9), bitrate_bps); >+ >+ cwnd_controller_.UpdateMaxOutstandingData(20000); >+ bitrate_bps = cwnd_controller_.UpdateTargetBitrate(bitrate_bps); >+ EXPECT_EQ(30000u, bitrate_bps); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.cc >index c21966cf7f5444de87126a2dc68eabb350a34f11..4894c9863b6836113c5e13bbac39d4892ce709e9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.cc >@@ -11,26 +11,25 @@ > #include "modules/congestion_controller/goog_cc/delay_based_bwe.h" > > #include <algorithm> >-#include <cmath> >+#include <cstdint> > #include <cstdio> > #include <string> > > #include "absl/memory/memory.h" >+#include "api/transport/network_types.h" // For PacedPacketInfo >+#include "logging/rtc_event_log/events/rtc_event.h" > #include "logging/rtc_event_log/events/rtc_event_bwe_update_delay_based.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/congestion_controller/goog_cc/trendline_estimator.h" >-#include "modules/pacing/paced_sender.h" >-#include "modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h" > #include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" > #include "rtc_base/checks.h" >-#include "rtc_base/constructormagic.h" > #include "rtc_base/logging.h" >-#include "rtc_base/thread_annotations.h" > #include "system_wrappers/include/field_trial.h" > #include "system_wrappers/include/metrics.h" > >+namespace webrtc { > namespace { >-static const int64_t kStreamTimeOutMs = 2000; >+constexpr TimeDelta kStreamTimeOut = TimeDelta::Seconds<2>(); > constexpr int kTimestampGroupLengthMs = 5; > constexpr int kAbsSendTimeFraction = 18; > constexpr int kAbsSendTimeInterArrivalUpshift = 8; >@@ -63,24 +62,22 @@ size_t ReadTrendlineFilterWindowSize() { > return window_size; > RTC_LOG(WARNING) << "Window size must be greater than 1."; > } >- RTC_LOG(LS_WARNING) << "Failed to parse parameters for BweTrendlineFilter " >- "experiment from field trial string. Using default."; >+ RTC_LOG(LS_WARNING) << "Failed to parse parameters for BweWindowSizeInPackets" >+ " experiment from field trial string. Using default."; > return kDefaultTrendlineWindowSize; > } > } // namespace > >-namespace webrtc { >- > DelayBasedBwe::Result::Result() > : updated(false), > probe(false), >- target_bitrate_bps(0), >+ target_bitrate(DataRate::Zero()), > recovered_from_overuse(false) {} > >-DelayBasedBwe::Result::Result(bool probe, uint32_t target_bitrate_bps) >+DelayBasedBwe::Result::Result(bool probe, DataRate target_bitrate) > : updated(true), > probe(probe), >- target_bitrate_bps(target_bitrate_bps), >+ target_bitrate(target_bitrate), > recovered_from_overuse(false) {} > > DelayBasedBwe::Result::~Result() {} >@@ -89,9 +86,8 @@ DelayBasedBwe::DelayBasedBwe(RtcEventLog* event_log) > : event_log_(event_log), > inter_arrival_(), > delay_detector_(), >- last_seen_packet_ms_(-1), >+ last_seen_packet_(Timestamp::MinusInfinity()), > uma_recorded_(false), >- probe_bitrate_estimator_(event_log), > trendline_window_size_( > webrtc::field_trial::IsEnabled(kBweWindowSizeInPacketsExperiment) > ? ReadTrendlineFilterWindowSize() >@@ -99,7 +95,7 @@ DelayBasedBwe::DelayBasedBwe(RtcEventLog* event_log) > trendline_smoothing_coeff_(kDefaultTrendlineSmoothingCoeff), > trendline_threshold_gain_(kDefaultTrendlineThresholdGain), > consecutive_delayed_feedbacks_(0), >- prev_bitrate_(0), >+ prev_bitrate_(DataRate::Zero()), > prev_state_(BandwidthUsage::kBwNormal) { > RTC_LOG(LS_INFO) > << "Using Trendline filter for delay change estimation with window size " >@@ -113,8 +109,9 @@ DelayBasedBwe::~DelayBasedBwe() {} > > DelayBasedBwe::Result DelayBasedBwe::IncomingPacketFeedbackVector( > const std::vector<PacketFeedback>& packet_feedback_vector, >- absl::optional<uint32_t> acked_bitrate_bps, >- int64_t at_time_ms) { >+ absl::optional<DataRate> acked_bitrate, >+ absl::optional<DataRate> probe_bitrate, >+ Timestamp at_time) { > RTC_DCHECK(std::is_sorted(packet_feedback_vector.begin(), > packet_feedback_vector.end(), > PacketFeedbackComparator())); >@@ -141,7 +138,7 @@ DelayBasedBwe::Result DelayBasedBwe::IncomingPacketFeedbackVector( > if (packet_feedback.send_time_ms < 0) > continue; > delayed_feedback = false; >- IncomingPacketFeedback(packet_feedback, at_time_ms); >+ IncomingPacketFeedback(packet_feedback, at_time); > if (prev_detector_state == BandwidthUsage::kBwUnderusing && > delay_detector_->State() == BandwidthUsage::kBwNormal) { > recovered_from_overuse = true; >@@ -150,43 +147,51 @@ DelayBasedBwe::Result DelayBasedBwe::IncomingPacketFeedbackVector( > } > > if (delayed_feedback) { >- ++consecutive_delayed_feedbacks_; >- if (consecutive_delayed_feedbacks_ >= kMaxConsecutiveFailedLookups) { >- consecutive_delayed_feedbacks_ = 0; >- return OnLongFeedbackDelay(packet_feedback_vector.back().arrival_time_ms); >- } >+ Timestamp arrival_time = Timestamp::PlusInfinity(); >+ if (packet_feedback_vector.back().arrival_time_ms > 0) >+ arrival_time = >+ Timestamp::ms(packet_feedback_vector.back().arrival_time_ms); >+ return OnDelayedFeedback(arrival_time); >+ > } else { > consecutive_delayed_feedbacks_ = 0; >- return MaybeUpdateEstimate(acked_bitrate_bps, recovered_from_overuse, >- at_time_ms); >+ return MaybeUpdateEstimate(acked_bitrate, probe_bitrate, >+ recovered_from_overuse, at_time); >+ } >+ return Result(); >+} >+ >+DelayBasedBwe::Result DelayBasedBwe::OnDelayedFeedback(Timestamp receive_time) { >+ ++consecutive_delayed_feedbacks_; >+ if (consecutive_delayed_feedbacks_ >= kMaxConsecutiveFailedLookups) { >+ consecutive_delayed_feedbacks_ = 0; >+ return OnLongFeedbackDelay(receive_time); > } > return Result(); > } > > DelayBasedBwe::Result DelayBasedBwe::OnLongFeedbackDelay( >- int64_t arrival_time_ms) { >+ Timestamp arrival_time) { > // Estimate should always be valid since a start bitrate always is set in the > // Call constructor. An alternative would be to return an empty Result here, > // or to estimate the throughput based on the feedback we received. > RTC_DCHECK(rate_control_.ValidEstimate()); >- rate_control_.SetEstimate(rate_control_.LatestEstimate() / 2, >- arrival_time_ms); >+ rate_control_.SetEstimate(rate_control_.LatestEstimate() / 2, arrival_time); > Result result; > result.updated = true; > result.probe = false; >- result.target_bitrate_bps = rate_control_.LatestEstimate(); >+ result.target_bitrate = rate_control_.LatestEstimate(); > RTC_LOG(LS_WARNING) << "Long feedback delay detected, reducing BWE to " >- << result.target_bitrate_bps; >+ << ToString(result.target_bitrate); > return result; > } > > void DelayBasedBwe::IncomingPacketFeedback( > const PacketFeedback& packet_feedback, >- int64_t at_time_ms) { >- int64_t now_ms = at_time_ms; >+ Timestamp at_time) { > // Reset if the stream has timed out. >- if (last_seen_packet_ms_ == -1 || >- now_ms - last_seen_packet_ms_ > kStreamTimeOutMs) { >+ if (last_seen_packet_.IsInfinite() || >+ at_time - last_seen_packet_ > kStreamTimeOut) { > inter_arrival_.reset( > new InterArrival((kTimestampGroupLengthMs << kInterArrivalShift) / 1000, > kTimestampToMs, true)); >@@ -194,7 +199,7 @@ void DelayBasedBwe::IncomingPacketFeedback( > trendline_smoothing_coeff_, > trendline_threshold_gain_)); > } >- last_seen_packet_ms_ = now_ms; >+ last_seen_packet_ = at_time; > > uint32_t send_time_24bits = > static_cast<uint32_t>( >@@ -211,115 +216,109 @@ void DelayBasedBwe::IncomingPacketFeedback( > int64_t t_delta = 0; > int size_delta = 0; > if (inter_arrival_->ComputeDeltas(timestamp, packet_feedback.arrival_time_ms, >- now_ms, packet_feedback.payload_size, >+ at_time.ms(), packet_feedback.payload_size, > &ts_delta, &t_delta, &size_delta)) { > double ts_delta_ms = (1000.0 * ts_delta) / (1 << kInterArrivalShift); > delay_detector_->Update(t_delta, ts_delta_ms, > packet_feedback.arrival_time_ms); > } >- if (packet_feedback.pacing_info.probe_cluster_id != >- PacedPacketInfo::kNotAProbe) { >- probe_bitrate_estimator_.HandleProbeAndEstimateBitrate(packet_feedback); >- } > } > > DelayBasedBwe::Result DelayBasedBwe::MaybeUpdateEstimate( >- absl::optional<uint32_t> acked_bitrate_bps, >+ absl::optional<DataRate> acked_bitrate, >+ absl::optional<DataRate> probe_bitrate, > bool recovered_from_overuse, >- int64_t at_time_ms) { >+ Timestamp at_time) { > Result result; >- int64_t now_ms = at_time_ms; > >- absl::optional<int> probe_bitrate_bps = >- probe_bitrate_estimator_.FetchAndResetLastEstimatedBitrateBps(); > // Currently overusing the bandwidth. > if (delay_detector_->State() == BandwidthUsage::kBwOverusing) { >- if (acked_bitrate_bps && >- rate_control_.TimeToReduceFurther(now_ms, *acked_bitrate_bps)) { >+ if (acked_bitrate && >+ rate_control_.TimeToReduceFurther(at_time, *acked_bitrate)) { > result.updated = >- UpdateEstimate(now_ms, acked_bitrate_bps, &result.target_bitrate_bps); >- } else if (!acked_bitrate_bps && rate_control_.ValidEstimate() && >- rate_control_.InitialTimeToReduceFurther(now_ms)) { >+ UpdateEstimate(at_time, acked_bitrate, &result.target_bitrate); >+ } else if (!acked_bitrate && rate_control_.ValidEstimate() && >+ rate_control_.InitialTimeToReduceFurther(at_time)) { > // Overusing before we have a measured acknowledged bitrate. Reduce send > // rate by 50% every 200 ms. > // TODO(tschumim): Improve this and/or the acknowledged bitrate estimator > // so that we (almost) always have a bitrate estimate. >- rate_control_.SetEstimate(rate_control_.LatestEstimate() / 2, now_ms); >+ rate_control_.SetEstimate(rate_control_.LatestEstimate() / 2, at_time); > result.updated = true; > result.probe = false; >- result.target_bitrate_bps = rate_control_.LatestEstimate(); >+ result.target_bitrate = rate_control_.LatestEstimate(); > } > } else { >- if (probe_bitrate_bps) { >+ if (probe_bitrate) { > result.probe = true; > result.updated = true; >- result.target_bitrate_bps = *probe_bitrate_bps; >- rate_control_.SetEstimate(*probe_bitrate_bps, now_ms); >+ result.target_bitrate = *probe_bitrate; >+ rate_control_.SetEstimate(*probe_bitrate, at_time); > } else { > result.updated = >- UpdateEstimate(now_ms, acked_bitrate_bps, &result.target_bitrate_bps); >+ UpdateEstimate(at_time, acked_bitrate, &result.target_bitrate); > result.recovered_from_overuse = recovered_from_overuse; > } > } > BandwidthUsage detector_state = delay_detector_->State(); >- if ((result.updated && prev_bitrate_ != result.target_bitrate_bps) || >+ if ((result.updated && prev_bitrate_ != result.target_bitrate) || > detector_state != prev_state_) { >- uint32_t bitrate_bps = >- result.updated ? result.target_bitrate_bps : prev_bitrate_; >+ DataRate bitrate = result.updated ? result.target_bitrate : prev_bitrate_; > >- BWE_TEST_LOGGING_PLOT(1, "target_bitrate_bps", now_ms, bitrate_bps); >+ BWE_TEST_LOGGING_PLOT(1, "target_bitrate_bps", at_time.ms(), bitrate.bps()); > > if (event_log_) { > event_log_->Log(absl::make_unique<RtcEventBweUpdateDelayBased>( >- bitrate_bps, detector_state)); >+ bitrate.bps(), detector_state)); > } > >- prev_bitrate_ = bitrate_bps; >+ prev_bitrate_ = bitrate; > prev_state_ = detector_state; > } > return result; > } > >-bool DelayBasedBwe::UpdateEstimate(int64_t now_ms, >- absl::optional<uint32_t> acked_bitrate_bps, >- uint32_t* target_bitrate_bps) { >- const RateControlInput input(delay_detector_->State(), acked_bitrate_bps); >- *target_bitrate_bps = rate_control_.Update(&input, now_ms); >+bool DelayBasedBwe::UpdateEstimate(Timestamp at_time, >+ absl::optional<DataRate> acked_bitrate, >+ DataRate* target_rate) { >+ const RateControlInput input(delay_detector_->State(), acked_bitrate); >+ *target_rate = rate_control_.Update(&input, at_time); > return rate_control_.ValidEstimate(); > } > >-void DelayBasedBwe::OnRttUpdate(int64_t avg_rtt_ms) { >- rate_control_.SetRtt(avg_rtt_ms); >+void DelayBasedBwe::OnRttUpdate(TimeDelta avg_rtt) { >+ rate_control_.SetRtt(avg_rtt); > } > > bool DelayBasedBwe::LatestEstimate(std::vector<uint32_t>* ssrcs, >- uint32_t* bitrate_bps) const { >+ DataRate* bitrate) const { > // Currently accessed from both the process thread (see > // ModuleRtpRtcpImpl::Process()) and the configuration thread (see > // Call::GetStats()). Should in the future only be accessed from a single > // thread. > RTC_DCHECK(ssrcs); >- RTC_DCHECK(bitrate_bps); >+ RTC_DCHECK(bitrate); > if (!rate_control_.ValidEstimate()) > return false; > > *ssrcs = {kFixedSsrc}; >- *bitrate_bps = rate_control_.LatestEstimate(); >+ *bitrate = rate_control_.LatestEstimate(); > return true; > } > >-void DelayBasedBwe::SetStartBitrate(int start_bitrate_bps) { >- RTC_LOG(LS_INFO) << "BWE Setting start bitrate to: " << start_bitrate_bps; >- rate_control_.SetStartBitrate(start_bitrate_bps); >+void DelayBasedBwe::SetStartBitrate(DataRate start_bitrate) { >+ RTC_LOG(LS_INFO) << "BWE Setting start bitrate to: " >+ << ToString(start_bitrate); >+ rate_control_.SetStartBitrate(start_bitrate); > } > >-void DelayBasedBwe::SetMinBitrate(int min_bitrate_bps) { >+void DelayBasedBwe::SetMinBitrate(DataRate min_bitrate) { > // Called from both the configuration thread and the network thread. Shouldn't > // be called from the network thread in the future. >- rate_control_.SetMinBitrate(min_bitrate_bps); >+ rate_control_.SetMinBitrate(min_bitrate); > } > >-int64_t DelayBasedBwe::GetExpectedBwePeriodMs() const { >- return rate_control_.GetExpectedBandwidthPeriodMs(); >+TimeDelta DelayBasedBwe::GetExpectedBwePeriod() const { >+ return rate_control_.GetExpectedBandwidthPeriod(); > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.h >index 42164baad4ac71c20e9768432c1adff6d44ddb47..616c5b0896bdf2fe03490b748ea0ce3419e8dac6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe.h >@@ -11,16 +11,18 @@ > #ifndef MODULES_CONGESTION_CONTROLLER_GOOG_CC_DELAY_BASED_BWE_H_ > #define MODULES_CONGESTION_CONTROLLER_GOOG_CC_DELAY_BASED_BWE_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> >-#include <utility> > #include <vector> > >+#include "absl/types/optional.h" > #include "modules/congestion_controller/goog_cc/delay_increase_detector_interface.h" > #include "modules/congestion_controller/goog_cc/probe_bitrate_estimator.h" > #include "modules/remote_bitrate_estimator/aimd_rate_control.h" >-#include "modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h" >+#include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/remote_bitrate_estimator/inter_arrival.h" >-#include "rtc_base/checks.h" >+#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" // For PacketFeedback > #include "rtc_base/constructormagic.h" > #include "rtc_base/race_checker.h" > >@@ -31,11 +33,11 @@ class DelayBasedBwe { > public: > struct Result { > Result(); >- Result(bool probe, uint32_t target_bitrate_bps); >+ Result(bool probe, DataRate target_bitrate); > ~Result(); > bool updated; > bool probe; >- uint32_t target_bitrate_bps; >+ DataRate target_bitrate = DataRate::Zero(); > bool recovered_from_overuse; > }; > >@@ -44,42 +46,43 @@ class DelayBasedBwe { > > Result IncomingPacketFeedbackVector( > const std::vector<PacketFeedback>& packet_feedback_vector, >- absl::optional<uint32_t> acked_bitrate_bps, >- int64_t at_time_ms); >- void OnRttUpdate(int64_t avg_rtt_ms); >- bool LatestEstimate(std::vector<uint32_t>* ssrcs, >- uint32_t* bitrate_bps) const; >- void SetStartBitrate(int start_bitrate_bps); >- void SetMinBitrate(int min_bitrate_bps); >- int64_t GetExpectedBwePeriodMs() const; >+ absl::optional<DataRate> acked_bitrate, >+ absl::optional<DataRate> probe_bitrate, >+ Timestamp at_time); >+ Result OnDelayedFeedback(Timestamp receive_time); >+ void OnRttUpdate(TimeDelta avg_rtt); >+ bool LatestEstimate(std::vector<uint32_t>* ssrcs, DataRate* bitrate) const; >+ void SetStartBitrate(DataRate start_bitrate); >+ void SetMinBitrate(DataRate min_bitrate); >+ TimeDelta GetExpectedBwePeriod() const; > > private: > friend class GoogCcStatePrinter; > void IncomingPacketFeedback(const PacketFeedback& packet_feedback, >- int64_t at_time_ms); >- Result OnLongFeedbackDelay(int64_t arrival_time_ms); >- Result MaybeUpdateEstimate(absl::optional<uint32_t> acked_bitrate_bps, >+ Timestamp at_time); >+ Result OnLongFeedbackDelay(Timestamp arrival_time); >+ Result MaybeUpdateEstimate(absl::optional<DataRate> acked_bitrate, >+ absl::optional<DataRate> probe_bitrate, > bool request_probe, >- int64_t at_time_ms); >+ Timestamp at_time); > // Updates the current remote rate estimate and returns true if a valid > // estimate exists. >- bool UpdateEstimate(int64_t now_ms, >- absl::optional<uint32_t> acked_bitrate_bps, >- uint32_t* target_bitrate_bps); >+ bool UpdateEstimate(Timestamp now, >+ absl::optional<DataRate> acked_bitrate, >+ DataRate* target_bitrate); > > rtc::RaceChecker network_race_; > RtcEventLog* const event_log_; > std::unique_ptr<InterArrival> inter_arrival_; > std::unique_ptr<DelayIncreaseDetectorInterface> delay_detector_; >- int64_t last_seen_packet_ms_; >+ Timestamp last_seen_packet_; > bool uma_recorded_; > AimdRateControl rate_control_; >- ProbeBitrateEstimator probe_bitrate_estimator_; > size_t trendline_window_size_; > double trendline_smoothing_coeff_; > double trendline_threshold_gain_; > int consecutive_delayed_feedbacks_; >- uint32_t prev_bitrate_; >+ DataRate prev_bitrate_; > BandwidthUsage prev_state_; > > RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(DelayBasedBwe); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest.cc >index b2add3e6e94a8f502780076bc377c92677f123c6..7de11b16840f00afad8e70afd59f732e4079b230 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest.cc >@@ -25,13 +25,13 @@ constexpr int kNumProbesCluster1 = 8; > const PacedPacketInfo kPacingInfo0(0, kNumProbesCluster0, 2000); > const PacedPacketInfo kPacingInfo1(1, kNumProbesCluster1, 4000); > constexpr float kTargetUtilizationFraction = 0.95f; >-constexpr int64_t kDummyTimestamp = 1000; >+constexpr Timestamp kDummyTimestamp = Timestamp::Seconds<1000>(); > } // namespace > > TEST_F(DelayBasedBweTest, NoCrashEmptyFeedback) { > std::vector<PacketFeedback> packet_feedback_vector; > bitrate_estimator_->IncomingPacketFeedbackVector( >- packet_feedback_vector, absl::nullopt, kDummyTimestamp); >+ packet_feedback_vector, absl::nullopt, absl::nullopt, kDummyTimestamp); > } > > TEST_F(DelayBasedBweTest, NoCrashOnlyLostFeedback) { >@@ -43,7 +43,7 @@ TEST_F(DelayBasedBweTest, NoCrashOnlyLostFeedback) { > PacketFeedback::kNoSendTime, > 1, 1500, PacedPacketInfo())); > bitrate_estimator_->IncomingPacketFeedbackVector( >- packet_feedback_vector, absl::nullopt, kDummyTimestamp); >+ packet_feedback_vector, absl::nullopt, absl::nullopt, kDummyTimestamp); > } > > TEST_F(DelayBasedBweTest, ProbeDetection) { >@@ -144,12 +144,12 @@ TEST_F(DelayBasedBweTest, ProbeDetectionSlowerArrivalHighBitrate) { > } > > TEST_F(DelayBasedBweTest, GetExpectedBwePeriodMs) { >- int64_t default_interval_ms = bitrate_estimator_->GetExpectedBwePeriodMs(); >- EXPECT_GT(default_interval_ms, 0); >+ auto default_interval = bitrate_estimator_->GetExpectedBwePeriod(); >+ EXPECT_GT(default_interval.ms(), 0); > CapacityDropTestHelper(1, true, 333, 0); >- int64_t interval_ms = bitrate_estimator_->GetExpectedBwePeriodMs(); >- EXPECT_GT(interval_ms, 0); >- EXPECT_NE(interval_ms, default_interval_ms); >+ auto interval = bitrate_estimator_->GetExpectedBwePeriod(); >+ EXPECT_GT(interval.ms(), 0); >+ EXPECT_NE(interval.ms(), default_interval.ms()); > } > > TEST_F(DelayBasedBweTest, InitialBehavior) { >@@ -199,20 +199,20 @@ TEST_F(DelayBasedBweTest, TestLongTimeoutAndWrap) { > } > > TEST_F(DelayBasedBweTest, TestInitialOveruse) { >- const uint32_t kStartBitrate = 300e3; >- const uint32_t kInitialCapacityBps = 200e3; >+ const DataRate kStartBitrate = DataRate::kbps(300); >+ const DataRate kInitialCapacity = DataRate::kbps(200); > const uint32_t kDummySsrc = 0; > // High FPS to ensure that we send a lot of packets in a short time. > const int kFps = 90; > >- stream_generator_->AddStream(new test::RtpStream(kFps, kStartBitrate)); >- stream_generator_->set_capacity_bps(kInitialCapacityBps); >+ stream_generator_->AddStream(new test::RtpStream(kFps, kStartBitrate.bps())); >+ stream_generator_->set_capacity_bps(kInitialCapacity.bps()); > > // Needed to initialize the AimdRateControl. > bitrate_estimator_->SetStartBitrate(kStartBitrate); > > // Produce 30 frames (in 1/3 second) and give them to the estimator. >- uint32_t bitrate_bps = kStartBitrate; >+ int64_t bitrate_bps = kStartBitrate.bps(); > bool seen_overuse = false; > for (int i = 0; i < 30; ++i) { > bool overuse = GenerateAndProcessFrame(kDummySsrc, bitrate_bps); >@@ -222,7 +222,8 @@ TEST_F(DelayBasedBweTest, TestInitialOveruse) { > EXPECT_FALSE(acknowledged_bitrate_estimator_->bitrate_bps().has_value()); > if (overuse) { > EXPECT_TRUE(bitrate_observer_.updated()); >- EXPECT_NEAR(bitrate_observer_.latest_bitrate(), kStartBitrate / 2, 15000); >+ EXPECT_NEAR(bitrate_observer_.latest_bitrate(), kStartBitrate.bps() / 2, >+ 15000); > bitrate_bps = bitrate_observer_.latest_bitrate(); > seen_overuse = true; > break; >@@ -232,7 +233,8 @@ TEST_F(DelayBasedBweTest, TestInitialOveruse) { > } > } > EXPECT_TRUE(seen_overuse); >- EXPECT_NEAR(bitrate_observer_.latest_bitrate(), kStartBitrate / 2, 15000); >+ EXPECT_NEAR(bitrate_observer_.latest_bitrate(), kStartBitrate.bps() / 2, >+ 15000); > } > > class DelayBasedBweTestWithBackoffTimeoutExperiment : public DelayBasedBweTest { >@@ -243,20 +245,20 @@ class DelayBasedBweTestWithBackoffTimeoutExperiment : public DelayBasedBweTest { > > // This test subsumes and improves DelayBasedBweTest.TestInitialOveruse above. > TEST_F(DelayBasedBweTestWithBackoffTimeoutExperiment, TestInitialOveruse) { >- const uint32_t kStartBitrate = 300e3; >- const uint32_t kInitialCapacityBps = 200e3; >+ const DataRate kStartBitrate = DataRate::kbps(300); >+ const DataRate kInitialCapacity = DataRate::kbps(200); > const uint32_t kDummySsrc = 0; > // High FPS to ensure that we send a lot of packets in a short time. > const int kFps = 90; > >- stream_generator_->AddStream(new test::RtpStream(kFps, kStartBitrate)); >- stream_generator_->set_capacity_bps(kInitialCapacityBps); >+ stream_generator_->AddStream(new test::RtpStream(kFps, kStartBitrate.bps())); >+ stream_generator_->set_capacity_bps(kInitialCapacity.bps()); > > // Needed to initialize the AimdRateControl. > bitrate_estimator_->SetStartBitrate(kStartBitrate); > > // Produce 30 frames (in 1/3 second) and give them to the estimator. >- uint32_t bitrate_bps = kStartBitrate; >+ int64_t bitrate_bps = kStartBitrate.bps(); > bool seen_overuse = false; > for (int frames = 0; frames < 30 && !seen_overuse; ++frames) { > bool overuse = GenerateAndProcessFrame(kDummySsrc, bitrate_bps); >@@ -266,7 +268,8 @@ TEST_F(DelayBasedBweTestWithBackoffTimeoutExperiment, TestInitialOveruse) { > EXPECT_FALSE(acknowledged_bitrate_estimator_->bitrate_bps().has_value()); > if (overuse) { > EXPECT_TRUE(bitrate_observer_.updated()); >- EXPECT_NEAR(bitrate_observer_.latest_bitrate(), kStartBitrate / 2, 15000); >+ EXPECT_NEAR(bitrate_observer_.latest_bitrate(), kStartBitrate.bps() / 2, >+ 15000); > bitrate_bps = bitrate_observer_.latest_bitrate(); > seen_overuse = true; > } else if (bitrate_observer_.updated()) { >@@ -282,8 +285,8 @@ TEST_F(DelayBasedBweTestWithBackoffTimeoutExperiment, TestInitialOveruse) { > EXPECT_FALSE(overuse); > if (bitrate_observer_.updated()) { > bitrate_bps = bitrate_observer_.latest_bitrate(); >- EXPECT_GE(bitrate_bps, kStartBitrate / 2 - 15000); >- EXPECT_LE(bitrate_bps, kInitialCapacityBps + 15000); >+ EXPECT_GE(bitrate_bps, kStartBitrate.bps() / 2 - 15000); >+ EXPECT_LE(bitrate_bps, kInitialCapacity.bps() + 15000); > bitrate_observer_.Reset(); > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.cc >index 2a00801eb1c6cd686432479a72987c4eba351812..47288e5fbc64fcad3a924a2276994cf604005032 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.cc >@@ -154,6 +154,7 @@ DelayBasedBweTest::DelayBasedBweTest() > clock_(100000000), > acknowledged_bitrate_estimator_( > absl::make_unique<AcknowledgedBitrateEstimator>()), >+ probe_bitrate_estimator_(new ProbeBitrateEstimator(nullptr)), > bitrate_estimator_(new DelayBasedBwe(nullptr)), > stream_generator_(new test::StreamGenerator(1e6, // Capacity. > clock_.TimeInMicroseconds())), >@@ -166,6 +167,7 @@ DelayBasedBweTest::DelayBasedBweTest(const std::string& field_trial_string) > clock_(100000000), > acknowledged_bitrate_estimator_( > absl::make_unique<AcknowledgedBitrateEstimator>()), >+ probe_bitrate_estimator_(new ProbeBitrateEstimator(nullptr)), > bitrate_estimator_(new DelayBasedBwe(nullptr)), > stream_generator_(new test::StreamGenerator(1e6, // Capacity. > clock_.TimeInMicroseconds())), >@@ -198,15 +200,20 @@ void DelayBasedBweTest::IncomingFeedback(int64_t arrival_time_ms, > sequence_number, payload_size, pacing_info); > std::vector<PacketFeedback> packets; > packets.push_back(packet); >+ if (packet.send_time_ms != PacketFeedback::kNoSendTime && >+ packet.pacing_info.probe_cluster_id != PacedPacketInfo::kNotAProbe) >+ probe_bitrate_estimator_->HandleProbeAndEstimateBitrate(packet); >+ > acknowledged_bitrate_estimator_->IncomingPacketFeedbackVector(packets); > DelayBasedBwe::Result result = > bitrate_estimator_->IncomingPacketFeedbackVector( >- packets, acknowledged_bitrate_estimator_->bitrate_bps(), >- clock_.TimeInMilliseconds()); >+ packets, acknowledged_bitrate_estimator_->bitrate(), >+ probe_bitrate_estimator_->FetchAndResetLastEstimatedBitrate(), >+ Timestamp::ms(clock_.TimeInMilliseconds())); > const uint32_t kDummySsrc = 0; > if (result.updated) { > bitrate_observer_.OnReceiveBitrateChanged({kDummySsrc}, >- result.target_bitrate_bps); >+ result.target_bitrate.bps()); > } > } > >@@ -232,18 +239,23 @@ bool DelayBasedBweTest::GenerateAndProcessFrame(uint32_t ssrc, > for (auto& packet : packets) { > RTC_CHECK_GE(packet.arrival_time_ms + arrival_time_offset_ms_, 0); > packet.arrival_time_ms += arrival_time_offset_ms_; >+ >+ if (packet.send_time_ms != PacketFeedback::kNoSendTime && >+ packet.pacing_info.probe_cluster_id != PacedPacketInfo::kNotAProbe) >+ probe_bitrate_estimator_->HandleProbeAndEstimateBitrate(packet); > } > > acknowledged_bitrate_estimator_->IncomingPacketFeedbackVector(packets); > DelayBasedBwe::Result result = > bitrate_estimator_->IncomingPacketFeedbackVector( >- packets, acknowledged_bitrate_estimator_->bitrate_bps(), >- clock_.TimeInMilliseconds()); >+ packets, acknowledged_bitrate_estimator_->bitrate(), >+ probe_bitrate_estimator_->FetchAndResetLastEstimatedBitrate(), >+ Timestamp::ms(clock_.TimeInMilliseconds())); > const uint32_t kDummySsrc = 0; > if (result.updated) { > bitrate_observer_.OnReceiveBitrateChanged({kDummySsrc}, >- result.target_bitrate_bps); >- if (!first_update_ && result.target_bitrate_bps < bitrate_bps) >+ result.target_bitrate.bps()); >+ if (!first_update_ && result.target_bitrate.bps() < bitrate_bps) > overuse = true; > first_update_ = false; > } >@@ -289,14 +301,14 @@ void DelayBasedBweTest::InitialBehaviorTestHelper( > const int kFramerate = 50; // 50 fps to avoid rounding errors. > const int kFrameIntervalMs = 1000 / kFramerate; > const PacedPacketInfo kPacingInfo(0, 5, 5000); >- uint32_t bitrate_bps = 0; >+ DataRate bitrate = DataRate::Zero(); > int64_t send_time_ms = 0; > uint16_t sequence_number = 0; > std::vector<uint32_t> ssrcs; >- EXPECT_FALSE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate_bps)); >+ EXPECT_FALSE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate)); > EXPECT_EQ(0u, ssrcs.size()); > clock_.AdvanceTimeMilliseconds(1000); >- EXPECT_FALSE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate_bps)); >+ EXPECT_FALSE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate)); > EXPECT_FALSE(bitrate_observer_.updated()); > bitrate_observer_.Reset(); > clock_.AdvanceTimeMilliseconds(1000); >@@ -308,7 +320,7 @@ void DelayBasedBweTest::InitialBehaviorTestHelper( > i < kInitialProbingPackets ? kPacingInfo : PacedPacketInfo(); > > if (i == kNumInitialPackets) { >- EXPECT_FALSE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate_bps)); >+ EXPECT_FALSE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate)); > EXPECT_EQ(0u, ssrcs.size()); > EXPECT_FALSE(bitrate_observer_.updated()); > bitrate_observer_.Reset(); >@@ -318,13 +330,14 @@ void DelayBasedBweTest::InitialBehaviorTestHelper( > clock_.AdvanceTimeMilliseconds(1000 / kFramerate); > send_time_ms += kFrameIntervalMs; > } >- EXPECT_TRUE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate_bps)); >+ EXPECT_TRUE(bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate)); > ASSERT_EQ(1u, ssrcs.size()); > EXPECT_EQ(kDefaultSsrc, ssrcs.front()); >- EXPECT_NEAR(expected_converge_bitrate, bitrate_bps, kAcceptedBitrateErrorBps); >+ EXPECT_NEAR(expected_converge_bitrate, bitrate.bps(), >+ kAcceptedBitrateErrorBps); > EXPECT_TRUE(bitrate_observer_.updated()); > bitrate_observer_.Reset(); >- EXPECT_EQ(bitrate_observer_.latest_bitrate(), bitrate_bps); >+ EXPECT_EQ(bitrate_observer_.latest_bitrate(), bitrate.bps()); > } > > void DelayBasedBweTest::RateIncreaseReorderingTestHelper( >@@ -509,7 +522,7 @@ void DelayBasedBweTest::TestWrappingHelper(int silence_time_s) { > clock_.AdvanceTimeMilliseconds(kFrameIntervalMs); > send_time_ms += kFrameIntervalMs; > } >- uint32_t bitrate_before = 0; >+ DataRate bitrate_before = DataRate::Zero(); > std::vector<uint32_t> ssrcs; > bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate_before); > >@@ -522,7 +535,7 @@ void DelayBasedBweTest::TestWrappingHelper(int silence_time_s) { > clock_.AdvanceTimeMilliseconds(2 * kFrameIntervalMs); > send_time_ms += kFrameIntervalMs; > } >- uint32_t bitrate_after = 0; >+ DataRate bitrate_after = DataRate::Zero(); > bitrate_estimator_->LatestEstimate(&ssrcs, &bitrate_after); > EXPECT_LT(bitrate_after, bitrate_before); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.h >index 7ba18c367527bb726354e4c7412443e4ed4066b1..444e0cd67df4fb4685e339fc5328a51f28585190 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/delay_based_bwe_unittest_helper.h >@@ -172,6 +172,7 @@ class DelayBasedBweTest : public ::testing::Test { > SimulatedClock clock_; // Time at the receiver. > test::TestBitrateObserver bitrate_observer_; > std::unique_ptr<AcknowledgedBitrateEstimator> acknowledged_bitrate_estimator_; >+ const std::unique_ptr<ProbeBitrateEstimator> probe_bitrate_estimator_; > std::unique_ptr<DelayBasedBwe> bitrate_estimator_; > std::unique_ptr<test::StreamGenerator> stream_generator_; > int64_t arrival_time_offset_ms_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_factory.cc >deleted file mode 100644 >index 2e0ccc17ca4b3a3b8186bfe63a16e96ad519ad16..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_factory.cc >+++ /dev/null >@@ -1,43 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/congestion_controller/goog_cc/include/goog_cc_factory.h" >- >-#include "absl/memory/memory.h" >-#include "modules/congestion_controller/goog_cc/goog_cc_network_control.h" >-namespace webrtc { >-GoogCcNetworkControllerFactory::GoogCcNetworkControllerFactory( >- RtcEventLog* event_log) >- : event_log_(event_log) {} >- >-std::unique_ptr<NetworkControllerInterface> >-GoogCcNetworkControllerFactory::Create(NetworkControllerConfig config) { >- return absl::make_unique<GoogCcNetworkController>(event_log_, config, false); >-} >- >-TimeDelta GoogCcNetworkControllerFactory::GetProcessInterval() const { >- const int64_t kUpdateIntervalMs = 25; >- return TimeDelta::ms(kUpdateIntervalMs); >-} >- >-GoogCcFeedbackNetworkControllerFactory::GoogCcFeedbackNetworkControllerFactory( >- RtcEventLog* event_log) >- : event_log_(event_log) {} >- >-std::unique_ptr<NetworkControllerInterface> >-GoogCcFeedbackNetworkControllerFactory::Create(NetworkControllerConfig config) { >- return absl::make_unique<GoogCcNetworkController>(event_log_, config, true); >-} >- >-TimeDelta GoogCcFeedbackNetworkControllerFactory::GetProcessInterval() const { >- const int64_t kUpdateIntervalMs = 25; >- return TimeDelta::ms(kUpdateIntervalMs); >-} >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.cc >index 4ab9694f37bc037e4c95e26ac439b5f2c15a1f6d..4376953b7cc1cf4935c22ab2a87de90efc0e9e36 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.cc >@@ -22,7 +22,6 @@ > #include "absl/memory/memory.h" > #include "modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h" > #include "modules/congestion_controller/goog_cc/alr_detector.h" >-#include "modules/congestion_controller/goog_cc/include/goog_cc_factory.h" > #include "modules/congestion_controller/goog_cc/probe_controller.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" >@@ -36,6 +35,11 @@ namespace webrtc { > namespace { > > const char kCwndExperiment[] = "WebRTC-CwndExperiment"; >+// When CongestionWindowPushback is enabled, the pacer is oblivious to >+// the congestion window. The relation between outstanding data and >+// the congestion window affects encoder allocations directly. >+const char kCongestionPushbackExperiment[] = "WebRTC-CongestionWindowPushback"; >+ > const int64_t kDefaultAcceptedQueueMs = 250; > > // From RTCPSender video report interval. >@@ -48,6 +52,18 @@ constexpr TimeDelta kLossUpdateInterval = TimeDelta::Millis<1000>(); > // overshoots from the encoder. > const float kDefaultPaceMultiplier = 2.5f; > >+bool IsCongestionWindowPushbackExperimentEnabled() { >+ return webrtc::field_trial::IsEnabled(kCongestionPushbackExperiment) && >+ webrtc::field_trial::IsEnabled(kCwndExperiment); >+} >+ >+std::unique_ptr<CongestionWindowPushbackController> >+MaybeInitalizeCongestionWindowPushbackController() { >+ return IsCongestionWindowPushbackExperimentEnabled() >+ ? absl::make_unique<CongestionWindowPushbackController>() >+ : nullptr; >+} >+ > bool CwndExperimentEnabled() { > std::string experiment_string = > webrtc::field_trial::FindFullName(kCwndExperiment); >@@ -90,13 +106,10 @@ std::vector<PacketFeedback> ReceivedPacketsFeedbackAsRtp( > if (fb.receive_time.IsFinite()) { > PacketFeedback pf(fb.receive_time.ms(), 0); > pf.creation_time_ms = report.feedback_time.ms(); >- if (fb.sent_packet.has_value()) { >- pf.payload_size = fb.sent_packet->size.bytes(); >- pf.pacing_info = fb.sent_packet->pacing_info; >- pf.send_time_ms = fb.sent_packet->send_time.ms(); >- } else { >- pf.send_time_ms = PacketFeedback::kNoSendTime; >- } >+ pf.payload_size = fb.sent_packet.size.bytes(); >+ pf.pacing_info = fb.sent_packet.pacing_info; >+ pf.send_time_ms = fb.sent_packet.send_time.ms(); >+ pf.unacknowledged_data = fb.sent_packet.prior_unacked_data.bytes(); > packet_feedback_vector.push_back(pf); > } > } >@@ -114,21 +127,27 @@ int64_t GetBpsOrDefault(const absl::optional<DataRate>& rate, > > } // namespace > >- > GoogCcNetworkController::GoogCcNetworkController(RtcEventLog* event_log, > NetworkControllerConfig config, > bool feedback_only) > : event_log_(event_log), > packet_feedback_only_(feedback_only), >+ safe_reset_on_route_change_("Enabled"), >+ safe_reset_acknowledged_rate_("ack"), >+ use_stable_bandwidth_estimate_( >+ field_trial::IsEnabled("WebRTC-Bwe-StableBandwidthEstimate")), > probe_controller_(new ProbeController()), >+ congestion_window_pushback_controller_( >+ MaybeInitalizeCongestionWindowPushbackController()), > bandwidth_estimation_( > absl::make_unique<SendSideBandwidthEstimation>(event_log_)), > alr_detector_(absl::make_unique<AlrDetector>()), >+ probe_bitrate_estimator_(new ProbeBitrateEstimator(event_log)), > delay_based_bwe_(new DelayBasedBwe(event_log_)), > acknowledged_bitrate_estimator_( > absl::make_unique<AcknowledgedBitrateEstimator>()), > initial_config_(config), >- last_bandwidth_(*config.constraints.starting_rate), >+ last_target_rate_(*config.constraints.starting_rate), > pacing_factor_(config.stream_based_config.pacing_factor.value_or( > kDefaultPaceMultiplier)), > min_pacing_rate_(config.stream_based_config.min_pacing_rate.value_or( >@@ -138,7 +157,12 @@ GoogCcNetworkController::GoogCcNetworkController(RtcEventLog* event_log, > max_total_allocated_bitrate_(DataRate::Zero()), > in_cwnd_experiment_(CwndExperimentEnabled()), > accepted_queue_ms_(kDefaultAcceptedQueueMs) { >- delay_based_bwe_->SetMinBitrate(congestion_controller::GetMinBitrateBps()); >+ RTC_DCHECK(config.constraints.at_time.IsFinite()); >+ ParseFieldTrial( >+ {&safe_reset_on_route_change_, &safe_reset_acknowledged_rate_}, >+ field_trial::FindFullName("WebRTC-Bwe-SafeResetOnRouteChange")); >+ >+ delay_based_bwe_->SetMinBitrate(congestion_controller::GetMinBitrate()); > if (in_cwnd_experiment_ && > !ReadCwndExperimentParameter(&accepted_queue_ms_)) { > RTC_LOG(LS_WARNING) << "Failed to parse parameters for CwndExperiment " >@@ -165,16 +189,39 @@ NetworkControlUpdate GoogCcNetworkController::OnNetworkRouteChange( > > ClampBitrates(&start_bitrate_bps, &min_bitrate_bps, &max_bitrate_bps); > >- bandwidth_estimation_ = >- absl::make_unique<SendSideBandwidthEstimation>(event_log_); >+ if (safe_reset_on_route_change_) { >+ absl::optional<uint32_t> estimated_bitrate_bps; >+ if (safe_reset_acknowledged_rate_) { >+ estimated_bitrate_bps = acknowledged_bitrate_estimator_->bitrate_bps(); >+ if (!estimated_bitrate_bps) >+ estimated_bitrate_bps = acknowledged_bitrate_estimator_->PeekBps(); >+ } else { >+ int32_t target_bitrate_bps; >+ uint8_t fraction_loss; >+ int64_t rtt_ms; >+ bandwidth_estimation_->CurrentEstimate(&target_bitrate_bps, >+ &fraction_loss, &rtt_ms); >+ estimated_bitrate_bps = target_bitrate_bps; >+ } >+ if (estimated_bitrate_bps && (!msg.constraints.starting_rate || >+ estimated_bitrate_bps < start_bitrate_bps)) { >+ start_bitrate_bps = *estimated_bitrate_bps; >+ msg.constraints.starting_rate = DataRate::bps(start_bitrate_bps); >+ } >+ } >+ >+ acknowledged_bitrate_estimator_.reset(new AcknowledgedBitrateEstimator()); >+ probe_bitrate_estimator_.reset(new ProbeBitrateEstimator(event_log_)); >+ delay_based_bwe_.reset(new DelayBasedBwe(event_log_)); >+ if (msg.constraints.starting_rate) >+ delay_based_bwe_->SetStartBitrate(*msg.constraints.starting_rate); >+ // TODO(srte): Use original values instead of converted. >+ delay_based_bwe_->SetMinBitrate(DataRate::bps(min_bitrate_bps)); >+ bandwidth_estimation_->OnRouteChange(); > bandwidth_estimation_->SetBitrates( > msg.constraints.starting_rate, DataRate::bps(min_bitrate_bps), > msg.constraints.max_data_rate.value_or(DataRate::Infinity()), > msg.at_time); >- delay_based_bwe_.reset(new DelayBasedBwe(event_log_)); >- acknowledged_bitrate_estimator_.reset(new AcknowledgedBitrateEstimator()); >- delay_based_bwe_->SetStartBitrate(start_bitrate_bps); >- delay_based_bwe_->SetMinBitrate(min_bitrate_bps); > > probe_controller_->Reset(msg.at_time.ms()); > NetworkControlUpdate update; >@@ -239,7 +286,7 @@ NetworkControlUpdate GoogCcNetworkController::OnRoundTripTimeUpdate( > if (packet_feedback_only_) > return NetworkControlUpdate(); > if (msg.smoothed) { >- delay_based_bwe_->OnRttUpdate(msg.round_trip_time.ms()); >+ delay_based_bwe_->OnRttUpdate(msg.round_trip_time); > } else { > bandwidth_estimation_->UpdateRtt(msg.round_trip_time, msg.receive_time); > } >@@ -250,7 +297,22 @@ NetworkControlUpdate GoogCcNetworkController::OnSentPacket( > SentPacket sent_packet) { > alr_detector_->OnBytesSent(sent_packet.size.bytes(), > sent_packet.send_time.ms()); >- return NetworkControlUpdate(); >+ if (!first_packet_sent_) { >+ first_packet_sent_ = true; >+ // Initialize feedback time to send time to allow estimation of RTT until >+ // first feedback is received. >+ bandwidth_estimation_->UpdatePropagationRtt(sent_packet.send_time, >+ TimeDelta::Zero()); >+ } >+ if (congestion_window_pushback_controller_) { >+ congestion_window_pushback_controller_->UpdateOutstandingData( >+ sent_packet.data_in_flight.bytes()); >+ NetworkControlUpdate update; >+ MaybeTriggerOnNetworkChanged(&update, sent_packet.send_time); >+ return update; >+ } else { >+ return NetworkControlUpdate(); >+ } > } > > NetworkControlUpdate GoogCcNetworkController::OnStreamsConfig( >@@ -312,9 +374,9 @@ GoogCcNetworkController::UpdateBitrateConstraints( > starting_rate, DataRate::bps(min_bitrate_bps), > constraints.max_data_rate.value_or(DataRate::Infinity()), > constraints.at_time); >- if (start_bitrate_bps > 0) >- delay_based_bwe_->SetStartBitrate(start_bitrate_bps); >- delay_based_bwe_->SetMinBitrate(min_bitrate_bps); >+ if (starting_rate) >+ delay_based_bwe_->SetStartBitrate(*starting_rate); >+ delay_based_bwe_->SetMinBitrate(DataRate::bps(min_bitrate_bps)); > return probes; > } > >@@ -331,38 +393,61 @@ NetworkControlUpdate GoogCcNetworkController::OnTransportLossReport( > > NetworkControlUpdate GoogCcNetworkController::OnTransportPacketsFeedback( > TransportPacketsFeedback report) { >- TimeDelta feedback_max_rtt = TimeDelta::MinusInfinity(); >+ if (report.packet_feedbacks.empty()) { >+ DelayBasedBwe::Result result = delay_based_bwe_->OnDelayedFeedback( >+ report.sendless_arrival_times.back()); >+ NetworkControlUpdate update; >+ if (result.updated) { >+ bandwidth_estimation_->UpdateDelayBasedEstimate(report.feedback_time, >+ result.target_bitrate); >+ MaybeTriggerOnNetworkChanged(&update, report.feedback_time); >+ } >+ return update; >+ } >+ >+ if (congestion_window_pushback_controller_) { >+ congestion_window_pushback_controller_->UpdateOutstandingData( >+ report.data_in_flight.bytes()); >+ } >+ TimeDelta max_feedback_rtt = TimeDelta::MinusInfinity(); >+ TimeDelta min_propagation_rtt = TimeDelta::PlusInfinity(); > Timestamp max_recv_time = Timestamp::MinusInfinity(); >- for (const auto& packet_feedback : report.ReceivedWithSendInfo()) { >- TimeDelta rtt = >- report.feedback_time - packet_feedback.sent_packet->send_time; >- // max() is used to account for feedback being delayed by the >- // receiver. >- feedback_max_rtt = std::max(feedback_max_rtt, rtt); >- max_recv_time = std::max(max_recv_time, packet_feedback.receive_time); >+ >+ std::vector<PacketResult> feedbacks = report.ReceivedWithSendInfo(); >+ for (const auto& feedback : feedbacks) >+ max_recv_time = std::max(max_recv_time, feedback.receive_time); >+ >+ for (const auto& feedback : feedbacks) { >+ TimeDelta feedback_rtt = >+ report.feedback_time - feedback.sent_packet.send_time; >+ TimeDelta min_pending_time = feedback.receive_time - max_recv_time; >+ TimeDelta propagation_rtt = feedback_rtt - min_pending_time; >+ max_feedback_rtt = std::max(max_feedback_rtt, feedback_rtt); >+ min_propagation_rtt = std::min(min_propagation_rtt, propagation_rtt); > } >- absl::optional<int64_t> min_feedback_max_rtt_ms; >- if (feedback_max_rtt.IsFinite()) { >- feedback_max_rtts_.push_back(feedback_max_rtt.ms()); >+ >+ if (max_feedback_rtt.IsFinite()) { >+ feedback_max_rtts_.push_back(max_feedback_rtt.ms()); > const size_t kMaxFeedbackRttWindow = 32; > if (feedback_max_rtts_.size() > kMaxFeedbackRttWindow) > feedback_max_rtts_.pop_front(); >- min_feedback_max_rtt_ms.emplace(*std::min_element( >- feedback_max_rtts_.begin(), feedback_max_rtts_.end())); >+ // TODO(srte): Use time since last unacknowledged packet. >+ bandwidth_estimation_->UpdatePropagationRtt(report.feedback_time, >+ min_propagation_rtt); > } > if (packet_feedback_only_) { > if (!feedback_max_rtts_.empty()) { > int64_t sum_rtt_ms = std::accumulate(feedback_max_rtts_.begin(), > feedback_max_rtts_.end(), 0); > int64_t mean_rtt_ms = sum_rtt_ms / feedback_max_rtts_.size(); >- delay_based_bwe_->OnRttUpdate(mean_rtt_ms); >+ delay_based_bwe_->OnRttUpdate(TimeDelta::ms(mean_rtt_ms)); > } > > TimeDelta feedback_min_rtt = TimeDelta::PlusInfinity(); >- for (const auto& packet_feedback : report.ReceivedWithSendInfo()) { >+ for (const auto& packet_feedback : feedbacks) { > TimeDelta pending_time = packet_feedback.receive_time - max_recv_time; > TimeDelta rtt = report.feedback_time - >- packet_feedback.sent_packet->send_time - pending_time; >+ packet_feedback.sent_packet.send_time - pending_time; > // Value used for predicting NACK round trip time in FEC controller. > feedback_min_rtt = std::min(rtt, feedback_min_rtt); > } >@@ -400,22 +485,33 @@ NetworkControlUpdate GoogCcNetworkController::OnTransportPacketsFeedback( > previously_in_alr = alr_start_time.has_value(); > acknowledged_bitrate_estimator_->IncomingPacketFeedbackVector( > received_feedback_vector); >- auto acknowledged_bitrate = acknowledged_bitrate_estimator_->bitrate_bps(); >+ auto acknowledged_bitrate = acknowledged_bitrate_estimator_->bitrate(); >+ bandwidth_estimation_->SetAcknowledgedRate(acknowledged_bitrate, >+ report.feedback_time); >+ bandwidth_estimation_->IncomingPacketFeedbackVector(report); >+ for (const auto& feedback : received_feedback_vector) { >+ if (feedback.pacing_info.probe_cluster_id != PacedPacketInfo::kNotAProbe) { >+ probe_bitrate_estimator_->HandleProbeAndEstimateBitrate(feedback); >+ } >+ } >+ absl::optional<DataRate> probe_bitrate = >+ probe_bitrate_estimator_->FetchAndResetLastEstimatedBitrate(); >+ > DelayBasedBwe::Result result; > result = delay_based_bwe_->IncomingPacketFeedbackVector( >- received_feedback_vector, acknowledged_bitrate, >- report.feedback_time.ms()); >+ received_feedback_vector, acknowledged_bitrate, probe_bitrate, >+ report.feedback_time); > > NetworkControlUpdate update; > if (result.updated) { > if (result.probe) { >- bandwidth_estimation_->SetSendBitrate( >- DataRate::bps(result.target_bitrate_bps), report.feedback_time); >+ bandwidth_estimation_->SetSendBitrate(result.target_bitrate, >+ report.feedback_time); > } > // Since SetSendBitrate now resets the delay-based estimate, we have to call > // UpdateDelayBasedEstimate after SetSendBitrate. >- bandwidth_estimation_->UpdateDelayBasedEstimate( >- report.feedback_time, DataRate::bps(result.target_bitrate_bps)); >+ bandwidth_estimation_->UpdateDelayBasedEstimate(report.feedback_time, >+ result.target_bitrate); > // Update the estimate in the ProbeController, in case we want to probe. > MaybeTriggerOnNetworkChanged(&update, report.feedback_time); > } >@@ -428,11 +524,14 @@ NetworkControlUpdate GoogCcNetworkController::OnTransportPacketsFeedback( > > // No valid RTT could be because send-side BWE isn't used, in which case > // we don't try to limit the outstanding packets. >- if (in_cwnd_experiment_ && min_feedback_max_rtt_ms) { >+ if (in_cwnd_experiment_ && max_feedback_rtt.IsFinite()) { >+ int64_t min_feedback_max_rtt_ms = >+ *std::min_element(feedback_max_rtts_.begin(), feedback_max_rtts_.end()); >+ > const DataSize kMinCwnd = DataSize::bytes(2 * 1500); > TimeDelta time_window = >- TimeDelta::ms(*min_feedback_max_rtt_ms + accepted_queue_ms_); >- DataSize data_window = last_bandwidth_ * time_window; >+ TimeDelta::ms(min_feedback_max_rtt_ms + accepted_queue_ms_); >+ DataSize data_window = last_target_rate_ * time_window; > if (current_data_window_) { > data_window = > std::max(kMinCwnd, (data_window + current_data_window_.value()) / 2); >@@ -440,10 +539,13 @@ NetworkControlUpdate GoogCcNetworkController::OnTransportPacketsFeedback( > data_window = std::max(kMinCwnd, data_window); > } > current_data_window_ = data_window; >- RTC_LOG(LS_INFO) << "Feedback rtt: " << *min_feedback_max_rtt_ms >- << " Bitrate: " << last_bandwidth_.bps(); > } >- update.congestion_window = current_data_window_; >+ if (congestion_window_pushback_controller_ && current_data_window_) { >+ congestion_window_pushback_controller_->SetDataWindow( >+ *current_data_window_); >+ } else { >+ update.congestion_window = current_data_window_; >+ } > > return update; > } >@@ -460,7 +562,7 @@ NetworkControlUpdate GoogCcNetworkController::GetNetworkState( > last_estimated_fraction_loss_ / 255.0; > update.target_rate->network_estimate.round_trip_time = rtt; > update.target_rate->network_estimate.bwe_period = >- TimeDelta::ms(delay_based_bwe_->GetExpectedBwePeriodMs()); >+ delay_based_bwe_->GetExpectedBwePeriod(); > update.target_rate->at_time = at_time; > update.target_rate->target_rate = bandwidth; > update.pacer_config = GetPacingRates(at_time); >@@ -496,27 +598,38 @@ void GoogCcNetworkController::MaybeTriggerOnNetworkChanged( > > alr_detector_->SetEstimatedBitrate(estimated_bitrate_bps); > >- DataRate bandwidth = DataRate::bps(estimated_bitrate_bps); >- last_bandwidth_ = bandwidth; >+ last_target_rate_ = DataRate::bps(estimated_bitrate_bps); >+ DataRate bandwidth = use_stable_bandwidth_estimate_ >+ ? bandwidth_estimation_->GetEstimatedLinkCapacity() >+ : last_target_rate_; > >- TimeDelta bwe_period = >- TimeDelta::ms(delay_based_bwe_->GetExpectedBwePeriodMs()); >+ TimeDelta bwe_period = delay_based_bwe_->GetExpectedBwePeriod(); > >- TargetTransferRate target_rate; >- target_rate.at_time = at_time; > // Set the target rate to the full estimated bandwidth since the estimation > // for legacy reasons includes target rate constraints. >- target_rate.target_rate = bandwidth; >+ DataRate target_rate = last_target_rate_; >+ if (congestion_window_pushback_controller_) { >+ int64_t pushback_rate = >+ congestion_window_pushback_controller_->UpdateTargetBitrate( >+ target_rate.bps()); >+ pushback_rate = std::max<int64_t>(bandwidth_estimation_->GetMinBitrate(), >+ pushback_rate); >+ target_rate = DataRate::bps(pushback_rate); >+ } >+ >+ TargetTransferRate target_rate_msg; >+ target_rate_msg.at_time = at_time; >+ target_rate_msg.target_rate = target_rate; >+ target_rate_msg.network_estimate.at_time = at_time; >+ target_rate_msg.network_estimate.round_trip_time = TimeDelta::ms(rtt_ms); >+ target_rate_msg.network_estimate.bandwidth = bandwidth; >+ target_rate_msg.network_estimate.loss_rate_ratio = fraction_loss / 255.0f; >+ target_rate_msg.network_estimate.bwe_period = bwe_period; > >- target_rate.network_estimate.at_time = at_time; >- target_rate.network_estimate.round_trip_time = TimeDelta::ms(rtt_ms); >- target_rate.network_estimate.bandwidth = bandwidth; >- target_rate.network_estimate.loss_rate_ratio = fraction_loss / 255.0f; >- target_rate.network_estimate.bwe_period = bwe_period; >- update->target_rate = target_rate; >+ update->target_rate = target_rate_msg; > >- auto probes = >- probe_controller_->SetEstimatedBitrate(bandwidth.bps(), at_time.ms()); >+ auto probes = probe_controller_->SetEstimatedBitrate( >+ last_target_rate_.bps(), at_time.ms()); > update->probe_cluster_configs.insert(update->probe_cluster_configs.end(), > probes.begin(), probes.end()); > update->pacer_config = GetPacingRates(at_time); >@@ -525,8 +638,8 @@ void GoogCcNetworkController::MaybeTriggerOnNetworkChanged( > > PacerConfig GoogCcNetworkController::GetPacingRates(Timestamp at_time) const { > DataRate pacing_rate = >- std::max(min_pacing_rate_, last_bandwidth_) * pacing_factor_; >- DataRate padding_rate = std::min(max_padding_rate_, last_bandwidth_); >+ std::max(min_pacing_rate_, last_target_rate_) * pacing_factor_; >+ DataRate padding_rate = std::min(max_padding_rate_, last_target_rate_); > PacerConfig msg; > msg.at_time = at_time; > msg.time_window = TimeDelta::seconds(1); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.h >index f7efda396277e4e633c3da1f80509ecac9be9f47..a62f59bb44bc0e0e5c3b8a12ab244e05098b49a7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control.h >@@ -22,6 +22,7 @@ > #include "modules/bitrate_controller/send_side_bandwidth_estimation.h" > #include "modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h" > #include "modules/congestion_controller/goog_cc/alr_detector.h" >+#include "modules/congestion_controller/goog_cc/congestion_window_pushback_controller.h" > #include "modules/congestion_controller/goog_cc/delay_based_bwe.h" > #include "modules/congestion_controller/goog_cc/probe_controller.h" > #include "rtc_base/constructormagic.h" >@@ -63,24 +64,31 @@ class GoogCcNetworkController : public NetworkControllerInterface { > > RtcEventLog* const event_log_; > const bool packet_feedback_only_; >+ FieldTrialFlag safe_reset_on_route_change_; >+ FieldTrialFlag safe_reset_acknowledged_rate_; >+ const bool use_stable_bandwidth_estimate_; > > const std::unique_ptr<ProbeController> probe_controller_; >+ const std::unique_ptr<CongestionWindowPushbackController> >+ congestion_window_pushback_controller_; > > std::unique_ptr<SendSideBandwidthEstimation> bandwidth_estimation_; > std::unique_ptr<AlrDetector> alr_detector_; >+ std::unique_ptr<ProbeBitrateEstimator> probe_bitrate_estimator_; > std::unique_ptr<DelayBasedBwe> delay_based_bwe_; > std::unique_ptr<AcknowledgedBitrateEstimator> acknowledged_bitrate_estimator_; > > absl::optional<NetworkControllerConfig> initial_config_; > >+ bool first_packet_sent_ = false; >+ > Timestamp next_loss_update_ = Timestamp::MinusInfinity(); > int lost_packets_since_last_loss_update_ = 0; > int expected_packets_since_last_loss_update_ = 0; > > std::deque<int64_t> feedback_max_rtts_; > >- DataRate last_bandwidth_; >- absl::optional<TargetTransferRate> last_target_rate_; >+ DataRate last_target_rate_; > > int32_t last_estimated_bitrate_bps_ = 0; > uint8_t last_estimated_fraction_loss_ = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control_slowtest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control_slowtest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..cfa0d1c096dda3f6a35a6f8c4aa60b3ed5bcece7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control_slowtest.cc >@@ -0,0 +1,129 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/transport/goog_cc_factory.h" >+#include "test/field_trial.h" >+#include "test/gtest.h" >+#include "test/scenario/scenario.h" >+ >+namespace webrtc { >+namespace test { >+ >+TEST(GoogCcNetworkControllerTest, MaintainsLowRateInSafeResetTrial) { >+ const DataRate kLinkCapacity = DataRate::kbps(200); >+ const DataRate kStartRate = DataRate::kbps(300); >+ >+ ScopedFieldTrials trial("WebRTC-Bwe-SafeResetOnRouteChange/Enabled/"); >+ Scenario s("googcc_unit/safe_reset_low", true); >+ auto* send_net = s.CreateSimulationNode([&](NetworkNodeConfig* c) { >+ c->simulation.bandwidth = kLinkCapacity; >+ c->simulation.delay = TimeDelta::ms(10); >+ }); >+ // TODO(srte): replace with SimulatedTimeClient when it supports probing. >+ auto* client = s.CreateClient("send", [&](CallClientConfig* c) { >+ c->transport.cc = TransportControllerConfig::CongestionController::kGoogCc; >+ c->transport.rates.start_rate = kStartRate; >+ }); >+ auto* route = s.CreateRoutes(client, {send_net}, >+ s.CreateClient("return", CallClientConfig()), >+ {s.CreateSimulationNode(NetworkNodeConfig())}); >+ s.CreateVideoStream(route->forward(), VideoStreamConfig()); >+ // Trigger reroute message, but keep transport unchanged. >+ s.ChangeRoute(route->forward(), {send_net}); >+ >+ // Allow the controller to stabilize. >+ s.RunFor(TimeDelta::ms(500)); >+ EXPECT_NEAR(client->send_bandwidth().kbps(), kLinkCapacity.kbps(), 50); >+ s.ChangeRoute(route->forward(), {send_net}); >+ // Allow new settings to propagate. >+ s.RunFor(TimeDelta::ms(100)); >+ // Under the trial, the target should be unchanged for low rates. >+ EXPECT_NEAR(client->send_bandwidth().kbps(), kLinkCapacity.kbps(), 50); >+} >+ >+TEST(GoogCcNetworkControllerTest, CutsHighRateInSafeResetTrial) { >+ const DataRate kLinkCapacity = DataRate::kbps(1000); >+ const DataRate kStartRate = DataRate::kbps(300); >+ >+ ScopedFieldTrials trial("WebRTC-Bwe-SafeResetOnRouteChange/Enabled/"); >+ Scenario s("googcc_unit/safe_reset_high_cut", true); >+ auto send_net = s.CreateSimulationNode([&](NetworkNodeConfig* c) { >+ c->simulation.bandwidth = kLinkCapacity; >+ c->simulation.delay = TimeDelta::ms(50); >+ }); >+ // TODO(srte): replace with SimulatedTimeClient when it supports probing. >+ auto* client = s.CreateClient("send", [&](CallClientConfig* c) { >+ c->transport.cc = TransportControllerConfig::CongestionController::kGoogCc; >+ c->transport.rates.start_rate = kStartRate; >+ }); >+ auto* route = s.CreateRoutes(client, {send_net}, >+ s.CreateClient("return", CallClientConfig()), >+ {s.CreateSimulationNode(NetworkNodeConfig())}); >+ s.CreateVideoStream(route->forward(), VideoStreamConfig()); >+ >+ s.ChangeRoute(route->forward(), {send_net}); >+ // Allow the controller to stabilize. >+ s.RunFor(TimeDelta::ms(500)); >+ EXPECT_NEAR(client->send_bandwidth().kbps(), kLinkCapacity.kbps(), 300); >+ s.ChangeRoute(route->forward(), {send_net}); >+ // Allow new settings to propagate. >+ s.RunFor(TimeDelta::ms(50)); >+ // Under the trial, the target should be reset from high values. >+ EXPECT_NEAR(client->send_bandwidth().kbps(), kStartRate.kbps(), 30); >+} >+ >+#ifdef WEBRTC_LINUX // bugs.webrtc.org/10036 >+#define MAYBE_DetectsHighRateInSafeResetTrial \ >+ DISABLED_DetectsHighRateInSafeResetTrial >+#else >+#define MAYBE_DetectsHighRateInSafeResetTrial DetectsHighRateInSafeResetTrial >+#endif >+TEST(GoogCcNetworkControllerTest, MAYBE_DetectsHighRateInSafeResetTrial) { >+ ScopedFieldTrials trial("WebRTC-Bwe-SafeResetOnRouteChange/Enabled/"); >+ const DataRate kInitialLinkCapacity = DataRate::kbps(200); >+ const DataRate kNewLinkCapacity = DataRate::kbps(800); >+ const DataRate kStartRate = DataRate::kbps(300); >+ >+ Scenario s("googcc_unit/safe_reset_high_detect", true); >+ auto* initial_net = s.CreateSimulationNode([&](NetworkNodeConfig* c) { >+ c->simulation.bandwidth = kInitialLinkCapacity; >+ c->simulation.delay = TimeDelta::ms(50); >+ }); >+ auto* new_net = s.CreateSimulationNode([&](NetworkNodeConfig* c) { >+ c->simulation.bandwidth = kNewLinkCapacity; >+ c->simulation.delay = TimeDelta::ms(50); >+ }); >+ // TODO(srte): replace with SimulatedTimeClient when it supports probing. >+ auto* client = s.CreateClient("send", [&](CallClientConfig* c) { >+ c->transport.cc = TransportControllerConfig::CongestionController::kGoogCc; >+ c->transport.rates.start_rate = kStartRate; >+ }); >+ auto* route = s.CreateRoutes(client, {initial_net}, >+ s.CreateClient("return", CallClientConfig()), >+ {s.CreateSimulationNode(NetworkNodeConfig())}); >+ s.CreateVideoStream(route->forward(), VideoStreamConfig()); >+ s.ChangeRoute(route->forward(), {initial_net}); >+ >+ // Allow the controller to stabilize. >+ s.RunFor(TimeDelta::ms(1000)); >+ EXPECT_NEAR(client->send_bandwidth().kbps(), kInitialLinkCapacity.kbps(), 50); >+ s.ChangeRoute(route->forward(), {new_net}); >+ // Allow new settings to propagate. >+ s.RunFor(TimeDelta::ms(100)); >+ // Under the field trial, the target rate should be unchanged since it's lower >+ // than the starting rate. >+ EXPECT_NEAR(client->send_bandwidth().kbps(), kInitialLinkCapacity.kbps(), 50); >+ // However, probing should have made us detect the higher rate. >+ s.RunFor(TimeDelta::ms(2000)); >+ EXPECT_NEAR(client->send_bandwidth().kbps(), kNewLinkCapacity.kbps(), 200); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control_unittest.cc >index e3f3abf2b11ebe1164fc3fd6caec49f9ef8c4c37..fecd1b32d464bcab04aaf37acbf0e49b40a13abc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/goog_cc_network_control_unittest.cc >@@ -8,12 +8,12 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >+#include "api/transport/goog_cc_factory.h" > #include "api/transport/test/network_control_tester.h" > #include "logging/rtc_event_log/mock/mock_rtc_event_log.h" >-#include "modules/congestion_controller/goog_cc/include/goog_cc_factory.h" >-#include "test/scenario/scenario.h" >- >+#include "test/field_trial.h" > #include "test/gtest.h" >+#include "test/scenario/scenario.h" > > using testing::Field; > using testing::Matcher; >@@ -84,9 +84,9 @@ class GoogCcNetworkControllerTest : public ::testing::Test { > PacedPacketInfo pacing_info) { > PacketResult packet_result; > packet_result.sent_packet = SentPacket(); >- packet_result.sent_packet->send_time = Timestamp::ms(send_time_ms); >- packet_result.sent_packet->size = DataSize::bytes(payload_size); >- packet_result.sent_packet->pacing_info = pacing_info; >+ packet_result.sent_packet.send_time = Timestamp::ms(send_time_ms); >+ packet_result.sent_packet.size = DataSize::bytes(payload_size); >+ packet_result.sent_packet.pacing_info = pacing_info; > packet_result.receive_time = Timestamp::ms(arrival_time_ms); > return packet_result; > } >@@ -122,7 +122,7 @@ class GoogCcNetworkControllerTest : public ::testing::Test { > CreateResult(current_time_.ms() + delay_buildup, current_time_.ms(), > kPayloadSize, PacedPacketInfo()); > delay_buildup += delay; >- controller_->OnSentPacket(*packet.sent_packet); >+ controller_->OnSentPacket(packet.sent_packet); > TransportPacketsFeedback feedback; > feedback.feedback_time = packet.receive_time; > feedback.packet_feedbacks.push_back(packet); >@@ -213,16 +213,13 @@ TEST_F(GoogCcNetworkControllerTest, LongFeedbackDelays) { > packets.push_back( > CreateResult(i * 100 + 40, 2 * i * 100 + 40, 1500, kPacingInfo1)); > >+ TransportPacketsFeedback feedback; > for (PacketResult& packet : packets) { >- controller_->OnSentPacket(*packet.sent_packet); >- // Simulate packet timeout >- packet.sent_packet = absl::nullopt; >+ controller_->OnSentPacket(packet.sent_packet); >+ feedback.sendless_arrival_times.push_back(packet.receive_time); > } > >- TransportPacketsFeedback feedback; > feedback.feedback_time = packets[0].receive_time; >- feedback.packet_feedbacks = packets; >- > AdvanceTimeMilliseconds(kFeedbackTimeoutMs); > SentPacket later_packet; > later_packet.send_time = Timestamp::ms(kFeedbackTimeoutMs + i * 200 + 40); >@@ -246,7 +243,7 @@ TEST_F(GoogCcNetworkControllerTest, LongFeedbackDelays) { > packets.push_back(CreateResult(140, 240, 1500, kPacingInfo1)); > > for (const PacketResult& packet : packets) >- controller_->OnSentPacket(*packet.sent_packet); >+ controller_->OnSentPacket(packet.sent_packet); > > TransportPacketsFeedback feedback; > feedback.feedback_time = packets[0].receive_time; >@@ -308,8 +305,51 @@ TEST_F(GoogCcNetworkControllerTest, > 20); > } > >-TEST_F(GoogCcNetworkControllerTest, ScenarioQuickTest) { >- Scenario s("googcc_unit/scenario_quick", false); >+TEST_F(GoogCcNetworkControllerTest, LimitsToMinRateIfRttIsHighInTrial) { >+ // The field trial limits maximum RTT to 2 seconds, higher RTT means that the >+ // controller backs off until it reaches the minimum configured bitrate. This >+ // allows the RTT to recover faster than the regular control mechanism would >+ // achieve. >+ ScopedFieldTrials trial("WebRTC-Bwe-MaxRttLimit/limit:2s/"); >+ // In the test case, we limit the capacity and add a cross traffic packet >+ // burst that blocks media from being sent. This causes the RTT to quickly >+ // increase above the threshold in the trial. >+ const DataRate kLinkCapacity = DataRate::kbps(100); >+ const DataRate kMinRate = DataRate::kbps(20); >+ const TimeDelta kBufferBloatDuration = TimeDelta::seconds(10); >+ Scenario s("googcc_unit/limit_trial", false); >+ NetworkNodeConfig net_conf; >+ auto send_net = s.CreateSimulationNode([=](NetworkNodeConfig* c) { >+ c->simulation.bandwidth = kLinkCapacity; >+ c->simulation.delay = TimeDelta::ms(100); >+ c->update_frequency = TimeDelta::ms(5); >+ }); >+ auto ret_net = s.CreateSimulationNode([](NetworkNodeConfig* c) { >+ c->simulation.delay = TimeDelta::ms(100); >+ c->update_frequency = TimeDelta::ms(5); >+ }); >+ SimulatedTimeClientConfig config; >+ config.transport.cc = >+ TransportControllerConfig::CongestionController::kGoogCc; >+ config.transport.rates.min_rate = kMinRate; >+ config.transport.rates.start_rate = kLinkCapacity; >+ SimulatedTimeClient* client = s.CreateSimulatedTimeClient( >+ "send", config, {PacketStreamConfig()}, {send_net}, {ret_net}); >+ // Run for a few seconds to allow the controller to stabilize. >+ s.RunFor(TimeDelta::seconds(10)); >+ const DataSize kBloatPacketSize = DataSize::bytes(1000); >+ const int kBloatPacketCount = >+ static_cast<int>(kBufferBloatDuration * kLinkCapacity / kBloatPacketSize); >+ // This will cause the RTT to be large for a while. >+ s.TriggerPacketBurst({send_net}, kBloatPacketCount, kBloatPacketSize.bytes()); >+ // Wait to allow the high RTT to be detected and acted upon. >+ s.RunFor(TimeDelta::seconds(4)); >+ // By now the target rate should have dropped to the minimum configured rate. >+ EXPECT_NEAR(client->target_rate_kbps(), kMinRate.kbps(), 1); >+} >+ >+TEST_F(GoogCcNetworkControllerTest, UpdatesTargetRateBasedOnLinkCapacity) { >+ Scenario s("googcc_unit/target_capacity", false); > SimulatedTimeClientConfig config; > config.transport.cc = > TransportControllerConfig::CongestionController::kGoogCcFeedback; >@@ -359,5 +399,61 @@ TEST_F(GoogCcNetworkControllerTest, ScenarioQuickTest) { > EXPECT_NEAR(client->target_rate_kbps(), 90, 20); > } > >+TEST_F(GoogCcNetworkControllerTest, DefaultEstimateVariesInSteadyState) { >+ ScopedFieldTrials trial("WebRTC-Bwe-StableBandwidthEstimate/Disabled/"); >+ Scenario s("googcc_unit/no_stable_varies", false); >+ SimulatedTimeClientConfig config; >+ config.transport.cc = >+ TransportControllerConfig::CongestionController::kGoogCcFeedback; >+ NetworkNodeConfig net_conf; >+ net_conf.simulation.bandwidth = DataRate::kbps(500); >+ net_conf.simulation.delay = TimeDelta::ms(100); >+ net_conf.update_frequency = TimeDelta::ms(5); >+ auto send_net = s.CreateSimulationNode(net_conf); >+ auto ret_net = s.CreateSimulationNode(net_conf); >+ SimulatedTimeClient* client = s.CreateSimulatedTimeClient( >+ "send", config, {PacketStreamConfig()}, {send_net}, {ret_net}); >+ // Run for a while to allow the estimate to stabilize. >+ s.RunFor(TimeDelta::seconds(20)); >+ DataRate min_estimate = DataRate::PlusInfinity(); >+ DataRate max_estimate = DataRate::MinusInfinity(); >+ // Measure variation in steady state. >+ for (int i = 0; i < 20; ++i) { >+ min_estimate = std::min(min_estimate, client->link_capacity()); >+ max_estimate = std::max(max_estimate, client->link_capacity()); >+ s.RunFor(TimeDelta::seconds(1)); >+ } >+ // We should expect drops by at least 15% (default backoff.) >+ EXPECT_LT(min_estimate / max_estimate, 0.85); >+} >+ >+TEST_F(GoogCcNetworkControllerTest, StableEstimateDoesNotVaryInSteadyState) { >+ ScopedFieldTrials trial("WebRTC-Bwe-StableBandwidthEstimate/Enabled/"); >+ Scenario s("googcc_unit/stable_is_stable", false); >+ SimulatedTimeClientConfig config; >+ config.transport.cc = >+ TransportControllerConfig::CongestionController::kGoogCcFeedback; >+ NetworkNodeConfig net_conf; >+ net_conf.simulation.bandwidth = DataRate::kbps(500); >+ net_conf.simulation.delay = TimeDelta::ms(100); >+ net_conf.update_frequency = TimeDelta::ms(5); >+ auto send_net = s.CreateSimulationNode(net_conf); >+ auto ret_net = s.CreateSimulationNode(net_conf); >+ SimulatedTimeClient* client = s.CreateSimulatedTimeClient( >+ "send", config, {PacketStreamConfig()}, {send_net}, {ret_net}); >+ // Run for a while to allow the estimate to stabilize. >+ s.RunFor(TimeDelta::seconds(20)); >+ DataRate min_estimate = DataRate::PlusInfinity(); >+ DataRate max_estimate = DataRate::MinusInfinity(); >+ // Measure variation in steady state. >+ for (int i = 0; i < 20; ++i) { >+ min_estimate = std::min(min_estimate, client->link_capacity()); >+ max_estimate = std::max(max_estimate, client->link_capacity()); >+ s.RunFor(TimeDelta::seconds(1)); >+ } >+ // We expect no variation under the trial in steady state. >+ EXPECT_GT(min_estimate / max_estimate, 0.95); >+} >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/include/goog_cc_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/include/goog_cc_factory.h >deleted file mode 100644 >index 0d91fff65fa96d8a5cac251fabe8032660464ad5..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/include/goog_cc_factory.h >+++ /dev/null >@@ -1,48 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_CONGESTION_CONTROLLER_GOOG_CC_INCLUDE_GOOG_CC_FACTORY_H_ >-#define MODULES_CONGESTION_CONTROLLER_GOOG_CC_INCLUDE_GOOG_CC_FACTORY_H_ >-#include <memory> >- >-#include "api/transport/network_control.h" >- >-namespace webrtc { >-class Clock; >-class RtcEventLog; >- >-class GoogCcNetworkControllerFactory >- : public NetworkControllerFactoryInterface { >- public: >- explicit GoogCcNetworkControllerFactory(RtcEventLog*); >- std::unique_ptr<NetworkControllerInterface> Create( >- NetworkControllerConfig config) override; >- TimeDelta GetProcessInterval() const override; >- >- private: >- RtcEventLog* const event_log_; >-}; >- >-// Factory to create packet feedback only GoogCC, this can be used for >-// connections providing packet receive time feedback but no other reports. >-class GoogCcFeedbackNetworkControllerFactory >- : public NetworkControllerFactoryInterface { >- public: >- explicit GoogCcFeedbackNetworkControllerFactory(RtcEventLog*); >- std::unique_ptr<NetworkControllerInterface> Create( >- NetworkControllerConfig config) override; >- TimeDelta GetProcessInterval() const override; >- >- private: >- RtcEventLog* const event_log_; >-}; >-} // namespace webrtc >- >-#endif // MODULES_CONGESTION_CONTROLLER_GOOG_CC_INCLUDE_GOOG_CC_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/median_slope_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/median_slope_estimator.cc >index 0ad7728302ec20878f9aa504a5becd8c59be612f..45d2fe32114edea59fe41fedb78c2d99d7aead74 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/median_slope_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/median_slope_estimator.cc >@@ -10,12 +10,10 @@ > > #include "modules/congestion_controller/goog_cc/median_slope_estimator.h" > >-#include <algorithm> > #include <vector> > >-#include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" >-#include "rtc_base/logging.h" >+#include "rtc_base/checks.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.cc >index 0f0d17ea48db06221e4cbdf17f53b77ef3cb5753..7ba2fb5a13bc94d284ee657ece482801886db7d7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.cc >@@ -170,11 +170,13 @@ int ProbeBitrateEstimator::HandleProbeAndEstimateBitrate( > return *estimated_bitrate_bps_; > } > >-absl::optional<int> >-ProbeBitrateEstimator::FetchAndResetLastEstimatedBitrateBps() { >+absl::optional<DataRate> >+ProbeBitrateEstimator::FetchAndResetLastEstimatedBitrate() { > absl::optional<int> estimated_bitrate_bps = estimated_bitrate_bps_; > estimated_bitrate_bps_.reset(); >- return estimated_bitrate_bps; >+ if (estimated_bitrate_bps) >+ return DataRate::bps(*estimated_bitrate_bps); >+ return absl::nullopt; > } > > void ProbeBitrateEstimator::EraseOldClusters(int64_t timestamp_ms) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.h >index ed9252137fea1afe3fcfb6a77ca4a441588b6bea..ae83ed380feb875c6c94ea425ab96d8aecd998aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator.h >@@ -28,7 +28,7 @@ class ProbeBitrateEstimator { > // Returns the estimated bitrate if the probe completes a valid cluster. > int HandleProbeAndEstimateBitrate(const PacketFeedback& packet_feedback); > >- absl::optional<int> FetchAndResetLastEstimatedBitrateBps(); >+ absl::optional<DataRate> FetchAndResetLastEstimatedBitrate(); > > private: > struct AggregatedCluster { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator_unittest.cc >index f8a40f839f385a3005b4a665772292d020ab8555..fcf555924537ab86ad93de34831ebb66d82c0390 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_bitrate_estimator_unittest.cc >@@ -202,7 +202,7 @@ TEST_F(TestProbeBitrateEstimator, IgnoreSizeFirstReceivePacket) { > } > > TEST_F(TestProbeBitrateEstimator, NoLastEstimatedBitrateBps) { >- EXPECT_FALSE(probe_bitrate_estimator_.FetchAndResetLastEstimatedBitrateBps()); >+ EXPECT_FALSE(probe_bitrate_estimator_.FetchAndResetLastEstimatedBitrate()); > } > > TEST_F(TestProbeBitrateEstimator, FetchLastEstimatedBitrateBps) { >@@ -211,11 +211,11 @@ TEST_F(TestProbeBitrateEstimator, FetchLastEstimatedBitrateBps) { > AddPacketFeedback(0, 1000, 20, 30); > AddPacketFeedback(0, 1000, 30, 40); > >- auto estimated_bitrate_bps = >- probe_bitrate_estimator_.FetchAndResetLastEstimatedBitrateBps(); >- EXPECT_TRUE(estimated_bitrate_bps); >- EXPECT_NEAR(*estimated_bitrate_bps, 800000, 10); >- EXPECT_FALSE(probe_bitrate_estimator_.FetchAndResetLastEstimatedBitrateBps()); >+ auto estimated_bitrate = >+ probe_bitrate_estimator_.FetchAndResetLastEstimatedBitrate(); >+ EXPECT_TRUE(estimated_bitrate); >+ EXPECT_NEAR(estimated_bitrate->bps(), 800000, 10); >+ EXPECT_FALSE(probe_bitrate_estimator_.FetchAndResetLastEstimatedBitrate()); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.cc >index c2a8e023d7d40eb78ee52c9243b51f607ce4b904..7bbe8e7138a4e627679f6c4856b2e841e37f2d7d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.cc >@@ -146,6 +146,7 @@ std::vector<ProbeClusterConfig> ProbeController::OnMaxTotalAllocatedBitrate( > max_total_allocated_bitrate_ = max_total_allocated_bitrate; > return InitiateProbing(at_time_ms, {max_total_allocated_bitrate}, false); > } >+ max_total_allocated_bitrate_ = max_total_allocated_bitrate; > return std::vector<ProbeClusterConfig>(); > } > >@@ -260,6 +261,16 @@ std::vector<ProbeClusterConfig> ProbeController::RequestProbe( > return std::vector<ProbeClusterConfig>(); > } > >+std::vector<ProbeClusterConfig> ProbeController::InitiateCapacityProbing( >+ int64_t bitrate_bps, >+ int64_t at_time_ms) { >+ if (state_ != State::kWaitingForProbingResult) { >+ RTC_DCHECK(network_available_); >+ return InitiateProbing(at_time_ms, {2 * bitrate_bps}, true); >+ } >+ return std::vector<ProbeClusterConfig>(); >+} >+ > void ProbeController::Reset(int64_t at_time_ms) { > network_available_ = true; > state_ = State::kInit; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.h >index 6b6d4ae8b4b52a815aa7e8e4ee92161b7e5d66b3..4ddd77f75c37b6dbd9b53893d3c0b5e93944c7f1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/probe_controller.h >@@ -60,6 +60,9 @@ class ProbeController { > RTC_WARN_UNUSED_RESULT std::vector<ProbeClusterConfig> RequestProbe( > int64_t at_time_ms); > >+ RTC_WARN_UNUSED_RESULT std::vector<ProbeClusterConfig> >+ InitiateCapacityProbing(int64_t bitrate_bps, int64_t at_time_ms); >+ > // Resets the ProbeController to a state equivalent to as if it was just > // created EXCEPT for |enable_periodic_alr_probing_|. > void Reset(int64_t at_time_ms); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.cc >index 3264b37e6ddc0e4638b57819b0de87ed0cd8768d..59aad511c008b080cdef5c5e2c3b0543bfcbcb53 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.cc >@@ -8,6 +8,9 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > #include "modules/congestion_controller/goog_cc/test/goog_cc_printer.h" >+ >+#include <math.h> >+ > #include "modules/congestion_controller/goog_cc/trendline_estimator.h" > > namespace webrtc { >@@ -33,9 +36,11 @@ void GoogCcStatePrinter::PrintValues(FILE* out) { > RTC_CHECK(controller_); > auto* detector = controller_->delay_based_bwe_->delay_detector_.get(); > auto* trendline_estimator = reinterpret_cast<TrendlineEstimator*>(detector); >- fprintf(out, "%i %i %i %.6lf %.6lf %.6lf", >+ fprintf(out, "%i %f %i %.6lf %.6lf %.6lf", > controller_->delay_based_bwe_->rate_control_.rate_control_state_, >- controller_->delay_based_bwe_->rate_control_.rate_control_region_, >+ controller_->delay_based_bwe_->rate_control_ >+ .link_capacity_estimate_kbps_.value_or(NAN) * >+ 1000 / 8, > controller_->alr_detector_->alr_started_time_ms_.has_value(), > trendline_estimator->prev_trend_, > trendline_estimator->prev_modified_trend_, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.h >index 4dfc059b479a9475abeb961f539b58cde040d2c4..5d4f2c1840f4c97a83075f29cc9dcc6c6e557af2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/test/goog_cc_printer.h >@@ -12,8 +12,8 @@ > > #include <memory> > >+#include "api/transport/goog_cc_factory.h" > #include "modules/congestion_controller/goog_cc/goog_cc_network_control.h" >-#include "modules/congestion_controller/goog_cc/include/goog_cc_factory.h" > #include "modules/congestion_controller/test/controller_printer.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/trendline_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/trendline_estimator.h >index a7d22a950efad02dc387569a7eb2400ae81bfc97..1826d812480e4720701b86961a56e51041a1908f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/trendline_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/goog_cc/trendline_estimator.h >@@ -12,11 +12,11 @@ > > #include <stddef.h> > #include <stdint.h> >- > #include <deque> > #include <utility> > > #include "modules/congestion_controller/goog_cc/delay_increase_detector_interface.h" >+#include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/include/send_side_congestion_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/include/send_side_congestion_controller.h >index ba26272e6fb47b4c20df53a51f8f9e83ac16144f..746e590a59f41e7abc4452f3c35e13b9cb9997a3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/include/send_side_congestion_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/include/send_side_congestion_controller.h >@@ -113,6 +113,9 @@ class SendSideCongestionController > > void SetAllocatedBitrateWithoutFeedback(uint32_t bitrate_bps) override; > >+ void EnableCongestionWindowPushback(int64_t accepted_queue_ms, >+ uint32_t min_pushback_target_bitrate_bps); >+ > private: > void MaybeTriggerOnNetworkChanged(); > >@@ -136,7 +139,7 @@ class SendSideCongestionController > RTC_GUARDED_BY(probe_lock_); > > const std::unique_ptr<RateLimiter> retransmission_rate_limiter_; >- TransportFeedbackAdapter transport_feedback_adapter_; >+ LegacyTransportFeedbackAdapter transport_feedback_adapter_; > rtc::CriticalSection network_state_lock_; > uint32_t last_reported_bitrate_bps_ RTC_GUARDED_BY(network_state_lock_); > uint8_t last_reported_fraction_loss_ RTC_GUARDED_BY(network_state_lock_); >@@ -149,6 +152,8 @@ class SendSideCongestionController > bool pacer_paused_; > rtc::CriticalSection bwe_lock_; > int min_bitrate_bps_ RTC_GUARDED_BY(bwe_lock_); >+ std::unique_ptr<ProbeBitrateEstimator> probe_bitrate_estimator_ >+ RTC_GUARDED_BY(bwe_lock_); > std::unique_ptr<DelayBasedBwe> delay_based_bwe_ RTC_GUARDED_BY(bwe_lock_); > bool in_cwnd_experiment_; > int64_t accepted_queue_ms_; >@@ -161,7 +166,7 @@ class SendSideCongestionController > bool pacer_pushback_experiment_ = false; > float encoding_rate_ = 1.0; > >- const std::unique_ptr<CongestionWindowPushbackController> >+ std::unique_ptr<CongestionWindowPushbackController> > congestion_window_pushback_controller_; > > RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(SendSideCongestionController); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/monitor_interval.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/monitor_interval.cc >index d430b7f1790a7a38febd40adc6152a59f546554d..281b9224f0c3beb0503f35336157ff05104c48e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/monitor_interval.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/monitor_interval.cc >@@ -31,25 +31,24 @@ PccMonitorInterval::PccMonitorInterval(const PccMonitorInterval& other) = > void PccMonitorInterval::OnPacketsFeedback( > const std::vector<PacketResult>& packets_results) { > for (const PacketResult& packet_result : packets_results) { >- if (!packet_result.sent_packet.has_value() || >- packet_result.sent_packet->send_time <= start_time_) { >+ if (packet_result.sent_packet.send_time <= start_time_) { > continue; > } > // Here we assume that if some packets are reordered with packets sent > // after the end of the monitor interval, then they are lost. (Otherwise > // it is not clear how long should we wait for packets feedback to arrive). >- if (packet_result.sent_packet->send_time > >+ if (packet_result.sent_packet.send_time > > start_time_ + interval_duration_) { > feedback_collection_done_ = true; > return; > } > if (packet_result.receive_time.IsInfinite()) { >- lost_packets_sent_time_.push_back(packet_result.sent_packet->send_time); >+ lost_packets_sent_time_.push_back(packet_result.sent_packet.send_time); > } else { > received_packets_.push_back( >- {packet_result.receive_time - packet_result.sent_packet->send_time, >- packet_result.sent_packet->send_time}); >- received_packets_size_ += packet_result.sent_packet->size; >+ {packet_result.receive_time - packet_result.sent_packet.send_time, >+ packet_result.sent_packet.send_time}); >+ received_packets_size_ += packet_result.sent_packet.size; > } > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/pcc_network_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/pcc_network_controller.cc >index 98cda7409bd1fb4227149419c0527a8b8b3449b5..47ad33eac227bd053221589166e73633a06d6674 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/pcc_network_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/pcc_network_controller.cc >@@ -148,7 +148,7 @@ NetworkControlUpdate PccNetworkController::OnSentPacket(SentPacket msg) { > if (IsTimeoutExpired(msg.send_time)) { > DataSize received_size = DataSize::Zero(); > for (size_t i = 1; i < last_received_packets_.size(); ++i) { >- received_size += last_received_packets_[i].sent_packet->size; >+ received_size += last_received_packets_[i].sent_packet.size; > } > TimeDelta sending_time = TimeDelta::Zero(); > if (last_received_packets_.size() > 0) >@@ -166,7 +166,7 @@ NetworkControlUpdate PccNetworkController::OnSentPacket(SentPacket msg) { > msg.send_time - start_time_ >= kStartupDuration) { > DataSize received_size = DataSize::Zero(); > for (size_t i = 1; i < last_received_packets_.size(); ++i) { >- received_size += last_received_packets_[i].sent_packet->size; >+ received_size += last_received_packets_[i].sent_packet.size; > } > TimeDelta sending_time = TimeDelta::Zero(); > if (last_received_packets_.size() > 0) >@@ -260,6 +260,8 @@ bool PccNetworkController::IsFeedbackCollectionDone() const { > > NetworkControlUpdate PccNetworkController::OnTransportPacketsFeedback( > TransportPacketsFeedback msg) { >+ if (msg.packet_feedbacks.empty()) >+ return NetworkControlUpdate(); > // Save packets to last_received_packets_ array. > for (const PacketResult& packet_result : msg.ReceivedWithSendInfo()) { > last_received_packets_.push_back(packet_result); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/rtt_tracker.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/rtt_tracker.cc >index a0545759f9f1208ed9161aada4d6a56d4ffe55b1..533a573a7ac8d4bb4077a67d3791633d3ae04856 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/rtt_tracker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/pcc/rtt_tracker.cc >@@ -23,12 +23,11 @@ void RttTracker::OnPacketsFeedback( > Timestamp feedback_received_time) { > TimeDelta packet_rtt = TimeDelta::MinusInfinity(); > for (const PacketResult& packet_result : packet_feedbacks) { >- if (!packet_result.sent_packet.has_value() || >- packet_result.receive_time.IsInfinite()) >+ if (packet_result.receive_time.IsInfinite()) > continue; > packet_rtt = std::max<TimeDelta>( > packet_rtt, >- feedback_received_time - packet_result.sent_packet->send_time); >+ feedback_received_time - packet_result.sent_packet.send_time); > } > if (packet_rtt.IsFinite()) > rtt_estimate_ = (1 - alpha_) * rtt_estimate_ + alpha_ * packet_rtt; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/BUILD.gn >index 134fb940fdd6528e65ba332d8294b175d1074542..bffe1f893eb54ba5bacdc980545ba58cca8d678c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/BUILD.gn >@@ -21,8 +21,6 @@ rtc_static_library("congestion_controller") { > configs += [ ":bwe_test_logging" ] > sources = [ > "include/send_side_congestion_controller.h", >- "pacer_controller.cc", >- "pacer_controller.h", > "send_side_congestion_controller.cc", > ] > >@@ -32,10 +30,12 @@ rtc_static_library("congestion_controller") { > } > > deps = [ >+ ":control_handler", > ":transport_feedback", > "../:congestion_controller", > "../..:module_api", > "../../..:webrtc_common", >+ "../../../api/transport:goog_cc", > "../../../api/transport:network_control", > "../../../rtc_base:checks", > "../../../rtc_base:rate_limiter", >@@ -43,12 +43,12 @@ rtc_static_library("congestion_controller") { > "../../../rtc_base:safe_minmax", > "../../../rtc_base:sequenced_task_checker", > "../../../rtc_base/experiments:congestion_controller_experiment", >+ "../../../rtc_base/network:sent_packet", > "../../../system_wrappers", > "../../../system_wrappers:field_trial", > "../../pacing", > "../../remote_bitrate_estimator", > "../../rtp_rtcp:rtp_rtcp_format", >- "../goog_cc", > "//third_party/abseil-cpp/absl/memory", > ] > >@@ -57,6 +57,38 @@ rtc_static_library("congestion_controller") { > } > } > >+rtc_source_set("control_handler") { >+ visibility = [ "*" ] >+ sources = [ >+ "control_handler.cc", >+ "control_handler.h", >+ ] >+ >+ if (!build_with_chromium && is_clang) { >+ # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >+ suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >+ } >+ >+ deps = [ >+ "../:congestion_controller", >+ "../../../api/transport:network_control", >+ "../../../rtc_base:checks", >+ "../../../rtc_base:rate_limiter", >+ "../../../rtc_base:safe_minmax", >+ "../../../rtc_base:sequenced_task_checker", >+ "../../../rtc_base/experiments:congestion_controller_experiment", >+ "../../../system_wrappers", >+ "../../../system_wrappers:field_trial", >+ "../../pacing", >+ "../../remote_bitrate_estimator", >+ "../../rtp_rtcp:rtp_rtcp_format", >+ "//third_party/abseil-cpp/absl/memory", >+ ] >+ >+ if (!build_with_mozilla) { >+ deps += [ "../../../rtc_base:rtc_base" ] >+ } >+} > rtc_static_library("transport_feedback") { > visibility = [ "*" ] > sources = [ >@@ -69,8 +101,10 @@ rtc_static_library("transport_feedback") { > deps = [ > "../..:module_api", > "../../../api/transport:network_control", >+ "../../../api/units:data_size", > "../../../rtc_base:checks", > "../../../rtc_base:rtc_base_approved", >+ "../../../rtc_base/network:sent_packet", > "../../../system_wrappers", > "../../rtp_rtcp:rtp_rtcp_format", > ] >@@ -97,7 +131,7 @@ if (rtc_include_tests) { > "../../../rtc_base:checks", > "../../../rtc_base:rtc_base", > "../../../rtc_base:rtc_base_approved", >- "../../../rtc_base:rtc_base_tests_utils", >+ "../../../rtc_base/network:sent_packet", > "../../../system_wrappers", > "../../../test:field_trial", > "../../../test:test_support", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/control_handler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/control_handler.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..e6cadb720e9c0b1cca5a26f5e22325f947e5d696 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/control_handler.cc >@@ -0,0 +1,169 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/congestion_controller/rtp/control_handler.h" >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/numerics/safe_minmax.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+namespace { >+ >+// When PacerPushbackExperiment is enabled, build-up in the pacer due to >+// the congestion window and/or data spikes reduces encoder allocations. >+bool IsPacerPushbackExperimentEnabled() { >+ return field_trial::IsEnabled("WebRTC-PacerPushbackExperiment"); >+} >+ >+// By default, pacer emergency stops encoder when buffer reaches a high level. >+bool IsPacerEmergencyStopDisabled() { >+ return field_trial::IsEnabled("WebRTC-DisablePacerEmergencyStop"); >+} >+ >+} // namespace >+CongestionControlHandler::CongestionControlHandler( >+ NetworkChangedObserver* observer, >+ PacedSender* pacer) >+ : observer_(observer), >+ pacer_(pacer), >+ pacer_pushback_experiment_(IsPacerPushbackExperimentEnabled()), >+ disable_pacer_emergency_stop_(IsPacerEmergencyStopDisabled()) { >+ sequenced_checker_.Detach(); >+} >+ >+CongestionControlHandler::~CongestionControlHandler() {} >+ >+void CongestionControlHandler::PostUpdates(NetworkControlUpdate update) { >+ RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >+ if (update.congestion_window) { >+ if (update.congestion_window->IsFinite()) >+ pacer_->SetCongestionWindow(update.congestion_window->bytes()); >+ else >+ pacer_->SetCongestionWindow(PacedSender::kNoCongestionWindow); >+ } >+ if (update.pacer_config) { >+ pacer_->SetPacingRates(update.pacer_config->data_rate().bps(), >+ update.pacer_config->pad_rate().bps()); >+ } >+ for (const auto& probe : update.probe_cluster_configs) { >+ int64_t bitrate_bps = probe.target_data_rate.bps(); >+ pacer_->CreateProbeCluster(bitrate_bps); >+ } >+ if (update.target_rate) { >+ current_target_rate_msg_ = *update.target_rate; >+ OnNetworkInvalidation(); >+ } >+} >+ >+void CongestionControlHandler::OnNetworkAvailability(NetworkAvailability msg) { >+ RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >+ if (network_available_ != msg.network_available) { >+ network_available_ = msg.network_available; >+ pacer_->UpdateOutstandingData(0); >+ SetPacerState(!msg.network_available); >+ OnNetworkInvalidation(); >+ } >+} >+ >+void CongestionControlHandler::OnOutstandingData(DataSize in_flight_data) { >+ RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >+ pacer_->UpdateOutstandingData(in_flight_data.bytes()); >+ OnNetworkInvalidation(); >+} >+ >+void CongestionControlHandler::OnPacerQueueUpdate( >+ TimeDelta expected_queue_time) { >+ RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >+ pacer_expected_queue_ms_ = expected_queue_time.ms(); >+ OnNetworkInvalidation(); >+} >+ >+void CongestionControlHandler::SetPacerState(bool paused) { >+ if (paused && !pacer_paused_) >+ pacer_->Pause(); >+ else if (!paused && pacer_paused_) >+ pacer_->Resume(); >+ pacer_paused_ = paused; >+} >+ >+void CongestionControlHandler::OnNetworkInvalidation() { >+ if (!current_target_rate_msg_.has_value()) >+ return; >+ >+ uint32_t target_bitrate_bps = current_target_rate_msg_->target_rate.bps(); >+ int64_t rtt_ms = >+ current_target_rate_msg_->network_estimate.round_trip_time.ms(); >+ float loss_rate_ratio = >+ current_target_rate_msg_->network_estimate.loss_rate_ratio; >+ >+ int loss_ratio_255 = loss_rate_ratio * 255; >+ uint8_t fraction_loss = >+ rtc::dchecked_cast<uint8_t>(rtc::SafeClamp(loss_ratio_255, 0, 255)); >+ >+ int64_t probing_interval_ms = >+ current_target_rate_msg_->network_estimate.bwe_period.ms(); >+ >+ if (!network_available_) { >+ target_bitrate_bps = 0; >+ } else if (pacer_pushback_experiment_) { >+ int64_t queue_length_ms = pacer_expected_queue_ms_; >+ >+ if (queue_length_ms == 0) { >+ encoding_rate_ratio_ = 1.0; >+ } else if (queue_length_ms > 50) { >+ double encoding_ratio = 1.0 - queue_length_ms / 1000.0; >+ encoding_rate_ratio_ = std::min(encoding_rate_ratio_, encoding_ratio); >+ encoding_rate_ratio_ = std::max(encoding_rate_ratio_, 0.0); >+ } >+ >+ target_bitrate_bps *= encoding_rate_ratio_; >+ target_bitrate_bps = target_bitrate_bps < 50000 ? 0 : target_bitrate_bps; >+ } else if (!disable_pacer_emergency_stop_) { >+ target_bitrate_bps = IsSendQueueFull() ? 0 : target_bitrate_bps; >+ } >+ >+ if (HasNetworkParametersToReportChanged(target_bitrate_bps, fraction_loss, >+ rtt_ms)) { >+ observer_->OnNetworkChanged(target_bitrate_bps, fraction_loss, rtt_ms, >+ probing_interval_ms); >+ } >+} >+bool CongestionControlHandler::HasNetworkParametersToReportChanged( >+ int64_t target_bitrate_bps, >+ uint8_t fraction_loss, >+ int64_t rtt_ms) { >+ bool changed = last_reported_target_bitrate_bps_ != target_bitrate_bps || >+ (target_bitrate_bps > 0 && >+ (last_reported_fraction_loss_ != fraction_loss || >+ last_reported_rtt_ms_ != rtt_ms)); >+ if (changed && >+ (last_reported_target_bitrate_bps_ == 0 || target_bitrate_bps == 0)) { >+ RTC_LOG(LS_INFO) << "Bitrate estimate state changed, BWE: " >+ << target_bitrate_bps << " bps."; >+ } >+ last_reported_target_bitrate_bps_ = target_bitrate_bps; >+ last_reported_fraction_loss_ = fraction_loss; >+ last_reported_rtt_ms_ = rtt_ms; >+ return changed; >+} >+ >+bool CongestionControlHandler::IsSendQueueFull() const { >+ return pacer_expected_queue_ms_ > PacedSender::kMaxQueueLengthMs; >+} >+ >+absl::optional<TargetTransferRate> >+CongestionControlHandler::last_transfer_rate() { >+ RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >+ return current_target_rate_msg_; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/control_handler.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/control_handler.h >new file mode 100644 >index 0000000000000000000000000000000000000000..456c2a9e27128406f5bb8040a56dd44cd8905600 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/control_handler.h >@@ -0,0 +1,72 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_CONGESTION_CONTROLLER_RTP_CONTROL_HANDLER_H_ >+#define MODULES_CONGESTION_CONTROLLER_RTP_CONTROL_HANDLER_H_ >+ >+#include <algorithm> >+#include <memory> >+ >+#include "api/transport/network_control.h" >+#include "modules/congestion_controller/include/network_changed_observer.h" >+#include "modules/pacing/paced_sender.h" >+#include "rtc_base/sequenced_task_checker.h" >+ >+namespace webrtc { >+// This is used to observe the network controller state and route calls to >+// the proper handler. It also keeps cached values for safe asynchronous use. >+// This makes sure that things running on the worker queue can't access state >+// in SendSideCongestionController, which would risk causing data race on >+// destruction unless members are properly ordered. >+class CongestionControlHandler { >+ public: >+ CongestionControlHandler(NetworkChangedObserver* observer, >+ PacedSender* pacer); >+ ~CongestionControlHandler(); >+ >+ void PostUpdates(NetworkControlUpdate update); >+ >+ void OnNetworkAvailability(NetworkAvailability msg); >+ void OnOutstandingData(DataSize in_flight_data); >+ void OnPacerQueueUpdate(TimeDelta expected_queue_time); >+ >+ absl::optional<TargetTransferRate> last_transfer_rate(); >+ >+ private: >+ void SetPacerState(bool paused); >+ void OnNetworkInvalidation(); >+ bool IsSendQueueFull() const; >+ bool HasNetworkParametersToReportChanged(int64_t bitrate_bps, >+ float loss_rate_ratio, >+ TimeDelta rtt); >+ >+ bool HasNetworkParametersToReportChanged(int64_t bitrate_bps, >+ uint8_t fraction_loss, >+ int64_t rtt); >+ NetworkChangedObserver* observer_ = nullptr; >+ PacedSender* const pacer_; >+ >+ absl::optional<TargetTransferRate> current_target_rate_msg_; >+ bool network_available_ = true; >+ bool pacer_paused_ = false; >+ int64_t last_reported_target_bitrate_bps_ = 0; >+ uint8_t last_reported_fraction_loss_ = 0; >+ int64_t last_reported_rtt_ms_ = 0; >+ const bool pacer_pushback_experiment_; >+ const bool disable_pacer_emergency_stop_; >+ uint32_t min_pushback_target_bitrate_bps_; >+ int64_t pacer_expected_queue_ms_ = 0; >+ double encoding_rate_ratio_ = 1.0; >+ >+ rtc::SequencedTaskChecker sequenced_checker_; >+ RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(CongestionControlHandler); >+}; >+} // namespace webrtc >+#endif // MODULES_CONGESTION_CONTROLLER_RTP_CONTROL_HANDLER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/include/send_side_congestion_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/include/send_side_congestion_controller.h >index efe4d2ff47756b441f52d64f149bf8a2ecac2854..55465cb75f88f695cec986697dfebddeb911f364 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/include/send_side_congestion_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/include/send_side_congestion_controller.h >@@ -21,7 +21,7 @@ > #include "api/transport/network_types.h" > #include "common_types.h" // NOLINT(build/include) > #include "modules/congestion_controller/include/send_side_congestion_controller_interface.h" >-#include "modules/congestion_controller/rtp/pacer_controller.h" >+#include "modules/congestion_controller/rtp/control_handler.h" > #include "modules/congestion_controller/rtp/transport_feedback_adapter.h" > #include "modules/include/module.h" > #include "modules/include/module_common_types.h" >@@ -40,18 +40,11 @@ struct SentPacket; > namespace webrtc { > > class Clock; >-class RateLimiter; > class RtcEventLog; > > namespace webrtc_cc { > > namespace send_side_cc_internal { >-// This is used to observe the network controller state and route calls to >-// the proper handler. It also keeps cached values for safe asynchronous use. >-// This makes sure that things running on the worker queue can't access state >-// in SendSideCongestionController, which would risk causing data race on >-// destruction unless members are properly ordered. >-class ControlHandler; > > // TODO(srte): Make sure the PeriodicTask implementation is reusable and move it > // to task_queue.h. >@@ -182,10 +175,7 @@ class SendSideCongestionController > const std::unique_ptr<NetworkControllerFactoryInterface> > controller_factory_fallback_ RTC_GUARDED_BY(task_queue_); > >- const std::unique_ptr<PacerController> pacer_controller_ >- RTC_GUARDED_BY(task_queue_); >- >- std::unique_ptr<send_side_cc_internal::ControlHandler> control_handler_ >+ std::unique_ptr<CongestionControlHandler> control_handler_ > RTC_GUARDED_BY(task_queue_); > > std::unique_ptr<NetworkControllerInterface> controller_ >@@ -201,6 +191,7 @@ class SendSideCongestionController > NetworkControllerConfig initial_config_ RTC_GUARDED_BY(task_queue_); > StreamsConfig streams_config_ RTC_GUARDED_BY(task_queue_); > >+ const bool reset_feedback_on_route_change_; > const bool send_side_bwe_with_overhead_; > // Transport overhead is written by OnNetworkRouteChanged and read by > // AddPacket. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/pacer_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/pacer_controller.cc >deleted file mode 100644 >index af062e881eeef20800be0077bcc977433337cd58..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/pacer_controller.cc >+++ /dev/null >@@ -1,70 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/congestion_controller/rtp/pacer_controller.h" >- >-#include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >- >-namespace webrtc { >- >-PacerController::PacerController(PacedSender* pacer) : pacer_(pacer) { >- sequenced_checker_.Detach(); >-} >- >-PacerController::~PacerController() = default; >- >-void PacerController::OnCongestionWindow(DataSize congestion_window) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- if (congestion_window.IsFinite()) >- pacer_->SetCongestionWindow(congestion_window.bytes()); >- else >- pacer_->SetCongestionWindow(PacedSender::kNoCongestionWindow); >-} >- >-void PacerController::OnNetworkAvailability(NetworkAvailability msg) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- network_available_ = msg.network_available; >- pacer_->UpdateOutstandingData(0); >- SetPacerState(!msg.network_available); >-} >- >-void PacerController::OnNetworkRouteChange(NetworkRouteChange) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- pacer_->UpdateOutstandingData(0); >-} >- >-void PacerController::OnPacerConfig(PacerConfig msg) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- DataRate pacing_rate = msg.data_window / msg.time_window; >- DataRate padding_rate = msg.pad_window / msg.time_window; >- pacer_->SetPacingRates(pacing_rate.bps(), padding_rate.bps()); >-} >- >-void PacerController::OnProbeClusterConfig(ProbeClusterConfig config) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- int64_t bitrate_bps = config.target_data_rate.bps(); >- pacer_->CreateProbeCluster(bitrate_bps); >-} >- >-void PacerController::OnOutstandingData(DataSize in_flight_data) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- pacer_->UpdateOutstandingData(in_flight_data.bytes()); >-} >- >-void PacerController::SetPacerState(bool paused) { >- if (paused && !pacer_paused_) >- pacer_->Pause(); >- else if (!paused && pacer_paused_) >- pacer_->Resume(); >- pacer_paused_ = paused; >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/pacer_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/pacer_controller.h >deleted file mode 100644 >index b778b784890355491474aeeee56312be37498103..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/pacer_controller.h >+++ /dev/null >@@ -1,50 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_CONGESTION_CONTROLLER_RTP_PACER_CONTROLLER_H_ >-#define MODULES_CONGESTION_CONTROLLER_RTP_PACER_CONTROLLER_H_ >- >-#include <memory> >- >-#include "api/transport/network_types.h" >-#include "modules/pacing/paced_sender.h" >-#include "rtc_base/sequenced_task_checker.h" >- >-namespace webrtc { >-class Clock; >- >-// Wrapper class to control pacer using task queues. Note that this class is >-// only designed to be used from a single task queue and has no built in >-// concurrency safety. >-// TODO(srte): Integrate this interface directly into PacedSender. >-class PacerController { >- public: >- explicit PacerController(PacedSender* pacer); >- ~PacerController(); >- void OnCongestionWindow(DataSize msg); >- void OnNetworkAvailability(NetworkAvailability msg); >- void OnNetworkRouteChange(NetworkRouteChange msg); >- void OnOutstandingData(DataSize in_flight_data); >- void OnPacerConfig(PacerConfig msg); >- void OnProbeClusterConfig(ProbeClusterConfig msg); >- >- private: >- void SetPacerState(bool paused); >- PacedSender* const pacer_; >- >- absl::optional<PacerConfig> current_pacer_config_; >- bool pacer_paused_ = false; >- bool network_available_ = true; >- >- rtc::SequencedTaskChecker sequenced_checker_; >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(PacerController); >-}; >-} // namespace webrtc >-#endif // MODULES_CONGESTION_CONTROLLER_RTP_PACER_CONTROLLER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller.cc >index 266a5dc7400cbd11079294f59809cad87bc51584..982380f457aed3c84fe3dd097b1f54dbb77615e0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller.cc >@@ -15,19 +15,19 @@ > #include <memory> > #include <vector> > #include "absl/memory/memory.h" >+#include "api/transport/goog_cc_factory.h" > #include "api/transport/network_types.h" >-#include "modules/congestion_controller/congestion_window_pushback_controller.h" >-#include "modules/congestion_controller/goog_cc/include/goog_cc_factory.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "rtc_base/bind.h" > #include "rtc_base/checks.h" >+#include "rtc_base/event.h" > #include "rtc_base/format_macros.h" > #include "rtc_base/logging.h" >+#include "rtc_base/network/sent_packet.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "rtc_base/rate_limiter.h" > #include "rtc_base/sequenced_task_checker.h" >-#include "rtc_base/socket.h" > #include "rtc_base/timeutils.h" > #include "system_wrappers/include/field_trial.h" > >@@ -37,70 +37,8 @@ namespace webrtc { > namespace webrtc_cc { > namespace { > using send_side_cc_internal::PeriodicTask; >- >-const char kCwndExperiment[] = "WebRTC-CwndExperiment"; >- >-// When CongestionWindowPushback is enabled, the pacer is oblivious to >-// the congestion window. The relation between outstanding data and >-// the congestion window affects encoder allocations directly. >-const char kCongestionPushbackExperiment[] = "WebRTC-CongestionWindowPushback"; >- >-// When PacerPushbackExperiment is enabled, build-up in the pacer due to >-// the congestion window and/or data spikes reduces encoder allocations. >-const char kPacerPushbackExperiment[] = "WebRTC-PacerPushbackExperiment"; > const int64_t PacerQueueUpdateIntervalMs = 25; > >-bool IsPacerPushbackExperimentEnabled() { >- return webrtc::field_trial::IsEnabled(kPacerPushbackExperiment); >-} >- >-bool IsCongestionWindowPushbackExperimentEnabled() { >- return webrtc::field_trial::IsEnabled(kCongestionPushbackExperiment) && >- webrtc::field_trial::IsEnabled(kCwndExperiment); >-} >- >-std::unique_ptr<CongestionWindowPushbackController> >-MaybeInitalizeCongestionWindowPushbackController() { >- return IsCongestionWindowPushbackExperimentEnabled() >- ? absl::make_unique<CongestionWindowPushbackController>() >- : nullptr; >-} >- >-void SortPacketFeedbackVector(std::vector<webrtc::PacketFeedback>* input) { >- std::sort(input->begin(), input->end(), PacketFeedbackComparator()); >-} >- >-PacketResult NetworkPacketFeedbackFromRtpPacketFeedback( >- const webrtc::PacketFeedback& pf) { >- PacketResult feedback; >- if (pf.arrival_time_ms == webrtc::PacketFeedback::kNotReceived) >- feedback.receive_time = Timestamp::PlusInfinity(); >- else >- feedback.receive_time = Timestamp::ms(pf.arrival_time_ms); >- if (pf.send_time_ms != webrtc::PacketFeedback::kNoSendTime) { >- feedback.sent_packet = SentPacket(); >- feedback.sent_packet->sequence_number = pf.long_sequence_number; >- feedback.sent_packet->send_time = Timestamp::ms(pf.send_time_ms); >- feedback.sent_packet->size = DataSize::bytes(pf.payload_size); >- feedback.sent_packet->pacing_info = pf.pacing_info; >- } >- return feedback; >-} >- >-std::vector<PacketResult> PacketResultsFromRtpFeedbackVector( >- const std::vector<PacketFeedback>& feedback_vector) { >- RTC_DCHECK(std::is_sorted(feedback_vector.begin(), feedback_vector.end(), >- PacketFeedbackComparator())); >- >- std::vector<PacketResult> packet_feedbacks; >- packet_feedbacks.reserve(feedback_vector.size()); >- for (const PacketFeedback& rtp_feedback : feedback_vector) { >- auto feedback = NetworkPacketFeedbackFromRtpPacketFeedback(rtp_feedback); >- packet_feedbacks.push_back(feedback); >- } >- return packet_feedbacks; >-} >- > TargetRateConstraints ConvertConstraints(int min_bitrate_bps, > int max_bitrate_bps, > int start_bitrate_bps, >@@ -166,177 +104,6 @@ static PeriodicTask* StartPeriodicTask(rtc::TaskQueue* task_queue, > > } // namespace > >-namespace send_side_cc_internal { >-class ControlHandler { >- public: >- ControlHandler(NetworkChangedObserver* observer, >- PacerController* pacer_controller, >- const Clock* clock); >- >- void PostUpdates(NetworkControlUpdate update); >- >- void OnNetworkAvailability(NetworkAvailability msg); >- void OnOutstandingData(DataSize in_flight_data); >- void OnPacerQueueUpdate(TimeDelta expected_queue_time); >- >- absl::optional<TargetTransferRate> last_transfer_rate(); >- >- private: >- void OnNetworkInvalidation(); >- bool GetNetworkParameters(int32_t* estimated_bitrate_bps, >- uint8_t* fraction_loss, >- int64_t* rtt_ms); >- bool IsSendQueueFull() const; >- bool HasNetworkParametersToReportChanged(int64_t bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt); >- NetworkChangedObserver* observer_ = nullptr; >- PacerController* pacer_controller_; >- >- absl::optional<TargetTransferRate> current_target_rate_msg_; >- bool network_available_ = true; >- int64_t last_reported_target_bitrate_bps_ = 0; >- uint8_t last_reported_fraction_loss_ = 0; >- int64_t last_reported_rtt_ms_ = 0; >- const bool pacer_pushback_experiment_ = false; >- uint32_t min_pushback_target_bitrate_bps_; >- int64_t pacer_expected_queue_ms_ = 0; >- double encoding_rate_ratio_ = 1.0; >- const std::unique_ptr<CongestionWindowPushbackController> >- congestion_window_pushback_controller_; >- >- rtc::SequencedTaskChecker sequenced_checker_; >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(ControlHandler); >-}; >- >-ControlHandler::ControlHandler(NetworkChangedObserver* observer, >- PacerController* pacer_controller, >- const Clock* clock) >- : observer_(observer), >- pacer_controller_(pacer_controller), >- pacer_pushback_experiment_(IsPacerPushbackExperimentEnabled()), >- congestion_window_pushback_controller_( >- MaybeInitalizeCongestionWindowPushbackController()) { >- sequenced_checker_.Detach(); >-} >- >-void ControlHandler::PostUpdates(NetworkControlUpdate update) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- if (update.congestion_window) { >- if (congestion_window_pushback_controller_) { >- congestion_window_pushback_controller_->SetDataWindow( >- update.congestion_window.value()); >- } else { >- pacer_controller_->OnCongestionWindow(*update.congestion_window); >- } >- } >- if (update.pacer_config) { >- pacer_controller_->OnPacerConfig(*update.pacer_config); >- } >- for (const auto& probe : update.probe_cluster_configs) { >- pacer_controller_->OnProbeClusterConfig(probe); >- } >- if (update.target_rate) { >- current_target_rate_msg_ = *update.target_rate; >- OnNetworkInvalidation(); >- } >-} >- >-void ControlHandler::OnNetworkAvailability(NetworkAvailability msg) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- network_available_ = msg.network_available; >- OnNetworkInvalidation(); >-} >- >-void ControlHandler::OnOutstandingData(DataSize in_flight_data) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- if (congestion_window_pushback_controller_) { >- congestion_window_pushback_controller_->UpdateOutstandingData( >- in_flight_data.bytes()); >- } >- OnNetworkInvalidation(); >-} >- >-void ControlHandler::OnPacerQueueUpdate(TimeDelta expected_queue_time) { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- pacer_expected_queue_ms_ = expected_queue_time.ms(); >- OnNetworkInvalidation(); >-} >- >-void ControlHandler::OnNetworkInvalidation() { >- if (!current_target_rate_msg_.has_value()) >- return; >- >- uint32_t target_bitrate_bps = current_target_rate_msg_->target_rate.bps(); >- int64_t rtt_ms = >- current_target_rate_msg_->network_estimate.round_trip_time.ms(); >- float loss_rate_ratio = >- current_target_rate_msg_->network_estimate.loss_rate_ratio; >- >- int loss_ratio_255 = loss_rate_ratio * 255; >- uint8_t fraction_loss = >- rtc::dchecked_cast<uint8_t>(rtc::SafeClamp(loss_ratio_255, 0, 255)); >- >- int64_t probing_interval_ms = >- current_target_rate_msg_->network_estimate.bwe_period.ms(); >- >- if (!network_available_) { >- target_bitrate_bps = 0; >- } else if (congestion_window_pushback_controller_) { >- target_bitrate_bps = >- congestion_window_pushback_controller_->UpdateTargetBitrate( >- target_bitrate_bps); >- } else if (!pacer_pushback_experiment_) { >- target_bitrate_bps = IsSendQueueFull() ? 0 : target_bitrate_bps; >- } else { >- int64_t queue_length_ms = pacer_expected_queue_ms_; >- >- if (queue_length_ms == 0) { >- encoding_rate_ratio_ = 1.0; >- } else if (queue_length_ms > 50) { >- double encoding_ratio = 1.0 - queue_length_ms / 1000.0; >- encoding_rate_ratio_ = std::min(encoding_rate_ratio_, encoding_ratio); >- encoding_rate_ratio_ = std::max(encoding_rate_ratio_, 0.0); >- } >- >- target_bitrate_bps *= encoding_rate_ratio_; >- target_bitrate_bps = target_bitrate_bps < 50000 ? 0 : target_bitrate_bps; >- } >- if (HasNetworkParametersToReportChanged(target_bitrate_bps, fraction_loss, >- rtt_ms)) { >- observer_->OnNetworkChanged(target_bitrate_bps, fraction_loss, rtt_ms, >- probing_interval_ms); >- } >-} >-bool ControlHandler::HasNetworkParametersToReportChanged( >- int64_t target_bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt_ms) { >- bool changed = last_reported_target_bitrate_bps_ != target_bitrate_bps || >- (target_bitrate_bps > 0 && >- (last_reported_fraction_loss_ != fraction_loss || >- last_reported_rtt_ms_ != rtt_ms)); >- if (changed && >- (last_reported_target_bitrate_bps_ == 0 || target_bitrate_bps == 0)) { >- RTC_LOG(LS_INFO) << "Bitrate estimate state changed, BWE: " >- << target_bitrate_bps << " bps."; >- } >- last_reported_target_bitrate_bps_ = target_bitrate_bps; >- last_reported_fraction_loss_ = fraction_loss; >- last_reported_rtt_ms_ = rtt_ms; >- return changed; >-} >- >-bool ControlHandler::IsSendQueueFull() const { >- return pacer_expected_queue_ms_ > PacedSender::kMaxQueueLengthMs; >-} >- >-absl::optional<TargetTransferRate> ControlHandler::last_transfer_rate() { >- RTC_DCHECK_CALLED_SEQUENTIALLY(&sequenced_checker_); >- return current_target_rate_msg_; >-} >-} // namespace send_side_cc_internal >- > SendSideCongestionController::SendSideCongestionController( > const Clock* clock, > rtc::TaskQueue* task_queue, >@@ -352,10 +119,11 @@ SendSideCongestionController::SendSideCongestionController( > controller_factory_with_feedback_(controller_factory), > controller_factory_fallback_( > absl::make_unique<GoogCcNetworkControllerFactory>(event_log)), >- pacer_controller_(absl::make_unique<PacerController>(pacer_)), > process_interval_(controller_factory_fallback_->GetProcessInterval()), > last_report_block_time_(Timestamp::ms(clock_->TimeInMilliseconds())), > observer_(nullptr), >+ reset_feedback_on_route_change_( >+ !field_trial::IsEnabled("WebRTC-Bwe-NoFeedbackReset")), > send_side_bwe_with_overhead_( > webrtc::field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")), > transport_overhead_bytes_per_packet_(0), >@@ -398,8 +166,8 @@ void SendSideCongestionController::MaybeRecreateControllers() { > if (!network_available_ || !observer_) > return; > if (!control_handler_) { >- control_handler_ = absl::make_unique<send_side_cc_internal::ControlHandler>( >- observer_, pacer_controller_.get(), clock_); >+ control_handler_ = >+ absl::make_unique<CongestionControlHandler>(observer_, pacer_); > } > > initial_config_.constraints.at_time = >@@ -489,8 +257,9 @@ void SendSideCongestionController::OnNetworkRouteChanged( > int start_bitrate_bps, > int min_bitrate_bps, > int max_bitrate_bps) { >- transport_feedback_adapter_.SetNetworkIds(network_route.local_network_id, >- network_route.remote_network_id); >+ if (reset_feedback_on_route_change_) >+ transport_feedback_adapter_.SetNetworkIds(network_route.local_network_id, >+ network_route.remote_network_id); > transport_overhead_bytes_per_packet_ = network_route.packet_overhead; > > NetworkRouteChange msg; >@@ -505,7 +274,7 @@ void SendSideCongestionController::OnNetworkRouteChanged( > } else { > UpdateInitialConstraints(msg.constraints); > } >- pacer_controller_->OnNetworkRouteChange(msg); >+ pacer_->UpdateOutstandingData(0); > }); > } > >@@ -572,7 +341,6 @@ void SendSideCongestionController::SignalNetworkState(NetworkState state) { > network_available_ = msg.network_available; > if (controller_) { > control_handler_->PostUpdates(controller_->OnNetworkAvailability(msg)); >- pacer_controller_->OnNetworkAvailability(msg); > control_handler_->OnNetworkAvailability(msg); > } else { > MaybeCreateControllers(); >@@ -582,27 +350,16 @@ void SendSideCongestionController::SignalNetworkState(NetworkState state) { > > void SendSideCongestionController::OnSentPacket( > const rtc::SentPacket& sent_packet) { >- // We're not interested in packets without an id, which may be stun packets, >- // etc, sent on the same transport. >- if (sent_packet.packet_id == -1) >- return; >- transport_feedback_adapter_.OnSentPacket(sent_packet.packet_id, >- sent_packet.send_time_ms); >- MaybeUpdateOutstandingData(); >- auto packet = transport_feedback_adapter_.GetPacket(sent_packet.packet_id); >- if (packet.has_value()) { >- SentPacket msg; >- msg.size = DataSize::bytes(packet->payload_size); >- msg.send_time = Timestamp::ms(packet->send_time_ms); >- msg.sequence_number = packet->long_sequence_number; >- msg.data_in_flight = >- DataSize::bytes(transport_feedback_adapter_.GetOutstandingBytes()); >- task_queue_->PostTask([this, msg]() { >+ absl::optional<SentPacket> packet_msg = >+ transport_feedback_adapter_.ProcessSentPacket(sent_packet); >+ if (packet_msg) { >+ task_queue_->PostTask([this, packet_msg]() { > RTC_DCHECK_RUN_ON(task_queue_); > if (controller_) >- control_handler_->PostUpdates(controller_->OnSentPacket(msg)); >+ control_handler_->PostUpdates(controller_->OnSentPacket(*packet_msg)); > }); > } >+ MaybeUpdateOutstandingData(); > } > > void SendSideCongestionController::OnRttUpdate(int64_t avg_rtt_ms, >@@ -687,39 +444,24 @@ void SendSideCongestionController::AddPacket( > void SendSideCongestionController::OnTransportFeedback( > const rtcp::TransportFeedback& feedback) { > RTC_DCHECK_RUNS_SERIALIZED(&worker_race_); >- int64_t feedback_time_ms = clock_->TimeInMilliseconds(); > >- DataSize prior_in_flight = >- DataSize::bytes(transport_feedback_adapter_.GetOutstandingBytes()); >- transport_feedback_adapter_.OnTransportFeedback(feedback); >- MaybeUpdateOutstandingData(); >- >- std::vector<PacketFeedback> feedback_vector = >- transport_feedback_adapter_.GetTransportFeedbackVector(); >- SortPacketFeedbackVector(&feedback_vector); >- >- if (!feedback_vector.empty()) { >- TransportPacketsFeedback msg; >- msg.packet_feedbacks = PacketResultsFromRtpFeedbackVector(feedback_vector); >- msg.feedback_time = Timestamp::ms(feedback_time_ms); >- msg.prior_in_flight = prior_in_flight; >- msg.data_in_flight = >- DataSize::bytes(transport_feedback_adapter_.GetOutstandingBytes()); >- task_queue_->PostTask([this, msg]() { >+ absl::optional<TransportPacketsFeedback> feedback_msg = >+ transport_feedback_adapter_.ProcessTransportFeedback(feedback); >+ if (feedback_msg) { >+ task_queue_->PostTask([this, feedback_msg]() { > RTC_DCHECK_RUN_ON(task_queue_); > if (controller_) > control_handler_->PostUpdates( >- controller_->OnTransportPacketsFeedback(msg)); >+ controller_->OnTransportPacketsFeedback(*feedback_msg)); > }); > } >+ MaybeUpdateOutstandingData(); > } > > void SendSideCongestionController::MaybeUpdateOutstandingData() { >- DataSize in_flight_data = >- DataSize::bytes(transport_feedback_adapter_.GetOutstandingBytes()); >+ DataSize in_flight_data = transport_feedback_adapter_.GetOutstandingData(); > task_queue_->PostTask([this, in_flight_data]() { > RTC_DCHECK_RUN_ON(task_queue_); >- pacer_controller_->OnOutstandingData(in_flight_data); > if (control_handler_) > control_handler_->OnOutstandingData(in_flight_data); > }); >@@ -740,7 +482,7 @@ void SendSideCongestionController::PostPeriodicTasksForTest() { > } > > void SendSideCongestionController::WaitOnTasksForTest() { >- rtc::Event event(false, false); >+ rtc::Event event; > task_queue_->PostTask([&event]() { event.Set(); }); > event.Wait(rtc::Event::kForever); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller_unittest.cc >index d610afe13d0f3e2145187062788a693f6f4d7de6..6ed7a49553f1924378ab2e78000d225c1951f84c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_side_congestion_controller_unittest.cc >@@ -17,7 +17,7 @@ > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtcp_packet/transport_feedback.h" >-#include "rtc_base/socket.h" >+#include "rtc_base/network/sent_packet.h" > #include "system_wrappers/include/clock.h" > #include "test/field_trial.h" > #include "test/gmock.h" >@@ -108,8 +108,11 @@ class SendSideCongestionControllerTest : public ::testing::Test { > controller_->AddPacket(ssrc, packet_feedback.sequence_number, > packet_feedback.payload_size, > packet_feedback.pacing_info); >+ rtc::PacketInfo packet_info; >+ packet_info.included_in_feedback = true; > controller_->OnSentPacket(rtc::SentPacket(packet_feedback.sequence_number, >- packet_feedback.send_time_ms)); >+ packet_feedback.send_time_ms, >+ packet_info)); > } > > // Allows us to track the target bitrate, without prescribing the exact >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.cc >index 73682cd5b2426545caeb513468b4de3a9908c525..110dac399321248869e1a9f81502a6fd4132e27e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.cc >@@ -15,6 +15,7 @@ > > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "rtc_base/checks.h" >+#include "rtc_base/logging.h" > #include "system_wrappers/include/clock.h" > > namespace webrtc { >@@ -41,8 +42,19 @@ void SendTimeHistory::AddAndRemoveOld(const PacketFeedback& packet) { > PacketFeedback packet_copy = packet; > packet_copy.long_sequence_number = unwrapped_seq_num; > history_.insert(std::make_pair(unwrapped_seq_num, packet_copy)); >- if (packet.send_time_ms >= 0) >+ if (packet.send_time_ms >= 0) { > AddPacketBytes(packet_copy); >+ last_send_time_ms_ = std::max(last_send_time_ms_, packet.send_time_ms); >+ } >+} >+ >+void SendTimeHistory::AddUntracked(size_t packet_size, int64_t send_time_ms) { >+ if (send_time_ms < last_send_time_ms_) { >+ RTC_LOG(LS_WARNING) << "ignoring untracked data for out of order packet."; >+ } >+ pending_untracked_size_ += packet_size; >+ last_untracked_send_time_ms_ = >+ std::max(last_untracked_send_time_ms_, send_time_ms); > } > > bool SendTimeHistory::OnSentPacket(uint16_t sequence_number, >@@ -53,8 +65,17 @@ bool SendTimeHistory::OnSentPacket(uint16_t sequence_number, > return false; > bool packet_retransmit = it->second.send_time_ms >= 0; > it->second.send_time_ms = send_time_ms; >+ last_send_time_ms_ = std::max(last_send_time_ms_, send_time_ms); > if (!packet_retransmit) > AddPacketBytes(it->second); >+ if (pending_untracked_size_ > 0) { >+ if (send_time_ms < last_untracked_send_time_ms_) >+ RTC_LOG(LS_WARNING) >+ << "appending acknowledged data for out of order packet. (Diff: " >+ << last_untracked_send_time_ms_ - send_time_ms << " ms.)"; >+ it->second.unacknowledged_data += pending_untracked_size_; >+ pending_untracked_size_ = 0; >+ } > return true; > } > >@@ -90,13 +111,13 @@ bool SendTimeHistory::GetFeedback(PacketFeedback* packet_feedback, > return true; > } > >-size_t SendTimeHistory::GetOutstandingBytes(uint16_t local_net_id, >- uint16_t remote_net_id) const { >+DataSize SendTimeHistory::GetOutstandingData(uint16_t local_net_id, >+ uint16_t remote_net_id) const { > auto it = in_flight_bytes_.find({local_net_id, remote_net_id}); > if (it != in_flight_bytes_.end()) { >- return it->second; >+ return DataSize::bytes(it->second); > } else { >- return 0; >+ return DataSize::Zero(); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.h >index 7cc6ad76414880491178e55d6b664b1f1d817894..656b7e16a34b1dfdab877287d21179ffd1fc2b5f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/send_time_history.h >@@ -14,6 +14,7 @@ > #include <map> > #include <utility> > >+#include "api/units/data_size.h" > #include "modules/include/module_common_types.h" > #include "rtc_base/constructormagic.h" > >@@ -29,6 +30,8 @@ class SendTimeHistory { > // Cleanup old entries, then add new packet info with provided parameters. > void AddAndRemoveOld(const PacketFeedback& packet); > >+ void AddUntracked(size_t packet_size, int64_t send_time_ms); >+ > // Updates packet info identified by |sequence_number| with |send_time_ms|. > // Return false if not found. > bool OnSentPacket(uint16_t sequence_number, int64_t send_time_ms); >@@ -41,8 +44,8 @@ class SendTimeHistory { > // thus be non-null and have the sequence_number field set. > bool GetFeedback(PacketFeedback* packet_feedback, bool remove); > >- size_t GetOutstandingBytes(uint16_t local_net_id, >- uint16_t remote_net_id) const; >+ DataSize GetOutstandingData(uint16_t local_net_id, >+ uint16_t remote_net_id) const; > > private: > using RemoteAndLocalNetworkId = std::pair<uint16_t, uint16_t>; >@@ -52,6 +55,9 @@ class SendTimeHistory { > void UpdateAckedSeqNum(int64_t acked_seq_num); > const Clock* const clock_; > const int64_t packet_age_limit_ms_; >+ size_t pending_untracked_size_ = 0; >+ int64_t last_send_time_ms_ = -1; >+ int64_t last_untracked_send_time_ms_ = -1; > SequenceNumberUnwrapper seq_num_unwrapper_; > std::map<int64_t, PacketFeedback> history_; > absl::optional<int64_t> last_ack_seq_num_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.cc >index d1b4855e369d4d026a3c9d53f33ae3210745b784..73d95231843b74db4e65379e255e0bb6f241697e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.cc >@@ -19,8 +19,28 @@ > #include "rtc_base/numerics/mod_ops.h" > > namespace webrtc { >-namespace webrtc_cc { >+namespace { >+void SortPacketFeedbackVector(std::vector<webrtc::PacketFeedback>* input) { >+ std::sort(input->begin(), input->end(), PacketFeedbackComparator()); >+} > >+PacketResult NetworkPacketFeedbackFromRtpPacketFeedback( >+ const webrtc::PacketFeedback& pf) { >+ PacketResult feedback; >+ if (pf.arrival_time_ms == webrtc::PacketFeedback::kNotReceived) { >+ feedback.receive_time = Timestamp::PlusInfinity(); >+ } else { >+ feedback.receive_time = Timestamp::ms(pf.arrival_time_ms); >+ } >+ feedback.sent_packet.sequence_number = pf.long_sequence_number; >+ feedback.sent_packet.send_time = Timestamp::ms(pf.send_time_ms); >+ feedback.sent_packet.size = DataSize::bytes(pf.payload_size); >+ feedback.sent_packet.pacing_info = pf.pacing_info; >+ feedback.sent_packet.prior_unacked_data = >+ DataSize::bytes(pf.unacknowledged_data); >+ return feedback; >+} >+} // namespace > const int64_t kNoTimestamp = -1; > const int64_t kSendTimeHistoryWindowMs = 60000; > const int64_t kBaseTimestampScaleFactor = >@@ -77,16 +97,59 @@ void TransportFeedbackAdapter::AddPacket(uint32_t ssrc, > } > } > >-void TransportFeedbackAdapter::OnSentPacket(uint16_t sequence_number, >- int64_t send_time_ms) { >+absl::optional<SentPacket> TransportFeedbackAdapter::ProcessSentPacket( >+ const rtc::SentPacket& sent_packet) { > rtc::CritScope cs(&lock_); >- send_time_history_.OnSentPacket(sequence_number, send_time_ms); >+ // TODO(srte): Only use one way to indicate that packet feedback is used. >+ if (sent_packet.info.included_in_feedback || sent_packet.packet_id != -1) { >+ send_time_history_.OnSentPacket(sent_packet.packet_id, >+ sent_packet.send_time_ms); >+ absl::optional<PacketFeedback> packet = >+ send_time_history_.GetPacket(sent_packet.packet_id); >+ if (packet) { >+ SentPacket msg; >+ msg.size = DataSize::bytes(packet->payload_size); >+ msg.send_time = Timestamp::ms(packet->send_time_ms); >+ msg.sequence_number = packet->long_sequence_number; >+ msg.prior_unacked_data = DataSize::bytes(packet->unacknowledged_data); >+ msg.data_in_flight = >+ send_time_history_.GetOutstandingData(local_net_id_, remote_net_id_); >+ return msg; >+ } >+ } else if (sent_packet.info.included_in_allocation) { >+ send_time_history_.AddUntracked(sent_packet.info.packet_size_bytes, >+ sent_packet.send_time_ms); >+ } >+ return absl::nullopt; > } > >-absl::optional<PacketFeedback> TransportFeedbackAdapter::GetPacket( >- uint16_t sequence_number) const { >- rtc::CritScope cs(&lock_); >- return send_time_history_.GetPacket(sequence_number); >+absl::optional<TransportPacketsFeedback> >+TransportFeedbackAdapter::ProcessTransportFeedback( >+ const rtcp::TransportFeedback& feedback) { >+ int64_t feedback_time_ms = clock_->TimeInMilliseconds(); >+ DataSize prior_in_flight = GetOutstandingData(); >+ OnTransportFeedback(feedback); >+ std::vector<PacketFeedback> feedback_vector = last_packet_feedback_vector_; >+ if (feedback_vector.empty()) >+ return absl::nullopt; >+ >+ SortPacketFeedbackVector(&feedback_vector); >+ TransportPacketsFeedback msg; >+ for (const PacketFeedback& rtp_feedback : feedback_vector) { >+ if (rtp_feedback.send_time_ms != PacketFeedback::kNoSendTime) { >+ auto feedback = NetworkPacketFeedbackFromRtpPacketFeedback(rtp_feedback); >+ msg.packet_feedbacks.push_back(feedback); >+ } else if (rtp_feedback.arrival_time_ms == PacketFeedback::kNotReceived) { >+ msg.sendless_arrival_times.push_back(Timestamp::PlusInfinity()); >+ } else { >+ msg.sendless_arrival_times.push_back( >+ Timestamp::ms(rtp_feedback.arrival_time_ms)); >+ } >+ } >+ msg.feedback_time = Timestamp::ms(feedback_time_ms); >+ msg.prior_in_flight = prior_in_flight; >+ msg.data_in_flight = GetOutstandingData(); >+ return msg; > } > > void TransportFeedbackAdapter::SetNetworkIds(uint16_t local_id, >@@ -96,6 +159,11 @@ void TransportFeedbackAdapter::SetNetworkIds(uint16_t local_id, > remote_net_id_ = remote_id; > } > >+DataSize TransportFeedbackAdapter::GetOutstandingData() const { >+ rtc::CritScope cs(&lock_); >+ return send_time_history_.GetOutstandingData(local_net_id_, remote_net_id_); >+} >+ > std::vector<PacketFeedback> TransportFeedbackAdapter::GetPacketFeedbackVector( > const rtcp::TransportFeedback& feedback) { > int64_t timestamp_us = feedback.GetBaseTimeUs(); >@@ -184,10 +252,4 @@ std::vector<PacketFeedback> > TransportFeedbackAdapter::GetTransportFeedbackVector() const { > return last_packet_feedback_vector_; > } >- >-size_t TransportFeedbackAdapter::GetOutstandingBytes() const { >- rtc::CritScope cs(&lock_); >- return send_time_history_.GetOutstandingBytes(local_net_id_, remote_net_id_); >-} >-} // namespace webrtc_cc > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.h >index e2477ca31f3443faa1d2341b02b290e31a1b308f..1741d6d66a0c413ad05605bff95a0162da041069 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter.h >@@ -17,6 +17,7 @@ > #include "api/transport/network_types.h" > #include "modules/congestion_controller/rtp/send_time_history.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/network/sent_packet.h" > #include "rtc_base/thread_annotations.h" > #include "rtc_base/thread_checker.h" > #include "system_wrappers/include/clock.h" >@@ -29,7 +30,6 @@ namespace rtcp { > class TransportFeedback; > } // namespace rtcp > >-namespace webrtc_cc { > class TransportFeedbackAdapter { > public: > explicit TransportFeedbackAdapter(const Clock* clock); >@@ -42,22 +42,22 @@ class TransportFeedbackAdapter { > uint16_t sequence_number, > size_t length, > const PacedPacketInfo& pacing_info); >- void OnSentPacket(uint16_t sequence_number, int64_t send_time_ms); > >- // TODO(holmer): This method should return DelayBasedBwe::Result so that we >- // can get rid of the dependency on BitrateController. Requires changes >- // to the CongestionController interface. >- void OnTransportFeedback(const rtcp::TransportFeedback& feedback); >- std::vector<PacketFeedback> GetTransportFeedbackVector() const; >- absl::optional<PacketFeedback> GetPacket(uint16_t sequence_number) const; >+ absl::optional<SentPacket> ProcessSentPacket( >+ const rtc::SentPacket& sent_packet); > >- void SetTransportOverhead(int transport_overhead_bytes_per_packet); >+ absl::optional<TransportPacketsFeedback> ProcessTransportFeedback( >+ const rtcp::TransportFeedback& feedback); >+ >+ std::vector<PacketFeedback> GetTransportFeedbackVector() const; > > void SetNetworkIds(uint16_t local_id, uint16_t remote_id); > >- size_t GetOutstandingBytes() const; >+ DataSize GetOutstandingData() const; > > private: >+ void OnTransportFeedback(const rtcp::TransportFeedback& feedback); >+ > std::vector<PacketFeedback> GetPacketFeedbackVector( > const rtcp::TransportFeedback& feedback); > >@@ -75,7 +75,6 @@ class TransportFeedbackAdapter { > RTC_GUARDED_BY(&observers_lock_); > }; > >-} // namespace webrtc_cc > } // namespace webrtc > > #endif // MODULES_CONGESTION_CONTROLLER_RTP_TRANSPORT_FEEDBACK_ADAPTER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter_unittest.cc >index 70087678a239da39b1716802f2e9353bfc90ff50..d1404df63f418d72137ebd4e6a6cc411795a2f2c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/rtp/transport_feedback_adapter_unittest.cc >@@ -69,8 +69,9 @@ class TransportFeedbackAdapterTest : public ::testing::Test { > adapter_->AddPacket(kSsrc, packet_feedback.sequence_number, > packet_feedback.payload_size, > packet_feedback.pacing_info); >- adapter_->OnSentPacket(packet_feedback.sequence_number, >- packet_feedback.send_time_ms); >+ adapter_->ProcessSentPacket(rtc::SentPacket(packet_feedback.sequence_number, >+ packet_feedback.send_time_ms, >+ rtc::PacketInfo())); > } > > static constexpr uint32_t kSsrc = 8492; >@@ -100,7 +101,7 @@ TEST_F(TransportFeedbackAdapterTest, ObserverSanity) { > } > > EXPECT_CALL(mock, OnPacketFeedbackVector(_)).Times(1); >- adapter_->OnTransportFeedback(feedback); >+ adapter_->ProcessTransportFeedback(feedback); > > adapter_->DeRegisterPacketFeedbackObserver(&mock); > >@@ -115,7 +116,7 @@ TEST_F(TransportFeedbackAdapterTest, ObserverSanity) { > EXPECT_TRUE(feedback.AddReceivedPacket(new_packet.sequence_number, > new_packet.arrival_time_ms * 1000)); > EXPECT_CALL(mock, OnPacketFeedbackVector(_)).Times(0); >- adapter_->OnTransportFeedback(second_feedback); >+ adapter_->ProcessTransportFeedback(second_feedback); > } > > #if RTC_DCHECK_IS_ON && GTEST_HAS_DEATH_TEST && !defined(WEBRTC_ANDROID) >@@ -156,7 +157,7 @@ TEST_F(TransportFeedbackAdapterTest, AdaptsFeedbackAndPopulatesSendTimes) { > > feedback.Build(); > >- adapter_->OnTransportFeedback(feedback); >+ adapter_->ProcessTransportFeedback(feedback); > ComparePacketFeedbackVectors(packets, adapter_->GetTransportFeedbackVector()); > } > >@@ -189,7 +190,7 @@ TEST_F(TransportFeedbackAdapterTest, FeedbackVectorReportsUnreceived) { > > feedback.Build(); > >- adapter_->OnTransportFeedback(feedback); >+ adapter_->ProcessTransportFeedback(feedback); > ComparePacketFeedbackVectors(sent_packets, > adapter_->GetTransportFeedbackVector()); > } >@@ -233,7 +234,7 @@ TEST_F(TransportFeedbackAdapterTest, HandlesDroppedPackets) { > expected_packets[i].pacing_info = PacedPacketInfo(); > } > >- adapter_->OnTransportFeedback(feedback); >+ adapter_->ProcessTransportFeedback(feedback); > ComparePacketFeedbackVectors(expected_packets, > adapter_->GetTransportFeedbackVector()); > } >@@ -269,7 +270,7 @@ TEST_F(TransportFeedbackAdapterTest, SendTimeWrapsBothWays) { > std::vector<PacketFeedback> expected_packets; > expected_packets.push_back(packets[i]); > >- adapter_->OnTransportFeedback(*feedback.get()); >+ adapter_->ProcessTransportFeedback(*feedback.get()); > ComparePacketFeedbackVectors(expected_packets, > adapter_->GetTransportFeedbackVector()); > } >@@ -298,7 +299,7 @@ TEST_F(TransportFeedbackAdapterTest, HandlesArrivalReordering) { > // Adapter keeps the packets ordered by sequence number (which is itself > // assigned by the order of transmission). Reordering by some other criteria, > // eg. arrival time, is up to the observers. >- adapter_->OnTransportFeedback(feedback); >+ adapter_->ProcessTransportFeedback(feedback); > ComparePacketFeedbackVectors(packets, adapter_->GetTransportFeedbackVector()); > } > >@@ -362,7 +363,7 @@ TEST_F(TransportFeedbackAdapterTest, TimestampDeltas) { > std::vector<PacketFeedback> received_feedback; > > EXPECT_TRUE(feedback.get() != nullptr); >- adapter_->OnTransportFeedback(*feedback.get()); >+ adapter_->ProcessTransportFeedback(*feedback.get()); > ComparePacketFeedbackVectors(sent_packets, > adapter_->GetTransportFeedbackVector()); > >@@ -377,7 +378,7 @@ TEST_F(TransportFeedbackAdapterTest, TimestampDeltas) { > rtcp::TransportFeedback::ParseFrom(raw_packet.data(), raw_packet.size()); > > EXPECT_TRUE(feedback.get() != nullptr); >- adapter_->OnTransportFeedback(*feedback.get()); >+ adapter_->ProcessTransportFeedback(*feedback.get()); > { > std::vector<PacketFeedback> expected_packets; > expected_packets.push_back(packet_feedback); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller.cc >index f8b3155bf9dc66d90d4157296edba75cffb2563c..147f103e7869771d26e99f3c5f1af67fd16c17d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller.cc >@@ -18,16 +18,16 @@ > > #include "absl/memory/memory.h" > #include "modules/bitrate_controller/include/bitrate_controller.h" >-#include "modules/congestion_controller/congestion_window_pushback_controller.h" > #include "modules/congestion_controller/goog_cc/acknowledged_bitrate_estimator.h" >+#include "modules/congestion_controller/goog_cc/congestion_window_pushback_controller.h" > #include "modules/congestion_controller/goog_cc/probe_controller.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "rtc_base/checks.h" > #include "rtc_base/format_macros.h" > #include "rtc_base/logging.h" >+#include "rtc_base/network/sent_packet.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/rate_limiter.h" >-#include "rtc_base/socket.h" > #include "rtc_base/timeutils.h" > #include "system_wrappers/include/field_trial.h" > >@@ -143,6 +143,7 @@ SendSideCongestionController::SendSideCongestionController( > pause_pacer_(false), > pacer_paused_(false), > min_bitrate_bps_(congestion_controller::GetMinBitrateBps()), >+ probe_bitrate_estimator_(new ProbeBitrateEstimator(event_log_)), > delay_based_bwe_(new DelayBasedBwe(event_log_)), > in_cwnd_experiment_(CwndExperimentEnabled()), > accepted_queue_ms_(kDefaultAcceptedQueueMs), >@@ -153,7 +154,7 @@ SendSideCongestionController::SendSideCongestionController( > pacer_pushback_experiment_(IsPacerPushbackExperimentEnabled()), > congestion_window_pushback_controller_( > MaybeCreateCongestionWindowPushbackController()) { >- delay_based_bwe_->SetMinBitrate(min_bitrate_bps_); >+ delay_based_bwe_->SetMinBitrate(DataRate::bps(min_bitrate_bps_)); > if (in_cwnd_experiment_ && > !ReadCwndExperimentParameter(&accepted_queue_ms_)) { > RTC_LOG(LS_WARNING) << "Failed to parse parameters for CwndExperiment " >@@ -164,6 +165,23 @@ SendSideCongestionController::SendSideCongestionController( > > SendSideCongestionController::~SendSideCongestionController() {} > >+void SendSideCongestionController::EnableCongestionWindowPushback( >+ int64_t accepted_queue_ms, >+ uint32_t min_pushback_target_bitrate_bps) { >+ RTC_DCHECK(!congestion_window_pushback_controller_) >+ << "The congestion pushback is already enabled."; >+ RTC_CHECK_GE(accepted_queue_ms, 0) >+ << "Accepted must be greater than or equal to 0."; >+ RTC_CHECK_GE(min_pushback_target_bitrate_bps, 0) >+ << "Min pushback target bitrate must be greater than or equal to 0."; >+ >+ in_cwnd_experiment_ = true; >+ accepted_queue_ms_ = accepted_queue_ms; >+ congestion_window_pushback_controller_ = >+ absl::make_unique<CongestionWindowPushbackController>( >+ min_pushback_target_bitrate_bps); >+} >+ > void SendSideCongestionController::RegisterPacketFeedbackObserver( > PacketFeedbackObserver* observer) { > transport_feedback_adapter_.RegisterPacketFeedbackObserver(observer); >@@ -204,9 +222,9 @@ void SendSideCongestionController::SetBweBitrates(int min_bitrate_bps, > { > rtc::CritScope cs(&bwe_lock_); > if (start_bitrate_bps > 0) >- delay_based_bwe_->SetStartBitrate(start_bitrate_bps); >+ delay_based_bwe_->SetStartBitrate(DataRate::bps(start_bitrate_bps)); > min_bitrate_bps_ = min_bitrate_bps; >- delay_based_bwe_->SetMinBitrate(min_bitrate_bps_); >+ delay_based_bwe_->SetMinBitrate(DataRate::bps(min_bitrate_bps_)); > } > MaybeTriggerOnNetworkChanged(); > } >@@ -241,10 +259,13 @@ void SendSideCongestionController::OnNetworkRouteChanged( > rtc::CritScope cs(&bwe_lock_); > transport_overhead_bytes_per_packet_ = network_route.packet_overhead; > min_bitrate_bps_ = min_bitrate_bps; >+ probe_bitrate_estimator_.reset(new ProbeBitrateEstimator(event_log_)); > delay_based_bwe_.reset(new DelayBasedBwe(event_log_)); > acknowledged_bitrate_estimator_.reset(new AcknowledgedBitrateEstimator()); >- delay_based_bwe_->SetStartBitrate(bitrate_bps); >- delay_based_bwe_->SetMinBitrate(min_bitrate_bps); >+ if (bitrate_bps > 0) { >+ delay_based_bwe_->SetStartBitrate(DataRate::bps(bitrate_bps)); >+ } >+ delay_based_bwe_->SetMinBitrate(DataRate::bps(min_bitrate_bps)); > } > { > rtc::CritScope cs(&probe_lock_); >@@ -321,7 +342,7 @@ void SendSideCongestionController::OnSentPacket( > void SendSideCongestionController::OnRttUpdate(int64_t avg_rtt_ms, > int64_t max_rtt_ms) { > rtc::CritScope cs(&bwe_lock_); >- delay_based_bwe_->OnRttUpdate(avg_rtt_ms); >+ delay_based_bwe_->OnRttUpdate(TimeDelta::ms(avg_rtt_ms)); > } > > int64_t SendSideCongestionController::TimeUntilNextProcess() { >@@ -397,9 +418,16 @@ void SendSideCongestionController::OnTransportFeedback( > DelayBasedBwe::Result result; > { > rtc::CritScope cs(&bwe_lock_); >+ for (const auto& packet : feedback_vector) { >+ if (packet.send_time_ms != PacketFeedback::kNoSendTime && >+ packet.pacing_info.probe_cluster_id != PacedPacketInfo::kNotAProbe) { >+ probe_bitrate_estimator_->HandleProbeAndEstimateBitrate(packet); >+ } >+ } > result = delay_based_bwe_->IncomingPacketFeedbackVector( >- feedback_vector, acknowledged_bitrate_estimator_->bitrate_bps(), >- clock_->TimeInMilliseconds()); >+ feedback_vector, acknowledged_bitrate_estimator_->bitrate(), >+ probe_bitrate_estimator_->FetchAndResetLastEstimatedBitrate(), >+ Timestamp::ms(clock_->TimeInMilliseconds())); > } > if (result.updated) { > bitrate_controller_->OnDelayBasedBweResult(result); >@@ -501,7 +529,7 @@ void SendSideCongestionController::MaybeTriggerOnNetworkChanged() { > int64_t probing_interval_ms; > { > rtc::CritScope cs(&bwe_lock_); >- probing_interval_ms = delay_based_bwe_->GetExpectedBwePeriodMs(); >+ probing_interval_ms = delay_based_bwe_->GetExpectedBwePeriod().ms(); > } > { > rtc::CritScope cs(&observer_lock_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller_unittest.cc >index 3653b8c22f6cc7a7ac572045b7013baa6191dc0c..5f58f21cdf8946c4ac8fa885454606c4b7563ef9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/send_side_congestion_controller_unittest.cc >@@ -18,7 +18,7 @@ > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtcp_packet/transport_feedback.h" >-#include "rtc_base/socket.h" >+#include "rtc_base/network/sent_packet.h" > #include "system_wrappers/include/clock.h" > #include "test/field_trial.h" > #include "test/gmock.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.cc >index 3baa3ad5a53c9ae2d58956ca24a842231784e842..b8b620d008b302f3a51873a49bf1efddae572aa6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.cc >@@ -26,7 +26,8 @@ const int64_t kBaseTimestampScaleFactor = > rtcp::TransportFeedback::kDeltaScaleFactor * (1 << 8); > const int64_t kBaseTimestampRangeSizeUs = kBaseTimestampScaleFactor * (1 << 24); > >-TransportFeedbackAdapter::TransportFeedbackAdapter(const Clock* clock) >+LegacyTransportFeedbackAdapter::LegacyTransportFeedbackAdapter( >+ const Clock* clock) > : send_time_history_(clock, kSendTimeHistoryWindowMs), > clock_(clock), > current_offset_ms_(kNoTimestamp), >@@ -34,11 +35,11 @@ TransportFeedbackAdapter::TransportFeedbackAdapter(const Clock* clock) > local_net_id_(0), > remote_net_id_(0) {} > >-TransportFeedbackAdapter::~TransportFeedbackAdapter() { >+LegacyTransportFeedbackAdapter::~LegacyTransportFeedbackAdapter() { > RTC_DCHECK(observers_.empty()); > } > >-void TransportFeedbackAdapter::RegisterPacketFeedbackObserver( >+void LegacyTransportFeedbackAdapter::RegisterPacketFeedbackObserver( > PacketFeedbackObserver* observer) { > rtc::CritScope cs(&observers_lock_); > RTC_DCHECK(observer); >@@ -47,7 +48,7 @@ void TransportFeedbackAdapter::RegisterPacketFeedbackObserver( > observers_.push_back(observer); > } > >-void TransportFeedbackAdapter::DeRegisterPacketFeedbackObserver( >+void LegacyTransportFeedbackAdapter::DeRegisterPacketFeedbackObserver( > PacketFeedbackObserver* observer) { > rtc::CritScope cs(&observers_lock_); > RTC_DCHECK(observer); >@@ -56,10 +57,11 @@ void TransportFeedbackAdapter::DeRegisterPacketFeedbackObserver( > observers_.erase(it); > } > >-void TransportFeedbackAdapter::AddPacket(uint32_t ssrc, >- uint16_t sequence_number, >- size_t length, >- const PacedPacketInfo& pacing_info) { >+void LegacyTransportFeedbackAdapter::AddPacket( >+ uint32_t ssrc, >+ uint16_t sequence_number, >+ size_t length, >+ const PacedPacketInfo& pacing_info) { > { > rtc::CritScope cs(&lock_); > const int64_t creation_time_ms = clock_->TimeInMilliseconds(); >@@ -76,20 +78,21 @@ void TransportFeedbackAdapter::AddPacket(uint32_t ssrc, > } > } > >-void TransportFeedbackAdapter::OnSentPacket(uint16_t sequence_number, >- int64_t send_time_ms) { >+void LegacyTransportFeedbackAdapter::OnSentPacket(uint16_t sequence_number, >+ int64_t send_time_ms) { > rtc::CritScope cs(&lock_); > send_time_history_.OnSentPacket(sequence_number, send_time_ms); > } > >-void TransportFeedbackAdapter::SetNetworkIds(uint16_t local_id, >- uint16_t remote_id) { >+void LegacyTransportFeedbackAdapter::SetNetworkIds(uint16_t local_id, >+ uint16_t remote_id) { > rtc::CritScope cs(&lock_); > local_net_id_ = local_id; > remote_net_id_ = remote_id; > } > >-std::vector<PacketFeedback> TransportFeedbackAdapter::GetPacketFeedbackVector( >+std::vector<PacketFeedback> >+LegacyTransportFeedbackAdapter::GetPacketFeedbackVector( > const rtcp::TransportFeedback& feedback) { > int64_t timestamp_us = feedback.GetBaseTimeUs(); > int64_t now_ms = clock_->TimeInMilliseconds(); >@@ -177,7 +180,7 @@ std::vector<PacketFeedback> TransportFeedbackAdapter::GetPacketFeedbackVector( > return packet_feedback_vector; > } > >-void TransportFeedbackAdapter::OnTransportFeedback( >+void LegacyTransportFeedbackAdapter::OnTransportFeedback( > const rtcp::TransportFeedback& feedback) { > last_packet_feedback_vector_ = GetPacketFeedbackVector(feedback); > { >@@ -189,18 +192,19 @@ void TransportFeedbackAdapter::OnTransportFeedback( > } > > std::vector<PacketFeedback> >-TransportFeedbackAdapter::GetTransportFeedbackVector() const { >+LegacyTransportFeedbackAdapter::GetTransportFeedbackVector() const { > return last_packet_feedback_vector_; > } > >-absl::optional<int64_t> TransportFeedbackAdapter::GetMinFeedbackLoopRtt() >+absl::optional<int64_t> LegacyTransportFeedbackAdapter::GetMinFeedbackLoopRtt() > const { > rtc::CritScope cs(&lock_); > return min_feedback_rtt_; > } > >-size_t TransportFeedbackAdapter::GetOutstandingBytes() const { >+size_t LegacyTransportFeedbackAdapter::GetOutstandingBytes() const { > rtc::CritScope cs(&lock_); >- return send_time_history_.GetOutstandingBytes(local_net_id_, remote_net_id_); >+ return send_time_history_.GetOutstandingData(local_net_id_, remote_net_id_) >+ .bytes(); > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.h >index 206236ce8ba7d2d92d16960c7c761ce406a4d2a7..c6ac2ad9ad4bae9b2d254832e96535a07f60dfbc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter.h >@@ -29,10 +29,12 @@ namespace rtcp { > class TransportFeedback; > } // namespace rtcp > >-class TransportFeedbackAdapter { >+// Deprecated, use version in >+// modules/congeestion_controller/rtp/transport_feedback_adapter.h >+class LegacyTransportFeedbackAdapter { > public: >- explicit TransportFeedbackAdapter(const Clock* clock); >- virtual ~TransportFeedbackAdapter(); >+ explicit LegacyTransportFeedbackAdapter(const Clock* clock); >+ virtual ~LegacyTransportFeedbackAdapter(); > > void RegisterPacketFeedbackObserver(PacketFeedbackObserver* observer); > void DeRegisterPacketFeedbackObserver(PacketFeedbackObserver* observer); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter_unittest.cc >index 39aa72e8167e7ab8c744c4752e2e47535fea6cb9..dc4eab13d651080c2c8bd2d25b1c59e182e8e60e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/congestion_controller/transport_feedback_adapter_unittest.cc >@@ -52,7 +52,7 @@ class LegacyTransportFeedbackAdapterTest : public ::testing::Test { > virtual ~LegacyTransportFeedbackAdapterTest() {} > > virtual void SetUp() { >- adapter_.reset(new TransportFeedbackAdapter(&clock_)); >+ adapter_.reset(new LegacyTransportFeedbackAdapter(&clock_)); > } > > virtual void TearDown() { adapter_.reset(); } >@@ -75,7 +75,7 @@ class LegacyTransportFeedbackAdapterTest : public ::testing::Test { > static constexpr uint32_t kSsrc = 8492; > > SimulatedClock clock_; >- std::unique_ptr<TransportFeedbackAdapter> adapter_; >+ std::unique_ptr<LegacyTransportFeedbackAdapter> adapter_; > }; > > TEST_F(LegacyTransportFeedbackAdapterTest, ObserverSanity) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/BUILD.gn >index 74de1747afd3c04b871597dc83b1ba20ee302866..390fc0a051b264058cd0c6c6febd9c7207a01544 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/BUILD.gn >@@ -6,6 +6,7 @@ > # in the file PATENTS. All contributing project authors may > # be found in the AUTHORS file in the root of the source tree. > >+import("//build/config/linux/pkg_config.gni") > import("//build/config/ui.gni") > import("../../webrtc.gni") > >@@ -28,8 +29,8 @@ rtc_static_library("primitives") { > ] > > deps = [ >- "../..:webrtc_common", > "../../rtc_base:checks", >+ "../../rtc_base/system:rtc_export", > "//third_party/abseil-cpp/absl/memory", > ] > >@@ -95,7 +96,6 @@ if (rtc_include_tests) { > ":desktop_capture", > ":desktop_capture_mock", > ":primitives", >- "../..:webrtc_common", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", > "../../system_wrappers:cpu_features_api", >@@ -164,6 +164,23 @@ if (rtc_include_tests) { > } > } > >+if (is_linux) { >+ if (rtc_use_pipewire) { >+ pkg_config("pipewire") { >+ packages = [ "libpipewire-0.2" ] >+ >+ defines = [ "WEBRTC_USE_PIPEWIRE" ] >+ } >+ >+ pkg_config("gio") { >+ packages = [ >+ "gio-2.0", >+ "gio-unix-2.0", >+ ] >+ } >+ } >+} >+ > rtc_source_set("desktop_capture") { > visibility = [ "*" ] > public_deps = [ >@@ -321,39 +338,69 @@ rtc_static_library("desktop_capture_generic") { > ] > } > >+ if (rtc_use_x11 || rtc_use_pipewire) { >+ sources += [ >+ "mouse_cursor_monitor_linux.cc", >+ "screen_capturer_linux.cc", >+ "window_capturer_linux.cc", >+ ] >+ >+ if (build_with_mozilla) { >+ sources += [ "app_capturer_linux.cc" ] >+ } >+ } >+ > if (rtc_use_x11) { > sources += [ >- "mouse_cursor_monitor_x11.cc", >- "screen_capturer_x11.cc", >- "window_capturer_x11.cc", >- "window_finder_x11.cc", >- "window_finder_x11.h", >- "x11/shared_x_display.cc", >- "x11/shared_x_display.h", >- "x11/window_list_utils.cc", >- "x11/window_list_utils.h", >- "x11/x_atom_cache.cc", >- "x11/x_atom_cache.h", >- "x11/x_error_trap.cc", >- "x11/x_error_trap.h", >- "x11/x_server_pixel_buffer.cc", >- "x11/x_server_pixel_buffer.h", >+ "linux/mouse_cursor_monitor_x11.cc", >+ "linux/mouse_cursor_monitor_x11.h", >+ "linux/screen_capturer_x11.cc", >+ "linux/screen_capturer_x11.h", >+ "linux/shared_x_display.cc", >+ "linux/shared_x_display.h", >+ "linux/window_capturer_x11.cc", >+ "linux/window_capturer_x11.h", >+ "linux/window_finder_x11.cc", >+ "linux/window_finder_x11.h", >+ "linux/window_list_utils.cc", >+ "linux/window_list_utils.h", >+ "linux/x_atom_cache.cc", >+ "linux/x_atom_cache.h", >+ "linux/x_error_trap.cc", >+ "linux/x_error_trap.h", >+ "linux/x_server_pixel_buffer.cc", >+ "linux/x_server_pixel_buffer.h", > ] > configs += [ "//build/config/linux:x11" ] > > if (build_with_mozilla) { > sources += [ >- "app_capturer_x11.cc", >- "app_capturer_x11.h", >- "x11/desktop_device_info_x11.cc", >- "x11/desktop_device_info_x11.h", >- "x11/shared_x_util.cc", >- "x11/shared_x_util.h", >+ "linux/app_capturer_x11.cc", >+ "linux/desktop_device_info_linux.cc", >+ "linux/desktop_device_info_linux.h", >+ "linux/shared_x_util.cc", >+ "linux/shared_x_util.h", > ] > } > } > >- if (!is_win && !is_mac && !rtc_use_x11) { >+ if (rtc_use_pipewire) { >+ sources += [ >+ "linux/base_capturer_pipewire.cc", >+ "linux/base_capturer_pipewire.h", >+ "linux/screen_capturer_pipewire.cc", >+ "linux/screen_capturer_pipewire.h", >+ "linux/window_capturer_pipewire.cc", >+ "linux/window_capturer_pipewire.h", >+ ] >+ >+ configs += [ >+ ":gio", >+ ":pipewire", >+ ] >+ } >+ >+ if (!is_win && !is_mac && !rtc_use_x11 && !rtc_use_pipewire) { > sources += [ > "mouse_cursor_monitor_null.cc", > "screen_capturer_null.cc", >@@ -370,12 +417,12 @@ rtc_static_library("desktop_capture_generic") { > > deps = [ > ":primitives", >- "../..:webrtc_common", > "../../api:refcountedbase", > "../../rtc_base:checks", > "../../rtc_base:rtc_base", # TODO(kjellander): Cleanup in bugs.webrtc.org/3806. > "../../rtc_base/synchronization:rw_lock_wrapper", > "../../rtc_base/system:arch", >+ "../../rtc_base/system:rtc_export", > "../../system_wrappers", > "../../system_wrappers:cpu_features_api", > "../../system_wrappers:metrics", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropped_desktop_frame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropped_desktop_frame.h >index 59dced40179bdd93d81c3b47ae73a5f7199c6a3c..6782398a6ab1e21d35e115bc4801a6acf32814c8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropped_desktop_frame.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropped_desktop_frame.h >@@ -12,6 +12,7 @@ > #define MODULES_DESKTOP_CAPTURE_CROPPED_DESKTOP_FRAME_H_ > > #include "modules/desktop_capture/desktop_frame.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -20,9 +21,9 @@ namespace webrtc { > // |frame| should not be nullptr. |rect| is in |frame| coordinate, i.e. > // |frame|->top_left() does not impact the area of |rect|. > // Returns nullptr frame if |rect| is not contained by the bounds of |frame|. >-std::unique_ptr<DesktopFrame> CreateCroppedDesktopFrame( >- std::unique_ptr<DesktopFrame> frame, >- const DesktopRect& rect); >+std::unique_ptr<DesktopFrame> RTC_EXPORT >+CreateCroppedDesktopFrame(std::unique_ptr<DesktopFrame> frame, >+ const DesktopRect& rect); > > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropping_window_capturer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropping_window_capturer.h >index c889801da3c5bbfeb5c849cb76e0fc5c86f4b70f..224198fa615110b14216dc50e3fcc03181af7ffd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropping_window_capturer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/cropping_window_capturer.h >@@ -15,14 +15,15 @@ > > #include "modules/desktop_capture/desktop_capture_options.h" > #include "modules/desktop_capture/desktop_capturer.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // WindowCapturer implementation that uses a screen capturer to capture the > // whole screen and crops the video frame to the window area when the captured > // window is on top. >-class CroppingWindowCapturer : public DesktopCapturer, >- public DesktopCapturer::Callback { >+class RTC_EXPORT CroppingWindowCapturer : public DesktopCapturer, >+ public DesktopCapturer::Callback { > public: > static std::unique_ptr<DesktopCapturer> CreateCapturer( > const DesktopCaptureOptions& options); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_and_cursor_composer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_and_cursor_composer.h >index 7dff7101f35d833c69331f66b7b7a6fb3c330089..fa5d15c92f6018f8ffbaf0c44ecdfc433a94df99 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_and_cursor_composer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_and_cursor_composer.h >@@ -17,14 +17,16 @@ > #include "modules/desktop_capture/desktop_capturer.h" > #include "modules/desktop_capture/mouse_cursor_monitor.h" > #include "rtc_base/constructormagic.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // A wrapper for DesktopCapturer that also captures mouse using specified > // MouseCursorMonitor and renders it on the generated streams. >-class DesktopAndCursorComposer : public DesktopCapturer, >- public DesktopCapturer::Callback, >- public MouseCursorMonitor::Callback { >+class RTC_EXPORT DesktopAndCursorComposer >+ : public DesktopCapturer, >+ public DesktopCapturer::Callback, >+ public MouseCursorMonitor::Callback { > public: > // Creates a new blender that captures mouse cursor using > // MouseCursorMonitor::Create(options) and renders it into the frames >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capture_options.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capture_options.h >index 60aa05c3fd3329d2747812f0f4838514e79d3f82..e9587a3f50c31b08535d8d7a05bfd469ff377671 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capture_options.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capture_options.h >@@ -12,9 +12,10 @@ > > #include "rtc_base/constructormagic.h" > #include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/system/rtc_export.h" > > #if defined(USE_X11) >-#include "modules/desktop_capture/x11/shared_x_display.h" >+#include "modules/desktop_capture/linux/shared_x_display.h" > #endif > > #if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >@@ -26,7 +27,7 @@ namespace webrtc { > > // An object that stores initialization parameters for screen and window > // capturers. >-class DesktopCaptureOptions { >+class RTC_EXPORT DesktopCaptureOptions { > public: > // Returns instance of DesktopCaptureOptions with default parameters. On Linux > // also initializes X window connection. x_display() will be set to null if >@@ -114,6 +115,11 @@ class DesktopCaptureOptions { > } > #endif > >+#if defined(WEBRTC_USE_PIPEWIRE) >+ bool allow_pipewire() const { return allow_pipewire_; } >+ void set_allow_pipewire(bool allow) { allow_pipewire_ = allow; } >+#endif >+ > private: > #if defined(USE_X11) > rtc::scoped_refptr<SharedXDisplay> x_display_; >@@ -137,6 +143,9 @@ class DesktopCaptureOptions { > #endif > bool disable_effects_ = true; > bool detect_updated_region_ = false; >+#if defined(WEBRTC_USE_PIPEWIRE) >+ bool allow_pipewire_ = false; >+#endif > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.cc >index 166d97cc52af69977e3da1eb9b1b94c570b8105e..190f61e81755a326ffa046aeef3a03bc2792c460 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.cc >@@ -8,6 +8,8 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >+#include <cstring> >+ > #include "modules/desktop_capture/desktop_capturer.h" > > #include "modules/desktop_capture/desktop_capture_options.h" >@@ -60,4 +62,17 @@ std::unique_ptr<DesktopCapturer> DesktopCapturer::CreateScreenCapturer( > return capturer; > } > >+#if defined(WEBRTC_USE_PIPEWIRE) || defined(USE_X11) >+bool DesktopCapturer::IsRunningUnderWayland() { >+ const char* xdg_session_type = getenv("XDG_SESSION_TYPE"); >+ if (!xdg_session_type || strncmp(xdg_session_type, "wayland", 7) != 0) >+ return false; >+ >+ if (!(getenv("WAYLAND_DISPLAY"))) >+ return false; >+ >+ return true; >+} >+#endif // defined(WEBRTC_USE_PIPEWIRE) || defined(USE_X11) >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.h >index 47196cfe491083d666930eb9556b39a1c3d7292f..f9dacc1f7a38477f7e3283eeb62d83a913abf0a3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_capturer.h >@@ -22,6 +22,7 @@ > #include "modules/desktop_capture/desktop_capture_types.h" > #include "modules/desktop_capture/desktop_frame.h" > #include "modules/desktop_capture/shared_memory.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -29,7 +30,7 @@ class DesktopCaptureOptions; > class DesktopFrame; > > // Abstract interface for screen and window capturers. >-class DesktopCapturer { >+class RTC_EXPORT DesktopCapturer { > public: > enum class Result { > // The frame was captured successfully. >@@ -134,6 +135,10 @@ class DesktopCapturer { > static std::unique_ptr<DesktopCapturer> CreateScreenCapturer( > const DesktopCaptureOptions& options); > >+#if defined(WEBRTC_USE_PIPEWIRE) || defined(USE_X11) >+ static bool IsRunningUnderWayland(); >+#endif // defined(WEBRTC_USE_PIPEWIRE) || defined(USE_X11) >+ > protected: > // CroppingWindowCapturer needs to create raw capturers without wrappers, so > // the following two functions are protected. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_frame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_frame.h >index 6f7edfcc4f7095e1ef2366e8088e4694b5389218..29b84b718372cf0d2275729484c5961408926438 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_frame.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_frame.h >@@ -18,13 +18,14 @@ > #include "modules/desktop_capture/desktop_region.h" > #include "modules/desktop_capture/shared_memory.h" > #include "rtc_base/constructormagic.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > const int kStandardDPI = 96; > > // DesktopFrame represents a video frame captured from the screen. >-class DesktopFrame { >+class RTC_EXPORT DesktopFrame { > public: > // DesktopFrame objects always hold RGBA data. > static const int kBytesPerPixel = 4; >@@ -135,7 +136,7 @@ class DesktopFrame { > }; > > // A DesktopFrame that stores data in the heap. >-class BasicDesktopFrame : public DesktopFrame { >+class RTC_EXPORT BasicDesktopFrame : public DesktopFrame { > public: > explicit BasicDesktopFrame(DesktopSize size); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_geometry.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_geometry.h >index 09ad55bb4bf156d3ca99f8136cbc5f6686fac398..623a4555f9f48d6c610831d2d0335a179bd6a7ed 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_geometry.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_geometry.h >@@ -14,6 +14,7 @@ > #include <stdint.h> > > #include "rtc_base/constructormagic.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -74,7 +75,7 @@ class DesktopSize { > }; > > // Represents a rectangle on the screen. >-class DesktopRect { >+class RTC_EXPORT DesktopRect { > public: > static DesktopRect MakeSize(const DesktopSize& size) { > return DesktopRect(0, 0, size.width(), size.height()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_region.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_region.h >index f4b9209e2dcaffc5d2127febff276d34037e3390..cbb2d8c48b5303da731a5b7e47e17f00dae77371 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_region.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/desktop_region.h >@@ -16,6 +16,7 @@ > > #include "modules/desktop_capture/desktop_geometry.h" > #include "rtc_base/constructormagic.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -23,7 +24,7 @@ namespace webrtc { > // > // Internally each region is stored as a set of rows where each row contains one > // or more rectangles aligned vertically. >-class DesktopRegion { >+class RTC_EXPORT DesktopRegion { > private: > // The following private types need to be declared first because they are used > // in the public Iterator. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/fake_desktop_capturer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/fake_desktop_capturer.h >index 82c053cbe098b991bba10963bbab13f57f485261..fd867e335da1923a226df62c50b13fc3d5249904 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/fake_desktop_capturer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/fake_desktop_capturer.h >@@ -17,6 +17,7 @@ > #include "modules/desktop_capture/desktop_capturer.h" > #include "modules/desktop_capture/desktop_frame_generator.h" > #include "modules/desktop_capture/shared_memory.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -31,7 +32,7 @@ namespace webrtc { > // Double buffering is guaranteed by the FrameGenerator. FrameGenerator > // implements in desktop_frame_generator.h guarantee double buffering, they > // creates a new instance of DesktopFrame each time. >-class FakeDesktopCapturer : public DesktopCapturer { >+class RTC_EXPORT FakeDesktopCapturer : public DesktopCapturer { > public: > FakeDesktopCapturer(); > ~FakeDesktopCapturer() override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/base_capturer_pipewire.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/base_capturer_pipewire.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..62d9994d32a525ee9e0fe59b2ede6ae3cc41310b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/base_capturer_pipewire.cc >@@ -0,0 +1,843 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/base_capturer_pipewire.h" >+ >+#include <gio/gunixfdlist.h> >+#include <glib-object.h> >+ >+#include <spa/param/format-utils.h> >+#include <spa/param/props.h> >+#include <spa/param/video/raw-utils.h> >+#include <spa/support/type-map.h> >+ >+#include <memory> >+#include <utility> >+ >+#include "absl/memory/memory.h" >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capturer.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+ >+const char kDesktopBusName[] = "org.freedesktop.portal.Desktop"; >+const char kDesktopObjectPath[] = "/org/freedesktop/portal/desktop"; >+const char kDesktopRequestObjectPath[] = >+ "/org/freedesktop/portal/desktop/request"; >+const char kSessionInterfaceName[] = "org.freedesktop.portal.Session"; >+const char kRequestInterfaceName[] = "org.freedesktop.portal.Request"; >+const char kScreenCastInterfaceName[] = "org.freedesktop.portal.ScreenCast"; >+ >+const int kBytesPerPixel = 4; >+ >+// static >+void BaseCapturerPipeWire::OnStateChanged(void* data, >+ pw_remote_state old_state, >+ pw_remote_state state, >+ const char* error_message) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(data); >+ RTC_DCHECK(that); >+ >+ switch (state) { >+ case PW_REMOTE_STATE_ERROR: >+ RTC_LOG(LS_ERROR) << "PipeWire remote state error: " << error_message; >+ break; >+ case PW_REMOTE_STATE_CONNECTED: >+ RTC_LOG(LS_INFO) << "PipeWire remote state: connected."; >+ that->CreateReceivingStream(); >+ break; >+ case PW_REMOTE_STATE_CONNECTING: >+ RTC_LOG(LS_INFO) << "PipeWire remote state: connecting."; >+ break; >+ case PW_REMOTE_STATE_UNCONNECTED: >+ RTC_LOG(LS_INFO) << "PipeWire remote state: unconnected."; >+ break; >+ } >+} >+ >+// static >+void BaseCapturerPipeWire::OnStreamStateChanged(void* data, >+ pw_stream_state old_state, >+ pw_stream_state state, >+ const char* error_message) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(data); >+ RTC_DCHECK(that); >+ >+ switch (state) { >+ case PW_STREAM_STATE_ERROR: >+ RTC_LOG(LS_ERROR) << "PipeWire stream state error: " << error_message; >+ break; >+ case PW_STREAM_STATE_CONFIGURE: >+ pw_stream_set_active(that->pw_stream_, true); >+ break; >+ case PW_STREAM_STATE_UNCONNECTED: >+ case PW_STREAM_STATE_CONNECTING: >+ case PW_STREAM_STATE_READY: >+ case PW_STREAM_STATE_PAUSED: >+ case PW_STREAM_STATE_STREAMING: >+ break; >+ } >+} >+ >+// static >+void BaseCapturerPipeWire::OnStreamFormatChanged(void* data, >+ const struct spa_pod* format) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(data); >+ RTC_DCHECK(that); >+ >+ RTC_LOG(LS_INFO) << "PipeWire stream format changed."; >+ >+ if (!format) { >+ pw_stream_finish_format(that->pw_stream_, /*res=*/0, /*params=*/nullptr, >+ /*n_params=*/0); >+ return; >+ } >+ >+ that->spa_video_format_ = new spa_video_info_raw(); >+ spa_format_video_raw_parse(format, that->spa_video_format_, >+ &that->pw_type_->format_video); >+ >+ auto width = that->spa_video_format_->size.width; >+ auto height = that->spa_video_format_->size.height; >+ auto stride = SPA_ROUND_UP_N(width * kBytesPerPixel, 4); >+ auto size = height * stride; >+ >+ uint8_t buffer[1024] = {}; >+ auto builder = spa_pod_builder{buffer, sizeof(buffer)}; >+ >+ // Setup buffers and meta header for new format. >+ const struct spa_pod* params[2]; >+ params[0] = reinterpret_cast<spa_pod*>(spa_pod_builder_object( >+ &builder, >+ // id to enumerate buffer requirements >+ that->pw_core_type_->param.idBuffers, >+ that->pw_core_type_->param_buffers.Buffers, >+ // Size: specified as integer (i) and set to specified size >+ ":", that->pw_core_type_->param_buffers.size, "i", size, >+ // Stride: specified as integer (i) and set to specified stride >+ ":", that->pw_core_type_->param_buffers.stride, "i", stride, >+ // Buffers: specifies how many buffers we want to deal with, set as >+ // integer (i) where preferred number is 8, then allowed number is defined >+ // as range (r) from min and max values and it is undecided (u) to allow >+ // negotiation >+ ":", that->pw_core_type_->param_buffers.buffers, "iru", 8, >+ SPA_POD_PROP_MIN_MAX(1, 32), >+ // Align: memory alignment of the buffer, set as integer (i) to specified >+ // value >+ ":", that->pw_core_type_->param_buffers.align, "i", 16)); >+ params[1] = reinterpret_cast<spa_pod*>(spa_pod_builder_object( >+ &builder, >+ // id to enumerate supported metadata >+ that->pw_core_type_->param.idMeta, that->pw_core_type_->param_meta.Meta, >+ // Type: specified as id or enum (I) >+ ":", that->pw_core_type_->param_meta.type, "I", >+ that->pw_core_type_->meta.Header, >+ // Size: size of the metadata, specified as integer (i) >+ ":", that->pw_core_type_->param_meta.size, "i", >+ sizeof(struct spa_meta_header))); >+ >+ pw_stream_finish_format(that->pw_stream_, /*res=*/0, params, /*n_params=*/2); >+} >+ >+// static >+void BaseCapturerPipeWire::OnStreamProcess(void* data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(data); >+ RTC_DCHECK(that); >+ >+ pw_buffer* buf = nullptr; >+ >+ if (!(buf = pw_stream_dequeue_buffer(that->pw_stream_))) { >+ return; >+ } >+ >+ that->HandleBuffer(buf); >+ >+ pw_stream_queue_buffer(that->pw_stream_, buf); >+} >+ >+BaseCapturerPipeWire::BaseCapturerPipeWire(CaptureSourceType source_type) >+ : capture_source_type_(source_type) {} >+ >+BaseCapturerPipeWire::~BaseCapturerPipeWire() { >+ if (pw_main_loop_) { >+ pw_thread_loop_stop(pw_main_loop_); >+ } >+ >+ if (pw_type_) { >+ delete pw_type_; >+ } >+ >+ if (spa_video_format_) { >+ delete spa_video_format_; >+ } >+ >+ if (pw_stream_) { >+ pw_stream_destroy(pw_stream_); >+ } >+ >+ if (pw_remote_) { >+ pw_remote_destroy(pw_remote_); >+ } >+ >+ if (pw_core_) { >+ pw_core_destroy(pw_core_); >+ } >+ >+ if (pw_main_loop_) { >+ pw_thread_loop_destroy(pw_main_loop_); >+ } >+ >+ if (pw_loop_) { >+ pw_loop_destroy(pw_loop_); >+ } >+ >+ if (current_frame_) { >+ free(current_frame_); >+ } >+ >+ if (start_request_signal_id_) { >+ g_dbus_connection_signal_unsubscribe(connection_, start_request_signal_id_); >+ } >+ if (sources_request_signal_id_) { >+ g_dbus_connection_signal_unsubscribe(connection_, >+ sources_request_signal_id_); >+ } >+ if (session_request_signal_id_) { >+ g_dbus_connection_signal_unsubscribe(connection_, >+ session_request_signal_id_); >+ } >+ >+ if (session_handle_) { >+ GDBusMessage* message = g_dbus_message_new_method_call( >+ kDesktopBusName, session_handle_, kSessionInterfaceName, "Close"); >+ if (message) { >+ GError* error = nullptr; >+ g_dbus_connection_send_message(connection_, message, >+ G_DBUS_SEND_MESSAGE_FLAGS_NONE, >+ /*out_serial=*/nullptr, &error); >+ if (error) { >+ RTC_LOG(LS_ERROR) << "Failed to close the session: " << error->message; >+ g_error_free(error); >+ } >+ g_object_unref(message); >+ } >+ } >+ >+ g_free(start_handle_); >+ g_free(sources_handle_); >+ g_free(session_handle_); >+ g_free(portal_handle_); >+ >+ if (proxy_) { >+ g_clear_object(&proxy_); >+ } >+} >+ >+void BaseCapturerPipeWire::InitPortal() { >+ g_dbus_proxy_new_for_bus( >+ G_BUS_TYPE_SESSION, G_DBUS_PROXY_FLAGS_NONE, /*info=*/nullptr, >+ kDesktopBusName, kDesktopObjectPath, kScreenCastInterfaceName, >+ /*cancellable=*/nullptr, >+ reinterpret_cast<GAsyncReadyCallback>(OnProxyRequested), this); >+} >+ >+void BaseCapturerPipeWire::InitPipeWire() { >+ pw_init(/*argc=*/nullptr, /*argc=*/nullptr); >+ >+ pw_loop_ = pw_loop_new(/*properties=*/nullptr); >+ pw_main_loop_ = pw_thread_loop_new(pw_loop_, "pipewire-main-loop"); >+ >+ pw_core_ = pw_core_new(pw_loop_, /*properties=*/nullptr); >+ pw_core_type_ = pw_core_get_type(pw_core_); >+ pw_remote_ = pw_remote_new(pw_core_, nullptr, /*user_data_size=*/0); >+ >+ InitPipeWireTypes(); >+ >+ // Initialize event handlers, remote end and stream-related. >+ pw_remote_events_.version = PW_VERSION_REMOTE_EVENTS; >+ pw_remote_events_.state_changed = &OnStateChanged; >+ >+ pw_stream_events_.version = PW_VERSION_STREAM_EVENTS; >+ pw_stream_events_.state_changed = &OnStreamStateChanged; >+ pw_stream_events_.format_changed = &OnStreamFormatChanged; >+ pw_stream_events_.process = &OnStreamProcess; >+ >+ pw_remote_add_listener(pw_remote_, &spa_remote_listener_, &pw_remote_events_, >+ this); >+ pw_remote_connect_fd(pw_remote_, pw_fd_); >+ >+ if (pw_thread_loop_start(pw_main_loop_) < 0) { >+ RTC_LOG(LS_ERROR) << "Failed to start main PipeWire loop"; >+ portal_init_failed_ = true; >+ } >+} >+ >+void BaseCapturerPipeWire::InitPipeWireTypes() { >+ spa_type_map* map = pw_core_type_->map; >+ pw_type_ = new PipeWireType(); >+ >+ spa_type_media_type_map(map, &pw_type_->media_type); >+ spa_type_media_subtype_map(map, &pw_type_->media_subtype); >+ spa_type_format_video_map(map, &pw_type_->format_video); >+ spa_type_video_format_map(map, &pw_type_->video_format); >+} >+ >+void BaseCapturerPipeWire::CreateReceivingStream() { >+ spa_rectangle pwMinScreenBounds = spa_rectangle{1, 1}; >+ spa_rectangle pwScreenBounds = >+ spa_rectangle{static_cast<uint32_t>(desktop_size_.width()), >+ static_cast<uint32_t>(desktop_size_.height())}; >+ >+ spa_fraction pwFrameRateMin = spa_fraction{0, 1}; >+ spa_fraction pwFrameRateMax = spa_fraction{60, 1}; >+ >+ pw_properties* reuseProps = pw_properties_new("pipewire.client.reuse", "1", >+ /*end of varargs*/ nullptr); >+ pw_stream_ = pw_stream_new(pw_remote_, "webrtc-consume-stream", reuseProps); >+ >+ uint8_t buffer[1024] = {}; >+ const spa_pod* params[1]; >+ spa_pod_builder builder = spa_pod_builder{buffer, sizeof(buffer)}; >+ params[0] = reinterpret_cast<spa_pod*>(spa_pod_builder_object( >+ &builder, >+ // id to enumerate formats >+ pw_core_type_->param.idEnumFormat, pw_core_type_->spa_format, "I", >+ pw_type_->media_type.video, "I", pw_type_->media_subtype.raw, >+ // Video format: specified as id or enum (I), preferred format is BGRx, >+ // then allowed formats are enumerated (e) and the format is undecided (u) >+ // to allow negotiation >+ ":", pw_type_->format_video.format, "Ieu", pw_type_->video_format.BGRx, >+ SPA_POD_PROP_ENUM(2, pw_type_->video_format.RGBx, >+ pw_type_->video_format.BGRx), >+ // Video size: specified as rectangle (R), preferred size is specified as >+ // first parameter, then allowed size is defined as range (r) from min and >+ // max values and the format is undecided (u) to allow negotiation >+ ":", pw_type_->format_video.size, "Rru", &pwScreenBounds, 2, >+ &pwMinScreenBounds, &pwScreenBounds, >+ // Frame rate: specified as fraction (F) and set to minimum frame rate >+ // value >+ ":", pw_type_->format_video.framerate, "F", &pwFrameRateMin, >+ // Max frame rate: specified as fraction (F), preferred frame rate is set >+ // to maximum value, then allowed frame rate is defined as range (r) from >+ // min and max values and it is undecided (u) to allow negotiation >+ ":", pw_type_->format_video.max_framerate, "Fru", &pwFrameRateMax, 2, >+ &pwFrameRateMin, &pwFrameRateMax)); >+ >+ pw_stream_add_listener(pw_stream_, &spa_stream_listener_, &pw_stream_events_, >+ this); >+ pw_stream_flags flags = static_cast<pw_stream_flags>( >+ PW_STREAM_FLAG_AUTOCONNECT | PW_STREAM_FLAG_INACTIVE | >+ PW_STREAM_FLAG_MAP_BUFFERS); >+ if (pw_stream_connect(pw_stream_, PW_DIRECTION_INPUT, /*port_path=*/nullptr, >+ flags, params, >+ /*n_params=*/1) != 0) { >+ RTC_LOG(LS_ERROR) << "Could not connect receiving stream."; >+ portal_init_failed_ = true; >+ return; >+ } >+} >+ >+void BaseCapturerPipeWire::HandleBuffer(pw_buffer* buffer) { >+ spa_buffer* spaBuffer = buffer->buffer; >+ void* src = nullptr; >+ >+ if (!(src = spaBuffer->datas[0].data)) { >+ return; >+ } >+ >+ uint32_t maxSize = spaBuffer->datas[0].maxsize; >+ int32_t srcStride = spaBuffer->datas[0].chunk->stride; >+ if (srcStride != (desktop_size_.width() * kBytesPerPixel)) { >+ RTC_LOG(LS_ERROR) << "Got buffer with stride different from screen stride: " >+ << srcStride >+ << " != " << (desktop_size_.width() * kBytesPerPixel); >+ portal_init_failed_ = true; >+ return; >+ } >+ >+ if (!current_frame_) { >+ current_frame_ = static_cast<uint8_t*>(malloc(maxSize)); >+ } >+ RTC_DCHECK(current_frame_ != nullptr); >+ >+ // If both sides decided to go with the RGBx format we need to convert it to >+ // BGRx to match color format expected by WebRTC. >+ if (spa_video_format_->format == pw_type_->video_format.RGBx) { >+ uint8_t* tempFrame = static_cast<uint8_t*>(malloc(maxSize)); >+ std::memcpy(tempFrame, src, maxSize); >+ ConvertRGBxToBGRx(tempFrame, maxSize); >+ std::memcpy(current_frame_, tempFrame, maxSize); >+ free(tempFrame); >+ } else { >+ std::memcpy(current_frame_, src, maxSize); >+ } >+} >+ >+void BaseCapturerPipeWire::ConvertRGBxToBGRx(uint8_t* frame, uint32_t size) { >+ // Change color format for KDE KWin which uses RGBx and not BGRx >+ for (uint32_t i = 0; i < size; i += 4) { >+ uint8_t tempR = frame[i]; >+ uint8_t tempB = frame[i + 2]; >+ frame[i] = tempB; >+ frame[i + 2] = tempR; >+ } >+} >+ >+guint BaseCapturerPipeWire::SetupRequestResponseSignal( >+ const gchar* object_path, >+ GDBusSignalCallback callback) { >+ return g_dbus_connection_signal_subscribe( >+ connection_, kDesktopBusName, kRequestInterfaceName, "Response", >+ object_path, /*arg0=*/nullptr, G_DBUS_SIGNAL_FLAGS_NO_MATCH_RULE, >+ callback, this, /*user_data_free_func=*/nullptr); >+} >+ >+// static >+void BaseCapturerPipeWire::OnProxyRequested(GObject* /*object*/, >+ GAsyncResult* result, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ GError* error = nullptr; >+ that->proxy_ = g_dbus_proxy_new_finish(result, &error); >+ if (!that->proxy_) { >+ RTC_LOG(LS_ERROR) << "Failed to create a proxy for the screen cast portal: " >+ << error->message; >+ g_error_free(error); >+ that->portal_init_failed_ = true; >+ return; >+ } >+ that->connection_ = g_dbus_proxy_get_connection(that->proxy_); >+ >+ RTC_LOG(LS_INFO) << "Created proxy for the screen cast portal."; >+ that->SessionRequest(); >+} >+ >+// static >+gchar* BaseCapturerPipeWire::PrepareSignalHandle(GDBusConnection* connection, >+ const gchar* token) { >+ gchar* sender = g_strdup(g_dbus_connection_get_unique_name(connection) + 1); >+ for (int i = 0; sender[i]; i++) { >+ if (sender[i] == '.') { >+ sender[i] = '_'; >+ } >+ } >+ >+ gchar* handle = g_strconcat(kDesktopRequestObjectPath, "/", sender, "/", >+ token, /*end of varargs*/ nullptr); >+ g_free(sender); >+ >+ return handle; >+} >+ >+void BaseCapturerPipeWire::SessionRequest() { >+ GVariantBuilder builder; >+ gchar* variant_string; >+ >+ g_variant_builder_init(&builder, G_VARIANT_TYPE_VARDICT); >+ variant_string = >+ g_strdup_printf("webrtc_session%d", g_random_int_range(0, G_MAXINT)); >+ g_variant_builder_add(&builder, "{sv}", "session_handle_token", >+ g_variant_new_string(variant_string)); >+ g_free(variant_string); >+ variant_string = g_strdup_printf("webrtc%d", g_random_int_range(0, G_MAXINT)); >+ g_variant_builder_add(&builder, "{sv}", "handle_token", >+ g_variant_new_string(variant_string)); >+ >+ portal_handle_ = PrepareSignalHandle(connection_, variant_string); >+ session_request_signal_id_ = SetupRequestResponseSignal( >+ portal_handle_, OnSessionRequestResponseSignal); >+ g_free(variant_string); >+ >+ RTC_LOG(LS_INFO) << "Screen cast session requested."; >+ g_dbus_proxy_call( >+ proxy_, "CreateSession", g_variant_new("(a{sv})", &builder), >+ G_DBUS_CALL_FLAGS_NONE, /*timeout=*/-1, /*cancellable=*/nullptr, >+ reinterpret_cast<GAsyncReadyCallback>(OnSessionRequested), this); >+} >+ >+// static >+void BaseCapturerPipeWire::OnSessionRequested(GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ GError* error = nullptr; >+ GVariant* variant = g_dbus_proxy_call_finish(that->proxy_, result, &error); >+ if (!variant) { >+ RTC_LOG(LS_ERROR) << "Failed to create a screen cast session: " >+ << error->message; >+ g_error_free(error); >+ that->portal_init_failed_ = true; >+ return; >+ } >+ RTC_LOG(LS_INFO) << "Initializing the screen cast session."; >+ >+ gchar* handle = nullptr; >+ g_variant_get_child(variant, 0, "o", &handle); >+ g_variant_unref(variant); >+ if (!handle) { >+ RTC_LOG(LS_ERROR) << "Failed to initialize the screen cast session."; >+ if (that->session_request_signal_id_) { >+ g_dbus_connection_signal_unsubscribe(connection, >+ that->session_request_signal_id_); >+ that->session_request_signal_id_ = 0; >+ } >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ g_free(handle); >+ >+ RTC_LOG(LS_INFO) << "Subscribing to the screen cast session."; >+} >+ >+// static >+void BaseCapturerPipeWire::OnSessionRequestResponseSignal( >+ GDBusConnection* connection, >+ const gchar* sender_name, >+ const gchar* object_path, >+ const gchar* interface_name, >+ const gchar* signal_name, >+ GVariant* parameters, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ RTC_LOG(LS_INFO) >+ << "Received response for the screen cast session subscription."; >+ >+ guint32 portal_response; >+ GVariant* response_data; >+ g_variant_get(parameters, "(u@a{sv})", &portal_response, &response_data); >+ g_variant_lookup(response_data, "session_handle", "s", >+ &that->session_handle_); >+ g_variant_unref(response_data); >+ >+ if (!that->session_handle_ || portal_response) { >+ RTC_LOG(LS_ERROR) >+ << "Failed to request the screen cast session subscription."; >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ that->SourcesRequest(); >+} >+ >+void BaseCapturerPipeWire::SourcesRequest() { >+ GVariantBuilder builder; >+ gchar* variant_string; >+ >+ g_variant_builder_init(&builder, G_VARIANT_TYPE_VARDICT); >+ // We want to record monitor content. >+ g_variant_builder_add(&builder, "{sv}", "types", >+ g_variant_new_uint32(capture_source_type_)); >+ // We don't want to allow selection of multiple sources. >+ g_variant_builder_add(&builder, "{sv}", "multiple", >+ g_variant_new_boolean(false)); >+ variant_string = g_strdup_printf("webrtc%d", g_random_int_range(0, G_MAXINT)); >+ g_variant_builder_add(&builder, "{sv}", "handle_token", >+ g_variant_new_string(variant_string)); >+ >+ sources_handle_ = PrepareSignalHandle(connection_, variant_string); >+ sources_request_signal_id_ = SetupRequestResponseSignal( >+ sources_handle_, OnSourcesRequestResponseSignal); >+ g_free(variant_string); >+ >+ RTC_LOG(LS_INFO) << "Requesting sources from the screen cast session."; >+ g_dbus_proxy_call( >+ proxy_, "SelectSources", >+ g_variant_new("(oa{sv})", session_handle_, &builder), >+ G_DBUS_CALL_FLAGS_NONE, /*timeout=*/-1, /*cancellable=*/nullptr, >+ reinterpret_cast<GAsyncReadyCallback>(OnSourcesRequested), this); >+} >+ >+// static >+void BaseCapturerPipeWire::OnSourcesRequested(GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ GError* error = nullptr; >+ GVariant* variant = g_dbus_proxy_call_finish(that->proxy_, result, &error); >+ if (!variant) { >+ RTC_LOG(LS_ERROR) << "Failed to request the sources: " << error->message; >+ g_error_free(error); >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ RTC_LOG(LS_INFO) << "Sources requested from the screen cast session."; >+ >+ gchar* handle = nullptr; >+ g_variant_get_child(variant, 0, "o", &handle); >+ g_variant_unref(variant); >+ if (!handle) { >+ RTC_LOG(LS_ERROR) << "Failed to initialize the screen cast session."; >+ if (that->sources_request_signal_id_) { >+ g_dbus_connection_signal_unsubscribe(connection, >+ that->sources_request_signal_id_); >+ that->sources_request_signal_id_ = 0; >+ } >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ g_free(handle); >+ >+ RTC_LOG(LS_INFO) << "Subscribed to sources signal."; >+} >+ >+// static >+void BaseCapturerPipeWire::OnSourcesRequestResponseSignal( >+ GDBusConnection* connection, >+ const gchar* sender_name, >+ const gchar* object_path, >+ const gchar* interface_name, >+ const gchar* signal_name, >+ GVariant* parameters, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ RTC_LOG(LS_INFO) << "Received sources signal from session."; >+ >+ guint32 portal_response; >+ g_variant_get(parameters, "(u@a{sv})", &portal_response, nullptr); >+ if (portal_response) { >+ RTC_LOG(LS_ERROR) >+ << "Failed to select sources for the screen cast session."; >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ that->StartRequest(); >+} >+ >+void BaseCapturerPipeWire::StartRequest() { >+ GVariantBuilder builder; >+ gchar* variant_string; >+ >+ g_variant_builder_init(&builder, G_VARIANT_TYPE_VARDICT); >+ variant_string = g_strdup_printf("webrtc%d", g_random_int_range(0, G_MAXINT)); >+ g_variant_builder_add(&builder, "{sv}", "handle_token", >+ g_variant_new_string(variant_string)); >+ >+ start_handle_ = PrepareSignalHandle(connection_, variant_string); >+ start_request_signal_id_ = >+ SetupRequestResponseSignal(start_handle_, OnStartRequestResponseSignal); >+ g_free(variant_string); >+ >+ // "Identifier for the application window", this is Wayland, so not "x11:...". >+ const gchar parent_window[] = ""; >+ >+ RTC_LOG(LS_INFO) << "Starting the screen cast session."; >+ g_dbus_proxy_call( >+ proxy_, "Start", >+ g_variant_new("(osa{sv})", session_handle_, parent_window, &builder), >+ G_DBUS_CALL_FLAGS_NONE, /*timeout=*/-1, /*cancellable=*/nullptr, >+ reinterpret_cast<GAsyncReadyCallback>(OnStartRequested), this); >+} >+ >+// static >+void BaseCapturerPipeWire::OnStartRequested(GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ GError* error = nullptr; >+ GVariant* variant = g_dbus_proxy_call_finish(that->proxy_, result, &error); >+ if (!variant) { >+ RTC_LOG(LS_ERROR) << "Failed to start the screen cast session: " >+ << error->message; >+ g_error_free(error); >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ RTC_LOG(LS_INFO) << "Initializing the start of the screen cast session."; >+ >+ gchar* handle = nullptr; >+ g_variant_get_child(variant, 0, "o", &handle); >+ g_variant_unref(variant); >+ if (!handle) { >+ RTC_LOG(LS_ERROR) >+ << "Failed to initialize the start of the screen cast session."; >+ if (that->start_request_signal_id_) { >+ g_dbus_connection_signal_unsubscribe(connection, >+ that->start_request_signal_id_); >+ that->start_request_signal_id_ = 0; >+ } >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ g_free(handle); >+ >+ RTC_LOG(LS_INFO) << "Subscribed to the start signal."; >+} >+ >+// static >+void BaseCapturerPipeWire::OnStartRequestResponseSignal( >+ GDBusConnection* connection, >+ const gchar* sender_name, >+ const gchar* object_path, >+ const gchar* interface_name, >+ const gchar* signal_name, >+ GVariant* parameters, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ RTC_LOG(LS_INFO) << "Start signal received."; >+ guint32 portal_response; >+ GVariant* response_data; >+ GVariantIter* iter = nullptr; >+ g_variant_get(parameters, "(u@a{sv})", &portal_response, &response_data); >+ if (portal_response || !response_data) { >+ RTC_LOG(LS_ERROR) << "Failed to start the screen cast session."; >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ // Array of PipeWire streams. See >+ // https://github.com/flatpak/xdg-desktop-portal/blob/master/data/org.freedesktop.portal.ScreenCast.xml >+ // documentation for <method name="Start">. >+ if (g_variant_lookup(response_data, "streams", "a(ua{sv})", &iter)) { >+ GVariant* variant; >+ >+ while (g_variant_iter_next(iter, "@(ua{sv})", &variant)) { >+ guint32 stream_id; >+ gint32 width; >+ gint32 height; >+ GVariant* options; >+ >+ g_variant_get(variant, "(u@a{sv})", &stream_id, &options); >+ RTC_DCHECK(options != nullptr); >+ >+ g_variant_lookup(options, "size", "(ii)", &width, &height); >+ >+ that->desktop_size_.set(width, height); >+ >+ g_variant_unref(options); >+ g_variant_unref(variant); >+ } >+ } >+ g_variant_iter_free(iter); >+ g_variant_unref(response_data); >+ >+ that->OpenPipeWireRemote(); >+} >+ >+void BaseCapturerPipeWire::OpenPipeWireRemote() { >+ GVariantBuilder builder; >+ g_variant_builder_init(&builder, G_VARIANT_TYPE_VARDICT); >+ >+ RTC_LOG(LS_INFO) << "Opening the PipeWire remote."; >+ >+ g_dbus_proxy_call_with_unix_fd_list( >+ proxy_, "OpenPipeWireRemote", >+ g_variant_new("(oa{sv})", session_handle_, &builder), >+ G_DBUS_CALL_FLAGS_NONE, /*timeout=*/-1, /*fd_list=*/nullptr, >+ /*cancellable=*/nullptr, >+ reinterpret_cast<GAsyncReadyCallback>(OnOpenPipeWireRemoteRequested), >+ this); >+} >+ >+// static >+void BaseCapturerPipeWire::OnOpenPipeWireRemoteRequested( >+ GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data) { >+ BaseCapturerPipeWire* that = static_cast<BaseCapturerPipeWire*>(user_data); >+ RTC_DCHECK(that); >+ >+ GError* error = nullptr; >+ GUnixFDList* outlist = nullptr; >+ GVariant* variant = g_dbus_proxy_call_with_unix_fd_list_finish( >+ that->proxy_, &outlist, result, &error); >+ if (!variant) { >+ RTC_LOG(LS_ERROR) << "Failed to open the PipeWire remote: " >+ << error->message; >+ g_error_free(error); >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ gint32 index; >+ g_variant_get(variant, "(h)", &index); >+ >+ if ((that->pw_fd_ = g_unix_fd_list_get(outlist, index, &error)) == -1) { >+ RTC_LOG(LS_ERROR) << "Failed to get file descriptor from the list: " >+ << error->message; >+ g_error_free(error); >+ g_variant_unref(variant); >+ that->portal_init_failed_ = true; >+ return; >+ } >+ >+ g_variant_unref(variant); >+ g_object_unref(outlist); >+ >+ that->InitPipeWire(); >+ RTC_LOG(LS_INFO) << "PipeWire remote opened."; >+} >+ >+void BaseCapturerPipeWire::Start(Callback* callback) { >+ RTC_DCHECK(!callback_); >+ RTC_DCHECK(callback); >+ >+ InitPortal(); >+ >+ callback_ = callback; >+} >+ >+void BaseCapturerPipeWire::CaptureFrame() { >+ if (portal_init_failed_) { >+ callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); >+ return; >+ } >+ >+ if (!current_frame_) { >+ callback_->OnCaptureResult(Result::ERROR_TEMPORARY, nullptr); >+ return; >+ } >+ >+ std::unique_ptr<DesktopFrame> result(new BasicDesktopFrame(desktop_size_)); >+ result->CopyPixelsFrom( >+ current_frame_, (desktop_size_.width() * kBytesPerPixel), >+ DesktopRect::MakeWH(desktop_size_.width(), desktop_size_.height())); >+ if (!result) { >+ callback_->OnCaptureResult(Result::ERROR_TEMPORARY, nullptr); >+ return; >+ } >+ callback_->OnCaptureResult(Result::SUCCESS, std::move(result)); >+} >+ >+bool BaseCapturerPipeWire::GetSourceList(SourceList* sources) { >+ RTC_DCHECK(sources->size() == 0); >+ // List of available screens is already presented by the xdg-desktop-portal. >+ // But we have to add an empty source as the code expects it. >+ sources->push_back({0}); >+ return true; >+} >+ >+bool BaseCapturerPipeWire::SelectSource(SourceId id) { >+ // Screen selection is handled by the xdg-desktop-portal. >+ return true; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/base_capturer_pipewire.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/base_capturer_pipewire.h >new file mode 100644 >index 0000000000000000000000000000000000000000..56b101acbaa601b83f81c7b48890c79f3cb66755 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/base_capturer_pipewire.h >@@ -0,0 +1,167 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_BASE_CAPTURER_PIPEWIRE_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_BASE_CAPTURER_PIPEWIRE_H_ >+ >+#include <gio/gio.h> >+#define typeof __typeof__ >+#include <pipewire/pipewire.h> >+#include <spa/param/video/format-utils.h> >+ >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capturer.h" >+#include "rtc_base/constructormagic.h" >+ >+namespace webrtc { >+ >+class PipeWireType { >+ public: >+ spa_type_media_type media_type; >+ spa_type_media_subtype media_subtype; >+ spa_type_format_video format_video; >+ spa_type_video_format video_format; >+}; >+ >+class BaseCapturerPipeWire : public DesktopCapturer { >+ public: >+ enum CaptureSourceType { Screen = 1, Window }; >+ >+ explicit BaseCapturerPipeWire(CaptureSourceType source_type); >+ ~BaseCapturerPipeWire() override; >+ >+ // DesktopCapturer interface. >+ void Start(Callback* delegate) override; >+ void CaptureFrame() override; >+ bool GetSourceList(SourceList* sources) override; >+ bool SelectSource(SourceId id) override; >+ >+ private: >+ // PipeWire types --> >+ pw_core* pw_core_ = nullptr; >+ pw_type* pw_core_type_ = nullptr; >+ pw_stream* pw_stream_ = nullptr; >+ pw_remote* pw_remote_ = nullptr; >+ pw_loop* pw_loop_ = nullptr; >+ pw_thread_loop* pw_main_loop_ = nullptr; >+ PipeWireType* pw_type_ = nullptr; >+ >+ spa_hook spa_stream_listener_ = {}; >+ spa_hook spa_remote_listener_ = {}; >+ >+ pw_stream_events pw_stream_events_ = {}; >+ pw_remote_events pw_remote_events_ = {}; >+ >+ spa_video_info_raw* spa_video_format_ = nullptr; >+ >+ gint32 pw_fd_ = -1; >+ >+ CaptureSourceType capture_source_type_ = >+ BaseCapturerPipeWire::CaptureSourceType::Screen; >+ >+ // <-- end of PipeWire types >+ >+ GDBusConnection* connection_ = nullptr; >+ GDBusProxy* proxy_ = nullptr; >+ gchar* portal_handle_ = nullptr; >+ gchar* session_handle_ = nullptr; >+ gchar* sources_handle_ = nullptr; >+ gchar* start_handle_ = nullptr; >+ guint session_request_signal_id_ = 0; >+ guint sources_request_signal_id_ = 0; >+ guint start_request_signal_id_ = 0; >+ >+ DesktopSize desktop_size_ = {}; >+ DesktopCaptureOptions options_ = {}; >+ >+ uint8_t* current_frame_ = nullptr; >+ Callback* callback_ = nullptr; >+ >+ bool portal_init_failed_ = false; >+ >+ void InitPortal(); >+ void InitPipeWire(); >+ void InitPipeWireTypes(); >+ >+ void CreateReceivingStream(); >+ void HandleBuffer(pw_buffer* buffer); >+ >+ void ConvertRGBxToBGRx(uint8_t* frame, uint32_t size); >+ >+ static void OnStateChanged(void* data, >+ pw_remote_state old_state, >+ pw_remote_state state, >+ const char* error); >+ static void OnStreamStateChanged(void* data, >+ pw_stream_state old_state, >+ pw_stream_state state, >+ const char* error_message); >+ >+ static void OnStreamFormatChanged(void* data, const struct spa_pod* format); >+ static void OnStreamProcess(void* data); >+ static void OnNewBuffer(void* data, uint32_t id); >+ >+ guint SetupRequestResponseSignal(const gchar* object_path, >+ GDBusSignalCallback callback); >+ >+ static void OnProxyRequested(GObject* object, >+ GAsyncResult* result, >+ gpointer user_data); >+ >+ static gchar* PrepareSignalHandle(GDBusConnection* connection, >+ const gchar* token); >+ >+ void SessionRequest(); >+ static void OnSessionRequested(GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data); >+ static void OnSessionRequestResponseSignal(GDBusConnection* connection, >+ const gchar* sender_name, >+ const gchar* object_path, >+ const gchar* interface_name, >+ const gchar* signal_name, >+ GVariant* parameters, >+ gpointer user_data); >+ >+ void SourcesRequest(); >+ static void OnSourcesRequested(GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data); >+ static void OnSourcesRequestResponseSignal(GDBusConnection* connection, >+ const gchar* sender_name, >+ const gchar* object_path, >+ const gchar* interface_name, >+ const gchar* signal_name, >+ GVariant* parameters, >+ gpointer user_data); >+ >+ void StartRequest(); >+ static void OnStartRequested(GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data); >+ static void OnStartRequestResponseSignal(GDBusConnection* connection, >+ const gchar* sender_name, >+ const gchar* object_path, >+ const gchar* interface_name, >+ const gchar* signal_name, >+ GVariant* parameters, >+ gpointer user_data); >+ >+ void OpenPipeWireRemote(); >+ static void OnOpenPipeWireRemoteRequested(GDBusConnection* connection, >+ GAsyncResult* result, >+ gpointer user_data); >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(BaseCapturerPipeWire); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_BASE_CAPTURER_PIPEWIRE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/mouse_cursor_monitor_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/mouse_cursor_monitor_x11.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f7503822bced8bbdbe47f3cd105abd8f1032f350 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/mouse_cursor_monitor_x11.cc >@@ -0,0 +1,247 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/mouse_cursor_monitor_x11.h" >+ >+#include <X11/Xutil.h> >+#include <X11/extensions/Xfixes.h> >+ >+#include <algorithm> >+#include <memory> >+ >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capture_types.h" >+#include "modules/desktop_capture/desktop_frame.h" >+#include "modules/desktop_capture/linux/x_error_trap.h" >+#include "modules/desktop_capture/mouse_cursor.h" >+#include "modules/desktop_capture/mouse_cursor_monitor.h" >+#include "rtc_base/logging.h" >+ >+namespace { >+ >+// WindowCapturer returns window IDs of X11 windows with WM_STATE attribute. >+// These windows may not be immediate children of the root window, because >+// window managers may re-parent them to add decorations. However, >+// XQueryPointer() expects to be passed children of the root. This function >+// searches up the list of the windows to find the root child that corresponds >+// to |window|. >+Window GetTopLevelWindow(Display* display, Window window) { >+ while (true) { >+ // If the window is in WithdrawnState then look at all of its children. >+ ::Window root, parent; >+ ::Window* children; >+ unsigned int num_children; >+ if (!XQueryTree(display, window, &root, &parent, &children, >+ &num_children)) { >+ RTC_LOG(LS_ERROR) << "Failed to query for child windows although window" >+ << "does not have a valid WM_STATE."; >+ return None; >+ } >+ if (children) >+ XFree(children); >+ >+ if (parent == root) >+ break; >+ >+ window = parent; >+ } >+ >+ return window; >+} >+ >+} // namespace >+ >+namespace webrtc { >+ >+MouseCursorMonitorX11::MouseCursorMonitorX11( >+ const DesktopCaptureOptions& options, >+ Window window) >+ : x_display_(options.x_display()), >+ callback_(NULL), >+ mode_(SHAPE_AND_POSITION), >+ window_(window), >+ have_xfixes_(false), >+ xfixes_event_base_(-1), >+ xfixes_error_base_(-1) { >+ // Set a default initial cursor shape in case XFixes is not present. >+ const int kSize = 5; >+ std::unique_ptr<DesktopFrame> default_cursor( >+ new BasicDesktopFrame(DesktopSize(kSize, kSize))); >+ const uint8_t pixels[kSize * kSize] = { >+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, >+ 0x00, 0x00, 0xff, 0xff, 0xff, 0x00, 0x00, 0xff, 0xff, >+ 0xff, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00}; >+ uint8_t* ptr = default_cursor->data(); >+ for (int y = 0; y < kSize; ++y) { >+ for (int x = 0; x < kSize; ++x) { >+ *ptr++ = pixels[kSize * y + x]; >+ *ptr++ = pixels[kSize * y + x]; >+ *ptr++ = pixels[kSize * y + x]; >+ *ptr++ = 0xff; >+ } >+ } >+ DesktopVector hotspot(2, 2); >+ cursor_shape_.reset(new MouseCursor(default_cursor.release(), hotspot)); >+} >+ >+MouseCursorMonitorX11::~MouseCursorMonitorX11() { >+ if (have_xfixes_) { >+ x_display_->RemoveEventHandler(xfixes_event_base_ + XFixesCursorNotify, >+ this); >+ } >+} >+ >+void MouseCursorMonitorX11::Init(Callback* callback, Mode mode) { >+ // Init can be called only once per instance of MouseCursorMonitor. >+ RTC_DCHECK(!callback_); >+ RTC_DCHECK(callback); >+ >+ callback_ = callback; >+ mode_ = mode; >+ >+ have_xfixes_ = >+ XFixesQueryExtension(display(), &xfixes_event_base_, &xfixes_error_base_); >+ >+ if (have_xfixes_) { >+ // Register for changes to the cursor shape. >+ XFixesSelectCursorInput(display(), window_, XFixesDisplayCursorNotifyMask); >+ x_display_->AddEventHandler(xfixes_event_base_ + XFixesCursorNotify, this); >+ >+ CaptureCursor(); >+ } else { >+ RTC_LOG(LS_INFO) << "X server does not support XFixes."; >+ } >+} >+ >+void MouseCursorMonitorX11::Capture() { >+ RTC_DCHECK(callback_); >+ >+ // Process X11 events in case XFixes has sent cursor notification. >+ x_display_->ProcessPendingXEvents(); >+ >+ // cursor_shape_| is set only if we were notified of a cursor shape change. >+ if (cursor_shape_.get()) >+ callback_->OnMouseCursor(cursor_shape_.release()); >+ >+ // Get cursor position if necessary. >+ if (mode_ == SHAPE_AND_POSITION) { >+ int root_x; >+ int root_y; >+ int win_x; >+ int win_y; >+ Window root_window; >+ Window child_window; >+ unsigned int mask; >+ >+ XErrorTrap error_trap(display()); >+ Bool result = XQueryPointer(display(), window_, &root_window, &child_window, >+ &root_x, &root_y, &win_x, &win_y, &mask); >+ CursorState state; >+ if (!result || error_trap.GetLastErrorAndDisable() != 0) { >+ state = OUTSIDE; >+ } else { >+ // In screen mode (window_ == root_window) the mouse is always inside. >+ // XQueryPointer() sets |child_window| to None if the cursor is outside >+ // |window_|. >+ state = >+ (window_ == root_window || child_window != None) ? INSIDE : OUTSIDE; >+ } >+ >+ // As the comments to GetTopLevelWindow() above indicate, in window capture, >+ // the cursor position capture happens in |window_|, while the frame catpure >+ // happens in |child_window|. These two windows are not alwyas same, as >+ // window manager may add some decorations to the |window_|. So translate >+ // the coordinate in |window_| to the coordinate space of |child_window|. >+ if (window_ != root_window && state == INSIDE) { >+ int translated_x, translated_y; >+ Window unused; >+ if (XTranslateCoordinates(display(), window_, child_window, win_x, win_y, >+ &translated_x, &translated_y, &unused)) { >+ win_x = translated_x; >+ win_y = translated_y; >+ } >+ } >+ >+ // X11 always starts the coordinate from (0, 0), so we do not need to >+ // translate here. >+ callback_->OnMouseCursorPosition(DesktopVector(root_x, root_y)); >+ } >+} >+ >+bool MouseCursorMonitorX11::HandleXEvent(const XEvent& event) { >+ if (have_xfixes_ && event.type == xfixes_event_base_ + XFixesCursorNotify) { >+ const XFixesCursorNotifyEvent* cursor_event = >+ reinterpret_cast<const XFixesCursorNotifyEvent*>(&event); >+ if (cursor_event->subtype == XFixesDisplayCursorNotify) { >+ CaptureCursor(); >+ } >+ // Return false, even if the event has been handled, because there might be >+ // other listeners for cursor notifications. >+ } >+ return false; >+} >+ >+void MouseCursorMonitorX11::CaptureCursor() { >+ RTC_DCHECK(have_xfixes_); >+ >+ XFixesCursorImage* img; >+ { >+ XErrorTrap error_trap(display()); >+ img = XFixesGetCursorImage(display()); >+ if (!img || error_trap.GetLastErrorAndDisable() != 0) >+ return; >+ } >+ >+ std::unique_ptr<DesktopFrame> image( >+ new BasicDesktopFrame(DesktopSize(img->width, img->height))); >+ >+ uint64_t* src = reinterpret_cast<uint64_t*>(img->pixels); >+ uint32_t* dst = reinterpret_cast<uint32_t*>(image->data()); >+ uint32_t* dst_end = dst + (img->width * img->height); >+ while (dst < dst_end) { >+ *dst++ = *src++; >+ } >+ >+ DesktopVector hotspot(std::min(img->width, img->xhot), >+ std::min(img->height, img->yhot)); >+ >+ XFree(img); >+ >+ cursor_shape_.reset(new MouseCursor(image.release(), hotspot)); >+} >+ >+// static >+MouseCursorMonitor* MouseCursorMonitorX11::CreateForWindow( >+ const DesktopCaptureOptions& options, >+ WindowId window) { >+ if (!options.x_display()) >+ return NULL; >+ window = GetTopLevelWindow(options.x_display()->display(), window); >+ if (window == None) >+ return NULL; >+ return new MouseCursorMonitorX11(options, window); >+} >+ >+MouseCursorMonitor* MouseCursorMonitorX11::CreateForScreen( >+ const DesktopCaptureOptions& options, >+ ScreenId screen) { >+ if (!options.x_display()) >+ return NULL; >+ return new MouseCursorMonitorX11( >+ options, DefaultRootWindow(options.x_display()->display())); >+} >+ >+std::unique_ptr<MouseCursorMonitor> MouseCursorMonitorX11::Create( >+ const DesktopCaptureOptions& options) { >+ return std::unique_ptr<MouseCursorMonitor>( >+ CreateForScreen(options, kFullDesktopScreenId)); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/mouse_cursor_monitor_x11.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/mouse_cursor_monitor_x11.h >new file mode 100644 >index 0000000000000000000000000000000000000000..3e8e0ca78c324de451b6e94a114156d5a342eaac >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/mouse_cursor_monitor_x11.h >@@ -0,0 +1,64 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_MOUSE_CURSOR_MONITOR_X11_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_MOUSE_CURSOR_MONITOR_X11_H_ >+ >+#include <X11/Xlib.h> >+ >+#include <memory> >+ >+#include "modules/desktop_capture/linux/shared_x_display.h" >+#include "modules/desktop_capture/mouse_cursor_monitor.h" >+ >+namespace webrtc { >+ >+class MouseCursorMonitorX11 : public MouseCursorMonitor, >+ public SharedXDisplay::XEventHandler { >+ public: >+ MouseCursorMonitorX11(const DesktopCaptureOptions& options, Window window); >+ ~MouseCursorMonitorX11() override; >+ >+ static MouseCursorMonitor* CreateForWindow( >+ const DesktopCaptureOptions& options, >+ WindowId window); >+ static MouseCursorMonitor* CreateForScreen( >+ const DesktopCaptureOptions& options, >+ ScreenId screen); >+ static std::unique_ptr<MouseCursorMonitor> Create( >+ const DesktopCaptureOptions& options); >+ >+ void Init(Callback* callback, Mode mode) override; >+ void Capture() override; >+ >+ private: >+ // SharedXDisplay::XEventHandler interface. >+ bool HandleXEvent(const XEvent& event) override; >+ >+ Display* display() { return x_display_->display(); } >+ >+ // Captures current cursor shape and stores it in |cursor_shape_|. >+ void CaptureCursor(); >+ >+ rtc::scoped_refptr<SharedXDisplay> x_display_; >+ Callback* callback_; >+ Mode mode_; >+ Window window_; >+ >+ bool have_xfixes_; >+ int xfixes_event_base_; >+ int xfixes_error_base_; >+ >+ std::unique_ptr<MouseCursor> cursor_shape_; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_MOUSE_CURSOR_MONITOR_X11_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_pipewire.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_pipewire.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..724d8537d31178df40ad324074c30f5cced0090a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_pipewire.cc >@@ -0,0 +1,30 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/screen_capturer_pipewire.h" >+ >+#include <memory> >+ >+#include "absl/memory/memory.h" >+ >+namespace webrtc { >+ >+ScreenCapturerPipeWire::ScreenCapturerPipeWire() >+ : BaseCapturerPipeWire(BaseCapturerPipeWire::CaptureSourceType::Screen) {} >+ScreenCapturerPipeWire::~ScreenCapturerPipeWire() {} >+ >+// static >+std::unique_ptr<DesktopCapturer> >+ScreenCapturerPipeWire::CreateRawScreenCapturer( >+ const DesktopCaptureOptions& options) { >+ return absl::make_unique<ScreenCapturerPipeWire>(); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_pipewire.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_pipewire.h >new file mode 100644 >index 0000000000000000000000000000000000000000..66dcd680e06a320fa37acfbf0bafb72b5670eefa >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_pipewire.h >@@ -0,0 +1,33 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_SCREEN_CAPTURER_PIPEWIRE_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_SCREEN_CAPTURER_PIPEWIRE_H_ >+ >+#include <memory> >+ >+#include "modules/desktop_capture/linux/base_capturer_pipewire.h" >+ >+namespace webrtc { >+ >+class ScreenCapturerPipeWire : public BaseCapturerPipeWire { >+ public: >+ ScreenCapturerPipeWire(); >+ ~ScreenCapturerPipeWire() override; >+ >+ static std::unique_ptr<DesktopCapturer> CreateRawScreenCapturer( >+ const DesktopCaptureOptions& options); >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(ScreenCapturerPipeWire); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_SCREEN_CAPTURER_PIPEWIRE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_x11.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..9d6a085b65ccfaaf3f0e2853d797377c468a9a79 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_x11.cc >@@ -0,0 +1,337 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/screen_capturer_x11.h" >+ >+#include <string.h> >+ >+#include <X11/Xlib.h> >+#include <X11/Xutil.h> >+#include <X11/extensions/Xdamage.h> >+#include <X11/extensions/Xfixes.h> >+ >+#include <memory> >+#include <set> >+#include <utility> >+ >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capturer.h" >+#include "modules/desktop_capture/desktop_frame.h" >+#include "modules/desktop_capture/linux/x_server_pixel_buffer.h" >+#include "modules/desktop_capture/screen_capture_frame_queue.h" >+#include "modules/desktop_capture/screen_capturer_helper.h" >+#include "modules/desktop_capture/shared_desktop_frame.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/timeutils.h" >+#include "rtc_base/trace_event.h" >+ >+namespace webrtc { >+ >+ScreenCapturerX11::ScreenCapturerX11() { >+ helper_.SetLogGridSize(4); >+} >+ >+ScreenCapturerX11::~ScreenCapturerX11() { >+ options_.x_display()->RemoveEventHandler(ConfigureNotify, this); >+ if (use_damage_) { >+ options_.x_display()->RemoveEventHandler(damage_event_base_ + XDamageNotify, >+ this); >+ } >+ DeinitXlib(); >+} >+ >+bool ScreenCapturerX11::Init(const DesktopCaptureOptions& options) { >+ TRACE_EVENT0("webrtc", "ScreenCapturerX11::Init"); >+ options_ = options; >+ >+ root_window_ = RootWindow(display(), DefaultScreen(display())); >+ if (root_window_ == BadValue) { >+ RTC_LOG(LS_ERROR) << "Unable to get the root window"; >+ DeinitXlib(); >+ return false; >+ } >+ >+ gc_ = XCreateGC(display(), root_window_, 0, NULL); >+ if (gc_ == NULL) { >+ RTC_LOG(LS_ERROR) << "Unable to get graphics context"; >+ DeinitXlib(); >+ return false; >+ } >+ >+ options_.x_display()->AddEventHandler(ConfigureNotify, this); >+ >+ // Check for XFixes extension. This is required for cursor shape >+ // notifications, and for our use of XDamage. >+ if (XFixesQueryExtension(display(), &xfixes_event_base_, >+ &xfixes_error_base_)) { >+ has_xfixes_ = true; >+ } else { >+ RTC_LOG(LS_INFO) << "X server does not support XFixes."; >+ } >+ >+ // Register for changes to the dimensions of the root window. >+ XSelectInput(display(), root_window_, StructureNotifyMask); >+ >+ if (!x_server_pixel_buffer_.Init(display(), DefaultRootWindow(display()))) { >+ RTC_LOG(LS_ERROR) << "Failed to initialize pixel buffer."; >+ return false; >+ } >+ >+ if (options_.use_update_notifications()) { >+ InitXDamage(); >+ } >+ >+ return true; >+} >+ >+void ScreenCapturerX11::InitXDamage() { >+ // Our use of XDamage requires XFixes. >+ if (!has_xfixes_) { >+ return; >+ } >+ >+ // Check for XDamage extension. >+ if (!XDamageQueryExtension(display(), &damage_event_base_, >+ &damage_error_base_)) { >+ RTC_LOG(LS_INFO) << "X server does not support XDamage."; >+ return; >+ } >+ >+ // TODO(lambroslambrou): Disable DAMAGE in situations where it is known >+ // to fail, such as when Desktop Effects are enabled, with graphics >+ // drivers (nVidia, ATI) that fail to report DAMAGE notifications >+ // properly. >+ >+ // Request notifications every time the screen becomes damaged. >+ damage_handle_ = >+ XDamageCreate(display(), root_window_, XDamageReportNonEmpty); >+ if (!damage_handle_) { >+ RTC_LOG(LS_ERROR) << "Unable to initialize XDamage."; >+ return; >+ } >+ >+ // Create an XFixes server-side region to collate damage into. >+ damage_region_ = XFixesCreateRegion(display(), 0, 0); >+ if (!damage_region_) { >+ XDamageDestroy(display(), damage_handle_); >+ RTC_LOG(LS_ERROR) << "Unable to create XFixes region."; >+ return; >+ } >+ >+ options_.x_display()->AddEventHandler(damage_event_base_ + XDamageNotify, >+ this); >+ >+ use_damage_ = true; >+ RTC_LOG(LS_INFO) << "Using XDamage extension."; >+} >+ >+void ScreenCapturerX11::Start(Callback* callback) { >+ RTC_DCHECK(!callback_); >+ RTC_DCHECK(callback); >+ >+ callback_ = callback; >+} >+ >+void ScreenCapturerX11::CaptureFrame() { >+ TRACE_EVENT0("webrtc", "ScreenCapturerX11::CaptureFrame"); >+ int64_t capture_start_time_nanos = rtc::TimeNanos(); >+ >+ queue_.MoveToNextFrame(); >+ RTC_DCHECK(!queue_.current_frame() || !queue_.current_frame()->IsShared()); >+ >+ // Process XEvents for XDamage and cursor shape tracking. >+ options_.x_display()->ProcessPendingXEvents(); >+ >+ // ProcessPendingXEvents() may call ScreenConfigurationChanged() which >+ // reinitializes |x_server_pixel_buffer_|. Check if the pixel buffer is still >+ // in a good shape. >+ if (!x_server_pixel_buffer_.is_initialized()) { >+ // We failed to initialize pixel buffer. >+ RTC_LOG(LS_ERROR) << "Pixel buffer is not initialized."; >+ callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); >+ return; >+ } >+ >+ // If the current frame is from an older generation then allocate a new one. >+ // Note that we can't reallocate other buffers at this point, since the caller >+ // may still be reading from them. >+ if (!queue_.current_frame()) { >+ queue_.ReplaceCurrentFrame( >+ SharedDesktopFrame::Wrap(std::unique_ptr<DesktopFrame>( >+ new BasicDesktopFrame(x_server_pixel_buffer_.window_size())))); >+ } >+ >+ std::unique_ptr<DesktopFrame> result = CaptureScreen(); >+ if (!result) { >+ RTC_LOG(LS_WARNING) << "Temporarily failed to capture screen."; >+ callback_->OnCaptureResult(Result::ERROR_TEMPORARY, nullptr); >+ return; >+ } >+ >+ last_invalid_region_ = result->updated_region(); >+ result->set_capture_time_ms((rtc::TimeNanos() - capture_start_time_nanos) / >+ rtc::kNumNanosecsPerMillisec); >+ callback_->OnCaptureResult(Result::SUCCESS, std::move(result)); >+} >+ >+bool ScreenCapturerX11::GetSourceList(SourceList* sources) { >+ RTC_DCHECK(sources->size() == 0); >+ // TODO(jiayl): implement screen enumeration. >+ sources->push_back({0}); >+ return true; >+} >+ >+bool ScreenCapturerX11::SelectSource(SourceId id) { >+ // TODO(jiayl): implement screen selection. >+ return true; >+} >+ >+bool ScreenCapturerX11::HandleXEvent(const XEvent& event) { >+ if (use_damage_ && (event.type == damage_event_base_ + XDamageNotify)) { >+ const XDamageNotifyEvent* damage_event = >+ reinterpret_cast<const XDamageNotifyEvent*>(&event); >+ if (damage_event->damage != damage_handle_) >+ return false; >+ RTC_DCHECK(damage_event->level == XDamageReportNonEmpty); >+ return true; >+ } else if (event.type == ConfigureNotify) { >+ ScreenConfigurationChanged(); >+ return true; >+ } >+ return false; >+} >+ >+std::unique_ptr<DesktopFrame> ScreenCapturerX11::CaptureScreen() { >+ std::unique_ptr<SharedDesktopFrame> frame = queue_.current_frame()->Share(); >+ RTC_DCHECK(x_server_pixel_buffer_.window_size().equals(frame->size())); >+ >+ // Pass the screen size to the helper, so it can clip the invalid region if it >+ // expands that region to a grid. >+ helper_.set_size_most_recent(frame->size()); >+ >+ // In the DAMAGE case, ensure the frame is up-to-date with the previous frame >+ // if any. If there isn't a previous frame, that means a screen-resolution >+ // change occurred, and |invalid_rects| will be updated to include the whole >+ // screen. >+ if (use_damage_ && queue_.previous_frame()) >+ SynchronizeFrame(); >+ >+ DesktopRegion* updated_region = frame->mutable_updated_region(); >+ >+ x_server_pixel_buffer_.Synchronize(); >+ if (use_damage_ && queue_.previous_frame()) { >+ // Atomically fetch and clear the damage region. >+ XDamageSubtract(display(), damage_handle_, None, damage_region_); >+ int rects_num = 0; >+ XRectangle bounds; >+ XRectangle* rects = XFixesFetchRegionAndBounds(display(), damage_region_, >+ &rects_num, &bounds); >+ for (int i = 0; i < rects_num; ++i) { >+ updated_region->AddRect(DesktopRect::MakeXYWH( >+ rects[i].x, rects[i].y, rects[i].width, rects[i].height)); >+ } >+ XFree(rects); >+ helper_.InvalidateRegion(*updated_region); >+ >+ // Capture the damaged portions of the desktop. >+ helper_.TakeInvalidRegion(updated_region); >+ >+ // Clip the damaged portions to the current screen size, just in case some >+ // spurious XDamage notifications were received for a previous (larger) >+ // screen size. >+ updated_region->IntersectWith( >+ DesktopRect::MakeSize(x_server_pixel_buffer_.window_size())); >+ >+ for (DesktopRegion::Iterator it(*updated_region); !it.IsAtEnd(); >+ it.Advance()) { >+ if (!x_server_pixel_buffer_.CaptureRect(it.rect(), frame.get())) >+ return nullptr; >+ } >+ } else { >+ // Doing full-screen polling, or this is the first capture after a >+ // screen-resolution change. In either case, need a full-screen capture. >+ DesktopRect screen_rect = DesktopRect::MakeSize(frame->size()); >+ if (!x_server_pixel_buffer_.CaptureRect(screen_rect, frame.get())) >+ return nullptr; >+ updated_region->SetRect(screen_rect); >+ } >+ >+ return std::move(frame); >+} >+ >+void ScreenCapturerX11::ScreenConfigurationChanged() { >+ TRACE_EVENT0("webrtc", "ScreenCapturerX11::ScreenConfigurationChanged"); >+ // Make sure the frame buffers will be reallocated. >+ queue_.Reset(); >+ >+ helper_.ClearInvalidRegion(); >+ if (!x_server_pixel_buffer_.Init(display(), DefaultRootWindow(display()))) { >+ RTC_LOG(LS_ERROR) << "Failed to initialize pixel buffer after screen " >+ "configuration change."; >+ } >+} >+ >+void ScreenCapturerX11::SynchronizeFrame() { >+ // Synchronize the current buffer with the previous one since we do not >+ // capture the entire desktop. Note that encoder may be reading from the >+ // previous buffer at this time so thread access complaints are false >+ // positives. >+ >+ // TODO(hclam): We can reduce the amount of copying here by subtracting >+ // |capturer_helper_|s region from |last_invalid_region_|. >+ // http://crbug.com/92354 >+ RTC_DCHECK(queue_.previous_frame()); >+ >+ DesktopFrame* current = queue_.current_frame(); >+ DesktopFrame* last = queue_.previous_frame(); >+ RTC_DCHECK(current != last); >+ for (DesktopRegion::Iterator it(last_invalid_region_); !it.IsAtEnd(); >+ it.Advance()) { >+ current->CopyPixelsFrom(*last, it.rect().top_left(), it.rect()); >+ } >+} >+ >+void ScreenCapturerX11::DeinitXlib() { >+ if (gc_) { >+ XFreeGC(display(), gc_); >+ gc_ = nullptr; >+ } >+ >+ x_server_pixel_buffer_.Release(); >+ >+ if (display()) { >+ if (damage_handle_) { >+ XDamageDestroy(display(), damage_handle_); >+ damage_handle_ = 0; >+ } >+ >+ if (damage_region_) { >+ XFixesDestroyRegion(display(), damage_region_); >+ damage_region_ = 0; >+ } >+ } >+} >+ >+// static >+std::unique_ptr<DesktopCapturer> ScreenCapturerX11::CreateRawScreenCapturer( >+ const DesktopCaptureOptions& options) { >+ if (!options.x_display()) >+ return nullptr; >+ >+ std::unique_ptr<ScreenCapturerX11> capturer(new ScreenCapturerX11()); >+ if (!capturer.get()->Init(options)) { >+ return nullptr; >+ } >+ >+ return std::move(capturer); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_x11.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_x11.h >new file mode 100644 >index 0000000000000000000000000000000000000000..9f3b60c3d87e5b17dd5949777851fb9cc49b98e3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/screen_capturer_x11.h >@@ -0,0 +1,123 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_SCREEN_CAPTURER_X11_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_SCREEN_CAPTURER_X11_H_ >+ >+#include <X11/Xlib.h> >+#include <X11/extensions/Xdamage.h> >+ >+#include <memory> >+#include <set> >+#include <utility> >+ >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capturer.h" >+#include "modules/desktop_capture/desktop_frame.h" >+#include "modules/desktop_capture/linux/x_server_pixel_buffer.h" >+#include "modules/desktop_capture/screen_capture_frame_queue.h" >+#include "modules/desktop_capture/screen_capturer_helper.h" >+#include "modules/desktop_capture/shared_desktop_frame.h" >+#include "rtc_base/constructormagic.h" >+ >+namespace webrtc { >+ >+// A class to perform video frame capturing for Linux on X11. >+// >+// If XDamage is used, this class sets DesktopFrame::updated_region() according >+// to the areas reported by XDamage. Otherwise this class does not detect >+// DesktopFrame::updated_region(), the field is always set to the entire frame >+// rectangle. ScreenCapturerDifferWrapper should be used if that functionality >+// is necessary. >+class ScreenCapturerX11 : public DesktopCapturer, >+ public SharedXDisplay::XEventHandler { >+ public: >+ ScreenCapturerX11(); >+ ~ScreenCapturerX11() override; >+ >+ static std::unique_ptr<DesktopCapturer> CreateRawScreenCapturer( >+ const DesktopCaptureOptions& options); >+ >+ // TODO(ajwong): Do we really want this to be synchronous? >+ bool Init(const DesktopCaptureOptions& options); >+ >+ // DesktopCapturer interface. >+ void Start(Callback* delegate) override; >+ void CaptureFrame() override; >+ bool GetSourceList(SourceList* sources) override; >+ bool SelectSource(SourceId id) override; >+ >+ private: >+ Display* display() { return options_.x_display()->display(); } >+ >+ // SharedXDisplay::XEventHandler interface. >+ bool HandleXEvent(const XEvent& event) override; >+ >+ void InitXDamage(); >+ >+ // Capture screen pixels to the current buffer in the queue. In the DAMAGE >+ // case, the ScreenCapturerHelper already holds the list of invalid rectangles >+ // from HandleXEvent(). In the non-DAMAGE case, this captures the >+ // whole screen, then calculates some invalid rectangles that include any >+ // differences between this and the previous capture. >+ std::unique_ptr<DesktopFrame> CaptureScreen(); >+ >+ // Called when the screen configuration is changed. >+ void ScreenConfigurationChanged(); >+ >+ // Synchronize the current buffer with |last_buffer_|, by copying pixels from >+ // the area of |last_invalid_rects|. >+ // Note this only works on the assumption that kNumBuffers == 2, as >+ // |last_invalid_rects| holds the differences from the previous buffer and >+ // the one prior to that (which will then be the current buffer). >+ void SynchronizeFrame(); >+ >+ void DeinitXlib(); >+ >+ DesktopCaptureOptions options_; >+ >+ Callback* callback_ = nullptr; >+ >+ // X11 graphics context. >+ GC gc_ = nullptr; >+ Window root_window_ = BadValue; >+ >+ // XFixes. >+ bool has_xfixes_ = false; >+ int xfixes_event_base_ = -1; >+ int xfixes_error_base_ = -1; >+ >+ // XDamage information. >+ bool use_damage_ = false; >+ Damage damage_handle_ = 0; >+ int damage_event_base_ = -1; >+ int damage_error_base_ = -1; >+ XserverRegion damage_region_ = 0; >+ >+ // Access to the X Server's pixel buffer. >+ XServerPixelBuffer x_server_pixel_buffer_; >+ >+ // A thread-safe list of invalid rectangles, and the size of the most >+ // recently captured screen. >+ ScreenCapturerHelper helper_; >+ >+ // Queue of the frames buffers. >+ ScreenCaptureFrameQueue<SharedDesktopFrame> queue_; >+ >+ // Invalid region from the previous capture. This is used to synchronize the >+ // current with the last buffer used. >+ DesktopRegion last_invalid_region_; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(ScreenCapturerX11); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_SCREEN_CAPTURER_X11_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/shared_x_display.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/shared_x_display.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..c475db6e78759b53abbee977f09b60e4056ac055 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/shared_x_display.cc >@@ -0,0 +1,89 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/shared_x_display.h" >+ >+#include <X11/Xlib.h> >+ >+#include <algorithm> >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+ >+SharedXDisplay::SharedXDisplay(Display* display) : display_(display) { >+ RTC_DCHECK(display_); >+} >+ >+SharedXDisplay::~SharedXDisplay() { >+ RTC_DCHECK(event_handlers_.empty()); >+ XCloseDisplay(display_); >+} >+ >+// static >+rtc::scoped_refptr<SharedXDisplay> SharedXDisplay::Create( >+ const std::string& display_name) { >+ Display* display = >+ XOpenDisplay(display_name.empty() ? NULL : display_name.c_str()); >+ if (!display) { >+ RTC_LOG(LS_ERROR) << "Unable to open display"; >+ return NULL; >+ } >+ return new SharedXDisplay(display); >+} >+ >+// static >+rtc::scoped_refptr<SharedXDisplay> SharedXDisplay::CreateDefault() { >+ return Create(std::string()); >+} >+ >+void SharedXDisplay::AddEventHandler(int type, XEventHandler* handler) { >+ event_handlers_[type].push_back(handler); >+} >+ >+void SharedXDisplay::RemoveEventHandler(int type, XEventHandler* handler) { >+ EventHandlersMap::iterator handlers = event_handlers_.find(type); >+ if (handlers == event_handlers_.end()) >+ return; >+ >+ std::vector<XEventHandler*>::iterator new_end = >+ std::remove(handlers->second.begin(), handlers->second.end(), handler); >+ handlers->second.erase(new_end, handlers->second.end()); >+ >+ // Check if no handlers left for this event. >+ if (handlers->second.empty()) >+ event_handlers_.erase(handlers); >+} >+ >+void SharedXDisplay::ProcessPendingXEvents() { >+ // Hold reference to |this| to prevent it from being destroyed while >+ // processing events. >+ rtc::scoped_refptr<SharedXDisplay> self(this); >+ >+ // Find the number of events that are outstanding "now." We don't just loop >+ // on XPending because we want to guarantee this terminates. >+ int events_to_process = XPending(display()); >+ XEvent e; >+ >+ for (int i = 0; i < events_to_process; i++) { >+ XNextEvent(display(), &e); >+ EventHandlersMap::iterator handlers = event_handlers_.find(e.type); >+ if (handlers == event_handlers_.end()) >+ continue; >+ for (std::vector<XEventHandler*>::iterator it = handlers->second.begin(); >+ it != handlers->second.end(); ++it) { >+ if ((*it)->HandleXEvent(e)) >+ break; >+ } >+ } >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/shared_x_display.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/shared_x_display.h >new file mode 100644 >index 0000000000000000000000000000000000000000..a14f41fab8a3b73241777405b1ac476188ea1722 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/shared_x_display.h >@@ -0,0 +1,81 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_SHARED_X_DISPLAY_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_SHARED_X_DISPLAY_H_ >+ >+#include <map> >+#include <vector> >+ >+#include <string> >+ >+#include "api/refcountedbase.h" >+#include "rtc_base/constructormagic.h" >+#include "rtc_base/scoped_ref_ptr.h" >+ >+// Including Xlib.h will involve evil defines (Bool, Status, True, False), which >+// easily conflict with other headers. >+typedef struct _XDisplay Display; >+typedef union _XEvent XEvent; >+ >+namespace webrtc { >+ >+// A ref-counted object to store XDisplay connection. >+class SharedXDisplay : public rtc::RefCountedBase { >+ public: >+ class XEventHandler { >+ public: >+ virtual ~XEventHandler() {} >+ >+ // Processes XEvent. Returns true if the event has been handled. >+ virtual bool HandleXEvent(const XEvent& event) = 0; >+ }; >+ >+ // Takes ownership of |display|. >+ explicit SharedXDisplay(Display* display); >+ >+ // Creates a new X11 Display for the |display_name|. NULL is returned if X11 >+ // connection failed. Equivalent to CreateDefault() when |display_name| is >+ // empty. >+ static rtc::scoped_refptr<SharedXDisplay> Create( >+ const std::string& display_name); >+ >+ // Creates X11 Display connection for the default display (e.g. specified in >+ // DISPLAY). NULL is returned if X11 connection failed. >+ static rtc::scoped_refptr<SharedXDisplay> CreateDefault(); >+ >+ Display* display() { return display_; } >+ >+ // Adds a new event |handler| for XEvent's of |type|. >+ void AddEventHandler(int type, XEventHandler* handler); >+ >+ // Removes event |handler| added using |AddEventHandler|. Doesn't do anything >+ // if |handler| is not registered. >+ void RemoveEventHandler(int type, XEventHandler* handler); >+ >+ // Processes pending XEvents, calling corresponding event handlers. >+ void ProcessPendingXEvents(); >+ >+ protected: >+ ~SharedXDisplay() override; >+ >+ private: >+ typedef std::map<int, std::vector<XEventHandler*> > EventHandlersMap; >+ >+ Display* display_; >+ >+ EventHandlersMap event_handlers_; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(SharedXDisplay); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_SHARED_X_DISPLAY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_pipewire.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_pipewire.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..1c195aaf4a01423407c727cb602c54ed76e9d9ea >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_pipewire.cc >@@ -0,0 +1,30 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/window_capturer_pipewire.h" >+ >+#include <memory> >+ >+#include "absl/memory/memory.h" >+ >+namespace webrtc { >+ >+WindowCapturerPipeWire::WindowCapturerPipeWire() >+ : BaseCapturerPipeWire(BaseCapturerPipeWire::CaptureSourceType::Window) {} >+WindowCapturerPipeWire::~WindowCapturerPipeWire() {} >+ >+// static >+std::unique_ptr<DesktopCapturer> >+WindowCapturerPipeWire::CreateRawWindowCapturer( >+ const DesktopCaptureOptions& options) { >+ return absl::make_unique<WindowCapturerPipeWire>(); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_pipewire.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_pipewire.h >new file mode 100644 >index 0000000000000000000000000000000000000000..7f184ef29992dfcb7884cefe32eba591d716f455 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_pipewire.h >@@ -0,0 +1,33 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_CAPTURER_PIPEWIRE_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_CAPTURER_PIPEWIRE_H_ >+ >+#include <memory> >+ >+#include "modules/desktop_capture/linux/base_capturer_pipewire.h" >+ >+namespace webrtc { >+ >+class WindowCapturerPipeWire : public BaseCapturerPipeWire { >+ public: >+ WindowCapturerPipeWire(); >+ ~WindowCapturerPipeWire() override; >+ >+ static std::unique_ptr<DesktopCapturer> CreateRawWindowCapturer( >+ const DesktopCaptureOptions& options); >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(WindowCapturerPipeWire); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_CAPTURER_PIPEWIRE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_x11.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..4ef7c5cb3882eebf4a325b01664f935d56b342b1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_x11.cc >@@ -0,0 +1,244 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/window_capturer_x11.h" >+ >+#include <X11/Xutil.h> >+#include <X11/extensions/Xcomposite.h> >+#include <X11/extensions/Xrender.h> >+ >+#include <memory> >+#include <string> >+#include <utility> >+ >+#include "modules/desktop_capture/desktop_frame.h" >+#include "modules/desktop_capture/linux/shared_x_display.h" >+#include "modules/desktop_capture/linux/window_finder_x11.h" >+#include "modules/desktop_capture/linux/window_list_utils.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/trace_event.h" >+ >+namespace webrtc { >+ >+WindowCapturerX11::WindowCapturerX11(const DesktopCaptureOptions& options) >+ : x_display_(options.x_display()), >+ atom_cache_(display()), >+ window_finder_(&atom_cache_) { >+ int event_base, error_base, major_version, minor_version; >+ if (XCompositeQueryExtension(display(), &event_base, &error_base) && >+ XCompositeQueryVersion(display(), &major_version, &minor_version) && >+ // XCompositeNameWindowPixmap() requires version 0.2 >+ (major_version > 0 || minor_version >= 2)) { >+ has_composite_extension_ = true; >+ } else { >+ RTC_LOG(LS_INFO) << "Xcomposite extension not available or too old."; >+ } >+ >+ x_display_->AddEventHandler(ConfigureNotify, this); >+} >+ >+WindowCapturerX11::~WindowCapturerX11() { >+ x_display_->RemoveEventHandler(ConfigureNotify, this); >+} >+ >+bool WindowCapturerX11::GetSourceList(SourceList* sources) { >+ return GetWindowList(&atom_cache_, [this, sources](::Window window) { >+ Source w; >+ w.id = window; >+ if (this->GetWindowTitle(window, &w.title)) { >+ sources->push_back(w); >+ } >+ return true; >+ }); >+} >+ >+bool WindowCapturerX11::SelectSource(SourceId id) { >+ if (!x_server_pixel_buffer_.Init(display(), id)) >+ return false; >+ >+ // Tell the X server to send us window resizing events. >+ XSelectInput(display(), id, StructureNotifyMask); >+ >+ selected_window_ = id; >+ >+ // In addition to needing X11 server-side support for Xcomposite, it actually >+ // needs to be turned on for the window. If the user has modern >+ // hardware/drivers but isn't using a compositing window manager, that won't >+ // be the case. Here we automatically turn it on. >+ >+ // Redirect drawing to an offscreen buffer (ie, turn on compositing). X11 >+ // remembers who has requested this and will turn it off for us when we exit. >+ XCompositeRedirectWindow(display(), id, CompositeRedirectAutomatic); >+ >+ return true; >+} >+ >+bool WindowCapturerX11::FocusOnSelectedSource() { >+ if (!selected_window_) >+ return false; >+ >+ unsigned int num_children; >+ ::Window* children; >+ ::Window parent; >+ ::Window root; >+ // Find the root window to pass event to. >+ int status = XQueryTree(display(), selected_window_, &root, &parent, >+ &children, &num_children); >+ if (status == 0) { >+ RTC_LOG(LS_ERROR) << "Failed to query for the root window."; >+ return false; >+ } >+ >+ if (children) >+ XFree(children); >+ >+ XRaiseWindow(display(), selected_window_); >+ >+ // Some window managers (e.g., metacity in GNOME) consider it illegal to >+ // raise a window without also giving it input focus with >+ // _NET_ACTIVE_WINDOW, so XRaiseWindow() on its own isn't enough. >+ Atom atom = XInternAtom(display(), "_NET_ACTIVE_WINDOW", True); >+ if (atom != None) { >+ XEvent xev; >+ xev.xclient.type = ClientMessage; >+ xev.xclient.serial = 0; >+ xev.xclient.send_event = True; >+ xev.xclient.window = selected_window_; >+ xev.xclient.message_type = atom; >+ >+ // The format member is set to 8, 16, or 32 and specifies whether the >+ // data should be viewed as a list of bytes, shorts, or longs. >+ xev.xclient.format = 32; >+ >+ memset(xev.xclient.data.l, 0, sizeof(xev.xclient.data.l)); >+ >+ XSendEvent(display(), root, False, >+ SubstructureRedirectMask | SubstructureNotifyMask, &xev); >+ } >+ XFlush(display()); >+ return true; >+} >+ >+void WindowCapturerX11::Start(Callback* callback) { >+ RTC_DCHECK(!callback_); >+ RTC_DCHECK(callback); >+ >+ callback_ = callback; >+} >+ >+void WindowCapturerX11::CaptureFrame() { >+ TRACE_EVENT0("webrtc", "WindowCapturerX11::CaptureFrame"); >+ >+ if (!x_server_pixel_buffer_.IsWindowValid()) { >+ RTC_LOG(LS_ERROR) << "The window is no longer valid."; >+ callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); >+ return; >+ } >+ >+ x_display_->ProcessPendingXEvents(); >+ >+ if (!has_composite_extension_) { >+ // Without the Xcomposite extension we capture when the whole window is >+ // visible on screen and not covered by any other window. This is not >+ // something we want so instead, just bail out. >+ RTC_LOG(LS_ERROR) << "No Xcomposite extension detected."; >+ callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); >+ return; >+ } >+ >+ if (GetWindowState(&atom_cache_, selected_window_) == IconicState) { >+ // Window is in minimized. Return a 1x1 frame as same as OSX/Win does. >+ std::unique_ptr<DesktopFrame> frame( >+ new BasicDesktopFrame(DesktopSize(1, 1))); >+ callback_->OnCaptureResult(Result::SUCCESS, std::move(frame)); >+ return; >+ } >+ >+ std::unique_ptr<DesktopFrame> frame( >+ new BasicDesktopFrame(x_server_pixel_buffer_.window_size())); >+ >+ x_server_pixel_buffer_.Synchronize(); >+ if (!x_server_pixel_buffer_.CaptureRect(DesktopRect::MakeSize(frame->size()), >+ frame.get())) { >+ RTC_LOG(LS_WARNING) << "Temporarily failed to capture winodw."; >+ callback_->OnCaptureResult(Result::ERROR_TEMPORARY, nullptr); >+ return; >+ } >+ >+ frame->mutable_updated_region()->SetRect( >+ DesktopRect::MakeSize(frame->size())); >+ frame->set_top_left(x_server_pixel_buffer_.window_rect().top_left()); >+ >+ callback_->OnCaptureResult(Result::SUCCESS, std::move(frame)); >+} >+ >+bool WindowCapturerX11::IsOccluded(const DesktopVector& pos) { >+ return window_finder_.GetWindowUnderPoint(pos) != >+ static_cast<WindowId>(selected_window_); >+} >+ >+bool WindowCapturerX11::HandleXEvent(const XEvent& event) { >+ if (event.type == ConfigureNotify) { >+ XConfigureEvent xce = event.xconfigure; >+ if (xce.window == selected_window_) { >+ if (!DesktopRectFromXAttributes(xce).equals( >+ x_server_pixel_buffer_.window_rect())) { >+ if (!x_server_pixel_buffer_.Init(display(), selected_window_)) { >+ RTC_LOG(LS_ERROR) >+ << "Failed to initialize pixel buffer after resizing."; >+ } >+ } >+ } >+ } >+ >+ // Always returns false, so other observers can still receive the events. >+ return false; >+} >+ >+bool WindowCapturerX11::GetWindowTitle(::Window window, std::string* title) { >+ int status; >+ bool result = false; >+ XTextProperty window_name; >+ window_name.value = nullptr; >+ if (window) { >+ status = XGetWMName(display(), window, &window_name); >+ if (status && window_name.value && window_name.nitems) { >+ int cnt; >+ char** list = nullptr; >+ status = >+ Xutf8TextPropertyToTextList(display(), &window_name, &list, &cnt); >+ if (status >= Success && cnt && *list) { >+ if (cnt > 1) { >+ RTC_LOG(LS_INFO) << "Window has " << cnt >+ << " text properties, only using the first one."; >+ } >+ *title = *list; >+ result = true; >+ } >+ if (list) >+ XFreeStringList(list); >+ } >+ if (window_name.value) >+ XFree(window_name.value); >+ } >+ return result; >+} >+ >+// static >+std::unique_ptr<DesktopCapturer> WindowCapturerX11::CreateRawWindowCapturer( >+ const DesktopCaptureOptions& options) { >+ if (!options.x_display()) >+ return nullptr; >+ return std::unique_ptr<DesktopCapturer>(new WindowCapturerX11(options)); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_x11.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_x11.h >new file mode 100644 >index 0000000000000000000000000000000000000000..4cd2c97cb9fddea5a7875fde236495c17f5591b8 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_capturer_x11.h >@@ -0,0 +1,68 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_CAPTURER_X11_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_CAPTURER_X11_H_ >+ >+#include <memory> >+#include <string> >+ >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capturer.h" >+#include "modules/desktop_capture/linux/window_finder_x11.h" >+#include "modules/desktop_capture/linux/x_atom_cache.h" >+#include "modules/desktop_capture/linux/x_server_pixel_buffer.h" >+#include "rtc_base/constructormagic.h" >+ >+namespace webrtc { >+ >+class WindowCapturerX11 : public DesktopCapturer, >+ public SharedXDisplay::XEventHandler { >+ public: >+ explicit WindowCapturerX11(const DesktopCaptureOptions& options); >+ ~WindowCapturerX11() override; >+ >+ static std::unique_ptr<DesktopCapturer> CreateRawWindowCapturer( >+ const DesktopCaptureOptions& options); >+ >+ // DesktopCapturer interface. >+ void Start(Callback* callback) override; >+ void CaptureFrame() override; >+ bool GetSourceList(SourceList* sources) override; >+ bool SelectSource(SourceId id) override; >+ bool FocusOnSelectedSource() override; >+ bool IsOccluded(const DesktopVector& pos) override; >+ >+ // SharedXDisplay::XEventHandler interface. >+ bool HandleXEvent(const XEvent& event) override; >+ >+ private: >+ Display* display() { return x_display_->display(); } >+ >+ // Returns window title for the specified X |window|. >+ bool GetWindowTitle(::Window window, std::string* title); >+ >+ Callback* callback_ = nullptr; >+ >+ rtc::scoped_refptr<SharedXDisplay> x_display_; >+ >+ bool has_composite_extension_ = false; >+ >+ ::Window selected_window_ = 0; >+ XServerPixelBuffer x_server_pixel_buffer_; >+ XAtomCache atom_cache_; >+ WindowFinderX11 window_finder_; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(WindowCapturerX11); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_CAPTURER_X11_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_finder_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_finder_x11.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..46fadc2c6fb43292dba613706bdc302fe9c819e3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_finder_x11.cc >@@ -0,0 +1,51 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/window_finder_x11.h" >+ >+#include <memory> >+ >+#include "absl/memory/memory.h" >+#include "modules/desktop_capture/linux/window_list_utils.h" >+#include "rtc_base/checks.h" >+ >+namespace webrtc { >+ >+WindowFinderX11::WindowFinderX11(XAtomCache* cache) : cache_(cache) { >+ RTC_DCHECK(cache_); >+} >+ >+WindowFinderX11::~WindowFinderX11() = default; >+ >+WindowId WindowFinderX11::GetWindowUnderPoint(DesktopVector point) { >+ WindowId id = kNullWindowId; >+ GetWindowList(cache_, [&id, this, point](::Window window) { >+ DesktopRect rect; >+ if (GetWindowRect(this->cache_->display(), window, &rect) && >+ rect.Contains(point)) { >+ id = window; >+ return false; >+ } >+ return true; >+ }); >+ return id; >+} >+ >+// static >+std::unique_ptr<WindowFinder> WindowFinder::Create( >+ const WindowFinder::Options& options) { >+ if (options.cache == nullptr) { >+ return nullptr; >+ } >+ >+ return absl::make_unique<WindowFinderX11>(options.cache); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_finder_x11.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_finder_x11.h >new file mode 100644 >index 0000000000000000000000000000000000000000..d0bba8697b5f7544d4d6cccde6f12ce56a9ce488 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_finder_x11.h >@@ -0,0 +1,35 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_FINDER_X11_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_FINDER_X11_H_ >+ >+#include "modules/desktop_capture/window_finder.h" >+ >+namespace webrtc { >+ >+class XAtomCache; >+ >+// The implementation of WindowFinder for X11. >+class WindowFinderX11 final : public WindowFinder { >+ public: >+ explicit WindowFinderX11(XAtomCache* cache); >+ ~WindowFinderX11() override; >+ >+ // WindowFinder implementation. >+ WindowId GetWindowUnderPoint(DesktopVector point) override; >+ >+ private: >+ XAtomCache* const cache_; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_FINDER_X11_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_list_utils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_list_utils.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..fd8f5f72cd5e5e5e0512909f797a30229de0dc96 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_list_utils.cc >@@ -0,0 +1,246 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/window_list_utils.h" >+ >+#include <X11/Xatom.h> >+#include <X11/Xlib.h> >+#include <X11/Xutil.h> >+#include <string.h> >+ >+#include <algorithm> >+ >+#include "modules/desktop_capture/linux/x_error_trap.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/constructormagic.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+ >+namespace { >+ >+class DeferXFree { >+ public: >+ explicit DeferXFree(void* data) : data_(data) {} >+ ~DeferXFree(); >+ >+ private: >+ void* const data_; >+}; >+ >+DeferXFree::~DeferXFree() { >+ if (data_) >+ XFree(data_); >+} >+ >+// Convenience wrapper for XGetWindowProperty() results. >+template <class PropertyType> >+class XWindowProperty { >+ public: >+ XWindowProperty(Display* display, Window window, Atom property) { >+ const int kBitsPerByte = 8; >+ Atom actual_type; >+ int actual_format; >+ unsigned long bytes_after; // NOLINT: type required by XGetWindowProperty >+ int status = XGetWindowProperty( >+ display, window, property, 0L, ~0L, False, AnyPropertyType, >+ &actual_type, &actual_format, &size_, &bytes_after, &data_); >+ if (status != Success) { >+ data_ = nullptr; >+ return; >+ } >+ if (sizeof(PropertyType) * kBitsPerByte != actual_format) { >+ size_ = 0; >+ return; >+ } >+ >+ is_valid_ = true; >+ } >+ >+ ~XWindowProperty() { >+ if (data_) >+ XFree(data_); >+ } >+ >+ // True if we got properly value successfully. >+ bool is_valid() const { return is_valid_; } >+ >+ // Size and value of the property. >+ size_t size() const { return size_; } >+ const PropertyType* data() const { >+ return reinterpret_cast<PropertyType*>(data_); >+ } >+ PropertyType* data() { return reinterpret_cast<PropertyType*>(data_); } >+ >+ private: >+ bool is_valid_ = false; >+ unsigned long size_ = 0; // NOLINT: type required by XGetWindowProperty >+ unsigned char* data_ = nullptr; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(XWindowProperty); >+}; >+ >+// Iterates through |window| hierarchy to find first visible window, i.e. one >+// that has WM_STATE property set to NormalState. >+// See http://tronche.com/gui/x/icccm/sec-4.html#s-4.1.3.1 . >+::Window GetApplicationWindow(XAtomCache* cache, ::Window window) { >+ int32_t state = GetWindowState(cache, window); >+ if (state == NormalState) { >+ // Window has WM_STATE==NormalState. Return it. >+ return window; >+ } else if (state == IconicState) { >+ // Window is in minimized. Skip it. >+ return 0; >+ } >+ >+ RTC_DCHECK_EQ(state, WithdrawnState); >+ // If the window is in WithdrawnState then look at all of its children. >+ ::Window root, parent; >+ ::Window* children; >+ unsigned int num_children; >+ if (!XQueryTree(cache->display(), window, &root, &parent, &children, >+ &num_children)) { >+ RTC_LOG(LS_ERROR) << "Failed to query for child windows although window" >+ << "does not have a valid WM_STATE."; >+ return 0; >+ } >+ ::Window app_window = 0; >+ for (unsigned int i = 0; i < num_children; ++i) { >+ app_window = GetApplicationWindow(cache, children[i]); >+ if (app_window) >+ break; >+ } >+ >+ if (children) >+ XFree(children); >+ return app_window; >+} >+ >+// Returns true if the |window| is a desktop element. >+bool IsDesktopElement(XAtomCache* cache, ::Window window) { >+ RTC_DCHECK(cache); >+ if (window == 0) >+ return false; >+ >+ // First look for _NET_WM_WINDOW_TYPE. The standard >+ // (http://standards.freedesktop.org/wm-spec/latest/ar01s05.html#id2760306) >+ // says this hint *should* be present on all windows, and we use the existence >+ // of _NET_WM_WINDOW_TYPE_NORMAL in the property to indicate a window is not >+ // a desktop element (that is, only "normal" windows should be shareable). >+ XWindowProperty<uint32_t> window_type(cache->display(), window, >+ cache->WindowType()); >+ if (window_type.is_valid() && window_type.size() > 0) { >+ uint32_t* end = window_type.data() + window_type.size(); >+ bool is_normal = >+ (end != std::find(window_type.data(), end, cache->WindowTypeNormal())); >+ return !is_normal; >+ } >+ >+ // Fall back on using the hint. >+ XClassHint class_hint; >+ Status status = XGetClassHint(cache->display(), window, &class_hint); >+ if (status == 0) { >+ // No hints, assume this is a normal application window. >+ return false; >+ } >+ >+ DeferXFree free_res_name(class_hint.res_name); >+ DeferXFree free_res_class(class_hint.res_class); >+ return strcmp("gnome-panel", class_hint.res_name) == 0 || >+ strcmp("desktop_window", class_hint.res_name) == 0; >+} >+ >+} // namespace >+ >+int32_t GetWindowState(XAtomCache* cache, ::Window window) { >+ // Get WM_STATE property of the window. >+ XWindowProperty<uint32_t> window_state(cache->display(), window, >+ cache->WmState()); >+ >+ // WM_STATE is considered to be set to WithdrawnState when it missing. >+ return window_state.is_valid() ? *window_state.data() : WithdrawnState; >+} >+ >+bool GetWindowList(XAtomCache* cache, >+ rtc::FunctionView<bool(::Window)> on_window) { >+ RTC_DCHECK(cache); >+ RTC_DCHECK(on_window); >+ ::Display* const display = cache->display(); >+ >+ int failed_screens = 0; >+ const int num_screens = XScreenCount(display); >+ for (int screen = 0; screen < num_screens; screen++) { >+ ::Window root_window = XRootWindow(display, screen); >+ ::Window parent; >+ ::Window* children; >+ unsigned int num_children; >+ { >+ XErrorTrap error_trap(display); >+ if (XQueryTree(display, root_window, &root_window, &parent, &children, >+ &num_children) == 0 || >+ error_trap.GetLastErrorAndDisable() != 0) { >+ failed_screens++; >+ RTC_LOG(LS_ERROR) << "Failed to query for child windows for screen " >+ << screen; >+ continue; >+ } >+ } >+ >+ DeferXFree free_children(children); >+ >+ for (unsigned int i = 0; i < num_children; i++) { >+ // Iterates in reverse order to return windows from front to back. >+ ::Window app_window = >+ GetApplicationWindow(cache, children[num_children - 1 - i]); >+ if (app_window && !IsDesktopElement(cache, app_window)) { >+ if (!on_window(app_window)) { >+ return true; >+ } >+ } >+ } >+ } >+ >+ return failed_screens < num_screens; >+} >+ >+bool GetWindowRect(::Display* display, >+ ::Window window, >+ DesktopRect* rect, >+ XWindowAttributes* attributes /* = nullptr */) { >+ XWindowAttributes local_attributes; >+ int offset_x; >+ int offset_y; >+ if (attributes == nullptr) { >+ attributes = &local_attributes; >+ } >+ >+ { >+ XErrorTrap error_trap(display); >+ if (!XGetWindowAttributes(display, window, attributes) || >+ error_trap.GetLastErrorAndDisable() != 0) { >+ return false; >+ } >+ } >+ *rect = DesktopRectFromXAttributes(*attributes); >+ >+ { >+ XErrorTrap error_trap(display); >+ ::Window child; >+ if (!XTranslateCoordinates(display, window, attributes->root, -rect->left(), >+ -rect->top(), &offset_x, &offset_y, &child) || >+ error_trap.GetLastErrorAndDisable() != 0) { >+ return false; >+ } >+ } >+ rect->Translate(offset_x, offset_y); >+ return true; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_list_utils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_list_utils.h >new file mode 100644 >index 0000000000000000000000000000000000000000..2057c99b84c23e4772fc56acc354afb4ee1eba81 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/window_list_utils.h >@@ -0,0 +1,54 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_LIST_UTILS_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_LIST_UTILS_H_ >+ >+#include <X11/Xlib.h> >+ >+#include "modules/desktop_capture/desktop_geometry.h" >+#include "modules/desktop_capture/linux/x_atom_cache.h" >+#include "rtc_base/function_view.h" >+ >+namespace webrtc { >+ >+// Synchronously iterates all on-screen windows in |cache|.display() in >+// decreasing z-order and sends them one-by-one to |on_window| function before >+// GetWindowList() returns. If |on_window| returns false, this function ignores >+// other windows and returns immediately. GetWindowList() returns false if >+// native APIs failed. If multiple screens are attached to the |display|, this >+// function returns false only when native APIs failed on all screens. Menus, >+// panels and minimized windows will be ignored. >+bool GetWindowList(XAtomCache* cache, >+ rtc::FunctionView<bool(::Window)> on_window); >+ >+// Returns WM_STATE property of the |window|. This function returns >+// WithdrawnState if the |window| is missing. >+int32_t GetWindowState(XAtomCache* cache, ::Window window); >+ >+// Returns the rectangle of the |window| in the coordinates of |display|. This >+// function returns false if native APIs failed. If |attributes| is provided, it >+// will be filled with the attributes of |window|. The |rect| is in system >+// coordinate, i.e. the primary monitor always starts from (0, 0). >+bool GetWindowRect(::Display* display, >+ ::Window window, >+ DesktopRect* rect, >+ XWindowAttributes* attributes = nullptr); >+ >+// Creates a DesktopRect from |attributes|. >+template <typename T> >+DesktopRect DesktopRectFromXAttributes(const T& attributes) { >+ return DesktopRect::MakeXYWH(attributes.x, attributes.y, attributes.width, >+ attributes.height); >+} >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_WINDOW_LIST_UTILS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_atom_cache.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_atom_cache.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..832e6166cb435cda87e3a9d1b46f4ea49aa6c0b6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_atom_cache.cc >@@ -0,0 +1,47 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/x_atom_cache.h" >+ >+#include "rtc_base/checks.h" >+ >+namespace webrtc { >+ >+XAtomCache::XAtomCache(::Display* display) : display_(display) { >+ RTC_DCHECK(display_); >+} >+ >+XAtomCache::~XAtomCache() = default; >+ >+::Display* XAtomCache::display() const { >+ return display_; >+} >+ >+Atom XAtomCache::WmState() { >+ return CreateIfNotExist(&wm_state_, "WM_STATE"); >+} >+ >+Atom XAtomCache::WindowType() { >+ return CreateIfNotExist(&window_type_, "_NET_WM_WINDOW_TYPE"); >+} >+ >+Atom XAtomCache::WindowTypeNormal() { >+ return CreateIfNotExist(&window_type_normal_, "_NET_WM_WINDOW_TYPE_NORMAL"); >+} >+ >+Atom XAtomCache::CreateIfNotExist(Atom* atom, const char* name) { >+ RTC_DCHECK(atom); >+ if (*atom == None) { >+ *atom = XInternAtom(display(), name, True); >+ } >+ return *atom; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_atom_cache.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_atom_cache.h >new file mode 100644 >index 0000000000000000000000000000000000000000..71789f05124e10100bfa0d290883ba82f1c362ee >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_atom_cache.h >@@ -0,0 +1,43 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_X_ATOM_CACHE_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_X_ATOM_CACHE_H_ >+ >+#include <X11/Xatom.h> >+#include <X11/Xlib.h> >+ >+namespace webrtc { >+ >+// A cache of Atom. Each Atom object is created on demand. >+class XAtomCache final { >+ public: >+ explicit XAtomCache(::Display* display); >+ ~XAtomCache(); >+ >+ ::Display* display() const; >+ >+ Atom WmState(); >+ Atom WindowType(); >+ Atom WindowTypeNormal(); >+ >+ private: >+ // If |*atom| is None, this function uses XInternAtom() to retrieve an Atom. >+ Atom CreateIfNotExist(Atom* atom, const char* name); >+ >+ ::Display* const display_; >+ Atom wm_state_ = None; >+ Atom window_type_ = None; >+ Atom window_type_normal_ = None; >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_X_ATOM_CACHE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_error_trap.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_error_trap.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..bb5ca4512dc62a28253228a7f603113fcaa1d996 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_error_trap.cc >@@ -0,0 +1,68 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/x_error_trap.h" >+ >+#include <assert.h> >+ >+#if defined(TOOLKIT_GTK) >+#include <gdk/gdk.h> >+#endif // !defined(TOOLKIT_GTK) >+ >+namespace webrtc { >+ >+namespace { >+ >+#if !defined(TOOLKIT_GTK) >+ >+// TODO(sergeyu): This code is not thread safe. Fix it. Bug 2202. >+static bool g_xserver_error_trap_enabled = false; >+static int g_last_xserver_error_code = 0; >+ >+int XServerErrorHandler(Display* display, XErrorEvent* error_event) { >+ assert(g_xserver_error_trap_enabled); >+ g_last_xserver_error_code = error_event->error_code; >+ return 0; >+} >+ >+#endif // !defined(TOOLKIT_GTK) >+ >+} // namespace >+ >+XErrorTrap::XErrorTrap(Display* display) >+ : original_error_handler_(NULL), enabled_(true) { >+#if defined(TOOLKIT_GTK) >+ gdk_error_trap_push(); >+#else // !defined(TOOLKIT_GTK) >+ assert(!g_xserver_error_trap_enabled); >+ original_error_handler_ = XSetErrorHandler(&XServerErrorHandler); >+ g_xserver_error_trap_enabled = true; >+ g_last_xserver_error_code = 0; >+#endif // !defined(TOOLKIT_GTK) >+} >+ >+int XErrorTrap::GetLastErrorAndDisable() { >+ enabled_ = false; >+#if defined(TOOLKIT_GTK) >+ return gdk_error_trap_push(); >+#else // !defined(TOOLKIT_GTK) >+ assert(g_xserver_error_trap_enabled); >+ XSetErrorHandler(original_error_handler_); >+ g_xserver_error_trap_enabled = false; >+ return g_last_xserver_error_code; >+#endif // !defined(TOOLKIT_GTK) >+} >+ >+XErrorTrap::~XErrorTrap() { >+ if (enabled_) >+ GetLastErrorAndDisable(); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_error_trap.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_error_trap.h >new file mode 100644 >index 0000000000000000000000000000000000000000..a9d059a0bc3608e6d15e6e9674eaeb89e396bd3d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_error_trap.h >@@ -0,0 +1,39 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_X_ERROR_TRAP_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_X_ERROR_TRAP_H_ >+ >+#include <X11/Xlib.h> >+ >+#include "rtc_base/constructormagic.h" >+ >+namespace webrtc { >+ >+// Helper class that registers X Window error handler. Caller can use >+// GetLastErrorAndDisable() to get the last error that was caught, if any. >+class XErrorTrap { >+ public: >+ explicit XErrorTrap(Display* display); >+ ~XErrorTrap(); >+ >+ // Returns last error and removes unregisters the error handler. >+ int GetLastErrorAndDisable(); >+ >+ private: >+ XErrorHandler original_error_handler_; >+ bool enabled_; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(XErrorTrap); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_X_ERROR_TRAP_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_server_pixel_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_server_pixel_buffer.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..19dd1398c9cb0f6c64009c575ef0fc5115c1db5b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_server_pixel_buffer.cc >@@ -0,0 +1,351 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/linux/x_server_pixel_buffer.h" >+ >+#include <string.h> >+#include <sys/shm.h> >+ >+#include "modules/desktop_capture/desktop_frame.h" >+#include "modules/desktop_capture/linux/window_list_utils.h" >+#include "modules/desktop_capture/linux/x_error_trap.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+ >+namespace { >+ >+// Returns the number of bits |mask| has to be shifted left so its last >+// (most-significant) bit set becomes the most-significant bit of the word. >+// When |mask| is 0 the function returns 31. >+uint32_t MaskToShift(uint32_t mask) { >+ int shift = 0; >+ if ((mask & 0xffff0000u) == 0) { >+ mask <<= 16; >+ shift += 16; >+ } >+ if ((mask & 0xff000000u) == 0) { >+ mask <<= 8; >+ shift += 8; >+ } >+ if ((mask & 0xf0000000u) == 0) { >+ mask <<= 4; >+ shift += 4; >+ } >+ if ((mask & 0xc0000000u) == 0) { >+ mask <<= 2; >+ shift += 2; >+ } >+ if ((mask & 0x80000000u) == 0) >+ shift += 1; >+ >+ return shift; >+} >+ >+// Returns true if |image| is in RGB format. >+bool IsXImageRGBFormat(XImage* image) { >+ return image->bits_per_pixel == 32 && image->red_mask == 0xff0000 && >+ image->green_mask == 0xff00 && image->blue_mask == 0xff; >+} >+ >+// We expose two forms of blitting to handle variations in the pixel format. >+// In FastBlit(), the operation is effectively a memcpy. >+void FastBlit(XImage* x_image, >+ uint8_t* src_pos, >+ const DesktopRect& rect, >+ DesktopFrame* frame) { >+ int src_stride = x_image->bytes_per_line; >+ int dst_x = rect.left(), dst_y = rect.top(); >+ >+ uint8_t* dst_pos = frame->data() + frame->stride() * dst_y; >+ dst_pos += dst_x * DesktopFrame::kBytesPerPixel; >+ >+ int height = rect.height(); >+ int row_bytes = rect.width() * DesktopFrame::kBytesPerPixel; >+ for (int y = 0; y < height; ++y) { >+ memcpy(dst_pos, src_pos, row_bytes); >+ src_pos += src_stride; >+ dst_pos += frame->stride(); >+ } >+} >+ >+void SlowBlit(XImage* x_image, >+ uint8_t* src_pos, >+ const DesktopRect& rect, >+ DesktopFrame* frame) { >+ int src_stride = x_image->bytes_per_line; >+ int dst_x = rect.left(), dst_y = rect.top(); >+ int width = rect.width(), height = rect.height(); >+ >+ uint32_t red_mask = x_image->red_mask; >+ uint32_t green_mask = x_image->red_mask; >+ uint32_t blue_mask = x_image->blue_mask; >+ >+ uint32_t red_shift = MaskToShift(red_mask); >+ uint32_t green_shift = MaskToShift(green_mask); >+ uint32_t blue_shift = MaskToShift(blue_mask); >+ >+ int bits_per_pixel = x_image->bits_per_pixel; >+ >+ uint8_t* dst_pos = frame->data() + frame->stride() * dst_y; >+ dst_pos += dst_x * DesktopFrame::kBytesPerPixel; >+ // TODO(hclam): Optimize, perhaps using MMX code or by converting to >+ // YUV directly. >+ // TODO(sergeyu): This code doesn't handle XImage byte order properly and >+ // won't work with 24bpp images. Fix it. >+ for (int y = 0; y < height; y++) { >+ uint32_t* dst_pos_32 = reinterpret_cast<uint32_t*>(dst_pos); >+ uint32_t* src_pos_32 = reinterpret_cast<uint32_t*>(src_pos); >+ uint16_t* src_pos_16 = reinterpret_cast<uint16_t*>(src_pos); >+ for (int x = 0; x < width; x++) { >+ // Dereference through an appropriately-aligned pointer. >+ uint32_t pixel; >+ if (bits_per_pixel == 32) { >+ pixel = src_pos_32[x]; >+ } else if (bits_per_pixel == 16) { >+ pixel = src_pos_16[x]; >+ } else { >+ pixel = src_pos[x]; >+ } >+ uint32_t r = (pixel & red_mask) << red_shift; >+ uint32_t g = (pixel & green_mask) << green_shift; >+ uint32_t b = (pixel & blue_mask) << blue_shift; >+ // Write as 32-bit RGB. >+ dst_pos_32[x] = >+ ((r >> 8) & 0xff0000) | ((g >> 16) & 0xff00) | ((b >> 24) & 0xff); >+ } >+ dst_pos += frame->stride(); >+ src_pos += src_stride; >+ } >+} >+ >+} // namespace >+ >+XServerPixelBuffer::XServerPixelBuffer() {} >+ >+XServerPixelBuffer::~XServerPixelBuffer() { >+ Release(); >+} >+ >+void XServerPixelBuffer::Release() { >+ if (x_image_) { >+ XDestroyImage(x_image_); >+ x_image_ = nullptr; >+ } >+ if (x_shm_image_) { >+ XDestroyImage(x_shm_image_); >+ x_shm_image_ = nullptr; >+ } >+ if (shm_pixmap_) { >+ XFreePixmap(display_, shm_pixmap_); >+ shm_pixmap_ = 0; >+ } >+ if (shm_gc_) { >+ XFreeGC(display_, shm_gc_); >+ shm_gc_ = nullptr; >+ } >+ >+ ReleaseSharedMemorySegment(); >+ >+ window_ = 0; >+} >+ >+void XServerPixelBuffer::ReleaseSharedMemorySegment() { >+ if (!shm_segment_info_) >+ return; >+ if (shm_segment_info_->shmaddr != nullptr) >+ shmdt(shm_segment_info_->shmaddr); >+ if (shm_segment_info_->shmid != -1) >+ shmctl(shm_segment_info_->shmid, IPC_RMID, 0); >+ delete shm_segment_info_; >+ shm_segment_info_ = nullptr; >+} >+ >+bool XServerPixelBuffer::Init(Display* display, Window window) { >+ Release(); >+ display_ = display; >+ >+ XWindowAttributes attributes; >+ if (!GetWindowRect(display_, window, &window_rect_, &attributes)) { >+ return false; >+ } >+ >+ window_ = window; >+ InitShm(attributes); >+ >+ return true; >+} >+ >+void XServerPixelBuffer::InitShm(const XWindowAttributes& attributes) { >+ Visual* default_visual = attributes.visual; >+ int default_depth = attributes.depth; >+ >+ int major, minor; >+ Bool have_pixmaps; >+ if (!XShmQueryVersion(display_, &major, &minor, &have_pixmaps)) { >+ // Shared memory not supported. CaptureRect will use the XImage API instead. >+ return; >+ } >+ >+ bool using_shm = false; >+ shm_segment_info_ = new XShmSegmentInfo; >+ shm_segment_info_->shmid = -1; >+ shm_segment_info_->shmaddr = nullptr; >+ shm_segment_info_->readOnly = False; >+ x_shm_image_ = XShmCreateImage(display_, default_visual, default_depth, >+ ZPixmap, 0, shm_segment_info_, >+ window_rect_.width(), window_rect_.height()); >+ if (x_shm_image_) { >+ shm_segment_info_->shmid = >+ shmget(IPC_PRIVATE, x_shm_image_->bytes_per_line * x_shm_image_->height, >+ IPC_CREAT | 0600); >+ if (shm_segment_info_->shmid != -1) { >+ void* shmat_result = shmat(shm_segment_info_->shmid, 0, 0); >+ if (shmat_result != reinterpret_cast<void*>(-1)) { >+ shm_segment_info_->shmaddr = reinterpret_cast<char*>(shmat_result); >+ x_shm_image_->data = shm_segment_info_->shmaddr; >+ >+ XErrorTrap error_trap(display_); >+ using_shm = XShmAttach(display_, shm_segment_info_); >+ XSync(display_, False); >+ if (error_trap.GetLastErrorAndDisable() != 0) >+ using_shm = false; >+ if (using_shm) { >+ RTC_LOG(LS_VERBOSE) >+ << "Using X shared memory segment " << shm_segment_info_->shmid; >+ } >+ } >+ } else { >+ RTC_LOG(LS_WARNING) << "Failed to get shared memory segment. " >+ "Performance may be degraded."; >+ } >+ } >+ >+ if (!using_shm) { >+ RTC_LOG(LS_WARNING) >+ << "Not using shared memory. Performance may be degraded."; >+ ReleaseSharedMemorySegment(); >+ return; >+ } >+ >+ if (have_pixmaps) >+ have_pixmaps = InitPixmaps(default_depth); >+ >+ shmctl(shm_segment_info_->shmid, IPC_RMID, 0); >+ shm_segment_info_->shmid = -1; >+ >+ RTC_LOG(LS_VERBOSE) << "Using X shared memory extension v" << major << "." >+ << minor << " with" << (have_pixmaps ? "" : "out") >+ << " pixmaps."; >+} >+ >+bool XServerPixelBuffer::InitPixmaps(int depth) { >+ if (XShmPixmapFormat(display_) != ZPixmap) >+ return false; >+ >+ { >+ XErrorTrap error_trap(display_); >+ shm_pixmap_ = XShmCreatePixmap( >+ display_, window_, shm_segment_info_->shmaddr, shm_segment_info_, >+ window_rect_.width(), window_rect_.height(), depth); >+ XSync(display_, False); >+ if (error_trap.GetLastErrorAndDisable() != 0) { >+ // |shm_pixmap_| is not not valid because the request was not processed >+ // by the X Server, so zero it. >+ shm_pixmap_ = 0; >+ return false; >+ } >+ } >+ >+ { >+ XErrorTrap error_trap(display_); >+ XGCValues shm_gc_values; >+ shm_gc_values.subwindow_mode = IncludeInferiors; >+ shm_gc_values.graphics_exposures = False; >+ shm_gc_ = XCreateGC(display_, window_, >+ GCSubwindowMode | GCGraphicsExposures, &shm_gc_values); >+ XSync(display_, False); >+ if (error_trap.GetLastErrorAndDisable() != 0) { >+ XFreePixmap(display_, shm_pixmap_); >+ shm_pixmap_ = 0; >+ shm_gc_ = 0; // See shm_pixmap_ comment above. >+ return false; >+ } >+ } >+ >+ return true; >+} >+ >+bool XServerPixelBuffer::IsWindowValid() const { >+ XWindowAttributes attributes; >+ { >+ XErrorTrap error_trap(display_); >+ if (!XGetWindowAttributes(display_, window_, &attributes) || >+ error_trap.GetLastErrorAndDisable() != 0) { >+ return false; >+ } >+ } >+ return true; >+} >+ >+void XServerPixelBuffer::Synchronize() { >+ if (shm_segment_info_ && !shm_pixmap_) { >+ // XShmGetImage can fail if the display is being reconfigured. >+ XErrorTrap error_trap(display_); >+ // XShmGetImage fails if the window is partially out of screen. >+ xshm_get_image_succeeded_ = >+ XShmGetImage(display_, window_, x_shm_image_, 0, 0, AllPlanes); >+ } >+} >+ >+bool XServerPixelBuffer::CaptureRect(const DesktopRect& rect, >+ DesktopFrame* frame) { >+ RTC_DCHECK_LE(rect.right(), window_rect_.width()); >+ RTC_DCHECK_LE(rect.bottom(), window_rect_.height()); >+ >+ XImage* image; >+ uint8_t* data; >+ >+ if (shm_segment_info_ && (shm_pixmap_ || xshm_get_image_succeeded_)) { >+ if (shm_pixmap_) { >+ XCopyArea(display_, window_, shm_pixmap_, shm_gc_, rect.left(), >+ rect.top(), rect.width(), rect.height(), rect.left(), >+ rect.top()); >+ XSync(display_, False); >+ } >+ >+ image = x_shm_image_; >+ data = reinterpret_cast<uint8_t*>(image->data) + >+ rect.top() * image->bytes_per_line + >+ rect.left() * image->bits_per_pixel / 8; >+ >+ } else { >+ if (x_image_) >+ XDestroyImage(x_image_); >+ x_image_ = XGetImage(display_, window_, rect.left(), rect.top(), >+ rect.width(), rect.height(), AllPlanes, ZPixmap); >+ if (!x_image_) >+ return false; >+ >+ image = x_image_; >+ data = reinterpret_cast<uint8_t*>(image->data); >+ } >+ >+ if (IsXImageRGBFormat(image)) { >+ FastBlit(image, data, rect, frame); >+ } else { >+ SlowBlit(image, data, rect, frame); >+ } >+ >+ return true; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_server_pixel_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_server_pixel_buffer.h >new file mode 100644 >index 0000000000000000000000000000000000000000..a9398dac9968016981f3e3fea57a3e86e5bd49f2 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/linux/x_server_pixel_buffer.h >@@ -0,0 +1,84 @@ >+/* >+ * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+// Don't include this file in any .h files because it pulls in some X headers. >+ >+#ifndef MODULES_DESKTOP_CAPTURE_LINUX_X_SERVER_PIXEL_BUFFER_H_ >+#define MODULES_DESKTOP_CAPTURE_LINUX_X_SERVER_PIXEL_BUFFER_H_ >+ >+#include <X11/Xutil.h> >+#include <X11/extensions/XShm.h> >+ >+#include "modules/desktop_capture/desktop_geometry.h" >+#include "rtc_base/constructormagic.h" >+ >+namespace webrtc { >+ >+class DesktopFrame; >+ >+// A class to allow the X server's pixel buffer to be accessed as efficiently >+// as possible. >+class XServerPixelBuffer { >+ public: >+ XServerPixelBuffer(); >+ ~XServerPixelBuffer(); >+ >+ void Release(); >+ >+ // Allocate (or reallocate) the pixel buffer for |window|. Returns false in >+ // case of an error (e.g. window doesn't exist). >+ bool Init(Display* display, Window window); >+ >+ bool is_initialized() { return window_ != 0; } >+ >+ // Returns the size of the window the buffer was initialized for. >+ DesktopSize window_size() { return window_rect_.size(); } >+ >+ // Returns the rectangle of the window the buffer was initialized for. >+ const DesktopRect& window_rect() { return window_rect_; } >+ >+ // Returns true if the window can be found. >+ bool IsWindowValid() const; >+ >+ // If shared memory is being used without pixmaps, synchronize this pixel >+ // buffer with the root window contents (otherwise, this is a no-op). >+ // This is to avoid doing a full-screen capture for each individual >+ // rectangle in the capture list, when it only needs to be done once at the >+ // beginning. >+ void Synchronize(); >+ >+ // Capture the specified rectangle and stores it in the |frame|. In the case >+ // where the full-screen data is captured by Synchronize(), this simply >+ // returns the pointer without doing any more work. The caller must ensure >+ // that |rect| is not larger than window_size(). >+ bool CaptureRect(const DesktopRect& rect, DesktopFrame* frame); >+ >+ private: >+ void ReleaseSharedMemorySegment(); >+ >+ void InitShm(const XWindowAttributes& attributes); >+ bool InitPixmaps(int depth); >+ >+ Display* display_ = nullptr; >+ Window window_ = 0; >+ DesktopRect window_rect_; >+ XImage* x_image_ = nullptr; >+ XShmSegmentInfo* shm_segment_info_ = nullptr; >+ XImage* x_shm_image_ = nullptr; >+ Pixmap shm_pixmap_ = 0; >+ GC shm_gc_ = nullptr; >+ bool xshm_get_image_succeeded_ = false; >+ >+ RTC_DISALLOW_COPY_AND_ASSIGN(XServerPixelBuffer); >+}; >+ >+} // namespace webrtc >+ >+#endif // MODULES_DESKTOP_CAPTURE_LINUX_X_SERVER_PIXEL_BUFFER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.cc >index d4364bedc2be5e3d6fdcb55db3fa73e1b24971aa..9a32f638d8fdb9f3abc1ec43442928021416b62a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.cc >@@ -17,19 +17,12 @@ > > namespace webrtc { > >-// The amount of time allowed for displays to reconfigure. >-static const int64_t kDisplayConfigurationEventTimeoutMs = 10 * 1000; >- >-DesktopConfigurationMonitor::DesktopConfigurationMonitor() >- : display_configuration_capture_event_(EventWrapper::Create()) { >+DesktopConfigurationMonitor::DesktopConfigurationMonitor() { > CGError err = CGDisplayRegisterReconfigurationCallback( > DesktopConfigurationMonitor::DisplaysReconfiguredCallback, this); >- if (err != kCGErrorSuccess) { >+ if (err != kCGErrorSuccess) > RTC_LOG(LS_ERROR) << "CGDisplayRegisterReconfigurationCallback " << err; >- abort(); >- } >- display_configuration_capture_event_->Set(); >- >+ rtc::CritScope cs(&desktop_configuration_lock_); > desktop_configuration_ = MacDesktopConfiguration::GetCurrent( > MacDesktopConfiguration::TopLeftOrigin); > } >@@ -41,19 +34,13 @@ DesktopConfigurationMonitor::~DesktopConfigurationMonitor() { > RTC_LOG(LS_ERROR) << "CGDisplayRemoveReconfigurationCallback " << err; > } > >-void DesktopConfigurationMonitor::Lock() { >- if (!display_configuration_capture_event_->Wait( >- kDisplayConfigurationEventTimeoutMs)) { >- RTC_LOG_F(LS_ERROR) << "Event wait timed out."; >- abort(); >- } >-} >- >-void DesktopConfigurationMonitor::Unlock() { >- display_configuration_capture_event_->Set(); >+MacDesktopConfiguration DesktopConfigurationMonitor::desktop_configuration() { >+ rtc::CritScope crit(&desktop_configuration_lock_); >+ return desktop_configuration_; > } > > // static >+// This method may be called on any system thread. > void DesktopConfigurationMonitor::DisplaysReconfiguredCallback( > CGDirectDisplayID display, > CGDisplayChangeSummaryFlags flags, >@@ -72,24 +59,15 @@ void DesktopConfigurationMonitor::DisplaysReconfigured( > << flags; > > if (flags & kCGDisplayBeginConfigurationFlag) { >- if (reconfiguring_displays_.empty()) { >- // If this is the first display to start reconfiguring then wait on >- // |display_configuration_capture_event_| to block the capture thread >- // from accessing display memory until the reconfiguration completes. >- if (!display_configuration_capture_event_->Wait( >- kDisplayConfigurationEventTimeoutMs)) { >- RTC_LOG_F(LS_ERROR) << "Event wait timed out."; >- abort(); >- } >- } > reconfiguring_displays_.insert(display); >- } else { >- reconfiguring_displays_.erase(display); >- if (reconfiguring_displays_.empty()) { >- desktop_configuration_ = MacDesktopConfiguration::GetCurrent( >- MacDesktopConfiguration::TopLeftOrigin); >- display_configuration_capture_event_->Set(); >- } >+ return; >+ } >+ >+ reconfiguring_displays_.erase(display); >+ if (reconfiguring_displays_.empty()) { >+ rtc::CritScope cs(&desktop_configuration_lock_); >+ desktop_configuration_ = MacDesktopConfiguration::GetCurrent( >+ MacDesktopConfiguration::TopLeftOrigin); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.h >index 5215e5496ade8d2b012cb0c582f955bb4d02b634..cbf580b235301f6a37fd389efad9d75a1db7954e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/desktop_configuration_monitor.h >@@ -19,25 +19,17 @@ > #include "api/refcountedbase.h" > #include "modules/desktop_capture/mac/desktop_configuration.h" > #include "rtc_base/constructormagic.h" >+#include "rtc_base/criticalsection.h" > > namespace webrtc { > >-class EventWrapper; >- > // The class provides functions to synchronize capturing and display > // reconfiguring across threads, and the up-to-date MacDesktopConfiguration. > class DesktopConfigurationMonitor : public rtc::RefCountedBase { > public: > DesktopConfigurationMonitor(); >- // Acquires a lock on the current configuration. >- void Lock(); >- // Releases the lock previously acquired. >- void Unlock(); >- // Returns the current desktop configuration. Should only be called when the >- // lock has been acquired. >- const MacDesktopConfiguration& desktop_configuration() { >- return desktop_configuration_; >- } >+ // Returns the current desktop configuration. >+ MacDesktopConfiguration desktop_configuration(); > > protected: > ~DesktopConfigurationMonitor() override; >@@ -49,9 +41,10 @@ class DesktopConfigurationMonitor : public rtc::RefCountedBase { > void DisplaysReconfigured(CGDirectDisplayID display, > CGDisplayChangeSummaryFlags flags); > >+ rtc::CriticalSection desktop_configuration_lock_; >+ MacDesktopConfiguration desktop_configuration_ >+ RTC_GUARDED_BY(&desktop_configuration_lock_); > std::set<CGDirectDisplayID> reconfiguring_displays_; >- MacDesktopConfiguration desktop_configuration_; >- std::unique_ptr<EventWrapper> display_configuration_capture_event_; > > RTC_DISALLOW_COPY_AND_ASSIGN(DesktopConfigurationMonitor); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/screen_capturer_mac.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/screen_capturer_mac.mm >index 0d4fa072b5f6433af00822e2a376378f51f43f28..542d1c5f43fe74fa80d5fa52f39e704d9a438759 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/screen_capturer_mac.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mac/screen_capturer_mac.mm >@@ -167,11 +167,7 @@ ScreenCapturerMac::~ScreenCapturerMac() { > > bool ScreenCapturerMac::Init() { > TRACE_EVENT0("webrtc", "ScreenCapturerMac::Init"); >- >- desktop_config_monitor_->Lock(); > desktop_config_ = desktop_config_monitor_->desktop_configuration(); >- desktop_config_monitor_->Unlock(); >- > return true; > } > >@@ -207,7 +203,6 @@ void ScreenCapturerMac::CaptureFrame() { > queue_.MoveToNextFrame(); > RTC_DCHECK(!queue_.current_frame() || !queue_.current_frame()->IsShared()); > >- desktop_config_monitor_->Lock(); > MacDesktopConfiguration new_config = desktop_config_monitor_->desktop_configuration(); > if (!desktop_config_.Equals(new_config)) { > desktop_config_ = new_config; >@@ -234,7 +229,6 @@ void ScreenCapturerMac::CaptureFrame() { > DesktopFrame* current_frame = queue_.current_frame(); > > if (!CgBlit(*current_frame, region)) { >- desktop_config_monitor_->Unlock(); > callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); > return; > } >@@ -256,10 +250,6 @@ void ScreenCapturerMac::CaptureFrame() { > > helper_.set_size_most_recent(new_frame->size()); > >- // Signal that we are done capturing data from the display framebuffer, >- // and accessing display structures. >- desktop_config_monitor_->Unlock(); >- > new_frame->set_capture_time_ms((rtc::TimeNanos() - capture_start_time_nanos) / > rtc::kNumNanosecsPerMillisec); > callback_->OnCaptureResult(Result::SUCCESS, std::move(new_frame)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_linux.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_linux.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..def344ef408a6962706c1e94084d207daa687cbc >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_linux.cc >@@ -0,0 +1,51 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/mouse_cursor_monitor.h" >+ >+#if defined(USE_X11) >+#include "modules/desktop_capture/linux/mouse_cursor_monitor_x11.h" >+#endif // defined(USE_X11) >+ >+namespace webrtc { >+ >+// static >+MouseCursorMonitor* MouseCursorMonitor::CreateForWindow( >+ const DesktopCaptureOptions& options, >+ WindowId window) { >+#if defined(USE_X11) >+ return MouseCursorMonitorX11::CreateForWindow(options, window); >+#else >+ return nullptr; >+#endif // defined(USE_X11) >+} >+ >+// static >+MouseCursorMonitor* MouseCursorMonitor::CreateForScreen( >+ const DesktopCaptureOptions& options, >+ ScreenId screen) { >+#if defined(USE_X11) >+ return MouseCursorMonitorX11::CreateForScreen(options, screen); >+#else >+ return nullptr; >+#endif // defined(USE_X11) >+} >+ >+// static >+std::unique_ptr<MouseCursorMonitor> MouseCursorMonitor::Create( >+ const DesktopCaptureOptions& options) { >+#if defined(USE_X11) >+ return MouseCursorMonitorX11::Create(options); >+#else >+ return nullptr; >+#endif // defined(USE_X11) >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_mac.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_mac.mm >index de8876438f8a9860e9b540ce9e8fdd165c785fb7..3402a5ffbbcbe040dd153baf4f4cddcbfc6e3997 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_mac.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_mac.mm >@@ -117,10 +117,8 @@ void MouseCursorMonitorMac::Capture() { > > DesktopVector position(gc_position.x, gc_position.y); > >- configuration_monitor_->Lock(); > MacDesktopConfiguration configuration = > configuration_monitor_->desktop_configuration(); >- configuration_monitor_->Unlock(); > float scale = GetScaleFactorAtPosition(configuration, position); > > CaptureImage(scale); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_x11.cc >deleted file mode 100644 >index 41749bd2841bb5f6aadb276bb429e50a484b6a4d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/mouse_cursor_monitor_x11.cc >+++ /dev/null >@@ -1,277 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "modules/desktop_capture/mouse_cursor_monitor.h" >- >-#include <X11/Xlib.h> >-#include <X11/Xutil.h> >-#include <X11/extensions/Xfixes.h> >- >-#include "modules/desktop_capture/desktop_capture_options.h" >-#include "modules/desktop_capture/desktop_capture_types.h" >-#include "modules/desktop_capture/desktop_frame.h" >-#include "modules/desktop_capture/mouse_cursor.h" >-#include "modules/desktop_capture/x11/x_error_trap.h" >-#include "rtc_base/logging.h" >- >-namespace { >- >-// WindowCapturer returns window IDs of X11 windows with WM_STATE attribute. >-// These windows may not be immediate children of the root window, because >-// window managers may re-parent them to add decorations. However, >-// XQueryPointer() expects to be passed children of the root. This function >-// searches up the list of the windows to find the root child that corresponds >-// to |window|. >-Window GetTopLevelWindow(Display* display, Window window) { >- while (true) { >- // If the window is in WithdrawnState then look at all of its children. >- ::Window root, parent; >- ::Window* children; >- unsigned int num_children; >- if (!XQueryTree(display, window, &root, &parent, &children, >- &num_children)) { >- RTC_LOG(LS_ERROR) << "Failed to query for child windows although window" >- << "does not have a valid WM_STATE."; >- return None; >- } >- if (children) >- XFree(children); >- >- if (parent == root) >- break; >- >- window = parent; >- } >- >- return window; >-} >- >-} // namespace >- >-namespace webrtc { >- >-class MouseCursorMonitorX11 : public MouseCursorMonitor, >- public SharedXDisplay::XEventHandler { >- public: >- MouseCursorMonitorX11(const DesktopCaptureOptions& options, Window window); >- ~MouseCursorMonitorX11() override; >- >- void Init(Callback* callback, Mode mode) override; >- void Capture() override; >- >- private: >- // SharedXDisplay::XEventHandler interface. >- bool HandleXEvent(const XEvent& event) override; >- >- Display* display() { return x_display_->display(); } >- >- // Captures current cursor shape and stores it in |cursor_shape_|. >- void CaptureCursor(); >- >- rtc::scoped_refptr<SharedXDisplay> x_display_; >- Callback* callback_; >- Mode mode_; >- Window window_; >- >- bool have_xfixes_; >- int xfixes_event_base_; >- int xfixes_error_base_; >- >- std::unique_ptr<MouseCursor> cursor_shape_; >-}; >- >-MouseCursorMonitorX11::MouseCursorMonitorX11( >- const DesktopCaptureOptions& options, >- Window window) >- : x_display_(options.x_display()), >- callback_(NULL), >- mode_(SHAPE_AND_POSITION), >- window_(window), >- have_xfixes_(false), >- xfixes_event_base_(-1), >- xfixes_error_base_(-1) { >- // Set a default initial cursor shape in case XFixes is not present. >- const int kSize = 5; >- std::unique_ptr<DesktopFrame> default_cursor( >- new BasicDesktopFrame(DesktopSize(kSize, kSize))); >- const uint8_t pixels[kSize * kSize] = { >- 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, >- 0x00, 0x00, 0xff, 0xff, 0xff, 0x00, 0x00, 0xff, 0xff, >- 0xff, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00}; >- uint8_t* ptr = default_cursor->data(); >- for (int y = 0; y < kSize; ++y) { >- for (int x = 0; x < kSize; ++x) { >- *ptr++ = pixels[kSize * y + x]; >- *ptr++ = pixels[kSize * y + x]; >- *ptr++ = pixels[kSize * y + x]; >- *ptr++ = 0xff; >- } >- } >- DesktopVector hotspot(2, 2); >- cursor_shape_.reset(new MouseCursor(default_cursor.release(), hotspot)); >-} >- >-MouseCursorMonitorX11::~MouseCursorMonitorX11() { >- if (have_xfixes_) { >- x_display_->RemoveEventHandler(xfixes_event_base_ + XFixesCursorNotify, >- this); >- } >-} >- >-void MouseCursorMonitorX11::Init(Callback* callback, Mode mode) { >- // Init can be called only once per instance of MouseCursorMonitor. >- RTC_DCHECK(!callback_); >- RTC_DCHECK(callback); >- >- callback_ = callback; >- mode_ = mode; >- >- have_xfixes_ = >- XFixesQueryExtension(display(), &xfixes_event_base_, &xfixes_error_base_); >- >- if (have_xfixes_) { >- // Register for changes to the cursor shape. >- XFixesSelectCursorInput(display(), window_, XFixesDisplayCursorNotifyMask); >- x_display_->AddEventHandler(xfixes_event_base_ + XFixesCursorNotify, this); >- >- CaptureCursor(); >- } else { >- RTC_LOG(LS_INFO) << "X server does not support XFixes."; >- } >-} >- >-void MouseCursorMonitorX11::Capture() { >- RTC_DCHECK(callback_); >- >- // Process X11 events in case XFixes has sent cursor notification. >- x_display_->ProcessPendingXEvents(); >- >- // cursor_shape_| is set only if we were notified of a cursor shape change. >- if (cursor_shape_.get()) >- callback_->OnMouseCursor(cursor_shape_.release()); >- >- // Get cursor position if necessary. >- if (mode_ == SHAPE_AND_POSITION) { >- int root_x; >- int root_y; >- int win_x; >- int win_y; >- Window root_window; >- Window child_window; >- unsigned int mask; >- >- XErrorTrap error_trap(display()); >- Bool result = XQueryPointer(display(), window_, &root_window, &child_window, >- &root_x, &root_y, &win_x, &win_y, &mask); >- CursorState state; >- if (!result || error_trap.GetLastErrorAndDisable() != 0) { >- state = OUTSIDE; >- } else { >- // In screen mode (window_ == root_window) the mouse is always inside. >- // XQueryPointer() sets |child_window| to None if the cursor is outside >- // |window_|. >- state = >- (window_ == root_window || child_window != None) ? INSIDE : OUTSIDE; >- } >- >- // As the comments to GetTopLevelWindow() above indicate, in window capture, >- // the cursor position capture happens in |window_|, while the frame catpure >- // happens in |child_window|. These two windows are not alwyas same, as >- // window manager may add some decorations to the |window_|. So translate >- // the coordinate in |window_| to the coordinate space of |child_window|. >- if (window_ != root_window && state == INSIDE) { >- int translated_x, translated_y; >- Window unused; >- if (XTranslateCoordinates(display(), window_, child_window, win_x, win_y, >- &translated_x, &translated_y, &unused)) { >- win_x = translated_x; >- win_y = translated_y; >- } >- } >- >- // X11 always starts the coordinate from (0, 0), so we do not need to >- // translate here. >- callback_->OnMouseCursorPosition(DesktopVector(root_x, root_y)); >- } >-} >- >-bool MouseCursorMonitorX11::HandleXEvent(const XEvent& event) { >- if (have_xfixes_ && event.type == xfixes_event_base_ + XFixesCursorNotify) { >- const XFixesCursorNotifyEvent* cursor_event = >- reinterpret_cast<const XFixesCursorNotifyEvent*>(&event); >- if (cursor_event->subtype == XFixesDisplayCursorNotify) { >- CaptureCursor(); >- } >- // Return false, even if the event has been handled, because there might be >- // other listeners for cursor notifications. >- } >- return false; >-} >- >-void MouseCursorMonitorX11::CaptureCursor() { >- RTC_DCHECK(have_xfixes_); >- >- XFixesCursorImage* img; >- { >- XErrorTrap error_trap(display()); >- img = XFixesGetCursorImage(display()); >- if (!img || error_trap.GetLastErrorAndDisable() != 0) >- return; >- } >- >- std::unique_ptr<DesktopFrame> image( >- new BasicDesktopFrame(DesktopSize(img->width, img->height))); >- >- // Xlib stores 32-bit data in longs, even if longs are 64-bits long. >- unsigned long* src = img->pixels; >- uint32_t* dst = reinterpret_cast<uint32_t*>(image->data()); >- uint32_t* dst_end = dst + (img->width * img->height); >- while (dst < dst_end) { >- *dst++ = static_cast<uint32_t>(*src++); >- } >- >- DesktopVector hotspot(std::min(img->width, img->xhot), >- std::min(img->height, img->yhot)); >- >- XFree(img); >- >- cursor_shape_.reset(new MouseCursor(image.release(), hotspot)); >-} >- >-// static >-MouseCursorMonitor* MouseCursorMonitor::CreateForWindow( >- const DesktopCaptureOptions& options, >- WindowId window) { >- if (!options.x_display()) >- return NULL; >- window = GetTopLevelWindow(options.x_display()->display(), window); >- if (window == None) >- return NULL; >- return new MouseCursorMonitorX11(options, window); >-} >- >-MouseCursorMonitor* MouseCursorMonitor::CreateForScreen( >- const DesktopCaptureOptions& options, >- ScreenId screen) { >- if (!options.x_display()) >- return NULL; >- return new MouseCursorMonitorX11( >- options, DefaultRootWindow(options.x_display()->display())); >-} >- >-std::unique_ptr<MouseCursorMonitor> MouseCursorMonitor::Create( >- const DesktopCaptureOptions& options) { >- return std::unique_ptr<MouseCursorMonitor>( >- CreateForScreen(options, kFullDesktopScreenId)); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_capturer_linux.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_capturer_linux.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..cf8a9dd0e0db4008e17f43aa2507e74910672b46 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_capturer_linux.cc >@@ -0,0 +1,40 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capturer.h" >+ >+#if defined(WEBRTC_USE_PIPEWIRE) >+#include "modules/desktop_capture/linux/screen_capturer_pipewire.h" >+#endif // defined(WEBRTC_USE_PIPEWIRE) >+ >+#if defined(USE_X11) >+#include "modules/desktop_capture/linux/screen_capturer_x11.h" >+#endif // defined(USE_X11) >+ >+namespace webrtc { >+ >+// static >+std::unique_ptr<DesktopCapturer> DesktopCapturer::CreateRawScreenCapturer( >+ const DesktopCaptureOptions& options) { >+#if defined(WEBRTC_USE_PIPEWIRE) >+ if (options.allow_pipewire() && DesktopCapturer::IsRunningUnderWayland()) { >+ return ScreenCapturerPipeWire::CreateRawScreenCapturer(options); >+ } >+#endif // defined(WEBRTC_USE_PIPEWIRE) >+ >+#if defined(USE_X11) >+ return ScreenCapturerX11::CreateRawScreenCapturer(options); >+#endif // defined(USE_X11) >+ >+ return nullptr; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_capturer_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_capturer_x11.cc >deleted file mode 100644 >index e95fccc2082a97042aaffb8ddd6c2434df497e70..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_capturer_x11.cc >+++ /dev/null >@@ -1,425 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <string.h> >- >-#include <memory> >-#include <set> >-#include <utility> >- >-#include <X11/Xlib.h> >-#include <X11/Xutil.h> >-#include <X11/extensions/Xdamage.h> >-#include <X11/extensions/Xfixes.h> >- >-#include "modules/desktop_capture/desktop_capture_options.h" >-#include "modules/desktop_capture/desktop_capturer.h" >-#include "modules/desktop_capture/desktop_frame.h" >-#include "modules/desktop_capture/screen_capture_frame_queue.h" >-#include "modules/desktop_capture/screen_capturer_helper.h" >-#include "modules/desktop_capture/shared_desktop_frame.h" >-#include "modules/desktop_capture/x11/x_server_pixel_buffer.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/timeutils.h" >-#include "rtc_base/trace_event.h" >- >-namespace webrtc { >-namespace { >- >-// A class to perform video frame capturing for Linux. >-// >-// If XDamage is used, this class sets DesktopFrame::updated_region() according >-// to the areas reported by XDamage. Otherwise this class does not detect >-// DesktopFrame::updated_region(), the field is always set to the entire frame >-// rectangle. ScreenCapturerDifferWrapper should be used if that functionality >-// is necessary. >-class ScreenCapturerLinux : public DesktopCapturer, >- public SharedXDisplay::XEventHandler { >- public: >- ScreenCapturerLinux(); >- ~ScreenCapturerLinux() override; >- >- // TODO(ajwong): Do we really want this to be synchronous? >- bool Init(const DesktopCaptureOptions& options); >- >- // DesktopCapturer interface. >- void Start(Callback* delegate) override; >- void CaptureFrame() override; >- bool GetSourceList(SourceList* sources) override; >- bool SelectSource(SourceId id) override; >- >- private: >- Display* display() { return options_.x_display()->display(); } >- >- // SharedXDisplay::XEventHandler interface. >- bool HandleXEvent(const XEvent& event) override; >- >- void InitXDamage(); >- >- // Capture screen pixels to the current buffer in the queue. In the DAMAGE >- // case, the ScreenCapturerHelper already holds the list of invalid rectangles >- // from HandleXEvent(). In the non-DAMAGE case, this captures the >- // whole screen, then calculates some invalid rectangles that include any >- // differences between this and the previous capture. >- std::unique_ptr<DesktopFrame> CaptureScreen(); >- >- // Called when the screen configuration is changed. >- void ScreenConfigurationChanged(); >- >- // Synchronize the current buffer with |last_buffer_|, by copying pixels from >- // the area of |last_invalid_rects|. >- // Note this only works on the assumption that kNumBuffers == 2, as >- // |last_invalid_rects| holds the differences from the previous buffer and >- // the one prior to that (which will then be the current buffer). >- void SynchronizeFrame(); >- >- void DeinitXlib(); >- >- DesktopCaptureOptions options_; >- >- Callback* callback_ = nullptr; >- >- // X11 graphics context. >- GC gc_ = nullptr; >- Window root_window_ = BadValue; >- >- // XFixes. >- bool has_xfixes_ = false; >- int xfixes_event_base_ = -1; >- int xfixes_error_base_ = -1; >- >- // XDamage information. >- bool use_damage_ = false; >- Damage damage_handle_ = 0; >- int damage_event_base_ = -1; >- int damage_error_base_ = -1; >- XserverRegion damage_region_ = 0; >- >- // Access to the X Server's pixel buffer. >- XServerPixelBuffer x_server_pixel_buffer_; >- >- // A thread-safe list of invalid rectangles, and the size of the most >- // recently captured screen. >- ScreenCapturerHelper helper_; >- >- // Queue of the frames buffers. >- ScreenCaptureFrameQueue<SharedDesktopFrame> queue_; >- >- // Invalid region from the previous capture. This is used to synchronize the >- // current with the last buffer used. >- DesktopRegion last_invalid_region_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(ScreenCapturerLinux); >-}; >- >-ScreenCapturerLinux::ScreenCapturerLinux() { >- helper_.SetLogGridSize(4); >-} >- >-ScreenCapturerLinux::~ScreenCapturerLinux() { >- options_.x_display()->RemoveEventHandler(ConfigureNotify, this); >- if (use_damage_) { >- options_.x_display()->RemoveEventHandler(damage_event_base_ + XDamageNotify, >- this); >- } >- DeinitXlib(); >-} >- >-bool ScreenCapturerLinux::Init(const DesktopCaptureOptions& options) { >- TRACE_EVENT0("webrtc", "ScreenCapturerLinux::Init"); >- options_ = options; >- >- root_window_ = RootWindow(display(), DefaultScreen(display())); >- if (root_window_ == BadValue) { >- RTC_LOG(LS_ERROR) << "Unable to get the root window"; >- DeinitXlib(); >- return false; >- } >- >- gc_ = XCreateGC(display(), root_window_, 0, NULL); >- if (gc_ == NULL) { >- RTC_LOG(LS_ERROR) << "Unable to get graphics context"; >- DeinitXlib(); >- return false; >- } >- >- options_.x_display()->AddEventHandler(ConfigureNotify, this); >- >- // Check for XFixes extension. This is required for cursor shape >- // notifications, and for our use of XDamage. >- if (XFixesQueryExtension(display(), &xfixes_event_base_, >- &xfixes_error_base_)) { >- has_xfixes_ = true; >- } else { >- RTC_LOG(LS_INFO) << "X server does not support XFixes."; >- } >- >- // Register for changes to the dimensions of the root window. >- XSelectInput(display(), root_window_, StructureNotifyMask); >- >- if (!x_server_pixel_buffer_.Init(display(), DefaultRootWindow(display()))) { >- RTC_LOG(LS_ERROR) << "Failed to initialize pixel buffer."; >- return false; >- } >- >- if (options_.use_update_notifications()) { >- InitXDamage(); >- } >- >- return true; >-} >- >-void ScreenCapturerLinux::InitXDamage() { >- // Our use of XDamage requires XFixes. >- if (!has_xfixes_) { >- return; >- } >- >- // Check for XDamage extension. >- if (!XDamageQueryExtension(display(), &damage_event_base_, >- &damage_error_base_)) { >- RTC_LOG(LS_INFO) << "X server does not support XDamage."; >- return; >- } >- >- // TODO(lambroslambrou): Disable DAMAGE in situations where it is known >- // to fail, such as when Desktop Effects are enabled, with graphics >- // drivers (nVidia, ATI) that fail to report DAMAGE notifications >- // properly. >- >- // Request notifications every time the screen becomes damaged. >- damage_handle_ = >- XDamageCreate(display(), root_window_, XDamageReportNonEmpty); >- if (!damage_handle_) { >- RTC_LOG(LS_ERROR) << "Unable to initialize XDamage."; >- return; >- } >- >- // Create an XFixes server-side region to collate damage into. >- damage_region_ = XFixesCreateRegion(display(), 0, 0); >- if (!damage_region_) { >- XDamageDestroy(display(), damage_handle_); >- RTC_LOG(LS_ERROR) << "Unable to create XFixes region."; >- return; >- } >- >- options_.x_display()->AddEventHandler(damage_event_base_ + XDamageNotify, >- this); >- >- use_damage_ = true; >- RTC_LOG(LS_INFO) << "Using XDamage extension."; >-} >- >-void ScreenCapturerLinux::Start(Callback* callback) { >- RTC_DCHECK(!callback_); >- RTC_DCHECK(callback); >- >- callback_ = callback; >-} >- >-void ScreenCapturerLinux::CaptureFrame() { >- TRACE_EVENT0("webrtc", "ScreenCapturerLinux::CaptureFrame"); >- int64_t capture_start_time_nanos = rtc::TimeNanos(); >- >- queue_.MoveToNextFrame(); >- RTC_DCHECK(!queue_.current_frame() || !queue_.current_frame()->IsShared()); >- >- // Process XEvents for XDamage and cursor shape tracking. >- options_.x_display()->ProcessPendingXEvents(); >- >- // ProcessPendingXEvents() may call ScreenConfigurationChanged() which >- // reinitializes |x_server_pixel_buffer_|. Check if the pixel buffer is still >- // in a good shape. >- if (!x_server_pixel_buffer_.is_initialized()) { >- // We failed to initialize pixel buffer. >- RTC_LOG(LS_ERROR) << "Pixel buffer is not initialized."; >- callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); >- return; >- } >- >- // If the current frame is from an older generation then allocate a new one. >- // Note that we can't reallocate other buffers at this point, since the caller >- // may still be reading from them. >- if (!queue_.current_frame()) { >- queue_.ReplaceCurrentFrame( >- SharedDesktopFrame::Wrap(std::unique_ptr<DesktopFrame>( >- new BasicDesktopFrame(x_server_pixel_buffer_.window_size())))); >- } >- >- std::unique_ptr<DesktopFrame> result = CaptureScreen(); >- if (!result) { >- RTC_LOG(LS_WARNING) << "Temporarily failed to capture screen."; >- callback_->OnCaptureResult(Result::ERROR_TEMPORARY, nullptr); >- return; >- } >- >- last_invalid_region_ = result->updated_region(); >- result->set_capture_time_ms((rtc::TimeNanos() - capture_start_time_nanos) / >- rtc::kNumNanosecsPerMillisec); >- callback_->OnCaptureResult(Result::SUCCESS, std::move(result)); >-} >- >-bool ScreenCapturerLinux::GetSourceList(SourceList* sources) { >- RTC_DCHECK(sources->size() == 0); >- // TODO(jiayl): implement screen enumeration. >- sources->push_back({0}); >- return true; >-} >- >-bool ScreenCapturerLinux::SelectSource(SourceId id) { >- // TODO(jiayl): implement screen selection. >- return true; >-} >- >-bool ScreenCapturerLinux::HandleXEvent(const XEvent& event) { >- if (use_damage_ && (event.type == damage_event_base_ + XDamageNotify)) { >- const XDamageNotifyEvent* damage_event = >- reinterpret_cast<const XDamageNotifyEvent*>(&event); >- if (damage_event->damage != damage_handle_) >- return false; >- RTC_DCHECK(damage_event->level == XDamageReportNonEmpty); >- return true; >- } else if (event.type == ConfigureNotify) { >- ScreenConfigurationChanged(); >- return true; >- } >- return false; >-} >- >-std::unique_ptr<DesktopFrame> ScreenCapturerLinux::CaptureScreen() { >- std::unique_ptr<SharedDesktopFrame> frame = queue_.current_frame()->Share(); >- RTC_DCHECK(x_server_pixel_buffer_.window_size().equals(frame->size())); >- >- // Pass the screen size to the helper, so it can clip the invalid region if it >- // expands that region to a grid. >- helper_.set_size_most_recent(frame->size()); >- >- // In the DAMAGE case, ensure the frame is up-to-date with the previous frame >- // if any. If there isn't a previous frame, that means a screen-resolution >- // change occurred, and |invalid_rects| will be updated to include the whole >- // screen. >- if (use_damage_ && queue_.previous_frame()) >- SynchronizeFrame(); >- >- DesktopRegion* updated_region = frame->mutable_updated_region(); >- >- x_server_pixel_buffer_.Synchronize(); >- if (use_damage_ && queue_.previous_frame()) { >- // Atomically fetch and clear the damage region. >- XDamageSubtract(display(), damage_handle_, None, damage_region_); >- int rects_num = 0; >- XRectangle bounds; >- XRectangle* rects = XFixesFetchRegionAndBounds(display(), damage_region_, >- &rects_num, &bounds); >- for (int i = 0; i < rects_num; ++i) { >- updated_region->AddRect(DesktopRect::MakeXYWH( >- rects[i].x, rects[i].y, rects[i].width, rects[i].height)); >- } >- XFree(rects); >- helper_.InvalidateRegion(*updated_region); >- >- // Capture the damaged portions of the desktop. >- helper_.TakeInvalidRegion(updated_region); >- >- // Clip the damaged portions to the current screen size, just in case some >- // spurious XDamage notifications were received for a previous (larger) >- // screen size. >- updated_region->IntersectWith( >- DesktopRect::MakeSize(x_server_pixel_buffer_.window_size())); >- >- for (DesktopRegion::Iterator it(*updated_region); !it.IsAtEnd(); >- it.Advance()) { >- if (!x_server_pixel_buffer_.CaptureRect(it.rect(), frame.get())) >- return nullptr; >- } >- } else { >- // Doing full-screen polling, or this is the first capture after a >- // screen-resolution change. In either case, need a full-screen capture. >- DesktopRect screen_rect = DesktopRect::MakeSize(frame->size()); >- if (!x_server_pixel_buffer_.CaptureRect(screen_rect, frame.get())) >- return nullptr; >- updated_region->SetRect(screen_rect); >- } >- >- return std::move(frame); >-} >- >-void ScreenCapturerLinux::ScreenConfigurationChanged() { >- TRACE_EVENT0("webrtc", "ScreenCapturerLinux::ScreenConfigurationChanged"); >- // Make sure the frame buffers will be reallocated. >- queue_.Reset(); >- >- helper_.ClearInvalidRegion(); >- if (!x_server_pixel_buffer_.Init(display(), DefaultRootWindow(display()))) { >- RTC_LOG(LS_ERROR) << "Failed to initialize pixel buffer after screen " >- "configuration change."; >- } >-} >- >-void ScreenCapturerLinux::SynchronizeFrame() { >- // Synchronize the current buffer with the previous one since we do not >- // capture the entire desktop. Note that encoder may be reading from the >- // previous buffer at this time so thread access complaints are false >- // positives. >- >- // TODO(hclam): We can reduce the amount of copying here by subtracting >- // |capturer_helper_|s region from |last_invalid_region_|. >- // http://crbug.com/92354 >- RTC_DCHECK(queue_.previous_frame()); >- >- DesktopFrame* current = queue_.current_frame(); >- DesktopFrame* last = queue_.previous_frame(); >- RTC_DCHECK(current != last); >- for (DesktopRegion::Iterator it(last_invalid_region_); !it.IsAtEnd(); >- it.Advance()) { >- current->CopyPixelsFrom(*last, it.rect().top_left(), it.rect()); >- } >-} >- >-void ScreenCapturerLinux::DeinitXlib() { >- if (gc_) { >- XFreeGC(display(), gc_); >- gc_ = nullptr; >- } >- >- x_server_pixel_buffer_.Release(); >- >- if (display()) { >- if (damage_handle_) { >- XDamageDestroy(display(), damage_handle_); >- damage_handle_ = 0; >- } >- >- if (damage_region_) { >- XFixesDestroyRegion(display(), damage_region_); >- damage_region_ = 0; >- } >- } >-} >- >-} // namespace >- >-// static >-std::unique_ptr<DesktopCapturer> DesktopCapturer::CreateRawScreenCapturer( >- const DesktopCaptureOptions& options) { >- if (!options.x_display()) >- return nullptr; >- >- std::unique_ptr<ScreenCapturerLinux> capturer(new ScreenCapturerLinux()); >- if (!capturer.get()->Init(options)) { >- return nullptr; >- } >- >- return std::move(capturer); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_linux.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_linux.cc >index 39ceca7dc61799c46b7322518e6651a356d51017..2ab5e74891c62af9074e6eda486397d67f82e725 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_linux.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_linux.cc >@@ -14,9 +14,9 @@ > #include <memory> > > #include "absl/memory/memory.h" >+#include "modules/desktop_capture/linux/shared_x_display.h" > #include "modules/desktop_capture/screen_drawer.h" > #include "modules/desktop_capture/screen_drawer_lock_posix.h" >-#include "modules/desktop_capture/x11/shared_x_display.h" > #include "rtc_base/checks.h" > #include "system_wrappers/include/sleep.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_unittest.cc >index 5f0f3cc5733518d57abc9a86ed0d7cb4eb2fbd3a..d48fa89c0b5bc0a621eef6992a5887e2f1c4f9bb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/screen_drawer_unittest.cc >@@ -140,7 +140,12 @@ TEST(ScreenDrawerTest, DISABLED_DrawRectangles) { > SleepMs(10000); > } > >-TEST(ScreenDrawerTest, TwoScreenDrawerLocks) { >+#if defined(THREAD_SANITIZER) // bugs.webrtc.org/10019 >+#define MAYBE_TwoScreenDrawerLocks DISABLED_TwoScreenDrawerLocks >+#else >+#define MAYBE_TwoScreenDrawerLocks TwoScreenDrawerLocks >+#endif >+TEST(ScreenDrawerTest, MAYBE_TwoScreenDrawerLocks) { > #if defined(WEBRTC_POSIX) > // ScreenDrawerLockPosix won't be able to unlink the named semaphore. So use a > // different semaphore name here to avoid deadlock. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/shared_desktop_frame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/shared_desktop_frame.h >index f70508cee15b6906c24e6d136632e3b3a524c0fd..ea12c000f2a64528f5c08fc73c27d12e198bb91d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/shared_desktop_frame.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/shared_desktop_frame.h >@@ -16,12 +16,13 @@ > #include "rtc_base/refcount.h" > #include "rtc_base/refcountedobject.h" > #include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > > // SharedDesktopFrame is a DesktopFrame that may have multiple instances all > // sharing the same buffer. >-class SharedDesktopFrame : public DesktopFrame { >+class RTC_EXPORT SharedDesktopFrame : public DesktopFrame { > public: > ~SharedDesktopFrame() override; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_linux.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_linux.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..82359e50c2db19c748e417bceec6b588e4ed1038 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_linux.cc >@@ -0,0 +1,40 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/desktop_capture/desktop_capture_options.h" >+#include "modules/desktop_capture/desktop_capturer.h" >+ >+#if defined(WEBRTC_USE_PIPEWIRE) >+#include "modules/desktop_capture/linux/window_capturer_pipewire.h" >+#endif // defined(WEBRTC_USE_PIPEWIRE) >+ >+#if defined(USE_X11) >+#include "modules/desktop_capture/linux/window_capturer_x11.h" >+#endif // defined(USE_X11) >+ >+namespace webrtc { >+ >+// static >+std::unique_ptr<DesktopCapturer> DesktopCapturer::CreateRawWindowCapturer( >+ const DesktopCaptureOptions& options) { >+#if defined(WEBRTC_USE_PIPEWIRE) >+ if (options.allow_pipewire() && DesktopCapturer::IsRunningUnderWayland()) { >+ return WindowCapturerPipeWire::CreateRawWindowCapturer(options); >+ } >+#endif // defined(WEBRTC_USE_PIPEWIRE) >+ >+#if defined(USE_X11) >+ return WindowCapturerX11::CreateRawWindowCapturer(options); >+#endif // defined(USE_X11) >+ >+ return nullptr; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_mac.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_mac.mm >index 95e622a77a1c3cbc1b90b26722b7d7710866c3ee..b0647f041b6d86085175de0e75b82420a1fb4a71 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_mac.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_mac.mm >@@ -140,9 +140,7 @@ bool WindowCapturerMac::FocusOnSelectedSource() { > bool WindowCapturerMac::IsOccluded(const DesktopVector& pos) { > DesktopVector sys_pos = pos; > if (configuration_monitor_) { >- configuration_monitor_->Lock(); > auto configuration = configuration_monitor_->desktop_configuration(); >- configuration_monitor_->Unlock(); > sys_pos = pos.add(configuration.bounds.top_left()); > } > return window_finder_.GetWindowUnderPoint(sys_pos) != window_id_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_x11.cc >deleted file mode 100644 >index c4ca3ae5e8c40096a945e9f120ffc9bbba58465b..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_capturer_x11.cc >+++ /dev/null >@@ -1,288 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <string.h> >- >-#include <X11/Xutil.h> >-#include <X11/extensions/Xcomposite.h> >-#include <X11/extensions/Xrender.h> >- >-#include <utility> >- >-#include "modules/desktop_capture/desktop_capture_options.h" >-#include "modules/desktop_capture/desktop_capturer.h" >-#include "modules/desktop_capture/desktop_frame.h" >-#include "modules/desktop_capture/window_finder_x11.h" >-#include "modules/desktop_capture/x11/shared_x_display.h" >-#include "modules/desktop_capture/x11/window_list_utils.h" >-#include "modules/desktop_capture/x11/x_atom_cache.h" >-#include "modules/desktop_capture/x11/x_server_pixel_buffer.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/scoped_ref_ptr.h" >-#include "rtc_base/trace_event.h" >- >-namespace webrtc { >- >-namespace { >- >-class WindowCapturerLinux : public DesktopCapturer, >- public SharedXDisplay::XEventHandler { >- public: >- WindowCapturerLinux(const DesktopCaptureOptions& options); >- ~WindowCapturerLinux() override; >- >- // DesktopCapturer interface. >- void Start(Callback* callback) override; >- void CaptureFrame() override; >- bool GetSourceList(SourceList* sources) override; >- bool SelectSource(SourceId id) override; >- bool FocusOnSelectedSource() override; >- bool IsOccluded(const DesktopVector& pos) override; >- >- // SharedXDisplay::XEventHandler interface. >- bool HandleXEvent(const XEvent& event) override; >- >- private: >- Display* display() { return x_display_->display(); } >- >- // Returns window title for the specified X |window|. >- bool GetWindowTitle(::Window window, std::string* title); >- >- Callback* callback_ = nullptr; >- >- rtc::scoped_refptr<SharedXDisplay> x_display_; >- >- bool has_composite_extension_ = false; >- >- ::Window selected_window_ = 0; >- XServerPixelBuffer x_server_pixel_buffer_; >- XAtomCache atom_cache_; >- WindowFinderX11 window_finder_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(WindowCapturerLinux); >-}; >- >-WindowCapturerLinux::WindowCapturerLinux(const DesktopCaptureOptions& options) >- : x_display_(options.x_display()), >- atom_cache_(display()), >- window_finder_(&atom_cache_) { >- int event_base, error_base, major_version, minor_version; >- if (XCompositeQueryExtension(display(), &event_base, &error_base) && >- XCompositeQueryVersion(display(), &major_version, &minor_version) && >- // XCompositeNameWindowPixmap() requires version 0.2 >- (major_version > 0 || minor_version >= 2)) { >- has_composite_extension_ = true; >- } else { >- RTC_LOG(LS_INFO) << "Xcomposite extension not available or too old."; >- } >- >- x_display_->AddEventHandler(ConfigureNotify, this); >-} >- >-WindowCapturerLinux::~WindowCapturerLinux() { >- x_display_->RemoveEventHandler(ConfigureNotify, this); >-} >- >-bool WindowCapturerLinux::GetSourceList(SourceList* sources) { >- return GetWindowList(&atom_cache_, [this, sources](::Window window) { >- Source w; >- w.id = window; >- if (this->GetWindowTitle(window, &w.title)) { >- sources->push_back(w); >- } >- return true; >- }); >-} >- >-bool WindowCapturerLinux::SelectSource(SourceId id) { >- if (!x_server_pixel_buffer_.Init(display(), id)) >- return false; >- >- // Tell the X server to send us window resizing events. >- XSelectInput(display(), id, StructureNotifyMask); >- >- selected_window_ = id; >- >- // In addition to needing X11 server-side support for Xcomposite, it actually >- // needs to be turned on for the window. If the user has modern >- // hardware/drivers but isn't using a compositing window manager, that won't >- // be the case. Here we automatically turn it on. >- >- // Redirect drawing to an offscreen buffer (ie, turn on compositing). X11 >- // remembers who has requested this and will turn it off for us when we exit. >- XCompositeRedirectWindow(display(), id, CompositeRedirectAutomatic); >- >- return true; >-} >- >-bool WindowCapturerLinux::FocusOnSelectedSource() { >- if (!selected_window_) >- return false; >- >- unsigned int num_children; >- ::Window* children; >- ::Window parent; >- ::Window root; >- // Find the root window to pass event to. >- int status = XQueryTree(display(), selected_window_, &root, &parent, >- &children, &num_children); >- if (status == 0) { >- RTC_LOG(LS_ERROR) << "Failed to query for the root window."; >- return false; >- } >- >- if (children) >- XFree(children); >- >- XRaiseWindow(display(), selected_window_); >- >- // Some window managers (e.g., metacity in GNOME) consider it illegal to >- // raise a window without also giving it input focus with >- // _NET_ACTIVE_WINDOW, so XRaiseWindow() on its own isn't enough. >- Atom atom = XInternAtom(display(), "_NET_ACTIVE_WINDOW", True); >- if (atom != None) { >- XEvent xev; >- xev.xclient.type = ClientMessage; >- xev.xclient.serial = 0; >- xev.xclient.send_event = True; >- xev.xclient.window = selected_window_; >- xev.xclient.message_type = atom; >- >- // The format member is set to 8, 16, or 32 and specifies whether the >- // data should be viewed as a list of bytes, shorts, or longs. >- xev.xclient.format = 32; >- >- memset(xev.xclient.data.l, 0, sizeof(xev.xclient.data.l)); >- >- XSendEvent(display(), root, False, >- SubstructureRedirectMask | SubstructureNotifyMask, &xev); >- } >- XFlush(display()); >- return true; >-} >- >-void WindowCapturerLinux::Start(Callback* callback) { >- RTC_DCHECK(!callback_); >- RTC_DCHECK(callback); >- >- callback_ = callback; >-} >- >-void WindowCapturerLinux::CaptureFrame() { >- TRACE_EVENT0("webrtc", "WindowCapturerLinux::CaptureFrame"); >- >- if (!x_server_pixel_buffer_.IsWindowValid()) { >- RTC_LOG(LS_ERROR) << "The window is no longer valid."; >- callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); >- return; >- } >- >- x_display_->ProcessPendingXEvents(); >- >- if (!has_composite_extension_) { >- // Without the Xcomposite extension we capture when the whole window is >- // visible on screen and not covered by any other window. This is not >- // something we want so instead, just bail out. >- RTC_LOG(LS_ERROR) << "No Xcomposite extension detected."; >- callback_->OnCaptureResult(Result::ERROR_PERMANENT, nullptr); >- return; >- } >- >- if (GetWindowState(&atom_cache_, selected_window_) == IconicState) { >- // Window is in minimized. Return a 1x1 frame as same as OSX/Win does. >- std::unique_ptr<DesktopFrame> frame( >- new BasicDesktopFrame(DesktopSize(1, 1))); >- callback_->OnCaptureResult(Result::SUCCESS, std::move(frame)); >- return; >- } >- >- std::unique_ptr<DesktopFrame> frame( >- new BasicDesktopFrame(x_server_pixel_buffer_.window_size())); >- >- x_server_pixel_buffer_.Synchronize(); >- if (!x_server_pixel_buffer_.CaptureRect(DesktopRect::MakeSize(frame->size()), >- frame.get())) { >- RTC_LOG(LS_WARNING) << "Temporarily failed to capture winodw."; >- callback_->OnCaptureResult(Result::ERROR_TEMPORARY, nullptr); >- return; >- } >- >- frame->mutable_updated_region()->SetRect( >- DesktopRect::MakeSize(frame->size())); >- frame->set_top_left(x_server_pixel_buffer_.window_rect().top_left()); >- >- callback_->OnCaptureResult(Result::SUCCESS, std::move(frame)); >-} >- >-bool WindowCapturerLinux::IsOccluded(const DesktopVector& pos) { >- return window_finder_.GetWindowUnderPoint(pos) != >- static_cast<WindowId>(selected_window_); >-} >- >-bool WindowCapturerLinux::HandleXEvent(const XEvent& event) { >- if (event.type == ConfigureNotify) { >- XConfigureEvent xce = event.xconfigure; >- if (xce.window == selected_window_) { >- if (!DesktopRectFromXAttributes(xce).equals( >- x_server_pixel_buffer_.window_rect())) { >- if (!x_server_pixel_buffer_.Init(display(), selected_window_)) { >- RTC_LOG(LS_ERROR) >- << "Failed to initialize pixel buffer after resizing."; >- } >- } >- } >- } >- >- // Always returns false, so other observers can still receive the events. >- return false; >-} >- >-bool WindowCapturerLinux::GetWindowTitle(::Window window, std::string* title) { >- int status; >- bool result = false; >- XTextProperty window_name; >- window_name.value = nullptr; >- if (window) { >- status = XGetWMName(display(), window, &window_name); >- if (status && window_name.value && window_name.nitems) { >- int cnt; >- char** list = nullptr; >- status = >- Xutf8TextPropertyToTextList(display(), &window_name, &list, &cnt); >- if (status >= Success && cnt && *list) { >- if (cnt > 1) { >- RTC_LOG(LS_INFO) << "Window has " << cnt >- << " text properties, only using the first one."; >- } >- *title = *list; >- result = true; >- } >- if (list) >- XFreeStringList(list); >- } >- if (window_name.value) >- XFree(window_name.value); >- } >- return result; >-} >- >-} // namespace >- >-// static >-std::unique_ptr<DesktopCapturer> DesktopCapturer::CreateRawWindowCapturer( >- const DesktopCaptureOptions& options) { >- if (!options.x_display()) >- return nullptr; >- return std::unique_ptr<DesktopCapturer>(new WindowCapturerLinux(options)); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_unittest.cc >index 88327aaac5483ad95199810177e875e2f6838120..df9067ef2cb597335732aefe491cea24a4e38b72 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_unittest.cc >@@ -21,8 +21,8 @@ > > #if defined(USE_X11) > #include "absl/memory/memory.h" >-#include "modules/desktop_capture/x11/shared_x_display.h" >-#include "modules/desktop_capture/x11/x_atom_cache.h" >+#include "modules/desktop_capture/linux/shared_x_display.h" >+#include "modules/desktop_capture/linux/x_atom_cache.h" > #endif > > #if defined(WEBRTC_WIN) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_x11.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_x11.cc >deleted file mode 100644 >index 03694326d2fd840fb4e932a290e8dada17aed52b..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_x11.cc >+++ /dev/null >@@ -1,49 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/desktop_capture/window_finder_x11.h" >- >-#include "absl/memory/memory.h" >-#include "modules/desktop_capture/x11/window_list_utils.h" >-#include "rtc_base/checks.h" >- >-namespace webrtc { >- >-WindowFinderX11::WindowFinderX11(XAtomCache* cache) : cache_(cache) { >- RTC_DCHECK(cache_); >-} >- >-WindowFinderX11::~WindowFinderX11() = default; >- >-WindowId WindowFinderX11::GetWindowUnderPoint(DesktopVector point) { >- WindowId id = kNullWindowId; >- GetWindowList(cache_, [&id, this, point](::Window window) { >- DesktopRect rect; >- if (GetWindowRect(this->cache_->display(), window, &rect) && >- rect.Contains(point)) { >- id = window; >- return false; >- } >- return true; >- }); >- return id; >-} >- >-// static >-std::unique_ptr<WindowFinder> WindowFinder::Create( >- const WindowFinder::Options& options) { >- if (options.cache == nullptr) { >- return nullptr; >- } >- >- return absl::make_unique<WindowFinderX11>(options.cache); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_x11.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_x11.h >deleted file mode 100644 >index 38c30c4670bac82e3995c9e1177ca14cba6d7ed2..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/window_finder_x11.h >+++ /dev/null >@@ -1,35 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_DESKTOP_CAPTURE_WINDOW_FINDER_X11_H_ >-#define MODULES_DESKTOP_CAPTURE_WINDOW_FINDER_X11_H_ >- >-#include "modules/desktop_capture/window_finder.h" >- >-namespace webrtc { >- >-class XAtomCache; >- >-// The implementation of WindowFinder for X11. >-class WindowFinderX11 final : public WindowFinder { >- public: >- explicit WindowFinderX11(XAtomCache* cache); >- ~WindowFinderX11() override; >- >- // WindowFinder implementation. >- WindowId GetWindowUnderPoint(DesktopVector point) override; >- >- private: >- XAtomCache* const cache_; >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_DESKTOP_CAPTURE_WINDOW_FINDER_X11_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/shared_x_display.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/shared_x_display.cc >deleted file mode 100644 >index db6a64b81d9b751ba95a3caf2be9b2e8719aa1d1..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/shared_x_display.cc >+++ /dev/null >@@ -1,89 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/desktop_capture/x11/shared_x_display.h" >- >-#include <X11/Xlib.h> >- >-#include <algorithm> >- >-#include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >- >-namespace webrtc { >- >-SharedXDisplay::SharedXDisplay(Display* display) : display_(display) { >- RTC_DCHECK(display_); >-} >- >-SharedXDisplay::~SharedXDisplay() { >- RTC_DCHECK(event_handlers_.empty()); >- XCloseDisplay(display_); >-} >- >-// static >-rtc::scoped_refptr<SharedXDisplay> SharedXDisplay::Create( >- const std::string& display_name) { >- Display* display = >- XOpenDisplay(display_name.empty() ? NULL : display_name.c_str()); >- if (!display) { >- RTC_LOG(LS_ERROR) << "Unable to open display"; >- return NULL; >- } >- return new SharedXDisplay(display); >-} >- >-// static >-rtc::scoped_refptr<SharedXDisplay> SharedXDisplay::CreateDefault() { >- return Create(std::string()); >-} >- >-void SharedXDisplay::AddEventHandler(int type, XEventHandler* handler) { >- event_handlers_[type].push_back(handler); >-} >- >-void SharedXDisplay::RemoveEventHandler(int type, XEventHandler* handler) { >- EventHandlersMap::iterator handlers = event_handlers_.find(type); >- if (handlers == event_handlers_.end()) >- return; >- >- std::vector<XEventHandler*>::iterator new_end = >- std::remove(handlers->second.begin(), handlers->second.end(), handler); >- handlers->second.erase(new_end, handlers->second.end()); >- >- // Check if no handlers left for this event. >- if (handlers->second.empty()) >- event_handlers_.erase(handlers); >-} >- >-void SharedXDisplay::ProcessPendingXEvents() { >- // Hold reference to |this| to prevent it from being destroyed while >- // processing events. >- rtc::scoped_refptr<SharedXDisplay> self(this); >- >- // Find the number of events that are outstanding "now." We don't just loop >- // on XPending because we want to guarantee this terminates. >- int events_to_process = XPending(display()); >- XEvent e; >- >- for (int i = 0; i < events_to_process; i++) { >- XNextEvent(display(), &e); >- EventHandlersMap::iterator handlers = event_handlers_.find(e.type); >- if (handlers == event_handlers_.end()) >- continue; >- for (std::vector<XEventHandler*>::iterator it = handlers->second.begin(); >- it != handlers->second.end(); ++it) { >- if ((*it)->HandleXEvent(e)) >- break; >- } >- } >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/shared_x_display.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/shared_x_display.h >deleted file mode 100644 >index 4d93b3b175144d14c7d363b4c1b3c62787c48cef..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/shared_x_display.h >+++ /dev/null >@@ -1,81 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_DESKTOP_CAPTURE_X11_SHARED_X_DISPLAY_H_ >-#define MODULES_DESKTOP_CAPTURE_X11_SHARED_X_DISPLAY_H_ >- >-#include <map> >-#include <vector> >- >-#include <string> >- >-#include "api/refcountedbase.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/scoped_ref_ptr.h" >- >-// Including Xlib.h will involve evil defines (Bool, Status, True, False), which >-// easily conflict with other headers. >-typedef struct _XDisplay Display; >-typedef union _XEvent XEvent; >- >-namespace webrtc { >- >-// A ref-counted object to store XDisplay connection. >-class SharedXDisplay : public rtc::RefCountedBase { >- public: >- class XEventHandler { >- public: >- virtual ~XEventHandler() {} >- >- // Processes XEvent. Returns true if the event has been handled. >- virtual bool HandleXEvent(const XEvent& event) = 0; >- }; >- >- // Takes ownership of |display|. >- explicit SharedXDisplay(Display* display); >- >- // Creates a new X11 Display for the |display_name|. NULL is returned if X11 >- // connection failed. Equivalent to CreateDefault() when |display_name| is >- // empty. >- static rtc::scoped_refptr<SharedXDisplay> Create( >- const std::string& display_name); >- >- // Creates X11 Display connection for the default display (e.g. specified in >- // DISPLAY). NULL is returned if X11 connection failed. >- static rtc::scoped_refptr<SharedXDisplay> CreateDefault(); >- >- Display* display() { return display_; } >- >- // Adds a new event |handler| for XEvent's of |type|. >- void AddEventHandler(int type, XEventHandler* handler); >- >- // Removes event |handler| added using |AddEventHandler|. Doesn't do anything >- // if |handler| is not registered. >- void RemoveEventHandler(int type, XEventHandler* handler); >- >- // Processes pending XEvents, calling corresponding event handlers. >- void ProcessPendingXEvents(); >- >- protected: >- ~SharedXDisplay() override; >- >- private: >- typedef std::map<int, std::vector<XEventHandler*> > EventHandlersMap; >- >- Display* display_; >- >- EventHandlersMap event_handlers_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(SharedXDisplay); >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_DESKTOP_CAPTURE_X11_SHARED_X_DISPLAY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/window_list_utils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/window_list_utils.cc >deleted file mode 100644 >index fe070d887824c0af65a9f657917ee9286cbaca5c..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/window_list_utils.cc >+++ /dev/null >@@ -1,246 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/desktop_capture/x11/window_list_utils.h" >- >-#include <X11/Xatom.h> >-#include <X11/Xlib.h> >-#include <X11/Xutil.h> >-#include <string.h> >- >-#include <algorithm> >- >-#include "modules/desktop_capture/x11/x_error_trap.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/logging.h" >- >-namespace webrtc { >- >-namespace { >- >-class DeferXFree { >- public: >- explicit DeferXFree(void* data) : data_(data) {} >- ~DeferXFree(); >- >- private: >- void* const data_; >-}; >- >-DeferXFree::~DeferXFree() { >- if (data_) >- XFree(data_); >-} >- >-// Convenience wrapper for XGetWindowProperty() results. >-template <class PropertyType> >-class XWindowProperty { >- public: >- XWindowProperty(Display* display, Window window, Atom property) { >- const int kBitsPerByte = 8; >- Atom actual_type; >- int actual_format; >- unsigned long bytes_after; // NOLINT: type required by XGetWindowProperty >- int status = XGetWindowProperty( >- display, window, property, 0L, ~0L, False, AnyPropertyType, >- &actual_type, &actual_format, &size_, &bytes_after, &data_); >- if (status != Success) { >- data_ = nullptr; >- return; >- } >- if (sizeof(PropertyType) * kBitsPerByte != actual_format) { >- size_ = 0; >- return; >- } >- >- is_valid_ = true; >- } >- >- ~XWindowProperty() { >- if (data_) >- XFree(data_); >- } >- >- // True if we got properly value successfully. >- bool is_valid() const { return is_valid_; } >- >- // Size and value of the property. >- size_t size() const { return size_; } >- const PropertyType* data() const { >- return reinterpret_cast<PropertyType*>(data_); >- } >- PropertyType* data() { return reinterpret_cast<PropertyType*>(data_); } >- >- private: >- bool is_valid_ = false; >- unsigned long size_ = 0; // NOLINT: type required by XGetWindowProperty >- unsigned char* data_ = nullptr; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(XWindowProperty); >-}; >- >-// Iterates through |window| hierarchy to find first visible window, i.e. one >-// that has WM_STATE property set to NormalState. >-// See http://tronche.com/gui/x/icccm/sec-4.html#s-4.1.3.1 . >-::Window GetApplicationWindow(XAtomCache* cache, ::Window window) { >- int32_t state = GetWindowState(cache, window); >- if (state == NormalState) { >- // Window has WM_STATE==NormalState. Return it. >- return window; >- } else if (state == IconicState) { >- // Window is in minimized. Skip it. >- return 0; >- } >- >- RTC_DCHECK_EQ(state, WithdrawnState); >- // If the window is in WithdrawnState then look at all of its children. >- ::Window root, parent; >- ::Window* children; >- unsigned int num_children; >- if (!XQueryTree(cache->display(), window, &root, &parent, &children, >- &num_children)) { >- RTC_LOG(LS_ERROR) << "Failed to query for child windows although window" >- << "does not have a valid WM_STATE."; >- return 0; >- } >- ::Window app_window = 0; >- for (unsigned int i = 0; i < num_children; ++i) { >- app_window = GetApplicationWindow(cache, children[i]); >- if (app_window) >- break; >- } >- >- if (children) >- XFree(children); >- return app_window; >-} >- >-// Returns true if the |window| is a desktop element. >-bool IsDesktopElement(XAtomCache* cache, ::Window window) { >- RTC_DCHECK(cache); >- if (window == 0) >- return false; >- >- // First look for _NET_WM_WINDOW_TYPE. The standard >- // (http://standards.freedesktop.org/wm-spec/latest/ar01s05.html#id2760306) >- // says this hint *should* be present on all windows, and we use the existence >- // of _NET_WM_WINDOW_TYPE_NORMAL in the property to indicate a window is not >- // a desktop element (that is, only "normal" windows should be shareable). >- XWindowProperty<uint32_t> window_type(cache->display(), window, >- cache->WindowType()); >- if (window_type.is_valid() && window_type.size() > 0) { >- uint32_t* end = window_type.data() + window_type.size(); >- bool is_normal = >- (end != std::find(window_type.data(), end, cache->WindowTypeNormal())); >- return !is_normal; >- } >- >- // Fall back on using the hint. >- XClassHint class_hint; >- Status status = XGetClassHint(cache->display(), window, &class_hint); >- if (status == 0) { >- // No hints, assume this is a normal application window. >- return false; >- } >- >- DeferXFree free_res_name(class_hint.res_name); >- DeferXFree free_res_class(class_hint.res_class); >- return strcmp("gnome-panel", class_hint.res_name) == 0 || >- strcmp("desktop_window", class_hint.res_name) == 0; >-} >- >-} // namespace >- >-int32_t GetWindowState(XAtomCache* cache, ::Window window) { >- // Get WM_STATE property of the window. >- XWindowProperty<uint32_t> window_state(cache->display(), window, >- cache->WmState()); >- >- // WM_STATE is considered to be set to WithdrawnState when it missing. >- return window_state.is_valid() ? *window_state.data() : WithdrawnState; >-} >- >-bool GetWindowList(XAtomCache* cache, >- rtc::FunctionView<bool(::Window)> on_window) { >- RTC_DCHECK(cache); >- RTC_DCHECK(on_window); >- ::Display* const display = cache->display(); >- >- int failed_screens = 0; >- const int num_screens = XScreenCount(display); >- for (int screen = 0; screen < num_screens; screen++) { >- ::Window root_window = XRootWindow(display, screen); >- ::Window parent; >- ::Window* children; >- unsigned int num_children; >- { >- XErrorTrap error_trap(display); >- if (XQueryTree(display, root_window, &root_window, &parent, &children, >- &num_children) == 0 || >- error_trap.GetLastErrorAndDisable() != 0) { >- failed_screens++; >- RTC_LOG(LS_ERROR) << "Failed to query for child windows for screen " >- << screen; >- continue; >- } >- } >- >- DeferXFree free_children(children); >- >- for (unsigned int i = 0; i < num_children; i++) { >- // Iterates in reverse order to return windows from front to back. >- ::Window app_window = >- GetApplicationWindow(cache, children[num_children - 1 - i]); >- if (app_window && !IsDesktopElement(cache, app_window)) { >- if (!on_window(app_window)) { >- return true; >- } >- } >- } >- } >- >- return failed_screens < num_screens; >-} >- >-bool GetWindowRect(::Display* display, >- ::Window window, >- DesktopRect* rect, >- XWindowAttributes* attributes /* = nullptr */) { >- XWindowAttributes local_attributes; >- int offset_x; >- int offset_y; >- if (attributes == nullptr) { >- attributes = &local_attributes; >- } >- >- { >- XErrorTrap error_trap(display); >- if (!XGetWindowAttributes(display, window, attributes) || >- error_trap.GetLastErrorAndDisable() != 0) { >- return false; >- } >- } >- *rect = DesktopRectFromXAttributes(*attributes); >- >- { >- XErrorTrap error_trap(display); >- ::Window child; >- if (!XTranslateCoordinates(display, window, attributes->root, -rect->left(), >- -rect->top(), &offset_x, &offset_y, &child) || >- error_trap.GetLastErrorAndDisable() != 0) { >- return false; >- } >- } >- rect->Translate(offset_x, offset_y); >- return true; >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/window_list_utils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/window_list_utils.h >deleted file mode 100644 >index 72d2a70d2107ddda919db78d91130de2d4813e2e..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/window_list_utils.h >+++ /dev/null >@@ -1,54 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_DESKTOP_CAPTURE_X11_WINDOW_LIST_UTILS_H_ >-#define MODULES_DESKTOP_CAPTURE_X11_WINDOW_LIST_UTILS_H_ >- >-#include <X11/Xlib.h> >- >-#include "modules/desktop_capture/desktop_geometry.h" >-#include "modules/desktop_capture/x11/x_atom_cache.h" >-#include "rtc_base/function_view.h" >- >-namespace webrtc { >- >-// Synchronously iterates all on-screen windows in |cache|.display() in >-// decreasing z-order and sends them one-by-one to |on_window| function before >-// GetWindowList() returns. If |on_window| returns false, this function ignores >-// other windows and returns immediately. GetWindowList() returns false if >-// native APIs failed. If multiple screens are attached to the |display|, this >-// function returns false only when native APIs failed on all screens. Menus, >-// panels and minimized windows will be ignored. >-bool GetWindowList(XAtomCache* cache, >- rtc::FunctionView<bool(::Window)> on_window); >- >-// Returns WM_STATE property of the |window|. This function returns >-// WithdrawnState if the |window| is missing. >-int32_t GetWindowState(XAtomCache* cache, ::Window window); >- >-// Returns the rectangle of the |window| in the coordinates of |display|. This >-// function returns false if native APIs failed. If |attributes| is provided, it >-// will be filled with the attributes of |window|. The |rect| is in system >-// coordinate, i.e. the primary monitor always starts from (0, 0). >-bool GetWindowRect(::Display* display, >- ::Window window, >- DesktopRect* rect, >- XWindowAttributes* attributes = nullptr); >- >-// Creates a DesktopRect from |attributes|. >-template <typename T> >-DesktopRect DesktopRectFromXAttributes(const T& attributes) { >- return DesktopRect::MakeXYWH(attributes.x, attributes.y, attributes.width, >- attributes.height); >-} >- >-} // namespace webrtc >- >-#endif // MODULES_DESKTOP_CAPTURE_X11_WINDOW_LIST_UTILS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_atom_cache.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_atom_cache.cc >deleted file mode 100644 >index fa3a1fdebddd5d91bb99a87a8d0dc6b16c3d58a7..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_atom_cache.cc >+++ /dev/null >@@ -1,47 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/desktop_capture/x11/x_atom_cache.h" >- >-#include "rtc_base/checks.h" >- >-namespace webrtc { >- >-XAtomCache::XAtomCache(::Display* display) : display_(display) { >- RTC_DCHECK(display_); >-} >- >-XAtomCache::~XAtomCache() = default; >- >-::Display* XAtomCache::display() const { >- return display_; >-} >- >-Atom XAtomCache::WmState() { >- return CreateIfNotExist(&wm_state_, "WM_STATE"); >-} >- >-Atom XAtomCache::WindowType() { >- return CreateIfNotExist(&window_type_, "_NET_WM_WINDOW_TYPE"); >-} >- >-Atom XAtomCache::WindowTypeNormal() { >- return CreateIfNotExist(&window_type_normal_, "_NET_WM_WINDOW_TYPE_NORMAL"); >-} >- >-Atom XAtomCache::CreateIfNotExist(Atom* atom, const char* name) { >- RTC_DCHECK(atom); >- if (*atom == None) { >- *atom = XInternAtom(display(), name, True); >- } >- return *atom; >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_atom_cache.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_atom_cache.h >deleted file mode 100644 >index 5246c2bb5c9a86b8b0db1c66e95b6e2415d7e686..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_atom_cache.h >+++ /dev/null >@@ -1,43 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_DESKTOP_CAPTURE_X11_X_ATOM_CACHE_H_ >-#define MODULES_DESKTOP_CAPTURE_X11_X_ATOM_CACHE_H_ >- >-#include <X11/Xatom.h> >-#include <X11/Xlib.h> >- >-namespace webrtc { >- >-// A cache of Atom. Each Atom object is created on demand. >-class XAtomCache final { >- public: >- explicit XAtomCache(::Display* display); >- ~XAtomCache(); >- >- ::Display* display() const; >- >- Atom WmState(); >- Atom WindowType(); >- Atom WindowTypeNormal(); >- >- private: >- // If |*atom| is None, this function uses XInternAtom() to retrieve an Atom. >- Atom CreateIfNotExist(Atom* atom, const char* name); >- >- ::Display* const display_; >- Atom wm_state_ = None; >- Atom window_type_ = None; >- Atom window_type_normal_ = None; >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_DESKTOP_CAPTURE_X11_X_ATOM_CACHE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_error_trap.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_error_trap.cc >deleted file mode 100644 >index 6559c3dbe9f5166564ae8abb608bc2ee277b39d7..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_error_trap.cc >+++ /dev/null >@@ -1,68 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/desktop_capture/x11/x_error_trap.h" >- >-#include <assert.h> >- >-#if defined(TOOLKIT_GTK) >-#include <gdk/gdk.h> >-#endif // !defined(TOOLKIT_GTK) >- >-namespace webrtc { >- >-namespace { >- >-#if !defined(TOOLKIT_GTK) >- >-// TODO(sergeyu): This code is not thread safe. Fix it. Bug 2202. >-static bool g_xserver_error_trap_enabled = false; >-static int g_last_xserver_error_code = 0; >- >-int XServerErrorHandler(Display* display, XErrorEvent* error_event) { >- assert(g_xserver_error_trap_enabled); >- g_last_xserver_error_code = error_event->error_code; >- return 0; >-} >- >-#endif // !defined(TOOLKIT_GTK) >- >-} // namespace >- >-XErrorTrap::XErrorTrap(Display* display) >- : original_error_handler_(NULL), enabled_(true) { >-#if defined(TOOLKIT_GTK) >- gdk_error_trap_push(); >-#else // !defined(TOOLKIT_GTK) >- assert(!g_xserver_error_trap_enabled); >- original_error_handler_ = XSetErrorHandler(&XServerErrorHandler); >- g_xserver_error_trap_enabled = true; >- g_last_xserver_error_code = 0; >-#endif // !defined(TOOLKIT_GTK) >-} >- >-int XErrorTrap::GetLastErrorAndDisable() { >- enabled_ = false; >-#if defined(TOOLKIT_GTK) >- return gdk_error_trap_push(); >-#else // !defined(TOOLKIT_GTK) >- assert(g_xserver_error_trap_enabled); >- XSetErrorHandler(original_error_handler_); >- g_xserver_error_trap_enabled = false; >- return g_last_xserver_error_code; >-#endif // !defined(TOOLKIT_GTK) >-} >- >-XErrorTrap::~XErrorTrap() { >- if (enabled_) >- GetLastErrorAndDisable(); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_error_trap.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_error_trap.h >deleted file mode 100644 >index ea9ebf670b7ea10fe7528a6207a14f19a6fec90f..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_error_trap.h >+++ /dev/null >@@ -1,39 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_DESKTOP_CAPTURE_X11_X_ERROR_TRAP_H_ >-#define MODULES_DESKTOP_CAPTURE_X11_X_ERROR_TRAP_H_ >- >-#include <X11/Xlib.h> >- >-#include "rtc_base/constructormagic.h" >- >-namespace webrtc { >- >-// Helper class that registers X Window error handler. Caller can use >-// GetLastErrorAndDisable() to get the last error that was caught, if any. >-class XErrorTrap { >- public: >- explicit XErrorTrap(Display* display); >- ~XErrorTrap(); >- >- // Returns last error and removes unregisters the error handler. >- int GetLastErrorAndDisable(); >- >- private: >- XErrorHandler original_error_handler_; >- bool enabled_; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(XErrorTrap); >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_DESKTOP_CAPTURE_X11_X_ERROR_TRAP_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_server_pixel_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_server_pixel_buffer.cc >deleted file mode 100644 >index e327d3fc7dde2a21283987ea4882213dd514d0cf..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_server_pixel_buffer.cc >+++ /dev/null >@@ -1,351 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/desktop_capture/x11/x_server_pixel_buffer.h" >- >-#include <string.h> >-#include <sys/shm.h> >- >-#include "modules/desktop_capture/desktop_frame.h" >-#include "modules/desktop_capture/x11/window_list_utils.h" >-#include "modules/desktop_capture/x11/x_error_trap.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >- >-namespace webrtc { >- >-namespace { >- >-// Returns the number of bits |mask| has to be shifted left so its last >-// (most-significant) bit set becomes the most-significant bit of the word. >-// When |mask| is 0 the function returns 31. >-uint32_t MaskToShift(uint32_t mask) { >- int shift = 0; >- if ((mask & 0xffff0000u) == 0) { >- mask <<= 16; >- shift += 16; >- } >- if ((mask & 0xff000000u) == 0) { >- mask <<= 8; >- shift += 8; >- } >- if ((mask & 0xf0000000u) == 0) { >- mask <<= 4; >- shift += 4; >- } >- if ((mask & 0xc0000000u) == 0) { >- mask <<= 2; >- shift += 2; >- } >- if ((mask & 0x80000000u) == 0) >- shift += 1; >- >- return shift; >-} >- >-// Returns true if |image| is in RGB format. >-bool IsXImageRGBFormat(XImage* image) { >- return image->bits_per_pixel == 32 && image->red_mask == 0xff0000 && >- image->green_mask == 0xff00 && image->blue_mask == 0xff; >-} >- >-// We expose two forms of blitting to handle variations in the pixel format. >-// In FastBlit(), the operation is effectively a memcpy. >-void FastBlit(XImage* x_image, >- uint8_t* src_pos, >- const DesktopRect& rect, >- DesktopFrame* frame) { >- int src_stride = x_image->bytes_per_line; >- int dst_x = rect.left(), dst_y = rect.top(); >- >- uint8_t* dst_pos = frame->data() + frame->stride() * dst_y; >- dst_pos += dst_x * DesktopFrame::kBytesPerPixel; >- >- int height = rect.height(); >- int row_bytes = rect.width() * DesktopFrame::kBytesPerPixel; >- for (int y = 0; y < height; ++y) { >- memcpy(dst_pos, src_pos, row_bytes); >- src_pos += src_stride; >- dst_pos += frame->stride(); >- } >-} >- >-void SlowBlit(XImage* x_image, >- uint8_t* src_pos, >- const DesktopRect& rect, >- DesktopFrame* frame) { >- int src_stride = x_image->bytes_per_line; >- int dst_x = rect.left(), dst_y = rect.top(); >- int width = rect.width(), height = rect.height(); >- >- uint32_t red_mask = x_image->red_mask; >- uint32_t green_mask = x_image->red_mask; >- uint32_t blue_mask = x_image->blue_mask; >- >- uint32_t red_shift = MaskToShift(red_mask); >- uint32_t green_shift = MaskToShift(green_mask); >- uint32_t blue_shift = MaskToShift(blue_mask); >- >- int bits_per_pixel = x_image->bits_per_pixel; >- >- uint8_t* dst_pos = frame->data() + frame->stride() * dst_y; >- dst_pos += dst_x * DesktopFrame::kBytesPerPixel; >- // TODO(hclam): Optimize, perhaps using MMX code or by converting to >- // YUV directly. >- // TODO(sergeyu): This code doesn't handle XImage byte order properly and >- // won't work with 24bpp images. Fix it. >- for (int y = 0; y < height; y++) { >- uint32_t* dst_pos_32 = reinterpret_cast<uint32_t*>(dst_pos); >- uint32_t* src_pos_32 = reinterpret_cast<uint32_t*>(src_pos); >- uint16_t* src_pos_16 = reinterpret_cast<uint16_t*>(src_pos); >- for (int x = 0; x < width; x++) { >- // Dereference through an appropriately-aligned pointer. >- uint32_t pixel; >- if (bits_per_pixel == 32) { >- pixel = src_pos_32[x]; >- } else if (bits_per_pixel == 16) { >- pixel = src_pos_16[x]; >- } else { >- pixel = src_pos[x]; >- } >- uint32_t r = (pixel & red_mask) << red_shift; >- uint32_t g = (pixel & green_mask) << green_shift; >- uint32_t b = (pixel & blue_mask) << blue_shift; >- // Write as 32-bit RGB. >- dst_pos_32[x] = >- ((r >> 8) & 0xff0000) | ((g >> 16) & 0xff00) | ((b >> 24) & 0xff); >- } >- dst_pos += frame->stride(); >- src_pos += src_stride; >- } >-} >- >-} // namespace >- >-XServerPixelBuffer::XServerPixelBuffer() {} >- >-XServerPixelBuffer::~XServerPixelBuffer() { >- Release(); >-} >- >-void XServerPixelBuffer::Release() { >- if (x_image_) { >- XDestroyImage(x_image_); >- x_image_ = nullptr; >- } >- if (x_shm_image_) { >- XDestroyImage(x_shm_image_); >- x_shm_image_ = nullptr; >- } >- if (shm_pixmap_) { >- XFreePixmap(display_, shm_pixmap_); >- shm_pixmap_ = 0; >- } >- if (shm_gc_) { >- XFreeGC(display_, shm_gc_); >- shm_gc_ = nullptr; >- } >- >- ReleaseSharedMemorySegment(); >- >- window_ = 0; >-} >- >-void XServerPixelBuffer::ReleaseSharedMemorySegment() { >- if (!shm_segment_info_) >- return; >- if (shm_segment_info_->shmaddr != nullptr) >- shmdt(shm_segment_info_->shmaddr); >- if (shm_segment_info_->shmid != -1) >- shmctl(shm_segment_info_->shmid, IPC_RMID, 0); >- delete shm_segment_info_; >- shm_segment_info_ = nullptr; >-} >- >-bool XServerPixelBuffer::Init(Display* display, Window window) { >- Release(); >- display_ = display; >- >- XWindowAttributes attributes; >- if (!GetWindowRect(display_, window, &window_rect_, &attributes)) { >- return false; >- } >- >- window_ = window; >- InitShm(attributes); >- >- return true; >-} >- >-void XServerPixelBuffer::InitShm(const XWindowAttributes& attributes) { >- Visual* default_visual = attributes.visual; >- int default_depth = attributes.depth; >- >- int major, minor; >- Bool have_pixmaps; >- if (!XShmQueryVersion(display_, &major, &minor, &have_pixmaps)) { >- // Shared memory not supported. CaptureRect will use the XImage API instead. >- return; >- } >- >- bool using_shm = false; >- shm_segment_info_ = new XShmSegmentInfo; >- shm_segment_info_->shmid = -1; >- shm_segment_info_->shmaddr = nullptr; >- shm_segment_info_->readOnly = False; >- x_shm_image_ = XShmCreateImage(display_, default_visual, default_depth, >- ZPixmap, 0, shm_segment_info_, >- window_rect_.width(), window_rect_.height()); >- if (x_shm_image_) { >- shm_segment_info_->shmid = >- shmget(IPC_PRIVATE, x_shm_image_->bytes_per_line * x_shm_image_->height, >- IPC_CREAT | 0600); >- if (shm_segment_info_->shmid != -1) { >- void* shmat_result = shmat(shm_segment_info_->shmid, 0, 0); >- if (shmat_result != reinterpret_cast<void*>(-1)) { >- shm_segment_info_->shmaddr = reinterpret_cast<char*>(shmat_result); >- x_shm_image_->data = shm_segment_info_->shmaddr; >- >- XErrorTrap error_trap(display_); >- using_shm = XShmAttach(display_, shm_segment_info_); >- XSync(display_, False); >- if (error_trap.GetLastErrorAndDisable() != 0) >- using_shm = false; >- if (using_shm) { >- RTC_LOG(LS_VERBOSE) >- << "Using X shared memory segment " << shm_segment_info_->shmid; >- } >- } >- } else { >- RTC_LOG(LS_WARNING) << "Failed to get shared memory segment. " >- "Performance may be degraded."; >- } >- } >- >- if (!using_shm) { >- RTC_LOG(LS_WARNING) >- << "Not using shared memory. Performance may be degraded."; >- ReleaseSharedMemorySegment(); >- return; >- } >- >- if (have_pixmaps) >- have_pixmaps = InitPixmaps(default_depth); >- >- shmctl(shm_segment_info_->shmid, IPC_RMID, 0); >- shm_segment_info_->shmid = -1; >- >- RTC_LOG(LS_VERBOSE) << "Using X shared memory extension v" << major << "." >- << minor << " with" << (have_pixmaps ? "" : "out") >- << " pixmaps."; >-} >- >-bool XServerPixelBuffer::InitPixmaps(int depth) { >- if (XShmPixmapFormat(display_) != ZPixmap) >- return false; >- >- { >- XErrorTrap error_trap(display_); >- shm_pixmap_ = XShmCreatePixmap( >- display_, window_, shm_segment_info_->shmaddr, shm_segment_info_, >- window_rect_.width(), window_rect_.height(), depth); >- XSync(display_, False); >- if (error_trap.GetLastErrorAndDisable() != 0) { >- // |shm_pixmap_| is not not valid because the request was not processed >- // by the X Server, so zero it. >- shm_pixmap_ = 0; >- return false; >- } >- } >- >- { >- XErrorTrap error_trap(display_); >- XGCValues shm_gc_values; >- shm_gc_values.subwindow_mode = IncludeInferiors; >- shm_gc_values.graphics_exposures = False; >- shm_gc_ = XCreateGC(display_, window_, >- GCSubwindowMode | GCGraphicsExposures, &shm_gc_values); >- XSync(display_, False); >- if (error_trap.GetLastErrorAndDisable() != 0) { >- XFreePixmap(display_, shm_pixmap_); >- shm_pixmap_ = 0; >- shm_gc_ = 0; // See shm_pixmap_ comment above. >- return false; >- } >- } >- >- return true; >-} >- >-bool XServerPixelBuffer::IsWindowValid() const { >- XWindowAttributes attributes; >- { >- XErrorTrap error_trap(display_); >- if (!XGetWindowAttributes(display_, window_, &attributes) || >- error_trap.GetLastErrorAndDisable() != 0) { >- return false; >- } >- } >- return true; >-} >- >-void XServerPixelBuffer::Synchronize() { >- if (shm_segment_info_ && !shm_pixmap_) { >- // XShmGetImage can fail if the display is being reconfigured. >- XErrorTrap error_trap(display_); >- // XShmGetImage fails if the window is partially out of screen. >- xshm_get_image_succeeded_ = >- XShmGetImage(display_, window_, x_shm_image_, 0, 0, AllPlanes); >- } >-} >- >-bool XServerPixelBuffer::CaptureRect(const DesktopRect& rect, >- DesktopFrame* frame) { >- RTC_DCHECK_LE(rect.right(), window_rect_.width()); >- RTC_DCHECK_LE(rect.bottom(), window_rect_.height()); >- >- XImage* image; >- uint8_t* data; >- >- if (shm_segment_info_ && (shm_pixmap_ || xshm_get_image_succeeded_)) { >- if (shm_pixmap_) { >- XCopyArea(display_, window_, shm_pixmap_, shm_gc_, rect.left(), >- rect.top(), rect.width(), rect.height(), rect.left(), >- rect.top()); >- XSync(display_, False); >- } >- >- image = x_shm_image_; >- data = reinterpret_cast<uint8_t*>(image->data) + >- rect.top() * image->bytes_per_line + >- rect.left() * image->bits_per_pixel / 8; >- >- } else { >- if (x_image_) >- XDestroyImage(x_image_); >- x_image_ = XGetImage(display_, window_, rect.left(), rect.top(), >- rect.width(), rect.height(), AllPlanes, ZPixmap); >- if (!x_image_) >- return false; >- >- image = x_image_; >- data = reinterpret_cast<uint8_t*>(image->data); >- } >- >- if (IsXImageRGBFormat(image)) { >- FastBlit(image, data, rect, frame); >- } else { >- SlowBlit(image, data, rect, frame); >- } >- >- return true; >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_server_pixel_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_server_pixel_buffer.h >deleted file mode 100644 >index 7e20945f3d529cafee0be365604a4ff35eb99e44..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/desktop_capture/x11/x_server_pixel_buffer.h >+++ /dev/null >@@ -1,84 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-// Don't include this file in any .h files because it pulls in some X headers. >- >-#ifndef MODULES_DESKTOP_CAPTURE_X11_X_SERVER_PIXEL_BUFFER_H_ >-#define MODULES_DESKTOP_CAPTURE_X11_X_SERVER_PIXEL_BUFFER_H_ >- >-#include "modules/desktop_capture/desktop_geometry.h" >-#include "rtc_base/constructormagic.h" >- >-#include <X11/Xutil.h> >-#include <X11/extensions/XShm.h> >- >-namespace webrtc { >- >-class DesktopFrame; >- >-// A class to allow the X server's pixel buffer to be accessed as efficiently >-// as possible. >-class XServerPixelBuffer { >- public: >- XServerPixelBuffer(); >- ~XServerPixelBuffer(); >- >- void Release(); >- >- // Allocate (or reallocate) the pixel buffer for |window|. Returns false in >- // case of an error (e.g. window doesn't exist). >- bool Init(Display* display, Window window); >- >- bool is_initialized() { return window_ != 0; } >- >- // Returns the size of the window the buffer was initialized for. >- DesktopSize window_size() { return window_rect_.size(); } >- >- // Returns the rectangle of the window the buffer was initialized for. >- const DesktopRect& window_rect() { return window_rect_; } >- >- // Returns true if the window can be found. >- bool IsWindowValid() const; >- >- // If shared memory is being used without pixmaps, synchronize this pixel >- // buffer with the root window contents (otherwise, this is a no-op). >- // This is to avoid doing a full-screen capture for each individual >- // rectangle in the capture list, when it only needs to be done once at the >- // beginning. >- void Synchronize(); >- >- // Capture the specified rectangle and stores it in the |frame|. In the case >- // where the full-screen data is captured by Synchronize(), this simply >- // returns the pointer without doing any more work. The caller must ensure >- // that |rect| is not larger than window_size(). >- bool CaptureRect(const DesktopRect& rect, DesktopFrame* frame); >- >- private: >- void ReleaseSharedMemorySegment(); >- >- void InitShm(const XWindowAttributes& attributes); >- bool InitPixmaps(int depth); >- >- Display* display_ = nullptr; >- Window window_ = 0; >- DesktopRect window_rect_; >- XImage* x_image_ = nullptr; >- XShmSegmentInfo* shm_segment_info_ = nullptr; >- XImage* x_shm_image_ = nullptr; >- Pixmap shm_pixmap_ = 0; >- GC shm_gc_ = nullptr; >- bool xshm_get_image_succeeded_ = false; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(XServerPixelBuffer); >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_DESKTOP_CAPTURE_X11_X_SERVER_PIXEL_BUFFER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.cc >index 4ad5d1444e9c57a71215b5969037af372d5e25a7..80eba2e3051cd2d1fdf0aa51f62f5048da05ea01 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.cc >@@ -11,6 +11,7 @@ > #include "modules/include/module_common_types.h" > > #include <string.h> >+#include <cstdint> > #include <utility> > > #include "rtc_base/numerics/safe_conversions.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.h >index 98ff7673110cd039ec6cd32547ae718d9cbdda7c..e058cc84e66ff79545afe0e083b311246a89f3f1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/include/module_common_types.h >@@ -19,6 +19,7 @@ > #include "modules/include/module_common_types_public.h" > #include "modules/include/module_fec_types.h" > #include "modules/rtp_rtcp/source/rtp_video_header.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -33,7 +34,7 @@ struct WebRtcRTPHeader { > int64_t ntp_time_ms; > }; > >-class RTPFragmentationHeader { >+class RTC_EXPORT RTPFragmentationHeader { > public: > RTPFragmentationHeader(); > RTPFragmentationHeader(const RTPFragmentationHeader&) = delete; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/BUILD.gn >index a06233ad90131b407b019684ead81fd7330d0aa4..9bc61936f16822b7d8eab37a50fa55bf8fa365bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/BUILD.gn >@@ -29,7 +29,6 @@ rtc_static_library("pacing") { > deps = [ > ":interval_budget", > "..:module_api", >- "../../:webrtc_common", > "../../api/transport:network_control", > "../../logging:rtc_event_bwe", > "../../logging:rtc_event_log_api", >@@ -56,7 +55,6 @@ rtc_source_set("interval_budget") { > ] > > deps = [ >- "../../:webrtc_common", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", > ] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.cc >index cbf1d2b97efb9272eb259caeb0dc3cde93205fa0..4ff7c0d583dd1d708dd90034f1008e013ed1ab53 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.cc >@@ -13,9 +13,9 @@ > #include <algorithm> > > #include "absl/memory/memory.h" >+#include "logging/rtc_event_log/events/rtc_event.h" > #include "logging/rtc_event_log/events/rtc_event_probe_cluster_created.h" > #include "logging/rtc_event_log/rtc_event_log.h" >-#include "modules/pacing/paced_sender.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > >@@ -96,8 +96,9 @@ void BitrateProber::CreateProbeCluster(int bitrate_bps, int64_t now_ms) { > ProbeCluster cluster; > cluster.time_created_ms = now_ms; > cluster.pace_info.probe_cluster_min_probes = kMinProbePacketsSent; >- cluster.pace_info.probe_cluster_min_bytes = >- bitrate_bps * kMinProbeDurationMs / 8000; >+ cluster.pace_info.probe_cluster_min_bytes = static_cast<int32_t>( >+ static_cast<int64_t>(bitrate_bps) * kMinProbeDurationMs / 8000); >+ RTC_DCHECK_GE(cluster.pace_info.probe_cluster_min_bytes, 0); > cluster.pace_info.send_bitrate_bps = bitrate_bps; > cluster.pace_info.probe_cluster_id = next_cluster_id_++; > clusters_.push(cluster); >@@ -126,9 +127,9 @@ int BitrateProber::TimeUntilNextProbe(int64_t now_ms) { > if (next_probe_time_ms_ >= 0) { > time_until_probe_ms = next_probe_time_ms_ - now_ms; > if (time_until_probe_ms < -kMaxProbeDelayMs) { >- RTC_LOG(LS_WARNING) << "Probe delay too high" >- << " (next_ms:" << next_probe_time_ms_ >- << ", now_ms: " << now_ms << ")"; >+ RTC_DLOG(LS_WARNING) << "Probe delay too high" >+ << " (next_ms:" << next_probe_time_ms_ >+ << ", now_ms: " << now_ms << ")"; > return -1; > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.h >index b2548c006de83f5e597d3652df48feb20c25c67c..bb98100dfac5c6c74966cec6058c723479ad5693 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober.h >@@ -11,10 +11,11 @@ > #ifndef MODULES_PACING_BITRATE_PROBER_H_ > #define MODULES_PACING_BITRATE_PROBER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <queue> > > #include "api/transport/network_types.h" >-#include "modules/include/module_common_types.h" > > namespace webrtc { > class RtcEventLog; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober_unittest.cc >index dc596982db022b3a9334a1ecbb9a0385bcf10247..95d26593cbddb91fc6325f52f402479a8f903837 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/bitrate_prober_unittest.cc >@@ -147,6 +147,24 @@ TEST(BitrateProberTest, ScaleBytesUsedForProbing) { > EXPECT_FALSE(prober.IsProbing()); > } > >+TEST(BitrateProberTest, HighBitrateProbing) { >+ BitrateProber prober; >+ constexpr int kBitrateBps = 1000000000; // 1 Gbps. >+ constexpr int kPacketSizeBytes = 1000; >+ constexpr int kExpectedBytesSent = (kBitrateBps / 8000) * 15; >+ >+ prober.CreateProbeCluster(kBitrateBps, 0); >+ prober.OnIncomingPacket(kPacketSizeBytes); >+ int bytes_sent = 0; >+ while (bytes_sent < kExpectedBytesSent) { >+ ASSERT_TRUE(prober.IsProbing()); >+ prober.ProbeSent(0, kPacketSizeBytes); >+ bytes_sent += kPacketSizeBytes; >+ } >+ >+ EXPECT_FALSE(prober.IsProbing()); >+} >+ > TEST(BitrateProberTest, ProbeClusterTimeout) { > BitrateProber prober; > constexpr int kBitrateBps = 300000; // 300 kbps >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/interval_budget.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/interval_budget.h >index 880fe784ba74f07ee812e94dfea9cad5ef3d5ce3..d09b06e34034ae243f6bf1a10c2fddc645cfc108 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/interval_budget.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/interval_budget.h >@@ -11,7 +11,8 @@ > #ifndef MODULES_PACING_INTERVAL_BUDGET_H_ > #define MODULES_PACING_INTERVAL_BUDGET_H_ > >-#include "common_types.h" // NOLINT(build/include) >+#include <stddef.h> >+#include <stdint.h> > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.cc >index 8cc442c228e849fdfbf2dd6b8ebfb33efd17307e..7e0ac781e2b784cb7e9a68a1599260971cda3572 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.cc >@@ -11,15 +11,11 @@ > #include "modules/pacing/paced_sender.h" > > #include <algorithm> >-#include <map> >-#include <queue> >-#include <set> > #include <utility> >-#include <vector> > > #include "absl/memory/memory.h" >+#include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/congestion_controller/goog_cc/alr_detector.h" >-#include "modules/include/module_common_types.h" > #include "modules/pacing/bitrate_prober.h" > #include "modules/pacing/interval_budget.h" > #include "modules/utility/include/process_thread.h" >@@ -58,9 +54,9 @@ PacedSender::PacedSender(const Clock* clock, > video_blocks_audio_(!field_trial::IsDisabled("WebRTC-Pacer-BlockAudio")), > last_timestamp_ms_(clock_->TimeInMilliseconds()), > paused_(false), >- media_budget_(absl::make_unique<IntervalBudget>(0)), >- padding_budget_(absl::make_unique<IntervalBudget>(0)), >- prober_(absl::make_unique<BitrateProber>(event_log)), >+ media_budget_(0), >+ padding_budget_(0), >+ prober_(event_log), > probing_send_failure_(false), > estimated_bitrate_bps_(0), > min_send_bitrate_kbps_(0u), >@@ -69,7 +65,7 @@ PacedSender::PacedSender(const Clock* clock, > time_last_process_us_(clock->TimeInMicroseconds()), > last_send_time_us_(clock->TimeInMicroseconds()), > first_sent_packet_ms_(-1), >- packets_(clock), >+ packets_(clock->TimeInMicroseconds()), > packet_counter_(0), > pacing_factor_(kDefaultPaceMultiplier), > queue_time_limit(kMaxQueueLengthMs), >@@ -84,7 +80,7 @@ PacedSender::~PacedSender() {} > > void PacedSender::CreateProbeCluster(int bitrate_bps) { > rtc::CritScope cs(&critsect_); >- prober_->CreateProbeCluster(bitrate_bps, TimeMilliseconds()); >+ prober_.CreateProbeCluster(bitrate_bps, TimeMilliseconds()); > } > > void PacedSender::Pause() { >@@ -149,7 +145,7 @@ int64_t PacedSender::TimeMilliseconds() const { > void PacedSender::SetProbingEnabled(bool enabled) { > rtc::CritScope cs(&critsect_); > RTC_CHECK_EQ(0, packet_counter_); >- prober_->SetEnabled(enabled); >+ prober_.SetEnabled(enabled); > } > > void PacedSender::SetEstimatedBitrate(uint32_t bitrate_bps) { >@@ -157,7 +153,7 @@ void PacedSender::SetEstimatedBitrate(uint32_t bitrate_bps) { > RTC_LOG(LS_ERROR) << "PacedSender is not designed to handle 0 bitrate."; > rtc::CritScope cs(&critsect_); > estimated_bitrate_bps_ = bitrate_bps; >- padding_budget_->set_target_rate_kbps( >+ padding_budget_.set_target_rate_kbps( > std::min(estimated_bitrate_bps_ / 1000, max_padding_bitrate_kbps_)); > pacing_bitrate_kbps_ = > std::max(min_send_bitrate_kbps_, estimated_bitrate_bps_ / 1000) * >@@ -173,7 +169,7 @@ void PacedSender::SetSendBitrateLimits(int min_send_bitrate_bps, > std::max(min_send_bitrate_kbps_, estimated_bitrate_bps_ / 1000) * > pacing_factor_; > max_padding_bitrate_kbps_ = padding_bitrate / 1000; >- padding_budget_->set_target_rate_kbps( >+ padding_budget_.set_target_rate_kbps( > std::min(estimated_bitrate_bps_ / 1000, max_padding_bitrate_kbps_)); > } > >@@ -182,7 +178,7 @@ void PacedSender::SetPacingRates(uint32_t pacing_rate_bps, > rtc::CritScope cs(&critsect_); > RTC_DCHECK(pacing_rate_bps > 0); > pacing_bitrate_kbps_ = pacing_rate_bps / 1000; >- padding_budget_->set_target_rate_kbps(padding_rate_bps / 1000); >+ padding_budget_.set_target_rate_kbps(padding_rate_bps / 1000); > } > > void PacedSender::InsertPacket(RtpPacketSender::Priority priority, >@@ -196,7 +192,7 @@ void PacedSender::InsertPacket(RtpPacketSender::Priority priority, > << "SetPacingRate must be called before InsertPacket."; > > int64_t now_ms = TimeMilliseconds(); >- prober_->OnIncomingPacket(bytes); >+ prober_.OnIncomingPacket(bytes); > > if (capture_time_ms < 0) > capture_time_ms = now_ms; >@@ -254,17 +250,15 @@ int64_t PacedSender::TimeUntilNextProcess() { > if (paused_) > return std::max<int64_t>(kPausedProcessIntervalMs - elapsed_time_ms, 0); > >- if (prober_->IsProbing()) { >- int64_t ret = prober_->TimeUntilNextProbe(TimeMilliseconds()); >+ if (prober_.IsProbing()) { >+ int64_t ret = prober_.TimeUntilNextProbe(TimeMilliseconds()); > if (ret > 0 || (ret == 0 && !probing_send_failure_)) > return ret; > } > return std::max<int64_t>(kMinPacketLimitMs - elapsed_time_ms, 0); > } > >-void PacedSender::Process() { >- int64_t now_us = clock_->TimeInMicroseconds(); >- rtc::CritScope cs(&critsect_); >+int64_t PacedSender::UpdateTimeAndGetElapsedMs(int64_t now_us) { > int64_t elapsed_time_ms = (now_us - time_last_process_us_ + 500) / 1000; > time_last_process_us_ = now_us; > if (elapsed_time_ms > kMaxElapsedTimeMs) { >@@ -273,6 +267,10 @@ void PacedSender::Process() { > << kMaxElapsedTimeMs << " ms"; > elapsed_time_ms = kMaxElapsedTimeMs; > } >+ return elapsed_time_ms; >+} >+ >+bool PacedSender::ShouldSendKeepalive(int64_t now_us) const { > if (send_padding_if_silent_ || paused_ || Congested()) { > // We send a padding packet every 500 ms to ensure we won't get stuck in > // congested state due to no feedback being received. >@@ -281,12 +279,25 @@ void PacedSender::Process() { > // We can not send padding unless a normal packet has first been sent. If > // we do, timestamps get messed up. > if (packet_counter_ > 0) { >- PacedPacketInfo pacing_info; >- size_t bytes_sent = SendPadding(1, pacing_info); >- alr_detector_->OnBytesSent(bytes_sent, now_us / 1000); >+ return true; > } > } > } >+ return false; >+} >+ >+void PacedSender::Process() { >+ rtc::CritScope cs(&critsect_); >+ int64_t now_us = clock_->TimeInMicroseconds(); >+ int64_t elapsed_time_ms = UpdateTimeAndGetElapsedMs(now_us); >+ if (ShouldSendKeepalive(now_us)) { >+ critsect_.Leave(); >+ size_t bytes_sent = packet_sender_->TimeToSendPadding(1, PacedPacketInfo()); >+ critsect_.Enter(); >+ OnPaddingSent(bytes_sent); >+ alr_detector_->OnBytesSent(bytes_sent, now_us / 1000); >+ } >+ > if (paused_) > return; > >@@ -308,35 +319,39 @@ void PacedSender::Process() { > } > } > >- media_budget_->set_target_rate_kbps(target_bitrate_kbps); >+ media_budget_.set_target_rate_kbps(target_bitrate_kbps); > UpdateBudgetWithElapsedTime(elapsed_time_ms); > } > >- bool is_probing = prober_->IsProbing(); >+ bool is_probing = prober_.IsProbing(); > PacedPacketInfo pacing_info; > size_t bytes_sent = 0; > size_t recommended_probe_size = 0; > if (is_probing) { >- pacing_info = prober_->CurrentCluster(); >- recommended_probe_size = prober_->RecommendedMinProbeSize(); >+ pacing_info = prober_.CurrentCluster(); >+ recommended_probe_size = prober_.RecommendedMinProbeSize(); > } >- // The paused state is checked in the loop since SendPacket leaves the >- // critical section allowing the paused state to be changed from other code. >+ // The paused state is checked in the loop since it leaves the critical >+ // section allowing the paused state to be changed from other code. > while (!packets_.Empty() && !paused_) { >- // Since we need to release the lock in order to send, we first pop the >- // element from the priority queue but keep it in storage, so that we can >- // reinsert it if send fails. >- const RoundRobinPacketQueue::Packet& packet = packets_.BeginPop(); >+ const auto* packet = GetPendingPacket(pacing_info); >+ if (packet == nullptr) >+ break; > >- if (SendPacket(packet, pacing_info)) { >- bytes_sent += packet.bytes; >+ critsect_.Leave(); >+ bool success = packet_sender_->TimeToSendPacket( >+ packet->ssrc, packet->sequence_number, packet->capture_time_ms, >+ packet->retransmission, pacing_info); >+ critsect_.Enter(); >+ if (success) { >+ bytes_sent += packet->bytes; > // Send succeeded, remove it from the queue. >- packets_.FinalizePop(packet); >+ OnPacketSent(std::move(packet)); > if (is_probing && bytes_sent > recommended_probe_size) > break; > } else { > // Send failed, put it back into the queue. >- packets_.CancelPop(packet); >+ packets_.CancelPop(*packet); > break; > } > } >@@ -347,16 +362,21 @@ void PacedSender::Process() { > if (packet_counter_ > 0) { > int padding_needed = > static_cast<int>(is_probing ? (recommended_probe_size - bytes_sent) >- : padding_budget_->bytes_remaining()); >+ : padding_budget_.bytes_remaining()); > if (padding_needed > 0) { >- bytes_sent += SendPadding(padding_needed, pacing_info); >+ critsect_.Leave(); >+ size_t padding_sent = >+ packet_sender_->TimeToSendPadding(padding_needed, pacing_info); >+ critsect_.Enter(); >+ bytes_sent += padding_sent; >+ OnPaddingSent(padding_sent); > } > } > } > if (is_probing) { > probing_send_failure_ = bytes_sent == 0; > if (!probing_send_failure_) >- prober_->ProbeSent(TimeMilliseconds(), bytes_sent); >+ prober_.ProbeSent(TimeMilliseconds(), bytes_sent); > } > alr_detector_->OnBytesSent(bytes_sent, now_us / 1000); > } >@@ -367,66 +387,58 @@ void PacedSender::ProcessThreadAttached(ProcessThread* process_thread) { > process_thread_ = process_thread; > } > >-bool PacedSender::SendPacket(const RoundRobinPacketQueue::Packet& packet, >- const PacedPacketInfo& pacing_info) { >- RTC_DCHECK(!paused_); >- bool audio_packet = packet.priority == kHighPriority; >+const RoundRobinPacketQueue::Packet* PacedSender::GetPendingPacket( >+ const PacedPacketInfo& pacing_info) { >+ // Since we need to release the lock in order to send, we first pop the >+ // element from the priority queue but keep it in storage, so that we can >+ // reinsert it if send fails. >+ const RoundRobinPacketQueue::Packet* packet = &packets_.BeginPop(); >+ bool audio_packet = packet->priority == kHighPriority; > bool apply_pacing = > !audio_packet || account_for_audio_ || video_blocks_audio_; >- if (apply_pacing && (Congested() || (media_budget_->bytes_remaining() == 0 && >+ if (apply_pacing && (Congested() || (media_budget_.bytes_remaining() == 0 && > pacing_info.probe_cluster_id == > PacedPacketInfo::kNotAProbe))) { >- return false; >+ packets_.CancelPop(*packet); >+ return nullptr; > } >+ return packet; >+} > >- critsect_.Leave(); >- const bool success = packet_sender_->TimeToSendPacket( >- packet.ssrc, packet.sequence_number, packet.capture_time_ms, >- packet.retransmission, pacing_info); >- critsect_.Enter(); >- >- if (success) { >- if (first_sent_packet_ms_ == -1) >- first_sent_packet_ms_ = TimeMilliseconds(); >- if (!audio_packet || account_for_audio_) { >- // Update media bytes sent. >- // TODO(eladalon): TimeToSendPacket() can also return |true| in some >- // situations where nothing actually ended up being sent to the network, >- // and we probably don't want to update the budget in such cases. >- // https://bugs.chromium.org/p/webrtc/issues/detail?id=8052 >- UpdateBudgetWithBytesSent(packet.bytes); >- last_send_time_us_ = clock_->TimeInMicroseconds(); >- } >+void PacedSender::OnPacketSent(const RoundRobinPacketQueue::Packet* packet) { >+ if (first_sent_packet_ms_ == -1) >+ first_sent_packet_ms_ = TimeMilliseconds(); >+ bool audio_packet = packet->priority == kHighPriority; >+ if (!audio_packet || account_for_audio_) { >+ // Update media bytes sent. >+ // TODO(eladalon): TimeToSendPacket() can also return |true| in some >+ // situations where nothing actually ended up being sent to the network, >+ // and we probably don't want to update the budget in such cases. >+ // https://bugs.chromium.org/p/webrtc/issues/detail?id=8052 >+ UpdateBudgetWithBytesSent(packet->bytes); >+ last_send_time_us_ = clock_->TimeInMicroseconds(); > } >- >- return success; >+ // Send succeeded, remove it from the queue. >+ packets_.FinalizePop(*packet); > } > >-size_t PacedSender::SendPadding(size_t padding_needed, >- const PacedPacketInfo& pacing_info) { >- RTC_DCHECK_GT(packet_counter_, 0); >- critsect_.Leave(); >- size_t bytes_sent = >- packet_sender_->TimeToSendPadding(padding_needed, pacing_info); >- critsect_.Enter(); >- >+void PacedSender::OnPaddingSent(size_t bytes_sent) { > if (bytes_sent > 0) { > UpdateBudgetWithBytesSent(bytes_sent); > } > last_send_time_us_ = clock_->TimeInMicroseconds(); >- return bytes_sent; > } > > void PacedSender::UpdateBudgetWithElapsedTime(int64_t delta_time_ms) { > delta_time_ms = std::min(kMaxIntervalTimeMs, delta_time_ms); >- media_budget_->IncreaseBudget(delta_time_ms); >- padding_budget_->IncreaseBudget(delta_time_ms); >+ media_budget_.IncreaseBudget(delta_time_ms); >+ padding_budget_.IncreaseBudget(delta_time_ms); > } > > void PacedSender::UpdateBudgetWithBytesSent(size_t bytes_sent) { > outstanding_bytes_ += bytes_sent; >- media_budget_->UseBudget(bytes_sent); >- padding_budget_->UseBudget(bytes_sent); >+ media_budget_.UseBudget(bytes_sent); >+ padding_budget_.UseBudget(bytes_sent); > } > > void PacedSender::SetPacingFactor(float pacing_factor) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.h >index f12917408a6b41684228fb040983d29b3f838120..4586d29cca97e76e05b7615edf07f2b2da7635a6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/paced_sender.h >@@ -11,20 +11,25 @@ > #ifndef MODULES_PACING_PACED_SENDER_H_ > #define MODULES_PACING_PACED_SENDER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > > #include "absl/types/optional.h" >+#include "api/transport/network_types.h" >+#include "modules/pacing/bitrate_prober.h" >+#include "modules/pacing/interval_budget.h" > #include "modules/pacing/pacer.h" > #include "modules/pacing/round_robin_packet_queue.h" >+#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "modules/utility/include/process_thread.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/thread_annotations.h" > > namespace webrtc { > class AlrDetector; >-class BitrateProber; > class Clock; > class RtcEventLog; >-class IntervalBudget; > > class PacedSender : public Pacer { > public: >@@ -138,19 +143,25 @@ class PacedSender : public Pacer { > void SetQueueTimeLimit(int limit_ms); > > private: >+ int64_t UpdateTimeAndGetElapsedMs(int64_t now_us) >+ RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); >+ bool ShouldSendKeepalive(int64_t at_time_us) const >+ RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); >+ > // Updates the number of bytes that can be sent for the next time interval. > void UpdateBudgetWithElapsedTime(int64_t delta_time_in_ms) > RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); > void UpdateBudgetWithBytesSent(size_t bytes) > RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); > >- bool SendPacket(const RoundRobinPacketQueue::Packet& packet, >- const PacedPacketInfo& cluster_info) >+ const RoundRobinPacketQueue::Packet* GetPendingPacket( >+ const PacedPacketInfo& pacing_info) >+ RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); >+ void OnPacketSent(const RoundRobinPacketQueue::Packet* packet) > RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); >- size_t SendPadding(size_t padding_needed, const PacedPacketInfo& cluster_info) >+ void OnPaddingSent(size_t padding_sent) > RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); > >- void OnBytesSent(size_t bytes_sent) RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); > bool Congested() const RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); > int64_t TimeMilliseconds() const RTC_EXCLUSIVE_LOCKS_REQUIRED(critsect_); > >@@ -169,15 +180,13 @@ class PacedSender : public Pacer { > bool paused_ RTC_GUARDED_BY(critsect_); > // This is the media budget, keeping track of how many bits of media > // we can pace out during the current interval. >- const std::unique_ptr<IntervalBudget> media_budget_ >- RTC_PT_GUARDED_BY(critsect_); >+ IntervalBudget media_budget_ RTC_GUARDED_BY(critsect_); > // This is the padding budget, keeping track of how many bits of padding we're > // allowed to send out during the current interval. This budget will be > // utilized when there's no media to send. >- const std::unique_ptr<IntervalBudget> padding_budget_ >- RTC_PT_GUARDED_BY(critsect_); >+ IntervalBudget padding_budget_ RTC_GUARDED_BY(critsect_); > >- const std::unique_ptr<BitrateProber> prober_ RTC_PT_GUARDED_BY(critsect_); >+ BitrateProber prober_ RTC_GUARDED_BY(critsect_); > bool probing_send_failure_ RTC_GUARDED_BY(critsect_); > // Actual configured bitrates (media_budget_ may temporarily be higher in > // order to meet pace time constraint). >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.cc >index b27c73c2f0c7e7e76ba86fbeb803d36596c8c7a8..24be7eebf2a9e32db77286c687af41adf9f4a5e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.cc >@@ -11,8 +11,10 @@ > #include "modules/pacing/packet_router.h" > > #include <algorithm> >+#include <cstdint> > #include <limits> > >+#include "absl/types/optional.h" > #include "modules/rtp_rtcp/include/rtp_rtcp.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtcp_packet/transport_feedback.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.h >index f01e9ef493396541a6f883f2d84c8e68ae37b9a7..b7cc5ff196687da2bc8a490bf8f2836b1323d16d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/packet_router.h >@@ -11,16 +11,17 @@ > #ifndef MODULES_PACING_PACKET_ROUTER_H_ > #define MODULES_PACING_PACKET_ROUTER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <list> > #include <vector> > >-#include "common_types.h" // NOLINT(build/include) >+#include "api/transport/network_types.h" > #include "modules/pacing/paced_sender.h" > #include "modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" >-#include "rtc_base/race_checker.h" > #include "rtc_base/thread_annotations.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.cc >index e892ddb4ea2b5086481bd90213e560bdbaab5dd1..e61d68215aca8a7476babe34878b6de6046ab359 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.cc >@@ -11,9 +11,10 @@ > #include "modules/pacing/round_robin_packet_queue.h" > > #include <algorithm> >+#include <cstdint> >+#include <utility> > > #include "rtc_base/checks.h" >-#include "system_wrappers/include/clock.h" > > namespace webrtc { > >@@ -53,8 +54,8 @@ RoundRobinPacketQueue::Stream::Stream() : bytes(0), ssrc(0) {} > RoundRobinPacketQueue::Stream::Stream(const Stream& stream) = default; > RoundRobinPacketQueue::Stream::~Stream() {} > >-RoundRobinPacketQueue::RoundRobinPacketQueue(const Clock* clock) >- : time_last_updated_(clock->TimeInMilliseconds()) {} >+RoundRobinPacketQueue::RoundRobinPacketQueue(int64_t start_time_us) >+ : time_last_updated_ms_(start_time_us / 1000) {} > > RoundRobinPacketQueue::~RoundRobinPacketQueue() {} > >@@ -132,7 +133,7 @@ void RoundRobinPacketQueue::FinalizePop(const Packet& packet) { > // by subtracting it now we effectively remove the time spent in in the > // queue while in a paused state. > int64_t time_in_non_paused_state_ms = >- time_last_updated_ - packet.enqueue_time_ms - pause_time_sum_ms_; >+ time_last_updated_ms_ - packet.enqueue_time_ms - pause_time_sum_ms_; > queue_time_sum_ms_ -= time_in_non_paused_state_ms; > > RTC_CHECK(packet.enqueue_time_it != enqueue_times_.end()); >@@ -189,11 +190,11 @@ int64_t RoundRobinPacketQueue::OldestEnqueueTimeMs() const { > } > > void RoundRobinPacketQueue::UpdateQueueTime(int64_t timestamp_ms) { >- RTC_CHECK_GE(timestamp_ms, time_last_updated_); >- if (timestamp_ms == time_last_updated_) >+ RTC_CHECK_GE(timestamp_ms, time_last_updated_ms_); >+ if (timestamp_ms == time_last_updated_ms_) > return; > >- int64_t delta_ms = timestamp_ms - time_last_updated_; >+ int64_t delta_ms = timestamp_ms - time_last_updated_ms_; > > if (paused_) { > pause_time_sum_ms_ += delta_ms; >@@ -201,7 +202,7 @@ void RoundRobinPacketQueue::UpdateQueueTime(int64_t timestamp_ms) { > queue_time_sum_ms_ += delta_ms * size_packets_; > } > >- time_last_updated_ = timestamp_ms; >+ time_last_updated_ms_ = timestamp_ms; > } > > void RoundRobinPacketQueue::SetPauseState(bool paused, int64_t timestamp_ms) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.h >index ee91e2efc0c93444fb6959551f9fcb10d07ccf38..74b855a483d55e157ba0a09f66204b2ac7998bd1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/pacing/round_robin_packet_queue.h >@@ -11,18 +11,22 @@ > #ifndef MODULES_PACING_ROUND_ROBIN_PACKET_QUEUE_H_ > #define MODULES_PACING_ROUND_ROBIN_PACKET_QUEUE_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <list> > #include <map> > #include <queue> > #include <set> > >+#include "absl/types/optional.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "system_wrappers/include/clock.h" > > namespace webrtc { > > class RoundRobinPacketQueue { > public: >- explicit RoundRobinPacketQueue(const Clock* clock); >+ explicit RoundRobinPacketQueue(int64_t start_time_us); > ~RoundRobinPacketQueue(); > > struct Packet { >@@ -105,7 +109,7 @@ class RoundRobinPacketQueue { > // Just used to verify correctness. > bool IsSsrcScheduled(uint32_t ssrc) const; > >- int64_t time_last_updated_; >+ int64_t time_last_updated_ms_; > absl::optional<Packet> pop_packet_; > absl::optional<Stream*> pop_stream_; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/BUILD.gn >index a8d6850749cc9081f93ccd93b5e923116fe6ad14..f6991ea69d403cfd7d50d1bca9e6163b837a70fd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/BUILD.gn >@@ -42,7 +42,9 @@ rtc_static_library("remote_bitrate_estimator") { > deps = [ > "../..:webrtc_common", > "../../api/units:data_rate", >+ "../../api/units:timestamp", > "../../modules:module_api", >+ "../../modules:module_api_public", > "../../modules/rtp_rtcp:rtp_rtcp_format", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >@@ -227,6 +229,7 @@ if (rtc_include_tests) { > "../../rtc_base:rtc_base_approved", > "../../test:fileutils", > "../../test:test_main", >+ "../../test:test_support", > "//testing/gtest", > ] > data = [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.cc >index a6e64368977f5ab06b004ba8a6e9d216e88f1997..b2ca8012889c6d953e3599efbd27cbeba5f743b8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.cc >@@ -11,7 +11,6 @@ > #include "modules/remote_bitrate_estimator/aimd_rate_control.h" > > #include <inttypes.h> >- > #include <algorithm> > #include <cassert> > #include <cmath> >@@ -19,9 +18,7 @@ > #include <string> > > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" >-#include "modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h" > #include "modules/remote_bitrate_estimator/overuse_detector.h" >-#include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_minmax.h" >@@ -29,25 +26,24 @@ > > namespace webrtc { > >-static const int64_t kDefaultRttMs = 200; >-static const int64_t kMaxFeedbackIntervalMs = 1000; >-static const float kDefaultBackoffFactor = 0.85f; >-static const int64_t kDefaultInitialBackOffIntervalMs = 200; >+constexpr TimeDelta kDefaultRtt = TimeDelta::Millis<200>(); >+constexpr double kDefaultBackoffFactor = 0.85; >+constexpr TimeDelta kDefaultInitialBackOffInterval = TimeDelta::Millis<200>(); > > const char kBweBackOffFactorExperiment[] = "WebRTC-BweBackOffFactor"; > const char kBweInitialBackOffIntervalExperiment[] = > "WebRTC-BweInitialBackOffInterval"; > >-float ReadBackoffFactor() { >+double ReadBackoffFactor() { > std::string experiment_string = > webrtc::field_trial::FindFullName(kBweBackOffFactorExperiment); >- float backoff_factor; >+ double backoff_factor; > int parsed_values = >- sscanf(experiment_string.c_str(), "Enabled-%f", &backoff_factor); >+ sscanf(experiment_string.c_str(), "Enabled-%lf", &backoff_factor); > if (parsed_values == 1) { >- if (backoff_factor >= 1.0f) { >+ if (backoff_factor >= 1.0) { > RTC_LOG(WARNING) << "Back-off factor must be less than 1."; >- } else if (backoff_factor <= 0.0f) { >+ } else if (backoff_factor <= 0.0) { > RTC_LOG(WARNING) << "Back-off factor must be greater than 0."; > } else { > return backoff_factor; >@@ -58,7 +54,7 @@ float ReadBackoffFactor() { > return kDefaultBackoffFactor; > } > >-int64_t ReadInitialBackoffIntervalMs() { >+TimeDelta ReadInitialBackoffInterval() { > std::string experiment_string = > webrtc::field_trial::FindFullName(kBweInitialBackOffIntervalExperiment); > int64_t backoff_interval; >@@ -66,7 +62,7 @@ int64_t ReadInitialBackoffIntervalMs() { > sscanf(experiment_string.c_str(), "Enabled-%" SCNd64, &backoff_interval); > if (parsed_values == 1) { > if (10 <= backoff_interval && backoff_interval <= 200) { >- return backoff_interval; >+ return TimeDelta::ms(backoff_interval); > } > RTC_LOG(WARNING) > << "Initial back-off interval must be between 10 and 200 ms."; >@@ -74,334 +70,347 @@ int64_t ReadInitialBackoffIntervalMs() { > RTC_LOG(LS_WARNING) << "Failed to parse parameters for " > << kBweInitialBackOffIntervalExperiment > << " experiment. Using default."; >- return kDefaultInitialBackOffIntervalMs; >+ return kDefaultInitialBackOffInterval; > } > > AimdRateControl::AimdRateControl() >- : min_configured_bitrate_bps_(congestion_controller::GetMinBitrateBps()), >- max_configured_bitrate_bps_(30000000), >- current_bitrate_bps_(max_configured_bitrate_bps_), >- latest_estimated_throughput_bps_(current_bitrate_bps_), >- avg_max_bitrate_kbps_(-1.0f), >- var_max_bitrate_kbps_(0.4f), >+ : min_configured_bitrate_(congestion_controller::GetMinBitrate()), >+ max_configured_bitrate_(DataRate::kbps(30000)), >+ current_bitrate_(max_configured_bitrate_), >+ latest_estimated_throughput_(current_bitrate_), >+ link_capacity_estimate_kbps_(), >+ var_link_capacity_estimate_kbps_(0.4), > rate_control_state_(kRcHold), >- rate_control_region_(kRcMaxUnknown), >- time_last_bitrate_change_(-1), >- time_last_bitrate_decrease_(-1), >- time_first_throughput_estimate_(-1), >+ time_last_bitrate_change_(Timestamp::MinusInfinity()), >+ time_last_bitrate_decrease_(Timestamp::MinusInfinity()), >+ time_first_throughput_estimate_(Timestamp::MinusInfinity()), > bitrate_is_initialized_(false), > beta_(webrtc::field_trial::IsEnabled(kBweBackOffFactorExperiment) > ? ReadBackoffFactor() > : kDefaultBackoffFactor), >- rtt_(kDefaultRttMs), >+ rtt_(kDefaultRtt), > in_experiment_(!AdaptiveThresholdExperimentIsDisabled()), > smoothing_experiment_( > webrtc::field_trial::IsEnabled("WebRTC-Audio-BandwidthSmoothing")), > in_initial_backoff_interval_experiment_( > webrtc::field_trial::IsEnabled(kBweInitialBackOffIntervalExperiment)), >- initial_backoff_interval_ms_(kDefaultInitialBackOffIntervalMs) { >+ initial_backoff_interval_(kDefaultInitialBackOffInterval) { > if (in_initial_backoff_interval_experiment_) { >- initial_backoff_interval_ms_ = ReadInitialBackoffIntervalMs(); >+ initial_backoff_interval_ = ReadInitialBackoffInterval(); > RTC_LOG(LS_INFO) << "Using aimd rate control with initial back-off interval" >- << " " << initial_backoff_interval_ms_ << " ms."; >+ << " " << ToString(initial_backoff_interval_) << "."; > } > RTC_LOG(LS_INFO) << "Using aimd rate control with back off factor " << beta_; > } > > AimdRateControl::~AimdRateControl() {} > >-void AimdRateControl::SetStartBitrate(int start_bitrate_bps) { >- current_bitrate_bps_ = start_bitrate_bps; >- latest_estimated_throughput_bps_ = current_bitrate_bps_; >+void AimdRateControl::SetStartBitrate(DataRate start_bitrate) { >+ current_bitrate_ = start_bitrate; >+ latest_estimated_throughput_ = current_bitrate_; > bitrate_is_initialized_ = true; > } > >-void AimdRateControl::SetMinBitrate(int min_bitrate_bps) { >- min_configured_bitrate_bps_ = min_bitrate_bps; >- current_bitrate_bps_ = std::max<int>(min_bitrate_bps, current_bitrate_bps_); >+void AimdRateControl::SetMinBitrate(DataRate min_bitrate) { >+ min_configured_bitrate_ = min_bitrate; >+ current_bitrate_ = std::max(min_bitrate, current_bitrate_); > } > > bool AimdRateControl::ValidEstimate() const { > return bitrate_is_initialized_; > } > >-int64_t AimdRateControl::GetFeedbackInterval() const { >+TimeDelta AimdRateControl::GetFeedbackInterval() const { > // Estimate how often we can send RTCP if we allocate up to 5% of bandwidth > // to feedback. >- static const int kRtcpSize = 80; >- const int64_t interval = static_cast<int64_t>( >- kRtcpSize * 8.0 * 1000.0 / (0.05 * current_bitrate_bps_) + 0.5); >- const int64_t kMinFeedbackIntervalMs = 200; >- return rtc::SafeClamp(interval, kMinFeedbackIntervalMs, >- kMaxFeedbackIntervalMs); >+ const DataSize kRtcpSize = DataSize::bytes(80); >+ const DataRate rtcp_bitrate = current_bitrate_ * 0.05; >+ const TimeDelta interval = kRtcpSize / rtcp_bitrate; >+ const TimeDelta kMinFeedbackInterval = TimeDelta::ms(200); >+ const TimeDelta kMaxFeedbackInterval = TimeDelta::ms(1000); >+ return interval.Clamped(kMinFeedbackInterval, kMaxFeedbackInterval); > } > >-bool AimdRateControl::TimeToReduceFurther( >- int64_t now_ms, >- uint32_t estimated_throughput_bps) const { >- const int64_t bitrate_reduction_interval = >- std::max<int64_t>(std::min<int64_t>(rtt_, 200), 10); >- if (now_ms - time_last_bitrate_change_ >= bitrate_reduction_interval) { >+bool AimdRateControl::TimeToReduceFurther(Timestamp at_time, >+ DataRate estimated_throughput) const { >+ const TimeDelta bitrate_reduction_interval = >+ rtt_.Clamped(TimeDelta::ms(10), TimeDelta::ms(200)); >+ if (at_time - time_last_bitrate_change_ >= bitrate_reduction_interval) { > return true; > } > if (ValidEstimate()) { > // TODO(terelius/holmer): Investigate consequences of increasing > // the threshold to 0.95 * LatestEstimate(). >- const uint32_t threshold = static_cast<uint32_t>(0.5 * LatestEstimate()); >- return estimated_throughput_bps < threshold; >+ const DataRate threshold = 0.5 * LatestEstimate(); >+ return estimated_throughput < threshold; > } > return false; > } > >-bool AimdRateControl::InitialTimeToReduceFurther(int64_t now_ms) const { >+bool AimdRateControl::InitialTimeToReduceFurther(Timestamp at_time) const { > if (!in_initial_backoff_interval_experiment_) { > return ValidEstimate() && >- TimeToReduceFurther(now_ms, LatestEstimate() / 2 - 1); >+ TimeToReduceFurther(at_time, >+ LatestEstimate() / 2 - DataRate::bps(1)); > } > // TODO(terelius): We could use the RTT (clamped to suitable limits) instead > // of a fixed bitrate_reduction_interval. >- if (time_last_bitrate_decrease_ == -1 || >- now_ms - time_last_bitrate_decrease_ >= initial_backoff_interval_ms_) { >+ if (time_last_bitrate_decrease_.IsInfinite() || >+ at_time - time_last_bitrate_decrease_ >= initial_backoff_interval_) { > return true; > } > return false; > } > >-uint32_t AimdRateControl::LatestEstimate() const { >- return current_bitrate_bps_; >+DataRate AimdRateControl::LatestEstimate() const { >+ return current_bitrate_; > } > >-void AimdRateControl::SetRtt(int64_t rtt) { >+void AimdRateControl::SetRtt(TimeDelta rtt) { > rtt_ = rtt; > } > >-uint32_t AimdRateControl::Update(const RateControlInput* input, >- int64_t now_ms) { >+DataRate AimdRateControl::Update(const RateControlInput* input, >+ Timestamp at_time) { > RTC_CHECK(input); > > // Set the initial bit rate value to what we're receiving the first half > // second. > // TODO(bugs.webrtc.org/9379): The comment above doesn't match to the code. > if (!bitrate_is_initialized_) { >- const int64_t kInitializationTimeMs = 5000; >- RTC_DCHECK_LE(kBitrateWindowMs, kInitializationTimeMs); >- if (time_first_throughput_estimate_ < 0) { >- if (input->estimated_throughput_bps) >- time_first_throughput_estimate_ = now_ms; >- } else if (now_ms - time_first_throughput_estimate_ > >- kInitializationTimeMs && >- input->estimated_throughput_bps) { >- current_bitrate_bps_ = *input->estimated_throughput_bps; >+ const TimeDelta kInitializationTime = TimeDelta::seconds(5); >+ RTC_DCHECK_LE(kBitrateWindowMs, kInitializationTime.ms()); >+ if (time_first_throughput_estimate_.IsInfinite()) { >+ if (input->estimated_throughput) >+ time_first_throughput_estimate_ = at_time; >+ } else if (at_time - time_first_throughput_estimate_ > >+ kInitializationTime && >+ input->estimated_throughput) { >+ current_bitrate_ = *input->estimated_throughput; > bitrate_is_initialized_ = true; > } > } > >- current_bitrate_bps_ = ChangeBitrate(current_bitrate_bps_, *input, now_ms); >- return current_bitrate_bps_; >+ current_bitrate_ = ChangeBitrate(current_bitrate_, *input, at_time); >+ return current_bitrate_; > } > >-void AimdRateControl::SetEstimate(int bitrate_bps, int64_t now_ms) { >+void AimdRateControl::SetEstimate(DataRate bitrate, Timestamp at_time) { > bitrate_is_initialized_ = true; >- uint32_t prev_bitrate_bps = current_bitrate_bps_; >- current_bitrate_bps_ = ClampBitrate(bitrate_bps, bitrate_bps); >- time_last_bitrate_change_ = now_ms; >- if (current_bitrate_bps_ < prev_bitrate_bps) { >- time_last_bitrate_decrease_ = now_ms; >+ DataRate prev_bitrate = current_bitrate_; >+ current_bitrate_ = ClampBitrate(bitrate, bitrate); >+ time_last_bitrate_change_ = at_time; >+ if (current_bitrate_ < prev_bitrate) { >+ time_last_bitrate_decrease_ = at_time; > } > } > >-int AimdRateControl::GetNearMaxIncreaseRateBps() const { >- RTC_DCHECK_GT(current_bitrate_bps_, 0); >- double bits_per_frame = static_cast<double>(current_bitrate_bps_) / 30.0; >- double packets_per_frame = std::ceil(bits_per_frame / (8.0 * 1200.0)); >- double avg_packet_size_bits = bits_per_frame / packets_per_frame; >+double AimdRateControl::GetNearMaxIncreaseRateBpsPerSecond() const { >+ RTC_DCHECK(!current_bitrate_.IsZero()); >+ const TimeDelta kFrameInterval = TimeDelta::seconds(1) / 30; >+ DataSize frame_size = current_bitrate_ * kFrameInterval; >+ const DataSize kPacketSize = DataSize::bytes(1200); >+ double packets_per_frame = std::ceil(frame_size / kPacketSize); >+ DataSize avg_packet_size = frame_size / packets_per_frame; > > // Approximate the over-use estimator delay to 100 ms. >- const int64_t response_time = in_experiment_ ? (rtt_ + 100) * 2 : rtt_ + 100; >- constexpr double kMinIncreaseRateBps = 4000; >- return static_cast<int>(std::max( >- kMinIncreaseRateBps, (avg_packet_size_bits * 1000) / response_time)); >+ TimeDelta response_time = rtt_ + TimeDelta::ms(100); >+ if (in_experiment_) >+ response_time = response_time * 2; >+ double increase_rate_bps_per_second = >+ (avg_packet_size / response_time).bps<double>(); >+ double kMinIncreaseRateBpsPerSecond = 4000; >+ return std::max(kMinIncreaseRateBpsPerSecond, increase_rate_bps_per_second); > } > >-int AimdRateControl::GetExpectedBandwidthPeriodMs() const { >- const int kMinPeriodMs = smoothing_experiment_ ? 500 : 2000; >- constexpr int kDefaultPeriodMs = 3000; >- constexpr int kMaxPeriodMs = 50000; >+TimeDelta AimdRateControl::GetExpectedBandwidthPeriod() const { >+ const TimeDelta kMinPeriod = >+ smoothing_experiment_ ? TimeDelta::ms(500) : TimeDelta::seconds(2); >+ const TimeDelta kDefaultPeriod = TimeDelta::seconds(3); >+ const TimeDelta kMaxPeriod = TimeDelta::seconds(50); > >- int increase_rate = GetNearMaxIncreaseRateBps(); >+ double increase_rate_bps_per_second = GetNearMaxIncreaseRateBpsPerSecond(); > if (!last_decrease_) >- return smoothing_experiment_ ? kMinPeriodMs : kDefaultPeriodMs; >- >- return std::min(kMaxPeriodMs, >- std::max<int>(1000 * static_cast<int64_t>(*last_decrease_) / >- increase_rate, >- kMinPeriodMs)); >+ return smoothing_experiment_ ? kMinPeriod : kDefaultPeriod; >+ double time_to_recover_decrease_seconds = >+ last_decrease_->bps() / increase_rate_bps_per_second; >+ TimeDelta period = TimeDelta::seconds(time_to_recover_decrease_seconds); >+ return period.Clamped(kMinPeriod, kMaxPeriod); > } > >-uint32_t AimdRateControl::ChangeBitrate(uint32_t new_bitrate_bps, >+DataRate AimdRateControl::ChangeBitrate(DataRate new_bitrate, > const RateControlInput& input, >- int64_t now_ms) { >- uint32_t estimated_throughput_bps = >- input.estimated_throughput_bps.value_or(latest_estimated_throughput_bps_); >- if (input.estimated_throughput_bps) >- latest_estimated_throughput_bps_ = *input.estimated_throughput_bps; >+ Timestamp at_time) { >+ DataRate estimated_throughput = >+ input.estimated_throughput.value_or(latest_estimated_throughput_); >+ if (input.estimated_throughput) >+ latest_estimated_throughput_ = *input.estimated_throughput; > > // An over-use should always trigger us to reduce the bitrate, even though > // we have not yet established our first estimate. By acting on the over-use, > // we will end up with a valid estimate. > if (!bitrate_is_initialized_ && > input.bw_state != BandwidthUsage::kBwOverusing) >- return current_bitrate_bps_; >+ return current_bitrate_; > >- ChangeState(input, now_ms); >+ ChangeState(input, at_time); > // Calculated here because it's used in multiple places. >- const float estimated_throughput_kbps = estimated_throughput_bps / 1000.0f; >+ const double estimated_throughput_kbps = estimated_throughput.kbps<double>(); >+ > // Calculate the max bit rate std dev given the normalized >- // variance and the current throughput bitrate. >- const float std_max_bit_rate = >- sqrt(var_max_bitrate_kbps_ * avg_max_bitrate_kbps_); >+ // variance and the current throughput bitrate. The standard deviation will >+ // only be used if link_capacity_estimate_kbps_ has a value. >+ const double std_link_capacity_kbps = >+ sqrt(var_link_capacity_estimate_kbps_ * >+ link_capacity_estimate_kbps_.value_or(0)); > switch (rate_control_state_) { > case kRcHold: > break; > > case kRcIncrease: >- if (avg_max_bitrate_kbps_ >= 0 && >- estimated_throughput_kbps > >- avg_max_bitrate_kbps_ + 3 * std_max_bit_rate) { >- rate_control_region_ = kRcMaxUnknown; >- >- avg_max_bitrate_kbps_ = -1.0; >+ if (link_capacity_estimate_kbps_.has_value() && >+ estimated_throughput_kbps > link_capacity_estimate_kbps_.value() + >+ 3 * std_link_capacity_kbps) { >+ // The link capacity appears to have changed. Forget the previous >+ // estimate and use multiplicative increase to quickly discover new >+ // capacity. >+ link_capacity_estimate_kbps_.reset(); > } >- if (rate_control_region_ == kRcNearMax) { >- uint32_t additive_increase_bps = >- AdditiveRateIncrease(now_ms, time_last_bitrate_change_); >- new_bitrate_bps += additive_increase_bps; >+ if (link_capacity_estimate_kbps_.has_value()) { >+ // The link_capacity_estimate_kbps_ is reset if the measured throughput >+ // is too far from the estimate. We can therefore assume that our target >+ // rate is reasonably close to link capacity and use additive increase. >+ DataRate additive_increase = >+ AdditiveRateIncrease(at_time, time_last_bitrate_change_); >+ new_bitrate += additive_increase; > } else { >- uint32_t multiplicative_increase_bps = MultiplicativeRateIncrease( >- now_ms, time_last_bitrate_change_, new_bitrate_bps); >- new_bitrate_bps += multiplicative_increase_bps; >+ // If we don't have an estimate of the link capacity, use faster ramp up >+ // to discover the capacity. >+ DataRate multiplicative_increase = MultiplicativeRateIncrease( >+ at_time, time_last_bitrate_change_, new_bitrate); >+ new_bitrate += multiplicative_increase; > } > >- time_last_bitrate_change_ = now_ms; >+ time_last_bitrate_change_ = at_time; > break; > > case kRcDecrease: >- // Set bit rate to something slightly lower than max >+ // Set bit rate to something slightly lower than the measured throughput > // to get rid of any self-induced delay. >- new_bitrate_bps = >- static_cast<uint32_t>(beta_ * estimated_throughput_bps + 0.5); >- if (new_bitrate_bps > current_bitrate_bps_) { >+ new_bitrate = estimated_throughput * beta_; >+ if (new_bitrate > current_bitrate_) { > // Avoid increasing the rate when over-using. >- if (rate_control_region_ != kRcMaxUnknown) { >- new_bitrate_bps = static_cast<uint32_t>( >- beta_ * avg_max_bitrate_kbps_ * 1000 + 0.5f); >+ if (link_capacity_estimate_kbps_.has_value()) { >+ new_bitrate = >+ DataRate::kbps(beta_ * link_capacity_estimate_kbps_.value()); > } >- new_bitrate_bps = std::min(new_bitrate_bps, current_bitrate_bps_); >+ new_bitrate = std::min(new_bitrate, current_bitrate_); > } >- rate_control_region_ = kRcNearMax; > >- if (bitrate_is_initialized_ && >- estimated_throughput_bps < current_bitrate_bps_) { >- constexpr float kDegradationFactor = 0.9f; >+ if (bitrate_is_initialized_ && estimated_throughput < current_bitrate_) { >+ constexpr double kDegradationFactor = 0.9; > if (smoothing_experiment_ && >- new_bitrate_bps < >- kDegradationFactor * beta_ * current_bitrate_bps_) { >+ new_bitrate < kDegradationFactor * beta_ * current_bitrate_) { > // If bitrate decreases more than a normal back off after overuse, it > // indicates a real network degradation. We do not let such a decrease > // to determine the bandwidth estimation period. > last_decrease_ = absl::nullopt; > } else { >- last_decrease_ = current_bitrate_bps_ - new_bitrate_bps; >+ last_decrease_ = current_bitrate_ - new_bitrate; > } > } >- if (estimated_throughput_kbps < >- avg_max_bitrate_kbps_ - 3 * std_max_bit_rate) { >- avg_max_bitrate_kbps_ = -1.0f; >+ if (link_capacity_estimate_kbps_.has_value() && >+ estimated_throughput_kbps < link_capacity_estimate_kbps_.value() - >+ 3 * std_link_capacity_kbps) { >+ // The current throughput is far from the estimated link capacity. Clear >+ // the estimate to allow an fast update in UpdateLinkCapacityEstimate. >+ link_capacity_estimate_kbps_.reset(); > } > > bitrate_is_initialized_ = true; >- UpdateMaxThroughputEstimate(estimated_throughput_kbps); >+ UpdateLinkCapacityEstimate(estimated_throughput_kbps); > // Stay on hold until the pipes are cleared. > rate_control_state_ = kRcHold; >- time_last_bitrate_change_ = now_ms; >- time_last_bitrate_decrease_ = now_ms; >+ time_last_bitrate_change_ = at_time; >+ time_last_bitrate_decrease_ = at_time; > break; > > default: > assert(false); > } >- return ClampBitrate(new_bitrate_bps, estimated_throughput_bps); >+ return ClampBitrate(new_bitrate, estimated_throughput); > } > >-uint32_t AimdRateControl::ClampBitrate( >- uint32_t new_bitrate_bps, >- uint32_t estimated_throughput_bps) const { >+DataRate AimdRateControl::ClampBitrate(DataRate new_bitrate, >+ DataRate estimated_throughput) const { > // Don't change the bit rate if the send side is too far off. > // We allow a bit more lag at very low rates to not too easily get stuck if > // the encoder produces uneven outputs. >- const uint32_t max_bitrate_bps = >- static_cast<uint32_t>(1.5f * estimated_throughput_bps) + 10000; >- if (new_bitrate_bps > current_bitrate_bps_ && >- new_bitrate_bps > max_bitrate_bps) { >- new_bitrate_bps = std::max(current_bitrate_bps_, max_bitrate_bps); >+ const DataRate max_bitrate = 1.5 * estimated_throughput + DataRate::kbps(10); >+ if (new_bitrate > current_bitrate_ && new_bitrate > max_bitrate) { >+ new_bitrate = std::max(current_bitrate_, max_bitrate); > } >- new_bitrate_bps = std::max(new_bitrate_bps, min_configured_bitrate_bps_); >- return new_bitrate_bps; >+ new_bitrate = std::max(new_bitrate, min_configured_bitrate_); >+ return new_bitrate; > } > >-uint32_t AimdRateControl::MultiplicativeRateIncrease( >- int64_t now_ms, >- int64_t last_ms, >- uint32_t current_bitrate_bps) const { >+DataRate AimdRateControl::MultiplicativeRateIncrease( >+ Timestamp at_time, >+ Timestamp last_time, >+ DataRate current_bitrate) const { > double alpha = 1.08; >- if (last_ms > -1) { >- auto time_since_last_update_ms = >- rtc::SafeMin<int64_t>(now_ms - last_ms, 1000); >- alpha = pow(alpha, time_since_last_update_ms / 1000.0); >+ if (last_time.IsFinite()) { >+ auto time_since_last_update = at_time - last_time; >+ alpha = pow(alpha, std::min(time_since_last_update.seconds<double>(), 1.0)); > } >- uint32_t multiplicative_increase_bps = >- std::max(current_bitrate_bps * (alpha - 1.0), 1000.0); >- return multiplicative_increase_bps; >+ DataRate multiplicative_increase = >+ std::max(current_bitrate * (alpha - 1.0), DataRate::bps(1000)); >+ return multiplicative_increase; > } > >-uint32_t AimdRateControl::AdditiveRateIncrease(int64_t now_ms, >- int64_t last_ms) const { >- return static_cast<uint32_t>((now_ms - last_ms) * >- GetNearMaxIncreaseRateBps() / 1000); >+DataRate AimdRateControl::AdditiveRateIncrease(Timestamp at_time, >+ Timestamp last_time) const { >+ double time_period_seconds = (at_time - last_time).seconds<double>(); >+ double data_rate_increase_bps = >+ GetNearMaxIncreaseRateBpsPerSecond() * time_period_seconds; >+ return DataRate::bps(data_rate_increase_bps); > } > >-void AimdRateControl::UpdateMaxThroughputEstimate( >- float estimated_throughput_kbps) { >- const float alpha = 0.05f; >- if (avg_max_bitrate_kbps_ == -1.0f) { >- avg_max_bitrate_kbps_ = estimated_throughput_kbps; >+void AimdRateControl::UpdateLinkCapacityEstimate( >+ double estimated_throughput_kbps) { >+ const double alpha = 0.05; >+ if (!link_capacity_estimate_kbps_.has_value()) { >+ link_capacity_estimate_kbps_ = estimated_throughput_kbps; > } else { >- avg_max_bitrate_kbps_ = >- (1 - alpha) * avg_max_bitrate_kbps_ + alpha * estimated_throughput_kbps; >+ link_capacity_estimate_kbps_ = >+ (1 - alpha) * link_capacity_estimate_kbps_.value() + >+ alpha * estimated_throughput_kbps; > } >- // Estimate the max bit rate variance and normalize the variance >- // with the average max bit rate. >- const float norm = std::max(avg_max_bitrate_kbps_, 1.0f); >- var_max_bitrate_kbps_ = >- (1 - alpha) * var_max_bitrate_kbps_ + >- alpha * (avg_max_bitrate_kbps_ - estimated_throughput_kbps) * >- (avg_max_bitrate_kbps_ - estimated_throughput_kbps) / norm; >+ // Estimate the variance of the link capacity estimate and normalize the >+ // variance with the link capacity estimate. >+ const double norm = std::max(link_capacity_estimate_kbps_.value(), 1.0); >+ var_link_capacity_estimate_kbps_ = >+ (1 - alpha) * var_link_capacity_estimate_kbps_ + >+ alpha * >+ (link_capacity_estimate_kbps_.value() - estimated_throughput_kbps) * >+ (link_capacity_estimate_kbps_.value() - estimated_throughput_kbps) / >+ norm; > // 0.4 ~= 14 kbit/s at 500 kbit/s >- if (var_max_bitrate_kbps_ < 0.4f) { >- var_max_bitrate_kbps_ = 0.4f; >+ if (var_link_capacity_estimate_kbps_ < 0.4) { >+ var_link_capacity_estimate_kbps_ = 0.4; > } >- // 2.5f ~= 35 kbit/s at 500 kbit/s >- if (var_max_bitrate_kbps_ > 2.5f) { >- var_max_bitrate_kbps_ = 2.5f; >+ // 2.5 ~= 35 kbit/s at 500 kbit/s >+ if (var_link_capacity_estimate_kbps_ > 2.5) { >+ var_link_capacity_estimate_kbps_ = 2.5; > } > } > > void AimdRateControl::ChangeState(const RateControlInput& input, >- int64_t now_ms) { >+ Timestamp at_time) { > switch (input.bw_state) { > case BandwidthUsage::kBwNormal: > if (rate_control_state_ == kRcHold) { >- time_last_bitrate_change_ = now_ms; >+ time_last_bitrate_change_ = at_time; > rate_control_state_ = kRcIncrease; > } > break; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.h >index 2a3f71d98ae8a7558a152fd754eb817b41960946..adb959a14b9c38b058e5e061afb9edace5ffbdcc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control.h >@@ -11,8 +11,13 @@ > #ifndef MODULES_REMOTE_BITRATE_ESTIMATOR_AIMD_RATE_CONTROL_H_ > #define MODULES_REMOTE_BITRATE_ESTIMATOR_AIMD_RATE_CONTROL_H_ > >+#include <stdint.h> >+ >+#include "absl/types/optional.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" >-#include "rtc_base/constructormagic.h" >+ >+#include "api/units/data_rate.h" >+#include "api/units/timestamp.h" > > namespace webrtc { > >@@ -30,28 +35,28 @@ class AimdRateControl { > // either if it has been explicitly set via SetStartBitrate/SetEstimate, or if > // we have measured a throughput. > bool ValidEstimate() const; >- void SetStartBitrate(int start_bitrate_bps); >- void SetMinBitrate(int min_bitrate_bps); >- int64_t GetFeedbackInterval() const; >+ void SetStartBitrate(DataRate start_bitrate); >+ void SetMinBitrate(DataRate min_bitrate); >+ TimeDelta GetFeedbackInterval() const; > > // Returns true if the bitrate estimate hasn't been changed for more than > // an RTT, or if the estimated_throughput is less than half of the current > // estimate. Should be used to decide if we should reduce the rate further > // when over-using. >- bool TimeToReduceFurther(int64_t now_ms, >- uint32_t estimated_throughput_bps) const; >+ bool TimeToReduceFurther(Timestamp at_time, >+ DataRate estimated_throughput) const; > // As above. To be used if overusing before we have measured a throughput. >- bool InitialTimeToReduceFurther(int64_t now_ms) const; >+ bool InitialTimeToReduceFurther(Timestamp at_time) const; > >- uint32_t LatestEstimate() const; >- void SetRtt(int64_t rtt); >- uint32_t Update(const RateControlInput* input, int64_t now_ms); >- void SetEstimate(int bitrate_bps, int64_t now_ms); >+ DataRate LatestEstimate() const; >+ void SetRtt(TimeDelta rtt); >+ DataRate Update(const RateControlInput* input, Timestamp at_time); >+ void SetEstimate(DataRate bitrate, Timestamp at_time); > > // Returns the increase rate when used bandwidth is near the link capacity. >- int GetNearMaxIncreaseRateBps() const; >+ double GetNearMaxIncreaseRateBpsPerSecond() const; > // Returns the expected time between overuse signals (assuming steady state). >- int GetExpectedBandwidthPeriodMs() const; >+ TimeDelta GetExpectedBandwidthPeriod() const; > > private: > friend class GoogCcStatePrinter; >@@ -62,41 +67,40 @@ class AimdRateControl { > // in the "decrease" state the bitrate will be decreased to slightly below the > // current throughput. When in the "hold" state the bitrate will be kept > // constant to allow built up queues to drain. >- uint32_t ChangeBitrate(uint32_t current_bitrate, >+ DataRate ChangeBitrate(DataRate current_bitrate, > const RateControlInput& input, >- int64_t now_ms); >- // Clamps new_bitrate_bps to within the configured min bitrate and a linear >+ Timestamp at_time); >+ // Clamps new_bitrate to within the configured min bitrate and a linear > // function of the throughput, so that the new bitrate can't grow too > // large compared to the bitrate actually being received by the other end. >- uint32_t ClampBitrate(uint32_t new_bitrate_bps, >- uint32_t estimated_throughput_bps) const; >- uint32_t MultiplicativeRateIncrease(int64_t now_ms, >- int64_t last_ms, >- uint32_t current_bitrate_bps) const; >- uint32_t AdditiveRateIncrease(int64_t now_ms, int64_t last_ms) const; >- void UpdateChangePeriod(int64_t now_ms); >- void UpdateMaxThroughputEstimate(float estimated_throughput_kbps); >- void ChangeState(const RateControlInput& input, int64_t now_ms); >+ DataRate ClampBitrate(DataRate new_bitrate, >+ DataRate estimated_throughput) const; >+ DataRate MultiplicativeRateIncrease(Timestamp at_time, >+ Timestamp last_ms, >+ DataRate current_bitrate) const; >+ DataRate AdditiveRateIncrease(Timestamp at_time, Timestamp last_time) const; >+ void UpdateChangePeriod(Timestamp at_time); >+ void UpdateLinkCapacityEstimate(double estimated_throughput_kbps); >+ void ChangeState(const RateControlInput& input, Timestamp at_time); > >- uint32_t min_configured_bitrate_bps_; >- uint32_t max_configured_bitrate_bps_; >- uint32_t current_bitrate_bps_; >- uint32_t latest_estimated_throughput_bps_; >- float avg_max_bitrate_kbps_; >- float var_max_bitrate_kbps_; >+ DataRate min_configured_bitrate_; >+ DataRate max_configured_bitrate_; >+ DataRate current_bitrate_; >+ DataRate latest_estimated_throughput_; >+ absl::optional<double> link_capacity_estimate_kbps_; >+ double var_link_capacity_estimate_kbps_; > RateControlState rate_control_state_; >- RateControlRegion rate_control_region_; >- int64_t time_last_bitrate_change_; >- int64_t time_last_bitrate_decrease_; >- int64_t time_first_throughput_estimate_; >+ Timestamp time_last_bitrate_change_; >+ Timestamp time_last_bitrate_decrease_; >+ Timestamp time_first_throughput_estimate_; > bool bitrate_is_initialized_; >- float beta_; >- int64_t rtt_; >+ double beta_; >+ TimeDelta rtt_; > const bool in_experiment_; > const bool smoothing_experiment_; > const bool in_initial_backoff_interval_experiment_; >- int64_t initial_backoff_interval_ms_; >- absl::optional<int> last_decrease_; >+ TimeDelta initial_backoff_interval_; >+ absl::optional<DataRate> last_decrease_; > }; > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control_unittest.cc >index 5cde7cedaf492dbda5a3f3bb0f30d7e11e5e0072..4bec9e8ff500781d253f5cd3475a627b202d4b62 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/aimd_rate_control_unittest.cc >@@ -40,13 +40,26 @@ AimdRateControlStates CreateAimdRateControlStates() { > states.simulated_clock.reset(new SimulatedClock(kClockInitialTime)); > return states; > } >- >+absl::optional<DataRate> OptionalRateFromOptionalBps( >+ absl::optional<int> bitrate_bps) { >+ if (bitrate_bps) { >+ return DataRate::bps(*bitrate_bps); >+ } else { >+ return absl::nullopt; >+ } >+} > void UpdateRateControl(const AimdRateControlStates& states, > const BandwidthUsage& bandwidth_usage, > absl::optional<uint32_t> throughput_estimate, > int64_t now_ms) { >- RateControlInput input(bandwidth_usage, throughput_estimate); >- states.aimd_rate_control->Update(&input, now_ms); >+ RateControlInput input(bandwidth_usage, >+ OptionalRateFromOptionalBps(throughput_estimate)); >+ states.aimd_rate_control->Update(&input, Timestamp::ms(now_ms)); >+} >+void SetEstimate(const AimdRateControlStates& states, int bitrate_bps) { >+ states.aimd_rate_control->SetEstimate( >+ DataRate::bps(bitrate_bps), >+ Timestamp::ms(states.simulated_clock->TimeInMilliseconds())); > } > > } // namespace >@@ -54,40 +67,40 @@ void UpdateRateControl(const AimdRateControlStates& states, > TEST(AimdRateControlTest, MinNearMaxIncreaseRateOnLowBandwith) { > auto states = CreateAimdRateControlStates(); > constexpr int kBitrate = 30000; >- states.aimd_rate_control->SetEstimate( >- kBitrate, states.simulated_clock->TimeInMilliseconds()); >- EXPECT_EQ(4000, states.aimd_rate_control->GetNearMaxIncreaseRateBps()); >+ SetEstimate(states, kBitrate); >+ EXPECT_EQ(4000, >+ states.aimd_rate_control->GetNearMaxIncreaseRateBpsPerSecond()); > } > > TEST(AimdRateControlTest, NearMaxIncreaseRateIs5kbpsOn90kbpsAnd200msRtt) { > auto states = CreateAimdRateControlStates(); > constexpr int kBitrate = 90000; >- states.aimd_rate_control->SetEstimate( >- kBitrate, states.simulated_clock->TimeInMilliseconds()); >- EXPECT_EQ(5000, states.aimd_rate_control->GetNearMaxIncreaseRateBps()); >+ SetEstimate(states, kBitrate); >+ EXPECT_EQ(5000, >+ states.aimd_rate_control->GetNearMaxIncreaseRateBpsPerSecond()); > } > > TEST(AimdRateControlTest, NearMaxIncreaseRateIs5kbpsOn60kbpsAnd100msRtt) { > auto states = CreateAimdRateControlStates(); > constexpr int kBitrate = 60000; >- states.aimd_rate_control->SetEstimate( >- kBitrate, states.simulated_clock->TimeInMilliseconds()); >- states.aimd_rate_control->SetRtt(100); >- EXPECT_EQ(5000, states.aimd_rate_control->GetNearMaxIncreaseRateBps()); >+ SetEstimate(states, kBitrate); >+ states.aimd_rate_control->SetRtt(TimeDelta::ms(100)); >+ EXPECT_EQ(5000, >+ states.aimd_rate_control->GetNearMaxIncreaseRateBpsPerSecond()); > } > > TEST(AimdRateControlTest, GetIncreaseRateAndBandwidthPeriod) { > // Smoothing experiment disabled > auto states = CreateAimdRateControlStates(); > constexpr int kBitrate = 300000; >- states.aimd_rate_control->SetEstimate( >- kBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kBitrate); > UpdateRateControl(states, BandwidthUsage::kBwOverusing, kBitrate, > states.simulated_clock->TimeInMilliseconds()); >- EXPECT_NEAR(14000, states.aimd_rate_control->GetNearMaxIncreaseRateBps(), >+ EXPECT_NEAR(14000, >+ states.aimd_rate_control->GetNearMaxIncreaseRateBpsPerSecond(), > 1000); > EXPECT_EQ(kDefaultPeriodMsNoSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, GetIncreaseRateAndBandwidthPeriodSmoothingExp) { >@@ -95,21 +108,20 @@ TEST(AimdRateControlTest, GetIncreaseRateAndBandwidthPeriodSmoothingExp) { > test::ScopedFieldTrials override_field_trials(kSmoothingExpFieldTrial); > auto states = CreateAimdRateControlStates(); > constexpr int kBitrate = 300000; >- states.aimd_rate_control->SetEstimate( >- kBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kBitrate); > UpdateRateControl(states, BandwidthUsage::kBwOverusing, kBitrate, > states.simulated_clock->TimeInMilliseconds()); >- EXPECT_NEAR(14000, states.aimd_rate_control->GetNearMaxIncreaseRateBps(), >+ EXPECT_NEAR(14000, >+ states.aimd_rate_control->GetNearMaxIncreaseRateBpsPerSecond(), > 1000); > EXPECT_EQ(kMinBwePeriodMsSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, BweLimitedByAckedBitrate) { > auto states = CreateAimdRateControlStates(); > constexpr int kAckedBitrate = 10000; >- states.aimd_rate_control->SetEstimate( >- kAckedBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kAckedBitrate); > while (states.simulated_clock->TimeInMilliseconds() - kClockInitialTime < > 20000) { > UpdateRateControl(states, BandwidthUsage::kBwNormal, kAckedBitrate, >@@ -118,14 +130,13 @@ TEST(AimdRateControlTest, BweLimitedByAckedBitrate) { > } > ASSERT_TRUE(states.aimd_rate_control->ValidEstimate()); > EXPECT_EQ(static_cast<uint32_t>(1.5 * kAckedBitrate + 10000), >- states.aimd_rate_control->LatestEstimate()); >+ states.aimd_rate_control->LatestEstimate().bps()); > } > > TEST(AimdRateControlTest, BweNotLimitedByDecreasingAckedBitrate) { > auto states = CreateAimdRateControlStates(); > constexpr int kAckedBitrate = 100000; >- states.aimd_rate_control->SetEstimate( >- kAckedBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kAckedBitrate); > while (states.simulated_clock->TimeInMilliseconds() - kClockInitialTime < > 20000) { > UpdateRateControl(states, BandwidthUsage::kBwNormal, kAckedBitrate, >@@ -135,10 +146,10 @@ TEST(AimdRateControlTest, BweNotLimitedByDecreasingAckedBitrate) { > ASSERT_TRUE(states.aimd_rate_control->ValidEstimate()); > // If the acked bitrate decreases the BWE shouldn't be reduced to 1.5x > // what's being acked, but also shouldn't get to increase more. >- uint32_t prev_estimate = states.aimd_rate_control->LatestEstimate(); >+ uint32_t prev_estimate = states.aimd_rate_control->LatestEstimate().bps(); > UpdateRateControl(states, BandwidthUsage::kBwNormal, kAckedBitrate / 2, > states.simulated_clock->TimeInMilliseconds()); >- uint32_t new_estimate = states.aimd_rate_control->LatestEstimate(); >+ uint32_t new_estimate = states.aimd_rate_control->LatestEstimate().bps(); > EXPECT_NEAR(new_estimate, static_cast<uint32_t>(1.5 * kAckedBitrate + 10000), > 2000); > EXPECT_EQ(new_estimate, prev_estimate); >@@ -147,35 +158,34 @@ TEST(AimdRateControlTest, BweNotLimitedByDecreasingAckedBitrate) { > TEST(AimdRateControlTest, DefaultPeriodUntilFirstOveruse) { > // Smoothing experiment disabled > auto states = CreateAimdRateControlStates(); >- states.aimd_rate_control->SetStartBitrate(300000); >+ states.aimd_rate_control->SetStartBitrate(DataRate::kbps(300)); > EXPECT_EQ(kDefaultPeriodMsNoSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > states.simulated_clock->AdvanceTimeMilliseconds(100); > UpdateRateControl(states, BandwidthUsage::kBwOverusing, 280000, > states.simulated_clock->TimeInMilliseconds()); > EXPECT_NE(kDefaultPeriodMsNoSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, MinPeriodUntilFirstOveruseSmoothingExp) { > // Smoothing experiment enabled > test::ScopedFieldTrials override_field_trials(kSmoothingExpFieldTrial); > auto states = CreateAimdRateControlStates(); >- states.aimd_rate_control->SetStartBitrate(300000); >+ states.aimd_rate_control->SetStartBitrate(DataRate::kbps(300)); > EXPECT_EQ(kMinBwePeriodMsSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > states.simulated_clock->AdvanceTimeMilliseconds(100); > UpdateRateControl(states, BandwidthUsage::kBwOverusing, 280000, > states.simulated_clock->TimeInMilliseconds()); > EXPECT_NE(kMinBwePeriodMsSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, ExpectedPeriodAfter20kbpsDropAnd5kbpsIncrease) { > auto states = CreateAimdRateControlStates(); > constexpr int kInitialBitrate = 110000; >- states.aimd_rate_control->SetEstimate( >- kInitialBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kInitialBitrate); > states.simulated_clock->AdvanceTimeMilliseconds(100); > // Make the bitrate drop by 20 kbps to get to 90 kbps. > // The rate increase at 90 kbps should be 5 kbps, so the period should be 4 s. >@@ -183,8 +193,9 @@ TEST(AimdRateControlTest, ExpectedPeriodAfter20kbpsDropAnd5kbpsIncrease) { > (kInitialBitrate - 20000) / kFractionAfterOveruse; > UpdateRateControl(states, BandwidthUsage::kBwOverusing, kAckedBitrate, > states.simulated_clock->TimeInMilliseconds()); >- EXPECT_EQ(5000, states.aimd_rate_control->GetNearMaxIncreaseRateBps()); >- EXPECT_EQ(4000, states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ EXPECT_EQ(5000, >+ states.aimd_rate_control->GetNearMaxIncreaseRateBpsPerSecond()); >+ EXPECT_EQ(4000, states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, MinPeriodAfterLargeBitrateDecreaseSmoothingExp) { >@@ -192,8 +203,7 @@ TEST(AimdRateControlTest, MinPeriodAfterLargeBitrateDecreaseSmoothingExp) { > test::ScopedFieldTrials override_field_trials(kSmoothingExpFieldTrial); > auto states = CreateAimdRateControlStates(); > constexpr int kInitialBitrate = 110000; >- states.aimd_rate_control->SetEstimate( >- kInitialBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kInitialBitrate); > states.simulated_clock->AdvanceTimeMilliseconds(100); > // Make such a large drop in bitrate that should be treated as network > // degradation. >@@ -201,20 +211,19 @@ TEST(AimdRateControlTest, MinPeriodAfterLargeBitrateDecreaseSmoothingExp) { > UpdateRateControl(states, BandwidthUsage::kBwOverusing, kAckedBitrate, > states.simulated_clock->TimeInMilliseconds()); > EXPECT_EQ(kMinBwePeriodMsSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, BandwidthPeriodIsNotBelowMin) { > auto states = CreateAimdRateControlStates(); > constexpr int kInitialBitrate = 10000; >- states.aimd_rate_control->SetEstimate( >- kInitialBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kInitialBitrate); > states.simulated_clock->AdvanceTimeMilliseconds(100); > // Make a small (1.5 kbps) bitrate drop to 8.5 kbps. > UpdateRateControl(states, BandwidthUsage::kBwOverusing, kInitialBitrate - 1, > states.simulated_clock->TimeInMilliseconds()); > EXPECT_EQ(kMinBwePeriodMsNoSmoothingExp, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, BandwidthPeriodIsNotAboveMaxSmoothingExp) { >@@ -222,29 +231,27 @@ TEST(AimdRateControlTest, BandwidthPeriodIsNotAboveMaxSmoothingExp) { > test::ScopedFieldTrials override_field_trials(kSmoothingExpFieldTrial); > auto states = CreateAimdRateControlStates(); > constexpr int kInitialBitrate = 50000000; >- states.aimd_rate_control->SetEstimate( >- kInitialBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kInitialBitrate); > states.simulated_clock->AdvanceTimeMilliseconds(100); > // Make a large (10 Mbps) bitrate drop to 10 kbps. > constexpr int kAckedBitrate = 40000000 / kFractionAfterOveruse; > UpdateRateControl(states, BandwidthUsage::kBwOverusing, kAckedBitrate, > states.simulated_clock->TimeInMilliseconds()); > EXPECT_EQ(kMaxBwePeriodMs, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, BandwidthPeriodIsNotAboveMaxNoSmoothingExp) { > auto states = CreateAimdRateControlStates(); > constexpr int kInitialBitrate = 10010000; >- states.aimd_rate_control->SetEstimate( >- kInitialBitrate, states.simulated_clock->TimeInMilliseconds()); >+ SetEstimate(states, kInitialBitrate); > states.simulated_clock->AdvanceTimeMilliseconds(100); > // Make a large (10 Mbps) bitrate drop to 10 kbps. > constexpr int kAckedBitrate = 10000 / kFractionAfterOveruse; > UpdateRateControl(states, BandwidthUsage::kBwOverusing, kAckedBitrate, > states.simulated_clock->TimeInMilliseconds()); > EXPECT_EQ(kMaxBwePeriodMs, >- states.aimd_rate_control->GetExpectedBandwidthPeriodMs()); >+ states.aimd_rate_control->GetExpectedBandwidthPeriod().ms()); > } > > TEST(AimdRateControlTest, SendingRateBoundedWhenThroughputNotEstimated) { >@@ -265,7 +272,7 @@ TEST(AimdRateControlTest, SendingRateBoundedWhenThroughputNotEstimated) { > states.simulated_clock->TimeInMilliseconds()); > states.simulated_clock->AdvanceTimeMilliseconds(100); > } >- EXPECT_LE(states.aimd_rate_control->LatestEstimate(), >+ EXPECT_LE(states.aimd_rate_control->LatestEstimate().bps(), > kInitialBitrateBps * 1.5 + 10000); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/bwe_defines.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/bwe_defines.cc >index 6cbe46834896771863a38a073d5d190c545b3b3c..91f3cd4050b78f0ff72d246f05fb38317f74a2e7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/bwe_defines.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/bwe_defines.cc >@@ -34,8 +34,8 @@ DataRate GetMinBitrate() { > > RateControlInput::RateControlInput( > BandwidthUsage bw_state, >- const absl::optional<uint32_t>& estimated_throughput_bps) >- : bw_state(bw_state), estimated_throughput_bps(estimated_throughput_bps) {} >+ const absl::optional<DataRate>& estimated_throughput) >+ : bw_state(bw_state), estimated_throughput(estimated_throughput) {} > > RateControlInput::~RateControlInput() = default; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/include/bwe_defines.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/include/bwe_defines.h >index d9185de99cadb0f0c4434586ce64768b3f8bfd1c..5223a6b424b1a0057eeaecd2acc55a1a41f478b0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/include/bwe_defines.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/include/bwe_defines.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_REMOTE_BITRATE_ESTIMATOR_INCLUDE_BWE_DEFINES_H_ > #define MODULES_REMOTE_BITRATE_ESTIMATOR_INCLUDE_BWE_DEFINES_H_ > >+#include <stdint.h> >+ > #include "absl/types/optional.h" > #include "api/units/data_rate.h" > >@@ -45,15 +47,13 @@ enum class BandwidthUsage { > > enum RateControlState { kRcHold, kRcIncrease, kRcDecrease }; > >-enum RateControlRegion { kRcNearMax, kRcAboveMax, kRcMaxUnknown }; >- > struct RateControlInput { > RateControlInput(BandwidthUsage bw_state, >- const absl::optional<uint32_t>& estimated_throughput_bps); >+ const absl::optional<DataRate>& estimated_throughput); > ~RateControlInput(); > > BandwidthUsage bw_state; >- absl::optional<uint32_t> estimated_throughput_bps; >+ absl::optional<DataRate> estimated_throughput; > }; > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/inter_arrival.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/inter_arrival.cc >index 3a0f579052f90f4d90499c516941718fa740dbcd..b8e683b89a6f2efc2f39580394c136309dd675ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/inter_arrival.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/inter_arrival.cc >@@ -10,10 +10,9 @@ > > #include "modules/remote_bitrate_estimator/inter_arrival.h" > >-#include <algorithm> > #include <cassert> > >-#include "modules/include/module_common_types.h" >+#include "modules/include/module_common_types_public.h" > #include "rtc_base/logging.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.cc >index 776ca42a56ff2203262b0149a62958e2b741c1e6..f3dbe1e998829580047d951513cdd673d9b93c86 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.cc >@@ -12,15 +12,12 @@ > > #include <math.h> > #include <stdio.h> >-#include <stdlib.h> >- > #include <algorithm> > #include <string> > > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "system_wrappers/include/field_trial.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.h >index f322c30496097fc15b63028351ade6e0efd7f283..61d1c3f707e3206a139dff33e07421601369c402 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_detector.h >@@ -10,9 +10,8 @@ > #ifndef MODULES_REMOTE_BITRATE_ESTIMATOR_OVERUSE_DETECTOR_H_ > #define MODULES_REMOTE_BITRATE_ESTIMATOR_OVERUSE_DETECTOR_H_ > >-#include <list> >+#include <stdint.h> > >-#include "modules/include/module_common_types.h" > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.cc >index 09de5c6e7c1c6279cebd135db1c41ce475c8dd66..206ceba84c2468e6f8359b6b9004e962f0ab5b70 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.cc >@@ -12,9 +12,7 @@ > > #include <assert.h> > #include <math.h> >-#include <stdlib.h> > #include <string.h> >- > #include <algorithm> > > #include "modules/remote_bitrate_estimator/include/bwe_defines.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.h >index 4c924938aa0b864c08188c6464aff528d26f5a4d..3d7bd16be47e7379923040dc5a868592ff761452 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/overuse_estimator.h >@@ -10,6 +10,7 @@ > #ifndef MODULES_REMOTE_BITRATE_ESTIMATOR_OVERUSE_ESTIMATOR_H_ > #define MODULES_REMOTE_BITRATE_ESTIMATOR_OVERUSE_ESTIMATOR_H_ > >+#include <stdint.h> > #include <deque> > > #include "common_types.h" // NOLINT(build/include) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_abs_send_time.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_abs_send_time.cc >index a7cfe4c90acc6bb0f00b0a7a242c93e896bd2f75..1ad35c7bf0db741bd6a7253f6858a7fa28dabb65 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_abs_send_time.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_abs_send_time.cc >@@ -22,6 +22,16 @@ > #include "system_wrappers/include/metrics.h" > > namespace webrtc { >+namespace { >+absl::optional<DataRate> OptionalRateFromOptionalBps( >+ absl::optional<int> bitrate_bps) { >+ if (bitrate_bps) { >+ return DataRate::bps(*bitrate_bps); >+ } else { >+ return absl::nullopt; >+ } >+} >+} // namespace > > enum { > kTimestampGroupLengthMs = 5, >@@ -188,7 +198,8 @@ RemoteBitrateEstimatorAbsSendTime::ProcessClusters(int64_t now_ms) { > << " bps. Mean send delta: " << best_it->send_mean_ms > << " ms, mean recv delta: " << best_it->recv_mean_ms > << " ms, num probes: " << best_it->count; >- remote_rate_.SetEstimate(probe_bitrate_bps, now_ms); >+ remote_rate_.SetEstimate(DataRate::bps(probe_bitrate_bps), >+ Timestamp::ms(now_ms)); > return ProbeResult::kBitrateUpdated; > } > } >@@ -205,7 +216,7 @@ bool RemoteBitrateEstimatorAbsSendTime::IsBitrateImproving( > bool initial_probe = !remote_rate_.ValidEstimate() && new_bitrate_bps > 0; > bool bitrate_above_estimate = > remote_rate_.ValidEstimate() && >- new_bitrate_bps > static_cast<int>(remote_rate_.LatestEstimate()); >+ new_bitrate_bps > remote_rate_.LatestEstimate().bps<int>(); > return initial_probe || bitrate_above_estimate; > } > >@@ -316,13 +327,14 @@ void RemoteBitrateEstimatorAbsSendTime::IncomingPacketInfo( > // Check if it's time for a periodic update or if we should update because > // of an over-use. > if (last_update_ms_ == -1 || >- now_ms - last_update_ms_ > remote_rate_.GetFeedbackInterval()) { >+ now_ms - last_update_ms_ > remote_rate_.GetFeedbackInterval().ms()) { > update_estimate = true; > } else if (detector_.State() == BandwidthUsage::kBwOverusing) { > absl::optional<uint32_t> incoming_rate = > incoming_bitrate_.Rate(arrival_time_ms); > if (incoming_rate && >- remote_rate_.TimeToReduceFurther(now_ms, *incoming_rate)) { >+ remote_rate_.TimeToReduceFurther(Timestamp::ms(now_ms), >+ DataRate::bps(*incoming_rate))) { > update_estimate = true; > } > } >@@ -332,9 +344,11 @@ void RemoteBitrateEstimatorAbsSendTime::IncomingPacketInfo( > // The first overuse should immediately trigger a new estimate. > // We also have to update the estimate immediately if we are overusing > // and the target bitrate is too high compared to what we are receiving. >- const RateControlInput input(detector_.State(), >- incoming_bitrate_.Rate(arrival_time_ms)); >- target_bitrate_bps = remote_rate_.Update(&input, now_ms); >+ const RateControlInput input( >+ detector_.State(), >+ OptionalRateFromOptionalBps(incoming_bitrate_.Rate(arrival_time_ms))); >+ target_bitrate_bps = >+ remote_rate_.Update(&input, Timestamp::ms(now_ms)).bps<uint32_t>(); > update_estimate = remote_rate_.ValidEstimate(); > ssrcs = Keys(ssrcs_); > } >@@ -374,7 +388,7 @@ void RemoteBitrateEstimatorAbsSendTime::TimeoutStreams(int64_t now_ms) { > void RemoteBitrateEstimatorAbsSendTime::OnRttUpdate(int64_t avg_rtt_ms, > int64_t max_rtt_ms) { > rtc::CritScope lock(&crit_); >- remote_rate_.SetRtt(avg_rtt_ms); >+ remote_rate_.SetRtt(TimeDelta::ms(avg_rtt_ms)); > } > > void RemoteBitrateEstimatorAbsSendTime::RemoveStream(uint32_t ssrc) { >@@ -399,7 +413,7 @@ bool RemoteBitrateEstimatorAbsSendTime::LatestEstimate( > if (ssrcs_.empty()) { > *bitrate_bps = 0; > } else { >- *bitrate_bps = remote_rate_.LatestEstimate(); >+ *bitrate_bps = remote_rate_.LatestEstimate().bps<uint32_t>(); > } > return true; > } >@@ -408,6 +422,6 @@ void RemoteBitrateEstimatorAbsSendTime::SetMinBitrate(int min_bitrate_bps) { > // Called from both the configuration thread and the network thread. Shouldn't > // be called from the network thread in the future. > rtc::CritScope lock(&crit_); >- remote_rate_.SetMinBitrate(min_bitrate_bps); >+ remote_rate_.SetMinBitrate(DataRate::bps(min_bitrate_bps)); > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.cc >index 3d791f78487d9b4da877b0dcea01d5e91f5a6964..a267051944b10f19b8a10dedd522eb7b723e9da8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.cc >@@ -10,17 +10,33 @@ > > #include "modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.h" > >+#include <assert.h> >+#include <cstdint> > #include <utility> > >+#include "absl/types/optional.h" >+#include "common_types.h" // NOLINT(build/include) > #include "modules/remote_bitrate_estimator/aimd_rate_control.h" >+#include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/remote_bitrate_estimator/inter_arrival.h" > #include "modules/remote_bitrate_estimator/overuse_detector.h" > #include "modules/remote_bitrate_estimator/overuse_estimator.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/clock.h" > #include "system_wrappers/include/metrics.h" > > namespace webrtc { >+namespace { >+absl::optional<DataRate> OptionalRateFromOptionalBps( >+ absl::optional<int> bitrate_bps) { >+ if (bitrate_bps) { >+ return DataRate::bps(*bitrate_bps); >+ } else { >+ return absl::nullopt; >+ } >+} >+} // namespace > > enum { kTimestampGroupLengthMs = 5 }; > static const double kTimestampToMs = 1.0 / 90.0; >@@ -127,7 +143,8 @@ void RemoteBitrateEstimatorSingleStream::IncomingPacket( > incoming_bitrate_.Rate(now_ms); > if (incoming_bitrate_bps && > (prior_state != BandwidthUsage::kBwOverusing || >- GetRemoteRate()->TimeToReduceFurther(now_ms, *incoming_bitrate_bps))) { >+ GetRemoteRate()->TimeToReduceFurther( >+ Timestamp::ms(now_ms), DataRate::bps(*incoming_bitrate_bps)))) { > // The first overuse should immediately trigger a new estimate. > // We also have to update the estimate immediately if we are overusing > // and the target bitrate is too high compared to what we are receiving. >@@ -181,10 +198,12 @@ void RemoteBitrateEstimatorSingleStream::UpdateEstimate(int64_t now_ms) { > } > AimdRateControl* remote_rate = GetRemoteRate(); > >- const RateControlInput input(bw_state, incoming_bitrate_.Rate(now_ms)); >- uint32_t target_bitrate = remote_rate->Update(&input, now_ms); >+ const RateControlInput input( >+ bw_state, OptionalRateFromOptionalBps(incoming_bitrate_.Rate(now_ms))); >+ uint32_t target_bitrate = >+ remote_rate->Update(&input, Timestamp::ms(now_ms)).bps<uint32_t>(); > if (remote_rate->ValidEstimate()) { >- process_interval_ms_ = remote_rate->GetFeedbackInterval(); >+ process_interval_ms_ = remote_rate->GetFeedbackInterval().ms(); > RTC_DCHECK_GT(process_interval_ms_, 0); > std::vector<uint32_t> ssrcs; > GetSsrcs(&ssrcs); >@@ -196,7 +215,7 @@ void RemoteBitrateEstimatorSingleStream::UpdateEstimate(int64_t now_ms) { > void RemoteBitrateEstimatorSingleStream::OnRttUpdate(int64_t avg_rtt_ms, > int64_t max_rtt_ms) { > rtc::CritScope cs(&crit_sect_); >- GetRemoteRate()->SetRtt(avg_rtt_ms); >+ GetRemoteRate()->SetRtt(TimeDelta::ms(avg_rtt_ms)); > } > > void RemoteBitrateEstimatorSingleStream::RemoveStream(unsigned int ssrc) { >@@ -220,7 +239,7 @@ bool RemoteBitrateEstimatorSingleStream::LatestEstimate( > if (ssrcs->empty()) > *bitrate_bps = 0; > else >- *bitrate_bps = remote_rate_->LatestEstimate(); >+ *bitrate_bps = remote_rate_->LatestEstimate().bps<uint32_t>(); > return true; > } > >@@ -243,7 +262,7 @@ AimdRateControl* RemoteBitrateEstimatorSingleStream::GetRemoteRate() { > > void RemoteBitrateEstimatorSingleStream::SetMinBitrate(int min_bitrate_bps) { > rtc::CritScope cs(&crit_sect_); >- remote_rate_->SetMinBitrate(min_bitrate_bps); >+ remote_rate_->SetMinBitrate(DataRate::bps(min_bitrate_bps)); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.h >index 18b435affdd094931a331f2f4fcf1f32dec6f61e..638f0d68027abc5508c1c8a01a69d6dfc00ff04c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_REMOTE_BITRATE_ESTIMATOR_REMOTE_BITRATE_ESTIMATOR_SINGLE_STREAM_H_ > #define MODULES_REMOTE_BITRATE_ESTIMATOR_REMOTE_BITRATE_ESTIMATOR_SINGLE_STREAM_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <map> > #include <memory> > #include <vector> >@@ -20,9 +22,13 @@ > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/rate_statistics.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >+class Clock; >+struct RTPHeader; >+ > class RemoteBitrateEstimatorSingleStream : public RemoteBitrateEstimator { > public: > RemoteBitrateEstimatorSingleStream(RemoteBitrateObserver* observer, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream_unittest.cc >index 120db13c7f2854d76313424ef9abb4c7b570df57..2a4ef0647373b665186db5f7831c354d0dca7669 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/remote_bitrate_estimator_single_stream_unittest.cc >@@ -55,7 +55,7 @@ TEST_F(RemoteBitrateEstimatorSingleTest, CapacityDropThreeStreamsWrap) { > } > > TEST_F(RemoteBitrateEstimatorSingleTest, CapacityDropThirteenStreamsWrap) { >- CapacityDropTestHelper(13, true, 733, 0); >+ CapacityDropTestHelper(13, true, 567, 0); > } > > TEST_F(RemoteBitrateEstimatorSingleTest, CapacityDropNineteenStreamsWrap) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.cc >index a4444375805b4be6e52688d4dbbe7dc0eaa246c4..289ff70ee2157ece315e0a2311c094b57e7040b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.cc >@@ -32,6 +32,7 @@ SendSideBweSender::SendSideBweSender(int kbps, > &event_log_)), > acknowledged_bitrate_estimator_( > absl::make_unique<AcknowledgedBitrateEstimator>()), >+ probe_bitrate_estimator_(new ProbeBitrateEstimator(nullptr)), > bwe_(new DelayBasedBwe(nullptr)), > feedback_observer_(bitrate_controller_.get()), > clock_(clock), >@@ -44,7 +45,7 @@ SendSideBweSender::SendSideBweSender(int kbps, > bitrate_controller_->SetStartBitrate(1000 * kbps); > bitrate_controller_->SetMinMaxBitrate(1000 * kMinBitrateKbps, > 1000 * kMaxBitrateKbps); >- bwe_->SetMinBitrate(1000 * kMinBitrateKbps); >+ bwe_->SetMinBitrate(DataRate::kbps(kMinBitrateKbps)); > } > > SendSideBweSender::~SendSideBweSender() {} >@@ -72,16 +73,22 @@ void SendSideBweSender::GiveFeedback(const FeedbackPacket& feedback) { > > int64_t rtt_ms = > clock_->TimeInMilliseconds() - feedback.latest_send_time_ms(); >- bwe_->OnRttUpdate(rtt_ms); >+ bwe_->OnRttUpdate(TimeDelta::ms(rtt_ms)); > BWE_TEST_LOGGING_PLOT(1, "RTT", clock_->TimeInMilliseconds(), rtt_ms); > > std::sort(packet_feedback_vector.begin(), packet_feedback_vector.end(), > PacketFeedbackComparator()); > acknowledged_bitrate_estimator_->IncomingPacketFeedbackVector( > packet_feedback_vector); >+ for (PacketFeedback& packet : packet_feedback_vector) { >+ if (packet.send_time_ms != PacketFeedback::kNoSendTime && >+ packet.pacing_info.probe_cluster_id != PacedPacketInfo::kNotAProbe) >+ probe_bitrate_estimator_->HandleProbeAndEstimateBitrate(packet); >+ } > DelayBasedBwe::Result result = bwe_->IncomingPacketFeedbackVector( >- packet_feedback_vector, acknowledged_bitrate_estimator_->bitrate_bps(), >- clock_->TimeInMilliseconds()); >+ packet_feedback_vector, acknowledged_bitrate_estimator_->bitrate(), >+ probe_bitrate_estimator_->FetchAndResetLastEstimatedBitrate(), >+ Timestamp::ms(clock_->TimeInMilliseconds())); > if (result.updated) > bitrate_controller_->OnDelayBasedBweResult(result); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.h >index 6e939c1f035706addcd35f20637181eb7794e50b..5b45e66fbca754b893847d4851c648330e47df5c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/test/estimators/send_side.h >@@ -39,6 +39,7 @@ class SendSideBweSender : public BweSender, public RemoteBitrateObserver { > protected: > std::unique_ptr<BitrateController> bitrate_controller_; > std::unique_ptr<AcknowledgedBitrateEstimator> acknowledged_bitrate_estimator_; >+ std::unique_ptr<ProbeBitrateEstimator> probe_bitrate_estimator_; > std::unique_ptr<DelayBasedBwe> bwe_; > RtcpBandwidthObserver* feedback_observer_; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/tools/bwe_rtp.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/tools/bwe_rtp.cc >index 0230db17dc01c2ddc159287e2320519a2a7b950c..d7cdb54d92879a42c87aa7bd6f5f2072ce8fad48 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/tools/bwe_rtp.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/remote_bitrate_estimator/tools/bwe_rtp.cc >@@ -24,28 +24,30 @@ > > namespace flags { > >-DEFINE_string(extension_type, >- "abs", >- "Extension type, either abs for absolute send time or tsoffset " >- "for timestamp offset."); >+WEBRTC_DEFINE_string( >+ extension_type, >+ "abs", >+ "Extension type, either abs for absolute send time or tsoffset " >+ "for timestamp offset."); > std::string ExtensionType() { > return static_cast<std::string>(FLAG_extension_type); > } > >-DEFINE_int(extension_id, 3, "Extension id."); >+WEBRTC_DEFINE_int(extension_id, 3, "Extension id."); > int ExtensionId() { > return static_cast<int>(FLAG_extension_id); > } > >-DEFINE_string(input_file, "", "Input file."); >+WEBRTC_DEFINE_string(input_file, "", "Input file."); > std::string InputFile() { > return static_cast<std::string>(FLAG_input_file); > } > >-DEFINE_string(ssrc_filter, >- "", >- "Comma-separated list of SSRCs in hexadecimal which are to be " >- "used as input to the BWE (only applicable to pcap files)."); >+WEBRTC_DEFINE_string( >+ ssrc_filter, >+ "", >+ "Comma-separated list of SSRCs in hexadecimal which are to be " >+ "used as input to the BWE (only applicable to pcap files)."); > std::set<uint32_t> SsrcFilter() { > std::string ssrc_filter_string = static_cast<std::string>(FLAG_ssrc_filter); > if (ssrc_filter_string.empty()) >@@ -64,7 +66,7 @@ std::set<uint32_t> SsrcFilter() { > return ssrcs; > } > >-DEFINE_bool(help, false, "Print this message."); >+WEBRTC_DEFINE_bool(help, false, "Print this message."); > } // namespace flags > > bool ParseArgsAndSetupEstimator(int argc, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/BUILD.gn >index 4b37290765273efc693c1a14aa7a1056923371ff..edb981bc67edc172d1ad858b59435a40a0b65835 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/BUILD.gn >@@ -11,6 +11,7 @@ import("../../webrtc.gni") > rtc_source_set("rtp_rtcp_format") { > visibility = [ "*" ] > public = [ >+ "include/rtcp_statistics.h", > "include/rtp_cvo.h", > "include/rtp_header_extension_map.h", > "include/rtp_rtcp_defines.h", >@@ -85,6 +86,7 @@ rtc_source_set("rtp_rtcp_format") { > > deps = [ > "..:module_api", >+ "..:module_api_public", > "../..:webrtc_common", > "../../api:array_view", > "../../api:libjingle_peerconnection_api", >@@ -96,6 +98,7 @@ rtc_source_set("rtp_rtcp_format") { > "../../rtc_base:deprecation", > "../../rtc_base:rtc_base_approved", > "../../system_wrappers", >+ "../video_coding:codec_globals_headers", > "//third_party/abseil-cpp/absl/types:optional", > "//third_party/abseil-cpp/absl/types:variant", > ] >@@ -186,7 +189,10 @@ rtc_static_library("rtp_rtcp") { > > deps = [ > ":rtp_rtcp_format", >+ ":rtp_video_header", > "..:module_api", >+ "..:module_api_public", >+ "..:module_fec_api", > "../..:webrtc_common", > "../../api:array_view", > "../../api:libjingle_peerconnection_api", >@@ -217,9 +223,12 @@ rtc_static_library("rtp_rtcp") { > "../../system_wrappers:metrics", > "../audio_coding:audio_format_conversion", > "../remote_bitrate_estimator", >+ "../video_coding:codec_globals_headers", > "//third_party/abseil-cpp/absl/container:inlined_vector", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", >+ "//third_party/abseil-cpp/absl/types:variant", > ] > } > >@@ -264,6 +273,7 @@ rtc_source_set("rtp_video_header") { > "../../api/video:video_frame", > "../../modules/video_coding:codec_globals_headers", > "//third_party/abseil-cpp/absl/container:inlined_vector", >+ "//third_party/abseil-cpp/absl/types:optional", > "//third_party/abseil-cpp/absl/types:variant", > ] > } >@@ -322,6 +332,7 @@ if (rtc_include_tests) { > ":rtp_rtcp", > "../../test:fileutils", > "../../test:test_main", >+ "../../test:test_support", > "//testing/gtest", > ] > } # test_packet_masks_metrics >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_receiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_receiver.h >index f006111e117b62fb3de994ae0c2089a2ae869258..f0ed576c87560793c39b3793338cb04b88ed15e5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_receiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_receiver.h >@@ -11,16 +11,20 @@ > #ifndef MODULES_RTP_RTCP_INCLUDE_FLEXFEC_RECEIVER_H_ > #define MODULES_RTP_RTCP_INCLUDE_FLEXFEC_RECEIVER_H_ > >+#include <stdint.h> > #include <memory> > >+#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/include/ulpfec_receiver.h" > #include "modules/rtp_rtcp/source/forward_error_correction.h" > #include "modules/rtp_rtcp/source/rtp_packet_received.h" > #include "rtc_base/sequenced_task_checker.h" >-#include "system_wrappers/include/clock.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >+class Clock; >+ > class FlexfecReceiver { > public: > FlexfecReceiver(uint32_t ssrc, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_sender.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_sender.h >index 12277e128f5554919ff29763d2f3adf83677d850..acee11764c0aaf15ab67234bb0288353fec4f574 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_sender.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/flexfec_sender.h >@@ -18,17 +18,15 @@ > #include "api/array_view.h" > #include "api/rtpparameters.h" > #include "modules/include/module_common_types.h" >-#include "modules/rtp_rtcp/include/flexfec_sender.h" > #include "modules/rtp_rtcp/include/rtp_header_extension_map.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtp_header_extension_size.h" >-#include "modules/rtp_rtcp/source/rtp_packet_to_send.h" > #include "modules/rtp_rtcp/source/ulpfec_generator.h" > #include "rtc_base/random.h" >-#include "system_wrappers/include/clock.h" > > namespace webrtc { > >+class Clock; > class RtpPacketToSend; > > // Note that this class is not thread safe, and thus requires external >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/receive_statistics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/receive_statistics.h >index 24d6f81237a5b6eefbd264806650f0143cb994b4..c299ea69d66ec03bd481dc79e86c539734541db0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/receive_statistics.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/receive_statistics.h >@@ -12,11 +12,13 @@ > #define MODULES_RTP_RTCP_INCLUDE_RECEIVE_STATISTICS_H_ > > #include <map> >+#include <memory> > #include <vector> > > #include "call/rtp_packet_sink_interface.h" > #include "modules/include/module.h" > #include "modules/include/module_common_types.h" >+#include "modules/rtp_rtcp/include/rtcp_statistics.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtcp_packet/report_block.h" > #include "rtc_base/deprecation.h" >@@ -54,14 +56,14 @@ class ReceiveStatistics : public ReceiveStatisticsProvider, > public: > ~ReceiveStatistics() override = default; > >- static ReceiveStatistics* Create(Clock* clock); >+ static ReceiveStatistics* Create(Clock* clock) { >+ return Create(clock, nullptr, nullptr).release(); >+ } > >- // Updates the receive statistics with this packet. >- // TODO(bugs.webrtc.org/8016): Deprecated. Delete as soon as >- // downstream code is updated to use OnRtpPacket. >- RTC_DEPRECATED >- virtual void IncomingPacket(const RTPHeader& rtp_header, >- size_t packet_length) = 0; >+ static std::unique_ptr<ReceiveStatistics> Create( >+ Clock* clock, >+ RtcpStatisticsCallback* rtcp_callback, >+ StreamDataCountersCallback* rtp_callback); > > // Increment counter for number of FEC packets received. > virtual void FecPacketReceived(const RtpPacketReceived& packet) = 0; >@@ -74,14 +76,6 @@ class ReceiveStatistics : public ReceiveStatisticsProvider, > // Detect retransmissions, enabling updates of the retransmitted counters. The > // default is false. > virtual void EnableRetransmitDetection(uint32_t ssrc, bool enable) = 0; >- >- // Called on new RTCP stats creation. >- virtual void RegisterRtcpStatisticsCallback( >- RtcpStatisticsCallback* callback) = 0; >- >- // Called on new RTP stats creation. >- virtual void RegisterRtpStatisticsCallback( >- StreamDataCountersCallback* callback) = 0; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/remote_ntp_time_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/remote_ntp_time_estimator.h >index 5195e8ab7e316c9edef7410cdaa65b1df1b04b1a..e6d269c4ddabb1b9b53ba88e35ef767542d53832 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/remote_ntp_time_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/remote_ntp_time_estimator.h >@@ -11,7 +11,7 @@ > #ifndef MODULES_RTP_RTCP_INCLUDE_REMOTE_NTP_TIME_ESTIMATOR_H_ > #define MODULES_RTP_RTCP_INCLUDE_REMOTE_NTP_TIME_ESTIMATOR_H_ > >-#include <memory> >+#include <stdint.h> > > #include "rtc_base/constructormagic.h" > #include "rtc_base/numerics/moving_median_filter.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtcp_statistics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtcp_statistics.h >new file mode 100644 >index 0000000000000000000000000000000000000000..e1d576de2d86fa8e85226bbfb59f5a6b3874bf62 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtcp_statistics.h >@@ -0,0 +1,36 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef MODULES_RTP_RTCP_INCLUDE_RTCP_STATISTICS_H_ >+#define MODULES_RTP_RTCP_INCLUDE_RTCP_STATISTICS_H_ >+ >+#include <stdint.h> >+ >+namespace webrtc { >+ >+// Statistics for an RTCP channel >+struct RtcpStatistics { >+ uint8_t fraction_lost = 0; >+ int32_t packets_lost = 0; // Defined as a 24 bit signed integer in RTCP >+ uint32_t extended_highest_sequence_number = 0; >+ uint32_t jitter = 0; >+}; >+ >+class RtcpStatisticsCallback { >+ public: >+ virtual ~RtcpStatisticsCallback() {} >+ >+ virtual void StatisticsUpdated(const RtcpStatistics& statistics, >+ uint32_t ssrc) = 0; >+ virtual void CNameChanged(const char* cname, uint32_t ssrc) = 0; >+}; >+ >+} // namespace webrtc >+#endif // MODULES_RTP_RTCP_INCLUDE_RTCP_STATISTICS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_extension_map.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_extension_map.h >index b8f27a13124532e5b42eec626063e9b31c98c82d..391c5beaa5a62bcb079d191231a625ba2a832ab8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_extension_map.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_extension_map.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_RTP_RTCP_INCLUDE_RTP_HEADER_EXTENSION_MAP_H_ > #define MODULES_RTP_RTCP_INCLUDE_RTP_HEADER_EXTENSION_MAP_H_ > >+#include <stdint.h> > #include <string> > > #include "api/array_view.h" >@@ -26,6 +27,7 @@ class RtpHeaderExtensionMap { > static constexpr int kInvalidId = 0; > > RtpHeaderExtensionMap(); >+ explicit RtpHeaderExtensionMap(bool extmap_allow_mixed); > explicit RtpHeaderExtensionMap(rtc::ArrayView<const RtpExtension> extensions); > > template <typename Extension> >@@ -53,18 +55,19 @@ class RtpHeaderExtensionMap { > } > int32_t Deregister(RTPExtensionType type); > >- bool IsMixedOneTwoByteHeaderSupported() const { >- return mixed_one_two_byte_header_supported_; >- } >- void SetMixedOneTwoByteHeaderSupported(bool supported) { >- mixed_one_two_byte_header_supported_ = supported; >+ // Corresponds to the SDP attribute extmap-allow-mixed, see RFC8285. >+ // Set to true if it's allowed to mix one- and two-byte RTP header extensions >+ // in the same stream. >+ bool ExtmapAllowMixed() const { return extmap_allow_mixed_; } >+ void SetExtmapAllowMixed(bool extmap_allow_mixed) { >+ extmap_allow_mixed_ = extmap_allow_mixed; > } > > private: > bool Register(int id, RTPExtensionType type, const char* uri); > > uint8_t ids_[kRtpExtensionNumberOfExtensions]; >- bool mixed_one_two_byte_header_supported_; >+ bool extmap_allow_mixed_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_parser.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_parser.h >index 2d84fc1e6903920f997df44b6df08863dbf057d8..85eab90a73acd52147a17cbe594445c326352376 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_parser.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_header_parser.h >@@ -10,6 +10,7 @@ > #ifndef MODULES_RTP_RTCP_INCLUDE_RTP_HEADER_PARSER_H_ > #define MODULES_RTP_RTCP_INCLUDE_RTP_HEADER_PARSER_H_ > >+#include "api/rtpparameters.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > > namespace webrtc { >@@ -36,8 +37,14 @@ class RtpHeaderParser { > virtual bool RegisterRtpHeaderExtension(RTPExtensionType type, > uint8_t id) = 0; > >+ // Registers an RTP header extension. >+ virtual bool RegisterRtpHeaderExtension(RtpExtension extension) = 0; >+ > // De-registers an RTP header extension. > virtual bool DeregisterRtpHeaderExtension(RTPExtensionType type) = 0; >+ >+ // De-registers an RTP header extension. >+ virtual bool DeregisterRtpHeaderExtension(RtpExtension extension) = 0; > }; > } // namespace webrtc > #endif // MODULES_RTP_RTCP_INCLUDE_RTP_HEADER_PARSER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp.h >index 9b29f7622e2ff134230ca056b1d8a65553d6e2c5..d136a5e6e9a3ba7623ccb3c4c732131212714d69 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp.h >@@ -18,9 +18,9 @@ > > #include "absl/types/optional.h" > #include "api/video/video_bitrate_allocation.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/include/module.h" > #include "modules/rtp_rtcp/include/flexfec_sender.h" >+#include "modules/rtp_rtcp/include/receive_statistics.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/deprecation.h" >@@ -28,6 +28,7 @@ > namespace webrtc { > > // Forward declarations. >+class FrameEncryptorInterface; > class OverheadObserver; > class RateLimiter; > class ReceiveStatisticsProvider; >@@ -92,11 +93,20 @@ class RtpRtcp : public Module, public RtcpFeedbackSenderInterface { > RateLimiter* retransmission_rate_limiter = nullptr; > OverheadObserver* overhead_observer = nullptr; > RtpKeepAliveConfig keepalive_config; >- RtcpIntervalConfig rtcp_interval_config; >+ >+ int rtcp_report_interval_ms = 0; > > // Update network2 instead of pacer_exit field of video timing extension. > bool populate_network2_timestamp = false; > >+ // E2EE Custom Video Frame Encryption >+ FrameEncryptorInterface* frame_encryptor = nullptr; >+ // Require all outgoing frames to be encrypted with a FrameEncryptor. >+ bool require_frame_encryption = false; >+ >+ // Corresponds to extmap-allow-mixed in SDP negotiation. >+ bool extmap_allow_mixed = false; >+ > private: > RTC_DISALLOW_COPY_AND_ASSIGN(Configuration); > }; >@@ -136,6 +146,8 @@ class RtpRtcp : public Module, public RtcpFeedbackSenderInterface { > // Returns -1 on failure else 0. > virtual int32_t DeRegisterSendPayload(int8_t payload_type) = 0; > >+ virtual void SetExtmapAllowMixed(bool extmap_allow_mixed) = 0; >+ > // (De)registers RTP header extension type and id. > // Returns -1 on failure else 0. > virtual int32_t RegisterSendRtpHeaderExtension(RTPExtensionType type, >@@ -213,6 +225,10 @@ class RtpRtcp : public Module, public RtcpFeedbackSenderInterface { > // Returns current media sending status. > virtual bool SendingMedia() const = 0; > >+ // Indicate that the packets sent by this module should be counted towards the >+ // bitrate estimate since the stream participates in the bitrate allocation. >+ virtual void SetAsPartOfAllocation(bool part_of_allocation) = 0; >+ > // Returns current bitrate in Kbit/s. > virtual void BitrateSent(uint32_t* total_rate, > uint32_t* video_rate, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.cc >index f86b238b7daebe632f1fe4dad23a4483444a8c9c..d743f52f7e6df3ea268321200c33758516f29eeb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.cc >@@ -9,6 +9,14 @@ > */ > > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "modules/rtp_rtcp/source/rtp_packet.h" >+ >+#include <ctype.h> >+#include <string.h> >+#include <algorithm> >+#include <type_traits> >+ >+#include "api/array_view.h" > > namespace webrtc { > >@@ -54,4 +62,80 @@ PayloadUnion::~PayloadUnion() = default; > PayloadUnion& PayloadUnion::operator=(const PayloadUnion&) = default; > PayloadUnion& PayloadUnion::operator=(PayloadUnion&&) = default; > >+PacketFeedback::PacketFeedback(int64_t arrival_time_ms, >+ uint16_t sequence_number) >+ : PacketFeedback(-1, >+ arrival_time_ms, >+ kNoSendTime, >+ sequence_number, >+ 0, >+ 0, >+ 0, >+ PacedPacketInfo()) {} >+ >+PacketFeedback::PacketFeedback(int64_t arrival_time_ms, >+ int64_t send_time_ms, >+ uint16_t sequence_number, >+ size_t payload_size, >+ const PacedPacketInfo& pacing_info) >+ : PacketFeedback(-1, >+ arrival_time_ms, >+ send_time_ms, >+ sequence_number, >+ payload_size, >+ 0, >+ 0, >+ pacing_info) {} >+ >+PacketFeedback::PacketFeedback(int64_t creation_time_ms, >+ uint16_t sequence_number, >+ size_t payload_size, >+ uint16_t local_net_id, >+ uint16_t remote_net_id, >+ const PacedPacketInfo& pacing_info) >+ : PacketFeedback(creation_time_ms, >+ kNotReceived, >+ kNoSendTime, >+ sequence_number, >+ payload_size, >+ local_net_id, >+ remote_net_id, >+ pacing_info) {} >+ >+PacketFeedback::PacketFeedback(int64_t creation_time_ms, >+ int64_t arrival_time_ms, >+ int64_t send_time_ms, >+ uint16_t sequence_number, >+ size_t payload_size, >+ uint16_t local_net_id, >+ uint16_t remote_net_id, >+ const PacedPacketInfo& pacing_info) >+ : creation_time_ms(creation_time_ms), >+ arrival_time_ms(arrival_time_ms), >+ send_time_ms(send_time_ms), >+ sequence_number(sequence_number), >+ payload_size(payload_size), >+ unacknowledged_data(0), >+ local_net_id(local_net_id), >+ remote_net_id(remote_net_id), >+ pacing_info(pacing_info) {} >+ >+PacketFeedback::PacketFeedback(const PacketFeedback&) = default; >+PacketFeedback& PacketFeedback::operator=(const PacketFeedback&) = default; >+PacketFeedback::~PacketFeedback() = default; >+ >+bool PacketFeedback::operator==(const PacketFeedback& rhs) const { >+ return arrival_time_ms == rhs.arrival_time_ms && >+ send_time_ms == rhs.send_time_ms && >+ sequence_number == rhs.sequence_number && >+ payload_size == rhs.payload_size && pacing_info == rhs.pacing_info; >+} >+ >+void RtpPacketCounter::AddPacket(const RtpPacket& packet) { >+ ++packets; >+ header_bytes += packet.headers_size(); >+ padding_bytes += packet.padding_size(); >+ payload_bytes += packet.payload_size(); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.h >index 6503b4edcc3a3b88d7fe0ce657b03ef7b780ad6c..ab4fcaecc79a481071bc166dc79a52feb0a62556 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/include/rtp_rtcp_defines.h >@@ -27,28 +27,20 @@ > #define IP_PACKET_SIZE 1500 // we assume ethernet > > namespace webrtc { >+class RtpPacket; > namespace rtcp { > class TransportFeedback; > } > > const int kVideoPayloadTypeFrequency = 90000; >-// TODO(solenberg): RTP time stamp rate for RTCP is fixed at 8k, this is legacy >-// and should be fixed. >-// See: https://bugs.chromium.org/p/webrtc/issues/detail?id=6458 >+ >+// TODO(bugs.webrtc.org/6458): Remove this when all the depending projects are >+// updated to correctly set rtp rate for RtcpSender. > const int kBogusRtpRateForAudioRtcp = 8000; > > // Minimum RTP header size in bytes. > const uint8_t kRtpHeaderSize = 12; > >-struct RtcpIntervalConfig final { >- RtcpIntervalConfig() = default; >- RtcpIntervalConfig(int64_t video_interval_ms, int64_t audio_interval_ms) >- : video_interval_ms(video_interval_ms), >- audio_interval_ms(audio_interval_ms) {} >- int64_t video_interval_ms = 1000; >- int64_t audio_interval_ms = 5000; >-}; >- > struct AudioPayload { > SdpAudioFormat format; > uint32_t rate; >@@ -112,6 +104,7 @@ enum RTPExtensionType : int { > kRtpExtensionRepairedRtpStreamId, > kRtpExtensionMid, > kRtpExtensionGenericFrameDescriptor, >+ kRtpExtensionColorSpace, > kRtpExtensionNumberOfExtensions // Must be the last entity in the enum. > }; > >@@ -222,15 +215,6 @@ struct RtpState { > bool media_has_been_sent; > }; > >-class RtpData { >- public: >- virtual ~RtpData() {} >- >- virtual int32_t OnReceivedPayloadData(const uint8_t* payload_data, >- size_t payload_size, >- const WebRtcRTPHeader* rtp_header) = 0; >-}; >- > // Callback interface for packets recovered by FlexFEC or ULPFEC. In > // the FlexFEC case, the implementation should be able to demultiplex > // the recovered RTP packets based on SSRC. >@@ -263,44 +247,20 @@ class RtcpBandwidthObserver { > }; > > struct PacketFeedback { >- PacketFeedback(int64_t arrival_time_ms, uint16_t sequence_number) >- : PacketFeedback(-1, >- arrival_time_ms, >- kNoSendTime, >- sequence_number, >- 0, >- 0, >- 0, >- PacedPacketInfo()) {} >+ PacketFeedback(int64_t arrival_time_ms, uint16_t sequence_number); > > PacketFeedback(int64_t arrival_time_ms, > int64_t send_time_ms, > uint16_t sequence_number, > size_t payload_size, >- const PacedPacketInfo& pacing_info) >- : PacketFeedback(-1, >- arrival_time_ms, >- send_time_ms, >- sequence_number, >- payload_size, >- 0, >- 0, >- pacing_info) {} >+ const PacedPacketInfo& pacing_info); > > PacketFeedback(int64_t creation_time_ms, > uint16_t sequence_number, > size_t payload_size, > uint16_t local_net_id, > uint16_t remote_net_id, >- const PacedPacketInfo& pacing_info) >- : PacketFeedback(creation_time_ms, >- kNotReceived, >- kNoSendTime, >- sequence_number, >- payload_size, >- local_net_id, >- remote_net_id, >- pacing_info) {} >+ const PacedPacketInfo& pacing_info); > > PacketFeedback(int64_t creation_time_ms, > int64_t arrival_time_ms, >@@ -309,15 +269,10 @@ struct PacketFeedback { > size_t payload_size, > uint16_t local_net_id, > uint16_t remote_net_id, >- const PacedPacketInfo& pacing_info) >- : creation_time_ms(creation_time_ms), >- arrival_time_ms(arrival_time_ms), >- send_time_ms(send_time_ms), >- sequence_number(sequence_number), >- payload_size(payload_size), >- local_net_id(local_net_id), >- remote_net_id(remote_net_id), >- pacing_info(pacing_info) {} >+ const PacedPacketInfo& pacing_info); >+ PacketFeedback(const PacketFeedback&); >+ PacketFeedback& operator=(const PacketFeedback&); >+ ~PacketFeedback(); > > static constexpr int kNotAProbe = -1; > static constexpr int64_t kNotReceived = -1; >@@ -328,12 +283,7 @@ struct PacketFeedback { > // for book-keeping, and is of no interest outside that class. > // TODO(philipel): Remove |creation_time_ms| from PacketFeedback when cleaning > // up SendTimeHistory. >- bool operator==(const PacketFeedback& rhs) const { >- return arrival_time_ms == rhs.arrival_time_ms && >- send_time_ms == rhs.send_time_ms && >- sequence_number == rhs.sequence_number && >- payload_size == rhs.payload_size && pacing_info == rhs.pacing_info; >- } >+ bool operator==(const PacketFeedback& rhs) const; > > // Time corresponding to when this object was created. > int64_t creation_time_ms; >@@ -352,6 +302,8 @@ struct PacketFeedback { > int64_t long_sequence_number; > // Size of the packet excluding RTP headers. > size_t payload_size; >+ // Size of preceeding packets that are not part of feedback. >+ size_t unacknowledged_data; > // The network route ids that this packet is associated with. > uint16_t local_net_id; > uint16_t remote_net_id; >@@ -489,13 +441,8 @@ struct RtpPacketCounter { > packets -= other.packets; > } > >- void AddPacket(size_t packet_length, const RTPHeader& header) { >- ++packets; >- header_bytes += header.headerLength; >- padding_bytes += header.paddingLength; >- payload_bytes += >- packet_length - (header.headerLength + header.paddingLength); >- } >+ // Not inlined, since use of RtpPacket would result in circular includes. >+ void AddPacket(const RtpPacket& packet); > > size_t TotalBytes() const { > return header_bytes + payload_bytes + padding_bytes; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.cc >index d24c1b094c51f998edab65d6def9e37937a8bfe3..061f82765cc75baa3ca04cdd43c64f1992dd5eb4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.cc >@@ -12,9 +12,6 @@ > > namespace webrtc { > >-MockRtpData::MockRtpData() = default; >-MockRtpData::~MockRtpData() = default; >- > MockRtpRtcp::MockRtpRtcp() = default; > MockRtpRtcp::~MockRtpRtcp() = default; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.h >index 6d5bde4788f4a6ae48ff6a8ddc084f766344fade..3b9b943c4539504130d26d55bc4ecf34a9ccb5ad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/mocks/mock_rtp_rtcp.h >@@ -27,17 +27,6 @@ > > namespace webrtc { > >-class MockRtpData : public RtpData { >- public: >- MockRtpData(); >- ~MockRtpData(); >- >- MOCK_METHOD3(OnReceivedPayloadData, >- int32_t(const uint8_t* payload_data, >- size_t payload_size, >- const WebRtcRTPHeader* rtp_header)); >-}; >- > class MockRtpRtcp : public RtpRtcp { > public: > MockRtpRtcp(); >@@ -52,6 +41,7 @@ class MockRtpRtcp : public RtpRtcp { > MOCK_METHOD2(RegisterVideoSendPayload, > void(int payload_type, const char* payload_name)); > MOCK_METHOD1(DeRegisterSendPayload, int32_t(int8_t payload_type)); >+ MOCK_METHOD1(SetExtmapAllowMixed, void(bool extmap_allow_mixed)); > MOCK_METHOD2(RegisterSendRtpHeaderExtension, > int32_t(RTPExtensionType type, uint8_t id)); > MOCK_METHOD2(RegisterRtpHeaderExtension, >@@ -83,6 +73,7 @@ class MockRtpRtcp : public RtpRtcp { > MOCK_CONST_METHOD0(Sending, bool()); > MOCK_METHOD1(SetSendingMediaStatus, void(bool sending)); > MOCK_CONST_METHOD0(SendingMedia, bool()); >+ MOCK_METHOD1(SetAsPartOfAllocation, void(bool)); > MOCK_CONST_METHOD4(BitrateSent, > void(uint32_t* total_rate, > uint32_t* video_rate, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.cc >index 853706c4f5dc140909ee22c9e26c02d47fac1b94..64dc443e295f5c46d8a70a0d71df3ffcf5c83736 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.cc >@@ -25,9 +25,11 @@ ContributingSources::ContributingSources() = default; > ContributingSources::~ContributingSources() = default; > > void ContributingSources::Update(int64_t now_ms, >- rtc::ArrayView<const uint32_t> csrcs) { >+ rtc::ArrayView<const uint32_t> csrcs, >+ absl::optional<uint8_t> audio_level) { >+ Entry entry = { now_ms, audio_level }; > for (uint32_t csrc : csrcs) { >- last_seen_ms_[csrc] = now_ms; >+ active_csrcs_[csrc] = entry; > } > if (!next_pruning_ms_) { > next_pruning_ms_ = now_ms + kPruningIntervalMs; >@@ -43,9 +45,16 @@ void ContributingSources::Update(int64_t now_ms, > // non-const. > std::vector<RtpSource> ContributingSources::GetSources(int64_t now_ms) const { > std::vector<RtpSource> sources; >- for (auto& record : last_seen_ms_) { >- if (record.second >= now_ms - kHistoryMs) { >- sources.emplace_back(record.second, record.first, RtpSourceType::CSRC); >+ for (auto& record : active_csrcs_) { >+ if (record.second.last_seen_ms >= now_ms - kHistoryMs) { >+ if (record.second.audio_level.has_value()) { >+ sources.emplace_back(record.second.last_seen_ms, record.first, >+ RtpSourceType::CSRC, >+ *record.second.audio_level); >+ } else { >+ sources.emplace_back(record.second.last_seen_ms, record.first, >+ RtpSourceType::CSRC); >+ } > } > } > >@@ -54,15 +63,20 @@ std::vector<RtpSource> ContributingSources::GetSources(int64_t now_ms) const { > > // Delete stale entries. > void ContributingSources::DeleteOldEntries(int64_t now_ms) { >- for (auto it = last_seen_ms_.begin(); it != last_seen_ms_.end();) { >- if (it->second >= now_ms - kHistoryMs) { >+ for (auto it = active_csrcs_.begin(); it != active_csrcs_.end();) { >+ if (it->second.last_seen_ms >= now_ms - kHistoryMs) { > // Still relevant. > ++it; > } else { >- it = last_seen_ms_.erase(it); >+ it = active_csrcs_.erase(it); > } > } > next_pruning_ms_ = now_ms + kPruningIntervalMs; > } > >+ContributingSources::Entry::Entry() = default; >+ContributingSources::Entry::Entry(int64_t timestamp_ms, >+ absl::optional<uint8_t> audio_level_arg) >+ : last_seen_ms(timestamp_ms), audio_level(audio_level_arg) {} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.h >index 1a4a572d8f54431ba6a523e40f5d33af01f8fb22..5e34539ce477e0b7bedaf80d099a744735114e63 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources.h >@@ -18,7 +18,8 @@ > > #include "absl/types/optional.h" > #include "api/array_view.h" >-#include "api/rtpreceiverinterface.h" >+#include "api/rtpreceiverinterface.h" // For RtpSource >+#include "rtc_base/timeutils.h" // For kNumMillisecsPerSec > > namespace webrtc { > >@@ -31,18 +32,25 @@ class ContributingSources { > ContributingSources(); > ~ContributingSources(); > >- // TODO(bugs.webrtc.org/3333): Needs to be extended with audio-level, to >- // support RFC6465. >- void Update(int64_t now_ms, rtc::ArrayView<const uint32_t> csrcs); >+ void Update(int64_t now_ms, rtc::ArrayView<const uint32_t> csrcs, >+ absl::optional<uint8_t> audio_level); > > // Returns contributing sources seen the last 10 s. > std::vector<RtpSource> GetSources(int64_t now_ms) const; > > private: >+ struct Entry { >+ Entry(); >+ Entry(int64_t timestamp_ms, absl::optional<uint8_t> audio_level); >+ >+ int64_t last_seen_ms; >+ absl::optional<uint8_t> audio_level; >+ }; >+ > void DeleteOldEntries(int64_t now_ms); > > // Indexed by csrc. >- std::map<uint32_t, int64_t> last_seen_ms_; >+ std::map<uint32_t, Entry> active_csrcs_; > absl::optional<int64_t> next_pruning_ms_; > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources_unittest.cc >index 8b22d26c94804cf4422c0db61d65873c18e4bcd2..5f1d8d3d112042c8942415679bd7dbb92e6fc1f3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/contributing_sources_unittest.cc >@@ -30,7 +30,7 @@ TEST(ContributingSourcesTest, RecordSources) { > ContributingSources csrcs; > constexpr uint32_t kCsrcs[] = {kCsrc1, kCsrc2}; > constexpr int64_t kTime1 = 10; >- csrcs.Update(kTime1, kCsrcs); >+ csrcs.Update(kTime1, kCsrcs, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime1), > UnorderedElementsAre(RtpSource(kTime1, kCsrc1, RtpSourceType::CSRC), >@@ -45,12 +45,12 @@ TEST(ContributingSourcesTest, UpdateSources) { > constexpr uint32_t kCsrcs2[] = {kCsrc2, kCsrc3}; > constexpr int64_t kTime1 = 10; > constexpr int64_t kTime2 = kTime1 + 5 * rtc::kNumMillisecsPerSec; >- csrcs.Update(kTime1, kCsrcs1); >+ csrcs.Update(kTime1, kCsrcs1, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime1), > UnorderedElementsAre(RtpSource(kTime1, kCsrc1, RtpSourceType::CSRC), > RtpSource(kTime1, kCsrc2, RtpSourceType::CSRC))); >- csrcs.Update(kTime2, kCsrcs2); >+ csrcs.Update(kTime2, kCsrcs2, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime2), > UnorderedElementsAre(RtpSource(kTime1, kCsrc1, RtpSourceType::CSRC), >@@ -65,12 +65,12 @@ TEST(ContributingSourcesTest, ReturnRecentOnly) { > constexpr int64_t kTime1 = 10; > constexpr int64_t kTime2 = kTime1 + 5 * rtc::kNumMillisecsPerSec; > constexpr int64_t kTime3 = kTime1 + 12 * rtc::kNumMillisecsPerSec; >- csrcs.Update(kTime1, kCsrcs1); >+ csrcs.Update(kTime1, kCsrcs1, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime1), > UnorderedElementsAre(RtpSource(kTime1, kCsrc1, RtpSourceType::CSRC), > RtpSource(kTime1, kCsrc2, RtpSourceType::CSRC))); >- csrcs.Update(kTime2, kCsrcs2); >+ csrcs.Update(kTime2, kCsrcs2, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime3), > UnorderedElementsAre(RtpSource(kTime2, kCsrc2, RtpSourceType::CSRC), >@@ -84,18 +84,18 @@ TEST(ContributingSourcesTest, PurgeOldSources) { > constexpr int64_t kTime1 = 10; > constexpr int64_t kTime2 = kTime1 + 10 * rtc::kNumMillisecsPerSec; > constexpr int64_t kTime3 = kTime1 + 20 * rtc::kNumMillisecsPerSec; >- csrcs.Update(kTime1, kCsrcs1); >+ csrcs.Update(kTime1, kCsrcs1, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime2), > UnorderedElementsAre(RtpSource(kTime1, kCsrc1, RtpSourceType::CSRC), > RtpSource(kTime1, kCsrc2, RtpSourceType::CSRC))); >- csrcs.Update(kTime2, kCsrcs2); >+ csrcs.Update(kTime2, kCsrcs2, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime2), > UnorderedElementsAre(RtpSource(kTime1, kCsrc1, RtpSourceType::CSRC), > RtpSource(kTime2, kCsrc2, RtpSourceType::CSRC), > RtpSource(kTime2, kCsrc3, RtpSourceType::CSRC))); >- csrcs.Update(kTime3, kCsrcs2); >+ csrcs.Update(kTime3, kCsrcs2, absl::nullopt); > EXPECT_THAT( > csrcs.GetSources(kTime3), > UnorderedElementsAre(RtpSource(kTime3, kCsrc2, RtpSourceType::CSRC), >@@ -108,4 +108,22 @@ TEST(ContributingSourcesTest, PurgeOldSources) { > RtpSource(kTime3, kCsrc3, RtpSourceType::CSRC))); > } > >+TEST(ContributingSourcesTest, AudioLevel) { >+ ContributingSources csrcs; >+ constexpr uint32_t kCsrcs[] = {kCsrc1, kCsrc2}; >+ constexpr int64_t kTime1 = 10; >+ csrcs.Update(kTime1, kCsrcs, 47); >+ EXPECT_THAT( >+ csrcs.GetSources(kTime1), >+ UnorderedElementsAre(RtpSource(kTime1, kCsrc1, RtpSourceType::CSRC, 47), >+ RtpSource(kTime1, kCsrc2, RtpSourceType::CSRC, 47))); >+ >+ constexpr uint32_t kCsrcsSubset[] = {kCsrc1}; >+ csrcs.Update(kTime1 + 1, kCsrcsSubset, absl::nullopt); >+ EXPECT_THAT( >+ csrcs.GetSources(kTime1 + 1), >+ UnorderedElementsAre(RtpSource(kTime1 + 1, kCsrc1, RtpSourceType::CSRC), >+ RtpSource(kTime1, kCsrc2, RtpSourceType::CSRC, 47))); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.cc >index 86ddb105f924e998f19dee296b0e19c0c5a184e1..10e674789a4ba0fcf6761b7fc33bb5a9d95a8f05 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.cc >@@ -10,6 +10,10 @@ > > #include "modules/rtp_rtcp/source/dtmf_queue.h" > >+#include <stddef.h> >+ >+#include "rtc_base/checks.h" >+ > namespace { > constexpr size_t kDtmfOutbandMax = 20; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.h >index db70c9750800286fa7e381f229d2e5193e406961..e5955a1297c44383da19c55d413885ad3fd21ff3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/dtmf_queue.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_DTMF_QUEUE_H_ > #define MODULES_RTP_RTCP_SOURCE_DTMF_QUEUE_H_ > >+#include <stdint.h> > #include <list> > > #include "rtc_base/criticalsection.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.cc >index d7666e1192a339cafb192516735309fac63b8add..b813340ea68d8555e4539687aec85f5fbf599553 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.cc >@@ -12,12 +12,11 @@ > > #include <string.h> > >-#include <utility> >- > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/forward_error_correction_internal.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >+#include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.h >index 1d6ddda4b40985f4b612166a451129db3796c930..d305c4c288de14fdb840b7acca2ead83cf1c228b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_header_reader_writer.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_FLEXFEC_HEADER_READER_WRITER_H_ > #define MODULES_RTP_RTCP_SOURCE_FLEXFEC_HEADER_READER_WRITER_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/rtp_rtcp/source/forward_error_correction.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_receiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_receiver.cc >index c3ed4d53b2a43ff9e6dec909cfde4b165951d3a9..1a62bced68a4d2f22fb1a6e230f728c512b4f010 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_receiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_receiver.cc >@@ -10,6 +10,9 @@ > > #include "modules/rtp_rtcp/include/flexfec_receiver.h" > >+#include <string.h> >+ >+#include "api/array_view.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/scoped_ref_ptr.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_sender.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_sender.cc >index 286f47c5fe808d26749f5abf0fafae542ea1e1bf..1204b2d978c44cff1dbb709f51bc2ad88006debf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_sender.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/flexfec_sender.cc >@@ -10,11 +10,15 @@ > > #include "modules/rtp_rtcp/include/flexfec_sender.h" > >+#include <string.h> >+#include <list> > #include <utility> > > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/forward_error_correction.h" > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" >+#include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.cc >index b743110b2d6024ef0150aa793487fcd5ea1edcbb..a52fecad18e70f3c372c8ce9d088d168f60496aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.cc >@@ -11,11 +11,10 @@ > #include "modules/rtp_rtcp/source/forward_error_correction.h" > > #include <string.h> >- > #include <algorithm> >-#include <iterator> > #include <utility> > >+#include "modules/include/module_common_types_public.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/flexfec_header_reader_writer.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.h >index 819f6bcec56a16ea7078b45fcbadd303f01fa364..adb7572ae670a964eef5c0fb1109afcdbd310d14 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction.h >@@ -11,16 +11,15 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_FORWARD_ERROR_CORRECTION_H_ > #define MODULES_RTP_RTCP_SOURCE_FORWARD_ERROR_CORRECTION_H_ > >+#include <stddef.h> > #include <stdint.h> >- > #include <list> > #include <memory> > #include <vector> > >+#include "modules/include/module_fec_types.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/forward_error_correction_internal.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/refcount.h" > #include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.cc >index 7e5fd91e035644e01b6d193e24119ba38a680387..9b02026a77c505f38f8677c55e586bbcef8c8b9c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.cc >@@ -10,6 +10,7 @@ > > #include "modules/rtp_rtcp/source/forward_error_correction_internal.h" > >+#include <string.h> > #include <algorithm> > > #include "modules/rtp_rtcp/source/fec_private_tables_bursty.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.h >index 2e8a202113af94867b1b1a5868d60be5f0aec30c..ed93f520e526fc788bc62eee3cd7e2e04c144bbc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/forward_error_correction_internal.h >@@ -11,9 +11,11 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_FORWARD_ERROR_CORRECTION_INTERNAL_H_ > #define MODULES_RTP_RTCP_SOURCE_FORWARD_ERROR_CORRECTION_INTERNAL_H_ > >-#include "modules/include/module_common_types.h" >+#include <stddef.h> >+#include <stdint.h> > > #include "api/array_view.h" >+#include "modules/include/module_fec_types.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/packet_loss_stats.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/packet_loss_stats.cc >index 076348d2da3e0893e1b05490cf60da8279d784d6..36f0a63d596657be643391fdd4bde0dc8c0efdd7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/packet_loss_stats.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/packet_loss_stats.cc >@@ -10,6 +10,8 @@ > > #include "modules/rtp_rtcp/source/packet_loss_stats.h" > >+#include <cstdint> >+#include <iterator> > #include <vector> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.cc >index d3a75dd13aa34a99f1dfc900c07f4423233806f5..dc33fad536ded3d067f752500209d129b3d28437 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.cc >@@ -13,7 +13,6 @@ > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.h >index 6e6e253c8c76e56d92d8ab5ae1c263f25ff83681..0e3bd39a7ef5ee1ae85dff256abbb7943268ee25 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/playout_delay_oracle.h >@@ -13,8 +13,10 @@ > > #include <stdint.h> > >-#include "modules/include/module_common_types.h" >+#include "common_types.h" // NOLINT(build/include) >+#include "modules/include/module_common_types_public.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/thread_annotations.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.cc >index abe028d670237a8921a63d03e23e00bf7da709f9..bc742d1ea63ff7bd4794cceac3ddbd833d12c499 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.cc >@@ -11,10 +11,11 @@ > #include "modules/rtp_rtcp/source/receive_statistics_impl.h" > > #include <math.h> >- > #include <cstdlib> >+#include <memory> > #include <vector> > >+#include "absl/memory/memory.h" > #include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" > #include "modules/rtp_rtcp/source/rtp_packet_received.h" > #include "modules/rtp_rtcp/source/rtp_rtcp_config.h" >@@ -33,13 +34,14 @@ StreamStatisticianImpl::StreamStatisticianImpl( > uint32_t ssrc, > Clock* clock, > bool enable_retransmit_detection, >+ int max_reordering_threshold, > RtcpStatisticsCallback* rtcp_callback, > StreamDataCountersCallback* rtp_callback) > : ssrc_(ssrc), > clock_(clock), > incoming_bitrate_(kStatisticsProcessIntervalMs, > RateStatistics::kBpsScale), >- max_reordering_threshold_(kDefaultMaxReorderingThreshold), >+ max_reordering_threshold_(max_reordering_threshold), > enable_retransmit_detection_(enable_retransmit_detection), > jitter_q4_(0), > cumulative_loss_(0), >@@ -48,7 +50,6 @@ StreamStatisticianImpl::StreamStatisticianImpl( > received_seq_first_(0), > received_seq_max_(0), > received_seq_wraps_(0), >- received_packet_overhead_(12), > last_report_inorder_packets_(0), > last_report_old_packets_(0), > last_report_seq_max_(0), >@@ -57,79 +58,71 @@ StreamStatisticianImpl::StreamStatisticianImpl( > > StreamStatisticianImpl::~StreamStatisticianImpl() = default; > >-void StreamStatisticianImpl::IncomingPacket(const RTPHeader& header, >- size_t packet_length) { >- StreamDataCounters counters; >- { >- rtc::CritScope cs(&stream_lock_); >- >- bool retransmitted = >- enable_retransmit_detection_ && IsRetransmitOfOldPacket(header); >- counters = UpdateCounters(header, packet_length, retransmitted); >- } >- rtp_callback_->DataCountersUpdated(counters, ssrc_); >+void StreamStatisticianImpl::OnRtpPacket(const RtpPacketReceived& packet) { >+ StreamDataCounters counters = UpdateCounters(packet); >+ if (rtp_callback_) >+ rtp_callback_->DataCountersUpdated(counters, ssrc_); > } > > StreamDataCounters StreamStatisticianImpl::UpdateCounters( >- const RTPHeader& header, >- size_t packet_length, >- bool retransmitted) { >- bool in_order = InOrderPacketInternal(header.sequenceNumber); >- RTC_DCHECK_EQ(ssrc_, header.ssrc); >- incoming_bitrate_.Update(packet_length, clock_->TimeInMilliseconds()); >- receive_counters_.transmitted.AddPacket(packet_length, header); >- if (!in_order && retransmitted) { >- receive_counters_.retransmitted.AddPacket(packet_length, header); >+ const RtpPacketReceived& packet) { >+ rtc::CritScope cs(&stream_lock_); >+ RTC_DCHECK_EQ(ssrc_, packet.Ssrc()); >+ uint16_t sequence_number = packet.SequenceNumber(); >+ bool in_order = >+ // First packet is always in order. >+ last_receive_time_ms_ == 0 || >+ IsNewerSequenceNumber(sequence_number, received_seq_max_) || >+ // If we have a restart of the remote side this packet is still in order. >+ !IsNewerSequenceNumber(sequence_number, >+ received_seq_max_ - max_reordering_threshold_); >+ int64_t now_ms = clock_->TimeInMilliseconds(); >+ >+ incoming_bitrate_.Update(packet.size(), now_ms); >+ receive_counters_.transmitted.AddPacket(packet); >+ if (!in_order && enable_retransmit_detection_ && >+ IsRetransmitOfOldPacket(packet, now_ms)) { >+ receive_counters_.retransmitted.AddPacket(packet); > } > > if (receive_counters_.transmitted.packets == 1) { >- received_seq_first_ = header.sequenceNumber; >- receive_counters_.first_packet_time_ms = clock_->TimeInMilliseconds(); >+ received_seq_first_ = packet.SequenceNumber(); >+ receive_counters_.first_packet_time_ms = now_ms; > } > > // Count only the new packets received. That is, if packets 1, 2, 3, 5, 4, 6 > // are received, 4 will be ignored. > if (in_order) { >- // Current time in samples. >- NtpTime receive_time = clock_->CurrentNtpTime(); >- > // Wrong if we use RetransmitOfOldPacket. > if (receive_counters_.transmitted.packets > 1 && >- received_seq_max_ > header.sequenceNumber) { >+ received_seq_max_ > packet.SequenceNumber()) { > // Wrap around detected. > received_seq_wraps_++; > } > // New max. >- received_seq_max_ = header.sequenceNumber; >+ received_seq_max_ = packet.SequenceNumber(); > > // If new time stamp and more than one in-order packet received, calculate > // new jitter statistics. >- if (header.timestamp != last_received_timestamp_ && >+ if (packet.Timestamp() != last_received_timestamp_ && > (receive_counters_.transmitted.packets - > receive_counters_.retransmitted.packets) > 1) { >- UpdateJitter(header, receive_time); >+ UpdateJitter(packet, now_ms); > } >- last_received_timestamp_ = header.timestamp; >- last_receive_time_ntp_ = receive_time; >- last_receive_time_ms_ = clock_->TimeInMilliseconds(); >+ last_received_timestamp_ = packet.Timestamp(); >+ last_receive_time_ms_ = now_ms; > } >- >- size_t packet_oh = header.headerLength + header.paddingLength; >- >- // Our measured overhead. Filter from RFC 5104 4.2.1.2: >- // avg_OH (new) = 15/16*avg_OH (old) + 1/16*pckt_OH, >- received_packet_overhead_ = (15 * received_packet_overhead_ + packet_oh) >> 4; > return receive_counters_; > } > >-void StreamStatisticianImpl::UpdateJitter(const RTPHeader& header, >- NtpTime receive_time) { >- uint32_t receive_time_rtp = >- NtpToRtp(receive_time, header.payload_type_frequency); >- uint32_t last_receive_time_rtp = >- NtpToRtp(last_receive_time_ntp_, header.payload_type_frequency); >- int32_t time_diff_samples = (receive_time_rtp - last_receive_time_rtp) - >- (header.timestamp - last_received_timestamp_); >+void StreamStatisticianImpl::UpdateJitter(const RtpPacketReceived& packet, >+ int64_t receive_time_ms) { >+ int64_t receive_diff_ms = receive_time_ms - last_receive_time_ms_; >+ RTC_DCHECK_GE(receive_diff_ms, 0); >+ uint32_t receive_diff_rtp = static_cast<uint32_t>( >+ (receive_diff_ms * packet.payload_type_frequency()) / 1000); >+ int32_t time_diff_samples = >+ receive_diff_rtp - (packet.Timestamp() - last_received_timestamp_); > > time_diff_samples = std::abs(time_diff_samples); > >@@ -143,15 +136,16 @@ void StreamStatisticianImpl::UpdateJitter(const RTPHeader& header, > } > } > >-void StreamStatisticianImpl::FecPacketReceived(const RTPHeader& header, >- size_t packet_length) { >+void StreamStatisticianImpl::FecPacketReceived( >+ const RtpPacketReceived& packet) { > StreamDataCounters counters; > { > rtc::CritScope cs(&stream_lock_); >- receive_counters_.fec.AddPacket(packet_length, header); >+ receive_counters_.fec.AddPacket(packet); > counters = receive_counters_; > } >- rtp_callback_->DataCountersUpdated(counters, ssrc_); >+ if (rtp_callback_) >+ rtp_callback_->DataCountersUpdated(counters, ssrc_); > } > > void StreamStatisticianImpl::SetMaxReorderingThreshold( >@@ -188,7 +182,8 @@ bool StreamStatisticianImpl::GetStatistics(RtcpStatistics* statistics, > *statistics = CalculateRtcpStatistics(); > } > >- rtcp_callback_->StatisticsUpdated(*statistics, ssrc_); >+ if (rtcp_callback_) >+ rtcp_callback_->StatisticsUpdated(*statistics, ssrc_); > return true; > } > >@@ -196,7 +191,7 @@ bool StreamStatisticianImpl::GetActiveStatisticsAndReset( > RtcpStatistics* statistics) { > { > rtc::CritScope cs(&stream_lock_); >- if (clock_->CurrentNtpInMilliseconds() - last_receive_time_ntp_.ToMs() >= >+ if (clock_->TimeInMilliseconds() - last_receive_time_ms_ >= > kStatisticsTimeoutMs) { > // Not active. > return false; >@@ -210,7 +205,8 @@ bool StreamStatisticianImpl::GetActiveStatisticsAndReset( > *statistics = CalculateRtcpStatistics(); > } > >- rtcp_callback_->StatisticsUpdated(*statistics, ssrc_); >+ if (rtcp_callback_) >+ rtcp_callback_->StatisticsUpdated(*statistics, ssrc_); > return true; > } > >@@ -312,17 +308,15 @@ uint32_t StreamStatisticianImpl::BitrateReceived() const { > } > > bool StreamStatisticianImpl::IsRetransmitOfOldPacket( >- const RTPHeader& header) const { >- if (InOrderPacketInternal(header.sequenceNumber)) { >- return false; >- } >- uint32_t frequency_khz = header.payload_type_frequency / 1000; >- assert(frequency_khz > 0); >+ const RtpPacketReceived& packet, >+ int64_t now_ms) const { >+ uint32_t frequency_khz = packet.payload_type_frequency() / 1000; >+ RTC_DCHECK_GT(frequency_khz, 0); > >- int64_t time_diff_ms = clock_->TimeInMilliseconds() - last_receive_time_ms_; >+ int64_t time_diff_ms = now_ms - last_receive_time_ms_; > > // Diff in time stamp since last received in order. >- uint32_t timestamp_diff = header.timestamp - last_received_timestamp_; >+ uint32_t timestamp_diff = packet.Timestamp() - last_received_timestamp_; > uint32_t rtp_time_stamp_diff_ms = timestamp_diff / frequency_khz; > > int64_t max_delay_ms = 0; >@@ -341,30 +335,23 @@ bool StreamStatisticianImpl::IsRetransmitOfOldPacket( > return time_diff_ms > rtp_time_stamp_diff_ms + max_delay_ms; > } > >-bool StreamStatisticianImpl::InOrderPacketInternal( >- uint16_t sequence_number) const { >- // First packet is always in order. >- if (last_receive_time_ms_ == 0) >- return true; >- >- if (IsNewerSequenceNumber(sequence_number, received_seq_max_)) { >- return true; >- } else { >- // If we have a restart of the remote side this packet is still in order. >- return !IsNewerSequenceNumber( >- sequence_number, received_seq_max_ - max_reordering_threshold_); >- } >-} >- >-ReceiveStatistics* ReceiveStatistics::Create(Clock* clock) { >- return new ReceiveStatisticsImpl(clock); >+std::unique_ptr<ReceiveStatistics> ReceiveStatistics::Create( >+ Clock* clock, >+ RtcpStatisticsCallback* rtcp_callback, >+ StreamDataCountersCallback* rtp_callback) { >+ return absl::make_unique<ReceiveStatisticsImpl>(clock, rtcp_callback, >+ rtp_callback); > } > >-ReceiveStatisticsImpl::ReceiveStatisticsImpl(Clock* clock) >+ReceiveStatisticsImpl::ReceiveStatisticsImpl( >+ Clock* clock, >+ RtcpStatisticsCallback* rtcp_callback, >+ StreamDataCountersCallback* rtp_callback) > : clock_(clock), > last_returned_ssrc_(0), >- rtcp_stats_callback_(NULL), >- rtp_stats_callback_(NULL) {} >+ max_reordering_threshold_(kDefaultMaxReorderingThreshold), >+ rtcp_stats_callback_(rtcp_callback), >+ rtp_stats_callback_(rtp_callback) {} > > ReceiveStatisticsImpl::~ReceiveStatisticsImpl() { > while (!statisticians_.empty()) { >@@ -374,31 +361,24 @@ ReceiveStatisticsImpl::~ReceiveStatisticsImpl() { > } > > void ReceiveStatisticsImpl::OnRtpPacket(const RtpPacketReceived& packet) { >- RTPHeader header; >- packet.GetHeader(&header); >- IncomingPacket(header, packet.size()); >-} >- >-void ReceiveStatisticsImpl::IncomingPacket(const RTPHeader& header, >- size_t packet_length) { > StreamStatisticianImpl* impl; > { > rtc::CritScope cs(&receive_statistics_lock_); >- auto it = statisticians_.find(header.ssrc); >+ auto it = statisticians_.find(packet.Ssrc()); > if (it != statisticians_.end()) { > impl = it->second; > } else { > impl = new StreamStatisticianImpl( >- header.ssrc, clock_, /* enable_retransmit_detection = */ false, this, >- this); >- statisticians_[header.ssrc] = impl; >+ packet.Ssrc(), clock_, /* enable_retransmit_detection = */ false, >+ max_reordering_threshold_, rtcp_stats_callback_, rtp_stats_callback_); >+ statisticians_[packet.Ssrc()] = impl; > } > } > // StreamStatisticianImpl instance is created once and only destroyed when > // this whole ReceiveStatisticsImpl is destroyed. StreamStatisticianImpl has > // it's own locking so don't hold receive_statistics_lock_ (potential > // deadlock). >- impl->IncomingPacket(header, packet_length); >+ impl->OnRtpPacket(packet); > } > > void ReceiveStatisticsImpl::FecPacketReceived(const RtpPacketReceived& packet) { >@@ -411,9 +391,7 @@ void ReceiveStatisticsImpl::FecPacketReceived(const RtpPacketReceived& packet) { > return; > impl = it->second; > } >- RTPHeader header; >- packet.GetHeader(&header); >- impl->FecPacketReceived(header, packet.size()); >+ impl->FecPacketReceived(packet); > } > > StreamStatistician* ReceiveStatisticsImpl::GetStatistician( >@@ -427,8 +405,13 @@ StreamStatistician* ReceiveStatisticsImpl::GetStatistician( > > void ReceiveStatisticsImpl::SetMaxReorderingThreshold( > int max_reordering_threshold) { >- rtc::CritScope cs(&receive_statistics_lock_); >- for (auto& statistician : statisticians_) { >+ std::map<uint32_t, StreamStatisticianImpl*> statisticians; >+ { >+ rtc::CritScope cs(&receive_statistics_lock_); >+ max_reordering_threshold_ = max_reordering_threshold; >+ statisticians = statisticians_; >+ } >+ for (auto& statistician : statisticians) { > statistician.second->SetMaxReorderingThreshold(max_reordering_threshold); > } > } >@@ -440,7 +423,9 @@ void ReceiveStatisticsImpl::EnableRetransmitDetection(uint32_t ssrc, > rtc::CritScope cs(&receive_statistics_lock_); > StreamStatisticianImpl*& impl_ref = statisticians_[ssrc]; > if (impl_ref == nullptr) { // new element >- impl_ref = new StreamStatisticianImpl(ssrc, clock_, enable, this, this); >+ impl_ref = new StreamStatisticianImpl( >+ ssrc, clock_, enable, max_reordering_threshold_, rtcp_stats_callback_, >+ rtp_stats_callback_); > return; > } > impl = impl_ref; >@@ -448,43 +433,6 @@ void ReceiveStatisticsImpl::EnableRetransmitDetection(uint32_t ssrc, > impl->EnableRetransmitDetection(enable); > } > >-void ReceiveStatisticsImpl::RegisterRtcpStatisticsCallback( >- RtcpStatisticsCallback* callback) { >- rtc::CritScope cs(&receive_statistics_lock_); >- if (callback != NULL) >- assert(rtcp_stats_callback_ == NULL); >- rtcp_stats_callback_ = callback; >-} >- >-void ReceiveStatisticsImpl::StatisticsUpdated(const RtcpStatistics& statistics, >- uint32_t ssrc) { >- rtc::CritScope cs(&receive_statistics_lock_); >- if (rtcp_stats_callback_) >- rtcp_stats_callback_->StatisticsUpdated(statistics, ssrc); >-} >- >-void ReceiveStatisticsImpl::CNameChanged(const char* cname, uint32_t ssrc) { >- rtc::CritScope cs(&receive_statistics_lock_); >- if (rtcp_stats_callback_) >- rtcp_stats_callback_->CNameChanged(cname, ssrc); >-} >- >-void ReceiveStatisticsImpl::RegisterRtpStatisticsCallback( >- StreamDataCountersCallback* callback) { >- rtc::CritScope cs(&receive_statistics_lock_); >- if (callback != NULL) >- assert(rtp_stats_callback_ == NULL); >- rtp_stats_callback_ = callback; >-} >- >-void ReceiveStatisticsImpl::DataCountersUpdated(const StreamDataCounters& stats, >- uint32_t ssrc) { >- rtc::CritScope cs(&receive_statistics_lock_); >- if (rtp_stats_callback_) { >- rtp_stats_callback_->DataCountersUpdated(stats, ssrc); >- } >-} >- > std::vector<rtcp::ReportBlock> ReceiveStatisticsImpl::RtcpReportBlocks( > size_t max_blocks) { > std::map<uint32_t, StreamStatisticianImpl*> statisticians; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.h >index 56bfd2b4a405ce6d1786566bdd69a1b9c035010d..8153c4400d40bcf595fdb820db829a5aad72fd9c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_impl.h >@@ -19,15 +19,17 @@ > > #include "rtc_base/criticalsection.h" > #include "rtc_base/rate_statistics.h" >-#include "system_wrappers/include/ntp_time.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >-class StreamStatisticianImpl : public StreamStatistician { >+class StreamStatisticianImpl : public StreamStatistician, >+ public RtpPacketSinkInterface { > public: > StreamStatisticianImpl(uint32_t ssrc, > Clock* clock, > bool enable_retransmit_detection, >+ int max_reordering_threshold, > RtcpStatisticsCallback* rtcp_callback, > StreamDataCountersCallback* rtp_callback); > ~StreamStatisticianImpl() override; >@@ -41,24 +43,22 @@ class StreamStatisticianImpl : public StreamStatistician { > StreamDataCounters* data_counters) const override; > uint32_t BitrateReceived() const override; > >- void IncomingPacket(const RTPHeader& rtp_header, size_t packet_length); >- void FecPacketReceived(const RTPHeader& header, size_t packet_length); >+ // Implements RtpPacketSinkInterface >+ void OnRtpPacket(const RtpPacketReceived& packet) override; >+ >+ void FecPacketReceived(const RtpPacketReceived& packet); > void SetMaxReorderingThreshold(int max_reordering_threshold); > void EnableRetransmitDetection(bool enable); > > private: >- bool IsRetransmitOfOldPacket(const RTPHeader& header) const >- RTC_EXCLUSIVE_LOCKS_REQUIRED(stream_lock_); >- bool InOrderPacketInternal(uint16_t sequence_number) const >+ bool IsRetransmitOfOldPacket(const RtpPacketReceived& packet, >+ int64_t now_ms) const > RTC_EXCLUSIVE_LOCKS_REQUIRED(stream_lock_); > RtcpStatistics CalculateRtcpStatistics() > RTC_EXCLUSIVE_LOCKS_REQUIRED(stream_lock_); >- void UpdateJitter(const RTPHeader& header, NtpTime receive_time) >- RTC_EXCLUSIVE_LOCKS_REQUIRED(stream_lock_); >- StreamDataCounters UpdateCounters(const RTPHeader& rtp_header, >- size_t packet_length, >- bool retransmitted) >+ void UpdateJitter(const RtpPacketReceived& packet, int64_t receive_time_ms) > RTC_EXCLUSIVE_LOCKS_REQUIRED(stream_lock_); >+ StreamDataCounters UpdateCounters(const RtpPacketReceived& packet); > > const uint32_t ssrc_; > Clock* const clock_; >@@ -73,14 +73,12 @@ class StreamStatisticianImpl : public StreamStatistician { > uint32_t cumulative_loss_ RTC_GUARDED_BY(&stream_lock_); > > int64_t last_receive_time_ms_ RTC_GUARDED_BY(&stream_lock_); >- NtpTime last_receive_time_ntp_ RTC_GUARDED_BY(&stream_lock_); > uint32_t last_received_timestamp_ RTC_GUARDED_BY(&stream_lock_); > uint16_t received_seq_first_ RTC_GUARDED_BY(&stream_lock_); > uint16_t received_seq_max_ RTC_GUARDED_BY(&stream_lock_); > uint16_t received_seq_wraps_ RTC_GUARDED_BY(&stream_lock_); > > // Current counter values. >- size_t received_packet_overhead_ RTC_GUARDED_BY(&stream_lock_); > StreamDataCounters receive_counters_ RTC_GUARDED_BY(&stream_lock_); > > // Counter values when we sent the last report. >@@ -94,47 +92,36 @@ class StreamStatisticianImpl : public StreamStatistician { > StreamDataCountersCallback* const rtp_callback_; > }; > >-class ReceiveStatisticsImpl : public ReceiveStatistics, >- public RtcpStatisticsCallback, >- public StreamDataCountersCallback { >+class ReceiveStatisticsImpl : public ReceiveStatistics { > public: >- explicit ReceiveStatisticsImpl(Clock* clock); >+ ReceiveStatisticsImpl(Clock* clock, >+ RtcpStatisticsCallback* rtcp_callback, >+ StreamDataCountersCallback* rtp_callback); > > ~ReceiveStatisticsImpl() override; > >- // Implement ReceiveStatisticsProvider. >+ // Implements ReceiveStatisticsProvider. > std::vector<rtcp::ReportBlock> RtcpReportBlocks(size_t max_blocks) override; > >- // Implement RtpPacketSinkInterface >+ // Implements RtpPacketSinkInterface > void OnRtpPacket(const RtpPacketReceived& packet) override; > >- // Implement ReceiveStatistics. >- void IncomingPacket(const RTPHeader& header, size_t packet_length) override; >+ // Implements ReceiveStatistics. > void FecPacketReceived(const RtpPacketReceived& packet) override; > StreamStatistician* GetStatistician(uint32_t ssrc) const override; > void SetMaxReorderingThreshold(int max_reordering_threshold) override; > void EnableRetransmitDetection(uint32_t ssrc, bool enable) override; > >- void RegisterRtcpStatisticsCallback( >- RtcpStatisticsCallback* callback) override; >- >- void RegisterRtpStatisticsCallback( >- StreamDataCountersCallback* callback) override; >- > private: >- void StatisticsUpdated(const RtcpStatistics& statistics, >- uint32_t ssrc) override; >- void CNameChanged(const char* cname, uint32_t ssrc) override; >- void DataCountersUpdated(const StreamDataCounters& counters, >- uint32_t ssrc) override; >- > Clock* const clock_; > rtc::CriticalSection receive_statistics_lock_; > uint32_t last_returned_ssrc_; >- std::map<uint32_t, StreamStatisticianImpl*> statisticians_; >+ int max_reordering_threshold_ RTC_GUARDED_BY(receive_statistics_lock_); >+ std::map<uint32_t, StreamStatisticianImpl*> statisticians_ >+ RTC_GUARDED_BY(receive_statistics_lock_); > >- RtcpStatisticsCallback* rtcp_stats_callback_; >- StreamDataCountersCallback* rtp_stats_callback_; >+ RtcpStatisticsCallback* const rtcp_stats_callback_; >+ StreamDataCountersCallback* const rtp_stats_callback_; > }; > } // namespace webrtc > #endif // MODULES_RTP_RTCP_SOURCE_RECEIVE_STATISTICS_IMPL_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_unittest.cc >index 18448cb77ec6e5551b92d81d9dea2d55aabff8e7..25393631df9b89bce1d719791b3db06d848265d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/receive_statistics_unittest.cc >@@ -48,10 +48,7 @@ RtpPacketReceived CreateRtpPacket(uint32_t ssrc, > packet.SetCsrcs(csrcs); > } > packet.SetPayloadSize(payload_size); >- if (padding_size > 0) { >- Random random(17); >- packet.SetPadding(padding_size, &random); >- } >+ packet.SetPadding(padding_size); > return packet; > } > >@@ -74,7 +71,9 @@ void IncrementTimestamp(RtpPacketReceived* packet, uint32_t incr) { > class ReceiveStatisticsTest : public ::testing::Test { > public: > ReceiveStatisticsTest() >- : clock_(0), receive_statistics_(ReceiveStatistics::Create(&clock_)) { >+ : clock_(0), >+ receive_statistics_( >+ ReceiveStatistics::Create(&clock_, nullptr, nullptr)) { > packet1_ = CreateRtpPacket(kSsrc1, kPacketSize1); > packet2_ = CreateRtpPacket(kSsrc2, kPacketSize2); > } >@@ -254,7 +253,7 @@ TEST_F(ReceiveStatisticsTest, RtcpCallbacks) { > RtcpStatistics stats_; > } callback; > >- receive_statistics_->RegisterRtcpStatisticsCallback(&callback); >+ receive_statistics_ = ReceiveStatistics::Create(&clock_, &callback, nullptr); > receive_statistics_->EnableRetransmitDetection(kSsrc1, true); > > // Add some arbitrary data, with loss and jitter. >@@ -294,33 +293,6 @@ TEST_F(ReceiveStatisticsTest, RtcpCallbacks) { > EXPECT_EQ(1, statistics.packets_lost); > EXPECT_EQ(5u, statistics.extended_highest_sequence_number); > EXPECT_EQ(177u, statistics.jitter); >- >- receive_statistics_->RegisterRtcpStatisticsCallback(NULL); >- >- // Add some more data. >- packet1_.SetSequenceNumber(1); >- clock_.AdvanceTimeMilliseconds(7); >- IncrementTimestamp(&packet1_, 3); >- receive_statistics_->OnRtpPacket(packet1_); >- IncrementSequenceNumber(&packet1_, 2); >- clock_.AdvanceTimeMilliseconds(9); >- IncrementTimestamp(&packet1_, 9); >- receive_statistics_->OnRtpPacket(packet1_); >- IncrementSequenceNumber(&packet1_, -1); >- clock_.AdvanceTimeMilliseconds(13); >- IncrementTimestamp(&packet1_, 47); >- receive_statistics_->OnRtpPacket(packet1_); >- IncrementSequenceNumber(&packet1_, 3); >- clock_.AdvanceTimeMilliseconds(11); >- IncrementTimestamp(&packet1_, 17); >- receive_statistics_->OnRtpPacket(packet1_); >- IncrementSequenceNumber(&packet1_); >- >- receive_statistics_->GetStatistician(kSsrc1)->GetStatistics(&statistics, >- true); >- >- // Should not have been called after deregister. >- EXPECT_EQ(1u, callback.num_calls_); > } > > class RtpTestCallback : public StreamDataCountersCallback { >@@ -361,7 +333,7 @@ class RtpTestCallback : public StreamDataCountersCallback { > > TEST_F(ReceiveStatisticsTest, RtpCallbacks) { > RtpTestCallback callback; >- receive_statistics_->RegisterRtpStatisticsCallback(&callback); >+ receive_statistics_ = ReceiveStatistics::Create(&clock_, nullptr, &callback); > receive_statistics_->EnableRetransmitDetection(kSsrc1, true); > > const size_t kHeaderLength = 20; >@@ -420,19 +392,11 @@ TEST_F(ReceiveStatisticsTest, RtpCallbacks) { > expected.fec.header_bytes = kHeaderLength; > expected.fec.packets = 1; > callback.Matches(5, kSsrc1, expected); >- >- receive_statistics_->RegisterRtpStatisticsCallback(NULL); >- >- // New stats, but callback should not be called. >- IncrementSequenceNumber(&packet1); >- clock_.AdvanceTimeMilliseconds(5); >- receive_statistics_->OnRtpPacket(packet1); >- callback.Matches(5, kSsrc1, expected); > } > > TEST_F(ReceiveStatisticsTest, RtpCallbacksFecFirst) { > RtpTestCallback callback; >- receive_statistics_->RegisterRtpStatisticsCallback(&callback); >+ receive_statistics_ = ReceiveStatistics::Create(&clock_, nullptr, &callback); > > const uint32_t kHeaderLength = 20; > RtpPacketReceived packet = >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator.cc >index fc867a491a3749896620fcf87630beaf1f18208e..fd19b1383a44e3c8e9faeb6837b3dbd3f208e201 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator.cc >@@ -10,8 +10,9 @@ > > #include "modules/rtp_rtcp/include/remote_ntp_time_estimator.h" > >+#include <cstdint> >+ > #include "rtc_base/logging.h" >-#include "rtc_base/time/timestamp_extrapolator.h" > #include "system_wrappers/include/clock.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator_unittest.cc >index 5254cd57bb933b410894d12fac7715eaf91d2f0f..b301461427b697c95049619b0ba5b7da8c3e8882 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/remote_ntp_time_estimator_unittest.cc >@@ -56,7 +56,8 @@ class RemoteNtpTimeEstimatorTest : public ::testing::Test { > int64_t networking_delay_ms) { > uint32_t rtcp_timestamp = GetRemoteTimestamp(); > int64_t ntp_error_fractions = >- ntp_error_ms * NtpTime::kFractionsPerSecond / 1000; >+ ntp_error_ms * static_cast<int64_t>(NtpTime::kFractionsPerSecond) / >+ 1000; > NtpTime ntp(static_cast<uint64_t>(remote_clock_.CurrentNtpTime()) + > ntp_error_fractions); > AdvanceTimeMilliseconds(kTestRtt / 2 + networking_delay_ms); >@@ -110,7 +111,17 @@ TEST_F(RemoteNtpTimeEstimatorTest, Estimate) { > } > > TEST_F(RemoteNtpTimeEstimatorTest, AveragesErrorsOut) { >- // Remote peer sends first 5 RTCP SR without errors. >+ // Remote peer sends first 10 RTCP SR without errors. >+ AdvanceTimeMilliseconds(1000); >+ SendRtcpSr(); >+ AdvanceTimeMilliseconds(1000); >+ SendRtcpSr(); >+ AdvanceTimeMilliseconds(1000); >+ SendRtcpSr(); >+ AdvanceTimeMilliseconds(1000); >+ SendRtcpSr(); >+ AdvanceTimeMilliseconds(1000); >+ SendRtcpSr(); > AdvanceTimeMilliseconds(1000); > SendRtcpSr(); > AdvanceTimeMilliseconds(1000); >@@ -122,18 +133,17 @@ TEST_F(RemoteNtpTimeEstimatorTest, AveragesErrorsOut) { > AdvanceTimeMilliseconds(1000); > SendRtcpSr(); > >- AdvanceTimeMilliseconds(15); >+ AdvanceTimeMilliseconds(150); > uint32_t rtp_timestamp = GetRemoteTimestamp(); > int64_t capture_ntp_time_ms = local_clock_.CurrentNtpInMilliseconds(); >- > // Local peer gets enough RTCP SR to calculate the capture time. > EXPECT_EQ(capture_ntp_time_ms, estimator_->Estimate(rtp_timestamp)); > > // Remote sends corrupted RTCP SRs > AdvanceTimeMilliseconds(1000); >- SendRtcpSrInaccurately(10, 10); >+ SendRtcpSrInaccurately(/*ntp_error_ms=*/2, /*networking_delay_ms=*/-1); > AdvanceTimeMilliseconds(1000); >- SendRtcpSrInaccurately(-20, 5); >+ SendRtcpSrInaccurately(/*ntp_error_ms=*/-2, /*networking_delay_ms=*/1); > > // New RTP packet to estimate timestamp. > AdvanceTimeMilliseconds(150); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_nack_stats.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_nack_stats.cc >index 24b708502ee6c2751aeeb0c1b41d43dc0316644d..1d652d0b5ba7bc7581be0bf920aa514a68e2be9b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_nack_stats.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_nack_stats.cc >@@ -10,7 +10,7 @@ > > #include "modules/rtp_rtcp/source/rtcp_nack_stats.h" > >-#include "modules/include/module_common_types.h" >+#include "modules/include/module_common_types_public.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet.h >index 11037cbd82d627376bcdaab497924dd53dba56d5..40e51e87ad33d6f0a00417ace80b2d0b0e1ebda5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet.h >@@ -11,6 +11,10 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_H_ > #define MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ >+#include "api/array_view.h" > #include "rtc_base/buffer.h" > #include "rtc_base/function_view.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.cc >index 4e21bc93e4879900a798934476467c4213ee99e0..eadd4d9c352f8fff473fa36d14558b335da882be 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.cc >@@ -10,6 +10,9 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/app.h" > >+#include <string.h> >+#include <cstdint> >+ > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.h >index 19a97e0ed920a31a54ae25d37b6435de1b9065ff..a9602a80cfb5a878b8b5bcd369f045b809846126 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/app.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_APP_H_ > #define MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_APP_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/rtp_rtcp/source/rtcp_packet.h" > #include "rtc_base/buffer.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/bye.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/bye.cc >index 0e2eb9e41a3e1d2c27652c5e10580f61999973a4..23ac35f856b5389133dd0fb54d8f7e59713e3fb1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/bye.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/bye.cc >@@ -10,6 +10,8 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/bye.h" > >+#include <string.h> >+#include <cstdint> > #include <utility> > > #include "modules/rtp_rtcp/source/byte_io.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_jitter_report.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_jitter_report.cc >index 27ed4ccef4f6321aa23e861fd3df2c3cc4d92654..5e7dadd1f482a11d38dea0ba7f14b1e4ad6c5975 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_jitter_report.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_jitter_report.cc >@@ -10,6 +10,7 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/extended_jitter_report.h" > >+#include <cstdint> > #include <utility> > > #include "modules/rtp_rtcp/source/byte_io.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_reports.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_reports.cc >index 5513f37dba0b1f374b7c13a23f9aa2c76af740ac..2b5f9cacd72821392bb88b2e391917c9b32f8c1a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_reports.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/extended_reports.cc >@@ -10,6 +10,8 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/extended_reports.h" > >+#include <vector> >+ > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/nack.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/nack.cc >index 6a4a0bdc0e43ba7c348eff75c09fa39f9821d56a..6fe7eade62b6478096b2dbbade23cb32835d91fa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/nack.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/nack.cc >@@ -11,6 +11,7 @@ > #include "modules/rtp_rtcp/source/rtcp_packet/nack.h" > > #include <algorithm> >+#include <cstdint> > #include <utility> > > #include "modules/rtp_rtcp/source/byte_io.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/psfb.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/psfb.h >index ae66a172387ecc86e177f1357c166e12339daec6..46ee291285046085bf43363cfe5f965bc16287ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/psfb.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/psfb.h >@@ -12,6 +12,9 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_PSFB_H_ > #define MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_PSFB_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/rtp_rtcp/source/rtcp_packet.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/receiver_report.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/receiver_report.h >index 8f143da72c0901ef63c0aada2c289dec94057a82..7470d1d2e10abb72cdb7daf5304bf34412a8291a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/receiver_report.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/receiver_report.h >@@ -11,6 +11,8 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_RECEIVER_REPORT_H_ > #define MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_RECEIVER_REPORT_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <vector> > > #include "modules/rtp_rtcp/source/rtcp_packet.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/remb.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/remb.cc >index 02406116f6e793e6b02b6466fa8939022165b9c9..3ed1fbdb803e6434a1f25585babbe4a59e6aecf1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/remb.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/remb.cc >@@ -10,6 +10,7 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/remb.h" > >+#include <cstdint> > #include <utility> > > #include "modules/rtp_rtcp/source/byte_io.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rrtr.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rrtr.h >index a470b1a217c556b530486e8a805f4b2bebd81c33..8eb4ce62adef8b86a8ad6d9fd5d7cd7566811b99 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rrtr.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rrtr.h >@@ -12,6 +12,9 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_RRTR_H_ > #define MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_RRTR_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "system_wrappers/include/ntp_time.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rtpfb.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rtpfb.h >index a0407411ab38800c096bbb16a182fa68594d14df..21977736b619e487be8d9e9c0dafc29ed65766cc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rtpfb.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/rtpfb.h >@@ -12,6 +12,9 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_RTPFB_H_ > #define MODULES_RTP_RTCP_SOURCE_RTCP_PACKET_RTPFB_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/rtp_rtcp/source/rtcp_packet.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/sdes.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/sdes.cc >index 337c8b04a8487092e46ce0a36828e43f442ced7b..0ef432903db55eff4721632455b083c9417ea825 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/sdes.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/sdes.cc >@@ -10,6 +10,7 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/sdes.h" > >+#include <string.h> > #include <utility> > > #include "modules/rtp_rtcp/source/byte_io.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbn.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbn.cc >index 4d38b3b6dbb3091d5d29f8cf01ea5d5c3d3b6e28..f57e5749c2ded05a9ca1bc2f8620c7ae231fccd1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbn.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbn.cc >@@ -10,7 +10,6 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/tmmbn.h" > >-#include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbr.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbr.cc >index d8f073b87f309df4649b569d3958b97d97554a97..9dc745e509c48a090e81f92e98308339a1f35a59 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbr.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/tmmbr.cc >@@ -10,7 +10,6 @@ > > #include "modules/rtp_rtcp/source/rtcp_packet/tmmbr.h" > >-#include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/transport_feedback.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/transport_feedback.cc >index 4703d3132ee872b3b8917dad60851c3b4893a354..28165594bec4c995e090aabecdcbabb6647786e9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/transport_feedback.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_packet/transport_feedback.cc >@@ -11,9 +11,10 @@ > #include "modules/rtp_rtcp/source/rtcp_packet/transport_feedback.h" > > #include <algorithm> >+#include <cstdint> > #include <utility> > >-#include "modules/include/module_common_types.h" >+#include "modules/include/module_common_types_public.h" > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.cc >index 496675497ad23e3b5cecc44118f9bc47c682268c..383f785dfea82062251d0a4fca71e6fab9a278a7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.cc >@@ -131,6 +131,7 @@ RTCPReceiver::RTCPReceiver( > RtcpIntraFrameObserver* rtcp_intra_frame_observer, > TransportFeedbackObserver* transport_feedback_observer, > VideoBitrateAllocationObserver* bitrate_allocation_observer, >+ int report_interval_ms, > ModuleRtpRtcp* owner) > : clock_(clock), > receiver_only_(receiver_only), >@@ -139,6 +140,7 @@ RTCPReceiver::RTCPReceiver( > rtcp_intra_frame_observer_(rtcp_intra_frame_observer), > transport_feedback_observer_(transport_feedback_observer), > bitrate_allocation_observer_(bitrate_allocation_observer), >+ report_interval_ms_(report_interval_ms), > main_ssrc_(0), > remote_ssrc_(0), > remote_sender_rtp_time_(0), >@@ -280,7 +282,8 @@ RTCPReceiver::ConsumeReceivedXrReferenceTimeInfo() { > std::vector<rtcp::ReceiveTimeInfo> last_xr_rtis; > last_xr_rtis.reserve(last_xr_rtis_size); > >- const uint32_t now_ntp = CompactNtp(clock_->CurrentNtpTime()); >+ const uint32_t now_ntp = >+ CompactNtp(TimeMicrosToNtp(clock_->TimeInMicroseconds())); > > for (size_t i = 0; i < last_xr_rtis_size; ++i) { > RrtrInformation& rrtr = received_rrtrs_.front(); >@@ -427,7 +430,7 @@ void RTCPReceiver::HandleSenderReport(const CommonHeader& rtcp_block, > > remote_sender_ntp_time_ = sender_report.ntp(); > remote_sender_rtp_time_ = sender_report.rtp_timestamp(); >- last_received_sr_ntp_ = clock_->CurrentNtpTime(); >+ last_received_sr_ntp_ = TimeMicrosToNtp(clock_->TimeInMicroseconds()); > } else { > // We will only store the send report from one source, but > // we will store all the receive blocks. >@@ -503,10 +506,18 @@ void RTCPReceiver::HandleReportBlock(const ReportBlock& report_block, > // If no SR has been received yet, the field is set to zero. > // Receiver rtp_rtcp module is not expected to calculate rtt using > // Sender Reports even if it accidentally can. >- if (!receiver_only_ && send_time_ntp != 0) { >+ >+ // TODO(nisse): Use this way to determine the RTT only when |receiver_only_| >+ // is false. However, that currently breaks the tests of the >+ // googCaptureStartNtpTimeMs stat for audio receive streams. To fix, either >+ // delete all dependencies on RTT measurements for audio receive streams, or >+ // ensure that audio receive streams that need RTT and stats that depend on it >+ // are configured with an associated audio send stream. >+ if (send_time_ntp != 0) { > uint32_t delay_ntp = report_block.delay_since_last_sr(); > // Local NTP time. >- uint32_t receive_time_ntp = CompactNtp(clock_->CurrentNtpTime()); >+ uint32_t receive_time_ntp = >+ CompactNtp(TimeMicrosToNtp(clock_->TimeInMicroseconds())); > > // RTT in 1/(2^16) seconds. > uint32_t rtt_ntp = receive_time_ntp - delay_ntp - send_time_ntp; >@@ -552,12 +563,12 @@ RTCPReceiver::TmmbrInformation* RTCPReceiver::GetTmmbrInformation( > return &it->second; > } > >-bool RTCPReceiver::RtcpRrTimeout(int64_t rtcp_interval_ms) { >+bool RTCPReceiver::RtcpRrTimeout() { > rtc::CritScope lock(&rtcp_receiver_lock_); > if (last_received_rb_ms_ == 0) > return false; > >- int64_t time_out_ms = kRrTimeoutIntervals * rtcp_interval_ms; >+ int64_t time_out_ms = kRrTimeoutIntervals * report_interval_ms_; > if (clock_->TimeInMilliseconds() > last_received_rb_ms_ + time_out_ms) { > // Reset the timer to only trigger one log. > last_received_rb_ms_ = 0; >@@ -566,12 +577,12 @@ bool RTCPReceiver::RtcpRrTimeout(int64_t rtcp_interval_ms) { > return false; > } > >-bool RTCPReceiver::RtcpRrSequenceNumberTimeout(int64_t rtcp_interval_ms) { >+bool RTCPReceiver::RtcpRrSequenceNumberTimeout() { > rtc::CritScope lock(&rtcp_receiver_lock_); > if (last_increased_sequence_number_ms_ == 0) > return false; > >- int64_t time_out_ms = kRrTimeoutIntervals * rtcp_interval_ms; >+ int64_t time_out_ms = kRrTimeoutIntervals * report_interval_ms_; > if (clock_->TimeInMilliseconds() > > last_increased_sequence_number_ms_ + time_out_ms) { > // Reset the timer to only trigger one log. >@@ -720,7 +731,8 @@ void RTCPReceiver::HandleXr(const CommonHeader& rtcp_block, > void RTCPReceiver::HandleXrReceiveReferenceTime(uint32_t sender_ssrc, > const rtcp::Rrtr& rrtr) { > uint32_t received_remote_mid_ntp_time = CompactNtp(rrtr.ntp()); >- uint32_t local_receive_mid_ntp_time = CompactNtp(clock_->CurrentNtpTime()); >+ uint32_t local_receive_mid_ntp_time = >+ CompactNtp(TimeMicrosToNtp(clock_->TimeInMicroseconds())); > > auto it = received_rrtrs_ssrc_it_.find(sender_ssrc); > if (it != received_rrtrs_ssrc_it_.end()) { >@@ -754,7 +766,7 @@ void RTCPReceiver::HandleXrDlrrReportBlock(const rtcp::ReceiveTimeInfo& rti) { > return; > > uint32_t delay_ntp = rti.delay_since_last_rr; >- uint32_t now_ntp = CompactNtp(clock_->CurrentNtpTime()); >+ uint32_t now_ntp = CompactNtp(TimeMicrosToNtp(clock_->TimeInMicroseconds())); > > uint32_t rtt_ntp = now_ntp - delay_ntp - send_time_ntp; > xr_rr_rtt_ms_ = CompactNtpRttToMs(rtt_ntp); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.h >index a863caeff7b69e6c97b034eb7306013e18713b31..be4c70e3cea49186e009bd17f6d0cd4e1b448645 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver.h >@@ -17,6 +17,7 @@ > #include <string> > #include <vector> > >+#include "modules/rtp_rtcp/include/rtcp_statistics.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtcp_nack_stats.h" > #include "modules/rtp_rtcp/source/rtcp_packet/dlrr.h" >@@ -56,6 +57,7 @@ class RTCPReceiver { > RtcpIntraFrameObserver* rtcp_intra_frame_observer, > TransportFeedbackObserver* transport_feedback_observer, > VideoBitrateAllocationObserver* bitrate_allocation_observer, >+ int report_interval_ms, > ModuleRtpRtcp* owner); > virtual ~RTCPReceiver(); > >@@ -94,13 +96,13 @@ class RTCPReceiver { > > // Returns true if we haven't received an RTCP RR for several RTCP > // intervals, but only triggers true once. >- bool RtcpRrTimeout(int64_t rtcp_interval_ms); >+ bool RtcpRrTimeout(); > > // Returns true if we haven't received an RTCP RR telling the receive side > // has not received RTP packets for too long, i.e. extended highest sequence > // number hasn't increased for several RTCP intervals. The function only > // returns true once until a new RR is received. >- bool RtcpRrSequenceNumberTimeout(int64_t rtcp_interval_ms); >+ bool RtcpRrSequenceNumberTimeout(); > > std::vector<rtcp::TmmbItem> TmmbrReceived(); > // Return true if new bandwidth should be set. >@@ -215,6 +217,7 @@ class RTCPReceiver { > RtcpIntraFrameObserver* const rtcp_intra_frame_observer_; > TransportFeedbackObserver* const transport_feedback_observer_; > VideoBitrateAllocationObserver* const bitrate_allocation_observer_; >+ const int report_interval_ms_; > > rtc::CriticalSection rtcp_receiver_lock_; > uint32_t main_ssrc_ RTC_GUARDED_BY(rtcp_receiver_lock_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver_unittest.cc >index c7708f0bddd97caa4bbd64df48360f68662ff04a..a576fdc5e9a576391985fe84379d8c143204db2b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_receiver_unittest.cc >@@ -107,6 +107,8 @@ constexpr uint32_t kReceiverExtraSsrc = 0x1234567; > constexpr uint32_t kNotToUsSsrc = 0x654321; > constexpr uint32_t kUnknownSenderSsrc = 0x54321; > >+constexpr int64_t kRtcpIntervalMs = 1000; >+ > } // namespace > > class RtcpReceiverTest : public ::testing::Test { >@@ -120,6 +122,7 @@ class RtcpReceiverTest : public ::testing::Test { > &intra_frame_observer_, > &transport_feedback_observer_, > &bitrate_allocation_observer_, >+ kRtcpIntervalMs, > &rtp_rtcp_impl_) {} > void SetUp() { > std::set<uint32_t> ssrcs = {kReceiverMainSsrc, kReceiverExtraSsrc}; >@@ -208,7 +211,8 @@ TEST_F(RtcpReceiverTest, InjectSrPacketCalculatesRTT) { > EXPECT_EQ( > -1, rtcp_receiver_.RTT(kSenderSsrc, &rtt_ms, nullptr, nullptr, nullptr)); > >- uint32_t sent_ntp = CompactNtp(system_clock_.CurrentNtpTime()); >+ uint32_t sent_ntp = >+ CompactNtp(TimeMicrosToNtp(system_clock_.TimeInMicroseconds())); > system_clock_.AdvanceTimeMilliseconds(kRttMs + kDelayMs); > > rtcp::SenderReport sr; >@@ -238,7 +242,8 @@ TEST_F(RtcpReceiverTest, InjectSrPacketCalculatesNegativeRTTAsOne) { > EXPECT_EQ( > -1, rtcp_receiver_.RTT(kSenderSsrc, &rtt_ms, nullptr, nullptr, nullptr)); > >- uint32_t sent_ntp = CompactNtp(system_clock_.CurrentNtpTime()); >+ uint32_t sent_ntp = >+ CompactNtp(TimeMicrosToNtp(system_clock_.TimeInMicroseconds())); > system_clock_.AdvanceTimeMilliseconds(kRttMs + kDelayMs); > > rtcp::SenderReport sr; >@@ -266,7 +271,8 @@ TEST_F( > const uint32_t kDelayNtp = 123000; > const int64_t kDelayMs = CompactNtpRttToMs(kDelayNtp); > >- uint32_t sent_ntp = CompactNtp(system_clock_.CurrentNtpTime()); >+ uint32_t sent_ntp = >+ CompactNtp(TimeMicrosToNtp(system_clock_.TimeInMicroseconds())); > system_clock_.AdvanceTimeMilliseconds(kRttMs + kDelayMs); > > rtcp::SenderReport sr; >@@ -737,7 +743,8 @@ TEST_F(RtcpReceiverTest, InjectExtendedReportsDlrrPacketWithSubBlock) { > > InjectRtcpPacket(xr); > >- uint32_t compact_ntp_now = CompactNtp(system_clock_.CurrentNtpTime()); >+ uint32_t compact_ntp_now = >+ CompactNtp(TimeMicrosToNtp(system_clock_.TimeInMicroseconds())); > EXPECT_TRUE(rtcp_receiver_.GetAndResetXrRrRtt(&rtt_ms)); > uint32_t rtt_ntp = compact_ntp_now - kDelay - kLastRR; > EXPECT_NEAR(CompactNtpRttToMs(rtt_ntp), rtt_ms, 1); >@@ -756,7 +763,8 @@ TEST_F(RtcpReceiverTest, InjectExtendedReportsDlrrPacketWithMultipleSubBlocks) { > > InjectRtcpPacket(xr); > >- uint32_t compact_ntp_now = CompactNtp(system_clock_.CurrentNtpTime()); >+ uint32_t compact_ntp_now = >+ CompactNtp(TimeMicrosToNtp(system_clock_.TimeInMicroseconds())); > int64_t rtt_ms = 0; > EXPECT_TRUE(rtcp_receiver_.GetAndResetXrRrRtt(&rtt_ms)); > uint32_t rtt_ntp = compact_ntp_now - kDelay - kLastRR; >@@ -818,7 +826,7 @@ TEST_F(RtcpReceiverTest, RttCalculatedAfterExtendedReportsDlrr) { > const uint32_t kDelayNtp = rand.Rand(0, 0x7fffffff); > const int64_t kDelayMs = CompactNtpRttToMs(kDelayNtp); > rtcp_receiver_.SetRtcpXrRrtrStatus(true); >- NtpTime now = system_clock_.CurrentNtpTime(); >+ NtpTime now = TimeMicrosToNtp(system_clock_.TimeInMicroseconds()); > uint32_t sent_ntp = CompactNtp(now); > system_clock_.AdvanceTimeMilliseconds(kRttMs + kDelayMs); > >@@ -838,7 +846,7 @@ TEST_F(RtcpReceiverTest, XrDlrrCalculatesNegativeRttAsOne) { > const int64_t kRttMs = rand.Rand(-3600 * 1000, -1); > const uint32_t kDelayNtp = rand.Rand(0, 0x7fffffff); > const int64_t kDelayMs = CompactNtpRttToMs(kDelayNtp); >- NtpTime now = system_clock_.CurrentNtpTime(); >+ NtpTime now = TimeMicrosToNtp(system_clock_.TimeInMicroseconds()); > uint32_t sent_ntp = CompactNtp(now); > system_clock_.AdvanceTimeMilliseconds(kRttMs + kDelayMs); > rtcp_receiver_.SetRtcpXrRrtrStatus(true); >@@ -936,13 +944,12 @@ TEST_F(RtcpReceiverTest, StoresLastReceivedRrtrPerSsrc) { > } > > TEST_F(RtcpReceiverTest, ReceiveReportTimeout) { >- const int64_t kRtcpIntervalMs = 1000; > const uint16_t kSequenceNumber = 1234; > system_clock_.AdvanceTimeMilliseconds(3 * kRtcpIntervalMs); > > // No RR received, shouldn't trigger a timeout. >- EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >- EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout(kRtcpIntervalMs)); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout()); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout()); > > // Add a RR and advance the clock just enough to not trigger a timeout. > rtcp::ReportBlock rb1; >@@ -957,8 +964,8 @@ TEST_F(RtcpReceiverTest, ReceiveReportTimeout) { > InjectRtcpPacket(rr1); > > system_clock_.AdvanceTimeMilliseconds(3 * kRtcpIntervalMs - 1); >- EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >- EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout(kRtcpIntervalMs)); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout()); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout()); > > // Add a RR with the same extended max as the previous RR to trigger a > // sequence number timeout, but not a RR timeout. >@@ -967,17 +974,17 @@ TEST_F(RtcpReceiverTest, ReceiveReportTimeout) { > InjectRtcpPacket(rr1); > > system_clock_.AdvanceTimeMilliseconds(2); >- EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >- EXPECT_TRUE(rtcp_receiver_.RtcpRrSequenceNumberTimeout(kRtcpIntervalMs)); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout()); >+ EXPECT_TRUE(rtcp_receiver_.RtcpRrSequenceNumberTimeout()); > > // Advance clock enough to trigger an RR timeout too. > system_clock_.AdvanceTimeMilliseconds(3 * kRtcpIntervalMs); >- EXPECT_TRUE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >+ EXPECT_TRUE(rtcp_receiver_.RtcpRrTimeout()); > > // We should only get one timeout even though we still haven't received a new > // RR. >- EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >- EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout(kRtcpIntervalMs)); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout()); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout()); > > // Add a new RR with increase sequence number to reset timers. > rtcp::ReportBlock rb2; >@@ -991,8 +998,8 @@ TEST_F(RtcpReceiverTest, ReceiveReportTimeout) { > EXPECT_CALL(bandwidth_observer_, OnReceivedRtcpReceiverReport(_, _, _)); > InjectRtcpPacket(rr2); > >- EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >- EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout(kRtcpIntervalMs)); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout()); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrSequenceNumberTimeout()); > > // Verify we can get a timeout again once we've received new RR. > system_clock_.AdvanceTimeMilliseconds(2 * kRtcpIntervalMs); >@@ -1001,11 +1008,11 @@ TEST_F(RtcpReceiverTest, ReceiveReportTimeout) { > InjectRtcpPacket(rr2); > > system_clock_.AdvanceTimeMilliseconds(kRtcpIntervalMs + 1); >- EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >- EXPECT_TRUE(rtcp_receiver_.RtcpRrSequenceNumberTimeout(kRtcpIntervalMs)); >+ EXPECT_FALSE(rtcp_receiver_.RtcpRrTimeout()); >+ EXPECT_TRUE(rtcp_receiver_.RtcpRrSequenceNumberTimeout()); > > system_clock_.AdvanceTimeMilliseconds(2 * kRtcpIntervalMs); >- EXPECT_TRUE(rtcp_receiver_.RtcpRrTimeout(kRtcpIntervalMs)); >+ EXPECT_TRUE(rtcp_receiver_.RtcpRrTimeout()); > } > > TEST_F(RtcpReceiverTest, TmmbrReceivedWithNoIncomingPacket) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.cc >index bfddc422f0128be27fccd188690a18b2c0fd425c..2581487927bf84094ebfcf0de6deb5943292fa18 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.cc >@@ -10,7 +10,8 @@ > > #include "modules/rtp_rtcp/source/rtcp_sender.h" > >-#include <string.h> // memcpy >+#include <string.h> // memcpy >+#include <algorithm> // std::min > > #include <utility> > >@@ -99,16 +100,16 @@ class RTCPSender::RtcpContext { > RtcpContext(const FeedbackState& feedback_state, > int32_t nack_size, > const uint16_t* nack_list, >- NtpTime now) >+ int64_t now_us) > : feedback_state_(feedback_state), > nack_size_(nack_size), > nack_list_(nack_list), >- now_(now) {} >+ now_us_(now_us) {} > > const FeedbackState& feedback_state_; > const int32_t nack_size_; > const uint16_t* nack_list_; >- const NtpTime now_; >+ const int64_t now_us_; > }; > > RTCPSender::RTCPSender( >@@ -118,15 +119,14 @@ RTCPSender::RTCPSender( > RtcpPacketTypeCounterObserver* packet_type_counter_observer, > RtcEventLog* event_log, > Transport* outgoing_transport, >- RtcpIntervalConfig interval_config) >+ int report_interval_ms) > : audio_(audio), > clock_(clock), > random_(clock_->TimeInMicroseconds()), > method_(RtcpMode::kOff), > event_log_(event_log), > transport_(outgoing_transport), >- interval_config_(interval_config), >- using_nack_(false), >+ report_interval_ms_(report_interval_ms), > sending_(false), > next_time_to_send_rtcp_(0), > timestamp_offset_(0), >@@ -151,7 +151,8 @@ RTCPSender::RTCPSender( > > xr_send_receiver_reference_time_enabled_(false), > packet_type_counter_observer_(packet_type_counter_observer), >- send_video_bitrate_allocation_(false) { >+ send_video_bitrate_allocation_(false), >+ last_payload_type_(-1) { > RTC_DCHECK(transport_ != nullptr); > > builders_[kRtcpSr] = &RTCPSender::BuildSR; >@@ -180,9 +181,8 @@ void RTCPSender::SetRTCPStatus(RtcpMode new_method) { > > if (method_ == RtcpMode::kOff && new_method != RtcpMode::kOff) { > // When switching on, reschedule the next packet >- int64_t interval_ms = audio_ ? interval_config_.audio_interval_ms >- : interval_config_.video_interval_ms; >- next_time_to_send_rtcp_ = clock_->TimeInMilliseconds() + (interval_ms / 2); >+ next_time_to_send_rtcp_ = >+ clock_->TimeInMilliseconds() + (report_interval_ms_ / 2); > } > method_ = new_method; > } >@@ -254,8 +254,14 @@ void RTCPSender::SetTimestampOffset(uint32_t timestamp_offset) { > } > > void RTCPSender::SetLastRtpTime(uint32_t rtp_timestamp, >- int64_t capture_time_ms) { >+ int64_t capture_time_ms, >+ int8_t payload_type) { > rtc::CritScope lock(&critical_section_rtcp_sender_); >+ // For compatibility with clients who don't set payload type correctly on all >+ // calls. >+ if (payload_type != -1) { >+ last_payload_type_ = payload_type; >+ } > last_rtp_timestamp_ = rtp_timestamp; > if (capture_time_ms < 0) { > // We don't currently get a capture time from VoiceEngine. >@@ -265,6 +271,11 @@ void RTCPSender::SetLastRtpTime(uint32_t rtp_timestamp, > } > } > >+void RTCPSender::SetRtpClockRate(int8_t payload_type, int rtp_clock_rate_hz) { >+ rtc::CritScope lock(&critical_section_rtcp_sender_); >+ rtp_clock_rates_khz_[payload_type] = rtp_clock_rate_hz / 1000; >+} >+ > uint32_t RTCPSender::SSRC() const { > rtc::CritScope lock(&critical_section_rtcp_sender_); > return ssrc_; >@@ -411,15 +422,21 @@ std::unique_ptr<rtcp::RtcpPacket> RTCPSender::BuildSR(const RtcpContext& ctx) { > // the frame being captured at this moment. We are calculating that > // timestamp as the last frame's timestamp + the time since the last frame > // was captured. >- uint32_t rtp_rate = >- (audio_ ? kBogusRtpRateForAudioRtcp : kVideoPayloadTypeFrequency) / 1000; >+ int rtp_rate = rtp_clock_rates_khz_[last_payload_type_]; >+ if (rtp_rate <= 0) { >+ rtp_rate = >+ (audio_ ? kBogusRtpRateForAudioRtcp : kVideoPayloadTypeFrequency) / >+ 1000; >+ } >+ // Round now_us_ to the closest millisecond, because Ntp time is rounded >+ // when converted to milliseconds, > uint32_t rtp_timestamp = > timestamp_offset_ + last_rtp_timestamp_ + >- (clock_->TimeInMilliseconds() - last_frame_capture_time_ms_) * rtp_rate; >+ ((ctx.now_us_ + 500) / 1000 - last_frame_capture_time_ms_) * rtp_rate; > > rtcp::SenderReport* report = new rtcp::SenderReport(); > report->SetSenderSsrc(ssrc_); >- report->SetNtp(ctx.now_); >+ report->SetNtp(TimeMicrosToNtp(ctx.now_us_)); > report->SetRtpTimestamp(rtp_timestamp); > report->SetPacketCount(ctx.feedback_state_.packets_sent); > report->SetOctetCount(ctx.feedback_state_.media_bytes_sent); >@@ -600,7 +617,7 @@ std::unique_ptr<rtcp::RtcpPacket> RTCPSender::BuildExtendedReports( > > if (!sending_ && xr_send_receiver_reference_time_enabled_) { > rtcp::Rrtr rrtr; >- rrtr.SetNtp(ctx.now_); >+ rrtr.SetNtp(TimeMicrosToNtp(ctx.now_us_)); > xr->SetRrtr(rrtr); > } > >@@ -675,7 +692,7 @@ int32_t RTCPSender::SendCompoundRTCP( > > // We need to send our NTP even if we haven't received any reports. > RtcpContext context(feedback_state, nack_size, nack_list, >- clock_->CurrentNtpTime()); >+ clock_->TimeInMicroseconds()); > > PrepareReport(feedback_state); > >@@ -748,28 +765,24 @@ void RTCPSender::PrepareReport(const FeedbackState& feedback_state) { > } > > // generate next time to send an RTCP report >- uint32_t minIntervalMs = >- rtc::dchecked_cast<uint32_t>(interval_config_.audio_interval_ms); >- >- if (!audio_) { >- if (sending_) { >- // Calculate bandwidth for video; 360 / send bandwidth in kbit/s. >- uint32_t send_bitrate_kbit = feedback_state.send_bitrate / 1000; >- if (send_bitrate_kbit != 0) >- minIntervalMs = 360000 / send_bitrate_kbit; >- } >- if (minIntervalMs > >- rtc::dchecked_cast<uint32_t>(interval_config_.video_interval_ms)) { >- minIntervalMs = >- rtc::dchecked_cast<uint32_t>(interval_config_.video_interval_ms); >+ int min_interval_ms = report_interval_ms_; >+ >+ if (!audio_ && sending_) { >+ // Calculate bandwidth for video; 360 / send bandwidth in kbit/s. >+ int send_bitrate_kbit = feedback_state.send_bitrate / 1000; >+ if (send_bitrate_kbit != 0) { >+ min_interval_ms = 360000 / send_bitrate_kbit; >+ min_interval_ms = std::min(min_interval_ms, report_interval_ms_); > } > } > > // The interval between RTCP packets is varied randomly over the > // range [1/2,3/2] times the calculated interval. >- uint32_t timeToNext = >- random_.Rand(minIntervalMs * 1 / 2, minIntervalMs * 3 / 2); >- next_time_to_send_rtcp_ = clock_->TimeInMilliseconds() + timeToNext; >+ int time_to_next = >+ random_.Rand(min_interval_ms * 1 / 2, min_interval_ms * 3 / 2); >+ >+ RTC_DCHECK_GT(time_to_next, 0); >+ next_time_to_send_rtcp_ = clock_->TimeInMilliseconds() + time_to_next; > > // RtcpSender expected to be used for sending either just sender reports > // or just receiver reports. >@@ -791,7 +804,7 @@ std::vector<rtcp::ReportBlock> RTCPSender::CreateReportBlocks( > if (!result.empty() && ((feedback_state.last_rr_ntp_secs != 0) || > (feedback_state.last_rr_ntp_frac != 0))) { > // Get our NTP as late as possible to avoid a race. >- uint32_t now = CompactNtp(clock_->CurrentNtpTime()); >+ uint32_t now = CompactNtp(TimeMicrosToNtp(clock_->TimeInMicroseconds())); > > uint32_t receive_time = feedback_state.last_rr_ntp_secs & 0x0000FFFF; > receive_time <<= 16; >@@ -949,12 +962,4 @@ bool RTCPSender::SendFeedbackPacket(const rtcp::TransportFeedback& packet) { > return packet.Build(max_packet_size, callback) && !send_failure; > } > >-int64_t RTCPSender::RtcpAudioReportInverval() const { >- return interval_config_.audio_interval_ms; >-} >- >-int64_t RTCPSender::RtcpVideoReportInverval() const { >- return interval_config_.video_interval_ms; >-} >- > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.h >index c9220dd35c7a9dfa00ad2ac5d0971b5d335737f9..0845397da3bf6d224578f51b057accc813ac2495 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender.h >@@ -68,7 +68,7 @@ class RTCPSender { > RtcpPacketTypeCounterObserver* packet_type_counter_observer, > RtcEventLog* event_log, > Transport* outgoing_transport, >- RtcpIntervalConfig interval_config); >+ int report_interval_ms); > virtual ~RTCPSender(); > > RtcpMode Status() const; >@@ -82,7 +82,13 @@ class RTCPSender { > > void SetTimestampOffset(uint32_t timestamp_offset); > >- void SetLastRtpTime(uint32_t rtp_timestamp, int64_t capture_time_ms); >+ // TODO(bugs.webrtc.org/6458): Remove default parameter value when all the >+ // depending projects are updated to correctly set payload type. >+ void SetLastRtpTime(uint32_t rtp_timestamp, >+ int64_t capture_time_ms, >+ int8_t payload_type = -1); >+ >+ void SetRtpClockRate(int8_t payload_type, int rtp_clock_rate_hz); > > uint32_t SSRC() const; > >@@ -135,9 +141,6 @@ class RTCPSender { > void SetVideoBitrateAllocation(const VideoBitrateAllocation& bitrate); > bool SendFeedbackPacket(const rtcp::TransportFeedback& packet); > >- int64_t RtcpAudioReportInverval() const; >- int64_t RtcpVideoReportInverval() const; >- > private: > class RtcpContext; > >@@ -184,10 +187,9 @@ class RTCPSender { > RtcEventLog* const event_log_; > Transport* const transport_; > >- const RtcpIntervalConfig interval_config_; >+ const int report_interval_ms_; > > rtc::CriticalSection critical_section_rtcp_sender_; >- bool using_nack_ RTC_GUARDED_BY(critical_section_rtcp_sender_); > bool sending_ RTC_GUARDED_BY(critical_section_rtcp_sender_); > > int64_t next_time_to_send_rtcp_ RTC_GUARDED_BY(critical_section_rtcp_sender_); >@@ -244,6 +246,11 @@ class RTCPSender { > RTC_GUARDED_BY(critical_section_rtcp_sender_); > bool send_video_bitrate_allocation_ > RTC_GUARDED_BY(critical_section_rtcp_sender_); >+ >+ std::map<int8_t, int> rtp_clock_rates_khz_ >+ RTC_GUARDED_BY(critical_section_rtcp_sender_); >+ int8_t last_payload_type_ RTC_GUARDED_BY(critical_section_rtcp_sender_); >+ > absl::optional<VideoBitrateAllocation> CheckAndUpdateLayerStructure( > const VideoBitrateAllocation& bitrate) const > RTC_EXCLUSIVE_LOCKS_REQUIRED(critical_section_rtcp_sender_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender_unittest.cc >index bcffc81213d9f1249c833433883e655f359223be..e1a436b8452ff7291c94b8ff7180edb5f1c3b85c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_sender_unittest.cc >@@ -11,11 +11,13 @@ > #include <memory> > > #include "common_types.h" // NOLINT(build/include) >+#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtcp_packet/bye.h" > #include "modules/rtp_rtcp/source/rtcp_packet/common_header.h" > #include "modules/rtp_rtcp/source/rtcp_sender.h" > #include "modules/rtp_rtcp/source/rtp_packet_received.h" > #include "modules/rtp_rtcp/source/rtp_rtcp_impl.h" >+#include "modules/rtp_rtcp/source/time_util.h" > #include "rtc_base/rate_limiter.h" > #include "test/gmock.h" > #include "test/gtest.h" >@@ -43,7 +45,7 @@ class RtcpPacketTypeCounterObserverImpl : public RtcpPacketTypeCounterObserver { > RtcpPacketTypeCounter counter_; > }; > >-class TestTransport : public Transport, public RtpData { >+class TestTransport : public Transport { > public: > TestTransport() {} > >@@ -56,11 +58,6 @@ class TestTransport : public Transport, public RtpData { > parser_.Parse(data, len); > return true; > } >- int OnReceivedPayloadData(const uint8_t* payload_data, >- size_t payload_size, >- const WebRtcRTPHeader* rtp_header) override { >- return 0; >- } > test::RtcpPacketParser parser_; > }; > >@@ -82,15 +79,17 @@ class RtcpSenderTest : public ::testing::Test { > configuration.clock = &clock_; > configuration.outgoing_transport = &test_transport_; > configuration.retransmission_rate_limiter = &retransmission_rate_limiter_; >+ configuration.rtcp_report_interval_ms = 1000; > > rtp_rtcp_impl_.reset(new ModuleRtpRtcpImpl(configuration)); > rtcp_sender_.reset(new RTCPSender(false, &clock_, receive_statistics_.get(), > nullptr, nullptr, &test_transport_, >- configuration.rtcp_interval_config)); >+ configuration.rtcp_report_interval_ms)); > rtcp_sender_->SetSSRC(kSenderSsrc); > rtcp_sender_->SetRemoteSSRC(kRemoteSsrc); > rtcp_sender_->SetTimestampOffset(kStartRtpTimestamp); >- rtcp_sender_->SetLastRtpTime(kRtpTimestamp, clock_.TimeInMilliseconds()); >+ rtcp_sender_->SetLastRtpTime(kRtpTimestamp, clock_.TimeInMilliseconds(), >+ /*paylpad_type=*/0); > } > > void InsertIncomingPacket(uint32_t remote_ssrc, uint16_t seq_num) { >@@ -141,7 +140,7 @@ TEST_F(RtcpSenderTest, SendSr) { > rtcp_sender_->SetSendingStatus(feedback_state, true); > feedback_state.packets_sent = kPacketCount; > feedback_state.media_bytes_sent = kOctetCount; >- NtpTime ntp = clock_.CurrentNtpTime(); >+ NtpTime ntp = TimeMicrosToNtp(clock_.TimeInMicroseconds()); > EXPECT_EQ(0, rtcp_sender_->SendRTCP(feedback_state, kRtcpSr)); > EXPECT_EQ(1, parser()->sender_report()->num_packets()); > EXPECT_EQ(kSenderSsrc, parser()->sender_report()->sender_ssrc()); >@@ -153,10 +152,42 @@ TEST_F(RtcpSenderTest, SendSr) { > EXPECT_EQ(0U, parser()->sender_report()->report_blocks().size()); > } > >+TEST_F(RtcpSenderTest, SendConsecutiveSrWithExactSlope) { >+ const uint32_t kPacketCount = 0x12345; >+ const uint32_t kOctetCount = 0x23456; >+ const int kTimeBetweenSRsUs = 10043; // Not exact value in milliseconds. >+ const int kExtraPackets = 30; >+ // Make sure clock is not exactly at some milliseconds point. >+ clock_.AdvanceTimeMicroseconds(kTimeBetweenSRsUs); >+ rtcp_sender_->SetRTCPStatus(RtcpMode::kReducedSize); >+ RTCPSender::FeedbackState feedback_state = rtp_rtcp_impl_->GetFeedbackState(); >+ rtcp_sender_->SetSendingStatus(feedback_state, true); >+ feedback_state.packets_sent = kPacketCount; >+ feedback_state.media_bytes_sent = kOctetCount; >+ >+ EXPECT_EQ(0, rtcp_sender_->SendRTCP(feedback_state, kRtcpSr)); >+ EXPECT_EQ(1, parser()->sender_report()->num_packets()); >+ NtpTime ntp1 = parser()->sender_report()->ntp(); >+ uint32_t rtp1 = parser()->sender_report()->rtp_timestamp(); >+ >+ // Send more SRs to ensure slope is always exact for different offsets >+ for (int packets = 1; packets <= kExtraPackets; ++packets) { >+ clock_.AdvanceTimeMicroseconds(kTimeBetweenSRsUs); >+ EXPECT_EQ(0, rtcp_sender_->SendRTCP(feedback_state, kRtcpSr)); >+ EXPECT_EQ(packets + 1, parser()->sender_report()->num_packets()); >+ >+ NtpTime ntp2 = parser()->sender_report()->ntp(); >+ uint32_t rtp2 = parser()->sender_report()->rtp_timestamp(); >+ >+ uint32_t ntp_diff_in_rtp_units = >+ (ntp2.ToMs() - ntp1.ToMs()) * (kVideoPayloadTypeFrequency / 1000); >+ EXPECT_EQ(rtp2 - rtp1, ntp_diff_in_rtp_units); >+ } >+} >+ > TEST_F(RtcpSenderTest, DoNotSendSrBeforeRtp) { > rtcp_sender_.reset(new RTCPSender(false, &clock_, receive_statistics_.get(), >- nullptr, nullptr, &test_transport_, >- RtcpIntervalConfig{})); >+ nullptr, nullptr, &test_transport_, 1000)); > rtcp_sender_->SetSSRC(kSenderSsrc); > rtcp_sender_->SetRemoteSSRC(kRemoteSsrc); > rtcp_sender_->SetRTCPStatus(RtcpMode::kReducedSize); >@@ -174,8 +205,7 @@ TEST_F(RtcpSenderTest, DoNotSendSrBeforeRtp) { > > TEST_F(RtcpSenderTest, DoNotSendCompundBeforeRtp) { > rtcp_sender_.reset(new RTCPSender(false, &clock_, receive_statistics_.get(), >- nullptr, nullptr, &test_transport_, >- RtcpIntervalConfig{})); >+ nullptr, nullptr, &test_transport_, 1000)); > rtcp_sender_->SetSSRC(kSenderSsrc); > rtcp_sender_->SetRemoteSSRC(kRemoteSsrc); > rtcp_sender_->SetRTCPStatus(RtcpMode::kCompound); >@@ -447,7 +477,7 @@ TEST_F(RtcpSenderTest, SendXrWithRrtr) { > rtcp_sender_->SetRTCPStatus(RtcpMode::kCompound); > EXPECT_EQ(0, rtcp_sender_->SetSendingStatus(feedback_state(), false)); > rtcp_sender_->SendRtcpXrReceiverReferenceTime(true); >- NtpTime ntp = clock_.CurrentNtpTime(); >+ NtpTime ntp = TimeMicrosToNtp(clock_.TimeInMicroseconds()); > EXPECT_EQ(0, rtcp_sender_->SendRTCP(feedback_state(), kRtcpReport)); > EXPECT_EQ(1, parser()->xr()->num_packets()); > EXPECT_EQ(kSenderSsrc, parser()->xr()->sender_ssrc()); >@@ -476,7 +506,7 @@ TEST_F(RtcpSenderTest, TestRegisterRtcpPacketTypeObserver) { > RtcpPacketTypeCounterObserverImpl observer; > rtcp_sender_.reset(new RTCPSender(false, &clock_, receive_statistics_.get(), > &observer, nullptr, &test_transport_, >- RtcpIntervalConfig{})); >+ 1000)); > rtcp_sender_->SetRemoteSSRC(kRemoteSsrc); > rtcp_sender_->SetRTCPStatus(RtcpMode::kReducedSize); > EXPECT_EQ(0, rtcp_sender_->SendRTCP(feedback_state(), kRtcpPli)); >@@ -598,12 +628,12 @@ TEST_F(RtcpSenderTest, ByeMustBeLast) { > > // Re-configure rtcp_sender_ with mock_transport_ > rtcp_sender_.reset(new RTCPSender(false, &clock_, receive_statistics_.get(), >- nullptr, nullptr, &mock_transport, >- RtcpIntervalConfig{})); >+ nullptr, nullptr, &mock_transport, 1000)); > rtcp_sender_->SetSSRC(kSenderSsrc); > rtcp_sender_->SetRemoteSSRC(kRemoteSsrc); > rtcp_sender_->SetTimestampOffset(kStartRtpTimestamp); >- rtcp_sender_->SetLastRtpTime(kRtpTimestamp, clock_.TimeInMilliseconds()); >+ rtcp_sender_->SetLastRtpTime(kRtpTimestamp, clock_.TimeInMilliseconds(), >+ /*paylpad_type=*/0); > > // Set up REMB info to be included with BYE. > rtcp_sender_->SetRTCPStatus(RtcpMode::kCompound); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.cc >index 77b10ca052d9864993fda1beae4ade0c71b22346..57d2142ffad4fff1e172955fc6f31d6ce89de9fd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.cc >@@ -39,10 +39,10 @@ RtcpTransceiver::~RtcpTransceiver() { > RTC_DCHECK(!rtcp_transceiver_); > } > >-void RtcpTransceiver::Stop(std::unique_ptr<rtc::QueuedTask> on_destroyed) { >+void RtcpTransceiver::Stop(std::function<void()> on_destroyed) { > RTC_DCHECK(rtcp_transceiver_); >- task_queue_->PostTaskAndReply(Destructor{std::move(rtcp_transceiver_)}, >- std::move(on_destroyed)); >+ task_queue_->PostTask(rtc::NewClosure( >+ Destructor{std::move(rtcp_transceiver_)}, std::move(on_destroyed))); > RTC_DCHECK(!rtcp_transceiver_); > } > >@@ -59,13 +59,14 @@ void RtcpTransceiver::AddMediaReceiverRtcpObserver( > void RtcpTransceiver::RemoveMediaReceiverRtcpObserver( > uint32_t remote_ssrc, > MediaReceiverRtcpObserver* observer, >- std::unique_ptr<rtc::QueuedTask> on_removed) { >+ std::function<void()> on_removed) { > RTC_CHECK(rtcp_transceiver_); > RtcpTransceiverImpl* ptr = rtcp_transceiver_.get(); > auto remove = [ptr, remote_ssrc, observer] { > ptr->RemoveMediaReceiverRtcpObserver(remote_ssrc, observer); > }; >- task_queue_->PostTaskAndReply(std::move(remove), std::move(on_removed)); >+ task_queue_->PostTask( >+ rtc::NewClosure(std::move(remove), std::move(on_removed))); > } > > void RtcpTransceiver::SetReadyToSend(bool ready) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.h >index fc9488c4abd640c507f5fce439f37b8b3e55bbea..9c9675110fa60e395f51250341f1b0e0c90c98b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver.h >@@ -11,6 +11,7 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTCP_TRANSCEIVER_H_ > #define MODULES_RTP_RTCP_SOURCE_RTCP_TRANSCEIVER_H_ > >+#include <functional> > #include <memory> > #include <string> > #include <vector> >@@ -42,18 +43,17 @@ class RtcpTransceiver : public RtcpFeedbackSenderInterface { > // Note that interfaces provided in constructor or registered with AddObserver > // still might be used by the transceiver on the task queue > // until |on_destroyed| runs. >- void Stop(std::unique_ptr<rtc::QueuedTask> on_destroyed); >+ void Stop(std::function<void()> on_destroyed); > > // Registers observer to be notified about incoming rtcp packets. > // Calls to observer will be done on the |config.task_queue|. > void AddMediaReceiverRtcpObserver(uint32_t remote_ssrc, > MediaReceiverRtcpObserver* observer); > // Deregisters the observer. Might return before observer is deregistered. >- // Posts |on_removed| task when observer is deregistered. >- void RemoveMediaReceiverRtcpObserver( >- uint32_t remote_ssrc, >- MediaReceiverRtcpObserver* observer, >- std::unique_ptr<rtc::QueuedTask> on_removed); >+ // Runs |on_removed| when observer is deregistered. >+ void RemoveMediaReceiverRtcpObserver(uint32_t remote_ssrc, >+ MediaReceiverRtcpObserver* observer, >+ std::function<void()> on_removed); > > // Enables/disables sending rtcp packets eventually. > // Packets may be sent after the SetReadyToSend(false) returns, but no new >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_config.h >index 0da18e2cc1b128a3fd3bf2703255a8bff0c8ffdf..01330d0bc7dfb0aff1649760fafb5e7f1f6ff2d8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_config.h >@@ -97,11 +97,6 @@ struct RtcpTransceiverConfig { > // Estimate RTT as non-sender as described in > // https://tools.ietf.org/html/rfc3611#section-4.4 and #section-4.5 > bool non_sender_rtt_measurement = false; >- // Copies LastSR/DelaySinceLastSR for previous report block to avoid >- // triggering bug in older version of RtcpReceiver. >- // TODO(bugs.webrtc.org/8805): Change to false by default then remove when >- // all major webrtc clients updated with the fix in RtcpReceiver. >- bool avoid_zero_last_sr_in_last_report_block = true; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl.cc >index 2c6c3acc391365ca58f66e3dc7e7569630274cc9..97c2ac018b31d0a56f222a0d08442e5f3cb7ad9e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl.cc >@@ -421,12 +421,6 @@ std::vector<rtcp::ReportBlock> RtcpTransceiverImpl::CreateReportBlocks( > auto it = remote_senders_.find(report_block.source_ssrc()); > if (it == remote_senders_.end() || > !it->second.last_received_sender_report) { >- if (config_.avoid_zero_last_sr_in_last_report_block && last_sr != 0) { >- // Simulate behaviour of the RtcpSender to avoid hitting bug in >- // RtcpReceiver. >- report_block.SetLastSr(last_sr); >- report_block.SetDelayLastSr(last_delay); >- } > continue; > } > const SenderReportTimes& last_sender_report = >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl_unittest.cc >index e5db0862cc476109bcc93b2abb4b1df6eee765dc..e86d67f5790c7f2ccea2db271ffde79c15ef9ce6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_impl_unittest.cc >@@ -77,7 +77,6 @@ constexpr int kAlmostForeverMs = 1000; > // Helper to wait for an rtcp packet produced on a different thread/task queue. > class FakeRtcpTransport : public webrtc::Transport { > public: >- FakeRtcpTransport() : sent_rtcp_(false, false) {} > bool SendRtcp(const uint8_t* data, size_t size) override { > sent_rtcp_.Set(); > return true; >@@ -148,7 +147,7 @@ TEST(RtcpTransceiverImplTest, CanDestroyOnTaskQueue) { > // Wait for a periodic packet. > EXPECT_TRUE(transport.WaitPacket()); > >- rtc::Event done(false, false); >+ rtc::Event done; > queue.PostTask([rtcp_transceiver, &done] { > delete rtcp_transceiver; > done.Set(); >@@ -187,7 +186,7 @@ TEST(RtcpTransceiverImplTest, DelaysSendingFirstCompondPacket) { > EXPECT_GE(rtc::TimeMillis() - started_ms, config.initial_report_delay_ms); > > // Cleanup. >- rtc::Event done(false, false); >+ rtc::Event done; > queue.PostTask([&] { > rtcp_transceiver.reset(); > done.Set(); >@@ -220,7 +219,7 @@ TEST(RtcpTransceiverImplTest, PeriodicallySendsPackets) { > config.report_period_ms - 1); > > // Cleanup. >- rtc::Event done(false, false); >+ rtc::Event done; > queue.PostTask([&] { > rtcp_transceiver.reset(); > done.Set(); >@@ -242,7 +241,7 @@ TEST(RtcpTransceiverImplTest, SendCompoundPacketDelaysPeriodicSendPackets) { > // Wait for first packet. > EXPECT_TRUE(transport.WaitPacket()); > // Send non periodic one after half period. >- rtc::Event non_periodic(false, false); >+ rtc::Event non_periodic; > int64_t time_of_non_periodic_packet_ms = 0; > queue.PostDelayedTask( > [&] { >@@ -265,7 +264,7 @@ TEST(RtcpTransceiverImplTest, SendCompoundPacketDelaysPeriodicSendPackets) { > config.report_period_ms - 1); > > // Cleanup. >- rtc::Event done(false, false); >+ rtc::Event done; > queue.PostTask([&] { > rtcp_transceiver.reset(); > done.Set(); >@@ -329,7 +328,7 @@ TEST(RtcpTransceiverImplTest, SendsPeriodicRtcpWhenNetworkStateIsUp) { > EXPECT_TRUE(transport.WaitPacket()); > > // Cleanup. >- rtc::Event done(false, false); >+ rtc::Event done; > queue.PostTask([&] { > rtcp_transceiver.reset(); > done.Set(); >@@ -671,7 +670,6 @@ TEST(RtcpTransceiverImplTest, > > RtcpTransceiverConfig config; > config.schedule_periodic_compound_packets = false; >- config.avoid_zero_last_sr_in_last_report_block = false; > RtcpPacketParser rtcp_parser; > RtcpParserTransport transport(&rtcp_parser); > config.outgoing_transport = &transport; >@@ -703,52 +701,6 @@ TEST(RtcpTransceiverImplTest, > EXPECT_EQ(report_blocks[1].last_sr(), 0u); > } > >-TEST(RtcpTransceiverImplTest, AvoidLastReportBlockToHaveZeroLastSrField) { >- const uint32_t kRemoteSsrc1 = 54321; >- const uint32_t kRemoteSsrc2 = 54323; >- MockReceiveStatisticsProvider receive_statistics; >- std::vector<ReportBlock> statistics_report_blocks(2); >- statistics_report_blocks[0].SetMediaSsrc(kRemoteSsrc1); >- statistics_report_blocks[1].SetMediaSsrc(kRemoteSsrc2); >- ON_CALL(receive_statistics, RtcpReportBlocks(_)) >- .WillByDefault(Return(statistics_report_blocks)); >- >- RtcpTransceiverConfig config; >- config.schedule_periodic_compound_packets = false; >- config.avoid_zero_last_sr_in_last_report_block = true; >- RtcpPacketParser rtcp_parser; >- RtcpParserTransport transport(&rtcp_parser); >- config.outgoing_transport = &transport; >- config.receive_statistics = &receive_statistics; >- RtcpTransceiverImpl rtcp_transceiver(config); >- >- const NtpTime kRemoteNtp(0x9876543211); >- // Receive SenderReport for RemoteSsrc1, but no report for RemoteSsrc2. >- SenderReport sr; >- sr.SetSenderSsrc(kRemoteSsrc1); >- sr.SetNtp(kRemoteNtp); >- auto raw_packet = sr.Build(); >- rtcp_transceiver.ReceivePacket(raw_packet, /*now_us=*/0); >- >- // Trigger sending ReceiverReport. >- rtcp_transceiver.SendCompoundPacket(); >- >- EXPECT_GT(rtcp_parser.receiver_report()->num_packets(), 0); >- const auto& report_blocks = rtcp_parser.receiver_report()->report_blocks(); >- ASSERT_EQ(report_blocks.size(), 2u); >- // RtcpTransceiverImpl doesn't guarantee order of the report blocks >- // match result of ReceiveStatisticsProvider::RtcpReportBlocks callback, >- // but for simplicity of the test asume it is the same. >- ASSERT_EQ(report_blocks[0].source_ssrc(), kRemoteSsrc1); >- EXPECT_NE(report_blocks[0].last_sr(), 0u); >- >- ASSERT_EQ(report_blocks[1].source_ssrc(), kRemoteSsrc2); >- // No Sender Report for kRemoteSsrc2, use same LastSR as for kRemoteSsrc1 >- EXPECT_EQ(report_blocks[1].last_sr(), report_blocks[0].last_sr()); >- EXPECT_EQ(report_blocks[1].delay_since_last_sr(), >- report_blocks[0].delay_since_last_sr()); >-} >- > TEST(RtcpTransceiverImplTest, > WhenSendsReceiverReportCalculatesDelaySinceLastSenderReport) { > const uint32_t kRemoteSsrc1 = 4321; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_unittest.cc >index 2ea5bc9b5377d212c87e3190ba5073429536d70b..d71918db03b4f7a56c338c76b5d48ac0e12aabf2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtcp_transceiver_unittest.cc >@@ -41,7 +41,7 @@ class MockMediaReceiverRtcpObserver : public webrtc::MediaReceiverRtcpObserver { > constexpr int kTimeoutMs = 1000; > > void WaitPostedTasks(rtc::TaskQueue* queue) { >- rtc::Event done(false, false); >+ rtc::Event done; > queue->PostTask([&done] { done.Set(); }); > ASSERT_TRUE(done.Wait(kTimeoutMs)); > } >@@ -108,8 +108,8 @@ TEST(RtcpTransceiverTest, CanBeDestroyedWithoutBlocking) { > auto* rtcp_transceiver = new RtcpTransceiver(config); > rtcp_transceiver->SendCompoundPacket(); > >- rtc::Event done(false, false); >- rtc::Event heavy_task(false, false); >+ rtc::Event done; >+ rtc::Event heavy_task; > queue.PostTask([&] { > EXPECT_TRUE(heavy_task.Wait(kTimeoutMs)); > done.Set(); >@@ -128,7 +128,7 @@ TEST(RtcpTransceiverTest, MaySendPacketsAfterDestructor) { // i.e. Be careful! > config.task_queue = &queue; > auto* rtcp_transceiver = new RtcpTransceiver(config); > >- rtc::Event heavy_task(false, false); >+ rtc::Event heavy_task; > queue.PostTask([&] { EXPECT_TRUE(heavy_task.Wait(kTimeoutMs)); }); > rtcp_transceiver->SendCompoundPacket(); > delete rtcp_transceiver; >@@ -158,7 +158,7 @@ TEST(RtcpTransceiverTest, DoesntPostToRtcpObserverAfterCallToRemove) { > config.outgoing_transport = &null_transport; > config.task_queue = &queue; > RtcpTransceiver rtcp_transceiver(config); >- rtc::Event observer_deleted(false, false); >+ rtc::Event observer_deleted; > > auto observer = absl::make_unique<MockMediaReceiverRtcpObserver>(); > EXPECT_CALL(*observer, OnSenderReport(kRemoteSsrc, _, 1)); >@@ -166,12 +166,11 @@ TEST(RtcpTransceiverTest, DoesntPostToRtcpObserverAfterCallToRemove) { > > rtcp_transceiver.AddMediaReceiverRtcpObserver(kRemoteSsrc, observer.get()); > rtcp_transceiver.ReceivePacket(CreateSenderReport(kRemoteSsrc, 1)); >- rtcp_transceiver.RemoveMediaReceiverRtcpObserver( >- kRemoteSsrc, observer.get(), >- /*on_removed=*/rtc::NewClosure([&] { >- observer.reset(); >- observer_deleted.Set(); >- })); >+ rtcp_transceiver.RemoveMediaReceiverRtcpObserver(kRemoteSsrc, observer.get(), >+ /*on_removed=*/[&] { >+ observer.reset(); >+ observer_deleted.Set(); >+ }); > rtcp_transceiver.ReceivePacket(CreateSenderReport(kRemoteSsrc, 2)); > > EXPECT_TRUE(observer_deleted.Wait(kTimeoutMs)); >@@ -189,15 +188,14 @@ TEST(RtcpTransceiverTest, RemoveMediaReceiverRtcpObserverIsNonBlocking) { > auto observer = absl::make_unique<MockMediaReceiverRtcpObserver>(); > rtcp_transceiver.AddMediaReceiverRtcpObserver(kRemoteSsrc, observer.get()); > >- rtc::Event queue_blocker(false, false); >- rtc::Event observer_deleted(false, false); >+ rtc::Event queue_blocker; >+ rtc::Event observer_deleted; > queue.PostTask([&] { EXPECT_TRUE(queue_blocker.Wait(kTimeoutMs)); }); >- rtcp_transceiver.RemoveMediaReceiverRtcpObserver( >- kRemoteSsrc, observer.get(), >- /*on_removed=*/rtc::NewClosure([&] { >- observer.reset(); >- observer_deleted.Set(); >- })); >+ rtcp_transceiver.RemoveMediaReceiverRtcpObserver(kRemoteSsrc, observer.get(), >+ /*on_removed=*/[&] { >+ observer.reset(); >+ observer_deleted.Set(); >+ }); > > EXPECT_THAT(observer, Not(IsNull())); > queue_blocker.Set(); >@@ -242,12 +240,12 @@ TEST(RtcpTransceiverTest, DoesntSendPacketsAfterStopCallback) { > config.schedule_periodic_compound_packets = true; > > auto rtcp_transceiver = absl::make_unique<RtcpTransceiver>(config); >- rtc::Event done(false, false); >+ rtc::Event done; > rtcp_transceiver->SendCompoundPacket(); >- rtcp_transceiver->Stop(rtc::NewClosure([&] { >+ rtcp_transceiver->Stop([&] { > EXPECT_CALL(outgoing_transport, SendRtcp).Times(0); > done.Set(); >- })); >+ }); > rtcp_transceiver = nullptr; > EXPECT_TRUE(done.Wait(kTimeoutMs)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.cc >index 13ec0afe9f7bd675ac9841be41cb998b9ef151b0..0010d90750f29efd001a8a5d2b81136befe9c396 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.cc >@@ -10,13 +10,15 @@ > > #include "modules/rtp_rtcp/source/rtp_format.h" > >-#include <utility> >- > #include "absl/memory/memory.h" >+#include "absl/types/variant.h" > #include "modules/rtp_rtcp/source/rtp_format_h264.h" > #include "modules/rtp_rtcp/source/rtp_format_video_generic.h" > #include "modules/rtp_rtcp/source/rtp_format_vp8.h" > #include "modules/rtp_rtcp/source/rtp_format_vp9.h" >+#include "modules/video_coding/codecs/h264/include/h264_globals.h" >+#include "modules/video_coding/codecs/vp8/include/vp8_globals.h" >+#include "modules/video_coding/codecs/vp9/include/vp9_globals.h" > #include "rtc_base/checks.h" > > namespace webrtc { >@@ -63,6 +65,11 @@ std::vector<int> RtpPacketizer::SplitAboutEqually( > RTC_DCHECK_GE(limits.last_packet_reduction_len, 0); > > std::vector<int> result; >+ if (limits.max_payload_len >= >+ limits.single_packet_reduction_len + payload_len) { >+ result.push_back(payload_len); >+ return result; >+ } > if (limits.max_payload_len - limits.first_packet_reduction_len < 1 || > limits.max_payload_len - limits.last_packet_reduction_len < 1) { > // Capacity is not enough to put a single byte into one of the packets. >@@ -77,6 +84,10 @@ std::vector<int> RtpPacketizer::SplitAboutEqually( > // Integer divisions with rounding up. > int num_packets_left = > (total_bytes + limits.max_payload_len - 1) / limits.max_payload_len; >+ if (num_packets_left == 1) { >+ // Single packet is a special case handled above. >+ num_packets_left = 2; >+ } > > if (payload_len < num_packets_left) { > // Edge case where limits force to have more packets than there are payload >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.h >index 9fad4cf8f9761cacbf3c87ca4051d5b4be621e4d..71c7dc5e0ecd8d4fa80a079b191a78f423cfb421 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format.h >@@ -11,17 +11,17 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_H_ > >+#include <stdint.h> > #include <memory> >-#include <string> > #include <vector> > > #include "api/array_view.h" > #include "common_types.h" // NOLINT(build/include) > #include "modules/include/module_common_types.h" >-#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >-#include "rtc_base/constructormagic.h" >+#include "modules/rtp_rtcp/source/rtp_video_header.h" > > namespace webrtc { >+ > class RtpPacketToSend; > > class RtpPacketizer { >@@ -30,6 +30,8 @@ class RtpPacketizer { > int max_payload_len = 1200; > int first_packet_reduction_len = 0; > int last_packet_reduction_len = 0; >+ // Reduction len for packet that is first & last at the same time. >+ int single_packet_reduction_len = 0; > }; > static std::unique_ptr<RtpPacketizer> Create( > VideoCodecType type, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.cc >index 9793ebae259131ddfec81e3176e2788fa248e07d..7a7bcdfbb89c874891fa2ce22022ec5c9e71af8e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.cc >@@ -11,10 +11,16 @@ > #include "modules/rtp_rtcp/source/rtp_format_h264.h" > > #include <string.h> >+#include <cstddef> >+#include <cstdint> >+#include <iterator> > #include <memory> > #include <utility> > #include <vector> > >+#include "absl/types/optional.h" >+#include "absl/types/variant.h" >+#include "common_types.h" // NOLINT(build/include) > #include "common_video/h264/h264_common.h" > #include "common_video/h264/pps_parser.h" > #include "common_video/h264/sps_parser.h" >@@ -186,9 +192,11 @@ bool RtpPacketizerH264::GeneratePackets( > case H264PacketizationMode::NonInterleaved: > int fragment_len = input_fragments_[i].length; > int single_packet_capacity = limits_.max_payload_len; >- if (i == 0) >+ if (input_fragments_.size() == 1) >+ single_packet_capacity -= limits_.single_packet_reduction_len; >+ else if (i == 0) > single_packet_capacity -= limits_.first_packet_reduction_len; >- if (i + 1 == input_fragments_.size()) >+ else if (i + 1 == input_fragments_.size()) > single_packet_capacity -= limits_.last_packet_reduction_len; > > if (fragment_len > single_packet_capacity) { >@@ -211,7 +219,10 @@ bool RtpPacketizerH264::PacketizeFuA(size_t fragment_index) { > PayloadSizeLimits limits = limits_; > // Leave room for the FU-A header. > limits.max_payload_len -= kFuAHeaderSize; >- // Ignore first/last packet reductions unless it is first/last fragment. >+ // Ignore single/first/last packet reductions unless it is single/first/last >+ // fragment. >+ if (input_fragments_.size() != 1) >+ limits.single_packet_reduction_len = 0; > if (fragment_index != 0) > limits.first_packet_reduction_len = 0; > if (fragment_index != input_fragments_.size() - 1) >@@ -243,17 +254,31 @@ bool RtpPacketizerH264::PacketizeFuA(size_t fragment_index) { > size_t RtpPacketizerH264::PacketizeStapA(size_t fragment_index) { > // Aggregate fragments into one packet (STAP-A). > size_t payload_size_left = limits_.max_payload_len; >- if (fragment_index == 0) >+ if (input_fragments_.size() == 1) >+ payload_size_left -= limits_.single_packet_reduction_len; >+ else if (fragment_index == 0) > payload_size_left -= limits_.first_packet_reduction_len; > int aggregated_fragments = 0; > size_t fragment_headers_length = 0; > const Fragment* fragment = &input_fragments_[fragment_index]; > RTC_CHECK_GE(payload_size_left, fragment->length); > ++num_packets_left_; >- while (payload_size_left >= fragment->length + fragment_headers_length && >- (fragment_index + 1 < input_fragments_.size() || >- payload_size_left >= fragment->length + fragment_headers_length + >- limits_.last_packet_reduction_len)) { >+ >+ auto payload_size_needed = [&] { >+ size_t fragment_size = fragment->length + fragment_headers_length; >+ if (input_fragments_.size() == 1) { >+ // Single fragment, single packet, payload_size_left already adjusted >+ // with limits_.single_packet_reduction_len. >+ return fragment_size; >+ } >+ if (fragment_index == input_fragments_.size() - 1) { >+ // Last fragment, so StrapA might be the last packet. >+ return fragment_size + limits_.last_packet_reduction_len; >+ } >+ return fragment_size; >+ }; >+ >+ while (payload_size_left >= payload_size_needed()) { > RTC_CHECK_GT(fragment->length, 0); > packets_.push(PacketUnit(*fragment, aggregated_fragments == 0, false, true, > fragment->buffer[0])); >@@ -282,9 +307,11 @@ size_t RtpPacketizerH264::PacketizeStapA(size_t fragment_index) { > bool RtpPacketizerH264::PacketizeSingleNalu(size_t fragment_index) { > // Add a single NALU to the queue, no aggregation. > size_t payload_size_left = limits_.max_payload_len; >- if (fragment_index == 0) >+ if (input_fragments_.size() == 1) >+ payload_size_left -= limits_.single_packet_reduction_len; >+ else if (fragment_index == 0) > payload_size_left -= limits_.first_packet_reduction_len; >- if (fragment_index + 1 == input_fragments_.size()) >+ else if (fragment_index + 1 == input_fragments_.size()) > payload_size_left -= limits_.last_packet_reduction_len; > const Fragment* fragment = &input_fragments_[fragment_index]; > if (payload_size_left < fragment->length) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.h >index 73e40878a18bd9edd1d5016f0bf6b9303b0ceaa1..fbd4fd9509075d8fb852e33d4526bb32e338d394 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264.h >@@ -11,12 +11,17 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_H264_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_H264_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <deque> > #include <memory> > #include <queue> >-#include <string> > >+#include "api/array_view.h" >+#include "modules/include/module_common_types.h" > #include "modules/rtp_rtcp/source/rtp_format.h" >+#include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "modules/video_coding/codecs/h264/include/h264_globals.h" > #include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264_unittest.cc >index edf907aac859cf47ef10f06a9be0af86cc9d9ff8..aeab813b53e2651c4deb02094474654acfb54ba4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_h264_unittest.cc >@@ -390,7 +390,7 @@ std::vector<int> TestFua(size_t frame_payload_size, > NoFragmentation(frame)); > std::vector<RtpPacketToSend> packets = FetchAllPackets(&packetizer); > >- RTC_CHECK_GE(packets.size(), 2); // Single packet indicates it is not FuA. >+ EXPECT_GE(packets.size(), 2u); // Single packet indicates it is not FuA. > std::vector<uint16_t> fua_header; > std::vector<int> payload_sizes; > >@@ -422,6 +422,7 @@ TEST(RtpPacketizerH264Test, FUAWithFirstPacketReduction) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 1200; > limits.first_packet_reduction_len = 4; >+ limits.single_packet_reduction_len = 4; > EXPECT_THAT(TestFua(1198, limits), ElementsAre(597, 601)); > } > >@@ -429,14 +430,14 @@ TEST(RtpPacketizerH264Test, FUAWithLastPacketReduction) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 1200; > limits.last_packet_reduction_len = 4; >+ limits.single_packet_reduction_len = 4; > EXPECT_THAT(TestFua(1198, limits), ElementsAre(601, 597)); > } > >-TEST(RtpPacketizerH264Test, FUAWithFirstAndLastPacketReduction) { >+TEST(RtpPacketizerH264Test, FUAWithSinglePacketReduction) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 1199; >- limits.first_packet_reduction_len = 100; >- limits.last_packet_reduction_len = 100; >+ limits.single_packet_reduction_len = 200; > EXPECT_THAT(TestFua(1000, limits), ElementsAre(500, 500)); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_unittest.cc >index a79c434b629f325b1ef6424a265be3630c79183c..ae1b5b054adfa3e4e2150eab41871eb50198f3bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_unittest.cc >@@ -199,21 +199,32 @@ TEST(RtpPacketizerSplitAboutEqually, GivesNonZeroPayloadLengthEachPacket) { > } > > TEST(RtpPacketizerSplitAboutEqually, >- OnePacketWhenExtraSpaceIsEnoughForSumOfFirstAndLastPacketReductions) { >+ IgnoresFirstAndLastPacketReductionWhenPayloadFitsIntoSinglePacket) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 30; >- limits.first_packet_reduction_len = 6; >- limits.last_packet_reduction_len = 4; >+ limits.first_packet_reduction_len = 29; >+ limits.last_packet_reduction_len = 29; >+ limits.single_packet_reduction_len = 10; > > EXPECT_THAT(RtpPacketizer::SplitAboutEqually(20, limits), ElementsAre(20)); > } > > TEST(RtpPacketizerSplitAboutEqually, >- TwoPacketsWhenExtraSpaceIsTooSmallForSumOfFirstAndLastPacketReductions) { >+ OnePacketWhenExtraSpaceIsEnoughForSinglePacketReduction) { >+ RtpPacketizer::PayloadSizeLimits limits; >+ limits.max_payload_len = 30; >+ limits.single_packet_reduction_len = 10; >+ >+ EXPECT_THAT(RtpPacketizer::SplitAboutEqually(20, limits), ElementsAre(20)); >+} >+ >+TEST(RtpPacketizerSplitAboutEqually, >+ TwoPacketsWhenExtraSpaceIsTooSmallForSinglePacketReduction) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 29; >- limits.first_packet_reduction_len = 6; >- limits.last_packet_reduction_len = 4; >+ limits.first_packet_reduction_len = 3; >+ limits.last_packet_reduction_len = 1; >+ limits.single_packet_reduction_len = 10; > > // First packet needs two more extra bytes compared to last one, > // so should have two less payload bytes. >@@ -246,8 +257,7 @@ TEST(RtpPacketizerSplitAboutEqually, RejectsZeroLastPacketLen) { > TEST(RtpPacketizerSplitAboutEqually, CantPutSinglePayloadByteInTwoPackets) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 10; >- limits.first_packet_reduction_len = 6; >- limits.last_packet_reduction_len = 4; >+ limits.single_packet_reduction_len = 10; > > EXPECT_THAT(RtpPacketizer::SplitAboutEqually(1, limits), IsEmpty()); > } >@@ -255,8 +265,7 @@ TEST(RtpPacketizerSplitAboutEqually, CantPutSinglePayloadByteInTwoPackets) { > TEST(RtpPacketizerSplitAboutEqually, CanPutTwoPayloadBytesInTwoPackets) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 10; >- limits.first_packet_reduction_len = 6; >- limits.last_packet_reduction_len = 4; >+ limits.single_packet_reduction_len = 10; > > EXPECT_THAT(RtpPacketizer::SplitAboutEqually(2, limits), ElementsAre(1, 1)); > } >@@ -264,8 +273,7 @@ TEST(RtpPacketizerSplitAboutEqually, CanPutTwoPayloadBytesInTwoPackets) { > TEST(RtpPacketizerSplitAboutEqually, CanPutSinglePayloadByteInOnePacket) { > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = 11; >- limits.first_packet_reduction_len = 6; >- limits.last_packet_reduction_len = 4; >+ limits.single_packet_reduction_len = 10; > > EXPECT_THAT(RtpPacketizer::SplitAboutEqually(1, limits), ElementsAre(1)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.cc >index edd1e3c53051635435b8c0e11a85113cacb38425..92aada4688a85fa5cbcc758720a785691d509868 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.cc >@@ -8,11 +8,13 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include <string> >+#include <assert.h> >+#include <string.h> > >-#include "modules/include/module_common_types.h" >+#include "absl/types/optional.h" > #include "modules/rtp_rtcp/source/rtp_format_video_generic.h" > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.h >index 03509f5041df6f7cf6d48ae788b889565e32dcb0..3458d4942329a15c120dcff59e30ff7cb126db11 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_video_generic.h >@@ -10,6 +10,7 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_VIDEO_GENERIC_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_VIDEO_GENERIC_H_ > >+#include <stdint.h> > #include <vector> > > #include "api/array_view.h" >@@ -18,6 +19,10 @@ > #include "rtc_base/constructormagic.h" > > namespace webrtc { >+ >+class RtpPacketToSend; >+struct RTPVideoHeader; >+ > namespace RtpFormatVideoGeneric { > static const uint8_t kKeyFrameBit = 0x01; > static const uint8_t kFirstPacketBit = 0x02; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.cc >index f40434e7e031f729f0d43122855c95bc923dc200..a1248cf040b173c5c07ad4b74fdddc1794258dad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.cc >@@ -10,13 +10,13 @@ > > #include "modules/rtp_rtcp/source/rtp_format_vp8.h" > >+#include <stdint.h> > #include <string.h> // memcpy >- >-#include <limits> >-#include <utility> > #include <vector> > >+#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "modules/video_coding/codecs/interface/common_constants.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.h >index e4bc36e3f3f4ee25f636c9c1ee785ca7b847b636..444298fbbf809d555bbcd408f62d5c8e674b56c5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp8.h >@@ -25,13 +25,15 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_VP8_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_VP8_H_ > >-#include <string> >+#include <stddef.h> >+#include <cstdint> > #include <vector> > > #include "absl/container/inlined_vector.h" > #include "api/array_view.h" >-#include "modules/include/module_common_types.h" > #include "modules/rtp_rtcp/source/rtp_format.h" >+#include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "modules/video_coding/codecs/vp8/include/vp8_globals.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.cc >index 4419d1a30c0b5f983b542428ab498e3c6ad80e17..9cd75143b8c2971af32aab540a48fc643d54ee74 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.cc >@@ -12,9 +12,9 @@ > > #include <string.h> > >-#include <cmath> >- >+#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "modules/video_coding/codecs/interface/common_constants.h" > #include "rtc_base/bitbuffer.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >@@ -451,6 +451,7 @@ RtpPacketizerVp9::RtpPacketizerVp9(rtc::ArrayView<const uint8_t> payload, > remaining_payload_(payload) { > limits.max_payload_len -= header_size_; > limits.first_packet_reduction_len += first_packet_extra_header_size_; >+ limits.single_packet_reduction_len += first_packet_extra_header_size_; > > payload_sizes_ = SplitAboutEqually(payload.size(), limits); > current_packet_ = payload_sizes_.begin(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.h >index 8a2a2a604c814d32255589343419efdb926e9ea9..c3b8f17a4b0981bac1481c964e5338474e4ae693 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_format_vp9.h >@@ -21,12 +21,14 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_VP9_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_FORMAT_VP9_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <vector> > > #include "api/array_view.h" >-#include "modules/include/module_common_types.h" > #include "modules/rtp_rtcp/source/rtp_format.h" > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "modules/video_coding/codecs/vp9/include/vp9_globals.h" > #include "rtc_base/constructormagic.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.cc >index 1b931327bad1f8a791297420b635763836ce1d94..c27fb6e3f6f85fb91da9409e86326ed8c8333dde 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.cc >@@ -10,6 +10,8 @@ > > #include "modules/rtp_rtcp/source/rtp_generic_frame_descriptor.h" > >+#include <cstdint> >+ > #include "rtc_base/checks.h" > > namespace webrtc { >@@ -56,6 +58,16 @@ void RtpGenericFrameDescriptor::SetSpatialLayersBitmask( > spatial_layers_ = spatial_layers; > } > >+void RtpGenericFrameDescriptor::SetResolution(int width, int height) { >+ RTC_DCHECK(FirstPacketInSubFrame()); >+ RTC_DCHECK_GE(width, 0); >+ RTC_DCHECK_LE(width, 0xFFFF); >+ RTC_DCHECK_GE(height, 0); >+ RTC_DCHECK_LE(height, 0xFFFF); >+ width_ = width; >+ height_ = height; >+} >+ > uint16_t RtpGenericFrameDescriptor::FrameId() const { > RTC_DCHECK(FirstPacketInSubFrame()); > return frame_id_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.h >index 1bde4fa56dcc789f8f6cc3b6c7eba1f861e429e3..3a6c34d59ff2630fa1436de57f4fce6968a1049c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor.h >@@ -50,6 +50,10 @@ class RtpGenericFrameDescriptor { > uint8_t SpatialLayersBitmask() const; > void SetSpatialLayersBitmask(uint8_t spatial_layers); > >+ int Width() const { return width_; } >+ int Height() const { return height_; } >+ void SetResolution(int width, int height); >+ > uint16_t FrameId() const; > void SetFrameId(uint16_t frame_id); > >@@ -72,6 +76,9 @@ class RtpGenericFrameDescriptor { > uint8_t temporal_layer_ = 0; > size_t num_frame_deps_ = 0; > uint16_t frame_deps_id_diffs_[kMaxNumFrameDependencies]; >+ int width_ = 0; >+ int height_ = 0; >+ > std::vector<uint8_t> byte_representation_; > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.cc >index c7b52d5ba680d14fd86562fa4a85bb10f090278c..7cd120d9f45c395706e7519350d9591f82c32293 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.cc >@@ -36,6 +36,14 @@ constexpr uint8_t kFlageXtendedOffset = 0x02; > // B: + FID + > // | | > // +-+-+-+-+-+-+-+-+ >+// | | >+// + Width + >+// B=1 | | >+// and +-+-+-+-+-+-+-+-+ >+// D=0 | | >+// + Height + >+// | | >+// +-+-+-+-+-+-+-+-+ > // D: | FDIFF |X|M| > // +---------------+ > // X: | ... | >@@ -75,6 +83,12 @@ bool RtpGenericFrameDescriptorExtension::Parse( > descriptor->ClearFrameDependencies(); > size_t offset = 4; > bool has_more_dependencies = (data[0] & kFlagDependencies) != 0; >+ if (!has_more_dependencies && data.size() >= offset + 4) { >+ uint16_t width = (data[offset] << 8) | data[offset + 1]; >+ uint16_t height = (data[offset + 2] << 8) | data[offset + 3]; >+ descriptor->SetResolution(width, height); >+ offset += 4; >+ } > while (has_more_dependencies) { > if (data.size() == offset) > return false; >@@ -91,7 +105,7 @@ bool RtpGenericFrameDescriptorExtension::Parse( > if (!descriptor->AddFrameDependencyDiff(fdiff)) > return false; > } >- return data.size() == offset; >+ return true; > } > > size_t RtpGenericFrameDescriptorExtension::ValueSize( >@@ -103,6 +117,11 @@ size_t RtpGenericFrameDescriptorExtension::ValueSize( > for (uint16_t fdiff : descriptor.FrameDependenciesDiffs()) { > size += (fdiff >= (1 << 6)) ? 2 : 1; > } >+ if (descriptor.FirstPacketInSubFrame() && >+ descriptor.FrameDependenciesDiffs().empty() && descriptor.Width() > 0 && >+ descriptor.Height() > 0) { >+ size += 4; >+ } > return size; > } > >@@ -129,6 +148,13 @@ bool RtpGenericFrameDescriptorExtension::Write( > data[3] = frame_id >> 8; > rtc::ArrayView<const uint16_t> fdiffs = descriptor.FrameDependenciesDiffs(); > size_t offset = 4; >+ if (descriptor.FirstPacketInSubFrame() && fdiffs.empty() && >+ descriptor.Width() > 0 && descriptor.Height() > 0) { >+ data[offset++] = (descriptor.Width() >> 8); >+ data[offset++] = (descriptor.Width() & 0xFF); >+ data[offset++] = (descriptor.Height() >> 8); >+ data[offset++] = (descriptor.Height() & 0xFF); >+ } > for (size_t i = 0; i < fdiffs.size(); i++) { > bool extended = fdiffs[i] >= (1 << 6); > bool more = i < fdiffs.size() - 1; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.h >index 2dd9dd49ebe12e02a9c05fe26c3bdb431e182ee0..0d673e060198d81dc0faf378cfe8e853bb280a27 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension.h >@@ -21,6 +21,7 @@ namespace webrtc { > > class RtpGenericFrameDescriptorExtension { > public: >+ using value_type = RtpGenericFrameDescriptor; > static constexpr RTPExtensionType kId = kRtpExtensionGenericFrameDescriptor; > static constexpr char kUri[] = > "http://www.webrtc.org/experiments/rtp-hdrext/" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension_unittest.cc >index bb69e1455dd940caf57bb15280bc3195f5ce7473..7f8fa2f5a41e447fcd935c59c4e55046f8e724f7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_generic_frame_descriptor_extension_unittest.cc >@@ -279,5 +279,32 @@ TEST(RtpGenericFrameDescriptorExtensionTest, WriteTwoFrameDependencies) { > EXPECT_THAT(buffer, ElementsAreArray(kRaw)); > } > >+TEST(RtpGenericFrameDescriptorExtensionTest, >+ ParseResolutionOnIndependentFrame) { >+ constexpr int kWidth = 0x2468; >+ constexpr int kHeight = 0x6543; >+ constexpr uint8_t kRaw[] = {0x80, 0x01, 0x00, 0x00, 0x24, 0x68, 0x65, 0x43}; >+ RtpGenericFrameDescriptor descriptor; >+ >+ ASSERT_TRUE(RtpGenericFrameDescriptorExtension::Parse(kRaw, &descriptor)); >+ EXPECT_EQ(descriptor.Width(), kWidth); >+ EXPECT_EQ(descriptor.Height(), kHeight); >+} >+ >+TEST(RtpGenericFrameDescriptorExtensionTest, >+ WriteResolutionOnIndependentFrame) { >+ constexpr int kWidth = 0x2468; >+ constexpr int kHeight = 0x6543; >+ constexpr uint8_t kRaw[] = {0x80, 0x01, 0x00, 0x00, 0x24, 0x68, 0x65, 0x43}; >+ RtpGenericFrameDescriptor descriptor; >+ descriptor.SetFirstPacketInSubFrame(true); >+ descriptor.SetResolution(kWidth, kHeight); >+ >+ ASSERT_EQ(RtpGenericFrameDescriptorExtension::ValueSize(descriptor), >+ sizeof(kRaw)); >+ uint8_t buffer[sizeof(kRaw)]; >+ EXPECT_TRUE(RtpGenericFrameDescriptorExtension::Write(buffer, descriptor)); >+ EXPECT_THAT(buffer, ElementsAreArray(kRaw)); >+} > } // namespace > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extension_map.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extension_map.cc >index 168665afc4532deb4d9e8635a2d7e0ab320842ab..8e0a484d9774e606ead845cda46e4087266cc72f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extension_map.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extension_map.cc >@@ -43,6 +43,7 @@ constexpr ExtensionInfo kExtensions[] = { > CreateExtensionInfo<RepairedRtpStreamId>(), > CreateExtensionInfo<RtpMid>(), > CreateExtensionInfo<RtpGenericFrameDescriptorExtension>(), >+ CreateExtensionInfo<ColorSpaceExtension>(), > }; > > // Because of kRtpExtensionNone, NumberOfExtension is 1 bigger than the actual >@@ -56,15 +57,17 @@ static_assert(arraysize(kExtensions) == > constexpr RTPExtensionType RtpHeaderExtensionMap::kInvalidType; > constexpr int RtpHeaderExtensionMap::kInvalidId; > >-RtpHeaderExtensionMap::RtpHeaderExtensionMap() >- : mixed_one_two_byte_header_supported_(false) { >+RtpHeaderExtensionMap::RtpHeaderExtensionMap() : RtpHeaderExtensionMap(false) {} >+ >+RtpHeaderExtensionMap::RtpHeaderExtensionMap(bool extmap_allow_mixed) >+ : extmap_allow_mixed_(extmap_allow_mixed) { > for (auto& id : ids_) > id = kInvalidId; > } > > RtpHeaderExtensionMap::RtpHeaderExtensionMap( > rtc::ArrayView<const RtpExtension> extensions) >- : RtpHeaderExtensionMap() { >+ : RtpHeaderExtensionMap(false) { > for (const RtpExtension& extension : extensions) > RegisterByUri(extension.id, extension.uri); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.cc >index fc4b7ceb9281af5c122e5b49bda50d54ccafe759..92694cd5a3a466a328d81c4377f04561b28b2080 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.cc >@@ -10,10 +10,14 @@ > > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" > >+#include <string.h> >+#include <cmath> >+ > #include "modules/rtp_rtcp/include/rtp_cvo.h" > #include "modules/rtp_rtcp/source/byte_io.h" >+// TODO(bug:9855) Move kNoSpatialIdx from vp9_globals.h to common_constants >+#include "modules/video_coding/codecs/interface/common_constants.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > > namespace webrtc { > // Absolute send time in RTP streams. >@@ -430,6 +434,172 @@ bool FrameMarkingExtension::Write(rtc::ArrayView<uint8_t> data, > return true; > } > >+// Color space including HDR metadata as an optional field. >+// >+// RTP header extension to carry HDR metadata. >+// Float values are upscaled by a static factor and transmitted as integers. >+// >+// Data layout with HDR metadata >+// 0 1 2 3 >+// 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | ID | length=30 | Primaries | Transfer | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | Matrix | Range | luminance_max | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | | luminance_min | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | mastering_metadata.primary_r.x and .y | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | mastering_metadata.primary_g.x and .y | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | mastering_metadata.primary_b.x and .y | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | mastering_metadata.white.x and .y | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | max_content_light_level | max_frame_average_light_level | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// >+// Data layout without HDR metadata >+// 0 1 2 3 >+// 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | ID | length=4 | Primaries | Transfer | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ >+// | Matrix | Range | >+// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+- >+ >+constexpr RTPExtensionType ColorSpaceExtension::kId; >+constexpr uint8_t ColorSpaceExtension::kValueSizeBytes; >+constexpr const char ColorSpaceExtension::kUri[]; >+ >+bool ColorSpaceExtension::Parse(rtc::ArrayView<const uint8_t> data, >+ ColorSpace* color_space) { >+ RTC_DCHECK(color_space); >+ if (data.size() != kValueSizeBytes && >+ data.size() != kValueSizeBytesWithoutHdrMetadata) >+ return false; >+ >+ size_t offset = 0; >+ // Read color space information. >+ if (!color_space->set_primaries_from_uint8(data.data()[offset++])) >+ return false; >+ if (!color_space->set_transfer_from_uint8(data.data()[offset++])) >+ return false; >+ if (!color_space->set_matrix_from_uint8(data.data()[offset++])) >+ return false; >+ if (!color_space->set_range_from_uint8(data.data()[offset++])) >+ return false; >+ >+ // Read HDR metadata if it exists, otherwise clear it. >+ if (data.size() == kValueSizeBytesWithoutHdrMetadata) { >+ color_space->set_hdr_metadata(nullptr); >+ } else { >+ HdrMetadata hdr_metadata; >+ offset += ParseLuminance(data.data() + offset, >+ &hdr_metadata.mastering_metadata.luminance_max, >+ kLuminanceMaxDenominator); >+ offset += ParseLuminance(data.data() + offset, >+ &hdr_metadata.mastering_metadata.luminance_min, >+ kLuminanceMinDenominator); >+ offset += ParseChromaticity(data.data() + offset, >+ &hdr_metadata.mastering_metadata.primary_r); >+ offset += ParseChromaticity(data.data() + offset, >+ &hdr_metadata.mastering_metadata.primary_g); >+ offset += ParseChromaticity(data.data() + offset, >+ &hdr_metadata.mastering_metadata.primary_b); >+ offset += ParseChromaticity(data.data() + offset, >+ &hdr_metadata.mastering_metadata.white_point); >+ hdr_metadata.max_content_light_level = >+ ByteReader<uint16_t>::ReadBigEndian(data.data() + offset); >+ offset += 2; >+ hdr_metadata.max_frame_average_light_level = >+ ByteReader<uint16_t>::ReadBigEndian(data.data() + offset); >+ offset += 2; >+ color_space->set_hdr_metadata(&hdr_metadata); >+ } >+ RTC_DCHECK_EQ(ValueSize(*color_space), offset); >+ return true; >+} >+ >+bool ColorSpaceExtension::Write(rtc::ArrayView<uint8_t> data, >+ const ColorSpace& color_space) { >+ RTC_DCHECK(data.size() >= ValueSize(color_space)); >+ size_t offset = 0; >+ // Write color space information. >+ data.data()[offset++] = static_cast<uint8_t>(color_space.primaries()); >+ data.data()[offset++] = static_cast<uint8_t>(color_space.transfer()); >+ data.data()[offset++] = static_cast<uint8_t>(color_space.matrix()); >+ data.data()[offset++] = static_cast<uint8_t>(color_space.range()); >+ >+ // Write HDR metadata if it exists. >+ if (color_space.hdr_metadata()) { >+ const HdrMetadata& hdr_metadata = *color_space.hdr_metadata(); >+ offset += WriteLuminance(data.data() + offset, >+ hdr_metadata.mastering_metadata.luminance_max, >+ kLuminanceMaxDenominator); >+ offset += WriteLuminance(data.data() + offset, >+ hdr_metadata.mastering_metadata.luminance_min, >+ kLuminanceMinDenominator); >+ offset += WriteChromaticity(data.data() + offset, >+ hdr_metadata.mastering_metadata.primary_r); >+ offset += WriteChromaticity(data.data() + offset, >+ hdr_metadata.mastering_metadata.primary_g); >+ offset += WriteChromaticity(data.data() + offset, >+ hdr_metadata.mastering_metadata.primary_b); >+ offset += WriteChromaticity(data.data() + offset, >+ hdr_metadata.mastering_metadata.white_point); >+ >+ ByteWriter<uint16_t>::WriteBigEndian(data.data() + offset, >+ hdr_metadata.max_content_light_level); >+ offset += 2; >+ ByteWriter<uint16_t>::WriteBigEndian( >+ data.data() + offset, hdr_metadata.max_frame_average_light_level); >+ offset += 2; >+ } >+ RTC_DCHECK_EQ(ValueSize(color_space), offset); >+ return true; >+} >+ >+size_t ColorSpaceExtension::ParseChromaticity( >+ const uint8_t* data, >+ HdrMasteringMetadata::Chromaticity* p) { >+ uint16_t chromaticity_x_scaled = ByteReader<uint16_t>::ReadBigEndian(data); >+ uint16_t chromaticity_y_scaled = >+ ByteReader<uint16_t>::ReadBigEndian(data + 2); >+ p->x = static_cast<float>(chromaticity_x_scaled) / kChromaticityDenominator; >+ p->y = static_cast<float>(chromaticity_y_scaled) / kChromaticityDenominator; >+ return 4; // Return number of bytes read. >+} >+ >+size_t ColorSpaceExtension::ParseLuminance(const uint8_t* data, >+ float* f, >+ int denominator) { >+ uint32_t luminance_scaled = ByteReader<uint32_t, 3>::ReadBigEndian(data); >+ *f = static_cast<float>(luminance_scaled) / denominator; >+ return 3; // Return number of bytes read. >+} >+ >+size_t ColorSpaceExtension::WriteChromaticity( >+ uint8_t* data, >+ const HdrMasteringMetadata::Chromaticity& p) { >+ RTC_DCHECK_GE(p.x, 0.0f); >+ RTC_DCHECK_GE(p.y, 0.0f); >+ ByteWriter<uint16_t>::WriteBigEndian( >+ data, std::round(p.x * kChromaticityDenominator)); >+ ByteWriter<uint16_t>::WriteBigEndian( >+ data + 2, std::round(p.y * kChromaticityDenominator)); >+ return 4; // Return number of bytes written. >+} >+ >+size_t ColorSpaceExtension::WriteLuminance(uint8_t* data, >+ float f, >+ int denominator) { >+ RTC_DCHECK_GE(f, 0.0f); >+ ByteWriter<uint32_t, 3>::WriteBigEndian(data, std::round(f * denominator)); >+ return 3; // Return number of bytes written. >+} >+ > bool BaseRtpStringExtension::Parse(rtc::ArrayView<const uint8_t> data, > StringRtpHeaderExtension* str) { > if (data.empty() || data[0] == 0) // Valid string extension can't be empty. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.h >index 84e9831479db3e94e995b6fb65fa403a3fcce1b0..42a6216c7b529892f5bd58d27cd940c895012dda 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_extensions.h >@@ -10,19 +10,25 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_HEADER_EXTENSIONS_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_HEADER_EXTENSIONS_H_ > >+#include <stddef.h> > #include <stdint.h> > #include <string> > > #include "api/array_view.h" >+#include "api/rtp_headers.h" >+#include "api/video/color_space.h" > #include "api/video/video_content_type.h" >+#include "api/video/video_frame_marking.h" > #include "api/video/video_rotation.h" > #include "api/video/video_timing.h" >+#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > > namespace webrtc { > > class AbsoluteSendTime { > public: >+ using value_type = uint32_t; > static constexpr RTPExtensionType kId = kRtpExtensionAbsoluteSendTime; > static constexpr uint8_t kValueSizeBytes = 3; > static constexpr const char kUri[] = >@@ -57,6 +63,7 @@ class AudioLevel { > > class TransmissionOffset { > public: >+ using value_type = int32_t; > static constexpr RTPExtensionType kId = kRtpExtensionTransmissionTimeOffset; > static constexpr uint8_t kValueSizeBytes = 3; > static constexpr const char kUri[] = "urn:ietf:params:rtp-hdrext:toffset"; >@@ -68,6 +75,7 @@ class TransmissionOffset { > > class TransportSequenceNumber { > public: >+ using value_type = uint16_t; > static constexpr RTPExtensionType kId = kRtpExtensionTransportSequenceNumber; > static constexpr uint8_t kValueSizeBytes = 2; > static constexpr const char kUri[] = >@@ -80,6 +88,7 @@ class TransportSequenceNumber { > > class VideoOrientation { > public: >+ using value_type = VideoRotation; > static constexpr RTPExtensionType kId = kRtpExtensionVideoRotation; > static constexpr uint8_t kValueSizeBytes = 1; > static constexpr const char kUri[] = "urn:3gpp:video-orientation"; >@@ -94,6 +103,7 @@ class VideoOrientation { > > class PlayoutDelayLimits { > public: >+ using value_type = PlayoutDelay; > static constexpr RTPExtensionType kId = kRtpExtensionPlayoutDelay; > static constexpr uint8_t kValueSizeBytes = 3; > static constexpr const char kUri[] = >@@ -117,6 +127,7 @@ class PlayoutDelayLimits { > > class VideoContentTypeExtension { > public: >+ using value_type = VideoContentType; > static constexpr RTPExtensionType kId = kRtpExtensionVideoContentType; > static constexpr uint8_t kValueSizeBytes = 1; > static constexpr const char kUri[] = >@@ -133,6 +144,7 @@ class VideoContentTypeExtension { > > class VideoTimingExtension { > public: >+ using value_type = VideoSendTiming; > static constexpr RTPExtensionType kId = kRtpExtensionVideoTiming; > static constexpr uint8_t kValueSizeBytes = 13; > static constexpr const char kUri[] = >@@ -155,6 +167,7 @@ class VideoTimingExtension { > > class FrameMarkingExtension { > public: >+ using value_type = FrameMarking; > static constexpr RTPExtensionType kId = kRtpExtensionFrameMarking; > static constexpr const char kUri[] = > "http://tools.ietf.org/html/draft-ietf-avtext-framemarking-07"; >@@ -169,10 +182,41 @@ class FrameMarkingExtension { > static bool IsScalable(uint8_t temporal_id, uint8_t layer_id); > }; > >+class ColorSpaceExtension { >+ public: >+ using value_type = ColorSpace; >+ static constexpr RTPExtensionType kId = kRtpExtensionColorSpace; >+ static constexpr uint8_t kValueSizeBytes = 30; >+ static constexpr uint8_t kValueSizeBytesWithoutHdrMetadata = 4; >+ // TODO(webrtc:8651): Change to a valid uri. >+ static constexpr const char kUri[] = "rtp-colorspace-uri-placeholder"; >+ >+ static bool Parse(rtc::ArrayView<const uint8_t> data, >+ ColorSpace* color_space); >+ static size_t ValueSize(const ColorSpace& color_space) { >+ return color_space.hdr_metadata() ? kValueSizeBytes >+ : kValueSizeBytesWithoutHdrMetadata; >+ } >+ static bool Write(rtc::ArrayView<uint8_t> data, >+ const ColorSpace& color_space); >+ >+ private: >+ static constexpr int kChromaticityDenominator = 10000; // 0.0001 resolution. >+ static constexpr int kLuminanceMaxDenominator = 100; // 0.01 resolution. >+ static constexpr int kLuminanceMinDenominator = 10000; // 0.0001 resolution. >+ static size_t ParseChromaticity(const uint8_t* data, >+ HdrMasteringMetadata::Chromaticity* p); >+ static size_t ParseLuminance(const uint8_t* data, float* f, int denominator); >+ static size_t WriteChromaticity(uint8_t* data, >+ const HdrMasteringMetadata::Chromaticity& p); >+ static size_t WriteLuminance(uint8_t* data, float f, int denominator); >+}; >+ > // Base extension class for RTP header extensions which are strings. > // Subclasses must defined kId and kUri static constexpr members. > class BaseRtpStringExtension { > public: >+ using value_type = std::string; > // String RTP header extensions are limited to 16 bytes because it is the > // maximum length that can be encoded with one-byte header extensions. > static constexpr uint8_t kMaxValueSizeBytes = 16; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_parser.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_parser.cc >index df68f74dd9377222319d6fe0a30fc1aa8327ccbf..6481a403a0436a3d3e46f45cc77b2e2cb14a12b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_parser.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_header_parser.cc >@@ -9,9 +9,12 @@ > */ > #include "modules/rtp_rtcp/include/rtp_header_parser.h" > >+#include <string.h> >+ > #include "modules/rtp_rtcp/include/rtp_header_extension_map.h" > #include "modules/rtp_rtcp/source/rtp_utility.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >@@ -25,8 +28,10 @@ class RtpHeaderParserImpl : public RtpHeaderParser { > RTPHeader* header) const override; > > bool RegisterRtpHeaderExtension(RTPExtensionType type, uint8_t id) override; >+ bool RegisterRtpHeaderExtension(RtpExtension extension) override; > > bool DeregisterRtpHeaderExtension(RTPExtensionType type) override; >+ bool DeregisterRtpHeaderExtension(RtpExtension extension) override; > > private: > rtc::CriticalSection critical_section_; >@@ -63,6 +68,10 @@ bool RtpHeaderParserImpl::Parse(const uint8_t* packet, > } > return true; > } >+bool RtpHeaderParserImpl::RegisterRtpHeaderExtension(RtpExtension extension) { >+ rtc::CritScope cs(&critical_section_); >+ return rtp_header_extension_map_.RegisterByUri(extension.id, extension.uri); >+} > > bool RtpHeaderParserImpl::RegisterRtpHeaderExtension(RTPExtensionType type, > uint8_t id) { >@@ -70,6 +79,12 @@ bool RtpHeaderParserImpl::RegisterRtpHeaderExtension(RTPExtensionType type, > return rtp_header_extension_map_.RegisterByType(id, type); > } > >+bool RtpHeaderParserImpl::DeregisterRtpHeaderExtension(RtpExtension extension) { >+ rtc::CritScope cs(&critical_section_); >+ return rtp_header_extension_map_.Deregister( >+ rtp_header_extension_map_.GetType(extension.id)); >+} >+ > bool RtpHeaderParserImpl::DeregisterRtpHeaderExtension(RTPExtensionType type) { > rtc::CritScope cs(&critical_section_); > return rtp_header_extension_map_.Deregister(type) == 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.cc >index 0b9ed80af56ed867a54c9580fcbcc6344b3ecdc8..9d4dce4dea27e4db04951ffc4f14d7f5b004fb41 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.cc >@@ -10,16 +10,15 @@ > > #include "modules/rtp_rtcp/source/rtp_packet.h" > >+#include <cstdint> > #include <cstring> > #include <utility> > > #include "api/rtpparameters.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/source/byte_io.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" >-#include "rtc_base/random.h" > > namespace webrtc { > namespace { >@@ -212,8 +211,7 @@ rtc::ArrayView<uint8_t> RtpPacket::AllocateRawExtension(int id, size_t length) { > const bool two_byte_header_required = > id > RtpExtension::kOneByteHeaderExtensionMaxId || > length > RtpExtension::kOneByteHeaderExtensionMaxValueSize || length == 0; >- RTC_CHECK(!two_byte_header_required || >- extensions_.IsMixedOneTwoByteHeaderSupported()); >+ RTC_CHECK(!two_byte_header_required || extensions_.ExtmapAllowMixed()); > > uint16_t profile_id; > if (extensions_size_ > 0) { >@@ -359,22 +357,20 @@ uint8_t* RtpPacket::SetPayloadSize(size_t size_bytes) { > return WriteAt(payload_offset_); > } > >-bool RtpPacket::SetPadding(uint8_t size_bytes, Random* random) { >- RTC_DCHECK(random); >- if (payload_offset_ + payload_size_ + size_bytes > capacity()) { >- RTC_LOG(LS_WARNING) << "Cannot set padding size " << size_bytes << ", only " >+bool RtpPacket::SetPadding(size_t padding_bytes) { >+ if (payload_offset_ + payload_size_ + padding_bytes > capacity()) { >+ RTC_LOG(LS_WARNING) << "Cannot set padding size " << padding_bytes >+ << ", only " > << (capacity() - payload_offset_ - payload_size_) > << " bytes left in buffer."; > return false; > } >- padding_size_ = size_bytes; >+ padding_size_ = rtc::dchecked_cast<uint8_t>(padding_bytes); > buffer_.SetSize(payload_offset_ + payload_size_ + padding_size_); > if (padding_size_ > 0) { > size_t padding_offset = payload_offset_ + payload_size_; > size_t padding_end = padding_offset + padding_size_; >- for (size_t offset = padding_offset; offset < padding_end - 1; ++offset) { >- WriteAt(offset, random->Rand<uint8_t>()); >- } >+ memset(WriteAt(padding_offset), 0, padding_size_ - 1); > WriteAt(padding_end - 1, padding_size_); > WriteAt(0, data()[0] | 0x20); // Set padding bit. > } else { >@@ -556,7 +552,7 @@ rtc::ArrayView<uint8_t> RtpPacket::AllocateExtension(ExtensionType type, > size_t length) { > // TODO(webrtc:7990): Add support for empty extensions (length==0). > if (length == 0 || length > RtpExtension::kMaxValueSize || >- (!extensions_.IsMixedOneTwoByteHeaderSupported() && >+ (!extensions_.ExtmapAllowMixed() && > length > RtpExtension::kOneByteHeaderExtensionMaxValueSize)) { > return nullptr; > } >@@ -566,7 +562,7 @@ rtc::ArrayView<uint8_t> RtpPacket::AllocateExtension(ExtensionType type, > // Extension not registered. > return nullptr; > } >- if (!extensions_.IsMixedOneTwoByteHeaderSupported() && >+ if (!extensions_.ExtmapAllowMixed() && > id > RtpExtension::kOneByteHeaderExtensionMaxId) { > return nullptr; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.h >index 15c3865b3594dc876e41806a4783f1746253dd74..76666b7f12d7637519f052339309123ce215e7b4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet.h >@@ -12,10 +12,12 @@ > > #include <vector> > >+#include "absl/types/optional.h" > #include "api/array_view.h" > #include "modules/rtp_rtcp/include/rtp_header_extension_map.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "rtc_base/copyonwritebuffer.h" >+#include "rtc_base/deprecation.h" > > namespace webrtc { > class Random; >@@ -94,8 +96,11 @@ class RtpPacket { > template <typename Extension> > bool HasExtension() const; > >- template <typename Extension, typename... Values> >- bool GetExtension(Values...) const; >+ template <typename Extension, typename FirstValue, typename... Values> >+ bool GetExtension(FirstValue, Values...) const; >+ >+ template <typename Extension> >+ absl::optional<typename Extension::value_type> GetExtension() const; > > // Returns view of the raw extension or empty view on failure. > template <typename Extension> >@@ -111,7 +116,12 @@ class RtpPacket { > uint8_t* SetPayloadSize(size_t size_bytes); > // Same as SetPayloadSize but doesn't guarantee to keep current payload. > uint8_t* AllocatePayload(size_t size_bytes); >- bool SetPadding(uint8_t size_bytes, Random* random); >+ RTC_DEPRECATED >+ bool SetPadding(uint8_t size_bytes, Random* random) { >+ return SetPadding(size_bytes); >+ } >+ >+ bool SetPadding(size_t padding_size); > > private: > struct ExtensionInfo { >@@ -177,12 +187,21 @@ bool RtpPacket::HasExtension() const { > return !FindExtension(Extension::kId).empty(); > } > >-template <typename Extension, typename... Values> >-bool RtpPacket::GetExtension(Values... values) const { >+template <typename Extension, typename FirstValue, typename... Values> >+bool RtpPacket::GetExtension(FirstValue first, Values... values) const { > auto raw = FindExtension(Extension::kId); > if (raw.empty()) > return false; >- return Extension::Parse(raw, values...); >+ return Extension::Parse(raw, first, values...); >+} >+ >+template <typename Extension> >+absl::optional<typename Extension::value_type> RtpPacket::GetExtension() const { >+ absl::optional<typename Extension::value_type> result; >+ auto raw = FindExtension(Extension::kId); >+ if (raw.empty() || !Extension::Parse(raw, &result.emplace())) >+ result = absl::nullopt; >+ return result; > } > > template <typename Extension> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.cc >index e1374feb57fcd9961215128e111f2337ec68fae0..8c1c8ebbbfae91cab6ea04c528a36d9c9974b9ca 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.cc >@@ -97,6 +97,15 @@ void RtpPacketHistory::PutRtpPacket(std::unique_ptr<RtpPacketToSend> packet, > const uint16_t rtp_seq_no = packet->SequenceNumber(); > StoredPacket& stored_packet = packet_history_[rtp_seq_no]; > RTC_DCHECK(stored_packet.packet == nullptr); >+ if (stored_packet.packet) { >+ // It is an error if this happen. But it can happen if the sequence numbers >+ // for some reason restart without that the history has been reset. >+ auto size_iterator = packet_size_.find(stored_packet.packet->size()); >+ if (size_iterator != packet_size_.end() && >+ size_iterator->second == stored_packet.packet->SequenceNumber()) { >+ packet_size_.erase(size_iterator); >+ } >+ } > stored_packet.packet = std::move(packet); > > if (stored_packet.packet->capture_time_ms() <= 0) { >@@ -116,8 +125,7 @@ void RtpPacketHistory::PutRtpPacket(std::unique_ptr<RtpPacketToSend> packet, > } > > std::unique_ptr<RtpPacketToSend> RtpPacketHistory::GetPacketAndSetSendTime( >- uint16_t sequence_number, >- bool verify_rtt) { >+ uint16_t sequence_number) { > rtc::CritScope cs(&lock_); > if (mode_ == StorageMode::kDisabled) { > return nullptr; >@@ -130,7 +138,7 @@ std::unique_ptr<RtpPacketToSend> RtpPacketHistory::GetPacketAndSetSendTime( > } > > StoredPacket& packet = rtp_it->second; >- if (verify_rtt && !VerifyRtt(rtp_it->second, now_ms)) { >+ if (!VerifyRtt(rtp_it->second, now_ms)) { > return nullptr; > } > >@@ -150,8 +158,7 @@ std::unique_ptr<RtpPacketToSend> RtpPacketHistory::GetPacketAndSetSendTime( > } > > absl::optional<RtpPacketHistory::PacketState> RtpPacketHistory::GetPacketState( >- uint16_t sequence_number, >- bool verify_rtt) const { >+ uint16_t sequence_number) const { > rtc::CritScope cs(&lock_); > if (mode_ == StorageMode::kDisabled) { > return absl::nullopt; >@@ -162,7 +169,7 @@ absl::optional<RtpPacketHistory::PacketState> RtpPacketHistory::GetPacketState( > return absl::nullopt; > } > >- if (verify_rtt && !VerifyRtt(rtp_it->second, clock_->TimeInMilliseconds())) { >+ if (!VerifyRtt(rtp_it->second, clock_->TimeInMilliseconds())) { > return absl::nullopt; > } > >@@ -209,8 +216,19 @@ std::unique_ptr<RtpPacketToSend> RtpPacketHistory::GetBestFittingPacket( > const uint16_t seq_no = upper_bound_diff < lower_bound_diff > ? size_iter_upper->second > : size_iter_lower->second; >- RtpPacketToSend* best_packet = >- packet_history_.find(seq_no)->second.packet.get(); >+ auto history_it = packet_history_.find(seq_no); >+ if (history_it == packet_history_.end()) { >+ RTC_LOG(LS_ERROR) << "Can't find packet in history with seq_no" << seq_no; >+ RTC_DCHECK(false); >+ return nullptr; >+ } >+ if (!history_it->second.packet) { >+ RTC_LOG(LS_ERROR) << "Packet pointer is null in history for seq_no" >+ << seq_no; >+ RTC_DCHECK(false); >+ return nullptr; >+ } >+ RtpPacketToSend* best_packet = history_it->second.packet.get(); > return absl::make_unique<RtpPacketToSend>(*best_packet); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.h >index 1646ba7c76f3afa5e9265e2630929a76ff3da072..095424e3d713123c9c3bb3b7096dbcdd5ea50760 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history.h >@@ -76,16 +76,13 @@ class RtpPacketHistory { > absl::optional<int64_t> send_time_ms); > > // Gets stored RTP packet corresponding to the input |sequence number|. >- // Returns nullptr if packet is not found. If |verify_rtt| is true, doesn't >- // return packet that was (re)sent too recently. >+ // Returns nullptr if packet is not found or was (re)sent too recently. > std::unique_ptr<RtpPacketToSend> GetPacketAndSetSendTime( >- uint16_t sequence_number, >- bool verify_rtt); >+ uint16_t sequence_number); > > // Similar to GetPacketAndSetSendTime(), but only returns a snapshot of the > // current state for packet, and never updates internal state. >- absl::optional<PacketState> GetPacketState(uint16_t sequence_number, >- bool verify_rtt) const; >+ absl::optional<PacketState> GetPacketState(uint16_t sequence_number) const; > > // Get the packet (if any) from the history, with size closest to > // |packet_size|. The exact size of the packet is not guaranteed. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history_unittest.cc >index 85e9eb28cd7453a462d287a610d3d46e426c9020..140434cbbde79a6d1a25942ff801eaf1f6d9936f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_history_unittest.cc >@@ -63,11 +63,11 @@ TEST_F(RtpPacketHistoryTest, ClearsHistoryAfterSetStoreStatus) { > // Store a packet, but with send-time. It should then not be removed. > hist_.PutRtpPacket(CreateRtpPacket(kStartSeqNum), kAllowRetransmission, > absl::nullopt); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // Changing store status, even to the current one, will clear the history. > hist_.SetStorePacketsStatus(StorageMode::kStore, 10); >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, StartSeqResetAfterReset) { >@@ -75,16 +75,16 @@ TEST_F(RtpPacketHistoryTest, StartSeqResetAfterReset) { > // Store a packet, but with send-time. It should then not be removed. > hist_.PutRtpPacket(CreateRtpPacket(kStartSeqNum), kAllowRetransmission, > absl::nullopt); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // Changing store status, to clear the history. > hist_.SetStorePacketsStatus(StorageMode::kStoreAndCull, 10); >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); > > // Add a new packet. > hist_.PutRtpPacket(CreateRtpPacket(kStartSeqNum + 1), kAllowRetransmission, > absl::nullopt); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1)); > > // Advance time past where packet expires. > fake_clock_.AdvanceTimeMilliseconds( >@@ -94,9 +94,9 @@ TEST_F(RtpPacketHistoryTest, StartSeqResetAfterReset) { > // Add one more packet and verify no state left from packet before reset. > hist_.PutRtpPacket(CreateRtpPacket(To16u(kStartSeqNum + 2)), > kAllowRetransmission, absl::nullopt); >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1, false)); >- EXPECT_TRUE(hist_.GetPacketState(To16u(kStartSeqNum + 2), false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1)); >+ EXPECT_TRUE(hist_.GetPacketState(To16u(kStartSeqNum + 2))); > } > > TEST_F(RtpPacketHistoryTest, NoStoreStatus) { >@@ -104,21 +104,21 @@ TEST_F(RtpPacketHistoryTest, NoStoreStatus) { > std::unique_ptr<RtpPacketToSend> packet = CreateRtpPacket(kStartSeqNum); > hist_.PutRtpPacket(std::move(packet), kAllowRetransmission, absl::nullopt); > // Packet should not be stored. >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, GetRtpPacket_NotStored) { > hist_.SetStorePacketsStatus(StorageMode::kStore, 10); >- EXPECT_FALSE(hist_.GetPacketState(0, false)); >+ EXPECT_FALSE(hist_.GetPacketState(0)); > } > > TEST_F(RtpPacketHistoryTest, PutRtpPacket) { > hist_.SetStorePacketsStatus(StorageMode::kStore, 10); > std::unique_ptr<RtpPacketToSend> packet = CreateRtpPacket(kStartSeqNum); > >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); > hist_.PutRtpPacket(std::move(packet), kAllowRetransmission, absl::nullopt); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, GetRtpPacket) { >@@ -130,7 +130,7 @@ TEST_F(RtpPacketHistoryTest, GetRtpPacket) { > hist_.PutRtpPacket(std::move(packet), kAllowRetransmission, absl::nullopt); > > std::unique_ptr<RtpPacketToSend> packet_out = >- hist_.GetPacketAndSetSendTime(kStartSeqNum, false); >+ hist_.GetPacketAndSetSendTime(kStartSeqNum); > EXPECT_TRUE(packet_out); > EXPECT_EQ(buffer, packet_out->Buffer()); > EXPECT_EQ(capture_time_ms, packet_out->capture_time_ms()); >@@ -146,7 +146,7 @@ TEST_F(RtpPacketHistoryTest, NoCaptureTime) { > hist_.PutRtpPacket(std::move(packet), kAllowRetransmission, absl::nullopt); > > std::unique_ptr<RtpPacketToSend> packet_out = >- hist_.GetPacketAndSetSendTime(kStartSeqNum, false); >+ hist_.GetPacketAndSetSendTime(kStartSeqNum); > EXPECT_TRUE(packet_out); > EXPECT_EQ(buffer, packet_out->Buffer()); > EXPECT_EQ(capture_time_ms, packet_out->capture_time_ms()); >@@ -161,19 +161,21 @@ TEST_F(RtpPacketHistoryTest, DontRetransmit) { > > // Get the packet and verify data. > std::unique_ptr<RtpPacketToSend> packet_out; >- packet_out = hist_.GetPacketAndSetSendTime(kStartSeqNum, false); >+ packet_out = hist_.GetPacketAndSetSendTime(kStartSeqNum); > ASSERT_TRUE(packet_out); > EXPECT_EQ(buffer.size(), packet_out->size()); > EXPECT_EQ(capture_time_ms, packet_out->capture_time_ms()); > > // Non-retransmittable packets are immediately removed, so getting in again > // should fail. >- EXPECT_FALSE(hist_.GetPacketAndSetSendTime(kStartSeqNum, false)); >+ EXPECT_FALSE(hist_.GetPacketAndSetSendTime(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, PacketStateIsCorrect) { > const uint32_t kSsrc = 92384762; >+ const int64_t kRttMs = 100; > hist_.SetStorePacketsStatus(StorageMode::kStoreAndCull, 10); >+ hist_.SetRtt(kRttMs); > std::unique_ptr<RtpPacketToSend> packet = CreateRtpPacket(kStartSeqNum); > packet->SetSsrc(kSsrc); > packet->SetPayloadSize(1234); >@@ -183,7 +185,7 @@ TEST_F(RtpPacketHistoryTest, PacketStateIsCorrect) { > fake_clock_.TimeInMilliseconds()); > > absl::optional<RtpPacketHistory::PacketState> state = >- hist_.GetPacketState(kStartSeqNum, false); >+ hist_.GetPacketState(kStartSeqNum); > ASSERT_TRUE(state); > EXPECT_EQ(state->rtp_sequence_number, kStartSeqNum); > EXPECT_EQ(state->send_time_ms, fake_clock_.TimeInMilliseconds()); >@@ -193,9 +195,10 @@ TEST_F(RtpPacketHistoryTest, PacketStateIsCorrect) { > EXPECT_EQ(state->times_retransmitted, 0u); > > fake_clock_.AdvanceTimeMilliseconds(1); >- EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum)); >+ fake_clock_.AdvanceTimeMilliseconds(kRttMs + 1); > >- state = hist_.GetPacketState(kStartSeqNum, false); >+ state = hist_.GetPacketState(kStartSeqNum); > ASSERT_TRUE(state); > EXPECT_EQ(state->times_retransmitted, 1u); > } >@@ -211,7 +214,7 @@ TEST_F(RtpPacketHistoryTest, MinResendTimeWithPacer) { > hist_.PutRtpPacket(std::move(packet), kAllowRetransmission, absl::nullopt); > > // First transmission: TimeToSendPacket() call from pacer. >- EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum)); > > // First retransmission - allow early retransmission. > fake_clock_.AdvanceTimeMilliseconds(1); >@@ -221,26 +224,24 @@ TEST_F(RtpPacketHistoryTest, MinResendTimeWithPacer) { > // packet is there and verify RTT constraints. Then we use the ssrc > // and sequence number to enqueue the retransmission in the pacer > // 2) When the pacer determines that it is time to send the packet, it calls >- // GetPacketAndSetSendTime(). This time we do not need to verify RTT as >- // has that has already been done. >+ // GetPacketAndSetSendTime(). > absl::optional<RtpPacketHistory::PacketState> packet_state = >- hist_.GetPacketState(kStartSeqNum, /*verify_rtt=*/true); >+ hist_.GetPacketState(kStartSeqNum); > EXPECT_TRUE(packet_state); > EXPECT_EQ(len, packet_state->payload_size); > EXPECT_EQ(capture_time_ms, packet_state->capture_time_ms); > > // Retransmission was allowed, next send it from pacer. >- EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum, >- /*verify_rtt=*/false)); >+ EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum)); > > // Second retransmission - advance time to just before retransmission OK. > fake_clock_.AdvanceTimeMilliseconds(kMinRetransmitIntervalMs - 1); >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, /*verify_rtt=*/true)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); > > // Advance time to just after retransmission OK. > fake_clock_.AdvanceTimeMilliseconds(1); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, /*verify_rtt=*/true)); >- EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); >+ EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, MinResendTimeWithoutPacer) { >@@ -256,18 +257,18 @@ TEST_F(RtpPacketHistoryTest, MinResendTimeWithoutPacer) { > > // First retransmission - allow early retransmission. > fake_clock_.AdvanceTimeMilliseconds(1); >- packet = hist_.GetPacketAndSetSendTime(kStartSeqNum, true); >+ packet = hist_.GetPacketAndSetSendTime(kStartSeqNum); > EXPECT_TRUE(packet); > EXPECT_EQ(len, packet->size()); > EXPECT_EQ(capture_time_ms, packet->capture_time_ms()); > > // Second retransmission - advance time to just before retransmission OK. > fake_clock_.AdvanceTimeMilliseconds(kMinRetransmitIntervalMs - 1); >- EXPECT_FALSE(hist_.GetPacketAndSetSendTime(kStartSeqNum, true)); >+ EXPECT_FALSE(hist_.GetPacketAndSetSendTime(kStartSeqNum)); > > // Advance time to just after retransmission OK. > fake_clock_.AdvanceTimeMilliseconds(1); >- EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum, true)); >+ EXPECT_TRUE(hist_.GetPacketAndSetSendTime(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, RemovesOldestSentPacketWhenAtMaxSize) { >@@ -289,7 +290,7 @@ TEST_F(RtpPacketHistoryTest, RemovesOldestSentPacketWhenAtMaxSize) { > } > > // First packet should still be there. >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // History is full, oldest one should be overwritten. > std::unique_ptr<RtpPacketToSend> packet = >@@ -298,8 +299,8 @@ TEST_F(RtpPacketHistoryTest, RemovesOldestSentPacketWhenAtMaxSize) { > fake_clock_.TimeInMilliseconds()); > > // Oldest packet should be gone, but packet after than one still present. >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1)); > } > > TEST_F(RtpPacketHistoryTest, RemovesOldestPacketWhenAtMaxCapacity) { >@@ -317,7 +318,7 @@ TEST_F(RtpPacketHistoryTest, RemovesOldestPacketWhenAtMaxCapacity) { > } > > // First packet should still be there. >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // History is full, oldest one should be overwritten. > std::unique_ptr<RtpPacketToSend> packet = >@@ -326,8 +327,8 @@ TEST_F(RtpPacketHistoryTest, RemovesOldestPacketWhenAtMaxCapacity) { > fake_clock_.TimeInMilliseconds()); > > // Oldest packet should be gone, but packet after than one still present. >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1)); > } > > TEST_F(RtpPacketHistoryTest, DontRemoveUnsentPackets) { >@@ -343,25 +344,25 @@ TEST_F(RtpPacketHistoryTest, DontRemoveUnsentPackets) { > fake_clock_.AdvanceTimeMilliseconds(RtpPacketHistory::kMinPacketDurationMs); > > // First packet should still be there. >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // History is full, but old packets not sent, so allow expansion. > hist_.PutRtpPacket(CreateRtpPacket(To16u(kStartSeqNum + kMaxNumPackets)), > kAllowRetransmission, fake_clock_.TimeInMilliseconds()); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // Set all packet as sent and advance time past min packet duration time, > // otherwise packets till still be prevented from being removed. > for (size_t i = 0; i <= kMaxNumPackets; ++i) { >- EXPECT_TRUE(hist_.GetPacketAndSetSendTime(To16u(kStartSeqNum + i), false)); >+ EXPECT_TRUE(hist_.GetPacketAndSetSendTime(To16u(kStartSeqNum + i))); > } > fake_clock_.AdvanceTimeMilliseconds(RtpPacketHistory::kMinPacketDurationMs); > // Add a new packet, this means the two oldest ones will be culled. > hist_.PutRtpPacket(CreateRtpPacket(To16u(kStartSeqNum + kMaxNumPackets + 1)), > kAllowRetransmission, fake_clock_.TimeInMilliseconds()); >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum + 1, false)); >- EXPECT_TRUE(hist_.GetPacketState(To16u(kStartSeqNum + 2), false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum + 1)); >+ EXPECT_TRUE(hist_.GetPacketState(To16u(kStartSeqNum + 2))); > } > > TEST_F(RtpPacketHistoryTest, DontRemoveTooRecentlyTransmittedPackets) { >@@ -378,15 +379,15 @@ TEST_F(RtpPacketHistoryTest, DontRemoveTooRecentlyTransmittedPackets) { > hist_.PutRtpPacket(CreateRtpPacket(kStartSeqNum + 1), kAllowRetransmission, > fake_clock_.TimeInMilliseconds()); > // First packet should still be there. >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // Advance time to where packet will be eligible for removal and try again. > fake_clock_.AdvanceTimeMilliseconds(1); > hist_.PutRtpPacket(CreateRtpPacket(To16u(kStartSeqNum + 2)), > kAllowRetransmission, fake_clock_.TimeInMilliseconds()); > // First packet should no be gone, but next one still there. >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1)); > } > > TEST_F(RtpPacketHistoryTest, DontRemoveTooRecentlyTransmittedPacketsHighRtt) { >@@ -407,15 +408,15 @@ TEST_F(RtpPacketHistoryTest, DontRemoveTooRecentlyTransmittedPacketsHighRtt) { > hist_.PutRtpPacket(CreateRtpPacket(kStartSeqNum + 1), kAllowRetransmission, > fake_clock_.TimeInMilliseconds()); > // First packet should still be there. >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // Advance time to where packet will be eligible for removal and try again. > fake_clock_.AdvanceTimeMilliseconds(1); > hist_.PutRtpPacket(CreateRtpPacket(To16u(kStartSeqNum + 2)), > kAllowRetransmission, fake_clock_.TimeInMilliseconds()); > // First packet should no be gone, but next one still there. >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum + 1)); > } > > TEST_F(RtpPacketHistoryTest, RemovesOldWithCulling) { >@@ -431,14 +432,14 @@ TEST_F(RtpPacketHistoryTest, RemovesOldWithCulling) { > fake_clock_.AdvanceTimeMilliseconds(kMaxPacketDurationMs - 1); > > // First packet should still be there. >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // Advance to where packet can be culled, even if buffer is not full. > fake_clock_.AdvanceTimeMilliseconds(1); > hist_.PutRtpPacket(CreateRtpPacket(kStartSeqNum + 1), kAllowRetransmission, > fake_clock_.TimeInMilliseconds()); > >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, RemovesOldWithCullingHighRtt) { >@@ -457,14 +458,14 @@ TEST_F(RtpPacketHistoryTest, RemovesOldWithCullingHighRtt) { > fake_clock_.AdvanceTimeMilliseconds(kMaxPacketDurationMs - 1); > > // First packet should still be there. >- EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_TRUE(hist_.GetPacketState(kStartSeqNum)); > > // Advance to where packet can be culled, even if buffer is not full. > fake_clock_.AdvanceTimeMilliseconds(1); > hist_.PutRtpPacket(CreateRtpPacket(kStartSeqNum + 1), kAllowRetransmission, > fake_clock_.TimeInMilliseconds()); > >- EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >+ EXPECT_FALSE(hist_.GetPacketState(kStartSeqNum)); > } > > TEST_F(RtpPacketHistoryTest, GetBestFittingPacket) { >@@ -503,7 +504,7 @@ TEST_F(RtpPacketHistoryTest, > ASSERT_THAT(packet, ::testing::NotNull()); > > // Send the packet and advance time past where packet expires. >- ASSERT_THAT(hist_.GetPacketAndSetSendTime(kStartSeqNum, false), >+ ASSERT_THAT(hist_.GetPacketAndSetSendTime(kStartSeqNum), > ::testing::NotNull()); > fake_clock_.AdvanceTimeMilliseconds( > RtpPacketHistory::kPacketCullingDelayFactor * >@@ -513,7 +514,7 @@ TEST_F(RtpPacketHistoryTest, > packet->SetPayloadSize(100); > hist_.PutRtpPacket(std::move(packet), kAllowRetransmission, > fake_clock_.TimeInMilliseconds()); >- ASSERT_FALSE(hist_.GetPacketState(kStartSeqNum, false)); >+ ASSERT_FALSE(hist_.GetPacketState(kStartSeqNum)); > > auto best_packet = hist_.GetBestFittingPacket(target_packet_size + 2); > ASSERT_THAT(best_packet, ::testing::NotNull()); >@@ -573,7 +574,7 @@ TEST_F(RtpPacketHistoryTest, > hist_.PutRtpPacket(std::move(packet), kDontRetransmit, > fake_clock_.TimeInMilliseconds()); > EXPECT_THAT(hist_.GetBestFittingPacket(50), ::testing::IsNull()); >- EXPECT_THAT(hist_.GetPacketAndSetSendTime(kStartSeqNum, false), >+ EXPECT_THAT(hist_.GetPacketAndSetSendTime(kStartSeqNum), > ::testing::NotNull()); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.cc >index c8deb99d90e5586fa4c41525df4adf7f95524dd3..f80fad68e0d326d6effeb5595a240f24173a980c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.cc >@@ -10,6 +10,8 @@ > > #include "modules/rtp_rtcp/source/rtp_packet_received.h" > >+#include <stddef.h> >+#include <cstdint> > #include <vector> > > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" >@@ -67,6 +69,7 @@ void RtpPacketReceived::GetHeader(RTPHeader* header) const { > GetExtension<RepairedRtpStreamId>(&header->extension.repaired_stream_id); > GetExtension<RtpMid>(&header->extension.mid); > GetExtension<PlayoutDelayLimits>(&header->extension.playout_delay); >+ header->extension.color_space = GetExtension<ColorSpaceExtension>(); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.h >index 86900310e49e3198e9e49125d35b38a670858851..566b116a10a613c246093f37929bdfd3d6159f7e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_received.h >@@ -10,9 +10,11 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_PACKET_RECEIVED_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_PACKET_RECEIVED_H_ > >+#include <stdint.h> > #include <vector> > >-#include "common_types.h" // NOLINT(build/include) >+#include "api/array_view.h" >+#include "api/rtp_headers.h" > #include "modules/rtp_rtcp/source/rtp_packet.h" > #include "system_wrappers/include/ntp_time.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.cc >index 0153bbe73ac229ed3400728f5c0be062b2d01d93..b55e74aaf01f807dd29758521a23e8de9e30fd99 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.cc >@@ -10,6 +10,8 @@ > > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" > >+#include <cstdint> >+ > namespace webrtc { > > RtpPacketToSend::RtpPacketToSend(const ExtensionManager* extensions) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.h >index 1ed7dda478269772047eb670f3a87b43d29ef7ea..56b1024a059d7ac5bdaae1d3d0b22303f4ccdf2d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_to_send.h >@@ -10,9 +10,12 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_PACKET_TO_SEND_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_PACKET_TO_SEND_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <vector> > > #include "api/array_view.h" >+#include "api/video/video_timing.h" > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" > #include "modules/rtp_rtcp/source/rtp_packet.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_unittest.cc >index 125000d2dc94ed4be17862e04b24406bf3d2747c..b1c0e42525eae52677c0f80a0ccf64171459e4e7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_packet_unittest.cc >@@ -19,6 +19,7 @@ > namespace webrtc { > namespace { > >+using ::testing::Each; > using ::testing::ElementsAre; > using ::testing::ElementsAreArray; > using ::testing::IsEmpty; >@@ -184,6 +185,52 @@ constexpr uint8_t kPacketWithLegacyTimingExtension[] = { > 0x04, 0x00, 0x00, 0x00, > 0x00, 0x00, 0x00, 0x00}; > // clang-format on >+ >+HdrMetadata CreateTestHdrMetadata() { >+ // Random but reasonable HDR metadata. >+ HdrMetadata hdr_metadata; >+ hdr_metadata.mastering_metadata.luminance_max = 2000.0; >+ hdr_metadata.mastering_metadata.luminance_min = 2.0001; >+ hdr_metadata.mastering_metadata.primary_r.x = 0.3003; >+ hdr_metadata.mastering_metadata.primary_r.y = 0.4004; >+ hdr_metadata.mastering_metadata.primary_g.x = 0.3201; >+ hdr_metadata.mastering_metadata.primary_g.y = 0.4604; >+ hdr_metadata.mastering_metadata.primary_b.x = 0.3409; >+ hdr_metadata.mastering_metadata.primary_b.y = 0.4907; >+ hdr_metadata.mastering_metadata.white_point.x = 0.4103; >+ hdr_metadata.mastering_metadata.white_point.y = 0.4806; >+ hdr_metadata.max_content_light_level = 2345; >+ hdr_metadata.max_frame_average_light_level = 1789; >+ return hdr_metadata; >+} >+ >+ColorSpace CreateTestColorSpace(bool with_hdr_metadata) { >+ ColorSpace color_space( >+ ColorSpace::PrimaryID::kBT709, ColorSpace::TransferID::kGAMMA22, >+ ColorSpace::MatrixID::kSMPTE2085, ColorSpace::RangeID::kFull); >+ if (with_hdr_metadata) { >+ HdrMetadata hdr_metadata = CreateTestHdrMetadata(); >+ color_space.set_hdr_metadata(&hdr_metadata); >+ } >+ return color_space; >+} >+ >+void TestCreateAndParseColorSpaceExtension(bool with_hdr_metadata) { >+ // Create packet with extension. >+ RtpPacket::ExtensionManager extensions(/*extmap-allow-mixed=*/true); >+ extensions.Register<ColorSpaceExtension>(1); >+ RtpPacket packet(&extensions); >+ const ColorSpace kColorSpace = CreateTestColorSpace(with_hdr_metadata); >+ EXPECT_TRUE(packet.SetExtension<ColorSpaceExtension>(kColorSpace)); >+ packet.SetPayloadSize(42); >+ >+ // Read packet with the extension. >+ RtpPacketReceived parsed(&extensions); >+ EXPECT_TRUE(parsed.Parse(packet.Buffer())); >+ ColorSpace parsed_color_space; >+ EXPECT_TRUE(parsed.GetExtension<ColorSpaceExtension>(&parsed_color_space)); >+ EXPECT_EQ(kColorSpace, parsed_color_space); >+} > } // namespace > > TEST(RtpPacketTest, CreateMinimum) { >@@ -225,8 +272,7 @@ TEST(RtpPacketTest, CreateWith2Extensions) { > } > > TEST(RtpPacketTest, CreateWithTwoByteHeaderExtensionFirst) { >- RtpPacketToSend::ExtensionManager extensions; >- extensions.SetMixedOneTwoByteHeaderSupported(true); >+ RtpPacketToSend::ExtensionManager extensions(true); > extensions.Register(kRtpExtensionTransmissionTimeOffset, > kTransmissionOffsetExtensionId); > extensions.Register(kRtpExtensionAudioLevel, kAudioLevelExtensionId); >@@ -247,8 +293,7 @@ TEST(RtpPacketTest, CreateWithTwoByteHeaderExtensionFirst) { > > TEST(RtpPacketTest, CreateWithTwoByteHeaderExtensionLast) { > // This test will trigger RtpPacket::PromoteToTwoByteHeaderExtension(). >- RtpPacketToSend::ExtensionManager extensions; >- extensions.SetMixedOneTwoByteHeaderSupported(true); >+ RtpPacketToSend::ExtensionManager extensions(true); > extensions.Register(kRtpExtensionTransmissionTimeOffset, > kTransmissionOffsetExtensionId); > extensions.Register(kRtpExtensionAudioLevel, kAudioLevelExtensionId); >@@ -367,11 +412,10 @@ TEST(RtpPacketTest, CreatePurePadding) { > packet.SetSequenceNumber(kSeqNum); > packet.SetTimestamp(kTimestamp); > packet.SetSsrc(kSsrc); >- Random random(0x123456789); > > EXPECT_LT(packet.size(), packet.capacity()); >- EXPECT_FALSE(packet.SetPadding(kPaddingSize + 1, &random)); >- EXPECT_TRUE(packet.SetPadding(kPaddingSize, &random)); >+ EXPECT_FALSE(packet.SetPadding(kPaddingSize + 1)); >+ EXPECT_TRUE(packet.SetPadding(kPaddingSize)); > EXPECT_EQ(packet.size(), packet.capacity()); > } > >@@ -383,13 +427,48 @@ TEST(RtpPacketTest, CreateUnalignedPadding) { > packet.SetTimestamp(kTimestamp); > packet.SetSsrc(kSsrc); > packet.SetPayloadSize(kPayloadSize); >- Random r(0x123456789); > > EXPECT_LT(packet.size(), packet.capacity()); >- EXPECT_TRUE(packet.SetPadding(kMaxPaddingSize, &r)); >+ EXPECT_TRUE(packet.SetPadding(kMaxPaddingSize)); > EXPECT_EQ(packet.size(), packet.capacity()); > } > >+TEST(RtpPacketTest, WritesPaddingSizeToLastByte) { >+ const size_t kPaddingSize = 5; >+ RtpPacket packet; >+ >+ EXPECT_TRUE(packet.SetPadding(kPaddingSize)); >+ EXPECT_EQ(packet.data()[packet.size() - 1], kPaddingSize); >+} >+ >+TEST(RtpPacketTest, UsesZerosForPadding) { >+ const size_t kPaddingSize = 5; >+ RtpPacket packet; >+ >+ EXPECT_TRUE(packet.SetPadding(kPaddingSize)); >+ EXPECT_THAT(rtc::MakeArrayView(packet.data() + 12, kPaddingSize - 1), >+ Each(0)); >+} >+ >+TEST(RtpPacketTest, CreateOneBytePadding) { >+ size_t kPayloadSize = 123; >+ RtpPacket packet(nullptr, 12 + kPayloadSize + 1); >+ packet.SetPayloadSize(kPayloadSize); >+ >+ EXPECT_TRUE(packet.SetPadding(1)); >+ >+ EXPECT_EQ(packet.size(), 12 + kPayloadSize + 1); >+ EXPECT_EQ(packet.padding_size(), 1u); >+} >+ >+TEST(RtpPacketTest, FailsToAddPaddingWithoutCapacity) { >+ size_t kPayloadSize = 123; >+ RtpPacket packet(nullptr, 12 + kPayloadSize); >+ packet.SetPayloadSize(kPayloadSize); >+ >+ EXPECT_FALSE(packet.SetPadding(1)); >+} >+ > TEST(RtpPacketTest, ParseMinimum) { > RtpPacketReceived packet; > EXPECT_TRUE(packet.Parse(kMinimumPacket, sizeof(kMinimumPacket))); >@@ -433,6 +512,23 @@ TEST(RtpPacketTest, ParseWithExtension) { > EXPECT_EQ(0u, packet.padding_size()); > } > >+TEST(RtpPacketTest, GetExtensionWithoutParametersReturnsOptionalValue) { >+ RtpPacket::ExtensionManager extensions; >+ extensions.Register<TransmissionOffset>(kTransmissionOffsetExtensionId); >+ extensions.Register<RtpStreamId>(kRtpStreamIdExtensionId); >+ >+ RtpPacketReceived packet(&extensions); >+ EXPECT_TRUE(packet.Parse(kPacketWithTO, sizeof(kPacketWithTO))); >+ >+ auto time_offset = packet.GetExtension<TransmissionOffset>(); >+ static_assert( >+ std::is_same<decltype(time_offset), >+ absl::optional<TransmissionOffset::value_type>>::value, >+ ""); >+ EXPECT_EQ(time_offset, kTimeOffset); >+ EXPECT_FALSE(packet.GetExtension<RtpStreamId>().has_value()); >+} >+ > TEST(RtpPacketTest, GetRawExtensionWhenPresent) { > constexpr uint8_t kRawPacket[] = { > // comment for clang-format to align kRawPacket nicer. >@@ -751,4 +847,12 @@ TEST(RtpPacketTest, ParseLegacyTimingFrameExtension) { > EXPECT_EQ(receivied_timing.flags, 0); > } > >+TEST(RtpPacketTest, CreateAndParseColorSpaceExtension) { >+ TestCreateAndParseColorSpaceExtension(/*with_hdr_metadata=*/true); >+} >+ >+TEST(RtpPacketTest, CreateAndParseColorSpaceExtensionWithoutHdrMetadata) { >+ TestCreateAndParseColorSpaceExtension(/*with_hdr_metadata=*/false); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.cc >index 175a94be802161ef0779960c58adf30f0e46cbd1..0d0ca9618880373bd4ef10687439dbea27b659b0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.cc >@@ -11,12 +11,14 @@ > #include "modules/rtp_rtcp/source/rtp_rtcp_impl.h" > > #include <string.h> >- > #include <algorithm> >+#include <cstdint> > #include <set> > #include <string> >+#include <utility> > >-#include "api/rtpparameters.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/dlrr.h" >+#include "modules/rtp_rtcp/source/rtp_rtcp_config.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > >@@ -31,6 +33,8 @@ const int64_t kRtpRtcpMaxIdleTimeProcessMs = 5; > const int64_t kRtpRtcpRttProcessTimeMs = 1000; > const int64_t kRtpRtcpBitrateProcessTimeMs = 10; > const int64_t kDefaultExpectedRetransmissionTimeMs = 125; >+constexpr int32_t kDefaultVideoReportInterval = 1000; >+constexpr int32_t kDefaultAudioReportInterval = 5000; > } // namespace > > RtpRtcp::Configuration::Configuration() = default; >@@ -62,7 +66,10 @@ ModuleRtpRtcpImpl::ModuleRtpRtcpImpl(const Configuration& configuration) > configuration.rtcp_packet_type_counter_observer, > configuration.event_log, > configuration.outgoing_transport, >- configuration.rtcp_interval_config), >+ configuration.rtcp_report_interval_ms > 0 >+ ? configuration.rtcp_report_interval_ms >+ : (configuration.audio ? kDefaultAudioReportInterval >+ : kDefaultVideoReportInterval)), > rtcp_receiver_(configuration.clock, > configuration.receiver_only, > configuration.rtcp_packet_type_counter_observer, >@@ -70,6 +77,10 @@ ModuleRtpRtcpImpl::ModuleRtpRtcpImpl(const Configuration& configuration) > configuration.intra_frame_callback, > configuration.transport_feedback_callback, > configuration.bitrate_allocation_observer, >+ configuration.rtcp_report_interval_ms > 0 >+ ? configuration.rtcp_report_interval_ms >+ : (configuration.audio ? kDefaultAudioReportInterval >+ : kDefaultVideoReportInterval), > this), > clock_(configuration.clock), > audio_(configuration.audio), >@@ -99,7 +110,9 @@ ModuleRtpRtcpImpl::ModuleRtpRtcpImpl(const Configuration& configuration) > configuration.send_packet_observer, > configuration.retransmission_rate_limiter, > configuration.overhead_observer, >- configuration.populate_network2_timestamp)); >+ configuration.populate_network2_timestamp, >+ configuration.frame_encryptor, configuration.require_frame_encryption, >+ configuration.extmap_allow_mixed)); > // Make sure rtcp sender use same timestamp offset as rtp sender. > rtcp_sender_.SetTimestampOffset(rtp_sender_->TimestampOffset()); > >@@ -175,10 +188,9 @@ void ModuleRtpRtcpImpl::Process() { > > // Verify receiver reports are delivered and the reported sequence number > // is increasing. >- int64_t rtcp_interval = RtcpReportInterval(); >- if (rtcp_receiver_.RtcpRrTimeout(rtcp_interval)) { >+ if (rtcp_receiver_.RtcpRrTimeout()) { > RTC_LOG_F(LS_WARNING) << "Timeout: No RTCP RR received."; >- } else if (rtcp_receiver_.RtcpRrSequenceNumberTimeout(rtcp_interval)) { >+ } else if (rtcp_receiver_.RtcpRrSequenceNumberTimeout()) { > RTC_LOG_F(LS_WARNING) << "Timeout: No increase in RTCP RR extended " > "highest sequence number."; > } >@@ -253,6 +265,7 @@ void ModuleRtpRtcpImpl::IncomingRtcpPacket(const uint8_t* rtcp_packet, > } > > int32_t ModuleRtpRtcpImpl::RegisterSendPayload(const CodecInst& voice_codec) { >+ rtcp_sender_.SetRtpClockRate(voice_codec.pltype, voice_codec.plfreq); > return rtp_sender_->RegisterPayload( > voice_codec.plname, voice_codec.pltype, voice_codec.plfreq, > voice_codec.channels, (voice_codec.rate < 0) ? 0 : voice_codec.rate); >@@ -260,8 +273,10 @@ int32_t ModuleRtpRtcpImpl::RegisterSendPayload(const CodecInst& voice_codec) { > > void ModuleRtpRtcpImpl::RegisterVideoSendPayload(int payload_type, > const char* payload_name) { >- RTC_CHECK_EQ( >- 0, rtp_sender_->RegisterPayload(payload_name, payload_type, 90000, 0, 0)); >+ rtcp_sender_.SetRtpClockRate(payload_type, kVideoPayloadTypeFrequency); >+ RTC_CHECK_EQ(0, >+ rtp_sender_->RegisterPayload(payload_name, payload_type, >+ kVideoPayloadTypeFrequency, 0, 0)); > } > > int32_t ModuleRtpRtcpImpl::DeRegisterSendPayload(const int8_t payload_type) { >@@ -392,6 +407,11 @@ bool ModuleRtpRtcpImpl::SendingMedia() const { > return rtp_sender_ ? rtp_sender_->SendingMedia() : false; > } > >+void ModuleRtpRtcpImpl::SetAsPartOfAllocation(bool part_of_allocation) { >+ RTC_CHECK(rtp_sender_); >+ rtp_sender_->SetAsPartOfAllocation(part_of_allocation); >+} >+ > bool ModuleRtpRtcpImpl::SendOutgoingData( > FrameType frame_type, > int8_t payload_type, >@@ -402,7 +422,7 @@ bool ModuleRtpRtcpImpl::SendOutgoingData( > const RTPFragmentationHeader* fragmentation, > const RTPVideoHeader* rtp_video_header, > uint32_t* transport_frame_id_out) { >- rtcp_sender_.SetLastRtpTime(time_stamp, capture_time_ms); >+ rtcp_sender_.SetLastRtpTime(time_stamp, capture_time_ms, payload_type); > // Make sure an RTCP report isn't queued behind a key frame. > if (rtcp_sender_.TimeToSendRTCPReport(kVideoFrameKey == frame_type)) { > rtcp_sender_.SendRTCP(GetFeedbackState(), kRtcpReport); >@@ -604,6 +624,10 @@ void ModuleRtpRtcpImpl::UnsetRemb() { > rtcp_sender_.UnsetRemb(); > } > >+void ModuleRtpRtcpImpl::SetExtmapAllowMixed(bool extmap_allow_mixed) { >+ rtp_sender_->SetExtmapAllowMixed(extmap_allow_mixed); >+} >+ > int32_t ModuleRtpRtcpImpl::RegisterSendRtpHeaderExtension( > const RTPExtensionType type, > const uint8_t id) { >@@ -843,13 +867,6 @@ std::vector<rtcp::TmmbItem> ModuleRtpRtcpImpl::BoundingSet(bool* tmmbr_owner) { > return rtcp_receiver_.BoundingSet(tmmbr_owner); > } > >-int64_t ModuleRtpRtcpImpl::RtcpReportInterval() { >- if (audio_) >- return rtcp_sender_.RtcpAudioReportInverval(); >- else >- return rtcp_sender_.RtcpVideoReportInverval(); >-} >- > void ModuleRtpRtcpImpl::SetRtcpReceiverSsrcs(uint32_t main_ssrc) { > std::set<uint32_t> ssrcs; > ssrcs.insert(main_ssrc); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.h >index 604414adf2ff4541185298a1550e342adeceed84..8e9751dcc1f0b237960e4fd5496ca739210302a2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl.h >@@ -11,17 +11,24 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_RTCP_IMPL_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_RTCP_IMPL_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <set> > #include <string> >-#include <utility> > #include <vector> > > #include "absl/types/optional.h" >+#include "api/rtp_headers.h" > #include "api/video/video_bitrate_allocation.h" >+#include "common_types.h" // NOLINT(build/include) >+#include "modules/include/module_common_types.h" >+#include "modules/include/module_fec_types.h" >+#include "modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h" > #include "modules/rtp_rtcp/include/rtp_rtcp.h" >-#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >+#include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" // RTCPPacketType > #include "modules/rtp_rtcp/source/packet_loss_stats.h" >+#include "modules/rtp_rtcp/source/rtcp_packet/tmmb_item.h" > #include "modules/rtp_rtcp/source/rtcp_receiver.h" > #include "modules/rtp_rtcp/source/rtcp_sender.h" > #include "modules/rtp_rtcp/source/rtp_sender.h" >@@ -30,6 +37,10 @@ > > namespace webrtc { > >+class Clock; >+struct PacedPacketInfo; >+struct RTPVideoHeader; >+ > class ModuleRtpRtcpImpl : public RtpRtcp, public RTCPReceiver::ModuleRtpRtcp { > public: > explicit ModuleRtpRtcpImpl(const RtpRtcp::Configuration& configuration); >@@ -59,6 +70,8 @@ class ModuleRtpRtcpImpl : public RtpRtcp, public RTCPReceiver::ModuleRtpRtcp { > > int32_t DeRegisterSendPayload(int8_t payload_type) override; > >+ void SetExtmapAllowMixed(bool extmap_allow_mixed) override; >+ > // Register RTP header extension. > int32_t RegisterSendRtpHeaderExtension(RTPExtensionType type, > uint8_t id) override; >@@ -115,6 +128,8 @@ class ModuleRtpRtcpImpl : public RtpRtcp, public RTCPReceiver::ModuleRtpRtcp { > > bool SendingMedia() const override; > >+ void SetAsPartOfAllocation(bool part_of_allocation) override; >+ > // Used by the codec module to deliver a video or audio frame for > // packetization. > bool SendOutgoingData(FrameType frame_type, >@@ -311,7 +326,6 @@ class ModuleRtpRtcpImpl : public RtpRtcp, public RTCPReceiver::ModuleRtpRtcp { > private: > FRIEND_TEST_ALL_PREFIXES(RtpRtcpImplTest, Rtt); > FRIEND_TEST_ALL_PREFIXES(RtpRtcpImplTest, RttForReceiverOnly); >- int64_t RtcpReportInterval(); > void SetRtcpReceiverSsrcs(uint32_t main_ssrc); > > void set_rtt_ms(int64_t rtt_ms); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl_unittest.cc >index 5160a64be73d892270ec3ce811ead803eab67d24..632a537e8029101b5069fd26913dc7445f90a47a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_rtcp_impl_unittest.cc >@@ -49,7 +49,7 @@ class RtcpRttStatsTestImpl : public RtcpRttStats { > int64_t rtt_ms_; > }; > >-class SendTransport : public Transport, public RtpData { >+class SendTransport : public Transport { > public: > SendTransport() > : receiver_(nullptr), >@@ -90,11 +90,6 @@ class SendTransport : public Transport, public RtpData { > ++rtcp_packets_sent_; > return true; > } >- int32_t OnReceivedPayloadData(const uint8_t* payload_data, >- size_t payload_size, >- const WebRtcRTPHeader* rtp_header) override { >- return 0; >- } > void SetKeepalivePayloadType(uint8_t payload_type) { > keepalive_payload_type_ = payload_type; > } >@@ -129,7 +124,7 @@ class RtpRtcpModule : public RtcpPacketTypeCounterObserver { > std::unique_ptr<ModuleRtpRtcpImpl> impl_; > uint32_t remote_ssrc_; > RtpKeepAliveConfig keepalive_config_; >- RtcpIntervalConfig rtcp_interval_config_; >+ int rtcp_report_interval_ms_ = 0; > > void SetRemoteSsrc(uint32_t ssrc) { > remote_ssrc_ = ssrc; >@@ -164,8 +159,8 @@ class RtpRtcpModule : public RtcpPacketTypeCounterObserver { > CreateModuleImpl(); > transport_.SetKeepalivePayloadType(config.payload_type); > } >- void SetRtcpIntervalConfigAndReset(const RtcpIntervalConfig& config) { >- rtcp_interval_config_ = config; >+ void SetRtcpReportIntervalAndReset(int rtcp_report_interval_ms) { >+ rtcp_report_interval_ms_ = rtcp_report_interval_ms; > CreateModuleImpl(); > } > >@@ -179,7 +174,7 @@ class RtpRtcpModule : public RtcpPacketTypeCounterObserver { > config.rtcp_packet_type_counter_observer = this; > config.rtt_stats = &rtt_stats_; > config.keepalive_config = keepalive_config_; >- config.rtcp_interval_config = rtcp_interval_config_; >+ config.rtcp_report_interval_ms = rtcp_report_interval_ms_; > > impl_.reset(new ModuleRtpRtcpImpl(config)); > impl_->SetRTCPStatus(RtcpMode::kCompound); >@@ -648,11 +643,8 @@ TEST_F(RtpRtcpImplTest, SendsKeepaliveAfterTimout) { > TEST_F(RtpRtcpImplTest, ConfigurableRtcpReportInterval) { > const int kVideoReportInterval = 3000; > >- RtcpIntervalConfig config; >- config.video_interval_ms = kVideoReportInterval; >- > // Recreate sender impl with new configuration, and redo setup. >- sender_.SetRtcpIntervalConfigAndReset(config); >+ sender_.SetRtcpReportIntervalAndReset(kVideoReportInterval); > SetUp(); > > SendFrame(&sender_, kBaseLayerTid); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.cc >index 9f10c40e5754026955234852007197e175bd7369..ddf91f5f5a2d580d5f99ff7f92e777f40436e049 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.cc >@@ -16,6 +16,7 @@ > #include <utility> > > #include "absl/memory/memory.h" >+#include "absl/strings/match.h" > #include "logging/rtc_event_log/events/rtc_event_rtp_packet_outgoing.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/remote_bitrate_estimator/test/bwe_test_logging.h" >@@ -94,14 +95,6 @@ const char* FrameTypeToString(FrameType frame_type) { > } > return ""; > } >- >-void CountPacket(RtpPacketCounter* counter, const RtpPacketToSend& packet) { >- ++counter->packets; >- counter->header_bytes += packet.headers_size(); >- counter->padding_bytes += packet.padding_size(); >- counter->payload_bytes += packet.payload_size(); >-} >- > } // namespace > > RTPSender::RTPSender( >@@ -119,24 +112,33 @@ RTPSender::RTPSender( > SendPacketObserver* send_packet_observer, > RateLimiter* retransmission_rate_limiter, > OverheadObserver* overhead_observer, >- bool populate_network2_timestamp) >+ bool populate_network2_timestamp, >+ FrameEncryptorInterface* frame_encryptor, >+ bool require_frame_encryption, >+ bool extmap_allow_mixed) > : clock_(clock), > // TODO(holmer): Remove this conversion? > clock_delta_ms_(clock_->TimeInMilliseconds() - rtc::TimeMillis()), > random_(clock_->TimeInMicroseconds()), > audio_configured_(audio), > audio_(audio ? new RTPSenderAudio(clock, this) : nullptr), >- video_(audio ? nullptr : new RTPSenderVideo(clock, this, flexfec_sender)), >+ video_(audio ? nullptr >+ : new RTPSenderVideo(clock, >+ this, >+ flexfec_sender, >+ frame_encryptor, >+ require_frame_encryption)), > paced_sender_(paced_sender), > transport_sequence_number_allocator_(sequence_number_allocator), > transport_feedback_observer_(transport_feedback_observer), > last_capture_time_ms_sent_(0), > transport_(transport), >- sending_media_(true), // Default to sending media. >+ sending_media_(true), // Default to sending media. >+ force_part_of_allocation_(false), > max_packet_size_(IP_PACKET_SIZE - 28), // Default is IP-v4/UDP. > last_payload_type_(-1), > payload_type_map_(), >- rtp_header_extension_map_(), >+ rtp_header_extension_map_(extmap_allow_mixed), > packet_history_(clock), > flexfec_packet_history_(clock), > // Statistics >@@ -167,9 +169,7 @@ RTPSender::RTPSender( > overhead_observer_(overhead_observer), > populate_network2_timestamp_(populate_network2_timestamp), > send_side_bwe_with_overhead_( >- webrtc::field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")), >- unlimited_retransmission_experiment_( >- field_trial::IsEnabled("WebRTC-UnlimitedScreenshareRetransmission")) { >+ webrtc::field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")) { > // This random initialization is not intended to be cryptographic strong. > timestamp_offset_ = random_.Rand<uint32_t>(); > // Random start, 16 bits. Can't be 0. >@@ -239,6 +239,11 @@ uint32_t RTPSender::NackOverheadRate() const { > return nack_bitrate_sent_.Rate(clock_->TimeInMilliseconds()).value_or(0); > } > >+void RTPSender::SetExtmapAllowMixed(bool extmap_allow_mixed) { >+ rtc::CritScope lock(&send_critsect_); >+ rtp_header_extension_map_.SetExtmapAllowMixed(extmap_allow_mixed); >+} >+ > int32_t RTPSender::RegisterRtpHeaderExtension(RTPExtensionType type, > uint8_t id) { > rtc::CritScope lock(&send_critsect_); >@@ -260,13 +265,12 @@ int32_t RTPSender::DeregisterRtpHeaderExtension(RTPExtensionType type) { > return rtp_header_extension_map_.Deregister(type); > } > >-int32_t RTPSender::RegisterPayload( >- const char payload_name[RTP_PAYLOAD_NAME_SIZE], >- int8_t payload_number, >- uint32_t frequency, >- size_t channels, >- uint32_t rate) { >- RTC_DCHECK_LT(strlen(payload_name), RTP_PAYLOAD_NAME_SIZE); >+int32_t RTPSender::RegisterPayload(absl::string_view payload_name, >+ int8_t payload_number, >+ uint32_t frequency, >+ size_t channels, >+ uint32_t rate) { >+ RTC_DCHECK_LT(payload_name.size(), RTP_PAYLOAD_NAME_SIZE); > rtc::CritScope lock(&send_critsect_); > > std::map<int8_t, RtpUtility::Payload*>::iterator it = >@@ -278,8 +282,7 @@ int32_t RTPSender::RegisterPayload( > RTC_DCHECK(payload); > > // Check if it's the same as we already have. >- if (RtpUtility::StringCompare(payload->name, payload_name, >- RTP_PAYLOAD_NAME_SIZE - 1)) { >+ if (absl::EqualsIgnoreCase(payload->name, payload_name)) { > if (audio_configured_ && payload->typeSpecific.is_audio()) { > auto& p = payload->typeSpecific.audio_payload(); > if (rtc::SafeEq(p.format.clockrate_hz, frequency) && >@@ -426,11 +429,6 @@ bool RTPSender::SendOutgoingData(FrameType frame_type, > *transport_frame_id_out = rtp_timestamp; > if (!sending_media_) > return true; >- >- // Cache video content type. >- if (!audio_configured_ && rtp_header) { >- video_content_type_ = rtp_header->content_type; >- } > } > VideoCodecType video_type = kVideoCodecGeneric; > if (CheckPayloadType(payload_type, &video_type) != 0) { >@@ -619,10 +617,16 @@ size_t RTPSender::SendPadData(size_t bytes, > PacketOptions options; > // Padding packets are never retransmissions. > options.is_retransmit = false; >- bool has_transport_seq_num = >- UpdateTransportSequenceNumber(&padding_packet, &options.packet_id); >- padding_packet.SetPadding(padding_bytes_in_packet, &random_); >- >+ bool has_transport_seq_num; >+ { >+ rtc::CritScope lock(&send_critsect_); >+ has_transport_seq_num = >+ UpdateTransportSequenceNumber(&padding_packet, &options.packet_id); >+ options.included_in_allocation = >+ has_transport_seq_num || force_part_of_allocation_; >+ options.included_in_feedback = has_transport_seq_num; >+ } >+ padding_packet.SetPadding(padding_bytes_in_packet); > if (has_transport_seq_num) { > AddPacketToTransportFeedback(options.packet_id, padding_packet, > pacing_info); >@@ -654,7 +658,7 @@ int32_t RTPSender::ReSendPacket(uint16_t packet_id) { > // Try to find packet in RTP packet history. Also verify RTT here, so that we > // don't retransmit too often. > absl::optional<RtpPacketHistory::PacketState> stored_packet = >- packet_history_.GetPacketState(packet_id, true); >+ packet_history_.GetPacketState(packet_id); > if (!stored_packet) { > // Packet not found. > return 0; >@@ -664,20 +668,9 @@ int32_t RTPSender::ReSendPacket(uint16_t packet_id) { > > // Skip retransmission rate check if not configured. > if (retransmission_rate_limiter_) { >- // Skip retransmission rate check if sending screenshare and the experiment >- // is on. >- bool skip_retransmission_rate_limit = false; >- if (unlimited_retransmission_experiment_) { >- rtc::CritScope lock(&send_critsect_); >- skip_retransmission_rate_limit = >- video_content_type_ && >- videocontenttypehelpers::IsScreenshare(*video_content_type_); >- } >- > // Check if we're overusing retransmission bitrate. > // TODO(sprang): Add histograms for nack success or failure reasons. >- if (!skip_retransmission_rate_limit && >- !retransmission_rate_limiter_->TryUseRate(packet_size)) { >+ if (!retransmission_rate_limiter_->TryUseRate(packet_size)) { > return -1; > } > } >@@ -696,7 +689,7 @@ int32_t RTPSender::ReSendPacket(uint16_t packet_id) { > } > > std::unique_ptr<RtpPacketToSend> packet = >- packet_history_.GetPacketAndSetSendTime(packet_id, true); >+ packet_history_.GetPacketAndSetSendTime(packet_id); > if (!packet) { > // Packet could theoretically time out between the first check and this one. > return 0; >@@ -774,17 +767,14 @@ bool RTPSender::TimeToSendPacket(uint32_t ssrc, > return true; > > std::unique_ptr<RtpPacketToSend> packet; >- // No need to verify RTT here, it has already been checked before putting the >- // packet into the pacer. But _do_ update the send time. > if (ssrc == SSRC()) { >- packet = packet_history_.GetPacketAndSetSendTime(sequence_number, false); >+ packet = packet_history_.GetPacketAndSetSendTime(sequence_number); > } else if (ssrc == FlexfecSsrc()) { >- packet = >- flexfec_packet_history_.GetPacketAndSetSendTime(sequence_number, false); >+ packet = flexfec_packet_history_.GetPacketAndSetSendTime(sequence_number); > } > > if (!packet) { >- // Packet cannot be found. >+ // Packet cannot be found or was resend too recently. > return true; > } > >@@ -838,7 +828,16 @@ bool RTPSender::PrepareAndSendPacket(std::unique_ptr<RtpPacketToSend> packet, > // E.g. RTPSender::TrySendRedundantPayloads calls PrepareAndSendPacket with > // send_over_rtx = true but is_retransmit = false. > options.is_retransmit = is_retransmit || send_over_rtx; >- if (UpdateTransportSequenceNumber(packet_to_send, &options.packet_id)) { >+ bool has_transport_seq_num; >+ { >+ rtc::CritScope lock(&send_critsect_); >+ has_transport_seq_num = >+ UpdateTransportSequenceNumber(packet_to_send, &options.packet_id); >+ options.included_in_allocation = >+ has_transport_seq_num || force_part_of_allocation_; >+ options.included_in_feedback = has_transport_seq_num; >+ } >+ if (has_transport_seq_num) { > AddPacketToTransportFeedback(options.packet_id, *packet_to_send, > pacing_info); > } >@@ -876,13 +875,13 @@ void RTPSender::UpdateRtpStats(const RtpPacketToSend& packet, > counters->first_packet_time_ms = now_ms; > > if (IsFecPacket(packet)) >- CountPacket(&counters->fec, packet); >+ counters->fec.AddPacket(packet); > > if (is_retransmit) { >- CountPacket(&counters->retransmitted, packet); >+ counters->retransmitted.AddPacket(packet); > nack_bitrate_sent_.Update(packet.size(), now_ms); > } >- CountPacket(&counters->transmitted, packet); >+ counters->transmitted.AddPacket(packet); > > if (rtp_stats_callback_) > rtp_stats_callback_->DataCountersUpdated(*counters, packet.Ssrc()); >@@ -920,15 +919,6 @@ bool RTPSender::SendToNetwork(std::unique_ptr<RtpPacketToSend> packet, > RTC_DCHECK(packet); > int64_t now_ms = clock_->TimeInMilliseconds(); > >- // |capture_time_ms| <= 0 is considered invalid. >- // TODO(holmer): This should be changed all over Video Engine so that negative >- // time is consider invalid, while 0 is considered a valid time. >- if (packet->capture_time_ms() > 0) { >- packet->SetExtension<TransmissionOffset>( >- kTimestampTicksPerMs * (now_ms - packet->capture_time_ms())); >- } >- packet->SetExtension<AbsoluteSendTime>(AbsoluteSendTime::MsTo24Bits(now_ms)); >- > if (video_) { > BWE_TEST_LOGGING_PLOT_WITH_SSRC(1, "VideoTotBitrate_kbps", now_ms, > ActualSendBitrateKbit(), packet->Ssrc()); >@@ -971,7 +961,31 @@ bool RTPSender::SendToNetwork(std::unique_ptr<RtpPacketToSend> packet, > > PacketOptions options; > options.is_retransmit = false; >- if (UpdateTransportSequenceNumber(packet.get(), &options.packet_id)) { >+ >+ // |capture_time_ms| <= 0 is considered invalid. >+ // TODO(holmer): This should be changed all over Video Engine so that negative >+ // time is consider invalid, while 0 is considered a valid time. >+ if (packet->capture_time_ms() > 0) { >+ packet->SetExtension<TransmissionOffset>( >+ kTimestampTicksPerMs * (now_ms - packet->capture_time_ms())); >+ >+ if (populate_network2_timestamp_ && >+ packet->HasExtension<VideoTimingExtension>()) { >+ packet->set_network2_time_ms(now_ms); >+ } >+ } >+ packet->SetExtension<AbsoluteSendTime>(AbsoluteSendTime::MsTo24Bits(now_ms)); >+ >+ bool has_transport_seq_num; >+ { >+ rtc::CritScope lock(&send_critsect_); >+ has_transport_seq_num = >+ UpdateTransportSequenceNumber(packet.get(), &options.packet_id); >+ options.included_in_allocation = >+ has_transport_seq_num || force_part_of_allocation_; >+ options.included_in_feedback = has_transport_seq_num; >+ } >+ if (has_transport_seq_num) { > AddPacketToTransportFeedback(options.packet_id, *packet.get(), > PacedPacketInfo()); > } >@@ -1141,8 +1155,15 @@ void RTPSender::GetDataCounters(StreamDataCounters* rtp_stats, > > std::unique_ptr<RtpPacketToSend> RTPSender::AllocatePacket() const { > rtc::CritScope lock(&send_critsect_); >- std::unique_ptr<RtpPacketToSend> packet( >- new RtpPacketToSend(&rtp_header_extension_map_, max_packet_size_)); >+ // TODO(danilchap): Find better motivator and value for extra capacity. >+ // RtpPacketizer might slightly miscalulate needed size, >+ // SRTP may benefit from extra space in the buffer and do encryption in place >+ // saving reallocation. >+ // While sending slightly oversized packet increase chance of dropped packet, >+ // it is better than crash on drop packet without trying to send it. >+ static constexpr int kExtraCapacity = 16; >+ auto packet = absl::make_unique<RtpPacketToSend>( >+ &rtp_header_extension_map_, max_packet_size_ + kExtraCapacity); > RTC_DCHECK(ssrc_); > packet->SetSsrc(*ssrc_); > packet->SetCsrcs(csrcs_); >@@ -1181,10 +1202,9 @@ bool RTPSender::AssignSequenceNumber(RtpPacketToSend* packet) { > } > > bool RTPSender::UpdateTransportSequenceNumber(RtpPacketToSend* packet, >- int* packet_id) const { >+ int* packet_id) { > RTC_DCHECK(packet); > RTC_DCHECK(packet_id); >- rtc::CritScope lock(&send_critsect_); > if (!rtp_header_extension_map_.IsRegistered(TransportSequenceNumber::kId)) > return false; > >@@ -1209,6 +1229,11 @@ bool RTPSender::SendingMedia() const { > return sending_media_; > } > >+void RTPSender::SetAsPartOfAllocation(bool part_of_allocation) { >+ rtc::CritScope lock(&send_critsect_); >+ force_part_of_allocation_ = part_of_allocation; >+} >+ > void RTPSender::SetTimestampOffset(uint32_t timestamp) { > rtc::CritScope lock(&send_critsect_); > timestamp_offset_ = timestamp; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.h >index 5a93d7a52021e14c508d754d306e0a2d884ebf1b..f9bbbdd9e22a23df01d22c19d87cc72b69c6dcd7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender.h >@@ -17,10 +17,10 @@ > #include <utility> > #include <vector> > >+#include "absl/strings/string_view.h" > #include "absl/types/optional.h" > #include "api/array_view.h" > #include "api/call/transport.h" >-#include "api/video/video_content_type.h" > #include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/include/flexfec_sender.h" > #include "modules/rtp_rtcp/include/rtp_header_extension_map.h" >@@ -38,6 +38,7 @@ > > namespace webrtc { > >+class FrameEncryptorInterface; > class OverheadObserver; > class RateLimiter; > class RtcEventLog; >@@ -63,7 +64,10 @@ class RTPSender { > SendPacketObserver* send_packet_observer, > RateLimiter* nack_rate_limiter, > OverheadObserver* overhead_observer, >- bool populate_network2_timestamp); >+ bool populate_network2_timestamp, >+ FrameEncryptorInterface* frame_encryptor, >+ bool require_frame_encryption, >+ bool extmap_allow_mixed); > > ~RTPSender(); > >@@ -75,7 +79,7 @@ class RTPSender { > uint32_t FecOverheadRate() const; > uint32_t NackOverheadRate() const; > >- int32_t RegisterPayload(const char* payload_name, >+ int32_t RegisterPayload(absl::string_view payload_name, > const int8_t payload_type, > const uint32_t frequency, > const size_t channels, >@@ -86,6 +90,8 @@ class RTPSender { > void SetSendingMediaStatus(bool enabled); > bool SendingMedia() const; > >+ void SetAsPartOfAllocation(bool part_of_allocation); >+ > void GetDataCounters(StreamDataCounters* rtp_stats, > StreamDataCounters* rtx_stats) const; > >@@ -114,6 +120,8 @@ class RTPSender { > uint32_t* transport_frame_id_out, > int64_t expected_retransmission_time_ms); > >+ void SetExtmapAllowMixed(bool extmap_allow_mixed); >+ > // RTP header extension > int32_t RegisterRtpHeaderExtension(RTPExtensionType type, uint8_t id); > bool RegisterRtpHeaderExtension(const std::string& uri, int id); >@@ -247,8 +255,8 @@ class RTPSender { > int64_t capture_time_ms, > uint32_t ssrc); > >- bool UpdateTransportSequenceNumber(RtpPacketToSend* packet, >- int* packet_id) const; >+ bool UpdateTransportSequenceNumber(RtpPacketToSend* packet, int* packet_id) >+ RTC_EXCLUSIVE_LOCKS_REQUIRED(send_critsect_); > > void UpdateRtpStats(const RtpPacketToSend& packet, > bool is_rtx, >@@ -277,7 +285,7 @@ class RTPSender { > > Transport* transport_; > bool sending_media_ RTC_GUARDED_BY(send_critsect_); >- >+ bool force_part_of_allocation_ RTC_GUARDED_BY(send_critsect_); > size_t max_packet_size_; > > int8_t last_payload_type_ RTC_GUARDED_BY(send_critsect_); >@@ -343,11 +351,6 @@ class RTPSender { > > const bool send_side_bwe_with_overhead_; > >- const bool unlimited_retransmission_experiment_; >- >- absl::optional<VideoContentType> video_content_type_ >- RTC_GUARDED_BY(send_critsect_); >- > RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(RTPSender); > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.cc >index bf86a39339be73f9078359c7755bded87741ec46..636ccccf0d9b7ff90a0dad4d3d529aab6fdb0fa6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.cc >@@ -11,16 +11,18 @@ > #include "modules/rtp_rtcp/source/rtp_sender_audio.h" > > #include <string.h> >- > #include <memory> > #include <utility> > >+#include "absl/strings/match.h" >+#include "api/audio_codecs/audio_format.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" >+#include "modules/rtp_rtcp/source/rtp_packet.h" > #include "modules/rtp_rtcp/source/rtp_packet_to_send.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "rtc_base/timeutils.h" > #include "rtc_base/trace_event.h" > > namespace webrtc { >@@ -30,14 +32,13 @@ RTPSenderAudio::RTPSenderAudio(Clock* clock, RTPSender* rtp_sender) > > RTPSenderAudio::~RTPSenderAudio() {} > >-int32_t RTPSenderAudio::RegisterAudioPayload( >- const char payloadName[RTP_PAYLOAD_NAME_SIZE], >- const int8_t payload_type, >- const uint32_t frequency, >- const size_t channels, >- const uint32_t rate, >- RtpUtility::Payload** payload) { >- if (RtpUtility::StringCompare(payloadName, "cn", 2)) { >+int32_t RTPSenderAudio::RegisterAudioPayload(absl::string_view payload_name, >+ const int8_t payload_type, >+ const uint32_t frequency, >+ const size_t channels, >+ const uint32_t rate, >+ RtpUtility::Payload** payload) { >+ if (absl::EqualsIgnoreCase(payload_name, "cn")) { > rtc::CritScope cs(&send_audio_critsect_); > // we can have multiple CNG payload types > switch (frequency) { >@@ -56,7 +57,7 @@ int32_t RTPSenderAudio::RegisterAudioPayload( > default: > return -1; > } >- } else if (RtpUtility::StringCompare(payloadName, "telephone-event", 15)) { >+ } else if (absl::EqualsIgnoreCase(payload_name, "telephone-event")) { > rtc::CritScope cs(&send_audio_critsect_); > // Don't add it to the list > // we dont want to allow send with a DTMF payloadtype >@@ -65,9 +66,9 @@ int32_t RTPSenderAudio::RegisterAudioPayload( > return 0; > } > *payload = new RtpUtility::Payload( >- payloadName, >+ payload_name, > PayloadUnion(AudioPayload{ >- SdpAudioFormat(payloadName, frequency, channels), rate})); >+ SdpAudioFormat(payload_name, frequency, channels), rate})); > return 0; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.h >index 63dfc2b99c393852783d6c80f797a4a50d3ca2cd..1dbe5b5817cc3d1674629711b106343445b100af 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_audio.h >@@ -11,14 +11,19 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_SENDER_AUDIO_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_SENDER_AUDIO_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ >+#include "absl/strings/string_view.h" > #include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/source/dtmf_queue.h" >-#include "modules/rtp_rtcp/source/rtp_rtcp_config.h" > #include "modules/rtp_rtcp/source/rtp_sender.h" > #include "modules/rtp_rtcp/source/rtp_utility.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/onetimeevent.h" >+#include "rtc_base/thread_annotations.h" >+#include "system_wrappers/include/clock.h" > > namespace webrtc { > >@@ -27,7 +32,7 @@ class RTPSenderAudio { > RTPSenderAudio(Clock* clock, RTPSender* rtp_sender); > ~RTPSenderAudio(); > >- int32_t RegisterAudioPayload(const char payloadName[RTP_PAYLOAD_NAME_SIZE], >+ int32_t RegisterAudioPayload(absl::string_view payload_name, > int8_t payload_type, > uint32_t frequency, > size_t channels, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_unittest.cc >index 86b92da2e00b9cd14cbd3606c3555b8bb41f7816..a687bcbb35b27adc73b67ecfa082044b822916b8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_unittest.cc >@@ -184,7 +184,8 @@ class RtpSenderTest : public ::testing::TestWithParam<bool> { > false, &fake_clock_, &transport_, pacer ? &mock_paced_sender_ : nullptr, > nullptr, &seq_num_allocator_, nullptr, nullptr, nullptr, nullptr, > &mock_rtc_event_log_, &send_packet_observer_, >- &retransmission_rate_limiter_, nullptr, populate_network2)); >+ &retransmission_rate_limiter_, nullptr, populate_network2, nullptr, >+ false, false)); > rtp_sender_->SetSequenceNumber(kSeqNum); > rtp_sender_->SetTimestampOffset(0); > rtp_sender_->SetSSRC(kSsrc); >@@ -276,7 +277,7 @@ class TestRtpSenderVideo : public RTPSenderVideo { > TestRtpSenderVideo(Clock* clock, > RTPSender* rtp_sender, > FlexfecSender* flexfec_sender) >- : RTPSenderVideo(clock, rtp_sender, flexfec_sender) {} >+ : RTPSenderVideo(clock, rtp_sender, flexfec_sender, nullptr, false) {} > ~TestRtpSenderVideo() override {} > > StorageType GetStorageType(const RTPVideoHeader& header, >@@ -382,7 +383,8 @@ TEST_P(RtpSenderTest, AssignSequenceNumberAllowsPaddingOnAudio) { > rtp_sender_.reset(new RTPSender( > kEnableAudio, &fake_clock_, &transport, &mock_paced_sender_, nullptr, > nullptr, nullptr, nullptr, nullptr, nullptr, &mock_rtc_event_log_, >- nullptr, &retransmission_rate_limiter_, nullptr, false)); >+ nullptr, &retransmission_rate_limiter_, nullptr, false, nullptr, false, >+ false)); > rtp_sender_->SetTimestampOffset(0); > rtp_sender_->SetSSRC(kSsrc); > >@@ -428,7 +430,8 @@ TEST_P(RtpSenderTestWithoutPacer, > rtp_sender_.reset(new RTPSender( > false, &fake_clock_, &transport_, nullptr, nullptr, &seq_num_allocator_, > &feedback_observer_, nullptr, nullptr, nullptr, &mock_rtc_event_log_, >- nullptr, &retransmission_rate_limiter_, &mock_overhead_observer, false)); >+ nullptr, &retransmission_rate_limiter_, &mock_overhead_observer, false, >+ nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > EXPECT_EQ(0, rtp_sender_->RegisterRtpHeaderExtension( > kRtpExtensionTransportSequenceNumber, >@@ -455,7 +458,8 @@ TEST_P(RtpSenderTestWithoutPacer, SendsPacketsWithTransportSequenceNumber) { > rtp_sender_.reset(new RTPSender( > false, &fake_clock_, &transport_, nullptr, nullptr, &seq_num_allocator_, > &feedback_observer_, nullptr, nullptr, nullptr, &mock_rtc_event_log_, >- &send_packet_observer_, &retransmission_rate_limiter_, nullptr, false)); >+ &send_packet_observer_, &retransmission_rate_limiter_, nullptr, false, >+ nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > EXPECT_EQ(0, rtp_sender_->RegisterRtpHeaderExtension( > kRtpExtensionTransportSequenceNumber, >@@ -479,13 +483,15 @@ TEST_P(RtpSenderTestWithoutPacer, SendsPacketsWithTransportSequenceNumber) { > ASSERT_TRUE(packet.GetExtension<TransportSequenceNumber>(&transport_seq_no)); > EXPECT_EQ(kTransportSequenceNumber, transport_seq_no); > EXPECT_EQ(transport_.last_options_.packet_id, transport_seq_no); >+ EXPECT_TRUE(transport_.last_options_.included_in_allocation); > } > > TEST_P(RtpSenderTestWithoutPacer, PacketOptionsNoRetransmission) { > rtp_sender_.reset(new RTPSender( > false, &fake_clock_, &transport_, nullptr, nullptr, &seq_num_allocator_, > &feedback_observer_, nullptr, nullptr, nullptr, &mock_rtc_event_log_, >- &send_packet_observer_, &retransmission_rate_limiter_, nullptr, false)); >+ &send_packet_observer_, &retransmission_rate_limiter_, nullptr, false, >+ nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > > SendGenericPayload(); >@@ -493,16 +499,53 @@ TEST_P(RtpSenderTestWithoutPacer, PacketOptionsNoRetransmission) { > EXPECT_FALSE(transport_.last_options_.is_retransmit); > } > >-TEST_P(RtpSenderTestWithoutPacer, NoAllocationIfNotRegistered) { >+TEST_P(RtpSenderTestWithoutPacer, >+ SetsIncludedInFeedbackWhenTransportSequenceNumberExtensionIsRegistered) { >+ SetUpRtpSender(false, false); >+ rtp_sender_->RegisterRtpHeaderExtension(kRtpExtensionTransportSequenceNumber, >+ kTransportSequenceNumberExtensionId); >+ EXPECT_CALL(seq_num_allocator_, AllocateSequenceNumber()) >+ .WillOnce(testing::Return(kTransportSequenceNumber)); >+ EXPECT_CALL(send_packet_observer_, OnSendPacket).Times(1); >+ SendGenericPayload(); >+ EXPECT_TRUE(transport_.last_options_.included_in_feedback); >+} >+ >+TEST_P( >+ RtpSenderTestWithoutPacer, >+ SetsIncludedInAllocationWhenTransportSequenceNumberExtensionIsRegistered) { >+ SetUpRtpSender(false, false); >+ rtp_sender_->RegisterRtpHeaderExtension(kRtpExtensionTransportSequenceNumber, >+ kTransportSequenceNumberExtensionId); >+ EXPECT_CALL(seq_num_allocator_, AllocateSequenceNumber()) >+ .WillOnce(testing::Return(kTransportSequenceNumber)); >+ EXPECT_CALL(send_packet_observer_, OnSendPacket).Times(1); > SendGenericPayload(); >+ EXPECT_TRUE(transport_.last_options_.included_in_allocation); >+} >+ >+TEST_P(RtpSenderTestWithoutPacer, >+ SetsIncludedInAllocationWhenForcedAsPartOfAllocation) { >+ SetUpRtpSender(false, false); >+ rtp_sender_->SetAsPartOfAllocation(true); >+ SendGenericPayload(); >+ EXPECT_FALSE(transport_.last_options_.included_in_feedback); >+ EXPECT_TRUE(transport_.last_options_.included_in_allocation); >+} >+ >+TEST_P(RtpSenderTestWithoutPacer, DoesnSetIncludedInAllocationByDefault) { >+ SetUpRtpSender(false, false); >+ SendGenericPayload(); >+ EXPECT_FALSE(transport_.last_options_.included_in_feedback); >+ EXPECT_FALSE(transport_.last_options_.included_in_allocation); > } > > TEST_P(RtpSenderTestWithoutPacer, OnSendSideDelayUpdated) { > testing::StrictMock<MockSendSideDelayObserver> send_side_delay_observer_; >- rtp_sender_.reset( >- new RTPSender(false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, >- nullptr, nullptr, nullptr, &send_side_delay_observer_, >- &mock_rtc_event_log_, nullptr, nullptr, nullptr, false)); >+ rtp_sender_.reset(new RTPSender( >+ false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, nullptr, >+ nullptr, nullptr, &send_side_delay_observer_, &mock_rtc_event_log_, >+ nullptr, nullptr, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > > const uint8_t kPayloadType = 127; >@@ -580,7 +623,7 @@ TEST_P(RtpSenderTest, SendsPacketsWithTransportSequenceNumber) { > false, &fake_clock_, &transport_, &mock_paced_sender_, nullptr, > &seq_num_allocator_, &feedback_observer_, nullptr, nullptr, nullptr, > &mock_rtc_event_log_, &send_packet_observer_, >- &retransmission_rate_limiter_, nullptr, false)); >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSequenceNumber(kSeqNum); > rtp_sender_->SetSSRC(kSsrc); > rtp_sender_->SetStorePacketsStatus(true, 10); >@@ -647,8 +690,8 @@ TEST_P(RtpSenderTest, WritesPacerExitToTimingExtension) { > EXPECT_EQ(kStoredTimeInMs, video_timing.pacer_exit_delta_ms); > } > >-TEST_P(RtpSenderTest, WritesNetwork2ToTimingExtension) { >- SetUpRtpSender(true, true); >+TEST_P(RtpSenderTest, WritesNetwork2ToTimingExtensionWithPacer) { >+ SetUpRtpSender(/*pacer=*/true, /*populate_network2=*/true); > rtp_sender_->SetStorePacketsStatus(true, 10); > EXPECT_EQ(0, rtp_sender_->RegisterRtpHeaderExtension( > kRtpExtensionVideoTiming, kVideoTimingExtensionId)); >@@ -686,6 +729,31 @@ TEST_P(RtpSenderTest, WritesNetwork2ToTimingExtension) { > EXPECT_EQ(kPacerExitMs, video_timing.pacer_exit_delta_ms); > } > >+TEST_P(RtpSenderTest, WritesNetwork2ToTimingExtensionWithoutPacer) { >+ SetUpRtpSender(/*pacer=*/false, /*populate_network2=*/true); >+ EXPECT_EQ(0, rtp_sender_->RegisterRtpHeaderExtension( >+ kRtpExtensionVideoTiming, kVideoTimingExtensionId)); >+ auto packet = rtp_sender_->AllocatePacket(); >+ packet->SetMarker(true); >+ packet->set_capture_time_ms(fake_clock_.TimeInMilliseconds()); >+ const VideoSendTiming kVideoTiming = {0u, 0u, 0u, 0u, 0u, 0u, true}; >+ packet->SetExtension<VideoTimingExtension>(kVideoTiming); >+ EXPECT_TRUE(rtp_sender_->AssignSequenceNumber(packet.get())); >+ >+ const int kPropagateTimeMs = 10; >+ fake_clock_.AdvanceTimeMilliseconds(kPropagateTimeMs); >+ >+ EXPECT_TRUE(rtp_sender_->SendToNetwork(std::move(packet), >+ kAllowRetransmission, >+ RtpPacketSender::kNormalPriority)); >+ >+ EXPECT_EQ(1, transport_.packets_sent()); >+ absl::optional<VideoSendTiming> video_timing = >+ transport_.last_sent_packet().GetExtension<VideoTimingExtension>(); >+ ASSERT_TRUE(video_timing); >+ EXPECT_EQ(kPropagateTimeMs, video_timing->network2_timestamp_delta_ms); >+} >+ > TEST_P(RtpSenderTest, TrafficSmoothingWithExtensions) { > EXPECT_CALL(mock_paced_sender_, InsertPacket(RtpPacketSender::kNormalPriority, > kSsrc, kSeqNum, _, _, _)); >@@ -940,7 +1008,7 @@ TEST_P(RtpSenderTest, OnSendPacketNotUpdatedWithoutSeqNumAllocator) { > false, &fake_clock_, &transport_, &mock_paced_sender_, nullptr, > nullptr /* TransportSequenceNumberAllocator */, nullptr, nullptr, nullptr, > nullptr, nullptr, &send_packet_observer_, &retransmission_rate_limiter_, >- nullptr, false)); >+ nullptr, false, nullptr, false, false)); > rtp_sender_->SetSequenceNumber(kSeqNum); > rtp_sender_->SetSSRC(kSsrc); > EXPECT_EQ(0, rtp_sender_->RegisterRtpHeaderExtension( >@@ -966,7 +1034,7 @@ TEST_P(RtpSenderTest, SendRedundantPayloads) { > rtp_sender_.reset(new RTPSender( > false, &fake_clock_, &transport, &mock_paced_sender_, nullptr, nullptr, > nullptr, nullptr, nullptr, nullptr, &mock_rtc_event_log_, nullptr, >- &retransmission_rate_limiter_, nullptr, false)); >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSequenceNumber(kSeqNum); > rtp_sender_->SetSSRC(kSsrc); > rtp_sender_->SetRtxPayloadType(kRtxPayload, kPayload); >@@ -1090,7 +1158,7 @@ TEST_P(RtpSenderTest, SendFlexfecPackets) { > false, &fake_clock_, &transport_, &mock_paced_sender_, &flexfec_sender, > &seq_num_allocator_, nullptr, nullptr, nullptr, nullptr, > &mock_rtc_event_log_, &send_packet_observer_, >- &retransmission_rate_limiter_, nullptr, false)); >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kMediaSsrc); > rtp_sender_->SetSequenceNumber(kSeqNum); > rtp_sender_->SetStorePacketsStatus(true, 10); >@@ -1150,7 +1218,7 @@ TEST_P(RtpSenderTest, NoFlexfecForTimingFrames) { > false, &fake_clock_, &transport_, &mock_paced_sender_, &flexfec_sender, > &seq_num_allocator_, nullptr, nullptr, nullptr, nullptr, > &mock_rtc_event_log_, &send_packet_observer_, >- &retransmission_rate_limiter_, nullptr, false)); >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kMediaSsrc); > rtp_sender_->SetSequenceNumber(kSeqNum); > rtp_sender_->SetStorePacketsStatus(true, 10); >@@ -1245,11 +1313,11 @@ TEST_P(RtpSenderTestWithoutPacer, SendFlexfecPackets) { > nullptr /* rtp_state */, &fake_clock_); > > // Reset |rtp_sender_| to use FlexFEC. >- rtp_sender_.reset( >- new RTPSender(false, &fake_clock_, &transport_, nullptr, &flexfec_sender, >- &seq_num_allocator_, nullptr, nullptr, nullptr, nullptr, >- &mock_rtc_event_log_, &send_packet_observer_, >- &retransmission_rate_limiter_, nullptr, false)); >+ rtp_sender_.reset(new RTPSender( >+ false, &fake_clock_, &transport_, nullptr, &flexfec_sender, >+ &seq_num_allocator_, nullptr, nullptr, nullptr, nullptr, >+ &mock_rtc_event_log_, &send_packet_observer_, >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kMediaSsrc); > rtp_sender_->SetSequenceNumber(kSeqNum); > >@@ -1313,7 +1381,7 @@ TEST_P(RtpSenderTest, FecOverheadRate) { > false, &fake_clock_, &transport_, &mock_paced_sender_, &flexfec_sender, > &seq_num_allocator_, nullptr, nullptr, nullptr, nullptr, > &mock_rtc_event_log_, &send_packet_observer_, >- &retransmission_rate_limiter_, nullptr, false)); >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kMediaSsrc); > rtp_sender_->SetSequenceNumber(kSeqNum); > >@@ -1365,7 +1433,7 @@ TEST_P(RtpSenderTest, FrameCountCallbacks) { > rtp_sender_.reset(new RTPSender( > false, &fake_clock_, &transport_, &mock_paced_sender_, nullptr, nullptr, > nullptr, nullptr, &callback, nullptr, nullptr, nullptr, >- &retransmission_rate_limiter_, nullptr, false)); >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > char payload_name[RTP_PAYLOAD_NAME_SIZE] = "GENERIC"; > const uint8_t payload_type = 127; >@@ -1425,10 +1493,10 @@ TEST_P(RtpSenderTest, BitrateCallbacks) { > uint32_t total_bitrate_; > uint32_t retransmit_bitrate_; > } callback; >- rtp_sender_.reset( >- new RTPSender(false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, >- nullptr, &callback, nullptr, nullptr, nullptr, nullptr, >- &retransmission_rate_limiter_, nullptr, false)); >+ rtp_sender_.reset(new RTPSender( >+ false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, nullptr, >+ &callback, nullptr, nullptr, nullptr, nullptr, >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > > // Simulate kNumPackets sent with kPacketInterval ms intervals, with the >@@ -1485,10 +1553,10 @@ class RtpSenderAudioTest : public RtpSenderTest { > > void SetUp() override { > payload_ = kAudioPayload; >- rtp_sender_.reset( >- new RTPSender(true, &fake_clock_, &transport_, nullptr, nullptr, >- nullptr, nullptr, nullptr, nullptr, nullptr, nullptr, >- nullptr, &retransmission_rate_limiter_, nullptr, false)); >+ rtp_sender_.reset(new RTPSender( >+ true, &fake_clock_, &transport_, nullptr, nullptr, nullptr, nullptr, >+ nullptr, nullptr, nullptr, nullptr, nullptr, >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > rtp_sender_->SetSequenceNumber(kSeqNum); > } >@@ -2151,10 +2219,11 @@ TEST_P(RtpSenderVideoTest, > > TEST_P(RtpSenderTest, OnOverheadChanged) { > MockOverheadObserver mock_overhead_observer; >- rtp_sender_.reset(new RTPSender( >- false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, nullptr, >- nullptr, nullptr, nullptr, nullptr, nullptr, >- &retransmission_rate_limiter_, &mock_overhead_observer, false)); >+ rtp_sender_.reset( >+ new RTPSender(false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, >+ nullptr, nullptr, nullptr, nullptr, nullptr, nullptr, >+ &retransmission_rate_limiter_, &mock_overhead_observer, >+ false, nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > > // RTP overhead is 12B. >@@ -2172,10 +2241,11 @@ TEST_P(RtpSenderTest, OnOverheadChanged) { > > TEST_P(RtpSenderTest, DoesNotUpdateOverheadOnEqualSize) { > MockOverheadObserver mock_overhead_observer; >- rtp_sender_.reset(new RTPSender( >- false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, nullptr, >- nullptr, nullptr, nullptr, nullptr, nullptr, >- &retransmission_rate_limiter_, &mock_overhead_observer, false)); >+ rtp_sender_.reset( >+ new RTPSender(false, &fake_clock_, &transport_, nullptr, nullptr, nullptr, >+ nullptr, nullptr, nullptr, nullptr, nullptr, nullptr, >+ &retransmission_rate_limiter_, &mock_overhead_observer, >+ false, nullptr, false, false)); > rtp_sender_->SetSSRC(kSsrc); > > EXPECT_CALL(mock_overhead_observer, OnOverheadChanged(_)).Times(1); >@@ -2185,10 +2255,10 @@ TEST_P(RtpSenderTest, DoesNotUpdateOverheadOnEqualSize) { > > TEST_P(RtpSenderTest, SendsKeepAlive) { > MockTransport transport; >- rtp_sender_.reset( >- new RTPSender(false, &fake_clock_, &transport, nullptr, nullptr, nullptr, >- nullptr, nullptr, nullptr, nullptr, &mock_rtc_event_log_, >- nullptr, &retransmission_rate_limiter_, nullptr, false)); >+ rtp_sender_.reset(new RTPSender( >+ false, &fake_clock_, &transport, nullptr, nullptr, nullptr, nullptr, >+ nullptr, nullptr, nullptr, &mock_rtc_event_log_, nullptr, >+ &retransmission_rate_limiter_, nullptr, false, nullptr, false, false)); > rtp_sender_->SetSequenceNumber(kSeqNum); > rtp_sender_->SetTimestampOffset(0); > rtp_sender_->SetSSRC(kSsrc); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.cc >index e8f0ea5c8de07180acbf1930c652a28aa1db8635..cb0b665608455745ccd05cb7a5ca9e9b50b7122f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.cc >@@ -19,6 +19,8 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "absl/strings/match.h" >+#include "api/crypto/frameencryptorinterface.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtp_format_video_generic.h" >@@ -91,6 +93,11 @@ void AddRtpHeaderExtensions(const RTPVideoHeader& video_header, > generic_descriptor.SetSpatialLayersBitmask(spatial_bimask); > > generic_descriptor.SetTemporalLayer(video_header.generic->temporal_index); >+ >+ if (frame_type == kVideoFrameKey) { >+ generic_descriptor.SetResolution(video_header.width, >+ video_header.height); >+ } > } > packet->SetExtension<RtpGenericFrameDescriptorExtension>( > generic_descriptor); >@@ -115,7 +122,9 @@ bool MinimizeDescriptor(const RTPVideoHeader& full, RTPVideoHeader* minimized) { > > RTPSenderVideo::RTPSenderVideo(Clock* clock, > RTPSender* rtp_sender, >- FlexfecSender* flexfec_sender) >+ FlexfecSender* flexfec_sender, >+ FrameEncryptorInterface* frame_encryptor, >+ bool require_frame_encryption) > : rtp_sender_(rtp_sender), > clock_(clock), > video_type_(kVideoCodecGeneric), >@@ -128,7 +137,9 @@ RTPSenderVideo::RTPSenderVideo(Clock* clock, > delta_fec_params_{0, 1, kFecMaskRandom}, > key_fec_params_{0, 1, kFecMaskRandom}, > fec_bitrate_(1000, RateStatistics::kBpsScale), >- video_bitrate_(1000, RateStatistics::kBpsScale) {} >+ video_bitrate_(1000, RateStatistics::kBpsScale), >+ frame_encryptor_(frame_encryptor), >+ require_frame_encryption_(require_frame_encryption) {} > > RTPSenderVideo::~RTPSenderVideo() {} > >@@ -142,18 +153,18 @@ VideoCodecType RTPSenderVideo::VideoCodecType() const { > > // Static. > RtpUtility::Payload* RTPSenderVideo::CreateVideoPayload( >- const char payload_name[RTP_PAYLOAD_NAME_SIZE], >+ absl::string_view payload_name, > int8_t payload_type) { > enum VideoCodecType video_type = kVideoCodecGeneric; >- if (RtpUtility::StringCompare(payload_name, "VP8", 3)) { >+ if (absl::EqualsIgnoreCase(payload_name, "VP8")) { > video_type = kVideoCodecVP8; >- } else if (RtpUtility::StringCompare(payload_name, "VP9", 3)) { >+ } else if (absl::EqualsIgnoreCase(payload_name, "VP9")) { > video_type = kVideoCodecVP9; >- } else if (RtpUtility::StringCompare(payload_name, "H264", 4)) { >+ } else if (absl::EqualsIgnoreCase(payload_name, "H264")) { > video_type = kVideoCodecH264; >- } else if (RtpUtility::StringCompare(payload_name, "I420", 4)) { >+ } else if (absl::EqualsIgnoreCase(payload_name, "I420")) { > video_type = kVideoCodecGeneric; >- } else if (RtpUtility::StringCompare(payload_name, "stereo", 6)) { >+ } else if (absl::EqualsIgnoreCase(payload_name, "stereo")) { > video_type = kVideoCodecGeneric; > } else { > video_type = kVideoCodecGeneric; >@@ -387,20 +398,19 @@ bool RTPSenderVideo::SendVideo(enum VideoCodecType video_type, > int packet_capacity = rtp_sender_->MaxRtpPacketSize() - fec_packet_overhead - > (rtp_sender_->RtxStatus() ? kRtxHeaderSize : 0); > >- auto create_packet = [&] { >- std::unique_ptr<RtpPacketToSend> rtp_packet = rtp_sender_->AllocatePacket(); >- RTC_DCHECK_LE(packet_capacity, rtp_packet->capacity()); >- >- rtp_packet->SetPayloadType(payload_type); >- rtp_packet->SetTimestamp(rtp_timestamp); >- rtp_packet->set_capture_time_ms(capture_time_ms); >- return rtp_packet; >- }; >+ std::unique_ptr<RtpPacketToSend> single_packet = >+ rtp_sender_->AllocatePacket(); >+ RTC_DCHECK_LE(packet_capacity, single_packet->capacity()); >+ single_packet->SetPayloadType(payload_type); >+ single_packet->SetTimestamp(rtp_timestamp); >+ single_packet->set_capture_time_ms(capture_time_ms); > >- auto first_packet = create_packet(); >- auto middle_packet = absl::make_unique<RtpPacketToSend>(*first_packet); >- auto last_packet = absl::make_unique<RtpPacketToSend>(*first_packet); >+ auto first_packet = absl::make_unique<RtpPacketToSend>(*single_packet); >+ auto middle_packet = absl::make_unique<RtpPacketToSend>(*single_packet); >+ auto last_packet = absl::make_unique<RtpPacketToSend>(*single_packet); > // Simplest way to estimate how much extensions would occupy is to set them. >+ AddRtpHeaderExtensions(*video_header, frame_type, set_video_rotation, >+ /*first=*/true, /*last=*/true, single_packet.get()); > AddRtpHeaderExtensions(*video_header, frame_type, set_video_rotation, > /*first=*/true, /*last=*/false, first_packet.get()); > AddRtpHeaderExtensions(*video_header, frame_type, set_video_rotation, >@@ -408,12 +418,17 @@ bool RTPSenderVideo::SendVideo(enum VideoCodecType video_type, > AddRtpHeaderExtensions(*video_header, frame_type, set_video_rotation, > /*first=*/false, /*last=*/true, last_packet.get()); > >+ RTC_DCHECK_GT(packet_capacity, single_packet->headers_size()); > RTC_DCHECK_GT(packet_capacity, first_packet->headers_size()); > RTC_DCHECK_GT(packet_capacity, middle_packet->headers_size()); > RTC_DCHECK_GT(packet_capacity, last_packet->headers_size()); > RtpPacketizer::PayloadSizeLimits limits; > limits.max_payload_len = packet_capacity - middle_packet->headers_size(); > >+ RTC_DCHECK_GE(single_packet->headers_size(), middle_packet->headers_size()); >+ limits.single_packet_reduction_len = >+ single_packet->headers_size() - middle_packet->headers_size(); >+ > RTC_DCHECK_GE(first_packet->headers_size(), middle_packet->headers_size()); > limits.first_packet_reduction_len = > first_packet->headers_size() - middle_packet->headers_size(); >@@ -424,9 +439,42 @@ bool RTPSenderVideo::SendVideo(enum VideoCodecType video_type, > > RTPVideoHeader minimized_video_header; > const RTPVideoHeader* packetize_video_header = video_header; >- if (first_packet->HasExtension<RtpGenericFrameDescriptorExtension>() && >- MinimizeDescriptor(*video_header, &minimized_video_header)) { >- packetize_video_header = &minimized_video_header; >+ rtc::ArrayView<const uint8_t> generic_descriptor_raw = >+ first_packet->GetRawExtension<RtpGenericFrameDescriptorExtension>(); >+ if (!generic_descriptor_raw.empty()) { >+ if (MinimizeDescriptor(*video_header, &minimized_video_header)) { >+ packetize_video_header = &minimized_video_header; >+ } >+ } >+ >+ // TODO(benwright@webrtc.org) - Allocate enough to always encrypt inline. >+ rtc::Buffer encrypted_video_payload; >+ if (frame_encryptor_ != nullptr) { >+ if (generic_descriptor_raw.empty()) { >+ return false; >+ } >+ >+ const size_t max_ciphertext_size = >+ frame_encryptor_->GetMaxCiphertextByteSize(cricket::MEDIA_TYPE_VIDEO, >+ payload_size); >+ encrypted_video_payload.SetSize(max_ciphertext_size); >+ >+ size_t bytes_written = 0; >+ if (frame_encryptor_->Encrypt( >+ cricket::MEDIA_TYPE_VIDEO, first_packet->Ssrc(), >+ /*additional_data=*/nullptr, >+ rtc::MakeArrayView(payload_data, payload_size), >+ encrypted_video_payload, &bytes_written) != 0) { >+ return false; >+ } >+ >+ encrypted_video_payload.SetSize(bytes_written); >+ payload_data = encrypted_video_payload.data(); >+ payload_size = encrypted_video_payload.size(); >+ } else if (require_frame_encryption_) { >+ RTC_LOG(LS_WARNING) >+ << "No FrameEncryptor is attached to this video sending stream but " >+ << "one is required since require_frame_encryptor is set"; > } > > std::unique_ptr<RtpPacketizer> packetizer = RtpPacketizer::Create( >@@ -447,16 +495,9 @@ bool RTPSenderVideo::SendVideo(enum VideoCodecType video_type, > int expected_payload_capacity; > // Choose right packet template: > if (num_packets == 1) { >- // No prepared template, create a new packet. >- packet = create_packet(); >- AddRtpHeaderExtensions(*video_header, frame_type, set_video_rotation, >- /*first=*/true, /*last=*/true, packet.get()); >- // TODO(bugs.webrtc.org/7990): Revisit this case when two byte header >- // extension are implemented because then single packet might need more >- // space for extensions than sum of first and last packet reductions. >- expected_payload_capacity = limits.max_payload_len - >- limits.first_packet_reduction_len - >- limits.last_packet_reduction_len; >+ packet = std::move(single_packet); >+ expected_payload_capacity = >+ limits.max_payload_len - limits.single_packet_reduction_len; > } else if (i == 0) { > packet = std::move(first_packet); > expected_payload_capacity = >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.h >index ce7be16149c3b37a44d74693117d4dc4ea639c9a..d3a898b40a306a7dbc4106029c99de61fea539e9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_sender_video.h >@@ -14,6 +14,7 @@ > #include <map> > #include <memory> > >+#include "absl/strings/string_view.h" > #include "absl/types/optional.h" > #include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/include/flexfec_sender.h" >@@ -29,6 +30,8 @@ > #include "rtc_base/thread_annotations.h" > > namespace webrtc { >+ >+class FrameEncryptorInterface; > class RtpPacketizer; > class RtpPacketToSend; > >@@ -38,14 +41,15 @@ class RTPSenderVideo { > > RTPSenderVideo(Clock* clock, > RTPSender* rtpSender, >- FlexfecSender* flexfec_sender); >+ FlexfecSender* flexfec_sender, >+ FrameEncryptorInterface* frame_encryptor, >+ bool require_frame_encryption); > virtual ~RTPSenderVideo(); > > virtual enum VideoCodecType VideoCodecType() const; > >- static RtpUtility::Payload* CreateVideoPayload( >- const char payload_name[RTP_PAYLOAD_NAME_SIZE], >- int8_t payload_type); >+ static RtpUtility::Payload* CreateVideoPayload(absl::string_view payload_name, >+ int8_t payload_type); > > bool SendVideo(enum VideoCodecType video_type, > FrameType frame_type, >@@ -158,6 +162,13 @@ class RTPSenderVideo { > RTC_GUARDED_BY(stats_crit_); > > OneTimeEvent first_frame_sent_; >+ >+ // E2EE Custom Video Frame Encryptor (optional) >+ FrameEncryptorInterface* const frame_encryptor_ = nullptr; >+ // If set to true will require all outgoing frames to pass through an >+ // initialized frame_encryptor_ before being sent out of the network. >+ // Otherwise these payloads will be dropped. >+ bool require_frame_encryption_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.cc >index 228572f8d169c21cb0b76d6ba04539cee4647cf3..44c671f5067eaf37bfadf8f960afb8fc8c53cf81 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.cc >@@ -10,9 +10,19 @@ > > #include "modules/rtp_rtcp/source/rtp_utility.h" > >+#include <assert.h> >+#include <stddef.h> >+ >+#include "api/array_view.h" >+#include "api/video/video_content_type.h" >+#include "api/video/video_frame_marking.h" >+#include "api/video/video_rotation.h" >+#include "api/video/video_timing.h" > #include "modules/rtp_rtcp/include/rtp_cvo.h" > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/rtp_header_extensions.h" >+#include "modules/video_coding/codecs/interface/common_constants.h" >+#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/stringutils.h" > >@@ -33,10 +43,6 @@ enum { > * Misc utility routines > */ > >-bool StringCompare(const char* str1, const char* str2, const uint32_t length) { >- return _strnicmp(str1, str2, length) == 0; >-} >- > size_t Word32Align(size_t size) { > uint32_t remainder = size % 4; > if (remainder != 0) >@@ -501,6 +507,10 @@ void RtpHeaderParser::ParseOneByteExtensionHeader( > RTC_LOG(WARNING) > << "RtpGenericFrameDescriptor unsupported by rtp header parser."; > break; >+ case kRtpExtensionColorSpace: >+ RTC_LOG(WARNING) >+ << "RtpExtensionColorSpace unsupported by rtp header parser."; >+ break; > case kRtpExtensionNone: > case kRtpExtensionNumberOfExtensions: { > RTC_NOTREACHED() << "Invalid extension type: " << type; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.h >index 762f964a2e2f31eaace15485960f3891e3db96bf..408517481bb495e5424eb66cf0d078e5d24cb803 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_utility.h >@@ -11,14 +11,14 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_UTILITY_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_UTILITY_H_ > >-#include <cstring> >-#include <map> >+#include <stdint.h> >+#include <algorithm> > >-#include "modules/rtp_rtcp/include/receive_statistics.h" >+#include "absl/strings/string_view.h" >+#include "api/rtp_headers.h" >+#include "common_types.h" // NOLINT(build/include) > #include "modules/rtp_rtcp/include/rtp_header_extension_map.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" >-#include "modules/rtp_rtcp/source/rtp_rtcp_config.h" >-#include "rtc_base/deprecation.h" > > namespace webrtc { > >@@ -27,16 +27,15 @@ const uint8_t kRtpMarkerBitMask = 0x80; > namespace RtpUtility { > > struct Payload { >- Payload(const char* name, const PayloadUnion& pu) : typeSpecific(pu) { >- std::strncpy(this->name, name, sizeof(this->name) - 1); >- this->name[sizeof(this->name) - 1] = '\0'; >+ Payload(absl::string_view payload_name, const PayloadUnion& pu) >+ : typeSpecific(pu) { >+ size_t clipped_size = payload_name.copy(name, sizeof(name) - 1); >+ name[clipped_size] = '\0'; > } > char name[RTP_PAYLOAD_NAME_SIZE]; > PayloadUnion typeSpecific; > }; > >-bool StringCompare(const char* str1, const char* str2, const uint32_t length); >- > // Round up to the nearest size that is a multiple of 4. > size_t Word32Align(size_t size); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.cc >index a3ee8baa83e13afbd55f528901c2903bfcf8faf7..bb9413ddd5e28a89f39d77c3452ab1be3ba6165b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.cc >@@ -12,7 +12,7 @@ > > namespace webrtc { > >-RTPVideoHeader::RTPVideoHeader() : playout_delay(), video_timing() {} >+RTPVideoHeader::RTPVideoHeader() : video_timing() {} > RTPVideoHeader::RTPVideoHeader(const RTPVideoHeader& other) = default; > RTPVideoHeader::~RTPVideoHeader() = default; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.h >index 288b7d03a0260e52f3de0e12597c4465ec66d0ce..1c75f539a1e86bb63ff1a717e0e1ab256da473b4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/rtp_video_header.h >@@ -10,8 +10,12 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_RTP_VIDEO_HEADER_H_ > #define MODULES_RTP_RTCP_SOURCE_RTP_VIDEO_HEADER_H_ > >+#include <cstdint> >+ > #include "absl/container/inlined_vector.h" >+#include "absl/types/optional.h" > #include "absl/types/variant.h" >+#include "api/video/video_codec_type.h" > #include "api/video/video_content_type.h" > #include "api/video/video_frame_marking.h" > #include "api/video/video_rotation.h" >@@ -56,7 +60,7 @@ struct RTPVideoHeader { > uint8_t simulcastIdx = 0; > VideoCodecType codec = VideoCodecType::kVideoCodecGeneric; > >- PlayoutDelay playout_delay; >+ PlayoutDelay playout_delay = {-1, -1}; > VideoSendTiming video_timing; > FrameMarking frame_marking; > RTPVideoTypeHeader video_type_header; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.cc >index 6ac280a61db7e93e757a8cf7a5d2ce800ca858d5..e65329d06b78052b204053b2fd9da457160de03a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.cc >@@ -12,6 +12,7 @@ > > #include <algorithm> > >+#include "rtc_base/checks.h" > #include "rtc_base/timeutils.h" > > namespace webrtc { >@@ -22,20 +23,27 @@ inline int64_t DivideRoundToNearest(int64_t x, uint32_t y) { > return (x + y / 2) / y; > } > >-int64_t NtpOffsetUs() { >+int64_t NtpOffsetMsCalledOnce() { > constexpr int64_t kNtpJan1970Sec = 2208988800; >- int64_t clock_time = rtc::TimeMicros(); >- int64_t utc_time = rtc::TimeUTCMicros(); >- return utc_time - clock_time + kNtpJan1970Sec * rtc::kNumMicrosecsPerSec; >+ int64_t clock_time = rtc::TimeMillis(); >+ int64_t utc_time = rtc::TimeUTCMillis(); >+ return utc_time - clock_time + kNtpJan1970Sec * rtc::kNumMillisecsPerSec; > } > > } // namespace > >-NtpTime TimeMicrosToNtp(int64_t time_us) { >+int64_t NtpOffsetMs() { > // Calculate the offset once. >- static int64_t ntp_offset_us = NtpOffsetUs(); >+ static int64_t ntp_offset_ms = NtpOffsetMsCalledOnce(); >+ return ntp_offset_ms; >+} > >- int64_t time_ntp_us = time_us + ntp_offset_us; >+NtpTime TimeMicrosToNtp(int64_t time_us) { >+ // Since this doesn't return a wallclock time, but only NTP representation >+ // of rtc::TimeMillis() clock, the exact offset doesn't matter. >+ // To simplify conversions between NTP and RTP time, this offset is >+ // limited to milliseconds in resolution. >+ int64_t time_ntp_us = time_us + NtpOffsetMs() * 1000; > RTC_DCHECK_GE(time_ntp_us, 0); // Time before year 1900 is unsupported. > > // TODO(danilchap): Convert both seconds and fraction together using int128 >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.h >index 672722c07f99ac0fb72773177600e27a509e0194..94b914310ccf20dcab6ca60448b29a9ae1f39243 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/time_util.h >@@ -22,13 +22,14 @@ namespace webrtc { > // difference of the passed values. > // As a result TimeMicrosToNtp(rtc::TimeMicros()) doesn't guarantee to match > // system time. >+// However, TimeMicrosToNtp Guarantees that returned NtpTime will be offsetted >+// from rtc::TimeMicros() by integral number of milliseconds. >+// Use NtpOffsetMs() to get that offset value. > NtpTime TimeMicrosToNtp(int64_t time_us); > >-// Converts NTP timestamp to RTP timestamp. >-inline uint32_t NtpToRtp(NtpTime ntp, uint32_t freq) { >- uint32_t tmp = (static_cast<uint64_t>(ntp.fractions()) * freq) >> 32; >- return ntp.seconds() * freq + tmp; >-} >+// Difference between Ntp time and local relative time returned by >+// rtc::TimeMicros() >+int64_t NtpOffsetMs(); > > // Helper function for compact ntp representation: > // RFC 3550, Section 4. Time Format. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.cc >index 8aa4530211353c5ad584b4a9c2ff795a49cab0a6..315a4c21209eadbe2baf82363d38f49a14318ea8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.cc >@@ -10,6 +10,7 @@ > > #include "modules/rtp_rtcp/source/tmmbr_help.h" > >+#include <stddef.h> > #include <algorithm> > #include <limits> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.h >index 91aeaf464101c2cfc0b36a832e90918d074a7e06..bf86f65222c7e092cc848b46e42c0cb4665d17c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/tmmbr_help.h >@@ -11,7 +11,9 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_TMMBR_HELP_H_ > #define MODULES_RTP_RTCP_SOURCE_TMMBR_HELP_H_ > >+#include <stdint.h> > #include <vector> >+ > #include "modules/rtp_rtcp/source/rtcp_packet/tmmb_item.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.cc >index e5777edab8504b1740b164a6b19dff9ab84d60c5..56dae29450e950edd56a75f330e3b059859ca3be 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.cc >@@ -10,12 +10,15 @@ > > #include "modules/rtp_rtcp/source/ulpfec_generator.h" > >+#include <string.h> >+#include <cstdint> > #include <memory> > #include <utility> > > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/forward_error_correction.h" >+#include "modules/rtp_rtcp/source/forward_error_correction_internal.h" > #include "modules/rtp_rtcp/source/rtp_utility.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.h >index efc753f4c74132331ae6c5357ccccc2b9ece9634..74a1d80256ea9c9c3b96564cb30d7abaaf197891 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_generator.h >@@ -11,10 +11,13 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_ULPFEC_GENERATOR_H_ > #define MODULES_RTP_RTCP_SOURCE_ULPFEC_GENERATOR_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <list> > #include <memory> > #include <vector> > >+#include "modules/include/module_fec_types.h" > #include "modules/rtp_rtcp/source/forward_error_correction.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.cc >index c54d3cdd8f90b1f34124c9fd489e4b183a0ab181..22af7e7736904bb24d9b83767cbb45bbc973c95f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.cc >@@ -10,11 +10,12 @@ > > #include "modules/rtp_rtcp/source/ulpfec_header_reader_writer.h" > >-#include <utility> >+#include <string.h> > > #include "modules/rtp_rtcp/source/byte_io.h" > #include "modules/rtp_rtcp/source/forward_error_correction_internal.h" > #include "rtc_base/checks.h" >+#include "rtc_base/scoped_ref_ptr.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.h >index fc83afdf0cc025a92966ba355f6de6a6cb744bc1..a8bb737dbb58da05eb2dea68e1e5f26857c20ff8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_header_reader_writer.h >@@ -11,6 +11,9 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_ULPFEC_HEADER_READER_WRITER_H_ > #define MODULES_RTP_RTCP_SOURCE_ULPFEC_HEADER_READER_WRITER_H_ > >+#include <stddef.h> >+#include <stdint.h> >+ > #include "modules/rtp_rtcp/source/forward_error_correction.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.cc >index eb09c95813413d57595ebd94ab899684ab838dd4..7da6b88a007cf5ddefdca8c218ebb1975f86f66d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.cc >@@ -10,12 +10,13 @@ > > #include "modules/rtp_rtcp/source/ulpfec_receiver_impl.h" > >+#include <string.h> > #include <memory> > #include <utility> > > #include "modules/rtp_rtcp/source/byte_io.h" >-#include "rtc_base/checks.h" > #include "rtc_base/logging.h" >+#include "rtc_base/scoped_ref_ptr.h" > #include "system_wrappers/include/clock.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.h >index 96367dc7aa821edcc856e7eda61ee3c35439faf8..0943266161802c7842dd246084a6f9c507887291 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/rtp_rtcp/source/ulpfec_receiver_impl.h >@@ -11,9 +11,12 @@ > #ifndef MODULES_RTP_RTCP_SOURCE_ULPFEC_RECEIVER_IMPL_H_ > #define MODULES_RTP_RTCP_SOURCE_ULPFEC_RECEIVER_IMPL_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <vector> > >+#include "api/rtp_headers.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/include/ulpfec_receiver.h" > #include "modules/rtp_rtcp/source/forward_error_correction.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/BUILD.gn >index 32da10950f293e626a92f97f68f0af55f7adc028..1d780b0e4a770fd1eff3565ef11e6449572f68da 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/BUILD.gn >@@ -26,7 +26,6 @@ rtc_static_library("utility") { > > deps = [ > "..:module_api", >- "../..:webrtc_common", > "../../common_audio", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.cc >index 97272da2bf10a76e32baf105e34ab7ce3f518e90..20f26d927a39463898572ba9f5f877126a54d796 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.cc >@@ -10,6 +10,8 @@ > > #include "modules/utility/source/process_thread_impl.h" > >+#include <string> >+ > #include "modules/include/module.h" > #include "rtc_base/checks.h" > #include "rtc_base/task_queue.h" >@@ -42,9 +44,7 @@ std::unique_ptr<ProcessThread> ProcessThread::Create(const char* thread_name) { > } > > ProcessThreadImpl::ProcessThreadImpl(const char* thread_name) >- : wake_up_(EventWrapper::Create()), >- stop_(false), >- thread_name_(thread_name) {} >+ : stop_(false), thread_name_(thread_name) {} > > ProcessThreadImpl::~ProcessThreadImpl() { > RTC_DCHECK(thread_checker_.CalledOnValidThread()); >@@ -83,7 +83,7 @@ void ProcessThreadImpl::Stop() { > stop_ = true; > } > >- wake_up_->Set(); >+ wake_up_.Set(); > > thread_->Stop(); > stop_ = false; >@@ -102,7 +102,7 @@ void ProcessThreadImpl::WakeUp(Module* module) { > m.next_callback = kCallProcessImmediately; > } > } >- wake_up_->Set(); >+ wake_up_.Set(); > } > > void ProcessThreadImpl::PostTask(std::unique_ptr<rtc::QueuedTask> task) { >@@ -111,7 +111,7 @@ void ProcessThreadImpl::PostTask(std::unique_ptr<rtc::QueuedTask> task) { > rtc::CritScope lock(&lock_); > queue_.push(task.release()); > } >- wake_up_->Set(); >+ wake_up_.Set(); > } > > void ProcessThreadImpl::RegisterModule(Module* module, >@@ -145,7 +145,7 @@ void ProcessThreadImpl::RegisterModule(Module* module, > // Wake the thread calling ProcessThreadImpl::Process() to update the > // waiting time. The waiting time for the just registered module may be > // shorter than all other registered modules. >- wake_up_->Set(); >+ wake_up_.Set(); > } > > void ProcessThreadImpl::DeRegisterModule(Module* module) { >@@ -215,7 +215,7 @@ bool ProcessThreadImpl::Process() { > > int64_t time_to_wait = next_checkpoint - rtc::TimeMillis(); > if (time_to_wait > 0) >- wake_up_->Wait(static_cast<unsigned long>(time_to_wait)); >+ wake_up_.Wait(static_cast<int>(time_to_wait)); > > return true; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.h >index c1f6ed44475d6d7485704247eb6bd0dba2581ff1..fff21d04619c029641a690ccb47dd63f27f7eff2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl.h >@@ -11,16 +11,19 @@ > #ifndef MODULES_UTILITY_SOURCE_PROCESS_THREAD_IMPL_H_ > #define MODULES_UTILITY_SOURCE_PROCESS_THREAD_IMPL_H_ > >+#include <stdint.h> > #include <list> > #include <memory> > #include <queue> > >+#include "modules/include/module.h" > #include "modules/utility/include/process_thread.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/location.h" > #include "rtc_base/platform_thread.h" >+#include "rtc_base/task_queue.h" > #include "rtc_base/thread_checker.h" >-#include "system_wrappers/include/event_wrapper.h" > > namespace webrtc { > >@@ -72,7 +75,7 @@ class ProcessThreadImpl : public ProcessThread { > rtc::CriticalSection lock_; // Used to guard modules_, tasks_ and stop_. > > rtc::ThreadChecker thread_checker_; >- const std::unique_ptr<EventWrapper> wake_up_; >+ rtc::Event wake_up_; > // TODO(pbos): Remove unique_ptr and stop recreating the thread. > std::unique_ptr<rtc::PlatformThread> thread_; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl_unittest.cc >index d5926b26c3e2ca4b07722663cac6bb0c46e15c70..19157963a3bbbf9f93d78dc33ba055b4299d8ed7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/utility/source/process_thread_impl_unittest.cc >@@ -42,14 +42,14 @@ class MockModule : public Module { > > class RaiseEventTask : public rtc::QueuedTask { > public: >- RaiseEventTask(EventWrapper* event) : event_(event) {} >+ RaiseEventTask(rtc::Event* event) : event_(event) {} > bool Run() override { > event_->Set(); > return true; > } > > private: >- EventWrapper* event_; >+ rtc::Event* event_; > }; > > ACTION_P(SetEvent, event) { >@@ -83,19 +83,19 @@ TEST(ProcessThreadImpl, ProcessCall) { > ProcessThreadImpl thread("ProcessThread"); > thread.Start(); > >- std::unique_ptr<EventWrapper> event(EventWrapper::Create()); >+ rtc::Event event; > > MockModule module; > EXPECT_CALL(module, TimeUntilNextProcess()) > .WillOnce(Return(0)) > .WillRepeatedly(Return(1)); > EXPECT_CALL(module, Process()) >- .WillOnce(DoAll(SetEvent(event.get()), Return())) >+ .WillOnce(DoAll(SetEvent(&event), Return())) > .WillRepeatedly(Return()); > EXPECT_CALL(module, ProcessThreadAttached(&thread)).Times(1); > > thread.RegisterModule(&module, RTC_FROM_HERE); >- EXPECT_EQ(kEventSignaled, event->Wait(kEventWaitTimeout)); >+ EXPECT_TRUE(event.Wait(kEventWaitTimeout)); > > EXPECT_CALL(module, ProcessThreadAttached(nullptr)).Times(1); > thread.Stop(); >@@ -105,21 +105,21 @@ TEST(ProcessThreadImpl, ProcessCall) { > // call to Start(). > TEST(ProcessThreadImpl, ProcessCall2) { > ProcessThreadImpl thread("ProcessThread"); >- std::unique_ptr<EventWrapper> event(EventWrapper::Create()); >+ rtc::Event event; > > MockModule module; > EXPECT_CALL(module, TimeUntilNextProcess()) > .WillOnce(Return(0)) > .WillRepeatedly(Return(1)); > EXPECT_CALL(module, Process()) >- .WillOnce(DoAll(SetEvent(event.get()), Return())) >+ .WillOnce(DoAll(SetEvent(&event), Return())) > .WillRepeatedly(Return()); > > thread.RegisterModule(&module, RTC_FROM_HERE); > > EXPECT_CALL(module, ProcessThreadAttached(&thread)).Times(1); > thread.Start(); >- EXPECT_EQ(kEventSignaled, event->Wait(kEventWaitTimeout)); >+ EXPECT_TRUE(event.Wait(kEventWaitTimeout)); > > EXPECT_CALL(module, ProcessThreadAttached(nullptr)).Times(1); > thread.Stop(); >@@ -129,7 +129,7 @@ TEST(ProcessThreadImpl, ProcessCall2) { > // After unregistration, we should not receive any further callbacks. > TEST(ProcessThreadImpl, Deregister) { > ProcessThreadImpl thread("ProcessThread"); >- std::unique_ptr<EventWrapper> event(EventWrapper::Create()); >+ rtc::Event event; > > int process_count = 0; > MockModule module; >@@ -137,8 +137,7 @@ TEST(ProcessThreadImpl, Deregister) { > .WillOnce(Return(0)) > .WillRepeatedly(Return(1)); > EXPECT_CALL(module, Process()) >- .WillOnce( >- DoAll(SetEvent(event.get()), Increment(&process_count), Return())) >+ .WillOnce(DoAll(SetEvent(&event), Increment(&process_count), Return())) > .WillRepeatedly(DoAll(Increment(&process_count), Return())); > > thread.RegisterModule(&module, RTC_FROM_HERE); >@@ -146,7 +145,7 @@ TEST(ProcessThreadImpl, Deregister) { > EXPECT_CALL(module, ProcessThreadAttached(&thread)).Times(1); > thread.Start(); > >- EXPECT_EQ(kEventSignaled, event->Wait(kEventWaitTimeout)); >+ EXPECT_TRUE(event.Wait(kEventWaitTimeout)); > > EXPECT_CALL(module, ProcessThreadAttached(nullptr)).Times(1); > thread.DeRegisterModule(&module); >@@ -155,7 +154,7 @@ TEST(ProcessThreadImpl, Deregister) { > int count_after_deregister = process_count; > > // We shouldn't get any more callbacks. >- EXPECT_EQ(kEventTimeout, event->Wait(20)); >+ EXPECT_FALSE(event.Wait(20)); > EXPECT_EQ(count_after_deregister, process_count); > thread.Stop(); > } >@@ -167,7 +166,7 @@ void ProcessCallAfterAFewMs(int64_t milliseconds) { > ProcessThreadImpl thread("ProcessThread"); > thread.Start(); > >- std::unique_ptr<EventWrapper> event(EventWrapper::Create()); >+ rtc::Event event; > > MockModule module; > int64_t start_time = 0; >@@ -176,8 +175,7 @@ void ProcessCallAfterAFewMs(int64_t milliseconds) { > .WillOnce(DoAll(SetTimestamp(&start_time), Return(milliseconds))) > .WillRepeatedly(Return(milliseconds)); > EXPECT_CALL(module, Process()) >- .WillOnce( >- DoAll(SetTimestamp(&called_time), SetEvent(event.get()), Return())) >+ .WillOnce(DoAll(SetTimestamp(&called_time), SetEvent(&event), Return())) > .WillRepeatedly(Return()); > > EXPECT_CALL(module, ProcessThreadAttached(&thread)).Times(1); >@@ -185,7 +183,7 @@ void ProcessCallAfterAFewMs(int64_t milliseconds) { > > // Add a buffer of 50ms due to slowness of some trybots > // (e.g. win_drmemory_light) >- EXPECT_EQ(kEventSignaled, event->Wait(milliseconds + 50)); >+ EXPECT_TRUE(event.Wait(milliseconds + 50)); > > EXPECT_CALL(module, ProcessThreadAttached(nullptr)).Times(1); > thread.Stop(); >@@ -230,7 +228,7 @@ TEST(ProcessThreadImpl, DISABLED_Process50Times) { > ProcessThreadImpl thread("ProcessThread"); > thread.Start(); > >- std::unique_ptr<EventWrapper> event(EventWrapper::Create()); >+ rtc::Event event; > > MockModule module; > int callback_count = 0; >@@ -242,7 +240,7 @@ TEST(ProcessThreadImpl, DISABLED_Process50Times) { > EXPECT_CALL(module, ProcessThreadAttached(&thread)).Times(1); > thread.RegisterModule(&module, RTC_FROM_HERE); > >- EXPECT_EQ(kEventTimeout, event->Wait(1000)); >+ EXPECT_TRUE(event.Wait(1000)); > > EXPECT_CALL(module, ProcessThreadAttached(nullptr)).Times(1); > thread.Stop(); >@@ -261,8 +259,8 @@ TEST(ProcessThreadImpl, WakeUp) { > ProcessThreadImpl thread("ProcessThread"); > thread.Start(); > >- std::unique_ptr<EventWrapper> started(EventWrapper::Create()); >- std::unique_ptr<EventWrapper> called(EventWrapper::Create()); >+ rtc::Event started; >+ rtc::Event called; > > MockModule module; > int64_t start_time; >@@ -276,20 +274,19 @@ TEST(ProcessThreadImpl, WakeUp) { > // The second time TimeUntilNextProcess is then called, is after Process > // has been called and we don't expect any more calls. > EXPECT_CALL(module, TimeUntilNextProcess()) >- .WillOnce(DoAll(SetTimestamp(&start_time), SetEvent(started.get()), >- Return(1000))) >+ .WillOnce( >+ DoAll(SetTimestamp(&start_time), SetEvent(&started), Return(1000))) > .WillOnce(Return(1000)); > EXPECT_CALL(module, Process()) >- .WillOnce( >- DoAll(SetTimestamp(&called_time), SetEvent(called.get()), Return())) >+ .WillOnce(DoAll(SetTimestamp(&called_time), SetEvent(&called), Return())) > .WillRepeatedly(Return()); > > EXPECT_CALL(module, ProcessThreadAttached(&thread)).Times(1); > thread.RegisterModule(&module, RTC_FROM_HERE); > >- EXPECT_EQ(kEventSignaled, started->Wait(kEventWaitTimeout)); >+ EXPECT_TRUE(started.Wait(kEventWaitTimeout)); > thread.WakeUp(&module); >- EXPECT_EQ(kEventSignaled, called->Wait(kEventWaitTimeout)); >+ EXPECT_TRUE(called.Wait(kEventWaitTimeout)); > > EXPECT_CALL(module, ProcessThreadAttached(nullptr)).Times(1); > thread.Stop(); >@@ -304,11 +301,11 @@ TEST(ProcessThreadImpl, WakeUp) { > // thread. > TEST(ProcessThreadImpl, PostTask) { > ProcessThreadImpl thread("ProcessThread"); >- std::unique_ptr<EventWrapper> task_ran(EventWrapper::Create()); >- std::unique_ptr<RaiseEventTask> task(new RaiseEventTask(task_ran.get())); >+ rtc::Event task_ran; >+ std::unique_ptr<RaiseEventTask> task(new RaiseEventTask(&task_ran)); > thread.Start(); > thread.PostTask(std::move(task)); >- EXPECT_EQ(kEventSignaled, task_ran->Wait(kEventWaitTimeout)); >+ EXPECT_TRUE(task_ran.Wait(kEventWaitTimeout)); > thread.Stop(); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/BUILD.gn >index ef6f335e4939d576ce6181e89094aebffc10c4ab..44db2eb477aa28a53fc6b1091baee0cf764bc798 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/BUILD.gn >@@ -38,6 +38,7 @@ rtc_static_library("video_capture_module") { > "../../rtc_base:stringutils", > "../../rtc_base/synchronization:rw_lock_wrapper", > "../../system_wrappers", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/libyuv", > ] > } >@@ -190,6 +191,7 @@ if (!build_with_chromium) { > "../../common_video:common_video", > "../../rtc_base:rtc_base_approved", > "../../system_wrappers:system_wrappers", >+ "../../test:test_support", > "../../test:video_test_common", > "../utility", > "//testing/gtest", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/device_info_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/device_info_impl.cc >index 2f0a39f2964fb37c703f9864f14ad77a0eb361fa..cdcb5ddc4a82744cc298de512d8204e27a8026b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/device_info_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/device_info_impl.cc >@@ -11,6 +11,7 @@ > #include <assert.h> > #include <stdlib.h> > >+#include "absl/strings/match.h" > #include "modules/video_capture/device_info_impl.h" > #include "modules/video_capture/video_capture_config.h" > #include "rtc_base/logging.h" >@@ -42,14 +43,12 @@ int32_t DeviceInfoImpl::NumberOfCapabilities(const char* deviceUniqueIdUTF8) { > > _apiLock.AcquireLockShared(); > >- if (_lastUsedDeviceNameLength == strlen((char*)deviceUniqueIdUTF8)) { >- // Is it the same device that is asked for again. >- if (_strnicmp((char*)_lastUsedDeviceName, (char*)deviceUniqueIdUTF8, >- _lastUsedDeviceNameLength) == 0) { >- // yes >- _apiLock.ReleaseLockShared(); >- return static_cast<int32_t>(_captureCapabilities.size()); >- } >+ // Is it the same device that is asked for again. >+ if (absl::EqualsIgnoreCase( >+ deviceUniqueIdUTF8, >+ absl::string_view(_lastUsedDeviceName, _lastUsedDeviceNameLength))) { >+ _apiLock.ReleaseLockShared(); >+ return static_cast<int32_t>(_captureCapabilities.size()); > } > // Need to get exclusive rights to create the new capability map. > _apiLock.ReleaseLockShared(); >@@ -66,9 +65,9 @@ int32_t DeviceInfoImpl::GetCapability(const char* deviceUniqueIdUTF8, > > ReadLockScoped cs(_apiLock); > >- if ((_lastUsedDeviceNameLength != strlen((char*)deviceUniqueIdUTF8)) || >- (_strnicmp((char*)_lastUsedDeviceName, (char*)deviceUniqueIdUTF8, >- _lastUsedDeviceNameLength) != 0)) { >+ if (!absl::EqualsIgnoreCase( >+ deviceUniqueIdUTF8, >+ absl::string_view(_lastUsedDeviceName, _lastUsedDeviceNameLength))) { > _apiLock.ReleaseLockShared(); > _apiLock.AcquireLockExclusive(); > if (-1 == CreateCapabilityMap(deviceUniqueIdUTF8)) { >@@ -100,9 +99,9 @@ int32_t DeviceInfoImpl::GetBestMatchedCapability( > return -1; > > ReadLockScoped cs(_apiLock); >- if ((_lastUsedDeviceNameLength != strlen((char*)deviceUniqueIdUTF8)) || >- (_strnicmp((char*)_lastUsedDeviceName, (char*)deviceUniqueIdUTF8, >- _lastUsedDeviceNameLength) != 0)) { >+ if (!absl::EqualsIgnoreCase( >+ deviceUniqueIdUTF8, >+ absl::string_view(_lastUsedDeviceName, _lastUsedDeviceNameLength))) { > _apiLock.ReleaseLockShared(); > _apiLock.AcquireLockExclusive(); > if (-1 == CreateCapabilityMap(deviceUniqueIdUTF8)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/video_capture_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/video_capture_impl.cc >index 53afed00b644e895fd2da5e083b328770713044a..235fd3438a216469e78297acade568132574aac7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/video_capture_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/video_capture_impl.cc >@@ -143,7 +143,7 @@ int32_t VideoCaptureImpl::IncomingFrame(uint8_t* videoFrame, > int stride_y = width; > int stride_uv = (width + 1) / 2; > int target_width = width; >- int target_height = height; >+ int target_height = abs(height); > > // SetApplyRotation doesn't take any lock. Make a local copy here. > bool apply_rotation = apply_rotation_; >@@ -163,7 +163,7 @@ int32_t VideoCaptureImpl::IncomingFrame(uint8_t* videoFrame, > > // TODO(nisse): Use a pool? > rtc::scoped_refptr<I420Buffer> buffer = I420Buffer::Create( >- target_width, abs(target_height), stride_y, stride_uv, stride_uv); >+ target_width, target_height, stride_y, stride_uv, stride_uv); > > libyuv::RotationMode rotation_mode = libyuv::kRotate0; > if (apply_rotation) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/windows/video_capture_ds.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/windows/video_capture_ds.cc >index 778fd8761ac19b3961ac25747a5c743ac581157c..ea377a3b05cb6d5d5a454fbef6307e5bcb31b784 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/windows/video_capture_ds.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_capture/windows/video_capture_ds.cc >@@ -95,6 +95,10 @@ int32_t VideoCaptureDS::Init(const char* deviceUniqueIdUTF8) { > } > > _outputCapturePin = GetOutputPin(_captureFilter, PIN_CATEGORY_CAPTURE); >+ if (!_outputCapturePin) { >+ RTC_LOG(LS_INFO) << "Failed to get output capture pin"; >+ return -1; >+ } > > // Create the sink filte used for receiving Captured frames. > _sinkFilter = new CaptureSinkFilter(SINK_FILTER_NAME, NULL, &hr, *this); >@@ -109,7 +113,12 @@ int32_t VideoCaptureDS::Init(const char* deviceUniqueIdUTF8) { > RTC_LOG(LS_INFO) << "Failed to add the send filter to the graph."; > return -1; > } >+ > _inputSendPin = GetInputPin(_sinkFilter); >+ if (!_inputSendPin) { >+ RTC_LOG(LS_INFO) << "Failed to get input send pin"; >+ return -1; >+ } > > // Temporary connect here. > // This is done so that no one else can use the capture device. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/BUILD.gn >index e759a4a7e8dd5e635dd7f2cef40e55848d6bacb8..ac359bc64f827712ef25406d7d2cb06f5a9da475 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/BUILD.gn >@@ -17,8 +17,8 @@ rtc_static_library("encoded_frame") { > deps = [ > ":video_codec_interface", > "../../:webrtc_common", >+ "../../api/video:encoded_image", > "../../api/video:video_frame_i420", >- "../../common_video:common_video", > "../../modules:module_api", > "../../modules:module_api_public", > "../../modules/video_coding:video_coding_utility", >@@ -57,6 +57,7 @@ rtc_static_library("nack_module") { > "../../rtc_base:rtc_base_approved", > "../../rtc_base:rtc_numerics", > "../../system_wrappers", >+ "../../system_wrappers:field_trial", > "../utility:utility", > ] > } >@@ -158,6 +159,7 @@ rtc_static_library("video_coding") { > "..:module_api_public", > "../..:webrtc_common", > "../../api:fec_controller_api", >+ "../../api/video:builtin_video_bitrate_allocator_factory", > "../../api/video:encoded_frame", > "../../api/video:video_bitrate_allocator", > "../../api/video:video_frame", >@@ -165,12 +167,14 @@ rtc_static_library("video_coding") { > "../../api/video_codecs:video_codecs_api", > "../../common_video", > "../../rtc_base:checks", >+ "../../rtc_base:deprecation", > "../../rtc_base:rtc_base", > "../../rtc_base:rtc_base_approved", > "../../rtc_base:rtc_numerics", > "../../rtc_base:rtc_task_queue", > "../../rtc_base:sequenced_task_checker", > "../../rtc_base/experiments:alr_experiment", >+ "../../rtc_base/experiments:jitter_upper_bound_experiment", > "../../rtc_base/experiments:rtt_mult_experiment", > "../../rtc_base/system:fallthrough", > "../../rtc_base/third_party/base64", >@@ -238,8 +242,6 @@ rtc_source_set("video_coding_utility") { > "utility/framerate_controller.h", > "utility/ivf_file_writer.cc", > "utility/ivf_file_writer.h", >- "utility/moving_average.cc", >- "utility/moving_average.h", > "utility/quality_scaler.cc", > "utility/quality_scaler.h", > "utility/simulcast_rate_allocator.cc", >@@ -261,6 +263,7 @@ rtc_source_set("video_coding_utility") { > ":video_codec_interface", > "..:module_api", > "../..:webrtc_common", >+ "../../api/video:encoded_image", > "../../api/video:video_bitrate_allocator", > "../../api/video_codecs:video_codecs_api", > "../../common_video", >@@ -302,8 +305,10 @@ rtc_static_library("webrtc_h264") { > "../../media:rtc_media_base", > "../../rtc_base:checks", > "../../rtc_base:rtc_base", >+ "../../rtc_base/system:rtc_export", > "../../system_wrappers:metrics", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/libyuv", > ] > >@@ -376,6 +381,7 @@ rtc_static_library("webrtc_multiplex") { > ":video_coding_utility", > "..:module_api", > "../..:webrtc_common", >+ "../../api/video:encoded_image", > "../../api/video:video_frame", > "../../api/video:video_frame_i420", > "../../api/video_codecs:video_codecs_api", >@@ -391,22 +397,13 @@ rtc_static_library("webrtc_vp8") { > visibility = [ "*" ] > poisonous = [ "software_video_codecs" ] > sources = [ >- "codecs/vp8/default_temporal_layers.cc", >- "codecs/vp8/default_temporal_layers.h", >- "codecs/vp8/include/temporal_layers_checker.h", > "codecs/vp8/include/vp8.h", >- "codecs/vp8/include/vp8_temporal_layers.h", > "codecs/vp8/libvpx_interface.cc", > "codecs/vp8/libvpx_interface.h", > "codecs/vp8/libvpx_vp8_decoder.cc", > "codecs/vp8/libvpx_vp8_decoder.h", > "codecs/vp8/libvpx_vp8_encoder.cc", > "codecs/vp8/libvpx_vp8_encoder.h", >- "codecs/vp8/screenshare_layers.cc", >- "codecs/vp8/screenshare_layers.h", >- "codecs/vp8/temporal_layers.h", >- "codecs/vp8/temporal_layers_checker.cc", >- "codecs/vp8/vp8_temporal_layers.cc", > ] > > if (!build_with_chromium && is_clang) { >@@ -418,14 +415,18 @@ rtc_static_library("webrtc_vp8") { > ":codec_globals_headers", > ":video_codec_interface", > ":video_coding_utility", >+ ":webrtc_vp8_temporal_layers", > "..:module_api", > "../..:webrtc_common", >+ "../../api/video:encoded_image", > "../../api/video:video_frame", >+ "../../api/video_codecs:create_vp8_temporal_layers", > "../../api/video_codecs:video_codecs_api", > "../../common_video", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", > "../../rtc_base:rtc_numerics", >+ "../../rtc_base/experiments:cpu_speed_experiment", > "../../system_wrappers", > "../../system_wrappers:field_trial", > "../../system_wrappers:metrics", >@@ -438,6 +439,41 @@ rtc_static_library("webrtc_vp8") { > } > } > >+rtc_static_library("webrtc_vp8_temporal_layers") { >+ visibility = [ "*" ] >+ sources = [ >+ "codecs/vp8/default_temporal_layers.cc", >+ "codecs/vp8/default_temporal_layers.h", >+ "codecs/vp8/include/temporal_layers_checker.h", >+ "codecs/vp8/screenshare_layers.cc", >+ "codecs/vp8/screenshare_layers.h", >+ "codecs/vp8/temporal_layers.h", >+ "codecs/vp8/temporal_layers_checker.cc", >+ ] >+ >+ if (!build_with_chromium && is_clang) { >+ # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >+ suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >+ } >+ >+ deps = [ >+ ":codec_globals_headers", >+ ":video_codec_interface", >+ ":video_coding_utility", >+ "..:module_api", >+ "../..:webrtc_common", >+ "../../api/video_codecs:video_codecs_api", >+ "../../rtc_base:checks", >+ "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:rtc_numerics", >+ "../../system_wrappers", >+ "../../system_wrappers:field_trial", >+ "../../system_wrappers:metrics", >+ "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >+ > # This target includes VP9 files that may be used for any VP9 codec, internal SW or external HW. > rtc_static_library("webrtc_vp9_helpers") { > sources = [ >@@ -465,19 +501,14 @@ rtc_static_library("webrtc_vp9_helpers") { > rtc_static_library("webrtc_vp9") { > visibility = [ "*" ] > poisonous = [ "software_video_codecs" ] >- if (rtc_libvpx_build_vp9) { >- sources = [ >- "codecs/vp9/include/vp9.h", >- "codecs/vp9/vp9_frame_buffer_pool.cc", >- "codecs/vp9/vp9_frame_buffer_pool.h", >- "codecs/vp9/vp9_impl.cc", >- "codecs/vp9/vp9_impl.h", >- ] >- } else { >- sources = [ >- "codecs/vp9/vp9_noop.cc", >- ] >- } >+ sources = [ >+ "codecs/vp9/include/vp9.h", >+ "codecs/vp9/vp9.cc", >+ "codecs/vp9/vp9_frame_buffer_pool.cc", >+ "codecs/vp9/vp9_frame_buffer_pool.h", >+ "codecs/vp9/vp9_impl.cc", >+ "codecs/vp9/vp9_impl.h", >+ ] > > if (!build_with_chromium && is_clang) { > # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >@@ -574,7 +605,10 @@ if (rtc_include_tests) { > ":video_coding", > ":video_coding_utility", > "../../:webrtc_common", >+ "../../api:mock_video_decoder", >+ "../../api:mock_video_encoder", > "../../api:simulcast_test_fixture_api", >+ "../../api/video:encoded_image", > "../../api/video:video_frame", > "../../api/video:video_frame_i420", > "../../api/video_codecs:video_codecs_api", >@@ -606,6 +640,7 @@ if (rtc_include_tests) { > ":webrtc_vp9_helpers", > "../..:webrtc_common", > "../../api:videocodec_test_fixture_api", >+ "../../api/video:builtin_video_bitrate_allocator_factory", > "../../api/video:video_bitrate_allocator", > "../../api/video:video_frame", > "../../api/video:video_frame_i420", >@@ -674,6 +709,7 @@ if (rtc_include_tests) { > ":webrtc_vp9_helpers", > "../..:webrtc_common", > "../../api:videocodec_test_fixture_api", >+ "../../api/test/video:function_video_factory", > "../../api/video_codecs:video_codecs_api", > "../../call:video_stream_api", > "../../common_video", >@@ -719,6 +755,7 @@ if (rtc_include_tests) { > } > > deps = [ >+ ":mock_headers", > ":video_codecs_test_framework", > ":video_coding_utility", > ":videocodec_test_impl", >@@ -730,7 +767,10 @@ if (rtc_include_tests) { > "../..:webrtc_common", > "../../api:create_videocodec_test_fixture_api", > "../../api:mock_video_codec_factory", >+ "../../api:mock_video_decoder", >+ "../../api:mock_video_encoder", > "../../api:videocodec_test_fixture_api", >+ "../../api/test/video:function_video_factory", > "../../api/video:video_frame", > "../../api/video:video_frame_i420", > "../../api/video_codecs:rtc_software_fallback_wrappers", >@@ -739,11 +779,11 @@ if (rtc_include_tests) { > "../../media:rtc_h264_profile_id", > "../../media:rtc_internal_video_codecs", > "../../media:rtc_media_base", >+ "../../media:rtc_simulcast_encoder_adapter", > "../../media:rtc_vp9_profile", > "../../rtc_base:rtc_base", > "../../test:field_trial", > "../../test:fileutils", >- "../../test:test_common", > "../../test:test_support", > "../../test:video_test_common", > "../rtp_rtcp:rtp_rtcp_format", >@@ -774,6 +814,10 @@ if (rtc_include_tests) { > # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). > suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] > } >+ >+ if (rtc_build_libvpx) { >+ deps += [ rtc_libvpx_dir ] >+ } > } > > rtc_source_set("video_coding_unittests") { >@@ -804,13 +848,11 @@ if (rtc_include_tests) { > "session_info_unittest.cc", > "test/stream_generator.cc", > "test/stream_generator.h", >- "test/test_util.h", > "timing_unittest.cc", > "utility/default_video_bitrate_allocator_unittest.cc", > "utility/frame_dropper_unittest.cc", > "utility/framerate_controller_unittest.cc", > "utility/ivf_file_writer_unittest.cc", >- "utility/moving_average_unittest.cc", > "utility/quality_scaler_unittest.cc", > "utility/simulcast_rate_allocator_unittest.cc", > "video_codec_initializer_unittest.cc", >@@ -839,16 +881,23 @@ if (rtc_include_tests) { > ":videocodec_test_impl", > ":webrtc_h264", > ":webrtc_vp8", >+ ":webrtc_vp8_temporal_layers", > ":webrtc_vp9", > ":webrtc_vp9_helpers", > "..:module_api", > "../..:webrtc_common", > "../../api:create_simulcast_test_fixture_api", >+ "../../api:mock_video_decoder", >+ "../../api:mock_video_encoder", > "../../api:simulcast_test_fixture_api", > "../../api:videocodec_test_fixture_api", >+ "../../api/test/video:function_video_factory", >+ "../../api/video:builtin_video_bitrate_allocator_factory", > "../../api/video:video_bitrate_allocator", >+ "../../api/video:video_bitrate_allocator_factory", > "../../api/video:video_frame", > "../../api/video:video_frame_i420", >+ "../../api/video_codecs:create_vp8_temporal_layers", > "../../api/video_codecs:video_codecs_api", > "../../common_video:common_video", > "../../media:rtc_media_base", >@@ -859,6 +908,7 @@ if (rtc_include_tests) { > "../../rtc_base:rtc_numerics", > "../../rtc_base:rtc_task_queue", > "../../rtc_base:rtc_task_queue_for_test", >+ "../../rtc_base/experiments:jitter_upper_bound_experiment", > "../../system_wrappers", > "../../system_wrappers:field_trial", > "../../system_wrappers:metrics", >@@ -870,6 +920,7 @@ if (rtc_include_tests) { > "../../test:video_test_support", > "../rtp_rtcp:rtp_rtcp_format", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > if (rtc_build_libvpx) { > deps += [ rtc_libvpx_dir ] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_decoder_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_decoder_impl.cc >index 73d45837af77d699f6a796c1dcf51df3216eadeb..e213223ca4f811d71e99b5fc8bf1398c89cc7726 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_decoder_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_decoder_impl.cc >@@ -323,13 +323,11 @@ int32_t H264DecoderImpl::Decode(const EncodedImage& input_image, > // without copying the underlying buffer. > if (av_frame_->width != i420_buffer->width() || > av_frame_->height != i420_buffer->height()) { >- rtc::scoped_refptr<VideoFrameBuffer> cropped_buf( >- new rtc::RefCountedObject<WrappedI420Buffer>( >- av_frame_->width, av_frame_->height, >- i420_buffer->DataY(), i420_buffer->StrideY(), >- i420_buffer->DataU(), i420_buffer->StrideU(), >- i420_buffer->DataV(), i420_buffer->StrideV(), >- rtc::KeepRefUntilDone(i420_buffer))); >+ rtc::scoped_refptr<VideoFrameBuffer> cropped_buf = WrapI420Buffer( >+ av_frame_->width, av_frame_->height, i420_buffer->DataY(), >+ i420_buffer->StrideY(), i420_buffer->DataU(), i420_buffer->StrideU(), >+ i420_buffer->DataV(), i420_buffer->StrideV(), >+ rtc::KeepRefUntilDone(i420_buffer)); > VideoFrame cropped_frame = > VideoFrame::Builder() > .set_video_frame_buffer(cropped_buf) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.cc >index 4effcdbc524fea997727f2be4ef686efd9421e41..f191f2c64032aa97453c0c8199df4245fb6db8c4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.cc >@@ -19,6 +19,7 @@ > #include "third_party/openh264/src/codec/api/svc/codec_def.h" > #include "third_party/openh264/src/codec/api/svc/codec_ver.h" > >+#include "absl/strings/match.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/video_coding/utility/simulcast_rate_allocator.h" > #include "modules/video_coding/utility/simulcast_utility.h" >@@ -167,7 +168,7 @@ H264EncoderImpl::H264EncoderImpl(const cricket::VideoCodec& codec) > encoded_image_callback_(nullptr), > has_reported_init_(false), > has_reported_error_(false) { >- RTC_CHECK(cricket::CodecNamesEq(codec.name, cricket::kH264CodecName)); >+ RTC_CHECK(absl::EqualsIgnoreCase(codec.name, cricket::kH264CodecName)); > std::string packetization_mode_string; > if (codec.GetParam(cricket::kH264FmtpPacketizationMode, > &packetization_mode_string) && >@@ -310,7 +311,7 @@ int32_t H264EncoderImpl::InitEncode(const VideoCodec* inst, > } > > SimulcastRateAllocator init_allocator(codec_); >- BitrateAllocation allocation = init_allocator.GetAllocation( >+ VideoBitrateAllocation allocation = init_allocator.GetAllocation( > codec_.startBitrate * 1000, codec_.maxFramerate); > return SetRateAllocation(allocation, codec_.maxFramerate); > } >@@ -339,7 +340,7 @@ int32_t H264EncoderImpl::RegisterEncodeCompleteCallback( > } > > int32_t H264EncoderImpl::SetRateAllocation( >- const BitrateAllocation& bitrate, >+ const VideoBitrateAllocation& bitrate, > uint32_t new_framerate) { > if (encoders_.empty()) > return WEBRTC_VIDEO_CODEC_UNINITIALIZED; >@@ -534,10 +535,6 @@ int32_t H264EncoderImpl::Encode(const VideoFrame& input_frame, > return WEBRTC_VIDEO_CODEC_OK; > } > >-const char* H264EncoderImpl::ImplementationName() const { >- return "OpenH264"; >-} >- > // Initialization parameters. > // There are two ways to initialize. There is SEncParamBase (cleared with > // memset(&p, 0, sizeof(SEncParamBase)) used in Initialize, and SEncParamExt >@@ -624,14 +621,13 @@ void H264EncoderImpl::ReportError() { > has_reported_error_ = true; > } > >-int32_t H264EncoderImpl::SetChannelParameters(uint32_t packet_loss, >- int64_t rtt) { >- return WEBRTC_VIDEO_CODEC_OK; >-} >- >-VideoEncoder::ScalingSettings H264EncoderImpl::GetScalingSettings() const { >- return VideoEncoder::ScalingSettings(kLowH264QpThreshold, >- kHighH264QpThreshold); >+VideoEncoder::EncoderInfo H264EncoderImpl::GetEncoderInfo() const { >+ EncoderInfo info; >+ info.supports_native_handle = false; >+ info.implementation_name = "OpenH264"; >+ info.scaling_settings = >+ VideoEncoder::ScalingSettings(kLowH264QpThreshold, kHighH264QpThreshold); >+ return info; > } > > void H264EncoderImpl::LayerConfig::SetStreamState(bool send_stream) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.h >index 0d259660f699b6b55ab3b5631bc76274d6d3a138..da32563c9b2de2f7ef2f09484d0f88d1141fcf28 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_encoder_impl.h >@@ -70,12 +70,7 @@ class H264EncoderImpl : public H264Encoder { > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) override; > >- const char* ImplementationName() const override; >- >- VideoEncoder::ScalingSettings GetScalingSettings() const override; >- >- // Unsupported / Do nothing. >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; >+ EncoderInfo GetEncoderInfo() const override; > > // Exposed for testing. > H264PacketizationMode PacketizationModeForTesting() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_simulcast_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_simulcast_unittest.cc >index 6c6fae8c1a0d143aaa12a2731e41a7ec881249f0..3b720b38430b470973ee4454fbdc29cf7b7e7e61 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_simulcast_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/h264_simulcast_unittest.cc >@@ -13,9 +13,9 @@ > #include "absl/memory/memory.h" > #include "api/test/create_simulcast_test_fixture.h" > #include "api/test/simulcast_test_fixture.h" >+#include "api/test/video/function_video_decoder_factory.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "modules/video_coding/codecs/h264/include/h264.h" >-#include "test/function_video_decoder_factory.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/include/h264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/include/h264.h >index e23818b756b86f3033eb6b953a3688ba3c806a32..f5cebcfe62122c5ef727f3770503b63f7fd4f1ff 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/include/h264.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/include/h264.h >@@ -17,6 +17,7 @@ > > #include "media/base/codec.h" > #include "modules/video_coding/include/video_codec_interface.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >@@ -26,13 +27,13 @@ struct SdpVideoFormat; > // |rtc_use_h264| build flag is true (if false, this function does nothing). > // This function should only be called before or during WebRTC initialization > // and is not thread-safe. >-void DisableRtcUseH264(); >+RTC_EXPORT void DisableRtcUseH264(); > > // Returns a vector with all supported internal H264 profiles that we can > // negotiate in SDP, in order of preference. > std::vector<SdpVideoFormat> SupportedH264Codecs(); > >-class H264Encoder : public VideoEncoder { >+class RTC_EXPORT H264Encoder : public VideoEncoder { > public: > static std::unique_ptr<H264Encoder> Create(const cricket::VideoCodec& codec); > // If H.264 is supported (any implementation). >@@ -41,7 +42,7 @@ class H264Encoder : public VideoEncoder { > ~H264Encoder() override {} > }; > >-class H264Decoder : public VideoDecoder { >+class RTC_EXPORT H264Decoder : public VideoDecoder { > public: > static std::unique_ptr<H264Decoder> Create(); > static bool IsSupported(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/test/h264_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/test/h264_impl_unittest.cc >index 14bb6bcc47e63f152f8d1a3af9035ae5e38f75a4..82c14b7fd529c06e46aca010aaf9f6f2c954aa6b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/test/h264_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/h264/test/h264_impl_unittest.cc >@@ -56,7 +56,7 @@ TEST_F(TestH264Impl, MAYBE_EncodeDecode) { > ASSERT_TRUE(decoded_frame); > EXPECT_GT(I420PSNR(input_frame, decoded_frame.get()), 36); > >- const ColorSpace color_space = decoded_frame->color_space().value(); >+ const ColorSpace color_space = *decoded_frame->color_space(); > EXPECT_EQ(ColorSpace::PrimaryID::kInvalid, color_space.primaries()); > EXPECT_EQ(ColorSpace::TransferID::kInvalid, color_space.transfer()); > EXPECT_EQ(ColorSpace::MatrixID::kInvalid, color_space.matrix()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/i420/include/i420.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/i420/include/i420.h >index 580bc16c2f8a9c0345e77f9530c964ff05e4125a..967286f42cb43e0fc36b78aab2a6edc0682ee1be 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/i420/include/i420.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/i420/include/i420.h >@@ -65,10 +65,6 @@ class I420Encoder : public VideoEncoder { > // Return value : WEBRTC_VIDEO_CODEC_OK if OK, < 0 otherwise. > int Release() override; > >- int SetChannelParameters(uint32_t /*packetLoss*/, int64_t /*rtt*/) override { >- return WEBRTC_VIDEO_CODEC_OK; >- } >- > private: > static uint8_t* InsertHeader(uint8_t* buffer, > uint16_t width, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/include/multiplex_encoder_adapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/include/multiplex_encoder_adapter.h >index 531d07870e76de1ea47de70a9efbc9dee9c14663..235a36030b078cfaea2bf3108d4295b5eb360096 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/include/multiplex_encoder_adapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/include/multiplex_encoder_adapter.h >@@ -46,11 +46,10 @@ class MultiplexEncoderAdapter : public VideoEncoder { > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) override; > int RegisterEncodeCompleteCallback(EncodedImageCallback* callback) override; >- int SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; > int SetRateAllocation(const VideoBitrateAllocation& bitrate, > uint32_t new_framerate) override; > int Release() override; >- const char* ImplementationName() const override; >+ EncoderInfo GetEncoderInfo() const override; > > EncodedImageCallback::Result OnEncodedImage( > AlphaCodecStream stream_idx, >@@ -81,6 +80,8 @@ class MultiplexEncoderAdapter : public VideoEncoder { > > const bool supports_augmented_data_; > int augmenting_data_size_ = 0; >+ >+ EncoderInfo encoder_info_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_decoder_adapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_decoder_adapter.cc >index cc588d0347a58552702135b368526b4a1df73a96..a27bc8d2588345f5048a2c4d79354b41f178fb98 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_decoder_adapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_decoder_adapter.cc >@@ -10,9 +10,9 @@ > > #include "modules/video_coding/codecs/multiplex/include/multiplex_decoder_adapter.h" > >+#include "api/video/encoded_image.h" > #include "api/video/i420_buffer.h" > #include "api/video/video_frame_buffer.h" >-#include "common_video/include/video_frame.h" > #include "common_video/include/video_frame_buffer.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/video_coding/codecs/multiplex/include/augmented_video_frame_buffer.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoded_image_packer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoded_image_packer.h >index ea37f0c4d9ff465160ce8fa5357aedf75593d654..0d2f9fa26ea35d5a8c4b154354796ee057c302d2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoded_image_packer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoded_image_packer.h >@@ -14,8 +14,8 @@ > #include <memory> > #include <vector> > >+#include "api/video/encoded_image.h" > #include "common_types.h" // NOLINT(build/include) >-#include "common_video/include/video_frame.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoder_adapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoder_adapter.cc >index c3a75062b54ebe58914939ad2714c8b2d6d12db1..4dabca77fd0059a51b383231601d03593500e1ba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoder_adapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/multiplex/multiplex_encoder_adapter.cc >@@ -12,7 +12,7 @@ > > #include <cstring> > >-#include "common_video/include/video_frame.h" >+#include "api/video/encoded_image.h" > #include "common_video/include/video_frame_buffer.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/include/module_common_types.h" >@@ -53,6 +53,7 @@ MultiplexEncoderAdapter::MultiplexEncoderAdapter( > : factory_(factory), > associated_format_(associated_format), > encoded_complete_callback_(nullptr), >+ key_frame_interval_(0), > supports_augmented_data_(supports_augmented_data) {} > > MultiplexEncoderAdapter::~MultiplexEncoderAdapter() { >@@ -92,6 +93,11 @@ int MultiplexEncoderAdapter::InitEncode(const VideoCodec* inst, > break; > } > >+ encoder_info_ = EncoderInfo(); >+ encoder_info_.implementation_name = "MultiplexEncoderAdapter ("; >+ // This needs to be false so that we can do the split in Encode(). >+ encoder_info_.supports_native_handle = false; >+ > for (size_t i = 0; i < kAlphaCodecStreams; ++i) { > std::unique_ptr<VideoEncoder> encoder = > factory_->CreateVideoEncoder(associated_format_); >@@ -104,8 +110,17 @@ int MultiplexEncoderAdapter::InitEncode(const VideoCodec* inst, > adapter_callbacks_.emplace_back(new AdapterEncodedImageCallback( > this, static_cast<AlphaCodecStream>(i))); > encoder->RegisterEncodeCompleteCallback(adapter_callbacks_.back().get()); >+ >+ const EncoderInfo& encoder_impl_info = encoder->GetEncoderInfo(); >+ encoder_info_.implementation_name += encoder_impl_info.implementation_name; >+ if (i != kAlphaCodecStreams - 1) { >+ encoder_info_.implementation_name += ", "; >+ } >+ > encoders_.emplace_back(std::move(encoder)); > } >+ encoder_info_.implementation_name += ")"; >+ > return WEBRTC_VIDEO_CODEC_OK; > } > >@@ -187,16 +202,6 @@ int MultiplexEncoderAdapter::RegisterEncodeCompleteCallback( > return WEBRTC_VIDEO_CODEC_OK; > } > >-int MultiplexEncoderAdapter::SetChannelParameters(uint32_t packet_loss, >- int64_t rtt) { >- for (auto& encoder : encoders_) { >- const int rv = encoder->SetChannelParameters(packet_loss, rtt); >- if (rv) >- return rv; >- } >- return WEBRTC_VIDEO_CODEC_OK; >-} >- > int MultiplexEncoderAdapter::SetRateAllocation( > const VideoBitrateAllocation& bitrate, > uint32_t framerate) { >@@ -238,8 +243,8 @@ int MultiplexEncoderAdapter::Release() { > return WEBRTC_VIDEO_CODEC_OK; > } > >-const char* MultiplexEncoderAdapter::ImplementationName() const { >- return "MultiplexEncoderAdapter"; >+VideoEncoder::EncoderInfo MultiplexEncoderAdapter::GetEncoderInfo() const { >+ return encoder_info_; > } > > EncodedImageCallback::Result MultiplexEncoderAdapter::OnEncodedImage( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/video_codec_unittest.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/video_codec_unittest.h >index 48860c35a6f34971592e3d3ec9b4c707eddb7e51..9a4d60cec2678726613a1a72c186a705e715c54d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/video_codec_unittest.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/video_codec_unittest.h >@@ -32,11 +32,7 @@ class VideoCodecUnitTest : public ::testing::Test { > VideoCodecUnitTest() > : encode_complete_callback_(this), > decode_complete_callback_(this), >- encoded_frame_event_(false /* manual reset */, >- false /* initially signaled */), > wait_for_encoded_frames_threshold_(1), >- decoded_frame_event_(false /* manual reset */, >- false /* initially signaled */), > last_input_frame_timestamp_(0) {} > > protected: >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_fixture_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_fixture_impl.cc >index 4dc268a0431f095efd9443c73accd88969986e52..847e84785bcb4326d47336131fe6b4fcc076749a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_fixture_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_fixture_impl.cc >@@ -130,9 +130,7 @@ bool RunEncodeInRealTime(const VideoCodecTestFixtureImpl::Config& config) { > > std::string FilenameWithParams( > const VideoCodecTestFixtureImpl::Config& config) { >- std::string implementation_type = config.hw_encoder ? "hw" : "sw"; > return config.filename + "_" + config.CodecName() + "_" + >- implementation_type + "_" + > std::to_string(config.codec_settings.startBitrate); > } > >@@ -446,7 +444,7 @@ void VideoCodecTestFixtureImpl::ProcessAllFrames( > } > > // Wait until we know that the last frame has been sent for encode. >- rtc::Event sync_event(false, false); >+ rtc::Event sync_event; > task_queue->PostTask([&sync_event] { sync_event.Set(); }); > sync_event.Wait(rtc::Event::kForever); > >@@ -673,7 +671,7 @@ void VideoCodecTestFixtureImpl::PrintSettings( > std::string encoder_name; > std::string decoder_name; > task_queue->SendTask([this, &encoder_name, &decoder_name] { >- encoder_name = encoder_->ImplementationName(); >+ encoder_name = encoder_->GetEncoderInfo().implementation_name; > decoder_name = decoders_.at(0)->ImplementationName(); > }); > printf("enc_impl_name: %s\n", encoder_name.c_str()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_libvpx.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_libvpx.cc >index 9586ce9c792382133421aed816cbaf5edee8d048..f69fde6884010fc35fc396702e243bb276ab4302 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_libvpx.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_libvpx.cc >@@ -12,6 +12,7 @@ > > #include "absl/memory/memory.h" > #include "api/test/create_videocodec_test_fixture.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "api/video_codecs/sdp_video_format.h" > #include "media/base/mediaconstants.h" > #include "media/engine/internaldecoderfactory.h" >@@ -19,7 +20,6 @@ > #include "media/engine/simulcast_encoder_adapter.h" > #include "modules/video_coding/utility/vp8_header_parser.h" > #include "modules/video_coding/utility/vp9_uncompressed_header_parser.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > #include "test/testsupport/fileutils.h" > >@@ -63,8 +63,6 @@ VideoCodecTestFixture::Config CreateConfig() { > config.filepath = ResourcePath(config.filename, "yuv"); > config.num_frames = kNumFramesLong; > config.use_single_core = true; >- config.hw_encoder = false; >- config.hw_decoder = false; > return config; > } > >@@ -92,7 +90,7 @@ void PrintRdPerf(std::map<size_t, std::vector<VideoStatistics>> rd_stats) { > } > } // namespace > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > TEST(VideoCodecTestLibvpx, HighBitrateVP9) { > auto config = CreateConfig(); > config.SetCodecSettings(cricket::kVp9CodecName, 1, 1, 1, false, true, false, >@@ -105,7 +103,7 @@ TEST(VideoCodecTestLibvpx, HighBitrateVP9) { > std::vector<RateProfile> rate_profiles = {{500, 30, kNumFramesShort}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 0, 0.11, 0.3, 0.1, 0, 1}}; >+ {5, 1, 0, 1, 0.3, 0.1, 0, 1}}; > > std::vector<QualityThresholds> quality_thresholds = {{37, 36, 0.94, 0.92}}; > >@@ -126,9 +124,9 @@ TEST(VideoCodecTestLibvpx, ChangeBitrateVP9) { > {500, 30, kNumFramesLong}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 0, 0.15, 0.5, 0.1, 0, 1}, >- {15, 2, 0, 0.2, 0.5, 0.1, 0, 0}, >- {10, 1, 0, 0.3, 0.5, 0.1, 0, 0}}; >+ {5, 1, 0, 1, 0.5, 0.1, 0, 1}, >+ {15, 2, 0, 1, 0.5, 0.1, 0, 0}, >+ {10, 1, 0, 1, 0.5, 0.1, 0, 0}}; > > std::vector<QualityThresholds> quality_thresholds = { > {34, 33, 0.90, 0.88}, {38, 35, 0.95, 0.91}, {35, 34, 0.93, 0.90}}; >@@ -151,9 +149,9 @@ TEST(VideoCodecTestLibvpx, ChangeFramerateVP9) { > > // Framerate mismatch should be lower for lower framerate. > std::vector<RateControlThresholds> rc_thresholds = { >- {10, 2, 40, 0.4, 0.5, 0.2, 0, 1}, >- {8, 2, 5, 0.2, 0.5, 0.2, 0, 0}, >- {5, 2, 0, 0.21, 0.5, 0.3, 0, 0}}; >+ {10, 2, 40, 1, 0.5, 0.2, 0, 1}, >+ {8, 2, 5, 1, 0.5, 0.2, 0, 0}, >+ {5, 2, 0, 1, 0.5, 0.3, 0, 0}}; > > // Quality should be higher for lower framerates for the same content. > std::vector<QualityThresholds> quality_thresholds = { >@@ -174,7 +172,7 @@ TEST(VideoCodecTestLibvpx, DenoiserOnVP9) { > std::vector<RateProfile> rate_profiles = {{500, 30, kNumFramesShort}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 0, 0.11, 0.3, 0.1, 0, 1}}; >+ {5, 1, 0, 1, 0.3, 0.1, 0, 1}}; > > std::vector<QualityThresholds> quality_thresholds = {{37.5, 36, 0.94, 0.93}}; > >@@ -192,7 +190,7 @@ TEST(VideoCodecTestLibvpx, VeryLowBitrateVP9) { > std::vector<RateProfile> rate_profiles = {{50, 30, kNumFramesLong}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {15, 3, 75, 1.0, 0.5, 0.4, 1, 1}}; >+ {15, 3, 75, 1, 0.5, 0.4, 1, 1}}; > > std::vector<QualityThresholds> quality_thresholds = {{28, 25, 0.80, 0.65}}; > >@@ -202,7 +200,7 @@ TEST(VideoCodecTestLibvpx, VeryLowBitrateVP9) { > // TODO(marpan): Add temporal layer test for VP9, once changes are in > // vp9 wrapper for this. > >-#endif // !defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > TEST(VideoCodecTestLibvpx, HighBitrateVP8) { > auto config = CreateConfig(); >@@ -216,7 +214,7 @@ TEST(VideoCodecTestLibvpx, HighBitrateVP8) { > std::vector<RateProfile> rate_profiles = {{500, 30, kNumFramesShort}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 0, 0.1, 0.2, 0.1, 0, 1}}; >+ {5, 1, 0, 1, 0.2, 0.1, 0, 1}}; > > #if defined(WEBRTC_ARCH_ARM) || defined(WEBRTC_ARCH_ARM64) > std::vector<QualityThresholds> quality_thresholds = {{35, 33, 0.91, 0.89}}; >@@ -255,9 +253,9 @@ TEST(VideoCodecTestLibvpx, MAYBE_ChangeBitrateVP8) { > {500, 30, kNumFramesLong}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 0, 0.1, 0.2, 0.1, 0, 1}, >- {15.5, 1, 0, 0.1, 0.2, 0.1, 0, 0}, >- {15, 1, 0, 0.3, 0.2, 0.1, 0, 0}}; >+ {5, 1, 0, 1, 0.2, 0.1, 0, 1}, >+ {15.5, 1, 0, 1, 0.2, 0.1, 0, 0}, >+ {15, 1, 0, 1, 0.2, 0.1, 0, 0}}; > > #if defined(WEBRTC_ARCH_ARM) || defined(WEBRTC_ARCH_ARM64) > std::vector<QualityThresholds> quality_thresholds = { >@@ -290,14 +288,14 @@ TEST(VideoCodecTestLibvpx, MAYBE_ChangeFramerateVP8) { > > #if defined(WEBRTC_ARCH_ARM) || defined(WEBRTC_ARCH_ARM64) > std::vector<RateControlThresholds> rc_thresholds = { >- {10, 2, 60, 0.5, 0.3, 0.3, 0, 1}, >- {10, 2, 30, 0.3, 0.3, 0.3, 0, 0}, >- {10, 2, 10, 0.2, 0.3, 0.2, 0, 0}}; >+ {10, 2, 60, 1, 0.3, 0.3, 0, 1}, >+ {10, 2, 30, 1, 0.3, 0.3, 0, 0}, >+ {10, 2, 10, 1, 0.3, 0.2, 0, 0}}; > #else > std::vector<RateControlThresholds> rc_thresholds = { >- {10, 2, 20, 0.4, 0.3, 0.1, 0, 1}, >- {5, 2, 5, 0.3, 0.3, 0.1, 0, 0}, >- {4, 2, 1, 0.2, 0.3, 0.2, 0, 0}}; >+ {10, 2, 20, 1, 0.3, 0.1, 0, 1}, >+ {5, 2, 5, 1, 0.3, 0.1, 0, 0}, >+ {4, 2, 1, 1, 0.3, 0.2, 0, 0}}; > #endif > > #if defined(WEBRTC_ARCH_ARM) || defined(WEBRTC_ARCH_ARM64) >@@ -328,10 +326,10 @@ TEST(VideoCodecTestLibvpx, MAYBE_TemporalLayersVP8) { > > #if defined(WEBRTC_ARCH_ARM) || defined(WEBRTC_ARCH_ARM64) > std::vector<RateControlThresholds> rc_thresholds = { >- {10, 1, 2, 0.3, 0.2, 0.1, 0, 1}, {12, 2, 3, 0.1, 0.2, 0.1, 0, 1}}; >+ {10, 1, 2, 1, 0.2, 0.1, 0, 1}, {12, 2, 3, 1, 0.2, 0.1, 0, 1}}; > #else > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 0, 0.1, 0.2, 0.1, 0, 1}, {10, 2, 0, 0.1, 0.2, 0.1, 0, 1}}; >+ {5, 1, 0, 1, 0.2, 0.1, 0, 1}, {10, 2, 0, 1, 0.2, 0.1, 0, 1}}; > #endif > // Min SSIM drops because of high motion scene with complex backgound (trees). > #if defined(WEBRTC_ARCH_ARM) || defined(WEBRTC_ARCH_ARM64) >@@ -364,7 +362,7 @@ TEST(VideoCodecTestLibvpx, MAYBE_MultiresVP8) { > std::vector<RateProfile> rate_profiles = {{1500, 30, config.num_frames}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 5, 0.2, 0.3, 0.1, 0, 1}}; >+ {5, 1, 5, 1, 0.3, 0.1, 0, 1}}; > std::vector<QualityThresholds> quality_thresholds = {{34, 32, 0.90, 0.88}}; > > fixture->RunTest(rate_profiles, &rc_thresholds, &quality_thresholds, nullptr); >@@ -401,7 +399,7 @@ TEST(VideoCodecTestLibvpx, MAYBE_SimulcastVP8) { > std::vector<RateProfile> rate_profiles = {{1500, 30, config.num_frames}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {20, 5, 90, 0.8, 0.5, 0.3, 0, 1}}; >+ {20, 5, 90, 1, 0.5, 0.3, 0, 1}}; > std::vector<QualityThresholds> quality_thresholds = {{34, 32, 0.90, 0.88}}; > > fixture->RunTest(rate_profiles, &rc_thresholds, &quality_thresholds, nullptr); >@@ -426,7 +424,7 @@ TEST(VideoCodecTestLibvpx, MAYBE_SvcVP9) { > std::vector<RateProfile> rate_profiles = {{1500, 30, config.num_frames}}; > > std::vector<RateControlThresholds> rc_thresholds = { >- {5, 1, 5, 0.2, 0.3, 0.1, 0, 1}}; >+ {5, 1, 5, 1, 0.3, 0.1, 0, 1}}; > std::vector<QualityThresholds> quality_thresholds = {{36, 34, 0.93, 0.90}}; > > fixture->RunTest(rate_profiles, &rc_thresholds, &quality_thresholds, nullptr); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_mediacodec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_mediacodec.cc >index 99b1aed8c0a580536798025a8c658f57b36b7245..edfc211c226d549cc0c5934f723f35ce07926e49 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_mediacodec.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_mediacodec.cc >@@ -33,8 +33,6 @@ VideoCodecTestFixture::Config CreateConfig() { > config.filename = "foreman_cif"; > config.filepath = ResourcePath(config.filename, "yuv"); > config.num_frames = kForemanNumFrames; >- config.hw_encoder = true; >- config.hw_decoder = true; > // In order to not overwhelm the OpenMAX buffers in the Android MediaCodec. > config.encode_in_real_time = true; > return config; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_openh264.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_openh264.cc >index e5cf71ba90f9042ee693c116d93baba13edde0f3..bd4bb52b85191dc560af3a8d09c4e39f82c3734e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_openh264.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_openh264.cc >@@ -32,8 +32,6 @@ VideoCodecTestFixture::Config CreateConfig() { > config.num_frames = kNumFrames; > // Only allow encoder/decoder to use single core, for predictability. > config.use_single_core = true; >- config.hw_encoder = false; >- config.hw_decoder = false; > return config; > } > } // namespace >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_parameterized.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_parameterized.cc >index 0d49b80d95af30c5b2914013c4a4d56c549fd913..efc80add2da5c1d23923917169f301242662e2fe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_parameterized.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_parameterized.cc >@@ -20,7 +20,6 @@ namespace { > // Loop variables. > const size_t kBitrates[] = {500}; > const VideoCodecType kVideoCodecType[] = {kVideoCodecVP8}; >-const bool kHwCodec[] = {false}; > > // Codec settings. > const int kNumSpatialLayers = 1; >@@ -39,12 +38,11 @@ const int kNumFrames = 30; > class VideoCodecTestParameterized > : public ::testing::Test, > public ::testing::WithParamInterface< >- ::testing::tuple<size_t, VideoCodecType, bool>> { >+ ::testing::tuple<size_t, VideoCodecType>> { > protected: > VideoCodecTestParameterized() > : bitrate_(::testing::get<0>(GetParam())), >- codec_type_(::testing::get<1>(GetParam())), >- hw_codec_(::testing::get<2>(GetParam())) {} >+ codec_type_(::testing::get<1>(GetParam())) {} > ~VideoCodecTestParameterized() override = default; > > void RunTest(size_t width, >@@ -56,8 +54,6 @@ class VideoCodecTestParameterized > config.filepath = ResourcePath(filename, "yuv"); > config.use_single_core = kUseSingleCore; > config.measure_cpu = kMeasureCpu; >- config.hw_encoder = hw_codec_; >- config.hw_decoder = hw_codec_; > config.num_frames = kNumFrames; > > const size_t num_simulcast_streams = >@@ -80,14 +76,13 @@ class VideoCodecTestParameterized > std::unique_ptr<VideoCodecTestFixture> fixture_; > const size_t bitrate_; > const VideoCodecType codec_type_; >- const bool hw_codec_; > }; > >-INSTANTIATE_TEST_CASE_P(CodecSettings, >- VideoCodecTestParameterized, >- ::testing::Combine(::testing::ValuesIn(kBitrates), >- ::testing::ValuesIn(kVideoCodecType), >- ::testing::ValuesIn(kHwCodec))); >+INSTANTIATE_TEST_CASE_P( >+ CodecSettings, >+ VideoCodecTestParameterized, >+ ::testing::Combine(::testing::ValuesIn(kBitrates), >+ ::testing::ValuesIn(kVideoCodecType))); > > TEST_P(VideoCodecTestParameterized, Foreman_352x288_30) { > RunTest(352, 288, 30, "foreman_cif"); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_videotoolbox.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_videotoolbox.cc >index f31ff548a81255bec445973ec750c3670328f057..6dfb993ce881c7eeed46c8164ef1db8865b4d6aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_videotoolbox.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videocodec_test_videotoolbox.cc >@@ -28,8 +28,6 @@ VideoCodecTestFixture::Config CreateConfig() { > config.filename = "foreman_cif"; > config.filepath = ResourcePath(config.filename, "yuv"); > config.num_frames = kForemanNumFrames; >- config.hw_encoder = true; >- config.hw_decoder = true; > return config; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor.cc >index 5e980cdcf1acc8155f841e4f7e047793ba206d74..69582660f91188bab809dae344149f1a6fc9f5c4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor.cc >@@ -14,6 +14,7 @@ > #include <limits> > #include <utility> > >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video/i420_buffer.h" > #include "common_types.h" // NOLINT(build/include) > #include "common_video/h264/h264_common.h" >@@ -173,8 +174,9 @@ VideoProcessor::VideoProcessor(webrtc::VideoEncoder* encoder, > stats_(stats), > encoder_(encoder), > decoders_(decoders), >- bitrate_allocator_(VideoCodecInitializer::CreateBitrateAllocator( >- config_.codec_settings)), >+ bitrate_allocator_( >+ CreateBuiltinVideoBitrateAllocatorFactory() >+ ->CreateVideoBitrateAllocator(config_.codec_settings)), > framerate_fps_(0), > encode_callback_(this), > input_frame_reader_(input_frame_reader), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor_unittest.cc >index 589a3d898a8ee488a4016eb623c9ead43854a578..3d41ad998febfc62063787ca0bd35fde18f7f36e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/test/videoprocessor_unittest.cc >@@ -11,13 +11,14 @@ > #include <memory> > > #include "absl/memory/memory.h" >+#include "api/test/mock_video_decoder.h" >+#include "api/test/mock_video_encoder.h" > #include "api/test/videocodec_test_fixture.h" > #include "api/video/i420_buffer.h" > #include "common_types.h" // NOLINT(build/include) > #include "media/base/mediaconstants.h" > #include "modules/video_coding/codecs/test/videocodec_test_stats_impl.h" > #include "modules/video_coding/codecs/test/videoprocessor.h" >-#include "modules/video_coding/include/mock/mock_video_codec_interface.h" > #include "modules/video_coding/include/video_coding.h" > #include "rtc_base/event.h" > #include "rtc_base/task_queue_for_test.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.cc >index d53f46994d3608ef39e3e06e61edafaac3a9b875..c5e163fcf720b8b28ec98275321774735a907734 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.cc >@@ -7,8 +7,6 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include "modules/video_coding/codecs/vp8/default_temporal_layers.h" >- > #include <stdlib.h> > #include <string.h> > >@@ -20,6 +18,7 @@ > #include <vector> > > #include "modules/include/module_common_types.h" >+#include "modules/video_coding/codecs/vp8/default_temporal_layers.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >@@ -27,27 +26,30 @@ > > namespace webrtc { > >-TemporalLayers::FrameConfig::FrameConfig() >+Vp8TemporalLayers::FrameConfig::FrameConfig() > : FrameConfig(kNone, kNone, kNone, false) {} > >-TemporalLayers::FrameConfig::FrameConfig(TemporalLayers::BufferFlags last, >- TemporalLayers::BufferFlags golden, >- TemporalLayers::BufferFlags arf) >+Vp8TemporalLayers::FrameConfig::FrameConfig( >+ Vp8TemporalLayers::BufferFlags last, >+ Vp8TemporalLayers::BufferFlags golden, >+ Vp8TemporalLayers::BufferFlags arf) > : FrameConfig(last, golden, arf, false) {} > >-TemporalLayers::FrameConfig::FrameConfig(TemporalLayers::BufferFlags last, >- TemporalLayers::BufferFlags golden, >- TemporalLayers::BufferFlags arf, >- FreezeEntropy) >+Vp8TemporalLayers::FrameConfig::FrameConfig( >+ Vp8TemporalLayers::BufferFlags last, >+ Vp8TemporalLayers::BufferFlags golden, >+ Vp8TemporalLayers::BufferFlags arf, >+ FreezeEntropy) > : FrameConfig(last, golden, arf, true) {} > >-TemporalLayers::FrameConfig::FrameConfig(TemporalLayers::BufferFlags last, >- TemporalLayers::BufferFlags golden, >- TemporalLayers::BufferFlags arf, >- bool freeze_entropy) >- : drop_frame(last == TemporalLayers::kNone && >- golden == TemporalLayers::kNone && >- arf == TemporalLayers::kNone), >+Vp8TemporalLayers::FrameConfig::FrameConfig( >+ Vp8TemporalLayers::BufferFlags last, >+ Vp8TemporalLayers::BufferFlags golden, >+ Vp8TemporalLayers::BufferFlags arf, >+ bool freeze_entropy) >+ : drop_frame(last == Vp8TemporalLayers::kNone && >+ golden == Vp8TemporalLayers::kNone && >+ arf == Vp8TemporalLayers::kNone), > last_buffer_flags(last), > golden_buffer_flags(golden), > arf_buffer_flags(arf), >@@ -106,15 +108,15 @@ std::vector<unsigned int> GetTemporalIds(size_t num_layers) { > return {0}; > } > >-uint8_t GetUpdatedBuffers(const TemporalLayers::FrameConfig& config) { >+uint8_t GetUpdatedBuffers(const Vp8TemporalLayers::FrameConfig& config) { > uint8_t flags = 0; >- if (config.last_buffer_flags & TemporalLayers::BufferFlags::kUpdate) { >+ if (config.last_buffer_flags & Vp8TemporalLayers::BufferFlags::kUpdate) { > flags |= static_cast<uint8_t>(Vp8BufferReference::kLast); > } >- if (config.golden_buffer_flags & TemporalLayers::BufferFlags::kUpdate) { >+ if (config.golden_buffer_flags & Vp8TemporalLayers::BufferFlags::kUpdate) { > flags |= static_cast<uint8_t>(Vp8BufferReference::kGolden); > } >- if (config.arf_buffer_flags & TemporalLayers::BufferFlags::kUpdate) { >+ if (config.arf_buffer_flags & Vp8TemporalLayers::BufferFlags::kUpdate) { > flags |= static_cast<uint8_t>(Vp8BufferReference::kAltref); > } > return flags; >@@ -122,10 +124,10 @@ uint8_t GetUpdatedBuffers(const TemporalLayers::FrameConfig& config) { > > // Find the set of buffers that are never updated by the given pattern. > std::set<Vp8BufferReference> FindKfBuffers( >- const std::vector<TemporalLayers::FrameConfig>& frame_configs) { >+ const std::vector<Vp8TemporalLayers::FrameConfig>& frame_configs) { > std::set<Vp8BufferReference> kf_buffers(kAllBuffers.begin(), > kAllBuffers.end()); >- for (TemporalLayers::FrameConfig config : frame_configs) { >+ for (Vp8TemporalLayers::FrameConfig config : frame_configs) { > // Get bit-masked set of update buffers for this frame config. > uint8_t updated_buffers = GetUpdatedBuffers(config); > for (Vp8BufferReference buffer : kAllBuffers) { >@@ -138,7 +140,7 @@ std::set<Vp8BufferReference> FindKfBuffers( > } > } // namespace > >-std::vector<TemporalLayers::FrameConfig> >+std::vector<Vp8TemporalLayers::FrameConfig> > DefaultTemporalLayers::GetTemporalPattern(size_t num_layers) { > // For indexing in the patterns described below (which temporal layers they > // belong to), see the diagram above. >@@ -262,7 +264,7 @@ DefaultTemporalLayers::DefaultTemporalLayers(int number_of_temporal_layers) > kf_buffers_(FindKfBuffers(temporal_pattern_)), > pattern_idx_(kUninitializedPatternIndex), > checker_(TemporalLayersChecker::CreateTemporalLayersChecker( >- TemporalLayersType::kFixedPattern, >+ Vp8TemporalLayersType::kFixedPattern, > number_of_temporal_layers)) { > RTC_CHECK_GE(kMaxTemporalStreams, number_of_temporal_layers); > RTC_CHECK_GE(number_of_temporal_layers, 0); >@@ -278,6 +280,8 @@ DefaultTemporalLayers::DefaultTemporalLayers(int number_of_temporal_layers) > } > } > >+DefaultTemporalLayers::~DefaultTemporalLayers() = default; >+ > bool DefaultTemporalLayers::SupportsEncoderFrameDropping() const { > // This class allows the encoder drop frames as it sees fit. > return true; >@@ -346,13 +350,13 @@ bool DefaultTemporalLayers::IsSyncFrame(const FrameConfig& config) const { > return true; > } > >-TemporalLayers::FrameConfig DefaultTemporalLayers::UpdateLayerConfig( >+Vp8TemporalLayers::FrameConfig DefaultTemporalLayers::UpdateLayerConfig( > uint32_t timestamp) { > RTC_DCHECK_GT(num_layers_, 0); > RTC_DCHECK_LT(0, temporal_pattern_.size()); > > pattern_idx_ = (pattern_idx_ + 1) % temporal_pattern_.size(); >- TemporalLayers::FrameConfig tl_config = temporal_pattern_[pattern_idx_]; >+ Vp8TemporalLayers::FrameConfig tl_config = temporal_pattern_[pattern_idx_]; > tl_config.encoder_layer_id = tl_config.packetizer_temporal_idx = > temporal_ids_[pattern_idx_ % temporal_ids_.size()]; > >@@ -560,9 +564,11 @@ DefaultTemporalLayersChecker::DefaultTemporalLayersChecker( > } > } > >+DefaultTemporalLayersChecker::~DefaultTemporalLayersChecker() = default; >+ > bool DefaultTemporalLayersChecker::CheckTemporalConfig( > bool frame_is_keyframe, >- const TemporalLayers::FrameConfig& frame_config) { >+ const Vp8TemporalLayers::FrameConfig& frame_config) { > if (!TemporalLayersChecker::CheckTemporalConfig(frame_is_keyframe, > frame_config)) { > return false; >@@ -613,7 +619,7 @@ bool DefaultTemporalLayersChecker::CheckTemporalConfig( > std::vector<int> dependencies; > > if (frame_config.last_buffer_flags & >- TemporalLayers::BufferFlags::kReference) { >+ Vp8TemporalLayers::BufferFlags::kReference) { > uint8_t referenced_layer = temporal_ids_[last_.pattern_idx]; > if (referenced_layer > 0) { > need_sync = false; >@@ -628,7 +634,8 @@ bool DefaultTemporalLayersChecker::CheckTemporalConfig( > return false; > } > >- if (frame_config.arf_buffer_flags & TemporalLayers::BufferFlags::kReference) { >+ if (frame_config.arf_buffer_flags & >+ Vp8TemporalLayers::BufferFlags::kReference) { > uint8_t referenced_layer = temporal_ids_[arf_.pattern_idx]; > if (referenced_layer > 0) { > need_sync = false; >@@ -644,7 +651,7 @@ bool DefaultTemporalLayersChecker::CheckTemporalConfig( > } > > if (frame_config.golden_buffer_flags & >- TemporalLayers::BufferFlags::kReference) { >+ Vp8TemporalLayers::BufferFlags::kReference) { > uint8_t referenced_layer = temporal_ids_[golden_.pattern_idx]; > if (referenced_layer > 0) { > need_sync = false; >@@ -680,17 +687,19 @@ bool DefaultTemporalLayersChecker::CheckTemporalConfig( > } > } > >- if (frame_config.last_buffer_flags & TemporalLayers::BufferFlags::kUpdate) { >+ if (frame_config.last_buffer_flags & >+ Vp8TemporalLayers::BufferFlags::kUpdate) { > last_.is_updated_this_cycle = true; > last_.pattern_idx = pattern_idx_; > last_.is_keyframe = false; > } >- if (frame_config.arf_buffer_flags & TemporalLayers::BufferFlags::kUpdate) { >+ if (frame_config.arf_buffer_flags & Vp8TemporalLayers::BufferFlags::kUpdate) { > arf_.is_updated_this_cycle = true; > arf_.pattern_idx = pattern_idx_; > arf_.is_keyframe = false; > } >- if (frame_config.golden_buffer_flags & TemporalLayers::BufferFlags::kUpdate) { >+ if (frame_config.golden_buffer_flags & >+ Vp8TemporalLayers::BufferFlags::kUpdate) { > golden_.is_updated_this_cycle = true; > golden_.pattern_idx = pattern_idx_; > golden_.is_keyframe = false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.h >index bda43cd6d93b414bbe7a4baaa5bd2661fc749d6b..fff62a557d36e0de9ffdabfb0ccd008153202385 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers.h >@@ -18,23 +18,23 @@ > #include <set> > #include <vector> > >-#include "modules/video_coding/codecs/vp8/include/temporal_layers_checker.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >- > #include "absl/types/optional.h" > >+#include "api/video_codecs/vp8_temporal_layers.h" >+#include "modules/video_coding/codecs/vp8/include/temporal_layers_checker.h" >+ > namespace webrtc { > >-class DefaultTemporalLayers : public TemporalLayers { >+class DefaultTemporalLayers : public Vp8TemporalLayers { > public: > explicit DefaultTemporalLayers(int number_of_temporal_layers); >- virtual ~DefaultTemporalLayers() {} >+ ~DefaultTemporalLayers() override; > > bool SupportsEncoderFrameDropping() const override; > > // Returns the recommended VP8 encode flags needed. May refresh the decoder > // and/or update the reference buffers. >- TemporalLayers::FrameConfig UpdateLayerConfig(uint32_t timestamp) override; >+ Vp8TemporalLayers::FrameConfig UpdateLayerConfig(uint32_t timestamp) override; > > // New target bitrate, per temporal layer. > void OnRatesUpdated(const std::vector<uint32_t>& bitrates_bps, >@@ -50,7 +50,7 @@ class DefaultTemporalLayers : public TemporalLayers { > > private: > static constexpr size_t kKeyframeBuffer = std::numeric_limits<size_t>::max(); >- static std::vector<TemporalLayers::FrameConfig> GetTemporalPattern( >+ static std::vector<Vp8TemporalLayers::FrameConfig> GetTemporalPattern( > size_t num_layers); > bool IsSyncFrame(const FrameConfig& config) const; > void ValidateReferences(BufferFlags* flags, Vp8BufferReference ref) const; >@@ -58,7 +58,7 @@ class DefaultTemporalLayers : public TemporalLayers { > > const size_t num_layers_; > const std::vector<unsigned int> temporal_ids_; >- const std::vector<TemporalLayers::FrameConfig> temporal_pattern_; >+ const std::vector<Vp8TemporalLayers::FrameConfig> temporal_pattern_; > // Set of buffers that are never updated except by keyframes. > const std::set<Vp8BufferReference> kf_buffers_; > >@@ -95,9 +95,11 @@ class DefaultTemporalLayers : public TemporalLayers { > class DefaultTemporalLayersChecker : public TemporalLayersChecker { > public: > explicit DefaultTemporalLayersChecker(int number_of_temporal_layers); >+ ~DefaultTemporalLayersChecker() override; >+ > bool CheckTemporalConfig( > bool frame_is_keyframe, >- const TemporalLayers::FrameConfig& frame_config) override; >+ const Vp8TemporalLayers::FrameConfig& frame_config) override; > > private: > struct BufferState { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers_unittest.cc >index ad31a6ef4a0aecbb332b10156fe9d79988fd0f1c..5ee652055cea517782b3ccedcf09943daf71c7da 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/default_temporal_layers_unittest.cc >@@ -74,7 +74,7 @@ constexpr int kDefaultBytesPerFrame = > constexpr int kDefaultQp = 2; > } // namespace > >-using BufferFlags = TemporalLayers::BufferFlags; >+using BufferFlags = Vp8TemporalLayers::BufferFlags; > > TEST(TemporalLayersTest, 2Layers) { > constexpr int kNumLayers = 2; >@@ -114,7 +114,7 @@ TEST(TemporalLayersTest, 2Layers) { > > uint32_t timestamp = 0; > for (int i = 0; i < 16; ++i) { >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > EXPECT_EQ(expected_flags[i], LibvpxVp8Encoder::EncodeFlags(tl_config)) << i; > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, i == 0, kDefaultQp, > &vp8_info); >@@ -166,7 +166,7 @@ TEST(TemporalLayersTest, 3Layers) { > > unsigned int timestamp = 0; > for (int i = 0; i < 16; ++i) { >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > EXPECT_EQ(expected_flags[i], LibvpxVp8Encoder::EncodeFlags(tl_config)) << i; > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, i == 0, kDefaultQp, > &vp8_info); >@@ -207,7 +207,7 @@ TEST(TemporalLayersTest, Alternative3Layers) { > > unsigned int timestamp = 0; > for (int i = 0; i < 8; ++i) { >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > EXPECT_EQ(expected_flags[i], LibvpxVp8Encoder::EncodeFlags(tl_config)) << i; > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, i == 0, kDefaultQp, > &vp8_info); >@@ -238,7 +238,7 @@ TEST(TemporalLayersTest, SearchOrder) { > > // Start with a key-frame. tl_config flags can be ignored. > uint32_t timestamp = 0; >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, true, kDefaultQp, > &vp8_info); > >@@ -282,7 +282,7 @@ TEST(TemporalLayersTest, SearchOrderWithDrop) { > > // Start with a key-frame. tl_config flags can be ignored. > uint32_t timestamp = 0; >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, true, kDefaultQp, > &vp8_info); > >@@ -343,7 +343,7 @@ TEST(TemporalLayersTest, 4Layers) { > > uint32_t timestamp = 0; > for (int i = 0; i < 16; ++i) { >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > EXPECT_EQ(expected_flags[i], LibvpxVp8Encoder::EncodeFlags(tl_config)) << i; > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, i == 0, kDefaultQp, > &vp8_info); >@@ -373,7 +373,7 @@ TEST(TemporalLayersTest, DoesNotReferenceDroppedFrames) { > > // Start with a keyframe. > uint32_t timestamp = 0; >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, true, kDefaultQp, > &vp8_info); > >@@ -460,7 +460,7 @@ TEST(TemporalLayersTest, DoesNotReferenceUnlessGuaranteedToExist) { > > // Start with a keyframe. > uint32_t timestamp = 0; >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, true, kDefaultQp, > &vp8_info); > >@@ -530,7 +530,7 @@ TEST(TemporalLayersTest, DoesNotReferenceUnlessGuaranteedToExistLongDelay) { > > // Start with a keyframe. > uint32_t timestamp = 0; >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, true, kDefaultQp, > &vp8_info); > >@@ -610,7 +610,8 @@ TEST(TemporalLayersTest, KeyFrame) { > for (int j = 1; j <= i; ++j) { > // Since last frame was always a keyframe and thus index 0 in the pattern, > // this loop starts at index 1. >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = >+ tl.UpdateLayerConfig(timestamp); > EXPECT_EQ(expected_flags[j], LibvpxVp8Encoder::EncodeFlags(tl_config)) > << j; > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, false, kDefaultQp, >@@ -622,7 +623,7 @@ TEST(TemporalLayersTest, KeyFrame) { > timestamp += 3000; > } > >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp); > tl.OnEncodeDone(timestamp, kDefaultBytesPerFrame, true, kDefaultQp, > &vp8_info); > EXPECT_TRUE(vp8_info.layerSync) << "Key frame should be marked layer sync."; >@@ -652,9 +653,9 @@ class TemporalLayersReferenceTest : public ::testing::TestWithParam<int> { > bool sync; > }; > >- bool UpdateSyncRefState(const TemporalLayers::BufferFlags& flags, >+ bool UpdateSyncRefState(const Vp8TemporalLayers::BufferFlags& flags, > BufferState* buffer_state) { >- if (flags & TemporalLayers::kReference) { >+ if (flags & Vp8TemporalLayers::kReference) { > if (buffer_state->temporal_idx == -1) > return true; // References key-frame. > if (buffer_state->temporal_idx == 0) { >@@ -668,10 +669,10 @@ class TemporalLayersReferenceTest : public ::testing::TestWithParam<int> { > return true; // No reference, does not affect sync frame status. > } > >- void ValidateReference(const TemporalLayers::BufferFlags& flags, >+ void ValidateReference(const Vp8TemporalLayers::BufferFlags& flags, > const BufferState& buffer_state, > int temporal_layer) { >- if (flags & TemporalLayers::kReference) { >+ if (flags & Vp8TemporalLayers::kReference) { > if (temporal_layer > 0 && buffer_state.timestamp > 0) { > // Check that high layer reference does not go past last sync frame. > EXPECT_GE(buffer_state.timestamp, last_sync_timestamp_); >@@ -709,9 +710,9 @@ TEST_P(TemporalLayersReferenceTest, ValidFrameConfigs) { > // (any). If a given buffer is never updated, it is legal to reference it > // even for sync frames. In order to be general, don't assume TL0 always > // updates |last|. >- std::vector<TemporalLayers::FrameConfig> tl_configs(kMaxPatternLength); >+ std::vector<Vp8TemporalLayers::FrameConfig> tl_configs(kMaxPatternLength); > for (int i = 0; i < kMaxPatternLength; ++i) { >- TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp_); >+ Vp8TemporalLayers::FrameConfig tl_config = tl.UpdateLayerConfig(timestamp_); > tl.OnEncodeDone(timestamp_, kDefaultBytesPerFrame, i == 0, kDefaultQp, > &vp8_specifics); > ++timestamp_; >@@ -752,11 +753,11 @@ TEST_P(TemporalLayersReferenceTest, ValidFrameConfigs) { > > // Update the current layer state. > BufferState state = {temporal_idx, timestamp_, is_sync_frame}; >- if (tl_config.last_buffer_flags & TemporalLayers::kUpdate) >+ if (tl_config.last_buffer_flags & Vp8TemporalLayers::kUpdate) > last_state = state; >- if (tl_config.golden_buffer_flags & TemporalLayers::kUpdate) >+ if (tl_config.golden_buffer_flags & Vp8TemporalLayers::kUpdate) > golden_state = state; >- if (tl_config.arf_buffer_flags & TemporalLayers::kUpdate) >+ if (tl_config.arf_buffer_flags & Vp8TemporalLayers::kUpdate) > altref_state = state; > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/include/temporal_layers_checker.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/include/temporal_layers_checker.h >index 9878ac9f3a1c6bdcd15bcfe79f94b14f02810090..ae14f682361c3f4549fbbb7661ea60fa09826eb0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/include/temporal_layers_checker.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/include/temporal_layers_checker.h >@@ -15,7 +15,7 @@ > > #include <memory> > >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > > namespace webrtc { > >@@ -29,10 +29,10 @@ class TemporalLayersChecker { > > virtual bool CheckTemporalConfig( > bool frame_is_keyframe, >- const TemporalLayers::FrameConfig& frame_config); >+ const Vp8TemporalLayers::FrameConfig& frame_config); > > static std::unique_ptr<TemporalLayersChecker> CreateTemporalLayersChecker( >- TemporalLayersType type, >+ Vp8TemporalLayersType type, > int num_temporal_layers); > > private: >@@ -46,7 +46,7 @@ class TemporalLayersChecker { > bool* need_sync, > bool frame_is_keyframe, > uint8_t temporal_layer, >- webrtc::TemporalLayers::BufferFlags flags, >+ webrtc::Vp8TemporalLayers::BufferFlags flags, > uint32_t sequence_number, > uint32_t* lowest_sequence_referenced); > BufferState last_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h >deleted file mode 100644 >index b5dbc4ec2bfc2b6013ba4aa222a07de0e4fc38f9..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h >+++ /dev/null >@@ -1,198 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_VIDEO_CODING_CODECS_VP8_INCLUDE_VP8_TEMPORAL_LAYERS_H_ >-#define MODULES_VIDEO_CODING_CODECS_VP8_INCLUDE_VP8_TEMPORAL_LAYERS_H_ >- >-#include <memory> >-#include <vector> >- >-namespace webrtc { >- >-// Some notes on the prerequisites of the TemporalLayers interface. >-// * Implementations of TemporalLayers may not contain internal synchronization >-// so caller must make sure doing so thread safe. >-// * The encoder is assumed to encode all frames in order, and callbacks to >-// PopulateCodecSpecific() / FrameEncoded() must happen in the same order. >-// >-// This means that in the case of pipelining encoders, it is OK to have a chain >-// of calls such as this: >-// - UpdateLayerConfig(timestampA) >-// - UpdateLayerConfig(timestampB) >-// - PopulateCodecSpecific(timestampA, ...) >-// - UpdateLayerConfig(timestampC) >-// - OnEncodeDone(timestampA, 1234, ...) >-// - UpdateLayerConfig(timestampC) >-// - OnEncodeDone(timestampB, 0, ...) >-// - OnEncodeDone(timestampC, 1234, ...) >-// Note that UpdateLayerConfig() for a new frame can happen before >-// FrameEncoded() for a previous one, but calls themselves must be both >-// synchronized (e.g. run on a task queue) and in order (per type). >- >-enum class TemporalLayersType { kFixedPattern, kBitrateDynamic }; >- >-struct CodecSpecificInfoVP8; >-enum class Vp8BufferReference : uint8_t { >- kNone = 0, >- kLast = 1, >- kGolden = 2, >- kAltref = 4 >-}; >- >-struct Vp8EncoderConfig { >- static constexpr size_t kMaxPeriodicity = 16; >- static constexpr size_t kMaxLayers = 5; >- >- // Number of active temporal layers. Set to 0 if not used. >- uint32_t ts_number_layers; >- // Arrays of length |ts_number_layers|, indicating (cumulative) target bitrate >- // and rate decimator (e.g. 4 if every 4th frame is in the given layer) for >- // each active temporal layer, starting with temporal id 0. >- uint32_t ts_target_bitrate[kMaxLayers]; >- uint32_t ts_rate_decimator[kMaxLayers]; >- >- // The periodicity of the temporal pattern. Set to 0 if not used. >- uint32_t ts_periodicity; >- // Array of length |ts_periodicity| indicating the sequence of temporal id's >- // to assign to incoming frames. >- uint32_t ts_layer_id[kMaxPeriodicity]; >- >- // Target bitrate, in bps. >- uint32_t rc_target_bitrate; >- >- // Clamp QP to min/max. Use 0 to disable clamping. >- uint32_t rc_min_quantizer; >- uint32_t rc_max_quantizer; >-}; >- >-// This interface defines a way of getting the encoder settings needed to >-// realize a temporal layer structure of predefined size. >-class TemporalLayers { >- public: >- enum BufferFlags : int { >- kNone = 0, >- kReference = 1, >- kUpdate = 2, >- kReferenceAndUpdate = kReference | kUpdate, >- }; >- enum FreezeEntropy { kFreezeEntropy }; >- >- struct FrameConfig { >- FrameConfig(); >- >- FrameConfig(BufferFlags last, BufferFlags golden, BufferFlags arf); >- FrameConfig(BufferFlags last, >- BufferFlags golden, >- BufferFlags arf, >- FreezeEntropy); >- >- bool drop_frame; >- BufferFlags last_buffer_flags; >- BufferFlags golden_buffer_flags; >- BufferFlags arf_buffer_flags; >- >- // The encoder layer ID is used to utilize the correct bitrate allocator >- // inside the encoder. It does not control references nor determine which >- // "actual" temporal layer this is. The packetizer temporal index determines >- // which layer the encoded frame should be packetized into. >- // Normally these are the same, but current temporal-layer strategies for >- // screenshare use one bitrate allocator for all layers, but attempt to >- // packetize / utilize references to split a stream into multiple layers, >- // with different quantizer settings, to hit target bitrate. >- // TODO(pbos): Screenshare layers are being reconsidered at the time of >- // writing, we might be able to remove this distinction, and have a temporal >- // layer imply both (the normal case). >- int encoder_layer_id; >- int packetizer_temporal_idx; >- >- bool layer_sync; >- >- bool freeze_entropy; >- >- // Indicates in which order the encoder should search the reference buffers >- // when doing motion prediction. Set to kNone to use unspecified order. Any >- // buffer indicated here must not have the corresponding no_ref bit set. >- // If all three buffers can be reference, the one not listed here should be >- // searched last. >- Vp8BufferReference first_reference; >- Vp8BufferReference second_reference; >- >- bool operator==(const FrameConfig& o) const; >- bool operator!=(const FrameConfig& o) const { return !(*this == o); } >- >- private: >- FrameConfig(BufferFlags last, >- BufferFlags golden, >- BufferFlags arf, >- bool freeze_entropy); >- }; >- >- // Factory for TemporalLayer strategy. Default behavior is a fixed pattern >- // of temporal layers. See default_temporal_layers.cc >- static std::unique_ptr<TemporalLayers> CreateTemporalLayers( >- TemporalLayersType type, >- int num_temporal_layers); >- >- virtual ~TemporalLayers() = default; >- >- // If this method returns true, the encoder is free to drop frames for >- // instance in an effort to uphold encoding bitrate. >- // If this return false, the encoder must not drop any frames unless: >- // 1. Requested to do so via FrameConfig.drop_frame >- // 2. The frame to be encoded is requested to be a keyframe >- // 3. The encoded detected a large overshoot and decided to drop and then >- // re-encode the image at a low bitrate. In this case the encoder should >- // call OnEncodeDone() once with size = 0 to indicate drop, and then call >- // OnEncodeDone() again when the frame has actually been encoded. >- virtual bool SupportsEncoderFrameDropping() const = 0; >- >- // New target bitrate, per temporal layer. >- virtual void OnRatesUpdated(const std::vector<uint32_t>& bitrates_bps, >- int framerate_fps) = 0; >- >- // Called by the encoder before encoding a frame. |cfg| contains the current >- // configuration. If the TemporalLayers instance wishes any part of that >- // to be changed before the encode step, |cfg| should be changed and then >- // return true. If false is returned, the encoder will proceed without >- // updating the configuration. >- virtual bool UpdateConfiguration(Vp8EncoderConfig* cfg) = 0; >- >- // Returns the recommended VP8 encode flags needed, and moves the temporal >- // pattern to the next frame. >- // The timestamp may be used as both a time and a unique identifier, and so >- // the caller must make sure no two frames use the same timestamp. >- // The timestamp uses a 90kHz RTP clock. >- // After calling this method, first call the actual encoder with the provided >- // frame configuration, and then OnEncodeDone() below. >- virtual FrameConfig UpdateLayerConfig(uint32_t rtp_timestamp) = 0; >- >- // Called after the encode step is done. |rtp_timestamp| must match the >- // parameter use in the UpdateLayerConfig() call. >- // |is_keyframe| must be true iff the encoder decided to encode this frame as >- // a keyframe. >- // If the encoder decided to drop this frame, |size_bytes| must be set to 0, >- // otherwise it should indicate the size in bytes of the encoded frame. >- // If |size_bytes| > 0, and |vp8_info| is not null, the TemporalLayers >- // instance my update |vp8_info| with codec specific data such as temporal id. >- // Some fields of this struct may have already been populated by the encoder, >- // check before overwriting. >- // If |size_bytes| > 0, |qp| should indicate the frame-level QP this frame was >- // encoded at. If the encoder does not support extracting this, |qp| should be >- // set to 0. >- virtual void OnEncodeDone(uint32_t rtp_timestamp, >- size_t size_bytes, >- bool is_keyframe, >- int qp, >- CodecSpecificInfoVP8* vp8_info) = 0; >-}; >- >-} // namespace webrtc >- >-#endif // MODULES_VIDEO_CODING_CODECS_VP8_INCLUDE_VP8_TEMPORAL_LAYERS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.cc >index 20684d1d39185e62e10fc713e3d9953588682913..48838e93e7f369b6ca0dc672f5545809bcc257ec 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.cc >@@ -12,7 +12,6 @@ > #include <string> > > #include "absl/memory/memory.h" >-#include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/video_coding/codecs/vp8/libvpx_vp8_decoder.h" > #include "rtc_base/checks.h" > #include "rtc_base/numerics/exp_filter.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.h >index 3a6cf92405cc7edc8b5a304a454d0cd63205cb48..cff9e8998841b9c3848469455dff36bbf88f78f8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_decoder.h >@@ -13,10 +13,10 @@ > > #include <memory> > >+#include "api/video/encoded_image.h" > #include "api/video_codecs/video_decoder.h" > #include "common_types.h" // NOLINT(build/include) > #include "common_video/include/i420_buffer_pool.h" >-#include "common_video/include/video_frame.h" > #include "modules/include/module_common_types.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" > #include "modules/video_coding/include/video_codec_interface.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.cc >index 41dad40d9e40bf3f9c6fa0ea3bd6463de960a17b..be5242c1b9168b06f9adb1d6604e4a9c016f8c53 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.cc >@@ -14,6 +14,8 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "api/video_codecs/create_vp8_temporal_layers.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/video_coding/codecs/vp8/libvpx_vp8_encoder.h" > #include "modules/video_coding/utility/simulcast_rate_allocator.h" >@@ -27,7 +29,8 @@ > > namespace webrtc { > namespace { >-const char kVp8GfBoostFieldTrial[] = "WebRTC-VP8-GfBoost"; >+const char kVp8TrustedRateControllerFieldTrial[] = >+ "WebRTC-LibvpxVp8TrustedRateController"; > > // QP is obtained from VP8-bitstream for HW, so the QP corresponds to the > // bitstream range of [0, 127] and not the user-level range of [0,63]. >@@ -59,20 +62,6 @@ static int GCD(int a, int b) { > return b; > } > >-bool GetGfBoostPercentageFromFieldTrialGroup(int* boost_percentage) { >- std::string group = webrtc::field_trial::FindFullName(kVp8GfBoostFieldTrial); >- if (group.empty()) >- return false; >- >- if (sscanf(group.c_str(), "Enabled-%d", boost_percentage) != 1) >- return false; >- >- if (*boost_percentage < 0 || *boost_percentage > 100) >- return false; >- >- return true; >-} >- > static_assert(Vp8EncoderConfig::kMaxPeriodicity == VPX_TS_MAX_PERIODICITY, > "Vp8EncoderConfig::kMaxPeriodicity must be kept in sync with the " > "constant in libvpx."); >@@ -113,7 +102,7 @@ static void FillInEncoderConfig(vpx_codec_enc_cfg* vpx_config, > vpx_config->rc_max_quantizer = config.rc_max_quantizer; > } > >-bool UpdateVpxConfiguration(TemporalLayers* temporal_layers, >+bool UpdateVpxConfiguration(Vp8TemporalLayers* temporal_layers, > vpx_codec_enc_cfg_t* cfg) { > Vp8EncoderConfig config = GetEncoderConfig(cfg); > const bool res = temporal_layers->UpdateConfiguration(&config); >@@ -129,22 +118,22 @@ std::unique_ptr<VideoEncoder> VP8Encoder::Create() { > } > > vpx_enc_frame_flags_t LibvpxVp8Encoder::EncodeFlags( >- const TemporalLayers::FrameConfig& references) { >+ const Vp8TemporalLayers::FrameConfig& references) { > RTC_DCHECK(!references.drop_frame); > > vpx_enc_frame_flags_t flags = 0; > >- if ((references.last_buffer_flags & TemporalLayers::kReference) == 0) >+ if ((references.last_buffer_flags & Vp8TemporalLayers::kReference) == 0) > flags |= VP8_EFLAG_NO_REF_LAST; >- if ((references.last_buffer_flags & TemporalLayers::kUpdate) == 0) >+ if ((references.last_buffer_flags & Vp8TemporalLayers::kUpdate) == 0) > flags |= VP8_EFLAG_NO_UPD_LAST; >- if ((references.golden_buffer_flags & TemporalLayers::kReference) == 0) >+ if ((references.golden_buffer_flags & Vp8TemporalLayers::kReference) == 0) > flags |= VP8_EFLAG_NO_REF_GF; >- if ((references.golden_buffer_flags & TemporalLayers::kUpdate) == 0) >+ if ((references.golden_buffer_flags & Vp8TemporalLayers::kUpdate) == 0) > flags |= VP8_EFLAG_NO_UPD_GF; >- if ((references.arf_buffer_flags & TemporalLayers::kReference) == 0) >+ if ((references.arf_buffer_flags & Vp8TemporalLayers::kReference) == 0) > flags |= VP8_EFLAG_NO_REF_ARF; >- if ((references.arf_buffer_flags & TemporalLayers::kUpdate) == 0) >+ if ((references.arf_buffer_flags & Vp8TemporalLayers::kUpdate) == 0) > flags |= VP8_EFLAG_NO_UPD_ARF; > if (references.freeze_entropy) > flags |= VP8_EFLAG_NO_UPD_ENTROPY; >@@ -157,7 +146,9 @@ LibvpxVp8Encoder::LibvpxVp8Encoder() > > LibvpxVp8Encoder::LibvpxVp8Encoder(std::unique_ptr<LibvpxInterface> interface) > : libvpx_(std::move(interface)), >- use_gf_boost_(webrtc::field_trial::IsEnabled(kVp8GfBoostFieldTrial)), >+ experimental_cpu_speed_config_arm_(CpuSpeedExperiment::GetConfigs()), >+ trusted_rate_controller_( >+ field_trial::IsEnabled(kVp8TrustedRateControllerFieldTrial)), > encoded_complete_callback_(nullptr), > inited_(false), > timestamp_(0), >@@ -167,7 +158,6 @@ LibvpxVp8Encoder::LibvpxVp8Encoder(std::unique_ptr<LibvpxInterface> interface) > rc_max_intra_target_(0), > key_frame_request_(kMaxSimulcastStreams, false) { > temporal_layers_.reserve(kMaxSimulcastStreams); >- temporal_layers_checkers_.reserve(kMaxSimulcastStreams); > raw_images_.reserve(kMaxSimulcastStreams); > encoded_images_.reserve(kMaxSimulcastStreams); > send_stream_.reserve(kMaxSimulcastStreams); >@@ -206,7 +196,6 @@ int LibvpxVp8Encoder::Release() { > raw_images_.pop_back(); > } > temporal_layers_.clear(); >- temporal_layers_checkers_.clear(); > inited_ = false; > return ret_val; > } >@@ -278,10 +267,6 @@ int LibvpxVp8Encoder::SetRateAllocation(const VideoBitrateAllocation& bitrate, > return WEBRTC_VIDEO_CODEC_OK; > } > >-const char* LibvpxVp8Encoder::ImplementationName() const { >- return "libvpx"; >-} >- > void LibvpxVp8Encoder::SetStreamState(bool send_stream, int stream_idx) { > if (send_stream && !send_stream_[stream_idx]) { > // Need a key frame if we have not sent this stream before. >@@ -294,21 +279,18 @@ void LibvpxVp8Encoder::SetupTemporalLayers(const VideoCodec& codec) { > RTC_DCHECK(temporal_layers_.empty()); > int num_streams = SimulcastUtility::NumberOfSimulcastStreams(codec); > for (int i = 0; i < num_streams; ++i) { >- TemporalLayersType type; >+ Vp8TemporalLayersType type; > int num_temporal_layers = > SimulcastUtility::NumberOfTemporalLayers(codec, i); > if (SimulcastUtility::IsConferenceModeScreenshare(codec) && i == 0) { >- type = TemporalLayersType::kBitrateDynamic; >+ type = Vp8TemporalLayersType::kBitrateDynamic; > // Legacy screenshare layers supports max 2 layers. > num_temporal_layers = std::max<int>(2, num_temporal_layers); > } else { >- type = TemporalLayersType::kFixedPattern; >+ type = Vp8TemporalLayersType::kFixedPattern; > } > temporal_layers_.emplace_back( >- TemporalLayers::CreateTemporalLayers(type, num_temporal_layers)); >- temporal_layers_checkers_.emplace_back( >- TemporalLayersChecker::CreateTemporalLayersChecker( >- type, num_temporal_layers)); >+ CreateVp8TemporalLayers(type, num_temporal_layers)); > } > } > >@@ -453,10 +435,10 @@ int LibvpxVp8Encoder::InitEncode(const VideoCodec* inst, > } > cpu_speed_default_ = cpu_speed_[0]; > // Set encoding complexity (cpu_speed) based on resolution and/or platform. >- cpu_speed_[0] = SetCpuSpeed(inst->width, inst->height); >+ cpu_speed_[0] = GetCpuSpeed(inst->width, inst->height); > for (int i = 1; i < number_of_streams; ++i) { > cpu_speed_[i] = >- SetCpuSpeed(inst->simulcastStream[number_of_streams - 1 - i].width, >+ GetCpuSpeed(inst->simulcastStream[number_of_streams - 1 - i].width, > inst->simulcastStream[number_of_streams - 1 - i].height); > } > configurations_[0].g_w = inst->width; >@@ -528,7 +510,7 @@ int LibvpxVp8Encoder::InitEncode(const VideoCodec* inst, > return InitAndSetControlSettings(); > } > >-int LibvpxVp8Encoder::SetCpuSpeed(int width, int height) { >+int LibvpxVp8Encoder::GetCpuSpeed(int width, int height) { > #if defined(WEBRTC_ARCH_ARM) || defined(WEBRTC_ARCH_ARM64) || \ > defined(WEBRTC_ANDROID) > // On mobile platform, use a lower speed setting for lower resolutions for >@@ -537,6 +519,11 @@ int LibvpxVp8Encoder::SetCpuSpeed(int width, int height) { > if (number_of_cores_ <= 3) > return -12; > >+ if (experimental_cpu_speed_config_arm_) { >+ return CpuSpeedExperiment::GetValue(width * height, >+ *experimental_cpu_speed_config_arm_); >+ } >+ > if (width * height <= 352 * 288) > return -8; > else if (width * height <= 640 * 480) >@@ -644,14 +631,6 @@ int LibvpxVp8Encoder::InitAndSetControlSettings() { > libvpx_->codec_control( > &(encoders_[i]), VP8E_SET_SCREEN_CONTENT_MODE, > codec_.mode == VideoCodecMode::kScreensharing ? 2u : 0u); >- // Apply boost on golden frames (has only effect when resilience is off). >- if (use_gf_boost_ && configurations_[0].g_error_resilient == 0) { >- int gf_boost_percent; >- if (GetGfBoostPercentageFromFieldTrialGroup(&gf_boost_percent)) { >- libvpx_->codec_control(&(encoders_[i]), VP8E_SET_GF_CBR_BOOST_PCT, >- gf_boost_percent); >- } >- } > } > inited_ = true; > return WEBRTC_VIDEO_CODEC_OK; >@@ -750,7 +729,7 @@ int LibvpxVp8Encoder::Encode(const VideoFrame& frame, > } > } > vpx_enc_frame_flags_t flags[kMaxSimulcastStreams]; >- TemporalLayers::FrameConfig tl_configs[kMaxSimulcastStreams]; >+ Vp8TemporalLayers::FrameConfig tl_configs[kMaxSimulcastStreams]; > for (size_t i = 0; i < encoders_.size(); ++i) { > tl_configs[i] = temporal_layers_[i]->UpdateLayerConfig(frame.timestamp()); > if (tl_configs[i].drop_frame) { >@@ -831,10 +810,11 @@ int LibvpxVp8Encoder::Encode(const VideoFrame& frame, > } > if (error) > return WEBRTC_VIDEO_CODEC_ERROR; >- timestamp_ += duration; > // Examines frame timestamps only. > error = GetEncodedPartitions(frame); > } >+ // TODO(sprang): Shouldn't we use the frame timestamp instead? >+ timestamp_ += duration; > return error; > } > >@@ -845,7 +825,6 @@ void LibvpxVp8Encoder::PopulateCodecSpecific(CodecSpecificInfo* codec_specific, > uint32_t timestamp) { > assert(codec_specific != NULL); > codec_specific->codecType = kVideoCodecVP8; >- codec_specific->codec_name = ImplementationName(); > CodecSpecificInfoVP8* vp8Info = &(codec_specific->codecSpecific.VP8); > vp8Info->keyIdx = kNoKeyIdx; // TODO(hlundin) populate this > vp8Info->nonReference = (pkt.data.frame.flags & VPX_FRAME_IS_DROPPABLE) != 0; >@@ -863,14 +842,9 @@ int LibvpxVp8Encoder::GetEncodedPartitions(const VideoFrame& input_image) { > for (size_t encoder_idx = 0; encoder_idx < encoders_.size(); > ++encoder_idx, --stream_idx) { > vpx_codec_iter_t iter = NULL; >- int part_idx = 0; > encoded_images_[encoder_idx]._length = 0; > encoded_images_[encoder_idx]._frameType = kVideoFrameDelta; >- RTPFragmentationHeader frag_info; >- // kTokenPartitions is number of bits used. >- frag_info.VerifyAndAllocateFragmentationHeader((1 << kTokenPartitions) + 1); > CodecSpecificInfo codec_specific; >- bool is_keyframe = false; > const vpx_codec_cx_pkt_t* pkt = NULL; > while ((pkt = libvpx_->codec_get_cx_data(&encoders_[encoder_idx], &iter)) != > NULL) { >@@ -887,13 +861,8 @@ int LibvpxVp8Encoder::GetEncodedPartitions(const VideoFrame& input_image) { > } > memcpy(&encoded_images_[encoder_idx]._buffer[length], > pkt->data.frame.buf, pkt->data.frame.sz); >- frag_info.fragmentationOffset[part_idx] = length; >- frag_info.fragmentationLength[part_idx] = pkt->data.frame.sz; >- frag_info.fragmentationPlType[part_idx] = 0; // not known here >- frag_info.fragmentationTimeDiff[part_idx] = 0; > encoded_images_[encoder_idx]._length += pkt->data.frame.sz; > assert(length <= encoded_images_[encoder_idx]._size); >- ++part_idx; > break; > } > default: >@@ -904,7 +873,6 @@ int LibvpxVp8Encoder::GetEncodedPartitions(const VideoFrame& input_image) { > // check if encoded frame is a key frame > if (pkt->data.frame.flags & VPX_FRAME_IS_KEY) { > encoded_images_[encoder_idx]._frameType = kVideoFrameKey; >- is_keyframe = true; > } > encoded_images_[encoder_idx].SetSpatialIndex(stream_idx); > PopulateCodecSpecific(&codec_specific, *pkt, stream_idx, encoder_idx, >@@ -935,7 +903,7 @@ int LibvpxVp8Encoder::GetEncodedPartitions(const VideoFrame& input_image) { > &qp_128); > encoded_images_[encoder_idx].qp_ = qp_128; > encoded_complete_callback_->OnEncodedImage(encoded_images_[encoder_idx], >- &codec_specific, &frag_info); >+ &codec_specific, nullptr); > } else if (!temporal_layers_[stream_idx] > ->SupportsEncoderFrameDropping()) { > result = WEBRTC_VIDEO_CODEC_TARGET_BITRATE_OVERSHOOT; >@@ -950,17 +918,21 @@ int LibvpxVp8Encoder::GetEncodedPartitions(const VideoFrame& input_image) { > return result; > } > >-VideoEncoder::ScalingSettings LibvpxVp8Encoder::GetScalingSettings() const { >+VideoEncoder::EncoderInfo LibvpxVp8Encoder::GetEncoderInfo() const { >+ EncoderInfo info; >+ info.supports_native_handle = false; >+ info.implementation_name = "libvpx"; >+ info.has_trusted_rate_controller = trusted_rate_controller_; >+ > const bool enable_scaling = encoders_.size() == 1 && > configurations_[0].rc_dropframe_thresh > 0 && > codec_.VP8().automaticResizeOn; >- return enable_scaling ? VideoEncoder::ScalingSettings(kLowVp8QpThreshold, >- kHighVp8QpThreshold) >- : VideoEncoder::ScalingSettings::kOff; >-} >+ info.scaling_settings = enable_scaling >+ ? VideoEncoder::ScalingSettings( >+ kLowVp8QpThreshold, kHighVp8QpThreshold) >+ : VideoEncoder::ScalingSettings::kOff; > >-int LibvpxVp8Encoder::SetChannelParameters(uint32_t packetLoss, int64_t rtt) { >- return WEBRTC_VIDEO_CODEC_OK; >+ return info; > } > > int LibvpxVp8Encoder::RegisterEncodeCompleteCallback( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.h >index 3c20672df2cd4e6041a0fb5389a8425e425b6a31..df2dbcee59e663ac27d3a7b11e2d11ad4f499e92 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_encoder.h >@@ -14,15 +14,15 @@ > #include <memory> > #include <vector> > >+#include "api/video/encoded_image.h" > #include "api/video/video_frame.h" > #include "api/video_codecs/video_encoder.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "common_types.h" // NOLINT(build/include) >-#include "common_video/include/video_frame.h" >-#include "modules/video_coding/codecs/vp8/include/temporal_layers_checker.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" > #include "modules/video_coding/codecs/vp8/libvpx_interface.h" > #include "modules/video_coding/include/video_codec_interface.h" >+#include "rtc_base/experiments/cpu_speed_experiment.h" > > #include "vpx/vp8cx.h" > #include "vpx/vpx_encoder.h" >@@ -47,23 +47,19 @@ class LibvpxVp8Encoder : public VideoEncoder { > > int RegisterEncodeCompleteCallback(EncodedImageCallback* callback) override; > >- int SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; >- > int SetRateAllocation(const VideoBitrateAllocation& bitrate, > uint32_t new_framerate) override; > >- ScalingSettings GetScalingSettings() const override; >- >- const char* ImplementationName() const override; >+ EncoderInfo GetEncoderInfo() const override; > > static vpx_enc_frame_flags_t EncodeFlags( >- const TemporalLayers::FrameConfig& references); >+ const Vp8TemporalLayers::FrameConfig& references); > > private: > void SetupTemporalLayers(const VideoCodec& codec); > >- // Set the cpu_speed setting for encoder based on resolution and/or platform. >- int SetCpuSpeed(int width, int height); >+ // Get the cpu_speed setting for encoder based on resolution and/or platform. >+ int GetCpuSpeed(int width, int height); > > // Determine number of encoder threads to use. > int NumberOfThreads(int width, int height, int number_of_cores); >@@ -87,7 +83,10 @@ class LibvpxVp8Encoder : public VideoEncoder { > uint32_t FrameDropThreshold(size_t spatial_idx) const; > > const std::unique_ptr<LibvpxInterface> libvpx_; >- const bool use_gf_boost_; >+ >+ const absl::optional<std::vector<CpuSpeedExperiment::Config>> >+ experimental_cpu_speed_config_arm_; >+ const bool trusted_rate_controller_; > > EncodedImageCallback* encoded_complete_callback_; > VideoCodec codec_; >@@ -97,8 +96,7 @@ class LibvpxVp8Encoder : public VideoEncoder { > int cpu_speed_default_; > int number_of_cores_; > uint32_t rc_max_intra_target_; >- std::vector<std::unique_ptr<TemporalLayers>> temporal_layers_; >- std::vector<std::unique_ptr<TemporalLayersChecker>> temporal_layers_checkers_; >+ std::vector<std::unique_ptr<Vp8TemporalLayers>> temporal_layers_; > std::vector<bool> key_frame_request_; > std::vector<bool> send_stream_; > std::vector<int> cpu_speed_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_simulcast_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_simulcast_test.cc >index 5f974d643bb0ee32cd6f76dbe00ae3b19f155d1e..d493e2aceb648771fa95ed5e23cc208e30ab2c1d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_simulcast_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/libvpx_vp8_simulcast_test.cc >@@ -13,9 +13,9 @@ > #include "absl/memory/memory.h" > #include "api/test/create_simulcast_test_fixture.h" > #include "api/test/simulcast_test_fixture.h" >+#include "api/test/video/function_video_decoder_factory.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" >-#include "test/function_video_decoder_factory.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.cc >index 76c4382a21f8dfeece7ee26cc0ff78d31e2fed46..ede21e554d72648a99005e8a77d9cbda9f6a725e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.cc >@@ -53,7 +53,7 @@ ScreenshareLayers::ScreenshareLayers(int num_temporal_layers, Clock* clock) > encode_framerate_(1000.0f, 1000.0f), // 1 second window, second scale. > bitrate_updated_(false), > checker_(TemporalLayersChecker::CreateTemporalLayersChecker( >- TemporalLayersType::kBitrateDynamic, >+ Vp8TemporalLayersType::kBitrateDynamic, > num_temporal_layers)) { > RTC_CHECK_GT(number_of_temporal_layers_, 0); > RTC_CHECK_LE(number_of_temporal_layers_, kMaxNumTemporalLayers); >@@ -68,7 +68,7 @@ bool ScreenshareLayers::SupportsEncoderFrameDropping() const { > return false; > } > >-TemporalLayers::FrameConfig ScreenshareLayers::UpdateLayerConfig( >+Vp8TemporalLayers::FrameConfig ScreenshareLayers::UpdateLayerConfig( > uint32_t timestamp) { > auto it = pending_frame_configs_.find(timestamp); > if (it != pending_frame_configs_.end()) { >@@ -79,7 +79,7 @@ TemporalLayers::FrameConfig ScreenshareLayers::UpdateLayerConfig( > if (number_of_temporal_layers_ <= 1) { > // No flags needed for 1 layer screenshare. > // TODO(pbos): Consider updating only last, and not all buffers. >- TemporalLayers::FrameConfig tl_config( >+ Vp8TemporalLayers::FrameConfig tl_config( > kReferenceAndUpdate, kReferenceAndUpdate, kReferenceAndUpdate); > pending_frame_configs_[timestamp] = tl_config; > return tl_config; >@@ -100,7 +100,7 @@ TemporalLayers::FrameConfig ScreenshareLayers::UpdateLayerConfig( > // averaging window, or if frame interval is below 90% of desired value, > // drop frame. > if (encode_framerate_.Rate(now_ms).value_or(0) > *target_framerate_) >- return TemporalLayers::FrameConfig(kNone, kNone, kNone); >+ return Vp8TemporalLayers::FrameConfig(kNone, kNone, kNone); > > // Primarily check if frame interval is too short using frame timestamps, > // as if they are correct they won't be affected by queuing in webrtc. >@@ -108,7 +108,7 @@ TemporalLayers::FrameConfig ScreenshareLayers::UpdateLayerConfig( > kOneSecond90Khz / *target_framerate_; > if (last_timestamp_ != -1 && ts_diff > 0) { > if (ts_diff < 85 * expected_frame_interval_90khz / 100) { >- return TemporalLayers::FrameConfig(kNone, kNone, kNone); >+ return Vp8TemporalLayers::FrameConfig(kNone, kNone, kNone); > } > } else { > // Timestamps looks off, use realtime clock here instead. >@@ -116,7 +116,7 @@ TemporalLayers::FrameConfig ScreenshareLayers::UpdateLayerConfig( > if (last_frame_time_ms_ != -1 && > now_ms - last_frame_time_ms_ < > (85 * expected_frame_interval_ms) / 100) { >- return TemporalLayers::FrameConfig(kNone, kNone, kNone); >+ return Vp8TemporalLayers::FrameConfig(kNone, kNone, kNone); > } > } > } >@@ -182,30 +182,30 @@ TemporalLayers::FrameConfig ScreenshareLayers::UpdateLayerConfig( > RTC_NOTREACHED(); > } > >- TemporalLayers::FrameConfig tl_config; >+ Vp8TemporalLayers::FrameConfig tl_config; > // TODO(pbos): Consider referencing but not updating the 'alt' buffer for all > // layers. > switch (layer_state) { > case TemporalLayerState::kDrop: >- tl_config = TemporalLayers::FrameConfig(kNone, kNone, kNone); >+ tl_config = Vp8TemporalLayers::FrameConfig(kNone, kNone, kNone); > break; > case TemporalLayerState::kTl0: > // TL0 only references and updates 'last'. > tl_config = >- TemporalLayers::FrameConfig(kReferenceAndUpdate, kNone, kNone); >+ Vp8TemporalLayers::FrameConfig(kReferenceAndUpdate, kNone, kNone); > tl_config.packetizer_temporal_idx = 0; > break; > case TemporalLayerState::kTl1: > // TL1 references both 'last' and 'golden' but only updates 'golden'. >- tl_config = >- TemporalLayers::FrameConfig(kReference, kReferenceAndUpdate, kNone); >+ tl_config = Vp8TemporalLayers::FrameConfig(kReference, >+ kReferenceAndUpdate, kNone); > tl_config.packetizer_temporal_idx = 1; > break; > case TemporalLayerState::kTl1Sync: > // Predict from only TL0 to allow participants to switch to the high > // bitrate stream. Updates 'golden' so that TL1 can continue to refer to > // and update 'golden' from this point on. >- tl_config = TemporalLayers::FrameConfig(kReference, kUpdate, kNone); >+ tl_config = Vp8TemporalLayers::FrameConfig(kReference, kUpdate, kNone); > tl_config.packetizer_temporal_idx = 1; > break; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.h >index 6193be360010df66b7b4f281f2a4c6b4908aa238..0359b2c33fc590cc0fcc1ceca4425e5576b57d43 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers.h >@@ -13,8 +13,8 @@ > #include <memory> > #include <vector> > >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "modules/video_coding/codecs/vp8/include/temporal_layers_checker.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" > #include "modules/video_coding/utility/frame_dropper.h" > #include "rtc_base/rate_statistics.h" > #include "rtc_base/timeutils.h" >@@ -24,7 +24,7 @@ namespace webrtc { > struct CodecSpecificInfoVP8; > class Clock; > >-class ScreenshareLayers : public TemporalLayers { >+class ScreenshareLayers : public Vp8TemporalLayers { > public: > static const double kMaxTL0FpsReduction; > static const double kAcceptableTargetOvershoot; >@@ -38,7 +38,7 @@ class ScreenshareLayers : public TemporalLayers { > > // Returns the recommended VP8 encode flags needed. May refresh the decoder > // and/or update the reference buffers. >- TemporalLayers::FrameConfig UpdateLayerConfig( >+ Vp8TemporalLayers::FrameConfig UpdateLayerConfig( > uint32_t rtp_timestamp) override; > > // New target bitrate, per temporal layer. >@@ -74,7 +74,7 @@ class ScreenshareLayers : public TemporalLayers { > int max_qp_; > uint32_t max_debt_bytes_; > >- std::map<uint32_t, TemporalLayers::FrameConfig> pending_frame_configs_; >+ std::map<uint32_t, Vp8TemporalLayers::FrameConfig> pending_frame_configs_; > > // Configured max framerate. > absl::optional<uint32_t> target_framerate_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers_unittest.cc >index 4ae7ed5e7e425b6ae5c708db404dbcc0c53f4cf4..474e4f825b597ee10e4d964415fa7bc5f679c9fc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/screenshare_layers_unittest.cc >@@ -87,7 +87,7 @@ class ScreenshareLayerTest : public ::testing::Test { > return flags; > } > >- TemporalLayers::FrameConfig UpdateLayerConfig(uint32_t timestamp) { >+ Vp8TemporalLayers::FrameConfig UpdateLayerConfig(uint32_t timestamp) { > int64_t timestamp_ms = timestamp / 90; > clock_.AdvanceTimeMilliseconds(timestamp_ms - clock_.TimeInMilliseconds()); > return layers_->UpdateLayerConfig(timestamp); >@@ -167,7 +167,7 @@ class ScreenshareLayerTest : public ::testing::Test { > std::unique_ptr<ScreenshareLayers> layers_; > > uint32_t timestamp_; >- TemporalLayers::FrameConfig tl_config_; >+ Vp8TemporalLayers::FrameConfig tl_config_; > Vp8EncoderConfig cfg_; > bool config_updated_; > CodecSpecificInfoVP8 vp8_info_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers.h >index 788d69a27bb3174e622af0450809ce3851c75880..9576fb27beaa6317c11945e13da08b43707078e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers.h >@@ -12,6 +12,6 @@ > #define MODULES_VIDEO_CODING_CODECS_VP8_TEMPORAL_LAYERS_H_ > > // TODO(webrtc:9012) Remove this file when downstream projects have updated. >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > > #endif // MODULES_VIDEO_CODING_CODECS_VP8_TEMPORAL_LAYERS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers_checker.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers_checker.cc >index c0a736a4edda9fd94c01c07a2f564afe19e070ad..4dba7a366a9d928fbe408c0c42faa7854e3e5235 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers_checker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/temporal_layers_checker.cc >@@ -17,13 +17,13 @@ > namespace webrtc { > > std::unique_ptr<TemporalLayersChecker> >-TemporalLayersChecker::CreateTemporalLayersChecker(TemporalLayersType type, >+TemporalLayersChecker::CreateTemporalLayersChecker(Vp8TemporalLayersType type, > int num_temporal_layers) { > switch (type) { >- case TemporalLayersType::kFixedPattern: >+ case Vp8TemporalLayersType::kFixedPattern: > return absl::make_unique<DefaultTemporalLayersChecker>( > num_temporal_layers); >- case TemporalLayersType::kBitrateDynamic: >+ case Vp8TemporalLayersType::kBitrateDynamic: > // Conference mode temporal layering for screen content in base stream. > return absl::make_unique<TemporalLayersChecker>(num_temporal_layers); > } >@@ -40,10 +40,10 @@ bool TemporalLayersChecker::CheckAndUpdateBufferState( > bool* need_sync, > bool frame_is_keyframe, > uint8_t temporal_layer, >- webrtc::TemporalLayers::BufferFlags flags, >+ webrtc::Vp8TemporalLayers::BufferFlags flags, > uint32_t sequence_number, > uint32_t* lowest_sequence_referenced) { >- if (flags & TemporalLayers::BufferFlags::kReference) { >+ if (flags & Vp8TemporalLayers::BufferFlags::kReference) { > if (state->temporal_layer > 0 && !state->is_keyframe) { > *need_sync = false; > } >@@ -57,7 +57,7 @@ bool TemporalLayersChecker::CheckAndUpdateBufferState( > return false; > } > } >- if ((flags & TemporalLayers::BufferFlags::kUpdate)) { >+ if ((flags & Vp8TemporalLayers::BufferFlags::kUpdate)) { > state->temporal_layer = temporal_layer; > state->sequence_number = sequence_number; > state->is_keyframe = frame_is_keyframe; >@@ -69,7 +69,7 @@ bool TemporalLayersChecker::CheckAndUpdateBufferState( > > bool TemporalLayersChecker::CheckTemporalConfig( > bool frame_is_keyframe, >- const TemporalLayers::FrameConfig& frame_config) { >+ const Vp8TemporalLayers::FrameConfig& frame_config) { > if (frame_config.drop_frame || > frame_config.packetizer_temporal_idx == kNoTemporalIdx) { > return true; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/mock_libvpx_interface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/mock_libvpx_interface.h >index 5804e814ec9148858ddc83dc7e776b527c57eef2..dcff1e6a18846adbe3de419026e1773e798c6ac7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/mock_libvpx_interface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/mock_libvpx_interface.h >@@ -32,7 +32,7 @@ class MockLibvpxVp8Interface : public LibvpxInterface { > unsigned int, > unsigned int, > unsigned char*)); >- MOCK_CONST_METHOD1(img_free, vpx_codec_err_t(const vpx_codec_enc_cfg_t*)); >+ MOCK_CONST_METHOD1(img_free, void(vpx_image_t* img)); > MOCK_CONST_METHOD2(codec_enc_config_set, > vpx_codec_err_t(vpx_codec_ctx_t*, > const vpx_codec_enc_cfg_t*)); >@@ -84,7 +84,7 @@ class MockLibvpxVp8Interface : public LibvpxInterface { > uint64_t, > vpx_enc_frame_flags_t, > uint64_t)); >- MOCK_CONST_METHOD3(codec_get_cx_data, >+ MOCK_CONST_METHOD2(codec_get_cx_data, > const vpx_codec_cx_pkt_t*(vpx_codec_ctx_t*, > vpx_codec_iter_t*)); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/vp8_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/vp8_impl_unittest.cc >index 45a090df3736de910663445cabd769a92b19c109..5eed5d380cf6a6f012ce1f80d2d90d7550a49a84 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/vp8_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/test/vp8_impl_unittest.cc >@@ -12,16 +12,25 @@ > > #include <memory> > >+#include "api/test/mock_video_decoder.h" >+#include "api/test/mock_video_encoder.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/video_coding/codecs/test/video_codec_unittest.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >+#include "modules/video_coding/codecs/vp8/libvpx_vp8_encoder.h" >+#include "modules/video_coding/codecs/vp8/test/mock_libvpx_interface.h" > #include "modules/video_coding/utility/vp8_header_parser.h" > #include "rtc_base/timeutils.h" > #include "test/video_codec_settings.h" > > namespace webrtc { > >+using testing::Invoke; >+using testing::NiceMock; >+using testing::Return; >+using testing::_; >+ > namespace { > constexpr uint32_t kInitialTimestampRtp = 123; > constexpr int64_t kTestNtpTimeMs = 456; >@@ -68,7 +77,7 @@ class TestVp8Impl : public VideoCodecUnitTest { > encoder_->Encode(input_frame, nullptr, &frame_types)); > ASSERT_TRUE(WaitForEncodedFrame(encoded_frame, codec_specific_info)); > VerifyQpParser(*encoded_frame); >- EXPECT_STREQ("libvpx", codec_specific_info->codec_name); >+ EXPECT_EQ("libvpx", encoder_->GetEncoderInfo().implementation_name); > EXPECT_EQ(kVideoCodecVP8, codec_specific_info->codecType); > EXPECT_EQ(0, encoded_frame->SpatialIndex()); > } >@@ -325,7 +334,8 @@ TEST_F(TestVp8Impl, ScalingDisabledIfAutomaticResizeOff) { > EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, > encoder_->InitEncode(&codec_settings_, kNumCores, kMaxPayloadSize)); > >- VideoEncoder::ScalingSettings settings = encoder_->GetScalingSettings(); >+ VideoEncoder::ScalingSettings settings = >+ encoder_->GetEncoderInfo().scaling_settings; > EXPECT_FALSE(settings.thresholds.has_value()); > } > >@@ -335,7 +345,8 @@ TEST_F(TestVp8Impl, ScalingEnabledIfAutomaticResizeOn) { > EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, > encoder_->InitEncode(&codec_settings_, kNumCores, kMaxPayloadSize)); > >- VideoEncoder::ScalingSettings settings = encoder_->GetScalingSettings(); >+ VideoEncoder::ScalingSettings settings = >+ encoder_->GetEncoderInfo().scaling_settings; > EXPECT_TRUE(settings.thresholds.has_value()); > EXPECT_EQ(kDefaultMinPixelsPerFrame, settings.min_pixels_per_frame); > } >@@ -380,4 +391,41 @@ TEST_F(TestVp8Impl, DontDropKeyframes) { > EncodeAndExpectFrameWith(*NextInputFrame(), 0, true); > } > >+TEST_F(TestVp8Impl, KeepsTimestampOnReencode) { >+ auto* const vpx = new NiceMock<MockLibvpxVp8Interface>(); >+ LibvpxVp8Encoder encoder((std::unique_ptr<LibvpxInterface>(vpx))); >+ >+ // Settings needed to trigger ScreenshareLayers usage, which is required for >+ // overshoot-drop-reencode logic. >+ codec_settings_.targetBitrate = 200; >+ codec_settings_.maxBitrate = 1000; >+ codec_settings_.mode = VideoCodecMode::kScreensharing; >+ codec_settings_.VP8()->numberOfTemporalLayers = 2; >+ >+ EXPECT_CALL(*vpx, img_wrap(_, _, _, _, _, _)) >+ .WillOnce(Invoke([](vpx_image_t* img, vpx_img_fmt_t fmt, unsigned int d_w, >+ unsigned int d_h, unsigned int stride_align, >+ unsigned char* img_data) { >+ img->fmt = fmt; >+ img->d_w = d_w; >+ img->d_h = d_h; >+ img->img_data = img_data; >+ return img; >+ })); >+ EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, >+ encoder.InitEncode(&codec_settings_, 1, 1000)); >+ MockEncodedImageCallback callback; >+ encoder.RegisterEncodeCompleteCallback(&callback); >+ >+ // Simulate overshoot drop, re-encode: encode function will be called twice >+ // with the same parameters. codec_get_cx_data() will by default return no >+ // image data and be interpreted as drop. >+ EXPECT_CALL(*vpx, codec_encode(_, _, /* pts = */ 0, _, _, _)) >+ .Times(2) >+ .WillRepeatedly(Return(vpx_codec_err_t::VPX_CODEC_OK)); >+ >+ auto delta_frame = std::vector<FrameType>{kVideoFrameDelta}; >+ encoder.Encode(*NextInputFrame(), nullptr, &delta_frame); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/vp8_temporal_layers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/vp8_temporal_layers.cc >deleted file mode 100644 >index 49584414a911370d749686b430ca1e5098efcd6a..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp8/vp8_temporal_layers.cc >+++ /dev/null >@@ -1,42 +0,0 @@ >-/* Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >- >-#include "absl/memory/memory.h" >-#include "modules/video_coding/codecs/vp8/default_temporal_layers.h" >-#include "modules/video_coding/codecs/vp8/screenshare_layers.h" >-#include "system_wrappers/include/clock.h" >- >-namespace webrtc { >- >-bool TemporalLayers::FrameConfig::operator==(const FrameConfig& o) const { >- return drop_frame == o.drop_frame && >- last_buffer_flags == o.last_buffer_flags && >- golden_buffer_flags == o.golden_buffer_flags && >- arf_buffer_flags == o.arf_buffer_flags && layer_sync == o.layer_sync && >- freeze_entropy == o.freeze_entropy && >- encoder_layer_id == o.encoder_layer_id && >- packetizer_temporal_idx == o.packetizer_temporal_idx; >-} >- >-std::unique_ptr<TemporalLayers> TemporalLayers::CreateTemporalLayers( >- TemporalLayersType type, >- int num_temporal_layers) { >- switch (type) { >- case TemporalLayersType::kFixedPattern: >- return absl::make_unique<DefaultTemporalLayers>(num_temporal_layers); >- case TemporalLayersType::kBitrateDynamic: >- // Conference mode temporal layering for screen content in base stream. >- return absl::make_unique<ScreenshareLayers>(num_temporal_layers, >- Clock::GetRealTimeClock()); >- } >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config.cc >index 6807698246e8ff3355e9ac2ac0452fb91aea6ab0..3e922801483928cc13ea31c7dea13a54bdeef3b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config.cc >@@ -22,9 +22,9 @@ namespace webrtc { > namespace { > const size_t kMinVp9SvcBitrateKbps = 30; > >-const size_t kMaxNumLayersForScreenSharing = 2; >-const float kMaxScreenSharingLayerFramerateFps[] = {5.0, 5.0}; >-const size_t kMaxScreenSharingLayerBitrateKbps[] = {200, 500}; >+const size_t kMaxNumLayersForScreenSharing = 3; >+const float kMaxScreenSharingLayerFramerateFps[] = {5.0, 5.0, 30.0}; >+const size_t kMaxScreenSharingLayerBitrateKbps[] = {200, 500, 1250}; > } // namespace > > std::vector<SpatialLayer> ConfigureSvcScreenSharing(size_t input_width, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config_unittest.cc >index 257c5df2f037ba0f502dcf5a4f7ef0c2e781dc5e..b997767465661e8b1844b45750fd176ec8d7e11b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_config_unittest.cc >@@ -49,12 +49,13 @@ TEST(SvcConfig, ScreenSharing) { > std::vector<SpatialLayer> spatial_layers = > GetSvcConfig(1920, 1080, 30, 3, 3, true); > >- EXPECT_EQ(spatial_layers.size(), 2UL); >+ EXPECT_EQ(spatial_layers.size(), 3UL); > >- for (const SpatialLayer& layer : spatial_layers) { >+ for (size_t i = 0; i < 3; ++i) { >+ const SpatialLayer& layer = spatial_layers[i]; > EXPECT_EQ(layer.width, 1920); > EXPECT_EQ(layer.height, 1080); >- EXPECT_EQ(layer.maxFramerate, 5); >+ EXPECT_EQ(layer.maxFramerate, (i < 2) ? 5 : 30); > EXPECT_EQ(layer.numberOfTemporalLayers, 1); > EXPECT_LE(layer.minBitrate, layer.maxBitrate); > EXPECT_LE(layer.minBitrate, layer.targetBitrate); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_rate_allocator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_rate_allocator_unittest.cc >index 048bf7d69471111e77461aa1cb8592817452cb69..eec2b9d4199ea9f77e3c99b6cb2d47849bd9f57c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_rate_allocator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/svc_rate_allocator_unittest.cc >@@ -149,7 +149,7 @@ TEST(SvcRateAllocatorTest, MinBitrateToGetQualityLayer) { > > const SpatialLayer* layers = codec.spatialLayers; > >- EXPECT_LE(codec.VP9()->numberOfSpatialLayers, 2U); >+ EXPECT_LE(codec.VP9()->numberOfSpatialLayers, 3U); > > VideoBitrateAllocation allocation = > allocator.GetAllocation(layers[0].minBitrate * 1000, 30); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/test/vp9_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/test/vp9_impl_unittest.cc >index 4733ad979a4995a1acb3b6fd00bb9ccdfa4a22ca..85fa278a0d376eaf5512ec0758b8cb7dd721eaf2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/test/vp9_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/test/vp9_impl_unittest.cc >@@ -104,6 +104,33 @@ class TestVp9Impl : public VideoCodecUnitTest { > codec_settings_.spatialLayers[i] = layers[i]; > } > } >+ >+ HdrMetadata CreateTestHdrMetadata() const { >+ // Random but reasonable HDR metadata. >+ HdrMetadata hdr_metadata; >+ hdr_metadata.mastering_metadata.luminance_max = 2000.0; >+ hdr_metadata.mastering_metadata.luminance_min = 2.0001; >+ hdr_metadata.mastering_metadata.primary_r.x = 0.30; >+ hdr_metadata.mastering_metadata.primary_r.y = 0.40; >+ hdr_metadata.mastering_metadata.primary_g.x = 0.32; >+ hdr_metadata.mastering_metadata.primary_g.y = 0.46; >+ hdr_metadata.mastering_metadata.primary_b.x = 0.34; >+ hdr_metadata.mastering_metadata.primary_b.y = 0.49; >+ hdr_metadata.mastering_metadata.white_point.x = 0.41; >+ hdr_metadata.mastering_metadata.white_point.y = 0.48; >+ hdr_metadata.max_content_light_level = 2345; >+ hdr_metadata.max_frame_average_light_level = 1789; >+ return hdr_metadata; >+ } >+ >+ ColorSpace CreateTestColorSpace() const { >+ HdrMetadata hdr_metadata = CreateTestHdrMetadata(); >+ ColorSpace color_space(ColorSpace::PrimaryID::kBT709, >+ ColorSpace::TransferID::kGAMMA22, >+ ColorSpace::MatrixID::kSMPTE2085, >+ ColorSpace::RangeID::kFull, &hdr_metadata); >+ return color_space; >+ } > }; > > // Disabled on ios as flake, see https://crbug.com/webrtc/7057 >@@ -128,7 +155,7 @@ TEST_F(TestVp9Impl, EncodeDecode) { > ASSERT_TRUE(decoded_frame); > EXPECT_GT(I420PSNR(input_frame, decoded_frame.get()), 36); > >- const ColorSpace color_space = decoded_frame->color_space().value(); >+ const ColorSpace color_space = *decoded_frame->color_space(); > EXPECT_EQ(ColorSpace::PrimaryID::kInvalid, color_space.primaries()); > EXPECT_EQ(ColorSpace::TransferID::kInvalid, color_space.transfer()); > EXPECT_EQ(ColorSpace::MatrixID::kInvalid, color_space.matrix()); >@@ -157,6 +184,60 @@ TEST_F(TestVp9Impl, EncodedRotationEqualsInputRotation) { > EXPECT_EQ(kVideoRotation_90, encoded_frame.rotation_); > } > >+TEST_F(TestVp9Impl, EncodedColorSpaceEqualsInputColorSpace) { >+ // Video frame without explicit color space information. >+ VideoFrame* input_frame = NextInputFrame(); >+ EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, >+ encoder_->Encode(*input_frame, nullptr, nullptr)); >+ EncodedImage encoded_frame; >+ CodecSpecificInfo codec_specific_info; >+ ASSERT_TRUE(WaitForEncodedFrame(&encoded_frame, &codec_specific_info)); >+ EXPECT_FALSE(encoded_frame.ColorSpace()); >+ >+ // Video frame with explicit color space information. >+ ColorSpace color_space = CreateTestColorSpace(); >+ VideoFrame input_frame_w_hdr = >+ VideoFrame::Builder() >+ .set_video_frame_buffer(input_frame->video_frame_buffer()) >+ .set_color_space(&color_space) >+ .build(); >+ EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, >+ encoder_->Encode(input_frame_w_hdr, nullptr, nullptr)); >+ ASSERT_TRUE(WaitForEncodedFrame(&encoded_frame, &codec_specific_info)); >+ ASSERT_TRUE(encoded_frame.ColorSpace()); >+ EXPECT_EQ(*encoded_frame.ColorSpace(), color_space); >+} >+ >+TEST_F(TestVp9Impl, DecodedHdrMetadataEqualsEncodedHdrMetadata) { >+ EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, >+ encoder_->Encode(*NextInputFrame(), nullptr, nullptr)); >+ EncodedImage encoded_frame; >+ CodecSpecificInfo codec_specific_info; >+ ASSERT_TRUE(WaitForEncodedFrame(&encoded_frame, &codec_specific_info)); >+ >+ // Encoded frame without explicit color space information. >+ EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, >+ decoder_->Decode(encoded_frame, false, nullptr, 0)); >+ std::unique_ptr<VideoFrame> decoded_frame; >+ absl::optional<uint8_t> decoded_qp; >+ ASSERT_TRUE(WaitForDecodedFrame(&decoded_frame, &decoded_qp)); >+ ASSERT_TRUE(decoded_frame); >+ // Color space present from encoded bitstream. >+ ASSERT_TRUE(decoded_frame->color_space()); >+ // No HDR metadata present. >+ EXPECT_FALSE(decoded_frame->color_space()->hdr_metadata()); >+ >+ // Encoded frame with explicit color space information. >+ ColorSpace color_space = CreateTestColorSpace(); >+ encoded_frame.SetColorSpace(&color_space); >+ EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, >+ decoder_->Decode(encoded_frame, false, nullptr, 0)); >+ ASSERT_TRUE(WaitForDecodedFrame(&decoded_frame, &decoded_qp)); >+ ASSERT_TRUE(decoded_frame); >+ ASSERT_TRUE(decoded_frame->color_space()); >+ EXPECT_EQ(color_space, *decoded_frame->color_space()); >+} >+ > TEST_F(TestVp9Impl, DecodedQpEqualsEncodedQp) { > EXPECT_EQ(WEBRTC_VIDEO_CODEC_OK, > encoder_->Encode(*NextInputFrame(), nullptr, nullptr)); >@@ -393,7 +474,7 @@ TEST_F(TestVp9Impl, InterLayerPred) { > ConfigureSvc(num_spatial_layers); > codec_settings_.VP9()->frameDroppingOn = false; > >- BitrateAllocation bitrate_allocation; >+ VideoBitrateAllocation bitrate_allocation; > for (size_t i = 0; i < num_spatial_layers; ++i) { > bitrate_allocation.SetBitrate( > i, 0, codec_settings_.spatialLayers[i].targetBitrate * 1000); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..22a08881208a129eb52e35a60d69f96b432f32bd >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9.cc >@@ -0,0 +1,75 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "modules/video_coding/codecs/vp9/include/vp9.h" >+ >+#include "absl/memory/memory.h" >+#include "api/video_codecs/sdp_video_format.h" >+#include "modules/video_coding/codecs/vp9/vp9_impl.h" >+#include "rtc_base/checks.h" >+#include "vpx/vp8cx.h" >+#include "vpx/vp8dx.h" >+#include "vpx/vpx_codec.h" >+ >+namespace webrtc { >+ >+std::vector<SdpVideoFormat> SupportedVP9Codecs() { >+#ifdef RTC_ENABLE_VP9 >+ // Profile 2 might not be available on some platforms until >+ // https://bugs.chromium.org/p/webm/issues/detail?id=1544 is solved. >+ static bool vpx_supports_high_bit_depth = >+ (vpx_codec_get_caps(vpx_codec_vp9_cx()) & VPX_CODEC_CAP_HIGHBITDEPTH) != >+ 0 && >+ (vpx_codec_get_caps(vpx_codec_vp9_dx()) & VPX_CODEC_CAP_HIGHBITDEPTH) != >+ 0; >+ >+ std::vector<SdpVideoFormat> supported_formats{SdpVideoFormat( >+ cricket::kVp9CodecName, >+ {{kVP9FmtpProfileId, VP9ProfileToString(VP9Profile::kProfile0)}})}; >+ if (vpx_supports_high_bit_depth) { >+ supported_formats.push_back(SdpVideoFormat( >+ cricket::kVp9CodecName, >+ {{kVP9FmtpProfileId, VP9ProfileToString(VP9Profile::kProfile2)}})); >+ } >+ return supported_formats; >+#else >+ return std::vector<SdpVideoFormat>(); >+#endif >+} >+ >+std::unique_ptr<VP9Encoder> VP9Encoder::Create() { >+#ifdef RTC_ENABLE_VP9 >+ return absl::make_unique<VP9EncoderImpl>(cricket::VideoCodec()); >+#else >+ RTC_NOTREACHED(); >+ return nullptr; >+#endif >+} >+ >+std::unique_ptr<VP9Encoder> VP9Encoder::Create( >+ const cricket::VideoCodec& codec) { >+#ifdef RTC_ENABLE_VP9 >+ return absl::make_unique<VP9EncoderImpl>(codec); >+#else >+ RTC_NOTREACHED(); >+ return nullptr; >+#endif >+} >+ >+std::unique_ptr<VP9Decoder> VP9Decoder::Create() { >+#ifdef RTC_ENABLE_VP9 >+ return absl::make_unique<VP9DecoderImpl>(); >+#else >+ RTC_NOTREACHED(); >+ return nullptr; >+#endif >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.cc >index 69101cb506f350431990081f3cbf66f50085ebf7..ecbda4cb3bd0f24d36969746d8d58968154a0b95 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.cc >@@ -9,6 +9,8 @@ > * > */ > >+#ifdef RTC_ENABLE_VP9 >+ > #include "modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.h" > > #include "vpx/vpx_codec.h" >@@ -138,3 +140,5 @@ int32_t Vp9FrameBufferPool::VpxReleaseFrameBuffer(void* user_priv, > } > > } // namespace webrtc >+ >+#endif // RTC_ENABLE_VP9 >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.h >index 9b035eb94ade737ded3b35bb67a27dcf98893f6c..02ede2432955fdea572b1c59ecca8f513f9c3a4c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_frame_buffer_pool.h >@@ -12,6 +12,8 @@ > #ifndef MODULES_VIDEO_CODING_CODECS_VP9_VP9_FRAME_BUFFER_POOL_H_ > #define MODULES_VIDEO_CODING_CODECS_VP9_VP9_FRAME_BUFFER_POOL_H_ > >+#ifdef RTC_ENABLE_VP9 >+ > #include <vector> > > #include "rtc_base/buffer.h" >@@ -120,4 +122,6 @@ class Vp9FrameBufferPool { > > } // namespace webrtc > >+#endif // RTC_ENABLE_VP9 >+ > #endif // MODULES_VIDEO_CODING_CODECS_VP9_VP9_FRAME_BUFFER_POOL_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.cc >index 6b982a8a132389d2aea51dfeaac92bedf9c2af7f..61542c508211496297b36cecde4794b46284e605 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.cc >@@ -9,6 +9,8 @@ > * > */ > >+#ifdef RTC_ENABLE_VP9 >+ > #include "modules/video_coding/codecs/vp9/vp9_impl.h" > > #include <algorithm> >@@ -37,11 +39,16 @@ > namespace webrtc { > > namespace { >+const char kVp9TrustedRateControllerFieldTrial[] = >+ "WebRTC-LibvpxVp9TrustedRateController"; >+ > // Maps from gof_idx to encoder internal reference frame buffer index. These > // maps work for 1,2 and 3 temporal layers with GOF length of 1,2 and 4 frames. > uint8_t kRefBufIdx[4] = {0, 0, 0, 1}; > uint8_t kUpdBufIdx[4] = {0, 0, 1, 0}; > >+int kMaxNumTiles4kVideo = 8; >+ > // Only positive speeds, range for real-time coding currently is: 5 - 8. > // Lower means slower/better quality, higher means fastest/lower quality. > int GetCpuSpeed(int width, int height) { >@@ -119,23 +126,6 @@ ColorSpace ExtractVP9ColorSpace(vpx_color_space_t space_t, > } > } // namespace > >-std::vector<SdpVideoFormat> SupportedVP9Codecs() { >- // TODO(emircan): Add Profile 2 support after fixing browser_tests. >- std::vector<SdpVideoFormat> supported_formats{SdpVideoFormat( >- cricket::kVp9CodecName, >- {{kVP9FmtpProfileId, VP9ProfileToString(VP9Profile::kProfile0)}})}; >- return supported_formats; >-} >- >-std::unique_ptr<VP9Encoder> VP9Encoder::Create() { >- return absl::make_unique<VP9EncoderImpl>(cricket::VideoCodec()); >-} >- >-std::unique_ptr<VP9Encoder> VP9Encoder::Create( >- const cricket::VideoCodec& codec) { >- return absl::make_unique<VP9EncoderImpl>(codec); >-} >- > void VP9EncoderImpl::EncoderOutputCodedPacketCallback(vpx_codec_cx_pkt* pkt, > void* user_data) { > VP9EncoderImpl* enc = static_cast<VP9EncoderImpl*>(user_data); >@@ -160,12 +150,16 @@ VP9EncoderImpl::VP9EncoderImpl(const cricket::VideoCodec& codec) > num_temporal_layers_(0), > num_spatial_layers_(0), > num_active_spatial_layers_(0), >- layer_deactivation_requires_key_frame_(webrtc::field_trial::IsEnabled( >- "WebRTC-Vp9IssueKeyFrameOnLayerDeactivation")), >+ layer_deactivation_requires_key_frame_( >+ field_trial::IsEnabled("WebRTC-Vp9IssueKeyFrameOnLayerDeactivation")), > is_svc_(false), > inter_layer_pred_(InterLayerPredMode::kOn), > external_ref_control_( >- webrtc::field_trial::IsEnabled("WebRTC-Vp9ExternalRefCtrl")), >+ field_trial::IsEnabled("WebRTC-Vp9ExternalRefCtrl")), >+ trusted_rate_controller_( >+ field_trial::IsEnabled(kVp9TrustedRateControllerFieldTrial)), >+ full_superframe_drop_(true), >+ first_frame_in_picture_(true), > is_flexible_mode_(false) { > memset(&codec_, 0, sizeof(codec_)); > memset(&svc_params_, 0, sizeof(vpx_svc_extra_cfg_t)); >@@ -627,7 +621,8 @@ int VP9EncoderImpl::InitAndSetControlSettings(const VideoCodec* inst) { > // quality flickering and is not compatible with RTP non-flexible mode. > vpx_svc_frame_drop_t svc_drop_frame; > memset(&svc_drop_frame, 0, sizeof(svc_drop_frame)); >- svc_drop_frame.framedrop_mode = FULL_SUPERFRAME_DROP; >+ svc_drop_frame.framedrop_mode = >+ full_superframe_drop_ ? FULL_SUPERFRAME_DROP : CONSTRAINED_LAYER_DROP; > svc_drop_frame.max_consec_drop = std::numeric_limits<int>::max(); > for (size_t i = 0; i < num_spatial_layers_; ++i) { > svc_drop_frame.framedrop_thresh[i] = config_->rc_dropframe_thresh; >@@ -701,30 +696,33 @@ int VP9EncoderImpl::Encode(const VideoFrame& input_image, > } > } > >- if (VideoCodecMode::kScreensharing == codec_.mode && !force_key_frame_) { >- // Skip encoding spatial layer frames if their target frame rate is lower >- // than actual input frame rate. >+ size_t first_active_spatial_layer_id = 0; >+ if (VideoCodecMode::kScreensharing == codec_.mode) { > vpx_svc_layer_id_t layer_id = {0}; >- const size_t gof_idx = (pics_since_key_ + 1) % gof_.num_frames_in_gof; >- layer_id.temporal_layer_id = gof_.temporal_idx[gof_idx]; >+ if (!force_key_frame_) { >+ // Skip encoding spatial layer frames if their target frame rate is lower >+ // than actual input frame rate. >+ const size_t gof_idx = (pics_since_key_ + 1) % gof_.num_frames_in_gof; >+ layer_id.temporal_layer_id = gof_.temporal_idx[gof_idx]; > >- const uint32_t frame_timestamp_ms = >- 1000 * input_image.timestamp() / kVideoPayloadTypeFrequency; >+ const uint32_t frame_timestamp_ms = >+ 1000 * input_image.timestamp() / kVideoPayloadTypeFrequency; > >- for (uint8_t sl_idx = 0; sl_idx < num_active_spatial_layers_; ++sl_idx) { >- if (framerate_controller_[sl_idx].DropFrame(frame_timestamp_ms)) { >- ++layer_id.spatial_layer_id; >- } else { >- break; >+ for (uint8_t sl_idx = 0; sl_idx < num_active_spatial_layers_; ++sl_idx) { >+ if (framerate_controller_[sl_idx].DropFrame(frame_timestamp_ms)) { >+ ++layer_id.spatial_layer_id; >+ } else { >+ break; >+ } > } >- } > >- RTC_DCHECK_LE(layer_id.spatial_layer_id, num_active_spatial_layers_); >- if (layer_id.spatial_layer_id >= num_active_spatial_layers_) { >- // Drop entire picture. >- return WEBRTC_VIDEO_CODEC_OK; >+ RTC_DCHECK_LE(layer_id.spatial_layer_id, num_active_spatial_layers_); >+ if (layer_id.spatial_layer_id >= num_active_spatial_layers_) { >+ // Drop entire picture. >+ return WEBRTC_VIDEO_CODEC_OK; >+ } > } >- >+ first_active_spatial_layer_id = layer_id.spatial_layer_id; > vpx_codec_control(encoder_, VP9E_SET_SVC_LAYER_ID, &layer_id); > } > >@@ -785,10 +783,21 @@ int VP9EncoderImpl::Encode(const VideoFrame& input_image, > } > > if (external_ref_control_) { >- vpx_svc_ref_frame_config_t ref_config = SetReferences(force_key_frame_); >+ vpx_svc_ref_frame_config_t ref_config = >+ SetReferences(force_key_frame_, first_active_spatial_layer_id); >+ >+ if (VideoCodecMode::kScreensharing == codec_.mode) { >+ for (uint8_t sl_idx = 0; sl_idx < num_active_spatial_layers_; ++sl_idx) { >+ ref_config.duration[sl_idx] = static_cast<int64_t>( >+ 90000 / framerate_controller_[sl_idx].GetTargetRate()); >+ } >+ } >+ > vpx_codec_control(encoder_, VP9E_SET_SVC_REF_FRAME_CONFIG, &ref_config); > } > >+ first_frame_in_picture_ = true; >+ > // TODO(ssilkin): Frame duration should be specified per spatial layer > // since their frame rate can be different. For now calculate frame duration > // based on target frame rate of the highest spatial layer, which frame rate >@@ -814,8 +823,10 @@ int VP9EncoderImpl::Encode(const VideoFrame& input_image, > } > timestamp_ += duration; > >- const bool end_of_picture = true; >- DeliverBufferedFrame(end_of_picture); >+ if (!full_superframe_drop_) { >+ const bool end_of_picture = true; >+ DeliverBufferedFrame(end_of_picture); >+ } > > return WEBRTC_VIDEO_CODEC_OK; > } >@@ -823,14 +834,12 @@ int VP9EncoderImpl::Encode(const VideoFrame& input_image, > void VP9EncoderImpl::PopulateCodecSpecific(CodecSpecificInfo* codec_specific, > absl::optional<int>* spatial_idx, > const vpx_codec_cx_pkt& pkt, >- uint32_t timestamp, >- bool first_frame_in_picture) { >+ uint32_t timestamp) { > RTC_CHECK(codec_specific != nullptr); > codec_specific->codecType = kVideoCodecVP9; >- codec_specific->codec_name = ImplementationName(); > CodecSpecificInfoVP9* vp9_info = &(codec_specific->codecSpecific.VP9); > >- vp9_info->first_frame_in_picture = first_frame_in_picture; >+ vp9_info->first_frame_in_picture = first_frame_in_picture_; > vp9_info->flexible_mode = is_flexible_mode_; > vp9_info->ss_data_available = > (pkt.data.frame.flags & VPX_FRAME_IS_KEY) ? true : false; >@@ -861,7 +870,7 @@ void VP9EncoderImpl::PopulateCodecSpecific(CodecSpecificInfo* codec_specific, > > if (pkt.data.frame.flags & VPX_FRAME_IS_KEY) { > pics_since_key_ = 0; >- } else if (first_frame_in_picture) { >+ } else if (first_frame_in_picture_) { > ++pics_since_key_; > } > >@@ -877,7 +886,7 @@ void VP9EncoderImpl::PopulateCodecSpecific(CodecSpecificInfo* codec_specific, > // if low layer frame is lost) then receiver won't be able to decode next high > // layer frame which uses ILP. > vp9_info->inter_layer_predicted = >- first_frame_in_picture ? false : is_inter_layer_pred_allowed; >+ first_frame_in_picture_ ? false : is_inter_layer_pred_allowed; > > // Mark all low spatial layer frames as references (not just frames of > // active low spatial layers) if inter-layer prediction is enabled since >@@ -921,6 +930,8 @@ void VP9EncoderImpl::PopulateCodecSpecific(CodecSpecificInfo* codec_specific, > vp9_info->gof.CopyGofInfoVP9(gof_); > } > } >+ >+ first_frame_in_picture_ = false; > } > > void VP9EncoderImpl::FillReferenceIndices(const vpx_codec_cx_pkt& pkt, >@@ -978,6 +989,8 @@ void VP9EncoderImpl::FillReferenceIndices(const vpx_codec_cx_pkt& pkt, > > size_t max_ref_temporal_layer_id = 0; > >+ std::vector<size_t> ref_pid_list; >+ > vp9_info->num_ref_pics = 0; > for (const RefFrameBuffer& ref_buf : ref_buf_list) { > RTC_DCHECK_LE(ref_buf.pic_num, pic_num); >@@ -990,6 +1003,16 @@ void VP9EncoderImpl::FillReferenceIndices(const vpx_codec_cx_pkt& pkt, > } > RTC_DCHECK_LE(ref_buf.temporal_layer_id, layer_id.temporal_layer_id); > >+ // Encoder may reference several spatial layers on the same previous >+ // frame in case if some spatial layers are skipped on the current frame. >+ // We shouldn't put duplicate references as it may break some old >+ // clients and isn't RTP compatible. >+ if (std::find(ref_pid_list.begin(), ref_pid_list.end(), >+ ref_buf.pic_num) != ref_pid_list.end()) { >+ continue; >+ } >+ ref_pid_list.push_back(ref_buf.pic_num); >+ > const size_t p_diff = pic_num - ref_buf.pic_num; > RTC_DCHECK_LE(p_diff, 127UL); > >@@ -1054,7 +1077,9 @@ void VP9EncoderImpl::UpdateReferenceBuffers(const vpx_codec_cx_pkt& pkt, > } > } > >-vpx_svc_ref_frame_config_t VP9EncoderImpl::SetReferences(bool is_key_pic) { >+vpx_svc_ref_frame_config_t VP9EncoderImpl::SetReferences( >+ bool is_key_pic, >+ size_t first_active_spatial_layer_id) { > // kRefBufIdx, kUpdBufIdx need to be updated to support longer GOFs. > RTC_DCHECK_LE(gof_.num_frames_in_gof, 4); > >@@ -1106,13 +1131,14 @@ vpx_svc_ref_frame_config_t VP9EncoderImpl::SetReferences(bool is_key_pic) { > } > } > >- if (is_inter_layer_pred_allowed && sl_idx > 0) { >+ if (is_inter_layer_pred_allowed && sl_idx > first_active_spatial_layer_id) { > // Set up spatial reference. > RTC_DCHECK(last_updated_buf_idx); > ref_config.gld_fb_idx[sl_idx] = *last_updated_buf_idx; > ref_config.reference_golden[sl_idx] = 1; > } else { >- RTC_DCHECK(ref_config.reference_last[sl_idx] != 0 || sl_idx == 0 || >+ RTC_DCHECK(ref_config.reference_last[sl_idx] != 0 || >+ sl_idx == first_active_spatial_layer_id || > inter_layer_pred_ == InterLayerPredMode::kOff); > } > >@@ -1147,12 +1173,11 @@ int VP9EncoderImpl::GetEncodedLayerFrame(const vpx_codec_cx_pkt* pkt) { > vpx_svc_layer_id_t layer_id = {0}; > vpx_codec_control(encoder_, VP9E_GET_SVC_LAYER_ID, &layer_id); > >- const bool first_frame_in_picture = encoded_image_._length == 0; >- // Ensure we don't buffer layers of previous picture (superframe). >- RTC_DCHECK(first_frame_in_picture || layer_id.spatial_layer_id > 0); >- >- const bool end_of_picture = false; >- DeliverBufferedFrame(end_of_picture); >+ if (!full_superframe_drop_) { >+ // Deliver buffered low spatial layer frame. >+ const bool end_of_picture = false; >+ DeliverBufferedFrame(end_of_picture); >+ } > > if (pkt->data.frame.sz > encoded_image_._size) { > delete[] encoded_image_._buffer; >@@ -1178,7 +1203,7 @@ int VP9EncoderImpl::GetEncodedLayerFrame(const vpx_codec_cx_pkt* pkt) { > memset(&codec_specific_, 0, sizeof(codec_specific_)); > absl::optional<int> spatial_index; > PopulateCodecSpecific(&codec_specific_, &spatial_index, *pkt, >- input_image_->timestamp(), first_frame_in_picture); >+ input_image_->timestamp()); > encoded_image_.SetSpatialIndex(spatial_index); > > if (is_flexible_mode_) { >@@ -1200,6 +1225,13 @@ int VP9EncoderImpl::GetEncodedLayerFrame(const vpx_codec_cx_pkt* pkt) { > int qp = -1; > vpx_codec_control(encoder_, VP8E_GET_LAST_QUANTIZER, &qp); > encoded_image_.qp_ = qp; >+ encoded_image_.SetColorSpace(input_image_->color_space()); >+ >+ if (full_superframe_drop_) { >+ const bool end_of_picture = encoded_image_.SpatialIndex().value_or(0) + 1 == >+ num_active_spatial_layers_; >+ DeliverBufferedFrame(end_of_picture); >+ } > > return WEBRTC_VIDEO_CODEC_OK; > } >@@ -1230,22 +1262,19 @@ void VP9EncoderImpl::DeliverBufferedFrame(bool end_of_picture) { > } > } > >-int VP9EncoderImpl::SetChannelParameters(uint32_t packet_loss, int64_t rtt) { >- return WEBRTC_VIDEO_CODEC_OK; >-} >- > int VP9EncoderImpl::RegisterEncodeCompleteCallback( > EncodedImageCallback* callback) { > encoded_complete_callback_ = callback; > return WEBRTC_VIDEO_CODEC_OK; > } > >-const char* VP9EncoderImpl::ImplementationName() const { >- return "libvpx"; >-} >- >-std::unique_ptr<VP9Decoder> VP9Decoder::Create() { >- return absl::make_unique<VP9DecoderImpl>(); >+VideoEncoder::EncoderInfo VP9EncoderImpl::GetEncoderInfo() const { >+ EncoderInfo info; >+ info.supports_native_handle = false; >+ info.implementation_name = "libvpx"; >+ info.scaling_settings = VideoEncoder::ScalingSettings::kOff; >+ info.has_trusted_rate_controller = trusted_rate_controller_; >+ return info; > } > > VP9DecoderImpl::VP9DecoderImpl() >@@ -1272,13 +1301,18 @@ int VP9DecoderImpl::InitDecode(const VideoCodec* inst, int number_of_cores) { > if (ret_val < 0) { > return ret_val; > } >+ > if (decoder_ == nullptr) { > decoder_ = new vpx_codec_ctx_t; > } > vpx_codec_dec_cfg_t cfg; >- // Setting number of threads to a constant value (1) >- cfg.threads = 1; >- cfg.h = cfg.w = 0; // set after decode >+ memset(&cfg, 0, sizeof(cfg)); >+ >+ // We want to use multithreading when decoding high resolution videos. But, >+ // since we don't know resolution of input stream at this stage, we always >+ // enable it. >+ cfg.threads = std::min(number_of_cores, kMaxNumTiles4kVideo); >+ > vpx_codec_flags_t flags = 0; > if (vpx_codec_dec_init(decoder_, vpx_codec_vp9_dx(), &cfg, flags)) { > return WEBRTC_VIDEO_CODEC_MEMORY; >@@ -1336,8 +1370,8 @@ int VP9DecoderImpl::Decode(const EncodedImage& input_image, > vpx_codec_err_t vpx_ret = > vpx_codec_control(decoder_, VPXD_GET_LAST_QUANTIZER, &qp); > RTC_DCHECK_EQ(vpx_ret, VPX_CODEC_OK); >- int ret = >- ReturnFrame(img, input_image.Timestamp(), input_image.ntp_time_ms_, qp); >+ int ret = ReturnFrame(img, input_image.Timestamp(), input_image.ntp_time_ms_, >+ qp, input_image.ColorSpace()); > if (ret != 0) { > return ret; > } >@@ -1347,7 +1381,8 @@ int VP9DecoderImpl::Decode(const EncodedImage& input_image, > int VP9DecoderImpl::ReturnFrame(const vpx_image_t* img, > uint32_t timestamp, > int64_t ntp_time_ms, >- int qp) { >+ int qp, >+ const ColorSpace* explicit_color_space) { > if (img == nullptr) { > // Decoder OK and nullptr image => No show frame. > return WEBRTC_VIDEO_CODEC_NO_OUTPUT; >@@ -1389,15 +1424,21 @@ int VP9DecoderImpl::ReturnFrame(const vpx_image_t* img, > return WEBRTC_VIDEO_CODEC_NO_OUTPUT; > } > >- VideoFrame decoded_image = VideoFrame::Builder() >- .set_video_frame_buffer(img_wrapped_buffer) >- .set_timestamp_ms(0) >- .set_timestamp_rtp(timestamp) >- .set_ntp_time_ms(ntp_time_ms) >- .set_rotation(webrtc::kVideoRotation_0) >- .set_color_space(ExtractVP9ColorSpace( >- img->cs, img->range, img->bit_depth)) >- .build(); >+ auto builder = VideoFrame::Builder() >+ .set_video_frame_buffer(img_wrapped_buffer) >+ .set_timestamp_ms(0) >+ .set_timestamp_rtp(timestamp) >+ .set_ntp_time_ms(ntp_time_ms) >+ .set_rotation(webrtc::kVideoRotation_0); >+ if (explicit_color_space) { >+ builder.set_color_space(explicit_color_space); >+ } else { >+ builder.set_color_space( >+ ExtractVP9ColorSpace(img->cs, img->range, img->bit_depth)); >+ } >+ >+ VideoFrame decoded_image = builder.build(); >+ > decode_complete_callback_->Decoded(decoded_image, absl::nullopt, qp); > return WEBRTC_VIDEO_CODEC_OK; > } >@@ -1435,3 +1476,5 @@ const char* VP9DecoderImpl::ImplementationName() const { > } > > } // namespace webrtc >+ >+#endif // RTC_ENABLE_VP9 >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.h >index d8ce1ff2ab5efb23773cf7f202808bf314fb9e72..3bfab9ad5fd589badd523b419f818d1f2e8207b2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_impl.h >@@ -12,6 +12,8 @@ > #ifndef MODULES_VIDEO_CODING_CODECS_VP9_VP9_IMPL_H_ > #define MODULES_VIDEO_CODING_CODECS_VP9_VP9_IMPL_H_ > >+#ifdef RTC_ENABLE_VP9 >+ > #include <map> > #include <memory> > #include <vector> >@@ -46,12 +48,10 @@ class VP9EncoderImpl : public VP9Encoder { > > int RegisterEncodeCompleteCallback(EncodedImageCallback* callback) override; > >- int SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; >- > int SetRateAllocation(const VideoBitrateAllocation& bitrate_allocation, > uint32_t frame_rate) override; > >- const char* ImplementationName() const override; >+ EncoderInfo GetEncoderInfo() const override; > > private: > // Determine number of encoder threads to use. >@@ -63,15 +63,16 @@ class VP9EncoderImpl : public VP9Encoder { > void PopulateCodecSpecific(CodecSpecificInfo* codec_specific, > absl::optional<int>* spatial_idx, > const vpx_codec_cx_pkt& pkt, >- uint32_t timestamp, >- bool first_frame_in_picture); >+ uint32_t timestamp); > void FillReferenceIndices(const vpx_codec_cx_pkt& pkt, > const size_t pic_num, > const bool inter_layer_predicted, > CodecSpecificInfoVP9* vp9_info); > void UpdateReferenceBuffers(const vpx_codec_cx_pkt& pkt, > const size_t pic_num); >- vpx_svc_ref_frame_config_t SetReferences(bool is_key_pic); >+ vpx_svc_ref_frame_config_t SetReferences( >+ bool is_key_pic, >+ size_t first_active_spatial_layer_id); > > bool ExplicitlyConfiguredSpatialLayers() const; > bool SetSvcRates(const VideoBitrateAllocation& bitrate_allocation); >@@ -119,6 +120,9 @@ class VP9EncoderImpl : public VP9Encoder { > bool is_svc_; > InterLayerPredMode inter_layer_pred_; > bool external_ref_control_; >+ const bool trusted_rate_controller_; >+ const bool full_superframe_drop_; >+ bool first_frame_in_picture_; > > std::vector<FramerateController> framerate_controller_; > >@@ -168,7 +172,8 @@ class VP9DecoderImpl : public VP9Decoder { > int ReturnFrame(const vpx_image_t* img, > uint32_t timestamp, > int64_t ntp_time_ms, >- int qp); >+ int qp, >+ const ColorSpace* explicit_color_space); > > // Memory pool used to share buffers between libvpx and webrtc. > Vp9FrameBufferPool frame_buffer_pool_; >@@ -179,4 +184,6 @@ class VP9DecoderImpl : public VP9Decoder { > }; > } // namespace webrtc > >+#endif // RTC_ENABLE_VP9 >+ > #endif // MODULES_VIDEO_CODING_CODECS_VP9_VP9_IMPL_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/encoded_frame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/encoded_frame.h >index a08eb0769c77f9f4aeb33fc9e1f15c2daec148a5..c7efd400727b864afdee490d629b1ea1747e99c6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/encoded_frame.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/encoded_frame.h >@@ -13,8 +13,8 @@ > > #include <vector> > >+#include "api/video/encoded_image.h" > #include "common_types.h" // NOLINT(build/include) >-#include "common_video/include/video_frame.h" > #include "modules/include/module_common_types.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "modules/video_coding/include/video_coding_defines.h" >@@ -60,11 +60,21 @@ class VCMEncodedFrame : protected EncodedImage { > * Get pointer to frame buffer > */ > const uint8_t* Buffer() const { return _buffer; } >+ /** >+ * Get pointer to frame buffer that can be mutated. >+ */ >+ uint8_t* MutableBuffer() { return _buffer; } > /** > * Get frame length > */ > size_t Length() const { return _length; } >- >+ /** >+ * Set frame length >+ */ >+ void SetLength(size_t length) { >+ RTC_DCHECK(length <= _size); >+ _length = length; >+ } > /** > * Frame RTP timestamp (90kHz) > */ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.cc >index 5fe38e6297dd1379b0e3483bcfda82e45ccbaf03..841ed742dac509e5772a02674044c3fbef474a8f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.cc >@@ -8,10 +8,17 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include "modules/video_coding/fec_controller_default.h" >+#include <algorithm> >+ >+#include "modules/video_coding/fec_controller_default.h" // NOLINT >+#include "rtc_base/logging.h" >+#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > using rtc::CritScope; >+ >+const float kProtectionOverheadRateThreshold = 0.5; >+ > FecControllerDefault::FecControllerDefault( > Clock* clock, > VCMProtectionCallback* protection_callback) >@@ -19,13 +26,15 @@ FecControllerDefault::FecControllerDefault( > protection_callback_(protection_callback), > loss_prot_logic_(new media_optimization::VCMLossProtectionLogic( > clock_->TimeInMilliseconds())), >- max_payload_size_(1460) {} >+ max_payload_size_(1460), >+ overhead_threshold_(GetProtectionOverheadRateThreshold()) {} > > FecControllerDefault::FecControllerDefault(Clock* clock) > : clock_(clock), > loss_prot_logic_(new media_optimization::VCMLossProtectionLogic( > clock_->TimeInMilliseconds())), >- max_payload_size_(1460) {} >+ max_payload_size_(1460), >+ overhead_threshold_(GetProtectionOverheadRateThreshold()) {} > > FecControllerDefault::~FecControllerDefault(void) { > loss_prot_logic_->Release(); >@@ -46,6 +55,25 @@ void FecControllerDefault::SetEncodingData(size_t width, > max_payload_size_ = max_payload_size; > } > >+float FecControllerDefault::GetProtectionOverheadRateThreshold() { >+ float overhead_threshold = >+ strtof(webrtc::field_trial::FindFullName( >+ "WebRTC-ProtectionOverheadRateThreshold") >+ .c_str(), >+ nullptr); >+ if (overhead_threshold > 0 && overhead_threshold <= 1) { >+ RTC_LOG(LS_INFO) << "ProtectionOverheadRateThreshold is set to " >+ << overhead_threshold; >+ return overhead_threshold; >+ } else if (overhead_threshold < 0 || overhead_threshold > 1) { >+ RTC_LOG(WARNING) << "ProtectionOverheadRateThreshold field trial is set to " >+ "an invalid value, expecting a value between (0, 1]."; >+ } >+ // WebRTC-ProtectionOverheadRateThreshold field trial string is not found, use >+ // the default value. >+ return kProtectionOverheadRateThreshold; >+} >+ > uint32_t FecControllerDefault::UpdateFecRates( > uint32_t estimated_bitrate_bps, > int actual_framerate_fps, >@@ -125,9 +153,9 @@ uint32_t FecControllerDefault::UpdateFecRates( > static_cast<float>(sent_nack_rate_bps + sent_fec_rate_bps) / > sent_total_rate_bps; > } >- // Cap the overhead estimate to 50%. >- if (protection_overhead_rate > 0.5) >- protection_overhead_rate = 0.5; >+ // Cap the overhead estimate to a threshold, default is 50%. >+ protection_overhead_rate = >+ std::min(protection_overhead_rate, overhead_threshold_); > // Source coding rate: total rate - protection overhead. > return estimated_bitrate_bps * (1.0 - protection_overhead_rate); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.h >index a95eced245a2208a02420d12bacd56891f617900..9d79a61d28ebbaeed7f0f31a51c7f7680ae44ae5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/fec_controller_default.h >@@ -42,6 +42,7 @@ class FecControllerDefault : public FecController { > void UpdateWithEncodedData(const size_t encoded_image_length, > const FrameType encoded_image_frametype) override; > bool UseLossVectorMask() override; >+ float GetProtectionOverheadRateThreshold(); > > private: > enum { kBitrateAverageWinMs = 1000 }; >@@ -52,6 +53,7 @@ class FecControllerDefault : public FecController { > RTC_GUARDED_BY(crit_sect_); > size_t max_payload_size_ RTC_GUARDED_BY(crit_sect_); > RTC_DISALLOW_COPY_AND_ASSIGN(FecControllerDefault); >+ const float overhead_threshold_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.cc >index 377611e5e271437506ad9847ea6840c9e1d6d606..04fba84caf55b447c057da93d06e654016ec1ec3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.cc >@@ -46,11 +46,9 @@ FrameBuffer::FrameBuffer(Clock* clock, > VCMTiming* timing, > VCMReceiveStatisticsCallback* stats_callback) > : clock_(clock), >- new_continuous_frame_event_(false, false), > jitter_estimator_(jitter_estimator), > timing_(timing), > inter_frame_delay_(clock_->TimeInMilliseconds()), >- last_decoded_frame_timestamp_(0), > last_decoded_frame_it_(frames_.end()), > last_continuous_frame_it_(frames_.end()), > num_frames_history_(0), >@@ -115,10 +113,19 @@ FrameBuffer::ReturnReason FrameBuffer::NextFrame( > if (keyframe_required && !frame->is_keyframe()) > continue; > >+ // TODO(https://bugs.webrtc.org/9974): consider removing this check >+ // as it may make a stream undecodable after a very long delay between >+ // frames. >+ if (last_decoded_frame_timestamp_ && >+ AheadOf(*last_decoded_frame_timestamp_, frame->Timestamp())) { >+ continue; >+ } >+ > next_frame_it_ = frame_it; >- if (frame->RenderTime() == -1) >+ if (frame->RenderTime() == -1) { > frame->SetRenderTime( > timing_->RenderTimeMs(frame->Timestamp(), now_ms)); >+ } > wait_ms = timing_->MaxWaitingTime(frame->RenderTime(), now_ms); > > // This will cause the frame buffer to prefer high framerate rather >@@ -175,33 +182,6 @@ FrameBuffer::ReturnReason FrameBuffer::NextFrame( > UpdateTimingFrameInfo(); > PropagateDecodability(next_frame_it_->second); > >- // Sanity check for RTP timestamp monotonicity. >- if (last_decoded_frame_it_ != frames_.end()) { >- const VideoLayerFrameId& last_decoded_frame_key = >- last_decoded_frame_it_->first; >- const VideoLayerFrameId& frame_key = next_frame_it_->first; >- >- const bool frame_is_higher_spatial_layer_of_last_decoded_frame = >- last_decoded_frame_timestamp_ == frame->Timestamp() && >- last_decoded_frame_key.picture_id == frame_key.picture_id && >- last_decoded_frame_key.spatial_layer < frame_key.spatial_layer; >- >- if (AheadOrAt(last_decoded_frame_timestamp_, frame->Timestamp()) && >- !frame_is_higher_spatial_layer_of_last_decoded_frame) { >- // TODO(brandtr): Consider clearing the entire buffer when we hit >- // these conditions. >- RTC_LOG(LS_WARNING) >- << "Frame with (timestamp:picture_id:spatial_id) (" >- << frame->Timestamp() << ":" << frame->id.picture_id << ":" >- << static_cast<int>(frame->id.spatial_layer) << ")" >- << " sent to decoder after frame with" >- << " (timestamp:picture_id:spatial_id) (" >- << last_decoded_frame_timestamp_ << ":" >- << last_decoded_frame_key.picture_id << ":" >- << static_cast<int>(last_decoded_frame_key.spatial_layer) << ")."; >- } >- } >- > AdvanceLastDecodedFrame(next_frame_it_); > last_decoded_frame_timestamp_ = frame->Timestamp(); > *frame_out = std::move(frame); >@@ -267,6 +247,11 @@ void FrameBuffer::Stop() { > new_continuous_frame_event_.Set(); > } > >+void FrameBuffer::Clear() { >+ rtc::CritScope lock(&crit_); >+ ClearFramesAndHistory(); >+} >+ > void FrameBuffer::UpdateRtt(int64_t rtt_ms) { > rtc::CritScope lock(&crit_); > jitter_estimator_->UpdateRtt(rtt_ms); >@@ -348,7 +333,7 @@ int64_t FrameBuffer::InsertFrame(std::unique_ptr<EncodedFrame> frame) { > > if (last_decoded_frame_it_ != frames_.end() && > id <= last_decoded_frame_it_->first) { >- if (AheadOf(frame->Timestamp(), last_decoded_frame_timestamp_) && >+ if (AheadOf(frame->Timestamp(), *last_decoded_frame_timestamp_) && > frame->is_keyframe()) { > // If this frame has a newer timestamp but an earlier picture id then we > // assume there has been a jump in the picture id due to some encoder >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.h >index ae84fcf9e6c293810d44432ef3a009db41eef74e..dc5e5a2e372fdc10ac80ac6ee71f769c8d86c983 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2.h >@@ -79,6 +79,9 @@ class FrameBuffer { > // Updates the RTT for jitter buffer estimation. > void UpdateRtt(int64_t rtt_ms); > >+ // Clears the FrameBuffer, removing all the buffered frames. >+ void Clear(); >+ > private: > struct FrameInfo { > FrameInfo(); >@@ -161,7 +164,7 @@ class FrameBuffer { > VCMJitterEstimator* const jitter_estimator_ RTC_GUARDED_BY(crit_); > VCMTiming* const timing_ RTC_GUARDED_BY(crit_); > VCMInterFrameDelay inter_frame_delay_ RTC_GUARDED_BY(crit_); >- uint32_t last_decoded_frame_timestamp_ RTC_GUARDED_BY(crit_); >+ absl::optional<uint32_t> last_decoded_frame_timestamp_ RTC_GUARDED_BY(crit_); > FrameMap::iterator last_decoded_frame_it_ RTC_GUARDED_BY(crit_); > FrameMap::iterator last_continuous_frame_it_ RTC_GUARDED_BY(crit_); > FrameMap::iterator next_frame_it_ RTC_GUARDED_BY(crit_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2_unittest.cc >index 357ba86540551f992aed561c5ff9138532de02d3..e10f78527957e77c075a0f0827b37d6d7bf4caf5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/frame_buffer2_unittest.cc >@@ -137,9 +137,7 @@ class TestFrameBuffer2 : public ::testing::Test { > &stats_callback_)), > rand_(0x34678213), > tear_down_(false), >- extract_thread_(&ExtractLoop, this, "Extract Thread"), >- trigger_extract_event_(false, false), >- crit_acquired_event_(false, false) {} >+ extract_thread_(&ExtractLoop, this, "Extract Thread") {} > > void SetUp() override { extract_thread_.Start(); } > >@@ -600,5 +598,21 @@ TEST_F(TestFrameBuffer2, DontUpdateOnUndecodableFrame) { > ExtractFrame(0, true); > } > >+TEST_F(TestFrameBuffer2, DontDecodeOlderTimestamp) { >+ InsertFrame(2, 0, 1, false); >+ InsertFrame(1, 0, 2, false); // Older picture id but newer timestamp. >+ ExtractFrame(0); >+ ExtractFrame(0); >+ CheckFrame(0, 1, 0); >+ CheckNoFrame(1); >+ >+ InsertFrame(3, 0, 4, false); >+ InsertFrame(4, 0, 3, false); // Newer picture id but older timestamp. >+ ExtractFrame(0); >+ ExtractFrame(0); >+ CheckFrame(2, 3, 0); >+ CheckNoFrame(3); >+} >+ > } // namespace video_coding > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.cc >index b1d7c28abd24733738f6087f8c9393aa05fee0b3..d5cc0a8ba068dcfdfef5eaf24d78911156ebb9d3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.cc >@@ -101,26 +101,14 @@ int32_t VCMGenericEncoder::Encode(const VideoFrame& frame, > > void VCMGenericEncoder::SetEncoderParameters(const EncoderParameters& params) { > RTC_DCHECK_RUNS_SERIALIZED(&race_checker_); >- bool channel_parameters_have_changed; > bool rates_have_changed; > { > rtc::CritScope lock(¶ms_lock_); >- channel_parameters_have_changed = >- params.loss_rate != encoder_params_.loss_rate || >- params.rtt != encoder_params_.rtt; > rates_have_changed = > params.target_bitrate != encoder_params_.target_bitrate || > params.input_frame_rate != encoder_params_.input_frame_rate; > encoder_params_ = params; > } >- if (channel_parameters_have_changed) { >- int res = encoder_->SetChannelParameters(params.loss_rate, params.rtt); >- if (res != 0) { >- RTC_LOG(LS_WARNING) << "Error set encoder parameters (loss = " >- << params.loss_rate << ", rtt = " << params.rtt >- << "): " << res; >- } >- } > if (rates_have_changed) { > int res = encoder_->SetRateAllocation(params.target_bitrate, > params.input_frame_rate); >@@ -164,9 +152,9 @@ bool VCMGenericEncoder::InternalSource() const { > return internal_source_; > } > >-bool VCMGenericEncoder::SupportsNativeHandle() const { >+VideoEncoder::EncoderInfo VCMGenericEncoder::GetEncoderInfo() const { > RTC_DCHECK_RUNS_SERIALIZED(&race_checker_); >- return encoder_->SupportsNativeHandle(); >+ return encoder_->GetEncoderInfo(); > } > > VCMEncodedFrameCallback::VCMEncodedFrameCallback( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.h >index 151e93e49759796095aedfeef9a40d13b410ad9c..2f841b34dd815d2995e48b4ac96813b0f0874dd8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/generic_encoder.h >@@ -146,7 +146,7 @@ class VCMGenericEncoder { > > int32_t RequestFrame(const std::vector<FrameType>& frame_types); > bool InternalSource() const; >- bool SupportsNativeHandle() const; >+ VideoEncoder::EncoderInfo GetEncoderInfo() const; > > private: > rtc::RaceChecker race_checker_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/mock/mock_video_codec_interface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/mock/mock_video_codec_interface.h >index c29efeead68d4630c3917b610ba35182ab14493c..7d003591ea3d95cc7ea744103da39d4267b710d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/mock/mock_video_codec_interface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/mock/mock_video_codec_interface.h >@@ -19,6 +19,8 @@ > > namespace webrtc { > >+// NOTE: Deprecated file, include api/mock_video_(encoder|decoder).h instead. >+ > class MockEncodedImageCallback : public EncodedImageCallback { > public: > MOCK_METHOD3(OnEncodedImage, >@@ -42,7 +44,6 @@ class MockVideoEncoder : public VideoEncoder { > int32_t(EncodedImageCallback* callback)); > MOCK_METHOD0(Release, int32_t()); > MOCK_METHOD0(Reset, int32_t()); >- MOCK_METHOD2(SetChannelParameters, int32_t(uint32_t packetLoss, int64_t rtt)); > MOCK_METHOD2(SetRates, int32_t(uint32_t newBitRate, uint32_t frameRate)); > MOCK_METHOD2(SetRateAllocation, > int32_t(const VideoBitrateAllocation& newBitRate, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_initializer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_initializer.h >index ee70810f62ed73d358491b17216ca3210713b7b9..e979f9c86776e2b9edbe746bd6d95823dbc923d8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_initializer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_initializer.h >@@ -30,15 +30,9 @@ class VideoCodecInitializer { > // type used. For instance, VP8 will create an allocator than can handle > // simulcast and temporal layering. > // GetBitrateAllocator is called implicitly from here, no need to call again. >- static bool SetupCodec( >- const VideoEncoderConfig& config, >- const std::vector<VideoStream>& streams, >- VideoCodec* codec, >- std::unique_ptr<VideoBitrateAllocator>* bitrate_allocator); >- >- // Create a bitrate allocator for the specified codec. >- static std::unique_ptr<VideoBitrateAllocator> CreateBitrateAllocator( >- const VideoCodec& codec); >+ static bool SetupCodec(const VideoEncoderConfig& config, >+ const std::vector<VideoStream>& streams, >+ VideoCodec* codec); > > private: > static VideoCodec VideoEncoderConfigToVideoCodec( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_interface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_interface.h >index d2badf9201b21ee4aa326386d201c49541a19b4e..38583bfa3b410c6285b32cf2b9654b7e076f2f8d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_interface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_codec_interface.h >@@ -79,6 +79,7 @@ struct CodecSpecificInfo { > memset(&codecSpecific, 0, sizeof(codecSpecific)); > } > VideoCodecType codecType; >+ // |codec_name| is deprecated, use name provided by VideoEncoder instead. > const char* codec_name; > CodecSpecificInfoUnion codecSpecific; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding.h >index 8ef046aa23193f7f30f2b1ecded11ff73c834d65..0889b231dea347cce047c0547dbea7ce80534120 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding.h >@@ -27,6 +27,7 @@ > #include "modules/include/module.h" > #include "modules/include/module_common_types.h" > #include "modules/video_coding/include/video_coding_defines.h" >+#include "rtc_base/deprecation.h" > #include "system_wrappers/include/event_wrapper.h" > > namespace webrtc { >@@ -37,20 +38,6 @@ class VideoDecoder; > class VideoEncoder; > struct CodecSpecificInfo; > >-class EventFactory { >- public: >- virtual ~EventFactory() {} >- >- virtual EventWrapper* CreateEvent() = 0; >-}; >- >-class EventFactoryImpl : public EventFactory { >- public: >- ~EventFactoryImpl() override {} >- >- EventWrapper* CreateEvent() override; >-}; >- > // Used to indicate which decode with errors mode should be used. > enum VCMDecodeErrorMode { > kNoErrors, // Never decode with errors. Video will freeze >@@ -69,7 +56,7 @@ class VideoCodingModule : public Module { > enum SenderNackMode { kNackNone, kNackAll, kNackSelective }; > > // DEPRECATED. >- static VideoCodingModule* Create(Clock* clock, EventFactory* event_factory); >+ static VideoCodingModule* Create(Clock* clock); > > /* > * Sender >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding_defines.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding_defines.h >index 7fb6f58364773f8fa818396e1cb7f684f8edb06f..7bbc98fd4831a8e28a440cee9ad27f6826960c15 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding_defines.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/include/video_coding_defines.h >@@ -15,8 +15,6 @@ > #include <vector> > > #include "api/video/video_frame.h" >-// For EncodedImage >-#include "common_video/include/video_frame.h" > #include "modules/include/module_common_types.h" > > namespace webrtc { >@@ -43,7 +41,7 @@ enum { > // |kDefaultTimingFramesDelayMs|, or if the frame is at least > // |kDefaultOutliserFrameSizePercent| in size of average frame. > kDefaultTimingFramesDelayMs = 200, >- kDefaultOutlierFrameSizePercent = 250, >+ kDefaultOutlierFrameSizePercent = 500, > // Maximum number of frames for what we store encode start timing information. > kMaxEncodeStartTimeListSize = 50, > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer.h >index f649adc8b3ab359304ad1c5ccbce3b246f884800..482627604553cd321222f2bda84d6ce5d9759044 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer.h >@@ -35,7 +35,6 @@ enum VCMNackMode { kNack, kNoNack }; > > // forward declarations > class Clock; >-class EventFactory; > class EventWrapper; > class VCMFrameBuffer; > class VCMPacket; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer_unittest.cc >index 6ae65d62e986c728cad98d47d4eacbbb1ecb6d0c..3ed18a938650b7be0dcdc6539ecebd261a0bee5d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_buffer_unittest.cc >@@ -20,7 +20,6 @@ > #include "modules/video_coding/media_opt_util.h" > #include "modules/video_coding/packet.h" > #include "modules/video_coding/test/stream_generator.h" >-#include "modules/video_coding/test/test_util.h" > #include "rtc_base/location.h" > #include "system_wrappers/include/clock.h" > #include "system_wrappers/include/field_trial.h" >@@ -220,9 +219,7 @@ class TestBasicJitterBuffer : public ::testing::TestWithParam<std::string>, > void SetUp() override { > clock_.reset(new SimulatedClock(0)); > jitter_buffer_.reset(new VCMJitterBuffer( >- clock_.get(), >- std::unique_ptr<EventWrapper>(event_factory_.CreateEvent()), this, >- this)); >+ clock_.get(), absl::WrapUnique(EventWrapper::Create()), this, this)); > jitter_buffer_->Start(); > seq_num_ = 1234; > timestamp_ = 0; >@@ -313,7 +310,6 @@ class TestBasicJitterBuffer : public ::testing::TestWithParam<std::string>, > uint8_t data_[1500]; > std::unique_ptr<VCMPacket> packet_; > std::unique_ptr<SimulatedClock> clock_; >- NullEventFactory event_factory_; > std::unique_ptr<VCMJitterBuffer> jitter_buffer_; > }; > >@@ -339,9 +335,7 @@ class TestRunningJitterBuffer : public ::testing::TestWithParam<std::string>, > max_nack_list_size_ = 150; > oldest_packet_to_nack_ = 250; > jitter_buffer_ = new VCMJitterBuffer( >- clock_.get(), >- std::unique_ptr<EventWrapper>(event_factory_.CreateEvent()), this, >- this); >+ clock_.get(), absl::WrapUnique(EventWrapper::Create()), this, this); > stream_generator_ = new StreamGenerator(0, clock_->TimeInMilliseconds()); > jitter_buffer_->Start(); > jitter_buffer_->SetNackSettings(max_nack_list_size_, oldest_packet_to_nack_, >@@ -433,7 +427,6 @@ class TestRunningJitterBuffer : public ::testing::TestWithParam<std::string>, > VCMJitterBuffer* jitter_buffer_; > StreamGenerator* stream_generator_; > std::unique_ptr<SimulatedClock> clock_; >- NullEventFactory event_factory_; > size_t max_nack_list_size_; > int oldest_packet_to_nack_; > uint8_t data_buffer_[kDataBufferSize]; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.cc >index 5e754f10885edc6655e2e7152a447bed4038597b..0d0fac7291e8fddd41dc6b355a935f6496d164c1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.cc >@@ -14,18 +14,24 @@ > #include <math.h> > #include <stdlib.h> > #include <string.h> >+#include <algorithm> > #include <string> > >+#include "absl/types/optional.h" > #include "modules/video_coding/internal_defines.h" > #include "modules/video_coding/rtt_filter.h" >+#include "rtc_base/experiments/jitter_upper_bound_experiment.h" > #include "system_wrappers/include/clock.h" > #include "system_wrappers/include/field_trial.h" > > namespace webrtc { >- >-enum { kStartupDelaySamples = 30 }; >-enum { kFsAccuStartupSamples = 5 }; >-enum { kMaxFramerateEstimate = 200 }; >+namespace { >+static constexpr uint32_t kStartupDelaySamples = 30; >+static constexpr int64_t kFsAccuStartupSamples = 5; >+static constexpr double kMaxFramerateEstimate = 200.0; >+static constexpr int64_t kNackCountTimeoutMs = 60000; >+static constexpr double kDefaultMaxTimestampDeviationInSigmas = 3.5; >+} // namespace > > VCMJitterEstimator::VCMJitterEstimator(const Clock* clock, > int32_t vcmId, >@@ -45,7 +51,9 @@ VCMJitterEstimator::VCMJitterEstimator(const Clock* clock, > _rttFilter(), > fps_counter_(30), // TODO(sprang): Use an estimator with limit based on > // time, rather than number of samples. >- low_rate_experiment_(kInit), >+ time_deviation_upper_bound_( >+ JitterUpperBoundExperiment::GetUpperBoundSigmas().value_or( >+ kDefaultMaxTimestampDeviationInSigmas)), > clock_(clock) { > Reset(); > } >@@ -75,6 +83,7 @@ VCMJitterEstimator& VCMJitterEstimator::operator=( > _latestNackTimestamp = rhs._latestNackTimestamp; > _nackCount = rhs._nackCount; > _rttFilter = rhs._rttFilter; >+ clock_ = rhs.clock_; > } > return *this; > } >@@ -102,6 +111,7 @@ void VCMJitterEstimator::Reset() { > _filterJitterEstimate = 0.0; > _latestNackTimestamp = 0; > _nackCount = 0; >+ _latestNackTimestamp = 0; > _fsSum = 0; > _fsCount = 0; > _startupCount = 0; >@@ -155,6 +165,12 @@ void VCMJitterEstimator::UpdateEstimate(int64_t frameDelayMS, > } > _prevFrameSize = frameSizeBytes; > >+ // Cap frameDelayMS based on the current time deviation noise. >+ int64_t max_time_deviation_ms = >+ static_cast<int64_t>(time_deviation_upper_bound_ * sqrt(_varNoise) + 0.5); >+ frameDelayMS = std::max(std::min(frameDelayMS, max_time_deviation_ms), >+ -max_time_deviation_ms); >+ > // Only update the Kalman filter if the sample is not considered > // an extreme outlier. Even if it is an extreme outlier from a > // delay point of view, if the frame size also is large the >@@ -193,15 +209,10 @@ void VCMJitterEstimator::UpdateEstimate(int64_t frameDelayMS, > > // Updates the nack/packet ratio > void VCMJitterEstimator::FrameNacked() { >- // Wait until _nackLimit retransmissions has been received, >- // then always add ~1 RTT delay. >- // TODO(holmer): Should we ever remove the additional delay if the >- // the packet losses seem to have stopped? We could for instance scale >- // the number of RTTs to add with the amount of retransmissions in a given >- // time interval, or similar. > if (_nackCount < _nackLimit) { > _nackCount++; > } >+ _latestNackTimestamp = clock_->TimeInMicroseconds(); > } > > // Updates Kalman estimate of the channel >@@ -309,22 +320,20 @@ void VCMJitterEstimator::EstimateRandomJitter(double d_dT, > if (_alphaCount > _alphaCountMax) > _alphaCount = _alphaCountMax; > >- if (LowRateExperimentEnabled()) { >- // In order to avoid a low frame rate stream to react slower to changes, >- // scale the alpha weight relative a 30 fps stream. >- double fps = GetFrameRate(); >- if (fps > 0.0) { >- double rate_scale = 30.0 / fps; >- // At startup, there can be a lot of noise in the fps estimate. >- // Interpolate rate_scale linearly, from 1.0 at sample #1, to 30.0 / fps >- // at sample #kStartupDelaySamples. >- if (_alphaCount < kStartupDelaySamples) { >- rate_scale = >- (_alphaCount * rate_scale + (kStartupDelaySamples - _alphaCount)) / >- kStartupDelaySamples; >- } >- alpha = pow(alpha, rate_scale); >+ // In order to avoid a low frame rate stream to react slower to changes, >+ // scale the alpha weight relative a 30 fps stream. >+ double fps = GetFrameRate(); >+ if (fps > 0.0) { >+ double rate_scale = 30.0 / fps; >+ // At startup, there can be a lot of noise in the fps estimate. >+ // Interpolate rate_scale linearly, from 1.0 at sample #1, to 30.0 / fps >+ // at sample #kStartupDelaySamples. >+ if (_alphaCount < kStartupDelaySamples) { >+ rate_scale = >+ (_alphaCount * rate_scale + (kStartupDelaySamples - _alphaCount)) / >+ kStartupDelaySamples; > } >+ alpha = pow(alpha, rate_scale); > } > > double avgNoise = alpha * _avgNoise + (1 - alpha) * d_dT; >@@ -386,46 +395,35 @@ void VCMJitterEstimator::UpdateMaxFrameSize(uint32_t frameSizeBytes) { > // otherwise tries to calculate an estimate. > int VCMJitterEstimator::GetJitterEstimate(double rttMultiplier) { > double jitterMS = CalculateEstimate() + OPERATING_SYSTEM_JITTER; >+ uint64_t now = clock_->TimeInMicroseconds(); >+ >+ if (now - _latestNackTimestamp > kNackCountTimeoutMs * 1000) >+ _nackCount = 0; >+ > if (_filterJitterEstimate > jitterMS) > jitterMS = _filterJitterEstimate; > if (_nackCount >= _nackLimit) > jitterMS += _rttFilter.RttMs() * rttMultiplier; > >- if (LowRateExperimentEnabled()) { >- static const double kJitterScaleLowThreshold = 5.0; >- static const double kJitterScaleHighThreshold = 10.0; >- double fps = GetFrameRate(); >- // Ignore jitter for very low fps streams. >- if (fps < kJitterScaleLowThreshold) { >- if (fps == 0.0) { >- return jitterMS; >- } >- return 0; >- } >- >- // Semi-low frame rate; scale by factor linearly interpolated from 0.0 at >- // kJitterScaleLowThreshold to 1.0 at kJitterScaleHighThreshold. >- if (fps < kJitterScaleHighThreshold) { >- jitterMS = >- (1.0 / (kJitterScaleHighThreshold - kJitterScaleLowThreshold)) * >- (fps - kJitterScaleLowThreshold) * jitterMS; >+ static const double kJitterScaleLowThreshold = 5.0; >+ static const double kJitterScaleHighThreshold = 10.0; >+ double fps = GetFrameRate(); >+ // Ignore jitter for very low fps streams. >+ if (fps < kJitterScaleLowThreshold) { >+ if (fps == 0.0) { >+ return rtc::checked_cast<int>(std::max(0.0, jitterMS) + 0.5); > } >+ return 0; > } > >- return static_cast<uint32_t>(jitterMS + 0.5); >-} >- >-bool VCMJitterEstimator::LowRateExperimentEnabled() { >- if (low_rate_experiment_ == kInit) { >- std::string group = >- webrtc::field_trial::FindFullName("WebRTC-ReducedJitterDelay"); >- if (group == "Disabled") { >- low_rate_experiment_ = kDisabled; >- } else { >- low_rate_experiment_ = kEnabled; >- } >+ // Semi-low frame rate; scale by factor linearly interpolated from 0.0 at >+ // kJitterScaleLowThreshold to 1.0 at kJitterScaleHighThreshold. >+ if (fps < kJitterScaleHighThreshold) { >+ jitterMS = (1.0 / (kJitterScaleHighThreshold - kJitterScaleLowThreshold)) * >+ (fps - kJitterScaleLowThreshold) * jitterMS; > } >- return low_rate_experiment_ == kEnabled ? true : false; >+ >+ return rtc::checked_cast<int>(std::max(0.0, jitterMS) + 0.5); > } > > double VCMJitterEstimator::GetFrameRate() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.h >index aeba3eded33cd157c54ce75ca31d1fcf43386c0d..56b532851d939ce8936606dc62cfccce4b71871e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator.h >@@ -72,8 +72,6 @@ class VCMJitterEstimator { > double _theta[2]; // Estimated line parameters (slope, offset) > double _varNoise; // Variance of the time-deviation from the line > >- virtual bool LowRateExperimentEnabled(); >- > private: > // Updates the Kalman filter for the line describing > // the frame size dependent jitter. >@@ -159,8 +157,7 @@ class VCMJitterEstimator { > VCMRttFilter _rttFilter; > > rtc::RollingAccumulator<uint64_t> fps_counter_; >- enum ExperimentFlag { kInit, kEnabled, kDisabled }; >- ExperimentFlag low_rate_experiment_; >+ const double time_deviation_upper_bound_; > const Clock* clock_; > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator_tests.cc >index f00e1ef6da8b64565972c8cdacd7493ed2422fb7..45262ef83602e7c15e2a9cf3b9e8066f35a92764 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/jitter_estimator_tests.cc >@@ -9,38 +9,30 @@ > > #include "modules/video_coding/jitter_estimator.h" > >+#include "rtc_base/experiments/jitter_upper_bound_experiment.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/numerics/histogram_percentile_counter.h" >+#include "rtc_base/timeutils.h" > #include "system_wrappers/include/clock.h" >+#include "test/field_trial.h" > #include "test/gtest.h" > > namespace webrtc { > >-class TestEstimator : public VCMJitterEstimator { >- public: >- explicit TestEstimator(bool exp_enabled) >- : VCMJitterEstimator(&fake_clock_, 0, 0), >- fake_clock_(0), >- exp_enabled_(exp_enabled) {} >+class TestVCMJitterEstimator : public ::testing::Test { >+ protected: >+ TestVCMJitterEstimator() : fake_clock_(0) {} > >- virtual bool LowRateExperimentEnabled() { return exp_enabled_; } >+ virtual void SetUp() { >+ estimator_ = absl::make_unique<VCMJitterEstimator>(&fake_clock_, 0, 0); >+ } > > void AdvanceClock(int64_t microseconds) { > fake_clock_.AdvanceTimeMicroseconds(microseconds); > } > >- private: > SimulatedClock fake_clock_; >- const bool exp_enabled_; >-}; >- >-class TestVCMJitterEstimator : public ::testing::Test { >- protected: >- TestVCMJitterEstimator() >- : regular_estimator_(false), low_rate_estimator_(true) {} >- >- virtual void SetUp() { regular_estimator_.Reset(); } >- >- TestEstimator regular_estimator_; >- TestEstimator low_rate_estimator_; >+ std::unique_ptr<VCMJitterEstimator> estimator_; > }; > > // Generates some simple test data in the form of a sawtooth wave. >@@ -64,98 +56,56 @@ class ValueGenerator { > // 5 fps, disable jitter delay altogether. > TEST_F(TestVCMJitterEstimator, TestLowRate) { > ValueGenerator gen(10); >- uint64_t time_delta = 1000000 / 5; >+ uint64_t time_delta_us = rtc::kNumMicrosecsPerSec / 5; > for (int i = 0; i < 60; ++i) { >- regular_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- regular_estimator_.AdvanceClock(time_delta); >- low_rate_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- low_rate_estimator_.AdvanceClock(time_delta); >- EXPECT_GT(regular_estimator_.GetJitterEstimate(0), 0); >+ estimator_->UpdateEstimate(gen.Delay(), gen.FrameSize()); >+ AdvanceClock(time_delta_us); > if (i > 2) >- EXPECT_EQ(low_rate_estimator_.GetJitterEstimate(0), 0); >- gen.Advance(); >- } >-} >- >-// 8 fps, steady state estimate should be in interpolated interval between 0 >-// and value of previous method. >-TEST_F(TestVCMJitterEstimator, TestMidRate) { >- ValueGenerator gen(10); >- uint64_t time_delta = 1000000 / 8; >- for (int i = 0; i < 60; ++i) { >- regular_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- regular_estimator_.AdvanceClock(time_delta); >- low_rate_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- low_rate_estimator_.AdvanceClock(time_delta); >- EXPECT_GT(regular_estimator_.GetJitterEstimate(0), 0); >- EXPECT_GT(low_rate_estimator_.GetJitterEstimate(0), 0); >- EXPECT_GE(regular_estimator_.GetJitterEstimate(0), >- low_rate_estimator_.GetJitterEstimate(0)); >+ EXPECT_EQ(estimator_->GetJitterEstimate(0), 0); > gen.Advance(); > } > } > >-// 30 fps, steady state estimate should be same as previous method. >-TEST_F(TestVCMJitterEstimator, TestHighRate) { >- ValueGenerator gen(10); >- uint64_t time_delta = 1000000 / 30; >- for (int i = 0; i < 60; ++i) { >- regular_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- regular_estimator_.AdvanceClock(time_delta); >- low_rate_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- low_rate_estimator_.AdvanceClock(time_delta); >- EXPECT_EQ(regular_estimator_.GetJitterEstimate(0), >- low_rate_estimator_.GetJitterEstimate(0)); >- gen.Advance(); >- } >-} >- >-// 10 fps, high jitter then low jitter. Low rate estimator should converge >-// faster to low noise estimate. >-TEST_F(TestVCMJitterEstimator, TestConvergence) { >- // Reach a steady state with high noise. >- ValueGenerator gen(50); >- uint64_t time_delta = 1000000 / 10; >- for (int i = 0; i < 100; ++i) { >- regular_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- regular_estimator_.AdvanceClock(time_delta * 2); >- low_rate_estimator_.UpdateEstimate(gen.Delay(), gen.FrameSize()); >- low_rate_estimator_.AdvanceClock(time_delta * 2); >- gen.Advance(); >- } >- >- int threshold = regular_estimator_.GetJitterEstimate(0) / 2; >- >- // New generator with zero noise. >- ValueGenerator low_gen(0); >- int regular_iterations = 0; >- int low_rate_iterations = 0; >- for (int i = 0; i < 500; ++i) { >- if (regular_iterations == 0) { >- regular_estimator_.UpdateEstimate(low_gen.Delay(), low_gen.FrameSize()); >- regular_estimator_.AdvanceClock(time_delta); >- if (regular_estimator_.GetJitterEstimate(0) < threshold) { >- regular_iterations = i; >- } >+TEST_F(TestVCMJitterEstimator, TestUpperBound) { >+ struct TestContext { >+ TestContext() : upper_bound(0.0), percentiles(1000) {} >+ double upper_bound; >+ rtc::HistogramPercentileCounter percentiles; >+ }; >+ std::vector<TestContext> test_cases(2); >+ >+ test_cases[0].upper_bound = 100.0; // First use essentially no cap. >+ test_cases[1].upper_bound = 3.5; // Second, reasonably small cap. >+ >+ for (TestContext& context : test_cases) { >+ // Set up field trial and reset jitter estimator. >+ char string_buf[64]; >+ rtc::SimpleStringBuilder ssb(string_buf); >+ ssb << JitterUpperBoundExperiment::kJitterUpperBoundExperimentName >+ << "/Enabled-" << context.upper_bound << "/"; >+ test::ScopedFieldTrials field_trials(ssb.str()); >+ SetUp(); >+ >+ ValueGenerator gen(50); >+ uint64_t time_delta_us = rtc::kNumMicrosecsPerSec / 30; >+ for (int i = 0; i < 100; ++i) { >+ estimator_->UpdateEstimate(gen.Delay(), gen.FrameSize()); >+ AdvanceClock(time_delta_us); >+ context.percentiles.Add( >+ static_cast<uint32_t>(estimator_->GetJitterEstimate(0))); >+ gen.Advance(); > } >- >- if (low_rate_iterations == 0) { >- low_rate_estimator_.UpdateEstimate(low_gen.Delay(), low_gen.FrameSize()); >- low_rate_estimator_.AdvanceClock(time_delta); >- if (low_rate_estimator_.GetJitterEstimate(0) < threshold) { >- low_rate_iterations = i; >- } >- } >- >- if (regular_iterations != 0 && low_rate_iterations != 0) { >- break; >- } >- >- gen.Advance(); > } > >- EXPECT_NE(regular_iterations, 0); >- EXPECT_NE(low_rate_iterations, 0); >- EXPECT_LE(low_rate_iterations, regular_iterations); >+ // Median should be similar after three seconds. Allow 5% error margin. >+ uint32_t median_unbound = *test_cases[0].percentiles.GetPercentile(0.5); >+ uint32_t median_bounded = *test_cases[1].percentiles.GetPercentile(0.5); >+ EXPECT_NEAR(median_unbound, median_bounded, (median_unbound * 5) / 100); >+ >+ // Max should be lower for the bounded case. >+ uint32_t max_unbound = *test_cases[0].percentiles.GetPercentile(1.0); >+ uint32_t max_bounded = *test_cases[1].percentiles.GetPercentile(1.0); >+ EXPECT_GT(max_unbound, static_cast<uint32_t>(max_bounded * 1.25)); > } >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.cc >index 7fbcf34b41a91373bd7861ae0f5dfe3e8e1f7fd1..0c69b57749349ffafce1d0877e5fd50704a71071 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.cc >@@ -16,6 +16,7 @@ > #include "modules/utility/include/process_thread.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >+#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > >@@ -28,14 +29,29 @@ const int kProcessFrequency = 50; > const int kProcessIntervalMs = 1000 / kProcessFrequency; > const int kMaxReorderedPackets = 128; > const int kNumReorderingBuckets = 10; >+const int kDefaultSendNackDelayMs = 0; >+ >+int64_t GetSendNackDelay() { >+ int64_t delay_ms = strtol( >+ webrtc::field_trial::FindFullName("WebRTC-SendNackDelayMs").c_str(), >+ nullptr, 10); >+ if (delay_ms > 0 && delay_ms <= 20) { >+ RTC_LOG(LS_INFO) << "SendNackDelay is set to " << delay_ms; >+ return delay_ms; >+ } >+ return kDefaultSendNackDelayMs; >+} > } // namespace > > NackModule::NackInfo::NackInfo() > : seq_num(0), send_at_seq_num(0), sent_at_time(-1), retries(0) {} > >-NackModule::NackInfo::NackInfo(uint16_t seq_num, uint16_t send_at_seq_num) >+NackModule::NackInfo::NackInfo(uint16_t seq_num, >+ uint16_t send_at_seq_num, >+ int64_t created_at_time) > : seq_num(seq_num), > send_at_seq_num(send_at_seq_num), >+ created_at_time(created_at_time), > sent_at_time(-1), > retries(0) {} > >@@ -49,13 +65,20 @@ NackModule::NackModule(Clock* clock, > initialized_(false), > rtt_ms_(kDefaultRttMs), > newest_seq_num_(0), >- next_process_time_ms_(-1) { >+ next_process_time_ms_(-1), >+ send_nack_delay_ms_(GetSendNackDelay()) { > RTC_DCHECK(clock_); > RTC_DCHECK(nack_sender_); > RTC_DCHECK(keyframe_request_sender_); > } > > int NackModule::OnReceivedPacket(uint16_t seq_num, bool is_keyframe) { >+ return OnReceivedPacket(seq_num, is_keyframe, false); >+} >+ >+int NackModule::OnReceivedPacket(uint16_t seq_num, >+ bool is_keyframe, >+ bool is_recovered) { > rtc::CritScope lock(&crit_); > // TODO(philipel): When the packet includes information whether it is > // retransmitted or not, use that value instead. For >@@ -88,8 +111,6 @@ int NackModule::OnReceivedPacket(uint16_t seq_num, bool is_keyframe) { > UpdateReorderingStatistics(seq_num); > return nacks_sent_for_packet; > } >- AddPacketsToNack(newest_seq_num_ + 1, seq_num); >- newest_seq_num_ = seq_num; > > // Keep track of new keyframes. > if (is_keyframe) >@@ -100,6 +121,21 @@ int NackModule::OnReceivedPacket(uint16_t seq_num, bool is_keyframe) { > if (it != keyframe_list_.begin()) > keyframe_list_.erase(keyframe_list_.begin(), it); > >+ if (is_recovered) { >+ recovered_list_.insert(seq_num); >+ >+ // Remove old ones so we don't accumulate recovered packets. >+ auto it = recovered_list_.lower_bound(seq_num - kMaxPacketAge); >+ if (it != recovered_list_.begin()) >+ recovered_list_.erase(recovered_list_.begin(), it); >+ >+ // Do not send nack for packets recovered by FEC or RTX. >+ return 0; >+ } >+ >+ AddPacketsToNack(newest_seq_num_ + 1, seq_num); >+ newest_seq_num_ = seq_num; >+ > // Are there any nacks that are waiting for this seq_num. > std::vector<uint16_t> nack_batch = GetNackBatch(kSeqNumOnly); > if (!nack_batch.empty()) >@@ -113,6 +149,8 @@ void NackModule::ClearUpTo(uint16_t seq_num) { > nack_list_.erase(nack_list_.begin(), nack_list_.lower_bound(seq_num)); > keyframe_list_.erase(keyframe_list_.begin(), > keyframe_list_.lower_bound(seq_num)); >+ recovered_list_.erase(recovered_list_.begin(), >+ recovered_list_.lower_bound(seq_num)); > } > > void NackModule::UpdateRtt(int64_t rtt_ms) { >@@ -124,6 +162,7 @@ void NackModule::Clear() { > rtc::CritScope lock(&crit_); > nack_list_.clear(); > keyframe_list_.clear(); >+ recovered_list_.clear(); > } > > int64_t NackModule::TimeUntilNextProcess() { >@@ -200,7 +239,11 @@ void NackModule::AddPacketsToNack(uint16_t seq_num_start, > } > > for (uint16_t seq_num = seq_num_start; seq_num != seq_num_end; ++seq_num) { >- NackInfo nack_info(seq_num, seq_num + WaitNumberOfPackets(0.5)); >+ // Do not send nack for packets that are already recovered by FEC or RTX >+ if (recovered_list_.find(seq_num) != recovered_list_.end()) >+ continue; >+ NackInfo nack_info(seq_num, seq_num + WaitNumberOfPackets(0.5), >+ clock_->TimeInMilliseconds()); > RTC_DCHECK(nack_list_.find(seq_num) == nack_list_.end()); > nack_list_[seq_num] = nack_info; > } >@@ -213,22 +256,14 @@ std::vector<uint16_t> NackModule::GetNackBatch(NackFilterOptions options) { > std::vector<uint16_t> nack_batch; > auto it = nack_list_.begin(); > while (it != nack_list_.end()) { >- if (consider_seq_num && it->second.sent_at_time == -1 && >- AheadOrAt(newest_seq_num_, it->second.send_at_seq_num)) { >- nack_batch.emplace_back(it->second.seq_num); >- ++it->second.retries; >- it->second.sent_at_time = now_ms; >- if (it->second.retries >= kMaxNackRetries) { >- RTC_LOG(LS_WARNING) << "Sequence number " << it->second.seq_num >- << " removed from NACK list due to max retries."; >- it = nack_list_.erase(it); >- } else { >- ++it; >- } >- continue; >- } >- >- if (consider_timestamp && it->second.sent_at_time + rtt_ms_ <= now_ms) { >+ bool delay_timed_out = >+ now_ms - it->second.created_at_time >= send_nack_delay_ms_; >+ bool nack_on_rtt_passed = now_ms - it->second.sent_at_time >= rtt_ms_; >+ bool nack_on_seq_num_passed = >+ it->second.sent_at_time == -1 && >+ AheadOrAt(newest_seq_num_, it->second.send_at_seq_num); >+ if (delay_timed_out && ((consider_seq_num && nack_on_seq_num_passed) || >+ (consider_timestamp && nack_on_rtt_passed))) { > nack_batch.emplace_back(it->second.seq_num); > ++it->second.retries; > it->second.sent_at_time = now_ms; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.h >index 44340e9d4ca476454ca7d0b3b5a687f57dd743e9..f86dd16db28ac30e0f4a136c04c5b53a9a9532bc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module.h >@@ -32,6 +32,8 @@ class NackModule : public Module { > KeyFrameRequestSender* keyframe_request_sender); > > int OnReceivedPacket(uint16_t seq_num, bool is_keyframe); >+ int OnReceivedPacket(uint16_t seq_num, bool is_keyframe, bool is_recovered); >+ > void ClearUpTo(uint16_t seq_num); > void UpdateRtt(int64_t rtt_ms); > void Clear(); >@@ -50,10 +52,13 @@ class NackModule : public Module { > // we have tried to nack this packet. > struct NackInfo { > NackInfo(); >- NackInfo(uint16_t seq_num, uint16_t send_at_seq_num); >+ NackInfo(uint16_t seq_num, >+ uint16_t send_at_seq_num, >+ int64_t created_at_time); > > uint16_t seq_num; > uint16_t send_at_seq_num; >+ int64_t created_at_time; > int64_t sent_at_time; > int retries; > }; >@@ -87,6 +92,8 @@ class NackModule : public Module { > RTC_GUARDED_BY(crit_); > std::set<uint16_t, DescendingSeqNumComp<uint16_t>> keyframe_list_ > RTC_GUARDED_BY(crit_); >+ std::set<uint16_t, DescendingSeqNumComp<uint16_t>> recovered_list_ >+ RTC_GUARDED_BY(crit_); > video_coding::Histogram reordering_histogram_ RTC_GUARDED_BY(crit_); > bool initialized_ RTC_GUARDED_BY(crit_); > int64_t rtt_ms_ RTC_GUARDED_BY(crit_); >@@ -94,6 +101,9 @@ class NackModule : public Module { > > // Only touched on the process thread. > int64_t next_process_time_ms_; >+ >+ // Adds a delay before send nack on packet received. >+ const int64_t send_nack_delay_ms_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module_unittest.cc >index 8a5f46bf7fba56872f259363287c03d9d3871608..8efb8ac1549f9d4b4cb798e45a9ba366597ed113 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/nack_module_unittest.cc >@@ -14,6 +14,7 @@ > #include "modules/video_coding/include/video_coding_defines.h" > #include "modules/video_coding/nack_module.h" > #include "system_wrappers/include/clock.h" >+#include "test/field_trial.h" > #include "test/gtest.h" > > namespace webrtc { >@@ -40,38 +41,38 @@ class TestNackModule : public ::testing::Test, > }; > > TEST_F(TestNackModule, NackOnePacket) { >- nack_module_.OnReceivedPacket(1, false); >- nack_module_.OnReceivedPacket(3, false); >+ nack_module_.OnReceivedPacket(1, false, false); >+ nack_module_.OnReceivedPacket(3, false, false); > EXPECT_EQ(1u, sent_nacks_.size()); > EXPECT_EQ(2, sent_nacks_[0]); > } > > TEST_F(TestNackModule, WrappingSeqNum) { >- nack_module_.OnReceivedPacket(0xfffe, false); >- nack_module_.OnReceivedPacket(1, false); >+ nack_module_.OnReceivedPacket(0xfffe, false, false); >+ nack_module_.OnReceivedPacket(1, false, false); > EXPECT_EQ(2u, sent_nacks_.size()); > EXPECT_EQ(0xffff, sent_nacks_[0]); > EXPECT_EQ(0, sent_nacks_[1]); > } > > TEST_F(TestNackModule, WrappingSeqNumClearToKeyframe) { >- nack_module_.OnReceivedPacket(0xfffe, false); >- nack_module_.OnReceivedPacket(1, false); >+ nack_module_.OnReceivedPacket(0xfffe, false, false); >+ nack_module_.OnReceivedPacket(1, false, false); > EXPECT_EQ(2u, sent_nacks_.size()); > EXPECT_EQ(0xffff, sent_nacks_[0]); > EXPECT_EQ(0, sent_nacks_[1]); > > sent_nacks_.clear(); >- nack_module_.OnReceivedPacket(2, true); >+ nack_module_.OnReceivedPacket(2, true, false); > EXPECT_EQ(0u, sent_nacks_.size()); > >- nack_module_.OnReceivedPacket(501, true); >+ nack_module_.OnReceivedPacket(501, true, false); > EXPECT_EQ(498u, sent_nacks_.size()); > for (int seq_num = 3; seq_num < 501; ++seq_num) > EXPECT_EQ(seq_num, sent_nacks_[seq_num - 3]); > > sent_nacks_.clear(); >- nack_module_.OnReceivedPacket(1001, false); >+ nack_module_.OnReceivedPacket(1001, false, false); > EXPECT_EQ(499u, sent_nacks_.size()); > for (int seq_num = 502; seq_num < 1001; ++seq_num) > EXPECT_EQ(seq_num, sent_nacks_[seq_num - 502]); >@@ -91,7 +92,7 @@ TEST_F(TestNackModule, WrappingSeqNumClearToKeyframe) { > // It will then clear all nacks up to the next keyframe (seq num 2), > // thus removing 0xffff and 0 from the nack list. > sent_nacks_.clear(); >- nack_module_.OnReceivedPacket(1004, false); >+ nack_module_.OnReceivedPacket(1004, false, false); > EXPECT_EQ(2u, sent_nacks_.size()); > EXPECT_EQ(1002, sent_nacks_[0]); > EXPECT_EQ(1003, sent_nacks_[1]); >@@ -107,7 +108,7 @@ TEST_F(TestNackModule, WrappingSeqNumClearToKeyframe) { > > // Adding packet 1007 will cause the nack module to overflow again, thus > // clearing everything up to 501 which is the next keyframe. >- nack_module_.OnReceivedPacket(1007, false); >+ nack_module_.OnReceivedPacket(1007, false, false); > sent_nacks_.clear(); > clock_->AdvanceTimeMilliseconds(100); > nack_module_.Process(); >@@ -146,8 +147,8 @@ TEST_F(TestNackModule, DontBurstOnTimeSkip) { > } > > TEST_F(TestNackModule, ResendNack) { >- nack_module_.OnReceivedPacket(1, false); >- nack_module_.OnReceivedPacket(3, false); >+ nack_module_.OnReceivedPacket(1, false, false); >+ nack_module_.OnReceivedPacket(3, false, false); > EXPECT_EQ(1u, sent_nacks_.size()); > EXPECT_EQ(2, sent_nacks_[0]); > >@@ -169,15 +170,15 @@ TEST_F(TestNackModule, ResendNack) { > nack_module_.Process(); > EXPECT_EQ(4u, sent_nacks_.size()); > >- nack_module_.OnReceivedPacket(2, false); >+ nack_module_.OnReceivedPacket(2, false, false); > clock_->AdvanceTimeMilliseconds(50); > nack_module_.Process(); > EXPECT_EQ(4u, sent_nacks_.size()); > } > > TEST_F(TestNackModule, ResendPacketMaxRetries) { >- nack_module_.OnReceivedPacket(1, false); >- nack_module_.OnReceivedPacket(3, false); >+ nack_module_.OnReceivedPacket(1, false, false); >+ nack_module_.OnReceivedPacket(3, false, false); > EXPECT_EQ(1u, sent_nacks_.size()); > EXPECT_EQ(2, sent_nacks_[0]); > >@@ -193,35 +194,35 @@ TEST_F(TestNackModule, ResendPacketMaxRetries) { > } > > TEST_F(TestNackModule, TooLargeNackList) { >- nack_module_.OnReceivedPacket(0, false); >- nack_module_.OnReceivedPacket(1001, false); >+ nack_module_.OnReceivedPacket(0, false, false); >+ nack_module_.OnReceivedPacket(1001, false, false); > EXPECT_EQ(1000u, sent_nacks_.size()); > EXPECT_EQ(0, keyframes_requested_); >- nack_module_.OnReceivedPacket(1003, false); >+ nack_module_.OnReceivedPacket(1003, false, false); > EXPECT_EQ(1000u, sent_nacks_.size()); > EXPECT_EQ(1, keyframes_requested_); >- nack_module_.OnReceivedPacket(1004, false); >+ nack_module_.OnReceivedPacket(1004, false, false); > EXPECT_EQ(1000u, sent_nacks_.size()); > EXPECT_EQ(1, keyframes_requested_); > } > > TEST_F(TestNackModule, TooLargeNackListWithKeyFrame) { >- nack_module_.OnReceivedPacket(0, false); >- nack_module_.OnReceivedPacket(1, true); >- nack_module_.OnReceivedPacket(1001, false); >+ nack_module_.OnReceivedPacket(0, false, false); >+ nack_module_.OnReceivedPacket(1, true, false); >+ nack_module_.OnReceivedPacket(1001, false, false); > EXPECT_EQ(999u, sent_nacks_.size()); > EXPECT_EQ(0, keyframes_requested_); >- nack_module_.OnReceivedPacket(1003, false); >+ nack_module_.OnReceivedPacket(1003, false, false); > EXPECT_EQ(1000u, sent_nacks_.size()); > EXPECT_EQ(0, keyframes_requested_); >- nack_module_.OnReceivedPacket(1005, false); >+ nack_module_.OnReceivedPacket(1005, false, false); > EXPECT_EQ(1000u, sent_nacks_.size()); > EXPECT_EQ(1, keyframes_requested_); > } > > TEST_F(TestNackModule, ClearUpTo) { >- nack_module_.OnReceivedPacket(0, false); >- nack_module_.OnReceivedPacket(100, false); >+ nack_module_.OnReceivedPacket(0, false, false); >+ nack_module_.OnReceivedPacket(100, false, false); > EXPECT_EQ(99u, sent_nacks_.size()); > > sent_nacks_.clear(); >@@ -233,8 +234,8 @@ TEST_F(TestNackModule, ClearUpTo) { > } > > TEST_F(TestNackModule, ClearUpToWrap) { >- nack_module_.OnReceivedPacket(0xfff0, false); >- nack_module_.OnReceivedPacket(0xf, false); >+ nack_module_.OnReceivedPacket(0xfff0, false, false); >+ nack_module_.OnReceivedPacket(0xf, false, false); > EXPECT_EQ(30u, sent_nacks_.size()); > > sent_nacks_.clear(); >@@ -246,20 +247,20 @@ TEST_F(TestNackModule, ClearUpToWrap) { > } > > TEST_F(TestNackModule, PacketNackCount) { >- EXPECT_EQ(0, nack_module_.OnReceivedPacket(0, false)); >- EXPECT_EQ(0, nack_module_.OnReceivedPacket(2, false)); >- EXPECT_EQ(1, nack_module_.OnReceivedPacket(1, false)); >+ EXPECT_EQ(0, nack_module_.OnReceivedPacket(0, false, false)); >+ EXPECT_EQ(0, nack_module_.OnReceivedPacket(2, false, false)); >+ EXPECT_EQ(1, nack_module_.OnReceivedPacket(1, false, false)); > > sent_nacks_.clear(); > nack_module_.UpdateRtt(100); >- EXPECT_EQ(0, nack_module_.OnReceivedPacket(5, false)); >+ EXPECT_EQ(0, nack_module_.OnReceivedPacket(5, false, false)); > clock_->AdvanceTimeMilliseconds(100); > nack_module_.Process(); > clock_->AdvanceTimeMilliseconds(100); > nack_module_.Process(); >- EXPECT_EQ(3, nack_module_.OnReceivedPacket(3, false)); >- EXPECT_EQ(3, nack_module_.OnReceivedPacket(4, false)); >- EXPECT_EQ(0, nack_module_.OnReceivedPacket(4, false)); >+ EXPECT_EQ(3, nack_module_.OnReceivedPacket(3, false, false)); >+ EXPECT_EQ(3, nack_module_.OnReceivedPacket(4, false, false)); >+ EXPECT_EQ(0, nack_module_.OnReceivedPacket(4, false, false)); > } > > TEST_F(TestNackModule, NackListFullAndNoOverlapWithKeyframes) { >@@ -267,14 +268,63 @@ TEST_F(TestNackModule, NackListFullAndNoOverlapWithKeyframes) { > const unsigned int kFirstGap = kMaxNackPackets - 20; > const unsigned int kSecondGap = 200; > uint16_t seq_num = 0; >- nack_module_.OnReceivedPacket(seq_num++, true); >+ nack_module_.OnReceivedPacket(seq_num++, true, false); > seq_num += kFirstGap; >- nack_module_.OnReceivedPacket(seq_num++, true); >+ nack_module_.OnReceivedPacket(seq_num++, true, false); > EXPECT_EQ(kFirstGap, sent_nacks_.size()); > sent_nacks_.clear(); > seq_num += kSecondGap; >- nack_module_.OnReceivedPacket(seq_num, true); >+ nack_module_.OnReceivedPacket(seq_num, true, false); > EXPECT_EQ(kSecondGap, sent_nacks_.size()); > } > >+TEST_F(TestNackModule, HandleFecRecoveredPacket) { >+ nack_module_.OnReceivedPacket(1, false, false); >+ nack_module_.OnReceivedPacket(4, false, true); >+ EXPECT_EQ(0u, sent_nacks_.size()); >+ nack_module_.OnReceivedPacket(5, false, false); >+ EXPECT_EQ(2u, sent_nacks_.size()); >+} >+ >+TEST_F(TestNackModule, SendNackWithoutDelay) { >+ nack_module_.OnReceivedPacket(0, false, false); >+ nack_module_.OnReceivedPacket(100, false, false); >+ EXPECT_EQ(99u, sent_nacks_.size()); >+} >+ >+class TestNackModuleWithFieldTrial : public ::testing::Test, >+ public NackSender, >+ public KeyFrameRequestSender { >+ protected: >+ TestNackModuleWithFieldTrial() >+ : nack_delay_field_trial_("WebRTC-SendNackDelayMs/10/"), >+ clock_(new SimulatedClock(0)), >+ nack_module_(clock_.get(), this, this), >+ keyframes_requested_(0) {} >+ >+ void SendNack(const std::vector<uint16_t>& sequence_numbers) override { >+ sent_nacks_.insert(sent_nacks_.end(), sequence_numbers.begin(), >+ sequence_numbers.end()); >+ } >+ >+ void RequestKeyFrame() override { ++keyframes_requested_; } >+ >+ test::ScopedFieldTrials nack_delay_field_trial_; >+ std::unique_ptr<SimulatedClock> clock_; >+ NackModule nack_module_; >+ std::vector<uint16_t> sent_nacks_; >+ int keyframes_requested_; >+}; >+ >+TEST_F(TestNackModuleWithFieldTrial, SendNackWithDelay) { >+ nack_module_.OnReceivedPacket(0, false, false); >+ nack_module_.OnReceivedPacket(100, false, false); >+ EXPECT_EQ(0u, sent_nacks_.size()); >+ clock_->AdvanceTimeMilliseconds(10); >+ nack_module_.OnReceivedPacket(106, false, false); >+ EXPECT_EQ(99u, sent_nacks_.size()); >+ clock_->AdvanceTimeMilliseconds(10); >+ nack_module_.OnReceivedPacket(109, false, false); >+ EXPECT_EQ(104u, sent_nacks_.size()); >+} > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.cc >index 7f22b73a5a621a289e860a0bd4f9d926354360a5..5bf27257b07a31a6b9d0afc7c5cec422027ada6a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.cc >@@ -28,32 +28,25 @@ namespace webrtc { > > enum { kMaxReceiverDelayMs = 10000 }; > >-VCMReceiver::VCMReceiver(VCMTiming* timing, >- Clock* clock, >- EventFactory* event_factory) >+VCMReceiver::VCMReceiver(VCMTiming* timing, Clock* clock) > : VCMReceiver::VCMReceiver(timing, > clock, >- event_factory, >+ absl::WrapUnique(EventWrapper::Create()), >+ absl::WrapUnique(EventWrapper::Create()), > nullptr, // NackSender > nullptr) // KeyframeRequestSender > {} > > VCMReceiver::VCMReceiver(VCMTiming* timing, > Clock* clock, >- EventFactory* event_factory, > NackSender* nack_sender, > KeyFrameRequestSender* keyframe_request_sender) >- : VCMReceiver( >- timing, >- clock, >- std::unique_ptr<EventWrapper>(event_factory >- ? event_factory->CreateEvent() >- : EventWrapper::Create()), >- std::unique_ptr<EventWrapper>(event_factory >- ? event_factory->CreateEvent() >- : EventWrapper::Create()), >- nack_sender, >- keyframe_request_sender) {} >+ : VCMReceiver(timing, >+ clock, >+ absl::WrapUnique(EventWrapper::Create()), >+ absl::WrapUnique(EventWrapper::Create()), >+ nack_sender, >+ keyframe_request_sender) {} > > VCMReceiver::VCMReceiver(VCMTiming* timing, > Clock* clock, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.h >index 4f6590d274822f392dc2adeaaca1063fd36b9c41..503cae7c8deedd67e9c4ddb0c21a59660ade26d1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver.h >@@ -30,17 +30,16 @@ class VCMReceiver { > public: > // Constructor for current interface, will be removed when the > // new jitter buffer is in place. >- VCMReceiver(VCMTiming* timing, Clock* clock, EventFactory* event_factory); >+ VCMReceiver(VCMTiming* timing, Clock* clock); > > // Create method for the new jitter buffer. > VCMReceiver(VCMTiming* timing, > Clock* clock, >- EventFactory* event_factory, > NackSender* nack_sender, > KeyFrameRequestSender* keyframe_request_sender); > >- // Using this constructor, you can specify a different event factory for the >- // jitter buffer. Useful for unit tests when you want to simulate incoming >+ // Using this constructor, you can specify a different event implemetation for >+ // the jitter buffer. Useful for unit tests when you want to simulate incoming > // packets, in which case the jitter buffer's wait event is different from > // that of VCMReceiver itself. > // >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver_unittest.cc >index 1ff866298e2ccdf4616631fb911dd8ac8049fefd..ba35f693582d481c3cbb1204cd3fd407913a4eba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/receiver_unittest.cc >@@ -18,7 +18,6 @@ > #include "modules/video_coding/packet.h" > #include "modules/video_coding/receiver.h" > #include "modules/video_coding/test/stream_generator.h" >-#include "modules/video_coding/test/test_util.h" > #include "modules/video_coding/timing.h" > #include "rtc_base/checks.h" > #include "system_wrappers/include/clock.h" >@@ -31,7 +30,7 @@ class TestVCMReceiver : public ::testing::Test { > TestVCMReceiver() > : clock_(new SimulatedClock(0)), > timing_(clock_.get()), >- receiver_(&timing_, clock_.get(), &event_factory_) { >+ receiver_(&timing_, clock_.get()) { > stream_generator_.reset( > new StreamGenerator(0, clock_->TimeInMilliseconds())); > } >@@ -81,7 +80,6 @@ class TestVCMReceiver : public ::testing::Test { > > std::unique_ptr<SimulatedClock> clock_; > VCMTiming timing_; >- NullEventFactory event_factory_; > VCMReceiver receiver_; > std::unique_ptr<StreamGenerator> stream_generator_; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/test/test_util.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/test/test_util.h >deleted file mode 100644 >index a38fc58b1284bc781b1c6f5cd642f7f66dc3ae48..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/test/test_util.h >+++ /dev/null >@@ -1,34 +0,0 @@ >-/* >- * Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_VIDEO_CODING_TEST_TEST_UTIL_H_ >-#define MODULES_VIDEO_CODING_TEST_TEST_UTIL_H_ >- >-#include "system_wrappers/include/event_wrapper.h" >- >-class NullEventFactory : public webrtc::EventFactory { >- public: >- virtual ~NullEventFactory() {} >- >- webrtc::EventWrapper* CreateEvent() override { return new NullEvent; } >- >- private: >- // Private class to avoid more dependencies on it in tests. >- class NullEvent : public webrtc::EventWrapper { >- public: >- ~NullEvent() override {} >- bool Set() override { return true; } >- webrtc::EventTypeWrapper Wait(unsigned long max_time) override { // NOLINT >- return webrtc::kEventTimeout; >- } >- }; >-}; >- >-#endif // MODULES_VIDEO_CODING_TEST_TEST_UTIL_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/ivf_file_writer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/ivf_file_writer.h >index 7d917ed35ee9b10f987ecfc94a1cea8803aafd32..4775beddac85a51b69298c5b13552e74a5395de3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/ivf_file_writer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/ivf_file_writer.h >@@ -14,7 +14,7 @@ > #include <memory> > #include <string> > >-#include "common_video/include/video_frame.h" >+#include "api/video/encoded_image.h" > #include "modules/include/module_common_types.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/file.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average.cc >deleted file mode 100644 >index eb23e3d92f3e6063eacdedfd8802a9ffa83d1c72..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average.cc >+++ /dev/null >@@ -1,47 +0,0 @@ >-/* >- * Copyright (c) 2016 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/video_coding/utility/moving_average.h" >- >-#include <algorithm> >- >-namespace webrtc { >- >-MovingAverage::MovingAverage(size_t s) : sum_history_(s + 1, 0) {} >-MovingAverage::~MovingAverage() = default; >- >-void MovingAverage::AddSample(int sample) { >- count_++; >- sum_ += sample; >- sum_history_[count_ % sum_history_.size()] = sum_; >-} >- >-absl::optional<int> MovingAverage::GetAverage() const { >- return GetAverage(size()); >-} >- >-absl::optional<int> MovingAverage::GetAverage(size_t num_samples) const { >- if (num_samples > size() || num_samples == 0) >- return absl::nullopt; >- int sum = sum_ - sum_history_[(count_ - num_samples) % sum_history_.size()]; >- return sum / static_cast<int>(num_samples); >-} >- >-void MovingAverage::Reset() { >- count_ = 0; >- sum_ = 0; >- std::fill(sum_history_.begin(), sum_history_.end(), 0); >-} >- >-size_t MovingAverage::size() const { >- return std::min(count_, sum_history_.size() - 1); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average.h >deleted file mode 100644 >index 77ce633b18fcc16f6f001d717d01d16dd5adb796..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average.h >+++ /dev/null >@@ -1,36 +0,0 @@ >-/* >- * Copyright (c) 2015 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef MODULES_VIDEO_CODING_UTILITY_MOVING_AVERAGE_H_ >-#define MODULES_VIDEO_CODING_UTILITY_MOVING_AVERAGE_H_ >- >-#include <vector> >- >-#include "absl/types/optional.h" >- >-namespace webrtc { >-class MovingAverage { >- public: >- explicit MovingAverage(size_t s); >- ~MovingAverage(); >- void AddSample(int sample); >- absl::optional<int> GetAverage() const; >- absl::optional<int> GetAverage(size_t num_samples) const; >- void Reset(); >- size_t size() const; >- >- private: >- size_t count_ = 0; >- int sum_ = 0; >- std::vector<int> sum_history_; >-}; >-} // namespace webrtc >- >-#endif // MODULES_VIDEO_CODING_UTILITY_MOVING_AVERAGE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average_unittest.cc >deleted file mode 100644 >index 72fc775a82732dcd175be761eb620d85dd60d6a9..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/moving_average_unittest.cc >+++ /dev/null >@@ -1,62 +0,0 @@ >-/* >- * Copyright (c) 2016 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "modules/video_coding/utility/moving_average.h" >- >-#include "test/gtest.h" >- >-TEST(MovingAverageTest, EmptyAverage) { >- webrtc::MovingAverage moving_average(1); >- EXPECT_EQ(0u, moving_average.size()); >- EXPECT_FALSE(moving_average.GetAverage(0)); >-} >- >-// Test single value. >-TEST(MovingAverageTest, OneElement) { >- webrtc::MovingAverage moving_average(1); >- moving_average.AddSample(3); >- EXPECT_EQ(1u, moving_average.size()); >- EXPECT_EQ(3, *moving_average.GetAverage()); >- EXPECT_EQ(3, *moving_average.GetAverage(1)); >- EXPECT_FALSE(moving_average.GetAverage(2)); >-} >- >-TEST(MovingAverageTest, GetAverage) { >- webrtc::MovingAverage moving_average(1024); >- moving_average.AddSample(1); >- moving_average.AddSample(1); >- moving_average.AddSample(3); >- moving_average.AddSample(3); >- EXPECT_EQ(*moving_average.GetAverage(4), 2); >- EXPECT_EQ(*moving_average.GetAverage(2), 3); >- EXPECT_FALSE(moving_average.GetAverage(0)); >-} >- >-TEST(MovingAverageTest, Reset) { >- webrtc::MovingAverage moving_average(5); >- moving_average.AddSample(1); >- EXPECT_EQ(1, *moving_average.GetAverage(1)); >- moving_average.Reset(); >- EXPECT_FALSE(moving_average.GetAverage(1)); >- EXPECT_FALSE(moving_average.GetAverage(6)); >-} >- >-TEST(MovingAverageTest, ManySamples) { >- webrtc::MovingAverage moving_average(10); >- for (int i = 1; i < 11; i++) { >- moving_average.AddSample(i); >- } >- EXPECT_EQ(*moving_average.GetAverage(), 5); >- moving_average.Reset(); >- for (int i = 1; i < 2001; i++) { >- moving_average.AddSample(i); >- } >- EXPECT_EQ(*moving_average.GetAverage(), 1995); >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.cc >index ea19e418c9d4e921fce0c97d6d6a93431f6ce725..32ae166a9c677a21cc3946f2a26572f52d5db35b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.cc >@@ -173,8 +173,8 @@ void QualityScaler::CheckQp() { > // If we have not observed at least this many frames we can't make a good > // scaling decision. > const size_t frames = config_.use_all_drop_reasons >- ? framedrop_percent_all_.size() >- : framedrop_percent_media_opt_.size(); >+ ? framedrop_percent_all_.Size() >+ : framedrop_percent_media_opt_.Size(); > if (frames < kMinFramesNeededToScale) { > observed_enough_frames_ = false; > return; >@@ -183,8 +183,9 @@ void QualityScaler::CheckQp() { > > // Check if we should scale down due to high frame drop. > const absl::optional<int> drop_rate = >- config_.use_all_drop_reasons ? framedrop_percent_all_.GetAverage() >- : framedrop_percent_media_opt_.GetAverage(); >+ config_.use_all_drop_reasons >+ ? framedrop_percent_all_.GetAverageRoundedDown() >+ : framedrop_percent_media_opt_.GetAverageRoundedDown(); > if (drop_rate && *drop_rate >= kFramedropPercentThreshold) { > RTC_LOG(LS_INFO) << "Reporting high QP, framedrop percent " << *drop_rate; > ReportQpHigh(); >@@ -192,11 +193,12 @@ void QualityScaler::CheckQp() { > } > > // Check if we should scale up or down based on QP. >- const absl::optional<int> avg_qp_high = qp_smoother_high_ >- ? qp_smoother_high_->GetAvg() >- : average_qp_.GetAverage(); >+ const absl::optional<int> avg_qp_high = >+ qp_smoother_high_ ? qp_smoother_high_->GetAvg() >+ : average_qp_.GetAverageRoundedDown(); > const absl::optional<int> avg_qp_low = >- qp_smoother_low_ ? qp_smoother_low_->GetAvg() : average_qp_.GetAverage(); >+ qp_smoother_low_ ? qp_smoother_low_->GetAvg() >+ : average_qp_.GetAverageRoundedDown(); > if (avg_qp_high && avg_qp_low) { > RTC_LOG(LS_INFO) << "Checking average QP " << *avg_qp_high << " (" > << *avg_qp_low << ")."; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.h >index 3091ebf36c42e9d96728e007e716642b7a77239e..272c02e53a80381513493d12739546932fb714e0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler.h >@@ -17,8 +17,8 @@ > #include "absl/types/optional.h" > #include "api/video_codecs/video_encoder.h" > #include "common_types.h" // NOLINT(build/include) >-#include "modules/video_coding/utility/moving_average.h" > #include "rtc_base/experiments/quality_scaling_experiment.h" >+#include "rtc_base/numerics/moving_average.h" > #include "rtc_base/sequenced_task_checker.h" > > namespace webrtc { >@@ -79,9 +79,10 @@ class QualityScaler { > const VideoEncoder::QpThresholds thresholds_; > const int64_t sampling_period_ms_; > bool fast_rampup_ RTC_GUARDED_BY(&task_checker_); >- MovingAverage average_qp_ RTC_GUARDED_BY(&task_checker_); >- MovingAverage framedrop_percent_media_opt_ RTC_GUARDED_BY(&task_checker_); >- MovingAverage framedrop_percent_all_ RTC_GUARDED_BY(&task_checker_); >+ rtc::MovingAverage average_qp_ RTC_GUARDED_BY(&task_checker_); >+ rtc::MovingAverage framedrop_percent_media_opt_ >+ RTC_GUARDED_BY(&task_checker_); >+ rtc::MovingAverage framedrop_percent_all_ RTC_GUARDED_BY(&task_checker_); > > // Used by QualityScalingExperiment. > const bool experiment_enabled_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler_unittest.cc >index b17062df5c158ca5afdbba22a2a79dcee366a17a..ed7937992979b30b5bfb68466f24027f2300cc8f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/quality_scaler_unittest.cc >@@ -28,19 +28,18 @@ static const int kMinFramesNeededToScale = 60; // From quality_scaler.cc. > static const size_t kDefaultTimeoutMs = 150; > } // namespace > >-#define DO_SYNC(q, block) \ >- do { \ >- rtc::Event event(false, false); \ >- q->PostTask([this, &event] { \ >- block; \ >- event.Set(); \ >- }); \ >- RTC_CHECK(event.Wait(1000)); \ >+#define DO_SYNC(q, block) \ >+ do { \ >+ rtc::Event event; \ >+ q->PostTask([this, &event] { \ >+ block; \ >+ event.Set(); \ >+ }); \ >+ RTC_CHECK(event.Wait(1000)); \ > } while (0) > > class MockAdaptationObserver : public AdaptationObserverInterface { > public: >- MockAdaptationObserver() : event(false, false) {} > virtual ~MockAdaptationObserver() {} > > void AdaptUp(AdaptReason r) override { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_rate_allocator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_rate_allocator_unittest.cc >index 7f54932a813ddb5141c8d3b14f8d136276ad30e6..bbe3f19fb7320d9e641e5286fe35817249f9f5a6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_rate_allocator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_rate_allocator_unittest.cc >@@ -15,8 +15,7 @@ > #include <utility> > #include <vector> > >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >- >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "test/field_trial.h" > #include "test/gmock.h" > #include "test/gtest.h" >@@ -34,9 +33,9 @@ constexpr uint32_t kLegacyScreenshareMaxBitrateKbps = 1000; > constexpr uint32_t kSimulcastScreenshareMinBitrateKbps = 600; > constexpr uint32_t kSimulcastScreenshareMaxBitrateKbps = 1250; > >-class MockTemporalLayers : public TemporalLayers { >+class MockTemporalLayers : public Vp8TemporalLayers { > public: >- MOCK_METHOD1(UpdateLayerConfig, TemporalLayers::FrameConfig(uint32_t)); >+ MOCK_METHOD1(UpdateLayerConfig, Vp8TemporalLayers::FrameConfig(uint32_t)); > MOCK_METHOD2(OnRatesUpdated, void(const std::vector<uint32_t>&, int)); > MOCK_METHOD1(UpdateConfiguration, bool(Vp8EncoderConfig*)); > MOCK_METHOD5(OnEncodeDone, >@@ -44,7 +43,7 @@ class MockTemporalLayers : public TemporalLayers { > MOCK_METHOD3(FrameEncoded, void(uint32_t, size_t, int)); > MOCK_CONST_METHOD0(Tl0PicIdx, uint8_t()); > MOCK_CONST_METHOD1(GetTemporalLayerId, >- int(const TemporalLayers::FrameConfig&)); >+ int(const Vp8TemporalLayers::FrameConfig&)); > }; > } // namespace > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.cc >index 4af526c66319cd772dce81c5a0c99bf0753a440c..992773ecc609eefd898610786f2522d06dbc2f4b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.cc >@@ -15,9 +15,10 @@ > #include <memory> > #include <vector> > >+#include "api/video/encoded_image.h" > #include "api/video_codecs/sdp_video_format.h" >-#include "common_video/include/video_frame.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" >+#include "modules/video_coding/include/video_codec_interface.h" > #include "modules/video_coding/include/video_coding_defines.h" > #include "rtc_base/checks.h" > #include "test/gtest.h" >@@ -819,5 +820,67 @@ void SimulcastTestFixtureImpl::TestStrideEncodeDecode() { > EXPECT_EQ(2, decoder_callback.DecodedFrames()); > } > >+void SimulcastTestFixtureImpl::TestDecodeWidthHeightSet() { >+ MockEncodedImageCallback encoder_callback; >+ MockDecodedImageCallback decoder_callback; >+ >+ EncodedImage encoded_frame[3]; >+ SetRates(kMaxBitrates[2], 30); // To get all three streams. >+ encoder_->RegisterEncodeCompleteCallback(&encoder_callback); >+ decoder_->RegisterDecodeCompleteCallback(&decoder_callback); >+ >+ EXPECT_CALL(encoder_callback, OnEncodedImage(_, _, _)) >+ .Times(3) >+ .WillRepeatedly( >+ testing::Invoke([&](const EncodedImage& encoded_image, >+ const CodecSpecificInfo* codec_specific_info, >+ const RTPFragmentationHeader* fragmentation) { >+ EXPECT_EQ(encoded_image._frameType, kVideoFrameKey); >+ >+ size_t index = encoded_image.SpatialIndex().value_or(0); >+ encoded_frame[index]._buffer = new uint8_t[encoded_image._size]; >+ encoded_frame[index]._size = encoded_image._size; >+ encoded_frame[index]._length = encoded_image._length; >+ encoded_frame[index]._frameType = encoded_image._frameType; >+ encoded_frame[index]._completeFrame = encoded_image._completeFrame; >+ memcpy(encoded_frame[index]._buffer, encoded_image._buffer, >+ encoded_image._length); >+ return EncodedImageCallback::Result( >+ EncodedImageCallback::Result::OK, 0); >+ })); >+ EXPECT_EQ(0, encoder_->Encode(*input_frame_, NULL, NULL)); >+ >+ EXPECT_CALL(decoder_callback, Decoded(_, _, _)) >+ .WillOnce(testing::Invoke([](VideoFrame& decodedImage, >+ absl::optional<int32_t> decode_time_ms, >+ absl::optional<uint8_t> qp) { >+ EXPECT_EQ(decodedImage.width(), kDefaultWidth / 4); >+ EXPECT_EQ(decodedImage.height(), kDefaultHeight / 4); >+ })); >+ EXPECT_EQ(0, decoder_->Decode(encoded_frame[0], false, NULL, 0)); >+ >+ EXPECT_CALL(decoder_callback, Decoded(_, _, _)) >+ .WillOnce(testing::Invoke([](VideoFrame& decodedImage, >+ absl::optional<int32_t> decode_time_ms, >+ absl::optional<uint8_t> qp) { >+ EXPECT_EQ(decodedImage.width(), kDefaultWidth / 2); >+ EXPECT_EQ(decodedImage.height(), kDefaultHeight / 2); >+ })); >+ EXPECT_EQ(0, decoder_->Decode(encoded_frame[1], false, NULL, 0)); >+ >+ EXPECT_CALL(decoder_callback, Decoded(_, _, _)) >+ .WillOnce(testing::Invoke([](VideoFrame& decodedImage, >+ absl::optional<int32_t> decode_time_ms, >+ absl::optional<uint8_t> qp) { >+ EXPECT_EQ(decodedImage.width(), kDefaultWidth); >+ EXPECT_EQ(decodedImage.height(), kDefaultHeight); >+ })); >+ EXPECT_EQ(0, decoder_->Decode(encoded_frame[2], false, NULL, 0)); >+ >+ for (int i = 0; i < 3; ++i) { >+ delete [] encoded_frame[i]._buffer; >+ } >+} >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.h >index 6634a69ad1ca543402f7e3fef09411b7ce26e800..2f834bdfb55fc2439b2888fa29a21e26d9898a8e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/utility/simulcast_test_fixture_impl.h >@@ -14,6 +14,8 @@ > #include <memory> > #include <vector> > >+#include "api/test/mock_video_decoder.h" >+#include "api/test/mock_video_encoder.h" > #include "api/test/simulcast_test_fixture.h" > #include "api/video/i420_buffer.h" > #include "api/video/video_frame.h" >@@ -21,7 +23,6 @@ > #include "api/video_codecs/video_encoder_factory.h" > #include "common_types.h" // NOLINT(build/include) > #include "modules/video_coding/utility/simulcast_rate_allocator.h" >-#include "modules/video_coding/include/mock/mock_video_codec_interface.h" > > namespace webrtc { > namespace test { >@@ -50,6 +51,7 @@ class SimulcastTestFixtureImpl final : public SimulcastTestFixture { > void TestSpatioTemporalLayers333PatternEncoder() override; > void TestSpatioTemporalLayers321PatternEncoder() override; > void TestStrideEncodeDecode() override; >+ void TestDecodeWidthHeightSet() override; > > static void DefaultSettings(VideoCodec* settings, > const int* temporal_layer_profile, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer.cc >index e7125ce2782313f8fbc2606b96dbe46ab3ec4e5c..86a5ab2a4820994fd7402874e9f34f2bbe2ef381 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer.cc >@@ -24,15 +24,13 @@ > > namespace webrtc { > >-bool VideoCodecInitializer::SetupCodec( >- const VideoEncoderConfig& config, >- const std::vector<VideoStream>& streams, >- VideoCodec* codec, >- std::unique_ptr<VideoBitrateAllocator>* bitrate_allocator) { >+bool VideoCodecInitializer::SetupCodec(const VideoEncoderConfig& config, >+ const std::vector<VideoStream>& streams, >+ VideoCodec* codec) { > if (config.codec_type == kVideoCodecMultiplex) { > VideoEncoderConfig associated_config = config.Copy(); > associated_config.codec_type = kVideoCodecVP9; >- if (!SetupCodec(associated_config, streams, codec, bitrate_allocator)) { >+ if (!SetupCodec(associated_config, streams, codec)) { > RTC_LOG(LS_ERROR) << "Failed to create stereo encoder configuration."; > return false; > } >@@ -41,31 +39,9 @@ bool VideoCodecInitializer::SetupCodec( > } > > *codec = VideoEncoderConfigToVideoCodec(config, streams); >- *bitrate_allocator = CreateBitrateAllocator(*codec); >- > return true; > } > >-std::unique_ptr<VideoBitrateAllocator> >-VideoCodecInitializer::CreateBitrateAllocator(const VideoCodec& codec) { >- std::unique_ptr<VideoBitrateAllocator> rate_allocator; >- >- switch (codec.codecType) { >- case kVideoCodecVP8: >- RTC_FALLTHROUGH(); >- case kVideoCodecH264: >- rate_allocator.reset(new SimulcastRateAllocator(codec)); >- break; >- case kVideoCodecVP9: >- rate_allocator.reset(new SvcRateAllocator(codec)); >- break; >- default: >- rate_allocator.reset(new DefaultVideoBitrateAllocator(codec)); >- } >- >- return rate_allocator; >-} >- > // TODO(sprang): Split this up and separate the codec specific parts. > VideoCodec VideoCodecInitializer::VideoEncoderConfigToVideoCodec( > const VideoEncoderConfig& config, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer_unittest.cc >index fb601f0d4299ec952c2f4657606bbd618f5bde23..ceff1ebaf7427bf928fe7e6adbe03c530a187ce3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_codec_initializer_unittest.cc >@@ -9,11 +9,13 @@ > */ > > #include "modules/video_coding/include/video_codec_initializer.h" >-#include "api/video/video_bitrate_allocator.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" >+#include "api/video_codecs/create_vp8_temporal_layers.h" > #include "api/video_codecs/video_encoder.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "common_types.h" // NOLINT(build/include) >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" > #include "modules/video_coding/codecs/vp9/include/vp9_globals.h" >+#include "rtc_base/checks.h" > #include "rtc_base/refcountedobject.h" > #include "test/gtest.h" > >@@ -74,21 +76,22 @@ class VideoCodecInitializerTest : public ::testing::Test { > > bool InitializeCodec() { > codec_out_ = VideoCodec(); >- bitrate_allocator_out_.reset(); > temporal_layers_.clear(); >- if (!VideoCodecInitializer::SetupCodec(config_, streams_, &codec_out_, >- &bitrate_allocator_out_)) { >+ if (!VideoCodecInitializer::SetupCodec(config_, streams_, &codec_out_)) { > return false; > } >+ bitrate_allocator_ = CreateBuiltinVideoBitrateAllocatorFactory() >+ ->CreateVideoBitrateAllocator(codec_out_); >+ RTC_CHECK(bitrate_allocator_); > if (codec_out_.codecType == VideoCodecType::kVideoCodecMultiplex) > return true; > > // Make sure temporal layers instances have been created. > if (codec_out_.codecType == VideoCodecType::kVideoCodecVP8) { > for (int i = 0; i < codec_out_.numberOfSimulcastStreams; ++i) { >- temporal_layers_.emplace_back(TemporalLayers::CreateTemporalLayers( >- TemporalLayersType::kFixedPattern, >- codec_out_.VP8()->numberOfTemporalLayers)); >+ temporal_layers_.emplace_back( >+ CreateVp8TemporalLayers(Vp8TemporalLayersType::kFixedPattern, >+ codec_out_.VP8()->numberOfTemporalLayers)); > } > } > return true; >@@ -125,8 +128,8 @@ class VideoCodecInitializerTest : public ::testing::Test { > > // Output. > VideoCodec codec_out_; >- std::unique_ptr<VideoBitrateAllocator> bitrate_allocator_out_; >- std::vector<std::unique_ptr<TemporalLayers>> temporal_layers_; >+ std::unique_ptr<VideoBitrateAllocator> bitrate_allocator_; >+ std::vector<std::unique_ptr<Vp8TemporalLayers>> temporal_layers_; > }; > > TEST_F(VideoCodecInitializerTest, SingleStreamVp8Screenshare) { >@@ -134,9 +137,8 @@ TEST_F(VideoCodecInitializerTest, SingleStreamVp8Screenshare) { > streams_.push_back(DefaultStream()); > EXPECT_TRUE(InitializeCodec()); > >- VideoBitrateAllocation bitrate_allocation = >- bitrate_allocator_out_->GetAllocation(kDefaultTargetBitrateBps, >- kDefaultFrameRate); >+ VideoBitrateAllocation bitrate_allocation = bitrate_allocator_->GetAllocation( >+ kDefaultTargetBitrateBps, kDefaultFrameRate); > EXPECT_EQ(1u, codec_out_.numberOfSimulcastStreams); > EXPECT_EQ(1u, codec_out_.VP8()->numberOfTemporalLayers); > EXPECT_EQ(kDefaultTargetBitrateBps, bitrate_allocation.get_sum_bps()); >@@ -149,9 +151,8 @@ TEST_F(VideoCodecInitializerTest, SingleStreamVp8ScreenshareInactive) { > streams_.push_back(inactive_stream); > EXPECT_TRUE(InitializeCodec()); > >- VideoBitrateAllocation bitrate_allocation = >- bitrate_allocator_out_->GetAllocation(kDefaultTargetBitrateBps, >- kDefaultFrameRate); >+ VideoBitrateAllocation bitrate_allocation = bitrate_allocator_->GetAllocation( >+ kDefaultTargetBitrateBps, kDefaultFrameRate); > EXPECT_EQ(1u, codec_out_.numberOfSimulcastStreams); > EXPECT_EQ(1u, codec_out_.VP8()->numberOfTemporalLayers); > EXPECT_EQ(0U, bitrate_allocation.get_sum_bps()); >@@ -164,9 +165,8 @@ TEST_F(VideoCodecInitializerTest, TemporalLayeredVp8Screenshare) { > > EXPECT_EQ(1u, codec_out_.numberOfSimulcastStreams); > EXPECT_EQ(2u, codec_out_.VP8()->numberOfTemporalLayers); >- VideoBitrateAllocation bitrate_allocation = >- bitrate_allocator_out_->GetAllocation(kScreenshareCodecTargetBitrateBps, >- kScreenshareDefaultFramerate); >+ VideoBitrateAllocation bitrate_allocation = bitrate_allocator_->GetAllocation( >+ kScreenshareCodecTargetBitrateBps, kScreenshareDefaultFramerate); > EXPECT_EQ(kScreenshareCodecTargetBitrateBps, > bitrate_allocation.get_sum_bps()); > EXPECT_EQ(kScreenshareTl0BitrateBps, bitrate_allocation.GetBitrate(0, 0)); >@@ -184,9 +184,8 @@ TEST_F(VideoCodecInitializerTest, SimulcastVp8Screenshare) { > EXPECT_EQ(1u, codec_out_.VP8()->numberOfTemporalLayers); > const uint32_t max_bitrate_bps = > streams_[0].target_bitrate_bps + streams_[1].max_bitrate_bps; >- VideoBitrateAllocation bitrate_allocation = >- bitrate_allocator_out_->GetAllocation(max_bitrate_bps, >- kScreenshareDefaultFramerate); >+ VideoBitrateAllocation bitrate_allocation = bitrate_allocator_->GetAllocation( >+ max_bitrate_bps, kScreenshareDefaultFramerate); > EXPECT_EQ(max_bitrate_bps, bitrate_allocation.get_sum_bps()); > EXPECT_EQ(static_cast<uint32_t>(streams_[0].target_bitrate_bps), > bitrate_allocation.GetSpatialLayerSum(0)); >@@ -209,9 +208,8 @@ TEST_F(VideoCodecInitializerTest, SimulcastVp8ScreenshareInactive) { > EXPECT_EQ(1u, codec_out_.VP8()->numberOfTemporalLayers); > const uint32_t target_bitrate = > streams_[0].target_bitrate_bps + streams_[1].target_bitrate_bps; >- VideoBitrateAllocation bitrate_allocation = >- bitrate_allocator_out_->GetAllocation(target_bitrate, >- kScreenshareDefaultFramerate); >+ VideoBitrateAllocation bitrate_allocation = bitrate_allocator_->GetAllocation( >+ target_bitrate, kScreenshareDefaultFramerate); > EXPECT_EQ(static_cast<uint32_t>(streams_[0].max_bitrate_bps), > bitrate_allocation.get_sum_bps()); > EXPECT_EQ(static_cast<uint32_t>(streams_[0].max_bitrate_bps), >@@ -234,7 +232,7 @@ TEST_F(VideoCodecInitializerTest, HighFpsSimulcastVp8Screenshare) { > const uint32_t max_bitrate_bps = > streams_[0].target_bitrate_bps + streams_[1].max_bitrate_bps; > VideoBitrateAllocation bitrate_allocation = >- bitrate_allocator_out_->GetAllocation(max_bitrate_bps, kDefaultFrameRate); >+ bitrate_allocator_->GetAllocation(max_bitrate_bps, kDefaultFrameRate); > EXPECT_EQ(max_bitrate_bps, bitrate_allocation.get_sum_bps()); > EXPECT_EQ(static_cast<uint32_t>(streams_[0].target_bitrate_bps), > bitrate_allocation.GetSpatialLayerSum(0)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.cc >index a061c18fd70e222ecccb742fa075df31516c2509..58397ad386d3221e03fb72b6dbfe0716578c83b6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.cc >@@ -13,6 +13,7 @@ > #include <algorithm> > #include <utility> > >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video/video_bitrate_allocator.h" > #include "common_types.h" // NOLINT(build/include) > #include "common_video/libyuv/include/webrtc_libyuv.h" >@@ -27,10 +28,6 @@ > #include "system_wrappers/include/clock.h" > > namespace webrtc { >-EventWrapper* EventFactoryImpl::CreateEvent() { >- return EventWrapper::Create(); >-} >- > namespace vcm { > > int64_t VCMProcessTimer::Period() const { >@@ -81,17 +78,13 @@ class EncodedImageCallbackWrapper : public EncodedImageCallback { > class VideoCodingModuleImpl : public VideoCodingModule { > public: > VideoCodingModuleImpl(Clock* clock, >- EventFactory* event_factory, > NackSender* nack_sender, > KeyFrameRequestSender* keyframe_request_sender) > : VideoCodingModule(), > sender_(clock, &post_encode_callback_), >+ rate_allocator_factory_(CreateBuiltinVideoBitrateAllocatorFactory()), > timing_(new VCMTiming(clock)), >- receiver_(clock, >- event_factory, >- timing_.get(), >- nack_sender, >- keyframe_request_sender) {} >+ receiver_(clock, timing_.get(), nack_sender, keyframe_request_sender) {} > > virtual ~VideoCodingModuleImpl() {} > >@@ -114,7 +107,8 @@ class VideoCodingModuleImpl : public VideoCodingModule { > // asynchronously keep the instance alive until destruction or until a > // new send codec is registered. > VideoCodec codec = *sendCodec; >- rate_allocator_ = VideoCodecInitializer::CreateBitrateAllocator(codec); >+ rate_allocator_ = >+ rate_allocator_factory_->CreateVideoBitrateAllocator(codec); > return sender_.RegisterSendCodec(&codec, numberOfCores, maxPayloadSize); > } > return sender_.RegisterSendCodec(sendCodec, numberOfCores, maxPayloadSize); >@@ -142,7 +136,7 @@ class VideoCodingModuleImpl : public VideoCodingModule { > > int32_t AddVideoFrame(const VideoFrame& videoFrame, > const CodecSpecificInfo* codecSpecificInfo) override { >- return sender_.AddVideoFrame(videoFrame, codecSpecificInfo); >+ return sender_.AddVideoFrame(videoFrame, codecSpecificInfo, absl::nullopt); > } > > int32_t IntraFrameRequest(size_t stream_index) override { >@@ -213,19 +207,18 @@ class VideoCodingModuleImpl : public VideoCodingModule { > rtc::ThreadChecker construction_thread_; > EncodedImageCallbackWrapper post_encode_callback_; > vcm::VideoSender sender_; >+ const std::unique_ptr<VideoBitrateAllocatorFactory> rate_allocator_factory_; > std::unique_ptr<VideoBitrateAllocator> rate_allocator_; >- std::unique_ptr<VCMTiming> timing_; >+ const std::unique_ptr<VCMTiming> timing_; > vcm::VideoReceiver receiver_; > }; > } // namespace > > // DEPRECATED. Create method for current interface, will be removed when the > // new jitter buffer is in place. >-VideoCodingModule* VideoCodingModule::Create(Clock* clock, >- EventFactory* event_factory) { >+VideoCodingModule* VideoCodingModule::Create(Clock* clock) { > RTC_DCHECK(clock); >- RTC_DCHECK(event_factory); >- return new VideoCodingModuleImpl(clock, event_factory, nullptr, nullptr); >+ return new VideoCodingModuleImpl(clock, nullptr, nullptr); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.h >index 3c09f96e269f9fc0aa3e4d0c148fc2d22ff30665..e27a2fe76011ee5d2fa64d352bec87a55ce28fba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_coding_impl.h >@@ -17,7 +17,7 @@ > #include <string> > #include <vector> > >-#include "common_video/include/frame_callback.h" >+#include "absl/types/optional.h" > #include "modules/video_coding/decoder_database.h" > #include "modules/video_coding/encoder_database.h" > #include "modules/video_coding/frame_buffer.h" >@@ -94,7 +94,8 @@ class VideoSender { > VideoBitrateAllocationObserver* bitrate_updated_callback); > > int32_t AddVideoFrame(const VideoFrame& videoFrame, >- const CodecSpecificInfo* codecSpecificInfo); >+ const CodecSpecificInfo* codecSpecificInfo, >+ absl::optional<VideoEncoder::EncoderInfo> encoder_info); > > int32_t IntraFrameRequest(size_t stream_index); > int32_t EnableFrameDropper(bool enable); >@@ -113,7 +114,18 @@ class VideoSender { > VCMEncodedFrameCallback _encodedFrameCallback RTC_GUARDED_BY(encoder_crit_); > EncodedImageCallback* const post_encode_callback_; > VCMEncoderDataBase _codecDataBase RTC_GUARDED_BY(encoder_crit_); >- bool frame_dropper_enabled_ RTC_GUARDED_BY(encoder_crit_); >+ >+ // |frame_dropper_requested_| specifies if the user of this class has >+ // requested frame dropping to be enabled, via EnableFrameDropper(). >+ // Depending on video encoder configuration, this setting may be overridden >+ // and the frame dropper be force disabled. If so, >+ // |force_disable_frame_dropper_| will be set to true. >+ // If frame dropper is requested, and is not force disabled, frame dropping >+ // might still be disabled if VideoEncoder::GetEncoderInfo() indicates that >+ // the encoder has a trusted rate controller. This is determined on a >+ // per-frame basis, as the encoder behavior might dynamically change. >+ bool frame_dropper_requested_ RTC_GUARDED_BY(encoder_crit_); >+ bool force_disable_frame_dropper_ RTC_GUARDED_BY(encoder_crit_); > > // Must be accessed on the construction thread of VideoSender. > VideoCodec current_codec_; >@@ -128,7 +140,6 @@ class VideoSender { > class VideoReceiver : public Module { > public: > VideoReceiver(Clock* clock, >- EventFactory* event_factory, > VCMTiming* timing, > NackSender* nack_sender = nullptr, > KeyFrameRequestSender* keyframe_request_sender = nullptr); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_packet_buffer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_packet_buffer_unittest.cc >index 425542c949c0365222e4aca04b5f1c6713b8a68d..13e64eebab41f09d73a8c51a8b439a247f4f8a99 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_packet_buffer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_packet_buffer_unittest.cc >@@ -120,14 +120,7 @@ TEST_F(TestPacketBuffer, InsertDuplicatePacket) { > EXPECT_TRUE(Insert(seq_num, kKeyFrame, kFirst, kLast)); > } > >-#if defined(WEBRTC_ANDROID) >-// Fails on android after clang update >-// TODO(crbug.com/887464): Reenable this >-#define MAYBE_SeqNumWrapOneFrame DISABLED_SeqNumWrapOneFrame >-#else >-#define MAYBE_SeqNumWrapOneFrame SeqNumWrapOneFrame >-#endif >-TEST_F(TestPacketBuffer, MAYBE_SeqNumWrapOneFrame) { >+TEST_F(TestPacketBuffer, SeqNumWrapOneFrame) { > EXPECT_TRUE(Insert(0xFFFF, kKeyFrame, kFirst, kNotLast)); > EXPECT_TRUE(Insert(0x0, kKeyFrame, kNotFirst, kLast)); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver.cc >index d94ba5e69f9ef70c0458dcb2a2e526e8ae3388b1..8b4ee785eef7987c249e04251b32f49063329712 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver.cc >@@ -26,7 +26,6 @@ namespace webrtc { > namespace vcm { > > VideoReceiver::VideoReceiver(Clock* clock, >- EventFactory* event_factory, > VCMTiming* timing, > NackSender* nack_sender, > KeyFrameRequestSender* keyframe_request_sender) >@@ -34,7 +33,6 @@ VideoReceiver::VideoReceiver(Clock* clock, > _timing(timing), > _receiver(_timing, > clock_, >- event_factory, > nack_sender, > keyframe_request_sender), > _decodedFrameCallback(_timing, clock_), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver_unittest.cc >index b7a46f92ea88e7ee2d5bd1514e35d01b14922bf3..f99dac49200a3bc955a8875bec7b6dafdf6da074 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_receiver_unittest.cc >@@ -11,10 +11,9 @@ > #include <memory> > #include <vector> > >+#include "api/test/mock_video_decoder.h" > #include "modules/video_coding/include/mock/mock_vcm_callbacks.h" >-#include "modules/video_coding/include/mock/mock_video_codec_interface.h" > #include "modules/video_coding/include/video_coding.h" >-#include "modules/video_coding/test/test_util.h" > #include "modules/video_coding/timing.h" > #include "modules/video_coding/video_coding_impl.h" > #include "system_wrappers/include/clock.h" >@@ -37,7 +36,7 @@ class TestVideoReceiver : public ::testing::Test { > > virtual void SetUp() { > timing_.reset(new VCMTiming(&clock_)); >- receiver_.reset(new VideoReceiver(&clock_, &event_factory_, timing_.get())); >+ receiver_.reset(new VideoReceiver(&clock_, timing_.get())); > receiver_->RegisterExternalDecoder(&decoder_, kUnusedPayloadType); > const size_t kMaxNackListSize = 250; > const int kMaxPacketAgeToNack = 450; >@@ -81,7 +80,6 @@ class TestVideoReceiver : public ::testing::Test { > } > > SimulatedClock clock_; >- NullEventFactory event_factory_; > VideoCodec settings_; > NiceMock<MockVideoDecoder> decoder_; > NiceMock<MockPacketRequestCallback> packet_request_callback_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender.cc >index 4c1e24361204dc059b8e37adaffa401b6e5363ef..4bd52530c9f5020a7fbc2786c8cf459a0ab32a77 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender.cc >@@ -40,7 +40,8 @@ VideoSender::VideoSender(Clock* clock, > _encodedFrameCallback(post_encode_callback, &_mediaOpt), > post_encode_callback_(post_encode_callback), > _codecDataBase(&_encodedFrameCallback), >- frame_dropper_enabled_(true), >+ frame_dropper_requested_(true), >+ force_disable_frame_dropper_(false), > current_codec_(), > encoder_params_({VideoBitrateAllocation(), 0, 0, 0}), > encoder_has_internal_source_(false), >@@ -96,15 +97,12 @@ int32_t VideoSender::RegisterSendCodec(const VideoCodec* sendCodec, > numLayers = 1; > } > >- // If we have screensharing and we have layers, we disable frame dropper. >- const bool disable_frame_dropper = >+ // Force-disable frame dropper if either: >+ // * We have screensharing with layers. >+ // * "WebRTC-FrameDropper" field trial is "Disabled". >+ force_disable_frame_dropper_ = > field_trial::IsDisabled(kFrameDropperFieldTrial) || > (numLayers > 1 && sendCodec->mode == VideoCodecMode::kScreensharing); >- if (disable_frame_dropper) { >- _mediaOpt.EnableFrameDropper(false); >- } else if (frame_dropper_enabled_) { >- _mediaOpt.EnableFrameDropper(true); >- } > > { > rtc::CritScope cs(¶ms_crit_); >@@ -247,8 +245,10 @@ void VideoSender::SetEncoderParameters(EncoderParameters params, > } > > // Add one raw video frame to the encoder, blocking. >-int32_t VideoSender::AddVideoFrame(const VideoFrame& videoFrame, >- const CodecSpecificInfo* codecSpecificInfo) { >+int32_t VideoSender::AddVideoFrame( >+ const VideoFrame& videoFrame, >+ const CodecSpecificInfo* codecSpecificInfo, >+ absl::optional<VideoEncoder::EncoderInfo> encoder_info) { > EncoderParameters encoder_params; > std::vector<FrameType> next_frame_types; > bool encoder_has_internal_source = false; >@@ -262,6 +262,17 @@ int32_t VideoSender::AddVideoFrame(const VideoFrame& videoFrame, > if (_encoder == nullptr) > return VCM_UNINITIALIZED; > SetEncoderParameters(encoder_params, encoder_has_internal_source); >+ if (!encoder_info) { >+ encoder_info = _encoder->GetEncoderInfo(); >+ } >+ >+ // Frame dropping is enabled iff frame dropping has been requested, and >+ // frame dropping is not force-disabled, and rate controller is not trusted. >+ const bool frame_dropping_enabled = >+ frame_dropper_requested_ && !force_disable_frame_dropper_ && >+ !encoder_info->has_trusted_rate_controller; >+ _mediaOpt.EnableFrameDropper(frame_dropping_enabled); >+ > if (_mediaOpt.DropFrame()) { > RTC_LOG(LS_VERBOSE) << "Drop Frame " > << "target bitrate " >@@ -287,7 +298,7 @@ int32_t VideoSender::AddVideoFrame(const VideoFrame& videoFrame, > const bool is_buffer_type_supported = > buffer_type == VideoFrameBuffer::Type::kI420 || > (buffer_type == VideoFrameBuffer::Type::kNative && >- _encoder->SupportsNativeHandle()); >+ encoder_info->supports_native_handle); > if (!is_buffer_type_supported) { > // This module only supports software encoding. > // TODO(pbos): Offload conversion from the encoder thread. >@@ -355,7 +366,7 @@ int32_t VideoSender::IntraFrameRequest(size_t stream_index) { > > int32_t VideoSender::EnableFrameDropper(bool enable) { > rtc::CritScope lock(&encoder_crit_); >- frame_dropper_enabled_ = enable; >+ frame_dropper_requested_ = enable; > _mediaOpt.EnableFrameDropper(enable); > return VCM_OK; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender_unittest.cc >index 3cd9caf65c306798f888f9dc5ca7c701ad32d9b8..c2ff0f6207c129a9dc838bb7e855fb1ff3aaa805 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/video_sender_unittest.cc >@@ -11,11 +11,11 @@ > #include <memory> > #include <vector> > >+#include "api/test/mock_video_encoder.h" > #include "api/video/i420_buffer.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" > #include "modules/video_coding/include/mock/mock_vcm_callbacks.h" >-#include "modules/video_coding/include/mock/mock_video_codec_interface.h" > #include "modules/video_coding/include/video_coding.h" > #include "modules/video_coding/utility/default_video_bitrate_allocator.h" > #include "modules/video_coding/utility/simulcast_rate_allocator.h" >@@ -186,7 +186,10 @@ class TestVideoSender : public ::testing::Test { > > void AddFrame() { > assert(generator_.get()); >- sender_->AddVideoFrame(*generator_->NextFrame(), NULL); >+ sender_->AddVideoFrame(*generator_->NextFrame(), nullptr, >+ encoder_ ? absl::optional<VideoEncoder::EncoderInfo>( >+ encoder_->GetEncoderInfo()) >+ : absl::nullopt); > } > > SimulatedClock clock_; >@@ -369,9 +372,6 @@ TEST_F(TestVideoSenderWithMockEncoder, > // Expect initial call to SetChannelParameters. Rates are initialized through > // InitEncode and expects no additional call before the framerate (or bitrate) > // updates. >- EXPECT_CALL(encoder_, SetChannelParameters(kLossRate, kRtt)) >- .Times(1) >- .WillOnce(Return(0)); > sender_->SetChannelParameters(settings_.startBitrate * 1000, kLossRate, kRtt, > rate_allocator_.get(), nullptr); > while (clock_.TimeInMilliseconds() < start_time + kRateStatsWindowMs) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/BUILD.gn >deleted file mode 100644 >index b65165a037be5fcedfb11ec0f8dbf50ac6285d55..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/BUILD.gn >+++ /dev/null >@@ -1,108 +0,0 @@ >-# Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >-# >-# Use of this source code is governed by a BSD-style license >-# that can be found in the LICENSE file in the root of the source >-# tree. An additional intellectual property rights grant can be found >-# in the file PATENTS. All contributing project authors may >-# be found in the AUTHORS file in the root of the source tree. >- >-import("../webrtc.gni") >-if (is_android) { >- import("//build/config/android/config.gni") >- import("//build/config/android/rules.gni") >-} >- >-rtc_static_library("ortc") { >- defines = [] >- sources = [ >- "ortcfactory.cc", >- "ortcfactory.h", >- "ortcrtpreceiveradapter.cc", >- "ortcrtpreceiveradapter.h", >- "ortcrtpsenderadapter.cc", >- "ortcrtpsenderadapter.h", >- "rtptransportadapter.cc", >- "rtptransportadapter.h", >- "rtptransportcontrolleradapter.cc", >- "rtptransportcontrolleradapter.h", >- ] >- >- # TODO(deadbeef): Create a separate target for the common things ORTC and >- # PeerConnection code shares, so that ortc can depend on that instead of >- # libjingle_peerconnection. >- deps = [ >- "../api:libjingle_peerconnection_api", >- "../api:ortc_api", >- "../api/video_codecs:builtin_video_decoder_factory", >- "../api/video_codecs:builtin_video_encoder_factory", >- "../call:call_interfaces", >- "../call:rtp_sender", >- "../logging:rtc_event_log_api", >- "../logging:rtc_event_log_impl_base", >- "../media:rtc_audio_video", >- "../media:rtc_media", >- "../media:rtc_media_base", >- "../modules/audio_processing:audio_processing", >- "../p2p:rtc_p2p", >- "../pc:libjingle_peerconnection", >- "../pc:peerconnection", >- "../pc:rtc_pc", >- "../pc:rtc_pc_base", >- "../rtc_base:checks", >- "../rtc_base:rtc_base", >- "../rtc_base:rtc_base_approved", >- "../rtc_base/third_party/sigslot", >- "//third_party/abseil-cpp/absl/memory", >- "//third_party/abseil-cpp/absl/types:optional", >- ] >- >- if (!build_with_chromium && is_clang) { >- # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >- suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >- } >-} >- >-if (rtc_include_tests) { >- rtc_test("ortc_unittests") { >- testonly = true >- >- sources = [ >- "ortcfactory_integrationtest.cc", >- "ortcfactory_unittest.cc", >- "ortcrtpreceiver_unittest.cc", >- "ortcrtpsender_unittest.cc", >- "rtptransport_unittest.cc", >- "rtptransportcontroller_unittest.cc", >- "srtptransport_unittest.cc", >- "testrtpparameters.cc", >- "testrtpparameters.h", >- ] >- >- deps = [ >- ":ortc", >- "../api:libjingle_peerconnection_api", >- "../api:ortc_api", >- "../api/audio_codecs:builtin_audio_decoder_factory", >- "../api/audio_codecs:builtin_audio_encoder_factory", >- "../media:rtc_media_tests_utils", >- "../p2p:p2p_test_utils", >- "../p2p:rtc_p2p", >- "../pc:pc_test_utils", >- "../pc:peerconnection", >- "../rtc_base:rtc_base", >- "../rtc_base:rtc_base_approved", >- "../rtc_base:rtc_base_tests_main", >- "../rtc_base:rtc_base_tests_utils", >- "../rtc_base/system:arch", >- ] >- >- if (!build_with_chromium && is_clang) { >- # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >- suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >- } >- >- if (is_android) { >- deps += [ "//testing/android/native_test:native_test_support" ] >- } >- } >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/DEPS b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/DEPS >deleted file mode 100644 >index bebf030b8290e6290c9326999d1e5eadc17fe6fd..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/DEPS >+++ /dev/null >@@ -1,17 +0,0 @@ >-include_rules = [ >- "+api", >- "+call", >- "+logging/rtc_event_log", >- "+media", >- "+modules/audio_coding", >- "+modules/audio_processing", >- "+p2p", >- "+pc", >- >- "+modules/rtp_rtcp", >- "+system_wrappers", >- >- "+modules/audio_device", >- "+modules/video_coding", >- "+modules/video_render", >-] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/OWNERS >deleted file mode 100644 >index 4c08fff50b8f2ea7393d9ab9d3e75d7e4b6cd301..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/OWNERS >+++ /dev/null >@@ -1,7 +0,0 @@ >-pthatcher@webrtc.org >-steveanton@webrtc.org >- >-# These are for the common case of adding or renaming files. If you're doing >-# structural changes, please get a review from a reviewer in this file. >-per-file *.gn=* >-per-file *.gni=* >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory.cc >deleted file mode 100644 >index cb63d26174ba6786409544fc154e2b1355ef2f59..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory.cc >+++ /dev/null >@@ -1,563 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "ortc/ortcfactory.h" >- >-#include <utility> // For std::move. >-#include <vector> >- >-#include "absl/memory/memory.h" >-#include "api/mediastreamtrackproxy.h" >-#include "api/proxy.h" >-#include "api/rtcerror.h" >-#include "api/video_codecs/builtin_video_decoder_factory.h" >-#include "api/video_codecs/builtin_video_encoder_factory.h" >-#include "api/videosourceproxy.h" >-#include "logging/rtc_event_log/rtc_event_log.h" >-#include "media/base/mediaconstants.h" >-#include "media/base/rtpdataengine.h" >-#include "modules/audio_processing/include/audio_processing.h" >-#include "ortc/ortcrtpreceiveradapter.h" >-#include "ortc/ortcrtpsenderadapter.h" >-#include "ortc/rtptransportadapter.h" >-#include "ortc/rtptransportcontrolleradapter.h" >-#include "p2p/base/basicpacketsocketfactory.h" >-#include "p2p/base/udptransport.h" >-#include "pc/audiotrack.h" >-#include "pc/channelmanager.h" >-#include "pc/localaudiosource.h" >-#include "pc/rtpparametersconversion.h" >-#include "pc/videotrack.h" >-#include "rtc_base/asyncpacketsocket.h" >-#include "rtc_base/bind.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/helpers.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/strings/string_builder.h" >- >-namespace { >- >-const int kDefaultRtcpCnameLength = 16; >- >-// Asserts that all of the built-in capabilities can be converted to >-// RtpCapabilities. If they can't, something's wrong (for example, maybe a new >-// feedback mechanism is supported, but an enum value wasn't added to >-// rtpparameters.h). >-template <typename C> >-webrtc::RtpCapabilities ToRtpCapabilitiesWithAsserts( >- const std::vector<C>& cricket_codecs, >- const cricket::RtpHeaderExtensions& cricket_extensions) { >- webrtc::RtpCapabilities capabilities = >- webrtc::ToRtpCapabilities(cricket_codecs, cricket_extensions); >- RTC_DCHECK_EQ(capabilities.codecs.size(), cricket_codecs.size()); >- for (size_t i = 0; i < capabilities.codecs.size(); ++i) { >- RTC_DCHECK_EQ(capabilities.codecs[i].rtcp_feedback.size(), >- cricket_codecs[i].feedback_params.params().size()); >- } >- RTC_DCHECK_EQ(capabilities.header_extensions.size(), >- cricket_extensions.size()); >- return capabilities; >-} >- >-} // namespace >- >-namespace webrtc { >- >-// Note that this proxy class uses the network thread as the "worker" thread. >-BEGIN_OWNED_PROXY_MAP(OrtcFactory) >-PROXY_SIGNALING_THREAD_DESTRUCTOR() >-PROXY_METHOD0(RTCErrorOr<std::unique_ptr<RtpTransportControllerInterface>>, >- CreateRtpTransportController) >-PROXY_METHOD4(RTCErrorOr<std::unique_ptr<RtpTransportInterface>>, >- CreateRtpTransport, >- const RtpTransportParameters&, >- PacketTransportInterface*, >- PacketTransportInterface*, >- RtpTransportControllerInterface*) >- >-PROXY_METHOD4(RTCErrorOr<std::unique_ptr<SrtpTransportInterface>>, >- CreateSrtpTransport, >- const RtpTransportParameters&, >- PacketTransportInterface*, >- PacketTransportInterface*, >- RtpTransportControllerInterface*) >- >-PROXY_CONSTMETHOD1(RtpCapabilities, >- GetRtpSenderCapabilities, >- cricket::MediaType) >-PROXY_METHOD2(RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>>, >- CreateRtpSender, >- rtc::scoped_refptr<MediaStreamTrackInterface>, >- RtpTransportInterface*) >-PROXY_METHOD2(RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>>, >- CreateRtpSender, >- cricket::MediaType, >- RtpTransportInterface*) >-PROXY_CONSTMETHOD1(RtpCapabilities, >- GetRtpReceiverCapabilities, >- cricket::MediaType) >-PROXY_METHOD2(RTCErrorOr<std::unique_ptr<OrtcRtpReceiverInterface>>, >- CreateRtpReceiver, >- cricket::MediaType, >- RtpTransportInterface*) >-PROXY_WORKER_METHOD3(RTCErrorOr<std::unique_ptr<UdpTransportInterface>>, >- CreateUdpTransport, >- int, >- uint16_t, >- uint16_t) >-PROXY_METHOD1(rtc::scoped_refptr<AudioSourceInterface>, >- CreateAudioSource, >- const cricket::AudioOptions&) >-PROXY_METHOD2(rtc::scoped_refptr<VideoTrackInterface>, >- CreateVideoTrack, >- const std::string&, >- VideoTrackSourceInterface*) >-PROXY_METHOD2(rtc::scoped_refptr<AudioTrackInterface>, >- CreateAudioTrack, >- const std::string&, >- AudioSourceInterface*) >-END_PROXY_MAP() >- >-// static >-RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>> OrtcFactory::Create( >- rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- std::unique_ptr<cricket::MediaEngineInterface> media_engine, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory) { >- // Hop to signaling thread if needed. >- if (signaling_thread && !signaling_thread->IsCurrent()) { >- return signaling_thread >- ->Invoke<RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>>>( >- RTC_FROM_HERE, >- rtc::Bind(&OrtcFactory::Create_s, network_thread, signaling_thread, >- network_manager, socket_factory, adm, >- media_engine.release(), audio_encoder_factory, >- audio_decoder_factory)); >- } >- return Create_s(network_thread, signaling_thread, network_manager, >- socket_factory, adm, media_engine.release(), >- audio_encoder_factory, audio_decoder_factory); >-} >- >-RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>> OrtcFactoryInterface::Create( >- rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory) { >- return OrtcFactory::Create(network_thread, signaling_thread, network_manager, >- socket_factory, adm, nullptr, >- audio_encoder_factory, audio_decoder_factory); >-} >- >-OrtcFactory::OrtcFactory( >- rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory) >- : network_thread_(network_thread), >- signaling_thread_(signaling_thread), >- network_manager_(network_manager), >- socket_factory_(socket_factory), >- adm_(adm), >- null_event_log_(RtcEventLog::CreateNull()), >- audio_encoder_factory_(audio_encoder_factory), >- audio_decoder_factory_(audio_decoder_factory) { >- if (!rtc::CreateRandomString(kDefaultRtcpCnameLength, &default_cname_)) { >- RTC_LOG(LS_ERROR) << "Failed to generate CNAME?"; >- RTC_NOTREACHED(); >- } >- if (!network_thread_) { >- owned_network_thread_ = rtc::Thread::CreateWithSocketServer(); >- owned_network_thread_->Start(); >- network_thread_ = owned_network_thread_.get(); >- } >- >- // The worker thread is created internally because it's an implementation >- // detail, and consumers of the API don't need to really know about it. >- worker_thread_ = rtc::Thread::Create(); >- worker_thread_->SetName("ORTC-worker", this); >- worker_thread_->Start(); >- >- if (signaling_thread_) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- } else { >- signaling_thread_ = rtc::Thread::Current(); >- if (!signaling_thread_) { >- // If this thread isn't already wrapped by an rtc::Thread, create a >- // wrapper and own it in this class. >- signaling_thread_ = rtc::ThreadManager::Instance()->WrapCurrentThread(); >- wraps_signaling_thread_ = true; >- } >- } >- >- if (signaling_thread_->name().empty()) { >- signaling_thread_->SetName("ORTC-signaling", this); >- } >- >- if (!network_manager_) { >- owned_network_manager_.reset(new rtc::BasicNetworkManager()); >- network_manager_ = owned_network_manager_.get(); >- } >- if (!socket_factory_) { >- owned_socket_factory_.reset( >- new rtc::BasicPacketSocketFactory(network_thread_)); >- socket_factory_ = owned_socket_factory_.get(); >- } >-} >- >-OrtcFactory::~OrtcFactory() { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- if (wraps_signaling_thread_) { >- rtc::ThreadManager::Instance()->UnwrapCurrentThread(); >- } >-} >- >-RTCErrorOr<std::unique_ptr<RtpTransportControllerInterface>> >-OrtcFactory::CreateRtpTransportController() { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- return RtpTransportControllerAdapter::CreateProxied( >- cricket::MediaConfig(), channel_manager_.get(), null_event_log_.get(), >- signaling_thread_, worker_thread_.get(), network_thread_); >-} >- >-RTCErrorOr<std::unique_ptr<RtpTransportInterface>> >-OrtcFactory::CreateRtpTransport( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerInterface* transport_controller) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- RtpTransportParameters copied_parameters = parameters; >- if (copied_parameters.rtcp.cname.empty()) { >- copied_parameters.rtcp.cname = default_cname_; >- } >- if (transport_controller) { >- return transport_controller->GetInternal()->CreateProxiedRtpTransport( >- copied_parameters, rtp, rtcp); >- } else { >- // If |transport_controller| is null, create one automatically, which the >- // returned RtpTransport will own. >- auto controller_result = CreateRtpTransportController(); >- if (!controller_result.ok()) { >- return controller_result.MoveError(); >- } >- auto controller = controller_result.MoveValue(); >- auto transport_result = >- controller->GetInternal()->CreateProxiedRtpTransport(copied_parameters, >- rtp, rtcp); >- // If RtpTransport was successfully created, transfer ownership of >- // |rtp_transport_controller|. Otherwise it will go out of scope and be >- // deleted automatically. >- if (transport_result.ok()) { >- transport_result.value() >- ->GetInternal() >- ->TakeOwnershipOfRtpTransportController(std::move(controller)); >- } >- return transport_result; >- } >-} >- >-RTCErrorOr<std::unique_ptr<SrtpTransportInterface>> >-OrtcFactory::CreateSrtpTransport( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerInterface* transport_controller) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- RtpTransportParameters copied_parameters = parameters; >- if (copied_parameters.rtcp.cname.empty()) { >- copied_parameters.rtcp.cname = default_cname_; >- } >- if (transport_controller) { >- return transport_controller->GetInternal()->CreateProxiedSrtpTransport( >- copied_parameters, rtp, rtcp); >- } else { >- // If |transport_controller| is null, create one automatically, which the >- // returned SrtpTransport will own. >- auto controller_result = CreateRtpTransportController(); >- if (!controller_result.ok()) { >- return controller_result.MoveError(); >- } >- auto controller = controller_result.MoveValue(); >- auto transport_result = >- controller->GetInternal()->CreateProxiedSrtpTransport(copied_parameters, >- rtp, rtcp); >- // If SrtpTransport was successfully created, transfer ownership of >- // |rtp_transport_controller|. Otherwise it will go out of scope and be >- // deleted automatically. >- if (transport_result.ok()) { >- transport_result.value() >- ->GetInternal() >- ->TakeOwnershipOfRtpTransportController(std::move(controller)); >- } >- return transport_result; >- } >-} >- >-RtpCapabilities OrtcFactory::GetRtpSenderCapabilities( >- cricket::MediaType kind) const { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- switch (kind) { >- case cricket::MEDIA_TYPE_AUDIO: { >- cricket::AudioCodecs cricket_codecs; >- cricket::RtpHeaderExtensions cricket_extensions; >- channel_manager_->GetSupportedAudioSendCodecs(&cricket_codecs); >- channel_manager_->GetSupportedAudioRtpHeaderExtensions( >- &cricket_extensions); >- return ToRtpCapabilitiesWithAsserts(cricket_codecs, cricket_extensions); >- } >- case cricket::MEDIA_TYPE_VIDEO: { >- cricket::VideoCodecs cricket_codecs; >- cricket::RtpHeaderExtensions cricket_extensions; >- channel_manager_->GetSupportedVideoCodecs(&cricket_codecs); >- channel_manager_->GetSupportedVideoRtpHeaderExtensions( >- &cricket_extensions); >- return ToRtpCapabilitiesWithAsserts(cricket_codecs, cricket_extensions); >- } >- case cricket::MEDIA_TYPE_DATA: >- return RtpCapabilities(); >- } >- // Not reached; avoids compile warning. >- FATAL(); >-} >- >-RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> >-OrtcFactory::CreateRtpSender( >- rtc::scoped_refptr<MediaStreamTrackInterface> track, >- RtpTransportInterface* transport) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- if (!track) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Cannot pass null track into CreateRtpSender."); >- } >- auto result = >- CreateRtpSender(cricket::MediaTypeFromString(track->kind()), transport); >- if (!result.ok()) { >- return result; >- } >- auto err = result.value()->SetTrack(track); >- if (!err.ok()) { >- return std::move(err); >- } >- return result; >-} >- >-RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> >-OrtcFactory::CreateRtpSender(cricket::MediaType kind, >- RtpTransportInterface* transport) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- if (kind == cricket::MEDIA_TYPE_DATA) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Cannot create data RtpSender."); >- } >- if (!transport) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Cannot pass null transport into CreateRtpSender."); >- } >- return transport->GetInternal() >- ->rtp_transport_controller() >- ->CreateProxiedRtpSender(kind, transport); >-} >- >-RtpCapabilities OrtcFactory::GetRtpReceiverCapabilities( >- cricket::MediaType kind) const { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- switch (kind) { >- case cricket::MEDIA_TYPE_AUDIO: { >- cricket::AudioCodecs cricket_codecs; >- cricket::RtpHeaderExtensions cricket_extensions; >- channel_manager_->GetSupportedAudioReceiveCodecs(&cricket_codecs); >- channel_manager_->GetSupportedAudioRtpHeaderExtensions( >- &cricket_extensions); >- return ToRtpCapabilitiesWithAsserts(cricket_codecs, cricket_extensions); >- } >- case cricket::MEDIA_TYPE_VIDEO: { >- cricket::VideoCodecs cricket_codecs; >- cricket::RtpHeaderExtensions cricket_extensions; >- channel_manager_->GetSupportedVideoCodecs(&cricket_codecs); >- channel_manager_->GetSupportedVideoRtpHeaderExtensions( >- &cricket_extensions); >- return ToRtpCapabilitiesWithAsserts(cricket_codecs, cricket_extensions); >- } >- case cricket::MEDIA_TYPE_DATA: >- return RtpCapabilities(); >- } >- // Not reached; avoids compile warning. >- FATAL(); >-} >- >-RTCErrorOr<std::unique_ptr<OrtcRtpReceiverInterface>> >-OrtcFactory::CreateRtpReceiver(cricket::MediaType kind, >- RtpTransportInterface* transport) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- if (kind == cricket::MEDIA_TYPE_DATA) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Cannot create data RtpReceiver."); >- } >- if (!transport) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Cannot pass null transport into CreateRtpReceiver."); >- } >- return transport->GetInternal() >- ->rtp_transport_controller() >- ->CreateProxiedRtpReceiver(kind, transport); >-} >- >-// UdpTransport expects all methods to be called on one thread, which needs to >-// be the network thread, since that's where its socket can safely be used. So >-// return a proxy to the created UdpTransport. >-BEGIN_OWNED_PROXY_MAP(UdpTransport) >-PROXY_WORKER_THREAD_DESTRUCTOR() >-PROXY_WORKER_CONSTMETHOD0(rtc::SocketAddress, GetLocalAddress) >-PROXY_WORKER_METHOD1(bool, SetRemoteAddress, const rtc::SocketAddress&) >-PROXY_WORKER_CONSTMETHOD0(rtc::SocketAddress, GetRemoteAddress) >-protected: >-rtc::PacketTransportInternal* GetInternal() override { >- return internal(); >-} >-END_PROXY_MAP() >- >-RTCErrorOr<std::unique_ptr<UdpTransportInterface>> >-OrtcFactory::CreateUdpTransport(int family, >- uint16_t min_port, >- uint16_t max_port) { >- RTC_DCHECK_RUN_ON(network_thread_); >- if (family != AF_INET && family != AF_INET6) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Address family must be AF_INET or AF_INET6."); >- } >- if (min_port > max_port) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_RANGE, >- "Port range invalid; minimum port must be less than " >- "or equal to max port."); >- } >- std::unique_ptr<rtc::AsyncPacketSocket> socket( >- socket_factory_->CreateUdpSocket( >- rtc::SocketAddress(rtc::GetAnyIP(family), 0), min_port, max_port)); >- if (!socket) { >- // Only log at warning level, because this method may be called with >- // specific port ranges to determine if a port is available, expecting the >- // possibility of an error. >- LOG_AND_RETURN_ERROR_EX(RTCErrorType::RESOURCE_EXHAUSTED, >- "Local socket allocation failure.", LS_WARNING); >- } >- RTC_LOG(LS_INFO) << "Created UDP socket with address " >- << socket->GetLocalAddress().ToSensitiveString() << "."; >- // Make a unique debug name (for logging/diagnostics only). >- rtc::StringBuilder oss; >- static int udp_id = 0; >- oss << "udp" << udp_id++; >- return UdpTransportProxyWithInternal<cricket::UdpTransport>::Create( >- signaling_thread_, network_thread_, >- std::unique_ptr<cricket::UdpTransport>( >- new cricket::UdpTransport(oss.str(), std::move(socket)))); >-} >- >-rtc::scoped_refptr<AudioSourceInterface> OrtcFactory::CreateAudioSource( >- const cricket::AudioOptions& options) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- return rtc::scoped_refptr<LocalAudioSource>( >- LocalAudioSource::Create(&options)); >-} >- >-rtc::scoped_refptr<VideoTrackInterface> OrtcFactory::CreateVideoTrack( >- const std::string& id, >- VideoTrackSourceInterface* source) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- rtc::scoped_refptr<VideoTrackInterface> track( >- VideoTrack::Create(id, source, worker_thread_.get())); >- return VideoTrackProxy::Create(signaling_thread_, worker_thread_.get(), >- track); >-} >- >-rtc::scoped_refptr<AudioTrackInterface> OrtcFactory::CreateAudioTrack( >- const std::string& id, >- AudioSourceInterface* source) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- rtc::scoped_refptr<AudioTrackInterface> track(AudioTrack::Create(id, source)); >- return AudioTrackProxy::Create(signaling_thread_, track); >-} >- >-// static >-RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>> OrtcFactory::Create_s( >- rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- cricket::MediaEngineInterface* media_engine, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory) { >- // Add the unique_ptr wrapper back. >- std::unique_ptr<cricket::MediaEngineInterface> owned_media_engine( >- media_engine); >- std::unique_ptr<OrtcFactory> new_factory(new OrtcFactory( >- network_thread, signaling_thread, network_manager, socket_factory, adm, >- audio_encoder_factory, audio_decoder_factory)); >- RTCError err = new_factory->Initialize(std::move(owned_media_engine)); >- if (!err.ok()) { >- return std::move(err); >- } >- // Return a proxy so that any calls on the returned object (including >- // destructor) happen on the signaling thread. >- rtc::Thread* signaling = new_factory->signaling_thread(); >- rtc::Thread* network = new_factory->network_thread(); >- return OrtcFactoryProxy::Create(signaling, network, std::move(new_factory)); >-} >- >-RTCError OrtcFactory::Initialize( >- std::unique_ptr<cricket::MediaEngineInterface> media_engine) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- // TODO(deadbeef): Get rid of requirement to hop to worker thread here. >- if (!media_engine) { >- media_engine = >- worker_thread_->Invoke<std::unique_ptr<cricket::MediaEngineInterface>>( >- RTC_FROM_HERE, rtc::Bind(&OrtcFactory::CreateMediaEngine_w, this)); >- } >- >- channel_manager_.reset(new cricket::ChannelManager( >- std::move(media_engine), absl::make_unique<cricket::RtpDataEngine>(), >- worker_thread_.get(), network_thread_)); >- channel_manager_->SetVideoRtxEnabled(true); >- if (!channel_manager_->Init()) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to initialize ChannelManager."); >- } >- return RTCError::OK(); >-} >- >-std::unique_ptr<cricket::MediaEngineInterface> >-OrtcFactory::CreateMediaEngine_w() { >- RTC_DCHECK_RUN_ON(worker_thread_.get()); >- // The null arguments are optional factories that could be passed into the >- // OrtcFactory, but aren't yet. >- // >- // Note that |adm_| may be null, in which case the platform-specific default >- // AudioDeviceModule will be used. >- return std::unique_ptr<cricket::MediaEngineInterface>( >- cricket::WebRtcMediaEngineFactory::Create( >- rtc::scoped_refptr<webrtc::AudioDeviceModule>(adm_), >- audio_encoder_factory_, audio_decoder_factory_, >- webrtc::CreateBuiltinVideoEncoderFactory(), >- webrtc::CreateBuiltinVideoDecoderFactory(), nullptr, >- webrtc::AudioProcessingBuilder().Create())); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory.h >deleted file mode 100644 >index d72f6121dbbd92a38e69af5ca703c81a4358232e..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory.h >+++ /dev/null >@@ -1,156 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef ORTC_ORTCFACTORY_H_ >-#define ORTC_ORTCFACTORY_H_ >- >-#include <memory> >-#include <string> >- >-#include "api/ortc/ortcfactoryinterface.h" >-#include "media/base/mediaengine.h" >-#include "media/engine/webrtcmediaengine.h" >-#include "pc/channelmanager.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/scoped_ref_ptr.h" >- >-namespace webrtc { >- >-// Implementation of OrtcFactoryInterface. >-// >-// See ortcfactoryinterface.h for documentation. >-class OrtcFactory : public OrtcFactoryInterface { >- public: >- ~OrtcFactory() override; >- >- // Internal-only Create method that allows passing in a fake media engine, >- // for testing. >- static RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>> Create( >- rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- std::unique_ptr<cricket::MediaEngineInterface> media_engine, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory); >- >- RTCErrorOr<std::unique_ptr<RtpTransportControllerInterface>> >- CreateRtpTransportController() override; >- >- RTCErrorOr<std::unique_ptr<RtpTransportInterface>> CreateRtpTransport( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerInterface* transport_controller) override; >- >- RTCErrorOr<std::unique_ptr<SrtpTransportInterface>> CreateSrtpTransport( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerInterface* transport_controller) override; >- >- RtpCapabilities GetRtpSenderCapabilities( >- cricket::MediaType kind) const override; >- >- RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> CreateRtpSender( >- rtc::scoped_refptr<MediaStreamTrackInterface> track, >- RtpTransportInterface* transport) override; >- >- RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> CreateRtpSender( >- cricket::MediaType kind, >- RtpTransportInterface* transport) override; >- >- RtpCapabilities GetRtpReceiverCapabilities( >- cricket::MediaType kind) const override; >- >- RTCErrorOr<std::unique_ptr<OrtcRtpReceiverInterface>> CreateRtpReceiver( >- cricket::MediaType kind, >- RtpTransportInterface* transport) override; >- >- RTCErrorOr<std::unique_ptr<UdpTransportInterface>> >- CreateUdpTransport(int family, uint16_t min_port, uint16_t max_port) override; >- >- rtc::scoped_refptr<AudioSourceInterface> CreateAudioSource( >- const cricket::AudioOptions& options) override; >- >- rtc::scoped_refptr<VideoTrackInterface> CreateVideoTrack( >- const std::string& id, >- VideoTrackSourceInterface* source) override; >- >- rtc::scoped_refptr<AudioTrackInterface> CreateAudioTrack( >- const std::string& id, >- AudioSourceInterface* source) override; >- >- rtc::Thread* network_thread() { return network_thread_; } >- rtc::Thread* worker_thread() { return worker_thread_.get(); } >- rtc::Thread* signaling_thread() { return signaling_thread_; } >- >- private: >- // Should only be called by OrtcFactoryInterface::Create. >- OrtcFactory(rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory); >- >- RTCErrorOr<std::unique_ptr<RtpTransportControllerInterface>> >- CreateRtpTransportController(const RtpTransportParameters& parameters); >- >- // Thread::Invoke doesn't support move-only arguments, so we need to remove >- // the unique_ptr wrapper from media_engine. TODO(deadbeef): Fix this. >- static RTCErrorOr<std::unique_ptr<OrtcFactoryInterface>> Create_s( >- rtc::Thread* network_thread, >- rtc::Thread* signaling_thread, >- rtc::NetworkManager* network_manager, >- rtc::PacketSocketFactory* socket_factory, >- AudioDeviceModule* adm, >- cricket::MediaEngineInterface* media_engine, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory); >- >- // Performs initialization that can fail. Called by factory method after >- // construction, and if it fails, no object is returned. >- RTCError Initialize( >- std::unique_ptr<cricket::MediaEngineInterface> media_engine); >- std::unique_ptr<cricket::MediaEngineInterface> CreateMediaEngine_w(); >- >- // Threads and networking objects. >- rtc::Thread* network_thread_; >- rtc::Thread* signaling_thread_; >- rtc::NetworkManager* network_manager_; >- rtc::PacketSocketFactory* socket_factory_; >- AudioDeviceModule* adm_; >- // If we created/own the objects above, these will be non-null and thus will >- // be released automatically upon destruction. >- std::unique_ptr<rtc::Thread> owned_network_thread_; >- bool wraps_signaling_thread_ = false; >- std::unique_ptr<rtc::NetworkManager> owned_network_manager_; >- std::unique_ptr<rtc::PacketSocketFactory> owned_socket_factory_; >- // We always own the worker thread. >- std::unique_ptr<rtc::Thread> worker_thread_; >- // Media-releated objects. >- std::unique_ptr<RtcEventLog> null_event_log_; >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory_; >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory_; >- std::unique_ptr<cricket::ChannelManager> channel_manager_; >- // Default CNAME to use for RtpTransports if none is passed in. >- std::string default_cname_; >- >- friend class OrtcFactoryInterface; >- >- RTC_DISALLOW_COPY_AND_ASSIGN(OrtcFactory); >-}; >- >-} // namespace webrtc >- >-#endif // ORTC_ORTCFACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory_integrationtest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory_integrationtest.cc >deleted file mode 100644 >index 62f29520ed8d76888a71890a7715967a19118f15..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory_integrationtest.cc >+++ /dev/null >@@ -1,724 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >-#include <utility> // For std::pair, std::move. >- >-#include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "api/ortc/ortcfactoryinterface.h" >-#include "ortc/testrtpparameters.h" >-#include "p2p/base/udptransport.h" >-#include "pc/test/fakeaudiocapturemodule.h" >-#include "pc/test/fakeperiodicvideotracksource.h" >-#include "pc/test/fakevideotrackrenderer.h" >-#include "pc/videotracksource.h" >-#include "rtc_base/criticalsection.h" >-#include "rtc_base/fakenetwork.h" >-#include "rtc_base/gunit.h" >-#include "rtc_base/system/arch.h" >-#include "rtc_base/timeutils.h" >-#include "rtc_base/virtualsocketserver.h" >- >-namespace { >- >-const int kDefaultTimeout = 10000; // 10 seconds. >-const int kReceivingDuration = 1000; // 1 second. >- >-// Default number of audio/video frames to wait for before considering a test a >-// success. >-const int kDefaultNumFrames = 3; >-const rtc::IPAddress kIPv4LocalHostAddress = >- rtc::IPAddress(0x7F000001); // 127.0.0.1 >- >-static const char kTestKeyParams1[] = >- "inline:WVNfX19zZW1jdGwgKskgewkyMjA7fQp9CnVubGVz"; >-static const char kTestKeyParams2[] = >- "inline:PS1uQCVeeCFCanVmcjkpaywjNWhcYD0mXXtxaVBR"; >-static const char kTestKeyParams3[] = >- "inline:WVNfX19zZW1jdGwgKskgewkyMjA7fQp9CnVubGVa"; >-static const char kTestKeyParams4[] = >- "inline:WVNfX19zZW1jdGwgKskgewkyMjA7fQp9CnVubGVb"; >-static const cricket::CryptoParams kTestCryptoParams1(1, >- "AES_CM_128_HMAC_SHA1_80", >- kTestKeyParams1, >- ""); >-static const cricket::CryptoParams kTestCryptoParams2(1, >- "AES_CM_128_HMAC_SHA1_80", >- kTestKeyParams2, >- ""); >-static const cricket::CryptoParams kTestCryptoParams3(1, >- "AES_CM_128_HMAC_SHA1_80", >- kTestKeyParams3, >- ""); >-static const cricket::CryptoParams kTestCryptoParams4(1, >- "AES_CM_128_HMAC_SHA1_80", >- kTestKeyParams4, >- ""); >-} // namespace >- >-namespace webrtc { >- >-// Used to test that things work end-to-end when using the default >-// implementations of threads/etc. provided by OrtcFactory, with the exception >-// of using a virtual network. >-// >-// By default, the virtual network manager doesn't enumerate any networks, but >-// sockets can still be created in this state. >-class OrtcFactoryIntegrationTest : public testing::Test { >- public: >- OrtcFactoryIntegrationTest() >- : network_thread_(&virtual_socket_server_), >- fake_audio_capture_module1_(FakeAudioCaptureModule::Create()), >- fake_audio_capture_module2_(FakeAudioCaptureModule::Create()) { >- // Sockets are bound to the ANY address, so this is needed to tell the >- // virtual network which address to use in this case. >- virtual_socket_server_.SetDefaultRoute(kIPv4LocalHostAddress); >- network_thread_.SetName("TestNetworkThread", this); >- network_thread_.Start(); >- // Need to create after network thread is started. >- ortc_factory1_ = >- OrtcFactoryInterface::Create( >- &network_thread_, nullptr, &fake_network_manager_, nullptr, >- fake_audio_capture_module1_, CreateBuiltinAudioEncoderFactory(), >- CreateBuiltinAudioDecoderFactory()) >- .MoveValue(); >- ortc_factory2_ = >- OrtcFactoryInterface::Create( >- &network_thread_, nullptr, &fake_network_manager_, nullptr, >- fake_audio_capture_module2_, CreateBuiltinAudioEncoderFactory(), >- CreateBuiltinAudioDecoderFactory()) >- .MoveValue(); >- } >- >- protected: >- typedef std::pair<std::unique_ptr<UdpTransportInterface>, >- std::unique_ptr<UdpTransportInterface>> >- UdpTransportPair; >- typedef std::pair<std::unique_ptr<RtpTransportInterface>, >- std::unique_ptr<RtpTransportInterface>> >- RtpTransportPair; >- typedef std::pair<std::unique_ptr<SrtpTransportInterface>, >- std::unique_ptr<SrtpTransportInterface>> >- SrtpTransportPair; >- typedef std::pair<std::unique_ptr<RtpTransportControllerInterface>, >- std::unique_ptr<RtpTransportControllerInterface>> >- RtpTransportControllerPair; >- >- // Helper function that creates one UDP transport each for |ortc_factory1_| >- // and |ortc_factory2_|, and connects them. >- UdpTransportPair CreateAndConnectUdpTransportPair() { >- auto transport1 = ortc_factory1_->CreateUdpTransport(AF_INET).MoveValue(); >- auto transport2 = ortc_factory2_->CreateUdpTransport(AF_INET).MoveValue(); >- transport1->SetRemoteAddress( >- rtc::SocketAddress(virtual_socket_server_.GetDefaultRoute(AF_INET), >- transport2->GetLocalAddress().port())); >- transport2->SetRemoteAddress( >- rtc::SocketAddress(virtual_socket_server_.GetDefaultRoute(AF_INET), >- transport1->GetLocalAddress().port())); >- return {std::move(transport1), std::move(transport2)}; >- } >- >- // Creates one transport controller each for |ortc_factory1_| and >- // |ortc_factory2_|. >- RtpTransportControllerPair CreateRtpTransportControllerPair() { >- return {ortc_factory1_->CreateRtpTransportController().MoveValue(), >- ortc_factory2_->CreateRtpTransportController().MoveValue()}; >- } >- >- // Helper function that creates a pair of RtpTransports between >- // |ortc_factory1_| and |ortc_factory2_|. Expected to be called with the >- // result of CreateAndConnectUdpTransportPair. |rtcp_udp_transports| can be >- // empty if RTCP muxing is used. |transport_controllers| can be empty if >- // these transports are being created using a default transport controller. >- RtpTransportPair CreateRtpTransportPair( >- const RtpTransportParameters& parameters, >- const UdpTransportPair& rtp_udp_transports, >- const UdpTransportPair& rtcp_udp_transports, >- const RtpTransportControllerPair& transport_controllers) { >- auto transport_result1 = ortc_factory1_->CreateRtpTransport( >- parameters, rtp_udp_transports.first.get(), >- rtcp_udp_transports.first.get(), transport_controllers.first.get()); >- auto transport_result2 = ortc_factory2_->CreateRtpTransport( >- parameters, rtp_udp_transports.second.get(), >- rtcp_udp_transports.second.get(), transport_controllers.second.get()); >- return {transport_result1.MoveValue(), transport_result2.MoveValue()}; >- } >- >- SrtpTransportPair CreateSrtpTransportPair( >- const RtpTransportParameters& parameters, >- const UdpTransportPair& rtp_udp_transports, >- const UdpTransportPair& rtcp_udp_transports, >- const RtpTransportControllerPair& transport_controllers) { >- auto transport_result1 = ortc_factory1_->CreateSrtpTransport( >- parameters, rtp_udp_transports.first.get(), >- rtcp_udp_transports.first.get(), transport_controllers.first.get()); >- auto transport_result2 = ortc_factory2_->CreateSrtpTransport( >- parameters, rtp_udp_transports.second.get(), >- rtcp_udp_transports.second.get(), transport_controllers.second.get()); >- return {transport_result1.MoveValue(), transport_result2.MoveValue()}; >- } >- >- // For convenience when |rtcp_udp_transports| and |transport_controllers| >- // aren't needed. >- RtpTransportPair CreateRtpTransportPair( >- const RtpTransportParameters& parameters, >- const UdpTransportPair& rtp_udp_transports) { >- return CreateRtpTransportPair(parameters, rtp_udp_transports, >- UdpTransportPair(), >- RtpTransportControllerPair()); >- } >- >- SrtpTransportPair CreateSrtpTransportPairAndSetKeys( >- const RtpTransportParameters& parameters, >- const UdpTransportPair& rtp_udp_transports) { >- SrtpTransportPair srtp_transports = CreateSrtpTransportPair( >- parameters, rtp_udp_transports, UdpTransportPair(), >- RtpTransportControllerPair()); >- EXPECT_TRUE(srtp_transports.first->SetSrtpSendKey(kTestCryptoParams1).ok()); >- EXPECT_TRUE( >- srtp_transports.first->SetSrtpReceiveKey(kTestCryptoParams2).ok()); >- EXPECT_TRUE( >- srtp_transports.second->SetSrtpSendKey(kTestCryptoParams2).ok()); >- EXPECT_TRUE( >- srtp_transports.second->SetSrtpReceiveKey(kTestCryptoParams1).ok()); >- return srtp_transports; >- } >- >- SrtpTransportPair CreateSrtpTransportPairAndSetMismatchingKeys( >- const RtpTransportParameters& parameters, >- const UdpTransportPair& rtp_udp_transports) { >- SrtpTransportPair srtp_transports = CreateSrtpTransportPair( >- parameters, rtp_udp_transports, UdpTransportPair(), >- RtpTransportControllerPair()); >- EXPECT_TRUE(srtp_transports.first->SetSrtpSendKey(kTestCryptoParams1).ok()); >- EXPECT_TRUE( >- srtp_transports.first->SetSrtpReceiveKey(kTestCryptoParams2).ok()); >- EXPECT_TRUE( >- srtp_transports.second->SetSrtpSendKey(kTestCryptoParams1).ok()); >- EXPECT_TRUE( >- srtp_transports.second->SetSrtpReceiveKey(kTestCryptoParams2).ok()); >- return srtp_transports; >- } >- >- // Ends up using fake audio capture module, which was passed into OrtcFactory >- // on creation. >- rtc::scoped_refptr<webrtc::AudioTrackInterface> CreateLocalAudioTrack( >- const std::string& id, >- OrtcFactoryInterface* ortc_factory) { >- // Disable echo cancellation to make test more efficient. >- cricket::AudioOptions options; >- options.echo_cancellation.emplace(true); >- rtc::scoped_refptr<webrtc::AudioSourceInterface> source = >- ortc_factory->CreateAudioSource(options); >- return ortc_factory->CreateAudioTrack(id, source); >- } >- >- // Stores created video source in |fake_video_track_sources_|. >- rtc::scoped_refptr<webrtc::VideoTrackInterface> >- CreateLocalVideoTrackAndFakeSource(const std::string& id, >- OrtcFactoryInterface* ortc_factory) { >- FakePeriodicVideoSource::Config config; >- config.timestamp_offset_ms = rtc::TimeMillis(); >- fake_video_track_sources_.emplace_back( >- new rtc::RefCountedObject<FakePeriodicVideoTrackSource>( >- config, false /* remote */)); >- return rtc::scoped_refptr<VideoTrackInterface>( >- ortc_factory->CreateVideoTrack(id, fake_video_track_sources_.back())); >- } >- >- // Helper function used to test two way RTP senders and receivers with basic >- // configurations. >- // If |expect_success| is true, waits for kDefaultTimeout for >- // kDefaultNumFrames frames to be received by all RtpReceivers. >- // If |expect_success| is false, simply waits for |kReceivingDuration|, and >- // stores the number of received frames in |received_audio_frame1_| etc. >- void BasicTwoWayRtpSendersAndReceiversTest(RtpTransportPair srtp_transports, >- bool expect_success) { >- received_audio_frames1_ = 0; >- received_audio_frames2_ = 0; >- rendered_video_frames1_ = 0; >- rendered_video_frames2_ = 0; >- // Create all the senders and receivers (four per endpoint). >- auto audio_sender_result1 = ortc_factory1_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, srtp_transports.first.get()); >- auto video_sender_result1 = ortc_factory1_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, srtp_transports.first.get()); >- auto audio_receiver_result1 = ortc_factory1_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, srtp_transports.first.get()); >- auto video_receiver_result1 = ortc_factory1_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, srtp_transports.first.get()); >- ASSERT_TRUE(audio_sender_result1.ok()); >- ASSERT_TRUE(video_sender_result1.ok()); >- ASSERT_TRUE(audio_receiver_result1.ok()); >- ASSERT_TRUE(video_receiver_result1.ok()); >- auto audio_sender1 = audio_sender_result1.MoveValue(); >- auto video_sender1 = video_sender_result1.MoveValue(); >- auto audio_receiver1 = audio_receiver_result1.MoveValue(); >- auto video_receiver1 = video_receiver_result1.MoveValue(); >- >- auto audio_sender_result2 = ortc_factory2_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, srtp_transports.second.get()); >- auto video_sender_result2 = ortc_factory2_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, srtp_transports.second.get()); >- auto audio_receiver_result2 = ortc_factory2_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, srtp_transports.second.get()); >- auto video_receiver_result2 = ortc_factory2_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, srtp_transports.second.get()); >- ASSERT_TRUE(audio_sender_result2.ok()); >- ASSERT_TRUE(video_sender_result2.ok()); >- ASSERT_TRUE(audio_receiver_result2.ok()); >- ASSERT_TRUE(video_receiver_result2.ok()); >- auto audio_sender2 = audio_sender_result2.MoveValue(); >- auto video_sender2 = video_sender_result2.MoveValue(); >- auto audio_receiver2 = audio_receiver_result2.MoveValue(); >- auto video_receiver2 = video_receiver_result2.MoveValue(); >- >- // Add fake tracks. >- RTCError error = audio_sender1->SetTrack( >- CreateLocalAudioTrack("audio", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- error = video_sender1->SetTrack( >- CreateLocalVideoTrackAndFakeSource("video", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- error = audio_sender2->SetTrack( >- CreateLocalAudioTrack("audio", ortc_factory2_.get())); >- EXPECT_TRUE(error.ok()); >- error = video_sender2->SetTrack( >- CreateLocalVideoTrackAndFakeSource("video", ortc_factory2_.get())); >- EXPECT_TRUE(error.ok()); >- >- // "sent_X_parameters1" are the parameters that endpoint 1 sends with and >- // endpoint 2 receives with. >- RtpParameters sent_opus_parameters1 = >- MakeMinimalOpusParametersWithSsrc(0xdeadbeef); >- RtpParameters sent_vp8_parameters1 = >- MakeMinimalVp8ParametersWithSsrc(0xbaadfeed); >- RtpParameters sent_opus_parameters2 = >- MakeMinimalOpusParametersWithSsrc(0x13333337); >- RtpParameters sent_vp8_parameters2 = >- MakeMinimalVp8ParametersWithSsrc(0x12345678); >- >- // Configure the senders' and receivers' parameters. >- EXPECT_TRUE(audio_receiver1->Receive(sent_opus_parameters2).ok()); >- EXPECT_TRUE(video_receiver1->Receive(sent_vp8_parameters2).ok()); >- EXPECT_TRUE(audio_receiver2->Receive(sent_opus_parameters1).ok()); >- EXPECT_TRUE(video_receiver2->Receive(sent_vp8_parameters1).ok()); >- EXPECT_TRUE(audio_sender1->Send(sent_opus_parameters1).ok()); >- EXPECT_TRUE(video_sender1->Send(sent_vp8_parameters1).ok()); >- EXPECT_TRUE(audio_sender2->Send(sent_opus_parameters2).ok()); >- EXPECT_TRUE(video_sender2->Send(sent_vp8_parameters2).ok()); >- >- FakeVideoTrackRenderer fake_video_renderer1( >- static_cast<VideoTrackInterface*>(video_receiver1->GetTrack().get())); >- FakeVideoTrackRenderer fake_video_renderer2( >- static_cast<VideoTrackInterface*>(video_receiver2->GetTrack().get())); >- >- if (expect_success) { >- EXPECT_TRUE_WAIT( >- fake_audio_capture_module1_->frames_received() > kDefaultNumFrames && >- fake_video_renderer1.num_rendered_frames() > kDefaultNumFrames && >- fake_audio_capture_module2_->frames_received() > >- kDefaultNumFrames && >- fake_video_renderer2.num_rendered_frames() > kDefaultNumFrames, >- kDefaultTimeout) >- << "Audio capture module 1 received " >- << fake_audio_capture_module1_->frames_received() >- << " frames, Video renderer 1 rendered " >- << fake_video_renderer1.num_rendered_frames() >- << " frames, Audio capture module 2 received " >- << fake_audio_capture_module2_->frames_received() >- << " frames, Video renderer 2 rendered " >- << fake_video_renderer2.num_rendered_frames() << " frames."; >- } else { >- WAIT(false, kReceivingDuration); >- rendered_video_frames1_ = fake_video_renderer1.num_rendered_frames(); >- rendered_video_frames2_ = fake_video_renderer2.num_rendered_frames(); >- received_audio_frames1_ = fake_audio_capture_module1_->frames_received(); >- received_audio_frames2_ = fake_audio_capture_module2_->frames_received(); >- } >- } >- >- rtc::VirtualSocketServer virtual_socket_server_; >- rtc::Thread network_thread_; >- rtc::FakeNetworkManager fake_network_manager_; >- rtc::scoped_refptr<FakeAudioCaptureModule> fake_audio_capture_module1_; >- rtc::scoped_refptr<FakeAudioCaptureModule> fake_audio_capture_module2_; >- std::unique_ptr<OrtcFactoryInterface> ortc_factory1_; >- std::unique_ptr<OrtcFactoryInterface> ortc_factory2_; >- std::vector<rtc::scoped_refptr<VideoTrackSource>> fake_video_track_sources_; >- int received_audio_frames1_ = 0; >- int received_audio_frames2_ = 0; >- int rendered_video_frames1_ = 0; >- int rendered_video_frames2_ = 0; >-}; >- >-// Disable for TSan v2, see >-// https://bugs.chromium.org/p/webrtc/issues/detail?id=7366 for details. >-#if !defined(THREAD_SANITIZER) >- >-// Very basic end-to-end test with a single pair of audio RTP sender and >-// receiver. >-// >-// Uses muxed RTCP, and minimal parameters with a hard-coded config that's >-// known to work. >-TEST_F(OrtcFactoryIntegrationTest, BasicOneWayAudioRtpSenderAndReceiver) { >- auto udp_transports = CreateAndConnectUdpTransportPair(); >- auto rtp_transports = >- CreateRtpTransportPair(MakeRtcpMuxParameters(), udp_transports); >- >- auto sender_result = ortc_factory1_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transports.first.get()); >- auto receiver_result = ortc_factory2_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transports.second.get()); >- ASSERT_TRUE(sender_result.ok()); >- ASSERT_TRUE(receiver_result.ok()); >- auto sender = sender_result.MoveValue(); >- auto receiver = receiver_result.MoveValue(); >- >- RTCError error = >- sender->SetTrack(CreateLocalAudioTrack("audio", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- >- RtpParameters opus_parameters = MakeMinimalOpusParameters(); >- EXPECT_TRUE(receiver->Receive(opus_parameters).ok()); >- EXPECT_TRUE(sender->Send(opus_parameters).ok()); >- // Sender and receiver are connected and configured; audio frames should be >- // able to flow at this point. >- EXPECT_TRUE_WAIT( >- fake_audio_capture_module2_->frames_received() > kDefaultNumFrames, >- kDefaultTimeout); >-} >- >-// Very basic end-to-end test with a single pair of video RTP sender and >-// receiver. >-// >-// Uses muxed RTCP, and minimal parameters with a hard-coded config that's >-// known to work. >-TEST_F(OrtcFactoryIntegrationTest, BasicOneWayVideoRtpSenderAndReceiver) { >- auto udp_transports = CreateAndConnectUdpTransportPair(); >- auto rtp_transports = >- CreateRtpTransportPair(MakeRtcpMuxParameters(), udp_transports); >- >- auto sender_result = ortc_factory1_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transports.first.get()); >- auto receiver_result = ortc_factory2_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transports.second.get()); >- ASSERT_TRUE(sender_result.ok()); >- ASSERT_TRUE(receiver_result.ok()); >- auto sender = sender_result.MoveValue(); >- auto receiver = receiver_result.MoveValue(); >- >- RTCError error = sender->SetTrack( >- CreateLocalVideoTrackAndFakeSource("video", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- >- RtpParameters vp8_parameters = MakeMinimalVp8Parameters(); >- EXPECT_TRUE(receiver->Receive(vp8_parameters).ok()); >- EXPECT_TRUE(sender->Send(vp8_parameters).ok()); >- FakeVideoTrackRenderer fake_renderer( >- static_cast<VideoTrackInterface*>(receiver->GetTrack().get())); >- // Sender and receiver are connected and configured; video frames should be >- // able to flow at this point. >- EXPECT_TRUE_WAIT(fake_renderer.num_rendered_frames() > kDefaultNumFrames, >- kDefaultTimeout); >-} >- >-// Test that if the track is changed while sending, the sender seamlessly >-// transitions to sending it and frames are received end-to-end. >-// >-// Only doing this for video, since given that audio is sourced from a single >-// fake audio capture module, the audio track is just a dummy object. >-// TODO(deadbeef): Change this when possible. >-TEST_F(OrtcFactoryIntegrationTest, SetTrackWhileSending) { >- auto udp_transports = CreateAndConnectUdpTransportPair(); >- auto rtp_transports = >- CreateRtpTransportPair(MakeRtcpMuxParameters(), udp_transports); >- >- auto sender_result = ortc_factory1_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transports.first.get()); >- auto receiver_result = ortc_factory2_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transports.second.get()); >- ASSERT_TRUE(sender_result.ok()); >- ASSERT_TRUE(receiver_result.ok()); >- auto sender = sender_result.MoveValue(); >- auto receiver = receiver_result.MoveValue(); >- >- RTCError error = sender->SetTrack( >- CreateLocalVideoTrackAndFakeSource("video_1", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- RtpParameters vp8_parameters = MakeMinimalVp8Parameters(); >- EXPECT_TRUE(receiver->Receive(vp8_parameters).ok()); >- EXPECT_TRUE(sender->Send(vp8_parameters).ok()); >- FakeVideoTrackRenderer fake_renderer( >- static_cast<VideoTrackInterface*>(receiver->GetTrack().get())); >- // Expect for some initial number of frames to be received. >- EXPECT_TRUE_WAIT(fake_renderer.num_rendered_frames() > kDefaultNumFrames, >- kDefaultTimeout); >- // Destroy old source, set a new track, and verify new frames are received >- // from the new track. The VideoTrackSource is reference counted and may live >- // a little longer, so tell it that its source is going away now. >- fake_video_track_sources_[0] = nullptr; >- int prev_num_frames = fake_renderer.num_rendered_frames(); >- error = sender->SetTrack( >- CreateLocalVideoTrackAndFakeSource("video_2", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- EXPECT_TRUE_WAIT( >- fake_renderer.num_rendered_frames() > kDefaultNumFrames + prev_num_frames, >- kDefaultTimeout); >-} >- >-// TODO(webrtc:7915, webrtc:9184): Tests below are disabled for iOS 64 on debug >-// builds because of flakiness. >-#if !(defined(WEBRTC_IOS) && defined(WEBRTC_ARCH_64_BITS) && !defined(NDEBUG)) >-#define MAYBE_BasicTwoWayAudioVideoRtpSendersAndReceivers \ >- BasicTwoWayAudioVideoRtpSendersAndReceivers >-#define MAYBE_BasicTwoWayAudioVideoSrtpSendersAndReceivers \ >- BasicTwoWayAudioVideoSrtpSendersAndReceivers >-#define MAYBE_SrtpSendersAndReceiversWithMismatchingKeys \ >- SrtpSendersAndReceiversWithMismatchingKeys >-#define MAYBE_OneSideSrtpSenderAndReceiver OneSideSrtpSenderAndReceiver >-#define MAYBE_FullTwoWayAudioVideoSrtpSendersAndReceivers \ >- FullTwoWayAudioVideoSrtpSendersAndReceivers >-#else >-#define MAYBE_BasicTwoWayAudioVideoRtpSendersAndReceivers \ >- DISABLED_BasicTwoWayAudioVideoRtpSendersAndReceivers >-#define MAYBE_BasicTwoWayAudioVideoSrtpSendersAndReceivers \ >- DISABLED_BasicTwoWayAudioVideoSrtpSendersAndReceivers >-#define MAYBE_SrtpSendersAndReceiversWithMismatchingKeys \ >- DISABLED_SrtpSendersAndReceiversWithMismatchingKeys >-#define MAYBE_OneSideSrtpSenderAndReceiver DISABLED_OneSideSrtpSenderAndReceiver >-#define MAYBE_FullTwoWayAudioVideoSrtpSendersAndReceivers \ >- DISABLED_FullTwoWayAudioVideoSrtpSendersAndReceivers >-#endif >- >-// End-to-end test with two pairs of RTP senders and receivers, for audio and >-// video. >-// >-// Uses muxed RTCP, and minimal parameters with hard-coded configs that are >-// known to work. >-TEST_F(OrtcFactoryIntegrationTest, >- MAYBE_BasicTwoWayAudioVideoRtpSendersAndReceivers) { >- auto udp_transports = CreateAndConnectUdpTransportPair(); >- auto rtp_transports = >- CreateRtpTransportPair(MakeRtcpMuxParameters(), udp_transports); >- bool expect_success = true; >- BasicTwoWayRtpSendersAndReceiversTest(std::move(rtp_transports), >- expect_success); >-} >- >-TEST_F(OrtcFactoryIntegrationTest, >- MAYBE_BasicTwoWayAudioVideoSrtpSendersAndReceivers) { >- auto udp_transports = CreateAndConnectUdpTransportPair(); >- auto srtp_transports = CreateSrtpTransportPairAndSetKeys( >- MakeRtcpMuxParameters(), udp_transports); >- bool expect_success = true; >- BasicTwoWayRtpSendersAndReceiversTest(std::move(srtp_transports), >- expect_success); >-} >- >-// Tests that the packets cannot be decoded if the keys are mismatched. >-// TODO(webrtc:9184): Disabled because this test is flaky. >-TEST_F(OrtcFactoryIntegrationTest, >- MAYBE_SrtpSendersAndReceiversWithMismatchingKeys) { >- auto udp_transports = CreateAndConnectUdpTransportPair(); >- auto srtp_transports = CreateSrtpTransportPairAndSetMismatchingKeys( >- MakeRtcpMuxParameters(), udp_transports); >- bool expect_success = false; >- BasicTwoWayRtpSendersAndReceiversTest(std::move(srtp_transports), >- expect_success); >- // No frames are expected to be decoded. >- EXPECT_TRUE(received_audio_frames1_ == 0 && received_audio_frames2_ == 0 && >- rendered_video_frames1_ == 0 && rendered_video_frames2_ == 0); >-} >- >-// Tests that the frames cannot be decoded if only one side uses SRTP. >-TEST_F(OrtcFactoryIntegrationTest, MAYBE_OneSideSrtpSenderAndReceiver) { >- auto rtcp_parameters = MakeRtcpMuxParameters(); >- auto udp_transports = CreateAndConnectUdpTransportPair(); >- auto rtcp_udp_transports = UdpTransportPair(); >- auto transport_controllers = RtpTransportControllerPair(); >- auto transport_result1 = ortc_factory1_->CreateRtpTransport( >- rtcp_parameters, udp_transports.first.get(), >- rtcp_udp_transports.first.get(), transport_controllers.first.get()); >- auto transport_result2 = ortc_factory2_->CreateSrtpTransport( >- rtcp_parameters, udp_transports.second.get(), >- rtcp_udp_transports.second.get(), transport_controllers.second.get()); >- >- auto rtp_transport = transport_result1.MoveValue(); >- auto srtp_transport = transport_result2.MoveValue(); >- EXPECT_TRUE(srtp_transport->SetSrtpSendKey(kTestCryptoParams1).ok()); >- EXPECT_TRUE(srtp_transport->SetSrtpReceiveKey(kTestCryptoParams2).ok()); >- bool expect_success = false; >- BasicTwoWayRtpSendersAndReceiversTest( >- {std::move(rtp_transport), std::move(srtp_transport)}, expect_success); >- >- // The SRTP side is not expected to decode any audio or video frames. >- // The RTP side is not expected to decode any video frames while it is >- // possible that the encrypted audio frames can be accidentally decoded which >- // is why received_audio_frames1_ is not validated. >- EXPECT_TRUE(received_audio_frames2_ == 0 && rendered_video_frames1_ == 0 && >- rendered_video_frames2_ == 0); >-} >- >-// End-to-end test with two pairs of RTP senders and receivers, for audio and >-// video. Unlike the test above, this attempts to make the parameters as >-// complex as possible. The senders and receivers use the SRTP transport with >-// different keys. >-// >-// Uses non-muxed RTCP, with separate audio/video transports, and a full set of >-// parameters, as would normally be used in a PeerConnection. >-// >-// TODO(deadbeef): Update this test as more audio/video features become >-// supported. >-TEST_F(OrtcFactoryIntegrationTest, >- MAYBE_FullTwoWayAudioVideoSrtpSendersAndReceivers) { >- // We want four pairs of UDP transports for this test, for audio/video and >- // RTP/RTCP. >- auto audio_rtp_udp_transports = CreateAndConnectUdpTransportPair(); >- auto audio_rtcp_udp_transports = CreateAndConnectUdpTransportPair(); >- auto video_rtp_udp_transports = CreateAndConnectUdpTransportPair(); >- auto video_rtcp_udp_transports = CreateAndConnectUdpTransportPair(); >- >- // Since we have multiple RTP transports on each side, we need an RTP >- // transport controller. >- auto transport_controllers = CreateRtpTransportControllerPair(); >- >- RtpTransportParameters audio_rtp_transport_parameters; >- audio_rtp_transport_parameters.rtcp.mux = false; >- auto audio_srtp_transports = CreateSrtpTransportPair( >- audio_rtp_transport_parameters, audio_rtp_udp_transports, >- audio_rtcp_udp_transports, transport_controllers); >- >- RtpTransportParameters video_rtp_transport_parameters; >- video_rtp_transport_parameters.rtcp.mux = false; >- video_rtp_transport_parameters.rtcp.reduced_size = true; >- auto video_srtp_transports = CreateSrtpTransportPair( >- video_rtp_transport_parameters, video_rtp_udp_transports, >- video_rtcp_udp_transports, transport_controllers); >- >- // Set keys for SRTP transports. >- audio_srtp_transports.first->SetSrtpSendKey(kTestCryptoParams1); >- audio_srtp_transports.first->SetSrtpReceiveKey(kTestCryptoParams2); >- video_srtp_transports.first->SetSrtpSendKey(kTestCryptoParams3); >- video_srtp_transports.first->SetSrtpReceiveKey(kTestCryptoParams4); >- >- audio_srtp_transports.second->SetSrtpSendKey(kTestCryptoParams2); >- audio_srtp_transports.second->SetSrtpReceiveKey(kTestCryptoParams1); >- video_srtp_transports.second->SetSrtpSendKey(kTestCryptoParams4); >- video_srtp_transports.second->SetSrtpReceiveKey(kTestCryptoParams3); >- >- // Create all the senders and receivers (four per endpoint). >- auto audio_sender_result1 = ortc_factory1_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, audio_srtp_transports.first.get()); >- auto video_sender_result1 = ortc_factory1_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, video_srtp_transports.first.get()); >- auto audio_receiver_result1 = ortc_factory1_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, audio_srtp_transports.first.get()); >- auto video_receiver_result1 = ortc_factory1_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, video_srtp_transports.first.get()); >- ASSERT_TRUE(audio_sender_result1.ok()); >- ASSERT_TRUE(video_sender_result1.ok()); >- ASSERT_TRUE(audio_receiver_result1.ok()); >- ASSERT_TRUE(video_receiver_result1.ok()); >- auto audio_sender1 = audio_sender_result1.MoveValue(); >- auto video_sender1 = video_sender_result1.MoveValue(); >- auto audio_receiver1 = audio_receiver_result1.MoveValue(); >- auto video_receiver1 = video_receiver_result1.MoveValue(); >- >- auto audio_sender_result2 = ortc_factory2_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, audio_srtp_transports.second.get()); >- auto video_sender_result2 = ortc_factory2_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, video_srtp_transports.second.get()); >- auto audio_receiver_result2 = ortc_factory2_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, audio_srtp_transports.second.get()); >- auto video_receiver_result2 = ortc_factory2_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, video_srtp_transports.second.get()); >- ASSERT_TRUE(audio_sender_result2.ok()); >- ASSERT_TRUE(video_sender_result2.ok()); >- ASSERT_TRUE(audio_receiver_result2.ok()); >- ASSERT_TRUE(video_receiver_result2.ok()); >- auto audio_sender2 = audio_sender_result2.MoveValue(); >- auto video_sender2 = video_sender_result2.MoveValue(); >- auto audio_receiver2 = audio_receiver_result2.MoveValue(); >- auto video_receiver2 = video_receiver_result2.MoveValue(); >- >- RTCError error = audio_sender1->SetTrack( >- CreateLocalAudioTrack("audio", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- error = video_sender1->SetTrack( >- CreateLocalVideoTrackAndFakeSource("video", ortc_factory1_.get())); >- EXPECT_TRUE(error.ok()); >- error = audio_sender2->SetTrack( >- CreateLocalAudioTrack("audio", ortc_factory2_.get())); >- EXPECT_TRUE(error.ok()); >- error = video_sender2->SetTrack( >- CreateLocalVideoTrackAndFakeSource("video", ortc_factory2_.get())); >- EXPECT_TRUE(error.ok()); >- >- // Use different codecs in different directions for extra challenge. >- RtpParameters opus_send_parameters = MakeFullOpusParameters(); >- RtpParameters isac_send_parameters = MakeFullIsacParameters(); >- RtpParameters vp8_send_parameters = MakeFullVp8Parameters(); >- RtpParameters vp9_send_parameters = MakeFullVp9Parameters(); >- >- // Remove "payload_type" from receive parameters. Receiver will need to >- // discern the payload type from packets received. >- RtpParameters opus_receive_parameters = opus_send_parameters; >- RtpParameters isac_receive_parameters = isac_send_parameters; >- RtpParameters vp8_receive_parameters = vp8_send_parameters; >- RtpParameters vp9_receive_parameters = vp9_send_parameters; >- opus_receive_parameters.encodings[0].codec_payload_type.reset(); >- isac_receive_parameters.encodings[0].codec_payload_type.reset(); >- vp8_receive_parameters.encodings[0].codec_payload_type.reset(); >- vp9_receive_parameters.encodings[0].codec_payload_type.reset(); >- >- // Configure the senders' and receivers' parameters. >- // >- // Note: Intentionally, the top codec in the receive parameters does not >- // match the codec sent by the other side. If "Receive" is called with a list >- // of codecs, the receiver should be prepared to receive any of them, not >- // just the one on top. >- EXPECT_TRUE(audio_receiver1->Receive(opus_receive_parameters).ok()); >- EXPECT_TRUE(video_receiver1->Receive(vp8_receive_parameters).ok()); >- EXPECT_TRUE(audio_receiver2->Receive(isac_receive_parameters).ok()); >- EXPECT_TRUE(video_receiver2->Receive(vp9_receive_parameters).ok()); >- EXPECT_TRUE(audio_sender1->Send(opus_send_parameters).ok()); >- EXPECT_TRUE(video_sender1->Send(vp8_send_parameters).ok()); >- EXPECT_TRUE(audio_sender2->Send(isac_send_parameters).ok()); >- EXPECT_TRUE(video_sender2->Send(vp9_send_parameters).ok()); >- >- FakeVideoTrackRenderer fake_video_renderer1( >- static_cast<VideoTrackInterface*>(video_receiver1->GetTrack().get())); >- FakeVideoTrackRenderer fake_video_renderer2( >- static_cast<VideoTrackInterface*>(video_receiver2->GetTrack().get())); >- >- // Senders and receivers are connected and configured; audio and video frames >- // should be able to flow at this point. >- EXPECT_TRUE_WAIT( >- fake_audio_capture_module1_->frames_received() > kDefaultNumFrames && >- fake_video_renderer1.num_rendered_frames() > kDefaultNumFrames && >- fake_audio_capture_module2_->frames_received() > kDefaultNumFrames && >- fake_video_renderer2.num_rendered_frames() > kDefaultNumFrames, >- kDefaultTimeout); >-} >- >-// TODO(deadbeef): End-to-end test for multiple senders/receivers of the same >-// media type, once that's supported. Currently, it is not because the >-// BaseChannel model relies on there being a single VoiceChannel and >-// VideoChannel, and these only support a single set of codecs/etc. per >-// send/receive direction. >- >-// TODO(deadbeef): End-to-end test for simulcast, once that's supported by this >-// API. >- >-#endif // if !defined(THREAD_SANITIZER) >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory_unittest.cc >deleted file mode 100644 >index 40afbd328b6dfe176a68e11b2323c84e33593855..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcfactory_unittest.cc >+++ /dev/null >@@ -1,250 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "media/base/fakemediaengine.h" >-#include "ortc/ortcfactory.h" >-#include "ortc/testrtpparameters.h" >-#include "p2p/base/fakepackettransport.h" >-#include "rtc_base/fakenetwork.h" >-#include "rtc_base/gunit.h" >-#include "rtc_base/virtualsocketserver.h" >- >-namespace webrtc { >- >-// This test uses a virtual network and fake media engine, in order to test the >-// OrtcFactory at only an API level. Any end-to-end test should go in >-// ortcfactory_integrationtest.cc instead. >-class OrtcFactoryTest : public testing::Test { >- public: >- OrtcFactoryTest() >- : thread_(&virtual_socket_server_), >- fake_packet_transport_("fake transport") { >- ortc_factory_ = >- OrtcFactory::Create(&thread_, nullptr, &fake_network_manager_, nullptr, >- nullptr, >- std::unique_ptr<cricket::MediaEngineInterface>( >- new cricket::FakeMediaEngine()), >- CreateBuiltinAudioEncoderFactory(), >- CreateBuiltinAudioDecoderFactory()) >- .MoveValue(); >- } >- >- protected: >- // Uses a single pre-made FakePacketTransport, so shouldn't be called twice in >- // the same test. >- std::unique_ptr<RtpTransportInterface> >- CreateRtpTransportWithFakePacketTransport() { >- return ortc_factory_ >- ->CreateRtpTransport(MakeRtcpMuxParameters(), &fake_packet_transport_, >- nullptr, nullptr) >- .MoveValue(); >- } >- >- rtc::VirtualSocketServer virtual_socket_server_; >- rtc::AutoSocketServerThread thread_; >- rtc::FakeNetworkManager fake_network_manager_; >- rtc::FakePacketTransport fake_packet_transport_; >- std::unique_ptr<OrtcFactoryInterface> ortc_factory_; >-}; >- >-TEST_F(OrtcFactoryTest, CanCreateMultipleRtpTransportControllers) { >- auto controller_result1 = ortc_factory_->CreateRtpTransportController(); >- EXPECT_TRUE(controller_result1.ok()); >- auto controller_result2 = ortc_factory_->CreateRtpTransportController(); >- EXPECT_TRUE(controller_result1.ok()); >-} >- >-// Simple test for the successful cases of CreateRtpTransport. >-TEST_F(OrtcFactoryTest, CreateRtpTransportWithAndWithoutMux) { >- rtc::FakePacketTransport rtp("rtp"); >- rtc::FakePacketTransport rtcp("rtcp"); >- // With muxed RTCP. >- RtpTransportParameters parameters = MakeRtcpMuxParameters(); >- auto result = >- ortc_factory_->CreateRtpTransport(parameters, &rtp, nullptr, nullptr); >- EXPECT_TRUE(result.ok()); >- result.MoveValue().reset(); >- // With non-muxed RTCP. >- parameters.rtcp.mux = false; >- result = ortc_factory_->CreateRtpTransport(parameters, &rtp, &rtcp, nullptr); >- EXPECT_TRUE(result.ok()); >-} >- >-// Simple test for the successful cases of CreateSrtpTransport. >-TEST_F(OrtcFactoryTest, CreateSrtpTransport) { >- rtc::FakePacketTransport rtp("rtp"); >- rtc::FakePacketTransport rtcp("rtcp"); >- // With muxed RTCP. >- RtpTransportParameters parameters = MakeRtcpMuxParameters(); >- auto result = >- ortc_factory_->CreateSrtpTransport(parameters, &rtp, nullptr, nullptr); >- EXPECT_TRUE(result.ok()); >- result.MoveValue().reset(); >- // With non-muxed RTCP. >- parameters.rtcp.mux = false; >- result = ortc_factory_->CreateSrtpTransport(parameters, &rtp, &rtcp, nullptr); >- EXPECT_TRUE(result.ok()); >-} >- >-// If no CNAME is provided, one should be generated and returned by >-// GetRtpParameters. >-TEST_F(OrtcFactoryTest, CreateRtpTransportGeneratesCname) { >- rtc::FakePacketTransport rtp("rtp"); >- auto result = ortc_factory_->CreateRtpTransport(MakeRtcpMuxParameters(), &rtp, >- nullptr, nullptr); >- ASSERT_TRUE(result.ok()); >- EXPECT_FALSE(result.value()->GetParameters().rtcp.cname.empty()); >-} >- >-// Extension of the above test; multiple transports created by the same factory >-// should use the same generated CNAME. >-TEST_F(OrtcFactoryTest, MultipleRtpTransportsUseSameGeneratedCname) { >- rtc::FakePacketTransport packet_transport1("1"); >- rtc::FakePacketTransport packet_transport2("2"); >- RtpTransportParameters parameters = MakeRtcpMuxParameters(); >- // Sanity check. >- ASSERT_TRUE(parameters.rtcp.cname.empty()); >- auto result = ortc_factory_->CreateRtpTransport( >- parameters, &packet_transport1, nullptr, nullptr); >- ASSERT_TRUE(result.ok()); >- auto rtp_transport1 = result.MoveValue(); >- result = ortc_factory_->CreateRtpTransport(parameters, &packet_transport2, >- nullptr, nullptr); >- ASSERT_TRUE(result.ok()); >- auto rtp_transport2 = result.MoveValue(); >- RtcpParameters params1 = rtp_transport1->GetParameters().rtcp; >- RtcpParameters params2 = rtp_transport2->GetParameters().rtcp; >- EXPECT_FALSE(params1.cname.empty()); >- EXPECT_EQ(params1.cname, params2.cname); >-} >- >-TEST_F(OrtcFactoryTest, CreateRtpTransportWithNoPacketTransport) { >- auto result = ortc_factory_->CreateRtpTransport(MakeRtcpMuxParameters(), >- nullptr, nullptr, nullptr); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, result.error().type()); >-} >- >-// If the |mux| member of the RtcpParameters is false, both an RTP and RTCP >-// packet transport are needed. >-TEST_F(OrtcFactoryTest, CreateRtpTransportWithMissingRtcpTransport) { >- rtc::FakePacketTransport rtp("rtp"); >- RtpTransportParameters parameters; >- parameters.rtcp.mux = false; >- auto result = >- ortc_factory_->CreateRtpTransport(parameters, &rtp, nullptr, nullptr); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, result.error().type()); >-} >- >-// If the |mux| member of the RtcpParameters is true, only an RTP packet >-// transport is necessary. So, passing in an RTCP transport is most likely >-// an accident, and thus should be treated as an error. >-TEST_F(OrtcFactoryTest, CreateRtpTransportWithExtraneousRtcpTransport) { >- rtc::FakePacketTransport rtp("rtp"); >- rtc::FakePacketTransport rtcp("rtcp"); >- auto result = ortc_factory_->CreateRtpTransport(MakeRtcpMuxParameters(), &rtp, >- &rtcp, nullptr); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, result.error().type()); >-} >- >-// Basic test that CreateUdpTransport works with AF_INET and AF_INET6. >-TEST_F(OrtcFactoryTest, CreateUdpTransport) { >- auto result = ortc_factory_->CreateUdpTransport(AF_INET); >- EXPECT_TRUE(result.ok()); >- result = ortc_factory_->CreateUdpTransport(AF_INET6); >- EXPECT_TRUE(result.ok()); >-} >- >-// Test CreateUdpPort with the |min_port| and |max_port| arguments. >-TEST_F(OrtcFactoryTest, CreateUdpTransportWithPortRange) { >- auto socket_result1 = ortc_factory_->CreateUdpTransport(AF_INET, 2000, 2002); >- ASSERT_TRUE(socket_result1.ok()); >- EXPECT_EQ(2000, socket_result1.value()->GetLocalAddress().port()); >- auto socket_result2 = ortc_factory_->CreateUdpTransport(AF_INET, 2000, 2002); >- ASSERT_TRUE(socket_result2.ok()); >- EXPECT_EQ(2001, socket_result2.value()->GetLocalAddress().port()); >- auto socket_result3 = ortc_factory_->CreateUdpTransport(AF_INET, 2000, 2002); >- ASSERT_TRUE(socket_result3.ok()); >- EXPECT_EQ(2002, socket_result3.value()->GetLocalAddress().port()); >- >- // All sockets in the range have been exhausted, so the next call should >- // fail. >- auto failed_result = ortc_factory_->CreateUdpTransport(AF_INET, 2000, 2002); >- EXPECT_EQ(RTCErrorType::RESOURCE_EXHAUSTED, failed_result.error().type()); >- >- // If one socket is destroyed, that port should be freed up again. >- socket_result2.MoveValue().reset(); >- auto socket_result4 = ortc_factory_->CreateUdpTransport(AF_INET, 2000, 2002); >- ASSERT_TRUE(socket_result4.ok()); >- EXPECT_EQ(2001, socket_result4.value()->GetLocalAddress().port()); >-} >- >-// Basic test that CreateUdpTransport works with AF_INET and AF_INET6. >-TEST_F(OrtcFactoryTest, CreateUdpTransportWithInvalidAddressFamily) { >- auto result = ortc_factory_->CreateUdpTransport(12345); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, result.error().type()); >-} >- >-TEST_F(OrtcFactoryTest, CreateUdpTransportWithInvalidPortRange) { >- auto result = ortc_factory_->CreateUdpTransport(AF_INET, 3000, 2000); >- EXPECT_EQ(RTCErrorType::INVALID_RANGE, result.error().type()); >-} >- >-// Just sanity check that each "GetCapabilities" method returns some codecs. >-TEST_F(OrtcFactoryTest, GetSenderAndReceiverCapabilities) { >- RtpCapabilities audio_send_caps = >- ortc_factory_->GetRtpSenderCapabilities(cricket::MEDIA_TYPE_AUDIO); >- EXPECT_GT(audio_send_caps.codecs.size(), 0u); >- RtpCapabilities video_send_caps = >- ortc_factory_->GetRtpSenderCapabilities(cricket::MEDIA_TYPE_VIDEO); >- EXPECT_GT(video_send_caps.codecs.size(), 0u); >- RtpCapabilities audio_receive_caps = >- ortc_factory_->GetRtpReceiverCapabilities(cricket::MEDIA_TYPE_AUDIO); >- EXPECT_GT(audio_receive_caps.codecs.size(), 0u); >- RtpCapabilities video_receive_caps = >- ortc_factory_->GetRtpReceiverCapabilities(cricket::MEDIA_TYPE_VIDEO); >- EXPECT_GT(video_receive_caps.codecs.size(), 0u); >-} >- >-// Calling CreateRtpSender with a null track should fail, since that makes it >-// impossible to know whether to create an audio or video sender. The >-// application should be using the method that takes a cricket::MediaType >-// instead. >-TEST_F(OrtcFactoryTest, CreateSenderWithNullTrack) { >- auto rtp_transport = CreateRtpTransportWithFakePacketTransport(); >- auto result = ortc_factory_->CreateRtpSender(nullptr, rtp_transport.get()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, result.error().type()); >-} >- >-// Calling CreateRtpSender or CreateRtpReceiver with MEDIA_TYPE_DATA should >-// fail. >-TEST_F(OrtcFactoryTest, CreateSenderOrReceieverWithInvalidKind) { >- auto rtp_transport = CreateRtpTransportWithFakePacketTransport(); >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_DATA, >- rtp_transport.get()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, sender_result.error().type()); >- auto receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_DATA, rtp_transport.get()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, receiver_result.error().type()); >-} >- >-TEST_F(OrtcFactoryTest, CreateSendersOrReceieversWithNullTransport) { >- auto sender_result = >- ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, nullptr); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, sender_result.error().type()); >- auto receiver_result = >- ortc_factory_->CreateRtpReceiver(cricket::MEDIA_TYPE_AUDIO, nullptr); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, receiver_result.error().type()); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiver_unittest.cc >deleted file mode 100644 >index f0e92387145eac64a9f6d808f24efafa9fea35df..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiver_unittest.cc >+++ /dev/null >@@ -1,550 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "media/base/fakemediaengine.h" >-#include "ortc/ortcfactory.h" >-#include "ortc/testrtpparameters.h" >-#include "p2p/base/fakepackettransport.h" >-#include "pc/test/fakevideotracksource.h" >-#include "rtc_base/gunit.h" >- >-namespace webrtc { >- >-// This test uses an individual RtpReceiver using only the public interface, >-// and verifies that it behaves as designed at an API level. Also tests that >-// parameters are applied to the audio/video engines as expected. Network and >-// media interfaces are faked to isolate what's being tested. >-// >-// This test shouldn't result any any actual media being sent. That sort of >-// test should go in ortcfactory_integrationtest.cc. >-class OrtcRtpReceiverTest : public testing::Test { >- public: >- OrtcRtpReceiverTest() : fake_packet_transport_("fake") { >- fake_media_engine_ = new cricket::FakeMediaEngine(); >- // Note: This doesn't need to use fake network classes, since we already >- // use FakePacketTransport. >- auto ortc_factory_result = OrtcFactory::Create( >- nullptr, nullptr, nullptr, nullptr, nullptr, >- std::unique_ptr<cricket::MediaEngineInterface>(fake_media_engine_), >- CreateBuiltinAudioEncoderFactory(), CreateBuiltinAudioDecoderFactory()); >- ortc_factory_ = ortc_factory_result.MoveValue(); >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- auto rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport_, nullptr, nullptr); >- rtp_transport_ = rtp_transport_result.MoveValue(); >- } >- >- protected: >- // Owned by |ortc_factory_|. >- cricket::FakeMediaEngine* fake_media_engine_; >- rtc::FakePacketTransport fake_packet_transport_; >- std::unique_ptr<OrtcFactoryInterface> ortc_factory_; >- std::unique_ptr<RtpTransportInterface> rtp_transport_; >-}; >- >-// See ortcrtpreceiverinterface.h for the current expectations of what GetTrack >-// will return after calls to Receive. >-// TODO(deadbeef): Replace this test when the non-standard behavior is fixed >-// and GetTrack starts returning the same track for the lifetime of the >-// receiver. >-TEST_F(OrtcRtpReceiverTest, GetTrack) { >- auto receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- ASSERT_TRUE(receiver_result.ok()); >- auto receiver = receiver_result.MoveValue(); >- >- // Track initially expected to be null. >- EXPECT_EQ(nullptr, receiver_result.value().get()); >- >- EXPECT_TRUE(receiver->Receive(MakeMinimalVp8ParametersWithNoSsrc()).ok()); >- auto initial_track = receiver->GetTrack(); >- EXPECT_NE(nullptr, initial_track); >- >- // Codec changing but SSRC (or lack thereof) isn't; shouldn't create new track >- EXPECT_TRUE(receiver->Receive(MakeMinimalVp9ParametersWithNoSsrc()).ok()); >- EXPECT_EQ(initial_track, receiver->GetTrack()); >- >- // Explicitly set SSRC and expect a different track. >- EXPECT_TRUE( >- receiver->Receive(MakeMinimalVp9ParametersWithSsrc(0xdeadbeef)).ok()); >- auto next_track = receiver->GetTrack(); >- EXPECT_NE(next_track, initial_track); >- >- // Deactivating the encoding shouldn't change the track. >- RtpParameters inactive_encoding = >- MakeMinimalVp9ParametersWithSsrc(0xdeadbeef); >- inactive_encoding.encodings[0].active = false; >- EXPECT_TRUE(receiver->Receive(inactive_encoding).ok()); >- EXPECT_EQ(next_track, receiver->GetTrack()); >- >- // Removing all encodings *is* expected to clear the track. >- RtpParameters no_encodings = MakeMinimalVp9ParametersWithSsrc(0xdeadbeef); >- no_encodings.encodings.clear(); >- EXPECT_TRUE(receiver->Receive(no_encodings).ok()); >- EXPECT_EQ(nullptr, receiver->GetTrack()); >-} >- >-// Currently SetTransport isn't supported. When it is, replace this test with a >-// test/tests for it. >-TEST_F(OrtcRtpReceiverTest, SetTransportFails) { >- rtc::FakePacketTransport fake_packet_transport("another_transport"); >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- auto rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport, nullptr, nullptr); >- auto rtp_transport = rtp_transport_result.MoveValue(); >- >- auto receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto receiver = receiver_result.MoveValue(); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- receiver->SetTransport(rtp_transport.get()).type()); >-} >- >-TEST_F(OrtcRtpReceiverTest, GetTransport) { >- auto result = ortc_factory_->CreateRtpReceiver(cricket::MEDIA_TYPE_AUDIO, >- rtp_transport_.get()); >- EXPECT_EQ(rtp_transport_.get(), result.value()->GetTransport()); >-} >- >-// Test that "Receive" causes the expected parameters to be applied to the media >-// engine level, for an audio receiver. >-TEST_F(OrtcRtpReceiverTest, ReceiveAppliesAudioParametersToMediaEngine) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- >- // First, create parameters with all the bells and whistles. >- RtpParameters parameters; >- >- RtpCodecParameters opus_codec; >- opus_codec.name = "opus"; >- opus_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- opus_codec.payload_type = 120; >- opus_codec.clock_rate.emplace(48000); >- opus_codec.num_channels.emplace(2); >- opus_codec.parameters["minptime"] = "10"; >- opus_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::TRANSPORT_CC); >- parameters.codecs.push_back(std::move(opus_codec)); >- >- // Add two codecs, expecting the first to be used. >- // TODO(deadbeef): Once "codec_payload_type" is supported, use it to select a >- // codec that's not at the top of the list. >- RtpCodecParameters isac_codec; >- isac_codec.name = "ISAC"; >- isac_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- isac_codec.payload_type = 110; >- isac_codec.clock_rate.emplace(16000); >- parameters.codecs.push_back(std::move(isac_codec)); >- >- RtpEncodingParameters encoding; >- encoding.ssrc.emplace(0xdeadbeef); >- parameters.encodings.push_back(std::move(encoding)); >- >- parameters.header_extensions.emplace_back( >- "urn:ietf:params:rtp-hdrext:ssrc-audio-level", 3); >- >- EXPECT_TRUE(audio_receiver->Receive(parameters).ok()); >- >- // Now verify that the parameters were applied to the fake media engine layer >- // that exists below BaseChannel. >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->playout()); >- >- // Verify codec parameters. >- ASSERT_GT(fake_voice_channel->recv_codecs().size(), 0u); >- const cricket::AudioCodec& top_codec = fake_voice_channel->recv_codecs()[0]; >- EXPECT_EQ("opus", top_codec.name); >- EXPECT_EQ(120, top_codec.id); >- EXPECT_EQ(48000, top_codec.clockrate); >- EXPECT_EQ(2u, top_codec.channels); >- ASSERT_NE(top_codec.params.end(), top_codec.params.find("minptime")); >- EXPECT_EQ("10", top_codec.params.at("minptime")); >- >- // Verify encoding parameters. >- ASSERT_EQ(1u, fake_voice_channel->recv_streams().size()); >- const cricket::StreamParams& recv_stream = >- fake_voice_channel->recv_streams()[0]; >- EXPECT_EQ(1u, recv_stream.ssrcs.size()); >- EXPECT_EQ(0xdeadbeef, recv_stream.first_ssrc()); >- >- // Verify header extensions. >- ASSERT_EQ(1u, fake_voice_channel->recv_extensions().size()); >- const RtpExtension& extension = fake_voice_channel->recv_extensions()[0]; >- EXPECT_EQ("urn:ietf:params:rtp-hdrext:ssrc-audio-level", extension.uri); >- EXPECT_EQ(3, extension.id); >-} >- >-// Test that "Receive" causes the expected parameters to be applied to the media >-// engine level, for a video receiver. >-TEST_F(OrtcRtpReceiverTest, ReceiveAppliesVideoParametersToMediaEngine) { >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_receiver = video_receiver_result.MoveValue(); >- >- // First, create parameters with all the bells and whistles. >- RtpParameters parameters; >- >- RtpCodecParameters vp8_codec; >- vp8_codec.name = "VP8"; >- vp8_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp8_codec.payload_type = 99; >- // Try a couple types of feedback params. "Generic NACK" is a bit of a >- // special case, so test it here. >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::CCM, >- RtcpFeedbackMessageType::FIR); >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::NACK, >- RtcpFeedbackMessageType::GENERIC_NACK); >- parameters.codecs.push_back(std::move(vp8_codec)); >- >- RtpCodecParameters vp8_rtx_codec; >- vp8_rtx_codec.name = "rtx"; >- vp8_rtx_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp8_rtx_codec.payload_type = 100; >- vp8_rtx_codec.parameters["apt"] = "99"; >- parameters.codecs.push_back(std::move(vp8_rtx_codec)); >- >- // Add two codecs, expecting the first to be used. >- // TODO(deadbeef): Once "codec_payload_type" is supported, use it to select a >- // codec that's not at the top of the list. >- RtpCodecParameters vp9_codec; >- vp9_codec.name = "VP9"; >- vp9_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp9_codec.payload_type = 102; >- parameters.codecs.push_back(std::move(vp9_codec)); >- >- RtpCodecParameters vp9_rtx_codec; >- vp9_rtx_codec.name = "rtx"; >- vp9_rtx_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp9_rtx_codec.payload_type = 103; >- vp9_rtx_codec.parameters["apt"] = "102"; >- parameters.codecs.push_back(std::move(vp9_rtx_codec)); >- >- RtpEncodingParameters encoding; >- encoding.ssrc.emplace(0xdeadbeef); >- encoding.rtx.emplace(0xbaadfeed); >- parameters.encodings.push_back(std::move(encoding)); >- >- parameters.header_extensions.emplace_back("urn:3gpp:video-orientation", 4); >- parameters.header_extensions.emplace_back( >- "http://www.webrtc.org/experiments/rtp-hdrext/playout-delay", 6); >- >- EXPECT_TRUE(video_receiver->Receive(parameters).ok()); >- >- // Now verify that the parameters were applied to the fake media engine layer >- // that exists below BaseChannel. >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- >- // Verify codec parameters. >- ASSERT_GE(fake_video_channel->recv_codecs().size(), 2u); >- const cricket::VideoCodec& top_codec = fake_video_channel->recv_codecs()[0]; >- EXPECT_EQ("VP8", top_codec.name); >- EXPECT_EQ(99, top_codec.id); >- EXPECT_TRUE(top_codec.feedback_params.Has({"ccm", "fir"})); >- EXPECT_TRUE(top_codec.feedback_params.Has(cricket::FeedbackParam("nack"))); >- >- const cricket::VideoCodec& rtx_codec = fake_video_channel->recv_codecs()[1]; >- EXPECT_EQ("rtx", rtx_codec.name); >- EXPECT_EQ(100, rtx_codec.id); >- ASSERT_NE(rtx_codec.params.end(), rtx_codec.params.find("apt")); >- EXPECT_EQ("99", rtx_codec.params.at("apt")); >- >- // Verify encoding parameters. >- ASSERT_EQ(1u, fake_video_channel->recv_streams().size()); >- const cricket::StreamParams& recv_stream = >- fake_video_channel->recv_streams()[0]; >- EXPECT_EQ(2u, recv_stream.ssrcs.size()); >- EXPECT_EQ(0xdeadbeef, recv_stream.first_ssrc()); >- uint32_t rtx_ssrc = 0u; >- EXPECT_TRUE(recv_stream.GetFidSsrc(recv_stream.first_ssrc(), &rtx_ssrc)); >- EXPECT_EQ(0xbaadfeed, rtx_ssrc); >- >- // Verify header extensions. >- ASSERT_EQ(2u, fake_video_channel->recv_extensions().size()); >- const RtpExtension& extension1 = fake_video_channel->recv_extensions()[0]; >- EXPECT_EQ("urn:3gpp:video-orientation", extension1.uri); >- EXPECT_EQ(4, extension1.id); >- const RtpExtension& extension2 = fake_video_channel->recv_extensions()[1]; >- EXPECT_EQ("http://www.webrtc.org/experiments/rtp-hdrext/playout-delay", >- extension2.uri); >- EXPECT_EQ(6, extension2.id); >-} >- >-// Test changing both the receive codec and SSRC at the same time, and verify >-// that the new parameters are applied to the media engine level. >-TEST_F(OrtcRtpReceiverTest, CallingReceiveTwiceChangesParameters) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- RTCError error = >- audio_receiver->Receive(MakeMinimalOpusParametersWithSsrc(0x11111111)); >- EXPECT_TRUE(error.ok()); >- error = >- audio_receiver->Receive(MakeMinimalIsacParametersWithSsrc(0x22222222)); >- EXPECT_TRUE(error.ok()); >- >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- ASSERT_GT(fake_voice_channel->recv_codecs().size(), 0u); >- EXPECT_EQ("ISAC", fake_voice_channel->recv_codecs()[0].name); >- ASSERT_EQ(1u, fake_voice_channel->recv_streams().size()); >- EXPECT_EQ(0x22222222u, fake_voice_channel->recv_streams()[0].first_ssrc()); >- >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_receiver = video_receiver_result.MoveValue(); >- error = video_receiver->Receive(MakeMinimalVp8ParametersWithSsrc(0x33333333)); >- EXPECT_TRUE(error.ok()); >- error = video_receiver->Receive(MakeMinimalVp9ParametersWithSsrc(0x44444444)); >- EXPECT_TRUE(error.ok()); >- >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- ASSERT_GT(fake_video_channel->recv_codecs().size(), 0u); >- EXPECT_EQ("VP9", fake_video_channel->recv_codecs()[0].name); >- ASSERT_EQ(1u, fake_video_channel->recv_streams().size()); >- EXPECT_EQ(0x44444444u, fake_video_channel->recv_streams()[0].first_ssrc()); >-} >- >-// Ensure that if the |active| flag of RtpEncodingParameters is set to false, >-// playout stops at the media engine level. Note that this is only applicable >-// to audio (at least currently). >-TEST_F(OrtcRtpReceiverTest, DeactivatingEncodingStopsPlayout) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- RtpParameters parameters = MakeMinimalOpusParameters(); >- EXPECT_TRUE(audio_receiver->Receive(parameters).ok()); >- >- // Expect "playout" flag to initially be true. >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->playout()); >- >- // Deactivate encoding and expect it to change to false. >- parameters.encodings[0].active = false; >- EXPECT_TRUE(audio_receiver->Receive(parameters).ok()); >- EXPECT_FALSE(fake_voice_channel->playout()); >-} >- >-// Ensure that calling Receive with an empty list of encodings causes receive >-// streams at the media engine level to be cleared. >-TEST_F(OrtcRtpReceiverTest, >- CallingReceiveWithEmptyEncodingsClearsReceiveStreams) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- RtpParameters parameters = MakeMinimalOpusParameters(); >- EXPECT_TRUE(audio_receiver->Receive(parameters).ok()); >- parameters.encodings.clear(); >- EXPECT_TRUE(audio_receiver->Receive(parameters).ok()); >- >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->recv_streams().empty()); >- >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_receiver = video_receiver_result.MoveValue(); >- parameters = MakeMinimalVp8Parameters(); >- EXPECT_TRUE(video_receiver->Receive(parameters).ok()); >- parameters.encodings.clear(); >- EXPECT_TRUE(video_receiver->Receive(parameters).ok()); >- >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->recv_streams().empty()); >-} >- >-// These errors should be covered by rtpparametersconversion_unittest.cc, but >-// we should at least test that those errors are propogated from calls to >-// Receive, with a few examples. >-TEST_F(OrtcRtpReceiverTest, ReceiveReturnsErrorOnInvalidParameters) { >- auto result = ortc_factory_->CreateRtpReceiver(cricket::MEDIA_TYPE_AUDIO, >- rtp_transport_.get()); >- auto receiver = result.MoveValue(); >- // CCM feedback missing message type. >- RtpParameters invalid_feedback = MakeMinimalOpusParameters(); >- invalid_feedback.codecs[0].rtcp_feedback.emplace_back(RtcpFeedbackType::CCM); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- receiver->Receive(invalid_feedback).type()); >- // Payload type greater than 127. >- RtpParameters invalid_pt = MakeMinimalOpusParameters(); >- invalid_pt.codecs[0].payload_type = 128; >- EXPECT_EQ(RTCErrorType::INVALID_RANGE, receiver->Receive(invalid_pt).type()); >- // Duplicate header extension IDs. >- RtpParameters duplicate_ids = MakeMinimalOpusParameters(); >- duplicate_ids.header_extensions.emplace_back("foo", 5); >- duplicate_ids.header_extensions.emplace_back("bar", 5); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- receiver->Receive(duplicate_ids).type()); >-} >- >-// Two receivers using the same transport shouldn't be able to use the same >-// payload type to refer to different codecs, same header extension IDs to >-// refer to different extensions, or same SSRC. >-TEST_F(OrtcRtpReceiverTest, ReceiveReturnsErrorOnIdConflicts) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- auto video_receiver = video_receiver_result.MoveValue(); >- >- // First test payload type conflict. >- RtpParameters audio_parameters = MakeMinimalOpusParameters(); >- RtpParameters video_parameters = MakeMinimalVp8Parameters(); >- audio_parameters.codecs[0].payload_type = 100; >- video_parameters.codecs[0].payload_type = 100; >- EXPECT_TRUE(audio_receiver->Receive(audio_parameters).ok()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- video_receiver->Receive(video_parameters).type()); >- >- // Test header extension ID conflict. >- video_parameters.codecs[0].payload_type = 110; >- audio_parameters.header_extensions.emplace_back("foo", 4); >- video_parameters.header_extensions.emplace_back("bar", 4); >- EXPECT_TRUE(audio_receiver->Receive(audio_parameters).ok()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- video_receiver->Receive(video_parameters).type()); >- >- // Test SSRC conflict. Have an RTX SSRC that conflicts with a primary SSRC >- // for extra challenge. >- video_parameters.header_extensions[0].uri = "foo"; >- audio_parameters.encodings[0].ssrc.emplace(0xabbaabba); >- audio_parameters.encodings[0].rtx.emplace(0xdeadbeef); >- video_parameters.encodings[0].ssrc.emplace(0xdeadbeef); >- EXPECT_TRUE(audio_receiver->Receive(audio_parameters).ok()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- video_receiver->Receive(video_parameters).type()); >- >- // Sanity check that parameters can be set if the conflicts are all resolved. >- video_parameters.encodings[0].ssrc.emplace(0xbaadf00d); >- EXPECT_TRUE(video_receiver->Receive(video_parameters).ok()); >-} >- >-// Ensure that deleting a receiver causes receive streams at the media engine >-// level to be cleared. >-TEST_F(OrtcRtpReceiverTest, DeletingReceiverClearsReceiveStreams) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- EXPECT_TRUE(audio_receiver->Receive(MakeMinimalOpusParameters()).ok()); >- >- // Also create an audio sender, to prevent the voice channel from being >- // completely deleted. >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_TRUE(audio_sender->Send(MakeMinimalOpusParameters()).ok()); >- >- audio_receiver.reset(nullptr); >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->recv_streams().empty()); >- >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_receiver = video_receiver_result.MoveValue(); >- EXPECT_TRUE(video_receiver->Receive(MakeMinimalVp8Parameters()).ok()); >- >- // Also create an video sender, to prevent the video channel from being >- // completely deleted. >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_TRUE(video_sender->Send(MakeMinimalVp8Parameters()).ok()); >- >- video_receiver.reset(nullptr); >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->recv_streams().empty()); >-} >- >-// If Receive hasn't been called, GetParameters should return empty parameters. >-TEST_F(OrtcRtpReceiverTest, GetDefaultParameters) { >- auto result = ortc_factory_->CreateRtpReceiver(cricket::MEDIA_TYPE_AUDIO, >- rtp_transport_.get()); >- EXPECT_EQ(RtpParameters(), result.value()->GetParameters()); >- result = ortc_factory_->CreateRtpReceiver(cricket::MEDIA_TYPE_VIDEO, >- rtp_transport_.get()); >- EXPECT_EQ(RtpParameters(), result.value()->GetParameters()); >-} >- >-// Test that GetParameters returns the last parameters passed into Receive, >-// along with the implementation-default values filled in where they were left >-// unset. >-TEST_F(OrtcRtpReceiverTest, >- GetParametersReturnsLastSetParametersWithDefaultsFilled) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- >- RtpParameters opus_parameters = MakeMinimalOpusParameters(); >- EXPECT_TRUE(audio_receiver->Receive(opus_parameters).ok()); >- EXPECT_EQ(opus_parameters, audio_receiver->GetParameters()); >- >- RtpParameters isac_parameters = MakeMinimalIsacParameters(); >- // Sanity check that num_channels actually is left unset. >- ASSERT_FALSE(isac_parameters.codecs[0].num_channels); >- EXPECT_TRUE(audio_receiver->Receive(isac_parameters).ok()); >- // Should be filled with a default "num channels" of 1. >- // TODO(deadbeef): This should actually default to 2 for some codecs. Update >- // this test once that's implemented. >- isac_parameters.codecs[0].num_channels.emplace(1); >- EXPECT_EQ(isac_parameters, audio_receiver->GetParameters()); >- >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_receiver = video_receiver_result.MoveValue(); >- >- RtpParameters vp8_parameters = MakeMinimalVp8Parameters(); >- // Sanity check that clock_rate actually is left unset. >- EXPECT_TRUE(video_receiver->Receive(vp8_parameters).ok()); >- // Should be filled with a default clock rate of 90000. >- vp8_parameters.codecs[0].clock_rate.emplace(90000); >- EXPECT_EQ(vp8_parameters, video_receiver->GetParameters()); >- >- RtpParameters vp9_parameters = MakeMinimalVp9Parameters(); >- // Sanity check that clock_rate actually is left unset. >- EXPECT_TRUE(video_receiver->Receive(vp9_parameters).ok()); >- // Should be filled with a default clock rate of 90000. >- vp9_parameters.codecs[0].clock_rate.emplace(90000); >- EXPECT_EQ(vp9_parameters, video_receiver->GetParameters()); >-} >- >-TEST_F(OrtcRtpReceiverTest, GetKind) { >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- auto video_receiver = video_receiver_result.MoveValue(); >- EXPECT_EQ(cricket::MEDIA_TYPE_AUDIO, audio_receiver->GetKind()); >- EXPECT_EQ(cricket::MEDIA_TYPE_VIDEO, video_receiver->GetKind()); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiveradapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiveradapter.cc >deleted file mode 100644 >index 689412217acc8b5d0fc84ea69b344ca73b2fad06..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiveradapter.cc >+++ /dev/null >@@ -1,181 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "ortc/ortcrtpreceiveradapter.h" >- >-#include <string> >-#include <utility> >-#include <vector> >- >-#include "media/base/mediaconstants.h" >-#include "ortc/rtptransportadapter.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/helpers.h" // For "CreateRandomX". >- >-namespace { >- >-void FillAudioReceiverParameters(webrtc::RtpParameters* parameters) { >- for (webrtc::RtpCodecParameters& codec : parameters->codecs) { >- if (!codec.num_channels) { >- codec.num_channels = 1; >- } >- } >-} >- >-void FillVideoReceiverParameters(webrtc::RtpParameters* parameters) { >- for (webrtc::RtpCodecParameters& codec : parameters->codecs) { >- if (!codec.clock_rate) { >- codec.clock_rate = cricket::kVideoCodecClockrate; >- } >- } >-} >- >-} // namespace >- >-namespace webrtc { >- >-BEGIN_OWNED_PROXY_MAP(OrtcRtpReceiver) >-PROXY_SIGNALING_THREAD_DESTRUCTOR() >-PROXY_CONSTMETHOD0(rtc::scoped_refptr<MediaStreamTrackInterface>, GetTrack) >-PROXY_METHOD1(RTCError, SetTransport, RtpTransportInterface*) >-PROXY_CONSTMETHOD0(RtpTransportInterface*, GetTransport) >-PROXY_METHOD1(RTCError, Receive, const RtpParameters&) >-PROXY_CONSTMETHOD0(RtpParameters, GetParameters) >-PROXY_CONSTMETHOD0(cricket::MediaType, GetKind) >-END_PROXY_MAP() >- >-// static >-std::unique_ptr<OrtcRtpReceiverInterface> OrtcRtpReceiverAdapter::CreateProxy( >- std::unique_ptr<OrtcRtpReceiverAdapter> wrapped_receiver) { >- RTC_DCHECK(wrapped_receiver); >- rtc::Thread* signaling = >- wrapped_receiver->rtp_transport_controller_->signaling_thread(); >- rtc::Thread* worker = >- wrapped_receiver->rtp_transport_controller_->worker_thread(); >- return OrtcRtpReceiverProxy::Create(signaling, worker, >- std::move(wrapped_receiver)); >-} >- >-OrtcRtpReceiverAdapter::~OrtcRtpReceiverAdapter() { >- internal_receiver_ = nullptr; >- SignalDestroyed(); >-} >- >-rtc::scoped_refptr<MediaStreamTrackInterface> OrtcRtpReceiverAdapter::GetTrack() >- const { >- return internal_receiver_ ? internal_receiver_->track() : nullptr; >-} >- >-RTCError OrtcRtpReceiverAdapter::SetTransport( >- RtpTransportInterface* transport) { >- LOG_AND_RETURN_ERROR( >- RTCErrorType::UNSUPPORTED_OPERATION, >- "Changing the transport of an RtpReceiver is not yet supported."); >-} >- >-RtpTransportInterface* OrtcRtpReceiverAdapter::GetTransport() const { >- return transport_; >-} >- >-RTCError OrtcRtpReceiverAdapter::Receive(const RtpParameters& parameters) { >- RtpParameters filled_parameters = parameters; >- RTCError err; >- switch (kind_) { >- case cricket::MEDIA_TYPE_AUDIO: >- FillAudioReceiverParameters(&filled_parameters); >- err = rtp_transport_controller_->ValidateAndApplyAudioReceiverParameters( >- filled_parameters); >- if (!err.ok()) { >- return err; >- } >- break; >- case cricket::MEDIA_TYPE_VIDEO: >- FillVideoReceiverParameters(&filled_parameters); >- err = rtp_transport_controller_->ValidateAndApplyVideoReceiverParameters( >- filled_parameters); >- if (!err.ok()) { >- return err; >- } >- break; >- case cricket::MEDIA_TYPE_DATA: >- RTC_NOTREACHED(); >- return webrtc::RTCError(webrtc::RTCErrorType::INTERNAL_ERROR); >- } >- last_applied_parameters_ = filled_parameters; >- >- // Now that parameters were applied, can create (or recreate) the internal >- // receiver. >- // >- // This is analogous to a PeerConnection creating a receiver after >- // SetRemoteDescription is successful. >- MaybeRecreateInternalReceiver(); >- return RTCError::OK(); >-} >- >-RtpParameters OrtcRtpReceiverAdapter::GetParameters() const { >- return last_applied_parameters_; >-} >- >-cricket::MediaType OrtcRtpReceiverAdapter::GetKind() const { >- return kind_; >-} >- >-OrtcRtpReceiverAdapter::OrtcRtpReceiverAdapter( >- cricket::MediaType kind, >- RtpTransportInterface* transport, >- RtpTransportControllerAdapter* rtp_transport_controller) >- : kind_(kind), >- transport_(transport), >- rtp_transport_controller_(rtp_transport_controller) {} >- >-void OrtcRtpReceiverAdapter::MaybeRecreateInternalReceiver() { >- if (last_applied_parameters_.encodings.empty()) { >- internal_receiver_ = nullptr; >- return; >- } >- // An SSRC of 0 is valid; this is used to identify "the default SSRC" (which >- // is the first one seen by the underlying media engine). >- uint32_t ssrc = 0; >- if (last_applied_parameters_.encodings[0].ssrc) { >- ssrc = *last_applied_parameters_.encodings[0].ssrc; >- } >- if (internal_receiver_ && ssrc == internal_receiver_->ssrc()) { >- // SSRC not changing; nothing to do. >- return; >- } >- internal_receiver_ = nullptr; >- switch (kind_) { >- case cricket::MEDIA_TYPE_AUDIO: { >- auto* audio_receiver = new AudioRtpReceiver( >- rtp_transport_controller_->worker_thread(), rtc::CreateRandomUuid(), >- std::vector<std::string>({})); >- auto* voice_channel = rtp_transport_controller_->voice_channel(); >- RTC_DCHECK(voice_channel); >- audio_receiver->SetVoiceMediaChannel(voice_channel->media_channel()); >- internal_receiver_ = audio_receiver; >- break; >- } >- case cricket::MEDIA_TYPE_VIDEO: { >- auto* video_receiver = new VideoRtpReceiver( >- rtp_transport_controller_->worker_thread(), rtc::CreateRandomUuid(), >- std::vector<std::string>({})); >- auto* video_channel = rtp_transport_controller_->video_channel(); >- RTC_DCHECK(video_channel); >- video_receiver->SetVideoMediaChannel(video_channel->media_channel()); >- internal_receiver_ = video_receiver; >- break; >- } >- case cricket::MEDIA_TYPE_DATA: >- RTC_NOTREACHED(); >- } >- internal_receiver_->SetupMediaChannel(ssrc); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiveradapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiveradapter.h >deleted file mode 100644 >index 4647b043145adae4b02f87275133c7ebea59d3ce..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpreceiveradapter.h >+++ /dev/null >@@ -1,79 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef ORTC_ORTCRTPRECEIVERADAPTER_H_ >-#define ORTC_ORTCRTPRECEIVERADAPTER_H_ >- >-#include <memory> >- >-#include "api/ortc/ortcrtpreceiverinterface.h" >-#include "api/rtcerror.h" >-#include "api/rtpparameters.h" >-#include "ortc/rtptransportcontrolleradapter.h" >-#include "pc/rtpreceiver.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/third_party/sigslot/sigslot.h" >-#include "rtc_base/thread.h" >- >-namespace webrtc { >- >-// Implementation of OrtcRtpReceiverInterface that works with >-// RtpTransportAdapter, and wraps a VideoRtpReceiver/AudioRtpReceiver that's >-// normally used with the PeerConnection. >-// >-// TODO(deadbeef): When BaseChannel is split apart into separate >-// "RtpReceiver"/"RtpTransceiver"/"RtpReceiver"/"RtpReceiver" objects, this >-// adapter object can be removed. >-class OrtcRtpReceiverAdapter : public OrtcRtpReceiverInterface { >- public: >- // Wraps |wrapped_receiver| in a proxy that will safely call methods on the >- // correct thread. >- static std::unique_ptr<OrtcRtpReceiverInterface> CreateProxy( >- std::unique_ptr<OrtcRtpReceiverAdapter> wrapped_receiver); >- >- // Should only be called by RtpTransportControllerAdapter. >- OrtcRtpReceiverAdapter( >- cricket::MediaType kind, >- RtpTransportInterface* transport, >- RtpTransportControllerAdapter* rtp_transport_controller); >- ~OrtcRtpReceiverAdapter() override; >- >- // OrtcRtpReceiverInterface implementation. >- rtc::scoped_refptr<MediaStreamTrackInterface> GetTrack() const override; >- >- RTCError SetTransport(RtpTransportInterface* transport) override; >- RtpTransportInterface* GetTransport() const override; >- >- RTCError Receive(const RtpParameters& parameters) override; >- RtpParameters GetParameters() const override; >- >- cricket::MediaType GetKind() const override; >- >- // Used so that the RtpTransportControllerAdapter knows when it can >- // deallocate resources allocated for this object. >- sigslot::signal0<> SignalDestroyed; >- >- private: >- void MaybeRecreateInternalReceiver(); >- >- cricket::MediaType kind_; >- RtpTransportInterface* transport_; >- RtpTransportControllerAdapter* rtp_transport_controller_; >- // Scoped refptr due to ref-counted interface, but we should be the only >- // reference holder. >- rtc::scoped_refptr<RtpReceiverInternal> internal_receiver_; >- RtpParameters last_applied_parameters_; >- >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(OrtcRtpReceiverAdapter); >-}; >- >-} // namespace webrtc >- >-#endif // ORTC_ORTCRTPRECEIVERADAPTER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsender_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsender_unittest.cc >deleted file mode 100644 >index c8bd8f93934538b51ac8f062fe0907e40031052d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsender_unittest.cc >+++ /dev/null >@@ -1,670 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "media/base/fakemediaengine.h" >-#include "ortc/ortcfactory.h" >-#include "ortc/testrtpparameters.h" >-#include "p2p/base/fakepackettransport.h" >-#include "pc/test/fakevideotracksource.h" >-#include "rtc_base/gunit.h" >- >-namespace webrtc { >- >-// This test uses an individual RtpSender using only the public interface, and >-// verifies that its behaves as designed at an API level. Also tests that >-// parameters are applied to the audio/video engines as expected. Network and >-// media interfaces are faked to isolate what's being tested. >-// >-// This test shouldn't result any any actual media being sent. That sort of >-// test should go in ortcfactory_integrationtest.cc. >-class OrtcRtpSenderTest : public testing::Test { >- public: >- OrtcRtpSenderTest() : fake_packet_transport_("fake") { >- // Need to set the fake packet transport to writable, in order to test that >- // the "send" flag is applied to the media engine based on the encoding >- // |active| flag. >- fake_packet_transport_.SetWritable(true); >- fake_media_engine_ = new cricket::FakeMediaEngine(); >- // Note: This doesn't need to use fake network classes, since we already >- // use FakePacketTransport. >- auto ortc_factory_result = OrtcFactory::Create( >- nullptr, nullptr, nullptr, nullptr, nullptr, >- std::unique_ptr<cricket::MediaEngineInterface>(fake_media_engine_), >- CreateBuiltinAudioEncoderFactory(), CreateBuiltinAudioDecoderFactory()); >- ortc_factory_ = ortc_factory_result.MoveValue(); >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- auto rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport_, nullptr, nullptr); >- rtp_transport_ = rtp_transport_result.MoveValue(); >- } >- >- protected: >- rtc::scoped_refptr<AudioTrackInterface> CreateAudioTrack( >- const std::string& id) { >- return ortc_factory_->CreateAudioTrack(id, nullptr); >- } >- >- rtc::scoped_refptr<VideoTrackInterface> CreateVideoTrack( >- const std::string& id) { >- return rtc::scoped_refptr<webrtc::VideoTrackInterface>( >- ortc_factory_->CreateVideoTrack(id, FakeVideoTrackSource::Create())); >- } >- >- // Owned by |ortc_factory_|. >- cricket::FakeMediaEngine* fake_media_engine_; >- rtc::FakePacketTransport fake_packet_transport_; >- std::unique_ptr<OrtcFactoryInterface> ortc_factory_; >- std::unique_ptr<RtpTransportInterface> rtp_transport_; >-}; >- >-TEST_F(OrtcRtpSenderTest, GetAndSetTrack) { >- // Test GetTrack with a sender constructed with a track. >- auto audio_track = CreateAudioTrack("audio"); >- auto audio_sender_result = >- ortc_factory_->CreateRtpSender(audio_track, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_EQ(audio_track, audio_sender->GetTrack()); >- >- // Test GetTrack after SetTrack. >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- auto video_track = CreateVideoTrack("video1"); >- EXPECT_TRUE(video_sender->SetTrack(video_track).ok()); >- EXPECT_EQ(video_track, video_sender->GetTrack()); >- video_track = CreateVideoTrack("video2"); >- EXPECT_TRUE(video_sender->SetTrack(video_track).ok()); >- EXPECT_EQ(video_track, video_sender->GetTrack()); >-} >- >-// Test that track can be set when previously unset, even after Send has been >-// called. >-TEST_F(OrtcRtpSenderTest, SetTrackWhileSending) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_TRUE(audio_sender->Send(MakeMinimalOpusParameters()).ok()); >- EXPECT_TRUE(audio_sender->SetTrack(CreateAudioTrack("audio")).ok()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_TRUE(video_sender->Send(MakeMinimalVp8Parameters()).ok()); >- EXPECT_TRUE(video_sender->SetTrack(CreateVideoTrack("video")).ok()); >-} >- >-// Test that track can be changed mid-sending. Differs from the above test in >-// that the track is set and being changed, rather than unset and being set for >-// the first time. >-TEST_F(OrtcRtpSenderTest, ChangeTrackWhileSending) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- CreateAudioTrack("audio1"), rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_TRUE(audio_sender->Send(MakeMinimalOpusParameters()).ok()); >- EXPECT_TRUE(audio_sender->SetTrack(CreateAudioTrack("audio2")).ok()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- CreateVideoTrack("video1"), rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_TRUE(video_sender->Send(MakeMinimalVp8Parameters()).ok()); >- EXPECT_TRUE(video_sender->SetTrack(CreateVideoTrack("video2")).ok()); >-} >- >-// Test that track can be set to null while sending. >-TEST_F(OrtcRtpSenderTest, UnsetTrackWhileSending) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- CreateAudioTrack("audio"), rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_TRUE(audio_sender->Send(MakeMinimalOpusParameters()).ok()); >- EXPECT_TRUE(audio_sender->SetTrack(nullptr).ok()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- CreateVideoTrack("video"), rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_TRUE(video_sender->Send(MakeMinimalVp8Parameters()).ok()); >- EXPECT_TRUE(video_sender->SetTrack(nullptr).ok()); >-} >- >-// Shouldn't be able to set an audio track on a video sender or vice versa. >-TEST_F(OrtcRtpSenderTest, SetTrackOfWrongKindFails) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- audio_sender->SetTrack(CreateVideoTrack("video")).type()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- video_sender->SetTrack(CreateAudioTrack("audio")).type()); >-} >- >-// Currently SetTransport isn't supported. When it is, replace this test with a >-// test/tests for it. >-TEST_F(OrtcRtpSenderTest, SetTransportFails) { >- rtc::FakePacketTransport fake_packet_transport("another_transport"); >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- auto rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport, nullptr, nullptr); >- auto rtp_transport = rtp_transport_result.MoveValue(); >- >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, >- rtp_transport_.get()); >- auto sender = sender_result.MoveValue(); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- sender->SetTransport(rtp_transport.get()).type()); >-} >- >-TEST_F(OrtcRtpSenderTest, GetTransport) { >- auto result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, >- rtp_transport_.get()); >- EXPECT_EQ(rtp_transport_.get(), result.value()->GetTransport()); >-} >- >-// Test that "Send" causes the expected parameters to be applied to the media >-// engine level, for an audio sender. >-TEST_F(OrtcRtpSenderTest, SendAppliesAudioParametersToMediaEngine) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- >- // First, create parameters with all the bells and whistles. >- RtpParameters parameters; >- >- RtpCodecParameters opus_codec; >- opus_codec.name = "opus"; >- opus_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- opus_codec.payload_type = 120; >- opus_codec.clock_rate.emplace(48000); >- opus_codec.num_channels.emplace(2); >- opus_codec.parameters["minptime"] = "10"; >- opus_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::TRANSPORT_CC); >- parameters.codecs.push_back(std::move(opus_codec)); >- >- // Add two codecs, expecting the first to be used. >- // TODO(deadbeef): Once "codec_payload_type" is supported, use it to select a >- // codec that's not at the top of the list. >- RtpCodecParameters isac_codec; >- isac_codec.name = "ISAC"; >- isac_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- isac_codec.payload_type = 110; >- isac_codec.clock_rate.emplace(16000); >- parameters.codecs.push_back(std::move(isac_codec)); >- >- RtpEncodingParameters encoding; >- encoding.ssrc.emplace(0xdeadbeef); >- encoding.max_bitrate_bps.emplace(20000); >- parameters.encodings.push_back(std::move(encoding)); >- >- parameters.header_extensions.emplace_back( >- "urn:ietf:params:rtp-hdrext:ssrc-audio-level", 3); >- >- EXPECT_TRUE(audio_sender->Send(parameters).ok()); >- >- // Now verify that the parameters were applied to the fake media engine layer >- // that exists below BaseChannel. >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->sending()); >- >- // Verify codec parameters. >- ASSERT_GT(fake_voice_channel->send_codecs().size(), 0u); >- const cricket::AudioCodec& top_codec = fake_voice_channel->send_codecs()[0]; >- EXPECT_EQ("opus", top_codec.name); >- EXPECT_EQ(120, top_codec.id); >- EXPECT_EQ(48000, top_codec.clockrate); >- EXPECT_EQ(2u, top_codec.channels); >- ASSERT_NE(top_codec.params.end(), top_codec.params.find("minptime")); >- EXPECT_EQ("10", top_codec.params.at("minptime")); >- >- // Verify encoding parameters. >- EXPECT_EQ(20000, fake_voice_channel->max_bps()); >- ASSERT_EQ(1u, fake_voice_channel->send_streams().size()); >- const cricket::StreamParams& send_stream = >- fake_voice_channel->send_streams()[0]; >- EXPECT_EQ(1u, send_stream.ssrcs.size()); >- EXPECT_EQ(0xdeadbeef, send_stream.first_ssrc()); >- >- // Verify header extensions. >- ASSERT_EQ(1u, fake_voice_channel->send_extensions().size()); >- const RtpExtension& extension = fake_voice_channel->send_extensions()[0]; >- EXPECT_EQ("urn:ietf:params:rtp-hdrext:ssrc-audio-level", extension.uri); >- EXPECT_EQ(3, extension.id); >-} >- >-// Test that "Send" causes the expected parameters to be applied to the media >-// engine level, for a video sender. >-TEST_F(OrtcRtpSenderTest, SendAppliesVideoParametersToMediaEngine) { >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- >- // First, create parameters with all the bells and whistles. >- RtpParameters parameters; >- >- RtpCodecParameters vp8_codec; >- vp8_codec.name = "VP8"; >- vp8_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp8_codec.payload_type = 99; >- // Try a couple types of feedback params. "Generic NACK" is a bit of a >- // special case, so test it here. >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::CCM, >- RtcpFeedbackMessageType::FIR); >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::NACK, >- RtcpFeedbackMessageType::GENERIC_NACK); >- parameters.codecs.push_back(std::move(vp8_codec)); >- >- RtpCodecParameters vp8_rtx_codec; >- vp8_rtx_codec.name = "rtx"; >- vp8_rtx_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp8_rtx_codec.payload_type = 100; >- vp8_rtx_codec.parameters["apt"] = "99"; >- parameters.codecs.push_back(std::move(vp8_rtx_codec)); >- >- // Add two codecs, expecting the first to be used. >- // TODO(deadbeef): Once "codec_payload_type" is supported, use it to select a >- // codec that's not at the top of the list. >- RtpCodecParameters vp9_codec; >- vp9_codec.name = "VP9"; >- vp9_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp9_codec.payload_type = 102; >- parameters.codecs.push_back(std::move(vp9_codec)); >- >- RtpCodecParameters vp9_rtx_codec; >- vp9_rtx_codec.name = "rtx"; >- vp9_rtx_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp9_rtx_codec.payload_type = 103; >- vp9_rtx_codec.parameters["apt"] = "102"; >- parameters.codecs.push_back(std::move(vp9_rtx_codec)); >- >- RtpEncodingParameters encoding; >- encoding.ssrc.emplace(0xdeadbeef); >- encoding.rtx.emplace(0xbaadfeed); >- encoding.max_bitrate_bps.emplace(99999); >- parameters.encodings.push_back(std::move(encoding)); >- >- parameters.header_extensions.emplace_back("urn:3gpp:video-orientation", 4); >- parameters.header_extensions.emplace_back( >- "http://www.webrtc.org/experiments/rtp-hdrext/playout-delay", 6); >- >- EXPECT_TRUE(video_sender->Send(parameters).ok()); >- >- // Now verify that the parameters were applied to the fake media engine layer >- // that exists below BaseChannel. >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->sending()); >- >- // Verify codec parameters. >- ASSERT_GE(fake_video_channel->send_codecs().size(), 2u); >- const cricket::VideoCodec& top_codec = fake_video_channel->send_codecs()[0]; >- EXPECT_EQ("VP8", top_codec.name); >- EXPECT_EQ(99, top_codec.id); >- EXPECT_TRUE(top_codec.feedback_params.Has({"ccm", "fir"})); >- EXPECT_TRUE(top_codec.feedback_params.Has(cricket::FeedbackParam("nack"))); >- >- const cricket::VideoCodec& rtx_codec = fake_video_channel->send_codecs()[1]; >- EXPECT_EQ("rtx", rtx_codec.name); >- EXPECT_EQ(100, rtx_codec.id); >- ASSERT_NE(rtx_codec.params.end(), rtx_codec.params.find("apt")); >- EXPECT_EQ("99", rtx_codec.params.at("apt")); >- >- // Verify encoding parameters. >- EXPECT_EQ(99999, fake_video_channel->max_bps()); >- ASSERT_EQ(1u, fake_video_channel->send_streams().size()); >- const cricket::StreamParams& send_stream = >- fake_video_channel->send_streams()[0]; >- EXPECT_EQ(2u, send_stream.ssrcs.size()); >- EXPECT_EQ(0xdeadbeef, send_stream.first_ssrc()); >- uint32_t rtx_ssrc = 0u; >- EXPECT_TRUE(send_stream.GetFidSsrc(send_stream.first_ssrc(), &rtx_ssrc)); >- EXPECT_EQ(0xbaadfeed, rtx_ssrc); >- >- // Verify header extensions. >- ASSERT_EQ(2u, fake_video_channel->send_extensions().size()); >- const RtpExtension& extension1 = fake_video_channel->send_extensions()[0]; >- EXPECT_EQ("urn:3gpp:video-orientation", extension1.uri); >- EXPECT_EQ(4, extension1.id); >- const RtpExtension& extension2 = fake_video_channel->send_extensions()[1]; >- EXPECT_EQ("http://www.webrtc.org/experiments/rtp-hdrext/playout-delay", >- extension2.uri); >- EXPECT_EQ(6, extension2.id); >-} >- >-// Ensure that when primary or RTX SSRCs are left unset, they're generated >-// automatically. >-TEST_F(OrtcRtpSenderTest, SendGeneratesSsrcsWhenEmpty) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- RtpParameters parameters = MakeMinimalOpusParametersWithNoSsrc(); >- // Default RTX parameters, with no SSRC. >- parameters.encodings[0].rtx.emplace(); >- EXPECT_TRUE(audio_sender->Send(parameters).ok()); >- >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- ASSERT_EQ(1u, fake_voice_channel->send_streams().size()); >- const cricket::StreamParams& audio_send_stream = >- fake_voice_channel->send_streams()[0]; >- EXPECT_NE(0u, audio_send_stream.first_ssrc()); >- uint32_t rtx_ssrc = 0u; >- EXPECT_TRUE( >- audio_send_stream.GetFidSsrc(audio_send_stream.first_ssrc(), &rtx_ssrc)); >- EXPECT_NE(0u, rtx_ssrc); >- EXPECT_NE(audio_send_stream.first_ssrc(), rtx_ssrc); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- parameters = MakeMinimalVp8ParametersWithNoSsrc(); >- // Default RTX parameters, with no SSRC. >- parameters.encodings[0].rtx.emplace(); >- EXPECT_TRUE(video_sender->Send(parameters).ok()); >- >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- ASSERT_EQ(1u, fake_video_channel->send_streams().size()); >- const cricket::StreamParams& video_send_stream = >- fake_video_channel->send_streams()[0]; >- EXPECT_NE(0u, video_send_stream.first_ssrc()); >- rtx_ssrc = 0u; >- EXPECT_TRUE( >- video_send_stream.GetFidSsrc(video_send_stream.first_ssrc(), &rtx_ssrc)); >- EXPECT_NE(0u, rtx_ssrc); >- EXPECT_NE(video_send_stream.first_ssrc(), rtx_ssrc); >- EXPECT_NE(video_send_stream.first_ssrc(), audio_send_stream.first_ssrc()); >-} >- >-// Test changing both the send codec and SSRC at the same time, and verify that >-// the new parameters are applied to the media engine level. >-TEST_F(OrtcRtpSenderTest, CallingSendTwiceChangesParameters) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_TRUE( >- audio_sender->Send(MakeMinimalOpusParametersWithSsrc(0x11111111)).ok()); >- EXPECT_TRUE( >- audio_sender->Send(MakeMinimalIsacParametersWithSsrc(0x22222222)).ok()); >- >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- ASSERT_GT(fake_voice_channel->send_codecs().size(), 0u); >- EXPECT_EQ("ISAC", fake_voice_channel->send_codecs()[0].name); >- ASSERT_EQ(1u, fake_voice_channel->send_streams().size()); >- EXPECT_EQ(0x22222222u, fake_voice_channel->send_streams()[0].first_ssrc()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_TRUE( >- video_sender->Send(MakeMinimalVp8ParametersWithSsrc(0x33333333)).ok()); >- EXPECT_TRUE( >- video_sender->Send(MakeMinimalVp9ParametersWithSsrc(0x44444444)).ok()); >- >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- ASSERT_GT(fake_video_channel->send_codecs().size(), 0u); >- EXPECT_EQ("VP9", fake_video_channel->send_codecs()[0].name); >- ASSERT_EQ(1u, fake_video_channel->send_streams().size()); >- EXPECT_EQ(0x44444444u, fake_video_channel->send_streams()[0].first_ssrc()); >-} >- >-// Ensure that if the |active| flag of RtpEncodingParameters is set to false, >-// sending stops at the media engine level. >-TEST_F(OrtcRtpSenderTest, DeactivatingEncodingStopsSending) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- RtpParameters parameters = MakeMinimalOpusParameters(); >- EXPECT_TRUE(audio_sender->Send(parameters).ok()); >- >- // Expect "sending" flag to initially be true. >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->sending()); >- >- // Deactivate encoding and expect it to change to false. >- parameters.encodings[0].active = false; >- EXPECT_TRUE(audio_sender->Send(parameters).ok()); >- EXPECT_FALSE(fake_voice_channel->sending()); >- >- // Try the same thing for video now. >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- parameters = MakeMinimalVp8Parameters(); >- EXPECT_TRUE(video_sender->Send(parameters).ok()); >- >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->sending()); >- >- parameters.encodings[0].active = false; >- EXPECT_TRUE(video_sender->Send(parameters).ok()); >- EXPECT_FALSE(fake_video_channel->sending()); >-} >- >-// Ensure that calling Send with an empty list of encodings causes send streams >-// at the media engine level to be cleared. >-TEST_F(OrtcRtpSenderTest, CallingSendWithEmptyEncodingsClearsSendStreams) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- RtpParameters parameters = MakeMinimalOpusParameters(); >- EXPECT_TRUE(audio_sender->Send(parameters).ok()); >- parameters.encodings.clear(); >- EXPECT_TRUE(audio_sender->Send(parameters).ok()); >- >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->send_streams().empty()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- parameters = MakeMinimalVp8Parameters(); >- EXPECT_TRUE(video_sender->Send(parameters).ok()); >- parameters.encodings.clear(); >- EXPECT_TRUE(video_sender->Send(parameters).ok()); >- >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->send_streams().empty()); >-} >- >-// These errors should be covered by rtpparametersconversion_unittest.cc, but >-// we should at least test that those errors are propogated from calls to Send, >-// with a few examples. >-TEST_F(OrtcRtpSenderTest, SendReturnsErrorOnInvalidParameters) { >- auto result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_VIDEO, >- rtp_transport_.get()); >- auto sender = result.MoveValue(); >- // NACK feedback missing message type. >- RtpParameters invalid_feedback = MakeMinimalVp8Parameters(); >- invalid_feedback.codecs[0].rtcp_feedback.emplace_back(RtcpFeedbackType::NACK); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- sender->Send(invalid_feedback).type()); >- // Negative payload type. >- RtpParameters invalid_pt = MakeMinimalVp8Parameters(); >- invalid_pt.codecs[0].payload_type = -1; >- EXPECT_EQ(RTCErrorType::INVALID_RANGE, sender->Send(invalid_pt).type()); >- // Duplicate codec payload types. >- RtpParameters duplicate_payload_types = MakeMinimalVp8Parameters(); >- duplicate_payload_types.codecs.push_back(duplicate_payload_types.codecs[0]); >- duplicate_payload_types.codecs.back().name = "VP9"; >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- sender->Send(duplicate_payload_types).type()); >-} >- >-// Two senders using the same transport shouldn't be able to use the same >-// payload type to refer to different codecs, same header extension IDs to >-// refer to different extensions, or same SSRC. >-TEST_F(OrtcRtpSenderTest, SendReturnsErrorOnIdConflicts) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- auto video_sender = video_sender_result.MoveValue(); >- >- // First test payload type conflict. >- RtpParameters audio_parameters = MakeMinimalOpusParameters(); >- RtpParameters video_parameters = MakeMinimalVp8Parameters(); >- audio_parameters.codecs[0].payload_type = 100; >- video_parameters.codecs[0].payload_type = 100; >- EXPECT_TRUE(audio_sender->Send(audio_parameters).ok()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- video_sender->Send(video_parameters).type()); >- >- // Test header extension ID conflict. >- video_parameters.codecs[0].payload_type = 110; >- audio_parameters.header_extensions.emplace_back("foo", 4); >- video_parameters.header_extensions.emplace_back("bar", 4); >- EXPECT_TRUE(audio_sender->Send(audio_parameters).ok()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- video_sender->Send(video_parameters).type()); >- >- // Test SSRC conflict. Have an RTX SSRC that conflicts with a primary SSRC >- // for extra challenge. >- video_parameters.header_extensions[0].uri = "foo"; >- audio_parameters.encodings[0].ssrc.emplace(0xdeadbeef); >- video_parameters.encodings[0].ssrc.emplace(0xabbaabba); >- video_parameters.encodings[0].rtx.emplace(0xdeadbeef); >- EXPECT_TRUE(audio_sender->Send(audio_parameters).ok()); >- EXPECT_EQ(RTCErrorType::INVALID_PARAMETER, >- video_sender->Send(video_parameters).type()); >- >- // Sanity check that parameters can be set if the conflicts are all resolved. >- video_parameters.encodings[0].rtx->ssrc.emplace(0xbaadf00d); >- EXPECT_TRUE(video_sender->Send(video_parameters).ok()); >-} >- >-// Ensure that deleting a sender causes send streams at the media engine level >-// to be cleared. >-TEST_F(OrtcRtpSenderTest, DeletingSenderClearsSendStreams) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- EXPECT_TRUE(audio_sender->Send(MakeMinimalOpusParameters()).ok()); >- >- // Also create an audio receiver, to prevent the voice channel from being >- // completely deleted. >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto audio_receiver = audio_receiver_result.MoveValue(); >- EXPECT_TRUE(audio_receiver->Receive(MakeMinimalOpusParameters()).ok()); >- >- audio_sender.reset(nullptr); >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_TRUE(fake_voice_channel->send_streams().empty()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_TRUE(video_sender->Send(MakeMinimalVp8Parameters()).ok()); >- >- // Also create an video receiver, to prevent the video channel from being >- // completely deleted. >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport_.get()); >- auto video_receiver = video_receiver_result.MoveValue(); >- EXPECT_TRUE(video_receiver->Receive(MakeMinimalVp8Parameters()).ok()); >- >- video_sender.reset(nullptr); >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->send_streams().empty()); >-} >- >-// If Send hasn't been called, GetParameters should return empty parameters. >-TEST_F(OrtcRtpSenderTest, GetDefaultParameters) { >- auto result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, >- rtp_transport_.get()); >- EXPECT_EQ(RtpParameters(), result.value()->GetParameters()); >- result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_VIDEO, >- rtp_transport_.get()); >- EXPECT_EQ(RtpParameters(), result.value()->GetParameters()); >-} >- >-// Test that GetParameters returns the last parameters passed into Send, along >-// with the implementation-default values filled in where they were left unset. >-TEST_F(OrtcRtpSenderTest, >- GetParametersReturnsLastSetParametersWithDefaultsFilled) { >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- CreateAudioTrack("audio"), rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- >- RtpParameters opus_parameters = MakeMinimalOpusParameters(); >- EXPECT_TRUE(audio_sender->Send(opus_parameters).ok()); >- EXPECT_EQ(opus_parameters, audio_sender->GetParameters()); >- >- RtpParameters isac_parameters = MakeMinimalIsacParameters(); >- // Sanity check that num_channels actually is left unset. >- ASSERT_FALSE(isac_parameters.codecs[0].num_channels); >- EXPECT_TRUE(audio_sender->Send(isac_parameters).ok()); >- // Should be filled with a default "num channels" of 1. >- // TODO(deadbeef): This should actually default to 2 for some codecs. Update >- // this test once that's implemented. >- isac_parameters.codecs[0].num_channels.emplace(1); >- EXPECT_EQ(isac_parameters, audio_sender->GetParameters()); >- >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- CreateVideoTrack("video"), rtp_transport_.get()); >- auto video_sender = video_sender_result.MoveValue(); >- >- RtpParameters vp8_parameters = MakeMinimalVp8Parameters(); >- // Sanity check that clock_rate actually is left unset. >- EXPECT_TRUE(video_sender->Send(vp8_parameters).ok()); >- // Should be filled with a default clock rate of 90000. >- vp8_parameters.codecs[0].clock_rate.emplace(90000); >- EXPECT_EQ(vp8_parameters, video_sender->GetParameters()); >- >- RtpParameters vp9_parameters = MakeMinimalVp9Parameters(); >- // Sanity check that clock_rate actually is left unset. >- EXPECT_TRUE(video_sender->Send(vp9_parameters).ok()); >- // Should be filled with a default clock rate of 90000. >- vp9_parameters.codecs[0].clock_rate.emplace(90000); >- EXPECT_EQ(vp9_parameters, video_sender->GetParameters()); >-} >- >-TEST_F(OrtcRtpSenderTest, GetKind) { >- // Construct one sender from the "kind" enum and another from a track. >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport_.get()); >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- CreateVideoTrack("video"), rtp_transport_.get()); >- auto audio_sender = audio_sender_result.MoveValue(); >- auto video_sender = video_sender_result.MoveValue(); >- EXPECT_EQ(cricket::MEDIA_TYPE_AUDIO, audio_sender->GetKind()); >- EXPECT_EQ(cricket::MEDIA_TYPE_VIDEO, video_sender->GetKind()); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsenderadapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsenderadapter.cc >deleted file mode 100644 >index f2541cb8f85f3a64ae1a0f4594a42755de51cbd4..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsenderadapter.cc >+++ /dev/null >@@ -1,188 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "ortc/ortcrtpsenderadapter.h" >- >-#include <utility> >- >-#include "media/base/mediaconstants.h" >-#include "ortc/rtptransportadapter.h" >-#include "rtc_base/checks.h" >- >-namespace { >- >-void FillAudioSenderParameters(webrtc::RtpParameters* parameters) { >- for (webrtc::RtpCodecParameters& codec : parameters->codecs) { >- if (!codec.num_channels) { >- codec.num_channels = 1; >- } >- } >-} >- >-void FillVideoSenderParameters(webrtc::RtpParameters* parameters) { >- for (webrtc::RtpCodecParameters& codec : parameters->codecs) { >- if (!codec.clock_rate) { >- codec.clock_rate = cricket::kVideoCodecClockrate; >- } >- } >-} >- >-} // namespace >- >-namespace webrtc { >- >-BEGIN_OWNED_PROXY_MAP(OrtcRtpSender) >-PROXY_SIGNALING_THREAD_DESTRUCTOR() >-PROXY_METHOD1(RTCError, SetTrack, MediaStreamTrackInterface*) >-PROXY_CONSTMETHOD0(rtc::scoped_refptr<MediaStreamTrackInterface>, GetTrack) >-PROXY_METHOD1(RTCError, SetTransport, RtpTransportInterface*) >-PROXY_CONSTMETHOD0(RtpTransportInterface*, GetTransport) >-PROXY_METHOD1(RTCError, Send, const RtpParameters&) >-PROXY_CONSTMETHOD0(RtpParameters, GetParameters) >-PROXY_CONSTMETHOD0(cricket::MediaType, GetKind) >-END_PROXY_MAP() >- >-// static >-std::unique_ptr<OrtcRtpSenderInterface> OrtcRtpSenderAdapter::CreateProxy( >- std::unique_ptr<OrtcRtpSenderAdapter> wrapped_sender) { >- RTC_DCHECK(wrapped_sender); >- rtc::Thread* signaling = >- wrapped_sender->rtp_transport_controller_->signaling_thread(); >- rtc::Thread* worker = >- wrapped_sender->rtp_transport_controller_->worker_thread(); >- return OrtcRtpSenderProxy::Create(signaling, worker, >- std::move(wrapped_sender)); >-} >- >-OrtcRtpSenderAdapter::~OrtcRtpSenderAdapter() { >- internal_sender_ = nullptr; >- SignalDestroyed(); >-} >- >-RTCError OrtcRtpSenderAdapter::SetTrack(MediaStreamTrackInterface* track) { >- if (track && cricket::MediaTypeFromString(track->kind()) != kind_) { >- LOG_AND_RETURN_ERROR( >- RTCErrorType::INVALID_PARAMETER, >- "Track kind (audio/video) doesn't match the kind of this sender."); >- } >- if (internal_sender_ && !internal_sender_->SetTrack(track)) { >- // Since we checked the track type above, this should never happen... >- RTC_NOTREACHED(); >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to set track on RtpSender."); >- } >- track_ = track; >- return RTCError::OK(); >-} >- >-rtc::scoped_refptr<MediaStreamTrackInterface> OrtcRtpSenderAdapter::GetTrack() >- const { >- return track_; >-} >- >-RTCError OrtcRtpSenderAdapter::SetTransport(RtpTransportInterface* transport) { >- LOG_AND_RETURN_ERROR( >- RTCErrorType::UNSUPPORTED_OPERATION, >- "Changing the transport of an RtpSender is not yet supported."); >-} >- >-RtpTransportInterface* OrtcRtpSenderAdapter::GetTransport() const { >- return transport_; >-} >- >-RTCError OrtcRtpSenderAdapter::Send(const RtpParameters& parameters) { >- RtpParameters filled_parameters = parameters; >- RTCError err; >- uint32_t ssrc = 0; >- switch (kind_) { >- case cricket::MEDIA_TYPE_AUDIO: >- FillAudioSenderParameters(&filled_parameters); >- err = rtp_transport_controller_->ValidateAndApplyAudioSenderParameters( >- filled_parameters, &ssrc); >- if (!err.ok()) { >- return err; >- } >- break; >- case cricket::MEDIA_TYPE_VIDEO: >- FillVideoSenderParameters(&filled_parameters); >- err = rtp_transport_controller_->ValidateAndApplyVideoSenderParameters( >- filled_parameters, &ssrc); >- if (!err.ok()) { >- return err; >- } >- break; >- case cricket::MEDIA_TYPE_DATA: >- RTC_NOTREACHED(); >- return webrtc::RTCError(webrtc::RTCErrorType::INTERNAL_ERROR); >- } >- last_applied_parameters_ = filled_parameters; >- >- // Now that parameters were applied, can call SetSsrc on the internal sender. >- // This is analogous to a PeerConnection calling SetSsrc after >- // SetLocalDescription is successful. >- // >- // If there were no encodings, this SSRC may be 0, which is valid. >- if (!internal_sender_) { >- CreateInternalSender(); >- } >- internal_sender_->SetSsrc(ssrc); >- >- return RTCError::OK(); >-} >- >-RtpParameters OrtcRtpSenderAdapter::GetParameters() const { >- return last_applied_parameters_; >-} >- >-cricket::MediaType OrtcRtpSenderAdapter::GetKind() const { >- return kind_; >-} >- >-OrtcRtpSenderAdapter::OrtcRtpSenderAdapter( >- cricket::MediaType kind, >- RtpTransportInterface* transport, >- RtpTransportControllerAdapter* rtp_transport_controller) >- : kind_(kind), >- transport_(transport), >- rtp_transport_controller_(rtp_transport_controller) {} >- >-void OrtcRtpSenderAdapter::CreateInternalSender() { >- switch (kind_) { >- case cricket::MEDIA_TYPE_AUDIO: { >- auto* audio_sender = new AudioRtpSender( >- rtp_transport_controller_->worker_thread(), /*id=*/"", nullptr); >- auto* voice_channel = rtp_transport_controller_->voice_channel(); >- RTC_DCHECK(voice_channel); >- audio_sender->SetVoiceMediaChannel(voice_channel->media_channel()); >- internal_sender_ = audio_sender; >- break; >- } >- case cricket::MEDIA_TYPE_VIDEO: { >- auto* video_sender = new VideoRtpSender( >- rtp_transport_controller_->worker_thread(), /*id=*/""); >- auto* video_channel = rtp_transport_controller_->video_channel(); >- RTC_DCHECK(video_channel); >- video_sender->SetVideoMediaChannel(video_channel->media_channel()); >- internal_sender_ = video_sender; >- break; >- } >- case cricket::MEDIA_TYPE_DATA: >- RTC_NOTREACHED(); >- } >- if (track_) { >- if (!internal_sender_->SetTrack(track_)) { >- // Since we checked the track type when it was set, this should never >- // happen... >- RTC_NOTREACHED(); >- } >- } >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsenderadapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsenderadapter.h >deleted file mode 100644 >index b078da985c9d18a06886220d249f8a9094fc5abb..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/ortcrtpsenderadapter.h >+++ /dev/null >@@ -1,79 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef ORTC_ORTCRTPSENDERADAPTER_H_ >-#define ORTC_ORTCRTPSENDERADAPTER_H_ >- >-#include <memory> >- >-#include "api/ortc/ortcrtpsenderinterface.h" >-#include "api/rtcerror.h" >-#include "api/rtpparameters.h" >-#include "ortc/rtptransportcontrolleradapter.h" >-#include "pc/rtpsender.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/third_party/sigslot/sigslot.h" >- >-namespace webrtc { >- >-// Implementation of OrtcRtpSenderInterface that works with RtpTransportAdapter, >-// and wraps a VideoRtpSender/AudioRtpSender that's normally used with the >-// PeerConnection. >-// >-// TODO(deadbeef): When BaseChannel is split apart into separate >-// "RtpSender"/"RtpTransceiver"/"RtpSender"/"RtpReceiver" objects, this adapter >-// object can be removed. >-class OrtcRtpSenderAdapter : public OrtcRtpSenderInterface { >- public: >- // Wraps |wrapped_sender| in a proxy that will safely call methods on the >- // correct thread. >- static std::unique_ptr<OrtcRtpSenderInterface> CreateProxy( >- std::unique_ptr<OrtcRtpSenderAdapter> wrapped_sender); >- >- // Should only be called by RtpTransportControllerAdapter. >- OrtcRtpSenderAdapter(cricket::MediaType kind, >- RtpTransportInterface* transport, >- RtpTransportControllerAdapter* rtp_transport_controller); >- ~OrtcRtpSenderAdapter() override; >- >- // OrtcRtpSenderInterface implementation. >- RTCError SetTrack(MediaStreamTrackInterface* track) override; >- rtc::scoped_refptr<MediaStreamTrackInterface> GetTrack() const override; >- >- RTCError SetTransport(RtpTransportInterface* transport) override; >- RtpTransportInterface* GetTransport() const override; >- >- RTCError Send(const RtpParameters& parameters) override; >- RtpParameters GetParameters() const override; >- >- cricket::MediaType GetKind() const override; >- >- // Used so that the RtpTransportControllerAdapter knows when it can >- // deallocate resources allocated for this object. >- sigslot::signal0<> SignalDestroyed; >- >- private: >- void CreateInternalSender(); >- >- cricket::MediaType kind_; >- RtpTransportInterface* transport_; >- RtpTransportControllerAdapter* rtp_transport_controller_; >- // Scoped refptr due to ref-counted interface, but we should be the only >- // reference holder. >- rtc::scoped_refptr<RtpSenderInternal> internal_sender_; >- rtc::scoped_refptr<MediaStreamTrackInterface> track_; >- RtpParameters last_applied_parameters_; >- >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(OrtcRtpSenderAdapter); >-}; >- >-} // namespace webrtc >- >-#endif // ORTC_ORTCRTPSENDERADAPTER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransport_unittest.cc >deleted file mode 100644 >index 92ad43fab3edee70405ba5521d915dfb59309b8a..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransport_unittest.cc >+++ /dev/null >@@ -1,286 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "media/base/fakemediaengine.h" >-#include "ortc/ortcfactory.h" >-#include "ortc/testrtpparameters.h" >-#include "p2p/base/fakepackettransport.h" >-#include "rtc_base/gunit.h" >- >-namespace webrtc { >- >-// This test uses fake packet transports and a fake media engine, in order to >-// test the RtpTransport at only an API level. Any end-to-end test should go in >-// ortcfactory_integrationtest.cc instead. >-class RtpTransportTest : public testing::Test { >- public: >- RtpTransportTest() { >- fake_media_engine_ = new cricket::FakeMediaEngine(); >- // Note: This doesn't need to use fake network classes, since it uses >- // FakePacketTransports. >- auto result = OrtcFactory::Create( >- nullptr, nullptr, nullptr, nullptr, nullptr, >- std::unique_ptr<cricket::MediaEngineInterface>(fake_media_engine_), >- CreateBuiltinAudioEncoderFactory(), CreateBuiltinAudioDecoderFactory()); >- ortc_factory_ = result.MoveValue(); >- } >- >- protected: >- // Owned by |ortc_factory_|. >- cricket::FakeMediaEngine* fake_media_engine_; >- std::unique_ptr<OrtcFactoryInterface> ortc_factory_; >-}; >- >-// Test GetRtpPacketTransport and GetRtcpPacketTransport, with and without RTCP >-// muxing. >-TEST_F(RtpTransportTest, GetPacketTransports) { >- rtc::FakePacketTransport rtp("rtp"); >- rtc::FakePacketTransport rtcp("rtcp"); >- // With muxed RTCP. >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- auto result = >- ortc_factory_->CreateRtpTransport(parameters, &rtp, nullptr, nullptr); >- ASSERT_TRUE(result.ok()); >- EXPECT_EQ(&rtp, result.value()->GetRtpPacketTransport()); >- EXPECT_EQ(nullptr, result.value()->GetRtcpPacketTransport()); >- result.MoveValue().reset(); >- // With non-muxed RTCP. >- parameters.rtcp.mux = false; >- result = ortc_factory_->CreateRtpTransport(parameters, &rtp, &rtcp, nullptr); >- ASSERT_TRUE(result.ok()); >- EXPECT_EQ(&rtp, result.value()->GetRtpPacketTransport()); >- EXPECT_EQ(&rtcp, result.value()->GetRtcpPacketTransport()); >-} >- >-// If an RtpTransport starts out un-muxed and then starts muxing, the RTCP >-// packet transport should be forgotten and GetRtcpPacketTransport should >-// return null. >-TEST_F(RtpTransportTest, EnablingRtcpMuxingUnsetsRtcpTransport) { >- rtc::FakePacketTransport rtp("rtp"); >- rtc::FakePacketTransport rtcp("rtcp"); >- >- // Create non-muxed. >- RtpTransportParameters parameters; >- parameters.rtcp.mux = false; >- auto result = >- ortc_factory_->CreateRtpTransport(parameters, &rtp, &rtcp, nullptr); >- ASSERT_TRUE(result.ok()); >- auto rtp_transport = result.MoveValue(); >- >- // Enable muxing. >- parameters.rtcp.mux = true; >- EXPECT_TRUE(rtp_transport->SetParameters(parameters).ok()); >- EXPECT_EQ(nullptr, rtp_transport->GetRtcpPacketTransport()); >-} >- >-TEST_F(RtpTransportTest, GetAndSetRtcpParameters) { >- rtc::FakePacketTransport rtp("rtp"); >- rtc::FakePacketTransport rtcp("rtcp"); >- // Start with non-muxed RTCP. >- RtpTransportParameters parameters; >- parameters.rtcp.mux = false; >- parameters.rtcp.cname = "teST"; >- parameters.rtcp.reduced_size = false; >- auto result = >- ortc_factory_->CreateRtpTransport(parameters, &rtp, &rtcp, nullptr); >- ASSERT_TRUE(result.ok()); >- auto transport = result.MoveValue(); >- EXPECT_EQ(parameters, transport->GetParameters()); >- >- // Changing the CNAME is currently unsupported. >- parameters.rtcp.cname = "different"; >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- transport->SetParameters(parameters).type()); >- parameters.rtcp.cname = "teST"; >- >- // Enable RTCP muxing and reduced-size RTCP. >- parameters.rtcp.mux = true; >- parameters.rtcp.reduced_size = true; >- EXPECT_TRUE(transport->SetParameters(parameters).ok()); >- EXPECT_EQ(parameters, transport->GetParameters()); >- >- // Empty CNAME should result in the existing CNAME being used. >- parameters.rtcp.cname.clear(); >- EXPECT_TRUE(transport->SetParameters(parameters).ok()); >- EXPECT_EQ("teST", transport->GetParameters().rtcp.cname); >- >- // Disabling RTCP muxing after enabling shouldn't be allowed, since enabling >- // muxing should have made the RTP transport forget about the RTCP packet >- // transport initially passed into it. >- parameters.rtcp.mux = false; >- EXPECT_EQ(RTCErrorType::INVALID_STATE, >- transport->SetParameters(parameters).type()); >-} >- >-// When Send or Receive is called on a sender or receiver, the RTCP parameters >-// from the RtpTransport underneath the sender should be applied to the created >-// media stream. The only relevant parameters (currently) are |cname| and >-// |reduced_size|. >-TEST_F(RtpTransportTest, SendAndReceiveApplyRtcpParametersToMediaEngine) { >- // First, create video transport with reduced-size RTCP. >- rtc::FakePacketTransport fake_packet_transport1("1"); >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- parameters.rtcp.reduced_size = true; >- parameters.rtcp.cname = "foo"; >- auto rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport1, nullptr, nullptr); >- auto video_transport = rtp_transport_result.MoveValue(); >- >- // Create video sender and call Send, expecting parameters to be applied. >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_VIDEO, >- video_transport.get()); >- auto video_sender = sender_result.MoveValue(); >- EXPECT_TRUE(video_sender->Send(MakeMinimalVp8Parameters()).ok()); >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->send_rtcp_parameters().reduced_size); >- ASSERT_EQ(1u, fake_video_channel->send_streams().size()); >- const cricket::StreamParams& video_send_stream = >- fake_video_channel->send_streams()[0]; >- EXPECT_EQ("foo", video_send_stream.cname); >- >- // Create video receiver and call Receive, expecting parameters to be applied >- // (minus |cname|, since that's the sent cname, not received). >- auto receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, video_transport.get()); >- auto video_receiver = receiver_result.MoveValue(); >- EXPECT_TRUE( >- video_receiver->Receive(MakeMinimalVp8ParametersWithSsrc(0xdeadbeef)) >- .ok()); >- EXPECT_TRUE(fake_video_channel->recv_rtcp_parameters().reduced_size); >- >- // Create audio transport with non-reduced size RTCP. >- rtc::FakePacketTransport fake_packet_transport2("2"); >- parameters.rtcp.reduced_size = false; >- parameters.rtcp.cname = "bar"; >- rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport2, nullptr, nullptr); >- auto audio_transport = rtp_transport_result.MoveValue(); >- >- // Create audio sender and call Send, expecting parameters to be applied. >- sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, >- audio_transport.get()); >- auto audio_sender = sender_result.MoveValue(); >- EXPECT_TRUE(audio_sender->Send(MakeMinimalIsacParameters()).ok()); >- >- cricket::FakeVoiceMediaChannel* fake_voice_channel = >- fake_media_engine_->GetVoiceChannel(0); >- ASSERT_NE(nullptr, fake_voice_channel); >- EXPECT_FALSE(fake_voice_channel->send_rtcp_parameters().reduced_size); >- ASSERT_EQ(1u, fake_voice_channel->send_streams().size()); >- const cricket::StreamParams& audio_send_stream = >- fake_voice_channel->send_streams()[0]; >- EXPECT_EQ("bar", audio_send_stream.cname); >- >- // Create audio receiver and call Receive, expecting parameters to be applied >- // (minus |cname|, since that's the sent cname, not received). >- receiver_result = ortc_factory_->CreateRtpReceiver(cricket::MEDIA_TYPE_AUDIO, >- audio_transport.get()); >- auto audio_receiver = receiver_result.MoveValue(); >- EXPECT_TRUE( >- audio_receiver->Receive(MakeMinimalOpusParametersWithSsrc(0xbaadf00d)) >- .ok()); >- EXPECT_FALSE(fake_voice_channel->recv_rtcp_parameters().reduced_size); >-} >- >-// When SetParameters is called, the modified parameters should be applied >-// to the media engine. >-// TODO(deadbeef): Once the implementation supports changing the CNAME, >-// test that here. >-TEST_F(RtpTransportTest, SetRtcpParametersAppliesParametersToMediaEngine) { >- rtc::FakePacketTransport fake_packet_transport("fake"); >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- parameters.rtcp.reduced_size = false; >- auto rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport, nullptr, nullptr); >- auto rtp_transport = rtp_transport_result.MoveValue(); >- >- // Create video sender and call Send, applying an initial set of parameters. >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_VIDEO, >- rtp_transport.get()); >- auto sender = sender_result.MoveValue(); >- EXPECT_TRUE(sender->Send(MakeMinimalVp8Parameters()).ok()); >- >- // Modify parameters and expect them to be changed at the media engine level. >- parameters.rtcp.reduced_size = true; >- EXPECT_TRUE(rtp_transport->SetParameters(parameters).ok()); >- >- cricket::FakeVideoMediaChannel* fake_video_channel = >- fake_media_engine_->GetVideoChannel(0); >- ASSERT_NE(nullptr, fake_video_channel); >- EXPECT_TRUE(fake_video_channel->send_rtcp_parameters().reduced_size); >-} >- >-// SetParameters should set keepalive for all RTP transports. >-// It is impossible to modify keepalive parameters if any streams are created. >-// Note: This is an implementation detail for current way of configuring the >-// keep-alive. It may change in the future. >-TEST_F(RtpTransportTest, CantChangeKeepAliveAfterCreatedSendStreams) { >- rtc::FakePacketTransport fake_packet_transport("fake"); >- RtpTransportParameters parameters; >- parameters.keepalive.timeout_interval_ms = 100; >- auto rtp_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport, nullptr, nullptr); >- ASSERT_TRUE(rtp_transport_result.ok()); >- std::unique_ptr<RtpTransportInterface> rtp_transport = >- rtp_transport_result.MoveValue(); >- >- // Updating keepalive parameters is ok, since no rtp sender created. >- parameters.keepalive.timeout_interval_ms = 200; >- EXPECT_TRUE(rtp_transport->SetParameters(parameters).ok()); >- >- // Create video sender. Note: |sender_result| scope must extend past the >- // SetParameters() call below. >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_VIDEO, >- rtp_transport.get()); >- EXPECT_TRUE(sender_result.ok()); >- >- // Modify parameters second time after video send stream created. >- parameters.keepalive.timeout_interval_ms = 10; >- EXPECT_EQ(RTCErrorType::INVALID_MODIFICATION, >- rtp_transport->SetParameters(parameters).type()); >-} >- >-// Note: This is an implementation detail for current way of configuring the >-// keep-alive. It may change in the future. >-TEST_F(RtpTransportTest, KeepAliveMustBeSameAcrossTransportController) { >- rtc::FakePacketTransport fake_packet_transport("fake"); >- RtpTransportParameters parameters; >- parameters.keepalive.timeout_interval_ms = 100; >- >- // Manually create a controller, that can be shared by multiple transports. >- auto controller_result = ortc_factory_->CreateRtpTransportController(); >- ASSERT_TRUE(controller_result.ok()); >- std::unique_ptr<RtpTransportControllerInterface> controller = >- controller_result.MoveValue(); >- >- // Create a first transport. >- auto first_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport, nullptr, controller.get()); >- ASSERT_TRUE(first_transport_result.ok()); >- >- // Update the parameters, and create another transport for the same >- // controller. >- parameters.keepalive.timeout_interval_ms = 10; >- auto seconds_transport_result = ortc_factory_->CreateRtpTransport( >- parameters, &fake_packet_transport, nullptr, controller.get()); >- EXPECT_EQ(RTCErrorType::INVALID_MODIFICATION, >- seconds_transport_result.error().type()); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportadapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportadapter.cc >deleted file mode 100644 >index 84c2ef09e66eddb9ac784c371611de723fea7ae0..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportadapter.cc >+++ /dev/null >@@ -1,230 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "ortc/rtptransportadapter.h" >- >-#include <algorithm> // For std::find. >-#include <set> >-#include <utility> // For std::move. >- >-#include "absl/memory/memory.h" >-#include "api/proxy.h" >-#include "rtc_base/logging.h" >- >-namespace webrtc { >- >-BEGIN_OWNED_PROXY_MAP(RtpTransport) >-PROXY_SIGNALING_THREAD_DESTRUCTOR() >-PROXY_CONSTMETHOD0(PacketTransportInterface*, GetRtpPacketTransport) >-PROXY_CONSTMETHOD0(PacketTransportInterface*, GetRtcpPacketTransport) >-PROXY_METHOD1(RTCError, SetParameters, const RtpTransportParameters&) >-PROXY_CONSTMETHOD0(RtpTransportParameters, GetParameters) >-protected: >-RtpTransportAdapter* GetInternal() override { >- return internal(); >-} >-END_PROXY_MAP() >- >-BEGIN_OWNED_PROXY_MAP(SrtpTransport) >-PROXY_SIGNALING_THREAD_DESTRUCTOR() >-PROXY_CONSTMETHOD0(PacketTransportInterface*, GetRtpPacketTransport) >-PROXY_CONSTMETHOD0(PacketTransportInterface*, GetRtcpPacketTransport) >-PROXY_METHOD1(RTCError, SetParameters, const RtpTransportParameters&) >-PROXY_CONSTMETHOD0(RtpTransportParameters, GetParameters) >-PROXY_METHOD1(RTCError, SetSrtpSendKey, const cricket::CryptoParams&) >-PROXY_METHOD1(RTCError, SetSrtpReceiveKey, const cricket::CryptoParams&) >-protected: >-RtpTransportAdapter* GetInternal() override { >- return internal(); >-} >-END_PROXY_MAP() >- >-// static >-RTCErrorOr<std::unique_ptr<RtpTransportInterface>> >-RtpTransportAdapter::CreateProxied( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerAdapter* rtp_transport_controller) { >- if (!rtp) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Must provide an RTP packet transport."); >- } >- if (!parameters.rtcp.mux && !rtcp) { >- LOG_AND_RETURN_ERROR( >- RTCErrorType::INVALID_PARAMETER, >- "Must provide an RTCP packet transport when RTCP muxing is not used."); >- } >- if (parameters.rtcp.mux && rtcp) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Creating an RtpTransport with RTCP muxing enabled, " >- "with a separate RTCP packet transport?"); >- } >- if (!rtp_transport_controller) { >- // Since OrtcFactory::CreateRtpTransport creates an RtpTransportController >- // automatically when one isn't passed in, this should never be reached. >- RTC_NOTREACHED(); >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Must provide an RTP transport controller."); >- } >- std::unique_ptr<RtpTransportAdapter> transport_adapter( >- new RtpTransportAdapter(parameters.rtcp, rtp, rtcp, >- rtp_transport_controller, >- false /*is_srtp_transport*/)); >- RTCError params_result = transport_adapter->SetParameters(parameters); >- if (!params_result.ok()) { >- return std::move(params_result); >- } >- >- return RtpTransportProxyWithInternal<RtpTransportAdapter>::Create( >- rtp_transport_controller->signaling_thread(), >- rtp_transport_controller->worker_thread(), std::move(transport_adapter)); >-} >- >-RTCErrorOr<std::unique_ptr<SrtpTransportInterface>> >-RtpTransportAdapter::CreateSrtpProxied( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerAdapter* rtp_transport_controller) { >- if (!rtp) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Must provide an RTP packet transport."); >- } >- if (!parameters.rtcp.mux && !rtcp) { >- LOG_AND_RETURN_ERROR( >- RTCErrorType::INVALID_PARAMETER, >- "Must provide an RTCP packet transport when RTCP muxing is not used."); >- } >- if (parameters.rtcp.mux && rtcp) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Creating an RtpTransport with RTCP muxing enabled, " >- "with a separate RTCP packet transport?"); >- } >- if (!rtp_transport_controller) { >- // Since OrtcFactory::CreateRtpTransport creates an RtpTransportController >- // automatically when one isn't passed in, this should never be reached. >- RTC_NOTREACHED(); >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, >- "Must provide an RTP transport controller."); >- } >- >- std::unique_ptr<RtpTransportAdapter> transport_adapter; >- transport_adapter.reset(new RtpTransportAdapter(parameters.rtcp, rtp, rtcp, >- rtp_transport_controller, >- true /*is_srtp_transport*/)); >- RTCError params_result = transport_adapter->SetParameters(parameters); >- if (!params_result.ok()) { >- return std::move(params_result); >- } >- >- return SrtpTransportProxyWithInternal<RtpTransportAdapter>::Create( >- rtp_transport_controller->signaling_thread(), >- rtp_transport_controller->worker_thread(), std::move(transport_adapter)); >-} >- >-void RtpTransportAdapter::TakeOwnershipOfRtpTransportController( >- std::unique_ptr<RtpTransportControllerInterface> controller) { >- RTC_DCHECK_EQ(rtp_transport_controller_, controller->GetInternal()); >- RTC_DCHECK(owned_rtp_transport_controller_.get() == nullptr); >- owned_rtp_transport_controller_ = std::move(controller); >-} >- >-RtpTransportAdapter::RtpTransportAdapter( >- const RtcpParameters& rtcp_params, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerAdapter* rtp_transport_controller, >- bool is_srtp_transport) >- : rtp_packet_transport_(rtp), >- rtcp_packet_transport_(rtcp), >- rtp_transport_controller_(rtp_transport_controller), >- network_thread_(rtp_transport_controller_->network_thread()) { >- parameters_.rtcp = rtcp_params; >- // CNAME should have been filled by OrtcFactory if empty. >- RTC_DCHECK(!parameters_.rtcp.cname.empty()); >- RTC_DCHECK(rtp_transport_controller); >- >- if (is_srtp_transport) { >- srtp_transport_ = absl::make_unique<SrtpTransport>(rtcp == nullptr); >- transport_ = srtp_transport_.get(); >- } else { >- unencrypted_rtp_transport_ = >- absl::make_unique<RtpTransport>(rtcp == nullptr); >- transport_ = unencrypted_rtp_transport_.get(); >- } >- RTC_DCHECK(transport_); >- >- network_thread_->Invoke<void>(RTC_FROM_HERE, [=] { >- SetRtpPacketTransport(rtp->GetInternal()); >- if (rtcp) { >- SetRtcpPacketTransport(rtcp->GetInternal()); >- } >- }); >- >- transport_->SignalReadyToSend.connect(this, >- &RtpTransportAdapter::OnReadyToSend); >- transport_->SignalRtcpPacketReceived.connect( >- this, &RtpTransportAdapter::OnRtcpPacketReceived); >- transport_->SignalWritableState.connect( >- this, &RtpTransportAdapter::OnWritableState); >-} >- >-RtpTransportAdapter::~RtpTransportAdapter() { >- SignalDestroyed(this); >-} >- >-RTCError RtpTransportAdapter::SetParameters( >- const RtpTransportParameters& parameters) { >- if (!parameters.rtcp.mux && parameters_.rtcp.mux) { >- LOG_AND_RETURN_ERROR(webrtc::RTCErrorType::INVALID_STATE, >- "Can't disable RTCP muxing after enabling."); >- } >- if (!parameters.rtcp.cname.empty() && >- parameters.rtcp.cname != parameters_.rtcp.cname) { >- LOG_AND_RETURN_ERROR(webrtc::RTCErrorType::UNSUPPORTED_OPERATION, >- "Changing the RTCP CNAME is currently unsupported."); >- } >- // If the CNAME is empty, use the existing one. >- RtpTransportParameters copy = parameters; >- if (copy.rtcp.cname.empty()) { >- copy.rtcp.cname = parameters_.rtcp.cname; >- } >- RTCError err = >- rtp_transport_controller_->SetRtpTransportParameters(copy, this); >- if (!err.ok()) { >- return err; >- } >- parameters_ = copy; >- if (parameters_.rtcp.mux) { >- rtcp_packet_transport_ = nullptr; >- } >- return RTCError::OK(); >-} >- >-RTCError RtpTransportAdapter::SetSrtpSendKey( >- const cricket::CryptoParams& params) { >- if (!network_thread_->IsCurrent()) { >- return network_thread_->Invoke<RTCError>( >- RTC_FROM_HERE, [&] { return SetSrtpSendKey(params); }); >- } >- return transport_->SetSrtpSendKey(params); >-} >- >-RTCError RtpTransportAdapter::SetSrtpReceiveKey( >- const cricket::CryptoParams& params) { >- if (!network_thread_->IsCurrent()) { >- return network_thread_->Invoke<RTCError>( >- RTC_FROM_HERE, [&] { return SetSrtpReceiveKey(params); }); >- } >- return transport_->SetSrtpReceiveKey(params); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportadapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportadapter.h >deleted file mode 100644 >index 81914f6b8d1b475f75b0485595a18a7f1641a544..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportadapter.h >+++ /dev/null >@@ -1,121 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef ORTC_RTPTRANSPORTADAPTER_H_ >-#define ORTC_RTPTRANSPORTADAPTER_H_ >- >-#include <memory> >-#include <vector> >- >-#include "api/rtcerror.h" >-#include "media/base/streamparams.h" >-#include "ortc/rtptransportcontrolleradapter.h" >-#include "pc/channel.h" >-#include "pc/rtptransportinternaladapter.h" >-#include "pc/srtptransport.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/third_party/sigslot/sigslot.h" >- >-namespace webrtc { >-// This class is a wrapper over an RtpTransport or an SrtpTransport. The base >-// class RtpTransportInternalAdapter keeps a raw pointer, |transport_|, of the >-// transport object and implements both the public SrtpTransportInterface and >-// RtpTransport internal interface by calling the |transport_| underneath. >-// >-// This adapter can be used as an unencrypted RTP transport or an SrtpTransport >-// with RtpSenderAdapter, RtpReceiverAdapter, and RtpTransportControllerAdapter. >-// >-// TODO(deadbeef): When BaseChannel is split apart into separate >-// "RtpTransport"/"RtpTransceiver"/"RtpSender"/"RtpReceiver" objects, this >-// adapter object can be removed. >-class RtpTransportAdapter : public RtpTransportInternalAdapter { >- public: >- // |rtp| can't be null. |rtcp| can if RTCP muxing is used immediately (meaning >- // |rtcp_parameters.mux| is also true). >- static RTCErrorOr<std::unique_ptr<RtpTransportInterface>> CreateProxied( >- const RtpTransportParameters& rtcp_parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerAdapter* rtp_transport_controller); >- >- static RTCErrorOr<std::unique_ptr<SrtpTransportInterface>> CreateSrtpProxied( >- const RtpTransportParameters& rtcp_parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerAdapter* rtp_transport_controller); >- >- ~RtpTransportAdapter() override; >- >- // RtpTransportInterface implementation. >- PacketTransportInterface* GetRtpPacketTransport() const override { >- return rtp_packet_transport_; >- } >- PacketTransportInterface* GetRtcpPacketTransport() const override { >- return rtcp_packet_transport_; >- } >- RTCError SetParameters(const RtpTransportParameters& parameters) override; >- RtpTransportParameters GetParameters() const override { return parameters_; } >- >- // SRTP specific implementation. >- RTCError SetSrtpSendKey(const cricket::CryptoParams& params) override; >- RTCError SetSrtpReceiveKey(const cricket::CryptoParams& params) override; >- >- // Methods used internally by OrtcFactory. >- RtpTransportControllerAdapter* rtp_transport_controller() { >- return rtp_transport_controller_; >- } >- void TakeOwnershipOfRtpTransportController( >- std::unique_ptr<RtpTransportControllerInterface> controller); >- >- // Used by RtpTransportControllerAdapter to tell when it should stop >- // returning this transport from GetTransports(). >- sigslot::signal1<RtpTransportAdapter*> SignalDestroyed; >- >- bool IsSrtpActive() const override { return transport_->IsSrtpActive(); } >- >- protected: >- RtpTransportAdapter* GetInternal() override { return this; } >- >- private: >- RtpTransportAdapter(const RtcpParameters& rtcp_params, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp, >- RtpTransportControllerAdapter* rtp_transport_controller, >- bool is_srtp_transport); >- >- void OnReadyToSend(bool ready) { SignalReadyToSend(ready); } >- >- void OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& time) { >- SignalRtcpPacketReceived(packet, time); >- } >- >- void OnWritableState(bool writable) { SignalWritableState(writable); } >- >- PacketTransportInterface* rtp_packet_transport_ = nullptr; >- PacketTransportInterface* rtcp_packet_transport_ = nullptr; >- RtpTransportControllerAdapter* const rtp_transport_controller_ = nullptr; >- // Non-null if this class owns the transport controller. >- std::unique_ptr<RtpTransportControllerInterface> >- owned_rtp_transport_controller_; >- RtpTransportParameters parameters_; >- >- // Only one of them is non-null; >- std::unique_ptr<RtpTransport> unencrypted_rtp_transport_; >- std::unique_ptr<SrtpTransport> srtp_transport_; >- >- rtc::Thread* network_thread_ = nullptr; >- >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(RtpTransportAdapter); >-}; >- >-} // namespace webrtc >- >-#endif // ORTC_RTPTRANSPORTADAPTER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontroller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontroller_unittest.cc >deleted file mode 100644 >index 49548417ee7cf67f4297df77a87c6b0f77f397c3..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontroller_unittest.cc >+++ /dev/null >@@ -1,199 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "media/base/fakemediaengine.h" >-#include "ortc/ortcfactory.h" >-#include "ortc/testrtpparameters.h" >-#include "p2p/base/fakepackettransport.h" >-#include "rtc_base/gunit.h" >- >-namespace webrtc { >- >-// This test uses fake packet transports and a fake media engine, in order to >-// test the RtpTransportController at only an API level. Any end-to-end test >-// should go in ortcfactory_integrationtest.cc instead. >-// >-// Currently, this test mainly focuses on the limitations of the "adapter" >-// RtpTransportController implementation. Only one of each type of >-// sender/receiver can be created, and the sender/receiver of the same media >-// type must use the same transport. >-class RtpTransportControllerTest : public testing::Test { >- public: >- RtpTransportControllerTest() { >- // Note: This doesn't need to use fake network classes, since it uses >- // FakePacketTransports. >- auto result = OrtcFactory::Create( >- nullptr, nullptr, nullptr, nullptr, nullptr, >- std::unique_ptr<cricket::MediaEngineInterface>( >- new cricket::FakeMediaEngine()), >- CreateBuiltinAudioEncoderFactory(), CreateBuiltinAudioDecoderFactory()); >- ortc_factory_ = result.MoveValue(); >- rtp_transport_controller_ = >- ortc_factory_->CreateRtpTransportController().MoveValue(); >- } >- >- protected: >- std::unique_ptr<OrtcFactoryInterface> ortc_factory_; >- std::unique_ptr<RtpTransportControllerInterface> rtp_transport_controller_; >-}; >- >-TEST_F(RtpTransportControllerTest, GetTransports) { >- rtc::FakePacketTransport packet_transport1("one"); >- rtc::FakePacketTransport packet_transport2("two"); >- >- auto rtp_transport_result1 = ortc_factory_->CreateRtpTransport( >- MakeRtcpMuxParameters(), &packet_transport1, nullptr, >- rtp_transport_controller_.get()); >- ASSERT_TRUE(rtp_transport_result1.ok()); >- >- auto rtp_transport_result2 = ortc_factory_->CreateRtpTransport( >- MakeRtcpMuxParameters(), &packet_transport2, nullptr, >- rtp_transport_controller_.get()); >- ASSERT_TRUE(rtp_transport_result2.ok()); >- >- auto returned_transports = rtp_transport_controller_->GetTransports(); >- ASSERT_EQ(2u, returned_transports.size()); >- EXPECT_EQ(rtp_transport_result1.value().get(), returned_transports[0]); >- EXPECT_EQ(rtp_transport_result2.value().get(), returned_transports[1]); >- >- // If a transport is deleted, it shouldn't be returned any more. >- rtp_transport_result1.MoveValue().reset(); >- returned_transports = rtp_transport_controller_->GetTransports(); >- ASSERT_EQ(1u, returned_transports.size()); >- EXPECT_EQ(rtp_transport_result2.value().get(), returned_transports[0]); >-} >- >-// Create RtpSenders and RtpReceivers on top of RtpTransports controlled by the >-// same RtpTransportController. Currently only one each of audio/video is >-// supported. >-TEST_F(RtpTransportControllerTest, AttachMultipleSendersAndReceivers) { >- rtc::FakePacketTransport audio_packet_transport("audio"); >- rtc::FakePacketTransport video_packet_transport("video"); >- >- auto audio_rtp_transport_result = ortc_factory_->CreateRtpTransport( >- MakeRtcpMuxParameters(), &audio_packet_transport, nullptr, >- rtp_transport_controller_.get()); >- ASSERT_TRUE(audio_rtp_transport_result.ok()); >- auto audio_rtp_transport = audio_rtp_transport_result.MoveValue(); >- >- auto video_rtp_transport_result = ortc_factory_->CreateRtpTransport( >- MakeRtcpMuxParameters(), &video_packet_transport, nullptr, >- rtp_transport_controller_.get()); >- ASSERT_TRUE(video_rtp_transport_result.ok()); >- auto video_rtp_transport = video_rtp_transport_result.MoveValue(); >- >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, audio_rtp_transport.get()); >- EXPECT_TRUE(audio_sender_result.ok()); >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, audio_rtp_transport.get()); >- EXPECT_TRUE(audio_receiver_result.ok()); >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, video_rtp_transport.get()); >- EXPECT_TRUE(video_sender_result.ok()); >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, video_rtp_transport.get()); >- EXPECT_TRUE(video_receiver_result.ok()); >- >- // Now that we have one each of audio/video senders/receivers, trying to >- // create more on top of the same controller is expected to fail. >- // TODO(deadbeef): Update this test once multiple senders/receivers on top of >- // the same controller is supported. >- auto failed_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, audio_rtp_transport.get()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- failed_sender_result.error().type()); >- auto failed_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, audio_rtp_transport.get()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- failed_receiver_result.error().type()); >- failed_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, video_rtp_transport.get()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- failed_sender_result.error().type()); >- failed_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, video_rtp_transport.get()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- failed_receiver_result.error().type()); >- >- // If we destroy the existing sender/receiver using a transport controller, >- // we should be able to make a new one, despite the above limitation. >- audio_sender_result.MoveValue().reset(); >- audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, audio_rtp_transport.get()); >- EXPECT_TRUE(audio_sender_result.ok()); >- audio_receiver_result.MoveValue().reset(); >- audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, audio_rtp_transport.get()); >- EXPECT_TRUE(audio_receiver_result.ok()); >- video_sender_result.MoveValue().reset(); >- video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, video_rtp_transport.get()); >- EXPECT_TRUE(video_sender_result.ok()); >- video_receiver_result.MoveValue().reset(); >- video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, video_rtp_transport.get()); >- EXPECT_TRUE(video_receiver_result.ok()); >-} >- >-// Given the current limitations of the BaseChannel-based implementation, it's >-// not possible for an audio sender and receiver to use different RtpTransports. >-// TODO(deadbeef): Once this is supported, update/replace this test. >-TEST_F(RtpTransportControllerTest, >- SenderAndReceiverUsingDifferentTransportsUnsupported) { >- rtc::FakePacketTransport packet_transport1("one"); >- rtc::FakePacketTransport packet_transport2("two"); >- >- auto rtp_transport_result1 = ortc_factory_->CreateRtpTransport( >- MakeRtcpMuxParameters(), &packet_transport1, nullptr, >- rtp_transport_controller_.get()); >- ASSERT_TRUE(rtp_transport_result1.ok()); >- auto rtp_transport1 = rtp_transport_result1.MoveValue(); >- >- auto rtp_transport_result2 = ortc_factory_->CreateRtpTransport( >- MakeRtcpMuxParameters(), &packet_transport2, nullptr, >- rtp_transport_controller_.get()); >- ASSERT_TRUE(rtp_transport_result2.ok()); >- auto rtp_transport2 = rtp_transport_result2.MoveValue(); >- >- // Create an audio sender on transport 1, then try to create a receiver on 2. >- auto audio_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport1.get()); >- EXPECT_TRUE(audio_sender_result.ok()); >- auto audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport2.get()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- audio_receiver_result.error().type()); >- // Delete the sender; now we should be ok to create the receiver on 2. >- audio_sender_result.MoveValue().reset(); >- audio_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, rtp_transport2.get()); >- EXPECT_TRUE(audio_receiver_result.ok()); >- >- // Do the same thing for video, reversing 1 and 2 (for variety). >- auto video_sender_result = ortc_factory_->CreateRtpSender( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport2.get()); >- EXPECT_TRUE(video_sender_result.ok()); >- >- auto video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport1.get()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- video_receiver_result.error().type()); >- video_sender_result.MoveValue().reset(); >- video_receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_VIDEO, rtp_transport1.get()); >- EXPECT_TRUE(video_receiver_result.ok()); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontrolleradapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontrolleradapter.cc >deleted file mode 100644 >index dbb03ea321f546737ef31fd31eab8648d5d00886..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontrolleradapter.cc >+++ /dev/null >@@ -1,971 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "ortc/rtptransportcontrolleradapter.h" >- >-#include <algorithm> // For "remove", "find". >-#include <set> >-#include <unordered_map> >-#include <utility> // For std::move. >- >-#include "absl/memory/memory.h" >-#include "api/proxy.h" >-#include "media/base/mediaconstants.h" >-#include "ortc/ortcrtpreceiveradapter.h" >-#include "ortc/ortcrtpsenderadapter.h" >-#include "ortc/rtptransportadapter.h" >-#include "pc/rtpmediautils.h" >-#include "pc/rtpparametersconversion.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/strings/string_builder.h" >- >-namespace webrtc { >- >-// Note: It's assumed that each individual list doesn't have conflicts, since >-// they should have been detected already by rtpparametersconversion.cc. This >-// only needs to detect conflicts *between* A and B. >-template <typename C1, typename C2> >-static RTCError CheckForIdConflicts( >- const std::vector<C1>& codecs_a, >- const cricket::RtpHeaderExtensions& extensions_a, >- const cricket::StreamParamsVec& streams_a, >- const std::vector<C2>& codecs_b, >- const cricket::RtpHeaderExtensions& extensions_b, >- const cricket::StreamParamsVec& streams_b) { >- rtc::StringBuilder oss; >- // Since it's assumed that C1 and C2 are different types, codecs_a and >- // codecs_b should never contain the same payload type, and thus we can just >- // use a set. >- std::set<int> seen_payload_types; >- for (const C1& codec : codecs_a) { >- seen_payload_types.insert(codec.id); >- } >- for (const C2& codec : codecs_b) { >- if (!seen_payload_types.insert(codec.id).second) { >- oss << "Same payload type used for audio and video codecs: " << codec.id; >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, oss.str()); >- } >- } >- // Audio and video *may* use the same header extensions, so use a map. >- std::unordered_map<int, std::string> seen_extensions; >- for (const webrtc::RtpExtension& extension : extensions_a) { >- seen_extensions[extension.id] = extension.uri; >- } >- for (const webrtc::RtpExtension& extension : extensions_b) { >- if (seen_extensions.find(extension.id) != seen_extensions.end() && >- seen_extensions.at(extension.id) != extension.uri) { >- oss << "Same ID used for different RTP header extensions: " >- << extension.id; >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, oss.str()); >- } >- } >- std::set<uint32_t> seen_ssrcs; >- for (const cricket::StreamParams& stream : streams_a) { >- seen_ssrcs.insert(stream.ssrcs.begin(), stream.ssrcs.end()); >- } >- for (const cricket::StreamParams& stream : streams_b) { >- for (uint32_t ssrc : stream.ssrcs) { >- if (!seen_ssrcs.insert(ssrc).second) { >- oss << "Same SSRC used for audio and video senders: " << ssrc; >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_PARAMETER, oss.str()); >- } >- } >- } >- return RTCError::OK(); >-} >- >-BEGIN_OWNED_PROXY_MAP(RtpTransportController) >-PROXY_SIGNALING_THREAD_DESTRUCTOR() >-PROXY_CONSTMETHOD0(std::vector<RtpTransportInterface*>, GetTransports) >-protected: >-RtpTransportControllerAdapter* GetInternal() override { >- return internal(); >-} >-END_PROXY_MAP() >- >-// static >-std::unique_ptr<RtpTransportControllerInterface> >-RtpTransportControllerAdapter::CreateProxied( >- const cricket::MediaConfig& config, >- cricket::ChannelManager* channel_manager, >- webrtc::RtcEventLog* event_log, >- rtc::Thread* signaling_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* network_thread) { >- std::unique_ptr<RtpTransportControllerAdapter> wrapped( >- new RtpTransportControllerAdapter(config, channel_manager, event_log, >- signaling_thread, worker_thread, >- network_thread)); >- return RtpTransportControllerProxyWithInternal< >- RtpTransportControllerAdapter>::Create(signaling_thread, worker_thread, >- std::move(wrapped)); >-} >- >-RtpTransportControllerAdapter::~RtpTransportControllerAdapter() { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- if (!transport_proxies_.empty()) { >- RTC_LOG(LS_ERROR) >- << "Destroying RtpTransportControllerAdapter while RtpTransports " >- "are still using it; this is unsafe."; >- } >- if (voice_channel_) { >- // This would mean audio RTP senders/receivers that are using us haven't >- // been destroyed. This isn't safe (see error log above). >- DestroyVoiceChannel(); >- } >- if (voice_channel_) { >- // This would mean video RTP senders/receivers that are using us haven't >- // been destroyed. This isn't safe (see error log above). >- DestroyVideoChannel(); >- } >- // Call must be destroyed on the worker thread. >- worker_thread_->Invoke<void>( >- RTC_FROM_HERE, rtc::Bind(&RtpTransportControllerAdapter::Close_w, this)); >-} >- >-RTCErrorOr<std::unique_ptr<RtpTransportInterface>> >-RtpTransportControllerAdapter::CreateProxiedRtpTransport( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp) { >- if (!transport_proxies_.empty() && (parameters.keepalive != keepalive_)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_MODIFICATION, >- "Cannot create RtpTransport with different keep-alive " >- "from the RtpTransports already associated with this " >- "transport controller."); >- } >- auto result = RtpTransportAdapter::CreateProxied(parameters, rtp, rtcp, this); >- if (result.ok()) { >- transport_proxies_.push_back(result.value().get()); >- transport_proxies_.back()->GetInternal()->SignalDestroyed.connect( >- this, &RtpTransportControllerAdapter::OnRtpTransportDestroyed); >- } >- return result; >-} >- >-RTCErrorOr<std::unique_ptr<SrtpTransportInterface>> >-RtpTransportControllerAdapter::CreateProxiedSrtpTransport( >- const RtpTransportParameters& parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp) { >- auto result = >- RtpTransportAdapter::CreateSrtpProxied(parameters, rtp, rtcp, this); >- if (result.ok()) { >- transport_proxies_.push_back(result.value().get()); >- transport_proxies_.back()->GetInternal()->SignalDestroyed.connect( >- this, &RtpTransportControllerAdapter::OnRtpTransportDestroyed); >- } >- return result; >-} >- >-RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> >-RtpTransportControllerAdapter::CreateProxiedRtpSender( >- cricket::MediaType kind, >- RtpTransportInterface* transport_proxy) { >- RTC_DCHECK(transport_proxy); >- RTC_DCHECK(std::find(transport_proxies_.begin(), transport_proxies_.end(), >- transport_proxy) != transport_proxies_.end()); >- std::unique_ptr<OrtcRtpSenderAdapter> new_sender( >- new OrtcRtpSenderAdapter(kind, transport_proxy, this)); >- RTCError err; >- switch (kind) { >- case cricket::MEDIA_TYPE_AUDIO: >- err = AttachAudioSender(new_sender.get(), transport_proxy->GetInternal()); >- break; >- case cricket::MEDIA_TYPE_VIDEO: >- err = AttachVideoSender(new_sender.get(), transport_proxy->GetInternal()); >- break; >- case cricket::MEDIA_TYPE_DATA: >- RTC_NOTREACHED(); >- } >- if (!err.ok()) { >- return std::move(err); >- } >- >- return OrtcRtpSenderAdapter::CreateProxy(std::move(new_sender)); >-} >- >-RTCErrorOr<std::unique_ptr<OrtcRtpReceiverInterface>> >-RtpTransportControllerAdapter::CreateProxiedRtpReceiver( >- cricket::MediaType kind, >- RtpTransportInterface* transport_proxy) { >- RTC_DCHECK(transport_proxy); >- RTC_DCHECK(std::find(transport_proxies_.begin(), transport_proxies_.end(), >- transport_proxy) != transport_proxies_.end()); >- std::unique_ptr<OrtcRtpReceiverAdapter> new_receiver( >- new OrtcRtpReceiverAdapter(kind, transport_proxy, this)); >- RTCError err; >- switch (kind) { >- case cricket::MEDIA_TYPE_AUDIO: >- err = AttachAudioReceiver(new_receiver.get(), >- transport_proxy->GetInternal()); >- break; >- case cricket::MEDIA_TYPE_VIDEO: >- err = AttachVideoReceiver(new_receiver.get(), >- transport_proxy->GetInternal()); >- break; >- case cricket::MEDIA_TYPE_DATA: >- RTC_NOTREACHED(); >- } >- if (!err.ok()) { >- return std::move(err); >- } >- >- return OrtcRtpReceiverAdapter::CreateProxy(std::move(new_receiver)); >-} >- >-std::vector<RtpTransportInterface*> >-RtpTransportControllerAdapter::GetTransports() const { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- return transport_proxies_; >-} >- >-RTCError RtpTransportControllerAdapter::SetRtpTransportParameters( >- const RtpTransportParameters& parameters, >- RtpTransportInterface* inner_transport) { >- if ((video_channel_ != nullptr || voice_channel_ != nullptr) && >- (parameters.keepalive != keepalive_)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INVALID_MODIFICATION, >- "Cannot change keep-alive settings after creating " >- "media streams or additional transports for the same " >- "transport controller."); >- } >- // Call must be configured on the worker thread. >- worker_thread_->Invoke<void>( >- RTC_FROM_HERE, >- rtc::Bind(&RtpTransportControllerAdapter::SetRtpTransportParameters_w, >- this, parameters)); >- >- do { >- if (inner_transport == inner_audio_transport_) { >- CopyRtcpParametersToDescriptions(parameters.rtcp, >- &local_audio_description_, >- &remote_audio_description_); >- if (!voice_channel_->SetLocalContent(&local_audio_description_, >- SdpType::kOffer, nullptr)) { >- break; >- } >- if (!voice_channel_->SetRemoteContent(&remote_audio_description_, >- SdpType::kAnswer, nullptr)) { >- break; >- } >- } else if (inner_transport == inner_video_transport_) { >- CopyRtcpParametersToDescriptions(parameters.rtcp, >- &local_video_description_, >- &remote_video_description_); >- if (!video_channel_->SetLocalContent(&local_video_description_, >- SdpType::kOffer, nullptr)) { >- break; >- } >- if (!video_channel_->SetRemoteContent(&remote_video_description_, >- SdpType::kAnswer, nullptr)) { >- break; >- } >- } >- return RTCError::OK(); >- } while (false); >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply new RTCP parameters."); >-} >- >-void RtpTransportControllerAdapter::SetRtpTransportParameters_w( >- const RtpTransportParameters& parameters) { >- call_send_rtp_transport_controller_->SetKeepAliveConfig(parameters.keepalive); >-} >- >-RTCError RtpTransportControllerAdapter::ValidateAndApplyAudioSenderParameters( >- const RtpParameters& parameters, >- uint32_t* primary_ssrc) { >- RTC_DCHECK(voice_channel_); >- RTC_DCHECK(have_audio_sender_); >- >- auto codecs_result = ToCricketCodecs<cricket::AudioCodec>(parameters.codecs); >- if (!codecs_result.ok()) { >- return codecs_result.MoveError(); >- } >- >- auto extensions_result = >- ToCricketRtpHeaderExtensions(parameters.header_extensions); >- if (!extensions_result.ok()) { >- return extensions_result.MoveError(); >- } >- >- auto stream_params_result = MakeSendStreamParamsVec( >- parameters.encodings, inner_audio_transport_->GetParameters().rtcp.cname, >- local_audio_description_); >- if (!stream_params_result.ok()) { >- return stream_params_result.MoveError(); >- } >- >- // Check that audio/video sender aren't using the same IDs to refer to >- // different things, if they share the same transport. >- if (inner_audio_transport_ == inner_video_transport_) { >- RTCError err = CheckForIdConflicts( >- codecs_result.value(), extensions_result.value(), >- stream_params_result.value(), remote_video_description_.codecs(), >- remote_video_description_.rtp_header_extensions(), >- local_video_description_.streams()); >- if (!err.ok()) { >- return err; >- } >- } >- >- bool local_send = false; >- int bandwidth = cricket::kAutoBandwidth; >- if (parameters.encodings.size() == 1u) { >- if (parameters.encodings[0].max_bitrate_bps) { >- bandwidth = *parameters.encodings[0].max_bitrate_bps; >- } >- local_send = parameters.encodings[0].active; >- } >- const bool local_recv = >- RtpTransceiverDirectionHasRecv(local_audio_description_.direction()); >- const auto local_direction = >- RtpTransceiverDirectionFromSendRecv(local_send, local_recv); >- if (primary_ssrc && !stream_params_result.value().empty()) { >- *primary_ssrc = stream_params_result.value()[0].first_ssrc(); >- } >- >- // Validation is done, so we can attempt applying the descriptions. Sent >- // codecs and header extensions go in remote description, streams go in >- // local. >- // >- // If there are no codecs or encodings, just leave the previous set of >- // codecs. The media engine doesn't like an empty set of codecs. >- if (local_audio_description_.streams().empty() && >- remote_audio_description_.codecs().empty()) { >- } else { >- remote_audio_description_.set_codecs(codecs_result.MoveValue()); >- } >- remote_audio_description_.set_rtp_header_extensions( >- extensions_result.MoveValue()); >- remote_audio_description_.set_bandwidth(bandwidth); >- local_audio_description_.mutable_streams() = stream_params_result.MoveValue(); >- // Direction set based on encoding "active" flag. >- local_audio_description_.set_direction(local_direction); >- remote_audio_description_.set_direction( >- RtpTransceiverDirectionReversed(local_direction)); >- >- // Set remote content first, to ensure the stream is created with the correct >- // codec. >- if (!voice_channel_->SetRemoteContent(&remote_audio_description_, >- SdpType::kOffer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply remote parameters to media channel."); >- } >- if (!voice_channel_->SetLocalContent(&local_audio_description_, >- SdpType::kAnswer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply local parameters to media channel."); >- } >- return RTCError::OK(); >-} >- >-RTCError RtpTransportControllerAdapter::ValidateAndApplyVideoSenderParameters( >- const RtpParameters& parameters, >- uint32_t* primary_ssrc) { >- RTC_DCHECK(video_channel_); >- RTC_DCHECK(have_video_sender_); >- >- auto codecs_result = ToCricketCodecs<cricket::VideoCodec>(parameters.codecs); >- if (!codecs_result.ok()) { >- return codecs_result.MoveError(); >- } >- >- auto extensions_result = >- ToCricketRtpHeaderExtensions(parameters.header_extensions); >- if (!extensions_result.ok()) { >- return extensions_result.MoveError(); >- } >- >- auto stream_params_result = MakeSendStreamParamsVec( >- parameters.encodings, inner_video_transport_->GetParameters().rtcp.cname, >- local_video_description_); >- if (!stream_params_result.ok()) { >- return stream_params_result.MoveError(); >- } >- >- // Check that audio/video sender aren't using the same IDs to refer to >- // different things, if they share the same transport. >- if (inner_audio_transport_ == inner_video_transport_) { >- RTCError err = CheckForIdConflicts( >- codecs_result.value(), extensions_result.value(), >- stream_params_result.value(), remote_audio_description_.codecs(), >- remote_audio_description_.rtp_header_extensions(), >- local_audio_description_.streams()); >- if (!err.ok()) { >- return err; >- } >- } >- >- bool local_send = false; >- int bandwidth = cricket::kAutoBandwidth; >- if (parameters.encodings.size() == 1u) { >- if (parameters.encodings[0].max_bitrate_bps) { >- bandwidth = *parameters.encodings[0].max_bitrate_bps; >- } >- local_send = parameters.encodings[0].active; >- } >- const bool local_recv = >- RtpTransceiverDirectionHasRecv(local_audio_description_.direction()); >- const auto local_direction = >- RtpTransceiverDirectionFromSendRecv(local_send, local_recv); >- if (primary_ssrc && !stream_params_result.value().empty()) { >- *primary_ssrc = stream_params_result.value()[0].first_ssrc(); >- } >- >- // Validation is done, so we can attempt applying the descriptions. Sent >- // codecs and header extensions go in remote description, streams go in >- // local. >- // >- // If there are no codecs or encodings, just leave the previous set of >- // codecs. The media engine doesn't like an empty set of codecs. >- if (local_video_description_.streams().empty() && >- remote_video_description_.codecs().empty()) { >- } else { >- remote_video_description_.set_codecs(codecs_result.MoveValue()); >- } >- remote_video_description_.set_rtp_header_extensions( >- extensions_result.MoveValue()); >- remote_video_description_.set_bandwidth(bandwidth); >- local_video_description_.mutable_streams() = stream_params_result.MoveValue(); >- // Direction set based on encoding "active" flag. >- local_video_description_.set_direction(local_direction); >- remote_video_description_.set_direction( >- RtpTransceiverDirectionReversed(local_direction)); >- >- // Set remote content first, to ensure the stream is created with the correct >- // codec. >- if (!video_channel_->SetRemoteContent(&remote_video_description_, >- SdpType::kOffer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply remote parameters to media channel."); >- } >- if (!video_channel_->SetLocalContent(&local_video_description_, >- SdpType::kAnswer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply local parameters to media channel."); >- } >- return RTCError::OK(); >-} >- >-RTCError RtpTransportControllerAdapter::ValidateAndApplyAudioReceiverParameters( >- const RtpParameters& parameters) { >- RTC_DCHECK(voice_channel_); >- RTC_DCHECK(have_audio_receiver_); >- >- auto codecs_result = ToCricketCodecs<cricket::AudioCodec>(parameters.codecs); >- if (!codecs_result.ok()) { >- return codecs_result.MoveError(); >- } >- >- auto extensions_result = >- ToCricketRtpHeaderExtensions(parameters.header_extensions); >- if (!extensions_result.ok()) { >- return extensions_result.MoveError(); >- } >- >- auto stream_params_result = ToCricketStreamParamsVec(parameters.encodings); >- if (!stream_params_result.ok()) { >- return stream_params_result.MoveError(); >- } >- >- // Check that audio/video receive aren't using the same IDs to refer to >- // different things, if they share the same transport. >- if (inner_audio_transport_ == inner_video_transport_) { >- RTCError err = CheckForIdConflicts( >- codecs_result.value(), extensions_result.value(), >- stream_params_result.value(), local_video_description_.codecs(), >- local_video_description_.rtp_header_extensions(), >- remote_video_description_.streams()); >- if (!err.ok()) { >- return err; >- } >- } >- >- const bool local_send = >- RtpTransceiverDirectionHasSend(local_audio_description_.direction()); >- const bool local_recv = >- !parameters.encodings.empty() && parameters.encodings[0].active; >- const auto local_direction = >- RtpTransceiverDirectionFromSendRecv(local_send, local_recv); >- >- // Validation is done, so we can attempt applying the descriptions. Received >- // codecs and header extensions go in local description, streams go in >- // remote. >- // >- // If there are no codecs or encodings, just leave the previous set of >- // codecs. The media engine doesn't like an empty set of codecs. >- if (remote_audio_description_.streams().empty() && >- local_audio_description_.codecs().empty()) { >- } else { >- local_audio_description_.set_codecs(codecs_result.MoveValue()); >- } >- local_audio_description_.set_rtp_header_extensions( >- extensions_result.MoveValue()); >- remote_audio_description_.mutable_streams() = >- stream_params_result.MoveValue(); >- // Direction set based on encoding "active" flag. >- local_audio_description_.set_direction(local_direction); >- remote_audio_description_.set_direction( >- RtpTransceiverDirectionReversed(local_direction)); >- >- if (!voice_channel_->SetLocalContent(&local_audio_description_, >- SdpType::kOffer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply local parameters to media channel."); >- } >- if (!voice_channel_->SetRemoteContent(&remote_audio_description_, >- SdpType::kAnswer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply remote parameters to media channel."); >- } >- return RTCError::OK(); >-} >- >-RTCError RtpTransportControllerAdapter::ValidateAndApplyVideoReceiverParameters( >- const RtpParameters& parameters) { >- RTC_DCHECK(video_channel_); >- RTC_DCHECK(have_video_receiver_); >- >- auto codecs_result = ToCricketCodecs<cricket::VideoCodec>(parameters.codecs); >- if (!codecs_result.ok()) { >- return codecs_result.MoveError(); >- } >- >- auto extensions_result = >- ToCricketRtpHeaderExtensions(parameters.header_extensions); >- if (!extensions_result.ok()) { >- return extensions_result.MoveError(); >- } >- >- int bandwidth = cricket::kAutoBandwidth; >- auto stream_params_result = ToCricketStreamParamsVec(parameters.encodings); >- if (!stream_params_result.ok()) { >- return stream_params_result.MoveError(); >- } >- >- // Check that audio/video receiver aren't using the same IDs to refer to >- // different things, if they share the same transport. >- if (inner_audio_transport_ == inner_video_transport_) { >- RTCError err = CheckForIdConflicts( >- codecs_result.value(), extensions_result.value(), >- stream_params_result.value(), local_audio_description_.codecs(), >- local_audio_description_.rtp_header_extensions(), >- remote_audio_description_.streams()); >- if (!err.ok()) { >- return err; >- } >- } >- >- const bool local_send = >- RtpTransceiverDirectionHasSend(local_video_description_.direction()); >- const bool local_recv = >- !parameters.encodings.empty() && parameters.encodings[0].active; >- const auto local_direction = >- RtpTransceiverDirectionFromSendRecv(local_send, local_recv); >- >- // Validation is done, so we can attempt applying the descriptions. Received >- // codecs and header extensions go in local description, streams go in >- // remote. >- // >- // If there are no codecs or encodings, just leave the previous set of >- // codecs. The media engine doesn't like an empty set of codecs. >- if (remote_video_description_.streams().empty() && >- local_video_description_.codecs().empty()) { >- } else { >- local_video_description_.set_codecs(codecs_result.MoveValue()); >- } >- local_video_description_.set_rtp_header_extensions( >- extensions_result.MoveValue()); >- local_video_description_.set_bandwidth(bandwidth); >- remote_video_description_.mutable_streams() = >- stream_params_result.MoveValue(); >- // Direction set based on encoding "active" flag. >- local_video_description_.set_direction(local_direction); >- remote_video_description_.set_direction( >- RtpTransceiverDirectionReversed(local_direction)); >- >- if (!video_channel_->SetLocalContent(&local_video_description_, >- SdpType::kOffer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply local parameters to media channel."); >- } >- if (!video_channel_->SetRemoteContent(&remote_video_description_, >- SdpType::kAnswer, nullptr)) { >- LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, >- "Failed to apply remote parameters to media channel."); >- } >- return RTCError::OK(); >-} >- >-RtpTransportControllerAdapter::RtpTransportControllerAdapter( >- const cricket::MediaConfig& config, >- cricket::ChannelManager* channel_manager, >- webrtc::RtcEventLog* event_log, >- rtc::Thread* signaling_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* network_thread) >- : signaling_thread_(signaling_thread), >- worker_thread_(worker_thread), >- network_thread_(network_thread), >- media_config_(config), >- channel_manager_(channel_manager), >- event_log_(event_log), >- call_send_rtp_transport_controller_(nullptr) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- RTC_DCHECK(channel_manager_); >- // Add "dummy" codecs to the descriptions, because the media engines >- // currently reject empty lists of codecs. Note that these codecs will never >- // actually be used, because when parameters are set, the dummy codecs will >- // be replaced by actual codecs before any send/receive streams are created. >- const cricket::AudioCodec dummy_audio(0, cricket::kPcmuCodecName, 8000, 0, 1); >- const cricket::VideoCodec dummy_video(96, cricket::kVp8CodecName); >- local_audio_description_.AddCodec(dummy_audio); >- remote_audio_description_.AddCodec(dummy_audio); >- local_video_description_.AddCodec(dummy_video); >- remote_video_description_.AddCodec(dummy_video); >- >- worker_thread_->Invoke<void>( >- RTC_FROM_HERE, rtc::Bind(&RtpTransportControllerAdapter::Init_w, this)); >-} >- >-// TODO(nisse): Duplicates corresponding method in PeerConnection (used >-// to be in MediaController). >-void RtpTransportControllerAdapter::Init_w() { >- RTC_DCHECK(worker_thread_->IsCurrent()); >- RTC_DCHECK(!call_); >- >- const int kMinBandwidthBps = 30000; >- const int kStartBandwidthBps = 300000; >- const int kMaxBandwidthBps = 2000000; >- >- webrtc::Call::Config call_config(event_log_); >- call_config.audio_state = channel_manager_->media_engine()->GetAudioState(); >- call_config.bitrate_config.min_bitrate_bps = kMinBandwidthBps; >- call_config.bitrate_config.start_bitrate_bps = kStartBandwidthBps; >- call_config.bitrate_config.max_bitrate_bps = kMaxBandwidthBps; >- std::unique_ptr<RtpTransportControllerSend> controller_send = >- absl::make_unique<RtpTransportControllerSend>( >- Clock::GetRealTimeClock(), event_log_, >- call_config.network_controller_factory, call_config.bitrate_config); >- call_send_rtp_transport_controller_ = controller_send.get(); >- call_.reset(webrtc::Call::Create(call_config, std::move(controller_send))); >-} >- >-void RtpTransportControllerAdapter::Close_w() { >- call_.reset(); >- call_send_rtp_transport_controller_ = nullptr; >-} >- >-RTCError RtpTransportControllerAdapter::AttachAudioSender( >- OrtcRtpSenderAdapter* sender, >- RtpTransportInterface* inner_transport) { >- if (have_audio_sender_) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using two audio RtpSenders with the same " >- "RtpTransportControllerAdapter is not currently " >- "supported."); >- } >- if (inner_audio_transport_ && inner_audio_transport_ != inner_transport) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using different transports for the audio " >- "RtpSender and RtpReceiver is not currently " >- "supported."); >- } >- >- // If setting new transport, extract its RTCP parameters and create voice >- // channel. >- if (!inner_audio_transport_) { >- CopyRtcpParametersToDescriptions(inner_transport->GetParameters().rtcp, >- &local_audio_description_, >- &remote_audio_description_); >- inner_audio_transport_ = inner_transport; >- CreateVoiceChannel(); >- } >- have_audio_sender_ = true; >- sender->SignalDestroyed.connect( >- this, &RtpTransportControllerAdapter::OnAudioSenderDestroyed); >- return RTCError::OK(); >-} >- >-RTCError RtpTransportControllerAdapter::AttachVideoSender( >- OrtcRtpSenderAdapter* sender, >- RtpTransportInterface* inner_transport) { >- if (have_video_sender_) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using two video RtpSenders with the same " >- "RtpTransportControllerAdapter is not currently " >- "supported."); >- } >- if (inner_video_transport_ && inner_video_transport_ != inner_transport) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using different transports for the video " >- "RtpSender and RtpReceiver is not currently " >- "supported."); >- } >- >- // If setting new transport, extract its RTCP parameters and create video >- // channel. >- if (!inner_video_transport_) { >- CopyRtcpParametersToDescriptions(inner_transport->GetParameters().rtcp, >- &local_video_description_, >- &remote_video_description_); >- inner_video_transport_ = inner_transport; >- CreateVideoChannel(); >- } >- have_video_sender_ = true; >- sender->SignalDestroyed.connect( >- this, &RtpTransportControllerAdapter::OnVideoSenderDestroyed); >- return RTCError::OK(); >-} >- >-RTCError RtpTransportControllerAdapter::AttachAudioReceiver( >- OrtcRtpReceiverAdapter* receiver, >- RtpTransportInterface* inner_transport) { >- if (have_audio_receiver_) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using two audio RtpReceivers with the same " >- "RtpTransportControllerAdapter is not currently " >- "supported."); >- } >- if (inner_audio_transport_ && inner_audio_transport_ != inner_transport) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using different transports for the audio " >- "RtpReceiver and RtpReceiver is not currently " >- "supported."); >- } >- >- // If setting new transport, extract its RTCP parameters and create voice >- // channel. >- if (!inner_audio_transport_) { >- CopyRtcpParametersToDescriptions(inner_transport->GetParameters().rtcp, >- &local_audio_description_, >- &remote_audio_description_); >- inner_audio_transport_ = inner_transport; >- CreateVoiceChannel(); >- } >- have_audio_receiver_ = true; >- receiver->SignalDestroyed.connect( >- this, &RtpTransportControllerAdapter::OnAudioReceiverDestroyed); >- return RTCError::OK(); >-} >- >-RTCError RtpTransportControllerAdapter::AttachVideoReceiver( >- OrtcRtpReceiverAdapter* receiver, >- RtpTransportInterface* inner_transport) { >- if (have_video_receiver_) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using two video RtpReceivers with the same " >- "RtpTransportControllerAdapter is not currently " >- "supported."); >- } >- if (inner_video_transport_ && inner_video_transport_ != inner_transport) { >- LOG_AND_RETURN_ERROR(RTCErrorType::UNSUPPORTED_OPERATION, >- "Using different transports for the video " >- "RtpReceiver and RtpReceiver is not currently " >- "supported."); >- } >- // If setting new transport, extract its RTCP parameters and create video >- // channel. >- if (!inner_video_transport_) { >- CopyRtcpParametersToDescriptions(inner_transport->GetParameters().rtcp, >- &local_video_description_, >- &remote_video_description_); >- inner_video_transport_ = inner_transport; >- CreateVideoChannel(); >- } >- have_video_receiver_ = true; >- receiver->SignalDestroyed.connect( >- this, &RtpTransportControllerAdapter::OnVideoReceiverDestroyed); >- return RTCError::OK(); >-} >- >-void RtpTransportControllerAdapter::OnRtpTransportDestroyed( >- RtpTransportAdapter* transport) { >- RTC_DCHECK_RUN_ON(signaling_thread_); >- auto it = std::find_if(transport_proxies_.begin(), transport_proxies_.end(), >- [transport](RtpTransportInterface* proxy) { >- return proxy->GetInternal() == transport; >- }); >- if (it == transport_proxies_.end()) { >- RTC_NOTREACHED(); >- return; >- } >- transport_proxies_.erase(it); >-} >- >-void RtpTransportControllerAdapter::OnAudioSenderDestroyed() { >- if (!have_audio_sender_) { >- RTC_NOTREACHED(); >- return; >- } >- // Empty parameters should result in sending being stopped. >- RTCError err = >- ValidateAndApplyAudioSenderParameters(RtpParameters(), nullptr); >- RTC_DCHECK(err.ok()); >- have_audio_sender_ = false; >- if (!have_audio_receiver_) { >- DestroyVoiceChannel(); >- } >-} >- >-void RtpTransportControllerAdapter::OnVideoSenderDestroyed() { >- if (!have_video_sender_) { >- RTC_NOTREACHED(); >- return; >- } >- // Empty parameters should result in sending being stopped. >- RTCError err = >- ValidateAndApplyVideoSenderParameters(RtpParameters(), nullptr); >- RTC_DCHECK(err.ok()); >- have_video_sender_ = false; >- if (!have_video_receiver_) { >- DestroyVideoChannel(); >- } >-} >- >-void RtpTransportControllerAdapter::OnAudioReceiverDestroyed() { >- if (!have_audio_receiver_) { >- RTC_NOTREACHED(); >- return; >- } >- // Empty parameters should result in receiving being stopped. >- RTCError err = ValidateAndApplyAudioReceiverParameters(RtpParameters()); >- RTC_DCHECK(err.ok()); >- have_audio_receiver_ = false; >- if (!have_audio_sender_) { >- DestroyVoiceChannel(); >- } >-} >- >-void RtpTransportControllerAdapter::OnVideoReceiverDestroyed() { >- if (!have_video_receiver_) { >- RTC_NOTREACHED(); >- return; >- } >- // Empty parameters should result in receiving being stopped. >- RTCError err = ValidateAndApplyVideoReceiverParameters(RtpParameters()); >- RTC_DCHECK(err.ok()); >- have_video_receiver_ = false; >- if (!have_video_sender_) { >- DestroyVideoChannel(); >- } >-} >- >-void RtpTransportControllerAdapter::CreateVoiceChannel() { >- voice_channel_ = channel_manager_->CreateVoiceChannel( >- call_.get(), media_config_, inner_audio_transport_->GetInternal(), >- signaling_thread_, "audio", false, rtc::CryptoOptions(), >- cricket::AudioOptions()); >- RTC_DCHECK(voice_channel_); >- voice_channel_->Enable(true); >-} >- >-void RtpTransportControllerAdapter::CreateVideoChannel() { >- video_channel_ = channel_manager_->CreateVideoChannel( >- call_.get(), media_config_, inner_video_transport_->GetInternal(), >- signaling_thread_, "video", false, rtc::CryptoOptions(), >- cricket::VideoOptions()); >- RTC_DCHECK(video_channel_); >- video_channel_->Enable(true); >-} >- >-void RtpTransportControllerAdapter::DestroyVoiceChannel() { >- RTC_DCHECK(voice_channel_); >- channel_manager_->DestroyVoiceChannel(voice_channel_); >- voice_channel_ = nullptr; >- inner_audio_transport_ = nullptr; >-} >- >-void RtpTransportControllerAdapter::DestroyVideoChannel() { >- RTC_DCHECK(video_channel_); >- channel_manager_->DestroyVideoChannel(video_channel_); >- video_channel_ = nullptr; >- inner_video_transport_ = nullptr; >-} >- >-void RtpTransportControllerAdapter::CopyRtcpParametersToDescriptions( >- const RtcpParameters& params, >- cricket::MediaContentDescription* local, >- cricket::MediaContentDescription* remote) { >- local->set_rtcp_mux(params.mux); >- remote->set_rtcp_mux(params.mux); >- local->set_rtcp_reduced_size(params.reduced_size); >- remote->set_rtcp_reduced_size(params.reduced_size); >- for (cricket::StreamParams& stream_params : local->mutable_streams()) { >- stream_params.cname = params.cname; >- } >-} >- >-uint32_t RtpTransportControllerAdapter::GenerateUnusedSsrc( >- std::set<uint32_t>* new_ssrcs) const { >- uint32_t ssrc; >- do { >- ssrc = rtc::CreateRandomNonZeroId(); >- } while ( >- cricket::GetStreamBySsrc(local_audio_description_.streams(), ssrc) || >- cricket::GetStreamBySsrc(remote_audio_description_.streams(), ssrc) || >- cricket::GetStreamBySsrc(local_video_description_.streams(), ssrc) || >- cricket::GetStreamBySsrc(remote_video_description_.streams(), ssrc) || >- !new_ssrcs->insert(ssrc).second); >- return ssrc; >-} >- >-RTCErrorOr<cricket::StreamParamsVec> >-RtpTransportControllerAdapter::MakeSendStreamParamsVec( >- std::vector<RtpEncodingParameters> encodings, >- const std::string& cname, >- const cricket::MediaContentDescription& description) const { >- if (encodings.size() > 1u) { >- LOG_AND_RETURN_ERROR(webrtc::RTCErrorType::UNSUPPORTED_PARAMETER, >- "ORTC API implementation doesn't currently " >- "support simulcast or layered encodings."); >- } else if (encodings.empty()) { >- return cricket::StreamParamsVec(); >- } >- RtpEncodingParameters& encoding = encodings[0]; >- std::set<uint32_t> new_ssrcs; >- if (encoding.ssrc) { >- new_ssrcs.insert(*encoding.ssrc); >- } >- if (encoding.rtx && encoding.rtx->ssrc) { >- new_ssrcs.insert(*encoding.rtx->ssrc); >- } >- // May need to fill missing SSRCs with generated ones. >- if (!encoding.ssrc) { >- if (!description.streams().empty()) { >- encoding.ssrc.emplace(description.streams()[0].first_ssrc()); >- } else { >- encoding.ssrc.emplace(GenerateUnusedSsrc(&new_ssrcs)); >- } >- } >- if (encoding.rtx && !encoding.rtx->ssrc) { >- uint32_t existing_rtx_ssrc; >- if (!description.streams().empty() && >- description.streams()[0].GetFidSsrc( >- description.streams()[0].first_ssrc(), &existing_rtx_ssrc)) { >- encoding.rtx->ssrc.emplace(existing_rtx_ssrc); >- } else { >- encoding.rtx->ssrc.emplace(GenerateUnusedSsrc(&new_ssrcs)); >- } >- } >- >- auto result = ToCricketStreamParamsVec(encodings); >- if (!result.ok()) { >- return result.MoveError(); >- } >- // If conversion was successful, there should be one StreamParams. >- RTC_DCHECK_EQ(1u, result.value().size()); >- result.value()[0].cname = cname; >- return result; >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontrolleradapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontrolleradapter.h >deleted file mode 100644 >index 14c578aa46f5caabb770d2340de7d44c59cab5bb..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/rtptransportcontrolleradapter.h >+++ /dev/null >@@ -1,221 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef ORTC_RTPTRANSPORTCONTROLLERADAPTER_H_ >-#define ORTC_RTPTRANSPORTCONTROLLERADAPTER_H_ >- >-#include <memory> >-#include <set> >-#include <string> >-#include <vector> >- >-#include "api/ortc/ortcrtpreceiverinterface.h" >-#include "api/ortc/ortcrtpsenderinterface.h" >-#include "api/ortc/rtptransportcontrollerinterface.h" >-#include "api/ortc/srtptransportinterface.h" >-#include "call/call.h" >-#include "call/rtp_transport_controller_send.h" >-#include "logging/rtc_event_log/rtc_event_log.h" >-#include "media/base/mediachannel.h" // For MediaConfig. >-#include "pc/channelmanager.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/third_party/sigslot/sigslot.h" >-#include "rtc_base/thread.h" >- >-namespace webrtc { >- >-class RtpTransportAdapter; >-class OrtcRtpSenderAdapter; >-class OrtcRtpReceiverAdapter; >- >-// Implementation of RtpTransportControllerInterface. Wraps a Call, >-// a VoiceChannel and VideoChannel, and maintains a list of dependent RTP >-// transports. >-// >-// When used along with an RtpSenderAdapter or RtpReceiverAdapter, the >-// sender/receiver passes its parameters along to this class, which turns them >-// into cricket:: media descriptions (the interface used by BaseChannel). >-// >-// Due to the fact that BaseChannel has different subclasses for audio/video, >-// the actual BaseChannel object is not created until an RtpSender/RtpReceiver >-// needs them. >-// >-// All methods should be called on the signaling thread. >-// >-// TODO(deadbeef): When BaseChannel is split apart into separate >-// "RtpSender"/"RtpTransceiver"/"RtpSender"/"RtpReceiver" objects, this adapter >-// object can be replaced by a "real" one. >-class RtpTransportControllerAdapter : public RtpTransportControllerInterface, >- public sigslot::has_slots<> { >- public: >- // Creates a proxy that will call "public interface" methods on the correct >- // thread. >- // >- // Doesn't take ownership of any objects passed in. >- // >- // |channel_manager| must not be null. >- static std::unique_ptr<RtpTransportControllerInterface> CreateProxied( >- const cricket::MediaConfig& config, >- cricket::ChannelManager* channel_manager, >- webrtc::RtcEventLog* event_log, >- rtc::Thread* signaling_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* network_thread); >- >- ~RtpTransportControllerAdapter() override; >- >- // RtpTransportControllerInterface implementation. >- std::vector<RtpTransportInterface*> GetTransports() const override; >- >- // These methods are used by OrtcFactory to create RtpTransports, RtpSenders >- // and RtpReceivers using this controller. Called "CreateProxied" because >- // these methods return proxies that will safely call methods on the correct >- // thread. >- RTCErrorOr<std::unique_ptr<RtpTransportInterface>> CreateProxiedRtpTransport( >- const RtpTransportParameters& rtcp_parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp); >- >- RTCErrorOr<std::unique_ptr<SrtpTransportInterface>> >- CreateProxiedSrtpTransport(const RtpTransportParameters& rtcp_parameters, >- PacketTransportInterface* rtp, >- PacketTransportInterface* rtcp); >- >- // |transport_proxy| needs to be a proxy to a transport because the >- // application may call GetTransport() on the returned sender or receiver, >- // and expects it to return a thread-safe transport proxy. >- RTCErrorOr<std::unique_ptr<OrtcRtpSenderInterface>> CreateProxiedRtpSender( >- cricket::MediaType kind, >- RtpTransportInterface* transport_proxy); >- RTCErrorOr<std::unique_ptr<OrtcRtpReceiverInterface>> >- CreateProxiedRtpReceiver(cricket::MediaType kind, >- RtpTransportInterface* transport_proxy); >- >- // Methods used internally by other "adapter" classes. >- rtc::Thread* signaling_thread() const { return signaling_thread_; } >- rtc::Thread* worker_thread() const { return worker_thread_; } >- rtc::Thread* network_thread() const { return network_thread_; } >- >- // |parameters.keepalive| will be set for ALL RTP transports in the call. >- RTCError SetRtpTransportParameters(const RtpTransportParameters& parameters, >- RtpTransportInterface* inner_transport); >- void SetRtpTransportParameters_w(const RtpTransportParameters& parameters); >- >- cricket::VoiceChannel* voice_channel() { return voice_channel_; } >- cricket::VideoChannel* video_channel() { return video_channel_; } >- >- // |primary_ssrc| out parameter is filled with either >- // |parameters.encodings[0].ssrc|, or a generated SSRC if that's left unset. >- RTCError ValidateAndApplyAudioSenderParameters( >- const RtpParameters& parameters, >- uint32_t* primary_ssrc); >- RTCError ValidateAndApplyVideoSenderParameters( >- const RtpParameters& parameters, >- uint32_t* primary_ssrc); >- RTCError ValidateAndApplyAudioReceiverParameters( >- const RtpParameters& parameters); >- RTCError ValidateAndApplyVideoReceiverParameters( >- const RtpParameters& parameters); >- >- protected: >- RtpTransportControllerAdapter* GetInternal() override { return this; } >- >- private: >- // Only expected to be called by RtpTransportControllerAdapter::CreateProxied. >- RtpTransportControllerAdapter(const cricket::MediaConfig& config, >- cricket::ChannelManager* channel_manager, >- webrtc::RtcEventLog* event_log, >- rtc::Thread* signaling_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* network_thread); >- void Init_w(); >- void Close_w(); >- >- // These return an error if another of the same type of object is already >- // attached, or if |transport_proxy| can't be used with the sender/receiver >- // due to the limitation that the sender/receiver of the same media type must >- // use the same transport. >- RTCError AttachAudioSender(OrtcRtpSenderAdapter* sender, >- RtpTransportInterface* inner_transport); >- RTCError AttachVideoSender(OrtcRtpSenderAdapter* sender, >- RtpTransportInterface* inner_transport); >- RTCError AttachAudioReceiver(OrtcRtpReceiverAdapter* receiver, >- RtpTransportInterface* inner_transport); >- RTCError AttachVideoReceiver(OrtcRtpReceiverAdapter* receiver, >- RtpTransportInterface* inner_transport); >- >- void OnRtpTransportDestroyed(RtpTransportAdapter* transport); >- >- void OnAudioSenderDestroyed(); >- void OnVideoSenderDestroyed(); >- void OnAudioReceiverDestroyed(); >- void OnVideoReceiverDestroyed(); >- >- void CreateVoiceChannel(); >- void CreateVideoChannel(); >- void DestroyVoiceChannel(); >- void DestroyVideoChannel(); >- >- void CopyRtcpParametersToDescriptions( >- const RtcpParameters& params, >- cricket::MediaContentDescription* local, >- cricket::MediaContentDescription* remote); >- >- // Helper function to generate an SSRC that doesn't match one in any of the >- // "content description" structs, or in |new_ssrcs| (which is needed since >- // multiple SSRCs may be generated in one go). >- uint32_t GenerateUnusedSsrc(std::set<uint32_t>* new_ssrcs) const; >- >- // |description| is the matching description where existing SSRCs can be >- // found. >- // >- // This is a member function because it may need to generate SSRCs that don't >- // match existing ones, which is more than ToStreamParamsVec does. >- RTCErrorOr<cricket::StreamParamsVec> MakeSendStreamParamsVec( >- std::vector<RtpEncodingParameters> encodings, >- const std::string& cname, >- const cricket::MediaContentDescription& description) const; >- >- rtc::Thread* signaling_thread_; >- rtc::Thread* worker_thread_; >- rtc::Thread* network_thread_; >- // |transport_proxies_| and |inner_audio_transport_|/|inner_audio_transport_| >- // are somewhat redundant, but the latter are only set when >- // RtpSenders/RtpReceivers are attached to the transport. >- std::vector<RtpTransportInterface*> transport_proxies_; >- RtpTransportInterface* inner_audio_transport_ = nullptr; >- RtpTransportInterface* inner_video_transport_ = nullptr; >- const cricket::MediaConfig media_config_; >- RtpKeepAliveConfig keepalive_; >- cricket::ChannelManager* channel_manager_; >- webrtc::RtcEventLog* event_log_; >- std::unique_ptr<Call> call_; >- webrtc::RtpTransportControllerSend* call_send_rtp_transport_controller_; >- >- // BaseChannel takes content descriptions as input, so we store them here >- // such that they can be updated when a new RtpSenderAdapter/ >- // RtpReceiverAdapter attaches itself. >- cricket::AudioContentDescription local_audio_description_; >- cricket::AudioContentDescription remote_audio_description_; >- cricket::VideoContentDescription local_video_description_; >- cricket::VideoContentDescription remote_video_description_; >- cricket::VoiceChannel* voice_channel_ = nullptr; >- cricket::VideoChannel* video_channel_ = nullptr; >- bool have_audio_sender_ = false; >- bool have_video_sender_ = false; >- bool have_audio_receiver_ = false; >- bool have_video_receiver_ = false; >- >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(RtpTransportControllerAdapter); >-}; >- >-} // namespace webrtc >- >-#endif // ORTC_RTPTRANSPORTCONTROLLERADAPTER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/srtptransport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/srtptransport_unittest.cc >deleted file mode 100644 >index 1506aa70414e29ae27b0fb9ad81d2d498fe212ac..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/srtptransport_unittest.cc >+++ /dev/null >@@ -1,125 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "api/audio_codecs/builtin_audio_decoder_factory.h" >-#include "api/audio_codecs/builtin_audio_encoder_factory.h" >-#include "media/base/fakemediaengine.h" >-#include "ortc/ortcfactory.h" >-#include "ortc/testrtpparameters.h" >-#include "p2p/base/fakepackettransport.h" >-#include "rtc_base/gunit.h" >- >-namespace webrtc { >- >-static const char kTestSha1KeyParams1[] = >- "inline:WVNfX19zZW1jdGwgKCkgewkyMjA7fQp9CnVubGVz"; >-static const char kTestSha1KeyParams2[] = >- "inline:PS1uQCVeeCFCanVmcjkpPywjNWhcYD0mXXtxaVBR"; >-static const char kTestGcmKeyParams3[] = >- "inline:e166KFlKzJsGW0d5apX+rrI05vxbrvMJEzFI14aTDCa63IRTlLK4iH66uOI="; >- >-static const cricket::CryptoParams kTestSha1CryptoParams1( >- 1, >- "AES_CM_128_HMAC_SHA1_80", >- kTestSha1KeyParams1, >- ""); >-static const cricket::CryptoParams kTestSha1CryptoParams2( >- 1, >- "AES_CM_128_HMAC_SHA1_80", >- kTestSha1KeyParams2, >- ""); >-static const cricket::CryptoParams kTestGcmCryptoParams3(1, >- "AEAD_AES_256_GCM", >- kTestGcmKeyParams3, >- ""); >- >-// This test uses fake packet transports and a fake media engine, in order to >-// test the SrtpTransport at only an API level. Any end-to-end test should go in >-// ortcfactory_integrationtest.cc instead. >-class SrtpTransportTest : public testing::Test { >- public: >- SrtpTransportTest() { >- fake_media_engine_ = new cricket::FakeMediaEngine(); >- // Note: This doesn't need to use fake network classes, since it uses >- // FakePacketTransports. >- auto result = OrtcFactory::Create( >- nullptr, nullptr, nullptr, nullptr, nullptr, >- std::unique_ptr<cricket::MediaEngineInterface>(fake_media_engine_), >- CreateBuiltinAudioEncoderFactory(), CreateBuiltinAudioDecoderFactory()); >- ortc_factory_ = result.MoveValue(); >- rtp_transport_controller_ = >- ortc_factory_->CreateRtpTransportController().MoveValue(); >- >- fake_packet_transport_.reset(new rtc::FakePacketTransport("fake")); >- auto srtp_transport_result = ortc_factory_->CreateSrtpTransport( >- rtp_transport_parameters_, fake_packet_transport_.get(), nullptr, >- rtp_transport_controller_.get()); >- srtp_transport_ = srtp_transport_result.MoveValue(); >- } >- >- protected: >- // Owned by |ortc_factory_|. >- cricket::FakeMediaEngine* fake_media_engine_; >- std::unique_ptr<OrtcFactoryInterface> ortc_factory_; >- std::unique_ptr<RtpTransportControllerInterface> rtp_transport_controller_; >- std::unique_ptr<SrtpTransportInterface> srtp_transport_; >- RtpTransportParameters rtp_transport_parameters_; >- std::unique_ptr<rtc::FakePacketTransport> fake_packet_transport_; >-}; >- >-// Tests that setting the SRTP send/receive key succeeds. >-TEST_F(SrtpTransportTest, SetSrtpSendAndReceiveKey) { >- EXPECT_TRUE(srtp_transport_->SetSrtpSendKey(kTestSha1CryptoParams1).ok()); >- EXPECT_TRUE(srtp_transport_->SetSrtpReceiveKey(kTestSha1CryptoParams2).ok()); >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, >- srtp_transport_.get()); >- EXPECT_TRUE(sender_result.ok()); >- auto receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, srtp_transport_.get()); >- EXPECT_TRUE(receiver_result.ok()); >-} >- >-// Tests that setting the SRTP send/receive key twice is not supported. >-TEST_F(SrtpTransportTest, SetSrtpSendAndReceiveKeyTwice) { >- EXPECT_TRUE(srtp_transport_->SetSrtpSendKey(kTestSha1CryptoParams1).ok()); >- EXPECT_TRUE(srtp_transport_->SetSrtpReceiveKey(kTestSha1CryptoParams2).ok()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- srtp_transport_->SetSrtpSendKey(kTestSha1CryptoParams2).type()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- srtp_transport_->SetSrtpReceiveKey(kTestSha1CryptoParams1).type()); >- // Ensure that the senders and receivers can be created despite the previous >- // errors. >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, >- srtp_transport_.get()); >- EXPECT_TRUE(sender_result.ok()); >- auto receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, srtp_transport_.get()); >- EXPECT_TRUE(receiver_result.ok()); >-} >- >-// Test that the SRTP send key and receive key must have the same cipher suite. >-TEST_F(SrtpTransportTest, SetSrtpSendAndReceiveKeyDifferentCipherSuite) { >- EXPECT_TRUE(srtp_transport_->SetSrtpSendKey(kTestSha1CryptoParams1).ok()); >- EXPECT_EQ(RTCErrorType::UNSUPPORTED_OPERATION, >- srtp_transport_->SetSrtpReceiveKey(kTestGcmCryptoParams3).type()); >- EXPECT_TRUE(srtp_transport_->SetSrtpReceiveKey(kTestSha1CryptoParams2).ok()); >- // Ensure that the senders and receivers can be created despite the previous >- // error. >- auto sender_result = ortc_factory_->CreateRtpSender(cricket::MEDIA_TYPE_AUDIO, >- srtp_transport_.get()); >- EXPECT_TRUE(sender_result.ok()); >- auto receiver_result = ortc_factory_->CreateRtpReceiver( >- cricket::MEDIA_TYPE_AUDIO, srtp_transport_.get()); >- EXPECT_TRUE(receiver_result.ok()); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/testrtpparameters.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/testrtpparameters.cc >deleted file mode 100644 >index 82d96590fb9655ea7885305724893116f67af29f..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/testrtpparameters.cc >+++ /dev/null >@@ -1,315 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "ortc/testrtpparameters.h" >- >-#include <algorithm> >-#include <utility> >- >-namespace webrtc { >- >-RtpParameters MakeMinimalOpusParameters() { >- RtpParameters parameters; >- RtpCodecParameters opus_codec; >- opus_codec.name = "opus"; >- opus_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- opus_codec.payload_type = 111; >- opus_codec.clock_rate.emplace(48000); >- opus_codec.num_channels.emplace(2); >- parameters.codecs.push_back(std::move(opus_codec)); >- RtpEncodingParameters encoding; >- encoding.codec_payload_type.emplace(111); >- parameters.encodings.push_back(std::move(encoding)); >- return parameters; >-} >- >-RtpParameters MakeMinimalIsacParameters() { >- RtpParameters parameters; >- RtpCodecParameters isac_codec; >- isac_codec.name = "ISAC"; >- isac_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- isac_codec.payload_type = 103; >- isac_codec.clock_rate.emplace(16000); >- parameters.codecs.push_back(std::move(isac_codec)); >- RtpEncodingParameters encoding; >- encoding.codec_payload_type.emplace(111); >- parameters.encodings.push_back(std::move(encoding)); >- return parameters; >-} >- >-RtpParameters MakeMinimalOpusParametersWithSsrc(uint32_t ssrc) { >- RtpParameters parameters = MakeMinimalOpusParameters(); >- parameters.encodings[0].ssrc.emplace(ssrc); >- return parameters; >-} >- >-RtpParameters MakeMinimalIsacParametersWithSsrc(uint32_t ssrc) { >- RtpParameters parameters = MakeMinimalIsacParameters(); >- parameters.encodings[0].ssrc.emplace(ssrc); >- return parameters; >-} >- >-RtpParameters MakeMinimalVideoParameters(const char* codec_name) { >- RtpParameters parameters; >- RtpCodecParameters vp8_codec; >- vp8_codec.name = codec_name; >- vp8_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp8_codec.payload_type = 96; >- parameters.codecs.push_back(std::move(vp8_codec)); >- RtpEncodingParameters encoding; >- encoding.codec_payload_type.emplace(96); >- parameters.encodings.push_back(std::move(encoding)); >- return parameters; >-} >- >-RtpParameters MakeMinimalVp8Parameters() { >- return MakeMinimalVideoParameters("VP8"); >-} >- >-RtpParameters MakeMinimalVp9Parameters() { >- return MakeMinimalVideoParameters("VP9"); >-} >- >-RtpParameters MakeMinimalVp8ParametersWithSsrc(uint32_t ssrc) { >- RtpParameters parameters = MakeMinimalVp8Parameters(); >- parameters.encodings[0].ssrc.emplace(ssrc); >- return parameters; >-} >- >-RtpParameters MakeMinimalVp9ParametersWithSsrc(uint32_t ssrc) { >- RtpParameters parameters = MakeMinimalVp9Parameters(); >- parameters.encodings[0].ssrc.emplace(ssrc); >- return parameters; >-} >- >-// Note: Currently, these "WithNoSsrc" methods are identical to the normal >-// "MakeMinimal" methods, but with the added guarantee that they will never be >-// changed to include an SSRC. >- >-RtpParameters MakeMinimalOpusParametersWithNoSsrc() { >- RtpParameters parameters = MakeMinimalOpusParameters(); >- RTC_DCHECK(!parameters.encodings[0].ssrc); >- return parameters; >-} >- >-RtpParameters MakeMinimalIsacParametersWithNoSsrc() { >- RtpParameters parameters = MakeMinimalIsacParameters(); >- RTC_DCHECK(!parameters.encodings[0].ssrc); >- return parameters; >-} >- >-RtpParameters MakeMinimalVp8ParametersWithNoSsrc() { >- RtpParameters parameters = MakeMinimalVp8Parameters(); >- RTC_DCHECK(!parameters.encodings[0].ssrc); >- return parameters; >-} >- >-RtpParameters MakeMinimalVp9ParametersWithNoSsrc() { >- RtpParameters parameters = MakeMinimalVp9Parameters(); >- RTC_DCHECK(!parameters.encodings[0].ssrc); >- return parameters; >-} >- >-// Make audio parameters with all the available properties configured and >-// features used, and with multiple codecs offered. Obtained by taking a >-// snapshot of a default PeerConnection offer (and adding other things, like >-// bitrate limit). >-// >-// See "MakeFullOpusParameters"/"MakeFullIsacParameters" below. >-RtpParameters MakeFullAudioParameters(int preferred_payload_type) { >- RtpParameters parameters; >- >- RtpCodecParameters opus_codec; >- opus_codec.name = "opus"; >- opus_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- opus_codec.payload_type = 111; >- opus_codec.clock_rate.emplace(48000); >- opus_codec.num_channels.emplace(2); >- opus_codec.parameters["minptime"] = "10"; >- opus_codec.parameters["useinbandfec"] = "1"; >- opus_codec.parameters["usedtx"] = "1"; >- opus_codec.parameters["stereo"] = "1"; >- opus_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::TRANSPORT_CC); >- parameters.codecs.push_back(std::move(opus_codec)); >- >- RtpCodecParameters isac_codec; >- isac_codec.name = "ISAC"; >- isac_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- isac_codec.payload_type = 103; >- isac_codec.clock_rate.emplace(16000); >- parameters.codecs.push_back(std::move(isac_codec)); >- >- RtpCodecParameters cn_codec; >- cn_codec.name = "CN"; >- cn_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- cn_codec.payload_type = 106; >- cn_codec.clock_rate.emplace(32000); >- parameters.codecs.push_back(std::move(cn_codec)); >- >- RtpCodecParameters dtmf_codec; >- dtmf_codec.name = "telephone-event"; >- dtmf_codec.kind = cricket::MEDIA_TYPE_AUDIO; >- dtmf_codec.payload_type = 126; >- dtmf_codec.clock_rate.emplace(8000); >- parameters.codecs.push_back(std::move(dtmf_codec)); >- >- // "codec_payload_type" isn't implemented, so we need to reorder codecs to >- // cause one to be used. >- // TODO(deadbeef): Remove this when it becomes unnecessary. >- auto it = std::find_if(parameters.codecs.begin(), parameters.codecs.end(), >- [preferred_payload_type](const RtpCodecParameters& p) { >- return p.payload_type == preferred_payload_type; >- }); >- RtpCodecParameters preferred = *it; >- parameters.codecs.erase(it); >- parameters.codecs.insert(parameters.codecs.begin(), preferred); >- >- // Intentionally leave out SSRC so one's chosen automatically. >- RtpEncodingParameters encoding; >- encoding.codec_payload_type.emplace(preferred_payload_type); >- encoding.dtx.emplace(DtxStatus::ENABLED); >- // 20 kbps. >- encoding.max_bitrate_bps.emplace(20000); >- parameters.encodings.push_back(std::move(encoding)); >- >- parameters.header_extensions.emplace_back( >- "urn:ietf:params:rtp-hdrext:ssrc-audio-level", 1); >- return parameters; >-} >- >-RtpParameters MakeFullOpusParameters() { >- return MakeFullAudioParameters(111); >-} >- >-RtpParameters MakeFullIsacParameters() { >- return MakeFullAudioParameters(103); >-} >- >-// Make video parameters with all the available properties configured and >-// features used, and with multiple codecs offered. Obtained by taking a >-// snapshot of a default PeerConnection offer (and adding other things, like >-// bitrate limit). >-// >-// See "MakeFullVp8Parameters"/"MakeFullVp9Parameters" below. >-RtpParameters MakeFullVideoParameters(int preferred_payload_type) { >- RtpParameters parameters; >- >- RtpCodecParameters vp8_codec; >- vp8_codec.name = "VP8"; >- vp8_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp8_codec.payload_type = 100; >- vp8_codec.clock_rate.emplace(90000); >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::CCM, >- RtcpFeedbackMessageType::FIR); >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::NACK, >- RtcpFeedbackMessageType::GENERIC_NACK); >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::NACK, >- RtcpFeedbackMessageType::PLI); >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::REMB); >- vp8_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::TRANSPORT_CC); >- parameters.codecs.push_back(std::move(vp8_codec)); >- >- RtpCodecParameters vp8_rtx_codec; >- vp8_rtx_codec.name = "rtx"; >- vp8_rtx_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp8_rtx_codec.payload_type = 96; >- vp8_rtx_codec.clock_rate.emplace(90000); >- vp8_rtx_codec.parameters["apt"] = "100"; >- parameters.codecs.push_back(std::move(vp8_rtx_codec)); >- >- RtpCodecParameters vp9_codec; >- vp9_codec.name = "VP9"; >- vp9_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp9_codec.payload_type = 101; >- vp9_codec.clock_rate.emplace(90000); >- vp9_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::CCM, >- RtcpFeedbackMessageType::FIR); >- vp9_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::NACK, >- RtcpFeedbackMessageType::GENERIC_NACK); >- vp9_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::NACK, >- RtcpFeedbackMessageType::PLI); >- vp9_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::REMB); >- vp9_codec.rtcp_feedback.emplace_back(RtcpFeedbackType::TRANSPORT_CC); >- parameters.codecs.push_back(std::move(vp9_codec)); >- >- RtpCodecParameters vp9_rtx_codec; >- vp9_rtx_codec.name = "rtx"; >- vp9_rtx_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- vp9_rtx_codec.payload_type = 97; >- vp9_rtx_codec.clock_rate.emplace(90000); >- vp9_rtx_codec.parameters["apt"] = "101"; >- parameters.codecs.push_back(std::move(vp9_rtx_codec)); >- >- RtpCodecParameters red_codec; >- red_codec.name = "red"; >- red_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- red_codec.payload_type = 116; >- red_codec.clock_rate.emplace(90000); >- parameters.codecs.push_back(std::move(red_codec)); >- >- RtpCodecParameters red_rtx_codec; >- red_rtx_codec.name = "rtx"; >- red_rtx_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- red_rtx_codec.payload_type = 98; >- red_rtx_codec.clock_rate.emplace(90000); >- red_rtx_codec.parameters["apt"] = "116"; >- parameters.codecs.push_back(std::move(red_rtx_codec)); >- >- RtpCodecParameters ulpfec_codec; >- ulpfec_codec.name = "ulpfec"; >- ulpfec_codec.kind = cricket::MEDIA_TYPE_VIDEO; >- ulpfec_codec.payload_type = 117; >- ulpfec_codec.clock_rate.emplace(90000); >- parameters.codecs.push_back(std::move(ulpfec_codec)); >- >- // "codec_payload_type" isn't implemented, so we need to reorder codecs to >- // cause one to be used. >- // TODO(deadbeef): Remove this when it becomes unnecessary. >- auto it = std::find_if(parameters.codecs.begin(), parameters.codecs.end(), >- [preferred_payload_type](const RtpCodecParameters& p) { >- return p.payload_type == preferred_payload_type; >- }); >- RtpCodecParameters preferred = *it; >- parameters.codecs.erase(it); >- parameters.codecs.insert(parameters.codecs.begin(), preferred); >- >- // Intentionally leave out SSRC so one's chosen automatically. >- RtpEncodingParameters encoding; >- encoding.codec_payload_type.emplace(preferred_payload_type); >- encoding.fec.emplace(FecMechanism::RED_AND_ULPFEC); >- // Will create default RtxParameters, with unset SSRC. >- encoding.rtx.emplace(); >- // 100 kbps. >- encoding.max_bitrate_bps.emplace(100000); >- parameters.encodings.push_back(std::move(encoding)); >- >- parameters.header_extensions.emplace_back( >- "urn:ietf:params:rtp-hdrext:toffset", 2); >- parameters.header_extensions.emplace_back( >- "http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time", 3); >- parameters.header_extensions.emplace_back("urn:3gpp:video-orientation", 4); >- parameters.header_extensions.emplace_back( >- "http://www.ietf.org/id/" >- "draft-holmer-rmcat-transport-wide-cc-extensions-01", >- 5); >- parameters.header_extensions.emplace_back( >- "http://www.webrtc.org/experiments/rtp-hdrext/playout-delay", 6); >- return parameters; >-} >- >-RtpParameters MakeFullVp8Parameters() { >- return MakeFullVideoParameters(100); >-} >- >-RtpParameters MakeFullVp9Parameters() { >- return MakeFullVideoParameters(101); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/testrtpparameters.h b/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/testrtpparameters.h >deleted file mode 100644 >index 47e51e140837d4b9c496c56e3c49895197970586..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/ortc/testrtpparameters.h >+++ /dev/null >@@ -1,72 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef ORTC_TESTRTPPARAMETERS_H_ >-#define ORTC_TESTRTPPARAMETERS_H_ >- >-#include "api/ortc/rtptransportinterface.h" >-#include "api/rtpparameters.h" >- >-namespace webrtc { >- >-// Helper methods to create RtpParameters to use for sending/receiving. >-// >-// "MakeMinimal" methods contain the minimal necessary information for an >-// RtpSender or RtpReceiver to function. The "MakeFull" methods are the >-// opposite, and include all features that would normally be offered by a >-// PeerConnection, and in some cases additional ones. >-// >-// These methods are intended to be used for end-to-end testing (such as in >-// ortcfactory_integrationtest.cc), or unit testing that doesn't care about the >-// specific contents of the parameters. Tests should NOT assume that these >-// methods will not change; tests that are testing that a specific value in the >-// parameters is applied properly should construct the parameters in the test >-// itself. >- >-inline RtpTransportParameters MakeRtcpMuxParameters() { >- RtpTransportParameters parameters; >- parameters.rtcp.mux = true; >- return parameters; >-} >- >-RtpParameters MakeMinimalOpusParameters(); >-RtpParameters MakeMinimalIsacParameters(); >-RtpParameters MakeMinimalOpusParametersWithSsrc(uint32_t ssrc); >-RtpParameters MakeMinimalIsacParametersWithSsrc(uint32_t ssrc); >- >-RtpParameters MakeMinimalVp8Parameters(); >-RtpParameters MakeMinimalVp9Parameters(); >-RtpParameters MakeMinimalVp8ParametersWithSsrc(uint32_t ssrc); >-RtpParameters MakeMinimalVp9ParametersWithSsrc(uint32_t ssrc); >- >-// Will create an encoding with no SSRC (meaning "match first SSRC seen" for a >-// receiver, or "pick one automatically" for a sender). >-RtpParameters MakeMinimalOpusParametersWithNoSsrc(); >-RtpParameters MakeMinimalIsacParametersWithNoSsrc(); >-RtpParameters MakeMinimalVp8ParametersWithNoSsrc(); >-RtpParameters MakeMinimalVp9ParametersWithNoSsrc(); >- >-// Make audio parameters with all the available properties configured and >-// features used, and with multiple codecs offered. Obtained by taking a >-// snapshot of a default PeerConnection offer (and adding other things, like >-// bitrate limit). >-RtpParameters MakeFullOpusParameters(); >-RtpParameters MakeFullIsacParameters(); >- >-// Make video parameters with all the available properties configured and >-// features used, and with multiple codecs offered. Obtained by taking a >-// snapshot of a default PeerConnection offer (and adding other things, like >-// bitrate limit). >-RtpParameters MakeFullVp8Parameters(); >-RtpParameters MakeFullVp9Parameters(); >- >-} // namespace webrtc >- >-#endif // ORTC_TESTRTPPARAMETERS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/BUILD.gn >index 2d351bb0fa2cff038923bf6127e7bf37e5fbb1a7..29e633c5d227f924350c94b2b5e351a9718afe9b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/BUILD.gn >@@ -29,6 +29,8 @@ rtc_static_library("rtc_p2p") { > "base/dtlstransport.h", > "base/dtlstransportinternal.cc", > "base/dtlstransportinternal.h", >+ "base/icecredentialsiterator.cc", >+ "base/icecredentialsiterator.h", > "base/icetransportinternal.cc", > "base/icetransportinternal.h", > "base/mdns_message.cc", >@@ -74,8 +76,6 @@ rtc_static_library("rtc_p2p") { > "base/turnport.cc", > "base/turnport.h", > "base/udpport.h", >- "base/udptransport.cc", >- "base/udptransport.h", > "client/basicportallocator.cc", > "client/basicportallocator.h", > "client/relayportfactoryinterface.h", >@@ -86,17 +86,20 @@ rtc_static_library("rtc_p2p") { > deps = [ > "../api:libjingle_peerconnection_api", > "../api:ortc_api", >+ "../api/transport:enums", > "../logging:ice_log", > "../rtc_base:checks", > "../rtc_base:rtc_base", > "../rtc_base:safe_minmax", > "../rtc_base:stringutils", > "../rtc_base:weak_ptr", >+ "../rtc_base/system:rtc_export", > "../rtc_base/third_party/base64", > "../rtc_base/third_party/sigslot", > "../system_wrappers:field_trial", > "../system_wrappers:metrics", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > >@@ -137,6 +140,7 @@ if (rtc_include_tests) { > ":rtc_p2p", > "../api:libjingle_peerconnection_api", > "../api:ortc_api", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_utils", >@@ -153,6 +157,7 @@ if (rtc_include_tests) { > "base/asyncstuntcpsocket_unittest.cc", > "base/basicasyncresolverfactory_unittest.cc", > "base/dtlstransport_unittest.cc", >+ "base/icecredentialsiterator_unittest.cc", > "base/mdns_message_unittest.cc", > "base/p2ptransportchannel_unittest.cc", > "base/packetlossestimator_unittest.cc", >@@ -170,7 +175,6 @@ if (rtc_include_tests) { > "base/transportdescriptionfactory_unittest.cc", > "base/turnport_unittest.cc", > "base/turnserver_unittest.cc", >- "base/udptransport_unittest.cc", > "client/basicportallocator_unittest.cc", > ] > deps = [ >@@ -178,10 +182,12 @@ if (rtc_include_tests) { > ":rtc_p2p", > "../api:ortc_api", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_utils", > "../rtc_base:stringutils", >+ "../rtc_base:testclient", > "../system_wrappers:metrics", > "../test:test_support", > "//testing/gtest", >@@ -206,6 +212,7 @@ rtc_static_library("libstunprober") { > "..:webrtc_common", > "../rtc_base:checks", > "../rtc_base:rtc_base", >+ "../rtc_base/system:rtc_export", > ] > } > >@@ -221,6 +228,7 @@ if (rtc_include_tests) { > ":p2p_test_utils", > ":rtc_p2p", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_tests_utils", > "//testing/gtest", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/OWNERS >index 18f6044b163d85f92f01d74f9068422ea21d9027..84a19ed596cced35d3dcb7031ee8d9c059cafb72 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/OWNERS >@@ -6,6 +6,9 @@ mflodman@webrtc.org > perkj@webrtc.org > pthatcher@webrtc.org > qingsi@webrtc.org >+jeroendb@webrtc.org >+emadomara@webrtc.org >+steveanton@webrtc.org > sergeyu@chromium.org > tommi@webrtc.org > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket.cc >index 8bb796fd1d59cb59f8904751ee2b1bc0be482a26..d0185c3eb290ee872b9330c8b2a0751762209432 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket.cc >@@ -112,7 +112,7 @@ void AsyncStunTCPSocket::ProcessInput(char* data, size_t* len) { > } > > SignalReadPacket(this, data, expected_pkt_len, remote_addr, >- rtc::CreatePacketTime(0)); >+ rtc::TimeMicros()); > > *len -= actual_length; > if (*len > 0) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket_unittest.cc >index de9541941f418da14581e14788b9cb28e87a8a11..b5fa14edc14bf8f436d6a62b335e6e1b1d553430 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/asyncstuntcpsocket_unittest.cc >@@ -81,7 +81,7 @@ class AsyncStunTCPSocketTest : public testing::Test, > const char* data, > size_t len, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > recv_packets_.push_back(std::string(data, len)); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.cc >index 314f63ce74f865f1af346928b5d4e117d6bfaf34..97c6b13fb7a7b0d1b3a80769cfb3a25299fd03a4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.cc >@@ -20,6 +20,7 @@ > #include "rtc_base/dscp.h" > #include "rtc_base/logging.h" > #include "rtc_base/messagequeue.h" >+#include "rtc_base/rtccertificate.h" > #include "rtc_base/sslstreamadapter.h" > #include "rtc_base/stream.h" > #include "rtc_base/thread.h" >@@ -113,37 +114,23 @@ void StreamInterfaceChannel::Close() { > state_ = rtc::SS_CLOSED; > } > >-DtlsTransport::DtlsTransport(IceTransportInternal* ice_transport, >- const rtc::CryptoOptions& crypto_options) >- : transport_name_(ice_transport->transport_name()), >- component_(ice_transport->component()), >- ice_transport_(ice_transport), >- downward_(NULL), >- srtp_ciphers_(GetSupportedDtlsSrtpCryptoSuites(crypto_options)), >- ssl_max_version_(rtc::SSL_PROTOCOL_DTLS_12), >- crypto_options_(crypto_options) { >- RTC_DCHECK(ice_transport_); >- ConnectToIceTransport(); >-} >- > DtlsTransport::DtlsTransport( > std::unique_ptr<IceTransportInternal> ice_transport, >- const rtc::CryptoOptions& crypto_options) >+ const webrtc::CryptoOptions& crypto_options) > : transport_name_(ice_transport->transport_name()), > component_(ice_transport->component()), >- ice_transport_(ice_transport.get()), >- owned_ice_transport_(std::move(ice_transport)), >+ ice_transport_(std::move(ice_transport)), > downward_(NULL), >- srtp_ciphers_(GetSupportedDtlsSrtpCryptoSuites(crypto_options)), >+ srtp_ciphers_(crypto_options.GetSupportedDtlsSrtpCryptoSuites()), > ssl_max_version_(rtc::SSL_PROTOCOL_DTLS_12), > crypto_options_(crypto_options) { >- RTC_DCHECK(owned_ice_transport_); >+ RTC_DCHECK(ice_transport_); > ConnectToIceTransport(); > } > > DtlsTransport::~DtlsTransport() = default; > >-const rtc::CryptoOptions& DtlsTransport::crypto_options() const { >+const webrtc::CryptoOptions& DtlsTransport::crypto_options() const { > return crypto_options_; > } > >@@ -335,7 +322,8 @@ bool DtlsTransport::ExportKeyingMaterial(const std::string& label, > > bool DtlsTransport::SetupDtls() { > RTC_DCHECK(dtls_role_); >- StreamInterfaceChannel* downward = new StreamInterfaceChannel(ice_transport_); >+ StreamInterfaceChannel* downward = >+ new StreamInterfaceChannel(ice_transport_.get()); > > dtls_.reset(rtc::SSLStreamAdapter::Create(downward)); > if (!dtls_) { >@@ -431,7 +419,7 @@ int DtlsTransport::SendPacket(const char* data, > } > > IceTransportInternal* DtlsTransport::ice_transport() { >- return ice_transport_; >+ return ice_transport_.get(); > } > > bool DtlsTransport::IsDtlsConnected() { >@@ -488,7 +476,7 @@ void DtlsTransport::ConnectToIceTransport() { > // impl again > void DtlsTransport::OnWritableState(rtc::PacketTransportInternal* transport) { > RTC_DCHECK_RUN_ON(&thread_checker_); >- RTC_DCHECK(transport == ice_transport_); >+ RTC_DCHECK(transport == ice_transport_.get()); > RTC_LOG(LS_VERBOSE) << ToString() > << ": ice_transport writable state changed to " > << ice_transport_->writable(); >@@ -520,7 +508,7 @@ void DtlsTransport::OnWritableState(rtc::PacketTransportInternal* transport) { > > void DtlsTransport::OnReceivingState(rtc::PacketTransportInternal* transport) { > RTC_DCHECK_RUN_ON(&thread_checker_); >- RTC_DCHECK(transport == ice_transport_); >+ RTC_DCHECK(transport == ice_transport_.get()); > RTC_LOG(LS_VERBOSE) << ToString() > << ": ice_transport " > "receiving state changed to " >@@ -534,15 +522,15 @@ void DtlsTransport::OnReceivingState(rtc::PacketTransportInternal* transport) { > void DtlsTransport::OnReadPacket(rtc::PacketTransportInternal* transport, > const char* data, > size_t size, >- const rtc::PacketTime& packet_time, >+ const int64_t& packet_time_us, > int flags) { > RTC_DCHECK_RUN_ON(&thread_checker_); >- RTC_DCHECK(transport == ice_transport_); >+ RTC_DCHECK(transport == ice_transport_.get()); > RTC_DCHECK(flags == 0); > > if (!dtls_active_) { > // Not doing DTLS. >- SignalReadPacket(this, data, size, packet_time, 0); >+ SignalReadPacket(this, data, size, packet_time_us, 0); > return; > } > >@@ -605,7 +593,7 @@ void DtlsTransport::OnReadPacket(rtc::PacketTransportInternal* transport, > RTC_DCHECK(!srtp_ciphers_.empty()); > > // Signal this upwards as a bypass packet. >- SignalReadPacket(this, data, size, packet_time, PF_SRTP_BYPASS); >+ SignalReadPacket(this, data, size, packet_time_us, PF_SRTP_BYPASS); > } > break; > case DTLS_TRANSPORT_FAILED: >@@ -651,7 +639,7 @@ void DtlsTransport::OnDtlsEvent(rtc::StreamInterface* dtls, int sig, int err) { > do { > ret = dtls_->Read(buf, sizeof(buf), &read, &read_error); > if (ret == rtc::SR_SUCCESS) { >- SignalReadPacket(this, buf, read, rtc::CreatePacketTime(0), 0); >+ SignalReadPacket(this, buf, read, rtc::TimeMicros(), 0); > } else if (ret == rtc::SR_EOS) { > // Remote peer shut down the association with no error. > RTC_LOG(LS_INFO) << ToString() << ": DTLS transport closed"; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.h >index 456f375a1070807e7a7b278001a759cabfb1c10a..ce95803aecb213de73e34844ab4bcd33c44042cb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport.h >@@ -15,6 +15,7 @@ > #include <string> > #include <vector> > >+#include "api/crypto/cryptooptions.h" > #include "p2p/base/dtlstransportinternal.h" > #include "p2p/base/icetransportinternal.h" > #include "rtc_base/buffer.h" >@@ -94,15 +95,12 @@ class DtlsTransport : public DtlsTransportInternal { > // > // |crypto_options| are the options used for the DTLS handshake. This affects > // whether GCM crypto suites are negotiated. >- // TODO(zhihuang): Remove this once we switch to JsepTransportController. >- explicit DtlsTransport(IceTransportInternal* ice_transport, >- const rtc::CryptoOptions& crypto_options); > explicit DtlsTransport(std::unique_ptr<IceTransportInternal> ice_transport, >- const rtc::CryptoOptions& crypto_options); >+ const webrtc::CryptoOptions& crypto_options); > > ~DtlsTransport() override; > >- const rtc::CryptoOptions& crypto_options() const override; >+ const webrtc::CryptoOptions& crypto_options() const override; > DtlsTransportState dtls_state() const override; > const std::string& transport_name() const override; > int component() const override; >@@ -196,7 +194,7 @@ class DtlsTransport : public DtlsTransportInternal { > void OnReadPacket(rtc::PacketTransportInternal* transport, > const char* data, > size_t size, >- const rtc::PacketTime& packet_time, >+ const int64_t& packet_time_us, > int flags); > void OnSentPacket(rtc::PacketTransportInternal* transport, > const rtc::SentPacket& sent_packet); >@@ -220,9 +218,8 @@ class DtlsTransport : public DtlsTransportInternal { > std::string transport_name_; > int component_; > DtlsTransportState dtls_state_ = DTLS_TRANSPORT_NEW; >- // Underlying ice_transport, not owned by this class. >- IceTransportInternal* const ice_transport_; >- std::unique_ptr<IceTransportInternal> owned_ice_transport_; >+ // Underlying ice_transport, owned by this class. >+ std::unique_ptr<IceTransportInternal> ice_transport_; > std::unique_ptr<rtc::SSLStreamAdapter> dtls_; // The DTLS stream > StreamInterfaceChannel* > downward_; // Wrapper for ice_transport_, owned by dtls_. >@@ -231,7 +228,7 @@ class DtlsTransport : public DtlsTransportInternal { > rtc::scoped_refptr<rtc::RTCCertificate> local_certificate_; > absl::optional<rtc::SSLRole> dtls_role_; > rtc::SSLProtocolVersion ssl_max_version_; >- rtc::CryptoOptions crypto_options_; >+ webrtc::CryptoOptions crypto_options_; > rtc::Buffer remote_fingerprint_value_; > std::string remote_fingerprint_algorithm_; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport_unittest.cc >index 3cf423e8513cdf9f784a73f679dc51f51a6cfd92..93f3da337d303084f07bf90aaf3f2cb403aabf22 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransport_unittest.cc >@@ -11,6 +11,7 @@ > #include <algorithm> > #include <memory> > #include <set> >+#include <utility> > > #include "p2p/base/dtlstransport.h" > #include "p2p/base/fakeicetransport.h" >@@ -19,6 +20,7 @@ > #include "rtc_base/dscp.h" > #include "rtc_base/gunit.h" > #include "rtc_base/helpers.h" >+#include "rtc_base/rtccertificate.h" > #include "rtc_base/ssladapter.h" > #include "rtc_base/sslidentity.h" > #include "rtc_base/sslstreamadapter.h" >@@ -47,8 +49,8 @@ void SetRemoteFingerprintFromCert( > DtlsTransport* transport, > const rtc::scoped_refptr<rtc::RTCCertificate>& cert, > bool modify_digest = false) { >- rtc::SSLFingerprint* fingerprint = >- rtc::SSLFingerprint::CreateFromCertificate(cert); >+ std::unique_ptr<rtc::SSLFingerprint> fingerprint = >+ rtc::SSLFingerprint::CreateFromCertificate(*cert); > if (modify_digest) { > ++fingerprint->digest[0]; > } >@@ -57,7 +59,6 @@ void SetRemoteFingerprintFromCert( > fingerprint->algorithm, > reinterpret_cast<const uint8_t*>(fingerprint->digest.data()), > fingerprint->digest.size())); >- delete fingerprint; > } > > class DtlsTestClient : public sigslot::has_slots<> { >@@ -76,18 +77,18 @@ class DtlsTestClient : public sigslot::has_slots<> { > } > // Set up fake ICE transport and real DTLS transport under test. > void SetupTransports(IceRole role, int async_delay_ms = 0) { >- fake_ice_transport_.reset(new FakeIceTransport("fake", 0)); >- fake_ice_transport_->SetAsync(true); >- fake_ice_transport_->SetAsyncDelay(async_delay_ms); >- fake_ice_transport_->SetIceRole(role); >- fake_ice_transport_->SetIceTiebreaker((role == ICEROLE_CONTROLLING) ? 1 >- : 2); >+ std::unique_ptr<FakeIceTransport> fake_ice_transport; >+ fake_ice_transport.reset(new FakeIceTransport("fake", 0)); >+ fake_ice_transport->SetAsync(true); >+ fake_ice_transport->SetAsyncDelay(async_delay_ms); >+ fake_ice_transport->SetIceRole(role); >+ fake_ice_transport->SetIceTiebreaker((role == ICEROLE_CONTROLLING) ? 1 : 2); > // Hook the raw packets so that we can verify they are encrypted. >- fake_ice_transport_->SignalReadPacket.connect( >+ fake_ice_transport->SignalReadPacket.connect( > this, &DtlsTestClient::OnFakeIceTransportReadPacket); > >- dtls_transport_.reset( >- new DtlsTransport(fake_ice_transport_.get(), rtc::CryptoOptions())); >+ dtls_transport_ = absl::make_unique<DtlsTransport>( >+ std::move(fake_ice_transport), webrtc::CryptoOptions()); > dtls_transport_->SetSslMaxProtocolVersion(ssl_max_version_); > // Note: Certificate may be null here if testing passthrough. > dtls_transport_->SetLocalCertificate(certificate_); >@@ -99,13 +100,16 @@ class DtlsTestClient : public sigslot::has_slots<> { > this, &DtlsTestClient::OnTransportSentPacket); > } > >- FakeIceTransport* fake_ice_transport() { return fake_ice_transport_.get(); } >+ FakeIceTransport* fake_ice_transport() { >+ return static_cast<FakeIceTransport*>(dtls_transport_->ice_transport()); >+ } > > DtlsTransport* dtls_transport() { return dtls_transport_.get(); } > > // Simulate fake ICE transports connecting. > bool Connect(DtlsTestClient* peer, bool asymmetric) { >- fake_ice_transport_->SetDestination(peer->fake_ice_transport(), asymmetric); >+ fake_ice_transport()->SetDestination(peer->fake_ice_transport(), >+ asymmetric); > return true; > } > >@@ -232,7 +236,7 @@ class DtlsTestClient : public sigslot::has_slots<> { > void OnTransportReadPacket(rtc::PacketTransportInternal* transport, > const char* data, > size_t size, >- const rtc::PacketTime& packet_time, >+ const int64_t& /* packet_time_us */, > int flags) { > uint32_t packet_num = 0; > ASSERT_TRUE(VerifyPacket(data, size, &packet_num)); >@@ -254,7 +258,7 @@ class DtlsTestClient : public sigslot::has_slots<> { > void OnFakeIceTransportReadPacket(rtc::PacketTransportInternal* transport, > const char* data, > size_t size, >- const rtc::PacketTime& time, >+ const int64_t& /* packet_time_us */, > int flags) { > // Flags shouldn't be set on the underlying Transport packets. > ASSERT_EQ(0, flags); >@@ -513,8 +517,8 @@ TEST_F(DtlsTransportTest, TestCertificatesBeforeConnect) { > // remote certificate, because connection has not yet occurred. > auto certificate1 = client1_.dtls_transport()->GetLocalCertificate(); > auto certificate2 = client2_.dtls_transport()->GetLocalCertificate(); >- ASSERT_NE(certificate1->ssl_certificate().ToPEMString(), >- certificate2->ssl_certificate().ToPEMString()); >+ ASSERT_NE(certificate1->GetSSLCertificate().ToPEMString(), >+ certificate2->GetSSLCertificate().ToPEMString()); > ASSERT_FALSE(client1_.dtls_transport()->GetRemoteSSLCertChain()); > ASSERT_FALSE(client2_.dtls_transport()->GetRemoteSSLCertChain()); > } >@@ -527,8 +531,8 @@ TEST_F(DtlsTransportTest, TestCertificatesAfterConnect) { > // After connection, each side has a distinct local certificate. > auto certificate1 = client1_.dtls_transport()->GetLocalCertificate(); > auto certificate2 = client2_.dtls_transport()->GetLocalCertificate(); >- ASSERT_NE(certificate1->ssl_certificate().ToPEMString(), >- certificate2->ssl_certificate().ToPEMString()); >+ ASSERT_NE(certificate1->GetSSLCertificate().ToPEMString(), >+ certificate2->GetSSLCertificate().ToPEMString()); > > // Each side's remote certificate is the other side's local certificate. > std::unique_ptr<rtc::SSLCertChain> remote_cert1 = >@@ -536,13 +540,13 @@ TEST_F(DtlsTransportTest, TestCertificatesAfterConnect) { > ASSERT_TRUE(remote_cert1); > ASSERT_EQ(1u, remote_cert1->GetSize()); > ASSERT_EQ(remote_cert1->Get(0).ToPEMString(), >- certificate2->ssl_certificate().ToPEMString()); >+ certificate2->GetSSLCertificate().ToPEMString()); > std::unique_ptr<rtc::SSLCertChain> remote_cert2 = > client2_.dtls_transport()->GetRemoteSSLCertChain(); > ASSERT_TRUE(remote_cert2); > ASSERT_EQ(1u, remote_cert2->GetSize()); > ASSERT_EQ(remote_cert2->Get(0).ToPEMString(), >- certificate1->ssl_certificate().ToPEMString()); >+ certificate1->GetSSLCertificate().ToPEMString()); > } > > // Test that packets are retransmitted according to the expected schedule. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransportinternal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransportinternal.h >index 58d36fa08dbc83f6606f4ce98e8446f25adbfaad..a7137449f4cf159844338d25aac4130d85299464 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransportinternal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/dtlstransportinternal.h >@@ -15,10 +15,10 @@ > #include <string> > #include <vector> > >+#include "api/crypto/cryptooptions.h" > #include "p2p/base/icetransportinternal.h" > #include "p2p/base/packettransportinternal.h" > #include "rtc_base/sslstreamadapter.h" >-#include "rtc_base/stringencode.h" > > namespace cricket { > >@@ -51,7 +51,7 @@ class DtlsTransportInternal : public rtc::PacketTransportInternal { > public: > ~DtlsTransportInternal() override; > >- virtual const rtc::CryptoOptions& crypto_options() const = 0; >+ virtual const webrtc::CryptoOptions& crypto_options() const = 0; > > virtual DtlsTransportState dtls_state() const = 0; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakedtlstransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakedtlstransport.h >index cb944d3db5b39da760d4542245792df52bdb67ad..daef5c77fc998a41190ba88e4bf4209a5f5fc3f3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakedtlstransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakedtlstransport.h >@@ -17,9 +17,11 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "api/crypto/cryptooptions.h" > #include "p2p/base/dtlstransportinternal.h" > #include "p2p/base/fakeicetransport.h" > #include "rtc_base/fakesslidentity.h" >+#include "rtc_base/rtccertificate.h" > > namespace cricket { > >@@ -32,7 +34,7 @@ class FakeDtlsTransport : public DtlsTransportInternal { > : ice_transport_(ice_transport), > transport_name_(ice_transport->transport_name()), > component_(ice_transport->component()), >- dtls_fingerprint_("", nullptr, 0) { >+ dtls_fingerprint_("", nullptr) { > RTC_DCHECK(ice_transport_); > ice_transport_->SignalReadPacket.connect( > this, &FakeDtlsTransport::OnIceTransportReadPacket); >@@ -44,7 +46,7 @@ class FakeDtlsTransport : public DtlsTransportInternal { > : owned_ice_transport_(std::move(ice)), > transport_name_(owned_ice_transport_->transport_name()), > component_(owned_ice_transport_->component()), >- dtls_fingerprint_("", nullptr, 0) { >+ dtls_fingerprint_("", rtc::ArrayView<const uint8_t>()) { > ice_transport_ = owned_ice_transport_.get(); > ice_transport_->SignalReadPacket.connect( > this, &FakeDtlsTransport::OnIceTransportReadPacket); >@@ -82,6 +84,7 @@ class FakeDtlsTransport : public DtlsTransportInternal { > ice_transport_->SetReceiving(receiving); > set_receiving(receiving); > } >+ void SetDtlsState(DtlsTransportState state) { dtls_state_ = state; } > > // Simulates the two DTLS transports connecting to each other. > // If |asymmetric| is true this method only affects this FakeDtlsTransport. >@@ -132,7 +135,8 @@ class FakeDtlsTransport : public DtlsTransportInternal { > bool SetRemoteFingerprint(const std::string& alg, > const uint8_t* digest, > size_t digest_len) override { >- dtls_fingerprint_ = rtc::SSLFingerprint(alg, digest, digest_len); >+ dtls_fingerprint_ = >+ rtc::SSLFingerprint(alg, rtc::MakeArrayView(digest, digest_len)); > return true; > } > bool SetSslMaxProtocolVersion(rtc::SSLProtocolVersion version) override { >@@ -149,10 +153,10 @@ class FakeDtlsTransport : public DtlsTransportInternal { > *role = *dtls_role_; > return true; > } >- const rtc::CryptoOptions& crypto_options() const override { >+ const webrtc::CryptoOptions& crypto_options() const override { > return crypto_options_; > } >- void SetCryptoOptions(const rtc::CryptoOptions& crypto_options) { >+ void SetCryptoOptions(const webrtc::CryptoOptions& crypto_options) { > crypto_options_ = crypto_options; > } > bool SetLocalCertificate( >@@ -178,8 +182,10 @@ class FakeDtlsTransport : public DtlsTransportInternal { > return local_cert_; > } > std::unique_ptr<rtc::SSLCertChain> GetRemoteSSLCertChain() const override { >- return remote_cert_ ? absl::make_unique<rtc::SSLCertChain>(remote_cert_) >- : nullptr; >+ if (!remote_cert_) { >+ return nullptr; >+ } >+ return absl::make_unique<rtc::SSLCertChain>(remote_cert_->Clone()); > } > bool ExportKeyingMaterial(const std::string& label, > const uint8_t* context, >@@ -232,9 +238,9 @@ class FakeDtlsTransport : public DtlsTransportInternal { > void OnIceTransportReadPacket(PacketTransportInternal* ice_, > const char* data, > size_t len, >- const rtc::PacketTime& time, >+ const int64_t& packet_time_us, > int flags) { >- SignalReadPacket(this, data, len, time, flags); >+ SignalReadPacket(this, data, len, packet_time_us, flags); > } > > void set_receiving(bool receiving) { >@@ -272,7 +278,7 @@ class FakeDtlsTransport : public DtlsTransportInternal { > rtc::SSLFingerprint dtls_fingerprint_; > absl::optional<rtc::SSLRole> dtls_role_; > int crypto_suite_ = rtc::SRTP_AES128_CM_SHA1_80; >- rtc::CryptoOptions crypto_options_; >+ webrtc::CryptoOptions crypto_options_; > > DtlsTransportState dtls_state_ = DTLS_TRANSPORT_NEW; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakeicetransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakeicetransport.h >index 48157451ea69a1b1522d418654e007bb9ae435d0..88afc382ecbb851915328b36936e740cacee7868 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakeicetransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakeicetransport.h >@@ -23,8 +23,13 @@ namespace cricket { > > class FakeIceTransport : public IceTransportInternal { > public: >- explicit FakeIceTransport(const std::string& name, int component) >- : name_(name), component_(component) {} >+ explicit FakeIceTransport(const std::string& name, >+ int component, >+ rtc::Thread* network_thread = nullptr) >+ : name_(name), >+ component_(component), >+ network_thread_(network_thread ? network_thread >+ : rtc::Thread::Current()) {} > ~FakeIceTransport() override { > if (dest_ && dest_->dest_ == this) { > dest_->dest_ = nullptr; >@@ -114,6 +119,19 @@ class FakeIceTransport : public IceTransportInternal { > return IceTransportState::STATE_CONNECTING; > } > >+ webrtc::IceTransportState GetIceTransportState() const override { >+ if (connection_count_ == 0) { >+ return had_connection_ ? webrtc::IceTransportState::kFailed >+ : webrtc::IceTransportState::kNew; >+ } >+ >+ if (connection_count_ == 1) { >+ return webrtc::IceTransportState::kCompleted; >+ } >+ >+ return webrtc::IceTransportState::kConnected; >+ } >+ > void SetIceRole(IceRole role) override { role_ = role; } > IceRole GetIceRole() const override { return role_; } > void SetIceTiebreaker(uint64_t tiebreaker) override { >@@ -226,6 +244,8 @@ class FakeIceTransport : public IceTransportInternal { > } > void SetNetworkRoute(absl::optional<rtc::NetworkRoute> network_route) { > network_route_ = network_route; >+ network_thread_->Invoke<void>( >+ RTC_FROM_HERE, [this] { SignalNetworkRouteChanged(network_route_); }); > } > > private: >@@ -253,7 +273,7 @@ class FakeIceTransport : public IceTransportInternal { > if (dest_) { > last_sent_packet_ = packet; > dest_->SignalReadPacket(dest_, packet.data<char>(), packet.size(), >- rtc::CreatePacketTime(0), 0); >+ rtc::TimeMicros(), 0); > } > } > >@@ -282,6 +302,7 @@ class FakeIceTransport : public IceTransportInternal { > absl::optional<rtc::NetworkRoute> network_route_; > std::map<rtc::Socket::Option, int> socket_options_; > rtc::CopyOnWriteBuffer last_sent_packet_; >+ rtc::Thread* const network_thread_; > }; > > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakepackettransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakepackettransport.h >index e57bc179a7991714c1b4b4323b57f5472d19db7f..52b39215c7524ac345bc539dca075c21827f818e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakepackettransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/fakepackettransport.h >@@ -119,7 +119,7 @@ class FakePacketTransport : public PacketTransportInternal { > last_sent_packet_ = packet; > if (dest_) { > dest_->SignalReadPacket(dest_, packet.data<char>(), packet.size(), >- CreatePacketTime(0), 0); >+ TimeMicros(), 0); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7d29653440d7cb2da80c6d5acf10e339c123ed0d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator.cc >@@ -0,0 +1,36 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "p2p/base/icecredentialsiterator.h" >+#include "rtc_base/helpers.h" >+ >+namespace cricket { >+ >+IceCredentialsIterator::IceCredentialsIterator( >+ const std::vector<IceParameters>& pooled_credentials) >+ : pooled_ice_credentials_(pooled_credentials) {} >+ >+IceCredentialsIterator::~IceCredentialsIterator() = default; >+ >+IceParameters IceCredentialsIterator::CreateRandomIceCredentials() { >+ return IceParameters(rtc::CreateRandomString(ICE_UFRAG_LENGTH), >+ rtc::CreateRandomString(ICE_PWD_LENGTH), false); >+} >+ >+IceParameters IceCredentialsIterator::GetIceCredentials() { >+ if (pooled_ice_credentials_.empty()) { >+ return CreateRandomIceCredentials(); >+ } >+ IceParameters credentials = pooled_ice_credentials_.back(); >+ pooled_ice_credentials_.pop_back(); >+ return credentials; >+} >+ >+} // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator.h >new file mode 100644 >index 0000000000000000000000000000000000000000..33e1d6460ae906b356897c4edb8ac6b9e904ec3a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator.h >@@ -0,0 +1,37 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef P2P_BASE_ICECREDENTIALSITERATOR_H_ >+#define P2P_BASE_ICECREDENTIALSITERATOR_H_ >+ >+#include <vector> >+ >+#include "p2p/base/transportdescription.h" >+ >+namespace cricket { >+ >+class IceCredentialsIterator { >+ public: >+ explicit IceCredentialsIterator(const std::vector<IceParameters>&); >+ virtual ~IceCredentialsIterator(); >+ >+ // Get next pooled ice credentials. >+ // Returns a new random credential if the pool is empty. >+ IceParameters GetIceCredentials(); >+ >+ static IceParameters CreateRandomIceCredentials(); >+ >+ private: >+ std::vector<IceParameters> pooled_ice_credentials_; >+}; >+ >+} // namespace cricket >+ >+#endif // P2P_BASE_ICECREDENTIALSITERATOR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..00facfbb88d1efc9ce639b89d7016ba53d4d49ae >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icecredentialsiterator_unittest.cc >@@ -0,0 +1,49 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <memory> >+#include <string> >+#include <vector> >+ >+#include "p2p/base/icecredentialsiterator.h" >+#include "rtc_base/gunit.h" >+ >+using cricket::IceParameters; >+using cricket::IceCredentialsIterator; >+ >+TEST(IceCredentialsIteratorTest, GetEmpty) { >+ std::vector<IceParameters> empty; >+ IceCredentialsIterator iterator(empty); >+ // Verify that we can get credentials even if input is empty. >+ IceParameters credentials1 = iterator.GetIceCredentials(); >+} >+ >+TEST(IceCredentialsIteratorTest, GetOne) { >+ std::vector<IceParameters> one = { >+ IceCredentialsIterator::CreateRandomIceCredentials()}; >+ IceCredentialsIterator iterator(one); >+ EXPECT_EQ(iterator.GetIceCredentials(), one[0]); >+ auto random = iterator.GetIceCredentials(); >+ EXPECT_NE(random, one[0]); >+ EXPECT_NE(random, iterator.GetIceCredentials()); >+} >+ >+TEST(IceCredentialsIteratorTest, GetTwo) { >+ std::vector<IceParameters> two = { >+ IceCredentialsIterator::CreateRandomIceCredentials(), >+ IceCredentialsIterator::CreateRandomIceCredentials()}; >+ IceCredentialsIterator iterator(two); >+ EXPECT_EQ(iterator.GetIceCredentials(), two[1]); >+ EXPECT_EQ(iterator.GetIceCredentials(), two[0]); >+ auto random = iterator.GetIceCredentials(); >+ EXPECT_NE(random, two[0]); >+ EXPECT_NE(random, two[1]); >+ EXPECT_NE(random, iterator.GetIceCredentials()); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icetransportinternal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icetransportinternal.h >index 7483c3babc2e21ecc2d397628db6441fa073a42b..099cea70e8f890837d9b420f089bb86c691b820f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icetransportinternal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/icetransportinternal.h >@@ -15,11 +15,12 @@ > #include <vector> > > #include "api/candidate.h" >+#include "api/transport/enums.h" > #include "p2p/base/candidatepairinterface.h" > #include "p2p/base/packettransportinternal.h" > #include "p2p/base/port.h" > #include "p2p/base/transportdescription.h" >-#include "rtc_base/stringencode.h" >+#include "rtc_base/system/rtc_export.h" > > namespace cricket { > >@@ -181,14 +182,17 @@ enum IceProtocolType { > > // IceTransportInternal is an internal abstract class that does ICE. > // Once the public interface is supported, >-// (https://www.w3.org/TR/webrtc/#rtcicetransport-interface) >+// (https://www.w3.org/TR/webrtc/#rtcicetransport) > // the IceTransportInterface will be split from this class. >-class IceTransportInternal : public rtc::PacketTransportInternal { >+class RTC_EXPORT IceTransportInternal : public rtc::PacketTransportInternal { > public: > IceTransportInternal(); > ~IceTransportInternal() override; > >+ // TODO(bugs.webrtc.org/9308): Remove GetState once all uses have been >+ // migrated to GetIceTransportState. > virtual IceTransportState GetState() const = 0; >+ virtual webrtc::IceTransportState GetIceTransportState() const = 0; > > virtual int component() const = 0; > >@@ -258,8 +262,13 @@ class IceTransportInternal : public rtc::PacketTransportInternal { > sigslot::signal1<IceTransportInternal*> SignalRoleConflict; > > // Emitted whenever the transport state changed. >+ // TODO(bugs.webrtc.org/9308): Remove once all uses have migrated to the new >+ // IceTransportState. > sigslot::signal1<IceTransportInternal*> SignalStateChanged; > >+ // Emitted whenever the new standards-compliant transport state changed. >+ sigslot::signal1<IceTransportInternal*> SignalIceTransportStateChanged; >+ > // Invoked when the transport is being destroyed. > sigslot::signal1<IceTransportInternal*> SignalDestroyed; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.cc >index 61af6f39f29c898e3c2a0db21ac5097c7381c60b..f14a0d117e35ea39c221ee1fff0cf7823d994f81 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.cc >@@ -19,13 +19,13 @@ namespace { > // RFC 1035, Section 4.1.1. > // > // QR bit. >-constexpr uint16_t kMDnsFlagMaskQueryOrResponse = 0x8000; >+constexpr uint16_t kMdnsFlagMaskQueryOrResponse = 0x8000; > // AA bit. >-constexpr uint16_t kMDnsFlagMaskAuthoritative = 0x0400; >+constexpr uint16_t kMdnsFlagMaskAuthoritative = 0x0400; > // RFC 1035, Section 4.1.2, QCLASS and RFC 6762, Section 18.12, repurposing of > // top bit of QCLASS as the unicast response bit. >-constexpr uint16_t kMDnsQClassMaskUnicastResponse = 0x8000; >-constexpr size_t kMDnsHeaderSizeBytes = 12; >+constexpr uint16_t kMdnsQClassMaskUnicastResponse = 0x8000; >+constexpr size_t kMdnsHeaderSizeBytes = 12; > > bool ReadDomainName(MessageBufferReader* buf, std::string* name) { > size_t name_start_pos = buf->CurrentOffset(); >@@ -64,7 +64,7 @@ bool ReadDomainName(MessageBufferReader* buf, std::string* name) { > // A legitimate pointer only refers to a prior occurrence of the same name, > // and we should only move strictly backward to a prior name field after the > // header. >- if (pos_jump_to >= name_start_pos || pos_jump_to < kMDnsHeaderSizeBytes) { >+ if (pos_jump_to >= name_start_pos || pos_jump_to < kMdnsHeaderSizeBytes) { > return false; > } > MessageBufferReader new_buf(buf->MessageData(), buf->MessageLength()); >@@ -88,27 +88,27 @@ void WriteDomainName(rtc::ByteBufferWriter* buf, const std::string& name) { > > } // namespace > >-void MDnsHeader::SetQueryOrResponse(bool is_query) { >+void MdnsHeader::SetQueryOrResponse(bool is_query) { > if (is_query) { >- flags &= ~kMDnsFlagMaskQueryOrResponse; >+ flags &= ~kMdnsFlagMaskQueryOrResponse; > } else { >- flags |= kMDnsFlagMaskQueryOrResponse; >+ flags |= kMdnsFlagMaskQueryOrResponse; > } > } > >-void MDnsHeader::SetAuthoritative(bool is_authoritative) { >+void MdnsHeader::SetAuthoritative(bool is_authoritative) { > if (is_authoritative) { >- flags |= kMDnsFlagMaskAuthoritative; >+ flags |= kMdnsFlagMaskAuthoritative; > } else { >- flags &= ~kMDnsFlagMaskAuthoritative; >+ flags &= ~kMdnsFlagMaskAuthoritative; > } > } > >-bool MDnsHeader::IsAuthoritative() const { >- return flags & kMDnsFlagMaskAuthoritative; >+bool MdnsHeader::IsAuthoritative() const { >+ return flags & kMdnsFlagMaskAuthoritative; > } > >-bool MDnsHeader::Read(MessageBufferReader* buf) { >+bool MdnsHeader::Read(MessageBufferReader* buf) { > if (!buf->ReadUInt16(&id) || !buf->ReadUInt16(&flags) || > !buf->ReadUInt16(&qdcount) || !buf->ReadUInt16(&ancount) || > !buf->ReadUInt16(&nscount) || !buf->ReadUInt16(&arcount)) { >@@ -118,7 +118,7 @@ bool MDnsHeader::Read(MessageBufferReader* buf) { > return true; > } > >-void MDnsHeader::Write(rtc::ByteBufferWriter* buf) const { >+void MdnsHeader::Write(rtc::ByteBufferWriter* buf) const { > buf->WriteUInt16(id); > buf->WriteUInt16(flags); > buf->WriteUInt16(qdcount); >@@ -127,15 +127,15 @@ void MDnsHeader::Write(rtc::ByteBufferWriter* buf) const { > buf->WriteUInt16(arcount); > } > >-bool MDnsHeader::IsQuery() const { >- return !(flags & kMDnsFlagMaskQueryOrResponse); >+bool MdnsHeader::IsQuery() const { >+ return !(flags & kMdnsFlagMaskQueryOrResponse); > } > >-MDnsSectionEntry::MDnsSectionEntry() = default; >-MDnsSectionEntry::~MDnsSectionEntry() = default; >-MDnsSectionEntry::MDnsSectionEntry(const MDnsSectionEntry& other) = default; >+MdnsSectionEntry::MdnsSectionEntry() = default; >+MdnsSectionEntry::~MdnsSectionEntry() = default; >+MdnsSectionEntry::MdnsSectionEntry(const MdnsSectionEntry& other) = default; > >-void MDnsSectionEntry::SetType(SectionEntryType type) { >+void MdnsSectionEntry::SetType(SectionEntryType type) { > switch (type) { > case SectionEntryType::kA: > type_ = 1; >@@ -148,7 +148,7 @@ void MDnsSectionEntry::SetType(SectionEntryType type) { > } > } > >-SectionEntryType MDnsSectionEntry::GetType() const { >+SectionEntryType MdnsSectionEntry::GetType() const { > switch (type_) { > case 1: > return SectionEntryType::kA; >@@ -159,7 +159,7 @@ SectionEntryType MDnsSectionEntry::GetType() const { > } > } > >-void MDnsSectionEntry::SetClass(SectionEntryClass cls) { >+void MdnsSectionEntry::SetClass(SectionEntryClass cls) { > switch (cls) { > case SectionEntryClass::kIN: > class_ = 1; >@@ -169,7 +169,7 @@ void MDnsSectionEntry::SetClass(SectionEntryClass cls) { > } > } > >-SectionEntryClass MDnsSectionEntry::GetClass() const { >+SectionEntryClass MdnsSectionEntry::GetClass() const { > switch (class_) { > case 1: > return SectionEntryClass::kIN; >@@ -178,11 +178,11 @@ SectionEntryClass MDnsSectionEntry::GetClass() const { > } > } > >-MDnsQuestion::MDnsQuestion() = default; >-MDnsQuestion::MDnsQuestion(const MDnsQuestion& other) = default; >-MDnsQuestion::~MDnsQuestion() = default; >+MdnsQuestion::MdnsQuestion() = default; >+MdnsQuestion::MdnsQuestion(const MdnsQuestion& other) = default; >+MdnsQuestion::~MdnsQuestion() = default; > >-bool MDnsQuestion::Read(MessageBufferReader* buf) { >+bool MdnsQuestion::Read(MessageBufferReader* buf) { > if (!ReadDomainName(buf, &name_)) { > RTC_LOG(LS_ERROR) << "Invalid name."; > return false; >@@ -194,31 +194,31 @@ bool MDnsQuestion::Read(MessageBufferReader* buf) { > return true; > } > >-bool MDnsQuestion::Write(rtc::ByteBufferWriter* buf) const { >+bool MdnsQuestion::Write(rtc::ByteBufferWriter* buf) const { > WriteDomainName(buf, name_); > buf->WriteUInt16(type_); > buf->WriteUInt16(class_); > return true; > } > >-void MDnsQuestion::SetUnicastResponse(bool should_unicast) { >+void MdnsQuestion::SetUnicastResponse(bool should_unicast) { > if (should_unicast) { >- class_ |= kMDnsQClassMaskUnicastResponse; >+ class_ |= kMdnsQClassMaskUnicastResponse; > } else { >- class_ &= ~kMDnsQClassMaskUnicastResponse; >+ class_ &= ~kMdnsQClassMaskUnicastResponse; > } > } > >-bool MDnsQuestion::ShouldUnicastResponse() const { >- return class_ & kMDnsQClassMaskUnicastResponse; >+bool MdnsQuestion::ShouldUnicastResponse() const { >+ return class_ & kMdnsQClassMaskUnicastResponse; > } > >-MDnsResourceRecord::MDnsResourceRecord() = default; >-MDnsResourceRecord::MDnsResourceRecord(const MDnsResourceRecord& other) = >+MdnsResourceRecord::MdnsResourceRecord() = default; >+MdnsResourceRecord::MdnsResourceRecord(const MdnsResourceRecord& other) = > default; >-MDnsResourceRecord::~MDnsResourceRecord() = default; >+MdnsResourceRecord::~MdnsResourceRecord() = default; > >-bool MDnsResourceRecord::Read(MessageBufferReader* buf) { >+bool MdnsResourceRecord::Read(MessageBufferReader* buf) { > if (!ReadDomainName(buf, &name_)) { > return false; > } >@@ -239,17 +239,17 @@ bool MDnsResourceRecord::Read(MessageBufferReader* buf) { > } > return false; > } >-bool MDnsResourceRecord::ReadARData(MessageBufferReader* buf) { >+bool MdnsResourceRecord::ReadARData(MessageBufferReader* buf) { > // A RDATA contains a 32-bit IPv4 address. > return buf->ReadString(&rdata_, 4); > } > >-bool MDnsResourceRecord::ReadQuadARData(MessageBufferReader* buf) { >+bool MdnsResourceRecord::ReadQuadARData(MessageBufferReader* buf) { > // AAAA RDATA contains a 128-bit IPv6 address. > return buf->ReadString(&rdata_, 16); > } > >-bool MDnsResourceRecord::Write(rtc::ByteBufferWriter* buf) const { >+bool MdnsResourceRecord::Write(rtc::ByteBufferWriter* buf) const { > WriteDomainName(buf, name_); > buf->WriteUInt16(type_); > buf->WriteUInt16(class_); >@@ -270,15 +270,15 @@ bool MDnsResourceRecord::Write(rtc::ByteBufferWriter* buf) const { > return true; > } > >-void MDnsResourceRecord::WriteARData(rtc::ByteBufferWriter* buf) const { >+void MdnsResourceRecord::WriteARData(rtc::ByteBufferWriter* buf) const { > buf->WriteString(rdata_); > } > >-void MDnsResourceRecord::WriteQuadARData(rtc::ByteBufferWriter* buf) const { >+void MdnsResourceRecord::WriteQuadARData(rtc::ByteBufferWriter* buf) const { > buf->WriteString(rdata_); > } > >-bool MDnsResourceRecord::SetIPAddressInRecordData( >+bool MdnsResourceRecord::SetIPAddressInRecordData( > const rtc::IPAddress& address) { > int af = address.family(); > if (af != AF_INET && af != AF_INET6) { >@@ -293,7 +293,7 @@ bool MDnsResourceRecord::SetIPAddressInRecordData( > return true; > } > >-bool MDnsResourceRecord::GetIPAddressFromRecordData( >+bool MdnsResourceRecord::GetIPAddressFromRecordData( > rtc::IPAddress* address) const { > if (GetType() != SectionEntryType::kA && > GetType() != SectionEntryType::kAAAA) { >@@ -310,16 +310,16 @@ bool MDnsResourceRecord::GetIPAddressFromRecordData( > return rtc::IPFromString(std::string(out), address); > } > >-MDnsMessage::MDnsMessage() = default; >-MDnsMessage::~MDnsMessage() = default; >+MdnsMessage::MdnsMessage() = default; >+MdnsMessage::~MdnsMessage() = default; > >-bool MDnsMessage::Read(MessageBufferReader* buf) { >+bool MdnsMessage::Read(MessageBufferReader* buf) { > RTC_DCHECK_EQ(0u, buf->CurrentOffset()); > if (!header_.Read(buf)) { > return false; > } > >- auto read_question = [&buf](std::vector<MDnsQuestion>* section, >+ auto read_question = [&buf](std::vector<MdnsQuestion>* section, > uint16_t count) { > section->resize(count); > for (auto& question : (*section)) { >@@ -329,7 +329,7 @@ bool MDnsMessage::Read(MessageBufferReader* buf) { > } > return true; > }; >- auto read_rr = [&buf](std::vector<MDnsResourceRecord>* section, >+ auto read_rr = [&buf](std::vector<MdnsResourceRecord>* section, > uint16_t count) { > section->resize(count); > for (auto& rr : (*section)) { >@@ -349,10 +349,10 @@ bool MDnsMessage::Read(MessageBufferReader* buf) { > return true; > } > >-bool MDnsMessage::Write(rtc::ByteBufferWriter* buf) const { >+bool MdnsMessage::Write(rtc::ByteBufferWriter* buf) const { > header_.Write(buf); > >- auto write_rr = [&buf](const std::vector<MDnsResourceRecord>& section) { >+ auto write_rr = [&buf](const std::vector<MdnsResourceRecord>& section) { > for (auto rr : section) { > if (!rr.Write(buf)) { > return false; >@@ -374,7 +374,7 @@ bool MDnsMessage::Write(rtc::ByteBufferWriter* buf) const { > return true; > } > >-bool MDnsMessage::ShouldUnicastResponse() const { >+bool MdnsMessage::ShouldUnicastResponse() const { > bool should_unicast = false; > for (const auto& question : question_section_) { > should_unicast |= question.ShouldUnicastResponse(); >@@ -382,12 +382,12 @@ bool MDnsMessage::ShouldUnicastResponse() const { > return should_unicast; > } > >-void MDnsMessage::AddQuestion(const MDnsQuestion& question) { >+void MdnsMessage::AddQuestion(const MdnsQuestion& question) { > question_section_.push_back(question); > header_.qdcount = question_section_.size(); > } > >-void MDnsMessage::AddAnswerRecord(const MDnsResourceRecord& answer) { >+void MdnsMessage::AddAnswerRecord(const MdnsResourceRecord& answer) { > answer_section_.push_back(answer); > header_.ancount = answer_section_.size(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.h >index f6dec20a956d53020748dfc9cb2b489fd4098207..7aa1b21039a530f11f8c8f8551c9b50cad3ddc52 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message.h >@@ -15,7 +15,7 @@ > // 6762 and RFC 1025 (DNS messages). Note that it is recommended by RFC 6762 to > // use the name compression scheme defined in RFC 1035 whenever possible. We > // currently only implement the capability of reading compressed names in mDNS >-// messages in MDnsMessage::Read(); however, the MDnsMessage::Write() does not >+// messages in MdnsMessage::Read(); however, the MdnsMessage::Write() does not > // support name compression yet. > // > // Fuzzer tests (test/fuzzers/mdns_parser_fuzzer.cc) MUST always be performed >@@ -50,7 +50,7 @@ enum class SectionEntryClass { > }; > > // RFC 1035, Section 4.1.1. >-class MDnsHeader final { >+class MdnsHeader final { > public: > bool Read(MessageBufferReader* buf); > void Write(rtc::ByteBufferWriter* buf) const; >@@ -74,11 +74,11 @@ class MDnsHeader final { > > // Entries in each section after the header share a common structure. Note that > // this is not a concept defined in RFC 1035. >-class MDnsSectionEntry { >+class MdnsSectionEntry { > public: >- MDnsSectionEntry(); >- MDnsSectionEntry(const MDnsSectionEntry& other); >- virtual ~MDnsSectionEntry(); >+ MdnsSectionEntry(); >+ MdnsSectionEntry(const MdnsSectionEntry& other); >+ virtual ~MdnsSectionEntry(); > virtual bool Read(MessageBufferReader* buf) = 0; > virtual bool Write(rtc::ByteBufferWriter* buf) const = 0; > >@@ -99,11 +99,11 @@ class MDnsSectionEntry { > }; > > // RFC 1035, Section 4.1.2. >-class MDnsQuestion final : public MDnsSectionEntry { >+class MdnsQuestion final : public MdnsSectionEntry { > public: >- MDnsQuestion(); >- MDnsQuestion(const MDnsQuestion& other); >- ~MDnsQuestion() override; >+ MdnsQuestion(); >+ MdnsQuestion(const MdnsQuestion& other); >+ ~MdnsQuestion() override; > > bool Read(MessageBufferReader* buf) override; > bool Write(rtc::ByteBufferWriter* buf) const override; >@@ -113,11 +113,11 @@ class MDnsQuestion final : public MDnsSectionEntry { > }; > > // RFC 1035, Section 4.1.3. >-class MDnsResourceRecord final : public MDnsSectionEntry { >+class MdnsResourceRecord final : public MdnsSectionEntry { > public: >- MDnsResourceRecord(); >- MDnsResourceRecord(const MDnsResourceRecord& other); >- ~MDnsResourceRecord() override; >+ MdnsResourceRecord(); >+ MdnsResourceRecord(const MdnsResourceRecord& other); >+ ~MdnsResourceRecord() override; > > bool Read(MessageBufferReader* buf) override; > bool Write(rtc::ByteBufferWriter* buf) const override; >@@ -145,17 +145,17 @@ class MDnsResourceRecord final : public MDnsSectionEntry { > std::string rdata_; > }; > >-class MDnsMessage final { >+class MdnsMessage final { > public: > // RFC 1035, Section 4.1. > enum class Section { kQuestion, kAnswer, kAuthority, kAdditional }; > >- MDnsMessage(); >- ~MDnsMessage(); >+ MdnsMessage(); >+ ~MdnsMessage(); > // Reads the mDNS message in |buf| and populates the corresponding fields in >- // MDnsMessage. >+ // MdnsMessage. > bool Read(MessageBufferReader* buf); >- // Write an mDNS message to |buf| based on the fields in MDnsMessage. >+ // Write an mDNS message to |buf| based on the fields in MdnsMessage. > // > // TODO(qingsi): Implement name compression when writing mDNS messages. > bool Write(rtc::ByteBufferWriter* buf) const; >@@ -177,29 +177,29 @@ class MDnsMessage final { > // preferred. False otherwise. > bool ShouldUnicastResponse() const; > >- void AddQuestion(const MDnsQuestion& question); >+ void AddQuestion(const MdnsQuestion& question); > // TODO(qingsi): Implement AddXRecord for name server and additional records. >- void AddAnswerRecord(const MDnsResourceRecord& answer); >+ void AddAnswerRecord(const MdnsResourceRecord& answer); > >- const std::vector<MDnsQuestion>& question_section() const { >+ const std::vector<MdnsQuestion>& question_section() const { > return question_section_; > } >- const std::vector<MDnsResourceRecord>& answer_section() const { >+ const std::vector<MdnsResourceRecord>& answer_section() const { > return answer_section_; > } >- const std::vector<MDnsResourceRecord>& authority_section() const { >+ const std::vector<MdnsResourceRecord>& authority_section() const { > return authority_section_; > } >- const std::vector<MDnsResourceRecord>& additional_section() const { >+ const std::vector<MdnsResourceRecord>& additional_section() const { > return additional_section_; > } > > private: >- MDnsHeader header_; >- std::vector<MDnsQuestion> question_section_; >- std::vector<MDnsResourceRecord> answer_section_; >- std::vector<MDnsResourceRecord> authority_section_; >- std::vector<MDnsResourceRecord> additional_section_; >+ MdnsHeader header_; >+ std::vector<MdnsQuestion> question_section_; >+ std::vector<MdnsResourceRecord> answer_section_; >+ std::vector<MdnsResourceRecord> authority_section_; >+ std::vector<MdnsResourceRecord> additional_section_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message_unittest.cc >index fb4a3b1e7a2615609d1a5a43d2d3abef00448f72..7f816f740b8a40d471fa1b1ecb372485f4ffb682 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mdns_message_unittest.cc >@@ -19,9 +19,9 @@ > #include "rtc_base/socketaddress.h" > #include "test/gmock.h" > >-#define ReadMDnsMessage(X, Y) ReadMDnsMessageTestCase(X, Y, sizeof(Y)) >-#define WriteMDnsMessageAndCompare(X, Y) \ >- WriteMDnsMessageAndCompareWithTestCast(X, Y, sizeof(Y)) >+#define ReadMdnsMessage(X, Y) ReadMdnsMessageTestCase(X, Y, sizeof(Y)) >+#define WriteMdnsMessageAndCompare(X, Y) \ >+ WriteMdnsMessageAndCompareWithTestCast(X, Y, sizeof(Y)) > > using ::testing::ElementsAre; > using ::testing::Pair; >@@ -255,14 +255,14 @@ const uint8_t kCorruptedAnswerWithNameCompression2[] = { > 0xc0, 0xA8, 0x00, 0x01, // 192.168.0.1 > }; > >-bool ReadMDnsMessageTestCase(MDnsMessage* msg, >+bool ReadMdnsMessageTestCase(MdnsMessage* msg, > const uint8_t* testcase, > size_t size) { > MessageBufferReader buf(reinterpret_cast<const char*>(testcase), size); > return msg->Read(&buf); > } > >-void WriteMDnsMessageAndCompareWithTestCast(MDnsMessage* msg, >+void WriteMdnsMessageAndCompareWithTestCast(MdnsMessage* msg, > const uint8_t* testcase, > size_t size) { > rtc::ByteBufferWriter out; >@@ -276,7 +276,7 @@ void WriteMDnsMessageAndCompareWithTestCast(MDnsMessage* msg, > EXPECT_EQ(testcase_bytes, bytes); > } > >-bool GetQueriedNames(MDnsMessage* msg, std::set<std::string>* names) { >+bool GetQueriedNames(MdnsMessage* msg, std::set<std::string>* names) { > if (!msg->IsQuery() || msg->question_section().empty()) { > return false; > } >@@ -286,7 +286,7 @@ bool GetQueriedNames(MDnsMessage* msg, std::set<std::string>* names) { > return true; > } > >-bool GetResolution(MDnsMessage* msg, >+bool GetResolution(MdnsMessage* msg, > std::map<std::string, rtc::IPAddress>* names) { > if (msg->IsQuery() || msg->answer_section().empty()) { > return false; >@@ -303,10 +303,10 @@ bool GetResolution(MDnsMessage* msg, > > } // namespace > >-TEST(MDnsMessageTest, ReadSingleQuestionForIPv4Address) { >- MDnsMessage msg; >+TEST(MdnsMessageTest, ReadSingleQuestionForIPv4Address) { >+ MdnsMessage msg; > ASSERT_TRUE( >- ReadMDnsMessage(&msg, kSingleQuestionForIPv4AddrWithUnicastResponse)); >+ ReadMdnsMessage(&msg, kSingleQuestionForIPv4AddrWithUnicastResponse)); > EXPECT_TRUE(msg.IsQuery()); > EXPECT_EQ(0x1234, msg.GetId()); > ASSERT_EQ(1u, msg.question_section().size()); >@@ -323,9 +323,9 @@ TEST(MDnsMessageTest, ReadSingleQuestionForIPv4Address) { > EXPECT_THAT(queried_names, ElementsAre("webrtc.org.")); > } > >-TEST(MDnsMessageTest, ReadTwoQuestionsForIPv4AndIPv6Addr) { >- MDnsMessage msg; >- ASSERT_TRUE(ReadMDnsMessage( >+TEST(MdnsMessageTest, ReadTwoQuestionsForIPv4AndIPv6Addr) { >+ MdnsMessage msg; >+ ASSERT_TRUE(ReadMdnsMessage( > &msg, kTwoQuestionsForIPv4AndIPv6AddrWithMulticastResponse)); > EXPECT_TRUE(msg.IsQuery()); > EXPECT_EQ(0x1234, msg.GetId()); >@@ -345,9 +345,9 @@ TEST(MDnsMessageTest, ReadTwoQuestionsForIPv4AndIPv6Addr) { > UnorderedElementsAre("webrtc4.org.", "webrtc6.org.")); > } > >-TEST(MDnsMessageTest, ReadTwoQuestionsForIPv4AndIPv6AddrWithNameCompression) { >- MDnsMessage msg; >- ASSERT_TRUE(ReadMDnsMessage( >+TEST(MdnsMessageTest, ReadTwoQuestionsForIPv4AndIPv6AddrWithNameCompression) { >+ MdnsMessage msg; >+ ASSERT_TRUE(ReadMdnsMessage( > &msg, > kTwoQuestionsForIPv4AndIPv6AddrWithMulticastResponseAndNameCompression)); > >@@ -363,10 +363,10 @@ TEST(MDnsMessageTest, ReadTwoQuestionsForIPv4AndIPv6AddrWithNameCompression) { > UnorderedElementsAre("www.webrtc.org.", "mdns.webrtc.org.")); > } > >-TEST(MDnsMessageTest, ReadThreeQuestionsWithTwoPointersToTheSameNameSuffix) { >- MDnsMessage msg; >+TEST(MdnsMessageTest, ReadThreeQuestionsWithTwoPointersToTheSameNameSuffix) { >+ MdnsMessage msg; > ASSERT_TRUE( >- ReadMDnsMessage(&msg, kThreeQuestionsWithTwoPointersToTheSameNameSuffix)); >+ ReadMdnsMessage(&msg, kThreeQuestionsWithTwoPointersToTheSameNameSuffix)); > > ASSERT_EQ(3u, msg.question_section().size()); > const auto& question1 = msg.question_section()[0]; >@@ -383,10 +383,10 @@ TEST(MDnsMessageTest, ReadThreeQuestionsWithTwoPointersToTheSameNameSuffix) { > "webrtc.org.")); > } > >-TEST(MDnsMessageTest, >+TEST(MdnsMessageTest, > ReadThreeQuestionsWithPointerToNameSuffixContainingAnotherPointer) { >- MDnsMessage msg; >- ASSERT_TRUE(ReadMDnsMessage( >+ MdnsMessage msg; >+ ASSERT_TRUE(ReadMdnsMessage( > &msg, kThreeQuestionsWithPointerToNameSuffixContainingAnotherPointer)); > > ASSERT_EQ(3u, msg.question_section().size()); >@@ -404,16 +404,16 @@ TEST(MDnsMessageTest, > "www.mdns.webrtc.org.")); > } > >-TEST(MDnsMessageTest, >+TEST(MdnsMessageTest, > ReadQuestionWithCorruptedPointerInNameCompressionShouldFail) { >- MDnsMessage msg; >- EXPECT_FALSE(ReadMDnsMessage(&msg, kCorruptedQuestionWithNameCompression1)); >- EXPECT_FALSE(ReadMDnsMessage(&msg, kCorruptedQuestionWithNameCompression2)); >+ MdnsMessage msg; >+ EXPECT_FALSE(ReadMdnsMessage(&msg, kCorruptedQuestionWithNameCompression1)); >+ EXPECT_FALSE(ReadMdnsMessage(&msg, kCorruptedQuestionWithNameCompression2)); > } > >-TEST(MDnsMessageTest, ReadSingleAnswerForIPv4Addr) { >- MDnsMessage msg; >- ASSERT_TRUE(ReadMDnsMessage(&msg, kSingleAuthoritativeAnswerWithIPv4Addr)); >+TEST(MdnsMessageTest, ReadSingleAnswerForIPv4Addr) { >+ MdnsMessage msg; >+ ASSERT_TRUE(ReadMdnsMessage(&msg, kSingleAuthoritativeAnswerWithIPv4Addr)); > EXPECT_FALSE(msg.IsQuery()); > EXPECT_TRUE(msg.IsAuthoritative()); > EXPECT_EQ(0x1234, msg.GetId()); >@@ -432,10 +432,10 @@ TEST(MDnsMessageTest, ReadSingleAnswerForIPv4Addr) { > EXPECT_THAT(resolution, ElementsAre(Pair("webrtc.org.", expected_addr))); > } > >-TEST(MDnsMessageTest, ReadTwoAnswersForIPv4AndIPv6Addr) { >- MDnsMessage msg; >+TEST(MdnsMessageTest, ReadTwoAnswersForIPv4AndIPv6Addr) { >+ MdnsMessage msg; > ASSERT_TRUE( >- ReadMDnsMessage(&msg, kTwoAuthoritativeAnswersWithIPv4AndIPv6Addr)); >+ ReadMdnsMessage(&msg, kTwoAuthoritativeAnswersWithIPv4AndIPv6Addr)); > EXPECT_FALSE(msg.IsQuery()); > EXPECT_TRUE(msg.IsAuthoritative()); > EXPECT_EQ(0x1234, msg.GetId()); >@@ -462,9 +462,9 @@ TEST(MDnsMessageTest, ReadTwoAnswersForIPv4AndIPv6Addr) { > Pair("webrtc6.org.", expected_addr_ipv6))); > } > >-TEST(MDnsMessageTest, ReadTwoAnswersForIPv4AndIPv6AddrWithNameCompression) { >- MDnsMessage msg; >- ASSERT_TRUE(ReadMDnsMessage( >+TEST(MdnsMessageTest, ReadTwoAnswersForIPv4AndIPv6AddrWithNameCompression) { >+ MdnsMessage msg; >+ ASSERT_TRUE(ReadMdnsMessage( > &msg, kTwoAuthoritativeAnswersWithIPv4AndIPv6AddrWithNameCompression)); > > std::map<std::string, rtc::IPAddress> resolution; >@@ -478,57 +478,57 @@ TEST(MDnsMessageTest, ReadTwoAnswersForIPv4AndIPv6AddrWithNameCompression) { > Pair("webrtc.org.", expected_addr_ipv6))); > } > >-TEST(MDnsMessageTest, >+TEST(MdnsMessageTest, > ReadAnswerWithCorruptedPointerInNameCompressionShouldFail) { >- MDnsMessage msg; >- EXPECT_FALSE(ReadMDnsMessage(&msg, kCorruptedAnswerWithNameCompression1)); >- EXPECT_FALSE(ReadMDnsMessage(&msg, kCorruptedAnswerWithNameCompression2)); >+ MdnsMessage msg; >+ EXPECT_FALSE(ReadMdnsMessage(&msg, kCorruptedAnswerWithNameCompression1)); >+ EXPECT_FALSE(ReadMdnsMessage(&msg, kCorruptedAnswerWithNameCompression2)); > } > >-TEST(MDnsMessageTest, WriteSingleQuestionForIPv4Addr) { >- MDnsMessage msg; >+TEST(MdnsMessageTest, WriteSingleQuestionForIPv4Addr) { >+ MdnsMessage msg; > msg.SetId(0x1234); > msg.SetQueryOrResponse(true); > >- MDnsQuestion question; >+ MdnsQuestion question; > question.SetName("webrtc.org."); > question.SetType(SectionEntryType::kA); > question.SetClass(SectionEntryClass::kIN); > question.SetUnicastResponse(true); > msg.AddQuestion(question); > >- WriteMDnsMessageAndCompare(&msg, >+ WriteMdnsMessageAndCompare(&msg, > kSingleQuestionForIPv4AddrWithUnicastResponse); > } > >-TEST(MDnsMessageTest, WriteTwoQuestionsForIPv4AndIPv6Addr) { >- MDnsMessage msg; >+TEST(MdnsMessageTest, WriteTwoQuestionsForIPv4AndIPv6Addr) { >+ MdnsMessage msg; > msg.SetId(0x1234); > msg.SetQueryOrResponse(true); > >- MDnsQuestion question1; >+ MdnsQuestion question1; > question1.SetName("webrtc4.org."); > question1.SetType(SectionEntryType::kA); > question1.SetClass(SectionEntryClass::kIN); > msg.AddQuestion(question1); > >- MDnsQuestion question2; >+ MdnsQuestion question2; > question2.SetName("webrtc6.org."); > question2.SetType(SectionEntryType::kAAAA); > question2.SetClass(SectionEntryClass::kIN); > msg.AddQuestion(question2); > >- WriteMDnsMessageAndCompare( >+ WriteMdnsMessageAndCompare( > &msg, kTwoQuestionsForIPv4AndIPv6AddrWithMulticastResponse); > } > >-TEST(MDnsMessageTest, WriteSingleAnswerToIPv4Addr) { >- MDnsMessage msg; >+TEST(MdnsMessageTest, WriteSingleAnswerToIPv4Addr) { >+ MdnsMessage msg; > msg.SetId(0x1234); > msg.SetQueryOrResponse(false); > msg.SetAuthoritative(true); > >- MDnsResourceRecord answer; >+ MdnsResourceRecord answer; > answer.SetName("webrtc.org."); > answer.SetType(SectionEntryType::kA); > answer.SetClass(SectionEntryClass::kIN); >@@ -537,16 +537,16 @@ TEST(MDnsMessageTest, WriteSingleAnswerToIPv4Addr) { > answer.SetTtlSeconds(120); > msg.AddAnswerRecord(answer); > >- WriteMDnsMessageAndCompare(&msg, kSingleAuthoritativeAnswerWithIPv4Addr); >+ WriteMdnsMessageAndCompare(&msg, kSingleAuthoritativeAnswerWithIPv4Addr); > } > >-TEST(MDnsMessageTest, WriteTwoAnswersToIPv4AndIPv6Addr) { >- MDnsMessage msg; >+TEST(MdnsMessageTest, WriteTwoAnswersToIPv4AndIPv6Addr) { >+ MdnsMessage msg; > msg.SetId(0x1234); > msg.SetQueryOrResponse(false); > msg.SetAuthoritative(true); > >- MDnsResourceRecord answer1; >+ MdnsResourceRecord answer1; > answer1.SetName("webrtc4.org."); > answer1.SetType(SectionEntryType::kA); > answer1.SetClass(SectionEntryClass::kIN); >@@ -555,7 +555,7 @@ TEST(MDnsMessageTest, WriteTwoAnswersToIPv4AndIPv6Addr) { > answer1.SetTtlSeconds(60); > msg.AddAnswerRecord(answer1); > >- MDnsResourceRecord answer2; >+ MdnsResourceRecord answer2; > answer2.SetName("webrtc6.org."); > answer2.SetType(SectionEntryType::kAAAA); > answer2.SetClass(SectionEntryClass::kIN); >@@ -564,7 +564,7 @@ TEST(MDnsMessageTest, WriteTwoAnswersToIPv4AndIPv6Addr) { > answer2.SetTtlSeconds(120); > msg.AddAnswerRecord(answer2); > >- WriteMDnsMessageAndCompare(&msg, kTwoAuthoritativeAnswersWithIPv4AndIPv6Addr); >+ WriteMdnsMessageAndCompare(&msg, kTwoAuthoritativeAnswersWithIPv4AndIPv6Addr); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mockicetransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mockicetransport.h >index d1863c52935209eee442a483c4265018f6eb2c70..b18ce3d5ade1da9400ccad202bd8adda8caa5fe4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mockicetransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/mockicetransport.h >@@ -47,6 +47,10 @@ class MockIceTransport : public IceTransportInternal { > IceTransportState GetState() const override { > return IceTransportState::STATE_INIT; > } >+ webrtc::IceTransportState GetIceTransportState() const override { >+ return webrtc::IceTransportState::kNew; >+ } >+ > const std::string& transport_name() const override { return transport_name_; } > int component() const override { return 0; } > void SetIceRole(IceRole role) override {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2pconstants.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2pconstants.h >index b2e92eff1352ac8c605ca0148498194a6a711bef..2948be0bb837d4c04dcbf2081f98b02d0d47d94b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2pconstants.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2pconstants.h >@@ -13,6 +13,8 @@ > > #include <string> > >+#include "rtc_base/system/rtc_export.h" >+ > namespace cricket { > > // CN_ == "content name". When we initiate a session, we choose the >@@ -29,7 +31,7 @@ extern const char CN_OTHER[]; > extern const char GROUP_TYPE_BUNDLE[]; > > extern const int ICE_UFRAG_LENGTH; >-extern const int ICE_PWD_LENGTH; >+RTC_EXPORT extern const int ICE_PWD_LENGTH; > extern const size_t ICE_UFRAG_MIN_LENGTH; > extern const size_t ICE_PWD_MIN_LENGTH; > extern const size_t ICE_UFRAG_MAX_LENGTH; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.cc >index b0abe24a0580a12a0fbec025e63af120dec67fb0..f61291c5ddc00826eb4247364ac4f5a1ba251203 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.cc >@@ -323,6 +323,10 @@ IceTransportState P2PTransportChannel::GetState() const { > return state_; > } > >+webrtc::IceTransportState P2PTransportChannel::GetIceTransportState() const { >+ return standardized_state_; >+} >+ > const std::string& P2PTransportChannel::transport_name() const { > return transport_name_; > } >@@ -387,6 +391,41 @@ IceTransportState P2PTransportChannel::ComputeState() const { > return IceTransportState::STATE_COMPLETED; > } > >+// Compute the current RTCIceTransportState as described in >+// https://www.w3.org/TR/webrtc/#dom-rtcicetransportstate >+// TODO(bugs.webrtc.org/9308): Return IceTransportState::kDisconnected when it >+// makes sense. >+// TODO(bugs.webrtc.org/9218): Avoid prematurely signalling kFailed once we have >+// implemented end-of-candidates signalling. >+webrtc::IceTransportState P2PTransportChannel::ComputeIceTransportState() >+ const { >+ bool has_connection = false; >+ for (Connection* connection : connections_) { >+ if (connection->active()) { >+ has_connection = true; >+ break; >+ } >+ } >+ >+ switch (gathering_state_) { >+ case kIceGatheringComplete: >+ if (has_connection) >+ return webrtc::IceTransportState::kCompleted; >+ else >+ return webrtc::IceTransportState::kFailed; >+ case kIceGatheringNew: >+ return webrtc::IceTransportState::kNew; >+ case kIceGatheringGathering: >+ if (has_connection) >+ return webrtc::IceTransportState::kConnected; >+ else >+ return webrtc::IceTransportState::kChecking; >+ default: >+ RTC_NOTREACHED(); >+ return webrtc::IceTransportState::kFailed; >+ } >+} >+ > void P2PTransportChannel::SetIceParameters(const IceParameters& ice_params) { > RTC_DCHECK(network_thread_ == rtc::Thread::Current()); > RTC_LOG(LS_INFO) << "Set ICE ufrag: " << ice_params.ufrag >@@ -1827,6 +1866,9 @@ void P2PTransportChannel::SwitchSelectedConnection(Connection* conn) { > // example, we call this at the end of SortConnectionsAndUpdateState. > void P2PTransportChannel::UpdateState() { > IceTransportState state = ComputeState(); >+ webrtc::IceTransportState current_standardized_state = >+ ComputeIceTransportState(); >+ > if (state_ != state) { > RTC_LOG(LS_INFO) << ToString() > << ": Transport channel state changed from " >@@ -1868,6 +1910,10 @@ void P2PTransportChannel::UpdateState() { > SignalStateChanged(this); > } > >+ if (standardized_state_ != current_standardized_state) { >+ standardized_state_ = current_standardized_state; >+ SignalIceTransportStateChanged(this); >+ } > // If our selected connection is "presumed writable" (TURN-TURN with no > // CreatePermission required), act like we're already writable to the upper > // layers, so they can start media quicker. >@@ -2319,7 +2365,7 @@ bool P2PTransportChannel::PrunePort(PortInterface* port) { > void P2PTransportChannel::OnReadPacket(Connection* connection, > const char* data, > size_t len, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > RTC_DCHECK(network_thread_ == rtc::Thread::Current()); > > // Do not deliver, if packet doesn't belong to the correct transport channel. >@@ -2327,7 +2373,7 @@ void P2PTransportChannel::OnReadPacket(Connection* connection, > return; > > // Let the client know of an incoming packet >- SignalReadPacket(this, data, len, packet_time, 0); >+ SignalReadPacket(this, data, len, packet_time_us, 0); > > // May need to switch the sending connection based on the receiving media path > // if this is the controlled side. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.h >index fbed344cc33e6ec731c9f08600bddb5a8db01fc9..abedd0bdd3489375e52f9436122fcd619ca75369 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel.h >@@ -42,6 +42,7 @@ > #include "rtc_base/asyncpacketsocket.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/strings/string_builder.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > > namespace webrtc { >@@ -75,7 +76,7 @@ class RemoteCandidate : public Candidate { > > // P2PTransportChannel manages the candidates and connection process to keep > // two P2P clients connected to each other. >-class P2PTransportChannel : public IceTransportInternal { >+class RTC_EXPORT P2PTransportChannel : public IceTransportInternal { > public: > // For testing only. > // TODO(zstein): Remove once AsyncResolverFactory is required. >@@ -91,6 +92,8 @@ class P2PTransportChannel : public IceTransportInternal { > > // From TransportChannelImpl: > IceTransportState GetState() const override; >+ webrtc::IceTransportState GetIceTransportState() const override; >+ > const std::string& transport_name() const override; > int component() const override; > bool writable() const override; >@@ -243,7 +246,13 @@ class P2PTransportChannel : public IceTransportInternal { > void UpdateState(); > void HandleAllTimedOut(); > void MaybeStopPortAllocatorSessions(); >+ >+ // ComputeIceTransportState computes the RTCIceTransportState as described in >+ // https://w3c.github.io/webrtc-pc/#dom-rtcicetransportstate. ComputeState >+ // computes the value we currently export as RTCIceTransportState. >+ // TODO(bugs.webrtc.org/9308): Remove ComputeState once it's no longer used. > IceTransportState ComputeState() const; >+ webrtc::IceTransportState ComputeIceTransportState() const; > > Connection* GetBestConnectionOnNetwork(rtc::Network* network) const; > bool CreateConnections(const Candidate& remote_candidate, >@@ -295,7 +304,7 @@ class P2PTransportChannel : public IceTransportInternal { > void OnReadPacket(Connection* connection, > const char* data, > size_t len, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > void OnSentPacket(const rtc::SentPacket& sent_packet); > void OnReadyToSend(Connection* connection); > void OnConnectionDestroyed(Connection* connection); >@@ -407,7 +416,11 @@ class P2PTransportChannel : public IceTransportInternal { > std::unique_ptr<webrtc::BasicRegatheringController> regathering_controller_; > int64_t last_ping_sent_ms_ = 0; > int weak_ping_interval_ = WEAK_PING_INTERVAL; >+ // TODO(jonasolsson): Remove state_ and rename standardized_state_ once state_ >+ // is no longer used to compute the ICE connection state. > IceTransportState state_ = IceTransportState::STATE_INIT; >+ webrtc::IceTransportState standardized_state_ = >+ webrtc::IceTransportState::kNew; > IceConfig config_; > int last_sent_packet_id_ = -1; // -1 indicates no packet was sent before. > bool started_pinging_ = false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel_unittest.cc >index 76b918ca1e4bce92763bdee4571137a73eb5cf5f..2ab3d88afbc597ad7e520c7e8f1bf9ffc0c48cf6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/p2ptransportchannel_unittest.cc >@@ -699,7 +699,6 @@ class P2PTransportChannelTestBase : public testing::Test, > EXPECT_NE(info.packet_type, rtc::PacketType::kUnknown); > EXPECT_NE(info.protocol, rtc::PacketInfoProtocolType::kUnknown); > EXPECT_TRUE(info.network_id.has_value()); >- EXPECT_FALSE(info.local_socket_address.IsNil()); > } > > void OnReadyToSend(rtc::PacketTransportInternal* transport) { >@@ -811,7 +810,7 @@ class P2PTransportChannelTestBase : public testing::Test, > void OnReadPacket(rtc::PacketTransportInternal* transport, > const char* data, > size_t len, >- const rtc::PacketTime& packet_time, >+ const int64_t& /* packet_time_us */, > int flags) { > std::list<std::string>& packets = GetPacketList(transport); > packets.push_front(std::string(data, len)); >@@ -3204,7 +3203,7 @@ class P2PTransportChannelPingTest : public testing::Test, > msg.AddFingerprint(); > rtc::ByteBufferWriter buf; > msg.Write(&buf); >- conn->OnReadPacket(buf.Data(), buf.Length(), rtc::CreatePacketTime(0)); >+ conn->OnReadPacket(buf.Data(), buf.Length(), rtc::TimeMicros()); > } > > void OnReadyToSend(rtc::PacketTransportInternal* transport) { >@@ -3613,7 +3612,7 @@ TEST_F(P2PTransportChannelPingTest, TestReceivingStateChange) { > > clock.AdvanceTime(webrtc::TimeDelta::seconds(1)); > conn1->ReceivedPing(); >- conn1->OnReadPacket("ABC", 3, rtc::CreatePacketTime(0)); >+ conn1->OnReadPacket("ABC", 3, rtc::TimeMicros()); > EXPECT_TRUE_SIMULATED_WAIT(ch.receiving(), kShortTimeout, clock); > EXPECT_TRUE_SIMULATED_WAIT(!ch.receiving(), kShortTimeout, clock); > } >@@ -3804,7 +3803,7 @@ TEST_F(P2PTransportChannelPingTest, TestSelectConnectionBasedOnMediaReceived) { > Connection* conn2 = WaitForConnectionTo(&ch, "2.2.2.2", 2); > ASSERT_TRUE(conn2 != nullptr); > conn2->ReceivedPingResponse(LOW_RTT, "id"); // Become writable and receiving. >- conn2->OnReadPacket("ABC", 3, rtc::CreatePacketTime(0)); >+ conn2->OnReadPacket("ABC", 3, rtc::TimeMicros()); > EXPECT_EQ(conn2, ch.selected_connection()); > conn2->ReceivedPingResponse(LOW_RTT, "id"); // Become writable. > >@@ -3832,7 +3831,7 @@ TEST_F(P2PTransportChannelPingTest, TestSelectConnectionBasedOnMediaReceived) { > // selected connection was nominated by the controlling side. > conn2->ReceivedPing(); > conn2->ReceivedPingResponse(LOW_RTT, "id"); >- conn2->OnReadPacket("XYZ", 3, rtc::CreatePacketTime(0)); >+ conn2->OnReadPacket("XYZ", 3, rtc::TimeMicros()); > EXPECT_EQ_WAIT(conn3, ch.selected_connection(), kDefaultTimeout); > } > >@@ -3861,12 +3860,12 @@ TEST_F(P2PTransportChannelPingTest, > // Advance the clock by 1ms so that the last data receiving timestamp of > // conn2 is larger. > SIMULATED_WAIT(false, 1, clock); >- conn2->OnReadPacket("XYZ", 3, rtc::CreatePacketTime(0)); >+ conn2->OnReadPacket("XYZ", 3, rtc::TimeMicros()); > EXPECT_EQ(1, reset_selected_candidate_pair_switches()); > EXPECT_TRUE(CandidatePairMatchesNetworkRoute(conn2)); > > // conn1 also receives data; it becomes selected due to priority again. >- conn1->OnReadPacket("XYZ", 3, rtc::CreatePacketTime(0)); >+ conn1->OnReadPacket("XYZ", 3, rtc::TimeMicros()); > EXPECT_EQ(1, reset_selected_candidate_pair_switches()); > EXPECT_TRUE(CandidatePairMatchesNetworkRoute(conn2)); > >@@ -3875,7 +3874,7 @@ TEST_F(P2PTransportChannelPingTest, > SIMULATED_WAIT(false, 1, clock); > // Need to become writable again because it was pruned. > conn2->ReceivedPingResponse(LOW_RTT, "id"); >- conn2->OnReadPacket("XYZ", 3, rtc::CreatePacketTime(0)); >+ conn2->OnReadPacket("XYZ", 3, rtc::TimeMicros()); > EXPECT_EQ(1, reset_selected_candidate_pair_switches()); > EXPECT_TRUE(CandidatePairMatchesNetworkRoute(conn2)); > >@@ -3905,7 +3904,7 @@ TEST_F(P2PTransportChannelPingTest, > // conn1 received data; it is the selected connection. > // Advance the clock to have a non-zero last-data-receiving time. > SIMULATED_WAIT(false, 1, clock); >- conn1->OnReadPacket("XYZ", 3, rtc::CreatePacketTime(0)); >+ conn1->OnReadPacket("XYZ", 3, rtc::TimeMicros()); > EXPECT_EQ(1, reset_selected_candidate_pair_switches()); > EXPECT_TRUE(CandidatePairMatchesNetworkRoute(conn1)); > >@@ -4126,7 +4125,7 @@ TEST_F(P2PTransportChannelPingTest, TestDontPruneHighPriorityConnections) { > // conn2. > NominateConnection(conn1); > SIMULATED_WAIT(false, 1, clock); >- conn1->OnReadPacket("XYZ", 3, rtc::CreatePacketTime(0)); >+ conn1->OnReadPacket("XYZ", 3, rtc::TimeMicros()); > SIMULATED_WAIT(conn2->pruned(), 100, clock); > EXPECT_FALSE(conn2->pruned()); > } >@@ -4137,6 +4136,7 @@ TEST_F(P2PTransportChannelPingTest, TestGetState) { > clock.AdvanceTime(webrtc::TimeDelta::seconds(1)); > FakePortAllocator pa(rtc::Thread::Current(), nullptr); > P2PTransportChannel ch("test channel", 1, &pa); >+ EXPECT_EQ(webrtc::IceTransportState::kNew, ch.GetIceTransportState()); > PrepareChannel(&ch); > ch.MaybeStartGathering(); > EXPECT_EQ(IceTransportState::STATE_INIT, ch.GetState()); >@@ -4144,6 +4144,8 @@ TEST_F(P2PTransportChannelPingTest, TestGetState) { > ch.AddRemoteCandidate(CreateUdpCandidate(LOCAL_PORT_TYPE, "2.2.2.2", 2, 1)); > Connection* conn1 = WaitForConnectionTo(&ch, "1.1.1.1", 1, &clock); > Connection* conn2 = WaitForConnectionTo(&ch, "2.2.2.2", 2, &clock); >+ // Gathering complete with candidates. >+ EXPECT_EQ(webrtc::IceTransportState::kCompleted, ch.GetIceTransportState()); > ASSERT_TRUE(conn1 != nullptr); > ASSERT_TRUE(conn2 != nullptr); > // Now there are two connections, so the transport channel is connecting. >@@ -4156,6 +4158,7 @@ TEST_F(P2PTransportChannelPingTest, TestGetState) { > // Need to wait until the channel state is updated. > EXPECT_EQ_SIMULATED_WAIT(IceTransportState::STATE_FAILED, ch.GetState(), > kShortTimeout, clock); >+ EXPECT_EQ(webrtc::IceTransportState::kFailed, ch.GetIceTransportState()); > } > > // Test that when a low-priority connection is pruned, it is not deleted >@@ -4590,7 +4593,7 @@ TEST(P2PTransportChannelResolverTest, HostnameCandidateIsResolved) { > // prflx candidate is updated to a host candidate after the name resolution is > // done. > TEST_F(P2PTransportChannelTest, >- PeerReflexiveCandidateBeforeSignalingWithMDnsName) { >+ PeerReflexiveCandidateBeforeSignalingWithMdnsName) { > rtc::MockAsyncResolver mock_async_resolver; > webrtc::MockAsyncResolverFactory mock_async_resolver_factory; > EXPECT_CALL(mock_async_resolver_factory, Create()) >@@ -4601,7 +4604,7 @@ TEST_F(P2PTransportChannelTest, > ConfigureEndpoints(OPEN, OPEN, kOnlyLocalPorts, kOnlyLocalPorts); > // ICE parameter will be set up when creating the channels. > set_remote_ice_parameter_source(FROM_SETICEPARAMETERS); >- GetEndpoint(0)->network_manager_.CreateMDnsResponder(); >+ GetEndpoint(0)->network_manager_.CreateMdnsResponder(); > GetEndpoint(1)->async_resolver_factory_ = &mock_async_resolver_factory; > CreateChannels(); > // Pause sending candidates from both endpoints until we find out what port >@@ -4656,7 +4659,7 @@ TEST_F(P2PTransportChannelTest, > // a host candidate if the hostname candidate turns out to have the same IP > // address after the resolution completes. > TEST_F(P2PTransportChannelTest, >- PeerReflexiveCandidateDuringResolvingHostCandidateWithMDnsName) { >+ PeerReflexiveCandidateDuringResolvingHostCandidateWithMdnsName) { > NiceMock<rtc::MockAsyncResolver> mock_async_resolver; > webrtc::MockAsyncResolverFactory mock_async_resolver_factory; > EXPECT_CALL(mock_async_resolver_factory, Create()) >@@ -4667,7 +4670,7 @@ TEST_F(P2PTransportChannelTest, > ConfigureEndpoints(OPEN, OPEN, kOnlyLocalPorts, kOnlyLocalPorts); > // ICE parameter will be set up when creating the channels. > set_remote_ice_parameter_source(FROM_SETICEPARAMETERS); >- GetEndpoint(0)->network_manager_.CreateMDnsResponder(); >+ GetEndpoint(0)->network_manager_.CreateMdnsResponder(); > GetEndpoint(1)->async_resolver_factory_ = &mock_async_resolver_factory; > CreateChannels(); > // Pause sending candidates from both endpoints until we find out what port >@@ -4723,7 +4726,7 @@ TEST_F(P2PTransportChannelTest, > // Test that if we only gather and signal a host candidate, the IP address of > // which is obfuscated by an mDNS name, and if the peer can complete the name > // resolution with the correct IP address, we can have a p2p connection. >-TEST_F(P2PTransportChannelTest, CanConnectWithHostCandidateWithMDnsName) { >+TEST_F(P2PTransportChannelTest, CanConnectWithHostCandidateWithMdnsName) { > NiceMock<rtc::MockAsyncResolver> mock_async_resolver; > webrtc::MockAsyncResolverFactory mock_async_resolver_factory; > EXPECT_CALL(mock_async_resolver_factory, Create()) >@@ -4734,7 +4737,7 @@ TEST_F(P2PTransportChannelTest, CanConnectWithHostCandidateWithMDnsName) { > ConfigureEndpoints(OPEN, OPEN, kOnlyLocalPorts, kOnlyLocalPorts); > // ICE parameter will be set up when creating the channels. > set_remote_ice_parameter_source(FROM_SETICEPARAMETERS); >- GetEndpoint(0)->network_manager_.CreateMDnsResponder(); >+ GetEndpoint(0)->network_manager_.CreateMdnsResponder(); > GetEndpoint(1)->async_resolver_factory_ = &mock_async_resolver_factory; > CreateChannels(); > // Pause sending candidates from both endpoints until we find out what port >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packetsocketfactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packetsocketfactory.h >index 4667bb1fd70028b9e06d4f655756306b62363ed6..c903df06c8a98dd45653cd4e03fa4f050c1be575 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packetsocketfactory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packetsocketfactory.h >@@ -17,6 +17,7 @@ > #include "rtc_base/constructormagic.h" > #include "rtc_base/proxyinfo.h" > #include "rtc_base/sslcertificate.h" >+#include "rtc_base/system/rtc_export.h" > > namespace rtc { > >@@ -36,7 +37,7 @@ struct PacketSocketTcpOptions { > class AsyncPacketSocket; > class AsyncResolverInterface; > >-class PacketSocketFactory { >+class RTC_EXPORT PacketSocketFactory { > public: > enum Options { > OPT_STUN = 0x04, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packettransportinternal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packettransportinternal.h >index ca7f0036114df836f1bccc68e0c5855987b3371b..7d50f9c0eea1d977bb805a2fb6478deddb591264 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packettransportinternal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/packettransportinternal.h >@@ -21,15 +21,16 @@ > #include "rtc_base/asyncpacketsocket.h" > #include "rtc_base/networkroute.h" > #include "rtc_base/socket.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > > namespace rtc { > struct PacketOptions; >-struct PacketTime; > struct SentPacket; > >-class PacketTransportInternal : public virtual webrtc::PacketTransportInterface, >- public sigslot::has_slots<> { >+class RTC_EXPORT PacketTransportInternal >+ : public virtual webrtc::PacketTransportInterface, >+ public sigslot::has_slots<> { > public: > virtual const std::string& transport_name() const = 0; > >@@ -85,7 +86,9 @@ class PacketTransportInternal : public virtual webrtc::PacketTransportInterface, > sigslot::signal5<PacketTransportInternal*, > const char*, > size_t, >- const rtc::PacketTime&, >+ // TODO(bugs.webrtc.org/9584): Change to passing the int64_t >+ // timestamp by value. >+ const int64_t&, > int> > SignalReadPacket; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.cc >index ac7521e66727f6e9434807f650c1ae3bd2f63179..5b8e02d28ff894d3d376efab3b07f0f3bee40cbf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.cc >@@ -17,6 +17,7 @@ > #include <vector> > > #include "absl/memory/memory.h" >+#include "absl/strings/match.h" > #include "p2p/base/portallocator.h" > #include "rtc_base/checks.h" > #include "rtc_base/crc32.h" >@@ -27,7 +28,6 @@ > #include "rtc_base/network.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "rtc_base/stringencode.h" >-#include "rtc_base/stringutils.h" > #include "rtc_base/third_party/base64/base64.h" > > namespace { >@@ -183,7 +183,7 @@ const char* ProtoToString(ProtocolType proto) { > > bool StringToProto(const char* value, ProtocolType* proto) { > for (size_t i = 0; i <= PROTO_LAST; ++i) { >- if (_stricmp(PROTO_NAMES[i], value) == 0) { >+ if (absl::EqualsIgnoreCase(PROTO_NAMES[i], value)) { > *proto = static_cast<ProtocolType>(i); > return true; > } >@@ -399,9 +399,9 @@ void Port::AddAddress(const rtc::SocketAddress& address, > const std::string& type, > uint32_t type_preference, > uint32_t relay_preference, >- bool final) { >+ bool is_final) { > AddAddress(address, base_address, related_address, protocol, relay_protocol, >- tcptype, type, type_preference, relay_preference, "", final); >+ tcptype, type, type_preference, relay_preference, "", is_final); > } > > void Port::AddAddress(const rtc::SocketAddress& address, >@@ -430,44 +430,60 @@ void Port::AddAddress(const rtc::SocketAddress& address, > c.set_network_name(network_->name()); > c.set_network_type(network_->type()); > c.set_url(url); >+ c.set_related_address(related_address); >+ >+ bool pending = MaybeObfuscateAddress(&c, type, is_final); >+ >+ if (!pending) { >+ FinishAddingAddress(c, is_final); >+ } >+} >+ >+bool Port::MaybeObfuscateAddress(Candidate* c, >+ const std::string& type, >+ bool is_final) { > // TODO(bugs.webrtc.org/9723): Use a config to control the feature of IP > // handling with mDNS. >- if (network_->GetMDnsResponder() != nullptr) { >- // Obfuscate the IP address of a host candidates by an mDNS hostname. >- if (type == LOCAL_PORT_TYPE) { >- auto weak_ptr = weak_factory_.GetWeakPtr(); >- auto callback = [weak_ptr, c, is_final](const rtc::IPAddress& addr, >- const std::string& name) mutable { >- RTC_DCHECK(c.address().ipaddr() == addr); >- rtc::SocketAddress hostname_address(name, c.address().port()); >- // In Port and Connection, we need the IP address information to >- // correctly handle the update of candidate type to prflx. The removal >- // of IP address when signaling this candidate will take place in >- // BasicPortAllocatorSession::OnCandidateReady, via SanitizeCandidate. >- hostname_address.SetResolvedIP(addr); >- c.set_address(hostname_address); >- RTC_DCHECK(c.related_address() == rtc::SocketAddress()); >- if (weak_ptr != nullptr) { >- weak_ptr->FinishAddingAddress(c, is_final); >- } >- }; >- network_->GetMDnsResponder()->CreateNameForAddress(c.address().ipaddr(), >- callback); >- return; >- } >- // For other types of candidates, the related address should be set to >- // 0.0.0.0 or ::0. >- c.set_related_address(rtc::SocketAddress()); >- } else { >- c.set_related_address(related_address); >+ if (network_->GetMdnsResponder() == nullptr) { >+ return false; >+ } >+ if (type != LOCAL_PORT_TYPE) { >+ return false; > } >- FinishAddingAddress(c, is_final); >+ >+ auto copy = *c; >+ auto weak_ptr = weak_factory_.GetWeakPtr(); >+ auto callback = [weak_ptr, copy, is_final](const rtc::IPAddress& addr, >+ const std::string& name) mutable { >+ RTC_DCHECK(copy.address().ipaddr() == addr); >+ rtc::SocketAddress hostname_address(name, copy.address().port()); >+ // In Port and Connection, we need the IP address information to >+ // correctly handle the update of candidate type to prflx. The removal >+ // of IP address when signaling this candidate will take place in >+ // BasicPortAllocatorSession::OnCandidateReady, via SanitizeCandidate. >+ hostname_address.SetResolvedIP(addr); >+ copy.set_address(hostname_address); >+ copy.set_related_address(rtc::SocketAddress()); >+ if (weak_ptr != nullptr) { >+ weak_ptr->set_mdns_name_registration_status( >+ MdnsNameRegistrationStatus::kCompleted); >+ weak_ptr->FinishAddingAddress(copy, is_final); >+ } >+ }; >+ set_mdns_name_registration_status(MdnsNameRegistrationStatus::kInProgress); >+ network_->GetMdnsResponder()->CreateNameForAddress(copy.address().ipaddr(), >+ callback); >+ return true; > } > > void Port::FinishAddingAddress(const Candidate& c, bool is_final) { > candidates_.push_back(c); > SignalCandidateReady(this, c); > >+ PostAddAddress(is_final); >+} >+ >+void Port::PostAddAddress(bool is_final) { > if (is_final) { > SignalPortComplete(this); > } >@@ -773,7 +789,7 @@ bool Port::HandleIncomingPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > RTC_NOTREACHED(); > return false; > } >@@ -1236,7 +1252,7 @@ void Connection::OnSendStunPacket(const void* data, > > void Connection::OnReadPacket(const char* data, > size_t size, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > std::unique_ptr<IceMessage> msg; > std::string remote_ufrag; > const rtc::SocketAddress& addr(remote_candidate_.address()); >@@ -1246,7 +1262,7 @@ void Connection::OnReadPacket(const char* data, > last_data_received_ = rtc::TimeMillis(); > UpdateReceiving(last_data_received_); > recv_rate_tracker_.AddSamples(size); >- SignalReadPacket(this, data, size, packet_time); >+ SignalReadPacket(this, data, size, packet_time_us); > > // If timed out sending writability checks, start up again > if (!pruned_ && (write_state_ == STATE_WRITE_TIMEOUT)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.h >index 85711e41f56f93892e43d733b9a5cb3048cda967..9a8f92a96e45fd9a39e8cf8db4b53c5356e640bf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port.h >@@ -37,6 +37,7 @@ > #include "rtc_base/proxyinfo.h" > #include "rtc_base/ratetracker.h" > #include "rtc_base/socketaddress.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > #include "rtc_base/thread.h" > #include "rtc_base/weak_ptr.h" >@@ -49,7 +50,7 @@ class ConnectionRequest; > extern const char LOCAL_PORT_TYPE[]; > extern const char STUN_PORT_TYPE[]; > extern const char PRFLX_PORT_TYPE[]; >-extern const char RELAY_PORT_TYPE[]; >+RTC_EXPORT extern const char RELAY_PORT_TYPE[]; > > // RFC 6544, TCP candidate encoding rules. > extern const int DISCARD_PORT; >@@ -83,6 +84,18 @@ enum class IceCandidatePairState { > // frozen because we have not implemented ICE freezing logic. > }; > >+enum class MdnsNameRegistrationStatus { >+ // IP concealment with mDNS is not enabled or the name registration process is >+ // not started yet. >+ kNotStarted, >+ // A request to create and register an mDNS name for a local IP address of a >+ // host candidate is sent to the mDNS responder. >+ kInProgress, >+ // The name registration is complete and the created name is returned by the >+ // mDNS responder. >+ kCompleted, >+}; >+ > // Stats that we can return about the port of a STUN candidate. > class StunStats { > public: >@@ -305,7 +318,7 @@ class Port : public PortInterface, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > > // Shall the port handle packet from this |remote_addr|. > // This method is overridden by TurnPort. >@@ -392,7 +405,7 @@ class Port : public PortInterface, > const std::string& type, > uint32_t type_preference, > uint32_t relay_preference, >- bool final); >+ bool is_final); > > void AddAddress(const rtc::SocketAddress& address, > const rtc::SocketAddress& base_address, >@@ -408,6 +421,8 @@ class Port : public PortInterface, > > void FinishAddingAddress(const Candidate& c, bool is_final); > >+ virtual void PostAddAddress(bool is_final); >+ > // Adds the given connection to the map keyed by the remote candidate address. > // If an existing connection has the same address, the existing one will be > // replaced and destroyed. >@@ -443,6 +458,13 @@ class Port : public PortInterface, > > void CopyPortInformationToPacketInfo(rtc::PacketInfo* info) const; > >+ MdnsNameRegistrationStatus mdns_name_registration_status() const { >+ return mdns_name_registration_status_; >+ } >+ void set_mdns_name_registration_status(MdnsNameRegistrationStatus status) { >+ mdns_name_registration_status_ = status; >+ } >+ > private: > void Construct(); > // Called when one of our connections deletes itself. >@@ -487,9 +509,15 @@ class Port : public PortInterface, > int16_t network_cost_; > State state_ = State::INIT; > int64_t last_time_all_connections_removed_ = 0; >+ MdnsNameRegistrationStatus mdns_name_registration_status_ = >+ MdnsNameRegistrationStatus::kNotStarted; > > rtc::WeakPtrFactory<Port> weak_factory_; > >+ bool MaybeObfuscateAddress(Candidate* c, >+ const std::string& type, >+ bool is_final); >+ > friend class Connection; > }; > >@@ -578,15 +606,12 @@ class Connection : public CandidatePairInterface, > // Error if Send() returns < 0 > virtual int GetError() = 0; > >- sigslot::signal4<Connection*, const char*, size_t, const rtc::PacketTime&> >- SignalReadPacket; >+ sigslot::signal4<Connection*, const char*, size_t, int64_t> SignalReadPacket; > > sigslot::signal1<Connection*> SignalReadyToSend; > > // Called when a packet is received on this connection. >- void OnReadPacket(const char* data, >- size_t size, >- const rtc::PacketTime& packet_time); >+ void OnReadPacket(const char* data, size_t size, int64_t packet_time_us); > > // Called when the socket is currently able to send. > void OnReadyToSend(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port_unittest.cc >index 50280301d4ce18ffbf7b1eb57b0b8fd3ad55b712..71929423cb620048e0e32183362b66216d994270 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/port_unittest.cc >@@ -91,18 +91,18 @@ SocketAddress GetAddress(Port* port) { > return GetCandidate(port).address(); > } > >-IceMessage* CopyStunMessage(const IceMessage* src) { >- IceMessage* dst = new IceMessage(); >+std::unique_ptr<IceMessage> CopyStunMessage(const IceMessage& src) { >+ auto dst = absl::make_unique<IceMessage>(); > ByteBufferWriter buf; >- src->Write(&buf); >+ src.Write(&buf); > ByteBufferReader read_buf(buf); > dst->Read(&read_buf); > return dst; > } > >-bool WriteStunMessage(const StunMessage* msg, ByteBufferWriter* buf) { >+bool WriteStunMessage(const StunMessage& msg, ByteBufferWriter* buf) { > buf->Resize(0); // clear out any existing buffer contents >- return msg->Write(buf); >+ return msg.Write(buf); > } > > } // namespace >@@ -189,17 +189,15 @@ class TestPort : public Port { > const rtc::PacketOptions& options, > bool payload) { > if (!payload) { >- IceMessage* msg = new IceMessage; >- auto* buf = >- new rtc::BufferT<uint8_t>(static_cast<const char*>(data), size); >+ auto msg = absl::make_unique<IceMessage>(); >+ auto buf = absl::make_unique<rtc::BufferT<uint8_t>>( >+ static_cast<const char*>(data), size); > ByteBufferReader read_buf(*buf); > if (!msg->Read(&read_buf)) { >- delete msg; >- delete buf; > return -1; > } >- last_stun_buf_.reset(buf); >- last_stun_msg_.reset(msg); >+ last_stun_buf_ = std::move(buf); >+ last_stun_msg_ = std::move(msg); > } > return static_cast<int>(size); > } >@@ -234,24 +232,18 @@ static void SendPingAndReceiveResponse(Connection* lconn, > ASSERT_TRUE_WAIT(lport->last_stun_msg(), kDefaultTimeout); > ASSERT_TRUE(lport->last_stun_buf()); > rconn->OnReadPacket(lport->last_stun_buf()->data<char>(), >- lport->last_stun_buf()->size(), rtc::PacketTime()); >+ lport->last_stun_buf()->size(), /* packet_time_us */ -1); > clock->AdvanceTime(webrtc::TimeDelta::ms(ms)); > ASSERT_TRUE_WAIT(rport->last_stun_msg(), kDefaultTimeout); > ASSERT_TRUE(rport->last_stun_buf()); > lconn->OnReadPacket(rport->last_stun_buf()->data<char>(), >- rport->last_stun_buf()->size(), rtc::PacketTime()); >+ rport->last_stun_buf()->size(), /* packet_time_us */ -1); > } > > class TestChannel : public sigslot::has_slots<> { > public: > // Takes ownership of |p1| (but not |p2|). >- explicit TestChannel(Port* p1) >- : ice_mode_(ICEMODE_FULL), >- port_(p1), >- complete_count_(0), >- conn_(NULL), >- remote_request_(), >- nominated_(false) { >+ explicit TestChannel(std::unique_ptr<Port> p1) : port_(std::move(p1)) { > port_->SignalPortComplete.connect(this, &TestChannel::OnPortComplete); > port_->SignalUnknownAddress.connect(this, &TestChannel::OnUnknownAddress); > port_->SignalDestroyed.connect(this, &TestChannel::OnSrcPortDestroyed); >@@ -327,7 +319,7 @@ class TestChannel : public sigslot::has_slots<> { > EXPECT_TRUE(mi_attr != NULL); > EXPECT_TRUE(fingerprint_attr != NULL); > remote_address_ = addr; >- remote_request_.reset(CopyStunMessage(msg)); >+ remote_request_ = CopyStunMessage(*msg); > remote_frag_ = rf; > } > >@@ -362,15 +354,15 @@ class TestChannel : public sigslot::has_slots<> { > connection_ready_to_send_ = true; > } > >- IceMode ice_mode_; >+ IceMode ice_mode_ = ICEMODE_FULL; > std::unique_ptr<Port> port_; > >- int complete_count_; >- Connection* conn_; >+ int complete_count_ = 0; >+ Connection* conn_ = nullptr; > SocketAddress remote_address_; > std::unique_ptr<StunMessage> remote_request_; > std::string remote_frag_; >- bool nominated_; >+ bool nominated_ = false; > bool connection_ready_to_send_ = false; > }; > >@@ -400,82 +392,85 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > > protected: > void TestLocalToLocal() { >- Port* port1 = CreateUdpPort(kLocalAddr1); >+ auto port1 = CreateUdpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateUdpPort(kLocalAddr2); >+ auto port2 = CreateUdpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity("udp", port1, "udp", port2, true, true, true, true); >+ TestConnectivity("udp", std::move(port1), "udp", std::move(port2), true, >+ true, true, true); > } > void TestLocalToStun(NATType ntype) { >- Port* port1 = CreateUdpPort(kLocalAddr1); >+ auto port1 = CreateUdpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- nat_server2_.reset(CreateNatServer(kNatAddr2, ntype)); >- Port* port2 = CreateStunPort(kLocalAddr2, &nat_socket_factory2_); >+ nat_server2_ = CreateNatServer(kNatAddr2, ntype); >+ auto port2 = CreateStunPort(kLocalAddr2, &nat_socket_factory2_); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity("udp", port1, StunName(ntype), port2, >+ TestConnectivity("udp", std::move(port1), StunName(ntype), std::move(port2), > ntype == NAT_OPEN_CONE, true, ntype != NAT_SYMMETRIC, > true); > } > void TestLocalToRelay(RelayType rtype, ProtocolType proto) { >- Port* port1 = CreateUdpPort(kLocalAddr1); >+ auto port1 = CreateUdpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_UDP); >+ auto port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_UDP); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity("udp", port1, RelayName(rtype, proto), port2, >- rtype == RELAY_GTURN, true, true, true); >+ TestConnectivity("udp", std::move(port1), RelayName(rtype, proto), >+ std::move(port2), rtype == RELAY_GTURN, true, true, true); > } > void TestStunToLocal(NATType ntype) { >- nat_server1_.reset(CreateNatServer(kNatAddr1, ntype)); >- Port* port1 = CreateStunPort(kLocalAddr1, &nat_socket_factory1_); >+ nat_server1_ = CreateNatServer(kNatAddr1, ntype); >+ auto port1 = CreateStunPort(kLocalAddr1, &nat_socket_factory1_); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateUdpPort(kLocalAddr2); >+ auto port2 = CreateUdpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity(StunName(ntype), port1, "udp", port2, true, >- ntype != NAT_SYMMETRIC, true, true); >+ TestConnectivity(StunName(ntype), std::move(port1), "udp", std::move(port2), >+ true, ntype != NAT_SYMMETRIC, true, true); > } > void TestStunToStun(NATType ntype1, NATType ntype2) { >- nat_server1_.reset(CreateNatServer(kNatAddr1, ntype1)); >- Port* port1 = CreateStunPort(kLocalAddr1, &nat_socket_factory1_); >+ nat_server1_ = CreateNatServer(kNatAddr1, ntype1); >+ auto port1 = CreateStunPort(kLocalAddr1, &nat_socket_factory1_); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- nat_server2_.reset(CreateNatServer(kNatAddr2, ntype2)); >- Port* port2 = CreateStunPort(kLocalAddr2, &nat_socket_factory2_); >+ nat_server2_ = CreateNatServer(kNatAddr2, ntype2); >+ auto port2 = CreateStunPort(kLocalAddr2, &nat_socket_factory2_); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity(StunName(ntype1), port1, StunName(ntype2), port2, >- ntype2 == NAT_OPEN_CONE, ntype1 != NAT_SYMMETRIC, >- ntype2 != NAT_SYMMETRIC, >+ TestConnectivity(StunName(ntype1), std::move(port1), StunName(ntype2), >+ std::move(port2), ntype2 == NAT_OPEN_CONE, >+ ntype1 != NAT_SYMMETRIC, ntype2 != NAT_SYMMETRIC, > ntype1 + ntype2 < (NAT_PORT_RESTRICTED + NAT_SYMMETRIC)); > } > void TestStunToRelay(NATType ntype, RelayType rtype, ProtocolType proto) { >- nat_server1_.reset(CreateNatServer(kNatAddr1, ntype)); >- Port* port1 = CreateStunPort(kLocalAddr1, &nat_socket_factory1_); >+ nat_server1_ = CreateNatServer(kNatAddr1, ntype); >+ auto port1 = CreateStunPort(kLocalAddr1, &nat_socket_factory1_); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_UDP); >+ auto port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_UDP); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity(StunName(ntype), port1, RelayName(rtype, proto), port2, >- rtype == RELAY_GTURN, ntype != NAT_SYMMETRIC, true, true); >+ TestConnectivity(StunName(ntype), std::move(port1), RelayName(rtype, proto), >+ std::move(port2), rtype == RELAY_GTURN, >+ ntype != NAT_SYMMETRIC, true, true); > } > void TestTcpToTcp() { >- Port* port1 = CreateTcpPort(kLocalAddr1); >+ auto port1 = CreateTcpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateTcpPort(kLocalAddr2); >+ auto port2 = CreateTcpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity("tcp", port1, "tcp", port2, true, false, true, true); >+ TestConnectivity("tcp", std::move(port1), "tcp", std::move(port2), true, >+ false, true, true); > } > void TestTcpToRelay(RelayType rtype, ProtocolType proto) { >- Port* port1 = CreateTcpPort(kLocalAddr1); >+ auto port1 = CreateTcpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_TCP); >+ auto port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_TCP); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity("tcp", port1, RelayName(rtype, proto), port2, >- rtype == RELAY_GTURN, false, true, true); >+ TestConnectivity("tcp", std::move(port1), RelayName(rtype, proto), >+ std::move(port2), rtype == RELAY_GTURN, false, true, true); > } > void TestSslTcpToRelay(RelayType rtype, ProtocolType proto) { >- Port* port1 = CreateTcpPort(kLocalAddr1); >+ auto port1 = CreateTcpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_SSLTCP); >+ auto port2 = CreateRelayPort(kLocalAddr2, rtype, proto, PROTO_SSLTCP); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); >- TestConnectivity("ssltcp", port1, RelayName(rtype, proto), port2, >- rtype == RELAY_GTURN, false, true, true); >+ TestConnectivity("ssltcp", std::move(port1), RelayName(rtype, proto), >+ std::move(port2), rtype == RELAY_GTURN, false, true, true); > } > > rtc::Network* MakeNetwork(const SocketAddress& addr) { >@@ -485,71 +480,71 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > } > > // helpers for above functions >- UDPPort* CreateUdpPort(const SocketAddress& addr) { >+ std::unique_ptr<UDPPort> CreateUdpPort(const SocketAddress& addr) { > return CreateUdpPort(addr, &socket_factory_); > } >- UDPPort* CreateUdpPort(const SocketAddress& addr, >- PacketSocketFactory* socket_factory) { >+ std::unique_ptr<UDPPort> CreateUdpPort(const SocketAddress& addr, >+ PacketSocketFactory* socket_factory) { > return UDPPort::Create(&main_, socket_factory, MakeNetwork(addr), 0, 0, > username_, password_, std::string(), true, > absl::nullopt); > } >- TCPPort* CreateTcpPort(const SocketAddress& addr) { >+ std::unique_ptr<TCPPort> CreateTcpPort(const SocketAddress& addr) { > return CreateTcpPort(addr, &socket_factory_); > } >- TCPPort* CreateTcpPort(const SocketAddress& addr, >- PacketSocketFactory* socket_factory) { >+ std::unique_ptr<TCPPort> CreateTcpPort(const SocketAddress& addr, >+ PacketSocketFactory* socket_factory) { > return TCPPort::Create(&main_, socket_factory, MakeNetwork(addr), 0, 0, > username_, password_, true); > } >- StunPort* CreateStunPort(const SocketAddress& addr, >- rtc::PacketSocketFactory* factory) { >+ std::unique_ptr<StunPort> CreateStunPort(const SocketAddress& addr, >+ rtc::PacketSocketFactory* factory) { > ServerAddresses stun_servers; > stun_servers.insert(kStunAddr); > return StunPort::Create(&main_, factory, MakeNetwork(addr), 0, 0, username_, > password_, stun_servers, std::string(), > absl::nullopt); > } >- Port* CreateRelayPort(const SocketAddress& addr, >- RelayType rtype, >- ProtocolType int_proto, >- ProtocolType ext_proto) { >+ std::unique_ptr<Port> CreateRelayPort(const SocketAddress& addr, >+ RelayType rtype, >+ ProtocolType int_proto, >+ ProtocolType ext_proto) { > if (rtype == RELAY_TURN) { > return CreateTurnPort(addr, &socket_factory_, int_proto, ext_proto); > } else { > return CreateGturnPort(addr, int_proto, ext_proto); > } > } >- TurnPort* CreateTurnPort(const SocketAddress& addr, >- PacketSocketFactory* socket_factory, >- ProtocolType int_proto, >- ProtocolType ext_proto) { >+ std::unique_ptr<TurnPort> CreateTurnPort(const SocketAddress& addr, >+ PacketSocketFactory* socket_factory, >+ ProtocolType int_proto, >+ ProtocolType ext_proto) { > SocketAddress server_addr = > int_proto == PROTO_TCP ? kTurnTcpIntAddr : kTurnUdpIntAddr; > return CreateTurnPort(addr, socket_factory, int_proto, ext_proto, > server_addr); > } >- TurnPort* CreateTurnPort(const SocketAddress& addr, >- PacketSocketFactory* socket_factory, >- ProtocolType int_proto, >- ProtocolType ext_proto, >- const rtc::SocketAddress& server_addr) { >- return TurnPort::Create( >- &main_, socket_factory, MakeNetwork(addr), 0, 0, username_, password_, >- ProtocolAddress(server_addr, int_proto), kRelayCredentials, 0, >- std::string(), std::vector<std::string>(), std::vector<std::string>(), >- nullptr, nullptr); >- } >- RelayPort* CreateGturnPort(const SocketAddress& addr, >- ProtocolType int_proto, >- ProtocolType ext_proto) { >- RelayPort* port = CreateGturnPort(addr); >+ std::unique_ptr<TurnPort> CreateTurnPort( >+ const SocketAddress& addr, >+ PacketSocketFactory* socket_factory, >+ ProtocolType int_proto, >+ ProtocolType ext_proto, >+ const rtc::SocketAddress& server_addr) { >+ return TurnPort::Create(&main_, socket_factory, MakeNetwork(addr), 0, 0, >+ username_, password_, >+ ProtocolAddress(server_addr, int_proto), >+ kRelayCredentials, 0, "", {}, {}, nullptr, nullptr); >+ } >+ std::unique_ptr<RelayPort> CreateGturnPort(const SocketAddress& addr, >+ ProtocolType int_proto, >+ ProtocolType ext_proto) { >+ std::unique_ptr<RelayPort> port = CreateGturnPort(addr); > SocketAddress addrs[] = {kRelayUdpIntAddr, kRelayTcpIntAddr, > kRelaySslTcpIntAddr}; > port->AddServerAddress(ProtocolAddress(addrs[int_proto], int_proto)); > return port; > } >- RelayPort* CreateGturnPort(const SocketAddress& addr) { >+ std::unique_ptr<RelayPort> CreateGturnPort(const SocketAddress& addr) { > // TODO(pthatcher): Remove GTURN. > // Generate a username with length of 16 for Gturn only. > std::string username = rtc::CreateRandomString(kGturnUserNameLength); >@@ -558,9 +553,10 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > // TODO(?): Add an external address for ext_proto, so that the > // other side can connect to this port using a non-UDP protocol. > } >- rtc::NATServer* CreateNatServer(const SocketAddress& addr, >- rtc::NATType type) { >- return new rtc::NATServer(type, ss_.get(), addr, addr, ss_.get(), addr); >+ std::unique_ptr<rtc::NATServer> CreateNatServer(const SocketAddress& addr, >+ rtc::NATType type) { >+ return absl::make_unique<rtc::NATServer>(type, ss_.get(), addr, addr, >+ ss_.get(), addr); > } > static const char* StunName(NATType type) { > switch (type) { >@@ -612,9 +608,9 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > > // This does all the work and then deletes |port1| and |port2|. > void TestConnectivity(const char* name1, >- Port* port1, >+ std::unique_ptr<Port> port1, > const char* name2, >- Port* port2, >+ std::unique_ptr<Port> port2, > bool accept, > bool same_addr1, > bool same_addr2, >@@ -680,17 +676,17 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > > void TestTcpReconnect(bool ping_after_disconnected, > bool send_after_disconnected) { >- Port* port1 = CreateTcpPort(kLocalAddr1); >+ auto port1 = CreateTcpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- Port* port2 = CreateTcpPort(kLocalAddr2); >+ auto port2 = CreateTcpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > > port1->set_component(cricket::ICE_CANDIDATE_COMPONENT_DEFAULT); > port2->set_component(cricket::ICE_CANDIDATE_COMPONENT_DEFAULT); > > // Set up channels and ensure both ports will be deleted. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > EXPECT_EQ(0, ch1.complete_count()); > EXPECT_EQ(0, ch2.complete_count()); > >@@ -700,7 +696,7 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > ASSERT_EQ_WAIT(1, ch2.complete_count(), kDefaultTimeout); > > // Initial connecting the channel, create connection on channel1. >- ch1.CreateConnection(GetCandidate(port2)); >+ ch1.CreateConnection(GetCandidate(ch2.port())); > ConnectStartedChannels(&ch1, &ch2); > > // Shorten the timeout period. >@@ -753,43 +749,45 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > EXPECT_TRUE_WAIT(ch2.conn() == NULL, kDefaultTimeout); > } > >- IceMessage* CreateStunMessage(int type) { >- IceMessage* msg = new IceMessage(); >+ std::unique_ptr<IceMessage> CreateStunMessage(int type) { >+ auto msg = absl::make_unique<IceMessage>(); > msg->SetType(type); > msg->SetTransactionID("TESTTESTTEST"); > return msg; > } >- IceMessage* CreateStunMessageWithUsername(int type, >- const std::string& username) { >- IceMessage* msg = CreateStunMessage(type); >+ std::unique_ptr<IceMessage> CreateStunMessageWithUsername( >+ int type, >+ const std::string& username) { >+ std::unique_ptr<IceMessage> msg = CreateStunMessage(type); > msg->AddAttribute(absl::make_unique<StunByteStringAttribute>( > STUN_ATTR_USERNAME, username)); > return msg; > } >- TestPort* CreateTestPort(const rtc::SocketAddress& addr, >- const std::string& username, >- const std::string& password) { >- TestPort* port = new TestPort(&main_, "test", &socket_factory_, >- MakeNetwork(addr), 0, 0, username, password); >+ std::unique_ptr<TestPort> CreateTestPort(const rtc::SocketAddress& addr, >+ const std::string& username, >+ const std::string& password) { >+ auto port = absl::make_unique<TestPort>(&main_, "test", &socket_factory_, >+ MakeNetwork(addr), 0, 0, username, >+ password); > port->SignalRoleConflict.connect(this, &PortTest::OnRoleConflict); > return port; > } >- TestPort* CreateTestPort(const rtc::SocketAddress& addr, >- const std::string& username, >- const std::string& password, >- cricket::IceRole role, >- int tiebreaker) { >- TestPort* port = CreateTestPort(addr, username, password); >+ std::unique_ptr<TestPort> CreateTestPort(const rtc::SocketAddress& addr, >+ const std::string& username, >+ const std::string& password, >+ cricket::IceRole role, >+ int tiebreaker) { >+ auto port = CreateTestPort(addr, username, password); > port->SetIceRole(role); > port->SetIceTiebreaker(tiebreaker); > return port; > } > // Overload to create a test port given an rtc::Network directly. >- TestPort* CreateTestPort(rtc::Network* network, >- const std::string& username, >- const std::string& password) { >- TestPort* port = new TestPort(&main_, "test", &socket_factory_, network, 0, >- 0, username, password); >+ std::unique_ptr<TestPort> CreateTestPort(rtc::Network* network, >+ const std::string& username, >+ const std::string& password) { >+ auto port = absl::make_unique<TestPort>(&main_, "test", &socket_factory_, >+ network, 0, 0, username, password); > port->SignalRoleConflict.connect(this, &PortTest::OnRoleConflict); > return port; > } >@@ -834,9 +832,9 @@ class PortTest : public testing::Test, public sigslot::has_slots<> { > }; > > void PortTest::TestConnectivity(const char* name1, >- Port* port1, >+ std::unique_ptr<Port> port1, > const char* name2, >- Port* port2, >+ std::unique_ptr<Port> port2, > bool accept, > bool same_addr1, > bool same_addr2, >@@ -847,8 +845,8 @@ void PortTest::TestConnectivity(const char* name1, > port2->set_component(cricket::ICE_CANDIDATE_COMPONENT_DEFAULT); > > // Set up channels and ensure both ports will be deleted. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > EXPECT_EQ(0, ch1.complete_count()); > EXPECT_EQ(0, ch2.complete_count()); > >@@ -859,7 +857,7 @@ void PortTest::TestConnectivity(const char* name1, > ASSERT_EQ_SIMULATED_WAIT(1, ch2.complete_count(), kDefaultTimeout, clock); > > // Send a ping from src to dst. This may or may not make it. >- ch1.CreateConnection(GetCandidate(port2)); >+ ch1.CreateConnection(GetCandidate(ch2.port())); > ASSERT_TRUE(ch1.conn() != NULL); > EXPECT_TRUE_SIMULATED_WAIT(ch1.conn()->connected(), kDefaultTimeout, > clock); // for TCP connect >@@ -870,16 +868,16 @@ void PortTest::TestConnectivity(const char* name1, > // We are able to send a ping from src to dst. This is the case when > // sending to UDP ports and cone NATs. > EXPECT_TRUE(ch1.remote_address().IsNil()); >- EXPECT_EQ(ch2.remote_fragment(), port1->username_fragment()); >+ EXPECT_EQ(ch2.remote_fragment(), ch1.port()->username_fragment()); > > // Ensure the ping came from the same address used for src. > // This is the case unless the source NAT was symmetric. > if (same_addr1) >- EXPECT_EQ(ch2.remote_address(), GetAddress(port1)); >+ EXPECT_EQ(ch2.remote_address(), GetAddress(ch1.port())); > EXPECT_TRUE(same_addr2); > > // Send a ping from dst to src. >- ch2.AcceptConnection(GetCandidate(port1)); >+ ch2.AcceptConnection(GetCandidate(ch1.port())); > ASSERT_TRUE(ch2.conn() != NULL); > ch2.Ping(); > EXPECT_EQ_SIMULATED_WAIT(Connection::STATE_WRITABLE, >@@ -891,7 +889,7 @@ void PortTest::TestConnectivity(const char* name1, > EXPECT_TRUE(ch2.remote_address().IsNil()); > > // Send a ping from dst to src. Again, this may or may not make it. >- ch2.CreateConnection(GetCandidate(port1)); >+ ch2.CreateConnection(GetCandidate(ch1.port())); > ASSERT_TRUE(ch2.conn() != NULL); > ch2.Ping(); > SIMULATED_WAIT(ch2.conn()->write_state() == Connection::STATE_WRITABLE, >@@ -925,7 +923,7 @@ void PortTest::TestConnectivity(const char* name1, > EXPECT_TRUE(ch1.remote_address().IsNil()); > > // Pick up the actual address and establish the connection. >- ch2.AcceptConnection(GetCandidate(port1)); >+ ch2.AcceptConnection(GetCandidate(ch1.port())); > ASSERT_TRUE(ch2.conn() != NULL); > ch2.Ping(); > EXPECT_EQ_SIMULATED_WAIT(Connection::STATE_WRITABLE, >@@ -938,7 +936,7 @@ void PortTest::TestConnectivity(const char* name1, > EXPECT_FALSE(ch1.conn()->receiving()); > > // Update our address and complete the connection. >- ch1.AcceptConnection(GetCandidate(port2)); >+ ch1.AcceptConnection(GetCandidate(ch2.port())); > ch1.Ping(); > EXPECT_EQ_SIMULATED_WAIT(Connection::STATE_WRITABLE, > ch1.conn()->write_state(), kDefaultTimeout, >@@ -1261,12 +1259,12 @@ TEST_F(PortTest, TestTcpReconnectTimeout) { > // Test when TcpConnection never connects, the OnClose() will be called to > // destroy the connection. > TEST_F(PortTest, TestTcpNeverConnect) { >- Port* port1 = CreateTcpPort(kLocalAddr1); >+ auto port1 = CreateTcpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); > port1->set_component(cricket::ICE_CANDIDATE_COMPONENT_DEFAULT); > > // Set up a channel and ensure the port will be deleted. >- TestChannel ch1(port1); >+ TestChannel ch1(std::move(port1)); > EXPECT_EQ(0, ch1.complete_count()); > > ch1.Start(); >@@ -1277,7 +1275,7 @@ TEST_F(PortTest, TestTcpNeverConnect) { > // Bind but not listen. > EXPECT_EQ(0, server->Bind(kLocalAddr2)); > >- Candidate c = GetCandidate(port1); >+ Candidate c = GetCandidate(ch1.port()); > c.set_address(server->GetLocalAddress()); > > ch1.CreateConnection(c); >@@ -1312,10 +1310,8 @@ TEST_F(PortTest, TestSslTcpToSslTcpRelay) { > // ii) it has not received anything for DEAD_CONNECTION_RECEIVE_TIMEOUT > // milliseconds since last receiving. > TEST_F(PortTest, TestConnectionDead) { >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(CreateUdpPort(kLocalAddr1)); >+ TestChannel ch2(CreateUdpPort(kLocalAddr2)); > // Acquire address. > ch1.Start(); > ch2.Start(); >@@ -1324,7 +1320,7 @@ TEST_F(PortTest, TestConnectionDead) { > > // Test case that the connection has never received anything. > int64_t before_created = rtc::TimeMillis(); >- ch1.CreateConnection(GetCandidate(port2)); >+ ch1.CreateConnection(GetCandidate(ch2.port())); > int64_t after_created = rtc::TimeMillis(); > Connection* conn = ch1.conn(); > ASSERT_NE(conn, nullptr); >@@ -1343,7 +1339,7 @@ TEST_F(PortTest, TestConnectionDead) { > > // Test case that the connection has received something. > // Create a connection again and receive a ping. >- ch1.CreateConnection(GetCandidate(port2)); >+ ch1.CreateConnection(GetCandidate(ch2.port())); > conn = ch1.conn(); > ASSERT_NE(conn, nullptr); > int64_t before_last_receiving = rtc::TimeMillis(); >@@ -1362,14 +1358,15 @@ TEST_F(PortTest, TestConnectionDead) { > // verifies Message Integrity attribute in STUN messages and username in STUN > // binding request will have colon (":") between remote and local username. > TEST_F(PortTest, TestLocalToLocalStandard) { >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >+ auto port1 = CreateUdpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); > port1->SetIceTiebreaker(kTiebreaker1); >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >+ auto port2 = CreateUdpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > port2->SetIceTiebreaker(kTiebreaker2); > // Same parameters as TestLocalToLocal above. >- TestConnectivity("udp", port1, "udp", port2, true, true, true, true); >+ TestConnectivity("udp", std::move(port1), "udp", std::move(port2), true, true, >+ true, true); > } > > // This test is trying to validate a successful and failure scenario in a >@@ -1377,8 +1374,7 @@ TEST_F(PortTest, TestLocalToLocalStandard) { > // should remain equal to the request generated by the port and role of port > // must be in controlling. > TEST_F(PortTest, TestLoopbackCall) { >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr1, "lfrag", "lpass")); >+ auto lport = CreateTestPort(kLocalAddr1, "lfrag", "lpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > lport->PrepareAddress(); >@@ -1391,7 +1387,7 @@ TEST_F(PortTest, TestLoopbackCall) { > IceMessage* msg = lport->last_stun_msg(); > EXPECT_EQ(STUN_BINDING_REQUEST, msg->type()); > conn->OnReadPacket(lport->last_stun_buf()->data<char>(), >- lport->last_stun_buf()->size(), rtc::PacketTime()); >+ lport->last_stun_buf()->size(), /* packet_time_us */ -1); > ASSERT_TRUE_WAIT(lport->last_stun_msg() != NULL, kDefaultTimeout); > msg = lport->last_stun_msg(); > EXPECT_EQ(STUN_BINDING_RESPONSE, msg->type()); >@@ -1422,9 +1418,9 @@ TEST_F(PortTest, TestLoopbackCall) { > modified_req->AddFingerprint(); > > lport->Reset(); >- std::unique_ptr<ByteBufferWriter> buf(new ByteBufferWriter()); >- WriteStunMessage(modified_req.get(), buf.get()); >- conn1->OnReadPacket(buf->Data(), buf->Length(), rtc::PacketTime()); >+ auto buf = absl::make_unique<ByteBufferWriter>(); >+ WriteStunMessage(*modified_req, buf.get()); >+ conn1->OnReadPacket(buf->Data(), buf->Length(), /* packet_time_us */ -1); > ASSERT_TRUE_WAIT(lport->last_stun_msg() != NULL, kDefaultTimeout); > msg = lport->last_stun_msg(); > EXPECT_EQ(STUN_BINDING_ERROR_RESPONSE, msg->type()); >@@ -1436,12 +1432,10 @@ TEST_F(PortTest, TestLoopbackCall) { > // value of tiebreaker, when it receives ping request from |rport| it will > // send role conflict signal. > TEST_F(PortTest, TestIceRoleConflict) { >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr1, "lfrag", "lpass")); >+ auto lport = CreateTestPort(kLocalAddr1, "lfrag", "lpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto rport = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > rport->SetIceRole(cricket::ICEROLE_CONTROLLING); > rport->SetIceTiebreaker(kTiebreaker2); > >@@ -1460,7 +1454,7 @@ TEST_F(PortTest, TestIceRoleConflict) { > EXPECT_EQ(STUN_BINDING_REQUEST, msg->type()); > // Send rport binding request to lport. > lconn->OnReadPacket(rport->last_stun_buf()->data<char>(), >- rport->last_stun_buf()->size(), rtc::PacketTime()); >+ rport->last_stun_buf()->size(), /* packet_time_us */ -1); > > ASSERT_TRUE_WAIT(lport->last_stun_msg() != NULL, kDefaultTimeout); > EXPECT_EQ(STUN_BINDING_RESPONSE, lport->last_stun_msg()->type()); >@@ -1468,13 +1462,12 @@ TEST_F(PortTest, TestIceRoleConflict) { > } > > TEST_F(PortTest, TestTcpNoDelay) { >- TCPPort* port1 = CreateTcpPort(kLocalAddr1); >+ auto port1 = CreateTcpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); > int option_value = -1; > int success = port1->GetOption(rtc::Socket::OPT_NODELAY, &option_value); > ASSERT_EQ(0, success); // GetOption() should complete successfully w/ 0 > ASSERT_EQ(1, option_value); >- delete port1; > } > > TEST_F(PortTest, TestDelayedBindingUdp) { >@@ -1482,7 +1475,7 @@ TEST_F(PortTest, TestDelayedBindingUdp) { > FakePacketSocketFactory socket_factory; > > socket_factory.set_next_udp_socket(socket); >- std::unique_ptr<UDPPort> port(CreateUdpPort(kLocalAddr1, &socket_factory)); >+ auto port = CreateUdpPort(kLocalAddr1, &socket_factory); > > socket->set_state(AsyncPacketSocket::STATE_BINDING); > port->PrepareAddress(); >@@ -1498,7 +1491,7 @@ TEST_F(PortTest, TestDelayedBindingTcp) { > FakePacketSocketFactory socket_factory; > > socket_factory.set_next_server_tcp_socket(socket); >- std::unique_ptr<TCPPort> port(CreateTcpPort(kLocalAddr1, &socket_factory)); >+ auto port = CreateTcpPort(kLocalAddr1, &socket_factory); > > socket->set_state(AsyncPacketSocket::STATE_BINDING); > port->PrepareAddress(); >@@ -1519,10 +1512,10 @@ void PortTest::TestCrossFamilyPorts(int type) { > FakeAsyncPacketSocket* socket = new FakeAsyncPacketSocket(); > if (type == SOCK_DGRAM) { > factory.set_next_udp_socket(socket); >- ports[i].reset(CreateUdpPort(addresses[i], &factory)); >+ ports[i] = CreateUdpPort(addresses[i], &factory); > } else if (type == SOCK_STREAM) { > factory.set_next_server_tcp_socket(socket); >- ports[i].reset(CreateTcpPort(addresses[i], &factory)); >+ ports[i] = CreateTcpPort(addresses[i], &factory); > } > socket->set_state(AsyncPacketSocket::STATE_BINDING); > socket->SignalAddressReady(socket, addresses[i]); >@@ -1586,7 +1579,7 @@ TEST_F(PortTest, TestUdpV6CrossTypePorts) { > for (int i = 0; i < 4; i++) { > FakeAsyncPacketSocket* socket = new FakeAsyncPacketSocket(); > factory.set_next_udp_socket(socket); >- ports[i].reset(CreateUdpPort(addresses[i], &factory)); >+ ports[i] = CreateUdpPort(addresses[i], &factory); > socket->set_state(AsyncPacketSocket::STATE_BINDING); > socket->SignalAddressReady(socket, addresses[i]); > ports[i]->PrepareAddress(); >@@ -1611,28 +1604,27 @@ TEST_F(PortTest, TestUdpV6CrossTypePorts) { > // get through DefaultDscpValue. > TEST_F(PortTest, TestDefaultDscpValue) { > int dscp; >- std::unique_ptr<UDPPort> udpport(CreateUdpPort(kLocalAddr1)); >+ auto udpport = CreateUdpPort(kLocalAddr1); > EXPECT_EQ(0, udpport->SetOption(rtc::Socket::OPT_DSCP, rtc::DSCP_CS6)); > EXPECT_EQ(0, udpport->GetOption(rtc::Socket::OPT_DSCP, &dscp)); >- std::unique_ptr<TCPPort> tcpport(CreateTcpPort(kLocalAddr1)); >+ auto tcpport = CreateTcpPort(kLocalAddr1); > EXPECT_EQ(0, tcpport->SetOption(rtc::Socket::OPT_DSCP, rtc::DSCP_AF31)); > EXPECT_EQ(0, tcpport->GetOption(rtc::Socket::OPT_DSCP, &dscp)); > EXPECT_EQ(rtc::DSCP_AF31, dscp); >- std::unique_ptr<StunPort> stunport( >- CreateStunPort(kLocalAddr1, nat_socket_factory1())); >+ auto stunport = CreateStunPort(kLocalAddr1, nat_socket_factory1()); > EXPECT_EQ(0, stunport->SetOption(rtc::Socket::OPT_DSCP, rtc::DSCP_AF41)); > EXPECT_EQ(0, stunport->GetOption(rtc::Socket::OPT_DSCP, &dscp)); > EXPECT_EQ(rtc::DSCP_AF41, dscp); >- std::unique_ptr<TurnPort> turnport1( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP)); >+ auto turnport1 = >+ CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP); > // Socket is created in PrepareAddress. > turnport1->PrepareAddress(); > EXPECT_EQ(0, turnport1->SetOption(rtc::Socket::OPT_DSCP, rtc::DSCP_CS7)); > EXPECT_EQ(0, turnport1->GetOption(rtc::Socket::OPT_DSCP, &dscp)); > EXPECT_EQ(rtc::DSCP_CS7, dscp); > // This will verify correct value returned without the socket. >- std::unique_ptr<TurnPort> turnport2( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP)); >+ auto turnport2 = >+ CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP); > EXPECT_EQ(0, turnport2->SetOption(rtc::Socket::OPT_DSCP, rtc::DSCP_CS6)); > EXPECT_EQ(0, turnport2->GetOption(rtc::Socket::OPT_DSCP, &dscp)); > EXPECT_EQ(rtc::DSCP_CS6, dscp); >@@ -1640,10 +1632,8 @@ TEST_F(PortTest, TestDefaultDscpValue) { > > // Test sending STUN messages. > TEST_F(PortTest, TestSendStunMessage) { >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr1, "lfrag", "lpass")); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto lport = CreateTestPort(kLocalAddr1, "lfrag", "lpass"); >+ auto rport = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > rport->SetIceRole(cricket::ICEROLE_CONTROLLED); >@@ -1689,17 +1679,17 @@ TEST_F(PortTest, TestSendStunMessage) { > ASSERT_TRUE(msg->GetUInt32(STUN_ATTR_RETRANSMIT_COUNT) == NULL); > > // Save a copy of the BINDING-REQUEST for use below. >- std::unique_ptr<IceMessage> request(CopyStunMessage(msg)); >+ std::unique_ptr<IceMessage> request = CopyStunMessage(*msg); > > // Receive the BINDING-REQUEST and respond with BINDING-RESPONSE. > rconn->OnReadPacket(lport->last_stun_buf()->data<char>(), >- lport->last_stun_buf()->size(), rtc::PacketTime()); >+ lport->last_stun_buf()->size(), /* packet_time_us */ -1); > msg = rport->last_stun_msg(); > ASSERT_TRUE(msg != NULL); > EXPECT_EQ(STUN_BINDING_RESPONSE, msg->type()); > // Received a BINDING-RESPONSE. > lconn->OnReadPacket(rport->last_stun_buf()->data<char>(), >- rport->last_stun_buf()->size(), rtc::PacketTime()); >+ rport->last_stun_buf()->size(), /* packet_time_us */ -1); > // Verify the STUN Stats. > EXPECT_EQ(1U, lconn->stats().sent_ping_requests_total); > EXPECT_EQ(1U, lconn->stats().sent_ping_requests_before_first_response); >@@ -1777,13 +1767,13 @@ TEST_F(PortTest, TestSendStunMessage) { > EXPECT_EQ(2U, retransmit_attr->value()); > > // Respond with a BINDING-RESPONSE. >- request.reset(CopyStunMessage(msg)); >+ request = CopyStunMessage(*msg); > lconn->OnReadPacket(rport->last_stun_buf()->data<char>(), >- rport->last_stun_buf()->size(), rtc::PacketTime()); >+ rport->last_stun_buf()->size(), /* packet_time_us */ -1); > msg = lport->last_stun_msg(); > // Receive the BINDING-RESPONSE. > rconn->OnReadPacket(lport->last_stun_buf()->data<char>(), >- lport->last_stun_buf()->size(), rtc::PacketTime()); >+ lport->last_stun_buf()->size(), /* packet_time_us */ -1); > > // Verify the Stun ping stats. > EXPECT_EQ(3U, rconn->stats().sent_ping_requests_total); >@@ -1804,10 +1794,8 @@ TEST_F(PortTest, TestSendStunMessage) { > } > > TEST_F(PortTest, TestNomination) { >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr1, "lfrag", "lpass")); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto lport = CreateTestPort(kLocalAddr1, "lfrag", "lpass"); >+ auto rport = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > rport->SetIceRole(cricket::ICEROLE_CONTROLLED); >@@ -1837,7 +1825,7 @@ TEST_F(PortTest, TestNomination) { > ASSERT_TRUE_WAIT(lport->last_stun_msg(), kDefaultTimeout); > ASSERT_TRUE(lport->last_stun_buf()); > rconn->OnReadPacket(lport->last_stun_buf()->data<char>(), >- lport->last_stun_buf()->size(), rtc::PacketTime()); >+ lport->last_stun_buf()->size(), /* packet_time_us */ -1); > EXPECT_EQ(nomination, rconn->remote_nomination()); > EXPECT_FALSE(lconn->nominated()); > EXPECT_TRUE(rconn->nominated()); >@@ -1849,7 +1837,7 @@ TEST_F(PortTest, TestNomination) { > ASSERT_TRUE_WAIT(rport->last_stun_msg(), kDefaultTimeout); > ASSERT_TRUE(rport->last_stun_buf()); > lconn->OnReadPacket(rport->last_stun_buf()->data<char>(), >- rport->last_stun_buf()->size(), rtc::PacketTime()); >+ rport->last_stun_buf()->size(), /* packet_time_us */ -1); > EXPECT_EQ(nomination, lconn->acked_nomination()); > EXPECT_TRUE(lconn->nominated()); > EXPECT_TRUE(rconn->nominated()); >@@ -1860,10 +1848,8 @@ TEST_F(PortTest, TestNomination) { > TEST_F(PortTest, TestRoundTripTime) { > rtc::ScopedFakeClock clock; > >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr1, "lfrag", "lpass")); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto lport = CreateTestPort(kLocalAddr1, "lfrag", "lpass"); >+ auto rport = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > rport->SetIceRole(cricket::ICEROLE_CONTROLLED); >@@ -1901,10 +1887,8 @@ TEST_F(PortTest, TestRoundTripTime) { > } > > TEST_F(PortTest, TestUseCandidateAttribute) { >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr1, "lfrag", "lpass")); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto lport = CreateTestPort(kLocalAddr1, "lfrag", "lpass"); >+ auto rport = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > rport->SetIceRole(cricket::ICEROLE_CONTROLLED); >@@ -1932,10 +1916,8 @@ TEST_F(PortTest, TestUseCandidateAttribute) { > // the remote network costs are updated with the stun binding requests. > TEST_F(PortTest, TestNetworkCostChange) { > rtc::Network* test_network = MakeNetwork(kLocalAddr1); >- std::unique_ptr<TestPort> lport( >- CreateTestPort(test_network, "lfrag", "lpass")); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(test_network, "rfrag", "rpass")); >+ auto lport = CreateTestPort(test_network, "lfrag", "lpass"); >+ auto rport = CreateTestPort(test_network, "rfrag", "rpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > rport->SetIceRole(cricket::ICEROLE_CONTROLLED); >@@ -1982,7 +1964,7 @@ TEST_F(PortTest, TestNetworkCostChange) { > EXPECT_EQ(STUN_BINDING_REQUEST, msg->type()); > // Pass the binding request to rport. > rconn->OnReadPacket(lport->last_stun_buf()->data<char>(), >- lport->last_stun_buf()->size(), rtc::PacketTime()); >+ lport->last_stun_buf()->size(), /* packet_time_us */ -1); > // Wait until rport sends the response and then check the remote network cost. > ASSERT_TRUE_WAIT(rport->last_stun_msg() != NULL, kDefaultTimeout); > EXPECT_EQ(rtc::kNetworkCostHigh, rconn->remote_candidate().network_cost()); >@@ -1990,10 +1972,8 @@ TEST_F(PortTest, TestNetworkCostChange) { > > TEST_F(PortTest, TestNetworkInfoAttribute) { > rtc::Network* test_network = MakeNetwork(kLocalAddr1); >- std::unique_ptr<TestPort> lport( >- CreateTestPort(test_network, "lfrag", "lpass")); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(test_network, "rfrag", "rpass")); >+ auto lport = CreateTestPort(test_network, "lfrag", "lpass"); >+ auto rport = CreateTestPort(test_network, "rfrag", "rpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > rport->SetIceRole(cricket::ICEROLE_CONTROLLED); >@@ -2037,44 +2017,43 @@ TEST_F(PortTest, TestNetworkInfoAttribute) { > // Test handling STUN messages. > TEST_F(PortTest, TestHandleStunMessage) { > // Our port will act as the "remote" port. >- std::unique_ptr<TestPort> port(CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto port = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > > std::unique_ptr<IceMessage> in_msg, out_msg; >- std::unique_ptr<ByteBufferWriter> buf(new ByteBufferWriter()); >+ auto buf = absl::make_unique<ByteBufferWriter>(); > rtc::SocketAddress addr(kLocalAddr1); > std::string username; > > // BINDING-REQUEST from local to remote with valid ICE username, > // MESSAGE-INTEGRITY, and FINGERPRINT. >- in_msg.reset( >- CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag"); > in_msg->AddMessageIntegrity("rpass"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() != NULL); > EXPECT_EQ("lfrag", username); > > // BINDING-RESPONSE without username, with MESSAGE-INTEGRITY and FINGERPRINT. >- in_msg.reset(CreateStunMessage(STUN_BINDING_RESPONSE)); >+ in_msg = CreateStunMessage(STUN_BINDING_RESPONSE); > in_msg->AddAttribute(absl::make_unique<StunXorAddressAttribute>( > STUN_ATTR_XOR_MAPPED_ADDRESS, kLocalAddr2)); > in_msg->AddMessageIntegrity("rpass"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() != NULL); > EXPECT_EQ("", username); > > // BINDING-ERROR-RESPONSE without username, with error, M-I, and FINGERPRINT. >- in_msg.reset(CreateStunMessage(STUN_BINDING_ERROR_RESPONSE)); >+ in_msg = CreateStunMessage(STUN_BINDING_ERROR_RESPONSE); > in_msg->AddAttribute(absl::make_unique<StunErrorCodeAttribute>( > STUN_ATTR_ERROR_CODE, STUN_ERROR_SERVER_ERROR, > STUN_ERROR_REASON_SERVER_ERROR)); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() != NULL); >@@ -2087,18 +2066,18 @@ TEST_F(PortTest, TestHandleStunMessage) { > > // Tests handling of ICE binding requests with missing or incorrect usernames. > TEST_F(PortTest, TestHandleStunMessageBadUsername) { >- std::unique_ptr<TestPort> port(CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto port = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > > std::unique_ptr<IceMessage> in_msg, out_msg; >- std::unique_ptr<ByteBufferWriter> buf(new ByteBufferWriter()); >+ auto buf = absl::make_unique<ByteBufferWriter>(); > rtc::SocketAddress addr(kLocalAddr1); > std::string username; > > // BINDING-REQUEST with no username. >- in_msg.reset(CreateStunMessage(STUN_BINDING_REQUEST)); >+ in_msg = CreateStunMessage(STUN_BINDING_REQUEST); > in_msg->AddMessageIntegrity("rpass"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() == NULL); >@@ -2106,10 +2085,10 @@ TEST_F(PortTest, TestHandleStunMessageBadUsername) { > EXPECT_EQ(STUN_ERROR_BAD_REQUEST, port->last_stun_error_code()); > > // BINDING-REQUEST with empty username. >- in_msg.reset(CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, ""); > in_msg->AddMessageIntegrity("rpass"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() == NULL); >@@ -2117,10 +2096,10 @@ TEST_F(PortTest, TestHandleStunMessageBadUsername) { > EXPECT_EQ(STUN_ERROR_UNAUTHORIZED, port->last_stun_error_code()); > > // BINDING-REQUEST with too-short username. >- in_msg.reset(CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfra")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfra"); > in_msg->AddMessageIntegrity("rpass"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() == NULL); >@@ -2128,11 +2107,10 @@ TEST_F(PortTest, TestHandleStunMessageBadUsername) { > EXPECT_EQ(STUN_ERROR_UNAUTHORIZED, port->last_stun_error_code()); > > // BINDING-REQUEST with reversed username. >- in_msg.reset( >- CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "lfrag:rfrag")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "lfrag:rfrag"); > in_msg->AddMessageIntegrity("rpass"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() == NULL); >@@ -2140,11 +2118,10 @@ TEST_F(PortTest, TestHandleStunMessageBadUsername) { > EXPECT_EQ(STUN_ERROR_UNAUTHORIZED, port->last_stun_error_code()); > > // BINDING-REQUEST with garbage username. >- in_msg.reset( >- CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "abcd:efgh")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "abcd:efgh"); > in_msg->AddMessageIntegrity("rpass"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() == NULL); >@@ -2155,19 +2132,18 @@ TEST_F(PortTest, TestHandleStunMessageBadUsername) { > // Test handling STUN messages with missing or malformed M-I. > TEST_F(PortTest, TestHandleStunMessageBadMessageIntegrity) { > // Our port will act as the "remote" port. >- std::unique_ptr<TestPort> port(CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto port = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > > std::unique_ptr<IceMessage> in_msg, out_msg; >- std::unique_ptr<ByteBufferWriter> buf(new ByteBufferWriter()); >+ auto buf = absl::make_unique<ByteBufferWriter>(); > rtc::SocketAddress addr(kLocalAddr1); > std::string username; > > // BINDING-REQUEST from local to remote with valid ICE username and > // FINGERPRINT, but no MESSAGE-INTEGRITY. >- in_msg.reset( >- CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() == NULL); >@@ -2176,11 +2152,10 @@ TEST_F(PortTest, TestHandleStunMessageBadMessageIntegrity) { > > // BINDING-REQUEST from local to remote with valid ICE username and > // FINGERPRINT, but invalid MESSAGE-INTEGRITY. >- in_msg.reset( >- CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag"); > in_msg->AddMessageIntegrity("invalid"); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() == NULL); >@@ -2195,19 +2170,18 @@ TEST_F(PortTest, TestHandleStunMessageBadMessageIntegrity) { > // Test handling STUN messages with missing or malformed FINGERPRINT. > TEST_F(PortTest, TestHandleStunMessageBadFingerprint) { > // Our port will act as the "remote" port. >- std::unique_ptr<TestPort> port(CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto port = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > > std::unique_ptr<IceMessage> in_msg, out_msg; >- std::unique_ptr<ByteBufferWriter> buf(new ByteBufferWriter()); >+ auto buf = absl::make_unique<ByteBufferWriter>(); > rtc::SocketAddress addr(kLocalAddr1); > std::string username; > > // BINDING-REQUEST from local to remote with valid ICE username and > // MESSAGE-INTEGRITY, but no FINGERPRINT; GetStunMessage should fail. >- in_msg.reset( >- CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag")); >+ in_msg = CreateStunMessageWithUsername(STUN_BINDING_REQUEST, "rfrag:lfrag"); > in_msg->AddMessageIntegrity("rpass"); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_FALSE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_EQ(0, port->last_stun_error_code()); >@@ -2215,17 +2189,17 @@ TEST_F(PortTest, TestHandleStunMessageBadFingerprint) { > // Now, add a fingerprint, but munge the message so it's not valid. > in_msg->AddFingerprint(); > in_msg->SetTransactionID("TESTTESTBADD"); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_FALSE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_EQ(0, port->last_stun_error_code()); > > // Valid BINDING-RESPONSE, except no FINGERPRINT. >- in_msg.reset(CreateStunMessage(STUN_BINDING_RESPONSE)); >+ in_msg = CreateStunMessage(STUN_BINDING_RESPONSE); > in_msg->AddAttribute(absl::make_unique<StunXorAddressAttribute>( > STUN_ATTR_XOR_MAPPED_ADDRESS, kLocalAddr2)); > in_msg->AddMessageIntegrity("rpass"); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_FALSE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_EQ(0, port->last_stun_error_code()); >@@ -2233,18 +2207,18 @@ TEST_F(PortTest, TestHandleStunMessageBadFingerprint) { > // Now, add a fingerprint, but munge the message so it's not valid. > in_msg->AddFingerprint(); > in_msg->SetTransactionID("TESTTESTBADD"); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_FALSE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_EQ(0, port->last_stun_error_code()); > > // Valid BINDING-ERROR-RESPONSE, except no FINGERPRINT. >- in_msg.reset(CreateStunMessage(STUN_BINDING_ERROR_RESPONSE)); >+ in_msg = CreateStunMessage(STUN_BINDING_ERROR_RESPONSE); > in_msg->AddAttribute(absl::make_unique<StunErrorCodeAttribute>( > STUN_ATTR_ERROR_CODE, STUN_ERROR_SERVER_ERROR, > STUN_ERROR_REASON_SERVER_ERROR)); > in_msg->AddMessageIntegrity("rpass"); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_FALSE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_EQ(0, port->last_stun_error_code()); >@@ -2252,7 +2226,7 @@ TEST_F(PortTest, TestHandleStunMessageBadFingerprint) { > // Now, add a fingerprint, but munge the message so it's not valid. > in_msg->AddFingerprint(); > in_msg->SetTransactionID("TESTTESTBADD"); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_FALSE(port->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_EQ(0, port->last_stun_error_code()); >@@ -2261,8 +2235,7 @@ TEST_F(PortTest, TestHandleStunMessageBadFingerprint) { > // Test handling of STUN binding indication messages . STUN binding > // indications are allowed only to the connection which is in read mode. > TEST_F(PortTest, TestHandleStunBindingIndication) { >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr2, "lfrag", "lpass")); >+ auto lport = CreateTestPort(kLocalAddr2, "lfrag", "lpass"); > lport->SetIceRole(cricket::ICEROLE_CONTROLLING); > lport->SetIceTiebreaker(kTiebreaker1); > >@@ -2272,9 +2245,9 @@ TEST_F(PortTest, TestHandleStunBindingIndication) { > rtc::SocketAddress addr(kLocalAddr1); > std::string username; > >- in_msg.reset(CreateStunMessage(STUN_BINDING_INDICATION)); >+ in_msg = CreateStunMessage(STUN_BINDING_INDICATION); > in_msg->AddFingerprint(); >- WriteStunMessage(in_msg.get(), buf.get()); >+ WriteStunMessage(*in_msg, buf.get()); > EXPECT_TRUE(lport->GetStunMessage(buf->Data(), buf->Length(), addr, &out_msg, > &username)); > EXPECT_TRUE(out_msg.get() != NULL); >@@ -2283,8 +2256,7 @@ TEST_F(PortTest, TestHandleStunBindingIndication) { > > // Verify connection can handle STUN indication and updates > // last_ping_received. >- std::unique_ptr<TestPort> rport( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto rport = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > rport->SetIceRole(cricket::ICEROLE_CONTROLLED); > rport->SetIceTiebreaker(kTiebreaker2); > >@@ -2304,7 +2276,7 @@ TEST_F(PortTest, TestHandleStunBindingIndication) { > EXPECT_EQ(STUN_BINDING_REQUEST, msg->type()); > // Send rport binding request to lport. > lconn->OnReadPacket(rport->last_stun_buf()->data<char>(), >- rport->last_stun_buf()->size(), rtc::PacketTime()); >+ rport->last_stun_buf()->size(), /* packet_time_us */ -1); > ASSERT_TRUE_WAIT(lport->last_stun_msg() != NULL, kDefaultTimeout); > EXPECT_EQ(STUN_BINDING_RESPONSE, lport->last_stun_msg()->type()); > int64_t last_ping_received1 = lconn->last_ping_received(); >@@ -2312,13 +2284,13 @@ TEST_F(PortTest, TestHandleStunBindingIndication) { > // Adding a delay of 100ms. > rtc::Thread::Current()->ProcessMessages(100); > // Pinging lconn using stun indication message. >- lconn->OnReadPacket(buf->Data(), buf->Length(), rtc::PacketTime()); >+ lconn->OnReadPacket(buf->Data(), buf->Length(), /* packet_time_us */ -1); > int64_t last_ping_received2 = lconn->last_ping_received(); > EXPECT_GT(last_ping_received2, last_ping_received1); > } > > TEST_F(PortTest, TestComputeCandidatePriority) { >- std::unique_ptr<TestPort> port(CreateTestPort(kLocalAddr1, "name", "pass")); >+ auto port = CreateTestPort(kLocalAddr1, "name", "pass"); > port->set_type_preference(90); > port->set_component(177); > port->AddCandidateAddress(SocketAddress("192.168.1.4", 1234)); >@@ -2355,8 +2327,7 @@ TEST_F(PortTest, TestComputeCandidatePriority) { > // In the case of shared socket, one port may be shared by local and stun. > // Test that candidates with different types will have different foundation. > TEST_F(PortTest, TestFoundation) { >- std::unique_ptr<TestPort> testport( >- CreateTestPort(kLocalAddr1, "name", "pass")); >+ auto testport = CreateTestPort(kLocalAddr1, "name", "pass"); > testport->AddCandidateAddress(kLocalAddr1, kLocalAddr1, LOCAL_PORT_TYPE, > cricket::ICE_TYPE_PREFERENCE_HOST, false); > testport->AddCandidateAddress(kLocalAddr2, kLocalAddr1, STUN_PORT_TYPE, >@@ -2369,20 +2340,19 @@ TEST_F(PortTest, TestFoundation) { > TEST_F(PortTest, TestCandidateFoundation) { > std::unique_ptr<rtc::NATServer> nat_server( > CreateNatServer(kNatAddr1, NAT_OPEN_CONE)); >- std::unique_ptr<UDPPort> udpport1(CreateUdpPort(kLocalAddr1)); >+ auto udpport1 = CreateUdpPort(kLocalAddr1); > udpport1->PrepareAddress(); >- std::unique_ptr<UDPPort> udpport2(CreateUdpPort(kLocalAddr1)); >+ auto udpport2 = CreateUdpPort(kLocalAddr1); > udpport2->PrepareAddress(); > EXPECT_EQ(udpport1->Candidates()[0].foundation(), > udpport2->Candidates()[0].foundation()); >- std::unique_ptr<TCPPort> tcpport1(CreateTcpPort(kLocalAddr1)); >+ auto tcpport1 = CreateTcpPort(kLocalAddr1); > tcpport1->PrepareAddress(); >- std::unique_ptr<TCPPort> tcpport2(CreateTcpPort(kLocalAddr1)); >+ auto tcpport2 = CreateTcpPort(kLocalAddr1); > tcpport2->PrepareAddress(); > EXPECT_EQ(tcpport1->Candidates()[0].foundation(), > tcpport2->Candidates()[0].foundation()); >- std::unique_ptr<Port> stunport( >- CreateStunPort(kLocalAddr1, nat_socket_factory1())); >+ auto stunport = CreateStunPort(kLocalAddr1, nat_socket_factory1()); > stunport->PrepareAddress(); > ASSERT_EQ_WAIT(1U, stunport->Candidates().size(), kDefaultTimeout); > EXPECT_NE(tcpport1->Candidates()[0].foundation(), >@@ -2394,7 +2364,7 @@ TEST_F(PortTest, TestCandidateFoundation) { > EXPECT_NE(udpport2->Candidates()[0].foundation(), > stunport->Candidates()[0].foundation()); > // Verify GTURN candidate foundation. >- std::unique_ptr<RelayPort> relayport(CreateGturnPort(kLocalAddr1)); >+ auto relayport = CreateGturnPort(kLocalAddr1); > relayport->AddServerAddress( > cricket::ProtocolAddress(kRelayUdpIntAddr, cricket::PROTO_UDP)); > relayport->PrepareAddress(); >@@ -2404,8 +2374,8 @@ TEST_F(PortTest, TestCandidateFoundation) { > EXPECT_NE(udpport2->Candidates()[0].foundation(), > relayport->Candidates()[0].foundation()); > // Verifying TURN candidate foundation. >- std::unique_ptr<Port> turnport1( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP)); >+ auto turnport1 = >+ CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP); > turnport1->PrepareAddress(); > ASSERT_EQ_WAIT(1U, turnport1->Candidates().size(), kDefaultTimeout); > EXPECT_NE(udpport1->Candidates()[0].foundation(), >@@ -2414,8 +2384,8 @@ TEST_F(PortTest, TestCandidateFoundation) { > turnport1->Candidates()[0].foundation()); > EXPECT_NE(stunport->Candidates()[0].foundation(), > turnport1->Candidates()[0].foundation()); >- std::unique_ptr<Port> turnport2( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP)); >+ auto turnport2 = >+ CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP); > turnport2->PrepareAddress(); > ASSERT_EQ_WAIT(1U, turnport2->Candidates().size(), kDefaultTimeout); > EXPECT_EQ(turnport1->Candidates()[0].foundation(), >@@ -2426,9 +2396,8 @@ TEST_F(PortTest, TestCandidateFoundation) { > SocketAddress kTurnUdpExtAddr2("99.99.98.5", 0); > TestTurnServer turn_server2(rtc::Thread::Current(), kTurnUdpIntAddr2, > kTurnUdpExtAddr2); >- std::unique_ptr<Port> turnport3( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP, >- kTurnUdpIntAddr2)); >+ auto turnport3 = CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, >+ PROTO_UDP, kTurnUdpIntAddr2); > turnport3->PrepareAddress(); > ASSERT_EQ_WAIT(1U, turnport3->Candidates().size(), kDefaultTimeout); > EXPECT_NE(turnport3->Candidates()[0].foundation(), >@@ -2438,8 +2407,8 @@ TEST_F(PortTest, TestCandidateFoundation) { > // different foundations if their relay protocols are different. > TestTurnServer turn_server3(rtc::Thread::Current(), kTurnTcpIntAddr, > kTurnUdpExtAddr, PROTO_TCP); >- std::unique_ptr<Port> turnport4( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_TCP, PROTO_UDP)); >+ auto turnport4 = >+ CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_TCP, PROTO_UDP); > turnport4->PrepareAddress(); > ASSERT_EQ_WAIT(1U, turnport4->Candidates().size(), kDefaultTimeout); > EXPECT_NE(turnport2->Candidates()[0].foundation(), >@@ -2449,17 +2418,15 @@ TEST_F(PortTest, TestCandidateFoundation) { > // This test verifies the related addresses of different types of > // ICE candiates. > TEST_F(PortTest, TestCandidateRelatedAddress) { >- std::unique_ptr<rtc::NATServer> nat_server( >- CreateNatServer(kNatAddr1, NAT_OPEN_CONE)); >- std::unique_ptr<UDPPort> udpport(CreateUdpPort(kLocalAddr1)); >+ auto nat_server = CreateNatServer(kNatAddr1, NAT_OPEN_CONE); >+ auto udpport = CreateUdpPort(kLocalAddr1); > udpport->PrepareAddress(); > // For UDPPort, related address will be empty. > EXPECT_TRUE(udpport->Candidates()[0].related_address().IsNil()); > // Testing related address for stun candidates. > // For stun candidate related address must be equal to the base > // socket address. >- std::unique_ptr<StunPort> stunport( >- CreateStunPort(kLocalAddr1, nat_socket_factory1())); >+ auto stunport = CreateStunPort(kLocalAddr1, nat_socket_factory1()); > stunport->PrepareAddress(); > ASSERT_EQ_WAIT(1U, stunport->Candidates().size(), kDefaultTimeout); > // Check STUN candidate address. >@@ -2470,7 +2437,7 @@ TEST_F(PortTest, TestCandidateRelatedAddress) { > // Verifying the related address for the GTURN candidates. > // NOTE: In case of GTURN related address will be equal to the mapped > // address, but address(mapped) will not be XOR. >- std::unique_ptr<RelayPort> relayport(CreateGturnPort(kLocalAddr1)); >+ auto relayport = CreateGturnPort(kLocalAddr1); > relayport->AddServerAddress( > cricket::ProtocolAddress(kRelayUdpIntAddr, cricket::PROTO_UDP)); > relayport->PrepareAddress(); >@@ -2479,8 +2446,8 @@ TEST_F(PortTest, TestCandidateRelatedAddress) { > EXPECT_EQ(rtc::SocketAddress(), relayport->Candidates()[0].related_address()); > // Verifying the related address for TURN candidate. > // For TURN related address must be equal to the mapped address. >- std::unique_ptr<Port> turnport( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP)); >+ auto turnport = >+ CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP); > turnport->PrepareAddress(); > ASSERT_EQ_WAIT(1U, turnport->Candidates().size(), kDefaultTimeout); > EXPECT_EQ(kTurnUdpExtAddr.ipaddr(), >@@ -2500,11 +2467,9 @@ TEST_F(PortTest, TestCandidatePriority) { > > // Test the Connection priority is calculated correctly. > TEST_F(PortTest, TestConnectionPriority) { >- std::unique_ptr<TestPort> lport( >- CreateTestPort(kLocalAddr1, "lfrag", "lpass")); >+ auto lport = CreateTestPort(kLocalAddr1, "lfrag", "lpass"); > lport->set_type_preference(cricket::ICE_TYPE_PREFERENCE_HOST); >- std::unique_ptr<TestPort> rport( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass")); >+ auto rport = CreateTestPort(kLocalAddr2, "rfrag", "rpass"); > rport->set_type_preference(cricket::ICE_TYPE_PREFERENCE_RELAY_UDP); > lport->set_component(123); > lport->AddCandidateAddress(SocketAddress("192.168.1.4", 1234)); >@@ -2545,14 +2510,14 @@ TEST_F(PortTest, TestConnectionPriority) { > // estimate given by |MINIMUM_RTT| = 100. > TEST_F(PortTest, TestWritableState) { > rtc::ScopedFakeClock clock; >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >+ auto port1 = CreateUdpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >+ auto port2 = CreateUdpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > > // Set up channels. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > > // Acquire addresses. > ch1.Start(); >@@ -2561,7 +2526,7 @@ TEST_F(PortTest, TestWritableState) { > ASSERT_EQ_SIMULATED_WAIT(1, ch2.complete_count(), kDefaultTimeout, clock); > > // Send a ping from src to dst. >- ch1.CreateConnection(GetCandidate(port2)); >+ ch1.CreateConnection(GetCandidate(ch2.port())); > ASSERT_TRUE(ch1.conn() != NULL); > EXPECT_EQ(Connection::STATE_WRITE_INIT, ch1.conn()->write_state()); > // for TCP connect >@@ -2577,7 +2542,7 @@ TEST_F(PortTest, TestWritableState) { > > // Accept the connection to return the binding response, transition to > // writable, and allow data to be sent. >- ch2.AcceptConnection(GetCandidate(port1)); >+ ch2.AcceptConnection(GetCandidate(ch1.port())); > EXPECT_EQ_SIMULATED_WAIT(Connection::STATE_WRITABLE, > ch1.conn()->write_state(), kDefaultTimeout, clock); > EXPECT_EQ(data_size, ch1.conn()->Send(data, data_size, options)); >@@ -2622,14 +2587,14 @@ TEST_F(PortTest, TestWritableState) { > // |CONNECTION_WRITE_CONNECT_FAILURES|. > TEST_F(PortTest, TestWritableStateWithConfiguredThreshold) { > rtc::ScopedFakeClock clock; >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >+ auto port1 = CreateUdpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >+ auto port2 = CreateUdpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > > // Set up channels. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > > // Acquire addresses. > ch1.Start(); >@@ -2638,14 +2603,14 @@ TEST_F(PortTest, TestWritableStateWithConfiguredThreshold) { > ASSERT_EQ_SIMULATED_WAIT(1, ch2.complete_count(), kDefaultTimeout, clock); > > // Send a ping from src to dst. >- ch1.CreateConnection(GetCandidate(port2)); >+ ch1.CreateConnection(GetCandidate(ch2.port())); > ASSERT_TRUE(ch1.conn() != NULL); > ch1.Ping(); > SIMULATED_WAIT(!ch2.remote_address().IsNil(), kShortTimeout, clock); > > // Accept the connection to return the binding response, transition to > // writable, and allow data to be sent. >- ch2.AcceptConnection(GetCandidate(port1)); >+ ch2.AcceptConnection(GetCandidate(ch1.port())); > EXPECT_EQ_SIMULATED_WAIT(Connection::STATE_WRITABLE, > ch1.conn()->write_state(), kDefaultTimeout, clock); > >@@ -2676,20 +2641,20 @@ TEST_F(PortTest, TestWritableStateWithConfiguredThreshold) { > } > > TEST_F(PortTest, TestTimeoutForNeverWritable) { >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >+ auto port1 = CreateUdpPort(kLocalAddr1); > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >+ auto port2 = CreateUdpPort(kLocalAddr2); > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > > // Set up channels. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > > // Acquire addresses. > ch1.Start(); > ch2.Start(); > >- ch1.CreateConnection(GetCandidate(port2)); >+ ch1.CreateConnection(GetCandidate(ch2.port())); > ASSERT_TRUE(ch1.conn() != NULL); > EXPECT_EQ(Connection::STATE_WRITE_INIT, ch1.conn()->write_state()); > >@@ -2706,15 +2671,15 @@ TEST_F(PortTest, TestTimeoutForNeverWritable) { > // In this test |ch1| behaves like FULL mode client and we have created > // port which responds to the ping message just like LITE client. > TEST_F(PortTest, TestIceLiteConnectivity) { >- TestPort* ice_full_port = >+ auto ice_full_port = > CreateTestPort(kLocalAddr1, "lfrag", "lpass", > cricket::ICEROLE_CONTROLLING, kTiebreaker1); >+ auto* ice_full_port_ptr = ice_full_port.get(); > >- std::unique_ptr<TestPort> ice_lite_port( >- CreateTestPort(kLocalAddr2, "rfrag", "rpass", cricket::ICEROLE_CONTROLLED, >- kTiebreaker2)); >+ auto ice_lite_port = CreateTestPort( >+ kLocalAddr2, "rfrag", "rpass", cricket::ICEROLE_CONTROLLED, kTiebreaker2); > // Setup TestChannel. This behaves like FULL mode client. >- TestChannel ch1(ice_full_port); >+ TestChannel ch1(std::move(ice_full_port)); > ch1.SetIceMode(ICEMODE_FULL); > > // Start gathering candidates. >@@ -2734,24 +2699,24 @@ TEST_F(PortTest, TestIceLiteConnectivity) { > > // Verify stun ping is without USE_CANDIDATE_ATTR. Getting message directly > // from port. >- ASSERT_TRUE_WAIT(ice_full_port->last_stun_msg() != NULL, kDefaultTimeout); >- IceMessage* msg = ice_full_port->last_stun_msg(); >+ ASSERT_TRUE_WAIT(ice_full_port_ptr->last_stun_msg() != NULL, kDefaultTimeout); >+ IceMessage* msg = ice_full_port_ptr->last_stun_msg(); > EXPECT_TRUE(msg->GetByteString(STUN_ATTR_USE_CANDIDATE) == NULL); > > // Respond with a BINDING-RESPONSE from litemode client. > // NOTE: Ideally we should't create connection at this stage from lite > // port, as it should be done only after receiving ping with USE_CANDIDATE. > // But we need a connection to send a response message. >- ice_lite_port->CreateConnection(ice_full_port->Candidates()[0], >+ ice_lite_port->CreateConnection(ice_full_port_ptr->Candidates()[0], > cricket::Port::ORIGIN_MESSAGE); >- std::unique_ptr<IceMessage> request(CopyStunMessage(msg)); >- ice_lite_port->SendBindingResponse(request.get(), >- ice_full_port->Candidates()[0].address()); >+ std::unique_ptr<IceMessage> request = CopyStunMessage(*msg); >+ ice_lite_port->SendBindingResponse( >+ request.get(), ice_full_port_ptr->Candidates()[0].address()); > > // Feeding the respone message from litemode to the full mode connection. > ch1.conn()->OnReadPacket(ice_lite_port->last_stun_buf()->data<char>(), > ice_lite_port->last_stun_buf()->size(), >- rtc::PacketTime()); >+ /* packet_time_us */ -1); > // Verifying full mode connection becomes writable from the response. > EXPECT_EQ_WAIT(Connection::STATE_WRITABLE, ch1.conn()->write_state(), > kDefaultTimeout); >@@ -2759,11 +2724,11 @@ TEST_F(PortTest, TestIceLiteConnectivity) { > > // Clear existing stun messsages. Otherwise we will process old stun > // message right after we send ping. >- ice_full_port->Reset(); >+ ice_full_port_ptr->Reset(); > // Send ping. This must have USE_CANDIDATE_ATTR. > ch1.Ping(); >- ASSERT_TRUE_WAIT(ice_full_port->last_stun_msg() != NULL, kDefaultTimeout); >- msg = ice_full_port->last_stun_msg(); >+ ASSERT_TRUE_WAIT(ice_full_port_ptr->last_stun_msg() != NULL, kDefaultTimeout); >+ msg = ice_full_port_ptr->last_stun_msg(); > EXPECT_TRUE(msg->GetByteString(STUN_ATTR_USE_CANDIDATE) != NULL); > ch1.Stop(); > } >@@ -2774,21 +2739,21 @@ TEST_F(PortTest, TestIceLiteConnectivity) { > TEST_F(PortTest, TestPortTimeoutIfNotKeptAlive) { > rtc::ScopedFakeClock clock; > int timeout_delay = 100; >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >- ConnectToSignalDestroyed(port1); >+ auto port1 = CreateUdpPort(kLocalAddr1); >+ ConnectToSignalDestroyed(port1.get()); > port1->set_timeout_delay(timeout_delay); // milliseconds > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); > port1->SetIceTiebreaker(kTiebreaker1); > >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >- ConnectToSignalDestroyed(port2); >+ auto port2 = CreateUdpPort(kLocalAddr2); >+ ConnectToSignalDestroyed(port2.get()); > port2->set_timeout_delay(timeout_delay); // milliseconds > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > port2->SetIceTiebreaker(kTiebreaker2); > > // Set up channels and ensure both ports will be deleted. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > > // Simulate a connection that succeeds, and then is destroyed. > StartConnectAndStopChannels(&ch1, &ch2); >@@ -2803,22 +2768,22 @@ TEST_F(PortTest, TestPortTimeoutIfNotKeptAlive) { > TEST_F(PortTest, TestPortTimeoutAfterNewConnectionCreatedAndDestroyed) { > rtc::ScopedFakeClock clock; > int timeout_delay = 100; >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >- ConnectToSignalDestroyed(port1); >+ auto port1 = CreateUdpPort(kLocalAddr1); >+ ConnectToSignalDestroyed(port1.get()); > port1->set_timeout_delay(timeout_delay); // milliseconds > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); > port1->SetIceTiebreaker(kTiebreaker1); > >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >- ConnectToSignalDestroyed(port2); >+ auto port2 = CreateUdpPort(kLocalAddr2); >+ ConnectToSignalDestroyed(port2.get()); > port2->set_timeout_delay(timeout_delay); // milliseconds > > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > port2->SetIceTiebreaker(kTiebreaker2); > > // Set up channels and ensure both ports will be deleted. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > > // Simulate a connection that succeeds, and then is destroyed. > StartConnectAndStopChannels(&ch1, &ch2); >@@ -2844,14 +2809,14 @@ TEST_F(PortTest, TestPortTimeoutAfterNewConnectionCreatedAndDestroyed) { > TEST_F(PortTest, TestPortNotTimeoutUntilPruned) { > rtc::ScopedFakeClock clock; > int timeout_delay = 100; >- UDPPort* port1 = CreateUdpPort(kLocalAddr1); >- ConnectToSignalDestroyed(port1); >+ auto port1 = CreateUdpPort(kLocalAddr1); >+ ConnectToSignalDestroyed(port1.get()); > port1->set_timeout_delay(timeout_delay); // milliseconds > port1->SetIceRole(cricket::ICEROLE_CONTROLLING); > port1->SetIceTiebreaker(kTiebreaker1); > >- UDPPort* port2 = CreateUdpPort(kLocalAddr2); >- ConnectToSignalDestroyed(port2); >+ auto port2 = CreateUdpPort(kLocalAddr2); >+ ConnectToSignalDestroyed(port2.get()); > port2->set_timeout_delay(timeout_delay); // milliseconds > port2->SetIceRole(cricket::ICEROLE_CONTROLLED); > port2->SetIceTiebreaker(kTiebreaker2); >@@ -2862,40 +2827,39 @@ TEST_F(PortTest, TestPortNotTimeoutUntilPruned) { > port2->set_component(cricket::ICE_CANDIDATE_COMPONENT_DEFAULT); > > // Set up channels and keep the port alive. >- TestChannel ch1(port1); >- TestChannel ch2(port2); >+ TestChannel ch1(std::move(port1)); >+ TestChannel ch2(std::move(port2)); > // Simulate a connection that succeeds, and then is destroyed. But ports > // are kept alive. Ports won't be destroyed. > StartConnectAndStopChannels(&ch1, &ch2); >- port1->KeepAliveUntilPruned(); >- port2->KeepAliveUntilPruned(); >+ ch1.port()->KeepAliveUntilPruned(); >+ ch2.port()->KeepAliveUntilPruned(); > SIMULATED_WAIT(ports_destroyed() > 0, 150, clock); > EXPECT_EQ(0, ports_destroyed()); > > // If they are pruned now, they will be destroyed right away. >- port1->Prune(); >- port2->Prune(); >+ ch1.port()->Prune(); >+ ch2.port()->Prune(); > // The ports on both sides should be destroyed after timeout. > EXPECT_TRUE_SIMULATED_WAIT(ports_destroyed() == 2, 1, clock); > } > > TEST_F(PortTest, TestSupportsProtocol) { >- std::unique_ptr<Port> udp_port(CreateUdpPort(kLocalAddr1)); >+ auto udp_port = CreateUdpPort(kLocalAddr1); > EXPECT_TRUE(udp_port->SupportsProtocol(UDP_PROTOCOL_NAME)); > EXPECT_FALSE(udp_port->SupportsProtocol(TCP_PROTOCOL_NAME)); > >- std::unique_ptr<Port> stun_port( >- CreateStunPort(kLocalAddr1, nat_socket_factory1())); >+ auto stun_port = CreateStunPort(kLocalAddr1, nat_socket_factory1()); > EXPECT_TRUE(stun_port->SupportsProtocol(UDP_PROTOCOL_NAME)); > EXPECT_FALSE(stun_port->SupportsProtocol(TCP_PROTOCOL_NAME)); > >- std::unique_ptr<Port> tcp_port(CreateTcpPort(kLocalAddr1)); >+ auto tcp_port = CreateTcpPort(kLocalAddr1); > EXPECT_TRUE(tcp_port->SupportsProtocol(TCP_PROTOCOL_NAME)); > EXPECT_TRUE(tcp_port->SupportsProtocol(SSLTCP_PROTOCOL_NAME)); > EXPECT_FALSE(tcp_port->SupportsProtocol(UDP_PROTOCOL_NAME)); > >- std::unique_ptr<Port> turn_port( >- CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP)); >+ auto turn_port = >+ CreateTurnPort(kLocalAddr1, nat_socket_factory1(), PROTO_UDP, PROTO_UDP); > EXPECT_TRUE(turn_port->SupportsProtocol(UDP_PROTOCOL_NAME)); > EXPECT_FALSE(turn_port->SupportsProtocol(TCP_PROTOCOL_NAME)); > } >@@ -2903,8 +2867,7 @@ TEST_F(PortTest, TestSupportsProtocol) { > // Test that SetIceParameters updates the component, ufrag and password > // on both the port itself and its candidates. > TEST_F(PortTest, TestSetIceParameters) { >- std::unique_ptr<TestPort> port( >- CreateTestPort(kLocalAddr1, "ufrag1", "password1")); >+ auto port = CreateTestPort(kLocalAddr1, "ufrag1", "password1"); > port->PrepareAddress(); > EXPECT_EQ(1UL, port->Candidates().size()); > port->SetIceParameters(1, "ufrag2", "password2"); >@@ -2918,8 +2881,7 @@ TEST_F(PortTest, TestSetIceParameters) { > } > > TEST_F(PortTest, TestAddConnectionWithSameAddress) { >- std::unique_ptr<TestPort> port( >- CreateTestPort(kLocalAddr1, "ufrag1", "password1")); >+ auto port = CreateTestPort(kLocalAddr1, "ufrag1", "password1"); > port->PrepareAddress(); > EXPECT_EQ(1u, port->Candidates().size()); > rtc::SocketAddress address("1.1.1.1", 5000); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.cc >index 5470b5b82e655e3643bee0ed9f42eb9eb706578a..d3b3a56edd926c7c0a0314bc53c86a472974c22b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.cc >@@ -10,8 +10,10 @@ > > #include "p2p/base/portallocator.h" > >+#include <iterator> > #include <utility> > >+#include "p2p/base/icecredentialsiterator.h" > #include "rtc_base/checks.h" > > namespace cricket { >@@ -121,6 +123,10 @@ PortAllocator::~PortAllocator() { > CheckRunOnValidThreadIfInitialized(); > } > >+void PortAllocator::set_restrict_ice_credentials_change(bool value) { >+ restrict_ice_credentials_change_ = value; >+} >+ > bool PortAllocator::SetConfiguration( > const ServerAddresses& stun_servers, > const std::vector<RelayServerConfig>& turn_servers, >@@ -166,8 +172,8 @@ bool PortAllocator::SetConfiguration( > // If |candidate_pool_size_| is less than the number of pooled sessions, get > // rid of the extras. > while (candidate_pool_size_ < static_cast<int>(pooled_sessions_.size())) { >- pooled_sessions_.front().reset(nullptr); >- pooled_sessions_.pop_front(); >+ pooled_sessions_.back().reset(nullptr); >+ pooled_sessions_.pop_back(); > } > > // |stun_candidate_keepalive_interval_| will be used in STUN port allocation >@@ -183,7 +189,11 @@ bool PortAllocator::SetConfiguration( > // If |candidate_pool_size_| is greater than the number of pooled sessions, > // create new sessions. > while (static_cast<int>(pooled_sessions_.size()) < candidate_pool_size_) { >- PortAllocatorSession* pooled_session = CreateSessionInternal("", 0, "", ""); >+ IceParameters iceCredentials = >+ IceCredentialsIterator::CreateRandomIceCredentials(); >+ PortAllocatorSession* pooled_session = >+ CreateSessionInternal("", 0, iceCredentials.ufrag, iceCredentials.pwd); >+ pooled_session->set_pooled(true); > pooled_session->StartGettingPorts(); > pooled_sessions_.push_back( > std::unique_ptr<PortAllocatorSession>(pooled_session)); >@@ -214,22 +224,50 @@ std::unique_ptr<PortAllocatorSession> PortAllocator::TakePooledSession( > if (pooled_sessions_.empty()) { > return nullptr; > } >- std::unique_ptr<PortAllocatorSession> ret = >- std::move(pooled_sessions_.front()); >+ >+ IceParameters credentials(ice_ufrag, ice_pwd, false); >+ // If restrict_ice_credentials_change_ is TRUE, then call FindPooledSession >+ // with ice credentials. Otherwise call it with nullptr which means >+ // "find any" pooled session. >+ auto cit = FindPooledSession(restrict_ice_credentials_change_ ? &credentials >+ : nullptr); >+ if (cit == pooled_sessions_.end()) { >+ return nullptr; >+ } >+ >+ auto it = >+ pooled_sessions_.begin() + std::distance(pooled_sessions_.cbegin(), cit); >+ std::unique_ptr<PortAllocatorSession> ret = std::move(*it); > ret->SetIceParameters(content_name, component, ice_ufrag, ice_pwd); >- // According to JSEP, a pooled session should filter candidates only after >- // it's taken out of the pool. >+ ret->set_pooled(false); >+ // According to JSEP, a pooled session should filter candidates only >+ // after it's taken out of the pool. > ret->SetCandidateFilter(candidate_filter()); >- pooled_sessions_.pop_front(); >+ pooled_sessions_.erase(it); > return ret; > } > >-const PortAllocatorSession* PortAllocator::GetPooledSession() const { >+const PortAllocatorSession* PortAllocator::GetPooledSession( >+ const IceParameters* ice_credentials) const { > CheckRunOnValidThreadAndInitialized(); >- if (pooled_sessions_.empty()) { >+ auto it = FindPooledSession(ice_credentials); >+ if (it == pooled_sessions_.end()) { > return nullptr; >+ } else { >+ return it->get(); >+ } >+} >+ >+std::vector<std::unique_ptr<PortAllocatorSession>>::const_iterator >+PortAllocator::FindPooledSession(const IceParameters* ice_credentials) const { >+ for (auto it = pooled_sessions_.begin(); it != pooled_sessions_.end(); ++it) { >+ if (ice_credentials == nullptr || >+ ((*it)->ice_ufrag() == ice_credentials->ufrag && >+ (*it)->ice_pwd() == ice_credentials->pwd)) { >+ return it; >+ } > } >- return pooled_sessions_.front().get(); >+ return pooled_sessions_.end(); > } > > void PortAllocator::FreezeCandidatePool() { >@@ -250,4 +288,14 @@ void PortAllocator::GetCandidateStatsFromPooledSessions( > } > } > >+std::vector<IceParameters> PortAllocator::GetPooledIceCredentials() { >+ CheckRunOnValidThreadAndInitialized(); >+ std::vector<IceParameters> list; >+ for (const auto& session : pooled_sessions_) { >+ list.push_back( >+ IceParameters(session->ice_ufrag(), session->ice_pwd(), false)); >+ } >+ return list; >+} >+ > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.h >index 8bd709642c8aec3846581be8d6c6b06e78bce020..7026f2b2b63204f72aabd723369d5c3f19d2497e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator.h >@@ -21,6 +21,7 @@ > #include "rtc_base/helpers.h" > #include "rtc_base/proxyinfo.h" > #include "rtc_base/sslcertificate.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > #include "rtc_base/thread.h" > #include "rtc_base/thread_checker.h" >@@ -146,7 +147,7 @@ struct RelayCredentials { > > typedef std::vector<ProtocolAddress> PortList; > // TODO(deadbeef): Rename to TurnServerConfig. >-struct RelayServerConfig { >+struct RTC_EXPORT RelayServerConfig { > explicit RelayServerConfig(RelayType type); > RelayServerConfig(const rtc::SocketAddress& address, > const std::string& username, >@@ -183,7 +184,7 @@ struct RelayServerConfig { > rtc::SSLCertificateVerifier* tls_cert_verifier = nullptr; > }; > >-class PortAllocatorSession : public sigslot::has_slots<> { >+class RTC_EXPORT PortAllocatorSession : public sigslot::has_slots<> { > public: > // Content name passed in mostly for logging and debugging. > PortAllocatorSession(const std::string& content_name, >@@ -201,7 +202,7 @@ class PortAllocatorSession : public sigslot::has_slots<> { > int component() const { return component_; } > const std::string& ice_ufrag() const { return ice_ufrag_; } > const std::string& ice_pwd() const { return ice_pwd_; } >- bool pooled() const { return ice_ufrag_.empty(); } >+ bool pooled() const { return pooled_; } > > // Setting this filter should affect not only candidates gathered in the > // future, but candidates already gathered and ports already "ready", >@@ -309,6 +310,8 @@ class PortAllocatorSession : public sigslot::has_slots<> { > UpdateIceParametersInternal(); > } > >+ void set_pooled(bool value) { pooled_ = value; } >+ > uint32_t flags_; > uint32_t generation_; > std::string content_name_; >@@ -316,6 +319,8 @@ class PortAllocatorSession : public sigslot::has_slots<> { > std::string ice_ufrag_; > std::string ice_pwd_; > >+ bool pooled_ = false; >+ > // SetIceParameters is an implementation detail which only PortAllocator > // should be able to call. > friend class PortAllocator; >@@ -326,7 +331,7 @@ class PortAllocatorSession : public sigslot::has_slots<> { > // > // This allows a PortAllocator subclass to be constructed and configured on one > // thread, and passed into an object that uses it on a different thread. >-class PortAllocator : public sigslot::has_slots<> { >+class RTC_EXPORT PortAllocator : public sigslot::has_slots<> { > public: > PortAllocator(); > ~PortAllocator() override; >@@ -335,6 +340,11 @@ class PortAllocator : public sigslot::has_slots<> { > // constructing and configuring the PortAllocator subclasses. > virtual void Initialize(); > >+ // Set to true if some Ports need to know the ICE credentials when they are >+ // created. This will ensure that the PortAllocator will only match pooled >+ // allocator sessions to the ICE transport with the same credentials. >+ virtual void set_restrict_ice_credentials_change(bool value); >+ > // Set STUN and TURN servers to be used in future sessions, and set > // candidate pool size, as described in JSEP. > // >@@ -392,6 +402,8 @@ class PortAllocator : public sigslot::has_slots<> { > // > // Caller takes ownership of the returned session. > // >+ // If restrict_ice_credentials_change is TRUE, then it will only >+ // return a pooled session with matching ice credentials. > // If no pooled sessions are available, returns null. > std::unique_ptr<PortAllocatorSession> TakePooledSession( > const std::string& content_name, >@@ -399,8 +411,10 @@ class PortAllocator : public sigslot::has_slots<> { > const std::string& ice_ufrag, > const std::string& ice_pwd); > >- // Returns the next session that would be returned by TakePooledSession. >- const PortAllocatorSession* GetPooledSession() const; >+ // Returns the next session that would be returned by TakePooledSession >+ // optionally restricting it to sessions with specified ice credentials. >+ const PortAllocatorSession* GetPooledSession( >+ const IceParameters* ice_credentials = nullptr) const; > > // After FreezeCandidatePool is called, changing the candidate pool size will > // no longer be allowed, and changing ICE servers will not cause pooled >@@ -548,6 +562,9 @@ class PortAllocator : public sigslot::has_slots<> { > virtual void GetCandidateStatsFromPooledSessions( > CandidateStatsList* candidate_stats_list); > >+ // Return IceParameters of the pooled sessions. >+ std::vector<IceParameters> GetPooledIceCredentials(); >+ > protected: > virtual PortAllocatorSession* CreateSessionInternal( > const std::string& content_name, >@@ -555,7 +572,7 @@ class PortAllocator : public sigslot::has_slots<> { > const std::string& ice_ufrag, > const std::string& ice_pwd) = 0; > >- const std::deque<std::unique_ptr<PortAllocatorSession>>& pooled_sessions() { >+ const std::vector<std::unique_ptr<PortAllocatorSession>>& pooled_sessions() { > return pooled_sessions_; > } > >@@ -586,7 +603,7 @@ class PortAllocator : public sigslot::has_slots<> { > ServerAddresses stun_servers_; > std::vector<RelayServerConfig> turn_servers_; > int candidate_pool_size_ = 0; // Last value passed into SetConfiguration. >- std::deque<std::unique_ptr<PortAllocatorSession>> pooled_sessions_; >+ std::vector<std::unique_ptr<PortAllocatorSession>> pooled_sessions_; > bool candidate_pool_frozen_ = false; > bool prune_turn_ports_ = false; > >@@ -596,6 +613,15 @@ class PortAllocator : public sigslot::has_slots<> { > webrtc::TurnCustomizer* turn_customizer_ = nullptr; > > absl::optional<int> stun_candidate_keepalive_interval_; >+ >+ // If true, TakePooledSession() will only return sessions that has same ice >+ // credentials as requested. >+ bool restrict_ice_credentials_change_ = false; >+ >+ // Returns iterator to pooled session with specified ice_credentials or first >+ // if ice_credentials is nullptr. >+ std::vector<std::unique_ptr<PortAllocatorSession>>::const_iterator >+ FindPooledSession(const IceParameters* ice_credentials = nullptr) const; > }; > > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator_unittest.cc >index 3887a90db1f29ba6f08123bb13e2d8cbf5327f17..8b317f4d5a953c847332809b6e66c2031ba63600 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/portallocator_unittest.cc >@@ -71,8 +71,7 @@ class PortAllocatorTest : public testing::Test, public sigslot::has_slots<> { > > int GetAllPooledSessionsReturnCount() { > int count = 0; >- while (GetPooledSession()) { >- TakePooledSession(); >+ while (TakePooledSession() != nullptr) { > ++count; > } > return count; >@@ -275,3 +274,29 @@ TEST_F(PortAllocatorTest, DiscardCandidatePool) { > allocator_->DiscardCandidatePool(); > EXPECT_EQ(0, GetAllPooledSessionsReturnCount()); > } >+ >+TEST_F(PortAllocatorTest, RestrictIceCredentialsChange) { >+ SetConfigurationWithPoolSize(1); >+ EXPECT_EQ(1, GetAllPooledSessionsReturnCount()); >+ allocator_->DiscardCandidatePool(); >+ >+ // Only return pooled sessions with the ice credentials that >+ // match those requested in TakePooledSession(). >+ allocator_->set_restrict_ice_credentials_change(true); >+ SetConfigurationWithPoolSize(1); >+ EXPECT_EQ(0, GetAllPooledSessionsReturnCount()); >+ allocator_->DiscardCandidatePool(); >+ >+ SetConfigurationWithPoolSize(1); >+ auto credentials = allocator_->GetPooledIceCredentials(); >+ ASSERT_EQ(1u, credentials.size()); >+ EXPECT_EQ(nullptr, >+ allocator_->TakePooledSession(kContentName, 0, kIceUfrag, kIcePwd)); >+ EXPECT_NE(nullptr, >+ allocator_->TakePooledSession(kContentName, 0, credentials[0].ufrag, >+ credentials[0].pwd)); >+ EXPECT_EQ(nullptr, >+ allocator_->TakePooledSession(kContentName, 0, credentials[0].ufrag, >+ credentials[0].pwd)); >+ allocator_->DiscardCandidatePool(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.cc >index 161122b6d5c73842b77521fc11e286c6d857c0a9..c74298584aee1af1bb2382d98cfb003cdf3741c4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.cc >@@ -24,7 +24,6 @@ > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_minmax.h" > #include "rtc_base/socket.h" >-#include "rtc_base/stringutils.h" > #include "rtc_base/timeutils.h" > > // The following logging is for detailed (packet-level) analysis only. >@@ -186,8 +185,8 @@ void ReportStats() { > char buffer[256]; > size_t len = 0; > for (int i = 0; i < S_NUM_STATS; ++i) { >- len += rtc::sprintfn(buffer, arraysize(buffer), "%s%s:%d", >- (i == 0) ? "" : ",", STAT_NAMES[i], g_stats[i]); >+ len += snprintf(buffer, arraysize(buffer), "%s%s:%d", >+ (i == 0) ? "" : ",", STAT_NAMES[i], g_stats[i]); > g_stats[i] = 0; > } > RTC_LOG(LS_INFO) << "Stats[" << buffer << "]"; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.h >index c363ef64b74af9b1223ff7ca533179ae82c1acb4..5b26aeb21a5bce435a19857ca7d051fadd91370a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp.h >@@ -14,6 +14,7 @@ > #include <list> > > #include "rtc_base/stream.h" >+#include "rtc_base/system/rtc_export.h" > > namespace cricket { > >@@ -45,7 +46,7 @@ class IPseudoTcpNotify { > // PseudoTcp > ////////////////////////////////////////////////////////////////////// > >-class PseudoTcp { >+class RTC_EXPORT PseudoTcp { > public: > static uint32_t Now(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp_unittest.cc >index dd86f921d52faa15692ca23a2498e0ca76fe1ff2..2fc9900cbcca5b1fbf4960bc15ea1b8490a82a80 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/pseudotcp_unittest.cc >@@ -15,8 +15,8 @@ > #include "p2p/base/pseudotcp.h" > #include "rtc_base/gunit.h" > #include "rtc_base/helpers.h" >+#include "rtc_base/memory_stream.h" > #include "rtc_base/messagehandler.h" >-#include "rtc_base/stream.h" > #include "rtc_base/thread.h" > #include "rtc_base/timeutils.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.cc >index fb830ce66a943cd249dda40ad28e9dce4baa38f0..b1b924b80274c394242fce5da359d4632d6398b4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.cc >@@ -139,7 +139,7 @@ class RelayEntry : public rtc::MessageHandler, public sigslot::has_slots<> { > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > void OnSentPacket(rtc::AsyncPacketSocket* socket, > const rtc::SentPacket& sent_packet); >@@ -397,9 +397,9 @@ void RelayPort::OnReadPacket(const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, > ProtocolType proto, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > if (Connection* conn = GetConnection(remote_addr)) { >- conn->OnReadPacket(data, size, packet_time); >+ conn->OnReadPacket(data, size, packet_time_us); > } else { > Port::OnReadPacket(data, size, remote_addr, proto); > } >@@ -686,7 +686,7 @@ void RelayEntry::OnReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > // RTC_DCHECK(remote_addr == port_->server_addr()); > // TODO(?): are we worried about this? > >@@ -700,7 +700,7 @@ void RelayEntry::OnReadPacket(rtc::AsyncPacketSocket* socket, > // by the server, The actual remote address is the one we recorded. > if (!port_->HasMagicCookie(data, size)) { > if (locked_) { >- port_->OnReadPacket(data, size, ext_addr_, PROTO_UDP, packet_time); >+ port_->OnReadPacket(data, size, ext_addr_, PROTO_UDP, packet_time_us); > } else { > RTC_LOG(WARNING) << "Dropping packet: entry not locked"; > } >@@ -753,7 +753,7 @@ void RelayEntry::OnReadPacket(rtc::AsyncPacketSocket* socket, > > // Process the actual data and remote address in the normal manner. > port_->OnReadPacket(data_attr->bytes(), data_attr->length(), remote_addr2, >- PROTO_UDP, packet_time); >+ PROTO_UDP, packet_time_us); > } > > void RelayEntry::OnSentPacket(rtc::AsyncPacketSocket* socket, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.h >index 48a939e0cfd10e8f7b2fdff7adcdc3013cd54dbb..5989f76e81f5f664fc1d8c63788381379189388f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport.h >@@ -12,10 +12,12 @@ > #define P2P_BASE_RELAYPORT_H_ > > #include <deque> >+#include <memory> > #include <string> > #include <utility> > #include <vector> > >+#include "absl/memory/memory.h" > #include "p2p/base/port.h" > #include "p2p/base/stunrequest.h" > >@@ -35,15 +37,16 @@ class RelayPort : public Port { > typedef std::pair<rtc::Socket::Option, int> OptionValue; > > // RelayPort doesn't yet do anything fancy in the ctor. >- static RelayPort* Create(rtc::Thread* thread, >- rtc::PacketSocketFactory* factory, >- rtc::Network* network, >- uint16_t min_port, >- uint16_t max_port, >- const std::string& username, >- const std::string& password) { >- return new RelayPort(thread, factory, network, min_port, max_port, username, >- password); >+ static std::unique_ptr<RelayPort> Create(rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ uint16_t min_port, >+ uint16_t max_port, >+ const std::string& username, >+ const std::string& password) { >+ // Using `new` to access a non-public constructor. >+ return absl::WrapUnique(new RelayPort(thread, factory, network, min_port, >+ max_port, username, password)); > } > ~RelayPort() override; > >@@ -92,7 +95,7 @@ class RelayPort : public Port { > size_t size, > const rtc::SocketAddress& remote_addr, > ProtocolType proto, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > > // The OnSentPacket callback is left empty here since they are handled by > // RelayEntry. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport_unittest.cc >index e6e922fdaf77fea7d0443474a7392e0d06badfff..1ca69f4ef96ce20bf98eac37f0263e60e144e1e5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayport_unittest.cc >@@ -62,10 +62,10 @@ class RelayPortTest : public testing::Test, public sigslot::has_slots<> { > } > > void OnReadPacket(rtc::AsyncPacketSocket* socket, >- const char* data, >- size_t size, >- const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const char* /* data */, >+ size_t /* size */, >+ const rtc::SocketAddress& /* remote_addr */, >+ const int64_t& /* packet_time_us */) { > received_packet_count_[socket]++; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.cc >index 8e17a79db8d444d56c33dfffc748ce6bea43bbde..ad6b487282e148c0927457c7de19013edf5b84dd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.cc >@@ -115,7 +115,7 @@ void RelayServer::AddInternalSocket(rtc::AsyncPacketSocket* socket) { > } > > void RelayServer::RemoveInternalSocket(rtc::AsyncPacketSocket* socket) { >- SocketList::iterator iter = >+ auto iter = > std::find(internal_sockets_.begin(), internal_sockets_.end(), socket); > RTC_DCHECK(iter != internal_sockets_.end()); > internal_sockets_.erase(iter); >@@ -132,7 +132,7 @@ void RelayServer::AddExternalSocket(rtc::AsyncPacketSocket* socket) { > } > > void RelayServer::RemoveExternalSocket(rtc::AsyncPacketSocket* socket) { >- SocketList::iterator iter = >+ auto iter = > std::find(external_sockets_.begin(), external_sockets_.end(), socket); > RTC_DCHECK(iter != external_sockets_.end()); > external_sockets_.erase(iter); >@@ -148,7 +148,7 @@ void RelayServer::AddInternalServerSocket(rtc::AsyncSocket* socket, > } > > void RelayServer::RemoveInternalServerSocket(rtc::AsyncSocket* socket) { >- ServerSocketMap::iterator iter = server_sockets_.find(socket); >+ auto iter = server_sockets_.find(socket); > RTC_DCHECK(iter != server_sockets_.end()); > server_sockets_.erase(iter); > socket->SignalReadEvent.disconnect(this); >@@ -160,10 +160,9 @@ int RelayServer::GetConnectionCount() const { > > rtc::SocketAddressPair RelayServer::GetConnection(int connection) const { > int i = 0; >- for (ConnectionMap::const_iterator it = connections_.begin(); >- it != connections_.end(); ++it) { >+ for (const auto& entry : connections_) { > if (i == connection) { >- return it->second->addr_pair(); >+ return entry.second->addr_pair(); > } > ++i; > } >@@ -171,9 +170,8 @@ rtc::SocketAddressPair RelayServer::GetConnection(int connection) const { > } > > bool RelayServer::HasConnection(const rtc::SocketAddress& address) const { >- for (ConnectionMap::const_iterator it = connections_.begin(); >- it != connections_.end(); ++it) { >- if (it->second->addr_pair().destination() == address) { >+ for (const auto& entry : connections_) { >+ if (entry.second->addr_pair().destination() == address) { > return true; > } > } >@@ -189,14 +187,14 @@ void RelayServer::OnInternalPacket(rtc::AsyncPacketSocket* socket, > const char* bytes, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > // Get the address of the connection we just received on. > rtc::SocketAddressPair ap(remote_addr, socket->GetLocalAddress()); > RTC_DCHECK(!ap.destination().IsNil()); > > // If this did not come from an existing connection, it should be a STUN > // allocate request. >- ConnectionMap::iterator piter = connections_.find(ap); >+ auto piter = connections_.find(ap); > if (piter == connections_.end()) { > HandleStunAllocate(bytes, size, ap, socket); > return; >@@ -234,13 +232,13 @@ void RelayServer::OnExternalPacket(rtc::AsyncPacketSocket* socket, > const char* bytes, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > // Get the address of the connection we just received on. > rtc::SocketAddressPair ap(remote_addr, socket->GetLocalAddress()); > RTC_DCHECK(!ap.destination().IsNil()); > > // If this connection already exists, then forward the traffic. >- ConnectionMap::iterator piter = connections_.find(ap); >+ auto piter = connections_.find(ap); > if (piter != connections_.end()) { > // TODO(?): Check the HMAC. > RelayServerConnection* ext_conn = piter->second; >@@ -276,7 +274,7 @@ void RelayServer::OnExternalPacket(rtc::AsyncPacketSocket* socket, > // TODO(?): Check the HMAC. > > // The binding should already be present. >- BindingMap::iterator biter = bindings_.find(username); >+ auto biter = bindings_.find(username); > if (biter == bindings_.end()) { > RTC_LOG(LS_WARNING) << "Dropping packet: no binding with username"; > return; >@@ -351,7 +349,7 @@ void RelayServer::HandleStunAllocate(const char* bytes, > > RelayServerBinding* binding; > >- BindingMap::iterator biter = bindings_.find(username); >+ auto biter = bindings_.find(username); > if (biter != bindings_.end()) { > binding = biter->second; > } else { >@@ -514,13 +512,13 @@ void RelayServer::AddConnection(RelayServerConnection* conn) { > } > > void RelayServer::RemoveConnection(RelayServerConnection* conn) { >- ConnectionMap::iterator iter = connections_.find(conn->addr_pair()); >+ auto iter = connections_.find(conn->addr_pair()); > RTC_DCHECK(iter != connections_.end()); > connections_.erase(iter); > } > > void RelayServer::RemoveBinding(RelayServerBinding* binding) { >- BindingMap::iterator iter = bindings_.find(binding->username()); >+ auto iter = bindings_.find(binding->username()); > RTC_DCHECK(iter != bindings_.end()); > bindings_.erase(iter); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.h >index c7661204f8199099064bbcbb13ef470ed3ae25b3..5ba5e065a86812058223efca5188c62c98a79e2b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/relayserver.h >@@ -68,33 +68,27 @@ class RelayServer : public rtc::MessageHandler, public sigslot::has_slots<> { > bool HasConnection(const rtc::SocketAddress& address) const; > > private: >- typedef std::vector<rtc::AsyncPacketSocket*> SocketList; >- typedef std::map<rtc::AsyncSocket*, cricket::ProtocolType> ServerSocketMap; >- typedef std::map<std::string, RelayServerBinding*> BindingMap; >- typedef std::map<rtc::SocketAddressPair, RelayServerConnection*> >- ConnectionMap; >- > rtc::Thread* thread_; > webrtc::Random random_; > bool log_bindings_; >- SocketList internal_sockets_; >- SocketList external_sockets_; >- SocketList removed_sockets_; >- ServerSocketMap server_sockets_; >- BindingMap bindings_; >- ConnectionMap connections_; >+ std::vector<rtc::AsyncPacketSocket*> internal_sockets_; >+ std::vector<rtc::AsyncPacketSocket*> external_sockets_; >+ std::vector<rtc::AsyncPacketSocket*> removed_sockets_; >+ std::map<rtc::AsyncSocket*, cricket::ProtocolType> server_sockets_; >+ std::map<std::string, RelayServerBinding*> bindings_; >+ std::map<rtc::SocketAddressPair, RelayServerConnection*> connections_; > > // Called when a packet is received by the server on one of its sockets. > void OnInternalPacket(rtc::AsyncPacketSocket* socket, > const char* bytes, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > void OnExternalPacket(rtc::AsyncPacketSocket* socket, > const char* bytes, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > void OnReadEvent(rtc::AsyncSocket* socket); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stun.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stun.cc >index cc9fb4e4c090c33b57c999badf0696812b77b942..fa6b6f832e08575a98a320f815f260b050aacce9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stun.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stun.cc >@@ -22,7 +22,6 @@ > #include "rtc_base/crc32.h" > #include "rtc_base/logging.h" > #include "rtc_base/messagedigest.h" >-#include "rtc_base/stringencode.h" > > using rtc::ByteBufferReader; > using rtc::ByteBufferWriter; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.cc >index c60bf92eb4920c3defdc2c51d60029f2ccef3e2f..727312329ee4d09967be717dd8b640c998700f76 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.cc >@@ -319,9 +319,9 @@ bool UDPPort::HandleIncomingPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > // All packets given to UDP port will be consumed. >- OnReadPacket(socket, data, size, remote_addr, packet_time); >+ OnReadPacket(socket, data, size, remote_addr, packet_time_us); > return true; > } > >@@ -358,11 +358,15 @@ void UDPPort::OnLocalAddressReady(rtc::AsyncPacketSocket* socket, > MaybePrepareStunCandidate(); > } > >+void UDPPort::PostAddAddress(bool is_final) { >+ MaybeSetPortCompleteOrError(); >+} >+ > void UDPPort::OnReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > RTC_DCHECK(socket == socket_); > RTC_DCHECK(!remote_addr.IsUnresolvedIP()); > >@@ -376,7 +380,7 @@ void UDPPort::OnReadPacket(rtc::AsyncPacketSocket* socket, > } > > if (Connection* conn = GetConnection(remote_addr)) { >- conn->OnReadPacket(data, size, packet_time); >+ conn->OnReadPacket(data, size, packet_time_us); > } else { > Port::OnReadPacket(data, size, remote_addr, PROTO_UDP); > } >@@ -517,8 +521,14 @@ void UDPPort::OnStunBindingOrResolveRequestFailed( > } > > void UDPPort::MaybeSetPortCompleteOrError() { >- if (ready_) >+ if (mdns_name_registration_status() == >+ MdnsNameRegistrationStatus::kInProgress) { > return; >+ } >+ >+ if (ready_) { >+ return; >+ } > > // Do not set port ready if we are still waiting for bind responses. > const size_t servers_done_bind_request = >@@ -563,22 +573,24 @@ bool UDPPort::HasCandidateWithAddress(const rtc::SocketAddress& addr) const { > return false; > } > >-StunPort* StunPort::Create(rtc::Thread* thread, >- rtc::PacketSocketFactory* factory, >- rtc::Network* network, >- uint16_t min_port, >- uint16_t max_port, >- const std::string& username, >- const std::string& password, >- const ServerAddresses& servers, >- const std::string& origin, >- absl::optional<int> stun_keepalive_interval) { >- StunPort* port = new StunPort(thread, factory, network, min_port, max_port, >- username, password, servers, origin); >+std::unique_ptr<StunPort> StunPort::Create( >+ rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ uint16_t min_port, >+ uint16_t max_port, >+ const std::string& username, >+ const std::string& password, >+ const ServerAddresses& servers, >+ const std::string& origin, >+ absl::optional<int> stun_keepalive_interval) { >+ // Using `new` to access a non-public constructor. >+ auto port = absl::WrapUnique(new StunPort(thread, factory, network, min_port, >+ max_port, username, password, >+ servers, origin)); > port->set_stun_keepalive_delay(stun_keepalive_interval); > if (!port->Init()) { >- delete port; >- port = NULL; >+ return nullptr; > } > return port; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.h >index 5e397835dbeb6518669fd387427ceaf974e4fe90..ca43cbc8c483d0749193c6b1e44894160d4cff15 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport.h >@@ -15,6 +15,7 @@ > #include <memory> > #include <string> > >+#include "absl/memory/memory.h" > #include "p2p/base/port.h" > #include "p2p/base/stunrequest.h" > #include "rtc_base/asyncpacketsocket.h" >@@ -35,42 +36,45 @@ static const int HIGH_COST_PORT_KEEPALIVE_LIFETIME = 2 * 60 * 1000; > // Communicates using the address on the outside of a NAT. > class UDPPort : public Port { > public: >- static UDPPort* Create(rtc::Thread* thread, >- rtc::PacketSocketFactory* factory, >- rtc::Network* network, >- rtc::AsyncPacketSocket* socket, >- const std::string& username, >- const std::string& password, >- const std::string& origin, >- bool emit_local_for_anyaddress, >- absl::optional<int> stun_keepalive_interval) { >- UDPPort* port = new UDPPort(thread, factory, network, socket, username, >- password, origin, emit_local_for_anyaddress); >+ static std::unique_ptr<UDPPort> Create( >+ rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ rtc::AsyncPacketSocket* socket, >+ const std::string& username, >+ const std::string& password, >+ const std::string& origin, >+ bool emit_local_for_anyaddress, >+ absl::optional<int> stun_keepalive_interval) { >+ // Using `new` to access a non-public constructor. >+ auto port = absl::WrapUnique(new UDPPort(thread, factory, network, socket, >+ username, password, origin, >+ emit_local_for_anyaddress)); > port->set_stun_keepalive_delay(stun_keepalive_interval); > if (!port->Init()) { >- delete port; >- port = NULL; >+ return nullptr; > } > return port; > } > >- static UDPPort* Create(rtc::Thread* thread, >- rtc::PacketSocketFactory* factory, >- rtc::Network* network, >- uint16_t min_port, >- uint16_t max_port, >- const std::string& username, >- const std::string& password, >- const std::string& origin, >- bool emit_local_for_anyaddress, >- absl::optional<int> stun_keepalive_interval) { >- UDPPort* port = >+ static std::unique_ptr<UDPPort> Create( >+ rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ uint16_t min_port, >+ uint16_t max_port, >+ const std::string& username, >+ const std::string& password, >+ const std::string& origin, >+ bool emit_local_for_anyaddress, >+ absl::optional<int> stun_keepalive_interval) { >+ // Using `new` to access a non-public constructor. >+ auto port = absl::WrapUnique( > new UDPPort(thread, factory, network, min_port, max_port, username, >- password, origin, emit_local_for_anyaddress); >+ password, origin, emit_local_for_anyaddress)); > port->set_stun_keepalive_delay(stun_keepalive_interval); > if (!port->Init()) { >- delete port; >- port = NULL; >+ return nullptr; > } > return port; > } >@@ -100,7 +104,7 @@ class UDPPort : public Port { > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > > bool SupportsProtocol(const std::string& protocol) const override; > ProtocolType GetProtocol() const override; >@@ -156,10 +160,14 @@ class UDPPort : public Port { > > void OnLocalAddressReady(rtc::AsyncPacketSocket* socket, > const rtc::SocketAddress& address); >+ >+ void PostAddAddress(bool is_final) override; >+ > void OnReadPacket(rtc::AsyncPacketSocket* socket, >- const char* data, size_t size, >+ const char* data, >+ size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > void OnSentPacket(rtc::AsyncPacketSocket* socket, > const rtc::SentPacket& sent_packet) override; >@@ -262,16 +270,17 @@ class UDPPort : public Port { > > class StunPort : public UDPPort { > public: >- static StunPort* Create(rtc::Thread* thread, >- rtc::PacketSocketFactory* factory, >- rtc::Network* network, >- uint16_t min_port, >- uint16_t max_port, >- const std::string& username, >- const std::string& password, >- const ServerAddresses& servers, >- const std::string& origin, >- absl::optional<int> stun_keepalive_interval); >+ static std::unique_ptr<StunPort> Create( >+ rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ uint16_t min_port, >+ uint16_t max_port, >+ const std::string& username, >+ const std::string& password, >+ const ServerAddresses& servers, >+ const std::string& origin, >+ absl::optional<int> stun_keepalive_interval); > > void PrepareAddress() override; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport_unittest.cc >index 29754e861c938a41a7ea8fbc867cafc3f0f7b1d2..01518a9468364d10ec341dc307e83b5a282a01cd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunport_unittest.cc >@@ -75,10 +75,10 @@ class StunPortTestBase : public testing::Test, public sigslot::has_slots<> { > } > > void CreateStunPort(const ServerAddresses& stun_servers) { >- stun_port_.reset(cricket::StunPort::Create( >+ stun_port_ = cricket::StunPort::Create( > rtc::Thread::Current(), &socket_factory_, &network_, 0, 0, > rtc::CreateRandomString(16), rtc::CreateRandomString(22), stun_servers, >- std::string(), absl::nullopt)); >+ std::string(), absl::nullopt); > stun_port_->set_stun_keepalive_delay(stun_keepalive_delay_); > // If |stun_keepalive_lifetime_| is negative, let the stun port > // choose its lifetime from the network type. >@@ -100,10 +100,10 @@ class StunPortTestBase : public testing::Test, public sigslot::has_slots<> { > } > ASSERT_TRUE(socket_ != NULL); > socket_->SignalReadPacket.connect(this, &StunPortTestBase::OnReadPacket); >- stun_port_.reset(cricket::UDPPort::Create( >+ stun_port_ = cricket::UDPPort::Create( > rtc::Thread::Current(), &socket_factory_, &network_, socket_.get(), > rtc::CreateRandomString(16), rtc::CreateRandomString(22), std::string(), >- false, absl::nullopt)); >+ false, absl::nullopt); > ASSERT_TRUE(stun_port_ != NULL); > ServerAddresses stun_servers; > stun_servers.insert(server_addr); >@@ -119,15 +119,15 @@ class StunPortTestBase : public testing::Test, public sigslot::has_slots<> { > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > stun_port_->HandleIncomingPacket(socket, data, size, remote_addr, >- rtc::PacketTime()); >+ /* packet_time_us */ -1); > } > > void SendData(const char* data, size_t len) { > stun_port_->HandleIncomingPacket(socket_.get(), data, len, > rtc::SocketAddress("22.22.22.22", 0), >- rtc::PacketTime()); >+ /* packet_time_us */ -1); > } > > protected: >@@ -216,7 +216,7 @@ TEST_F(StunPortTest, TestPrepareAddressFail) { > > // Test that we can get an address from a STUN server specified by a hostname. > // Crashes on Linux, see webrtc:7416 >-#if defined(WEBRTC_LINUX) >+#if defined(WEBRTC_LINUX) || defined(WEBRTC_WIN) > #define MAYBE_TestPrepareAddressHostname DISABLED_TestPrepareAddressHostname > #else > #define MAYBE_TestPrepareAddressHostname TestPrepareAddressHostname >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunrequest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunrequest.cc >index 56410bea089d9b269e2ffe216f7cc9ad4477724c..edba4d68e54bc20a0ef3fd466a84e5c9d3632fce 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunrequest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunrequest.cc >@@ -18,7 +18,7 @@ > #include "rtc_base/checks.h" > #include "rtc_base/helpers.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringencode.h" >+#include "rtc_base/timeutils.h" // For TimeMillis > > namespace cricket { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.cc >index 00eacf17bc9e19a6aebe7efc2cee615da4e6a44d..95f0cebe0a70d88cf039c855aff2544182156bac 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.cc >@@ -29,7 +29,7 @@ void StunServer::OnPacket(rtc::AsyncPacketSocket* socket, > const char* buf, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > // Parse the STUN message; eat any messages that fail to parse. > rtc::ByteBufferReader bbuf(buf, size); > StunMessage msg; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.h >index 9016d06c20aca8d91860969b1b6a1a559bdc88fa..0061d382bc1492524064a23f720f8c2bd3b72645 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/stunserver.h >@@ -33,7 +33,7 @@ class StunServer : public sigslot::has_slots<> { > const char* buf, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > // Handlers for the different types of STUN/TURN requests: > virtual void OnBindingRequest(StunMessage* msg, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >index c66c7307375e03fb8974b90a0ef2bf92084b7f83..0d7aea9d60a483f15c2511c1a0a05f8c792dbd80 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >@@ -307,7 +307,7 @@ void TCPPort::OnReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > Port::OnReadPacket(data, size, remote_addr, PROTO_TCP); > } > >@@ -545,9 +545,9 @@ void TCPConnection::OnReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > RTC_DCHECK(socket == socket_.get()); >- Connection::OnReadPacket(data, size, packet_time); >+ Connection::OnReadPacket(data, size, packet_time_us); > } > > void TCPConnection::OnReadyToSend(rtc::AsyncPacketSocket* socket) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.h >index fa5c958c21c263bbb4b1ea019b8a170bfa7ad697..4d06a6507c353421eacce9132a9e31dd1628c091 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.h >@@ -15,6 +15,7 @@ > #include <memory> > #include <string> > >+#include "absl/memory/memory.h" > #include "p2p/base/port.h" > #include "rtc_base/asyncpacketsocket.h" > >@@ -30,16 +31,18 @@ class TCPConnection; > // call this TCPPort::OnReadPacket (3 arg) to dispatch to a connection. > class TCPPort : public Port { > public: >- static TCPPort* Create(rtc::Thread* thread, >- rtc::PacketSocketFactory* factory, >- rtc::Network* network, >- uint16_t min_port, >- uint16_t max_port, >- const std::string& username, >- const std::string& password, >- bool allow_listen) { >- return new TCPPort(thread, factory, network, min_port, max_port, username, >- password, allow_listen); >+ static std::unique_ptr<TCPPort> Create(rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ uint16_t min_port, >+ uint16_t max_port, >+ const std::string& username, >+ const std::string& password, >+ bool allow_listen) { >+ // Using `new` to access a non-public constructor. >+ return absl::WrapUnique(new TCPPort(thread, factory, network, min_port, >+ max_port, username, password, >+ allow_listen)); > } > ~TCPPort() override; > >@@ -91,7 +94,7 @@ class TCPPort : public Port { > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > void OnSentPacket(rtc::AsyncPacketSocket* socket, > const rtc::SentPacket& sent_packet) override; >@@ -159,7 +162,7 @@ class TCPConnection : public Connection { > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > void OnReadyToSend(rtc::AsyncPacketSocket* socket); > > std::unique_ptr<rtc::AsyncPacketSocket> socket_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport_unittest.cc >index ef302f9026032d04320bd1cc08efe700598054ee..34385a7f9bee694084cdb5b6364ddee61cd7301c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport_unittest.cc >@@ -14,6 +14,7 @@ > #include "p2p/base/basicpacketsocketfactory.h" > #include "p2p/base/tcpport.h" > #include "rtc_base/gunit.h" >+#include "rtc_base/helpers.h" > #include "rtc_base/thread.h" > #include "rtc_base/virtualsocketserver.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.cc >index ad29a1199e7d531b087d0c0a26cce8e91db49913..377a4c3eee327854166e22e0a7dae4196c95aa2f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.cc >@@ -10,9 +10,9 @@ > > #include "p2p/base/transportdescription.h" > >+#include "absl/strings/match.h" > #include "p2p/base/p2pconstants.h" > #include "rtc_base/arraysize.h" >-#include "rtc_base/stringutils.h" > > namespace cricket { > >@@ -22,7 +22,7 @@ bool StringToConnectionRole(const std::string& role_str, ConnectionRole* role) { > CONNECTIONROLE_ACTPASS_STR, CONNECTIONROLE_HOLDCONN_STR}; > > for (size_t i = 0; i < arraysize(roles); ++i) { >- if (_stricmp(roles[i], role_str.c_str()) == 0) { >+ if (absl::EqualsIgnoreCase(roles[i], role_str)) { > *role = static_cast<ConnectionRole>(CONNECTIONROLE_ACTIVE + i); > return true; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.h >index 3bffdf971a0e936f20081163c9893fcfbddc4ddc..2ab973278faf72aa6034d883483d0e857730ffeb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescription.h >@@ -67,11 +67,13 @@ struct IceParameters { > bool ice_renomination) > : ufrag(ice_ufrag), pwd(ice_pwd), renomination(ice_renomination) {} > >- bool operator==(const IceParameters& other) { >+ bool operator==(const IceParameters& other) const { > return ufrag == other.ufrag && pwd == other.pwd && > renomination == other.renomination; > } >- bool operator!=(const IceParameters& other) { return !(*this == other); } >+ bool operator!=(const IceParameters& other) const { >+ return !(*this == other); >+ } > }; > > extern const char CONNECTIONROLE_ACTIVE_STR[]; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.cc >index 618726e841164c096ae05089c54d15ec1745d046..689cd4f7a2d0bdc7f051f3a44ca750a96afe60e6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.cc >@@ -27,13 +27,15 @@ TransportDescriptionFactory::~TransportDescriptionFactory() = default; > > TransportDescription* TransportDescriptionFactory::CreateOffer( > const TransportOptions& options, >- const TransportDescription* current_description) const { >+ const TransportDescription* current_description, >+ IceCredentialsIterator* ice_credentials) const { > std::unique_ptr<TransportDescription> desc(new TransportDescription()); > > // Generate the ICE credentials if we don't already have them. > if (!current_description || options.ice_restart) { >- desc->ice_ufrag = rtc::CreateRandomString(ICE_UFRAG_LENGTH); >- desc->ice_pwd = rtc::CreateRandomString(ICE_PWD_LENGTH); >+ IceParameters credentials = ice_credentials->GetIceCredentials(); >+ desc->ice_ufrag = credentials.ufrag; >+ desc->ice_pwd = credentials.pwd; > } else { > desc->ice_ufrag = current_description->ice_ufrag; > desc->ice_pwd = current_description->ice_pwd; >@@ -59,7 +61,8 @@ TransportDescription* TransportDescriptionFactory::CreateAnswer( > const TransportDescription* offer, > const TransportOptions& options, > bool require_transport_attributes, >- const TransportDescription* current_description) const { >+ const TransportDescription* current_description, >+ IceCredentialsIterator* ice_credentials) const { > // TODO(juberti): Figure out why we get NULL offers, and fix this upstream. > if (!offer) { > RTC_LOG(LS_WARNING) << "Failed to create TransportDescription answer " >@@ -71,8 +74,9 @@ TransportDescription* TransportDescriptionFactory::CreateAnswer( > // Generate the ICE credentials if we don't already have them or ice is > // being restarted. > if (!current_description || options.ice_restart) { >- desc->ice_ufrag = rtc::CreateRandomString(ICE_UFRAG_LENGTH); >- desc->ice_pwd = rtc::CreateRandomString(ICE_PWD_LENGTH); >+ IceParameters credentials = ice_credentials->GetIceCredentials(); >+ desc->ice_ufrag = credentials.ufrag; >+ desc->ice_pwd = credentials.pwd; > } else { > desc->ice_ufrag = current_description->ice_ufrag; > desc->ice_pwd = current_description->ice_pwd; >@@ -116,8 +120,8 @@ bool TransportDescriptionFactory::SetSecurityInfo(TransportDescription* desc, > // This digest algorithm is used to produce the a=fingerprint lines in SDP. > // RFC 4572 Section 5 requires that those lines use the same hash function as > // the certificate's signature, which is what CreateFromCertificate does. >- desc->identity_fingerprint.reset( >- rtc::SSLFingerprint::CreateFromCertificate(certificate_)); >+ desc->identity_fingerprint = >+ rtc::SSLFingerprint::CreateFromCertificate(*certificate_); > if (!desc->identity_fingerprint) { > return false; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.h >index 937c5fa1fea4a21faaccbf5e5195e32bf2d10b3e..dc1476a80f5071b9068c014c27fd134cd4dd972e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory.h >@@ -11,6 +11,7 @@ > #ifndef P2P_BASE_TRANSPORTDESCRIPTIONFACTORY_H_ > #define P2P_BASE_TRANSPORTDESCRIPTIONFACTORY_H_ > >+#include "p2p/base/icecredentialsiterator.h" > #include "p2p/base/transportdescription.h" > #include "rtc_base/rtccertificate.h" > >@@ -54,7 +55,8 @@ class TransportDescriptionFactory { > // Creates a transport description suitable for use in an offer. > TransportDescription* CreateOffer( > const TransportOptions& options, >- const TransportDescription* current_description) const; >+ const TransportDescription* current_description, >+ IceCredentialsIterator* ice_credentials) const; > // Create a transport description that is a response to an offer. > // > // If |require_transport_attributes| is true, then TRANSPORT category >@@ -66,7 +68,8 @@ class TransportDescriptionFactory { > const TransportDescription* offer, > const TransportOptions& options, > bool require_transport_attributes, >- const TransportDescription* current_description) const; >+ const TransportDescription* current_description, >+ IceCredentialsIterator* ice_credentials) const; > > private: > bool SetSecurityInfo(TransportDescription* description, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory_unittest.cc >index a7c34b56a04d35c6df90b08dae943cf879351ded..c46630ace2192f1586a76ce3ea6370e96c16cbc7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportdescriptionfactory_unittest.cc >@@ -26,7 +26,8 @@ using cricket::TransportOptions; > class TransportDescriptionFactoryTest : public testing::Test { > public: > TransportDescriptionFactoryTest() >- : cert1_(rtc::RTCCertificate::Create(std::unique_ptr<rtc::SSLIdentity>( >+ : ice_credentials_({}), >+ cert1_(rtc::RTCCertificate::Create(std::unique_ptr<rtc::SSLIdentity>( > new rtc::FakeSSLIdentity("User1")))), > cert2_(rtc::RTCCertificate::Create(std::unique_ptr<rtc::SSLIdentity>( > new rtc::FakeSSLIdentity("User2")))) {} >@@ -64,21 +65,22 @@ class TransportDescriptionFactoryTest : public testing::Test { > SetDtls(dtls); > cricket::TransportOptions options; > // The initial offer / answer exchange. >- std::unique_ptr<TransportDescription> offer(f1_.CreateOffer(options, NULL)); >+ std::unique_ptr<TransportDescription> offer( >+ f1_.CreateOffer(options, NULL, &ice_credentials_)); > std::unique_ptr<TransportDescription> answer( >- f2_.CreateAnswer(offer.get(), options, true, NULL)); >+ f2_.CreateAnswer(offer.get(), options, true, NULL, &ice_credentials_)); > > // Create an updated offer where we restart ice. > options.ice_restart = true; > std::unique_ptr<TransportDescription> restart_offer( >- f1_.CreateOffer(options, offer.get())); >+ f1_.CreateOffer(options, offer.get(), &ice_credentials_)); > > VerifyUfragAndPasswordChanged(dtls, offer.get(), restart_offer.get()); > > // Create a new answer. The transport ufrag and password is changed since > // |options.ice_restart == true| >- std::unique_ptr<TransportDescription> restart_answer( >- f2_.CreateAnswer(restart_offer.get(), options, true, answer.get())); >+ std::unique_ptr<TransportDescription> restart_answer(f2_.CreateAnswer( >+ restart_offer.get(), options, true, answer.get(), &ice_credentials_)); > ASSERT_TRUE(restart_answer.get() != NULL); > > VerifyUfragAndPasswordChanged(dtls, answer.get(), restart_answer.get()); >@@ -108,19 +110,20 @@ class TransportDescriptionFactoryTest : public testing::Test { > cricket::TransportOptions options; > // The initial offer / answer exchange. > std::unique_ptr<TransportDescription> offer( >- f1_.CreateOffer(options, nullptr)); >- std::unique_ptr<TransportDescription> answer( >- f2_.CreateAnswer(offer.get(), options, true, nullptr)); >+ f1_.CreateOffer(options, nullptr, &ice_credentials_)); >+ std::unique_ptr<TransportDescription> answer(f2_.CreateAnswer( >+ offer.get(), options, true, nullptr, &ice_credentials_)); > VerifyRenomination(offer.get(), false); > VerifyRenomination(answer.get(), false); > > options.enable_ice_renomination = true; > std::unique_ptr<TransportDescription> renomination_offer( >- f1_.CreateOffer(options, offer.get())); >+ f1_.CreateOffer(options, offer.get(), &ice_credentials_)); > VerifyRenomination(renomination_offer.get(), true); > >- std::unique_ptr<TransportDescription> renomination_answer(f2_.CreateAnswer( >- renomination_offer.get(), options, true, answer.get())); >+ std::unique_ptr<TransportDescription> renomination_answer( >+ f2_.CreateAnswer(renomination_offer.get(), options, true, answer.get(), >+ &ice_credentials_)); > VerifyRenomination(renomination_answer.get(), true); > } > >@@ -145,6 +148,7 @@ class TransportDescriptionFactoryTest : public testing::Test { > } > } > >+ cricket::IceCredentialsIterator ice_credentials_; > TransportDescriptionFactory f1_; > TransportDescriptionFactory f2_; > >@@ -154,7 +158,7 @@ class TransportDescriptionFactoryTest : public testing::Test { > > TEST_F(TransportDescriptionFactoryTest, TestOfferDefault) { > std::unique_ptr<TransportDescription> desc( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", ""); > } > >@@ -163,13 +167,13 @@ TEST_F(TransportDescriptionFactoryTest, TestOfferDtls) { > f1_.set_certificate(cert1_); > std::string digest_alg; > ASSERT_TRUE( >- cert1_->ssl_certificate().GetSignatureDigestAlgorithm(&digest_alg)); >+ cert1_->GetSSLCertificate().GetSignatureDigestAlgorithm(&digest_alg)); > std::unique_ptr<TransportDescription> desc( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", digest_alg); > // Ensure it also works with SEC_REQUIRED. > f1_.set_secure(cricket::SEC_REQUIRED); >- desc.reset(f1_.CreateOffer(TransportOptions(), NULL)); >+ desc.reset(f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", digest_alg); > } > >@@ -177,7 +181,7 @@ TEST_F(TransportDescriptionFactoryTest, TestOfferDtls) { > TEST_F(TransportDescriptionFactoryTest, TestOfferDtlsWithNoIdentity) { > f1_.set_secure(cricket::SEC_ENABLED); > std::unique_ptr<TransportDescription> desc( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > ASSERT_TRUE(desc.get() == NULL); > } > >@@ -188,36 +192,38 @@ TEST_F(TransportDescriptionFactoryTest, TestOfferDtlsReofferDtls) { > f1_.set_certificate(cert1_); > std::string digest_alg; > ASSERT_TRUE( >- cert1_->ssl_certificate().GetSignatureDigestAlgorithm(&digest_alg)); >+ cert1_->GetSSLCertificate().GetSignatureDigestAlgorithm(&digest_alg)); > std::unique_ptr<TransportDescription> old_desc( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > ASSERT_TRUE(old_desc.get() != NULL); > std::unique_ptr<TransportDescription> desc( >- f1_.CreateOffer(TransportOptions(), old_desc.get())); >+ f1_.CreateOffer(TransportOptions(), old_desc.get(), &ice_credentials_)); > CheckDesc(desc.get(), "", old_desc->ice_ufrag, old_desc->ice_pwd, digest_alg); > } > > TEST_F(TransportDescriptionFactoryTest, TestAnswerDefault) { > std::unique_ptr<TransportDescription> offer( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > ASSERT_TRUE(offer.get() != NULL); >- std::unique_ptr<TransportDescription> desc( >- f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ std::unique_ptr<TransportDescription> desc(f2_.CreateAnswer( >+ offer.get(), TransportOptions(), true, NULL, &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", ""); >- desc.reset(f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ desc.reset(f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL, >+ &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", ""); > } > > // Test that we can update an answer properly; ICE credentials shouldn't change. > TEST_F(TransportDescriptionFactoryTest, TestReanswer) { > std::unique_ptr<TransportDescription> offer( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > ASSERT_TRUE(offer.get() != NULL); >- std::unique_ptr<TransportDescription> old_desc( >- f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ std::unique_ptr<TransportDescription> old_desc(f2_.CreateAnswer( >+ offer.get(), TransportOptions(), true, NULL, &ice_credentials_)); > ASSERT_TRUE(old_desc.get() != NULL); > std::unique_ptr<TransportDescription> desc( >- f2_.CreateAnswer(offer.get(), TransportOptions(), true, old_desc.get())); >+ f2_.CreateAnswer(offer.get(), TransportOptions(), true, old_desc.get(), >+ &ice_credentials_)); > ASSERT_TRUE(desc.get() != NULL); > CheckDesc(desc.get(), "", old_desc->ice_ufrag, old_desc->ice_pwd, ""); > } >@@ -227,10 +233,10 @@ TEST_F(TransportDescriptionFactoryTest, TestAnswerDtlsToNoDtls) { > f1_.set_secure(cricket::SEC_ENABLED); > f1_.set_certificate(cert1_); > std::unique_ptr<TransportDescription> offer( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > ASSERT_TRUE(offer.get() != NULL); >- std::unique_ptr<TransportDescription> desc( >- f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ std::unique_ptr<TransportDescription> desc(f2_.CreateAnswer( >+ offer.get(), TransportOptions(), true, NULL, &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", ""); > } > >@@ -240,13 +246,14 @@ TEST_F(TransportDescriptionFactoryTest, TestAnswerNoDtlsToDtls) { > f2_.set_secure(cricket::SEC_ENABLED); > f2_.set_certificate(cert2_); > std::unique_ptr<TransportDescription> offer( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > ASSERT_TRUE(offer.get() != NULL); >- std::unique_ptr<TransportDescription> desc( >- f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ std::unique_ptr<TransportDescription> desc(f2_.CreateAnswer( >+ offer.get(), TransportOptions(), true, NULL, &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", ""); > f2_.set_secure(cricket::SEC_REQUIRED); >- desc.reset(f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ desc.reset(f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL, >+ &ice_credentials_)); > ASSERT_TRUE(desc.get() == NULL); > } > >@@ -262,16 +269,17 @@ TEST_F(TransportDescriptionFactoryTest, TestAnswerDtlsToDtls) { > // answer must contain fingerprint lines with cert2_'s digest algorithm. > std::string digest_alg2; > ASSERT_TRUE( >- cert2_->ssl_certificate().GetSignatureDigestAlgorithm(&digest_alg2)); >+ cert2_->GetSSLCertificate().GetSignatureDigestAlgorithm(&digest_alg2)); > > std::unique_ptr<TransportDescription> offer( >- f1_.CreateOffer(TransportOptions(), NULL)); >+ f1_.CreateOffer(TransportOptions(), NULL, &ice_credentials_)); > ASSERT_TRUE(offer.get() != NULL); >- std::unique_ptr<TransportDescription> desc( >- f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ std::unique_ptr<TransportDescription> desc(f2_.CreateAnswer( >+ offer.get(), TransportOptions(), true, NULL, &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", digest_alg2); > f2_.set_secure(cricket::SEC_REQUIRED); >- desc.reset(f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL)); >+ desc.reset(f2_.CreateAnswer(offer.get(), TransportOptions(), true, NULL, >+ &ice_credentials_)); > CheckDesc(desc.get(), "", "", "", digest_alg2); > } > >@@ -304,9 +312,36 @@ TEST_F(TransportDescriptionFactoryTest, TestIceRenominationWithDtls) { > TEST_F(TransportDescriptionFactoryTest, AddsTrickleIceOption) { > cricket::TransportOptions options; > std::unique_ptr<TransportDescription> offer( >- f1_.CreateOffer(options, nullptr)); >+ f1_.CreateOffer(options, nullptr, &ice_credentials_)); > EXPECT_TRUE(offer->HasOption("trickle")); > std::unique_ptr<TransportDescription> answer( >- f2_.CreateAnswer(offer.get(), options, true, nullptr)); >+ f2_.CreateAnswer(offer.get(), options, true, nullptr, &ice_credentials_)); > EXPECT_TRUE(answer->HasOption("trickle")); > } >+ >+// Test CreateOffer with IceCredentialsIterator. >+TEST_F(TransportDescriptionFactoryTest, CreateOfferIceCredentialsIterator) { >+ std::vector<cricket::IceParameters> credentials = { >+ cricket::IceParameters("kalle", "anka", false)}; >+ cricket::IceCredentialsIterator credentialsIterator(credentials); >+ cricket::TransportOptions options; >+ std::unique_ptr<TransportDescription> offer( >+ f1_.CreateOffer(options, nullptr, &credentialsIterator)); >+ EXPECT_EQ(offer->GetIceParameters().ufrag, credentials[0].ufrag); >+ EXPECT_EQ(offer->GetIceParameters().pwd, credentials[0].pwd); >+} >+ >+// Test CreateAnswer with IceCredentialsIterator. >+TEST_F(TransportDescriptionFactoryTest, CreateAnswerIceCredentialsIterator) { >+ cricket::TransportOptions options; >+ std::unique_ptr<TransportDescription> offer( >+ f1_.CreateOffer(options, nullptr, &ice_credentials_)); >+ >+ std::vector<cricket::IceParameters> credentials = { >+ cricket::IceParameters("kalle", "anka", false)}; >+ cricket::IceCredentialsIterator credentialsIterator(credentials); >+ std::unique_ptr<TransportDescription> answer(f1_.CreateAnswer( >+ offer.get(), options, false, nullptr, &credentialsIterator)); >+ EXPECT_EQ(answer->GetIceParameters().ufrag, credentials[0].ufrag); >+ EXPECT_EQ(answer->GetIceParameters().pwd, credentials[0].pwd); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportfactoryinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportfactoryinterface.h >index ce32ee8483edeb7ee2f084578323024aa783a67e..9805db005cc02155fbd22df6cedea03b0240994e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportfactoryinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/transportfactoryinterface.h >@@ -34,7 +34,7 @@ class TransportFactoryInterface { > > virtual std::unique_ptr<DtlsTransportInternal> CreateDtlsTransport( > std::unique_ptr<IceTransportInternal> ice, >- const rtc::CryptoOptions& crypto_options) = 0; >+ const webrtc::CryptoOptions& crypto_options) = 0; > }; > > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.cc >index 97541f60c02dd9b742d7def22c3509fbbb84b211..eb0adfa3ce4e1db990c303972a6c0e85587093a6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.cc >@@ -24,7 +24,6 @@ > #include "rtc_base/logging.h" > #include "rtc_base/nethelpers.h" > #include "rtc_base/socketaddress.h" >-#include "rtc_base/stringencode.h" > #include "rtc_base/strings/string_builder.h" > > namespace cricket { >@@ -623,7 +622,7 @@ bool TurnPort::HandleIncomingPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > if (socket != socket_) { > // The packet was received on a shared socket after we've allocated a new > // socket for this TURN port. >@@ -660,12 +659,12 @@ bool TurnPort::HandleIncomingPacket(rtc::AsyncPacketSocket* socket, > // a response to a previous request. > uint16_t msg_type = rtc::GetBE16(data); > if (IsTurnChannelData(msg_type)) { >- HandleChannelData(msg_type, data, size, packet_time); >+ HandleChannelData(msg_type, data, size, packet_time_us); > return true; > } > > if (msg_type == TURN_DATA_INDICATION) { >- HandleDataIndication(data, size, packet_time); >+ HandleDataIndication(data, size, packet_time_us); > return true; > } > >@@ -696,8 +695,8 @@ void TurnPort::OnReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >- HandleIncomingPacket(socket, data, size, remote_addr, packet_time); >+ const int64_t& packet_time_us) { >+ HandleIncomingPacket(socket, data, size, remote_addr, packet_time_us); > } > > void TurnPort::OnSentPacket(rtc::AsyncPacketSocket* socket, >@@ -933,7 +932,7 @@ void TurnPort::OnAllocateRequestTimeout() { > > void TurnPort::HandleDataIndication(const char* data, > size_t size, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > // Read in the message, and process according to RFC5766, Section 10.4. > rtc::ByteBufferReader buf(data, size); > TurnMessage msg; >@@ -972,13 +971,13 @@ void TurnPort::HandleDataIndication(const char* data, > } > > DispatchPacket(data_attr->bytes(), data_attr->length(), ext_addr, PROTO_UDP, >- packet_time); >+ packet_time_us); > } > > void TurnPort::HandleChannelData(int channel_id, > const char* data, > size_t size, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > // Read the message, and process according to RFC5766, Section 11.6. > // 0 1 2 3 > // 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 >@@ -1014,16 +1013,16 @@ void TurnPort::HandleChannelData(int channel_id, > } > > DispatchPacket(data + TURN_CHANNEL_HEADER_SIZE, len, entry->address(), >- PROTO_UDP, packet_time); >+ PROTO_UDP, packet_time_us); > } > > void TurnPort::DispatchPacket(const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, > ProtocolType proto, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > if (Connection* conn = GetConnection(remote_addr)) { >- conn->OnReadPacket(data, size, packet_time); >+ conn->OnReadPacket(data, size, packet_time_us); > } else { > Port::OnReadPacket(data, size, remote_addr, proto); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.h >index 11b58d4f6c99faa9428b825bac67ae863cd1d3aa..5db1956a9b1afa7ca647c26eb320e5f4b36a950b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport.h >@@ -14,10 +14,12 @@ > #include <stdio.h> > #include <list> > #include <map> >+#include <memory> > #include <set> > #include <string> > #include <vector> > >+#include "absl/memory/memory.h" > #include "p2p/base/port.h" > #include "p2p/client/basicportallocator.h" > #include "rtc_base/asyncinvoker.h" >@@ -50,25 +52,44 @@ class TurnPort : public Port { > // packets. > }; > // Create a TURN port using the shared UDP socket, |socket|. >- static TurnPort* Create(rtc::Thread* thread, >- rtc::PacketSocketFactory* factory, >- rtc::Network* network, >- rtc::AsyncPacketSocket* socket, >- const std::string& username, // ice username. >- const std::string& password, // ice password. >- const ProtocolAddress& server_address, >- const RelayCredentials& credentials, >- int server_priority, >- const std::string& origin, >- webrtc::TurnCustomizer* customizer) { >- return new TurnPort(thread, factory, network, socket, username, password, >- server_address, credentials, server_priority, origin, >- customizer); >+ static std::unique_ptr<TurnPort> Create( >+ rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ rtc::AsyncPacketSocket* socket, >+ const std::string& username, // ice username. >+ const std::string& password, // ice password. >+ const ProtocolAddress& server_address, >+ const RelayCredentials& credentials, >+ int server_priority, >+ const std::string& origin, >+ webrtc::TurnCustomizer* customizer) { >+ // Using `new` to access a non-public constructor. >+ return absl::WrapUnique(new TurnPort( >+ thread, factory, network, socket, username, password, server_address, >+ credentials, server_priority, origin, customizer)); >+ } >+ // TODO(steveanton): Remove once downstream clients have moved to |Create|. >+ static std::unique_ptr<TurnPort> CreateUnique( >+ rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ rtc::AsyncPacketSocket* socket, >+ const std::string& username, // ice username. >+ const std::string& password, // ice password. >+ const ProtocolAddress& server_address, >+ const RelayCredentials& credentials, >+ int server_priority, >+ const std::string& origin, >+ webrtc::TurnCustomizer* customizer) { >+ return Create(thread, factory, network, socket, username, password, >+ server_address, credentials, server_priority, origin, >+ customizer); > } > > // Create a TURN port that will use a new socket, bound to |network| and > // using a port in the range between |min_port| and |max_port|. >- static TurnPort* Create( >+ static std::unique_ptr<TurnPort> Create( > rtc::Thread* thread, > rtc::PacketSocketFactory* factory, > rtc::Network* network, >@@ -84,10 +105,34 @@ class TurnPort : public Port { > const std::vector<std::string>& tls_elliptic_curves, > webrtc::TurnCustomizer* customizer, > rtc::SSLCertificateVerifier* tls_cert_verifier = nullptr) { >- return new TurnPort(thread, factory, network, min_port, max_port, username, >- password, server_address, credentials, server_priority, >- origin, tls_alpn_protocols, tls_elliptic_curves, >- customizer, tls_cert_verifier); >+ // Using `new` to access a non-public constructor. >+ return absl::WrapUnique( >+ new TurnPort(thread, factory, network, min_port, max_port, username, >+ password, server_address, credentials, server_priority, >+ origin, tls_alpn_protocols, tls_elliptic_curves, >+ customizer, tls_cert_verifier)); >+ } >+ // TODO(steveanton): Remove once downstream clients have moved to |Create|. >+ static std::unique_ptr<TurnPort> CreateUnique( >+ rtc::Thread* thread, >+ rtc::PacketSocketFactory* factory, >+ rtc::Network* network, >+ uint16_t min_port, >+ uint16_t max_port, >+ const std::string& username, // ice username. >+ const std::string& password, // ice password. >+ const ProtocolAddress& server_address, >+ const RelayCredentials& credentials, >+ int server_priority, >+ const std::string& origin, >+ const std::vector<std::string>& tls_alpn_protocols, >+ const std::vector<std::string>& tls_elliptic_curves, >+ webrtc::TurnCustomizer* customizer, >+ rtc::SSLCertificateVerifier* tls_cert_verifier = nullptr) { >+ return Create(thread, factory, network, min_port, max_port, username, >+ password, server_address, credentials, server_priority, >+ origin, tls_alpn_protocols, tls_elliptic_curves, customizer, >+ tls_cert_verifier); > } > > ~TurnPort() override; >@@ -130,13 +175,14 @@ class TurnPort : public Port { > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > bool CanHandleIncomingPacketsFrom( > const rtc::SocketAddress& addr) const override; > virtual void OnReadPacket(rtc::AsyncPacketSocket* socket, >- const char* data, size_t size, >+ const char* data, >+ size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > void OnSentPacket(rtc::AsyncPacketSocket* socket, > const rtc::SentPacket& sent_packet) override; >@@ -268,13 +314,18 @@ class TurnPort : public Port { > void OnAllocateError(); > void OnAllocateRequestTimeout(); > >- void HandleDataIndication(const char* data, size_t size, >- const rtc::PacketTime& packet_time); >- void HandleChannelData(int channel_id, const char* data, size_t size, >- const rtc::PacketTime& packet_time); >- void DispatchPacket(const char* data, size_t size, >- const rtc::SocketAddress& remote_addr, >- ProtocolType proto, const rtc::PacketTime& packet_time); >+ void HandleDataIndication(const char* data, >+ size_t size, >+ int64_t packet_time_us); >+ void HandleChannelData(int channel_id, >+ const char* data, >+ size_t size, >+ int64_t packet_time_us); >+ void DispatchPacket(const char* data, >+ size_t size, >+ const rtc::SocketAddress& remote_addr, >+ ProtocolType proto, >+ int64_t packet_time_us); > > bool ScheduleRefresh(uint32_t lifetime); > void SendRequest(StunRequest* request, int delay); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport_unittest.cc >index bac35e891cd49dcde0ca36941cccb6bc89e2b041..617e7798adf53fae544e00c19c444611cc9dc347 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnport_unittest.cc >@@ -194,23 +194,23 @@ class TurnPortTest : public testing::Test, > void OnTurnReadPacket(Connection* conn, > const char* data, > size_t size, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > turn_packets_.push_back(rtc::Buffer(data, size)); > } > void OnUdpPortComplete(Port* port) { udp_ready_ = true; } > void OnUdpReadPacket(Connection* conn, > const char* data, > size_t size, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > udp_packets_.push_back(rtc::Buffer(data, size)); > } > void OnSocketReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > turn_port_->HandleIncomingPacket(socket, data, size, remote_addr, >- packet_time); >+ packet_time_us); > } > void OnTurnPortClosed(TurnPort* port) { turn_port_closed_ = true; } > void OnTurnPortDestroyed(PortInterface* port) { turn_port_destroyed_ = true; } >@@ -270,10 +270,9 @@ class TurnPortTest : public testing::Test, > const ProtocolAddress& server_address, > const std::string& origin) { > RelayCredentials credentials(username, password); >- turn_port_.reset(TurnPort::Create( >+ turn_port_ = TurnPort::Create( > &main_, &socket_factory_, network, 0, 0, kIceUfrag1, kIcePwd1, >- server_address, credentials, 0, origin, std::vector<std::string>(), >- std::vector<std::string>(), turn_customizer_.get())); >+ server_address, credentials, 0, origin, {}, {}, turn_customizer_.get()); > // This TURN port will be the controlling. > turn_port_->SetIceRole(ICEROLE_CONTROLLING); > ConnectSignals(); >@@ -301,10 +300,10 @@ class TurnPortTest : public testing::Test, > } > > RelayCredentials credentials(username, password); >- turn_port_.reset(TurnPort::Create(&main_, &socket_factory_, >- MakeNetwork(kLocalAddr1), socket_.get(), >- kIceUfrag1, kIcePwd1, server_address, >- credentials, 0, std::string(), nullptr)); >+ turn_port_ = >+ TurnPort::Create(&main_, &socket_factory_, MakeNetwork(kLocalAddr1), >+ socket_.get(), kIceUfrag1, kIcePwd1, server_address, >+ credentials, 0, std::string(), nullptr); > // This TURN port will be the controlling. > turn_port_->SetIceRole(ICEROLE_CONTROLLING); > ConnectSignals(); >@@ -329,9 +328,9 @@ class TurnPortTest : public testing::Test, > void CreateUdpPort() { CreateUdpPort(kLocalAddr2); } > > void CreateUdpPort(const SocketAddress& address) { >- udp_port_.reset(UDPPort::Create( >- &main_, &socket_factory_, MakeNetwork(address), 0, 0, kIceUfrag2, >- kIcePwd2, std::string(), false, absl::nullopt)); >+ udp_port_ = UDPPort::Create(&main_, &socket_factory_, MakeNetwork(address), >+ 0, 0, kIceUfrag2, kIcePwd2, std::string(), >+ false, absl::nullopt); > // UDP port will be controlled. > udp_port_->SetIceRole(ICEROLE_CONTROLLED); > udp_port_->SignalPortComplete.connect(this, >@@ -1032,8 +1031,7 @@ TEST_F(TurnPortTest, TestTurnAllocateMismatch) { > std::string test_packet = "Test packet"; > EXPECT_FALSE(turn_port_->HandleIncomingPacket( > socket_.get(), test_packet.data(), test_packet.size(), >- rtc::SocketAddress(kTurnUdpExtAddr.ipaddr(), 0), >- rtc::CreatePacketTime(0))); >+ rtc::SocketAddress(kTurnUdpExtAddr.ipaddr(), 0), rtc::TimeMicros())); > } > > // Tests that a shared-socket-TurnPort creates its own socket after >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.cc >index 022113e7e746a0c59076b8b062cb310992a94e30..a8bb41c87a6acad2e284aa783101537f7774ffd0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.cc >@@ -24,7 +24,6 @@ > #include "rtc_base/logging.h" > #include "rtc_base/messagedigest.h" > #include "rtc_base/socketadapters.h" >-#include "rtc_base/stringencode.h" > #include "rtc_base/strings/string_builder.h" > #include "rtc_base/thread.h" > >@@ -199,9 +198,10 @@ void TurnServer::OnInternalSocketClose(rtc::AsyncPacketSocket* socket, > } > > void TurnServer::OnInternalPacket(rtc::AsyncPacketSocket* socket, >- const char* data, size_t size, >+ const char* data, >+ size_t size, > const rtc::SocketAddress& addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > RTC_DCHECK(thread_checker_.CalledOnValidThread()); > // Fail if the packet is too small to even contain a channel header. > if (size < TURN_CHANNEL_HEADER_SIZE) { >@@ -842,9 +842,10 @@ void TurnServerAllocation::HandleChannelData(const char* data, size_t size) { > > void TurnServerAllocation::OnExternalPacket( > rtc::AsyncPacketSocket* socket, >- const char* data, size_t size, >+ const char* data, >+ size_t size, > const rtc::SocketAddress& addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > RTC_DCHECK(external_socket_.get() == socket); > Channel* channel = FindChannel(addr); > if (channel) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.h >index e7f886cf8fb553de3494cbe72813a9d88d3f984b..af51251bbcd0cf03984f031bb272897e343e439c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/turnserver.h >@@ -105,9 +105,10 @@ class TurnServerAllocation : public rtc::MessageHandler, > void HandleChannelBindRequest(const TurnMessage* msg); > > void OnExternalPacket(rtc::AsyncPacketSocket* socket, >- const char* data, size_t size, >+ const char* data, >+ size_t size, > const rtc::SocketAddress& addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > static int ComputeLifetime(const TurnMessage* msg); > bool HasPermission(const rtc::IPAddress& addr); >@@ -256,9 +257,11 @@ class TurnServer : public sigslot::has_slots<> { > > private: > std::string GenerateNonce(int64_t now) const; >- void OnInternalPacket(rtc::AsyncPacketSocket* socket, const char* data, >- size_t size, const rtc::SocketAddress& address, >- const rtc::PacketTime& packet_time); >+ void OnInternalPacket(rtc::AsyncPacketSocket* socket, >+ const char* data, >+ size_t size, >+ const rtc::SocketAddress& address, >+ const int64_t& packet_time_us); > > void OnNewInternalConnection(rtc::AsyncSocket* socket); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport.cc >deleted file mode 100644 >index c21e5b8769ee6dc66f5a3b724b37f8304e11026d..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport.cc >+++ /dev/null >@@ -1,133 +0,0 @@ >-/* >- * Copyright 2016 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "p2p/base/udptransport.h" >- >-#include <string> >-#include <utility> // For std::move. >- >-#include "rtc_base/asyncpacketsocket.h" >-#include "rtc_base/asyncudpsocket.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/nethelper.h" >-#include "rtc_base/socketaddress.h" >-#include "rtc_base/thread.h" >-#include "rtc_base/thread_checker.h" >- >-namespace cricket { >- >-UdpTransport::UdpTransport(const std::string& transport_name, >- std::unique_ptr<rtc::AsyncPacketSocket> socket) >- : transport_name_(transport_name), socket_(std::move(socket)) { >- RTC_DCHECK(socket_); >- socket_->SignalReadPacket.connect(this, &UdpTransport::OnSocketReadPacket); >- socket_->SignalSentPacket.connect(this, &UdpTransport::OnSocketSentPacket); >-} >- >-UdpTransport::~UdpTransport() { >- RTC_DCHECK_RUN_ON(&network_thread_checker_); >-} >- >-rtc::SocketAddress UdpTransport::GetLocalAddress() const { >- RTC_DCHECK_RUN_ON(&network_thread_checker_); >- return socket_->GetLocalAddress(); >-} >- >-bool UdpTransport::SetRemoteAddress(const rtc::SocketAddress& addr) { >- RTC_DCHECK_RUN_ON(&network_thread_checker_); >- if (!addr.IsComplete()) { >- RTC_LOG(LS_WARNING) << "Remote address not complete."; >- return false; >- } >- // TODO(johan): check for ipv4, other settings. >- bool prev_destination_nil = remote_address_.IsNil(); >- remote_address_ = addr; >- // Going from "didn't have destination" to "have destination" or vice versa. >- if (prev_destination_nil != remote_address_.IsNil()) { >- SignalWritableState(this); >- if (prev_destination_nil) { >- SignalReadyToSend(this); >- } >- } >- return true; >-} >- >-rtc::SocketAddress UdpTransport::GetRemoteAddress() const { >- RTC_DCHECK_RUN_ON(&network_thread_checker_); >- return remote_address_; >-} >- >-const std::string& UdpTransport::transport_name() const { >- return transport_name_; >-} >- >-bool UdpTransport::receiving() const { >- // TODO(johan): Implement method and signal. >- return true; >-} >- >-bool UdpTransport::writable() const { >- RTC_DCHECK_RUN_ON(&network_thread_checker_); >- return !remote_address_.IsNil(); >-} >- >-int UdpTransport::SendPacket(const char* data, >- size_t len, >- const rtc::PacketOptions& options, >- int flags) { >- // No thread_checker in high frequency network function. >- if (remote_address_.IsNil()) { >- RTC_LOG(LS_WARNING) << "Remote address not set."; >- send_error_ = ENOTCONN; >- return -1; >- } >- int result = >- socket_->SendTo((const void*)data, len, remote_address_, options); >- if (result <= 0) { >- RTC_LOG(LS_VERBOSE) << "SendPacket() " << result; >- } >- return result; >-} >- >-absl::optional<rtc::NetworkRoute> UdpTransport::network_route() const { >- rtc::NetworkRoute network_route; >- network_route.packet_overhead = >- /*kUdpOverhead=*/8 + GetIpOverhead(GetLocalAddress().family()); >- return absl::optional<rtc::NetworkRoute>(network_route); >-} >- >-int UdpTransport::SetOption(rtc::Socket::Option opt, int value) { >- return 0; >-} >- >-int UdpTransport::GetError() { >- return send_error_; >-} >- >-rtc::PacketTransportInternal* UdpTransport::GetInternal() { >- return this; >-} >- >-void UdpTransport::OnSocketReadPacket(rtc::AsyncPacketSocket* socket, >- const char* data, >- size_t len, >- const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >- // No thread_checker in high frequency network function. >- SignalReadPacket(this, data, len, packet_time, 0); >-} >- >-void UdpTransport::OnSocketSentPacket(rtc::AsyncPacketSocket* socket, >- const rtc::SentPacket& packet) { >- RTC_DCHECK_EQ(socket_.get(), socket); >- SignalSentPacket(this, packet); >-} >- >-} // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport.h >deleted file mode 100644 >index 06a795f2e1f9a5041146f8a9aadbc6d699026e4c..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport.h >+++ /dev/null >@@ -1,89 +0,0 @@ >-/* >- * Copyright 2016 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef P2P_BASE_UDPTRANSPORT_H_ >-#define P2P_BASE_UDPTRANSPORT_H_ >- >-#include <memory> >-#include <string> >- >-#include "absl/types/optional.h" >-#include "api/ortc/udptransportinterface.h" >-#include "p2p/base/packettransportinternal.h" >-#include "rtc_base/asyncpacketsocket.h" // For PacketOptions. >-#include "rtc_base/thread_checker.h" >- >-namespace rtc { >-class AsyncPacketSocket; >-struct PacketTime; >-struct SentPacket; >-class SocketAddress; >-} // namespace rtc >- >-namespace cricket { >- >-// Implementation of UdpTransportInterface. >-// Used by OrtcFactory. >-class UdpTransport : public rtc::PacketTransportInternal, >- public webrtc::UdpTransportInterface { >- public: >- // |transport_name| is only used for identification/logging. >- // |socket| must be non-null. >- UdpTransport(const std::string& transport_name, >- std::unique_ptr<rtc::AsyncPacketSocket> socket); >- ~UdpTransport() override; >- >- // Overrides of UdpTransportInterface, used by the API consumer. >- rtc::SocketAddress GetLocalAddress() const override; >- bool SetRemoteAddress(const rtc::SocketAddress& addr) override; >- rtc::SocketAddress GetRemoteAddress() const override; >- >- // Overrides of PacketTransportInternal, used by webrtc internally. >- const std::string& transport_name() const override; >- >- bool receiving() const override; >- >- bool writable() const override; >- >- int SendPacket(const char* data, >- size_t len, >- const rtc::PacketOptions& options, >- int flags) override; >- >- int SetOption(rtc::Socket::Option opt, int value) override; >- >- int GetError() override; >- >- absl::optional<rtc::NetworkRoute> network_route() const override; >- >- protected: >- PacketTransportInternal* GetInternal() override; >- >- private: >- void OnSocketReadPacket(rtc::AsyncPacketSocket* socket, >- const char* data, >- size_t len, >- const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >- void OnSocketSentPacket(rtc::AsyncPacketSocket* socket, >- const rtc::SentPacket& packet); >- bool IsLocalConsistent(); >- >- std::string transport_name_; >- int send_error_ = 0; >- std::unique_ptr<rtc::AsyncPacketSocket> socket_; >- // If not set, will be an "nil" address ("IsNil" returns true). >- rtc::SocketAddress remote_address_; >- rtc::ThreadChecker network_thread_checker_; >-}; >- >-} // namespace cricket >- >-#endif // P2P_BASE_UDPTRANSPORT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport_unittest.cc >deleted file mode 100644 >index ead366595f416b82ca22e1bdad890ba7857dfbdd..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/udptransport_unittest.cc >+++ /dev/null >@@ -1,189 +0,0 @@ >-/* >- * Copyright 2016 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <algorithm> >-#include <list> >-#include <memory> >-#include <utility> >-#include <vector> >- >-#include "p2p/base/basicpacketsocketfactory.h" >-#include "p2p/base/packettransportinternal.h" >-#include "p2p/base/udptransport.h" >-#include "rtc_base/asyncpacketsocket.h" >-#include "rtc_base/gunit.h" >-#include "rtc_base/ipaddress.h" >-#include "rtc_base/socketaddress.h" >-#include "rtc_base/socketserver.h" >-#include "rtc_base/thread.h" >-#include "rtc_base/virtualsocketserver.h" >- >-namespace cricket { >- >-constexpr int kTimeoutMs = 10000; >-static const rtc::IPAddress kIPv4LocalHostAddress = >- rtc::IPAddress(0x7F000001); // 127.0.0.1 >- >-class UdpTransportTest : public testing::Test, public sigslot::has_slots<> { >- public: >- UdpTransportTest() >- : virtual_socket_server_(new rtc::VirtualSocketServer()), >- network_thread_(virtual_socket_server_.get()), >- ep1_("Name1", >- std::unique_ptr<rtc::AsyncPacketSocket>( >- socket_factory_.CreateUdpSocket( >- rtc::SocketAddress(rtc::GetAnyIP(AF_INET), 0), >- 0, >- 0))), >- ep2_("Name2", >- std::unique_ptr<rtc::AsyncPacketSocket>( >- socket_factory_.CreateUdpSocket( >- rtc::SocketAddress(rtc::GetAnyIP(AF_INET), 0), >- 0, >- 0))) { >- // Setup IP Address for outgoing packets from sockets bound to IPV4 >- // INADDR_ANY ("0.0.0.0."), as used above when creating the virtual >- // sockets. The virtual socket server sends these packets only if the >- // default address is explicit set. With a physical socket, the actual >- // network stack / operating system would set the IP address for outgoing >- // packets. >- virtual_socket_server_->SetDefaultRoute(kIPv4LocalHostAddress); >- } >- >- struct Endpoint : public sigslot::has_slots<> { >- explicit Endpoint(std::string tch_name, >- std::unique_ptr<rtc::AsyncPacketSocket> socket) { >- ch_.reset(new UdpTransport(std::move(tch_name), std::move(socket))); >- ch_->SignalReadPacket.connect(this, &Endpoint::OnReadPacket); >- ch_->SignalSentPacket.connect(this, &Endpoint::OnSentPacket); >- ch_->SignalReadyToSend.connect(this, &Endpoint::OnReadyToSend); >- ch_->SignalWritableState.connect(this, &Endpoint::OnWritableState); >- } >- >- bool CheckData(const char* data, int len) { >- bool ret = false; >- if (!ch_packets_.empty()) { >- std::string packet = ch_packets_.front(); >- ret = (packet == std::string(data, len)); >- ch_packets_.pop_front(); >- } >- return ret; >- } >- >- void OnWritableState(rtc::PacketTransportInternal* transport) { >- num_sig_writable_++; >- } >- >- void OnReadyToSend(rtc::PacketTransportInternal* transport) { >- num_sig_ready_to_send_++; >- } >- >- void OnReadPacket(rtc::PacketTransportInternal* transport, >- const char* data, >- size_t len, >- const rtc::PacketTime& packet_time, >- int flags) { >- num_received_packets_++; >- RTC_LOG(LS_VERBOSE) << "OnReadPacket (unittest)"; >- ch_packets_.push_front(std::string(data, len)); >- } >- >- void OnSentPacket(rtc::PacketTransportInternal* transport, >- const rtc::SentPacket&) { >- num_sig_sent_packets_++; >- } >- >- int SendData(const char* data, size_t len) { >- rtc::PacketOptions options; >- return ch_->SendPacket(data, len, options, 0); >- } >- >- void GetLocalPort(uint16_t* local_port) { >- *local_port = ch_->GetLocalAddress().port(); >- } >- >- std::list<std::string> ch_packets_; >- std::unique_ptr<UdpTransport> ch_; >- uint32_t num_received_packets_ = 0; // Increases on SignalReadPacket. >- uint32_t num_sig_sent_packets_ = 0; // Increases on SignalSentPacket. >- uint32_t num_sig_writable_ = 0; // Increases on SignalWritable. >- uint32_t num_sig_ready_to_send_ = 0; // Increases on SignalReadyToSend. >- }; >- >- std::unique_ptr<rtc::VirtualSocketServer> virtual_socket_server_; >- rtc::AutoSocketServerThread network_thread_; >- // Uses current thread's socket server, which will be set by ss_scope_. >- rtc::BasicPacketSocketFactory socket_factory_; >- >- Endpoint ep1_; >- Endpoint ep2_; >- >- void TestSendRecv() { >- for (uint32_t i = 0; i < 5; ++i) { >- static const char* data = "ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890"; >- int len = static_cast<int>(strlen(data)); >- // local_channel <==> remote_channel >- EXPECT_EQ_WAIT(len, ep1_.SendData(data, len), kTimeoutMs); >- EXPECT_TRUE_WAIT(ep2_.CheckData(data, len), kTimeoutMs); >- EXPECT_EQ_WAIT(i + 1u, ep2_.num_received_packets_, kTimeoutMs); >- EXPECT_EQ_WAIT(len, ep2_.SendData(data, len), kTimeoutMs); >- EXPECT_TRUE_WAIT(ep1_.CheckData(data, len), kTimeoutMs); >- EXPECT_EQ_WAIT(i + 1u, ep1_.num_received_packets_, kTimeoutMs); >- } >- } >-}; >- >-TEST_F(UdpTransportTest, AddressGetters) { >- // Initially, remote address should be nil but local address shouldn't be. >- EXPECT_FALSE(ep1_.ch_->GetLocalAddress().IsNil()); >- EXPECT_TRUE(ep1_.ch_->GetRemoteAddress().IsNil()); >- rtc::SocketAddress destination("127.0.0.1", 1337); >- ASSERT_TRUE(ep1_.ch_->SetRemoteAddress(destination)); >- EXPECT_EQ(destination, ep1_.ch_->GetRemoteAddress()); >-} >- >-// Setting an invalid address should fail and have no effect. >-TEST_F(UdpTransportTest, SettingIncompleteRemoteAddressFails) { >- EXPECT_FALSE(ep1_.ch_->SetRemoteAddress(rtc::SocketAddress("127.0.0.1", 0))); >- EXPECT_TRUE(ep1_.ch_->GetRemoteAddress().IsNil()); >-} >- >-TEST_F(UdpTransportTest, SendRecvBasic) { >- uint16_t port; >- ep2_.GetLocalPort(&port); >- rtc::SocketAddress addr2 = rtc::SocketAddress("127.0.0.1", port); >- EXPECT_TRUE(ep1_.ch_->SetRemoteAddress(addr2)); >- ep1_.GetLocalPort(&port); >- rtc::SocketAddress addr1 = rtc::SocketAddress("127.0.0.1", port); >- EXPECT_TRUE(ep2_.ch_->SetRemoteAddress(addr1)); >- TestSendRecv(); >-} >- >-// Test the signals and state methods used internally by causing a UdpTransport >-// to send a packet to itself. >-TEST_F(UdpTransportTest, StatusAndSignals) { >- EXPECT_EQ(0u, ep1_.num_sig_writable_); >- EXPECT_EQ(0u, ep1_.num_sig_ready_to_send_); >- // Loopback >- EXPECT_TRUE(!ep1_.ch_->writable()); >- rtc::SocketAddress addr = ep1_.ch_->GetLocalAddress(); >- // Keep port, but explicitly set IP. >- addr.SetIP("127.0.0.1"); >- ep1_.ch_->SetRemoteAddress(addr); >- EXPECT_TRUE(ep1_.ch_->writable()); >- EXPECT_EQ(1u, ep1_.num_sig_writable_); >- EXPECT_EQ(1u, ep1_.num_sig_ready_to_send_); >- const char data[] = "abc"; >- ep1_.SendData(data, sizeof(data)); >- EXPECT_EQ_WAIT(1u, ep1_.ch_packets_.size(), kTimeoutMs); >- EXPECT_EQ_WAIT(1u, ep1_.num_sig_sent_packets_, kTimeoutMs); >-} >- >-} // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.cc >index c1b1de58ddb60ff278ae53a1c863c994003b2466..0c2fef3112605796b683c5edd5801dfe7478c677 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.cc >@@ -518,6 +518,10 @@ void BasicPortAllocatorSession::GetCandidatesFromPort( > } > } > >+bool BasicPortAllocatorSession::MdnsObfuscationEnabled() const { >+ return allocator_->network_manager()->GetMdnsResponder() != nullptr; >+} >+ > Candidate BasicPortAllocatorSession::SanitizeCandidate( > const Candidate& c) const { > RTC_DCHECK_RUN_ON(network_thread_); >@@ -534,7 +538,7 @@ Candidate BasicPortAllocatorSession::SanitizeCandidate( > bool filter_stun_related_address = > ((flags() & PORTALLOCATOR_DISABLE_ADAPTER_ENUMERATION) && > (flags() & PORTALLOCATOR_DISABLE_DEFAULT_LOCAL_CANDIDATE)) || >- !(candidate_filter_ & CF_HOST); >+ !(candidate_filter_ & CF_HOST) || MdnsObfuscationEnabled(); > // If the candidate filter doesn't allow reflexive addresses, empty TURN raddr > // to avoid reflexive address leakage. > bool filter_turn_related_address = !(candidate_filter_ & CF_REFLEXIVE); >@@ -1362,7 +1366,7 @@ void AllocationSequence::CreateUDPPorts() { > > // TODO(mallinath) - Remove UDPPort creating socket after shared socket > // is enabled completely. >- UDPPort* port = NULL; >+ std::unique_ptr<UDPPort> port; > bool emit_local_candidate_for_anyaddress = > !IsFlagSet(PORTALLOCATOR_DISABLE_DEFAULT_LOCAL_CANDIDATE); > if (IsFlagSet(PORTALLOCATOR_ENABLE_SHARED_SOCKET) && udp_socket_) { >@@ -1384,7 +1388,7 @@ void AllocationSequence::CreateUDPPorts() { > // If shared socket is enabled, STUN candidate will be allocated by the > // UDPPort. > if (IsFlagSet(PORTALLOCATOR_ENABLE_SHARED_SOCKET)) { >- udp_port_ = port; >+ udp_port_ = port.get(); > port->SignalDestroyed.connect(this, &AllocationSequence::OnPortDestroyed); > > // If STUN is not disabled, setting stun server address to port. >@@ -1398,7 +1402,7 @@ void AllocationSequence::CreateUDPPorts() { > } > } > >- session_->AddAllocatedPort(port, this, true); >+ session_->AddAllocatedPort(port.release(), this, true); > } > } > >@@ -1408,13 +1412,13 @@ void AllocationSequence::CreateTCPPorts() { > return; > } > >- Port* port = TCPPort::Create( >+ std::unique_ptr<Port> port = TCPPort::Create( > session_->network_thread(), session_->socket_factory(), network_, > session_->allocator()->min_port(), session_->allocator()->max_port(), > session_->username(), session_->password(), > session_->allocator()->allow_tcp_listen()); > if (port) { >- session_->AddAllocatedPort(port, this, true); >+ session_->AddAllocatedPort(port.release(), this, true); > // Since TCPPort is not created using shared socket, |port| will not be > // added to the dequeue. > } >@@ -1436,14 +1440,14 @@ void AllocationSequence::CreateStunPorts() { > return; > } > >- StunPort* port = StunPort::Create( >+ std::unique_ptr<StunPort> port = StunPort::Create( > session_->network_thread(), session_->socket_factory(), network_, > session_->allocator()->min_port(), session_->allocator()->max_port(), > session_->username(), session_->password(), config_->StunServers(), > session_->allocator()->origin(), > session_->allocator()->stun_candidate_keepalive_interval()); > if (port) { >- session_->AddAllocatedPort(port, this, true); >+ session_->AddAllocatedPort(port.release(), this, true); > // Since StunPort is not created using shared socket, |port| will not be > // added to the dequeue. > } >@@ -1479,11 +1483,12 @@ void AllocationSequence::CreateRelayPorts() { > > void AllocationSequence::CreateGturnPort(const RelayServerConfig& config) { > // TODO(mallinath) - Rename RelayPort to GTurnPort. >- RelayPort* port = RelayPort::Create( >+ std::unique_ptr<RelayPort> port = RelayPort::Create( > session_->network_thread(), session_->socket_factory(), network_, > session_->allocator()->min_port(), session_->allocator()->max_port(), > config_->username, config_->password); > if (port) { >+ RelayPort* port_ptr = port.release(); > // Since RelayPort is not created using shared socket, |port| will not be > // added to the dequeue. > // Note: We must add the allocated port before we add addresses because >@@ -1491,17 +1496,17 @@ void AllocationSequence::CreateGturnPort(const RelayServerConfig& config) { > // settings. However, we also can't prepare the address (normally > // done by AddAllocatedPort) until we have these addresses. So we > // wait to do that until below. >- session_->AddAllocatedPort(port, this, false); >+ session_->AddAllocatedPort(port_ptr, this, false); > > // Add the addresses of this protocol. > PortList::const_iterator relay_port; > for (relay_port = config.ports.begin(); relay_port != config.ports.end(); > ++relay_port) { >- port->AddServerAddress(*relay_port); >- port->AddExternalAddress(*relay_port); >+ port_ptr->AddServerAddress(*relay_port); >+ port_ptr->AddExternalAddress(*relay_port); > } > // Start fetching an address for this port. >- port->PrepareAddress(); >+ port_ptr->PrepareAddress(); > } > } > >@@ -1579,7 +1584,7 @@ void AllocationSequence::OnReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > RTC_DCHECK(socket == udp_socket_.get()); > > bool turn_port_found = false; >@@ -1593,7 +1598,7 @@ void AllocationSequence::OnReadPacket(rtc::AsyncPacketSocket* socket, > for (auto* port : relay_ports_) { > if (port->CanHandleIncomingPacketsFrom(remote_addr)) { > if (port->HandleIncomingPacket(socket, data, size, remote_addr, >- packet_time)) { >+ packet_time_us)) { > return; > } > turn_port_found = true; >@@ -1609,7 +1614,7 @@ void AllocationSequence::OnReadPacket(rtc::AsyncPacketSocket* socket, > stun_servers.find(remote_addr) != stun_servers.end()) { > RTC_DCHECK(udp_port_->SharedSocket()); > udp_port_->HandleIncomingPacket(socket, data, size, remote_addr, >- packet_time); >+ packet_time_us); > } > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.h >index 8b951cae9e62cbf7eec1872b4b99fcc3289d458f..672f3ddb7c263b1a55fa31422b9f103a5be7b601 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator.h >@@ -22,11 +22,12 @@ > #include "rtc_base/checks.h" > #include "rtc_base/messagequeue.h" > #include "rtc_base/network.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/thread.h" > > namespace cricket { > >-class BasicPortAllocator : public PortAllocator { >+class RTC_EXPORT BasicPortAllocator : public PortAllocator { > public: > // note: The (optional) relay_port_factory is owned by caller > // and must have a life time that exceeds that of BasicPortAllocator. >@@ -110,8 +111,8 @@ enum class SessionState { > // process will be started. > }; > >-class BasicPortAllocatorSession : public PortAllocatorSession, >- public rtc::MessageHandler { >+class RTC_EXPORT BasicPortAllocatorSession : public PortAllocatorSession, >+ public rtc::MessageHandler { > public: > BasicPortAllocatorSession(BasicPortAllocator* allocator, > const std::string& content_name, >@@ -235,6 +236,10 @@ class BasicPortAllocatorSession : public PortAllocatorSession, > > bool CheckCandidateFilter(const Candidate& c) const; > bool CandidatePairable(const Candidate& c, const Port* port) const; >+ >+ // Returns true if there is an mDNS responder attached to the network manager >+ bool MdnsObfuscationEnabled() const; >+ > // Clears 1) the address if the candidate is supposedly a hostname candidate; > // 2) the related address according to the flags and candidate filter in order > // to avoid leaking any information. >@@ -274,7 +279,7 @@ class BasicPortAllocatorSession : public PortAllocatorSession, > > // Records configuration information useful in creating ports. > // TODO(deadbeef): Rename "relay" to "turn_server" in this struct. >-struct PortConfiguration : public rtc::MessageData { >+struct RTC_EXPORT PortConfiguration : public rtc::MessageData { > // TODO(jiayl): remove |stun_address| when Chrome is updated. > rtc::SocketAddress stun_address; > ServerAddresses stun_servers; >@@ -383,7 +388,7 @@ class AllocationSequence : public rtc::MessageHandler, > const char* data, > size_t size, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > > void OnPortDestroyed(PortInterface* port); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator_unittest.cc >index a46e53ba9da6620e7b465d21ad2a59471e98821c..8943ebc73dba2db68a2fbcdbe40735786f4eaf0f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/basicportallocator_unittest.cc >@@ -38,7 +38,6 @@ > > using rtc::IPAddress; > using rtc::SocketAddress; >-using rtc::Thread; > > #define MAYBE_SKIP_IPV4 \ > if (!rtc::HasIPv4Enabled()) { \ >@@ -147,15 +146,15 @@ class BasicPortAllocatorTestBase : public testing::Test, > // must be called. > nat_factory_(vss_.get(), kNatUdpAddr, kNatTcpAddr), > nat_socket_factory_(new rtc::BasicPacketSocketFactory(&nat_factory_)), >- stun_server_(TestStunServer::Create(Thread::Current(), kStunAddr)), >- relay_server_(Thread::Current(), >+ stun_server_(TestStunServer::Create(rtc::Thread::Current(), kStunAddr)), >+ relay_server_(rtc::Thread::Current(), > kRelayUdpIntAddr, > kRelayUdpExtAddr, > kRelayTcpIntAddr, > kRelayTcpExtAddr, > kRelaySslTcpIntAddr, > kRelaySslTcpExtAddr), >- turn_server_(Thread::Current(), kTurnUdpIntAddr, kTurnUdpExtAddr), >+ turn_server_(rtc::Thread::Current(), kTurnUdpIntAddr, kTurnUdpExtAddr), > candidate_allocation_done_(false) { > ServerAddresses stun_servers; > stun_servers.insert(kStunAddr); >@@ -2247,8 +2246,8 @@ TEST_F(BasicPortAllocatorTest, IceRegatheringMetricsLoggedWhenNetworkChanges) { > } > > // Test that when an mDNS responder is present, the local address of a host >-// candidate is masked by an mDNS hostname and the related address of any other >-// type of candidates is set to 0.0.0.0 or ::0. >+// candidate is concealed by an mDNS hostname and the related address of a srflx >+// candidate is set to 0.0.0.0 or ::0. > TEST_F(BasicPortAllocatorTest, HostCandidateAddressIsReplacedByHostname) { > // Default config uses GTURN and no NAT, so replace that with the > // desired setup (NAT, STUN server, TURN server, UDP/TCP). >@@ -2258,7 +2257,7 @@ TEST_F(BasicPortAllocatorTest, HostCandidateAddressIsReplacedByHostname) { > AddTurnServers(kTurnUdpIntIPv6Addr, kTurnTcpIntIPv6Addr); > > ASSERT_EQ(&network_manager_, allocator().network_manager()); >- network_manager_.CreateMDnsResponder(); >+ network_manager_.CreateMdnsResponder(); > AddInterface(kClientAddr); > ASSERT_TRUE(CreateSession(ICE_CANDIDATE_COMPONENT_RTP)); > session_->StartGettingPorts(); >@@ -2270,23 +2269,29 @@ TEST_F(BasicPortAllocatorTest, HostCandidateAddressIsReplacedByHostname) { > int num_srflx_candidates = 0; > int num_relay_candidates = 0; > for (const auto& candidate : candidates_) { >+ const auto& raddr = candidate.related_address(); >+ > if (candidate.type() == LOCAL_PORT_TYPE) { >- EXPECT_TRUE(candidate.address().IsUnresolvedIP()); >+ EXPECT_FALSE(candidate.address().hostname().empty()); >+ EXPECT_TRUE(raddr.IsNil()); > if (candidate.protocol() == UDP_PROTOCOL_NAME) { > ++num_host_udp_candidates; > } else { > ++num_host_tcp_candidates; > } >+ } else if (candidate.type() == STUN_PORT_TYPE) { >+ // For a srflx candidate, the related address should be set to 0.0.0.0 or >+ // ::0 >+ EXPECT_TRUE(IPIsAny(raddr.ipaddr())); >+ EXPECT_EQ(raddr.port(), 0); >+ ++num_srflx_candidates; >+ } else if (candidate.type() == RELAY_PORT_TYPE) { >+ EXPECT_EQ(kNatUdpAddr.ipaddr(), raddr.ipaddr()); >+ EXPECT_EQ(kNatUdpAddr.family(), raddr.family()); >+ ++num_relay_candidates; > } else { >- EXPECT_NE(PRFLX_PORT_TYPE, candidate.type()); >- // The related address should be set to 0.0.0.0 or ::0 for srflx and >- // relay candidates. >- EXPECT_EQ(rtc::SocketAddress(), candidate.related_address()); >- if (candidate.type() == STUN_PORT_TYPE) { >- ++num_srflx_candidates; >- } else { >- ++num_relay_candidates; >- } >+ // prflx candidates are not expected >+ FAIL(); > } > } > EXPECT_EQ(1, num_host_udp_candidates); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/turnportfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/turnportfactory.cc >index 6404134c4a19d8c5e1a0ee734a6b766e0fb4a778..c0c172076bc1eb7d22f75931a25a90504572c729 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/turnportfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/client/turnportfactory.cc >@@ -11,6 +11,7 @@ > #include "p2p/client/turnportfactory.h" > > #include <memory> >+#include <utility> > > #include "p2p/base/turnport.h" > >@@ -21,26 +22,26 @@ TurnPortFactory::~TurnPortFactory() {} > std::unique_ptr<Port> TurnPortFactory::Create( > const CreateRelayPortArgs& args, > rtc::AsyncPacketSocket* udp_socket) { >- TurnPort* port = TurnPort::Create( >+ auto port = TurnPort::CreateUnique( > args.network_thread, args.socket_factory, args.network, udp_socket, > args.username, args.password, *args.server_address, > args.config->credentials, args.config->priority, args.origin, > args.turn_customizer); > port->SetTlsCertPolicy(args.config->tls_cert_policy); >- return std::unique_ptr<Port>(port); >+ return std::move(port); > } > > std::unique_ptr<Port> TurnPortFactory::Create(const CreateRelayPortArgs& args, > int min_port, > int max_port) { >- TurnPort* port = TurnPort::Create( >+ auto port = TurnPort::CreateUnique( > args.network_thread, args.socket_factory, args.network, min_port, > max_port, args.username, args.password, *args.server_address, > args.config->credentials, args.config->priority, args.origin, > args.config->tls_alpn_protocols, args.config->tls_elliptic_curves, > args.turn_customizer, args.config->tls_cert_verifier); > port->SetTlsCertPolicy(args.config->tls_cert_policy); >- return std::unique_ptr<Port>(port); >+ return std::move(port); > } > > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.cc >index 51506b670d7518af371ad0acc0026ae2a3657716..9778416a1b0091205d27bddc67dfa5875c2a5c25 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.cc >@@ -77,7 +77,7 @@ class StunProber::Requester : public sigslot::has_slots<> { > const char* buf, > size_t size, > const rtc::SocketAddress& addr, >- const rtc::PacketTime& time); >+ const int64_t& packet_time_us); > > const std::vector<Request*>& requests() { return requests_; } > >@@ -204,7 +204,7 @@ void StunProber::Requester::OnStunResponseReceived( > const char* buf, > size_t size, > const rtc::SocketAddress& addr, >- const rtc::PacketTime& time) { >+ const int64_t& /* packet_time_us */) { > RTC_DCHECK(thread_checker_.CalledOnValidThread()); > RTC_DCHECK(socket_); > Request* request = GetRequestByAddress(addr.ipaddr()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.h b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.h >index 9f1d42f32ce37b1bb2e5638772a64a8b61a13a71..a4feecd8132a9f35b49cb978c33b64550ee66cea 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/stunprober/stunprober.h >@@ -22,6 +22,7 @@ > #include "rtc_base/ipaddress.h" > #include "rtc_base/network.h" > #include "rtc_base/socketaddress.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/thread.h" > #include "rtc_base/thread_checker.h" > >@@ -49,7 +50,7 @@ enum NatType { > NATTYPE_NON_SYMMETRIC // Behind a non-symmetric NAT. > }; > >-class StunProber : public sigslot::has_slots<> { >+class RTC_EXPORT StunProber : public sigslot::has_slots<> { > public: > enum Status { // Used in UMA_HISTOGRAM_ENUMERATION. > SUCCESS, // Successfully received bytes from the server. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/BUILD.gn >index 4e6e55810ad9c20b849de756c973747a7e638a36..65ab33a50b497d2dc2036c3650628b8d748bf8a6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/BUILD.gn >@@ -31,6 +31,7 @@ rtc_static_library("rtc_pc_base") { > sources = [ > "channel.cc", > "channel.h", >+ "channelinterface.h", > "channelmanager.cc", > "channelmanager.h", > "dtlssrtptransport.cc", >@@ -87,6 +88,7 @@ rtc_static_library("rtc_pc_base") { > "../rtc_base/third_party/sigslot", > "../system_wrappers:metrics", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > >@@ -203,6 +205,7 @@ rtc_static_library("peerconnection") { > "../rtc_base:rtc_base_approved", > "../rtc_base:stringutils", > "../rtc_base/experiments:congestion_controller_experiment", >+ "../rtc_base/system:rtc_export", > "../rtc_base/third_party/base64", > "../rtc_base/third_party/sigslot", > "../stats", >@@ -210,62 +213,11 @@ rtc_static_library("peerconnection") { > "../system_wrappers:field_trial", > "../system_wrappers:metrics", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > } > >-rtc_static_library("builtin_video_bitrate_allocator_factory") { >- sources = [ >- "builtin_video_bitrate_allocator_factory.cc", >- "builtin_video_bitrate_allocator_factory.h", >- ] >- >- deps = [ >- "../api/video:video_bitrate_allocator_factory", >- "../media:rtc_media_base", >- "../modules/video_coding:video_coding_utility", >- "../modules/video_coding:webrtc_vp9_helpers", >- "../rtc_base:ptr_util", >- "../rtc_base/system:fallthrough", >- "//third_party/abseil-cpp/absl/memory", >- ] >-} >- >-# This target implements CreatePeerConnectionFactory methods that will create a >-# PeerConnection will full functionality (audio, video and data). Applications >-# that wish to reduce their binary size by ommitting functionality they don't >-# need should use CreateModularCreatePeerConnectionFactory instead, using the >-# "peerconnection" build target and other targets specific to their >-# requrements. See comment in peerconnectionfactoryinterface.h. >-rtc_static_library("create_pc_factory") { >- sources = [ >- "createpeerconnectionfactory.cc", >- ] >- >- deps = [ >- "../api:callfactory_api", >- "../api:libjingle_peerconnection_api", >- "../api/audio:audio_mixer_api", >- "../api/audio_codecs:audio_codecs_api", >- "../api/video_codecs:video_codecs_api", >- "../call", >- "../call:call_interfaces", >- "../logging:rtc_event_log_api", >- "../logging:rtc_event_log_impl_base", >- "../media:rtc_audio_video", >- "../media:rtc_media_base", >- "../modules/audio_device:audio_device", >- "../modules/audio_processing:audio_processing", >- "../rtc_base:rtc_base", >- "../rtc_base:rtc_base_approved", >- ] >- >- if (!build_with_chromium && is_clang) { >- # Suppress warnings from the Chromium Clang plugin (bugs.webrtc.org/163). >- suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] >- } >-} >- > rtc_source_set("libjingle_peerconnection") { > visibility = [ "*" ] > allow_poison = [ >@@ -273,8 +225,8 @@ rtc_source_set("libjingle_peerconnection") { > "software_video_codecs", # TODO(bugs.webrtc.org/7925): Remove. > ] > deps = [ >- ":create_pc_factory", > ":peerconnection", >+ "../api:create_peerconnection_factory", > "../api:libjingle_peerconnection_api", > ] > } >@@ -293,6 +245,7 @@ if (rtc_include_tests) { > "rtcpmuxfilter_unittest.cc", > "rtptransport_unittest.cc", > "rtptransporttestutil.h", >+ "sessiondescription_unittest.cc", > "srtpfilter_unittest.cc", > "srtpsession_unittest.cc", > "srtptestutil.h", >@@ -316,6 +269,7 @@ if (rtc_include_tests) { > ":rtc_pc", > ":rtc_pc_base", > "../api:array_view", >+ "../api:fake_media_transport", > "../api:libjingle_peerconnection_api", > "../call:rtp_interfaces", > "../logging:rtc_event_log_api", >@@ -325,6 +279,7 @@ if (rtc_include_tests) { > "../p2p:p2p_test_utils", > "../p2p:rtc_p2p", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_main", >@@ -353,6 +308,7 @@ if (rtc_include_tests) { > ] > deps = [ > ":pc_test_utils", >+ "../api:create_peerconnection_factory", > "../api:libjingle_peerconnection_api", > "../api:rtc_stats_api", > "../api/audio_codecs:builtin_audio_decoder_factory", >@@ -363,6 +319,7 @@ if (rtc_include_tests) { > "../p2p:p2p_test_utils", > "../p2p:rtc_p2p", > "../pc:peerconnection", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_utils", >@@ -391,6 +348,7 @@ if (rtc_include_tests) { > "test/fakevideotrackrenderer.h", > "test/fakevideotracksource.h", > "test/framegeneratorcapturervideotracksource.h", >+ "test/mock_channelinterface.h", > "test/mock_datachannel.h", > "test/mock_rtpreceiverinternal.h", > "test/mock_rtpsenderinternal.h", >@@ -406,6 +364,7 @@ if (rtc_include_tests) { > ":peerconnection", > ":rtc_pc_base", > "..:webrtc_common", >+ "../api:create_peerconnection_factory", > "../api:libjingle_peerconnection_api", > "../api:libjingle_peerconnection_test_api", > "../api:rtc_stats_api", >@@ -419,12 +378,13 @@ if (rtc_include_tests) { > "../media:rtc_media_base", > "../media:rtc_media_tests_utils", > "../modules/audio_device:audio_device", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../p2p:p2p_test_utils", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", >- "../rtc_base:rtc_base_tests_utils", > "../rtc_base:rtc_task_queue", > "../rtc_base/third_party/sigslot", > "../test:test_support", >@@ -470,6 +430,7 @@ if (rtc_include_tests) { > "rtpmediautils_unittest.cc", > "rtpparametersconversion_unittest.cc", > "rtpsenderreceiver_unittest.cc", >+ "rtptransceiver_unittest.cc", > "sctputils_unittest.cc", > "statscollector_unittest.cc", > "test/fakeaudiocapturemodule_unittest.cc", >@@ -492,17 +453,23 @@ if (rtc_include_tests) { > deps = [ > ":peerconnection", > ":rtc_pc_base", >- "../api:fake_frame_crypto", >+ "../api:create_peerconnection_factory", >+ "../api:fake_frame_decryptor", >+ "../api:fake_frame_encryptor", > "../api:libjingle_peerconnection_api", >+ "../api:loopback_media_transport", > "../api:mock_rtp", > "../api/units:time_delta", > "../logging:fake_rtc_event_log", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", >+ "../rtc_base:rtc_base_tests_utils", > "../rtc_base:stringutils", > "../rtc_base/third_party/base64", > "../system_wrappers:metrics", > "../test:fileutils", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > ] > if (is_android) { > deps += [ ":android_black_magic" ] >@@ -513,6 +480,7 @@ if (rtc_include_tests) { > ":pc_test_utils", > "..:webrtc_common", > "../api:callfactory_api", >+ "../api:fake_media_transport", > "../api:libjingle_peerconnection_test_api", > "../api:rtc_stats_api", > "../api/audio_codecs:audio_codecs_api", >@@ -531,6 +499,7 @@ if (rtc_include_tests) { > "../media:rtc_data", # TODO(phoglund): AFAIK only used for one sctp constant. > "../media:rtc_media_base", > "../media:rtc_media_tests_utils", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/utility:utility", > "../p2p:p2p_test_utils", >@@ -539,7 +508,6 @@ if (rtc_include_tests) { > "../rtc_base:rtc_base", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_main", >- "../rtc_base:rtc_base_tests_utils", > "../rtc_base:rtc_task_queue", > "../rtc_base:safe_conversions", > "../test:audio_codec_mocks", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/builtin_video_bitrate_allocator_factory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/builtin_video_bitrate_allocator_factory.cc >deleted file mode 100644 >index 46d7daea6fa98af7f00a9bfff415f999f8b79e34..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/builtin_video_bitrate_allocator_factory.cc >+++ /dev/null >@@ -1,56 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "pc/builtin_video_bitrate_allocator_factory.h" >- >-#include "absl/memory/memory.h" >-#include "media/base/codec.h" >-#include "modules/video_coding/codecs/vp9/svc_rate_allocator.h" >-#include "modules/video_coding/utility/default_video_bitrate_allocator.h" >-#include "modules/video_coding/utility/simulcast_rate_allocator.h" >-#include "rtc_base/system/fallthrough.h" >- >-namespace webrtc { >- >-namespace { >- >-class BuiltinVideoBitrateAllocatorFactory >- : public VideoBitrateAllocatorFactory { >- public: >- BuiltinVideoBitrateAllocatorFactory() = default; >- ~BuiltinVideoBitrateAllocatorFactory() override = default; >- >- std::unique_ptr<VideoBitrateAllocator> CreateVideoBitrateAllocator( >- const VideoCodec& codec) override { >- std::unique_ptr<VideoBitrateAllocator> rate_allocator; >- switch (codec.codecType) { >- case kVideoCodecVP8: >- RTC_FALLTHROUGH(); >- case kVideoCodecH264: >- rate_allocator.reset(new SimulcastRateAllocator(codec)); >- break; >- case kVideoCodecVP9: >- rate_allocator.reset(new SvcRateAllocator(codec)); >- break; >- default: >- rate_allocator.reset(new DefaultVideoBitrateAllocator(codec)); >- } >- return rate_allocator; >- } >-}; >- >-} // namespace >- >-std::unique_ptr<VideoBitrateAllocatorFactory> >-CreateBuiltinVideoBitrateAllocatorFactory() { >- return absl::make_unique<BuiltinVideoBitrateAllocatorFactory>(); >-} >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/builtin_video_bitrate_allocator_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/builtin_video_bitrate_allocator_factory.h >deleted file mode 100644 >index 60f2afcbaba127814d635bae42f003ad3020fe47..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/builtin_video_bitrate_allocator_factory.h >+++ /dev/null >@@ -1,25 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef PC_BUILTIN_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >-#define PC_BUILTIN_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >- >-#include <memory> >- >-#include "api/video/video_bitrate_allocator_factory.h" >- >-namespace webrtc { >- >-std::unique_ptr<VideoBitrateAllocatorFactory> >-CreateBuiltinVideoBitrateAllocatorFactory(); >- >-} // namespace webrtc >- >-#endif // PC_BUILTIN_VIDEO_BITRATE_ALLOCATOR_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.cc >deleted file mode 100644 >index 201d2bf95cab547162092d9197e7910981ceefd6..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.cc >+++ /dev/null >@@ -1,47 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "pc/bundlefilter.h" >- >-#include "media/base/rtputils.h" >-#include "rtc_base/logging.h" >- >-namespace cricket { >- >-BundleFilter::BundleFilter() {} >- >-BundleFilter::~BundleFilter() {} >- >-bool BundleFilter::DemuxPacket(const uint8_t* data, size_t len) { >- // For RTP packets, we check whether the payload type can be found. >- if (!IsRtpPacket(data, len)) { >- return false; >- } >- >- int payload_type = 0; >- if (!GetRtpPayloadType(data, len, &payload_type)) { >- return false; >- } >- return FindPayloadType(payload_type); >-} >- >-void BundleFilter::AddPayloadType(int payload_type) { >- payload_types_.insert(payload_type); >-} >- >-bool BundleFilter::FindPayloadType(int pl_type) const { >- return payload_types_.find(pl_type) != payload_types_.end(); >-} >- >-void BundleFilter::ClearAllPayloadTypes() { >- payload_types_.clear(); >-} >- >-} // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.h >deleted file mode 100644 >index de477f457f2948bb4341573eabdce2d1ff899c50..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.h >+++ /dev/null >@@ -1,54 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef PC_BUNDLEFILTER_H_ >-#define PC_BUNDLEFILTER_H_ >- >-#include <stdint.h> >- >-#include <set> >-#include <vector> >- >-#include "media/base/streamparams.h" >-//#include "rtc_base/basictypes.h" >- >-namespace cricket { >- >-// In case of single RTP session and single transport channel, all session >-// (or media) channels share a common transport channel. Hence they all get >-// SignalReadPacket when packet received on transport channel. This requires >-// cricket::BaseChannel to know all the valid sources, else media channel >-// will decode invalid packets. >-// >-// This class determines whether a packet is destined for cricket::BaseChannel. >-// This is only to be used for RTP packets as RTCP packets are not filtered. >-// For RTP packets, this is decided based on the payload type. >-class BundleFilter { >- public: >- BundleFilter(); >- ~BundleFilter(); >- >- // Determines if a RTP packet belongs to valid cricket::BaseChannel. >- bool DemuxPacket(const uint8_t* data, size_t len); >- >- // Adds the supported payload type. >- void AddPayloadType(int payload_type); >- >- // Public for unittests. >- bool FindPayloadType(int pl_type) const; >- void ClearAllPayloadTypes(); >- >- private: >- std::set<int> payload_types_; >-}; >- >-} // namespace cricket >- >-#endif // PC_BUNDLEFILTER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter_unittest.cc >deleted file mode 100644 >index e7cadbfbac613c6429a8be277e21f375cfc4bd71..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter_unittest.cc >+++ /dev/null >@@ -1,73 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "pc/bundlefilter.h" >-#include "rtc_base/gunit.h" >- >-using cricket::StreamParams; >- >-static const int kPayloadType1 = 0x11; >-static const int kPayloadType2 = 0x22; >-static const int kPayloadType3 = 0x33; >- >-// SSRC = 0x1111, Payload type = 0x11 >-static const unsigned char kRtpPacketPt1Ssrc1[] = { >- 0x80, kPayloadType1, 0x00, 0x01, 0x00, 0x00, >- 0x00, 0x00, 0x00, 0x00, 0x11, 0x11, >-}; >- >-// SSRC = 0x2222, Payload type = 0x22 >-static const unsigned char kRtpPacketPt2Ssrc2[] = { >- 0x80, 0x80 + kPayloadType2, >- 0x00, 0x01, >- 0x00, 0x00, >- 0x00, 0x00, >- 0x00, 0x00, >- 0x22, 0x22, >-}; >- >-// SSRC = 0x2222, Payload type = 0x33 >-static const unsigned char kRtpPacketPt3Ssrc2[] = { >- 0x80, kPayloadType3, 0x00, 0x01, 0x00, 0x00, >- 0x00, 0x00, 0x00, 0x00, 0x22, 0x22, >-}; >- >-// An SCTP packet. >-static const unsigned char kSctpPacket[] = { >- 0x00, 0x01, 0x00, 0x01, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, >- 0x00, 0x00, 0x03, 0x00, 0x00, 0x04, 0x00, 0x00, 0x00, 0x00, >-}; >- >-TEST(BundleFilterTest, RtpPacketTest) { >- cricket::BundleFilter bundle_filter; >- bundle_filter.AddPayloadType(kPayloadType1); >- EXPECT_TRUE(bundle_filter.DemuxPacket(kRtpPacketPt1Ssrc1, >- sizeof(kRtpPacketPt1Ssrc1))); >- bundle_filter.AddPayloadType(kPayloadType2); >- EXPECT_TRUE(bundle_filter.DemuxPacket(kRtpPacketPt2Ssrc2, >- sizeof(kRtpPacketPt2Ssrc2))); >- >- // Payload type 0x33 is not added. >- EXPECT_FALSE(bundle_filter.DemuxPacket(kRtpPacketPt3Ssrc2, >- sizeof(kRtpPacketPt3Ssrc2))); >- // Size is too small. >- EXPECT_FALSE(bundle_filter.DemuxPacket(kRtpPacketPt1Ssrc1, 11)); >- >- bundle_filter.ClearAllPayloadTypes(); >- EXPECT_FALSE(bundle_filter.DemuxPacket(kRtpPacketPt1Ssrc1, >- sizeof(kRtpPacketPt1Ssrc1))); >- EXPECT_FALSE(bundle_filter.DemuxPacket(kRtpPacketPt2Ssrc2, >- sizeof(kRtpPacketPt2Ssrc2))); >-} >- >-TEST(BundleFilterTest, InvalidRtpPacket) { >- cricket::BundleFilter bundle_filter; >- EXPECT_FALSE(bundle_filter.DemuxPacket(kSctpPacket, sizeof(kSctpPacket))); >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.cc >index b63cea7cecf3021c362a7a72d7846c7e2132acfa..67d698487ce4f89bcd2f9236b7cef024f6a9d4b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.cc >@@ -93,6 +93,7 @@ void RtpSendParametersFromMediaDescription( > RtpSendParameters<Codec>* send_params) { > RtpParametersFromMediaDescription(desc, extensions, send_params); > send_params->max_bandwidth_bps = desc->bandwidth(); >+ send_params->extmap_allow_mixed = desc->extmap_allow_mixed(); > } > > BaseChannel::BaseChannel(rtc::Thread* worker_thread, >@@ -101,7 +102,7 @@ BaseChannel::BaseChannel(rtc::Thread* worker_thread, > std::unique_ptr<MediaChannel> media_channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options) >+ webrtc::CryptoOptions crypto_options) > : worker_thread_(worker_thread), > network_thread_(network_thread), > signaling_thread_(signaling_thread), >@@ -117,6 +118,11 @@ BaseChannel::BaseChannel(rtc::Thread* worker_thread, > BaseChannel::~BaseChannel() { > TRACE_EVENT0("webrtc", "BaseChannel::~BaseChannel"); > RTC_DCHECK_RUN_ON(worker_thread_); >+ >+ if (media_transport_) { >+ media_transport_->SetNetworkChangeCallback(nullptr); >+ } >+ > // Eats any outstanding messages or packets. > worker_thread_->Clear(&invoker_); > worker_thread_->Clear(this); >@@ -136,8 +142,15 @@ bool BaseChannel::ConnectToRtpTransport() { > this, &BaseChannel::OnTransportReadyToSend); > rtp_transport_->SignalRtcpPacketReceived.connect( > this, &BaseChannel::OnRtcpPacketReceived); >- rtp_transport_->SignalNetworkRouteChanged.connect( >- this, &BaseChannel::OnNetworkRouteChanged); >+ >+ // If media transport is used, it's responsible for providing network >+ // route changed callbacks. >+ if (!media_transport_) { >+ rtp_transport_->SignalNetworkRouteChanged.connect( >+ this, &BaseChannel::OnNetworkRouteChanged); >+ } >+ // TODO(bugs.webrtc.org/9719): Media transport should also be used to provide >+ // 'writable' state here. > rtp_transport_->SignalWritableState.connect(this, > &BaseChannel::OnWritableState); > rtp_transport_->SignalSentPacket.connect(this, >@@ -155,19 +168,29 @@ void BaseChannel::DisconnectFromRtpTransport() { > rtp_transport_->SignalSentPacket.disconnect(this); > } > >-void BaseChannel::Init_w(webrtc::RtpTransportInternal* rtp_transport) { >+void BaseChannel::Init_w(webrtc::RtpTransportInternal* rtp_transport, >+ webrtc::MediaTransportInterface* media_transport) { > RTC_DCHECK_RUN_ON(worker_thread_); >+ media_transport_ = media_transport; >+ > network_thread_->Invoke<void>( > RTC_FROM_HERE, [this, rtp_transport] { SetRtpTransport(rtp_transport); }); > > // Both RTP and RTCP channels should be set, we can call SetInterface on > // the media channel and it can set network options. >- media_channel_->SetInterface(this); >+ media_channel_->SetInterface(this, media_transport); >+ >+ RTC_LOG(LS_INFO) << "BaseChannel::Init_w, media_transport=" >+ << (media_transport_ != nullptr); >+ if (media_transport_) { >+ media_transport_->SetNetworkChangeCallback(this); >+ } > } > > void BaseChannel::Deinit() { > RTC_DCHECK(worker_thread_->IsCurrent()); >- media_channel_->SetInterface(NULL); >+ media_channel_->SetInterface(/*iface=*/nullptr, >+ /*media_transport=*/nullptr); > // Packets arrive on the network thread, processing packets calls virtual > // functions, so need to stop this process in Deinit that is called in > // derived classes destructor. >@@ -350,6 +373,8 @@ void BaseChannel::OnWritableState(bool writable) { > > void BaseChannel::OnNetworkRouteChanged( > absl::optional<rtc::NetworkRoute> network_route) { >+ RTC_LOG(LS_INFO) << "Network route was changed."; >+ > RTC_DCHECK(network_thread_->IsCurrent()); > rtc::NetworkRoute new_route; > if (network_route) { >@@ -439,13 +464,12 @@ void BaseChannel::OnRtpPacket(const webrtc::RtpPacketReceived& parsed_packet) { > // RtpPacketReceived.arrival_time_ms = (PacketTime + 500) / 1000; > // Note: The |not_before| field is always 0 here. This field is not currently > // used, so it should be fine. >- int64_t timestamp = -1; >+ int64_t timestamp_us = -1; > if (parsed_packet.arrival_time_ms() > 0) { >- timestamp = parsed_packet.arrival_time_ms() * 1000; >+ timestamp_us = parsed_packet.arrival_time_ms() * 1000; > } >- rtc::PacketTime packet_time(timestamp, /*not_before=*/0); > >- OnPacketReceived(/*rtcp=*/false, parsed_packet.Buffer(), packet_time); >+ OnPacketReceived(/*rtcp=*/false, parsed_packet.Buffer(), timestamp_us); > } > > void BaseChannel::UpdateRtpHeaderExtensionMap( >@@ -472,13 +496,13 @@ bool BaseChannel::RegisterRtpDemuxerSink() { > } > > void BaseChannel::OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >- OnPacketReceived(/*rtcp=*/true, *packet, packet_time); >+ int64_t packet_time_us) { >+ OnPacketReceived(/*rtcp=*/true, *packet, packet_time_us); > } > > void BaseChannel::OnPacketReceived(bool rtcp, > const rtc::CopyOnWriteBuffer& packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > if (!has_received_packet_ && !rtcp) { > has_received_packet_ = true; > signaling_thread()->Post(RTC_FROM_HERE, this, MSG_FIRSTPACKETRECEIVED); >@@ -504,21 +528,21 @@ void BaseChannel::OnPacketReceived(bool rtcp, > > invoker_.AsyncInvoke<void>( > RTC_FROM_HERE, worker_thread_, >- Bind(&BaseChannel::ProcessPacket, this, rtcp, packet, packet_time)); >+ Bind(&BaseChannel::ProcessPacket, this, rtcp, packet, packet_time_us)); > } > > void BaseChannel::ProcessPacket(bool rtcp, > const rtc::CopyOnWriteBuffer& packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > RTC_DCHECK(worker_thread_->IsCurrent()); > > // Need to copy variable because OnRtcpReceived/OnPacketReceived > // requires non-const pointer to buffer. This doesn't memcpy the actual data. > rtc::CopyOnWriteBuffer data(packet); > if (rtcp) { >- media_channel_->OnRtcpReceived(&data, packet_time); >+ media_channel_->OnRtcpReceived(&data, packet_time_us); > } else { >- media_channel_->OnPacketReceived(&data, packet_time); >+ media_channel_->OnPacketReceived(&data, packet_time_us); > } > } > >@@ -673,7 +697,7 @@ bool BaseChannel::UpdateRemoteStreams_w( > RtpHeaderExtensions BaseChannel::GetFilteredRtpHeaderExtensions( > const RtpHeaderExtensions& extensions) { > RTC_DCHECK(rtp_transport_); >- if (crypto_options_.enable_encrypted_rtp_header_extensions) { >+ if (crypto_options_.srtp.enable_encrypted_rtp_header_extensions) { > RtpHeaderExtensions filtered; > auto pred = [](const webrtc::RtpExtension& extension) { > return !extension.encrypt; >@@ -700,7 +724,7 @@ void BaseChannel::OnMessage(rtc::Message* pmsg) { > break; > } > case MSG_FIRSTPACKETRECEIVED: { >- SignalFirstPacketReceived(this); >+ SignalFirstPacketReceived_(this); > break; > } > } >@@ -742,7 +766,7 @@ VoiceChannel::VoiceChannel(rtc::Thread* worker_thread, > std::unique_ptr<VoiceMediaChannel> media_channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options) >+ webrtc::CryptoOptions crypto_options) > : BaseChannel(worker_thread, > network_thread, > signaling_thread, >@@ -765,6 +789,11 @@ void BaseChannel::UpdateMediaSendRecvState() { > Bind(&BaseChannel::UpdateMediaSendRecvState_w, this)); > } > >+void BaseChannel::OnNetworkRouteChanged( >+ const rtc::NetworkRoute& network_route) { >+ OnNetworkRouteChanged(absl::make_optional(network_route)); >+} >+ > void VoiceChannel::UpdateMediaSendRecvState_w() { > // Render incoming data if we're the active call, and we have the local > // content. We receive data on the default channel and multiplexed streams. >@@ -797,6 +826,7 @@ bool VoiceChannel::SetLocalContent_w(const MediaContentDescription* content, > RtpHeaderExtensions rtp_header_extensions = > GetFilteredRtpHeaderExtensions(audio->rtp_header_extensions()); > UpdateRtpHeaderExtensionMap(rtp_header_extensions); >+ media_channel()->SetExtmapAllowMixed(audio->extmap_allow_mixed()); > > AudioRecvParameters recv_params = last_recv_params_; > RtpParametersFromMediaDescription(audio, rtp_header_extensions, &recv_params); >@@ -881,7 +911,7 @@ VideoChannel::VideoChannel(rtc::Thread* worker_thread, > std::unique_ptr<VideoMediaChannel> media_channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options) >+ webrtc::CryptoOptions crypto_options) > : BaseChannel(worker_thread, > network_thread, > signaling_thread, >@@ -932,6 +962,7 @@ bool VideoChannel::SetLocalContent_w(const MediaContentDescription* content, > RtpHeaderExtensions rtp_header_extensions = > GetFilteredRtpHeaderExtensions(video->rtp_header_extensions()); > UpdateRtpHeaderExtensionMap(rtp_header_extensions); >+ media_channel()->SetExtmapAllowMixed(video->extmap_allow_mixed()); > > VideoRecvParameters recv_params = last_recv_params_; > RtpParametersFromMediaDescription(video, rtp_header_extensions, &recv_params); >@@ -1019,7 +1050,7 @@ RtpDataChannel::RtpDataChannel(rtc::Thread* worker_thread, > std::unique_ptr<DataMediaChannel> media_channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options) >+ webrtc::CryptoOptions crypto_options) > : BaseChannel(worker_thread, > network_thread, > signaling_thread, >@@ -1036,7 +1067,7 @@ RtpDataChannel::~RtpDataChannel() { > } > > void RtpDataChannel::Init_w(webrtc::RtpTransportInternal* rtp_transport) { >- BaseChannel::Init_w(rtp_transport); >+ BaseChannel::Init_w(rtp_transport, /*media_transport=*/nullptr); > media_channel()->SignalDataReceived.connect(this, > &RtpDataChannel::OnDataReceived); > media_channel()->SignalReadyToSend.connect( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.h >index ba584690ba6ad07557bc196a160330e1d844f507..6264bb39041ae7d3d4a1a5e7270f4348c234a27e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel.h >@@ -29,6 +29,7 @@ > #include "media/base/streamparams.h" > #include "p2p/base/dtlstransportinternal.h" > #include "p2p/base/packettransportinternal.h" >+#include "pc/channelinterface.h" > #include "pc/dtlssrtptransport.h" > #include "pc/mediasession.h" > #include "pc/rtptransport.h" >@@ -42,12 +43,12 @@ > > namespace webrtc { > class AudioSinkInterface; >+class MediaTransportInterface; > } // namespace webrtc > > namespace cricket { > > struct CryptoParams; >-class MediaContentDescription; > > // BaseChannel contains logic common to voice and video, including enable, > // marshaling calls to a worker and network threads, and connection and media >@@ -67,10 +68,12 @@ class MediaContentDescription; > // vtable, and the media channel's thread using BaseChannel as the > // NetworkInterface. > >-class BaseChannel : public rtc::MessageHandler, >+class BaseChannel : public ChannelInterface, >+ public rtc::MessageHandler, > public sigslot::has_slots<>, > public MediaChannel::NetworkInterface, >- public webrtc::RtpPacketSinkInterface { >+ public webrtc::RtpPacketSinkInterface, >+ public webrtc::MediaTransportNetworkChangeCallback { > public: > // If |srtp_required| is true, the channel will not send or receive any > // RTP/RTCP packets without using SRTP (either using SDES or DTLS-SRTP). >@@ -82,9 +85,10 @@ class BaseChannel : public rtc::MessageHandler, > std::unique_ptr<MediaChannel> media_channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options); >+ webrtc::CryptoOptions crypto_options); > virtual ~BaseChannel(); >- void Init_w(webrtc::RtpTransportInternal* rtp_transport); >+ void Init_w(webrtc::RtpTransportInternal* rtp_transport, >+ webrtc::MediaTransportInterface* media_transport); > > // Deinit may be called multiple times and is simply ignored if it's already > // done. >@@ -92,10 +96,10 @@ class BaseChannel : public rtc::MessageHandler, > > rtc::Thread* worker_thread() const { return worker_thread_; } > rtc::Thread* network_thread() const { return network_thread_; } >- const std::string& content_name() const { return content_name_; } >+ const std::string& content_name() const override { return content_name_; } > // TODO(deadbeef): This is redundant; remove this. >- const std::string& transport_name() const { return transport_name_; } >- bool enabled() const { return enabled_; } >+ const std::string& transport_name() const override { return transport_name_; } >+ bool enabled() const override { return enabled_; } > > // This function returns true if using SRTP (DTLS-based keying or SDES). > bool srtp_active() const { >@@ -108,17 +112,17 @@ class BaseChannel : public rtc::MessageHandler, > // encryption, an SrtpTransport for SDES or a DtlsSrtpTransport for DTLS-SRTP. > // This can be called from any thread and it hops to the network thread > // internally. It would replace the |SetTransports| and its variants. >- bool SetRtpTransport(webrtc::RtpTransportInternal* rtp_transport); >+ bool SetRtpTransport(webrtc::RtpTransportInternal* rtp_transport) override; > > // Channel control > bool SetLocalContent(const MediaContentDescription* content, > webrtc::SdpType type, >- std::string* error_desc); >+ std::string* error_desc) override; > bool SetRemoteContent(const MediaContentDescription* content, > webrtc::SdpType type, >- std::string* error_desc); >+ std::string* error_desc) override; > >- bool Enable(bool enable); >+ bool Enable(bool enable) override; > > // TODO(zhihuang): These methods are used for testing and can be removed. > bool AddRecvStream(const StreamParams& sp); >@@ -138,7 +142,9 @@ class BaseChannel : public rtc::MessageHandler, > void SignalDtlsSrtpSetupFailure_s(bool rtcp); > > // Used for latency measurements. >- sigslot::signal1<BaseChannel*> SignalFirstPacketReceived; >+ sigslot::signal1<ChannelInterface*>& SignalFirstPacketReceived() override { >+ return SignalFirstPacketReceived_; >+ } > > // Forward SignalSentPacket to worker thread. > sigslot::signal1<const rtc::SentPacket&> SignalSentPacket; >@@ -162,6 +168,11 @@ class BaseChannel : public rtc::MessageHandler, > return nullptr; > } > >+ // Returns media transport, can be null if media transport is not available. >+ webrtc::MediaTransportInterface* media_transport() { >+ return media_transport_; >+ } >+ > // From RtpTransport - public for testing only > void OnTransportReadyToSend(bool ready); > >@@ -169,8 +180,6 @@ class BaseChannel : public rtc::MessageHandler, > int SetOption(SocketType type, rtc::Socket::Option o, int val) override; > int SetOption_n(SocketType type, rtc::Socket::Option o, int val); > >- virtual cricket::MediaType media_type() = 0; >- > // RtpPacketSinkInterface overrides. > void OnRtpPacket(const webrtc::RtpPacketReceived& packet) override; > >@@ -180,9 +189,9 @@ class BaseChannel : public rtc::MessageHandler, > transport_name_ = transport_name; > } > >- protected: >- virtual MediaChannel* media_channel() const { return media_channel_.get(); } >+ MediaChannel* media_channel() const override { return media_channel_.get(); } > >+ protected: > bool was_ever_writable() const { return was_ever_writable_; } > void set_local_content_direction(webrtc::RtpTransceiverDirection direction) { > local_content_direction_ = direction; >@@ -225,14 +234,14 @@ class BaseChannel : public rtc::MessageHandler, > const rtc::PacketOptions& options); > > void OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > > void OnPacketReceived(bool rtcp, > const rtc::CopyOnWriteBuffer& packet, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > void ProcessPacket(bool rtcp, > const rtc::CopyOnWriteBuffer& packet, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > > void EnableMedia_w(); > void DisableMedia_w(); >@@ -295,10 +304,14 @@ class BaseChannel : public rtc::MessageHandler, > void SignalSentPacket_n(const rtc::SentPacket& sent_packet); > void SignalSentPacket_w(const rtc::SentPacket& sent_packet); > bool IsReadyToSendMedia_n() const; >+ >+ // MediaTransportNetworkChangeCallback override. >+ void OnNetworkRouteChanged(const rtc::NetworkRoute& network_route) override; > rtc::Thread* const worker_thread_; > rtc::Thread* const network_thread_; > rtc::Thread* const signaling_thread_; > rtc::AsyncInvoker invoker_; >+ sigslot::signal1<ChannelInterface*> SignalFirstPacketReceived_; > > const std::string content_name_; > >@@ -307,13 +320,18 @@ class BaseChannel : public rtc::MessageHandler, > > webrtc::RtpTransportInternal* rtp_transport_ = nullptr; > >+ // Optional media transport (experimental). >+ // If provided, audio and video will be sent through media_transport instead >+ // of RTP/RTCP. Currently media_transport can co-exist with rtp_transport. >+ webrtc::MediaTransportInterface* media_transport_ = nullptr; >+ > std::vector<std::pair<rtc::Socket::Option, int> > socket_options_; > std::vector<std::pair<rtc::Socket::Option, int> > rtcp_socket_options_; > bool writable_ = false; > bool was_ever_writable_ = false; > bool has_received_packet_ = false; > const bool srtp_required_ = true; >- rtc::CryptoOptions crypto_options_; >+ webrtc::CryptoOptions crypto_options_; > > // MediaChannel related members that should be accessed from the worker > // thread. >@@ -343,7 +361,7 @@ class VoiceChannel : public BaseChannel { > std::unique_ptr<VoiceMediaChannel> channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options); >+ webrtc::CryptoOptions crypto_options); > ~VoiceChannel(); > > // downcasts a MediaChannel >@@ -351,10 +369,9 @@ class VoiceChannel : public BaseChannel { > return static_cast<VoiceMediaChannel*>(BaseChannel::media_channel()); > } > >- webrtc::RtpParameters GetRtpSendParameters_w(uint32_t ssrc) const; >- webrtc::RTCError SetRtpSendParameters_w(uint32_t ssrc, >- webrtc::RtpParameters parameters); >- cricket::MediaType media_type() override { return cricket::MEDIA_TYPE_AUDIO; } >+ cricket::MediaType media_type() const override { >+ return cricket::MEDIA_TYPE_AUDIO; >+ } > > private: > // overrides from BaseChannel >@@ -383,7 +400,7 @@ class VideoChannel : public BaseChannel { > std::unique_ptr<VideoMediaChannel> media_channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options); >+ webrtc::CryptoOptions crypto_options); > ~VideoChannel(); > > // downcasts a MediaChannel >@@ -393,7 +410,9 @@ class VideoChannel : public BaseChannel { > > void FillBitrateInfo(BandwidthEstimationInfo* bwe_info); > >- cricket::MediaType media_type() override { return cricket::MEDIA_TYPE_VIDEO; } >+ cricket::MediaType media_type() const override { >+ return cricket::MEDIA_TYPE_VIDEO; >+ } > > private: > // overrides from BaseChannel >@@ -422,7 +441,7 @@ class RtpDataChannel : public BaseChannel { > std::unique_ptr<DataMediaChannel> channel, > const std::string& content_name, > bool srtp_required, >- rtc::CryptoOptions crypto_options); >+ webrtc::CryptoOptions crypto_options); > ~RtpDataChannel(); > // TODO(zhihuang): Remove this once the RtpTransport can be shared between > // BaseChannels. >@@ -445,7 +464,9 @@ class RtpDataChannel : public BaseChannel { > // That occurs when the channel is enabled, the transport is writable, > // both local and remote descriptions are set, and the channel is unblocked. > sigslot::signal1<bool> SignalReadyToSendData; >- cricket::MediaType media_type() override { return cricket::MEDIA_TYPE_DATA; } >+ cricket::MediaType media_type() const override { >+ return cricket::MEDIA_TYPE_DATA; >+ } > > protected: > // downcasts a MediaChannel. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel_unittest.cc >index 58cb17b8bbf5dd1f7cbe400fc5c5d5009c6d72b7..5d4d4c7f9e004e56d5ecb8af207680b9da926337 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channel_unittest.cc >@@ -15,7 +15,6 @@ > #include "media/base/fakemediaengine.h" > #include "media/base/fakertp.h" > #include "media/base/mediachannel.h" >-#include "media/base/testutils.h" > #include "p2p/base/fakecandidatepair.h" > #include "p2p/base/fakedtlstransport.h" > #include "p2p/base/fakepackettransport.h" >@@ -250,8 +249,8 @@ class ChannelTest : public testing::Test, public sigslot::has_slots<> { > rtc::Thread* signaling_thread = rtc::Thread::Current(); > auto channel = absl::make_unique<typename T::Channel>( > worker_thread, network_thread, signaling_thread, engine, std::move(ch), >- cricket::CN_AUDIO, (flags & DTLS) != 0, rtc::CryptoOptions()); >- channel->Init_w(rtp_transport); >+ cricket::CN_AUDIO, (flags & DTLS) != 0, webrtc::CryptoOptions()); >+ channel->Init_w(rtp_transport, /*media_transport=*/nullptr); > return channel; > } > >@@ -577,6 +576,37 @@ class ChannelTest : public testing::Test, public sigslot::has_slots<> { > CodecMatches(content.codecs()[0], media_channel1_->codecs()[0])); > } > >+ // Test that SetLocalContent and SetRemoteContent properly configure >+ // extmap-allow-mixed. >+ void TestSetContentsExtmapAllowMixedCaller(bool offer, bool answer) { >+ // For a caller, SetLocalContent() is called first with an offer and next >+ // SetRemoteContent() is called with the answer. >+ CreateChannels(0, 0); >+ typename T::Content content; >+ CreateContent(0, kPcmuCodec, kH264Codec, &content); >+ auto offer_enum = offer ? (T::Content::kSession) : (T::Content::kNo); >+ auto answer_enum = answer ? (T::Content::kSession) : (T::Content::kNo); >+ content.set_extmap_allow_mixed_enum(offer_enum); >+ EXPECT_TRUE(channel1_->SetLocalContent(&content, SdpType::kOffer, NULL)); >+ content.set_extmap_allow_mixed_enum(answer_enum); >+ EXPECT_TRUE(channel1_->SetRemoteContent(&content, SdpType::kAnswer, NULL)); >+ EXPECT_EQ(answer, media_channel1_->ExtmapAllowMixed()); >+ } >+ void TestSetContentsExtmapAllowMixedCallee(bool offer, bool answer) { >+ // For a callee, SetRemoteContent() is called first with an offer and next >+ // SetLocalContent() is called with the answer. >+ CreateChannels(0, 0); >+ typename T::Content content; >+ CreateContent(0, kPcmuCodec, kH264Codec, &content); >+ auto offer_enum = offer ? (T::Content::kSession) : (T::Content::kNo); >+ auto answer_enum = answer ? (T::Content::kSession) : (T::Content::kNo); >+ content.set_extmap_allow_mixed_enum(offer_enum); >+ EXPECT_TRUE(channel1_->SetRemoteContent(&content, SdpType::kOffer, NULL)); >+ content.set_extmap_allow_mixed_enum(answer_enum); >+ EXPECT_TRUE(channel1_->SetLocalContent(&content, SdpType::kAnswer, NULL)); >+ EXPECT_EQ(answer, media_channel1_->ExtmapAllowMixed()); >+ } >+ > // Test that SetLocalContent and SetRemoteContent properly deals > // with an empty offer. > void TestSetContentsNullOffer() { >@@ -1545,8 +1575,8 @@ std::unique_ptr<cricket::VideoChannel> ChannelTest<VideoTraits>::CreateChannel( > rtc::Thread* signaling_thread = rtc::Thread::Current(); > auto channel = absl::make_unique<cricket::VideoChannel>( > worker_thread, network_thread, signaling_thread, std::move(ch), >- cricket::CN_VIDEO, (flags & DTLS) != 0, rtc::CryptoOptions()); >- channel->Init_w(rtp_transport); >+ cricket::CN_VIDEO, (flags & DTLS) != 0, webrtc::CryptoOptions()); >+ channel->Init_w(rtp_transport, /*media_transport=*/nullptr); > return channel; > } > >@@ -1615,6 +1645,24 @@ TEST_F(VoiceChannelSingleThreadTest, TestSetContents) { > Base::TestSetContents(); > } > >+TEST_F(VoiceChannelSingleThreadTest, TestSetContentsExtmapAllowMixedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VoiceChannelSingleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/false); >+} >+ >+TEST_F(VoiceChannelSingleThreadTest, TestSetContentsExtmapAllowMixedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VoiceChannelSingleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/false); >+} >+ > TEST_F(VoiceChannelSingleThreadTest, TestSetContentsNullOffer) { > Base::TestSetContentsNullOffer(); > } >@@ -1750,6 +1798,24 @@ TEST_F(VoiceChannelDoubleThreadTest, TestSetContents) { > Base::TestSetContents(); > } > >+TEST_F(VoiceChannelDoubleThreadTest, TestSetContentsExtmapAllowMixedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VoiceChannelDoubleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/false); >+} >+ >+TEST_F(VoiceChannelDoubleThreadTest, TestSetContentsExtmapAllowMixedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VoiceChannelDoubleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/false); >+} >+ > TEST_F(VoiceChannelDoubleThreadTest, TestSetContentsNullOffer) { > Base::TestSetContentsNullOffer(); > } >@@ -1883,6 +1949,24 @@ TEST_F(VideoChannelSingleThreadTest, TestSetContents) { > Base::TestSetContents(); > } > >+TEST_F(VideoChannelSingleThreadTest, TestSetContentsExtmapAllowMixedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VideoChannelSingleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/false); >+} >+ >+TEST_F(VideoChannelSingleThreadTest, TestSetContentsExtmapAllowMixedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VideoChannelSingleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/false); >+} >+ > TEST_F(VideoChannelSingleThreadTest, TestSetContentsNullOffer) { > Base::TestSetContentsNullOffer(); > } >@@ -2016,6 +2100,24 @@ TEST_F(VideoChannelDoubleThreadTest, TestSetContents) { > Base::TestSetContents(); > } > >+TEST_F(VideoChannelDoubleThreadTest, TestSetContentsExtmapAllowMixedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VideoChannelDoubleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCaller) { >+ Base::TestSetContentsExtmapAllowMixedCaller(/*offer=*/true, /*answer=*/false); >+} >+ >+TEST_F(VideoChannelDoubleThreadTest, TestSetContentsExtmapAllowMixedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/true); >+} >+ >+TEST_F(VideoChannelDoubleThreadTest, >+ TestSetContentsExtmapAllowMixedNotSupportedAsCallee) { >+ Base::TestSetContentsExtmapAllowMixedCallee(/*offer=*/true, /*answer=*/false); >+} >+ > TEST_F(VideoChannelDoubleThreadTest, TestSetContentsNullOffer) { > Base::TestSetContentsNullOffer(); > } >@@ -2164,7 +2266,7 @@ std::unique_ptr<cricket::RtpDataChannel> ChannelTest<DataTraits>::CreateChannel( > rtc::Thread* signaling_thread = rtc::Thread::Current(); > auto channel = absl::make_unique<cricket::RtpDataChannel>( > worker_thread, network_thread, signaling_thread, std::move(ch), >- cricket::CN_DATA, (flags & DTLS) != 0, rtc::CryptoOptions()); >+ cricket::CN_DATA, (flags & DTLS) != 0, webrtc::CryptoOptions()); > channel->Init_w(rtp_transport); > return channel; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelinterface.h >new file mode 100644 >index 0000000000000000000000000000000000000000..8e4109a04542f535656bda412e92e652237161a9 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelinterface.h >@@ -0,0 +1,68 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef PC_CHANNELINTERFACE_H_ >+#define PC_CHANNELINTERFACE_H_ >+ >+#include <string> >+ >+#include "api/jsep.h" >+#include "api/mediatypes.h" >+#include "media/base/mediachannel.h" >+#include "pc/rtptransportinternal.h" >+ >+namespace cricket { >+ >+class MediaContentDescription; >+ >+// ChannelInterface contains methods common to voice, video and data channels. >+// As more methods are added to BaseChannel, they should be included in the >+// interface as well. >+class ChannelInterface { >+ public: >+ virtual cricket::MediaType media_type() const = 0; >+ >+ virtual MediaChannel* media_channel() const = 0; >+ >+ // TODO(deadbeef): This is redundant; remove this. >+ virtual const std::string& transport_name() const = 0; >+ >+ virtual const std::string& content_name() const = 0; >+ >+ virtual bool enabled() const = 0; >+ >+ // Enables or disables this channel >+ virtual bool Enable(bool enable) = 0; >+ >+ // Used for latency measurements. >+ virtual sigslot::signal1<ChannelInterface*>& SignalFirstPacketReceived() = 0; >+ >+ // Channel control >+ virtual bool SetLocalContent(const MediaContentDescription* content, >+ webrtc::SdpType type, >+ std::string* error_desc) = 0; >+ virtual bool SetRemoteContent(const MediaContentDescription* content, >+ webrtc::SdpType type, >+ std::string* error_desc) = 0; >+ >+ // Set an RTP level transport. >+ // Some examples: >+ // * An RtpTransport without encryption. >+ // * An SrtpTransport for SDES. >+ // * A DtlsSrtpTransport for DTLS-SRTP. >+ virtual bool SetRtpTransport(webrtc::RtpTransportInternal* rtp_transport) = 0; >+ >+ protected: >+ virtual ~ChannelInterface() = default; >+}; >+ >+} // namespace cricket >+ >+#endif // PC_CHANNELINTERFACE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.cc >index dab622ad70e42f079fb430d2e901fca9dd49923c..eda5a2dd57922cbd7ae4fe84982ca7ef08dfec91 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.cc >@@ -14,10 +14,10 @@ > #include <utility> > > #include "absl/memory/memory.h" >+#include "absl/strings/match.h" > #include "media/base/rtpdataengine.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringutils.h" > #include "rtc_base/trace_event.h" > > namespace cricket { >@@ -66,7 +66,7 @@ void ChannelManager::GetSupportedAudioSendCodecs( > if (!media_engine_) { > return; > } >- *codecs = media_engine_->audio_send_codecs(); >+ *codecs = media_engine_->voice().send_codecs(); > } > > void ChannelManager::GetSupportedAudioReceiveCodecs( >@@ -74,7 +74,7 @@ void ChannelManager::GetSupportedAudioReceiveCodecs( > if (!media_engine_) { > return; > } >- *codecs = media_engine_->audio_recv_codecs(); >+ *codecs = media_engine_->voice().recv_codecs(); > } > > void ChannelManager::GetSupportedAudioRtpHeaderExtensions( >@@ -82,7 +82,7 @@ void ChannelManager::GetSupportedAudioRtpHeaderExtensions( > if (!media_engine_) { > return; > } >- *ext = media_engine_->GetAudioCapabilities().header_extensions; >+ *ext = media_engine_->voice().GetCapabilities().header_extensions; > } > > void ChannelManager::GetSupportedVideoCodecs( >@@ -92,10 +92,10 @@ void ChannelManager::GetSupportedVideoCodecs( > } > codecs->clear(); > >- std::vector<VideoCodec> video_codecs = media_engine_->video_codecs(); >+ std::vector<VideoCodec> video_codecs = media_engine_->video().codecs(); > for (const auto& video_codec : video_codecs) { > if (!enable_rtx_ && >- _stricmp(kRtxCodecName, video_codec.name.c_str()) == 0) { >+ absl::EqualsIgnoreCase(kRtxCodecName, video_codec.name)) { > continue; > } > codecs->push_back(video_codec); >@@ -107,7 +107,7 @@ void ChannelManager::GetSupportedVideoRtpHeaderExtensions( > if (!media_engine_) { > return; > } >- *ext = media_engine_->GetVideoCapabilities().header_extensions; >+ *ext = media_engine_->video().GetCapabilities().header_extensions; > } > > void ChannelManager::GetSupportedDataCodecs( >@@ -156,16 +156,17 @@ VoiceChannel* ChannelManager::CreateVoiceChannel( > webrtc::Call* call, > const cricket::MediaConfig& media_config, > webrtc::RtpTransportInternal* rtp_transport, >+ webrtc::MediaTransportInterface* media_transport, > rtc::Thread* signaling_thread, > const std::string& content_name, > bool srtp_required, >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > const AudioOptions& options) { > if (!worker_thread_->IsCurrent()) { > return worker_thread_->Invoke<VoiceChannel*>(RTC_FROM_HERE, [&] { > return CreateVoiceChannel(call, media_config, rtp_transport, >- signaling_thread, content_name, srtp_required, >- crypto_options, options); >+ media_transport, signaling_thread, content_name, >+ srtp_required, crypto_options, options); > }); > } > >@@ -176,8 +177,8 @@ VoiceChannel* ChannelManager::CreateVoiceChannel( > return nullptr; > } > >- VoiceMediaChannel* media_channel = >- media_engine_->CreateChannel(call, media_config, options); >+ VoiceMediaChannel* media_channel = media_engine_->voice().CreateMediaChannel( >+ call, media_config, options, crypto_options); > if (!media_channel) { > return nullptr; > } >@@ -187,7 +188,7 @@ VoiceChannel* ChannelManager::CreateVoiceChannel( > absl::WrapUnique(media_channel), content_name, srtp_required, > crypto_options); > >- voice_channel->Init_w(rtp_transport); >+ voice_channel->Init_w(rtp_transport, media_transport); > > VoiceChannel* voice_channel_ptr = voice_channel.get(); > voice_channels_.push_back(std::move(voice_channel)); >@@ -226,7 +227,7 @@ VideoChannel* ChannelManager::CreateVideoChannel( > rtc::Thread* signaling_thread, > const std::string& content_name, > bool srtp_required, >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > const VideoOptions& options) { > if (!worker_thread_->IsCurrent()) { > return worker_thread_->Invoke<VideoChannel*>(RTC_FROM_HERE, [&] { >@@ -243,8 +244,8 @@ VideoChannel* ChannelManager::CreateVideoChannel( > return nullptr; > } > >- VideoMediaChannel* media_channel = >- media_engine_->CreateVideoChannel(call, media_config, options); >+ VideoMediaChannel* media_channel = media_engine_->video().CreateMediaChannel( >+ call, media_config, options, crypto_options); > if (!media_channel) { > return nullptr; > } >@@ -253,7 +254,9 @@ VideoChannel* ChannelManager::CreateVideoChannel( > worker_thread_, network_thread_, signaling_thread, > absl::WrapUnique(media_channel), content_name, srtp_required, > crypto_options); >- video_channel->Init_w(rtp_transport); >+ >+ // TODO(sukhanov): Add media_transport support for video channel. >+ video_channel->Init_w(rtp_transport, /*media_transport=*/nullptr); > > VideoChannel* video_channel_ptr = video_channel.get(); > video_channels_.push_back(std::move(video_channel)); >@@ -291,7 +294,7 @@ RtpDataChannel* ChannelManager::CreateRtpDataChannel( > rtc::Thread* signaling_thread, > const std::string& content_name, > bool srtp_required, >- const rtc::CryptoOptions& crypto_options) { >+ const webrtc::CryptoOptions& crypto_options) { > if (!worker_thread_->IsCurrent()) { > return worker_thread_->Invoke<RtpDataChannel*>(RTC_FROM_HERE, [&] { > return CreateRtpDataChannel(media_config, rtp_transport, signaling_thread, >@@ -346,13 +349,13 @@ void ChannelManager::DestroyRtpDataChannel(RtpDataChannel* data_channel) { > bool ChannelManager::StartAecDump(rtc::PlatformFile file, > int64_t max_size_bytes) { > return worker_thread_->Invoke<bool>(RTC_FROM_HERE, [&] { >- return media_engine_->StartAecDump(file, max_size_bytes); >+ return media_engine_->voice().StartAecDump(file, max_size_bytes); > }); > } > > void ChannelManager::StopAecDump() { > worker_thread_->Invoke<void>(RTC_FROM_HERE, >- [&] { media_engine_->StopAecDump(); }); >+ [&] { media_engine_->voice().StopAecDump(); }); > } > > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.h >index c6b601ea8f6cb0e0017b0bf0aca703de0249658a..5cafd8cfa72f56377fc581573148a3a95b480c77 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager.h >@@ -80,14 +80,16 @@ class ChannelManager final { > // call the appropriate Destroy*Channel method when done. > > // Creates a voice channel, to be associated with the specified session. >- VoiceChannel* CreateVoiceChannel(webrtc::Call* call, >- const cricket::MediaConfig& media_config, >- webrtc::RtpTransportInternal* rtp_transport, >- rtc::Thread* signaling_thread, >- const std::string& content_name, >- bool srtp_required, >- const rtc::CryptoOptions& crypto_options, >- const AudioOptions& options); >+ VoiceChannel* CreateVoiceChannel( >+ webrtc::Call* call, >+ const cricket::MediaConfig& media_config, >+ webrtc::RtpTransportInternal* rtp_transport, >+ webrtc::MediaTransportInterface* media_transport, >+ rtc::Thread* signaling_thread, >+ const std::string& content_name, >+ bool srtp_required, >+ const webrtc::CryptoOptions& crypto_options, >+ const AudioOptions& options); > // Destroys a voice channel created by CreateVoiceChannel. > void DestroyVoiceChannel(VoiceChannel* voice_channel); > >@@ -100,7 +102,7 @@ class ChannelManager final { > rtc::Thread* signaling_thread, > const std::string& content_name, > bool srtp_required, >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > const VideoOptions& options); > // Destroys a video channel created by CreateVideoChannel. > void DestroyVideoChannel(VideoChannel* video_channel); >@@ -111,7 +113,7 @@ class ChannelManager final { > rtc::Thread* signaling_thread, > const std::string& content_name, > bool srtp_required, >- const rtc::CryptoOptions& crypto_options); >+ const webrtc::CryptoOptions& crypto_options); > // Destroys a data channel created by CreateRtpDataChannel. > void DestroyRtpDataChannel(RtpDataChannel* data_channel); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager_unittest.cc >index 47c95306ee70f8b261b80ac04ece7d388402712d..6e9cab6cda81de6c60ebe7ee87fc1ab77bb95ac2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/channelmanager_unittest.cc >@@ -11,6 +11,7 @@ > #include <memory> > #include <utility> > >+#include "api/test/fake_media_transport.h" > #include "media/base/fakemediaengine.h" > #include "media/base/testutils.h" > #include "media/engine/fakewebrtccall.h" >@@ -61,20 +62,32 @@ class ChannelManagerTest : public testing::Test { > return dtls_srtp_transport; > } > >- void TestCreateDestroyChannels(webrtc::RtpTransportInternal* rtp_transport) { >+ std::unique_ptr<webrtc::MediaTransportInterface> CreateMediaTransport( >+ rtc::PacketTransportInternal* packet_transport) { >+ auto media_transport_result = >+ fake_media_transport_factory_.CreateMediaTransport(packet_transport, >+ network_.get(), >+ /*is_caller=*/true); >+ RTC_CHECK(media_transport_result.ok()); >+ return media_transport_result.MoveValue(); >+ } >+ >+ void TestCreateDestroyChannels( >+ webrtc::RtpTransportInternal* rtp_transport, >+ webrtc::MediaTransportInterface* media_transport) { > cricket::VoiceChannel* voice_channel = cm_->CreateVoiceChannel( >- &fake_call_, cricket::MediaConfig(), rtp_transport, >+ &fake_call_, cricket::MediaConfig(), rtp_transport, media_transport, > rtc::Thread::Current(), cricket::CN_AUDIO, kDefaultSrtpRequired, >- rtc::CryptoOptions(), AudioOptions()); >+ webrtc::CryptoOptions(), AudioOptions()); > EXPECT_TRUE(voice_channel != nullptr); > cricket::VideoChannel* video_channel = cm_->CreateVideoChannel( > &fake_call_, cricket::MediaConfig(), rtp_transport, > rtc::Thread::Current(), cricket::CN_VIDEO, kDefaultSrtpRequired, >- rtc::CryptoOptions(), VideoOptions()); >+ webrtc::CryptoOptions(), VideoOptions()); > EXPECT_TRUE(video_channel != nullptr); > cricket::RtpDataChannel* rtp_data_channel = cm_->CreateRtpDataChannel( > cricket::MediaConfig(), rtp_transport, rtc::Thread::Current(), >- cricket::CN_DATA, kDefaultSrtpRequired, rtc::CryptoOptions()); >+ cricket::CN_DATA, kDefaultSrtpRequired, webrtc::CryptoOptions()); > EXPECT_TRUE(rtp_data_channel != nullptr); > cm_->DestroyVideoChannel(video_channel); > cm_->DestroyVoiceChannel(voice_channel); >@@ -90,6 +103,7 @@ class ChannelManagerTest : public testing::Test { > cricket::FakeDataEngine* fdme_; > std::unique_ptr<cricket::ChannelManager> cm_; > cricket::FakeCall fake_call_; >+ webrtc::FakeMediaTransportFactory fake_media_transport_factory_; > }; > > // Test that we startup/shutdown properly. >@@ -154,7 +168,15 @@ TEST_F(ChannelManagerTest, SetVideoRtxEnabled) { > TEST_F(ChannelManagerTest, CreateDestroyChannels) { > EXPECT_TRUE(cm_->Init()); > auto rtp_transport = CreateDtlsSrtpTransport(); >- TestCreateDestroyChannels(rtp_transport.get()); >+ TestCreateDestroyChannels(rtp_transport.get(), /*media_transport=*/nullptr); >+} >+ >+TEST_F(ChannelManagerTest, CreateDestroyChannelsWithMediaTransport) { >+ EXPECT_TRUE(cm_->Init()); >+ auto rtp_transport = CreateDtlsSrtpTransport(); >+ auto media_transport = >+ CreateMediaTransport(rtp_transport->rtcp_packet_transport()); >+ TestCreateDestroyChannels(rtp_transport.get(), media_transport.get()); > } > > TEST_F(ChannelManagerTest, CreateDestroyChannelsOnThread) { >@@ -164,7 +186,7 @@ TEST_F(ChannelManagerTest, CreateDestroyChannelsOnThread) { > EXPECT_TRUE(cm_->set_network_thread(network_.get())); > EXPECT_TRUE(cm_->Init()); > auto rtp_transport = CreateDtlsSrtpTransport(); >- TestCreateDestroyChannels(rtp_transport.get()); >+ TestCreateDestroyChannels(rtp_transport.get(), /*media_transport=*/nullptr); > } > > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/createpeerconnectionfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/createpeerconnectionfactory.cc >deleted file mode 100644 >index bbc4ebbe730f99ae38027f23812918188b5a019c..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/createpeerconnectionfactory.cc >+++ /dev/null >@@ -1,172 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "api/call/callfactoryinterface.h" >-#include "api/peerconnectioninterface.h" >-#include "api/video_codecs/video_decoder_factory.h" >-#include "api/video_codecs/video_encoder_factory.h" >-#include "logging/rtc_event_log/rtc_event_log_factory_interface.h" >-#include "media/engine/webrtcmediaengine.h" >-#include "modules/audio_device/include/audio_device.h" >-#include "modules/audio_processing/include/audio_processing.h" >-#include "rtc_base/bind.h" >-#include "rtc_base/scoped_ref_ptr.h" >-#include "rtc_base/thread.h" >- >-namespace webrtc { >- >-#if defined(USE_BUILTIN_SW_CODECS) >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory) { >- return CreatePeerConnectionFactoryWithAudioMixer( >- nullptr /*network_thread*/, nullptr /*worker_thread*/, >- nullptr /*signaling_thread*/, nullptr /*default_adm*/, >- audio_encoder_factory, audio_decoder_factory, >- nullptr /*video_encoder_factory*/, nullptr /*video_decoder_factory*/, >- nullptr /*audio_mixer*/); >-} >- >-// Note: all the other CreatePeerConnectionFactory variants just end up calling >-// this, ultimately. >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer, >- rtc::scoped_refptr<AudioProcessing> audio_processing) { >- rtc::scoped_refptr<AudioProcessing> audio_processing_use = audio_processing; >- if (!audio_processing_use) { >- audio_processing_use = AudioProcessingBuilder().Create(); >- } >- >- std::unique_ptr<cricket::MediaEngineInterface> media_engine( >- cricket::WebRtcMediaEngineFactory::Create( >- default_adm, audio_encoder_factory, audio_decoder_factory, >- video_encoder_factory, video_decoder_factory, audio_mixer, >- audio_processing_use)); >- >- std::unique_ptr<CallFactoryInterface> call_factory = CreateCallFactory(); >- >- std::unique_ptr<RtcEventLogFactoryInterface> event_log_factory = >- CreateRtcEventLogFactory(); >- >- return CreateModularPeerConnectionFactory( >- network_thread, worker_thread, signaling_thread, std::move(media_engine), >- std::move(call_factory), std::move(event_log_factory)); >-} >- >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer, >- rtc::scoped_refptr<AudioProcessing> audio_processing, >- std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory, >- std::unique_ptr<NetworkControllerFactoryInterface> >- network_controller_factory) { >- rtc::scoped_refptr<AudioProcessing> audio_processing_use = audio_processing; >- if (!audio_processing_use) { >- audio_processing_use = AudioProcessingBuilder().Create(); >- } >- >- std::unique_ptr<cricket::MediaEngineInterface> media_engine( >- cricket::WebRtcMediaEngineFactory::Create( >- default_adm, audio_encoder_factory, audio_decoder_factory, >- video_encoder_factory, video_decoder_factory, audio_mixer, >- audio_processing_use)); >- >- std::unique_ptr<CallFactoryInterface> call_factory = CreateCallFactory(); >- >- std::unique_ptr<RtcEventLogFactoryInterface> event_log_factory = >- CreateRtcEventLogFactory(); >- >- return CreateModularPeerConnectionFactory( >- network_thread, worker_thread, signaling_thread, std::move(media_engine), >- std::move(call_factory), std::move(event_log_factory), >- std::move(fec_controller_factory), std::move(network_controller_factory)); >-} >-#endif >- >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- rtc::scoped_refptr<AudioDeviceModule> default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- std::unique_ptr<VideoEncoderFactory> video_encoder_factory, >- std::unique_ptr<VideoDecoderFactory> video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer, >- rtc::scoped_refptr<AudioProcessing> audio_processing) { >- if (!audio_processing) >- audio_processing = AudioProcessingBuilder().Create(); >- >- std::unique_ptr<cricket::MediaEngineInterface> media_engine = >- cricket::WebRtcMediaEngineFactory::Create( >- default_adm, audio_encoder_factory, audio_decoder_factory, >- std::move(video_encoder_factory), std::move(video_decoder_factory), >- audio_mixer, audio_processing); >- >- std::unique_ptr<CallFactoryInterface> call_factory = CreateCallFactory(); >- >- std::unique_ptr<RtcEventLogFactoryInterface> event_log_factory = >- CreateRtcEventLogFactory(); >- >- return CreateModularPeerConnectionFactory( >- network_thread, worker_thread, signaling_thread, std::move(media_engine), >- std::move(call_factory), std::move(event_log_factory)); >-} >- >-#if defined(USE_BUILTIN_SW_CODECS) >-rtc::scoped_refptr<PeerConnectionFactoryInterface> >-CreatePeerConnectionFactoryWithAudioMixer( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory, >- rtc::scoped_refptr<AudioMixer> audio_mixer) { >- return CreatePeerConnectionFactory( >- network_thread, worker_thread, signaling_thread, default_adm, >- audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >- video_decoder_factory, audio_mixer, nullptr); >-} >- >-rtc::scoped_refptr<PeerConnectionFactoryInterface> CreatePeerConnectionFactory( >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- rtc::Thread* signaling_thread, >- AudioDeviceModule* default_adm, >- rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >- rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, >- cricket::WebRtcVideoEncoderFactory* video_encoder_factory, >- cricket::WebRtcVideoDecoderFactory* video_decoder_factory) { >- return CreatePeerConnectionFactoryWithAudioMixer( >- network_thread, worker_thread, signaling_thread, default_adm, >- audio_encoder_factory, audio_decoder_factory, video_encoder_factory, >- video_decoder_factory, nullptr); >-} >-#endif >- >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.cc >index f819d26695760515d08fe194ba0131520370408d..e989586607d77bf3a41f9283d524d3ad3f73b0e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.cc >@@ -118,6 +118,10 @@ rtc::scoped_refptr<DataChannel> DataChannel::Create( > return channel; > } > >+bool DataChannel::IsSctpLike(cricket::DataChannelType type) { >+ return type == cricket::DCT_SCTP || type == cricket::DCT_MEDIA_TRANSPORT; >+} >+ > DataChannel::DataChannel(DataChannelProviderInterface* provider, > cricket::DataChannelType dct, > const std::string& label) >@@ -147,7 +151,7 @@ bool DataChannel::Init(const InternalDataChannelInit& config) { > return false; > } > handshake_state_ = kHandshakeReady; >- } else if (data_channel_type_ == cricket::DCT_SCTP) { >+ } else if (IsSctpLike(data_channel_type_)) { > if (config.id < -1 || config.maxRetransmits < -1 || > config.maxRetransmitTime < -1) { > RTC_LOG(LS_ERROR) << "Failed to initialize the SCTP data channel due to " >@@ -241,7 +245,7 @@ bool DataChannel::Send(const DataBuffer& buffer) { > if (!queued_send_data_.Empty()) { > // Only SCTP DataChannel queues the outgoing data when the transport is > // blocked. >- RTC_DCHECK(data_channel_type_ == cricket::DCT_SCTP); >+ RTC_DCHECK(IsSctpLike(data_channel_type_)); > if (!QueueSendDataMessage(buffer)) { > RTC_LOG(LS_ERROR) << "Closing the DataChannel due to a failure to queue " > "additional data."; >@@ -273,7 +277,7 @@ void DataChannel::SetReceiveSsrc(uint32_t receive_ssrc) { > void DataChannel::SetSctpSid(int sid) { > RTC_DCHECK_LT(config_.id, 0); > RTC_DCHECK_GE(sid, 0); >- RTC_DCHECK_EQ(data_channel_type_, cricket::DCT_SCTP); >+ RTC_DCHECK(IsSctpLike(data_channel_type_)); > if (config_.id == sid) { > return; > } >@@ -283,7 +287,7 @@ void DataChannel::SetSctpSid(int sid) { > } > > void DataChannel::OnClosingProcedureStartedRemotely(int sid) { >- if (data_channel_type_ == cricket::DCT_SCTP && sid == config_.id && >+ if (IsSctpLike(data_channel_type_) && sid == config_.id && > state_ != kClosing && state_ != kClosed) { > // Don't bother sending queued data since the side that initiated the > // closure wouldn't receive it anyway. See crbug.com/559394 for a lengthy >@@ -299,7 +303,7 @@ void DataChannel::OnClosingProcedureStartedRemotely(int sid) { > } > > void DataChannel::OnClosingProcedureComplete(int sid) { >- if (data_channel_type_ == cricket::DCT_SCTP && sid == config_.id) { >+ if (IsSctpLike(data_channel_type_) && sid == config_.id) { > // If the closing procedure is complete, we should have finished sending > // all pending data and transitioned to kClosing already. > RTC_DCHECK_EQ(state_, kClosing); >@@ -310,7 +314,7 @@ void DataChannel::OnClosingProcedureComplete(int sid) { > } > > void DataChannel::OnTransportChannelCreated() { >- RTC_DCHECK(data_channel_type_ == cricket::DCT_SCTP); >+ RTC_DCHECK(IsSctpLike(data_channel_type_)); > if (!connected_to_provider_) { > connected_to_provider_ = provider_->ConnectDataChannel(this); > } >@@ -348,12 +352,12 @@ void DataChannel::OnDataReceived(const cricket::ReceiveDataParams& params, > if (data_channel_type_ == cricket::DCT_RTP && params.ssrc != receive_ssrc_) { > return; > } >- if (data_channel_type_ == cricket::DCT_SCTP && params.sid != config_.id) { >+ if (IsSctpLike(data_channel_type_) && params.sid != config_.id) { > return; > } > > if (params.type == cricket::DMT_CONTROL) { >- RTC_DCHECK(data_channel_type_ == cricket::DCT_SCTP); >+ RTC_DCHECK(IsSctpLike(data_channel_type_)); > if (handshake_state_ != kHandshakeWaitingForAck) { > // Ignore it if we are not expecting an ACK message. > RTC_LOG(LS_WARNING) >@@ -570,7 +574,7 @@ bool DataChannel::SendDataMessage(const DataBuffer& buffer, > bool queue_if_blocked) { > cricket::SendDataParams send_params; > >- if (data_channel_type_ == cricket::DCT_SCTP) { >+ if (IsSctpLike(data_channel_type_)) { > send_params.ordered = config_.ordered; > // Send as ordered if it is still going through OPEN/ACK signaling. > if (handshake_state_ != kHandshakeReady && !config_.ordered) { >@@ -597,7 +601,7 @@ bool DataChannel::SendDataMessage(const DataBuffer& buffer, > return true; > } > >- if (data_channel_type_ != cricket::DCT_SCTP) { >+ if (!IsSctpLike(data_channel_type_)) { > return false; > } > >@@ -649,7 +653,7 @@ void DataChannel::QueueControlMessage(const rtc::CopyOnWriteBuffer& buffer) { > bool DataChannel::SendControlMessage(const rtc::CopyOnWriteBuffer& buffer) { > bool is_open_message = handshake_state_ == kHandshakeShouldSendOpen; > >- RTC_DCHECK_EQ(data_channel_type_, cricket::DCT_SCTP); >+ RTC_DCHECK(IsSctpLike(data_channel_type_)); > RTC_DCHECK(writable_); > RTC_DCHECK_GE(config_.id, 0); > RTC_DCHECK(!is_open_message || !config_.negotiated); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.h >index cbb0c8b2cf0aba568ca7dbff4b3d1b93a5cb95b1..22ea354c211632db86e7942348bd5c328664f7f3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/datachannel.h >@@ -122,6 +122,8 @@ class DataChannel : public DataChannelInterface, public sigslot::has_slots<> { > const std::string& label, > const InternalDataChannelInit& config); > >+ static bool IsSctpLike(cricket::DataChannelType type); >+ > virtual void RegisterObserver(DataChannelObserver* observer); > virtual void UnregisterObserver(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.cc >index e31b2f5872ca96b573bc8124a05cd1d38020cd84..2835d34a4677654972b54b7f6033a1f420396333 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.cc >@@ -301,6 +301,8 @@ void DtlsSrtpTransport::OnDtlsState(cricket::DtlsTransportInternal* transport, > RTC_DCHECK(transport == rtp_dtls_transport_ || > transport == rtcp_dtls_transport_); > >+ SignalDtlsStateChange(); >+ > if (state != cricket::DTLS_TRANSPORT_CONNECTED) { > ResetParams(); > return; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.h >index 498f02e917622b73004946061158fe0986f7eb71..cac560e79ace85af3672835c4dabf95c0ccaf3dc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/dtlssrtptransport.h >@@ -42,7 +42,8 @@ class DtlsSrtpTransport : public SrtpTransport { > void UpdateRecvEncryptedHeaderExtensionIds( > const std::vector<int>& recv_extension_ids); > >- sigslot::signal2<DtlsSrtpTransport*, bool> SignalDtlsSrtpSetupFailure; >+ sigslot::signal<DtlsSrtpTransport*, bool> SignalDtlsSrtpSetupFailure; >+ sigslot::signal<> SignalDtlsStateChange; > > RTCError SetSrtpSendKey(const cricket::CryptoParams& params) override { > return RTCError(RTCErrorType::UNSUPPORTED_OPERATION, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepicecandidate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepicecandidate.cc >index 277f1bab4fff10d384f7d13a74fae3a8d48a78be..2133d9dadd258184b49776897cc339ffa2c6b87a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepicecandidate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepicecandidate.cc >@@ -14,7 +14,6 @@ > > #include "absl/memory/memory.h" > #include "pc/webrtcsdp.h" >-#include "rtc_base/stringencode.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepsessiondescription.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepsessiondescription.cc >index 64a7a2bd591b0a691ed21c91a6f8ba73954e4d32..13d074380820053428f23eda1985c182a42d0969 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepsessiondescription.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jsepsessiondescription.cc >@@ -17,7 +17,6 @@ > #include "pc/mediasession.h" > #include "pc/webrtcsdp.h" > #include "rtc_base/arraysize.h" >-#include "rtc_base/stringencode.h" > > using cricket::SessionDescription; > >@@ -93,37 +92,9 @@ void UpdateConnectionAddress( > > } // namespace > >-const char SessionDescriptionInterface::kOffer[] = "offer"; >-const char SessionDescriptionInterface::kPrAnswer[] = "pranswer"; >-const char SessionDescriptionInterface::kAnswer[] = "answer"; >- > const int JsepSessionDescription::kDefaultVideoCodecId = 100; > const char JsepSessionDescription::kDefaultVideoCodecName[] = "VP8"; > >-const char* SdpTypeToString(SdpType type) { >- switch (type) { >- case SdpType::kOffer: >- return SessionDescriptionInterface::kOffer; >- case SdpType::kPrAnswer: >- return SessionDescriptionInterface::kPrAnswer; >- case SdpType::kAnswer: >- return SessionDescriptionInterface::kAnswer; >- } >- return ""; >-} >- >-absl::optional<SdpType> SdpTypeFromString(const std::string& type_str) { >- if (type_str == SessionDescriptionInterface::kOffer) { >- return SdpType::kOffer; >- } else if (type_str == SessionDescriptionInterface::kPrAnswer) { >- return SdpType::kPrAnswer; >- } else if (type_str == SessionDescriptionInterface::kAnswer) { >- return SdpType::kAnswer; >- } else { >- return absl::nullopt; >- } >-} >- > // TODO(steveanton): Remove this default implementation once Chromium has been > // updated. > SdpType SessionDescriptionInterface::GetType() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.cc >index c3dfd32464ded2579a194c01d7ec1da8102f6dd5..88d20a45b17927f41d3625b14b11cf05ec4a1cb0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.cc >@@ -13,7 +13,6 @@ > #include <memory> > #include <utility> // for std::pair > >-#include "absl/memory/memory.h" > #include "api/candidate.h" > #include "p2p/base/p2pconstants.h" > #include "p2p/base/p2ptransportchannel.h" >@@ -94,11 +93,13 @@ JsepTransport::JsepTransport( > std::unique_ptr<webrtc::SrtpTransport> sdes_transport, > std::unique_ptr<webrtc::DtlsSrtpTransport> dtls_srtp_transport, > std::unique_ptr<DtlsTransportInternal> rtp_dtls_transport, >- std::unique_ptr<DtlsTransportInternal> rtcp_dtls_transport) >+ std::unique_ptr<DtlsTransportInternal> rtcp_dtls_transport, >+ std::unique_ptr<webrtc::MediaTransportInterface> media_transport) > : mid_(mid), > local_certificate_(local_certificate), > rtp_dtls_transport_(std::move(rtp_dtls_transport)), >- rtcp_dtls_transport_(std::move(rtcp_dtls_transport)) { >+ rtcp_dtls_transport_(std::move(rtcp_dtls_transport)), >+ media_transport_(std::move(media_transport)) { > RTC_DCHECK(rtp_dtls_transport_); > if (unencrypted_rtp_transport) { > RTC_DCHECK(!sdes_transport); >@@ -114,9 +115,17 @@ JsepTransport::JsepTransport( > RTC_DCHECK(!sdes_transport); > dtls_srtp_transport_ = std::move(dtls_srtp_transport); > } >+ >+ if (media_transport_) { >+ media_transport_->SetMediaTransportStateCallback(this); >+ } > } > >-JsepTransport::~JsepTransport() {} >+JsepTransport::~JsepTransport() { >+ if (media_transport_) { >+ media_transport_->SetMediaTransportStateCallback(nullptr); >+ } >+} > > webrtc::RTCError JsepTransport::SetLocalJsepTransportDescription( > const JsepTransportDescription& jsep_description, >@@ -315,8 +324,9 @@ webrtc::RTCError JsepTransport::VerifyCertificateFingerprint( > return webrtc::RTCError(webrtc::RTCErrorType::INVALID_PARAMETER, > "Fingerprint provided but no identity available."); > } >- std::unique_ptr<rtc::SSLFingerprint> fp_tmp(rtc::SSLFingerprint::Create( >- fingerprint->algorithm, certificate->identity())); >+ std::unique_ptr<rtc::SSLFingerprint> fp_tmp = >+ rtc::SSLFingerprint::CreateUnique(fingerprint->algorithm, >+ *certificate->identity()); > RTC_DCHECK(fp_tmp.get() != NULL); > if (*fp_tmp == *fingerprint) { > return webrtc::RTCError::OK(); >@@ -505,7 +515,8 @@ webrtc::RTCError JsepTransport::NegotiateAndSetDtlsParameters( > "Local fingerprint supplied when caller didn't offer DTLS."); > } else { > // We are not doing DTLS >- remote_fingerprint = absl::make_unique<rtc::SSLFingerprint>("", nullptr, 0); >+ remote_fingerprint = absl::make_unique<rtc::SSLFingerprint>( >+ "", rtc::ArrayView<const uint8_t>()); > } > // Now that we have negotiated everything, push it downward. > // Note that we cache the result so that if we have race conditions >@@ -633,4 +644,12 @@ bool JsepTransport::GetTransportStats(DtlsTransportInternal* dtls_transport, > return true; > } > >+void JsepTransport::OnStateChanged(webrtc::MediaTransportState state) { >+ // TODO(bugs.webrtc.org/9719) This method currently fires on the network >+ // thread, but media transport does not make such guarantees. We need to make >+ // sure this callback is guaranteed to be executed on the network thread. >+ media_transport_state_ = state; >+ SignalMediaTransportStateChanged(); >+} >+ > } // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.h >index 8e898533ef97c1ec6d2bee478ffa96d20501fed0..952f2ccb8f74848f6465871070dea35b08c16200 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport.h >@@ -19,6 +19,7 @@ > #include "absl/types/optional.h" > #include "api/candidate.h" > #include "api/jsep.h" >+#include "api/media_transport_interface.h" > #include "p2p/base/dtlstransport.h" > #include "p2p/base/p2pconstants.h" > #include "p2p/base/transportinfo.h" >@@ -70,11 +71,16 @@ struct JsepTransportDescription { > // > // On Threading: JsepTransport performs work solely on the network thread, and > // so its methods should only be called on the network thread. >-class JsepTransport : public sigslot::has_slots<> { >+class JsepTransport : public sigslot::has_slots<>, >+ public webrtc::MediaTransportStateCallback { > public: > // |mid| is just used for log statements in order to identify the Transport. > // Note that |local_certificate| is allowed to be null since a remote > // description may be set before a local certificate is generated. >+ // >+ // |media_trasport| is optional (experimental). If available it will be used >+ // to send / receive encoded audio and video frames instead of RTP. >+ // Currently |media_transport| can co-exist with RTP / RTCP transports. > JsepTransport( > const std::string& mid, > const rtc::scoped_refptr<rtc::RTCCertificate>& local_certificate, >@@ -82,7 +88,8 @@ class JsepTransport : public sigslot::has_slots<> { > std::unique_ptr<webrtc::SrtpTransport> sdes_transport, > std::unique_ptr<webrtc::DtlsSrtpTransport> dtls_srtp_transport, > std::unique_ptr<DtlsTransportInternal> rtp_dtls_transport, >- std::unique_ptr<DtlsTransportInternal> rtcp_dtls_transport); >+ std::unique_ptr<DtlsTransportInternal> rtcp_dtls_transport, >+ std::unique_ptr<webrtc::MediaTransportInterface> media_transport); > > ~JsepTransport() override; > >@@ -158,11 +165,26 @@ class JsepTransport : public sigslot::has_slots<> { > return rtcp_dtls_transport_.get(); > } > >+ // Returns media transport, if available. >+ // Note that media transport is owned by jseptransport and the pointer >+ // to media transport will becomes invalid after destruction of jseptransport. >+ webrtc::MediaTransportInterface* media_transport() const { >+ return media_transport_.get(); >+ } >+ >+ // Returns the latest media transport state. >+ webrtc::MediaTransportState media_transport_state() const { >+ return media_transport_state_; >+ } >+ > // This is signaled when RTCP-mux becomes active and > // |rtcp_dtls_transport_| is destroyed. The JsepTransportController will > // handle the signal and update the aggregate transport states. > sigslot::signal<> SignalRtcpMuxActive; > >+ // This is signaled for changes in |media_transport_| state. >+ sigslot::signal<> SignalMediaTransportStateChanged; >+ > // TODO(deadbeef): The methods below are only public for testing. Should make > // them utility functions or objects so they can be tested independently from > // this class. >@@ -218,6 +240,9 @@ class JsepTransport : public sigslot::has_slots<> { > bool GetTransportStats(DtlsTransportInternal* dtls_transport, > TransportStats* stats); > >+ // Invoked whenever the state of the media transport changes. >+ void OnStateChanged(webrtc::MediaTransportState state) override; >+ > const std::string mid_; > // needs-ice-restart bit as described in JSEP. > bool needs_ice_restart_ = false; >@@ -241,6 +266,14 @@ class JsepTransport : public sigslot::has_slots<> { > absl::optional<std::vector<int>> send_extension_ids_; > absl::optional<std::vector<int>> recv_extension_ids_; > >+ // Optional media transport (experimental). >+ std::unique_ptr<webrtc::MediaTransportInterface> media_transport_; >+ >+ // If |media_transport_| is provided, this variable represents the state of >+ // media transport. >+ webrtc::MediaTransportState media_transport_state_ = >+ webrtc::MediaTransportState::kPending; >+ > RTC_DISALLOW_COPY_AND_ASSIGN(JsepTransport); > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport_unittest.cc >index 56a742c338bc7fe03ad3c21878e1ca40f091cc75..e518301ce2c862fc75f6491aff4ad03fbc169e6b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransport_unittest.cc >@@ -98,11 +98,16 @@ class JsepTransport2Test : public testing::Test, public sigslot::has_slots<> { > RTC_NOTREACHED(); > } > >+ // TODO(sukhanov): Currently there is no media_transport specific >+ // logic in jseptransport, so jseptransport unittests are created with >+ // media_transport = nullptr. In the future we will probably add >+ // more logic that require unit tests. Note that creation of media_transport >+ // is covered in jseptransportcontroller_unittest. > auto jsep_transport = absl::make_unique<JsepTransport>( > kTransportName, /*local_certificate=*/nullptr, > std::move(unencrypted_rtp_transport), std::move(sdes_transport), > std::move(dtls_srtp_transport), std::move(rtp_dtls_transport), >- std::move(rtcp_dtls_transport)); >+ std::move(rtcp_dtls_transport), /*media_transport=*/nullptr); > > signal_rtcp_mux_active_received_ = false; > jsep_transport->SignalRtcpMuxActive.connect( >@@ -121,7 +126,7 @@ class JsepTransport2Test : public testing::Test, public sigslot::has_slots<> { > > std::unique_ptr<rtc::SSLFingerprint> fingerprint; > if (cert) { >- fingerprint.reset(rtc::SSLFingerprint::CreateFromCertificate(cert)); >+ fingerprint = rtc::SSLFingerprint::CreateFromCertificate(*cert); > } > jsep_description.transport_desc = > TransportDescription(std::vector<std::string>(), ufrag, pwd, >@@ -378,11 +383,12 @@ TEST_P(JsepTransport2WithRtcpMux, VerifyCertificateFingerprint) { > ASSERT_NE(nullptr, certificate); > > std::string digest_algorithm; >- ASSERT_TRUE(certificate->ssl_certificate().GetSignatureDigestAlgorithm( >+ ASSERT_TRUE(certificate->GetSSLCertificate().GetSignatureDigestAlgorithm( > &digest_algorithm)); > ASSERT_FALSE(digest_algorithm.empty()); >- std::unique_ptr<rtc::SSLFingerprint> good_fingerprint( >- rtc::SSLFingerprint::Create(digest_algorithm, certificate->identity())); >+ std::unique_ptr<rtc::SSLFingerprint> good_fingerprint = >+ rtc::SSLFingerprint::CreateUnique(digest_algorithm, >+ *certificate->identity()); > ASSERT_NE(nullptr, good_fingerprint); > > EXPECT_TRUE(jsep_transport_ >@@ -1045,7 +1051,7 @@ class JsepTransport2HeaderExtensionTest > void OnReadPacket1(rtc::PacketTransportInternal* transport, > const char* data, > size_t size, >- const rtc::PacketTime& time, >+ const int64_t& /* packet_time_us */, > int flags) { > RTC_LOG(LS_INFO) << "JsepTransport 1 Received a packet."; > CompareHeaderExtensions( >@@ -1058,7 +1064,7 @@ class JsepTransport2HeaderExtensionTest > void OnReadPacket2(rtc::PacketTransportInternal* transport, > const char* data, > size_t size, >- const rtc::PacketTime& time, >+ const int64_t& /* packet_time_us */, > int flags) { > RTC_LOG(LS_INFO) << "JsepTransport 2 Received a packet."; > CompareHeaderExtensions( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.cc >index 4c69a6a07e29588ce9bfa4dfa7be34855a2ff012..78ecaf31deac4e41e8db488a0df6ca81504d2cb7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.cc >@@ -14,10 +14,11 @@ > #include <memory> > #include <utility> > >-#include "absl/memory/memory.h" > #include "p2p/base/port.h" >+#include "pc/srtpfilter.h" > #include "rtc_base/bind.h" > #include "rtc_base/checks.h" >+#include "rtc_base/key_derivation.h" > #include "rtc_base/thread.h" > > using webrtc::SdpType; >@@ -133,6 +134,24 @@ RtpTransportInternal* JsepTransportController::GetRtpTransport( > return jsep_transport->rtp_transport(); > } > >+MediaTransportInterface* JsepTransportController::GetMediaTransport( >+ const std::string& mid) const { >+ auto jsep_transport = GetJsepTransportForMid(mid); >+ if (!jsep_transport) { >+ return nullptr; >+ } >+ return jsep_transport->media_transport(); >+} >+ >+MediaTransportState JsepTransportController::GetMediaTransportState( >+ const std::string& mid) const { >+ auto jsep_transport = GetJsepTransportForMid(mid); >+ if (!jsep_transport) { >+ return MediaTransportState::kPending; >+ } >+ return jsep_transport->media_transport_state(); >+} >+ > cricket::DtlsTransportInternal* JsepTransportController::GetDtlsTransport( > const std::string& mid) const { > auto jsep_transport = GetJsepTransportForMid(mid); >@@ -370,6 +389,15 @@ void JsepTransportController::SetActiveResetSrtpParams( > } > } > >+void JsepTransportController::SetMediaTransportFactory( >+ MediaTransportFactory* media_transport_factory) { >+ RTC_DCHECK(media_transport_factory == config_.media_transport_factory || >+ jsep_transports_by_name_.empty()) >+ << "You can only call SetMediaTransportFactory before " >+ "JsepTransportController created its first transport."; >+ config_.media_transport_factory = media_transport_factory; >+} >+ > std::unique_ptr<cricket::DtlsTransportInternal> > JsepTransportController::CreateDtlsTransport(const std::string& transport_name, > bool rtcp) { >@@ -471,6 +499,8 @@ JsepTransportController::CreateDtlsSrtpTransport( > rtcp_dtls_transport); > dtls_srtp_transport->SetActiveResetSrtpParams( > config_.active_reset_srtp_params); >+ dtls_srtp_transport->SignalDtlsStateChange.connect( >+ this, &JsepTransportController::UpdateAggregateStates_n); > return dtls_srtp_transport; > } > >@@ -523,7 +553,7 @@ RTCError JsepTransportController::ApplyDescription_n( > (IsBundled(content_info.name) && content_info.name != *bundled_mid())) { > continue; > } >- error = MaybeCreateJsepTransport(content_info); >+ error = MaybeCreateJsepTransport(local, content_info); > if (!error.ok()) { > return error; > } >@@ -732,6 +762,10 @@ bool JsepTransportController::HandleBundledContent( > // BaseChannel/SctpTransport change the RtpTransport/DtlsTransport first, > // then destroy the cricket::JsepTransport. > if (SetTransportForMid(content_info.name, jsep_transport)) { >+ // TODO(bugs.webrtc.org/9719) For media transport this is far from ideal, >+ // because it means that we first create media transport and start >+ // connecting it, and then we destroy it. We will need to address it before >+ // video path is enabled. > MaybeDestroyJsepTransport(content_info.name); > return true; > } >@@ -749,12 +783,12 @@ bool JsepTransportController::SetTransportForMid( > mid_to_transport_[mid] = jsep_transport; > return config_.transport_observer->OnTransportChanged( > mid, jsep_transport->rtp_transport(), >- jsep_transport->rtp_dtls_transport()); >+ jsep_transport->rtp_dtls_transport(), jsep_transport->media_transport()); > } > > void JsepTransportController::RemoveTransportForMid(const std::string& mid) { >- bool ret = >- config_.transport_observer->OnTransportChanged(mid, nullptr, nullptr); >+ bool ret = config_.transport_observer->OnTransportChanged(mid, nullptr, >+ nullptr, nullptr); > // Calling OnTransportChanged with nullptr should always succeed, since it is > // only expected to fail when adding media to a transport (not removing). > RTC_DCHECK(ret); >@@ -806,7 +840,7 @@ std::vector<int> JsepTransportController::GetEncryptedHeaderExtensionIds( > static_cast<const cricket::MediaContentDescription*>( > content_info.description); > >- if (!config_.crypto_options.enable_encrypted_rtp_header_extensions) { >+ if (!config_.crypto_options.srtp.enable_encrypted_rtp_header_extensions) { > return std::vector<int>(); > } > >@@ -888,7 +922,94 @@ cricket::JsepTransport* JsepTransportController::GetJsepTransportByName( > return (it == jsep_transports_by_name_.end()) ? nullptr : it->second.get(); > } > >+std::unique_ptr<webrtc::MediaTransportInterface> >+JsepTransportController::MaybeCreateMediaTransport( >+ const cricket::ContentInfo& content_info, >+ bool local, >+ cricket::IceTransportInternal* ice_transport) { >+ absl::optional<cricket::CryptoParams> selected_crypto_for_media_transport; >+ if (content_info.media_description() && >+ !content_info.media_description()->cryptos().empty()) { >+ // Order of cryptos is deterministic (rfc4568, 5.1.1), so we just select the >+ // first one (in fact the first one should be the most preferred one.) We >+ // ignore the HMAC size, as media transport crypto settings currently don't >+ // expose HMAC size, nor crypto protocol for that matter. >+ selected_crypto_for_media_transport = >+ content_info.media_description()->cryptos()[0]; >+ } >+ >+ if (config_.media_transport_factory != nullptr) { >+ if (!selected_crypto_for_media_transport.has_value()) { >+ RTC_LOG(LS_WARNING) << "a=cryto line was not found in the offer. Most " >+ "likely you did not enable SDES. " >+ "Make sure to pass config.enable_dtls_srtp=false " >+ "to RTCConfiguration. " >+ "Cannot continue with media transport. Falling " >+ "back to RTP. is_local=" >+ << local; >+ >+ // Remove media_transport_factory from config, because we don't want to >+ // use it on the subsequent call (for the other side of the offer). >+ config_.media_transport_factory = nullptr; >+ } else { >+ // Note that we ignore here lifetime and length. >+ // In fact we take those bits (inline, lifetime and length) and keep it as >+ // part of key derivation. >+ // >+ // Technically, we are also not following rfc4568, which requires us to >+ // send and answer with the key that we chose. In practice, for media >+ // transport, the current approach should be sufficient (we take the key >+ // that sender offered, and caller assumes we will use it. We are not >+ // signaling back that we indeed used it.) >+ std::unique_ptr<rtc::KeyDerivation> key_derivation = >+ rtc::KeyDerivation::Create(rtc::KeyDerivationAlgorithm::HKDF_SHA256); >+ const std::string label = "MediaTransportLabel"; >+ constexpr int kDerivedKeyByteSize = 32; >+ >+ int key_len, salt_len; >+ if (!rtc::GetSrtpKeyAndSaltLengths( >+ rtc::SrtpCryptoSuiteFromName( >+ selected_crypto_for_media_transport.value().cipher_suite), >+ &key_len, &salt_len)) { >+ RTC_CHECK(false) << "Cannot set up secure media transport"; >+ } >+ rtc::ZeroOnFreeBuffer<uint8_t> raw_key(key_len + salt_len); >+ >+ cricket::SrtpFilter::ParseKeyParams( >+ selected_crypto_for_media_transport.value().key_params, >+ raw_key.data(), raw_key.size()); >+ absl::optional<rtc::ZeroOnFreeBuffer<uint8_t>> key = >+ key_derivation->DeriveKey( >+ raw_key, >+ /*salt=*/nullptr, >+ rtc::ArrayView<const uint8_t>( >+ reinterpret_cast<const uint8_t*>(label.data()), label.size()), >+ kDerivedKeyByteSize); >+ >+ // We want to crash the app if we don't have a key, and not silently fall >+ // back to the unsecure communication. >+ RTC_CHECK(key.has_value()); >+ MediaTransportSettings settings; >+ settings.is_caller = local; >+ settings.pre_shared_key = >+ std::string(reinterpret_cast<const char*>(key.value().data()), >+ key.value().size()); >+ auto media_transport_result = >+ config_.media_transport_factory->CreateMediaTransport( >+ ice_transport, network_thread_, settings); >+ >+ // TODO(sukhanov): Proper error handling. >+ RTC_CHECK(media_transport_result.ok()); >+ >+ return media_transport_result.MoveValue(); >+ } >+ } >+ >+ return nullptr; >+} >+ > RTCError JsepTransportController::MaybeCreateJsepTransport( >+ bool local, > const cricket::ContentInfo& content_info) { > RTC_DCHECK(network_thread_->IsCurrent()); > cricket::JsepTransport* transport = GetJsepTransportByName(content_info.name); >@@ -906,17 +1027,24 @@ RTCError JsepTransportController::MaybeCreateJsepTransport( > > std::unique_ptr<cricket::DtlsTransportInternal> rtp_dtls_transport = > CreateDtlsTransport(content_info.name, /*rtcp =*/false); >+ > std::unique_ptr<cricket::DtlsTransportInternal> rtcp_dtls_transport; >+ std::unique_ptr<RtpTransport> unencrypted_rtp_transport; >+ std::unique_ptr<SrtpTransport> sdes_transport; >+ std::unique_ptr<DtlsSrtpTransport> dtls_srtp_transport; >+ std::unique_ptr<MediaTransportInterface> media_transport; >+ > if (config_.rtcp_mux_policy != > PeerConnectionInterface::kRtcpMuxPolicyRequire && > content_info.type == cricket::MediaProtocolType::kRtp) { > rtcp_dtls_transport = > CreateDtlsTransport(content_info.name, /*rtcp =*/true); > } >+ media_transport = MaybeCreateMediaTransport( >+ content_info, local, rtp_dtls_transport->ice_transport()); > >- std::unique_ptr<RtpTransport> unencrypted_rtp_transport; >- std::unique_ptr<SrtpTransport> sdes_transport; >- std::unique_ptr<DtlsSrtpTransport> dtls_srtp_transport; >+ // TODO(sukhanov): Do not create RTP/RTCP transports if media transport is >+ // used. > if (config_.disable_encryption) { > unencrypted_rtp_transport = CreateUnencryptedRtpTransport( > content_info.name, rtp_dtls_transport.get(), rtcp_dtls_transport.get()); >@@ -932,9 +1060,12 @@ RTCError JsepTransportController::MaybeCreateJsepTransport( > absl::make_unique<cricket::JsepTransport>( > content_info.name, certificate_, std::move(unencrypted_rtp_transport), > std::move(sdes_transport), std::move(dtls_srtp_transport), >- std::move(rtp_dtls_transport), std::move(rtcp_dtls_transport)); >+ std::move(rtp_dtls_transport), std::move(rtcp_dtls_transport), >+ std::move(media_transport)); > jsep_transport->SignalRtcpMuxActive.connect( > this, &JsepTransportController::UpdateAggregateStates_n); >+ jsep_transport->SignalMediaTransportStateChanged.connect( >+ this, &JsepTransportController::OnMediaTransportStateChanged_n); > SetTransportForMid(content_info.name, jsep_transport.get()); > > jsep_transports_by_name_[content_info.name] = std::move(jsep_transport); >@@ -956,12 +1087,19 @@ void JsepTransportController::MaybeDestroyJsepTransport( > return; > } > } >+ > jsep_transports_by_name_.erase(mid); > UpdateAggregateStates_n(); > } > > void JsepTransportController::DestroyAllJsepTransports_n() { > RTC_DCHECK(network_thread_->IsCurrent()); >+ >+ for (const auto& jsep_transport : jsep_transports_by_name_) { >+ config_.transport_observer->OnTransportChanged(jsep_transport.first, >+ nullptr, nullptr, nullptr); >+ } >+ > jsep_transports_by_name_.clear(); > } > >@@ -1116,18 +1254,35 @@ void JsepTransportController::OnTransportStateChanged_n( > UpdateAggregateStates_n(); > } > >+void JsepTransportController::OnMediaTransportStateChanged_n() { >+ SignalMediaTransportStateChanged(); >+ UpdateAggregateStates_n(); >+} >+ > void JsepTransportController::UpdateAggregateStates_n() { > RTC_DCHECK(network_thread_->IsCurrent()); > > auto dtls_transports = GetDtlsTransports(); > cricket::IceConnectionState new_connection_state = > cricket::kIceConnectionConnecting; >+ PeerConnectionInterface::IceConnectionState new_ice_connection_state = >+ PeerConnectionInterface::IceConnectionState::kIceConnectionNew; >+ PeerConnectionInterface::PeerConnectionState new_combined_state = >+ PeerConnectionInterface::PeerConnectionState::kNew; > cricket::IceGatheringState new_gathering_state = cricket::kIceGatheringNew; > bool any_failed = false; >+ >+ // TODO(http://bugs.webrtc.org/9719) If(when) media_transport disables >+ // dtls_transports entirely, the below line will have to be changed to account >+ // for the fact that dtls transports might be absent. > bool all_connected = !dtls_transports.empty(); > bool all_completed = !dtls_transports.empty(); > bool any_gathering = false; > bool all_done_gathering = !dtls_transports.empty(); >+ >+ std::map<IceTransportState, int> ice_state_counts; >+ std::map<cricket::DtlsTransportState, int> dtls_state_counts; >+ > for (const auto& dtls : dtls_transports) { > any_failed = any_failed || dtls->ice_transport()->GetState() == > cricket::IceTransportState::STATE_FAILED; >@@ -1144,7 +1299,35 @@ void JsepTransportController::UpdateAggregateStates_n() { > all_done_gathering = > all_done_gathering && dtls->ice_transport()->gathering_state() == > cricket::kIceGatheringComplete; >+ >+ dtls_state_counts[dtls->dtls_state()]++; >+ ice_state_counts[dtls->ice_transport()->GetIceTransportState()]++; > } >+ >+ for (auto it = jsep_transports_by_name_.begin(); >+ it != jsep_transports_by_name_.end(); ++it) { >+ auto jsep_transport = it->second.get(); >+ if (!jsep_transport->media_transport()) { >+ continue; >+ } >+ >+ // There is no 'kIceConnectionDisconnected', so we only need to handle >+ // connected and completed. >+ // We treat kClosed as failed, because if it happens before shutting down >+ // media transports it means that there was a failure. >+ // MediaTransportInterface allows to flip back and forth between kWritable >+ // and kPending, but there does not exist an implementation that does that, >+ // and the contract of jsep transport controller doesn't quite expect that. >+ // When this happens, we would go from connected to connecting state, but >+ // this may change in future. >+ any_failed |= jsep_transport->media_transport_state() == >+ webrtc::MediaTransportState::kClosed; >+ all_completed &= jsep_transport->media_transport_state() == >+ webrtc::MediaTransportState::kWritable; >+ all_connected &= jsep_transport->media_transport_state() == >+ webrtc::MediaTransportState::kWritable; >+ } >+ > if (any_failed) { > new_connection_state = cricket::kIceConnectionFailed; > } else if (all_completed) { >@@ -1160,6 +1343,127 @@ void JsepTransportController::UpdateAggregateStates_n() { > }); > } > >+ // Compute the current RTCIceConnectionState as described in >+ // https://www.w3.org/TR/webrtc/#dom-rtciceconnectionstate. >+ // The PeerConnection is responsible for handling the "closed" state. >+ int total_ice_checking = ice_state_counts[IceTransportState::kChecking]; >+ int total_ice_connected = ice_state_counts[IceTransportState::kConnected]; >+ int total_ice_completed = ice_state_counts[IceTransportState::kCompleted]; >+ int total_ice_failed = ice_state_counts[IceTransportState::kFailed]; >+ int total_ice_disconnected = >+ ice_state_counts[IceTransportState::kDisconnected]; >+ int total_ice_closed = ice_state_counts[IceTransportState::kClosed]; >+ int total_ice_new = ice_state_counts[IceTransportState::kNew]; >+ int total_ice = dtls_transports.size(); >+ >+ if (total_ice_failed > 0) { >+ // Any of the RTCIceTransports are in the "failed" state. >+ new_ice_connection_state = PeerConnectionInterface::kIceConnectionFailed; >+ } else if (total_ice_disconnected > 0) { >+ // Any of the RTCIceTransports are in the "disconnected" state and none of >+ // them are in the "failed" state. >+ new_ice_connection_state = >+ PeerConnectionInterface::kIceConnectionDisconnected; >+ } else if (total_ice_checking > 0) { >+ // Any of the RTCIceTransports are in the "checking" state and none of them >+ // are in the "disconnected" or "failed" state. >+ new_ice_connection_state = PeerConnectionInterface::kIceConnectionChecking; >+ } else if (total_ice_completed + total_ice_closed == total_ice && >+ total_ice_completed > 0) { >+ // All RTCIceTransports are in the "completed" or "closed" state and at >+ // least one of them is in the "completed" state. >+ new_ice_connection_state = PeerConnectionInterface::kIceConnectionCompleted; >+ } else if (total_ice_connected + total_ice_completed + total_ice_closed == >+ total_ice && >+ total_ice_connected > 0) { >+ // All RTCIceTransports are in the "connected", "completed" or "closed" >+ // state and at least one of them is in the "connected" state. >+ new_ice_connection_state = PeerConnectionInterface::kIceConnectionConnected; >+ } else if ((total_ice_new > 0 && >+ total_ice_checking + total_ice_disconnected + total_ice_failed == >+ 0) || >+ total_ice == total_ice_closed) { >+ // Any of the RTCIceTransports are in the "new" state and none of them are >+ // in the "checking", "disconnected" or "failed" state, or all >+ // RTCIceTransports are in the "closed" state, or there are no transports. >+ new_ice_connection_state = PeerConnectionInterface::kIceConnectionNew; >+ } else { >+ RTC_NOTREACHED(); >+ } >+ >+ if (standardized_ice_connection_state_ != new_ice_connection_state) { >+ standardized_ice_connection_state_ = new_ice_connection_state; >+ invoker_.AsyncInvoke<void>( >+ RTC_FROM_HERE, signaling_thread_, [this, new_ice_connection_state] { >+ SignalStandardizedIceConnectionState(new_ice_connection_state); >+ }); >+ } >+ >+ // Compute the current RTCPeerConnectionState as described in >+ // https://www.w3.org/TR/webrtc/#dom-rtcpeerconnectionstate. >+ // The PeerConnection is responsible for handling the "closed" state. >+ // Note that "connecting" is only a valid state for DTLS transports while >+ // "checking", "completed" and "disconnected" are only valid for ICE >+ // transports. >+ int total_connected = total_ice_connected + >+ dtls_state_counts[cricket::DTLS_TRANSPORT_CONNECTED]; >+ int total_dtls_connecting = >+ dtls_state_counts[cricket::DTLS_TRANSPORT_CONNECTING]; >+ int total_failed = >+ total_ice_failed + dtls_state_counts[cricket::DTLS_TRANSPORT_FAILED]; >+ int total_closed = >+ total_ice_closed + dtls_state_counts[cricket::DTLS_TRANSPORT_CLOSED]; >+ int total_new = >+ total_ice_new + dtls_state_counts[cricket::DTLS_TRANSPORT_NEW]; >+ int total_transports = total_ice * 2; >+ >+ if (total_failed > 0) { >+ // Any of the RTCIceTransports or RTCDtlsTransports are in a "failed" state. >+ new_combined_state = PeerConnectionInterface::PeerConnectionState::kFailed; >+ } else if (total_ice_disconnected > 0 && >+ total_dtls_connecting + total_ice_checking == 0) { >+ // Any of the RTCIceTransports or RTCDtlsTransports are in the >+ // "disconnected" state and none of them are in the "failed" or "connecting" >+ // or "checking" state. >+ new_combined_state = >+ PeerConnectionInterface::PeerConnectionState::kDisconnected; >+ } else if (total_dtls_connecting + total_ice_checking > 0) { >+ // Any of the RTCIceTransports or RTCDtlsTransports are in the "connecting" >+ // or "checking" state and none of them is in the "failed" state. >+ new_combined_state = >+ PeerConnectionInterface::PeerConnectionState::kConnecting; >+ } else if (total_connected + total_ice_completed + total_closed == >+ total_transports && >+ total_connected + total_ice_completed > 0) { >+ // All RTCIceTransports and RTCDtlsTransports are in the "connected", >+ // "completed" or "closed" state and at least one of them is in the >+ // "connected" or "completed" state. >+ new_combined_state = >+ PeerConnectionInterface::PeerConnectionState::kConnected; >+ } else if ((total_new > 0 && total_dtls_connecting + total_ice_checking + >+ total_failed + total_ice_disconnected == >+ 0) || >+ total_transports == total_closed) { >+ // Any of the RTCIceTransports or RTCDtlsTransports are in the "new" state >+ // and none of the transports are in the "connecting", "checking", "failed" >+ // or "disconnected" state, or all transports are in the "closed" state, or >+ // there are no transports. >+ // >+ // Note that if none of the other conditions hold this is guaranteed to be >+ // true. >+ new_combined_state = PeerConnectionInterface::PeerConnectionState::kNew; >+ } else { >+ RTC_NOTREACHED(); >+ } >+ >+ if (combined_connection_state_ != new_combined_state) { >+ combined_connection_state_ = new_combined_state; >+ invoker_.AsyncInvoke<void>(RTC_FROM_HERE, signaling_thread_, >+ [this, new_combined_state] { >+ SignalConnectionState(new_combined_state); >+ }); >+ } >+ > if (all_done_gathering) { > new_gathering_state = cricket::kIceGatheringComplete; > } else if (any_gathering) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.h >index d9340cf357d0e1345db635893e2fb27a34451480..3ed7f5f433b7a6250b345ed96086fe2f383d30c2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller.h >@@ -18,6 +18,8 @@ > #include <vector> > > #include "api/candidate.h" >+#include "api/crypto/cryptooptions.h" >+#include "api/media_transport_interface.h" > #include "api/peerconnectioninterface.h" > #include "logging/rtc_event_log/rtc_event_log.h" > #include "media/sctp/sctptransportinternal.h" >@@ -32,7 +34,6 @@ > #include "rtc_base/asyncinvoker.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/refcountedobject.h" >-#include "rtc_base/sslstreamadapter.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > > namespace rtc { >@@ -56,7 +57,8 @@ class JsepTransportController : public sigslot::has_slots<> { > virtual bool OnTransportChanged( > const std::string& mid, > RtpTransportInternal* rtp_transport, >- cricket::DtlsTransportInternal* dtls_transport) = 0; >+ cricket::DtlsTransportInternal* dtls_transport, >+ MediaTransportInterface* media_transport) = 0; > }; > > struct Config { >@@ -67,7 +69,7 @@ class JsepTransportController : public sigslot::has_slots<> { > rtc::SSLProtocolVersion ssl_max_version = rtc::SSL_PROTOCOL_DTLS_12; > // |crypto_options| is used to determine if created DTLS transports > // negotiate GCM crypto suites or not. >- rtc::CryptoOptions crypto_options; >+ webrtc::CryptoOptions crypto_options; > PeerConnectionInterface::BundlePolicy bundle_policy = > PeerConnectionInterface::kBundlePolicyBalanced; > PeerConnectionInterface::RtcpMuxPolicy rtcp_mux_policy = >@@ -79,6 +81,13 @@ class JsepTransportController : public sigslot::has_slots<> { > Observer* transport_observer = nullptr; > bool active_reset_srtp_params = false; > RtcEventLog* event_log = nullptr; >+ >+ // Optional media transport factory (experimental). If provided it will be >+ // used to create media_transport and will be used to send / receive >+ // audio and video frames instead of RTP. Note that currently >+ // media_transport co-exists with RTP / RTCP transports and uses the same >+ // underlying ICE transport. >+ MediaTransportFactory* media_transport_factory = nullptr; > }; > > // The ICE related events are signaled on the |signaling_thread|. >@@ -108,6 +117,9 @@ class JsepTransportController : public sigslot::has_slots<> { > cricket::DtlsTransportInternal* GetRtcpDtlsTransport( > const std::string& mid) const; > >+ MediaTransportInterface* GetMediaTransport(const std::string& mid) const; >+ MediaTransportState GetMediaTransportState(const std::string& mid) const; >+ > /********************* > * ICE-related methods > ********************/ >@@ -157,6 +169,12 @@ class JsepTransportController : public sigslot::has_slots<> { > > void SetActiveResetSrtpParams(bool active_reset_srtp_params); > >+ // Allows to overwrite the settings from config. You may set or reset the >+ // media transport factory on the jsep transport controller, as long as you >+ // did not call 'GetMediaTransport' or 'MaybeCreateJsepTransport'. Once Jsep >+ // transport is created, you can't change this setting. >+ void SetMediaTransportFactory(MediaTransportFactory* media_transport_factory); >+ > // All of these signals are fired on the signaling thread. > > // If any transport failed => failed, >@@ -165,6 +183,11 @@ class JsepTransportController : public sigslot::has_slots<> { > // Else => connecting > sigslot::signal1<cricket::IceConnectionState> SignalIceConnectionState; > >+ sigslot::signal1<PeerConnectionInterface::PeerConnectionState> >+ SignalConnectionState; >+ sigslot::signal1<PeerConnectionInterface::IceConnectionState> >+ SignalStandardizedIceConnectionState; >+ > // If all transports done gathering => complete, > // Else if any are gathering => gathering, > // Else => new >@@ -179,6 +202,8 @@ class JsepTransportController : public sigslot::has_slots<> { > > sigslot::signal1<rtc::SSLHandshakeError> SignalDtlsHandshakeError; > >+ sigslot::signal<> SignalMediaTransportStateChanged; >+ > private: > RTCError ApplyDescription_n(bool local, > SdpType type, >@@ -241,7 +266,20 @@ class JsepTransportController : public sigslot::has_slots<> { > cricket::JsepTransport* GetJsepTransportByName( > const std::string& transport_name); > >- RTCError MaybeCreateJsepTransport(const cricket::ContentInfo& content_info); >+ // Creates jsep transport. Noop if transport is already created. >+ // Transport is created either during SetLocalDescription (|local| == true) or >+ // during SetRemoteDescription (|local| == false). Passing |local| helps to >+ // differentiate initiator (caller) from answerer (callee). >+ RTCError MaybeCreateJsepTransport(bool local, >+ const cricket::ContentInfo& content_info); >+ >+ // Creates media transport if config wants to use it, and pre-shared key is >+ // provided in content info. It modifies the config to disable media transport >+ // if pre-shared key is not provided. >+ std::unique_ptr<webrtc::MediaTransportInterface> MaybeCreateMediaTransport( >+ const cricket::ContentInfo& content_info, >+ bool local, >+ cricket::IceTransportInternal* ice_transport); > void MaybeDestroyJsepTransport(const std::string& mid); > void DestroyAllJsepTransports_n(); > >@@ -285,6 +323,7 @@ class JsepTransportController : public sigslot::has_slots<> { > const cricket::Candidates& candidates); > void OnTransportRoleConflict_n(cricket::IceTransportInternal* transport); > void OnTransportStateChanged_n(cricket::IceTransportInternal* transport); >+ void OnMediaTransportStateChanged_n(); > > void UpdateAggregateStates_n(); > >@@ -301,9 +340,16 @@ class JsepTransportController : public sigslot::has_slots<> { > // (BaseChannel/SctpTransport) and the JsepTransport underneath. > std::map<std::string, cricket::JsepTransport*> mid_to_transport_; > >- // Aggregate state for Transports. >+ // Aggregate states for Transports. >+ // standardized_ice_connection_state_ is intended to replace >+ // ice_connection_state, see bugs.webrtc.org/9308 > cricket::IceConnectionState ice_connection_state_ = > cricket::kIceConnectionConnecting; >+ PeerConnectionInterface::IceConnectionState >+ standardized_ice_connection_state_ = >+ PeerConnectionInterface::kIceConnectionNew; >+ PeerConnectionInterface::PeerConnectionState combined_connection_state_ = >+ PeerConnectionInterface::PeerConnectionState::kNew; > cricket::IceGatheringState ice_gathering_state_ = cricket::kIceGatheringNew; > > Config config_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller_unittest.cc >index 6f8693bea7a9b57150258d0486c0a7f24eefb3f7..129d22a4fcc84730e1f99b7a854e967ca2ef9a0f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/jseptransportcontroller_unittest.cc >@@ -11,7 +11,8 @@ > #include <map> > #include <memory> > >-#include "absl/memory/memory.h" >+#include "api/media_transport_interface.h" >+#include "api/test/fake_media_transport.h" > #include "p2p/base/fakedtlstransport.h" > #include "p2p/base/fakeicetransport.h" > #include "p2p/base/transportfactoryinterface.h" >@@ -41,6 +42,20 @@ static const char kDataMid1[] = "data1"; > > namespace webrtc { > >+namespace { >+ >+// Media transport factory requires crypto settings to be present in order to >+// create media transport. >+void AddCryptoSettings(cricket::SessionDescription* description) { >+ for (auto& content : description->contents()) { >+ content.media_description()->AddCrypto(cricket::CryptoParams( >+ /*t=*/0, std::string(rtc::CS_AES_CM_128_HMAC_SHA1_80), >+ "inline:YUJDZGVmZ2hpSktMbW9QUXJzVHVWd3l6MTIzNDU2", "")); >+ } >+} >+ >+} // namespace >+ > class FakeTransportFactory : public cricket::TransportFactoryInterface { > public: > std::unique_ptr<cricket::IceTransportInternal> CreateIceTransport( >@@ -52,7 +67,7 @@ class FakeTransportFactory : public cricket::TransportFactoryInterface { > > std::unique_ptr<cricket::DtlsTransportInternal> CreateDtlsTransport( > std::unique_ptr<cricket::IceTransportInternal> ice, >- const rtc::CryptoOptions& crypto_options) override { >+ const webrtc::CryptoOptions& crypto_options) override { > std::unique_ptr<cricket::FakeIceTransport> fake_ice( > static_cast<cricket::FakeIceTransport*>(ice.release())); > return absl::make_unique<FakeDtlsTransport>(std::move(fake_ice)); >@@ -84,6 +99,10 @@ class JsepTransportControllerTest : public JsepTransportController::Observer, > void ConnectTransportControllerSignals() { > transport_controller_->SignalIceConnectionState.connect( > this, &JsepTransportControllerTest::OnConnectionState); >+ transport_controller_->SignalStandardizedIceConnectionState.connect( >+ this, &JsepTransportControllerTest::OnStandardizedIceConnectionState); >+ transport_controller_->SignalConnectionState.connect( >+ this, &JsepTransportControllerTest::OnCombinedConnectionState); > transport_controller_->SignalIceGatheringState.connect( > this, &JsepTransportControllerTest::OnGatheringState); > transport_controller_->SignalIceCandidatesGathered.connect( >@@ -170,7 +189,7 @@ class JsepTransportControllerTest : public JsepTransportController::Observer, > rtc::scoped_refptr<rtc::RTCCertificate> cert) { > std::unique_ptr<rtc::SSLFingerprint> fingerprint; > if (cert) { >- fingerprint.reset(rtc::SSLFingerprint::CreateFromCertificate(cert)); >+ fingerprint = rtc::SSLFingerprint::CreateFromCertificate(*cert); > } > > cricket::TransportDescription transport_desc(std::vector<std::string>(), >@@ -243,6 +262,24 @@ class JsepTransportControllerTest : public JsepTransportController::Observer, > ++connection_state_signal_count_; > } > >+ void OnStandardizedIceConnectionState( >+ PeerConnectionInterface::IceConnectionState state) { >+ if (!signaling_thread_->IsCurrent()) { >+ signaled_on_non_signaling_thread_ = true; >+ } >+ ice_connection_state_ = state; >+ ++ice_connection_state_signal_count_; >+ } >+ >+ void OnCombinedConnectionState( >+ PeerConnectionInterface::PeerConnectionState state) { >+ if (!signaling_thread_->IsCurrent()) { >+ signaled_on_non_signaling_thread_ = true; >+ } >+ combined_connection_state_ = state; >+ ++combined_connection_state_signal_count_; >+ } >+ > void OnGatheringState(cricket::IceGatheringState state) { > if (!signaling_thread_->IsCurrent()) { > signaled_on_non_signaling_thread_ = true; >@@ -262,31 +299,37 @@ class JsepTransportControllerTest : public JsepTransportController::Observer, > } > > // JsepTransportController::Observer overrides. >- bool OnTransportChanged( >- const std::string& mid, >- RtpTransportInternal* rtp_transport, >- cricket::DtlsTransportInternal* dtls_transport) override { >+ bool OnTransportChanged(const std::string& mid, >+ RtpTransportInternal* rtp_transport, >+ cricket::DtlsTransportInternal* dtls_transport, >+ MediaTransportInterface* media_transport) override { > changed_rtp_transport_by_mid_[mid] = rtp_transport; > changed_dtls_transport_by_mid_[mid] = dtls_transport; >+ changed_media_transport_by_mid_[mid] = media_transport; > return true; > } > > // Information received from signals from transport controller. > cricket::IceConnectionState connection_state_ = > cricket::kIceConnectionConnecting; >+ PeerConnectionInterface::IceConnectionState ice_connection_state_ = >+ PeerConnectionInterface::kIceConnectionNew; >+ PeerConnectionInterface::PeerConnectionState combined_connection_state_ = >+ PeerConnectionInterface::PeerConnectionState::kNew; > bool receiving_ = false; > cricket::IceGatheringState gathering_state_ = cricket::kIceGatheringNew; > // transport_name => candidates > std::map<std::string, Candidates> candidates_; > // Counts of each signal emitted. > int connection_state_signal_count_ = 0; >+ int ice_connection_state_signal_count_ = 0; >+ int combined_connection_state_signal_count_ = 0; > int receiving_signal_count_ = 0; > int gathering_state_signal_count_ = 0; > int candidates_signal_count_ = 0; > > // |network_thread_| should be destroyed after |transport_controller_| > std::unique_ptr<rtc::Thread> network_thread_; >- std::unique_ptr<JsepTransportController> transport_controller_; > std::unique_ptr<FakeTransportFactory> fake_transport_factory_; > rtc::Thread* const signaling_thread_ = nullptr; > bool signaled_on_non_signaling_thread_ = false; >@@ -295,6 +338,12 @@ class JsepTransportControllerTest : public JsepTransportController::Observer, > std::map<std::string, RtpTransportInternal*> changed_rtp_transport_by_mid_; > std::map<std::string, cricket::DtlsTransportInternal*> > changed_dtls_transport_by_mid_; >+ std::map<std::string, MediaTransportInterface*> >+ changed_media_transport_by_mid_; >+ >+ // Transport controller needs to be destroyed first, because it may issue >+ // callbacks that modify the changed_*_by_mid in the destructor. >+ std::unique_ptr<JsepTransportController> transport_controller_; > }; > > TEST_F(JsepTransportControllerTest, GetRtpTransport) { >@@ -341,6 +390,118 @@ TEST_F(JsepTransportControllerTest, GetDtlsTransportWithRtcpMux) { > EXPECT_EQ(nullptr, transport_controller_->GetRtcpDtlsTransport(kAudioMid1)); > EXPECT_NE(nullptr, transport_controller_->GetDtlsTransport(kVideoMid1)); > EXPECT_EQ(nullptr, transport_controller_->GetRtcpDtlsTransport(kVideoMid1)); >+ EXPECT_EQ(nullptr, transport_controller_->GetMediaTransport(kAudioMid1)); >+} >+ >+TEST_F(JsepTransportControllerTest, GetMediaTransportInCaller) { >+ FakeMediaTransportFactory fake_media_transport_factory; >+ JsepTransportController::Config config; >+ >+ config.rtcp_mux_policy = PeerConnectionInterface::kRtcpMuxPolicyNegotiate; >+ config.media_transport_factory = &fake_media_transport_factory; >+ CreateJsepTransportController(config); >+ auto description = CreateSessionDescriptionWithoutBundle(); >+ AddCryptoSettings(description.get()); >+ >+ EXPECT_TRUE(transport_controller_ >+ ->SetLocalDescription(SdpType::kOffer, description.get()) >+ .ok()); >+ >+ FakeMediaTransport* media_transport = static_cast<FakeMediaTransport*>( >+ transport_controller_->GetMediaTransport(kAudioMid1)); >+ >+ ASSERT_NE(nullptr, media_transport); >+ >+ // After SetLocalDescription, media transport should be created as caller. >+ EXPECT_TRUE(media_transport->is_caller()); >+ EXPECT_TRUE(media_transport->pre_shared_key().has_value()); >+ >+ // Return nullptr for non-existing mids. >+ EXPECT_EQ(nullptr, transport_controller_->GetMediaTransport(kVideoMid2)); >+} >+ >+TEST_F(JsepTransportControllerTest, GetMediaTransportInCallee) { >+ FakeMediaTransportFactory fake_media_transport_factory; >+ JsepTransportController::Config config; >+ >+ config.rtcp_mux_policy = PeerConnectionInterface::kRtcpMuxPolicyNegotiate; >+ config.media_transport_factory = &fake_media_transport_factory; >+ CreateJsepTransportController(config); >+ auto description = CreateSessionDescriptionWithoutBundle(); >+ AddCryptoSettings(description.get()); >+ EXPECT_TRUE(transport_controller_ >+ ->SetRemoteDescription(SdpType::kOffer, description.get()) >+ .ok()); >+ >+ FakeMediaTransport* media_transport = static_cast<FakeMediaTransport*>( >+ transport_controller_->GetMediaTransport(kAudioMid1)); >+ >+ ASSERT_NE(nullptr, media_transport); >+ >+ // After SetRemoteDescription, media transport should be created as callee. >+ EXPECT_FALSE(media_transport->is_caller()); >+ EXPECT_TRUE(media_transport->pre_shared_key().has_value()); >+ >+ // Return nullptr for non-existing mids. >+ EXPECT_EQ(nullptr, transport_controller_->GetMediaTransport(kVideoMid2)); >+} >+ >+TEST_F(JsepTransportControllerTest, GetMediaTransportIsNotSetIfNoSdes) { >+ FakeMediaTransportFactory fake_media_transport_factory; >+ JsepTransportController::Config config; >+ >+ config.rtcp_mux_policy = PeerConnectionInterface::kRtcpMuxPolicyNegotiate; >+ config.media_transport_factory = &fake_media_transport_factory; >+ CreateJsepTransportController(config); >+ auto description = CreateSessionDescriptionWithoutBundle(); >+ EXPECT_TRUE(transport_controller_ >+ ->SetRemoteDescription(SdpType::kOffer, description.get()) >+ .ok()); >+ >+ EXPECT_EQ(nullptr, transport_controller_->GetMediaTransport(kAudioMid1)); >+ >+ // Even if we set local description with crypto now (after the remote offer >+ // was set), media transport won't be provided. >+ auto description2 = CreateSessionDescriptionWithoutBundle(); >+ AddCryptoSettings(description2.get()); >+ EXPECT_TRUE(transport_controller_ >+ ->SetLocalDescription(SdpType::kAnswer, description2.get()) >+ .ok()); >+ >+ EXPECT_EQ(nullptr, transport_controller_->GetMediaTransport(kAudioMid1)); >+} >+ >+TEST_F(JsepTransportControllerTest, >+ AfterSettingAnswerTheSameMediaTransportIsReturned) { >+ FakeMediaTransportFactory fake_media_transport_factory; >+ JsepTransportController::Config config; >+ >+ config.rtcp_mux_policy = PeerConnectionInterface::kRtcpMuxPolicyNegotiate; >+ config.media_transport_factory = &fake_media_transport_factory; >+ CreateJsepTransportController(config); >+ auto description = CreateSessionDescriptionWithoutBundle(); >+ AddCryptoSettings(description.get()); >+ EXPECT_TRUE(transport_controller_ >+ ->SetRemoteDescription(SdpType::kOffer, description.get()) >+ .ok()); >+ >+ FakeMediaTransport* media_transport = static_cast<FakeMediaTransport*>( >+ transport_controller_->GetMediaTransport(kAudioMid1)); >+ EXPECT_NE(nullptr, media_transport); >+ EXPECT_TRUE(media_transport->pre_shared_key().has_value()); >+ >+ // Even if we set local description with crypto now (after the remote offer >+ // was set), media transport won't be provided. >+ auto description2 = CreateSessionDescriptionWithoutBundle(); >+ AddCryptoSettings(description2.get()); >+ >+ RTCError result = transport_controller_->SetLocalDescription( >+ SdpType::kAnswer, description2.get()); >+ EXPECT_TRUE(result.ok()) << result.message(); >+ >+ // Media transport did not change. >+ EXPECT_EQ(media_transport, >+ transport_controller_->GetMediaTransport(kAudioMid1)); > } > > TEST_F(JsepTransportControllerTest, SetIceConfig) { >@@ -568,9 +729,16 @@ TEST_F(JsepTransportControllerTest, SignalConnectionStateFailed) { > fake_ice->SetConnectionCount(0); > EXPECT_EQ_WAIT(cricket::kIceConnectionFailed, connection_state_, kTimeout); > EXPECT_EQ(1, connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::kIceConnectionFailed, >+ ice_connection_state_, kTimeout); >+ EXPECT_EQ(1, ice_connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::PeerConnectionState::kFailed, >+ combined_connection_state_, kTimeout); >+ EXPECT_EQ(1, combined_connection_state_signal_count_); > } > >-TEST_F(JsepTransportControllerTest, SignalConnectionStateConnected) { >+TEST_F(JsepTransportControllerTest, >+ SignalConnectionStateConnectedNoMediaTransport) { > CreateJsepTransportController(JsepTransportController::Config()); > auto description = CreateSessionDescriptionWithoutBundle(); > EXPECT_TRUE(transport_controller_ >@@ -595,13 +763,114 @@ TEST_F(JsepTransportControllerTest, SignalConnectionStateConnected) { > > EXPECT_EQ_WAIT(cricket::kIceConnectionFailed, connection_state_, kTimeout); > EXPECT_EQ(1, connection_state_signal_count_); >- >+ EXPECT_EQ_WAIT(PeerConnectionInterface::kIceConnectionFailed, >+ ice_connection_state_, kTimeout); >+ EXPECT_EQ(1, ice_connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::PeerConnectionState::kFailed, >+ combined_connection_state_, kTimeout); >+ EXPECT_EQ(1, combined_connection_state_signal_count_); >+ >+ fake_audio_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); >+ fake_video_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); > // Set the connection count to be 2 and the cricket::FakeIceTransport will set > // the transport state to be STATE_CONNECTING. > fake_video_dtls->fake_ice_transport()->SetConnectionCount(2); > fake_video_dtls->SetWritable(true); > EXPECT_EQ_WAIT(cricket::kIceConnectionConnected, connection_state_, kTimeout); > EXPECT_EQ(2, connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::kIceConnectionConnected, >+ ice_connection_state_, kTimeout); >+ EXPECT_EQ(2, ice_connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::PeerConnectionState::kConnected, >+ combined_connection_state_, kTimeout); >+ EXPECT_EQ(2, combined_connection_state_signal_count_); >+} >+ >+TEST_F(JsepTransportControllerTest, >+ SignalConnectionStateConnectedWithMediaTransport) { >+ FakeMediaTransportFactory fake_media_transport_factory; >+ JsepTransportController::Config config; >+ config.media_transport_factory = &fake_media_transport_factory; >+ CreateJsepTransportController(config); >+ auto description = CreateSessionDescriptionWithoutBundle(); >+ AddCryptoSettings(description.get()); >+ EXPECT_TRUE(transport_controller_ >+ ->SetLocalDescription(SdpType::kOffer, description.get()) >+ .ok()); >+ >+ auto fake_audio_dtls = static_cast<FakeDtlsTransport*>( >+ transport_controller_->GetDtlsTransport(kAudioMid1)); >+ auto fake_video_dtls = static_cast<FakeDtlsTransport*>( >+ transport_controller_->GetDtlsTransport(kVideoMid1)); >+ fake_audio_dtls->SetWritable(true); >+ fake_video_dtls->SetWritable(true); >+ // Decreasing connection count from 2 to 1 triggers connection state event. >+ fake_audio_dtls->fake_ice_transport()->SetConnectionCount(2); >+ fake_audio_dtls->fake_ice_transport()->SetConnectionCount(1); >+ fake_video_dtls->fake_ice_transport()->SetConnectionCount(2); >+ fake_video_dtls->fake_ice_transport()->SetConnectionCount(1); >+ fake_audio_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); >+ fake_video_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); >+ >+ // Still not connected, because we are waiting for media transport. >+ EXPECT_EQ_WAIT(cricket::kIceConnectionConnecting, connection_state_, >+ kTimeout); >+ >+ FakeMediaTransport* media_transport = static_cast<FakeMediaTransport*>( >+ transport_controller_->GetMediaTransport(kAudioMid1)); >+ >+ media_transport->SetState(webrtc::MediaTransportState::kWritable); >+ EXPECT_EQ_WAIT(cricket::kIceConnectionConnecting, connection_state_, >+ kTimeout); >+ >+ // Still waiting for the second media transport. >+ media_transport = static_cast<FakeMediaTransport*>( >+ transport_controller_->GetMediaTransport(kVideoMid1)); >+ media_transport->SetState(webrtc::MediaTransportState::kWritable); >+ >+ EXPECT_EQ_WAIT(cricket::kIceConnectionConnected, connection_state_, kTimeout); >+} >+ >+TEST_F(JsepTransportControllerTest, >+ SignalConnectionStateFailedWhenMediaTransportClosed) { >+ FakeMediaTransportFactory fake_media_transport_factory; >+ JsepTransportController::Config config; >+ config.media_transport_factory = &fake_media_transport_factory; >+ CreateJsepTransportController(config); >+ auto description = CreateSessionDescriptionWithoutBundle(); >+ AddCryptoSettings(description.get()); >+ EXPECT_TRUE(transport_controller_ >+ ->SetLocalDescription(SdpType::kOffer, description.get()) >+ .ok()); >+ >+ auto fake_audio_dtls = static_cast<FakeDtlsTransport*>( >+ transport_controller_->GetDtlsTransport(kAudioMid1)); >+ auto fake_video_dtls = static_cast<FakeDtlsTransport*>( >+ transport_controller_->GetDtlsTransport(kVideoMid1)); >+ fake_audio_dtls->SetWritable(true); >+ fake_video_dtls->SetWritable(true); >+ // Decreasing connection count from 2 to 1 triggers connection state event. >+ fake_audio_dtls->fake_ice_transport()->SetConnectionCount(2); >+ fake_audio_dtls->fake_ice_transport()->SetConnectionCount(1); >+ fake_video_dtls->fake_ice_transport()->SetConnectionCount(2); >+ fake_video_dtls->fake_ice_transport()->SetConnectionCount(1); >+ fake_audio_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); >+ fake_video_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); >+ >+ FakeMediaTransport* media_transport = static_cast<FakeMediaTransport*>( >+ transport_controller_->GetMediaTransport(kAudioMid1)); >+ >+ media_transport->SetState(webrtc::MediaTransportState::kWritable); >+ >+ media_transport = static_cast<FakeMediaTransport*>( >+ transport_controller_->GetMediaTransport(kVideoMid1)); >+ >+ media_transport->SetState(webrtc::MediaTransportState::kWritable); >+ >+ EXPECT_EQ_WAIT(cricket::kIceConnectionConnected, connection_state_, kTimeout); >+ >+ media_transport->SetState(webrtc::MediaTransportState::kClosed); >+ EXPECT_EQ_WAIT(cricket::kIceConnectionFailed, connection_state_, kTimeout); > } > > TEST_F(JsepTransportControllerTest, SignalConnectionStateComplete) { >@@ -629,13 +898,27 @@ TEST_F(JsepTransportControllerTest, SignalConnectionStateComplete) { > > EXPECT_EQ_WAIT(cricket::kIceConnectionFailed, connection_state_, kTimeout); > EXPECT_EQ(1, connection_state_signal_count_); >- >+ EXPECT_EQ_WAIT(PeerConnectionInterface::kIceConnectionFailed, >+ ice_connection_state_, kTimeout); >+ EXPECT_EQ(1, ice_connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::PeerConnectionState::kFailed, >+ combined_connection_state_, kTimeout); >+ EXPECT_EQ(1, combined_connection_state_signal_count_); >+ >+ fake_audio_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); >+ fake_video_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); > // Set the connection count to be 1 and the cricket::FakeIceTransport will set > // the transport state to be STATE_COMPLETED. > fake_video_dtls->fake_ice_transport()->SetConnectionCount(1); > fake_video_dtls->SetWritable(true); > EXPECT_EQ_WAIT(cricket::kIceConnectionCompleted, connection_state_, kTimeout); > EXPECT_EQ(2, connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::kIceConnectionCompleted, >+ ice_connection_state_, kTimeout); >+ EXPECT_EQ(2, ice_connection_state_signal_count_); >+ EXPECT_EQ_WAIT(PeerConnectionInterface::PeerConnectionState::kConnected, >+ combined_connection_state_, kTimeout); >+ EXPECT_EQ(2, combined_connection_state_signal_count_); > } > > TEST_F(JsepTransportControllerTest, SignalIceGatheringStateGathering) { >@@ -709,6 +992,7 @@ TEST_F(JsepTransportControllerTest, > fake_audio_dtls->SetWritable(true); > fake_audio_dtls->fake_ice_transport()->SetCandidatesGatheringComplete(); > fake_audio_dtls->fake_ice_transport()->SetConnectionCount(1); >+ fake_audio_dtls->SetDtlsState(cricket::DTLS_TRANSPORT_CONNECTED); > EXPECT_EQ(1, gathering_state_signal_count_); > > // Set the remote description and enable the bundle. >@@ -721,6 +1005,10 @@ TEST_F(JsepTransportControllerTest, > transport_controller_->GetDtlsTransport(kVideoMid1)); > EXPECT_EQ(fake_audio_dtls, fake_video_dtls); > EXPECT_EQ_WAIT(cricket::kIceConnectionCompleted, connection_state_, kTimeout); >+ EXPECT_EQ(PeerConnectionInterface::kIceConnectionCompleted, >+ ice_connection_state_); >+ EXPECT_EQ(PeerConnectionInterface::PeerConnectionState::kConnected, >+ combined_connection_state_); > EXPECT_EQ_WAIT(cricket::kIceGatheringComplete, gathering_state_, kTimeout); > EXPECT_EQ(2, gathering_state_signal_count_); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/localaudiosource.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/localaudiosource.h >index c5f65304d0b3ceb5b770406554424d34cc3c7f49..c48f5407d4c1e534f5e6cb36cb97ca2f54f6342b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/localaudiosource.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/localaudiosource.h >@@ -29,7 +29,7 @@ class LocalAudioSource : public Notifier<AudioSourceInterface> { > SourceState state() const override { return kLive; } > bool remote() const override { return false; } > >- virtual const cricket::AudioOptions& options() const { return options_; } >+ const cricket::AudioOptions options() const override { return options_; } > > void AddSink(AudioTrackSinkInterface* sink) override {} > void RemoveSink(AudioTrackSinkInterface* sink) override {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.cc >index 9ae46be27c5bf85a03a2c82de7ae7f6a74a115b4..7239af86c399e7bdbe486d8b06bdaca69a612c0b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.cc >@@ -18,9 +18,9 @@ > #include <unordered_map> > #include <utility> > >+#include "absl/strings/match.h" > #include "absl/types/optional.h" > #include "api/cryptoparams.h" >-#include "common_types.h" // NOLINT(build/include) > #include "media/base/h264_profile_level_id.h" > #include "media/base/mediaconstants.h" > #include "p2p/base/p2pconstants.h" >@@ -39,10 +39,10 @@ using webrtc::RtpTransceiverDirection; > > const char kInline[] = "inline:"; > >-void GetSupportedSdesCryptoSuiteNames(void (*func)(const rtc::CryptoOptions&, >- std::vector<int>*), >- const rtc::CryptoOptions& crypto_options, >- std::vector<std::string>* names) { >+void GetSupportedSdesCryptoSuiteNames( >+ void (*func)(const webrtc::CryptoOptions&, std::vector<int>*), >+ const webrtc::CryptoOptions& crypto_options, >+ std::vector<std::string>* names) { > std::vector<int> crypto_suites; > func(crypto_options, &crypto_suites); > for (const auto crypto : crypto_suites) { >@@ -195,28 +195,30 @@ bool FindMatchingCrypto(const CryptoParamsVec& cryptos, > > // For audio, HMAC 32 (if enabled) is prefered over HMAC 80 because of the > // low overhead. >-void GetSupportedAudioSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, >- std::vector<int>* crypto_suites) { >- if (crypto_options.enable_gcm_crypto_suites) { >+void GetSupportedAudioSdesCryptoSuites( >+ const webrtc::CryptoOptions& crypto_options, >+ std::vector<int>* crypto_suites) { >+ if (crypto_options.srtp.enable_gcm_crypto_suites) { > crypto_suites->push_back(rtc::SRTP_AEAD_AES_256_GCM); > crypto_suites->push_back(rtc::SRTP_AEAD_AES_128_GCM); > } >- if (crypto_options.enable_aes128_sha1_32_crypto_cipher) { >+ if (crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher) { > crypto_suites->push_back(rtc::SRTP_AES128_CM_SHA1_32); > } > crypto_suites->push_back(rtc::SRTP_AES128_CM_SHA1_80); > } > > void GetSupportedAudioSdesCryptoSuiteNames( >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > std::vector<std::string>* crypto_suite_names) { > GetSupportedSdesCryptoSuiteNames(GetSupportedAudioSdesCryptoSuites, > crypto_options, crypto_suite_names); > } > >-void GetSupportedVideoSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, >- std::vector<int>* crypto_suites) { >- if (crypto_options.enable_gcm_crypto_suites) { >+void GetSupportedVideoSdesCryptoSuites( >+ const webrtc::CryptoOptions& crypto_options, >+ std::vector<int>* crypto_suites) { >+ if (crypto_options.srtp.enable_gcm_crypto_suites) { > crypto_suites->push_back(rtc::SRTP_AEAD_AES_256_GCM); > crypto_suites->push_back(rtc::SRTP_AEAD_AES_128_GCM); > } >@@ -224,15 +226,16 @@ void GetSupportedVideoSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, > } > > void GetSupportedVideoSdesCryptoSuiteNames( >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > std::vector<std::string>* crypto_suite_names) { > GetSupportedSdesCryptoSuiteNames(GetSupportedVideoSdesCryptoSuites, > crypto_options, crypto_suite_names); > } > >-void GetSupportedDataSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, >- std::vector<int>* crypto_suites) { >- if (crypto_options.enable_gcm_crypto_suites) { >+void GetSupportedDataSdesCryptoSuites( >+ const webrtc::CryptoOptions& crypto_options, >+ std::vector<int>* crypto_suites) { >+ if (crypto_options.srtp.enable_gcm_crypto_suites) { > crypto_suites->push_back(rtc::SRTP_AEAD_AES_256_GCM); > crypto_suites->push_back(rtc::SRTP_AEAD_AES_128_GCM); > } >@@ -240,7 +243,7 @@ void GetSupportedDataSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, > } > > void GetSupportedDataSdesCryptoSuiteNames( >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > std::vector<std::string>* crypto_suite_names) { > GetSupportedSdesCryptoSuiteNames(GetSupportedDataSdesCryptoSuites, > crypto_options, crypto_suite_names); >@@ -252,17 +255,17 @@ void GetSupportedDataSdesCryptoSuiteNames( > // Pick the crypto in the list that is supported. > static bool SelectCrypto(const MediaContentDescription* offer, > bool bundle, >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > CryptoParams* crypto_out) { > bool audio = offer->type() == MEDIA_TYPE_AUDIO; > const CryptoParamsVec& cryptos = offer->cryptos(); > > for (const CryptoParams& crypto : cryptos) { >- if ((crypto_options.enable_gcm_crypto_suites && >+ if ((crypto_options.srtp.enable_gcm_crypto_suites && > rtc::IsGcmCryptoSuiteName(crypto.cipher_suite)) || > rtc::CS_AES_CM_128_HMAC_SHA1_80 == crypto.cipher_suite || > (rtc::CS_AES_CM_128_HMAC_SHA1_32 == crypto.cipher_suite && audio && >- !bundle && crypto_options.enable_aes128_sha1_32_crypto_cipher)) { >+ !bundle && crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher)) { > return CreateCryptoParams(crypto.tag, crypto.cipher_suite, crypto_out); > } > } >@@ -309,7 +312,8 @@ void FilterDataCodecs(std::vector<DataCodec>* codecs, bool sctp) { > sctp ? kGoogleRtpDataCodecName : kGoogleSctpDataCodecName; > codecs->erase(std::remove_if(codecs->begin(), codecs->end(), > [&codec_name](const DataCodec& codec) { >- return CodecNamesEq(codec.name, codec_name); >+ return absl::EqualsIgnoreCase(codec.name, >+ codec_name); > }), > codecs->end()); > } >@@ -650,7 +654,7 @@ static bool ContainsRtxCodec(const std::vector<C>& codecs) { > > template <class C> > static bool IsRtxCodec(const C& codec) { >- return STR_CASE_CMP(codec.name.c_str(), kRtxCodecName) == 0; >+ return absl::EqualsIgnoreCase(codec.name, kRtxCodecName); > } > > template <class C> >@@ -665,7 +669,7 @@ static bool ContainsFlexfecCodec(const std::vector<C>& codecs) { > > template <class C> > static bool IsFlexfecCodec(const C& codec) { >- return STR_CASE_CMP(codec.name.c_str(), kFlexfecCodecName) == 0; >+ return absl::EqualsIgnoreCase(codec.name, kFlexfecCodecName); > } > > // Create a media content to be offered for the given |sender_options|, >@@ -743,7 +747,7 @@ static void NegotiateCodecs(const std::vector<C>& local_codecs, > RTC_DCHECK(apt_it != theirs.params.end()); > negotiated.SetParam(kCodecParamAssociatedPayloadType, apt_it->second); > } >- if (CodecNamesEq(ours.name.c_str(), kH264CodecName)) { >+ if (absl::EqualsIgnoreCase(ours.name, kH264CodecName)) { > webrtc::H264::GenerateProfileLevelIdForAnswer( > ours.params, theirs.params, &negotiated.params); > } >@@ -1038,9 +1042,8 @@ static void NegotiateRtpHeaderExtensions( > static void StripCNCodecs(AudioCodecs* audio_codecs) { > audio_codecs->erase(std::remove_if(audio_codecs->begin(), audio_codecs->end(), > [](const AudioCodec& codec) { >- return STR_CASE_CMP( >- codec.name.c_str(), >- kComfortNoiseCodecName) == 0; >+ return absl::EqualsIgnoreCase( >+ codec.name, kComfortNoiseCodecName); > }), > audio_codecs->end()); > } >@@ -1069,6 +1072,8 @@ static bool CreateMediaContentAnswer( > NegotiateCodecs(local_codecs, offer->codecs(), &negotiated_codecs); > answer->AddCodecs(negotiated_codecs); > answer->set_protocol(offer->protocol()); >+ >+ answer->set_extmap_allow_mixed_enum(offer->extmap_allow_mixed_enum()); > RtpHeaderExtensions negotiated_rtp_extensions; > NegotiateRtpHeaderExtensions( > local_rtp_extenstions, offer->rtp_header_extensions(), >@@ -1262,6 +1267,8 @@ SessionDescription* MediaSessionDescriptionFactory::CreateOffer( > const SessionDescription* current_description) const { > std::unique_ptr<SessionDescription> offer(new SessionDescription()); > >+ IceCredentialsIterator ice_credentials( >+ session_options.pooled_ice_credentials); > StreamParamsVec current_streams; > GetCurrentStreamParams(current_description, ¤t_streams); > >@@ -1305,18 +1312,18 @@ SessionDescription* MediaSessionDescriptionFactory::CreateOffer( > } > switch (media_description_options.type) { > case MEDIA_TYPE_AUDIO: >- if (!AddAudioContentForOffer(media_description_options, session_options, >- current_content, current_description, >- audio_rtp_extensions, offer_audio_codecs, >- ¤t_streams, offer.get())) { >+ if (!AddAudioContentForOffer( >+ media_description_options, session_options, current_content, >+ current_description, audio_rtp_extensions, offer_audio_codecs, >+ ¤t_streams, offer.get(), &ice_credentials)) { > return nullptr; > } > break; > case MEDIA_TYPE_VIDEO: >- if (!AddVideoContentForOffer(media_description_options, session_options, >- current_content, current_description, >- video_rtp_extensions, offer_video_codecs, >- ¤t_streams, offer.get())) { >+ if (!AddVideoContentForOffer( >+ media_description_options, session_options, current_content, >+ current_description, video_rtp_extensions, offer_video_codecs, >+ ¤t_streams, offer.get(), &ice_credentials)) { > return nullptr; > } > break; >@@ -1324,7 +1331,7 @@ SessionDescription* MediaSessionDescriptionFactory::CreateOffer( > if (!AddDataContentForOffer(media_description_options, session_options, > current_content, current_description, > offer_data_codecs, ¤t_streams, >- offer.get())) { >+ offer.get(), &ice_credentials)) { > return nullptr; > } > break; >@@ -1371,6 +1378,8 @@ SessionDescription* MediaSessionDescriptionFactory::CreateOffer( > offer->set_msid_signaling(cricket::kMsidSignalingSsrcAttribute); > } > >+ offer->set_extmap_allow_mixed(session_options.offer_extmap_allow_mixed); >+ > return offer.release(); > } > >@@ -1381,6 +1390,10 @@ SessionDescription* MediaSessionDescriptionFactory::CreateAnswer( > if (!offer) { > return nullptr; > } >+ >+ IceCredentialsIterator ice_credentials( >+ session_options.pooled_ice_credentials); >+ > // The answer contains the intersection of the codecs in the offer with the > // codecs we support. As indicated by XEP-0167, we retain the same payload ids > // from the offer in the answer. >@@ -1396,6 +1409,8 @@ SessionDescription* MediaSessionDescriptionFactory::CreateAnswer( > // Transport info shared by the bundle group. > std::unique_ptr<TransportInfo> bundle_transport; > >+ answer->set_extmap_allow_mixed(offer->extmap_allow_mixed()); >+ > // Get list of all possible codecs that respects existing payload type > // mappings and uses a single payload type space. > // >@@ -1440,7 +1455,7 @@ SessionDescription* MediaSessionDescriptionFactory::CreateAnswer( > media_description_options, session_options, offer_content, > offer, current_content, current_description, > bundle_transport.get(), answer_audio_codecs, ¤t_streams, >- answer.get())) { >+ answer.get(), &ice_credentials)) { > return nullptr; > } > break; >@@ -1449,16 +1464,16 @@ SessionDescription* MediaSessionDescriptionFactory::CreateAnswer( > media_description_options, session_options, offer_content, > offer, current_content, current_description, > bundle_transport.get(), answer_video_codecs, ¤t_streams, >- answer.get())) { >+ answer.get(), &ice_credentials)) { > return nullptr; > } > break; > case MEDIA_TYPE_DATA: >- if (!AddDataContentForAnswer(media_description_options, session_options, >- offer_content, offer, current_content, >- current_description, >- bundle_transport.get(), answer_data_codecs, >- ¤t_streams, answer.get())) { >+ if (!AddDataContentForAnswer( >+ media_description_options, session_options, offer_content, >+ offer, current_content, current_description, >+ bundle_transport.get(), answer_data_codecs, ¤t_streams, >+ answer.get(), &ice_credentials)) { > return nullptr; > } > break; >@@ -1765,13 +1780,15 @@ bool MediaSessionDescriptionFactory::AddTransportOffer( > const std::string& content_name, > const TransportOptions& transport_options, > const SessionDescription* current_desc, >- SessionDescription* offer_desc) const { >+ SessionDescription* offer_desc, >+ IceCredentialsIterator* ice_credentials) const { > if (!transport_desc_factory_) > return false; > const TransportDescription* current_tdesc = > GetTransportDescription(content_name, current_desc); > std::unique_ptr<TransportDescription> new_tdesc( >- transport_desc_factory_->CreateOffer(transport_options, current_tdesc)); >+ transport_desc_factory_->CreateOffer(transport_options, current_tdesc, >+ ice_credentials)); > bool ret = > (new_tdesc.get() != NULL && > offer_desc->AddTransportInfo(TransportInfo(content_name, *new_tdesc))); >@@ -1787,7 +1804,8 @@ TransportDescription* MediaSessionDescriptionFactory::CreateTransportAnswer( > const SessionDescription* offer_desc, > const TransportOptions& transport_options, > const SessionDescription* current_desc, >- bool require_transport_attributes) const { >+ bool require_transport_attributes, >+ IceCredentialsIterator* ice_credentials) const { > if (!transport_desc_factory_) > return NULL; > const TransportDescription* offer_tdesc = >@@ -1796,7 +1814,7 @@ TransportDescription* MediaSessionDescriptionFactory::CreateTransportAnswer( > GetTransportDescription(content_name, current_desc); > return transport_desc_factory_->CreateAnswer(offer_tdesc, transport_options, > require_transport_attributes, >- current_tdesc); >+ current_tdesc, ice_credentials); > } > > bool MediaSessionDescriptionFactory::AddTransportAnswer( >@@ -1832,7 +1850,8 @@ bool MediaSessionDescriptionFactory::AddAudioContentForOffer( > const RtpHeaderExtensions& audio_rtp_extensions, > const AudioCodecs& audio_codecs, > StreamParamsVec* current_streams, >- SessionDescription* desc) const { >+ SessionDescription* desc, >+ IceCredentialsIterator* ice_credentials) const { > // Filter audio_codecs (which includes all codecs, with correctly remapped > // payload types) based on transceiver direction. > const AudioCodecs& supported_audio_codecs = >@@ -1888,7 +1907,7 @@ bool MediaSessionDescriptionFactory::AddAudioContentForOffer( > media_description_options.stopped, audio.release()); > if (!AddTransportOffer(media_description_options.mid, > media_description_options.transport_options, >- current_description, desc)) { >+ current_description, desc, ice_credentials)) { > return false; > } > >@@ -1903,7 +1922,8 @@ bool MediaSessionDescriptionFactory::AddVideoContentForOffer( > const RtpHeaderExtensions& video_rtp_extensions, > const VideoCodecs& video_codecs, > StreamParamsVec* current_streams, >- SessionDescription* desc) const { >+ SessionDescription* desc, >+ IceCredentialsIterator* ice_credentials) const { > cricket::SecurePolicy sdes_policy = > IsDtlsActive(current_content, current_description) ? cricket::SEC_DISABLED > : secure(); >@@ -1957,7 +1977,7 @@ bool MediaSessionDescriptionFactory::AddVideoContentForOffer( > media_description_options.stopped, video.release()); > if (!AddTransportOffer(media_description_options.mid, > media_description_options.transport_options, >- current_description, desc)) { >+ current_description, desc, ice_credentials)) { > return false; > } > return true; >@@ -1970,7 +1990,8 @@ bool MediaSessionDescriptionFactory::AddDataContentForOffer( > const SessionDescription* current_description, > const DataCodecs& data_codecs, > StreamParamsVec* current_streams, >- SessionDescription* desc) const { >+ SessionDescription* desc, >+ IceCredentialsIterator* ice_credentials) const { > bool secure_transport = (transport_desc_factory_->secure() != SEC_DISABLED); > > std::unique_ptr<DataContentDescription> data(new DataContentDescription()); >@@ -2024,7 +2045,7 @@ bool MediaSessionDescriptionFactory::AddDataContentForOffer( > } > if (!AddTransportOffer(media_description_options.mid, > media_description_options.transport_options, >- current_description, desc)) { >+ current_description, desc, ice_credentials)) { > return false; > } > return true; >@@ -2052,15 +2073,16 @@ bool MediaSessionDescriptionFactory::AddAudioContentForAnswer( > const TransportInfo* bundle_transport, > const AudioCodecs& audio_codecs, > StreamParamsVec* current_streams, >- SessionDescription* answer) const { >+ SessionDescription* answer, >+ IceCredentialsIterator* ice_credentials) const { > RTC_CHECK(IsMediaContentOfType(offer_content, MEDIA_TYPE_AUDIO)); > const AudioContentDescription* offer_audio_description = > offer_content->media_description()->as_audio(); > >- std::unique_ptr<TransportDescription> audio_transport( >- CreateTransportAnswer(media_description_options.mid, offer_description, >- media_description_options.transport_options, >- current_description, bundle_transport != nullptr)); >+ std::unique_ptr<TransportDescription> audio_transport(CreateTransportAnswer( >+ media_description_options.mid, offer_description, >+ media_description_options.transport_options, current_description, >+ bundle_transport != nullptr, ice_credentials)); > if (!audio_transport) { > return false; > } >@@ -2146,15 +2168,16 @@ bool MediaSessionDescriptionFactory::AddVideoContentForAnswer( > const TransportInfo* bundle_transport, > const VideoCodecs& video_codecs, > StreamParamsVec* current_streams, >- SessionDescription* answer) const { >+ SessionDescription* answer, >+ IceCredentialsIterator* ice_credentials) const { > RTC_CHECK(IsMediaContentOfType(offer_content, MEDIA_TYPE_VIDEO)); > const VideoContentDescription* offer_video_description = > offer_content->media_description()->as_video(); > >- std::unique_ptr<TransportDescription> video_transport( >- CreateTransportAnswer(media_description_options.mid, offer_description, >- media_description_options.transport_options, >- current_description, bundle_transport != nullptr)); >+ std::unique_ptr<TransportDescription> video_transport(CreateTransportAnswer( >+ media_description_options.mid, offer_description, >+ media_description_options.transport_options, current_description, >+ bundle_transport != nullptr, ice_credentials)); > if (!video_transport) { > return false; > } >@@ -2232,11 +2255,12 @@ bool MediaSessionDescriptionFactory::AddDataContentForAnswer( > const TransportInfo* bundle_transport, > const DataCodecs& data_codecs, > StreamParamsVec* current_streams, >- SessionDescription* answer) const { >- std::unique_ptr<TransportDescription> data_transport( >- CreateTransportAnswer(media_description_options.mid, offer_description, >- media_description_options.transport_options, >- current_description, bundle_transport != nullptr)); >+ SessionDescription* answer, >+ IceCredentialsIterator* ice_credentials) const { >+ std::unique_ptr<TransportDescription> data_transport(CreateTransportAnswer( >+ media_description_options.mid, offer_description, >+ media_description_options.transport_options, current_description, >+ bundle_transport != nullptr, ice_credentials)); > if (!data_transport) { > return false; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.h >index b40974039b820a965f57d57f35c689021c445430..5904605f10a4da25fbd7cdc95cfb90f949914630 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession.h >@@ -21,6 +21,7 @@ > #include "api/mediatypes.h" > #include "media/base/mediaconstants.h" > #include "media/base/mediaengine.h" // For DataChannelType >+#include "p2p/base/icecredentialsiterator.h" > #include "p2p/base/transportdescriptionfactory.h" > #include "pc/jseptransport.h" > #include "pc/sessiondescription.h" >@@ -93,11 +94,13 @@ struct MediaSessionOptions { > bool rtcp_mux_enabled = true; > bool bundle_enabled = false; > bool is_unified_plan = false; >+ bool offer_extmap_allow_mixed = false; > std::string rtcp_cname = kDefaultRtcpCname; >- rtc::CryptoOptions crypto_options; >+ webrtc::CryptoOptions crypto_options; > // List of media description options in the same order that the media > // descriptions will be generated. > std::vector<MediaDescriptionOptions> media_description_options; >+ std::vector<IceParameters> pooled_ice_credentials; > }; > > // Creates media session descriptions according to the supplied codecs and >@@ -186,14 +189,16 @@ class MediaSessionDescriptionFactory { > bool AddTransportOffer(const std::string& content_name, > const TransportOptions& transport_options, > const SessionDescription* current_desc, >- SessionDescription* offer) const; >+ SessionDescription* offer, >+ IceCredentialsIterator* ice_credentials) const; > > TransportDescription* CreateTransportAnswer( > const std::string& content_name, > const SessionDescription* offer_desc, > const TransportOptions& transport_options, > const SessionDescription* current_desc, >- bool require_transport_attributes) const; >+ bool require_transport_attributes, >+ IceCredentialsIterator* ice_credentials) const; > > bool AddTransportAnswer(const std::string& content_name, > const TransportDescription& transport_desc, >@@ -211,7 +216,8 @@ class MediaSessionDescriptionFactory { > const RtpHeaderExtensions& audio_rtp_extensions, > const AudioCodecs& audio_codecs, > StreamParamsVec* current_streams, >- SessionDescription* desc) const; >+ SessionDescription* desc, >+ IceCredentialsIterator* ice_credentials) const; > > bool AddVideoContentForOffer( > const MediaDescriptionOptions& media_description_options, >@@ -221,7 +227,8 @@ class MediaSessionDescriptionFactory { > const RtpHeaderExtensions& video_rtp_extensions, > const VideoCodecs& video_codecs, > StreamParamsVec* current_streams, >- SessionDescription* desc) const; >+ SessionDescription* desc, >+ IceCredentialsIterator* ice_credentials) const; > > bool AddDataContentForOffer( > const MediaDescriptionOptions& media_description_options, >@@ -230,7 +237,8 @@ class MediaSessionDescriptionFactory { > const SessionDescription* current_description, > const DataCodecs& data_codecs, > StreamParamsVec* current_streams, >- SessionDescription* desc) const; >+ SessionDescription* desc, >+ IceCredentialsIterator* ice_credentials) const; > > bool AddAudioContentForAnswer( > const MediaDescriptionOptions& media_description_options, >@@ -242,7 +250,8 @@ class MediaSessionDescriptionFactory { > const TransportInfo* bundle_transport, > const AudioCodecs& audio_codecs, > StreamParamsVec* current_streams, >- SessionDescription* answer) const; >+ SessionDescription* answer, >+ IceCredentialsIterator* ice_credentials) const; > > bool AddVideoContentForAnswer( > const MediaDescriptionOptions& media_description_options, >@@ -254,7 +263,8 @@ class MediaSessionDescriptionFactory { > const TransportInfo* bundle_transport, > const VideoCodecs& video_codecs, > StreamParamsVec* current_streams, >- SessionDescription* answer) const; >+ SessionDescription* answer, >+ IceCredentialsIterator* ice_credentials) const; > > bool AddDataContentForAnswer( > const MediaDescriptionOptions& media_description_options, >@@ -266,7 +276,8 @@ class MediaSessionDescriptionFactory { > const TransportInfo* bundle_transport, > const DataCodecs& data_codecs, > StreamParamsVec* current_streams, >- SessionDescription* answer) const; >+ SessionDescription* answer, >+ IceCredentialsIterator* ice_credentials) const; > > void ComputeAudioCodecsIntersectionAndUnion(); > >@@ -327,20 +338,23 @@ DataContentDescription* GetFirstDataContentDescription( > SessionDescription* sdesc); > > // Helper functions to return crypto suites used for SDES. >-void GetSupportedAudioSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, >- std::vector<int>* crypto_suites); >-void GetSupportedVideoSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, >- std::vector<int>* crypto_suites); >-void GetSupportedDataSdesCryptoSuites(const rtc::CryptoOptions& crypto_options, >- std::vector<int>* crypto_suites); >+void GetSupportedAudioSdesCryptoSuites( >+ const webrtc::CryptoOptions& crypto_options, >+ std::vector<int>* crypto_suites); >+void GetSupportedVideoSdesCryptoSuites( >+ const webrtc::CryptoOptions& crypto_options, >+ std::vector<int>* crypto_suites); >+void GetSupportedDataSdesCryptoSuites( >+ const webrtc::CryptoOptions& crypto_options, >+ std::vector<int>* crypto_suites); > void GetSupportedAudioSdesCryptoSuiteNames( >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > std::vector<std::string>* crypto_suite_names); > void GetSupportedVideoSdesCryptoSuiteNames( >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > std::vector<std::string>* crypto_suite_names); > void GetSupportedDataSdesCryptoSuiteNames( >- const rtc::CryptoOptions& crypto_options, >+ const webrtc::CryptoOptions& crypto_options, > std::vector<std::string>* crypto_suite_names); > > // Returns true if the given media section protocol indicates use of RTP. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession_unittest.cc >index db325ed2ef167ebf9b3ade832cb4ccdbf7f73c16..076ad12a5b34980c00c31209749a1eb536fc34d4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/mediasession_unittest.cc >@@ -26,10 +26,11 @@ > #include "rtc_base/messagedigest.h" > #include "rtc_base/ssladapter.h" > #include "rtc_base/strings/string_builder.h" >+#include "test/gmock.h" > > #define ASSERT_CRYPTO(cd, s, cs) \ > ASSERT_EQ(s, cd->cryptos().size()); \ >- ASSERT_EQ(std::string(cs), cd->cryptos()[0].cipher_suite) >+ ASSERT_EQ(cs, cd->cryptos()[0].cipher_suite) > > typedef std::vector<cricket::Candidate> Candidates; > >@@ -71,6 +72,7 @@ using rtc::CS_AES_CM_128_HMAC_SHA1_32; > using rtc::CS_AES_CM_128_HMAC_SHA1_80; > using rtc::CS_AEAD_AES_128_GCM; > using rtc::CS_AEAD_AES_256_GCM; >+using testing::ElementsAreArray; > using webrtc::RtpExtension; > using webrtc::RtpTransceiverDirection; > >@@ -545,7 +547,7 @@ class MediaSessionDescriptionFactoryTest : public testing::Test { > EXPECT_TRUE(CompareCryptoParams(audio_media_desc->cryptos(), > video_media_desc->cryptos())); > EXPECT_EQ(1u, audio_media_desc->cryptos().size()); >- EXPECT_EQ(std::string(kDefaultSrtpCryptoSuite), >+ EXPECT_EQ(kDefaultSrtpCryptoSuite, > audio_media_desc->cryptos()[0].cipher_suite); > > // Verify the selected crypto is one from the reference audio >@@ -609,11 +611,11 @@ class MediaSessionDescriptionFactoryTest : public testing::Test { > void TestVideoGcmCipher(bool gcm_offer, bool gcm_answer) { > MediaSessionOptions offer_opts; > AddAudioVideoSections(RtpTransceiverDirection::kRecvOnly, &offer_opts); >- offer_opts.crypto_options.enable_gcm_crypto_suites = gcm_offer; >+ offer_opts.crypto_options.srtp.enable_gcm_crypto_suites = gcm_offer; > > MediaSessionOptions answer_opts; > AddAudioVideoSections(RtpTransceiverDirection::kRecvOnly, &answer_opts); >- answer_opts.crypto_options.enable_gcm_crypto_suites = gcm_answer; >+ answer_opts.crypto_options.srtp.enable_gcm_crypto_suites = gcm_answer; > > f1_.set_secure(SEC_ENABLED); > f2_.set_secure(SEC_ENABLED); >@@ -631,7 +633,7 @@ class MediaSessionDescriptionFactoryTest : public testing::Test { > const AudioContentDescription* acd = ac->media_description()->as_audio(); > const VideoContentDescription* vcd = vc->media_description()->as_video(); > EXPECT_EQ(MEDIA_TYPE_AUDIO, acd->type()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // negotiated auto bw > EXPECT_EQ(0U, acd->first_ssrc()); // no sender is attached > EXPECT_TRUE(acd->rtcp_mux()); // negotiated rtcp-mux >@@ -641,7 +643,7 @@ class MediaSessionDescriptionFactoryTest : public testing::Test { > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); > } > EXPECT_EQ(MEDIA_TYPE_VIDEO, vcd->type()); >- EXPECT_EQ(MAKE_VECTOR(kVideoCodecsAnswer), vcd->codecs()); >+ EXPECT_THAT(vcd->codecs(), ElementsAreArray(kVideoCodecsAnswer)); > EXPECT_EQ(0U, vcd->first_ssrc()); // no sender is attached > EXPECT_TRUE(vcd->rtcp_mux()); // negotiated rtcp-mux > if (gcm_offer && gcm_answer) { >@@ -649,7 +651,7 @@ class MediaSessionDescriptionFactoryTest : public testing::Test { > } else { > ASSERT_CRYPTO(vcd, 1U, kDefaultSrtpCryptoSuite); > } >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), vcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, vcd->protocol()); > } > > protected: >@@ -677,7 +679,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateAudioOffer) { > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // default bandwidth (auto) > EXPECT_TRUE(acd->rtcp_mux()); // rtcp-mux defaults on > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, acd->protocol()); > } > > // Create a typical video offer, and ensure it matches what we expect. >@@ -701,14 +703,14 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateVideoOffer) { > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // default bandwidth (auto) > EXPECT_TRUE(acd->rtcp_mux()); // rtcp-mux defaults on > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, acd->protocol()); > EXPECT_EQ(MEDIA_TYPE_VIDEO, vcd->type()); > EXPECT_EQ(f1_.video_codecs(), vcd->codecs()); > EXPECT_EQ(0U, vcd->first_ssrc()); // no sender is attached > EXPECT_EQ(kAutoBandwidth, vcd->bandwidth()); // default bandwidth (auto) > EXPECT_TRUE(vcd->rtcp_mux()); // rtcp-mux defaults on > ASSERT_CRYPTO(vcd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), vcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, vcd->protocol()); > } > > // Test creating an offer with bundle where the Codecs have the same dynamic >@@ -779,11 +781,11 @@ TEST_F(MediaSessionDescriptionFactoryTest, > EXPECT_TRUE(NULL != dcd); > > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, acd->protocol()); > ASSERT_CRYPTO(vcd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), vcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, vcd->protocol()); > ASSERT_CRYPTO(dcd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), dcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, dcd->protocol()); > } > > // Create a RTP data offer, and ensure it matches what we expect. >@@ -808,7 +810,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateRtpDataOffer) { > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // default bandwidth (auto) > EXPECT_TRUE(acd->rtcp_mux()); // rtcp-mux defaults on > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, acd->protocol()); > EXPECT_EQ(MEDIA_TYPE_DATA, dcd->type()); > EXPECT_EQ(f1_.data_codecs(), dcd->codecs()); > EXPECT_EQ(0U, dcd->first_ssrc()); // no sender is attached. >@@ -816,7 +818,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateRtpDataOffer) { > dcd->bandwidth()); // default bandwidth (auto) > EXPECT_TRUE(dcd->rtcp_mux()); // rtcp-mux defaults on > ASSERT_CRYPTO(dcd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), dcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, dcd->protocol()); > } > > // Create an SCTP data offer with bundle without error. >@@ -939,12 +941,12 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateAudioAnswer) { > EXPECT_EQ(MediaProtocolType::kRtp, ac->type); > const AudioContentDescription* acd = ac->media_description()->as_audio(); > EXPECT_EQ(MEDIA_TYPE_AUDIO, acd->type()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > EXPECT_EQ(0U, acd->first_ssrc()); // no sender is attached > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // negotiated auto bw > EXPECT_TRUE(acd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, acd->protocol()); > } > > // Create a typical audio answer with GCM ciphers enabled, and ensure it >@@ -953,7 +955,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateAudioAnswerGcm) { > f1_.set_secure(SEC_ENABLED); > f2_.set_secure(SEC_ENABLED); > MediaSessionOptions opts = CreatePlanBMediaSessionOptions(); >- opts.crypto_options.enable_gcm_crypto_suites = true; >+ opts.crypto_options.srtp.enable_gcm_crypto_suites = true; > std::unique_ptr<SessionDescription> offer(f1_.CreateOffer(opts, NULL)); > ASSERT_TRUE(offer.get() != NULL); > std::unique_ptr<SessionDescription> answer( >@@ -965,12 +967,12 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateAudioAnswerGcm) { > EXPECT_EQ(MediaProtocolType::kRtp, ac->type); > const AudioContentDescription* acd = ac->media_description()->as_audio(); > EXPECT_EQ(MEDIA_TYPE_AUDIO, acd->type()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > EXPECT_EQ(0U, acd->first_ssrc()); // no sender is attached > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // negotiated auto bw > EXPECT_TRUE(acd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuiteGcm); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, acd->protocol()); > } > > // Create a typical video answer, and ensure it matches what we expect. >@@ -992,17 +994,17 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateVideoAnswer) { > const AudioContentDescription* acd = ac->media_description()->as_audio(); > const VideoContentDescription* vcd = vc->media_description()->as_video(); > EXPECT_EQ(MEDIA_TYPE_AUDIO, acd->type()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // negotiated auto bw > EXPECT_EQ(0U, acd->first_ssrc()); // no sender is attached > EXPECT_TRUE(acd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); > EXPECT_EQ(MEDIA_TYPE_VIDEO, vcd->type()); >- EXPECT_EQ(MAKE_VECTOR(kVideoCodecsAnswer), vcd->codecs()); >+ EXPECT_THAT(vcd->codecs(), ElementsAreArray(kVideoCodecsAnswer)); > EXPECT_EQ(0U, vcd->first_ssrc()); // no sender is attached > EXPECT_TRUE(vcd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(vcd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), vcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, vcd->protocol()); > } > > // Create a typical video answer with GCM ciphers enabled, and ensure it >@@ -1041,23 +1043,23 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateDataAnswer) { > const AudioContentDescription* acd = ac->media_description()->as_audio(); > const DataContentDescription* dcd = dc->media_description()->as_data(); > EXPECT_EQ(MEDIA_TYPE_AUDIO, acd->type()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // negotiated auto bw > EXPECT_EQ(0U, acd->first_ssrc()); // no sender is attached > EXPECT_TRUE(acd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuite); > EXPECT_EQ(MEDIA_TYPE_DATA, dcd->type()); >- EXPECT_EQ(MAKE_VECTOR(kDataCodecsAnswer), dcd->codecs()); >+ EXPECT_THAT(dcd->codecs(), ElementsAreArray(kDataCodecsAnswer)); > EXPECT_EQ(0U, dcd->first_ssrc()); // no sender is attached > EXPECT_TRUE(dcd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(dcd, 1U, kDefaultSrtpCryptoSuite); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), dcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, dcd->protocol()); > } > > TEST_F(MediaSessionDescriptionFactoryTest, TestCreateDataAnswerGcm) { > MediaSessionOptions opts = CreatePlanBMediaSessionOptions(); > AddDataSection(cricket::DCT_RTP, RtpTransceiverDirection::kRecvOnly, &opts); >- opts.crypto_options.enable_gcm_crypto_suites = true; >+ opts.crypto_options.srtp.enable_gcm_crypto_suites = true; > f1_.set_secure(SEC_ENABLED); > f2_.set_secure(SEC_ENABLED); > std::unique_ptr<SessionDescription> offer(f1_.CreateOffer(opts, NULL)); >@@ -1073,17 +1075,17 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateDataAnswerGcm) { > const AudioContentDescription* acd = ac->media_description()->as_audio(); > const DataContentDescription* dcd = dc->media_description()->as_data(); > EXPECT_EQ(MEDIA_TYPE_AUDIO, acd->type()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > EXPECT_EQ(kAutoBandwidth, acd->bandwidth()); // negotiated auto bw > EXPECT_EQ(0U, acd->first_ssrc()); // no sender is attached > EXPECT_TRUE(acd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(acd, 1U, kDefaultSrtpCryptoSuiteGcm); > EXPECT_EQ(MEDIA_TYPE_DATA, dcd->type()); >- EXPECT_EQ(MAKE_VECTOR(kDataCodecsAnswer), dcd->codecs()); >+ EXPECT_THAT(dcd->codecs(), ElementsAreArray(kDataCodecsAnswer)); > EXPECT_EQ(0U, dcd->first_ssrc()); // no sender is attached > EXPECT_TRUE(dcd->rtcp_mux()); // negotiated rtcp-mux > ASSERT_CRYPTO(dcd, 1U, kDefaultSrtpCryptoSuiteGcm); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), dcd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, dcd->protocol()); > } > > // The use_sctpmap flag should be set in a DataContentDescription by default. >@@ -1264,7 +1266,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, AudioOfferAnswerWithCryptoDisabled) { > const AudioContentDescription* offer_acd = > GetFirstAudioContentDescription(offer.get()); > ASSERT_TRUE(offer_acd != NULL); >- EXPECT_EQ(std::string(cricket::kMediaProtocolAvpf), offer_acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolAvpf, offer_acd->protocol()); > > std::unique_ptr<SessionDescription> answer( > f2_.CreateAnswer(offer.get(), opts, NULL)); >@@ -1276,7 +1278,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, AudioOfferAnswerWithCryptoDisabled) { > const AudioContentDescription* answer_acd = > GetFirstAudioContentDescription(answer.get()); > ASSERT_TRUE(answer_acd != NULL); >- EXPECT_EQ(std::string(cricket::kMediaProtocolAvpf), answer_acd->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolAvpf, answer_acd->protocol()); > } > > // Create a video offer and answer and ensure the RTP header extensions >@@ -1581,6 +1583,67 @@ TEST_F(MediaSessionDescriptionFactoryTest, > EXPECT_TRUE(dc->rejected); > } > >+TEST_F(MediaSessionDescriptionFactoryTest, >+ CreateAnswerSupportsMixedOneAndTwoByteHeaderExtensions) { >+ MediaSessionOptions opts; >+ std::unique_ptr<SessionDescription> offer(f1_.CreateOffer(opts, NULL)); >+ // Offer without request of mixed one- and two-byte header extensions. >+ offer->set_extmap_allow_mixed(false); >+ ASSERT_TRUE(offer.get() != NULL); >+ std::unique_ptr<SessionDescription> answer_no_support( >+ f2_.CreateAnswer(offer.get(), opts, NULL)); >+ EXPECT_FALSE(answer_no_support->extmap_allow_mixed()); >+ >+ // Offer with request of mixed one- and two-byte header extensions. >+ offer->set_extmap_allow_mixed(true); >+ ASSERT_TRUE(offer.get() != NULL); >+ std::unique_ptr<SessionDescription> answer_support( >+ f2_.CreateAnswer(offer.get(), opts, NULL)); >+ EXPECT_TRUE(answer_support->extmap_allow_mixed()); >+} >+ >+TEST_F(MediaSessionDescriptionFactoryTest, >+ CreateAnswerSupportsMixedOneAndTwoByteHeaderExtensionsOnMediaLevel) { >+ MediaSessionOptions opts; >+ AddAudioVideoSections(RtpTransceiverDirection::kSendRecv, &opts); >+ std::unique_ptr<SessionDescription> offer(f1_.CreateOffer(opts, NULL)); >+ MediaContentDescription* video_offer = >+ offer->GetContentDescriptionByName("video"); >+ ASSERT_TRUE(video_offer); >+ MediaContentDescription* audio_offer = >+ offer->GetContentDescriptionByName("audio"); >+ ASSERT_TRUE(audio_offer); >+ >+ // Explicit disable of mixed one-two byte header support in offer. >+ video_offer->set_extmap_allow_mixed_enum(MediaContentDescription::kNo); >+ audio_offer->set_extmap_allow_mixed_enum(MediaContentDescription::kNo); >+ >+ ASSERT_TRUE(offer.get() != NULL); >+ std::unique_ptr<SessionDescription> answer_no_support( >+ f2_.CreateAnswer(offer.get(), opts, NULL)); >+ MediaContentDescription* video_answer = >+ answer_no_support->GetContentDescriptionByName("video"); >+ MediaContentDescription* audio_answer = >+ answer_no_support->GetContentDescriptionByName("audio"); >+ EXPECT_EQ(MediaContentDescription::kNo, >+ video_answer->extmap_allow_mixed_enum()); >+ EXPECT_EQ(MediaContentDescription::kNo, >+ audio_answer->extmap_allow_mixed_enum()); >+ >+ // Enable mixed one-two byte header support in offer. >+ video_offer->set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ audio_offer->set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ ASSERT_TRUE(offer.get() != NULL); >+ std::unique_ptr<SessionDescription> answer_support( >+ f2_.CreateAnswer(offer.get(), opts, NULL)); >+ video_answer = answer_support->GetContentDescriptionByName("video"); >+ audio_answer = answer_support->GetContentDescriptionByName("audio"); >+ EXPECT_EQ(MediaContentDescription::kMedia, >+ video_answer->extmap_allow_mixed_enum()); >+ EXPECT_EQ(MediaContentDescription::kMedia, >+ audio_answer->extmap_allow_mixed_enum()); >+} >+ > // Create an audio and video offer with: > // - one video track > // - two audio tracks >@@ -1814,7 +1877,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateMultiStreamVideoAnswer) { > ASSERT_CRYPTO(dcd, 1U, kDefaultSrtpCryptoSuite); > > EXPECT_EQ(MEDIA_TYPE_AUDIO, acd->type()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > > const StreamParamsVec& audio_streams = acd->streams(); > ASSERT_EQ(2U, audio_streams.size()); >@@ -1830,7 +1893,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateMultiStreamVideoAnswer) { > EXPECT_TRUE(acd->rtcp_mux()); // rtcp-mux defaults on > > EXPECT_EQ(MEDIA_TYPE_VIDEO, vcd->type()); >- EXPECT_EQ(MAKE_VECTOR(kVideoCodecsAnswer), vcd->codecs()); >+ EXPECT_THAT(vcd->codecs(), ElementsAreArray(kVideoCodecsAnswer)); > > const StreamParamsVec& video_streams = vcd->streams(); > ASSERT_EQ(1U, video_streams.size()); >@@ -1840,7 +1903,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCreateMultiStreamVideoAnswer) { > EXPECT_TRUE(vcd->rtcp_mux()); // rtcp-mux defaults on > > EXPECT_EQ(MEDIA_TYPE_DATA, dcd->type()); >- EXPECT_EQ(MAKE_VECTOR(kDataCodecsAnswer), dcd->codecs()); >+ EXPECT_THAT(dcd->codecs(), ElementsAreArray(kDataCodecsAnswer)); > > const StreamParamsVec& data_streams = dcd->streams(); > ASSERT_EQ(2U, data_streams.size()); >@@ -1923,11 +1986,11 @@ TEST_F(MediaSessionDescriptionFactoryTest, > > const AudioContentDescription* acd = > GetFirstAudioContentDescription(answer.get()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > > const VideoContentDescription* vcd = > GetFirstVideoContentDescription(answer.get()); >- EXPECT_EQ(MAKE_VECTOR(kVideoCodecsAnswer), vcd->codecs()); >+ EXPECT_THAT(vcd->codecs(), ElementsAreArray(kVideoCodecsAnswer)); > > std::unique_ptr<SessionDescription> updated_offer( > f2_.CreateOffer(opts, answer.get())); >@@ -1950,11 +2013,11 @@ TEST_F(MediaSessionDescriptionFactoryTest, > > const AudioContentDescription* updated_acd = > GetFirstAudioContentDescription(updated_offer.get()); >- EXPECT_EQ(MAKE_VECTOR(kUpdatedAudioCodecOffer), updated_acd->codecs()); >+ EXPECT_THAT(updated_acd->codecs(), ElementsAreArray(kUpdatedAudioCodecOffer)); > > const VideoContentDescription* updated_vcd = > GetFirstVideoContentDescription(updated_offer.get()); >- EXPECT_EQ(MAKE_VECTOR(kUpdatedVideoCodecOffer), updated_vcd->codecs()); >+ EXPECT_THAT(updated_vcd->codecs(), ElementsAreArray(kUpdatedVideoCodecOffer)); > } > > // Create an updated offer after creating an answer to the original offer and >@@ -2079,7 +2142,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, > > const AudioContentDescription* acd = > GetFirstAudioContentDescription(answer.get()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), acd->codecs()); >+ EXPECT_THAT(acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > > // Now - let |f2_| add video with RTX and let the payload type the RTX codec > // reference be the same as an audio codec that was negotiated in the >@@ -2101,13 +2164,13 @@ TEST_F(MediaSessionDescriptionFactoryTest, > > const AudioContentDescription* updated_acd = > GetFirstAudioContentDescription(answer.get()); >- EXPECT_EQ(MAKE_VECTOR(kAudioCodecsAnswer), updated_acd->codecs()); >+ EXPECT_THAT(updated_acd->codecs(), ElementsAreArray(kAudioCodecsAnswer)); > > const VideoContentDescription* updated_vcd = > GetFirstVideoContentDescription(updated_answer.get()); > > ASSERT_EQ("H264", updated_vcd->codecs()[0].name); >- ASSERT_EQ(std::string(cricket::kRtxCodecName), updated_vcd->codecs()[1].name); >+ ASSERT_EQ(cricket::kRtxCodecName, updated_vcd->codecs()[1].name); > int new_h264_pl_type = updated_vcd->codecs()[0].id; > EXPECT_NE(used_pl_type, new_h264_pl_type); > VideoCodec rtx = updated_vcd->codecs()[1]; >@@ -2776,8 +2839,7 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestOfferDtlsSavpfCreateAnswer) { > > const AudioContentDescription* answer_audio_desc = > answer_content->media_description()->as_audio(); >- EXPECT_EQ(std::string(cricket::kMediaProtocolDtlsSavpf), >- answer_audio_desc->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolDtlsSavpf, answer_audio_desc->protocol()); > } > > // Test that we include both SDES and DTLS in the offer, but only include SDES >@@ -2842,10 +2904,8 @@ TEST_F(MediaSessionDescriptionFactoryTest, TestCryptoDtls) { > ASSERT_TRUE(video_media_desc != NULL); > EXPECT_TRUE(audio_media_desc->cryptos().empty()); > EXPECT_TRUE(video_media_desc->cryptos().empty()); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), >- audio_media_desc->protocol()); >- EXPECT_EQ(std::string(cricket::kMediaProtocolSavpf), >- video_media_desc->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, audio_media_desc->protocol()); >+ EXPECT_EQ(cricket::kMediaProtocolSavpf, video_media_desc->protocol()); > > audio_trans_desc = answer->GetTransportDescriptionByName("audio"); > ASSERT_TRUE(audio_trans_desc != NULL); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.cc >index 6d67329242d75c1ca8256d07a4514c72d9f40287..a6b47c16deba381e57abfbf1230118165fb158a1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.cc >@@ -22,6 +22,7 @@ > #include "api/jsepsessiondescription.h" > #include "api/mediastreamproxy.h" > #include "api/mediastreamtrackproxy.h" >+#include "api/umametrics.h" > #include "call/call.h" > #include "logging/rtc_event_log/icelogger.h" > #include "logging/rtc_event_log/output/rtc_event_log_output_file.h" >@@ -630,6 +631,35 @@ absl::optional<int> RTCConfigurationToIceConfigOptionalInt( > return rtc_configuration_parameter; > } > >+cricket::DataMessageType ToCricketDataMessageType(DataMessageType type) { >+ switch (type) { >+ case DataMessageType::kText: >+ return cricket::DMT_TEXT; >+ case DataMessageType::kBinary: >+ return cricket::DMT_BINARY; >+ case DataMessageType::kControl: >+ return cricket::DMT_CONTROL; >+ default: >+ return cricket::DMT_NONE; >+ } >+ return cricket::DMT_NONE; >+} >+ >+DataMessageType ToWebrtcDataMessageType(cricket::DataMessageType type) { >+ switch (type) { >+ case cricket::DMT_TEXT: >+ return DataMessageType::kText; >+ case cricket::DMT_BINARY: >+ return DataMessageType::kBinary; >+ case cricket::DMT_CONTROL: >+ return DataMessageType::kControl; >+ case cricket::DMT_NONE: >+ default: >+ RTC_NOTREACHED(); >+ } >+ return DataMessageType::kControl; >+} >+ > } // namespace > > // Upon completion, posts a task to execute the callback of the >@@ -684,6 +714,7 @@ bool PeerConnectionInterface::RTCConfiguration::operator==( > CandidateNetworkPolicy candidate_network_policy; > int audio_jitter_buffer_max_packets; > bool audio_jitter_buffer_fast_accelerate; >+ int audio_jitter_buffer_min_delay_ms; > int ice_connection_receiving_timeout; > int ice_backup_candidate_pair_ping_interval; > ContinualGatheringPolicy continual_gathering_policy; >@@ -704,6 +735,10 @@ bool PeerConnectionInterface::RTCConfiguration::operator==( > SdpSemantics sdp_semantics; > absl::optional<rtc::AdapterType> network_preference; > bool active_reset_srtp_params; >+ bool use_media_transport; >+ bool use_media_transport_for_data_channels; >+ absl::optional<CryptoOptions> crypto_options; >+ bool offer_extmap_allow_mixed; > }; > static_assert(sizeof(stuff_being_tested_for_equality) == sizeof(*this), > "Did you add something to RTCConfiguration and forget to " >@@ -716,6 +751,8 @@ bool PeerConnectionInterface::RTCConfiguration::operator==( > audio_jitter_buffer_max_packets == o.audio_jitter_buffer_max_packets && > audio_jitter_buffer_fast_accelerate == > o.audio_jitter_buffer_fast_accelerate && >+ audio_jitter_buffer_min_delay_ms == >+ o.audio_jitter_buffer_min_delay_ms && > ice_connection_receiving_timeout == > o.ice_connection_receiving_timeout && > ice_backup_candidate_pair_ping_interval == >@@ -751,7 +788,12 @@ bool PeerConnectionInterface::RTCConfiguration::operator==( > turn_customizer == o.turn_customizer && > sdp_semantics == o.sdp_semantics && > network_preference == o.network_preference && >- active_reset_srtp_params == o.active_reset_srtp_params; >+ active_reset_srtp_params == o.active_reset_srtp_params && >+ use_media_transport == o.use_media_transport && >+ use_media_transport_for_data_channels == >+ o.use_media_transport_for_data_channels && >+ crypto_options == o.crypto_options && >+ offer_extmap_allow_mixed == o.offer_extmap_allow_mixed; > } > > bool PeerConnectionInterface::RTCConfiguration::operator!=( >@@ -820,6 +862,7 @@ PeerConnection::~PeerConnection() { > webrtc_session_desc_factory_.reset(); > sctp_invoker_.reset(); > sctp_factory_.reset(); >+ media_transport_invoker_.reset(); > transport_controller_.reset(); > > // port_allocator_ lives on the network thread and should be destroyed there. >@@ -929,18 +972,40 @@ bool PeerConnection::Initialize( > config.disable_encryption = options.disable_encryption; > config.bundle_policy = configuration.bundle_policy; > config.rtcp_mux_policy = configuration.rtcp_mux_policy; >- config.crypto_options = options.crypto_options; >+ // TODO(bugs.webrtc.org/9891) - Remove options.crypto_options then remove this >+ // stub. >+ config.crypto_options = configuration.crypto_options.has_value() >+ ? *configuration.crypto_options >+ : options.crypto_options; > config.transport_observer = this; > config.event_log = event_log_.get(); > #if defined(ENABLE_EXTERNAL_AUTH) > config.enable_external_auth = true; > #endif > config.active_reset_srtp_params = configuration.active_reset_srtp_params; >+ >+ if (configuration.use_media_transport || >+ configuration.use_media_transport_for_data_channels) { >+ if (!factory_->media_transport_factory()) { >+ RTC_DCHECK(false) >+ << "PeerConnecton is initialized with use_media_transport = true or " >+ << "use_media_transport_for_data_channels = true " >+ << "but media transport factory is not set in PeerConnectionFactory"; >+ return false; >+ } >+ >+ config.media_transport_factory = factory_->media_transport_factory(); >+ } >+ > transport_controller_.reset(new JsepTransportController( > signaling_thread(), network_thread(), port_allocator_.get(), > async_resolver_factory_.get(), config)); > transport_controller_->SignalIceConnectionState.connect( > this, &PeerConnection::OnTransportControllerConnectionState); >+ transport_controller_->SignalStandardizedIceConnectionState.connect( >+ this, &PeerConnection::SetStandardizedIceConnectionState); >+ transport_controller_->SignalConnectionState.connect( >+ this, &PeerConnection::SetConnectionState); > transport_controller_->SignalIceGatheringState.connect( > this, &PeerConnection::OnTransportControllerGatheringState); > transport_controller_->SignalIceCandidatesGathered.connect( >@@ -979,10 +1044,18 @@ bool PeerConnection::Initialize( > } > } > >- // Enable creation of RTP data channels if the kEnableRtpDataChannels is set. >- // It takes precendence over the disable_sctp_data_channels >- // PeerConnectionFactoryInterface::Options. >- if (configuration.enable_rtp_data_channel) { >+ if (configuration.use_media_transport_for_data_channels) { >+ if (configuration.enable_rtp_data_channel) { >+ RTC_LOG(LS_ERROR) << "enable_rtp_data_channel and " >+ "use_media_transport_for_data_channels are " >+ "incompatible and cannot both be set to true"; >+ return false; >+ } >+ data_channel_type_ = cricket::DCT_MEDIA_TRANSPORT; >+ } else if (configuration.enable_rtp_data_channel) { >+ // Enable creation of RTP data channels if the kEnableRtpDataChannels is >+ // set. It takes precendence over the disable_sctp_data_channels >+ // PeerConnectionFactoryInterface::Options. > data_channel_type_ = cricket::DCT_RTP; > } else { > // DTLS has to be enabled to use SCTP. >@@ -1002,6 +1075,9 @@ bool PeerConnection::Initialize( > audio_options_.audio_jitter_buffer_fast_accelerate = > configuration.audio_jitter_buffer_fast_accelerate; > >+ audio_options_.audio_jitter_buffer_min_delay_ms = >+ configuration.audio_jitter_buffer_min_delay_ms; >+ > // Whether the certificate generator/certificate is null or not determines > // what PeerConnectionDescriptionFactory will do, so make sure that we give it > // the right instructions by clearing the variables if needed. >@@ -1024,7 +1100,7 @@ bool PeerConnection::Initialize( > } > > webrtc_session_desc_factory_->set_enable_encrypted_rtp_header_extensions( >- options.crypto_options.enable_encrypted_rtp_header_extensions); >+ GetCryptoOptions().srtp.enable_encrypted_rtp_header_extensions); > > // Add default audio/video transceivers for Plan B SDP. > if (!IsUnifiedPlan()) { >@@ -1183,7 +1259,7 @@ PeerConnection::AddTrackPlanB( > auto new_sender = > CreateSender(media_type, track->id(), track, adjusted_stream_ids, {}); > if (track->kind() == MediaStreamTrackInterface::kAudioKind) { >- new_sender->internal()->SetVoiceMediaChannel(voice_media_channel()); >+ new_sender->internal()->SetMediaChannel(voice_media_channel()); > GetAudioTransceiver()->internal()->AddSender(new_sender); > const RtpSenderInfo* sender_info = > FindSenderInfo(local_audio_sender_infos_, >@@ -1193,7 +1269,7 @@ PeerConnection::AddTrackPlanB( > } > } else { > RTC_DCHECK_EQ(MediaStreamTrackInterface::kVideoKind, track->kind()); >- new_sender->internal()->SetVideoMediaChannel(video_media_channel()); >+ new_sender->internal()->SetMediaChannel(video_media_channel()); > GetVideoTransceiver()->internal()->AddSender(new_sender); > const RtpSenderInfo* sender_info = > FindSenderInfo(local_video_sender_infos_, >@@ -1523,14 +1599,14 @@ rtc::scoped_refptr<RtpSenderInterface> PeerConnection::CreateSender( > if (kind == MediaStreamTrackInterface::kAudioKind) { > auto* audio_sender = new AudioRtpSender( > worker_thread(), rtc::CreateRandomUuid(), stats_.get()); >- audio_sender->SetVoiceMediaChannel(voice_media_channel()); >+ audio_sender->SetMediaChannel(voice_media_channel()); > new_sender = RtpSenderProxyWithInternal<RtpSenderInternal>::Create( > signaling_thread(), audio_sender); > GetAudioTransceiver()->internal()->AddSender(new_sender); > } else if (kind == MediaStreamTrackInterface::kVideoKind) { > auto* video_sender = > new VideoRtpSender(worker_thread(), rtc::CreateRandomUuid()); >- video_sender->SetVideoMediaChannel(video_media_channel()); >+ video_sender->SetMediaChannel(video_media_channel()); > new_sender = RtpSenderProxyWithInternal<RtpSenderInternal>::Create( > signaling_thread(), video_sender); > GetVideoTransceiver()->internal()->AddSender(new_sender); >@@ -1647,7 +1723,7 @@ void PeerConnection::GetStats( > break; > } > } >- // If there is no |internal_sender| then |selector| is either null or does not >+ // If there is no |internal_sender| then |selector| is either null or does not > // belong to the PeerConnection (in Plan B, senders can be removed from the > // PeerConnection). This means that "all the stats objects representing the > // selector" is an empty set. Invoking GetStatsReport() with a null selector >@@ -1675,7 +1751,7 @@ void PeerConnection::GetStats( > break; > } > } >- // If there is no |internal_receiver| then |selector| is either null or does >+ // If there is no |internal_receiver| then |selector| is either null or does > // not belong to the PeerConnection (in Plan B, receivers can be removed from > // the PeerConnection). This means that "all the stats objects representing > // the selector" is an empty set. Invoking GetStatsReport() with a null >@@ -1692,6 +1768,16 @@ PeerConnection::ice_connection_state() { > return ice_connection_state_; > } > >+PeerConnectionInterface::IceConnectionState >+PeerConnection::standardized_ice_connection_state() { >+ return standardized_ice_connection_state_; >+} >+ >+PeerConnectionInterface::PeerConnectionState >+PeerConnection::peer_connection_state() { >+ return connection_state_; >+} >+ > PeerConnectionInterface::IceGatheringState > PeerConnection::ice_gathering_state() { > return ice_gathering_state_; >@@ -1995,6 +2081,16 @@ RTCError PeerConnection::ApplyLocalDescription( > // |local_description()|. > RTC_DCHECK(local_description()); > >+ if (!is_caller_) { >+ if (remote_description()) { >+ // Remote description was applied first, so this PC is the callee. >+ is_caller_ = false; >+ } else { >+ // Local description is applied first, so this PC is the caller. >+ is_caller_ = true; >+ } >+ } >+ > RTCError error = PushdownTransportDescription(cricket::CS_LOCAL, type); > if (!error.ok()) { > return error; >@@ -2077,7 +2173,7 @@ RTCError PeerConnection::ApplyLocalDescription( > // If setting the description decided our SSL role, allocate any necessary > // SCTP sids. > rtc::SSLRole role; >- if (data_channel_type() == cricket::DCT_SCTP && GetSctpSslRole(&role)) { >+ if (DataChannel::IsSctpLike(data_channel_type_) && GetSctpSslRole(&role)) { > AllocateSctpSids(role); > } > >@@ -2352,7 +2448,7 @@ RTCError PeerConnection::ApplyRemoteDescription( > // If setting the description decided our SSL role, allocate any necessary > // SCTP sids. > rtc::SSLRole role; >- if (data_channel_type() == cricket::DCT_SCTP && GetSctpSslRole(&role)) { >+ if (DataChannel::IsSctpLike(data_channel_type_) && GetSctpSslRole(&role)) { > AllocateSctpSids(role); > } > >@@ -2646,11 +2742,11 @@ RTCError PeerConnection::UpdateTransceiverChannel( > const cricket::ContentGroup* bundle_group) { > RTC_DCHECK(IsUnifiedPlan()); > RTC_DCHECK(transceiver); >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ cricket::ChannelInterface* channel = transceiver->internal()->channel(); > if (content.rejected) { > if (channel) { > transceiver->internal()->SetChannel(nullptr); >- DestroyBaseChannel(channel); >+ DestroyChannelInterface(channel); > } > } else { > if (!channel) { >@@ -2683,7 +2779,7 @@ RTCError PeerConnection::UpdateDataChannel( > if (content.rejected) { > DestroyDataChannel(); > } else { >- if (!rtp_data_channel_ && !sctp_transport_) { >+ if (!rtp_data_channel_ && !sctp_transport_ && !media_transport_) { > if (!CreateDataChannel(content.name)) { > LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, > "Failed to create data channel."); >@@ -2870,6 +2966,49 @@ bool PeerConnection::SetConfiguration(const RTCConfiguration& configuration, > return SafeSetError(RTCErrorType::INVALID_MODIFICATION, error); > } > >+ if (local_description() && >+ configuration.use_media_transport != configuration_.use_media_transport) { >+ RTC_LOG(LS_ERROR) << "Can't change media_transport after calling " >+ "SetLocalDescription."; >+ return SafeSetError(RTCErrorType::INVALID_MODIFICATION, error); >+ } >+ >+ if (remote_description() && >+ configuration.use_media_transport != configuration_.use_media_transport) { >+ RTC_LOG(LS_ERROR) << "Can't change media_transport after calling " >+ "SetRemoteDescription."; >+ return SafeSetError(RTCErrorType::INVALID_MODIFICATION, error); >+ } >+ >+ if (local_description() && >+ configuration.use_media_transport_for_data_channels != >+ configuration_.use_media_transport_for_data_channels) { >+ RTC_LOG(LS_ERROR) << "Can't change media_transport_for_data_channels " >+ "after calling SetLocalDescription."; >+ return SafeSetError(RTCErrorType::INVALID_MODIFICATION, error); >+ } >+ >+ if (remote_description() && >+ configuration.use_media_transport_for_data_channels != >+ configuration_.use_media_transport_for_data_channels) { >+ RTC_LOG(LS_ERROR) << "Can't change media_transport_for_data_channels " >+ "after calling SetRemoteDescription."; >+ return SafeSetError(RTCErrorType::INVALID_MODIFICATION, error); >+ } >+ >+ if (local_description() && >+ configuration.crypto_options != configuration_.crypto_options) { >+ RTC_LOG(LS_ERROR) << "Can't change crypto_options after calling " >+ "SetLocalDescription."; >+ return SafeSetError(RTCErrorType::INVALID_MODIFICATION, error); >+ } >+ >+ if (configuration.use_media_transport_for_data_channels || >+ configuration.use_media_transport) { >+ RTC_CHECK(configuration.bundle_policy == kBundlePolicyMaxBundle) >+ << "Media transport requires MaxBundle policy."; >+ } >+ > // The simplest (and most future-compatible) way to tell if the config was > // modified in an invalid way is to copy each property we do support > // modifying, then use operator==. There are far more properties we don't >@@ -2894,6 +3033,9 @@ bool PeerConnection::SetConfiguration(const RTCConfiguration& configuration, > modified_config.network_preference = configuration.network_preference; > modified_config.active_reset_srtp_params = > configuration.active_reset_srtp_params; >+ modified_config.use_media_transport = configuration.use_media_transport; >+ modified_config.use_media_transport_for_data_channels = >+ configuration.use_media_transport_for_data_channels; > if (configuration != modified_config) { > RTC_LOG(LS_ERROR) << "Modifying the configuration in an unsupported way."; > return SafeSetError(RTCErrorType::INVALID_MODIFICATION, error); >@@ -2951,6 +3093,11 @@ bool PeerConnection::SetConfiguration(const RTCConfiguration& configuration, > } > > transport_controller_->SetIceConfig(ParseIceConfig(modified_config)); >+ transport_controller_->SetMediaTransportFactory( >+ modified_config.use_media_transport || >+ modified_config.use_media_transport_for_data_channels >+ ? factory_->media_transport_factory() >+ : nullptr); > > if (configuration_.active_reset_srtp_params != > modified_config.active_reset_srtp_params) { >@@ -3119,7 +3266,7 @@ void PeerConnection::SetAudioPlayout(bool playout) { > return; > } > auto audio_state = >- factory_->channel_manager()->media_engine()->GetAudioState(); >+ factory_->channel_manager()->media_engine()->voice().GetAudioState(); > audio_state->SetPlayout(playout); > } > >@@ -3131,7 +3278,7 @@ void PeerConnection::SetAudioRecording(bool recording) { > return; > } > auto audio_state = >- factory_->channel_manager()->media_engine()->GetAudioState(); >+ factory_->channel_manager()->media_engine()->voice().GetAudioState(); > audio_state->SetRecording(recording); > } > >@@ -3141,7 +3288,7 @@ PeerConnection::GetRemoteAudioSSLCertificate() { > if (!chain || !chain->GetSize()) { > return nullptr; > } >- return chain->Get(0).GetUniqueReference(); >+ return chain->Get(0).Clone(); > } > > std::unique_ptr<rtc::SSLCertChain> >@@ -3170,9 +3317,13 @@ bool PeerConnection::StartRtcEventLog(rtc::PlatformFile file, > const size_t max_size = (max_size_bytes < 0) > ? RtcEventLog::kUnlimitedOutput > : rtc::saturated_cast<size_t>(max_size_bytes); >+ int64_t output_period_ms = webrtc::RtcEventLog::kImmediateOutput; >+ if (field_trial::IsEnabled("WebRTC-RtcEventLogNewFormat")) { >+ output_period_ms = 5000; >+ } > return StartRtcEventLog( > absl::make_unique<RtcEventLogOutputFile>(file, max_size), >- webrtc::RtcEventLog::kImmediateOutput); >+ output_period_ms); > } > > bool PeerConnection::StartRtcEventLog(std::unique_ptr<RtcEventLogOutput> output, >@@ -3347,7 +3498,7 @@ void PeerConnection::CreateAudioReceiver( > // the constructor taking stream IDs instead. > auto* audio_receiver = new AudioRtpReceiver( > worker_thread(), remote_sender_info.sender_id, streams); >- audio_receiver->SetVoiceMediaChannel(voice_media_channel()); >+ audio_receiver->SetMediaChannel(voice_media_channel()); > audio_receiver->SetupMediaChannel(remote_sender_info.first_ssrc); > auto receiver = RtpReceiverProxyWithInternal<RtpReceiverInternal>::Create( > signaling_thread(), audio_receiver); >@@ -3365,7 +3516,7 @@ void PeerConnection::CreateVideoReceiver( > // the constructor taking stream IDs instead. > auto* video_receiver = new VideoRtpReceiver( > worker_thread(), remote_sender_info.sender_id, streams); >- video_receiver->SetVideoMediaChannel(video_media_channel()); >+ video_receiver->SetMediaChannel(video_media_channel()); > video_receiver->SetupMediaChannel(remote_sender_info.first_ssrc); > auto receiver = RtpReceiverProxyWithInternal<RtpReceiverInternal>::Create( > signaling_thread(), video_receiver); >@@ -3408,7 +3559,7 @@ void PeerConnection::AddAudioTrack(AudioTrackInterface* track, > // Normal case; we've never seen this track before. > auto new_sender = CreateSender(cricket::MEDIA_TYPE_AUDIO, track->id(), track, > {stream->id()}, {}); >- new_sender->internal()->SetVoiceMediaChannel(voice_media_channel()); >+ new_sender->internal()->SetMediaChannel(voice_media_channel()); > GetAudioTransceiver()->internal()->AddSender(new_sender); > // If the sender has already been configured in SDP, we call SetSsrc, > // which will connect the sender to the underlying transport. This can >@@ -3453,7 +3604,7 @@ void PeerConnection::AddVideoTrack(VideoTrackInterface* track, > // Normal case; we've never seen this track before. > auto new_sender = CreateSender(cricket::MEDIA_TYPE_VIDEO, track->id(), track, > {stream->id()}, {}); >- new_sender->internal()->SetVideoMediaChannel(video_media_channel()); >+ new_sender->internal()->SetMediaChannel(video_media_channel()); > GetVideoTransceiver()->internal()->AddSender(new_sender); > const RtpSenderInfo* sender_info = > FindSenderInfo(local_video_sender_infos_, stream->id(), track->id()); >@@ -3495,6 +3646,29 @@ void PeerConnection::SetIceConnectionState(IceConnectionState new_state) { > Observer()->OnIceConnectionChange(ice_connection_state_); > } > >+void PeerConnection::SetStandardizedIceConnectionState( >+ PeerConnectionInterface::IceConnectionState new_state) { >+ RTC_DCHECK(signaling_thread()->IsCurrent()); >+ if (standardized_ice_connection_state_ == new_state) >+ return; >+ if (IsClosed()) >+ return; >+ standardized_ice_connection_state_ = new_state; >+ // TODO(jonasolsson): Pass this value on to OnIceConnectionChange instead of >+ // the old one once disconnects are handled properly. >+} >+ >+void PeerConnection::SetConnectionState( >+ PeerConnectionInterface::PeerConnectionState new_state) { >+ RTC_DCHECK(signaling_thread()->IsCurrent()); >+ if (connection_state_ == new_state) >+ return; >+ if (IsClosed()) >+ return; >+ connection_state_ = new_state; >+ Observer()->OnConnectionChange(new_state); >+} >+ > void PeerConnection::OnIceGatheringChange( > PeerConnectionInterface::IceGatheringState new_state) { > RTC_DCHECK(signaling_thread()->IsCurrent()); >@@ -3542,6 +3716,10 @@ void PeerConnection::ChangeSignalingState( > if (signaling_state == kClosed) { > ice_connection_state_ = kIceConnectionClosed; > Observer()->OnIceConnectionChange(ice_connection_state_); >+ standardized_ice_connection_state_ = >+ PeerConnectionInterface::IceConnectionState::kIceConnectionClosed; >+ connection_state_ = PeerConnectionInterface::PeerConnectionState::kClosed; >+ Observer()->OnConnectionChange(connection_state_); > if (ice_gathering_state_ != kIceGatheringComplete) { > ice_gathering_state_ = kIceGatheringComplete; > Observer()->OnIceGatheringChange(ice_gathering_state_); >@@ -3641,8 +3819,15 @@ void PeerConnection::GetOptionsForOffer( > } > > session_options->rtcp_cname = rtcp_cname_; >- session_options->crypto_options = factory_->options().crypto_options; >+ session_options->crypto_options = GetCryptoOptions(); > session_options->is_unified_plan = IsUnifiedPlan(); >+ session_options->pooled_ice_credentials = >+ network_thread()->Invoke<std::vector<cricket::IceParameters>>( >+ RTC_FROM_HERE, >+ rtc::Bind(&cricket::PortAllocator::GetPooledIceCredentials, >+ port_allocator_.get())); >+ session_options->offer_extmap_allow_mixed = >+ configuration_.offer_extmap_allow_mixed; > } > > void PeerConnection::GetOptionsForPlanBOffer( >@@ -3901,8 +4086,13 @@ void PeerConnection::GetOptionsForAnswer( > } > > session_options->rtcp_cname = rtcp_cname_; >- session_options->crypto_options = factory_->options().crypto_options; >+ session_options->crypto_options = GetCryptoOptions(); > session_options->is_unified_plan = IsUnifiedPlan(); >+ session_options->pooled_ice_credentials = >+ network_thread()->Invoke<std::vector<cricket::IceParameters>>( >+ RTC_FROM_HERE, >+ rtc::Bind(&cricket::PortAllocator::GetPooledIceCredentials, >+ port_allocator_.get())); > } > > void PeerConnection::GetOptionsForPlanBAnswer( >@@ -4069,6 +4259,8 @@ absl::optional<std::string> PeerConnection::GetDataMid() const { > return rtp_data_channel_->content_name(); > case cricket::DCT_SCTP: > return sctp_mid_; >+ case cricket::DCT_MEDIA_TRANSPORT: >+ return media_transport_data_mid_; > default: > return absl::nullopt; > } >@@ -4431,7 +4623,7 @@ rtc::scoped_refptr<DataChannel> PeerConnection::InternalCreateDataChannel( > } > InternalDataChannelInit new_config = > config ? (*config) : InternalDataChannelInit(); >- if (data_channel_type() == cricket::DCT_SCTP) { >+ if (DataChannel::IsSctpLike(data_channel_type_)) { > if (new_config.id < 0) { > rtc::SSLRole role; > if ((GetSctpSslRole(&role)) && >@@ -4462,7 +4654,7 @@ rtc::scoped_refptr<DataChannel> PeerConnection::InternalCreateDataChannel( > } > rtp_data_channels_[channel->label()] = channel; > } else { >- RTC_DCHECK(channel->data_channel_type() == cricket::DCT_SCTP); >+ RTC_DCHECK(DataChannel::IsSctpLike(data_channel_type_)); > sctp_data_channels_.push_back(channel); > channel->SignalClosed.connect(this, > &PeerConnection::OnSctpDataChannelClosed); >@@ -4542,6 +4734,27 @@ void PeerConnection::OnDataChannelOpenMessage( > NoteUsageEvent(UsageEvent::DATA_ADDED); > } > >+bool PeerConnection::HandleOpenMessage_s( >+ const cricket::ReceiveDataParams& params, >+ const rtc::CopyOnWriteBuffer& buffer) { >+ if (params.type == cricket::DMT_CONTROL && IsOpenMessage(buffer)) { >+ // Received OPEN message; parse and signal that a new data channel should >+ // be created. >+ std::string label; >+ InternalDataChannelInit config; >+ config.id = params.ssrc; >+ if (!ParseDataChannelOpenMessage(buffer, &label, &config)) { >+ RTC_LOG(LS_WARNING) << "Failed to parse the OPEN message for ssrc " >+ << params.ssrc; >+ return true; >+ } >+ config.open_handshake_role = InternalDataChannelInit::kAcker; >+ OnDataChannelOpenMessage(label, config); >+ return true; >+ } >+ return false; >+} >+ > rtc::scoped_refptr<RtpTransceiverProxyWithInternal<RtpTransceiver>> > PeerConnection::GetAudioTransceiver() const { > // This method only works with Plan B SDP, where there is a single >@@ -4769,10 +4982,10 @@ void PeerConnection::StopRtcEventLog_w() { > } > } > >-cricket::BaseChannel* PeerConnection::GetChannel( >+cricket::ChannelInterface* PeerConnection::GetChannel( > const std::string& content_name) { > for (auto transceiver : transceivers_) { >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ cricket::ChannelInterface* channel = transceiver->internal()->channel(); > if (channel && channel->content_name() == content_name) { > return channel; > } >@@ -4785,19 +4998,25 @@ cricket::BaseChannel* PeerConnection::GetChannel( > } > > bool PeerConnection::GetSctpSslRole(rtc::SSLRole* role) { >+ RTC_DCHECK_RUN_ON(signaling_thread()); > if (!local_description() || !remote_description()) { > RTC_LOG(LS_INFO) > << "Local and Remote descriptions must be applied to get the " > "SSL Role of the SCTP transport."; > return false; > } >- if (!sctp_transport_) { >+ if (!sctp_transport_ && !media_transport_) { > RTC_LOG(LS_INFO) << "Non-rejected SCTP m= section is needed to get the " > "SSL Role of the SCTP transport."; > return false; > } > >- auto dtls_role = transport_controller_->GetDtlsRole(*sctp_mid_); >+ absl::optional<rtc::SSLRole> dtls_role; >+ if (sctp_mid_) { >+ dtls_role = transport_controller_->GetDtlsRole(*sctp_mid_); >+ } else if (is_caller_) { >+ dtls_role = *is_caller_ ? rtc::SSL_SERVER : rtc::SSL_CLIENT; >+ } > if (dtls_role) { > *role = *dtls_role; > return true; >@@ -4883,7 +5102,7 @@ RTCError PeerConnection::PushdownMediaDescription( > for (auto transceiver : transceivers_) { > const ContentInfo* content_info = > FindMediaSectionForTransceiver(transceiver, sdesc); >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ cricket::ChannelInterface* channel = transceiver->internal()->channel(); > if (!channel || !content_info || content_info->rejected) { > continue; > } >@@ -5043,11 +5262,22 @@ bool PeerConnection::GetRemoteTrackIdBySsrc(uint32_t ssrc, > bool PeerConnection::SendData(const cricket::SendDataParams& params, > const rtc::CopyOnWriteBuffer& payload, > cricket::SendDataResult* result) { >- if (!rtp_data_channel_ && !sctp_transport_) { >- RTC_LOG(LS_ERROR) << "SendData called when rtp_data_channel_ " >- "and sctp_transport_ are NULL."; >+ if (!rtp_data_channel_ && !sctp_transport_ && !media_transport_) { >+ RTC_LOG(LS_ERROR) << "SendData called when rtp_data_channel_, " >+ "sctp_transport_, and media_transport_ are NULL."; > return false; > } >+ if (media_transport_) { >+ SendDataParams send_params; >+ send_params.type = ToWebrtcDataMessageType(params.type); >+ send_params.ordered = params.ordered; >+ if (params.max_rtx_count >= 0) { >+ send_params.max_rtx_count = params.max_rtx_count; >+ } else if (params.max_rtx_ms >= 0) { >+ send_params.max_rtx_ms = params.max_rtx_ms; >+ } >+ return media_transport_->SendData(params.sid, send_params, payload).ok(); >+ } > return rtp_data_channel_ > ? rtp_data_channel_->SendData(params, payload, result) > : network_thread()->Invoke<bool>( >@@ -5057,13 +5287,23 @@ bool PeerConnection::SendData(const cricket::SendDataParams& params, > } > > bool PeerConnection::ConnectDataChannel(DataChannel* webrtc_data_channel) { >- if (!rtp_data_channel_ && !sctp_transport_) { >+ RTC_DCHECK_RUN_ON(signaling_thread()); >+ if (!rtp_data_channel_ && !sctp_transport_ && !media_transport_) { > // Don't log an error here, because DataChannels are expected to call > // ConnectDataChannel in this state. It's the only way to initially tell > // whether or not the underlying transport is ready. > return false; > } >- if (rtp_data_channel_) { >+ if (media_transport_) { >+ SignalMediaTransportWritable_s.connect(webrtc_data_channel, >+ &DataChannel::OnChannelReady); >+ SignalMediaTransportReceivedData_s.connect(webrtc_data_channel, >+ &DataChannel::OnDataReceived); >+ SignalMediaTransportChannelClosing_s.connect( >+ webrtc_data_channel, &DataChannel::OnClosingProcedureStartedRemotely); >+ SignalMediaTransportChannelClosed_s.connect( >+ webrtc_data_channel, &DataChannel::OnClosingProcedureComplete); >+ } else if (rtp_data_channel_) { > rtp_data_channel_->SignalReadyToSendData.connect( > webrtc_data_channel, &DataChannel::OnChannelReady); > rtp_data_channel_->SignalDataReceived.connect(webrtc_data_channel, >@@ -5082,13 +5322,19 @@ bool PeerConnection::ConnectDataChannel(DataChannel* webrtc_data_channel) { > } > > void PeerConnection::DisconnectDataChannel(DataChannel* webrtc_data_channel) { >- if (!rtp_data_channel_ && !sctp_transport_) { >+ RTC_DCHECK_RUN_ON(signaling_thread()); >+ if (!rtp_data_channel_ && !sctp_transport_ && !media_transport_) { > RTC_LOG(LS_ERROR) > << "DisconnectDataChannel called when rtp_data_channel_ and " > "sctp_transport_ are NULL."; > return; > } >- if (rtp_data_channel_) { >+ if (media_transport_) { >+ SignalMediaTransportWritable_s.disconnect(webrtc_data_channel); >+ SignalMediaTransportReceivedData_s.disconnect(webrtc_data_channel); >+ SignalMediaTransportChannelClosing_s.disconnect(webrtc_data_channel); >+ SignalMediaTransportChannelClosed_s.disconnect(webrtc_data_channel); >+ } else if (rtp_data_channel_) { > rtp_data_channel_->SignalReadyToSendData.disconnect(webrtc_data_channel); > rtp_data_channel_->SignalDataReceived.disconnect(webrtc_data_channel); > } else { >@@ -5100,6 +5346,10 @@ void PeerConnection::DisconnectDataChannel(DataChannel* webrtc_data_channel) { > } > > void PeerConnection::AddSctpDataStream(int sid) { >+ if (media_transport_) { >+ // No-op. Media transport does not need to add streams. >+ return; >+ } > if (!sctp_transport_) { > RTC_LOG(LS_ERROR) > << "AddSctpDataStream called when sctp_transport_ is NULL."; >@@ -5111,6 +5361,10 @@ void PeerConnection::AddSctpDataStream(int sid) { > } > > void PeerConnection::RemoveSctpDataStream(int sid) { >+ if (media_transport_) { >+ media_transport_->CloseChannel(sid); >+ return; >+ } > if (!sctp_transport_) { > RTC_LOG(LS_ERROR) << "RemoveSctpDataStream called when sctp_transport_ is " > "NULL."; >@@ -5122,10 +5376,43 @@ void PeerConnection::RemoveSctpDataStream(int sid) { > } > > bool PeerConnection::ReadyToSendData() const { >+ RTC_DCHECK_RUN_ON(signaling_thread()); > return (rtp_data_channel_ && rtp_data_channel_->ready_to_send_data()) || >+ (media_transport_ && media_transport_ready_to_send_data_) || > sctp_ready_to_send_data_; > } > >+void PeerConnection::OnDataReceived(int channel_id, >+ DataMessageType type, >+ const rtc::CopyOnWriteBuffer& buffer) { >+ cricket::ReceiveDataParams params; >+ params.sid = channel_id; >+ params.type = ToCricketDataMessageType(type); >+ media_transport_invoker_->AsyncInvoke<void>( >+ RTC_FROM_HERE, signaling_thread(), [this, params, buffer] { >+ RTC_DCHECK_RUN_ON(signaling_thread()); >+ if (!HandleOpenMessage_s(params, buffer)) { >+ SignalMediaTransportReceivedData_s(params, buffer); >+ } >+ }); >+} >+ >+void PeerConnection::OnChannelClosing(int channel_id) { >+ media_transport_invoker_->AsyncInvoke<void>( >+ RTC_FROM_HERE, signaling_thread(), [this, channel_id] { >+ RTC_DCHECK_RUN_ON(signaling_thread()); >+ SignalMediaTransportChannelClosing_s(channel_id); >+ }); >+} >+ >+void PeerConnection::OnChannelClosed(int channel_id) { >+ media_transport_invoker_->AsyncInvoke<void>( >+ RTC_FROM_HERE, signaling_thread(), [this, channel_id] { >+ RTC_DCHECK_RUN_ON(signaling_thread()); >+ SignalMediaTransportChannelClosed_s(channel_id); >+ }); >+} >+ > absl::optional<std::string> PeerConnection::sctp_transport_name() const { > if (sctp_mid_ && transport_controller_) { > auto dtls_transport = transport_controller_->GetDtlsTransport(*sctp_mid_); >@@ -5150,7 +5437,7 @@ std::map<std::string, std::string> PeerConnection::GetTransportNamesByMid() > const { > std::map<std::string, std::string> transport_names_by_mid; > for (auto transceiver : transceivers_) { >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ cricket::ChannelInterface* channel = transceiver->internal()->channel(); > if (channel) { > transport_names_by_mid[channel->content_name()] = > channel->transport_name(); >@@ -5327,7 +5614,7 @@ void PeerConnection::OnTransportControllerDtlsHandshakeError( > > void PeerConnection::EnableSending() { > for (auto transceiver : transceivers_) { >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ cricket::ChannelInterface* channel = transceiver->internal()->channel(); > if (channel && !channel->enabled()) { > channel->Enable(true); > } >@@ -5486,7 +5773,7 @@ RTCError PeerConnection::CreateChannels(const SessionDescription& desc) { > > const cricket::ContentInfo* data = cricket::GetFirstDataContent(&desc); > if (data_channel_type_ != cricket::DCT_NONE && data && !data->rejected && >- !rtp_data_channel_ && !sctp_transport_) { >+ !rtp_data_channel_ && !sctp_transport_ && !media_transport_) { > if (!CreateDataChannel(data->name)) { > LOG_AND_RETURN_ERROR(RTCErrorType::INTERNAL_ERROR, > "Failed to create data channel."); >@@ -5499,13 +5786,16 @@ RTCError PeerConnection::CreateChannels(const SessionDescription& desc) { > // TODO(steveanton): Perhaps this should be managed by the RtpTransceiver. > cricket::VoiceChannel* PeerConnection::CreateVoiceChannel( > const std::string& mid) { >- RtpTransportInternal* rtp_transport = >- transport_controller_->GetRtpTransport(mid); >- RTC_DCHECK(rtp_transport); >+ RtpTransportInternal* rtp_transport = GetRtpTransport(mid); >+ MediaTransportInterface* media_transport = nullptr; >+ if (configuration_.use_media_transport) { >+ media_transport = GetMediaTransport(mid); >+ } >+ > cricket::VoiceChannel* voice_channel = channel_manager()->CreateVoiceChannel( >- call_.get(), configuration_.media_config, rtp_transport, >- signaling_thread(), mid, SrtpRequired(), >- factory_->options().crypto_options, audio_options_); >+ call_.get(), configuration_.media_config, rtp_transport, media_transport, >+ signaling_thread(), mid, SrtpRequired(), GetCryptoOptions(), >+ audio_options_); > if (!voice_channel) { > return nullptr; > } >@@ -5521,13 +5811,13 @@ cricket::VoiceChannel* PeerConnection::CreateVoiceChannel( > // TODO(steveanton): Perhaps this should be managed by the RtpTransceiver. > cricket::VideoChannel* PeerConnection::CreateVideoChannel( > const std::string& mid) { >- RtpTransportInternal* rtp_transport = >- transport_controller_->GetRtpTransport(mid); >- RTC_DCHECK(rtp_transport); >+ RtpTransportInternal* rtp_transport = GetRtpTransport(mid); >+ >+ // TODO(sukhanov): Propagate media_transport to video channel. > cricket::VideoChannel* video_channel = channel_manager()->CreateVideoChannel( > call_.get(), configuration_.media_config, rtp_transport, >- signaling_thread(), mid, SrtpRequired(), >- factory_->options().crypto_options, video_options_); >+ signaling_thread(), mid, SrtpRequired(), GetCryptoOptions(), >+ video_options_); > if (!video_channel) { > return nullptr; > } >@@ -5541,37 +5831,49 @@ cricket::VideoChannel* PeerConnection::CreateVideoChannel( > } > > bool PeerConnection::CreateDataChannel(const std::string& mid) { >- bool sctp = (data_channel_type_ == cricket::DCT_SCTP); >- if (sctp) { >- if (!sctp_factory_) { >- RTC_LOG(LS_ERROR) >- << "Trying to create SCTP transport, but didn't compile with " >- "SCTP support (HAVE_SCTP)"; >- return false; >- } >- if (!network_thread()->Invoke<bool>( >- RTC_FROM_HERE, >- rtc::Bind(&PeerConnection::CreateSctpTransport_n, this, mid))) { >- return false; >- } >- for (const auto& channel : sctp_data_channels_) { >- channel->OnTransportChannelCreated(); >- } >- } else { >- RtpTransportInternal* rtp_transport = >- transport_controller_->GetRtpTransport(mid); >- RTC_DCHECK(rtp_transport); >- rtp_data_channel_ = channel_manager()->CreateRtpDataChannel( >- configuration_.media_config, rtp_transport, signaling_thread(), mid, >- SrtpRequired(), factory_->options().crypto_options); >- if (!rtp_data_channel_) { >+ switch (data_channel_type_) { >+ case cricket::DCT_MEDIA_TRANSPORT: >+ if (network_thread()->Invoke<bool>( >+ RTC_FROM_HERE, >+ rtc::Bind(&PeerConnection::SetupMediaTransportForDataChannels_n, >+ this, mid))) { >+ for (const auto& channel : sctp_data_channels_) { >+ channel->OnTransportChannelCreated(); >+ } >+ return true; >+ } > return false; >- } >- rtp_data_channel_->SignalDtlsSrtpSetupFailure.connect( >- this, &PeerConnection::OnDtlsSrtpSetupFailure); >- rtp_data_channel_->SignalSentPacket.connect( >- this, &PeerConnection::OnSentPacket_w); >- rtp_data_channel_->SetRtpTransport(rtp_transport); >+ case cricket::DCT_SCTP: >+ if (!sctp_factory_) { >+ RTC_LOG(LS_ERROR) >+ << "Trying to create SCTP transport, but didn't compile with " >+ "SCTP support (HAVE_SCTP)"; >+ return false; >+ } >+ if (!network_thread()->Invoke<bool>( >+ RTC_FROM_HERE, >+ rtc::Bind(&PeerConnection::CreateSctpTransport_n, this, mid))) { >+ return false; >+ } >+ for (const auto& channel : sctp_data_channels_) { >+ channel->OnTransportChannelCreated(); >+ } >+ return true; >+ case cricket::DCT_RTP: >+ default: >+ RtpTransportInternal* rtp_transport = GetRtpTransport(mid); >+ rtp_data_channel_ = channel_manager()->CreateRtpDataChannel( >+ configuration_.media_config, rtp_transport, signaling_thread(), mid, >+ SrtpRequired(), GetCryptoOptions()); >+ if (!rtp_data_channel_) { >+ return false; >+ } >+ rtp_data_channel_->SignalDtlsSrtpSetupFailure.connect( >+ this, &PeerConnection::OnDtlsSrtpSetupFailure); >+ rtp_data_channel_->SignalSentPacket.connect( >+ this, &PeerConnection::OnSentPacket_w); >+ rtp_data_channel_->SetRtpTransport(rtp_transport); >+ return true; > } > > return true; >@@ -5661,22 +5963,8 @@ void PeerConnection::OnSctpTransportDataReceived_n( > void PeerConnection::OnSctpTransportDataReceived_s( > const cricket::ReceiveDataParams& params, > const rtc::CopyOnWriteBuffer& payload) { >- RTC_DCHECK(signaling_thread()->IsCurrent()); >- if (params.type == cricket::DMT_CONTROL && IsOpenMessage(payload)) { >- // Received OPEN message; parse and signal that a new data channel should >- // be created. >- std::string label; >- InternalDataChannelInit config; >- config.id = params.ssrc; >- if (!ParseDataChannelOpenMessage(payload, &label, &config)) { >- RTC_LOG(LS_WARNING) << "Failed to parse the OPEN message for sid " >- << params.ssrc; >- return; >- } >- config.open_handshake_role = InternalDataChannelInit::kAcker; >- OnDataChannelOpenMessage(label, config); >- } else { >- // Otherwise just forward the signal. >+ RTC_DCHECK_RUN_ON(signaling_thread()); >+ if (!HandleOpenMessage_s(params, payload)) { > SignalSctpDataReceived(params, payload); > } > } >@@ -5699,6 +5987,49 @@ void PeerConnection::OnSctpClosingProcedureComplete_n(int sid) { > &SignalSctpClosingProcedureComplete, sid)); > } > >+bool PeerConnection::SetupMediaTransportForDataChannels_n( >+ const std::string& mid) { >+ media_transport_ = transport_controller_->GetMediaTransport(mid); >+ if (!media_transport_) { >+ RTC_LOG(LS_ERROR) << "Media transport is not available for data channels"; >+ return false; >+ } >+ >+ media_transport_invoker_ = absl::make_unique<rtc::AsyncInvoker>(); >+ media_transport_->SetDataSink(this); >+ media_transport_data_mid_ = mid; >+ transport_controller_->SignalMediaTransportStateChanged.connect( >+ this, &PeerConnection::OnMediaTransportStateChanged_n); >+ // Check the initial state right away, in case transport is already writable. >+ OnMediaTransportStateChanged_n(); >+ return true; >+} >+ >+void PeerConnection::TeardownMediaTransportForDataChannels_n() { >+ if (!media_transport_) { >+ return; >+ } >+ transport_controller_->SignalMediaTransportStateChanged.disconnect(this); >+ media_transport_data_mid_.reset(); >+ media_transport_->SetDataSink(nullptr); >+ media_transport_invoker_ = nullptr; >+ media_transport_ = nullptr; >+} >+ >+void PeerConnection::OnMediaTransportStateChanged_n() { >+ if (!media_transport_data_mid_ || >+ transport_controller_->GetMediaTransportState( >+ *media_transport_data_mid_) != MediaTransportState::kWritable) { >+ return; >+ } >+ media_transport_invoker_->AsyncInvoke<void>( >+ RTC_FROM_HERE, signaling_thread(), [this] { >+ RTC_DCHECK_RUN_ON(signaling_thread()); >+ media_transport_ready_to_send_data_ = true; >+ SignalMediaTransportWritable_s(media_transport_ready_to_send_data_); >+ }); >+} >+ > // Returns false if bundle is enabled and rtcp_mux is disabled. > bool PeerConnection::ValidateBundleSettings(const SessionDescription* desc) { > bool bundle_enabled = desc->HasGroup(cricket::GROUP_TYPE_BUNDLE); >@@ -6169,7 +6500,7 @@ void PeerConnection::OnSentPacket_w(const rtc::SentPacket& sent_packet) { > > const std::string PeerConnection::GetTransportName( > const std::string& content_name) { >- cricket::BaseChannel* channel = GetChannel(content_name); >+ cricket::ChannelInterface* channel = GetChannel(content_name); > if (channel) { > return channel->transport_name(); > } >@@ -6188,17 +6519,17 @@ void PeerConnection::DestroyTransceiverChannel( > transceiver) { > RTC_DCHECK(transceiver); > >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ cricket::ChannelInterface* channel = transceiver->internal()->channel(); > if (channel) { > transceiver->internal()->SetChannel(nullptr); >- DestroyBaseChannel(channel); >+ DestroyChannelInterface(channel); > } > } > > void PeerConnection::DestroyDataChannel() { > if (rtp_data_channel_) { > OnDataChannelDestroyed(); >- DestroyBaseChannel(rtp_data_channel_); >+ DestroyChannelInterface(rtp_data_channel_); > rtp_data_channel_ = nullptr; > } > >@@ -6213,9 +6544,18 @@ void PeerConnection::DestroyDataChannel() { > network_thread()->Invoke<void>(RTC_FROM_HERE, > [this] { DestroySctpTransport_n(); }); > } >+ >+ if (media_transport_) { >+ OnDataChannelDestroyed(); >+ network_thread()->Invoke<void>(RTC_FROM_HERE, [this] { >+ RTC_DCHECK_RUN_ON(network_thread()); >+ TeardownMediaTransportForDataChannels_n(); >+ }); >+ } > } > >-void PeerConnection::DestroyBaseChannel(cricket::BaseChannel* channel) { >+void PeerConnection::DestroyChannelInterface( >+ cricket::ChannelInterface* channel) { > RTC_DCHECK(channel); > switch (channel->media_type()) { > case cricket::MEDIA_TYPE_AUDIO: >@@ -6239,7 +6579,8 @@ void PeerConnection::DestroyBaseChannel(cricket::BaseChannel* channel) { > bool PeerConnection::OnTransportChanged( > const std::string& mid, > RtpTransportInternal* rtp_transport, >- cricket::DtlsTransportInternal* dtls_transport) { >+ cricket::DtlsTransportInternal* dtls_transport, >+ MediaTransportInterface* media_transport) { > bool ret = true; > auto base_channel = GetChannel(mid); > if (base_channel) { >@@ -6248,6 +6589,9 @@ bool PeerConnection::OnTransportChanged( > if (sctp_transport_ && mid == sctp_mid_) { > sctp_transport_->SetDtlsTransport(dtls_transport); > } >+ >+ call_->MediaTransportChange(media_transport); >+ > return ret; > } > >@@ -6261,6 +6605,14 @@ PeerConnectionObserver* PeerConnection::Observer() const { > return observer_; > } > >+CryptoOptions PeerConnection::GetCryptoOptions() { >+ // TODO(bugs.webrtc.org/9891) - Remove PeerConnectionFactory::CryptoOptions >+ // after it has been removed. >+ return configuration_.crypto_options.has_value() >+ ? *configuration_.crypto_options >+ : factory_->options().crypto_options; >+} >+ > void PeerConnection::ClearStatsCache() { > if (stats_collector_) { > stats_collector_->ClearCachedStatsReport(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.h >index 37c9d73bd46c881c91c5abdda27a97a8aadac677..7e97afab7c605395db0c9b404463682e515d376b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection.h >@@ -17,6 +17,7 @@ > #include <string> > #include <vector> > >+#include "api/media_transport_interface.h" > #include "api/peerconnectioninterface.h" > #include "api/turncustomizer.h" > #include "pc/iceserverparsing.h" >@@ -51,6 +52,7 @@ class RtcEventLog; > // - Generating stats. > class PeerConnection : public PeerConnectionInternal, > public DataChannelProviderInterface, >+ public DataChannelSink, > public JsepTransportController::Observer, > public rtc::MessageHandler, > public sigslot::has_slots<> { >@@ -145,6 +147,8 @@ class PeerConnection : public PeerConnectionInternal, > SignalingState signaling_state() override; > > IceConnectionState ice_connection_state() override; >+ IceConnectionState standardized_ice_connection_state(); >+ PeerConnectionState peer_connection_state() override; > IceGatheringState ice_gathering_state() override; > > const SessionDescriptionInterface* local_description() const override; >@@ -372,6 +376,11 @@ class PeerConnection : public PeerConnectionInternal, > receiver); > > void SetIceConnectionState(IceConnectionState new_state); >+ void SetStandardizedIceConnectionState( >+ PeerConnectionInterface::IceConnectionState new_state); >+ void SetConnectionState( >+ PeerConnectionInterface::PeerConnectionState new_state); >+ > // Called any time the IceGatheringState changes > void OnIceGatheringChange(IceGatheringState new_state); > // New ICE candidate has been gathered. >@@ -624,6 +633,11 @@ class PeerConnection : public PeerConnectionInternal, > // Called when a valid data channel OPEN message is received. > void OnDataChannelOpenMessage(const std::string& label, > const InternalDataChannelInit& config); >+ // Parses and handles open messages. Returns true if the message is an open >+ // message, false otherwise. >+ bool HandleOpenMessage_s(const cricket::ReceiveDataParams& params, >+ const rtc::CopyOnWriteBuffer& buffer) >+ RTC_RUN_ON(signaling_thread()); > > // Returns true if the PeerConnection is configured to use Unified Plan > // semantics for creating offers/answers and setting local/remote >@@ -705,7 +719,7 @@ class PeerConnection : public PeerConnectionInternal, > SessionError session_error() const { return session_error_; } > const std::string& session_error_desc() const { return session_error_desc_; } > >- cricket::BaseChannel* GetChannel(const std::string& content_name); >+ cricket::ChannelInterface* GetChannel(const std::string& content_name); > > // Get current SSL role used by SCTP's underlying transport. > bool GetSctpSslRole(rtc::SSLRole* role); >@@ -725,6 +739,13 @@ class PeerConnection : public PeerConnectionInternal, > > cricket::DataChannelType data_channel_type() const; > >+ // Implements DataChannelSink. >+ void OnDataReceived(int channel_id, >+ DataMessageType type, >+ const rtc::CopyOnWriteBuffer& buffer) override; >+ void OnChannelClosing(int channel_id) override; >+ void OnChannelClosed(int channel_id) override; >+ > // Called when an RTCCertificate is generated or retrieved by > // WebRTCSessionDescriptionFactory. Should happen before setLocalDescription. > void OnCertificateReady( >@@ -822,6 +843,11 @@ class PeerConnection : public PeerConnectionInternal, > void OnSctpClosingProcedureStartedRemotely_n(int sid); > void OnSctpClosingProcedureComplete_n(int sid); > >+ bool SetupMediaTransportForDataChannels_n(const std::string& mid) >+ RTC_RUN_ON(network_thread()); >+ void OnMediaTransportStateChanged_n() RTC_RUN_ON(network_thread()); >+ void TeardownMediaTransportForDataChannels_n() RTC_RUN_ON(network_thread()); >+ > bool ValidateBundleSettings(const cricket::SessionDescription* desc); > bool HasRtcpMuxEnabled(const cricket::ContentInfo* content); > // Below methods are helper methods which verifies SDP. >@@ -896,9 +922,9 @@ class PeerConnection : public PeerConnectionInternal, > // Destroys the RTP data channel and/or the SCTP data channel and clears it. > void DestroyDataChannel(); > >- // Destroys the given BaseChannel. The channel cannot be accessed after this >- // method is called. >- void DestroyBaseChannel(cricket::BaseChannel* channel); >+ // Destroys the given ChannelInterface. >+ // The channel cannot be accessed after this method is called. >+ void DestroyChannelInterface(cricket::ChannelInterface* channel); > > // JsepTransportController::Observer override. > // >@@ -906,14 +932,41 @@ class PeerConnection : public PeerConnectionInternal, > // from a session description, and the mapping from m= sections to transports > // changed (as a result of BUNDLE negotiation, or m= sections being > // rejected). >- bool OnTransportChanged( >- const std::string& mid, >- RtpTransportInternal* rtp_transport, >- cricket::DtlsTransportInternal* dtls_transport) override; >+ bool OnTransportChanged(const std::string& mid, >+ RtpTransportInternal* rtp_transport, >+ cricket::DtlsTransportInternal* dtls_transport, >+ MediaTransportInterface* media_transport) override; > > // Returns the observer. Will crash on CHECK if the observer is removed. > PeerConnectionObserver* Observer() const; > >+ // Returns the CryptoOptions for this PeerConnection. This will always >+ // return the RTCConfiguration.crypto_options if set and will only default >+ // back to the PeerConnectionFactory settings if nothing was set. >+ CryptoOptions GetCryptoOptions(); >+ >+ // Returns rtp transport, result can not be nullptr. >+ RtpTransportInternal* GetRtpTransport(const std::string& mid) { >+ auto rtp_transport = transport_controller_->GetRtpTransport(mid); >+ RTC_DCHECK(rtp_transport); >+ return rtp_transport; >+ } >+ >+ // Returns media transport, if PeerConnection was created with configuration >+ // to use media transport. Otherwise returns nullptr. >+ MediaTransportInterface* GetMediaTransport(const std::string& mid) { >+ auto media_transport = transport_controller_->GetMediaTransport(mid); >+ RTC_DCHECK((configuration_.use_media_transport || >+ configuration_.use_media_transport_for_data_channels) == >+ (media_transport != nullptr)) >+ << "configuration_.use_media_transport=" >+ << configuration_.use_media_transport >+ << ", configuration_.use_media_transport_for_data_channels=" >+ << configuration_.use_media_transport_for_data_channels >+ << ", (media_transport != nullptr)=" << (media_transport != nullptr); >+ return media_transport; >+ } >+ > sigslot::signal1<DataChannel*> SignalDataChannelCreated_; > > // Storing the factory as a scoped reference pointer ensures that the memory >@@ -930,6 +983,11 @@ class PeerConnection : public PeerConnectionInternal, > > SignalingState signaling_state_ = kStable; > IceConnectionState ice_connection_state_ = kIceConnectionNew; >+ PeerConnectionInterface::IceConnectionState >+ standardized_ice_connection_state_ = kIceConnectionNew; >+ PeerConnectionInterface::PeerConnectionState connection_state_ = >+ PeerConnectionState::kNew; >+ > IceGatheringState ice_gathering_state_ = kIceGatheringNew; > PeerConnectionInterface::RTCConfiguration configuration_; > >@@ -1010,6 +1068,33 @@ class PeerConnection : public PeerConnectionInternal, > sigslot::signal1<int> SignalSctpClosingProcedureStartedRemotely; > sigslot::signal1<int> SignalSctpClosingProcedureComplete; > >+ // Whether this peer is the caller. Set when the local description is applied. >+ absl::optional<bool> is_caller_ RTC_GUARDED_BY(signaling_thread()); >+ >+ // Content name (MID) for media transport data channels in SDP. >+ absl::optional<std::string> media_transport_data_mid_; >+ >+ // Media transport used for data channels. Thread-safe. >+ MediaTransportInterface* media_transport_ = nullptr; >+ >+ // Cached value of whether the media transport is ready to send. >+ bool media_transport_ready_to_send_data_ RTC_GUARDED_BY(signaling_thread()) = >+ false; >+ >+ // Used to invoke media transport signals on the signaling thread. >+ std::unique_ptr<rtc::AsyncInvoker> media_transport_invoker_; >+ >+ // Identical to the signals for SCTP, but from media transport: >+ sigslot::signal1<bool> SignalMediaTransportWritable_s >+ RTC_GUARDED_BY(signaling_thread()); >+ sigslot::signal2<const cricket::ReceiveDataParams&, >+ const rtc::CopyOnWriteBuffer&> >+ SignalMediaTransportReceivedData_s RTC_GUARDED_BY(signaling_thread()); >+ sigslot::signal1<int> SignalMediaTransportChannelClosing_s >+ RTC_GUARDED_BY(signaling_thread()); >+ sigslot::signal1<int> SignalMediaTransportChannelClosed_s >+ RTC_GUARDED_BY(signaling_thread()); >+ > std::unique_ptr<SessionDescriptionInterface> current_local_description_; > std::unique_ptr<SessionDescriptionInterface> pending_local_description_; > std::unique_ptr<SessionDescriptionInterface> current_remote_description_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_bundle_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_bundle_unittest.cc >index 907669b5ad30c4866aeaac83b3c626b4076f2c42..6c372b4b8982b8289921c75c07a2a69bfa2e6df4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_bundle_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_bundle_unittest.cc >@@ -10,6 +10,7 @@ > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/peerconnectionproxy.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_crypto_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_crypto_unittest.cc >index 1ccfe5560f09509f5143c60c4f87f484a4b5b033..6f7e23b94f5ad350534a3ed71432ad5ed1e5cee1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_crypto_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_crypto_unittest.cc >@@ -10,6 +10,7 @@ > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" > #include "p2p/base/fakeportallocator.h" >@@ -76,6 +77,7 @@ class PeerConnectionCryptoBaseTest : public ::testing::Test { > return nullptr; > } > >+ observer->SetPeerConnectionInterface(pc.get()); > return absl::make_unique<PeerConnectionWrapper>(pc_factory_, pc, > std::move(observer)); > } >@@ -278,13 +280,35 @@ TEST_P(PeerConnectionCryptoTest, CorrectCryptoInAnswerWhenEncryptionDisabled) { > answer->description())); > } > >+// CryptoOptions has been promoted to RTCConfiguration. As such if it is ever >+// set in the configuration it should overrite the settings set in the factory. >+TEST_P(PeerConnectionCryptoTest, RTCConfigurationCryptoOptionOverridesFactory) { >+ PeerConnectionFactoryInterface::Options options; >+ options.crypto_options.srtp.enable_gcm_crypto_suites = true; >+ pc_factory_->SetOptions(options); >+ >+ RTCConfiguration config; >+ config.enable_dtls_srtp.emplace(false); >+ CryptoOptions crypto_options; >+ crypto_options.srtp.enable_gcm_crypto_suites = false; >+ config.crypto_options = crypto_options; >+ auto caller = CreatePeerConnectionWithAudioVideo(config); >+ >+ auto offer = caller->CreateOffer(); >+ ASSERT_TRUE(offer); >+ >+ ASSERT_FALSE(offer->description()->contents().empty()); >+ // This should exist if GCM is enabled see CorrectCryptoInOfferWithSdesAndGcm >+ EXPECT_FALSE(SdpContentsAll(HaveSdesGcmCryptos(3), offer->description())); >+} >+ > // When DTLS is disabled and GCM cipher suites are enabled, the SDP offer/answer > // should have the correct ciphers in the SDES crypto options. > // With GCM cipher suites enabled, there will be 3 cryptos in the offer and 1 > // in the answer. > TEST_P(PeerConnectionCryptoTest, CorrectCryptoInOfferWithSdesAndGcm) { > PeerConnectionFactoryInterface::Options options; >- options.crypto_options.enable_gcm_crypto_suites = true; >+ options.crypto_options.srtp.enable_gcm_crypto_suites = true; > pc_factory_->SetOptions(options); > > RTCConfiguration config; >@@ -297,9 +321,10 @@ TEST_P(PeerConnectionCryptoTest, CorrectCryptoInOfferWithSdesAndGcm) { > ASSERT_FALSE(offer->description()->contents().empty()); > EXPECT_TRUE(SdpContentsAll(HaveSdesGcmCryptos(3), offer->description())); > } >+ > TEST_P(PeerConnectionCryptoTest, CorrectCryptoInAnswerWithSdesAndGcm) { > PeerConnectionFactoryInterface::Options options; >- options.crypto_options.enable_gcm_crypto_suites = true; >+ options.crypto_options.srtp.enable_gcm_crypto_suites = true; > pc_factory_->SetOptions(options); > > RTCConfiguration config; >@@ -317,7 +342,7 @@ TEST_P(PeerConnectionCryptoTest, CorrectCryptoInAnswerWithSdesAndGcm) { > > TEST_P(PeerConnectionCryptoTest, CanSetSdesGcmRemoteOfferAndLocalAnswer) { > PeerConnectionFactoryInterface::Options options; >- options.crypto_options.enable_gcm_crypto_suites = true; >+ options.crypto_options.srtp.enable_gcm_crypto_suites = true; > pc_factory_->SetOptions(options); > > RTCConfiguration config; >@@ -703,8 +728,8 @@ TEST_P(PeerConnectionCryptoTest, SessionErrorIfFingerprintInvalid) { > invalid_answer->description()->GetTransportInfoByName( > audio_content->name); > ASSERT_TRUE(audio_transport_info); >- audio_transport_info->description.identity_fingerprint.reset( >- rtc::SSLFingerprint::CreateFromCertificate(other_certificate)); >+ audio_transport_info->description.identity_fingerprint = >+ rtc::SSLFingerprint::CreateFromCertificate(*other_certificate); > > // Set the invalid answer and expect a fingerprint error. > std::string error; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_datachannel_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_datachannel_unittest.cc >index 033a609f98adab2391119ce96dfc715eab6651e1..cfb5dde22af3cbe0487d46e9cf9c932351c04879 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_datachannel_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_datachannel_unittest.cc >@@ -11,6 +11,7 @@ > #include <tuple> > > #include "api/peerconnectionproxy.h" >+#include "api/test/fake_media_transport.h" > #include "media/base/fakemediaengine.h" > #include "pc/mediasession.h" > #include "pc/peerconnection.h" >@@ -31,17 +32,39 @@ using RTCConfiguration = PeerConnectionInterface::RTCConfiguration; > using RTCOfferAnswerOptions = PeerConnectionInterface::RTCOfferAnswerOptions; > using ::testing::Values; > >+namespace { >+ >+PeerConnectionFactoryDependencies CreatePeerConnectionFactoryDependencies( >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ rtc::Thread* signaling_thread, >+ std::unique_ptr<cricket::MediaEngineInterface> media_engine, >+ std::unique_ptr<CallFactoryInterface> call_factory, >+ std::unique_ptr<MediaTransportFactory> media_transport_factory) { >+ PeerConnectionFactoryDependencies deps; >+ deps.network_thread = network_thread; >+ deps.worker_thread = worker_thread; >+ deps.signaling_thread = signaling_thread; >+ deps.media_engine = std::move(media_engine); >+ deps.call_factory = std::move(call_factory); >+ deps.media_transport_factory = std::move(media_transport_factory); >+ return deps; >+} >+ >+} // namespace >+ > class PeerConnectionFactoryForDataChannelTest > : public rtc::RefCountedObject<PeerConnectionFactory> { > public: > PeerConnectionFactoryForDataChannelTest() > : rtc::RefCountedObject<PeerConnectionFactory>( >- rtc::Thread::Current(), >- rtc::Thread::Current(), >- rtc::Thread::Current(), >- absl::make_unique<cricket::FakeMediaEngine>(), >- CreateCallFactory(), >- nullptr) {} >+ CreatePeerConnectionFactoryDependencies( >+ rtc::Thread::Current(), >+ rtc::Thread::Current(), >+ rtc::Thread::Current(), >+ absl::make_unique<cricket::FakeMediaEngine>(), >+ CreateCallFactory(), >+ absl::make_unique<FakeMediaTransportFactory>())) {} > > std::unique_ptr<cricket::SctpTransportInternalFactory> > CreateSctpTransportInternalFactory() { >@@ -123,6 +146,7 @@ class PeerConnectionDataChannelBaseTest : public ::testing::Test { > return nullptr; > } > >+ observer->SetPeerConnectionInterface(pc.get()); > auto wrapper = absl::make_unique<PeerConnectionWrapperForDataChannelTest>( > pc_factory, pc, std::move(observer)); > RTC_DCHECK(pc_factory->last_fake_sctp_transport_factory_); >@@ -323,6 +347,52 @@ TEST_P(PeerConnectionDataChannelTest, SctpPortPropagatedFromSdpToTransport) { > EXPECT_EQ(kNewRecvPort, callee_transport->local_port()); > } > >+TEST_P(PeerConnectionDataChannelTest, >+ NoSctpTransportCreatedIfMediaTransportDataChannelsEnabled) { >+ RTCConfiguration config; >+ config.use_media_transport_for_data_channels = true; >+ config.enable_dtls_srtp = false; // SDES is required to use media transport. >+ auto caller = CreatePeerConnectionWithDataChannel(config); >+ >+ ASSERT_TRUE(caller->SetLocalDescription(caller->CreateOffer())); >+ EXPECT_FALSE(caller->sctp_transport_factory()->last_fake_sctp_transport()); >+} >+ >+TEST_P(PeerConnectionDataChannelTest, >+ MediaTransportDataChannelCreatedEvenIfSctpAvailable) { >+ RTCConfiguration config; >+ config.use_media_transport_for_data_channels = true; >+ config.enable_dtls_srtp = false; // SDES is required to use media transport. >+ PeerConnectionFactoryInterface::Options options; >+ options.disable_sctp_data_channels = false; >+ auto caller = CreatePeerConnectionWithDataChannel(config, options); >+ >+ ASSERT_TRUE(caller->SetLocalDescription(caller->CreateOffer())); >+ EXPECT_FALSE(caller->sctp_transport_factory()->last_fake_sctp_transport()); >+} >+ >+TEST_P(PeerConnectionDataChannelTest, >+ CannotEnableBothMediaTransportAndRtpDataChannels) { >+ RTCConfiguration config; >+ config.enable_rtp_data_channel = true; >+ config.use_media_transport_for_data_channels = true; >+ config.enable_dtls_srtp = false; // SDES is required to use media transport. >+ EXPECT_EQ(CreatePeerConnection(config), nullptr); >+} >+ >+TEST_P(PeerConnectionDataChannelTest, >+ MediaTransportDataChannelFailsWithoutSdes) { >+ RTCConfiguration config; >+ config.use_media_transport_for_data_channels = true; >+ config.enable_dtls_srtp = true; // Disables SDES for data sections. >+ auto caller = CreatePeerConnectionWithDataChannel(config); >+ >+ std::string error; >+ ASSERT_FALSE(caller->SetLocalDescription(caller->CreateOffer(), &error)); >+ EXPECT_EQ(error, >+ "Failed to set local offer sdp: Failed to create data channel."); >+} >+ > INSTANTIATE_TEST_CASE_P(PeerConnectionDataChannelTest, > PeerConnectionDataChannelTest, > Values(SdpSemantics::kPlanB, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_histogram_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_histogram_unittest.cc >index b5c07f49ec411b0680ca13bdd750ba86e044795e..497e33a688c65db49107768a2c9b2c9977c096da 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_histogram_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_histogram_unittest.cc >@@ -251,6 +251,7 @@ class PeerConnectionUsageHistogramTest : public ::testing::Test { > return nullptr; > } > >+ observer->SetPeerConnectionInterface(pc.get()); > auto wrapper = > absl::make_unique<PeerConnectionWrapperForUsageHistogramTest>( > pc_factory, pc, std::move(observer)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_ice_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_ice_unittest.cc >index 965956c42a3241b1180e5db8735814ede36f5375..928549f3800f2564fe9b674c2d4c6f805ef0aa92 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_ice_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_ice_unittest.cc >@@ -21,7 +21,9 @@ > #include "absl/memory/memory.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/peerconnectionproxy.h" >+#include "api/umametrics.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" > #include "pc/test/fakeaudiocapturemodule.h" >@@ -77,6 +79,9 @@ class PeerConnectionWrapperForIceTest : public PeerConnectionWrapper { > > void set_network(rtc::FakeNetworkManager* network) { network_ = network; } > >+ // The port allocator used by this PC. >+ cricket::PortAllocator* port_allocator_; >+ > private: > rtc::FakeNetworkManager* network_; > }; >@@ -114,15 +119,18 @@ class PeerConnectionIceBaseTest : public ::testing::Test { > RTCConfiguration modified_config = config; > modified_config.sdp_semantics = sdp_semantics_; > auto observer = absl::make_unique<MockPeerConnectionObserver>(); >+ auto port_allocator_copy = port_allocator.get(); > auto pc = pc_factory_->CreatePeerConnection( > modified_config, std::move(port_allocator), nullptr, observer.get()); > if (!pc) { > return nullptr; > } > >+ observer->SetPeerConnectionInterface(pc.get()); > auto wrapper = absl::make_unique<PeerConnectionWrapperForIceTest>( > pc_factory_, pc, std::move(observer)); > wrapper->set_network(fake_network); >+ wrapper->port_allocator_ = port_allocator_copy; > return wrapper; > } > >@@ -205,7 +213,12 @@ class PeerConnectionIceBaseTest : public ::testing::Test { > PeerConnection* pc = static_cast<PeerConnection*>(pc_proxy->internal()); > for (auto transceiver : pc->GetTransceiversInternal()) { > if (transceiver->media_type() == cricket::MEDIA_TYPE_AUDIO) { >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ // TODO(amithi): This test seems to be using a method that should not >+ // be public |rtp_packet_transport|. Because the test is not mocking >+ // the channels or transceiver, workaround will be to |static_cast| >+ // the channel until the method is rewritten. >+ cricket::BaseChannel* channel = static_cast<cricket::BaseChannel*>( >+ transceiver->internal()->channel()); > if (channel) { > auto dtls_transport = static_cast<cricket::DtlsTransportInternal*>( > channel->rtp_packet_transport()); >@@ -1007,4 +1020,41 @@ TEST_F(PeerConnectionIceConfigTest, SetStunCandidateKeepaliveInterval) { > EXPECT_EQ(actual_stun_keepalive_interval.value_or(-1), 321); > } > >+TEST_P(PeerConnectionIceTest, IceCredentialsCreateOffer) { >+ RTCConfiguration config; >+ config.ice_candidate_pool_size = 1; >+ auto pc = CreatePeerConnectionWithAudioVideo(config); >+ ASSERT_NE(pc->port_allocator_, nullptr); >+ auto offer = pc->CreateOffer(); >+ auto credentials = pc->port_allocator_->GetPooledIceCredentials(); >+ ASSERT_EQ(1u, credentials.size()); >+ >+ auto* desc = offer->description(); >+ for (const auto& content : desc->contents()) { >+ auto* transport_info = desc->GetTransportInfoByName(content.name); >+ EXPECT_EQ(transport_info->description.ice_ufrag, credentials[0].ufrag); >+ EXPECT_EQ(transport_info->description.ice_pwd, credentials[0].pwd); >+ } >+} >+ >+TEST_P(PeerConnectionIceTest, IceCredentialsCreateAnswer) { >+ RTCConfiguration config; >+ config.ice_candidate_pool_size = 1; >+ auto pc = CreatePeerConnectionWithAudioVideo(config); >+ ASSERT_NE(pc->port_allocator_, nullptr); >+ auto offer = pc->CreateOffer(); >+ ASSERT_TRUE(pc->SetRemoteDescription(std::move(offer))); >+ auto answer = pc->CreateAnswer(); >+ >+ auto credentials = pc->port_allocator_->GetPooledIceCredentials(); >+ ASSERT_EQ(1u, credentials.size()); >+ >+ auto* desc = answer->description(); >+ for (const auto& content : desc->contents()) { >+ auto* transport_info = desc->GetTransportInfoByName(content.name); >+ EXPECT_EQ(transport_info->description.ice_ufrag, credentials[0].ufrag); >+ EXPECT_EQ(transport_info->description.ice_pwd, credentials[0].pwd); >+ } >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_integrationtest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_integrationtest.cc >index 28e0110355037ed4d7ec916326ae68f171ce13d3..a7f7aad0f54b31caed3ea7fd45f63f47c94a9ee0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_integrationtest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_integrationtest.cc >@@ -28,6 +28,8 @@ > #include "api/peerconnectioninterface.h" > #include "api/peerconnectionproxy.h" > #include "api/rtpreceiverinterface.h" >+#include "api/test/loopback_media_transport.h" >+#include "api/umametrics.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" > #include "api/video_codecs/sdp_video_format.h" >@@ -250,7 +252,8 @@ class PeerConnectionWrapper : public webrtc::PeerConnectionObserver, > webrtc::PeerConnectionDependencies dependencies(nullptr); > dependencies.cert_generator = std::move(cert_generator); > if (!client->Init(nullptr, nullptr, std::move(dependencies), network_thread, >- worker_thread, nullptr)) { >+ worker_thread, nullptr, >+ /*media_transport_factory=*/nullptr)) { > delete client; > return nullptr; > } >@@ -317,6 +320,12 @@ class PeerConnectionWrapper : public webrtc::PeerConnectionObserver, > ice_connection_state_history_.clear(); > } > >+ // Every PeerConnection state in order that has been seen by the observer. >+ std::vector<PeerConnectionInterface::PeerConnectionState> >+ peer_connection_state_history() const { >+ return peer_connection_state_history_; >+ } >+ > // Every ICE gathering state in order that has been seen by the observer. > std::vector<PeerConnectionInterface::IceGatheringState> > ice_gathering_state_history() const { >@@ -581,12 +590,14 @@ class PeerConnectionWrapper : public webrtc::PeerConnectionObserver, > explicit PeerConnectionWrapper(const std::string& debug_name) > : debug_name_(debug_name) {} > >- bool Init(const PeerConnectionFactory::Options* options, >- const PeerConnectionInterface::RTCConfiguration* config, >- webrtc::PeerConnectionDependencies dependencies, >- rtc::Thread* network_thread, >- rtc::Thread* worker_thread, >- std::unique_ptr<webrtc::FakeRtcEventLogFactory> event_log_factory) { >+ bool Init( >+ const PeerConnectionFactory::Options* options, >+ const PeerConnectionInterface::RTCConfiguration* config, >+ webrtc::PeerConnectionDependencies dependencies, >+ rtc::Thread* network_thread, >+ rtc::Thread* worker_thread, >+ std::unique_ptr<webrtc::FakeRtcEventLogFactory> event_log_factory, >+ std::unique_ptr<webrtc::MediaTransportFactory> media_transport_factory) { > // There's an error in this test code if Init ends up being called twice. > RTC_DCHECK(!peer_connection_); > RTC_DCHECK(!peer_connection_factory_); >@@ -624,6 +635,10 @@ class PeerConnectionWrapper : public webrtc::PeerConnectionObserver, > pc_factory_dependencies.event_log_factory = > webrtc::CreateRtcEventLogFactory(); > } >+ if (media_transport_factory) { >+ pc_factory_dependencies.media_transport_factory = >+ std::move(media_transport_factory); >+ } > peer_connection_factory_ = webrtc::CreateModularPeerConnectionFactory( > std::move(pc_factory_dependencies)); > >@@ -913,6 +928,11 @@ class PeerConnectionWrapper : public webrtc::PeerConnectionObserver, > EXPECT_EQ(pc()->ice_connection_state(), new_state); > ice_connection_state_history_.push_back(new_state); > } >+ void OnConnectionChange( >+ webrtc::PeerConnectionInterface::PeerConnectionState new_state) override { >+ peer_connection_state_history_.push_back(new_state); >+ } >+ > void OnIceGatheringChange( > webrtc::PeerConnectionInterface::IceGatheringState new_state) override { > EXPECT_EQ(pc()->ice_gathering_state(), new_state); >@@ -1009,6 +1029,8 @@ class PeerConnectionWrapper : public webrtc::PeerConnectionObserver, > > std::vector<PeerConnectionInterface::IceConnectionState> > ice_connection_state_history_; >+ std::vector<PeerConnectionInterface::PeerConnectionState> >+ peer_connection_state_history_; > std::vector<PeerConnectionInterface::IceGatheringState> > ice_gathering_state_history_; > >@@ -1142,7 +1164,8 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > ss_(new rtc::VirtualSocketServer()), > fss_(new rtc::FirewallSocketServer(ss_.get())), > network_thread_(new rtc::Thread(fss_.get())), >- worker_thread_(rtc::Thread::Create()) { >+ worker_thread_(rtc::Thread::Create()), >+ loopback_media_transports_(network_thread_.get()) { > network_thread_->SetName("PCNetworkThread", this); > worker_thread_->SetName("PCWorkerThread", this); > RTC_CHECK(network_thread_->Start()); >@@ -1198,7 +1221,8 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > const PeerConnectionFactory::Options* options, > const RTCConfiguration* config, > webrtc::PeerConnectionDependencies dependencies, >- std::unique_ptr<webrtc::FakeRtcEventLogFactory> event_log_factory) { >+ std::unique_ptr<webrtc::FakeRtcEventLogFactory> event_log_factory, >+ std::unique_ptr<webrtc::MediaTransportFactory> media_transport_factory) { > RTCConfiguration modified_config; > if (config) { > modified_config = *config; >@@ -1213,7 +1237,8 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > > if (!client->Init(options, &modified_config, std::move(dependencies), > network_thread_.get(), worker_thread_.get(), >- std::move(event_log_factory))) { >+ std::move(event_log_factory), >+ std::move(media_transport_factory))) { > return nullptr; > } > return client; >@@ -1229,7 +1254,8 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > new webrtc::FakeRtcEventLogFactory(rtc::Thread::Current())); > return CreatePeerConnectionWrapper(debug_name, options, config, > std::move(dependencies), >- std::move(event_log_factory)); >+ std::move(event_log_factory), >+ /*media_transport_factory=*/nullptr); > } > > bool CreatePeerConnectionWrappers() { >@@ -1250,11 +1276,11 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > sdp_semantics_ = caller_semantics; > caller_ = CreatePeerConnectionWrapper( > "Caller", nullptr, nullptr, webrtc::PeerConnectionDependencies(nullptr), >- nullptr); >+ nullptr, /*media_transport_factory=*/nullptr); > sdp_semantics_ = callee_semantics; > callee_ = CreatePeerConnectionWrapper( > "Callee", nullptr, nullptr, webrtc::PeerConnectionDependencies(nullptr), >- nullptr); >+ nullptr, /*media_transport_factory=*/nullptr); > sdp_semantics_ = original_semantics; > return caller_ && callee_; > } >@@ -1264,10 +1290,28 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > const PeerConnectionInterface::RTCConfiguration& callee_config) { > caller_ = CreatePeerConnectionWrapper( > "Caller", nullptr, &caller_config, >- webrtc::PeerConnectionDependencies(nullptr), nullptr); >+ webrtc::PeerConnectionDependencies(nullptr), nullptr, >+ /*media_transport_factory=*/nullptr); > callee_ = CreatePeerConnectionWrapper( > "Callee", nullptr, &callee_config, >- webrtc::PeerConnectionDependencies(nullptr), nullptr); >+ webrtc::PeerConnectionDependencies(nullptr), nullptr, >+ /*media_transport_factory=*/nullptr); >+ return caller_ && callee_; >+ } >+ >+ bool CreatePeerConnectionWrappersWithConfigAndMediaTransportFactory( >+ const PeerConnectionInterface::RTCConfiguration& caller_config, >+ const PeerConnectionInterface::RTCConfiguration& callee_config, >+ std::unique_ptr<webrtc::MediaTransportFactory> caller_factory, >+ std::unique_ptr<webrtc::MediaTransportFactory> callee_factory) { >+ caller_ = >+ CreatePeerConnectionWrapper("Caller", nullptr, &caller_config, >+ webrtc::PeerConnectionDependencies(nullptr), >+ nullptr, std::move(caller_factory)); >+ callee_ = >+ CreatePeerConnectionWrapper("Callee", nullptr, &callee_config, >+ webrtc::PeerConnectionDependencies(nullptr), >+ nullptr, std::move(callee_factory)); > return caller_ && callee_; > } > >@@ -1278,10 +1322,12 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > webrtc::PeerConnectionDependencies callee_dependencies) { > caller_ = > CreatePeerConnectionWrapper("Caller", nullptr, &caller_config, >- std::move(caller_dependencies), nullptr); >+ std::move(caller_dependencies), nullptr, >+ /*media_transport_factory=*/nullptr); > callee_ = > CreatePeerConnectionWrapper("Callee", nullptr, &callee_config, >- std::move(callee_dependencies), nullptr); >+ std::move(callee_dependencies), nullptr, >+ /*media_transport_factory=*/nullptr); > return caller_ && callee_; > } > >@@ -1290,10 +1336,12 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > const PeerConnectionFactory::Options& callee_options) { > caller_ = CreatePeerConnectionWrapper( > "Caller", &caller_options, nullptr, >- webrtc::PeerConnectionDependencies(nullptr), nullptr); >+ webrtc::PeerConnectionDependencies(nullptr), nullptr, >+ /*media_transport_factory=*/nullptr); > callee_ = CreatePeerConnectionWrapper( > "Callee", &callee_options, nullptr, >- webrtc::PeerConnectionDependencies(nullptr), nullptr); >+ webrtc::PeerConnectionDependencies(nullptr), nullptr, >+ /*media_transport_factory=*/nullptr); > return caller_ && callee_; > } > >@@ -1317,7 +1365,8 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > webrtc::PeerConnectionDependencies dependencies(nullptr); > dependencies.cert_generator = std::move(cert_generator); > return CreatePeerConnectionWrapper("New Peer", nullptr, nullptr, >- std::move(dependencies), nullptr); >+ std::move(dependencies), nullptr, >+ /*media_transport_factory=*/nullptr); > } > > cricket::TestTurnServer* CreateTurnServer( >@@ -1405,6 +1454,10 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > > rtc::VirtualSocketServer* virtual_socket_server() { return ss_.get(); } > >+ webrtc::MediaTransportPair* loopback_media_transports() { >+ return &loopback_media_transports_; >+ } >+ > PeerConnectionWrapper* caller() { return caller_.get(); } > > // Set the |caller_| to the |wrapper| passed in and return the >@@ -1557,9 +1610,11 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > bool remote_gcm_enabled, > int expected_cipher_suite) { > PeerConnectionFactory::Options caller_options; >- caller_options.crypto_options.enable_gcm_crypto_suites = local_gcm_enabled; >+ caller_options.crypto_options.srtp.enable_gcm_crypto_suites = >+ local_gcm_enabled; > PeerConnectionFactory::Options callee_options; >- callee_options.crypto_options.enable_gcm_crypto_suites = remote_gcm_enabled; >+ callee_options.crypto_options.srtp.enable_gcm_crypto_suites = >+ remote_gcm_enabled; > TestNegotiatedCipherSuite(caller_options, callee_options, > expected_cipher_suite); > } >@@ -1581,6 +1636,7 @@ class PeerConnectionIntegrationBaseTest : public testing::Test { > // on the network thread. > std::vector<std::unique_ptr<cricket::TestTurnServer>> turn_servers_; > std::vector<std::unique_ptr<cricket::TestTurnCustomizer>> turn_customizers_; >+ webrtc::MediaTransportPair loopback_media_transports_; > std::unique_ptr<PeerConnectionWrapper> caller_; > std::unique_ptr<PeerConnectionWrapper> callee_; > }; >@@ -1805,26 +1861,26 @@ TEST_P(PeerConnectionIntegrationTest, > > auto caller_remote_cert = GetRemoteAudioSSLCertificate(caller()); > ASSERT_TRUE(caller_remote_cert); >- EXPECT_EQ(callee_cert->ssl_certificate().ToPEMString(), >+ EXPECT_EQ(callee_cert->GetSSLCertificate().ToPEMString(), > caller_remote_cert->ToPEMString()); > > auto callee_remote_cert = GetRemoteAudioSSLCertificate(callee()); > ASSERT_TRUE(callee_remote_cert); >- EXPECT_EQ(caller_cert->ssl_certificate().ToPEMString(), >+ EXPECT_EQ(caller_cert->GetSSLCertificate().ToPEMString(), > callee_remote_cert->ToPEMString()); > > auto caller_remote_cert_chain = GetRemoteAudioSSLCertChain(caller()); > ASSERT_TRUE(caller_remote_cert_chain); > ASSERT_EQ(1U, caller_remote_cert_chain->GetSize()); > auto remote_cert = &caller_remote_cert_chain->Get(0); >- EXPECT_EQ(callee_cert->ssl_certificate().ToPEMString(), >+ EXPECT_EQ(callee_cert->GetSSLCertificate().ToPEMString(), > remote_cert->ToPEMString()); > > auto callee_remote_cert_chain = GetRemoteAudioSSLCertChain(callee()); > ASSERT_TRUE(callee_remote_cert_chain); > ASSERT_EQ(1U, callee_remote_cert_chain->GetSize()); > remote_cert = &callee_remote_cert_chain->Get(0); >- EXPECT_EQ(caller_cert->ssl_certificate().ToPEMString(), >+ EXPECT_EQ(caller_cert->GetSSLCertificate().ToPEMString(), > remote_cert->ToPEMString()); > } > >@@ -2842,9 +2898,10 @@ TEST_P(PeerConnectionIntegrationTest, CallerDtls10ToCalleeDtls12) { > TEST_P(PeerConnectionIntegrationTest, > Aes128Sha1_32_CipherNotUsedWhenOnlyCallerSupported) { > PeerConnectionFactory::Options caller_options; >- caller_options.crypto_options.enable_aes128_sha1_32_crypto_cipher = true; >+ caller_options.crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher = true; > PeerConnectionFactory::Options callee_options; >- callee_options.crypto_options.enable_aes128_sha1_32_crypto_cipher = false; >+ callee_options.crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher = >+ false; > int expected_cipher_suite = rtc::SRTP_AES128_CM_SHA1_80; > TestNegotiatedCipherSuite(caller_options, callee_options, > expected_cipher_suite); >@@ -2853,9 +2910,10 @@ TEST_P(PeerConnectionIntegrationTest, > TEST_P(PeerConnectionIntegrationTest, > Aes128Sha1_32_CipherNotUsedWhenOnlyCalleeSupported) { > PeerConnectionFactory::Options caller_options; >- caller_options.crypto_options.enable_aes128_sha1_32_crypto_cipher = false; >+ caller_options.crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher = >+ false; > PeerConnectionFactory::Options callee_options; >- callee_options.crypto_options.enable_aes128_sha1_32_crypto_cipher = true; >+ callee_options.crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher = true; > int expected_cipher_suite = rtc::SRTP_AES128_CM_SHA1_80; > TestNegotiatedCipherSuite(caller_options, callee_options, > expected_cipher_suite); >@@ -2863,9 +2921,9 @@ TEST_P(PeerConnectionIntegrationTest, > > TEST_P(PeerConnectionIntegrationTest, Aes128Sha1_32_CipherUsedWhenSupported) { > PeerConnectionFactory::Options caller_options; >- caller_options.crypto_options.enable_aes128_sha1_32_crypto_cipher = true; >+ caller_options.crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher = true; > PeerConnectionFactory::Options callee_options; >- callee_options.crypto_options.enable_aes128_sha1_32_crypto_cipher = true; >+ callee_options.crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher = true; > int expected_cipher_suite = rtc::SRTP_AES128_CM_SHA1_32; > TestNegotiatedCipherSuite(caller_options, callee_options, > expected_cipher_suite); >@@ -2915,7 +2973,7 @@ TEST_P(PeerConnectionIntegrationTest, > // works with it. > TEST_P(PeerConnectionIntegrationTest, EndToEndCallWithGcmCipher) { > PeerConnectionFactory::Options gcm_options; >- gcm_options.crypto_options.enable_gcm_crypto_suites = true; >+ gcm_options.crypto_options.srtp.enable_gcm_crypto_suites = true; > ASSERT_TRUE( > CreatePeerConnectionWrappersWithOptions(gcm_options, gcm_options)); > ConnectFakeSignaling(); >@@ -3329,6 +3387,146 @@ TEST_P(PeerConnectionIntegrationTest, > > #endif // HAVE_SCTP > >+// This test sets up a call between two parties with a media transport data >+// channel. >+TEST_P(PeerConnectionIntegrationTest, MediaTransportDataChannelEndToEnd) { >+ PeerConnectionInterface::RTCConfiguration rtc_config; >+ rtc_config.use_media_transport_for_data_channels = true; >+ rtc_config.enable_dtls_srtp = false; // SDES is required for media transport. >+ ASSERT_TRUE(CreatePeerConnectionWrappersWithConfigAndMediaTransportFactory( >+ rtc_config, rtc_config, loopback_media_transports()->first_factory(), >+ loopback_media_transports()->second_factory())); >+ ConnectFakeSignaling(); >+ >+ // Expect that data channel created on caller side will show up for callee as >+ // well. >+ caller()->CreateDataChannel(); >+ caller()->CreateAndSetAndSignalOffer(); >+ ASSERT_TRUE_WAIT(SignalingStateStable(), kDefaultTimeout); >+ >+ // Ensure that the media transport is ready. >+ loopback_media_transports()->SetState(webrtc::MediaTransportState::kWritable); >+ loopback_media_transports()->FlushAsyncInvokes(); >+ >+ // Caller data channel should already exist (it created one). Callee data >+ // channel may not exist yet, since negotiation happens in-band, not in SDP. >+ ASSERT_NE(nullptr, caller()->data_channel()); >+ ASSERT_TRUE_WAIT(callee()->data_channel() != nullptr, kDefaultTimeout); >+ EXPECT_TRUE_WAIT(caller()->data_observer()->IsOpen(), kDefaultTimeout); >+ EXPECT_TRUE_WAIT(callee()->data_observer()->IsOpen(), kDefaultTimeout); >+ >+ // Ensure data can be sent in both directions. >+ std::string data = "hello world"; >+ caller()->data_channel()->Send(DataBuffer(data)); >+ EXPECT_EQ_WAIT(data, callee()->data_observer()->last_message(), >+ kDefaultTimeout); >+ callee()->data_channel()->Send(DataBuffer(data)); >+ EXPECT_EQ_WAIT(data, caller()->data_observer()->last_message(), >+ kDefaultTimeout); >+} >+ >+// Ensure that when the callee closes a media transport data channel, the >+// closing procedure results in the data channel being closed for the caller >+// as well. >+TEST_P(PeerConnectionIntegrationTest, MediaTransportDataChannelCalleeCloses) { >+ PeerConnectionInterface::RTCConfiguration rtc_config; >+ rtc_config.use_media_transport_for_data_channels = true; >+ rtc_config.enable_dtls_srtp = false; // SDES is required for media transport. >+ ASSERT_TRUE(CreatePeerConnectionWrappersWithConfigAndMediaTransportFactory( >+ rtc_config, rtc_config, loopback_media_transports()->first_factory(), >+ loopback_media_transports()->second_factory())); >+ ConnectFakeSignaling(); >+ >+ // Create a data channel on the caller and signal it to the callee. >+ caller()->CreateDataChannel(); >+ caller()->CreateAndSetAndSignalOffer(); >+ ASSERT_TRUE_WAIT(SignalingStateStable(), kDefaultTimeout); >+ >+ // Ensure that the media transport is ready. >+ loopback_media_transports()->SetState(webrtc::MediaTransportState::kWritable); >+ loopback_media_transports()->FlushAsyncInvokes(); >+ >+ // Data channels exist and open on both ends of the connection. >+ ASSERT_NE(nullptr, caller()->data_channel()); >+ ASSERT_TRUE_WAIT(callee()->data_channel() != nullptr, kDefaultTimeout); >+ ASSERT_TRUE_WAIT(caller()->data_observer()->IsOpen(), kDefaultTimeout); >+ ASSERT_TRUE_WAIT(callee()->data_observer()->IsOpen(), kDefaultTimeout); >+ >+ // Close the data channel on the callee side, and wait for it to reach the >+ // "closed" state on both sides. >+ callee()->data_channel()->Close(); >+ EXPECT_TRUE_WAIT(!caller()->data_observer()->IsOpen(), kDefaultTimeout); >+ EXPECT_TRUE_WAIT(!callee()->data_observer()->IsOpen(), kDefaultTimeout); >+} >+ >+TEST_P(PeerConnectionIntegrationTest, >+ MediaTransportDataChannelConfigSentToOtherSide) { >+ PeerConnectionInterface::RTCConfiguration rtc_config; >+ rtc_config.use_media_transport_for_data_channels = true; >+ rtc_config.enable_dtls_srtp = false; // SDES is required for media transport. >+ ASSERT_TRUE(CreatePeerConnectionWrappersWithConfigAndMediaTransportFactory( >+ rtc_config, rtc_config, loopback_media_transports()->first_factory(), >+ loopback_media_transports()->second_factory())); >+ ConnectFakeSignaling(); >+ >+ // Create a data channel with a non-default configuration and signal it to the >+ // callee. >+ webrtc::DataChannelInit init; >+ init.id = 53; >+ init.maxRetransmits = 52; >+ caller()->CreateDataChannel("data-channel", &init); >+ caller()->CreateAndSetAndSignalOffer(); >+ ASSERT_TRUE_WAIT(SignalingStateStable(), kDefaultTimeout); >+ >+ // Ensure that the media transport is ready. >+ loopback_media_transports()->SetState(webrtc::MediaTransportState::kWritable); >+ loopback_media_transports()->FlushAsyncInvokes(); >+ >+ // Ensure that the data channel exists on the callee with the correct >+ // configuration. >+ ASSERT_TRUE_WAIT(callee()->data_channel() != nullptr, kDefaultTimeout); >+ ASSERT_TRUE_WAIT(callee()->data_observer()->IsOpen(), kDefaultTimeout); >+ EXPECT_EQ(init.id, callee()->data_channel()->id()); >+ EXPECT_EQ("data-channel", callee()->data_channel()->label()); >+ EXPECT_EQ(init.maxRetransmits, callee()->data_channel()->maxRetransmits()); >+ EXPECT_FALSE(callee()->data_channel()->negotiated()); >+} >+ >+TEST_P(PeerConnectionIntegrationTest, MediaTransportBidirectionalAudio) { >+ PeerConnectionInterface::RTCConfiguration rtc_config; >+ rtc_config.use_media_transport = true; >+ rtc_config.enable_dtls_srtp = false; // SDES is required for media transport. >+ ASSERT_TRUE(CreatePeerConnectionWrappersWithConfigAndMediaTransportFactory( >+ rtc_config, rtc_config, loopback_media_transports()->first_factory(), >+ loopback_media_transports()->second_factory())); >+ ConnectFakeSignaling(); >+ >+ caller()->AddAudioTrack(); >+ callee()->AddAudioTrack(); >+ // Start offer/answer exchange and wait for it to complete. >+ caller()->CreateAndSetAndSignalOffer(); >+ ASSERT_TRUE_WAIT(SignalingStateStable(), kDefaultTimeout); >+ >+ // Ensure that the media transport is ready. >+ loopback_media_transports()->SetState(webrtc::MediaTransportState::kWritable); >+ loopback_media_transports()->FlushAsyncInvokes(); >+ >+ MediaExpectations media_expectations; >+ media_expectations.ExpectBidirectionalAudio(); >+ ASSERT_TRUE(ExpectNewFrames(media_expectations)); >+ >+ webrtc::MediaTransportPair::Stats first_stats = >+ loopback_media_transports()->FirstStats(); >+ webrtc::MediaTransportPair::Stats second_stats = >+ loopback_media_transports()->SecondStats(); >+ >+ EXPECT_GT(first_stats.received_audio_frames, 0); >+ EXPECT_GE(second_stats.sent_audio_frames, first_stats.received_audio_frames); >+ >+ EXPECT_GT(second_stats.received_audio_frames, 0); >+ EXPECT_GE(first_stats.sent_audio_frames, second_stats.received_audio_frames); >+} >+ > // Test that the ICE connection and gathering states eventually reach > // "complete". > TEST_P(PeerConnectionIntegrationTest, IceStatesReachCompletion) { >@@ -3557,6 +3755,17 @@ TEST_P(PeerConnectionIntegrationIceStatesTest, VerifyIceStates) { > ElementsAre(PeerConnectionInterface::kIceConnectionChecking, > PeerConnectionInterface::kIceConnectionConnected, > PeerConnectionInterface::kIceConnectionCompleted)); >+ // After the ice transport transitions from checking to connected we revert >+ // back to new as the standard requires, as at that point the DTLS transport >+ // is in the "new" state while no transports are "connecting", "checking", >+ // "failed" or disconnected. This is pretty unintuitive, and we might want to >+ // amend the spec to handle this case more gracefully. >+ EXPECT_THAT( >+ caller()->peer_connection_state_history(), >+ ElementsAre(PeerConnectionInterface::PeerConnectionState::kConnecting, >+ PeerConnectionInterface::PeerConnectionState::kNew, >+ PeerConnectionInterface::PeerConnectionState::kConnecting, >+ PeerConnectionInterface::PeerConnectionState::kConnected)); > EXPECT_THAT(caller()->ice_gathering_state_history(), > ElementsAre(PeerConnectionInterface::kIceGatheringGathering, > PeerConnectionInterface::kIceGatheringComplete)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_jsep_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_jsep_unittest.cc >index 7de0a3ff14a4ab97a98cbc767a9398cc7fa888b1..fe878b05f78c231770068aeae2ccea61903524c5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_jsep_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_jsep_unittest.cc >@@ -95,6 +95,7 @@ class PeerConnectionJsepTest : public ::testing::Test { > return nullptr; > } > >+ observer->SetPeerConnectionInterface(pc.get()); > return absl::make_unique<PeerConnectionWrapper>(pc_factory, pc, > std::move(observer)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_media_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_media_unittest.cc >index 6f4fe1e3f489e9892d6492168b37ebdd5030e5bf..6af0c9887e6e2dfe4a1c07cad220ebd120e56486 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_media_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_media_unittest.cc >@@ -15,6 +15,7 @@ > #include <tuple> > > #include "api/call/callfactoryinterface.h" >+#include "api/test/fake_media_transport.h" > #include "logging/rtc_event_log/rtc_event_log_factory.h" > #include "media/base/fakemediaengine.h" > #include "p2p/base/fakeportallocator.h" >@@ -71,13 +72,26 @@ class PeerConnectionMediaBaseTest : public ::testing::Test { > return CreatePeerConnection(RTCConfiguration()); > } > >+ // Creates PeerConnectionFactory and PeerConnection for given configuration. >+ // Note that PeerConnectionFactory is created with MediaTransportFactory, >+ // because some tests pass config.use_media_transport = true. > WrapperPtr CreatePeerConnection(const RTCConfiguration& config) { > auto media_engine = absl::make_unique<FakeMediaEngine>(); > auto* media_engine_ptr = media_engine.get(); >- auto pc_factory = CreateModularPeerConnectionFactory( >- rtc::Thread::Current(), rtc::Thread::Current(), rtc::Thread::Current(), >- std::move(media_engine), CreateCallFactory(), >- CreateRtcEventLogFactory()); >+ >+ PeerConnectionFactoryDependencies factory_dependencies; >+ >+ factory_dependencies.network_thread = rtc::Thread::Current(); >+ factory_dependencies.worker_thread = rtc::Thread::Current(); >+ factory_dependencies.signaling_thread = rtc::Thread::Current(); >+ factory_dependencies.media_engine = std::move(media_engine); >+ factory_dependencies.call_factory = CreateCallFactory(); >+ factory_dependencies.event_log_factory = CreateRtcEventLogFactory(); >+ factory_dependencies.media_transport_factory = >+ absl::make_unique<FakeMediaTransportFactory>(); >+ >+ auto pc_factory = >+ CreateModularPeerConnectionFactory(std::move(factory_dependencies)); > > auto fake_port_allocator = absl::make_unique<cricket::FakePortAllocator>( > rtc::Thread::Current(), nullptr); >@@ -91,12 +105,25 @@ class PeerConnectionMediaBaseTest : public ::testing::Test { > return nullptr; > } > >+ observer->SetPeerConnectionInterface(pc.get()); > auto wrapper = absl::make_unique<PeerConnectionWrapperForMediaTest>( > pc_factory, pc, std::move(observer)); > wrapper->set_media_engine(media_engine_ptr); > return wrapper; > } > >+ // Accepts the same arguments as CreatePeerConnection and adds default audio >+ // track (but no video). >+ template <typename... Args> >+ WrapperPtr CreatePeerConnectionWithAudio(Args&&... args) { >+ auto wrapper = CreatePeerConnection(std::forward<Args>(args)...); >+ if (!wrapper) { >+ return nullptr; >+ } >+ wrapper->AddAudioTrack("a"); >+ return wrapper; >+ } >+ > // Accepts the same arguments as CreatePeerConnection and adds default audio > // and video tracks. > template <typename... Args> >@@ -677,7 +704,7 @@ void AddComfortNoiseCodecsToSend(cricket::FakeMediaEngine* media_engine) { > const cricket::AudioCodec kComfortNoiseCodec8k(102, "CN", 8000, 0, 1); > const cricket::AudioCodec kComfortNoiseCodec16k(103, "CN", 16000, 0, 1); > >- auto codecs = media_engine->audio_send_codecs(); >+ auto codecs = media_engine->voice().send_codecs(); > codecs.push_back(kComfortNoiseCodec8k); > codecs.push_back(kComfortNoiseCodec16k); > media_engine->SetAudioCodecs(codecs); >@@ -1072,6 +1099,128 @@ TEST_P(PeerConnectionMediaTest, > audio_options.combined_audio_video_bwe); > } > >+TEST_P(PeerConnectionMediaTest, MediaTransportPropagatedToVoiceEngine) { >+ RTCConfiguration config; >+ >+ // Setup PeerConnection to use media transport. >+ config.use_media_transport = true; >+ >+ // Force SDES. >+ config.enable_dtls_srtp = false; >+ >+ auto caller = CreatePeerConnectionWithAudio(config); >+ auto callee = CreatePeerConnectionWithAudio(config); >+ >+ ASSERT_TRUE(callee->SetRemoteDescription(caller->CreateOfferAndSetAsLocal())); >+ auto answer = callee->CreateAnswer(); >+ ASSERT_TRUE(callee->SetLocalDescription(std::move(answer))); >+ >+ auto caller_voice = caller->media_engine()->GetVoiceChannel(0); >+ auto callee_voice = callee->media_engine()->GetVoiceChannel(0); >+ ASSERT_TRUE(caller_voice); >+ ASSERT_TRUE(callee_voice); >+ >+ // Make sure media transport is propagated to voice channel. >+ FakeMediaTransport* caller_voice_media_transport = >+ static_cast<FakeMediaTransport*>(caller_voice->media_transport()); >+ FakeMediaTransport* callee_voice_media_transport = >+ static_cast<FakeMediaTransport*>(callee_voice->media_transport()); >+ ASSERT_NE(nullptr, caller_voice_media_transport); >+ ASSERT_NE(nullptr, callee_voice_media_transport); >+ >+ // Make sure media transport is created with correct is_caller. >+ EXPECT_TRUE(caller_voice_media_transport->is_caller()); >+ EXPECT_FALSE(callee_voice_media_transport->is_caller()); >+ >+ // TODO(sukhanov): Propagate media transport to video channel. >+ // This test does NOT set up video channels, because currently it causes >+ // us to create two media transports. >+} >+ >+TEST_P(PeerConnectionMediaTest, MediaTransportOnlyForDataChannels) { >+ RTCConfiguration config; >+ >+ // Setup PeerConnection to use media transport for data channels. >+ config.use_media_transport_for_data_channels = true; >+ >+ // Force SDES. >+ config.enable_dtls_srtp = false; >+ >+ auto caller = CreatePeerConnectionWithAudio(config); >+ auto callee = CreatePeerConnectionWithAudio(config); >+ >+ ASSERT_TRUE(callee->SetRemoteDescription(caller->CreateOfferAndSetAsLocal())); >+ ASSERT_TRUE(callee->SetLocalDescription(callee->CreateAnswer())); >+ >+ auto caller_voice = caller->media_engine()->GetVoiceChannel(0); >+ auto callee_voice = callee->media_engine()->GetVoiceChannel(0); >+ ASSERT_TRUE(caller_voice); >+ ASSERT_TRUE(callee_voice); >+ >+ // Make sure media transport is not propagated to voice channel. >+ EXPECT_EQ(nullptr, caller_voice->media_transport()); >+ EXPECT_EQ(nullptr, callee_voice->media_transport()); >+} >+ >+TEST_P(PeerConnectionMediaTest, MediaTransportForMediaAndDataChannels) { >+ RTCConfiguration config; >+ >+ // Setup PeerConnection to use media transport for both media and data >+ // channels. >+ config.use_media_transport = true; >+ config.use_media_transport_for_data_channels = true; >+ >+ // Force SDES. >+ config.enable_dtls_srtp = false; >+ >+ auto caller = CreatePeerConnectionWithAudio(config); >+ auto callee = CreatePeerConnectionWithAudio(config); >+ >+ ASSERT_TRUE(callee->SetRemoteDescription(caller->CreateOfferAndSetAsLocal())); >+ ASSERT_TRUE(callee->SetLocalDescription(callee->CreateAnswer())); >+ >+ auto caller_voice = caller->media_engine()->GetVoiceChannel(0); >+ auto callee_voice = callee->media_engine()->GetVoiceChannel(0); >+ ASSERT_TRUE(caller_voice); >+ ASSERT_TRUE(callee_voice); >+ >+ // Make sure media transport is propagated to voice channel. >+ FakeMediaTransport* caller_voice_media_transport = >+ static_cast<FakeMediaTransport*>(caller_voice->media_transport()); >+ FakeMediaTransport* callee_voice_media_transport = >+ static_cast<FakeMediaTransport*>(callee_voice->media_transport()); >+ ASSERT_NE(nullptr, caller_voice_media_transport); >+ ASSERT_NE(nullptr, callee_voice_media_transport); >+ >+ // Make sure media transport is created with correct is_caller. >+ EXPECT_TRUE(caller_voice_media_transport->is_caller()); >+ EXPECT_FALSE(callee_voice_media_transport->is_caller()); >+} >+ >+TEST_P(PeerConnectionMediaTest, MediaTransportNotPropagatedToVoiceEngine) { >+ auto caller = CreatePeerConnectionWithAudioVideo(); >+ auto callee = CreatePeerConnectionWithAudioVideo(); >+ >+ ASSERT_TRUE(callee->SetRemoteDescription(caller->CreateOfferAndSetAsLocal())); >+ auto answer = callee->CreateAnswer(); >+ ASSERT_TRUE(callee->SetLocalDescription(std::move(answer))); >+ >+ auto caller_voice = caller->media_engine()->GetVoiceChannel(0); >+ auto callee_voice = callee->media_engine()->GetVoiceChannel(0); >+ ASSERT_TRUE(caller_voice); >+ ASSERT_TRUE(callee_voice); >+ >+ // Since we did not setup PeerConnection to use media transport, media >+ // transport should not be created / propagated to the voice engine. >+ ASSERT_EQ(nullptr, caller_voice->media_transport()); >+ ASSERT_EQ(nullptr, callee_voice->media_transport()); >+ >+ auto caller_video = caller->media_engine()->GetVideoChannel(0); >+ auto callee_video = callee->media_engine()->GetVideoChannel(0); >+ ASSERT_EQ(nullptr, caller_video->media_transport()); >+ ASSERT_EQ(nullptr, callee_video->media_transport()); >+} >+ > INSTANTIATE_TEST_CASE_P(PeerConnectionMediaTest, > PeerConnectionMediaTest, > Values(SdpSemantics::kPlanB, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rampup_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rampup_tests.cc >index 850212bf0dca4bd75282828869032ceda8c9330d..74f8018fb09b7e0da107f7834943227214e9c8c5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rampup_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rampup_tests.cc >@@ -10,6 +10,7 @@ > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/stats/rtcstats_objects.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rtp_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rtp_unittest.cc >index bc3662cbbd43e341d1aae823c7d8168318ae282e..7098470baaeee98d0e74a6fc7074a87611f9596b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rtp_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_rtp_unittest.cc >@@ -14,9 +14,11 @@ > #include "absl/memory/memory.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/jsep.h" > #include "api/mediastreaminterface.h" > #include "api/peerconnectioninterface.h" >+#include "api/umametrics.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" > #include "pc/mediasession.h" >@@ -116,6 +118,8 @@ class PeerConnectionRtpBaseTest : public testing::Test { > auto observer = absl::make_unique<MockPeerConnectionObserver>(); > auto pc = pc_factory_->CreatePeerConnection(config, nullptr, nullptr, > observer.get()); >+ EXPECT_TRUE(pc.get()); >+ observer->SetPeerConnectionInterface(pc.get()); > return absl::make_unique<PeerConnectionWrapper>(pc_factory_, pc, > std::move(observer)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_signaling_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_signaling_unittest.cc >index 95f464b2ee249224042418d67b19cff9ba84c82b..f57f806b647f01339fbb0252cc1071ea3a242d4a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_signaling_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnection_signaling_unittest.cc >@@ -15,6 +15,7 @@ > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/peerconnectionproxy.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" >@@ -90,6 +91,7 @@ class PeerConnectionSignalingBaseTest : public ::testing::Test { > return nullptr; > } > >+ observer->SetPeerConnectionInterface(pc.get()); > return absl::make_unique<PeerConnectionWrapperForSignalingTest>( > pc_factory_, pc, std::move(observer)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionendtoend_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionendtoend_unittest.cc >index 60da2d242c1aa35dcf1cddef47fe6b323fede9ff..dee253b7c71870a728c22b98a0f67b5390cc94c7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionendtoend_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionendtoend_unittest.cc >@@ -11,6 +11,7 @@ > #include <memory> > > #include "absl/memory/memory.h" >+#include "absl/strings/match.h" > #include "api/audio_codecs/L16/audio_decoder_L16.h" > #include "api/audio_codecs/L16/audio_encoder_L16.h" > #include "api/audio_codecs/audio_codec_pair_id.h" >@@ -20,7 +21,6 @@ > #include "api/audio_codecs/builtin_audio_encoder_factory.h" > #include "rtc_base/gunit.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringencode.h" > #include "rtc_base/stringutils.h" > > #ifdef WEBRTC_ANDROID >@@ -31,6 +31,7 @@ > #include "pc/test/mockpeerconnectionobservers.h" > #include "test/mock_audio_decoder.h" > #include "test/mock_audio_decoder_factory.h" >+#include "test/mock_audio_encoder_factory.h" > > using testing::AtLeast; > using testing::Invoke; >@@ -300,7 +301,7 @@ CreateForwardingMockDecoderFactory( > struct AudioEncoderUnicornSparklesRainbow { > using Config = webrtc::AudioEncoderL16::Config; > static absl::optional<Config> SdpToConfig(webrtc::SdpAudioFormat format) { >- if (STR_CASE_CMP(format.name.c_str(), "UnicornSparklesRainbow") == 0) { >+ if (absl::EqualsIgnoreCase(format.name, "UnicornSparklesRainbow")) { > const webrtc::SdpAudioFormat::Parameters expected_params = { > {"num_horns", "1"}}; > EXPECT_EQ(expected_params, format.parameters); >@@ -337,7 +338,7 @@ struct AudioEncoderUnicornSparklesRainbow { > struct AudioDecoderUnicornSparklesRainbow { > using Config = webrtc::AudioDecoderL16::Config; > static absl::optional<Config> SdpToConfig(webrtc::SdpAudioFormat format) { >- if (STR_CASE_CMP(format.name.c_str(), "UnicornSparklesRainbow") == 0) { >+ if (absl::EqualsIgnoreCase(format.name, "UnicornSparklesRainbow")) { > const webrtc::SdpAudioFormat::Parameters expected_params = { > {"num_horns", "1"}}; > EXPECT_EQ(expected_params, format.parameters); >@@ -481,7 +482,7 @@ TEST_P(PeerConnectionEndToEndTest, CallWithCustomCodec) { > // Verifies that a DataChannel created before the negotiation can transition to > // "OPEN" and transfer data. > TEST_P(PeerConnectionEndToEndTest, CreateDataChannelBeforeNegotiate) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >@@ -506,7 +507,7 @@ TEST_P(PeerConnectionEndToEndTest, CreateDataChannelBeforeNegotiate) { > // Verifies that a DataChannel created after the negotiation can transition to > // "OPEN" and transfer data. > TEST_P(PeerConnectionEndToEndTest, CreateDataChannelAfterNegotiate) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >@@ -538,7 +539,7 @@ TEST_P(PeerConnectionEndToEndTest, CreateDataChannelAfterNegotiate) { > > // Verifies that a DataChannel created can transfer large messages. > TEST_P(PeerConnectionEndToEndTest, CreateDataChannelLargeTransfer) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >@@ -572,7 +573,7 @@ TEST_P(PeerConnectionEndToEndTest, CreateDataChannelLargeTransfer) { > > // Verifies that DataChannel IDs are even/odd based on the DTLS roles. > TEST_P(PeerConnectionEndToEndTest, DataChannelIdAssignment) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >@@ -600,7 +601,7 @@ TEST_P(PeerConnectionEndToEndTest, DataChannelIdAssignment) { > // there are multiple DataChannels. > TEST_P(PeerConnectionEndToEndTest, > MessageTransferBetweenTwoPairsOfDataChannels) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >@@ -640,7 +641,7 @@ TEST_P(PeerConnectionEndToEndTest, > // channel, and the closed channel was incorrectly still assigned to the ID. > TEST_P(PeerConnectionEndToEndTest, > DataChannelFromOpenWorksAfterPreviousChannelClosed) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >@@ -673,7 +674,7 @@ TEST_P(PeerConnectionEndToEndTest, > // closing before creating the second one. > TEST_P(PeerConnectionEndToEndTest, > DataChannelFromOpenWorksWhilePreviousChannelClosing) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >@@ -704,7 +705,7 @@ TEST_P(PeerConnectionEndToEndTest, > // reference count), no memory access violation will occur. > // See: https://code.google.com/p/chromium/issues/detail?id=565048 > TEST_P(PeerConnectionEndToEndTest, CloseDataChannelRemotelyWhileNotReferenced) { >- CreatePcs(webrtc::CreateBuiltinAudioEncoderFactory(), >+ CreatePcs(webrtc::MockAudioEncoderFactory::CreateEmptyFactory(), > webrtc::MockAudioDecoderFactory::CreateEmptyFactory()); > > webrtc::DataChannelInit init; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >index 53f4da8516e46c8f588463ed9222f56edb202294..75261508b272d2e7e662a5f873f35af8bde538a0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >@@ -15,6 +15,7 @@ > > #include "absl/memory/memory.h" > #include "api/fec_controller.h" >+#include "api/media_transport_interface.h" > #include "api/mediaconstraintsinterface.h" > #include "api/mediastreamproxy.h" > #include "api/mediastreamtrackproxy.h" >@@ -46,6 +47,7 @@ > #include "pc/videocapturertracksource.h" > #include "pc/videotrack.h" > #include "rtc_base/experiments/congestion_controller_experiment.h" >+#include "system_wrappers/include/field_trial.h" > > namespace webrtc { > >@@ -187,7 +189,9 @@ PeerConnectionFactory::PeerConnectionFactory( > std::move(dependencies.call_factory), > std::move(dependencies.event_log_factory), > std::move(dependencies.fec_controller_factory), >- std::move(dependencies.network_controller_factory)) {} >+ std::move(dependencies.network_controller_factory)) { >+ media_transport_factory_ = std::move(dependencies.media_transport_factory); >+} > > PeerConnectionFactory::~PeerConnectionFactory() { > RTC_DCHECK(signaling_thread_->IsCurrent()); >@@ -356,9 +360,12 @@ PeerConnectionFactory::CreatePeerConnection( > network_thread_); > } > if (!dependencies.allocator) { >- dependencies.allocator.reset(new cricket::BasicPortAllocator( >- default_network_manager_.get(), default_socket_factory_.get(), >- configuration.turn_customizer)); >+ network_thread_->Invoke<void>(RTC_FROM_HERE, [this, &configuration, >+ &dependencies]() { >+ dependencies.allocator = absl::make_unique<cricket::BasicPortAllocator>( >+ default_network_manager_.get(), default_socket_factory_.get(), >+ configuration.turn_customizer); >+ }); > } > > // TODO(zstein): Once chromium injects its own AsyncResolverFactory, set >@@ -442,7 +449,10 @@ rtc::Thread* PeerConnectionFactory::network_thread() { > > std::unique_ptr<RtcEventLog> PeerConnectionFactory::CreateRtcEventLog_w() { > RTC_DCHECK_RUN_ON(worker_thread_); >- const auto encoding_type = RtcEventLog::EncodingType::Legacy; >+ >+ auto encoding_type = RtcEventLog::EncodingType::Legacy; >+ if (field_trial::IsEnabled("WebRTC-RtcEventLogNewFormat")) >+ encoding_type = RtcEventLog::EncodingType::NewFormat; > return event_log_factory_ > ? event_log_factory_->CreateRtcEventLog(encoding_type) > : absl::make_unique<RtcEventLogNullImpl>(); >@@ -460,7 +470,8 @@ std::unique_ptr<Call> PeerConnectionFactory::CreateCall_w( > if (!channel_manager_->media_engine() || !call_factory_) { > return nullptr; > } >- call_config.audio_state = channel_manager_->media_engine()->GetAudioState(); >+ call_config.audio_state = >+ channel_manager_->media_engine()->voice().GetAudioState(); > call_config.bitrate_config.min_bitrate_bps = kMinBandwidthBps; > call_config.bitrate_config.start_bitrate_bps = kStartBandwidthBps; > call_config.bitrate_config.max_bitrate_bps = kMaxBandwidthBps; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.h >index f36a4e0c044bc445bb68ecc988477491f37052b1..1c1ea94fe28a73aaccb62471eae79bc0d5437b69 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.h >@@ -15,6 +15,7 @@ > #include <memory> > #include <string> > >+#include "api/media_transport_interface.h" > #include "api/mediastreaminterface.h" > #include "api/peerconnectioninterface.h" > #include "media/sctp/sctptransportinternal.h" >@@ -96,6 +97,10 @@ class PeerConnectionFactory : public PeerConnectionFactoryInterface { > virtual rtc::Thread* network_thread(); > const Options& options() const { return options_; } > >+ MediaTransportFactory* media_transport_factory() { >+ return media_transport_factory_.get(); >+ } >+ > protected: > PeerConnectionFactory( > rtc::Thread* network_thread, >@@ -148,6 +153,7 @@ class PeerConnectionFactory : public PeerConnectionFactoryInterface { > injected_network_controller_factory_; > std::unique_ptr<NetworkControllerFactoryInterface> > bbr_network_controller_factory_; >+ std::unique_ptr<MediaTransportFactory> media_transport_factory_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory_unittest.cc >index 84c828f0961042a5d51cbcf12f6081b76462e93a..24441432d7a38b88409713406aa39fa13a5520af 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory_unittest.cc >@@ -15,6 +15,7 @@ > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/mediastreaminterface.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectioninterface_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectioninterface_unittest.cc >index 719fdbdd7f41cc80d8bb3b808219d69d3a0a5536..dd5d0d16a91986b0ec45f24b7df030e11ea40b30 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectioninterface_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectioninterface_unittest.cc >@@ -17,6 +17,7 @@ > #include "absl/memory/memory.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/create_peerconnection_factory.h" > #include "api/jsepsessiondescription.h" > #include "api/mediastreaminterface.h" > #include "api/peerconnectioninterface.h" >@@ -1397,6 +1398,8 @@ TEST_P(PeerConnectionInterfaceTest, > rtc::scoped_refptr<PeerConnectionInterface> pc( > pc_factory->CreatePeerConnection(config, std::move(port_allocator), > nullptr, &observer_)); >+ EXPECT_TRUE(pc.get()); >+ observer_.SetPeerConnectionInterface(pc.get()); > > // Now validate that the config fields set above were applied to the > // PortAllocator, as flags or otherwise. >@@ -1424,15 +1427,22 @@ TEST_P(PeerConnectionInterfaceTest, GetConfigurationAfterCreatePeerConnection) { > // Check that GetConfiguration returns the last configuration passed into > // SetConfiguration. > TEST_P(PeerConnectionInterfaceTest, GetConfigurationAfterSetConfiguration) { >- CreatePeerConnection(); >+ PeerConnectionInterface::RTCConfiguration starting_config; >+ starting_config.bundle_policy = >+ webrtc::PeerConnection::kBundlePolicyMaxBundle; >+ CreatePeerConnection(starting_config); > > PeerConnectionInterface::RTCConfiguration config = pc_->GetConfiguration(); > config.type = PeerConnectionInterface::kRelay; >+ config.use_media_transport = true; >+ config.use_media_transport_for_data_channels = true; > EXPECT_TRUE(pc_->SetConfiguration(config)); > > PeerConnectionInterface::RTCConfiguration returned_config = > pc_->GetConfiguration(); > EXPECT_EQ(PeerConnectionInterface::kRelay, returned_config.type); >+ EXPECT_TRUE(returned_config.use_media_transport); >+ EXPECT_TRUE(returned_config.use_media_transport_for_data_channels); > } > > TEST_P(PeerConnectionInterfaceTest, SetConfigurationFailsAfterClose) { >@@ -3938,6 +3948,21 @@ TEST_P(PeerConnectionInterfaceTest, > EXPECT_FALSE(DoSetLocalDescription(std::move(offer))); > } > >+TEST_P(PeerConnectionInterfaceTest, ExtmapAllowMixedIsConfigurable) { >+ RTCConfiguration config; >+ // Default behavior is false. >+ CreatePeerConnection(config); >+ std::unique_ptr<SessionDescriptionInterface> offer; >+ ASSERT_TRUE(DoCreateOffer(&offer, nullptr)); >+ EXPECT_FALSE(offer->description()->extmap_allow_mixed()); >+ // Possible to set to true. >+ config.offer_extmap_allow_mixed = true; >+ CreatePeerConnection(config); >+ offer.release(); >+ ASSERT_TRUE(DoCreateOffer(&offer, nullptr)); >+ EXPECT_TRUE(offer->description()->extmap_allow_mixed()); >+} >+ > INSTANTIATE_TEST_CASE_P(PeerConnectionInterfaceTest, > PeerConnectionInterfaceTest, > Values(SdpSemantics::kPlanB, >@@ -3954,6 +3979,7 @@ class PeerConnectionMediaConfigTest : public testing::Test { > rtc::scoped_refptr<PeerConnectionInterface> pc( > pcf_->CreatePeerConnection(config, nullptr, nullptr, &observer_)); > EXPECT_TRUE(pc.get()); >+ observer_.SetPeerConnectionInterface(pc.get()); > return pc->GetConfiguration().media_config; > } > >@@ -3961,6 +3987,17 @@ class PeerConnectionMediaConfigTest : public testing::Test { > MockPeerConnectionObserver observer_; > }; > >+// This sanity check validates the test infrastructure itself. >+TEST_F(PeerConnectionMediaConfigTest, TestCreateAndClose) { >+ PeerConnectionInterface::RTCConfiguration config; >+ rtc::scoped_refptr<PeerConnectionInterface> pc( >+ pcf_->CreatePeerConnection(config, nullptr, nullptr, &observer_)); >+ EXPECT_TRUE(pc.get()); >+ observer_.SetPeerConnectionInterface(pc.get()); // Required. >+ pc->Close(); // No abort -> ok. >+ SUCCEED(); >+} >+ > // This test verifies the default behaviour with no constraints and a > // default RTCConfiguration. > TEST_F(PeerConnectionMediaConfigTest, TestDefaults) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcpmuxfilter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcpmuxfilter_unittest.cc >index 555ab5c4681054c40a71e1dce6c34aa970ce124a..a6ac4478fcd5ae75ba5716c036f90003fe08aa0b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcpmuxfilter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcpmuxfilter_unittest.cc >@@ -9,7 +9,6 @@ > */ > > #include "pc/rtcpmuxfilter.h" >-#include "media/base/testutils.h" > #include "rtc_base/gunit.h" > > TEST(RtcpMuxFilterTest, IsActiveSender) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstats_integrationtest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstats_integrationtest.cc >index 3c07e38b5a495aa93e7db9ffd1ef2f5594083ba8..49084de4d20717c22ddc891b7ea463204cb63b4c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstats_integrationtest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstats_integrationtest.cc >@@ -604,11 +604,18 @@ class RTCStatsReportVerifier { > media_stream_track.concealed_samples); > verifier.TestMemberIsNonNegative<uint64_t>( > media_stream_track.concealment_events); >+ verifier.TestMemberIsNonNegative<uint64_t>( >+ media_stream_track.jitter_buffer_flushes); >+ verifier.TestMemberIsNonNegative<uint64_t>( >+ media_stream_track.delayed_packet_outage_samples); > } else { > verifier.TestMemberIsUndefined(media_stream_track.jitter_buffer_delay); > verifier.TestMemberIsUndefined(media_stream_track.total_samples_received); > verifier.TestMemberIsUndefined(media_stream_track.concealed_samples); > verifier.TestMemberIsUndefined(media_stream_track.concealment_events); >+ verifier.TestMemberIsUndefined(media_stream_track.jitter_buffer_flushes); >+ verifier.TestMemberIsUndefined( >+ media_stream_track.delayed_packet_outage_samples); > } > return verifier.ExpectAllMembersSuccessfullyTested(); > } >@@ -806,7 +813,11 @@ TEST_F(RTCStatsIntegrationTest, GetStatsWithInvalidReceiverSelector) { > EXPECT_FALSE(report->size()); > } > >-TEST_F(RTCStatsIntegrationTest, GetsStatsWhileDestroyingPeerConnections) { >+// TODO(bugs.webrtc.org/10041) For now this is equivalent to the following >+// test GetsStatsWhileClosingPeerConnection, because pc() is closed by >+// PeerConnectionTestWrapper. See: bugs.webrtc.org/9847 >+TEST_F(RTCStatsIntegrationTest, >+ DISABLED_GetStatsWhileDestroyingPeerConnection) { > StartCall(); > > rtc::scoped_refptr<RTCStatsObtainer> stats_obtainer = >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector.cc >index d8839430ebef8423e2f98f583443c1e45bcf92d7..d48ecc01f359e51b900259828c085c8b5e1617f2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector.cc >@@ -454,6 +454,10 @@ ProduceMediaStreamTrackStatsFromVoiceReceiverInfo( > audio_track_stats->concealed_samples = voice_receiver_info.concealed_samples; > audio_track_stats->concealment_events = > voice_receiver_info.concealment_events; >+ audio_track_stats->jitter_buffer_flushes = >+ voice_receiver_info.jitter_buffer_flushes; >+ audio_track_stats->delayed_packet_outage_samples = >+ voice_receiver_info.delayed_packet_outage_samples; > return audio_track_stats; > } > >@@ -1388,7 +1392,7 @@ RTCStatsCollector::PrepareTransportCertificateStats_n( > rtc::scoped_refptr<rtc::RTCCertificate> local_certificate; > if (pc_->GetLocalCertificate(transport_name, &local_certificate)) { > certificate_stats_pair.local = >- local_certificate->ssl_cert_chain().GetStats(); >+ local_certificate->GetSSLCertificateChain().GetStats(); > } > > std::unique_ptr<rtc::SSLCertChain> remote_cert_chain = >@@ -1426,7 +1430,7 @@ RTCStatsCollector::PrepareTransceiverStatsInfos_s() const { > stats.transceiver = transceiver->internal(); > stats.media_type = media_type; > >- cricket::BaseChannel* channel = transceiver->internal()->channel(); >+ cricket::ChannelInterface* channel = transceiver->internal()->channel(); > if (!channel) { > // The remaining fields require a BaseChannel. > continue; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector_unittest.cc >index 7404d492ef37dd1aadcb7f423bd7b886d65adcb3..adca9ef6c4657ecddcf4f58d71e35d060c639505 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtcstatscollector_unittest.cc >@@ -143,10 +143,10 @@ std::unique_ptr<CertificateInfo> CreateFakeCertificateAndInfoFromDers( > } > // Fingerprints for the whole certificate chain, starting with leaf > // certificate. >- const rtc::SSLCertChain& chain = info->certificate->ssl_cert_chain(); >+ const rtc::SSLCertChain& chain = info->certificate->GetSSLCertificateChain(); > std::unique_ptr<rtc::SSLFingerprint> fp; > for (size_t i = 0; i < chain.GetSize(); i++) { >- fp.reset(rtc::SSLFingerprint::Create("sha-1", &chain.Get(i))); >+ fp = rtc::SSLFingerprint::Create("sha-1", chain.Get(i)); > EXPECT_TRUE(fp); > info->fingerprints.push_back(fp->GetRfc4572Fingerprint()); > } >@@ -704,7 +704,7 @@ TEST_F(RTCStatsCollectorTest, CollectRTCCertificateStatsSingle) { > std::vector<std::string>({"(remote) single certificate"})); > pc_->SetRemoteCertChain( > kTransportName, >- remote_certinfo->certificate->ssl_cert_chain().UniqueCopy()); >+ remote_certinfo->certificate->GetSSLCertificateChain().Clone()); > > rtc::scoped_refptr<const RTCStatsReport> report = stats_->GetStatsReport(); > >@@ -818,7 +818,7 @@ TEST_F(RTCStatsCollectorTest, CollectRTCCertificateStatsMultiple) { > std::vector<std::string>({"(remote) audio"})); > pc_->SetRemoteCertChain( > kAudioTransport, >- audio_remote_certinfo->certificate->ssl_cert_chain().UniqueCopy()); >+ audio_remote_certinfo->certificate->GetSSLCertificateChain().Clone()); > > pc_->AddVideoChannel("video", kVideoTransport); > std::unique_ptr<CertificateInfo> video_local_certinfo = >@@ -830,7 +830,7 @@ TEST_F(RTCStatsCollectorTest, CollectRTCCertificateStatsMultiple) { > std::vector<std::string>({"(remote) video"})); > pc_->SetRemoteCertChain( > kVideoTransport, >- video_remote_certinfo->certificate->ssl_cert_chain().UniqueCopy()); >+ video_remote_certinfo->certificate->GetSSLCertificateChain().Clone()); > > rtc::scoped_refptr<const RTCStatsReport> report = stats_->GetStatsReport(); > ExpectReportContainsCertificateInfo(report, *audio_local_certinfo); >@@ -855,7 +855,7 @@ TEST_F(RTCStatsCollectorTest, CollectRTCCertificateStatsChain) { > "(remote) chain"}); > pc_->SetRemoteCertChain( > kTransportName, >- remote_certinfo->certificate->ssl_cert_chain().UniqueCopy()); >+ remote_certinfo->certificate->GetSSLCertificateChain().Clone()); > > rtc::scoped_refptr<const RTCStatsReport> report = stats_->GetStatsReport(); > ExpectReportContainsCertificateInfo(report, *local_certinfo); >@@ -1426,6 +1426,8 @@ TEST_F(RTCStatsCollectorTest, > voice_receiver_info.concealed_samples = 123; > voice_receiver_info.concealment_events = 12; > voice_receiver_info.jitter_buffer_delay_seconds = 3456; >+ voice_receiver_info.jitter_buffer_flushes = 7; >+ voice_receiver_info.delayed_packet_outage_samples = 15; > > stats_->CreateMockRtpSendersReceiversAndChannels( > {}, {std::make_pair(remote_audio_track.get(), voice_receiver_info)}, {}, >@@ -1459,6 +1461,8 @@ TEST_F(RTCStatsCollectorTest, > expected_remote_audio_track.concealed_samples = 123; > expected_remote_audio_track.concealment_events = 12; > expected_remote_audio_track.jitter_buffer_delay = 3456; >+ expected_remote_audio_track.jitter_buffer_flushes = 7; >+ expected_remote_audio_track.delayed_packet_outage_samples = 15; > ASSERT_TRUE(report->Get(expected_remote_audio_track.id())); > EXPECT_EQ(expected_remote_audio_track, > report->Get(expected_remote_audio_track.id()) >@@ -1957,7 +1961,7 @@ TEST_F(RTCStatsCollectorTest, CollectRTCTransportStats) { > {"(remote) local", "(remote) chain"}); > pc_->SetRemoteCertChain( > kTransportName, >- remote_certinfo->certificate->ssl_cert_chain().UniqueCopy()); >+ remote_certinfo->certificate->GetSSLCertificateChain().Clone()); > > report = stats_->GetFreshStatsReport(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >index e519b9b06de6c33a026d4959e9b73ce6d4e2d790..865685d73c4324fb8f2f61fea06847db9c656328 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >@@ -396,6 +396,7 @@ RtpCapabilities ToRtpCapabilities( > bool have_red = false; > bool have_ulpfec = false; > bool have_flexfec = false; >+ bool have_rtx = false; > for (const C& cricket_codec : cricket_codecs) { > if (cricket_codec.name == cricket::kRedCodecName) { > have_red = true; >@@ -403,8 +404,19 @@ RtpCapabilities ToRtpCapabilities( > have_ulpfec = true; > } else if (cricket_codec.name == cricket::kFlexfecCodecName) { > have_flexfec = true; >+ } else if (cricket_codec.name == cricket::kRtxCodecName) { >+ if (have_rtx) { >+ // There should only be one RTX codec entry >+ continue; >+ } >+ have_rtx = true; >+ } >+ auto codec_capability = ToRtpCodecCapability(cricket_codec); >+ if (cricket_codec.name == cricket::kRtxCodecName) { >+ // RTX codec should not have any parameter >+ codec_capability.parameters.clear(); > } >- capabilities.codecs.push_back(ToRtpCodecCapability(cricket_codec)); >+ capabilities.codecs.push_back(codec_capability); > } > for (const RtpExtension& cricket_extension : cricket_extensions) { > capabilities.header_extensions.emplace_back(cricket_extension.uri, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion_unittest.cc >index 257d26ebb50c3b8e6f25c99708a66137bf5c30c1..f4f1b9270c6c96afed9994780e9892b1e7d92d60 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion_unittest.cc >@@ -574,11 +574,23 @@ TEST(RtpParametersConversionTest, ToRtpCapabilities) { > flexfec.id = 102; > flexfec.clockrate = 90000; > >+ cricket::VideoCodec rtx; >+ rtx.name = "rtx"; >+ rtx.id = 104; >+ rtx.params.insert({"apt", "101"}); >+ >+ cricket::VideoCodec rtx2; >+ rtx2.name = "rtx"; >+ rtx2.id = 105; >+ rtx2.params.insert({"apt", "109"}); >+ > RtpCapabilities capabilities = ToRtpCapabilities<cricket::VideoCodec>( >- {vp8, ulpfec}, {{"uri", 1}, {"uri2", 3}}); >- ASSERT_EQ(2u, capabilities.codecs.size()); >+ {vp8, ulpfec, rtx, rtx2}, {{"uri", 1}, {"uri2", 3}}); >+ ASSERT_EQ(3u, capabilities.codecs.size()); > EXPECT_EQ("VP8", capabilities.codecs[0].name); > EXPECT_EQ("ulpfec", capabilities.codecs[1].name); >+ EXPECT_EQ("rtx", capabilities.codecs[2].name); >+ EXPECT_EQ(0u, capabilities.codecs[2].parameters.size()); > ASSERT_EQ(2u, capabilities.header_extensions.size()); > EXPECT_EQ("uri", capabilities.header_extensions[0].uri); > EXPECT_EQ(1, capabilities.header_extensions[0].preferred_id); >@@ -587,8 +599,8 @@ TEST(RtpParametersConversionTest, ToRtpCapabilities) { > EXPECT_EQ(0u, capabilities.fec.size()); > > capabilities = ToRtpCapabilities<cricket::VideoCodec>( >- {vp8, red, ulpfec}, cricket::RtpHeaderExtensions()); >- EXPECT_EQ(3u, capabilities.codecs.size()); >+ {vp8, red, ulpfec, rtx}, cricket::RtpHeaderExtensions()); >+ EXPECT_EQ(4u, capabilities.codecs.size()); > EXPECT_EQ(2u, capabilities.fec.size()); > EXPECT_NE(capabilities.fec.end(), > std::find(capabilities.fec.begin(), capabilities.fec.end(), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.cc >index d4a1f62f83b84cca670faa0be58291e0c2a3bc4f..1916a7367032c4bb1f81a6510d2eef11c45a9279 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.cc >@@ -50,9 +50,10 @@ void MaybeAttachFrameDecryptorToMediaChannel( > const absl::optional<uint32_t>& ssrc, > rtc::Thread* worker_thread, > rtc::scoped_refptr<webrtc::FrameDecryptorInterface> frame_decryptor, >- cricket::MediaChannel* media_channel) { >- if (media_channel && ssrc.has_value()) { >- return worker_thread->Invoke<void>(RTC_FROM_HERE, [&] { >+ cricket::MediaChannel* media_channel, >+ bool stopped) { >+ if (media_channel && frame_decryptor && ssrc.has_value() && !stopped) { >+ worker_thread->Invoke<void>(RTC_FROM_HERE, [&] { > media_channel->SetFrameDecryptor(*ssrc, frame_decryptor); > }); > } >@@ -156,9 +157,12 @@ bool AudioRtpReceiver::SetParameters(const RtpParameters& parameters) { > void AudioRtpReceiver::SetFrameDecryptor( > rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor) { > frame_decryptor_ = std::move(frame_decryptor); >- // Attach the frame decryptor to the media channel if it exists. >- MaybeAttachFrameDecryptorToMediaChannel(ssrc_, worker_thread_, >- frame_decryptor_, media_channel_); >+ // Special Case: Set the frame decryptor to any value on any existing channel. >+ if (media_channel_ && ssrc_.has_value() && !stopped_) { >+ worker_thread_->Invoke<void>(RTC_FROM_HERE, [&] { >+ media_channel_->SetFrameDecryptor(*ssrc_, frame_decryptor_); >+ }); >+ } > } > > rtc::scoped_refptr<FrameDecryptorInterface> >@@ -252,8 +256,8 @@ void AudioRtpReceiver::Reconfigure() { > RTC_NOTREACHED(); > } > // Reattach the frame decryptor if we were reconfigured. >- MaybeAttachFrameDecryptorToMediaChannel(ssrc_, worker_thread_, >- frame_decryptor_, media_channel_); >+ MaybeAttachFrameDecryptorToMediaChannel( >+ ssrc_, worker_thread_, frame_decryptor_, media_channel_, stopped_); > } > > void AudioRtpReceiver::SetObserver(RtpReceiverObserverInterface* observer) { >@@ -264,9 +268,10 @@ void AudioRtpReceiver::SetObserver(RtpReceiverObserverInterface* observer) { > } > } > >-void AudioRtpReceiver::SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) { >- media_channel_ = voice_media_channel; >+void AudioRtpReceiver::SetMediaChannel(cricket::MediaChannel* media_channel) { >+ RTC_DCHECK(media_channel == nullptr || >+ media_channel->media_type() == media_type()); >+ media_channel_ = static_cast<cricket::VoiceMediaChannel*>(media_channel); > } > > void AudioRtpReceiver::NotifyFirstPacketReceived() { >@@ -347,9 +352,12 @@ bool VideoRtpReceiver::SetParameters(const RtpParameters& parameters) { > void VideoRtpReceiver::SetFrameDecryptor( > rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor) { > frame_decryptor_ = std::move(frame_decryptor); >- // Attach the new frame decryptor the media channel if it exists yet. >- MaybeAttachFrameDecryptorToMediaChannel(ssrc_, worker_thread_, >- frame_decryptor_, media_channel_); >+ // Special Case: Set the frame decryptor to any value on any existing channel. >+ if (media_channel_ && ssrc_.has_value() && !stopped_) { >+ worker_thread_->Invoke<void>(RTC_FROM_HERE, [&] { >+ media_channel_->SetFrameDecryptor(*ssrc_, frame_decryptor_); >+ }); >+ } > } > > rtc::scoped_refptr<FrameDecryptorInterface> >@@ -387,8 +395,8 @@ void VideoRtpReceiver::SetupMediaChannel(uint32_t ssrc) { > ssrc_ = ssrc; > SetSink(source_->sink()); > // Attach any existing frame decryptor to the media channel. >- MaybeAttachFrameDecryptorToMediaChannel(ssrc_, worker_thread_, >- frame_decryptor_, media_channel_); >+ MaybeAttachFrameDecryptorToMediaChannel( >+ ssrc_, worker_thread_, frame_decryptor_, media_channel_, stopped_); > } > > void VideoRtpReceiver::set_stream_ids(std::vector<std::string> stream_ids) { >@@ -436,9 +444,10 @@ void VideoRtpReceiver::SetObserver(RtpReceiverObserverInterface* observer) { > } > } > >-void VideoRtpReceiver::SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) { >- media_channel_ = video_media_channel; >+void VideoRtpReceiver::SetMediaChannel(cricket::MediaChannel* media_channel) { >+ RTC_DCHECK(media_channel == nullptr || >+ media_channel->media_type() == media_type()); >+ media_channel_ = static_cast<cricket::VideoMediaChannel*>(media_channel); > } > > void VideoRtpReceiver::NotifyFirstPacketReceived() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.h >index 85be8c258bfba2f5b092eff1bfcd52c4122c3f69..06331a66c4dd95ca2e5e1588462bb1efefd8727c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpreceiver.h >@@ -34,14 +34,10 @@ class RtpReceiverInternal : public RtpReceiverInterface { > virtual void Stop() = 0; > > // Sets the underlying MediaEngine channel associated with this RtpSender. >- // SetVoiceMediaChannel should be used for audio RtpSenders and >- // SetVideoMediaChannel should be used for video RtpSenders. Must call the >- // appropriate SetXxxMediaChannel(nullptr) before the media channel is >- // destroyed. >- virtual void SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) = 0; >- virtual void SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) = 0; >+ // A VoiceMediaChannel should be used for audio RtpSenders and >+ // a VideoMediaChannel should be used for video RtpSenders. >+ // Must call SetMediaChannel(nullptr) before the media channel is destroyed. >+ virtual void SetMediaChannel(cricket::MediaChannel* media_channel) = 0; > > // Configures the RtpReceiver with the underlying media channel, with the > // given SSRC as the stream identifier. If |ssrc| is 0, the receiver will >@@ -130,13 +126,8 @@ class AudioRtpReceiver : public ObserverInterface, > void SetStreams(const std::vector<rtc::scoped_refptr<MediaStreamInterface>>& > streams) override; > void SetObserver(RtpReceiverObserverInterface* observer) override; >- void SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) override; > >- void SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) override { >- RTC_NOTREACHED(); >- } >+ void SetMediaChannel(cricket::MediaChannel* media_channel) override; > > std::vector<RtpSource> GetSources() const override; > int AttachmentId() const override { return attachment_id_; } >@@ -217,13 +208,7 @@ class VideoRtpReceiver : public rtc::RefCountedObject<RtpReceiverInternal> { > > void SetObserver(RtpReceiverObserverInterface* observer) override; > >- void SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) override { >- RTC_NOTREACHED(); >- } >- >- void SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) override; >+ void SetMediaChannel(cricket::MediaChannel* media_channel) override; > > int AttachmentId() const override { return attachment_id_; } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.cc >index 3bb90f24da14b914d303f5ba8ec3c637cd4b52bd..76fdca6e94209483ea7673251dda42d5e9c6f88f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.cc >@@ -58,7 +58,8 @@ bool UnimplementedRtpEncodingParameterHasValue( > // layer. > bool PerSenderRtpEncodingParameterHasValue( > const RtpEncodingParameters& encoding_params) { >- if (encoding_params.bitrate_priority != kDefaultBitratePriority) { >+ if (encoding_params.bitrate_priority != kDefaultBitratePriority || >+ encoding_params.network_priority != kDefaultBitratePriority) { > return true; > } > return false; >@@ -68,13 +69,14 @@ bool PerSenderRtpEncodingParameterHasValue( > // correct worker thread only if both the media channel exists and a ssrc has > // been allocated to the stream. > void MaybeAttachFrameEncryptorToMediaChannel( >- const absl::optional<uint32_t> ssrc, >+ const uint32_t ssrc, > rtc::Thread* worker_thread, > rtc::scoped_refptr<webrtc::FrameEncryptorInterface> frame_encryptor, >- cricket::MediaChannel* media_channel) { >- if (media_channel && ssrc.has_value()) { >- return worker_thread->Invoke<void>(RTC_FROM_HERE, [&] { >- media_channel->SetFrameEncryptor(*ssrc, frame_encryptor); >+ cricket::MediaChannel* media_channel, >+ bool stopped) { >+ if (media_channel && frame_encryptor && ssrc && !stopped) { >+ worker_thread->Invoke<void>(RTC_FROM_HERE, [&] { >+ media_channel->SetFrameEncryptor(ssrc, frame_encryptor); > }); > } > } >@@ -305,8 +307,12 @@ rtc::scoped_refptr<DtmfSenderInterface> AudioRtpSender::GetDtmfSender() const { > void AudioRtpSender::SetFrameEncryptor( > rtc::scoped_refptr<FrameEncryptorInterface> frame_encryptor) { > frame_encryptor_ = std::move(frame_encryptor); >- MaybeAttachFrameEncryptorToMediaChannel(ssrc_, worker_thread_, >- frame_encryptor_, media_channel_); >+ // Special Case: Set the frame encryptor to any value on any existing channel. >+ if (media_channel_ && ssrc_ && !stopped_) { >+ worker_thread_->Invoke<void>(RTC_FROM_HERE, [&] { >+ media_channel_->SetFrameEncryptor(ssrc_, frame_encryptor_); >+ }); >+ } > } > > rtc::scoped_refptr<FrameEncryptorInterface> AudioRtpSender::GetFrameEncryptor() >@@ -356,8 +362,8 @@ void AudioRtpSender::SetSsrc(uint32_t ssrc) { > }); > } > // Each time there is an ssrc update. >- MaybeAttachFrameEncryptorToMediaChannel(ssrc_, worker_thread_, >- frame_encryptor_, media_channel_); >+ MaybeAttachFrameEncryptorToMediaChannel( >+ ssrc_, worker_thread_, frame_encryptor_, media_channel_, stopped_); > } > > void AudioRtpSender::Stop() { >@@ -380,9 +386,10 @@ void AudioRtpSender::Stop() { > stopped_ = true; > } > >-void AudioRtpSender::SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) { >- media_channel_ = voice_media_channel; >+void AudioRtpSender::SetMediaChannel(cricket::MediaChannel* media_channel) { >+ RTC_DCHECK(media_channel == nullptr || >+ media_channel->media_type() == media_type()); >+ media_channel_ = static_cast<cricket::VoiceMediaChannel*>(media_channel); > } > > void AudioRtpSender::SetAudioSend() { >@@ -399,9 +406,7 @@ void AudioRtpSender::SetAudioSend() { > // options since it is also applied to all streams/channels, local or remote. > if (track_->enabled() && track_->GetSource() && > !track_->GetSource()->remote()) { >- // TODO(xians): Remove this static_cast since we should be able to connect >- // a remote audio track to a peer connection. >- options = static_cast<LocalAudioSource*>(track_->GetSource())->options(); >+ options = track_->GetSource()->options(); > } > #endif > >@@ -557,8 +562,12 @@ rtc::scoped_refptr<DtmfSenderInterface> VideoRtpSender::GetDtmfSender() const { > void VideoRtpSender::SetFrameEncryptor( > rtc::scoped_refptr<FrameEncryptorInterface> frame_encryptor) { > frame_encryptor_ = std::move(frame_encryptor); >- MaybeAttachFrameEncryptorToMediaChannel(ssrc_, worker_thread_, >- frame_encryptor_, media_channel_); >+ // Special Case: Set the frame encryptor to any value on any existing channel. >+ if (media_channel_ && ssrc_ && !stopped_) { >+ worker_thread_->Invoke<void>(RTC_FROM_HERE, [&] { >+ media_channel_->SetFrameEncryptor(ssrc_, frame_encryptor_); >+ }); >+ } > } > > rtc::scoped_refptr<FrameEncryptorInterface> VideoRtpSender::GetFrameEncryptor() >@@ -601,8 +610,8 @@ void VideoRtpSender::SetSsrc(uint32_t ssrc) { > init_parameters_.encodings.clear(); > }); > } >- MaybeAttachFrameEncryptorToMediaChannel(ssrc_, worker_thread_, >- frame_encryptor_, media_channel_); >+ MaybeAttachFrameEncryptorToMediaChannel( >+ ssrc_, worker_thread_, frame_encryptor_, media_channel_, stopped_); > } > > void VideoRtpSender::Stop() { >@@ -621,9 +630,10 @@ void VideoRtpSender::Stop() { > stopped_ = true; > } > >-void VideoRtpSender::SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) { >- media_channel_ = video_media_channel; >+void VideoRtpSender::SetMediaChannel(cricket::MediaChannel* media_channel) { >+ RTC_DCHECK(media_channel == nullptr || >+ media_channel->media_type() == media_type()); >+ media_channel_ = static_cast<cricket::VideoMediaChannel*>(media_channel); > } > > void VideoRtpSender::SetVideoSend() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.h >index 2a8289f417eac94e2483a98be7f78b6bbd7f827c..07fae6b9547c494a23ff3d758e2e3a62aeffc8d0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsender.h >@@ -36,14 +36,10 @@ bool UnimplementedRtpParameterHasValue(const RtpParameters& parameters); > class RtpSenderInternal : public RtpSenderInterface { > public: > // Sets the underlying MediaEngine channel associated with this RtpSender. >- // SetVoiceMediaChannel should be used for audio RtpSenders and >- // SetVideoMediaChannel should be used for video RtpSenders. Must call the >- // appropriate SetXxxMediaChannel(nullptr) before the media channel is >- // destroyed. >- virtual void SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) = 0; >- virtual void SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) = 0; >+ // A VoiceMediaChannel should be used for audio RtpSenders and >+ // a VideoMediaChannel should be used for video RtpSenders. >+ // Must call SetMediaChannel(nullptr) before the media channel is destroyed. >+ virtual void SetMediaChannel(cricket::MediaChannel* media_channel) = 0; > > // Used to set the SSRC of the sender, once a local description has been set. > // If |ssrc| is 0, this indiates that the sender should disconnect from the >@@ -156,13 +152,7 @@ class AudioRtpSender : public DtmfProviderInterface, > > int AttachmentId() const override { return attachment_id_; } > >- void SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) override; >- >- void SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) override { >- RTC_NOTREACHED(); >- } >+ void SetMediaChannel(cricket::MediaChannel* media_channel) override; > > private: > // TODO(nisse): Since SSRC == 0 is technically valid, figure out >@@ -253,13 +243,7 @@ class VideoRtpSender : public ObserverInterface, > void Stop() override; > int AttachmentId() const override { return attachment_id_; } > >- void SetVoiceMediaChannel( >- cricket::VoiceMediaChannel* voice_media_channel) override { >- RTC_NOTREACHED(); >- } >- >- void SetVideoMediaChannel( >- cricket::VideoMediaChannel* video_media_channel) override; >+ void SetMediaChannel(cricket::MediaChannel* media_channel) override; > > private: > bool can_send_track() const { return track_ && ssrc_; } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsenderreceiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsenderreceiver_unittest.cc >index 07ff6a3865446474a2186e2b691e8ffc9ca90344..3be69bafc9c01f5d6ef4d42f071882a23e23a66a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsenderreceiver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpsenderreceiver_unittest.cc >@@ -80,12 +80,12 @@ class RtpSenderReceiverTest : public testing::Test, > > voice_channel_ = channel_manager_.CreateVoiceChannel( > &fake_call_, cricket::MediaConfig(), rtp_transport_.get(), >- rtc::Thread::Current(), cricket::CN_AUDIO, srtp_required, >- rtc::CryptoOptions(), cricket::AudioOptions()); >+ /*media_transport=*/nullptr, rtc::Thread::Current(), cricket::CN_AUDIO, >+ srtp_required, webrtc::CryptoOptions(), cricket::AudioOptions()); > video_channel_ = channel_manager_.CreateVideoChannel( > &fake_call_, cricket::MediaConfig(), rtp_transport_.get(), > rtc::Thread::Current(), cricket::CN_VIDEO, srtp_required, >- rtc::CryptoOptions(), cricket::VideoOptions()); >+ webrtc::CryptoOptions(), cricket::VideoOptions()); > voice_channel_->Enable(true); > video_channel_->Enable(true); > voice_media_channel_ = media_engine_->GetVoiceChannel(0); >@@ -154,7 +154,7 @@ class RtpSenderReceiverTest : public testing::Test, > new AudioRtpSender(worker_thread_, audio_track_->id(), nullptr); > ASSERT_TRUE(audio_rtp_sender_->SetTrack(audio_track_)); > audio_rtp_sender_->set_stream_ids({local_stream_->id()}); >- audio_rtp_sender_->SetVoiceMediaChannel(voice_media_channel_); >+ audio_rtp_sender_->SetMediaChannel(voice_media_channel_); > audio_rtp_sender_->SetSsrc(kAudioSsrc); > audio_rtp_sender_->GetOnDestroyedSignal()->connect( > this, &RtpSenderReceiverTest::OnAudioSenderDestroyed); >@@ -163,7 +163,7 @@ class RtpSenderReceiverTest : public testing::Test, > > void CreateAudioRtpSenderWithNoTrack() { > audio_rtp_sender_ = new AudioRtpSender(worker_thread_, /*id=*/"", nullptr); >- audio_rtp_sender_->SetVoiceMediaChannel(voice_media_channel_); >+ audio_rtp_sender_->SetMediaChannel(voice_media_channel_); > } > > void OnAudioSenderDestroyed() { audio_sender_destroyed_signal_fired_ = true; } >@@ -191,13 +191,13 @@ class RtpSenderReceiverTest : public testing::Test, > video_rtp_sender_ = new VideoRtpSender(worker_thread_, video_track_->id()); > ASSERT_TRUE(video_rtp_sender_->SetTrack(video_track_)); > video_rtp_sender_->set_stream_ids({local_stream_->id()}); >- video_rtp_sender_->SetVideoMediaChannel(video_media_channel_); >+ video_rtp_sender_->SetMediaChannel(video_media_channel_); > video_rtp_sender_->SetSsrc(ssrc); > VerifyVideoChannelInput(ssrc); > } > void CreateVideoRtpSenderWithNoTrack() { > video_rtp_sender_ = new VideoRtpSender(worker_thread_, /*id=*/""); >- video_rtp_sender_->SetVideoMediaChannel(video_media_channel_); >+ video_rtp_sender_->SetMediaChannel(video_media_channel_); > } > > void DestroyAudioRtpSender() { >@@ -214,7 +214,7 @@ class RtpSenderReceiverTest : public testing::Test, > std::vector<rtc::scoped_refptr<MediaStreamInterface>> streams = {}) { > audio_rtp_receiver_ = new AudioRtpReceiver( > rtc::Thread::Current(), kAudioTrackId, std::move(streams)); >- audio_rtp_receiver_->SetVoiceMediaChannel(voice_media_channel_); >+ audio_rtp_receiver_->SetMediaChannel(voice_media_channel_); > audio_rtp_receiver_->SetupMediaChannel(kAudioSsrc); > audio_track_ = audio_rtp_receiver_->audio_track(); > VerifyVoiceChannelOutput(); >@@ -224,12 +224,30 @@ class RtpSenderReceiverTest : public testing::Test, > std::vector<rtc::scoped_refptr<MediaStreamInterface>> streams = {}) { > video_rtp_receiver_ = new VideoRtpReceiver( > rtc::Thread::Current(), kVideoTrackId, std::move(streams)); >- video_rtp_receiver_->SetVideoMediaChannel(video_media_channel_); >+ video_rtp_receiver_->SetMediaChannel(video_media_channel_); > video_rtp_receiver_->SetupMediaChannel(kVideoSsrc); > video_track_ = video_rtp_receiver_->video_track(); > VerifyVideoChannelOutput(); > } > >+ void CreateVideoRtpReceiverWithSimulcast( >+ std::vector<rtc::scoped_refptr<MediaStreamInterface>> streams = {}, >+ int num_layers = kVideoSimulcastLayerCount) { >+ std::vector<uint32_t> ssrcs; >+ for (int i = 0; i < num_layers; ++i) >+ ssrcs.push_back(kVideoSsrcSimulcast + i); >+ cricket::StreamParams stream_params = >+ cricket::CreateSimStreamParams("cname", ssrcs); >+ video_media_channel_->AddRecvStream(stream_params); >+ uint32_t primary_ssrc = stream_params.first_ssrc(); >+ >+ video_rtp_receiver_ = new VideoRtpReceiver( >+ rtc::Thread::Current(), kVideoTrackId, std::move(streams)); >+ video_rtp_receiver_->SetMediaChannel(video_media_channel_); >+ video_rtp_receiver_->SetupMediaChannel(primary_ssrc); >+ video_track_ = video_rtp_receiver_->video_track(); >+ } >+ > void DestroyAudioRtpReceiver() { > audio_rtp_receiver_ = nullptr; > VerifyVoiceChannelNoOutput(); >@@ -664,7 +682,7 @@ TEST_F(RtpSenderReceiverTest, AudioSenderInitParametersMovedAfterNegotiation) { > cricket::StreamParams stream_params = > cricket::CreateSimStreamParams("cname", ssrcs); > voice_media_channel_->AddSendStream(stream_params); >- audio_rtp_sender_->SetVoiceMediaChannel(voice_media_channel_); >+ audio_rtp_sender_->SetMediaChannel(voice_media_channel_); > audio_rtp_sender_->SetSsrc(1); > > params = audio_rtp_sender_->GetParameters(); >@@ -903,7 +921,7 @@ TEST_F(RtpSenderReceiverTest, VideoSenderInitParametersMovedAfterNegotiation) { > cricket::StreamParams stream_params = > cricket::CreateSimStreamParams("cname", ssrcs); > video_media_channel_->AddSendStream(stream_params); >- video_rtp_sender_->SetVideoMediaChannel(video_media_channel_); >+ video_rtp_sender_->SetMediaChannel(video_media_channel_); > video_rtp_sender_->SetSsrc(kVideoSsrcSimulcast); > > params = video_rtp_sender_->GetParameters(); >@@ -938,7 +956,7 @@ TEST_F(RtpSenderReceiverTest, > cricket::StreamParams stream_params = > cricket::CreateSimStreamParams("cname", ssrcs); > video_media_channel_->AddSendStream(stream_params); >- video_rtp_sender_->SetVideoMediaChannel(video_media_channel_); >+ video_rtp_sender_->SetMediaChannel(video_media_channel_); > video_rtp_sender_->SetSsrc(kVideoSsrcSimulcast); > > params = video_rtp_sender_->GetParameters(); >@@ -1263,6 +1281,15 @@ TEST_F(RtpSenderReceiverTest, VideoReceiverCanSetParameters) { > DestroyVideoRtpReceiver(); > } > >+TEST_F(RtpSenderReceiverTest, VideoReceiverCanGetParametersWithSimulcast) { >+ CreateVideoRtpReceiverWithSimulcast({}, 2); >+ >+ RtpParameters params = video_rtp_receiver_->GetParameters(); >+ EXPECT_EQ(2u, params.encodings.size()); >+ >+ DestroyVideoRtpReceiver(); >+} >+ > // Test that makes sure that a video track content hint translates to the proper > // value for sources that are not screencast. > TEST_F(RtpSenderReceiverTest, PropagatesVideoTrackContentHint) { >@@ -1332,7 +1359,7 @@ TEST_F(RtpSenderReceiverTest, > video_rtp_sender_ = new VideoRtpSender(worker_thread_, video_track_->id()); > ASSERT_TRUE(video_rtp_sender_->SetTrack(video_track_)); > video_rtp_sender_->set_stream_ids({local_stream_->id()}); >- video_rtp_sender_->SetVideoMediaChannel(video_media_channel_); >+ video_rtp_sender_->SetMediaChannel(video_media_channel_); > video_track_->set_enabled(true); > > // Sender is not ready to send (no SSRC) so no option should have been set. >@@ -1424,6 +1451,18 @@ TEST_F(RtpSenderReceiverTest, AudioSenderCanSetFrameEncryptor) { > audio_rtp_sender_->GetFrameEncryptor().get()); > } > >+// Validate that setting a FrameEncryptor after the send stream is stopped does >+// nothing. >+TEST_F(RtpSenderReceiverTest, AudioSenderCannotSetFrameEncryptorAfterStop) { >+ CreateAudioRtpSender(); >+ rtc::scoped_refptr<FrameEncryptorInterface> fake_frame_encryptor( >+ new FakeFrameEncryptor()); >+ EXPECT_EQ(nullptr, audio_rtp_sender_->GetFrameEncryptor()); >+ audio_rtp_sender_->Stop(); >+ audio_rtp_sender_->SetFrameEncryptor(fake_frame_encryptor); >+ // TODO(webrtc:9926) - Validate media channel not set once fakes updated. >+} >+ > // Validate that the default FrameEncryptor setting is nullptr. > TEST_F(RtpSenderReceiverTest, AudioReceiverCanSetFrameDecryptor) { > CreateAudioRtpReceiver(); >@@ -1435,4 +1474,60 @@ TEST_F(RtpSenderReceiverTest, AudioReceiverCanSetFrameDecryptor) { > audio_rtp_receiver_->GetFrameDecryptor().get()); > } > >+// Validate that the default FrameEncryptor setting is nullptr. >+TEST_F(RtpSenderReceiverTest, AudioReceiverCannotSetFrameDecryptorAfterStop) { >+ CreateAudioRtpReceiver(); >+ rtc::scoped_refptr<FrameDecryptorInterface> fake_frame_decryptor( >+ new FakeFrameDecryptor()); >+ EXPECT_EQ(nullptr, audio_rtp_receiver_->GetFrameDecryptor()); >+ audio_rtp_receiver_->Stop(); >+ audio_rtp_receiver_->SetFrameDecryptor(fake_frame_decryptor); >+ // TODO(webrtc:9926) - Validate media channel not set once fakes updated. >+} >+ >+// Validate that the default FrameEncryptor setting is nullptr. >+TEST_F(RtpSenderReceiverTest, VideoSenderCanSetFrameEncryptor) { >+ CreateVideoRtpSender(); >+ rtc::scoped_refptr<FrameEncryptorInterface> fake_frame_encryptor( >+ new FakeFrameEncryptor()); >+ EXPECT_EQ(nullptr, video_rtp_sender_->GetFrameEncryptor()); >+ video_rtp_sender_->SetFrameEncryptor(fake_frame_encryptor); >+ EXPECT_EQ(fake_frame_encryptor.get(), >+ video_rtp_sender_->GetFrameEncryptor().get()); >+} >+ >+// Validate that setting a FrameEncryptor after the send stream is stopped does >+// nothing. >+TEST_F(RtpSenderReceiverTest, VideoSenderCannotSetFrameEncryptorAfterStop) { >+ CreateVideoRtpSender(); >+ rtc::scoped_refptr<FrameEncryptorInterface> fake_frame_encryptor( >+ new FakeFrameEncryptor()); >+ EXPECT_EQ(nullptr, video_rtp_sender_->GetFrameEncryptor()); >+ video_rtp_sender_->Stop(); >+ video_rtp_sender_->SetFrameEncryptor(fake_frame_encryptor); >+ // TODO(webrtc:9926) - Validate media channel not set once fakes updated. >+} >+ >+// Validate that the default FrameEncryptor setting is nullptr. >+TEST_F(RtpSenderReceiverTest, VideoReceiverCanSetFrameDecryptor) { >+ CreateVideoRtpReceiver(); >+ rtc::scoped_refptr<FrameDecryptorInterface> fake_frame_decryptor( >+ new FakeFrameDecryptor()); >+ EXPECT_EQ(nullptr, video_rtp_receiver_->GetFrameDecryptor()); >+ video_rtp_receiver_->SetFrameDecryptor(fake_frame_decryptor); >+ EXPECT_EQ(fake_frame_decryptor.get(), >+ video_rtp_receiver_->GetFrameDecryptor().get()); >+} >+ >+// Validate that the default FrameEncryptor setting is nullptr. >+TEST_F(RtpSenderReceiverTest, VideoReceiverCannotSetFrameDecryptorAfterStop) { >+ CreateVideoRtpReceiver(); >+ rtc::scoped_refptr<FrameDecryptorInterface> fake_frame_decryptor( >+ new FakeFrameDecryptor()); >+ EXPECT_EQ(nullptr, video_rtp_receiver_->GetFrameDecryptor()); >+ video_rtp_receiver_->Stop(); >+ video_rtp_receiver_->SetFrameDecryptor(fake_frame_decryptor); >+ // TODO(webrtc:9926) - Validate media channel not set once fakes updated. >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.cc >index 0fe8ea6f50a9930f3e8c8ab3151696d3befd2402..8b56b8b4f10a529f9b9a3b3e0814214d1664b769 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.cc >@@ -38,52 +38,45 @@ RtpTransceiver::~RtpTransceiver() { > Stop(); > } > >-void RtpTransceiver::SetChannel(cricket::BaseChannel* channel) { >+void RtpTransceiver::SetChannel(cricket::ChannelInterface* channel) { >+ // Cannot set a non-null channel on a stopped transceiver. >+ if (stopped_ && channel) { >+ return; >+ } >+ > if (channel) { > RTC_DCHECK_EQ(media_type(), channel->media_type()); > } > > if (channel_) { >- channel_->SignalFirstPacketReceived.disconnect(this); >+ channel_->SignalFirstPacketReceived().disconnect(this); > } > > channel_ = channel; > > if (channel_) { >- channel_->SignalFirstPacketReceived.connect( >+ channel_->SignalFirstPacketReceived().connect( > this, &RtpTransceiver::OnFirstPacketReceived); > } > > for (auto sender : senders_) { >- if (media_type() == cricket::MEDIA_TYPE_AUDIO) { >- auto* voice_channel = static_cast<cricket::VoiceChannel*>(channel); >- sender->internal()->SetVoiceMediaChannel( >- voice_channel ? voice_channel->media_channel() : nullptr); >- } else { >- auto* video_channel = static_cast<cricket::VideoChannel*>(channel); >- sender->internal()->SetVideoMediaChannel( >- video_channel ? video_channel->media_channel() : nullptr); >- } >+ sender->internal()->SetMediaChannel(channel_ ? channel_->media_channel() >+ : nullptr); > } > > for (auto receiver : receivers_) { >- if (!channel) { >+ if (!channel_) { > receiver->internal()->Stop(); > } >- if (media_type() == cricket::MEDIA_TYPE_AUDIO) { >- auto* voice_channel = static_cast<cricket::VoiceChannel*>(channel); >- receiver->internal()->SetVoiceMediaChannel( >- voice_channel ? voice_channel->media_channel() : nullptr); >- } else { >- auto* video_channel = static_cast<cricket::VideoChannel*>(channel); >- receiver->internal()->SetVideoMediaChannel( >- video_channel ? video_channel->media_channel() : nullptr); >- } >+ >+ receiver->internal()->SetMediaChannel(channel_ ? channel_->media_channel() >+ : nullptr); > } > } > > void RtpTransceiver::AddSender( > rtc::scoped_refptr<RtpSenderProxyWithInternal<RtpSenderInternal>> sender) { >+ RTC_DCHECK(!stopped_); > RTC_DCHECK(!unified_plan_); > RTC_DCHECK(sender); > RTC_DCHECK_EQ(media_type(), sender->media_type()); >@@ -109,6 +102,7 @@ bool RtpTransceiver::RemoveSender(RtpSenderInterface* sender) { > void RtpTransceiver::AddReceiver( > rtc::scoped_refptr<RtpReceiverProxyWithInternal<RtpReceiverInternal>> > receiver) { >+ RTC_DCHECK(!stopped_); > RTC_DCHECK(!unified_plan_); > RTC_DCHECK(receiver); > RTC_DCHECK_EQ(media_type(), receiver->media_type()); >@@ -152,7 +146,7 @@ absl::optional<std::string> RtpTransceiver::mid() const { > return mid_; > } > >-void RtpTransceiver::OnFirstPacketReceived(cricket::BaseChannel* channel) { >+void RtpTransceiver::OnFirstPacketReceived(cricket::ChannelInterface*) { > for (auto receiver : receivers_) { > receiver->internal()->NotifyFirstPacketReceived(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.h >index 3e0d433782611576ca92e516e4e6b49543ae2950..07db196edd9abdca53cc8eee33c62a2bb4ea64bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver.h >@@ -70,11 +70,11 @@ class RtpTransceiver final > > // Returns the Voice/VideoChannel set for this transceiver. May be null if > // the transceiver is not in the currently set local/remote description. >- cricket::BaseChannel* channel() const { return channel_; } >+ cricket::ChannelInterface* channel() const { return channel_; } > > // Sets the Voice/VideoChannel. The caller must pass in the correct channel > // implementation based on the type of the transceiver. >- void SetChannel(cricket::BaseChannel* channel); >+ void SetChannel(cricket::ChannelInterface* channel); > > // Adds an RtpSender of the appropriate type to be owned by this transceiver. > // Must not be null. >@@ -177,7 +177,7 @@ class RtpTransceiver final > void SetCodecPreferences(rtc::ArrayView<RtpCodecCapability> codecs) override; > > private: >- void OnFirstPacketReceived(cricket::BaseChannel* channel); >+ void OnFirstPacketReceived(cricket::ChannelInterface* channel); > > const bool unified_plan_; > const cricket::MediaType media_type_; >@@ -196,7 +196,7 @@ class RtpTransceiver final > bool created_by_addtrack_ = false; > bool has_ever_been_used_to_send_ = false; > >- cricket::BaseChannel* channel_ = nullptr; >+ cricket::ChannelInterface* channel_ = nullptr; > }; > > BEGIN_SIGNALING_PROXY_MAP(RtpTransceiver) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..b57d2123d152fb2bf335a896965e7fe61052f587 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransceiver_unittest.cc >@@ -0,0 +1,71 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+// This file contains tests for |RtpTransceiver|. >+ >+#include "pc/rtptransceiver.h" >+#include "rtc_base/gunit.h" >+#include "test/gmock.h" >+#include "test/mock_channelinterface.h" >+ >+using ::testing::Return; >+using ::testing::ReturnRef; >+ >+namespace webrtc { >+ >+// Checks that a channel cannot be set on a stopped |RtpTransceiver|. >+TEST(RtpTransceiverTest, CannotSetChannelOnStoppedTransceiver) { >+ RtpTransceiver transceiver(cricket::MediaType::MEDIA_TYPE_AUDIO); >+ cricket::MockChannelInterface channel1; >+ sigslot::signal1<cricket::ChannelInterface*> signal; >+ EXPECT_CALL(channel1, media_type()) >+ .WillRepeatedly(Return(cricket::MediaType::MEDIA_TYPE_AUDIO)); >+ EXPECT_CALL(channel1, SignalFirstPacketReceived()) >+ .WillRepeatedly(ReturnRef(signal)); >+ >+ transceiver.SetChannel(&channel1); >+ EXPECT_EQ(&channel1, transceiver.channel()); >+ >+ // Stop the transceiver. >+ transceiver.Stop(); >+ EXPECT_EQ(&channel1, transceiver.channel()); >+ >+ cricket::MockChannelInterface channel2; >+ EXPECT_CALL(channel2, media_type()) >+ .WillRepeatedly(Return(cricket::MediaType::MEDIA_TYPE_AUDIO)); >+ >+ // Channel can no longer be set, so this call should be a no-op. >+ transceiver.SetChannel(&channel2); >+ EXPECT_EQ(&channel1, transceiver.channel()); >+} >+ >+// Checks that a channel can be unset on a stopped |RtpTransceiver| >+TEST(RtpTransceiverTest, CanUnsetChannelOnStoppedTransceiver) { >+ RtpTransceiver transceiver(cricket::MediaType::MEDIA_TYPE_VIDEO); >+ cricket::MockChannelInterface channel; >+ sigslot::signal1<cricket::ChannelInterface*> signal; >+ EXPECT_CALL(channel, media_type()) >+ .WillRepeatedly(Return(cricket::MediaType::MEDIA_TYPE_VIDEO)); >+ EXPECT_CALL(channel, SignalFirstPacketReceived()) >+ .WillRepeatedly(ReturnRef(signal)); >+ >+ transceiver.SetChannel(&channel); >+ EXPECT_EQ(&channel, transceiver.channel()); >+ >+ // Stop the transceiver. >+ transceiver.Stop(); >+ EXPECT_EQ(&channel, transceiver.channel()); >+ >+ // Set the channel to |nullptr|. >+ transceiver.SetChannel(nullptr); >+ EXPECT_EQ(nullptr, transceiver.channel()); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.cc >index 9e994e9b9804a51ac56b6797a567d0af7136ecd0..4a573016f21e87bcc8b21e0fa71eeb846481d77b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.cc >@@ -187,7 +187,7 @@ RtpTransportParameters RtpTransport::GetParameters() const { > } > > void RtpTransport::DemuxPacket(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& time) { >+ int64_t packet_time_us) { > webrtc::RtpPacketReceived parsed_packet(&header_extension_map_); > if (!parsed_packet.Parse(std::move(*packet))) { > RTC_LOG(LS_ERROR) >@@ -195,8 +195,8 @@ void RtpTransport::DemuxPacket(rtc::CopyOnWriteBuffer* packet, > return; > } > >- if (time.timestamp != -1) { >- parsed_packet.set_arrival_time_ms((time.timestamp + 500) / 1000); >+ if (packet_time_us != -1) { >+ parsed_packet.set_arrival_time_ms((packet_time_us + 500) / 1000); > } > rtp_demuxer_.OnRtpPacket(parsed_packet); > } >@@ -236,19 +236,19 @@ void RtpTransport::OnSentPacket(rtc::PacketTransportInternal* packet_transport, > } > > void RtpTransport::OnRtpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >- DemuxPacket(packet, packet_time); >+ int64_t packet_time_us) { >+ DemuxPacket(packet, packet_time_us); > } > > void RtpTransport::OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >- SignalRtcpPacketReceived(packet, packet_time); >+ int64_t packet_time_us) { >+ SignalRtcpPacketReceived(packet, packet_time_us); > } > > void RtpTransport::OnReadPacket(rtc::PacketTransportInternal* transport, > const char* data, > size_t len, >- const rtc::PacketTime& packet_time, >+ const int64_t& packet_time_us, > int flags) { > TRACE_EVENT0("webrtc", "RtpTransport::OnReadPacket"); > >@@ -272,9 +272,9 @@ void RtpTransport::OnReadPacket(rtc::PacketTransportInternal* transport, > } > > if (rtcp) { >- OnRtcpPacketReceived(&packet, packet_time); >+ OnRtcpPacketReceived(&packet, packet_time_us); > } else { >- OnRtpPacketReceived(&packet, packet_time); >+ OnRtpPacketReceived(&packet, packet_time_us); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.h >index 793ac901aa5c8cb48cf2cf33e5aeb9bd46b9d098..dab5903171558ef7793cf9cfe8f9b44d42793cf7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransport.h >@@ -87,7 +87,7 @@ class RtpTransport : public RtpTransportInternal { > RtpTransportAdapter* GetInternal() override; > > // These methods will be used in the subclasses. >- void DemuxPacket(rtc::CopyOnWriteBuffer* packet, const rtc::PacketTime& time); >+ void DemuxPacket(rtc::CopyOnWriteBuffer* packet, int64_t packet_time_us); > > bool SendPacket(bool rtcp, > rtc::CopyOnWriteBuffer* packet, >@@ -98,9 +98,9 @@ class RtpTransport : public RtpTransportInternal { > virtual void OnNetworkRouteChanged( > absl::optional<rtc::NetworkRoute> network_route); > virtual void OnRtpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > virtual void OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time); >+ int64_t packet_time_us); > // Overridden by SrtpTransport and DtlsSrtpTransport. > virtual void OnWritableState(rtc::PacketTransportInternal* packet_transport); > >@@ -111,7 +111,7 @@ class RtpTransport : public RtpTransportInternal { > void OnReadPacket(rtc::PacketTransportInternal* transport, > const char* data, > size_t len, >- const rtc::PacketTime& packet_time, >+ const int64_t& packet_time_us, > int flags); > > // Updates "ready to send" for an individual channel and fires >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransportinternal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransportinternal.h >index ca0568071086ce51ad1680b62cf29e7ec0e59edd..e6338366c0637ecf06d3b28fe31e19e657745d0a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransportinternal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransportinternal.h >@@ -24,7 +24,6 @@ > namespace rtc { > class CopyOnWriteBuffer; > struct PacketOptions; >-struct PacketTime; > } // namespace rtc > > namespace webrtc { >@@ -59,8 +58,7 @@ class RtpTransportInternal : public SrtpTransportInterface, > // Called whenever an RTCP packet is received. There is no equivalent signal > // for RTP packets because they would be forwarded to the BaseChannel through > // the RtpDemuxer callback. >- sigslot::signal2<rtc::CopyOnWriteBuffer*, const rtc::PacketTime&> >- SignalRtcpPacketReceived; >+ sigslot::signal2<rtc::CopyOnWriteBuffer*, int64_t> SignalRtcpPacketReceived; > > // Called whenever the network route of the P2P layer transport changes. > // The argument is an optional network route. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransporttestutil.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransporttestutil.h >index eb337730d107f8c8d81a8d28a2aa04f32e602254..f2c35122aaf44a347267fad37f9f225c2447638f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransporttestutil.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtptransporttestutil.h >@@ -39,7 +39,7 @@ class TransportObserver : public RtpPacketSinkInterface, > } > > void OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > rtcp_count_++; > last_recv_rtcp_packet_ = *packet; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.cc >index f3e89b47fadb66c000011dc888f2cc3634b9957d..3a9b18ee6919b37ff5940ee6064eccb7b46a8d4d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.cc >@@ -84,21 +84,6 @@ bool ContentGroup::RemoveContentName(const std::string& content_name) { > } > > SessionDescription::SessionDescription() = default; >- >-SessionDescription::SessionDescription(const ContentInfos& contents) >- : contents_(contents) {} >- >-SessionDescription::SessionDescription(const ContentInfos& contents, >- const ContentGroups& groups) >- : contents_(contents), content_groups_(groups) {} >- >-SessionDescription::SessionDescription(const ContentInfos& contents, >- const TransportInfos& transports, >- const ContentGroups& groups) >- : contents_(contents), >- transport_infos_(transports), >- content_groups_(groups) {} >- > SessionDescription::SessionDescription(const SessionDescription&) = default; > > SessionDescription::~SessionDescription() { >@@ -162,7 +147,7 @@ void SessionDescription::AddContent(const std::string& name, > ContentInfo content(type); > content.name = name; > content.description = description; >- contents_.push_back(std::move(content)); >+ AddContent(&content); > } > > void SessionDescription::AddContent(const std::string& name, >@@ -173,7 +158,7 @@ void SessionDescription::AddContent(const std::string& name, > content.name = name; > content.rejected = rejected; > content.description = description; >- contents_.push_back(std::move(content)); >+ AddContent(&content); > } > > void SessionDescription::AddContent(const std::string& name, >@@ -186,7 +171,16 @@ void SessionDescription::AddContent(const std::string& name, > content.rejected = rejected; > content.bundle_only = bundle_only; > content.description = description; >- contents_.push_back(std::move(content)); >+ AddContent(&content); >+} >+ >+void SessionDescription::AddContent(ContentInfo* content) { >+ if (extmap_allow_mixed()) { >+ // Mixed support on session level overrides setting on media level. >+ content->description->set_extmap_allow_mixed_enum( >+ MediaContentDescription::kSession); >+ } >+ contents_.push_back(std::move(*content)); > } > > bool SessionDescription::RemoveContentByName(const std::string& name) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.h >index 645ebc54f86e57974640af7bd59c86cea0776272..38291484cd8cb4e8888737a78d6b0f82154e80f4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription.h >@@ -183,6 +183,22 @@ class MediaContentDescription { > return connection_address_; > } > >+ // Determines if it's allowed to mix one- and two-byte rtp header extensions >+ // within the same rtp stream. >+ enum ExtmapAllowMixed { kNo, kSession, kMedia }; >+ void set_extmap_allow_mixed_enum(ExtmapAllowMixed new_extmap_allow_mixed) { >+ if (new_extmap_allow_mixed == kMedia && >+ extmap_allow_mixed_enum_ == kSession) { >+ // Do not downgrade from session level to media level. >+ return; >+ } >+ extmap_allow_mixed_enum_ = new_extmap_allow_mixed; >+ } >+ ExtmapAllowMixed extmap_allow_mixed_enum() const { >+ return extmap_allow_mixed_enum_; >+ } >+ bool extmap_allow_mixed() const { return extmap_allow_mixed_enum_ != kNo; } >+ > protected: > bool rtcp_mux_ = false; > bool rtcp_reduced_size_ = false; >@@ -196,6 +212,10 @@ class MediaContentDescription { > webrtc::RtpTransceiverDirection direction_ = > webrtc::RtpTransceiverDirection::kSendRecv; > rtc::SocketAddress connection_address_; >+ // Mixed one- and two-byte header not included in offer on media level or >+ // session level, but we will respond that we support it. The plan is to add >+ // it to our offer on session level. See todo in SessionDescription. >+ ExtmapAllowMixed extmap_allow_mixed_enum_ = kNo; > }; > > // TODO(bugs.webrtc.org/8620): Remove this alias once downstream projects have >@@ -378,11 +398,6 @@ enum MsidSignaling { > class SessionDescription { > public: > SessionDescription(); >- explicit SessionDescription(const ContentInfos& contents); >- SessionDescription(const ContentInfos& contents, const ContentGroups& groups); >- SessionDescription(const ContentInfos& contents, >- const TransportInfos& transports, >- const ContentGroups& groups); > ~SessionDescription(); > > SessionDescription* Copy() const; >@@ -412,6 +427,8 @@ class SessionDescription { > bool rejected, > bool bundle_only, > MediaContentDescription* description); >+ void AddContent(ContentInfo* content); >+ > bool RemoveContentByName(const std::string& name); > > // Transport accessors. >@@ -455,6 +472,24 @@ class SessionDescription { > } > int msid_signaling() const { return msid_signaling_; } > >+ // Determines if it's allowed to mix one- and two-byte rtp header extensions >+ // within the same rtp stream. >+ void set_extmap_allow_mixed(bool supported) { >+ extmap_allow_mixed_ = supported; >+ MediaContentDescription::ExtmapAllowMixed media_level_setting = >+ supported ? MediaContentDescription::kSession >+ : MediaContentDescription::kNo; >+ for (auto& content : contents_) { >+ // Do not set to kNo if the current setting is kMedia. >+ if (supported || content.media_description()->extmap_allow_mixed_enum() != >+ MediaContentDescription::kMedia) { >+ content.media_description()->set_extmap_allow_mixed_enum( >+ media_level_setting); >+ } >+ } >+ } >+ bool extmap_allow_mixed() const { return extmap_allow_mixed_; } >+ > private: > SessionDescription(const SessionDescription&); > >@@ -465,6 +500,12 @@ class SessionDescription { > // Default to what Plan B would do. > // TODO(bugs.webrtc.org/8530): Change default to kMsidSignalingMediaSection. > int msid_signaling_ = kMsidSignalingSsrcAttribute; >+ // TODO(webrtc:9985): Activate mixed one- and two-byte header extension in >+ // offer at session level. It's currently not included in offer by default >+ // because clients prior to https://bugs.webrtc.org/9712 cannot parse this >+ // correctly. If it's included in offer to us we will respond that we support >+ // it. >+ bool extmap_allow_mixed_ = false; > }; > > // Indicates whether a session description was sent by the local client or >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..dcacf587f562b4f9a5267b4ac275add788e2731b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/sessiondescription_unittest.cc >@@ -0,0 +1,130 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#include "pc/sessiondescription.h" >+#include "rtc_base/gunit.h" >+ >+namespace cricket { >+ >+TEST(MediaContentDescriptionTest, ExtmapAllowMixedDefaultValue) { >+ VideoContentDescription video_desc; >+ EXPECT_EQ(MediaContentDescription::kNo, video_desc.extmap_allow_mixed_enum()); >+} >+ >+TEST(MediaContentDescriptionTest, SetExtmapAllowMixed) { >+ VideoContentDescription video_desc; >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kNo); >+ EXPECT_EQ(MediaContentDescription::kNo, video_desc.extmap_allow_mixed_enum()); >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ EXPECT_EQ(MediaContentDescription::kMedia, >+ video_desc.extmap_allow_mixed_enum()); >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kSession); >+ EXPECT_EQ(MediaContentDescription::kSession, >+ video_desc.extmap_allow_mixed_enum()); >+ >+ // Not allowed to downgrade from kSession to kMedia. >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ EXPECT_EQ(MediaContentDescription::kSession, >+ video_desc.extmap_allow_mixed_enum()); >+ >+ // Always okay to set not supported. >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kNo); >+ EXPECT_EQ(MediaContentDescription::kNo, video_desc.extmap_allow_mixed_enum()); >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ EXPECT_EQ(MediaContentDescription::kMedia, >+ video_desc.extmap_allow_mixed_enum()); >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kNo); >+ EXPECT_EQ(MediaContentDescription::kNo, video_desc.extmap_allow_mixed_enum()); >+} >+ >+TEST(MediaContentDescriptionTest, MixedOneTwoByteHeaderSupported) { >+ VideoContentDescription video_desc; >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kNo); >+ EXPECT_FALSE(video_desc.extmap_allow_mixed()); >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ EXPECT_TRUE(video_desc.extmap_allow_mixed()); >+ video_desc.set_extmap_allow_mixed_enum(MediaContentDescription::kSession); >+ EXPECT_TRUE(video_desc.extmap_allow_mixed()); >+} >+ >+TEST(SessionDescriptionTest, SetExtmapAllowMixed) { >+ SessionDescription session_desc; >+ session_desc.set_extmap_allow_mixed(true); >+ EXPECT_TRUE(session_desc.extmap_allow_mixed()); >+ session_desc.set_extmap_allow_mixed(false); >+ EXPECT_FALSE(session_desc.extmap_allow_mixed()); >+} >+ >+TEST(SessionDescriptionTest, SetExtmapAllowMixedPropagatesToMediaLevel) { >+ SessionDescription session_desc; >+ MediaContentDescription* video_desc = new VideoContentDescription(); >+ session_desc.AddContent("video", MediaProtocolType::kRtp, video_desc); >+ >+ // Setting true on session level propagates to media level. >+ session_desc.set_extmap_allow_mixed(true); >+ EXPECT_EQ(MediaContentDescription::kSession, >+ video_desc->extmap_allow_mixed_enum()); >+ >+ // Don't downgrade from session level to media level >+ video_desc->set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ EXPECT_EQ(MediaContentDescription::kSession, >+ video_desc->extmap_allow_mixed_enum()); >+ >+ // Setting false on session level propagates to media level if the current >+ // state is kSession. >+ session_desc.set_extmap_allow_mixed(false); >+ EXPECT_EQ(MediaContentDescription::kNo, >+ video_desc->extmap_allow_mixed_enum()); >+ >+ // Now possible to set at media level. >+ video_desc->set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ EXPECT_EQ(MediaContentDescription::kMedia, >+ video_desc->extmap_allow_mixed_enum()); >+ >+ // Setting false on session level does not override on media level if current >+ // state is kMedia. >+ session_desc.set_extmap_allow_mixed(false); >+ EXPECT_EQ(MediaContentDescription::kMedia, >+ video_desc->extmap_allow_mixed_enum()); >+ >+ // Setting true on session level overrides setting on media level. >+ session_desc.set_extmap_allow_mixed(true); >+ EXPECT_EQ(MediaContentDescription::kSession, >+ video_desc->extmap_allow_mixed_enum()); >+} >+ >+TEST(SessionDescriptionTest, AddContentTransfersExtmapAllowMixedSetting) { >+ SessionDescription session_desc; >+ session_desc.set_extmap_allow_mixed(false); >+ MediaContentDescription* audio_desc = new AudioContentDescription(); >+ audio_desc->set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ >+ // If session setting is false, media level setting is preserved when new >+ // content is added. >+ session_desc.AddContent("audio", MediaProtocolType::kRtp, audio_desc); >+ EXPECT_EQ(MediaContentDescription::kMedia, >+ audio_desc->extmap_allow_mixed_enum()); >+ >+ // If session setting is true, it's transferred to media level when new >+ // content is added. >+ session_desc.set_extmap_allow_mixed(true); >+ MediaContentDescription* video_desc = new VideoContentDescription(); >+ session_desc.AddContent("video", MediaProtocolType::kRtp, video_desc); >+ EXPECT_EQ(MediaContentDescription::kSession, >+ video_desc->extmap_allow_mixed_enum()); >+ >+ // Session level setting overrides media level when new content is added. >+ MediaContentDescription* data_desc = new DataContentDescription; >+ data_desc->set_extmap_allow_mixed_enum(MediaContentDescription::kMedia); >+ session_desc.AddContent("data", MediaProtocolType::kRtp, data_desc); >+ EXPECT_EQ(MediaContentDescription::kSession, >+ data_desc->extmap_allow_mixed_enum()); >+} >+ >+} // namespace cricket >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.cc >index 5ffe34212e508ba73a00e3f86948ae507ff62e10..cd0393a03e331b1d73cff4a2259e23ae7a52f463 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.cc >@@ -19,7 +19,6 @@ > #include "rtc_base/byteorder.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringencode.h" > #include "rtc_base/third_party/base64/base64.h" > #include "rtc_base/timeutils.h" > #include "rtc_base/zero_memory.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.h >index a4dd54fa8bdebca68e19c0fddd0839b6910e69d5..4ab0dd72c1a412c1423c8c0245ade794ea8ef898 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpfilter.h >@@ -78,6 +78,10 @@ class SrtpFilter { > > bool ResetParams(); > >+ static bool ParseKeyParams(const std::string& params, >+ uint8_t* key, >+ size_t len); >+ > absl::optional<int> send_cipher_suite() { return send_cipher_suite_; } > absl::optional<int> recv_cipher_suite() { return recv_cipher_suite_; } > >@@ -104,10 +108,6 @@ class SrtpFilter { > > bool ApplyRecvParams(const CryptoParams& recv_params); > >- static bool ParseKeyParams(const std::string& params, >- uint8_t* key, >- size_t len); >- > enum State { > ST_INIT, // SRTP filter unused. > ST_SENTOFFER, // Offer with SRTP parameters sent. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.cc >index a5fda8897feb9d726295d5e5378de456fcf0b537..aadd4711a503e1e74733f3f84625e0a811d82f27 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.cc >@@ -138,7 +138,15 @@ bool SrtpSession::UnprotectRtp(void* p, int in_len, int* out_len) { > *out_len = in_len; > int err = srtp_unprotect(session_, p, out_len); > if (err != srtp_err_status_ok) { >- RTC_LOG(LS_WARNING) << "Failed to unprotect SRTP packet, err=" << err; >+ // Limit the error logging to avoid excessive logs when there are lots of >+ // bad packets. >+ const int kFailureLogThrottleCount = 100; >+ if (decryption_failure_count_ % kFailureLogThrottleCount == 0) { >+ RTC_LOG(LS_WARNING) << "Failed to unprotect SRTP packet, err=" << err >+ << ", previous failure count: " >+ << decryption_failure_count_; >+ } >+ ++decryption_failure_count_; > RTC_HISTOGRAM_ENUMERATION("WebRTC.PeerConnection.SrtpUnprotectError", > static_cast<int>(err), kSrtpErrorCodeBoundary); > return false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.h >index c3a01bd139bb0a536cbf8d9774c4e2bfa7e76193..ce05ce6d7258622c84973b2f771ddc6bb0f3d49c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtpsession.h >@@ -122,6 +122,7 @@ class SrtpSession { > int last_send_seq_num_ = -1; > bool external_auth_active_ = false; > bool external_auth_enabled_ = false; >+ int decryption_failure_count_ = 0; > RTC_DISALLOW_COPY_AND_ASSIGN(SrtpSession); > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.cc >index 6fd8be37c242481b9c9789930be43cdda1470f3b..0cf4106203e635150de7fe898fc4a917568d3e3b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.cc >@@ -193,7 +193,7 @@ bool SrtpTransport::SendRtcpPacket(rtc::CopyOnWriteBuffer* packet, > } > > void SrtpTransport::OnRtpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > if (!IsSrtpActive()) { > RTC_LOG(LS_WARNING) > << "Inactive SRTP transport received an RTP packet. Drop it."; >@@ -207,16 +207,25 @@ void SrtpTransport::OnRtpPacketReceived(rtc::CopyOnWriteBuffer* packet, > uint32_t ssrc = 0; > cricket::GetRtpSeqNum(data, len, &seq_num); > cricket::GetRtpSsrc(data, len, &ssrc); >- RTC_LOG(LS_ERROR) << "Failed to unprotect RTP packet: size=" << len >- << ", seqnum=" << seq_num << ", SSRC=" << ssrc; >+ >+ // Limit the error logging to avoid excessive logs when there are lots of >+ // bad packets. >+ const int kFailureLogThrottleCount = 100; >+ if (decryption_failure_count_ % kFailureLogThrottleCount == 0) { >+ RTC_LOG(LS_ERROR) << "Failed to unprotect RTP packet: size=" << len >+ << ", seqnum=" << seq_num << ", SSRC=" << ssrc >+ << ", previous failure count: " >+ << decryption_failure_count_; >+ } >+ ++decryption_failure_count_; > return; > } > packet->SetSize(len); >- DemuxPacket(packet, packet_time); >+ DemuxPacket(packet, packet_time_us); > } > > void SrtpTransport::OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) { >+ int64_t packet_time_us) { > if (!IsSrtpActive()) { > RTC_LOG(LS_WARNING) > << "Inactive SRTP transport received an RTCP packet. Drop it."; >@@ -233,7 +242,7 @@ void SrtpTransport::OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, > return; > } > packet->SetSize(len); >- SignalRtcpPacketReceived(packet, packet_time); >+ SignalRtcpPacketReceived(packet, packet_time_us); > } > > void SrtpTransport::OnNetworkRouteChanged( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.h >index 198024f5a54ca4d9414575088384b46e98ad995d..c62359b944d48723990fa005039f1a3dc3421c86 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/srtptransport.h >@@ -113,9 +113,9 @@ class SrtpTransport : public RtpTransport { > void CreateSrtpSessions(); > > void OnRtpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > void OnRtcpPacketReceived(rtc::CopyOnWriteBuffer* packet, >- const rtc::PacketTime& packet_time) override; >+ int64_t packet_time_us) override; > void OnNetworkRouteChanged( > absl::optional<rtc::NetworkRoute> network_route) override; > >@@ -160,6 +160,8 @@ class SrtpTransport : public RtpTransport { > bool external_auth_enabled_ = false; > > int rtp_abs_sendtime_extn_id_ = -1; >+ >+ int decryption_failure_count_ = 0; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector.cc >index 23a449308c317e91110cad840400f2db35d78120..a1d52db04e92de8d457c93a0689a85221acd7140 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector.cc >@@ -783,8 +783,8 @@ void StatsCollector::ExtractSessionInfo() { > StatsReport::Id local_cert_report_id, remote_cert_report_id; > rtc::scoped_refptr<rtc::RTCCertificate> certificate; > if (pc_->GetLocalCertificate(transport_name, &certificate)) { >- StatsReport* r = >- AddCertificateReports(certificate->ssl_cert_chain().GetStats()); >+ StatsReport* r = AddCertificateReports( >+ certificate->GetSSLCertificateChain().GetStats()); > if (r) > local_cert_report_id = r->id(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector_unittest.cc >index fa4d3db81bd7c576f75cd1977294e36ddf251f89..53eb2eaa5eaba4677205bd3a9416a3420a6767d1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/statscollector_unittest.cc >@@ -22,6 +22,7 @@ > #include "pc/test/fakevideotracksource.h" > #include "pc/videotrack.h" > #include "rtc_base/fakesslidentity.h" >+#include "rtc_base/messagedigest.h" > #include "rtc_base/third_party/base64/base64.h" > #include "test/gtest.h" > >@@ -56,14 +57,6 @@ class FakeAudioProcessor : public AudioProcessorInterface { > ~FakeAudioProcessor() {} > > private: >- void GetStats(AudioProcessorInterface::AudioProcessorStats* stats) override { >- stats->typing_noise_detected = true; >- stats->echo_return_loss = 2; >- stats->echo_return_loss_enhancement = 3; >- stats->echo_delay_median_ms = 4; >- stats->echo_delay_std_ms = 6; >- } >- > AudioProcessorInterface::AudioProcessorStatistics GetStats( > bool has_recv_streams) override { > AudioProcessorStatistics stats; >@@ -107,14 +100,6 @@ class FakeAudioProcessorWithInitValue : public AudioProcessorInterface { > ~FakeAudioProcessorWithInitValue() {} > > private: >- void GetStats(AudioProcessorInterface::AudioProcessorStats* stats) override { >- stats->typing_noise_detected = false; >- stats->echo_return_loss = -100; >- stats->echo_return_loss_enhancement = -100; >- stats->echo_delay_median_ms = -1; >- stats->echo_delay_std_ms = -1; >- } >- > AudioProcessorInterface::AudioProcessorStatistics GetStats( > bool /*has_recv_streams*/) override { > AudioProcessorStatistics stats; >@@ -494,10 +479,10 @@ void InitVoiceSenderInfo(cricket::VoiceSenderInfo* voice_sender_info) { > voice_sender_info->packets_lost = 105; > voice_sender_info->ext_seqnum = 106; > voice_sender_info->audio_level = 107; >- voice_sender_info->echo_return_loss = 108; >- voice_sender_info->echo_return_loss_enhancement = 109; >- voice_sender_info->echo_delay_median_ms = 110; >- voice_sender_info->echo_delay_std_ms = 111; >+ voice_sender_info->apm_statistics.echo_return_loss = 108; >+ voice_sender_info->apm_statistics.echo_return_loss_enhancement = 109; >+ voice_sender_info->apm_statistics.delay_median_ms = 110; >+ voice_sender_info->apm_statistics.delay_standard_deviation_ms = 111; > voice_sender_info->typing_noise_detected = false; > voice_sender_info->ana_statistics.bitrate_action_counter = 112; > voice_sender_info->ana_statistics.channel_action_counter = 113; >@@ -641,7 +626,7 @@ class StatsCollectorTest : public testing::Test { > std::unique_ptr<rtc::SSLIdentity>(local_identity.GetReference()))); > pc->SetLocalCertificate(kTransportName, local_certificate); > pc->SetRemoteCertChain(kTransportName, >- remote_identity.cert_chain().UniqueCopy()); >+ remote_identity.cert_chain().Clone()); > > stats->UpdateStats(PeerConnectionInterface::kStatsOutputLevelStandard); > >@@ -1567,7 +1552,7 @@ TEST_P(StatsCollectorTrackTest, GetStatsAfterRemoveAudioStream) { > > // Verifies the values in the track report, no value will be changed by the > // AudioTrackInterface::GetSignalValue() and >- // AudioProcessorInterface::AudioProcessorStats::GetStats(); >+ // AudioProcessorInterface::GetStats(); > VerifyVoiceSenderInfoReport(report, voice_sender_info); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/fakepeerconnectionforstats.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/fakepeerconnectionforstats.h >index 642fa011eb4e88096c8249e26c20a572fe9e77d2..af86639eca4d59849f6d5ee89a451e4b6400b217 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/fakepeerconnectionforstats.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/fakepeerconnectionforstats.h >@@ -126,7 +126,7 @@ class FakePeerConnectionForStats : public FakePeerConnectionBase { > voice_channel_ = absl::make_unique<cricket::VoiceChannel>( > worker_thread_, network_thread_, signaling_thread_, nullptr, > std::move(voice_media_channel), mid, kDefaultSrtpRequired, >- rtc::CryptoOptions()); >+ webrtc::CryptoOptions()); > voice_channel_->set_transport_name_for_testing(transport_name); > GetOrCreateFirstTransceiverOfType(cricket::MEDIA_TYPE_AUDIO) > ->internal() >@@ -144,7 +144,7 @@ class FakePeerConnectionForStats : public FakePeerConnectionBase { > video_channel_ = absl::make_unique<cricket::VideoChannel>( > worker_thread_, network_thread_, signaling_thread_, > std::move(video_media_channel), mid, kDefaultSrtpRequired, >- rtc::CryptoOptions()); >+ webrtc::CryptoOptions()); > video_channel_->set_transport_name_for_testing(transport_name); > GetOrCreateFirstTransceiverOfType(cricket::MEDIA_TYPE_VIDEO) > ->internal() >@@ -319,7 +319,7 @@ class FakePeerConnectionForStats : public FakePeerConnectionBase { > const std::string& transport_name) override { > auto it = remote_cert_chains_by_transport_.find(transport_name); > if (it != remote_cert_chains_by_transport_.end()) { >- return it->second->UniqueCopy(); >+ return it->second->Clone(); > } else { > return nullptr; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_channelinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_channelinterface.h >new file mode 100644 >index 0000000000000000000000000000000000000000..4c32a8fb8d979c05d99004cd1d44c42bbf4edd60 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_channelinterface.h >@@ -0,0 +1,47 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef PC_TEST_MOCK_CHANNELINTERFACE_H_ >+#define PC_TEST_MOCK_CHANNELINTERFACE_H_ >+ >+#include <string> >+ >+#include "pc/channelinterface.h" >+#include "test/gmock.h" >+ >+namespace cricket { >+ >+// Mock class for BaseChannel. >+// Use this class in unit tests to avoid dependecy on a specific >+// implementation of BaseChannel. >+class MockChannelInterface : public cricket::ChannelInterface { >+ public: >+ MOCK_CONST_METHOD0(media_type, cricket::MediaType()); >+ MOCK_CONST_METHOD0(media_channel, MediaChannel*()); >+ MOCK_CONST_METHOD0(transport_name, const std::string&()); >+ MOCK_CONST_METHOD0(content_name, const std::string&()); >+ MOCK_CONST_METHOD0(enabled, bool()); >+ MOCK_METHOD1(Enable, bool(bool)); >+ MOCK_METHOD0(SignalFirstPacketReceived, >+ sigslot::signal1<ChannelInterface*>&()); >+ MOCK_METHOD3(SetLocalContent, >+ bool(const cricket::MediaContentDescription*, >+ webrtc::SdpType, >+ std::string*)); >+ MOCK_METHOD3(SetRemoteContent, >+ bool(const cricket::MediaContentDescription*, >+ webrtc::SdpType, >+ std::string*)); >+ MOCK_METHOD1(SetRtpTransport, bool(webrtc::RtpTransportInternal*)); >+}; >+ >+} // namespace cricket >+ >+#endif // PC_TEST_MOCK_CHANNELINTERFACE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpreceiverinternal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpreceiverinternal.h >index 850d5a9f8e5990d010fd135c98e055101bccca1a..84b0d71766b9705e55b5eb408dbb81c82455664c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpreceiverinternal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpreceiverinternal.h >@@ -41,8 +41,7 @@ class MockRtpReceiverInternal : public RtpReceiverInternal { > > // RtpReceiverInternal methods. > MOCK_METHOD0(Stop, void()); >- MOCK_METHOD1(SetVoiceMediaChannel, void(cricket::VoiceMediaChannel*)); >- MOCK_METHOD1(SetVideoMediaChannel, void(cricket::VideoMediaChannel*)); >+ MOCK_METHOD1(SetMediaChannel, void(cricket::MediaChannel*)); > MOCK_METHOD1(SetupMediaChannel, void(uint32_t)); > MOCK_CONST_METHOD0(ssrc, uint32_t()); > MOCK_METHOD0(NotifyFirstPacketReceived, void()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpsenderinternal.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpsenderinternal.h >index 5e14f7ad7de625c48b2a8de2582662f465b08269..dd4f69497df41291610c157ac370dd9d3c0a75a1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpsenderinternal.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mock_rtpsenderinternal.h >@@ -39,8 +39,7 @@ class MockRtpSenderInternal : public RtpSenderInternal { > rtc::scoped_refptr<FrameEncryptorInterface>()); > > // RtpSenderInternal methods. >- MOCK_METHOD1(SetVoiceMediaChannel, void(cricket::VoiceMediaChannel*)); >- MOCK_METHOD1(SetVideoMediaChannel, void(cricket::VideoMediaChannel*)); >+ MOCK_METHOD1(SetMediaChannel, void(cricket::MediaChannel*)); > MOCK_METHOD1(SetSsrc, void(uint32_t)); > MOCK_METHOD1(set_stream_ids, void(const std::vector<std::string>&)); > MOCK_METHOD1(set_init_send_encodings, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mockpeerconnectionobservers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mockpeerconnectionobservers.h >index 6f5e9bc665259e710b368cdb4d6e59908c0620f8..62c0c9415f6c4183bece1278870e2b067ad43903 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mockpeerconnectionobservers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/mockpeerconnectionobservers.h >@@ -67,6 +67,7 @@ class MockPeerConnectionObserver : public PeerConnectionObserver { > } > void OnSignalingChange( > PeerConnectionInterface::SignalingState new_state) override { >+ RTC_DCHECK(pc_); > RTC_DCHECK(pc_->signaling_state() == new_state); > state_ = new_state; > } >@@ -92,6 +93,7 @@ class MockPeerConnectionObserver : public PeerConnectionObserver { > > void OnIceConnectionChange( > PeerConnectionInterface::IceConnectionState new_state) override { >+ RTC_DCHECK(pc_); > RTC_DCHECK(pc_->ice_connection_state() == new_state); > // When ICE is finished, the caller will get to a kIceConnectionCompleted > // state, because it has the ICE controlling role, while the callee >@@ -104,12 +106,14 @@ class MockPeerConnectionObserver : public PeerConnectionObserver { > } > void OnIceGatheringChange( > PeerConnectionInterface::IceGatheringState new_state) override { >+ RTC_DCHECK(pc_); > RTC_DCHECK(pc_->ice_gathering_state() == new_state); > ice_gathering_complete_ = > new_state == PeerConnectionInterface::kIceGatheringComplete; > callback_triggered_ = true; > } > void OnIceCandidate(const IceCandidateInterface* candidate) override { >+ RTC_DCHECK(pc_); > RTC_DCHECK(PeerConnectionInterface::kIceGatheringNew != > pc_->ice_gathering_state()); > candidates_.push_back(absl::make_unique<JsepIceCandidate>( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.cc >index 9be1596373c681ad420c5688d80c56003373a7f4..a1db9ed9e6f30c3e072695494e08c5a68f9eb094 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.cc >@@ -12,6 +12,7 @@ > #include <utility> > #include <vector> > >+#include "api/create_peerconnection_factory.h" > #include "api/video_codecs/builtin_video_decoder_factory.h" > #include "api/video_codecs/builtin_video_encoder_factory.h" > #include "modules/audio_processing/include/audio_processing.h" >@@ -22,6 +23,7 @@ > #include "pc/test/mockpeerconnectionobservers.h" > #include "pc/test/peerconnectiontestwrapper.h" > #include "rtc_base/gunit.h" >+#include "rtc_base/thread_checker.h" > #include "rtc_base/timeutils.h" > > using webrtc::FakeConstraints; >@@ -64,9 +66,20 @@ PeerConnectionTestWrapper::PeerConnectionTestWrapper( > rtc::Thread* worker_thread) > : name_(name), > network_thread_(network_thread), >- worker_thread_(worker_thread) {} >+ worker_thread_(worker_thread) { >+ pc_thread_checker_.DetachFromThread(); >+} > >-PeerConnectionTestWrapper::~PeerConnectionTestWrapper() {} >+PeerConnectionTestWrapper::~PeerConnectionTestWrapper() { >+ RTC_DCHECK_RUN_ON(&pc_thread_checker_); >+ // Either network_thread or worker_thread might be active at this point. >+ // Relying on ~PeerConnection to properly wait for them doesn't work, >+ // as a vptr race might occur (before we enter the destruction body). >+ // See: bugs.webrtc.org/9847 >+ if (pc()) { >+ pc()->Close(); >+ } >+} > > bool PeerConnectionTestWrapper::CreatePc( > const webrtc::PeerConnectionInterface::RTCConfiguration& config, >@@ -75,6 +88,8 @@ bool PeerConnectionTestWrapper::CreatePc( > std::unique_ptr<cricket::PortAllocator> port_allocator( > new cricket::FakePortAllocator(network_thread_, nullptr)); > >+ RTC_DCHECK_RUN_ON(&pc_thread_checker_); >+ > fake_audio_capture_module_ = FakeAudioCaptureModule::Create(); > if (fake_audio_capture_module_ == NULL) { > return false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.h >index 21ba89acbf07305839a51d5198fad4dd89111f52..b6a57f35d3823311ba5f3d3d7f6646ff07d20888 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/test/peerconnectiontestwrapper.h >@@ -20,6 +20,7 @@ > #include "pc/test/fakeaudiocapturemodule.h" > #include "pc/test/fakevideotrackrenderer.h" > #include "rtc_base/third_party/sigslot/sigslot.h" >+#include "rtc_base/thread_checker.h" > > class PeerConnectionTestWrapper > : public webrtc::PeerConnectionObserver, >@@ -106,6 +107,7 @@ class PeerConnectionTestWrapper > std::string name_; > rtc::Thread* const network_thread_; > rtc::Thread* const worker_thread_; >+ rtc::ThreadChecker pc_thread_checker_; > rtc::scoped_refptr<webrtc::PeerConnectionInterface> peer_connection_; > rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> > peer_connection_factory_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videocapturertracksource_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videocapturertracksource_unittest.cc >index d2b8e1574cebd083d62372dc248ce7ec648ce88c..67bec345b697a4a725a332443fa50c3c961e1ec3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videocapturertracksource_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videocapturertracksource_unittest.cc >@@ -126,7 +126,7 @@ class VideoCapturerTrackSourceTest : public testing::Test { > } > > void CaptureSingleFrame() { >- rtc::Event event(false, false); >+ rtc::Event event; > task_queue_.PostTask([this, &event]() { > ASSERT_TRUE(capturer_->CaptureFrame()); > event.Set(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotrack.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotrack.h >index e669e0814c7253e0a873774f8933a34216bb550d..f119dd86d83de6806d63d072a7ea3e29882647b6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotrack.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotrack.h >@@ -17,6 +17,7 @@ > #include "media/base/videosourcebase.h" > #include "pc/mediastreamtrack.h" > #include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/thread.h" > #include "rtc_base/thread_checker.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotracksource.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotracksource.h >index 50488dd2c6011d336abe9bba277e10f646467e53..cedb75b3c88dd730f6387595e1b16a0a62ada94b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotracksource.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/videotracksource.h >@@ -15,13 +15,14 @@ > #include "api/notifier.h" > #include "api/video/video_sink_interface.h" > #include "media/base/mediachannel.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/thread_checker.h" > > namespace webrtc { > > // VideoTrackSource is a convenience base class for implementations of > // VideoTrackSourceInterface. >-class VideoTrackSource : public Notifier<VideoTrackSourceInterface> { >+class RTC_EXPORT VideoTrackSource : public Notifier<VideoTrackSourceInterface> { > public: > explicit VideoTrackSource(bool remote); > void SetState(SourceState new_state); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.cc >index 665b4d411a050a98430f26d4f4fab347f0ba47a8..0cd6beceefa377dffe7f934b9e5b96635046c545 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.cc >@@ -20,13 +20,15 @@ > #include <set> > #include <string> > #include <unordered_map> >+#include <utility> > #include <vector> > >-#include "api/mediatypes.h" >+#include "absl/strings/match.h" > #include "api/candidate.h" > #include "api/cryptoparams.h" > #include "api/jsepicecandidate.h" > #include "api/jsepsessiondescription.h" >+#include "api/mediatypes.h" > // for RtpExtension > #include "api/rtpparameters.h" > #include "media/base/codec.h" >@@ -119,6 +121,7 @@ static const char kAttributeRtcpMux[] = "rtcp-mux"; > static const char kAttributeRtcpReducedSize[] = "rtcp-rsize"; > static const char kAttributeSsrc[] = "ssrc"; > static const char kSsrcAttributeCname[] = "cname"; >+static const char kAttributeExtmapAllowMixed[] = "extmap-allow-mixed"; > static const char kAttributeExtmap[] = "extmap"; > // draft-alvestrand-mmusic-msid-01 > // a=msid-semantic: WMS >@@ -334,9 +337,10 @@ static bool ParseIceOptions(const std::string& line, > static bool ParseExtmap(const std::string& line, > RtpExtension* extmap, > SdpParseError* error); >-static bool ParseFingerprintAttribute(const std::string& line, >- rtc::SSLFingerprint** fingerprint, >- SdpParseError* error); >+static bool ParseFingerprintAttribute( >+ const std::string& line, >+ std::unique_ptr<rtc::SSLFingerprint>* fingerprint, >+ SdpParseError* error); > static bool ParseDtlsSetup(const std::string& line, > cricket::ConnectionRole* role, > SdpParseError* error); >@@ -852,6 +856,12 @@ std::string SdpSerialize(const JsepSessionDescription& jdesc) { > AddLine(group_line, &message); > } > >+ // Mixed one- and two-byte header extension. >+ if (desc->extmap_allow_mixed()) { >+ InitAttrLine(kAttributeExtmapAllowMixed, &os); >+ AddLine(os.str(), &message); >+ } >+ > // MediaStream semantics > InitAttrLine(kAttributeMsidSemantics, &os); > os << kSdpDelimiterColon << " " << kMediaStreamSemantic; >@@ -1303,8 +1313,8 @@ void BuildMediaDescription(const ContentInfo* content_info, > > if (data_desc->use_sctpmap()) { > for (const cricket::DataCodec& codec : data_desc->codecs()) { >- if (cricket::CodecNamesEq(codec.name, >- cricket::kGoogleSctpDataCodecName) && >+ if (absl::EqualsIgnoreCase(codec.name, >+ cricket::kGoogleSctpDataCodecName) && > codec.GetParam(cricket::kCodecParamPort, &sctp_port)) { > break; > } >@@ -1482,7 +1492,17 @@ void BuildRtpContentAttributes(const MediaContentDescription* media_desc, > int msid_signaling, > std::string* message) { > rtc::StringBuilder os; >- // RFC 5285 >+ // RFC 8285 >+ // a=extmap-allow-mixed >+ // The attribute MUST be either on session level or media level. We support >+ // responding on both levels, however, we don't respond on media level if it's >+ // set on session level. >+ if (media_desc->extmap_allow_mixed_enum() == >+ MediaContentDescription::kMedia) { >+ InitAttrLine(kAttributeExtmapAllowMixed, &os); >+ AddLine(os.str(), message); >+ } >+ // RFC 8285 > // a=extmap:<value>["/"<direction>] <URI> <extensionattributes> > // The definitions MUST be either all session level or all media level. This > // implementation uses all media level. >@@ -1609,7 +1629,12 @@ void BuildRtpContentAttributes(const MediaContentDescription* media_desc, > // Since a=ssrc msid signaling is used in Plan B SDP semantics, and > // multiple stream ids are not supported for Plan B, we are only adding > // a line for the first media stream id here. >- const std::string& stream_id = track.first_stream_id(); >+ const std::string& track_stream_id = track.first_stream_id(); >+ // We use a special msid-id value of "-" to represent no streams, >+ // for Unified Plan compatibility. Plan B will always have a >+ // track_stream_id. >+ const std::string& stream_id = >+ track_stream_id.empty() ? kNoStreamMsid : track_stream_id; > InitAttrLine(kAttributeSsrc, &os); > os << kSdpDelimiterColon << ssrc << kSdpDelimiterSpace > << kSsrcAttributeMsid << kSdpDelimiterColon << stream_id >@@ -1727,7 +1752,7 @@ void AddRtcpFbLines(const T& codec, std::string* message) { > > bool AddSctpDataCodec(DataContentDescription* media_desc, int sctp_port) { > for (const auto& codec : media_desc->codecs()) { >- if (cricket::CodecNamesEq(codec.name, cricket::kGoogleSctpDataCodecName)) { >+ if (absl::EqualsIgnoreCase(codec.name, cricket::kGoogleSctpDataCodecName)) { > return ParseFailed("", "Can't have multiple sctp port attributes.", NULL); > } > } >@@ -1996,7 +2021,7 @@ bool ParseSessionDescription(const std::string& message, > std::string line; > > desc->set_msid_supported(false); >- >+ desc->set_extmap_allow_mixed(false); > // RFC 4566 > // v= (protocol version) > if (!GetLineWithType(message, pos, &line, kLineTypeVersion)) { >@@ -2117,11 +2142,11 @@ bool ParseSessionDescription(const std::string& message, > "Can't have multiple fingerprint attributes at the same level.", > error); > } >- rtc::SSLFingerprint* fingerprint = NULL; >+ std::unique_ptr<rtc::SSLFingerprint> fingerprint; > if (!ParseFingerprintAttribute(line, &fingerprint, error)) { > return false; > } >- session_td->identity_fingerprint.reset(fingerprint); >+ session_td->identity_fingerprint = std::move(fingerprint); > } else if (HasAttribute(line, kAttributeSetup)) { > if (!ParseDtlsSetup(line, &(session_td->connection_role), error)) { > return false; >@@ -2133,6 +2158,8 @@ bool ParseSessionDescription(const std::string& message, > } > desc->set_msid_supported( > CaseInsensitiveFind(semantics, kMediaStreamSemantic)); >+ } else if (HasAttribute(line, kAttributeExtmapAllowMixed)) { >+ desc->set_extmap_allow_mixed(true); > } else if (HasAttribute(line, kAttributeExtmap)) { > RtpExtension extmap; > if (!ParseExtmap(line, &extmap, error)) { >@@ -2166,9 +2193,10 @@ bool ParseGroupAttribute(const std::string& line, > return true; > } > >-static bool ParseFingerprintAttribute(const std::string& line, >- rtc::SSLFingerprint** fingerprint, >- SdpParseError* error) { >+static bool ParseFingerprintAttribute( >+ const std::string& line, >+ std::unique_ptr<rtc::SSLFingerprint>* fingerprint, >+ SdpParseError* error) { > if (!IsLineType(line, kLineTypeAttributes) || > !HasAttribute(line, kAttributeFingerprint)) { > return ParseFailedExpectLine(line, 0, kLineTypeAttributes, >@@ -2194,7 +2222,8 @@ static bool ParseFingerprintAttribute(const std::string& line, > ::tolower); > > // The second field is the digest value. De-hexify it. >- *fingerprint = rtc::SSLFingerprint::CreateFromRfc4572(algorithm, fields[1]); >+ *fingerprint = >+ rtc::SSLFingerprint::CreateUniqueFromRfc4572(algorithm, fields[1]); > if (!*fingerprint) { > return ParseFailed(line, "Failed to create fingerprint from the digest.", > error); >@@ -2834,12 +2863,11 @@ bool ParseContent(const std::string& message, > return false; > } > } else if (HasAttribute(line, kAttributeFingerprint)) { >- rtc::SSLFingerprint* fingerprint = NULL; >- >+ std::unique_ptr<rtc::SSLFingerprint> fingerprint; > if (!ParseFingerprintAttribute(line, &fingerprint, error)) { > return false; > } >- transport->identity_fingerprint.reset(fingerprint); >+ transport->identity_fingerprint = std::move(fingerprint); > } else if (HasAttribute(line, kAttributeSetup)) { > if (!ParseDtlsSetup(line, &(transport->connection_role), error)) { > return false; >@@ -2902,6 +2930,9 @@ bool ParseContent(const std::string& message, > media_desc->set_direction(RtpTransceiverDirection::kInactive); > } else if (HasAttribute(line, kAttributeSendRecv)) { > media_desc->set_direction(RtpTransceiverDirection::kSendRecv); >+ } else if (HasAttribute(line, kAttributeExtmapAllowMixed)) { >+ media_desc->set_extmap_allow_mixed_enum( >+ MediaContentDescription::kMedia); > } else if (HasAttribute(line, kAttributeExtmap)) { > RtpExtension extmap; > if (!ParseExtmap(line, &extmap, error)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.h >index 2d842a1d5d4b8091bb74a94f96efab609d077d29..f185c0893200898bd1706519f1b659d5f496b2b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp.h >@@ -22,6 +22,8 @@ > > #include <string> > >+#include "rtc_base/system/rtc_export.h" >+ > namespace cricket { > class Candidate; > } // namespace cricket >@@ -46,7 +48,8 @@ std::string SdpSerializeCandidate(const IceCandidateInterface& candidate); > > // Serializes a cricket Candidate. > // candidate - The candidate to be serialized. >-std::string SdpSerializeCandidate(const cricket::Candidate& candidate); >+RTC_EXPORT std::string SdpSerializeCandidate( >+ const cricket::Candidate& candidate); > > // Deserializes the passed in SDP string to a JsepSessionDescription. > // message - SDP string to be Deserialized. >@@ -76,10 +79,10 @@ bool SdpDeserializeCandidate(const std::string& message, > // candidate - The cricket Candidate from the SDP string. > // error - The detail error information when parsing fails. > // return - true on success, false on failure. >-bool SdpDeserializeCandidate(const std::string& transport_name, >- const std::string& message, >- cricket::Candidate* candidate, >- SdpParseError* error); >+RTC_EXPORT bool SdpDeserializeCandidate(const std::string& transport_name, >+ const std::string& message, >+ cricket::Candidate* candidate, >+ SdpParseError* error); > > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp_unittest.cc >index 9c1055463153be712254a0342127a80046ef66ee..e7c5c77bb61ade2b5f948926610257b10965d07a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsdp_unittest.cc >@@ -22,6 +22,7 @@ > #include "rtc_base/checks.h" > #include "rtc_base/gunit.h" > #include "rtc_base/logging.h" >+#include "rtc_base/messagedigest.h" > #include "rtc_base/stringencode.h" > #include "rtc_base/stringutils.h" > >@@ -89,6 +90,7 @@ static const char kAttributeCryptoVideo[] = > static const char kFingerprint[] = > "a=fingerprint:sha-1 " > "4A:AD:B9:B1:3F:82:18:3B:54:02:12:DF:3E:5D:49:6B:19:E5:7C:AB\r\n"; >+static const char kExtmapAllowMixed[] = "a=extmap-allow-mixed\r\n"; > static const int kExtmapId = 1; > static const char kExtmapUri[] = "http://example.com/082005/ext.htm#ttime"; > static const char kExtmap[] = >@@ -621,8 +623,8 @@ static const char kUnifiedPlanSdpFullStringWithSpecialMsid[] = > "generation 2\r\n" > "a=ice-ufrag:ufrag_voice\r\na=ice-pwd:pwd_voice\r\n" > "a=mid:audio_content_name\r\n" >- "a=msid:local_stream_1 audio_track_id_1\r\n" > "a=sendrecv\r\n" >+ "a=msid:local_stream_1 audio_track_id_1\r\n" > "a=rtcp-mux\r\n" > "a=rtcp-rsize\r\n" > "a=crypto:1 AES_CM_128_HMAC_SHA1_32 " >@@ -641,9 +643,9 @@ static const char kUnifiedPlanSdpFullStringWithSpecialMsid[] = > "a=rtcp:9 IN IP4 0.0.0.0\r\n" > "a=ice-ufrag:ufrag_voice_2\r\na=ice-pwd:pwd_voice_2\r\n" > "a=mid:audio_content_name_2\r\n" >+ "a=sendrecv\r\n" > "a=msid:local_stream_1 audio_track_id_2\r\n" > "a=msid:local_stream_2 audio_track_id_2\r\n" >- "a=sendrecv\r\n" > "a=rtcp-mux\r\n" > "a=rtcp-rsize\r\n" > "a=crypto:1 AES_CM_128_HMAC_SHA1_32 " >@@ -664,8 +666,8 @@ static const char kUnifiedPlanSdpFullStringWithSpecialMsid[] = > "a=rtcp:9 IN IP4 0.0.0.0\r\n" > "a=ice-ufrag:ufrag_voice_3\r\na=ice-pwd:pwd_voice_3\r\n" > "a=mid:audio_content_name_3\r\n" >- "a=msid:- audio_track_id_3\r\n" > "a=sendrecv\r\n" >+ "a=msid:- audio_track_id_3\r\n" > "a=rtcp-mux\r\n" > "a=rtcp-rsize\r\n" > "a=crypto:1 AES_CM_128_HMAC_SHA1_32 " >@@ -674,7 +676,10 @@ static const char kUnifiedPlanSdpFullStringWithSpecialMsid[] = > "a=rtpmap:111 opus/48000/2\r\n" > "a=rtpmap:103 ISAC/16000\r\n" > "a=rtpmap:104 ISAC/32000\r\n" >- "a=ssrc:7 cname:stream_2_cname\r\n"; >+ "a=ssrc:7 cname:stream_2_cname\r\n" >+ "a=ssrc:7 msid:- audio_track_id_3\r\n" >+ "a=ssrc:7 mslabel:-\r\n" >+ "a=ssrc:7 label:audio_track_id_3\r\n"; > > // One candidate reference string as per W3c spec. > // candidate:<blah> not a=candidate:<blah>CRLF >@@ -1160,7 +1165,7 @@ class WebRtcSdpTest : public testing::Test { > // with 3 audio MediaContentDescriptions with special StreamParams that > // contain 0 or multiple stream ids: - audio track 1 has 1 media stream id - > // audio track 2 has 2 media stream ids - audio track 3 has 0 media stream ids >- void MakeUnifiedPlanDescriptionMultipleStreamIds() { >+ void MakeUnifiedPlanDescriptionMultipleStreamIds(const int msid_signaling) { > desc_.RemoveContentByName(kVideoContentName); > desc_.RemoveTransportInfoByName(kVideoContentName); > RemoveVideoCandidates(); >@@ -1188,9 +1193,7 @@ class WebRtcSdpTest : public testing::Test { > desc_.AddContent(kAudioContentName3, MediaProtocolType::kRtp, audio_desc_3); > EXPECT_TRUE(desc_.AddTransportInfo(TransportInfo( > kAudioContentName3, TransportDescription(kUfragVoice3, kPwdVoice3)))); >- // Make sure to create both a=msid lines. >- desc_.set_msid_signaling(cricket::kMsidSignalingMediaSection | >- cricket::kMsidSignalingSsrcAttribute); >+ desc_.set_msid_signaling(msid_signaling); > ASSERT_TRUE(jdesc_.Initialize(desc_.Copy(), jdesc_.session_id(), > jdesc_.session_version())); > } >@@ -1282,6 +1285,9 @@ class WebRtcSdpTest : public testing::Test { > // streams > EXPECT_EQ(cd1->streams(), cd2->streams()); > >+ // extmap-allow-mixed >+ EXPECT_EQ(cd1->extmap_allow_mixed_enum(), cd2->extmap_allow_mixed_enum()); >+ > // extmap > ASSERT_EQ(cd1->rtp_header_extensions().size(), > cd2->rtp_header_extensions().size()); >@@ -1398,6 +1404,7 @@ class WebRtcSdpTest : public testing::Test { > > // global attributes > EXPECT_EQ(desc1.msid_supported(), desc2.msid_supported()); >+ EXPECT_EQ(desc1.extmap_allow_mixed(), desc2.extmap_allow_mixed()); > } > > bool CompareSessionDescription(const JsepSessionDescription& desc1, >@@ -1495,8 +1502,7 @@ class WebRtcSdpTest : public testing::Test { > void AddFingerprint() { > desc_.RemoveTransportInfoByName(kAudioContentName); > desc_.RemoveTransportInfoByName(kVideoContentName); >- rtc::SSLFingerprint fingerprint(rtc::DIGEST_SHA_1, kIdentityDigest, >- sizeof(kIdentityDigest)); >+ rtc::SSLFingerprint fingerprint(rtc::DIGEST_SHA_1, kIdentityDigest); > EXPECT_TRUE(desc_.AddTransportInfo(TransportInfo( > kAudioContentName, > TransportDescription(std::vector<std::string>(), kUfragVoice, kPwdVoice, >@@ -1540,6 +1546,27 @@ class WebRtcSdpTest : public testing::Test { > } > } > >+ // Removes all a=ssrc lines from the SDP string, except for the >+ // "a=ssrc:... cname:..." lines. >+ void RemoveSsrcMsidLinesFromSdpString(std::string* sdp_string) { >+ const char kAttributeSsrc[] = "a=ssrc"; >+ const char kAttributeCname[] = "cname"; >+ size_t ssrc_line_pos = sdp_string->find(kAttributeSsrc); >+ while (ssrc_line_pos != std::string::npos) { >+ size_t beg_line_pos = sdp_string->rfind('\n', ssrc_line_pos); >+ size_t end_line_pos = sdp_string->find('\n', ssrc_line_pos); >+ size_t cname_pos = sdp_string->find(kAttributeCname, ssrc_line_pos); >+ if (cname_pos == std::string::npos || cname_pos > end_line_pos) { >+ // Only erase a=ssrc lines that don't contain "cname". >+ sdp_string->erase(beg_line_pos, end_line_pos - beg_line_pos); >+ ssrc_line_pos = sdp_string->find(kAttributeSsrc, beg_line_pos); >+ } else { >+ // Skip the "a=ssrc:... cname" line and find the next "a=ssrc" line. >+ ssrc_line_pos = sdp_string->find(kAttributeSsrc, end_line_pos); >+ } >+ } >+ } >+ > // Removes all a=ssrc lines from the SDP string. > void RemoveSsrcLinesFromSdpString(std::string* sdp_string) { > const char kAttributeSsrc[] = "a=ssrc"; >@@ -2070,9 +2097,8 @@ TEST_F(WebRtcSdpTest, SerializeWithSctpDataChannelAndNewPort) { > > char default_portstr[16]; > char new_portstr[16]; >- rtc::sprintfn(default_portstr, sizeof(default_portstr), "%d", >- kDefaultSctpPort); >- rtc::sprintfn(new_portstr, sizeof(new_portstr), "%d", kNewPort); >+ snprintf(default_portstr, sizeof(default_portstr), "%d", kDefaultSctpPort); >+ snprintf(new_portstr, sizeof(new_portstr), "%d", kNewPort); > rtc::replace_substrs(default_portstr, strlen(default_portstr), new_portstr, > strlen(new_portstr), &expected_sdp); > >@@ -2094,6 +2120,25 @@ TEST_F(WebRtcSdpTest, SerializeSessionDescriptionWithDataChannelAndBandwidth) { > EXPECT_EQ(expected_sdp, message); > } > >+TEST_F(WebRtcSdpTest, SerializeSessionDescriptionWithExtmapAllowMixed) { >+ jdesc_.description()->set_extmap_allow_mixed(true); >+ TestSerialize(jdesc_); >+} >+ >+TEST_F(WebRtcSdpTest, SerializeMediaContentDescriptionWithExtmapAllowMixed) { >+ cricket::MediaContentDescription* video_desc = >+ jdesc_.description()->GetContentDescriptionByName(kVideoContentName); >+ ASSERT_TRUE(video_desc); >+ cricket::MediaContentDescription* audio_desc = >+ jdesc_.description()->GetContentDescriptionByName(kAudioContentName); >+ ASSERT_TRUE(audio_desc); >+ video_desc->set_extmap_allow_mixed_enum( >+ cricket::MediaContentDescription::kMedia); >+ audio_desc->set_extmap_allow_mixed_enum( >+ cricket::MediaContentDescription::kMedia); >+ TestSerialize(jdesc_); >+} >+ > TEST_F(WebRtcSdpTest, SerializeSessionDescriptionWithExtmap) { > bool encrypted = false; > AddExtmap(encrypted); >@@ -2431,6 +2476,53 @@ TEST_F(WebRtcSdpTest, DeserializeSessionDescriptionWithoutMsid) { > EXPECT_TRUE(CompareSessionDescription(jdesc_, jdesc)); > } > >+TEST_F(WebRtcSdpTest, DeserializeSessionDescriptionWithExtmapAllowMixed) { >+ jdesc_.description()->set_extmap_allow_mixed(true); >+ std::string sdp_with_extmap_allow_mixed = kSdpFullString; >+ InjectAfter("t=0 0\r\n", kExtmapAllowMixed, &sdp_with_extmap_allow_mixed); >+ // Deserialize >+ JsepSessionDescription jdesc_deserialized(kDummyType); >+ EXPECT_TRUE(SdpDeserialize(sdp_with_extmap_allow_mixed, &jdesc_deserialized)); >+ // Verify >+ EXPECT_TRUE(CompareSessionDescription(jdesc_, jdesc_deserialized)); >+} >+ >+TEST_F(WebRtcSdpTest, DeserializeSessionDescriptionWithoutExtmapAllowMixed) { >+ jdesc_.description()->set_extmap_allow_mixed(false); >+ std::string sdp_without_extmap_allow_mixed = kSdpFullString; >+ // Deserialize >+ JsepSessionDescription jdesc_deserialized(kDummyType); >+ EXPECT_TRUE( >+ SdpDeserialize(sdp_without_extmap_allow_mixed, &jdesc_deserialized)); >+ // Verify >+ EXPECT_TRUE(CompareSessionDescription(jdesc_, jdesc_deserialized)); >+} >+ >+TEST_F(WebRtcSdpTest, DeserializeMediaContentDescriptionWithExtmapAllowMixed) { >+ cricket::MediaContentDescription* video_desc = >+ jdesc_.description()->GetContentDescriptionByName(kVideoContentName); >+ ASSERT_TRUE(video_desc); >+ cricket::MediaContentDescription* audio_desc = >+ jdesc_.description()->GetContentDescriptionByName(kAudioContentName); >+ ASSERT_TRUE(audio_desc); >+ video_desc->set_extmap_allow_mixed_enum( >+ cricket::MediaContentDescription::kMedia); >+ audio_desc->set_extmap_allow_mixed_enum( >+ cricket::MediaContentDescription::kMedia); >+ >+ std::string sdp_with_extmap_allow_mixed = kSdpFullString; >+ InjectAfter("a=mid:audio_content_name\r\n", kExtmapAllowMixed, >+ &sdp_with_extmap_allow_mixed); >+ InjectAfter("a=mid:video_content_name\r\n", kExtmapAllowMixed, >+ &sdp_with_extmap_allow_mixed); >+ >+ // Deserialize >+ JsepSessionDescription jdesc_deserialized(kDummyType); >+ EXPECT_TRUE(SdpDeserialize(sdp_with_extmap_allow_mixed, &jdesc_deserialized)); >+ // Verify >+ EXPECT_TRUE(CompareSessionDescription(jdesc_, jdesc_deserialized)); >+} >+ > TEST_F(WebRtcSdpTest, DeserializeCandidate) { > JsepIceCandidate jcandidate(kDummyMid, kDummyIndex); > >@@ -3362,12 +3454,16 @@ TEST_F(WebRtcSdpTest, SerializeUnifiedPlanSessionDescription) { > } > > // This tests deserializing a Unified Plan SDP that is compatible with both >-// Unified Plan and Plan B style SDP. It tests the case for audio/video tracks >+// Unified Plan and Plan B style SDP, meaning that it contains both "a=ssrc >+// msid" lines and "a=msid " lines. It tests the case for audio/video tracks > // with no stream ids and multiple stream ids. For parsing this, the Unified > // Plan a=msid lines should take priority, because the Plan B style a=ssrc msid > // lines do not support multiple stream ids and no stream ids. >-TEST_F(WebRtcSdpTest, DeserializeUnifiedPlanSessionDescriptionSpecialMsid) { >- MakeUnifiedPlanDescriptionMultipleStreamIds(); >+TEST_F(WebRtcSdpTest, DeserializeSessionDescriptionSpecialMsid) { >+ // Create both msid lines for Plan B and Unified Plan support. >+ MakeUnifiedPlanDescriptionMultipleStreamIds( >+ cricket::kMsidSignalingMediaSection | >+ cricket::kMsidSignalingSsrcAttribute); > > JsepSessionDescription deserialized_description(kDummyType); > EXPECT_TRUE(SdpDeserialize(kUnifiedPlanSdpFullStringWithSpecialMsid, >@@ -3376,8 +3472,50 @@ TEST_F(WebRtcSdpTest, DeserializeUnifiedPlanSessionDescriptionSpecialMsid) { > EXPECT_TRUE(CompareSessionDescription(jdesc_, deserialized_description)); > } > >-TEST_F(WebRtcSdpTest, SerializeUnifiedPlanSessionDescriptionSpecialMsid) { >- MakeUnifiedPlanDescriptionMultipleStreamIds(); >+// Tests the serialization of a Unified Plan SDP that is compatible for both >+// Unified Plan and Plan B style SDPs, meaning that it contains both "a=ssrc >+// msid" lines and "a=msid " lines. It tests the case for no stream ids and >+// multiple stream ids. >+TEST_F(WebRtcSdpTest, SerializeSessionDescriptionSpecialMsid) { >+ // Create both msid lines for Plan B and Unified Plan support. >+ MakeUnifiedPlanDescriptionMultipleStreamIds( >+ cricket::kMsidSignalingMediaSection | >+ cricket::kMsidSignalingSsrcAttribute); >+ std::string serialized_sdp = webrtc::SdpSerialize(jdesc_); >+ // We explicitly test that the serialized SDP string is equal to the hard >+ // coded SDP string. This is necessary, because in the parser "a=msid" lines >+ // take priority over "a=ssrc msid" lines. This means if we just used >+ // TestSerialize(), it could serialize an SDP that omits "a=ssrc msid" lines, >+ // and still pass, because the deserialized version would be the same. >+ EXPECT_EQ(kUnifiedPlanSdpFullStringWithSpecialMsid, serialized_sdp); >+} >+ >+// Tests that a Unified Plan style SDP (does not contain "a=ssrc msid" lines >+// that signal stream IDs) is deserialized appropriately. It tests the case for >+// no stream ids and multiple stream ids. >+TEST_F(WebRtcSdpTest, UnifiedPlanDeserializeSessionDescriptionSpecialMsid) { >+ // Only create a=msid lines for strictly Unified Plan stream ID support. >+ MakeUnifiedPlanDescriptionMultipleStreamIds( >+ cricket::kMsidSignalingMediaSection); >+ >+ JsepSessionDescription deserialized_description(kDummyType); >+ std::string unified_plan_sdp_string = >+ kUnifiedPlanSdpFullStringWithSpecialMsid; >+ RemoveSsrcMsidLinesFromSdpString(&unified_plan_sdp_string); >+ EXPECT_TRUE( >+ SdpDeserialize(unified_plan_sdp_string, &deserialized_description)); >+ >+ EXPECT_TRUE(CompareSessionDescription(jdesc_, deserialized_description)); >+} >+ >+// Tests that a Unified Plan style SDP (does not contain "a=ssrc msid" lines >+// that signal stream IDs) is serialized appropriately. It tests the case for no >+// stream ids and multiple stream ids. >+TEST_F(WebRtcSdpTest, UnifiedPlanSerializeSessionDescriptionSpecialMsid) { >+ // Only create a=msid lines for strictly Unified Plan stream ID support. >+ MakeUnifiedPlanDescriptionMultipleStreamIds( >+ cricket::kMsidSignalingMediaSection); >+ > TestSerialize(jdesc_); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsessiondescriptionfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsessiondescriptionfactory.cc >index 808e411a39985852b963317666bc881b48d7a8c5..07b8e147cc8277407e26146585ea60a70c40a206 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsessiondescriptionfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/webrtcsessiondescriptionfactory.cc >@@ -116,7 +116,6 @@ void WebRtcSessionDescriptionFactory::CopyCandidatesFromSessionDescription( > } > } > >-// Private constructor called by other constructors. > WebRtcSessionDescriptionFactory::WebRtcSessionDescriptionFactory( > rtc::Thread* signaling_thread, > cricket::ChannelManager* channel_manager, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F00_tlm10.OUT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F00_tlm10.OUT20.sha1 >deleted file mode 100644 >index 5cb8e52cb25df314af960b1042980144525609d2..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F00_tlm10.OUT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-96fb5327ff7a1fe87bd4512773ce7347b4d72888 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F00_tlm10.OUT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F00_tlm10.OUT30.sha1 >deleted file mode 100644 >index 1470013b72af0deff58742f2cb9f7e593e8d0eaa..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F00_tlm10.OUT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-ea527e8e61241ea73265abba5765793511c42291 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F01_tlm10.OUT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F01_tlm10.OUT20.sha1 >deleted file mode 100644 >index d33eb7e1c7475b854d831cb4cbb82dbbc0e23a15..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F01_tlm10.OUT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-a5e8c268936d7c8d03edd708c675254474aed944 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F01_tlm10.OUT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F01_tlm10.OUT30.sha1 >deleted file mode 100644 >index 953656fe01c0b3383da368f054faf3d3290ef052..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F01_tlm10.OUT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-9893504bdaa560a4ddb67a99e6701e8d4244896a >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F02_tlm10.OUT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F02_tlm10.OUT20.sha1 >deleted file mode 100644 >index b8a52534b3ae8160c3c07242893856d9edc19a29..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F02_tlm10.OUT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-5e0053a0de7b4708402fcc0a1b5ae024815938a3 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F02_tlm10.OUT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F02_tlm10.OUT30.sha1 >deleted file mode 100644 >index d3bbc9866d15687c7268d754cfea81bce66ea798..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F02_tlm10.OUT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-d57986d3dc65c1cadc41419a77ed1472b73050b4 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.BIT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.BIT20.sha1 >deleted file mode 100644 >index a304fa41811a89de08c15e1af18237c152dd6552..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.BIT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-8cc1f82ee0d1b7df39840895d3e99756920b85ad >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.BIT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.BIT30.sha1 >deleted file mode 100644 >index c6cbe13afb4b87a11ff967a1a501465b48c7a9c2..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.BIT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-85ca3d9a5ca61c1bb9d798ba6d3551945ab7667d >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.INP.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.INP.sha1 >deleted file mode 100644 >index 53118fcb35c1d62f1e6422a004dee8f1786037f6..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.INP.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-730581035d5af832d1b05c8f3a623ec93863b3e3 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.OUT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.OUT20.sha1 >deleted file mode 100644 >index fd27f041e09eaa31a94a40c589c83ed5cef2c390..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.OUT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-985e5ffa0b98718afadfb1f02b5595fb24a9476c >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.OUT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.OUT30.sha1 >deleted file mode 100644 >index 3ccec4effefb7feef0439de2a76c09fd0cc2b683..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F04.OUT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-8b035fb0f9ce3bdc2ae6c983e232c257a515a781 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.BIT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.BIT20.sha1 >deleted file mode 100644 >index 39d82e6f521a72e0b93345c29033efdb6ec419bf..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.BIT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-9069fecc3cd13bf901781af4ece8b374b5285984 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.BIT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.BIT30.sha1 >deleted file mode 100644 >index 513c9362a89119db2c62ff21e4f8eefc193477a4..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.BIT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-44a04e133c15e013f6b4424877528a4445e28bef >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.INP.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.INP.sha1 >deleted file mode 100644 >index 0ee0a39f3ca412d0f06836812859a8f65759b7ad..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.INP.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-c3dde10c32f12da58181ecaccb7aeaa515239233 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.OUT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.OUT20.sha1 >deleted file mode 100644 >index df514526521a0589a4a4cb68674feeed8cbc2688..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.OUT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-260f51f8320d92d3c8b834c1f38879baa12a7a5a >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.OUT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.OUT30.sha1 >deleted file mode 100644 >index bcc318c96f498b23581a7fb3349bc5f0d9d51d56..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F05.OUT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-72abb4c1d84a4cff1efed4d3e01280d839a9f691 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.BIT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.BIT20.sha1 >deleted file mode 100644 >index 6b6e3dfb6845b51014347878513ad0ac3480eb42..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.BIT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-b6530247acdfd7ae7b3fd80c677d71045351ad07 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.BIT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.BIT30.sha1 >deleted file mode 100644 >index f9f7975730e5c1d8495cf898718b42d826df0b52..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.BIT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-0318de4dc48422bfe125fd6f827e6e7114dc61bc >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.INP.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.INP.sha1 >deleted file mode 100644 >index 1210209652c76d7d4b7e5a29039e757f7d16e5c0..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.INP.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-80359cff72cbdc04a337a51c2fcc0c74890710b4 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.OUT20.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.OUT20.sha1 >deleted file mode 100644 >index 7d50df8258d334d0f2e49447fdd0cce2e3b26255..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.OUT20.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-c7bc71abc973440123a517f61e3dbcb96089ee3f >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.OUT30.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.OUT30.sha1 >deleted file mode 100644 >index ad9219dd0ad07f7b224057626538cf8c8ba226d7..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/F06.OUT30.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-5fde2defd2350b5ca7d3b8f8ceef75d2b7fd7f03 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/tlm10.chn.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/tlm10.chn.sha1 >deleted file mode 100644 >index 801f4487bc636578b94d9fe65ee74794da557f00..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/resources/audio_coding/tlm10.chn.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-21c78516c2470667a75c7ed85fe37c53a3514456 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/BUILD.gn >index e9efbd1360f124b2971104d368090ff12134d43a..75c4397ab7d48afc1775adaeb362f482722404a8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/BUILD.gn >@@ -67,9 +67,102 @@ rtc_source_set("compile_assert_c") { > ] > } > >+# The subset of rtc_base approved for use outside of libjingle. >+# TODO(bugs.webrtc.org/9838): Create small and focused build targets and remove >+# the old concept of rtc_base and rtc_base_approved. > rtc_source_set("rtc_base_approved") { > visibility = [ "*" ] >- public_deps = [ >+ deps = [ >+ ":checks", >+ ":rtc_task_queue", >+ ":safe_compare", >+ ":safe_minmax", >+ ":type_traits", >+ "../api:array_view", >+ "../system_wrappers:field_trial", >+ "experiments:field_trial_parser", >+ "system:arch", >+ "system:unused", >+ "third_party/base64", >+ "//third_party/abseil-cpp/absl/memory:memory", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+ >+ sources = [ >+ "bind.h", >+ "bitbuffer.cc", >+ "bitbuffer.h", >+ "bitrateallocationstrategy.cc", >+ "bitrateallocationstrategy.h", >+ "buffer.h", >+ "bufferqueue.cc", >+ "bufferqueue.h", >+ "bytebuffer.cc", >+ "bytebuffer.h", >+ "byteorder.h", >+ "copyonwritebuffer.cc", >+ "copyonwritebuffer.h", >+ "event_tracer.cc", >+ "event_tracer.h", >+ "file.cc", >+ "file.h", >+ "flags.cc", >+ "flags.h", >+ "function_view.h", >+ "ignore_wundef.h", >+ "location.cc", >+ "location.h", >+ "message_buffer_reader.h", >+ "numerics/histogram_percentile_counter.cc", >+ "numerics/histogram_percentile_counter.h", >+ "numerics/mod_ops.h", >+ "numerics/moving_max_counter.h", >+ "numerics/sample_counter.cc", >+ "numerics/sample_counter.h", >+ "onetimeevent.h", >+ "platform_file.cc", >+ "platform_file.h", >+ "race_checker.cc", >+ "race_checker.h", >+ "random.cc", >+ "random.h", >+ "rate_statistics.cc", >+ "rate_statistics.h", >+ "ratetracker.cc", >+ "ratetracker.h", >+ "swap_queue.h", >+ "template_util.h", >+ "timestampaligner.cc", >+ "timestampaligner.h", >+ "trace_event.h", >+ "zero_memory.cc", >+ "zero_memory.h", >+ ] >+ >+ if (is_posix || is_fuchsia) { >+ sources += [ "file_posix.cc" ] >+ } >+ >+ if (is_win) { >+ sources += [ >+ "file_win.cc", >+ "win/windows_version.cc", >+ "win/windows_version.h", >+ ] >+ data_deps = [ >+ "//build/win:runtime_libs", >+ ] >+ } >+ >+ if (is_nacl) { >+ deps += [ "//native_client_sdk/src/libraries/nacl_io" ] >+ } >+ >+ if (is_android) { >+ libs = [ "log" ] >+ } >+ >+ public_deps = [ # no-presubmit-check TODO(webrtc:8603) > ":atomicops", > ":criticalsection", > ":logging", >@@ -78,16 +171,12 @@ rtc_source_set("rtc_base_approved") { > ":platform_thread_types", > ":ptr_util", > ":refcount", >- ":rtc_base_approved_generic", > ":rtc_event", > ":safe_conversions", > ":stringutils", > ":thread_checker", > ":timeutils", > ] >- if (is_mac && !build_with_chromium) { >- public_deps += [ ":rtc_base_approved_objc" ] >- } > } > > rtc_source_set("macromagic") { >@@ -117,6 +206,9 @@ rtc_source_set("ptr_util") { > sources = [ > "scoped_ref_ptr.h", > ] >+ deps = [ >+ "../api:scoped_refptr", >+ ] > } > > rtc_source_set("refcount") { >@@ -149,9 +241,9 @@ rtc_source_set("criticalsection") { > rtc_source_set("platform_thread") { > visibility = [ > ":rtc_base_approved", >- ":rtc_base_approved_generic", > ":rtc_task_queue_libevent", > ":rtc_task_queue_win", >+ ":rtc_task_queue_stdlib", > ":sequenced_task_checker", > ] > sources = [ >@@ -172,7 +264,6 @@ rtc_source_set("platform_thread") { > rtc_source_set("rtc_event") { > deps = [ > ":checks", >- ":macromagic", > ] > > if (build_with_chromium) { >@@ -192,7 +283,9 @@ rtc_source_set("rtc_event") { > > rtc_source_set("logging") { > visibility = [ "*" ] >+ libs = [] > deps = [ >+ ":checks", > ":criticalsection", > ":macromagic", > ":platform_thread_types", >@@ -219,9 +312,17 @@ rtc_source_set("logging") { > ] > deps += [ "system:inline" ] > >+ if (is_mac) { >+ deps += [ ":logging_mac" ] >+ } >+ > # logging.h needs the deprecation header while downstream projects are > # removing code that depends on logging implementation details. > deps += [ ":deprecation" ] >+ >+ if (is_android) { >+ libs += [ "log" ] >+ } > } > } > >@@ -248,6 +349,7 @@ rtc_source_set("atomicops") { > rtc_source_set("checks") { > # TODO(bugs.webrtc.org/9607): This should not be public. > visibility = [ "*" ] >+ libs = [] > sources = [ > "checks.cc", > "checks.h", >@@ -256,6 +358,9 @@ rtc_source_set("checks") { > ":safe_compare", > "system:inline", > ] >+ if (is_android) { >+ libs += [ "log" ] >+ } > } > > rtc_source_set("rate_limiter") { >@@ -266,6 +371,7 @@ rtc_source_set("rate_limiter") { > deps = [ > ":rtc_base_approved", > "../system_wrappers", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > } > >@@ -273,6 +379,9 @@ rtc_source_set("sanitizer") { > sources = [ > "sanitizer.h", > ] >+ deps = [ >+ "//third_party/abseil-cpp/absl/meta:type_traits", >+ ] > } > > rtc_source_set("safe_compare") { >@@ -362,137 +471,14 @@ rtc_source_set("deprecation") { > ] > } > >-# The subset of rtc_base approved for use outside of libjingle. >-rtc_source_set("rtc_base_approved_generic") { >- visibility = [ >- ":rtc_base_approved", >- ":weak_ptr_unittests", >- ] >- >- cflags = [] >- defines = [] >- libs = [] >- data_deps = [] >- deps = [ >- ":atomicops", >- ":checks", >- ":criticalsection", >- ":logging", >- ":macromagic", >- ":platform_thread", >- ":platform_thread_types", >- ":ptr_util", >- ":refcount", >- ":rtc_event", >- ":rtc_task_queue", >- ":safe_compare", >- ":safe_conversions", >- ":stringutils", >- ":thread_checker", >- ":timeutils", >- ":type_traits", >- "system:arch", >- "system:unused", >- "third_party/base64", >- ] >- >- sources = [ >- "bind.h", >- "bitbuffer.cc", >- "bitbuffer.h", >- "bitrateallocationstrategy.cc", >- "bitrateallocationstrategy.h", >- "buffer.h", >- "bufferqueue.cc", >- "bufferqueue.h", >- "bytebuffer.cc", >- "bytebuffer.h", >- "byteorder.h", >- "copyonwritebuffer.cc", >- "copyonwritebuffer.h", >- "event_tracer.cc", >- "event_tracer.h", >- "file.cc", >- "file.h", >- "flags.cc", >- "flags.h", >- "function_view.h", >- "ignore_wundef.h", >- "location.cc", >- "location.h", >- "message_buffer_reader.h", >- "numerics/histogram_percentile_counter.cc", >- "numerics/histogram_percentile_counter.h", >- "numerics/mod_ops.h", >- "numerics/moving_max_counter.h", >- "numerics/sample_counter.cc", >- "numerics/sample_counter.h", >- "onetimeevent.h", >- "pathutils.cc", >- "pathutils.h", >- "platform_file.cc", >- "platform_file.h", >- "race_checker.cc", >- "race_checker.h", >- "random.cc", >- "random.h", >- "rate_statistics.cc", >- "rate_statistics.h", >- "ratetracker.cc", >- "ratetracker.h", >- "swap_queue.h", >- "template_util.h", >- "timestampaligner.cc", >- "timestampaligner.h", >- "trace_event.h", >- "zero_memory.cc", >- "zero_memory.h", >- ] >- >- deps += [ >- "..:webrtc_common", >- "../api:array_view", >- "//third_party/abseil-cpp/absl/memory:memory", >- "//third_party/abseil-cpp/absl/types:optional", >- ] >- >- if (is_android) { >- libs += [ "log" ] >- } >- >- if (is_posix || is_fuchsia) { >- sources += [ "file_posix.cc" ] >- } >- >- if (is_win) { >- sources += [ >- "file_win.cc", >- "win/windows_version.cc", >- "win/windows_version.h", >- ] >- data_deps += [ "//build/win:runtime_libs" ] >- } >- >- if (is_nacl) { >- deps += [ "//native_client_sdk/src/libraries/nacl_io" ] >- } >-} >- > if (is_mac && !build_with_chromium) { >- config("rtc_base_approved_objc_all_dependent_config") { >- visibility = [ ":rtc_base_approved_objc" ] >- libs = [ "Foundation.framework" ] # needed for logging_mac.mm >- } >- >- rtc_source_set("rtc_base_approved_objc") { >- visibility = [ ":rtc_base_approved" ] >- all_dependent_configs = [ ":rtc_base_approved_objc_all_dependent_config" ] >+ rtc_source_set("logging_mac") { >+ visibility = [ ":logging" ] >+ libs = [ "Foundation.framework" ] > sources = [ >+ "logging_mac.h", > "logging_mac.mm", > ] >- deps = [ >- ":logging", >- ] > } > } > >@@ -529,6 +515,7 @@ rtc_source_set("rtc_task_queue_api") { > deps = [ > ":macromagic", > ":ptr_util", >+ "system:rtc_export", > "//third_party/abseil-cpp/absl/memory", > ] > } >@@ -565,7 +552,9 @@ if (rtc_enable_libevent) { > ":checks", > ":criticalsection", > ":logging", >+ ":macromagic", > ":platform_thread", >+ ":platform_thread_types", > ":ptr_util", > ":refcount", > ":rtc_task_queue_api", >@@ -619,6 +608,26 @@ if (is_win) { > } > } > >+rtc_source_set("rtc_task_queue_stdlib") { >+ visibility = [ ":rtc_task_queue_impl" ] >+ sources = [ >+ "task_queue_stdlib.cc", >+ ] >+ deps = [ >+ ":checks", >+ ":criticalsection", >+ ":logging", >+ ":macromagic", >+ ":platform_thread", >+ ":ptr_util", >+ ":refcount", >+ ":rtc_event", >+ ":rtc_task_queue_api", >+ ":safe_conversions", >+ ":timeutils", >+ ] >+} >+ > rtc_source_set("rtc_task_queue_impl") { > visibility = [ "*" ] > if (rtc_enable_libevent) { >@@ -632,9 +641,15 @@ rtc_source_set("rtc_task_queue_impl") { > ] > } > if (is_win) { >- deps = [ >- ":rtc_task_queue_win", >- ] >+ if (current_os == "winuwp") { >+ deps = [ >+ ":rtc_task_queue_stdlib", >+ ] >+ } else { >+ deps = [ >+ ":rtc_task_queue_win", >+ ] >+ } > } > } > } >@@ -670,6 +685,8 @@ rtc_static_library("rtc_numerics") { > sources = [ > "numerics/exp_filter.cc", > "numerics/exp_filter.h", >+ "numerics/moving_average.cc", >+ "numerics/moving_average.h", > "numerics/moving_median_filter.h", > "numerics/percentile_filter.h", > "numerics/sequence_number_util.h", >@@ -707,48 +724,23 @@ rtc_source_set("rtc_json") { > > rtc_static_library("rtc_base") { > visibility = [ "*" ] >- public_deps = [] >- if (!build_with_mozilla) { >- public_deps += [ ":rtc_base_generic" ] >- } >- if (is_win) { >- sources = [ >- "noop.cc", >- ] >- } >- if (is_ios || is_mac) { >- sources = [ >- "noop.mm", >- ] >- public_deps += [ ":rtc_base_objc" ] >- } >-} >- >-if (is_ios || is_mac) { >- rtc_source_set("rtc_base_objc") { >- sources = [ >- "thread_darwin.mm", >- ] >- deps = [ >- ":rtc_base_generic", >- ] >- visibility = [ ":rtc_base" ] >- } >-} >- >-rtc_static_library("rtc_base_generic") { > cflags = [] > cflags_cc = [] > libs = [] > defines = [] > deps = [ > ":checks", >+ >+ # For deprecation of rtc::PacketTime, in asyncpacketsocket.h. >+ ":deprecation", > ":stringutils", > "..:webrtc_common", > "../api:array_view", >+ "network:sent_packet", > "third_party/base64", > "third_party/sigslot", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/strings", > "//third_party/abseil-cpp/absl/types:optional", > ] > public_deps = [ >@@ -781,8 +773,6 @@ rtc_static_library("rtc_base_generic") { > "dscp.h", > "filerotatingstream.cc", > "filerotatingstream.h", >- "fileutils.cc", >- "fileutils.h", > "gunit_prod.h", > "helpers.cc", > "helpers.h", >@@ -791,6 +781,8 @@ rtc_static_library("rtc_base_generic") { > "ipaddress.cc", > "ipaddress.h", > "keep_ref_until_done.h", >+ "key_derivation.cc", >+ "key_derivation.h", > "mdns_responder_interface.h", > "messagedigest.cc", > "messagedigest.h", >@@ -811,6 +803,8 @@ rtc_static_library("rtc_base_generic") { > "nullsocketserver.cc", > "nullsocketserver.h", > "openssl.h", >+ "openssl_key_derivation_hkdf.cc", >+ "openssl_key_derivation_hkdf.h", > "openssladapter.cc", > "openssladapter.h", > "opensslcertificate.cc", >@@ -846,8 +840,6 @@ rtc_static_library("rtc_base_generic") { > "socketaddresspair.h", > "socketfactory.h", > "socketserver.h", >- "socketstream.cc", >- "socketstream.h", > "ssladapter.cc", > "ssladapter.h", > "sslcertificate.cc", >@@ -864,11 +856,6 @@ rtc_static_library("rtc_base_generic") { > "thread.h", > ] > >- visibility = [ >- ":rtc_base", >- ":rtc_base_objc", >- ] >- > if (build_with_chromium) { > include_dirs = [ "../../boringssl/src/include" ] > public_configs += [ ":rtc_base_chromium_config" ] >@@ -878,8 +865,6 @@ rtc_static_library("rtc_base_generic") { > "logsinks.cc", > "logsinks.h", > "numerics/mathutils.h", >- "optionsfile.cc", >- "optionsfile.h", > "rollingaccumulator.h", > "sslroots.h", > ] >@@ -913,6 +898,7 @@ rtc_static_library("rtc_base_generic") { > > if (is_ios || is_mac) { > sources += [ "macifaddrs_converter.cc" ] >+ deps += [ "system:cocoa_threading" ] > } > > if (rtc_use_x11) { >@@ -938,18 +924,12 @@ rtc_static_library("rtc_base_generic") { > "macutils.cc", > "macutils.h", > ] >- libs += [ >- # For ProcessInformationCopyDictionary in unixfilesystem.cc. >- "ApplicationServices.framework", >- ] > } > > if (is_win) { > sources += [ > "win32.cc", > "win32.h", >- "win32filesystem.cc", >- "win32filesystem.h", > "win32window.cc", > "win32window.h", > ] >@@ -967,8 +947,6 @@ rtc_static_library("rtc_base_generic") { > sources += [ > "ifaddrs_converter.cc", > "ifaddrs_converter.h", >- "unixfilesystem.cc", >- "unixfilesystem.h", > ] > } > >@@ -986,11 +964,41 @@ rtc_source_set("gtest_prod") { > ] > } > >+rtc_source_set("gunit_helpers") { >+ testonly = true >+ sources = [ >+ "gunit.cc", >+ "gunit.h", >+ ] >+ deps = [ >+ ":logging", >+ ":rtc_base", >+ ":rtc_base_tests_utils", >+ ":stringutils", >+ "../test:test_support", >+ ] >+} >+ >+rtc_source_set("testclient") { >+ testonly = true >+ sources = [ >+ "testclient.cc", >+ "testclient.h", >+ ] >+ deps = [ >+ ":criticalsection", >+ ":gunit_helpers", >+ ":macromagic", >+ ":rtc_base", >+ ":rtc_base_tests_utils", >+ ":timeutils", >+ "//third_party/abseil-cpp/absl/memory:memory", >+ ] >+} >+ > rtc_source_set("rtc_base_tests_utils") { > testonly = true > sources = [ >- # Also use this as a convenient dumping ground for misc files that are >- # included by multiple targets below. > "cpu_time.cc", > "cpu_time.h", > "fake_mdns_responder.h", >@@ -1001,8 +1009,8 @@ rtc_source_set("rtc_base_tests_utils") { > "fakesslidentity.h", > "firewallsocketserver.cc", > "firewallsocketserver.h", >- "gunit.cc", >- "gunit.h", >+ "memory_stream.cc", >+ "memory_stream.h", > "memory_usage.cc", > "memory_usage.h", > "natserver.cc", >@@ -1015,10 +1023,10 @@ rtc_source_set("rtc_base_tests_utils") { > "proxyserver.h", > "sigslottester.h", > "sigslottester.h.pump", >+ "socketstream.cc", >+ "socketstream.h", > "testbase64.h", > "testcertificateverifier.h", >- "testclient.cc", >- "testclient.h", > "testechoserver.cc", > "testechoserver.h", > "testutils.cc", >@@ -1029,16 +1037,10 @@ rtc_source_set("rtc_base_tests_utils") { > deps = [ > ":checks", > ":rtc_base", >- ":stringutils", > "../api/units:time_delta", >- "../test:test_support", >- "system:fallthrough", > "third_party/sigslot", > "//third_party/abseil-cpp/absl/memory", > ] >- public_deps = [ >- "//testing/gtest", >- ] > } > > rtc_source_set("rtc_task_queue_for_test") { >@@ -1063,6 +1065,7 @@ if (rtc_include_tests) { > "sigslot_unittest.cc", > ] > deps = [ >+ ":gunit_helpers", > ":rtc_base", > ":rtc_base_tests_utils", > "third_party/sigslot", >@@ -1075,13 +1078,13 @@ if (rtc_include_tests) { > "unittest_main.cc", > ] > deps = [ >+ ":gunit_helpers", > ":rtc_base", > ":rtc_base_approved", > ":rtc_base_tests_utils", > "../system_wrappers:field_trial", > "../system_wrappers:metrics", > "../test:field_trial", >- "../test:fileutils", > "../test:test_support", > ] > >@@ -1104,9 +1107,11 @@ if (rtc_include_tests) { > ] > deps = [ > ":checks", >+ ":gunit_helpers", > ":rtc_base", > ":rtc_base_tests_main", > ":rtc_base_tests_utils", >+ ":testclient", > "../system_wrappers:system_wrappers", > "../test:fileutils", > "../test:test_support", >@@ -1147,7 +1152,6 @@ if (rtc_include_tests) { > "numerics/safe_minmax_unittest.cc", > "numerics/sample_counter_unittest.cc", > "onetimeevent_unittest.cc", >- "pathutils_unittest.cc", > "platform_file_unittest.cc", > "platform_thread_unittest.cc", > "random_unittest.cc", >@@ -1174,6 +1178,7 @@ if (rtc_include_tests) { > } > deps = [ > ":checks", >+ ":gunit_helpers", > ":rate_limiter", > ":rtc_base", > ":rtc_base_approved", >@@ -1184,6 +1189,7 @@ if (rtc_include_tests) { > ":safe_minmax", > ":sanitizer", > ":stringutils", >+ ":testclient", > "../api:array_view", > "../system_wrappers:system_wrappers", > "../test:fileutils", >@@ -1203,6 +1209,7 @@ if (rtc_include_tests) { > "task_queue_unittest.cc", > ] > deps = [ >+ ":gunit_helpers", > ":rtc_base_approved", > ":rtc_base_tests_main", > ":rtc_base_tests_utils", >@@ -1236,7 +1243,8 @@ if (rtc_include_tests) { > "weak_ptr_unittest.cc", > ] > deps = [ >- ":rtc_base_approved_generic", >+ ":gunit_helpers", >+ ":rtc_base_approved", > ":rtc_base_tests_main", > ":rtc_base_tests_utils", > ":rtc_event", >@@ -1251,6 +1259,7 @@ if (rtc_include_tests) { > > sources = [ > "numerics/exp_filter_unittest.cc", >+ "numerics/moving_average_unittest.cc", > "numerics/moving_median_filter_unittest.cc", > "numerics/percentile_filter_unittest.cc", > "numerics/sequence_number_util_unittest.cc", >@@ -1270,6 +1279,7 @@ if (rtc_include_tests) { > "strings/json_unittest.cc", > ] > deps = [ >+ ":gunit_helpers", > ":rtc_base_tests_main", > ":rtc_base_tests_utils", > ":rtc_json", >@@ -1292,7 +1302,6 @@ if (rtc_include_tests) { > "messagequeue_unittest.cc", > "nat_unittest.cc", > "network_unittest.cc", >- "optionsfile_unittest.cc", > "proxy_unittest.cc", > "rollingaccumulator_unittest.cc", > "rtccertificate_unittest.cc", >@@ -1311,6 +1320,7 @@ if (rtc_include_tests) { > } > if (is_posix || is_fuchsia) { > sources += [ >+ "openssl_key_derivation_hkdf_unittest.cc", > "openssladapter_unittest.cc", > "opensslsessioncache_unittest.cc", > "opensslutility_unittest.cc", >@@ -1321,9 +1331,11 @@ if (rtc_include_tests) { > } > deps = [ > ":checks", >+ ":gunit_helpers", > ":rtc_base_tests_main", > ":rtc_base_tests_utils", > ":stringutils", >+ ":testclient", > "../api:array_view", > "../test:fileutils", > "../test:test_support", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/Dummy.java b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/Dummy.java >deleted file mode 100644 >index d8f02c90dbf660dd058ee5d136e91a24eefa2e21..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/Dummy.java >+++ /dev/null >@@ -1,19 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-/** >- * This class only exists as glue in a transition. >- * TODO(kjellander): Remove. >- * See https://bugs.webrtc.org/7634 for more details. >- */ >-class Dummy { >- Dummy() { >- } >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/OWNERS >index 69ce253087f59b17b76f2a5ce807290bcbf2bb14..e67e8340f7c7890f70e51b2368955a17e9615e8d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/OWNERS >@@ -2,7 +2,6 @@ hta@webrtc.org > juberti@webrtc.org > kwiberg@webrtc.org > mflodman@webrtc.org >-perkj@webrtc.org > pthatcher@webrtc.org > qingsi@webrtc.org > sergeyu@chromium.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncinvoker.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncinvoker.cc >index e255fb98fccb7276b49f326e81777e7098045a00..f0dd1881ef3472b85c260d476d661fde83230712 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncinvoker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncinvoker.cc >@@ -17,7 +17,7 @@ namespace rtc { > > AsyncInvoker::AsyncInvoker() > : pending_invocations_(0), >- invocation_complete_(new RefCountedObject<Event>(false, false)), >+ invocation_complete_(new RefCountedObject<Event>()), > destroying_(false) {} > > AsyncInvoker::~AsyncInvoker() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.cc >index e1b1eae6a31d735b703af24b58cc8184b8c1dcb9..7e0cc8f81011e596c43805b20c9ed629aa4b1a88 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.cc >@@ -9,6 +9,7 @@ > */ > > #include "rtc_base/asyncpacketsocket.h" >+#include "rtc_base/nethelper.h" > > namespace rtc { > >@@ -33,9 +34,11 @@ void CopySocketInformationToPacketInfo(size_t packet_size_bytes, > bool is_connectionless, > rtc::PacketInfo* info) { > info->packet_size_bytes = packet_size_bytes; >- info->local_socket_address = socket_from.GetLocalAddress(); >- if (!is_connectionless) { >- info->remote_socket_address = socket_from.GetRemoteAddress(); >+ // TODO(srte): Make sure that the family of the local socket is always set >+ // in the VirtualSocket implementation and remove this check. >+ int family = socket_from.GetLocalAddress().family(); >+ if (family != 0) { >+ info->ip_overhead_bytes = cricket::GetIpOverhead(family); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.h >index bb0b3bcc048e75fe486efdf6557f3ca7171cbb02..44d6c674267a6165f2b195f81f2d882f3364138f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncpacketsocket.h >@@ -12,6 +12,7 @@ > #define RTC_BASE_ASYNCPACKETSOCKET_H_ > > #include "rtc_base/constructormagic.h" >+#include "rtc_base/deprecation.h" > #include "rtc_base/dscp.h" > #include "rtc_base/socket.h" > #include "rtc_base/third_party/sigslot/sigslot.h" >@@ -50,25 +51,9 @@ struct PacketOptions { > PacketInfo info_signaled_after_sent; > }; > >-// This structure will have the information about when packet is actually >-// received by socket. >-struct PacketTime { >- PacketTime() : timestamp(-1), not_before(-1) {} >- PacketTime(int64_t timestamp, int64_t not_before) >- : timestamp(timestamp), not_before(not_before) {} >- >- int64_t timestamp; // Receive time after socket delivers the data. >- >- // Earliest possible time the data could have arrived, indicating the >- // potential error in the |timestamp| value, in case the system, is busy. For >- // example, the time of the last select() call. >- // If unknown, this value will be set to zero. >- int64_t not_before; >-}; >- >-inline PacketTime CreatePacketTime(int64_t not_before) { >- return PacketTime(TimeMicros(), not_before); >-} >+// TODO(bugs.webrtc.org/9584): Compatibility alias, delete as soon as downstream >+// code is updated. >+typedef int64_t PacketTime; > > // Provides the ability to receive packets asynchronously. Sends are not > // buffered since it is acceptable to drop packets under high load. >@@ -120,7 +105,9 @@ class AsyncPacketSocket : public sigslot::has_slots<> { > const char*, > size_t, > const SocketAddress&, >- const PacketTime&> >+ // TODO(bugs.webrtc.org/9584): Change to passing the int64_t >+ // timestamp by value. >+ const int64_t&> > SignalReadPacket; > > // Emitted each time a packet is sent. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncresolverinterface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncresolverinterface.h >index 5b2303f4d80286081b79a9b129f8d7f205e0051e..f3df8842493fde3c76c4ad5e99dc9e0691e4f7b8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncresolverinterface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncresolverinterface.h >@@ -22,9 +22,12 @@ class AsyncResolverInterface { > AsyncResolverInterface(); > virtual ~AsyncResolverInterface(); > >- // Start address resolve process. >+ // Start address resolution of the hostname in |addr|. > virtual void Start(const SocketAddress& addr) = 0; >- // Returns top most resolved address of |family| >+ // Returns true iff the address from |Start| was successfully resolved. >+ // If the address was successfully resolved, sets |addr| to a copy of the >+ // address from |Start| with the IP address set to the top most resolved >+ // address of |family| (|addr| will have both hostname and the resolved ip). > virtual bool GetResolvedAddress(int family, SocketAddress* addr) const = 0; > // Returns error from resolver. > virtual int GetError() const = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.cc >index 3d68a2a46a6c637a8ed2fa7a2fbd9cb3e5ccf7dd..666b3357aaaf553fa30741ecb6507742866da2fb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.cc >@@ -18,6 +18,7 @@ > #include "rtc_base/byteorder.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >+#include "rtc_base/timeutils.h" // for TimeMillis > > #if defined(WEBRTC_POSIX) > #include <errno.h> >@@ -320,7 +321,7 @@ void AsyncTCPSocket::ProcessInput(char* data, size_t* len) { > return; > > SignalReadPacket(this, data + kPacketLenSize, pkt_len, remote_addr, >- CreatePacketTime(0)); >+ TimeMicros()); > > *len -= kPacketLenSize + pkt_len; > if (*len > 0) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.h >index 943e010f4dac019c1c812d8f406487cd2c1f5bda..9567dd91432b74c003e8d7b2d28278c3b2972d6b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asynctcpsocket.h >@@ -11,12 +11,15 @@ > #ifndef RTC_BASE_ASYNCTCPSOCKET_H_ > #define RTC_BASE_ASYNCTCPSOCKET_H_ > >+#include <stddef.h> > #include <memory> > > #include "rtc_base/asyncpacketsocket.h" >+#include "rtc_base/asyncsocket.h" > #include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" >-#include "rtc_base/socketfactory.h" >+#include "rtc_base/socket.h" >+#include "rtc_base/socketaddress.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.cc >index ba5fa88aadb3e2f4d6f4124a934f144d502368eb..2f9011cab143aabd40acff0de9c551333ab3825d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.cc >@@ -74,7 +74,6 @@ int AsyncUDPSocket::SendTo(const void* pv, > rtc::SentPacket sent_packet(options.packet_id, rtc::TimeMillis(), > options.info_signaled_after_sent); > CopySocketInformationToPacketInfo(cb, *this, true, &sent_packet.info); >- sent_packet.info.remote_socket_address = addr; > int ret = socket_->SendTo(pv, cb, addr); > SignalSentPacket(this, sent_packet); > return ret; >@@ -123,9 +122,8 @@ void AsyncUDPSocket::OnReadEvent(AsyncSocket* socket) { > > // TODO: Make sure that we got all of the packet. > // If we did not, then we should resize our buffer to be large enough. >- SignalReadPacket( >- this, buf_, static_cast<size_t>(len), remote_addr, >- (timestamp > -1 ? PacketTime(timestamp, 0) : CreatePacketTime(0))); >+ SignalReadPacket(this, buf_, static_cast<size_t>(len), remote_addr, >+ (timestamp > -1 ? timestamp : TimeMicros())); > } > > void AsyncUDPSocket::OnWriteEvent(AsyncSocket* socket) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.h >index d814b4bfd0c5dbe5d7e4d4ac486c7b8fb56098f4..030946d8a03a5c6cf84b3f87b776c179bcc0927c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/asyncudpsocket.h >@@ -11,9 +11,13 @@ > #ifndef RTC_BASE_ASYNCUDPSOCKET_H_ > #define RTC_BASE_ASYNCUDPSOCKET_H_ > >+#include <stddef.h> > #include <memory> > > #include "rtc_base/asyncpacketsocket.h" >+#include "rtc_base/asyncsocket.h" >+#include "rtc_base/socket.h" >+#include "rtc_base/socketaddress.h" > #include "rtc_base/socketfactory.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/base64_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/base64_unittest.cc >index 4b857f1aa94008419ed846fa5d33b88405f8440d..bdf8559c127f47b41af2582cdd5b537c98c94203 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/base64_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/base64_unittest.cc >@@ -11,7 +11,6 @@ > #include "rtc_base/third_party/base64/base64.h" > #include "rtc_base/gunit.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringutils.h" > > #include "rtc_base/testbase64.h" > >@@ -430,10 +429,10 @@ TEST(Base64, EncodeDecodeBattery) { > > // try putting some extra stuff after the equals signs, or in between them > if (equals == 2) { >- sprintfn(first_equals, 6, " = = "); >+ snprintf(first_equals, 6, " = = "); > len = first_equals - encode_buffer + 5; > } else { >- sprintfn(first_equals, 6, " = "); >+ snprintf(first_equals, 6, " = "); > len = first_equals - encode_buffer + 3; > } > decoded2.assign("this junk should be ignored"); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.cc >index d2a06cd9efec81cc7a4eeab1036315d66dde7cd1..46e66741e0a0411555273dced76141a14dfd8a35 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.cc >@@ -9,9 +9,31 @@ > */ > > #include "rtc_base/bitrateallocationstrategy.h" >+ > #include <algorithm> >+#include <cstddef> >+#include <cstdint> >+#include <map> > #include <utility> > >+#include "rtc_base/numerics/safe_minmax.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+AudioPriorityConfig::AudioPriorityConfig() >+ : min_rate("min"), max_rate("max"), target_rate("target") { >+ std::string trial_string; >+// TODO(bugs.webrtc.org/9889): Remove this when Chromium build has been fixed. >+#if !defined(WEBRTC_CHROMIUM_BUILD) >+ trial_string = field_trial::FindFullName("WebRTC-Bwe-AudioPriority"); >+#endif >+ ParseFieldTrial({&min_rate, &max_rate, &target_rate}, trial_string); >+} >+AudioPriorityConfig::AudioPriorityConfig(const AudioPriorityConfig&) = default; >+AudioPriorityConfig::~AudioPriorityConfig() = default; >+ >+} // namespace webrtc >+ > namespace rtc { > > // The purpose of this is to allow video streams to use extra bandwidth for FEC. >@@ -21,31 +43,31 @@ namespace rtc { > const int kTransmissionMaxBitrateMultiplier = 2; > > std::vector<uint32_t> BitrateAllocationStrategy::SetAllBitratesToMinimum( >- const ArrayView<const TrackConfig*> track_configs) { >+ const std::vector<BitrateAllocationStrategy::TrackConfig>& track_configs) { > std::vector<uint32_t> track_allocations; >- for (const auto* track_config : track_configs) { >- track_allocations.push_back(track_config->min_bitrate_bps); >+ for (const auto& track_config : track_configs) { >+ track_allocations.push_back(track_config.min_bitrate_bps); > } > return track_allocations; > } > > std::vector<uint32_t> BitrateAllocationStrategy::DistributeBitratesEvenly( >- const ArrayView<const TrackConfig*> track_configs, >+ const std::vector<BitrateAllocationStrategy::TrackConfig>& track_configs, > uint32_t available_bitrate) { > std::vector<uint32_t> track_allocations = > SetAllBitratesToMinimum(track_configs); > uint32_t sum_min_bitrates = 0; > uint32_t sum_max_bitrates = 0; >- for (const auto* track_config : track_configs) { >- sum_min_bitrates += track_config->min_bitrate_bps; >- sum_max_bitrates += track_config->max_bitrate_bps; >+ for (const auto& track_config : track_configs) { >+ sum_min_bitrates += track_config.min_bitrate_bps; >+ sum_max_bitrates += track_config.max_bitrate_bps; > } > if (sum_min_bitrates >= available_bitrate) { > return track_allocations; > } else if (available_bitrate >= sum_max_bitrates) { > auto track_allocations_it = track_allocations.begin(); >- for (const auto* track_config : track_configs) { >- *track_allocations_it++ = track_config->max_bitrate_bps; >+ for (const auto& track_config : track_configs) { >+ *track_allocations_it++ = track_config.max_bitrate_bps; > } > return track_allocations; > } else { >@@ -54,11 +76,10 @@ std::vector<uint32_t> BitrateAllocationStrategy::DistributeBitratesEvenly( > // lowest max_bitrate_bps. Remainder of available bitrate split evenly among > // remaining tracks. > std::multimap<uint32_t, size_t> max_bitrate_sorted_configs; >- for (const TrackConfig** track_configs_it = track_configs.begin(); >- track_configs_it != track_configs.end(); ++track_configs_it) { >+ for (const auto& track_config : track_configs) { > max_bitrate_sorted_configs.insert( >- std::make_pair((*track_configs_it)->max_bitrate_bps, >- track_configs_it - track_configs.begin())); >+ std::make_pair(track_config.max_bitrate_bps, >+ &track_config - &track_configs.front())); > } > uint32_t total_available_increase = available_bitrate - sum_min_bitrates; > int processed_configs = 0; >@@ -67,8 +88,8 @@ std::vector<uint32_t> BitrateAllocationStrategy::DistributeBitratesEvenly( > total_available_increase / > (static_cast<uint32_t>(track_configs.size() - processed_configs)); > uint32_t consumed_increase = >- std::min(track_configs[track_config_pair.second]->max_bitrate_bps - >- track_configs[track_config_pair.second]->min_bitrate_bps, >+ std::min(track_configs[track_config_pair.second].max_bitrate_bps - >+ track_configs[track_config_pair.second].min_bitrate_bps, > available_increase); > track_allocations[track_config_pair.second] += consumed_increase; > total_available_increase -= consumed_increase; >@@ -77,53 +98,50 @@ std::vector<uint32_t> BitrateAllocationStrategy::DistributeBitratesEvenly( > return track_allocations; > } > } >- > AudioPriorityBitrateAllocationStrategy::AudioPriorityBitrateAllocationStrategy( > std::string audio_track_id, > uint32_t sufficient_audio_bitrate) > : audio_track_id_(audio_track_id), >- sufficient_audio_bitrate_(sufficient_audio_bitrate) {} >+ sufficient_audio_bitrate_(sufficient_audio_bitrate) { >+ if (config_.target_rate) { >+ sufficient_audio_bitrate_ = config_.target_rate->bps(); >+ } >+} > > std::vector<uint32_t> AudioPriorityBitrateAllocationStrategy::AllocateBitrates( > uint32_t available_bitrate, >- const ArrayView<const TrackConfig*> track_configs) { >- const TrackConfig* audio_track_config = NULL; >+ std::vector<BitrateAllocationStrategy::TrackConfig> track_configs) { >+ TrackConfig* audio_track_config = nullptr; > size_t audio_config_index = 0; > uint32_t sum_min_bitrates = 0; > uint32_t sum_max_bitrates = 0; > >- for (const auto*& track_config : track_configs) { >- sum_min_bitrates += track_config->min_bitrate_bps; >- sum_max_bitrates += track_config->max_bitrate_bps; >- if (track_config->track_id == audio_track_id_) { >- audio_track_config = track_config; >+ for (auto& track_config : track_configs) { >+ if (track_config.track_id == audio_track_id_) { > audio_config_index = &track_config - &track_configs[0]; >+ audio_track_config = &track_config; >+ if (config_.min_rate) >+ audio_track_config->min_bitrate_bps = config_.min_rate->bps(); >+ if (config_.max_rate) >+ audio_track_config->max_bitrate_bps = config_.max_rate->bps(); > } >+ sum_min_bitrates += track_config.min_bitrate_bps; >+ sum_max_bitrates += track_config.max_bitrate_bps; > } > if (sum_max_bitrates < available_bitrate) { > // Allow non audio streams to go above max upto > // kTransmissionMaxBitrateMultiplier * max_bitrate_bps >- size_t track_configs_size = track_configs.size(); >- std::vector<TrackConfig> increased_track_configs(track_configs_size); >- std::vector<const TrackConfig*> increased_track_configs_ptr( >- track_configs_size); >- for (unsigned long i = 0; i < track_configs_size; i++) { >- increased_track_configs[i] = (*track_configs[i]); >- increased_track_configs_ptr[i] = &increased_track_configs[i]; >- if (track_configs[i]->track_id != audio_track_id_) { >- increased_track_configs[i].max_bitrate_bps = >- track_configs[i]->max_bitrate_bps * >- kTransmissionMaxBitrateMultiplier; >- } >+ for (auto& track_config : track_configs) { >+ if (&track_config != audio_track_config) >+ track_config.max_bitrate_bps *= kTransmissionMaxBitrateMultiplier; > } >- return DistributeBitratesEvenly(increased_track_configs_ptr, >- available_bitrate); >+ return DistributeBitratesEvenly(track_configs, available_bitrate); > } >- if (audio_track_config == nullptr) { >+ if (!audio_track_config) { > return DistributeBitratesEvenly(track_configs, available_bitrate); > } >- auto safe_sufficient_audio_bitrate = std::min( >- std::max(audio_track_config->min_bitrate_bps, sufficient_audio_bitrate_), >+ auto safe_sufficient_audio_bitrate = rtc::SafeClamp( >+ sufficient_audio_bitrate_, audio_track_config->min_bitrate_bps, > audio_track_config->max_bitrate_bps); > if (available_bitrate <= sum_min_bitrates) { > return SetAllBitratesToMinimum(track_configs); >@@ -139,9 +157,7 @@ std::vector<uint32_t> AudioPriorityBitrateAllocationStrategy::AllocateBitrates( > // Setting audio track minimum to safe_sufficient_audio_bitrate will > // allow using DistributeBitratesEvenly to allocate at least sufficient > // bitrate for audio and the rest evenly. >- TrackConfig sufficient_track_config(*track_configs[audio_config_index]); >- sufficient_track_config.min_bitrate_bps = safe_sufficient_audio_bitrate; >- track_configs[audio_config_index] = &sufficient_track_config; >+ audio_track_config->min_bitrate_bps = safe_sufficient_audio_bitrate; > std::vector<uint32_t> track_allocations = > DistributeBitratesEvenly(track_configs, available_bitrate); > return track_allocations; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.h >index f711d1f18ffd814eeb16b8d349994f9aca1342c2..13a4eee524e2e98b1c767c95ddfa72174be0db06 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy.h >@@ -11,12 +11,13 @@ > #ifndef RTC_BASE_BITRATEALLOCATIONSTRATEGY_H_ > #define RTC_BASE_BITRATEALLOCATIONSTRATEGY_H_ > >-#include <map> >-#include <memory> >+#include <stdint.h> > #include <string> > #include <vector> >+ > #include "api/array_view.h" >-#include "rtc_base/checks.h" >+#include "rtc_base/experiments/field_trial_parser.h" >+#include "rtc_base/experiments/field_trial_units.h" > > namespace rtc { > >@@ -56,10 +57,12 @@ class BitrateAllocationStrategy { > std::string track_id; > }; > >+ // These are only used by AudioPriorityBitrateAllocationStrategy. They are >+ // exposed here to they can be unit tested. > static std::vector<uint32_t> SetAllBitratesToMinimum( >- const ArrayView<const TrackConfig*> track_configs); >+ const std::vector<BitrateAllocationStrategy::TrackConfig>& track_configs); > static std::vector<uint32_t> DistributeBitratesEvenly( >- const ArrayView<const TrackConfig*> track_configs, >+ const std::vector<BitrateAllocationStrategy::TrackConfig>& track_configs, > uint32_t available_bitrate); > > // Strategy is expected to allocate all available_bitrate up to the sum of >@@ -74,11 +77,25 @@ class BitrateAllocationStrategy { > // available_bitrate decrease. > virtual std::vector<uint32_t> AllocateBitrates( > uint32_t available_bitrate, >- const ArrayView<const TrackConfig*> track_configs) = 0; >+ std::vector<BitrateAllocationStrategy::TrackConfig> track_configs) = 0; > > virtual ~BitrateAllocationStrategy() = default; > }; >+} // namespace rtc > >+namespace webrtc { >+struct AudioPriorityConfig { >+ FieldTrialOptional<DataRate> min_rate; >+ FieldTrialOptional<DataRate> max_rate; >+ FieldTrialOptional<DataRate> target_rate; >+ AudioPriorityConfig(); >+ AudioPriorityConfig(const AudioPriorityConfig&); >+ AudioPriorityConfig& operator=(const AudioPriorityConfig&) = default; >+ ~AudioPriorityConfig(); >+}; >+} // namespace webrtc >+ >+namespace rtc { > // Simple allocation strategy giving priority to audio until > // sufficient_audio_bitrate is reached. Bitrate is distributed evenly between > // the tracks after sufficient_audio_bitrate is reached. This implementation >@@ -90,9 +107,11 @@ class AudioPriorityBitrateAllocationStrategy > uint32_t sufficient_audio_bitrate); > std::vector<uint32_t> AllocateBitrates( > uint32_t available_bitrate, >- const ArrayView<const TrackConfig*> track_configs) override; >+ std::vector<BitrateAllocationStrategy::TrackConfig> track_configs) >+ override; > > private: >+ webrtc::AudioPriorityConfig config_; > std::string audio_track_id_; > uint32_t sufficient_audio_bitrate_; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy_unittest.cc >index bfc41f56e80102afd875096895d0a06d3cd2bc39..f4c7ee7043b4646a69f31b846c12d673ed784f67 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bitrateallocationstrategy_unittest.cc >@@ -43,11 +43,8 @@ TEST(BitrateAllocationStrategyTest, SetAllBitratesToMinimum) { > BitrateAllocationStrategy::TrackConfig(min_other_bitrate, > max_other_bitrate, false, "")}; > >- std::vector<const rtc::BitrateAllocationStrategy::TrackConfig*> >- track_config_ptrs = MakeTrackConfigPtrsVector(track_configs); >- > std::vector<uint32_t> allocations = >- BitrateAllocationStrategy::SetAllBitratesToMinimum(track_config_ptrs); >+ BitrateAllocationStrategy::SetAllBitratesToMinimum(track_configs); > EXPECT_EQ(min_audio_bitrate, allocations[0]); > EXPECT_EQ(min_video_bitrate, allocations[1]); > EXPECT_EQ(min_other_bitrate, allocations[2]); >@@ -76,11 +73,8 @@ TEST(BitrateAllocationStrategyTest, DistributeBitratesEvenly) { > BitrateAllocationStrategy::TrackConfig(min_other_bitrate, > max_other_bitrate, false, "")}; > >- std::vector<const rtc::BitrateAllocationStrategy::TrackConfig*> >- track_config_ptrs = MakeTrackConfigPtrsVector(track_configs); >- > std::vector<uint32_t> allocations = >- BitrateAllocationStrategy::DistributeBitratesEvenly(track_config_ptrs, >+ BitrateAllocationStrategy::DistributeBitratesEvenly(track_configs, > available_bitrate); > EXPECT_EQ(min_audio_bitrate + even_bitrate_increase, allocations[0]); > EXPECT_EQ(min_video_bitrate + even_bitrate_increase, allocations[1]); >@@ -108,11 +102,7 @@ std::vector<uint32_t> RunAudioPriorityAllocation( > BitrateAllocationStrategy::TrackConfig(min_other_bitrate, > max_other_bitrate, false, "")}; > >- std::vector<const rtc::BitrateAllocationStrategy::TrackConfig*> >- track_config_ptrs = MakeTrackConfigPtrsVector(track_configs); >- >- return allocation_strategy.AllocateBitrates(available_bitrate, >- track_config_ptrs); >+ return allocation_strategy.AllocateBitrates(available_bitrate, track_configs); > } > > // Test that when the available bitrate is less than the sum of the minimum >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer.h >index 22f89372debc16d55d6c8c4c37f8f0dda3569114..f9291b99e5a103244c7d6c46cba82641e96eb176 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer.h >@@ -149,11 +149,13 @@ class BufferT { > } > > BufferT& operator=(BufferT&& buf) { >- RTC_DCHECK(IsConsistent()); > RTC_DCHECK(buf.IsConsistent()); >+ MaybeZeroCompleteBuffer(); > size_ = buf.size_; > capacity_ = buf.capacity_; >- data_ = std::move(buf.data_); >+ using std::swap; >+ swap(data_, buf.data_); >+ buf.data_.reset(); > buf.OnMovedFrom(); > return *this; > } >@@ -374,10 +376,10 @@ class BufferT { > > // Zero the complete buffer if template argument "ZeroOnFree" is true. > void MaybeZeroCompleteBuffer() { >- if (ZeroOnFree && capacity_) { >+ if (ZeroOnFree && capacity_ > 0) { > // It would be sufficient to only zero "size_" elements, as all other > // methods already ensure that the unused capacity contains no sensitive >- // data - but better safe than sorry. >+ // data---but better safe than sorry. > ExplicitZeroMemory(data_.get(), capacity_ * sizeof(T)); > } > } >@@ -389,7 +391,7 @@ class BufferT { > ExplicitZeroMemory(data_.get() + size_, count * sizeof(T)); > } > >- // Precondition for all methods except Clear and the destructor. >+ // Precondition for all methods except Clear, operator= and the destructor. > // Postcondition for all methods except move construction and move > // assignment, which leave the moved-from object in a possibly inconsistent > // state. >@@ -400,15 +402,16 @@ class BufferT { > // Called when *this has been moved from. Conceptually it's a no-op, but we > // can mutate the state slightly to help subsequent sanity checks catch bugs. > void OnMovedFrom() { >+ RTC_DCHECK(!data_); // Our heap block should have been stolen. > #if RTC_DCHECK_IS_ON >+ // Ensure that *this is always inconsistent, to provoke bugs. >+ size_ = 1; >+ capacity_ = 0; >+#else > // Make *this consistent and empty. Shouldn't be necessary, but better safe > // than sorry. > size_ = 0; > capacity_ = 0; >-#else >- // Ensure that *this is always inconsistent, to provoke bugs. >- size_ = 1; >- capacity_ = 0; > #endif > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer_unittest.cc >index ae976f1394322f11b1a9496466d7385e2bac94aa..b2f47c16ecbe3f3df8d1fc683f6da80ca3f8d02b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/buffer_unittest.cc >@@ -185,6 +185,17 @@ TEST(BufferTest, TestMoveAssign) { > EXPECT_TRUE(buf1.empty()); > } > >+TEST(BufferTest, TestMoveAssignSelf) { >+ // Move self-assignment isn't required to produce a meaningful state, but >+ // should not leave the object in an inconsistent state. (Such inconsistent >+ // state could be caught by the DCHECKs and/or by the leak checker.) We need >+ // to be sneaky when testing this; if we're doing a too-obvious >+ // move-assign-to-self, clang's -Wself-move triggers at compile time. >+ Buffer buf(kTestData, 3, 40); >+ Buffer* buf_ptr = &buf; >+ buf = std::move(*buf_ptr); >+} >+ > TEST(BufferTest, TestSwap) { > Buffer buf1(kTestData, 3); > Buffer buf2(kTestData, 6, 40); >@@ -434,6 +445,19 @@ TEST(BufferTest, TestStruct) { > EXPECT_EQ(kObsidian, buf[2].stone); > } > >+TEST(BufferTest, DieOnUseAfterMove) { >+ Buffer buf(17); >+ Buffer buf2 = std::move(buf); >+ EXPECT_EQ(buf2.size(), 17u); >+#if RTC_DCHECK_IS_ON >+#if GTEST_HAS_DEATH_TEST && !defined(WEBRTC_ANDROID) >+ EXPECT_DEATH(buf.empty(), ""); >+#endif >+#else >+ EXPECT_TRUE(buf.empty()); >+#endif >+} >+ > TEST(ZeroOnFreeBufferTest, TestZeroOnSetData) { > ZeroOnFreeBuffer<uint8_t> buf(kTestData, 7); > const uint8_t* old_data = buf.data(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.cc >index 48ff2e6d317d78208a6b330efb398add64e3df7c..74f7a502c0b910e61f5aaa8bc109e6334b8a918a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.cc >@@ -10,6 +10,8 @@ > > #include "rtc_base/bufferqueue.h" > >+#include <stdint.h> >+#include <string.h> > #include <algorithm> > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.h >index 94ab0ca4c9eb0cf3bb3f964f7e6cc68a39e1d5d7..63f5182509f62204be01115798b90f1e0a8e2560 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bufferqueue.h >@@ -11,12 +11,14 @@ > #ifndef RTC_BASE_BUFFERQUEUE_H_ > #define RTC_BASE_BUFFERQUEUE_H_ > >+#include <stddef.h> > #include <deque> > #include <vector> > > #include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/thread_annotations.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.cc >index 94fc6acf08027412a25ed7367320f76e21ddad10..f8ce1a29efe16687ac654fe9afcb25edfaa308f7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.cc >@@ -12,8 +12,6 @@ > > #include <string.h> > >-#include <algorithm> >- > namespace rtc { > > ByteBufferWriter::ByteBufferWriter() : ByteBufferWriterT() {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.h >index 9e08f025a84237e944a1ff3e538baf8972dce293..4d25c210507d731059f83bd53abd9737be4d0f84 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/bytebuffer.h >@@ -11,6 +11,8 @@ > #ifndef RTC_BASE_BYTEBUFFER_H_ > #define RTC_BASE_BYTEBUFFER_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <string> > > #include "rtc_base/buffer.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cancelable_periodic_task_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cancelable_periodic_task_unittest.cc >index fe27ea7d48b656ec5e98b9efaf56efc2668b9947..badd623415ad55319f72ddffffbf1b3bc2e4b768 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cancelable_periodic_task_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cancelable_periodic_task_unittest.cc >@@ -68,7 +68,7 @@ TEST(CancelablePeriodicTaskTest, CanCallCancelOnEmptyHandle) { > } > > TEST(CancelablePeriodicTaskTest, CancelTaskBeforeItRuns) { >- rtc::Event done(false, false); >+ rtc::Event done; > MockClosure mock; > EXPECT_CALL(mock, Call).Times(0); > EXPECT_CALL(mock, Delete).WillOnce(Invoke([&done] { done.Set(); })); >@@ -84,7 +84,7 @@ TEST(CancelablePeriodicTaskTest, CancelTaskBeforeItRuns) { > } > > TEST(CancelablePeriodicTaskTest, CancelDelayedTaskBeforeItRuns) { >- rtc::Event done(false, false); >+ rtc::Event done; > MockClosure mock; > EXPECT_CALL(mock, Call).Times(0); > EXPECT_CALL(mock, Delete).WillOnce(Invoke([&done] { done.Set(); })); >@@ -100,7 +100,7 @@ TEST(CancelablePeriodicTaskTest, CancelDelayedTaskBeforeItRuns) { > } > > TEST(CancelablePeriodicTaskTest, CancelTaskAfterItRuns) { >- rtc::Event done(false, false); >+ rtc::Event done; > MockClosure mock; > EXPECT_CALL(mock, Call).WillOnce(Return(100)); > EXPECT_CALL(mock, Delete).WillOnce(Invoke([&done] { done.Set(); })); >@@ -117,7 +117,7 @@ TEST(CancelablePeriodicTaskTest, CancelTaskAfterItRuns) { > > TEST(CancelablePeriodicTaskTest, ZeroReturnValueRepostsTheTask) { > NiceMock<MockClosure> closure; >- rtc::Event done(false, false); >+ rtc::Event done; > EXPECT_CALL(closure, Call()).WillOnce(Return(0)).WillOnce(Invoke([&done] { > done.Set(); > return kTimeoutMs; >@@ -130,7 +130,7 @@ TEST(CancelablePeriodicTaskTest, ZeroReturnValueRepostsTheTask) { > > TEST(CancelablePeriodicTaskTest, StartPeriodicTask) { > MockFunction<int()> closure; >- rtc::Event done(false, false); >+ rtc::Event done; > EXPECT_CALL(closure, Call()) > .WillOnce(Return(20)) > .WillOnce(Return(20)) >@@ -146,7 +146,7 @@ TEST(CancelablePeriodicTaskTest, StartPeriodicTask) { > > // Validates perfect forwarding doesn't keep reference to deleted copy. > TEST(CancelablePeriodicTaskTest, CreateWithCopyOfAClosure) { >- rtc::Event done(false, false); >+ rtc::Event done; > MockClosure mock; > EXPECT_CALL(mock, Call).WillOnce(Invoke([&done] { > done.Set(); >@@ -166,7 +166,7 @@ TEST(CancelablePeriodicTaskTest, CreateWithCopyOfAClosure) { > } > > TEST(CancelablePeriodicTaskTest, DeletingHandleDoesntStopTheTask) { >- rtc::Event run(false, false); >+ rtc::Event run; > rtc::TaskQueue task_queue("queue"); > auto task = rtc::CreateCancelablePeriodicTask(([&] { > run.Set(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.cc >index 6c48d52f17172ee8c4cfaefad3e9051ccb420933..8f5126a28268cf198acb4cda9ff4ac74981e77b7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.cc >@@ -10,6 +10,8 @@ > > #include "rtc_base/copyonwritebuffer.h" > >+#include <stddef.h> >+ > namespace rtc { > > CopyOnWriteBuffer::CopyOnWriteBuffer() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.h >index 0514e2fe6611a25966fd633e8f692f4623a2f5cf..cc174dfc7c969e728995de3d34daa4ac8d2833cd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/copyonwritebuffer.h >@@ -11,12 +11,15 @@ > #ifndef RTC_BASE_COPYONWRITEBUFFER_H_ > #define RTC_BASE_COPYONWRITEBUFFER_H_ > >+#include <stdint.h> > #include <algorithm> >+#include <cstring> >+#include <string> >+#include <type_traits> > #include <utility> > > #include "rtc_base/buffer.h" > #include "rtc_base/checks.h" >-#include "rtc_base/refcount.h" > #include "rtc_base/refcountedobject.h" > #include "rtc_base/scoped_ref_ptr.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cpu_time.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cpu_time.cc >index de4a6bd6844b1eb4322b417d175fa0f166c8ee23..ad91ecace973195c495bb22791425929d33362ed 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cpu_time.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/cpu_time.cc >@@ -65,6 +65,9 @@ int64_t GetProcessCpuTimeNanos() { > } else { > RTC_LOG_ERR(LS_ERROR) << "GetProcessTimes() failed."; > } >+#elif defined(WEBRTC_FUCHSIA) >+ RTC_LOG_ERR(LS_ERROR) << "GetProcessCpuTimeNanos() not implemented"; >+ return 0; > #else > // Not implemented yet. > static_assert( >@@ -107,10 +110,13 @@ int64_t GetThreadCpuTimeNanos() { > } else { > RTC_LOG_ERR(LS_ERROR) << "GetThreadTimes() failed."; > } >+#elif defined(WEBRTC_FUCHSIA) >+ RTC_LOG_ERR(LS_ERROR) << "GetThreadCpuTimeNanos() not implemented"; >+ return 0; > #else > // Not implemented yet. > static_assert( >- false, "GetProcessCpuTimeNanos() platform support not yet implemented."); >+ false, "GetThreadCpuTimeNanos() platform support not yet implemented."); > #endif > return -1; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection.cc >index d8a5b48c0513902587ba21954f60847c516a395a..4e00be968de902ad2adfa7ee41020dfebf219ba0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection.cc >@@ -10,6 +10,8 @@ > > #include "rtc_base/criticalsection.h" > >+#include <time.h> >+ > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" > #include "rtc_base/platform_thread_types.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection_unittest.cc >index db4f9e7452f57c660c7d953241b38c9d20f09804..6016f85f898e5e48ea6505dda59ce8a6587bf23c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/criticalsection_unittest.cc >@@ -389,7 +389,7 @@ class PerfTestThread { > // The test is disabled by default to avoid unecessarily loading the bots. > TEST(CriticalSectionTest, DISABLED_Performance) { > PerfTestThread threads[8]; >- Event event(false, false); >+ Event event; > > static const int kThreadRepeats = 10000000; > static const int kExpectedCount = kThreadRepeats * arraysize(threads); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.cc >index 6c9639b4fecc111a807eeaa18aef03ba71816baa..42c22a29ccb0968a837deee9db6bddec1e4e0ab2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.cc >@@ -24,6 +24,8 @@ > > namespace rtc { > >+Event::Event() : Event(false, false) {} >+ > #if defined(WEBRTC_WIN) > > Event::Event(bool manual_reset, bool initially_signaled) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.h >index 7e61c4c8a6878fdea13eb700a39c025cdf1caa5b..2e11002066da91fc59184aa90f156e1f525daf1c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event.h >@@ -11,7 +11,6 @@ > #ifndef RTC_BASE_EVENT_H_ > #define RTC_BASE_EVENT_H_ > >-#include "rtc_base/constructormagic.h" > #if defined(WEBRTC_WIN) > #include <windows.h> > #elif defined(WEBRTC_POSIX) >@@ -26,7 +25,10 @@ class Event { > public: > static const int kForever = -1; > >+ Event(); > Event(bool manual_reset, bool initially_signaled); >+ Event(const Event&) = delete; >+ Event& operator=(const Event&) = delete; > ~Event(); > > void Set(); >@@ -45,8 +47,6 @@ class Event { > const bool is_manual_reset_; > bool event_status_; > #endif >- >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(Event); > }; > > // This class is provided for compatibility with Chromium. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_tracer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_tracer.cc >index 31f4271d31e2a39db7b88413dc1868ea37cb7f2b..af88c9ddd4c101cd89f1dcbf486c03bfb4982fc5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_tracer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_tracer.cc >@@ -10,7 +10,9 @@ > #include "rtc_base/event_tracer.h" > > #include <inttypes.h> >- >+#include <stdint.h> >+#include <stdio.h> >+#include <string.h> > #include <string> > #include <vector> > >@@ -20,7 +22,9 @@ > #include "rtc_base/event.h" > #include "rtc_base/logging.h" > #include "rtc_base/platform_thread.h" >-#include "rtc_base/stringutils.h" >+#include "rtc_base/platform_thread_types.h" >+#include "rtc_base/thread_annotations.h" >+#include "rtc_base/thread_checker.h" > #include "rtc_base/timeutils.h" > #include "rtc_base/trace_event.h" > >@@ -86,8 +90,7 @@ class EventLogger final { > : logging_thread_(EventTracingThreadFunc, > this, > "EventTracingThread", >- kLowPriority), >- shutdown_event_(false, false) {} >+ kLowPriority) {} > ~EventLogger() { RTC_DCHECK(thread_checker_.CalledOnValidThread()); } > > void AddTraceEvent(const char* name, >@@ -286,19 +289,19 @@ class EventLogger final { > } > break; > case TRACE_VALUE_TYPE_UINT: >- print_length = sprintfn(&output[0], kTraceArgBufferLength, "%llu", >+ print_length = snprintf(&output[0], kTraceArgBufferLength, "%llu", > arg.value.as_uint); > break; > case TRACE_VALUE_TYPE_INT: >- print_length = sprintfn(&output[0], kTraceArgBufferLength, "%lld", >+ print_length = snprintf(&output[0], kTraceArgBufferLength, "%lld", > arg.value.as_int); > break; > case TRACE_VALUE_TYPE_DOUBLE: >- print_length = sprintfn(&output[0], kTraceArgBufferLength, "%f", >+ print_length = snprintf(&output[0], kTraceArgBufferLength, "%f", > arg.value.as_double); > break; > case TRACE_VALUE_TYPE_POINTER: >- print_length = sprintfn(&output[0], kTraceArgBufferLength, "\"%p\"", >+ print_length = snprintf(&output[0], kTraceArgBufferLength, "\"%p\"", > arg.value.as_pointer); > break; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_unittest.cc >index 050619e849541cac07d090efd7bfc0f3b50b8d40..a65111bc90f7f3837e38d2378adcc40a8d896b75 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/event_unittest.cc >@@ -32,7 +32,7 @@ TEST(EventTest, ManualReset) { > } > > TEST(EventTest, AutoReset) { >- Event event(false, false); >+ Event event; > ASSERT_FALSE(event.Wait(0)); > > event.Set(); >@@ -59,7 +59,7 @@ class SignalerThread { > me->reader_->Wait(Event::kForever); > } > } >- Event stop_event_{false, false}; >+ Event stop_event_; > Event* writer_; > Event* reader_; > PlatformThread thread_; >@@ -68,7 +68,7 @@ class SignalerThread { > // These tests are disabled by default and only intended to be run manually. > TEST(EventTest, DISABLED_PerformanceSingleThread) { > static const int kNumIterations = 10000000; >- Event event(false, false); >+ Event event; > for (int i = 0; i < kNumIterations; ++i) { > event.Set(); > event.Wait(0); >@@ -77,8 +77,8 @@ TEST(EventTest, DISABLED_PerformanceSingleThread) { > > TEST(EventTest, DISABLED_PerformanceMultiThread) { > static const int kNumIterations = 10000; >- Event read(false, false); >- Event write(false, false); >+ Event read; >+ Event write; > SignalerThread thread; > thread.Start(&read, &write); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/BUILD.gn >index b10d0c154592062671a9f2ac9d8a92f5036826da..d36a43e9469055edcc4df695a3ad15884ebf7317 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/BUILD.gn >@@ -28,11 +28,11 @@ rtc_static_library("field_trial_parser") { > "field_trial_units.h", > ] > deps = [ >- "../:rtc_base_approved", > "../../api/units:data_rate", > "../../api/units:data_size", > "../../api/units:time_delta", > "../../rtc_base:checks", >+ "../../rtc_base:logging", > "//third_party/abseil-cpp/absl/types:optional", > ] > } >@@ -56,13 +56,36 @@ rtc_static_library("quality_scaling_experiment") { > ] > deps = [ > "../:rtc_base_approved", >- "../..:webrtc_common", > "../../api/video_codecs:video_codecs_api", > "../../system_wrappers:field_trial", > "//third_party/abseil-cpp/absl/types:optional", > ] > } > >+rtc_static_library("normalize_simulcast_size_experiment") { >+ sources = [ >+ "normalize_simulcast_size_experiment.cc", >+ "normalize_simulcast_size_experiment.h", >+ ] >+ deps = [ >+ "../:rtc_base_approved", >+ "../../system_wrappers:field_trial", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >+ >+rtc_static_library("cpu_speed_experiment") { >+ sources = [ >+ "cpu_speed_experiment.cc", >+ "cpu_speed_experiment.h", >+ ] >+ deps = [ >+ "../:rtc_base_approved", >+ "../../system_wrappers:field_trial", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >+ > rtc_static_library("rtt_mult_experiment") { > sources = [ > "rtt_mult_experiment.cc", >@@ -74,26 +97,44 @@ rtc_static_library("rtt_mult_experiment") { > ] > } > >+rtc_static_library("jitter_upper_bound_experiment") { >+ sources = [ >+ "jitter_upper_bound_experiment.cc", >+ "jitter_upper_bound_experiment.h", >+ ] >+ deps = [ >+ "../:rtc_base_approved", >+ "../../system_wrappers:field_trial", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >+ > if (rtc_include_tests) { > rtc_source_set("experiments_unittests") { > testonly = true > > sources = [ > "congestion_controller_experiment_unittest.cc", >+ "cpu_speed_experiment_unittest.cc", > "field_trial_parser_unittest.cc", > "field_trial_units_unittest.cc", >+ "normalize_simulcast_size_experiment_unittest.cc", > "quality_scaling_experiment_unittest.cc", > "rtt_mult_experiment_unittest.cc", > ] > deps = [ > ":congestion_controller_experiment", >+ ":cpu_speed_experiment", > ":field_trial_parser", >+ ":normalize_simulcast_size_experiment", > ":quality_scaling_experiment", > ":rtt_mult_experiment", >+ "..:gunit_helpers", > "../:rtc_base_tests_main", > "../:rtc_base_tests_utils", > "../../system_wrappers:field_trial", > "../../test:field_trial", >+ "../../test:test_support", > ] > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/OWNERS >new file mode 100644 >index 0000000000000000000000000000000000000000..9c5587226f8da22455e014ed961a112ce1fa8e0c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/OWNERS >@@ -0,0 +1,8 @@ >+per-file alr_experiment*=sprang@webrtc.org >+per-file congestion_controller_experiment*=srte@webrtc.org >+per-file cpu_speed_experiment*=asapersson@webrtc.org >+per-file field_trial*=srte@webrtc.org >+per-file jitter_upper_bound_experiment*=sprang@webrtc.org >+per-file normalize_simulcast_size_experiment*=asapersson@webrtc.org >+per-file quality_scaling_experiment*=asapersson@webrtc.org >+per-file rtt_mult_experiment*=mhoro@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.cc >index dff5acecc8b7566a08c1569ee70ff1508e267444..25e948d98a0e7684bacd399db662a6985d57d37b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.cc >@@ -10,9 +10,10 @@ > > #include "rtc_base/experiments/alr_experiment.h" > >+#include <inttypes.h> >+#include <stdio.h> > #include <string> > >-#include "rtc_base/format_macros.h" > #include "rtc_base/logging.h" > #include "system_wrappers/include/field_trial.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.h >index 4d9fd001304965542fd558d144127ea3377842fe..876bd0251730b6a040cca3ae5f91031135ce9458 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/alr_experiment.h >@@ -11,6 +11,8 @@ > #ifndef RTC_BASE_EXPERIMENTS_ALR_EXPERIMENT_H_ > #define RTC_BASE_EXPERIMENTS_ALR_EXPERIMENT_H_ > >+#include <stdint.h> >+ > #include "absl/types/optional.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f39540cf0390f1b99894acd1e4f04f4176c4fd4f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment.cc >@@ -0,0 +1,70 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/experiments/cpu_speed_experiment.h" >+ >+#include <string> >+ >+#include "rtc_base/logging.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+namespace { >+constexpr char kFieldTrial[] = "WebRTC-VP8-CpuSpeed-Arm"; >+constexpr int kMinSetting = -16; >+constexpr int kMaxSetting = -1; >+} // namespace >+ >+absl::optional<std::vector<CpuSpeedExperiment::Config>> >+CpuSpeedExperiment::GetConfigs() { >+ if (!webrtc::field_trial::IsEnabled(kFieldTrial)) >+ return absl::nullopt; >+ >+ const std::string group = webrtc::field_trial::FindFullName(kFieldTrial); >+ if (group.empty()) >+ return absl::nullopt; >+ >+ std::vector<Config> configs(3); >+ if (sscanf(group.c_str(), "Enabled-%d,%d,%d,%d,%d,%d", &(configs[0].pixels), >+ &(configs[0].cpu_speed), &(configs[1].pixels), >+ &(configs[1].cpu_speed), &(configs[2].pixels), >+ &(configs[2].cpu_speed)) != 6) { >+ RTC_LOG(LS_WARNING) << "Too few parameters provided."; >+ return absl::nullopt; >+ } >+ >+ for (const auto& config : configs) { >+ if (config.cpu_speed < kMinSetting || config.cpu_speed > kMaxSetting) { >+ RTC_LOG(LS_WARNING) << "Unsupported cpu speed setting, value ignored."; >+ return absl::nullopt; >+ } >+ } >+ >+ for (size_t i = 1; i < configs.size(); ++i) { >+ if (configs[i].pixels < configs[i - 1].pixels || >+ configs[i].cpu_speed > configs[i - 1].cpu_speed) { >+ RTC_LOG(LS_WARNING) << "Invalid parameter value provided."; >+ return absl::nullopt; >+ } >+ } >+ >+ return absl::optional<std::vector<Config>>(configs); >+} >+ >+int CpuSpeedExperiment::GetValue(int pixels, >+ const std::vector<Config>& configs) { >+ for (const auto& config : configs) { >+ if (pixels <= config.pixels) >+ return config.cpu_speed; >+ } >+ return kMinSetting; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment.h >new file mode 100644 >index 0000000000000000000000000000000000000000..e6c83409430141e5701ae20ba144c7d672ce9d9b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment.h >@@ -0,0 +1,41 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_EXPERIMENTS_CPU_SPEED_EXPERIMENT_H_ >+#define RTC_BASE_EXPERIMENTS_CPU_SPEED_EXPERIMENT_H_ >+ >+#include <vector> >+ >+#include "absl/types/optional.h" >+ >+namespace webrtc { >+ >+class CpuSpeedExperiment { >+ public: >+ struct Config { >+ bool operator==(const Config& o) const { >+ return pixels == o.pixels && cpu_speed == o.cpu_speed; >+ } >+ >+ int pixels; // The video frame size. >+ int cpu_speed; // The |cpu_speed| to be used if the frame size is less >+ // than or equal to |pixels|. >+ }; >+ >+ // Returns the configurations from field trial on success. >+ static absl::optional<std::vector<Config>> GetConfigs(); >+ >+ // Gets the cpu speed from the |configs| based on |pixels|. >+ static int GetValue(int pixels, const std::vector<Config>& configs); >+}; >+ >+} // namespace webrtc >+ >+#endif // RTC_BASE_EXPERIMENTS_CPU_SPEED_EXPERIMENT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..edc782c0addbd694421e3bbb59cb478bb32eec2f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/cpu_speed_experiment_unittest.cc >@@ -0,0 +1,85 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/experiments/cpu_speed_experiment.h" >+ >+#include "rtc_base/gunit.h" >+#include "test/field_trial.h" >+#include "test/gmock.h" >+ >+namespace webrtc { >+ >+TEST(CpuSpeedExperimentTest, GetConfigsFailsIfNotEnabled) { >+ EXPECT_FALSE(CpuSpeedExperiment::GetConfigs()); >+} >+ >+TEST(CpuSpeedExperimentTest, GetConfigsFailsForTooFewParameters) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-VP8-CpuSpeed-Arm/Enabled-1000,-1,2000,-10,3000/"); >+ EXPECT_FALSE(CpuSpeedExperiment::GetConfigs()); >+} >+ >+TEST(CpuSpeedExperimentTest, GetConfigs) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-VP8-CpuSpeed-Arm/Enabled-1000,-1,2000,-10,3000,-16/"); >+ >+ const absl::optional<std::vector<CpuSpeedExperiment::Config>> kConfigs = >+ CpuSpeedExperiment::GetConfigs(); >+ ASSERT_TRUE(kConfigs); >+ EXPECT_THAT(*kConfigs, >+ ::testing::ElementsAre(CpuSpeedExperiment::Config{1000, -1}, >+ CpuSpeedExperiment::Config{2000, -10}, >+ CpuSpeedExperiment::Config{3000, -16})); >+} >+ >+TEST(CpuSpeedExperimentTest, GetValue) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-VP8-CpuSpeed-Arm/Enabled-1000,-5,2000,-10,3000,-12/"); >+ >+ const absl::optional<std::vector<CpuSpeedExperiment::Config>> kConfigs = >+ CpuSpeedExperiment::GetConfigs(); >+ ASSERT_TRUE(kConfigs); >+ ASSERT_EQ(3u, (*kConfigs).size()); >+ EXPECT_EQ(-5, CpuSpeedExperiment::GetValue(1, *kConfigs)); >+ EXPECT_EQ(-5, CpuSpeedExperiment::GetValue(1000, *kConfigs)); >+ EXPECT_EQ(-10, CpuSpeedExperiment::GetValue(1000 + 1, *kConfigs)); >+ EXPECT_EQ(-10, CpuSpeedExperiment::GetValue(2000, *kConfigs)); >+ EXPECT_EQ(-12, CpuSpeedExperiment::GetValue(2000 + 1, *kConfigs)); >+ EXPECT_EQ(-12, CpuSpeedExperiment::GetValue(3000, *kConfigs)); >+ EXPECT_EQ(-16, CpuSpeedExperiment::GetValue(3000 + 1, *kConfigs)); >+} >+ >+TEST(CpuSpeedExperimentTest, GetConfigsFailsForTooSmallValue) { >+ // Supported range: [-16, -1]. >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-VP8-CpuSpeed-Arm/Enabled-1000,-1,2000,-10,3000,-17/"); >+ EXPECT_FALSE(CpuSpeedExperiment::GetConfigs()); >+} >+ >+TEST(CpuSpeedExperimentTest, GetConfigsFailsForTooLargeValue) { >+ // Supported range: [-16, -1]. >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-VP8-CpuSpeed-Arm/Enabled-1000,0,2000,-10,3000,-16/"); >+ EXPECT_FALSE(CpuSpeedExperiment::GetConfigs()); >+} >+ >+TEST(CpuSpeedExperimentTest, GetConfigsFailsIfPixelsDecreasing) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-VP8-CpuSpeed-Arm/Enabled-1000,-5,999,-10,3000,-16/"); >+ EXPECT_FALSE(CpuSpeedExperiment::GetConfigs()); >+} >+ >+TEST(CpuSpeedExperimentTest, GetConfigsFailsIfCpuSpeedIncreasing) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-VP8-CpuSpeed-Arm/Enabled-1000,-5,2000,-4,3000,-16/"); >+ EXPECT_FALSE(CpuSpeedExperiment::GetConfigs()); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.cc >index a2d7f97bde1f4643f7e5efdd41a2a24aef29fb98..936487cb015e0262ef90a2802ac26dabf347350f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.cc >@@ -169,6 +169,9 @@ template class FieldTrialParameter<double>; > template class FieldTrialParameter<int>; > template class FieldTrialParameter<std::string>; > >+template class FieldTrialConstrained<double>; >+template class FieldTrialConstrained<int>; >+ > template class FieldTrialOptional<double>; > template class FieldTrialOptional<int>; > template class FieldTrialOptional<bool>; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.h >index 22a88897170bc55072fd8d1e31a0dd7f037d8fd1..8bdd9b5d8ebe8601a44e6a22c9a768d9bae6ff9b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser.h >@@ -92,6 +92,43 @@ class FieldTrialParameter : public FieldTrialParameterInterface { > T value_; > }; > >+// This class uses the ParseTypedParameter function to implement a parameter >+// implementation with an enforced default value and a range constraint. Values >+// outside the configured range will be ignored. >+template <typename T> >+class FieldTrialConstrained : public FieldTrialParameterInterface { >+ public: >+ FieldTrialConstrained(std::string key, >+ T default_value, >+ absl::optional<T> lower_limit, >+ absl::optional<T> upper_limit) >+ : FieldTrialParameterInterface(key), >+ value_(default_value), >+ lower_limit_(lower_limit), >+ upper_limit_(upper_limit) {} >+ T Get() const { return value_; } >+ operator T() const { return Get(); } >+ const T* operator->() const { return &value_; } >+ >+ protected: >+ bool Parse(absl::optional<std::string> str_value) override { >+ if (str_value) { >+ absl::optional<T> value = ParseTypedParameter<T>(*str_value); >+ if (value && (!lower_limit_ || *value >= *lower_limit_) && >+ (!upper_limit_ || *value <= *upper_limit_)) { >+ value_ = *value; >+ return true; >+ } >+ } >+ return false; >+ } >+ >+ private: >+ T value_; >+ absl::optional<T> lower_limit_; >+ absl::optional<T> upper_limit_; >+}; >+ > class AbstractFieldTrialEnum : public FieldTrialParameterInterface { > public: > AbstractFieldTrialEnum(std::string key, >@@ -191,6 +228,9 @@ extern template class FieldTrialParameter<int>; > // Using the given value as is. > extern template class FieldTrialParameter<std::string>; > >+extern template class FieldTrialConstrained<double>; >+extern template class FieldTrialConstrained<int>; >+ > extern template class FieldTrialOptional<double>; > extern template class FieldTrialOptional<int>; > extern template class FieldTrialOptional<bool>; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser_unittest.cc >index de977ec8ba809722030408d6597f57bac14ecbdc..0d067f5a49138a75cd8cde18724e6572bc9ad984 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_parser_unittest.cc >@@ -99,6 +99,19 @@ TEST(FieldTrialParserTest, IgnoresInvalid) { > EXPECT_EQ(exp.ping.Get(), false); > EXPECT_EQ(exp.hash.Get(), "a80"); > } >+TEST(FieldTrialParserTest, IgnoresOutOfRange) { >+ FieldTrialConstrained<double> low("low", 10, absl::nullopt, 100); >+ FieldTrialConstrained<double> high("high", 10, 5, absl::nullopt); >+ ParseFieldTrial({&low, &high}, "low:1000,high:0"); >+ EXPECT_EQ(low.Get(), 10); >+ EXPECT_EQ(high.Get(), 10); >+ ParseFieldTrial({&low, &high}, "low:inf,high:nan"); >+ EXPECT_EQ(low.Get(), 10); >+ EXPECT_EQ(high.Get(), 10); >+ ParseFieldTrial({&low, &high}, "low:20,high:20"); >+ EXPECT_EQ(low.Get(), 20); >+ EXPECT_EQ(high.Get(), 20); >+} > TEST(FieldTrialParserTest, ParsesOptionalParameters) { > FieldTrialOptional<int> max_count("c", absl::nullopt); > ParseFieldTrial({&max_count}, ""); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.cc >index f53978b21153d4fd5024e1c99564d57ebf91adf7..5311a3a941c780b9e035c0fb0b38781c437ba283 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.cc >@@ -9,6 +9,7 @@ > */ > #include "rtc_base/experiments/field_trial_units.h" > >+#include <stdio.h> > #include <limits> > #include <string> > >@@ -86,6 +87,10 @@ template class FieldTrialParameter<DataRate>; > template class FieldTrialParameter<DataSize>; > template class FieldTrialParameter<TimeDelta>; > >+template class FieldTrialConstrained<DataRate>; >+template class FieldTrialConstrained<DataSize>; >+template class FieldTrialConstrained<TimeDelta>; >+ > template class FieldTrialOptional<DataRate>; > template class FieldTrialOptional<DataSize>; > template class FieldTrialOptional<TimeDelta>; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.h >index 932c5cb45001cf06e582e1b9dc660fb9312080df..af88f4a6c00df35106b9c5cda46339701259a6bf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units.h >@@ -21,6 +21,10 @@ extern template class FieldTrialParameter<DataRate>; > extern template class FieldTrialParameter<DataSize>; > extern template class FieldTrialParameter<TimeDelta>; > >+extern template class FieldTrialConstrained<DataRate>; >+extern template class FieldTrialConstrained<DataSize>; >+extern template class FieldTrialConstrained<TimeDelta>; >+ > extern template class FieldTrialOptional<DataRate>; > extern template class FieldTrialOptional<DataSize>; > extern template class FieldTrialOptional<TimeDelta>; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units_unittest.cc >index 80771d926abd37e7c80fde91804eff355d2740e8..57022c2f23f46e30b82313172b8392e3ba3a2267 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/field_trial_units_unittest.cc >@@ -58,5 +58,25 @@ TEST(FieldTrialParserUnitsTest, ParsesOtherUnitParameters) { > EXPECT_EQ(*exp.max_buffer.GetOptional(), DataSize::bytes(8)); > EXPECT_EQ(exp.period.Get(), TimeDelta::ms(300)); > } >+TEST(FieldTrialParserUnitsTest, IgnoresOutOfRange) { >+ FieldTrialConstrained<DataRate> rate("r", DataRate::kbps(30), >+ DataRate::kbps(10), DataRate::kbps(100)); >+ FieldTrialConstrained<TimeDelta> delta("d", TimeDelta::ms(30), >+ TimeDelta::ms(10), TimeDelta::ms(100)); >+ FieldTrialConstrained<DataSize> size( >+ "s", DataSize::bytes(30), DataSize::bytes(10), DataSize::bytes(100)); >+ ParseFieldTrial({&rate, &delta, &size}, "r:0,d:0,s:0"); >+ EXPECT_EQ(rate->kbps(), 30); >+ EXPECT_EQ(delta->ms(), 30); >+ EXPECT_EQ(size->bytes(), 30); >+ ParseFieldTrial({&rate, &delta, &size}, "r:300,d:300,s:300"); >+ EXPECT_EQ(rate->kbps(), 30); >+ EXPECT_EQ(delta->ms(), 30); >+ EXPECT_EQ(size->bytes(), 30); >+ ParseFieldTrial({&rate, &delta, &size}, "r:50,d:50,s:50"); >+ EXPECT_EQ(rate->kbps(), 50); >+ EXPECT_EQ(delta->ms(), 50); >+ EXPECT_EQ(size->bytes(), 50); >+} > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/jitter_upper_bound_experiment.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/jitter_upper_bound_experiment.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..b3e923019b3cfd82dc648b3ffa8934df406fdbf5 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/jitter_upper_bound_experiment.cc >@@ -0,0 +1,46 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/experiments/jitter_upper_bound_experiment.h" >+ >+#include <algorithm> >+#include <string> >+ >+#include "rtc_base/logging.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+ >+const char JitterUpperBoundExperiment::kJitterUpperBoundExperimentName[] = >+ "WebRTC-JitterUpperBound"; >+ >+absl::optional<double> JitterUpperBoundExperiment::GetUpperBoundSigmas() { >+ if (!field_trial::IsEnabled(kJitterUpperBoundExperimentName)) { >+ return absl::nullopt; >+ } >+ const std::string group = >+ webrtc::field_trial::FindFullName(kJitterUpperBoundExperimentName); >+ >+ double upper_bound_sigmas; >+ if (sscanf(group.c_str(), "Enabled-%lf", &upper_bound_sigmas) != 1) { >+ RTC_LOG(LS_WARNING) << "Invalid number of parameters provided."; >+ return absl::nullopt; >+ } >+ >+ if (upper_bound_sigmas < 0) { >+ RTC_LOG(LS_WARNING) << "Invalid jitter upper bound sigmas, must be >= 0.0: " >+ << upper_bound_sigmas; >+ return absl::nullopt; >+ } >+ >+ return upper_bound_sigmas; >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/jitter_upper_bound_experiment.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/jitter_upper_bound_experiment.h >new file mode 100644 >index 0000000000000000000000000000000000000000..262cd79efa6cccfd7ac1a9044ffd9700e78f20b0 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/jitter_upper_bound_experiment.h >@@ -0,0 +1,31 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_EXPERIMENTS_JITTER_UPPER_BOUND_EXPERIMENT_H_ >+#define RTC_BASE_EXPERIMENTS_JITTER_UPPER_BOUND_EXPERIMENT_H_ >+ >+#include "absl/types/optional.h" >+ >+namespace webrtc { >+ >+class JitterUpperBoundExperiment { >+ public: >+ // Returns nullopt if experiment is not on, otherwise returns the configured >+ // upper bound for frame delay delta used in jitter estimation, expressed as >+ // number of standard deviations of the current deviation from the expected >+ // delay. >+ static absl::optional<double> GetUpperBoundSigmas(); >+ >+ static const char kJitterUpperBoundExperimentName[]; >+}; >+ >+} // namespace webrtc >+ >+#endif // RTC_BASE_EXPERIMENTS_JITTER_UPPER_BOUND_EXPERIMENT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..9ce5f57f8f0652d283f2ee7f37d3012670fd0dc1 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment.cc >@@ -0,0 +1,47 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/experiments/normalize_simulcast_size_experiment.h" >+ >+#include <string> >+ >+#include "rtc_base/logging.h" >+#include "system_wrappers/include/field_trial.h" >+ >+namespace webrtc { >+namespace { >+constexpr char kFieldTrial[] = "WebRTC-NormalizeSimulcastResolution"; >+constexpr int kMinSetting = 0; >+constexpr int kMaxSetting = 5; >+} // namespace >+ >+absl::optional<int> NormalizeSimulcastSizeExperiment::GetBase2Exponent() { >+ if (!webrtc::field_trial::IsEnabled(kFieldTrial)) >+ return absl::nullopt; >+ >+ const std::string group = webrtc::field_trial::FindFullName(kFieldTrial); >+ if (group.empty()) >+ return absl::nullopt; >+ >+ int exponent; >+ if (sscanf(group.c_str(), "Enabled-%d", &exponent) != 1) { >+ RTC_LOG(LS_WARNING) << "No parameter provided."; >+ return absl::nullopt; >+ } >+ >+ if (exponent < kMinSetting || exponent > kMaxSetting) { >+ RTC_LOG(LS_WARNING) << "Unsupported exp value provided, value ignored."; >+ return absl::nullopt; >+ } >+ >+ return absl::optional<int>(exponent); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment.h >new file mode 100644 >index 0000000000000000000000000000000000000000..6b358202b272ada414cc6874b360a25c7fb567cb >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment.h >@@ -0,0 +1,25 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_EXPERIMENTS_NORMALIZE_SIMULCAST_SIZE_EXPERIMENT_H_ >+#define RTC_BASE_EXPERIMENTS_NORMALIZE_SIMULCAST_SIZE_EXPERIMENT_H_ >+ >+#include "absl/types/optional.h" >+ >+namespace webrtc { >+class NormalizeSimulcastSizeExperiment { >+ public: >+ // Returns the base two exponent from field trial. >+ static absl::optional<int> GetBase2Exponent(); >+}; >+ >+} // namespace webrtc >+ >+#endif // RTC_BASE_EXPERIMENTS_NORMALIZE_SIMULCAST_SIZE_EXPERIMENT_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..c37b80917fe3c60eb1e6698448c44f3e40e8dfce >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/normalize_simulcast_size_experiment_unittest.cc >@@ -0,0 +1,59 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/experiments/normalize_simulcast_size_experiment.h" >+ >+#include "rtc_base/gunit.h" >+#include "test/field_trial.h" >+ >+namespace webrtc { >+ >+TEST(NormalizeSimulcastSizeExperimentTest, GetExponent) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Enabled-2/"); >+ EXPECT_EQ(2, NormalizeSimulcastSizeExperiment::GetBase2Exponent()); >+} >+ >+TEST(NormalizeSimulcastSizeExperimentTest, GetExponentWithTwoParameters) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Enabled-3-4/"); >+ EXPECT_EQ(3, NormalizeSimulcastSizeExperiment::GetBase2Exponent()); >+} >+ >+TEST(NormalizeSimulcastSizeExperimentTest, GetExponentFailsIfNotEnabled) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Disabled/"); >+ EXPECT_FALSE(NormalizeSimulcastSizeExperiment::GetBase2Exponent()); >+} >+ >+TEST(NormalizeSimulcastSizeExperimentTest, >+ GetExponentFailsForInvalidFieldTrial) { >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Enabled-invalid/"); >+ EXPECT_FALSE(NormalizeSimulcastSizeExperiment::GetBase2Exponent()); >+} >+ >+TEST(NormalizeSimulcastSizeExperimentTest, >+ GetExponentFailsForNegativeOutOfBoundValue) { >+ // Supported range: [0, 5]. >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Enabled--1/"); >+ EXPECT_FALSE(NormalizeSimulcastSizeExperiment::GetBase2Exponent()); >+} >+ >+TEST(NormalizeSimulcastSizeExperimentTest, >+ GetExponentFailsForPositiveOutOfBoundValue) { >+ // Supported range: [0, 5]. >+ webrtc::test::ScopedFieldTrials field_trials( >+ "WebRTC-NormalizeSimulcastResolution/Enabled-6/"); >+ EXPECT_FALSE(NormalizeSimulcastSizeExperiment::GetBase2Exponent()); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/quality_scaling_experiment.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/quality_scaling_experiment.h >index 80a25ef1eee58b191adc0f1bb5a015559b1e2351..14833c00b127978cb6b65e924c8c0449a9d323ea 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/quality_scaling_experiment.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/experiments/quality_scaling_experiment.h >@@ -12,7 +12,6 @@ > > #include "absl/types/optional.h" > #include "api/video_codecs/video_encoder.h" >-#include "common_types.h" // NOLINT(build/include) > > namespace webrtc { > class QualityScalingExperiment { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fake_mdns_responder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fake_mdns_responder.h >index 32d69ba6836a7f9fc39e6976f6e66e26a39874d1..1e60a5d9de2d1ccc4fdf154bf6b30598d5b65bc5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fake_mdns_responder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fake_mdns_responder.h >@@ -21,10 +21,10 @@ > > namespace webrtc { > >-class FakeMDnsResponder : public MDnsResponderInterface { >+class FakeMdnsResponder : public MdnsResponderInterface { > public: >- FakeMDnsResponder() = default; >- ~FakeMDnsResponder() = default; >+ explicit FakeMdnsResponder(rtc::Thread* thread) : thread_(thread) {} >+ ~FakeMdnsResponder() = default; > > void CreateNameForAddress(const rtc::IPAddress& addr, > NameCreatedCallback callback) override { >@@ -35,7 +35,9 @@ class FakeMDnsResponder : public MDnsResponderInterface { > name = std::to_string(next_available_id_++) + ".local"; > addr_name_map_[addr] = name; > } >- callback(addr, name); >+ invoker_.AsyncInvoke<void>( >+ RTC_FROM_HERE, thread_, >+ [callback, addr, name]() { callback(addr, name); }); > } > void RemoveNameForAddress(const rtc::IPAddress& addr, > NameRemovedCallback callback) override { >@@ -43,12 +45,16 @@ class FakeMDnsResponder : public MDnsResponderInterface { > if (it != addr_name_map_.end()) { > addr_name_map_.erase(it); > } >- callback(it != addr_name_map_.end()); >+ bool result = it != addr_name_map_.end(); >+ invoker_.AsyncInvoke<void>(RTC_FROM_HERE, thread_, >+ [callback, result]() { callback(result); }); > } > > private: > uint32_t next_available_id_ = 0; > std::map<rtc::IPAddress, std::string> addr_name_map_; >+ rtc::Thread* thread_; >+ rtc::AsyncInvoker invoker_; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakenetwork.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakenetwork.h >index d5426a3e2c90955456df17c2d3e66b25c16eb077..cb890ec583dc876be6f4d44d2ebdf0803c05ab57 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakenetwork.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakenetwork.h >@@ -82,16 +82,17 @@ class FakeNetworkManager : public NetworkManagerBase, public MessageHandler { > // MessageHandler interface. > virtual void OnMessage(Message* msg) { DoUpdateNetworks(); } > >- void CreateMDnsResponder() { >+ void CreateMdnsResponder() { > if (mdns_responder_ == nullptr) { >- mdns_responder_ = absl::make_unique<webrtc::FakeMDnsResponder>(); >+ mdns_responder_ = >+ absl::make_unique<webrtc::FakeMdnsResponder>(rtc::Thread::Current()); > } > } > > using NetworkManagerBase::set_enumeration_permission; > using NetworkManagerBase::set_default_local_addresses; > >- webrtc::MDnsResponderInterface* GetMDnsResponder() const override { >+ webrtc::MdnsResponderInterface* GetMdnsResponder() const override { > return mdns_responder_.get(); > } > >@@ -131,7 +132,7 @@ class FakeNetworkManager : public NetworkManagerBase, public MessageHandler { > IPAddress default_local_ipv4_address_; > IPAddress default_local_ipv6_address_; > >- std::unique_ptr<webrtc::FakeMDnsResponder> mdns_responder_; >+ std::unique_ptr<webrtc::FakeMdnsResponder> mdns_responder_; > }; > > } // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.cc >index 80a3e788878441ed1dca6897863c3db9b121362b..62ac9dd0207d856a01b50e87a62ac1a309c9cd85 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.cc >@@ -29,8 +29,8 @@ FakeSSLCertificate::FakeSSLCertificate(const FakeSSLCertificate&) = default; > > FakeSSLCertificate::~FakeSSLCertificate() = default; > >-FakeSSLCertificate* FakeSSLCertificate::GetReference() const { >- return new FakeSSLCertificate(*this); >+std::unique_ptr<SSLCertificate> FakeSSLCertificate::Clone() const { >+ return absl::make_unique<FakeSSLCertificate>(*this); > } > > std::string FakeSSLCertificate::ToPEMString() const { >@@ -83,10 +83,10 @@ FakeSSLIdentity::FakeSSLIdentity(const std::vector<std::string>& pem_strings) { > } > > FakeSSLIdentity::FakeSSLIdentity(const FakeSSLCertificate& cert) >- : cert_chain_(absl::make_unique<SSLCertChain>(&cert)) {} >+ : cert_chain_(absl::make_unique<SSLCertChain>(cert.Clone())) {} > > FakeSSLIdentity::FakeSSLIdentity(const FakeSSLIdentity& o) >- : cert_chain_(o.cert_chain_->UniqueCopy()) {} >+ : cert_chain_(o.cert_chain_->Clone()) {} > > FakeSSLIdentity::~FakeSSLIdentity() = default; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.h >index 4494a524efab907916463fffb0c7178636974d1d..b19cbfbb492ca16e66ec1075b69fa5f95584ea56 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fakesslidentity.h >@@ -14,6 +14,7 @@ > #include <memory> > #include <vector> > >+#include "rtc_base/sslcertificate.h" > #include "rtc_base/sslidentity.h" > > namespace rtc { >@@ -28,7 +29,7 @@ class FakeSSLCertificate : public SSLCertificate { > ~FakeSSLCertificate() override; > > // SSLCertificate implementation. >- FakeSSLCertificate* GetReference() const override; >+ std::unique_ptr<SSLCertificate> Clone() const override; > std::string ToPEMString() const override; > void ToDER(Buffer* der_buffer) const override; > int64_t CertificateExpirationTime() const override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.cc >index 62024119fcc17a49c5a7d331da42ef7ed22f6505..a793500646af623d603896b234a10d3162a6b611 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.cc >@@ -10,8 +10,6 @@ > > #include "rtc_base/file.h" > >-#include <utility> >- > namespace rtc { > > File::File(PlatformFile file) : file_(file) {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.h >index 75fd93d13101e25fcbdc7221dcf9b9a568d3af34..bc0974ae2a0bbf5f10840ed55aa1c9a29dd96de5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file.h >@@ -11,8 +11,8 @@ > #ifndef RTC_BASE_FILE_H_ > #define RTC_BASE_FILE_H_ > >+#include <stddef.h> > #include <stdint.h> >- > #include <string> > > #include "rtc_base/constructormagic.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file_posix.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file_posix.cc >index b0fec9f9fcea0014cc76dd30f07705f8a42a179f..492019280c535af61327c652e4914e73fa971dea 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file_posix.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/file_posix.cc >@@ -8,17 +8,16 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include "rtc_base/file.h" >- > #include <errno.h> > #include <fcntl.h> >-#include <sys/stat.h> >-#include <sys/types.h> >+#include <stddef.h> >+#include <stdint.h> > #include <unistd.h> >- > #include <limits> > > #include "rtc_base/checks.h" >+#include "rtc_base/file.h" >+#include "rtc_base/platform_file.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.cc >index c9a663aafff869f3706f1fddde38502b4d840559..b1dc5ff99817851c1c9521b4e087b02abe509369 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.cc >@@ -13,17 +13,159 @@ > #include <algorithm> > #include <cstdio> > #include <string> >- >+#include <utility> >+ >+#if defined(WEBRTC_WIN) >+#include <windows.h> >+#include "rtc_base/stringutils.h" >+#else >+#include <dirent.h> >+#include <sys/stat.h> >+#include <unistd.h> >+#endif // WEBRTC_WIN >+ >+#include "absl/strings/match.h" > #include "rtc_base/checks.h" >-#include "rtc_base/fileutils.h" >-#include "rtc_base/pathutils.h" >-#include "rtc_base/strings/string_builder.h" >+#include "rtc_base/logging.h" > > // Note: We use fprintf for logging in the write paths of this stream to avoid > // infinite loops when logging. > > namespace rtc { > >+namespace { >+ >+std::string AddTrailingPathDelimiterIfNeeded(std::string directory); >+ >+// |dir| must have a trailing delimiter. |prefix| must not include wild card >+// characters. >+std::vector<std::string> GetFilesWithPrefix(const std::string& directory, >+ const std::string& prefix); >+bool DeleteFile(const std::string& file); >+bool MoveFile(const std::string& old_file, const std::string& new_file); >+bool IsFile(const std::string& file); >+bool IsFolder(const std::string& file); >+absl::optional<size_t> GetFileSize(const std::string& file); >+ >+#if defined(WEBRTC_WIN) >+ >+std::string AddTrailingPathDelimiterIfNeeded(std::string directory) { >+ if (absl::EndsWith(directory, "\\")) { >+ return directory; >+ } >+ return directory + "\\"; >+} >+ >+std::vector<std::string> GetFilesWithPrefix(const std::string& directory, >+ const std::string& prefix) { >+ RTC_DCHECK(absl::EndsWith(directory, "\\")); >+ WIN32_FIND_DATA data; >+ HANDLE handle; >+ handle = ::FindFirstFile(ToUtf16(directory + prefix + '*').c_str(), &data); >+ if (handle == INVALID_HANDLE_VALUE) >+ return {}; >+ >+ std::vector<std::string> file_list; >+ do { >+ file_list.emplace_back(directory + ToUtf8(data.cFileName)); >+ } while (::FindNextFile(handle, &data) == TRUE); >+ >+ ::FindClose(handle); >+ return file_list; >+} >+ >+bool DeleteFile(const std::string& file) { >+ return ::DeleteFile(ToUtf16(file).c_str()) != 0; >+} >+ >+bool MoveFile(const std::string& old_file, const std::string& new_file) { >+ return ::MoveFile(ToUtf16(old_file).c_str(), ToUtf16(new_file).c_str()) != 0; >+} >+ >+bool IsFile(const std::string& file) { >+ WIN32_FILE_ATTRIBUTE_DATA data = {0}; >+ if (0 == ::GetFileAttributesEx(ToUtf16(file).c_str(), GetFileExInfoStandard, >+ &data)) >+ return false; >+ return (data.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY) == 0; >+} >+ >+bool IsFolder(const std::string& file) { >+ WIN32_FILE_ATTRIBUTE_DATA data = {0}; >+ if (0 == ::GetFileAttributesEx(ToUtf16(file).c_str(), GetFileExInfoStandard, >+ &data)) >+ return false; >+ return (data.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY) == >+ FILE_ATTRIBUTE_DIRECTORY; >+} >+ >+absl::optional<size_t> GetFileSize(const std::string& file) { >+ WIN32_FILE_ATTRIBUTE_DATA data = {0}; >+ if (::GetFileAttributesEx(ToUtf16(file).c_str(), GetFileExInfoStandard, >+ &data) == 0) >+ return absl::nullopt; >+ return data.nFileSizeLow; >+} >+ >+#else // defined(WEBRTC_WIN) >+ >+std::string AddTrailingPathDelimiterIfNeeded(std::string directory) { >+ if (absl::EndsWith(directory, "/")) { >+ return directory; >+ } >+ return directory + "/"; >+} >+ >+std::vector<std::string> GetFilesWithPrefix(const std::string& directory, >+ const std::string& prefix) { >+ RTC_DCHECK(absl::EndsWith(directory, "/")); >+ DIR* dir = ::opendir(directory.c_str()); >+ if (dir == nullptr) >+ return {}; >+ std::vector<std::string> file_list; >+ for (struct dirent* dirent = ::readdir(dir); dirent; >+ dirent = ::readdir(dir)) { >+ std::string name = dirent->d_name; >+ if (name.compare(0, prefix.size(), prefix) == 0) { >+ file_list.emplace_back(directory + name); >+ } >+ } >+ ::closedir(dir); >+ return file_list; >+} >+ >+bool DeleteFile(const std::string& file) { >+ return ::unlink(file.c_str()) == 0; >+} >+ >+bool MoveFile(const std::string& old_file, const std::string& new_file) { >+ return ::rename(old_file.c_str(), new_file.c_str()) == 0; >+} >+ >+bool IsFile(const std::string& file) { >+ struct stat st; >+ int res = ::stat(file.c_str(), &st); >+ // Treat symlinks, named pipes, etc. all as files. >+ return res == 0 && !S_ISDIR(st.st_mode); >+} >+ >+bool IsFolder(const std::string& file) { >+ struct stat st; >+ int res = ::stat(file.c_str(), &st); >+ return res == 0 && S_ISDIR(st.st_mode); >+} >+ >+absl::optional<size_t> GetFileSize(const std::string& file) { >+ struct stat st; >+ if (::stat(file.c_str(), &st) != 0) >+ return absl::nullopt; >+ return st.st_size; >+} >+ >+#endif >+ >+} // namespace >+ > FileRotatingStream::FileRotatingStream(const std::string& dir_path, > const std::string& file_prefix) > : FileRotatingStream(dir_path, file_prefix, 0, 0, kRead) {} >@@ -46,7 +188,7 @@ FileRotatingStream::FileRotatingStream(const std::string& dir_path, > size_t max_file_size, > size_t num_files, > Mode mode) >- : dir_path_(dir_path), >+ : dir_path_(AddTrailingPathDelimiterIfNeeded(dir_path)), > file_prefix_(file_prefix), > mode_(mode), > file_stream_(nullptr), >@@ -55,7 +197,7 @@ FileRotatingStream::FileRotatingStream(const std::string& dir_path, > rotation_index_(0), > current_bytes_written_(0), > disable_buffering_(false) { >- RTC_DCHECK(Filesystem::IsFolder(dir_path)); >+ RTC_DCHECK(IsFolder(dir_path)); > switch (mode) { > case kWrite: { > file_names_.clear(); >@@ -66,7 +208,7 @@ FileRotatingStream::FileRotatingStream(const std::string& dir_path, > break; > } > case kRead: { >- file_names_ = GetFilesWithPrefix(); >+ file_names_ = GetFilesWithPrefix(dir_path_, file_prefix_); > std::sort(file_names_.begin(), file_names_.end()); > if (file_names_.size() > 0) { > // |file_names_| is sorted newest first, so read from the end. >@@ -187,11 +329,7 @@ bool FileRotatingStream::GetSize(size_t* size) const { > *size = 0; > size_t total_size = 0; > for (auto file_name : file_names_) { >- Pathname pathname(file_name); >- size_t file_size = 0; >- if (Filesystem::GetFileSize(file_name, &file_size)) { >- total_size += file_size; >- } >+ total_size += GetFileSize(file_name).value_or(0); > } > *size = total_size; > return true; >@@ -209,9 +347,10 @@ bool FileRotatingStream::Open() { > return true; > case kWrite: { > // Delete existing files when opening for write. >- std::vector<std::string> matching_files = GetFilesWithPrefix(); >- for (auto matching_file : matching_files) { >- if (!Filesystem::DeleteFile(matching_file)) { >+ std::vector<std::string> matching_files = >+ GetFilesWithPrefix(dir_path_, file_prefix_); >+ for (const auto& matching_file : matching_files) { >+ if (!DeleteFile(matching_file)) { > std::fprintf(stderr, "Failed to delete: %s\n", matching_file.c_str()); > } > } >@@ -282,16 +421,16 @@ void FileRotatingStream::RotateFiles() { > // See header file comments for example. > RTC_DCHECK_LT(rotation_index_, file_names_.size()); > std::string file_to_delete = file_names_[rotation_index_]; >- if (Filesystem::IsFile(file_to_delete)) { >- if (!Filesystem::DeleteFile(file_to_delete)) { >+ if (IsFile(file_to_delete)) { >+ if (!DeleteFile(file_to_delete)) { > std::fprintf(stderr, "Failed to delete: %s\n", file_to_delete.c_str()); > } > } > for (auto i = rotation_index_; i > 0; --i) { > std::string rotated_name = file_names_[i]; > std::string unrotated_name = file_names_[i - 1]; >- if (Filesystem::IsFile(unrotated_name)) { >- if (!Filesystem::MoveFile(unrotated_name, rotated_name)) { >+ if (IsFile(unrotated_name)) { >+ if (!MoveFile(unrotated_name, rotated_name)) { > std::fprintf(stderr, "Failed to move: %s to %s\n", > unrotated_name.c_str(), rotated_name.c_str()); > } >@@ -302,26 +441,6 @@ void FileRotatingStream::RotateFiles() { > OnRotation(); > } > >-std::vector<std::string> FileRotatingStream::GetFilesWithPrefix() const { >- std::vector<std::string> files; >- // Iterate over the files in the directory. >- DirectoryIterator it; >- Pathname dir_path; >- dir_path.SetFolder(dir_path_); >- if (!it.Iterate(dir_path)) { >- return files; >- } >- do { >- std::string current_name = it.Name(); >- if (current_name.size() && !it.IsDirectory() && >- current_name.compare(0, file_prefix_.size(), file_prefix_) == 0) { >- Pathname path(dir_path_, current_name); >- files.push_back(path.pathname()); >- } >- } while (it.Next()); >- return files; >-} >- > std::string FileRotatingStream::GetFilePath(size_t index, > size_t num_files) const { > RTC_DCHECK_LT(index, num_files); >@@ -333,8 +452,7 @@ std::string FileRotatingStream::GetFilePath(size_t index, > RTC_DCHECK_LT(1 + max_digits, buffer_size); > std::snprintf(file_postfix, buffer_size, "_%0*zu", max_digits, index); > >- Pathname file_path(dir_path_, file_prefix_ + file_postfix); >- return file_path.pathname(); >+ return dir_path_ + file_prefix_ + file_postfix; > } > > CallSessionFileRotatingStream::CallSessionFileRotatingStream( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.h >index 4dab3455db0193b21332ad5e13a0285042631e08..c75ee153b6e4a60e4c34f8754e0f16a213a875a9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream.h >@@ -11,6 +11,7 @@ > #ifndef RTC_BASE_FILEROTATINGSTREAM_H_ > #define RTC_BASE_FILEROTATINGSTREAM_H_ > >+#include <stddef.h> > #include <memory> > #include <string> > #include <vector> >@@ -101,8 +102,6 @@ class FileRotatingStream : public StreamInterface { > // create new file_0 > void RotateFiles(); > >- // Returns a list of file names in the directory beginning with the prefix. >- std::vector<std::string> GetFilesWithPrefix() const; > // Private version of GetFilePath. > std::string GetFilePath(size_t index, size_t num_files) const; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream_unittest.cc >index 19055162fd01945d5bcadf2c1f7d9ba09f536c89..172be57eacc69d48c9c25f0e58784186b9c71cd6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/filerotatingstream_unittest.cc >@@ -13,9 +13,7 @@ > #include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" > #include "rtc_base/filerotatingstream.h" >-#include "rtc_base/fileutils.h" > #include "rtc_base/gunit.h" >-#include "rtc_base/pathutils.h" > #include "test/testsupport/fileutils.h" > > namespace rtc { >@@ -46,12 +44,15 @@ class MAYBE_FileRotatingStreamTest : public ::testing::Test { > void Init(const std::string& dir_name, > const std::string& file_prefix, > size_t max_file_size, >- size_t num_log_files) { >+ size_t num_log_files, >+ bool ensure_trailing_delimiter = true) { > dir_path_ = webrtc::test::OutputPath(); > > // Append per-test output path in order to run within gtest parallel. > dir_path_.append(dir_name); >- dir_path_.push_back(Pathname::DefaultFolderDelimiter()); >+ if (ensure_trailing_delimiter) { >+ dir_path_.append(webrtc::test::kPathDelimiter); >+ } > ASSERT_TRUE(webrtc::test::CreateDir(dir_path_)); > stream_.reset(new FileRotatingStream(dir_path_, file_prefix, max_file_size, > num_log_files)); >@@ -159,12 +160,12 @@ TEST_F(MAYBE_FileRotatingStreamTest, WriteAndRead) { > } > // Check that exactly three files exist. > for (size_t i = 0; i < arraysize(messages); ++i) { >- EXPECT_TRUE(Filesystem::IsFile(stream_->GetFilePath(i))); >+ EXPECT_TRUE(webrtc::test::FileExists(stream_->GetFilePath(i))); > } > std::string message("d"); > WriteAndFlush(message.c_str(), message.size()); > for (size_t i = 0; i < arraysize(messages); ++i) { >- EXPECT_TRUE(Filesystem::IsFile(stream_->GetFilePath(i))); >+ EXPECT_TRUE(webrtc::test::FileExists(stream_->GetFilePath(i))); > } > // TODO(tkchin): Maybe check all the files in the dir. > >@@ -174,6 +175,53 @@ TEST_F(MAYBE_FileRotatingStreamTest, WriteAndRead) { > dir_path_, kFilePrefix); > } > >+// Tests that a write operation (with dir name without delimiter) followed by a >+// read returns the expected data and writes to the expected files. >+TEST_F(MAYBE_FileRotatingStreamTest, WriteWithoutDelimiterAndRead) { >+ Init("FileRotatingStreamTestWriteWithoutDelimiterAndRead", kFilePrefix, >+ kMaxFileSize, 3, >+ /* ensure_trailing_delimiter*/ false); >+ >+ ASSERT_TRUE(stream_->Open()); >+ // The test is set up to create three log files of length 2. Write and check >+ // contents. >+ std::string messages[3] = {"aa", "bb", "cc"}; >+ for (size_t i = 0; i < arraysize(messages); ++i) { >+ const std::string& message = messages[i]; >+ WriteAndFlush(message.c_str(), message.size()); >+ } >+ std::string message("d"); >+ WriteAndFlush(message.c_str(), message.size()); >+ >+ // Reopen for read. >+ std::string expected_contents("bbccd"); >+ VerifyStreamRead(expected_contents.c_str(), expected_contents.size(), >+ dir_path_ + webrtc::test::kPathDelimiter, kFilePrefix); >+} >+ >+// Tests that a write operation followed by a read (without trailing delimiter) >+// returns the expected data and writes to the expected files. >+TEST_F(MAYBE_FileRotatingStreamTest, WriteAndReadWithoutDelimiter) { >+ Init("FileRotatingStreamTestWriteAndReadWithoutDelimiter", kFilePrefix, >+ kMaxFileSize, 3); >+ >+ ASSERT_TRUE(stream_->Open()); >+ // The test is set up to create three log files of length 2. Write and check >+ // contents. >+ std::string messages[3] = {"aa", "bb", "cc"}; >+ for (size_t i = 0; i < arraysize(messages); ++i) { >+ const std::string& message = messages[i]; >+ WriteAndFlush(message.c_str(), message.size()); >+ } >+ std::string message("d"); >+ WriteAndFlush(message.c_str(), message.size()); >+ >+ // Reopen for read. >+ std::string expected_contents("bbccd"); >+ VerifyStreamRead(expected_contents.c_str(), expected_contents.size(), >+ dir_path_.substr(0, dir_path_.size() - 1), kFilePrefix); >+} >+ > // Tests that writing data greater than the total capacity of the files > // overwrites the files correctly and is read correctly after. > TEST_F(MAYBE_FileRotatingStreamTest, WriteOverflowAndRead) { >@@ -218,7 +266,7 @@ class MAYBE_CallSessionFileRotatingStreamTest : public ::testing::Test { > > // Append per-test output path in order to run within gtest parallel. > dir_path_.append(dir_name); >- dir_path_.push_back(Pathname::DefaultFolderDelimiter()); >+ dir_path_.append(webrtc::test::kPathDelimiter); > ASSERT_TRUE(webrtc::test::CreateDir(dir_path_)); > stream_.reset( > new CallSessionFileRotatingStream(dir_path_, max_total_log_size)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fileutils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fileutils.cc >deleted file mode 100644 >index 7d83f97fae4c1ca06914745509bc047178ae93fa..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fileutils.cc >+++ /dev/null >@@ -1,130 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "rtc_base/fileutils.h" >- >-#include "rtc_base/arraysize.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/pathutils.h" >-#include "rtc_base/stringutils.h" >- >-#if defined(WEBRTC_WIN) >-#include "rtc_base/win32filesystem.h" >-#else >-#include "rtc_base/unixfilesystem.h" >-#endif >- >-#if !defined(WEBRTC_WIN) >-#define MAX_PATH 260 >-#endif >- >-namespace rtc { >- >-////////////////////////// >-// Directory Iterator // >-////////////////////////// >- >-// A DirectoryIterator is created with a given directory. It originally points >-// to the first file in the directory, and can be advanecd with Next(). This >-// allows you to get information about each file. >- >-// Constructor >-DirectoryIterator::DirectoryIterator() >-#ifdef WEBRTC_WIN >- : handle_(INVALID_HANDLE_VALUE) { >-#else >- : dir_(nullptr), >- dirent_(nullptr){ >-#endif >-} >- >-// Destructor >-DirectoryIterator::~DirectoryIterator() { >-#if defined(WEBRTC_WIN) >- if (handle_ != INVALID_HANDLE_VALUE) >- ::FindClose(handle_); >-#else >- if (dir_) >- closedir(dir_); >-#endif >-} >- >-// Starts traversing a directory. >-// dir is the directory to traverse >-// returns true if the directory exists and is valid >-bool DirectoryIterator::Iterate(const Pathname& dir) { >- directory_ = dir.pathname(); >-#if defined(WEBRTC_WIN) >- if (handle_ != INVALID_HANDLE_VALUE) >- ::FindClose(handle_); >- std::string d = dir.pathname() + '*'; >- handle_ = ::FindFirstFile(ToUtf16(d).c_str(), &data_); >- if (handle_ == INVALID_HANDLE_VALUE) >- return false; >-#else >- if (dir_ != nullptr) >- closedir(dir_); >- dir_ = ::opendir(directory_.c_str()); >- if (dir_ == nullptr) >- return false; >- dirent_ = readdir(dir_); >- if (dirent_ == nullptr) >- return false; >- >- if (::stat(std::string(directory_ + Name()).c_str(), &stat_) != 0) >- return false; >-#endif >- return true; >-} >- >-// Advances to the next file >-// returns true if there were more files in the directory. >-bool DirectoryIterator::Next() { >-#if defined(WEBRTC_WIN) >- return ::FindNextFile(handle_, &data_) == TRUE; >-#else >- dirent_ = ::readdir(dir_); >- if (dirent_ == nullptr) >- return false; >- >- return ::stat(std::string(directory_ + Name()).c_str(), &stat_) == 0; >-#endif >-} >- >-// returns true if the file currently pointed to is a directory >-bool DirectoryIterator::IsDirectory() const { >-#if defined(WEBRTC_WIN) >- return (data_.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY) != FALSE; >-#else >- return S_ISDIR(stat_.st_mode); >-#endif >-} >- >-// returns the name of the file currently pointed to >-std::string DirectoryIterator::Name() const { >-#if defined(WEBRTC_WIN) >- return ToUtf8(data_.cFileName); >-#else >- RTC_DCHECK(dirent_); >- return dirent_->d_name; >-#endif >-} >- >-FilesystemInterface* Filesystem::GetFilesystem() { >-#if defined(WEBRTC_WIN) >- static FilesystemInterface* const filesystem = new Win32Filesystem(); >-#else >- static FilesystemInterface* const filesystem = new UnixFilesystem(); >-#endif >- >- return filesystem; >-} >- >-} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fileutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fileutils.h >deleted file mode 100644 >index 132fd884401cff5ffb023efe85e8f08956751ff7..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/fileutils.h >+++ /dev/null >@@ -1,133 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef RTC_BASE_FILEUTILS_H_ >-#define RTC_BASE_FILEUTILS_H_ >- >-#include <string> >- >-#if defined(WEBRTC_WIN) >-#include <windows.h> >-#else >-#include <dirent.h> >-#include <stdio.h> >-#include <sys/stat.h> >-#include <sys/types.h> >-#include <unistd.h> >-#endif // WEBRTC_WIN >- >-#include "rtc_base/checks.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/platform_file.h" >- >-namespace rtc { >- >-class FileStream; >-class Pathname; >- >-////////////////////////// >-// Directory Iterator // >-////////////////////////// >- >-// A DirectoryIterator is created with a given directory. It originally points >-// to the first file in the directory, and can be advanecd with Next(). This >-// allows you to get information about each file. >- >-class DirectoryIterator { >- friend class Filesystem; >- >- public: >- // Constructor >- DirectoryIterator(); >- // Destructor >- virtual ~DirectoryIterator(); >- >- // Starts traversing a directory >- // dir is the directory to traverse >- // returns true if the directory exists and is valid >- // The iterator will point to the first entry in the directory >- virtual bool Iterate(const Pathname& path); >- >- // Advances to the next file >- // returns true if there were more files in the directory. >- virtual bool Next(); >- >- // returns true if the file currently pointed to is a directory >- virtual bool IsDirectory() const; >- >- // returns the name of the file currently pointed to >- virtual std::string Name() const; >- >- private: >- std::string directory_; >-#if defined(WEBRTC_WIN) >- WIN32_FIND_DATA data_; >- HANDLE handle_; >-#else >- DIR* dir_; >- struct dirent* dirent_; >- struct stat stat_; >-#endif >-}; >- >-class FilesystemInterface { >- public: >- virtual ~FilesystemInterface() {} >- >- // This will attempt to delete the path located at filename. >- // It DCHECKs and returns false if the path points to a folder or a >- // non-existent file. >- virtual bool DeleteFile(const Pathname& filename) = 0; >- >- // This moves a file from old_path to new_path, where "old_path" is a >- // plain file. This DCHECKs and returns false if old_path points to a >- // directory, and returns true if the function succeeds. >- virtual bool MoveFile(const Pathname& old_path, const Pathname& new_path) = 0; >- >- // Returns true if pathname refers to a directory >- virtual bool IsFolder(const Pathname& pathname) = 0; >- >- // Returns true if pathname refers to a file >- virtual bool IsFile(const Pathname& pathname) = 0; >- >- // Determines the size of the file indicated by path. >- virtual bool GetFileSize(const Pathname& path, size_t* size) = 0; >-}; >- >-class Filesystem { >- public: >- static bool DeleteFile(const Pathname& filename) { >- return GetFilesystem()->DeleteFile(filename); >- } >- >- static bool MoveFile(const Pathname& old_path, const Pathname& new_path) { >- return GetFilesystem()->MoveFile(old_path, new_path); >- } >- >- static bool IsFolder(const Pathname& pathname) { >- return GetFilesystem()->IsFolder(pathname); >- } >- >- static bool IsFile(const Pathname& pathname) { >- return GetFilesystem()->IsFile(pathname); >- } >- >- static bool GetFileSize(const Pathname& path, size_t* size) { >- return GetFilesystem()->GetFileSize(path, size); >- } >- >- private: >- static FilesystemInterface* GetFilesystem(); >- RTC_DISALLOW_IMPLICIT_CONSTRUCTORS(Filesystem); >-}; >- >-} // namespace rtc >- >-#endif // RTC_BASE_FILEUTILS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >index 4f294594bd4afcf29a3f89ac270e28abcbf75531..6b43b91e026acdc584215dd4eb42ddba5e5c04aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >@@ -15,7 +15,6 @@ > #include <string.h> > > #include "rtc_base/checks.h" >-#include "rtc_base/stringutils.h" > > #if defined(WEBRTC_WIN) > // clang-format off >@@ -23,6 +22,8 @@ > #include <windows.h> > #include <shellapi.h> // must come after windows.h > // clang-format on >+ >+#include "rtc_base/stringutils.h" // For ToUtf8 > #endif > > namespace { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.h >index 33f6e5bda1dbda54c6df8e5f268fd860fd4768ad..a08bfd29b85d5ff2b0d1daf511165ecb4f2103dd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.h >@@ -23,7 +23,10 @@ > #define RTC_BASE_FLAGS_H_ > > #include "rtc_base/checks.h" >+ >+#if defined(WEBRTC_WIN) > #include "rtc_base/constructormagic.h" >+#endif > > namespace rtc { > >@@ -33,7 +36,7 @@ union FlagValue { > // bool values ('bool b = "false";' results in b == true!), we pass > // and int argument to New_BOOL as this appears to be safer - sigh. > // In particular, it prevents the (not uncommon!) bug where a bool >- // flag is defined via: DEFINE_bool(flag, "false", "some comment");. >+ // flag is defined via: WEBRTC_DEFINE_bool(flag, "false", "some comment");. > static FlagValue New_BOOL(int b) { > FlagValue v; > v.b = (b != 0); >@@ -152,7 +155,7 @@ class Flag { > }; > > // Internal use only. >-#define DEFINE_FLAG(type, c_type, name, default, comment) \ >+#define WEBRTC_DEFINE_FLAG(type, c_type, name, default, comment) \ > /* define and initialize the flag */ \ > c_type FLAG_##name = (default); \ > /* register the flag */ \ >@@ -161,25 +164,25 @@ class Flag { > rtc::FlagValue::New_##type(default)) > > // Internal use only. >-#define DECLARE_FLAG(c_type, name) \ >- /* declare the external flag */ \ >+#define WEBRTC_DECLARE_FLAG(c_type, name) \ >+ /* declare the external flag */ \ > extern c_type FLAG_##name > > // Use the following macros to define a new flag: >-#define DEFINE_bool(name, default, comment) \ >- DEFINE_FLAG(BOOL, bool, name, default, comment) >-#define DEFINE_int(name, default, comment) \ >- DEFINE_FLAG(INT, int, name, default, comment) >-#define DEFINE_float(name, default, comment) \ >- DEFINE_FLAG(FLOAT, double, name, default, comment) >-#define DEFINE_string(name, default, comment) \ >- DEFINE_FLAG(STRING, const char*, name, default, comment) >+#define WEBRTC_DEFINE_bool(name, default, comment) \ >+ WEBRTC_DEFINE_FLAG(BOOL, bool, name, default, comment) >+#define WEBRTC_DEFINE_int(name, default, comment) \ >+ WEBRTC_DEFINE_FLAG(INT, int, name, default, comment) >+#define WEBRTC_DEFINE_float(name, default, comment) \ >+ WEBRTC_DEFINE_FLAG(FLOAT, double, name, default, comment) >+#define WEBRTC_DEFINE_string(name, default, comment) \ >+ WEBRTC_DEFINE_FLAG(STRING, const char*, name, default, comment) > > // Use the following macros to declare a flag defined elsewhere: >-#define DECLARE_bool(name) DECLARE_FLAG(bool, name) >-#define DECLARE_int(name) DECLARE_FLAG(int, name) >-#define DECLARE_float(name) DECLARE_FLAG(double, name) >-#define DECLARE_string(name) DECLARE_FLAG(const char*, name) >+#define WEBRTC_DECLARE_bool(name) WEBRTC_DECLARE_FLAG(bool, name) >+#define WEBRTC_DECLARE_int(name) WEBRTC_DECLARE_FLAG(int, name) >+#define WEBRTC_DECLARE_float(name) WEBRTC_DECLARE_FLAG(double, name) >+#define WEBRTC_DECLARE_string(name) WEBRTC_DECLARE_FLAG(const char*, name) > > // The global list of all flags. > class FlagList { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/helpers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/helpers.h >index d3b09cfd48fb5bd3e5afd93bd6f854a11ace0ef7..a93b321eab92673de903b9e87ec2656366e31e45 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/helpers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/helpers.h >@@ -11,6 +11,8 @@ > #ifndef RTC_BASE_HELPERS_H_ > #define RTC_BASE_HELPERS_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <string> > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.cc >index 716a41e16e93e72c6e90c162be625ee221458b35..7926f884f9d881b375f8c0d77c0465f08ea64db3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.cc >@@ -18,17 +18,23 @@ > #include <security.h> > #endif > >+#include <ctype.h> // for isspace >+#include <stdio.h> // for sprintf > #include <algorithm> >+#include <utility> // for pair >+#include <vector> > >+#include "absl/strings/match.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" >-#include "rtc_base/cryptstring.h" >+#include "rtc_base/cryptstring.h" // for CryptString > #include "rtc_base/httpcommon.h" >+#include "rtc_base/logging.h" > #include "rtc_base/messagedigest.h" > #include "rtc_base/socketaddress.h" > #include "rtc_base/strings/string_builder.h" >-#include "rtc_base/third_party/base64/base64.h" >-#include "rtc_base/zero_memory.h" >+#include "rtc_base/third_party/base64/base64.h" // for Base64 >+#include "rtc_base/zero_memory.h" // for ExplicitZeroMemory > > namespace rtc { > namespace { >@@ -260,7 +266,7 @@ HttpAuthResult HttpAuthenticate(const char* challenge, > return HAR_IGNORE; > > // BASIC >- if (_stricmp(auth_method.c_str(), "basic") == 0) { >+ if (absl::EqualsIgnoreCase(auth_method, "basic")) { > if (context) > return HAR_CREDENTIALS; // Bad credentials > if (username.empty()) >@@ -288,7 +294,7 @@ HttpAuthResult HttpAuthenticate(const char* challenge, > } > > // DIGEST >- if (_stricmp(auth_method.c_str(), "digest") == 0) { >+ if (absl::EqualsIgnoreCase(auth_method, "digest")) { > if (context) > return HAR_CREDENTIALS; // Bad credentials > if (username.empty()) >@@ -355,8 +361,8 @@ HttpAuthResult HttpAuthenticate(const char* challenge, > > #if defined(WEBRTC_WIN) > #if 1 >- bool want_negotiate = (_stricmp(auth_method.c_str(), "negotiate") == 0); >- bool want_ntlm = (_stricmp(auth_method.c_str(), "ntlm") == 0); >+ bool want_negotiate = absl::EqualsIgnoreCase(auth_method, "negotiate"); >+ bool want_ntlm = absl::EqualsIgnoreCase(auth_method, "ntlm"); > // SPNEGO & NTLM > if (want_negotiate || want_ntlm) { > const size_t MAX_MESSAGE = 12000, MAX_SPN = 256; >@@ -371,7 +377,7 @@ HttpAuthResult HttpAuthenticate(const char* challenge, > return HAR_IGNORE; > } > #else >- sprintfn(spn, MAX_SPN, "HTTP/%s", server.ToString().c_str()); >+ snprintf(spn, MAX_SPN, "HTTP/%s", server.ToString().c_str()); > #endif > > SecBuffer out_sec; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.h >index 581bd1ee8a86c612cb62f6ab234153ad4d2c808d..fbad280e0250a0b7ad535d938381727dd7acd1c4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/httpcommon.h >@@ -11,14 +11,7 @@ > #ifndef RTC_BASE_HTTPCOMMON_H_ > #define RTC_BASE_HTTPCOMMON_H_ > >-#include <map> >-#include <memory> > #include <string> >-#include <vector> >- >-#include "rtc_base/checks.h" >-#include "rtc_base/stream.h" >-#include "rtc_base/stringutils.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ipaddress.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ipaddress.cc >index c52c9a4adddac30dc4ed009dffae0aec55edb6b6..027a7b219e02a9e6c5370bc505f593ad8b40a47c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ipaddress.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ipaddress.cc >@@ -11,24 +11,17 @@ > #if defined(WEBRTC_POSIX) > #include <netinet/in.h> > #include <sys/socket.h> >-#include <sys/types.h> > #ifdef OPENBSD > #include <netinet/in_systm.h> > #endif > #ifndef __native_client__ > #include <netinet/ip.h> > #endif >-#include <arpa/inet.h> > #include <netdb.h> >-#include <unistd.h> > #endif > >-#include <stdio.h> >- > #include "rtc_base/byteorder.h" >-#include "rtc_base/checks.h" > #include "rtc_base/ipaddress.h" >-#include "rtc_base/logging.h" > #include "rtc_base/nethelpers.h" > #include "rtc_base/stringutils.h" > >@@ -162,11 +155,10 @@ std::string IPAddress::ToSensitiveString() const { > std::string result; > result.resize(INET6_ADDRSTRLEN); > in6_addr addr = ipv6_address(); >- size_t len = >- rtc::sprintfn(&(result[0]), result.size(), "%x:%x:%x:x:x:x:x:x", >- (addr.s6_addr[0] << 8) + addr.s6_addr[1], >- (addr.s6_addr[2] << 8) + addr.s6_addr[3], >- (addr.s6_addr[4] << 8) + addr.s6_addr[5]); >+ size_t len = snprintf(&(result[0]), result.size(), "%x:%x:%x:x:x:x:x:x", >+ (addr.s6_addr[0] << 8) + addr.s6_addr[1], >+ (addr.s6_addr[2] << 8) + addr.s6_addr[3], >+ (addr.s6_addr[4] << 8) + addr.s6_addr[5]); > result.resize(len); > return result; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/key_derivation.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/key_derivation.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..288e407ebc5a974310d840ba69cf405752978c2c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/key_derivation.cc >@@ -0,0 +1,31 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/key_derivation.h" >+ >+#include "absl/memory/memory.h" >+#include "rtc_base/openssl_key_derivation_hkdf.h" >+ >+namespace rtc { >+ >+KeyDerivation::KeyDerivation() = default; >+KeyDerivation::~KeyDerivation() = default; >+ >+// static >+std::unique_ptr<KeyDerivation> KeyDerivation::Create( >+ KeyDerivationAlgorithm key_derivation_algorithm) { >+ switch (key_derivation_algorithm) { >+ case KeyDerivationAlgorithm::HKDF_SHA256: >+ return absl::make_unique<OpenSSLKeyDerivationHKDF>(); >+ } >+ RTC_NOTREACHED(); >+} >+ >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/key_derivation.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/key_derivation.h >new file mode 100644 >index 0000000000000000000000000000000000000000..fa329aeb4ce4c9d98eabe10061678bc91b7b6f39 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/key_derivation.h >@@ -0,0 +1,70 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_KEY_DERIVATION_H_ >+#define RTC_BASE_KEY_DERIVATION_H_ >+ >+#include <memory> >+ >+#include "absl/types/optional.h" >+#include "api/array_view.h" >+#include "rtc_base/buffer.h" >+#include "rtc_base/constructormagic.h" >+ >+namespace rtc { >+ >+// Defines the set of key derivation algorithms that are supported. It is ideal >+// to keep this list as small as possible. >+enum class KeyDerivationAlgorithm { >+ // This algorithm is not suitable to generate a key from a password. Please >+ // only use with a cryptographically random master secret. >+ HKDF_SHA256 >+}; >+ >+// KeyDerivation provides a generic interface for deriving keys in WebRTC. This >+// class should be used over directly accessing openssl or boringssl primitives >+// so that we can maintain seperate implementations. >+// Example: >+// auto kd = KeyDerivation::Create(KeyDerivationAlgorithm::HDKF_SHA526); >+// if (kd == nullptr) return; >+// auto derived_key_or = kd->DeriveKey(secret, salt, label); >+// if (!derived_key_or.ok()) return; >+// DoSomethingWithKey(derived_key_or.value()); >+class KeyDerivation { >+ public: >+ KeyDerivation(); >+ virtual ~KeyDerivation(); >+ >+ // Derives a new key from existing key material. >+ // secret - The random secret value you wish to derive a key from. >+ // salt - Optional but recommended (non secret) cryptographically random. >+ // label - A non secret but unique label value to determine the derivation. >+ // derived_key_byte_size - This must be at least 128 bits. >+ // return - An optional ZeroOnFreeBuffer containing the derived key or >+ // absl::nullopt. Nullopt indicates a failure in derivation. >+ virtual absl::optional<ZeroOnFreeBuffer<uint8_t>> DeriveKey( >+ rtc::ArrayView<const uint8_t> secret, >+ rtc::ArrayView<const uint8_t> salt, >+ rtc::ArrayView<const uint8_t> label, >+ size_t derived_key_byte_size) = 0; >+ >+ // Static factory that will return an implementation that is capable of >+ // handling the key derivation with the requested algorithm. If no >+ // implementation is available nullptr will be returned. >+ static std::unique_ptr<KeyDerivation> Create( >+ KeyDerivationAlgorithm key_derivation_algorithm); >+ >+ private: >+ RTC_DISALLOW_COPY_AND_ASSIGN(KeyDerivation); >+}; >+ >+} // namespace rtc >+ >+#endif // RTC_BASE_KEY_DERIVATION_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.cc >index 9c90d9ec1797745b91c7396984efbca906473b5a..c95ad9c488a07d415b0e98e5ed5a7e25e190003b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.cc >@@ -10,8 +10,6 @@ > > #include "rtc_base/location.h" > >-#include "rtc_base/stringutils.h" >- > namespace rtc { > > Location::Location(const char* function_name, const char* file_and_line) >@@ -31,7 +29,7 @@ Location& Location::operator=(const Location& other) { > > std::string Location::ToString() const { > char buf[256]; >- sprintfn(buf, sizeof(buf), "%s@%s", function_name_, file_and_line_); >+ snprintf(buf, sizeof(buf), "%s@%s", function_name_, file_and_line_); > return buf; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.h >index 1bd0c7205e69267a1d1173817ff3419ee45305ac..c7d083e42d37a319126a2adf7e6037b7b0b6d8bc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.h >@@ -51,16 +51,15 @@ > #include <string> > #include <utility> > >-#if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >-#include <CoreServices/CoreServices.h> >-#endif >- > #include "absl/strings/string_view.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/deprecation.h" > #include "rtc_base/strings/string_builder.h" > #include "rtc_base/system/inline.h" >-#include "rtc_base/thread_annotations.h" >+ >+#if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >+#include "rtc_base/logging_mac.h" >+#endif // WEBRTC_MAC > > #if !defined(NDEBUG) || defined(DLOG_ALWAYS_ON) > #define RTC_DLOG_IS_ON 1 >@@ -70,11 +69,6 @@ > > namespace rtc { > >-#if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >-// Returns a UTF8 description from an OS X Status error. >-std::string DescriptionFromOSStatus(OSStatus err); >-#endif >- > ////////////////////////////////////////////////////////////////////// > > // Note that the non-standard LoggingSeverity aliases exist because they are >@@ -121,6 +115,8 @@ class LogSink { > virtual void OnLogMessage(const std::string& msg, > LoggingSeverity severity, > const char* tag); >+ virtual void OnLogMessage(const std::string& message, >+ LoggingSeverity severity); > virtual void OnLogMessage(const std::string& message) = 0; > }; > >@@ -523,10 +519,10 @@ class LogMessage { > ? static_cast<void>(0) \ > : rtc::webrtc_logging_impl::LogMessageVoidify()& > >-#define RTC_LOG_FILE_LINE(sev, file, line) \ >- rtc::webrtc_logging_impl::LogCall() & \ >- rtc::webrtc_logging_impl::LogStreamer<>() \ >- << rtc::webrtc_logging_impl::LogMetadata(__FILE__, __LINE__, sev) >+#define RTC_LOG_FILE_LINE(sev, file, line) \ >+ rtc::webrtc_logging_impl::LogCall() & \ >+ rtc::webrtc_logging_impl::LogStreamer<>() \ >+ << rtc::webrtc_logging_impl::LogMetadata(file, line, sev) > > #define RTC_LOG(sev) RTC_LOG_FILE_LINE(rtc::sev, __FILE__, __LINE__) > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_mac.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_mac.h >new file mode 100644 >index 0000000000000000000000000000000000000000..e65db560c1879f2b880377339419880cf6187c58 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_mac.h >@@ -0,0 +1,28 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#ifndef RTC_BASE_LOGGING_MAC_H_ >+#define RTC_BASE_LOGGING_MAC_H_ >+ >+#if !defined(WEBRTC_MAC) || defined(WEBRTC_IOS) >+#error "Only include this header in macOS builds" >+#endif >+ >+#include <CoreServices/CoreServices.h> >+ >+#include <string> >+ >+namespace rtc { >+ >+// Returns a UTF8 description from an OS X Status error. >+std::string DescriptionFromOSStatus(OSStatus err); >+ >+} // namespace rtc >+ >+#endif // RTC_BASE_LOGGING_MAC_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_mac.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_mac.mm >index 378cfbfcb9cd147701180b30b07b859297b5fb57..bd5f2b9d38fbf5e8d90075bfbc49a40036444b54 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_mac.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_mac.mm >@@ -8,15 +8,15 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include "rtc_base/logging.h" >+#include "rtc_base/logging_mac.h" > > #import <Foundation/Foundation.h> > >- > namespace rtc { > std::string DescriptionFromOSStatus(OSStatus err) { > NSError* error = > [NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]; > return error.description.UTF8String; > } >+ > } // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc >index 6b2263923fa8501b67b3bdeacc04677e04c6df36..4de1cf2fe4caab3e7edd44eb9bfd447a06dfb018 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc >@@ -221,7 +221,7 @@ TEST(LogTest, SingleStream) { > > #if GTEST_HAS_DEATH_TEST && !defined(WEBRTC_ANDROID) > TEST(LogTest, Checks) { >- EXPECT_DEATH(RTC_FATAL() << "message", >+ EXPECT_DEATH(FATAL() << "message", > "\n\n#\n" > "# Fatal error in: \\S+, line \\w+\n" > "# last system error: \\w+\n" >@@ -292,7 +292,7 @@ class LogThread { > static void ThreadEntry(void* p) { static_cast<LogThread*>(p)->Run(); } > > PlatformThread thread_; >- Event event_{false, false}; >+ Event event_; > }; > > // Ensure we don't crash when adding/removing streams while threads are going. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.cc >index 662b1f20de9785c75360840aa5ee8b09c7d9e717..c01bafbdd2280f45988b2927cc75bcb04da4f86d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.cc >@@ -10,10 +10,12 @@ > > #include "rtc_base/logsinks.h" > >+#include <string.h> > #include <cstdio> > #include <string> > > #include "rtc_base/checks.h" >+#include "rtc_base/stream.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.h >index caf4a5f728a37945250d0f89257290b5cbc54797..d0867a22024bbd15e7eddf8455c3c97dd0f9614c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logsinks.h >@@ -11,6 +11,7 @@ > #ifndef RTC_BASE_LOGSINKS_H_ > #define RTC_BASE_LOGSINKS_H_ > >+#include <stddef.h> > #include <memory> > #include <string> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/mdns_responder_interface.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/mdns_responder_interface.h >index 9dbaf56ad88f1059bf3f3f0048aa1c0a6156a87b..71938b2576d79dd8a6c717e6a205d4365ac217ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/mdns_responder_interface.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/mdns_responder_interface.h >@@ -12,36 +12,35 @@ > #define RTC_BASE_MDNS_RESPONDER_INTERFACE_H_ > > #include <functional> >-#include <map> >-#include <memory> >-#include <set> > #include <string> > > #include "rtc_base/ipaddress.h" >-#include "rtc_base/socketaddress.h" > > namespace webrtc { > > // Defines an mDNS responder that can be used in ICE candidate gathering, where >-// the local IP addresses of host candidates are obfuscated by mDNS hostnames. >-class MDnsResponderInterface { >+// the local IP addresses of host candidates are replaced by mDNS hostnames. >+class MdnsResponderInterface { > public: > using NameCreatedCallback = > std::function<void(const rtc::IPAddress&, const std::string&)>; > using NameRemovedCallback = std::function<void(bool)>; > >- MDnsResponderInterface() = default; >- virtual ~MDnsResponderInterface() = default; >+ MdnsResponderInterface() = default; >+ virtual ~MdnsResponderInterface() = default; > >- // Asynchronously creates a type-4 UUID hostname for an IP address. The >- // created name should be given to |callback| with the address that it >- // represents. >+ // Asynchronously creates and returns a new name via |callback| for |addr| if >+ // there is no name mapped to it by this responder, and initializes the >+ // reference count of this name to one. Otherwise the existing name mapped to >+ // |addr| is returned and its reference count is incremented by one. > virtual void CreateNameForAddress(const rtc::IPAddress& addr, > NameCreatedCallback callback) = 0; >- // Removes the name mapped to the given address if there is such an >- // name-address mapping previously created via CreateNameForAddress. The >- // result of whether an associated name-address mapping is removed should be >- // given to |callback|. >+ // Decrements the reference count of the mapped name of |addr|, if >+ // there is a map created previously via CreateNameForAddress; asynchronously >+ // removes the association between |addr| and its mapped name, and returns >+ // true via |callback| if the decremented reference count reaches zero. >+ // Otherwise no operation is done and false is returned via |callback| >+ // asynchronously. > virtual void RemoveNameForAddress(const rtc::IPAddress& addr, > NameRemovedCallback callback) = 0; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory/aligned_malloc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory/aligned_malloc.cc >index a1d54bdb785a8dfda5e3493d026b9ac4ed392119..c893c96ef2af0883c03fa120eff385a58e3acb48 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory/aligned_malloc.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory/aligned_malloc.cc >@@ -10,8 +10,8 @@ > > #include "rtc_base/memory/aligned_malloc.h" > >-#include <stdlib.h> >-#include <cstring> >+#include <stdlib.h> // for free, malloc >+#include <string.h> // for memcpy > > #ifdef _WIN32 > #include <windows.h> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_stream.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..541de070c7281cb8f7f499576e6c6dc1087196de >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_stream.cc >@@ -0,0 +1,143 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <algorithm> >+ >+#include "rtc_base/memory_stream.h" >+ >+namespace rtc { >+ >+StreamState MemoryStream::GetState() const { >+ return SS_OPEN; >+} >+ >+StreamResult MemoryStream::Read(void* buffer, >+ size_t bytes, >+ size_t* bytes_read, >+ int* error) { >+ if (seek_position_ >= data_length_) { >+ return SR_EOS; >+ } >+ size_t available = data_length_ - seek_position_; >+ if (bytes > available) { >+ // Read partial buffer >+ bytes = available; >+ } >+ memcpy(buffer, &buffer_[seek_position_], bytes); >+ seek_position_ += bytes; >+ if (bytes_read) { >+ *bytes_read = bytes; >+ } >+ return SR_SUCCESS; >+} >+ >+StreamResult MemoryStream::Write(const void* buffer, >+ size_t bytes, >+ size_t* bytes_written, >+ int* error) { >+ size_t available = buffer_length_ - seek_position_; >+ if (0 == available) { >+ // Increase buffer size to the larger of: >+ // a) new position rounded up to next 256 bytes >+ // b) double the previous length >+ size_t new_buffer_length = >+ std::max(((seek_position_ + bytes) | 0xFF) + 1, buffer_length_ * 2); >+ StreamResult result = DoReserve(new_buffer_length, error); >+ if (SR_SUCCESS != result) { >+ return result; >+ } >+ RTC_DCHECK(buffer_length_ >= new_buffer_length); >+ available = buffer_length_ - seek_position_; >+ } >+ >+ if (bytes > available) { >+ bytes = available; >+ } >+ memcpy(&buffer_[seek_position_], buffer, bytes); >+ seek_position_ += bytes; >+ if (data_length_ < seek_position_) { >+ data_length_ = seek_position_; >+ } >+ if (bytes_written) { >+ *bytes_written = bytes; >+ } >+ return SR_SUCCESS; >+} >+ >+void MemoryStream::Close() { >+ // nothing to do >+} >+ >+bool MemoryStream::SetPosition(size_t position) { >+ if (position > data_length_) >+ return false; >+ seek_position_ = position; >+ return true; >+} >+ >+bool MemoryStream::GetPosition(size_t* position) const { >+ if (position) >+ *position = seek_position_; >+ return true; >+} >+ >+bool MemoryStream::GetSize(size_t* size) const { >+ if (size) >+ *size = data_length_; >+ return true; >+} >+ >+bool MemoryStream::ReserveSize(size_t size) { >+ return (SR_SUCCESS == DoReserve(size, nullptr)); >+} >+ >+/////////////////////////////////////////////////////////////////////////////// >+ >+MemoryStream::MemoryStream() {} >+ >+MemoryStream::MemoryStream(const char* data) { >+ SetData(data, strlen(data)); >+} >+ >+MemoryStream::MemoryStream(const void* data, size_t length) { >+ SetData(data, length); >+} >+ >+MemoryStream::~MemoryStream() { >+ delete[] buffer_; >+} >+ >+void MemoryStream::SetData(const void* data, size_t length) { >+ data_length_ = buffer_length_ = length; >+ delete[] buffer_; >+ buffer_ = new char[buffer_length_]; >+ memcpy(buffer_, data, data_length_); >+ seek_position_ = 0; >+} >+ >+StreamResult MemoryStream::DoReserve(size_t size, int* error) { >+ if (buffer_length_ >= size) >+ return SR_SUCCESS; >+ >+ if (char* new_buffer = new char[size]) { >+ memcpy(new_buffer, buffer_, data_length_); >+ delete[] buffer_; >+ buffer_ = new_buffer; >+ buffer_length_ = size; >+ return SR_SUCCESS; >+ } >+ >+ if (error) { >+ *error = ENOMEM; >+ } >+ return SR_ERROR; >+} >+ >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_stream.h >new file mode 100644 >index 0000000000000000000000000000000000000000..936f71b34c010b2263e478029329b1ecfd270d70 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_stream.h >@@ -0,0 +1,59 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_MEMORY_STREAM_H_ >+#define RTC_BASE_MEMORY_STREAM_H_ >+ >+#include "rtc_base/stream.h" >+ >+namespace rtc { >+ >+// MemoryStream dynamically resizes to accomodate written data. >+ >+class MemoryStream final : public StreamInterface { >+ public: >+ MemoryStream(); >+ explicit MemoryStream(const char* data); // Calls SetData(data, strlen(data)) >+ MemoryStream(const void* data, size_t length); // Calls SetData(data, length) >+ ~MemoryStream() override; >+ >+ StreamState GetState() const override; >+ StreamResult Read(void* buffer, >+ size_t bytes, >+ size_t* bytes_read, >+ int* error) override; >+ StreamResult Write(const void* buffer, >+ size_t bytes, >+ size_t* bytes_written, >+ int* error) override; >+ void Close() override; >+ bool SetPosition(size_t position) override; >+ bool GetPosition(size_t* position) const override; >+ bool GetSize(size_t* size) const override; >+ bool ReserveSize(size_t size) override; >+ >+ char* GetBuffer() { return buffer_; } >+ const char* GetBuffer() const { return buffer_; } >+ >+ void SetData(const void* data, size_t length); >+ >+ private: >+ StreamResult DoReserve(size_t size, int* error); >+ >+ // Invariant: 0 <= seek_position <= data_length_ <= buffer_length_ >+ char* buffer_ = nullptr; >+ size_t buffer_length_ = 0; >+ size_t data_length_ = 0; >+ size_t seek_position_ = 0; >+}; >+ >+} // namespace rtc >+ >+#endif // RTC_BASE_MEMORY_STREAM_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_usage.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_usage.cc >index a70c54753fc4af7c857435c48d2a74eb1a244bf5..9cd36d372099aa497b0b43abc866ebbe23ca646e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_usage.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/memory_usage.cc >@@ -61,6 +61,9 @@ int64_t GetProcessResidentSizeBytes() { > return -1; > } > return pmc.WorkingSetSize; >+#elif defined(WEBRTC_FUCHSIA) >+ RTC_LOG_ERR(LS_ERROR) << "GetProcessResidentSizeBytes() not implemented"; >+ return 0; > #else > // Not implemented yet. > static_assert(false, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.cc >index 9c10bcd486a761bbed6dfa3a39b6dba80da44f89..5a0d16a9a04b672ec69cff77fc98287450677a74 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.cc >@@ -10,9 +10,9 @@ > > #include "rtc_base/messagedigest.h" > >-#include <memory> >- > #include <string.h> >+#include <cstdint> >+#include <memory> > > #include "rtc_base/openssldigest.h" > #include "rtc_base/stringencode.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.h >index fc820885436f8ce71d45f6cb2153a7ef6173b398..757f914876b07e4130781e8e4bc17fa5faa97f97 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagedigest.h >@@ -11,6 +11,7 @@ > #ifndef RTC_BASE_MESSAGEDIGEST_H_ > #define RTC_BASE_MESSAGEDIGEST_H_ > >+#include <stddef.h> > #include <string> > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagehandler.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagehandler.h >index df2d1ada8ddcecc4493e5ba245a874270eb03af9..0c40853e214855c46478c5f7a90eff142264c70a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagehandler.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagehandler.h >@@ -11,7 +11,6 @@ > #ifndef RTC_BASE_MESSAGEHANDLER_H_ > #define RTC_BASE_MESSAGEHANDLER_H_ > >-#include <memory> > #include <utility> > > #include "rtc_base/constructormagic.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.cc >index a561af42eb8084a74a39bebc9a448877b6233d8f..204952a7e2abfb42338834e0ec934c4ebcfe5d0b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.cc >@@ -8,13 +8,15 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > #include <algorithm> >+#include <string> >+#include <utility> > > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/messagequeue.h" >-#include "rtc_base/stringencode.h" > #include "rtc_base/thread.h" >+#include "rtc_base/timeutils.h" > #include "rtc_base/trace_event.h" > > namespace rtc { >@@ -348,8 +350,10 @@ void MessageQueue::Post(const Location& posted_from, > uint32_t id, > MessageData* pdata, > bool time_sensitive) { >- if (IsQuitting()) >+ if (IsQuitting()) { >+ delete pdata; > return; >+ } > > // Keep thread safe > // Add the message to the end of the queue >@@ -405,6 +409,7 @@ void MessageQueue::DoDelayPost(const Location& posted_from, > uint32_t id, > MessageData* pdata) { > if (IsQuitting()) { >+ delete pdata; > return; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.h >index e7e479285d20c8339a62ee63eaf3aa50d099e02e..c1b9b5a916a895ac06f2bdeb7b599cac63de78b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/messagequeue.h >@@ -17,7 +17,6 @@ > #include <list> > #include <memory> > #include <queue> >-#include <utility> > #include <vector> > > #include "rtc_base/constructormagic.h" >@@ -28,7 +27,6 @@ > #include "rtc_base/socketserver.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > #include "rtc_base/thread_annotations.h" >-#include "rtc_base/timeutils.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.cc >index 81193761a63c055c72691a91dfc63c033389e10f..b005ecae7a0d1a5bf6a0a6a4582d10a1c78d9a68 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.cc >@@ -158,7 +158,7 @@ void NATServer::OnInternalUDPPacket(AsyncPacketSocket* socket, > const char* buf, > size_t size, > const SocketAddress& addr, >- const PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > // Read the intended destination from the wire. > SocketAddress dest_addr; > size_t length = UnpackAddressFromNAT(buf, size, &dest_addr); >@@ -184,7 +184,7 @@ void NATServer::OnExternalUDPPacket(AsyncPacketSocket* socket, > const char* buf, > size_t size, > const SocketAddress& remote_addr, >- const PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > SocketAddress local_addr = socket->GetLocalAddress(); > > // Find the translation for this addresses. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.h >index f4cabcf0d9b7e130b5c8d299a95c48b89110de64..d16b5370f0507c6c84ee88813539cde9befc6a5d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/natserver.h >@@ -81,12 +81,12 @@ class NATServer : public sigslot::has_slots<> { > const char* buf, > size_t size, > const SocketAddress& addr, >- const PacketTime& packet_time); >+ const int64_t& packet_time_us); > void OnExternalUDPPacket(AsyncPacketSocket* socket, > const char* buf, > size_t size, > const SocketAddress& remote_addr, >- const PacketTime& packet_time); >+ const int64_t& packet_time_us); > > private: > typedef std::set<SocketAddress, AddrCmp> AddressSet; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelper.h >index e86d126cbf45c6f79ea201515501588d1a583deb..f9561380fb7d23032aae9f6b06ce921933c0ee17 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelper.h >@@ -10,7 +10,6 @@ > #ifndef RTC_BASE_NETHELPER_H_ > #define RTC_BASE_NETHELPER_H_ > >-#include <cstdlib> > #include <string> > > // This header contains helper functions and constants used by different types >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.cc >index b1221c3d0014db634418bbdbfb1d7e65878ce85b..81cd1afcef4a069e6a76a076da20d1ce1071b315 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.cc >@@ -25,10 +25,9 @@ > #endif > #endif // defined(WEBRTC_POSIX) && !defined(__native_client__) > >-#include "rtc_base/byteorder.h" >-#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/signalthread.h" >+#include "rtc_base/third_party/sigslot/sigslot.h" // for signal_with_thread... > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.h >index f9d188f61d09a778251373c738dae7e869324a1e..138f9583ccca1aca8a808fdadc412c8fb4a80fc0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nethelpers.h >@@ -12,18 +12,17 @@ > #define RTC_BASE_NETHELPERS_H_ > > #if defined(WEBRTC_POSIX) >-#include <netdb.h> >-#include <stddef.h> >+#include <sys/socket.h> > #elif WEBRTC_WIN > #include <winsock2.h> // NOLINT > #endif > >-#include <list> >+#include <vector> > > #include "rtc_base/asyncresolverinterface.h" >+#include "rtc_base/ipaddress.h" > #include "rtc_base/signalthread.h" > #include "rtc_base/socketaddress.h" >-#include "rtc_base/third_party/sigslot/sigslot.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.cc >index 67888ede8342310a0706cf13af824ae9b5f32992..5c7b019d66c2fdd3255d520ce3203f7c96b91bc8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.cc >@@ -38,7 +38,6 @@ > #include "rtc_base/logging.h" > #include "rtc_base/networkmonitor.h" > #include "rtc_base/socket.h" // includes something that makes windows happy >-#include "rtc_base/stream.h" > #include "rtc_base/stringencode.h" > #include "rtc_base/strings/string_builder.h" > #include "rtc_base/stringutils.h" >@@ -260,7 +259,7 @@ bool NetworkManager::GetDefaultLocalAddress(int family, IPAddress* addr) const { > return false; > } > >-webrtc::MDnsResponderInterface* NetworkManager::GetMDnsResponder() const { >+webrtc::MdnsResponderInterface* NetworkManager::GetMdnsResponder() const { > return nullptr; > } > >@@ -286,7 +285,7 @@ void NetworkManagerBase::GetAnyAddressNetworks(NetworkList* networks) { > new rtc::Network("any", "any", ipv4_any_address, 0, ADAPTER_TYPE_ANY)); > ipv4_any_address_network_->set_default_local_address_provider(this); > ipv4_any_address_network_->AddIP(ipv4_any_address); >- ipv4_any_address_network_->SetMDnsResponder(GetMDnsResponder()); >+ ipv4_any_address_network_->SetMdnsResponder(GetMdnsResponder()); > } > networks->push_back(ipv4_any_address_network_.get()); > >@@ -297,7 +296,7 @@ void NetworkManagerBase::GetAnyAddressNetworks(NetworkList* networks) { > "any", "any", ipv6_any_address, 0, ADAPTER_TYPE_ANY)); > ipv6_any_address_network_->set_default_local_address_provider(this); > ipv6_any_address_network_->AddIP(ipv6_any_address); >- ipv6_any_address_network_->SetMDnsResponder(GetMDnsResponder()); >+ ipv6_any_address_network_->SetMdnsResponder(GetMdnsResponder()); > } > networks->push_back(ipv6_any_address_network_.get()); > } >@@ -387,7 +386,7 @@ void NetworkManagerBase::MergeNetworkList(const NetworkList& new_networks, > delete net; > } > } >- networks_map_[key]->SetMDnsResponder(GetMDnsResponder()); >+ networks_map_[key]->SetMdnsResponder(GetMdnsResponder()); > } > // It may still happen that the merged list is a subset of |networks_|. > // To detect this change, we compare their sizes. >@@ -775,26 +774,28 @@ bool BasicNetworkManager::CreateNetworks(bool include_ignored, > > #if defined(WEBRTC_LINUX) > bool IsDefaultRoute(const std::string& network_name) { >- FileStream fs; >- if (!fs.Open("/proc/net/route", "r", nullptr)) { >+ FILE* f = fopen("/proc/net/route", "r"); >+ if (!f) { > RTC_LOG(LS_WARNING) > << "Couldn't read /proc/net/route, skipping default " > << "route check (assuming everything is a default route)."; > return true; >- } else { >- std::string line; >- while (fs.ReadLine(&line) == SR_SUCCESS) { >- char iface_name[256]; >- unsigned int iface_ip, iface_gw, iface_mask, iface_flags; >- if (sscanf(line.c_str(), "%255s %8X %8X %4X %*d %*u %*d %8X", iface_name, >- &iface_ip, &iface_gw, &iface_flags, &iface_mask) == 5 && >- network_name == iface_name && iface_mask == 0 && >- (iface_flags & (RTF_UP | RTF_HOST)) == RTF_UP) { >- return true; >- } >+ } >+ bool is_default_route = false; >+ char line[500]; >+ while (fgets(line, sizeof(line), f)) { >+ char iface_name[256]; >+ unsigned int iface_ip, iface_gw, iface_mask, iface_flags; >+ if (sscanf(line, "%255s %8X %8X %4X %*d %*u %*d %8X", iface_name, &iface_ip, >+ &iface_gw, &iface_flags, &iface_mask) == 5 && >+ network_name == iface_name && iface_mask == 0 && >+ (iface_flags & (RTF_UP | RTF_HOST)) == RTF_UP) { >+ is_default_route = true; >+ break; > } > } >- return false; >+ fclose(f); >+ return is_default_route; > } > #endif > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.h >index 601e39f02771d03293e8d73008ac83d4b277b786..8b1a5fcc5c3ded1183ba86b8037dd8d86a90d4cd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network.h >@@ -141,7 +141,7 @@ class NetworkManager : public DefaultLocalAddressProvider { > > // Returns the mDNS responder that can be used to obfuscate the local IP > // addresses of ICE host candidates by mDNS hostnames. >- virtual webrtc::MDnsResponderInterface* GetMDnsResponder() const; >+ virtual webrtc::MdnsResponderInterface* GetMdnsResponder() const; > }; > > // Base class for NetworkManager implementations. >@@ -367,11 +367,11 @@ class Network { > // created name will be resolved by the responder. > // > // The mDNS responder, if not null, should outlive this rtc::Network. >- void SetMDnsResponder(webrtc::MDnsResponderInterface* mdns_responder) { >+ void SetMdnsResponder(webrtc::MdnsResponderInterface* mdns_responder) { > mdns_responder_ = mdns_responder; > } > // Returns the mDNS responder, which is null by default. >- webrtc::MDnsResponderInterface* GetMDnsResponder() const { >+ webrtc::MdnsResponderInterface* GetMdnsResponder() const { > return mdns_responder_; > } > >@@ -446,7 +446,7 @@ class Network { > int prefix_length_; > std::string key_; > std::vector<InterfaceAddress> ips_; >- webrtc::MDnsResponderInterface* mdns_responder_ = nullptr; >+ webrtc::MdnsResponderInterface* mdns_responder_ = nullptr; > int scope_id_; > bool ignored_; > AdapterType type_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/BUILD.gn >new file mode 100644 >index 0000000000000000000000000000000000000000..0fbdbb1edbb0afc24e1f4bb7c135436a48ee842c >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/BUILD.gn >@@ -0,0 +1,19 @@ >+# Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+# >+# Use of this source code is governed by a BSD-style license >+# that can be found in the LICENSE file in the root of the source >+# tree. An additional intellectual property rights grant can be found >+# in the file PATENTS. All contributing project authors may >+# be found in the AUTHORS file in the root of the source tree. >+ >+import("../../webrtc.gni") >+ >+rtc_source_set("sent_packet") { >+ sources = [ >+ "sent_packet.cc", >+ "sent_packet.h", >+ ] >+ deps = [ >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/sent_packet.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/sent_packet.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..8cc49737efdeca7081e155cdc7541e835c1c0bee >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/sent_packet.cc >@@ -0,0 +1,27 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/network/sent_packet.h" >+ >+namespace rtc { >+ >+PacketInfo::PacketInfo() = default; >+PacketInfo::PacketInfo(const PacketInfo& info) = default; >+PacketInfo::~PacketInfo() = default; >+ >+SentPacket::SentPacket() = default; >+SentPacket::SentPacket(int64_t packet_id, int64_t send_time_ms) >+ : packet_id(packet_id), send_time_ms(send_time_ms) {} >+SentPacket::SentPacket(int64_t packet_id, >+ int64_t send_time_ms, >+ const rtc::PacketInfo& info) >+ : packet_id(packet_id), send_time_ms(send_time_ms), info(info) {} >+ >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/sent_packet.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/sent_packet.h >new file mode 100644 >index 0000000000000000000000000000000000000000..0cad31ca91c0f8902bb4ae4a8ddf478014c1a435 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/network/sent_packet.h >@@ -0,0 +1,68 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_NETWORK_SENT_PACKET_H_ >+#define RTC_BASE_NETWORK_SENT_PACKET_H_ >+ >+#include <stddef.h> >+#include <stdint.h> >+ >+#include "absl/types/optional.h" >+ >+namespace rtc { >+ >+enum class PacketType { >+ kUnknown, >+ kData, >+ kIceConnectivityCheck, >+ kIceConnectivityCheckResponse, >+ kStunMessage, >+ kTurnMessage, >+}; >+ >+enum class PacketInfoProtocolType { >+ kUnknown, >+ kUdp, >+ kTcp, >+ kSsltcp, >+ kTls, >+}; >+ >+struct PacketInfo { >+ PacketInfo(); >+ PacketInfo(const PacketInfo& info); >+ ~PacketInfo(); >+ >+ bool included_in_feedback = false; >+ bool included_in_allocation = false; >+ PacketType packet_type = PacketType::kUnknown; >+ PacketInfoProtocolType protocol = PacketInfoProtocolType::kUnknown; >+ // A unique id assigned by the network manager, and absl::nullopt if not set. >+ absl::optional<uint16_t> network_id; >+ size_t packet_size_bytes = 0; >+ size_t turn_overhead_bytes = 0; >+ size_t ip_overhead_bytes = 0; >+}; >+ >+struct SentPacket { >+ SentPacket(); >+ SentPacket(int64_t packet_id, int64_t send_time_ms); >+ SentPacket(int64_t packet_id, >+ int64_t send_time_ms, >+ const rtc::PacketInfo& info); >+ >+ int64_t packet_id = -1; >+ int64_t send_time_ms = -1; >+ rtc::PacketInfo info; >+}; >+ >+} // namespace rtc >+ >+#endif // RTC_BASE_NETWORK_SENT_PACKET_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.cc >index ad6805acec7e798ff8a9c6ea33dc9775580cbbb0..0185eab85613a3b19561a61fe4aaca165f3fece8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.cc >@@ -10,7 +10,11 @@ > > #include "rtc_base/networkmonitor.h" > >+#include <stdint.h> >+ > #include "rtc_base/checks.h" >+#include "rtc_base/location.h" >+#include "rtc_base/logging.h" > > namespace { > const uint32_t UPDATE_NETWORKS_MESSAGE = 1; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.h >index a84a30aed2bc3368665b7d4314775e63bf05561f..1ad76636696cc8e66fedbbeb01fb25e66cc77cac 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/networkmonitor.h >@@ -11,7 +11,6 @@ > #ifndef RTC_BASE_NETWORKMONITOR_H_ > #define RTC_BASE_NETWORKMONITOR_H_ > >-#include "rtc_base/logging.h" > #include "rtc_base/network_constants.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > #include "rtc_base/thread.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/noop.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/noop.cc >deleted file mode 100644 >index 16a8e6d5c10b97134e68b08e2c65ef2fff16bd52..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/noop.cc >+++ /dev/null >@@ -1,13 +0,0 @@ >-/* >- * Copyright 2015 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-// This file is only needed to make ninja happy on some platforms. >-// On some platforms it is not possible to link an rtc_static_library >-// without any source file listed in the GN target. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/noop.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/noop.mm >deleted file mode 100644 >index 16a8e6d5c10b97134e68b08e2c65ef2fff16bd52..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/noop.mm >+++ /dev/null >@@ -1,13 +0,0 @@ >-/* >- * Copyright 2015 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-// This file is only needed to make ninja happy on some platforms. >-// On some platforms it is not possible to link an rtc_static_library >-// without any source file listed in the GN target. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.cc >index c890c6f3dcf653f9694efa6bbc5fc82991cac91d..ec042dd09517dc18e7e1e04b849ae517850a30d5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.cc >@@ -13,7 +13,7 @@ > > namespace rtc { > >-NullSocketServer::NullSocketServer() : event_(false, false) {} >+NullSocketServer::NullSocketServer() = default; > NullSocketServer::~NullSocketServer() {} > > bool NullSocketServer::Wait(int cms, bool process_io) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.h >index 408bcd1b2f5163621f131015b6c5f6defab39c1a..47a7fa611871c4423d2516e398c9480c485f83ef 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/nullsocketserver.h >@@ -11,7 +11,9 @@ > #ifndef RTC_BASE_NULLSOCKETSERVER_H_ > #define RTC_BASE_NULLSOCKETSERVER_H_ > >+#include "rtc_base/asyncsocket.h" > #include "rtc_base/event.h" >+#include "rtc_base/socket.h" > #include "rtc_base/socketserver.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..c825839227ebb8a4aac7c591138f60940508d27f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average.cc >@@ -0,0 +1,60 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/numerics/moving_average.h" >+ >+#include <algorithm> >+ >+#include "rtc_base/checks.h" >+ >+namespace rtc { >+ >+MovingAverage::MovingAverage(size_t window_size) : history_(window_size, 0) { >+ // Limit window size to avoid overflow. >+ RTC_DCHECK_LE(window_size, (int64_t{1} << 32) - 1); >+} >+MovingAverage::~MovingAverage() = default; >+ >+void MovingAverage::AddSample(int sample) { >+ count_++; >+ size_t index = count_ % history_.size(); >+ if (count_ > history_.size()) >+ sum_ -= history_[index]; >+ sum_ += sample; >+ history_[index] = sample; >+} >+ >+absl::optional<int> MovingAverage::GetAverageRoundedDown() const { >+ if (count_ == 0) >+ return absl::nullopt; >+ return sum_ / Size(); >+} >+ >+absl::optional<int> MovingAverage::GetAverageRoundedToClosest() const { >+ if (count_ == 0) >+ return absl::nullopt; >+ return (sum_ + Size() / 2) / Size(); >+} >+ >+absl::optional<double> MovingAverage::GetUnroundedAverage() const { >+ if (count_ == 0) >+ return absl::nullopt; >+ return sum_ / static_cast<double>(Size()); >+} >+ >+void MovingAverage::Reset() { >+ count_ = 0; >+ sum_ = 0; >+} >+ >+size_t MovingAverage::Size() const { >+ return std::min(count_, history_.size()); >+} >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average.h >new file mode 100644 >index 0000000000000000000000000000000000000000..770e47d86f8128fcf7774d4bc4509676975c62ac >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average.h >@@ -0,0 +1,63 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_NUMERICS_MOVING_AVERAGE_H_ >+#define RTC_BASE_NUMERICS_MOVING_AVERAGE_H_ >+ >+#include <vector> >+ >+#include "absl/types/optional.h" >+ >+namespace rtc { >+ >+// Calculates average over fixed size window. If there are less than window >+// size elements, calculates average of all inserted so far elements. >+// >+class MovingAverage { >+ public: >+ // Maximum supported window size is 2^32 - 1. >+ explicit MovingAverage(size_t window_size); >+ ~MovingAverage(); >+ // MovingAverage is neither copyable nor movable. >+ MovingAverage(const MovingAverage&) = delete; >+ MovingAverage& operator=(const MovingAverage&) = delete; >+ >+ // Adds new sample. If the window is full, the oldest element is pushed out. >+ void AddSample(int sample); >+ >+ // Returns rounded down average of last |window_size| elements or all >+ // elements if there are not enough of them. Returns nullopt if there were >+ // no elements added. >+ absl::optional<int> GetAverageRoundedDown() const; >+ >+ // Same as above but rounded to the closest integer. >+ absl::optional<int> GetAverageRoundedToClosest() const; >+ >+ // Returns unrounded average over the window. >+ absl::optional<double> GetUnroundedAverage() const; >+ >+ // Resets to the initial state before any elements were added. >+ void Reset(); >+ >+ // Returns number of elements in the window. >+ size_t Size() const; >+ >+ private: >+ // Total number of samples added to the class since last reset. >+ size_t count_ = 0; >+ // Sum of the samples in the moving window. >+ int64_t sum_ = 0; >+ // Circular buffer for all the samples in the moving window. >+ // Size is always |window_size| >+ std::vector<int> history_; >+}; >+ >+} // namespace rtc >+#endif // RTC_BASE_NUMERICS_MOVING_AVERAGE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..9bc9a1aef89a14ba9366be888fd22b2d13960959 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/moving_average_unittest.cc >@@ -0,0 +1,87 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/numerics/moving_average.h" >+ >+#include "test/gtest.h" >+ >+namespace test { >+ >+TEST(MovingAverageTest, EmptyAverage) { >+ rtc::MovingAverage moving_average(1); >+ EXPECT_EQ(0u, moving_average.Size()); >+ EXPECT_EQ(absl::nullopt, moving_average.GetAverageRoundedDown()); >+} >+ >+// Test single value. >+TEST(MovingAverageTest, OneElement) { >+ rtc::MovingAverage moving_average(1); >+ moving_average.AddSample(3); >+ EXPECT_EQ(1u, moving_average.Size()); >+ EXPECT_EQ(3, *moving_average.GetAverageRoundedDown()); >+} >+ >+TEST(MovingAverageTest, GetAverage) { >+ rtc::MovingAverage moving_average(1024); >+ moving_average.AddSample(1); >+ moving_average.AddSample(1); >+ moving_average.AddSample(3); >+ moving_average.AddSample(3); >+ EXPECT_EQ(*moving_average.GetAverageRoundedDown(), 2); >+ EXPECT_EQ(*moving_average.GetAverageRoundedToClosest(), 2); >+} >+ >+TEST(MovingAverageTest, GetAverageRoundedDownRounds) { >+ rtc::MovingAverage moving_average(1024); >+ moving_average.AddSample(1); >+ moving_average.AddSample(2); >+ moving_average.AddSample(2); >+ moving_average.AddSample(2); >+ EXPECT_EQ(*moving_average.GetAverageRoundedDown(), 1); >+} >+ >+TEST(MovingAverageTest, GetAverageRoundedToClosestRounds) { >+ rtc::MovingAverage moving_average(1024); >+ moving_average.AddSample(1); >+ moving_average.AddSample(2); >+ moving_average.AddSample(2); >+ moving_average.AddSample(2); >+ EXPECT_EQ(*moving_average.GetAverageRoundedToClosest(), 2); >+} >+ >+TEST(MovingAverageTest, Reset) { >+ rtc::MovingAverage moving_average(5); >+ moving_average.AddSample(1); >+ EXPECT_EQ(1, *moving_average.GetAverageRoundedDown()); >+ EXPECT_EQ(1, *moving_average.GetAverageRoundedToClosest()); >+ >+ moving_average.Reset(); >+ >+ EXPECT_FALSE(moving_average.GetAverageRoundedDown()); >+ moving_average.AddSample(10); >+ EXPECT_EQ(10, *moving_average.GetAverageRoundedDown()); >+ EXPECT_EQ(10, *moving_average.GetAverageRoundedToClosest()); >+} >+ >+TEST(MovingAverageTest, ManySamples) { >+ rtc::MovingAverage moving_average(10); >+ for (int i = 1; i < 11; i++) { >+ moving_average.AddSample(i); >+ } >+ EXPECT_EQ(*moving_average.GetAverageRoundedDown(), 5); >+ EXPECT_EQ(*moving_average.GetAverageRoundedToClosest(), 6); >+ for (int i = 1; i < 2001; i++) { >+ moving_average.AddSample(i); >+ } >+ EXPECT_EQ(*moving_average.GetAverageRoundedDown(), 1995); >+ EXPECT_EQ(*moving_average.GetAverageRoundedToClosest(), 1996); >+} >+ >+} // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/sample_counter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/sample_counter.h >index 4fe71d19427a23ee229ecf4d21b61710d85080ee..18bd36b8f3052a652a1aa7be986d6db3c9717927 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/sample_counter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/sample_counter.h >@@ -11,6 +11,8 @@ > #ifndef RTC_BASE_NUMERICS_SAMPLE_COUNTER_H_ > #define RTC_BASE_NUMERICS_SAMPLE_COUNTER_H_ > >+#include <stdint.h> >+ > #include "absl/types/optional.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..52af667645b39ee7703ce7b74ff20455b9e8acae >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf.cc >@@ -0,0 +1,70 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/openssl_key_derivation_hkdf.h" >+ >+#include <openssl/digest.h> >+#include <openssl/err.h> >+#include <openssl/hkdf.h> >+#include <openssl/sha.h> >+ >+#include <algorithm> >+#include <utility> >+ >+#include "rtc_base/buffer.h" >+#include "rtc_base/openssl.h" >+ >+namespace rtc { >+ >+OpenSSLKeyDerivationHKDF::OpenSSLKeyDerivationHKDF() = default; >+OpenSSLKeyDerivationHKDF::~OpenSSLKeyDerivationHKDF() = default; >+ >+const size_t OpenSSLKeyDerivationHKDF::kMinKeyByteSize = 16; >+const size_t OpenSSLKeyDerivationHKDF::kMaxKeyByteSize = >+ 255 * SHA256_DIGEST_LENGTH; >+const size_t OpenSSLKeyDerivationHKDF::kMinSecretByteSize = 16; >+ >+absl::optional<ZeroOnFreeBuffer<uint8_t>> OpenSSLKeyDerivationHKDF::DeriveKey( >+ rtc::ArrayView<const uint8_t> secret, >+ rtc::ArrayView<const uint8_t> salt, >+ rtc::ArrayView<const uint8_t> label, >+ size_t derived_key_byte_size) { >+ // Prevent deriving less than 128 bits of key material or more than the max. >+ if (derived_key_byte_size < kMinKeyByteSize || >+ derived_key_byte_size > kMaxKeyByteSize) { >+ return absl::nullopt; >+ } >+ // The secret must reach the minimum number of bits to be secure. >+ if (secret.data() == nullptr || secret.size() < kMinSecretByteSize) { >+ return absl::nullopt; >+ } >+ // Empty labels are always invalid in derivation. >+ if (label.data() == nullptr || label.size() == 0) { >+ return absl::nullopt; >+ } >+ // If a random salt is not provided use all zeros. >+ rtc::Buffer salt_buffer; >+ if (salt.data() == nullptr || salt.size() == 0) { >+ salt_buffer.SetSize(SHA256_DIGEST_LENGTH); >+ std::fill(salt_buffer.begin(), salt_buffer.end(), 0); >+ salt = salt_buffer; >+ } >+ // This buffer will erase itself on release. >+ ZeroOnFreeBuffer<uint8_t> derived_key_buffer(derived_key_byte_size, 0); >+ if (!HKDF(derived_key_buffer.data(), derived_key_buffer.size(), EVP_sha256(), >+ secret.data(), secret.size(), salt.data(), salt.size(), >+ label.data(), label.size())) { >+ return absl::nullopt; >+ } >+ return absl::optional<ZeroOnFreeBuffer<uint8_t>>( >+ std::move(derived_key_buffer)); >+} >+ >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf.h >new file mode 100644 >index 0000000000000000000000000000000000000000..ebf43cfaa7abad7a9a11aa5d5096e81efa1ee59f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf.h >@@ -0,0 +1,54 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_OPENSSL_KEY_DERIVATION_HKDF_H_ >+#define RTC_BASE_OPENSSL_KEY_DERIVATION_HKDF_H_ >+ >+#include "rtc_base/constructormagic.h" >+#include "rtc_base/key_derivation.h" >+ >+namespace rtc { >+ >+// OpenSSLKeyDerivationHKDF provides a concrete implementation of the >+// KeyDerivation interface to support the HKDF algorithm using the >+// OpenSSL/BoringSSL internal implementation. >+class OpenSSLKeyDerivationHKDF final : public KeyDerivation { >+ public: >+ OpenSSLKeyDerivationHKDF(); >+ ~OpenSSLKeyDerivationHKDF() override; >+ >+ // General users shouldn't be generating keys smaller than 128 bits. >+ static const size_t kMinKeyByteSize; >+ // The maximum available derivation size 255*DIGEST_LENGTH >+ static const size_t kMaxKeyByteSize; >+ // The minimum acceptable secret size. >+ static const size_t kMinSecretByteSize; >+ >+ // Derives a new key from existing key material using HKDF. >+ // secret - The random secret value you wish to derive a key from. >+ // salt - Optional (non secret) cryptographically random value. >+ // label - A non secret but unique label value to determine the derivation. >+ // derived_key_byte_size - The size of the derived key. >+ // return - A ZeroOnFreeBuffer containing the derived key or an error >+ // condition. Checking error codes is explicit in the API and error should >+ // never be ignored. >+ absl::optional<ZeroOnFreeBuffer<uint8_t>> DeriveKey( >+ rtc::ArrayView<const uint8_t> secret, >+ rtc::ArrayView<const uint8_t> salt, >+ rtc::ArrayView<const uint8_t> label, >+ size_t derived_key_byte_size) override; >+ >+ private: >+ RTC_DISALLOW_COPY_AND_ASSIGN(OpenSSLKeyDerivationHKDF); >+}; >+ >+} // namespace rtc >+ >+#endif // RTC_BASE_OPENSSL_KEY_DERIVATION_HKDF_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..92df42ffd30d6b4fd359c14b13d3883dea9af4db >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssl_key_derivation_hkdf_unittest.cc >@@ -0,0 +1,107 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/openssl_key_derivation_hkdf.h" >+ >+#include <utility> >+ >+#include "test/gmock.h" >+ >+namespace rtc { >+namespace { >+ >+// Validates that a basic valid call works correctly. >+TEST(OpenSSLKeyDerivationHKDF, DerivationBasicTest) { >+ rtc::Buffer secret(32); >+ rtc::Buffer salt(32); >+ rtc::Buffer label(32); >+ const size_t derived_key_byte_size = 16; >+ >+ OpenSSLKeyDerivationHKDF hkdf; >+ auto key_or = hkdf.DeriveKey(secret, salt, label, derived_key_byte_size); >+ EXPECT_TRUE(key_or.has_value()); >+ ZeroOnFreeBuffer<uint8_t> key = std::move(key_or.value()); >+ EXPECT_EQ(derived_key_byte_size, key.size()); >+} >+ >+// Derivation fails if output is too small. >+TEST(OpenSSLKeyDerivationHKDF, DerivationFailsIfOutputIsTooSmall) { >+ rtc::Buffer secret(32); >+ rtc::Buffer salt(32); >+ rtc::Buffer label(32); >+ const size_t derived_key_byte_size = 15; >+ >+ OpenSSLKeyDerivationHKDF hkdf; >+ auto key_or = hkdf.DeriveKey(secret, salt, label, derived_key_byte_size); >+ EXPECT_FALSE(key_or.has_value()); >+} >+ >+// Derivation fails if output is too large. >+TEST(OpenSSLKeyDerivationHKDF, DerivationFailsIfOutputIsTooLarge) { >+ rtc::Buffer secret(32); >+ rtc::Buffer salt(32); >+ rtc::Buffer label(32); >+ const size_t derived_key_byte_size = 256 * 32; >+ >+ OpenSSLKeyDerivationHKDF hkdf; >+ auto key_or = hkdf.DeriveKey(secret, salt, label, derived_key_byte_size); >+ EXPECT_FALSE(key_or.has_value()); >+} >+ >+// Validates that too little key material causes a failure. >+TEST(OpenSSLKeyDerivationHKDF, DerivationFailsWithInvalidSecret) { >+ rtc::Buffer secret(15); >+ rtc::Buffer salt(32); >+ rtc::Buffer label(32); >+ const size_t derived_key_byte_size = 16; >+ >+ OpenSSLKeyDerivationHKDF hkdf; >+ auto key_or_0 = hkdf.DeriveKey(secret, salt, label, derived_key_byte_size); >+ EXPECT_FALSE(key_or_0.has_value()); >+ >+ auto key_or_1 = hkdf.DeriveKey(nullptr, salt, label, derived_key_byte_size); >+ EXPECT_FALSE(key_or_1.has_value()); >+ >+ rtc::Buffer secret_empty; >+ auto key_or_2 = >+ hkdf.DeriveKey(secret_empty, salt, label, derived_key_byte_size); >+ EXPECT_FALSE(key_or_2.has_value()); >+} >+ >+// Validates that HKDF works without a salt being set. >+TEST(OpenSSLKeyDerivationHKDF, DerivationWorksWithNoSalt) { >+ rtc::Buffer secret(32); >+ rtc::Buffer label(32); >+ const size_t derived_key_byte_size = 16; >+ >+ OpenSSLKeyDerivationHKDF hkdf; >+ auto key_or = hkdf.DeriveKey(secret, nullptr, label, derived_key_byte_size); >+ EXPECT_TRUE(key_or.has_value()); >+} >+ >+// Validates that a label is required to work correctly. >+TEST(OpenSSLKeyDerivationHKDF, DerivationRequiresLabel) { >+ rtc::Buffer secret(32); >+ rtc::Buffer salt(32); >+ rtc::Buffer label(1); >+ const size_t derived_key_byte_size = 16; >+ >+ OpenSSLKeyDerivationHKDF hkdf; >+ auto key_or_0 = hkdf.DeriveKey(secret, salt, label, derived_key_byte_size); >+ EXPECT_TRUE(key_or_0.has_value()); >+ ZeroOnFreeBuffer<uint8_t> key = std::move(key_or_0.value()); >+ EXPECT_EQ(key.size(), derived_key_byte_size); >+ >+ auto key_or_1 = hkdf.DeriveKey(secret, salt, nullptr, derived_key_byte_size); >+ EXPECT_FALSE(key_or_1.has_value()); >+} >+ >+} // namespace >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.cc >index 50284a6719fe45146ace9266e0f9cbdb3bd633d8..fcfa53b7ee70d84ec1baad00b867454e4313901d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.cc >@@ -10,25 +10,25 @@ > > #include "rtc_base/openssladapter.h" > >-#if defined(WEBRTC_POSIX) >-#include <unistd.h> >-#endif >+#include <errno.h> > > #include <openssl/bio.h> >-#include <openssl/crypto.h> > #include <openssl/err.h> >-#include <openssl/opensslv.h> > #include <openssl/rand.h> > #include <openssl/x509.h> >-#include <openssl/x509v3.h> > #include "rtc_base/openssl.h" > >+#include <string.h> >+#include <time.h> >+ >+#include "absl/memory/memory.h" > #include "rtc_base/checks.h" >+#include "rtc_base/location.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" >+#include "rtc_base/opensslcertificate.h" > #include "rtc_base/opensslutility.h" > #include "rtc_base/stringencode.h" >-#include "rtc_base/stringutils.h" > #include "rtc_base/thread.h" > > #ifndef OPENSSL_IS_BORINGSSL >@@ -392,15 +392,17 @@ int OpenSSLAdapter::BeginSSL() { > > // Do the connect. > err = ContinueSSL(); >- if (err != 0) >+ if (err != 0) { > goto ssl_error; >+ } > > return err; > > ssl_error: > Cleanup(); >- if (bio) >+ if (bio) { > BIO_free(bio); >+ } > > return err; > } >@@ -424,14 +426,13 @@ int OpenSSLAdapter::ContinueSSL() { > > state_ = SSL_CONNECTED; > AsyncSocketAdapter::OnConnectEvent(this); >-#if 0 // TODO(benwright): worry about this >- // Don't let ourselves go away during the callbacks >- PRefPtr<OpenSSLAdapter> lock(this); >- RTC_LOG(LS_INFO) << " -- onStreamReadable"; >- AsyncSocketAdapter::OnReadEvent(this); >- RTC_LOG(LS_INFO) << " -- onStreamWriteable"; >- AsyncSocketAdapter::OnWriteEvent(this); >-#endif >+ // TODO(benwright): Refactor this code path. >+ // Don't let ourselves go away during the callbacks >+ // PRefPtr<OpenSSLAdapter> lock(this); >+ // RTC_LOG(LS_INFO) << " -- onStreamReadable"; >+ // AsyncSocketAdapter::OnReadEvent(this); >+ // RTC_LOG(LS_INFO) << " -- onStreamWriteable"; >+ // AsyncSocketAdapter::OnWriteEvent(this); > break; > > case SSL_ERROR_WANT_READ: >@@ -462,8 +463,9 @@ void OpenSSLAdapter::Error(const char* context, int err, bool signal) { > << ")"; > state_ = SSL_ERROR; > SetError(err); >- if (signal) >+ if (signal) { > AsyncSocketAdapter::OnCloseEvent(this, err); >+ } > } > > void OpenSSLAdapter::Cleanup() { >@@ -528,23 +530,20 @@ int OpenSSLAdapter::DoSslWrite(const void* pv, size_t cb, int* error) { > return SOCKET_ERROR; > } > >-// >+/////////////////////////////////////////////////////////////////////////////// > // AsyncSocket Implementation >-// >+/////////////////////////////////////////////////////////////////////////////// > > int OpenSSLAdapter::Send(const void* pv, size_t cb) { > switch (state_) { > case SSL_NONE: > return AsyncSocketAdapter::Send(pv, cb); >- > case SSL_WAIT: > case SSL_CONNECTING: > SetError(ENOTCONN); > return SOCKET_ERROR; >- > case SSL_CONNECTED: > break; >- > case SSL_ERROR: > default: > return SOCKET_ERROR; >@@ -567,8 +566,9 @@ int OpenSSLAdapter::Send(const void* pv, size_t cb) { > } > > // OpenSSL will return an error if we try to write zero bytes >- if (cb == 0) >+ if (cb == 0) { > return 0; >+ } > > ret = DoSslWrite(pv, cb, &error); > >@@ -595,7 +595,6 @@ int OpenSSLAdapter::Send(const void* pv, size_t cb) { > // size. The user of this class can consider it sent. > return rtc::dchecked_cast<int>(cb); > } >- > return ret; > } > >@@ -608,7 +607,6 @@ int OpenSSLAdapter::SendTo(const void* pv, > } > > SetError(ENOTCONN); >- > return SOCKET_ERROR; > } > >@@ -616,28 +614,26 @@ int OpenSSLAdapter::Recv(void* pv, size_t cb, int64_t* timestamp) { > switch (state_) { > case SSL_NONE: > return AsyncSocketAdapter::Recv(pv, cb, timestamp); >- > case SSL_WAIT: > case SSL_CONNECTING: > SetError(ENOTCONN); > return SOCKET_ERROR; >- > case SSL_CONNECTED: > break; >- > case SSL_ERROR: > default: > return SOCKET_ERROR; > } > > // Don't trust OpenSSL with zero byte reads >- if (cb == 0) >+ if (cb == 0) { > return 0; >+ } > > ssl_read_needs_write_ = false; >- > int code = SSL_read(ssl_, pv, checked_cast<int>(cb)); > int error = SSL_get_error(ssl_, code); >+ > switch (error) { > case SSL_ERROR_NONE: > return code; >@@ -660,7 +656,6 @@ int OpenSSLAdapter::Recv(void* pv, size_t cb, int64_t* timestamp) { > Error("SSL_read", (code ? code : -1), false); > break; > } >- > return SOCKET_ERROR; > } > >@@ -670,14 +665,11 @@ int OpenSSLAdapter::RecvFrom(void* pv, > int64_t* timestamp) { > if (socket_->GetState() == Socket::CS_CONNECTED) { > int ret = Recv(pv, cb, timestamp); >- > *paddr = GetRemoteAddress(); >- > return ret; > } > > SetError(ENOTCONN); >- > return SOCKET_ERROR; > } > >@@ -688,12 +680,11 @@ int OpenSSLAdapter::Close() { > } > > Socket::ConnState OpenSSLAdapter::GetState() const { >- // if (signal_close_) >- // return CS_CONNECTED; > ConnState state = socket_->GetState(); > if ((state == CS_CONNECTED) && >- ((state_ == SSL_WAIT) || (state_ == SSL_CONNECTING))) >+ ((state_ == SSL_WAIT) || (state_ == SSL_CONNECTING))) { > state = CS_CONNECTING; >+ } > return state; > } > >@@ -736,8 +727,9 @@ void OpenSSLAdapter::OnReadEvent(AsyncSocket* socket) { > return; > } > >- if (state_ != SSL_CONNECTED) >+ if (state_ != SSL_CONNECTED) { > return; >+ } > > // Don't let ourselves go away during the callbacks > // PRefPtr<OpenSSLAdapter> lock(this); // TODO(benwright): fix this >@@ -761,8 +753,9 @@ void OpenSSLAdapter::OnWriteEvent(AsyncSocket* socket) { > return; > } > >- if (state_ != SSL_CONNECTED) >+ if (state_ != SSL_CONNECTED) { > return; >+ } > > // Don't let ourselves go away during the callbacks > // PRefPtr<OpenSSLAdapter> lock(this); // TODO(benwright): fix this >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.h >index 50a7c0845ca459dedd282105c773ac97ea8a2d6a..2e3a355f439e750e5c22c8aa9bc6ab696b85c8b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssladapter.h >@@ -11,25 +11,28 @@ > #ifndef RTC_BASE_OPENSSLADAPTER_H_ > #define RTC_BASE_OPENSSLADAPTER_H_ > >-#include <openssl/ossl_typ.h> >- >-#include <map> >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <string> > #include <vector> > >-#include "absl/memory/memory.h" >+#include "rtc_base/asyncsocket.h" > #include "rtc_base/buffer.h" > #include "rtc_base/messagehandler.h" > #include "rtc_base/messagequeue.h" >-#include "rtc_base/opensslcertificate.h" > #include "rtc_base/opensslidentity.h" > #include "rtc_base/opensslsessioncache.h" >+#include "rtc_base/socket.h" >+#include "rtc_base/socketaddress.h" > #include "rtc_base/ssladapter.h" >+#include "rtc_base/sslcertificate.h" >+#include "rtc_base/sslidentity.h" >+#include "rtc_base/sslstreamadapter.h" > > namespace rtc { > >-class OpenSSLAdapter : public SSLAdapter, public MessageHandler { >+class OpenSSLAdapter final : public SSLAdapter, public MessageHandler { > public: > static bool InitializeSSL(); > static bool CleanupSSL(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >index 236bef55c8aac95a6c714374b8a2c0cea77b12dc..4e325dcec018fc92536a0ce5c828c32a98293830 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >@@ -10,10 +10,6 @@ > > #include "rtc_base/opensslcertificate.h" > >-#include <memory> >-#include <utility> >-#include <vector> >- > #if defined(WEBRTC_WIN) > // Must be included first before openssl headers. > #include "rtc_base/win32.h" // NOLINT >@@ -21,63 +17,69 @@ > > #include <openssl/bio.h> > #include <openssl/bn.h> >-#include <openssl/crypto.h> >-#include <openssl/err.h> > #include <openssl/pem.h> >-#include <openssl/rsa.h> >+#include <time.h> > > #include "absl/memory/memory.h" >-#include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" > #include "rtc_base/helpers.h" > #include "rtc_base/logging.h" >-#include "rtc_base/numerics/safe_conversions.h" >-#include "rtc_base/openssl.h" >+#include "rtc_base/messagedigest.h" > #include "rtc_base/openssldigest.h" > #include "rtc_base/opensslidentity.h" > #include "rtc_base/opensslutility.h" >-#ifndef WEBRTC_EXCLUDE_BUILT_IN_SSL_ROOT_CERTS >-#include "rtc_base/sslroots.h" >-#endif // WEBRTC_EXCLUDE_BUILT_IN_SSL_ROOT_CERTS > > namespace rtc { >- >-////////////////////////////////////////////////////////////////////// >-// OpenSSLCertificate >-////////////////////////////////////////////////////////////////////// >- >-// We could have exposed a myriad of parameters for the crypto stuff, >-// but keeping it simple seems best. >+namespace { > > // Random bits for certificate serial number > static const int SERIAL_RAND_BITS = 64; > >+#if !defined(NDEBUG) >+// Print a certificate to the log, for debugging. >+static void PrintCert(X509* x509) { >+ BIO* temp_memory_bio = BIO_new(BIO_s_mem()); >+ if (!temp_memory_bio) { >+ RTC_DLOG_F(LS_ERROR) << "Failed to allocate temporary memory bio"; >+ return; >+ } >+ X509_print_ex(temp_memory_bio, x509, XN_FLAG_SEP_CPLUS_SPC, 0); >+ BIO_write(temp_memory_bio, "\0", 1); >+ char* buffer; >+ BIO_get_mem_data(temp_memory_bio, &buffer); >+ RTC_DLOG(LS_VERBOSE) << buffer; >+ BIO_free(temp_memory_bio); >+} >+#endif >+ > // Generate a self-signed certificate, with the public key from the > // given key pair. Caller is responsible for freeing the returned object. > static X509* MakeCertificate(EVP_PKEY* pkey, const SSLIdentityParams& params) { > RTC_LOG(LS_INFO) << "Making certificate for " << params.common_name; >- X509* x509 = nullptr; >+ >+ ASN1_INTEGER* asn1_serial_number = nullptr; > BIGNUM* serial_number = nullptr; >+ X509* x509 = nullptr; > X509_NAME* name = nullptr; > time_t epoch_off = 0; // Time offset since epoch. > >- if ((x509 = X509_new()) == nullptr) >+ if ((x509 = X509_new()) == nullptr) { > goto error; >- >- if (!X509_set_pubkey(x509, pkey)) >+ } >+ if (!X509_set_pubkey(x509, pkey)) { > goto error; >- >- // serial number >- // temporary reference to serial number inside x509 struct >- ASN1_INTEGER* asn1_serial_number; >+ } >+ // serial number - temporary reference to serial number inside x509 struct > if ((serial_number = BN_new()) == nullptr || > !BN_pseudo_rand(serial_number, SERIAL_RAND_BITS, 0, 0) || > (asn1_serial_number = X509_get_serialNumber(x509)) == nullptr || >- !BN_to_ASN1_INTEGER(serial_number, asn1_serial_number)) >+ !BN_to_ASN1_INTEGER(serial_number, asn1_serial_number)) { > goto error; >- >- if (!X509_set_version(x509, 2L)) // version 3 >+ } >+ // Set version to X509.V3 >+ if (!X509_set_version(x509, 2L)) { > goto error; >+ } > > // There are a lot of possible components for the name entries. In > // our P2P SSL mode however, the certificates are pre-exchanged >@@ -90,15 +92,16 @@ static X509* MakeCertificate(EVP_PKEY* pkey, const SSLIdentityParams& params) { > !X509_NAME_add_entry_by_NID(name, NID_commonName, MBSTRING_UTF8, > (unsigned char*)params.common_name.c_str(), > -1, -1, 0) || >- !X509_set_subject_name(x509, name) || !X509_set_issuer_name(x509, name)) >+ !X509_set_subject_name(x509, name) || !X509_set_issuer_name(x509, name)) { > goto error; >- >+ } > if (!X509_time_adj(X509_get_notBefore(x509), params.not_before, &epoch_off) || >- !X509_time_adj(X509_get_notAfter(x509), params.not_after, &epoch_off)) >+ !X509_time_adj(X509_get_notAfter(x509), params.not_after, &epoch_off)) { > goto error; >- >- if (!X509_sign(x509, pkey, EVP_sha256())) >+ } >+ if (!X509_sign(x509, pkey, EVP_sha256())) { > goto error; >+ } > > BN_free(serial_number); > X509_NAME_free(name); >@@ -112,28 +115,14 @@ error: > return nullptr; > } > >-#if !defined(NDEBUG) >-// Print a certificate to the log, for debugging. >-static void PrintCert(X509* x509) { >- BIO* temp_memory_bio = BIO_new(BIO_s_mem()); >- if (!temp_memory_bio) { >- RTC_DLOG_F(LS_ERROR) << "Failed to allocate temporary memory bio"; >- return; >- } >- X509_print_ex(temp_memory_bio, x509, XN_FLAG_SEP_CPLUS_SPC, 0); >- BIO_write(temp_memory_bio, "\0", 1); >- char* buffer; >- BIO_get_mem_data(temp_memory_bio, &buffer); >- RTC_DLOG(LS_VERBOSE) << buffer; >- BIO_free(temp_memory_bio); >-} >-#endif >+} // namespace > > OpenSSLCertificate::OpenSSLCertificate(X509* x509) : x509_(x509) { >- AddReference(); >+ RTC_DCHECK(x509_ != nullptr); >+ X509_up_ref(x509_); > } > >-OpenSSLCertificate* OpenSSLCertificate::Generate( >+std::unique_ptr<OpenSSLCertificate> OpenSSLCertificate::Generate( > OpenSSLKeyPair* key_pair, > const SSLIdentityParams& params) { > SSLIdentityParams actual_params(params); >@@ -149,25 +138,27 @@ OpenSSLCertificate* OpenSSLCertificate::Generate( > #if !defined(NDEBUG) > PrintCert(x509); > #endif >- OpenSSLCertificate* ret = new OpenSSLCertificate(x509); >+ auto ret = absl::make_unique<OpenSSLCertificate>(x509); > X509_free(x509); > return ret; > } > >-OpenSSLCertificate* OpenSSLCertificate::FromPEMString( >+std::unique_ptr<OpenSSLCertificate> OpenSSLCertificate::FromPEMString( > const std::string& pem_string) { > BIO* bio = BIO_new_mem_buf(const_cast<char*>(pem_string.c_str()), -1); >- if (!bio) >+ if (!bio) { > return nullptr; >+ } >+ > BIO_set_mem_eof_return(bio, 0); > X509* x509 = > PEM_read_bio_X509(bio, nullptr, nullptr, const_cast<char*>("\0")); > BIO_free(bio); // Frees the BIO, but not the pointed-to string. > >- if (!x509) >+ if (!x509) { > return nullptr; >- >- OpenSSLCertificate* ret = new OpenSSLCertificate(x509); >+ } >+ auto ret = absl::make_unique<OpenSSLCertificate>(x509); > X509_free(x509); > return ret; > } >@@ -229,19 +220,16 @@ bool OpenSSLCertificate::ComputeDigest(const X509* x509, > unsigned char* digest, > size_t size, > size_t* length) { >- const EVP_MD* md; >- unsigned int n; >- >- if (!OpenSSLDigest::GetDigestEVP(algorithm, &md)) >+ const EVP_MD* md = nullptr; >+ unsigned int n = 0; >+ if (!OpenSSLDigest::GetDigestEVP(algorithm, &md)) { > return false; >- >- if (size < static_cast<size_t>(EVP_MD_size(md))) >+ } >+ if (size < static_cast<size_t>(EVP_MD_size(md))) { > return false; >- >+ } > X509_digest(x509, md, digest, &n); >- > *length = n; >- > return true; > } > >@@ -249,18 +237,18 @@ OpenSSLCertificate::~OpenSSLCertificate() { > X509_free(x509_); > } > >-OpenSSLCertificate* OpenSSLCertificate::GetReference() const { >- return new OpenSSLCertificate(x509_); >+std::unique_ptr<SSLCertificate> OpenSSLCertificate::Clone() const { >+ return absl::make_unique<OpenSSLCertificate>(x509_); > } > > std::string OpenSSLCertificate::ToPEMString() const { > BIO* bio = BIO_new(BIO_s_mem()); > if (!bio) { >- RTC_FATAL() << "unreachable code"; >+ RTC_FATAL() << "Unreachable code."; > } > if (!PEM_write_bio_X509(bio, x509_)) { > BIO_free(bio); >- RTC_FATAL() << "unreachable code"; >+ RTC_FATAL() << "Unreachable code."; > } > BIO_write(bio, "\0", 1); > char* buffer; >@@ -273,27 +261,21 @@ std::string OpenSSLCertificate::ToPEMString() const { > void OpenSSLCertificate::ToDER(Buffer* der_buffer) const { > // In case of failure, make sure to leave the buffer empty. > der_buffer->SetSize(0); >- > // Calculates the DER representation of the certificate, from scratch. > BIO* bio = BIO_new(BIO_s_mem()); > if (!bio) { >- RTC_FATAL() << "unreachable code"; >+ RTC_FATAL() << "Unreachable code."; > } > if (!i2d_X509_bio(bio, x509_)) { > BIO_free(bio); >- RTC_FATAL() << "unreachable code"; >+ RTC_FATAL() << "Unreachable code."; > } >- char* data; >+ char* data = nullptr; > size_t length = BIO_get_mem_data(bio, &data); > der_buffer->SetData(data, length); > BIO_free(bio); > } > >-void OpenSSLCertificate::AddReference() const { >- RTC_DCHECK(x509_ != nullptr); >- X509_up_ref(x509_); >-} >- > bool OpenSSLCertificate::operator==(const OpenSSLCertificate& other) const { > return X509_cmp(x509_, other.x509_) == 0; > } >@@ -302,11 +284,9 @@ bool OpenSSLCertificate::operator!=(const OpenSSLCertificate& other) const { > return !(*this == other); > } > >-// Documented in sslidentity.h. > int64_t OpenSSLCertificate::CertificateExpirationTime() const { > ASN1_TIME* expire_time = X509_get_notAfter(x509_); > bool long_format; >- > if (expire_time->type == V_ASN1_UTCTIME) { > long_format = false; > } else if (expire_time->type == V_ASN1_GENERALIZEDTIME) { >@@ -314,7 +294,6 @@ int64_t OpenSSLCertificate::CertificateExpirationTime() const { > } else { > return -1; > } >- > return ASN1TimeToSec(expire_time->data, expire_time->length, long_format); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.h >index c730ffd0dc9e769c6bd0f97d4ee1e7bdc5429ff2..088725c0d755653d3b230365c56331cd042c069f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.h >@@ -11,38 +11,38 @@ > #ifndef RTC_BASE_OPENSSLCERTIFICATE_H_ > #define RTC_BASE_OPENSSLCERTIFICATE_H_ > >-#include <openssl/evp.h> >-#include <openssl/x509.h> >+#include <openssl/ossl_typ.h> > >-#include <memory> >+#include <stddef.h> >+#include <stdint.h> > #include <string> > >-#include "rtc_base/checks.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/sslcertificate.h" > #include "rtc_base/sslidentity.h" > >-typedef struct ssl_ctx_st SSL_CTX; >- > namespace rtc { > > class OpenSSLKeyPair; > > // OpenSSLCertificate encapsulates an OpenSSL X509* certificate object, > // which is also reference counted inside the OpenSSL library. >-class OpenSSLCertificate : public SSLCertificate { >+class OpenSSLCertificate final : public SSLCertificate { > public: > // X509 object has its reference count incremented. So the caller and > // OpenSSLCertificate share ownership. > explicit OpenSSLCertificate(X509* x509); > >- static OpenSSLCertificate* Generate(OpenSSLKeyPair* key_pair, >- const SSLIdentityParams& params); >- static OpenSSLCertificate* FromPEMString(const std::string& pem_string); >+ static std::unique_ptr<OpenSSLCertificate> Generate( >+ OpenSSLKeyPair* key_pair, >+ const SSLIdentityParams& params); >+ static std::unique_ptr<OpenSSLCertificate> FromPEMString( >+ const std::string& pem_string); > > ~OpenSSLCertificate() override; > >- OpenSSLCertificate* GetReference() const override; >+ std::unique_ptr<SSLCertificate> Clone() const override; > > X509* x509() const { return x509_; } > >@@ -69,8 +69,6 @@ class OpenSSLCertificate : public SSLCertificate { > int64_t CertificateExpirationTime() const override; > > private: >- void AddReference() const; >- > X509* x509_; // NOT OWNED > RTC_DISALLOW_COPY_AND_ASSIGN(OpenSSLCertificate); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.cc >index 9b644c460b5a5834b839fe68f1f74be90c7c5c93..da90b65f8221ca96c6a03339f2887d70b82b5a64 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.cc >@@ -10,7 +10,7 @@ > > #include "rtc_base/openssldigest.h" > >-#include "rtc_base/checks.h" >+#include "rtc_base/checks.h" // RTC_DCHECK, RTC_CHECK > #include "rtc_base/openssl.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.h >index c4cd1e0314ed4e0abedcdbc0676c89006a1b6eb0..82dc9a96cd84bbaf0c7cdaf271ac22470ac58290 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/openssldigest.h >@@ -11,14 +11,16 @@ > #ifndef RTC_BASE_OPENSSLDIGEST_H_ > #define RTC_BASE_OPENSSLDIGEST_H_ > >-#include <openssl/evp.h> >+#include <openssl/base.h> >+#include <stddef.h> >+#include <string> > > #include "rtc_base/messagedigest.h" > > namespace rtc { > > // An implementation of the digest class that uses OpenSSL. >-class OpenSSLDigest : public MessageDigest { >+class OpenSSLDigest final : public MessageDigest { > public: > // Creates an OpenSSLDigest with |algorithm| as the hash algorithm. > explicit OpenSSLDigest(const std::string& algorithm); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.cc >index a8c6919779bede4790f9fdf0f115b572ced6e685..9850c855ea1679a413792690e65f99862c4d449b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.cc >@@ -21,18 +21,17 @@ > > #include <openssl/bio.h> > #include <openssl/bn.h> >-#include <openssl/crypto.h> > #include <openssl/err.h> > #include <openssl/pem.h> > #include <openssl/rsa.h> > >+#include <stdint.h> >+ > #include "absl/memory/memory.h" > #include "rtc_base/checks.h" >-#include "rtc_base/helpers.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/openssl.h" >-#include "rtc_base/openssldigest.h" > #include "rtc_base/opensslutility.h" > > namespace rtc { >@@ -316,7 +315,7 @@ const SSLCertChain& OpenSSLIdentity::cert_chain() const { > > OpenSSLIdentity* OpenSSLIdentity::GetReference() const { > return new OpenSSLIdentity(absl::WrapUnique(key_pair_->GetReference()), >- absl::WrapUnique(cert_chain_->Copy())); >+ cert_chain_->Clone()); > } > > bool OpenSSLIdentity::ConfigureIdentity(SSL_CTX* ctx) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.h >index 34044276ca7659bdd785545bd9bef11b8caf9c92..fcf7debf9072feb135f7472f24cd6b0c0cacec1c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslidentity.h >@@ -11,24 +11,23 @@ > #ifndef RTC_BASE_OPENSSLIDENTITY_H_ > #define RTC_BASE_OPENSSLIDENTITY_H_ > >-#include <openssl/evp.h> >-#include <openssl/x509.h> >+#include <openssl/ossl_typ.h> > >+#include <ctime> > #include <memory> > #include <string> > > #include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/opensslcertificate.h" >+#include "rtc_base/sslcertificate.h" > #include "rtc_base/sslidentity.h" > >-typedef struct ssl_ctx_st SSL_CTX; >- > namespace rtc { > > // OpenSSLKeyPair encapsulates an OpenSSL EVP_PKEY* keypair object, > // which is reference counted inside the OpenSSL library. >-class OpenSSLKeyPair { >+class OpenSSLKeyPair final { > public: > explicit OpenSSLKeyPair(EVP_PKEY* pkey) : pkey_(pkey) { > RTC_DCHECK(pkey_ != nullptr); >@@ -59,7 +58,7 @@ class OpenSSLKeyPair { > > // Holds a keypair and certificate together, and a method to generate > // them consistently. >-class OpenSSLIdentity : public SSLIdentity { >+class OpenSSLIdentity final : public SSLIdentity { > public: > static OpenSSLIdentity* GenerateWithExpiration(const std::string& common_name, > const KeyParams& key_params, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.cc >index fd54a082a7d738583e884c888eee5082f9434f8e..727cb84aae95da375c2474a92fbcc8eb8d91d8c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.cc >@@ -22,6 +22,7 @@ > #endif > > #include <memory> >+#include <utility> > #include <vector> > > #include "rtc_base/checks.h" >@@ -31,6 +32,7 @@ > #include "rtc_base/openssladapter.h" > #include "rtc_base/openssldigest.h" > #include "rtc_base/opensslidentity.h" >+#include "rtc_base/sslcertificate.h" > #include "rtc_base/stream.h" > #include "rtc_base/stringutils.h" > #include "rtc_base/thread.h" >@@ -180,8 +182,9 @@ static BIO_METHOD* BIO_stream_method() { > > static BIO* BIO_new_stream(StreamInterface* stream) { > BIO* ret = BIO_new(BIO_stream_method()); >- if (ret == nullptr) >+ if (ret == nullptr) { > return nullptr; >+ } > BIO_set_data(ret, stream); > return ret; > } >@@ -196,14 +199,16 @@ static int stream_new(BIO* b) { > } > > static int stream_free(BIO* b) { >- if (b == nullptr) >+ if (b == nullptr) { > return 0; >+ } > return 1; > } > > static int stream_read(BIO* b, char* out, int outl) { >- if (!out) >+ if (!out) { > return -1; >+ } > StreamInterface* stream = static_cast<StreamInterface*>(BIO_get_data(b)); > BIO_clear_retry_flags(b); > size_t read; >@@ -218,8 +223,9 @@ static int stream_read(BIO* b, char* out, int outl) { > } > > static int stream_write(BIO* b, const char* in, int inl) { >- if (!in) >+ if (!in) { > return -1; >+ } > StreamInterface* stream = static_cast<StreamInterface*>(BIO_get_data(b)); > BIO_clear_retry_flags(b); > size_t written; >@@ -296,7 +302,7 @@ bool OpenSSLStreamAdapter::SetPeerCertificateDigest( > size_t digest_len, > SSLPeerCertificateDigestError* error) { > RTC_DCHECK(!peer_certificate_verified_); >- RTC_DCHECK(!has_peer_certificate_digest()); >+ RTC_DCHECK(!HasPeerCertificateDigest()); > size_t expected_len; > if (error) { > *error = SSLPeerCertificateDigestError::NONE; >@@ -362,8 +368,9 @@ std::string OpenSSLStreamAdapter::SslCipherSuiteToName(int cipher_suite) { > } > > bool OpenSSLStreamAdapter::GetSslCipherSuite(int* cipher_suite) { >- if (state_ != SSL_CONNECTED) >+ if (state_ != SSL_CONNECTED) { > return false; >+ } > > const SSL_CIPHER* current_cipher = SSL_get_current_cipher(ssl_); > if (current_cipher == nullptr) { >@@ -380,17 +387,19 @@ int OpenSSLStreamAdapter::GetSslVersion() const { > > int ssl_version = SSL_version(ssl_); > if (ssl_mode_ == SSL_MODE_DTLS) { >- if (ssl_version == DTLS1_VERSION) >+ if (ssl_version == DTLS1_VERSION) { > return SSL_PROTOCOL_DTLS_10; >- else if (ssl_version == DTLS1_2_VERSION) >+ } else if (ssl_version == DTLS1_2_VERSION) { > return SSL_PROTOCOL_DTLS_12; >+ } > } else { >- if (ssl_version == TLS1_VERSION) >+ if (ssl_version == TLS1_VERSION) { > return SSL_PROTOCOL_TLS_10; >- else if (ssl_version == TLS1_1_VERSION) >+ } else if (ssl_version == TLS1_1_VERSION) { > return SSL_PROTOCOL_TLS_11; >- else if (ssl_version == TLS1_2_VERSION) >+ } else if (ssl_version == TLS1_2_VERSION) { > return SSL_PROTOCOL_TLS_12; >+ } > } > > return -1; >@@ -403,15 +412,11 @@ bool OpenSSLStreamAdapter::ExportKeyingMaterial(const std::string& label, > bool use_context, > uint8_t* result, > size_t result_len) { >- int i; >- >- i = SSL_export_keying_material(ssl_, result, result_len, label.c_str(), >+ if (SSL_export_keying_material(ssl_, result, result_len, label.c_str(), > label.length(), const_cast<uint8_t*>(context), >- context_len, use_context); >- >- if (i != 1) >+ context_len, use_context) != 1) { > return false; >- >+ } > return true; > } > >@@ -419,8 +424,9 @@ bool OpenSSLStreamAdapter::SetDtlsSrtpCryptoSuites( > const std::vector<int>& ciphers) { > std::string internal_ciphers; > >- if (state_ != SSL_NONE) >+ if (state_ != SSL_NONE) { > return false; >+ } > > for (std::vector<int>::const_iterator cipher = ciphers.begin(); > cipher != ciphers.end(); ++cipher) { >@@ -429,8 +435,9 @@ bool OpenSSLStreamAdapter::SetDtlsSrtpCryptoSuites( > ++entry) { > if (*cipher == entry->id) { > found = true; >- if (!internal_ciphers.empty()) >+ if (!internal_ciphers.empty()) { > internal_ciphers += ":"; >+ } > internal_ciphers += entry->internal_name; > break; > } >@@ -442,8 +449,9 @@ bool OpenSSLStreamAdapter::SetDtlsSrtpCryptoSuites( > } > } > >- if (internal_ciphers.empty()) >+ if (internal_ciphers.empty()) { > return false; >+ } > > srtp_ciphers_ = internal_ciphers; > return true; >@@ -451,14 +459,16 @@ bool OpenSSLStreamAdapter::SetDtlsSrtpCryptoSuites( > > bool OpenSSLStreamAdapter::GetDtlsSrtpCryptoSuite(int* crypto_suite) { > RTC_DCHECK(state_ == SSL_CONNECTED); >- if (state_ != SSL_CONNECTED) >+ if (state_ != SSL_CONNECTED) { > return false; >+ } > > const SRTP_PROTECTION_PROFILE* srtp_profile = > SSL_get_selected_srtp_profile(ssl_); > >- if (!srtp_profile) >+ if (!srtp_profile) { > return false; >+ } > > *crypto_suite = srtp_profile->id; > RTC_DCHECK(!SrtpCryptoSuiteToName(*crypto_suite).empty()); >@@ -470,8 +480,8 @@ bool OpenSSLStreamAdapter::IsTlsConnected() { > } > > int OpenSSLStreamAdapter::StartSSL() { >+ // Don't allow StartSSL to be called twice. > if (state_ != SSL_NONE) { >- // Don't allow StartSSL to be called twice. > return -1; > } > >@@ -524,7 +534,7 @@ StreamResult OpenSSLStreamAdapter::Write(const void* data, > return SR_BLOCK; > > case SSL_CONNECTED: >- if (waiting_to_verify_peer_certificate()) { >+ if (WaitingToVerifyPeerCertificate()) { > return SR_BLOCK; > } > break; >@@ -532,15 +542,17 @@ StreamResult OpenSSLStreamAdapter::Write(const void* data, > case SSL_ERROR: > case SSL_CLOSED: > default: >- if (error) >+ if (error) { > *error = ssl_error_code_; >+ } > return SR_ERROR; > } > > // OpenSSL will return an error if we try to write zero bytes > if (data_len == 0) { >- if (written) >+ if (written) { > *written = 0; >+ } > return SR_SUCCESS; > } > >@@ -567,8 +579,9 @@ StreamResult OpenSSLStreamAdapter::Write(const void* data, > case SSL_ERROR_ZERO_RETURN: > default: > Error("SSL_write", (ssl_error ? ssl_error : -1), 0, false); >- if (error) >+ if (error) { > *error = ssl_error_code_; >+ } > return SR_ERROR; > } > // not reached >@@ -583,31 +596,29 @@ StreamResult OpenSSLStreamAdapter::Read(void* data, > case SSL_NONE: > // pass-through in clear text > return StreamAdapterInterface::Read(data, data_len, read, error); >- > case SSL_WAIT: > case SSL_CONNECTING: > return SR_BLOCK; >- > case SSL_CONNECTED: >- if (waiting_to_verify_peer_certificate()) { >+ if (WaitingToVerifyPeerCertificate()) { > return SR_BLOCK; > } > break; >- > case SSL_CLOSED: > return SR_EOS; >- > case SSL_ERROR: > default: >- if (error) >+ if (error) { > *error = ssl_error_code_; >+ } > return SR_ERROR; > } > > // Don't trust OpenSSL with zero byte reads > if (data_len == 0) { >- if (read) >+ if (read) { > *read = 0; >+ } > return SR_SUCCESS; > } > >@@ -620,8 +631,9 @@ StreamResult OpenSSLStreamAdapter::Read(void* data, > RTC_LOG(LS_VERBOSE) << " -- success"; > RTC_DCHECK_GT(code, 0); > RTC_DCHECK_LE(code, data_len); >- if (read) >+ if (read) { > *read = code; >+ } > > if (ssl_mode_ == SSL_MODE_DTLS) { > // Enforce atomic reads -- this is a short read >@@ -630,8 +642,9 @@ StreamResult OpenSSLStreamAdapter::Read(void* data, > if (pending) { > RTC_LOG(LS_INFO) << " -- short DTLS read. flushing"; > FlushInput(pending); >- if (error) >+ if (error) { > *error = SSE_MSG_TRUNC; >+ } > return SR_ERROR; > } > } >@@ -650,8 +663,9 @@ StreamResult OpenSSLStreamAdapter::Read(void* data, > break; > default: > Error("SSL_read", (ssl_error ? ssl_error : -1), 0, false); >- if (error) >+ if (error) { > *error = ssl_error_code_; >+ } > return SR_ERROR; > } > // not reached >@@ -694,13 +708,13 @@ StreamState OpenSSLStreamAdapter::GetState() const { > case SSL_CONNECTING: > return SS_OPENING; > case SSL_CONNECTED: >- if (waiting_to_verify_peer_certificate()) { >+ if (WaitingToVerifyPeerCertificate()) { > return SS_OPENING; > } > return SS_OPEN; > default: > return SS_CLOSED; >- }; >+ } > // not reached > } > >@@ -710,6 +724,7 @@ void OpenSSLStreamAdapter::OnEvent(StreamInterface* stream, > int events_to_signal = 0; > int signal_error = 0; > RTC_DCHECK(stream == this->stream()); >+ > if ((events & SE_OPEN)) { > RTC_LOG(LS_VERBOSE) << "OpenSSLStreamAdapter::OnEvent SE_OPEN"; > if (state_ != SSL_WAIT) { >@@ -723,6 +738,7 @@ void OpenSSLStreamAdapter::OnEvent(StreamInterface* stream, > } > } > } >+ > if ((events & (SE_READ | SE_WRITE))) { > RTC_LOG(LS_VERBOSE) << "OpenSSLStreamAdapter::OnEvent" > << ((events & SE_READ) ? " SE_READ" : "") >@@ -747,6 +763,7 @@ void OpenSSLStreamAdapter::OnEvent(StreamInterface* stream, > } > } > } >+ > if ((events & SE_CLOSE)) { > RTC_LOG(LS_VERBOSE) << "OpenSSLStreamAdapter::OnEvent(SE_CLOSE, " << err > << ")"; >@@ -756,8 +773,10 @@ void OpenSSLStreamAdapter::OnEvent(StreamInterface* stream, > RTC_DCHECK(signal_error == 0); > signal_error = err; > } >- if (events_to_signal) >+ >+ if (events_to_signal) { > StreamAdapterInterface::OnEvent(stream, events_to_signal, signal_error); >+ } > } > > int OpenSSLStreamAdapter::BeginSSL() { >@@ -770,12 +789,14 @@ int OpenSSLStreamAdapter::BeginSSL() { > // First set up the context. > RTC_DCHECK(ssl_ctx_ == nullptr); > ssl_ctx_ = SetupSSLContext(); >- if (!ssl_ctx_) >+ if (!ssl_ctx_) { > return -1; >+ } > > bio = BIO_new_stream(static_cast<StreamInterface*>(stream())); >- if (!bio) >+ if (!bio) { > return -1; >+ } > > ssl_ = SSL_new(ssl_ctx_); > if (!ssl_) { >@@ -805,8 +826,9 @@ int OpenSSLStreamAdapter::BeginSSL() { > // commonly supported. BoringSSL doesn't need explicit configuration and has > // a reasonable default set. > EC_KEY* ecdh = EC_KEY_new_by_curve_name(NID_X9_62_prime256v1); >- if (ecdh == nullptr) >+ if (ecdh == nullptr) { > return -1; >+ } > SSL_set_options(ssl_, SSL_OP_SINGLE_ECDH_USE); > SSL_set_tmp_ecdh(ssl_, ecdh); > EC_KEY_free(ecdh); >@@ -830,10 +852,10 @@ int OpenSSLStreamAdapter::ContinueSSL() { > RTC_LOG(LS_VERBOSE) << " -- success"; > // By this point, OpenSSL should have given us a certificate, or errored > // out if one was missing. >- RTC_DCHECK(peer_cert_chain_ || !client_auth_enabled()); >+ RTC_DCHECK(peer_cert_chain_ || !GetClientAuthEnabled()); > > state_ = SSL_CONNECTED; >- if (!waiting_to_verify_peer_certificate()) { >+ if (!WaitingToVerifyPeerCertificate()) { > // We have everything we need to start the connection, so signal > // SE_OPEN. If we need a client certificate fingerprint and don't have > // it yet, we'll instead signal SE_OPEN in SetPeerCertificateDigest. >@@ -886,8 +908,9 @@ void OpenSSLStreamAdapter::Error(const char* context, > state_ = SSL_ERROR; > ssl_error_code_ = err; > Cleanup(alert); >- if (signal) >+ if (signal) { > StreamAdapterInterface::OnEvent(stream(), SE_CLOSE, err); >+ } > } > > void OpenSSLStreamAdapter::Cleanup(uint8_t alert) { >@@ -990,8 +1013,9 @@ SSL_CTX* OpenSSLStreamAdapter::SetupSSLContext() { > ctx = SSL_CTX_new(method); > #endif // OPENSSL_IS_BORINGSSL > >- if (ctx == nullptr) >+ if (ctx == nullptr) { > return nullptr; >+ } > > #ifdef OPENSSL_IS_BORINGSSL > SSL_CTX_set_min_proto_version( >@@ -1026,7 +1050,7 @@ SSL_CTX* OpenSSLStreamAdapter::SetupSSLContext() { > #endif > > int mode = SSL_VERIFY_PEER; >- if (client_auth_enabled()) { >+ if (GetClientAuthEnabled()) { > // Require a certificate from the client. > // Note: Normally this is always true in production, but it may be disabled > // for testing purposes (e.g. SSLAdapter unit tests). >@@ -1058,7 +1082,7 @@ SSL_CTX* OpenSSLStreamAdapter::SetupSSLContext() { > } > > bool OpenSSLStreamAdapter::VerifyPeerCertificate() { >- if (!has_peer_certificate_digest() || !peer_cert_chain_ || >+ if (!HasPeerCertificateDigest() || !peer_cert_chain_ || > !peer_cert_chain_->GetSize()) { > RTC_LOG(LS_WARNING) << "Missing digest or peer certificate."; > return false; >@@ -1091,7 +1115,7 @@ bool OpenSSLStreamAdapter::VerifyPeerCertificate() { > > std::unique_ptr<SSLCertChain> OpenSSLStreamAdapter::GetPeerSSLCertChain() > const { >- return peer_cert_chain_ ? peer_cert_chain_->UniqueCopy() : nullptr; >+ return peer_cert_chain_ ? peer_cert_chain_->Clone() : nullptr; > } > > int OpenSSLStreamAdapter::SSLVerifyCallback(X509_STORE_CTX* store, void* arg) { >@@ -1176,15 +1200,17 @@ static const cipher_list OK_ECDSA_ciphers[] = { > bool OpenSSLStreamAdapter::IsAcceptableCipher(int cipher, KeyType key_type) { > if (key_type == KT_RSA) { > for (const cipher_list& c : OK_RSA_ciphers) { >- if (cipher == c.cipher) >+ if (cipher == c.cipher) { > return true; >+ } > } > } > > if (key_type == KT_ECDSA) { > for (const cipher_list& c : OK_ECDSA_ciphers) { >- if (cipher == c.cipher) >+ if (cipher == c.cipher) { > return true; >+ } > } > } > >@@ -1195,22 +1221,24 @@ bool OpenSSLStreamAdapter::IsAcceptableCipher(const std::string& cipher, > KeyType key_type) { > if (key_type == KT_RSA) { > for (const cipher_list& c : OK_RSA_ciphers) { >- if (cipher == c.cipher_str) >+ if (cipher == c.cipher_str) { > return true; >+ } > } > } > > if (key_type == KT_ECDSA) { > for (const cipher_list& c : OK_ECDSA_ciphers) { >- if (cipher == c.cipher_str) >+ if (cipher == c.cipher_str) { > return true; >+ } > } > } > > return false; > } > >-void OpenSSLStreamAdapter::enable_time_callback_for_testing() { >+void OpenSSLStreamAdapter::EnableTimeCallbackForTesting() { > g_use_time_callback_for_testing = true; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.h >index 61ffc3dbfe2d6538bd3031fc970e752cea4f43ff..e012d172a62a0b4595c905cc61281c4989756ef3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslstreamadapter.h >@@ -13,13 +13,18 @@ > > #include <openssl/ossl_typ.h> > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <string> > #include <vector> > > #include "rtc_base/buffer.h" >+#include "rtc_base/messagequeue.h" > #include "rtc_base/opensslidentity.h" >+#include "rtc_base/sslidentity.h" > #include "rtc_base/sslstreamadapter.h" >+#include "rtc_base/stream.h" > > namespace rtc { > >@@ -47,11 +52,11 @@ namespace rtc { > > // Look in sslstreamadapter.h for documentation of the methods. > >-class OpenSSLIdentity; >+class SSLCertChain; > > /////////////////////////////////////////////////////////////////////////////// > >-class OpenSSLStreamAdapter : public SSLStreamAdapter { >+class OpenSSLStreamAdapter final : public SSLStreamAdapter { > public: > explicit OpenSSLStreamAdapter(StreamInterface* stream); > ~OpenSSLStreamAdapter() override; >@@ -115,7 +120,7 @@ class OpenSSLStreamAdapter : public SSLStreamAdapter { > > // Use our timeutils.h source of timing in BoringSSL, allowing us to test > // using a fake clock. >- static void enable_time_callback_for_testing(); >+ static void EnableTimeCallbackForTesting(); > > protected: > void OnEvent(StreamInterface* stream, int events, int err) override; >@@ -170,11 +175,11 @@ class OpenSSLStreamAdapter : public SSLStreamAdapter { > // SSL_CTX_set_cert_verify_callback. > static int SSLVerifyCallback(X509_STORE_CTX* store, void* arg); > >- bool waiting_to_verify_peer_certificate() const { >- return client_auth_enabled() && !peer_certificate_verified_; >+ bool WaitingToVerifyPeerCertificate() const { >+ return GetClientAuthEnabled() && !peer_certificate_verified_; > } > >- bool has_peer_certificate_digest() const { >+ bool HasPeerCertificateDigest() const { > return !peer_certificate_digest_algorithm_.empty() && > !peer_certificate_digest_value_.empty(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.cc >index 46f4547436e1158a7b80308043f8f677d25e19e8..a3f33478ebc7b05100b73cb304281a0246b9256b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.cc >@@ -9,29 +9,21 @@ > */ > > #include "rtc_base/opensslutility.h" >- >-#include <memory> >- >-#if defined(WEBRTC_POSIX) >-#include <unistd.h> >-#endif >- > #if defined(WEBRTC_WIN) > // Must be included first before openssl headers. > #include "rtc_base/win32.h" // NOLINT > #endif // WEBRTC_WIN > >-#include <openssl/bio.h> >-#include <openssl/crypto.h> > #include <openssl/err.h> > #include <openssl/x509.h> > #include <openssl/x509v3.h> >+#include "rtc_base/openssl.h" >+ >+#include <stddef.h> > > #include "rtc_base/arraysize.h" >-#include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" >-#include "rtc_base/openssl.h" > #include "rtc_base/opensslcertificate.h" > #ifndef WEBRTC_EXCLUDE_BUILT_IN_SSL_ROOT_CERTS > #include "rtc_base/sslroots.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.h >index 7cb38b5b52763a51ddb40258936a656bd6721f9a..77ed0b1e7cca1cb7968a681acbf065a9fa3c3298 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslutility.h >@@ -13,7 +13,6 @@ > > #include <openssl/ossl_typ.h> > #include <string> >-#include "rtc_base/sslcertificate.h" > > namespace rtc { > // The openssl namespace holds static helper methods. All methods related >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile.cc >deleted file mode 100644 >index 535859b4cc075364b4bc62d1f85ec33de51c0c14..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile.cc >+++ /dev/null >@@ -1,181 +0,0 @@ >-/* >- * Copyright 2008 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "rtc_base/optionsfile.h" >- >-#include <ctype.h> >- >-#include "rtc_base/logging.h" >-#include "rtc_base/stream.h" >-#include "rtc_base/stringencode.h" >- >-namespace rtc { >- >-OptionsFile::OptionsFile(const std::string& path) : path_(path) {} >- >-OptionsFile::~OptionsFile() = default; >- >-bool OptionsFile::Load() { >- options_.clear(); >- // Open file. >- FileStream stream; >- int err; >- if (!stream.Open(path_, "r", &err)) { >- RTC_LOG_F(LS_WARNING) << "Could not open file, err=" << err; >- // We do not consider this an error because we expect there to be no file >- // until the user saves a setting. >- return true; >- } >- // Read in all its data. >- std::string line; >- StreamResult res; >- for (;;) { >- res = stream.ReadLine(&line); >- if (res != SR_SUCCESS) { >- break; >- } >- size_t equals_pos = line.find('='); >- if (equals_pos == std::string::npos) { >- // We do not consider this an error. Instead we ignore the line and >- // keep going. >- RTC_LOG_F(LS_WARNING) << "Ignoring malformed line in " << path_; >- continue; >- } >- std::string key(line, 0, equals_pos); >- std::string value(line, equals_pos + 1, line.length() - (equals_pos + 1)); >- options_[key] = value; >- } >- if (res != SR_EOS) { >- RTC_LOG_F(LS_ERROR) << "Error when reading from file"; >- return false; >- } else { >- return true; >- } >-} >- >-bool OptionsFile::Save() { >- // Open file. >- FileStream stream; >- int err; >- if (!stream.Open(path_, "w", &err)) { >- RTC_LOG_F(LS_ERROR) << "Could not open file, err=" << err; >- return false; >- } >- // Write out all the data. >- StreamResult res = SR_SUCCESS; >- size_t written; >- int error; >- for (OptionsMap::const_iterator i = options_.begin(); i != options_.end(); >- ++i) { >- res = >- stream.WriteAll(i->first.c_str(), i->first.length(), &written, &error); >- if (res != SR_SUCCESS) { >- break; >- } >- res = stream.WriteAll("=", 1, &written, &error); >- if (res != SR_SUCCESS) { >- break; >- } >- res = stream.WriteAll(i->second.c_str(), i->second.length(), &written, >- &error); >- if (res != SR_SUCCESS) { >- break; >- } >- res = stream.WriteAll("\n", 1, &written, &error); >- if (res != SR_SUCCESS) { >- break; >- } >- } >- if (res != SR_SUCCESS) { >- RTC_LOG_F(LS_ERROR) << "Unable to write to file"; >- return false; >- } else { >- return true; >- } >-} >- >-bool OptionsFile::IsLegalName(const std::string& name) { >- for (size_t pos = 0; pos < name.length(); ++pos) { >- if (name[pos] == '\n' || name[pos] == '\\' || name[pos] == '=') { >- // Illegal character. >- RTC_LOG(LS_WARNING) << "Ignoring operation for illegal option " << name; >- return false; >- } >- } >- return true; >-} >- >-bool OptionsFile::IsLegalValue(const std::string& value) { >- for (size_t pos = 0; pos < value.length(); ++pos) { >- if (value[pos] == '\n' || value[pos] == '\\') { >- // Illegal character. >- RTC_LOG(LS_WARNING) << "Ignoring operation for illegal value " << value; >- return false; >- } >- } >- return true; >-} >- >-bool OptionsFile::GetStringValue(const std::string& option, >- std::string* out_val) const { >- RTC_LOG(LS_VERBOSE) << "OptionsFile::GetStringValue " << option; >- if (!IsLegalName(option)) { >- return false; >- } >- OptionsMap::const_iterator i = options_.find(option); >- if (i == options_.end()) { >- return false; >- } >- *out_val = i->second; >- return true; >-} >- >-bool OptionsFile::GetIntValue(const std::string& option, int* out_val) const { >- RTC_LOG(LS_VERBOSE) << "OptionsFile::GetIntValue " << option; >- if (!IsLegalName(option)) { >- return false; >- } >- OptionsMap::const_iterator i = options_.find(option); >- if (i == options_.end()) { >- return false; >- } >- return FromString(i->second, out_val); >-} >- >-bool OptionsFile::SetStringValue(const std::string& option, >- const std::string& value) { >- RTC_LOG(LS_VERBOSE) << "OptionsFile::SetStringValue " << option << ":" >- << value; >- if (!IsLegalName(option) || !IsLegalValue(value)) { >- return false; >- } >- options_[option] = value; >- return true; >-} >- >-bool OptionsFile::SetIntValue(const std::string& option, int value) { >- RTC_LOG(LS_VERBOSE) << "OptionsFile::SetIntValue " << option << ":" << value; >- if (!IsLegalName(option)) { >- return false; >- } >- options_[option] = ToString(value); >- return true; >-} >- >-bool OptionsFile::RemoveValue(const std::string& option) { >- RTC_LOG(LS_VERBOSE) << "OptionsFile::RemoveValue " << option; >- if (!IsLegalName(option)) { >- return false; >- } >- options_.erase(option); >- return true; >-} >- >-} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile.h >deleted file mode 100644 >index 55660ff2598be3decb2e56962e55e853be15e2b2..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile.h >+++ /dev/null >@@ -1,50 +0,0 @@ >-/* >- * Copyright 2008 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef RTC_BASE_OPTIONSFILE_H_ >-#define RTC_BASE_OPTIONSFILE_H_ >- >-#include <map> >-#include <string> >- >-namespace rtc { >- >-// Implements storage of simple options in a text file on disk. This is >-// cross-platform, but it is intended mostly for Linux where there is no >-// first-class options storage system. >-class OptionsFile { >- public: >- OptionsFile(const std::string& path); >- ~OptionsFile(); >- >- // Loads the file from disk, overwriting the in-memory values. >- bool Load(); >- // Saves the contents in memory, overwriting the on-disk values. >- bool Save(); >- >- bool GetStringValue(const std::string& option, std::string* out_val) const; >- bool GetIntValue(const std::string& option, int* out_val) const; >- bool SetStringValue(const std::string& option, const std::string& val); >- bool SetIntValue(const std::string& option, int val); >- bool RemoveValue(const std::string& option); >- >- private: >- typedef std::map<std::string, std::string> OptionsMap; >- >- static bool IsLegalName(const std::string& name); >- static bool IsLegalValue(const std::string& value); >- >- std::string path_; >- OptionsMap options_; >-}; >- >-} // namespace rtc >- >-#endif // RTC_BASE_OPTIONSFILE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile_unittest.cc >deleted file mode 100644 >index fc5bc823ff4158f72db2c506e26f7b89d01d457a..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/optionsfile_unittest.cc >+++ /dev/null >@@ -1,178 +0,0 @@ >-/* >- * Copyright 2008 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include <memory> >- >-#include "rtc_base/checks.h" >-#include "rtc_base/gunit.h" >-#include "rtc_base/optionsfile.h" >-#include "test/testsupport/fileutils.h" >- >-namespace rtc { >- >-static const std::string kTestOptionA = "test-option-a"; >-static const std::string kTestOptionB = "test-option-b"; >-static const std::string kTestString1 = "a string"; >-static const std::string kTestString2 = "different string"; >-static const std::string kOptionWithEquals = "foo=bar"; >-static const std::string kOptionWithNewline = "foo\nbar"; >-static const std::string kValueWithEquals = "baz=quux"; >-static const std::string kValueWithNewline = "baz\nquux"; >-static const std::string kEmptyString = ""; >-static const char kOptionWithUtf8[] = {'O', 'p', 't', '\302', '\256', >- 'i', 'o', 'n', '\342', '\204', >- '\242', '\0'}; // Opt(R)io(TM). >-static const char kValueWithUtf8[] = { >- 'V', 'a', 'l', '\302', '\256', 'v', >- 'e', '\342', '\204', '\242', '\0'}; // Val(R)ue(TM). >-static int kTestInt1 = 12345; >-static int kTestInt2 = 67890; >-static int kNegInt = -634; >-static int kZero = 0; >- >-#if defined(WEBRTC_ANDROID) >-// Fails on Android: https://bugs.chromium.org/p/webrtc/issues/detail?id=4364. >-#define MAYBE_OptionsFileTest DISABLED_OptionsFileTest >-#else >-#define MAYBE_OptionsFileTest OptionsFileTest >-#endif >- >-class MAYBE_OptionsFileTest : public testing::Test { >- public: >- MAYBE_OptionsFileTest() { >- test_file_ = >- webrtc::test::TempFilename(webrtc::test::OutputPath(), ".testfile"); >- OpenStore(); >- } >- >- ~MAYBE_OptionsFileTest() override { webrtc::test::RemoveFile(test_file_); } >- >- protected: >- void OpenStore() { store_.reset(new OptionsFile(test_file_)); } >- >- std::unique_ptr<OptionsFile> store_; >- >- private: >- std::string test_file_; >-}; >- >-TEST_F(MAYBE_OptionsFileTest, GetSetString) { >- // Clear contents of the file on disk. >- EXPECT_TRUE(store_->Save()); >- std::string out1, out2; >- EXPECT_FALSE(store_->GetStringValue(kTestOptionA, &out1)); >- EXPECT_FALSE(store_->GetStringValue(kTestOptionB, &out2)); >- EXPECT_TRUE(store_->SetStringValue(kTestOptionA, kTestString1)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->SetStringValue(kTestOptionB, kTestString2)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->GetStringValue(kTestOptionA, &out1)); >- EXPECT_TRUE(store_->GetStringValue(kTestOptionB, &out2)); >- EXPECT_EQ(kTestString1, out1); >- EXPECT_EQ(kTestString2, out2); >- EXPECT_TRUE(store_->RemoveValue(kTestOptionA)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->RemoveValue(kTestOptionB)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_FALSE(store_->GetStringValue(kTestOptionA, &out1)); >- EXPECT_FALSE(store_->GetStringValue(kTestOptionB, &out2)); >-} >- >-TEST_F(MAYBE_OptionsFileTest, GetSetInt) { >- // Clear contents of the file on disk. >- EXPECT_TRUE(store_->Save()); >- int out1, out2; >- EXPECT_FALSE(store_->GetIntValue(kTestOptionA, &out1)); >- EXPECT_FALSE(store_->GetIntValue(kTestOptionB, &out2)); >- EXPECT_TRUE(store_->SetIntValue(kTestOptionA, kTestInt1)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->SetIntValue(kTestOptionB, kTestInt2)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->GetIntValue(kTestOptionA, &out1)); >- EXPECT_TRUE(store_->GetIntValue(kTestOptionB, &out2)); >- EXPECT_EQ(kTestInt1, out1); >- EXPECT_EQ(kTestInt2, out2); >- EXPECT_TRUE(store_->RemoveValue(kTestOptionA)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->RemoveValue(kTestOptionB)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_FALSE(store_->GetIntValue(kTestOptionA, &out1)); >- EXPECT_FALSE(store_->GetIntValue(kTestOptionB, &out2)); >- EXPECT_TRUE(store_->SetIntValue(kTestOptionA, kNegInt)); >- EXPECT_TRUE(store_->GetIntValue(kTestOptionA, &out1)); >- EXPECT_EQ(kNegInt, out1); >- EXPECT_TRUE(store_->SetIntValue(kTestOptionA, kZero)); >- EXPECT_TRUE(store_->GetIntValue(kTestOptionA, &out1)); >- EXPECT_EQ(kZero, out1); >-} >- >-TEST_F(MAYBE_OptionsFileTest, Persist) { >- // Clear contents of the file on disk. >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->SetStringValue(kTestOptionA, kTestString1)); >- EXPECT_TRUE(store_->SetIntValue(kTestOptionB, kNegInt)); >- EXPECT_TRUE(store_->Save()); >- >- // Load the saved contents from above. >- OpenStore(); >- EXPECT_TRUE(store_->Load()); >- std::string out1; >- int out2; >- EXPECT_TRUE(store_->GetStringValue(kTestOptionA, &out1)); >- EXPECT_TRUE(store_->GetIntValue(kTestOptionB, &out2)); >- EXPECT_EQ(kTestString1, out1); >- EXPECT_EQ(kNegInt, out2); >-} >- >-TEST_F(MAYBE_OptionsFileTest, SpecialCharacters) { >- // Clear contents of the file on disk. >- EXPECT_TRUE(store_->Save()); >- std::string out; >- EXPECT_FALSE(store_->SetStringValue(kOptionWithEquals, kTestString1)); >- EXPECT_FALSE(store_->GetStringValue(kOptionWithEquals, &out)); >- EXPECT_FALSE(store_->SetStringValue(kOptionWithNewline, kTestString1)); >- EXPECT_FALSE(store_->GetStringValue(kOptionWithNewline, &out)); >- EXPECT_TRUE(store_->SetStringValue(kOptionWithUtf8, kValueWithUtf8)); >- EXPECT_TRUE(store_->SetStringValue(kTestOptionA, kTestString1)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->GetStringValue(kTestOptionA, &out)); >- EXPECT_EQ(kTestString1, out); >- EXPECT_TRUE(store_->GetStringValue(kOptionWithUtf8, &out)); >- EXPECT_EQ(kValueWithUtf8, out); >- EXPECT_FALSE(store_->SetStringValue(kTestOptionA, kValueWithNewline)); >- EXPECT_TRUE(store_->GetStringValue(kTestOptionA, &out)); >- EXPECT_EQ(kTestString1, out); >- EXPECT_TRUE(store_->SetStringValue(kTestOptionA, kValueWithEquals)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->GetStringValue(kTestOptionA, &out)); >- EXPECT_EQ(kValueWithEquals, out); >- EXPECT_TRUE(store_->SetStringValue(kEmptyString, kTestString2)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->GetStringValue(kEmptyString, &out)); >- EXPECT_EQ(kTestString2, out); >- EXPECT_TRUE(store_->SetStringValue(kTestOptionB, kEmptyString)); >- EXPECT_TRUE(store_->Save()); >- EXPECT_TRUE(store_->Load()); >- EXPECT_TRUE(store_->GetStringValue(kTestOptionB, &out)); >- EXPECT_EQ(kEmptyString, out); >-} >- >-} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils.cc >deleted file mode 100644 >index 594deb7f0fc132fa7c436ec1742e6829892a1a11..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils.cc >+++ /dev/null >@@ -1,155 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#if defined(WEBRTC_WIN) >-#include <windows.h> >-#include <shellapi.h> >-#include <shlobj.h> >-#include <tchar.h> >-#endif // WEBRTC_WIN >- >-#include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/pathutils.h" >-#include "rtc_base/stringutils.h" >- >-namespace rtc { >- >-static const char EMPTY_STR[] = ""; >- >-// EXT_DELIM separates a file basename from extension >-const char EXT_DELIM = '.'; >- >-// FOLDER_DELIMS separate folder segments and the filename >-const char* const FOLDER_DELIMS = "/\\"; >- >-// DEFAULT_FOLDER_DELIM is the preferred delimiter for this platform >-#ifdef WEBRTC_WIN >-const char DEFAULT_FOLDER_DELIM = '\\'; >-#else // !WEBRTC_WIN >-const char DEFAULT_FOLDER_DELIM = '/'; >-#endif // !WEBRTC_WIN >- >-/////////////////////////////////////////////////////////////////////////////// >-// Pathname - parsing of pathnames into components, and vice versa >-/////////////////////////////////////////////////////////////////////////////// >- >-bool Pathname::IsFolderDelimiter(char ch) { >- return (nullptr != ::strchr(FOLDER_DELIMS, ch)); >-} >- >-char Pathname::DefaultFolderDelimiter() { >- return DEFAULT_FOLDER_DELIM; >-} >- >-Pathname::Pathname() >- : folder_delimiter_(DEFAULT_FOLDER_DELIM) { >-} >- >-Pathname::Pathname(const Pathname&) = default; >-Pathname::Pathname(Pathname&&) = default; >- >-Pathname::Pathname(const std::string& pathname) >- : folder_delimiter_(DEFAULT_FOLDER_DELIM) { >- SetPathname(pathname); >-} >- >-Pathname::Pathname(const std::string& folder, const std::string& filename) >- : folder_delimiter_(DEFAULT_FOLDER_DELIM) { >- SetPathname(folder, filename); >-} >- >-Pathname& Pathname::operator=(const Pathname&) = default; >-Pathname& Pathname::operator=(Pathname&&) = default; >- >-std::string Pathname::pathname() const { >- std::string pathname(folder_); >- pathname.append(basename_); >- pathname.append(extension_); >- if (pathname.empty()) { >- // Instead of the empty pathname, return the current working directory. >- pathname.push_back('.'); >- pathname.push_back(folder_delimiter_); >- } >- return pathname; >-} >- >-void Pathname::SetPathname(const std::string& pathname) { >- std::string::size_type pos = pathname.find_last_of(FOLDER_DELIMS); >- if (pos != std::string::npos) { >- SetFolder(pathname.substr(0, pos + 1)); >- SetFilename(pathname.substr(pos + 1)); >- } else { >- SetFolder(EMPTY_STR); >- SetFilename(pathname); >- } >-} >- >-void Pathname::SetPathname(const std::string& folder, >- const std::string& filename) { >- SetFolder(folder); >- SetFilename(filename); >-} >- >-void Pathname::SetFolder(const std::string& folder) { >- folder_.assign(folder); >- // Ensure folder ends in a path delimiter >- if (!folder_.empty() && !IsFolderDelimiter(folder_[folder_.length()-1])) { >- folder_.push_back(folder_delimiter_); >- } >-} >- >-void Pathname::AppendFolder(const std::string& folder) { >- folder_.append(folder); >- // Ensure folder ends in a path delimiter >- if (!folder_.empty() && !IsFolderDelimiter(folder_[folder_.length()-1])) { >- folder_.push_back(folder_delimiter_); >- } >-} >- >-bool Pathname::SetBasename(const std::string& basename) { >- if(basename.find_first_of(FOLDER_DELIMS) != std::string::npos) { >- return false; >- } >- basename_.assign(basename); >- return true; >-} >- >-bool Pathname::SetExtension(const std::string& extension) { >- if (extension.find_first_of(FOLDER_DELIMS) != std::string::npos || >- extension.find_first_of(EXT_DELIM, 1) != std::string::npos) { >- return false; >- } >- extension_.assign(extension); >- // Ensure extension begins with the extension delimiter >- if (!extension_.empty() && (extension_[0] != EXT_DELIM)) { >- extension_.insert(extension_.begin(), EXT_DELIM); >- } >- return true; >-} >- >-std::string Pathname::filename() const { >- std::string filename(basename_); >- filename.append(extension_); >- return filename; >-} >- >-bool Pathname::SetFilename(const std::string& filename) { >- std::string::size_type pos = filename.rfind(EXT_DELIM); >- if ((pos == std::string::npos) || (pos == 0)) { >- return SetExtension(EMPTY_STR) && SetBasename(filename); >- } else { >- return SetExtension(filename.substr(pos)) && SetBasename(filename.substr(0, pos)); >- } >-} >- >-/////////////////////////////////////////////////////////////////////////////// >- >-} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils.h >deleted file mode 100644 >index 9543be073f245d3409fd690b5cf9cc97db6e5a7e..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils.h >+++ /dev/null >@@ -1,81 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef RTC_BASE_PATHUTILS_H_ >-#define RTC_BASE_PATHUTILS_H_ >- >-#include <string> >- >-#include "rtc_base/checks.h" >- >-namespace rtc { >- >-/////////////////////////////////////////////////////////////////////////////// >-// Pathname - parsing of pathnames into components, and vice versa. >-// >-// To establish consistent terminology, a filename never contains a folder >-// component. A folder never contains a filename. A pathname may include >-// a folder and/or filename component. Here are some examples: >-// >-// pathname() /home/john/example.txt >-// folder() /home/john/ >-// filename() example.txt >-// parent_folder() /home/ >-// folder_name() john/ >-// basename() example >-// extension() .txt >-// >-// Basename may begin, end, and/or include periods, but no folder delimiters. >-// If extension exists, it consists of a period followed by zero or more >-// non-period/non-delimiter characters, and basename is non-empty. >-/////////////////////////////////////////////////////////////////////////////// >- >-class Pathname { >- public: >- // Folder delimiters are slash and backslash >- static bool IsFolderDelimiter(char ch); >- static char DefaultFolderDelimiter(); >- >- Pathname(); >- Pathname(const Pathname&); >- Pathname(Pathname&&); >- Pathname(const std::string& pathname); >- Pathname(const std::string& folder, const std::string& filename); >- >- Pathname& operator=(const Pathname&); >- Pathname& operator=(Pathname&&); >- >- // Returns the folder and filename components. If the pathname is empty, >- // returns a string representing the current directory (as a relative path, >- // i.e., "."). >- std::string pathname() const; >- void SetPathname(const std::string& pathname); >- void SetPathname(const std::string& folder, const std::string& filename); >- >- // SetFolder and AppendFolder will append a folder delimiter, if needed. >- void SetFolder(const std::string& folder); >- void AppendFolder(const std::string& folder); >- >- bool SetBasename(const std::string& basename); >- >- // SetExtension will prefix a period, if needed. >- bool SetExtension(const std::string& extension); >- >- std::string filename() const; >- bool SetFilename(const std::string& filename); >- >- private: >- std::string folder_, basename_, extension_; >- char folder_delimiter_; >-}; >- >-} // namespace rtc >- >-#endif // RTC_BASE_PATHUTILS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils_unittest.cc >deleted file mode 100644 >index fae4f0aba59bf5e4f8e89bf9d9508d0f579dbee3..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/pathutils_unittest.cc >+++ /dev/null >@@ -1,37 +0,0 @@ >-/* >- * Copyright 2007 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "rtc_base/pathutils.h" >-#include "rtc_base/gunit.h" >- >-TEST(Pathname, ReturnsDotForEmptyPathname) { >- const std::string kCWD = >- std::string(".") + rtc::Pathname::DefaultFolderDelimiter(); >- >- rtc::Pathname path("/", ""); >- EXPECT_TRUE (path.filename().empty()); >- EXPECT_FALSE(path.pathname().empty()); >- EXPECT_EQ(std::string("/"), path.pathname()); >- >- path.SetPathname("", "foo"); >- EXPECT_FALSE(path.filename().empty()); >- EXPECT_FALSE(path.pathname().empty()); >- EXPECT_EQ(std::string("foo"), path.pathname()); >- >- path.SetPathname("", ""); >- EXPECT_TRUE (path.filename().empty()); >- EXPECT_FALSE(path.pathname().empty()); >- EXPECT_EQ(kCWD, path.pathname()); >- >- path.SetPathname(kCWD, ""); >- EXPECT_TRUE (path.filename().empty()); >- EXPECT_FALSE(path.pathname().empty()); >- EXPECT_EQ(kCWD, path.pathname()); >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver_unittest.cc >index 1e046c057e5d8211c235d854f5d6ce712e6bbaa3..4b36cd5f608e8f87fe9e6d5b9d90d9e910e4eecd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver_unittest.cc >@@ -547,7 +547,13 @@ Thread* PosixSignalDeliveryTest::signaled_thread_ = nullptr; > > // Test receiving a synchronous signal while not in Wait() and then entering > // Wait() afterwards. >-TEST_F(PosixSignalDeliveryTest, RaiseThenWait) { >+// TODO(webrtc:7864): Fails on real iOS devices >+#if defined(WEBRTC_IOS) && defined(WEBRTC_ARCH_ARM_FAMILY) >+#define MAYBE_RaiseThenWait DISABLED_RaiseThenWait >+#else >+#define MAYBE_RaiseThenWait RaiseThenWait >+#endif >+TEST_F(PosixSignalDeliveryTest, MAYBE_RaiseThenWait) { > ASSERT_TRUE(ss_->SetPosixSignalHandler(SIGTERM, &RecordSignal)); > raise(SIGTERM); > EXPECT_TRUE(ss_->Wait(0, true)); >@@ -557,7 +563,13 @@ TEST_F(PosixSignalDeliveryTest, RaiseThenWait) { > > // Test that we can handle getting tons of repeated signals and that we see all > // the different ones. >-TEST_F(PosixSignalDeliveryTest, InsanelyManySignals) { >+// TODO(webrtc:7864): Fails on real iOS devices >+#if defined(WEBRTC_IOS) && defined(WEBRTC_ARCH_ARM_FAMILY) >+#define MAYBE_InsanelyManySignals DISABLED_InsanelyManySignals >+#else >+#define MAYBE_InsanelyManySignals InsanelyManySignals >+#endif >+TEST_F(PosixSignalDeliveryTest, MAYBE_InsanelyManySignals) { > ss_->SetPosixSignalHandler(SIGTERM, &RecordSignal); > ss_->SetPosixSignalHandler(SIGINT, &RecordSignal); > for (int i = 0; i < 10000; ++i) { >@@ -597,7 +609,13 @@ class RaiseSigTermRunnable : public Runnable { > > // Test that it works no matter what thread the kernel chooses to give the > // signal to (since it's not guaranteed to be the one that Wait() runs on). >-TEST_F(PosixSignalDeliveryTest, SignalOnDifferentThread) { >+// TODO(webrtc:7864): Fails on real iOS devices >+#if defined(WEBRTC_IOS) && defined(WEBRTC_ARCH_ARM_FAMILY) >+#define MAYBE_SignalOnDifferentThread DISABLED_SignalOnDifferentThread >+#else >+#define MAYBE_SignalOnDifferentThread SignalOnDifferentThread >+#endif >+TEST_F(PosixSignalDeliveryTest, DISABLED_SignalOnDifferentThread) { > ss_->SetPosixSignalHandler(SIGTERM, &RecordSignal); > // Mask out SIGTERM so that it can't be delivered to this thread. > sigset_t mask; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_file.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_file.cc >index a4c906a1ed028adf6db0bff2528407b90b9bccc9..baefb229313bd2a73a34dd1324808a5929d8d035 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_file.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_file.cc >@@ -10,14 +10,13 @@ > > #include "rtc_base/platform_file.h" > >-#include "rtc_base/stringutils.h" >- > #if defined(WEBRTC_WIN) > #include <io.h> >+ >+#include "rtc_base/stringutils.h" // For ToUtf16 > #else > #include <fcntl.h> > #include <sys/stat.h> >-#include <sys/types.h> > #include <unistd.h> > #endif > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.cc >index 79d9d53aeee02e0d8be0411b0e019b6075d07e1e..ba84b6ae8e049548773c7fed473155de5c6a343f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.cc >@@ -10,15 +10,17 @@ > > #include "rtc_base/platform_thread.h" > >+#if !defined(WEBRTC_WIN) >+#include <sched.h> >+#endif >+#include <stdint.h> >+#include <time.h> >+#include <algorithm> >+ > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" > #include "rtc_base/timeutils.h" > >-#if defined(WEBRTC_LINUX) >-#include <sys/prctl.h> >-#include <sys/syscall.h> >-#endif >- > namespace rtc { > namespace { > #if defined(WEBRTC_WIN) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.h >index 33921c209da0289b0747daa47b1545bca5446d54..47c23dc1b5b47f4b29885228f38b306e23ad2b0e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/platform_thread.h >@@ -11,10 +11,12 @@ > #ifndef RTC_BASE_PLATFORM_THREAD_H_ > #define RTC_BASE_PLATFORM_THREAD_H_ > >+#ifndef WEBRTC_WIN >+#include <pthread.h> >+#endif > #include <string> > > #include "rtc_base/constructormagic.h" >-#include "rtc_base/event.h" > #include "rtc_base/platform_thread_types.h" > #include "rtc_base/thread_checker.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxy_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxy_unittest.cc >index f42039f50ba2118ddccaab845c1dc940e2a84c02..010189384f8b757099ae68bc8a36d0e98ae2bd60 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxy_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxy_unittest.cc >@@ -18,7 +18,6 @@ > #include "rtc_base/virtualsocketserver.h" > > using rtc::Socket; >-using rtc::Thread; > using rtc::SocketAddress; > > static const SocketAddress kSocksProxyIntAddr("1.2.3.4", 1080); >@@ -49,7 +48,8 @@ TEST_F(ProxyTest, TestSocks5Connect) { > socket, kSocksProxyIntAddr, "", rtc::CryptString()); > // TODO: IPv6-ize these tests when proxy supports IPv6. > >- rtc::TestEchoServer server(Thread::Current(), SocketAddress(INADDR_ANY, 0)); >+ rtc::TestEchoServer server(rtc::Thread::Current(), >+ SocketAddress(INADDR_ANY, 0)); > > std::unique_ptr<rtc::AsyncTCPSocket> packet_socket( > rtc::AsyncTCPSocket::Create(proxy_socket, SocketAddress(INADDR_ANY, 0), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxyserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxyserver.cc >index 55cab80975c36fd1543e84eda512aff382bde74c..71c4879099f05648c48506096374bcba9d2f5305 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxyserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/proxyserver.cc >@@ -13,6 +13,7 @@ > #include <algorithm> > > #include "rtc_base/checks.h" >+#include "rtc_base/logging.h" > #include "rtc_base/socketfactory.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/race_checker.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/race_checker.h >index 73567e98d8ac07db766e81c897db34c0d3ad8dec..4d574601eb0b4ba62ace5cdb46d690a3413fa3ff 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/race_checker.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/race_checker.h >@@ -12,7 +12,7 @@ > #define RTC_BASE_RACE_CHECKER_H_ > > #include "rtc_base/checks.h" >-#include "rtc_base/platform_thread.h" >+#include "rtc_base/platform_thread_types.h" > #include "rtc_base/thread_annotations.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/random.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/random.h >index 2faa9856dc6aee1105cfebb9ff6c8a5d609b6e09..e1c3bb7058e82697494994490086f02712dba044 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/random.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/random.h >@@ -11,6 +11,7 @@ > #ifndef RTC_BASE_RANDOM_H_ > #define RTC_BASE_RANDOM_H_ > >+#include <stdint.h> > #include <limits> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.cc >index 0343f256b293f0a11448f167dca3190cf7505d56..5c7bdefb0c8b97a068cadd93477f5a05db726ca8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.cc >@@ -9,6 +9,10 @@ > */ > > #include "rtc_base/rate_limiter.h" >+ >+#include <limits> >+ >+#include "absl/types/optional.h" > #include "system_wrappers/include/clock.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.h >index 0bfde0da259bb7a4a2e5da0106753f0393e96f1e..43ef88d8fc356eb0e91c9782c1629cb8da6389c1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter.h >@@ -11,11 +11,13 @@ > #ifndef RTC_BASE_RATE_LIMITER_H_ > #define RTC_BASE_RATE_LIMITER_H_ > >-#include <limits> >+#include <stddef.h> >+#include <stdint.h> > > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/rate_statistics.h" >+#include "rtc_base/thread_annotations.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter_unittest.cc >index 6efea5458e1b087cfa205ce1e5e1b4f37165086f..ac0625fa186f79ceb17cdc242648a07ebe919e3d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_limiter_unittest.cc >@@ -112,9 +112,7 @@ static const int64_t kMaxTimeoutMs = 30000; > class ThreadTask { > public: > explicit ThreadTask(RateLimiter* rate_limiter) >- : rate_limiter_(rate_limiter), >- start_signal_(false, false), >- end_signal_(false, false) {} >+ : rate_limiter_(rate_limiter) {} > virtual ~ThreadTask() {} > > void Run() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_statistics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_statistics.h >index 68035c9a68e98ba0032b57dd3da002d26cc1e976..d4ccc59bf5bdfd1c89388fd7dfef12bfd80be0a6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_statistics.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rate_statistics.h >@@ -11,6 +11,8 @@ > #ifndef RTC_BASE_RATE_STATISTICS_H_ > #define RTC_BASE_RATE_STATISTICS_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > > #include "absl/types/optional.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ratetracker.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ratetracker.cc >index e31d26627ab19edb27c1dbe020b883e5df73d0e9..7c96ca9f60858892b95501ef2e6861b0b2178968 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ratetracker.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ratetracker.cc >@@ -10,8 +10,6 @@ > > #include "rtc_base/ratetracker.h" > >-#include <stddef.h> >- > #include <algorithm> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.cc >index 7f027ba5f80f48886d73e50bb3945160f5a23bf9..875068f4fc988970e026f871f296f3d25f5ea2ee 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.cc >@@ -14,6 +14,9 @@ > > #include "rtc_base/checks.h" > #include "rtc_base/refcountedobject.h" >+#include "rtc_base/sslcertificate.h" >+#include "rtc_base/sslidentity.h" >+#include "rtc_base/timeutils.h" > > namespace rtc { > >@@ -29,7 +32,7 @@ RTCCertificate::RTCCertificate(SSLIdentity* identity) : identity_(identity) { > RTCCertificate::~RTCCertificate() {} > > uint64_t RTCCertificate::Expires() const { >- int64_t expires = ssl_certificate().CertificateExpirationTime(); >+ int64_t expires = GetSSLCertificate().CertificateExpirationTime(); > if (expires != -1) > return static_cast<uint64_t>(expires) * kNumMillisecsPerSec; > // If the expiration time could not be retrieved return an expired timestamp. >@@ -40,17 +43,22 @@ bool RTCCertificate::HasExpired(uint64_t now) const { > return Expires() <= now; > } > >+const SSLCertificate& RTCCertificate::GetSSLCertificate() const { >+ return identity_->certificate(); >+} >+ >+// Deprecated: TODO(benwright) - Remove once chromium is updated. > const SSLCertificate& RTCCertificate::ssl_certificate() const { > return identity_->certificate(); > } > >-const SSLCertChain& RTCCertificate::ssl_cert_chain() const { >+const SSLCertChain& RTCCertificate::GetSSLCertificateChain() const { > return identity_->cert_chain(); > } > > RTCCertificatePEM RTCCertificate::ToPEM() const { > return RTCCertificatePEM(identity_->PrivateKeyToPEMString(), >- ssl_certificate().ToPEMString()); >+ GetSSLCertificate().ToPEMString()); > } > > scoped_refptr<RTCCertificate> RTCCertificate::FromPEM( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.h >index d5422f8b4a47d2fe1928b6123c7b5c072e06364d..561ea0f9e7e906f4c78d0fa11999a9ba872c7037 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificate.h >@@ -12,15 +12,18 @@ > #define RTC_BASE_RTCCERTIFICATE_H_ > > #include <stdint.h> >- > #include <memory> >+#include <string> > > #include "rtc_base/refcount.h" > #include "rtc_base/scoped_ref_ptr.h" >-#include "rtc_base/sslidentity.h" > > namespace rtc { > >+class SSLCertChain; >+class SSLCertificate; >+class SSLIdentity; >+ > // This class contains PEM strings of an RTCCertificate's private key and > // certificate and acts as a text representation of RTCCertificate. Certificates > // can be serialized and deserialized to and from this format, which allows for >@@ -55,11 +58,15 @@ class RTCCertificate : public RefCountInterface { > // Checks if the certificate has expired, where |now| is expressed in ms > // relative to epoch, 1970-01-01T00:00:00Z. > bool HasExpired(uint64_t now) const; >+ >+ const SSLCertificate& GetSSLCertificate() const; >+ const SSLCertChain& GetSSLCertificateChain() const; >+ >+ // Deprecated: TODO(benwright) - Remove once chromium is updated. > const SSLCertificate& ssl_certificate() const; >- const SSLCertChain& ssl_cert_chain() const; > > // TODO(hbos): If possible, remove once RTCCertificate and its >- // ssl_certificate() is used in all relevant places. Should not pass around >+ // GetSSLCertificate() is used in all relevant places. Should not pass around > // raw SSLIdentity* for the sake of accessing SSLIdentity::certificate(). > // However, some places might need SSLIdentity* for its public/private key... > SSLIdentity* identity() const { return identity_.get(); } >@@ -77,7 +84,7 @@ class RTCCertificate : public RefCountInterface { > > private: > // The SSLIdentity is the owner of the SSLCertificate. To protect our >- // ssl_certificate() we take ownership of |identity_|. >+ // GetSSLCertificate() we take ownership of |identity_|. > std::unique_ptr<SSLIdentity> identity_; > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.cc >index 0b51c611badf537014bd204b3e5bab6853ffb2e2..114b35c6658fcc9e55725e7a3444549bc12df896 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.cc >@@ -10,10 +10,15 @@ > > #include "rtc_base/rtccertificategenerator.h" > >+#include <time.h> > #include <algorithm> > #include <memory> >+#include <utility> > > #include "rtc_base/checks.h" >+#include "rtc_base/location.h" >+#include "rtc_base/messagehandler.h" >+#include "rtc_base/messagequeue.h" > #include "rtc_base/refcountedobject.h" > #include "rtc_base/sslidentity.h" > >@@ -23,7 +28,6 @@ namespace { > > // A certificates' subject and issuer name. > const char kIdentityName[] = "WebRTC"; >- > const uint64_t kYearInSeconds = 365 * 24 * 60 * 60; > > enum { >@@ -60,11 +64,9 @@ class RTCCertificateGenerationTask : public RefCountInterface, > switch (msg->message_id) { > case MSG_GENERATE: > RTC_DCHECK(worker_thread_->IsCurrent()); >- > // Perform the certificate generation work here on the worker thread. > certificate_ = RTCCertificateGenerator::GenerateCertificate( > key_params_, expires_ms_); >- > // Handle callbacks on signaling thread. Pass on the |msg->pdata| > // (which references |this| with ref counting) to that thread. > signaling_thread_->Post(RTC_FROM_HERE, this, MSG_GENERATE_DONE, >@@ -72,14 +74,12 @@ class RTCCertificateGenerationTask : public RefCountInterface, > break; > case MSG_GENERATE_DONE: > RTC_DCHECK(signaling_thread_->IsCurrent()); >- > // Perform callback with result here on the signaling thread. > if (certificate_) { > callback_->OnSuccess(certificate_); > } else { > callback_->OnFailure(); > } >- > // Destroy |msg->pdata| which references |this| with ref counting. This > // may result in |this| being deleted - do not touch member variables > // after this line. >@@ -105,9 +105,11 @@ class RTCCertificateGenerationTask : public RefCountInterface, > scoped_refptr<RTCCertificate> RTCCertificateGenerator::GenerateCertificate( > const KeyParams& key_params, > const absl::optional<uint64_t>& expires_ms) { >- if (!key_params.IsValid()) >+ if (!key_params.IsValid()) { > return nullptr; >- SSLIdentity* identity; >+ } >+ >+ SSLIdentity* identity = nullptr; > if (!expires_ms) { > identity = SSLIdentity::Generate(kIdentityName, key_params); > } else { >@@ -124,8 +126,9 @@ scoped_refptr<RTCCertificate> RTCCertificateGenerator::GenerateCertificate( > identity = SSLIdentity::GenerateWithExpiration(kIdentityName, key_params, > cert_lifetime_s); > } >- if (!identity) >+ if (!identity) { > return nullptr; >+ } > std::unique_ptr<SSLIdentity> identity_sptr(identity); > return RTCCertificate::Create(std::move(identity_sptr)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.h >index a6c503a9f29cf04d6d26a0572be5e97442ff45d2..fed075e6ddf4e67b434baaeab1c6329daac3bcc2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/rtccertificategenerator.h >@@ -11,6 +11,8 @@ > #ifndef RTC_BASE_RTCCERTIFICATEGENERATOR_H_ > #define RTC_BASE_RTCCERTIFICATEGENERATOR_H_ > >+#include <stdint.h> >+ > #include "absl/types/optional.h" > #include "rtc_base/refcount.h" > #include "rtc_base/rtccertificate.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sanitizer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sanitizer.h >index 23a748f84fc027d4cdf5efbfb2430b2f56d82e93..8af0824b67c85bcdedb3efbc35e410e6e01ae99e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sanitizer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sanitizer.h >@@ -14,7 +14,7 @@ > #include <stddef.h> // For size_t. > > #ifdef __cplusplus >-#include <type_traits> >+#include "absl/meta/type_traits.h" > #endif > > #if defined(__has_feature) >@@ -98,10 +98,10 @@ namespace sanitizer_impl { > > template <typename T> > constexpr bool IsTriviallyCopyable() { >- return static_cast<bool>(std::is_trivially_copy_constructible<T>::value && >- (std::is_trivially_copy_assignable<T>::value || >+ return static_cast<bool>(absl::is_trivially_copy_constructible<T>::value && >+ (absl::is_trivially_copy_assignable<T>::value || > !std::is_copy_assignable<T>::value) && >- std::is_trivially_destructible<T>::value); >+ absl::is_trivially_destructible<T>::value); > } > > } // namespace sanitizer_impl >@@ -123,9 +123,11 @@ inline void MsanMarkUninitialized(const T& mem) { > > template <typename T> > inline T MsanUninitialized(T t) { >+#if RTC_HAS_MSAN > // TODO(bugs.webrtc.org/8762): Switch to std::is_trivially_copyable when it > // becomes available in downstream projects. > static_assert(sanitizer_impl::IsTriviallyCopyable<T>(), ""); >+#endif > rtc_MsanMarkUninitialized(&t, sizeof(T), 1); > return t; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/scoped_ref_ptr.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/scoped_ref_ptr.h >index a583aa99b533bb29f6973b2ff297ae2fbba3fa1e..b961ff57e864a7393280934d3b2c812c71b9ee33 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/scoped_ref_ptr.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/scoped_ref_ptr.h >@@ -8,154 +8,12 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-// Originally these classes are from Chromium. >-// http://src.chromium.org/viewvc/chrome/trunk/src/base/memory/ref_counted.h?view=markup >- >-// >-// A smart pointer class for reference counted objects. Use this class instead >-// of calling AddRef and Release manually on a reference counted object to >-// avoid common memory leaks caused by forgetting to Release an object >-// reference. Sample usage: >-// >-// class MyFoo : public RefCounted<MyFoo> { >-// ... >-// }; >-// >-// void some_function() { >-// scoped_refptr<MyFoo> foo = new MyFoo(); >-// foo->Method(param); >-// // |foo| is released when this function returns >-// } >-// >-// void some_other_function() { >-// scoped_refptr<MyFoo> foo = new MyFoo(); >-// ... >-// foo = nullptr; // explicitly releases |foo| >-// ... >-// if (foo) >-// foo->Method(param); >-// } >-// >-// The above examples show how scoped_refptr<T> acts like a pointer to T. >-// Given two scoped_refptr<T> classes, it is also possible to exchange >-// references between the two objects, like so: >-// >-// { >-// scoped_refptr<MyFoo> a = new MyFoo(); >-// scoped_refptr<MyFoo> b; >-// >-// b.swap(a); >-// // now, |b| references the MyFoo object, and |a| references null. >-// } >-// >-// To make both |a| and |b| in the above example reference the same MyFoo >-// object, simply use the assignment operator: >-// >-// { >-// scoped_refptr<MyFoo> a = new MyFoo(); >-// scoped_refptr<MyFoo> b; >-// >-// b = a; >-// // now, |a| and |b| each own a reference to the same MyFoo object. >-// } >-// >- > #ifndef RTC_BASE_SCOPED_REF_PTR_H_ > #define RTC_BASE_SCOPED_REF_PTR_H_ > >-#include <memory> >- >-namespace rtc { >- >-template <class T> >-class scoped_refptr { >- public: >- scoped_refptr() : ptr_(nullptr) {} >- >- scoped_refptr(T* p) : ptr_(p) { >- if (ptr_) >- ptr_->AddRef(); >- } >- >- scoped_refptr(const scoped_refptr<T>& r) : ptr_(r.ptr_) { >- if (ptr_) >- ptr_->AddRef(); >- } >- >- template <typename U> >- scoped_refptr(const scoped_refptr<U>& r) : ptr_(r.get()) { >- if (ptr_) >- ptr_->AddRef(); >- } >- >- // Move constructors. >- scoped_refptr(scoped_refptr<T>&& r) : ptr_(r.release()) {} >- >- template <typename U> >- scoped_refptr(scoped_refptr<U>&& r) : ptr_(r.release()) {} >- >- ~scoped_refptr() { >- if (ptr_) >- ptr_->Release(); >- } >- >- T* get() const { return ptr_; } >- operator T*() const { return ptr_; } >- T* operator->() const { return ptr_; } >- >- // Returns the (possibly null) raw pointer, and makes the scoped_refptr hold a >- // null pointer, all without touching the reference count of the underlying >- // pointed-to object. The object is still reference counted, and the caller of >- // release() is now the proud owner of one reference, so it is responsible for >- // calling Release() once on the object when no longer using it. >- T* release() { >- T* retVal = ptr_; >- ptr_ = nullptr; >- return retVal; >- } >- >- scoped_refptr<T>& operator=(T* p) { >- // AddRef first so that self assignment should work >- if (p) >- p->AddRef(); >- if (ptr_) >- ptr_->Release(); >- ptr_ = p; >- return *this; >- } >- >- scoped_refptr<T>& operator=(const scoped_refptr<T>& r) { >- return *this = r.ptr_; >- } >- >- template <typename U> >- scoped_refptr<T>& operator=(const scoped_refptr<U>& r) { >- return *this = r.get(); >- } >- >- scoped_refptr<T>& operator=(scoped_refptr<T>&& r) { >- scoped_refptr<T>(std::move(r)).swap(*this); >- return *this; >- } >- >- template <typename U> >- scoped_refptr<T>& operator=(scoped_refptr<U>&& r) { >- scoped_refptr<T>(std::move(r)).swap(*this); >- return *this; >- } >- >- void swap(T** pp) { >- T* p = ptr_; >- ptr_ = *pp; >- *pp = p; >- } >- >- void swap(scoped_refptr<T>& r) { swap(&r.ptr_); } >- >- protected: >- T* ptr_; >-}; >+// TODO(bugs.webrtc.org/9887): This is a forward header for backwards >+// compatibility. Remove when downstream clients are updated. > >-} // namespace rtc >+#include "api/scoped_refptr.h" > > #endif // RTC_BASE_SCOPED_REF_PTR_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_impl.cc >index 16069c25f7ab064672d96f3bb0426b14790d2b47..717cb95040835a57b14805bab2025cfab9838575 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_impl.cc >@@ -14,6 +14,7 @@ > #include <dispatch/dispatch.h> > #endif > >+#include "rtc_base/checks.h" > #include "rtc_base/sequenced_task_checker.h" > #include "rtc_base/task_queue.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_unittest.cc >index 96e655baad4f37c3706b23b72d14183ebfdf3826..7b7247c64c015fad3f1677f8ac712afec9b89021 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sequenced_task_checker_unittest.cc >@@ -12,6 +12,7 @@ > > #include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" >+#include "rtc_base/event.h" > #include "rtc_base/platform_thread.h" > #include "rtc_base/task_queue.h" > #include "rtc_base/thread_checker.h" >@@ -43,7 +44,6 @@ class CallCalledSequentiallyOnThread { > CallCalledSequentiallyOnThread(bool expect_true, > SequencedTaskChecker* sequenced_task_checker) > : expect_true_(expect_true), >- thread_has_run_event_(false, false), > thread_(&Run, this, "call_do_stuff_on_thread"), > sequenced_task_checker_(sequenced_task_checker) { > thread_.Start(); >@@ -77,7 +77,6 @@ class DeleteSequencedCheckerOnThread { > explicit DeleteSequencedCheckerOnThread( > std::unique_ptr<SequencedTaskChecker> sequenced_task_checker) > : thread_(&Run, this, "delete_sequenced_task_checker_on_thread"), >- thread_has_run_event_(false, false), > sequenced_task_checker_(std::move(sequenced_task_checker)) { > thread_.Start(); > } >@@ -118,7 +117,7 @@ void RunMethodOnDifferentTaskQueue(bool expect_true) { > > static const char kQueueName[] = "MethodNotAllowedOnDifferentTq"; > TaskQueue queue(kQueueName); >- Event done_event(false, false); >+ Event done_event; > queue.PostTask([&sequenced_task_checker, &done_event, expect_true] { > if (expect_true) > EXPECT_TRUE(sequenced_task_checker->CalledSequentially()); >@@ -135,7 +134,7 @@ void DetachThenCallFromDifferentTaskQueue(bool expect_true) { > > sequenced_task_checker->Detach(); > >- Event done_event(false, false); >+ Event done_event; > TaskQueue queue1("DetachThenCallFromDifferentTaskQueueImpl1"); > queue1.PostTask([&sequenced_task_checker, &done_event] { > EXPECT_TRUE(sequenced_task_checker->CalledSequentially()); >@@ -193,7 +192,7 @@ TEST(SequencedTaskCheckerTest, DetachFromThreadAndUseOnTaskQueue) { > sequenced_task_checker->Detach(); > static const char kQueueName[] = "DetachFromThreadAndUseOnTaskQueue"; > TaskQueue queue(kQueueName); >- Event done_event(false, false); >+ Event done_event; > queue.PostTask([&sequenced_task_checker, &done_event] { > EXPECT_TRUE(sequenced_task_checker->CalledSequentially()); > done_event.Set(); >@@ -203,7 +202,7 @@ TEST(SequencedTaskCheckerTest, DetachFromThreadAndUseOnTaskQueue) { > > TEST(SequencedTaskCheckerTest, DetachFromTaskQueueAndUseOnThread) { > TaskQueue queue("DetachFromTaskQueueAndUseOnThread"); >- Event done_event(false, false); >+ Event done_event; > queue.PostTask([&done_event] { > std::unique_ptr<SequencedTaskChecker> sequenced_task_checker( > new SequencedTaskChecker()); >@@ -271,7 +270,7 @@ void TestAnnotationsOnWrongQueue() { > TestAnnotations annotations; > static const char kQueueName[] = "TestAnnotationsOnWrongQueueDebug"; > TaskQueue queue(kQueueName); >- Event done_event(false, false); >+ Event done_event; > queue.PostTask([&annotations, &done_event] { > annotations.ModifyTestVar(); > done_event.Set(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.cc >index eb79dc84fe8e5465dac1ee6e8944d0be71b4efb6..5dd93879315c39ead0ef065ff3c4f921b971f8fe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.cc >@@ -10,8 +10,13 @@ > > #include "rtc_base/signalthread.h" > >+#include <memory> >+ > #include "absl/memory/memory.h" > #include "rtc_base/checks.h" >+#include "rtc_base/location.h" >+#include "rtc_base/nullsocketserver.h" >+#include "rtc_base/socketserver.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.h >index 021cf4d78df8e4a35db8d6b372787bfc107830d3..9208e2c2467787537e083b5ad9f73b5f8245d34a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/signalthread.h >@@ -15,9 +15,12 @@ > > #include "rtc_base/checks.h" > #include "rtc_base/constructormagic.h" >-#include "rtc_base/nullsocketserver.h" >+#include "rtc_base/criticalsection.h" >+#include "rtc_base/messagehandler.h" >+#include "rtc_base/messagequeue.h" > #include "rtc_base/third_party/sigslot/sigslot.h" > #include "rtc_base/thread.h" >+#include "rtc_base/thread_annotations.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.cc >index a9749a473ee1181fe5c9804e8314197bc4cb1b1f..f19b34412f4a58f96dc1a0155ee2102260fb5b0e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.cc >@@ -12,16 +12,4 @@ > > namespace rtc { > >-PacketInfo::PacketInfo() = default; >-PacketInfo::PacketInfo(const PacketInfo& info) = default; >-PacketInfo::~PacketInfo() = default; >- >-SentPacket::SentPacket() = default; >-SentPacket::SentPacket(int64_t packet_id, int64_t send_time_ms) >- : packet_id(packet_id), send_time_ms(send_time_ms) {} >-SentPacket::SentPacket(int64_t packet_id, >- int64_t send_time_ms, >- const rtc::PacketInfo& info) >- : packet_id(packet_id), send_time_ms(send_time_ms), info(info) {} >- > } // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.h >index b8290bb139b2e69a57b2074ea5763589d40669cf..e7e82108cd29d2abdb319bd236d9b647611ac518 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socket.h >@@ -25,8 +25,8 @@ > #include "rtc_base/win32.h" > #endif > >-#include "absl/types/optional.h" > #include "rtc_base/constructormagic.h" >+#include "rtc_base/network/sent_packet.h" > #include "rtc_base/socketaddress.h" > > // Rather than converting errors into a private namespace, >@@ -123,50 +123,6 @@ inline bool IsBlockingError(int e) { > return (e == EWOULDBLOCK) || (e == EAGAIN) || (e == EINPROGRESS); > } > >-enum class PacketType { >- kUnknown, >- kData, >- kIceConnectivityCheck, >- kIceConnectivityCheckResponse, >- kStunMessage, >- kTurnMessage, >-}; >- >-enum class PacketInfoProtocolType { >- kUnknown, >- kUdp, >- kTcp, >- kSsltcp, >- kTls, >-}; >- >-struct PacketInfo { >- PacketInfo(); >- PacketInfo(const PacketInfo& info); >- ~PacketInfo(); >- >- PacketType packet_type = PacketType::kUnknown; >- PacketInfoProtocolType protocol = PacketInfoProtocolType::kUnknown; >- // A unique id assigned by the network manager, and absl::nullopt if not set. >- absl::optional<uint16_t> network_id; >- size_t packet_size_bytes = 0; >- size_t turn_overhead_bytes = 0; >- SocketAddress local_socket_address; >- SocketAddress remote_socket_address; >-}; >- >-struct SentPacket { >- SentPacket(); >- SentPacket(int64_t packet_id, int64_t send_time_ms); >- SentPacket(int64_t packet_id, >- int64_t send_time_ms, >- const rtc::PacketInfo& info); >- >- int64_t packet_id = -1; >- int64_t send_time_ms = -1; >- rtc::PacketInfo info; >-}; >- > // General interface for the socket implementations of various networks. The > // methods match those of normal UNIX sockets very closely. > class Socket { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.cc >index acd9b633858c8814f6d4b8f86ca1d9d3d5184e81..945192889142a8e04ed3f9d3d78cab2def852b47 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.cc >@@ -13,24 +13,29 @@ > #endif > > #include <errno.h> >+#include <stdio.h> >+#include <stdlib.h> >+#include <string.h> > #include <time.h> > > #if defined(WEBRTC_WIN) > #include <windows.h> > #include <winsock2.h> > #include <ws2tcpip.h> >+ > #define SECURITY_WIN32 > #include <security.h> > #endif > > #include <algorithm> > >+#include "absl/strings/match.h" >+#include "rtc_base/buffer.h" > #include "rtc_base/bytebuffer.h" > #include "rtc_base/checks.h" > #include "rtc_base/httpcommon.h" > #include "rtc_base/logging.h" > #include "rtc_base/socketadapters.h" >-#include "rtc_base/stringencode.h" > #include "rtc_base/strings/string_builder.h" > #include "rtc_base/stringutils.h" > #include "rtc_base/zero_memory.h" >@@ -457,7 +462,7 @@ void AsyncHttpsProxySocket::ProcessLine(char* data, size_t len) { > return; > } > } else if ((state_ == PS_AUTHENTICATE) && >- (_strnicmp(data, "Proxy-Authenticate:", 19) == 0)) { >+ absl::StartsWithIgnoreCase(data, "Proxy-Authenticate:")) { > std::string response, auth_method; > switch (HttpAuthenticate(data + 19, len - 19, proxy_, "CONNECT", "/", user_, > pass_, context_, response, auth_method)) { >@@ -485,12 +490,12 @@ void AsyncHttpsProxySocket::ProcessLine(char* data, size_t len) { > unknown_mechanisms_.clear(); > break; > } >- } else if (_strnicmp(data, "Content-Length:", 15) == 0) { >+ } else if (absl::StartsWithIgnoreCase(data, "Content-Length:")) { > content_length_ = strtoul(data + 15, 0, 0); >- } else if (_strnicmp(data, "Proxy-Connection: Keep-Alive", 28) == 0) { >+ } else if (absl::StartsWithIgnoreCase(data, "Proxy-Connection: Keep-Alive")) { > expect_close_ = false; > /* >- } else if (_strnicmp(data, "Connection: close", 17) == 0) { >+ } else if (absl::StartsWithIgnoreCase(data, "Connection: close") { > expect_close_ = true; > */ > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.h >index ad88fe6b6fb8cf2003beef81539c3e222382d10f..062f75c3feb1fffc300944144719eee5238a7942 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketadapters.h >@@ -11,13 +11,11 @@ > #ifndef RTC_BASE_SOCKETADAPTERS_H_ > #define RTC_BASE_SOCKETADAPTERS_H_ > >-#include <map> > #include <string> > > #include "rtc_base/asyncsocket.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/cryptstring.h" >-#include "rtc_base/logging.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddress.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddress.h >index bff8e76495eb74bca589824b8ed93021ccf20b21..b1a52b93716a8e24ca28b7e5ded15ed995efd90a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddress.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddress.h >@@ -15,7 +15,6 @@ > #ifdef UNIT_TEST > #include <ostream> // no-presubmit-check TODO(webrtc:8982) > #endif // UNIT_TEST >-#include <vector> > #include "rtc_base/ipaddress.h" > > #undef SetPort >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddresspair.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddresspair.h >index 5ff148a75d87604e1db1fc61d8490c1d2df722e9..6691386d3abff8b9e8e430c3b7355d29479f9ed4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddresspair.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketaddresspair.h >@@ -11,6 +11,8 @@ > #ifndef RTC_BASE_SOCKETADDRESSPAIR_H_ > #define RTC_BASE_SOCKETADDRESSPAIR_H_ > >+#include <stddef.h> >+ > #include "rtc_base/socketaddress.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketstream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketstream.cc >index 2ea1cec0ac736d35501103eda2ed06e8181574ec..8978404cf45f02d9d09016a6f7c15499d2abf479 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketstream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/socketstream.cc >@@ -11,6 +11,7 @@ > #include "rtc_base/socketstream.h" > > #include "rtc_base/checks.h" >+#include "rtc_base/socket.h" > > namespace rtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter.h >index 4843d264fcccd366d480f6438614103b9ea9785b..7ebedcadf026fef9ab9b1c133ad3598a3754ce0b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter.h >@@ -16,6 +16,7 @@ > > #include "rtc_base/asyncsocket.h" > #include "rtc_base/sslcertificate.h" >+#include "rtc_base/sslidentity.h" > #include "rtc_base/sslstreamadapter.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter_unittest.cc >index ec532b1c442618ff2d74f558d56c602f6fde2809..c84c668c7e1a81281af894f26e05fb45afb85fbf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/ssladapter_unittest.cc >@@ -15,6 +15,7 @@ > #include "absl/memory/memory.h" > #include "rtc_base/gunit.h" > #include "rtc_base/ipaddress.h" >+#include "rtc_base/messagedigest.h" > #include "rtc_base/socketstream.h" > #include "rtc_base/ssladapter.h" > #include "rtc_base/sslidentity.h" >@@ -266,7 +267,7 @@ class SSLAdapterTestDummyServer : public sigslot::has_slots<> { > // (e.g. a WebRTC-based application and an RFC 5766 TURN server), where > // clients are not required to provide a certificate during handshake. > // Accordingly, we must disable client authentication here. >- ssl_stream_adapter_->set_client_auth_enabled(false); >+ ssl_stream_adapter_->SetClientAuthEnabledForTesting(false); > > ssl_stream_adapter_->SetIdentity(ssl_identity_->GetReference()); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.cc >index 9a38fc0384b704942a0f28e87a1370297bc29cd5..934848fb730457b3978f91a172cded67751eeb28 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.cc >@@ -10,13 +10,12 @@ > > #include "rtc_base/sslcertificate.h" > >-#include <ctime> >+#include <algorithm> > #include <string> > #include <utility> > > #include "absl/memory/memory.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > #include "rtc_base/opensslcertificate.h" > #include "rtc_base/sslfingerprint.h" > #include "rtc_base/third_party/base64/base64.h" >@@ -31,7 +30,7 @@ SSLCertificateStats::SSLCertificateStats( > std::string&& fingerprint, > std::string&& fingerprint_algorithm, > std::string&& base64_certificate, >- std::unique_ptr<SSLCertificateStats>&& issuer) >+ std::unique_ptr<SSLCertificateStats> issuer) > : fingerprint(std::move(fingerprint)), > fingerprint_algorithm(std::move(fingerprint_algorithm)), > base64_certificate(std::move(base64_certificate)), >@@ -55,8 +54,8 @@ std::unique_ptr<SSLCertificateStats> SSLCertificate::GetStats() const { > // |SSLCertificate::GetSignatureDigestAlgorithm| is not supported by the > // implementation of |SSLCertificate::ComputeDigest|. This currently happens > // with MD5- and SHA-224-signed certificates when linked to libNSS. >- std::unique_ptr<SSLFingerprint> ssl_fingerprint( >- SSLFingerprint::Create(digest_algorithm, this)); >+ std::unique_ptr<SSLFingerprint> ssl_fingerprint = >+ SSLFingerprint::Create(digest_algorithm, *this); > if (!ssl_fingerprint) > return nullptr; > std::string fingerprint = ssl_fingerprint->GetRfc4572Fingerprint(); >@@ -71,49 +70,30 @@ std::unique_ptr<SSLCertificateStats> SSLCertificate::GetStats() const { > std::move(der_base64), nullptr); > } > >-std::unique_ptr<SSLCertificate> SSLCertificate::GetUniqueReference() const { >- return absl::WrapUnique(GetReference()); >-} >- > ////////////////////////////////////////////////////////////////////// > // SSLCertChain > ////////////////////////////////////////////////////////////////////// > >-SSLCertChain::SSLCertChain(std::vector<std::unique_ptr<SSLCertificate>> certs) >- : certs_(std::move(certs)) {} >- >-SSLCertChain::SSLCertChain(const std::vector<SSLCertificate*>& certs) { >- RTC_DCHECK(!certs.empty()); >- certs_.resize(certs.size()); >- std::transform( >- certs.begin(), certs.end(), certs_.begin(), >- [](const SSLCertificate* cert) -> std::unique_ptr<SSLCertificate> { >- return cert->GetUniqueReference(); >- }); >+SSLCertChain::SSLCertChain(std::unique_ptr<SSLCertificate> single_cert) { >+ certs_.push_back(std::move(single_cert)); > } > >-SSLCertChain::SSLCertChain(const SSLCertificate* cert) { >- certs_.push_back(cert->GetUniqueReference()); >-} >+SSLCertChain::SSLCertChain(std::vector<std::unique_ptr<SSLCertificate>> certs) >+ : certs_(std::move(certs)) {} > > SSLCertChain::SSLCertChain(SSLCertChain&& rhs) = default; > > SSLCertChain& SSLCertChain::operator=(SSLCertChain&&) = default; > >-SSLCertChain::~SSLCertChain() {} >+SSLCertChain::~SSLCertChain() = default; > >-SSLCertChain* SSLCertChain::Copy() const { >+std::unique_ptr<SSLCertChain> SSLCertChain::Clone() const { > std::vector<std::unique_ptr<SSLCertificate>> new_certs(certs_.size()); >- std::transform(certs_.begin(), certs_.end(), new_certs.begin(), >- [](const std::unique_ptr<SSLCertificate>& cert) >- -> std::unique_ptr<SSLCertificate> { >- return cert->GetUniqueReference(); >- }); >- return new SSLCertChain(std::move(new_certs)); >-} >- >-std::unique_ptr<SSLCertChain> SSLCertChain::UniqueCopy() const { >- return absl::WrapUnique(Copy()); >+ std::transform( >+ certs_.begin(), certs_.end(), new_certs.begin(), >+ [](const std::unique_ptr<SSLCertificate>& cert) >+ -> std::unique_ptr<SSLCertificate> { return cert->Clone(); }); >+ return absl::make_unique<SSLCertChain>(std::move(new_certs)); > } > > std::unique_ptr<SSLCertificateStats> SSLCertChain::GetStats() const { >@@ -135,7 +115,8 @@ std::unique_ptr<SSLCertificateStats> SSLCertChain::GetStats() const { > } > > // static >-SSLCertificate* SSLCertificate::FromPEMString(const std::string& pem_string) { >+std::unique_ptr<SSLCertificate> SSLCertificate::FromPEMString( >+ const std::string& pem_string) { > return OpenSSLCertificate::FromPEMString(pem_string); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.h >index 29c4db58ef56c80035e832cd242cad72e9dc8dd4..eb81c209e4f0c3265664e015d7a733c4480600e3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslcertificate.h >@@ -15,15 +15,14 @@ > #ifndef RTC_BASE_SSLCERTIFICATE_H_ > #define RTC_BASE_SSLCERTIFICATE_H_ > >-#include <algorithm> >+#include <stddef.h> >+#include <stdint.h> > #include <memory> > #include <string> > #include <vector> > > #include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" >-#include "rtc_base/messagedigest.h" >-#include "rtc_base/timeutils.h" > > namespace rtc { > >@@ -31,7 +30,7 @@ struct SSLCertificateStats { > SSLCertificateStats(std::string&& fingerprint, > std::string&& fingerprint_algorithm, > std::string&& base64_certificate, >- std::unique_ptr<SSLCertificateStats>&& issuer); >+ std::unique_ptr<SSLCertificateStats> issuer); > ~SSLCertificateStats(); > std::string fingerprint; > std::string fingerprint_algorithm; >@@ -54,17 +53,13 @@ class SSLCertificate { > // The length of the string representation of the certificate is > // stored in *pem_length if it is non-null, and only if > // parsing was successful. >- // Caller is responsible for freeing the returned object. >- static SSLCertificate* FromPEMString(const std::string& pem_string); >- virtual ~SSLCertificate() {} >+ static std::unique_ptr<SSLCertificate> FromPEMString( >+ const std::string& pem_string); >+ virtual ~SSLCertificate() = default; > > // Returns a new SSLCertificate object instance wrapping the same >- // underlying certificate, including its chain if present. Caller is >- // responsible for freeing the returned object. Use GetUniqueReference >- // instead. >- virtual SSLCertificate* GetReference() const = 0; >- >- std::unique_ptr<SSLCertificate> GetUniqueReference() const; >+ // underlying certificate, including its chain if present. >+ virtual std::unique_ptr<SSLCertificate> Clone() const = 0; > > // Returns a PEM encoded string representation of the certificate. > virtual std::string ToPEMString() const = 0; >@@ -95,13 +90,10 @@ class SSLCertificate { > // SSLCertChain is a simple wrapper for a vector of SSLCertificates. It serves > // primarily to ensure proper memory management (especially deletion) of the > // SSLCertificate pointers. >-class SSLCertChain { >+class SSLCertChain final { > public: >+ explicit SSLCertChain(std::unique_ptr<SSLCertificate> single_cert); > explicit SSLCertChain(std::vector<std::unique_ptr<SSLCertificate>> certs); >- // These constructors copy the provided SSLCertificate(s), so the caller >- // retains ownership. >- explicit SSLCertChain(const std::vector<SSLCertificate*>& certs); >- explicit SSLCertChain(const SSLCertificate* cert); > // Allow move semantics for the object. > SSLCertChain(SSLCertChain&&); > SSLCertChain& operator=(SSLCertChain&&); >@@ -115,10 +107,8 @@ class SSLCertChain { > const SSLCertificate& Get(size_t pos) const { return *(certs_[pos]); } > > // Returns a new SSLCertChain object instance wrapping the same underlying >- // certificate chain. Caller is responsible for freeing the returned object. >- SSLCertChain* Copy() const; >- // Same as above, but returning a unique_ptr for convenience. >- std::unique_ptr<SSLCertChain> UniqueCopy() const; >+ // certificate chain. >+ std::unique_ptr<SSLCertChain> Clone() const; > > // Gets information (fingerprint, etc.) about this certificate chain. This is > // used for certificate stats, see >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.cc >index b651a3d6285205b55a2fb03677104b15a97b7662..b296d33ddd9c43123a4154b94b5f0f3a864a81e3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.cc >@@ -11,67 +11,82 @@ > #include "rtc_base/sslfingerprint.h" > > #include <ctype.h> >+#include <algorithm> >+#include <cstdint> > #include <string> > >-#include "rtc_base/helpers.h" >+#include "absl/memory/memory.h" > #include "rtc_base/logging.h" > #include "rtc_base/messagedigest.h" >+#include "rtc_base/rtccertificate.h" >+#include "rtc_base/sslcertificate.h" >+#include "rtc_base/sslidentity.h" > #include "rtc_base/stringencode.h" > > namespace rtc { > > SSLFingerprint* SSLFingerprint::Create(const std::string& algorithm, > const rtc::SSLIdentity* identity) { >- if (!identity) { >- return nullptr; >- } >+ return CreateUnique(algorithm, *identity).release(); >+} > >- return Create(algorithm, &(identity->certificate())); >+std::unique_ptr<SSLFingerprint> SSLFingerprint::CreateUnique( >+ const std::string& algorithm, >+ const rtc::SSLIdentity& identity) { >+ return Create(algorithm, identity.certificate()); > } > >-SSLFingerprint* SSLFingerprint::Create(const std::string& algorithm, >- const rtc::SSLCertificate* cert) { >+std::unique_ptr<SSLFingerprint> SSLFingerprint::Create( >+ const std::string& algorithm, >+ const rtc::SSLCertificate& cert) { > uint8_t digest_val[64]; > size_t digest_len; >- bool ret = cert->ComputeDigest(algorithm, digest_val, sizeof(digest_val), >- &digest_len); >+ bool ret = cert.ComputeDigest(algorithm, digest_val, sizeof(digest_val), >+ &digest_len); > if (!ret) { > return nullptr; > } >- >- return new SSLFingerprint(algorithm, digest_val, digest_len); >+ return absl::make_unique<SSLFingerprint>( >+ algorithm, ArrayView<const uint8_t>(digest_val, digest_len)); > } > > SSLFingerprint* SSLFingerprint::CreateFromRfc4572( > const std::string& algorithm, > const std::string& fingerprint) { >+ return CreateUniqueFromRfc4572(algorithm, fingerprint).release(); >+} >+ >+std::unique_ptr<SSLFingerprint> SSLFingerprint::CreateUniqueFromRfc4572( >+ const std::string& algorithm, >+ const std::string& fingerprint) { > if (algorithm.empty() || !rtc::IsFips180DigestAlgorithm(algorithm)) > return nullptr; > > if (fingerprint.empty()) > return nullptr; > >- size_t value_len; > char value[rtc::MessageDigest::kMaxSize]; >- value_len = rtc::hex_decode_with_delimiter( >+ size_t value_len = rtc::hex_decode_with_delimiter( > value, sizeof(value), fingerprint.c_str(), fingerprint.length(), ':'); > if (!value_len) > return nullptr; > >- return new SSLFingerprint(algorithm, reinterpret_cast<uint8_t*>(value), >- value_len); >+ return absl::make_unique<SSLFingerprint>( >+ algorithm, >+ ArrayView<const uint8_t>(reinterpret_cast<uint8_t*>(value), value_len)); > } > >-SSLFingerprint* SSLFingerprint::CreateFromCertificate( >- const RTCCertificate* cert) { >+std::unique_ptr<SSLFingerprint> SSLFingerprint::CreateFromCertificate( >+ const RTCCertificate& cert) { > std::string digest_alg; >- if (!cert->ssl_certificate().GetSignatureDigestAlgorithm(&digest_alg)) { >+ if (!cert.GetSSLCertificate().GetSignatureDigestAlgorithm(&digest_alg)) { > RTC_LOG(LS_ERROR) > << "Failed to retrieve the certificate's digest algorithm"; > return nullptr; > } > >- SSLFingerprint* fingerprint = Create(digest_alg, cert->identity()); >+ std::unique_ptr<SSLFingerprint> fingerprint = >+ CreateUnique(digest_alg, *cert.identity()); > if (!fingerprint) { > RTC_LOG(LS_ERROR) << "Failed to create identity fingerprint, alg=" > << digest_alg; >@@ -79,12 +94,14 @@ SSLFingerprint* SSLFingerprint::CreateFromCertificate( > return fingerprint; > } > >+SSLFingerprint::SSLFingerprint(const std::string& algorithm, >+ ArrayView<const uint8_t> digest_view) >+ : algorithm(algorithm), digest(digest_view.data(), digest_view.size()) {} >+ > SSLFingerprint::SSLFingerprint(const std::string& algorithm, > const uint8_t* digest_in, > size_t digest_len) >- : algorithm(algorithm) { >- digest.SetData(digest_in, digest_len); >-} >+ : SSLFingerprint(algorithm, MakeArrayView(digest_in, digest_len)) {} > > SSLFingerprint::SSLFingerprint(const SSLFingerprint& from) > : algorithm(from.algorithm), digest(from.digest) {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.h >index b204bc77f300211c73f4c47c216f38b3b91eeed9..ea10ede7206ba6f3691f97f3907bb02fae295998 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslfingerprint.h >@@ -11,30 +11,47 @@ > #ifndef RTC_BASE_SSLFINGERPRINT_H_ > #define RTC_BASE_SSLFINGERPRINT_H_ > >+#include <stddef.h> >+#include <stdint.h> > #include <string> > > #include "rtc_base/copyonwritebuffer.h" >-#include "rtc_base/rtccertificate.h" >-#include "rtc_base/sslidentity.h" > > namespace rtc { > >+class RTCCertificate; > class SSLCertificate; >+class SSLIdentity; > > struct SSLFingerprint { >+ // TODO(steveanton): Remove once downstream projects have moved off of this. > static SSLFingerprint* Create(const std::string& algorithm, > const rtc::SSLIdentity* identity); >+ // TODO(steveanton): Rename to Create once projects have migrated. >+ static std::unique_ptr<SSLFingerprint> CreateUnique( >+ const std::string& algorithm, >+ const rtc::SSLIdentity& identity); > >- static SSLFingerprint* Create(const std::string& algorithm, >- const rtc::SSLCertificate* cert); >+ static std::unique_ptr<SSLFingerprint> Create( >+ const std::string& algorithm, >+ const rtc::SSLCertificate& cert); > >+ // TODO(steveanton): Remove once downstream projects have moved off of this. > static SSLFingerprint* CreateFromRfc4572(const std::string& algorithm, > const std::string& fingerprint); >+ // TODO(steveanton): Rename to CreateFromRfc4572 once projects have migrated. >+ static std::unique_ptr<SSLFingerprint> CreateUniqueFromRfc4572( >+ const std::string& algorithm, >+ const std::string& fingerprint); > > // Creates a fingerprint from a certificate, using the same digest algorithm > // as the certificate's signature. >- static SSLFingerprint* CreateFromCertificate(const RTCCertificate* cert); >+ static std::unique_ptr<SSLFingerprint> CreateFromCertificate( >+ const RTCCertificate& cert); > >+ SSLFingerprint(const std::string& algorithm, >+ ArrayView<const uint8_t> digest_view); >+ // TODO(steveanton): Remove once downstream projects have moved off of this. > SSLFingerprint(const std::string& algorithm, > const uint8_t* digest_in, > size_t digest_len); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.cc >index 823fc388a3cef1b8150f60cf698b8736a1642b37..41eb35d5f52b6737e71b9336140eb73cc9771a2e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.cc >@@ -11,20 +11,95 @@ > // Handling of certificates and keypairs for SSLStreamAdapter's peer mode. > #include "rtc_base/sslidentity.h" > >-#include <ctime> >+#include <string.h> >+#include <time.h> > #include <string> >-#include <utility> > >-#include "absl/memory/memory.h" > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" > #include "rtc_base/opensslidentity.h" >-#include "rtc_base/sslfingerprint.h" >+#include "rtc_base/sslcertificate.h" > #include "rtc_base/strings/string_builder.h" > #include "rtc_base/third_party/base64/base64.h" >+#include "rtc_base/timeutils.h" > > namespace rtc { > >+////////////////////////////////////////////////////////////////////// >+// Helper Functions >+////////////////////////////////////////////////////////////////////// >+ >+namespace { >+// Read |n| bytes from ASN1 number string at *|pp| and return the numeric value. >+// Update *|pp| and *|np| to reflect number of read bytes. >+// TODO(bugs.webrtc.org/9860) - Remove this code. >+inline int ASN1ReadInt(const unsigned char** pp, size_t* np, size_t n) { >+ const unsigned char* p = *pp; >+ int x = 0; >+ for (size_t i = 0; i < n; i++) { >+ x = 10 * x + p[i] - '0'; >+ } >+ *pp = p + n; >+ *np = *np - n; >+ return x; >+} >+ >+} // namespace >+ >+// TODO(bugs.webrtc.org/9860) - Remove this code. >+int64_t ASN1TimeToSec(const unsigned char* s, size_t length, bool long_format) { >+ size_t bytes_left = length; >+ // Make sure the string ends with Z. Doing it here protects the strspn call >+ // from running off the end of the string in Z's absense. >+ if (length == 0 || s[length - 1] != 'Z') { >+ return -1; >+ } >+ // Make sure we only have ASCII digits so that we don't need to clutter the >+ // code below and ASN1ReadInt with error checking. >+ size_t n = strspn(reinterpret_cast<const char*>(s), "0123456789"); >+ if (n + 1 != length) { >+ return -1; >+ } >+ // Read out ASN1 year, in either 2-char "UTCTIME" or 4-char "GENERALIZEDTIME" >+ // format. Both format use UTC in this context. >+ int year = 0; >+ if (long_format) { >+ // ASN1 format: yyyymmddhh[mm[ss[.fff]]]Z where the Z is literal, but >+ // RFC 5280 requires us to only support exactly yyyymmddhhmmssZ. >+ if (bytes_left < 11) { >+ return -1; >+ } >+ year = ASN1ReadInt(&s, &bytes_left, 4); >+ year -= 1900; >+ } else { >+ // ASN1 format: yymmddhhmm[ss]Z where the Z is literal, but RFC 5280 >+ // requires us to only support exactly yymmddhhmmssZ. >+ if (bytes_left < 9) { >+ return -1; >+ } >+ year = ASN1ReadInt(&s, &bytes_left, 2); >+ // Per RFC 5280 4.1.2.5.1 >+ if (year < 50) { >+ year += 100; >+ } >+ } >+ >+ // Read out remaining ASN1 time data and store it in |tm| in documented >+ // std::tm format. >+ tm tm; >+ tm.tm_year = year; >+ tm.tm_mon = ASN1ReadInt(&s, &bytes_left, 2) - 1; >+ tm.tm_mday = ASN1ReadInt(&s, &bytes_left, 2); >+ tm.tm_hour = ASN1ReadInt(&s, &bytes_left, 2); >+ tm.tm_min = ASN1ReadInt(&s, &bytes_left, 2); >+ tm.tm_sec = ASN1ReadInt(&s, &bytes_left, 2); >+ >+ // Now just Z should remain. Its existence was asserted above. >+ if (bytes_left != 1) { >+ return -1; >+ } >+ return TmToSeconds(tm); >+} >+ > ////////////////////////////////////////////////////////////////////// > // KeyParams > ////////////////////////////////////////////////////////////////////// >@@ -93,22 +168,21 @@ KeyType IntKeyTypeFamilyToKeyType(int key_type_family) { > bool SSLIdentity::PemToDer(const std::string& pem_type, > const std::string& pem_string, > std::string* der) { >- // Find the inner body. We need this to fulfill the contract of >- // returning pem_length. >+ // Find the inner body. We need this to fulfill the contract of returning >+ // pem_length. > size_t header = pem_string.find("-----BEGIN " + pem_type + "-----"); >- if (header == std::string::npos) >+ if (header == std::string::npos) { > return false; >- >+ } > size_t body = pem_string.find("\n", header); >- if (body == std::string::npos) >+ if (body == std::string::npos) { > return false; >- >+ } > size_t trailer = pem_string.find("-----END " + pem_type + "-----"); >- if (trailer == std::string::npos) >+ if (trailer == std::string::npos) { > return false; >- >+ } > std::string inner = pem_string.substr(body + 1, trailer - (body + 1)); >- > *der = Base64::Decode(inner, Base64::DO_PARSE_WHITE | Base64::DO_PAD_ANY | > Base64::DO_TERM_BUFFER); > return true; >@@ -118,14 +192,12 @@ std::string SSLIdentity::DerToPem(const std::string& pem_type, > const unsigned char* data, > size_t length) { > rtc::StringBuilder result; >- > result << "-----BEGIN " << pem_type << "-----\n"; > > std::string b64_encoded; > Base64::EncodeFromArray(data, length, &b64_encoded); >- >- // Divide the Base-64 encoded data into 64-character chunks, as per >- // 4.3.2.4 of RFC 1421. >+ // Divide the Base-64 encoded data into 64-character chunks, as per 4.3.2.4 >+ // of RFC 1421. > static const size_t kChunkSize = 64; > size_t chunks = (b64_encoded.size() + (kChunkSize - 1)) / kChunkSize; > for (size_t i = 0, chunk_offset = 0; i < chunks; >@@ -133,9 +205,7 @@ std::string SSLIdentity::DerToPem(const std::string& pem_type, > result << b64_encoded.substr(chunk_offset, kChunkSize); > result << "\n"; > } >- > result << "-----END " << pem_type << "-----\n"; >- > return result.Release(); > } > >@@ -186,78 +256,4 @@ bool operator!=(const SSLIdentity& a, const SSLIdentity& b) { > return !(a == b); > } > >-////////////////////////////////////////////////////////////////////// >-// Helper Functions >-////////////////////////////////////////////////////////////////////// >- >-// Read |n| bytes from ASN1 number string at *|pp| and return the numeric value. >-// Update *|pp| and *|np| to reflect number of read bytes. >-static inline int ASN1ReadInt(const unsigned char** pp, size_t* np, size_t n) { >- const unsigned char* p = *pp; >- int x = 0; >- for (size_t i = 0; i < n; i++) >- x = 10 * x + p[i] - '0'; >- *pp = p + n; >- *np = *np - n; >- return x; >-} >- >-int64_t ASN1TimeToSec(const unsigned char* s, size_t length, bool long_format) { >- size_t bytes_left = length; >- >- // Make sure the string ends with Z. Doing it here protects the strspn call >- // from running off the end of the string in Z's absense. >- if (length == 0 || s[length - 1] != 'Z') >- return -1; >- >- // Make sure we only have ASCII digits so that we don't need to clutter the >- // code below and ASN1ReadInt with error checking. >- size_t n = strspn(reinterpret_cast<const char*>(s), "0123456789"); >- if (n + 1 != length) >- return -1; >- >- int year; >- >- // Read out ASN1 year, in either 2-char "UTCTIME" or 4-char "GENERALIZEDTIME" >- // format. Both format use UTC in this context. >- if (long_format) { >- // ASN1 format: yyyymmddhh[mm[ss[.fff]]]Z where the Z is literal, but >- // RFC 5280 requires us to only support exactly yyyymmddhhmmssZ. >- >- if (bytes_left < 11) >- return -1; >- >- year = ASN1ReadInt(&s, &bytes_left, 4); >- year -= 1900; >- } else { >- // ASN1 format: yymmddhhmm[ss]Z where the Z is literal, but RFC 5280 >- // requires us to only support exactly yymmddhhmmssZ. >- >- if (bytes_left < 9) >- return -1; >- >- year = ASN1ReadInt(&s, &bytes_left, 2); >- if (year < 50) // Per RFC 5280 4.1.2.5.1 >- year += 100; >- } >- >- std::tm tm; >- tm.tm_year = year; >- >- // Read out remaining ASN1 time data and store it in |tm| in documented >- // std::tm format. >- tm.tm_mon = ASN1ReadInt(&s, &bytes_left, 2) - 1; >- tm.tm_mday = ASN1ReadInt(&s, &bytes_left, 2); >- tm.tm_hour = ASN1ReadInt(&s, &bytes_left, 2); >- tm.tm_min = ASN1ReadInt(&s, &bytes_left, 2); >- tm.tm_sec = ASN1ReadInt(&s, &bytes_left, 2); >- >- if (bytes_left != 1) { >- // Now just Z should remain. Its existence was asserted above. >- return -1; >- } >- >- return TmToSeconds(tm); >-} >- > } // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.h >index 1379d733be5f4f1a2f877c64f5d3fba5fe79ffb5..39feeabab312c14054d2653a53ac243f57ee02b2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity.h >@@ -13,19 +13,15 @@ > #ifndef RTC_BASE_SSLIDENTITY_H_ > #define RTC_BASE_SSLIDENTITY_H_ > >-#include <algorithm> >-#include <memory> >+#include <stdint.h> >+#include <ctime> > #include <string> >-#include <vector> >- >-#include "rtc_base/buffer.h" >-#include "rtc_base/constructormagic.h" >-#include "rtc_base/messagedigest.h" >-#include "rtc_base/sslcertificate.h" >-#include "rtc_base/timeutils.h" > > namespace rtc { > >+class SSLCertChain; >+class SSLCertificate; >+ > // KT_LAST is intended for vector declarations and loops over all key types; > // it does not represent any key type in itself. > // KT_DEFAULT is used as the default KeyType for KeyParams. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity_unittest.cc >index 132e240c2f5a5c0dc48c77ac88e75e86a3ac28c0..ba53d17da431a2acf2ab6b18f41c8ed616514dd2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslidentity_unittest.cc >@@ -14,6 +14,7 @@ > #include "rtc_base/fakesslidentity.h" > #include "rtc_base/gunit.h" > #include "rtc_base/helpers.h" >+#include "rtc_base/messagedigest.h" > #include "rtc_base/ssladapter.h" > #include "rtc_base/sslfingerprint.h" > #include "rtc_base/sslidentity.h" >@@ -179,7 +180,7 @@ IdentityAndInfo CreateFakeIdentityAndInfoFromDers( > const rtc::SSLCertChain& chain = info.identity->cert_chain(); > std::unique_ptr<rtc::SSLFingerprint> fp; > for (size_t i = 0; i < chain.GetSize(); i++) { >- fp.reset(rtc::SSLFingerprint::Create("sha-1", &chain.Get(i))); >+ fp = rtc::SSLFingerprint::Create("sha-1", chain.Get(i)); > EXPECT_TRUE(fp); > info.fingerprints.push_back(fp->GetRfc4572Fingerprint()); > } >@@ -200,7 +201,7 @@ class SSLIdentityTest : public testing::Test { > ASSERT_TRUE(identity_ecdsa1_); > ASSERT_TRUE(identity_ecdsa2_); > >- test_cert_.reset(rtc::SSLCertificate::FromPEMString(kTestCertificate)); >+ test_cert_ = rtc::SSLCertificate::FromPEMString(kTestCertificate); > ASSERT_TRUE(test_cert_); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.cc >index 746ebd50063dca8bfae68985dbdb6eed7f532e0e..9c33a9c933c98a904049fc86bcf0110a77dfb772 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.cc >@@ -89,40 +89,12 @@ bool IsGcmCryptoSuiteName(const std::string& crypto_suite) { > crypto_suite == CS_AEAD_AES_128_GCM); > } > >-// static >-CryptoOptions CryptoOptions::NoGcm() { >- CryptoOptions options; >- options.enable_gcm_crypto_suites = false; >- return options; >-} >- >-std::vector<int> GetSupportedDtlsSrtpCryptoSuites( >- const rtc::CryptoOptions& crypto_options) { >- std::vector<int> crypto_suites; >- if (crypto_options.enable_gcm_crypto_suites) { >- crypto_suites.push_back(rtc::SRTP_AEAD_AES_256_GCM); >- crypto_suites.push_back(rtc::SRTP_AEAD_AES_128_GCM); >- } >- // Note: SRTP_AES128_CM_SHA1_80 is what is required to be supported (by >- // draft-ietf-rtcweb-security-arch), but SRTP_AES128_CM_SHA1_32 is allowed as >- // well, and saves a few bytes per packet if it ends up selected. >- // As the cipher suite is potentially insecure, it will only be used if >- // enabled by both peers. >- if (crypto_options.enable_aes128_sha1_32_crypto_cipher) { >- crypto_suites.push_back(rtc::SRTP_AES128_CM_SHA1_32); >- } >- crypto_suites.push_back(rtc::SRTP_AES128_CM_SHA1_80); >- return crypto_suites; >-} >- > SSLStreamAdapter* SSLStreamAdapter::Create(StreamInterface* stream) { > return new OpenSSLStreamAdapter(stream); > } > > SSLStreamAdapter::SSLStreamAdapter(StreamInterface* stream) >- : StreamAdapterInterface(stream), >- ignore_bad_cert_(false), >- client_auth_enabled_(true) {} >+ : StreamAdapterInterface(stream) {} > > SSLStreamAdapter::~SSLStreamAdapter() {} > >@@ -161,8 +133,13 @@ bool SSLStreamAdapter::IsAcceptableCipher(const std::string& cipher, > std::string SSLStreamAdapter::SslCipherSuiteToName(int cipher_suite) { > return OpenSSLStreamAdapter::SslCipherSuiteToName(cipher_suite); > } >-void SSLStreamAdapter::enable_time_callback_for_testing() { >- OpenSSLStreamAdapter::enable_time_callback_for_testing(); >+ >+/////////////////////////////////////////////////////////////////////////////// >+// Test only settings >+/////////////////////////////////////////////////////////////////////////////// >+ >+void SSLStreamAdapter::EnableTimeCallbackForTesting() { >+ OpenSSLStreamAdapter::EnableTimeCallbackForTesting(); > } > > /////////////////////////////////////////////////////////////////////////////// >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.h >index 2d4e19f9b50fef30c08cf063adf65464133ffc85..25f4f33486dadd984c8a42be1b31d3f2aa9ab583 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter.h >@@ -70,34 +70,6 @@ bool IsGcmCryptoSuite(int crypto_suite); > // Returns true if the given crypto suite name uses a GCM cipher. > bool IsGcmCryptoSuiteName(const std::string& crypto_suite); > >-struct CryptoOptions { >- CryptoOptions() {} >- >- // Helper method to return an instance of the CryptoOptions with GCM crypto >- // suites disabled. This method should be used instead of depending on current >- // default values set by the constructor. >- static CryptoOptions NoGcm(); >- >- // Enable GCM crypto suites from RFC 7714 for SRTP. GCM will only be used >- // if both sides enable it. >- bool enable_gcm_crypto_suites = false; >- >- // If set to true, the (potentially insecure) crypto cipher >- // SRTP_AES128_CM_SHA1_32 will be included in the list of supported ciphers >- // during negotiation. It will only be used if both peers support it and no >- // other ciphers get preferred. >- bool enable_aes128_sha1_32_crypto_cipher = false; >- >- // If set to true, encrypted RTP header extensions as defined in RFC 6904 >- // will be negotiated. They will only be used if both peers support them. >- bool enable_encrypted_rtp_header_extensions = false; >-}; >- >-// Returns supported crypto suites, given |crypto_options|. >-// CS_AES_CM_128_HMAC_SHA1_32 will be preferred by default. >-std::vector<int> GetSupportedDtlsSrtpCryptoSuites( >- const rtc::CryptoOptions& crypto_options); >- > // SSLStreamAdapter : A StreamInterfaceAdapter that does SSL/TLS. > // After SSL has been started, the stream will only open on successful > // SSL verification of certificates, and the communication is >@@ -144,12 +116,6 @@ class SSLStreamAdapter : public StreamAdapterInterface { > explicit SSLStreamAdapter(StreamInterface* stream); > ~SSLStreamAdapter() override; > >- void set_ignore_bad_cert(bool ignore) { ignore_bad_cert_ = ignore; } >- bool ignore_bad_cert() const { return ignore_bad_cert_; } >- >- void set_client_auth_enabled(bool enabled) { client_auth_enabled_ = enabled; } >- bool client_auth_enabled() const { return client_auth_enabled_; } >- > // Specify our SSL identity: key and certificate. SSLStream takes ownership > // of the SSLIdentity object and will free it when appropriate. Should be > // called no more than once on a given SSLStream instance. >@@ -263,22 +229,32 @@ class SSLStreamAdapter : public StreamAdapterInterface { > // depending on specific SSL implementation. > static std::string SslCipherSuiteToName(int cipher_suite); > >+ //////////////////////////////////////////////////////////////////////////// >+ // Testing only member functions >+ //////////////////////////////////////////////////////////////////////////// >+ > // Use our timeutils.h source of timing in BoringSSL, allowing us to test > // using a fake clock. >- static void enable_time_callback_for_testing(); >+ static void EnableTimeCallbackForTesting(); >+ >+ // Deprecated. Do not use this API outside of testing. >+ // Do not set this to false outside of testing. >+ void SetClientAuthEnabledForTesting(bool enabled) { >+ client_auth_enabled_ = enabled; >+ } >+ >+ // Deprecated. Do not use this API outside of testing. >+ // Returns true by default, else false if explicitly set to disable client >+ // authentication. >+ bool GetClientAuthEnabled() const { return client_auth_enabled_; } > > sigslot::signal1<SSLHandshakeError> SignalSSLHandshakeError; > > private: >- // If true, the server certificate need not match the configured >- // server_name, and in fact missing certificate authority and other >- // verification errors are ignored. >- bool ignore_bad_cert_; >- > // If true (default), the client is required to provide a certificate during > // handshake. If no certificate is given, handshake fails. This applies to > // server mode only. >- bool client_auth_enabled_; >+ bool client_auth_enabled_ = true; > }; > > } // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter_unittest.cc >index 3403bdbe727ee33f0862860c9f71aad1db6d9fe7..6fbb1d7c2cc09822d26ba67e9a70ea52c315a71a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/sslstreamadapter_unittest.cc >@@ -17,6 +17,8 @@ > #include "rtc_base/checks.h" > #include "rtc_base/gunit.h" > #include "rtc_base/helpers.h" >+#include "rtc_base/memory_stream.h" >+#include "rtc_base/messagedigest.h" > #include "rtc_base/ssladapter.h" > #include "rtc_base/sslidentity.h" > #include "rtc_base/sslstreamadapter.h" >@@ -587,8 +589,7 @@ class SSLStreamAdapterTestBase : public testing::Test, > chain = client_ssl_->GetPeerSSLCertChain(); > else > chain = server_ssl_->GetPeerSSLCertChain(); >- return (chain && chain->GetSize()) ? chain->Get(0).GetUniqueReference() >- : nullptr; >+ return (chain && chain->GetSize()) ? chain->Get(0).Clone() : nullptr; > } > > bool GetSslCipherSuite(bool client, int* retval) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.cc >index d1ec1400de60b1de3feeda9f779a009672f2190d..783625c67a94602511937ae40d66343aa856e3b2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.cc >@@ -7,29 +7,23 @@ > * in the file PATENTS. All contributing project authors may > * be found in the AUTHORS file in the root of the source tree. > */ >- >-#if defined(WEBRTC_POSIX) >-#include <sys/file.h> >-#endif // WEBRTC_POSIX > #include <errno.h> >+#include <string.h> > #include <sys/stat.h> >-#include <sys/types.h> >- > #include <algorithm> > #include <string> > > #include "rtc_base/checks.h" >-#include "rtc_base/logging.h" >+#include "rtc_base/location.h" > #include "rtc_base/messagequeue.h" > #include "rtc_base/stream.h" >-#include "rtc_base/stringencode.h" >-#include "rtc_base/stringutils.h" > #include "rtc_base/thread.h" >-#include "rtc_base/timeutils.h" > > #if defined(WEBRTC_WIN) > #include <windows.h> >+ > #define fileno _fileno >+#include "rtc_base/stringutils.h" > #endif > > namespace rtc { >@@ -75,26 +69,6 @@ StreamResult StreamInterface::ReadAll(void* buffer, > return result; > } > >-StreamResult StreamInterface::ReadLine(std::string* line) { >- line->clear(); >- StreamResult result = SR_SUCCESS; >- while (true) { >- char ch; >- result = Read(&ch, sizeof(ch), nullptr, nullptr); >- if (result != SR_SUCCESS) { >- break; >- } >- if (ch == '\n') { >- break; >- } >- line->push_back(ch); >- } >- if (!line->empty()) { // give back the line we've collected so far with >- result = SR_SUCCESS; // a success code. Otherwise return the last code >- } >- return result; >-} >- > void StreamInterface::PostEvent(Thread* t, int events, int err) { > t->Post(RTC_FROM_HERE, this, MSG_POST_EVENT, > new StreamEventData(events, err)); >@@ -104,14 +78,6 @@ void StreamInterface::PostEvent(int events, int err) { > PostEvent(Thread::Current(), events, err); > } > >-const void* StreamInterface::GetReadData(size_t* data_len) { >- return nullptr; >-} >- >-void* StreamInterface::GetWriteBuffer(size_t* buf_len) { >- return nullptr; >-} >- > bool StreamInterface::SetPosition(size_t position) { > return false; > } >@@ -124,10 +90,6 @@ bool StreamInterface::GetSize(size_t* size) const { > return false; > } > >-bool StreamInterface::GetWriteRemaining(size_t* size) const { >- return false; >-} >- > bool StreamInterface::Flush() { > return false; > } >@@ -188,10 +150,6 @@ bool StreamAdapterInterface::GetSize(size_t* size) const { > return stream_->GetSize(size); > } > >-bool StreamAdapterInterface::GetWriteRemaining(size_t* size) const { >- return stream_->GetWriteRemaining(size); >-} >- > bool StreamAdapterInterface::ReserveSize(size_t size) { > return stream_->ReserveSize(size); > } >@@ -389,143 +347,6 @@ void FileStream::DoClose() { > fclose(file_); > } > >-/////////////////////////////////////////////////////////////////////////////// >-// MemoryStream >-/////////////////////////////////////////////////////////////////////////////// >- >-MemoryStreamBase::MemoryStreamBase() >- : buffer_(nullptr), buffer_length_(0), data_length_(0), seek_position_(0) {} >- >-StreamState MemoryStreamBase::GetState() const { >- return SS_OPEN; >-} >- >-StreamResult MemoryStreamBase::Read(void* buffer, >- size_t bytes, >- size_t* bytes_read, >- int* error) { >- if (seek_position_ >= data_length_) { >- return SR_EOS; >- } >- size_t available = data_length_ - seek_position_; >- if (bytes > available) { >- // Read partial buffer >- bytes = available; >- } >- memcpy(buffer, &buffer_[seek_position_], bytes); >- seek_position_ += bytes; >- if (bytes_read) { >- *bytes_read = bytes; >- } >- return SR_SUCCESS; >-} >- >-StreamResult MemoryStreamBase::Write(const void* buffer, >- size_t bytes, >- size_t* bytes_written, >- int* error) { >- size_t available = buffer_length_ - seek_position_; >- if (0 == available) { >- // Increase buffer size to the larger of: >- // a) new position rounded up to next 256 bytes >- // b) double the previous length >- size_t new_buffer_length = >- std::max(((seek_position_ + bytes) | 0xFF) + 1, buffer_length_ * 2); >- StreamResult result = DoReserve(new_buffer_length, error); >- if (SR_SUCCESS != result) { >- return result; >- } >- RTC_DCHECK(buffer_length_ >= new_buffer_length); >- available = buffer_length_ - seek_position_; >- } >- >- if (bytes > available) { >- bytes = available; >- } >- memcpy(&buffer_[seek_position_], buffer, bytes); >- seek_position_ += bytes; >- if (data_length_ < seek_position_) { >- data_length_ = seek_position_; >- } >- if (bytes_written) { >- *bytes_written = bytes; >- } >- return SR_SUCCESS; >-} >- >-void MemoryStreamBase::Close() { >- // nothing to do >-} >- >-bool MemoryStreamBase::SetPosition(size_t position) { >- if (position > data_length_) >- return false; >- seek_position_ = position; >- return true; >-} >- >-bool MemoryStreamBase::GetPosition(size_t* position) const { >- if (position) >- *position = seek_position_; >- return true; >-} >- >-bool MemoryStreamBase::GetSize(size_t* size) const { >- if (size) >- *size = data_length_; >- return true; >-} >- >-bool MemoryStreamBase::ReserveSize(size_t size) { >- return (SR_SUCCESS == DoReserve(size, nullptr)); >-} >- >-StreamResult MemoryStreamBase::DoReserve(size_t size, int* error) { >- return (buffer_length_ >= size) ? SR_SUCCESS : SR_EOS; >-} >- >-/////////////////////////////////////////////////////////////////////////////// >- >-MemoryStream::MemoryStream() {} >- >-MemoryStream::MemoryStream(const char* data) { >- SetData(data, strlen(data)); >-} >- >-MemoryStream::MemoryStream(const void* data, size_t length) { >- SetData(data, length); >-} >- >-MemoryStream::~MemoryStream() { >- delete[] buffer_; >-} >- >-void MemoryStream::SetData(const void* data, size_t length) { >- data_length_ = buffer_length_ = length; >- delete[] buffer_; >- buffer_ = new char[buffer_length_]; >- memcpy(buffer_, data, data_length_); >- seek_position_ = 0; >-} >- >-StreamResult MemoryStream::DoReserve(size_t size, int* error) { >- if (buffer_length_ >= size) >- return SR_SUCCESS; >- >- if (char* new_buffer = new char[size]) { >- memcpy(new_buffer, buffer_, data_length_); >- delete[] buffer_; >- buffer_ = new_buffer; >- buffer_length_ = size; >- return SR_SUCCESS; >- } >- >- if (error) { >- *error = ENOMEM; >- } >- return SR_ERROR; >-} >- > /////////////////////////////////////////////////////////////////////////////// > // FifoBuffer > /////////////////////////////////////////////////////////////////////////////// >@@ -758,68 +579,4 @@ StreamResult FifoBuffer::WriteOffsetLocked(const void* buffer, > return SR_SUCCESS; > } > >- >-/////////////////////////////////////////////////////////////////////////////// >- >-StreamResult Flow(StreamInterface* source, >- char* buffer, >- size_t buffer_len, >- StreamInterface* sink, >- size_t* data_len /* = nullptr */) { >- RTC_DCHECK(buffer_len > 0); >- >- StreamResult result; >- size_t count, read_pos, write_pos; >- if (data_len) { >- read_pos = *data_len; >- } else { >- read_pos = 0; >- } >- >- bool end_of_stream = false; >- do { >- // Read until buffer is full, end of stream, or error >- while (!end_of_stream && (read_pos < buffer_len)) { >- result = source->Read(buffer + read_pos, buffer_len - read_pos, &count, >- nullptr); >- if (result == SR_EOS) { >- end_of_stream = true; >- } else if (result != SR_SUCCESS) { >- if (data_len) { >- *data_len = read_pos; >- } >- return result; >- } else { >- read_pos += count; >- } >- } >- >- // Write until buffer is empty, or error (including end of stream) >- write_pos = 0; >- while (write_pos < read_pos) { >- result = sink->Write(buffer + write_pos, read_pos - write_pos, &count, >- nullptr); >- if (result != SR_SUCCESS) { >- if (data_len) { >- *data_len = read_pos - write_pos; >- if (write_pos > 0) { >- memmove(buffer, buffer + write_pos, *data_len); >- } >- } >- return result; >- } >- write_pos += count; >- } >- >- read_pos = 0; >- } while (!end_of_stream); >- >- if (data_len) { >- *data_len = 0; >- } >- return SR_SUCCESS; >-} >- >-/////////////////////////////////////////////////////////////////////////////// >- > } // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.h >index 81c5e41801a5baad651b42b5dadd8453b5123378..43e7f58bc7c0961371ca9ce1ed865c00bac0be52 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream.h >@@ -18,7 +18,6 @@ > #include "rtc_base/buffer.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/criticalsection.h" >-#include "rtc_base/logging.h" > #include "rtc_base/messagehandler.h" > #include "rtc_base/messagequeue.h" > #include "rtc_base/third_party/sigslot/sigslot.h" >@@ -108,59 +107,6 @@ class StreamInterface : public MessageHandler { > // Like the aforementioned method, but posts to the current thread. > void PostEvent(int events, int err); > >- // >- // OPTIONAL OPERATIONS >- // >- // Not all implementations will support the following operations. In general, >- // a stream will only support an operation if it reasonably efficient to do >- // so. For example, while a socket could buffer incoming data to support >- // seeking, it will not do so. Instead, a buffering stream adapter should >- // be used. >- // >- // Even though several of these operations are related, you should >- // always use whichever operation is most relevant. >- // >- // The following four methods are used to avoid copying data multiple times. >- >- // GetReadData returns a pointer to a buffer which is owned by the stream. >- // The buffer contains data_len bytes. null is returned if no data is >- // available, or if the method fails. If the caller processes the data, it >- // must call ConsumeReadData with the number of processed bytes. GetReadData >- // does not require a matching call to ConsumeReadData if the data is not >- // processed. Read and ConsumeReadData invalidate the buffer returned by >- // GetReadData. >- virtual const void* GetReadData(size_t* data_len); >- virtual void ConsumeReadData(size_t used) {} >- >- // GetWriteBuffer returns a pointer to a buffer which is owned by the stream. >- // The buffer has a capacity of buf_len bytes. null is returned if there is >- // no buffer available, or if the method fails. The call may write data to >- // the buffer, and then call ConsumeWriteBuffer with the number of bytes >- // written. GetWriteBuffer does not require a matching call to >- // ConsumeWriteData if no data is written. Write, ForceWrite, and >- // ConsumeWriteData invalidate the buffer returned by GetWriteBuffer. >- // TODO: Allow the caller to specify a minimum buffer size. If the specified >- // amount of buffer is not yet available, return null and Signal SE_WRITE >- // when it is available. If the requested amount is too large, return an >- // error. >- virtual void* GetWriteBuffer(size_t* buf_len); >- virtual void ConsumeWriteBuffer(size_t used) {} >- >- // Write data_len bytes found in data, circumventing any throttling which >- // would could cause SR_BLOCK to be returned. Returns true if all the data >- // was written. Otherwise, the method is unsupported, or an unrecoverable >- // error occurred, and the error value is set. This method should be used >- // sparingly to write critical data which should not be throttled. A stream >- // which cannot circumvent its blocking constraints should not implement this >- // method. >- // NOTE: This interface is being considered experimentally at the moment. It >- // would be used by JUDP and BandwidthStream as a way to circumvent certain >- // soft limits in writing. >- // virtual bool ForceWrite(const void* data, size_t data_len, int* error) { >- // if (error) *error = -1; >- // return false; >- //} >- > // Seek to a byte offset from the beginning of the stream. Returns false if > // the stream does not support seeking, or cannot seek to the specified > // position. >@@ -174,10 +120,6 @@ class StreamInterface : public MessageHandler { > // is not known. > virtual bool GetSize(size_t* size) const; > >- // Return the number of Write()-able bytes remaining before end-of-stream. >- // Returns false if not known. >- virtual bool GetWriteRemaining(size_t* size) const; >- > // Return true if flush is successful. > virtual bool Flush(); > >@@ -214,12 +156,6 @@ class StreamInterface : public MessageHandler { > size_t* read, > int* error); > >- // ReadLine is a helper function which repeatedly calls Read until it hits >- // the end-of-line character, or something other than SR_SUCCESS. >- // TODO: this is too inefficient to keep here. Break this out into a buffered >- // readline object or adapter >- StreamResult ReadLine(std::string* line); >- > protected: > StreamInterface(); > >@@ -255,37 +191,9 @@ class StreamAdapterInterface : public StreamInterface, > int* error) override; > void Close() override; > >- // Optional Stream Interface >- /* Note: Many stream adapters were implemented prior to this Read/Write >- interface. Therefore, a simple pass through of data in those cases may >- be broken. At a later time, we should do a once-over pass of all >- adapters, and make them compliant with these interfaces, after which this >- code can be uncommented. >- virtual const void* GetReadData(size_t* data_len) { >- return stream_->GetReadData(data_len); >- } >- virtual void ConsumeReadData(size_t used) { >- stream_->ConsumeReadData(used); >- } >- >- virtual void* GetWriteBuffer(size_t* buf_len) { >- return stream_->GetWriteBuffer(buf_len); >- } >- virtual void ConsumeWriteBuffer(size_t used) { >- stream_->ConsumeWriteBuffer(used); >- } >- */ >- >- /* Note: This interface is currently undergoing evaluation. >- virtual bool ForceWrite(const void* data, size_t data_len, int* error) { >- return stream_->ForceWrite(data, data_len, error); >- } >- */ >- > bool SetPosition(size_t position) override; > bool GetPosition(size_t* position) const override; > bool GetSize(size_t* size) const override; >- bool GetWriteRemaining(size_t* size) const override; > bool ReserveSize(size_t size) override; > bool Flush() override; > >@@ -353,68 +261,11 @@ class FileStream : public StreamInterface { > RTC_DISALLOW_COPY_AND_ASSIGN(FileStream); > }; > >-/////////////////////////////////////////////////////////////////////////////// >-// MemoryStream is a simple implementation of a StreamInterface over in-memory >-// data. Data is read and written at the current seek position. Reads return >-// end-of-stream when they reach the end of data. Writes actually extend the >-// end of data mark. >-/////////////////////////////////////////////////////////////////////////////// >- >-class MemoryStreamBase : public StreamInterface { >- public: >- StreamState GetState() const override; >- StreamResult Read(void* buffer, >- size_t bytes, >- size_t* bytes_read, >- int* error) override; >- StreamResult Write(const void* buffer, >- size_t bytes, >- size_t* bytes_written, >- int* error) override; >- void Close() override; >- bool SetPosition(size_t position) override; >- bool GetPosition(size_t* position) const override; >- bool GetSize(size_t* size) const override; >- bool ReserveSize(size_t size) override; >- >- char* GetBuffer() { return buffer_; } >- const char* GetBuffer() const { return buffer_; } >- >- protected: >- MemoryStreamBase(); >- >- virtual StreamResult DoReserve(size_t size, int* error); >- >- // Invariant: 0 <= seek_position <= data_length_ <= buffer_length_ >- char* buffer_; >- size_t buffer_length_; >- size_t data_length_; >- size_t seek_position_; >- >- private: >- RTC_DISALLOW_COPY_AND_ASSIGN(MemoryStreamBase); >-}; >- >-// MemoryStream dynamically resizes to accomodate written data. >- >-class MemoryStream : public MemoryStreamBase { >- public: >- MemoryStream(); >- explicit MemoryStream(const char* data); // Calls SetData(data, strlen(data)) >- MemoryStream(const void* data, size_t length); // Calls SetData(data, length) >- ~MemoryStream() override; >- >- void SetData(const void* data, size_t length); >- >- protected: >- StreamResult DoReserve(size_t size, int* error) override; >-}; >- > // FifoBuffer allows for efficient, thread-safe buffering of data between > // writer and reader. As the data can wrap around the end of the buffer, > // MemoryStreamBase can't help us here. > >-class FifoBuffer : public StreamInterface { >+class FifoBuffer final : public StreamInterface { > public: > // Creates a FIFO buffer with the specified capacity. > explicit FifoBuffer(size_t length); >@@ -455,11 +306,28 @@ class FifoBuffer : public StreamInterface { > size_t* bytes_written, > int* error) override; > void Close() override; >- const void* GetReadData(size_t* data_len) override; >- void ConsumeReadData(size_t used) override; >- void* GetWriteBuffer(size_t* buf_len) override; >- void ConsumeWriteBuffer(size_t used) override; >- bool GetWriteRemaining(size_t* size) const override; >+ // GetReadData returns a pointer to a buffer which is owned by the stream. >+ // The buffer contains data_len bytes. null is returned if no data is >+ // available, or if the method fails. If the caller processes the data, it >+ // must call ConsumeReadData with the number of processed bytes. GetReadData >+ // does not require a matching call to ConsumeReadData if the data is not >+ // processed. Read and ConsumeReadData invalidate the buffer returned by >+ // GetReadData. >+ const void* GetReadData(size_t* data_len); >+ void ConsumeReadData(size_t used); >+ // GetWriteBuffer returns a pointer to a buffer which is owned by the stream. >+ // The buffer has a capacity of buf_len bytes. null is returned if there is >+ // no buffer available, or if the method fails. The call may write data to >+ // the buffer, and then call ConsumeWriteBuffer with the number of bytes >+ // written. GetWriteBuffer does not require a matching call to >+ // ConsumeWriteData if no data is written. Write and >+ // ConsumeWriteData invalidate the buffer returned by GetWriteBuffer. >+ void* GetWriteBuffer(size_t* buf_len); >+ void ConsumeWriteBuffer(size_t used); >+ >+ // Return the number of Write()-able bytes remaining before end-of-stream. >+ // Returns false if not known. >+ bool GetWriteRemaining(size_t* size) const; > > private: > // Helper method that implements ReadOffset. Caller must acquire a lock >@@ -495,25 +363,6 @@ class FifoBuffer : public StreamInterface { > RTC_DISALLOW_COPY_AND_ASSIGN(FifoBuffer); > }; > >-/////////////////////////////////////////////////////////////////////////////// >- >-// Flow attempts to move bytes from source to sink via buffer of size >-// buffer_len. The function returns SR_SUCCESS when source reaches >-// end-of-stream (returns SR_EOS), and all the data has been written successful >-// to sink. Alternately, if source returns SR_BLOCK or SR_ERROR, or if sink >-// returns SR_BLOCK, SR_ERROR, or SR_EOS, then the function immediately returns >-// with the unexpected StreamResult value. >-// data_len is the length of the valid data in buffer. in case of error >-// this is the data that read from source but can't move to destination. >-// as a pass in parameter, it indicates data in buffer that should move to sink >-StreamResult Flow(StreamInterface* source, >- char* buffer, >- size_t buffer_len, >- StreamInterface* sink, >- size_t* data_len = nullptr); >- >-/////////////////////////////////////////////////////////////////////////////// >- > } // namespace rtc > > #endif // RTC_BASE_STREAM_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream_unittest.cc >index 616783cb4dc76011f60793a06a28b08ff798320e..2ca2526c3caaea70f4543c09862b4bbcfaf4cda7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stream_unittest.cc >@@ -105,56 +105,52 @@ TEST(FifoBufferTest, TestAll) { > const void* q; > size_t bytes; > FifoBuffer buf(kSize); >- StreamInterface* stream = &buf; > > // Test assumptions about base state >- EXPECT_EQ(SS_OPEN, stream->GetState()); >- EXPECT_EQ(SR_BLOCK, stream->Read(out, kSize, &bytes, nullptr)); >- EXPECT_TRUE(nullptr != stream->GetReadData(&bytes)); >- EXPECT_EQ((size_t)0, bytes); >- stream->ConsumeReadData(0); >- EXPECT_TRUE(nullptr != stream->GetWriteBuffer(&bytes)); >+ EXPECT_EQ(SS_OPEN, buf.GetState()); >+ EXPECT_EQ(SR_BLOCK, buf.Read(out, kSize, &bytes, nullptr)); >+ EXPECT_TRUE(nullptr != buf.GetWriteBuffer(&bytes)); > EXPECT_EQ(kSize, bytes); >- stream->ConsumeWriteBuffer(0); >+ buf.ConsumeWriteBuffer(0); > > // Try a full write >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > > // Try a write that should block >- EXPECT_EQ(SR_BLOCK, stream->Write(in, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_BLOCK, buf.Write(in, kSize, &bytes, nullptr)); > > // Try a full read >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize)); > > // Try a read that should block >- EXPECT_EQ(SR_BLOCK, stream->Read(out, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_BLOCK, buf.Read(out, kSize, &bytes, nullptr)); > > // Try a too-big write >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize * 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize * 2, &bytes, nullptr)); > EXPECT_EQ(bytes, kSize); > > // Try a too-big read >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize * 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize * 2, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize)); > > // Try some small writes and reads >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize / 2)); >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize / 2)); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize / 2)); > >@@ -166,51 +162,51 @@ TEST(FifoBufferTest, TestAll) { > // XXXXWWWWWWWWXXXX 4567012345670123 > // RRRRXXXXXXXXRRRR ....01234567.... > // ....RRRRRRRR.... ................ >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize * 3 / 4, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize * 3 / 4, &bytes, nullptr)); > EXPECT_EQ(kSize * 3 / 4, bytes); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize / 2)); >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 4, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 4, &bytes, nullptr)); > EXPECT_EQ(kSize / 4, bytes); > EXPECT_EQ(0, memcmp(in + kSize / 2, out, kSize / 4)); >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize / 2)); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize / 2)); > > // Use GetWriteBuffer to reset the read_position for the next tests >- stream->GetWriteBuffer(&bytes); >- stream->ConsumeWriteBuffer(0); >+ buf.GetWriteBuffer(&bytes); >+ buf.ConsumeWriteBuffer(0); > > // Try using GetReadData to do a full read >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize, &bytes, nullptr)); >- q = stream->GetReadData(&bytes); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize, &bytes, nullptr)); >+ q = buf.GetReadData(&bytes); > EXPECT_TRUE(nullptr != q); > EXPECT_EQ(kSize, bytes); > EXPECT_EQ(0, memcmp(q, in, kSize)); >- stream->ConsumeReadData(kSize); >- EXPECT_EQ(SR_BLOCK, stream->Read(out, kSize, &bytes, nullptr)); >+ buf.ConsumeReadData(kSize); >+ EXPECT_EQ(SR_BLOCK, buf.Read(out, kSize, &bytes, nullptr)); > > // Try using GetReadData to do some small reads >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize, &bytes, nullptr)); >- q = stream->GetReadData(&bytes); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize, &bytes, nullptr)); >+ q = buf.GetReadData(&bytes); > EXPECT_TRUE(nullptr != q); > EXPECT_EQ(kSize, bytes); > EXPECT_EQ(0, memcmp(q, in, kSize / 2)); >- stream->ConsumeReadData(kSize / 2); >- q = stream->GetReadData(&bytes); >+ buf.ConsumeReadData(kSize / 2); >+ q = buf.GetReadData(&bytes); > EXPECT_TRUE(nullptr != q); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(q, in + kSize / 2, kSize / 2)); >- stream->ConsumeReadData(kSize / 2); >- EXPECT_EQ(SR_BLOCK, stream->Read(out, kSize, &bytes, nullptr)); >+ buf.ConsumeReadData(kSize / 2); >+ EXPECT_EQ(SR_BLOCK, buf.Read(out, kSize, &bytes, nullptr)); > > // Try using GetReadData in a wraparound case > // WWWWWWWWWWWWWWWW 0123456789ABCDEF >@@ -218,46 +214,46 @@ TEST(FifoBufferTest, TestAll) { > // WWWWWWWW....XXXX 01234567....CDEF > // ............RRRR 01234567........ > // RRRRRRRR........ ................ >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize, &bytes, nullptr)); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize * 3 / 4, &bytes, nullptr)); >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize / 2, &bytes, nullptr)); >- q = stream->GetReadData(&bytes); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize * 3 / 4, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize / 2, &bytes, nullptr)); >+ q = buf.GetReadData(&bytes); > EXPECT_TRUE(nullptr != q); > EXPECT_EQ(kSize / 4, bytes); > EXPECT_EQ(0, memcmp(q, in + kSize * 3 / 4, kSize / 4)); >- stream->ConsumeReadData(kSize / 4); >- q = stream->GetReadData(&bytes); >+ buf.ConsumeReadData(kSize / 4); >+ q = buf.GetReadData(&bytes); > EXPECT_TRUE(nullptr != q); > EXPECT_EQ(kSize / 2, bytes); > EXPECT_EQ(0, memcmp(q, in, kSize / 2)); >- stream->ConsumeReadData(kSize / 2); >+ buf.ConsumeReadData(kSize / 2); > > // Use GetWriteBuffer to reset the read_position for the next tests >- stream->GetWriteBuffer(&bytes); >- stream->ConsumeWriteBuffer(0); >+ buf.GetWriteBuffer(&bytes); >+ buf.ConsumeWriteBuffer(0); > > // Try using GetWriteBuffer to do a full write >- p = stream->GetWriteBuffer(&bytes); >+ p = buf.GetWriteBuffer(&bytes); > EXPECT_TRUE(nullptr != p); > EXPECT_EQ(kSize, bytes); > memcpy(p, in, kSize); >- stream->ConsumeWriteBuffer(kSize); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize, &bytes, nullptr)); >+ buf.ConsumeWriteBuffer(kSize); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize)); > > // Try using GetWriteBuffer to do some small writes >- p = stream->GetWriteBuffer(&bytes); >+ p = buf.GetWriteBuffer(&bytes); > EXPECT_TRUE(nullptr != p); > EXPECT_EQ(kSize, bytes); > memcpy(p, in, kSize / 2); >- stream->ConsumeWriteBuffer(kSize / 2); >- p = stream->GetWriteBuffer(&bytes); >+ buf.ConsumeWriteBuffer(kSize / 2); >+ p = buf.GetWriteBuffer(&bytes); > EXPECT_TRUE(nullptr != p); > EXPECT_EQ(kSize / 2, bytes); > memcpy(p, in + kSize / 2, kSize / 2); >- stream->ConsumeWriteBuffer(kSize / 2); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize, &bytes, nullptr)); >+ buf.ConsumeWriteBuffer(kSize / 2); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize)); > >@@ -267,53 +263,53 @@ TEST(FifoBufferTest, TestAll) { > // ........XXXXWWWW ........89AB0123 > // WWWW....XXXXXXXX 4567....89AB0123 > // RRRR....RRRRRRRR ................ >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize * 3 / 4, &bytes, nullptr)); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >- p = stream->GetWriteBuffer(&bytes); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize * 3 / 4, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); >+ p = buf.GetWriteBuffer(&bytes); > EXPECT_TRUE(nullptr != p); > EXPECT_EQ(kSize / 4, bytes); > memcpy(p, in, kSize / 4); >- stream->ConsumeWriteBuffer(kSize / 4); >- p = stream->GetWriteBuffer(&bytes); >+ buf.ConsumeWriteBuffer(kSize / 4); >+ p = buf.GetWriteBuffer(&bytes); > EXPECT_TRUE(nullptr != p); > EXPECT_EQ(kSize / 2, bytes); > memcpy(p, in + kSize / 4, kSize / 4); >- stream->ConsumeWriteBuffer(kSize / 4); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize * 3 / 4, &bytes, nullptr)); >+ buf.ConsumeWriteBuffer(kSize / 4); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize * 3 / 4, &bytes, nullptr)); > EXPECT_EQ(kSize * 3 / 4, bytes); > EXPECT_EQ(0, memcmp(in + kSize / 2, out, kSize / 4)); > EXPECT_EQ(0, memcmp(in, out + kSize / 4, kSize / 4)); > > // Check that the stream is now empty >- EXPECT_EQ(SR_BLOCK, stream->Read(out, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_BLOCK, buf.Read(out, kSize, &bytes, nullptr)); > > // Try growing the buffer >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > EXPECT_TRUE(buf.SetCapacity(kSize * 2)); >- EXPECT_EQ(SR_SUCCESS, stream->Write(in + kSize, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in + kSize, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize * 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize * 2, &bytes, nullptr)); > EXPECT_EQ(kSize * 2, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize * 2)); > > // Try shrinking the buffer >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > EXPECT_TRUE(buf.SetCapacity(kSize)); >- EXPECT_EQ(SR_BLOCK, stream->Write(in, kSize, &bytes, nullptr)); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_BLOCK, buf.Write(in, kSize, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize, &bytes, nullptr)); > EXPECT_EQ(kSize, bytes); > EXPECT_EQ(0, memcmp(in, out, kSize)); > > // Write to the stream, close it, read the remaining bytes >- EXPECT_EQ(SR_SUCCESS, stream->Write(in, kSize / 2, &bytes, nullptr)); >- stream->Close(); >- EXPECT_EQ(SS_CLOSED, stream->GetState()); >- EXPECT_EQ(SR_EOS, stream->Write(in, kSize / 2, &bytes, nullptr)); >- EXPECT_EQ(SR_SUCCESS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Write(in, kSize / 2, &bytes, nullptr)); >+ buf.Close(); >+ EXPECT_EQ(SS_CLOSED, buf.GetState()); >+ EXPECT_EQ(SR_EOS, buf.Write(in, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_SUCCESS, buf.Read(out, kSize / 2, &bytes, nullptr)); > EXPECT_EQ(0, memcmp(in, out, kSize / 2)); >- EXPECT_EQ(SR_EOS, stream->Read(out, kSize / 2, &bytes, nullptr)); >+ EXPECT_EQ(SR_EOS, buf.Read(out, kSize / 2, &bytes, nullptr)); > } > > TEST(FifoBufferTest, FullBufferCheck) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.cc >index 9201242d5f578d7f558ef1f073ac35b558467d69..634652b83f6be2bea367fb9128c1da4dffd91304 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.cc >@@ -10,6 +10,7 @@ > > #include "rtc_base/string_to_number.h" > >+#include <ctype.h> > #include <cerrno> > #include <cstdlib> > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.h >index 7ea9f251ed468bbb6a8a0cf76c6888d491c0309e..4cb521595defadd2f10bfd6dab4a38d1553617b3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/string_to_number.h >@@ -13,6 +13,7 @@ > > #include <limits> > #include <string> >+#include <type_traits> > > #include "absl/types/optional.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.cc >index a57845ffdad5a80b0bfc24f9b545922b064ef26b..fc4e3bc8e7315cc7235c1f32769a5a6135258522 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.cc >@@ -10,8 +10,7 @@ > > #include "rtc_base/stringencode.h" > >-#include <stdio.h> >-#include <stdlib.h> >+#include <cstdio> > > #include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" >@@ -51,67 +50,6 @@ size_t url_decode(char* buffer, > return bufpos; > } > >-size_t utf8_decode(const char* source, size_t srclen, unsigned long* value) { >- const unsigned char* s = reinterpret_cast<const unsigned char*>(source); >- if ((s[0] & 0x80) == 0x00) { // Check s[0] == 0xxxxxxx >- *value = s[0]; >- return 1; >- } >- if ((srclen < 2) || ((s[1] & 0xC0) != 0x80)) { // Check s[1] != 10xxxxxx >- return 0; >- } >- // Accumulate the trailer byte values in value16, and combine it with the >- // relevant bits from s[0], once we've determined the sequence length. >- unsigned long value16 = (s[1] & 0x3F); >- if ((s[0] & 0xE0) == 0xC0) { // Check s[0] == 110xxxxx >- *value = ((s[0] & 0x1F) << 6) | value16; >- return 2; >- } >- if ((srclen < 3) || ((s[2] & 0xC0) != 0x80)) { // Check s[2] != 10xxxxxx >- return 0; >- } >- value16 = (value16 << 6) | (s[2] & 0x3F); >- if ((s[0] & 0xF0) == 0xE0) { // Check s[0] == 1110xxxx >- *value = ((s[0] & 0x0F) << 12) | value16; >- return 3; >- } >- if ((srclen < 4) || ((s[3] & 0xC0) != 0x80)) { // Check s[3] != 10xxxxxx >- return 0; >- } >- value16 = (value16 << 6) | (s[3] & 0x3F); >- if ((s[0] & 0xF8) == 0xF0) { // Check s[0] == 11110xxx >- *value = ((s[0] & 0x07) << 18) | value16; >- return 4; >- } >- return 0; >-} >- >-size_t utf8_encode(char* buffer, size_t buflen, unsigned long value) { >- if ((value <= 0x7F) && (buflen >= 1)) { >- buffer[0] = static_cast<unsigned char>(value); >- return 1; >- } >- if ((value <= 0x7FF) && (buflen >= 2)) { >- buffer[0] = 0xC0 | static_cast<unsigned char>(value >> 6); >- buffer[1] = 0x80 | static_cast<unsigned char>(value & 0x3F); >- return 2; >- } >- if ((value <= 0xFFFF) && (buflen >= 3)) { >- buffer[0] = 0xE0 | static_cast<unsigned char>(value >> 12); >- buffer[1] = 0x80 | static_cast<unsigned char>((value >> 6) & 0x3F); >- buffer[2] = 0x80 | static_cast<unsigned char>(value & 0x3F); >- return 3; >- } >- if ((value <= 0x1FFFFF) && (buflen >= 4)) { >- buffer[0] = 0xF0 | static_cast<unsigned char>(value >> 18); >- buffer[1] = 0x80 | static_cast<unsigned char>((value >> 12) & 0x3F); >- buffer[2] = 0x80 | static_cast<unsigned char>((value >> 6) & 0x3F); >- buffer[3] = 0x80 | static_cast<unsigned char>(value & 0x3F); >- return 4; >- } >- return 0; >-} >- > static const char HEX[] = "0123456789abcdef"; > > char hex_encode(unsigned char val) { >@@ -122,9 +60,9 @@ char hex_encode(unsigned char val) { > bool hex_decode(char ch, unsigned char* val) { > if ((ch >= '0') && (ch <= '9')) { > *val = ch - '0'; >- } else if ((ch >= 'A') && (ch <= 'Z')) { >+ } else if ((ch >= 'A') && (ch <= 'F')) { > *val = (ch - 'A') + 10; >- } else if ((ch >= 'a') && (ch <= 'z')) { >+ } else if ((ch >= 'a') && (ch <= 'f')) { > *val = (ch - 'a') + 10; > } else { > return false; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.h >index 5d436dfd57ba08a57307f5ac3d3103e3fce6e081..09bf77f4461525a3bafc1fa1faca69c296e2d2c2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode.h >@@ -11,9 +11,12 @@ > #ifndef RTC_BASE_STRINGENCODE_H_ > #define RTC_BASE_STRINGENCODE_H_ > >+#include <stddef.h> > #include <string> >+#include <type_traits> > #include <vector> > >+#include "absl/types/optional.h" > #include "rtc_base/checks.h" > #include "rtc_base/string_to_number.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode_unittest.cc >index 9bdc592c29aa0663778ae2c7d86dca601819b1f4..f21c4cb78c6f8484f7853572840bea293292b9c1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringencode_unittest.cc >@@ -141,7 +141,8 @@ TEST_F(HexEncodeTest, TestDecodeTooShort) { > > // Test that decoding non-hex data fails. > TEST_F(HexEncodeTest, TestDecodeBogusData) { >- dec_res_ = hex_decode_with_delimiter(decoded_, sizeof(decoded_), "xyz", 3, 0); >+ dec_res_ = >+ hex_decode_with_delimiter(decoded_, sizeof(decoded_), "axyz", 4, 0); > ASSERT_EQ(0U, dec_res_); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/audio_format_to_string.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/audio_format_to_string.cc >index a1493441985fdc9d72ee254121460a258cf697f3..7e91c3b49dd415585c18dcf657e3393b921def10 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/audio_format_to_string.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/audio_format_to_string.cc >@@ -10,6 +10,8 @@ > > #include "rtc_base/strings/audio_format_to_string.h" > >+#include <utility> >+ > #include "rtc_base/strings/string_builder.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.cc >index 2fd4aba19cdcdbd32dfe9e999a09b8ff92e3638d..adf4fa9e819a389dd5b1c2fe9edff9360ddc1b6d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.cc >@@ -10,6 +10,12 @@ > > #include "rtc_base/strings/string_builder.h" > >+#include <stdarg.h> >+#include <cstring> >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/numerics/safe_minmax.h" >+ > namespace rtc { > > SimpleStringBuilder::SimpleStringBuilder(rtc::ArrayView<char> buffer) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.h >index 186c2f7b9d79816e31acc02ec50daea6ff6d8ff2..27001d1be5a3c12588830ab3dc60fca46a733584 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/strings/string_builder.h >@@ -12,13 +12,11 @@ > #define RTC_BASE_STRINGS_STRING_BUILDER_H_ > > #include <cstdio> >-#include <cstring> > #include <string> >+#include <utility> > > #include "absl/strings/string_view.h" > #include "api/array_view.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_minmax.h" > #include "rtc_base/stringencode.h" > #include "rtc_base/stringutils.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.cc >index 6baddd11bdd36d196db9fe31cc294a8fea4ca53e..c808eb2cc41ee607c3b9e73a9ec60db762eef1ad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.cc >@@ -7,97 +7,29 @@ > * in the file PATENTS. All contributing project authors may > * be found in the AUTHORS file in the root of the source tree. > */ >-#include <algorithm> >-#include <cstdio> > >-#include "rtc_base/checks.h" > #include "rtc_base/stringutils.h" > > namespace rtc { > >-bool memory_check(const void* memory, int c, size_t count) { >- const char* char_memory = static_cast<const char*>(memory); >- char char_c = static_cast<char>(c); >- for (size_t i = 0; i < count; ++i) { >- if (char_memory[i] != char_c) { >- return false; >- } >- } >- return true; >-} >- >-bool string_match(const char* target, const char* pattern) { >- while (*pattern) { >- if (*pattern == '*') { >- if (!*++pattern) { >- return true; >- } >- while (*target) { >- if ((toupper(*pattern) == toupper(*target)) && >- string_match(target + 1, pattern + 1)) { >- return true; >- } >- ++target; >- } >- return false; >- } else { >- if (toupper(*pattern) != toupper(*target)) { >- return false; >- } >- ++target; >- ++pattern; >- } >- } >- return !*target; >-} >- >-#if defined(WEBRTC_WIN) >-int ascii_string_compare(const wchar_t* s1, >- const char* s2, >- size_t n, >- CharacterTransformation transformation) { >- wchar_t c1, c2; >- while (true) { >- if (n-- == 0) >- return 0; >- c1 = transformation(*s1); >- // Double check that characters are not UTF-8 >- RTC_DCHECK_LT(*s2, 128); >- // Note: *s2 gets implicitly promoted to wchar_t >- c2 = transformation(*s2); >- if (c1 != c2) >- return (c1 < c2) ? -1 : 1; >- if (!c1) >- return 0; >- ++s1; >- ++s2; >- } >-} >- >-size_t asccpyn(wchar_t* buffer, >+size_t strcpyn(char* buffer, > size_t buflen, > const char* source, >- size_t srclen) { >+ size_t srclen /* = SIZE_UNKNOWN */) { > if (buflen <= 0) > return 0; > > if (srclen == SIZE_UNKNOWN) { >- srclen = strlenn(source, buflen - 1); >- } else if (srclen >= buflen) { >+ srclen = strlen(source); >+ } >+ if (srclen >= buflen) { > srclen = buflen - 1; > } >-#if RTC_DCHECK_IS_ON >- // Double check that characters are not UTF-8 >- for (size_t pos = 0; pos < srclen; ++pos) >- RTC_DCHECK_LT(source[pos], 128); >-#endif >- std::copy(source, source + srclen, buffer); >+ memcpy(buffer, source, srclen); > buffer[srclen] = 0; > return srclen; > } > >-#endif // WEBRTC_WIN >- > void replace_substrs(const char* search, > size_t search_len, > const char* replace, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.h >index 26c4627d5ca59803a4e3fb5da2977863bca97972..c9dabfb8aaafb85b5ebb28f2f2b9637e777d4326 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils.h >@@ -13,8 +13,7 @@ > > #if !defined(WEBRTC_WEBKIT_BUILD) > #include <ctype.h> >-#endif >- >+#endif // !defined(WEBRTC_WEBKIT_BUILD) > #include <stdarg.h> > #include <stdio.h> > #include <string.h> >@@ -23,6 +22,7 @@ > #include <malloc.h> > #include <wchar.h> > #include <windows.h> >+ > #define alloca _alloca > #endif // WEBRTC_WIN > >@@ -32,6 +32,7 @@ > #else // BSD > #include <alloca.h> > #endif // !BSD >+#include <strings.h> > #endif // WEBRTC_POSIX > > #include <string> >@@ -43,49 +44,6 @@ > #define STACK_ARRAY(TYPE, LEN) \ > static_cast<TYPE*>(::alloca((LEN) * sizeof(TYPE))) > >-namespace rtc { >- >-// Determines whether the simple wildcard pattern matches target. >-// Alpha characters in pattern match case-insensitively. >-// Asterisks in pattern match 0 or more characters. >-// Ex: string_match("www.TEST.GOOGLE.COM", "www.*.com") -> true >-bool string_match(const char* target, const char* pattern); >- >-} // namespace rtc >- >-#if !defined(WEBRTC_WEBKIT_BUILD) >-/////////////////////////////////////////////////////////////////////////////// >-// Rename a few common string functions so they are consistent across platforms. >-// tolowercase is like tolower, but not compatible with end-of-file value >-// >-// It's not clear if we will ever use wchar_t strings on unix. In theory, >-// all strings should be Utf8 all the time, except when interfacing with Win32 >-// APIs that require Utf16. >-/////////////////////////////////////////////////////////////////////////////// >-inline char tolowercase(char c) { >- return static_cast<char>(tolower(c)); >-} >-#endif // !defined(WEBRTC_WEBKIT_BUILD) >- >-#if defined(WEBRTC_WIN) >- >-inline wchar_t tolowercase(wchar_t c) { >- return static_cast<wchar_t>(towlower(c)); >-} >- >-#endif // WEBRTC_WIN >- >-#if defined(WEBRTC_POSIX) >- >-inline int _stricmp(const char* s1, const char* s2) { >- return strcasecmp(s1, s2); >-} >-inline int _strnicmp(const char* s1, const char* s2, size_t n) { >- return strncasecmp(s1, s2, n); >-} >- >-#endif // WEBRTC_POSIX >- > /////////////////////////////////////////////////////////////////////////////// > // Traits simplifies porting string functions to be CTYPE-agnostic > /////////////////////////////////////////////////////////////////////////////// >@@ -94,191 +52,12 @@ namespace rtc { > > const size_t SIZE_UNKNOWN = static_cast<size_t>(-1); > >-template <class CTYPE> >-struct Traits { >- // STL string type >- // typedef XXX string; >- // Null-terminated string >- // inline static const CTYPE* empty_str(); >-}; >- >-/////////////////////////////////////////////////////////////////////////////// >-// String utilities which work with char or wchar_t >-/////////////////////////////////////////////////////////////////////////////// >- >-template <class CTYPE> >-inline const CTYPE* nonnull(const CTYPE* str, const CTYPE* def_str = nullptr) { >- return str ? str : (def_str ? def_str : Traits<CTYPE>::empty_str()); >-} >- >-template <class CTYPE> >-const CTYPE* strchr(const CTYPE* str, const CTYPE* chs) { >- for (size_t i = 0; str[i]; ++i) { >- for (size_t j = 0; chs[j]; ++j) { >- if (str[i] == chs[j]) { >- return str + i; >- } >- } >- } >- return 0; >-} >- >-template <class CTYPE> >-const CTYPE* strchrn(const CTYPE* str, size_t slen, CTYPE ch) { >- for (size_t i = 0; i < slen && str[i]; ++i) { >- if (str[i] == ch) { >- return str + i; >- } >- } >- return 0; >-} >- >-template <class CTYPE> >-size_t strlenn(const CTYPE* buffer, size_t buflen) { >- size_t bufpos = 0; >- while (buffer[bufpos] && (bufpos < buflen)) { >- ++bufpos; >- } >- return bufpos; >-} >- >-// Safe versions of strncpy, strncat, snprintf and vsnprintf that always >-// null-terminate. >- >-template <class CTYPE> >-size_t strcpyn(CTYPE* buffer, >- size_t buflen, >- const CTYPE* source, >- size_t srclen = SIZE_UNKNOWN) { >- if (buflen <= 0) >- return 0; >- >- if (srclen == SIZE_UNKNOWN) { >- srclen = strlenn(source, buflen - 1); >- } else if (srclen >= buflen) { >- srclen = buflen - 1; >- } >- memcpy(buffer, source, srclen * sizeof(CTYPE)); >- buffer[srclen] = 0; >- return srclen; >-} >- >-template <class CTYPE> >-size_t strcatn(CTYPE* buffer, >- size_t buflen, >- const CTYPE* source, >- size_t srclen = SIZE_UNKNOWN) { >- if (buflen <= 0) >- return 0; >- >- size_t bufpos = strlenn(buffer, buflen - 1); >- return bufpos + strcpyn(buffer + bufpos, buflen - bufpos, source, srclen); >-} >- >-// Some compilers (clang specifically) require vsprintfn be defined before >-// sprintfn. >-template <class CTYPE> >-size_t vsprintfn(CTYPE* buffer, >- size_t buflen, >- const CTYPE* format, >- va_list args) { >- int len = vsnprintf(buffer, buflen, format, args); >- if ((len < 0) || (static_cast<size_t>(len) >= buflen)) { >- len = static_cast<int>(buflen - 1); >- buffer[len] = 0; >- } >- return len; >-} >- >-template <class CTYPE> >-size_t sprintfn(CTYPE* buffer, size_t buflen, const CTYPE* format, ...); >-template <class CTYPE> >-size_t sprintfn(CTYPE* buffer, size_t buflen, const CTYPE* format, ...) { >- va_list args; >- va_start(args, format); >- size_t len = vsprintfn(buffer, buflen, format, args); >- va_end(args); >- return len; >-} >- >-/////////////////////////////////////////////////////////////////////////////// >-// Allow safe comparing and copying ascii (not UTF-8) with both wide and >-// non-wide character strings. >-/////////////////////////////////////////////////////////////////////////////// >- >-inline int asccmp(const char* s1, const char* s2) { >- return strcmp(s1, s2); >-} >-inline int ascicmp(const char* s1, const char* s2) { >- return _stricmp(s1, s2); >-} >-inline int ascncmp(const char* s1, const char* s2, size_t n) { >- return strncmp(s1, s2, n); >-} >-inline int ascnicmp(const char* s1, const char* s2, size_t n) { >- return _strnicmp(s1, s2, n); >-} >-inline size_t asccpyn(char* buffer, >- size_t buflen, >- const char* source, >- size_t srclen = SIZE_UNKNOWN) { >- return strcpyn(buffer, buflen, source, srclen); >-} >- >-#if defined(WEBRTC_WIN) >- >-typedef wchar_t (*CharacterTransformation)(wchar_t); >-inline wchar_t identity(wchar_t c) { >- return c; >-} >-int ascii_string_compare(const wchar_t* s1, >- const char* s2, >- size_t n, >- CharacterTransformation transformation); >- >-inline int asccmp(const wchar_t* s1, const char* s2) { >- return ascii_string_compare(s1, s2, static_cast<size_t>(-1), identity); >-} >-inline int ascicmp(const wchar_t* s1, const char* s2) { >- return ascii_string_compare(s1, s2, static_cast<size_t>(-1), tolowercase); >-} >-inline int ascncmp(const wchar_t* s1, const char* s2, size_t n) { >- return ascii_string_compare(s1, s2, n, identity); >-} >-inline int ascnicmp(const wchar_t* s1, const char* s2, size_t n) { >- return ascii_string_compare(s1, s2, n, tolowercase); >-} >-size_t asccpyn(wchar_t* buffer, >+// Safe version of strncpy that always nul-terminate. >+size_t strcpyn(char* buffer, > size_t buflen, > const char* source, > size_t srclen = SIZE_UNKNOWN); > >-#endif // WEBRTC_WIN >- >-/////////////////////////////////////////////////////////////////////////////// >-// Traits<char> specializations >-/////////////////////////////////////////////////////////////////////////////// >- >-template <> >-struct Traits<char> { >- typedef std::string string; >- inline static const char* empty_str() { return ""; } >-}; >- >-/////////////////////////////////////////////////////////////////////////////// >-// Traits<wchar_t> specializations (Windows only, currently) >-/////////////////////////////////////////////////////////////////////////////// >- >-#if defined(WEBRTC_WIN) >- >-template <> >-struct Traits<wchar_t> { >- typedef std::wstring string; >- inline static const wchar_t* empty_str() { return L""; } >-}; >- >-#endif // WEBRTC_WIN >- > /////////////////////////////////////////////////////////////////////////////// > // UTF helpers (Windows only) > /////////////////////////////////////////////////////////////////////////////// >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils_unittest.cc >index a6e64680bb7359d81ed1ae4effcb6396508b6cc6..663e9768f608a346c842045a11ced5704e76cf95 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringutils_unittest.cc >@@ -13,70 +13,6 @@ > > namespace rtc { > >-// Tests for string_match(). >- >-TEST(string_matchTest, Matches) { >- EXPECT_TRUE(string_match("A.B.C.D", "a.b.c.d")); >- EXPECT_TRUE(string_match("www.TEST.GOOGLE.COM", "www.*.com")); >- EXPECT_TRUE(string_match("127.0.0.1", "12*.0.*1")); >- EXPECT_TRUE(string_match("127.1.0.21", "12*.0.*1")); >- EXPECT_FALSE(string_match("127.0.0.0", "12*.0.*1")); >- EXPECT_FALSE(string_match("127.0.0.0", "12*.0.*1")); >- EXPECT_FALSE(string_match("127.1.1.21", "12*.0.*1")); >-} >- >-// It's not clear if we will ever use wchar_t strings on unix. In theory, >-// all strings should be Utf8 all the time, except when interfacing with Win32 >-// APIs that require Utf16. >- >-#if defined(WEBRTC_WIN) >- >-// Tests for ascii_string_compare(). >- >-// Tests null input. >-TEST(ascii_string_compareTest, NullInput) { >- // The following results in an access violation in >- // ascii_string_compare. Is this a bug or by design? stringutils.h >- // should document the expected behavior in this case. >- >- // EXPECT_EQ(0, ascii_string_compare(nullptr, nullptr, 1, identity)); >-} >- >-// Tests comparing two strings of different lengths. >-TEST(ascii_string_compareTest, DifferentLengths) { >- EXPECT_EQ(-1, ascii_string_compare(L"Test", "Test1", 5, identity)); >-} >- >-// Tests the case where the buffer size is smaller than the string >-// lengths. >-TEST(ascii_string_compareTest, SmallBuffer) { >- EXPECT_EQ(0, ascii_string_compare(L"Test", "Test1", 3, identity)); >-} >- >-// Tests the case where the buffer is not full. >-TEST(ascii_string_compareTest, LargeBuffer) { >- EXPECT_EQ(0, ascii_string_compare(L"Test", "Test", 10, identity)); >-} >- >-// Tests comparing two eqaul strings. >-TEST(ascii_string_compareTest, Equal) { >- EXPECT_EQ(0, ascii_string_compare(L"Test", "Test", 5, identity)); >- EXPECT_EQ(0, ascii_string_compare(L"TeSt", "tEsT", 5, tolowercase)); >-} >- >-// Tests comparing a smller string to a larger one. >-TEST(ascii_string_compareTest, LessThan) { >- EXPECT_EQ(-1, ascii_string_compare(L"abc", "abd", 4, identity)); >- EXPECT_EQ(-1, ascii_string_compare(L"ABC", "abD", 5, tolowercase)); >-} >- >-// Tests comparing a larger string to a smaller one. >-TEST(ascii_string_compareTest, GreaterThan) { >- EXPECT_EQ(1, ascii_string_compare(L"xyz", "xy", 5, identity)); >- EXPECT_EQ(1, ascii_string_compare(L"abc", "ABB", 5, tolowercase)); >-} >-#endif // WEBRTC_WIN >- > TEST(string_trim_Test, Trimming) { > EXPECT_EQ("temp", string_trim("\n\r\t temp \n\r\t")); > EXPECT_EQ("temp\n\r\t temp", string_trim(" temp\n\r\t temp ")); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/synchronization/rw_lock_posix.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/synchronization/rw_lock_posix.cc >index 7e37dc924fdf81cb70916e31aa7507b62ce62003..15ef3d706e8aecadddbccfed6118eea30bdf4c0b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/synchronization/rw_lock_posix.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/synchronization/rw_lock_posix.cc >@@ -10,6 +10,8 @@ > > #include "rtc_base/synchronization/rw_lock_posix.h" > >+#include <stddef.h> >+ > namespace webrtc { > > RWLockPosix::RWLockPosix() : lock_() {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/BUILD.gn >index f22efe14a0488e33b178d6cb6eb3522c43f6accc..e5aa32b4ef1d8c522a8243a16b44043be7bc215b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/BUILD.gn >@@ -65,3 +65,15 @@ rtc_source_set("rtc_export") { > "rtc_export.h", > ] > } >+ >+if (is_mac || is_ios) { >+ rtc_source_set("cocoa_threading") { >+ sources = [ >+ "cocoa_threading.h", >+ "cocoa_threading.mm", >+ ] >+ deps = [ >+ "..:checks", >+ ] >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/cocoa_threading.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/cocoa_threading.h >new file mode 100644 >index 0000000000000000000000000000000000000000..518cb71786baf34ac0e3d2830da8518398c2f729 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/cocoa_threading.h >@@ -0,0 +1,24 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_BASE_SYSTEM_COCOA_THREADING_H_ >+#define RTC_BASE_SYSTEM_COCOA_THREADING_H_ >+ >+// If Cocoa is to be used on more than one thread, it must know that the >+// application is multithreaded. Since it's possible to enter Cocoa code >+// from threads created by pthread_thread_create, Cocoa won't necessarily >+// be aware that the application is multithreaded. Spawning an NSThread is >+// enough to get Cocoa to set up for multithreaded operation, so this is done >+// if necessary before pthread_thread_create spawns any threads. >+// >+// http://developer.apple.com/documentation/Cocoa/Conceptual/Multithreading/CreatingThreads/chapter_4_section_4.html >+void InitCocoaMultiThreading(); >+ >+#endif // RTC_BASE_SYSTEM_COCOA_THREADING_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/cocoa_threading.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/cocoa_threading.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..c09862e7e5f750db05ab884e05be24b015cd3dbc >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/cocoa_threading.mm >@@ -0,0 +1,24 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#include "rtc_base/system/cocoa_threading.h" >+ >+#import <Foundation/Foundation.h> >+ >+#include "rtc_base/checks.h" >+ >+void InitCocoaMultiThreading() { >+ static BOOL is_cocoa_multithreaded = [NSThread isMultiThreaded]; >+ if (!is_cocoa_multithreaded) { >+ // +[NSObject class] is idempotent. >+ [NSThread detachNewThreadSelector:@selector(class) toTarget:[NSObject class] withObject:nil]; >+ is_cocoa_multithreaded = YES; >+ RTC_DCHECK([NSThread isMultiThreaded]); >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.cc >index 72f5f259dbdf689a03584bbcd7515b11b9ab5038..c033a792a50c3e9238896255ab6c657e3880943d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.cc >@@ -13,14 +13,11 @@ > #ifdef _WIN32 > #include <Windows.h> > #else >-#include <stdarg.h> > #include <string.h> > #endif > > #include <utility> > >-#include "rtc_base/checks.h" >- > namespace webrtc { > namespace { > FILE* FileOpen(const char* file_name_utf8, bool read_only) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.h >index 5411b04f6992d75d3eab9212690949b036cdc000..0bb86a397f9060bfe433d9e356f4f989f6555f00 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/file_wrapper.h >@@ -14,7 +14,6 @@ > #include <stddef.h> > #include <stdio.h> > >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/criticalsection.h" > > // Implementation that can read (exclusive) or write from/to a file. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/rtc_export.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/rtc_export.h >index b7a0fe416826082197b98462798f5caabdbddc58..d1eb60ad783a5f1c8a7a80742ebfded31ab735ca 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/rtc_export.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/system/rtc_export.h >@@ -16,7 +16,7 @@ > // When WebRTC is built as a static library the RTC_EXPORT macro expands to > // nothing. > >-#ifdef COMPONENT_BUILD >+#ifdef WEBRTC_ENABLE_SYMBOL_EXPORT > > #ifdef WEBRTC_WIN > >@@ -34,7 +34,7 @@ > > #endif // WEBRTC_WIN > >-#endif // COMPONENT_BUILD >+#endif // WEBRTC_ENABLE_SYMBOL_EXPORT > > #ifndef RTC_EXPORT > #define RTC_EXPORT >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue.h >index b8b307ca6a4058354e4b1b888698848b5394e793..888e203967cd365ad082b33e809dc1a749c16d8b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue.h >@@ -18,6 +18,7 @@ > #include "absl/memory/memory.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/system/rtc_export.h" > #include "rtc_base/thread_annotations.h" > > namespace rtc { >@@ -150,7 +151,7 @@ static std::unique_ptr<QueuedTask> NewClosure(Closure&& closure, > // TaskQueue itself has been deleted or it may happen synchronously while the > // TaskQueue instance is being deleted. This may vary from one OS to the next > // so assumptions about lifetimes of pending tasks should not be made. >-class RTC_LOCKABLE TaskQueue { >+class RTC_LOCKABLE RTC_EXPORT TaskQueue { > public: > // TaskQueue priority levels. On some platforms these will map to thread > // priorities, on others such as Mac and iOS, GCD queue priorities. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_for_test.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_for_test.h >index 70c58fb516c1aa6ad43edfeaa93bd3cab2daca1a..9162e81b88ace9582b08d8f20d393cc1e0a24ca8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_for_test.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_for_test.h >@@ -33,7 +33,7 @@ class RTC_LOCKABLE TaskQueueForTest : public TaskQueue { > template <class Closure> > void SendTask(Closure* task) { > RTC_DCHECK(!IsCurrent()); >- rtc::Event event(false, false); >+ rtc::Event event; > PostTask(rtc::NewClosure( > [&task]() { > RTC_CHECK_EQ(false, static_cast<QueuedTask*>(task)->Run()); >@@ -47,7 +47,7 @@ class RTC_LOCKABLE TaskQueueForTest : public TaskQueue { > template <class Closure> > void SendTask(Closure&& task) { > RTC_DCHECK(!IsCurrent()); >- rtc::Event event(false, false); >+ rtc::Event event; > PostTask(rtc::NewClosure(std::move(task), [&event]() { event.Set(); })); > event.Wait(rtc::Event::kForever); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_libevent.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_libevent.cc >index cbed3334d77fd7aa8657ab895a8f13ceaeb56584..905bbdac0e2e5a1bedd1e64affeef9d0831d8fd2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_libevent.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_libevent.cc >@@ -10,30 +10,31 @@ > > #include "rtc_base/task_queue.h" > >+#include <errno.h> > #include <fcntl.h> >+#include <pthread.h> > #include <signal.h> >-#include <string.h> >+#include <stdint.h> >+#include <time.h> > #include <unistd.h> > #include <list> >+#include <memory> >+#include <type_traits> >+#include <utility> > >-#if defined(WEBRTC_LINUX) >-#include <event2/event.h> >-#include <event2/event_compat.h> >-#include <event2/event_struct.h> >-#else > #include "base/third_party/libevent/event.h" >-#endif >- > #include "rtc_base/checks.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/logging.h" > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/platform_thread.h" >+#include "rtc_base/platform_thread_types.h" > #include "rtc_base/refcount.h" > #include "rtc_base/refcountedobject.h" >+#include "rtc_base/scoped_ref_ptr.h" > #include "rtc_base/system/unused.h" >-#include "rtc_base/task_queue.h" > #include "rtc_base/task_queue_posix.h" >+#include "rtc_base/thread_annotations.h" > #include "rtc_base/timeutils.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_stdlib.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_stdlib.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..0fb0ed28b951ebc910e103878b1893ac7b8088a6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_stdlib.cc >@@ -0,0 +1,399 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/task_queue.h" >+ >+#include <string.h> >+#include <algorithm> >+#include <atomic> >+#include <condition_variable> >+#include <map> >+#include <queue> >+#include <utility> >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/platform_thread.h" >+#include "rtc_base/refcount.h" >+#include "rtc_base/refcountedobject.h" >+#include "rtc_base/thread_annotations.h" >+#include "rtc_base/timeutils.h" >+ >+namespace rtc { >+namespace { >+ >+using Priority = TaskQueue::Priority; >+ >+ThreadPriority TaskQueuePriorityToThreadPriority(Priority priority) { >+ switch (priority) { >+ case Priority::HIGH: >+ return kRealtimePriority; >+ case Priority::LOW: >+ return kLowPriority; >+ case Priority::NORMAL: >+ return kNormalPriority; >+ default: >+ RTC_NOTREACHED(); >+ return kNormalPriority; >+ } >+ return kNormalPriority; >+} >+ >+} // namespace >+ >+class TaskQueue::Impl : public RefCountInterface { >+ public: >+ Impl(const char* queue_name, TaskQueue* queue, Priority priority); >+ ~Impl() override; >+ >+ static TaskQueue::Impl* Current(); >+ static TaskQueue* CurrentQueue(); >+ >+ // Used for DCHECKing the current queue. >+ bool IsCurrent() const; >+ >+ template <class Closure, >+ typename std::enable_if<!std::is_convertible< >+ Closure, >+ std::unique_ptr<QueuedTask>>::value>::type* = nullptr> >+ void PostTask(Closure&& closure) { >+ PostTask(NewClosure(std::forward<Closure>(closure))); >+ } >+ >+ void PostTask(std::unique_ptr<QueuedTask> task); >+ void PostTaskAndReply(std::unique_ptr<QueuedTask> task, >+ std::unique_ptr<QueuedTask> reply, >+ TaskQueue::Impl* reply_queue); >+ >+ void PostDelayedTask(std::unique_ptr<QueuedTask> task, uint32_t milliseconds); >+ >+ class WorkerThread : public PlatformThread { >+ public: >+ WorkerThread(ThreadRunFunction func, >+ void* obj, >+ const char* thread_name, >+ ThreadPriority priority) >+ : PlatformThread(func, obj, thread_name, priority) {} >+ }; >+ >+ using OrderId = uint64_t; >+ >+ struct DelayedEntryTimeout { >+ int64_t next_fire_at_ms_{}; >+ OrderId order_{}; >+ >+ bool operator<(const DelayedEntryTimeout& o) const { >+ return std::tie(next_fire_at_ms_, order_) < >+ std::tie(o.next_fire_at_ms_, o.order_); >+ } >+ }; >+ >+ struct NextTask { >+ bool final_task_{false}; >+ std::unique_ptr<QueuedTask> run_task_; >+ int64_t sleep_time_ms_{}; >+ }; >+ >+ protected: >+ NextTask GetNextTask(); >+ >+ private: >+ // The ThreadQueue::Current() method requires that the current thread >+ // returns the task queue if the current thread is the active task >+ // queue and this variable holds the value needed in thread_local to >+ // on the initialized worker thread holding the queue. >+ static thread_local TaskQueue::Impl* thread_context_; >+ >+ static void ThreadMain(void* context); >+ >+ void ProcessTasks(); >+ >+ void NotifyWake(); >+ >+ // The back pointer from the owner task queue object >+ // from this implementation detail. >+ TaskQueue* const queue_; >+ >+ // Indicates if the thread has started. >+ Event started_; >+ >+ // Indicates if the thread has stopped. >+ Event stopped_; >+ >+ // Signaled whenever a new task is pending. >+ Event flag_notify_; >+ >+ // Contains the active worker thread assigned to processing >+ // tasks (including delayed tasks). >+ WorkerThread thread_; >+ >+ rtc::CriticalSection pending_lock_; >+ >+ // Indicates if the worker thread needs to shutdown now. >+ bool thread_should_quit_ RTC_GUARDED_BY(pending_lock_){false}; >+ >+ // Holds the next order to use for the next task to be >+ // put into one of the pending queues. >+ OrderId thread_posting_order_ RTC_GUARDED_BY(pending_lock_){}; >+ >+ // The list of all pending tasks that need to be processed in the >+ // FIFO queue ordering on the worker thread. >+ std::queue<std::pair<OrderId, std::unique_ptr<QueuedTask>>> pending_queue_ >+ RTC_GUARDED_BY(pending_lock_); >+ >+ // The list of all pending tasks that need to be processed at a future >+ // time based upon a delay. On the off change the delayed task should >+ // happen at exactly the same time interval as another task then the >+ // task is processed based on FIFO ordering. std::priority_queue was >+ // considered but rejected due to its inability to extract the >+ // std::unique_ptr out of the queue without the presence of a hack. >+ std::map<DelayedEntryTimeout, std::unique_ptr<QueuedTask>> delayed_queue_ >+ RTC_GUARDED_BY(pending_lock_); >+}; >+ >+// static >+thread_local TaskQueue::Impl* TaskQueue::Impl::thread_context_ = nullptr; >+ >+TaskQueue::Impl::Impl(const char* queue_name, >+ TaskQueue* queue, >+ Priority priority) >+ : queue_(queue), >+ started_(/*manual_reset=*/false, /*initially_signaled=*/false), >+ stopped_(/*manual_reset=*/false, /*initially_signaled=*/false), >+ flag_notify_(/*manual_reset=*/false, /*initially_signaled=*/false), >+ thread_(&TaskQueue::Impl::ThreadMain, >+ this, >+ queue_name, >+ TaskQueuePriorityToThreadPriority(priority)) { >+ RTC_DCHECK(queue_name); >+ thread_.Start(); >+ started_.Wait(Event::kForever); >+} >+ >+TaskQueue::Impl::~Impl() { >+ RTC_DCHECK(!IsCurrent()); >+ >+ { >+ CritScope lock(&pending_lock_); >+ thread_should_quit_ = true; >+ } >+ >+ NotifyWake(); >+ >+ stopped_.Wait(Event::kForever); >+ thread_.Stop(); >+} >+ >+// static >+TaskQueue::Impl* TaskQueue::Impl::Current() { >+ return thread_context_; >+} >+ >+// static >+TaskQueue* TaskQueue::Impl::CurrentQueue() { >+ TaskQueue::Impl* current = Current(); >+ return current ? current->queue_ : nullptr; >+} >+ >+bool TaskQueue::Impl::IsCurrent() const { >+ return IsThreadRefEqual(thread_.GetThreadRef(), CurrentThreadRef()); >+} >+ >+void TaskQueue::Impl::PostTask(std::unique_ptr<QueuedTask> task) { >+ { >+ CritScope lock(&pending_lock_); >+ OrderId order = thread_posting_order_++; >+ >+ pending_queue_.push(std::pair<OrderId, std::unique_ptr<QueuedTask>>( >+ order, std::move(task))); >+ } >+ >+ NotifyWake(); >+} >+ >+void TaskQueue::Impl::PostDelayedTask(std::unique_ptr<QueuedTask> task, >+ uint32_t milliseconds) { >+ auto fire_at = rtc::TimeMillis() + milliseconds; >+ >+ DelayedEntryTimeout delay; >+ delay.next_fire_at_ms_ = fire_at; >+ >+ { >+ CritScope lock(&pending_lock_); >+ delay.order_ = ++thread_posting_order_; >+ delayed_queue_[delay] = std::move(task); >+ } >+ >+ NotifyWake(); >+} >+ >+void TaskQueue::Impl::PostTaskAndReply(std::unique_ptr<QueuedTask> task, >+ std::unique_ptr<QueuedTask> reply, >+ TaskQueue::Impl* reply_queue) { >+ QueuedTask* task_ptr = task.release(); >+ QueuedTask* reply_task_ptr = reply.release(); >+ PostTask([task_ptr, reply_task_ptr, reply_queue]() { >+ if (task_ptr->Run()) >+ delete task_ptr; >+ >+ reply_queue->PostTask(std::unique_ptr<QueuedTask>(reply_task_ptr)); >+ }); >+} >+ >+TaskQueue::Impl::NextTask TaskQueue::Impl::GetNextTask() { >+ NextTask result{}; >+ >+ auto tick = rtc::TimeMillis(); >+ >+ CritScope lock(&pending_lock_); >+ >+ if (thread_should_quit_) { >+ result.final_task_ = true; >+ return result; >+ } >+ >+ if (delayed_queue_.size() > 0) { >+ auto delayed_entry = delayed_queue_.begin(); >+ const auto& delay_info = delayed_entry->first; >+ auto& delay_run = delayed_entry->second; >+ if (tick >= delay_info.next_fire_at_ms_) { >+ if (pending_queue_.size() > 0) { >+ auto& entry = pending_queue_.front(); >+ auto& entry_order = entry.first; >+ auto& entry_run = entry.second; >+ if (entry_order < delay_info.order_) { >+ result.run_task_ = std::move(entry_run); >+ pending_queue_.pop(); >+ return result; >+ } >+ } >+ >+ result.run_task_ = std::move(delay_run); >+ delayed_queue_.erase(delayed_entry); >+ return result; >+ } >+ >+ result.sleep_time_ms_ = delay_info.next_fire_at_ms_ - tick; >+ } >+ >+ if (pending_queue_.size() > 0) { >+ auto& entry = pending_queue_.front(); >+ result.run_task_ = std::move(entry.second); >+ pending_queue_.pop(); >+ } >+ >+ return result; >+} >+ >+// static >+void TaskQueue::Impl::ThreadMain(void* context) { >+ TaskQueue::Impl* me = static_cast<TaskQueue::Impl*>(context); >+ me->ProcessTasks(); >+} >+ >+void TaskQueue::Impl::ProcessTasks() { >+ thread_context_ = this; >+ started_.Set(); >+ >+ while (true) { >+ auto task = GetNextTask(); >+ >+ if (task.final_task_) >+ break; >+ >+ if (task.run_task_) { >+ // process entry immediately then try again >+ QueuedTask* release_ptr = task.run_task_.release(); >+ if (release_ptr->Run()) >+ delete release_ptr; >+ >+ // attempt to sleep again >+ continue; >+ } >+ >+ if (0 == task.sleep_time_ms_) >+ flag_notify_.Wait(Event::kForever); >+ else >+ flag_notify_.Wait(task.sleep_time_ms_); >+ } >+ >+ stopped_.Set(); >+} >+ >+void TaskQueue::Impl::NotifyWake() { >+ // The queue holds pending tasks to complete. Either tasks are to be >+ // executed immediately or tasks are to be run at some future delayed time. >+ // For immediate tasks the task queue's thread is busy running the task and >+ // the thread will not be waiting on the flag_notify_ event. If no immediate >+ // tasks are available but a delayed task is pending then the thread will be >+ // waiting on flag_notify_ with a delayed time-out of the nearest timed task >+ // to run. If no immediate or pending tasks are available, the thread will >+ // wait on flag_notify_ until signaled that a task has been added (or the >+ // thread to be told to shutdown). >+ >+ // In all cases, when a new immediate task, delayed task, or request to >+ // shutdown the thread is added the flag_notify_ is signaled after. If the >+ // thread was waiting then the thread will wake up immediately and re-assess >+ // what task needs to be run next (i.e. run a task now, wait for the nearest >+ // timed delayed task, or shutdown the thread). If the thread was not waiting >+ // then the thread will remained signaled to wake up the next time any >+ // attempt to wait on the flag_notify_ event occurs. >+ >+ // Any immediate or delayed pending task (or request to shutdown the thread) >+ // must always be added to the queue prior to signaling flag_notify_ to wake >+ // up the possibly sleeping thread. This prevents a race condition where the >+ // thread is notified to wake up but the task queue's thread finds nothing to >+ // do so it waits once again to be signaled where such a signal may never >+ // happen. >+ flag_notify_.Set(); >+} >+ >+// Boilerplate for the PIMPL pattern. >+TaskQueue::TaskQueue(const char* queue_name, Priority priority) >+ : impl_(new RefCountedObject<TaskQueue::Impl>(queue_name, this, priority)) { >+} >+ >+TaskQueue::~TaskQueue() {} >+ >+// static >+TaskQueue* TaskQueue::Current() { >+ return TaskQueue::Impl::CurrentQueue(); >+} >+ >+// Used for DCHECKing the current queue. >+bool TaskQueue::IsCurrent() const { >+ return impl_->IsCurrent(); >+} >+ >+void TaskQueue::PostTask(std::unique_ptr<QueuedTask> task) { >+ return TaskQueue::impl_->PostTask(std::move(task)); >+} >+ >+void TaskQueue::PostTaskAndReply(std::unique_ptr<QueuedTask> task, >+ std::unique_ptr<QueuedTask> reply, >+ TaskQueue* reply_queue) { >+ return TaskQueue::impl_->PostTaskAndReply(std::move(task), std::move(reply), >+ reply_queue->impl_.get()); >+} >+ >+void TaskQueue::PostTaskAndReply(std::unique_ptr<QueuedTask> task, >+ std::unique_ptr<QueuedTask> reply) { >+ return TaskQueue::impl_->PostTaskAndReply(std::move(task), std::move(reply), >+ impl_.get()); >+} >+ >+void TaskQueue::PostDelayedTask(std::unique_ptr<QueuedTask> task, >+ uint32_t milliseconds) { >+ return TaskQueue::impl_->PostDelayedTask(std::move(task), milliseconds); >+} >+ >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_unittest.cc >index cedf68e2e851359c505e4c34b14b9cd89bdaee63..0af39a5e715fb1578c359be6c28cc41ad3dd8247 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/task_queue_unittest.cc >@@ -64,7 +64,7 @@ TEST(TaskQueueTest, Construct) { > > TEST(TaskQueueTest, PostAndCheckCurrent) { > static const char kQueueName[] = "PostAndCheckCurrent"; >- Event event(false, false); >+ Event event; > TaskQueue queue(kQueueName); > > // We're not running a task, so there shouldn't be a current queue. >@@ -106,7 +106,7 @@ TEST(TaskQueueTest, PostLambda) { > > TEST(TaskQueueTest, PostDelayedZero) { > static const char kQueueName[] = "PostDelayedZero"; >- Event event(false, false); >+ Event event; > TaskQueue queue(kQueueName); > > queue.PostDelayedTask([&event]() { event.Set(); }, 0); >@@ -115,7 +115,7 @@ TEST(TaskQueueTest, PostDelayedZero) { > > TEST(TaskQueueTest, PostFromQueue) { > static const char kQueueName[] = "PostFromQueue"; >- Event event(false, false); >+ Event event; > TaskQueue queue(kQueueName); > > queue.PostTask( >@@ -125,7 +125,7 @@ TEST(TaskQueueTest, PostFromQueue) { > > TEST(TaskQueueTest, PostDelayed) { > static const char kQueueName[] = "PostDelayed"; >- Event event(false, false); >+ Event event; > TaskQueue queue(kQueueName, TaskQueue::Priority::HIGH); > > uint32_t start = Time(); >@@ -145,7 +145,7 @@ TEST(TaskQueueTest, DISABLED_PostDelayedHighRes) { > EnableHighResTimers high_res_scope; > > static const char kQueueName[] = "PostDelayedHighRes"; >- Event event(false, false); >+ Event event; > TaskQueue queue(kQueueName, TaskQueue::Priority::HIGH); > > uint32_t start = Time(); >@@ -165,7 +165,7 @@ TEST(TaskQueueTest, PostMultipleDelayed) { > > std::vector<std::unique_ptr<Event>> events; > for (int i = 0; i < 100; ++i) { >- events.push_back(std::unique_ptr<Event>(new Event(false, false))); >+ events.push_back(absl::make_unique<Event>()); > queue.PostDelayedTask(Bind(&CheckCurrent, events.back().get(), &queue), i); > } > >@@ -175,8 +175,8 @@ TEST(TaskQueueTest, PostMultipleDelayed) { > > TEST(TaskQueueTest, PostDelayedAfterDestruct) { > static const char kQueueName[] = "PostDelayedAfterDestruct"; >- Event run(false, false); >- Event deleted(false, false); >+ Event run; >+ Event deleted; > { > TaskQueue queue(kQueueName); > queue.PostDelayedTask( >@@ -191,7 +191,7 @@ TEST(TaskQueueTest, PostDelayedAfterDestruct) { > TEST(TaskQueueTest, PostAndReply) { > static const char kPostQueue[] = "PostQueue"; > static const char kReplyQueue[] = "ReplyQueue"; >- Event event(false, false); >+ Event event; > TaskQueue post_queue(kPostQueue); > TaskQueue reply_queue(kReplyQueue); > >@@ -204,7 +204,7 @@ TEST(TaskQueueTest, PostAndReply) { > TEST(TaskQueueTest, PostAndReuse) { > static const char kPostQueue[] = "PostQueue"; > static const char kReplyQueue[] = "ReplyQueue"; >- Event event(false, false); >+ Event event; > TaskQueue post_queue(kPostQueue); > TaskQueue reply_queue(kReplyQueue); > >@@ -251,7 +251,7 @@ TEST(TaskQueueTest, PostAndReuse) { > TEST(TaskQueueTest, PostAndReplyLambda) { > static const char kPostQueue[] = "PostQueue"; > static const char kReplyQueue[] = "ReplyQueue"; >- Event event(false, false); >+ Event event; > TaskQueue post_queue(kPostQueue); > TaskQueue reply_queue(kReplyQueue); > >@@ -287,7 +287,7 @@ TEST(TaskQueueTest, PostCopyableClosure) { > > int num_copies = 0; > int num_moves = 0; >- Event event(false, false); >+ Event event; > > static const char kPostQueue[] = "PostCopyableClosure"; > TaskQueue post_queue(kPostQueue); >@@ -323,7 +323,7 @@ TEST(TaskQueueTest, PostMoveOnlyClosure) { > }; > > int num_moves = 0; >- Event event(false, false); >+ Event event; > std::unique_ptr<SomeState> state(new SomeState(&event)); > > static const char kPostQueue[] = "PostMoveOnlyClosure"; >@@ -346,8 +346,8 @@ TEST(TaskQueueTest, PostMoveOnlyCleanup) { > std::unique_ptr<SomeState> state; > }; > >- Event event_run(false, false); >- Event event_cleanup(false, false); >+ Event event_run; >+ Event event_cleanup; > std::unique_ptr<SomeState> state_run(new SomeState(&event_run)); > std::unique_ptr<SomeState> state_cleanup(new SomeState(&event_cleanup)); > >@@ -367,7 +367,7 @@ TEST(TaskQueueTest, PostMoveOnlyCleanup) { > // written in a way that makes it likely and by running with --gtest_repeat=1000 > // the bug would occur. Alas, now it should be fixed. > TEST(TaskQueueTest, PostAndReplyDeadlock) { >- Event event(false, false); >+ Event event; > TaskQueue post_queue("PostQueue"); > TaskQueue reply_queue("ReplyQueue"); > >@@ -384,8 +384,8 @@ TEST(TaskQueueTest, PostAndReplyDeadlock) { > #define MAYBE_DeleteTaskQueueAfterPostAndReply DeleteTaskQueueAfterPostAndReply > #endif > TEST(TaskQueueTest, MAYBE_DeleteTaskQueueAfterPostAndReply) { >- Event task_deleted(false, false); >- Event reply_deleted(false, false); >+ Event task_deleted; >+ Event reply_deleted; > auto* task_queue = new TaskQueue("Queue"); > > task_queue->PostTaskAndReply( >@@ -413,7 +413,7 @@ void TestPostTaskAndReply(TaskQueue* work_queue, Event* event) { > TEST(TaskQueueTest, PostAndReply2) { > static const char kQueueName[] = "PostAndReply2"; > static const char kWorkQueueName[] = "PostAndReply2_Worker"; >- Event event(false, false); >+ Event event; > TaskQueue queue(kQueueName); > TaskQueue work_queue(kWorkQueueName); > >@@ -425,7 +425,7 @@ TEST(TaskQueueTest, PostAndReply2) { > // In situations like that, tasks will get dropped. > TEST(TaskQueueTest, PostALot) { > // To destruct the event after the queue has gone out of scope. >- Event event(false, false); >+ Event event; > > int tasks_executed = 0; > int tasks_cleaned_up = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.cc >index 7c151c75101e7b0587faac802d56031da4d4254f..a5b90ddd2017dd6f280bfac7d7a5a62a796eaa7c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.cc >@@ -97,7 +97,7 @@ bool TestClient::CheckNextPacket(const char* buf, > std::unique_ptr<Packet> packet = NextPacket(kTimeoutMs); > if (packet) { > res = (packet->size == size && memcmp(packet->buf, buf, size) == 0 && >- CheckTimestamp(packet->packet_time.timestamp)); >+ CheckTimestamp(packet->packet_time_us)); > if (addr) > *addr = packet->addr; > } >@@ -144,10 +144,10 @@ void TestClient::OnPacket(AsyncPacketSocket* socket, > const char* buf, > size_t size, > const SocketAddress& remote_addr, >- const PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > CritScope cs(&crit_); > packets_.push_back( >- absl::make_unique<Packet>(remote_addr, buf, size, packet_time)); >+ absl::make_unique<Packet>(remote_addr, buf, size, packet_time_us)); > } > > void TestClient::OnReadyToSend(AsyncPacketSocket* socket) { >@@ -157,14 +157,14 @@ void TestClient::OnReadyToSend(AsyncPacketSocket* socket) { > TestClient::Packet::Packet(const SocketAddress& a, > const char* b, > size_t s, >- const PacketTime& packet_time) >- : addr(a), buf(0), size(s), packet_time(packet_time) { >+ int64_t packet_time_us) >+ : addr(a), buf(0), size(s), packet_time_us(packet_time_us) { > buf = new char[size]; > memcpy(buf, b, size); > } > > TestClient::Packet::Packet(const Packet& p) >- : addr(p.addr), buf(0), size(p.size), packet_time(p.packet_time) { >+ : addr(p.addr), buf(0), size(p.size), packet_time_us(p.packet_time_us) { > buf = new char[size]; > memcpy(buf, p.buf, size); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.h >index a0d98b30d1654bc60d2a3d9741c89da141db52ad..16fb6ba677c2226df898b09eb3db9b4a7c952689 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testclient.h >@@ -29,14 +29,14 @@ class TestClient : public sigslot::has_slots<> { > Packet(const SocketAddress& a, > const char* b, > size_t s, >- const PacketTime& packet_time); >+ int64_t packet_time_us); > Packet(const Packet& p); > virtual ~Packet(); > > SocketAddress addr; > char* buf; > size_t size; >- PacketTime packet_time; >+ int64_t packet_time_us; > }; > > // Default timeout for NextPacket reads. >@@ -97,7 +97,7 @@ class TestClient : public sigslot::has_slots<> { > const char* buf, > size_t len, > const SocketAddress& remote_addr, >- const PacketTime& packet_time); >+ const int64_t& packet_time_us); > void OnReadyToSend(AsyncPacketSocket* socket); > bool CheckTimestamp(int64_t packet_timestamp); > void AdvanceTime(int ms); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testechoserver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testechoserver.h >index 5e714eb63dd30e0e69675b1e0df9fe479f2aff4a..9ffd78622a45f7202c89752b1e3292724da0b8ec 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testechoserver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testechoserver.h >@@ -44,7 +44,7 @@ class TestEchoServer : public sigslot::has_slots<> { > const char* buf, > size_t size, > const SocketAddress& remote_addr, >- const PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > rtc::PacketOptions options; > socket->Send(buf, size, options); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.cc >index b4a7433fbd7f183050b4626f558d7b32054c877f..f3292fd59f7cee0b20e7f81992ec21e3257e2400 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.cc >@@ -14,75 +14,7 @@ namespace webrtc { > namespace testing { > > StreamSink::StreamSink() = default; >- > StreamSink::~StreamSink() = default; > >-StreamSource::StreamSource() { >- Clear(); >-} >- >-StreamSource::~StreamSource() = default; >- >-StreamState StreamSource::GetState() const { >- return state_; >-} >- >-StreamResult StreamSource::Read(void* buffer, >- size_t buffer_len, >- size_t* read, >- int* error) { >- if (SS_CLOSED == state_) { >- if (error) >- *error = -1; >- return SR_ERROR; >- } >- if ((SS_OPENING == state_) || (readable_data_.size() <= read_block_)) { >- return SR_BLOCK; >- } >- size_t count = std::min(buffer_len, readable_data_.size() - read_block_); >- memcpy(buffer, &readable_data_[0], count); >- size_t new_size = readable_data_.size() - count; >- // Avoid undefined access beyond the last element of the vector. >- // This only happens when new_size is 0. >- if (count < readable_data_.size()) { >- memmove(&readable_data_[0], &readable_data_[count], new_size); >- } >- readable_data_.resize(new_size); >- if (read) >- *read = count; >- return SR_SUCCESS; >-} >- >-StreamResult StreamSource::Write(const void* data, >- size_t data_len, >- size_t* written, >- int* error) { >- if (SS_CLOSED == state_) { >- if (error) >- *error = -1; >- return SR_ERROR; >- } >- if (SS_OPENING == state_) { >- return SR_BLOCK; >- } >- if (SIZE_UNKNOWN != write_block_) { >- if (written_data_.size() >= write_block_) { >- return SR_BLOCK; >- } >- if (data_len > (write_block_ - written_data_.size())) { >- data_len = write_block_ - written_data_.size(); >- } >- } >- if (written) >- *written = data_len; >- const char* cdata = static_cast<const char*>(data); >- written_data_.insert(written_data_.end(), cdata, cdata + data_len); >- return SR_SUCCESS; >-} >- >-void StreamSource::Close() { >- state_ = SS_CLOSED; >-} >- > } // namespace testing > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.h >index 2ab4a35956a059eacb743915b5ae28a7254cb5b8..ac74203cf9f93cf7baa2a02e2d69ca6982dd0b61 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/testutils.h >@@ -19,9 +19,7 @@ > #include <vector> > #include "rtc_base/asyncsocket.h" > #include "rtc_base/checks.h" >-#include "rtc_base/gunit.h" > #include "rtc_base/stream.h" >-#include "rtc_base/stringutils.h" > > namespace webrtc { > namespace testing { >@@ -137,91 +135,6 @@ class StreamSink : public sigslot::has_slots<> { > EventMap events_; > }; > >-/////////////////////////////////////////////////////////////////////////////// >-// StreamSource - Implements stream interface and simulates asynchronous >-// events on the stream, without a network. Also buffers written data. >-/////////////////////////////////////////////////////////////////////////////// >- >-class StreamSource : public StreamInterface { >- public: >- StreamSource(); >- ~StreamSource() override; >- >- void Clear() { >- readable_data_.clear(); >- written_data_.clear(); >- state_ = SS_CLOSED; >- read_block_ = 0; >- write_block_ = SIZE_UNKNOWN; >- } >- void QueueString(const char* data) { QueueData(data, strlen(data)); } >-#if defined(__GNUC__) >- // Note: Implicit |this| argument counts as the first argument. >- __attribute__((__format__(__printf__, 2, 3))) >-#endif >- void >- QueueStringF(const char* format, ...) { >- va_list args; >- va_start(args, format); >- char buffer[1024]; >- size_t len = vsprintfn(buffer, sizeof(buffer), format, args); >- RTC_CHECK(len < sizeof(buffer) - 1); >- va_end(args); >- QueueData(buffer, len); >- } >- void QueueData(const char* data, size_t len) { >- readable_data_.insert(readable_data_.end(), data, data + len); >- if ((SS_OPEN == state_) && (readable_data_.size() == len)) { >- SignalEvent(this, SE_READ, 0); >- } >- } >- std::string ReadData() { >- std::string data; >- // avoid accessing written_data_[0] if it is undefined >- if (written_data_.size() > 0) { >- data.insert(0, &written_data_[0], written_data_.size()); >- } >- written_data_.clear(); >- return data; >- } >- void SetState(StreamState state) { >- int events = 0; >- if ((SS_OPENING == state_) && (SS_OPEN == state)) { >- events |= SE_OPEN; >- if (!readable_data_.empty()) { >- events |= SE_READ; >- } >- } else if ((SS_CLOSED != state_) && (SS_CLOSED == state)) { >- events |= SE_CLOSE; >- } >- state_ = state; >- if (events) { >- SignalEvent(this, events, 0); >- } >- } >- // Will cause Read to block when there are pos bytes in the read queue. >- void SetReadBlock(size_t pos) { read_block_ = pos; } >- // Will cause Write to block when there are pos bytes in the write queue. >- void SetWriteBlock(size_t pos) { write_block_ = pos; } >- >- StreamState GetState() const override; >- StreamResult Read(void* buffer, >- size_t buffer_len, >- size_t* read, >- int* error) override; >- StreamResult Write(const void* data, >- size_t data_len, >- size_t* written, >- int* error) override; >- void Close() override; >- >- private: >- typedef std::vector<char> Buffer; >- Buffer readable_data_, written_data_; >- StreamState state_; >- size_t read_block_, write_block_; >-}; >- > } // namespace testing > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/third_party/sigslot/sigslot.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/third_party/sigslot/sigslot.h >index c77e4e675df504a084223932227eac8b54fbb435..8bd1c7064a3841e3aabb7cc5fd71ed4d7806548e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/third_party/sigslot/sigslot.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/third_party/sigslot/sigslot.h >@@ -96,7 +96,6 @@ > #ifndef RTC_BASE_THIRD_PARTY_SIGSLOT_SIGSLOT_H_ > #define RTC_BASE_THIRD_PARTY_SIGSLOT_SIGSLOT_H_ > >-#include <stdlib.h> > #include <cstring> > #include <list> > #include <set> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.cc >index 2d5704e0be780a4748ee254f723423b42f1669f7..4dd5fd2224f12a6aaeb52ae81cfb845b2bdb4885 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.cc >@@ -24,14 +24,43 @@ > #pragma warning(disable : 4722) > #endif > >+#include <stdio.h> >+#include <utility> >+ > #include "rtc_base/checks.h" >+#include "rtc_base/criticalsection.h" > #include "rtc_base/logging.h" > #include "rtc_base/nullsocketserver.h" >-#include "rtc_base/platform_thread.h" >-#include "rtc_base/stringutils.h" > #include "rtc_base/timeutils.h" > #include "rtc_base/trace_event.h" > >+#if defined(WEBRTC_MAC) >+#include "rtc_base/system/cocoa_threading.h" >+ >+/* >+ * These are forward-declarations for methods that are part of the >+ * ObjC runtime. They are declared in the private header objc-internal.h. >+ * These calls are what clang inserts when using @autoreleasepool in ObjC, >+ * but here they are used directly in order to keep this file C++. >+ * https://clang.llvm.org/docs/AutomaticReferenceCounting.html#runtime-support >+ */ >+extern "C" { >+void* objc_autoreleasePoolPush(void); >+void objc_autoreleasePoolPop(void* pool); >+} >+ >+namespace { >+class ScopedAutoReleasePool { >+ public: >+ ScopedAutoReleasePool() : pool_(objc_autoreleasePoolPush()) {} >+ ~ScopedAutoReleasePool() { objc_autoreleasePoolPop(pool_); } >+ >+ private: >+ void* const pool_; >+}; >+} // namespace >+#endif >+ > namespace rtc { > > ThreadManager* ThreadManager::Instance() { >@@ -61,11 +90,12 @@ Thread* Thread::Current() { > } > > #if defined(WEBRTC_POSIX) >-#if !defined(WEBRTC_MAC) > ThreadManager::ThreadManager() : main_thread_ref_(CurrentThreadRef()) { >+#if defined(WEBRTC_MAC) >+ InitCocoaMultiThreading(); >+#endif > pthread_key_create(&key_, nullptr); > } >-#endif > > Thread* ThreadManager::CurrentThread() { > return static_cast<Thread*>(pthread_getspecific(key_)); >@@ -193,8 +223,10 @@ bool Thread::SetName(const std::string& name, const void* obj) { > > name_ = name; > if (obj) { >- char buf[16]; >- sprintfn(buf, sizeof(buf), " 0x%p", obj); >+ // The %p specifier typically produce at most 16 hex digits, possibly with a >+ // 0x prefix. But format is implementation defined, so add some margin. >+ char buf[30]; >+ snprintf(buf, sizeof(buf), " 0x%p", obj); > name_ += buf; > } > return true; >@@ -300,7 +332,6 @@ void Thread::AssertBlockingIsAllowedOnCurrentThread() { > } > > // static >-#if !defined(WEBRTC_MAC) > #if defined(WEBRTC_WIN) > DWORD WINAPI Thread::PreRun(LPVOID pv) { > #else >@@ -309,6 +340,9 @@ void* Thread::PreRun(void* pv) { > ThreadInit* init = static_cast<ThreadInit*>(pv); > ThreadManager::Instance()->SetCurrentThread(init->thread); > rtc::SetCurrentThreadName(init->thread->name_.c_str()); >+#if defined(WEBRTC_MAC) >+ ScopedAutoReleasePool pool; >+#endif > if (init->runnable) { > init->runnable->Run(init->thread); > } else { >@@ -322,7 +356,6 @@ void* Thread::PreRun(void* pv) { > return nullptr; > #endif > } >-#endif > > void Thread::Run() { > ProcessMessages(kForever); >@@ -485,9 +518,6 @@ void Thread::Clear(MessageHandler* phandler, > ClearInternal(phandler, id, removed); > } > >-#if !defined(WEBRTC_MAC) >-// Note that these methods have a separate implementation for mac and ios >-// defined in webrtc/rtc_base/thread_darwin.mm. > bool Thread::ProcessMessages(int cmsLoop) { > // Using ProcessMessages with a custom clock for testing and a time greater > // than 0 doesn't work, since it's not guaranteed to advance the custom >@@ -498,6 +528,9 @@ bool Thread::ProcessMessages(int cmsLoop) { > int cmsNext = cmsLoop; > > while (true) { >+#if defined(WEBRTC_MAC) >+ ScopedAutoReleasePool pool; >+#endif > Message msg; > if (!Get(&msg, cmsNext)) > return !IsQuitting(); >@@ -510,7 +543,6 @@ bool Thread::ProcessMessages(int cmsLoop) { > } > } > } >-#endif > > bool Thread::WrapCurrentWithThreadManager(ThreadManager* thread_manager, > bool need_synchronize_access) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.h >index 5a466104aa273e774adc7525cac59628e94e6236..039192ce7f674695ab2f1c758ec0c85497f2aa85 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread.h >@@ -11,19 +11,22 @@ > #ifndef RTC_BASE_THREAD_H_ > #define RTC_BASE_THREAD_H_ > >-#include <algorithm> >+#include <stdint.h> > #include <list> > #include <memory> > #include <string> >-#include <utility> >-#include <vector> >+#include <type_traits> > > #if defined(WEBRTC_POSIX) > #include <pthread.h> > #endif > #include "rtc_base/constructormagic.h" >+#include "rtc_base/location.h" >+#include "rtc_base/messagehandler.h" > #include "rtc_base/messagequeue.h" > #include "rtc_base/platform_thread_types.h" >+#include "rtc_base/socketserver.h" >+#include "rtc_base/thread_annotations.h" > > #if defined(WEBRTC_WIN) > #include "rtc_base/win32.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread_darwin.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread_darwin.mm >deleted file mode 100644 >index a404849c72b4cefd56f1f5457092cff0a23d7aa8..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread_darwin.mm >+++ /dev/null >@@ -1,84 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "rtc_base/thread.h" >- >-#import <Foundation/Foundation.h> >- >-#include "rtc_base/platform_thread.h" >- >-/* >- * This file contains platform-specific implementations for several >- * methods in rtc::Thread. >- */ >- >-namespace { >-void InitCocoaMultiThreading() { >- if ([NSThread isMultiThreaded] == NO) { >- // The sole purpose of this autorelease pool is to avoid a console >- // message on Leopard that tells us we're autoreleasing the thread >- // with no autorelease pool in place. >- @autoreleasepool { >- [NSThread detachNewThreadSelector:@selector(class) >- toTarget:[NSObject class] >- withObject:nil]; >- } >- } >- >- RTC_DCHECK([NSThread isMultiThreaded]); >-} >-} >- >-namespace rtc { >- >-ThreadManager::ThreadManager() : main_thread_ref_(CurrentThreadRef()) { >- pthread_key_create(&key_, nullptr); >- // This is necessary to alert the cocoa runtime of the fact that >- // we are running in a multithreaded environment. >- InitCocoaMultiThreading(); >-} >- >-// static >-void* Thread::PreRun(void* pv) { >- ThreadInit* init = static_cast<ThreadInit*>(pv); >- ThreadManager::Instance()->SetCurrentThread(init->thread); >- rtc::SetCurrentThreadName(init->thread->name_.c_str()); >- @autoreleasepool { >- if (init->runnable) { >- init->runnable->Run(init->thread); >- } else { >- init->thread->Run(); >- } >- } >- ThreadManager::Instance()->SetCurrentThread(nullptr); >- delete init; >- return nullptr; >-} >- >-bool Thread::ProcessMessages(int cmsLoop) { >- int64_t msEnd = (kForever == cmsLoop) ? 0 : TimeAfter(cmsLoop); >- int cmsNext = cmsLoop; >- >- while (true) { >- @autoreleasepool { >- Message msg; >- if (!Get(&msg, cmsNext)) >- return !IsQuitting(); >- Dispatch(&msg); >- >- if (cmsLoop != kForever) { >- cmsNext = static_cast<int>(TimeUntil(msEnd)); >- if (cmsNext < 0) >- return true; >- } >- } >- } >-} >-} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread_unittest.cc >index d5c53f8a7cc0a27fc4fa647bf97d1cf6f29cc2ec..7e392affd94a5b0a7bdbe3809e3b6a68d71d3d67 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/thread_unittest.cc >@@ -69,7 +69,7 @@ class SocketClient : public TestGenerator, public sigslot::has_slots<> { > const char* buf, > size_t size, > const SocketAddress& remote_addr, >- const PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > EXPECT_EQ(size, sizeof(uint32_t)); > uint32_t prev = reinterpret_cast<const uint32_t*>(buf)[0]; > uint32_t result = Next(prev); >@@ -470,9 +470,9 @@ TEST_F(AsyncInvokeTest, KillInvokerDuringExecute) { > // Use these events to get in a state where the functor is in the middle of > // executing, and then to wait for it to finish, ensuring the "EXPECT_FALSE" > // is run. >- Event functor_started(false, false); >- Event functor_continue(false, false); >- Event functor_finished(false, false); >+ Event functor_started; >+ Event functor_continue; >+ Event functor_finished; > > auto thread = Thread::CreateWithSocketServer(); > thread->Start(); >@@ -507,7 +507,7 @@ TEST_F(AsyncInvokeTest, KillInvokerDuringExecute) { > // destroyed. This shouldn't deadlock or crash; this second invocation should > // just be ignored. > TEST_F(AsyncInvokeTest, KillInvokerDuringExecuteWithReentrantInvoke) { >- Event functor_started(false, false); >+ Event functor_started; > // Flag used to verify that the recursively invoked task never actually runs. > bool reentrant_functor_run = false; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timestampaligner.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timestampaligner.cc >index a9bcafbdc5f5adcbd2f6e65e6e85a39ec8a868e8..f2da101727110d47aeb99c0de2cc43b434ce61a1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timestampaligner.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timestampaligner.cc >@@ -8,6 +8,7 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >+#include <cstdlib> > #include <limits> > > #include "rtc_base/checks.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.cc >index de8fc3259c28ec4e7c07df5a56f8e49733609737..dc5b611f647bec9452206de052d7e5fb69f4593c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.cc >@@ -14,6 +14,7 @@ > #include <sys/time.h> > #if defined(WEBRTC_MAC) > #include <mach/mach_time.h> >+#include "rtc_base/numerics/safe_conversions.h" > #endif > #endif > >@@ -28,7 +29,6 @@ > #endif > > #include "rtc_base/checks.h" >-#include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/timeutils.h" > > namespace rtc { >@@ -154,7 +154,7 @@ int64_t TimestampWrapAroundHandler::Unwrap(uint32_t ts) { > return ts + (num_wrap_ << 32); > } > >-int64_t TmToSeconds(const std::tm& tm) { >+int64_t TmToSeconds(const tm& tm) { > static short int mdays[12] = {31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31}; > static short int cumul_mdays[12] = {0, 31, 59, 90, 120, 151, > 181, 212, 243, 273, 304, 334}; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.h >index 3a412a47cf8425bc9f1a4e3300d56d4501cca5ac..4e38a031409f0f05ab5838774bc7eef10a6907b3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils.h >@@ -13,8 +13,6 @@ > > #include <stdint.h> > #include <time.h> >- >-#include <ctime> > #include <string> > > #include "rtc_base/checks.h" >@@ -110,10 +108,10 @@ class TimestampWrapAroundHandler { > int64_t num_wrap_; > }; > >-// Convert from std::tm, which is relative to 1900-01-01 00:00 to number of >-// seconds from 1970-01-01 00:00 ("epoch"). Don't return time_t since that >+// Convert from tm, which is relative to 1900-01-01 00:00 to number of >+// seconds from 1970-01-01 00:00 ("epoch"). Don't return time_t since that > // is still 32 bits on many systems. >-int64_t TmToSeconds(const std::tm& tm); >+int64_t TmToSeconds(const tm& tm); > > // Return the number of microseconds since January 1, 1970, UTC. > // Useful mainly when producing logs to be correlated with other >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils_unittest.cc >index 0e1949aee31125bd36a721200338416e02b07c96..577efda0255b03cb679f4595bdebc5559cf482f1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/timeutils_unittest.cc >@@ -262,7 +262,7 @@ TEST(FakeClock, SettingTimeWakesThreads) { > worker->Start(); > > // Post an event that won't be executed for 10 seconds. >- Event message_handler_dispatched(false, false); >+ Event message_handler_dispatched; > auto functor = [&message_handler_dispatched] { > message_handler_dispatched.Set(); > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/BUILD.gn >new file mode 100644 >index 0000000000000000000000000000000000000000..8c722bbc7b8e70fb206f806c112b61ddbfe985b3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/BUILD.gn >@@ -0,0 +1,37 @@ >+# Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+# >+# Use of this source code is governed by a BSD-style license >+# that can be found in the LICENSE file in the root of the source >+# tree. An additional intellectual property rights grant can be found >+# in the file PATENTS. All contributing project authors may >+# be found in the AUTHORS file in the root of the source tree. >+ >+import("../../webrtc.gni") >+ >+rtc_source_set("unit_base") { >+ visibility = [ >+ "../../api/units:*", >+ ":*", >+ ] >+ sources = [ >+ "unit_base.h", >+ ] >+ >+ deps = [ >+ "../../rtc_base:checks", >+ "../../rtc_base:safe_conversions", >+ ] >+} >+ >+if (rtc_include_tests) { >+ rtc_source_set("units_unittests") { >+ testonly = true >+ sources = [ >+ "unit_base_unittest.cc", >+ ] >+ deps = [ >+ ":unit_base", >+ "../../test:test_support", >+ ] >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/unit_base.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/unit_base.h >new file mode 100644 >index 0000000000000000000000000000000000000000..5503a329937667642cfa70f7a17d29e333b4f064 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/unit_base.h >@@ -0,0 +1,304 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#ifndef RTC_BASE_UNITS_UNIT_BASE_H_ >+#define RTC_BASE_UNITS_UNIT_BASE_H_ >+ >+#include <stdint.h> >+#include <algorithm> >+#include <cmath> >+#include <limits> >+#include <type_traits> >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/numerics/safe_conversions.h" >+ >+namespace webrtc { >+namespace rtc_units_impl { >+ >+// UnitBase is a base class for implementing custom value types with a specific >+// unit. It provides type safety and sommonly useful operations. The undelying >+// storage is always an int64_t, it's up to the unit implementation to choose >+// what scale it represents. >+// >+// It's used like: >+// class MyUnit: public UnitBase<MyUnit> {...}; >+// >+// Unit_T is the subclass representing the specific unit. >+template <class Unit_T> >+class UnitBase { >+ public: >+ UnitBase() = delete; >+ static constexpr Unit_T Zero() { return Unit_T(0); } >+ static constexpr Unit_T PlusInfinity() { return Unit_T(PlusInfinityVal()); } >+ static constexpr Unit_T MinusInfinity() { return Unit_T(MinusInfinityVal()); } >+ >+ constexpr bool IsZero() const { return value_ == 0; } >+ constexpr bool IsFinite() const { return !IsInfinite(); } >+ constexpr bool IsInfinite() const { >+ return value_ == PlusInfinityVal() || value_ == MinusInfinityVal(); >+ } >+ constexpr bool IsPlusInfinity() const { return value_ == PlusInfinityVal(); } >+ constexpr bool IsMinusInfinity() const { >+ return value_ == MinusInfinityVal(); >+ } >+ >+ constexpr bool operator==(const Unit_T& other) const { >+ return value_ == other.value_; >+ } >+ constexpr bool operator!=(const Unit_T& other) const { >+ return value_ != other.value_; >+ } >+ constexpr bool operator<=(const Unit_T& other) const { >+ return value_ <= other.value_; >+ } >+ constexpr bool operator>=(const Unit_T& other) const { >+ return value_ >= other.value_; >+ } >+ constexpr bool operator>(const Unit_T& other) const { >+ return value_ > other.value_; >+ } >+ constexpr bool operator<(const Unit_T& other) const { >+ return value_ < other.value_; >+ } >+ >+ protected: >+ template <int64_t value> >+ static constexpr Unit_T FromStaticValue() { >+ static_assert(value >= 0 || !Unit_T::one_sided, ""); >+ static_assert(value > MinusInfinityVal(), ""); >+ static_assert(value < PlusInfinityVal(), ""); >+ return Unit_T(value); >+ } >+ >+ template <int64_t fraction_value, int64_t Denominator> >+ static constexpr Unit_T FromStaticFraction() { >+ static_assert(fraction_value >= 0 || !Unit_T::one_sided, ""); >+ static_assert(fraction_value > MinusInfinityVal() / Denominator, ""); >+ static_assert(fraction_value < PlusInfinityVal() / Denominator, ""); >+ return Unit_T(fraction_value * Denominator); >+ } >+ >+ template < >+ typename T, >+ typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >+ static Unit_T FromValue(T value) { >+ if (Unit_T::one_sided) >+ RTC_DCHECK_GE(value, 0); >+ RTC_DCHECK_GT(value, MinusInfinityVal()); >+ RTC_DCHECK_LT(value, PlusInfinityVal()); >+ return Unit_T(rtc::dchecked_cast<int64_t>(value)); >+ } >+ template <typename T, >+ typename std::enable_if<std::is_floating_point<T>::value>::type* = >+ nullptr> >+ static Unit_T FromValue(T value) { >+ if (value == std::numeric_limits<T>::infinity()) { >+ return PlusInfinity(); >+ } else if (value == -std::numeric_limits<T>::infinity()) { >+ return MinusInfinity(); >+ } else { >+ RTC_DCHECK(!std::isnan(value)); >+ return FromValue(rtc::dchecked_cast<int64_t>(value)); >+ } >+ } >+ >+ template < >+ int64_t Denominator, >+ typename T, >+ typename std::enable_if<std::is_integral<T>::value>::type* = nullptr> >+ static Unit_T FromFraction(T value) { >+ if (Unit_T::one_sided) >+ RTC_DCHECK_GE(value, 0); >+ RTC_DCHECK_GT(value, MinusInfinityVal() / Denominator); >+ RTC_DCHECK_LT(value, PlusInfinityVal() / Denominator); >+ return Unit_T(rtc::dchecked_cast<int64_t>(value * Denominator)); >+ } >+ template <int64_t Denominator, >+ typename T, >+ typename std::enable_if<std::is_floating_point<T>::value>::type* = >+ nullptr> >+ static Unit_T FromFraction(T value) { >+ return FromValue(value * Denominator); >+ } >+ >+ template <typename T = int64_t> >+ typename std::enable_if<std::is_integral<T>::value, T>::type ToValue() const { >+ RTC_DCHECK(IsFinite()); >+ return rtc::dchecked_cast<T>(value_); >+ } >+ template <typename T> >+ constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >+ ToValue() const { >+ return IsPlusInfinity() >+ ? std::numeric_limits<T>::infinity() >+ : IsMinusInfinity() ? -std::numeric_limits<T>::infinity() >+ : value_; >+ } >+ template <typename T> >+ constexpr T ToValueOr(T fallback_value) const { >+ return IsFinite() ? value_ : fallback_value; >+ } >+ >+ template <int64_t Denominator, typename T = int64_t> >+ typename std::enable_if<std::is_integral<T>::value, T>::type ToFraction() >+ const { >+ RTC_DCHECK(IsFinite()); >+ if (Unit_T::one_sided) { >+ return rtc::dchecked_cast<T>( >+ DivRoundPositiveToNearest(value_, Denominator)); >+ } else { >+ return rtc::dchecked_cast<T>(DivRoundToNearest(value_, Denominator)); >+ } >+ } >+ template <int64_t Denominator, typename T> >+ constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >+ ToFraction() const { >+ return ToValue<T>() * (1 / static_cast<T>(Denominator)); >+ } >+ >+ template <int64_t Denominator> >+ constexpr int64_t ToFractionOr(int64_t fallback_value) const { >+ return IsFinite() ? Unit_T::one_sided >+ ? DivRoundPositiveToNearest(value_, Denominator) >+ : DivRoundToNearest(value_, Denominator) >+ : fallback_value; >+ } >+ >+ template <int64_t Factor, typename T = int64_t> >+ typename std::enable_if<std::is_integral<T>::value, T>::type ToMultiple() >+ const { >+ RTC_DCHECK_GE(ToValue(), std::numeric_limits<T>::min() / Factor); >+ RTC_DCHECK_LE(ToValue(), std::numeric_limits<T>::max() / Factor); >+ return rtc::dchecked_cast<T>(ToValue() * Factor); >+ } >+ template <int64_t Factor, typename T> >+ constexpr typename std::enable_if<std::is_floating_point<T>::value, T>::type >+ ToMultiple() const { >+ return ToValue<T>() * Factor; >+ } >+ >+ explicit constexpr UnitBase(int64_t value) : value_(value) {} >+ >+ private: >+ template <class RelativeUnit_T> >+ friend class RelativeUnit; >+ >+ static inline constexpr int64_t PlusInfinityVal() { >+ return std::numeric_limits<int64_t>::max(); >+ } >+ static inline constexpr int64_t MinusInfinityVal() { >+ return std::numeric_limits<int64_t>::min(); >+ } >+ >+ Unit_T& AsSubClassRef() { return reinterpret_cast<Unit_T&>(*this); } >+ constexpr const Unit_T& AsSubClassRef() const { >+ return reinterpret_cast<const Unit_T&>(*this); >+ } >+ // Assumes that n >= 0 and d > 0. >+ static constexpr int64_t DivRoundPositiveToNearest(int64_t n, int64_t d) { >+ return (n + d / 2) / d; >+ } >+ // Assumes that d > 0. >+ static constexpr int64_t DivRoundToNearest(int64_t n, int64_t d) { >+ return (n + (n >= 0 ? d / 2 : -d / 2)) / d; >+ } >+ >+ int64_t value_; >+}; >+ >+// Extends UnitBase to provide operations for relative units, that is, units >+// that have a meaningful relation between values such that a += b is a >+// sensible thing to do. For a,b <- same unit. >+template <class Unit_T> >+class RelativeUnit : public UnitBase<Unit_T> { >+ public: >+ Unit_T Clamped(Unit_T min_value, Unit_T max_value) const { >+ return std::max(min_value, >+ std::min(UnitBase<Unit_T>::AsSubClassRef(), max_value)); >+ } >+ void Clamp(Unit_T min_value, Unit_T max_value) { >+ *this = Clamped(min_value, max_value); >+ } >+ Unit_T operator+(const Unit_T other) const { >+ if (this->IsPlusInfinity() || other.IsPlusInfinity()) { >+ RTC_DCHECK(!this->IsMinusInfinity()); >+ RTC_DCHECK(!other.IsMinusInfinity()); >+ return this->PlusInfinity(); >+ } else if (this->IsMinusInfinity() || other.IsMinusInfinity()) { >+ RTC_DCHECK(!this->IsPlusInfinity()); >+ RTC_DCHECK(!other.IsPlusInfinity()); >+ return this->MinusInfinity(); >+ } >+ return UnitBase<Unit_T>::FromValue(this->ToValue() + other.ToValue()); >+ } >+ Unit_T operator-(const Unit_T other) const { >+ if (this->IsPlusInfinity() || other.IsMinusInfinity()) { >+ RTC_DCHECK(!this->IsMinusInfinity()); >+ RTC_DCHECK(!other.IsPlusInfinity()); >+ return this->PlusInfinity(); >+ } else if (this->IsMinusInfinity() || other.IsPlusInfinity()) { >+ RTC_DCHECK(!this->IsPlusInfinity()); >+ RTC_DCHECK(!other.IsMinusInfinity()); >+ return this->MinusInfinity(); >+ } >+ return UnitBase<Unit_T>::FromValue(this->ToValue() - other.ToValue()); >+ } >+ Unit_T& operator+=(const Unit_T other) { >+ *this = *this + other; >+ return this->AsSubClassRef(); >+ } >+ Unit_T& operator-=(const Unit_T other) { >+ *this = *this - other; >+ return this->AsSubClassRef(); >+ } >+ constexpr double operator/(const Unit_T other) const { >+ return UnitBase<Unit_T>::template ToValue<double>() / >+ other.template ToValue<double>(); >+ } >+ template <typename T> >+ typename std::enable_if<std::is_arithmetic<T>::value, Unit_T>::type operator/( >+ const T& scalar) const { >+ return UnitBase<Unit_T>::FromValue( >+ std::round(UnitBase<Unit_T>::template ToValue<int64_t>() / scalar)); >+ } >+ Unit_T operator*(const double scalar) const { >+ return UnitBase<Unit_T>::FromValue(std::round(this->ToValue() * scalar)); >+ } >+ Unit_T operator*(const int64_t scalar) const { >+ return UnitBase<Unit_T>::FromValue(this->ToValue() * scalar); >+ } >+ Unit_T operator*(const int32_t scalar) const { >+ return UnitBase<Unit_T>::FromValue(this->ToValue() * scalar); >+ } >+ >+ protected: >+ using UnitBase<Unit_T>::UnitBase; >+}; >+ >+template <class Unit_T> >+inline Unit_T operator*(const double scalar, const RelativeUnit<Unit_T> other) { >+ return other * scalar; >+} >+template <class Unit_T> >+inline Unit_T operator*(const int64_t scalar, >+ const RelativeUnit<Unit_T> other) { >+ return other * scalar; >+} >+template <class Unit_T> >+inline Unit_T operator*(const int32_t& scalar, >+ const RelativeUnit<Unit_T> other) { >+ return other * scalar; >+} >+ >+} // namespace rtc_units_impl >+ >+} // namespace webrtc >+ >+#endif // RTC_BASE_UNITS_UNIT_BASE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/unit_base_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/unit_base_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f8c8503decb47833e49d0df16d2039cb0ac04cfd >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/units/unit_base_unittest.cc >@@ -0,0 +1,234 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_base/units/unit_base.h" >+ >+#include "test/gtest.h" >+ >+namespace webrtc { >+namespace { >+class TestUnit final : public rtc_units_impl::RelativeUnit<TestUnit> { >+ public: >+ TestUnit() = delete; >+ >+ using UnitBase::FromStaticValue; >+ using UnitBase::FromValue; >+ using UnitBase::ToValue; >+ using UnitBase::ToValueOr; >+ >+ template <int64_t kilo> >+ static constexpr TestUnit FromStaticKilo() { >+ return FromStaticFraction<kilo, 1000>(); >+ } >+ template <typename T> >+ static TestUnit FromKilo(T kilo) { >+ return FromFraction<1000>(kilo); >+ } >+ template <typename T = int64_t> >+ T ToKilo() const { >+ return UnitBase::ToFraction<1000, T>(); >+ } >+ constexpr int64_t ToKiloOr(int64_t fallback) const { >+ return UnitBase::ToFractionOr<1000>(fallback); >+ } >+ template <typename T> >+ constexpr T ToMilli() const { >+ return UnitBase::ToMultiple<1000, T>(); >+ } >+ >+ private: >+ friend class UnitBase<TestUnit>; >+ static constexpr bool one_sided = false; >+ using RelativeUnit<TestUnit>::RelativeUnit; >+}; >+} // namespace >+namespace test { >+TEST(UnitBaseTest, ConstExpr) { >+ constexpr int64_t kValue = -12345; >+ constexpr TestUnit kTestUnitZero = TestUnit::Zero(); >+ constexpr TestUnit kTestUnitPlusInf = TestUnit::PlusInfinity(); >+ constexpr TestUnit kTestUnitMinusInf = TestUnit::MinusInfinity(); >+ static_assert(kTestUnitZero.IsZero(), ""); >+ static_assert(kTestUnitPlusInf.IsPlusInfinity(), ""); >+ static_assert(kTestUnitMinusInf.IsMinusInfinity(), ""); >+ static_assert(kTestUnitPlusInf.ToKiloOr(-1) == -1, ""); >+ >+ static_assert(kTestUnitPlusInf > kTestUnitZero, ""); >+ >+ constexpr TestUnit kTestUnitKilo = TestUnit::FromStaticKilo<kValue>(); >+ constexpr TestUnit kTestUnitValue = TestUnit::FromStaticValue<kValue>(); >+ >+ static_assert(kTestUnitKilo.ToKiloOr(0) == kValue, ""); >+ static_assert(kTestUnitValue.ToValueOr(0) == kValue, ""); >+} >+ >+TEST(UnitBaseTest, GetBackSameValues) { >+ const int64_t kValue = 499; >+ for (int sign = -1; sign <= 1; ++sign) { >+ int64_t value = kValue * sign; >+ EXPECT_EQ(TestUnit::FromKilo(value).ToKilo(), value); >+ EXPECT_EQ(TestUnit::FromValue(value).ToValue<int64_t>(), value); >+ } >+ EXPECT_EQ(TestUnit::Zero().ToValue<int64_t>(), 0); >+} >+ >+TEST(UnitBaseTest, GetDifferentPrefix) { >+ const int64_t kValue = 3000000; >+ EXPECT_EQ(TestUnit::FromValue(kValue).ToKilo(), kValue / 1000); >+ EXPECT_EQ(TestUnit::FromKilo(kValue).ToValue<int64_t>(), kValue * 1000); >+} >+ >+TEST(UnitBaseTest, IdentityChecks) { >+ const int64_t kValue = 3000; >+ EXPECT_TRUE(TestUnit::Zero().IsZero()); >+ EXPECT_FALSE(TestUnit::FromKilo(kValue).IsZero()); >+ >+ EXPECT_TRUE(TestUnit::PlusInfinity().IsInfinite()); >+ EXPECT_TRUE(TestUnit::MinusInfinity().IsInfinite()); >+ EXPECT_FALSE(TestUnit::Zero().IsInfinite()); >+ EXPECT_FALSE(TestUnit::FromKilo(-kValue).IsInfinite()); >+ EXPECT_FALSE(TestUnit::FromKilo(kValue).IsInfinite()); >+ >+ EXPECT_FALSE(TestUnit::PlusInfinity().IsFinite()); >+ EXPECT_FALSE(TestUnit::MinusInfinity().IsFinite()); >+ EXPECT_TRUE(TestUnit::FromKilo(-kValue).IsFinite()); >+ EXPECT_TRUE(TestUnit::FromKilo(kValue).IsFinite()); >+ EXPECT_TRUE(TestUnit::Zero().IsFinite()); >+ >+ EXPECT_TRUE(TestUnit::PlusInfinity().IsPlusInfinity()); >+ EXPECT_FALSE(TestUnit::MinusInfinity().IsPlusInfinity()); >+ >+ EXPECT_TRUE(TestUnit::MinusInfinity().IsMinusInfinity()); >+ EXPECT_FALSE(TestUnit::PlusInfinity().IsMinusInfinity()); >+} >+ >+TEST(UnitBaseTest, ComparisonOperators) { >+ const int64_t kSmall = 450; >+ const int64_t kLarge = 451; >+ const TestUnit small = TestUnit::FromKilo(kSmall); >+ const TestUnit large = TestUnit::FromKilo(kLarge); >+ >+ EXPECT_EQ(TestUnit::Zero(), TestUnit::FromKilo(0)); >+ EXPECT_EQ(TestUnit::PlusInfinity(), TestUnit::PlusInfinity()); >+ EXPECT_EQ(small, TestUnit::FromKilo(kSmall)); >+ EXPECT_LE(small, TestUnit::FromKilo(kSmall)); >+ EXPECT_GE(small, TestUnit::FromKilo(kSmall)); >+ EXPECT_NE(small, TestUnit::FromKilo(kLarge)); >+ EXPECT_LE(small, TestUnit::FromKilo(kLarge)); >+ EXPECT_LT(small, TestUnit::FromKilo(kLarge)); >+ EXPECT_GE(large, TestUnit::FromKilo(kSmall)); >+ EXPECT_GT(large, TestUnit::FromKilo(kSmall)); >+ EXPECT_LT(TestUnit::Zero(), small); >+ EXPECT_GT(TestUnit::Zero(), TestUnit::FromKilo(-kSmall)); >+ EXPECT_GT(TestUnit::Zero(), TestUnit::FromKilo(-kSmall)); >+ >+ EXPECT_GT(TestUnit::PlusInfinity(), large); >+ EXPECT_LT(TestUnit::MinusInfinity(), TestUnit::Zero()); >+} >+ >+TEST(UnitBaseTest, Clamping) { >+ const TestUnit upper = TestUnit::FromKilo(800); >+ const TestUnit lower = TestUnit::FromKilo(100); >+ const TestUnit under = TestUnit::FromKilo(100); >+ const TestUnit inside = TestUnit::FromKilo(500); >+ const TestUnit over = TestUnit::FromKilo(1000); >+ EXPECT_EQ(under.Clamped(lower, upper), lower); >+ EXPECT_EQ(inside.Clamped(lower, upper), inside); >+ EXPECT_EQ(over.Clamped(lower, upper), upper); >+ >+ TestUnit mutable_delta = lower; >+ mutable_delta.Clamp(lower, upper); >+ EXPECT_EQ(mutable_delta, lower); >+ mutable_delta = inside; >+ mutable_delta.Clamp(lower, upper); >+ EXPECT_EQ(mutable_delta, inside); >+ mutable_delta = over; >+ mutable_delta.Clamp(lower, upper); >+ EXPECT_EQ(mutable_delta, upper); >+} >+ >+TEST(UnitBaseTest, CanBeInititializedFromLargeInt) { >+ const int kMaxInt = std::numeric_limits<int>::max(); >+ EXPECT_EQ(TestUnit::FromKilo(kMaxInt).ToValue<int64_t>(), >+ static_cast<int64_t>(kMaxInt) * 1000); >+} >+ >+TEST(UnitBaseTest, ConvertsToAndFromDouble) { >+ const int64_t kValue = 17017; >+ const double kMilliDouble = kValue * 1e3; >+ const double kValueDouble = kValue; >+ const double kKiloDouble = kValue * 1e-3; >+ >+ EXPECT_EQ(TestUnit::FromValue(kValue).ToKilo<double>(), kKiloDouble); >+ EXPECT_EQ(TestUnit::FromKilo(kKiloDouble).ToValue<int64_t>(), kValue); >+ >+ EXPECT_EQ(TestUnit::FromValue(kValue).ToValue<double>(), kValueDouble); >+ EXPECT_EQ(TestUnit::FromValue(kValueDouble).ToValue<int64_t>(), kValue); >+ >+ EXPECT_NEAR(TestUnit::FromValue(kValue).ToMilli<double>(), kMilliDouble, 1); >+ >+ const double kPlusInfinity = std::numeric_limits<double>::infinity(); >+ const double kMinusInfinity = -kPlusInfinity; >+ >+ EXPECT_EQ(TestUnit::PlusInfinity().ToKilo<double>(), kPlusInfinity); >+ EXPECT_EQ(TestUnit::MinusInfinity().ToKilo<double>(), kMinusInfinity); >+ EXPECT_EQ(TestUnit::PlusInfinity().ToValue<double>(), kPlusInfinity); >+ EXPECT_EQ(TestUnit::MinusInfinity().ToValue<double>(), kMinusInfinity); >+ EXPECT_EQ(TestUnit::PlusInfinity().ToMilli<double>(), kPlusInfinity); >+ EXPECT_EQ(TestUnit::MinusInfinity().ToMilli<double>(), kMinusInfinity); >+ >+ EXPECT_TRUE(TestUnit::FromKilo(kPlusInfinity).IsPlusInfinity()); >+ EXPECT_TRUE(TestUnit::FromKilo(kMinusInfinity).IsMinusInfinity()); >+ EXPECT_TRUE(TestUnit::FromValue(kPlusInfinity).IsPlusInfinity()); >+ EXPECT_TRUE(TestUnit::FromValue(kMinusInfinity).IsMinusInfinity()); >+} >+ >+TEST(UnitBaseTest, MathOperations) { >+ const int64_t kValueA = 267; >+ const int64_t kValueB = 450; >+ const TestUnit delta_a = TestUnit::FromKilo(kValueA); >+ const TestUnit delta_b = TestUnit::FromKilo(kValueB); >+ EXPECT_EQ((delta_a + delta_b).ToKilo(), kValueA + kValueB); >+ EXPECT_EQ((delta_a - delta_b).ToKilo(), kValueA - kValueB); >+ >+ const int32_t kInt32Value = 123; >+ const double kFloatValue = 123.0; >+ EXPECT_EQ((TestUnit::FromValue(kValueA) * kValueB).ToValue<int64_t>(), >+ kValueA * kValueB); >+ EXPECT_EQ((TestUnit::FromValue(kValueA) * kInt32Value).ToValue<int64_t>(), >+ kValueA * kInt32Value); >+ EXPECT_EQ((TestUnit::FromValue(kValueA) * kFloatValue).ToValue<int64_t>(), >+ kValueA * kFloatValue); >+ >+ EXPECT_EQ((delta_b / 10).ToKilo(), kValueB / 10); >+ EXPECT_EQ(delta_b / delta_a, static_cast<double>(kValueB) / kValueA); >+ >+ TestUnit mutable_delta = TestUnit::FromKilo(kValueA); >+ mutable_delta += TestUnit::FromKilo(kValueB); >+ EXPECT_EQ(mutable_delta, TestUnit::FromKilo(kValueA + kValueB)); >+ mutable_delta -= TestUnit::FromKilo(kValueB); >+ EXPECT_EQ(mutable_delta, TestUnit::FromKilo(kValueA)); >+} >+ >+TEST(UnitBaseTest, InfinityOperations) { >+ const int64_t kValue = 267; >+ const TestUnit finite = TestUnit::FromKilo(kValue); >+ EXPECT_TRUE((TestUnit::PlusInfinity() + finite).IsPlusInfinity()); >+ EXPECT_TRUE((TestUnit::PlusInfinity() - finite).IsPlusInfinity()); >+ EXPECT_TRUE((finite + TestUnit::PlusInfinity()).IsPlusInfinity()); >+ EXPECT_TRUE((finite - TestUnit::MinusInfinity()).IsPlusInfinity()); >+ >+ EXPECT_TRUE((TestUnit::MinusInfinity() + finite).IsMinusInfinity()); >+ EXPECT_TRUE((TestUnit::MinusInfinity() - finite).IsMinusInfinity()); >+ EXPECT_TRUE((finite + TestUnit::MinusInfinity()).IsMinusInfinity()); >+ EXPECT_TRUE((finite - TestUnit::PlusInfinity()).IsMinusInfinity()); >+} >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unittest_main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unittest_main.cc >index 8d4ff2df2d3930cf7ed4c859e197f91f43ce083c..5fd3a996876f8dca3544d0884412547747009999 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unittest_main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unittest_main.cc >@@ -22,7 +22,6 @@ > #include "system_wrappers/include/field_trial.h" > #include "system_wrappers/include/metrics.h" > #include "test/field_trial.h" >-#include "test/testsupport/fileutils.h" > > #if defined(WEBRTC_WIN) > #include "rtc_base/win32socketinit.h" >@@ -32,19 +31,20 @@ > #include "test/ios/test_support.h" > #endif > >-DEFINE_bool(help, false, "prints this message"); >-DEFINE_string(log, "", "logging options to use"); >-DEFINE_string( >+WEBRTC_DEFINE_bool(help, false, "prints this message"); >+WEBRTC_DEFINE_string(log, "", "logging options to use"); >+WEBRTC_DEFINE_string( > force_fieldtrials, > "", > "Field trials control experimental feature code which can be forced. " > "E.g. running with --force_fieldtrials=WebRTC-FooFeature/Enable/" > " will assign the group Enable to field trial WebRTC-FooFeature."); > #if defined(WEBRTC_WIN) >-DEFINE_int(crt_break_alloc, -1, "memory allocation to break on"); >-DEFINE_bool(default_error_handlers, >- false, >- "leave the default exception/dbg handler functions in place"); >+WEBRTC_DEFINE_int(crt_break_alloc, -1, "memory allocation to break on"); >+WEBRTC_DEFINE_bool( >+ default_error_handlers, >+ false, >+ "leave the default exception/dbg handler functions in place"); > > void TestInvalidParameterHandler(const wchar_t* expression, > const wchar_t* function, >@@ -81,7 +81,6 @@ int main(int argc, char* argv[]) { > return 0; > } > >- webrtc::test::SetExecutablePath(argv[0]); > webrtc::test::ValidateFieldTrialsStringOrDie(FLAG_force_fieldtrials); > // InitFieldTrialsFromString stores the char*, so the char array must outlive > // the application. >@@ -118,7 +117,7 @@ int main(int argc, char* argv[]) { > > // Initialize SSL which are used by several tests. > rtc::InitializeSSL(); >- rtc::SSLStreamAdapter::enable_time_callback_for_testing(); >+ rtc::SSLStreamAdapter::EnableTimeCallbackForTesting(); > > #if defined(WEBRTC_IOS) > rtc::test::InitTestSuite(RUN_ALL_TESTS, argc, argv, false); >@@ -138,5 +137,13 @@ int main(int argc, char* argv[]) { > _CrtSetReportHook2(_CRT_RPTHOOK_REMOVE, TestCrtReportHandler); > #endif > >+#if defined(ADDRESS_SANITIZER) || defined(LEAK_SANITIZER) || \ >+ defined(MEMORY_SANITIZER) || defined(THREAD_SANITIZER) || \ >+ defined(UNDEFINED_SANITIZER) >+ // We want the test flagged as failed only for sanitizer defects, >+ // in which case the sanitizer will override exit code with 66. >+ return 0; >+#endif >+ > return res; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unixfilesystem.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unixfilesystem.cc >deleted file mode 100644 >index 023c34c8c943f873d8cbfaf125f5d71b89c8530e..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unixfilesystem.cc >+++ /dev/null >@@ -1,114 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "rtc_base/unixfilesystem.h" >- >-#include <errno.h> >-#include <fcntl.h> >-#include <stdlib.h> >-#include <sys/stat.h> >-#include <unistd.h> >- >-#if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >-#include <CoreServices/CoreServices.h> >-#include <IOKit/IOCFBundle.h> >-#include <sys/statvfs.h> >-#include "rtc_base/macutils.h" >-#endif // WEBRTC_MAC && !defined(WEBRTC_IOS) >- >-#if defined(WEBRTC_POSIX) && !defined(WEBRTC_MAC) || defined(WEBRTC_IOS) >-#include <sys/types.h> >-#if defined(WEBRTC_ANDROID) >-#include <sys/statfs.h> >-#elif !defined(__native_client__) >-#include <sys/statvfs.h> >-#endif // !defined(__native_client__) >-#include <limits.h> >-#include <pwd.h> >-#include <stdio.h> >-#endif // WEBRTC_POSIX && !WEBRTC_MAC || WEBRTC_IOS >- >-#if defined(WEBRTC_LINUX) && !defined(WEBRTC_ANDROID) >-#include <ctype.h> >-#include <algorithm> >-#endif >- >-#if defined(__native_client__) && !defined(__GLIBC__) >-#include <sys/syslimits.h> >-#endif >- >-#include "rtc_base/arraysize.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/fileutils.h" >-#include "rtc_base/pathutils.h" >-#include "rtc_base/stream.h" >-#include "rtc_base/stringutils.h" >- >-namespace rtc { >- >-UnixFilesystem::UnixFilesystem() {} >- >-UnixFilesystem::~UnixFilesystem() {} >- >-bool UnixFilesystem::DeleteFile(const Pathname& filename) { >- RTC_LOG(LS_INFO) << "Deleting file:" << filename.pathname(); >- >- if (!IsFile(filename)) { >- RTC_DCHECK(IsFile(filename)); >- return false; >- } >- return ::unlink(filename.pathname().c_str()) == 0; >-} >- >-bool UnixFilesystem::MoveFile(const Pathname& old_path, >- const Pathname& new_path) { >- if (!IsFile(old_path)) { >- RTC_DCHECK(IsFile(old_path)); >- return false; >- } >- RTC_LOG(LS_VERBOSE) << "Moving " << old_path.pathname() << " to " >- << new_path.pathname(); >- if (rename(old_path.pathname().c_str(), new_path.pathname().c_str()) != 0) { >- return false; >- } >- return true; >-} >- >-bool UnixFilesystem::IsFolder(const Pathname& path) { >- struct stat st; >- if (stat(path.pathname().c_str(), &st) < 0) >- return false; >- return S_ISDIR(st.st_mode); >-} >- >-bool UnixFilesystem::IsFile(const Pathname& pathname) { >- struct stat st; >- int res = ::stat(pathname.pathname().c_str(), &st); >- // Treat symlinks, named pipes, etc. all as files. >- return res == 0 && !S_ISDIR(st.st_mode); >-} >- >-bool UnixFilesystem::GetFileSize(const Pathname& pathname, size_t* size) { >- struct stat st; >- if (::stat(pathname.pathname().c_str(), &st) != 0) >- return false; >- *size = st.st_size; >- return true; >-} >- >-} // namespace rtc >- >-#if defined(__native_client__) >-extern "C" int __attribute__((weak)) >-link(const char* oldpath, const char* newpath) { >- errno = EACCES; >- return -1; >-} >-#endif >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unixfilesystem.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unixfilesystem.h >deleted file mode 100644 >index 711d7b3ea192e149c9111a0d94b9e8fdceeda276..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/unixfilesystem.h >+++ /dev/null >@@ -1,45 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef RTC_BASE_UNIXFILESYSTEM_H_ >-#define RTC_BASE_UNIXFILESYSTEM_H_ >- >-#include <sys/types.h> >- >-#include "rtc_base/fileutils.h" >- >-namespace rtc { >- >-class UnixFilesystem : public FilesystemInterface { >- public: >- UnixFilesystem(); >- ~UnixFilesystem() override; >- >- // This will attempt to delete the file located at filename. >- // It will fail with VERIY if you pass it a non-existant file, or a directory. >- bool DeleteFile(const Pathname& filename) override; >- >- // This moves a file from old_path to new_path, where "file" can be a plain >- // file or directory, which will be moved recursively. >- // Returns true if function succeeds. >- bool MoveFile(const Pathname& old_path, const Pathname& new_path) override; >- >- // Returns true if a pathname is a directory >- bool IsFolder(const Pathname& pathname) override; >- >- // Returns true of pathname represents an existing file >- bool IsFile(const Pathname& pathname) override; >- >- bool GetFileSize(const Pathname& path, size_t* size) override; >-}; >- >-} // namespace rtc >- >-#endif // RTC_BASE_UNIXFILESYSTEM_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocket_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocket_unittest.cc >index d2bff5468837bd16d9303831b58436fc37ad757b..d44f46a65e5c1483d55ccdc1082c3f535dfb9fc3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocket_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocket_unittest.cc >@@ -104,7 +104,7 @@ struct Receiver : public MessageHandler, public sigslot::has_slots<> { > const char* data, > size_t size, > const SocketAddress& remote_addr, >- const PacketTime& packet_time) { >+ const int64_t& /* packet_time_us */) { > ASSERT_EQ(socket.get(), s); > ASSERT_GE(size, 4U); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >index ae40d7888f551bf7c5abcb88ba4a77f66c40b76a..bfce0f54541e9d1816c6d132b66e3729641702ad 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >@@ -525,7 +525,6 @@ VirtualSocketServer::VirtualSocketServer() : VirtualSocketServer(nullptr) {} > > VirtualSocketServer::VirtualSocketServer(FakeClock* fake_clock) > : fake_clock_(fake_clock), >- wakeup_(/*manual_reset=*/false, /*initially_signaled=*/false), > msg_queue_(nullptr), > stop_on_idle_(false), > next_ipv4_(kInitialNextIPv4), >@@ -1027,8 +1026,9 @@ void VirtualSocketServer::UpdateDelayDistribution() { > } > } > >+ > static double Normal(double x, double mean, double stddev) { >- static const double PI = 4 * atan(1.0); >+ static double PI = 4 * atan(1.0); > double a = (x - mean) * (x - mean) / (2 * stddev * stddev); > return exp(-a) / (stddev * sqrt(2 * PI)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/weak_ptr_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/weak_ptr_unittest.cc >index 954171834e67b29044e67448bb88ef5249b074f2..66f2b4df5b18fd8f6299b58334d6194c25c54b71 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/weak_ptr_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/weak_ptr_unittest.cc >@@ -202,7 +202,7 @@ template <class T> > std::unique_ptr<T> NewObjectCreatedOnTaskQueue() { > std::unique_ptr<T> obj; > TaskQueue queue("NewObjectCreatedOnTaskQueue"); >- Event event(false, false); >+ Event event; > queue.PostTask([&event, &obj] { > obj.reset(new T()); > event.Set(); >@@ -229,7 +229,7 @@ TEST(WeakPtrTest, WeakPtrInitiateAndUseOnDifferentThreads) { > // Create weak ptr on main thread > WeakPtr<Target> weak_ptr = target->factory.GetWeakPtr(); > rtc::TaskQueue queue("queue"); >- rtc::Event done(false, false); >+ rtc::Event done; > queue.PostTask([&] { > // Dereference and invalide weak_ptr on another thread. > EXPECT_EQ(weak_ptr.get(), target.get()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32.cc >index d81d685f16c0f28c0dcd3e8a75ec7c9ebda8e010..e3482e3238448f6e5f2614e4659956c8b2a5d2cd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32.cc >@@ -72,9 +72,9 @@ const char* inet_ntop_v4(const void* src, char* dst, socklen_t size) { > } > const struct in_addr* as_in_addr = > reinterpret_cast<const struct in_addr*>(src); >- rtc::sprintfn(dst, size, "%d.%d.%d.%d", as_in_addr->S_un.S_un_b.s_b1, >- as_in_addr->S_un.S_un_b.s_b2, as_in_addr->S_un.S_un_b.s_b3, >- as_in_addr->S_un.S_un_b.s_b4); >+ snprintf(dst, size, "%d.%d.%d.%d", as_in_addr->S_un.S_un_b.s_b1, >+ as_in_addr->S_un.S_un_b.s_b2, as_in_addr->S_un.S_un_b.s_b3, >+ as_in_addr->S_un.S_un_b.s_b4); > return dst; > } > >@@ -127,7 +127,7 @@ const char* inet_ntop_v6(const void* src, char* dst, socklen_t size) { > *cursor++ = ':'; > *cursor++ = ':'; > if (maxpos == 4) { >- cursor += rtc::sprintfn(cursor, INET6_ADDRSTRLEN - 2, "ffff:"); >+ cursor += snprintf(cursor, INET6_ADDRSTRLEN - 2, "ffff:"); > } > const struct in_addr* as_v4 = > reinterpret_cast<const struct in_addr*>(&(as_shorts[6])); >@@ -136,8 +136,8 @@ const char* inet_ntop_v6(const void* src, char* dst, socklen_t size) { > } else { > for (int i = 0; i < run_array_size; ++i) { > if (runpos[i] == -1) { >- cursor += rtc::sprintfn(cursor, INET6_ADDRSTRLEN - (cursor - dst), "%x", >- NetworkToHost16(as_shorts[i])); >+ cursor += snprintf(cursor, INET6_ADDRSTRLEN - (cursor - dst), "%x", >+ NetworkToHost16(as_shorts[i])); > if (i != 7 && runpos[i + 1] != 1) { > *cursor++ = ':'; > } >@@ -224,8 +224,8 @@ int inet_pton_v6(const char* src, void* dst) { > *(readcursor + 2) != 0) { > // Check for periods, which we'll take as a sign of v4 addresses. > const char* addrstart = readcursor + 2; >- if (rtc::strchr(addrstart, ".")) { >- const char* colon = rtc::strchr(addrstart, "::"); >+ if (strchr(addrstart, '.')) { >+ const char* colon = strchr(addrstart, ':'); > if (colon) { > uint16_t a_short; > int bytesread = 0; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32filesystem.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32filesystem.cc >deleted file mode 100644 >index b500e5e629b8a6fc507a622ffa7ed5d92c6cd939..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32filesystem.cc >+++ /dev/null >@@ -1,83 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "rtc_base/win32filesystem.h" >- >-#include <shellapi.h> >-#include <shlobj.h> >-#include <tchar.h> >-#include "rtc_base/win32.h" >- >-#include <memory> >- >-#include "rtc_base/arraysize.h" >-#include "rtc_base/checks.h" >-#include "rtc_base/fileutils.h" >-#include "rtc_base/pathutils.h" >-#include "rtc_base/stream.h" >-#include "rtc_base/stringutils.h" >- >-// In several places in this file, we test the integrity level of the process >-// before calling GetLongPathName. We do this because calling GetLongPathName >-// when running under protected mode IE (a low integrity process) can result in >-// a virtualized path being returned, which is wrong if you only plan to read. >-// TODO: Waiting to hear back from IE team on whether this is the >-// best approach; IEIsProtectedModeProcess is another possible solution. >- >-namespace rtc { >- >-bool Win32Filesystem::DeleteFile(const Pathname& filename) { >- RTC_LOG(LS_INFO) << "Deleting file " << filename.pathname(); >- if (!IsFile(filename)) { >- RTC_DCHECK(IsFile(filename)); >- return false; >- } >- return ::DeleteFile(ToUtf16(filename.pathname()).c_str()) != 0; >-} >- >-bool Win32Filesystem::MoveFile(const Pathname& old_path, >- const Pathname& new_path) { >- if (!IsFile(old_path)) { >- RTC_DCHECK(IsFile(old_path)); >- return false; >- } >- RTC_LOG(LS_INFO) << "Moving " << old_path.pathname() << " to " >- << new_path.pathname(); >- return ::MoveFile(ToUtf16(old_path.pathname()).c_str(), >- ToUtf16(new_path.pathname()).c_str()) != 0; >-} >- >-bool Win32Filesystem::IsFolder(const Pathname& path) { >- WIN32_FILE_ATTRIBUTE_DATA data = {0}; >- if (0 == ::GetFileAttributesEx(ToUtf16(path.pathname()).c_str(), >- GetFileExInfoStandard, &data)) >- return false; >- return (data.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY) == >- FILE_ATTRIBUTE_DIRECTORY; >-} >- >-bool Win32Filesystem::IsFile(const Pathname& path) { >- WIN32_FILE_ATTRIBUTE_DATA data = {0}; >- if (0 == ::GetFileAttributesEx(ToUtf16(path.pathname()).c_str(), >- GetFileExInfoStandard, &data)) >- return false; >- return (data.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY) == 0; >-} >- >-bool Win32Filesystem::GetFileSize(const Pathname& pathname, size_t* size) { >- WIN32_FILE_ATTRIBUTE_DATA data = {0}; >- if (::GetFileAttributesEx(ToUtf16(pathname.pathname()).c_str(), >- GetFileExInfoStandard, &data) == 0) >- return false; >- *size = data.nFileSizeLow; >- return true; >-} >- >-} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32filesystem.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32filesystem.h >deleted file mode 100644 >index d26741e67b87048e181fcf0113208af5fad873da..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32filesystem.h >+++ /dev/null >@@ -1,41 +0,0 @@ >-/* >- * Copyright 2004 The WebRTC Project Authors. All rights reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef RTC_BASE_WIN32FILESYSTEM_H_ >-#define RTC_BASE_WIN32FILESYSTEM_H_ >- >-#include "fileutils.h" >- >-namespace rtc { >- >-class Win32Filesystem : public FilesystemInterface { >- public: >- // This will attempt to delete the path located at filename. >- // If the path points to a folder, it will fail with VERIFY >- bool DeleteFile(const Pathname& filename) override; >- >- // This moves a file from old_path to new_path. If the new path is on a >- // different volume than the old, it will attempt to copy and then delete >- // the folder >- // Returns true if the file is successfully moved >- bool MoveFile(const Pathname& old_path, const Pathname& new_path) override; >- >- // Returns true if a pathname is a directory >- bool IsFolder(const Pathname& pathname) override; >- >- // Returns true if a file exists at path >- bool IsFile(const Pathname& path) override; >- >- bool GetFileSize(const Pathname& path, size_t* size) override; >-}; >- >-} // namespace rtc >- >-#endif // RTC_BASE_WIN32FILESYSTEM_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32socketserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32socketserver.cc >index cab751a63173abfdc979feca2a3806cd761ff723..230f3ed46f9057e8ed6788ee6289ff6eacae185b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32socketserver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/win32socketserver.cc >@@ -16,6 +16,7 @@ > #include "rtc_base/byteorder.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >+#include "rtc_base/timeutils.h" // For Time, TimeSince > #include "rtc_base/win32window.h" > > namespace rtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/zero_memory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/zero_memory.h >index cb4646c1fe4b359f71acba7cd0808f494e4634c9..f697bcbd6b81c762b1567bbce6c5615f25e32e8c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/zero_memory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/zero_memory.h >@@ -11,6 +11,7 @@ > #ifndef RTC_BASE_ZERO_MEMORY_H_ > #define RTC_BASE_ZERO_MEMORY_H_ > >+#include <stddef.h> > #include <type_traits> > > #include "api/array_view.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/BUILD.gn >index 252aec4547f43a12cd1ae590908404c61d86f732..5214bc3bd0f24fbf942dcbd0b4a92af7e3081681 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/BUILD.gn >@@ -25,9 +25,6 @@ group("rtc_tools") { > ":psnr_ssim_analyzer", > ":rgba_to_i420_converter", > ] >- if (rtc_include_internal_audio_device) { >- deps += [ ":force_mic_volume_max" ] >- } > if (rtc_enable_protobuf) { > deps += [ ":chart_proto" ] > } >@@ -39,8 +36,10 @@ group("rtc_tools") { > ":tools_unittests", > ] > if (rtc_enable_protobuf) { >+ if (!build_with_chromium) { >+ deps += [ ":event_log_visualizer" ] >+ } > deps += [ >- ":event_log_visualizer", > ":rtp_analyzer", > ":unpack_aecdump", > "network_tester", >@@ -89,6 +88,12 @@ rtc_static_library("video_file_writer") { > > rtc_static_library("video_quality_analysis") { > sources = [ >+ "frame_analyzer/linear_least_squares.cc", >+ "frame_analyzer/linear_least_squares.h", >+ "frame_analyzer/video_color_aligner.cc", >+ "frame_analyzer/video_color_aligner.h", >+ "frame_analyzer/video_geometry_aligner.cc", >+ "frame_analyzer/video_geometry_aligner.h", > "frame_analyzer/video_quality_analysis.cc", > "frame_analyzer/video_quality_analysis.h", > "frame_analyzer/video_temporal_aligner.cc", >@@ -96,6 +101,8 @@ rtc_static_library("video_quality_analysis") { > ] > deps = [ > ":video_file_reader", >+ "../api:array_view", >+ "../api/video:video_frame", > "../api/video:video_frame_i420", > "../common_video", > "../rtc_base:checks", >@@ -211,20 +218,6 @@ if (!build_with_chromium) { > ] > } > >- # It doesn't make sense to build this tool without the ADM enabled. >- if (rtc_include_internal_audio_device) { >- rtc_executable("force_mic_volume_max") { >- sources = [ >- "force_mic_volume_max/force_mic_volume_max.cc", >- ] >- >- deps = [ >- "../modules/audio_device", >- "../modules/audio_device:audio_device_impl", >- ] >- } >- } >- > if (rtc_enable_protobuf) { > proto_library("chart_proto") { > sources = [ >@@ -272,10 +265,13 @@ if (!build_with_chromium) { > > # TODO(kwiberg): Remove this dependency. > "../api/audio_codecs:audio_codecs_api", >+ "../api/transport:goog_cc", > "../modules/congestion_controller", > "../modules/congestion_controller/goog_cc:delay_based_bwe", > "../modules/congestion_controller/goog_cc:estimators", >+ "../modules/congestion_controller/rtp:transport_feedback", > "../modules/pacing", >+ "../modules/remote_bitrate_estimator", > "../modules/rtp_rtcp", > "//third_party/abseil-cpp/absl/memory", > ] >@@ -284,7 +280,7 @@ if (!build_with_chromium) { > } > > if (rtc_include_tests) { >- if (rtc_enable_protobuf) { >+ if (rtc_enable_protobuf && !build_with_chromium) { > rtc_executable("event_log_visualizer") { > testonly = true > sources = [ >@@ -347,7 +343,10 @@ if (rtc_include_tests) { > testonly = true > > sources = [ >+ "frame_analyzer/linear_least_squares_unittest.cc", > "frame_analyzer/reference_less_video_analysis_unittest.cc", >+ "frame_analyzer/video_color_aligner_unittest.cc", >+ "frame_analyzer/video_geometry_aligner_unittest.cc", > "frame_analyzer/video_quality_analysis_unittest.cc", > "frame_analyzer/video_temporal_aligner_unittest.cc", > "frame_editing/frame_editing_unittest.cc", >@@ -364,8 +363,6 @@ if (rtc_include_tests) { > > deps = [ > ":command_line_parser", >- ":frame_editing_lib", >- ":reference_less_video_analysis_lib", > ":video_file_reader", > ":video_file_writer", > ":video_quality_analysis", >@@ -374,10 +371,19 @@ if (rtc_include_tests) { > "../rtc_base:checks", > "../test:fileutils", > "../test:test_main", >+ "../test:test_support", > "//testing/gtest", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/libyuv", > ] > >+ if (!build_with_chromium) { >+ deps += [ >+ ":frame_editing_lib", >+ ":reference_less_video_analysis_lib", >+ ] >+ } >+ > if (rtc_enable_protobuf) { > deps += [ "network_tester:network_tester_unittests" ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/DEPS b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/DEPS >index d9b162cd15fb2c704597baad1aa20f4fd3b892ce..4015ae265ac8e49208774e7b96bb1fabab84d593 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/DEPS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/DEPS >@@ -7,6 +7,7 @@ include_rules = [ > "+modules/audio_coding/neteq/tools", > "+modules/audio_processing", > "+modules/bitrate_controller", >+ "+modules/remote_bitrate_estimator", > "+modules/congestion_controller", > "+modules/pacing", > "+modules/rtp_rtcp", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/OWNERS >index c1b87c9bdca37e0881d03bc681b30e7be78670b8..77385fcd316177a7c1c56584e1b1492305adae78 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/OWNERS >@@ -3,3 +3,6 @@ per-file BUILD.gn=* > phoglund@webrtc.org > oprypin@webrtc.org > mbonadei@webrtc.org >+ >+# For video analysis tools >+magjed@webrtc.org >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/agc/activity_metric.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/agc/activity_metric.cc >index b4ed3facb359b5a81ba83b3d4e52fca7534cd3fa..9b2276f16f1f5e27536f958d9d9e972f8599ee4a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/agc/activity_metric.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/agc/activity_metric.cc >@@ -30,32 +30,34 @@ > static const int kAgcAnalWindowSamples = 100; > static const float kDefaultActivityThreshold = 0.3f; > >-DEFINE_bool(standalone_vad, true, "enable stand-alone VAD"); >-DEFINE_string(true_vad, >- "", >- "name of a file containing true VAD in 'int'" >- " format"); >-DEFINE_string(video_vad, >- "", >- "name of a file containing video VAD (activity" >- " probabilities) in double format. One activity per 10ms is" >- " required. If no file is given the video information is not" >- " incorporated. Negative activity is interpreted as video is" >- " not adapted and the statistics are not computed during" >- " the learning phase. Note that the negative video activities" >- " are ONLY allowed at the beginning."); >-DEFINE_string(result, >- "", >- "name of a file to write the results. The results" >- " will be appended to the end of the file. This is optional."); >-DEFINE_string(audio_content, >- "", >- "name of a file where audio content is written" >- " to, in double format."); >-DEFINE_float(activity_threshold, >- kDefaultActivityThreshold, >- "Activity threshold"); >-DEFINE_bool(help, false, "prints this message"); >+WEBRTC_DEFINE_bool(standalone_vad, true, "enable stand-alone VAD"); >+WEBRTC_DEFINE_string(true_vad, >+ "", >+ "name of a file containing true VAD in 'int'" >+ " format"); >+WEBRTC_DEFINE_string( >+ video_vad, >+ "", >+ "name of a file containing video VAD (activity" >+ " probabilities) in double format. One activity per 10ms is" >+ " required. If no file is given the video information is not" >+ " incorporated. Negative activity is interpreted as video is" >+ " not adapted and the statistics are not computed during" >+ " the learning phase. Note that the negative video activities" >+ " are ONLY allowed at the beginning."); >+WEBRTC_DEFINE_string( >+ result, >+ "", >+ "name of a file to write the results. The results" >+ " will be appended to the end of the file. This is optional."); >+WEBRTC_DEFINE_string(audio_content, >+ "", >+ "name of a file where audio content is written" >+ " to, in double format."); >+WEBRTC_DEFINE_float(activity_threshold, >+ kDefaultActivityThreshold, >+ "Activity threshold"); >+WEBRTC_DEFINE_bool(help, false, "prints this message"); > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/DEPS b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/DEPS >deleted file mode 100644 >index d0325a65aa1e344990120cf22f2d34a23808e205..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/DEPS >+++ /dev/null >@@ -1,13 +0,0 @@ >-# This is trimmed down version of the main tools DEPS file which is to be used >-# in Chromiums PyAuto WebRTC video quality measurement test. We will only >-# need the Zxing dependencies as we only use the barcode tools in this test. >- >-deps = { >- # Used by barcode_tools >- "barcode_tools/third_party/zxing/core": >- "http://zxing.googlecode.com/svn/trunk/core@2349", >- >- # Used by barcode_tools >- "barcode_tools/third_party/zxing/javase": >- "http://zxing.googlecode.com/svn/trunk/javase@2349", >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/README b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/README >deleted file mode 100644 >index a23e798064def22c99d668383fbd8ddccc435768..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/README >+++ /dev/null >@@ -1,34 +0,0 @@ >-This file explains how to get the dependencies needed for the barcode tools. >- >-barcode_encoder.py >-================== >-This script depends on: >-* Zxing (Java version) >-* Ant (must be installed manually) >-* Java >- >-To automatically download Zxing for the encoder script, checkout this directory >-as a separate gclient solution, like this: >-gclient config http://webrtc.googlecode.com/svn/trunk/webrtc/rtc_tools/barcode_tools >-gclient sync >-Then the Zxing Java source code will be put in third_party/zxing. >- >-In order to run barcode_encoder.py you then need to build: >-* zxing/core/core.jar >-* zxing/javase/javase.jar >-These are compiled using Ant by running build_zxing.py: >-python build_zxing.py >- >-For more info about Zxing, see https://code.google.com/p/zxing/ >- >- >-barcode_decoder.py >-================== >-This script depends on: >-* Zxing (C++ version). You need to checkout from Subversion and build the libs >- and zxing SCons targets. SVN URL: http://zxing.googlecode.com/svn/trunk/cpp >-* FFMPEG fmpeg 0.11.1 >- >-These dependencies must be precompiled separately before running the script. >-Make sure to add FFMPEG to the PATH environment variable and provide the path >-to the zxing executable using the mandatory command line flag to the script. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/barcode_decoder.py b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/barcode_decoder.py >deleted file mode 100644 >index 078e47f1bac724307939e38ef846878e0eb14bce..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/barcode_decoder.py >+++ /dev/null >@@ -1,295 +0,0 @@ >-#!/usr/bin/env python >-# Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >-# >-# Use of this source code is governed by a BSD-style license >-# that can be found in the LICENSE file in the root of the source >-# tree. An additional intellectual property rights grant can be found >-# in the file PATENTS. All contributing project authors may >-# be found in the AUTHORS file in the root of the source tree. >- >-import optparse >-import os >-import sys >- >-if __name__ == '__main__': >- # Make sure we always can import helper_functions. >- sys.path.append(os.path.dirname(__file__)) >- >-import helper_functions >- >-# Chrome browsertests will throw away stderr; avoid that output gets lost. >-sys.stderr = sys.stdout >- >- >-def ConvertYuvToPngFiles(yuv_file_name, yuv_frame_width, yuv_frame_height, >- output_directory, ffmpeg_path): >- """Converts a YUV video file into PNG frames. >- >- The function uses ffmpeg to convert the YUV file. The output of ffmpeg is in >- the form frame_xxxx.png, where xxxx is the frame number, starting from 0001. >- >- Args: >- yuv_file_name(string): The name of the YUV file. >- yuv_frame_width(int): The width of one YUV frame. >- yuv_frame_height(int): The height of one YUV frame. >- output_directory(string): The output directory where the PNG frames will be >- stored. >- ffmpeg_path(string): The path to the ffmpeg executable. If None, the PATH >- will be searched for it. >- >- Return: >- (bool): True if the conversion was OK. >- """ >- size_string = str(yuv_frame_width) + 'x' + str(yuv_frame_height) >- output_files_pattern = os.path.join(output_directory, 'frame_%04d.png') >- if not ffmpeg_path: >- ffmpeg_path = 'ffmpeg.exe' if sys.platform == 'win32' else 'ffmpeg' >- if yuv_file_name.endswith('.yuv'): >- command = [ffmpeg_path, '-s', '%s' % size_string, '-i', '%s' >- % yuv_file_name, '-f', 'image2', '-vcodec', 'png', >- '%s' % output_files_pattern] >- else: >- command = [ffmpeg_path, '-i', '%s' % yuv_file_name, '-f', 'image2', >- '-vcodec', 'png', '%s' % output_files_pattern] >- try: >- print 'Converting YUV file to PNG images (may take a while)...' >- print ' '.join(command) >- helper_functions.RunShellCommand( >- command, fail_msg='Error during YUV to PNG conversion') >- except helper_functions.HelperError, err: >- print 'Error executing command: %s. Error: %s' % (command, err) >- return False >- except OSError: >- print 'Did not find %s. Have you installed it?' % ffmpeg_path >- return False >- return True >- >- >-def DecodeFrames(input_directory, zxing_path): >- """Decodes the barcodes overlaid in each frame. >- >- The function uses the Zxing command-line tool from the Zxing C++ distribution >- to decode the barcode in every PNG frame from the input directory. The frames >- should be named frame_xxxx.png, where xxxx is the frame number. The frame >- numbers should be consecutive and should start from 0001. >- The decoding results in a frame_xxxx.txt file for every successfully decoded >- barcode. This file contains the decoded barcode as 12-digit string (UPC-A >- format: 11 digits content + one check digit). >- >- Args: >- input_directory(string): The input directory from where the PNG frames are >- read. >- zxing_path(string): The path to the zxing binary. If specified as None, >- the PATH will be searched for it. >- Return: >- (bool): True if the decoding succeeded. >- """ >- if not zxing_path: >- zxing_path = 'zxing.exe' if sys.platform == 'win32' else 'zxing' >- print 'Decoding barcodes from PNG files with %s...' % zxing_path >- return helper_functions.PerformActionOnAllFiles( >- directory=input_directory, file_pattern='frame_', >- file_extension='png', start_number=1, action=_DecodeBarcodeInFile, >- command_line_decoder=zxing_path) >- >- >-def _DecodeBarcodeInFile(file_name, command_line_decoder): >- """Decodes the barcode in the upper left corner of a PNG file. >- >- Args: >- file_name(string): File name of the PNG file. >- command_line_decoder(string): The ZXing command-line decoding tool. >- >- Return: >- (bool): True upon success, False otherwise. >- """ >- command = [command_line_decoder, '--try-harder', '--dump-raw', file_name] >- try: >- out = helper_functions.RunShellCommand( >- command, fail_msg='Error during decoding of %s' % file_name) >- text_file = open('%s.txt' % file_name[:-4], 'w') >- text_file.write(out) >- text_file.close() >- except helper_functions.HelperError, err: >- print 'Barcode in %s cannot be decoded.' % file_name >- print err >- return False >- except OSError: >- print 'Did not find %s. Have you installed it?' % command_line_decoder >- return False >- return True >- >- >-def _GenerateStatsFile(stats_file_name, input_directory='.'): >- """Generate statistics file. >- >- The function generates a statistics file. The contents of the file are in the >- format <frame_name> <barcode>, where frame name is the name of every frame >- (effectively the frame number) and barcode is the decoded barcode. The frames >- and the helper .txt files are removed after they have been used. >- """ >- file_prefix = os.path.join(input_directory, 'frame_') >- stats_file = open(stats_file_name, 'w') >- >- print 'Generating stats file: %s' % stats_file_name >- for i in range(1, _CountFramesIn(input_directory=input_directory) + 1): >- frame_number = helper_functions.ZeroPad(i) >- barcode_file_name = file_prefix + frame_number + '.txt' >- png_frame = file_prefix + frame_number + '.png' >- entry_frame_number = helper_functions.ZeroPad(i-1) >- entry = 'frame_' + entry_frame_number + ' ' >- >- if os.path.isfile(barcode_file_name): >- barcode = _ReadBarcodeFromTextFile(barcode_file_name) >- os.remove(barcode_file_name) >- >- if _CheckBarcode(barcode): >- entry += (helper_functions.ZeroPad(int(barcode[0:11])) + '\n') >- else: >- entry += 'Barcode error\n' # Barcode is wrongly detected. >- else: # Barcode file doesn't exist. >- entry += 'Barcode error\n' >- >- stats_file.write(entry) >- os.remove(png_frame) >- >- stats_file.close() >- >- >-def _ReadBarcodeFromTextFile(barcode_file_name): >- """Reads the decoded barcode for a .txt file. >- >- Args: >- barcode_file_name(string): The name of the .txt file. >- Return: >- (string): The decoded barcode. >- """ >- barcode_file = open(barcode_file_name, 'r') >- barcode = barcode_file.read() >- barcode_file.close() >- return barcode >- >- >-def _CheckBarcode(barcode): >- """Check weather the UPC-A barcode was decoded correctly. >- >- This function calculates the check digit of the provided barcode and compares >- it to the check digit that was decoded. >- >- Args: >- barcode(string): The barcode (12-digit). >- Return: >- (bool): True if the barcode was decoded correctly. >- """ >- if len(barcode) != 12: >- return False >- >- r1 = range(0, 11, 2) # Odd digits >- r2 = range(1, 10, 2) # Even digits except last >- dsum = 0 >- # Sum all the even digits >- for i in r1: >- dsum += int(barcode[i]) >- # Multiply the sum by 3 >- dsum *= 3 >- # Add all the even digits except the check digit (12th digit) >- for i in r2: >- dsum += int(barcode[i]) >- # Get the modulo 10 >- dsum = dsum % 10 >- # If not 0 substract from 10 >- if dsum != 0: >- dsum = 10 - dsum >- # Compare result and check digit >- return dsum == int(barcode[11]) >- >- >-def _CountFramesIn(input_directory='.'): >- """Calculates the number of frames in the input directory. >- >- The function calculates the number of frames in the input directory. The >- frames should be named frame_xxxx.png, where xxxx is the number of the frame. >- The numbers should start from 1 and should be consecutive. >- >- Args: >- input_directory(string): The input directory. >- Return: >- (int): The number of frames. >- """ >- file_prefix = os.path.join(input_directory, 'frame_') >- file_exists = True >- num = 1 >- >- while file_exists: >- file_name = (file_prefix + helper_functions.ZeroPad(num) + '.png') >- if os.path.isfile(file_name): >- num += 1 >- else: >- file_exists = False >- return num - 1 >- >- >-def _ParseArgs(): >- """Registers the command-line options.""" >- usage = "usage: %prog [options]" >- parser = optparse.OptionParser(usage=usage) >- >- parser.add_option('--zxing_path', type='string', >- help=('The path to where the zxing executable is located. ' >- 'If omitted, it will be assumed to be present in the ' >- 'PATH with the name zxing[.exe].')) >- parser.add_option('--ffmpeg_path', type='string', >- help=('The path to where the ffmpeg executable is located. ' >- 'If omitted, it will be assumed to be present in the ' >- 'PATH with the name ffmpeg[.exe].')) >- parser.add_option('--yuv_frame_width', type='int', default=640, >- help='Width of the YUV file\'s frames. Default: %default') >- parser.add_option('--yuv_frame_height', type='int', default=480, >- help='Height of the YUV file\'s frames. Default: %default') >- parser.add_option('--yuv_file', type='string', default='output.yuv', >- help='The YUV file to be decoded. Default: %default') >- parser.add_option('--stats_file', type='string', default='stats.txt', >- help='The output stats file. Default: %default') >- parser.add_option('--png_working_dir', type='string', default='.', >- help=('The directory for temporary PNG images to be stored ' >- 'in when decoding from YUV before they\'re barcode ' >- 'decoded. If using Windows and a Cygwin-compiled ' >- 'zxing.exe, you should keep the default value to ' >- 'avoid problems. Default: %default')) >- options, _ = parser.parse_args() >- return options >- >- >-def main(): >- """The main function. >- >- A simple invocation is: >- ./webrtc/rtc_tools/barcode_tools/barcode_decoder.py >- --yuv_file=<path_and_name_of_overlaid_yuv_video> >- --yuv_frame_width=640 --yuv_frame_height=480 >- --stats_file=<path_and_name_to_stats_file> >- """ >- options = _ParseArgs() >- >- # Convert the overlaid YUV video into a set of PNG frames. >- if not ConvertYuvToPngFiles(options.yuv_file, options.yuv_frame_width, >- options.yuv_frame_height, >- output_directory=options.png_working_dir, >- ffmpeg_path=options.ffmpeg_path): >- print 'An error occurred converting from YUV to PNG frames.' >- return -1 >- >- # Decode the barcodes from the PNG frames. >- if not DecodeFrames(input_directory=options.png_working_dir, >- zxing_path=options.zxing_path): >- print 'An error occurred decoding barcodes from PNG frames.' >- return -2 >- >- # Generate statistics file. >- _GenerateStatsFile(options.stats_file, >- input_directory=options.png_working_dir) >- print 'Completed barcode decoding.' >- return 0 >- >-if __name__ == '__main__': >- sys.exit(main()) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/barcode_encoder.py b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/barcode_encoder.py >deleted file mode 100644 >index 9ab8b5075423992f92b8c7f87393ce02ce9e6b90..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/barcode_encoder.py >+++ /dev/null >@@ -1,372 +0,0 @@ >-#!/usr/bin/env python >-# Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >-# >-# Use of this source code is governed by a BSD-style license >-# that can be found in the LICENSE file in the root of the source >-# tree. An additional intellectual property rights grant can be found >-# in the file PATENTS. All contributing project authors may >-# be found in the AUTHORS file in the root of the source tree. >- >-import optparse >-import os >-import sys >- >-import helper_functions >- >-_DEFAULT_BARCODE_WIDTH = 352 >-_DEFAULT_BARCODES_FILE = 'barcodes.yuv' >- >- >-def GenerateUpcaBarcodes(number_of_barcodes, barcode_width, barcode_height, >- output_directory='.', >- path_to_zxing='zxing-read-only'): >- """Generates UPC-A barcodes. >- >- This function generates a number_of_barcodes UPC-A barcodes. The function >- calls an example Java encoder from the Zxing library. The barcodes are >- generated as PNG images. The width of the barcodes shouldn't be less than 102 >- pixels because otherwise Zxing can't properly generate the barcodes. >- >- Args: >- number_of_barcodes(int): The number of barcodes to generate. >- barcode_width(int): Width of barcode in pixels. >- barcode_height(int): Height of barcode in pixels. >- output_directory(string): Output directory where to store generated >- barcodes. >- path_to_zxing(string): The path to Zxing. >- >- Return: >- (bool): True if the conversion is successful. >- """ >- base_file_name = os.path.join(output_directory, "barcode_") >- jars = _FormJarsString(path_to_zxing) >- command_line_encoder = 'com.google.zxing.client.j2se.CommandLineEncoder' >- barcode_width = str(barcode_width) >- barcode_height = str(barcode_height) >- >- errors = False >- for i in range(number_of_barcodes): >- suffix = helper_functions.ZeroPad(i) >- # Barcodes starting from 0 >- content = helper_functions.ZeroPad(i, 11) >- output_file_name = base_file_name + suffix + ".png" >- >- command = ["java", "-cp", jars, command_line_encoder, >- "--barcode_format=UPC_A", "--height=%s" % barcode_height, >- "--width=%s" % barcode_width, >- "--output=%s" % (output_file_name), "%s" % (content)] >- try: >- helper_functions.RunShellCommand( >- command, fail_msg=('Error during barcode %s generation' % content)) >- except helper_functions.HelperError as err: >- print >> sys.stderr, err >- errors = True >- return not errors >- >- >-def ConvertPngToYuvBarcodes(input_directory='.', output_directory='.'): >- """Converts PNG barcodes to YUV barcode images. >- >- This function reads all the PNG files from the input directory which are in >- the format frame_xxxx.png, where xxxx is the number of the frame, starting >- from 0000. The frames should be consecutive numbers. The output YUV file is >- named frame_xxxx.yuv. The function uses ffmpeg to do the conversion. >- >- Args: >- input_directory(string): The input direcotry to read the PNG barcodes from. >- output_directory(string): The putput directory to write the YUV files to. >- Return: >- (bool): True if the conversion was without errors. >- """ >- return helper_functions.PerformActionOnAllFiles( >- input_directory, 'barcode_', 'png', 0, _ConvertToYuvAndDelete, >- output_directory=output_directory, pattern='barcode_') >- >- >-def _ConvertToYuvAndDelete(output_directory, file_name, pattern): >- """Converts a PNG file to a YUV file and deletes the PNG file. >- >- Args: >- output_directory(string): The output directory for the YUV file. >- file_name(string): The PNG file name. >- pattern(string): The file pattern of the PNG/YUV file. The PNG/YUV files are >- named patternxx..x.png/yuv, where xx..x are digits starting from 00..0. >- Return: >- (bool): True upon successful conversion, false otherwise. >- """ >- # Pattern should be in file name >- if not pattern in file_name: >- return False >- pattern_position = file_name.rfind(pattern) >- >- # Strip the path to the PNG file and replace the png extension with yuv >- yuv_file_name = file_name[pattern_position:-3] + 'yuv' >- yuv_file_name = os.path.join(output_directory, yuv_file_name) >- >- command = ['ffmpeg', '-i', '%s' % (file_name), '-pix_fmt', 'yuv420p', >- '%s' % (yuv_file_name)] >- try: >- helper_functions.RunShellCommand( >- command, fail_msg=('Error during PNG to YUV conversion of %s' % >- file_name)) >- os.remove(file_name) >- except helper_functions.HelperError as err: >- print >> sys.stderr, err >- return False >- return True >- >- >-def CombineYuvFramesIntoOneFile(output_file_name, input_directory='.'): >- """Combines several YUV frames into one YUV video file. >- >- The function combines the YUV frames from input_directory into one YUV video >- file. The frames should be named in the format frame_xxxx.yuv where xxxx >- stands for the frame number. The numbers have to be consecutive and start from >- 0000. The YUV frames are removed after they have been added to the video. >- >- Args: >- output_file_name(string): The name of the file to produce. >- input_directory(string): The directory from which the YUV frames are read. >- Return: >- (bool): True if the frame stitching went OK. >- """ >- output_file = open(output_file_name, "wb") >- success = helper_functions.PerformActionOnAllFiles( >- input_directory, 'barcode_', 'yuv', 0, _AddToFileAndDelete, >- output_file=output_file) >- output_file.close() >- return success >- >-def _AddToFileAndDelete(output_file, file_name): >- """Adds the contents of a file to a previously opened file. >- >- Args: >- output_file(file): The ouput file, previously opened. >- file_name(string): The file name of the file to add to the output file. >- >- Return: >- (bool): True if successful, False otherwise. >- """ >- input_file = open(file_name, "rb") >- input_file_contents = input_file.read() >- output_file.write(input_file_contents) >- input_file.close() >- try: >- os.remove(file_name) >- except OSError as e: >- print >> sys.stderr, 'Error deleting file %s.\nError: %s' % (file_name, e) >- return False >- return True >- >- >-def _OverlayBarcodeAndBaseFrames(barcodes_file, base_file, output_file, >- barcodes_component_sizes, >- base_component_sizes): >- """Overlays the next YUV frame from a file with a barcode. >- >- Args: >- barcodes_file(FileObject): The YUV file containing the barcodes (opened). >- base_file(FileObject): The base YUV file (opened). >- output_file(FileObject): The output overlaid file (opened). >- barcodes_component_sizes(list of tuples): The width and height of each Y, U >- and V plane of the barcodes YUV file. >- base_component_sizes(list of tuples): The width and height of each Y, U and >- V plane of the base YUV file. >- Return: >- (bool): True if there are more planes (i.e. frames) in the base file, false >- otherwise. >- """ >- # We will loop three times - once for the Y, U and V planes >- for ((barcode_comp_width, barcode_comp_height), >- (base_comp_width, base_comp_height)) in zip(barcodes_component_sizes, >- base_component_sizes): >- for base_row in range(base_comp_height): >- barcode_plane_traversed = False >- if (base_row < barcode_comp_height) and not barcode_plane_traversed: >- barcode_plane = barcodes_file.read(barcode_comp_width) >- if barcode_plane == "": >- barcode_plane_traversed = True >- else: >- barcode_plane_traversed = True >- base_plane = base_file.read(base_comp_width) >- >- if base_plane == "": >- return False >- >- if not barcode_plane_traversed: >- # Substitute part of the base component with the top component >- output_file.write(barcode_plane) >- base_plane = base_plane[barcode_comp_width:] >- output_file.write(base_plane) >- return True >- >- >-def OverlayYuvFiles(barcode_width, barcode_height, base_width, base_height, >- barcodes_file_name, base_file_name, output_file_name): >- """Overlays two YUV files starting from the upper left corner of both. >- >- Args: >- barcode_width(int): The width of the barcode (to be overlaid). >- barcode_height(int): The height of the barcode (to be overlaid). >- base_width(int): The width of a frame of the base file. >- base_height(int): The height of a frame of the base file. >- barcodes_file_name(string): The name of the YUV file containing the YUV >- barcodes. >- base_file_name(string): The name of the base YUV file. >- output_file_name(string): The name of the output file where the overlaid >- video will be written. >- """ >- # Component sizes = [Y_sizes, U_sizes, V_sizes] >- barcodes_component_sizes = [(barcode_width, barcode_height), >- (barcode_width/2, barcode_height/2), >- (barcode_width/2, barcode_height/2)] >- base_component_sizes = [(base_width, base_height), >- (base_width/2, base_height/2), >- (base_width/2, base_height/2)] >- >- barcodes_file = open(barcodes_file_name, 'rb') >- base_file = open(base_file_name, 'rb') >- output_file = open(output_file_name, 'wb') >- >- data_left = True >- while data_left: >- data_left = _OverlayBarcodeAndBaseFrames(barcodes_file, base_file, >- output_file, >- barcodes_component_sizes, >- base_component_sizes) >- >- barcodes_file.close() >- base_file.close() >- output_file.close() >- >- >-def CalculateFramesNumberFromYuv(yuv_width, yuv_height, file_name): >- """Calculates the number of frames of a YUV video. >- >- Args: >- yuv_width(int): Width of a frame of the yuv file. >- yuv_height(int): Height of a frame of the YUV file. >- file_name(string): The name of the YUV file. >- Return: >- (int): The number of frames in the YUV file. >- """ >- file_size = os.path.getsize(file_name) >- >- y_plane_size = yuv_width * yuv_height >- u_plane_size = (yuv_width/2) * (yuv_height/2) # Equals to V plane size too >- frame_size = y_plane_size + (2 * u_plane_size) >- return int(file_size/frame_size) # Should be int anyway >- >- >-def _FormJarsString(path_to_zxing): >- """Forms the the Zxing core and javase jars argument. >- >- Args: >- path_to_zxing(string): The path to the Zxing checkout folder. >- Return: >- (string): The newly formed jars argument. >- """ >- javase_jar = os.path.join(path_to_zxing, "javase", "javase.jar") >- core_jar = os.path.join(path_to_zxing, "core", "core.jar") >- delimiter = ':' >- if os.name != 'posix': >- delimiter = ';' >- return javase_jar + delimiter + core_jar >- >-def _ParseArgs(): >- """Registers the command-line options.""" >- usage = "usage: %prog [options]" >- parser = optparse.OptionParser(usage=usage) >- >- parser.add_option('--barcode_width', type='int', >- default=_DEFAULT_BARCODE_WIDTH, >- help=('Width of the barcodes to be overlaid on top of the' >- ' base file. Default: %default')) >- parser.add_option('--barcode_height', type='int', default=32, >- help=('Height of the barcodes to be overlaid on top of the' >- ' base file. Default: %default')) >- parser.add_option('--base_frame_width', type='int', default=352, >- help=('Width of the base YUV file\'s frames. ' >- 'Default: %default')) >- parser.add_option('--base_frame_height', type='int', default=288, >- help=('Height of the top YUV file\'s frames. ' >- 'Default: %default')) >- parser.add_option('--barcodes_yuv', type='string', >- default=_DEFAULT_BARCODES_FILE, >- help=('The YUV file with the barcodes in YUV. ' >- 'Default: %default')) >- parser.add_option('--base_yuv', type='string', default='base.yuv', >- help=('The base YUV file to be overlaid. ' >- 'Default: %default')) >- parser.add_option('--output_yuv', type='string', default='output.yuv', >- help=('The output YUV file containing the base overlaid' >- ' with the barcodes. Default: %default')) >- parser.add_option('--png_barcodes_output_dir', type='string', default='.', >- help=('Output directory where the PNG barcodes will be ' >- 'generated. Default: %default')) >- parser.add_option('--png_barcodes_input_dir', type='string', default='.', >- help=('Input directory from where the PNG barcodes will be ' >- 'read. Default: %default')) >- parser.add_option('--yuv_barcodes_output_dir', type='string', default='.', >- help=('Output directory where the YUV barcodes will be ' >- 'generated. Default: %default')) >- parser.add_option('--yuv_frames_input_dir', type='string', default='.', >- help=('Input directory from where the YUV will be ' >- 'read before combination. Default: %default')) >- parser.add_option('--zxing_dir', type='string', default='zxing', >- help=('Path to the Zxing barcodes library. ' >- 'Default: %default')) >- options = parser.parse_args()[0] >- return options >- >- >-def main(): >- """The main function. >- >- A simple invocation will be: >- ./webrtc/rtc_tools/barcode_tools/barcode_encoder.py --barcode_height=32 >- --base_frame_width=352 --base_frame_height=288 >- --base_yuv=<path_and_name_of_base_file> >- --output_yuv=<path and name_of_output_file> >- """ >- options = _ParseArgs() >- # The barcodes with will be different than the base frame width only if >- # explicitly specified at the command line. >- if options.barcode_width == _DEFAULT_BARCODE_WIDTH: >- options.barcode_width = options.base_frame_width >- # If the user provides a value for the barcodes YUV video file, we will keep >- # it. Otherwise we create a temp file which is removed after it has been used. >- keep_barcodes_yuv_file = False >- if options.barcodes_yuv != _DEFAULT_BARCODES_FILE: >- keep_barcodes_yuv_file = True >- >- # Calculate the number of barcodes - it is equal to the number of frames in >- # the base file. >- number_of_barcodes = CalculateFramesNumberFromYuv( >- options.base_frame_width, options.base_frame_height, options.base_yuv) >- >- script_dir = os.path.dirname(os.path.abspath(sys.argv[0])) >- zxing_dir = os.path.join(script_dir, 'third_party', 'zxing') >- # Generate barcodes - will generate them in PNG. >- GenerateUpcaBarcodes(number_of_barcodes, options.barcode_width, >- options.barcode_height, >- output_directory=options.png_barcodes_output_dir, >- path_to_zxing=zxing_dir) >- # Convert the PNG barcodes to to YUV format. >- ConvertPngToYuvBarcodes(options.png_barcodes_input_dir, >- options.yuv_barcodes_output_dir) >- # Combine the YUV barcodes into one YUV file. >- CombineYuvFramesIntoOneFile(options.barcodes_yuv, >- input_directory=options.yuv_frames_input_dir) >- # Overlay the barcodes over the base file. >- OverlayYuvFiles(options.barcode_width, options.barcode_height, >- options.base_frame_width, options.base_frame_height, >- options.barcodes_yuv, options.base_yuv, options.output_yuv) >- >- if not keep_barcodes_yuv_file: >- # Remove the temporary barcodes YUV file >- os.remove(options.barcodes_yuv) >- >- >-if __name__ == '__main__': >- sys.exit(main()) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/build_zxing.py b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/build_zxing.py >deleted file mode 100644 >index 92696a78c2c36fd123a9fc12531dc34c2ed0e9d6..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/build_zxing.py >+++ /dev/null >@@ -1,44 +0,0 @@ >-#!/usr/bin/env python >-# Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >-# >-# Use of this source code is governed by a BSD-style license >-# that can be found in the LICENSE file in the root of the source >-# tree. An additional intellectual property rights grant can be found >-# in the file PATENTS. All contributing project authors may >-# be found in the AUTHORS file in the root of the source tree. >- >-import os >-import subprocess >-import sys >- >- >-def RunAntBuildCommand(path_to_ant_build_file): >- """Tries to build the passed build file with ant.""" >- ant_executable = 'ant' >- if sys.platform == 'win32': >- if os.getenv('ANT_HOME'): >- ant_executable = os.path.join(os.getenv('ANT_HOME'), 'bin', 'ant.bat') >- else: >- ant_executable = 'ant.bat' >- cmd = [ant_executable, '-buildfile', path_to_ant_build_file] >- try: >- process = subprocess.Popen(cmd, stdout=sys.stdout, stderr=sys.stderr) >- process.wait() >- if process.returncode != 0: >- print >> sys.stderr, 'Failed to execute: %s' % ' '.join(cmd) >- return process.returncode >- except subprocess.CalledProcessError as e: >- print >> sys.stderr, 'Failed to execute: %s.\nCause: %s' % (' '.join(cmd), >- e) >- return -1 >- >-def main(): >- core_build = os.path.join('third_party', 'zxing', 'core', 'build.xml') >- RunAntBuildCommand(core_build) >- >- javase_build = os.path.join('third_party', 'zxing', 'javase', 'build.xml') >- return RunAntBuildCommand(javase_build) >- >- >-if __name__ == '__main__': >- sys.exit(main()) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/helper_functions.py b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/helper_functions.py >deleted file mode 100644 >index e27322f41f92670b711ae5f8a8a7b6071e1407c3..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/helper_functions.py >+++ /dev/null >@@ -1,105 +0,0 @@ >-#!/usr/bin/env python >-# Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >-# >-# Use of this source code is governed by a BSD-style license >-# that can be found in the LICENSE file in the root of the source >-# tree. An additional intellectual property rights grant can be found >-# in the file PATENTS. All contributing project authors may >-# be found in the AUTHORS file in the root of the source tree. >- >-import multiprocessing >-import os >-import subprocess >-import sys >- >-_DEFAULT_PADDING = 4 >- >- >-class HelperError(Exception): >- """Exception raised for errors in the helper.""" >- pass >- >- >-def ZeroPad(number, padding=_DEFAULT_PADDING): >- """Converts an int into a zero padded string. >- >- Args: >- number(int): The number to convert. >- padding(int): The number of chars in the output. Note that if you pass for >- example number=23456 and padding=4, the output will still be '23456', >- i.e. it will not be cropped. If you pass number=2 and padding=4, the >- return value will be '0002'. >- Return: >- (string): The zero padded number converted to string. >- """ >- return str(number).zfill(padding) >- >- >-def RunShellCommand(cmd_list, fail_msg=None): >- """Executes a command. >- >- Args: >- cmd_list(list): Command list to execute. >- fail_msg(string): Message describing the error in case the command fails. >- >- Return: >- (string): The standard output from running the command. >- >- Raise: >- HelperError: If command fails. >- """ >- process = subprocess.Popen(cmd_list, stdout=subprocess.PIPE, >- stderr=subprocess.PIPE) >- output, error = process.communicate() >- if process.returncode != 0: >- if fail_msg: >- print >> sys.stderr, fail_msg >- raise HelperError('Failed to run %s: command returned %d and printed ' >- '%s and %s' % (' '.join(cmd_list), process.returncode, >- output, error)) >- return output.strip() >- >- >-def PerformActionOnAllFiles(directory, file_pattern, file_extension, >- start_number, action, **kwargs): >- """Function that performs a given action on all files matching a pattern. >- >- It is assumed that the files are named file_patternxxxx.file_extension, where >- xxxx are digits starting from start_number. >- >- Args: >- directory(string): The directory where the files live. >- file_pattern(string): The name pattern of the files. >- file_extension(string): The files' extension. >- start_number(int): From where to start to count frames. >- action(function): The action to be performed over the files. Must return >- False if the action failed, True otherwise. It should take a file name >- as the first argument and **kwargs as arguments. The function must be >- possible to pickle, so it cannot be a bound function (for instance). >- >- Return: >- (bool): Whether performing the action over all files was successful or not. >- """ >- file_prefix = os.path.join(directory, file_pattern) >- file_number = start_number >- >- process_pool = multiprocessing.Pool(processes=multiprocessing.cpu_count()) >- results = [] >- while True: >- zero_padded_file_number = ZeroPad(file_number) >- file_name = file_prefix + zero_padded_file_number + '.' + file_extension >- if not os.path.isfile(file_name): >- break >- future = process_pool.apply_async(action, args=(file_name,), kwds=kwargs) >- results.append(future) >- file_number += 1 >- >- successful = True >- for result in results: >- if not result.get(): >- print "At least one action %s failed for files %sxxxx.%s." % ( >- action, file_pattern, file_extension) >- successful = False >- >- process_pool.close() >- return successful >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/yuv_cropper.py b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/yuv_cropper.py >deleted file mode 100644 >index 609f07dc9cc7ad90ec2d429167f92b53a782cf3a..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/barcode_tools/yuv_cropper.py >+++ /dev/null >@@ -1,125 +0,0 @@ >-#!/usr/bin/env python >-# Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. >-# >-# Use of this source code is governed by a BSD-style license >-# that can be found in the LICENSE file in the root of the source >-# tree. An additional intellectual property rights grant can be found >-# in the file PATENTS. All contributing project authors may >-# be found in the AUTHORS file in the root of the source tree. >- >-import optparse >-import os >-import sys >- >- >-def _CropOneFrame(yuv_file, output_file, component_sizes): >- """Crops one frame. >- >- This function crops one frame going through all the YUV planes and cropping >- respective amount of rows. >- >- Args: >- yuv_file(file): The opened (for binary reading) YUV file. >- output_file(file): The opened (for binary writing) file. >- component_sizes(list of 3 3-ples): The list contains the sizes for all the >- planes (Y, U, V) of the YUV file plus the crop_height scaled for every >- plane. The sizes equal width, height and crop_height for the Y plane, >- and are equal to width/2, height/2 and crop_height/2 for the U and V >- planes. >- Return: >- (bool): True if there are more frames to crop, False otherwise. >- """ >- for comp_width, comp_height, comp_crop_height in component_sizes: >- for row in range(comp_height): >- # Read the plane data for this row. >- yuv_plane = yuv_file.read(comp_width) >- >- # If the plane is empty, we have reached the end of the file. >- if yuv_plane == "": >- return False >- >- # Only write the plane data for the rows bigger than crop_height. >- if row >= comp_crop_height: >- output_file.write(yuv_plane) >- return True >- >- >-def CropFrames(yuv_file_name, output_file_name, width, height, crop_height): >- """Crops rows of pixels from the top of the YUV frames. >- >- This function goes through all the frames in a video and crops the crop_height >- top pixel rows of every frame. >- >- Args: >- yuv_file_name(string): The name of the YUV file to be cropped. >- output_file_name(string): The name of the output file where the result will >- be written. >- width(int): The width of the original YUV file. >- height(int): The height of the original YUV file. >- crop_height(int): The height (the number of pixel rows) to be cropped from >- the frames. >- """ >- # Component sizes = [Y_sizes, U_sizes, V_sizes]. >- component_sizes = [(width, height, crop_height), >- (width/2, height/2, crop_height/2), >- (width/2, height/2, crop_height/2)] >- >- yuv_file = open(yuv_file_name, 'rb') >- output_file = open(output_file_name, 'wb') >- >- data_left = True >- while data_left: >- data_left = _CropOneFrame(yuv_file, output_file, component_sizes) >- >- yuv_file.close() >- output_file.close() >- >- >-def _ParseArgs(): >- """Registers the command-line options.""" >- usage = "usage: %prog [options]" >- parser = optparse.OptionParser(usage=usage) >- >- parser.add_option('--width', type='int', >- default=352, >- help=('Width of the YUV file\'s frames. ' >- 'Default: %default')) >- parser.add_option('--height', type='int', default=288, >- help=('Height of the YUV file\'s frames. ' >- 'Default: %default')) >- parser.add_option('--crop_height', type='int', default=32, >- help=('How much of the top of the YUV file to crop. ' >- 'Has to be module of 2. Default: %default')) >- parser.add_option('--yuv_file', type='string', >- help=('The YUV file to be cropped.')) >- parser.add_option('--output_file', type='string', default='output.yuv', >- help=('The output YUV file containing the cropped YUV. ' >- 'Default: %default')) >- options = parser.parse_args()[0] >- if not options.yuv_file: >- parser.error('yuv_file argument missing. Please specify input YUV file!') >- return options >- >- >-def main(): >- """A tool to crop rows of pixels from the top part of a YUV file. >- >- A simple invocation will be: >- ./yuv_cropper.py --width=640 --height=480 --crop_height=32 >- --yuv_file=<path_and_name_of_yuv_file> >- --output_yuv=<path and name_of_output_file> >- """ >- options = _ParseArgs() >- >- if os.path.getsize(options.yuv_file) == 0: >- sys.stderr.write('Error: The YUV file you have passed has size 0. The ' >- 'produced output will also have size 0.\n') >- return -1 >- >- CropFrames(options.yuv_file, options.output_file, options.width, >- options.height, options.crop_height) >- return 0 >- >- >-if __name__ == '__main__': >- sys.exit(main()) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/compare_videos.py b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/compare_videos.py >index 79790439b176daac191727cd0c217c946d52579d..0cb4a6ddb199f560f467c4ddd9b2551cf443c601 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/compare_videos.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/compare_videos.py >@@ -7,6 +7,7 @@ > # in the file PATENTS. All contributing project authors may > # be found in the AUTHORS file in the root of the source tree. > >+import json > import optparse > import os > import shutil >@@ -45,25 +46,15 @@ def _ParseArgs(): > parser.add_option('--vmaf_phone_model', action='store_true', > help='Whether to use phone model in VMAF.') > parser.add_option('--barcode_decoder', type='string', >- help=('Path to the barcode decoder script. By default, we ' >- 'will assume we can find it in barcode_tools/' >- 'relative to this directory.')) >+ help=('DEPRECATED')) > parser.add_option('--ffmpeg_path', type='string', >- help=('The path to where the ffmpeg executable is located. ' >- 'If omitted, it will be assumed to be present in the ' >- 'PATH with the name ffmpeg[.exe].')) >+ help=('DEPRECATED')) > parser.add_option('--zxing_path', type='string', >- help=('The path to where the zxing executable is located. ' >- 'If omitted, it will be assumed to be present in the ' >- 'PATH with the name zxing[.exe].')) >+ help=('DEPRECATED')) > parser.add_option('--stats_file_ref', type='string', default='stats_ref.txt', >- help=('Path to the temporary stats file to be created and ' >- 'used for the reference video file. ' >- 'Default: %default')) >+ help=('DEPRECATED')) > parser.add_option('--stats_file_test', type='string', >- default='stats_test.txt', >- help=('Path to the temporary stats file to be created and ' >- 'used for the test video file. Default: %default')) >+ help=('DEPRECATED')) > parser.add_option('--stats_file', type='string', > help=('DEPRECATED')) > parser.add_option('--yuv_frame_width', type='int', default=640, >@@ -74,11 +65,6 @@ def _ParseArgs(): > help='Where to store perf results in chartjson format.') > options, _ = parser.parse_args() > >- if options.stats_file: >- options.stats_file_test = options.stats_file >- print ('WARNING: Using deprecated switch --stats_file. ' >- 'The new flag is --stats_file_test.') >- > if not options.ref_video: > parser.error('You must provide a path to the reference video!') > if not os.path.exists(options.ref_video): >@@ -107,34 +93,6 @@ def _DevNull(): > """ > return open(os.devnull, 'r') > >-def DecodeBarcodesInVideo(options, path_to_decoder, video, stat_file): >- # Run barcode decoder on the test video to identify frame numbers. >- png_working_directory = tempfile.mkdtemp() >- cmd = [ >- sys.executable, >- path_to_decoder, >- '--yuv_file=%s' % video, >- '--yuv_frame_width=%d' % options.yuv_frame_width, >- '--yuv_frame_height=%d' % options.yuv_frame_height, >- '--stats_file=%s' % stat_file, >- '--png_working_dir=%s' % png_working_directory, >- ] >- if options.zxing_path: >- cmd.append('--zxing_path=%s' % options.zxing_path) >- if options.ffmpeg_path: >- cmd.append('--ffmpeg_path=%s' % options.ffmpeg_path) >- >- >- barcode_decoder = subprocess.Popen(cmd, stdin=_DevNull(), >- stdout=sys.stdout, stderr=sys.stderr) >- barcode_decoder.wait() >- >- shutil.rmtree(png_working_directory) >- if barcode_decoder.returncode != 0: >- print 'Failed to run barcode decoder script.' >- return 1 >- return 0 >- > > def _RunFrameAnalyzer(options, yuv_directory=None): > """Run frame analyzer to compare the videos and print output.""" >@@ -162,12 +120,9 @@ def _RunFrameAnalyzer(options, yuv_directory=None): > return frame_analyzer.returncode > > >-def _RunVmaf(options, yuv_directory): >+def _RunVmaf(options, yuv_directory, logfile): > """ Run VMAF to compare videos and print output. > >- The provided vmaf directory is assumed to contain a c++ wrapper executable >- and a model. >- > The yuv_directory is assumed to have been populated with a reference and test > video in .yuv format, with names according to the label. > """ >@@ -179,26 +134,29 @@ def _RunVmaf(options, yuv_directory): > os.path.join(yuv_directory, "ref.yuv"), > os.path.join(yuv_directory, "test.yuv"), > options.vmaf_model, >+ '--log', >+ logfile, >+ '--log-fmt', >+ 'json', > ] > if options.vmaf_phone_model: > cmd.append('--phone-model') > > vmaf = subprocess.Popen(cmd, stdin=_DevNull(), >- stdout=subprocess.PIPE, stderr=sys.stderr) >+ stdout=sys.stdout, stderr=sys.stderr) > vmaf.wait() > if vmaf.returncode != 0: > print 'Failed to run VMAF.' > return 1 >- output = vmaf.stdout.read() >- # Extract score from VMAF output. >- try: >- score = float(output.split('\n')[2].split()[3]) >- except (ValueError, IndexError): >- print 'Error in VMAF output (expected "VMAF score = [float]" on line 3):' >- print output >- return 1 > >- print 'RESULT Vmaf: %s= %f' % (options.label, score) >+ # Read per-frame scores from VMAF output and print. >+ with open(logfile) as f: >+ vmaf_data = json.load(f) >+ vmaf_scores = [] >+ for frame in vmaf_data['frames']: >+ vmaf_scores.append(frame['metrics']['vmaf']) >+ print 'RESULT VMAF: %s=' % options.label, vmaf_scores >+ > return 0 > > >@@ -213,40 +171,24 @@ def main(): > > Running vmaf requires the following arguments: > --vmaf, --vmaf_model, --yuv_frame_width, --yuv_frame_height >- >- Notice that the prerequisites for barcode_decoder.py also applies to this >- script. The means the following executables have to be available in the PATH: >- * zxing >- * ffmpeg > """ > options = _ParseArgs() > >- if options.barcode_decoder: >- path_to_decoder = options.barcode_decoder >- else: >- path_to_decoder = os.path.join(SCRIPT_DIR, 'barcode_tools', >- 'barcode_decoder.py') >- >- if DecodeBarcodesInVideo(options, path_to_decoder, >- options.ref_video, options.stats_file_ref) != 0: >- return 1 >- if DecodeBarcodesInVideo(options, path_to_decoder, >- options.test_video, options.stats_file_test) != 0: >- return 1 >- > if options.vmaf: > try: > # Directory to save temporary YUV files for VMAF in frame_analyzer. > yuv_directory = tempfile.mkdtemp() >+ _, vmaf_logfile = tempfile.mkstemp() > > # Run frame analyzer to compare the videos and print output. > if _RunFrameAnalyzer(options, yuv_directory=yuv_directory) != 0: > return 1 > > # Run VMAF for further video comparison and print output. >- return _RunVmaf(options, yuv_directory) >+ return _RunVmaf(options, yuv_directory, vmaf_logfile) > finally: > shutil.rmtree(yuv_directory) >+ os.remove(vmaf_logfile) > else: > return _RunFrameAnalyzer(options) > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.cc >index 43cbd69fc8e18c9461729d352fdaeb0b0d3185e5..3bdfbf32a0e1b516a54978451243252c0934c019 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.cc >@@ -18,6 +18,7 @@ > #include <utility> > > #include "absl/memory/memory.h" >+#include "api/transport/goog_cc_factory.h" > #include "call/audio_receive_stream.h" > #include "call/audio_send_stream.h" > #include "call/call.h" >@@ -36,8 +37,9 @@ > #include "modules/congestion_controller/goog_cc/bitrate_estimator.h" > #include "modules/congestion_controller/goog_cc/delay_based_bwe.h" > #include "modules/congestion_controller/include/receive_side_congestion_controller.h" >-#include "modules/congestion_controller/include/send_side_congestion_controller.h" >+#include "modules/congestion_controller/rtp/transport_feedback_adapter.h" > #include "modules/pacing/packet_router.h" >+#include "modules/remote_bitrate_estimator/include/bwe_defines.h" > #include "modules/rtp_rtcp/include/rtp_rtcp.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > #include "modules/rtp_rtcp/source/rtcp_packet.h" >@@ -175,13 +177,13 @@ constexpr float kBottomMargin = 0.02f; > constexpr float kTopMargin = 0.05f; > > absl::optional<double> NetworkDelayDiff_AbsSendTime( >- const LoggedRtpPacket& old_packet, >- const LoggedRtpPacket& new_packet) { >- if (old_packet.header.extension.hasAbsoluteSendTime && >- new_packet.header.extension.hasAbsoluteSendTime) { >+ const LoggedRtpPacketIncoming& old_packet, >+ const LoggedRtpPacketIncoming& new_packet) { >+ if (old_packet.rtp.header.extension.hasAbsoluteSendTime && >+ new_packet.rtp.header.extension.hasAbsoluteSendTime) { > int64_t send_time_diff = WrappingDifference( >- new_packet.header.extension.absoluteSendTime, >- old_packet.header.extension.absoluteSendTime, 1ul << 24); >+ new_packet.rtp.header.extension.absoluteSendTime, >+ old_packet.rtp.header.extension.absoluteSendTime, 1ul << 24); > int64_t recv_time_diff = > new_packet.log_time_us() - old_packet.log_time_us(); > double delay_change_us = >@@ -193,34 +195,31 @@ absl::optional<double> NetworkDelayDiff_AbsSendTime( > } > > absl::optional<double> NetworkDelayDiff_CaptureTime( >- const LoggedRtpPacket& old_packet, >- const LoggedRtpPacket& new_packet) { >- int64_t send_time_diff = WrappingDifference( >- new_packet.header.timestamp, old_packet.header.timestamp, 1ull << 32); >+ const LoggedRtpPacketIncoming& old_packet, >+ const LoggedRtpPacketIncoming& new_packet, >+ const double sample_rate) { >+ int64_t send_time_diff = >+ WrappingDifference(new_packet.rtp.header.timestamp, >+ old_packet.rtp.header.timestamp, 1ull << 32); > int64_t recv_time_diff = new_packet.log_time_us() - old_packet.log_time_us(); > >- const double kVideoSampleRate = 90000; >- // TODO(terelius): We treat all streams as video for now, even though >- // audio might be sampled at e.g. 16kHz, because it is really difficult to >- // figure out the true sampling rate of a stream. The effect is that the >- // delay will be scaled incorrectly for non-video streams. >- > double delay_change = > static_cast<double>(recv_time_diff) / 1000 - >- static_cast<double>(send_time_diff) / kVideoSampleRate * 1000; >+ static_cast<double>(send_time_diff) / sample_rate * 1000; > if (delay_change < -10000 || 10000 < delay_change) { > RTC_LOG(LS_WARNING) << "Very large delay change. Timestamps correct?"; >- RTC_LOG(LS_WARNING) << "Old capture time " << old_packet.header.timestamp >- << ", received time " << old_packet.log_time_us(); >- RTC_LOG(LS_WARNING) << "New capture time " << new_packet.header.timestamp >- << ", received time " << new_packet.log_time_us(); >+ RTC_LOG(LS_WARNING) << "Old capture time " >+ << old_packet.rtp.header.timestamp << ", received time " >+ << old_packet.log_time_us(); >+ RTC_LOG(LS_WARNING) << "New capture time " >+ << new_packet.rtp.header.timestamp << ", received time " >+ << new_packet.log_time_us(); > RTC_LOG(LS_WARNING) << "Receive time difference " << recv_time_diff << " = " > << static_cast<double>(recv_time_diff) / > kNumMicrosecsPerSec > << "s"; > RTC_LOG(LS_WARNING) << "Send time difference " << send_time_diff << " = " >- << static_cast<double>(send_time_diff) / >- kVideoSampleRate >+ << static_cast<double>(send_time_diff) / sample_rate > << "s"; > } > return delay_change; >@@ -274,9 +273,10 @@ void AccumulatePairs( > for (size_t i = 1; i < data.size(); i++) { > float x = fx(data[i]); > absl::optional<ResultType> y = fy(data[i - 1], data[i]); >- if (y) >+ if (y) { > sum += *y; >- result->points.emplace_back(x, static_cast<float>(sum)); >+ result->points.emplace_back(x, static_cast<float>(sum)); >+ } > } > } > >@@ -486,17 +486,15 @@ EventLogAnalyzer::EventLogAnalyzer(const ParsedRtcEventLogNew& log, > << " (LOG_START, LOG_END) segments in log."; > } > >-class BitrateObserver : public NetworkChangedObserver, >- public RemoteBitrateObserver { >+class BitrateObserver : public RemoteBitrateObserver { > public: > BitrateObserver() : last_bitrate_bps_(0), bitrate_updated_(false) {} > >- void OnNetworkChanged(uint32_t bitrate_bps, >- uint8_t fraction_lost, >- int64_t rtt_ms, >- int64_t probing_interval_ms) override { >- last_bitrate_bps_ = bitrate_bps; >- bitrate_updated_ = true; >+ void Update(NetworkControlUpdate update) { >+ if (update.target_rate) { >+ last_bitrate_bps_ = update.target_rate->target_rate.bps(); >+ bitrate_updated_ = true; >+ } > } > > void OnReceiveBitrateChanged(const std::vector<uint32_t>& ssrcs, >@@ -737,71 +735,58 @@ void EventLogAnalyzer::CreateIncomingPacketLossGraph(Plot* plot) { > plot->SetTitle("Estimated incoming loss rate"); > } > >-void EventLogAnalyzer::CreateIncomingDelayDeltaGraph(Plot* plot) { >- for (const auto& stream : parsed_log_.rtp_packets_by_ssrc(kIncomingPacket)) { >+void EventLogAnalyzer::CreateIncomingDelayGraph(Plot* plot) { >+ for (const auto& stream : parsed_log_.incoming_rtp_packets_by_ssrc()) { > // Filter on SSRC. > if (!MatchingSsrc(stream.ssrc, desired_ssrc_) || >- IsAudioSsrc(kIncomingPacket, stream.ssrc) || >- !IsVideoSsrc(kIncomingPacket, stream.ssrc) || > IsRtxSsrc(kIncomingPacket, stream.ssrc)) { > continue; > } > >- TimeSeries capture_time_data( >- GetStreamName(kIncomingPacket, stream.ssrc) + " capture-time", >- LineStyle::kBar); >- auto ToCallTime = [this](const LoggedRtpPacket& packet) { >- return this->ToCallTimeSec(packet.log_time_us()); >- }; >- ProcessPairs<LoggedRtpPacket, double>( >- ToCallTime, NetworkDelayDiff_CaptureTime, stream.packet_view, >- &capture_time_data); >- plot->AppendTimeSeries(std::move(capture_time_data)); >- >- TimeSeries send_time_data( >- GetStreamName(kIncomingPacket, stream.ssrc) + " abs-send-time", >- LineStyle::kBar); >- ProcessPairs<LoggedRtpPacket, double>(ToCallTime, >- NetworkDelayDiff_AbsSendTime, >- stream.packet_view, &send_time_data); >- plot->AppendTimeSeries(std::move(send_time_data)); >- } >- >- plot->SetXAxis(ToCallTimeSec(begin_time_), call_duration_s_, "Time (s)", >- kLeftMargin, kRightMargin); >- plot->SetSuggestedYAxis(0, 1, "Latency change (ms)", kBottomMargin, >- kTopMargin); >- plot->SetTitle("Network latency difference between consecutive packets"); >-} >- >-void EventLogAnalyzer::CreateIncomingDelayGraph(Plot* plot) { >- for (const auto& stream : parsed_log_.rtp_packets_by_ssrc(kIncomingPacket)) { >- // Filter on SSRC. >- if (!MatchingSsrc(stream.ssrc, desired_ssrc_) || >- IsAudioSsrc(kIncomingPacket, stream.ssrc) || >- !IsVideoSsrc(kIncomingPacket, stream.ssrc) || >- IsRtxSsrc(kIncomingPacket, stream.ssrc)) { >+ const std::vector<LoggedRtpPacketIncoming>& packets = >+ stream.incoming_packets; >+ if (packets.size() < 100) { >+ RTC_LOG(LS_WARNING) << "Can't estimate the RTP clock frequency with " >+ << packets.size() << " packets in the stream."; >+ continue; >+ } >+ int64_t end_time_us = log_segments_.empty() >+ ? std::numeric_limits<int64_t>::max() >+ : log_segments_.front().second; >+ absl::optional<uint32_t> estimated_frequency = >+ EstimateRtpClockFrequency(packets, end_time_us); >+ if (!estimated_frequency) >+ continue; >+ const double frequency_hz = *estimated_frequency; >+ if (IsVideoSsrc(kIncomingPacket, stream.ssrc) && frequency_hz != 90000) { >+ RTC_LOG(LS_WARNING) >+ << "Video stream should use a 90 kHz clock but appears to use " >+ << frequency_hz / 1000 << ". Discarding."; > continue; > } > >+ auto ToCallTime = [this](const LoggedRtpPacketIncoming& packet) { >+ return this->ToCallTimeSec(packet.log_time_us()); >+ }; >+ auto ToNetworkDelay = [frequency_hz]( >+ const LoggedRtpPacketIncoming& old_packet, >+ const LoggedRtpPacketIncoming& new_packet) { >+ return NetworkDelayDiff_CaptureTime(old_packet, new_packet, frequency_hz); >+ }; >+ > TimeSeries capture_time_data( > GetStreamName(kIncomingPacket, stream.ssrc) + " capture-time", > LineStyle::kLine); >- auto ToCallTime = [this](const LoggedRtpPacket& packet) { >- return this->ToCallTimeSec(packet.log_time_us()); >- }; >- AccumulatePairs<LoggedRtpPacket, double>( >- ToCallTime, NetworkDelayDiff_CaptureTime, stream.packet_view, >- &capture_time_data); >+ AccumulatePairs<LoggedRtpPacketIncoming, double>( >+ ToCallTime, ToNetworkDelay, packets, &capture_time_data); > plot->AppendTimeSeries(std::move(capture_time_data)); > > TimeSeries send_time_data( > GetStreamName(kIncomingPacket, stream.ssrc) + " abs-send-time", > LineStyle::kLine); >- AccumulatePairs<LoggedRtpPacket, double>( >- ToCallTime, NetworkDelayDiff_AbsSendTime, stream.packet_view, >- &send_time_data); >- plot->AppendTimeSeries(std::move(send_time_data)); >+ AccumulatePairs<LoggedRtpPacketIncoming, double>( >+ ToCallTime, NetworkDelayDiff_AbsSendTime, packets, &send_time_data); >+ plot->AppendTimeSeriesIfNotEmpty(std::move(send_time_data)); > } > > plot->SetXAxis(ToCallTimeSec(begin_time_), call_duration_s_, "Time (s)", >@@ -1081,10 +1066,15 @@ void EventLogAnalyzer::CreateSendSideBweSimulationGraph(Plot* plot) { > RtcEventLogNullImpl null_event_log; > PacketRouter packet_router; > PacedSender pacer(&clock, &packet_router, &null_event_log); >- SendSideCongestionController cc(&clock, &observer, &null_event_log, &pacer); >+ TransportFeedbackAdapter transport_feedback(&clock); >+ auto factory = GoogCcNetworkControllerFactory(&null_event_log); >+ TimeDelta process_interval = factory.GetProcessInterval(); > // TODO(holmer): Log the call config and use that here instead. > static const uint32_t kDefaultStartBitrateBps = 300000; >- cc.SetBweBitrates(0, kDefaultStartBitrateBps, -1); >+ NetworkControllerConfig cc_config; >+ cc_config.constraints.at_time = Timestamp::us(clock.TimeInMicroseconds()); >+ cc_config.constraints.starting_rate = DataRate::bps(kDefaultStartBitrateBps); >+ auto goog_cc = factory.Create(cc_config); > > TimeSeries time_series("Delay-based estimate", LineStyle::kStep, > PointStyle::kHighlight); >@@ -1107,12 +1097,12 @@ void EventLogAnalyzer::CreateSendSideBweSimulationGraph(Plot* plot) { > return static_cast<int64_t>(rtcp_iterator->log_time_us()); > return std::numeric_limits<int64_t>::max(); > }; >+ int64_t next_process_time_us_ = clock.TimeInMicroseconds(); > > auto NextProcessTime = [&]() { > if (rtcp_iterator != incoming_rtcp.end() || > rtp_iterator != outgoing_rtp.end()) { >- return clock.TimeInMicroseconds() + >- std::max<int64_t>(cc.TimeUntilNextProcess() * 1000, 0); >+ return next_process_time_us_; > } > return std::numeric_limits<int64_t>::max(); > }; >@@ -1129,24 +1119,32 @@ void EventLogAnalyzer::CreateSendSideBweSimulationGraph(Plot* plot) { > AcknowledgedBitrateEstimator acknowledged_bitrate_estimator( > absl::make_unique<BitrateEstimator>()); > #endif // !(BWE_TEST_LOGGING_COMPILE_TIME_ENABLE) >- int64_t time_us = std::min(NextRtpTime(), NextRtcpTime()); >+ int64_t time_us = >+ std::min({NextRtpTime(), NextRtcpTime(), NextProcessTime()}); > int64_t last_update_us = 0; > while (time_us != std::numeric_limits<int64_t>::max()) { > clock.AdvanceTimeMicroseconds(time_us - clock.TimeInMicroseconds()); > if (clock.TimeInMicroseconds() >= NextRtcpTime()) { > RTC_DCHECK_EQ(clock.TimeInMicroseconds(), NextRtcpTime()); >- cc.OnTransportFeedback(rtcp_iterator->transport_feedback); >- std::vector<PacketFeedback> feedback = cc.GetTransportFeedbackVector(); >- SortPacketFeedbackVector(&feedback); >+ >+ auto feedback_msg = transport_feedback.ProcessTransportFeedback( >+ rtcp_iterator->transport_feedback); > absl::optional<uint32_t> bitrate_bps; >- if (!feedback.empty()) { >+ if (feedback_msg) { >+ observer.Update(goog_cc->OnTransportPacketsFeedback(*feedback_msg)); >+ std::vector<PacketFeedback> feedback = >+ transport_feedback.GetTransportFeedbackVector(); >+ SortPacketFeedbackVector(&feedback); >+ if (!feedback.empty()) { > #if !(BWE_TEST_LOGGING_COMPILE_TIME_ENABLE) >- acknowledged_bitrate_estimator.IncomingPacketFeedbackVector(feedback); >+ acknowledged_bitrate_estimator.IncomingPacketFeedbackVector(feedback); > #endif // !(BWE_TEST_LOGGING_COMPILE_TIME_ENABLE) >- for (const PacketFeedback& packet : feedback) >- acked_bitrate.Update(packet.payload_size, packet.arrival_time_ms); >- bitrate_bps = acked_bitrate.Rate(feedback.back().arrival_time_ms); >+ for (const PacketFeedback& packet : feedback) >+ acked_bitrate.Update(packet.payload_size, packet.arrival_time_ms); >+ bitrate_bps = acked_bitrate.Rate(feedback.back().arrival_time_ms); >+ } > } >+ > float x = ToCallTimeSec(clock.TimeInMicroseconds()); > float y = bitrate_bps.value_or(0) / 1000; > acked_time_series.points.emplace_back(x, y); >@@ -1161,19 +1159,25 @@ void EventLogAnalyzer::CreateSendSideBweSimulationGraph(Plot* plot) { > const RtpPacketType& rtp_packet = *rtp_iterator->second; > if (rtp_packet.rtp.header.extension.hasTransportSequenceNumber) { > RTC_DCHECK(rtp_packet.rtp.header.extension.hasTransportSequenceNumber); >- cc.AddPacket(rtp_packet.rtp.header.ssrc, >- rtp_packet.rtp.header.extension.transportSequenceNumber, >- rtp_packet.rtp.total_length, PacedPacketInfo()); >+ transport_feedback.AddPacket( >+ rtp_packet.rtp.header.ssrc, >+ rtp_packet.rtp.header.extension.transportSequenceNumber, >+ rtp_packet.rtp.total_length, PacedPacketInfo()); > rtc::SentPacket sent_packet( > rtp_packet.rtp.header.extension.transportSequenceNumber, > rtp_packet.rtp.log_time_us() / 1000); >- cc.OnSentPacket(sent_packet); >+ auto sent_msg = transport_feedback.ProcessSentPacket(sent_packet); >+ if (sent_msg) >+ observer.Update(goog_cc->OnSentPacket(*sent_msg)); > } > ++rtp_iterator; > } > if (clock.TimeInMicroseconds() >= NextProcessTime()) { > RTC_DCHECK_EQ(clock.TimeInMicroseconds(), NextProcessTime()); >- cc.Process(); >+ ProcessInterval msg; >+ msg.at_time = Timestamp::us(clock.TimeInMicroseconds()); >+ observer.Update(goog_cc->OnProcessInterval(msg)); >+ next_process_time_us_ += process_interval.us(); > } > if (observer.GetAndResetBitrateUpdated() || > time_us - last_update_us >= 1e6) { >@@ -1277,9 +1281,10 @@ void EventLogAnalyzer::CreateReceiveSideBweSimulationGraph(Plot* plot) { > void EventLogAnalyzer::CreateNetworkDelayFeedbackGraph(Plot* plot) { > TimeSeries late_feedback_series("Late feedback results.", LineStyle::kNone, > PointStyle::kHighlight); >- TimeSeries time_series("Network Delay Change", LineStyle::kLine, >+ TimeSeries time_series("Network delay", LineStyle::kLine, > PointStyle::kHighlight); >- int64_t estimated_base_delay_ms = std::numeric_limits<int64_t>::max(); >+ int64_t min_send_receive_diff_ms = std::numeric_limits<int64_t>::max(); >+ int64_t min_rtt_ms = std::numeric_limits<int64_t>::max(); > > int64_t prev_y = 0; > for (auto packet : GetNetworkTrace(parsed_log_)) { >@@ -1292,16 +1297,22 @@ void EventLogAnalyzer::CreateNetworkDelayFeedbackGraph(Plot* plot) { > } > int64_t y = packet.arrival_time_ms - packet.send_time_ms; > prev_y = y; >- estimated_base_delay_ms = std::min(y, estimated_base_delay_ms); >+ int64_t rtt_ms = packet.feedback_arrival_time_ms - packet.send_time_ms; >+ min_rtt_ms = std::min(rtt_ms, min_rtt_ms); >+ min_send_receive_diff_ms = std::min(y, min_send_receive_diff_ms); > time_series.points.emplace_back(x, y); > } > >- // We assume that the base network delay (w/o queues) is the min delay >- // observed during the call. >+ // We assume that the base network delay (w/o queues) is equal to half >+ // the minimum RTT. Therefore rescale the delays by subtracting the minimum >+ // observed 1-ways delay and add half the minumum RTT. >+ const int64_t estimated_clock_offset_ms = >+ min_send_receive_diff_ms - min_rtt_ms / 2; > for (TimeSeriesPoint& point : time_series.points) >- point.y -= estimated_base_delay_ms; >+ point.y -= estimated_clock_offset_ms; > for (TimeSeriesPoint& point : late_feedback_series.points) >- point.y -= estimated_base_delay_ms; >+ point.y -= estimated_clock_offset_ms; >+ > // Add the data set to the plot. > plot->AppendTimeSeriesIfNotEmpty(std::move(time_series)); > plot->AppendTimeSeriesIfNotEmpty(std::move(late_feedback_series)); >@@ -1309,7 +1320,7 @@ void EventLogAnalyzer::CreateNetworkDelayFeedbackGraph(Plot* plot) { > plot->SetXAxis(ToCallTimeSec(begin_time_), call_duration_s_, "Time (s)", > kLeftMargin, kRightMargin); > plot->SetSuggestedYAxis(0, 10, "Delay (ms)", kBottomMargin, kTopMargin); >- plot->SetTitle("Network Delay Change."); >+ plot->SetTitle("Network delay (based on per-packet feedback)"); > } > > void EventLogAnalyzer::CreatePacerDelayGraph(Plot* plot) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.h >index 4b8c27c9517d0cbcc0401d48ab118cc23d90128a..c6606c2b1f16b122d35b2e1fd6f3ced3ebf9e60f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/analyzer.h >@@ -45,7 +45,6 @@ class EventLogAnalyzer { > > void CreateIncomingPacketLossGraph(Plot* plot); > >- void CreateIncomingDelayDeltaGraph(Plot* plot); > void CreateIncomingDelayGraph(Plot* plot); > > void CreateFractionLossGraph(Plot* plot); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/main.cc >index 0c3689c1bcae3a0fdd003862b1d8dff44dab2ee1..0b4e7617e6068d756f390f4be2c74cc91ebca620 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/event_log_visualizer/main.cc >@@ -20,151 +20,169 @@ > #include "test/field_trial.h" > #include "test/testsupport/fileutils.h" > >-DEFINE_string(plot_profile, >- "default", >- "A profile that selects a certain subset of the plots. Currently " >- "defined profiles are \"all\", \"none\", \"sendside_bwe\"," >- "\"receiveside_bwe\" and \"default\""); >+WEBRTC_DEFINE_string( >+ plot_profile, >+ "default", >+ "A profile that selects a certain subset of the plots. Currently " >+ "defined profiles are \"all\", \"none\", \"sendside_bwe\"," >+ "\"receiveside_bwe\" and \"default\""); > >-DEFINE_bool(plot_incoming_packet_sizes, >- false, >- "Plot bar graph showing the size of each incoming packet."); >-DEFINE_bool(plot_outgoing_packet_sizes, >- false, >- "Plot bar graph showing the size of each outgoing packet."); >-DEFINE_bool(plot_incoming_packet_count, >- false, >- "Plot the accumulated number of packets for each incoming stream."); >-DEFINE_bool(plot_outgoing_packet_count, >- false, >- "Plot the accumulated number of packets for each outgoing stream."); >-DEFINE_bool(plot_audio_playout, >- false, >- "Plot bar graph showing the time between each audio playout."); >-DEFINE_bool(plot_audio_level, >- false, >- "Plot line graph showing the audio level of incoming audio."); >-DEFINE_bool(plot_incoming_sequence_number_delta, >- false, >- "Plot the sequence number difference between consecutive incoming " >- "packets."); >-DEFINE_bool( >- plot_incoming_delay_delta, >+WEBRTC_DEFINE_bool(plot_incoming_packet_sizes, >+ false, >+ "Plot bar graph showing the size of each incoming packet."); >+WEBRTC_DEFINE_bool(plot_outgoing_packet_sizes, >+ false, >+ "Plot bar graph showing the size of each outgoing packet."); >+WEBRTC_DEFINE_bool( >+ plot_incoming_packet_count, > false, >- "Plot the difference in 1-way path delay between consecutive packets."); >-DEFINE_bool(plot_incoming_delay, >- true, >- "Plot the 1-way path delay for incoming packets, normalized so " >- "that the first packet has delay 0."); >-DEFINE_bool(plot_incoming_loss_rate, >- true, >- "Compute the loss rate for incoming packets using a method that's " >- "similar to the one used for RTCP SR and RR fraction lost. Note " >- "that the loss rate can be negative if packets are duplicated or " >- "reordered."); >-DEFINE_bool(plot_incoming_bitrate, >- true, >- "Plot the total bitrate used by all incoming streams."); >-DEFINE_bool(plot_outgoing_bitrate, >- true, >- "Plot the total bitrate used by all outgoing streams."); >-DEFINE_bool(plot_incoming_stream_bitrate, >- true, >- "Plot the bitrate used by each incoming stream."); >-DEFINE_bool(plot_outgoing_stream_bitrate, >- true, >- "Plot the bitrate used by each outgoing stream."); >-DEFINE_bool(plot_simulated_receiveside_bwe, >- false, >- "Run the receive-side bandwidth estimator with the incoming rtp " >- "packets and plot the resulting estimate."); >-DEFINE_bool(plot_simulated_sendside_bwe, >- false, >- "Run the send-side bandwidth estimator with the outgoing rtp and " >- "incoming rtcp and plot the resulting estimate."); >-DEFINE_bool(plot_network_delay_feedback, >- true, >- "Compute network delay based on sent packets and the received " >- "transport feedback."); >-DEFINE_bool(plot_fraction_loss_feedback, >- true, >- "Plot packet loss in percent for outgoing packets (as perceived by " >- "the send-side bandwidth estimator)."); >-DEFINE_bool(plot_pacer_delay, >- false, >- "Plot the time each sent packet has spent in the pacer (based on " >- "the difference between the RTP timestamp and the send " >- "timestamp)."); >-DEFINE_bool(plot_timestamps, >- false, >- "Plot the rtp timestamps of all rtp and rtcp packets over time."); >-DEFINE_bool(plot_rtcp_details, >- false, >- "Plot the contents of all report blocks in all sender and receiver " >- "reports. This includes fraction lost, cumulative number of lost " >- "packets, extended highest sequence number and time since last " >- "received SR."); >-DEFINE_bool(plot_audio_encoder_bitrate_bps, >- false, >- "Plot the audio encoder target bitrate."); >-DEFINE_bool(plot_audio_encoder_frame_length_ms, >- false, >- "Plot the audio encoder frame length."); >-DEFINE_bool( >+ "Plot the accumulated number of packets for each incoming stream."); >+WEBRTC_DEFINE_bool( >+ plot_outgoing_packet_count, >+ false, >+ "Plot the accumulated number of packets for each outgoing stream."); >+WEBRTC_DEFINE_bool( >+ plot_audio_playout, >+ false, >+ "Plot bar graph showing the time between each audio playout."); >+WEBRTC_DEFINE_bool( >+ plot_audio_level, >+ false, >+ "Plot line graph showing the audio level of incoming audio."); >+WEBRTC_DEFINE_bool( >+ plot_incoming_sequence_number_delta, >+ false, >+ "Plot the sequence number difference between consecutive incoming " >+ "packets."); >+WEBRTC_DEFINE_bool( >+ plot_incoming_delay, >+ true, >+ "Plot the 1-way path delay for incoming packets, normalized so " >+ "that the first packet has delay 0."); >+WEBRTC_DEFINE_bool( >+ plot_incoming_loss_rate, >+ true, >+ "Compute the loss rate for incoming packets using a method that's " >+ "similar to the one used for RTCP SR and RR fraction lost. Note " >+ "that the loss rate can be negative if packets are duplicated or " >+ "reordered."); >+WEBRTC_DEFINE_bool(plot_incoming_bitrate, >+ true, >+ "Plot the total bitrate used by all incoming streams."); >+WEBRTC_DEFINE_bool(plot_outgoing_bitrate, >+ true, >+ "Plot the total bitrate used by all outgoing streams."); >+WEBRTC_DEFINE_bool(plot_incoming_stream_bitrate, >+ true, >+ "Plot the bitrate used by each incoming stream."); >+WEBRTC_DEFINE_bool(plot_outgoing_stream_bitrate, >+ true, >+ "Plot the bitrate used by each outgoing stream."); >+WEBRTC_DEFINE_bool( >+ plot_simulated_receiveside_bwe, >+ false, >+ "Run the receive-side bandwidth estimator with the incoming rtp " >+ "packets and plot the resulting estimate."); >+WEBRTC_DEFINE_bool( >+ plot_simulated_sendside_bwe, >+ false, >+ "Run the send-side bandwidth estimator with the outgoing rtp and " >+ "incoming rtcp and plot the resulting estimate."); >+WEBRTC_DEFINE_bool( >+ plot_network_delay_feedback, >+ true, >+ "Compute network delay based on sent packets and the received " >+ "transport feedback."); >+WEBRTC_DEFINE_bool( >+ plot_fraction_loss_feedback, >+ true, >+ "Plot packet loss in percent for outgoing packets (as perceived by " >+ "the send-side bandwidth estimator)."); >+WEBRTC_DEFINE_bool( >+ plot_pacer_delay, >+ false, >+ "Plot the time each sent packet has spent in the pacer (based on " >+ "the difference between the RTP timestamp and the send " >+ "timestamp)."); >+WEBRTC_DEFINE_bool( >+ plot_timestamps, >+ false, >+ "Plot the rtp timestamps of all rtp and rtcp packets over time."); >+WEBRTC_DEFINE_bool( >+ plot_rtcp_details, >+ false, >+ "Plot the contents of all report blocks in all sender and receiver " >+ "reports. This includes fraction lost, cumulative number of lost " >+ "packets, extended highest sequence number and time since last " >+ "received SR."); >+WEBRTC_DEFINE_bool(plot_audio_encoder_bitrate_bps, >+ false, >+ "Plot the audio encoder target bitrate."); >+WEBRTC_DEFINE_bool(plot_audio_encoder_frame_length_ms, >+ false, >+ "Plot the audio encoder frame length."); >+WEBRTC_DEFINE_bool( > plot_audio_encoder_packet_loss, > false, > "Plot the uplink packet loss fraction which is sent to the audio encoder."); >-DEFINE_bool(plot_audio_encoder_fec, false, "Plot the audio encoder FEC."); >-DEFINE_bool(plot_audio_encoder_dtx, false, "Plot the audio encoder DTX."); >-DEFINE_bool(plot_audio_encoder_num_channels, >- false, >- "Plot the audio encoder number of channels."); >-DEFINE_bool(plot_neteq_stats, false, "Plot the NetEq statistics."); >-DEFINE_bool(plot_ice_candidate_pair_config, >- false, >- "Plot the ICE candidate pair config events."); >-DEFINE_bool(plot_ice_connectivity_check, >- false, >- "Plot the ICE candidate pair connectivity checks."); >+WEBRTC_DEFINE_bool(plot_audio_encoder_fec, >+ false, >+ "Plot the audio encoder FEC."); >+WEBRTC_DEFINE_bool(plot_audio_encoder_dtx, >+ false, >+ "Plot the audio encoder DTX."); >+WEBRTC_DEFINE_bool(plot_audio_encoder_num_channels, >+ false, >+ "Plot the audio encoder number of channels."); >+WEBRTC_DEFINE_bool(plot_neteq_stats, false, "Plot the NetEq statistics."); >+WEBRTC_DEFINE_bool(plot_ice_candidate_pair_config, >+ false, >+ "Plot the ICE candidate pair config events."); >+WEBRTC_DEFINE_bool(plot_ice_connectivity_check, >+ false, >+ "Plot the ICE candidate pair connectivity checks."); > >-DEFINE_string( >+WEBRTC_DEFINE_string( > force_fieldtrials, > "", > "Field trials control experimental feature code which can be forced. " > "E.g. running with --force_fieldtrials=WebRTC-FooFeature/Enabled/" > " will assign the group Enabled to field trial WebRTC-FooFeature. Multiple " > "trials are separated by \"/\""); >-DEFINE_string(wav_filename, >- "", >- "Path to wav file used for simulation of jitter buffer"); >-DEFINE_bool(help, false, "prints this message"); >+WEBRTC_DEFINE_string(wav_filename, >+ "", >+ "Path to wav file used for simulation of jitter buffer"); >+WEBRTC_DEFINE_bool(help, false, "prints this message"); > >-DEFINE_bool(show_detector_state, >- false, >- "Show the state of the delay based BWE detector on the total " >- "bitrate graph"); >+WEBRTC_DEFINE_bool( >+ show_detector_state, >+ false, >+ "Show the state of the delay based BWE detector on the total " >+ "bitrate graph"); > >-DEFINE_bool(show_alr_state, >- false, >- "Show the state ALR state on the total bitrate graph"); >+WEBRTC_DEFINE_bool(show_alr_state, >+ false, >+ "Show the state ALR state on the total bitrate graph"); > >-DEFINE_bool(parse_unconfigured_header_extensions, >- true, >- "Attempt to parse unconfigured header extensions using the default " >- "WebRTC mapping. This can give very misleading results if the " >- "application negotiates a different mapping."); >+WEBRTC_DEFINE_bool( >+ parse_unconfigured_header_extensions, >+ true, >+ "Attempt to parse unconfigured header extensions using the default " >+ "WebRTC mapping. This can give very misleading results if the " >+ "application negotiates a different mapping."); > >-DEFINE_bool(print_triage_alerts, >- false, >- "Print triage alerts, i.e. a list of potential problems."); >+WEBRTC_DEFINE_bool(print_triage_alerts, >+ false, >+ "Print triage alerts, i.e. a list of potential problems."); > >-DEFINE_bool(normalize_time, >- true, >- "Normalize the log timestamps so that the call starts at time 0."); >+WEBRTC_DEFINE_bool( >+ normalize_time, >+ true, >+ "Normalize the log timestamps so that the call starts at time 0."); > >-DEFINE_bool(protobuf_output, >- false, >- "Output charts as protobuf instead of python code."); >+WEBRTC_DEFINE_bool(protobuf_output, >+ false, >+ "Output charts as protobuf instead of python code."); > > void SetAllPlotFlags(bool setting); > >@@ -194,7 +212,6 @@ int main(int argc, char* argv[]) { > } else if (strcmp(FLAG_plot_profile, "receiveside_bwe") == 0) { > SetAllPlotFlags(false); > FLAG_plot_incoming_packet_sizes = true; >- FLAG_plot_incoming_delay_delta = true; > FLAG_plot_incoming_delay = true; > FLAG_plot_incoming_loss_rate = true; > FLAG_plot_incoming_bitrate = true; >@@ -218,7 +235,6 @@ int main(int argc, char* argv[]) { > return 0; > } > >- webrtc::test::SetExecutablePath(argv[0]); > webrtc::test::ValidateFieldTrialsStringOrDie(FLAG_force_fieldtrials); > // InitFieldTrialsFromString stores the char*, so the char array must outlive > // the application. >@@ -236,9 +252,7 @@ int main(int argc, char* argv[]) { > > if (!parsed_log.ParseFile(filename)) { > std::cerr << "Could not parse the entire log file." << std::endl; >- std::cerr << "Proceeding to analyze the first " >- << parsed_log.GetNumberOfEvents() << " events in the file." >- << std::endl; >+ std::cerr << "Only the parsable events will be analyzed." << std::endl; > } > > webrtc::EventLogAnalyzer analyzer(parsed_log, FLAG_normalize_time); >@@ -277,9 +291,6 @@ int main(int argc, char* argv[]) { > if (FLAG_plot_incoming_sequence_number_delta) { > analyzer.CreateSequenceNumberGraph(collection->AppendNewPlot()); > } >- if (FLAG_plot_incoming_delay_delta) { >- analyzer.CreateIncomingDelayDeltaGraph(collection->AppendNewPlot()); >- } > if (FLAG_plot_incoming_delay) { > analyzer.CreateIncomingDelayGraph(collection->AppendNewPlot()); > } >@@ -466,7 +477,6 @@ void SetAllPlotFlags(bool setting) { > FLAG_plot_audio_playout = setting; > FLAG_plot_audio_level = setting; > FLAG_plot_incoming_sequence_number_delta = setting; >- FLAG_plot_incoming_delay_delta = setting; > FLAG_plot_incoming_delay = setting; > FLAG_plot_incoming_loss_rate = setting; > FLAG_plot_incoming_bitrate = setting; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/force_mic_volume_max/force_mic_volume_max.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/force_mic_volume_max/force_mic_volume_max.cc >deleted file mode 100644 >index 2cd1b36d9047b89ce959a68bfc659b1ef94fdb04..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/force_mic_volume_max/force_mic_volume_max.cc >+++ /dev/null >@@ -1,58 +0,0 @@ >-/* >- * Copyright (c) 2013 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-// This utility will portably force the volume of the default microphone to max. >- >-#include <stdio.h> >- >-#include "modules/audio_device/include/audio_device.h" >- >-using webrtc::AudioDeviceModule; >- >-#if defined(_WIN32) >-#define DEFAULT_INPUT_DEVICE (AudioDeviceModule::kDefaultCommunicationDevice) >-#else >-#define DEFAULT_INPUT_DEVICE (0) >-#endif >- >-int main(int /*argc*/, char* /*argv*/ []) { >- // Create and initialize the ADM. >- rtc::scoped_refptr<AudioDeviceModule> adm( >- AudioDeviceModule::Create(AudioDeviceModule::kPlatformDefaultAudio)); >- if (!adm.get()) { >- fprintf(stderr, "Failed to create Audio Device Module.\n"); >- return 1; >- } >- if (adm->Init() != 0) { >- fprintf(stderr, "Failed to initialize Audio Device Module.\n"); >- return 1; >- } >- if (adm->SetRecordingDevice(DEFAULT_INPUT_DEVICE) != 0) { >- fprintf(stderr, "Failed to set the default input device.\n"); >- return 1; >- } >- if (adm->InitMicrophone() != 0) { >- fprintf(stderr, "Failed to to initialize the microphone.\n"); >- return 1; >- } >- >- // Set mic volume to max. >- uint32_t max_vol = 0; >- if (adm->MaxMicrophoneVolume(&max_vol) != 0) { >- fprintf(stderr, "Failed to get max volume.\n"); >- return 1; >- } >- if (adm->SetMicrophoneVolume(max_vol) != 0) { >- fprintf(stderr, "Failed to set mic volume.\n"); >- return 1; >- } >- >- return 0; >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/frame_analyzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/frame_analyzer.cc >index 89440ea9845a25ef15898811db67d301affbcaf1..fb5f8de227afbc22949d05ddef4e0aeaca049099 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/frame_analyzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/frame_analyzer.cc >@@ -15,7 +15,10 @@ > #include <string> > #include <vector> > >+#include "rtc_base/strings/string_builder.h" > #include "rtc_base/stringutils.h" >+#include "rtc_tools/frame_analyzer/video_color_aligner.h" >+#include "rtc_tools/frame_analyzer/video_geometry_aligner.h" > #include "rtc_tools/frame_analyzer/video_quality_analysis.h" > #include "rtc_tools/frame_analyzer/video_temporal_aligner.h" > #include "rtc_tools/simple_command_line_parser.h" >@@ -133,8 +136,35 @@ int main(int argc, char* argv[]) { > const std::vector<size_t> matching_indices = > webrtc::test::FindMatchingFrameIndices(reference_video, test_video); > >- results.frames = >- webrtc::test::RunAnalysis(reference_video, test_video, matching_indices); >+ // Align the reference video both temporally and geometrically. I.e. align the >+ // frames to match up in order to the test video, and align a crop region of >+ // the reference video to match up to the test video. >+ const rtc::scoped_refptr<webrtc::test::Video> aligned_reference_video = >+ AdjustCropping(ReorderVideo(reference_video, matching_indices), >+ test_video); >+ >+ // Calculate if there is any systematic color difference between the reference >+ // and test video. >+ const webrtc::test::ColorTransformationMatrix color_transformation = >+ CalculateColorTransformationMatrix(aligned_reference_video, test_video); >+ >+ char buf[256]; >+ rtc::SimpleStringBuilder string_builder(buf); >+ for (int i = 0; i < 3; ++i) { >+ string_builder << "\n"; >+ for (int j = 0; j < 4; ++j) >+ string_builder.AppendFormat("%6.2f ", color_transformation[i][j]); >+ } >+ printf("Adjusting test video with color transformation: %s\n", >+ string_builder.str()); >+ >+ // Adjust all frames in the test video with the calculated color >+ // transformation. >+ const rtc::scoped_refptr<webrtc::test::Video> color_adjusted_test_video = >+ AdjustColors(color_transformation, test_video); >+ >+ results.frames = webrtc::test::RunAnalysis( >+ aligned_reference_video, color_adjusted_test_video, matching_indices); > > const std::vector<webrtc::test::Cluster> clusters = > webrtc::test::CalculateFrameClusters(matching_indices); >@@ -151,20 +181,17 @@ int main(int argc, char* argv[]) { > if (!chartjson_result_file.empty()) { > webrtc::test::WritePerfResults(chartjson_result_file); > } >- rtc::scoped_refptr<webrtc::test::Video> reordered_video = >- webrtc::test::GenerateAlignedReferenceVideo(reference_video, >- matching_indices); > std::string aligned_output_file = parser.GetFlag("aligned_output_file"); > if (!aligned_output_file.empty()) { >- webrtc::test::WriteVideoToFile(reordered_video, aligned_output_file, >+ webrtc::test::WriteVideoToFile(aligned_reference_video, aligned_output_file, > /*fps=*/30); > } > std::string yuv_directory = parser.GetFlag("yuv_directory"); > if (!yuv_directory.empty()) { >- webrtc::test::WriteVideoToFile(reordered_video, >+ webrtc::test::WriteVideoToFile(aligned_reference_video, > JoinFilename(yuv_directory, "ref.yuv"), > /*fps=*/30); >- webrtc::test::WriteVideoToFile(test_video, >+ webrtc::test::WriteVideoToFile(color_adjusted_test_video, > JoinFilename(yuv_directory, "test.yuv"), > /*fps=*/30); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..706d212d174abe9d76ce20039a0ba83e3fe842dd >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares.cc >@@ -0,0 +1,200 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_tools/frame_analyzer/linear_least_squares.h" >+ >+#include <numeric> >+#include <utility> >+ >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+ >+namespace webrtc { >+namespace test { >+ >+template <class T> >+using Matrix = std::valarray<std::valarray<T>>; >+ >+namespace { >+ >+template <typename R, typename T> >+R DotProduct(const std::valarray<T>& a, const std::valarray<T>& b) { >+ RTC_CHECK_EQ(a.size(), b.size()); >+ return std::inner_product(std::begin(a), std::end(a), std::begin(b), R(0)); >+} >+ >+// Calculates a^T * b. >+template <typename R, typename T> >+Matrix<R> MatrixMultiply(const Matrix<T>& a, const Matrix<T>& b) { >+ Matrix<R> result(std::valarray<R>(a.size()), b.size()); >+ for (size_t i = 0; i < a.size(); ++i) { >+ for (size_t j = 0; j < b.size(); ++j) >+ result[j][i] = DotProduct<R>(a[i], b[j]); >+ } >+ >+ return result; >+} >+ >+template <typename T> >+Matrix<T> Transpose(const Matrix<T>& matrix) { >+ if (matrix.size() == 0) >+ return Matrix<T>(); >+ const size_t rows = matrix.size(); >+ const size_t columns = matrix[0].size(); >+ Matrix<T> result(std::valarray<T>(rows), columns); >+ >+ for (size_t i = 0; i < rows; ++i) { >+ for (size_t j = 0; j < columns; ++j) >+ result[j][i] = matrix[i][j]; >+ } >+ >+ return result; >+} >+ >+// Convert valarray from type T to type R. >+template <typename R, typename T> >+std::valarray<R> ConvertTo(const std::valarray<T>& v) { >+ std::valarray<R> result(v.size()); >+ for (size_t i = 0; i < v.size(); ++i) >+ result[i] = static_cast<R>(v[i]); >+ return result; >+} >+ >+// Convert valarray Matrix from type T to type R. >+template <typename R, typename T> >+Matrix<R> ConvertTo(const Matrix<T>& mat) { >+ Matrix<R> result(mat.size()); >+ for (size_t i = 0; i < mat.size(); ++i) >+ result[i] = ConvertTo<R>(mat[i]); >+ return result; >+} >+ >+// Convert from valarray Matrix back to the more conventional std::vector. >+template <typename T> >+std::vector<std::vector<T>> ToVectorMatrix(const Matrix<T>& m) { >+ std::vector<std::vector<T>> result; >+ for (const std::valarray<T>& v : m) >+ result.emplace_back(std::begin(v), std::end(v)); >+ return result; >+} >+ >+// Create a valarray Matrix from a conventional std::vector. >+template <typename T> >+Matrix<T> FromVectorMatrix(const std::vector<std::vector<T>>& mat) { >+ Matrix<T> result(mat.size()); >+ for (size_t i = 0; i < mat.size(); ++i) >+ result[i] = std::valarray<T>(mat[i].data(), mat[i].size()); >+ return result; >+} >+ >+// Returns |matrix_to_invert|^-1 * |right_hand_matrix|. |matrix_to_invert| must >+// have square size. >+Matrix<double> GaussianElimination(Matrix<double> matrix_to_invert, >+ Matrix<double> right_hand_matrix) { >+ // |n| is the width/height of |matrix_to_invert|. >+ const size_t n = matrix_to_invert.size(); >+ // Make sure |matrix_to_invert| has square size. >+ for (const std::valarray<double>& column : matrix_to_invert) >+ RTC_CHECK_EQ(n, column.size()); >+ // Make sure |right_hand_matrix| has correct size. >+ for (const std::valarray<double>& column : right_hand_matrix) >+ RTC_CHECK_EQ(n, column.size()); >+ >+ // Transpose the matrices before and after so that we can perform Gaussian >+ // elimination on the columns instead of the rows, since that is easier with >+ // our representation. >+ matrix_to_invert = Transpose(matrix_to_invert); >+ right_hand_matrix = Transpose(right_hand_matrix); >+ >+ // Loop over the diagonal of |matrix_to_invert| and perform column reduction. >+ // Column reduction is a sequence of elementary column operations that is >+ // performed on both |matrix_to_invert| and |right_hand_matrix| until >+ // |matrix_to_invert| has been transformed to the identity matrix. >+ for (size_t diagonal_index = 0; diagonal_index < n; ++diagonal_index) { >+ // Make sure the diagonal element has the highest absolute value by >+ // swapping columns if necessary. >+ for (size_t column = diagonal_index + 1; column < n; ++column) { >+ if (std::abs(matrix_to_invert[column][diagonal_index]) > >+ std::abs(matrix_to_invert[diagonal_index][diagonal_index])) { >+ std::swap(matrix_to_invert[column], matrix_to_invert[diagonal_index]); >+ std::swap(right_hand_matrix[column], right_hand_matrix[diagonal_index]); >+ } >+ } >+ >+ // Reduce the diagonal element to be 1, by dividing the column with that >+ // value. If the diagonal element is 0, it means the system of equations has >+ // many solutions, and in that case we will return an arbitrary solution. >+ if (matrix_to_invert[diagonal_index][diagonal_index] == 0.0) { >+ RTC_LOG(LS_WARNING) << "Matrix is not invertible, ignoring."; >+ continue; >+ } >+ const double diagonal_element = >+ matrix_to_invert[diagonal_index][diagonal_index]; >+ matrix_to_invert[diagonal_index] /= diagonal_element; >+ right_hand_matrix[diagonal_index] /= diagonal_element; >+ >+ // Eliminate the other entries in row |diagonal_index| by making them zero. >+ for (size_t column = 0; column < n; ++column) { >+ if (column == diagonal_index) >+ continue; >+ const double row_element = matrix_to_invert[column][diagonal_index]; >+ matrix_to_invert[column] -= >+ row_element * matrix_to_invert[diagonal_index]; >+ right_hand_matrix[column] -= >+ row_element * right_hand_matrix[diagonal_index]; >+ } >+ } >+ >+ // Transpose the result before returning it, explained in comment above. >+ return Transpose(right_hand_matrix); >+} >+ >+} // namespace >+ >+IncrementalLinearLeastSquares::IncrementalLinearLeastSquares() = default; >+IncrementalLinearLeastSquares::~IncrementalLinearLeastSquares() = default; >+ >+void IncrementalLinearLeastSquares::AddObservations( >+ const std::vector<std::vector<uint8_t>>& x, >+ const std::vector<std::vector<uint8_t>>& y) { >+ if (x.empty() || y.empty()) >+ return; >+ // Make sure all columns are the same size. >+ const size_t n = x[0].size(); >+ for (const std::vector<uint8_t>& column : x) >+ RTC_CHECK_EQ(n, column.size()); >+ for (const std::vector<uint8_t>& column : y) >+ RTC_CHECK_EQ(n, column.size()); >+ >+ // We will multiply the uint8_t values together, so we need to expand to a >+ // type that can safely store those values, i.e. uint16_t. >+ const Matrix<uint16_t> unpacked_x = ConvertTo<uint16_t>(FromVectorMatrix(x)); >+ const Matrix<uint16_t> unpacked_y = ConvertTo<uint16_t>(FromVectorMatrix(y)); >+ >+ const Matrix<uint64_t> xx = MatrixMultiply<uint64_t>(unpacked_x, unpacked_x); >+ const Matrix<uint64_t> xy = MatrixMultiply<uint64_t>(unpacked_x, unpacked_y); >+ if (sum_xx && sum_xy) { >+ *sum_xx += xx; >+ *sum_xy += xy; >+ } else { >+ sum_xx = xx; >+ sum_xy = xy; >+ } >+} >+ >+std::vector<std::vector<double>> >+IncrementalLinearLeastSquares::GetBestSolution() const { >+ RTC_CHECK(sum_xx && sum_xy) << "No observations have been added"; >+ return ToVectorMatrix(GaussianElimination(ConvertTo<double>(*sum_xx), >+ ConvertTo<double>(*sum_xy))); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares.h >new file mode 100644 >index 0000000000000000000000000000000000000000..1b07dc1bde640afb792f1dc7625b3e63d8b582b3 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares.h >@@ -0,0 +1,53 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_TOOLS_FRAME_ANALYZER_LINEAR_LEAST_SQUARES_H_ >+#define RTC_TOOLS_FRAME_ANALYZER_LINEAR_LEAST_SQUARES_H_ >+ >+#include <valarray> >+#include <vector> >+#include "absl/types/optional.h" >+ >+namespace webrtc { >+namespace test { >+ >+// This class is used for finding a matrix b that roughly solves the equation: >+// y = x * b. This is generally impossible to do exactly, so the problem is >+// rephrased as finding the matrix b that minimizes the difference: >+// |y - x * b|^2. Calling multiple AddObservations() is equivalent to >+// concatenating the observation vectors and calling AddObservations() once. The >+// reason for doing it incrementally is that we can't store the raw YUV values >+// for a whole video file in memory at once. This class has a constant memory >+// footprint, regardless how may times AddObservations() is called. >+class IncrementalLinearLeastSquares { >+ public: >+ IncrementalLinearLeastSquares(); >+ ~IncrementalLinearLeastSquares(); >+ >+ // Add a number of observations. The subvectors of x and y must have the same >+ // length. >+ void AddObservations(const std::vector<std::vector<uint8_t>>& x, >+ const std::vector<std::vector<uint8_t>>& y); >+ >+ // Calculate and return the best linear solution, given the observations so >+ // far. >+ std::vector<std::vector<double>> GetBestSolution() const; >+ >+ private: >+ // Running sum of x^T * x. >+ absl::optional<std::valarray<std::valarray<uint64_t>>> sum_xx; >+ // Running sum of x^T * y. >+ absl::optional<std::valarray<std::valarray<uint64_t>>> sum_xy; >+}; >+ >+} // namespace test >+} // namespace webrtc >+ >+#endif // RTC_TOOLS_FRAME_ANALYZER_LINEAR_LEAST_SQUARES_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..b074aacce856a8ff639696b8860624a9ed99c7cf >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/linear_least_squares_unittest.cc >@@ -0,0 +1,91 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_tools/frame_analyzer/linear_least_squares.h" >+ >+#include "test/gtest.h" >+ >+namespace webrtc { >+namespace test { >+ >+TEST(LinearLeastSquares, ScalarIdentityOneObservation) { >+ IncrementalLinearLeastSquares lls; >+ lls.AddObservations({{1}}, {{1}}); >+ EXPECT_EQ(std::vector<std::vector<double>>({{1.0}}), lls.GetBestSolution()); >+} >+ >+TEST(LinearLeastSquares, ScalarIdentityTwoObservationsOneCall) { >+ IncrementalLinearLeastSquares lls; >+ lls.AddObservations({{1, 2}}, {{1, 2}}); >+ EXPECT_EQ(std::vector<std::vector<double>>({{1.0}}), lls.GetBestSolution()); >+} >+ >+TEST(LinearLeastSquares, ScalarIdentityTwoObservationsTwoCalls) { >+ IncrementalLinearLeastSquares lls; >+ lls.AddObservations({{1}}, {{1}}); >+ lls.AddObservations({{2}}, {{2}}); >+ EXPECT_EQ(std::vector<std::vector<double>>({{1.0}}), lls.GetBestSolution()); >+} >+ >+TEST(LinearLeastSquares, MatrixIdentityOneObservation) { >+ IncrementalLinearLeastSquares lls; >+ lls.AddObservations({{1, 2}, {3, 4}}, {{1, 2}, {3, 4}}); >+ EXPECT_EQ(std::vector<std::vector<double>>({{1.0, 0.0}, {0.0, 1.0}}), >+ lls.GetBestSolution()); >+} >+ >+TEST(LinearLeastSquares, MatrixManyObservations) { >+ IncrementalLinearLeastSquares lls; >+ // Test that we can find the solution of the overspecified equation system: >+ // [1, 2] [1, 3] = [5, 11] >+ // [3, 4] [2, 4] [11, 25] >+ // [5, 6] [17, 39] >+ lls.AddObservations({{1}, {2}}, {{5}, {11}}); >+ lls.AddObservations({{3}, {4}}, {{11}, {25}}); >+ lls.AddObservations({{5}, {6}}, {{17}, {39}}); >+ >+ const std::vector<std::vector<double>> result = lls.GetBestSolution(); >+ // We allow some numerical flexibility here. >+ EXPECT_DOUBLE_EQ(1.0, result[0][0]); >+ EXPECT_DOUBLE_EQ(2.0, result[0][1]); >+ EXPECT_DOUBLE_EQ(3.0, result[1][0]); >+ EXPECT_DOUBLE_EQ(4.0, result[1][1]); >+} >+ >+TEST(LinearLeastSquares, MatrixVectorOneObservation) { >+ IncrementalLinearLeastSquares lls; >+ // Test that we can find the solution of the overspecified equation system: >+ // [1, 2] [1] = [5] >+ // [3, 4] [2] [11] >+ // [5, 6] [17] >+ lls.AddObservations({{1, 3, 5}, {2, 4, 6}}, {{5, 11, 17}}); >+ >+ const std::vector<std::vector<double>> result = lls.GetBestSolution(); >+ // We allow some numerical flexibility here. >+ EXPECT_DOUBLE_EQ(1.0, result[0][0]); >+ EXPECT_DOUBLE_EQ(2.0, result[0][1]); >+} >+ >+TEST(LinearLeastSquares, LinearLeastSquaresNonPerfectSolution) { >+ IncrementalLinearLeastSquares lls; >+ // Test that we can find the non-perfect solution of the overspecified >+ // equation system: >+ // [1] [20] = [21] >+ // [2] [39] >+ // [3] [60] >+ // [2] [41] >+ // [1] [19] >+ lls.AddObservations({{1, 2, 3, 2, 1}}, {{21, 39, 60, 41, 19}}); >+ >+ EXPECT_DOUBLE_EQ(20.0, lls.GetBestSolution()[0][0]); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7afb1e46950f9921f900aaa6a1d189e0a2a5d22b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner.cc >@@ -0,0 +1,239 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_tools/frame_analyzer/video_color_aligner.h" >+ >+#include <algorithm> >+#include <cmath> >+#include <deque> >+#include <limits> >+#include <vector> >+ >+#include "api/array_view.h" >+#include "api/video/i420_buffer.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/refcountedobject.h" >+#include "rtc_tools/frame_analyzer/linear_least_squares.h" >+#include "rtc_tools/frame_analyzer/video_quality_analysis.h" >+#include "third_party/libyuv/include/libyuv/compare.h" >+#include "third_party/libyuv/include/libyuv/planar_functions.h" >+#include "third_party/libyuv/include/libyuv/scale.h" >+ >+namespace webrtc { >+namespace test { >+ >+namespace { >+ >+// Helper function for AdjustColors(). This functions calculates a single output >+// row for y with the given color coefficients. The u/v channels are assumed to >+// be subsampled by a factor of 2, which is the case of I420. >+void CalculateYChannel(rtc::ArrayView<const uint8_t> y_data, >+ rtc::ArrayView<const uint8_t> u_data, >+ rtc::ArrayView<const uint8_t> v_data, >+ const std::array<float, 4>& coeff, >+ rtc::ArrayView<uint8_t> output) { >+ RTC_CHECK_EQ(y_data.size(), output.size()); >+ // Each u/v element represents two y elements. Make sure we have enough to >+ // cover the Y values. >+ RTC_CHECK_GE(u_data.size() * 2, y_data.size()); >+ RTC_CHECK_GE(v_data.size() * 2, y_data.size()); >+ >+ // Do two pixels at a time since u/v are subsampled. >+ for (size_t i = 0; i * 2 < y_data.size() - 1; ++i) { >+ const float uv_contribution = >+ coeff[1] * u_data[i] + coeff[2] * v_data[i] + coeff[3]; >+ >+ const float val0 = coeff[0] * y_data[i * 2 + 0] + uv_contribution; >+ const float val1 = coeff[0] * y_data[i * 2 + 1] + uv_contribution; >+ >+ // Clamp result to a byte. >+ output[i * 2 + 0] = static_cast<uint8_t>( >+ std::round(std::max(0.0f, std::min(val0, 255.0f)))); >+ output[i * 2 + 1] = static_cast<uint8_t>( >+ std::round(std::max(0.0f, std::min(val1, 255.0f)))); >+ } >+ >+ // Handle the last pixel for odd widths. >+ if (y_data.size() % 2 == 1) { >+ const float val = coeff[0] * y_data[y_data.size() - 1] + >+ coeff[1] * u_data[(y_data.size() - 1) / 2] + >+ coeff[2] * v_data[(y_data.size() - 1) / 2] + coeff[3]; >+ output[y_data.size() - 1] = >+ static_cast<uint8_t>(std::round(std::max(0.0f, std::min(val, 255.0f)))); >+ } >+} >+ >+// Helper function for AdjustColors(). This functions calculates a single output >+// row for either u or v, with the given color coefficients. Y, U, and V are >+// assumed to be the same size, i.e. no subsampling. >+void CalculateUVChannel(rtc::ArrayView<const uint8_t> y_data, >+ rtc::ArrayView<const uint8_t> u_data, >+ rtc::ArrayView<const uint8_t> v_data, >+ const std::array<float, 4>& coeff, >+ rtc::ArrayView<uint8_t> output) { >+ RTC_CHECK_EQ(y_data.size(), u_data.size()); >+ RTC_CHECK_EQ(y_data.size(), v_data.size()); >+ RTC_CHECK_EQ(y_data.size(), output.size()); >+ >+ for (size_t x = 0; x < y_data.size(); ++x) { >+ const float val = coeff[0] * y_data[x] + coeff[1] * u_data[x] + >+ coeff[2] * v_data[x] + coeff[3]; >+ // Clamp result to a byte. >+ output[x] = >+ static_cast<uint8_t>(std::round(std::max(0.0f, std::min(val, 255.0f)))); >+ } >+} >+ >+// Convert a frame to four vectors consisting of [y, u, v, 1]. >+std::vector<std::vector<uint8_t>> FlattenYuvData( >+ const rtc::scoped_refptr<I420BufferInterface>& frame) { >+ std::vector<std::vector<uint8_t>> result( >+ 4, std::vector<uint8_t>(frame->ChromaWidth() * frame->ChromaHeight())); >+ >+ // Downscale the Y plane so that all YUV planes are the same size. >+ libyuv::ScalePlane(frame->DataY(), frame->StrideY(), frame->width(), >+ frame->height(), result[0].data(), frame->ChromaWidth(), >+ frame->ChromaWidth(), frame->ChromaHeight(), >+ libyuv::kFilterBox); >+ >+ libyuv::CopyPlane(frame->DataU(), frame->StrideU(), result[1].data(), >+ frame->ChromaWidth(), frame->ChromaWidth(), >+ frame->ChromaHeight()); >+ >+ libyuv::CopyPlane(frame->DataV(), frame->StrideV(), result[2].data(), >+ frame->ChromaWidth(), frame->ChromaWidth(), >+ frame->ChromaHeight()); >+ >+ std::fill(result[3].begin(), result[3].end(), 1u); >+ >+ return result; >+} >+ >+ColorTransformationMatrix VectorToColorMatrix( >+ const std::vector<std::vector<double>>& v) { >+ ColorTransformationMatrix color_transformation; >+ for (int i = 0; i < 3; ++i) { >+ for (int j = 0; j < 4; ++j) >+ color_transformation[i][j] = v[i][j]; >+ } >+ return color_transformation; >+} >+ >+} // namespace >+ >+ColorTransformationMatrix CalculateColorTransformationMatrix( >+ const rtc::scoped_refptr<I420BufferInterface>& reference_frame, >+ const rtc::scoped_refptr<I420BufferInterface>& test_frame) { >+ IncrementalLinearLeastSquares incremental_lls; >+ incremental_lls.AddObservations(FlattenYuvData(test_frame), >+ FlattenYuvData(reference_frame)); >+ return VectorToColorMatrix(incremental_lls.GetBestSolution()); >+} >+ >+ColorTransformationMatrix CalculateColorTransformationMatrix( >+ const rtc::scoped_refptr<Video>& reference_video, >+ const rtc::scoped_refptr<Video>& test_video) { >+ RTC_CHECK_GE(reference_video->number_of_frames(), >+ test_video->number_of_frames()); >+ >+ IncrementalLinearLeastSquares incremental_lls; >+ for (size_t i = 0; i < test_video->number_of_frames(); ++i) { >+ incremental_lls.AddObservations( >+ FlattenYuvData(test_video->GetFrame(i)), >+ FlattenYuvData(reference_video->GetFrame(i))); >+ } >+ >+ return VectorToColorMatrix(incremental_lls.GetBestSolution()); >+} >+ >+rtc::scoped_refptr<Video> AdjustColors( >+ const ColorTransformationMatrix& color_transformation, >+ const rtc::scoped_refptr<Video>& video) { >+ class ColorAdjustedVideo : public rtc::RefCountedObject<Video> { >+ public: >+ ColorAdjustedVideo(const ColorTransformationMatrix& color_transformation, >+ const rtc::scoped_refptr<Video>& video) >+ : color_transformation_(color_transformation), video_(video) {} >+ >+ int width() const override { return video_->width(); } >+ int height() const override { return video_->height(); } >+ size_t number_of_frames() const override { >+ return video_->number_of_frames(); >+ } >+ >+ rtc::scoped_refptr<I420BufferInterface> GetFrame( >+ size_t index) const override { >+ return AdjustColors(color_transformation_, video_->GetFrame(index)); >+ } >+ >+ private: >+ const ColorTransformationMatrix color_transformation_; >+ const rtc::scoped_refptr<Video> video_; >+ }; >+ >+ return new ColorAdjustedVideo(color_transformation, video); >+} >+ >+rtc::scoped_refptr<I420BufferInterface> AdjustColors( >+ const ColorTransformationMatrix& color_matrix, >+ const rtc::scoped_refptr<I420BufferInterface>& frame) { >+ // Allocate I420 buffer that will hold the color adjusted frame. >+ rtc::scoped_refptr<I420Buffer> adjusted_frame = >+ I420Buffer::Create(frame->width(), frame->height()); >+ >+ // Create a downscaled Y plane with the same size as the U/V planes to >+ // simplify converting the U/V planes. >+ std::vector<uint8_t> downscaled_y_plane(frame->ChromaWidth() * >+ frame->ChromaHeight()); >+ libyuv::ScalePlane(frame->DataY(), frame->StrideY(), frame->width(), >+ frame->height(), downscaled_y_plane.data(), >+ frame->ChromaWidth(), frame->ChromaWidth(), >+ frame->ChromaHeight(), libyuv::kFilterBox); >+ >+ // Fill in the adjusted data row by row. >+ for (int y = 0; y < frame->height(); ++y) { >+ const int half_y = y / 2; >+ rtc::ArrayView<const uint8_t> y_row(frame->DataY() + frame->StrideY() * y, >+ frame->width()); >+ rtc::ArrayView<const uint8_t> u_row( >+ frame->DataU() + frame->StrideU() * half_y, frame->ChromaWidth()); >+ rtc::ArrayView<const uint8_t> v_row( >+ frame->DataV() + frame->StrideV() * half_y, frame->ChromaWidth()); >+ rtc::ArrayView<uint8_t> output_y_row( >+ adjusted_frame->MutableDataY() + adjusted_frame->StrideY() * y, >+ frame->width()); >+ >+ CalculateYChannel(y_row, u_row, v_row, color_matrix[0], output_y_row); >+ >+ // Chroma channels only exist every second row for I420. >+ if (y % 2 == 0) { >+ rtc::ArrayView<const uint8_t> downscaled_y_row( >+ downscaled_y_plane.data() + frame->ChromaWidth() * half_y, >+ frame->ChromaWidth()); >+ rtc::ArrayView<uint8_t> output_u_row( >+ adjusted_frame->MutableDataU() + adjusted_frame->StrideU() * half_y, >+ frame->ChromaWidth()); >+ rtc::ArrayView<uint8_t> output_v_row( >+ adjusted_frame->MutableDataV() + adjusted_frame->StrideV() * half_y, >+ frame->ChromaWidth()); >+ >+ CalculateUVChannel(downscaled_y_row, u_row, v_row, color_matrix[1], >+ output_u_row); >+ CalculateUVChannel(downscaled_y_row, u_row, v_row, color_matrix[2], >+ output_v_row); >+ } >+ } >+ >+ return adjusted_frame; >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner.h >new file mode 100644 >index 0000000000000000000000000000000000000000..997cbf65829f6826f6e21cb8291c336bfb7a4dae >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner.h >@@ -0,0 +1,49 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_TOOLS_FRAME_ANALYZER_VIDEO_COLOR_ALIGNER_H_ >+#define RTC_TOOLS_FRAME_ANALYZER_VIDEO_COLOR_ALIGNER_H_ >+ >+#include <array> >+ >+#include "rtc_tools/video_file_reader.h" >+ >+namespace webrtc { >+namespace test { >+ >+// Represents a linear color transformation from [y, u, v] to [y', u', v'] >+// through the equation: [y', u', v'] = [y, u, v, 1] * matrix. >+using ColorTransformationMatrix = std::array<std::array<float, 4>, 3>; >+ >+// Calculate the optimal color transformation that should be applied to the test >+// video to match as closely as possible to the reference video. >+ColorTransformationMatrix CalculateColorTransformationMatrix( >+ const rtc::scoped_refptr<Video>& reference_video, >+ const rtc::scoped_refptr<Video>& test_video); >+ >+// Calculate color transformation for a single I420 frame. >+ColorTransformationMatrix CalculateColorTransformationMatrix( >+ const rtc::scoped_refptr<I420BufferInterface>& reference_frame, >+ const rtc::scoped_refptr<I420BufferInterface>& test_frame); >+ >+// Apply a color transformation to a video. >+rtc::scoped_refptr<Video> AdjustColors( >+ const ColorTransformationMatrix& color_matrix, >+ const rtc::scoped_refptr<Video>& video); >+ >+// Apply a color transformation to a single I420 frame. >+rtc::scoped_refptr<I420BufferInterface> AdjustColors( >+ const ColorTransformationMatrix& color_matrix, >+ const rtc::scoped_refptr<I420BufferInterface>& frame); >+ >+} // namespace test >+} // namespace webrtc >+ >+#endif // RTC_TOOLS_FRAME_ANALYZER_VIDEO_COLOR_ALIGNER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..520d6f459e7052b5fadb0a4aabf9dfa465acc90e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_color_aligner_unittest.cc >@@ -0,0 +1,174 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_tools/frame_analyzer/video_color_aligner.h" >+ >+#include "rtc_tools/frame_analyzer/video_quality_analysis.h" >+#include "rtc_tools/video_file_reader.h" >+#include "test/gtest.h" >+#include "test/testsupport/fileutils.h" >+#include "third_party/libyuv/include/libyuv/scale.h" >+ >+namespace webrtc { >+namespace test { >+ >+namespace { >+ >+const ColorTransformationMatrix kIdentityColorMatrix = { >+ {{1, 0, 0, 0}, {0, 1, 0, 0}, {0, 0, 1, 0}}}; >+ >+void ExpectNear(const ColorTransformationMatrix& expected, >+ const ColorTransformationMatrix& actual) { >+ // The scaling factor on y/u/v should be pretty precise. >+ for (int i = 0; i < 3; ++i) { >+ for (int j = 0; j < 3; ++j) >+ EXPECT_NEAR(expected[i][j], actual[i][j], /* abs_error= */ 1.0e-3) >+ << "at element i: " << i << ", j: " << j; >+ } >+ // The offset can be less precise since the range is [0, 255]. >+ for (int i = 0; i < 3; ++i) >+ EXPECT_NEAR(expected[i][3], actual[i][3], /* abs_error= */ 0.1) >+ << "at element i: " << i; >+} >+ >+} // namespace >+ >+class VideoColorAlignerTest : public ::testing::Test { >+ protected: >+ void SetUp() { >+ reference_video_ = >+ OpenYuvFile(ResourcePath("foreman_128x96", "yuv"), 128, 96); >+ ASSERT_TRUE(reference_video_); >+ } >+ >+ rtc::scoped_refptr<Video> reference_video_; >+}; >+ >+TEST_F(VideoColorAlignerTest, AdjustColorsFrameIdentity) { >+ const rtc::scoped_refptr<I420BufferInterface> test_frame = >+ reference_video_->GetFrame(0); >+ >+ // Assume perfect match, i.e. ssim == 1. >+ EXPECT_EQ(1.0, >+ Ssim(test_frame, AdjustColors(kIdentityColorMatrix, test_frame))); >+} >+ >+TEST_F(VideoColorAlignerTest, AdjustColorsFrame1x1) { >+ const ColorTransformationMatrix color_matrix = { >+ {{1, 2, 3, 4}, {5, 6, 7, 8}, {9, 10, 11, 12}}}; >+ >+ const uint8_t data_y[] = {2}; >+ const uint8_t data_u[] = {6}; >+ const uint8_t data_v[] = {7}; >+ const rtc::scoped_refptr<I420BufferInterface> i420_buffer = I420Buffer::Copy( >+ /* width= */ 1, /* height= */ 1, data_y, /* stride_y= */ 1, data_u, >+ /* stride_u= */ 1, data_v, /* stride_v= */ 1); >+ >+ const rtc::scoped_refptr<I420BufferInterface> adjusted_buffer = >+ AdjustColors(color_matrix, i420_buffer); >+ >+ EXPECT_EQ(2 * 1 + 6 * 2 + 7 * 3 + 4, adjusted_buffer->DataY()[0]); >+ EXPECT_EQ(2 * 5 + 6 * 6 + 7 * 7 + 8, adjusted_buffer->DataU()[0]); >+ EXPECT_EQ(2 * 9 + 6 * 10 + 7 * 11 + 12, adjusted_buffer->DataV()[0]); >+} >+ >+TEST_F(VideoColorAlignerTest, AdjustColorsFrame1x1Negative) { >+ const ColorTransformationMatrix color_matrix = { >+ {{-1, 0, 0, 255}, {0, -1, 0, 255}, {0, 0, -1, 255}}}; >+ >+ const uint8_t data_y[] = {2}; >+ const uint8_t data_u[] = {6}; >+ const uint8_t data_v[] = {7}; >+ const rtc::scoped_refptr<I420BufferInterface> i420_buffer = I420Buffer::Copy( >+ /* width= */ 1, /* height= */ 1, data_y, /* stride_y= */ 1, data_u, >+ /* stride_u= */ 1, data_v, /* stride_v= */ 1); >+ >+ const rtc::scoped_refptr<I420BufferInterface> adjusted_buffer = >+ AdjustColors(color_matrix, i420_buffer); >+ >+ EXPECT_EQ(255 - 2, adjusted_buffer->DataY()[0]); >+ EXPECT_EQ(255 - 6, adjusted_buffer->DataU()[0]); >+ EXPECT_EQ(255 - 7, adjusted_buffer->DataV()[0]); >+} >+ >+TEST_F(VideoColorAlignerTest, AdjustColorsFrame2x2) { >+ const ColorTransformationMatrix color_matrix = { >+ {{1, 2, 3, 4}, {5, 6, 7, 8}, {9, 10, 11, 12}}}; >+ >+ const uint8_t data_y[] = {0, 1, 3, 4}; >+ const uint8_t data_u[] = {6}; >+ const uint8_t data_v[] = {7}; >+ const rtc::scoped_refptr<I420BufferInterface> i420_buffer = I420Buffer::Copy( >+ /* width= */ 2, /* height= */ 2, data_y, /* stride_y= */ 2, data_u, >+ /* stride_u= */ 1, data_v, /* stride_v= */ 1); >+ >+ const rtc::scoped_refptr<I420BufferInterface> adjusted_buffer = >+ AdjustColors(color_matrix, i420_buffer); >+ >+ EXPECT_EQ(0 * 1 + 6 * 2 + 7 * 3 + 4, adjusted_buffer->DataY()[0]); >+ EXPECT_EQ(1 * 1 + 6 * 2 + 7 * 3 + 4, adjusted_buffer->DataY()[1]); >+ EXPECT_EQ(3 * 1 + 6 * 2 + 7 * 3 + 4, adjusted_buffer->DataY()[2]); >+ EXPECT_EQ(4 * 1 + 6 * 2 + 7 * 3 + 4, adjusted_buffer->DataY()[3]); >+ >+ EXPECT_EQ(2 * 5 + 6 * 6 + 7 * 7 + 8, adjusted_buffer->DataU()[0]); >+ EXPECT_EQ(2 * 9 + 6 * 10 + 7 * 11 + 12, adjusted_buffer->DataV()[0]); >+} >+ >+TEST_F(VideoColorAlignerTest, CalculateColorTransformationMatrixIdentity) { >+ EXPECT_EQ(kIdentityColorMatrix, CalculateColorTransformationMatrix( >+ reference_video_, reference_video_)); >+} >+ >+TEST_F(VideoColorAlignerTest, CalculateColorTransformationMatrixOffset) { >+ const uint8_t small_data_y[] = {0, 1, 2, 3, 4, 5, 6, 7, >+ 8, 9, 10, 11, 12, 13, 14, 15}; >+ const uint8_t small_data_u[] = {15, 13, 17, 29}; >+ const uint8_t small_data_v[] = {3, 200, 170, 29}; >+ const rtc::scoped_refptr<I420BufferInterface> small_i420_buffer = >+ I420Buffer::Copy( >+ /* width= */ 4, /* height= */ 4, small_data_y, /* stride_y= */ 4, >+ small_data_u, /* stride_u= */ 2, small_data_v, /* stride_v= */ 2); >+ >+ uint8_t big_data_y[16]; >+ uint8_t big_data_u[4]; >+ uint8_t big_data_v[4]; >+ // Create another I420 frame where all values are 10 bigger. >+ for (int i = 0; i < 16; ++i) >+ big_data_y[i] = small_data_y[i] + 10; >+ for (int i = 0; i < 4; ++i) >+ big_data_u[i] = small_data_u[i] + 10; >+ for (int i = 0; i < 4; ++i) >+ big_data_v[i] = small_data_v[i] + 10; >+ >+ const rtc::scoped_refptr<I420BufferInterface> big_i420_buffer = >+ I420Buffer::Copy( >+ /* width= */ 4, /* height= */ 4, big_data_y, /* stride_y= */ 4, >+ big_data_u, /* stride_u= */ 2, big_data_v, /* stride_v= */ 2); >+ >+ const ColorTransformationMatrix color_matrix = >+ CalculateColorTransformationMatrix(big_i420_buffer, small_i420_buffer); >+ >+ ExpectNear({{{1, 0, 0, 10}, {0, 1, 0, 10}, {0, 0, 1, 10}}}, color_matrix); >+} >+ >+TEST_F(VideoColorAlignerTest, CalculateColorTransformationMatrix) { >+ // Arbitrary color transformation matrix. >+ const ColorTransformationMatrix org_color_matrix = { >+ {{0.8, 0.05, 0.04, -4}, {-0.2, 0.7, 0.1, 10}, {0.1, 0.2, 0.4, 20}}}; >+ >+ const ColorTransformationMatrix result_color_matrix = >+ CalculateColorTransformationMatrix( >+ AdjustColors(org_color_matrix, reference_video_), reference_video_); >+ >+ ExpectNear(org_color_matrix, result_color_matrix); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..d3cbc051c9bf133c1fdf1e9cb7f52f506c06cc5f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner.cc >@@ -0,0 +1,178 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_tools/frame_analyzer/video_geometry_aligner.h" >+ >+#include <map> >+ >+#include "api/video/i420_buffer.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/refcountedobject.h" >+#include "rtc_tools/frame_analyzer/video_quality_analysis.h" >+#include "third_party/libyuv/include/libyuv/scale.h" >+ >+namespace webrtc { >+namespace test { >+ >+namespace { >+ >+bool IsValidRegion(const CropRegion& region, >+ const rtc::scoped_refptr<I420BufferInterface>& frame) { >+ return region.left >= 0 && region.right >= 0 && region.top >= 0 && >+ region.bottom >= 0 && region.left + region.right < frame->width() && >+ region.top + region.bottom < frame->height(); >+} >+ >+} // namespace >+ >+rtc::scoped_refptr<I420BufferInterface> CropAndZoom( >+ const CropRegion& crop_region, >+ const rtc::scoped_refptr<I420BufferInterface>& frame) { >+ RTC_CHECK(IsValidRegion(crop_region, frame)); >+ >+ const int uv_crop_left = crop_region.left / 2; >+ const int uv_crop_top = crop_region.top / 2; >+ >+ const int cropped_width = >+ frame->width() - crop_region.left - crop_region.right; >+ const int cropped_height = >+ frame->height() - crop_region.top - crop_region.bottom; >+ >+ // Crop by only adjusting pointers. >+ const uint8_t* y_plane = >+ frame->DataY() + frame->StrideY() * crop_region.top + crop_region.left; >+ const uint8_t* u_plane = >+ frame->DataU() + frame->StrideU() * uv_crop_top + uv_crop_left; >+ const uint8_t* v_plane = >+ frame->DataV() + frame->StrideV() * uv_crop_top + uv_crop_left; >+ >+ // Stretch the cropped frame to the original size using libyuv. >+ rtc::scoped_refptr<I420Buffer> adjusted_frame = >+ I420Buffer::Create(frame->width(), frame->height()); >+ libyuv::I420Scale(y_plane, frame->StrideY(), u_plane, frame->StrideU(), >+ v_plane, frame->StrideV(), cropped_width, cropped_height, >+ adjusted_frame->MutableDataY(), adjusted_frame->StrideY(), >+ adjusted_frame->MutableDataU(), adjusted_frame->StrideU(), >+ adjusted_frame->MutableDataV(), adjusted_frame->StrideV(), >+ frame->width(), frame->height(), libyuv::kFilterBilinear); >+ >+ return adjusted_frame; >+} >+ >+CropRegion CalculateCropRegion( >+ const rtc::scoped_refptr<I420BufferInterface>& reference_frame, >+ const rtc::scoped_refptr<I420BufferInterface>& test_frame) { >+ RTC_CHECK_EQ(reference_frame->width(), test_frame->width()); >+ RTC_CHECK_EQ(reference_frame->height(), test_frame->height()); >+ >+ CropRegion best_region; >+ double best_ssim = Ssim(reference_frame, test_frame); >+ >+ typedef int CropRegion::*CropParameter; >+ CropParameter crop_parameters[4] = {&CropRegion::left, &CropRegion::top, >+ &CropRegion::right, &CropRegion::bottom}; >+ >+ while (true) { >+ // Find the parameter in which direction SSIM improves the most. >+ CropParameter best_parameter = nullptr; >+ const CropRegion prev_best_region = best_region; >+ >+ for (CropParameter crop_parameter : crop_parameters) { >+ CropRegion test_region = prev_best_region; >+ ++(test_region.*crop_parameter); >+ >+ if (!IsValidRegion(test_region, reference_frame)) >+ continue; >+ >+ const double ssim = >+ Ssim(CropAndZoom(test_region, reference_frame), test_frame); >+ >+ if (ssim > best_ssim) { >+ best_ssim = ssim; >+ best_parameter = crop_parameter; >+ best_region = test_region; >+ } >+ } >+ >+ // No improvement among any direction, stop iteration. >+ if (best_parameter == nullptr) >+ break; >+ >+ // Iterate in the best direction as long as it improves SSIM. >+ for (CropRegion test_region = best_region; >+ IsValidRegion(test_region, reference_frame); >+ ++(test_region.*best_parameter)) { >+ const double ssim = >+ Ssim(CropAndZoom(test_region, reference_frame), test_frame); >+ if (ssim <= best_ssim) >+ break; >+ >+ best_ssim = ssim; >+ best_region = test_region; >+ } >+ } >+ >+ return best_region; >+} >+ >+rtc::scoped_refptr<I420BufferInterface> AdjustCropping( >+ const rtc::scoped_refptr<I420BufferInterface>& reference_frame, >+ const rtc::scoped_refptr<I420BufferInterface>& test_frame) { >+ return CropAndZoom(CalculateCropRegion(reference_frame, test_frame), >+ reference_frame); >+} >+ >+rtc::scoped_refptr<Video> AdjustCropping( >+ const rtc::scoped_refptr<Video>& reference_video, >+ const rtc::scoped_refptr<Video>& test_video) { >+ class CroppedVideo : public rtc::RefCountedObject<Video> { >+ public: >+ CroppedVideo(const rtc::scoped_refptr<Video>& reference_video, >+ const rtc::scoped_refptr<Video>& test_video) >+ : reference_video_(reference_video), test_video_(test_video) { >+ RTC_CHECK_EQ(reference_video->number_of_frames(), >+ test_video->number_of_frames()); >+ RTC_CHECK_EQ(reference_video->width(), test_video->width()); >+ RTC_CHECK_EQ(reference_video->height(), test_video->height()); >+ } >+ >+ int width() const override { return test_video_->width(); } >+ int height() const override { return test_video_->height(); } >+ size_t number_of_frames() const override { >+ return test_video_->number_of_frames(); >+ } >+ >+ rtc::scoped_refptr<I420BufferInterface> GetFrame( >+ size_t index) const override { >+ const rtc::scoped_refptr<I420BufferInterface> reference_frame = >+ reference_video_->GetFrame(index); >+ >+ // Only calculate cropping region once per frame since it's expensive. >+ if (!crop_regions_.count(index)) { >+ crop_regions_[index] = >+ CalculateCropRegion(reference_frame, test_video_->GetFrame(index)); >+ } >+ >+ return CropAndZoom(crop_regions_[index], reference_frame); >+ } >+ >+ private: >+ const rtc::scoped_refptr<Video> reference_video_; >+ const rtc::scoped_refptr<Video> test_video_; >+ // Mutable since this is a cache that affects performance and not logical >+ // behavior. >+ mutable std::map<size_t, CropRegion> crop_regions_; >+ }; >+ >+ return new CroppedVideo(reference_video, test_video); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner.h >new file mode 100644 >index 0000000000000000000000000000000000000000..47667b0d135da263ee40d0002f0b4937f1da6e6a >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner.h >@@ -0,0 +1,57 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef RTC_TOOLS_FRAME_ANALYZER_VIDEO_GEOMETRY_ALIGNER_H_ >+#define RTC_TOOLS_FRAME_ANALYZER_VIDEO_GEOMETRY_ALIGNER_H_ >+ >+#include "api/video/video_frame_buffer.h" >+#include "rtc_tools/video_file_reader.h" >+ >+namespace webrtc { >+namespace test { >+ >+struct CropRegion { >+ // Each value represents how much to crop from each side. Left is where x=0, >+ // and top is where y=0. All values equal to zero represents no cropping. >+ int left = 0; >+ int right = 0; >+ int top = 0; >+ int bottom = 0; >+}; >+ >+// Crops and zooms in on the cropped region so that the returned frame has the >+// same resolution as the input frame. >+rtc::scoped_refptr<I420BufferInterface> CropAndZoom( >+ const CropRegion& crop_region, >+ const rtc::scoped_refptr<I420BufferInterface>& frame); >+ >+// Calculate the optimal cropping region on the reference frame to maximize SSIM >+// to the test frame. >+CropRegion CalculateCropRegion( >+ const rtc::scoped_refptr<I420BufferInterface>& reference_frame, >+ const rtc::scoped_refptr<I420BufferInterface>& test_frame); >+ >+// Returns a cropped and zoomed version of the reference frame that matches up >+// to the test frame. This is a simple helper function on top of >+// CalculateCropRegion() and CropAndZoom(). >+rtc::scoped_refptr<I420BufferInterface> AdjustCropping( >+ const rtc::scoped_refptr<I420BufferInterface>& reference_frame, >+ const rtc::scoped_refptr<I420BufferInterface>& test_frame); >+ >+// Returns a cropped and zoomed version of the reference video that matches up >+// to the test video. Frames are individually adjusted for cropping. >+rtc::scoped_refptr<Video> AdjustCropping( >+ const rtc::scoped_refptr<Video>& reference_video, >+ const rtc::scoped_refptr<Video>& test_video); >+ >+} // namespace test >+} // namespace webrtc >+ >+#endif // RTC_TOOLS_FRAME_ANALYZER_VIDEO_GEOMETRY_ALIGNER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner_unittest.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..526f7772e1b92046e900dfb40a3afc6a5f7bd15b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_geometry_aligner_unittest.cc >@@ -0,0 +1,150 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "rtc_tools/frame_analyzer/video_geometry_aligner.h" >+ >+#include <vector> >+ >+#include "rtc_tools/frame_analyzer/video_quality_analysis.h" >+#include "rtc_tools/video_file_reader.h" >+#include "test/gtest.h" >+#include "test/testsupport/fileutils.h" >+ >+namespace webrtc { >+namespace test { >+ >+class VideoGeometryAlignerTest : public ::testing::Test { >+ protected: >+ void SetUp() { >+ reference_video_ = >+ OpenYuvFile(ResourcePath("foreman_128x96", "yuv"), 128, 96); >+ ASSERT_TRUE(reference_video_); >+ >+ // Very simple 4x4 frame used for verying CropAndZoom. >+ const uint8_t data_y[] = {0, 1, 2, 3, 4, 5, 6, 7, >+ 8, 9, 10, 11, 12, 13, 14, 15}; >+ const uint8_t data_u[] = {0, 1, 2, 3}; >+ const uint8_t data_v[] = {0, 1, 2, 3}; >+ test_frame_ = I420Buffer::Copy( >+ /* width= */ 4, /* height= */ 4, data_y, /* stride_y= */ 4, data_u, >+ /* stride_u= */ 2, data_v, /* stride_v= */ 2); >+ } >+ >+ rtc::scoped_refptr<Video> reference_video_; >+ rtc::scoped_refptr<I420BufferInterface> test_frame_; >+}; >+ >+// Teach gtest how to compare CropRegions. >+bool operator==(const CropRegion& a, const CropRegion& b) { >+ return a.left == b.left && a.top == b.top && a.right == b.right && >+ a.bottom == b.bottom; >+} >+ >+TEST_F(VideoGeometryAlignerTest, CropAndZoomIdentity) { >+ const rtc::scoped_refptr<I420BufferInterface> frame = >+ reference_video_->GetFrame(0); >+ >+ // Assume perfect match, i.e. SSIM == 1. >+ CropRegion identity_region; >+ EXPECT_EQ(1.0, Ssim(frame, CropAndZoom(identity_region, frame))); >+} >+ >+TEST_F(VideoGeometryAlignerTest, CropAndZoomLeft) { >+ CropRegion region; >+ region.left = 2; >+ const rtc::scoped_refptr<I420BufferInterface> cropped_frame = >+ CropAndZoom(region, test_frame_); >+ EXPECT_EQ(std::vector<uint8_t>( >+ {2, 2, 3, 3, 6, 6, 7, 7, 10, 10, 11, 11, 14, 14, 15, 15}), >+ std::vector<uint8_t>(cropped_frame->DataY(), >+ cropped_frame->DataY() + 16)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({1, 1, 3, 3}), >+ std::vector<uint8_t>(cropped_frame->DataU(), cropped_frame->DataU() + 4)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({1, 1, 3, 3}), >+ std::vector<uint8_t>(cropped_frame->DataV(), cropped_frame->DataV() + 4)); >+} >+ >+TEST_F(VideoGeometryAlignerTest, CropAndZoomTop) { >+ CropRegion region; >+ region.top = 2; >+ const rtc::scoped_refptr<I420BufferInterface> cropped_frame = >+ CropAndZoom(region, test_frame_); >+ EXPECT_EQ(std::vector<uint8_t>( >+ {8, 9, 10, 11, 10, 11, 12, 13, 12, 13, 14, 15, 12, 13, 14, 15}), >+ std::vector<uint8_t>(cropped_frame->DataY(), >+ cropped_frame->DataY() + 16)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({2, 3, 2, 3}), >+ std::vector<uint8_t>(cropped_frame->DataU(), cropped_frame->DataU() + 4)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({2, 3, 2, 3}), >+ std::vector<uint8_t>(cropped_frame->DataV(), cropped_frame->DataV() + 4)); >+} >+ >+TEST_F(VideoGeometryAlignerTest, CropAndZoomRight) { >+ CropRegion region; >+ region.right = 2; >+ const rtc::scoped_refptr<I420BufferInterface> cropped_frame = >+ CropAndZoom(region, test_frame_); >+ EXPECT_EQ(std::vector<uint8_t>( >+ {0, 0, 1, 1, 4, 4, 5, 5, 8, 8, 9, 9, 12, 12, 13, 13}), >+ std::vector<uint8_t>(cropped_frame->DataY(), >+ cropped_frame->DataY() + 16)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({0, 0, 2, 2}), >+ std::vector<uint8_t>(cropped_frame->DataU(), cropped_frame->DataU() + 4)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({0, 0, 2, 2}), >+ std::vector<uint8_t>(cropped_frame->DataV(), cropped_frame->DataV() + 4)); >+} >+ >+TEST_F(VideoGeometryAlignerTest, CropAndZoomBottom) { >+ CropRegion region; >+ region.bottom = 2; >+ const rtc::scoped_refptr<I420BufferInterface> cropped_frame = >+ CropAndZoom(region, test_frame_); >+ EXPECT_EQ( >+ std::vector<uint8_t>({0, 1, 2, 3, 2, 3, 4, 5, 4, 5, 6, 7, 4, 5, 6, 7}), >+ std::vector<uint8_t>(cropped_frame->DataY(), >+ cropped_frame->DataY() + 16)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({0, 1, 0, 1}), >+ std::vector<uint8_t>(cropped_frame->DataU(), cropped_frame->DataU() + 4)); >+ EXPECT_EQ( >+ std::vector<uint8_t>({0, 1, 0, 1}), >+ std::vector<uint8_t>(cropped_frame->DataV(), cropped_frame->DataV() + 4)); >+} >+ >+TEST_F(VideoGeometryAlignerTest, CalculateCropRegionIdentity) { >+ const rtc::scoped_refptr<I420BufferInterface> frame = >+ reference_video_->GetFrame(0); >+ CropRegion identity_region; >+ EXPECT_EQ(identity_region, CalculateCropRegion(frame, frame)); >+} >+ >+TEST_F(VideoGeometryAlignerTest, CalculateCropRegionArbitrary) { >+ // Arbitrary crop region. >+ CropRegion crop_region; >+ crop_region.left = 2; >+ crop_region.top = 4; >+ crop_region.right = 5; >+ crop_region.bottom = 3; >+ >+ const rtc::scoped_refptr<I420BufferInterface> frame = >+ reference_video_->GetFrame(0); >+ >+ EXPECT_EQ(crop_region, >+ CalculateCropRegion(frame, CropAndZoom(crop_region, frame))); >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_quality_analysis.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_quality_analysis.cc >index 1c30d08c56551ba3951cdc3cff4f0ed37616b329..40026802c7743da841294209bdaabf1ee8220abd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_quality_analysis.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_quality_analysis.cc >@@ -13,6 +13,7 @@ > #include <algorithm> > #include <numeric> > >+#include "rtc_base/logging.h" > #include "test/testsupport/perf_test.h" > #include "third_party/libyuv/include/libyuv/compare.h" > #include "third_party/libyuv/include/libyuv/convert.h" >@@ -56,16 +57,11 @@ std::vector<AnalysisResult> RunAnalysis( > const rtc::scoped_refptr<webrtc::test::Video>& test_video, > const std::vector<size_t>& test_frame_indices) { > std::vector<AnalysisResult> results; >- for (size_t i = 0; i < test_frame_indices.size(); ++i) { >- // Ignore duplicated frames in the test video. >- if (i > 0 && test_frame_indices[i] == test_frame_indices[i - 1]) >- continue; >- >+ for (size_t i = 0; i < test_video->number_of_frames(); ++i) { > const rtc::scoped_refptr<I420BufferInterface>& test_frame = > test_video->GetFrame(i); > const rtc::scoped_refptr<I420BufferInterface>& reference_frame = >- reference_video->GetFrame(test_frame_indices[i] % >- reference_video->number_of_frames()); >+ reference_video->GetFrame(i); > > // Fill in the result struct. > AnalysisResult result; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_temporal_aligner.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_temporal_aligner.cc >index 2ebffbc4e75b3be400d9ed8d4d684481dab1ce80..fa25f73f8e1448c87d40d7799e2d9011e8d043f8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_temporal_aligner.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/frame_analyzer/video_temporal_aligner.cc >@@ -215,7 +215,7 @@ std::vector<size_t> FindMatchingFrameIndices( > > rtc::scoped_refptr<Video> ReorderVideo(const rtc::scoped_refptr<Video>& video, > const std::vector<size_t>& indices) { >- return new ReorderedVideo(video, indices); >+ return new ReorderedVideo(new LoopingVideo(video), indices); > } > > rtc::scoped_refptr<Video> GenerateAlignedReferenceVideo( >@@ -228,7 +228,7 @@ rtc::scoped_refptr<Video> GenerateAlignedReferenceVideo( > rtc::scoped_refptr<Video> GenerateAlignedReferenceVideo( > const rtc::scoped_refptr<Video>& reference_video, > const std::vector<size_t>& indices) { >- return ReorderVideo(new LoopingVideo(reference_video), indices); >+ return ReorderVideo(reference_video, indices); > } > > } // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/BUILD.gn >index a5036dcea9cd186001d8d493f4125750092e72b6..47b81430472b179574b056a33faed8b06624dbcc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/BUILD.gn >@@ -6,8 +6,8 @@ > # in the file PATENTS. All contributing project authors may > # be found in the AUTHORS file in the root of the source tree. > >-import("../../webrtc.gni") > import("//third_party/protobuf/proto_library.gni") >+import("../../webrtc.gni") > > if (rtc_enable_protobuf) { > proto_library("network_tester_config_proto") { >@@ -44,6 +44,7 @@ if (rtc_enable_protobuf) { > "../../p2p", > "../../rtc_base:checks", > "../../rtc_base:protobuf_utils", >+ "../../rtc_base:rtc_base", > "../../rtc_base:rtc_base_approved", > "../../rtc_base:rtc_task_queue", > "../../rtc_base:sequenced_task_checker", >@@ -80,7 +81,7 @@ if (rtc_enable_protobuf) { > > deps = [ > ":network_tester", >- "../../rtc_base:rtc_base_tests_utils", >+ "../../rtc_base:gunit_helpers", > "../../test:fileutils", > "../../test:test_support", > "//testing/gtest", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.cc >index e5bd92efa93c6db1550548ab628875f259b5b081..9bfdfa7732c820e86019fd913180834c1884c0e5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.cc >@@ -10,6 +10,9 @@ > > #include "rtc_tools/network_tester/test_controller.h" > >+#include "absl/types/optional.h" >+#include "rtc_base/thread.h" >+ > namespace webrtc { > > TestController::TestController(int min_port, >@@ -24,17 +27,15 @@ TestController::TestController(int min_port, > RTC_DCHECK_RUN_ON(&test_controller_thread_checker_); > packet_sender_checker_.Detach(); > send_data_.fill(42); >- auto socket = >+ udp_socket_ = > std::unique_ptr<rtc::AsyncPacketSocket>(socket_factory_.CreateUdpSocket( > rtc::SocketAddress(rtc::GetAnyIP(AF_INET), 0), min_port, max_port)); >- socket->SignalReadPacket.connect(this, &TestController::OnReadPacket); >- udp_transport_.reset( >- new cricket::UdpTransport("network tester transport", std::move(socket))); >+ udp_socket_->SignalReadPacket.connect(this, &TestController::OnReadPacket); > } > > void TestController::SendConnectTo(const std::string& hostname, int port) { > RTC_DCHECK_RUN_ON(&test_controller_thread_checker_); >- udp_transport_->SetRemoteAddress(rtc::SocketAddress(hostname, port)); >+ remote_address_ = rtc::SocketAddress(hostname, port); > NetworkTesterPacket packet; > packet.set_type(NetworkTesterPacket::HAND_SHAKING); > SendData(packet, absl::nullopt); >@@ -57,8 +58,8 @@ void TestController::SendData(const NetworkTesterPacket& packet, > packet.SerializeToArray(&send_data_[1], std::numeric_limits<char>::max()); > if (data_size && *data_size > packet_size) > packet_size = *data_size; >- udp_transport_->SendPacket(send_data_.data(), packet_size, >- rtc::PacketOptions(), 0); >+ udp_socket_->SendTo((const void*)send_data_.data(), packet_size, >+ remote_address_, rtc::PacketOptions()); > } > > void TestController::OnTestDone() { >@@ -80,7 +81,7 @@ void TestController::OnReadPacket(rtc::AsyncPacketSocket* socket, > const char* data, > size_t len, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time) { >+ const int64_t& packet_time_us) { > RTC_DCHECK_RUN_ON(&test_controller_thread_checker_); > size_t packet_size = data[0]; > std::string receive_data(&data[1], packet_size); >@@ -91,7 +92,7 @@ void TestController::OnReadPacket(rtc::AsyncPacketSocket* socket, > case NetworkTesterPacket::HAND_SHAKING: { > NetworkTesterPacket packet; > packet.set_type(NetworkTesterPacket::TEST_START); >- udp_transport_->SetRemoteAddress(remote_addr); >+ remote_address_ = remote_addr; > SendData(packet, absl::nullopt); > packet_sender_.reset(new PacketSender(this, config_file_path_)); > packet_sender_->StartSending(); >@@ -109,7 +110,7 @@ void TestController::OnReadPacket(rtc::AsyncPacketSocket* socket, > break; > } > case NetworkTesterPacket::TEST_DATA: { >- packet.set_arrival_timestamp(packet_time.timestamp); >+ packet.set_arrival_timestamp(packet_time_us); > packet.set_packet_size(len); > packet_logger_.LogPacket(packet); > break; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.h >index cf65e174595616995f3fd3873234e092b08fc38d..a65272a09d1eb40a5f0b90f537ef7b8bb622229c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/network_tester/test_controller.h >@@ -18,7 +18,7 @@ > #include <utility> > > #include "p2p/base/basicpacketsocketfactory.h" >-#include "p2p/base/udptransport.h" >+#include "rtc_base/asyncpacketsocket.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/ignore_wundef.h" > #include "rtc_tools/network_tester/packet_logger.h" >@@ -60,7 +60,7 @@ class TestController : public sigslot::has_slots<> { > const char* data, > size_t len, > const rtc::SocketAddress& remote_addr, >- const rtc::PacketTime& packet_time); >+ const int64_t& packet_time_us); > rtc::ThreadChecker test_controller_thread_checker_; > rtc::SequencedTaskChecker packet_sender_checker_; > rtc::BasicPacketSocketFactory socket_factory_; >@@ -70,7 +70,8 @@ class TestController : public sigslot::has_slots<> { > bool local_test_done_ RTC_GUARDED_BY(local_test_done_lock_); > bool remote_test_done_; > std::array<char, kEthernetMtu> send_data_; >- std::unique_ptr<cricket::UdpTransport> udp_transport_; >+ std::unique_ptr<rtc::AsyncPacketSocket> udp_socket_; >+ rtc::SocketAddress remote_address_; > std::unique_ptr<PacketSender> packet_sender_; > > RTC_DISALLOW_COPY_AND_ASSIGN(TestController); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/unpack_aecdump/unpack.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/unpack_aecdump/unpack.cc >index 4c75608e7488a60435fc11cf14489666a28c63ca..142b49730a36170b6576174a5b731228fd657ba3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/unpack_aecdump/unpack.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/unpack_aecdump/unpack.cc >@@ -29,27 +29,34 @@ RTC_PUSH_IGNORING_WUNDEF() > RTC_POP_IGNORING_WUNDEF() > > // TODO(andrew): unpack more of the data. >-DEFINE_string(input_file, "input", "The name of the input stream file."); >-DEFINE_string(output_file, >- "ref_out", >- "The name of the reference output stream file."); >-DEFINE_string(reverse_file, >- "reverse", >- "The name of the reverse input stream file."); >-DEFINE_string(delay_file, "delay.int32", "The name of the delay file."); >-DEFINE_string(drift_file, "drift.int32", "The name of the drift file."); >-DEFINE_string(level_file, "level.int32", "The name of the level file."); >-DEFINE_string(keypress_file, "keypress.bool", "The name of the keypress file."); >-DEFINE_string(callorder_file, >- "callorder", >- "The name of the render/capture call order file."); >-DEFINE_string(settings_file, "settings.txt", "The name of the settings file."); >-DEFINE_bool(full, false, "Unpack the full set of files (normally not needed)."); >-DEFINE_bool(raw, false, "Write raw data instead of a WAV file."); >-DEFINE_bool(text, >- false, >- "Write non-audio files as text files instead of binary files."); >-DEFINE_bool(help, false, "Print this message."); >+WEBRTC_DEFINE_string(input_file, "input", "The name of the input stream file."); >+WEBRTC_DEFINE_string(output_file, >+ "ref_out", >+ "The name of the reference output stream file."); >+WEBRTC_DEFINE_string(reverse_file, >+ "reverse", >+ "The name of the reverse input stream file."); >+WEBRTC_DEFINE_string(delay_file, "delay.int32", "The name of the delay file."); >+WEBRTC_DEFINE_string(drift_file, "drift.int32", "The name of the drift file."); >+WEBRTC_DEFINE_string(level_file, "level.int32", "The name of the level file."); >+WEBRTC_DEFINE_string(keypress_file, >+ "keypress.bool", >+ "The name of the keypress file."); >+WEBRTC_DEFINE_string(callorder_file, >+ "callorder", >+ "The name of the render/capture call order file."); >+WEBRTC_DEFINE_string(settings_file, >+ "settings.txt", >+ "The name of the settings file."); >+WEBRTC_DEFINE_bool(full, >+ false, >+ "Unpack the full set of files (normally not needed)."); >+WEBRTC_DEFINE_bool(raw, false, "Write raw data instead of a WAV file."); >+WEBRTC_DEFINE_bool( >+ text, >+ false, >+ "Write non-audio files as text files instead of binary files."); >+WEBRTC_DEFINE_bool(help, false, "Print this message."); > > #define PRINT_CONFIG(field_name) \ > if (msg.has_##field_name()) { \ >@@ -87,6 +94,10 @@ void WriteCallOrderData(const bool render_call, > WriteData(&call_type, sizeof(call_type), file, filename.c_str()); > } > >+bool WritingCallOrderFile() { >+ return FLAG_full; >+} >+ > } // namespace > > int do_main(int argc, char* argv[]) { >@@ -125,7 +136,9 @@ int do_main(int argc, char* argv[]) { > > rtc::StringBuilder callorder_raw_name; > callorder_raw_name << FLAG_callorder_file << ".char"; >- FILE* callorder_char_file = OpenFile(callorder_raw_name.str(), "wb"); >+ FILE* callorder_char_file = WritingCallOrderFile() >+ ? OpenFile(callorder_raw_name.str(), "wb") >+ : nullptr; > FILE* settings_file = OpenFile(FLAG_settings_file, "wb"); > > while (ReadMessageFromFile(debug_file, &event_msg)) { >@@ -163,8 +176,10 @@ int do_main(int argc, char* argv[]) { > reverse_raw_file.get()); > } > if (FLAG_full) { >- WriteCallOrderData(true /* render_call */, callorder_char_file, >- FLAG_callorder_file); >+ if (WritingCallOrderFile()) { >+ WriteCallOrderData(true /* render_call */, callorder_char_file, >+ FLAG_callorder_file); >+ } > } > } else if (event_msg.type() == Event::STREAM) { > frame_count++; >@@ -222,8 +237,10 @@ int do_main(int argc, char* argv[]) { > } > > if (FLAG_full) { >- WriteCallOrderData(false /* render_call */, callorder_char_file, >- FLAG_callorder_file); >+ if (WritingCallOrderFile()) { >+ WriteCallOrderData(false /* render_call */, callorder_char_file, >+ FLAG_callorder_file); >+ } > if (msg.has_delay()) { > static FILE* delay_file = OpenFile(FLAG_delay_file, "wb"); > int32_t delay = msg.delay(); >@@ -359,9 +376,11 @@ int do_main(int argc, char* argv[]) { > output_wav_file.reset(new WavWriter( > output_name.str(), output_sample_rate, num_output_channels)); > >- rtc::StringBuilder callorder_name; >- callorder_name << FLAG_callorder_file << frame_count << ".char"; >- callorder_char_file = OpenFile(callorder_name.str(), "wb"); >+ if (WritingCallOrderFile()) { >+ rtc::StringBuilder callorder_name; >+ callorder_name << FLAG_callorder_file << frame_count << ".char"; >+ callorder_char_file = OpenFile(callorder_name.str(), "wb"); >+ } > } > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/video_analysis.py b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/video_analysis.py >index a5d267b0b29a3a306ce696b5dce7bfa964808ecd..0c22617ecf210ba787ae99825cb574706313435f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/video_analysis.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_tools/video_analysis.py >@@ -79,7 +79,7 @@ def _ParseArgs(): > help='Path to the frame analyzer executable.' > 'Default: %default') > parser.add_option('--zxing_path', type='string', >- help='Path to the zebra xing barcode analyzer.') >+ help='DEPRECATED.') > parser.add_option('--ref_rec_dir', type='string', default='ref', > help='Path to where reference recordings will be created.' > 'Ideally keep the ref and test directories on separate' >@@ -118,8 +118,6 @@ def _ParseArgs(): > 'generated!') > if not os.path.isfile(options.frame_analyzer): > parser.warning('Cannot find frame_analyzer, no metrics will be generated!') >- if not os.path.isfile(options.zxing_path): >- parser.warning('Cannot find Zebra Xing, no metrics will be generated!') > > return options > >@@ -442,12 +440,6 @@ def CompareVideos(options, cropped_ref_file, cropped_test_file): > '--ref_video=%s' % cropped_ref_file, > '--test_video=%s' % cropped_test_file, > '--frame_analyzer=%s' % os.path.abspath(options.frame_analyzer), >- '--zxing_path=%s' % options.zxing_path, >- '--ffmpeg_path=%s' % options.ffmpeg, >- '--stats_file_ref=%s_stats.txt' % >- os.path.join(os.path.dirname(cropped_ref_file), cropped_ref_file), >- '--stats_file_test=%s_stats.txt' % >- os.path.join(os.path.dirname(cropped_test_file), cropped_test_file), > '--yuv_frame_height=%d' % crop_height, > '--yuv_frame_width=%d' % crop_width > ] >@@ -472,7 +464,6 @@ def main(): > --app_name AppRTCMobile \ > --ffmpeg ./ffmpeg --ref_video_device=/dev/video0 \ > --test_video_device=/dev/video1 \ >- --zxing_path ./zxing \ > --test_crop_parameters 'crop=950:420:130:56' \ > --ref_crop_parameters 'hflip, crop=950:420:130:56' \ > --ref_rec_dir /tmp/ref \ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/BUILD.gn >index b4f193d1f326a6a550b32111a50a99c4e902b685..60ea4734e9520c181db51ace4a7e37a00621e229 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/BUILD.gn >@@ -8,8 +8,8 @@ > > import("../webrtc.gni") > if (is_ios) { >- import("//build/config/ios/rules.gni") > import("//build/config/ios/ios_sdk.gni") >+ import("//build/config/ios/rules.gni") > } > if (is_mac) { > import("//build/config/mac/rules.gni") >@@ -40,6 +40,7 @@ if (is_ios || is_mac) { > "objc/Framework/Headers", # TODO(bugs.webrtc.org/9627): Remove this. > ] > cflags = [ >+ "-Wimplicit-retain-self", > "-Wstrict-overflow", > "-Wmissing-field-initializers", > ] >@@ -262,6 +263,8 @@ if (is_ios || is_mac) { > "../system_wrappers:metrics", > ] > >+ libs = [ "AudioToolbox.framework" ] >+ > if (is_clang) { > # Suppress warnings from the Chromium Clang plugin > # (bugs.webrtc.org/163). >@@ -287,6 +290,8 @@ if (is_ios || is_mac) { > > public_configs = [ ":common_config_objc" ] > >+ libs = [ "AVFoundation.framework" ] >+ > deps = [ > ":audio_session_observer", > ":base_objc", >@@ -756,8 +761,8 @@ if (is_ios || is_mac) { > deps = [ > ":base_objc", > ":helpers_objc", >+ "../api/video:encoded_image", > "../api/video_codecs:video_codecs_api", >- "../common_video", > "../modules:module_api", > "../modules/video_coding:video_codec_interface", > "../rtc_base:rtc_base", >@@ -793,6 +798,8 @@ if (is_ios || is_mac) { > "objc/api/peerconnection/RTCConfiguration+Private.h", > "objc/api/peerconnection/RTCConfiguration.h", > "objc/api/peerconnection/RTCConfiguration.mm", >+ "objc/api/peerconnection/RTCCryptoOptions.h", >+ "objc/api/peerconnection/RTCCryptoOptions.mm", > "objc/api/peerconnection/RTCDataChannel+Private.h", > "objc/api/peerconnection/RTCDataChannel.h", > "objc/api/peerconnection/RTCDataChannel.mm", >@@ -903,6 +910,7 @@ if (is_ios || is_mac) { > ":videorendereradapter_objc", > ":videosource_objc", > ":videotoolbox_objc", >+ "../api:create_peerconnection_factory", > "../api:libjingle_peerconnection_api", > "../api/audio_codecs:audio_codecs_api", > "../api/audio_codecs:builtin_audio_decoder_factory", >@@ -913,9 +921,9 @@ if (is_ios || is_mac) { > "../media:rtc_media_base", > "../modules:module_api", > "../modules/audio_device:audio_device_api", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/video_coding:video_codec_interface", >- "../pc:create_pc_factory", > "../pc:peerconnection", > "../rtc_base:checks", > "../rtc_base:rtc_base", >@@ -1031,6 +1039,7 @@ if (is_ios || is_mac) { > "objc/unittests/RTCPeerConnectionFactory_xctest.m", > "objc/unittests/frame_buffer_helpers.h", > "objc/unittests/frame_buffer_helpers.mm", >+ "objc/unittests/nalu_rewriter_xctest.mm", > ] > > # TODO(peterhanspers): Reenable these tests on simulator. >@@ -1053,6 +1062,7 @@ if (is_ios || is_mac) { > ":native_api_audio_device_module", > ":native_video", > ":peerconnectionfactory_base_objc", >+ ":video_toolbox_cc", > ":videocapture_objc", > ":videocodec_objc", > ":videoframebuffer_objc", >@@ -1063,8 +1073,8 @@ if (is_ios || is_mac) { > "../media:rtc_media_base", > "../media:rtc_media_tests_utils", > "../modules:module_api", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base", >- "../rtc_base:rtc_base_tests_utils", > "../system_wrappers:system_wrappers", > "//third_party/libyuv", > ] >@@ -1098,7 +1108,7 @@ if (is_ios || is_mac) { > rtc_ios_xctest_test("sdk_unittests") { > info_plist = "//test/ios/Info.plist" > sources = [ >- "objc/unittests/main.m", >+ "objc/unittests/main.mm", > ] > > _bundle_id_suffix = ios_generic_test_bundle_id_suffix >@@ -1107,6 +1117,7 @@ if (is_ios || is_mac) { > ":peerconnectionfactory_base_objc", > ":sdk_unittests_bundle_data", > ":sdk_unittests_sources", >+ "//test:test_support", > ] > ldflags = [ "-all_load" ] > } >@@ -1116,7 +1127,7 @@ if (is_ios || is_mac) { > info_plist = "//test/ios/Info.plist" > sources = [ > "objc/unittests/RTCDoNotPutCPlusPlusInFrameworkHeaders_xctest.m", >- "objc/unittests/main.m", >+ "objc/unittests/main.mm", > ] > > _bundle_id_suffix = ios_generic_test_bundle_id_suffix >@@ -1124,6 +1135,7 @@ if (is_ios || is_mac) { > deps = [ > ":framework_objc+link", > ":ios_framework_bundle", >+ "//test:test_support", > ] > } > } >@@ -1170,6 +1182,7 @@ if (is_ios || is_mac) { > ":videoframebuffer_objc", > ":videosource_objc", > ":videotoolbox_objc", >+ "../api:libjingle_peerconnection_api", > "../api/audio_codecs:audio_codecs_api", > "../api/audio_codecs:builtin_audio_decoder_factory", > "../api/audio_codecs:builtin_audio_encoder_factory", >@@ -1177,11 +1190,12 @@ if (is_ios || is_mac) { > "../media:rtc_media_base", > "../modules:module_api", > "../modules/audio_device:audio_device_api", >+ "../modules/audio_processing:api", > "../modules/audio_processing:audio_processing", > "../modules/video_coding:video_codec_interface", >- "../rtc_base:rtc_base_tests_utils", >+ "../rtc_base:gunit_helpers", > "../rtc_base/system:unused", >- "//test:test_support", >+ "../test:test_support", > "//third_party/ocmock", > ] > >@@ -1269,6 +1283,7 @@ if (is_ios || is_mac) { > "objc/api/peerconnection/RTCSessionDescription.h", > "objc/api/peerconnection/RTCTracing.h", > "objc/api/peerconnection/RTCCertificate.h", >+ "objc/api/peerconnection/RTCCryptoOptions.h", > "objc/api/peerconnection/RTCVideoSource.h", > "objc/api/peerconnection/RTCVideoTrack.h", > "objc/api/video_codec/RTCVideoCodecConstants.h", >@@ -1357,6 +1372,7 @@ if (is_ios || is_mac) { > "objc/api/peerconnection/RTCAudioTrack.h", > "objc/api/peerconnection/RTCCertificate.h", > "objc/api/peerconnection/RTCConfiguration.h", >+ "objc/api/peerconnection/RTCCryptoOptions.h", > "objc/api/peerconnection/RTCDataChannel.h", > "objc/api/peerconnection/RTCDataChannelConfiguration.h", > "objc/api/peerconnection/RTCDtmfSender.h", >@@ -1590,7 +1606,10 @@ if (is_ios || is_mac) { > } > > rtc_static_library("video_toolbox_cc") { >- visibility = [ ":videotoolbox_objc" ] >+ visibility = [ >+ ":videotoolbox_objc", >+ ":sdk_unittests_sources", >+ ] > sources = [ > "objc/components/video_codec/helpers.cc", > "objc/components/video_codec/helpers.h", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.mm >index 2d958b76dbe78317279b470f4efb3fcad8b83bd9..65eefa36f2dfb2069ecca7caa49224b262955bba 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.mm >@@ -29,8 +29,9 @@ > #import "WebRTC/RTCVideoCodecH264.h" > > #include "api/video/video_frame.h" >-#include "third_party/libyuv/include/libyuv/convert_from.h" > #include "native/src/objc_frame_buffer.h" >+#include "third_party/libyuv/include/libyuv/convert_from.h" >+#include "webrtc/rtc_base/checks.h" > #include "Framework/Headers/WebRTC/RTCVideoCodecFactory.h" > #include "Framework/Headers/WebRTC/RTCVideoFrame.h" > #include "Framework/Headers/WebRTC/RTCVideoFrameBuffer.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/BUILD.gn >index 63c5bd20c10436b000032b1025ef693597bab86c..edc5da30b79da763dbc74304df39acb797ccf04d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/BUILD.gn >@@ -7,9 +7,9 @@ > # be found in the AUTHORS file in the root of the source tree. > > if (is_android) { >- import("../../webrtc.gni") > import("//build/config/android/config.gni") > import("//build/config/android/rules.gni") >+ import("../../webrtc.gni") > > group("android") { > if (!build_with_chromium && is_android) { >@@ -38,6 +38,7 @@ if (is_android) { > deps = [ > ":audio_api_java", > ":base_java", >+ ":builtin_audio_codecs_java", > ":camera_java", > ":default_video_codec_factory_java", > ":filevideo_java", >@@ -110,23 +111,22 @@ if (is_android) { > public_deps = [ # no-presubmit-check TODO(webrtc:8603) > ":audio_jni", > ":base_jni", >+ ":builtin_audio_codecs_jni", >+ ":default_video_codec_factory_jni", > ":java_audio_device_module_jni", >+ ":legacy_hwcodecs_jni", > ":media_jni", > ":peerconnection_jni", > ":video_jni", >- "../../pc:create_pc_factory", >+ "../../api:create_peerconnection_factory", > ] > } > > rtc_static_library("libjingle_peerconnection_metrics_default_jni") { > visibility = [ "*" ] >- >- allow_poison = [ "software_video_codecs" ] # TODO(bugs.webrtc.org/7925): Remove. >- > sources = [ > "src/jni/androidmetrics.cc", > ] >- > deps = [ > ":base_jni", > ":generated_metrics_jni", >@@ -204,7 +204,11 @@ if (is_android) { > } > > rtc_android_library("audio_api_java") { >- java_files = [ "api/org/webrtc/audio/AudioDeviceModule.java" ] >+ java_files = [ >+ "api/org/webrtc/audio/AudioDeviceModule.java", >+ "api/org/webrtc/AudioDecoderFactoryFactory.java", >+ "api/org/webrtc/AudioEncoderFactoryFactory.java", >+ ] > > deps = [ > ":base_java", >@@ -250,6 +254,7 @@ if (is_android) { > "api/org/webrtc/VideoFrameDrawer.java", > "api/org/webrtc/YuvConverter.java", > "api/org/webrtc/YuvHelper.java", >+ "api/org/webrtc/TimestampAligner.java", > "src/java/org/webrtc/EglBase10.java", > "src/java/org/webrtc/EglBase14.java", > "src/java/org/webrtc/GlGenericDrawer.java", >@@ -276,9 +281,11 @@ if (is_android) { > "api/org/webrtc/AudioSource.java", > "api/org/webrtc/AudioTrack.java", > "api/org/webrtc/CallSessionFileRotatingLogSink.java", >+ "api/org/webrtc/CryptoOptions.java", > "api/org/webrtc/DataChannel.java", > "api/org/webrtc/DtmfSender.java", > "api/org/webrtc/FecControllerFactoryFactoryInterface.java", >+ "api/org/webrtc/MediaTransportFactoryFactory.java", > "api/org/webrtc/FrameDecryptor.java", > "api/org/webrtc/FrameEncryptor.java", > "api/org/webrtc/IceCandidate.java", >@@ -295,6 +302,7 @@ if (is_android) { > "api/org/webrtc/PeerConnection.java", > "api/org/webrtc/PeerConnectionDependencies.java", > "api/org/webrtc/PeerConnectionFactory.java", >+ "api/org/webrtc/RtcCertificatePem.java", > "api/org/webrtc/RTCStats.java", > "api/org/webrtc/RTCStatsCollectorCallback.java", > "api/org/webrtc/RTCStatsReport.java", >@@ -317,6 +325,7 @@ if (is_android) { > deps = [ > ":audio_api_java", > ":base_java", >+ ":builtin_audio_codecs_java", > ":default_video_codec_factory_java", > ":logging_java", > ":swcodecs_java", >@@ -426,6 +435,17 @@ if (is_android) { > ] > } > >+ rtc_android_library("builtin_audio_codecs_java") { >+ java_files = [ >+ "api/org/webrtc/BuiltinAudioDecoderFactoryFactory.java", >+ "api/org/webrtc/BuiltinAudioEncoderFactoryFactory.java", >+ ] >+ >+ deps = [ >+ ":audio_api_java", >+ ] >+ } >+ > rtc_android_library("screencapturer_java") { > java_files = [ "api/org/webrtc/ScreenCapturerAndroid.java" ] > >@@ -449,18 +469,44 @@ if (is_android) { > ] > } > >+ rtc_android_library("libvpx_vp8_java") { >+ visibility = [ "*" ] >+ java_files = [ >+ "api/org/webrtc/LibvpxVp8Decoder.java", >+ "api/org/webrtc/LibvpxVp8Encoder.java", >+ ] >+ deps = [ >+ ":base_java", >+ ":video_api_java", >+ ":video_java", >+ "//rtc_base:base_java", >+ ] >+ } >+ >+ rtc_android_library("libvpx_vp9_java") { >+ visibility = [ "*" ] >+ java_files = [ >+ "api/org/webrtc/LibvpxVp9Decoder.java", >+ "api/org/webrtc/LibvpxVp9Encoder.java", >+ ] >+ deps = [ >+ ":base_java", >+ ":video_api_java", >+ ":video_java", >+ "//rtc_base:base_java", >+ ] >+ } >+ > rtc_android_library("swcodecs_java") { > java_files = [ > "api/org/webrtc/SoftwareVideoDecoderFactory.java", > "api/org/webrtc/SoftwareVideoEncoderFactory.java", >- "src/java/org/webrtc/VP8Encoder.java", >- "src/java/org/webrtc/VP8Decoder.java", >- "src/java/org/webrtc/VP9Encoder.java", >- "src/java/org/webrtc/VP9Decoder.java", > ] > > deps = [ > ":base_java", >+ ":libvpx_vp8_java", >+ ":libvpx_vp9_java", > ":video_api_java", > ":video_java", > "//rtc_base:base_java", >@@ -511,23 +557,66 @@ if (is_android) { > > deps = [ > ":base_jni", >+ ":builtin_audio_codecs_jni", > "../../api/audio_codecs:builtin_audio_decoder_factory", > "../../api/audio_codecs:builtin_audio_encoder_factory", >+ "../../modules/audio_processing:api", > "../../modules/audio_processing:audio_processing", > "../../rtc_base:rtc_base_approved", > ] > } > >- rtc_static_library("video_jni") { >- sources = [] >+ rtc_static_library("builtin_audio_codecs_jni") { >+ sources = [ >+ "src/jni/builtinaudiodecoderfactoryfactory.cc", >+ "src/jni/builtinaudioencoderfactoryfactory.cc", >+ ] >+ > deps = [ >- "../../system_wrappers:field_trial", >+ ":base_jni", >+ ":generated_builtin_audio_codecs_jni", >+ ":native_api_jni", >+ "../../api/audio_codecs:builtin_audio_decoder_factory", >+ "../../api/audio_codecs:builtin_audio_encoder_factory", >+ "../../rtc_base:rtc_base_approved", > ] >+ } > >- sources += [ >+ # Corresponds to MediaCodecVideoEncoder/Decoder in Java. >+ rtc_static_library("legacy_hwcodecs_jni") { >+ visibility = [ "*" ] >+ allow_poison = [ "software_video_codecs" ] >+ sources = [ > "src/jni/androidmediacodeccommon.h", > "src/jni/androidmediadecoder.cc", > "src/jni/androidmediaencoder.cc", >+ ] >+ deps = [ >+ ":base_jni", >+ ":default_video_codec_factory_jni", >+ ":generated_video_jni", >+ ":native_api_jni", >+ ":video_jni", >+ ":videoframe_jni", >+ "../..:webrtc_common", >+ "../../api/video_codecs:video_codecs_api", >+ "../../common_video:common_video", >+ "../../media:rtc_internal_video_codecs", >+ "../../media:rtc_media_base", >+ "../../modules/video_coding:video_codec_interface", >+ "../../modules/video_coding:video_coding_utility", >+ "../../rtc_base:checks", >+ "../../rtc_base:rtc_base", >+ "../../rtc_base:rtc_task_queue_api", >+ "../../rtc_base:sequenced_task_checker", >+ "../../rtc_base:weak_ptr", >+ "../../system_wrappers:field_trial", >+ "//third_party/libyuv", >+ ] >+ } >+ >+ rtc_static_library("video_jni") { >+ sources = [ > "src/jni/androidvideotracksource.cc", > "src/jni/androidvideotracksource.h", > "src/jni/encodedimage.cc", >@@ -540,6 +629,7 @@ if (is_android) { > "src/jni/nv12buffer.cc", > "src/jni/nv21buffer.cc", > "src/jni/pc/video.cc", >+ "src/jni/timestampaligner.cc", > "src/jni/videocodecinfo.cc", > "src/jni/videocodecinfo.h", > "src/jni/videocodecstatus.cc", >@@ -561,36 +651,25 @@ if (is_android) { > "src/jni/yuvhelper.cc", > ] > >- deps += [ >+ deps = [ > ":base_jni", > ":generated_video_jni", > ":native_api_jni", > ":videoframe_jni", >- ":vp8_jni", # TODO(bugs.webrtc.org/7925): Remove. >- ":vp9_jni", # TODO(bugs.webrtc.org/7925): Remove. > "../..:webrtc_common", > "../../api:libjingle_peerconnection_api", >+ "../../api/video:encoded_image", > "../../api/video:video_frame", >- "../../api/video_codecs:builtin_video_decoder_factory", >- "../../api/video_codecs:builtin_video_encoder_factory", > "../../api/video_codecs:rtc_software_fallback_wrappers", > "../../api/video_codecs:video_codecs_api", > "../../common_video:common_video", >- "../../media:rtc_audio_video", >- "../../media:rtc_h264_profile_id", >- "../../media:rtc_internal_video_codecs", > "../../media:rtc_media_base", >- "../../modules/utility:utility", > "../../modules/video_coding:codec_globals_headers", > "../../modules/video_coding:video_codec_interface", > "../../modules/video_coding:video_coding_utility", > "../../rtc_base:checks", > "../../rtc_base:rtc_base", >- "../../rtc_base:rtc_base_approved", >- "../../rtc_base:rtc_task_queue", >- "../../rtc_base:sequenced_task_checker", >- "../../rtc_base:weak_ptr", >- "../../rtc_base/memory:aligned_malloc", >+ "../../rtc_base:rtc_task_queue_api", > "//third_party/libyuv", > ] > } >@@ -606,6 +685,8 @@ if (is_android) { > "src/jni/pc/androidnetworkmonitor.h", > "src/jni/pc/audiotrack.cc", > "src/jni/pc/callsessionfilerotatinglogsink.cc", >+ "src/jni/pc/cryptooptions.cc", >+ "src/jni/pc/cryptooptions.h", > "src/jni/pc/datachannel.cc", > "src/jni/pc/datachannel.h", > "src/jni/pc/dtmfsender.cc", >@@ -624,6 +705,8 @@ if (is_android) { > "src/jni/pc/peerconnection.h", > "src/jni/pc/peerconnectionfactory.cc", > "src/jni/pc/peerconnectionfactory.h", >+ "src/jni/pc/rtccertificate.cc", >+ "src/jni/pc/rtccertificate.h", > "src/jni/pc/rtcstatscollectorcallbackwrapper.cc", > "src/jni/pc/rtcstatscollectorcallbackwrapper.h", > "src/jni/pc/rtpparameters.cc", >@@ -669,6 +752,7 @@ if (is_android) { > "../../rtc_base:stringutils", > "../../system_wrappers:field_trial", > "//third_party/abseil-cpp/absl/memory", >+ "//third_party/abseil-cpp/absl/types:optional", > ] > } > >@@ -685,6 +769,50 @@ if (is_android) { > ] > } > >+ rtc_static_library("default_video_codec_factory_jni") { >+ visibility = [ "*" ] >+ allow_poison = [ "software_video_codecs" ] >+ deps = [ >+ ":swcodecs_jni", >+ ":video_jni", >+ ] >+ } >+ >+ rtc_static_library("libvpx_vp8_jni") { >+ visibility = [ "*" ] >+ allow_poison = [ "software_video_codecs" ] >+ sources = [ >+ "src/jni/vp8codec.cc", >+ ] >+ deps = [ >+ ":base_jni", >+ ":generated_libvpx_vp8_jni", >+ ":video_jni", >+ "../../modules/video_coding:webrtc_vp8", >+ ] >+ } >+ >+ rtc_static_library("libvpx_vp9_jni") { >+ visibility = [ "*" ] >+ allow_poison = [ "software_video_codecs" ] >+ sources = [ >+ "src/jni/vp9codec.cc", >+ ] >+ deps = [ >+ ":base_jni", >+ ":generated_libvpx_vp9_jni", >+ ":video_jni", >+ "../../modules/video_coding:webrtc_vp9", >+ ] >+ } >+ >+ rtc_static_library("swcodecs_jni") { >+ deps = [ >+ ":libvpx_vp8_jni", >+ ":libvpx_vp9_jni", >+ ] >+ } >+ > ###################### > # Native API targets # > ###################### >@@ -769,10 +897,7 @@ if (is_android) { > # objects. > rtc_static_library("native_api_codecs") { > visibility = [ "*" ] >- allow_poison = [ >- "audio_codecs", # TODO(bugs.webrtc.org/8396): Remove. >- "software_video_codecs", # TODO(bugs.webrtc.org/7925): Remove. >- ] >+ allow_poison = [ "audio_codecs" ] # TODO(bugs.webrtc.org/8396): Remove. > sources = [ > "native_api/codecs/wrapper.cc", > "native_api/codecs/wrapper.h", >@@ -792,14 +917,10 @@ if (is_android) { > # API for creating Java PeerConnectionFactory from C++ equivalents. > rtc_static_library("native_api_peerconnection") { > visibility = [ "*" ] >- >- allow_poison = [ "software_video_codecs" ] # TODO(bugs.webrtc.org/7925): Remove. >- > sources = [ > "native_api/peerconnection/peerconnectionfactory.cc", > "native_api/peerconnection/peerconnectionfactory.h", > ] >- > deps = [ > ":base_jni", > ":peerconnection_jni", >@@ -813,10 +934,7 @@ if (is_android) { > # video interfaces from their Java equivalents. > rtc_static_library("native_api_video") { > visibility = [ "*" ] >- allow_poison = [ >- "audio_codecs", # TODO(bugs.webrtc.org/8396): Remove. >- "software_video_codecs", # TODO(bugs.webrtc.org/7925): Remove. >- ] >+ allow_poison = [ "audio_codecs" ] # TODO(bugs.webrtc.org/8396): Remove. > sources = [ > "native_api/video/videosource.cc", > "native_api/video/videosource.h", >@@ -829,6 +947,7 @@ if (is_android) { > ":videoframe_jni", > "//api:libjingle_peerconnection_api", > "//api/video:video_frame", >+ "//rtc_base:rtc_base", > "//rtc_base:rtc_base_approved", > "//third_party/abseil-cpp/absl/memory", > ] >@@ -891,6 +1010,7 @@ if (is_android) { > "../../logging:rtc_event_log_api", > "../../media:rtc_audio_video", > "../../modules/audio_device:audio_device", >+ "../../modules/audio_processing:api", > "../../modules/audio_processing:audio_processing", > ] > } >@@ -929,30 +1049,6 @@ if (is_android) { > ] > } > >- rtc_static_library("vp8_jni") { >- sources = [ >- "src/jni/vp8codec.cc", >- ] >- >- deps = [ >- ":base_jni", >- ":generated_vp8_jni", >- "../../modules/video_coding:webrtc_vp8", >- ] >- } >- >- rtc_static_library("vp9_jni") { >- sources = [ >- "src/jni/vp9codec.cc", >- ] >- >- deps = [ >- ":base_jni", >- ":generated_vp9_jni", >- "../../modules/video_coding:webrtc_vp9", >- ] >- } >- > rtc_static_library("logging_jni") { > visibility = [ "*" ] > sources = [ >@@ -981,6 +1077,7 @@ if (is_android) { > ":base_jni", > ":generated_audio_device_module_base_jni", > ":native_api_jni", >+ "../../modules/audio_device:audio_device_api", > "../../modules/audio_device:audio_device_buffer", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >@@ -1114,6 +1211,7 @@ if (is_android) { > "api/org/webrtc/JavaI420Buffer.java", > "api/org/webrtc/MediaCodecVideoDecoder.java", > "api/org/webrtc/MediaCodecVideoEncoder.java", >+ "api/org/webrtc/TimestampAligner.java", > "api/org/webrtc/VideoCodecInfo.java", > "api/org/webrtc/VideoCodecStatus.java", > "api/org/webrtc/VideoDecoder.java", >@@ -1140,10 +1238,10 @@ if (is_android) { > jni_generator_include = "//sdk/android/src/jni/jni_generator_helper.h" > } > >- generate_jni("generated_vp8_jni") { >+ generate_jni("generated_libvpx_vp8_jni") { > sources = [ >- "src/java/org/webrtc/VP8Decoder.java", >- "src/java/org/webrtc/VP8Encoder.java", >+ "api/org/webrtc/LibvpxVp8Decoder.java", >+ "api/org/webrtc/LibvpxVp8Encoder.java", > ] > > jni_package = "" >@@ -1151,10 +1249,10 @@ if (is_android) { > jni_generator_include = "//sdk/android/src/jni/jni_generator_helper.h" > } > >- generate_jni("generated_vp9_jni") { >+ generate_jni("generated_libvpx_vp9_jni") { > sources = [ >- "src/java/org/webrtc/VP9Decoder.java", >- "src/java/org/webrtc/VP9Encoder.java", >+ "api/org/webrtc/LibvpxVp9Decoder.java", >+ "api/org/webrtc/LibvpxVp9Encoder.java", > ] > > jni_package = "" >@@ -1166,6 +1264,7 @@ if (is_android) { > sources = [ > "api/org/webrtc/AudioTrack.java", > "api/org/webrtc/CallSessionFileRotatingLogSink.java", >+ "api/org/webrtc/CryptoOptions.java", > "api/org/webrtc/DataChannel.java", > "api/org/webrtc/DtmfSender.java", > "api/org/webrtc/IceCandidate.java", >@@ -1178,6 +1277,7 @@ if (is_android) { > "api/org/webrtc/RTCStats.java", > "api/org/webrtc/RTCStatsCollectorCallback.java", > "api/org/webrtc/RTCStatsReport.java", >+ "api/org/webrtc/RtcCertificatePem.java", > "api/org/webrtc/RtpParameters.java", > "api/org/webrtc/RtpReceiver.java", > "api/org/webrtc/RtpSender.java", >@@ -1203,6 +1303,16 @@ if (is_android) { > jni_generator_include = "//sdk/android/src/jni/jni_generator_helper.h" > } > >+ generate_jni("generated_builtin_audio_codecs_jni") { >+ sources = [ >+ "api/org/webrtc/BuiltinAudioDecoderFactoryFactory.java", >+ "api/org/webrtc/BuiltinAudioEncoderFactoryFactory.java", >+ ] >+ jni_package = "" >+ namespace = "webrtc::jni" >+ jni_generator_include = "//sdk/android/src/jni/jni_generator_helper.h" >+ } >+ > # Generated JNI for native API targets > > generate_jni("generated_native_api_jni") { >@@ -1255,6 +1365,7 @@ if (is_android) { > > java_files = [ > "instrumentationtests/src/org/webrtc/AndroidVideoDecoderInstrumentationTest.java", >+ "instrumentationtests/src/org/webrtc/BuiltinAudioCodecsFactoryFactoryTest.java", > "instrumentationtests/src/org/webrtc/Camera1CapturerUsingByteBufferTest.java", > "instrumentationtests/src/org/webrtc/Camera1CapturerUsingTextureTest.java", > "instrumentationtests/src/org/webrtc/Camera2CapturerTest.java", >@@ -1270,11 +1381,13 @@ if (is_android) { > "instrumentationtests/src/org/webrtc/PeerConnectionFactoryTest.java", > "instrumentationtests/src/org/webrtc/PeerConnectionTest.java", > "instrumentationtests/src/org/webrtc/RendererCommonTest.java", >+ "instrumentationtests/src/org/webrtc/RtcCertificatePemTest.java", > "instrumentationtests/src/org/webrtc/SurfaceTextureHelperTest.java", > "instrumentationtests/src/org/webrtc/SurfaceViewRendererOnMeasureTest.java", > "instrumentationtests/src/org/webrtc/TestConstants.java", > "instrumentationtests/src/org/webrtc/VideoFileRendererTest.java", > "instrumentationtests/src/org/webrtc/VideoFrameBufferTest.java", >+ "instrumentationtests/src/org/webrtc/TimestampAlignerTest.java", > "instrumentationtests/src/org/webrtc/WebRtcJniBootTest.java", > "instrumentationtests/src/org/webrtc/YuvHelperTest.java", > ] >@@ -1294,6 +1407,7 @@ if (is_android) { > "//rtc_base:base_java", > "//third_party/android_support_test_runner:rules_java", > "//third_party/android_support_test_runner:runner_java", >+ "//third_party/google-truth:google_truth_java", > "//third_party/junit", > ] > >@@ -1371,23 +1485,23 @@ if (is_android) { > ":native_api_video", > ":opensles_audio_device_module", > ":video_jni", >- "../../system_wrappers:system_wrappers", >- "//api/audio_codecs:builtin_audio_decoder_factory", >- "//api/audio_codecs:builtin_audio_encoder_factory", >- "//api/video:video_frame", >- "//media:rtc_audio_video", >- "//media:rtc_internal_video_codecs", >- "//media:rtc_media_base", >- "//modules/audio_device:audio_device", >- "//modules/audio_device:mock_audio_device", >- "//modules/audio_processing:audio_processing", >- "//modules/utility:utility", >- "//pc:libjingle_peerconnection", >- "//rtc_base:checks", >- "//rtc_base:rtc_base_approved", >- "//test:fileutils", >- "//test:test_support", >- "//testing/gtest", >+ "../../api/audio_codecs:builtin_audio_decoder_factory", >+ "../../api/audio_codecs:builtin_audio_encoder_factory", >+ "../../api/video:video_frame", >+ "../../media:rtc_audio_video", >+ "../../media:rtc_internal_video_codecs", >+ "../../media:rtc_media_base", >+ "../../modules/audio_device:audio_device", >+ "../../modules/audio_device:mock_audio_device", >+ "../../modules/audio_processing:api", >+ "../../modules/audio_processing:audio_processing", >+ "../../modules/utility:utility", >+ "../../pc:libjingle_peerconnection", >+ "../../rtc_base:checks", >+ "../../rtc_base:rtc_base_approved", >+ "../../test:fileutils", >+ "../../test:test_support", >+ "../../testing/gtest", > "//third_party/abseil-cpp/absl/memory", > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/AudioDecoderFactoryFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/AudioDecoderFactoryFactory.java >new file mode 100644 >index 0000000000000000000000000000000000000000..dd3e262896cab84441e4a7d2b7f838fb59c3ab83 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/AudioDecoderFactoryFactory.java >@@ -0,0 +1,21 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+/** >+ * Implementations of this interface can create a native {@code webrtc::AudioDecoderFactory}. >+ */ >+public interface AudioDecoderFactoryFactory { >+ /** >+ * Returns a pointer to a {@code webrtc::AudioDecoderFactory}. The caller takes ownership. >+ */ >+ long createNativeAudioDecoderFactory(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/AudioEncoderFactoryFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/AudioEncoderFactoryFactory.java >new file mode 100644 >index 0000000000000000000000000000000000000000..814b71aba1f060d66ee0c3a304f84c43eb39e653 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/AudioEncoderFactoryFactory.java >@@ -0,0 +1,21 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+/** >+ * Implementations of this interface can create a native {@code webrtc::AudioEncoderFactory}. >+ */ >+public interface AudioEncoderFactoryFactory { >+ /** >+ * Returns a pointer to a {@code webrtc::AudioEncoderFactory}. The caller takes ownership. >+ */ >+ long createNativeAudioEncoderFactory(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/BuiltinAudioDecoderFactoryFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/BuiltinAudioDecoderFactoryFactory.java >new file mode 100644 >index 0000000000000000000000000000000000000000..5ebc19f25dceea3bfa4efa881158504459ebef2e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/BuiltinAudioDecoderFactoryFactory.java >@@ -0,0 +1,23 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+/** >+ * Creates a native {@code webrtc::AudioDecoderFactory} with the builtin audio decoders. >+ */ >+public class BuiltinAudioDecoderFactoryFactory implements AudioDecoderFactoryFactory { >+ @Override >+ public long createNativeAudioDecoderFactory() { >+ return nativeCreateBuiltinAudioDecoderFactory(); >+ } >+ >+ private static native long nativeCreateBuiltinAudioDecoderFactory(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/BuiltinAudioEncoderFactoryFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/BuiltinAudioEncoderFactoryFactory.java >new file mode 100644 >index 0000000000000000000000000000000000000000..e884d4c3b98d6cd7d3c2bb5d5a4a01b93b30c2e7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/BuiltinAudioEncoderFactoryFactory.java >@@ -0,0 +1,23 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+/** >+ * This class creates a native {@code webrtc::AudioEncoderFactory} with the builtin audio encoders. >+ */ >+public class BuiltinAudioEncoderFactoryFactory implements AudioEncoderFactoryFactory { >+ @Override >+ public long createNativeAudioEncoderFactory() { >+ return nativeCreateBuiltinAudioEncoderFactory(); >+ } >+ >+ private static native long nativeCreateBuiltinAudioEncoderFactory(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/CryptoOptions.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/CryptoOptions.java >new file mode 100644 >index 0000000000000000000000000000000000000000..77b9552aad87a718d621288108618ddf85631e90 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/CryptoOptions.java >@@ -0,0 +1,145 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+/** >+ * CryptoOptions defines advanced cryptographic settings for native WebRTC. >+ * These settings must be passed into RTCConfiguration. WebRTC is secur by >+ * default and you should not need to set any of these options unless you are >+ * specifically looking for an additional crypto feature such as AES_GCM >+ * support. This class is the Java binding of native api/crypto/cryptooptions.h >+ */ >+public final class CryptoOptions { >+ /** >+ * SRTP Related Peer Connection Options. >+ */ >+ public final class Srtp { >+ /** >+ * Enable GCM crypto suites from RFC 7714 for SRTP. GCM will only be used >+ * if both sides enable it >+ */ >+ private final boolean enableGcmCryptoSuites; >+ /** >+ * If set to true, the (potentially insecure) crypto cipher >+ * SRTP_AES128_CM_SHA1_32 will be included in the list of supported ciphers >+ * during negotiation. It will only be used if both peers support it and no >+ * other ciphers get preferred. >+ */ >+ private final boolean enableAes128Sha1_32CryptoCipher; >+ /** >+ * If set to true, encrypted RTP header extensions as defined in RFC 6904 >+ * will be negotiated. They will only be used if both peers support them. >+ */ >+ private final boolean enableEncryptedRtpHeaderExtensions; >+ >+ private Srtp(boolean enableGcmCryptoSuites, boolean enableAes128Sha1_32CryptoCipher, >+ boolean enableEncryptedRtpHeaderExtensions) { >+ this.enableGcmCryptoSuites = enableGcmCryptoSuites; >+ this.enableAes128Sha1_32CryptoCipher = enableAes128Sha1_32CryptoCipher; >+ this.enableEncryptedRtpHeaderExtensions = enableEncryptedRtpHeaderExtensions; >+ } >+ >+ @CalledByNative("Srtp") >+ public boolean getEnableGcmCryptoSuites() { >+ return enableGcmCryptoSuites; >+ } >+ >+ @CalledByNative("Srtp") >+ public boolean getEnableAes128Sha1_32CryptoCipher() { >+ return enableAes128Sha1_32CryptoCipher; >+ } >+ >+ @CalledByNative("Srtp") >+ public boolean getEnableEncryptedRtpHeaderExtensions() { >+ return enableEncryptedRtpHeaderExtensions; >+ } >+ } >+ >+ /** >+ * Options to be used when the FrameEncryptor / FrameDecryptor APIs are used. >+ */ >+ public final class SFrame { >+ /** >+ * If set all RtpSenders must have an FrameEncryptor attached to them before >+ * they are allowed to send packets. All RtpReceivers must have a >+ * FrameDecryptor attached to them before they are able to receive packets. >+ */ >+ private final boolean requireFrameEncryption; >+ >+ private SFrame(boolean requireFrameEncryption) { >+ this.requireFrameEncryption = requireFrameEncryption; >+ } >+ >+ @CalledByNative("SFrame") >+ public boolean getRequireFrameEncryption() { >+ return requireFrameEncryption; >+ } >+ } >+ >+ private final Srtp srtp; >+ private final SFrame sframe; >+ >+ private CryptoOptions(boolean enableGcmCryptoSuites, boolean enableAes128Sha1_32CryptoCipher, >+ boolean enableEncryptedRtpHeaderExtensions, boolean requireFrameEncryption) { >+ this.srtp = new Srtp( >+ enableGcmCryptoSuites, enableAes128Sha1_32CryptoCipher, enableEncryptedRtpHeaderExtensions); >+ this.sframe = new SFrame(requireFrameEncryption); >+ } >+ >+ public static Builder builder() { >+ return new Builder(); >+ } >+ >+ @CalledByNative >+ public Srtp getSrtp() { >+ return srtp; >+ } >+ >+ @CalledByNative >+ public SFrame getSFrame() { >+ return sframe; >+ } >+ >+ public static class Builder { >+ private boolean enableGcmCryptoSuites; >+ private boolean enableAes128Sha1_32CryptoCipher; >+ private boolean enableEncryptedRtpHeaderExtensions; >+ private boolean requireFrameEncryption; >+ >+ private Builder() {} >+ >+ public Builder setEnableGcmCryptoSuites(boolean enableGcmCryptoSuites) { >+ this.enableGcmCryptoSuites = enableGcmCryptoSuites; >+ return this; >+ } >+ >+ public Builder setEnableAes128Sha1_32CryptoCipher(boolean enableAes128Sha1_32CryptoCipher) { >+ this.enableAes128Sha1_32CryptoCipher = enableAes128Sha1_32CryptoCipher; >+ return this; >+ } >+ >+ public Builder setEnableEncryptedRtpHeaderExtensions( >+ boolean enableEncryptedRtpHeaderExtensions) { >+ this.enableEncryptedRtpHeaderExtensions = enableEncryptedRtpHeaderExtensions; >+ return this; >+ } >+ >+ public Builder setRequireFrameEncryption(boolean requireFrameEncryption) { >+ this.requireFrameEncryption = requireFrameEncryption; >+ return this; >+ } >+ >+ public CryptoOptions createCryptoOptions() { >+ return new CryptoOptions(enableGcmCryptoSuites, enableAes128Sha1_32CryptoCipher, >+ enableEncryptedRtpHeaderExtensions, requireFrameEncryption); >+ } >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/EglRenderer.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/EglRenderer.java >index d0a1d98b2bdefca959dcc4d854c1f69427057c0c..e8100530036183cf33a781dcac2e2d371f9d0249 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/EglRenderer.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/EglRenderer.java >@@ -17,6 +17,7 @@ import android.opengl.GLES20; > import android.os.Handler; > import android.os.HandlerThread; > import android.os.Looper; >+import android.os.Message; > import android.view.Surface; > import java.nio.ByteBuffer; > import java.text.DecimalFormat; >@@ -79,6 +80,29 @@ public class EglRenderer implements VideoSink { > } > } > >+ /** >+ * Handler that triggers a callback when an uncaught exception happens when handling a message. >+ */ >+ private static class HandlerWithExceptionCallback extends Handler { >+ private final Runnable exceptionCallback; >+ >+ public HandlerWithExceptionCallback(Looper looper, Runnable exceptionCallback) { >+ super(looper); >+ this.exceptionCallback = exceptionCallback; >+ } >+ >+ @Override >+ public void dispatchMessage(Message msg) { >+ try { >+ super.dispatchMessage(msg); >+ } catch (Exception e) { >+ Logging.e(TAG, "Exception on EglRenderer thread", e); >+ exceptionCallback.run(); >+ throw e; >+ } >+ } >+ } >+ > protected final String name; > > // |renderThreadHandler| is a handler for communicating with |renderThread|, and is synchronized >@@ -101,6 +125,7 @@ public class EglRenderer implements VideoSink { > @Nullable private EglBase eglBase; > private final VideoFrameDrawer frameDrawer = new VideoFrameDrawer(); > @Nullable private RendererCommon.GlDrawer drawer; >+ private boolean usePresentationTimeStamp; > private final Matrix drawMatrix = new Matrix(); > > // Pending frame to render. Serves as a queue with size 1. Synchronized on |frameLock|. >@@ -161,20 +186,31 @@ public class EglRenderer implements VideoSink { > * Initialize this class, sharing resources with |sharedContext|. The custom |drawer| will be used > * for drawing frames on the EGLSurface. This class is responsible for calling release() on > * |drawer|. It is allowed to call init() to reinitialize the renderer after a previous >- * init()/release() cycle. >+ * init()/release() cycle. If usePresentationTimeStamp is true, eglPresentationTimeANDROID will be >+ * set with the frame timestamps, which specifies desired presentation time and might be useful >+ * for e.g. syncing audio and video. > */ > public void init(@Nullable final EglBase.Context sharedContext, final int[] configAttributes, >- RendererCommon.GlDrawer drawer) { >+ RendererCommon.GlDrawer drawer, boolean usePresentationTimeStamp) { > synchronized (handlerLock) { > if (renderThreadHandler != null) { > throw new IllegalStateException(name + "Already initialized"); > } > logD("Initializing EglRenderer"); > this.drawer = drawer; >+ this.usePresentationTimeStamp = usePresentationTimeStamp; > > final HandlerThread renderThread = new HandlerThread(name + "EglRenderer"); > renderThread.start(); >- renderThreadHandler = new Handler(renderThread.getLooper()); >+ renderThreadHandler = >+ new HandlerWithExceptionCallback(renderThread.getLooper(), new Runnable() { >+ @Override >+ public void run() { >+ synchronized (handlerLock) { >+ renderThreadHandler = null; >+ } >+ } >+ }); > // Create EGL context on the newly created render thread. It should be possibly to create the > // context on this thread and make it current on the render thread, but this causes failure on > // some Marvel based JB devices. https://bugs.chromium.org/p/webrtc/issues/detail?id=6350. >@@ -198,6 +234,16 @@ public class EglRenderer implements VideoSink { > } > } > >+ /** >+ * Same as above with usePresentationTimeStamp set to false. >+ * >+ * @see #init(EglBase.Context, int[], RendererCommon.GlDrawer, boolean) >+ */ >+ public void init(@Nullable final EglBase.Context sharedContext, final int[] configAttributes, >+ RendererCommon.GlDrawer drawer) { >+ init(sharedContext, configAttributes, drawer, /* usePresentationTimeStamp= */ false); >+ } >+ > public void createEglSurface(Surface surface) { > createEglSurfaceInternal(surface); > } >@@ -585,7 +631,11 @@ public class EglRenderer implements VideoSink { > eglBase.surfaceWidth(), eglBase.surfaceHeight()); > > final long swapBuffersStartTimeNs = System.nanoTime(); >- eglBase.swapBuffers(); >+ if (usePresentationTimeStamp) { >+ eglBase.swapBuffers(frame.getTimestampNs()); >+ } else { >+ eglBase.swapBuffers(); >+ } > > final long currentTimeNs = System.nanoTime(); > synchronized (statisticsLock) { >@@ -652,7 +702,7 @@ public class EglRenderer implements VideoSink { > } > > private String averageTimeAsString(long sumTimeNs, int count) { >- return (count <= 0) ? "NA" : TimeUnit.NANOSECONDS.toMicros(sumTimeNs / count) + " μs"; >+ return (count <= 0) ? "NA" : TimeUnit.NANOSECONDS.toMicros(sumTimeNs / count) + " us"; > } > > private void logStatistics() { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp8Decoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp8Decoder.java >new file mode 100644 >index 0000000000000000000000000000000000000000..54ad0aa137c7520ebb5909e0ee34ab2da5cf5def >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp8Decoder.java >@@ -0,0 +1,20 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+public class LibvpxVp8Decoder extends WrappedNativeVideoDecoder { >+ @Override >+ public long createNativeVideoDecoder() { >+ return nativeCreateDecoder(); >+ } >+ >+ static native long nativeCreateDecoder(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp8Encoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp8Encoder.java >new file mode 100644 >index 0000000000000000000000000000000000000000..4be9e52c1421f241c4ccc91899be159203e3a681 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp8Encoder.java >@@ -0,0 +1,25 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+public class LibvpxVp8Encoder extends WrappedNativeVideoEncoder { >+ @Override >+ public long createNativeVideoEncoder() { >+ return nativeCreateEncoder(); >+ } >+ >+ static native long nativeCreateEncoder(); >+ >+ @Override >+ public boolean isHardwareEncoder() { >+ return false; >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp9Decoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp9Decoder.java >new file mode 100644 >index 0000000000000000000000000000000000000000..90a24433a3ef741cc9b81d45e076ec228bc4db6b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp9Decoder.java >@@ -0,0 +1,22 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+public class LibvpxVp9Decoder extends WrappedNativeVideoDecoder { >+ @Override >+ public long createNativeVideoDecoder() { >+ return nativeCreateDecoder(); >+ } >+ >+ static native long nativeCreateDecoder(); >+ >+ static native boolean nativeIsSupported(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp9Encoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp9Encoder.java >new file mode 100644 >index 0000000000000000000000000000000000000000..1211ae93fb8ac5a179d3f77dabf7e3c9f2685f30 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/LibvpxVp9Encoder.java >@@ -0,0 +1,27 @@ >+/* >+ * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+public class LibvpxVp9Encoder extends WrappedNativeVideoEncoder { >+ @Override >+ public long createNativeVideoEncoder() { >+ return nativeCreateEncoder(); >+ } >+ >+ static native long nativeCreateEncoder(); >+ >+ @Override >+ public boolean isHardwareEncoder() { >+ return false; >+ } >+ >+ static native boolean nativeIsSupported(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/MediaTransportFactoryFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/MediaTransportFactoryFactory.java >new file mode 100644 >index 0000000000000000000000000000000000000000..c16a37a6d777539a460fc2946df1a220affa2365 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/MediaTransportFactoryFactory.java >@@ -0,0 +1,22 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+/** >+ * Factory for creating webrtc::MediaTransportFactory instances. >+ */ >+public interface MediaTransportFactoryFactory { >+ /** >+ * Dynamically allocates a webrtc::MediaTransportFactory instance and returns a pointer to it. >+ * The caller takes ownership of the object. >+ */ >+ public long createNativeMediaTransportFactory(); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitor.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitor.java >index ba6bae0ed83498321656deb8251fd16b5d258751..9764a0cb354ff772aa3a3555479d41ab802ef5fe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitor.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitor.java >@@ -14,11 +14,10 @@ import static org.webrtc.NetworkMonitorAutoDetect.INVALID_NET_ID; > > import android.content.Context; > import android.os.Build; >-import javax.annotation.Nullable; > import java.util.ArrayList; > import java.util.List; >-import org.webrtc.NetworkMonitorAutoDetect.ConnectionType; >-import org.webrtc.NetworkMonitorAutoDetect.NetworkInformation; >+import javax.annotation.Nullable; >+import org.webrtc.NetworkMonitorAutoDetect; > > /** > * Borrowed from Chromium's >@@ -33,7 +32,7 @@ public class NetworkMonitor { > * Alerted when the connection type of the network changes. The alert is fired on the UI thread. > */ > public interface NetworkObserver { >- public void onConnectionTypeChanged(ConnectionType connectionType); >+ public void onConnectionTypeChanged(NetworkMonitorAutoDetect.ConnectionType connectionType); > } > > private static final String TAG = "NetworkMonitor"; >@@ -55,13 +54,13 @@ public class NetworkMonitor { > // Also guarded by autoDetectLock. > private int numObservers; > >- private volatile ConnectionType currentConnectionType; >+ private volatile NetworkMonitorAutoDetect.ConnectionType currentConnectionType; > > private NetworkMonitor() { > nativeNetworkObservers = new ArrayList<Long>(); > networkObservers = new ArrayList<NetworkObserver>(); > numObservers = 0; >- currentConnectionType = ConnectionType.CONNECTION_UNKNOWN; >+ currentConnectionType = NetworkMonitorAutoDetect.ConnectionType.CONNECTION_UNKNOWN; > } > > // TODO(sakal): Remove once downstream dependencies have been updated. >@@ -155,7 +154,7 @@ public class NetworkMonitor { > return Build.VERSION.SDK_INT; > } > >- private ConnectionType getCurrentConnectionType() { >+ private NetworkMonitorAutoDetect.ConnectionType getCurrentConnectionType() { > return currentConnectionType; > } > >@@ -169,12 +168,13 @@ public class NetworkMonitor { > return new NetworkMonitorAutoDetect(new NetworkMonitorAutoDetect.Observer() { > > @Override >- public void onConnectionTypeChanged(ConnectionType newConnectionType) { >+ public void onConnectionTypeChanged( >+ NetworkMonitorAutoDetect.ConnectionType newConnectionType) { > updateCurrentConnectionType(newConnectionType); > } > > @Override >- public void onNetworkConnect(NetworkInformation networkInfo) { >+ public void onNetworkConnect(NetworkMonitorAutoDetect.NetworkInformation networkInfo) { > notifyObserversOfNetworkConnect(networkInfo); > } > >@@ -185,13 +185,15 @@ public class NetworkMonitor { > }, appContext); > } > >- private void updateCurrentConnectionType(ConnectionType newConnectionType) { >+ private void updateCurrentConnectionType( >+ NetworkMonitorAutoDetect.ConnectionType newConnectionType) { > currentConnectionType = newConnectionType; > notifyObserversOfConnectionTypeChange(newConnectionType); > } > > /** Alerts all observers of a connection change. */ >- private void notifyObserversOfConnectionTypeChange(ConnectionType newConnectionType) { >+ private void notifyObserversOfConnectionTypeChange( >+ NetworkMonitorAutoDetect.ConnectionType newConnectionType) { > List<Long> nativeObservers = getNativeNetworkObserversSync(); > for (Long nativeObserver : nativeObservers) { > nativeNotifyConnectionTypeChanged(nativeObserver); >@@ -206,7 +208,8 @@ public class NetworkMonitor { > } > } > >- private void notifyObserversOfNetworkConnect(NetworkInformation networkInfo) { >+ private void notifyObserversOfNetworkConnect( >+ NetworkMonitorAutoDetect.NetworkInformation networkInfo) { > List<Long> nativeObservers = getNativeNetworkObserversSync(); > for (Long nativeObserver : nativeObservers) { > nativeNotifyOfNetworkConnect(nativeObserver, networkInfo); >@@ -221,7 +224,7 @@ public class NetworkMonitor { > } > > private void updateObserverActiveNetworkList(long nativeObserver) { >- List<NetworkInformation> networkInfoList; >+ List<NetworkMonitorAutoDetect.NetworkInformation> networkInfoList; > synchronized (autoDetectLock) { > networkInfoList = (autoDetect == null) ? null : autoDetect.getActiveNetworkList(); > } >@@ -229,7 +232,8 @@ public class NetworkMonitor { > return; > } > >- NetworkInformation[] networkInfos = new NetworkInformation[networkInfoList.size()]; >+ NetworkMonitorAutoDetect.NetworkInformation[] networkInfos = >+ new NetworkMonitorAutoDetect.NetworkInformation[networkInfoList.size()]; > networkInfos = networkInfoList.toArray(networkInfos); > nativeNotifyOfActiveNetworkList(nativeObserver, networkInfos); > } >@@ -274,17 +278,18 @@ public class NetworkMonitor { > > /** Checks if there currently is connectivity. */ > public static boolean isOnline() { >- ConnectionType connectionType = getInstance().getCurrentConnectionType(); >- return connectionType != ConnectionType.CONNECTION_NONE; >+ NetworkMonitorAutoDetect.ConnectionType connectionType = >+ getInstance().getCurrentConnectionType(); >+ return connectionType != NetworkMonitorAutoDetect.ConnectionType.CONNECTION_NONE; > } > > private native void nativeNotifyConnectionTypeChanged(long nativeAndroidNetworkMonitor); > private native void nativeNotifyOfNetworkConnect( >- long nativeAndroidNetworkMonitor, NetworkInformation networkInfo); >+ long nativeAndroidNetworkMonitor, NetworkMonitorAutoDetect.NetworkInformation networkInfo); > private native void nativeNotifyOfNetworkDisconnect( > long nativeAndroidNetworkMonitor, long networkHandle); > private native void nativeNotifyOfActiveNetworkList( >- long nativeAndroidNetworkMonitor, NetworkInformation[] networkInfos); >+ long nativeAndroidNetworkMonitor, NetworkMonitorAutoDetect.NetworkInformation[] networkInfos); > > // For testing only. > @Nullable >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitorAutoDetect.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitorAutoDetect.java >index 3d74f5fd7c5da4d82c94c2e9cdecdf4a01b9acd1..179f83d3761be1bc508b26ddc3d1be6fc77fd3f9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitorAutoDetect.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/NetworkMonitorAutoDetect.java >@@ -297,7 +297,8 @@ public class NetworkMonitorAutoDetect extends BroadcastReceiver { > // https://android.googlesource.com/platform/frameworks/base/+/d6a7980d > NetworkInfo underlyingActiveNetworkInfo = connectivityManager.getActiveNetworkInfo(); > // We use the NetworkInfo of the underlying network if it is not of TYPE_VPN itself. >- if (underlyingActiveNetworkInfo.getType() != ConnectivityManager.TYPE_VPN) { >+ if (underlyingActiveNetworkInfo != null >+ && underlyingActiveNetworkInfo.getType() != ConnectivityManager.TYPE_VPN) { > return new NetworkState(networkInfo.isConnected(), ConnectivityManager.TYPE_VPN, -1, > underlyingActiveNetworkInfo.getType(), underlyingActiveNetworkInfo.getSubtype()); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnection.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnection.java >index 84d49ebb8a7dcd412d4096b9c2cb0a172d39b2ca..47c54ee549c4f98a860e34013067dfbe8d968b05 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnection.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnection.java >@@ -14,6 +14,9 @@ import java.util.ArrayList; > import java.util.Collections; > import java.util.List; > import javax.annotation.Nullable; >+import org.webrtc.DataChannel; >+import org.webrtc.MediaStreamTrack; >+import org.webrtc.RtpTransceiver; > > /** > * Java-land version of the PeerConnection APIs; wraps the C++ API >@@ -50,6 +53,21 @@ public class PeerConnection { > } > } > >+ /** Tracks PeerConnectionInterface::PeerConnectionState */ >+ public enum PeerConnectionState { >+ NEW, >+ CONNECTING, >+ CONNECTED, >+ DISCONNECTED, >+ FAILED, >+ CLOSED; >+ >+ @CalledByNative("PeerConnectionState") >+ static PeerConnectionState fromNativeIndex(int nativeIndex) { >+ return values()[nativeIndex]; >+ } >+ } >+ > /** Tracks PeerConnectionInterface::TlsCertPolicy */ > public enum TlsCertPolicy { > TLS_CERT_POLICY_SECURE, >@@ -79,6 +97,10 @@ public class PeerConnection { > /** Triggered when the IceConnectionState changes. */ > @CalledByNative("Observer") void onIceConnectionChange(IceConnectionState newState); > >+ /** Triggered when the PeerConnectionState changes. */ >+ @CalledByNative("Observer") >+ default void onConnectionChange(PeerConnectionState newState) {} >+ > /** Triggered when the ICE connection receiving status changes. */ > @CalledByNative("Observer") void onIceConnectionReceivingChange(boolean receiving); > >@@ -385,6 +407,7 @@ public class PeerConnection { > public IceTransportsType iceTransportsType; > public List<IceServer> iceServers; > public BundlePolicy bundlePolicy; >+ @Nullable public RtcCertificatePem certificate; > public RtcpMuxPolicy rtcpMuxPolicy; > public TcpCandidatePolicy tcpCandidatePolicy; > public CandidateNetworkPolicy candidateNetworkPolicy; >@@ -461,6 +484,25 @@ public class PeerConnection { > // every offer/answer negotiation.This is only intended to be a workaround for crbug.com/835958 > public boolean activeResetSrtpParams; > >+ /* >+ * Experimental flag that enables a use of media transport. If this is true, the media transport >+ * factory MUST be provided to the PeerConnectionFactory. >+ */ >+ public boolean useMediaTransport; >+ >+ /* >+ * Experimental flag that enables a use of media transport for data channels. If this is true, >+ * the media transport factory MUST be provided to the PeerConnectionFactory. >+ */ >+ public boolean useMediaTransportForDataChannels; >+ >+ /** >+ * Defines advanced optional cryptographic settings related to SRTP and >+ * frame encryption for native WebRTC. Setting this will overwrite any >+ * options set through the PeerConnectionFactory (which is deprecated). >+ */ >+ @Nullable public CryptoOptions cryptoOptions; >+ > // TODO(deadbeef): Instead of duplicating the defaults here, we should do > // something to pick up the defaults from C++. The Objective-C equivalent > // of RTCConfiguration does that. >@@ -500,6 +542,9 @@ public class PeerConnection { > networkPreference = AdapterType.UNKNOWN; > sdpSemantics = SdpSemantics.PLAN_B; > activeResetSrtpParams = false; >+ useMediaTransport = false; >+ useMediaTransportForDataChannels = false; >+ cryptoOptions = null; > } > > @CalledByNative("RTCConfiguration") >@@ -517,6 +562,12 @@ public class PeerConnection { > return bundlePolicy; > } > >+ @Nullable >+ @CalledByNative("RTCConfiguration") >+ RtcCertificatePem getCertificate() { >+ return certificate; >+ } >+ > @CalledByNative("RTCConfiguration") > RtcpMuxPolicy getRtcpMuxPolicy() { > return rtcpMuxPolicy; >@@ -692,6 +743,22 @@ public class PeerConnection { > boolean getActiveResetSrtpParams() { > return activeResetSrtpParams; > } >+ >+ @CalledByNative("RTCConfiguration") >+ boolean getUseMediaTransport() { >+ return useMediaTransport; >+ } >+ >+ @CalledByNative("RTCConfiguration") >+ boolean getUseMediaTransportForDataChannels() { >+ return useMediaTransportForDataChannels; >+ } >+ >+ @Nullable >+ @CalledByNative("RTCConfiguration") >+ CryptoOptions getCryptoOptions() { >+ return cryptoOptions; >+ } > }; > > private final List<MediaStream> localStreams = new ArrayList<>(); >@@ -721,6 +788,10 @@ public class PeerConnection { > return nativeGetRemoteDescription(); > } > >+ public RtcCertificatePem getCertificate() { >+ return nativeGetCertificate(); >+ } >+ > public DataChannel createDataChannel(String label, DataChannel.Init init) { > return nativeCreateDataChannel(label, init); > } >@@ -1044,6 +1115,10 @@ public class PeerConnection { > return nativeIceConnectionState(); > } > >+ public PeerConnectionState connectionState() { >+ return nativeConnectionState(); >+ } >+ > public IceGatheringState iceGatheringState() { > return nativeIceGatheringState(); > } >@@ -1107,6 +1182,7 @@ public class PeerConnection { > private native long nativeGetNativePeerConnection(); > private native SessionDescription nativeGetLocalDescription(); > private native SessionDescription nativeGetRemoteDescription(); >+ private native RtcCertificatePem nativeGetCertificate(); > private native DataChannel nativeCreateDataChannel(String label, DataChannel.Init init); > private native void nativeCreateOffer(SdpObserver observer, MediaConstraints constraints); > private native void nativeCreateAnswer(SdpObserver observer, MediaConstraints constraints); >@@ -1117,6 +1193,7 @@ public class PeerConnection { > private native boolean nativeSetBitrate(Integer min, Integer current, Integer max); > private native SignalingState nativeSignalingState(); > private native IceConnectionState nativeIceConnectionState(); >+ private native PeerConnectionState nativeConnectionState(); > private native IceGatheringState nativeIceGatheringState(); > private native void nativeClose(); > private static native long nativeCreatePeerConnectionObserver(Observer observer); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnectionFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnectionFactory.java >index eb98e923f6cc856cf8060e5dec67e903b8ec9de0..018fb76e8570c4379c4f31a3da66ef9fb7234b9b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnectionFactory.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/PeerConnectionFactory.java >@@ -14,6 +14,7 @@ import android.content.Context; > import java.util.List; > import javax.annotation.Nullable; > import org.webrtc.Logging.Severity; >+import org.webrtc.PeerConnection; > import org.webrtc.audio.AudioDeviceModule; > import org.webrtc.audio.LegacyAudioDeviceModule; > >@@ -122,8 +123,6 @@ public class PeerConnectionFactory { > public int networkIgnoreMask; > public boolean disableEncryption; > public boolean disableNetworkMonitor; >- public boolean enableAes128Sha1_32CryptoCipher; >- public boolean enableGcmCryptoSuites; > > @CalledByNative("Options") > int getNetworkIgnoreMask() { >@@ -139,25 +138,20 @@ public class PeerConnectionFactory { > boolean getDisableNetworkMonitor() { > return disableNetworkMonitor; > } >- >- @CalledByNative("Options") >- boolean getEnableAes128Sha1_32CryptoCipher() { >- return enableAes128Sha1_32CryptoCipher; >- } >- >- @CalledByNative("Options") >- boolean getEnableGcmCryptoSuites() { >- return enableGcmCryptoSuites; >- } > } > > public static class Builder { >- private @Nullable Options options; >- private @Nullable AudioDeviceModule audioDeviceModule = new LegacyAudioDeviceModule(); >- private @Nullable VideoEncoderFactory encoderFactory; >- private @Nullable VideoDecoderFactory decoderFactory; >- private @Nullable AudioProcessingFactory audioProcessingFactory; >- private @Nullable FecControllerFactoryFactoryInterface fecControllerFactoryFactory; >+ @Nullable private Options options; >+ @Nullable private AudioDeviceModule audioDeviceModule = new LegacyAudioDeviceModule(); >+ private AudioEncoderFactoryFactory audioEncoderFactoryFactory = >+ new BuiltinAudioEncoderFactoryFactory(); >+ private AudioDecoderFactoryFactory audioDecoderFactoryFactory = >+ new BuiltinAudioDecoderFactoryFactory(); >+ @Nullable private VideoEncoderFactory videoEncoderFactory; >+ @Nullable private VideoDecoderFactory videoDecoderFactory; >+ @Nullable private AudioProcessingFactory audioProcessingFactory; >+ @Nullable private FecControllerFactoryFactoryInterface fecControllerFactoryFactory; >+ @Nullable private MediaTransportFactoryFactory mediaTransportFactoryFactory; > > private Builder() {} > >@@ -171,13 +165,33 @@ public class PeerConnectionFactory { > return this; > } > >- public Builder setVideoEncoderFactory(VideoEncoderFactory encoderFactory) { >- this.encoderFactory = encoderFactory; >+ public Builder setAudioEncoderFactoryFactory( >+ AudioEncoderFactoryFactory audioEncoderFactoryFactory) { >+ if (audioEncoderFactoryFactory == null) { >+ throw new IllegalArgumentException( >+ "PeerConnectionFactory.Builder does not accept a null AudioEncoderFactoryFactory."); >+ } >+ this.audioEncoderFactoryFactory = audioEncoderFactoryFactory; >+ return this; >+ } >+ >+ public Builder setAudioDecoderFactoryFactory( >+ AudioDecoderFactoryFactory audioDecoderFactoryFactory) { >+ if (audioDecoderFactoryFactory == null) { >+ throw new IllegalArgumentException( >+ "PeerConnectionFactory.Builder does not accept a null AudioDecoderFactoryFactory."); >+ } >+ this.audioDecoderFactoryFactory = audioDecoderFactoryFactory; >+ return this; >+ } >+ >+ public Builder setVideoEncoderFactory(VideoEncoderFactory videoEncoderFactory) { >+ this.videoEncoderFactory = videoEncoderFactory; > return this; > } > >- public Builder setVideoDecoderFactory(VideoDecoderFactory decoderFactory) { >- this.decoderFactory = decoderFactory; >+ public Builder setVideoDecoderFactory(VideoDecoderFactory videoDecoderFactory) { >+ this.videoDecoderFactory = videoDecoderFactory; > return this; > } > >@@ -196,9 +210,17 @@ public class PeerConnectionFactory { > return this; > } > >+ /** Sets a MediaTransportFactoryFactory for a PeerConnectionFactory. */ >+ public Builder setMediaTransportFactoryFactory( >+ MediaTransportFactoryFactory mediaTransportFactoryFactory) { >+ this.mediaTransportFactoryFactory = mediaTransportFactoryFactory; >+ return this; >+ } >+ > public PeerConnectionFactory createPeerConnectionFactory() { >- return new PeerConnectionFactory(options, audioDeviceModule, encoderFactory, decoderFactory, >- audioProcessingFactory, fecControllerFactoryFactory); >+ return new PeerConnectionFactory(options, audioDeviceModule, audioEncoderFactoryFactory, >+ audioDecoderFactoryFactory, videoEncoderFactory, videoDecoderFactory, >+ audioProcessingFactory, fecControllerFactoryFactory, mediaTransportFactoryFactory); > } > } > >@@ -277,15 +299,24 @@ public class PeerConnectionFactory { > } > > private PeerConnectionFactory(Options options, @Nullable AudioDeviceModule audioDeviceModule, >- @Nullable VideoEncoderFactory encoderFactory, @Nullable VideoDecoderFactory decoderFactory, >+ AudioEncoderFactoryFactory audioEncoderFactoryFactory, >+ AudioDecoderFactoryFactory audioDecoderFactoryFactory, >+ @Nullable VideoEncoderFactory videoEncoderFactory, >+ @Nullable VideoDecoderFactory videoDecoderFactory, > @Nullable AudioProcessingFactory audioProcessingFactory, >- @Nullable FecControllerFactoryFactoryInterface fecControllerFactoryFactory) { >+ @Nullable FecControllerFactoryFactoryInterface fecControllerFactoryFactory, >+ @Nullable MediaTransportFactoryFactory mediaTransportFactoryFactory) { > checkInitializeHasBeenCalled(); > nativeFactory = nativeCreatePeerConnectionFactory(ContextUtils.getApplicationContext(), options, > audioDeviceModule == null ? 0 : audioDeviceModule.getNativeAudioDeviceModulePointer(), >- encoderFactory, decoderFactory, >+ audioEncoderFactoryFactory.createNativeAudioEncoderFactory(), >+ audioDecoderFactoryFactory.createNativeAudioDecoderFactory(), videoEncoderFactory, >+ videoDecoderFactory, > audioProcessingFactory == null ? 0 : audioProcessingFactory.createNative(), >- fecControllerFactoryFactory == null ? 0 : fecControllerFactoryFactory.createNative()); >+ fecControllerFactoryFactory == null ? 0 : fecControllerFactoryFactory.createNative(), >+ mediaTransportFactoryFactory == null >+ ? 0 >+ : mediaTransportFactoryFactory.createNativeMediaTransportFactory()); > if (nativeFactory == 0) { > throw new RuntimeException("Failed to initialize PeerConnectionFactory!"); > } >@@ -369,9 +400,25 @@ public class PeerConnectionFactory { > return new MediaStream(nativeCreateLocalMediaStream(nativeFactory, label)); > } > >- public VideoSource createVideoSource(boolean isScreencast) { >+ /** >+ * Create video source with given parameters. If alignTimestamps is false, the caller is >+ * responsible for aligning the frame timestamps to rtc::TimeNanos(). This can be used to achieve >+ * higher accuracy if there is a big delay between frame creation and frames being delivered to >+ * the returned video source. If alignTimestamps is true, timestamps will be aligned to >+ * rtc::TimeNanos() when they arrive to the returned video source. >+ */ >+ public VideoSource createVideoSource(boolean isScreencast, boolean alignTimestamps) { > checkPeerConnectionFactoryExists(); >- return new VideoSource(nativeCreateVideoSource(nativeFactory, isScreencast)); >+ return new VideoSource(nativeCreateVideoSource(nativeFactory, isScreencast, alignTimestamps)); >+ } >+ >+ /** >+ * Same as above with alignTimestamps set to true. >+ * >+ * @see #createVideoSource(boolean, boolean) >+ */ >+ public VideoSource createVideoSource(boolean isScreencast) { >+ return createVideoSource(isScreencast, /* alignTimestamps= */ true); > } > > public VideoTrack createVideoTrack(String id, VideoSource source) { >@@ -486,15 +533,18 @@ public class PeerConnectionFactory { > private static native void nativeShutdownInternalTracer(); > private static native boolean nativeStartInternalTracingCapture(String tracingFilename); > private static native void nativeStopInternalTracingCapture(); >+ > private static native long nativeCreatePeerConnectionFactory(Context context, Options options, >- long nativeAudioDeviceModule, VideoEncoderFactory encoderFactory, >- VideoDecoderFactory decoderFactory, long nativeAudioProcessor, >- long nativeFecControllerFactory); >+ long nativeAudioDeviceModule, long audioEncoderFactory, long audioDecoderFactory, >+ VideoEncoderFactory encoderFactory, VideoDecoderFactory decoderFactory, >+ long nativeAudioProcessor, long nativeFecControllerFactory, long mediaTransportFactory); >+ > private static native long nativeCreatePeerConnection(long factory, > PeerConnection.RTCConfiguration rtcConfig, MediaConstraints constraints, long nativeObserver, > SSLCertificateVerifier sslCertificateVerifier); > private static native long nativeCreateLocalMediaStream(long factory, String label); >- private static native long nativeCreateVideoSource(long factory, boolean is_screencast); >+ private static native long nativeCreateVideoSource( >+ long factory, boolean is_screencast, boolean alignTimestamps); > private static native long nativeCreateVideoTrack( > long factory, String id, long nativeVideoSource); > private static native long nativeCreateAudioSource(long factory, MediaConstraints constraints); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/RtcCertificatePem.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/RtcCertificatePem.java >new file mode 100644 >index 0000000000000000000000000000000000000000..6070135b3e12f27ef753596b45f9ce9ddf9415f0 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/RtcCertificatePem.java >@@ -0,0 +1,75 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+import org.webrtc.PeerConnection; >+ >+/** >+ * Easily storable/serializable version of a native C++ RTCCertificatePEM. >+ */ >+public class RtcCertificatePem { >+ /** PEM string representation of the private key. */ >+ public final String privateKey; >+ /** PEM string representation of the certificate. */ >+ public final String certificate; >+ /** Default expiration time of 30 days. */ >+ private static final long DEFAULT_EXPIRY = 60 * 60 * 24 * 30; >+ >+ /** Instantiate an RtcCertificatePem object from stored strings. */ >+ @CalledByNative >+ public RtcCertificatePem(String privateKey, String certificate) { >+ this.privateKey = privateKey; >+ this.certificate = certificate; >+ } >+ >+ @CalledByNative >+ String getPrivateKey() { >+ return privateKey; >+ } >+ >+ @CalledByNative >+ String getCertificate() { >+ return certificate; >+ } >+ >+ /** >+ * Generate a new RtcCertificatePem with the default settings of KeyType = ECDSA and >+ * expires = 30 days. >+ */ >+ public static RtcCertificatePem generateCertificate() { >+ return nativeGenerateCertificate(PeerConnection.KeyType.ECDSA, DEFAULT_EXPIRY); >+ } >+ >+ /** >+ * Generate a new RtcCertificatePem with a custom KeyType and the default setting of >+ * expires = 30 days. >+ */ >+ public static RtcCertificatePem generateCertificate(PeerConnection.KeyType keyType) { >+ return nativeGenerateCertificate(keyType, DEFAULT_EXPIRY); >+ } >+ >+ /** >+ * Generate a new RtcCertificatePem with a custom expires and the default setting of >+ * KeyType = ECDSA. >+ */ >+ public static RtcCertificatePem generateCertificate(long expires) { >+ return nativeGenerateCertificate(PeerConnection.KeyType.ECDSA, expires); >+ } >+ >+ /** Generate a new RtcCertificatePem with a custom KeyType and a custom expires. */ >+ public static RtcCertificatePem generateCertificate( >+ PeerConnection.KeyType keyType, long expires) { >+ return nativeGenerateCertificate(keyType, expires); >+ } >+ >+ private static native RtcCertificatePem nativeGenerateCertificate( >+ PeerConnection.KeyType keyType, long expires); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/RtpTransceiver.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/RtpTransceiver.java >index a4a5aa0540581c611b017aaedf69ba0542cd0fff..7f875e6c8ead46d281c3131e96c72ba5549278eb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/RtpTransceiver.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/RtpTransceiver.java >@@ -13,6 +13,7 @@ package org.webrtc; > import java.util.ArrayList; > import java.util.Collections; > import java.util.List; >+import org.webrtc.MediaStreamTrack; > > /** > * Java wrapper for a C++ RtpTransceiverInterface. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoDecoderFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoDecoderFactory.java >index 86dfb7a2c6f211eca41fbc3a9c41a63ea2007333..f421f16b1a1f8b97bff07e3b9c7380a6b9ce91bb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoDecoderFactory.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoDecoderFactory.java >@@ -27,10 +27,10 @@ public class SoftwareVideoDecoderFactory implements VideoDecoderFactory { > @Override > public VideoDecoder createDecoder(VideoCodecInfo codecType) { > if (codecType.getName().equalsIgnoreCase("VP8")) { >- return new VP8Decoder(); >+ return new LibvpxVp8Decoder(); > } >- if (codecType.getName().equalsIgnoreCase("VP9") && VP9Decoder.nativeIsSupported()) { >- return new VP9Decoder(); >+ if (codecType.getName().equalsIgnoreCase("VP9") && LibvpxVp9Decoder.nativeIsSupported()) { >+ return new LibvpxVp9Decoder(); > } > > return null; >@@ -45,7 +45,7 @@ public class SoftwareVideoDecoderFactory implements VideoDecoderFactory { > List<VideoCodecInfo> codecs = new ArrayList<VideoCodecInfo>(); > > codecs.add(new VideoCodecInfo("VP8", new HashMap<>())); >- if (VP9Decoder.nativeIsSupported()) { >+ if (LibvpxVp9Decoder.nativeIsSupported()) { > codecs.add(new VideoCodecInfo("VP9", new HashMap<>())); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoEncoderFactory.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoEncoderFactory.java >index a2ddf36c389d7c142939acbd638627fcabf6caeb..3c36c4d56a283313151fd9fedaf88376747a2351 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoEncoderFactory.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SoftwareVideoEncoderFactory.java >@@ -20,10 +20,10 @@ public class SoftwareVideoEncoderFactory implements VideoEncoderFactory { > @Override > public VideoEncoder createEncoder(VideoCodecInfo info) { > if (info.name.equalsIgnoreCase("VP8")) { >- return new VP8Encoder(); >+ return new LibvpxVp8Encoder(); > } >- if (info.name.equalsIgnoreCase("VP9") && VP9Encoder.nativeIsSupported()) { >- return new VP9Encoder(); >+ if (info.name.equalsIgnoreCase("VP9") && LibvpxVp9Encoder.nativeIsSupported()) { >+ return new LibvpxVp9Encoder(); > } > > return null; >@@ -38,7 +38,7 @@ public class SoftwareVideoEncoderFactory implements VideoEncoderFactory { > List<VideoCodecInfo> codecs = new ArrayList<VideoCodecInfo>(); > > codecs.add(new VideoCodecInfo("VP8", new HashMap<>())); >- if (VP9Encoder.nativeIsSupported()) { >+ if (LibvpxVp9Encoder.nativeIsSupported()) { > codecs.add(new VideoCodecInfo("VP9", new HashMap<>())); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SurfaceTextureHelper.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SurfaceTextureHelper.java >index bb67d1f9fe0a62c70109a92482252c4c44de0e0b..29df688901e1626d1f3e046a6d41f99d7486bd32 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SurfaceTextureHelper.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/SurfaceTextureHelper.java >@@ -36,10 +36,14 @@ public class SurfaceTextureHelper { > /** > * Construct a new SurfaceTextureHelper sharing OpenGL resources with |sharedContext|. A dedicated > * thread and handler is created for handling the SurfaceTexture. May return null if EGL fails to >- * initialize a pixel buffer surface and make it current. >+ * initialize a pixel buffer surface and make it current. If alignTimestamps is true, the frame >+ * timestamps will be aligned to rtc::TimeNanos(). If frame timestamps are aligned to >+ * rtc::TimeNanos() there is no need for aligning timestamps again in >+ * PeerConnectionFactory.createVideoSource(). This makes the timestamps more accurate and >+ * closer to actual creation time. > */ > public static SurfaceTextureHelper create( >- final String threadName, final EglBase.Context sharedContext) { >+ final String threadName, final EglBase.Context sharedContext, boolean alignTimestamps) { > final HandlerThread thread = new HandlerThread(threadName); > thread.start(); > final Handler handler = new Handler(thread.getLooper()); >@@ -53,7 +57,7 @@ public class SurfaceTextureHelper { > @Override > public SurfaceTextureHelper call() { > try { >- return new SurfaceTextureHelper(sharedContext, handler); >+ return new SurfaceTextureHelper(sharedContext, handler, alignTimestamps); > } catch (RuntimeException e) { > Logging.e(TAG, threadName + " create failure", e); > return null; >@@ -62,11 +66,22 @@ public class SurfaceTextureHelper { > }); > } > >+ /** >+ * Same as above with alignTimestamps set to false. >+ * >+ * @see #create(String, EglBase.Context, boolean) >+ */ >+ public static SurfaceTextureHelper create( >+ final String threadName, final EglBase.Context sharedContext) { >+ return create(threadName, sharedContext, /* alignTimestamps= */ false); >+ } >+ > private final Handler handler; > private final EglBase eglBase; > private final SurfaceTexture surfaceTexture; > private final int oesTextureId; > private final YuvConverter yuvConverter = new YuvConverter(); >+ @Nullable private final TimestampAligner timestampAligner; > > // These variables are only accessed from the |handler| thread. > @Nullable private VideoSink listener; >@@ -95,11 +110,13 @@ public class SurfaceTextureHelper { > } > }; > >- private SurfaceTextureHelper(EglBase.Context sharedContext, Handler handler) { >+ private SurfaceTextureHelper( >+ EglBase.Context sharedContext, Handler handler, boolean alignTimestamps) { > if (handler.getLooper().getThread() != Thread.currentThread()) { > throw new IllegalStateException("SurfaceTextureHelper must be created on the handler thread"); > } > this.handler = handler; >+ this.timestampAligner = alignTimestamps ? new TimestampAligner() : null; > > eglBase = EglBase.create(sharedContext, EglBase.CONFIG_PIXEL_BUFFER); > try { >@@ -264,7 +281,10 @@ public class SurfaceTextureHelper { > > final float[] transformMatrix = new float[16]; > surfaceTexture.getTransformMatrix(transformMatrix); >- final long timestampNs = surfaceTexture.getTimestamp(); >+ long timestampNs = surfaceTexture.getTimestamp(); >+ if (timestampAligner != null) { >+ timestampNs = timestampAligner.translateTimestamp(timestampNs); >+ } > if (textureWidth == 0 || textureHeight == 0) { > throw new RuntimeException("Texture size has not been set."); > } >@@ -289,5 +309,8 @@ public class SurfaceTextureHelper { > surfaceTexture.release(); > eglBase.release(); > handler.getLooper().quit(); >+ if (timestampAligner != null) { >+ timestampAligner.dispose(); >+ } > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/TimestampAligner.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/TimestampAligner.java >new file mode 100644 >index 0000000000000000000000000000000000000000..7c4bed4d8fba4484d5d1ea5e84a0a1a28f29954f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/TimestampAligner.java >@@ -0,0 +1,59 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+/** >+ * The TimestampAligner class helps translating camera timestamps into the same timescale as is >+ * used by rtc::TimeNanos(). Some cameras have built in timestamping which is more accurate than >+ * reading the system clock, but using a different epoch and unknown clock drift. Frame timestamps >+ * in webrtc should use rtc::TimeNanos (system monotonic time), and this class provides a filter >+ * which lets us use the rtc::TimeNanos timescale, and at the same time take advantage of higher >+ * accuracy of the camera clock. This class is a wrapper on top of rtc::TimestampAligner. >+ */ >+public class TimestampAligner { >+ /** >+ * Wrapper around rtc::TimeNanos(). This is normally same as System.nanoTime(), but call this >+ * function to be safe. >+ */ >+ public static long getRtcTimeNanos() { >+ return nativeRtcTimeNanos(); >+ } >+ >+ private volatile long nativeTimestampAligner = nativeCreateTimestampAligner(); >+ >+ /** >+ * Translates camera timestamps to the same timescale as is used by rtc::TimeNanos(). >+ * |cameraTimeNs| is assumed to be accurate, but with an unknown epoch and clock drift. Returns >+ * the translated timestamp. >+ */ >+ public long translateTimestamp(long cameraTimeNs) { >+ checkNativeAlignerExists(); >+ return nativeTranslateTimestamp(nativeTimestampAligner, cameraTimeNs); >+ } >+ >+ /** Dispose native timestamp aligner. */ >+ public void dispose() { >+ checkNativeAlignerExists(); >+ nativeReleaseTimestampAligner(nativeTimestampAligner); >+ nativeTimestampAligner = 0; >+ } >+ >+ private void checkNativeAlignerExists() { >+ if (nativeTimestampAligner == 0) { >+ throw new IllegalStateException("TimestampAligner has been disposed."); >+ } >+ } >+ >+ private static native long nativeRtcTimeNanos(); >+ private static native long nativeCreateTimestampAligner(); >+ private static native void nativeReleaseTimestampAligner(long timestampAligner); >+ private static native long nativeTranslateTimestamp(long timestampAligner, long cameraTimeNs); >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoEncoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoEncoder.java >index 8dc7582926b243ab9d60d228e66b5c036cfafe2b..afc5cd0239d1a5a2e2082ca91dd7c8a34f58e0c2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoEncoder.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoEncoder.java >@@ -210,14 +210,6 @@ public interface VideoEncoder { > */ > @CalledByNative VideoCodecStatus encode(VideoFrame frame, EncodeInfo info); > >- /** >- * Informs the encoder of the packet loss and the round-trip time of the network. >- * >- * @param packetLoss How many packets are lost on average per 255 packets. >- * @param roundTripTimeMs Round-trip time of the network in milliseconds. >- */ >- @CalledByNative VideoCodecStatus setChannelParameters(short packetLoss, long roundTripTimeMs); >- > /** Sets the bitrate allocation and the target framerate for the encoder. */ > @CalledByNative VideoCodecStatus setRateAllocation(BitrateAllocation allocation, int framerate); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoSource.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoSource.java >index 7f9f3e5323ff530f83f171b57b97fe41fa33cac7..a8ef6620acfcb4a43c9bcdef6afd284ed9d90875 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoSource.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/api/org/webrtc/VideoSource.java >@@ -30,7 +30,20 @@ public class VideoSource extends MediaSource { > * maintain the input orientation, so it doesn't matter if e.g. 1280x720 or 720x1280 is requested. > */ > public void adaptOutputFormat(int width, int height, int fps) { >- nativeAdaptOutputFormat(getNativeVideoTrackSource(), width, height, fps); >+ final int maxSide = Math.max(width, height); >+ final int minSide = Math.min(width, height); >+ adaptOutputFormat(maxSide, minSide, minSide, maxSide, fps); >+ } >+ >+ /** >+ * Same as above, but allows setting two different target resolutions depending on incoming >+ * frame orientation. This gives more fine-grained control and can e.g. be used to force landscape >+ * video to be cropped to portrait video. >+ */ >+ public void adaptOutputFormat( >+ int landscapeWidth, int landscapeHeight, int portraitWidth, int portraitHeight, int fps) { >+ nativeAdaptOutputFormat(getNativeVideoTrackSource(), landscapeWidth, landscapeHeight, >+ portraitWidth, portraitHeight, fps); > } > > public CapturerObserver getCapturerObserver() { >@@ -44,5 +57,6 @@ public class VideoSource extends MediaSource { > > // Returns source->internal() from webrtc::VideoTrackSourceProxy. > private static native long nativeGetInternalSource(long source); >- private static native void nativeAdaptOutputFormat(long source, int width, int height, int fps); >+ private static native void nativeAdaptOutputFormat(long source, int landscapeWidth, >+ int landscapeHeight, int portraitWidth, int portraitHeight, int fps); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/BuiltinAudioCodecsFactoryFactoryTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/BuiltinAudioCodecsFactoryFactoryTest.java >new file mode 100644 >index 0000000000000000000000000000000000000000..36ee0e932e1cc202afddd1d9ba7fe34ee5775179 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/BuiltinAudioCodecsFactoryFactoryTest.java >@@ -0,0 +1,58 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+import static com.google.common.truth.Truth.assertThat; >+ >+import android.support.test.filters.SmallTest; >+ >+import org.junit.Before; >+import org.junit.Test; >+import org.junit.runner.RunWith; >+import org.junit.runners.JUnit4; >+ >+@RunWith(JUnit4.class) >+public final class BuiltinAudioCodecsFactoryFactoryTest { >+ @Before >+ public void setUp() { >+ System.loadLibrary(TestConstants.NATIVE_LIBRARY); >+ } >+ >+ @Test >+ @SmallTest >+ public void testAudioEncoderFactoryFactoryTest() throws Exception { >+ BuiltinAudioEncoderFactoryFactory factory = new BuiltinAudioEncoderFactoryFactory(); >+ long aef = 0; >+ try { >+ aef = factory.createNativeAudioEncoderFactory(); >+ assertThat(aef).isNotEqualTo(0); >+ } finally { >+ if (aef != 0) { >+ JniCommon.nativeReleaseRef(aef); >+ } >+ } >+ } >+ >+ @Test >+ @SmallTest >+ public void testAudioDecoderFactoryFactoryTest() throws Exception { >+ BuiltinAudioDecoderFactoryFactory factory = new BuiltinAudioDecoderFactoryFactory(); >+ long adf = 0; >+ try { >+ adf = factory.createNativeAudioDecoderFactory(); >+ assertThat(adf).isNotEqualTo(0); >+ } finally { >+ if (adf != 0) { >+ JniCommon.nativeReleaseRef(adf); >+ } >+ } >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingByteBufferTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingByteBufferTest.java >index ef7669102f9bb648b6db15421104c9b066e9b35d..17b0977d595fd27aff58661ef9aa7d1cc7d8759f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingByteBufferTest.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingByteBufferTest.java >@@ -169,6 +169,14 @@ public class Camera1CapturerUsingByteBufferTest { > fixtures.scaleCameraOutput(); > } > >+ // This test that frames forwarded to a renderer is cropped to a new orientation if >+ // adaptOutputFormat is called in such a way. This test both Java and C++ parts of of the stack. >+ @Test >+ @MediumTest >+ public void testCropCameraOutput() throws InterruptedException { >+ fixtures.cropCameraOutput(); >+ } >+ > // This test that an error is reported if the camera is already opened > // when CameraVideoCapturer is started. > @Test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingTextureTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingTextureTest.java >index 28fa825ad74e61cc9406b32d24b9b6dbe60c8855..4dc003726a7196c626a47560f94918c4151155fb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingTextureTest.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera1CapturerUsingTextureTest.java >@@ -172,6 +172,14 @@ public class Camera1CapturerUsingTextureTest { > fixtures.scaleCameraOutput(); > } > >+ // This test that frames forwarded to a renderer is cropped to a new orientation if >+ // adaptOutputFormat is called in such a way. This test both Java and C++ parts of of the stack. >+ @Test >+ @MediumTest >+ public void testCropCameraOutput() throws InterruptedException { >+ fixtures.cropCameraOutput(); >+ } >+ > // This test that an error is reported if the camera is already opened > // when CameraVideoCapturer is started. > @Test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera2CapturerTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera2CapturerTest.java >index 77e8d77f0aef1d6568b1c7ada534dcb8c6411bc1..fba943176a8586c28ecf4918713922b4bfa4f44b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera2CapturerTest.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/Camera2CapturerTest.java >@@ -302,6 +302,14 @@ public class Camera2CapturerTest { > fixtures.scaleCameraOutput(); > } > >+ // This test that frames forwarded to a renderer is cropped to a new orientation if >+ // adaptOutputFormat is called in such a way. This test both Java and C++ parts of of the stack. >+ @Test >+ @MediumTest >+ public void testCropCameraOutput() throws InterruptedException { >+ fixtures.cropCameraOutput(); >+ } >+ > // This test that an error is reported if the camera is already opened > // when CameraVideoCapturer is started. > @Test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/CameraVideoCapturerTestFixtures.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/CameraVideoCapturerTestFixtures.java >index dd0d780df08c7196260261ed182d9b83cc9e5b6d..b47fee08d15796f2349896123dca0ac43c3beeae 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/CameraVideoCapturerTestFixtures.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/CameraVideoCapturerTestFixtures.java >@@ -688,6 +688,49 @@ class CameraVideoCapturerTestFixtures { > assertTrue(gotExpectedResolution); > } > >+ public void cropCameraOutput() throws InterruptedException { >+ final CapturerInstance capturerInstance = createCapturer(false /* initialize */); >+ final VideoTrackWithRenderer videoTrackWithRenderer = >+ createVideoTrackWithRenderer(capturerInstance.capturer); >+ assertTrue(videoTrackWithRenderer.rendererCallbacks.waitForNextFrameToRender() > 0); >+ >+ final int startWidth = videoTrackWithRenderer.rendererCallbacks.frameWidth(); >+ final int startHeight = videoTrackWithRenderer.rendererCallbacks.frameHeight(); >+ final int frameRate = 30; >+ final int cropWidth; >+ final int cropHeight; >+ if (startWidth > startHeight) { >+ // Landscape input, request portrait output. >+ cropWidth = 360; >+ cropHeight = 640; >+ } else { >+ // Portrait input, request landscape output. >+ cropWidth = 640; >+ cropHeight = 630; >+ } >+ >+ // Request different output orientation than input. >+ videoTrackWithRenderer.source.adaptOutputFormat( >+ cropWidth, cropHeight, cropWidth, cropHeight, frameRate); >+ >+ boolean gotExpectedOrientation = false; >+ int numberOfInspectedFrames = 0; >+ >+ do { >+ videoTrackWithRenderer.rendererCallbacks.waitForNextFrameToRender(); >+ ++numberOfInspectedFrames; >+ >+ gotExpectedOrientation = (cropWidth > cropHeight) >+ == (videoTrackWithRenderer.rendererCallbacks.frameWidth() >+ > videoTrackWithRenderer.rendererCallbacks.frameHeight()); >+ } while (!gotExpectedOrientation && numberOfInspectedFrames < 30); >+ >+ disposeCapturer(capturerInstance); >+ disposeVideoTrackWithRenderer(videoTrackWithRenderer); >+ >+ assertTrue(gotExpectedOrientation); >+ } >+ > public void startWhileCameraIsAlreadyOpen() throws InterruptedException { > final String cameraName = testObjectFactory.getNameOfBackFacingDevice(); > // At this point camera is not actually opened. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/PeerConnectionTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/PeerConnectionTest.java >index 7d0edc598e6c668c90b4db2259163b619c76d451..2063199e72dea8507326908e65acbf27e25b2abd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/PeerConnectionTest.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/PeerConnectionTest.java >@@ -11,6 +11,7 @@ > package org.webrtc; > > import static org.junit.Assert.assertEquals; >+import static org.junit.Assert.assertNotEquals; > import static org.junit.Assert.assertFalse; > import static org.junit.Assert.assertNotNull; > import static org.junit.Assert.assertNull; >@@ -44,6 +45,7 @@ import org.junit.runner.RunWith; > import org.webrtc.Metrics.HistogramInfo; > import org.webrtc.PeerConnection.IceConnectionState; > import org.webrtc.PeerConnection.IceGatheringState; >+import org.webrtc.PeerConnection.PeerConnectionState; > import org.webrtc.PeerConnection.SignalingState; > > /** End-to-end tests for PeerConnection.java. */ >@@ -73,6 +75,7 @@ public class PeerConnectionTest { > private int expectedTracksAdded; > private Queue<SignalingState> expectedSignalingChanges = new ArrayDeque<>(); > private Queue<IceConnectionState> expectedIceConnectionChanges = new ArrayDeque<>(); >+ private Queue<PeerConnectionState> expectedConnectionChanges = new ArrayDeque<>(); > private Queue<IceGatheringState> expectedIceGatheringChanges = new ArrayDeque<>(); > private Queue<String> expectedAddStreamLabels = new ArrayDeque<>(); > private Queue<String> expectedRemoveStreamLabels = new ArrayDeque<>(); >@@ -187,11 +190,29 @@ public class PeerConnectionTest { > assertEquals(expectedIceConnectionChanges.remove(), newState); > } > >+ // TODO(bugs.webrtc.org/8491): Remove NoSynchronizedMethodCheck suppression. >+ @SuppressWarnings("NoSynchronizedMethodCheck") >+ public synchronized void expectConnectionChange(PeerConnectionState newState) { >+ expectedConnectionChanges.add(newState); >+ } >+ >+ @Override >+ // TODO(bugs.webrtc.org/8491): Remove NoSynchronizedMethodCheck suppression. >+ @SuppressWarnings("NoSynchronizedMethodCheck") >+ public synchronized void onConnectionChange(PeerConnectionState newState) { >+ if (expectedConnectionChanges.isEmpty()) { >+ System.out.println(name + " got an unexpected DTLS connection change " + newState); >+ return; >+ } >+ >+ assertEquals(expectedConnectionChanges.remove(), newState); >+ } >+ > @Override > // TODO(bugs.webrtc.org/8491): Remove NoSynchronizedMethodCheck suppression. > @SuppressWarnings("NoSynchronizedMethodCheck") > public synchronized void onIceConnectionReceivingChange(boolean receiving) { >- System.out.println(name + "Got an ICE connection receiving change " + receiving); >+ System.out.println(name + " got an ICE connection receiving change " + receiving); > } > > // TODO(bugs.webrtc.org/8491): Remove NoSynchronizedMethodCheck suppression. >@@ -663,6 +684,45 @@ public class PeerConnectionTest { > assertNotNull(offeringPC); > } > >+ @Test >+ @SmallTest >+ public void testCreationWithCertificate() throws Exception { >+ PeerConnectionFactory factory = PeerConnectionFactory.builder().createPeerConnectionFactory(); >+ PeerConnection.RTCConfiguration config = new PeerConnection.RTCConfiguration(Arrays.asList()); >+ >+ // Test certificate. >+ RtcCertificatePem originalCert = RtcCertificatePem.generateCertificate(); >+ config.certificate = originalCert; >+ >+ ObserverExpectations offeringExpectations = new ObserverExpectations("PCTest:offerer"); >+ PeerConnection offeringPC = factory.createPeerConnection(config, offeringExpectations); >+ >+ RtcCertificatePem restoredCert = offeringPC.getCertificate(); >+ assertEquals(originalCert.privateKey, restoredCert.privateKey); >+ assertEquals(originalCert.certificate, restoredCert.certificate); >+ } >+ >+ @Test >+ @SmallTest >+ public void testCreationWithCryptoOptions() throws Exception { >+ PeerConnectionFactory factory = PeerConnectionFactory.builder().createPeerConnectionFactory(); >+ PeerConnection.RTCConfiguration config = new PeerConnection.RTCConfiguration(Arrays.asList()); >+ >+ assertNull(config.cryptoOptions); >+ >+ CryptoOptions cryptoOptions = CryptoOptions.builder() >+ .setEnableGcmCryptoSuites(true) >+ .setEnableAes128Sha1_32CryptoCipher(true) >+ .setEnableEncryptedRtpHeaderExtensions(true) >+ .setRequireFrameEncryption(true) >+ .createCryptoOptions(); >+ config.cryptoOptions = cryptoOptions; >+ >+ ObserverExpectations offeringExpectations = new ObserverExpectations("PCTest:offerer"); >+ PeerConnection offeringPC = factory.createPeerConnection(config, offeringExpectations); >+ assertNotNull(offeringPC); >+ } >+ > @Test > @MediumTest > public void testCompleteSession() throws Exception { >@@ -759,12 +819,14 @@ public class PeerConnectionTest { > > sdpLatch = new SdpObserverLatch(); > answeringExpectations.expectSignalingChange(SignalingState.STABLE); >+ answeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); > answeringPC.setLocalDescription(sdpLatch, answerSdp); > assertTrue(sdpLatch.await()); > assertNull(sdpLatch.getSdp()); > > sdpLatch = new SdpObserverLatch(); > offeringExpectations.expectSignalingChange(SignalingState.HAVE_LOCAL_OFFER); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); > offeringPC.setLocalDescription(sdpLatch, offerSdp); > assertTrue(sdpLatch.await()); > assertNull(sdpLatch.getSdp()); >@@ -774,6 +836,9 @@ public class PeerConnectionTest { > > offeringExpectations.expectIceConnectionChange(IceConnectionState.CHECKING); > offeringExpectations.expectIceConnectionChange(IceConnectionState.CONNECTED); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.NEW); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTED); > // TODO(bemasc): uncomment once delivery of ICECompleted is reliable > // (https://code.google.com/p/webrtc/issues/detail?id=3021). > // >@@ -781,6 +846,7 @@ public class PeerConnectionTest { > // IceConnectionState.COMPLETED); > answeringExpectations.expectIceConnectionChange(IceConnectionState.CHECKING); > answeringExpectations.expectIceConnectionChange(IceConnectionState.CONNECTED); >+ answeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTED); > > offeringPC.setRemoteDescription(sdpLatch, answerSdp); > assertTrue(sdpLatch.await()); >@@ -989,12 +1055,14 @@ public class PeerConnectionTest { > > sdpLatch = new SdpObserverLatch(); > answeringExpectations.expectSignalingChange(SignalingState.STABLE); >+ answeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); > answeringPC.setLocalDescription(sdpLatch, answerSdp); > assertTrue(sdpLatch.await()); > assertNull(sdpLatch.getSdp()); > > sdpLatch = new SdpObserverLatch(); > offeringExpectations.expectSignalingChange(SignalingState.HAVE_LOCAL_OFFER); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); > offeringPC.setLocalDescription(sdpLatch, offerSdp); > assertTrue(sdpLatch.await()); > assertNull(sdpLatch.getSdp()); >@@ -1003,10 +1071,14 @@ public class PeerConnectionTest { > > offeringExpectations.expectIceConnectionChange(IceConnectionState.CHECKING); > offeringExpectations.expectIceConnectionChange(IceConnectionState.CONNECTED); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.NEW); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTED); > // TODO(bemasc): uncomment once delivery of ICECompleted is reliable > // (https://code.google.com/p/webrtc/issues/detail?id=3021). > answeringExpectations.expectIceConnectionChange(IceConnectionState.CHECKING); > answeringExpectations.expectIceConnectionChange(IceConnectionState.CONNECTED); >+ answeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTED); > > offeringPC.setRemoteDescription(sdpLatch, answerSdp); > assertTrue(sdpLatch.await()); >@@ -1139,6 +1211,7 @@ public class PeerConnectionTest { > offeringExpectations.expectSignalingChange(SignalingState.HAVE_LOCAL_OFFER); > offeringExpectations.expectIceCandidates(2); > offeringExpectations.expectIceGatheringChange(IceGatheringState.COMPLETE); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); > offeringPC.setLocalDescription(sdpLatch, offerSdp); > assertTrue(sdpLatch.await()); > assertNull(sdpLatch.getSdp()); >@@ -1170,6 +1243,7 @@ public class PeerConnectionTest { > answeringExpectations.expectSignalingChange(SignalingState.STABLE); > answeringExpectations.expectIceCandidates(2); > answeringExpectations.expectIceGatheringChange(IceGatheringState.COMPLETE); >+ answeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); > answeringPC.setLocalDescription(sdpLatch, answerSdp); > assertTrue(sdpLatch.await()); > assertNull(sdpLatch.getSdp()); >@@ -1181,6 +1255,9 @@ public class PeerConnectionTest { > > offeringExpectations.expectIceConnectionChange(IceConnectionState.CHECKING); > offeringExpectations.expectIceConnectionChange(IceConnectionState.CONNECTED); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.NEW); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTING); >+ offeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTED); > // TODO(bemasc): uncomment once delivery of ICECompleted is reliable > // (https://code.google.com/p/webrtc/issues/detail?id=3021). > // >@@ -1188,6 +1265,7 @@ public class PeerConnectionTest { > // IceConnectionState.COMPLETED); > answeringExpectations.expectIceConnectionChange(IceConnectionState.CHECKING); > answeringExpectations.expectIceConnectionChange(IceConnectionState.CONNECTED); >+ answeringExpectations.expectConnectionChange(PeerConnectionState.CONNECTED); > > offeringPC.setRemoteDescription(sdpLatch, answerSdp); > assertTrue(sdpLatch.await()); >@@ -1573,6 +1651,7 @@ public class PeerConnectionTest { > assertTrue(expectations.waitForAllExpectationsToBeSatisfied(TIMEOUT_SECONDS)); > > expectations.expectIceConnectionChange(IceConnectionState.CLOSED); >+ expectations.expectConnectionChange(PeerConnectionState.CLOSED); > expectations.expectSignalingChange(SignalingState.CLOSED); > pc.close(); > assertTrue(expectations.waitForAllExpectationsToBeSatisfied(TIMEOUT_SECONDS)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/RtcCertificatePemTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/RtcCertificatePemTest.java >new file mode 100644 >index 0000000000000000000000000000000000000000..00f295c942f57ff4f8bd3badc67c1287d9130db5 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/RtcCertificatePemTest.java >@@ -0,0 +1,73 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+import static com.google.common.truth.Truth.assertThat; >+ >+import android.support.test.filters.SmallTest; >+import org.chromium.base.test.BaseJUnit4ClassRunner; >+import org.junit.Before; >+import org.junit.Test; >+import org.junit.runner.RunWith; >+import org.webrtc.PeerConnection; >+import org.webrtc.RtcCertificatePem; >+ >+/** Tests for RtcCertificatePem.java. */ >+@RunWith(BaseJUnit4ClassRunner.class) >+public class RtcCertificatePemTest { >+ @Before >+ public void setUp() { >+ System.loadLibrary(TestConstants.NATIVE_LIBRARY); >+ } >+ >+ @Test >+ @SmallTest >+ public void testConstructor() { >+ RtcCertificatePem original = RtcCertificatePem.generateCertificate(); >+ RtcCertificatePem recreated = new RtcCertificatePem(original.privateKey, original.certificate); >+ assertThat(original.privateKey).isEqualTo(recreated.privateKey); >+ assertThat(original.certificate).isEqualTo(recreated.certificate); >+ } >+ >+ @Test >+ @SmallTest >+ public void testGenerateCertificateDefaults() { >+ RtcCertificatePem rtcCertificate = RtcCertificatePem.generateCertificate(); >+ assertThat(rtcCertificate.privateKey).isNotEmpty(); >+ assertThat(rtcCertificate.certificate).isNotEmpty(); >+ } >+ >+ @Test >+ @SmallTest >+ public void testGenerateCertificateCustomKeyTypeDefaultExpires() { >+ RtcCertificatePem rtcCertificate = >+ RtcCertificatePem.generateCertificate(PeerConnection.KeyType.RSA); >+ assertThat(rtcCertificate.privateKey).isNotEmpty(); >+ assertThat(rtcCertificate.certificate).isNotEmpty(); >+ } >+ >+ @Test >+ @SmallTest >+ public void testGenerateCertificateCustomExpiresDefaultKeyType() { >+ RtcCertificatePem rtcCertificate = RtcCertificatePem.generateCertificate(60 * 60 * 24); >+ assertThat(rtcCertificate.privateKey).isNotEmpty(); >+ assertThat(rtcCertificate.certificate).isNotEmpty(); >+ } >+ >+ @Test >+ @SmallTest >+ public void testGenerateCertificateCustomKeyTypeAndExpires() { >+ RtcCertificatePem rtcCertificate = >+ RtcCertificatePem.generateCertificate(PeerConnection.KeyType.RSA, 60 * 60 * 24); >+ assertThat(rtcCertificate.privateKey).isNotEmpty(); >+ assertThat(rtcCertificate.certificate).isNotEmpty(); >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/TimestampAlignerTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/TimestampAlignerTest.java >new file mode 100644 >index 0000000000000000000000000000000000000000..1d944b57f9d654d89db007e48407a6180c67b4a6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/instrumentationtests/src/org/webrtc/TimestampAlignerTest.java >@@ -0,0 +1,46 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+import android.support.test.filters.SmallTest; >+import org.chromium.base.test.params.BaseJUnit4RunnerDelegate; >+import org.chromium.base.test.params.ParameterAnnotations.UseRunnerDelegate; >+import org.junit.BeforeClass; >+import org.junit.Test; >+ >+@UseRunnerDelegate(BaseJUnit4RunnerDelegate.class) >+public class TimestampAlignerTest { >+ @BeforeClass >+ public static void setUp() { >+ System.loadLibrary(TestConstants.NATIVE_LIBRARY); >+ } >+ >+ @Test >+ @SmallTest >+ public void testGetRtcTimeNanos() { >+ TimestampAligner.getRtcTimeNanos(); >+ } >+ >+ @Test >+ @SmallTest >+ public void testDispose() { >+ final TimestampAligner timestampAligner = new TimestampAligner(); >+ timestampAligner.dispose(); >+ } >+ >+ @Test >+ @SmallTest >+ public void testTranslateTimestamp() { >+ final TimestampAligner timestampAligner = new TimestampAligner(); >+ timestampAligner.translateTimestamp(/* cameraTimeNs= */ 123); >+ timestampAligner.dispose(); >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.cc >index 4c302da0ed9eb0b38c311f013f6443e4ff692236..aae1de7ff0342a0eb5ccc715dc7faea7c4e4d9a1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.cc >@@ -25,12 +25,14 @@ class JavaVideoTrackSourceImpl : public JavaVideoTrackSourceInterface { > public: > JavaVideoTrackSourceImpl(JNIEnv* env, > rtc::Thread* signaling_thread, >- bool is_screencast) >+ bool is_screencast, >+ bool align_timestamps) > : android_video_track_source_( > new rtc::RefCountedObject<jni::AndroidVideoTrackSource>( > signaling_thread, > env, >- is_screencast)), >+ is_screencast, >+ align_timestamps)), > native_capturer_observer_(jni::CreateJavaNativeCapturerObserver( > env, > android_video_track_source_)) {} >@@ -96,9 +98,10 @@ class JavaVideoTrackSourceImpl : public JavaVideoTrackSourceInterface { > rtc::scoped_refptr<JavaVideoTrackSourceInterface> CreateJavaVideoSource( > JNIEnv* jni, > rtc::Thread* signaling_thread, >- bool is_screencast) { >+ bool is_screencast, >+ bool align_timestamps) { > return new rtc::RefCountedObject<JavaVideoTrackSourceImpl>( >- jni, signaling_thread, is_screencast); >+ jni, signaling_thread, is_screencast, align_timestamps); > } > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.h >index 0c4f33342ebad6d4b5979daf3caee97603b06f87..75ebdbce30263e3085540414f1283f610361a01b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_api/video/videosource.h >@@ -14,6 +14,7 @@ > #include <jni.h> > > #include "api/mediastreaminterface.h" >+#include "rtc_base/thread.h" > #include "sdk/android/native_api/jni/scoped_java_ref.h" > > namespace webrtc { >@@ -32,7 +33,8 @@ class JavaVideoTrackSourceInterface : public VideoTrackSourceInterface { > rtc::scoped_refptr<JavaVideoTrackSourceInterface> CreateJavaVideoSource( > JNIEnv* env, > rtc::Thread* signaling_thread, >- bool is_screencast); >+ bool is_screencast, >+ bool align_timestamps); > > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/audio_device/audio_device_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/audio_device/audio_device_unittest.cc >index 5e46052b90048156bf1c3d698a2513684df8d369..98d585387a04d8b66467fe2eacee460fd97b2ede 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/audio_device/audio_device_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/audio_device/audio_device_unittest.cc >@@ -15,6 +15,7 @@ > #include "modules/audio_device/include/mock_audio_transport.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/criticalsection.h" >+#include "rtc_base/event.h" > #include "rtc_base/format_macros.h" > #include "rtc_base/scoped_ref_ptr.h" > #include "rtc_base/timeutils.h" >@@ -25,7 +26,6 @@ > #include "sdk/android/src/jni/audio_device/audio_device_module.h" > #include "sdk/android/src/jni/audio_device/opensles_common.h" > #include "sdk/android/src/jni/jni_helpers.h" >-#include "system_wrappers/include/event_wrapper.h" > #include "test/gmock.h" > #include "test/gtest.h" > #include "test/testsupport/fileutils.h" >@@ -364,7 +364,7 @@ class MockAudioTransportAndroid : public test::MockAudioTransport { > > // Set default actions of the mock object. We are delegating to fake > // implementations (of AudioStreamInterface) here. >- void HandleCallbacks(EventWrapper* test_is_done, >+ void HandleCallbacks(rtc::Event* test_is_done, > AudioStreamInterface* audio_stream, > int num_callbacks) { > test_is_done_ = test_is_done; >@@ -447,7 +447,7 @@ class MockAudioTransportAndroid : public test::MockAudioTransport { > bool rec_mode() const { return type_ & kRecording; } > > private: >- EventWrapper* test_is_done_; >+ rtc::Event* test_is_done_; > size_t num_callbacks_; > int type_; > size_t play_count_; >@@ -459,7 +459,7 @@ class MockAudioTransportAndroid : public test::MockAudioTransport { > // AudioDeviceTest test fixture. > class AudioDeviceTest : public ::testing::Test { > protected: >- AudioDeviceTest() : test_is_done_(EventWrapper::Create()) { >+ AudioDeviceTest() { > // One-time initialization of JVM and application context. Ensures that we > // can do calls between C++ and Java. Initializes both Java and OpenSL ES > // implementations. >@@ -683,7 +683,7 @@ class AudioDeviceTest : public ::testing::Test { > > JNIEnv* jni_; > std::unique_ptr<JavaParamRef<jobject>> context_; >- std::unique_ptr<EventWrapper> test_is_done_; >+ rtc::Event test_is_done_; > rtc::scoped_refptr<AudioDeviceModule> audio_device_; > ScopedJavaLocalRef<jobject> audio_manager_; > AudioParameters output_parameters_; >@@ -880,14 +880,14 @@ TEST_F(AudioDeviceTest, StopRecordingRequiresInitToRestart) { > // audio samples to play out using the NeedMorePlayData callback. > TEST_F(AudioDeviceTest, StartPlayoutVerifyCallbacks) { > MockAudioTransportAndroid mock(kPlayout); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL(mock, NeedMorePlayData(playout_frames_per_10ms_buffer(), > kBytesPerSample, playout_channels(), > playout_sample_rate(), NotNull(), _, _, _)) > .Times(AtLeast(kNumCallbacks)); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopPlayout(); > } > >@@ -895,7 +895,7 @@ TEST_F(AudioDeviceTest, StartPlayoutVerifyCallbacks) { > // audio samples via the RecordedDataIsAvailable callback. > TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > MockAudioTransportAndroid mock(kRecording); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL( > mock, RecordedDataIsAvailable(NotNull(), record_frames_per_10ms_buffer(), > kBytesPerSample, record_channels(), >@@ -904,7 +904,7 @@ TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartRecording(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopRecording(); > } > >@@ -912,7 +912,7 @@ TEST_F(AudioDeviceTest, StartRecordingVerifyCallbacks) { > // active in both directions. > TEST_F(AudioDeviceTest, StartPlayoutAndRecordingVerifyCallbacks) { > MockAudioTransportAndroid mock(kPlayout | kRecording); >- mock.HandleCallbacks(test_is_done_.get(), nullptr, kNumCallbacks); >+ mock.HandleCallbacks(&test_is_done_, nullptr, kNumCallbacks); > EXPECT_CALL(mock, NeedMorePlayData(playout_frames_per_10ms_buffer(), > kBytesPerSample, playout_channels(), > playout_sample_rate(), NotNull(), _, _, _)) >@@ -925,7 +925,7 @@ TEST_F(AudioDeviceTest, StartPlayoutAndRecordingVerifyCallbacks) { > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); > StartRecording(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopRecording(); > StopPlayout(); > } >@@ -941,12 +941,11 @@ TEST_F(AudioDeviceTest, RunPlayoutWithFileAsSource) { > std::string file_name = GetFileName(playout_sample_rate()); > std::unique_ptr<FileAudioStream> file_audio_stream( > new FileAudioStream(num_callbacks, file_name, playout_sample_rate())); >- mock.HandleCallbacks(test_is_done_.get(), file_audio_stream.get(), >- num_callbacks); >+ mock.HandleCallbacks(&test_is_done_, file_audio_stream.get(), num_callbacks); > // SetMaxPlayoutVolume(); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartPlayout(); >- test_is_done_->Wait(kTestTimeOutInMilliseconds); >+ test_is_done_.Wait(kTestTimeOutInMilliseconds); > StopPlayout(); > } > >@@ -1084,13 +1083,13 @@ TEST_F(AudioDeviceTest, DISABLED_RunPlayoutAndRecordingInFullDuplex) { > NiceMock<MockAudioTransportAndroid> mock(kPlayout | kRecording); > std::unique_ptr<FifoAudioStream> fifo_audio_stream( > new FifoAudioStream(playout_frames_per_10ms_buffer())); >- mock.HandleCallbacks(test_is_done_.get(), fifo_audio_stream.get(), >+ mock.HandleCallbacks(&test_is_done_, fifo_audio_stream.get(), > kFullDuplexTimeInSec * kNumCallbacksPerSecond); > SetMaxPlayoutVolume(); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > StartRecording(); > StartPlayout(); >- test_is_done_->Wait( >+ test_is_done_.Wait( > std::max(kTestTimeOutInMilliseconds, 1000 * kFullDuplexTimeInSec)); > StopPlayout(); > StopRecording(); >@@ -1117,14 +1116,14 @@ TEST_F(AudioDeviceTest, DISABLED_MeasureLoopbackLatency) { > NiceMock<MockAudioTransportAndroid> mock(kPlayout | kRecording); > std::unique_ptr<LatencyMeasuringAudioStream> latency_audio_stream( > new LatencyMeasuringAudioStream(playout_frames_per_10ms_buffer())); >- mock.HandleCallbacks(test_is_done_.get(), latency_audio_stream.get(), >+ mock.HandleCallbacks(&test_is_done_, latency_audio_stream.get(), > kMeasureLatencyTimeInSec * kNumCallbacksPerSecond); > EXPECT_EQ(0, audio_device()->RegisterAudioCallback(&mock)); > SetMaxPlayoutVolume(); > DisableBuiltInAECIfAvailable(); > StartRecording(); > StartPlayout(); >- test_is_done_->Wait( >+ test_is_done_.Wait( > std::max(kTestTimeOutInMilliseconds, 1000 * kMeasureLatencyTimeInSec)); > StopPlayout(); > StopRecording(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/org/webrtc/JavaVideoSourceTestHelper.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/org/webrtc/JavaVideoSourceTestHelper.java >index bde34e95cf6333b0d92501e66a220e98035955e4..309c78654ef366d655c581d5b93b36de50049ac8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/org/webrtc/JavaVideoSourceTestHelper.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/org/webrtc/JavaVideoSourceTestHelper.java >@@ -24,8 +24,9 @@ public class JavaVideoSourceTestHelper { > } > > @CalledByNative >- public static void deliverFrame(int width, int height, int rotation, CapturerObserver observer) { >+ public static void deliverFrame( >+ int width, int height, int rotation, long timestampNs, CapturerObserver observer) { > observer.onFrameCaptured( >- new VideoFrame(JavaI420Buffer.allocate(width, height), rotation, 0 /* timestampNs= */)); >+ new VideoFrame(JavaI420Buffer.allocate(width, height), rotation, timestampNs)); > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/video/videosource_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/video/videosource_unittest.cc >index b58ed1a45a54ed07b9d87fc243082bd9de070c7b..7ecf9f8eb4b8a0f789653aef6ed679c33548b387 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/video/videosource_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/native_unittests/video/videosource_unittest.cc >@@ -40,9 +40,9 @@ TEST(JavaVideoSourceTest, CreateJavaVideoSource) { > rtc::ThreadManager::Instance()->WrapCurrentThread(); > > rtc::scoped_refptr<JavaVideoTrackSourceInterface> video_track_source = >- CreateJavaVideoSource(env, >- rtc::ThreadManager::Instance()->CurrentThread(), >- false /* is_screencast */); >+ CreateJavaVideoSource( >+ env, rtc::ThreadManager::Instance()->CurrentThread(), >+ false /* is_screencast */, true /* align_timestamps */); > > ASSERT_NE(nullptr, video_track_source); > EXPECT_NE(nullptr, >@@ -57,9 +57,9 @@ TEST(JavaVideoSourceTest, OnFrameCapturedFrameIsDeliveredToSink) { > rtc::ThreadManager::Instance()->WrapCurrentThread(); > > rtc::scoped_refptr<JavaVideoTrackSourceInterface> video_track_source = >- CreateJavaVideoSource(env, >- rtc::ThreadManager::Instance()->CurrentThread(), >- false /* is_screencast */); >+ CreateJavaVideoSource( >+ env, rtc::ThreadManager::Instance()->CurrentThread(), >+ false /* is_screencast */, true /* align_timestamps */); > video_track_source->AddOrUpdateSink(&test_video_sink, rtc::VideoSinkWants()); > > jni::Java_JavaVideoSourceTestHelper_startCapture( >@@ -68,8 +68,9 @@ TEST(JavaVideoSourceTest, OnFrameCapturedFrameIsDeliveredToSink) { > const int width = 20; > const int height = 32; > const int rotation = 180; >+ const int64_t timestamp = 987654321; > jni::Java_JavaVideoSourceTestHelper_deliverFrame( >- env, width, height, rotation, >+ env, width, height, rotation, timestamp, > video_track_source->GetJavaVideoCapturerObserver(env)); > > std::vector<VideoFrame> frames = test_video_sink.GetFrames(); >@@ -80,15 +81,49 @@ TEST(JavaVideoSourceTest, OnFrameCapturedFrameIsDeliveredToSink) { > EXPECT_EQ(rotation, frame.rotation()); > } > >+TEST(JavaVideoSourceTest, >+ OnFrameCapturedFrameIsDeliveredToSinkWithPreservedTimestamp) { >+ TestVideoSink test_video_sink; >+ >+ JNIEnv* env = AttachCurrentThreadIfNeeded(); >+ // Wrap test thread so it can be used as the signaling thread. >+ rtc::ThreadManager::Instance()->WrapCurrentThread(); >+ >+ rtc::scoped_refptr<JavaVideoTrackSourceInterface> video_track_source = >+ CreateJavaVideoSource( >+ env, rtc::ThreadManager::Instance()->CurrentThread(), >+ false /* is_screencast */, false /* align_timestamps */); >+ video_track_source->AddOrUpdateSink(&test_video_sink, rtc::VideoSinkWants()); >+ >+ jni::Java_JavaVideoSourceTestHelper_startCapture( >+ env, video_track_source->GetJavaVideoCapturerObserver(env), >+ true /* success */); >+ const int width = 20; >+ const int height = 32; >+ const int rotation = 180; >+ const int64_t timestamp = 987654321; >+ jni::Java_JavaVideoSourceTestHelper_deliverFrame( >+ env, width, height, rotation, 987654321, >+ video_track_source->GetJavaVideoCapturerObserver(env)); >+ >+ std::vector<VideoFrame> frames = test_video_sink.GetFrames(); >+ ASSERT_EQ(1u, frames.size()); >+ webrtc::VideoFrame frame = frames[0]; >+ EXPECT_EQ(width, frame.width()); >+ EXPECT_EQ(height, frame.height()); >+ EXPECT_EQ(rotation, frame.rotation()); >+ EXPECT_EQ(timestamp / 1000, frame.timestamp_us()); >+} >+ > TEST(JavaVideoSourceTest, CapturerStartedSuccessStateBecomesLive) { > JNIEnv* env = AttachCurrentThreadIfNeeded(); > // Wrap test thread so it can be used as the signaling thread. > rtc::ThreadManager::Instance()->WrapCurrentThread(); > > rtc::scoped_refptr<JavaVideoTrackSourceInterface> video_track_source = >- CreateJavaVideoSource(env, >- rtc::ThreadManager::Instance()->CurrentThread(), >- false /* is_screencast */); >+ CreateJavaVideoSource( >+ env, rtc::ThreadManager::Instance()->CurrentThread(), >+ false /* is_screencast */, true /* align_timestamps */); > > jni::Java_JavaVideoSourceTestHelper_startCapture( > env, video_track_source->GetJavaVideoCapturerObserver(env), >@@ -104,9 +139,9 @@ TEST(JavaVideoSourceTest, CapturerStartedFailureStateBecomesEnded) { > rtc::ThreadManager::Instance()->WrapCurrentThread(); > > rtc::scoped_refptr<JavaVideoTrackSourceInterface> video_track_source = >- CreateJavaVideoSource(env, >- rtc::ThreadManager::Instance()->CurrentThread(), >- false /* is_screencast */); >+ CreateJavaVideoSource( >+ env, rtc::ThreadManager::Instance()->CurrentThread(), >+ false /* is_screencast */, true /* align_timestamps */); > > jni::Java_JavaVideoSourceTestHelper_startCapture( > env, video_track_source->GetJavaVideoCapturerObserver(env), >@@ -122,9 +157,9 @@ TEST(JavaVideoSourceTest, CapturerStoppedStateBecomesEnded) { > rtc::ThreadManager::Instance()->WrapCurrentThread(); > > rtc::scoped_refptr<JavaVideoTrackSourceInterface> video_track_source = >- CreateJavaVideoSource(env, >- rtc::ThreadManager::Instance()->CurrentThread(), >- false /* is_screencast */); >+ CreateJavaVideoSource( >+ env, rtc::ThreadManager::Instance()->CurrentThread(), >+ false /* is_screencast */, true /* align_timestamps */); > > jni::Java_JavaVideoSourceTestHelper_startCapture( > env, video_track_source->GetJavaVideoCapturerObserver(env), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/HardwareVideoEncoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/HardwareVideoEncoder.java >index ebf9de374f14b01d93a181cf7be2765f6d4387d9..c36a68cd21b7f1b530fe148381f88365f4f290c3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/HardwareVideoEncoder.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/HardwareVideoEncoder.java >@@ -403,12 +403,6 @@ class HardwareVideoEncoder implements VideoEncoder { > return VideoCodecStatus.OK; > } > >- @Override >- public VideoCodecStatus setChannelParameters(short packetLoss, long roundTripTimeMs) { >- encodeThreadChecker.checkIsOnValidThread(); >- return VideoCodecStatus.OK; // No op. >- } >- > @Override > public VideoCodecStatus setRateAllocation(BitrateAllocation bitrateAllocation, int framerate) { > encodeThreadChecker.checkIsOnValidThread(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/NativeCapturerObserver.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/NativeCapturerObserver.java >index 24869818a66a38e9c03e122a0a3a9042380c3c8e..f673b24885b2b62eedc20ab20eb65fe18a138796 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/NativeCapturerObserver.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/NativeCapturerObserver.java >@@ -11,6 +11,7 @@ > package org.webrtc; > > import javax.annotation.Nullable; >+import org.webrtc.VideoFrame; > > /** > * Implements VideoCapturer.CapturerObserver and feeds frames to >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP8Decoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP8Decoder.java >deleted file mode 100644 >index ab29c349c7a10218717b9fc694fe04056ab13ec9..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP8Decoder.java >+++ /dev/null >@@ -1,20 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-package org.webrtc; >- >-class VP8Decoder extends WrappedNativeVideoDecoder { >- @Override >- public long createNativeVideoDecoder() { >- return nativeCreateDecoder(); >- } >- >- static native long nativeCreateDecoder(); >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP8Encoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP8Encoder.java >deleted file mode 100644 >index c2f8f504177deeef9c3ea7c775852da28e43b4e7..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP8Encoder.java >+++ /dev/null >@@ -1,25 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-package org.webrtc; >- >-class VP8Encoder extends WrappedNativeVideoEncoder { >- @Override >- public long createNativeVideoEncoder() { >- return nativeCreateEncoder(); >- } >- >- static native long nativeCreateEncoder(); >- >- @Override >- public boolean isHardwareEncoder() { >- return false; >- } >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP9Decoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP9Decoder.java >deleted file mode 100644 >index b6fee2d362b2ae583758cbeb9dcd1ad7da9ee1b6..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP9Decoder.java >+++ /dev/null >@@ -1,22 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-package org.webrtc; >- >-class VP9Decoder extends WrappedNativeVideoDecoder { >- @Override >- public long createNativeVideoDecoder() { >- return nativeCreateDecoder(); >- } >- >- static native long nativeCreateDecoder(); >- >- static native boolean nativeIsSupported(); >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP9Encoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP9Encoder.java >deleted file mode 100644 >index bc705bd9c6017eb66449c727f3640715f71913df..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/VP9Encoder.java >+++ /dev/null >@@ -1,27 +0,0 @@ >-/* >- * Copyright (c) 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-package org.webrtc; >- >-class VP9Encoder extends WrappedNativeVideoEncoder { >- @Override >- public long createNativeVideoEncoder() { >- return nativeCreateEncoder(); >- } >- >- static native long nativeCreateEncoder(); >- >- @Override >- public boolean isHardwareEncoder() { >- return false; >- } >- >- static native boolean nativeIsSupported(); >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/WrappedNativeVideoEncoder.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/WrappedNativeVideoEncoder.java >index 07e842862101207107cfd42783bb9d6050a51d5b..959cafca56907c0a18bf51c66f8bfab261ccc26d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/WrappedNativeVideoEncoder.java >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/java/org/webrtc/WrappedNativeVideoEncoder.java >@@ -32,11 +32,6 @@ abstract class WrappedNativeVideoEncoder implements VideoEncoder { > throw new UnsupportedOperationException("Not implemented."); > } > >- @Override >- public VideoCodecStatus setChannelParameters(short packetLoss, long roundTripTimeMs) { >- throw new UnsupportedOperationException("Not implemented."); >- } >- > @Override > public VideoCodecStatus setRateAllocation(BitrateAllocation allocation, int framerate) { > throw new UnsupportedOperationException("Not implemented."); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediadecoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediadecoder.cc >index 47ddaac4672eb01da0c24147479d5a3cb1e1cf41..4df50829c36b10e1c0e00992123c5368d9fd8fc1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediadecoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediadecoder.cc >@@ -35,7 +35,6 @@ > #include "third_party/libyuv/include/libyuv/video_common.h" > > using rtc::Bind; >-using rtc::Thread; > using rtc::ThreadManager; > namespace webrtc { > namespace jni { >@@ -121,7 +120,7 @@ class MediaCodecVideoDecoder : public VideoDecoder, public rtc::MessageHandler { > > // State that is constant for the lifetime of this object once the ctor > // returns. >- std::unique_ptr<Thread> >+ std::unique_ptr<rtc::Thread> > codec_thread_; // Thread on which to operate MediaCodec. > ScopedJavaGlobalRef<jobject> j_media_codec_video_decoder_; > >@@ -137,7 +136,7 @@ MediaCodecVideoDecoder::MediaCodecVideoDecoder(JNIEnv* jni, > inited_(false), > sw_fallback_required_(false), > use_surface_(use_surface), >- codec_thread_(Thread::Create()), >+ codec_thread_(rtc::Thread::Create()), > j_media_codec_video_decoder_( > jni, > Java_MediaCodecVideoDecoder_Constructor(jni)) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediaencoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediaencoder.cc >index 435bc713a8580cd9ab4f1a9671680e14a89849bc..09b2ccf43f40cd491bea05618bb2babbe1167239 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediaencoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidmediaencoder.cc >@@ -47,7 +47,6 @@ > #include "third_party/libyuv/include/libyuv/video_common.h" > > using rtc::Bind; >-using rtc::Thread; > using rtc::ThreadManager; > > namespace webrtc { >@@ -105,13 +104,9 @@ class MediaCodecVideoEncoder : public VideoEncoder { > int32_t RegisterEncodeCompleteCallback( > EncodedImageCallback* callback) override; > int32_t Release() override; >- int32_t SetChannelParameters(uint32_t /* packet_loss */, >- int64_t /* rtt */) override; > int32_t SetRateAllocation(const VideoBitrateAllocation& rate_allocation, > uint32_t frame_rate) override; >- >- bool SupportsNativeHandle() const override { return has_egl_context_; } >- const char* ImplementationName() const override; >+ EncoderInfo GetEncoderInfo() const override; > > // Fills the input buffer with data from the buffers passed as parameters. > bool FillInputBuffer(JNIEnv* jni, >@@ -180,7 +175,7 @@ class MediaCodecVideoEncoder : public VideoEncoder { > // true on success. > bool DeliverPendingOutputs(JNIEnv* jni); > >- VideoEncoder::ScalingSettings GetScalingSettings() const override; >+ VideoEncoder::ScalingSettings GetScalingSettingsInternal() const; > > // Displays encoder statistics. > void LogStatistics(bool force_log); >@@ -271,6 +266,7 @@ class MediaCodecVideoEncoder : public VideoEncoder { > size_t gof_idx_; > > const bool has_egl_context_; >+ EncoderInfo encoder_info_; > > // Temporary fix for VP8. > // Sends a key frame if frames are largely spaced apart (possibly >@@ -358,17 +354,16 @@ int32_t MediaCodecVideoEncoder::InitEncode(const VideoCodec* codec_settings, > ALOGD << "H.264 profile: " << profile_; > } > >+ encoder_info_.supports_native_handle = has_egl_context_; >+ encoder_info_.implementation_name = "MediaCodec"; >+ encoder_info_.scaling_settings = GetScalingSettingsInternal(); >+ > return InitEncodeInternal( > init_width, init_height, codec_settings->startBitrate, > codec_settings->maxFramerate, > codec_settings->expect_encode_from_texture && has_egl_context_); > } > >-int32_t MediaCodecVideoEncoder::SetChannelParameters(uint32_t /* packet_loss */, >- int64_t /* rtt */) { >- return WEBRTC_VIDEO_CODEC_OK; >-} >- > bool MediaCodecVideoEncoder::ResetCodec() { > RTC_DCHECK_CALLED_SEQUENTIALLY(&encoder_queue_checker_); > ALOGE << "Reset"; >@@ -930,6 +925,10 @@ int32_t MediaCodecVideoEncoder::SetRateAllocation( > return WEBRTC_VIDEO_CODEC_OK; > } > >+VideoEncoder::EncoderInfo MediaCodecVideoEncoder::GetEncoderInfo() const { >+ return encoder_info_; >+} >+ > bool MediaCodecVideoEncoder::DeliverPendingOutputs(JNIEnv* jni) { > RTC_DCHECK_CALLED_SEQUENTIALLY(&encoder_queue_checker_); > >@@ -1148,8 +1147,8 @@ void MediaCodecVideoEncoder::LogStatistics(bool force_log) { > } > } > >-VideoEncoder::ScalingSettings MediaCodecVideoEncoder::GetScalingSettings() >- const { >+VideoEncoder::ScalingSettings >+MediaCodecVideoEncoder::GetScalingSettingsInternal() const { > if (!scale_) > return VideoEncoder::ScalingSettings::kOff; > >@@ -1206,10 +1205,6 @@ VideoEncoder::ScalingSettings MediaCodecVideoEncoder::GetScalingSettings() > return VideoEncoder::ScalingSettings::kOff; > } > >-const char* MediaCodecVideoEncoder::ImplementationName() const { >- return "MediaCodec"; >-} >- > static void JNI_MediaCodecVideoEncoder_FillInputBuffer( > JNIEnv* jni, > const JavaParamRef<jclass>&, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidnetworkmonitor.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidnetworkmonitor.cc >index b1351ef1e1b745550497561f9bc0d192f9512f76..7ac17281f97caf9fce7361e547e6d9be2d70c8ea 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidnetworkmonitor.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidnetworkmonitor.cc >@@ -19,6 +19,7 @@ > #include "rtc_base/bind.h" > #include "rtc_base/checks.h" > #include "rtc_base/ipaddress.h" >+#include "rtc_base/logging.h" > #include "rtc_base/strings/string_builder.h" > #include "sdk/android/generated_base_jni/jni/NetworkMonitorAutoDetect_jni.h" > #include "sdk/android/generated_base_jni/jni/NetworkMonitor_jni.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.cc >index 41d4278ac9cf3aad47c1faa6d6d2836005c1f89f..e72850f24988b0487a8850ba0511c7c4b5b1e4bd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.cc >@@ -25,10 +25,12 @@ const int kRequiredResolutionAlignment = 2; > > AndroidVideoTrackSource::AndroidVideoTrackSource(rtc::Thread* signaling_thread, > JNIEnv* jni, >- bool is_screencast) >+ bool is_screencast, >+ bool align_timestamps) > : AdaptedVideoTrackSource(kRequiredResolutionAlignment), > signaling_thread_(signaling_thread), >- is_screencast_(is_screencast) { >+ is_screencast_(is_screencast), >+ align_timestamps_(align_timestamps) { > RTC_LOG(LS_INFO) << "AndroidVideoTrackSource ctor"; > camera_thread_checker_.DetachFromThread(); > } >@@ -75,7 +77,9 @@ void AndroidVideoTrackSource::OnFrameCaptured( > > int64_t camera_time_us = timestamp_ns / rtc::kNumNanosecsPerMicrosec; > int64_t translated_camera_time_us = >- timestamp_aligner_.TranslateTimestamp(camera_time_us, rtc::TimeMicros()); >+ align_timestamps_ ? timestamp_aligner_.TranslateTimestamp( >+ camera_time_us, rtc::TimeMicros()) >+ : camera_time_us; > > int adapted_width; > int adapted_height; >@@ -84,10 +88,19 @@ void AndroidVideoTrackSource::OnFrameCaptured( > int crop_x; > int crop_y; > >- if (!AdaptFrame(width, height, camera_time_us, &adapted_width, >- &adapted_height, &crop_width, &crop_height, &crop_x, >- &crop_y)) { >- return; >+ if (rotation % 180 == 0) { >+ if (!AdaptFrame(width, height, camera_time_us, &adapted_width, >+ &adapted_height, &crop_width, &crop_height, &crop_x, >+ &crop_y)) { >+ return; >+ } >+ } else { >+ // Swap all width/height and x/y. >+ if (!AdaptFrame(height, width, camera_time_us, &adapted_height, >+ &adapted_width, &crop_height, &crop_width, &crop_y, >+ &crop_x)) { >+ return; >+ } > } > > rtc::scoped_refptr<VideoFrameBuffer> buffer = >@@ -103,12 +116,16 @@ void AndroidVideoTrackSource::OnFrameCaptured( > OnFrame(VideoFrame(buffer, rotation, translated_camera_time_us)); > } > >-void AndroidVideoTrackSource::OnOutputFormatRequest(int width, >- int height, >+void AndroidVideoTrackSource::OnOutputFormatRequest(int landscape_width, >+ int landscape_height, >+ int portrait_width, >+ int portrait_height, > int fps) { >- cricket::VideoFormat format(width, height, >- cricket::VideoFormat::FpsToInterval(fps), 0); >- video_adapter()->OnOutputFormatRequest(format); >+ video_adapter()->OnOutputFormatRequest( >+ std::make_pair(landscape_width, landscape_height), >+ landscape_width * landscape_height, >+ std::make_pair(portrait_width, portrait_height), >+ portrait_width * portrait_height, fps); > } > > } // namespace jni >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.h >index 8f092c1b041d6aea3b4952b607b2c9a72182de4c..4f8442e57781580fdd19b287c4b8e703c5c54f4b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/androidvideotracksource.h >@@ -29,7 +29,8 @@ class AndroidVideoTrackSource : public rtc::AdaptedVideoTrackSource { > public: > AndroidVideoTrackSource(rtc::Thread* signaling_thread, > JNIEnv* jni, >- bool is_screencast = false); >+ bool is_screencast, >+ bool align_timestamps); > ~AndroidVideoTrackSource() override; > > bool is_screencast() const override; >@@ -53,15 +54,20 @@ class AndroidVideoTrackSource : public rtc::AdaptedVideoTrackSource { > VideoRotation rotation, > const JavaRef<jobject>& j_video_frame_buffer); > >- void OnOutputFormatRequest(int width, int height, int fps); >+ void OnOutputFormatRequest(int landscape_width, >+ int landscape_height, >+ int portrait_width, >+ int portrait_height, >+ int fps); > > private: > rtc::Thread* signaling_thread_; > rtc::AsyncInvoker invoker_; > rtc::ThreadChecker camera_thread_checker_; > SourceState state_; >- rtc::TimestampAligner timestamp_aligner_; > const bool is_screencast_; >+ rtc::TimestampAligner timestamp_aligner_; >+ const bool align_timestamps_; > }; > > } // namespace jni >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.cc >index dc3a8fe4ba553a7dae1428c32e0441a664c26a7e..e04a01faead1bf60c3c6f4e1c18cbd52ef8ddef1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.cc >@@ -12,6 +12,7 @@ > > #include <utility> > >+#include "modules/audio_device/audio_device_buffer.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" > #include "rtc_base/refcountedobject.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.h >index cddd3e0676978555940389b3492e61022ebe1f02..476da14ceb92825e9c334892441ebf405cd8be4a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/audio_device/audio_device_module.h >@@ -14,10 +14,13 @@ > #include <memory> > > #include "absl/types/optional.h" >-#include "modules/audio_device/audio_device_buffer.h" >+#include "modules/audio_device/include/audio_device.h" > #include "sdk/android/native_api/jni/scoped_java_ref.h" > > namespace webrtc { >+ >+class AudioDeviceBuffer; >+ > namespace jni { > > class AudioInput { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/builtinaudiodecoderfactoryfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/builtinaudiodecoderfactoryfactory.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..68a2843637496764d418412c3c9f44df4699262f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/builtinaudiodecoderfactoryfactory.cc >@@ -0,0 +1,28 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "sdk/android/generated_builtin_audio_codecs_jni/jni/BuiltinAudioDecoderFactoryFactory_jni.h" >+#include "sdk/android/native_api/jni/java_types.h" >+#include "sdk/android/src/jni/jni_helpers.h" >+ >+#include "api/audio_codecs/builtin_audio_decoder_factory.h" >+ >+namespace webrtc { >+namespace jni { >+ >+static jlong >+JNI_BuiltinAudioDecoderFactoryFactory_CreateBuiltinAudioDecoderFactory( >+ JNIEnv* env, >+ const JavaParamRef<jclass>& jcaller) { >+ return NativeToJavaPointer(CreateBuiltinAudioDecoderFactory().release()); >+} >+ >+} // namespace jni >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/builtinaudioencoderfactoryfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/builtinaudioencoderfactoryfactory.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..c1aefba9ab278ef643db022d60e279e91ace82bc >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/builtinaudioencoderfactoryfactory.cc >@@ -0,0 +1,28 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "sdk/android/generated_builtin_audio_codecs_jni/jni/BuiltinAudioEncoderFactoryFactory_jni.h" >+#include "sdk/android/native_api/jni/java_types.h" >+#include "sdk/android/src/jni/jni_helpers.h" >+ >+#include "api/audio_codecs/builtin_audio_encoder_factory.h" >+ >+namespace webrtc { >+namespace jni { >+ >+static jlong >+JNI_BuiltinAudioEncoderFactoryFactory_CreateBuiltinAudioEncoderFactory( >+ JNIEnv* env, >+ const JavaParamRef<jclass>& jcaller) { >+ return NativeToJavaPointer(CreateBuiltinAudioEncoderFactory().release()); >+} >+ >+} // namespace jni >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/encodedimage.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/encodedimage.cc >index c0ea28e2dc323927a8e27b00d73df7c7e5d3dc61..21f83364d97d67a5f5949784c55ddaa06aa15fe4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/encodedimage.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/encodedimage.cc >@@ -10,7 +10,7 @@ > > #include "sdk/android/src/jni/encodedimage.h" > >-#include "common_video/include/video_frame.h" >+#include "api/video/encoded_image.h" > #include "rtc_base/timeutils.h" > #include "sdk/android/generated_video_jni/jni/EncodedImage_jni.h" > #include "sdk/android/native_api/jni/java_types.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/jni_generator_helper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/jni_generator_helper.h >index 53e1d84127d7b61b276bc3160662149e2b180e2b..e6d6f7ef159d68aa542540f131e047e534da410b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/jni_generator_helper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/jni_generator_helper.h >@@ -33,12 +33,6 @@ > RTC_CHECK(!jni->ExceptionCheck()) \ > << (jni->ExceptionDescribe(), jni->ExceptionClear(), "") > >-namespace jni_generator { >-inline void CheckException(JNIEnv* env) { >- CHECK_EXCEPTION(env); >-} >-} // namespace jni_generator >- > namespace webrtc { > > // This function will initialize |atomic_class_id| to contain a global ref to >@@ -86,4 +80,80 @@ using webrtc::MethodID; > } // namespace android > } // namespace base > >+namespace jni_generator { >+inline void CheckException(JNIEnv* env) { >+ CHECK_EXCEPTION(env); >+} >+ >+// A 32 bit number could be an address on stack. Random 64 bit marker on the >+// stack is much less likely to be present on stack. >+constexpr uint64_t kJniStackMarkerValue = 0xbdbdef1bebcade1b; >+ >+// Context about the JNI call with exception checked to be stored in stack. >+struct BASE_EXPORT JniJavaCallContextUnchecked { >+ inline JniJavaCallContextUnchecked() { >+// TODO(ssid): Implement for other architectures. >+#if defined(__arm__) || defined(__aarch64__) >+ // This assumes that this method does not increment the stack pointer. >+ asm volatile("mov %0, sp" : "=r"(sp)); >+#else >+ sp = 0; >+#endif >+ } >+ >+ // Force no inline to reduce code size. >+ template <base::android::MethodID::Type type> >+ void Init(JNIEnv* env, >+ jclass clazz, >+ const char* method_name, >+ const char* jni_signature, >+ std::atomic<jmethodID>* atomic_method_id) { >+ env1 = env; >+ >+ // Make sure compiler doesn't optimize out the assignment. >+ memcpy(&marker, &kJniStackMarkerValue, sizeof(kJniStackMarkerValue)); >+ // Gets PC of the calling function. >+ pc = reinterpret_cast<uintptr_t>(__builtin_return_address(0)); >+ >+ method_id = base::android::MethodID::LazyGet<type>( >+ env, clazz, method_name, jni_signature, atomic_method_id); >+ } >+ >+ ~JniJavaCallContextUnchecked() { >+ // Reset so that spurious marker finds are avoided. >+ memset(&marker, 0, sizeof(marker)); >+ } >+ >+ uint64_t marker; >+ uintptr_t sp; >+ uintptr_t pc; >+ >+ JNIEnv* env1; >+ jmethodID method_id; >+}; >+ >+// Context about the JNI call with exception unchecked to be stored in stack. >+struct BASE_EXPORT JniJavaCallContextChecked { >+ // Force no inline to reduce code size. >+ template <base::android::MethodID::Type type> >+ void Init(JNIEnv* env, >+ jclass clazz, >+ const char* method_name, >+ const char* jni_signature, >+ std::atomic<jmethodID>* atomic_method_id) { >+ base.Init<type>(env, clazz, method_name, jni_signature, atomic_method_id); >+ // Reset |pc| to correct caller. >+ base.pc = reinterpret_cast<uintptr_t>(__builtin_return_address(0)); >+ } >+ >+ ~JniJavaCallContextChecked() { jni_generator::CheckException(base.env1); } >+ >+ JniJavaCallContextUnchecked base; >+}; >+ >+static_assert(sizeof(JniJavaCallContextChecked) == >+ sizeof(JniJavaCallContextUnchecked), >+ "Stack unwinder cannot work with structs of different sizes."); >+} // namespace jni_generator >+ > #endif // SDK_ANDROID_SRC_JNI_JNI_GENERATOR_HELPER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/cryptooptions.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/cryptooptions.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..0f38b13f78bdd733e962b85af69879d21491f7e7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/cryptooptions.cc >@@ -0,0 +1,43 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "sdk/android/src/jni/pc/cryptooptions.h" >+ >+#include "sdk/android/generated_peerconnection_jni/jni/CryptoOptions_jni.h" >+ >+namespace webrtc { >+namespace jni { >+ >+absl::optional<CryptoOptions> JavaToNativeOptionalCryptoOptions( >+ JNIEnv* jni, >+ const JavaRef<jobject>& j_crypto_options) { >+ if (j_crypto_options.is_null()) { >+ return absl::nullopt; >+ } >+ >+ ScopedJavaLocalRef<jobject> j_srtp = >+ Java_CryptoOptions_getSrtp(jni, j_crypto_options); >+ ScopedJavaLocalRef<jobject> j_sframe = >+ Java_CryptoOptions_getSFrame(jni, j_crypto_options); >+ >+ CryptoOptions native_crypto_options; >+ native_crypto_options.srtp.enable_gcm_crypto_suites = >+ Java_Srtp_getEnableGcmCryptoSuites(jni, j_srtp); >+ native_crypto_options.srtp.enable_aes128_sha1_32_crypto_cipher = >+ Java_Srtp_getEnableAes128Sha1_32CryptoCipher(jni, j_srtp); >+ native_crypto_options.srtp.enable_encrypted_rtp_header_extensions = >+ Java_Srtp_getEnableEncryptedRtpHeaderExtensions(jni, j_srtp); >+ native_crypto_options.sframe.require_frame_encryption = >+ Java_SFrame_getRequireFrameEncryption(jni, j_sframe); >+ return absl::optional<CryptoOptions>(native_crypto_options); >+} >+ >+} // namespace jni >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/cryptooptions.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/cryptooptions.h >new file mode 100644 >index 0000000000000000000000000000000000000000..9890264a55c6f2c388027918ebd912b74c3720e7 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/cryptooptions.h >@@ -0,0 +1,30 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef SDK_ANDROID_SRC_JNI_PC_CRYPTOOPTIONS_H_ >+#define SDK_ANDROID_SRC_JNI_PC_CRYPTOOPTIONS_H_ >+ >+#include <jni.h> >+ >+#include "absl/types/optional.h" >+#include "api/crypto/cryptooptions.h" >+#include "sdk/android/native_api/jni/scoped_java_ref.h" >+ >+namespace webrtc { >+namespace jni { >+ >+absl::optional<CryptoOptions> JavaToNativeOptionalCryptoOptions( >+ JNIEnv* jni, >+ const JavaRef<jobject>& j_crypto_options); >+ >+} // namespace jni >+} // namespace webrtc >+ >+#endif // SDK_ANDROID_SRC_JNI_PC_CRYPTOOPTIONS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/null_video.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/null_video.cc >index 0e0b80d4612f25e874b90d8ea5a722403164674a..54a401d605c191283ae7fcffe464729420211ccb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/null_video.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/null_video.cc >@@ -31,7 +31,8 @@ VideoDecoderFactory* CreateVideoDecoderFactory( > void* CreateVideoSource(JNIEnv* env, > rtc::Thread* signaling_thread, > rtc::Thread* worker_thread, >- jboolean is_screencast) { >+ jboolean is_screencast, >+ jboolean align_timestamps) { > return nullptr; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.cc >index 9987f252275e327c2ddad4f38049d0c889e9499e..2d62ea32a131cc5e136b82a147b30f0d79772aae 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.cc >@@ -21,9 +21,9 @@ PeerConnectionFactoryInterface* factoryFromJava(jlong j_p) { > } > > OwnedFactoryAndThreads::OwnedFactoryAndThreads( >- std::unique_ptr<Thread> network_thread, >- std::unique_ptr<Thread> worker_thread, >- std::unique_ptr<Thread> signaling_thread, >+ std::unique_ptr<rtc::Thread> network_thread, >+ std::unique_ptr<rtc::Thread> worker_thread, >+ std::unique_ptr<rtc::Thread> signaling_thread, > rtc::NetworkMonitorFactory* network_monitor_factory, > PeerConnectionFactoryInterface* factory) > : network_thread_(std::move(network_thread)), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.h >index 2ea8570282d955ca000618f103c30974c44cc9e8..bacd42dfcd6c17f790ceab5dfe84f90b349336a0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/ownedfactoryandthreads.h >@@ -18,8 +18,6 @@ > #include "api/peerconnectioninterface.h" > #include "rtc_base/thread.h" > >-using rtc::Thread; >- > namespace webrtc { > namespace jni { > >@@ -33,18 +31,18 @@ PeerConnectionFactoryInterface* factoryFromJava(jlong j_p); > // single thing for Java to hold and eventually free. > class OwnedFactoryAndThreads { > public: >- OwnedFactoryAndThreads(std::unique_ptr<Thread> network_thread, >- std::unique_ptr<Thread> worker_thread, >- std::unique_ptr<Thread> signaling_thread, >+ OwnedFactoryAndThreads(std::unique_ptr<rtc::Thread> network_thread, >+ std::unique_ptr<rtc::Thread> worker_thread, >+ std::unique_ptr<rtc::Thread> signaling_thread, > rtc::NetworkMonitorFactory* network_monitor_factory, > PeerConnectionFactoryInterface* factory); > > ~OwnedFactoryAndThreads(); > > PeerConnectionFactoryInterface* factory() { return factory_; } >- Thread* network_thread() { return network_thread_.get(); } >- Thread* signaling_thread() { return signaling_thread_.get(); } >- Thread* worker_thread() { return worker_thread_.get(); } >+ rtc::Thread* network_thread() { return network_thread_.get(); } >+ rtc::Thread* signaling_thread() { return signaling_thread_.get(); } >+ rtc::Thread* worker_thread() { return worker_thread_.get(); } > rtc::NetworkMonitorFactory* network_monitor_factory() { > return network_monitor_factory_; > } >@@ -52,9 +50,9 @@ class OwnedFactoryAndThreads { > void InvokeJavaCallbacksOnFactoryThreads(); > > private: >- const std::unique_ptr<Thread> network_thread_; >- const std::unique_ptr<Thread> worker_thread_; >- const std::unique_ptr<Thread> signaling_thread_; >+ const std::unique_ptr<rtc::Thread> network_thread_; >+ const std::unique_ptr<rtc::Thread> worker_thread_; >+ const std::unique_ptr<rtc::Thread> signaling_thread_; > rtc::NetworkMonitorFactory* network_monitor_factory_; > PeerConnectionFactoryInterface* factory_; // Const after ctor except dtor. > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.cc >index 79da797e38877d8bfee40b3d289e8e57820629ef..0515ce264e739d7cf5315dec9905f91785e01f5c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.cc >@@ -43,10 +43,12 @@ > #include "sdk/android/generated_peerconnection_jni/jni/PeerConnection_jni.h" > #include "sdk/android/native_api/jni/java_types.h" > #include "sdk/android/src/jni/jni_helpers.h" >+#include "sdk/android/src/jni/pc/cryptooptions.h" > #include "sdk/android/src/jni/pc/datachannel.h" > #include "sdk/android/src/jni/pc/icecandidate.h" > #include "sdk/android/src/jni/pc/mediaconstraints.h" > #include "sdk/android/src/jni/pc/mediastreamtrack.h" >+#include "sdk/android/src/jni/pc/rtccertificate.h" > #include "sdk/android/src/jni/pc/rtcstatscollectorcallbackwrapper.h" > #include "sdk/android/src/jni/pc/rtpsender.h" > #include "sdk/android/src/jni/pc/sdpobserver.h" >@@ -129,6 +131,8 @@ void JavaToNativeRTCConfiguration( > Java_RTCConfiguration_getBundlePolicy(jni, j_rtc_config); > ScopedJavaLocalRef<jobject> j_rtcp_mux_policy = > Java_RTCConfiguration_getRtcpMuxPolicy(jni, j_rtc_config); >+ ScopedJavaLocalRef<jobject> j_rtc_certificate = >+ Java_RTCConfiguration_getCertificate(jni, j_rtc_config); > ScopedJavaLocalRef<jobject> j_tcp_candidate_policy = > Java_RTCConfiguration_getTcpCandidatePolicy(jni, j_rtc_config); > ScopedJavaLocalRef<jobject> j_candidate_network_policy = >@@ -143,11 +147,20 @@ void JavaToNativeRTCConfiguration( > Java_RTCConfiguration_getNetworkPreference(jni, j_rtc_config); > ScopedJavaLocalRef<jobject> j_sdp_semantics = > Java_RTCConfiguration_getSdpSemantics(jni, j_rtc_config); >+ ScopedJavaLocalRef<jobject> j_crypto_options = >+ Java_RTCConfiguration_getCryptoOptions(jni, j_rtc_config); > > rtc_config->type = JavaToNativeIceTransportsType(jni, j_ice_transports_type); > rtc_config->bundle_policy = JavaToNativeBundlePolicy(jni, j_bundle_policy); > rtc_config->rtcp_mux_policy = > JavaToNativeRtcpMuxPolicy(jni, j_rtcp_mux_policy); >+ if (!j_rtc_certificate.is_null()) { >+ rtc::scoped_refptr<rtc::RTCCertificate> certificate = >+ rtc::RTCCertificate::FromPEM( >+ JavaToNativeRTCCertificatePEM(jni, j_rtc_certificate)); >+ RTC_CHECK(certificate != nullptr) << "supplied certificate is malformed."; >+ rtc_config->certificates.push_back(certificate); >+ } > rtc_config->tcp_candidate_policy = > JavaToNativeTcpCandidatePolicy(jni, j_tcp_candidate_policy); > rtc_config->candidate_network_policy = >@@ -234,6 +247,13 @@ void JavaToNativeRTCConfiguration( > rtc_config->sdp_semantics = JavaToNativeSdpSemantics(jni, j_sdp_semantics); > rtc_config->active_reset_srtp_params = > Java_RTCConfiguration_getActiveResetSrtpParams(jni, j_rtc_config); >+ rtc_config->use_media_transport = >+ Java_RTCConfiguration_getUseMediaTransport(jni, j_rtc_config); >+ rtc_config->use_media_transport_for_data_channels = >+ Java_RTCConfiguration_getUseMediaTransportForDataChannels(jni, >+ j_rtc_config); >+ rtc_config->crypto_options = >+ JavaToNativeOptionalCryptoOptions(jni, j_crypto_options); > } > > rtc::KeyType GetRtcConfigKeyType(JNIEnv* env, >@@ -279,6 +299,14 @@ void PeerConnectionObserverJni::OnIceConnectionChange( > Java_IceConnectionState_fromNativeIndex(env, new_state)); > } > >+void PeerConnectionObserverJni::OnConnectionChange( >+ PeerConnectionInterface::PeerConnectionState new_state) { >+ JNIEnv* env = AttachCurrentThreadIfNeeded(); >+ Java_Observer_onConnectionChange(env, j_observer_global_, >+ Java_PeerConnectionState_fromNativeIndex( >+ env, static_cast<int>(new_state))); >+} >+ > void PeerConnectionObserverJni::OnIceConnectionReceivingChange(bool receiving) { > JNIEnv* env = AttachCurrentThreadIfNeeded(); > Java_Observer_onIceConnectionReceivingChange(env, j_observer_global_, >@@ -429,6 +457,16 @@ static ScopedJavaLocalRef<jobject> JNI_PeerConnection_GetRemoteDescription( > return sdp ? NativeToJavaSessionDescription(jni, sdp) : nullptr; > } > >+static ScopedJavaLocalRef<jobject> JNI_PeerConnection_GetCertificate( >+ JNIEnv* jni, >+ const JavaParamRef<jobject>& j_pc) { >+ const PeerConnectionInterface::RTCConfiguration rtc_config = >+ ExtractNativePC(jni, j_pc)->GetConfiguration(); >+ rtc::scoped_refptr<rtc::RTCCertificate> certificate = >+ rtc_config.certificates[0]; >+ return NativeToJavaRTCCertificatePEM(jni, certificate->ToPEM()); >+} >+ > static ScopedJavaLocalRef<jobject> JNI_PeerConnection_CreateDataChannel( > JNIEnv* jni, > const JavaParamRef<jobject>& j_pc, >@@ -722,6 +760,14 @@ static ScopedJavaLocalRef<jobject> JNI_PeerConnection_IceConnectionState( > env, ExtractNativePC(env, j_pc)->ice_connection_state()); > } > >+static ScopedJavaLocalRef<jobject> JNI_PeerConnection_ConnectionState( >+ JNIEnv* env, >+ const JavaParamRef<jobject>& j_pc) { >+ return Java_PeerConnectionState_fromNativeIndex( >+ env, >+ static_cast<int>(ExtractNativePC(env, j_pc)->peer_connection_state())); >+} >+ > static ScopedJavaLocalRef<jobject> JNI_PeerConnection_IceGatheringState( > JNIEnv* env, > const JavaParamRef<jobject>& j_pc) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.h >index f20166727ea0b4954b06a121eccc93cc6cac75c0..a9d7b39c3d4217ce2ec8257a1397fb68723e2989 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnection.h >@@ -51,6 +51,8 @@ class PeerConnectionObserverJni : public PeerConnectionObserver { > PeerConnectionInterface::SignalingState new_state) override; > void OnIceConnectionChange( > PeerConnectionInterface::IceConnectionState new_state) override; >+ void OnConnectionChange( >+ PeerConnectionInterface::PeerConnectionState new_state) override; > void OnIceConnectionReceivingChange(bool receiving) override; > void OnIceGatheringChange( > PeerConnectionInterface::IceGatheringState new_state) override; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnectionfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnectionfactory.cc >index 3de79555a63c1dd805cae135338db4aae38ecc99..c2b8b9c5d7dfa7bfe2eb8416a8e122f40166728b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnectionfactory.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/peerconnectionfactory.cc >@@ -50,10 +50,6 @@ JavaToNativePeerConnectionFactoryOptions(JNIEnv* jni, > bool disable_encryption = Java_Options_getDisableEncryption(jni, options); > bool disable_network_monitor = > Java_Options_getDisableNetworkMonitor(jni, options); >- bool enable_aes128_sha1_32_crypto_cipher = >- Java_Options_getEnableAes128Sha1_32CryptoCipher(jni, options); >- bool enable_gcm_crypto_suites = >- Java_Options_getEnableGcmCryptoSuites(jni, options); > > PeerConnectionFactoryInterface::Options native_options; > >@@ -63,10 +59,6 @@ JavaToNativePeerConnectionFactoryOptions(JNIEnv* jni, > native_options.disable_encryption = disable_encryption; > native_options.disable_network_monitor = disable_network_monitor; > >- native_options.crypto_options.enable_aes128_sha1_32_crypto_cipher = >- enable_aes128_sha1_32_crypto_cipher; >- native_options.crypto_options.enable_gcm_crypto_suites = >- enable_gcm_crypto_suites; > return native_options; > } > >@@ -199,15 +191,21 @@ static void JNI_PeerConnectionFactory_ShutdownInternalTracer( > rtc::tracing::ShutdownInternalTracer(); > } > >+// Following parameters are optional: >+// |audio_device_module|, |jencoder_factory|, |jdecoder_factory|, >+// |audio_processor|, |media_transport_factory|, |fec_controller_factory|. > jlong CreatePeerConnectionFactoryForJava( > JNIEnv* jni, > const JavaParamRef<jobject>& jcontext, > const JavaParamRef<jobject>& joptions, > rtc::scoped_refptr<AudioDeviceModule> audio_device_module, >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory, >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory, > const JavaParamRef<jobject>& jencoder_factory, > const JavaParamRef<jobject>& jdecoder_factory, > rtc::scoped_refptr<AudioProcessing> audio_processor, >- std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory) { >+ std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory, >+ std::unique_ptr<MediaTransportFactory> media_transport_factory) { > // talk/ assumes pretty widely that the current Thread is ThreadManager'd, but > // ThreadManager only WrapCurrentThread()s the thread where it is first > // created. Since the semantics around when auto-wrapping happens in >@@ -229,8 +227,6 @@ jlong CreatePeerConnectionFactoryForJava( > RTC_CHECK(signaling_thread->Start()) << "Failed to start thread"; > > rtc::NetworkMonitorFactory* network_monitor_factory = nullptr; >- auto audio_encoder_factory = CreateAudioEncoderFactory(); >- auto audio_decoder_factory = CreateAudioDecoderFactory(); > > PeerConnectionFactoryInterface::Options options; > bool has_options = !joptions.is_null(); >@@ -257,12 +253,19 @@ jlong CreatePeerConnectionFactoryForJava( > std::unique_ptr<VideoDecoderFactory>( > CreateVideoDecoderFactory(jni, jdecoder_factory)), > audio_mixer, audio_processor)); >+ PeerConnectionFactoryDependencies dependencies; >+ dependencies.network_thread = network_thread.get(); >+ dependencies.worker_thread = worker_thread.get(); >+ dependencies.signaling_thread = signaling_thread.get(); >+ dependencies.media_engine = std::move(media_engine); >+ dependencies.call_factory = std::move(call_factory); >+ dependencies.event_log_factory = std::move(rtc_event_log_factory); >+ dependencies.fec_controller_factory = std::move(fec_controller_factory); >+ dependencies.media_transport_factory = std::move(media_transport_factory); > > rtc::scoped_refptr<PeerConnectionFactoryInterface> factory( >- CreateModularPeerConnectionFactory( >- network_thread.get(), worker_thread.get(), signaling_thread.get(), >- std::move(media_engine), std::move(call_factory), >- std::move(rtc_event_log_factory), std::move(fec_controller_factory))); >+ CreateModularPeerConnectionFactory(std::move(dependencies))); >+ > RTC_CHECK(factory) << "Failed to create the peer connection factory; " > << "WebRTC/libjingle init likely failed on this device"; > // TODO(honghaiz): Maybe put the options as the argument of >@@ -283,21 +286,39 @@ static jlong JNI_PeerConnectionFactory_CreatePeerConnectionFactory( > const JavaParamRef<jobject>& jcontext, > const JavaParamRef<jobject>& joptions, > jlong native_audio_device_module, >+ jlong native_audio_encoder_factory, >+ jlong native_audio_decoder_factory, > const JavaParamRef<jobject>& jencoder_factory, > const JavaParamRef<jobject>& jdecoder_factory, > jlong native_audio_processor, >- jlong native_fec_controller_factory) { >+ jlong native_fec_controller_factory, >+ jlong native_media_transport_factory) { > rtc::scoped_refptr<AudioProcessing> audio_processor = > reinterpret_cast<AudioProcessing*>(native_audio_processor); >+ AudioEncoderFactory* audio_encoder_factory_ptr = >+ reinterpret_cast<AudioEncoderFactory*>(native_audio_encoder_factory); >+ rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory( >+ audio_encoder_factory_ptr); >+ // Release the caller's reference count. >+ audio_encoder_factory->Release(); >+ AudioDecoderFactory* audio_decoder_factory_ptr = >+ reinterpret_cast<AudioDecoderFactory*>(native_audio_decoder_factory); >+ rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory( >+ audio_decoder_factory_ptr); >+ // Release the caller's reference count. >+ audio_decoder_factory->Release(); > std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory( > reinterpret_cast<FecControllerFactoryInterface*>( > native_fec_controller_factory)); >+ std::unique_ptr<MediaTransportFactory> media_transport_factory( >+ reinterpret_cast<MediaTransportFactory*>(native_media_transport_factory)); > return CreatePeerConnectionFactoryForJava( > jni, jcontext, joptions, > reinterpret_cast<AudioDeviceModule*>(native_audio_device_module), >- jencoder_factory, jdecoder_factory, >+ audio_encoder_factory, audio_decoder_factory, jencoder_factory, >+ jdecoder_factory, > audio_processor ? audio_processor : CreateAudioProcessing(), >- std::move(fec_controller_factory)); >+ std::move(fec_controller_factory), std::move(media_transport_factory)); > } > > static void JNI_PeerConnectionFactory_FreeFactory(JNIEnv*, >@@ -396,18 +417,20 @@ static jlong JNI_PeerConnectionFactory_CreatePeerConnection( > PeerConnectionInterface::RTCConfigurationType::kAggressive); > JavaToNativeRTCConfiguration(jni, j_rtc_config, &rtc_config); > >- // Generate non-default certificate. >- rtc::KeyType key_type = GetRtcConfigKeyType(jni, j_rtc_config); >- if (key_type != rtc::KT_DEFAULT) { >- rtc::scoped_refptr<rtc::RTCCertificate> certificate = >- rtc::RTCCertificateGenerator::GenerateCertificate( >- rtc::KeyParams(key_type), absl::nullopt); >- if (!certificate) { >- RTC_LOG(LS_ERROR) << "Failed to generate certificate. KeyType: " >- << key_type; >- return 0; >+ if (rtc_config.certificates.empty()) { >+ // Generate non-default certificate. >+ rtc::KeyType key_type = GetRtcConfigKeyType(jni, j_rtc_config); >+ if (key_type != rtc::KT_DEFAULT) { >+ rtc::scoped_refptr<rtc::RTCCertificate> certificate = >+ rtc::RTCCertificateGenerator::GenerateCertificate( >+ rtc::KeyParams(key_type), absl::nullopt); >+ if (!certificate) { >+ RTC_LOG(LS_ERROR) << "Failed to generate certificate. KeyType: " >+ << key_type; >+ return 0; >+ } >+ rtc_config.certificates.push_back(certificate); > } >- rtc_config.certificates.push_back(certificate); > } > > std::unique_ptr<MediaConstraintsInterface> constraints; >@@ -437,12 +460,13 @@ static jlong JNI_PeerConnectionFactory_CreateVideoSource( > JNIEnv* jni, > const JavaParamRef<jclass>&, > jlong native_factory, >- jboolean is_screencast) { >+ jboolean is_screencast, >+ jboolean align_timestamps) { > OwnedFactoryAndThreads* factory = > reinterpret_cast<OwnedFactoryAndThreads*>(native_factory); > return jlongFromPointer(CreateVideoSource(jni, factory->signaling_thread(), > factory->worker_thread(), >- is_screencast)); >+ is_screencast, align_timestamps)); > } > > static jlong JNI_PeerConnectionFactory_CreateVideoTrack( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/rtccertificate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/rtccertificate.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..8c598cce09e35c6a4ac2f11f0dd5fcfd1af9e165 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/rtccertificate.cc >@@ -0,0 +1,60 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "sdk/android/src/jni/pc/rtccertificate.h" >+#include "sdk/android/src/jni/pc/icecandidate.h" >+ >+#include "rtc_base/refcount.h" >+#include "rtc_base/rtccertificate.h" >+#include "rtc_base/rtccertificategenerator.h" >+#include "sdk/android/generated_peerconnection_jni/jni/RtcCertificatePem_jni.h" >+#include "sdk/android/native_api/jni/java_types.h" >+#include "sdk/android/src/jni/jni_helpers.h" >+ >+namespace webrtc { >+namespace jni { >+ >+rtc::RTCCertificatePEM JavaToNativeRTCCertificatePEM( >+ JNIEnv* jni, >+ const JavaRef<jobject>& j_rtc_certificate) { >+ ScopedJavaLocalRef<jstring> privatekey_field = >+ Java_RtcCertificatePem_getPrivateKey(jni, j_rtc_certificate); >+ ScopedJavaLocalRef<jstring> certificate_field = >+ Java_RtcCertificatePem_getCertificate(jni, j_rtc_certificate); >+ return rtc::RTCCertificatePEM(JavaToNativeString(jni, privatekey_field), >+ JavaToNativeString(jni, certificate_field)); >+} >+ >+ScopedJavaLocalRef<jobject> NativeToJavaRTCCertificatePEM( >+ JNIEnv* jni, >+ const rtc::RTCCertificatePEM& certificate) { >+ return Java_RtcCertificatePem_Constructor( >+ jni, NativeToJavaString(jni, certificate.private_key()), >+ NativeToJavaString(jni, certificate.certificate())); >+} >+ >+static ScopedJavaLocalRef<jobject> JNI_RtcCertificatePem_GenerateCertificate( >+ JNIEnv* jni, >+ const JavaParamRef<jclass>&, >+ const JavaParamRef<jobject>& j_key_type, >+ jlong j_expires) { >+ rtc::KeyType key_type = JavaToNativeKeyType(jni, j_key_type); >+ uint64_t expires = (uint64_t)j_expires; >+ rtc::scoped_refptr<rtc::RTCCertificate> certificate = >+ rtc::RTCCertificateGenerator::GenerateCertificate( >+ rtc::KeyParams(key_type), expires); >+ rtc::RTCCertificatePEM pem = certificate->ToPEM(); >+ return Java_RtcCertificatePem_Constructor( >+ jni, NativeToJavaString(jni, pem.private_key()), >+ NativeToJavaString(jni, pem.certificate())); >+} >+ >+} // namespace jni >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/rtccertificate.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/rtccertificate.h >new file mode 100644 >index 0000000000000000000000000000000000000000..f7d1f75d13993d50cbeaa6ed7f6156c829659adf >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/rtccertificate.h >@@ -0,0 +1,33 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef SDK_ANDROID_SRC_JNI_PC_RTCCERTIFICATE_H_ >+#define SDK_ANDROID_SRC_JNI_PC_RTCCERTIFICATE_H_ >+ >+#include "rtc_base/refcount.h" >+#include "rtc_base/rtccertificate.h" >+#include "sdk/android/native_api/jni/java_types.h" >+#include "sdk/android/src/jni/jni_helpers.h" >+ >+namespace webrtc { >+namespace jni { >+ >+rtc::RTCCertificatePEM JavaToNativeRTCCertificatePEM( >+ JNIEnv* jni, >+ const JavaRef<jobject>& j_rtc_certificate); >+ >+ScopedJavaLocalRef<jobject> NativeToJavaRTCCertificatePEM( >+ JNIEnv* env, >+ const rtc::RTCCertificatePEM& certificate); >+ >+} // namespace jni >+} // namespace webrtc >+ >+#endif // SDK_ANDROID_SRC_JNI_PC_RTCCERTIFICATE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.cc >index dd10fd51cb6ad5f4f770f288b27fafbd7fc083dd..c7c6e76d5d9ba73e05b900d3bf5e0c0c10e55bc9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.cc >@@ -44,10 +44,11 @@ VideoDecoderFactory* CreateVideoDecoderFactory( > void* CreateVideoSource(JNIEnv* env, > rtc::Thread* signaling_thread, > rtc::Thread* worker_thread, >- jboolean is_screencast) { >+ jboolean is_screencast, >+ jboolean align_timestamps) { > rtc::scoped_refptr<AndroidVideoTrackSource> source( >- new rtc::RefCountedObject<AndroidVideoTrackSource>(signaling_thread, env, >- is_screencast)); >+ new rtc::RefCountedObject<AndroidVideoTrackSource>( >+ signaling_thread, env, is_screencast, align_timestamps)); > return VideoTrackSourceProxy::Create(signaling_thread, worker_thread, source) > .release(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.h >index 183770b8206283902f79b35f6073b6eaf79e98fb..04390ea453ea0a3b8ce12b1b4e35fcb8e7e650d8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/pc/video.h >@@ -36,7 +36,8 @@ VideoDecoderFactory* CreateVideoDecoderFactory( > void* CreateVideoSource(JNIEnv* env, > rtc::Thread* signaling_thread, > rtc::Thread* worker_thread, >- jboolean is_screencast); >+ jboolean is_screencast, >+ jboolean align_timestamps); > > } // namespace jni > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/timestampaligner.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/timestampaligner.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..abe9b0fdc513454f50dcb2b6bfd12b7bac10bbb4 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/timestampaligner.cc >@@ -0,0 +1,52 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <jni.h> >+ >+#include "rtc_base/timestampaligner.h" >+#include "rtc_base/timeutils.h" >+#include "sdk/android/generated_video_jni/jni/TimestampAligner_jni.h" >+#include "sdk/android/src/jni/jni_helpers.h" >+ >+namespace webrtc { >+namespace jni { >+ >+static jlong JNI_TimestampAligner_RtcTimeNanos( >+ JNIEnv* env, >+ const JavaParamRef<jclass>& j_caller) { >+ return rtc::TimeNanos(); >+} >+ >+static jlong JNI_TimestampAligner_CreateTimestampAligner( >+ JNIEnv* env, >+ const JavaParamRef<jclass>& j_caller) { >+ return jlongFromPointer(new rtc::TimestampAligner()); >+} >+ >+static void JNI_TimestampAligner_ReleaseTimestampAligner( >+ JNIEnv* env, >+ const JavaParamRef<jclass>& j_caller, >+ jlong timestamp_aligner) { >+ delete reinterpret_cast<rtc::TimestampAligner*>(timestamp_aligner); >+} >+ >+static jlong JNI_TimestampAligner_TranslateTimestamp( >+ JNIEnv* env, >+ const JavaParamRef<jclass>& j_caller, >+ jlong timestamp_aligner, >+ jlong camera_time_ns) { >+ return reinterpret_cast<rtc::TimestampAligner*>(timestamp_aligner) >+ ->TranslateTimestamp(camera_time_ns / rtc::kNumNanosecsPerMicrosec, >+ rtc::TimeMicros()) * >+ rtc::kNumNanosecsPerMicrosec; >+} >+ >+} // namespace jni >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.cc >index 6381c63cb06f9072968178dca09b144e90dc900d..1e4b84eac60f2b7c9badef1ac1cbe052b9aae95e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.cc >@@ -32,8 +32,6 @@ namespace jni { > VideoEncoderWrapper::VideoEncoderWrapper(JNIEnv* jni, > const JavaRef<jobject>& j_encoder) > : encoder_(jni, j_encoder), int_array_class_(GetClass(jni, "[I")) { >- implementation_name_ = GetImplementationName(jni); >- > initialized_ = false; > num_resets_ = 0; > } >@@ -82,6 +80,10 @@ int32_t VideoEncoderWrapper::InitEncodeInternal(JNIEnv* jni) { > jni, Java_VideoEncoder_initEncode(jni, encoder_, settings, callback)); > RTC_LOG(LS_INFO) << "initEncode: " << status; > >+ encoder_info_.supports_native_handle = true; >+ encoder_info_.implementation_name = GetImplementationName(jni); >+ encoder_info_.scaling_settings = GetScalingSettingsInternal(jni); >+ > if (status == WEBRTC_VIDEO_CODEC_OK) { > initialized_ = true; > } >@@ -136,14 +138,6 @@ int32_t VideoEncoderWrapper::Encode( > return HandleReturnCode(jni, ret, "encode"); > } > >-int32_t VideoEncoderWrapper::SetChannelParameters(uint32_t packet_loss, >- int64_t rtt) { >- JNIEnv* jni = AttachCurrentThreadIfNeeded(); >- ScopedJavaLocalRef<jobject> ret = Java_VideoEncoder_setChannelParameters( >- jni, encoder_, (jshort)packet_loss, (jlong)rtt); >- return HandleReturnCode(jni, ret, "setChannelParameters"); >-} >- > int32_t VideoEncoderWrapper::SetRateAllocation( > const VideoBitrateAllocation& allocation, > uint32_t framerate) { >@@ -156,9 +150,12 @@ int32_t VideoEncoderWrapper::SetRateAllocation( > return HandleReturnCode(jni, ret, "setRateAllocation"); > } > >-VideoEncoderWrapper::ScalingSettings VideoEncoderWrapper::GetScalingSettings() >- const { >- JNIEnv* jni = AttachCurrentThreadIfNeeded(); >+VideoEncoder::EncoderInfo VideoEncoderWrapper::GetEncoderInfo() const { >+ return encoder_info_; >+} >+ >+VideoEncoderWrapper::ScalingSettings >+VideoEncoderWrapper::GetScalingSettingsInternal(JNIEnv* jni) const { > ScopedJavaLocalRef<jobject> j_scaling_settings = > Java_VideoEncoder_getScalingSettings(jni, encoder_); > bool isOn = >@@ -206,14 +203,6 @@ VideoEncoderWrapper::ScalingSettings VideoEncoderWrapper::GetScalingSettings() > } > } > >-bool VideoEncoderWrapper::SupportsNativeHandle() const { >- return true; >-} >- >-const char* VideoEncoderWrapper::ImplementationName() const { >- return implementation_name_.c_str(); >-} >- > void VideoEncoderWrapper::OnEncodedFrame(JNIEnv* jni, > const JavaRef<jobject>& j_caller, > const JavaRef<jobject>& j_buffer, >@@ -385,7 +374,6 @@ CodecSpecificInfo VideoEncoderWrapper::ParseCodecSpecificInfo( > CodecSpecificInfo info; > memset(&info, 0, sizeof(info)); > info.codecType = codec_settings_.codecType; >- info.codec_name = implementation_name_.c_str(); > > switch (codec_settings_.codecType) { > case kVideoCodecVP8: >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.h >index c30560503cb75ef9aa542d181905539ae64d4dd8..3618f7847375e590345cf77c3089ec0d67cfd12b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videoencoderwrapper.h >@@ -45,14 +45,10 @@ class VideoEncoderWrapper : public VideoEncoder { > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) override; > >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; >- > int32_t SetRateAllocation(const VideoBitrateAllocation& allocation, > uint32_t framerate) override; > >- ScalingSettings GetScalingSettings() const override; >- >- bool SupportsNativeHandle() const override; >+ EncoderInfo GetEncoderInfo() const override; > > // Should only be called by JNI. > void OnEncodedFrame(JNIEnv* jni, >@@ -66,8 +62,6 @@ class VideoEncoderWrapper : public VideoEncoder { > jboolean complete_frame, > const JavaRef<jobject>& j_qp); > >- const char* ImplementationName() const override; >- > private: > struct FrameExtraInfo { > int64_t capture_time_ns; // Used as an identifier of the frame. >@@ -92,11 +86,11 @@ class VideoEncoderWrapper : public VideoEncoder { > const VideoBitrateAllocation& allocation); > std::string GetImplementationName(JNIEnv* jni) const; > >+ ScalingSettings GetScalingSettingsInternal(JNIEnv* jni) const; >+ > const ScopedJavaGlobalRef<jobject> encoder_; > const ScopedJavaGlobalRef<jclass> int_array_class_; > >- std::string implementation_name_; >- > rtc::TaskQueue* encoder_queue_; > std::deque<FrameExtraInfo> frame_extra_infos_; > EncodedImageCallback* callback_; >@@ -104,6 +98,7 @@ class VideoEncoderWrapper : public VideoEncoder { > int num_resets_; > int number_of_cores_; > VideoCodec codec_settings_; >+ EncoderInfo encoder_info_; > H264BitstreamParser h264_bitstream_parser_; > > // VP9 variables to populate codec specific structure. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videosource.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videosource.cc >index 09b538ebc265ba987e0c22c6bcc395131e85507f..37ba0b04b115101547e316457fbca968c5a9533c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videosource.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/videosource.cc >@@ -33,13 +33,16 @@ static jlong JNI_VideoSource_GetInternalSource(JNIEnv* jni, > static void JNI_VideoSource_AdaptOutputFormat(JNIEnv* jni, > const JavaParamRef<jclass>&, > jlong j_source, >- jint j_width, >- jint j_height, >+ jint j_landscape_width, >+ jint j_landscape_height, >+ jint j_portrait_width, >+ jint j_portrait_height, > jint j_fps) { > RTC_LOG(LS_INFO) << "VideoSource_nativeAdaptOutputFormat"; > AndroidVideoTrackSource* source = > AndroidVideoTrackSourceFromJavaProxy(j_source); >- source->OnOutputFormatRequest(j_width, j_height, j_fps); >+ source->OnOutputFormatRequest(j_landscape_width, j_landscape_height, >+ j_portrait_width, j_portrait_height, j_fps); > } > > } // namespace jni >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp8codec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp8codec.cc >index 295a03a610770f06b944511cfe003a2c275c9fc5..2756b102faf19324ec0d6b4e091c3fbd6c417f2f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp8codec.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp8codec.cc >@@ -11,20 +11,20 @@ > #include <jni.h> > > #include "modules/video_coding/codecs/vp8/include/vp8.h" >-#include "sdk/android/generated_vp8_jni/jni/VP8Decoder_jni.h" >-#include "sdk/android/generated_vp8_jni/jni/VP8Encoder_jni.h" >+#include "sdk/android/generated_libvpx_vp8_jni/jni/LibvpxVp8Decoder_jni.h" >+#include "sdk/android/generated_libvpx_vp8_jni/jni/LibvpxVp8Encoder_jni.h" > #include "sdk/android/src/jni/jni_helpers.h" > > namespace webrtc { > namespace jni { > >-static jlong JNI_VP8Encoder_CreateEncoder(JNIEnv* jni, >- const JavaParamRef<jclass>&) { >+static jlong JNI_LibvpxVp8Encoder_CreateEncoder(JNIEnv* jni, >+ const JavaParamRef<jclass>&) { > return jlongFromPointer(VP8Encoder::Create().release()); > } > >-static jlong JNI_VP8Decoder_CreateDecoder(JNIEnv* jni, >- const JavaParamRef<jclass>&) { >+static jlong JNI_LibvpxVp8Decoder_CreateDecoder(JNIEnv* jni, >+ const JavaParamRef<jclass>&) { > return jlongFromPointer(VP8Decoder::Create().release()); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp9codec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp9codec.cc >index ca712e6d8654d1ec5e90238d8f2b7b7b7f5942fa..92f9754d563311a073ec2b8a314c66e1a30fd508 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp9codec.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/src/jni/vp9codec.cc >@@ -11,30 +11,30 @@ > #include <jni.h> > > #include "modules/video_coding/codecs/vp9/include/vp9.h" >-#include "sdk/android/generated_vp9_jni/jni/VP9Decoder_jni.h" >-#include "sdk/android/generated_vp9_jni/jni/VP9Encoder_jni.h" >+#include "sdk/android/generated_libvpx_vp9_jni/jni/LibvpxVp9Decoder_jni.h" >+#include "sdk/android/generated_libvpx_vp9_jni/jni/LibvpxVp9Encoder_jni.h" > #include "sdk/android/src/jni/jni_helpers.h" > > namespace webrtc { > namespace jni { > >-static jlong JNI_VP9Encoder_CreateEncoder(JNIEnv* jni, >- const JavaParamRef<jclass>& w) { >+static jlong JNI_LibvpxVp9Encoder_CreateEncoder(JNIEnv* jni, >+ const JavaParamRef<jclass>& w) { > return jlongFromPointer(VP9Encoder::Create().release()); > } > >-static jboolean JNI_VP9Encoder_IsSupported(JNIEnv* jni, >- const JavaParamRef<jclass>&) { >+static jboolean JNI_LibvpxVp9Encoder_IsSupported(JNIEnv* jni, >+ const JavaParamRef<jclass>&) { > return !SupportedVP9Codecs().empty(); > } > >-static jlong JNI_VP9Decoder_CreateDecoder(JNIEnv* jni, >- const JavaParamRef<jclass>& w) { >+static jlong JNI_LibvpxVp9Decoder_CreateDecoder(JNIEnv* jni, >+ const JavaParamRef<jclass>& w) { > return jlongFromPointer(VP9Decoder::Create().release()); > } > >-static jboolean JNI_VP9Decoder_IsSupported(JNIEnv* jni, >- const JavaParamRef<jclass>&) { >+static jboolean JNI_LibvpxVp9Decoder_IsSupported(JNIEnv* jni, >+ const JavaParamRef<jclass>&) { > return !SupportedVP9Codecs().empty(); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/tests/src/org/webrtc/CryptoOptionsTest.java b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/tests/src/org/webrtc/CryptoOptionsTest.java >new file mode 100644 >index 0000000000000000000000000000000000000000..f03811e6fcad52c8504d68c67e92579cb22f49d6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/android/tests/src/org/webrtc/CryptoOptionsTest.java >@@ -0,0 +1,74 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+package org.webrtc; >+ >+import static com.google.common.truth.Truth.assertThat; >+ >+import org.chromium.testing.local.LocalRobolectricTestRunner; >+import org.junit.Test; >+import org.junit.runner.RunWith; >+import org.robolectric.annotation.Config; >+import org.webrtc.CryptoOptions; >+ >+@RunWith(LocalRobolectricTestRunner.class) >+@Config(manifest = Config.NONE) >+public class CryptoOptionsTest { >+ // Validates the builder builds by default all false options. >+ @Test >+ public void testBuilderDefaultsAreFalse() { >+ CryptoOptions cryptoOptions = CryptoOptions.builder().createCryptoOptions(); >+ assertThat(cryptoOptions.getSrtp().getEnableGcmCryptoSuites()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableAes128Sha1_32CryptoCipher()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableEncryptedRtpHeaderExtensions()).isFalse(); >+ assertThat(cryptoOptions.getSFrame().getRequireFrameEncryption()).isFalse(); >+ } >+ >+ // Validates the builder sets the correct parameters. >+ @Test >+ public void testBuilderCorrectlyInitializingGcmCrypto() { >+ CryptoOptions cryptoOptions = >+ CryptoOptions.builder().setEnableGcmCryptoSuites(true).createCryptoOptions(); >+ assertThat(cryptoOptions.getSrtp().getEnableGcmCryptoSuites()).isTrue(); >+ assertThat(cryptoOptions.getSrtp().getEnableAes128Sha1_32CryptoCipher()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableEncryptedRtpHeaderExtensions()).isFalse(); >+ assertThat(cryptoOptions.getSFrame().getRequireFrameEncryption()).isFalse(); >+ } >+ >+ @Test >+ public void testBuilderCorrectlyInitializingAes128Sha1_32CryptoCipher() { >+ CryptoOptions cryptoOptions = >+ CryptoOptions.builder().setEnableAes128Sha1_32CryptoCipher(true).createCryptoOptions(); >+ assertThat(cryptoOptions.getSrtp().getEnableGcmCryptoSuites()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableAes128Sha1_32CryptoCipher()).isTrue(); >+ assertThat(cryptoOptions.getSrtp().getEnableEncryptedRtpHeaderExtensions()).isFalse(); >+ assertThat(cryptoOptions.getSFrame().getRequireFrameEncryption()).isFalse(); >+ } >+ >+ @Test >+ public void testBuilderCorrectlyInitializingEncryptedRtpHeaderExtensions() { >+ CryptoOptions cryptoOptions = >+ CryptoOptions.builder().setEnableEncryptedRtpHeaderExtensions(true).createCryptoOptions(); >+ assertThat(cryptoOptions.getSrtp().getEnableGcmCryptoSuites()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableAes128Sha1_32CryptoCipher()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableEncryptedRtpHeaderExtensions()).isTrue(); >+ assertThat(cryptoOptions.getSFrame().getRequireFrameEncryption()).isFalse(); >+ } >+ >+ @Test >+ public void testBuilderCorrectlyInitializingRequireFrameEncryption() { >+ CryptoOptions cryptoOptions = >+ CryptoOptions.builder().setRequireFrameEncryption(true).createCryptoOptions(); >+ assertThat(cryptoOptions.getSrtp().getEnableGcmCryptoSuites()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableAes128Sha1_32CryptoCipher()).isFalse(); >+ assertThat(cryptoOptions.getSrtp().getEnableEncryptedRtpHeaderExtensions()).isFalse(); >+ assertThat(cryptoOptions.getSFrame().getRequireFrameEncryption()).isTrue(); >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/PeerConnection/RTCH264ProfileLevelId.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/PeerConnection/RTCH264ProfileLevelId.mm >deleted file mode 100644 >index 04a5689417c20737278a8b2b022f121ffe99662a..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/PeerConnection/RTCH264ProfileLevelId.mm >+++ /dev/null >@@ -1,58 +0,0 @@ >-/* >- * Copyright 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- * >- */ >- >-#import "WebRTC/RTCVideoCodecH264.h" >- >-#include "media/base/h264_profile_level_id.h" >- >-@interface RTCH264ProfileLevelId () >- >-@property(nonatomic, assign) RTCH264Profile profile; >-@property(nonatomic, assign) RTCH264Level level; >-@property(nonatomic, strong) NSString *hexString; >- >-@end >- >-@implementation RTCH264ProfileLevelId >- >-@synthesize profile = _profile; >-@synthesize level = _level; >-@synthesize hexString = _hexString; >- >-- (instancetype)initWithHexString:(NSString *)hexString { >- if (self = [super init]) { >- self.hexString = hexString; >- >- absl::optional<webrtc::H264::ProfileLevelId> profile_level_id = >- webrtc::H264::ParseProfileLevelId([hexString cStringUsingEncoding:NSUTF8StringEncoding]); >- if (profile_level_id.has_value()) { >- self.profile = static_cast<RTCH264Profile>(profile_level_id->profile); >- self.level = static_cast<RTCH264Level>(profile_level_id->level); >- } >- } >- return self; >-} >- >-- (instancetype)initWithProfile:(RTCH264Profile)profile level:(RTCH264Level)level { >- if (self = [super init]) { >- self.profile = profile; >- self.level = level; >- >- absl::optional<std::string> hex_string = >- webrtc::H264::ProfileLevelIdToString(webrtc::H264::ProfileLevelId( >- static_cast<webrtc::H264::Profile>(profile), static_cast<webrtc::H264::Level>(level))); >- self.hexString = >- [NSString stringWithCString:hex_string.value_or("").c_str() encoding:NSUTF8StringEncoding]; >- } >- return self; >-} >- >-@end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.h >index 4867b31e365cc1b83b47c0966774b655049eed8c..2bce03fe0f8ac5e0946e1698fd5d27ebee15cf2c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.h >@@ -15,6 +15,10 @@ > > NS_ASSUME_NONNULL_BEGIN > >+typedef void (^RTCCallbackLoggerMessageHandler)(NSString *message); >+typedef void (^RTCCallbackLoggerMessageAndSeverityHandler)(NSString *message, >+ RTCLoggingSeverity severity); >+ > // This class intercepts WebRTC logs and forwards them to a registered block. > // This class is not threadsafe. > RTC_OBJC_EXPORT >@@ -23,10 +27,12 @@ RTC_OBJC_EXPORT > // The severity level to capture. The default is kRTCLoggingSeverityInfo. > @property(nonatomic, assign) RTCLoggingSeverity severity; > >-// The callback will be called on the same thread that does the logging, so >-// if the logging callback can be slow it may be a good idea to implement >-// dispatching to some other queue. >-- (void)start:(nullable void (^)(NSString*))callback; >+// The callback handler will be called on the same thread that does the >+// logging, so if the logging callback can be slow it may be a good idea >+// to implement dispatching to some other queue. >+- (void)start:(nullable RTCCallbackLoggerMessageHandler)handler; >+- (void)startWithMessageAndSeverityHandler: >+ (nullable RTCCallbackLoggerMessageAndSeverityHandler)handler; > > - (void)stop; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.mm >index 8440d63791ef52ad22061425785c4e44800c4650..9bbb53043409a27c56a64c5a17cb84a9dc0fb884 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/logging/RTCCallbackLogger.mm >@@ -18,11 +18,8 @@ > > class CallbackLogSink : public rtc::LogSink { > public: >- CallbackLogSink(void (^callbackHandler)(NSString *message)) { >- callback_handler_ = callbackHandler; >- } >- >- ~CallbackLogSink() override { callback_handler_ = nil; } >+ CallbackLogSink(RTCCallbackLoggerMessageHandler callbackHandler) >+ : callback_handler_(callbackHandler) {} > > void OnLogMessage(const std::string &message) override { > if (callback_handler_) { >@@ -31,12 +28,47 @@ class CallbackLogSink : public rtc::LogSink { > } > > private: >- void (^callback_handler_)(NSString *message); >+ RTCCallbackLoggerMessageHandler callback_handler_; >+}; >+ >+class CallbackWithSeverityLogSink : public rtc::LogSink { >+ public: >+ CallbackWithSeverityLogSink(RTCCallbackLoggerMessageAndSeverityHandler callbackHandler) >+ : callback_handler_(callbackHandler) {} >+ >+ void OnLogMessage(const std::string& message) override { RTC_NOTREACHED(); } >+ >+ void OnLogMessage(const std::string& message, rtc::LoggingSeverity severity) override { >+ if (callback_handler_) { >+ RTCLoggingSeverity loggingSeverity = NativeSeverityToObjcSeverity(severity); >+ callback_handler_([NSString stringWithUTF8String:message.c_str()], loggingSeverity); >+ } >+ } >+ >+ private: >+ static RTCLoggingSeverity NativeSeverityToObjcSeverity(rtc::LoggingSeverity severity) { >+ switch (severity) { >+ case rtc::LS_SENSITIVE: >+ return RTCLoggingSeveritySensitive; >+ case rtc::LS_VERBOSE: >+ return RTCLoggingSeverityVerbose; >+ case rtc::LS_INFO: >+ return RTCLoggingSeverityInfo; >+ case rtc::LS_WARNING: >+ return RTCLoggingSeverityWarning; >+ case rtc::LS_ERROR: >+ return RTCLoggingSeverityError; >+ case rtc::LS_NONE: >+ return RTCLoggingSeverityNone; >+ } >+ } >+ >+ RTCCallbackLoggerMessageAndSeverityHandler callback_handler_; > }; > > @implementation RTCCallbackLogger { > BOOL _hasStarted; >- std::unique_ptr<CallbackLogSink> _logSink; >+ std::unique_ptr<rtc::LogSink> _logSink; > } > > @synthesize severity = _severity; >@@ -53,12 +85,24 @@ - (void)dealloc { > [self stop]; > } > >-- (void)start:(nullable void (^)(NSString *))callback { >+- (void)start:(nullable RTCCallbackLoggerMessageHandler)handler { >+ if (_hasStarted) { >+ return; >+ } >+ >+ _logSink.reset(new CallbackLogSink(handler)); >+ >+ rtc::LogMessage::AddLogToStream(_logSink.get(), [self rtcSeverity]); >+ _hasStarted = YES; >+} >+ >+- (void)startWithMessageAndSeverityHandler: >+ (nullable RTCCallbackLoggerMessageAndSeverityHandler)handler { > if (_hasStarted) { > return; > } > >- _logSink.reset(new CallbackLogSink(callback)); >+ _logSink.reset(new CallbackWithSeverityLogSink(handler)); > > rtc::LogMessage::AddLogToStream(_logSink.get(), [self rtcSeverity]); > _hasStarted = YES; >@@ -78,6 +122,8 @@ - (void)stop { > > - (rtc::LoggingSeverity)rtcSeverity { > switch (_severity) { >+ case RTCLoggingSeveritySensitive: >+ return rtc::LS_SENSITIVE; > case RTCLoggingSeverityVerbose: > return rtc::LS_VERBOSE; > case RTCLoggingSeverityInfo: >@@ -86,6 +132,8 @@ - (rtc::LoggingSeverity)rtcSeverity { > return rtc::LS_WARNING; > case RTCLoggingSeverityError: > return rtc::LS_ERROR; >+ case RTCLoggingSeverityNone: >+ return rtc::LS_NONE; > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.h >index 9fb8b4df3591d7a67a15d74ea9e175a5eac0bf8a..f9e6edfd97415a82190b3793999b317e07d25705 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.h >@@ -11,6 +11,7 @@ > #import <Foundation/Foundation.h> > > #import "RTCCertificate.h" >+#import "RTCCryptoOptions.h" > #import "RTCMacros.h" > > @class RTCIceServer; >@@ -91,6 +92,19 @@ RTC_OBJC_EXPORT > @property(nonatomic, assign) RTCCandidateNetworkPolicy candidateNetworkPolicy; > @property(nonatomic, assign) RTCContinualGatheringPolicy continualGatheringPolicy; > >+/** If set to YES, don't gather IPv6 ICE candidates. >+ * Default is NO. >+ */ >+@property(nonatomic, assign) BOOL disableIPV6; >+ >+/** If set to YES, don't gather IPv6 ICE candidates on Wi-Fi. >+ * Only intended to be used on specific devices. Certain phones disable IPv6 >+ * when the screen is turned off and it would be better to just disable the >+ * IPv6 ICE candidates on Wi-Fi in those cases. >+ * Default is NO. >+ */ >+@property(nonatomic, assign) BOOL disableIPV6OnWiFi; >+ > /** By default, the PeerConnection will use a limited number of IPv6 network > * interfaces, in order to avoid too many ICE candidate pairs being created > * and delaying ICE completion. >@@ -168,6 +182,35 @@ RTC_OBJC_EXPORT > */ > @property(nonatomic, assign) BOOL activeResetSrtpParams; > >+/** >+ * If MediaTransportFactory is provided in PeerConnectionFactory, this flag informs PeerConnection >+ * that it should use the MediaTransportInterface. >+ */ >+@property(nonatomic, assign) BOOL useMediaTransport; >+ >+/** >+ * If MediaTransportFactory is provided in PeerConnectionFactory, this flag informs PeerConnection >+ * that it should use the MediaTransportInterface for data channels. >+ */ >+@property(nonatomic, assign) BOOL useMediaTransportForDataChannels; >+ >+/** >+ * Defines advanced optional cryptographic settings related to SRTP and >+ * frame encryption for native WebRTC. Setting this will overwrite any >+ * options set through the PeerConnectionFactory (which is deprecated). >+ */ >+@property(nonatomic, nullable) RTCCryptoOptions *cryptoOptions; >+ >+/** >+ * Time interval between audio RTCP reports. >+ */ >+@property(nonatomic, assign) int rtcpAudioReportIntervalMs; >+ >+/** >+ * Time interval between video RTCP reports. >+ */ >+@property(nonatomic, assign) int rtcpVideoReportIntervalMs; >+ > - (instancetype)init; > > @end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.mm >index 73a0fa9aa5fd80db4cd27cd293b650b7b800695f..4041f8399fc8b35567cbd989a25427eeb4ba1ebb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCConfiguration.mm >@@ -31,6 +31,8 @@ @synthesize rtcpMuxPolicy = _rtcpMuxPolicy; > @synthesize tcpCandidatePolicy = _tcpCandidatePolicy; > @synthesize candidateNetworkPolicy = _candidateNetworkPolicy; > @synthesize continualGatheringPolicy = _continualGatheringPolicy; >+@synthesize disableIPV6 = _disableIPV6; >+@synthesize disableIPV6OnWiFi = _disableIPV6OnWiFi; > @synthesize maxIPv6Networks = _maxIPv6Networks; > @synthesize disableLinkLocalNetworks = _disableLinkLocalNetworks; > @synthesize audioJitterBufferMaxPackets = _audioJitterBufferMaxPackets; >@@ -48,6 +50,11 @@ @synthesize iceRegatherIntervalRange = _iceRegatherIntervalRange; > @synthesize sdpSemantics = _sdpSemantics; > @synthesize turnCustomizer = _turnCustomizer; > @synthesize activeResetSrtpParams = _activeResetSrtpParams; >+@synthesize useMediaTransport = _useMediaTransport; >+@synthesize useMediaTransportForDataChannels = _useMediaTransportForDataChannels; >+@synthesize cryptoOptions = _cryptoOptions; >+@synthesize rtcpAudioReportIntervalMs = _rtcpAudioReportIntervalMs; >+@synthesize rtcpVideoReportIntervalMs = _rtcpVideoReportIntervalMs; > > - (instancetype)init { > // Copy defaults. >@@ -86,6 +93,8 @@ - (instancetype)initWithNativeConfiguration: > config.continual_gathering_policy; > _continualGatheringPolicy = > [[self class] continualGatheringPolicyForNativePolicy:nativePolicy]; >+ _disableIPV6 = config.disable_ipv6; >+ _disableIPV6OnWiFi = config.disable_ipv6_on_wifi; > _maxIPv6Networks = config.max_ipv6_networks; > _disableLinkLocalNetworks = config.disable_link_local_networks; > _audioJitterBufferMaxPackets = config.audio_jitter_buffer_max_packets; >@@ -93,6 +102,8 @@ - (instancetype)initWithNativeConfiguration: > _iceConnectionReceivingTimeout = config.ice_connection_receiving_timeout; > _iceBackupCandidatePairPingInterval = > config.ice_backup_candidate_pair_ping_interval; >+ _useMediaTransport = config.use_media_transport; >+ _useMediaTransportForDataChannels = config.use_media_transport_for_data_channels; > _keyType = RTCEncryptionKeyTypeECDSA; > _iceCandidatePoolSize = config.ice_candidate_pool_size; > _shouldPruneTurnPorts = config.prune_turn_ports; >@@ -110,14 +121,27 @@ - (instancetype)initWithNativeConfiguration: > _sdpSemantics = [[self class] sdpSemanticsForNativeSdpSemantics:config.sdp_semantics]; > _turnCustomizer = config.turn_customizer; > _activeResetSrtpParams = config.active_reset_srtp_params; >+ if (config.crypto_options) { >+ _cryptoOptions = [[RTCCryptoOptions alloc] >+ initWithSrtpEnableGcmCryptoSuites:config.crypto_options->srtp >+ .enable_gcm_crypto_suites >+ srtpEnableAes128Sha1_32CryptoCipher:config.crypto_options->srtp >+ .enable_aes128_sha1_32_crypto_cipher >+ srtpEnableEncryptedRtpHeaderExtensions:config.crypto_options->srtp >+ .enable_encrypted_rtp_header_extensions >+ sframeRequireFrameEncryption:config.crypto_options->sframe >+ .require_frame_encryption]; >+ } >+ _rtcpAudioReportIntervalMs = config.audio_rtcp_report_interval_ms(); >+ _rtcpVideoReportIntervalMs = config.video_rtcp_report_interval_ms(); > } > return self; > } > > - (NSString *)description { >- static NSString *formatString = >- @"RTCConfiguration: " >- @"{\n%@\n%@\n%@\n%@\n%@\n%@\n%@\n%@\n%d\n%d\n%d\n%d\n%d\n%d\n%d\n%@\n%@\n%d\n%d\n%d\n}\n"; >+ static NSString *formatString = @"RTCConfiguration: " >+ @"{\n%@\n%@\n%@\n%@\n%@\n%@\n%@\n%@\n%d\n%d\n%d\n%d\n%d\n%d\n" >+ @"%d\n%@\n%@\n%d\n%d\n%d\n%d\n%d\n%@\n}\n"; > > return [NSString > stringWithFormat:formatString, >@@ -139,8 +163,11 @@ - (NSString *)description { > _iceCheckMinInterval, > _iceRegatherIntervalRange, > _disableLinkLocalNetworks, >+ _disableIPV6, >+ _disableIPV6OnWiFi, > _maxIPv6Networks, >- _activeResetSrtpParams]; >+ _activeResetSrtpParams, >+ _useMediaTransport]; > } > > #pragma mark - Private >@@ -166,6 +193,8 @@ - (webrtc::PeerConnectionInterface::RTCConfiguration *) > nativeCandidateNetworkPolicyForPolicy:_candidateNetworkPolicy]; > nativeConfig->continual_gathering_policy = [[self class] > nativeContinualGatheringPolicyForPolicy:_continualGatheringPolicy]; >+ nativeConfig->disable_ipv6 = _disableIPV6; >+ nativeConfig->disable_ipv6_on_wifi = _disableIPV6OnWiFi; > nativeConfig->max_ipv6_networks = _maxIPv6Networks; > nativeConfig->disable_link_local_networks = _disableLinkLocalNetworks; > nativeConfig->audio_jitter_buffer_max_packets = _audioJitterBufferMaxPackets; >@@ -175,6 +204,8 @@ - (webrtc::PeerConnectionInterface::RTCConfiguration *) > _iceConnectionReceivingTimeout; > nativeConfig->ice_backup_candidate_pair_ping_interval = > _iceBackupCandidatePairPingInterval; >+ nativeConfig->use_media_transport = _useMediaTransport; >+ nativeConfig->use_media_transport_for_data_channels = _useMediaTransportForDataChannels; > rtc::KeyType keyType = > [[self class] nativeEncryptionKeyTypeForKeyType:_keyType]; > if (_certificate != nullptr) { >@@ -222,6 +253,20 @@ - (webrtc::PeerConnectionInterface::RTCConfiguration *) > nativeConfig->turn_customizer = _turnCustomizer; > } > nativeConfig->active_reset_srtp_params = _activeResetSrtpParams ? true : false; >+ if (_cryptoOptions) { >+ webrtc::CryptoOptions nativeCryptoOptions; >+ nativeCryptoOptions.srtp.enable_gcm_crypto_suites = >+ _cryptoOptions.srtpEnableGcmCryptoSuites ? true : false; >+ nativeCryptoOptions.srtp.enable_aes128_sha1_32_crypto_cipher = >+ _cryptoOptions.srtpEnableAes128Sha1_32CryptoCipher ? true : false; >+ nativeCryptoOptions.srtp.enable_encrypted_rtp_header_extensions = >+ _cryptoOptions.srtpEnableEncryptedRtpHeaderExtensions ? true : false; >+ nativeCryptoOptions.sframe.require_frame_encryption = >+ _cryptoOptions.sframeRequireFrameEncryption ? true : false; >+ nativeConfig->crypto_options = absl::optional<webrtc::CryptoOptions>(nativeCryptoOptions); >+ } >+ nativeConfig->set_audio_rtcp_report_interval_ms(_rtcpAudioReportIntervalMs); >+ nativeConfig->set_video_rtcp_report_interval_ms(_rtcpVideoReportIntervalMs); > return nativeConfig.release(); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCCryptoOptions.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCCryptoOptions.h >new file mode 100644 >index 0000000000000000000000000000000000000000..b465bb5a73bdfb8e37f882514a9cb1696c03cfcb >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCCryptoOptions.h >@@ -0,0 +1,63 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#import <Foundation/Foundation.h> >+ >+#import "RTCMacros.h" >+ >+NS_ASSUME_NONNULL_BEGIN >+ >+/** >+ * Objective-C bindings for webrtc::CryptoOptions. This API had to be flattened >+ * as Objective-C doesn't support nested structures. >+ */ >+RTC_OBJC_EXPORT >+@interface RTCCryptoOptions : NSObject >+ >+/** >+ * Enable GCM crypto suites from RFC 7714 for SRTP. GCM will only be used >+ * if both sides enable it >+ */ >+@property(nonatomic, assign) BOOL srtpEnableGcmCryptoSuites; >+/** >+ * If set to true, the (potentially insecure) crypto cipher >+ * SRTP_AES128_CM_SHA1_32 will be included in the list of supported ciphers >+ * during negotiation. It will only be used if both peers support it and no >+ * other ciphers get preferred. >+ */ >+@property(nonatomic, assign) BOOL srtpEnableAes128Sha1_32CryptoCipher; >+/** >+ * If set to true, encrypted RTP header extensions as defined in RFC 6904 >+ * will be negotiated. They will only be used if both peers support them. >+ */ >+@property(nonatomic, assign) BOOL srtpEnableEncryptedRtpHeaderExtensions; >+ >+/** >+ * If set all RtpSenders must have an FrameEncryptor attached to them before >+ * they are allowed to send packets. All RtpReceivers must have a >+ * FrameDecryptor attached to them before they are able to receive packets. >+ */ >+@property(nonatomic, assign) BOOL sframeRequireFrameEncryption; >+ >+/** >+ * Initializes CryptoOptions with all possible options set explicitly. This >+ * is done when converting from a native RTCConfiguration.crypto_options. >+ */ >+- (instancetype)initWithSrtpEnableGcmCryptoSuites:(BOOL)srtpEnableGcmCryptoSuites >+ srtpEnableAes128Sha1_32CryptoCipher:(BOOL)srtpEnableAes128Sha1_32CryptoCipher >+ srtpEnableEncryptedRtpHeaderExtensions:(BOOL)srtpEnableEncryptedRtpHeaderExtensions >+ sframeRequireFrameEncryption:(BOOL)sframeRequireFrameEncryption >+ NS_DESIGNATED_INITIALIZER; >+ >+- (instancetype)init NS_UNAVAILABLE; >+ >+@end >+ >+NS_ASSUME_NONNULL_END >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCCryptoOptions.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCCryptoOptions.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..a059f755994e743030321da5f2e0432f0b211ff9 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCCryptoOptions.mm >@@ -0,0 +1,33 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#import "RTCCryptoOptions.h" >+ >+@implementation RTCCryptoOptions >+ >+@synthesize srtpEnableGcmCryptoSuites = _srtpEnableGcmCryptoSuites; >+@synthesize srtpEnableAes128Sha1_32CryptoCipher = _srtpEnableAes128Sha1_32CryptoCipher; >+@synthesize srtpEnableEncryptedRtpHeaderExtensions = _srtpEnableEncryptedRtpHeaderExtensions; >+@synthesize sframeRequireFrameEncryption = _sframeRequireFrameEncryption; >+ >+- (instancetype)initWithSrtpEnableGcmCryptoSuites:(BOOL)srtpEnableGcmCryptoSuites >+ srtpEnableAes128Sha1_32CryptoCipher:(BOOL)srtpEnableAes128Sha1_32CryptoCipher >+ srtpEnableEncryptedRtpHeaderExtensions:(BOOL)srtpEnableEncryptedRtpHeaderExtensions >+ sframeRequireFrameEncryption:(BOOL)sframeRequireFrameEncryption { >+ if (self = [super init]) { >+ _srtpEnableGcmCryptoSuites = srtpEnableGcmCryptoSuites; >+ _srtpEnableAes128Sha1_32CryptoCipher = srtpEnableAes128Sha1_32CryptoCipher; >+ _srtpEnableEncryptedRtpHeaderExtensions = srtpEnableEncryptedRtpHeaderExtensions; >+ _sframeRequireFrameEncryption = sframeRequireFrameEncryption; >+ } >+ return self; >+} >+ >+@end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.h >index 1b125f1039a22f229f092b9a76cf342b08c6bc41..c77a7ad85d7bb7a8dbb5f0a8cd7109577d582ef4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.h >@@ -10,7 +10,7 @@ > > #import "base/RTCEncodedImage.h" > >-#include "common_video/include/video_frame.h" >+#include "api/video/encoded_image.h" > > NS_ASSUME_NONNULL_BEGIN > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >index 94cb1f110f96cf03cd51ad7bed450e15b8218947..f9901bd327e8f24bcf0bf176b004ff9e1af5a748 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >@@ -35,7 +35,6 @@ - (instancetype)initWithNativeEncodedImage:(webrtc::EncodedImage)encodedImage { > self.contentType = (encodedImage.content_type_ == webrtc::VideoContentType::SCREENSHARE) ? > RTCVideoContentTypeScreenshare : > RTCVideoContentTypeUnspecified; >- self.spatialIndex = encodedImage.SpatialIndex() ? *encodedImage.SpatialIndex() : 0; > } > > return self; >@@ -57,11 +56,11 @@ - (webrtc::EncodedImage)nativeEncodedImage { > encodedImage.rotation_ = webrtc::VideoRotation(self.rotation); > encodedImage._completeFrame = self.completeFrame; > encodedImage.qp_ = self.qp ? self.qp.intValue : -1; >+ encodedImage.SetSpatialIndex(self.spatialIndex); > encodedImage.content_type_ = (self.contentType == RTCVideoContentTypeScreenshare) ? > webrtc::VideoContentType::SCREENSHARE : > webrtc::VideoContentType::UNSPECIFIED; >- if (self.spatialIndex) >- encodedImage.SetSpatialIndex(self.spatialIndex); >+ > return encodedImage; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.h >index effc470686f810b4015f23b51c4a4c1f391740eb..cf648d32c3a91081beb1d271f55c9d2ba221720b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.h >@@ -20,24 +20,12 @@ RTC_EXTERN NSString * const kRTCFieldTrialAudioForceABWENoTWCCKey; > RTC_EXTERN NSString * const kRTCFieldTrialSendSideBweWithOverheadKey; > RTC_EXTERN NSString * const kRTCFieldTrialFlexFec03AdvertisedKey; > RTC_EXTERN NSString * const kRTCFieldTrialFlexFec03Key; >-RTC_EXTERN NSString * const kRTCFieldTrialImprovedBitrateEstimateKey; > RTC_EXTERN NSString * const kRTCFieldTrialH264HighProfileKey; > RTC_EXTERN NSString * const kRTCFieldTrialMinimizeResamplingOnMobileKey; > > /** The valid value for field trials above. */ > RTC_EXTERN NSString * const kRTCFieldTrialEnabledValue; > >-/** Use a string returned by RTCFieldTrialMedianSlopeFilterValue as the value. */ >-RTC_EXTERN NSString * const kRTCFieldTrialMedianSlopeFilterKey; >-RTC_EXTERN NSString *RTCFieldTrialMedianSlopeFilterValue( >- size_t windowSize, double thresholdGain); >- >-/** Use a string returned by RTCFieldTrialTrendlineFilterValue as the value. */ >-RTC_EXTERN NSString * const kRTCFieldTrialTrendlineFilterKey; >-/** Returns a valid value for kRTCFieldTrialTrendlineFilterKey. */ >-RTC_EXTERN NSString *RTCFieldTrialTrendlineFilterValue( >- size_t windowSize, double smoothingCoeff, double thresholdGain); >- > /** Initialize field trials using a dictionary mapping field trial keys to their > * values. See above for valid keys and values. Must be called before any other > * call into WebRTC. See: webrtc/system_wrappers/include/field_trial.h >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.mm >index 2212c7dec9714b799bd291f57ff136fd854c6206..127ce6feb89f37d3a414b55c907318a4cd5430aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCFieldTrials.mm >@@ -23,9 +23,6 @@ NSString * const kRTCFieldTrialAudioForceABWENoTWCCKey = @"WebRTC-Audio-ABWENoTW > NSString * const kRTCFieldTrialSendSideBweWithOverheadKey = @"WebRTC-SendSideBwe-WithOverhead"; > NSString * const kRTCFieldTrialFlexFec03AdvertisedKey = @"WebRTC-FlexFEC-03-Advertised"; > NSString * const kRTCFieldTrialFlexFec03Key = @"WebRTC-FlexFEC-03"; >-NSString * const kRTCFieldTrialImprovedBitrateEstimateKey = @"WebRTC-ImprovedBitrateEstimate"; >-NSString * const kRTCFieldTrialMedianSlopeFilterKey = @"WebRTC-BweMedianSlopeFilter"; >-NSString * const kRTCFieldTrialTrendlineFilterKey = @"WebRTC-BweTrendlineFilter"; > NSString * const kRTCFieldTrialH264HighProfileKey = @"WebRTC-H264HighProfile"; > NSString * const kRTCFieldTrialMinimizeResamplingOnMobileKey = > @"WebRTC-Audio-MinimizeResamplingOnMobile"; >@@ -33,18 +30,6 @@ NSString * const kRTCFieldTrialEnabledValue = @"Enabled"; > > static std::unique_ptr<char[]> gFieldTrialInitString; > >-NSString *RTCFieldTrialMedianSlopeFilterValue( >- size_t windowSize, double thresholdGain) { >- NSString *format = @"Enabled-%zu,%lf"; >- return [NSString stringWithFormat:format, windowSize, thresholdGain]; >-} >- >-NSString *RTCFieldTrialTrendlineFilterValue( >- size_t windowSize, double smoothingCoeff, double thresholdGain) { >- NSString *format = @"Enabled-%zu,%lf,%lf"; >- return [NSString stringWithFormat:format, windowSize, smoothingCoeff, thresholdGain]; >-} >- > void RTCInitFieldTrialDictionary(NSDictionary<NSString *, NSString *> *fieldTrials) { > if (!fieldTrials) { > RTCLogWarning(@"No fieldTrials provided."); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection+Private.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection+Private.h >index b6440cd434a857ebd7a3216f82fb333b977b24c1..d26eb2e592554bd23cfb4db8a18713dcc7321111 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection+Private.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection+Private.h >@@ -39,6 +39,8 @@ class PeerConnectionDelegateAdapter : public PeerConnectionObserver { > > void OnIceConnectionChange(PeerConnectionInterface::IceConnectionState new_state) override; > >+ void OnConnectionChange(PeerConnectionInterface::PeerConnectionState new_state) override; >+ > void OnIceGatheringChange(PeerConnectionInterface::IceGatheringState new_state) override; > > void OnIceCandidate(const IceCandidateInterface *candidate) override; >@@ -85,11 +87,19 @@ class PeerConnectionDelegateAdapter : public PeerConnectionObserver { > + (webrtc::PeerConnectionInterface::IceConnectionState)nativeIceConnectionStateForState: > (RTCIceConnectionState)state; > >++ (webrtc::PeerConnectionInterface::PeerConnectionState)nativeConnectionStateForState: >+ (RTCPeerConnectionState)state; >+ > + (RTCIceConnectionState)iceConnectionStateForNativeState: > (webrtc::PeerConnectionInterface::IceConnectionState)nativeState; > >++ (RTCPeerConnectionState)connectionStateForNativeState: >+ (webrtc::PeerConnectionInterface::PeerConnectionState)nativeState; >+ > + (NSString *)stringForIceConnectionState:(RTCIceConnectionState)state; > >++ (NSString *)stringForConnectionState:(RTCPeerConnectionState)state; >+ > + (webrtc::PeerConnectionInterface::IceGatheringState)nativeIceGatheringStateForState: > (RTCIceGatheringState)state; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.h >index 0179ec0e26df93ab51ee2b7df5722c1f2ff4cba1..393a50bc8984ec99c014b74e6341e5e7af26c532 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.h >@@ -57,6 +57,16 @@ typedef NS_ENUM(NSInteger, RTCIceConnectionState) { > RTCIceConnectionStateCount, > }; > >+/** Represents the combined ice+dtls connection state of the peer connection. */ >+typedef NS_ENUM(NSInteger, RTCPeerConnectionState) { >+ RTCPeerConnectionStateNew, >+ RTCPeerConnectionStateConnecting, >+ RTCPeerConnectionStateConnected, >+ RTCPeerConnectionStateDisconnected, >+ RTCPeerConnectionStateFailed, >+ RTCPeerConnectionStateClosed, >+}; >+ > /** Represents the ice gathering state of the peer connection. */ > typedef NS_ENUM(NSInteger, RTCIceGatheringState) { > RTCIceGatheringStateNew, >@@ -115,11 +125,14 @@ RTC_OBJC_EXPORT > * This is only called with RTCSdpSemanticsUnifiedPlan specified. > */ > @optional >+/** Called any time the PeerConnectionState changes. */ >+- (void)peerConnection:(RTCPeerConnection *)peerConnection >+ didChangeConnectionState:(RTCPeerConnectionState)newState; >+ > - (void)peerConnection:(RTCPeerConnection *)peerConnection > didStartReceivingOnTransceiver:(RTCRtpTransceiver *)transceiver; > > /** Called when a receiver and its track are created. */ >-@optional > - (void)peerConnection:(RTCPeerConnection *)peerConnection > didAddReceiver:(RTCRtpReceiver *)rtpReceiver > streams:(NSArray<RTCMediaStream *> *)mediaStreams; >@@ -145,6 +158,7 @@ RTC_OBJC_EXPORT > @property(nonatomic, readonly, nullable) RTCSessionDescription *remoteDescription; > @property(nonatomic, readonly) RTCSignalingState signalingState; > @property(nonatomic, readonly) RTCIceConnectionState iceConnectionState; >+@property(nonatomic, readonly) RTCPeerConnectionState connectionState; > @property(nonatomic, readonly) RTCIceGatheringState iceGatheringState; > @property(nonatomic, readonly, copy) RTCConfiguration *configuration; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.mm >index 5277489747b0b9d388a039c0f3c8070252772e2c..b59dafec35ff92254933a74e9ba150d77554d3b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnection.mm >@@ -29,6 +29,7 @@ > #include <memory> > > #include "api/jsepicecandidate.h" >+#include "api/media_transport_interface.h" > #include "rtc_base/checks.h" > > NSString * const kRTCPeerConnectionErrorDomain = >@@ -174,11 +175,17 @@ void PeerConnectionDelegateAdapter::OnRenegotiationNeeded() { > > void PeerConnectionDelegateAdapter::OnIceConnectionChange( > PeerConnectionInterface::IceConnectionState new_state) { >- RTCIceConnectionState state = >- [[RTCPeerConnection class] iceConnectionStateForNativeState:new_state]; >- RTCPeerConnection *peer_connection = peer_connection_; >- [peer_connection.delegate peerConnection:peer_connection >- didChangeIceConnectionState:state]; >+ RTCIceConnectionState state = [RTCPeerConnection iceConnectionStateForNativeState:new_state]; >+ [peer_connection_.delegate peerConnection:peer_connection_ didChangeIceConnectionState:state]; >+} >+ >+void PeerConnectionDelegateAdapter::OnConnectionChange( >+ PeerConnectionInterface::PeerConnectionState new_state) { >+ if ([peer_connection_.delegate >+ respondsToSelector:@selector(peerConnection:didChangeConnectionState:)]) { >+ RTCPeerConnectionState state = [RTCPeerConnection connectionStateForNativeState:new_state]; >+ [peer_connection_.delegate peerConnection:peer_connection_ didChangeConnectionState:state]; >+ } > } > > void PeerConnectionDelegateAdapter::OnIceGatheringChange( >@@ -321,6 +328,10 @@ - (RTCIceConnectionState)iceConnectionState { > _peerConnection->ice_connection_state()]; > } > >+- (RTCPeerConnectionState)connectionState { >+ return [[self class] connectionStateForNativeState:_peerConnection->peer_connection_state()]; >+} >+ > - (RTCIceGatheringState)iceGatheringState { > return [[self class] iceGatheringStateForNativeState: > _peerConnection->ice_gathering_state()]; >@@ -631,6 +642,59 @@ + (NSString *)stringForSignalingState:(RTCSignalingState)state { > } > } > >++ (webrtc::PeerConnectionInterface::PeerConnectionState)nativeConnectionStateForState: >+ (RTCPeerConnectionState)state { >+ switch (state) { >+ case RTCPeerConnectionStateNew: >+ return webrtc::PeerConnectionInterface::PeerConnectionState::kNew; >+ case RTCPeerConnectionStateConnecting: >+ return webrtc::PeerConnectionInterface::PeerConnectionState::kConnecting; >+ case RTCPeerConnectionStateConnected: >+ return webrtc::PeerConnectionInterface::PeerConnectionState::kConnected; >+ case RTCPeerConnectionStateFailed: >+ return webrtc::PeerConnectionInterface::PeerConnectionState::kFailed; >+ case RTCPeerConnectionStateDisconnected: >+ return webrtc::PeerConnectionInterface::PeerConnectionState::kDisconnected; >+ case RTCPeerConnectionStateClosed: >+ return webrtc::PeerConnectionInterface::PeerConnectionState::kClosed; >+ } >+} >+ >++ (RTCPeerConnectionState)connectionStateForNativeState: >+ (webrtc::PeerConnectionInterface::PeerConnectionState)nativeState { >+ switch (nativeState) { >+ case webrtc::PeerConnectionInterface::PeerConnectionState::kNew: >+ return RTCPeerConnectionStateNew; >+ case webrtc::PeerConnectionInterface::PeerConnectionState::kConnecting: >+ return RTCPeerConnectionStateConnecting; >+ case webrtc::PeerConnectionInterface::PeerConnectionState::kConnected: >+ return RTCPeerConnectionStateConnected; >+ case webrtc::PeerConnectionInterface::PeerConnectionState::kFailed: >+ return RTCPeerConnectionStateFailed; >+ case webrtc::PeerConnectionInterface::PeerConnectionState::kDisconnected: >+ return RTCPeerConnectionStateDisconnected; >+ case webrtc::PeerConnectionInterface::PeerConnectionState::kClosed: >+ return RTCPeerConnectionStateClosed; >+ } >+} >+ >++ (NSString *)stringForConnectionState:(RTCPeerConnectionState)state { >+ switch (state) { >+ case RTCPeerConnectionStateNew: >+ return @"NEW"; >+ case RTCPeerConnectionStateConnecting: >+ return @"CONNECTING"; >+ case RTCPeerConnectionStateConnected: >+ return @"CONNECTED"; >+ case RTCPeerConnectionStateFailed: >+ return @"FAILED"; >+ case RTCPeerConnectionStateDisconnected: >+ return @"DISCONNECTED"; >+ case RTCPeerConnectionStateClosed: >+ return @"CLOSED"; >+ } >+} >+ > + (webrtc::PeerConnectionInterface::IceConnectionState) > nativeIceConnectionStateForState:(RTCIceConnectionState)state { > switch (state) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory+Native.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory+Native.h >index 60aa08f7ca3a023faa03c3329fbceac1e71a4e2a..2752cf411766b9046f4123232f34f4a45a701a96 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory+Native.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory+Native.h >@@ -17,6 +17,7 @@ namespace webrtc { > class AudioDeviceModule; > class AudioEncoderFactory; > class AudioDecoderFactory; >+class MediaTransportFactory; > class VideoEncoderFactory; > class VideoDecoderFactory; > class AudioProcessing; >@@ -49,6 +50,25 @@ NS_ASSUME_NONNULL_BEGIN > audioProcessingModule: > (rtc::scoped_refptr<webrtc::AudioProcessing>)audioProcessingModule; > >+- (instancetype) >+ initWithNativeAudioEncoderFactory: >+ (rtc::scoped_refptr<webrtc::AudioEncoderFactory>)audioEncoderFactory >+ nativeAudioDecoderFactory: >+ (rtc::scoped_refptr<webrtc::AudioDecoderFactory>)audioDecoderFactory >+ nativeVideoEncoderFactory: >+ (std::unique_ptr<webrtc::VideoEncoderFactory>)videoEncoderFactory >+ nativeVideoDecoderFactory: >+ (std::unique_ptr<webrtc::VideoDecoderFactory>)videoDecoderFactory >+ audioDeviceModule:(nullable webrtc::AudioDeviceModule *)audioDeviceModule >+ audioProcessingModule: >+ (rtc::scoped_refptr<webrtc::AudioProcessing>)audioProcessingModule >+ mediaTransportFactory: >+ (std::unique_ptr<webrtc::MediaTransportFactory>)mediaTransportFactory; >+ >+- (instancetype)initWithEncoderFactory:(nullable id<RTCVideoEncoderFactory>)encoderFactory >+ decoderFactory:(nullable id<RTCVideoDecoderFactory>)decoderFactory >+ mediaTransportFactory: >+ (std::unique_ptr<webrtc::MediaTransportFactory>)mediaTransportFactory; > @end > > NS_ASSUME_NONNULL_END >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory.mm >index 403db040fb8622648d4832a0ec2086097ab37d78..48476f2368f0145240deb50e454ce6938979cfe4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactory.mm >@@ -50,6 +50,7 @@ > // TODO(zhihuang): Remove nogncheck once MediaEngineInterface is moved to C++ > // API layer. > #include "absl/memory/memory.h" >+#include "api/media_transport_interface.h" > #include "media/engine/webrtcmediaengine.h" // nogncheck > > @implementation RTCPeerConnectionFactory { >@@ -80,12 +81,15 @@ - (instancetype)init { > nativeVideoDecoderFactory:webrtc::ObjCToNativeVideoDecoderFactory( > [[RTCVideoDecoderFactoryH264 alloc] init]) > audioDeviceModule:[self audioDeviceModule] >- audioProcessingModule:nullptr]; >+ audioProcessingModule:nullptr >+ mediaTransportFactory:nullptr]; > #endif > } > > - (instancetype)initWithEncoderFactory:(nullable id<RTCVideoEncoderFactory>)encoderFactory >- decoderFactory:(nullable id<RTCVideoDecoderFactory>)decoderFactory { >+ decoderFactory:(nullable id<RTCVideoDecoderFactory>)decoderFactory >+ mediaTransportFactory: >+ (std::unique_ptr<webrtc::MediaTransportFactory>)mediaTransportFactory { > #ifdef HAVE_NO_MEDIA > return [self initWithNoMedia]; > #else >@@ -102,9 +106,16 @@ - (instancetype)initWithEncoderFactory:(nullable id<RTCVideoEncoderFactory>)enco > nativeVideoEncoderFactory:std::move(native_encoder_factory) > nativeVideoDecoderFactory:std::move(native_decoder_factory) > audioDeviceModule:[self audioDeviceModule] >- audioProcessingModule:nullptr]; >+ audioProcessingModule:nullptr >+ mediaTransportFactory:std::move(mediaTransportFactory)]; > #endif > } >+- (instancetype)initWithEncoderFactory:(nullable id<RTCVideoEncoderFactory>)encoderFactory >+ decoderFactory:(nullable id<RTCVideoDecoderFactory>)decoderFactory { >+ return [self initWithEncoderFactory:encoderFactory >+ decoderFactory:decoderFactory >+ mediaTransportFactory:nullptr]; >+} > > - (instancetype)initNative { > if (self = [super init]) { >@@ -152,20 +163,57 @@ - (instancetype)initWithNativeAudioEncoderFactory: > (nullable webrtc::AudioDeviceModule *)audioDeviceModule > audioProcessingModule: > (rtc::scoped_refptr<webrtc::AudioProcessing>)audioProcessingModule { >+ return [self initWithNativeAudioEncoderFactory:audioEncoderFactory >+ nativeAudioDecoderFactory:audioDecoderFactory >+ nativeVideoEncoderFactory:std::move(videoEncoderFactory) >+ nativeVideoDecoderFactory:std::move(videoDecoderFactory) >+ audioDeviceModule:audioDeviceModule >+ audioProcessingModule:audioProcessingModule >+ mediaTransportFactory:nullptr]; >+} >+ >+- (instancetype) >+ initWithNativeAudioEncoderFactory: >+ (rtc::scoped_refptr<webrtc::AudioEncoderFactory>)audioEncoderFactory >+ nativeAudioDecoderFactory: >+ (rtc::scoped_refptr<webrtc::AudioDecoderFactory>)audioDecoderFactory >+ nativeVideoEncoderFactory: >+ (std::unique_ptr<webrtc::VideoEncoderFactory>)videoEncoderFactory >+ nativeVideoDecoderFactory: >+ (std::unique_ptr<webrtc::VideoDecoderFactory>)videoDecoderFactory >+ audioDeviceModule:(nullable webrtc::AudioDeviceModule *)audioDeviceModule >+ audioProcessingModule: >+ (rtc::scoped_refptr<webrtc::AudioProcessing>)audioProcessingModule >+ mediaTransportFactory: >+ (std::unique_ptr<webrtc::MediaTransportFactory>)mediaTransportFactory { > #ifdef HAVE_NO_MEDIA > return [self initWithNoMedia]; > #else > if (self = [self initNative]) { >- _nativeFactory = webrtc::CreatePeerConnectionFactory(_networkThread.get(), >- _workerThread.get(), >- _signalingThread.get(), >- audioDeviceModule, >- audioEncoderFactory, >- audioDecoderFactory, >- std::move(videoEncoderFactory), >- std::move(videoDecoderFactory), >- nullptr, // audio mixer >- audioProcessingModule); >+ if (!audioProcessingModule) audioProcessingModule = webrtc::AudioProcessingBuilder().Create(); >+ >+ std::unique_ptr<cricket::MediaEngineInterface> media_engine = >+ cricket::WebRtcMediaEngineFactory::Create(audioDeviceModule, >+ audioEncoderFactory, >+ audioDecoderFactory, >+ std::move(videoEncoderFactory), >+ std::move(videoDecoderFactory), >+ nullptr, // audio mixer >+ audioProcessingModule); >+ >+ std::unique_ptr<webrtc::CallFactoryInterface> call_factory = webrtc::CreateCallFactory(); >+ >+ std::unique_ptr<webrtc::RtcEventLogFactoryInterface> event_log_factory = >+ webrtc::CreateRtcEventLogFactory(); >+ webrtc::PeerConnectionFactoryDependencies dependencies; >+ dependencies.network_thread = _networkThread.get(); >+ dependencies.worker_thread = _workerThread.get(); >+ dependencies.signaling_thread = _signalingThread.get(); >+ dependencies.media_engine = std::move(media_engine); >+ dependencies.call_factory = std::move(call_factory); >+ dependencies.event_log_factory = std::move(event_log_factory); >+ dependencies.media_transport_factory = std::move(mediaTransportFactory); >+ _nativeFactory = webrtc::CreateModularPeerConnectionFactory(std::move(dependencies)); > NSAssert(_nativeFactory, @"Failed to initialize PeerConnectionFactory!"); > } > return self; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryBuilder.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryBuilder.mm >index a26a639e28e44f21b8c2756d4fd0c2137d27115c..0adaa30e4f088946b6df47504fc5b91189fbeafa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryBuilder.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryBuilder.mm >@@ -13,6 +13,7 @@ > > #include "api/audio_codecs/audio_decoder_factory.h" > #include "api/audio_codecs/audio_encoder_factory.h" >+#include "api/media_transport_interface.h" > #include "api/video_codecs/video_decoder_factory.h" > #include "api/video_codecs/video_encoder_factory.h" > #include "modules/audio_device/include/audio_device.h" >@@ -25,6 +26,7 @@ @implementation RTCPeerConnectionFactoryBuilder { > rtc::scoped_refptr<webrtc::AudioDecoderFactory> _audioDecoderFactory; > rtc::scoped_refptr<webrtc::AudioDeviceModule> _audioDeviceModule; > rtc::scoped_refptr<webrtc::AudioProcessing> _audioProcessingModule; >+ std::unique_ptr<webrtc::MediaTransportFactory> _mediaTransportFactory; > } > > + (RTCPeerConnectionFactoryBuilder *)builder { >@@ -38,7 +40,8 @@ - (RTCPeerConnectionFactory *)createPeerConnectionFactory { > nativeVideoEncoderFactory:std::move(_videoEncoderFactory) > nativeVideoDecoderFactory:std::move(_videoDecoderFactory) > audioDeviceModule:_audioDeviceModule >- audioProcessingModule:_audioProcessingModule]; >+ audioProcessingModule:_audioProcessingModule >+ mediaTransportFactory:std::move(_mediaTransportFactory)]; > } > > - (void)setVideoEncoderFactory:(std::unique_ptr<webrtc::VideoEncoderFactory>)videoEncoderFactory { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.h >index af327f987b27bca77b148d30f980d962646b3df7..4bec8695bdbf1979f50739ef95c64ade13590b08 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.h >@@ -31,10 +31,6 @@ RTC_OBJC_EXPORT > > @property(nonatomic, assign) BOOL ignoreEthernetNetworkAdapter; > >-@property(nonatomic, assign) BOOL enableAes128Sha1_32CryptoCipher; >- >-@property(nonatomic, assign) BOOL enableGcmCryptoSuites; >- > - (instancetype)init NS_DESIGNATED_INITIALIZER; > > @end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.mm >index 103a130390c062bc6ec76ca0ae77c0f52fec3fe9..f0cc6a6c81f0c11ef622ce2b3e283e293bd20a35 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCPeerConnectionFactoryOptions.mm >@@ -34,8 +34,6 @@ @synthesize ignoreVPNNetworkAdapter = _ignoreVPNNetworkAdapter; > @synthesize ignoreCellularNetworkAdapter = _ignoreCellularNetworkAdapter; > @synthesize ignoreWiFiNetworkAdapter = _ignoreWiFiNetworkAdapter; > @synthesize ignoreEthernetNetworkAdapter = _ignoreEthernetNetworkAdapter; >-@synthesize enableAes128Sha1_32CryptoCipher = _enableAes128Sha1_32CryptoCipher; >-@synthesize enableGcmCryptoSuites = _enableGcmCryptoSuites; > > - (instancetype)init { > return [super init]; >@@ -52,9 +50,6 @@ - (webrtc::PeerConnectionFactoryInterface::Options)nativeOptions { > setNetworkBit(&options, rtc::ADAPTER_TYPE_WIFI, self.ignoreWiFiNetworkAdapter); > setNetworkBit(&options, rtc::ADAPTER_TYPE_ETHERNET, self.ignoreEthernetNetworkAdapter); > >- options.crypto_options.enable_aes128_sha1_32_crypto_cipher = self.enableAes128Sha1_32CryptoCipher; >- options.crypto_options.enable_gcm_crypto_suites = self.enableGcmCryptoSuites; >- > return options; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpTransceiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpTransceiver.h >index 8b962e4884b142f29b4626ba70d486e2b2f9b675..8ef3fc1d42e7a30178608e3641ae98dec7ba7f96 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpTransceiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpTransceiver.h >@@ -28,6 +28,7 @@ typedef NS_ENUM(NSInteger, RTCRtpTransceiverDirection) { > * RTCPeerConnection.addTransceiver. > * https://w3c.github.io/webrtc-pc/#dom-rtcrtptransceiverinit > */ >+RTC_OBJC_EXPORT > @interface RTCRtpTransceiverInit : NSObject > > /** Direction of the RTCRtpTransceiver. See RTCRtpTransceiver.direction. */ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoSource+Private.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoSource+Private.h >index 2441e0cdbe6248aaebb3e88ee409257d44d0efe6..828aad87bd07498ce942e2851dd95a0bd91ab966 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoSource+Private.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoSource+Private.h >@@ -13,6 +13,7 @@ > #import "RTCMediaSource+Private.h" > > #include "api/mediastreaminterface.h" >+#include "rtc_base/thread.h" > > NS_ASSUME_NONNULL_BEGIN > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.h >index be62db25b569c8443ee0525cb285afdc56c61ffc..b20a4b4d428b580600e07464c3660241eed133ce 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.h >@@ -14,10 +14,12 @@ > > // Subset of rtc::LoggingSeverity. > typedef NS_ENUM(NSInteger, RTCLoggingSeverity) { >+ RTCLoggingSeveritySensitive, > RTCLoggingSeverityVerbose, > RTCLoggingSeverityInfo, > RTCLoggingSeverityWarning, > RTCLoggingSeverityError, >+ RTCLoggingSeverityNone, > }; > > // Wrapper for C++ RTC_LOG(sev) macros. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.mm >index 5c8f9fb4f8bf944f1818f4484dafca5430f61a9a..3aae47a240cc400a2b2ff0b0bf1bf03454369385 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCLogging.mm >@@ -14,6 +14,8 @@ > > rtc::LoggingSeverity RTCGetNativeLoggingSeverity(RTCLoggingSeverity severity) { > switch (severity) { >+ case RTCLoggingSeveritySensitive: >+ return rtc::LS_SENSITIVE; > case RTCLoggingSeverityVerbose: > return rtc::LS_VERBOSE; > case RTCLoggingSeverityInfo: >@@ -22,6 +24,8 @@ rtc::LoggingSeverity RTCGetNativeLoggingSeverity(RTCLoggingSeverity severity) { > return rtc::LS_WARNING; > case RTCLoggingSeverityError: > return rtc::LS_ERROR; >+ case RTCLoggingSeverityNone: >+ return rtc::LS_NONE; > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCCameraVideoCapturer.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCCameraVideoCapturer.m >index 2909405e7d8c511c580048c381dbf3fba32ede97..db833b398cbf1cc1211d996d3b21f9757224889b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCCameraVideoCapturer.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCCameraVideoCapturer.m >@@ -26,18 +26,18 @@ > > @interface RTCCameraVideoCapturer ()<AVCaptureVideoDataOutputSampleBufferDelegate> > @property(nonatomic, readonly) dispatch_queue_t frameQueue; >+@property(nonatomic, strong) AVCaptureDevice *currentDevice; >+@property(nonatomic, assign) BOOL hasRetriedOnFatalError; >+@property(nonatomic, assign) BOOL isRunning; >+// Will the session be running once all asynchronous operations have been completed? >+@property(nonatomic, assign) BOOL willBeRunning; > @end > > @implementation RTCCameraVideoCapturer { > AVCaptureVideoDataOutput *_videoDataOutput; > AVCaptureSession *_captureSession; >- AVCaptureDevice *_currentDevice; > FourCharCode _preferredOutputPixelFormat; > FourCharCode _outputPixelFormat; >- BOOL _hasRetriedOnFatalError; >- BOOL _isRunning; >- // Will the session be running once all asynchronous operations have been completed? >- BOOL _willBeRunning; > RTCVideoRotation _rotation; > #if TARGET_OS_IPHONE > UIDeviceOrientation _orientation; >@@ -46,6 +46,10 @@ @implementation RTCCameraVideoCapturer { > > @synthesize frameQueue = _frameQueue; > @synthesize captureSession = _captureSession; >+@synthesize currentDevice = _currentDevice; >+@synthesize hasRetriedOnFatalError = _hasRetriedOnFatalError; >+@synthesize isRunning = _isRunning; >+@synthesize willBeRunning = _willBeRunning; > > - (instancetype)init { > return [self initWithDelegate:nil captureSession:[[AVCaptureSession alloc] init]]; >@@ -157,25 +161,26 @@ - (void)startCaptureWithDevice:(AVCaptureDevice *)device > [[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications]; > #endif > >- _currentDevice = device; >+ self.currentDevice = device; > > NSError *error = nil; >- if (![_currentDevice lockForConfiguration:&error]) { >- RTCLogError( >- @"Failed to lock device %@. Error: %@", _currentDevice, error.userInfo); >+ if (![self.currentDevice lockForConfiguration:&error]) { >+ RTCLogError(@"Failed to lock device %@. Error: %@", >+ self.currentDevice, >+ error.userInfo); > if (completionHandler) { > completionHandler(error); > } >- _willBeRunning = NO; >+ self.willBeRunning = NO; > return; > } > [self reconfigureCaptureSessionInput]; > [self updateOrientation]; > [self updateDeviceCaptureFormat:format fps:fps]; > [self updateVideoDataOutputPixelFormat:format]; >- [_captureSession startRunning]; >- [_currentDevice unlockForConfiguration]; >- _isRunning = YES; >+ [self.captureSession startRunning]; >+ [self.currentDevice unlockForConfiguration]; >+ self.isRunning = YES; > if (completionHandler) { > completionHandler(nil); > } >@@ -188,16 +193,16 @@ - (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHand > dispatchAsyncOnType:RTCDispatcherTypeCaptureSession > block:^{ > RTCLogInfo("Stop"); >- _currentDevice = nil; >- for (AVCaptureDeviceInput *oldInput in [_captureSession.inputs copy]) { >- [_captureSession removeInput:oldInput]; >+ self.currentDevice = nil; >+ for (AVCaptureDeviceInput *oldInput in [self.captureSession.inputs copy]) { >+ [self.captureSession removeInput:oldInput]; > } >- [_captureSession stopRunning]; >+ [self.captureSession stopRunning]; > > #if TARGET_OS_IPHONE > [[UIDevice currentDevice] endGeneratingDeviceOrientationNotifications]; > #endif >- _isRunning = NO; >+ self.isRunning = NO; > if (completionHandler) { > completionHandler(); > } >@@ -340,7 +345,7 @@ - (void)handleCaptureSessionDidStartRunning:(NSNotification *)notification { > block:^{ > // If we successfully restarted after an unknown error, > // allow future retries on fatal errors. >- _hasRetriedOnFatalError = NO; >+ self.hasRetriedOnFatalError = NO; > }]; > } > >@@ -352,10 +357,10 @@ - (void)handleFatalError { > [RTCDispatcher > dispatchAsyncOnType:RTCDispatcherTypeCaptureSession > block:^{ >- if (!_hasRetriedOnFatalError) { >+ if (!self.hasRetriedOnFatalError) { > RTCLogWarning(@"Attempting to recover from fatal capture error."); > [self handleNonFatalError]; >- _hasRetriedOnFatalError = YES; >+ self.hasRetriedOnFatalError = YES; > } else { > RTCLogError(@"Previous fatal error recovery failed."); > } >@@ -366,8 +371,8 @@ - (void)handleNonFatalError { > [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession > block:^{ > RTCLog(@"Restarting capture session after error."); >- if (_isRunning) { >- [_captureSession startRunning]; >+ if (self.isRunning) { >+ [self.captureSession startRunning]; > } > }]; > } >@@ -379,9 +384,9 @@ - (void)handleNonFatalError { > - (void)handleApplicationDidBecomeActive:(NSNotification *)notification { > [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession > block:^{ >- if (_isRunning && !_captureSession.isRunning) { >+ if (self.isRunning && !self.captureSession.isRunning) { > RTCLog(@"Restarting capture session on active."); >- [_captureSession startRunning]; >+ [self.captureSession startRunning]; > } > }]; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCFileVideoCapturer.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCFileVideoCapturer.m >index 00143e909fd0aa02fe771f80d790e9677848799a..207a21d8c0d3f6915d228af4cc21d2bd80500a15 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCFileVideoCapturer.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/capturer/RTCFileVideoCapturer.m >@@ -27,15 +27,21 @@ typedef NS_ENUM(NSInteger, RTCFileVideoCapturerStatus) { > RTCFileVideoCapturerStatusStopped > }; > >+@interface RTCFileVideoCapturer () >+@property(nonatomic, assign) CMTime lastPresentationTime; >+@property(nonatomic, strong) NSURL *fileURL; >+@end >+ > @implementation RTCFileVideoCapturer { > AVAssetReader *_reader; > AVAssetReaderTrackOutput *_outTrack; > RTCFileVideoCapturerStatus _status; >- CMTime _lastPresentationTime; > dispatch_queue_t _frameQueue; >- NSURL *_fileURL; > } > >+@synthesize lastPresentationTime = _lastPresentationTime; >+@synthesize fileURL = _fileURL; >+ > - (void)startCapturingFromFileNamed:(NSString *)nameOfFile > onError:(RTCFileVideoCapturerErrorBlock)errorBlock { > if (_status == RTCFileVideoCapturerStatusStarted) { >@@ -62,9 +68,9 @@ - (void)startCapturingFromFileNamed:(NSString *)nameOfFile > return; > } > >- _lastPresentationTime = CMTimeMake(0, 0); >+ self.lastPresentationTime = CMTimeMake(0, 0); > >- _fileURL = [NSURL fileURLWithPath:pathForFile]; >+ self.fileURL = [NSURL fileURLWithPath:pathForFile]; > [self setupReaderOnError:errorBlock]; > }); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLRenderer.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLRenderer.mm >index 8f372bcdbd8152fbd25ccd4ba3ebecf11f1ade5a..fb478d25331e9ccec1494012f60209677f91f692 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLRenderer.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLRenderer.mm >@@ -176,7 +176,8 @@ - (BOOL)setupTexturesForFrame:(nonnull RTCVideoFrame *)frame { > RTCVideoRotation rotation; > NSValue *rotationOverride = self.rotationOverride; > if (rotationOverride) { >-#if defined(__IPHONE_11_0) && (__IPHONE_OS_VERSION_MAX_ALLOWED >= __IPHONE_11_0) >+#if defined(__IPHONE_11_0) && defined(__IPHONE_OS_VERSION_MAX_ALLOWED) && \ >+ (__IPHONE_OS_VERSION_MAX_ALLOWED >= __IPHONE_11_0) > if (@available(iOS 11, *)) { > [rotationOverride getValue:&rotation size:sizeof(rotation)]; > } else >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLVideoView.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLVideoView.m >index 7ad1d79d3e3f6d63be46dfe0861161811a9aa543..f8575c0cfed52fc4a55b8f6215fe8b3cb5865665 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLVideoView.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/renderer/metal/RTCMTLVideoView.m >@@ -141,6 +141,10 @@ - (void)drawInMTKView:(nonnull MTKView *)view { > return; > } > >+ if (CGRectIsEmpty(view.bounds)) { >+ return; >+ } >+ > RTCMTLRenderer *renderer; > if ([videoFrame.buffer isKindOfClass:[RTCCVPixelBuffer class]]) { > RTCCVPixelBuffer *buffer = (RTCCVPixelBuffer*)videoFrame.buffer; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm >index 6317013180daa3712ad72225e0d6f8a6059c5177..1c18fab17df964563eecd8e2e42edd0f080ee4e1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm >@@ -67,12 +67,23 @@ void decompressionOutputCallback(void *decoderRef, > // Decoder. > @implementation RTCVideoDecoderH264 { > CMVideoFormatDescriptionRef _videoFormat; >+ CMMemoryPoolRef _memoryPool; > VTDecompressionSessionRef _decompressionSession; > RTCVideoDecoderCallback _callback; > OSStatus _error; > } > >+- (instancetype)init { >+ self = [super init]; >+ if (self) { >+ _memoryPool = CMMemoryPoolCreate(nil); >+ } >+ return self; >+} >+ > - (void)dealloc { >+ CMMemoryPoolInvalidate(_memoryPool); >+ CFRelease(_memoryPool); > [self destroyDecompressionSession]; > [self setVideoFormat:nullptr]; > } >@@ -125,7 +136,8 @@ - (NSInteger)decode:(RTCEncodedImage *)inputImage > if (!webrtc::H264AnnexBBufferToCMSampleBuffer((uint8_t *)inputImage.buffer.bytes, > inputImage.buffer.length, > _videoFormat, >- &sampleBuffer)) { >+ &sampleBuffer, >+ _memoryPool)) { > return WEBRTC_VIDEO_CODEC_ERROR; > } > RTC_DCHECK(sampleBuffer); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.cc >index a849d3f053e39d21e3901ed4559a750862d46cfa..55ed1c2b4ca3700e0804ade2cbbea0e807b1e5ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.cc >@@ -160,7 +160,8 @@ bool H264CMSampleBufferToAnnexBBuffer( > bool H264AnnexBBufferToCMSampleBuffer(const uint8_t* annexb_buffer, > size_t annexb_buffer_size, > CMVideoFormatDescriptionRef video_format, >- CMSampleBufferRef* out_sample_buffer) { >+ CMSampleBufferRef* out_sample_buffer, >+ CMMemoryPoolRef memory_pool) { > RTC_DCHECK(annexb_buffer); > RTC_DCHECK(out_sample_buffer); > RTC_DCHECK(video_format); >@@ -185,11 +186,11 @@ bool H264AnnexBBufferToCMSampleBuffer(const uint8_t* annexb_buffer, > } > > // Allocate memory as a block buffer. >- // TODO(tkchin): figure out how to use a pool. > CMBlockBufferRef block_buffer = nullptr; >+ CFAllocatorRef block_allocator = CMMemoryPoolGetAllocator(memory_pool); > OSStatus status = CMBlockBufferCreateWithMemoryBlock( >- nullptr, nullptr, reader.BytesRemaining(), nullptr, nullptr, 0, >- reader.BytesRemaining(), kCMBlockBufferAssureMemoryNowFlag, >+ kCFAllocatorDefault, nullptr, reader.BytesRemaining(), block_allocator, >+ nullptr, 0, reader.BytesRemaining(), kCMBlockBufferAssureMemoryNowFlag, > &block_buffer); > if (status != kCMBlockBufferNoErr) { > RTC_LOG(LS_ERROR) << "Failed to create block buffer."; >@@ -199,8 +200,9 @@ bool H264AnnexBBufferToCMSampleBuffer(const uint8_t* annexb_buffer, > // Make sure block buffer is contiguous. > CMBlockBufferRef contiguous_buffer = nullptr; > if (!CMBlockBufferIsRangeContiguous(block_buffer, 0, 0)) { >- status = CMBlockBufferCreateContiguous( >- nullptr, block_buffer, nullptr, nullptr, 0, 0, 0, &contiguous_buffer); >+ status = CMBlockBufferCreateContiguous(kCFAllocatorDefault, block_buffer, >+ block_allocator, nullptr, 0, 0, 0, >+ &contiguous_buffer); > if (status != noErr) { > RTC_LOG(LS_ERROR) << "Failed to flatten non-contiguous block buffer: " > << status; >@@ -236,9 +238,9 @@ bool H264AnnexBBufferToCMSampleBuffer(const uint8_t* annexb_buffer, > } > > // Create sample buffer. >- status = CMSampleBufferCreate(nullptr, contiguous_buffer, true, nullptr, >- nullptr, video_format, 1, 0, nullptr, 0, >- nullptr, out_sample_buffer); >+ status = CMSampleBufferCreate(kCFAllocatorDefault, contiguous_buffer, true, >+ nullptr, nullptr, video_format, 1, 0, nullptr, >+ 0, nullptr, out_sample_buffer); > if (status != noErr) { > RTC_LOG(LS_ERROR) << "Failed to create sample buffer."; > CFRelease(contiguous_buffer); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.h >index cd5a51079ab5b225877d00ce235c67d0b1eb0e98..a0c1aa90af5759de39b5d06c221d284294389876 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/nalu_rewriter.h >@@ -44,7 +44,8 @@ bool H264CMSampleBufferToAnnexBBuffer( > bool H264AnnexBBufferToCMSampleBuffer(const uint8_t* annexb_buffer, > size_t annexb_buffer_size, > CMVideoFormatDescriptionRef video_format, >- CMSampleBufferRef* out_sample_buffer); >+ CMSampleBufferRef* out_sample_buffer, >+ CMMemoryPoolRef memory_pool); > > // Returns a video format description created from the sps/pps information in > // the Annex B buffer. If there is no such information, nullptr is returned. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h >index abe9dfca93b04bf552ec0302b807597df261edb1..432a38257464b26cb53234c8a37af67d58ef80b1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h >@@ -17,7 +17,6 @@ NS_ASSUME_NONNULL_BEGIN > > /** RTCVideoFrameBuffer containing a CVPixelBufferRef */ > RTC_OBJC_EXPORT >-__attribute__((objc_runtime_name("WK_RTCCVPixelBuffer"))) > @interface RTCCVPixelBuffer : NSObject <RTCVideoFrameBuffer> > > @property(nonatomic, readonly) CVPixelBufferRef pixelBuffer; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/RTCCameraPreviewView.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/RTCCameraPreviewView.m >index 2add46cf71f6a47db05f983c469e3f401b0e36ff..adc62cc30af497bdc16644a3181e59a45c8dd34f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/RTCCameraPreviewView.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/RTCCameraPreviewView.m >@@ -47,19 +47,22 @@ - (void)setCaptureSession:(AVCaptureSession *)captureSession { > if (_captureSession == captureSession) { > return; > } >- [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeMain >- block:^{ >- _captureSession = captureSession; >- AVCaptureVideoPreviewLayer *previewLayer = [self previewLayer]; >- [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSession >- block:^{ >- previewLayer.session = captureSession; >- [RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeMain >- block:^{ >- [self setCorrectVideoOrientation]; >- }]; >- }]; >- }]; >+ [RTCDispatcher >+ dispatchAsyncOnType:RTCDispatcherTypeMain >+ block:^{ >+ self.captureSession = captureSession; >+ AVCaptureVideoPreviewLayer *previewLayer = [self previewLayer]; >+ [RTCDispatcher >+ dispatchAsyncOnType:RTCDispatcherTypeCaptureSession >+ block:^{ >+ previewLayer.session = captureSession; >+ [RTCDispatcher >+ dispatchAsyncOnType:RTCDispatcherTypeMain >+ block:^{ >+ [self setCorrectVideoOrientation]; >+ }]; >+ }]; >+ }]; > } > > - (void)layoutSubviews { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.h >index eb39ea9cc4d767341a3f7aa396b98e706fb13e04..736237582fe828bd736e5c8c0bddeae4e8beba29 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.h >@@ -34,6 +34,9 @@ typedef NS_ENUM(NSInteger, RTCDeviceType) { > RTCDeviceTypeIPhone8, > RTCDeviceTypeIPhone8Plus, > RTCDeviceTypeIPhoneX, >+ RTCDeviceTypeIPhoneXS, >+ RTCDeviceTypeIPhoneXSMax, >+ RTCDeviceTypeIPhoneXR, > RTCDeviceTypeIPodTouch1G, > RTCDeviceTypeIPodTouch2G, > RTCDeviceTypeIPodTouch3G, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.mm >index 96adb9af780e13f09882ead7d848aee7c616c090..6f68e9f7a0da46b04e5a979509874d6d2a3aa2f1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/helpers/UIDevice+RTCDevice.mm >@@ -17,81 +17,85 @@ @implementation UIDevice (RTCDevice) > > + (RTCDeviceType)deviceType { > NSDictionary *machineNameToType = @{ >- @"iPhone1,1": @(RTCDeviceTypeIPhone1G), >- @"iPhone1,2": @(RTCDeviceTypeIPhone3G), >- @"iPhone2,1": @(RTCDeviceTypeIPhone3GS), >- @"iPhone3,1": @(RTCDeviceTypeIPhone4), >- @"iPhone3,2": @(RTCDeviceTypeIPhone4), >- @"iPhone3,3": @(RTCDeviceTypeIPhone4Verizon), >- @"iPhone4,1": @(RTCDeviceTypeIPhone4S), >- @"iPhone5,1": @(RTCDeviceTypeIPhone5GSM), >- @"iPhone5,2": @(RTCDeviceTypeIPhone5GSM_CDMA), >- @"iPhone5,3": @(RTCDeviceTypeIPhone5CGSM), >- @"iPhone5,4": @(RTCDeviceTypeIPhone5CGSM_CDMA), >- @"iPhone6,1": @(RTCDeviceTypeIPhone5SGSM), >- @"iPhone6,2": @(RTCDeviceTypeIPhone5SGSM_CDMA), >- @"iPhone7,1": @(RTCDeviceTypeIPhone6Plus), >- @"iPhone7,2": @(RTCDeviceTypeIPhone6), >- @"iPhone8,1": @(RTCDeviceTypeIPhone6S), >- @"iPhone8,2": @(RTCDeviceTypeIPhone6SPlus), >- @"iPhone8,4": @(RTCDeviceTypeIPhoneSE), >- @"iPhone9,1": @(RTCDeviceTypeIPhone7), >- @"iPhone9,2": @(RTCDeviceTypeIPhone7Plus), >- @"iPhone9,3": @(RTCDeviceTypeIPhone7), >- @"iPhone9,4": @(RTCDeviceTypeIPhone7Plus), >- @"iPhone10,1": @(RTCDeviceTypeIPhone8), >- @"iPhone10,2": @(RTCDeviceTypeIPhone8Plus), >- @"iPhone10,3": @(RTCDeviceTypeIPhoneX), >- @"iPhone10,4": @(RTCDeviceTypeIPhone8), >- @"iPhone10,5": @(RTCDeviceTypeIPhone8Plus), >- @"iPhone10,6": @(RTCDeviceTypeIPhoneX), >- @"iPod1,1": @(RTCDeviceTypeIPodTouch1G), >- @"iPod2,1": @(RTCDeviceTypeIPodTouch2G), >- @"iPod3,1": @(RTCDeviceTypeIPodTouch3G), >- @"iPod4,1": @(RTCDeviceTypeIPodTouch4G), >- @"iPod5,1": @(RTCDeviceTypeIPodTouch5G), >- @"iPod7,1": @(RTCDeviceTypeIPodTouch6G), >- @"iPad1,1": @(RTCDeviceTypeIPad), >- @"iPad2,1": @(RTCDeviceTypeIPad2Wifi), >- @"iPad2,2": @(RTCDeviceTypeIPad2GSM), >- @"iPad2,3": @(RTCDeviceTypeIPad2CDMA), >- @"iPad2,4": @(RTCDeviceTypeIPad2Wifi2), >- @"iPad2,5": @(RTCDeviceTypeIPadMiniWifi), >- @"iPad2,6": @(RTCDeviceTypeIPadMiniGSM), >- @"iPad2,7": @(RTCDeviceTypeIPadMiniGSM_CDMA), >- @"iPad3,1": @(RTCDeviceTypeIPad3Wifi), >- @"iPad3,2": @(RTCDeviceTypeIPad3GSM_CDMA), >- @"iPad3,3": @(RTCDeviceTypeIPad3GSM), >- @"iPad3,4": @(RTCDeviceTypeIPad4Wifi), >- @"iPad3,5": @(RTCDeviceTypeIPad4GSM), >- @"iPad3,6": @(RTCDeviceTypeIPad4GSM_CDMA), >- @"iPad4,1": @(RTCDeviceTypeIPadAirWifi), >- @"iPad4,2": @(RTCDeviceTypeIPadAirCellular), >- @"iPad4,3": @(RTCDeviceTypeIPadAirWifiCellular), >- @"iPad4,4": @(RTCDeviceTypeIPadMini2GWifi), >- @"iPad4,5": @(RTCDeviceTypeIPadMini2GCellular), >- @"iPad4,6": @(RTCDeviceTypeIPadMini2GWifiCellular), >- @"iPad4,7": @(RTCDeviceTypeIPadMini3), >- @"iPad4,8": @(RTCDeviceTypeIPadMini3), >- @"iPad4,9": @(RTCDeviceTypeIPadMini3), >- @"iPad5,1": @(RTCDeviceTypeIPadMini4), >- @"iPad5,2": @(RTCDeviceTypeIPadMini4), >- @"iPad5,3": @(RTCDeviceTypeIPadAir2), >- @"iPad5,4": @(RTCDeviceTypeIPadAir2), >- @"iPad6,3": @(RTCDeviceTypeIPadPro9Inch), >- @"iPad6,4": @(RTCDeviceTypeIPadPro9Inch), >- @"iPad6,7": @(RTCDeviceTypeIPadPro12Inch), >- @"iPad6,8": @(RTCDeviceTypeIPadPro12Inch), >- @"iPad6,11": @(RTCDeviceTypeIPad5), >- @"iPad6,12": @(RTCDeviceTypeIPad5), >- @"iPad7,1": @(RTCDeviceTypeIPadPro12Inch2), >- @"iPad7,2": @(RTCDeviceTypeIPadPro12Inch2), >- @"iPad7,3": @(RTCDeviceTypeIPadPro10Inch), >- @"iPad7,4": @(RTCDeviceTypeIPadPro10Inch), >- @"iPad7,5": @(RTCDeviceTypeIPad6), >- @"iPad7,6": @(RTCDeviceTypeIPad6), >- @"i386": @(RTCDeviceTypeSimulatori386), >- @"x86_64": @(RTCDeviceTypeSimulatorx86_64), >+ @"iPhone1,1" : @(RTCDeviceTypeIPhone1G), >+ @"iPhone1,2" : @(RTCDeviceTypeIPhone3G), >+ @"iPhone2,1" : @(RTCDeviceTypeIPhone3GS), >+ @"iPhone3,1" : @(RTCDeviceTypeIPhone4), >+ @"iPhone3,2" : @(RTCDeviceTypeIPhone4), >+ @"iPhone3,3" : @(RTCDeviceTypeIPhone4Verizon), >+ @"iPhone4,1" : @(RTCDeviceTypeIPhone4S), >+ @"iPhone5,1" : @(RTCDeviceTypeIPhone5GSM), >+ @"iPhone5,2" : @(RTCDeviceTypeIPhone5GSM_CDMA), >+ @"iPhone5,3" : @(RTCDeviceTypeIPhone5CGSM), >+ @"iPhone5,4" : @(RTCDeviceTypeIPhone5CGSM_CDMA), >+ @"iPhone6,1" : @(RTCDeviceTypeIPhone5SGSM), >+ @"iPhone6,2" : @(RTCDeviceTypeIPhone5SGSM_CDMA), >+ @"iPhone7,1" : @(RTCDeviceTypeIPhone6Plus), >+ @"iPhone7,2" : @(RTCDeviceTypeIPhone6), >+ @"iPhone8,1" : @(RTCDeviceTypeIPhone6S), >+ @"iPhone8,2" : @(RTCDeviceTypeIPhone6SPlus), >+ @"iPhone8,4" : @(RTCDeviceTypeIPhoneSE), >+ @"iPhone9,1" : @(RTCDeviceTypeIPhone7), >+ @"iPhone9,2" : @(RTCDeviceTypeIPhone7Plus), >+ @"iPhone9,3" : @(RTCDeviceTypeIPhone7), >+ @"iPhone9,4" : @(RTCDeviceTypeIPhone7Plus), >+ @"iPhone10,1" : @(RTCDeviceTypeIPhone8), >+ @"iPhone10,2" : @(RTCDeviceTypeIPhone8Plus), >+ @"iPhone10,3" : @(RTCDeviceTypeIPhoneX), >+ @"iPhone10,4" : @(RTCDeviceTypeIPhone8), >+ @"iPhone10,5" : @(RTCDeviceTypeIPhone8Plus), >+ @"iPhone10,6" : @(RTCDeviceTypeIPhoneX), >+ @"iPhone11,2" : @(RTCDeviceTypeIPhoneXS), >+ @"iPhone11,4" : @(RTCDeviceTypeIPhoneXSMax), >+ @"iPhone11,6" : @(RTCDeviceTypeIPhoneXSMax), >+ @"iPhone11,8" : @(RTCDeviceTypeIPhoneXR), >+ @"iPod1,1" : @(RTCDeviceTypeIPodTouch1G), >+ @"iPod2,1" : @(RTCDeviceTypeIPodTouch2G), >+ @"iPod3,1" : @(RTCDeviceTypeIPodTouch3G), >+ @"iPod4,1" : @(RTCDeviceTypeIPodTouch4G), >+ @"iPod5,1" : @(RTCDeviceTypeIPodTouch5G), >+ @"iPod7,1" : @(RTCDeviceTypeIPodTouch6G), >+ @"iPad1,1" : @(RTCDeviceTypeIPad), >+ @"iPad2,1" : @(RTCDeviceTypeIPad2Wifi), >+ @"iPad2,2" : @(RTCDeviceTypeIPad2GSM), >+ @"iPad2,3" : @(RTCDeviceTypeIPad2CDMA), >+ @"iPad2,4" : @(RTCDeviceTypeIPad2Wifi2), >+ @"iPad2,5" : @(RTCDeviceTypeIPadMiniWifi), >+ @"iPad2,6" : @(RTCDeviceTypeIPadMiniGSM), >+ @"iPad2,7" : @(RTCDeviceTypeIPadMiniGSM_CDMA), >+ @"iPad3,1" : @(RTCDeviceTypeIPad3Wifi), >+ @"iPad3,2" : @(RTCDeviceTypeIPad3GSM_CDMA), >+ @"iPad3,3" : @(RTCDeviceTypeIPad3GSM), >+ @"iPad3,4" : @(RTCDeviceTypeIPad4Wifi), >+ @"iPad3,5" : @(RTCDeviceTypeIPad4GSM), >+ @"iPad3,6" : @(RTCDeviceTypeIPad4GSM_CDMA), >+ @"iPad4,1" : @(RTCDeviceTypeIPadAirWifi), >+ @"iPad4,2" : @(RTCDeviceTypeIPadAirCellular), >+ @"iPad4,3" : @(RTCDeviceTypeIPadAirWifiCellular), >+ @"iPad4,4" : @(RTCDeviceTypeIPadMini2GWifi), >+ @"iPad4,5" : @(RTCDeviceTypeIPadMini2GCellular), >+ @"iPad4,6" : @(RTCDeviceTypeIPadMini2GWifiCellular), >+ @"iPad4,7" : @(RTCDeviceTypeIPadMini3), >+ @"iPad4,8" : @(RTCDeviceTypeIPadMini3), >+ @"iPad4,9" : @(RTCDeviceTypeIPadMini3), >+ @"iPad5,1" : @(RTCDeviceTypeIPadMini4), >+ @"iPad5,2" : @(RTCDeviceTypeIPadMini4), >+ @"iPad5,3" : @(RTCDeviceTypeIPadAir2), >+ @"iPad5,4" : @(RTCDeviceTypeIPadAir2), >+ @"iPad6,3" : @(RTCDeviceTypeIPadPro9Inch), >+ @"iPad6,4" : @(RTCDeviceTypeIPadPro9Inch), >+ @"iPad6,7" : @(RTCDeviceTypeIPadPro12Inch), >+ @"iPad6,8" : @(RTCDeviceTypeIPadPro12Inch), >+ @"iPad6,11" : @(RTCDeviceTypeIPad5), >+ @"iPad6,12" : @(RTCDeviceTypeIPad5), >+ @"iPad7,1" : @(RTCDeviceTypeIPadPro12Inch2), >+ @"iPad7,2" : @(RTCDeviceTypeIPadPro12Inch2), >+ @"iPad7,3" : @(RTCDeviceTypeIPadPro10Inch), >+ @"iPad7,4" : @(RTCDeviceTypeIPadPro10Inch), >+ @"iPad7,5" : @(RTCDeviceTypeIPad6), >+ @"iPad7,6" : @(RTCDeviceTypeIPad6), >+ @"i386" : @(RTCDeviceTypeSimulatori386), >+ @"x86_64" : @(RTCDeviceTypeSimulatorx86_64), > }; > > RTCDeviceType deviceType = RTCDeviceTypeUnknown; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/api/video_capturer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/api/video_capturer.h >index 5b6f3f939baa7c662335d791e6880693389a1d51..d6f396b4701cac5708b0f676da46636654f36a45 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/api/video_capturer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/api/video_capturer.h >@@ -15,6 +15,7 @@ > > #include "api/mediastreaminterface.h" > #include "rtc_base/scoped_ref_ptr.h" >+#include "rtc_base/thread.h" > > namespace webrtc { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >index c4c85cc6ab903535e5ef01e7cf5a81531d170e6f..ae45033a570f1c1989db31c54e1fbef19efc4a15 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >@@ -86,28 +86,26 @@ class ObjCVideoEncoder : public VideoEncoder { > frameTypes:rtcFrameTypes]; > } > >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) { return WEBRTC_VIDEO_CODEC_OK; } >- > int32_t SetRates(uint32_t bitrate, uint32_t framerate) { > return [encoder_ setBitrate:bitrate framerate:framerate]; > } > > int32_t SetRateAllocation(const VideoBitrateAllocation& allocation, uint32_t framerate) { >- RTCVideoBitrateAllocation *bitrateAllocation = >- [[RTCVideoBitrateAllocation alloc] initWithNativeVideoBitrateAllocation:&allocation]; >- return [encoder_ setRateAllocation: bitrateAllocation framerate:framerate]; >+ auto *rtcAllocation = [[RTCVideoBitrateAllocation alloc] initWithNativeVideoBitrateAllocation:&allocation]; >+ return [encoder_ setRateAllocation: rtcAllocation framerate:framerate]; > } > >- bool SupportsNativeHandle() const { return true; } >+ VideoEncoder::EncoderInfo GetEncoderInfo() const { >+ EncoderInfo info; >+ info.supports_native_handle = true; >+ info.implementation_name = implementation_name_; > >- VideoEncoder::ScalingSettings GetScalingSettings() const { > RTCVideoEncoderQpThresholds *qp_thresholds = [encoder_ scalingSettings]; >- return qp_thresholds ? ScalingSettings(qp_thresholds.low, qp_thresholds.high) : >- ScalingSettings::kOff; >+ info.scaling_settings = qp_thresholds ? ScalingSettings(qp_thresholds.low, qp_thresholds.high) : >+ ScalingSettings::kOff; >+ return info; > } > >- const char *ImplementationName() const { return implementation_name_.c_str(); } >- > private: > id<RTCVideoEncoder> encoder_; > const std::string implementation_name_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCAudioDeviceModule_xctest.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCAudioDeviceModule_xctest.mm >index 689de581d04eaf58ae473ccdfe52183b0256437e..55d48c28ba71e93f03f9deb7873f54113c0c1ea4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCAudioDeviceModule_xctest.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCAudioDeviceModule_xctest.mm >@@ -128,17 +128,20 @@ static const NSUInteger kFullDuplexTimeInSec = 10; > static const NSUInteger kNumIgnoreFirstCallbacks = 50; > > @interface RTCAudioDeviceModuleTests : XCTestCase { >- > rtc::scoped_refptr<webrtc::AudioDeviceModule> audioDeviceModule; >- webrtc::AudioParameters playoutParameters; >- webrtc::AudioParameters recordParameters; > MockAudioTransport mock; > } > >+@property(nonatomic, assign) webrtc::AudioParameters playoutParameters; >+@property(nonatomic, assign) webrtc::AudioParameters recordParameters; >+ > @end > > @implementation RTCAudioDeviceModuleTests > >+@synthesize playoutParameters; >+@synthesize recordParameters; >+ > - (void)setUp { > [super setUp]; > audioDeviceModule = webrtc::CreateAudioDeviceModule(); >@@ -254,10 +257,10 @@ - (void)testStartPlayoutOnTwoInstances { > int64_t *elapsed_time_ms, > int64_t *ntp_time_ms) { > nSamplesOut = nSamples; >- XCTAssertEqual(nSamples, playoutParameters.frames_per_10ms_buffer()); >+ XCTAssertEqual(nSamples, self.playoutParameters.frames_per_10ms_buffer()); > XCTAssertEqual(nBytesPerSample, kBytesPerSample); >- XCTAssertEqual(nChannels, playoutParameters.channels()); >- XCTAssertEqual((int) samplesPerSec, playoutParameters.sample_rate()); >+ XCTAssertEqual(nChannels, self.playoutParameters.channels()); >+ XCTAssertEqual((int)samplesPerSec, self.playoutParameters.sample_rate()); > XCTAssertNotEqual((void*)NULL, audioSamples); > > return 0; >@@ -291,10 +294,10 @@ - (void)testStartPlayoutOnTwoInstances { > int64_t *elapsed_time_ms, > int64_t *ntp_time_ms) { > nSamplesOut = nSamples; >- XCTAssertEqual(nSamples, playoutParameters.frames_per_10ms_buffer()); >+ XCTAssertEqual(nSamples, self.playoutParameters.frames_per_10ms_buffer()); > XCTAssertEqual(nBytesPerSample, kBytesPerSample); >- XCTAssertEqual(nChannels, playoutParameters.channels()); >- XCTAssertEqual((int) samplesPerSec, playoutParameters.sample_rate()); >+ XCTAssertEqual(nChannels, self.playoutParameters.channels()); >+ XCTAssertEqual((int)samplesPerSec, self.playoutParameters.sample_rate()); > XCTAssertNotEqual((void*)NULL, audioSamples); > if (++num_callbacks == kNumCallbacks) { > [playoutExpectation fulfill]; >@@ -330,10 +333,10 @@ - (void)testStartPlayoutVerifyCallbacks { > int64_t *elapsed_time_ms, > int64_t *ntp_time_ms) { > nSamplesOut = nSamples; >- XCTAssertEqual(nSamples, playoutParameters.frames_per_10ms_buffer()); >+ XCTAssertEqual(nSamples, self.playoutParameters.frames_per_10ms_buffer()); > XCTAssertEqual(nBytesPerSample, kBytesPerSample); >- XCTAssertEqual(nChannels, playoutParameters.channels()); >- XCTAssertEqual((int) samplesPerSec, playoutParameters.sample_rate()); >+ XCTAssertEqual(nChannels, self.playoutParameters.channels()); >+ XCTAssertEqual((int)samplesPerSec, self.playoutParameters.sample_rate()); > XCTAssertNotEqual((void*)NULL, audioSamples); > if (++num_callbacks == kNumCallbacks) { > [playoutExpectation fulfill]; >@@ -366,10 +369,10 @@ - (void)testStartRecordingVerifyCallbacks { > const bool keyPressed, > uint32_t& newMicLevel) { > XCTAssertNotEqual((void*)NULL, audioSamples); >- XCTAssertEqual(nSamples, recordParameters.frames_per_10ms_buffer()); >+ XCTAssertEqual(nSamples, self.recordParameters.frames_per_10ms_buffer()); > XCTAssertEqual(nBytesPerSample, kBytesPerSample); >- XCTAssertEqual(nChannels, recordParameters.channels()); >- XCTAssertEqual((int) samplesPerSec, recordParameters.sample_rate()); >+ XCTAssertEqual(nChannels, self.recordParameters.channels()); >+ XCTAssertEqual((int)samplesPerSec, self.recordParameters.sample_rate()); > XCTAssertEqual(0, clockDrift); > XCTAssertEqual(0u, currentMicLevel); > XCTAssertFalse(keyPressed); >@@ -405,10 +408,10 @@ - (void)testStartPlayoutAndRecordingVerifyCallbacks { > int64_t *elapsed_time_ms, > int64_t *ntp_time_ms) { > nSamplesOut = nSamples; >- XCTAssertEqual(nSamples, playoutParameters.frames_per_10ms_buffer()); >+ XCTAssertEqual(nSamples, self.playoutParameters.frames_per_10ms_buffer()); > XCTAssertEqual(nBytesPerSample, kBytesPerSample); >- XCTAssertEqual(nChannels, playoutParameters.channels()); >- XCTAssertEqual((int) samplesPerSec, playoutParameters.sample_rate()); >+ XCTAssertEqual(nChannels, self.playoutParameters.channels()); >+ XCTAssertEqual((int)samplesPerSec, self.playoutParameters.sample_rate()); > XCTAssertNotEqual((void*)NULL, audioSamples); > if (callbackCount++ >= kNumCallbacks) { > [playoutExpectation fulfill]; >@@ -428,10 +431,10 @@ - (void)testStartPlayoutAndRecordingVerifyCallbacks { > const bool keyPressed, > uint32_t& newMicLevel) { > XCTAssertNotEqual((void*)NULL, audioSamples); >- XCTAssertEqual(nSamples, recordParameters.frames_per_10ms_buffer()); >+ XCTAssertEqual(nSamples, self.recordParameters.frames_per_10ms_buffer()); > XCTAssertEqual(nBytesPerSample, kBytesPerSample); >- XCTAssertEqual(nChannels, recordParameters.channels()); >- XCTAssertEqual((int) samplesPerSec, recordParameters.sample_rate()); >+ XCTAssertEqual(nChannels, self.recordParameters.channels()); >+ XCTAssertEqual((int)samplesPerSec, self.recordParameters.sample_rate()); > XCTAssertEqual(0, clockDrift); > XCTAssertEqual(0u, currentMicLevel); > XCTAssertFalse(keyPressed); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCallbackLogger_xctest.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCallbackLogger_xctest.m >index 4a0a2b3e6d15bd71d96a592222acd5a3474c2a9e..ceaa762f1f8a1f097d51c55e09252d92ed99b71f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCallbackLogger_xctest.m >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCallbackLogger_xctest.m >@@ -49,6 +49,23 @@ - (void)testCallbackGetsCalledForAppropriateLevel { > [self waitForExpectations:@[ callbackExpectation ] timeout:10.0]; > } > >+- (void)testCallbackWithSeverityGetsCalledForAppropriateLevel { >+ self.logger.severity = RTCLoggingSeverityWarning; >+ >+ XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"callbackWarning"]; >+ >+ [self.logger >+ startWithMessageAndSeverityHandler:^(NSString *message, RTCLoggingSeverity severity) { >+ XCTAssertTrue([message hasSuffix:@"Horrible error\n"]); >+ XCTAssertEqual(severity, RTCLoggingSeverityError); >+ [callbackExpectation fulfill]; >+ }]; >+ >+ RTCLogError("Horrible error"); >+ >+ [self waitForExpectations:@[ callbackExpectation ] timeout:10.0]; >+} >+ > - (void)testCallbackDoesNotGetCalledForOtherLevels { > self.logger.severity = RTCLoggingSeverityError; > >@@ -66,12 +83,76 @@ - (void)testCallbackDoesNotGetCalledForOtherLevels { > [self waitForExpectations:@[ callbackExpectation ] timeout:10.0]; > } > >+- (void)testCallbackWithSeverityDoesNotGetCalledForOtherLevels { >+ self.logger.severity = RTCLoggingSeverityError; >+ >+ XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"callbackError"]; >+ >+ [self.logger >+ startWithMessageAndSeverityHandler:^(NSString *message, RTCLoggingSeverity severity) { >+ XCTAssertTrue([message hasSuffix:@"Horrible error\n"]); >+ XCTAssertEqual(severity, RTCLoggingSeverityError); >+ [callbackExpectation fulfill]; >+ }]; >+ >+ RTCLogInfo("Just some info"); >+ RTCLogWarning("Warning warning"); >+ RTCLogError("Horrible error"); >+ >+ [self waitForExpectations:@[ callbackExpectation ] timeout:10.0]; >+} >+ >+- (void)testCallbackDoesNotgetCalledForSeverityNone { >+ self.logger.severity = RTCLoggingSeverityNone; >+ >+ XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"unexpectedCallback"]; >+ >+ [self.logger start:^(NSString *message) { >+ [callbackExpectation fulfill]; >+ XCTAssertTrue(false); >+ }]; >+ >+ RTCLogInfo("Just some info"); >+ RTCLogWarning("Warning warning"); >+ RTCLogError("Horrible error"); >+ >+ XCTWaiter *waiter = [[XCTWaiter alloc] init]; >+ XCTWaiterResult result = [waiter waitForExpectations:@[ callbackExpectation ] timeout:1.0]; >+ XCTAssertEqual(result, XCTWaiterResultTimedOut); >+} >+ >+- (void)testCallbackWithSeverityDoesNotgetCalledForSeverityNone { >+ self.logger.severity = RTCLoggingSeverityNone; >+ >+ XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"unexpectedCallback"]; >+ >+ [self.logger >+ startWithMessageAndSeverityHandler:^(NSString *message, RTCLoggingSeverity severity) { >+ [callbackExpectation fulfill]; >+ XCTAssertTrue(false); >+ }]; >+ >+ RTCLogInfo("Just some info"); >+ RTCLogWarning("Warning warning"); >+ RTCLogError("Horrible error"); >+ >+ XCTWaiter *waiter = [[XCTWaiter alloc] init]; >+ XCTWaiterResult result = [waiter waitForExpectations:@[ callbackExpectation ] timeout:1.0]; >+ XCTAssertEqual(result, XCTWaiterResultTimedOut); >+} >+ > - (void)testStartingWithNilCallbackDoesNotCrash { > [self.logger start:nil]; > > RTCLogError("Horrible error"); > } > >+- (void)testStartingWithNilCallbackWithSeverityDoesNotCrash { >+ [self.logger startWithMessageAndSeverityHandler:nil]; >+ >+ RTCLogError("Horrible error"); >+} >+ > - (void)testStopCallbackLogger { > XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"stopped"]; > >@@ -88,6 +169,23 @@ - (void)testStopCallbackLogger { > XCTAssertEqual(result, XCTWaiterResultTimedOut); > } > >+- (void)testStopCallbackWithSeverityLogger { >+ XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"stopped"]; >+ >+ [self.logger >+ startWithMessageAndSeverityHandler:^(NSString *message, RTCLoggingSeverity loggingServerity) { >+ [callbackExpectation fulfill]; >+ }]; >+ >+ [self.logger stop]; >+ >+ RTCLogInfo("Just some info"); >+ >+ XCTWaiter *waiter = [[XCTWaiter alloc] init]; >+ XCTWaiterResult result = [waiter waitForExpectations:@[ callbackExpectation ] timeout:1.0]; >+ XCTAssertEqual(result, XCTWaiterResultTimedOut); >+} >+ > - (void)testDestroyingCallbackLogger { > XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"destroyed"]; > >@@ -104,4 +202,43 @@ - (void)testDestroyingCallbackLogger { > XCTAssertEqual(result, XCTWaiterResultTimedOut); > } > >+- (void)testDestroyingCallbackWithSeverityLogger { >+ XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"destroyed"]; >+ >+ [self.logger >+ startWithMessageAndSeverityHandler:^(NSString *message, RTCLoggingSeverity loggingServerity) { >+ [callbackExpectation fulfill]; >+ }]; >+ >+ self.logger = nil; >+ >+ RTCLogInfo("Just some info"); >+ >+ XCTWaiter *waiter = [[XCTWaiter alloc] init]; >+ XCTWaiterResult result = [waiter waitForExpectations:@[ callbackExpectation ] timeout:1.0]; >+ XCTAssertEqual(result, XCTWaiterResultTimedOut); >+} >+ >+- (void)testCallbackWithSeverityLoggerCannotStartTwice { >+ self.logger.severity = RTCLoggingSeverityWarning; >+ >+ XCTestExpectation *callbackExpectation = [self expectationWithDescription:@"callbackWarning"]; >+ >+ [self.logger >+ startWithMessageAndSeverityHandler:^(NSString *message, RTCLoggingSeverity loggingServerity) { >+ XCTAssertTrue([message hasSuffix:@"Horrible error\n"]); >+ XCTAssertEqual(loggingServerity, RTCLoggingSeverityError); >+ [callbackExpectation fulfill]; >+ }]; >+ >+ [self.logger start:^(NSString *message) { >+ [callbackExpectation fulfill]; >+ XCTAssertTrue(false); >+ }]; >+ >+ RTCLogError("Horrible error"); >+ >+ [self waitForExpectations:@[ callbackExpectation ] timeout:10.0]; >+} >+ > @end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCertificateTest.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCertificateTest.mm >index 2585a278118e99bf375b401b13b84f0625d4e05f..5bf1eb3fe4707477ffb84223b554f46ae84d56c8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCertificateTest.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCCertificateTest.mm >@@ -71,7 +71,7 @@ - (void)testCertificateIsUsedInConfig { > > @end > >-TEST(CertificateTest, CertificateIsUsedInConfig) { >+TEST(CertificateTest, DISABLED_CertificateIsUsedInConfig) { > RTCCertificateTest *test = [[RTCCertificateTest alloc] init]; > [test testCertificateIsUsedInConfig]; > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCConfigurationTest.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCConfigurationTest.mm >index 1ef718ad4958de7d776ff59bb7433d05703a99fc..f31fcfd858d929f6064f87c48ef107a46e8bf8ec 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCConfigurationTest.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCConfigurationTest.mm >@@ -50,6 +50,12 @@ - (void)testConversionToNativeConfiguration { > RTCContinualGatheringPolicyGatherContinually; > config.shouldPruneTurnPorts = YES; > config.iceRegatherIntervalRange = range; >+ config.cryptoOptions = [[RTCCryptoOptions alloc] initWithSrtpEnableGcmCryptoSuites:YES >+ srtpEnableAes128Sha1_32CryptoCipher:YES >+ srtpEnableEncryptedRtpHeaderExtensions:YES >+ sframeRequireFrameEncryption:YES]; >+ config.rtcpAudioReportIntervalMs = 2500; >+ config.rtcpVideoReportIntervalMs = 3750; > > std::unique_ptr<webrtc::PeerConnectionInterface::RTCConfiguration> > nativeConfig([config createNativeConfiguration]); >@@ -78,6 +84,12 @@ - (void)testConversionToNativeConfiguration { > EXPECT_EQ(true, nativeConfig->prune_turn_ports); > EXPECT_EQ(range.min, nativeConfig->ice_regather_interval_range->min()); > EXPECT_EQ(range.max, nativeConfig->ice_regather_interval_range->max()); >+ EXPECT_EQ(true, nativeConfig->crypto_options->srtp.enable_gcm_crypto_suites); >+ EXPECT_EQ(true, nativeConfig->crypto_options->srtp.enable_aes128_sha1_32_crypto_cipher); >+ EXPECT_EQ(true, nativeConfig->crypto_options->srtp.enable_encrypted_rtp_header_extensions); >+ EXPECT_EQ(true, nativeConfig->crypto_options->sframe.require_frame_encryption); >+ EXPECT_EQ(2500, nativeConfig->audio_rtcp_report_interval_ms()); >+ EXPECT_EQ(3750, nativeConfig->video_rtcp_report_interval_ms()); > } > > - (void)testNativeConversionToConfiguration { >@@ -103,6 +115,12 @@ - (void)testNativeConversionToConfiguration { > RTCContinualGatheringPolicyGatherContinually; > config.shouldPruneTurnPorts = YES; > config.iceRegatherIntervalRange = range; >+ config.cryptoOptions = [[RTCCryptoOptions alloc] initWithSrtpEnableGcmCryptoSuites:YES >+ srtpEnableAes128Sha1_32CryptoCipher:NO >+ srtpEnableEncryptedRtpHeaderExtensions:NO >+ sframeRequireFrameEncryption:NO]; >+ config.rtcpAudioReportIntervalMs = 1500; >+ config.rtcpVideoReportIntervalMs = 2150; > > webrtc::PeerConnectionInterface::RTCConfiguration *nativeConfig = > [config createNativeConfiguration]; >@@ -130,6 +148,21 @@ - (void)testNativeConversionToConfiguration { > EXPECT_EQ(config.shouldPruneTurnPorts, newConfig.shouldPruneTurnPorts); > EXPECT_EQ(config.iceRegatherIntervalRange.min, newConfig.iceRegatherIntervalRange.min); > EXPECT_EQ(config.iceRegatherIntervalRange.max, newConfig.iceRegatherIntervalRange.max); >+ EXPECT_EQ(config.cryptoOptions.srtpEnableGcmCryptoSuites, >+ newConfig.cryptoOptions.srtpEnableGcmCryptoSuites); >+ EXPECT_EQ(config.cryptoOptions.srtpEnableAes128Sha1_32CryptoCipher, >+ newConfig.cryptoOptions.srtpEnableAes128Sha1_32CryptoCipher); >+ EXPECT_EQ(config.cryptoOptions.srtpEnableEncryptedRtpHeaderExtensions, >+ newConfig.cryptoOptions.srtpEnableEncryptedRtpHeaderExtensions); >+ EXPECT_EQ(config.cryptoOptions.sframeRequireFrameEncryption, >+ newConfig.cryptoOptions.sframeRequireFrameEncryption); >+ EXPECT_EQ(config.rtcpAudioReportIntervalMs, newConfig.rtcpAudioReportIntervalMs); >+ EXPECT_EQ(config.rtcpVideoReportIntervalMs, newConfig.rtcpVideoReportIntervalMs); >+} >+ >+- (void)testDefaultValues { >+ RTCConfiguration *config = [[RTCConfiguration alloc] init]; >+ EXPECT_EQ(config.cryptoOptions, nil); > } > > @end >@@ -139,5 +172,6 @@ TEST(RTCConfigurationTest, NativeConfigurationConversionTest) { > RTCConfigurationTest *test = [[RTCConfigurationTest alloc] init]; > [test testConversionToNativeConfiguration]; > [test testNativeConversionToConfiguration]; >+ [test testDefaultValues]; > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionFactoryBuilderTest.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionFactoryBuilderTest.mm >index 9bb5179521f34a161cce3e15f7623ffd57ed5804..5f889a699da610851190043e3acfdfd767bd9a06 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionFactoryBuilderTest.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionFactoryBuilderTest.mm >@@ -22,6 +22,7 @@ extern "C" { > > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/media_transport_interface.h" > #include "api/video_codecs/video_decoder_factory.h" > #include "api/video_codecs/video_encoder_factory.h" > #include "modules/audio_device/include/audio_device.h" >@@ -49,7 +50,8 @@ - (void)testBuilder { > nativeVideoEncoderFactory:nullptr > nativeVideoDecoderFactory:nullptr > audioDeviceModule:nullptr >- audioProcessingModule:nullptr]); >+ audioProcessingModule:nullptr >+ mediaTransportFactory:nullptr]); > #endif > RTCPeerConnectionFactoryBuilder* builder = [[RTCPeerConnectionFactoryBuilder alloc] init]; > RTCPeerConnectionFactory* peerConnectionFactory = [builder createPeerConnectionFactory]; >@@ -69,7 +71,8 @@ - (void)testDefaultComponentsBuilder { > nativeVideoEncoderFactory:nullptr > nativeVideoDecoderFactory:nullptr > audioDeviceModule:nullptr >- audioProcessingModule:nullptr]); >+ audioProcessingModule:nullptr >+ mediaTransportFactory:nullptr]); > #endif > RTCPeerConnectionFactoryBuilder* builder = [RTCPeerConnectionFactoryBuilder defaultBuilder]; > RTCPeerConnectionFactory* peerConnectionFactory = [builder createPeerConnectionFactory]; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionTest.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionTest.mm >index 28fdee6a2bd6ed81e0700b4ec110a24b2ca0a3cd..35322587996d98ac6769b0655a648b19a3ad9d3a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionTest.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/RTCPeerConnectionTest.mm >@@ -16,6 +16,7 @@ > > #import "api/peerconnection/RTCConfiguration+Private.h" > #import "api/peerconnection/RTCConfiguration.h" >+#import "api/peerconnection/RTCCryptoOptions.h" > #import "api/peerconnection/RTCIceServer.h" > #import "api/peerconnection/RTCMediaConstraints.h" > #import "api/peerconnection/RTCPeerConnection.h" >@@ -50,6 +51,10 @@ - (void)testConfigurationGetter { > RTCContinualGatheringPolicyGatherContinually; > config.shouldPruneTurnPorts = YES; > config.activeResetSrtpParams = YES; >+ config.cryptoOptions = [[RTCCryptoOptions alloc] initWithSrtpEnableGcmCryptoSuites:YES >+ srtpEnableAes128Sha1_32CryptoCipher:YES >+ srtpEnableEncryptedRtpHeaderExtensions:NO >+ sframeRequireFrameEncryption:NO]; > > RTCMediaConstraints *contraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:@{} > optionalConstraints:nil]; >@@ -89,6 +94,14 @@ - (void)testConfigurationGetter { > EXPECT_EQ(config.continualGatheringPolicy, newConfig.continualGatheringPolicy); > EXPECT_EQ(config.shouldPruneTurnPorts, newConfig.shouldPruneTurnPorts); > EXPECT_EQ(config.activeResetSrtpParams, newConfig.activeResetSrtpParams); >+ EXPECT_EQ(config.cryptoOptions.srtpEnableGcmCryptoSuites, >+ newConfig.cryptoOptions.srtpEnableGcmCryptoSuites); >+ EXPECT_EQ(config.cryptoOptions.srtpEnableAes128Sha1_32CryptoCipher, >+ newConfig.cryptoOptions.srtpEnableAes128Sha1_32CryptoCipher); >+ EXPECT_EQ(config.cryptoOptions.srtpEnableEncryptedRtpHeaderExtensions, >+ newConfig.cryptoOptions.srtpEnableEncryptedRtpHeaderExtensions); >+ EXPECT_EQ(config.cryptoOptions.sframeRequireFrameEncryption, >+ newConfig.cryptoOptions.sframeRequireFrameEncryption); > } > > @end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/main.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/main.m >deleted file mode 100644 >index 217fb4f479c056e7c2a3fa28d8ca57f586dca4ac..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/main.m >+++ /dev/null >@@ -1,18 +0,0 @@ >-/* >- * Copyright 2017 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#import <Foundation/Foundation.h> >-#import <UIKit/UIKit.h> >- >-int main(int argc, char* argv[]) { >- @autoreleasepool { >- return UIApplicationMain(argc, argv, nil, nil); >- } >-} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/main.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/main.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..77a88a64d3d74e35da8ee055e788010823d6fa6e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/main.mm >@@ -0,0 +1,21 @@ >+/* >+ * Copyright 2017 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#import <Foundation/Foundation.h> >+#import <UIKit/UIKit.h> >+#include "test/ios/coverage_util_ios.h" >+ >+int main(int argc, char* argv[]) { >+ rtc::test::ConfigureCoverageReportPath(); >+ >+ @autoreleasepool { >+ return UIApplicationMain(argc, argv, nil, nil); >+ } >+} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/nalu_rewriter_xctest.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/nalu_rewriter_xctest.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..4b049901c51ab77656b9347b2ed301f0d342061f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/nalu_rewriter_xctest.mm >@@ -0,0 +1,414 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "common_video/h264/h264_common.h" >+#include "components/video_codec/nalu_rewriter.h" >+#include "rtc_base/arraysize.h" >+#include "rtc_base/gunit.h" >+ >+#import <XCTest/XCTest.h> >+ >+#if TARGET_OS_IPHONE >+#import <AVFoundation/AVFoundation.h> >+#import <UIKit/UIKit.h> >+#endif >+ >+@interface NaluRewriterTests : XCTestCase >+ >+@end >+ >+static const uint8_t NALU_TEST_DATA_0[] = {0xAA, 0xBB, 0xCC}; >+static const uint8_t NALU_TEST_DATA_1[] = {0xDE, 0xAD, 0xBE, 0xEF}; >+ >+// clang-format off >+static const uint8_t SPS_PPS_BUFFER[] = { >+ // SPS nalu. >+ 0x00, 0x00, 0x00, 0x01, 0x27, 0x42, 0x00, 0x1E, 0xAB, 0x40, 0xF0, 0x28, >+ 0xD3, 0x70, 0x20, 0x20, 0x20, 0x20, >+ // PPS nalu. >+ 0x00, 0x00, 0x00, 0x01, 0x28, 0xCE, 0x3C, 0x30}; >+// clang-format on >+ >+@implementation NaluRewriterTests >+ >+- (void)testCreateVideoFormatDescription { >+ CMVideoFormatDescriptionRef description = >+ webrtc::CreateVideoFormatDescription(SPS_PPS_BUFFER, arraysize(SPS_PPS_BUFFER)); >+ XCTAssertTrue(description); >+ if (description) { >+ CFRelease(description); >+ description = nullptr; >+ } >+ >+ // clang-format off >+ const uint8_t sps_pps_not_at_start_buffer[] = { >+ // Add some non-SPS/PPS NALUs at the beginning >+ 0x00, 0x00, 0x00, 0x01, 0x00, 0x00, 0x01, 0xFF, 0x00, 0x00, 0x00, 0x01, >+ 0xAB, 0x33, 0x21, >+ // SPS nalu. >+ 0x00, 0x00, 0x01, 0x27, 0x42, 0x00, 0x1E, 0xAB, 0x40, 0xF0, 0x28, 0xD3, >+ 0x70, 0x20, 0x20, 0x20, 0x20, >+ // PPS nalu. >+ 0x00, 0x00, 0x01, 0x28, 0xCE, 0x3C, 0x30}; >+ // clang-format on >+ description = webrtc::CreateVideoFormatDescription(sps_pps_not_at_start_buffer, >+ arraysize(sps_pps_not_at_start_buffer)); >+ >+ XCTAssertTrue(description); >+ >+ if (description) { >+ CFRelease(description); >+ description = nullptr; >+ } >+ >+ const uint8_t other_buffer[] = {0x00, 0x00, 0x00, 0x01, 0x28}; >+ XCTAssertFalse(webrtc::CreateVideoFormatDescription(other_buffer, arraysize(other_buffer))); >+} >+ >+- (void)testReadEmptyInput { >+ const uint8_t annex_b_test_data[] = {0x00}; >+ webrtc::AnnexBBufferReader reader(annex_b_test_data, 0); >+ const uint8_t* nalu = nullptr; >+ size_t nalu_length = 0; >+ XCTAssertEqual(0u, reader.BytesRemaining()); >+ XCTAssertFalse(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(nullptr, nalu); >+ XCTAssertEqual(0u, nalu_length); >+} >+ >+- (void)testReadSingleNalu { >+ const uint8_t annex_b_test_data[] = {0x00, 0x00, 0x00, 0x01, 0xAA}; >+ webrtc::AnnexBBufferReader reader(annex_b_test_data, arraysize(annex_b_test_data)); >+ const uint8_t* nalu = nullptr; >+ size_t nalu_length = 0; >+ XCTAssertEqual(arraysize(annex_b_test_data), reader.BytesRemaining()); >+ XCTAssertTrue(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(annex_b_test_data + 4, nalu); >+ XCTAssertEqual(1u, nalu_length); >+ XCTAssertEqual(0u, reader.BytesRemaining()); >+ XCTAssertFalse(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(nullptr, nalu); >+ XCTAssertEqual(0u, nalu_length); >+} >+ >+- (void)testReadSingleNalu3ByteHeader { >+ const uint8_t annex_b_test_data[] = {0x00, 0x00, 0x01, 0xAA}; >+ webrtc::AnnexBBufferReader reader(annex_b_test_data, arraysize(annex_b_test_data)); >+ const uint8_t* nalu = nullptr; >+ size_t nalu_length = 0; >+ XCTAssertEqual(arraysize(annex_b_test_data), reader.BytesRemaining()); >+ XCTAssertTrue(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(annex_b_test_data + 3, nalu); >+ XCTAssertEqual(1u, nalu_length); >+ XCTAssertEqual(0u, reader.BytesRemaining()); >+ XCTAssertFalse(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(nullptr, nalu); >+ XCTAssertEqual(0u, nalu_length); >+} >+ >+- (void)testReadMissingNalu { >+ // clang-format off >+ const uint8_t annex_b_test_data[] = {0x01, >+ 0x00, 0x01, >+ 0x00, 0x00, 0x00, 0xFF}; >+ // clang-format on >+ webrtc::AnnexBBufferReader reader(annex_b_test_data, arraysize(annex_b_test_data)); >+ const uint8_t* nalu = nullptr; >+ size_t nalu_length = 0; >+ XCTAssertEqual(0u, reader.BytesRemaining()); >+ XCTAssertFalse(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(nullptr, nalu); >+ XCTAssertEqual(0u, nalu_length); >+} >+ >+- (void)testReadMultipleNalus { >+ // clang-format off >+ const uint8_t annex_b_test_data[] = {0x00, 0x00, 0x00, 0x01, 0xFF, >+ 0x01, >+ 0x00, 0x01, >+ 0x00, 0x00, 0x00, 0xFF, >+ 0x00, 0x00, 0x01, 0xAA, 0xBB}; >+ // clang-format on >+ webrtc::AnnexBBufferReader reader(annex_b_test_data, arraysize(annex_b_test_data)); >+ const uint8_t* nalu = nullptr; >+ size_t nalu_length = 0; >+ XCTAssertEqual(arraysize(annex_b_test_data), reader.BytesRemaining()); >+ XCTAssertTrue(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(annex_b_test_data + 4, nalu); >+ XCTAssertEqual(8u, nalu_length); >+ XCTAssertEqual(5u, reader.BytesRemaining()); >+ XCTAssertTrue(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(annex_b_test_data + 15, nalu); >+ XCTAssertEqual(2u, nalu_length); >+ XCTAssertEqual(0u, reader.BytesRemaining()); >+ XCTAssertFalse(reader.ReadNalu(&nalu, &nalu_length)); >+ XCTAssertEqual(nullptr, nalu); >+ XCTAssertEqual(0u, nalu_length); >+} >+ >+- (void)testEmptyOutputBuffer { >+ const uint8_t expected_buffer[] = {0x00}; >+ const size_t buffer_size = 1; >+ std::unique_ptr<uint8_t[]> buffer(new uint8_t[buffer_size]); >+ memset(buffer.get(), 0, buffer_size); >+ webrtc::AvccBufferWriter writer(buffer.get(), 0); >+ XCTAssertEqual(0u, writer.BytesRemaining()); >+ XCTAssertFalse(writer.WriteNalu(NALU_TEST_DATA_0, arraysize(NALU_TEST_DATA_0))); >+ XCTAssertEqual(0, memcmp(expected_buffer, buffer.get(), arraysize(expected_buffer))); >+} >+ >+- (void)testWriteSingleNalu { >+ const uint8_t expected_buffer[] = { >+ 0x00, 0x00, 0x00, 0x03, 0xAA, 0xBB, 0xCC, >+ }; >+ const size_t buffer_size = arraysize(NALU_TEST_DATA_0) + 4; >+ std::unique_ptr<uint8_t[]> buffer(new uint8_t[buffer_size]); >+ webrtc::AvccBufferWriter writer(buffer.get(), buffer_size); >+ XCTAssertEqual(buffer_size, writer.BytesRemaining()); >+ XCTAssertTrue(writer.WriteNalu(NALU_TEST_DATA_0, arraysize(NALU_TEST_DATA_0))); >+ XCTAssertEqual(0u, writer.BytesRemaining()); >+ XCTAssertFalse(writer.WriteNalu(NALU_TEST_DATA_1, arraysize(NALU_TEST_DATA_1))); >+ XCTAssertEqual(0, memcmp(expected_buffer, buffer.get(), arraysize(expected_buffer))); >+} >+ >+- (void)testWriteMultipleNalus { >+ // clang-format off >+ const uint8_t expected_buffer[] = { >+ 0x00, 0x00, 0x00, 0x03, 0xAA, 0xBB, 0xCC, >+ 0x00, 0x00, 0x00, 0x04, 0xDE, 0xAD, 0xBE, 0xEF >+ }; >+ // clang-format on >+ const size_t buffer_size = arraysize(NALU_TEST_DATA_0) + arraysize(NALU_TEST_DATA_1) + 8; >+ std::unique_ptr<uint8_t[]> buffer(new uint8_t[buffer_size]); >+ webrtc::AvccBufferWriter writer(buffer.get(), buffer_size); >+ XCTAssertEqual(buffer_size, writer.BytesRemaining()); >+ XCTAssertTrue(writer.WriteNalu(NALU_TEST_DATA_0, arraysize(NALU_TEST_DATA_0))); >+ XCTAssertEqual(buffer_size - (arraysize(NALU_TEST_DATA_0) + 4), writer.BytesRemaining()); >+ XCTAssertTrue(writer.WriteNalu(NALU_TEST_DATA_1, arraysize(NALU_TEST_DATA_1))); >+ XCTAssertEqual(0u, writer.BytesRemaining()); >+ XCTAssertEqual(0, memcmp(expected_buffer, buffer.get(), arraysize(expected_buffer))); >+} >+ >+- (void)testOverflow { >+ const uint8_t expected_buffer[] = {0x00, 0x00, 0x00}; >+ const size_t buffer_size = arraysize(NALU_TEST_DATA_0); >+ std::unique_ptr<uint8_t[]> buffer(new uint8_t[buffer_size]); >+ memset(buffer.get(), 0, buffer_size); >+ webrtc::AvccBufferWriter writer(buffer.get(), buffer_size); >+ XCTAssertEqual(buffer_size, writer.BytesRemaining()); >+ XCTAssertFalse(writer.WriteNalu(NALU_TEST_DATA_0, arraysize(NALU_TEST_DATA_0))); >+ XCTAssertEqual(buffer_size, writer.BytesRemaining()); >+ XCTAssertEqual(0, memcmp(expected_buffer, buffer.get(), arraysize(expected_buffer))); >+} >+ >+- (void)testH264AnnexBBufferToCMSampleBuffer { >+ // clang-format off >+ const uint8_t annex_b_test_data[] = { >+ 0x00, >+ 0x00, 0x00, 0x01, >+ 0x01, 0x00, 0x00, 0xFF, // first chunk, 4 bytes >+ 0x00, 0x00, 0x01, >+ 0xAA, 0xFF, // second chunk, 2 bytes >+ 0x00, 0x00, 0x01, >+ 0xBB}; // third chunk, 1 byte, will not fit into output array >+ >+ const uint8_t expected_cmsample_data[] = { >+ 0x00, 0x00, 0x00, 0x04, >+ 0x01, 0x00, 0x00, 0xFF, // first chunk, 4 bytes >+ 0x00, 0x00, 0x00, 0x02, >+ 0xAA, 0xFF}; // second chunk, 2 bytes >+ // clang-format on >+ >+ CMMemoryPoolRef memory_pool = CMMemoryPoolCreate(nil); >+ CMSampleBufferRef out_sample_buffer = nil; >+ CMVideoFormatDescriptionRef description = [self createDescription]; >+ >+ Boolean result = webrtc::H264AnnexBBufferToCMSampleBuffer(annex_b_test_data, >+ arraysize(annex_b_test_data), >+ description, >+ &out_sample_buffer, >+ memory_pool); >+ >+ XCTAssertTrue(result); >+ >+ XCTAssertEqual(description, CMSampleBufferGetFormatDescription(out_sample_buffer)); >+ >+ char* data_ptr = nullptr; >+ CMBlockBufferRef block_buffer = CMSampleBufferGetDataBuffer(out_sample_buffer); >+ size_t block_buffer_size = CMBlockBufferGetDataLength(block_buffer); >+ CMBlockBufferGetDataPointer(block_buffer, 0, nullptr, nullptr, &data_ptr); >+ XCTAssertEqual(block_buffer_size, arraysize(annex_b_test_data)); >+ >+ int data_comparison_result = >+ memcmp(expected_cmsample_data, data_ptr, arraysize(expected_cmsample_data)); >+ >+ XCTAssertEqual(0, data_comparison_result); >+ >+ if (description) { >+ CFRelease(description); >+ description = nullptr; >+ } >+ >+ CMMemoryPoolInvalidate(memory_pool); >+ CFRelease(memory_pool); >+} >+ >+- (void)testH264CMSampleBufferToAnnexBBuffer { >+ // clang-format off >+ const uint8_t cmsample_data[] = { >+ 0x00, 0x00, 0x00, 0x04, >+ 0x01, 0x00, 0x00, 0xFF, // first chunk, 4 bytes >+ 0x00, 0x00, 0x00, 0x02, >+ 0xAA, 0xFF}; // second chunk, 2 bytes >+ >+ const uint8_t expected_annex_b_data[] = { >+ 0x00, 0x00, 0x00, 0x01, >+ 0x01, 0x00, 0x00, 0xFF, // first chunk, 4 bytes >+ 0x00, 0x00, 0x00, 0x01, >+ 0xAA, 0xFF}; // second chunk, 2 bytes >+ // clang-format on >+ >+ rtc::Buffer annexb_buffer(arraysize(cmsample_data)); >+ std::unique_ptr<webrtc::RTPFragmentationHeader> out_header_ptr; >+ CMSampleBufferRef sample_buffer = >+ [self createCMSampleBufferRef:(void*)cmsample_data cmsampleSize:arraysize(cmsample_data)]; >+ >+ Boolean result = webrtc::H264CMSampleBufferToAnnexBBuffer(sample_buffer, >+ /* is_keyframe */ false, >+ &annexb_buffer, >+ &out_header_ptr); >+ >+ XCTAssertTrue(result); >+ >+ XCTAssertEqual(arraysize(expected_annex_b_data), annexb_buffer.size()); >+ >+ int data_comparison_result = >+ memcmp(expected_annex_b_data, annexb_buffer.data(), arraysize(expected_annex_b_data)); >+ >+ XCTAssertEqual(0, data_comparison_result); >+ >+ webrtc::RTPFragmentationHeader* out_header = out_header_ptr.get(); >+ >+ XCTAssertEqual(2, (int)out_header->Size()); >+ >+ XCTAssertEqual(4, (int)out_header->Offset(0)); >+ XCTAssertEqual(4, (int)out_header->Length(0)); >+ XCTAssertEqual(0, (int)out_header->TimeDiff(0)); >+ XCTAssertEqual(0, (int)out_header->PayloadType(0)); >+ >+ XCTAssertEqual(12, (int)out_header->Offset(1)); >+ XCTAssertEqual(2, (int)out_header->Length(1)); >+ XCTAssertEqual(0, (int)out_header->TimeDiff(1)); >+ XCTAssertEqual(0, (int)out_header->PayloadType(1)); >+} >+ >+- (void)testH264CMSampleBufferToAnnexBBufferWithKeyframe { >+ // clang-format off >+ const uint8_t cmsample_data[] = { >+ 0x00, 0x00, 0x00, 0x04, >+ 0x01, 0x00, 0x00, 0xFF, // first chunk, 4 bytes >+ 0x00, 0x00, 0x00, 0x02, >+ 0xAA, 0xFF}; // second chunk, 2 bytes >+ >+ const uint8_t expected_annex_b_data[] = { >+ 0x00, 0x00, 0x00, 0x01, >+ 0x01, 0x00, 0x00, 0xFF, // first chunk, 4 bytes >+ 0x00, 0x00, 0x00, 0x01, >+ 0xAA, 0xFF}; // second chunk, 2 bytes >+ // clang-format on >+ >+ rtc::Buffer annexb_buffer(arraysize(cmsample_data)); >+ std::unique_ptr<webrtc::RTPFragmentationHeader> out_header_ptr; >+ CMSampleBufferRef sample_buffer = >+ [self createCMSampleBufferRef:(void*)cmsample_data cmsampleSize:arraysize(cmsample_data)]; >+ >+ Boolean result = webrtc::H264CMSampleBufferToAnnexBBuffer(sample_buffer, >+ /* is_keyframe */ true, >+ &annexb_buffer, >+ &out_header_ptr); >+ >+ XCTAssertTrue(result); >+ >+ XCTAssertEqual(arraysize(SPS_PPS_BUFFER) + arraysize(expected_annex_b_data), >+ annexb_buffer.size()); >+ >+ XCTAssertEqual(0, memcmp(SPS_PPS_BUFFER, annexb_buffer.data(), arraysize(SPS_PPS_BUFFER))); >+ >+ XCTAssertEqual(0, >+ memcmp(expected_annex_b_data, >+ annexb_buffer.data() + arraysize(SPS_PPS_BUFFER), >+ arraysize(expected_annex_b_data))); >+ >+ webrtc::RTPFragmentationHeader* out_header = out_header_ptr.get(); >+ >+ XCTAssertEqual(4, (int)out_header->Size()); >+ >+ XCTAssertEqual(4, (int)out_header->Offset(0)); >+ XCTAssertEqual(14, (int)out_header->Length(0)); >+ XCTAssertEqual(0, (int)out_header->TimeDiff(0)); >+ XCTAssertEqual(0, (int)out_header->PayloadType(0)); >+ >+ XCTAssertEqual(22, (int)out_header->Offset(1)); >+ XCTAssertEqual(4, (int)out_header->Length(1)); >+ XCTAssertEqual(0, (int)out_header->TimeDiff(1)); >+ XCTAssertEqual(0, (int)out_header->PayloadType(1)); >+ >+ XCTAssertEqual(30, (int)out_header->Offset(2)); >+ XCTAssertEqual(4, (int)out_header->Length(2)); >+ XCTAssertEqual(0, (int)out_header->TimeDiff(2)); >+ XCTAssertEqual(0, (int)out_header->PayloadType(2)); >+ >+ XCTAssertEqual(38, (int)out_header->Offset(3)); >+ XCTAssertEqual(2, (int)out_header->Length(3)); >+ XCTAssertEqual(0, (int)out_header->TimeDiff(3)); >+ XCTAssertEqual(0, (int)out_header->PayloadType(3)); >+} >+ >+- (CMVideoFormatDescriptionRef)createDescription { >+ CMVideoFormatDescriptionRef description = >+ webrtc::CreateVideoFormatDescription(SPS_PPS_BUFFER, arraysize(SPS_PPS_BUFFER)); >+ XCTAssertTrue(description); >+ return description; >+} >+ >+- (CMSampleBufferRef)createCMSampleBufferRef:(void*)cmsampleData cmsampleSize:(size_t)cmsampleSize { >+ CMSampleBufferRef sample_buffer = nil; >+ OSStatus status; >+ >+ CMVideoFormatDescriptionRef description = [self createDescription]; >+ CMBlockBufferRef block_buffer = nullptr; >+ >+ status = CMBlockBufferCreateWithMemoryBlock(nullptr, >+ cmsampleData, >+ cmsampleSize, >+ nullptr, >+ nullptr, >+ 0, >+ cmsampleSize, >+ kCMBlockBufferAssureMemoryNowFlag, >+ &block_buffer); >+ >+ status = CMSampleBufferCreate(nullptr, >+ block_buffer, >+ true, >+ nullptr, >+ nullptr, >+ description, >+ 1, >+ 0, >+ nullptr, >+ 0, >+ nullptr, >+ &sample_buffer); >+ >+ return sample_buffer; >+} >+ >+@end >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/objc_video_encoder_factory_tests.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/objc_video_encoder_factory_tests.mm >index d009d51bba3154547c2bc14eed2caf63066265b0..50c30c0c7442753371607a01fb0fe5deea2fe9fe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/objc_video_encoder_factory_tests.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/unittests/objc_video_encoder_factory_tests.mm >@@ -82,7 +82,6 @@ TEST(ObjCVideoEncoderFactoryTest, EncodeReturnsOKOnSuccess) { > webrtc::VideoFrame frame(buffer, webrtc::kVideoRotation_0, 0); > webrtc::CodecSpecificInfo info; > info.codecType = webrtc::kVideoCodecH264; >- info.codec_name = "H264"; > std::vector<webrtc::FrameType> frame_types; > > EXPECT_EQ(encoder->Encode(frame, &info, &frame_types), WEBRTC_VIDEO_CODEC_OK); >@@ -99,7 +98,6 @@ TEST(ObjCVideoEncoderFactoryTest, EncodeReturnsErrorOnFail) { > webrtc::VideoFrame frame(buffer, webrtc::kVideoRotation_0, 0); > webrtc::CodecSpecificInfo info; > info.codecType = webrtc::kVideoCodecH264; >- info.codec_name = "H264"; > std::vector<webrtc::FrameType> frame_types; > > EXPECT_EQ(encoder->Encode(frame, &info, &frame_types), WEBRTC_VIDEO_CODEC_ERROR); >@@ -117,12 +115,6 @@ TEST(ObjCVideoEncoderFactoryTest, ReleaseEncodeReturnsErrorOnFail) { > EXPECT_EQ(encoder->Release(), WEBRTC_VIDEO_CODEC_ERROR); > } > >-TEST(ObjCVideoEncoderFactoryTest, SetChannelParametersAlwaysReturnsOK) { >- std::unique_ptr<webrtc::VideoEncoder> encoder = GetObjCEncoder(CreateErrorEncoderFactory()); >- >- EXPECT_EQ(encoder->SetChannelParameters(1, 1), WEBRTC_VIDEO_CODEC_OK); >-} >- > TEST(ObjCVideoEncoderFactoryTest, SetRatesReturnsOKOnSuccess) { > std::unique_ptr<webrtc::VideoEncoder> encoder = GetObjCEncoder(CreateOKEncoderFactory()); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/stats/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/stats/BUILD.gn >index de300f18cf2faf3b94b6f5900fed5f7bbd152778..c67a5be3654de6224442ea13f9548a70cc3558f6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/stats/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/stats/BUILD.gn >@@ -40,6 +40,7 @@ rtc_source_set("rtc_stats_test_utils") { > deps = [ > ":rtc_stats", > "../api:rtc_stats_api", >+ "../rtc_base/system:rtc_export", > ] > } > >@@ -56,9 +57,9 @@ if (rtc_include_tests) { > ":rtc_stats_test_utils", > "../api:rtc_stats_api", > "../rtc_base:checks", >+ "../rtc_base:gunit_helpers", > "../rtc_base:rtc_base_approved", > "../rtc_base:rtc_base_tests_main", >- "../rtc_base:rtc_base_tests_utils", > "../rtc_base:rtc_json", > ] > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/stats/rtcstats_objects.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/stats/rtcstats_objects.cc >index dcc1180f16f10ffc285adbfe6702f166c71ea851..cd52a55971beba3efff69057aae2e6206fd5723a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/stats/rtcstats_objects.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/stats/rtcstats_objects.cc >@@ -374,7 +374,9 @@ WEBRTC_RTCSTATS_IMPL(RTCMediaStreamTrackStats, RTCStats, "track", > &total_samples_received, > &total_samples_duration, > &concealed_samples, >- &concealment_events); >+ &concealment_events, >+ &jitter_buffer_flushes, >+ &delayed_packet_outage_samples); > // clang-format on > > RTCMediaStreamTrackStats::RTCMediaStreamTrackStats(const std::string& id, >@@ -410,7 +412,9 @@ RTCMediaStreamTrackStats::RTCMediaStreamTrackStats(std::string&& id, > total_samples_received("totalSamplesReceived"), > total_samples_duration("totalSamplesDuration"), > concealed_samples("concealedSamples"), >- concealment_events("concealmentEvents") { >+ concealment_events("concealmentEvents"), >+ jitter_buffer_flushes("jitterBufferFlushes"), >+ delayed_packet_outage_samples("delayedPacketOutageSamples") { > RTC_DCHECK(kind == RTCMediaStreamTrackKind::kAudio || > kind == RTCMediaStreamTrackKind::kVideo); > } >@@ -442,7 +446,9 @@ RTCMediaStreamTrackStats::RTCMediaStreamTrackStats( > total_samples_received(other.total_samples_received), > total_samples_duration(other.total_samples_duration), > concealed_samples(other.concealed_samples), >- concealment_events(other.concealment_events) {} >+ concealment_events(other.concealment_events), >+ jitter_buffer_flushes(other.jitter_buffer_flushes), >+ delayed_packet_outage_samples(other.delayed_packet_outage_samples) {} > > RTCMediaStreamTrackStats::~RTCMediaStreamTrackStats() {} > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/stats/test/rtcteststats.h b/Source/ThirdParty/libwebrtc/Source/webrtc/stats/test/rtcteststats.h >index 7c8e1051ff10ace4568c25749a92d796bbcad8a5..a7d2c91da061d2452251b860ff02c0a5d129c7a1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/stats/test/rtcteststats.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/stats/test/rtcteststats.h >@@ -15,10 +15,11 @@ > #include <vector> > > #include "api/stats/rtcstats.h" >+#include "rtc_base/system/rtc_export.h" > > namespace webrtc { > >-class RTCTestStats : public RTCStats { >+class RTC_EXPORT RTCTestStats : public RTCStats { > public: > WEBRTC_RTCSTATS_DECL(); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/style-guide.md b/Source/ThirdParty/libwebrtc/Source/webrtc/style-guide.md >index 2a35fdc5d1a476ed75692e2101481e367683e7ab..391c45b644d041a8942401ad3debd6de84161cb4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/style-guide.md >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/style-guide.md >@@ -77,6 +77,18 @@ instead of | use > > See [the source](api/array_view.h) for more detailed docs. > >+### `absl::optional<T>` as function argument >+ >+`absl::optional<T>` is generally a good choice when you want to pass a >+possibly missing `T` to a function—provided of course that `T` >+is a type that it makes sense to pass by value. >+ >+However, when you want to avoid pass-by-value, generally **do not pass >+`const absl::optional<T>&`; use `const T*` instead.** `const >+absl::optional<T>&` forces the caller to store the `T` in an >+`absl::optional<T>`; `const T*`, on the other hand, makes no >+assumptions about how the `T` is stored. >+ > ### sigslot > > sigslot is a lightweight library that adds a signal/slot language >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/BUILD.gn >index eb6005226593b4eea44b19fd94afe830b8391dbc..da83d12e8db1919428a49391ab1146a40f8ef3b2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/BUILD.gn >@@ -33,7 +33,7 @@ rtc_static_library("system_wrappers") { > libs = [] > deps = [ > ":cpu_features_api", >- "..:webrtc_common", >+ "../api:array_view", > "../modules:module_api_public", > "../rtc_base:checks", > "../rtc_base/synchronization:rw_lock_wrapper", >@@ -89,9 +89,6 @@ rtc_source_set("cpu_features_api") { > sources = [ > "include/cpu_features_wrapper.h", > ] >- deps = [ >- "..:webrtc_common", >- ] > } > > rtc_source_set("field_trial") { >@@ -105,6 +102,16 @@ rtc_source_set("field_trial") { > if (rtc_exclude_field_trial_default) { > defines = [ "WEBRTC_EXCLUDE_FIELD_TRIAL_DEFAULT" ] > } >+ if (build_with_chromium) { >+ # When WebRTC is built as part of Chromium it should exclude the default >+ # implementation of field_trial unless it is building for NACL or >+ # Chromecast. >+ if (!is_nacl && !is_chromecast) { >+ deps = [ >+ "../../webrtc_overrides:field_trial", >+ ] >+ } >+ } > } > > rtc_source_set("metrics") { >@@ -123,6 +130,9 @@ rtc_source_set("metrics") { > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", > ] >+ if (build_with_chromium) { >+ deps += [ "../../webrtc_overrides:metrics" ] >+ } > } > > if (is_android && !build_with_mozilla) { >@@ -166,6 +176,7 @@ if (rtc_include_tests) { > "..:webrtc_common", > "../rtc_base:rtc_base_approved", > "../test:test_main", >+ "../test:test_support", > "//testing/gtest", > ] > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/OWNERS b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/OWNERS >index 13b08a36ef8f482b49b6aff2a284af0d310275fd..a08f6114b9275df12de45c15efef24fef99d83df 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/OWNERS >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/OWNERS >@@ -2,7 +2,6 @@ henrika@webrtc.org > mflodman@webrtc.org > niklas.enbom@webrtc.org > nisse@webrtc.org >-perkj@webrtc.org > > # These are for the common case of adding or renaming files. If you're doing > # structural changes, please get a review from a reviewer in this file. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/clock.h b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/clock.h >index 4b6eab8edcebe598a3b8dcad83081ab243b2f2c5..f1fc11f17298e8762c93ccbdc6846043a21b14de 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/clock.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/clock.h >@@ -11,6 +11,7 @@ > #ifndef SYSTEM_WRAPPERS_INCLUDE_CLOCK_H_ > #define SYSTEM_WRAPPERS_INCLUDE_CLOCK_H_ > >+#include <stdint.h> > #include <memory> > > #include "rtc_base/synchronization/rw_lock_wrapper.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/event_wrapper.h b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/event_wrapper.h >index 0531ddb5638be0ece8456e98f6c3f421d59f570e..989e7929b39a08182301110ae4f2b522f991b4eb 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/event_wrapper.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/event_wrapper.h >@@ -12,11 +12,7 @@ > #define SYSTEM_WRAPPERS_INCLUDE_EVENT_WRAPPER_H_ > > namespace webrtc { >-enum EventTypeWrapper { >- kEventSignaled = 1, >- kEventError = 2, >- kEventTimeout = 3 >-}; >+enum EventTypeWrapper { kEventSignaled = 1, kEventTimeout = 2 }; > > #define WEBRTC_EVENT_INFINITE 0xffffffff > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/metrics.h b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/metrics.h >index 2a2cda02d03f59a34d621088ea8ea9c06c455012..f00ecf2622aefd817717086b63ab4eb4d9b33929 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/metrics.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/metrics.h >@@ -15,7 +15,6 @@ > #include <memory> > #include <string> > >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/atomicops.h" > #include "rtc_base/checks.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/rtp_to_ntp_estimator.h b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/rtp_to_ntp_estimator.h >index d7009d8b1f4504db5d7283bfa79d92e2889f445c..c244c4ff2745209bfd48ec45d9c8b6434aefce6a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/rtp_to_ntp_estimator.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/include/rtp_to_ntp_estimator.h >@@ -11,10 +11,12 @@ > #ifndef SYSTEM_WRAPPERS_INCLUDE_RTP_TO_NTP_ESTIMATOR_H_ > #define SYSTEM_WRAPPERS_INCLUDE_RTP_TO_NTP_ESTIMATOR_H_ > >+#include <stdint.h> > #include <list> > > #include "absl/types/optional.h" > #include "modules/include/module_common_types_public.h" >+#include "rtc_base/checks.h" > #include "rtc_base/numerics/moving_median_filter.h" > #include "system_wrappers/include/ntp_time.h" > >@@ -40,23 +42,13 @@ class RtpToNtpEstimator { > > // Estimated parameters from RTP and NTP timestamp pairs in |measurements_|. > struct Parameters { >- // Implicit conversion from int because MovingMedianFilter returns 0 >- // internally if no samples are present. However, it should never happen as >- // we don't ask smoothing_filter_ to return anything if there were no >- // samples. >- Parameters(const int& value) { // NOLINT >- RTC_NOTREACHED(); >- } > Parameters() : frequency_khz(0.0), offset_ms(0.0) {} > >+ Parameters(double frequency_khz, double offset_ms) >+ : frequency_khz(frequency_khz), offset_ms(offset_ms) {} >+ > double frequency_khz; > double offset_ms; >- >- // Needed to make it work inside MovingMedianFilter >- bool operator<(const Parameters& other) const; >- bool operator==(const Parameters& other) const; >- bool operator<=(const Parameters& other) const; >- bool operator!=(const Parameters& other) const; > }; > > // Updates measurements with RTP/NTP timestamp pair from a RTCP sender report. >@@ -68,7 +60,7 @@ class RtpToNtpEstimator { > > // Converts an RTP timestamp to the NTP domain in milliseconds. > // Returns true on success, false otherwise. >- bool Estimate(int64_t rtp_timestamp, int64_t* rtp_timestamp_ms) const; >+ bool Estimate(int64_t rtp_timestamp, int64_t* ntp_timestamp_ms) const; > > // Returns estimated rtp to ntp linear transform parameters. > const absl::optional<Parameters> params() const; >@@ -80,8 +72,7 @@ class RtpToNtpEstimator { > > int consecutive_invalid_samples_; > std::list<RtcpMeasurement> measurements_; >- MovingMedianFilter<Parameters> smoothing_filter_; >- bool params_calculated_; >+ absl::optional<Parameters> params_; > mutable TimestampUnwrapper unwrapper_; > }; > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/clock.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/clock.cc >index 35ab5f0878379ecf55c5bea85d9f1ac56e32036d..4f5d9cff1f3a3a73c39abb32adf7713da9b0e586 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/clock.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/clock.cc >@@ -17,6 +17,8 @@ > > #include <mmsystem.h> > >+#include "rtc_base/criticalsection.h" >+ > #elif defined(WEBRTC_POSIX) > > #include <sys/time.h> >@@ -24,7 +26,6 @@ > > #endif // defined(WEBRTC_POSIX) > >-#include "rtc_base/criticalsection.h" > #include "rtc_base/synchronization/rw_lock_wrapper.h" > #include "rtc_base/timeutils.h" > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/event.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/event.cc >index d1d2cdaff077cc3cbb3d4b27784b5a2d138c5d5e..0c4ce10157c21611169dbbeb73cd09318d590262 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/event.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/event.cc >@@ -14,9 +14,6 @@ > #include <windows.h> > #elif defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) > #include <ApplicationServices/ApplicationServices.h> >-#include <pthread.h> >-#else >-#include <pthread.h> > #endif > > #include "rtc_base/event.h" >@@ -25,7 +22,6 @@ namespace webrtc { > > class EventWrapperImpl : public EventWrapper { > public: >- EventWrapperImpl() : event_(false, false) {} > ~EventWrapperImpl() override {} > > bool Set() override { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/field_trial.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/field_trial.cc >index 60158f48865bc51aafd7a371323524ab5c2feab3..ac7131144aef2b9713500fc363fa49bc361759e3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/field_trial.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/field_trial.cc >@@ -9,6 +9,7 @@ > > #include "system_wrappers/include/field_trial.h" > >+#include <stddef.h> > #include <string> > > // Simple field trial implementation, which allows client to >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator.cc >index 730c4f6691fccc14dcf240041cb240fd1f235002..4bbf6096d90c5cc164c7eb4c76f83b3b90fe4561 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator.cc >@@ -10,30 +10,23 @@ > > #include "system_wrappers/include/rtp_to_ntp_estimator.h" > >+#include <stddef.h> >+#include <cmath> >+#include <vector> >+ >+#include "api/array_view.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >-#include "system_wrappers/include/clock.h" > > namespace webrtc { > namespace { >-// Number of RTCP SR reports to use to map between RTP and NTP. >-const size_t kNumRtcpReportsToUse = 2; >-// Number of parameters samples used to smooth. >-const size_t kNumSamplesToSmooth = 20; >- >-// Calculates the RTP timestamp frequency from two pairs of NTP/RTP timestamps. >-bool CalculateFrequency(int64_t ntp_ms1, >- uint32_t rtp_timestamp1, >- int64_t ntp_ms2, >- uint32_t rtp_timestamp2, >- double* frequency_khz) { >- if (ntp_ms1 <= ntp_ms2) >- return false; >- >- *frequency_khz = static_cast<double>(rtp_timestamp1 - rtp_timestamp2) / >- static_cast<double>(ntp_ms1 - ntp_ms2); >- return true; >-} >+// Maximum number of RTCP SR reports to use to map between RTP and NTP. >+const size_t kNumRtcpReportsToUse = 20; >+// Don't allow NTP timestamps to jump more than 1 hour. Chosen arbitrary as big >+// enough to not affect normal use-cases. Yet it is smaller than RTP wrap-around >+// half-period (90khz RTP clock wrap-arounds every 13.25 hours). After half of >+// wrap-around period it is impossible to unwrap RTP timestamps correctly. >+const int kMaxAllowedRtcpNtpIntervalMs = 60 * 60 * 1000; > > bool Contains(const std::list<RtpToNtpEstimator::RtcpMeasurement>& measurements, > const RtpToNtpEstimator::RtcpMeasurement& other) { >@@ -43,29 +36,47 @@ bool Contains(const std::list<RtpToNtpEstimator::RtcpMeasurement>& measurements, > } > return false; > } >-} // namespace > >-bool RtpToNtpEstimator::Parameters::operator<(const Parameters& other) const { >- if (frequency_khz < other.frequency_khz - 1e-6) { >- return true; >- } else if (frequency_khz > other.frequency_khz + 1e-6) { >+// Given x[] and y[] writes out such k and b that line y=k*x+b approximates >+// given points in the best way (Least Squares Method). >+bool LinearRegression(rtc::ArrayView<const double> x, >+ rtc::ArrayView<const double> y, >+ double* k, >+ double* b) { >+ size_t n = x.size(); >+ if (n < 2) >+ return false; >+ >+ if (y.size() != n) > return false; >- } else { >- return offset_ms < other.offset_ms - 1e-6; >+ >+ double avg_x = 0; >+ double avg_y = 0; >+ for (size_t i = 0; i < n; ++i) { >+ avg_x += x[i]; >+ avg_y += y[i]; >+ } >+ avg_x /= n; >+ avg_y /= n; >+ >+ double variance_x = 0; >+ double covariance_xy = 0; >+ for (size_t i = 0; i < n; ++i) { >+ double normalized_x = x[i] - avg_x; >+ double normalized_y = y[i] - avg_y; >+ variance_x += normalized_x * normalized_x; >+ covariance_xy += normalized_x * normalized_y; > } >-} > >-bool RtpToNtpEstimator::Parameters::operator==(const Parameters& other) const { >- return !(other < *this || *this < other); >-} >+ if (std::fabs(variance_x) < 1e-8) >+ return false; > >-bool RtpToNtpEstimator::Parameters::operator!=(const Parameters& other) const { >- return other < *this || *this < other; >+ *k = static_cast<double>(covariance_xy / variance_x); >+ *b = static_cast<double>(avg_y - (*k) * avg_x); >+ return true; > } > >-bool RtpToNtpEstimator::Parameters::operator<=(const Parameters& other) const { >- return !(other < *this); >-} >+} // namespace > > RtpToNtpEstimator::RtcpMeasurement::RtcpMeasurement(uint32_t ntp_secs, > uint32_t ntp_frac, >@@ -82,31 +93,29 @@ bool RtpToNtpEstimator::RtcpMeasurement::IsEqual( > } > > // Class for converting an RTP timestamp to the NTP domain. >-RtpToNtpEstimator::RtpToNtpEstimator() >- : consecutive_invalid_samples_(0), >- smoothing_filter_(kNumSamplesToSmooth), >- params_calculated_(false) {} >+RtpToNtpEstimator::RtpToNtpEstimator() : consecutive_invalid_samples_(0) {} > > RtpToNtpEstimator::~RtpToNtpEstimator() {} > > void RtpToNtpEstimator::UpdateParameters() { >- if (measurements_.size() != kNumRtcpReportsToUse) >+ if (measurements_.size() < 2) > return; > >- Parameters params; >- int64_t timestamp_new = measurements_.front().unwrapped_rtp_timestamp; >- int64_t timestamp_old = measurements_.back().unwrapped_rtp_timestamp; >- >- int64_t ntp_ms_new = measurements_.front().ntp_time.ToMs(); >- int64_t ntp_ms_old = measurements_.back().ntp_time.ToMs(); >+ std::vector<double> x; >+ std::vector<double> y; >+ x.reserve(measurements_.size()); >+ y.reserve(measurements_.size()); >+ for (auto it = measurements_.begin(); it != measurements_.end(); ++it) { >+ x.push_back(it->unwrapped_rtp_timestamp); >+ y.push_back(it->ntp_time.ToMs()); >+ } >+ double slope, offset; > >- if (!CalculateFrequency(ntp_ms_new, timestamp_new, ntp_ms_old, timestamp_old, >- ¶ms.frequency_khz)) { >+ if (!LinearRegression(x, y, &slope, &offset)) { > return; > } >- params.offset_ms = timestamp_new - params.frequency_khz * ntp_ms_new; >- params_calculated_ = true; >- smoothing_filter_.Insert(params); >+ >+ params_.emplace(1 / slope, offset); > } > > bool RtpToNtpEstimator::UpdateMeasurements(uint32_t ntp_secs, >@@ -132,7 +141,8 @@ bool RtpToNtpEstimator::UpdateMeasurements(uint32_t ntp_secs, > if (!measurements_.empty()) { > int64_t old_rtp_timestamp = measurements_.front().unwrapped_rtp_timestamp; > int64_t old_ntp_ms = measurements_.front().ntp_time.ToMs(); >- if (ntp_ms_new <= old_ntp_ms) { >+ if (ntp_ms_new <= old_ntp_ms || >+ ntp_ms_new > old_ntp_ms + kMaxAllowedRtcpNtpIntervalMs) { > invalid_sample = true; > } else if (unwrapped_rtp_timestamp <= old_rtp_timestamp) { > RTC_LOG(LS_WARNING) >@@ -152,8 +162,7 @@ bool RtpToNtpEstimator::UpdateMeasurements(uint32_t ntp_secs, > RTC_LOG(LS_WARNING) << "Multiple consecutively invalid RTCP SR reports, " > "clearing measurements."; > measurements_.clear(); >- smoothing_filter_.Reset(); >- params_calculated_ = false; >+ params_ = absl::nullopt; > } > consecutive_invalid_samples_ = 0; > >@@ -170,35 +179,29 @@ bool RtpToNtpEstimator::UpdateMeasurements(uint32_t ntp_secs, > } > > bool RtpToNtpEstimator::Estimate(int64_t rtp_timestamp, >- int64_t* rtp_timestamp_ms) const { >- if (!params_calculated_) >+ int64_t* ntp_timestamp_ms) const { >+ if (!params_) > return false; > > int64_t rtp_timestamp_unwrapped = unwrapper_.Unwrap(rtp_timestamp); > >- Parameters params = smoothing_filter_.GetFilteredValue(); >- > // params_calculated_ should not be true unless ms params.frequency_khz has > // been calculated to something non zero. >- RTC_DCHECK_NE(params.frequency_khz, 0.0); >+ RTC_DCHECK_NE(params_->frequency_khz, 0.0); > double rtp_ms = >- (static_cast<double>(rtp_timestamp_unwrapped) - params.offset_ms) / >- params.frequency_khz + >- 0.5f; >+ static_cast<double>(rtp_timestamp_unwrapped) / params_->frequency_khz + >+ params_->offset_ms + 0.5f; > > if (rtp_ms < 0) > return false; > >- *rtp_timestamp_ms = rtp_ms; >+ *ntp_timestamp_ms = rtp_ms; >+ > return true; > } > > const absl::optional<RtpToNtpEstimator::Parameters> RtpToNtpEstimator::params() > const { >- absl::optional<Parameters> res; >- if (params_calculated_) { >- res.emplace(smoothing_filter_.GetFilteredValue()); >- } >- return res; >+ return params_; > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator_unittest.cc >index 0647ec8cad6398c26c5bbbdbef5dac4557bb3d40..b0b83bbbb1c7666b4e61553825480cc269cf336c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/system_wrappers/source/rtp_to_ntp_estimator_unittest.cc >@@ -9,11 +9,13 @@ > */ > > #include "system_wrappers/include/rtp_to_ntp_estimator.h" >+#include "rtc_base/random.h" > #include "test/gtest.h" > > namespace webrtc { > namespace { > const uint32_t kOneMsInNtpFrac = 4294967; >+const uint32_t kOneHourInNtpSec = 60 * 60; > const uint32_t kTimestampTicksPerMs = 90; > } // namespace > >@@ -224,6 +226,22 @@ TEST(UpdateRtcpMeasurementTests, FailsForOldNtp) { > estimator.UpdateMeasurements(ntp_sec, ntp_frac, timestamp, &new_sr)); > } > >+TEST(UpdateRtcpMeasurementTests, FailsForTooNewNtp) { >+ RtpToNtpEstimator estimator; >+ uint32_t ntp_sec = 1; >+ uint32_t ntp_frac = 699925050; >+ uint32_t timestamp = 0x12345678; >+ bool new_sr; >+ EXPECT_TRUE( >+ estimator.UpdateMeasurements(ntp_sec, ntp_frac, timestamp, &new_sr)); >+ EXPECT_TRUE(new_sr); >+ // Ntp time from far future, list not updated. >+ ntp_sec += kOneHourInNtpSec * 2; >+ timestamp += kTimestampTicksPerMs * 10; >+ EXPECT_FALSE( >+ estimator.UpdateMeasurements(ntp_sec, ntp_frac, timestamp, &new_sr)); >+} >+ > TEST(UpdateRtcpMeasurementTests, FailsForEqualTimestamp) { > RtpToNtpEstimator estimator; > uint32_t ntp_sec = 0; >@@ -292,4 +310,37 @@ TEST(RtpToNtpTests, FailsForNoParameters) { > EXPECT_FALSE(estimator.Estimate(timestamp, ×tamp_ms)); > } > >+TEST(RtpToNtpTests, AveragesErrorOut) { >+ RtpToNtpEstimator estimator; >+ uint32_t ntp_sec = 1; >+ uint32_t ntp_frac = 90000000; // More than 1 ms. >+ uint32_t timestamp = 0x12345678; >+ const int kNtpSecStep = 1; // 1 second. >+ const int kRtpTicksPerMs = 90; >+ const int kRtpStep = kRtpTicksPerMs * 1000; >+ bool new_sr; >+ EXPECT_TRUE( >+ estimator.UpdateMeasurements(ntp_sec, ntp_frac, timestamp, &new_sr)); >+ EXPECT_TRUE(new_sr); >+ >+ Random rand(1123536L); >+ for (size_t i = 0; i < 1000; i++) { >+ // Advance both timestamps by exactly 1 second. >+ ntp_sec += kNtpSecStep; >+ timestamp += kRtpStep; >+ // Add upto 1ms of errors to NTP and RTP timestamps passed to estimator. >+ EXPECT_TRUE(estimator.UpdateMeasurements( >+ ntp_sec, >+ ntp_frac + rand.Rand(-static_cast<int>(kOneMsInNtpFrac), >+ static_cast<int>(kOneMsInNtpFrac)), >+ timestamp + rand.Rand(-kRtpTicksPerMs, kRtpTicksPerMs), &new_sr)); >+ EXPECT_TRUE(new_sr); >+ >+ int64_t estimated_ntp_ms; >+ EXPECT_TRUE(estimator.Estimate(timestamp, &estimated_ntp_ms)); >+ // Allow upto 2 ms of error. >+ EXPECT_NEAR(NtpTime(ntp_sec, ntp_frac).ToMs(), estimated_ntp_ms, 2); >+ } >+} >+ > }; // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/test/BUILD.gn >index 28146e0599cde9b55200b3d6fa9e771b109d5f0d..7cd7b3eaa1fb93724a26d74cdd69265a15cf0537 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/BUILD.gn >@@ -137,6 +137,8 @@ if (is_ios) { > testonly = true > visibility = [ ":test_support" ] > sources = [ >+ "ios/coverage_util_ios.h", >+ "ios/coverage_util_ios.mm", > "ios/test_support.h", > "ios/test_support.mm", > ] >@@ -144,6 +146,15 @@ if (is_ios) { > ":perf_test", > "../sdk:helpers_objc", > ] >+ configs += [ ":test_support_objc_config" ] >+ } >+ >+ config("test_support_objc_config") { >+ defines = [] >+ >+ if (use_clang_coverage) { >+ defines += [ "WEBRTC_IOS_ENABLE_COVERAGE" ] >+ } > } > } > >@@ -196,24 +207,41 @@ rtc_source_set("test_support") { > } > > if (rtc_include_tests) { >- rtc_source_set("test_main") { >+ rtc_source_set("test_main_lib") { > visibility = [ "*" ] > testonly = true > sources = [ >- "test_main.cc", >+ "test_main_lib.cc", >+ "test_main_lib.h", > ] > >- public_deps = [ >- ":test_support", >- ] > deps = [ > ":field_trial", >- ":fileutils", > ":perf_test", >+ ":test_support", > "../rtc_base:rtc_base", > "../system_wrappers:field_trial", > "../system_wrappers:metrics", > "//testing/gtest", >+ "//third_party/abseil-cpp/absl/memory:memory", >+ >+ # TODO(bugs.webrtc.org/9792): This is needed for downstream projects on >+ # Android, where it's replaced by an internal version of fileutils that >+ # has a certain flag. Remove this once the internal fileutils has been >+ # eliminated. >+ "../test:fileutils", >+ ] >+ } >+ >+ rtc_source_set("test_main") { >+ visibility = [ "*" ] >+ testonly = true >+ sources = [ >+ "test_main.cc", >+ ] >+ >+ deps = [ >+ ":test_main_lib", > ] > } > >@@ -309,6 +337,8 @@ if (rtc_include_tests) { > ":video_test_support", > "../api:create_simulcast_test_fixture_api", > "../api:simulcast_test_fixture_api", >+ "../api/test/video:function_video_factory", >+ "../api/video:builtin_video_bitrate_allocator_factory", > "../api/video:video_frame_i420", > "../modules/rtp_rtcp:rtp_rtcp", > "../modules/video_capture", >@@ -360,7 +390,10 @@ if (rtc_include_tests) { > > if (is_ios) { > rtc_source_set("fileutils_ios_objc") { >- visibility = [ ":fileutils" ] >+ visibility = [ >+ ":fileutils", >+ ":fileutils_override_impl", >+ ] > sources = [ > "testsupport/iosfileutils.h", > "testsupport/iosfileutils.mm", >@@ -376,7 +409,10 @@ if (is_ios) { > > if (is_mac) { > rtc_source_set("fileutils_mac_objc") { >- visibility = [ ":fileutils" ] >+ visibility = [ >+ ":fileutils", >+ ":fileutils_override_impl", >+ ] > sources = [ > "testsupport/macfileutils.h", > "testsupport/macfileutils.mm", >@@ -395,6 +431,42 @@ rtc_source_set("fileutils") { > "testsupport/fileutils.h", > ] > deps = [ >+ ":fileutils_override_api", >+ ":fileutils_override_impl", >+ "..:webrtc_common", >+ "../rtc_base:checks", >+ "../rtc_base:rtc_base_approved", >+ "../rtc_base/system:arch", >+ "//third_party/abseil-cpp/absl/types:optional", >+ ] >+ if (is_ios) { >+ deps += [ ":fileutils_ios_objc" ] >+ } >+ if (is_mac) { >+ deps += [ ":fileutils_mac_objc" ] >+ } >+ if (is_win) { >+ deps += [ "../rtc_base:rtc_base" ] >+ } >+} >+ >+# We separate header into own target to make it possible for downstream >+# projects to override implementation. >+rtc_source_set("fileutils_override_api") { >+ testonly = true >+ sources = [ >+ "testsupport/fileutils_override.h", >+ ] >+} >+ >+rtc_source_set("fileutils_override_impl") { >+ testonly = true >+ visibility = [ ":fileutils" ] >+ sources = [ >+ "testsupport/fileutils_override.cc", >+ ] >+ deps = [ >+ ":fileutils_override_api", > "..:webrtc_common", > "../rtc_base:checks", > "../rtc_base:rtc_base_approved", >@@ -523,6 +595,8 @@ rtc_source_set("fake_video_codecs") { > "fake_decoder.h", > "fake_encoder.cc", > "fake_encoder.h", >+ "fake_vp8_decoder.cc", >+ "fake_vp8_decoder.h", > "fake_vp8_encoder.cc", > "fake_vp8_encoder.h", > ] >@@ -532,7 +606,9 @@ rtc_source_set("fake_video_codecs") { > } > deps = [ > "..:webrtc_common", >+ "../api/video:encoded_image", > "../api/video:video_frame_i420", >+ "../api/video_codecs:create_vp8_temporal_layers", > "../api/video_codecs:video_codecs_api", > "../common_video:common_video", > "../modules/video_coding:video_codec_interface", >@@ -560,8 +636,6 @@ rtc_source_set("test_common") { > "encoder_settings.cc", > "encoder_settings.h", > "fake_videorenderer.h", >- "function_video_decoder_factory.h", >- "function_video_encoder_factory.h", > "layer_filtering_transport.cc", > "layer_filtering_transport.h", > "mock_transport.cc", >@@ -599,6 +673,9 @@ rtc_source_set("test_common") { > "../api:transport_api", > "../api/audio_codecs:builtin_audio_decoder_factory", > "../api/audio_codecs:builtin_audio_encoder_factory", >+ "../api/test/video:function_video_factory", >+ "../api/video:builtin_video_bitrate_allocator_factory", >+ "../api/video:video_bitrate_allocator_factory", > "../api/video:video_frame", > "../api/video_codecs:video_codecs_api", > "../audio", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.cc >index f4fbf221be90f54e87fdd8650ea82b684f68035a..6e25b85a56c725385e35d2e8b3e5dc84b46a10da 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.cc >@@ -15,6 +15,7 @@ > #include "absl/memory/memory.h" > #include "api/audio_codecs/builtin_audio_decoder_factory.h" > #include "api/audio_codecs/builtin_audio_encoder_factory.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video_codecs/video_encoder_config.h" > #include "call/fake_network_pipe.h" > #include "call/rtp_transport_controller_send.h" >@@ -39,7 +40,8 @@ CallTest::CallTest() > send_event_log_(RtcEventLog::CreateNull()), > recv_event_log_(RtcEventLog::CreateNull()), > sender_call_transport_controller_(nullptr), >- audio_send_config_(nullptr), >+ audio_send_config_(/*send_transport=*/nullptr, >+ /*media_transport=*/nullptr), > audio_send_stream_(nullptr), > bbr_network_controller_factory_(new BbrNetworkControllerFactory()), > fake_encoder_factory_([this]() { >@@ -53,6 +55,7 @@ CallTest::CallTest() > return fake_encoder; > }), > fake_decoder_factory_([]() { return absl::make_unique<FakeDecoder>(); }), >+ bitrate_allocator_factory_(CreateBuiltinVideoBitrateAllocatorFactory()), > num_video_streams_(1), > num_audio_streams_(0), > num_flexfec_streams_(0), >@@ -75,7 +78,7 @@ void CallTest::RunBaseTest(BaseTest* test) { > num_flexfec_streams_ = test->GetNumFlexfecStreams(); > RTC_DCHECK(num_video_streams_ > 0 || num_audio_streams_ > 0); > Call::Config send_config(send_event_log_.get()); >- test->ModifySenderCallConfig(&send_config); >+ test->ModifySenderBitrateConfig(&send_config.bitrate_config); > if (num_audio_streams_ > 0) { > CreateFakeAudioDevices(test->CreateCapturer(), test->CreateRenderer()); > test->OnFakeAudioDevicesCreated(fake_send_audio_device_.get(), >@@ -99,7 +102,7 @@ void CallTest::RunBaseTest(BaseTest* test) { > } > if (test->ShouldCreateReceivers()) { > Call::Config recv_config(recv_event_log_.get()); >- test->ModifyReceiverCallConfig(&recv_config); >+ test->ModifyReceiverBitrateConfig(&recv_config.bitrate_config); > if (num_audio_streams_ > 0) { > AudioState::Config audio_state_config; > audio_state_config.audio_mixer = AudioMixerImpl::Create(); >@@ -163,6 +166,7 @@ void CallTest::RunBaseTest(BaseTest* test) { > int height = kDefaultHeight; > int frame_rate = kDefaultFramerate; > test->ModifyVideoCaptureStartResolution(&width, &height, &frame_rate); >+ test->ModifyVideoDegradationPreference(°radation_preference_); > CreateFrameGeneratorCapturer(frame_rate, width, height); > test->OnFrameGeneratorCapturerCreated(frame_generator_capturer_); > } >@@ -232,6 +236,8 @@ void CallTest::CreateVideoSendConfig(VideoSendStream::Config* video_config, > RTC_DCHECK_LE(num_video_streams + num_used_ssrcs, kNumSsrcs); > *video_config = VideoSendStream::Config(send_transport); > video_config->encoder_settings.encoder_factory = &fake_encoder_factory_; >+ video_config->encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory_.get(); > video_config->rtp.payload_name = "FAKE"; > video_config->rtp.payload_type = kFakeVideoSendPayloadType; > video_config->rtp.extensions.push_back( >@@ -259,7 +265,8 @@ void CallTest::CreateAudioAndFecSendConfigs(size_t num_audio_streams, > RTC_DCHECK_LE(num_audio_streams, 1); > RTC_DCHECK_LE(num_flexfec_streams, 1); > if (num_audio_streams > 0) { >- AudioSendStream::Config audio_send_config(send_transport); >+ AudioSendStream::Config audio_send_config(send_transport, >+ /*media_transport=*/nullptr); > audio_send_config.rtp.ssrc = kAudioSendSsrc; > audio_send_config.send_codec_spec = AudioSendStream::Config::SendCodecSpec( > kAudioSendPayloadType, {"opus", 48000, 2, {{"stereo", "1"}}}); >@@ -726,9 +733,10 @@ void BaseTest::OnFakeAudioDevicesCreated( > TestAudioDeviceModule* send_audio_device, > TestAudioDeviceModule* recv_audio_device) {} > >-void BaseTest::ModifySenderCallConfig(Call::Config* config) {} >+void BaseTest::ModifySenderBitrateConfig(BitrateConstraints* bitrate_config) {} > >-void BaseTest::ModifyReceiverCallConfig(Call::Config* config) {} >+void BaseTest::ModifyReceiverBitrateConfig(BitrateConstraints* bitrate_config) { >+} > > void BaseTest::OnRtpTransportControllerSendCreated( > RtpTransportControllerSend* controller) {} >@@ -742,8 +750,8 @@ test::PacketTransport* BaseTest::CreateSendTransport( > task_queue, sender_call, this, test::PacketTransport::kSender, > CallTest::payload_type_map_, > absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>(BuiltInNetworkBehaviorConfig()))); > } > > test::PacketTransport* BaseTest::CreateReceiveTransport( >@@ -752,8 +760,8 @@ test::PacketTransport* BaseTest::CreateReceiveTransport( > task_queue, nullptr, this, test::PacketTransport::kReceiver, > CallTest::payload_type_map_, > absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>(BuiltInNetworkBehaviorConfig()))); > } > > size_t BaseTest::GetNumVideoStreams() const { >@@ -777,6 +785,9 @@ void BaseTest::ModifyVideoCaptureStartResolution(int* width, > int* heigt, > int* frame_rate) {} > >+void BaseTest::ModifyVideoDegradationPreference( >+ DegradationPreference* degradation_preference) {} >+ > void BaseTest::OnVideoStreamsCreated( > VideoSendStream* send_stream, > const std::vector<VideoReceiveStream*>& receive_streams) {} >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.h >index 7a5c3e9b8c1a00ed706f8042d192210677371663..8961b7fccb3f9390e12691279914ebc6425389df 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/call_test.h >@@ -13,6 +13,9 @@ > #include <memory> > #include <vector> > >+#include "api/test/video/function_video_decoder_factory.h" >+#include "api/test/video/function_video_encoder_factory.h" >+#include "api/video/video_bitrate_allocator_factory.h" > #include "call/call.h" > #include "call/rtp_transport_controller_send.h" > #include "logging/rtc_event_log/rtc_event_log.h" >@@ -22,8 +25,6 @@ > #include "test/fake_videorenderer.h" > #include "test/fake_vp8_encoder.h" > #include "test/frame_generator_capturer.h" >-#include "test/function_video_decoder_factory.h" >-#include "test/function_video_encoder_factory.h" > #include "test/rtp_rtcp_observer.h" > #include "test/single_threaded_task_queue.h" > >@@ -205,6 +206,7 @@ class CallTest : public ::testing::Test { > test::FunctionVideoEncoderFactory fake_encoder_factory_; > int fake_encoder_max_bitrate_ = -1; > test::FunctionVideoDecoderFactory fake_decoder_factory_; >+ std::unique_ptr<VideoBitrateAllocatorFactory> bitrate_allocator_factory_; > // Number of simulcast substreams. > size_t num_video_streams_; > size_t num_audio_streams_; >@@ -241,8 +243,8 @@ class BaseTest : public RtpRtcpObserver { > TestAudioDeviceModule* send_audio_device, > TestAudioDeviceModule* recv_audio_device); > >- virtual void ModifySenderCallConfig(Call::Config* config); >- virtual void ModifyReceiverCallConfig(Call::Config* config); >+ virtual void ModifySenderBitrateConfig(BitrateConstraints* bitrate_config); >+ virtual void ModifyReceiverBitrateConfig(BitrateConstraints* bitrate_config); > > virtual void OnRtpTransportControllerSendCreated( > RtpTransportControllerSend* controller); >@@ -261,6 +263,9 @@ class BaseTest : public RtpRtcpObserver { > virtual void ModifyVideoCaptureStartResolution(int* width, > int* heigt, > int* frame_rate); >+ virtual void ModifyVideoDegradationPreference( >+ DegradationPreference* degradation_preference); >+ > virtual void OnVideoStreamsCreated( > VideoSendStream* send_stream, > const std::vector<VideoReceiveStream*>& receive_streams); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.cc >index c0fe2966e85ff21b712ad48282a94aea95885835..bdd61d032fc834a1d8d99e6c2416bfc438bd65fc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.cc >@@ -12,7 +12,7 @@ > > #include <string.h> > >-#include "common_video/include/video_frame.h" >+#include "api/video/encoded_image.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "rtc_base/checks.h" > >@@ -54,7 +54,9 @@ int32_t ConfigurableFrameSizeEncoder::Encode( > CodecSpecificInfo specific{}; > specific.codecType = codec_type_; > callback_->OnEncodedImage(encodedImage, &specific, fragmentation); >- >+ if (post_encode_callback_) { >+ (*post_encode_callback_)(); >+ } > return WEBRTC_VIDEO_CODEC_OK; > } > >@@ -68,11 +70,6 @@ int32_t ConfigurableFrameSizeEncoder::Release() { > return WEBRTC_VIDEO_CODEC_OK; > } > >-int32_t ConfigurableFrameSizeEncoder::SetChannelParameters(uint32_t packet_loss, >- int64_t rtt) { >- return WEBRTC_VIDEO_CODEC_OK; >-} >- > int32_t ConfigurableFrameSizeEncoder::SetRateAllocation( > const VideoBitrateAllocation& allocation, > uint32_t framerate) { >@@ -89,5 +86,10 @@ void ConfigurableFrameSizeEncoder::SetCodecType(VideoCodecType codec_type) { > codec_type_ = codec_type; > } > >+void ConfigurableFrameSizeEncoder::RegisterPostEncodeCallback( >+ std::function<void(void)> post_encode_callback) { >+ post_encode_callback_ = std::move(post_encode_callback); >+} >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.h >index 1dc9e48f47f5ab7bf0c478038e8bdf5831bc59dd..f881513fcc27361f7a04b4bf68ae5570d3ea8709 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/configurable_frame_size_encoder.h >@@ -37,8 +37,6 @@ class ConfigurableFrameSizeEncoder : public VideoEncoder { > > int32_t Release() override; > >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; >- > int32_t SetRateAllocation(const VideoBitrateAllocation& allocation, > uint32_t framerate) override; > >@@ -46,8 +44,13 @@ class ConfigurableFrameSizeEncoder : public VideoEncoder { > > void SetCodecType(VideoCodecType codec_type_); > >+ void RegisterPostEncodeCallback( >+ std::function<void(void)> post_encode_callback); >+ > private: > EncodedImageCallback* callback_; >+ absl::optional<std::function<void(void)>> post_encode_callback_; >+ > const size_t max_frame_size_; > size_t current_frame_size_; > std::unique_ptr<uint8_t[]> buffer_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/direct_transport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/direct_transport.cc >index 6aa8c9158f0de6720e225bb48eae463a79a53313..fd7369171a2200c714d49727da1f60f57385cb99 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/direct_transport.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/direct_transport.cc >@@ -72,6 +72,8 @@ bool DirectTransport::SendRtp(const uint8_t* data, > if (send_call_) { > rtc::SentPacket sent_packet(options.packet_id, > clock_->TimeInMilliseconds()); >+ sent_packet.info.included_in_feedback = options.included_in_feedback; >+ sent_packet.info.included_in_allocation = options.included_in_allocation; > sent_packet.info.packet_size_bytes = length; > sent_packet.info.packet_type = rtc::PacketType::kData; > send_call_->OnSentPacket(sent_packet); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/encoder_settings.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/encoder_settings.cc >index 945559b4ee02ee85519f733ba85d689c7129970e..bfbd2bdec47a9834991a137c3e9521ff7590ada7 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/encoder_settings.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/encoder_settings.cc >@@ -61,6 +61,14 @@ std::vector<VideoStream> CreateVideoStreams( > ? stream.target_bitrate_bps > : DefaultVideoStreamFactory::kMaxBitratePerStream[i]; > target_bitrate_bps = std::min(max_bitrate_bps, target_bitrate_bps); >+ >+ if (stream.max_framerate > 0) { >+ stream_settings[i].max_framerate = stream.max_framerate; >+ } >+ if (stream.num_temporal_layers) { >+ RTC_DCHECK_GE(*stream.num_temporal_layers, 1); >+ stream_settings[i].num_temporal_layers = stream.num_temporal_layers; >+ } > } else { > max_bitrate_bps = std::min( > bitrate_left_bps, DefaultVideoStreamFactory::kMaxBitratePerStream[i]); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.cc >index bc3c7b1087b4ea0f6a95879c7ae4726a3119179c..ddaba77a4f45eba13708db8ab6e9b9a443eee063 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.cc >@@ -15,16 +15,33 @@ > #include <algorithm> > #include <memory> > >-#include "common_types.h" // NOLINT(build/include) >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "rtc_base/checks.h" > #include "system_wrappers/include/sleep.h" > > namespace webrtc { > namespace test { >+namespace { >+const int kKeyframeSizeFactor = 5; >+ >+// Inverse of proportion of frames assigned to each temporal layer for all >+// possible temporal layers numbers. >+const int kTemporalLayerRateFactor[4][4] = { >+ {1, 0, 0, 0}, // 1/1 >+ {2, 2, 0, 0}, // 1/2 + 1/2 >+ {4, 4, 2, 0}, // 1/4 + 1/4 + 1/2 >+ {8, 8, 4, 2}, // 1/8 + 1/8 + 1/4 + 1/2 >+}; >+ >+void WriteCounter(unsigned char* payload, uint32_t counter) { >+ payload[0] = (counter & 0x00FF); >+ payload[1] = (counter & 0xFF00) >> 8; >+ payload[2] = (counter & 0xFF0000) >> 16; >+ payload[3] = (counter & 0xFF000000) >> 24; >+} > >-const int kKeyframeSizeFactor = 10; >+}; // namespace > > FakeEncoder::FakeEncoder(Clock* clock) > : clock_(clock), >@@ -32,6 +49,7 @@ FakeEncoder::FakeEncoder(Clock* clock) > configured_input_framerate_(-1), > max_target_bitrate_kbps_(-1), > pending_keyframe_(true), >+ counter_(0), > debt_bytes_(0) { > // Generate some arbitrary not-all-zero data > for (size_t i = 0; i < sizeof(encoded_buffer_); ++i) { >@@ -46,6 +64,7 @@ void FakeEncoder::SetMaxBitrate(int max_kbps) { > RTC_DCHECK_GE(max_kbps, -1); // max_kbps == -1 disables it. > rtc::CritScope cs(&crit_sect_); > max_target_bitrate_kbps_ = max_kbps; >+ SetRateAllocation(target_bitrate_, configured_input_framerate_); > } > > int32_t FakeEncoder::InitEncode(const VideoCodec* config, >@@ -71,6 +90,7 @@ int32_t FakeEncoder::Encode(const VideoFrame& input_image, > int framerate; > VideoCodecMode mode; > bool keyframe; >+ uint32_t counter; > { > rtc::CritScope cs(&crit_sect_); > max_framerate = config_.maxFramerate; >@@ -88,13 +108,15 @@ int32_t FakeEncoder::Encode(const VideoFrame& input_image, > } > keyframe = pending_keyframe_; > pending_keyframe_ = false; >+ counter = counter_++; > } > > FrameInfo frame_info = > NextFrame(frame_types, keyframe, num_simulcast_streams, target_bitrate, > simulcast_streams, framerate); > for (uint8_t i = 0; i < frame_info.layers.size(); ++i) { >- if (frame_info.layers[i].size == 0) { >+ constexpr int kMinPayLoadLength = 14; >+ if (frame_info.layers[i].size < kMinPayLoadLength) { > // Drop this temporal layer. > continue; > } >@@ -104,7 +126,10 @@ int32_t FakeEncoder::Encode(const VideoFrame& input_image, > specifics.codecType = kVideoCodecGeneric; > std::unique_ptr<uint8_t[]> encoded_buffer( > new uint8_t[frame_info.layers[i].size]); >- memcpy(encoded_buffer.get(), encoded_buffer_, frame_info.layers[i].size); >+ memcpy(encoded_buffer.get(), encoded_buffer_, >+ frame_info.layers[i].size - 4); >+ // Write a counter to the image to make each frame unique. >+ WriteCounter(encoded_buffer.get() + frame_info.layers[i].size - 4, counter); > EncodedImage encoded(encoded_buffer.get(), frame_info.layers[i].size, > sizeof(encoded_buffer_)); > encoded.SetTimestamp(input_image.timestamp()); >@@ -118,7 +143,6 @@ int32_t FakeEncoder::Encode(const VideoFrame& input_image, > ? VideoContentType::SCREENSHARE > : VideoContentType::UNSPECIFIED; > encoded.SetSpatialIndex(i); >- specifics.codec_name = ImplementationName(); > if (callback->OnEncodedImage(encoded, &specifics, nullptr).error != > EncodedImageCallback::Result::OK) { > return -1; >@@ -146,6 +170,7 @@ FakeEncoder::FrameInfo FakeEncoder::NextFrame( > } > } > >+ rtc::CritScope cs(&crit_sect_); > for (uint8_t i = 0; i < num_simulcast_streams; ++i) { > if (target_bitrate.GetBitrate(i, 0) > 0) { > int temporal_id = last_frame_info_.layers.size() > i >@@ -166,7 +191,9 @@ FakeEncoder::FrameInfo FakeEncoder::NextFrame( > if (frame_info.keyframe) { > layer_info.temporal_id = 0; > size_t avg_frame_size = >- (target_bitrate.GetBitrate(i, 0) + 7) / (8 * framerate); >+ (target_bitrate.GetBitrate(i, 0) + 7) * >+ kTemporalLayerRateFactor[frame_info.layers.size() - 1][i] / >+ (8 * framerate); > > // The first frame is a key frame and should be larger. > // Store the overshoot bytes and distribute them over the coming frames, >@@ -175,7 +202,8 @@ FakeEncoder::FrameInfo FakeEncoder::NextFrame( > layer_info.size = kKeyframeSizeFactor * avg_frame_size; > } else { > size_t avg_frame_size = >- (target_bitrate.GetBitrate(i, layer_info.temporal_id) + 7) / >+ (target_bitrate.GetBitrate(i, layer_info.temporal_id) + 7) * >+ kTemporalLayerRateFactor[frame_info.layers.size() - 1][i] / > (8 * framerate); > layer_info.size = avg_frame_size; > if (debt_bytes_ > 0) { >@@ -201,10 +229,6 @@ int32_t FakeEncoder::Release() { > return 0; > } > >-int32_t FakeEncoder::SetChannelParameters(uint32_t packet_loss, int64_t rtt) { >- return 0; >-} >- > int32_t FakeEncoder::SetRateAllocation( > const VideoBitrateAllocation& rate_allocation, > uint32_t framerate) { >@@ -236,8 +260,10 @@ int32_t FakeEncoder::SetRateAllocation( > } > > const char* FakeEncoder::kImplementationName = "fake_encoder"; >-const char* FakeEncoder::ImplementationName() const { >- return kImplementationName; >+VideoEncoder::EncoderInfo FakeEncoder::GetEncoderInfo() const { >+ EncoderInfo info; >+ info.implementation_name = kImplementationName; >+ return info; > } > > int FakeEncoder::GetConfiguredInputFramerate() const { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.h >index 67acebec69c94b7270f6fd28713ffef78df3895f..da38f11b456cfabaa47e4646d50ffcb36abeb39c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_encoder.h >@@ -15,7 +15,6 @@ > #include <vector> > > #include "api/video_codecs/video_encoder.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/criticalsection.h" > #include "rtc_base/sequenced_task_checker.h" > #include "rtc_base/task_queue.h" >@@ -41,11 +40,10 @@ class FakeEncoder : public VideoEncoder { > int32_t RegisterEncodeCompleteCallback( > EncodedImageCallback* callback) override; > int32_t Release() override; >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override; > int32_t SetRateAllocation(const VideoBitrateAllocation& rate_allocation, > uint32_t framerate) override; >- const char* ImplementationName() const override; > int GetConfiguredInputFramerate() const; >+ EncoderInfo GetEncoderInfo() const override; > > static const char* kImplementationName; > >@@ -71,7 +69,7 @@ class FakeEncoder : public VideoEncoder { > SimulcastStream simulcast_streams[kMaxSimulcastStreams], > int framerate); > >- FrameInfo last_frame_info_; >+ FrameInfo last_frame_info_ RTC_GUARDED_BY(crit_sect_); > Clock* const clock_; > > VideoCodec config_ RTC_GUARDED_BY(crit_sect_); >@@ -80,6 +78,7 @@ class FakeEncoder : public VideoEncoder { > int configured_input_framerate_ RTC_GUARDED_BY(crit_sect_); > int max_target_bitrate_kbps_ RTC_GUARDED_BY(crit_sect_); > bool pending_keyframe_ RTC_GUARDED_BY(crit_sect_); >+ uint32_t counter_ RTC_GUARDED_BY(crit_sect_); > rtc::CriticalSection crit_sect_; > > uint8_t encoded_buffer_[100000]; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_decoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_decoder.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..f15bb2149dea4f2ef6ec79e9bb358db238bbb2b0 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_decoder.cc >@@ -0,0 +1,77 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "test/fake_vp8_decoder.h" >+ >+#include "api/video/i420_buffer.h" >+#include "rtc_base/timeutils.h" >+ >+namespace webrtc { >+namespace test { >+ >+namespace { >+// Read width and height from the payload of the frame if it is a key frame the >+// same way as the real VP8 decoder. >+// FakeEncoder writes width, height and frame type. >+void ParseFakeVp8(const unsigned char* data, int* width, int* height) { >+ bool key_frame = data[0] == 0; >+ if (key_frame) { >+ *width = ((data[7] << 8) + data[6]) & 0x3FFF; >+ *height = ((data[9] << 8) + data[8]) & 0x3FFF; >+ } >+} >+} // namespace >+ >+FakeVp8Decoder::FakeVp8Decoder() : callback_(nullptr), width_(0), height_(0) {} >+ >+int32_t FakeVp8Decoder::InitDecode(const VideoCodec* config, >+ int32_t number_of_cores) { >+ return WEBRTC_VIDEO_CODEC_OK; >+} >+ >+int32_t FakeVp8Decoder::Decode(const EncodedImage& input, >+ bool missing_frames, >+ const CodecSpecificInfo* codec_specific_info, >+ int64_t render_time_ms) { >+ constexpr size_t kMinPayLoadHeaderLength = 10; >+ if (input._length < kMinPayLoadHeaderLength) { >+ return WEBRTC_VIDEO_CODEC_ERROR; >+ } >+ ParseFakeVp8(input._buffer, &width_, &height_); >+ >+ VideoFrame frame(I420Buffer::Create(width_, height_), >+ webrtc::kVideoRotation_0, >+ render_time_ms * rtc::kNumMicrosecsPerMillisec); >+ frame.set_timestamp(input.Timestamp()); >+ frame.set_ntp_time_ms(input.ntp_time_ms_); >+ >+ callback_->Decoded(frame, /*decode_time_ms=*/absl::nullopt, >+ /*qp=*/absl::nullopt); >+ >+ return WEBRTC_VIDEO_CODEC_OK; >+} >+ >+int32_t FakeVp8Decoder::RegisterDecodeCompleteCallback( >+ DecodedImageCallback* callback) { >+ callback_ = callback; >+ return WEBRTC_VIDEO_CODEC_OK; >+} >+ >+int32_t FakeVp8Decoder::Release() { >+ return WEBRTC_VIDEO_CODEC_OK; >+} >+ >+const char* FakeVp8Decoder::kImplementationName = "fake_vp8_decoder"; >+const char* FakeVp8Decoder::ImplementationName() const { >+ return kImplementationName; >+} >+ >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_decoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_decoder.h >new file mode 100644 >index 0000000000000000000000000000000000000000..974af403d692040d6a56d546a23c05a64c5aba08 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_decoder.h >@@ -0,0 +1,51 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef TEST_FAKE_VP8_DECODER_H_ >+#define TEST_FAKE_VP8_DECODER_H_ >+ >+#include "modules/video_coding/include/video_codec_interface.h" >+#include "system_wrappers/include/clock.h" >+ >+namespace webrtc { >+namespace test { >+ >+class FakeVp8Decoder : public VideoDecoder { >+ public: >+ FakeVp8Decoder(); >+ ~FakeVp8Decoder() override {} >+ >+ int32_t InitDecode(const VideoCodec* config, >+ int32_t number_of_cores) override; >+ >+ int32_t Decode(const EncodedImage& input, >+ bool missing_frames, >+ const CodecSpecificInfo* codec_specific_info, >+ int64_t render_time_ms) override; >+ >+ int32_t RegisterDecodeCompleteCallback( >+ DecodedImageCallback* callback) override; >+ >+ int32_t Release() override; >+ >+ const char* ImplementationName() const override; >+ >+ static const char* kImplementationName; >+ >+ private: >+ DecodedImageCallback* callback_; >+ int width_; >+ int height_; >+}; >+ >+} // namespace test >+} // namespace webrtc >+ >+#endif // TEST_FAKE_VP8_DECODER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.cc >index 4038837798315601ad812ecb63ed9aaec5bc4fac..9180bec04996b688583fea6567e1fa0478ca14d8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.cc >@@ -10,8 +10,8 @@ > > #include "test/fake_vp8_encoder.h" > >-#include "common_types.h" // NOLINT(build/include) >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >+#include "api/video_codecs/create_vp8_temporal_layers.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "modules/video_coding/include/video_codec_interface.h" > #include "modules/video_coding/include/video_error_codes.h" > #include "modules/video_coding/utility/simulcast_utility.h" >@@ -20,6 +20,26 @@ > #include "rtc_base/random.h" > #include "rtc_base/timeutils.h" > >+namespace { >+ >+// Write width and height to the payload the same way as the real encoder does. >+// It requires that |payload| has a size of at least kMinPayLoadHeaderLength. >+void WriteFakeVp8(unsigned char* payload, >+ int width, >+ int height, >+ bool key_frame) { >+ payload[0] = key_frame ? 0 : 0x01; >+ >+ if (key_frame) { >+ payload[9] = (height & 0x3F00) >> 8; >+ payload[8] = (height & 0x00FF); >+ >+ payload[7] = (width & 0x3F00) >> 8; >+ payload[6] = (width & 0x00FF); >+ } >+} >+} // namespace >+ > namespace webrtc { > > namespace test { >@@ -63,18 +83,18 @@ void FakeVP8Encoder::SetupTemporalLayers(const VideoCodec& codec) { > > int num_streams = SimulcastUtility::NumberOfSimulcastStreams(codec); > for (int i = 0; i < num_streams; ++i) { >- TemporalLayersType type; >+ Vp8TemporalLayersType type; > int num_temporal_layers = > SimulcastUtility::NumberOfTemporalLayers(codec, i); > if (SimulcastUtility::IsConferenceModeScreenshare(codec) && i == 0) { >- type = TemporalLayersType::kBitrateDynamic; >+ type = Vp8TemporalLayersType::kBitrateDynamic; > // Legacy screenshare layers supports max 2 layers. > num_temporal_layers = std::max<int>(2, num_temporal_layers); > } else { >- type = TemporalLayersType::kFixedPattern; >+ type = Vp8TemporalLayersType::kFixedPattern; > } > temporal_layers_.emplace_back( >- TemporalLayers::CreateTemporalLayers(type, num_temporal_layers)); >+ CreateVp8TemporalLayers(type, num_temporal_layers)); > } > } > >@@ -85,7 +105,6 @@ void FakeVP8Encoder::PopulateCodecSpecific(CodecSpecificInfo* codec_specific, > uint32_t timestamp) { > RTC_DCHECK_CALLED_SEQUENTIALLY(&sequence_checker_); > codec_specific->codecType = kVideoCodecVP8; >- codec_specific->codec_name = ImplementationName(); > CodecSpecificInfoVP8* vp8Info = &(codec_specific->codecSpecific.VP8); > vp8Info->keyIdx = kNoKeyIdx; > vp8Info->nonReference = false; >@@ -105,9 +124,20 @@ EncodedImageCallback::Result FakeVP8Encoder::OnEncodedImage( > encoded_image._frameType, stream_idx, > encoded_image.Timestamp()); > >+ // Write width and height to the payload the same way as the real encoder >+ // does. >+ WriteFakeVp8(encoded_image._buffer, encoded_image._encodedWidth, >+ encoded_image._encodedHeight, >+ encoded_image._frameType == kVideoFrameKey); > return callback_->OnEncodedImage(encoded_image, &overrided_specific_info, > fragments); > } > >+VideoEncoder::EncoderInfo FakeVP8Encoder::GetEncoderInfo() const { >+ EncoderInfo info; >+ info.implementation_name = "FakeVp8Encoder"; >+ return info; >+} >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.h >index 7518e3a2db5d358405ffc16c86a73d02571c8ce4..a4d63a9ae30a6701ca0bb2a6eb39c495f07b11af 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder.h >@@ -14,7 +14,7 @@ > #include <memory> > #include <vector> > >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "test/fake_encoder.h" > > #include "rtc_base/criticalsection.h" >@@ -37,12 +37,12 @@ class FakeVP8Encoder : public FakeEncoder, public EncodedImageCallback { > > int32_t Release() override; > >- const char* ImplementationName() const override { return "FakeVp8Encoder"; } >- > Result OnEncodedImage(const EncodedImage& encodedImage, > const CodecSpecificInfo* codecSpecificInfo, > const RTPFragmentationHeader* fragments) override; > >+ EncoderInfo GetEncoderInfo() const override; >+ > private: > void SetupTemporalLayers(const VideoCodec& codec); > void PopulateCodecSpecific(CodecSpecificInfo* codec_specific, >@@ -54,7 +54,7 @@ class FakeVP8Encoder : public FakeEncoder, public EncodedImageCallback { > rtc::SequencedTaskChecker sequence_checker_; > EncodedImageCallback* callback_ RTC_GUARDED_BY(sequence_checker_); > >- std::vector<std::unique_ptr<TemporalLayers>> temporal_layers_ >+ std::vector<std::unique_ptr<Vp8TemporalLayers>> temporal_layers_ > RTC_GUARDED_BY(sequence_checker_); > }; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder_unittest.cc >index c79ba0c5515c5fd65404bda058cee0c82daeb438..4e576fe9c9c25364f1489c9f199885faa289915d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fake_vp8_encoder_unittest.cc >@@ -14,11 +14,11 @@ > #include "absl/memory/memory.h" > #include "api/test/create_simulcast_test_fixture.h" > #include "api/test/simulcast_test_fixture.h" >+#include "api/test/video/function_video_decoder_factory.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "modules/video_coding/utility/simulcast_test_fixture_impl.h" >-#include "test/fake_decoder.h" >+#include "test/fake_vp8_decoder.h" > #include "test/fake_vp8_encoder.h" >-#include "test/function_video_decoder_factory.h" >-#include "test/function_video_encoder_factory.h" > > namespace webrtc { > namespace test { >@@ -32,7 +32,7 @@ std::unique_ptr<SimulcastTestFixture> CreateSpecificSimulcastTestFixture() { > }); > std::unique_ptr<VideoDecoderFactory> decoder_factory = > absl::make_unique<FunctionVideoDecoderFactory>( >- []() { return absl::make_unique<FakeDecoder>(); }); >+ []() { return absl::make_unique<FakeVp8Decoder>(); }); > return CreateSimulcastTestFixture(std::move(encoder_factory), > std::move(decoder_factory), > SdpVideoFormat("VP8")); >@@ -99,5 +99,10 @@ TEST(TestFakeVp8Codec, TestSpatioTemporalLayers333PatternEncoder) { > fixture->TestSpatioTemporalLayers333PatternEncoder(); > } > >+TEST(TestFakeVp8Codec, TestDecodeWidthHeightSet) { >+ auto fixture = CreateSpecificSimulcastTestFixture(); >+ fixture->TestDecodeWidthHeightSet(); >+} >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/frame_generator.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/frame_generator.cc >index acd63f931220b58836b1f5cbb03e6dcf4da6b060..f6fd285c1a13eaeedaf0853e3080dd22af55515a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/frame_generator.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/frame_generator.cc >@@ -390,11 +390,11 @@ class ScrollingImageFrameGenerator : public FrameGenerator { > (pixels_scrolled_x / 2); > > current_frame_ = webrtc::VideoFrame( >- new rtc::RefCountedObject<webrtc::WrappedI420Buffer>( >- target_width_, target_height_, &i420_buffer->DataY()[offset_y], >- i420_buffer->StrideY(), &i420_buffer->DataU()[offset_u], >- i420_buffer->StrideU(), &i420_buffer->DataV()[offset_v], >- i420_buffer->StrideV(), KeepRefUntilDone(i420_buffer)), >+ WrapI420Buffer(target_width_, target_height_, >+ &i420_buffer->DataY()[offset_y], i420_buffer->StrideY(), >+ &i420_buffer->DataU()[offset_u], i420_buffer->StrideU(), >+ &i420_buffer->DataV()[offset_v], i420_buffer->StrideV(), >+ KeepRefUntilDone(i420_buffer)), > kVideoRotation_0, 0); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/function_video_decoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/function_video_decoder_factory.h >deleted file mode 100644 >index 984f8534be00bb5901e9ab799423018f3039be01..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/function_video_decoder_factory.h >+++ /dev/null >@@ -1,56 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef TEST_FUNCTION_VIDEO_DECODER_FACTORY_H_ >-#define TEST_FUNCTION_VIDEO_DECODER_FACTORY_H_ >- >-#include <functional> >-#include <memory> >-#include <utility> >-#include <vector> >- >-#include "api/video_codecs/sdp_video_format.h" >-#include "api/video_codecs/video_decoder_factory.h" >-#include "rtc_base/checks.h" >- >-namespace webrtc { >-namespace test { >- >-// A decoder factory producing decoders by calling a supplied create function. >-class FunctionVideoDecoderFactory final : public VideoDecoderFactory { >- public: >- explicit FunctionVideoDecoderFactory( >- std::function<std::unique_ptr<VideoDecoder>()> create) >- : create_([create](const SdpVideoFormat&) { return create(); }) {} >- explicit FunctionVideoDecoderFactory( >- std::function<std::unique_ptr<VideoDecoder>(const SdpVideoFormat&)> >- create) >- : create_(std::move(create)) {} >- >- // Unused by tests. >- std::vector<SdpVideoFormat> GetSupportedFormats() const override { >- RTC_NOTREACHED(); >- return {}; >- } >- >- std::unique_ptr<VideoDecoder> CreateVideoDecoder( >- const SdpVideoFormat& format) override { >- return create_(format); >- } >- >- private: >- const std::function<std::unique_ptr<VideoDecoder>(const SdpVideoFormat&)> >- create_; >-}; >- >-} // namespace test >-} // namespace webrtc >- >-#endif // TEST_FUNCTION_VIDEO_DECODER_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/function_video_encoder_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/function_video_encoder_factory.h >deleted file mode 100644 >index 567397f6c0c697994c60b5149b1d6dc37aa40d04..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/function_video_encoder_factory.h >+++ /dev/null >@@ -1,65 +0,0 @@ >-/* >- * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#ifndef TEST_FUNCTION_VIDEO_ENCODER_FACTORY_H_ >-#define TEST_FUNCTION_VIDEO_ENCODER_FACTORY_H_ >- >-#include <functional> >-#include <memory> >-#include <utility> >-#include <vector> >- >-#include "api/video_codecs/sdp_video_format.h" >-#include "api/video_codecs/video_encoder_factory.h" >-#include "rtc_base/checks.h" >- >-namespace webrtc { >-namespace test { >- >-// An encoder factory producing encoders by calling a supplied create >-// function. >-class FunctionVideoEncoderFactory final : public VideoEncoderFactory { >- public: >- explicit FunctionVideoEncoderFactory( >- std::function<std::unique_ptr<VideoEncoder>()> create) >- : create_([create](const SdpVideoFormat&) { return create(); }) {} >- explicit FunctionVideoEncoderFactory( >- std::function<std::unique_ptr<VideoEncoder>(const SdpVideoFormat&)> >- create) >- : create_(std::move(create)) {} >- >- // Unused by tests. >- std::vector<SdpVideoFormat> GetSupportedFormats() const override { >- RTC_NOTREACHED(); >- return {}; >- } >- >- CodecInfo QueryVideoEncoder( >- const SdpVideoFormat& /* format */) const override { >- CodecInfo codec_info; >- codec_info.is_hardware_accelerated = false; >- codec_info.has_internal_source = false; >- return codec_info; >- } >- >- std::unique_ptr<VideoEncoder> CreateVideoEncoder( >- const SdpVideoFormat& format) override { >- return create_(format); >- } >- >- private: >- const std::function<std::unique_ptr<VideoEncoder>(const SdpVideoFormat&)> >- create_; >-}; >- >-} // namespace test >-} // namespace webrtc >- >-#endif // TEST_FUNCTION_VIDEO_ENCODER_FACTORY_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/BUILD.gn >index ccc15a61005e9d3ae862f00c62d2d5a689f3a497..43de5f998cef6108000418d0e7fc5c9504daf6c3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/BUILD.gn >@@ -6,9 +6,9 @@ > # in the file PATENTS. All contributing project authors may > # be found in the AUTHORS file in the root of the source tree. > >-import("../../webrtc.gni") > import("//build/config/features.gni") > import("//testing/libfuzzer/fuzzer_test.gni") >+import("../../webrtc.gni") > > rtc_static_library("webrtc_fuzzer_main") { > sources = [ >@@ -125,7 +125,6 @@ webrtc_fuzzer_test("forward_error_correction_fuzzer") { > "../../modules/rtp_rtcp:rtp_rtcp_format", > "../../rtc_base:rtc_base_approved", > ] >- libfuzzer_options = [ "max_len=5000" ] > } > > webrtc_fuzzer_test("flexfec_header_reader_fuzzer") { >@@ -148,7 +147,6 @@ webrtc_fuzzer_test("flexfec_sender_fuzzer") { > "../../modules/rtp_rtcp:rtp_rtcp_format", > "../../system_wrappers", > ] >- libfuzzer_options = [ "max_len=200" ] > } > > webrtc_fuzzer_test("ulpfec_header_reader_fuzzer") { >@@ -186,7 +184,6 @@ webrtc_fuzzer_test("ulpfec_receiver_fuzzer") { > "../../modules/rtp_rtcp:rtp_rtcp_format", > "../../rtc_base:rtc_base_approved", > ] >- libfuzzer_options = [ "max_len=2000" ] > } > > webrtc_fuzzer_test("flexfec_receiver_fuzzer") { >@@ -198,7 +195,6 @@ webrtc_fuzzer_test("flexfec_receiver_fuzzer") { > "../../modules/rtp_rtcp:rtp_rtcp_format", > "../../rtc_base:rtc_base_approved", > ] >- libfuzzer_options = [ "max_len=2000" ] > } > > webrtc_fuzzer_test("packet_buffer_fuzzer") { >@@ -209,7 +205,6 @@ webrtc_fuzzer_test("packet_buffer_fuzzer") { > "../../modules/video_coding/", > "../../system_wrappers", > ] >- libfuzzer_options = [ "max_len=200000" ] > } > > webrtc_fuzzer_test("rtcp_receiver_fuzzer") { >@@ -283,8 +278,6 @@ webrtc_fuzzer_test("audio_decoder_ilbc_fuzzer") { > ":audio_decoder_fuzzer", > "../../modules/audio_coding:ilbc", > ] >- >- libfuzzer_options = [ "max_len=10000" ] > } > > webrtc_fuzzer_test("audio_decoder_isac_fuzzer") { >@@ -295,8 +288,6 @@ webrtc_fuzzer_test("audio_decoder_isac_fuzzer") { > ":audio_decoder_fuzzer", > "../../modules/audio_coding:isac", > ] >- >- libfuzzer_options = [ "max_len=20000" ] > } > > webrtc_fuzzer_test("audio_decoder_isac_incoming_packet_fuzzer") { >@@ -307,8 +298,6 @@ webrtc_fuzzer_test("audio_decoder_isac_incoming_packet_fuzzer") { > ":audio_decoder_fuzzer", > "../../modules/audio_coding:isac", > ] >- >- libfuzzer_options = [ "max_len=20000" ] > } > > webrtc_fuzzer_test("audio_decoder_isacfix_fuzzer") { >@@ -319,8 +308,6 @@ webrtc_fuzzer_test("audio_decoder_isacfix_fuzzer") { > ":audio_decoder_fuzzer", > "../../modules/audio_coding:isac_fix", > ] >- >- libfuzzer_options = [ "max_len=20000" ] > } > > webrtc_fuzzer_test("audio_decoder_opus_fuzzer") { >@@ -377,10 +364,7 @@ webrtc_fuzzer_test("neteq_rtp_fuzzer") { > "../../modules/audio_coding:pcm16b", > "../../modules/rtp_rtcp:rtp_rtcp_format", > "../../rtc_base:rtc_base_approved", >- "../../rtc_base:rtc_base_tests_utils", > ] >- >- libfuzzer_options = [ "max_len=100000" ] > } > > webrtc_fuzzer_test("neteq_signal_fuzzer") { >@@ -394,10 +378,7 @@ webrtc_fuzzer_test("neteq_signal_fuzzer") { > "../../modules/audio_coding:neteq_tools_minimal", > "../../modules/audio_coding:pcm16b", > "../../rtc_base:rtc_base_approved", >- "../../rtc_base:rtc_base_tests_utils", > ] >- >- libfuzzer_options = [ "max_len=100000" ] > } > > webrtc_fuzzer_test("residual_echo_detector_fuzzer") { >@@ -420,7 +401,6 @@ webrtc_fuzzer_test("sdp_parser_fuzzer") { > "../../pc:libjingle_peerconnection", > ] > seed_corpus = "corpora/sdp-corpus" >- libfuzzer_options = [ "max_len=16384" ] > } > > webrtc_fuzzer_test("stun_parser_fuzzer") { >@@ -487,6 +467,7 @@ rtc_static_library("audio_processing_fuzzer_helper") { > ":fuzz_data_helper", > "../../api/audio:audio_frame_api", > "../../modules/audio_processing", >+ "../../modules/audio_processing:api", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", > "//third_party/abseil-cpp/absl/types:optional", >@@ -501,10 +482,13 @@ webrtc_fuzzer_test("audio_processing_fuzzer") { > ":audio_processing_fuzzer_helper", > "../../api/audio:aec3_factory", > "../../modules/audio_processing", >+ "../../modules/audio_processing:api", > "../../modules/audio_processing/aec3", >- "../../modules/audio_processing/aec_dump:mock_aec_dump", >+ "../../modules/audio_processing/aec_dump", >+ "../../modules/audio_processing/aec_dump:aec_dump_impl", > "../../rtc_base:ptr_util", > "../../rtc_base:rtc_base_approved", >+ "../../rtc_base:rtc_task_queue", > "../../rtc_base:safe_minmax", > "../../system_wrappers:field_trial", > "//third_party/abseil-cpp/absl/memory", >@@ -519,13 +503,13 @@ webrtc_fuzzer_test("agc_fuzzer") { > deps = [ > ":fuzz_data_helper", > "../../modules/audio_processing", >+ "../../modules/audio_processing:api", > "../../rtc_base:rtc_base_approved", > "../../rtc_base:safe_minmax", > "//third_party/abseil-cpp/absl/memory", > ] > > seed_corpus = "corpora/agc-corpus" >- libfuzzer_options = [ "max_len=200000" ] > } > > webrtc_fuzzer_test("comfort_noise_decoder_fuzzer") { >@@ -534,11 +518,9 @@ webrtc_fuzzer_test("comfort_noise_decoder_fuzzer") { > ] > deps = [ > "../../api:array_view", >- "../../modules/audio_coding:cng", >+ "../../modules/audio_coding:webrtc_cng", > "../../rtc_base:rtc_base_approved", > ] >- >- libfuzzer_options = [ "max_len=5000" ] > } > > webrtc_fuzzer_test("rtp_frame_reference_finder_fuzzer") { >@@ -551,7 +533,6 @@ webrtc_fuzzer_test("rtp_frame_reference_finder_fuzzer") { > "../../system_wrappers", > "//third_party/abseil-cpp/absl/memory", > ] >- libfuzzer_options = [ "max_len=20000" ] > } > > webrtc_fuzzer_test("frame_buffer2_fuzzer") { >@@ -562,5 +543,4 @@ webrtc_fuzzer_test("frame_buffer2_fuzzer") { > "../../modules/video_coding/", > "../../system_wrappers:system_wrappers", > ] >- libfuzzer_options = [ "max_len=10000" ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/agc_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/agc_fuzzer.cc >index 5bd921e74dae3b99859ccf52214cc11faba09f40..f2c90480c835eadee3db2289222a5ccadeb99e87 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/agc_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/agc_fuzzer.cc >@@ -109,6 +109,9 @@ void FuzzGainController(test::FuzzDataHelper* fuzz_data, GainControlImpl* gci) { > } // namespace > > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 200000) { >+ return; >+ } > test::FuzzDataHelper fuzz_data(rtc::ArrayView<const uint8_t>(data, size)); > rtc::CriticalSection crit_capture; > rtc::CriticalSection crit_render; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_ilbc_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_ilbc_fuzzer.cc >index a68725d6aacd162e2a2fb40f69a1110fc74027b3..8548645c63f9b767a5952b7601cb2c0f1c3dabd5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_ilbc_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_ilbc_fuzzer.cc >@@ -13,6 +13,9 @@ > > namespace webrtc { > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 10000) { >+ return; >+ } > AudioDecoderIlbcImpl dec; > static const int kSampleRateHz = 8000; > static const size_t kAllocatedOuputSizeSamples = kSampleRateHz / 10; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_fuzzer.cc >index e79996e05b28a34bf11d1b89697406b46d120e70..b579083956c6bce064be4f98b75e95f003b67e2c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_fuzzer.cc >@@ -13,6 +13,9 @@ > > namespace webrtc { > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 20000) { >+ return; >+ } > const int sample_rate_hz = size % 2 == 0 ? 16000 : 32000; // 16 or 32 kHz. > static const size_t kAllocatedOuputSizeSamples = 32000 / 10; // 100 ms. > int16_t output[kAllocatedOuputSizeSamples]; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_incoming_packet_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_incoming_packet_fuzzer.cc >index 5645142a034fa2326ca153e2393194bec9462684..9bd6234fa1495e7175fc01221e36eae24a6303aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_incoming_packet_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isac_incoming_packet_fuzzer.cc >@@ -13,6 +13,9 @@ > > namespace webrtc { > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 20000) { >+ return; >+ } > AudioDecoderIsacFloatImpl dec(16000); > FuzzAudioDecoderIncomingPacket(data, size, &dec); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isacfix_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isacfix_fuzzer.cc >index 444395b17df20c5c9d90c9208e1d0cf205255354..6477dc361bddd777f60b830baf21050c3df07a1c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isacfix_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_decoder_isacfix_fuzzer.cc >@@ -13,6 +13,9 @@ > > namespace webrtc { > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 20000) { >+ return; >+ } > static const int kSampleRateHz = 16000; > static const size_t kAllocatedOuputSizeSamples = 16000 / 10; // 100 ms. > int16_t output[kAllocatedOuputSizeSamples]; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_configs_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_configs_fuzzer.cc >index 8117c52aaf2b315f7f066b4b192919891a41d333..dd0013684edbcbcf34081faf38ae21421bfb2378 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_configs_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_configs_fuzzer.cc >@@ -13,10 +13,11 @@ > > #include "absl/memory/memory.h" > #include "api/audio/echo_canceller3_factory.h" >-#include "modules/audio_processing/aec_dump/mock_aec_dump.h" >+#include "modules/audio_processing/aec_dump/aec_dump_factory.h" > #include "modules/audio_processing/include/audio_processing.h" > #include "rtc_base/arraysize.h" > #include "rtc_base/numerics/safe_minmax.h" >+#include "rtc_base/task_queue.h" > #include "system_wrappers/include/field_trial.h" > #include "test/fuzzers/audio_processing_fuzzer_helper.h" > #include "test/fuzzers/fuzz_data_helper.h" >@@ -25,36 +26,45 @@ namespace webrtc { > namespace { > > const std::string kFieldTrialNames[] = { >- "WebRTC-Aec3TransparentModeKillSwitch", >- "WebRTC-Aec3StationaryRenderImprovementsKillSwitch", >- "WebRTC-Aec3EnforceDelayAfterRealignmentKillSwitch", >- "WebRTC-Aec3UseShortDelayEstimatorWindow", >- "WebRTC-Aec3ReverbBasedOnRenderKillSwitch", >- "WebRTC-Aec3ReverbModellingKillSwitch", >- "WebRTC-Aec3FilterAnalyzerPreprocessorKillSwitch", >- "WebRTC-Aec3TransparencyImprovementsKillSwitch", >- "WebRTC-Aec3SoftTransparentModeKillSwitch", >- "WebRTC-Aec3OverrideEchoPathGainKillSwitch", >- "WebRTC-Aec3ZeroExternalDelayHeadroomKillSwitch", >- "WebRTC-Aec3DownSamplingFactor8KillSwitch", >+ "WebRTC-Aec3AdaptErleOnLowRenderKillSwitch", >+ "WebRTC-Aec3AgcGainChangeResponseKillSwitch", >+ "WebRTC-Aec3BoundedNearendKillSwitch", >+ "WebRTC-Aec3EarlyShadowFilterJumpstartKillSwitch", >+ "WebRTC-Aec3EnableAdaptiveEchoReverbEstimation", >+ "WebRTC-Aec3EnableLegacyDominantNearend", >+ "WebRTC-Aec3EnableUnityInitialRampupGain", >+ "WebRTC-Aec3EnableUnityNonZeroRampupGain", > "WebRTC-Aec3EnforceSkewHysteresis1", > "WebRTC-Aec3EnforceSkewHysteresis2", >- "WebRTC-Aec3NewSuppressionKillSwitch", >- "WebRTC-Aec3LinearModeWithDivergedFilterKillSwitch", >+ "WebRTC-Aec3FilterAnalyzerPreprocessorKillSwitch", > "WebRTC-Aec3MisadjustmentEstimatorKillSwitch", >+ "WebRTC-Aec3NewFilterParamsKillSwitch", >+ "WebRTC-Aec3NewRenderBufferingKillSwitch", >+ "WebRTC-Aec3OverrideEchoPathGainKillSwitch", > "WebRTC-Aec3RapidAgcGainRecoveryKillSwitch", >- "WebRTC-Aec3SlowFilterAdaptationKillSwitch", >- "WebRTC-Aec3SmoothUpdatesTailFreqRespKillSwitch", >- "WebRTC-Aec3SuppressorNearendAveragingKillSwitch", >- "WebRTC-Aec3AgcGainChangeResponseKillSwitch", >+ "WebRTC-Aec3ResetErleAtGainChangesKillSwitch", >+ "WebRTC-Aec3ReverbBasedOnRenderKillSwitch", >+ "WebRTC-Aec3ReverbModellingKillSwitch", >+ "WebRTC-Aec3ShadowFilterBoostedJumpstartKillSwitch", > "WebRTC-Aec3ShadowFilterJumpstartKillSwitch", >- "WebRTC-Aec3EarlyLinearFilterUsageKillSwitch", >- "WebRTC-Aec3ShortInitialStateKillSwitch", >+ "WebRTC-Aec3ShortReverbKillSwitch", >+ "WebRTC-Aec3SmoothSignalTransitionsKillSwitch", >+ "WebRTC-Aec3SmoothUpdatesTailFreqRespKillSwitch", >+ "WebRTC-Aec3SoftTransparentModeKillSwitch", > "WebRTC-Aec3StandardNonlinearReverbModelKillSwitch", >- "WebRTC-Aec3EnableAdaptiveEchoReverbEstimation"}; >+ "WebRTC-Aec3StrictDivergenceCheckKillSwitch", >+ "WebRTC-Aec3UseLegacyNormalSuppressorTuning", >+ "WebRTC-Aec3UseOffsetBlocks", >+ "WebRTC-Aec3UseShortDelayEstimatorWindow", >+ "WebRTC-Aec3UseStationarityPropertiesKillSwitch", >+ "WebRTC-Aec3UtilizeShadowFilterOutputKillSwitch", >+ "WebRTC-Aec3ZeroExternalDelayHeadroomKillSwitch", >+ "WebRTC-Aec3EarlyDelayDetectionKillSwitch", >+}; > > std::unique_ptr<AudioProcessing> CreateApm(test::FuzzDataHelper* fuzz_data, >- std::string* field_trial_string) { >+ std::string* field_trial_string, >+ rtc::TaskQueue* worker_queue) { > // Parse boolean values for optionally enabling different > // configurable public components of APM. > bool exp_agc = fuzz_data->ReadOrDefaultValue(true); >@@ -79,12 +89,13 @@ std::unique_ptr<AudioProcessing> CreateApm(test::FuzzDataHelper* fuzz_data, > > // Read an int8 value, but don't let it be too large or small. > const float gain_controller2_gain_db = >- rtc::SafeClamp<int>(fuzz_data->ReadOrDefaultValue<int8_t>(0), -50, 50); >+ rtc::SafeClamp<int>(fuzz_data->ReadOrDefaultValue<int8_t>(0), -40, 40); > > constexpr size_t kNumFieldTrials = arraysize(kFieldTrialNames); > // Verify that the read data type has enough bits to fuzz the field trials. >- using FieldTrialBitmaskType = uint32_t; >- RTC_DCHECK_LE(kNumFieldTrials, sizeof(FieldTrialBitmaskType) * 8); >+ using FieldTrialBitmaskType = uint64_t; >+ static_assert(kNumFieldTrials <= sizeof(FieldTrialBitmaskType) * 8, >+ "FieldTrialBitmaskType is not large enough."); > std::bitset<kNumFieldTrials> field_trial_bitmask( > fuzz_data->ReadOrDefaultValue<FieldTrialBitmaskType>(0)); > for (size_t i = 0; i < kNumFieldTrials; ++i) { >@@ -129,8 +140,9 @@ std::unique_ptr<AudioProcessing> CreateApm(test::FuzzDataHelper* fuzz_data, > .SetEchoControlFactory(std::move(echo_control_factory)) > .Create(config)); > >- apm->AttachAecDump( >- absl::make_unique<testing::NiceMock<webrtc::test::MockAecDump>>()); >+#ifdef WEBRTC_LINUX >+ apm->AttachAecDump(AecDumpFactory::Create("/dev/null", -1, worker_queue)); >+#endif > > webrtc::AudioProcessing::Config apm_config; > apm_config.echo_canceller.enabled = use_aec || use_aecm; >@@ -139,7 +151,7 @@ std::unique_ptr<AudioProcessing> CreateApm(test::FuzzDataHelper* fuzz_data, > apm_config.high_pass_filter.enabled = hpf; > apm_config.gain_controller2.enabled = use_agc2_limiter; > >- apm_config.gain_controller2.fixed_gain_db = gain_controller2_gain_db; >+ apm_config.gain_controller2.fixed_digital.gain_db = gain_controller2_gain_db; > > apm->ApplyConfig(apm_config); > >@@ -158,7 +170,10 @@ void FuzzOneInput(const uint8_t* data, size_t size) { > // This string must be in scope during execution, according to documentation > // for field_trial.h. Hence it's created here and not in CreateApm. > std::string field_trial_string = ""; >- auto apm = CreateApm(&fuzz_data, &field_trial_string); >+ >+ std::unique_ptr<rtc::TaskQueue> worker_queue( >+ new rtc::TaskQueue("rtc-low-prio", rtc::TaskQueue::Priority::LOW)); >+ auto apm = CreateApm(&fuzz_data, &field_trial_string, worker_queue.get()); > > if (apm) { > FuzzAudioProcessing(&fuzz_data, std::move(apm)); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_fuzzer_helper.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_fuzzer_helper.cc >index 073ab17bd931c4bbec133d0e90ad5c1b3cbaf4c3..dec4e1f05e4d422a4dfef86a680fa430be076f75 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_fuzzer_helper.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/audio_processing_fuzzer_helper.cc >@@ -130,10 +130,8 @@ void FuzzAudioProcessing(test::FuzzDataHelper* fuzz_data, > } > } > >- // Make calls to stats gathering functions to cover these >- // codeways. >- static_cast<void>(apm->GetStatistics()); >- static_cast<void>(apm->GetStatistics(true)); >+ // Cover stats gathering code paths. >+ static_cast<void>(apm->GetStatistics(true /*has_remote_tracks*/)); > static_cast<void>(apm->UpdateHistogramsOnCallEnd()); > > RTC_DCHECK_NE(apm_return_code, AudioProcessing::kBadDataLengthError); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/comfort_noise_decoder_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/comfort_noise_decoder_fuzzer.cc >index 50166d7df778250041e52dc577e174b4a48b94ce..7f44af99fb53035d43c725fba93779800bf92fcd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/comfort_noise_decoder_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/comfort_noise_decoder_fuzzer.cc >@@ -50,6 +50,9 @@ void FuzzOneInputTest(rtc::ArrayView<const uint8_t> data) { > } // namespace test > > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 5000) { >+ return; >+ } > test::FuzzOneInputTest(rtc::ArrayView<const uint8_t>(data, size)); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_receiver_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_receiver_fuzzer.cc >index d96a328fa2c58bfef0a5ea54c431cd202a115d4f..c5034bb933ee83b46a18fc40a1a8f69b953b9867 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_receiver_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_receiver_fuzzer.cc >@@ -25,7 +25,7 @@ class DummyCallback : public RecoveredPacketReceiver { > > void FuzzOneInput(const uint8_t* data, size_t size) { > constexpr size_t kMinDataNeeded = 12; >- if (size < kMinDataNeeded) { >+ if (size < kMinDataNeeded || size > 2000) { > return; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_sender_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_sender_fuzzer.cc >index 8e79f95cf3fee323ac502e3dd2cd0231a1aaa0d4..4882f7df51a5c6bfff3824dc0004b70b351d76c0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_sender_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/flexfec_sender_fuzzer.cc >@@ -31,10 +31,9 @@ const std::vector<RtpExtensionSize> kNoRtpHeaderExtensionSizes; > > void FuzzOneInput(const uint8_t* data, size_t size) { > size_t i = 0; >- if (size < 5) { >+ if (size < 5 || size > 200) { > return; > } >- > SimulatedClock clock(1 + data[i++]); > FlexfecSender sender(kFlexfecPayloadType, kFlexfecSsrc, kMediaSsrc, kNoMid, > kNoRtpHeaderExtensions, kNoRtpHeaderExtensionSizes, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/forward_error_correction_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/forward_error_correction_fuzzer.cc >index 9d5b872434ad3163c65980924cd9c8873abc7e2e..2eb357b74a81426bd3c5f26bd849bc8b51575c07 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/forward_error_correction_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/forward_error_correction_fuzzer.cc >@@ -26,6 +26,9 @@ constexpr size_t kMaxPacketsInBuffer = 48; > } // namespace > > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 5000) { >+ return; >+ } > // Object under test. > std::unique_ptr<ForwardErrorCorrection> fec = > ForwardErrorCorrection::CreateFlexfec(kFecSsrc, kMediaSsrc); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/frame_buffer2_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/frame_buffer2_fuzzer.cc >index 2d5830979c124e409c7f4aa19a23ef15719053ea..a5591041ff69b091a31a35dd35be32fa4c04d988 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/frame_buffer2_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/frame_buffer2_fuzzer.cc >@@ -63,6 +63,9 @@ class FuzzyFrameObject : public video_coding::EncodedFrame { > } // namespace > > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 10000) { >+ return; >+ } > DataReader reader(data, size); > Clock* clock = Clock::GetRealTimeClock(); > VCMJitterEstimator jitter_estimator(clock, 0, 0); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/mdns_parser_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/mdns_parser_fuzzer.cc >index c3f765ef787b482b098c0e881982345c3a9da7e8..294f683226c07603f8e3fb887505c976e7b4fc86 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/mdns_parser_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/mdns_parser_fuzzer.cc >@@ -19,7 +19,7 @@ namespace webrtc { > > void FuzzOneInput(const uint8_t* data, size_t size) { > MessageBufferReader buf(reinterpret_cast<const char*>(data), size); >- auto mdns_msg = absl::make_unique<MDnsMessage>(); >+ auto mdns_msg = absl::make_unique<MdnsMessage>(); > mdns_msg->Read(&buf); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_rtp_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_rtp_fuzzer.cc >index 8aa6be59dd0e1c8d74cf9bc1d8735103e03b2798..94dbef39ebe5edeec46a3c2e5b0c1460946ad854 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_rtp_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_rtp_fuzzer.cc >@@ -146,6 +146,9 @@ void FuzzOneInputTest(const uint8_t* data, size_t size) { > } // namespace test > > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 100000) { >+ return; >+ } > test::FuzzOneInputTest(data, size); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_signal_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_signal_fuzzer.cc >index 16e776072bc2f1cc62e6c1149091bf74546ad896..25302c31a8857a10ac0228584384f9368adb99c5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_signal_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/neteq_signal_fuzzer.cc >@@ -140,8 +140,9 @@ class FuzzSignalInput : public NetEqInput { > } // namespace > > void FuzzOneInputTest(const uint8_t* data, size_t size) { >- if (size < 1) >+ if (size < 1 || size > 90000) { > return; >+ } > > FuzzDataHelper fuzz_data(rtc::ArrayView<const uint8_t>(data, size)); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/packet_buffer_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/packet_buffer_fuzzer.cc >index d226764cb25d82bff85fde48de87895b9750bead..56d1557072f47f52516f095f7e8d2b06efe1b4ed 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/packet_buffer_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/packet_buffer_fuzzer.cc >@@ -21,6 +21,9 @@ class NullCallback : public video_coding::OnReceivedFrameCallback { > } // namespace > > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 200000) { >+ return; >+ } > VCMPacket packet; > NullCallback callback; > SimulatedClock clock(0); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtcp_receiver_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtcp_receiver_fuzzer.cc >index cd763d0f605880416302b50a2ba3224dd21524e9..7f8cabafd68ea48beb9a4b87e9570cc4e303def3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtcp_receiver_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtcp_receiver_fuzzer.cc >@@ -15,6 +15,8 @@ > namespace webrtc { > namespace { > >+constexpr int kRtcpIntervalMs = 1000; >+ > class NullModuleRtpRtcp : public RTCPReceiver::ModuleRtpRtcp { > public: > void SetTmmbn(std::vector<rtcp::TmmbItem>) override {} >@@ -30,7 +32,7 @@ void FuzzOneInput(const uint8_t* data, size_t size) { > SimulatedClock clock(1234); > > RTCPReceiver receiver(&clock, false, nullptr, nullptr, nullptr, nullptr, >- nullptr, &rtp_rtcp_module); >+ nullptr, kRtcpIntervalMs, &rtp_rtcp_module); > > receiver.IncomingPacket(data, size); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_frame_reference_finder_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_frame_reference_finder_fuzzer.cc >index fd98892a0480fd9cf0eb40efbe8a488174d21c0d..57d7a9623a07a0d12887ee7618433e2a94e1787a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_frame_reference_finder_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_frame_reference_finder_fuzzer.cc >@@ -101,6 +101,9 @@ class FuzzyPacketBuffer : public video_coding::PacketBuffer { > } // namespace > > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 20000) { >+ return; >+ } > DataReader reader(data, size); > rtc::scoped_refptr<FuzzyPacketBuffer> pb(new FuzzyPacketBuffer(&reader)); > NullCallback cb; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_packet_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_packet_fuzzer.cc >index 8bf8e74aba94cf7345a5dff1ec160ba59ebcc4fc..f774c0cb8b0184343601a8f2ffbc6230902794d0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_packet_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/rtp_packet_fuzzer.cc >@@ -121,6 +121,11 @@ void FuzzOneInput(const uint8_t* data, size_t size) { > packet.GetExtension<RtpGenericFrameDescriptorExtension>(&descriptor); > break; > } >+ case kRtpExtensionColorSpace: { >+ ColorSpace color_space; >+ packet.GetExtension<ColorSpaceExtension>(&color_space); >+ break; >+ } > } > } > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/sdp_parser_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/sdp_parser_fuzzer.cc >index e47156c571e5cd901530c0db6d8d87f510a78714..763dbc594acb74125bc328ecce881ab9afe6e368 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/sdp_parser_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/sdp_parser_fuzzer.cc >@@ -15,6 +15,9 @@ > > namespace webrtc { > void FuzzOneInput(const uint8_t* data, size_t size) { >+ if (size > 16384) { >+ return; >+ } > std::string message(reinterpret_cast<const char*>(data), size); > webrtc::SdpParseError error; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/ulpfec_receiver_fuzzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/ulpfec_receiver_fuzzer.cc >index 6b74f3ac95d4d3ff3ad29074a12f67410c174298..7cb0fc61b5d2f20bcd5c42b709d1c8c7bd5d2c1a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/ulpfec_receiver_fuzzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/fuzzers/ulpfec_receiver_fuzzer.cc >@@ -25,7 +25,7 @@ class DummyCallback : public RecoveredPacketReceiver { > > void FuzzOneInput(const uint8_t* data, size_t size) { > constexpr size_t kMinDataNeeded = 12; >- if (size < kMinDataNeeded) { >+ if (size < kMinDataNeeded || size > 2000) { > return; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/coverage_util_ios.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/coverage_util_ios.h >new file mode 100644 >index 0000000000000000000000000000000000000000..a17b69dca8011b1d216e3cbb8dde86d8779fabbf >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/coverage_util_ios.h >@@ -0,0 +1,24 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef TEST_IOS_COVERAGE_UTIL_IOS_H_ >+#define TEST_IOS_COVERAGE_UTIL_IOS_H_ >+ >+namespace rtc { >+namespace test { >+ >+// In debug builds, if IOS_ENABLE_COVERAGE is defined, sets the filename of the >+// coverage file. Otherwise, it does nothing. >+void ConfigureCoverageReportPath(); >+ >+} // namespace test >+} // namespace rtc >+ >+#endif // TEST_IOS_COVERAGE_UTIL_IOS_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/coverage_util_ios.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/coverage_util_ios.mm >new file mode 100644 >index 0000000000000000000000000000000000000000..c21a16def24746a8ffd0fd22d9ac72b2d2484ae0 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/coverage_util_ios.mm >@@ -0,0 +1,42 @@ >+/* >+ * Copyright 2018 The WebRTC Project Authors. All rights reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#import <Foundation/Foundation.h> >+ >+#ifdef WEBRTC_IOS_ENABLE_COVERAGE >+extern "C" void __llvm_profile_set_filename(const char* name); >+#endif >+ >+namespace rtc { >+namespace test { >+ >+void ConfigureCoverageReportPath() { >+#ifdef WEBRTC_IOS_ENABLE_COVERAGE >+ static dispatch_once_t once_token; >+ dispatch_once(&once_token, ^{ >+ // Writes the profraw file to the Documents directory, where the app has >+ // write rights. >+ NSArray* paths = >+ NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); >+ NSString* documents_directory = [paths firstObject]; >+ NSString* file_name = [documents_directory stringByAppendingPathComponent:@"coverage.profraw"]; >+ >+ // For documentation, see: >+ // http://clang.llvm.org/docs/SourceBasedCodeCoverage.html >+ __llvm_profile_set_filename([file_name cStringUsingEncoding:NSUTF8StringEncoding]); >+ >+ // Print the path for easier retrieval. >+ NSLog(@"Coverage data at %@.", file_name); >+ }); >+#endif // ifdef WEBRTC_IOS_ENABLE_COVERAGE >+} >+ >+} // namespace test >+} // namespace rtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/test_support.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/test_support.mm >index fec597885497d6ba2996cdebb6bd96bd220cf06b..86005974fb2216c6365cc50c167b3968a8244d06 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/test_support.mm >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/ios/test_support.mm >@@ -10,6 +10,7 @@ > > #import <UIKit/UIKit.h> > >+#include "test/ios/coverage_util_ios.h" > #include "test/ios/test_support.h" > #include "test/testsupport/perf_test.h" > >@@ -70,6 +71,8 @@ - (BOOL)application:(UIApplication *)application > } > > - (void)runTests { >+ rtc::test::ConfigureCoverageReportPath(); >+ > int exitStatus = g_test_suite(); > > if (g_save_chartjson_result) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_reader.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_reader.h >index 4913522ba701a6ad8100477b5ce3ce959bc80bf3..805044f52f9f9c7399c824910f19d12113eff9d1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_reader.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_reader.h >@@ -13,8 +13,6 @@ > #include <set> > #include <string> > >-#include "common_types.h" // NOLINT(build/include) >- > namespace webrtc { > namespace test { > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_writer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_writer.h >index 4496ca4a25fdc0867f177a49bb7a5e0257990be6..5e560d737560a4772be39d93215da1925d1c055a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_writer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_file_writer.h >@@ -12,7 +12,6 @@ > > #include <string> > >-#include "common_types.h" // NOLINT(build/include) > #include "test/rtp_file_reader.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_rtcp_observer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_rtcp_observer.h >index c9744109553742e7fd96513ccfaa3beb0ab4b7e2..d4aadfe0dadaf0f93fe3086e3985746beaa46d20 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_rtcp_observer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/rtp_rtcp_observer.h >@@ -71,9 +71,7 @@ class RtpRtcpObserver { > protected: > RtpRtcpObserver() : RtpRtcpObserver(0) {} > explicit RtpRtcpObserver(int event_timeout_ms) >- : observation_complete_(false, false), >- parser_(RtpHeaderParser::Create()), >- timeout_ms_(event_timeout_ms) { >+ : parser_(RtpHeaderParser::Create()), timeout_ms_(event_timeout_ms) { > parser_->RegisterRtpHeaderExtension(kRtpExtensionTransmissionTimeOffset, > kTOffsetExtensionId); > parser_->RegisterRtpHeaderExtension(kRtpExtensionAbsoluteSendTime, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.cc >index dc2ec18eea79f4d46ce7ec39cb4fb80cec24008e..052662a464e94229d4621db74ce60555b3bf92a0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.cc >@@ -14,7 +14,7 @@ > namespace webrtc { > namespace test { > >-void PressEnterToContinue() { >+void PressEnterToContinue(SingleThreadedTaskQueueForTesting&) { > puts(">> Press ENTER to continue..."); > while (getc(stdin) != '\n' && !feof(stdin)) > ; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.h >index 90063dc3e873d7ff249812fc50be079f8cffa731..36bfa0669708abe8abbbe031e76ca170b92f8b34 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/run_loop.h >@@ -10,11 +10,13 @@ > #ifndef TEST_RUN_LOOP_H_ > #define TEST_RUN_LOOP_H_ > >+#include "test/single_threaded_task_queue.h" >+ > namespace webrtc { > namespace test { > > // Blocks until the user presses enter. >-void PressEnterToContinue(); >+void PressEnterToContinue(SingleThreadedTaskQueueForTesting &task_queue); > > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/BUILD.gn >index 3703aa84e72e6c5cb5639607a3c633c308df8613..e9bda5112cb8ac9bd62f80b68242b97dc201981c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/BUILD.gn >@@ -37,14 +37,15 @@ if (rtc_include_tests) { > "../:test_common", > "../:test_support", > "../:video_test_common", >- "../..:webrtc_common", > "../../api:libjingle_peerconnection_api", > "../../api:transport_api", > "../../api/audio_codecs:builtin_audio_decoder_factory", > "../../api/audio_codecs:builtin_audio_encoder_factory", >+ "../../api/test/video:function_video_factory", > "../../api/units:data_rate", > "../../api/units:time_delta", > "../../api/units:timestamp", >+ "../../api/video:builtin_video_bitrate_allocator_factory", > "../../api/video:video_frame", > "../../api/video:video_frame_i420", > "../../api/video_codecs:video_codecs_api", >@@ -82,6 +83,7 @@ if (rtc_include_tests) { > "../../rtc_base:rtc_base_approved", > "../../rtc_base:rtc_base_tests_utils", > "../../rtc_base:rtc_task_queue", >+ "../../rtc_base:safe_minmax", > "../../rtc_base:sequenced_task_checker", > "../../rtc_base:stringutils", > "../../system_wrappers", >@@ -94,6 +96,9 @@ if (rtc_include_tests) { > } else if (is_ios || is_mac) { > deps += [ "../../modules/video_coding:objc_codec_factory_helper" ] > } >+ if (rtc_enable_protobuf) { >+ deps += [ "../../modules/audio_coding:ana_config_proto" ] >+ } > if (!build_with_chromium && is_clang) { > suppressed_configs += [ "//build/config/clang:find_bad_constructs" ] > } >@@ -111,11 +116,10 @@ if (rtc_include_tests) { > "../../logging:mocks", > "../../rtc_base:checks", > "../../rtc_base:rtc_base_approved", >- "../../rtc_base:rtc_base_tests_utils", > "../../system_wrappers", >+ "../../system_wrappers:field_trial", > "../../test:field_trial", > "../../test:test_support", >- "//system_wrappers:field_trial", > "//testing/gmock", > "//third_party/abseil-cpp/absl/memory", > ] >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.cc >index bd317d97d8b7c48819a0dfe65a1cb868788c6e1e..8359bd09ec58ece19ef9bcfd26b0df5bca087430 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.cc >@@ -9,10 +9,58 @@ > */ > #include "test/scenario/audio_stream.h" > >+#include "rtc_base/bitrateallocationstrategy.h" > #include "test/call_test.h" > >+#if WEBRTC_ENABLE_PROTOBUF >+RTC_PUSH_IGNORING_WUNDEF() >+#ifdef WEBRTC_ANDROID_PLATFORM_BUILD >+#include "external/webrtc/webrtc/modules/audio_coding/audio_network_adaptor/config.pb.h" >+#else >+#include "modules/audio_coding/audio_network_adaptor/config.pb.h" >+#endif >+RTC_POP_IGNORING_WUNDEF() >+#endif >+ > namespace webrtc { > namespace test { >+namespace { >+absl::optional<std::string> CreateAdaptationString( >+ AudioStreamConfig::NetworkAdaptation config) { >+#if WEBRTC_ENABLE_PROTOBUF >+ >+ audio_network_adaptor::config::ControllerManager cont_conf; >+ if (config.frame.max_rate_for_60_ms.IsFinite()) { >+ auto controller = >+ cont_conf.add_controllers()->mutable_frame_length_controller(); >+ controller->set_fl_decreasing_packet_loss_fraction( >+ config.frame.min_packet_loss_for_decrease); >+ controller->set_fl_increasing_packet_loss_fraction( >+ config.frame.max_packet_loss_for_increase); >+ >+ controller->set_fl_20ms_to_60ms_bandwidth_bps( >+ config.frame.min_rate_for_20_ms.bps<int32_t>()); >+ controller->set_fl_60ms_to_20ms_bandwidth_bps( >+ config.frame.max_rate_for_60_ms.bps<int32_t>()); >+ >+ if (config.frame.max_rate_for_120_ms.IsFinite()) { >+ controller->set_fl_60ms_to_120ms_bandwidth_bps( >+ config.frame.min_rate_for_60_ms.bps<int32_t>()); >+ controller->set_fl_120ms_to_60ms_bandwidth_bps( >+ config.frame.max_rate_for_120_ms.bps<int32_t>()); >+ } >+ } >+ cont_conf.add_controllers()->mutable_bitrate_controller(); >+ std::string config_string = cont_conf.SerializeAsString(); >+ return config_string; >+#else >+ RTC_LOG(LS_ERROR) << "audio_network_adaptation is enabled" >+ " but WEBRTC_ENABLE_PROTOBUF is false.\n" >+ "Ignoring settings."; >+ return absl::nullopt; >+#endif // WEBRTC_ENABLE_PROTOBUF >+} >+} // namespace > > SendAudioStream::SendAudioStream( > CallClient* sender, >@@ -20,7 +68,8 @@ SendAudioStream::SendAudioStream( > rtc::scoped_refptr<AudioEncoderFactory> encoder_factory, > Transport* send_transport) > : sender_(sender), config_(config) { >- AudioSendStream::Config send_config(send_transport); >+ AudioSendStream::Config send_config(send_transport, >+ /*media_transport=*/nullptr); > ssrc_ = sender->GetNextAudioSsrc(); > send_config.rtp.ssrc = ssrc_; > SdpAudioFormat::Parameters sdp_params; >@@ -42,6 +91,10 @@ SendAudioStream::SendAudioStream( > send_config.send_codec_spec->target_bitrate_bps = > config.encoder.fixed_rate->bps(); > >+ if (config.network_adaptation) { >+ send_config.audio_network_adaptor_config = >+ CreateAdaptationString(config.adapt); >+ } > if (config.encoder.allocate_bitrate || > config.stream.in_bandwidth_estimation) { > DataRate min_rate = DataRate::Infinity(); >@@ -54,11 +107,20 @@ SendAudioStream::SendAudioStream( > max_rate = *config.encoder.max_rate; > } > if (field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")) { >- TimeDelta frame_length = config.encoder.initial_frame_length; >+ TimeDelta min_frame_length = TimeDelta::ms(20); >+ // Note, depends on WEBRTC_OPUS_SUPPORT_120MS_PTIME being set, which is >+ // the default. >+ TimeDelta max_frame_length = TimeDelta::ms(120); > DataSize rtp_overhead = DataSize::bytes(12); >- DataSize total_overhead = config.stream.packet_overhead + rtp_overhead; >- min_rate += total_overhead / frame_length; >- max_rate += total_overhead / frame_length; >+ // Note that this does not include rtp extension overhead and will not >+ // follow updates in the transport overhead over time. >+ DataSize total_overhead = >+ sender_->transport_.packet_overhead() + rtp_overhead; >+ >+ min_rate += total_overhead / max_frame_length; >+ // In WebRTCVoiceEngine the max rate is also based on the max frame >+ // length. >+ max_rate += total_overhead / min_frame_length; > } > send_config.min_bitrate_bps = min_rate.bps(); > send_config.max_bitrate_bps = max_rate.bps(); >@@ -70,13 +132,17 @@ SendAudioStream::SendAudioStream( > {RtpExtension::kTransportSequenceNumberUri, 8}}; > } > >- if (config.stream.rate_allocation_priority) { >+ if (config.encoder.priority_rate) { > send_config.track_id = sender->GetNextPriorityId(); >+ sender_->call_->SetBitrateAllocationStrategy( >+ absl::make_unique<rtc::AudioPriorityBitrateAllocationStrategy>( >+ send_config.track_id, >+ config.encoder.priority_rate->bps<uint32_t>())); > } > send_stream_ = sender_->call_->CreateAudioSendStream(send_config); > if (field_trial::IsEnabled("WebRTC-SendSideBwe-WithOverhead")) { > sender->call_->OnAudioTransportOverheadChanged( >- config.stream.packet_overhead.bytes()); >+ sender_->transport_.packet_overhead().bytes()); > } > } > >@@ -86,17 +152,19 @@ SendAudioStream::~SendAudioStream() { > > void SendAudioStream::Start() { > send_stream_->Start(); >+ sender_->call_->SignalChannelNetworkState(MediaType::AUDIO, kNetworkUp); > } > >-bool SendAudioStream::TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) { >- // Removes added overhead before delivering RTCP packet to sender. >- RTC_DCHECK_GE(packet.size(), config_.stream.packet_overhead.bytes()); >- packet.SetSize(packet.size() - config_.stream.packet_overhead.bytes()); >- sender_->DeliverPacket(MediaType::AUDIO, packet, at_time); >- return true; >+ColumnPrinter SendAudioStream::StatsPrinter() { >+ return ColumnPrinter::Lambda( >+ "audio_target_rate", >+ [this](rtc::SimpleStringBuilder& sb) { >+ AudioSendStream::Stats stats = send_stream_->GetStats(); >+ sb.AppendFormat("%.0lf", stats.target_bitrate_bps / 8.0); >+ }, >+ 64); > } >+ > ReceiveAudioStream::ReceiveAudioStream( > CallClient* receiver, > AudioStreamConfig config, >@@ -108,11 +176,13 @@ ReceiveAudioStream::ReceiveAudioStream( > recv_config.rtp.local_ssrc = CallTest::kReceiverLocalAudioSsrc; > recv_config.rtcp_send_transport = feedback_transport; > recv_config.rtp.remote_ssrc = send_stream->ssrc_; >+ receiver->ssrc_media_types_[recv_config.rtp.remote_ssrc] = MediaType::AUDIO; > if (config.stream.in_bandwidth_estimation) { > recv_config.rtp.transport_cc = true; > recv_config.rtp.extensions = { > {RtpExtension::kTransportSequenceNumberUri, 8}}; > } >+ receiver_->AddExtensions(recv_config.rtp.extensions); > recv_config.decoder_factory = decoder_factory; > recv_config.decoder_map = { > {CallTest::kAudioSendPayloadType, {"opus", 48000, 2}}}; >@@ -123,49 +193,26 @@ ReceiveAudioStream::~ReceiveAudioStream() { > receiver_->call_->DestroyAudioReceiveStream(receive_stream_); > } > >-bool ReceiveAudioStream::TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) { >- RTC_DCHECK_GE(packet.size(), config_.stream.packet_overhead.bytes()); >- packet.SetSize(packet.size() - config_.stream.packet_overhead.bytes()); >- receiver_->DeliverPacket(MediaType::AUDIO, packet, at_time); >- return true; >+void ReceiveAudioStream::Start() { >+ receive_stream_->Start(); >+ receiver_->call_->SignalChannelNetworkState(MediaType::AUDIO, kNetworkUp); > } > > AudioStreamPair::~AudioStreamPair() = default; > > AudioStreamPair::AudioStreamPair( > CallClient* sender, >- std::vector<NetworkNode*> send_link, >- uint64_t send_receiver_id, > rtc::scoped_refptr<AudioEncoderFactory> encoder_factory, > CallClient* receiver, >- std::vector<NetworkNode*> return_link, >- uint64_t return_receiver_id, > rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, > AudioStreamConfig config) > : config_(config), >- send_link_(send_link), >- return_link_(return_link), >- send_transport_(sender, >- send_link.front(), >- send_receiver_id, >- config.stream.packet_overhead), >- return_transport_(receiver, >- return_link.front(), >- return_receiver_id, >- config.stream.packet_overhead), >- send_stream_(sender, config, encoder_factory, &send_transport_), >+ send_stream_(sender, config, encoder_factory, &sender->transport_), > receive_stream_(receiver, > config, > &send_stream_, > decoder_factory, >- &return_transport_) { >- NetworkNode::Route(send_transport_.ReceiverId(), send_link_, >- &receive_stream_); >- NetworkNode::Route(return_transport_.ReceiverId(), return_link_, >- &send_stream_); >-} >+ &receiver->transport_) {} > > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.h >index 17b8bc370efd3d1921598b860fb0f87b30238926..3ab0a1f2def74c9ebd87d35c9ecb578a23d2ffc0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/audio_stream.h >@@ -24,11 +24,12 @@ namespace test { > > // SendAudioStream represents sending of audio. It can be used for starting the > // stream if neccessary. >-class SendAudioStream : public NetworkReceiverInterface { >+class SendAudioStream { > public: > RTC_DISALLOW_COPY_AND_ASSIGN(SendAudioStream); > ~SendAudioStream(); > void Start(); >+ ColumnPrinter StatsPrinter(); > > private: > friend class Scenario; >@@ -38,11 +39,6 @@ class SendAudioStream : public NetworkReceiverInterface { > AudioStreamConfig config, > rtc::scoped_refptr<AudioEncoderFactory> encoder_factory, > Transport* send_transport); >- // Handles RTCP feedback for this stream. >- bool TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) override; >- > AudioSendStream* send_stream_ = nullptr; > CallClient* const sender_; > const AudioStreamConfig config_; >@@ -50,10 +46,11 @@ class SendAudioStream : public NetworkReceiverInterface { > }; > > // ReceiveAudioStream represents an audio receiver. It can't be used directly. >-class ReceiveAudioStream : public NetworkReceiverInterface { >+class ReceiveAudioStream { > public: > RTC_DISALLOW_COPY_AND_ASSIGN(ReceiveAudioStream); > ~ReceiveAudioStream(); >+ void Start(); > > private: > friend class Scenario; >@@ -63,9 +60,6 @@ class ReceiveAudioStream : public NetworkReceiverInterface { > SendAudioStream* send_stream, > rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, > Transport* feedback_transport); >- bool TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) override; > AudioReceiveStream* receive_stream_ = nullptr; > CallClient* const receiver_; > const AudioStreamConfig config_; >@@ -84,23 +78,13 @@ class AudioStreamPair { > private: > friend class Scenario; > AudioStreamPair(CallClient* sender, >- std::vector<NetworkNode*> send_link, >- uint64_t send_receiver_id, > rtc::scoped_refptr<AudioEncoderFactory> encoder_factory, >- > CallClient* receiver, >- std::vector<NetworkNode*> return_link, >- uint64_t return_receiver_id, > rtc::scoped_refptr<AudioDecoderFactory> decoder_factory, > AudioStreamConfig config); > > private: > const AudioStreamConfig config_; >- std::vector<NetworkNode*> send_link_; >- std::vector<NetworkNode*> return_link_; >- NetworkNodeTransport send_transport_; >- NetworkNodeTransport return_transport_; >- > SendAudioStream send_stream_; > ReceiveAudioStream receive_stream_; > }; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.cc >index 47cc3bc57b82d2536e22fcde9e62c37ba43d260e..86eb4dc1db6a40a5f85169a36fe2e46411ce53a9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.cc >@@ -21,6 +21,39 @@ namespace webrtc { > namespace test { > namespace { > const char* kPriorityStreamId = "priority-track"; >+ >+CallClientFakeAudio InitAudio() { >+ CallClientFakeAudio setup; >+ auto capturer = TestAudioDeviceModule::CreatePulsedNoiseCapturer(256, 48000); >+ auto renderer = TestAudioDeviceModule::CreateDiscardRenderer(48000); >+ setup.fake_audio_device = TestAudioDeviceModule::CreateTestAudioDeviceModule( >+ std::move(capturer), std::move(renderer), 1.f); >+ setup.apm = AudioProcessingBuilder().Create(); >+ setup.fake_audio_device->Init(); >+ AudioState::Config audio_state_config; >+ audio_state_config.audio_mixer = AudioMixerImpl::Create(); >+ audio_state_config.audio_processing = setup.apm; >+ audio_state_config.audio_device_module = setup.fake_audio_device; >+ setup.audio_state = AudioState::Create(audio_state_config); >+ setup.fake_audio_device->RegisterAudioCallback( >+ setup.audio_state->audio_transport()); >+ return setup; >+} >+ >+Call* CreateCall(CallClientConfig config, >+ LoggingNetworkControllerFactory* network_controller_factory_, >+ rtc::scoped_refptr<AudioState> audio_state) { >+ CallConfig call_config(network_controller_factory_->GetEventLog()); >+ call_config.bitrate_config.max_bitrate_bps = >+ config.transport.rates.max_rate.bps_or(-1); >+ call_config.bitrate_config.min_bitrate_bps = >+ config.transport.rates.min_rate.bps(); >+ call_config.bitrate_config.start_bitrate_bps = >+ config.transport.rates.start_rate.bps(); >+ call_config.network_controller_factory = network_controller_factory_; >+ call_config.audio_state = audio_state; >+ return Call::Create(call_config); >+} > } > > LoggingNetworkControllerFactory::LoggingNetworkControllerFactory( >@@ -107,31 +140,17 @@ CallClient::CallClient(Clock* clock, > std::string log_filename, > CallClientConfig config) > : clock_(clock), >- network_controller_factory_(log_filename, config.transport) { >- CallConfig call_config(network_controller_factory_.GetEventLog()); >- call_config.bitrate_config.max_bitrate_bps = >- config.transport.rates.max_rate.bps_or(-1); >- call_config.bitrate_config.min_bitrate_bps = >- config.transport.rates.min_rate.bps(); >- call_config.bitrate_config.start_bitrate_bps = >- config.transport.rates.start_rate.bps(); >- call_config.network_controller_factory = &network_controller_factory_; >- call_config.audio_state = InitAudio(); >- call_.reset(Call::Create(call_config)); >- if (!config.priority_target_rate.IsZero() && >- config.priority_target_rate.IsFinite()) { >- call_->SetBitrateAllocationStrategy( >- absl::make_unique<rtc::AudioPriorityBitrateAllocationStrategy>( >- kPriorityStreamId, config.priority_target_rate.bps())); >- } >+ network_controller_factory_(log_filename, config.transport), >+ fake_audio_setup_(InitAudio()), >+ call_(CreateCall(config, >+ &network_controller_factory_, >+ fake_audio_setup_.audio_state)), >+ transport_(clock_, call_.get()), >+ header_parser_(RtpHeaderParser::Create()) { > } // namespace test > >-CallClient::~CallClient() {} >- >-void CallClient::DeliverPacket(MediaType media_type, >- rtc::CopyOnWriteBuffer packet, >- Timestamp at_time) { >- call_->Receiver()->DeliverPacket(media_type, packet, at_time.us()); >+CallClient::~CallClient() { >+ delete header_parser_; > } > > ColumnPrinter CallClient::StatsPrinter() { >@@ -149,6 +168,26 @@ Call::Stats CallClient::GetStats() { > return call_->GetStats(); > } > >+bool CallClient::TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >+ uint64_t receiver, >+ Timestamp at_time) { >+ // Removes added overhead before delivering packet to sender. >+ RTC_DCHECK_GE(packet.size(), route_overhead_.at(receiver).bytes()); >+ packet.SetSize(packet.size() - route_overhead_.at(receiver).bytes()); >+ >+ MediaType media_type = MediaType::ANY; >+ if (!RtpHeaderParser::IsRtcp(packet.cdata(), packet.size())) { >+ RTPHeader header; >+ bool success = >+ header_parser_->Parse(packet.cdata(), packet.size(), &header); >+ if (!success) >+ return false; >+ media_type = ssrc_media_types_[header.ssrc]; >+ } >+ call_->Receiver()->DeliverPacket(media_type, packet, at_time.us()); >+ return true; >+} >+ > uint32_t CallClient::GetNextVideoSsrc() { > RTC_CHECK_LT(next_video_ssrc_index_, CallTest::kNumSsrcs); > return CallTest::kVideoSendSsrcs[next_video_ssrc_index_++]; >@@ -170,21 +209,12 @@ std::string CallClient::GetNextPriorityId() { > return kPriorityStreamId; > } > >-rtc::scoped_refptr<AudioState> CallClient::InitAudio() { >- auto capturer = TestAudioDeviceModule::CreatePulsedNoiseCapturer(256, 48000); >- auto renderer = TestAudioDeviceModule::CreateDiscardRenderer(48000); >- fake_audio_device_ = TestAudioDeviceModule::CreateTestAudioDeviceModule( >- std::move(capturer), std::move(renderer), 1.f); >- apm_ = AudioProcessingBuilder().Create(); >- fake_audio_device_->Init(); >- AudioState::Config audio_state_config; >- audio_state_config.audio_mixer = AudioMixerImpl::Create(); >- audio_state_config.audio_processing = apm_; >- audio_state_config.audio_device_module = fake_audio_device_; >- auto audio_state = AudioState::Create(audio_state_config); >- fake_audio_device_->RegisterAudioCallback(audio_state->audio_transport()); >- return audio_state; >+void CallClient::AddExtensions(std::vector<RtpExtension> extensions) { >+ for (const auto& extension : extensions) >+ header_parser_->RegisterRtpHeaderExtension(extension); > } > >+CallClientPair::~CallClientPair() = default; >+ > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.h >index 80e2faf1626afd1d0b7ac4d4f7cf88b1d9988776..a12e06a065648979940bb841d24c658041edd615 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/call_client.h >@@ -17,8 +17,10 @@ > #include "logging/rtc_event_log/rtc_event_log.h" > #include "modules/audio_device/include/test_audio_device.h" > #include "modules/congestion_controller/test/controller_printer.h" >+#include "modules/rtp_rtcp/include/rtp_header_parser.h" > #include "rtc_base/constructormagic.h" > #include "test/scenario/column_printer.h" >+#include "test/scenario/network_node.h" > #include "test/scenario/scenario_config.h" > > namespace webrtc { >@@ -45,10 +47,15 @@ class LoggingNetworkControllerFactory > FILE* cc_out_ = nullptr; > }; > >+struct CallClientFakeAudio { >+ rtc::scoped_refptr<AudioProcessing> apm; >+ rtc::scoped_refptr<TestAudioDeviceModule> fake_audio_device; >+ rtc::scoped_refptr<AudioState> audio_state; >+}; > // CallClient represents a participant in a call scenario. It is created by the > // Scenario class and is used as sender and receiver when setting up a media > // stream session. >-class CallClient { >+class CallClient : public NetworkReceiverInterface { > public: > CallClient(Clock* clock, std::string log_filename, CallClientConfig config); > RTC_DISALLOW_COPY_AND_ASSIGN(CallClient); >@@ -56,37 +63,63 @@ class CallClient { > ~CallClient(); > ColumnPrinter StatsPrinter(); > Call::Stats GetStats(); >+ DataRate send_bandwidth() { >+ return DataRate::bps(GetStats().send_bandwidth_bps); >+ } >+ >+ bool TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >+ uint64_t receiver, >+ Timestamp at_time) override; > > private: > friend class Scenario; >+ friend class CallClientPair; > friend class SendVideoStream; >+ friend class VideoStreamPair; > friend class ReceiveVideoStream; > friend class SendAudioStream; > friend class ReceiveAudioStream; >+ friend class AudioStreamPair; > friend class NetworkNodeTransport; >- // TODO(srte): Consider using the Columnprinter interface for this. >- void DeliverPacket(MediaType media_type, >- rtc::CopyOnWriteBuffer packet, >- Timestamp at_time); > uint32_t GetNextVideoSsrc(); > uint32_t GetNextAudioSsrc(); > uint32_t GetNextRtxSsrc(); > std::string GetNextPriorityId(); >+ void AddExtensions(std::vector<RtpExtension> extensions); > > Clock* clock_; > LoggingNetworkControllerFactory network_controller_factory_; >+ CallClientFakeAudio fake_audio_setup_; > std::unique_ptr<Call> call_; >- >- rtc::scoped_refptr<AudioState> InitAudio(); >- >- rtc::scoped_refptr<AudioProcessing> apm_; >- rtc::scoped_refptr<TestAudioDeviceModule> fake_audio_device_; >+ NetworkNodeTransport transport_; >+ RtpHeaderParser* const header_parser_; > > std::unique_ptr<FecControllerFactoryInterface> fec_controller_factory_; >+ // Stores the configured overhead per known incomming route. This is used to >+ // subtract the overhead before processing. >+ std::map<uint64_t, DataSize> route_overhead_; > int next_video_ssrc_index_ = 0; > int next_rtx_ssrc_index_ = 0; > int next_audio_ssrc_index_ = 0; > int next_priority_index_ = 0; >+ std::map<uint32_t, MediaType> ssrc_media_types_; >+}; >+ >+class CallClientPair { >+ public: >+ RTC_DISALLOW_COPY_AND_ASSIGN(CallClientPair); >+ ~CallClientPair(); >+ CallClient* first() { return first_; } >+ CallClient* second() { return second_; } >+ std::pair<CallClient*, CallClient*> forward() { return {first(), second()}; } >+ std::pair<CallClient*, CallClient*> reverse() { return {second(), first()}; } >+ >+ private: >+ friend class Scenario; >+ CallClientPair(CallClient* first, CallClient* second) >+ : first_(first), second_(second) {} >+ CallClient* const first_; >+ CallClient* const second_; > }; > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.cc >index ff138939abb1efd51c1ae4a58ba179500dddbaca..b39d67bd25c8909a32035d3aeaa8a3e3dc3e1c9f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.cc >@@ -12,6 +12,8 @@ > #include <algorithm> > #include <vector> > >+#include "rtc_base/numerics/safe_minmax.h" >+ > namespace webrtc { > namespace test { > namespace { >@@ -168,29 +170,29 @@ SimulationNode::SimulationNode( > simulated_network_(simulation), > config_(config) {} > >-NetworkNodeTransport::NetworkNodeTransport(CallClient* sender, >- NetworkNode* send_net, >- uint64_t receiver, >- DataSize packet_overhead) >- : sender_(sender), >- send_net_(send_net), >- receiver_id_(receiver), >- packet_overhead_(packet_overhead) {} >+NetworkNodeTransport::NetworkNodeTransport(const Clock* sender_clock, >+ Call* sender_call) >+ : sender_clock_(sender_clock), sender_call_(sender_call) {} > > NetworkNodeTransport::~NetworkNodeTransport() = default; > > bool NetworkNodeTransport::SendRtp(const uint8_t* packet, > size_t length, > const PacketOptions& options) { >- int64_t send_time_ms = sender_->clock_->TimeInMilliseconds(); >+ int64_t send_time_ms = sender_clock_->TimeInMilliseconds(); > rtc::SentPacket sent_packet; > sent_packet.packet_id = options.packet_id; >+ sent_packet.info.included_in_feedback = options.included_in_feedback; >+ sent_packet.info.included_in_allocation = options.included_in_allocation; > sent_packet.send_time_ms = send_time_ms; > sent_packet.info.packet_size_bytes = length; > sent_packet.info.packet_type = rtc::PacketType::kData; >- sender_->call_->OnSentPacket(sent_packet); >+ sender_call_->OnSentPacket(sent_packet); > > Timestamp send_time = Timestamp::ms(send_time_ms); >+ rtc::CritScope crit(&crit_sect_); >+ if (!send_net_) >+ return false; > rtc::CopyOnWriteBuffer buffer(packet, length, > length + packet_overhead_.bytes()); > buffer.SetSize(length + packet_overhead_.bytes()); >@@ -199,13 +201,29 @@ bool NetworkNodeTransport::SendRtp(const uint8_t* packet, > > bool NetworkNodeTransport::SendRtcp(const uint8_t* packet, size_t length) { > rtc::CopyOnWriteBuffer buffer(packet, length); >- Timestamp send_time = Timestamp::ms(sender_->clock_->TimeInMilliseconds()); >+ Timestamp send_time = Timestamp::ms(sender_clock_->TimeInMilliseconds()); >+ rtc::CritScope crit(&crit_sect_); > buffer.SetSize(length + packet_overhead_.bytes()); >+ if (!send_net_) >+ return false; > return send_net_->TryDeliverPacket(buffer, receiver_id_, send_time); > } > >-uint64_t NetworkNodeTransport::ReceiverId() const { >- return receiver_id_; >+void NetworkNodeTransport::Connect(NetworkNode* send_node, >+ uint64_t receiver_id, >+ DataSize packet_overhead) { >+ rtc::CritScope crit(&crit_sect_); >+ send_net_ = send_node; >+ receiver_id_ = receiver_id; >+ packet_overhead_ = packet_overhead; >+ >+ rtc::NetworkRoute route; >+ route.connected = true; >+ route.local_network_id = receiver_id; >+ route.remote_network_id = receiver_id; >+ std::string transport_name = "dummy"; >+ sender_call_->GetTransportControllerSend()->OnNetworkRouteChanged( >+ transport_name, route); > } > > CrossTrafficSource::CrossTrafficSource(NetworkReceiverInterface* target, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.h >index af6db0a0099cc387fdb4708b98de1ba595f4ceb2..4e90f6f252a2c7cf5de76b88dd09a7e907a8686c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/network_node.h >@@ -18,10 +18,10 @@ > > #include "api/call/transport.h" > #include "api/units/timestamp.h" >+#include "call/call.h" > #include "call/simulated_network.h" > #include "rtc_base/constructormagic.h" > #include "rtc_base/copyonwritebuffer.h" >-#include "test/scenario/call_client.h" > #include "test/scenario/column_printer.h" > #include "test/scenario/scenario_config.h" > >@@ -119,23 +119,30 @@ class SimulationNode : public NetworkNode { > > class NetworkNodeTransport : public Transport { > public: >- NetworkNodeTransport(CallClient* sender, >- NetworkNode* send_net, >- uint64_t receiver, >- DataSize packet_overhead); >+ NetworkNodeTransport(const Clock* sender_clock, Call* sender_call); > ~NetworkNodeTransport() override; > > bool SendRtp(const uint8_t* packet, > size_t length, > const PacketOptions& options) override; > bool SendRtcp(const uint8_t* packet, size_t length) override; >- uint64_t ReceiverId() const; >+ >+ void Connect(NetworkNode* send_node, >+ uint64_t receiver_id, >+ DataSize packet_overhead); >+ >+ DataSize packet_overhead() { >+ rtc::CritScope crit(&crit_sect_); >+ return packet_overhead_; >+ } > > private: >- CallClient* const sender_; >- NetworkNode* const send_net_; >- const uint64_t receiver_id_; >- const DataSize packet_overhead_; >+ rtc::CriticalSection crit_sect_; >+ const Clock* const sender_clock_; >+ Call* const sender_call_; >+ NetworkNode* send_net_ RTC_GUARDED_BY(crit_sect_) = nullptr; >+ uint64_t receiver_id_ RTC_GUARDED_BY(crit_sect_) = 0; >+ DataSize packet_overhead_ RTC_GUARDED_BY(crit_sect_) = DataSize::Zero(); > }; > > // CrossTrafficSource is created by a Scenario and generates cross traffic. It >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.cc >index 7eb09e8bb7a1d1248624327ab4baae2eaea391c5..7589878d5ba353978f530cc2fbc0669bd5e64ba2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.cc >@@ -16,7 +16,7 @@ > #include "rtc_base/flags.h" > #include "test/testsupport/fileutils.h" > >-DEFINE_bool(scenario_logs, false, "Save logs from scenario framework."); >+WEBRTC_DEFINE_bool(scenario_logs, false, "Save logs from scenario framework."); > > namespace webrtc { > namespace test { >@@ -75,6 +75,8 @@ Scenario::Scenario(std::string file_name, bool real_time) > } > > Scenario::~Scenario() { >+ if (start_time_.IsFinite()) >+ Stop(); > if (!real_time_mode_) > rtc::SetClockForTesting(nullptr); > } >@@ -123,14 +125,50 @@ CallClient* Scenario::CreateClient( > return CreateClient(name, config); > } > >+CallClientPair* Scenario::CreateRoutes(CallClient* first, >+ std::vector<NetworkNode*> send_link, >+ CallClient* second, >+ std::vector<NetworkNode*> return_link) { >+ return CreateRoutes(first, send_link, >+ DataSize::bytes(PacketOverhead::kDefault), second, >+ return_link, DataSize::bytes(PacketOverhead::kDefault)); >+} >+ >+CallClientPair* Scenario::CreateRoutes(CallClient* first, >+ std::vector<NetworkNode*> send_link, >+ DataSize first_overhead, >+ CallClient* second, >+ std::vector<NetworkNode*> return_link, >+ DataSize second_overhead) { >+ CallClientPair* client_pair = new CallClientPair(first, second); >+ ChangeRoute(client_pair->forward(), send_link, first_overhead); >+ ChangeRoute(client_pair->reverse(), return_link, second_overhead); >+ client_pairs_.emplace_back(client_pair); >+ return client_pair; >+} >+ >+void Scenario::ChangeRoute(std::pair<CallClient*, CallClient*> clients, >+ std::vector<NetworkNode*> over_nodes) { >+ ChangeRoute(clients, over_nodes, DataSize::bytes(PacketOverhead::kDefault)); >+} >+ >+void Scenario::ChangeRoute(std::pair<CallClient*, CallClient*> clients, >+ std::vector<NetworkNode*> over_nodes, >+ DataSize overhead) { >+ uint64_t route_id = next_route_id_++; >+ clients.second->route_overhead_.insert({route_id, overhead}); >+ NetworkNode::Route(route_id, over_nodes, clients.second); >+ clients.first->transport_.Connect(over_nodes.front(), route_id, overhead); >+} >+ > SimulatedTimeClient* Scenario::CreateSimulatedTimeClient( > std::string name, > SimulatedTimeClientConfig config, > std::vector<PacketStreamConfig> stream_configs, > std::vector<NetworkNode*> send_link, > std::vector<NetworkNode*> return_link) { >- uint64_t send_id = next_receiver_id_++; >- uint64_t return_id = next_receiver_id_++; >+ uint64_t send_id = next_route_id_++; >+ uint64_t return_id = next_route_id_++; > SimulatedTimeClient* client = new SimulatedTimeClient( > GetFullPathOrEmpty(name), config, stream_configs, send_link, return_link, > send_id, return_id, Now()); >@@ -179,21 +217,21 @@ NetworkNode* Scenario::CreateNetworkNode( > void Scenario::TriggerPacketBurst(std::vector<NetworkNode*> over_nodes, > size_t num_packets, > size_t packet_size) { >- int64_t receiver_id = next_receiver_id_++; >- NetworkNode::Route(receiver_id, over_nodes, &null_receiver_); >+ int64_t route_id = next_route_id_++; >+ NetworkNode::Route(route_id, over_nodes, &null_receiver_); > for (size_t i = 0; i < num_packets; ++i) > over_nodes[0]->TryDeliverPacket(rtc::CopyOnWriteBuffer(packet_size), >- receiver_id, Now()); >+ route_id, Now()); > } > > void Scenario::NetworkDelayedAction(std::vector<NetworkNode*> over_nodes, > size_t packet_size, > std::function<void()> action) { >- int64_t receiver_id = next_receiver_id_++; >+ int64_t route_id = next_route_id_++; > action_receivers_.emplace_back(new ActionReceiver(action)); >- NetworkNode::Route(receiver_id, over_nodes, action_receivers_.back().get()); >- over_nodes[0]->TryDeliverPacket(rtc::CopyOnWriteBuffer(packet_size), >- receiver_id, Now()); >+ NetworkNode::Route(route_id, over_nodes, action_receivers_.back().get()); >+ over_nodes[0]->TryDeliverPacket(rtc::CopyOnWriteBuffer(packet_size), route_id, >+ Now()); > } > > CrossTrafficSource* Scenario::CreateCrossTraffic( >@@ -207,65 +245,46 @@ CrossTrafficSource* Scenario::CreateCrossTraffic( > CrossTrafficSource* Scenario::CreateCrossTraffic( > std::vector<NetworkNode*> over_nodes, > CrossTrafficConfig config) { >- int64_t receiver_id = next_receiver_id_++; >+ int64_t route_id = next_route_id_++; > cross_traffic_sources_.emplace_back( >- new CrossTrafficSource(over_nodes.front(), receiver_id, config)); >+ new CrossTrafficSource(over_nodes.front(), route_id, config)); > CrossTrafficSource* node = cross_traffic_sources_.back().get(); >- NetworkNode::Route(receiver_id, over_nodes, &null_receiver_); >+ NetworkNode::Route(route_id, over_nodes, &null_receiver_); > Every(config.min_packet_interval, > [this, node](TimeDelta delta) { node->Process(Now(), delta); }); > return node; > } > > VideoStreamPair* Scenario::CreateVideoStream( >- CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >+ std::pair<CallClient*, CallClient*> clients, > std::function<void(VideoStreamConfig*)> config_modifier) { > VideoStreamConfig config; > config_modifier(&config); >- return CreateVideoStream(sender, send_link, receiver, return_link, config); >+ return CreateVideoStream(clients, config); > } > > VideoStreamPair* Scenario::CreateVideoStream( >- CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >+ std::pair<CallClient*, CallClient*> clients, > VideoStreamConfig config) { >- uint64_t send_receiver_id = next_receiver_id_++; >- uint64_t return_receiver_id = next_receiver_id_++; >- > video_streams_.emplace_back( >- new VideoStreamPair(sender, send_link, send_receiver_id, receiver, >- return_link, return_receiver_id, config)); >+ new VideoStreamPair(clients.first, clients.second, config)); > return video_streams_.back().get(); > } > > AudioStreamPair* Scenario::CreateAudioStream( >- CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >+ std::pair<CallClient*, CallClient*> clients, > std::function<void(AudioStreamConfig*)> config_modifier) { > AudioStreamConfig config; > config_modifier(&config); >- return CreateAudioStream(sender, send_link, receiver, return_link, config); >+ return CreateAudioStream(clients, config); > } > > AudioStreamPair* Scenario::CreateAudioStream( >- CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >+ std::pair<CallClient*, CallClient*> clients, > AudioStreamConfig config) { >- uint64_t send_receiver_id = next_receiver_id_++; >- uint64_t return_receiver_id = next_receiver_id_++; >- >- audio_streams_.emplace_back(new AudioStreamPair( >- sender, send_link, send_receiver_id, audio_encoder_factory_, receiver, >- return_link, return_receiver_id, audio_decoder_factory_, config)); >+ audio_streams_.emplace_back( >+ new AudioStreamPair(clients.first, audio_encoder_factory_, clients.second, >+ audio_decoder_factory_, config)); > return audio_streams_.back().get(); > } > >@@ -288,37 +307,20 @@ void Scenario::At(TimeDelta offset, std::function<void()> function) { > } > > void Scenario::RunFor(TimeDelta duration) { >- RunUntil(duration, TimeDelta::PlusInfinity(), []() { return false; }); >+ RunUntil(Duration() + duration); >+} >+ >+void Scenario::RunUntil(TimeDelta max_duration) { >+ RunUntil(max_duration, TimeDelta::PlusInfinity(), []() { return false; }); > } > > void Scenario::RunUntil(TimeDelta max_duration, > TimeDelta poll_interval, > std::function<bool()> exit_function) { >- start_time_ = Timestamp::us(clock_->TimeInMicroseconds()); >- for (auto& activity : repeated_activities_) { >- activity->SetStartTime(start_time_); >- } >+ if (start_time_.IsInfinite()) >+ Start(); > >- for (auto& stream_pair : video_streams_) >- stream_pair->receive()->receive_stream_->Start(); >- for (auto& stream_pair : audio_streams_) >- stream_pair->receive()->receive_stream_->Start(); >- for (auto& stream_pair : video_streams_) { >- if (stream_pair->config_.autostart) { >- stream_pair->send()->Start(); >- } >- } >- for (auto& stream_pair : audio_streams_) { >- if (stream_pair->config_.autostart) { >- stream_pair->send()->Start(); >- } >- } >- for (auto& call : clients_) { >- call->call_->SignalChannelNetworkState(MediaType::AUDIO, kNetworkUp); >- call->call_->SignalChannelNetworkState(MediaType::VIDEO, kNetworkUp); >- } >- >- rtc::Event done_(false, false); >+ rtc::Event done_; > while (!exit_function() && Duration() < max_duration) { > Timestamp current_time = Now(); > TimeDelta duration = current_time - start_time_; >@@ -346,6 +348,32 @@ void Scenario::RunUntil(TimeDelta max_duration, > 1000); > } > } >+} >+ >+void Scenario::Start() { >+ start_time_ = Timestamp::us(clock_->TimeInMicroseconds()); >+ for (auto& activity : repeated_activities_) { >+ activity->SetStartTime(start_time_); >+ } >+ >+ for (auto& stream_pair : video_streams_) >+ stream_pair->receive()->Start(); >+ for (auto& stream_pair : audio_streams_) >+ stream_pair->receive()->Start(); >+ for (auto& stream_pair : video_streams_) { >+ if (stream_pair->config_.autostart) { >+ stream_pair->send()->Start(); >+ } >+ } >+ for (auto& stream_pair : audio_streams_) { >+ if (stream_pair->config_.autostart) { >+ stream_pair->send()->Start(); >+ } >+ } >+} >+ >+void Scenario::Stop() { >+ RTC_DCHECK(start_time_.IsFinite()); > for (auto& stream_pair : video_streams_) { > stream_pair->send()->video_capturer_->Stop(); > stream_pair->send()->send_stream_->Stop(); >@@ -356,6 +384,7 @@ void Scenario::RunUntil(TimeDelta max_duration, > stream_pair->receive()->receive_stream_->Stop(); > for (auto& stream_pair : audio_streams_) > stream_pair->receive()->receive_stream_->Stop(); >+ start_time_ = Timestamp::PlusInfinity(); > } > > Timestamp Scenario::Now() { >@@ -363,6 +392,8 @@ Timestamp Scenario::Now() { > } > > TimeDelta Scenario::Duration() { >+ if (start_time_.IsInfinite()) >+ return TimeDelta::Zero(); > return Now() - start_time_; > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.h >index 05876ac49c1563872f26ed38dc006839e8490379..38bef442ecdd53029513768dcea12eb55737949a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario.h >@@ -78,6 +78,25 @@ class Scenario { > std::string name, > std::function<void(CallClientConfig*)> config_modifier); > >+ CallClientPair* CreateRoutes(CallClient* first, >+ std::vector<NetworkNode*> send_link, >+ CallClient* second, >+ std::vector<NetworkNode*> return_link); >+ >+ CallClientPair* CreateRoutes(CallClient* first, >+ std::vector<NetworkNode*> send_link, >+ DataSize first_overhead, >+ CallClient* second, >+ std::vector<NetworkNode*> return_link, >+ DataSize second_overhead); >+ >+ void ChangeRoute(std::pair<CallClient*, CallClient*> clients, >+ std::vector<NetworkNode*> over_nodes); >+ >+ void ChangeRoute(std::pair<CallClient*, CallClient*> clients, >+ std::vector<NetworkNode*> over_nodes, >+ DataSize overhead); >+ > SimulatedTimeClient* CreateSimulatedTimeClient( > std::string name, > SimulatedTimeClientConfig config, >@@ -86,28 +105,18 @@ class Scenario { > std::vector<NetworkNode*> return_link); > > VideoStreamPair* CreateVideoStream( >- CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >+ std::pair<CallClient*, CallClient*> clients, > std::function<void(VideoStreamConfig*)> config_modifier); >- VideoStreamPair* CreateVideoStream(CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >- VideoStreamConfig config); >+ VideoStreamPair* CreateVideoStream( >+ std::pair<CallClient*, CallClient*> clients, >+ VideoStreamConfig config); > > AudioStreamPair* CreateAudioStream( >- CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >+ std::pair<CallClient*, CallClient*> clients, > std::function<void(AudioStreamConfig*)> config_modifier); >- AudioStreamPair* CreateAudioStream(CallClient* sender, >- std::vector<NetworkNode*> send_link, >- CallClient* receiver, >- std::vector<NetworkNode*> return_link, >- AudioStreamConfig config); >+ AudioStreamPair* CreateAudioStream( >+ std::pair<CallClient*, CallClient*> clients, >+ AudioStreamConfig config); > > CrossTrafficSource* CreateCrossTraffic( > std::vector<NetworkNode*> over_nodes, >@@ -131,9 +140,12 @@ class Scenario { > // Runs the scenario for the given time or until the exit function returns > // true. > void RunFor(TimeDelta duration); >+ void RunUntil(TimeDelta max_duration); > void RunUntil(TimeDelta max_duration, > TimeDelta probe_interval, > std::function<bool()> exit_function); >+ void Start(); >+ void Stop(); > > // Triggers sending of dummy packets over the given nodes. > void TriggerPacketBurst(std::vector<NetworkNode*> over_nodes, >@@ -167,6 +179,7 @@ class Scenario { > rtc::FakeClock event_log_fake_clock_; > > std::vector<std::unique_ptr<CallClient>> clients_; >+ std::vector<std::unique_ptr<CallClientPair>> client_pairs_; > std::vector<std::unique_ptr<NetworkNode>> network_nodes_; > std::vector<std::unique_ptr<CrossTrafficSource>> cross_traffic_sources_; > std::vector<std::unique_ptr<VideoStreamPair>> video_streams_; >@@ -179,7 +192,7 @@ class Scenario { > std::vector<std::unique_ptr<PendingActivity>> pending_activities_; > std::vector<std::unique_ptr<StatesPrinter>> printers_; > >- int64_t next_receiver_id_ = 40000; >+ int64_t next_route_id_ = 40000; > rtc::scoped_refptr<AudioDecoderFactory> audio_decoder_factory_; > rtc::scoped_refptr<AudioEncoderFactory> audio_encoder_factory_; > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_config.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_config.h >index e652e2ce3921df9d634957c1d05a1dc8443e97a1..88fdb8af01735cbc0567c1b1548a531cd3cfe02c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_config.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_config.h >@@ -28,7 +28,12 @@ struct PacketOverhead { > static constexpr size_t kIpv6 = 40; > static constexpr size_t kUdp = 8; > static constexpr size_t kSrtp = 10; >- static constexpr size_t kTurn = 4; >+ static constexpr size_t kStun = 4; >+ // TURN messages can be sent either with or without an establieshed channel. >+ // In the latter case, a TURN Send/Data Indication is sent which has >+ // significantly more overhead. >+ static constexpr size_t kTurnChannelMessage = 4; >+ static constexpr size_t kTurnIndicationMessage = 36; > static constexpr size_t kDefault = kIpv4 + kUdp + kSrtp; > }; > struct TransportControllerConfig { >@@ -46,7 +51,6 @@ struct TransportControllerConfig { > > struct CallClientConfig { > TransportControllerConfig transport; >- DataRate priority_target_rate = DataRate::Zero(); > }; > > struct SimulatedTimeClientConfig { >@@ -71,6 +75,10 @@ struct PacketStreamConfig { > struct VideoStreamConfig { > bool autostart = true; > struct Source { >+ enum class ContentType { >+ kVideo, >+ kScreen, >+ } content_type = ContentType::kVideo; > enum Capture { > kGenerator, > kVideoFile, >@@ -103,6 +111,7 @@ struct VideoStreamConfig { > absl::optional<int> key_frame_interval = 3000; > > absl::optional<DataRate> max_data_rate; >+ absl::optional<int> max_framerate; > size_t num_simulcast_streams = 1; > using DegradationPreference = DegradationPreference; > DegradationPreference degradation_preference = >@@ -114,10 +123,10 @@ struct VideoStreamConfig { > ~Stream(); > bool packet_feedback = true; > bool use_rtx = true; >+ DataRate pad_to_rate = DataRate::Zero(); > TimeDelta nack_history_time = TimeDelta::ms(1000); > bool use_flexfec = false; > bool use_ulpfec = false; >- DataSize packet_overhead = DataSize::bytes(PacketOverhead::kDefault); > } stream; > struct Renderer { > enum Type { kFake } type = kFake; >@@ -132,6 +141,17 @@ struct AudioStreamConfig { > struct Source { > int channels = 1; > } source; >+ bool network_adaptation = false; >+ struct NetworkAdaptation { >+ struct FrameLength { >+ double min_packet_loss_for_decrease = 0; >+ double max_packet_loss_for_increase = 1; >+ DataRate min_rate_for_20_ms = DataRate::Zero(); >+ DataRate max_rate_for_60_ms = DataRate::Infinity(); >+ DataRate min_rate_for_60_ms = DataRate::Zero(); >+ DataRate max_rate_for_120_ms = DataRate::Infinity(); >+ } frame; >+ } adapt; > struct Encoder { > Encoder(); > Encoder(const Encoder&); >@@ -140,6 +160,7 @@ struct AudioStreamConfig { > absl::optional<DataRate> fixed_rate; > absl::optional<DataRate> min_rate; > absl::optional<DataRate> max_rate; >+ absl::optional<DataRate> priority_rate; > TimeDelta initial_frame_length = TimeDelta::ms(20); > } encoder; > struct Stream { >@@ -147,8 +168,6 @@ struct AudioStreamConfig { > Stream(const Stream&); > ~Stream(); > bool in_bandwidth_estimation = false; >- bool rate_allocation_priority = false; >- DataSize packet_overhead = DataSize::bytes(PacketOverhead::kDefault); > } stream; > struct Render { > std::string sync_group; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_tests/bbr_performance.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_tests/bbr_performance.cc >index e87cc68dd31f70c8f8fce3f0dda757a6a2759d29..28cabc070fda07fb7663a130e4f87795c187f549 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_tests/bbr_performance.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_tests/bbr_performance.cc >@@ -167,31 +167,31 @@ TEST_P(BbrScenarioTest, ReceivesVideo) { > net_conf.simulation.delay_std_dev = conf_.scenario.delay_noise; > SimulationNode* send_net = s.CreateSimulationNode(net_conf); > SimulationNode* ret_net = s.CreateSimulationNode(net_conf); >- VideoStreamPair* alice_video = s.CreateVideoStream( >- alice, {send_net}, bob, {ret_net}, [&](VideoStreamConfig* c) { >+ auto route = s.CreateRoutes(alice, {send_net}, bob, {ret_net}); >+ >+ VideoStreamPair* alice_video = >+ s.CreateVideoStream(route->forward(), [&](VideoStreamConfig* c) { > c->encoder.fake.max_rate = DataRate::kbps(1800); > }); >- s.CreateAudioStream(alice, {send_net}, bob, {ret_net}, >- [&](AudioStreamConfig* c) { >- if (conf_.tuning.use_bbr) { >- c->stream.in_bandwidth_estimation = true; >- c->encoder.fixed_rate = DataRate::kbps(31); >- } >- }); >+ s.CreateAudioStream(route->forward(), [&](AudioStreamConfig* c) { >+ if (conf_.tuning.use_bbr) { >+ c->stream.in_bandwidth_estimation = true; >+ c->encoder.fixed_rate = DataRate::kbps(31); >+ } >+ }); > > VideoStreamPair* bob_video = nullptr; > if (conf_.scenario.return_traffic) { >- bob_video = s.CreateVideoStream( >- bob, {ret_net}, alice, {send_net}, [&](VideoStreamConfig* c) { >+ bob_video = >+ s.CreateVideoStream(route->reverse(), [&](VideoStreamConfig* c) { > c->encoder.fake.max_rate = DataRate::kbps(1800); > }); >- s.CreateAudioStream(bob, {ret_net}, alice, {send_net}, >- [&](AudioStreamConfig* c) { >- if (conf_.tuning.use_bbr) { >- c->stream.in_bandwidth_estimation = true; >- c->encoder.fixed_rate = DataRate::kbps(31); >- } >- }); >+ s.CreateAudioStream(route->reverse(), [&](AudioStreamConfig* c) { >+ if (conf_.tuning.use_bbr) { >+ c->stream.in_bandwidth_estimation = true; >+ c->encoder.fixed_rate = DataRate::kbps(31); >+ } >+ }); > } > CrossTrafficConfig cross_config; > cross_config.peak_rate = conf_.scenario.cross_traffic; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_unittest.cc >index 305f8660ccd744e9e7f563d8f33e2bfdc24c8ce3..97285d391459f2cfc8c3afc665b381c8151f7cc1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/scenario_unittest.cc >@@ -21,14 +21,19 @@ TEST(ScenarioTest, StartsAndStopsWithoutErrors) { > NetworkNodeConfig network_config; > auto alice_net = s.CreateSimulationNode(network_config); > auto bob_net = s.CreateSimulationNode(network_config); >+ auto route = s.CreateRoutes(alice, {alice_net}, bob, {bob_net}); > > VideoStreamConfig video_stream_config; >- s.CreateVideoStream(alice, {alice_net}, bob, {bob_net}, video_stream_config); >- s.CreateVideoStream(bob, {bob_net}, alice, {alice_net}, video_stream_config); >+ s.CreateVideoStream(route->forward(), video_stream_config); >+ s.CreateVideoStream(route->reverse(), video_stream_config); > > AudioStreamConfig audio_stream_config; >- s.CreateAudioStream(alice, {alice_net}, bob, {bob_net}, audio_stream_config); >- s.CreateAudioStream(bob, {bob_net}, alice, {alice_net}, audio_stream_config); >+ audio_stream_config.encoder.min_rate = DataRate::kbps(6); >+ audio_stream_config.encoder.max_rate = DataRate::kbps(64); >+ audio_stream_config.encoder.allocate_bitrate = true; >+ audio_stream_config.stream.in_bandwidth_estimation = false; >+ s.CreateAudioStream(route->forward(), audio_stream_config); >+ s.CreateAudioStream(route->reverse(), audio_stream_config); > > CrossTrafficConfig cross_traffic_config; > s.CreateCrossTraffic({alice_net}, cross_traffic_config); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.cc >index f29a82c455029f239614f3c1eaa42dcd16f9b313..a5977e7d965d236e9954e3801de4e1451dc9813f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.cc >@@ -144,7 +144,7 @@ TransportPacketsFeedback SimulatedSender::PullFeedbackReport( > } > } > >- data_in_flight_ -= feedback.sent_packet->size; >+ data_in_flight_ -= feedback.sent_packet.size; > report.packet_feedbacks.push_back(feedback); > } > } >@@ -252,11 +252,12 @@ SimulatedTimeClient::SimulatedTimeClient( > return_link_(return_link), > sender_(send_link.front(), send_receiver_id), > feedback_(config, return_receiver_id, return_link.front()) { >+ current_contraints_.at_time = at_time; >+ current_contraints_.starting_rate = config.transport.rates.start_rate; >+ current_contraints_.min_data_rate = config.transport.rates.min_rate; >+ current_contraints_.max_data_rate = config.transport.rates.max_rate; > NetworkControllerConfig initial_config; >- initial_config.constraints.at_time = at_time; >- initial_config.constraints.starting_rate = config.transport.rates.start_rate; >- initial_config.constraints.min_data_rate = config.transport.rates.min_rate; >- initial_config.constraints.max_data_rate = config.transport.rates.max_rate; >+ initial_config.constraints = current_contraints_; > congestion_controller_ = network_controller_factory_.Create(initial_config); > for (auto& stream_config : stream_configs) > packet_streams_.emplace_back(new PacketStream(stream_config)); >@@ -283,9 +284,9 @@ bool SimulatedTimeClient::TryDeliverPacket(rtc::CopyOnWriteBuffer raw_buffer, > for (PacketResult& feedback : report.packet_feedbacks) { > if (packet_log_) > fprintf(packet_log_, "%" PRId64 " %" PRId64 " %.3lf %.3lf %.3lf\n", >- feedback.sent_packet->sequence_number, >- feedback.sent_packet->size.bytes(), >- feedback.sent_packet->send_time.seconds<double>(), >+ feedback.sent_packet.sequence_number, >+ feedback.sent_packet.size.bytes(), >+ feedback.sent_packet.send_time.seconds<double>(), > feedback.receive_time.seconds<double>(), > at_time.seconds<double>()); > } >@@ -307,6 +308,7 @@ void SimulatedTimeClient::Update(NetworkControlUpdate update) { > DataRate rate_per_stream = > update.target_rate->target_rate * ratio_per_stream; > target_rate_ = update.target_rate->target_rate; >+ link_capacity_ = update.target_rate->network_estimate.bandwidth; > for (auto& stream : packet_streams_) > stream->OnTargetRateUpdate(rate_per_stream); > } >@@ -336,10 +338,22 @@ void SimulatedTimeClient::ProcessFrames(Timestamp at_time) { > } > } > >+void SimulatedTimeClient::TriggerFakeReroute(Timestamp at_time) { >+ NetworkRouteChange msg; >+ msg.at_time = at_time; >+ msg.constraints = current_contraints_; >+ msg.constraints.at_time = at_time; >+ Update(congestion_controller_->OnNetworkRouteChange(msg)); >+} >+ > TimeDelta SimulatedTimeClient::GetNetworkControllerProcessInterval() const { > return network_controller_factory_.GetProcessInterval(); > } > >+DataRate SimulatedTimeClient::link_capacity() const { >+ return link_capacity_; >+} >+ > double SimulatedTimeClient::target_rate_kbps() const { > return target_rate_.kbps<double>(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.h >index b9c6de124d41f5d53abe92819b5ea65dd412bd4b..79fdf845736d26a442f17e1b62c680289e30d58a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/simulated_time.h >@@ -129,8 +129,10 @@ class SimulatedTimeClient : NetworkReceiverInterface { > void CongestionProcess(Timestamp at_time); > void PacerProcess(Timestamp at_time); > void ProcessFrames(Timestamp at_time); >+ void TriggerFakeReroute(Timestamp at_time); > TimeDelta GetNetworkControllerProcessInterval() const; > double target_rate_kbps() const; >+ DataRate link_capacity() const; > > bool TryDeliverPacket(rtc::CopyOnWriteBuffer packet, > uint64_t receiver, >@@ -144,7 +146,9 @@ class SimulatedTimeClient : NetworkReceiverInterface { > std::vector<NetworkNode*> return_link_; > SimulatedSender sender_; > SimulatedFeedback feedback_; >+ TargetRateConstraints current_contraints_; > DataRate target_rate_ = DataRate::Infinity(); >+ DataRate link_capacity_ = DataRate::Infinity(); > FILE* packet_log_ = nullptr; > > std::vector<std::unique_ptr<PacketStream>> packet_streams_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.cc >index b85034ffb20223302b69232358aeb3597bdfd03c..36399234adcaf98d2ab6c46818614b3f5dd6c1cc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.cc >@@ -12,13 +12,14 @@ > #include <algorithm> > #include <utility> > >+#include "api/test/video/function_video_encoder_factory.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "media/base/mediaconstants.h" > #include "media/engine/internaldecoderfactory.h" > #include "media/engine/internalencoderfactory.h" > #include "media/engine/webrtcvideoengine.h" > #include "test/call_test.h" > #include "test/fake_encoder.h" >-#include "test/function_video_encoder_factory.h" > #include "test/scenario/hardware_codecs.h" > #include "test/testsupport/fileutils.h" > >@@ -131,11 +132,20 @@ VideoEncoderConfig CreateVideoEncoderConfig(VideoStreamConfig config) { > size_t num_streams = config.encoder.num_simulcast_streams; > VideoEncoderConfig encoder_config; > encoder_config.codec_type = config.encoder.codec; >- encoder_config.content_type = VideoEncoderConfig::ContentType::kRealtimeVideo; >+ switch (config.source.content_type) { >+ case VideoStreamConfig::Source::ContentType::kVideo: >+ encoder_config.content_type = >+ VideoEncoderConfig::ContentType::kRealtimeVideo; >+ break; >+ case VideoStreamConfig::Source::ContentType::kScreen: >+ encoder_config.content_type = VideoEncoderConfig::ContentType::kScreen; >+ break; >+ } > encoder_config.video_format = > SdpVideoFormat(CodecTypeToPayloadString(config.encoder.codec), {}); > encoder_config.number_of_streams = num_streams; > encoder_config.simulcast_layers = std::vector<VideoStream>(num_streams); >+ encoder_config.min_transmit_bitrate_bps = config.stream.pad_to_rate.bps(); > > std::string cricket_codec = CodecTypeToCodecName(config.encoder.codec); > if (!cricket_codec.empty()) { >@@ -153,6 +163,12 @@ VideoEncoderConfig CreateVideoEncoderConfig(VideoStreamConfig config) { > } > encoder_config.encoder_specific_settings = > CreateEncoderSpecificSettings(config); >+ if (config.encoder.max_framerate) { >+ for (auto& layer : encoder_config.simulcast_layers) { >+ layer.max_framerate = *config.encoder.max_framerate; >+ } >+ } >+ > return encoder_config; > } > } // namespace >@@ -193,11 +209,13 @@ SendVideoStream::SendVideoStream(CallClient* sender, > case Encoder::Implementation::kFake: > if (config.encoder.codec == Codec::kVideoCodecGeneric) { > encoder_factory_ = >- absl::make_unique<FunctionVideoEncoderFactory>([this, config]() { >+ absl::make_unique<FunctionVideoEncoderFactory>([this]() { >+ rtc::CritScope cs(&crit_); > auto encoder = > absl::make_unique<test::FakeEncoder>(sender_->clock_); >- if (config.encoder.fake.max_rate.IsFinite()) >- encoder->SetMaxBitrate(config.encoder.fake.max_rate.kbps()); >+ fake_encoders_.push_back(encoder.get()); >+ if (config_.encoder.fake.max_rate.IsFinite()) >+ encoder->SetMaxBitrate(config_.encoder.fake.max_rate.kbps()); > return encoder; > }); > } else { >@@ -213,9 +231,15 @@ SendVideoStream::SendVideoStream(CallClient* sender, > } > RTC_CHECK(encoder_factory_); > >+ bitrate_allocator_factory_ = CreateBuiltinVideoBitrateAllocatorFactory(); >+ RTC_CHECK(bitrate_allocator_factory_); >+ > VideoSendStream::Config send_config = > CreateVideoSendStreamConfig(config, ssrcs_, send_transport); > send_config.encoder_settings.encoder_factory = encoder_factory_.get(); >+ send_config.encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory_.get(); >+ > VideoEncoderConfig encoder_config = CreateVideoEncoderConfig(config); > > send_stream_ = sender_->call_->CreateVideoSendStream( >@@ -232,29 +256,34 @@ SendVideoStream::~SendVideoStream() { > void SendVideoStream::Start() { > send_stream_->Start(); > video_capturer_->Start(); >+ sender_->call_->SignalChannelNetworkState(MediaType::VIDEO, kNetworkUp); > } > >-bool SendVideoStream::TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) { >- // Removes added overhead before delivering RTCP packet to sender. >- RTC_DCHECK_GE(packet.size(), config_.stream.packet_overhead.bytes()); >- packet.SetSize(packet.size() - config_.stream.packet_overhead.bytes()); >- sender_->DeliverPacket(MediaType::VIDEO, packet, at_time); >- return true; >+void SendVideoStream::UpdateConfig( >+ std::function<void(VideoStreamConfig*)> modifier) { >+ rtc::CritScope cs(&crit_); >+ VideoStreamConfig prior_config = config_; >+ modifier(&config_); >+ if (prior_config.encoder.fake.max_rate != config_.encoder.fake.max_rate) { >+ for (auto* encoder : fake_encoders_) { >+ encoder->SetMaxBitrate(config_.encoder.fake.max_rate.kbps()); >+ } >+ } >+ // TODO(srte): Add more conditions that should cause reconfiguration. >+ if (prior_config.encoder.max_framerate != config_.encoder.max_framerate) { >+ VideoEncoderConfig encoder_config = CreateVideoEncoderConfig(config_); >+ send_stream_->ReconfigureVideoEncoder(std::move(encoder_config)); >+ } >+ if (prior_config.source.framerate != config_.source.framerate) { >+ SetCaptureFramerate(config_.source.framerate); >+ } > } > > void SendVideoStream::SetCaptureFramerate(int framerate) { > RTC_CHECK(frame_generator_) > << "Framerate change only implemented for generators"; > frame_generator_->ChangeFramerate(framerate); >-} > >-void SendVideoStream::SetMaxFramerate(absl::optional<int> max_framerate) { >- VideoEncoderConfig encoder_config = CreateVideoEncoderConfig(config_); >- RTC_DCHECK_EQ(encoder_config.simulcast_layers.size(), 1); >- encoder_config.simulcast_layers[0].max_framerate = max_framerate.value_or(-1); >- send_stream_->ReconfigureVideoEncoder(std::move(encoder_config)); > } > > VideoSendStream::Stats SendVideoStream::GetStats() const { >@@ -284,15 +313,14 @@ ReceiveVideoStream::ReceiveVideoStream(CallClient* receiver, > SendVideoStream* send_stream, > size_t chosen_stream, > Transport* feedback_transport) >- : receiver_(receiver), >- config_(config), >- decoder_factory_(absl::make_unique<InternalDecoderFactory>()) { >+ : receiver_(receiver), config_(config) { > renderer_ = absl::make_unique<FakeVideoRenderer>(); > VideoReceiveStream::Config recv_config(feedback_transport); > recv_config.rtp.remb = !config.stream.packet_feedback; > recv_config.rtp.transport_cc = config.stream.packet_feedback; > recv_config.rtp.local_ssrc = CallTest::kReceiverLocalVideoSsrc; > recv_config.rtp.extensions = GetVideoRtpExtensions(config); >+ receiver_->AddExtensions(recv_config.rtp.extensions); > RTC_DCHECK(!config.stream.use_rtx || > config.stream.nack_history_time > TimeDelta::Zero()); > recv_config.rtp.nack.rtp_history_ms = config.stream.nack_history_time.ms(); >@@ -300,14 +328,24 @@ ReceiveVideoStream::ReceiveVideoStream(CallClient* receiver, > recv_config.renderer = renderer_.get(); > if (config.stream.use_rtx) { > recv_config.rtp.rtx_ssrc = send_stream->rtx_ssrcs_[chosen_stream]; >+ receiver->ssrc_media_types_[recv_config.rtp.rtx_ssrc] = MediaType::VIDEO; > recv_config.rtp > .rtx_associated_payload_types[CallTest::kSendRtxPayloadType] = > CodecTypeToPayloadType(config.encoder.codec); > } > recv_config.rtp.remote_ssrc = send_stream->ssrcs_[chosen_stream]; >+ receiver->ssrc_media_types_[recv_config.rtp.remote_ssrc] = MediaType::VIDEO; >+ > VideoReceiveStream::Decoder decoder = > CreateMatchingDecoder(CodecTypeToPayloadType(config.encoder.codec), > CodecTypeToPayloadString(config.encoder.codec)); >+ if (config.encoder.codec == >+ VideoStreamConfig::Encoder::Codec::kVideoCodecGeneric) { >+ decoder_factory_ = absl::make_unique<FunctionVideoDecoderFactory>( >+ []() { return absl::make_unique<FakeDecoder>(); }); >+ } else { >+ decoder_factory_ = absl::make_unique<InternalDecoderFactory>(); >+ } > decoder.decoder_factory = decoder_factory_.get(); > recv_config.decoders.push_back(decoder); > >@@ -316,6 +354,7 @@ ReceiveVideoStream::ReceiveVideoStream(CallClient* receiver, > FlexfecReceiveStream::Config flexfec_config(feedback_transport); > flexfec_config.payload_type = CallTest::kFlexfecPayloadType; > flexfec_config.remote_ssrc = CallTest::kFlexfecSendSsrc; >+ receiver->ssrc_media_types_[flexfec_config.remote_ssrc] = MediaType::VIDEO; > flexfec_config.protected_media_ssrcs = send_stream->rtx_ssrcs_; > flexfec_config.local_ssrc = recv_config.rtp.local_ssrc; > flecfec_stream_ = >@@ -337,46 +376,23 @@ ReceiveVideoStream::~ReceiveVideoStream() { > receiver_->call_->DestroyFlexfecReceiveStream(flecfec_stream_); > } > >-bool ReceiveVideoStream::TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) { >- RTC_DCHECK_GE(packet.size(), config_.stream.packet_overhead.bytes()); >- packet.SetSize(packet.size() - config_.stream.packet_overhead.bytes()); >- receiver_->DeliverPacket(MediaType::VIDEO, packet, at_time); >- return true; >+void ReceiveVideoStream::Start() { >+ receive_stream_->Start(); >+ receiver_->call_->SignalChannelNetworkState(MediaType::VIDEO, kNetworkUp); > } > > VideoStreamPair::~VideoStreamPair() = default; > > VideoStreamPair::VideoStreamPair(CallClient* sender, >- std::vector<NetworkNode*> send_link, >- uint64_t send_receiver_id, > CallClient* receiver, >- std::vector<NetworkNode*> return_link, >- uint64_t return_receiver_id, > VideoStreamConfig config) > : config_(config), >- send_link_(send_link), >- return_link_(return_link), >- send_transport_(sender, >- send_link.front(), >- send_receiver_id, >- config.stream.packet_overhead), >- return_transport_(receiver, >- return_link.front(), >- return_receiver_id, >- config.stream.packet_overhead), >- send_stream_(sender, config, &send_transport_), >+ send_stream_(sender, config, &sender->transport_), > receive_stream_(receiver, > config, > &send_stream_, > /*chosen_stream=*/0, >- &return_transport_) { >- NetworkNode::Route(send_transport_.ReceiverId(), send_link_, >- &receive_stream_); >- NetworkNode::Route(return_transport_.ReceiverId(), return_link_, >- &send_stream_); >-} >+ &receiver->transport_) {} > > } // namespace test > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.h >index e6696183a7ebd503cfee0527e78cac048ec9b601..ebf9c951b706923596518058068dc64bebc3bd1a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/scenario/video_stream.h >@@ -14,6 +14,7 @@ > #include <vector> > > #include "rtc_base/constructormagic.h" >+#include "test/fake_encoder.h" > #include "test/frame_generator_capturer.h" > #include "test/scenario/call_client.h" > #include "test/scenario/column_printer.h" >@@ -25,15 +26,15 @@ namespace webrtc { > namespace test { > // SendVideoStream provides an interface for changing parameters and retrieving > // states at run time. >-class SendVideoStream : public NetworkReceiverInterface { >+class SendVideoStream { > public: > RTC_DISALLOW_COPY_AND_ASSIGN(SendVideoStream); > ~SendVideoStream(); > void SetCaptureFramerate(int framerate); >- void SetMaxFramerate(absl::optional<int> max_framerate); > VideoSendStream::Stats GetStats() const; > ColumnPrinter StatsPrinter(); > void Start(); >+ void UpdateConfig(std::function<void(VideoStreamConfig*)> modifier); > > private: > friend class Scenario; >@@ -43,25 +44,28 @@ class SendVideoStream : public NetworkReceiverInterface { > SendVideoStream(CallClient* sender, > VideoStreamConfig config, > Transport* send_transport); >- bool TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) override; > >+ rtc::CriticalSection crit_; > std::vector<uint32_t> ssrcs_; > std::vector<uint32_t> rtx_ssrcs_; > VideoSendStream* send_stream_ = nullptr; > CallClient* const sender_; >- const VideoStreamConfig config_; >+ VideoStreamConfig config_ RTC_GUARDED_BY(crit_); > std::unique_ptr<VideoEncoderFactory> encoder_factory_; >+ std::vector<test::FakeEncoder*> fake_encoders_ RTC_GUARDED_BY(crit_); >+ std::unique_ptr<VideoBitrateAllocatorFactory> bitrate_allocator_factory_; > std::unique_ptr<TestVideoCapturer> video_capturer_; > FrameGeneratorCapturer* frame_generator_ = nullptr; >+ int next_local_network_id_ = 0; >+ int next_remote_network_id_ = 0; > }; > > // ReceiveVideoStream represents a video receiver. It can't be used directly. >-class ReceiveVideoStream : public NetworkReceiverInterface { >+class ReceiveVideoStream { > public: > RTC_DISALLOW_COPY_AND_ASSIGN(ReceiveVideoStream); > ~ReceiveVideoStream(); >+ void Start(); > > private: > friend class Scenario; >@@ -71,9 +75,7 @@ class ReceiveVideoStream : public NetworkReceiverInterface { > SendVideoStream* send_stream, > size_t chosen_stream, > Transport* feedback_transport); >- bool TryDeliverPacket(rtc::CopyOnWriteBuffer packet, >- uint64_t receiver, >- Timestamp at_time) override; >+ > VideoReceiveStream* receive_stream_ = nullptr; > FlexfecReceiveStream* flecfec_stream_ = nullptr; > std::unique_ptr<rtc::VideoSinkInterface<webrtc::VideoFrame>> renderer_; >@@ -95,18 +97,10 @@ class VideoStreamPair { > private: > friend class Scenario; > VideoStreamPair(CallClient* sender, >- std::vector<NetworkNode*> send_link, >- uint64_t send_receiver_id, > CallClient* receiver, >- std::vector<NetworkNode*> return_link, >- uint64_t return_receiver_id, > VideoStreamConfig config); > > const VideoStreamConfig config_; >- std::vector<NetworkNode*> send_link_; >- std::vector<NetworkNode*> return_link_; >- NetworkNodeTransport send_transport_; >- NetworkNodeTransport return_transport_; > > SendVideoStream send_stream_; > ReceiveVideoStream receive_stream_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue.cc >index ad7a50327b900bf250ab231381437ef6c52a9288..5c35c2e43b0939f916c27843141684dc2985e434 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue.cc >@@ -32,10 +32,7 @@ SingleThreadedTaskQueueForTesting::QueuedTask::~QueuedTask() = default; > > SingleThreadedTaskQueueForTesting::SingleThreadedTaskQueueForTesting( > const char* name) >- : thread_(Run, this, name), >- running_(true), >- next_task_id_(0), >- wake_up_(false, false) { >+ : thread_(Run, this, name), running_(true), next_task_id_(0) { > thread_.Start(); > } > >@@ -83,7 +80,7 @@ SingleThreadedTaskQueueForTesting::PostDelayedTask(Task task, > } > > void SingleThreadedTaskQueueForTesting::SendTask(Task task) { >- rtc::Event done(true, false); >+ rtc::Event done; > PostTask([&task, &done]() { > task(); > done.Set(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue_unittest.cc >index 084f0d7e8cbb8212cfe2716532066bc95cd310c3..04961650c760e14ab2cb407868bfa2edeae036b3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/single_threaded_task_queue_unittest.cc >@@ -37,7 +37,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, ExecutesPostedTasks) { > SingleThreadedTaskQueueForTesting task_queue("task_queue"); > > std::atomic<bool> executed(false); >- rtc::Event done(true, false); >+ rtc::Event done; > > task_queue.PostTask([&executed, &done]() { > executed.store(true); >@@ -60,14 +60,14 @@ TEST(SingleThreadedTaskQueueForTestingTest, > > std::vector<std::unique_ptr<rtc::Event>> done_events; > for (size_t i = 0; i < kCount; i++) { >- done_events.emplace_back(absl::make_unique<rtc::Event>(false, false)); >+ done_events.emplace_back(absl::make_unique<rtc::Event>()); > } > > // To avoid the tasks which comprise the actual test from running before they > // have all be posted, which could result in only one task ever being in the > // queue at any given time, post one waiting task that would block the > // task-queue, and unblock only after all tasks have been posted. >- rtc::Event rendezvous(true, false); >+ rtc::Event rendezvous; > task_queue.PostTask( > [&rendezvous]() { ASSERT_TRUE(rendezvous.Wait(kMaxWaitTimeMs)); }); > >@@ -95,7 +95,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, PostToTaskQueueFromOwnThread) { > SingleThreadedTaskQueueForTesting task_queue("task_queue"); > > std::atomic<bool> executed(false); >- rtc::Event done(true, false); >+ rtc::Event done; > > auto internally_posted_task = [&executed, &done]() { > executed.store(true); >@@ -124,7 +124,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, TasksExecutedInSequence) { > > // Prevent the chain from being set in motion before we've had time to > // schedule it all, lest the queue only contain one task at a time. >- rtc::Event rendezvous(true, false); >+ rtc::Event rendezvous; > task_queue.PostTask( > [&rendezvous]() { ASSERT_TRUE(rendezvous.Wait(kMaxWaitTimeMs)); }); > >@@ -136,7 +136,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, TasksExecutedInSequence) { > } > > // The test will wait for the task-queue to finish. >- rtc::Event done(true, false); >+ rtc::Event done; > task_queue.PostTask([&done]() { done.Set(); }); > > rendezvous.Set(); // Set the chain in motion. >@@ -150,7 +150,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, ExecutesPostedDelayedTask) { > SingleThreadedTaskQueueForTesting task_queue("task_queue"); > > std::atomic<bool> executed(false); >- rtc::Event done(true, false); >+ rtc::Event done; > > constexpr int64_t delay_ms = 20; > static_assert(delay_ms < kMaxWaitTimeMs / 2, "Delay too long for tests."); >@@ -177,7 +177,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, DoesNotExecuteDelayedTaskTooSoon) { > task_queue.PostDelayedTask([&executed]() { executed.store(true); }, delay_ms); > > // Wait less than is enough, make sure the task was not yet executed. >- rtc::Event not_done(true, false); >+ rtc::Event not_done; > ASSERT_FALSE(not_done.Wait(delay_ms / 2)); > EXPECT_FALSE(executed.load()); > } >@@ -195,7 +195,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, > static_assert(earlier_delay_ms + later_delay_ms < kMaxWaitTimeMs / 2, > "Delay too long for tests."); > >- rtc::Event done(true, false); >+ rtc::Event done; > > auto earlier_task = [&earlier_executed, &later_executed]() { > EXPECT_FALSE(later_executed.load()); >@@ -229,7 +229,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, > static_assert(earlier_delay_ms + later_delay_ms < kMaxWaitTimeMs / 2, > "Delay too long for tests."); > >- rtc::Event done(true, false); >+ rtc::Event done; > > auto earlier_task = [&earlier_executed, &later_executed]() { > EXPECT_FALSE(later_executed.load()); >@@ -253,11 +253,11 @@ TEST(SingleThreadedTaskQueueForTestingTest, > TEST(SingleThreadedTaskQueueForTestingTest, ExternalThreadCancelsTask) { > SingleThreadedTaskQueueForTesting task_queue("task_queue"); > >- rtc::Event done(true, false); >+ rtc::Event done; > > // Prevent the to-be-cancelled task from being executed before we've had > // time to cancel it. >- rtc::Event rendezvous(true, false); >+ rtc::Event rendezvous; > task_queue.PostTask( > [&rendezvous]() { ASSERT_TRUE(rendezvous.Wait(kMaxWaitTimeMs)); }); > >@@ -279,10 +279,10 @@ TEST(SingleThreadedTaskQueueForTestingTest, ExternalThreadCancelsTask) { > TEST(SingleThreadedTaskQueueForTestingTest, InternalThreadCancelsTask) { > SingleThreadedTaskQueueForTesting task_queue("task_queue"); > >- rtc::Event done(true, false); >+ rtc::Event done; > > // Prevent the chain from being set-off before we've set everything up. >- rtc::Event rendezvous(true, false); >+ rtc::Event rendezvous; > task_queue.PostTask( > [&rendezvous]() { ASSERT_TRUE(rendezvous.Wait(kMaxWaitTimeMs)); }); > >@@ -316,7 +316,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, SendTask) { > task_queue.SendTask([&executed]() { > // Intentionally delay, so that if SendTask didn't block, the sender thread > // would have time to read |executed|. >- rtc::Event delay(true, false); >+ rtc::Event delay; > ASSERT_FALSE(delay.Wait(1000)); > executed.store(true); > }); >@@ -335,7 +335,7 @@ TEST(SingleThreadedTaskQueueForTestingTest, > for (size_t i = 0; i < tasks; i++) { > task_queue->PostTask([&counter]() { > std::atomic_fetch_add(&counter, static_cast<size_t>(1)); >- rtc::Event delay(true, false); >+ rtc::Event delay; > ASSERT_FALSE(delay.Wait(500)); > }); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main.cc >index c8d0b1b584e6a75c95f076a6df75a7c309e065cf..1a311f63c279cc9b8829decfa9b5ccd42901af90 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main.cc >@@ -8,129 +8,15 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >-#include <fstream> >+#include <memory> > >-#include "rtc_base/flags.h" >-#include "rtc_base/logging.h" >-#include "rtc_base/thread.h" >-#include "system_wrappers/include/field_trial.h" >-#include "system_wrappers/include/metrics.h" >-#include "test/field_trial.h" >-#include "test/gmock.h" >-#include "test/gtest.h" >-#include "test/testsupport/fileutils.h" >-#include "test/testsupport/perf_test.h" >- >-#if defined(WEBRTC_WIN) >-#include "rtc_base/win32socketinit.h" >-#endif >- >-#if defined(WEBRTC_IOS) >-#include "test/ios/test_support.h" >- >-DEFINE_string(NSTreatUnknownArgumentsAsOpen, >- "", >- "Intentionally ignored flag intended for iOS simulator."); >-DEFINE_string(ApplePersistenceIgnoreState, >- "", >- "Intentionally ignored flag intended for iOS simulator."); >-DEFINE_bool( >- save_chartjson_result, >- false, >- "Store the perf results in Documents/perf_result.json in the format " >- "described by " >- "https://github.com/catapult-project/catapult/blob/master/dashboard/docs/" >- "data-format.md."); >- >-#else >- >-DEFINE_string( >- isolated_script_test_output, >- "", >- "Path to output an empty JSON file which Chromium infra requires."); >- >-DEFINE_string( >- isolated_script_test_perf_output, >- "", >- "Path where the perf results should be stored in the JSON format described " >- "by " >- "https://github.com/catapult-project/catapult/blob/master/dashboard/docs/" >- "data-format.md."); >- >-#endif >- >-DEFINE_bool(logs, false, "print logs to stderr"); >- >-DEFINE_string( >- force_fieldtrials, >- "", >- "Field trials control experimental feature code which can be forced. " >- "E.g. running with --force_fieldtrials=WebRTC-FooFeature/Enable/" >- " will assign the group Enable to field trial WebRTC-FooFeature."); >- >-DEFINE_bool(help, false, "Print this message."); >+#include "test/test_main_lib.h" > > int main(int argc, char* argv[]) { >- ::testing::InitGoogleMock(&argc, argv); >- >- // Default to LS_INFO, even for release builds to provide better test logging. >- // TODO(pbos): Consider adding a command-line override. >- if (rtc::LogMessage::GetLogToDebug() > rtc::LS_INFO) >- rtc::LogMessage::LogToDebug(rtc::LS_INFO); >- >- if (rtc::FlagList::SetFlagsFromCommandLine(&argc, argv, false)) { >- return 1; >- } >- if (FLAG_help) { >- rtc::FlagList::Print(nullptr, false); >- return 0; >+ std::unique_ptr<webrtc::TestMain> main = webrtc::TestMain::Create(); >+ int err_code = main->Init(&argc, argv); >+ if (err_code != 0) { >+ return err_code; > } >- >- webrtc::test::SetExecutablePath(argv[0]); >- webrtc::test::ValidateFieldTrialsStringOrDie(FLAG_force_fieldtrials); >- // InitFieldTrialsFromString stores the char*, so the char array must outlive >- // the application. >- webrtc::field_trial::InitFieldTrialsFromString(FLAG_force_fieldtrials); >- webrtc::metrics::Enable(); >- >-#if defined(WEBRTC_WIN) >- rtc::WinsockInitializer winsock_init; >-#endif >- >- rtc::LogMessage::SetLogToStderr(FLAG_logs); >- >- // Ensure that main thread gets wrapped as an rtc::Thread. >- // TODO(bugs.webrt.org/9714): It might be better to avoid wrapping the main >- // thread, or leave it to individual tests that need it. But as long as we >- // have automatic thread wrapping, we need this to avoid that some other >- // random thread (which one depending on which tests are run) gets >- // automatically wrapped. >- rtc::ThreadManager::Instance()->WrapCurrentThread(); >- RTC_CHECK(rtc::Thread::Current()); >- >-#if defined(WEBRTC_IOS) >- >- rtc::test::InitTestSuite(RUN_ALL_TESTS, argc, argv, >- FLAG_save_chartjson_result); >- rtc::test::RunTestsFromIOSApp(); >- >-#else >- >- int exit_code = RUN_ALL_TESTS(); >- >- std::string chartjson_result_file = FLAG_isolated_script_test_perf_output; >- if (!chartjson_result_file.empty()) { >- webrtc::test::WritePerfResults(chartjson_result_file); >- } >- >- std::string result_filename = FLAG_isolated_script_test_output; >- if (!result_filename.empty()) { >- std::ofstream result_file(result_filename); >- result_file << "{\"version\": 3}"; >- result_file.close(); >- } >- >- return exit_code; >- >-#endif >+ return main->Run(argc, argv); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main_lib.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main_lib.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..95144c98fcb9347a07d64fe7117895f6135abe85 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main_lib.cc >@@ -0,0 +1,177 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "test/test_main_lib.h" >+ >+#include <fstream> >+#include <string> >+ >+#include "absl/memory/memory.h" >+#include "rtc_base/flags.h" >+#include "rtc_base/logging.h" >+#include "rtc_base/thread.h" >+#include "system_wrappers/include/field_trial.h" >+#include "system_wrappers/include/metrics.h" >+#include "test/field_trial.h" >+#include "test/gmock.h" >+#include "test/gtest.h" >+#include "test/testsupport/fileutils.h" >+#include "test/testsupport/perf_test.h" >+ >+#if defined(WEBRTC_WIN) >+#include "rtc_base/win32socketinit.h" >+#endif >+ >+#if defined(WEBRTC_IOS) >+#include "test/ios/test_support.h" >+ >+WEBRTC_DEFINE_string(NSTreatUnknownArgumentsAsOpen, >+ "", >+ "Intentionally ignored flag intended for iOS simulator."); >+WEBRTC_DEFINE_string(ApplePersistenceIgnoreState, >+ "", >+ "Intentionally ignored flag intended for iOS simulator."); >+WEBRTC_DEFINE_bool( >+ save_chartjson_result, >+ false, >+ "Store the perf results in Documents/perf_result.json in the format " >+ "described by " >+ "https://github.com/catapult-project/catapult/blob/master/dashboard/docs/" >+ "data-format.md."); >+ >+#else >+ >+WEBRTC_DEFINE_string( >+ isolated_script_test_output, >+ "", >+ "Path to output an empty JSON file which Chromium infra requires."); >+ >+WEBRTC_DEFINE_string( >+ isolated_script_test_perf_output, >+ "", >+ "Path where the perf results should be stored in the JSON format described " >+ "by " >+ "https://github.com/catapult-project/catapult/blob/master/dashboard/docs/" >+ "data-format.md."); >+ >+#endif >+ >+WEBRTC_DEFINE_bool(logs, false, "print logs to stderr"); >+ >+WEBRTC_DEFINE_string( >+ force_fieldtrials, >+ "", >+ "Field trials control experimental feature code which can be forced. " >+ "E.g. running with --force_fieldtrials=WebRTC-FooFeature/Enable/" >+ " will assign the group Enable to field trial WebRTC-FooFeature."); >+ >+WEBRTC_DEFINE_bool(help, false, "Print this message."); >+ >+namespace webrtc { >+ >+namespace { >+ >+class TestMainImpl : public TestMain { >+ public: >+ int Init(int* argc, char* argv[]) override { >+ ::testing::InitGoogleMock(argc, argv); >+ >+ // Default to LS_INFO, even for release builds to provide better test >+ // logging. >+ // TODO(pbos): Consider adding a command-line override. >+ if (rtc::LogMessage::GetLogToDebug() > rtc::LS_INFO) >+ rtc::LogMessage::LogToDebug(rtc::LS_INFO); >+ >+ if (rtc::FlagList::SetFlagsFromCommandLine(argc, argv, false)) { >+ return 1; >+ } >+ if (FLAG_help) { >+ rtc::FlagList::Print(nullptr, false); >+ return 0; >+ } >+ >+ // TODO(bugs.webrtc.org/9792): we need to reference something from >+ // fileutils.h so that our downstream hack where we replace fileutils.cc >+ // works. Otherwise the downstream flag implementation will take over and >+ // botch the flag introduced by the hack. Remove this awful thing once the >+ // downstream implementation has been eliminated. >+ (void)webrtc::test::JoinFilename("horrible", "hack"); >+ >+ webrtc::test::ValidateFieldTrialsStringOrDie(FLAG_force_fieldtrials); >+ // InitFieldTrialsFromString stores the char*, so the char array must >+ // outlive the application. >+ webrtc::field_trial::InitFieldTrialsFromString(FLAG_force_fieldtrials); >+ webrtc::metrics::Enable(); >+ >+#if defined(WEBRTC_WIN) >+ winsock_init_ = absl::make_unique<rtc::WinsockInitializer>(); >+#endif >+ >+ rtc::LogMessage::SetLogToStderr(FLAG_logs); >+ >+ // Ensure that main thread gets wrapped as an rtc::Thread. >+ // TODO(bugs.webrt.org/9714): It might be better to avoid wrapping the main >+ // thread, or leave it to individual tests that need it. But as long as we >+ // have automatic thread wrapping, we need this to avoid that some other >+ // random thread (which one depending on which tests are run) gets >+ // automatically wrapped. >+ rtc::ThreadManager::Instance()->WrapCurrentThread(); >+ RTC_CHECK(rtc::Thread::Current()); >+ return 0; >+ } >+ >+ int Run(int argc, char* argv[]) override { >+#if defined(WEBRTC_IOS) >+ rtc::test::InitTestSuite(RUN_ALL_TESTS, argc, argv, >+ FLAG_save_chartjson_result); >+ rtc::test::RunTestsFromIOSApp(); >+ return 0; >+#else >+ int exit_code = RUN_ALL_TESTS(); >+ >+ std::string chartjson_result_file = FLAG_isolated_script_test_perf_output; >+ if (!chartjson_result_file.empty()) { >+ webrtc::test::WritePerfResults(chartjson_result_file); >+ } >+ >+ std::string result_filename = FLAG_isolated_script_test_output; >+ if (!result_filename.empty()) { >+ std::ofstream result_file(result_filename); >+ result_file << "{\"version\": 3}"; >+ result_file.close(); >+ } >+ >+#if defined(ADDRESS_SANITIZER) || defined(LEAK_SANITIZER) || \ >+ defined(MEMORY_SANITIZER) || defined(THREAD_SANITIZER) || \ >+ defined(UNDEFINED_SANITIZER) >+ // We want the test flagged as failed only for sanitizer defects, >+ // in which case the sanitizer will override exit code with 66. >+ return 0; >+#endif >+ >+ return exit_code; >+#endif >+ } >+ >+ ~TestMainImpl() override = default; >+ >+ private: >+#if defined(WEBRTC_WIN) >+ std::unique_ptr<rtc::WinsockInitializer> winsock_init_; >+#endif >+}; >+ >+} // namespace >+ >+std::unique_ptr<TestMain> TestMain::Create() { >+ return absl::make_unique<TestMainImpl>(); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main_lib.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main_lib.h >new file mode 100644 >index 0000000000000000000000000000000000000000..44dd764ae24d3dfb863f6ec134a4d693e8dd030f >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/test_main_lib.h >@@ -0,0 +1,38 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+#ifndef TEST_TEST_MAIN_LIB_H_ >+#define TEST_TEST_MAIN_LIB_H_ >+ >+#include <memory> >+ >+namespace webrtc { >+ >+// Class to initialize test environment and run tests. >+class TestMain { >+ public: >+ virtual ~TestMain() {} >+ >+ static std::unique_ptr<TestMain> Create(); >+ >+ // Initializes test environment. Clients can add their own initialization >+ // steps after call to this method and before running tests. >+ // Returns 0 if initialization was successful and non 0 otherwise. >+ virtual int Init(int* argc, char* argv[]) = 0; >+ >+ // Runs test end return result error code. 0 - no errors. >+ virtual int Run(int argc, char* argv[]) = 0; >+ >+ protected: >+ TestMain() = default; >+}; >+ >+} // namespace webrtc >+ >+#endif // TEST_TEST_MAIN_LIB_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.cc >index cb88472e425bf2afc6a898864c9309cbd4ebb535..a71d8f8db57d57ee94034114db4b68e0288e68ab 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.cc >@@ -59,35 +59,17 @@ > #include "rtc_base/arraysize.h" > #include "rtc_base/checks.h" > #include "rtc_base/stringutils.h" >+#include "test/testsupport/fileutils_override.h" > > namespace webrtc { > namespace test { > >-namespace { >- > #if defined(WEBRTC_WIN) > const char* kPathDelimiter = "\\"; > #else > const char* kPathDelimiter = "/"; > #endif > >-#if defined(WEBRTC_ANDROID) >-// This is a special case in Chrome infrastructure. See >-// base/test/test_support_android.cc. >-const char* kAndroidChromiumTestsRoot = "/sdcard/chromium_tests_root/"; >-#endif >- >-#if !defined(WEBRTC_IOS) >-const char* kResourcesDirName = "resources"; >-#endif >- >-char relative_dir_path[FILENAME_MAX]; >-bool relative_dir_path_set = false; >- >-} // namespace >- >-const char* kCannotFindProjectRootDir = "ERROR_CANNOT_FIND_PROJECT_ROOT_DIR"; >- > std::string DirName(const std::string& path) { > if (path.empty()) > return ""; >@@ -101,27 +83,6 @@ std::string DirName(const std::string& path) { > return result.substr(0, result.find_last_of(kPathDelimiter)); > } > >-void SetExecutablePath(const std::string& path) { >- std::string working_dir = WorkingDir(); >- std::string temp_path = path; >- >- // Handle absolute paths; convert them to relative paths to the working dir. >- if (path.find(working_dir) != std::string::npos) { >- temp_path = path.substr(working_dir.length() + 1); >- } >-// On Windows, when tests are run under memory tools like DrMemory and TSan, >-// slashes occur in the path as directory separators. Make sure we replace >-// such cases with backslashes in order for the paths to be correct. >-#ifdef WIN32 >- std::replace(temp_path.begin(), temp_path.end(), '/', '\\'); >-#endif >- >- // Trim away the executable name; only store the relative dir path. >- temp_path = DirName(temp_path); >- strncpy(relative_dir_path, temp_path.c_str(), FILENAME_MAX); >- relative_dir_path_set = true; >-} >- > bool FileExists(const std::string& file_name) { > struct stat file_info = {0}; > return stat(file_name.c_str(), &file_info) == 0; >@@ -133,69 +94,12 @@ bool DirExists(const std::string& directory_name) { > S_ISDIR(directory_info.st_mode); > } > >-// Finds the WebRTC src dir. >-// The returned path always ends with a path separator. >-std::string ProjectRootPath() { >-#if defined(WEBRTC_ANDROID) >- return kAndroidChromiumTestsRoot; >-#elif defined WEBRTC_IOS >- return IOSRootPath(); >-#elif defined(WEBRTC_MAC) >- std::string path; >- GetNSExecutablePath(&path); >- std::string exe_dir = DirName(path); >- // On Mac, tests execute in out/Whatever, so src is two levels up except if >- // the test is bundled (which our tests are not), in which case it's 5 levels. >- return DirName(DirName(exe_dir)) + kPathDelimiter; >-#elif defined(WEBRTC_POSIX) >- char buf[PATH_MAX]; >- ssize_t count = ::readlink("/proc/self/exe", buf, arraysize(buf)); >- if (count <= 0) { >- RTC_NOTREACHED() << "Unable to resolve /proc/self/exe."; >- return kCannotFindProjectRootDir; >- } >- // On POSIX, tests execute in out/Whatever, so src is two levels up. >- std::string exe_dir = DirName(std::string(buf, count)); >- return DirName(DirName(exe_dir)) + kPathDelimiter; >-#elif defined(WEBRTC_WIN) >- wchar_t buf[MAX_PATH]; >- buf[0] = 0; >- if (GetModuleFileName(NULL, buf, MAX_PATH) == 0) >- return kCannotFindProjectRootDir; >- >- std::string exe_path = rtc::ToUtf8(std::wstring(buf)); >- std::string exe_dir = DirName(exe_path); >- return DirName(DirName(exe_dir)) + kPathDelimiter; >-#endif >-} >- > std::string OutputPath() { >-#if defined(WEBRTC_IOS) >- return IOSOutputPath(); >-#elif defined(WEBRTC_ANDROID) >- return kAndroidChromiumTestsRoot; >-#else >- std::string path = ProjectRootPath(); >- RTC_DCHECK_NE(path, kCannotFindProjectRootDir); >- path += "out"; >- if (!CreateDir(path)) { >- return "./"; >- } >- return path + kPathDelimiter; >-#endif >+ return webrtc::test::internal::OutputPath(); > } > > std::string WorkingDir() { >-#if defined(WEBRTC_ANDROID) >- return kAndroidChromiumTestsRoot; >-#endif >- char path_buffer[FILENAME_MAX]; >- if (!GET_CURRENT_DIR(path_buffer, sizeof(path_buffer))) { >- fprintf(stderr, "Cannot get current directory!\n"); >- return "./"; >- } else { >- return std::string(path_buffer); >- } >+ return webrtc::test::internal::WorkingDir(); > } > > // Generate a temporary filename in a safe way. >@@ -322,30 +226,7 @@ bool RemoveFile(const std::string& file_name) { > > std::string ResourcePath(const std::string& name, > const std::string& extension) { >-#if defined(WEBRTC_IOS) >- return IOSResourcePath(name, extension); >-#else >- std::string platform = "win"; >-#ifdef WEBRTC_LINUX >- platform = "linux"; >-#endif // WEBRTC_LINUX >-#ifdef WEBRTC_MAC >- platform = "mac"; >-#endif // WEBRTC_MAC >-#ifdef WEBRTC_ANDROID >- platform = "android"; >-#endif // WEBRTC_ANDROID >- >- std::string resources_path = >- ProjectRootPath() + kResourcesDirName + kPathDelimiter; >- std::string resource_file = >- resources_path + name + "_" + platform + "." + extension; >- if (FileExists(resource_file)) { >- return resource_file; >- } >- // Fall back on name without platform. >- return resources_path + name + "." + extension; >-#endif // defined (WEBRTC_IOS) >+ return webrtc::test::internal::ResourcePath(name, extension); > } > > std::string JoinFilename(const std::string& dir, const std::string& name) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.h >index 5ff019600ed5981ecfef25461169e975d492296a..8a186988d638086d16445be21f5fe14c1db567f1 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils.h >@@ -25,6 +25,9 @@ namespace test { > // to find the project root. > extern const char* kCannotFindProjectRootDir; > >+// Slash or backslash, depending on platform. NUL-terminated string. >+extern const char* kPathDelimiter; >+ > // Returns the absolute path to the output directory where log files and other > // test artifacts should be put. The output directory is generally a directory > // named "out" at the project root. This root is assumed to be two levels above >@@ -51,18 +54,8 @@ std::string TempFilename(const std::string& dir, const std::string& prefix); > std::string GenerateTempFilename(const std::string& dir, > const std::string& prefix); > >-// Returns a path to a resource file for the currently executing platform. >-// Adapts to what filenames are currently present in the >-// [project-root]/resources/ dir. >-// Returns an absolute path according to this priority list (the directory >-// part of the path is left out for readability): >-// 1. [name]_[platform]_[architecture].[extension] >-// 2. [name]_[platform].[extension] >-// 3. [name]_[architecture].[extension] >-// 4. [name].[extension] >-// Where >-// * platform is either of "win", "mac" or "linux". >-// * architecture is either of "32" or "64". >+// Returns a path to a resource file in [project-root]/resources/ dir. >+// Returns an absolute path > // > // Arguments: > // name - Name of the resource file. If a plain filename (no directory path) >@@ -110,15 +103,6 @@ std::string DirName(const std::string& path); > // empty or if the file does not exist/is readable. > size_t GetFileSize(const std::string& filename); > >-// Sets the executable path, i.e. the path to the executable that is being used >-// when launching it. This is usually the path relative to the working directory >-// but can also be an absolute path. The intention with this function is to pass >-// the argv[0] being sent into the main function to make it possible for >-// fileutils.h to find the correct project paths even when the working directory >-// is outside the project tree (which happens in some cases). >-// TODO(bugs.webrtc.org/9792): Deprecated - going away soon. >-void SetExecutablePath(const std::string& path_to_executable); >- > } // namespace test > } // namespace webrtc > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_override.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_override.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..884069c10dd4f610cdbea906acf61aa323b2a0ac >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_override.cc >@@ -0,0 +1,151 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "test/testsupport/fileutils_override.h" >+ >+#if defined(WEBRTC_WIN) >+#include <direct.h> >+#include <tchar.h> >+#include <windows.h> >+#include <algorithm> >+#include <codecvt> >+#include <locale> >+ >+#include "Shlwapi.h" >+#include "WinDef.h" >+ >+#include "rtc_base/win32.h" >+#define GET_CURRENT_DIR _getcwd >+#else >+#include <dirent.h> >+#include <unistd.h> >+#define GET_CURRENT_DIR getcwd >+#endif >+ >+#if defined(WEBRTC_IOS) >+#include "test/testsupport/iosfileutils.h" >+#endif >+ >+#if defined(WEBRTC_MAC) >+#include "test/testsupport/macfileutils.h" >+#endif >+ >+#include "absl/types/optional.h" >+#include "rtc_base/arraysize.h" >+#include "rtc_base/checks.h" >+#include "rtc_base/stringutils.h" >+ >+namespace webrtc { >+namespace test { >+ >+std::string DirName(const std::string& path); >+bool CreateDir(const std::string& directory_name); >+ >+namespace internal { >+ >+namespace { >+#if defined(WEBRTC_WIN) >+const char* kPathDelimiter = "\\"; >+#elif !defined(WEBRTC_IOS) >+const char* kPathDelimiter = "/"; >+#endif >+ >+#if defined(WEBRTC_ANDROID) >+// This is a special case in Chrome infrastructure. See >+// base/test/test_support_android.cc. >+const char* kAndroidChromiumTestsRoot = "/sdcard/chromium_tests_root/"; >+#endif >+ >+#if !defined(WEBRTC_IOS) >+const char* kResourcesDirName = "resources"; >+#endif >+ >+} // namespace >+ >+// Finds the WebRTC src dir. >+// The returned path always ends with a path separator. >+absl::optional<std::string> ProjectRootPath() { >+#if defined(WEBRTC_ANDROID) >+ return kAndroidChromiumTestsRoot; >+#elif defined WEBRTC_IOS >+ return IOSRootPath(); >+#elif defined(WEBRTC_MAC) >+ std::string path; >+ GetNSExecutablePath(&path); >+ std::string exe_dir = DirName(path); >+ // On Mac, tests execute in out/Whatever, so src is two levels up except if >+ // the test is bundled (which our tests are not), in which case it's 5 levels. >+ return DirName(DirName(exe_dir)) + kPathDelimiter; >+#elif defined(WEBRTC_POSIX) >+ char buf[PATH_MAX]; >+ ssize_t count = ::readlink("/proc/self/exe", buf, arraysize(buf)); >+ if (count <= 0) { >+ RTC_NOTREACHED() << "Unable to resolve /proc/self/exe."; >+ return absl::nullopt; >+ } >+ // On POSIX, tests execute in out/Whatever, so src is two levels up. >+ std::string exe_dir = DirName(std::string(buf, count)); >+ return DirName(DirName(exe_dir)) + kPathDelimiter; >+#elif defined(WEBRTC_WIN) >+ wchar_t buf[MAX_PATH]; >+ buf[0] = 0; >+ if (GetModuleFileName(NULL, buf, MAX_PATH) == 0) >+ return absl::nullopt; >+ >+ std::string exe_path = rtc::ToUtf8(std::wstring(buf)); >+ std::string exe_dir = DirName(exe_path); >+ return DirName(DirName(exe_dir)) + kPathDelimiter; >+#endif >+} >+ >+std::string OutputPath() { >+#if defined(WEBRTC_IOS) >+ return IOSOutputPath(); >+#elif defined(WEBRTC_ANDROID) >+ return kAndroidChromiumTestsRoot; >+#else >+ absl::optional<std::string> path_opt = ProjectRootPath(); >+ RTC_DCHECK(path_opt); >+ std::string path = *path_opt + "out"; >+ if (!CreateDir(path)) { >+ return "./"; >+ } >+ return path + kPathDelimiter; >+#endif >+} >+ >+std::string WorkingDir() { >+#if defined(WEBRTC_ANDROID) >+ return kAndroidChromiumTestsRoot; >+#endif >+ char path_buffer[FILENAME_MAX]; >+ if (!GET_CURRENT_DIR(path_buffer, sizeof(path_buffer))) { >+ fprintf(stderr, "Cannot get current directory!\n"); >+ return "./"; >+ } else { >+ return std::string(path_buffer); >+ } >+} >+ >+std::string ResourcePath(const std::string& name, >+ const std::string& extension) { >+#if defined(WEBRTC_IOS) >+ return IOSResourcePath(name, extension); >+#else >+ absl::optional<std::string> path_opt = ProjectRootPath(); >+ RTC_DCHECK(path_opt); >+ std::string resources_path = *path_opt + kResourcesDirName + kPathDelimiter; >+ return resources_path + name + "." + extension; >+#endif >+} >+ >+} // namespace internal >+} // namespace test >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_override.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_override.h >new file mode 100644 >index 0000000000000000000000000000000000000000..141c76860716ad7892edddbd8359931cf20b026b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_override.h >@@ -0,0 +1,57 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include <stdio.h> >+#include <string> >+ >+#ifndef TEST_TESTSUPPORT_FILEUTILS_OVERRIDE_H_ >+#define TEST_TESTSUPPORT_FILEUTILS_OVERRIDE_H_ >+ >+namespace webrtc { >+namespace test { >+namespace internal { >+ >+// Returns the absolute path to the output directory where log files and other >+// test artifacts should be put. The output directory is generally a directory >+// named "out" at the project root. This root is assumed to be two levels above >+// where the test binary is located; this is because tests execute in a dir >+// out/Whatever relative to the project root. This convention is also followed >+// in Chromium. >+// >+// The exception is Android where we use /sdcard/ instead. >+// >+// If symbolic links occur in the path they will be resolved and the actual >+// directory will be returned. >+// >+// Returns the path WITH a trailing path delimiter. If the project root is not >+// found, the current working directory ("./") is returned as a fallback. >+std::string OutputPath(); >+ >+// Gets the current working directory for the executing program. >+// Returns "./" if for some reason it is not possible to find the working >+// directory. >+std::string WorkingDir(); >+ >+// Returns a path to a resource file in [project-root]/resources/ dir. >+// Returns an absolute path >+// >+// Arguments: >+// name - Name of the resource file. If a plain filename (no directory path) >+// is supplied, the file is assumed to be located in resources/ >+// If a directory path is prepended to the filename, a subdirectory >+// hierarchy reflecting that path is assumed to be present. >+// extension - File extension, without the dot, i.e. "bmp" or "yuv". >+std::string ResourcePath(const std::string& name, const std::string& extension); >+ >+} // namespace internal >+} // namespace test >+} // namespace webrtc >+ >+#endif // TEST_TESTSUPPORT_FILEUTILS_OVERRIDE_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_unittest.cc >index c6dc86deb200a3c3c02d28d104e265266870982a..d39f46800418da7c376768fad123ee71969604b6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/fileutils_unittest.cc >@@ -24,9 +24,6 @@ > > #ifdef WIN32 > #define chdir _chdir >-static const char* kPathDelimiter = "\\"; >-#else >-static const char* kPathDelimiter = "/"; > #endif > > using ::testing::EndsWith; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/jpeg_frame_writer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/jpeg_frame_writer.cc >index 537e85cfa5e55f10029114432a25e977f68b5f98..8bf1ee4630e1f3b84a7da43fd171c9d1b1263457 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/jpeg_frame_writer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/jpeg_frame_writer.cc >@@ -10,7 +10,6 @@ > > #include <stdio.h> > >-#include "common_types.h" // NOLINT(build/include) > #include "common_video/libyuv/include/webrtc_libyuv.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts.cc >index 0f7e0cd4ff86606d0b9ab2ccfbcca352a390d7ee..9438cefbc4fd7e186bbc22efb7dabca2849a44ef 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts.cc >@@ -24,9 +24,9 @@ const std::string& DefaultArtifactPath() { > } > } // namespace > >-DEFINE_string(test_artifacts_dir, >- DefaultArtifactPath().c_str(), >- "The output folder where test output should be saved."); >+WEBRTC_DEFINE_string(test_artifacts_dir, >+ DefaultArtifactPath().c_str(), >+ "The output folder where test output should be saved."); > > namespace webrtc { > namespace test { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts_unittest.cc >index c423cd90eada908af3c92ffa9de2e1f86196ddb9..df02d27c6725d6167226ef2410f57040513c395b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/testsupport/test_artifacts_unittest.cc >@@ -16,12 +16,11 @@ > > #include "rtc_base/file.h" > #include "rtc_base/flags.h" >-#include "rtc_base/pathutils.h" > #include "rtc_base/platform_file.h" > #include "test/gtest.h" > #include "test/testsupport/fileutils.h" > >-DECLARE_string(test_artifacts_dir); >+WEBRTC_DECLARE_string(test_artifacts_dir); > > namespace webrtc { > namespace test { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/vcm_capturer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/vcm_capturer.h >index 59df4c71e2aa863e1bcd89f6f0013aff4ab93420..5f4707f911551f449cb8cf42f46b64036180f7fc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/vcm_capturer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/vcm_capturer.h >@@ -12,7 +12,6 @@ > > #include <memory> > >-#include "common_types.h" // NOLINT(build/include) > #include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/video_capture/video_capture.h" > #include "rtc_base/criticalsection.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/video_encoder_proxy_factory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/test/video_encoder_proxy_factory.h >index e7289c0399e2667ee8f13256648d603755e345f4..9e5ea12ce2bcbb9382f335381b559b95740dd72a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/video_encoder_proxy_factory.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/video_encoder_proxy_factory.h >@@ -71,23 +71,17 @@ class VideoEncoderProxyFactory final : public VideoEncoderFactory { > size_t max_payload_size) override { > return encoder_->InitEncode(config, number_of_cores, max_payload_size); > } >- VideoEncoder::ScalingSettings GetScalingSettings() const override { >- return encoder_->GetScalingSettings(); >- } > int32_t RegisterEncodeCompleteCallback( > EncodedImageCallback* callback) override { > return encoder_->RegisterEncodeCompleteCallback(callback); > } > int32_t Release() override { return encoder_->Release(); } >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override { >- return encoder_->SetChannelParameters(packet_loss, rtt); >- } > int32_t SetRateAllocation(const VideoBitrateAllocation& rate_allocation, > uint32_t framerate) override { > return encoder_->SetRateAllocation(rate_allocation, framerate); > } >- const char* ImplementationName() const override { >- return encoder_->ImplementationName(); >+ VideoEncoder::EncoderInfo GetEncoderInfo() const override { >+ return encoder_->GetEncoderInfo(); > } > > VideoEncoder* const encoder_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/test/win/run_loop_win.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/test/win/run_loop_win.cc >index 28dc9334e086ff5919c639bd660275f56851b9e4..ee936358fa6d2be03fa7598bd84870b47b5f90b5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/test/win/run_loop_win.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/test/win/run_loop_win.cc >@@ -18,15 +18,18 @@ > namespace webrtc { > namespace test { > >-void PressEnterToContinue() { >+void PressEnterToContinue(SingleThreadedTaskQueueForTesting &task_queue) { > puts(">> Press ENTER to continue..."); > >- MSG msg; >- BOOL ret; >- while ((ret = GetMessage(&msg, NULL, 0, 0)) != 0) { >- assert(ret != -1); >- TranslateMessage(&msg); >- DispatchMessage(&msg); >+ while (!_kbhit() || _getch() != '\r') { >+ // Drive the message loop for the thread running the task_queue >+ task_queue.SendTask([&]() { >+ MSG msg; >+ if (PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) { >+ TranslateMessage(&msg); >+ DispatchMessage(&msg); >+ } >+ }); > } > } > } // namespace test >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/android/suppressions.xml b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/android/suppressions.xml >index 67e5b4165302fe83c1bfecca6ec86a23522da19a..45375c2fe71c8d3626ccf9fd01e6305d5eca3211 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/android/suppressions.xml >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/android/suppressions.xml >@@ -22,7 +22,10 @@ > > <issue id="GoogleAppIndexingWarning" severity="ignore"/> > <issue id="MissingRegistered" severity="ignore"/> >- >+ <issue id="LintError"> >+ <!-- We no longer supply class files to lint. --> >+ <ignore regexp="No `.class` files were found in project"/> >+ </issue> > <!-- These are just from the dummy AndroidManifest.xml we use for linting. > It's in the same directory as this file. --> > <issue id="MissingApplicationIcon" severity="ignore"/> >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/autoroller/roll_deps.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/autoroller/roll_deps.py >index 9999c6f6afbf6df1da110d152039f2da20c38df5..d3a8bdbca8f9fb27f2f1134f88dd051d9ee1de8e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/autoroller/roll_deps.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/autoroller/roll_deps.py >@@ -70,6 +70,8 @@ ANDROID_DEPS_END = r'=== ANDROID_DEPS Generated Code End ===' > # Location of automically gathered android deps. > ANDROID_DEPS_PATH = 'src/third_party/android_deps/' > >+NOTIFY_EMAIL = 'webrtc-trooper@grotations.appspotmail.com' >+ > > sys.path.append(os.path.join(CHECKOUT_SRC_DIR, 'build')) > import find_depot_tools >@@ -256,8 +258,9 @@ def BuildDepsentryDict(deps_dict): > > > def _FindChangedCipdPackages(path, old_pkgs, new_pkgs): >- assert ({p['package'] for p in old_pkgs} == >- {p['package'] for p in new_pkgs}) >+ pkgs_equal = ({p['package'] for p in old_pkgs} == >+ {p['package'] for p in new_pkgs}) >+ assert pkgs_equal, 'Old: %s\n New: %s' % (old_pkgs, new_pkgs) > for old_pkg in old_pkgs: > for new_pkg in new_pkgs: > old_version = old_pkg['version'] >@@ -592,10 +595,11 @@ def _UploadCL(commit_queue_mode): > - 1: Run trybots but do not submit to CQ. > - 0: Skip CQ, upload only. > """ >- cmd = ['git', 'cl', 'upload', '--force', '--bypass-hooks'] >+ cmd = ['git', 'cl', 'upload', '--force', '--bypass-hooks', '--send-mail'] >+ cmd.extend(['--cc', NOTIFY_EMAIL]) > if commit_queue_mode >= 2: > logging.info('Sending the CL to the CQ...') >- cmd.extend(['--use-commit-queue', '--send-mail']) >+ cmd.extend(['--use-commit-queue']) > elif commit_queue_mode >= 1: > logging.info('Starting CQ dry run...') > cmd.extend(['--cq-dry-run']) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/coverage/generate_coverage_command.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/coverage/generate_coverage_command.py >index a402f38da7d45893dba77d32adc83eacb312d073..856666816dff1659d2f681a62807f477278c3e91 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/coverage/generate_coverage_command.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/coverage/generate_coverage_command.py >@@ -22,13 +22,13 @@ TESTS = [ > 'video_engine_tests', > 'tools_unittests', > 'test_support_unittests', >+ 'slow_tests', > 'system_wrappers_unittests', > 'rtc_unittests', > 'rtc_stats_unittests', > 'rtc_pc_unittests', > 'rtc_media_unittests', > 'peerconnection_unittests', >- 'ortc_unittests', > 'modules_unittests', > 'modules_tests', > 'low_bandwidth_audio_test', >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/coverage/generate_ios_coverage_command.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/coverage/generate_ios_coverage_command.py >new file mode 100644 >index 0000000000000000000000000000000000000000..f81ee2c62b0a75d2e9ace9c2afc95ff6db62c8d4 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/coverage/generate_ios_coverage_command.py >@@ -0,0 +1,174 @@ >+#!/usr/bin/env python >+# Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+# >+# Use of this source code is governed by a BSD-style license >+# that can be found in the LICENSE file in the root of the source >+# tree. An additional intellectual property rights grant can be found >+# in the file PATENTS. All contributing project authors may >+# be found in the AUTHORS file in the root of the source tree. >+ >+"""Generates command-line instructions to produce one-time iOS coverage using >+coverage.py. >+ >+This script is usable for both real devices and simulator. >+But for real devices actual execution should be done manually from Xcode >+and coverage.profraw files should be also copied manually from the device. >+ >+Additional prerequisites: >+ >+1. Xcode 10+ with iPhone Simulator SDK. Can be installed by command: >+ $ mac_toolchain install -kind ios -xcode-version 10l232m \ >+ -output-dir build/mac_files/Xcode.app >+ >+2. For computing coverage on real device you probably also need to apply >+following patch to code_coverage/coverage.py script: >+ >+========== BEGINNING OF PATCH ========== >+--- a/code_coverage/coverage.py >++++ b/code_coverage/coverage.py >+@@ -693,8 +693,7 @@ def _AddArchArgumentForIOSIfNeeded(cmd_list, num_archs): >+ to use, and one architecture needs to be specified for each binary. >+ "" " >+if _IsIOS(): >+- cmd_list.extend(['-arch=x86_64'] * num_archs) >++ cmd_list.extend(['-arch=arm64'] * num_archs) >+ >+ >+def _GetBinaryPath(command): >+@@ -836,8 +835,8 @@ def _GetBinaryPathsFromTargets(targets, build_dir): >+ binary_path = os.path.join(build_dir, target) >+ if coverage_utils.GetHostPlatform() == 'win': >+ binary_path += '.exe' >++ elif coverage_utils.GetHostPlatform() == 'mac': >++ binary_path += '.app/%s' % target >+ >+if os.path.exists(binary_path): >+ binary_paths.append(binary_path) >+========== ENDING OF PATCH ========== >+ >+""" >+ >+import sys >+ >+DIRECTORY = 'out/coverage' >+ >+TESTS = [ >+ 'audio_decoder_unittests', >+ 'common_audio_unittests', >+ 'common_video_unittests', >+ 'modules_tests', >+ 'modules_unittests', >+ 'rtc_media_unittests', >+ 'rtc_pc_unittests', >+ 'rtc_stats_unittests', >+ 'rtc_unittests', >+ 'slow_tests', >+ 'system_wrappers_unittests', >+ 'test_support_unittests', >+ 'tools_unittests', >+ 'video_capture_tests', >+ 'video_engine_tests', >+ 'webrtc_nonparallel_tests', >+] >+ >+XC_TESTS = [ >+ 'apprtcmobile_tests', >+ 'sdk_framework_unittests', >+ 'sdk_unittests', >+] >+ >+ >+def FormatIossimTest(test_name, is_xctest=False): >+ args = ['%s/%s.app' % (DIRECTORY, test_name)] >+ if is_xctest: >+ args += ['%s/%s_module.xctest' % (DIRECTORY, test_name)] >+ >+ return '-c \'%s/iossim %s\'' % (DIRECTORY, ' '.join(args)) >+ >+ >+def GetGNArgs(is_simulator): >+ target_cpu = 'x64' if is_simulator else 'arm64' >+ return ([] + >+ ['target_os="ios"'] + >+ ['target_cpu="%s"' % target_cpu] + >+ ['use_clang_coverage=true'] + >+ ['is_component_build=false'] + >+ ['dcheck_always_on=true']) >+ >+ >+def GenerateIOSSimulatorCommand(): >+ gn_args_string = ' '.join(GetGNArgs(is_simulator=True)) >+ gn_cmd = ['gn', 'gen', DIRECTORY, '--args=\'%s\'' % gn_args_string] >+ >+ coverage_cmd = ( >+ [sys.executable, 'tools/code_coverage/coverage.py'] + >+ ["%s.app" % t for t in XC_TESTS + TESTS] + >+ ['-b %s' % DIRECTORY, '-o out/report'] + >+ ['-i=\'.*/out/.*|.*/third_party/.*|.*test.*\''] + >+ [FormatIossimTest(t, is_xctest=True) for t in XC_TESTS] + >+ [FormatIossimTest(t, is_xctest=False) for t in TESTS] >+ ) >+ >+ print 'To get code coverage using iOS simulator just run following commands:' >+ print '' >+ print ' '.join(gn_cmd) >+ print '' >+ print ' '.join(coverage_cmd) >+ return 0 >+ >+ >+def GenerateIOSDeviceCommand(): >+ gn_args_string = ' '.join(GetGNArgs(is_simulator=False)) >+ >+ coverage_report_cmd = ( >+ [sys.executable, 'tools/code_coverage/coverage.py'] + >+ ['%s.app' % t for t in TESTS] + >+ ['-b %s' % DIRECTORY] + >+ ['-o out/report'] + >+ ['-p %s/merged.profdata' % DIRECTORY] + >+ ['-i=\'.*/out/.*|.*/third_party/.*|.*test.*\''] >+ ) >+ >+ print 'Computing code coverage for real iOS device is a little bit tedious.' >+ print '' >+ print 'You will need:' >+ print '' >+ print '1. Generate xcode project and open it with Xcode 10+:' >+ print ' gn gen %s --ide=xcode --args=\'%s\'' % (DIRECTORY, gn_args_string) >+ print ' open %s/all.xcworkspace' % DIRECTORY >+ print '' >+ print '2. Execute these Run targets manually with Xcode Run button and ' >+ print 'manually save generated coverage.profraw file to %s:' % DIRECTORY >+ print '\n'.join('- %s' % t for t in TESTS) >+ print '' >+ print '3. Execute these Test targets manually with Xcode Test button and ' >+ print 'manually save generated coverage.profraw file to %s:' % DIRECTORY >+ print '\n'.join('- %s' % t for t in XC_TESTS) >+ print '' >+ print '4. Merge *.profraw files to *.profdata using llvm-profdata tool:' >+ print (' build/mac_files/Xcode.app/Contents/Developer/Toolchains/' + >+ 'XcodeDefault.xctoolchain/usr/bin/llvm-profdata merge ' + >+ '-o %s/merged.profdata ' % DIRECTORY + >+ '-sparse=true %s/*.profraw' % DIRECTORY) >+ print '' >+ print '5. Generate coverage report:' >+ print ' ' + ' '.join(coverage_report_cmd) >+ return 0 >+ >+ >+def Main(): >+ if len(sys.argv) < 2: >+ print 'Please specify type of coverage:' >+ print ' %s simulator' % sys.argv[0] >+ print ' %s device' % sys.argv[0] >+ elif sys.argv[1] == 'simulator': >+ GenerateIOSSimulatorCommand() >+ elif sys.argv[1] == 'device': >+ GenerateIOSDeviceCommand() >+ else: >+ print 'Unsupported type of coverage' >+ >+ return 0 >+ >+if __name__ == '__main__': >+ sys.exit(Main()) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/gn_check_autofix.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/gn_check_autofix.py >index b939636d0c831e928772389ab5f19155eddefa6f..e44f7122dfff9ef037cbc9d46062ccc9462a759d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/gn_check_autofix.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/gn_check_autofix.py >@@ -31,6 +31,9 @@ from collections import defaultdict > > SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__)) > >+CHROMIUM_DIRS = ['base', 'build', 'buildtools', >+ 'testing', 'third_party', 'tools'] >+ > TARGET_RE = re.compile( > r'(?P<indentation_level>\s*)\w*\("(?P<target_name>\w*)"\) {$') > >@@ -88,22 +91,48 @@ def FixErrors(filename, missing_deps, deleted_sources): > > Run(['gn', 'format', filename]) > >+def FirstNonEmpty(iterable): >+ """Return first item which evaluates to True, or fallback to None.""" >+ return next((x for x in iterable if x), None) >+ > def Rebase(base_path, dependency_path, dependency): >- base_path = base_path.split(os.path.sep) >- dependency_path = dependency_path.split(os.path.sep) >- >- first_difference = None >- shortest_length = min(len(dependency_path), len(base_path)) >- for i in range(shortest_length): >- if dependency_path[i] != base_path[i]: >- first_difference = i >- break >- >- first_difference = first_difference or shortest_length >- base_path = base_path[first_difference:] >- dependency_path = dependency_path[first_difference:] >- return (os.path.sep.join((['..'] * len(base_path)) + dependency_path) + >- ':' + dependency) >+ """Adapt paths so they work both in stand-alone WebRTC and Chromium tree. >+ >+ To cope with varying top-level directory (WebRTC VS Chromium), we use: >+ * relative paths for WebRTC modules. >+ * absolute paths for shared ones. >+ E.g. '//common_audio/...' -> '../../common_audio/' >+ '//third_party/...' remains as is. >+ >+ Args: >+ base_path: current module path (E.g. '//video') >+ dependency_path: path from root (E.g. '//rtc_base/time') >+ dependency: target itself (E.g. 'timestamp_extrapolator') >+ >+ Returns: >+ Full target path (E.g. '../rtc_base/time:timestamp_extrapolator'). >+ """ >+ >+ root = FirstNonEmpty(dependency_path.split('/')) >+ if root in CHROMIUM_DIRS: >+ # Chromium paths must remain absolute. E.g. //third_party//abseil-cpp... >+ rebased = dependency_path >+ else: >+ base_path = base_path.split(os.path.sep) >+ dependency_path = dependency_path.split(os.path.sep) >+ >+ first_difference = None >+ shortest_length = min(len(dependency_path), len(base_path)) >+ for i in range(shortest_length): >+ if dependency_path[i] != base_path[i]: >+ first_difference = i >+ break >+ >+ first_difference = first_difference or shortest_length >+ base_path = base_path[first_difference:] >+ dependency_path = dependency_path[first_difference:] >+ rebased = os.path.sep.join((['..'] * len(base_path)) + dependency_path) >+ return rebased + ':' + dependency > > def main(): > deleted_sources = set() >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Debug.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Debug.json >index d2966ec3ad4d67ece0e87edf2bc5bd5b6aa52fad..b14809ff3d4effc3dc773be9bae433c457a38134 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Debug.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Debug.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 32-bit devices." >+ "Builder for 32-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Release.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Release.json >index e72a2291787c70f9e301b78a66df56a14cf3e4a5..97263256de9919c9b1f7e932bbf4dd4cdef8f12b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Release.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Release.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 32-bit devices." >+ "Builder for 32-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Sim_Debug_(iOS_9.0).json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Sim_Debug_(iOS_9.0).json >index 7a56d4618075068ceb74b114418c24fa66b7cb6a..03b272bb5347b3cdd2890469e3138ad4fac9157a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Sim_Debug_(iOS_9.0).json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS32_Sim_Debug_(iOS_9.0).json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 32-bit iOS simulators." >+ "Tests for 32-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Debug.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Debug.json >index 6532c90f5097bab8e6e3c0cd66d6d5f6cc7f5cf2..f1fb7fa71901a80d3338939830610501928b8195 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Debug.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Debug.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 64-bit devices." >+ "Builder for 64-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Release.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Release.json >index 8ca464c3c4300555fc8a689658aa9298e2d0da74..c2555c6a170d420a9cb46c4ff603de29172d4641 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Release.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Release.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 64-bit devices." >+ "Builder for 64-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_10.0).json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_10.0).json >index e759d95e3686b972c524d98c9a2d01d8075bb561..d8a71f665a342ad913537af80b630d489769125d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_10.0).json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_10.0).json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 64-bit iOS simulators." >+ "Tests for 64-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_11.0).json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_11.0).json >index 3dad91ba64f7e3faf78d4188624ad4823e48bc25..c303088b44977e10856fd394ee099a54ecfc89de 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_11.0).json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_11.0).json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 64-bit iOS simulators." >+ "Tests for 64-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_9.0).json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_9.0).json >index d27cdaa45f540c56a0827cda6aaa6f939dfbed79..bcf52d728326ecb278c5d12446e7b9135920de19 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_9.0).json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/client.webrtc/iOS64_Sim_Debug_(iOS_9.0).json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 64-bit iOS simulators." >+ "Tests for 64-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/generate_modulemap.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/generate_modulemap.py >new file mode 100644 >index 0000000000000000000000000000000000000000..45bd3d875e7cd0a062c75fc0a2ff13fc7d1fe66d >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/generate_modulemap.py >@@ -0,0 +1,32 @@ >+# Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+# >+# Use of this source code is governed by a BSD-style license >+# that can be found in the LICENSE file in the root of the source >+# tree. An additional intellectual property rights grant can be found >+# in the file PATENTS. All contributing project authors may >+# be found in the AUTHORS file in the root of the source tree. >+ >+import argparse >+import sys >+ >+def GenerateModulemap(): >+ parser = argparse.ArgumentParser(description='Generate modulemap') >+ parser.add_argument("-o", "--out", type=str, help="Output file.") >+ parser.add_argument("-n", "--name", type=str, help="Name of binary.") >+ >+ args = parser.parse_args() >+ >+ with open(args.out, "w") as outfile: >+ module_template = 'framework module %s {\n' \ >+ ' umbrella header "%s.h"\n' \ >+ '\n' \ >+ ' export *\n' \ >+ ' module * { export * }\n' \ >+ '}\n' % (args.name, args.name) >+ outfile.write(module_template) >+ return 0 >+ >+ >+if __name__ == '__main__': >+ sys.exit(GenerateModulemap()) >+ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Debug.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Debug.json >index 7c3b3cee1da128adc6b3bca0853a184e5e4e3b33..a97d8b706f82150fd236e11855ab4bd37a77ad83 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Debug.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Debug.json >@@ -21,6 +21,7 @@ > "tests": [ > { > "include": "real_device_tests.json", >+ "device type": "iPhone 6s", > "os": "11.4.1", > "dimensions": [ > { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Perf.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Perf.json >index b5357661a8427b21cf3eb37226c7267498f8d9ab..88bfe75264ad56b2552afd01b34bfab9a9a4b9ed 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Perf.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Perf.json >@@ -21,13 +21,8 @@ > "tests": [ > { > "include": "perf_tests.json", >+ "device type": "iPhone 7", > "os": "11.4.1", >- "dimensions": [ >- { >- "os": "Mac", >- "pool": "Chrome" >- } >- ] > } > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Release.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Release.json >index ac84db10bf5c6228f37c0f992cd80826f777c544..d75cde07ff681ba69d0291e4cb0968262c0b6986 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Release.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.client.webrtc/iOS64_Release.json >@@ -22,6 +22,7 @@ > "tests": [ > { > "include": "real_device_tests.json", >+ "device type": "iPhone 6s", > "os": "11.4.1", > "dimensions": [ > { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.tryserver.webrtc/ios_arm64_perf.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.tryserver.webrtc/ios_arm64_perf.json >index 0230c74a92ad07d158ef048423a7695378e6d237..0c30581da1df01076d88bf838b507c45db3e2c7f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.tryserver.webrtc/ios_arm64_perf.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/internal.tryserver.webrtc/ios_arm64_perf.json >@@ -23,12 +23,6 @@ > "include": "perf_trybot_tests.json", > "device type": "iPhone 7", > "os": "11.0.3", >- "dimensions": [ >- { >- "os": "Mac", >- "pool": "Chrome" >- } >- ] > } > ] > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/common_tests.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/common_tests.json >index 5eeb408387fec473712d4c679eaba83b1e52f06c..7459503d7f06832539baf473f40c8fac43922d59 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/common_tests.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/common_tests.json >@@ -27,9 +27,6 @@ > { > "app": "modules_unittests" > }, >- { >- "app": "ortc_unittests" >- }, > { > "app": "rtc_media_unittests" > }, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/real_device_tests.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/real_device_tests.json >index 8417470ddb882e29852c9048ae0c79b1bbed9e3b..dcee51688d6cf9fd55faa0ae471ef2bd4f514363 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/real_device_tests.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tests/real_device_tests.json >@@ -12,9 +12,6 @@ > { > "app": "modules_unittests" > }, >- { >- "app": "ortc_unittests" >- }, > { > "app": "rtc_pc_unittests" > }, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios32_sim_ios9_dbg.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios32_sim_ios9_dbg.json >index 29a113d41388c7feb6eb9eb358c34075fc025ff1..1eacc9295fb1a72bfcce36eae273929dcc825dfd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios32_sim_ios9_dbg.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios32_sim_ios9_dbg.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 32-bit iOS simulators." >+ "Tests for 32-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios10_dbg.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios10_dbg.json >index 6484ca1050de0eaa4f19d81005bf05d0b67f6a0a..8cecf4a41ef599e55a8ea63cb66cfd72165f793a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios10_dbg.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios10_dbg.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 64-bit iOS simulators." >+ "Tests for 64-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios11_dbg.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios11_dbg.json >index b54f212acbd68adc1ed426eb41e9f6ac490f9f97..3a5f7c0cf6b0f7dd04132d5b4ffdb94229e66259 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios11_dbg.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios11_dbg.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 64-bit iOS simulators." >+ "Tests for 64-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios9_dbg.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios9_dbg.json >index fdfa7b382d5ae2a09f29f29e531e03ff1194c31f..6c9848ee6dd968715672810cab0d5c1ec5b73837 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios9_dbg.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios64_sim_ios9_dbg.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Tests for 64-bit iOS simulators." >+ "Tests for 64-bit iOS simulators.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_dbg.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_dbg.json >index 7594cd2a36d66e6c8dceb6099227ca45fbe80d02..43add6710e6c5729a322dc92bbb185597707f655 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_dbg.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_dbg.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 64-bit devices." >+ "Builder for 64-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_rel.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_rel.json >index 08ae7e198332124b31e4536467e7f826739cf080..95a57e52262e52ae6feee95a35cfbad76de5e451 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_rel.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_arm64_rel.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 64-bit devices." >+ "Builder for 64-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_dbg.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_dbg.json >index e9d06c2a76140048809ebb235a467f02d7024f78..8be9e95c0f0b5aa1cd32b8c96e66aad5252ccd8d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_dbg.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_dbg.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 32-bit devices." >+ "Builder for 32-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_rel.json b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_rel.json >index 4521ef515372155759376a73abecc51f520b4205..f644940c7c400d8d18fefc44aa8453573afc0acf 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_rel.json >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/ios/tryserver.webrtc/ios_rel.json >@@ -1,6 +1,7 @@ > { > "comments": [ >- "Builder for 32-bit devices." >+ "Builder for 32-bit devices.", >+ "NOTE: Update cache entry in cr-buildbucket.cfg when changing xcode version." > ], > "xcode build version": "10l232m", > "gn_args": [ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/libs/generate_licenses.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/libs/generate_licenses.py >index 892fb46c05178f4e5b49576ab4de4e37762e364a..d1a76a64eec1a35467f1ede98eef7f9627339dc9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/libs/generate_licenses.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/libs/generate_licenses.py >@@ -36,7 +36,6 @@ LIB_TO_LICENSES_DICT = { > 'bazel': ['third_party/bazel/LICENSE'], > 'boringssl': ['third_party/boringssl/src/LICENSE'], > 'errorprone': ['third_party/errorprone/LICENSE'], >- 'expat': ['third_party/expat/files/COPYING'], > 'fiat': ['third_party/boringssl/src/third_party/fiat/LICENSE'], > 'guava': ['third_party/guava/LICENSE'], > 'ijar': ['third_party/ijar/LICENSE'], >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/gn_isolate_map.pyl b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/gn_isolate_map.pyl >index fecc56c38e1d1ea694889b00b6615407eceb0d13..d72bbdc0090af848bf89c26c9ee3b32d17af582b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/gn_isolate_map.pyl >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/gn_isolate_map.pyl >@@ -82,10 +82,6 @@ > "label": "//modules:modules_unittests", > "type": "windowed_test_launcher", > }, >- "ortc_unittests": { >- "label": "//ortc:ortc_unittests", >- "type": "console_test_launcher", >- }, > "peerconnection_unittests": { > "label": "//pc:peerconnection_unittests", > "type": "console_test_launcher", >@@ -106,6 +102,10 @@ > "label": "//:rtc_unittests", > "type": "console_test_launcher", > }, >+ "slow_tests": { >+ "label": "//:slow_tests", >+ "type": "console_test_launcher", >+ }, > "system_wrappers_unittests": { > "label": "//system_wrappers:system_wrappers_unittests", > "type": "console_test_launcher", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb.py >index 7f196bb733846c8cf0031af11782f25939699eb9..485fceb858c98aa39d2e9ffe23c6614f7e0dd0b2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb.py >@@ -833,6 +833,7 @@ class MetaBuildWrapper(object): > '../../testing/test_env.py', > ] > >+ must_retry = False > if test_type == 'script': > cmdline = ['../../' + self.ToSrcRelPath(isolate_map[target]['script'])] > elif is_android: >@@ -874,12 +875,21 @@ class MetaBuildWrapper(object): > # so it can exit cleanly and report results, instead of being > # interrupted by swarming and not reporting anything. > '--timeout=%s' % timeout, >- '--retry_failed=3', > ] > if test_type == 'non_parallel_console_test_launcher': > # Still use the gtest-parallel-wrapper.py script since we need it to > # run tests on swarming, but don't execute tests in parallel. > cmdline.append('--workers=1') >+ must_retry = True >+ >+ asan = 'is_asan=true' in vals['gn_args'] >+ lsan = 'is_lsan=true' in vals['gn_args'] >+ msan = 'is_msan=true' in vals['gn_args'] >+ tsan = 'is_tsan=true' in vals['gn_args'] >+ sanitizer = asan or lsan or msan or tsan >+ if must_retry and not sanitizer: >+ # Retry would hide most sanitizers detections. >+ cmdline.append('--retry_failed=3') > > executable_prefix = '.\\' if self.platform == 'win32' else './' > executable_suffix = '.exe' if self.platform == 'win32' else '' >@@ -887,11 +897,6 @@ class MetaBuildWrapper(object): > > cmdline.append(executable) > >- asan = 'is_asan=true' in vals['gn_args'] >- lsan = 'is_lsan=true' in vals['gn_args'] >- msan = 'is_msan=true' in vals['gn_args'] >- tsan = 'is_tsan=true' in vals['gn_args'] >- > cmdline.extend([ > '--asan=%d' % asan, > '--lsan=%d' % lsan, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_config.pyl b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_config.pyl >index 56c85837fe6a5a10ab12139193fc7c40c47de838..549ee4cb68ecbe1a8e9d6a40d9308637d941a0a4 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_config.pyl >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_config.pyl >@@ -24,7 +24,6 @@ > 'iOS32 Release': 'ios', > 'iOS64 Debug': 'ios', > 'iOS64 Release': 'ios', >- 'iOS32 Sim Debug (iOS 9.0)': 'ios', > 'iOS64 Sim Debug (iOS 9.0)': 'ios', > 'iOS64 Sim Debug (iOS 10.0)': 'ios', > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_unittest.py b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_unittest.py >index 09093c06bae98511f5a5f40b82db01c537d71c4a..0e7173abb4ce66a6755686e3512d66a01a746eb2 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_unittest.py >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/mb/mb_unittest.py >@@ -382,8 +382,8 @@ class UnitTest(unittest.TestCase): > '--output_dir=${ISOLATED_OUTDIR}/test_logs', > '--gtest_color=no', > '--timeout=500', >- '--retry_failed=3', > '--workers=1', >+ '--retry_failed=3', > './base_unittests', > '--asan=0', > '--lsan=0', >@@ -502,8 +502,8 @@ class UnitTest(unittest.TestCase): > '--output_dir=${ISOLATED_OUTDIR}/test_logs', > '--gtest_color=no', > '--timeout=900', >- '--retry_failed=3', > '--workers=1', >+ '--retry_failed=3', > './base_unittests', > '--asan=0', > '--lsan=0', >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/linux/zxing.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/linux/zxing.sha1 >deleted file mode 100644 >index f69a6ac4371f3ce5cc710cffe63a43961a55f5bb..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/linux/zxing.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-3e2a25eaa2e207a1e569c6585e50addb20bca89a >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/mac/zxing.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/mac/zxing.sha1 >deleted file mode 100644 >index 1b496e7ce6a904e61680c20cb12bc4917277837b..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/mac/zxing.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-6778e3ac489f4fc18a1ed1252bbd6eca54c9960e >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/win/zxing.exe.sha1 b/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/win/zxing.exe.sha1 >deleted file mode 100644 >index 0e5290fb346c53af866f39c4084d717e9d27ffd5..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/tools_webrtc/video_quality_toolchain/win/zxing.exe.sha1 >+++ /dev/null >@@ -1 +0,0 @@ >-86ca3148778079665ad5b9459311d343a1865943 >\ No newline at end of file >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/BUILD.gn b/Source/ThirdParty/libwebrtc/Source/webrtc/video/BUILD.gn >index 3273fda25d552868eb95805f2e54b27be079ca5a..b53d730ae07e06f8004e35536cae722a1a5a6ec6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/BUILD.gn >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/BUILD.gn >@@ -52,7 +52,7 @@ rtc_static_library("video") { > } > > deps = [ >- "..:webrtc_common", >+ ":frame_dumping_decoder", > "../api:fec_controller_api", > "../api:libjingle_peerconnection_api", > "../api:transport_api", >@@ -133,6 +133,25 @@ rtc_source_set("video_stream_decoder_impl") { > ] > } > >+rtc_source_set("frame_dumping_decoder") { >+ visibility = [ "*" ] >+ >+ sources = [ >+ "frame_dumping_decoder.cc", >+ "frame_dumping_decoder.h", >+ ] >+ >+ deps = [ >+ "../api/video:encoded_frame", >+ "../api/video_codecs:video_codecs_api", >+ "../modules/video_coding:video_codec_interface", >+ "../modules/video_coding:video_coding", >+ "../modules/video_coding:video_coding_utility", >+ "../rtc_base:rtc_base_approved", >+ "//third_party/abseil-cpp/absl/memory", >+ ] >+} >+ > rtc_source_set("video_stream_encoder_impl") { > visibility = [ "*" ] > >@@ -152,7 +171,9 @@ rtc_source_set("video_stream_encoder_impl") { > } > > deps = [ >+ "../api/video:encoded_image", > "../api/video:video_bitrate_allocator", >+ "../api/video:video_bitrate_allocator_factory", > "../api/video:video_frame", > "../api/video:video_frame_i420", > "../api/video:video_stream_encoder", >@@ -203,15 +224,19 @@ if (rtc_include_tests) { > "video_quality_test.h", > ] > deps = [ >+ ":frame_dumping_decoder", > "../api:fec_controller_api", > "../api:test_dependency_factory", > "../api:video_quality_test_fixture_api", >+ "../api/video:builtin_video_bitrate_allocator_factory", >+ "../api/video:video_bitrate_allocator_factory", > "../call:fake_network", > "../call:simulated_network", > "../logging:rtc_event_log_api", > "../logging:rtc_event_log_impl_output", > "../media:rtc_audio_video", > "../media:rtc_internal_video_codecs", >+ "../media:rtc_vp8_encoder_simulcast_proxy", > "../modules/audio_device:audio_device_api", > "../modules/audio_device:audio_device_module_from_input_and_output", > "../modules/audio_device:windows_core_audio_utility", >@@ -350,7 +375,7 @@ if (rtc_include_tests) { > "replay.cc", > ] > deps = [ >- "..:webrtc_common", >+ "../api/test/video:function_video_factory", > "../api/video_codecs:video_codecs_api", > "../call:call_interfaces", > "../common_video", >@@ -384,6 +409,7 @@ if (rtc_include_tests) { > defines = [] > sources = [ > "call_stats_unittest.cc", >+ "cpu_scaling_tests.cc", > "encoder_rtcp_feedback_unittest.cc", > "end_to_end_tests/bandwidth_tests.cc", > "end_to_end_tests/call_operation_tests.cc", >@@ -391,6 +417,7 @@ if (rtc_include_tests) { > "end_to_end_tests/config_tests.cc", > "end_to_end_tests/extended_reports_tests.cc", > "end_to_end_tests/fec_tests.cc", >+ "end_to_end_tests/frame_encryption_tests.cc", > "end_to_end_tests/histogram_tests.cc", > "end_to_end_tests/multi_codec_receive_tests.cc", > "end_to_end_tests/multi_stream_tester.cc", >@@ -398,7 +425,6 @@ if (rtc_include_tests) { > "end_to_end_tests/multi_stream_tests.cc", > "end_to_end_tests/network_state_tests.cc", > "end_to_end_tests/probing_tests.cc", >- "end_to_end_tests/receive_time_tests.cc", > "end_to_end_tests/retransmission_tests.cc", > "end_to_end_tests/rtp_rtcp_tests.cc", > "end_to_end_tests/ssrc_tests.cc", >@@ -424,9 +450,15 @@ if (rtc_include_tests) { > ":video", > ":video_mocks", > ":video_stream_encoder_impl", >+ "../api:fake_frame_decryptor", >+ "../api:fake_frame_encryptor", > "../api:simulated_network_api", >+ "../api/test/video:function_video_factory", >+ "../api/video:builtin_video_bitrate_allocator_factory", >+ "../api/video:encoded_image", > "../api/video:video_frame", > "../api/video:video_frame_i420", >+ "../api/video_codecs:create_vp8_temporal_layers", > "../api/video_codecs:video_codecs_api", > "../call:call_interfaces", > "../call:fake_network", >@@ -443,6 +475,7 @@ if (rtc_include_tests) { > "../media:rtc_media", > "../media:rtc_media_base", > "../media:rtc_media_tests_utils", >+ "../media:rtc_simulcast_encoder_adapter", > "../modules:module_api", > "../modules/pacing", > "../modules/rtp_rtcp", >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/call_stats_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/call_stats_unittest.cc >index 71faa692a578a96ebe32e9a5d449fdeb04252ab1..dc6c69af9e98c71b5638b2dd34bb235245bbe7a8 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/call_stats_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/call_stats_unittest.cc >@@ -55,7 +55,7 @@ class CallStatsTest : public ::testing::Test { > }; > > TEST_F(CallStatsTest, AddAndTriggerCallback) { >- rtc::Event event(false, false); >+ rtc::Event event; > > static constexpr const int64_t kRtt = 25; > >@@ -77,7 +77,7 @@ TEST_F(CallStatsTest, AddAndTriggerCallback) { > } > > TEST_F(CallStatsTest, ProcessTime) { >- rtc::Event event(false, false); >+ rtc::Event event; > > static constexpr const int64_t kRtt = 100; > static constexpr const int64_t kRtt2 = 80; >@@ -128,8 +128,8 @@ TEST_F(CallStatsTest, MultipleObservers) { > static constexpr const int64_t kRtt = 100; > > // Verify both observers are updated. >- rtc::Event ev1(false, false); >- rtc::Event ev2(false, false); >+ rtc::Event ev1; >+ rtc::Event ev2; > EXPECT_CALL(stats_observer_1, OnRttUpdate(kRtt, kRtt)) > .Times(AnyNumber()) > .WillOnce(InvokeWithoutArgs([&ev1] { ev1.Set(); })) >@@ -167,7 +167,7 @@ TEST_F(CallStatsTest, MultipleObservers) { > > // Flush the queue on the process thread to make sure we return after > // Process() has been called. >- rtc::Event event(false, false); >+ rtc::Event event; > process_thread_->PostTask(rtc::NewClosure([&event]() { event.Set(); })); > event.Wait(rtc::Event::kForever); > } >@@ -183,7 +183,7 @@ TEST_F(CallStatsTest, ChangeRtt) { > call_stats_.RegisterStatsObserver(&stats_observer); > RtcpRttStats* rtcp_rtt_stats = &call_stats_; > >- rtc::Event event(false, false); >+ rtc::Event event; > > static constexpr const int64_t kFirstRtt = 100; > static constexpr const int64_t kLowRtt = kFirstRtt - 20; >@@ -240,7 +240,7 @@ TEST_F(CallStatsTest, ChangeRtt) { > } > > TEST_F(CallStatsTest, LastProcessedRtt) { >- rtc::Event event(false, false); >+ rtc::Event event; > MockStatsObserver stats_observer; > call_stats_.RegisterStatsObserver(&stats_observer); > RtcpRttStats* rtcp_rtt_stats = &call_stats_; >@@ -289,7 +289,7 @@ TEST_F(CallStatsTest, LastProcessedRtt) { > > TEST_F(CallStatsTest, ProducesHistogramMetrics) { > metrics::Reset(); >- rtc::Event event(false, false); >+ rtc::Event event; > static constexpr const int64_t kRtt = 123; > RtcpRttStats* rtcp_rtt_stats = &call_stats_; > MockStatsObserver stats_observer; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/cpu_scaling_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/cpu_scaling_tests.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..608f05e05247db9f236c3f2d08108e39b1d37bdd >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/cpu_scaling_tests.cc >@@ -0,0 +1,131 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "test/call_test.h" >+#include "test/field_trial.h" >+#include "test/frame_generator_capturer.h" >+ >+namespace webrtc { >+namespace { >+constexpr int kWidth = 1280; >+constexpr int kHeight = 720; >+constexpr int kFps = 28; >+} // namespace >+ >+// Minimal normal usage at start, then 60s overuse. >+class CpuOveruseTest : public test::CallTest { >+ protected: >+ CpuOveruseTest() >+ : field_trials_("WebRTC-ForceSimulatedOveruseIntervalMs/1-60000-60000/") { >+ } >+ >+ void RunTestAndCheckForAdaptation( >+ const DegradationPreference& degradation_preference, >+ bool expect_adaptation); >+ >+ test::ScopedFieldTrials field_trials_; >+}; >+ >+void CpuOveruseTest::RunTestAndCheckForAdaptation( >+ const DegradationPreference& degradation_preference, >+ bool expect_adaptation) { >+ class OveruseObserver >+ : public test::SendTest, >+ public test::FrameGeneratorCapturer::SinkWantsObserver { >+ public: >+ OveruseObserver(const DegradationPreference& degradation_preference, >+ bool expect_adaptation) >+ : SendTest(expect_adaptation ? kLongTimeoutMs : kDefaultTimeoutMs), >+ degradation_preference_(degradation_preference), >+ expect_adaptation_(expect_adaptation) {} >+ >+ private: >+ void OnFrameGeneratorCapturerCreated( >+ test::FrameGeneratorCapturer* frame_generator_capturer) override { >+ frame_generator_capturer->SetSinkWantsObserver(this); >+ // Set initial resolution. >+ frame_generator_capturer->ChangeResolution(kWidth, kHeight); >+ } >+ >+ // Called when FrameGeneratorCapturer::AddOrUpdateSink is called. >+ void OnSinkWantsChanged(rtc::VideoSinkInterface<VideoFrame>* sink, >+ const rtc::VideoSinkWants& wants) override { >+ if (wants.max_pixel_count == std::numeric_limits<int>::max() && >+ wants.max_framerate_fps == kFps) { >+ // Max configured framerate is initially set. >+ return; >+ } >+ switch (degradation_preference_) { >+ case DegradationPreference::MAINTAIN_FRAMERATE: >+ EXPECT_LT(wants.max_pixel_count, kWidth * kHeight); >+ observation_complete_.Set(); >+ break; >+ case DegradationPreference::MAINTAIN_RESOLUTION: >+ EXPECT_LT(wants.max_framerate_fps, kFps); >+ observation_complete_.Set(); >+ break; >+ case DegradationPreference::BALANCED: >+ if (wants.max_pixel_count == std::numeric_limits<int>::max() && >+ wants.max_framerate_fps == std::numeric_limits<int>::max()) { >+ // |adapt_counters_| map in VideoStreamEncoder is reset when >+ // balanced mode is set. >+ break; >+ } >+ EXPECT_TRUE(wants.max_pixel_count < kWidth * kHeight || >+ wants.max_framerate_fps < kFps); >+ observation_complete_.Set(); >+ break; >+ default: >+ RTC_NOTREACHED(); >+ } >+ } >+ >+ void ModifyVideoConfigs( >+ VideoSendStream::Config* send_config, >+ std::vector<VideoReceiveStream::Config>* receive_configs, >+ VideoEncoderConfig* encoder_config) override { >+ EXPECT_FALSE(encoder_config->simulcast_layers.empty()); >+ encoder_config->simulcast_layers[0].max_framerate = kFps; >+ } >+ >+ void ModifyVideoDegradationPreference( >+ DegradationPreference* degradation_preference) override { >+ *degradation_preference = degradation_preference_; >+ } >+ >+ void PerformTest() override { >+ EXPECT_EQ(expect_adaptation_, Wait()) >+ << "Timed out while waiting for a scale down."; >+ } >+ >+ const DegradationPreference degradation_preference_; >+ const bool expect_adaptation_; >+ } test(degradation_preference, expect_adaptation); >+ >+ RunBaseTest(&test); >+} >+ >+TEST_F(CpuOveruseTest, AdaptsDownInResolutionOnOveruse) { >+ RunTestAndCheckForAdaptation(DegradationPreference::MAINTAIN_FRAMERATE, true); >+} >+ >+TEST_F(CpuOveruseTest, AdaptsDownInFpsOnOveruse) { >+ RunTestAndCheckForAdaptation(DegradationPreference::MAINTAIN_RESOLUTION, >+ true); >+} >+ >+TEST_F(CpuOveruseTest, AdaptsDownInResolutionOrFpsOnOveruse) { >+ RunTestAndCheckForAdaptation(DegradationPreference::BALANCED, true); >+} >+ >+TEST_F(CpuOveruseTest, NoAdaptDownOnOveruse) { >+ RunTestAndCheckForAdaptation(DegradationPreference::DISABLED, false); >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/bandwidth_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/bandwidth_tests.cc >index ab5e5687b997babebb420c37f6f5166f04b397c3..a35f08c3f5f6be3e6b2621418a1929059a8b2a27 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/bandwidth_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/bandwidth_tests.cc >@@ -9,6 +9,7 @@ > */ > > #include "api/test/simulated_network.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "modules/rtp_rtcp/include/rtp_rtcp.h" >@@ -145,7 +146,6 @@ TEST_F(BandwidthEndToEndTest, RembWithSendSideBwe) { > sender_ssrc_(0), > remb_bitrate_bps_(1000000), > receive_transport_(nullptr), >- stop_event_(false, false), > poller_thread_(&BitrateStatsPollingThread, > this, > "BitrateStatsPollingThread"), >@@ -160,15 +160,15 @@ TEST_F(BandwidthEndToEndTest, RembWithSendSideBwe) { > task_queue, nullptr, this, test::PacketTransport::kReceiver, > payload_type_map_, > absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), >- absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig()))); > return receive_transport_; > } > >- void ModifySenderCallConfig(Call::Config* config) override { >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { > // Set a high start bitrate to reduce the test completion time. >- config->bitrate_config.start_bitrate_bps = remb_bitrate_bps_; >+ bitrate_config->start_bitrate_bps = remb_bitrate_bps_; > } > > void ModifyVideoConfigs( >@@ -274,6 +274,8 @@ TEST_F(BandwidthEndToEndTest, ReportsSetEncoderRates) { > task_queue_(task_queue), > send_stream_(nullptr), > encoder_factory_(this), >+ bitrate_allocator_factory_( >+ CreateBuiltinVideoBitrateAllocatorFactory()), > bitrate_kbps_(0) {} > > void OnVideoStreamsCreated( >@@ -287,6 +289,8 @@ TEST_F(BandwidthEndToEndTest, ReportsSetEncoderRates) { > std::vector<VideoReceiveStream::Config>* receive_configs, > VideoEncoderConfig* encoder_config) override { > send_config->encoder_settings.encoder_factory = &encoder_factory_; >+ send_config->encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory_.get(); > RTC_DCHECK_EQ(1, encoder_config->number_of_streams); > } > >@@ -345,6 +349,7 @@ TEST_F(BandwidthEndToEndTest, ReportsSetEncoderRates) { > rtc::CriticalSection crit_; > VideoSendStream* send_stream_; > test::VideoEncoderProxyFactory encoder_factory_; >+ std::unique_ptr<VideoBitrateAllocatorFactory> bitrate_allocator_factory_; > uint32_t bitrate_kbps_ RTC_GUARDED_BY(crit_); > } test(&task_queue_); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/call_operation_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/call_operation_tests.cc >index 2acd1b9fe812918701ad6f6f6c80f6d57132c51f..ec3ca84bfb547c10d6439c9bc3e317c36b060f6d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/call_operation_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/call_operation_tests.cc >@@ -92,8 +92,6 @@ TEST_P(CallOperationEndToEndTest, RendersSingleDelayedFrame) { > > class Renderer : public rtc::VideoSinkInterface<VideoFrame> { > public: >- Renderer() : event_(false, false) {} >- > void OnFrame(const VideoFrame& video_frame) override { > SleepMs(kRenderDelayMs); > event_.Set(); >@@ -114,15 +112,15 @@ TEST_P(CallOperationEndToEndTest, RendersSingleDelayedFrame) { > > sender_transport = absl::make_unique<test::DirectTransport>( > &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())), > sender_call_.get(), payload_type_map_); > receiver_transport = absl::make_unique<test::DirectTransport>( > &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())), > receiver_call_.get(), payload_type_map_); > sender_transport->SetReceiver(receiver_call_->Receiver()); > receiver_transport->SetReceiver(sender_call_->Receiver()); >@@ -161,8 +159,6 @@ TEST_P(CallOperationEndToEndTest, RendersSingleDelayedFrame) { > TEST_P(CallOperationEndToEndTest, TransmitsFirstFrame) { > class Renderer : public rtc::VideoSinkInterface<VideoFrame> { > public: >- Renderer() : event_(false, false) {} >- > void OnFrame(const VideoFrame& video_frame) override { event_.Set(); } > > bool Wait() { return event_.Wait(kDefaultTimeoutMs); } >@@ -182,15 +178,15 @@ TEST_P(CallOperationEndToEndTest, TransmitsFirstFrame) { > > sender_transport = absl::make_unique<test::DirectTransport>( > &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())), > sender_call_.get(), payload_type_map_); > receiver_transport = absl::make_unique<test::DirectTransport>( > &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())), > receiver_call_.get(), payload_type_map_); > sender_transport->SetReceiver(receiver_call_->Receiver()); > receiver_transport->SetReceiver(sender_call_->Receiver()); >@@ -221,79 +217,4 @@ TEST_P(CallOperationEndToEndTest, TransmitsFirstFrame) { > }); > } > >-TEST_P(CallOperationEndToEndTest, ObserversEncodedFrames) { >- class EncodedFrameTestObserver : public EncodedFrameObserver { >- public: >- EncodedFrameTestObserver() >- : length_(0), frame_type_(kEmptyFrame), called_(false, false) {} >- virtual ~EncodedFrameTestObserver() {} >- >- virtual void EncodedFrameCallback(const EncodedFrame& encoded_frame) { >- frame_type_ = encoded_frame.frame_type_; >- length_ = encoded_frame.length_; >- buffer_.reset(new uint8_t[length_]); >- memcpy(buffer_.get(), encoded_frame.data_, length_); >- called_.Set(); >- } >- >- bool Wait() { return called_.Wait(kDefaultTimeoutMs); } >- >- private: >- std::unique_ptr<uint8_t[]> buffer_; >- size_t length_; >- FrameType frame_type_; >- rtc::Event called_; >- }; >- >- EncodedFrameTestObserver post_encode_observer; >- test::FrameForwarder forwarder; >- std::unique_ptr<test::FrameGenerator> frame_generator; >- >- std::unique_ptr<test::DirectTransport> sender_transport; >- std::unique_ptr<test::DirectTransport> receiver_transport; >- >- task_queue_.SendTask([&]() { >- CreateCalls(); >- >- sender_transport = absl::make_unique<test::DirectTransport>( >- &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >- sender_call_.get(), payload_type_map_); >- receiver_transport = absl::make_unique<test::DirectTransport>( >- &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >- receiver_call_.get(), payload_type_map_); >- sender_transport->SetReceiver(receiver_call_->Receiver()); >- receiver_transport->SetReceiver(sender_call_->Receiver()); >- >- CreateSendConfig(1, 0, 0, sender_transport.get()); >- CreateMatchingReceiveConfigs(receiver_transport.get()); >- GetVideoSendConfig()->post_encode_callback = &post_encode_observer; >- >- CreateVideoStreams(); >- Start(); >- >- frame_generator = test::FrameGenerator::CreateSquareGenerator( >- kDefaultWidth, kDefaultHeight, absl::nullopt, absl::nullopt); >- GetVideoSendStream()->SetSource(&forwarder, >- DegradationPreference::MAINTAIN_FRAMERATE); >- forwarder.IncomingCapturedFrame(*frame_generator->NextFrame()); >- }); >- >- EXPECT_TRUE(post_encode_observer.Wait()) >- << "Timed out while waiting for send-side encoded-frame callback."; >- >- task_queue_.SendTask([this, &sender_transport, &receiver_transport]() { >- Stop(); >- DestroyStreams(); >- sender_transport.reset(); >- receiver_transport.reset(); >- DestroyCalls(); >- }); >-} >- > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/codec_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/codec_tests.cc >index fad3ac7edef3d0dabc3d2a15f888669f8131fbb8..67ef611a7088823d7d2ddc43332e48a6a0dfe6ea 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/codec_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/codec_tests.cc >@@ -8,6 +8,7 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >+#include "api/test/video/function_video_encoder_factory.h" > #include "media/engine/internaldecoderfactory.h" > #include "media/engine/internalencoderfactory.h" > #include "modules/video_coding/codecs/h264/include/h264.h" >@@ -18,7 +19,6 @@ > #include "test/call_test.h" > #include "test/encoder_settings.h" > #include "test/field_trial.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > > namespace webrtc { >@@ -118,7 +118,7 @@ TEST_P(CodecEndToEndTest, SendsAndReceivesVP8Rotation90) { > RunBaseTest(&test); > } > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > TEST_P(CodecEndToEndTest, SendsAndReceivesVP9) { > test::FunctionVideoEncoderFactory encoder_factory( > []() { return VP9Encoder::Create(); }); >@@ -177,7 +177,7 @@ TEST_P(CodecEndToEndTest, SendsAndReceivesMultiplexVideoRotation90) { > RunBaseTest(&test); > } > >-#endif // !defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > #if defined(WEBRTC_USE_H264) > class EndToEndTestH264 : public test::CallTest, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/config_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/config_tests.cc >index d32b1118b7c321747124e88792e66b7116299a48..1d33589c9d1ced7cf028207330e32d5732081f38 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/config_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/config_tests.cc >@@ -49,6 +49,12 @@ TEST_F(ConfigEndToEndTest, VerifyDefaultSendConfigParameters) { > << "Enabling RTX requires rtpmap: rtx negotiation."; > EXPECT_TRUE(default_send_config.rtp.extensions.empty()) > << "Enabling RTP extensions require negotiation."; >+ EXPECT_EQ(nullptr, default_send_config.frame_encryptor) >+ << "Enabling Frame Encryption requires a frame encryptor to be attached"; >+ EXPECT_FALSE( >+ default_send_config.crypto_options.sframe.require_frame_encryption) >+ << "Enabling Require Frame Encryption means an encryptor must be " >+ "attached"; > > VerifyEmptyNackConfig(default_send_config.rtp.nack); > VerifyEmptyUlpfecConfig(default_send_config.rtp.ulpfec); >@@ -70,12 +76,16 @@ TEST_F(ConfigEndToEndTest, VerifyDefaultVideoReceiveConfigParameters) { > << "Enabling RTX requires rtpmap: rtx negotiation."; > EXPECT_TRUE(default_receive_config.rtp.extensions.empty()) > << "Enabling RTP extensions require negotiation."; >- > VerifyEmptyNackConfig(default_receive_config.rtp.nack); > EXPECT_EQ(-1, default_receive_config.rtp.ulpfec_payload_type) > << "Enabling ULPFEC requires rtpmap: ulpfec negotiation."; > EXPECT_EQ(-1, default_receive_config.rtp.red_payload_type) > << "Enabling ULPFEC requires rtpmap: red negotiation."; >+ EXPECT_EQ(nullptr, default_receive_config.frame_decryptor) >+ << "Enabling Frame Decryption requires a frame decryptor to be attached"; >+ EXPECT_FALSE( >+ default_receive_config.crypto_options.sframe.require_frame_encryption) >+ << "Enabling Require Frame Encryption means a decryptor must be attached"; > } > > TEST_F(ConfigEndToEndTest, VerifyDefaultFlexfecReceiveConfigParameters) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/extended_reports_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/extended_reports_tests.cc >index ba5ca693f2b5e8d5ba83fb6d3ea1c1edd61d1822..4de0441b3dfc444ffe3e022d7ebe9483d074ef2d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/extended_reports_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/extended_reports_tests.cc >@@ -8,9 +8,11 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >+#include "api/video_codecs/video_encoder_config.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "test/call_test.h" >+#include "test/field_trial.h" > #include "test/gtest.h" > #include "test/rtcp_packet_parser.h" > >@@ -21,12 +23,14 @@ class ExtendedReportsEndToEndTest : public test::CallTest {}; > class RtcpXrObserver : public test::EndToEndTest { > public: > RtcpXrObserver(bool enable_rrtr, >- bool enable_target_bitrate, >- bool enable_zero_target_bitrate) >+ bool expect_target_bitrate, >+ bool enable_zero_target_bitrate, >+ VideoEncoderConfig::ContentType content_type) > : EndToEndTest(test::CallTest::kDefaultTimeoutMs), > enable_rrtr_(enable_rrtr), >- enable_target_bitrate_(enable_target_bitrate), >+ expect_target_bitrate_(expect_target_bitrate), > enable_zero_target_bitrate_(enable_zero_target_bitrate), >+ content_type_(content_type), > sent_rtcp_sr_(0), > sent_rtcp_rr_(0), > sent_rtcp_rrtr_(0), >@@ -95,7 +99,7 @@ class RtcpXrObserver : public test::EndToEndTest { > > if (sent_rtcp_sr_ > kNumRtcpReportPacketsToObserve && > sent_rtcp_rr_ > kNumRtcpReportPacketsToObserve && >- (sent_rtcp_target_bitrate_ || !enable_target_bitrate_) && >+ (sent_rtcp_target_bitrate_ || !expect_target_bitrate_) && > (sent_zero_rtcp_target_bitrate_ || !enable_zero_target_bitrate_)) { > if (enable_rrtr_) { > EXPECT_GT(sent_rtcp_rrtr_, 0); >@@ -104,7 +108,7 @@ class RtcpXrObserver : public test::EndToEndTest { > EXPECT_EQ(sent_rtcp_rrtr_, 0); > EXPECT_EQ(sent_rtcp_dlrr_, 0); > } >- EXPECT_EQ(enable_target_bitrate_, sent_rtcp_target_bitrate_); >+ EXPECT_EQ(expect_target_bitrate_, sent_rtcp_target_bitrate_); > EXPECT_EQ(enable_zero_target_bitrate_, sent_zero_rtcp_target_bitrate_); > observation_complete_.Set(); > } >@@ -144,10 +148,7 @@ class RtcpXrObserver : public test::EndToEndTest { > (*receive_configs)[0].decoders[0].video_format = > SdpVideoFormat(send_config->rtp.payload_name); > } >- if (enable_target_bitrate_) { >- // TargetBitrate only signaled for screensharing. >- encoder_config->content_type = VideoEncoderConfig::ContentType::kScreen; >- } >+ encoder_config->content_type = content_type_; > (*receive_configs)[0].rtp.rtcp_mode = RtcpMode::kReducedSize; > (*receive_configs)[0].rtp.rtcp_xr.receiver_reference_time_report = > enable_rrtr_; >@@ -162,50 +163,65 @@ class RtcpXrObserver : public test::EndToEndTest { > > rtc::CriticalSection crit_; > const bool enable_rrtr_; >- const bool enable_target_bitrate_; >+ const bool expect_target_bitrate_; > const bool enable_zero_target_bitrate_; >+ const VideoEncoderConfig::ContentType content_type_; > int sent_rtcp_sr_; > int sent_rtcp_rr_ RTC_GUARDED_BY(&crit_); > int sent_rtcp_rrtr_ RTC_GUARDED_BY(&crit_); > bool sent_rtcp_target_bitrate_ RTC_GUARDED_BY(&crit_); > bool sent_zero_rtcp_target_bitrate_ RTC_GUARDED_BY(&crit_); > int sent_rtcp_dlrr_; >- DefaultNetworkSimulationConfig forward_transport_config_; >+ BuiltInNetworkBehaviorConfig forward_transport_config_; > SimulatedNetwork* send_simulated_network_; > }; > > TEST_F(ExtendedReportsEndToEndTest, > TestExtendedReportsWithRrtrWithoutTargetBitrate) { >- RtcpXrObserver test(/*enable_rrtr=*/true, /*enable_target_bitrate=*/false, >- /*enable_zero_target_bitrate=*/false); >+ RtcpXrObserver test(/*enable_rrtr=*/true, /*expect_target_bitrate=*/false, >+ /*enable_zero_target_bitrate=*/false, >+ VideoEncoderConfig::ContentType::kRealtimeVideo); > RunBaseTest(&test); > } > > TEST_F(ExtendedReportsEndToEndTest, > TestExtendedReportsWithoutRrtrWithoutTargetBitrate) { >- RtcpXrObserver test(/*enable_rrtr=*/false, /*enable_target_bitrate=*/false, >- /*enable_zero_target_bitrate=*/false); >+ RtcpXrObserver test(/*enable_rrtr=*/false, /*expect_target_bitrate=*/false, >+ /*enable_zero_target_bitrate=*/false, >+ VideoEncoderConfig::ContentType::kRealtimeVideo); > RunBaseTest(&test); > } > > TEST_F(ExtendedReportsEndToEndTest, > TestExtendedReportsWithRrtrWithTargetBitrate) { >- RtcpXrObserver test(/*enable_rrtr=*/true, /*enable_target_bitrate=*/true, >- /*enable_zero_target_bitrate=*/false); >+ RtcpXrObserver test(/*enable_rrtr=*/true, /*expect_target_bitrate=*/true, >+ /*enable_zero_target_bitrate=*/false, >+ VideoEncoderConfig::ContentType::kScreen); > RunBaseTest(&test); > } > > TEST_F(ExtendedReportsEndToEndTest, > TestExtendedReportsWithoutRrtrWithTargetBitrate) { >- RtcpXrObserver test(/*enable_rrtr=*/false, /*enable_target_bitrate=*/true, >- /*enable_zero_target_bitrate=*/false); >+ RtcpXrObserver test(/*enable_rrtr=*/false, /*expect_target_bitrate=*/true, >+ /*enable_zero_target_bitrate=*/false, >+ VideoEncoderConfig::ContentType::kScreen); >+ RunBaseTest(&test); >+} >+ >+TEST_F(ExtendedReportsEndToEndTest, >+ TestExtendedReportsWithoutRrtrWithTargetBitrateFromFieldTrial) { >+ test::ScopedFieldTrials field_trials("WebRTC-Target-Bitrate-Rtcp/Enabled/"); >+ RtcpXrObserver test(/*enable_rrtr=*/false, /*expect_target_bitrate=*/true, >+ /*enable_zero_target_bitrate=*/false, >+ VideoEncoderConfig::ContentType::kRealtimeVideo); > RunBaseTest(&test); > } > > TEST_F(ExtendedReportsEndToEndTest, > TestExtendedReportsCanSignalZeroTargetBitrate) { >- RtcpXrObserver test(/*enable_rrtr=*/false, /*enable_target_bitrate=*/true, >- /*enable_zero_target_bitrate=*/true); >+ RtcpXrObserver test(/*enable_rrtr=*/false, /*expect_target_bitrate=*/true, >+ /*enable_zero_target_bitrate=*/true, >+ VideoEncoderConfig::ContentType::kScreen); > RunBaseTest(&test); > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/fec_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/fec_tests.cc >index 0300d12f8bcac6d2aa28bebffd125186894383f3..2c53ccc7cbec34099d78809fd57f66f441a693be 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/fec_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/fec_tests.cc >@@ -9,6 +9,7 @@ > */ > > #include "api/test/simulated_network.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "media/engine/internaldecoderfactory.h" >@@ -16,7 +17,6 @@ > #include "modules/video_coding/codecs/vp8/include/vp8.h" > #include "test/call_test.h" > #include "test/field_trial.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > #include "test/rtcp_packet_parser.h" > >@@ -234,7 +234,7 @@ class FlexfecRenderObserver : public test::EndToEndTest, > Call* sender_call) override { > // At low RTT (< kLowRttNackMs) -> NACK only, no FEC. > const int kNetworkDelayMs = 100; >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_delay_ms = kNetworkDelayMs; > return new test::PacketTransport( > task_queue, sender_call, this, test::PacketTransport::kSender, >@@ -421,7 +421,7 @@ TEST_F(FecEndToEndTest, ReceivedUlpfecPacketsNotNacked) { > // At low RTT (< kLowRttNackMs) -> NACK only, no FEC. > // Configure some network delay. > const int kNetworkDelayMs = 50; >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_delay_ms = kNetworkDelayMs; > return new test::PacketTransport( > task_queue, sender_call, this, test::PacketTransport::kSender, >@@ -433,9 +433,10 @@ TEST_F(FecEndToEndTest, ReceivedUlpfecPacketsNotNacked) { > > // TODO(holmer): Investigate why we don't send FEC packets when the bitrate > // is 10 kbps. >- void ModifySenderCallConfig(Call::Config* config) override { >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { > const int kMinBitrateBps = 30000; >- config->bitrate_config.min_bitrate_bps = kMinBitrateBps; >+ bitrate_config->min_bitrate_bps = kMinBitrateBps; > } > > void ModifyVideoConfigs( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/frame_encryption_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/frame_encryption_tests.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..8cd7b478493594012afac17cf3563e97ec4148f6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/frame_encryption_tests.cc >@@ -0,0 +1,81 @@ >+/* >+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "api/test/fake_frame_decryptor.h" >+#include "api/test/fake_frame_encryptor.h" >+#include "media/engine/internaldecoderfactory.h" >+#include "modules/video_coding/codecs/vp8/include/vp8.h" >+#include "test/call_test.h" >+#include "test/field_trial.h" >+#include "test/gtest.h" >+ >+namespace webrtc { >+ >+class FrameEncryptionEndToEndTest : public test::CallTest { >+ public: >+ FrameEncryptionEndToEndTest() = default; >+ >+ private: >+ // GenericDescriptor is required for FrameEncryption to work. >+ test::ScopedFieldTrials field_trials_{"WebRTC-GenericDescriptor/Enabled/"}; >+}; >+ >+// Validates that payloads cannot be sent without a frame encryptor and frame >+// decryptor attached. >+TEST_F(FrameEncryptionEndToEndTest, RequireFrameEncryptionEnforced) { >+ class DecryptedFrameObserver : public test::EndToEndTest, >+ public rtc::VideoSinkInterface<VideoFrame> { >+ public: >+ DecryptedFrameObserver() >+ : EndToEndTest(kDefaultTimeoutMs), >+ encoder_factory_([]() { return VP8Encoder::Create(); }) {} >+ >+ private: >+ void ModifyVideoConfigs( >+ VideoSendStream::Config* send_config, >+ std::vector<VideoReceiveStream::Config>* receive_configs, >+ VideoEncoderConfig* encoder_config) override { >+ // Use VP8 instead of FAKE. >+ send_config->encoder_settings.encoder_factory = &encoder_factory_; >+ send_config->rtp.payload_name = "VP8"; >+ send_config->rtp.payload_type = kVideoSendPayloadType; >+ send_config->frame_encryptor = new FakeFrameEncryptor(); >+ send_config->crypto_options.sframe.require_frame_encryption = true; >+ encoder_config->codec_type = kVideoCodecVP8; >+ VideoReceiveStream::Decoder decoder = >+ test::CreateMatchingDecoder(*send_config); >+ decoder.decoder_factory = &decoder_factory_; >+ for (auto& recv_config : *receive_configs) { >+ recv_config.decoders.clear(); >+ recv_config.decoders.push_back(decoder); >+ recv_config.renderer = this; >+ recv_config.frame_decryptor = new FakeFrameDecryptor(); >+ recv_config.crypto_options.sframe.require_frame_encryption = true; >+ } >+ } >+ >+ // Validate that rotation is preserved. >+ void OnFrame(const VideoFrame& video_frame) override { >+ observation_complete_.Set(); >+ } >+ >+ void PerformTest() override { >+ EXPECT_TRUE(Wait()) >+ << "Timed out waiting for decrypted frames to be rendered."; >+ } >+ >+ std::unique_ptr<VideoEncoder> encoder_; >+ test::FunctionVideoEncoderFactory encoder_factory_; >+ InternalDecoderFactory decoder_factory_; >+ } test; >+ >+ RunBaseTest(&test); >+} >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/histogram_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/histogram_tests.cc >index a4aa0c810a96295a9f00c8f3bec08d7498ff622e..312924c986b862c81aae600ca067a6a91e5e4fa0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/histogram_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/histogram_tests.cc >@@ -9,10 +9,10 @@ > */ > > #include "absl/types/optional.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" > #include "system_wrappers/include/metrics.h" > #include "test/call_test.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_codec_receive_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_codec_receive_tests.cc >index e13bc351ca1cc70738a41bc30a332fd1dc7b987a..c58f0fe16a8e2ead05709a86127b39d574c23d9c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_codec_receive_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_codec_receive_tests.cc >@@ -9,13 +9,13 @@ > */ > > #include "api/test/simulated_network.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "modules/video_coding/codecs/h264/include/h264.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" > #include "modules/video_coding/codecs/vp9/include/vp9.h" > #include "test/call_test.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > > namespace webrtc { >@@ -51,29 +51,6 @@ int RemoveOlderOrEqual(uint32_t timestamp, std::vector<uint32_t>* timestamps) { > return num_removed; > } > >-class VideoStreamFactoryTest >- : public VideoEncoderConfig::VideoStreamFactoryInterface { >- public: >- explicit VideoStreamFactoryTest(size_t num_temporal_layers) >- : num_temporal_layers_(num_temporal_layers) {} >- >- private: >- std::vector<VideoStream> CreateEncoderStreams( >- int width, >- int height, >- const VideoEncoderConfig& encoder_config) override { >- std::vector<VideoStream> streams = >- test::CreateVideoStreams(width, height, encoder_config); >- >- for (size_t i = 0; i < encoder_config.number_of_streams; ++i) >- streams[i].num_temporal_layers = num_temporal_layers_; >- >- return streams; >- } >- >- const size_t num_temporal_layers_; >-}; >- > class FrameObserver : public test::RtpRtcpObserver, > public rtc::VideoSinkInterface<VideoFrame> { > public: >@@ -153,18 +130,16 @@ class MultiCodecReceiveTest : public test::CallTest { > &task_queue_, sender_call_.get(), &observer_, > test::PacketTransport::kSender, kPayloadTypeMap, > absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), >- absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())))); >+ Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())))); > send_transport_->SetReceiver(receiver_call_->Receiver()); > > receive_transport_.reset(new test::PacketTransport( > &task_queue_, receiver_call_.get(), &observer_, > test::PacketTransport::kReceiver, kPayloadTypeMap, > absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), >- absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())))); >+ Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())))); > receive_transport_->SetReceiver(sender_call_->Receiver()); > }); > } >@@ -219,9 +194,9 @@ void MultiCodecReceiveTest::ConfigureEncoder(const CodecConfig& config) { > PayloadNameToPayloadType(config.payload_name); > GetVideoEncoderConfig()->codec_type = > PayloadStringToCodecType(config.payload_name); >- GetVideoEncoderConfig()->video_stream_factory = >- new rtc::RefCountedObject<VideoStreamFactoryTest>( >- config.num_temporal_layers); >+ EXPECT_EQ(1u, GetVideoEncoderConfig()->simulcast_layers.size()); >+ GetVideoEncoderConfig()->simulcast_layers[0].num_temporal_layers = >+ config.num_temporal_layers; > } > > void MultiCodecReceiveTest::RunTestWithCodecs( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tester.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tester.cc >index 7d66976001adb2fd456467ce7f4cb76ebbf252a2..5d5da65ca8f7f80512ecc0f6a80dd56ea5efbf3f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tester.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tester.cc >@@ -15,6 +15,8 @@ > #include <vector> > > #include "api/test/simulated_network.h" >+#include "api/test/video/function_video_encoder_factory.h" >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "logging/rtc_event_log/rtc_event_log.h" >@@ -22,7 +24,6 @@ > #include "modules/video_coding/codecs/vp8/include/vp8.h" > #include "test/call_test.h" > #include "test/encoder_settings.h" >-#include "test/function_video_encoder_factory.h" > > namespace webrtc { > >@@ -54,6 +55,8 @@ void MultiStreamTester::RunTest() { > test::FrameGeneratorCapturer* frame_generators[kNumStreams]; > test::FunctionVideoEncoderFactory encoder_factory( > []() { return VP8Encoder::Create(); }); >+ std::unique_ptr<VideoBitrateAllocatorFactory> bitrate_allocator_factory = >+ CreateBuiltinVideoBitrateAllocatorFactory(); > InternalDecoderFactory decoder_factory; > > task_queue_->SendTask([&]() { >@@ -75,6 +78,8 @@ void MultiStreamTester::RunTest() { > VideoSendStream::Config send_config(sender_transport.get()); > send_config.rtp.ssrcs.push_back(ssrc); > send_config.encoder_settings.encoder_factory = &encoder_factory; >+ send_config.encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory.get(); > send_config.rtp.payload_name = "VP8"; > send_config.rtp.payload_type = kVideoPayloadType; > VideoEncoderConfig encoder_config; >@@ -143,9 +148,9 @@ test::DirectTransport* MultiStreamTester::CreateSendTransport( > Call* sender_call) { > return new test::DirectTransport( > task_queue, >- absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >- absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>( >+ Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>(BuiltInNetworkBehaviorConfig())), > sender_call, payload_type_map_); > } > >@@ -154,9 +159,9 @@ test::DirectTransport* MultiStreamTester::CreateReceiveTransport( > Call* receiver_call) { > return new test::DirectTransport( > task_queue, >- absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >- absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>( >+ Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>(BuiltInNetworkBehaviorConfig())), > receiver_call, payload_type_map_); > } > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tests.cc >index dbead24564fe0d70be0001faf12592639cab4c5b..9c754fb70a57de67fd02a04420e87f1965c35d0f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/multi_stream_tests.cc >@@ -30,10 +30,7 @@ TEST_F(MultiStreamEndToEndTest, SendsAndReceivesMultipleStreams) { > VideoOutputObserver(const MultiStreamTester::CodecSettings& settings, > uint32_t ssrc, > test::FrameGeneratorCapturer** frame_generator) >- : settings_(settings), >- ssrc_(ssrc), >- frame_generator_(frame_generator), >- done_(false, false) {} >+ : settings_(settings), ssrc_(ssrc), frame_generator_(frame_generator) {} > > void OnFrame(const VideoFrame& video_frame) override { > EXPECT_EQ(settings_.width, video_frame.width()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/network_state_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/network_state_tests.cc >index 937777f6b3479276506739d25a3869c0cea6c0c8..d66634a135f32aa66bb032fa13ba9f7b13d2241e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/network_state_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/network_state_tests.cc >@@ -118,9 +118,9 @@ void NetworkStateEndToEndTest::VerifyNewVideoReceiveStreamsRespectNetworkState( > receiver_call_->SignalChannelNetworkState(network_to_bring_up, kNetworkUp); > sender_transport = absl::make_unique<test::DirectTransport>( > &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())), > sender_call_.get(), payload_type_map_); > sender_transport->SetReceiver(receiver_call_->Receiver()); > CreateSendConfig(1, 0, 0, sender_transport.get()); >@@ -158,8 +158,6 @@ TEST_F(NetworkStateEndToEndTest, RespectsNetworkState) { > : EndToEndTest(kDefaultTimeoutMs), > FakeEncoder(Clock::GetRealTimeClock()), > task_queue_(task_queue), >- encoded_frames_(false, false), >- packet_event_(false, false), > sender_call_(nullptr), > receiver_call_(nullptr), > encoder_factory_(this), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/probing_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/probing_tests.cc >index 7dff657ad0397cbf96340a94cf437f60620c0c94..7970d4a1bf7989b95ffc69d3f76cb9b296beb742 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/probing_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/probing_tests.cc >@@ -41,8 +41,8 @@ class ProbingTest : public test::EndToEndTest { > state_(0), > sender_call_(nullptr) {} > >- void ModifySenderCallConfig(Call::Config* config) override { >- config->bitrate_config.start_bitrate_bps = start_bitrate_bps_; >+ void ModifySenderBitrateConfig(BitrateConstraints* bitrate_config) override { >+ bitrate_config->start_bitrate_bps = start_bitrate_bps_; > } > > void OnCallsCreated(Call* sender_call, Call* receiver_call) override { >@@ -221,7 +221,7 @@ TEST_P(ProbingEndToEndTest, ProbeOnVideoEncoderReconfiguration) { > test::SingleThreadedTaskQueueForTesting* task_queue, > Call* sender_call) override { > auto network = >- absl::make_unique<SimulatedNetwork>(DefaultNetworkSimulationConfig()); >+ absl::make_unique<SimulatedNetwork>(BuiltInNetworkBehaviorConfig()); > send_simulated_network_ = network.get(); > return new test::PacketTransport( > task_queue, sender_call, this, test::PacketTransport::kSender, >@@ -245,7 +245,7 @@ TEST_P(ProbingEndToEndTest, ProbeOnVideoEncoderReconfiguration) { > // bitrate). > if (stats.send_bandwidth_bps >= 250000 && > stats.send_bandwidth_bps <= 350000) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.link_capacity_kbps = 200; > send_simulated_network_->SetConfig(config); > >@@ -260,7 +260,7 @@ TEST_P(ProbingEndToEndTest, ProbeOnVideoEncoderReconfiguration) { > break; > case 1: > if (stats.send_bandwidth_bps <= 210000) { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.link_capacity_kbps = 5000; > send_simulated_network_->SetConfig(config); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/receive_time_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/receive_time_tests.cc >deleted file mode 100644 >index 3ca66f1d3d67e276e57b0a88308cf18470bab4d0..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/receive_time_tests.cc >+++ /dev/null >@@ -1,149 +0,0 @@ >-/* >- * Copyright 2018 The WebRTC project authors. All Rights Reserved. >- * >- * Use of this source code is governed by a BSD-style license >- * that can be found in the LICENSE file in the root of the source >- * tree. An additional intellectual property rights grant can be found >- * in the file PATENTS. All contributing project authors may >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >-#include "api/test/simulated_network.h" >-#include "call/fake_network_pipe.h" >-#include "call/simulated_network.h" >-#include "rtc_base/criticalsection.h" >-#include "rtc_base/timeutils.h" >-#include "test/call_test.h" >-#include "test/field_trial.h" >-#include "test/rtcp_packet_parser.h" >- >-namespace webrtc { >-namespace { >- >-// This tester simulates a series of clock reset events where different offsets >-// are added to the receive time. It detects jumps in the resulting reported >-// receive times of more than 200 ms. >-class ReportedReceiveTimeTester : public test::EndToEndTest { >- public: >- struct TimeJump { >- int64_t at_send_time_ms; >- int64_t add_offset_ms; >- static constexpr int64_t kStop = 0; >- }; >- >- ReportedReceiveTimeTester() >- : EndToEndTest(test::CallTest::kDefaultTimeoutMs) { >- // These should be let trough without correction and filtered if correction >- // is enabled. >- jumps_.push({500, 2000}); >- jumps_.push({1000, -400}); >- jumps_.push({1500, 2000000}); >- jumps_.push({1700, TimeJump::kStop}); >- } >- bool JumpInReportedTimes() { return jump_in_reported_times_; } >- >- protected: >- Action OnReceiveRtcp(const uint8_t* data, size_t length) override { >- test::RtcpPacketParser parser; >- EXPECT_TRUE(parser.Parse(data, length)); >- const auto& fb = parser.transport_feedback(); >- if (fb->num_packets() > 0) { >- int64_t arrival_time_us = fb->GetBaseTimeUs(); >- for (const auto& pkt : fb->GetReceivedPackets()) { >- arrival_time_us += pkt.delta_us(); >- if (last_arrival_time_us_ != 0) { >- int64_t delta_us = arrival_time_us - last_arrival_time_us_; >- rtc::CritScope crit(&send_times_crit_); >- if (send_times_us_.size() >= 2) { >- int64_t ground_truth_delta_us = >- send_times_us_[1] - send_times_us_[0]; >- send_times_us_.pop_front(); >- int64_t delta_diff_ms = (delta_us - ground_truth_delta_us) / 1000; >- if (std::abs(delta_diff_ms) > 200) { >- jump_in_reported_times_ = true; >- observation_complete_.Set(); >- } >- } >- } >- last_arrival_time_us_ = arrival_time_us; >- } >- } >- return SEND_PACKET; >- } >- Action OnSendRtp(const uint8_t* data, size_t length) override { >- { >- rtc::CritScope crit(&send_times_crit_); >- send_times_us_.push_back(rtc::TimeMicros()); >- } >- int64_t now_ms = rtc::TimeMillis(); >- if (!first_send_time_ms_) >- first_send_time_ms_ = now_ms; >- int64_t send_time_ms = now_ms - first_send_time_ms_; >- if (send_time_ms >= jumps_.front().at_send_time_ms) { >- if (jumps_.front().add_offset_ms == TimeJump::kStop) { >- observation_complete_.Set(); >- jumps_.pop(); >- return SEND_PACKET; >- } >- clock_offset_ms_ += jumps_.front().add_offset_ms; >- send_pipe_->SetClockOffset(clock_offset_ms_); >- jumps_.pop(); >- } >- return SEND_PACKET; >- } >- test::PacketTransport* CreateSendTransport( >- test::SingleThreadedTaskQueueForTesting* task_queue, >- Call* sender_call) override { >- auto pipe = absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), >- absl::make_unique<SimulatedNetwork>(DefaultNetworkSimulationConfig())); >- send_pipe_ = pipe.get(); >- return send_transport_ = new test::PacketTransport( >- task_queue, sender_call, this, test::PacketTransport::kSender, >- test::CallTest::payload_type_map_, std::move(pipe)); >- } >- void PerformTest() override { >- observation_complete_.Wait(test::CallTest::kDefaultTimeoutMs); >- } >- size_t GetNumVideoStreams() const override { return 1; } >- size_t GetNumAudioStreams() const override { return 0; } >- >- private: >- int64_t last_arrival_time_us_ = 0; >- int64_t first_send_time_ms_ = 0; >- rtc::CriticalSection send_times_crit_; >- std::deque<int64_t> send_times_us_ RTC_GUARDED_BY(send_times_crit_); >- bool jump_in_reported_times_ = false; >- FakeNetworkPipe* send_pipe_; >- test::PacketTransport* send_transport_; >- int64_t clock_offset_ms_ = 0; >- std::queue<TimeJump> jumps_; >-}; >-} // namespace >- >-class ReceiveTimeEndToEndTest : public test::CallTest { >- public: >- ReceiveTimeEndToEndTest() {} >- >- virtual ~ReceiveTimeEndToEndTest() {} >-}; >- >-TEST_F(ReceiveTimeEndToEndTest, ReceiveTimeJumpsWithoutFieldTrial) { >- // Without the field trial, the jumps in clock offset should be let trough and >- // be detected. >- ReportedReceiveTimeTester test; >- RunBaseTest(&test); >- EXPECT_TRUE(test.JumpInReportedTimes()); >-} >- >-TEST_F(ReceiveTimeEndToEndTest, ReceiveTimeSteadyWithFieldTrial) { >- // Since all the added jumps by the tester are outside the interval of -100 ms >- // to 1000 ms, they should all be filtered by the field trial below, and no >- // jumps should be detected. >- test::ScopedFieldTrials field_trial( >- "WebRTC-BweReceiveTimeCorrection/Enabled,-100,1000/"); >- ReportedReceiveTimeTester test; >- RunBaseTest(&test); >- EXPECT_FALSE(test.JumpInReportedTimes()); >-} >-} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/retransmission_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/retransmission_tests.cc >index 73a5585e747191b2e385867abf27257475728ef1..d1a69e3ed3181c9f5bc5da78b976e6f876598f1c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/retransmission_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/retransmission_tests.cc >@@ -9,13 +9,13 @@ > */ > > #include "api/test/simulated_network.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" > #include "system_wrappers/include/sleep.h" > #include "test/call_test.h" > #include "test/field_trial.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > #include "test/rtcp_packet_parser.h" > >@@ -131,9 +131,8 @@ TEST_F(RetransmissionEndToEndTest, ReceivesNackAndRetransmitsAudio) { > task_queue, nullptr, this, test::PacketTransport::kReceiver, > payload_type_map_, > absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), >- absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig()))); > receive_transport_ = receive_transport; > return receive_transport; > } >@@ -166,7 +165,6 @@ TEST_F(RetransmissionEndToEndTest, ReceivesNackAndRetransmitsAudio) { > void ModifyAudioConfigs( > AudioSendStream::Config* send_config, > std::vector<AudioReceiveStream::Config>* receive_configs) override { >- send_config->rtp.nack.rtp_history_ms = kNackRtpHistoryMs; > (*receive_configs)[0].rtp.nack.rtp_history_ms = kNackRtpHistoryMs; > local_ssrc_ = (*receive_configs)[0].rtp.local_ssrc; > remote_ssrc_ = (*receive_configs)[0].rtp.remote_ssrc; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/rtp_rtcp_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/rtp_rtcp_tests.cc >index 7d96dcca7610eda2ba258ba4ccf28121c8bbbcbb..f19c6ad3faa5f6a41870c3858da8b540ab2bda63 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/rtp_rtcp_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/rtp_rtcp_tests.cc >@@ -282,13 +282,13 @@ void RtpRtcpEndToEndTest::TestRtpStatePreservation( > test::PacketTransport::kSender, payload_type_map_, > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ BuiltInNetworkBehaviorConfig()))); > receive_transport = absl::make_unique<test::PacketTransport>( > &task_queue_, nullptr, &observer, test::PacketTransport::kReceiver, > payload_type_map_, > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig()))); >+ BuiltInNetworkBehaviorConfig()))); > send_transport->SetReceiver(receiver_call_->Receiver()); > receive_transport->SetReceiver(sender_call_->Receiver()); > >@@ -474,7 +474,7 @@ TEST_F(RtpRtcpEndToEndTest, DISABLED_TestFlexfecRtpStatePreservation) { > task_queue_.SendTask([&]() { > CreateCalls(); > >- DefaultNetworkSimulationConfig lossy_delayed_link; >+ BuiltInNetworkBehaviorConfig lossy_delayed_link; > lossy_delayed_link.loss_percent = 2; > lossy_delayed_link.queue_delay_ms = 50; > >@@ -486,7 +486,7 @@ TEST_F(RtpRtcpEndToEndTest, DISABLED_TestFlexfecRtpStatePreservation) { > absl::make_unique<SimulatedNetwork>(lossy_delayed_link))); > send_transport->SetReceiver(receiver_call_->Receiver()); > >- DefaultNetworkSimulationConfig flawless_link; >+ BuiltInNetworkBehaviorConfig flawless_link; > receive_transport = absl::make_unique<test::PacketTransport>( > &task_queue_, nullptr, &observer, test::PacketTransport::kReceiver, > payload_type_map_, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/ssrc_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/ssrc_tests.cc >index c03b49477d60a8be8285519d3eaf962df292abff..86faa0465b5fb3c1d8dda882ab39cefd9a929957 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/ssrc_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/ssrc_tests.cc >@@ -48,7 +48,7 @@ TEST_F(SsrcEndToEndTest, UnknownRtpPacketGivesUnknownSsrcReturnCode) { > class PacketInputObserver : public PacketReceiver { > public: > explicit PacketInputObserver(PacketReceiver* receiver) >- : receiver_(receiver), delivered_packet_(false, false) {} >+ : receiver_(receiver) {} > > bool Wait() { return delivered_packet_.Wait(kDefaultTimeoutMs); } > >@@ -82,15 +82,15 @@ TEST_F(SsrcEndToEndTest, UnknownRtpPacketGivesUnknownSsrcReturnCode) { > > send_transport = absl::make_unique<test::DirectTransport>( > &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())), > sender_call_.get(), payload_type_map_); > receive_transport = absl::make_unique<test::DirectTransport>( > &task_queue_, >- absl::make_unique<FakeNetworkPipe>( >- Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ absl::make_unique<FakeNetworkPipe>(Clock::GetRealTimeClock(), >+ absl::make_unique<SimulatedNetwork>( >+ BuiltInNetworkBehaviorConfig())), > receiver_call_.get(), payload_type_map_); > input_observer = > absl::make_unique<PacketInputObserver>(receiver_call_->Receiver()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/stats_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/stats_tests.cc >index af6bbda168e244dec7760fe8a51405ea45980379..d6e92c8af203aef5cbef54417fbd2800946b9ef5 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/stats_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/stats_tests.cc >@@ -9,6 +9,7 @@ > */ > > #include "api/test/simulated_network.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "modules/rtp_rtcp/source/rtp_utility.h" >@@ -18,7 +19,6 @@ > #include "system_wrappers/include/sleep.h" > #include "test/call_test.h" > #include "test/fake_encoder.h" >-#include "test/function_video_encoder_factory.h" > #include "test/gtest.h" > #include "test/rtcp_packet_parser.h" > >@@ -47,8 +47,7 @@ TEST_F(StatsEndToEndTest, GetStats) { > Clock::GetRealTimeClock(), 10); > }), > send_stream_(nullptr), >- expected_send_ssrcs_(), >- check_stats_event_(false, false) {} >+ expected_send_ssrcs_() {} > > private: > Action OnSendRtp(const uint8_t* packet, size_t length) override { >@@ -231,7 +230,7 @@ TEST_F(StatsEndToEndTest, GetStats) { > test::PacketTransport* CreateSendTransport( > test::SingleThreadedTaskQueueForTesting* task_queue, > Call* sender_call) override { >- DefaultNetworkSimulationConfig network_config; >+ BuiltInNetworkBehaviorConfig network_config; > network_config.loss_percent = 5; > return new test::PacketTransport( > task_queue, sender_call, this, test::PacketTransport::kSender, >@@ -240,8 +239,9 @@ TEST_F(StatsEndToEndTest, GetStats) { > Clock::GetRealTimeClock(), > absl::make_unique<SimulatedNetwork>(network_config))); > } >- void ModifySenderCallConfig(Call::Config* config) override { >- config->bitrate_config.start_bitrate_bps = kStartBitrateBps; >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { >+ bitrate_config->start_bitrate_bps = kStartBitrateBps; > } > > // This test use other VideoStream settings than the the default settings >@@ -515,9 +515,9 @@ TEST_F(StatsEndToEndTest, MAYBE_ContentTypeSwitches) { > metrics::Reset(); > > Call::Config send_config(send_event_log_.get()); >- test.ModifySenderCallConfig(&send_config); >+ test.ModifySenderBitrateConfig(&send_config.bitrate_config); > Call::Config recv_config(recv_event_log_.get()); >- test.ModifyReceiverCallConfig(&recv_config); >+ test.ModifyReceiverBitrateConfig(&recv_config.bitrate_config); > > VideoEncoderConfig encoder_config_with_screenshare; > >@@ -713,7 +713,7 @@ TEST_F(StatsEndToEndTest, CallReportsRttForSender) { > std::unique_ptr<test::DirectTransport> receiver_transport; > > task_queue_.SendTask([this, &sender_transport, &receiver_transport]() { >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.queue_delay_ms = kSendDelayMs; > CreateCalls(); > sender_transport = absl::make_unique<test::DirectTransport>( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/transport_feedback_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/transport_feedback_tests.cc >index 1437cd46e485eb4f3699071239ede28bce2efa14..94db0383305198e6016f567d824c547811d0d446 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/transport_feedback_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/end_to_end_tests/transport_feedback_tests.cc >@@ -55,10 +55,9 @@ TEST_P(TransportFeedbackEndToEndTest, AssignsTransportSequenceNumbers) { > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), > absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())), >+ BuiltInNetworkBehaviorConfig())), > sender_call, > payload_type_map), >- done_(false, false), > parser_(RtpHeaderParser::Create()), > first_media_ssrc_(first_media_ssrc), > rtx_to_media_ssrcs_(ssrc_map), >@@ -415,8 +414,9 @@ TEST_P(TransportFeedbackEndToEndTest, > EXPECT_TRUE(parser.Parse(data, length)); > return parser.transport_feedback()->num_packets() > 0; > } >- void ModifySenderCallConfig(Call::Config* config) override { >- config->bitrate_config.max_bitrate_bps = 300000; >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { >+ bitrate_config->max_bitrate_bps = 300000; > } > > void PerformTest() override { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/frame_dumping_decoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/frame_dumping_decoder.cc >new file mode 100644 >index 0000000000000000000000000000000000000000..7798511b5439a3ba71cfc8ecc3062ebd03794e2e >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/frame_dumping_decoder.cc >@@ -0,0 +1,61 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#include "video/frame_dumping_decoder.h" >+ >+#include <utility> >+ >+#include "modules/video_coding/include/video_codec_interface.h" >+ >+namespace webrtc { >+ >+FrameDumpingDecoder::FrameDumpingDecoder(std::unique_ptr<VideoDecoder> decoder, >+ rtc::PlatformFile file) >+ : decoder_(std::move(decoder)), >+ writer_(IvfFileWriter::Wrap(rtc::File(file), >+ /* byte_limit= */ 100000000)) {} >+ >+FrameDumpingDecoder::~FrameDumpingDecoder() = default; >+ >+int32_t FrameDumpingDecoder::InitDecode(const VideoCodec* codec_settings, >+ int32_t number_of_cores) { >+ return decoder_->InitDecode(codec_settings, number_of_cores); >+} >+ >+int32_t FrameDumpingDecoder::Decode( >+ const EncodedImage& input_image, >+ bool missing_frames, >+ const CodecSpecificInfo* codec_specific_info, >+ int64_t render_time_ms) { >+ int32_t ret = decoder_->Decode(input_image, missing_frames, >+ codec_specific_info, render_time_ms); >+ writer_->WriteFrame(input_image, codec_specific_info->codecType); >+ >+ return ret; >+} >+ >+int32_t FrameDumpingDecoder::RegisterDecodeCompleteCallback( >+ DecodedImageCallback* callback) { >+ return decoder_->RegisterDecodeCompleteCallback(callback); >+} >+ >+int32_t FrameDumpingDecoder::Release() { >+ return decoder_->Release(); >+} >+ >+bool FrameDumpingDecoder::PrefersLateDecoding() const { >+ return decoder_->PrefersLateDecoding(); >+} >+ >+const char* FrameDumpingDecoder::ImplementationName() const { >+ return decoder_->ImplementationName(); >+} >+ >+} // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/frame_dumping_decoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/frame_dumping_decoder.h >new file mode 100644 >index 0000000000000000000000000000000000000000..e19e9b4db089d835a2f3eb8255940e271fec090b >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/frame_dumping_decoder.h >@@ -0,0 +1,48 @@ >+/* >+ * Copyright (c) 2018 The WebRTC project authors. All Rights Reserved. >+ * >+ * Use of this source code is governed by a BSD-style license >+ * that can be found in the LICENSE file in the root of the source >+ * tree. An additional intellectual property rights grant can be found >+ * in the file PATENTS. All contributing project authors may >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+#ifndef VIDEO_FRAME_DUMPING_DECODER_H_ >+#define VIDEO_FRAME_DUMPING_DECODER_H_ >+ >+#include <memory> >+ >+#include "api/video_codecs/video_decoder.h" >+#include "modules/video_coding/utility/ivf_file_writer.h" >+#include "rtc_base/platform_file.h" >+ >+namespace webrtc { >+ >+// A decoder wrapper that writes the encoded frames to a file. >+class FrameDumpingDecoder : public VideoDecoder { >+ public: >+ FrameDumpingDecoder(std::unique_ptr<VideoDecoder> decoder, >+ rtc::PlatformFile file); >+ ~FrameDumpingDecoder() override; >+ >+ int32_t InitDecode(const VideoCodec* codec_settings, >+ int32_t number_of_cores) override; >+ int32_t Decode(const EncodedImage& input_image, >+ bool missing_frames, >+ const CodecSpecificInfo* codec_specific_info, >+ int64_t render_time_ms) override; >+ int32_t RegisterDecodeCompleteCallback( >+ DecodedImageCallback* callback) override; >+ int32_t Release() override; >+ bool PrefersLateDecoding() const override; >+ const char* ImplementationName() const override; >+ >+ private: >+ std::unique_ptr<VideoDecoder> decoder_; >+ std::unique_ptr<IvfFileWriter> writer_; >+}; >+ >+} // namespace webrtc >+ >+#endif // VIDEO_FRAME_DUMPING_DECODER_H_ >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/full_stack_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/full_stack_tests.cc >index 77da2534489a2aae8027dc79ea838a12d1eb967c..9a8d252bfcc51c6fad13b5416901618702104e94 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/full_stack_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/full_stack_tests.cc >@@ -22,21 +22,24 @@ > namespace webrtc { > namespace flags { > >-DEFINE_string(rtc_event_log_name, >- "", >- "Filename for rtc event log. Two files " >- "with \"_send\" and \"_recv\" suffixes will be created."); >+WEBRTC_DEFINE_string(rtc_event_log_name, >+ "", >+ "Filename for rtc event log. Two files " >+ "with \"_send\" and \"_recv\" suffixes will be created."); > std::string RtcEventLogName() { > return static_cast<std::string>(FLAG_rtc_event_log_name); > } >-DEFINE_string(rtp_dump_name, "", "Filename for dumped received RTP stream."); >+WEBRTC_DEFINE_string(rtp_dump_name, >+ "", >+ "Filename for dumped received RTP stream."); > std::string RtpDumpName() { > return static_cast<std::string>(FLAG_rtp_dump_name); > } >-DEFINE_string(encoded_frame_path, >- "", >- "The base path for encoded frame logs. Created files will have " >- "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); >+WEBRTC_DEFINE_string( >+ encoded_frame_path, >+ "", >+ "The base path for encoded frame logs. Created files will have " >+ "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); > std::string EncodedFramePath() { > return static_cast<std::string>(FLAG_encoded_frame_path); > } >@@ -47,10 +50,10 @@ namespace webrtc { > > namespace { > static const int kFullStackTestDurationSecs = 45; >-const char kScreenshareSimulcastExperiment[] = >- "WebRTC-SimulcastScreenshare/Enabled/"; > const char kPacerPushBackExperiment[] = > "WebRTC-PacerPushbackExperiment/Enabled/"; >+const char kVp8TrustedRateControllerFieldTrial[] = >+ "WebRTC-LibvpxVp8TrustedRateController/Enabled/"; > > struct ParamsWithLogging : public VideoQualityTest::Params { > public: >@@ -58,7 +61,7 @@ struct ParamsWithLogging : public VideoQualityTest::Params { > // Use these logging flags by default, for everything. > logging = {flags::RtcEventLogName(), flags::RtpDumpName(), > flags::EncodedFramePath()}; >- this->config = DefaultNetworkSimulationConfig(); >+ this->config = BuiltInNetworkBehaviorConfig(); > } > }; > >@@ -106,7 +109,7 @@ class GenericDescriptorTest : public ::testing::TestWithParam<std::string> { > bool generic_descriptor_enabled_; > }; > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > TEST(FullStackTest, ForemanCifWithoutPacketLossVp9) { > auto fixture = CreateVideoQualityTestFixture(); > ParamsWithLogging foreman_cif; >@@ -187,7 +190,7 @@ TEST(FullStackTest, GeneratorWithoutPacketLossMultiplexI420AFrame) { > fixture->RunWithAnalyzer(generator); > } > >-#endif // !defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > #if defined(WEBRTC_LINUX) > // Crashes on the linux trusty perf bot: bugs.webrtc.org/9129. >@@ -234,6 +237,25 @@ TEST_P(GenericDescriptorTest, ForemanCif30kbpsWithoutPacketLoss) { > fixture->RunWithAnalyzer(foreman_cif); > } > >+// TODO(webrtc:9722): Remove when experiment is cleaned up. >+TEST_P(GenericDescriptorTest, >+ ForemanCif30kbpsWithoutPacketLossTrustedRateControl) { >+ test::ScopedFieldTrials override_field_trials( >+ AppendFieldTrials(kVp8TrustedRateControllerFieldTrial)); >+ auto fixture = CreateVideoQualityTestFixture(); >+ >+ ParamsWithLogging foreman_cif; >+ foreman_cif.call.send_side_bwe = true; >+ foreman_cif.video[0] = {true, 352, 288, 10, 30000, 30000, 30000, >+ false, "VP8", 1, 0, 0, false, false, >+ false, "foreman_cif"}; >+ foreman_cif.analyzer = { >+ GetTestName("foreman_cif_30kbps_net_delay_0_0_plr_0_trusted_rate_ctrl"), >+ 0.0, 0.0, kFullStackTestDurationSecs}; >+ foreman_cif.call.generic_descriptor = GenericDescriptorEnabled(); >+ fixture->RunWithAnalyzer(foreman_cif); >+} >+ > // Link capacity below default start rate. Automatic down scaling enabled. > TEST(FullStackTest, ForemanCifLink150kbpsWithoutPacketLoss) { > auto fixture = CreateVideoQualityTestFixture(); >@@ -365,9 +387,9 @@ TEST_P(GenericDescriptorTest, ForemanCifPlr5H264) { > } > > TEST(FullStackTest, ForemanCifPlr5H264SpsPpsIdrIsKeyframe) { >- auto fixture = CreateVideoQualityTestFixture(); > test::ScopedFieldTrials override_field_trials( > AppendFieldTrials("WebRTC-SpsPpsIdrIsH264Keyframe/Enabled/")); >+ auto fixture = CreateVideoQualityTestFixture(); > > ParamsWithLogging foreman_cif; > foreman_cif.call.send_side_bwe = true; >@@ -521,16 +543,21 @@ TEST(FullStackTest, ConferenceMotionHd2000kbps100msLimitedQueue) { > fixture->RunWithAnalyzer(conf_motion_hd); > } > >-TEST(FullStackTest, ConferenceMotionHd1TLModerateLimits) { >+// TODO(webrtc:9722): Remove when experiment is cleaned up. >+TEST(FullStackTest, ConferenceMotionHd1TLModerateLimitsWhitelistVp8) { >+ test::ScopedFieldTrials override_field_trials( >+ AppendFieldTrials(kVp8TrustedRateControllerFieldTrial)); > auto fixture = CreateVideoQualityTestFixture(); >+ > ParamsWithLogging conf_motion_hd; > conf_motion_hd.call.send_side_bwe = true; > conf_motion_hd.video[0] = { > true, 1280, 720, 50, 30000, > 3000000, 3000000, false, "VP8", 1, > -1, 0, false, false, false, "ConferenceMotion_1280_720_50"}; >- conf_motion_hd.analyzer = {"conference_motion_hd_1tl_moderate_limits", 0.0, >- 0.0, kFullStackTestDurationSecs}; >+ conf_motion_hd.analyzer = { >+ "conference_motion_hd_1tl_moderate_limits_trusted_rate_ctrl", 0.0, 0.0, >+ kFullStackTestDurationSecs}; > conf_motion_hd.config->queue_length_packets = 50; > conf_motion_hd.config->loss_percent = 3; > conf_motion_hd.config->queue_delay_ms = 100; >@@ -592,17 +619,40 @@ TEST(FullStackTest, ConferenceMotionHd4TLModerateLimits) { > } > > TEST(FullStackTest, ConferenceMotionHd3TLModerateLimitsAltTLPattern) { >- auto fixture = CreateVideoQualityTestFixture(); > test::ScopedFieldTrials field_trial( > AppendFieldTrials("WebRTC-UseShortVP8TL3Pattern/Enabled/")); >+ auto fixture = CreateVideoQualityTestFixture(); >+ ParamsWithLogging conf_motion_hd; >+ conf_motion_hd.call.send_side_bwe = true; >+ conf_motion_hd.video[0] = { >+ true, 1280, 720, 50, >+ 30000, 3000000, 3000000, false, >+ "VP8", 3, -1, 0, >+ false, false, false, "ConferenceMotion_1280_720_50"}; >+ conf_motion_hd.analyzer = {"conference_motion_hd_3tl_alt_moderate_limits", >+ 0.0, 0.0, kFullStackTestDurationSecs}; >+ conf_motion_hd.config->queue_length_packets = 50; >+ conf_motion_hd.config->loss_percent = 3; >+ conf_motion_hd.config->queue_delay_ms = 100; >+ conf_motion_hd.config->link_capacity_kbps = 2000; >+ fixture->RunWithAnalyzer(conf_motion_hd); >+} >+ >+TEST(FullStackTest, >+ ConferenceMotionHd3TLModerateLimitsAltTLPatternAndBaseHeavyTLAllocation) { >+ auto fixture = CreateVideoQualityTestFixture(); >+ test::ScopedFieldTrials field_trial( >+ AppendFieldTrials("WebRTC-UseShortVP8TL3Pattern/Enabled/" >+ "WebRTC-UseBaseHeavyVP8TL3RateAllocation/Enabled/")); > ParamsWithLogging conf_motion_hd; > conf_motion_hd.call.send_side_bwe = true; > conf_motion_hd.video[0] = { > true, 1280, 720, 50, 30000, > 3000000, 3000000, false, "VP8", 3, > -1, 0, false, false, false, "ConferenceMotion_1280_720_50"}; >- conf_motion_hd.analyzer = {"conference_motion_hd_3tl_alt_moderate_limits", >- 0.0, 0.0, kFullStackTestDurationSecs}; >+ conf_motion_hd.analyzer = { >+ "conference_motion_hd_3tl_alt_heavy_moderate_limits", 0.0, 0.0, >+ kFullStackTestDurationSecs}; > conf_motion_hd.config->queue_length_packets = 50; > conf_motion_hd.config->loss_percent = 3; > conf_motion_hd.config->queue_delay_ms = 100; >@@ -610,7 +660,7 @@ TEST(FullStackTest, ConferenceMotionHd3TLModerateLimitsAltTLPattern) { > fixture->RunWithAnalyzer(conf_motion_hd); > } > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > TEST(FullStackTest, ConferenceMotionHd2000kbps100msLimitedQueueVP9) { > auto fixture = CreateVideoQualityTestFixture(); > ParamsWithLogging conf_motion_hd; >@@ -642,9 +692,15 @@ TEST(FullStackTest, ScreenshareSlidesVP8_2TL) { > fixture->RunWithAnalyzer(screenshare); > } > >+// TODO(bugs.webrtc.org/9840): Investigate why is this test flaky on MAC. >+#if !defined(WEBRTC_MAC) >+const char kScreenshareSimulcastExperiment[] = >+ "WebRTC-SimulcastScreenshare/Enabled/"; >+ > TEST(FullStackTest, ScreenshareSlidesVP8_3TL_Simulcast) { >+ test::ScopedFieldTrials field_trial( >+ AppendFieldTrials(kScreenshareSimulcastExperiment)); > auto fixture = CreateVideoQualityTestFixture(); >- test::ScopedFieldTrials field_trial(kScreenshareSimulcastExperiment); > ParamsWithLogging screenshare; > screenshare.call.send_side_bwe = true; > screenshare.screenshare[0] = {true, false, 10}; >@@ -670,6 +726,7 @@ TEST(FullStackTest, ScreenshareSlidesVP8_3TL_Simulcast) { > false}; > fixture->RunWithAnalyzer(screenshare); > } >+#endif // !defined(WEBRTC_MAC) > > TEST(FullStackTest, ScreenshareSlidesVP8_2TL_Scroll) { > auto fixture = CreateVideoQualityTestFixture(); >@@ -771,7 +828,7 @@ const ParamsWithLogging::Video kSimulcastVp8VideoLow = { > 150000, 200000, false, "VP8", 3, > 2, 400000, false, false, false, "ConferenceMotion_1280_720_50"}; > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > TEST(FullStackTest, ScreenshareSlidesVP9_2SL) { > auto fixture = CreateVideoQualityTestFixture(); > ParamsWithLogging screenshare; >@@ -891,9 +948,28 @@ TEST(FullStackTest, VP9KSVC_3SL_Medium_Network_Restricted) { > simulcast.config->queue_delay_ms = 100; > fixture->RunWithAnalyzer(simulcast); > } >+ >+// TODO(webrtc:9722): Remove when experiment is cleaned up. >+TEST(FullStackTest, VP9KSVC_3SL_Medium_Network_Restricted_Trusted_Rate) { >+ webrtc::test::ScopedFieldTrials override_trials( >+ AppendFieldTrials("WebRTC-Vp9IssueKeyFrameOnLayerDeactivation/Enabled/" >+ "WebRTC-LibvpxVp9TrustedRateController/Enabled/")); >+ auto fixture = CreateVideoQualityTestFixture(); >+ ParamsWithLogging simulcast; >+ simulcast.call.send_side_bwe = true; >+ simulcast.video[0] = kSvcVp9Video; >+ simulcast.analyzer = {"vp9ksvc_3sl_medium_network_restricted_trusted_rate", >+ 0.0, 0.0, kFullStackTestDurationSecs}; >+ simulcast.ss[0] = { >+ std::vector<VideoStream>(), 0, 3, -1, InterLayerPredMode::kOnKeyPic, >+ std::vector<SpatialLayer>(), false}; >+ simulcast.config->link_capacity_kbps = 1000; >+ simulcast.config->queue_delay_ms = 100; >+ fixture->RunWithAnalyzer(simulcast); >+} > #endif // !defined(WEBRTC_MAC) > >-#endif // !defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > // Android bots can't handle FullHD, so disable the test. > // TODO(bugs.webrtc.org/9220): Investigate source of flakiness on Mac. >@@ -1104,7 +1180,8 @@ INSTANTIATE_TEST_CASE_P(FullStackTest, > class DualStreamsTest : public ::testing::TestWithParam<int> {}; > > // Disable dual video test on mobile device becuase it's too heavy. >-#if !defined(WEBRTC_ANDROID) && !defined(WEBRTC_IOS) >+// TODO(bugs.webrtc.org/9840): Investigate why is this test flaky on MAC. >+#if !defined(WEBRTC_ANDROID) && !defined(WEBRTC_IOS) && !defined(WEBRTC_MAC) > TEST_P(DualStreamsTest, > ModeratelyRestricted_SlidesVp8_3TL_Simulcast_Video_Simulcast_High) { > test::ScopedFieldTrials field_trial( >@@ -1168,7 +1245,8 @@ TEST_P(DualStreamsTest, > auto fixture = CreateVideoQualityTestFixture(); > fixture->RunWithAnalyzer(dual_streams); > } >-#endif // !defined(WEBRTC_ANDROID) && !defined(WEBRTC_IOS) >+#endif // !defined(WEBRTC_ANDROID) && !defined(WEBRTC_IOS) && >+ // !defined(WEBRTC_MAC) > > TEST_P(DualStreamsTest, Conference_Restricted) { > test::ScopedFieldTrials field_trial( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/overuse_frame_detector_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/overuse_frame_detector_unittest.cc >index ac019754c54e261da17fab89c96401d67ec61cb8..91050d5056f01b4cfe798d2883741cf43b3c8cb3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/overuse_frame_detector_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/overuse_frame_detector_unittest.cc >@@ -10,8 +10,8 @@ > > #include <memory> > >+#include "api/video/encoded_image.h" > #include "api/video/i420_buffer.h" >-#include "common_video/include/video_frame.h" > #include "modules/video_coding/utility/quality_scaler.h" > #include "rtc_base/event.h" > #include "rtc_base/fakeclock.h" >@@ -406,7 +406,7 @@ TEST_F(OveruseFrameDetectorTest, UpdatesExistingSamples) { > TEST_F(OveruseFrameDetectorTest, RunOnTqNormalUsage) { > rtc::TaskQueue queue("OveruseFrameDetectorTestQueue"); > >- rtc::Event event(false, false); >+ rtc::Event event; > queue.PostTask([this, &event] { > overuse_detector_->StartCheckForOveruse(options_, observer_); > event.Set(); >@@ -870,7 +870,7 @@ TEST_F(OveruseFrameDetectorTest2, UpdatesExistingSamples) { > TEST_F(OveruseFrameDetectorTest2, RunOnTqNormalUsage) { > rtc::TaskQueue queue("OveruseFrameDetectorTestQueue"); > >- rtc::Event event(false, false); >+ rtc::Event event; > queue.PostTask([this, &event] { > overuse_detector_->StartCheckForOveruse(options_, observer_); > event.Set(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/picture_id_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/picture_id_tests.cc >index 155da13ae49dc4967aba9ab27690f383c5ff37f4..6ce0c99ffe7b32701aa25289c10edde57d22912b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/picture_id_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/picture_id_tests.cc >@@ -9,6 +9,7 @@ > */ > > #include "api/test/simulated_network.h" >+#include "api/test/video/function_video_encoder_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "media/engine/internalencoderfactory.h" >@@ -19,7 +20,6 @@ > #include "rtc_base/numerics/safe_conversions.h" > #include "rtc_base/numerics/sequence_number_util.h" > #include "test/call_test.h" >-#include "test/function_video_encoder_factory.h" > > namespace webrtc { > namespace { >@@ -299,7 +299,7 @@ void PictureIdTest::SetupEncoder(VideoEncoderFactory* encoder_factory, > test::PacketTransport::kSender, payload_type_map_, > absl::make_unique<FakeNetworkPipe>( > Clock::GetRealTimeClock(), absl::make_unique<SimulatedNetwork>( >- DefaultNetworkSimulationConfig())))); >+ BuiltInNetworkBehaviorConfig())))); > > CreateSendConfig(kNumSimulcastStreams, 0, 0, send_transport_.get()); > GetVideoSendConfig()->encoder_settings.encoder_factory = encoder_factory; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/quality_scaling_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/quality_scaling_tests.cc >index 9b1d0faeff2128c7eaccc24e72b1768670277904..01328f81925725f2062b0dfc462c7411bc99ccfe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/quality_scaling_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/quality_scaling_tests.cc >@@ -10,6 +10,7 @@ > > #include <string> > >+#include "api/test/video/function_video_encoder_factory.h" > #include "media/engine/internalencoderfactory.h" > #include "modules/video_coding/codecs/h264/include/h264.h" > #include "modules/video_coding/codecs/vp8/include/vp8.h" >@@ -17,7 +18,6 @@ > #include "test/call_test.h" > #include "test/field_trial.h" > #include "test/frame_generator_capturer.h" >-#include "test/function_video_encoder_factory.h" > > namespace webrtc { > namespace { >@@ -103,8 +103,9 @@ void QualityScalingTest::RunTest(VideoEncoderFactory* encoder_factory, > if (wants.max_pixel_count < kWidth * kHeight) > observation_complete_.Set(); > } >- void ModifySenderCallConfig(Call::Config* config) override { >- config->bitrate_config.start_bitrate_bps = start_bps_; >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { >+ bitrate_config->start_bitrate_bps = start_bps_; > } > > void ModifyVideoConfigs( >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.cc >index ccd2c20f65f8a729eed1dd56d177cceaace8ed7c..e20b7d294fd6224942c7798ae66403b5d8357ddd 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.cc >@@ -759,6 +759,9 @@ void ReceiveStatisticsProxy::OnRenderedFrame(const VideoFrame& frame) { > RTC_DCHECK_GT(height, 0); > int64_t now_ms = clock_->TimeInMilliseconds(); > rtc::CritScope lock(&crit_); >+ >+ video_quality_observer_->OnRenderedFrame(now_ms); >+ > ContentSpecificStats* content_specific_stats = > &content_specific_stats_[last_content_type_]; > renders_fps_estimator_.Update(1, now_ms); >@@ -813,8 +816,15 @@ void ReceiveStatisticsProxy::OnCompleteFrame(bool is_keyframe, > ++stats_.frame_counts.delta_frames; > } > >+ // Content type extension is set only for keyframes and should be propagated >+ // for all the following delta frames. Here we may receive frames out of order >+ // and miscategorise some delta frames near the layer switch. >+ // This may slightly offset calculated bitrate and keyframes permille metrics. >+ VideoContentType propagated_content_type = >+ is_keyframe ? content_type : last_content_type_; >+ > ContentSpecificStats* content_specific_stats = >- &content_specific_stats_[content_type]; >+ &content_specific_stats_[propagated_content_type]; > > content_specific_stats->total_media_bytes += size_bytes; > if (is_keyframe) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.h >index 4daefe047d658994e893bd8353320c3c281fc6de..100e75116c47360020c294036df4e18b59912855 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy.h >@@ -18,8 +18,6 @@ > > #include "absl/types/optional.h" > #include "call/video_receive_stream.h" >-#include "common_types.h" // NOLINT(build/include) >-#include "common_video/include/frame_callback.h" > #include "modules/video_coding/include/video_coding_defines.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/numerics/histogram_percentile_counter.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy_unittest.cc >index b8a96b2f5e824248351285606e093acf65e018c2..b74e4601aab0a916e8496ac3ea024be1ea11041d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/receive_statistics_proxy_unittest.cc >@@ -1057,15 +1057,19 @@ TEST_P(ReceiveStatisticsProxyTest, FreezesAreReported) { > const int kFreezeDelayMs = 200; > const int kCallDurationMs = > kMinRequiredSamples * kInterFrameDelayMs + kFreezeDelayMs; >+ webrtc::VideoFrame frame(webrtc::I420Buffer::Create(1, 1), 0, 0, >+ webrtc::kVideoRotation_0); > for (int i = 0; i < kMinRequiredSamples; ++i) { > statistics_proxy_->OnDecodedFrame(absl::nullopt, kWidth, kHeight, > content_type); >+ statistics_proxy_->OnRenderedFrame(frame); > fake_clock_.AdvanceTimeMilliseconds(kInterFrameDelayMs); > } > // Add extra freeze. > fake_clock_.AdvanceTimeMilliseconds(kFreezeDelayMs); > statistics_proxy_->OnDecodedFrame(absl::nullopt, kWidth, kHeight, > content_type); >+ statistics_proxy_->OnRenderedFrame(frame); > > statistics_proxy_.reset(); > const int kExpectedTimeBetweenFreezes = >@@ -1095,9 +1099,12 @@ TEST_P(ReceiveStatisticsProxyTest, PausesAreIgnored) { > const VideoContentType content_type = GetParam(); > const int kInterFrameDelayMs = 33; > const int kPauseDurationMs = 10000; >+ webrtc::VideoFrame frame(webrtc::I420Buffer::Create(1, 1), 0, 0, >+ webrtc::kVideoRotation_0); > for (int i = 0; i <= kMinRequiredSamples; ++i) { > statistics_proxy_->OnDecodedFrame(absl::nullopt, kWidth, kHeight, > content_type); >+ statistics_proxy_->OnRenderedFrame(frame); > fake_clock_.AdvanceTimeMilliseconds(kInterFrameDelayMs); > } > // Add a pause. >@@ -1108,6 +1115,7 @@ TEST_P(ReceiveStatisticsProxyTest, PausesAreIgnored) { > for (int i = 0; i <= kMinRequiredSamples * 3; ++i) { > statistics_proxy_->OnDecodedFrame(absl::nullopt, kWidth, kHeight, > content_type); >+ statistics_proxy_->OnRenderedFrame(frame); > fake_clock_.AdvanceTimeMilliseconds(kInterFrameDelayMs); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/replay.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/replay.cc >index 5c17e5b3b7d947bd846e9f6967bf1d1a23a0b453..2c1c8a55ad4f5f8527cb162e8555d19b28acb649 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/replay.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/replay.cc >@@ -15,6 +15,7 @@ > #include <memory> > #include <sstream> > >+#include "api/test/video/function_video_decoder_factory.h" > #include "api/video_codecs/video_decoder.h" > #include "call/call.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" >@@ -32,7 +33,6 @@ > #include "test/call_test.h" > #include "test/encoder_settings.h" > #include "test/fake_decoder.h" >-#include "test/function_video_decoder_factory.h" > #include "test/gtest.h" > #include "test/null_transport.h" > #include "test/rtp_file_reader.h" >@@ -72,39 +72,39 @@ namespace flags { > // TODO(pbos): Multiple receivers. > > // Flag for payload type. >-DEFINE_int(media_payload_type, >- test::CallTest::kPayloadTypeVP8, >- "Media payload type"); >+WEBRTC_DEFINE_int(media_payload_type, >+ test::CallTest::kPayloadTypeVP8, >+ "Media payload type"); > static int MediaPayloadType() { > return static_cast<int>(FLAG_media_payload_type); > } > > // Flag for RED payload type. >-DEFINE_int(red_payload_type, >- test::CallTest::kRedPayloadType, >- "RED payload type"); >+WEBRTC_DEFINE_int(red_payload_type, >+ test::CallTest::kRedPayloadType, >+ "RED payload type"); > static int RedPayloadType() { > return static_cast<int>(FLAG_red_payload_type); > } > > // Flag for ULPFEC payload type. >-DEFINE_int(ulpfec_payload_type, >- test::CallTest::kUlpfecPayloadType, >- "ULPFEC payload type"); >+WEBRTC_DEFINE_int(ulpfec_payload_type, >+ test::CallTest::kUlpfecPayloadType, >+ "ULPFEC payload type"); > static int UlpfecPayloadType() { > return static_cast<int>(FLAG_ulpfec_payload_type); > } > >-DEFINE_int(media_payload_type_rtx, >- test::CallTest::kSendRtxPayloadType, >- "Media over RTX payload type"); >+WEBRTC_DEFINE_int(media_payload_type_rtx, >+ test::CallTest::kSendRtxPayloadType, >+ "Media over RTX payload type"); > static int MediaPayloadTypeRtx() { > return static_cast<int>(FLAG_media_payload_type_rtx); > } > >-DEFINE_int(red_payload_type_rtx, >- test::CallTest::kRtxRedPayloadType, >- "RED over RTX payload type"); >+WEBRTC_DEFINE_int(red_payload_type_rtx, >+ test::CallTest::kRtxRedPayloadType, >+ "RED over RTX payload type"); > static int RedPayloadTypeRtx() { > return static_cast<int>(FLAG_red_payload_type_rtx); > } >@@ -115,7 +115,7 @@ const std::string& DefaultSsrc() { > std::to_string(test::CallTest::kVideoSendSsrcs[0]); > return ssrc; > } >-DEFINE_string(ssrc, DefaultSsrc().c_str(), "Incoming SSRC"); >+WEBRTC_DEFINE_string(ssrc, DefaultSsrc().c_str(), "Incoming SSRC"); > static uint32_t Ssrc() { > return rtc::StringToNumber<uint32_t>(FLAG_ssrc).value(); > } >@@ -125,54 +125,56 @@ const std::string& DefaultSsrcRtx() { > std::to_string(test::CallTest::kSendRtxSsrcs[0]); > return ssrc_rtx; > } >-DEFINE_string(ssrc_rtx, DefaultSsrcRtx().c_str(), "Incoming RTX SSRC"); >+WEBRTC_DEFINE_string(ssrc_rtx, DefaultSsrcRtx().c_str(), "Incoming RTX SSRC"); > static uint32_t SsrcRtx() { > return rtc::StringToNumber<uint32_t>(FLAG_ssrc_rtx).value(); > } > > // Flag for abs-send-time id. >-DEFINE_int(abs_send_time_id, -1, "RTP extension ID for abs-send-time"); >+WEBRTC_DEFINE_int(abs_send_time_id, -1, "RTP extension ID for abs-send-time"); > static int AbsSendTimeId() { > return static_cast<int>(FLAG_abs_send_time_id); > } > > // Flag for transmission-offset id. >-DEFINE_int(transmission_offset_id, >- -1, >- "RTP extension ID for transmission-offset"); >+WEBRTC_DEFINE_int(transmission_offset_id, >+ -1, >+ "RTP extension ID for transmission-offset"); > static int TransmissionOffsetId() { > return static_cast<int>(FLAG_transmission_offset_id); > } > > // Flag for rtpdump input file. >-DEFINE_string(input_file, "", "input file"); >+WEBRTC_DEFINE_string(input_file, "", "input file"); > static std::string InputFile() { > return static_cast<std::string>(FLAG_input_file); > } > >-DEFINE_string(config_file, "", "config file"); >+WEBRTC_DEFINE_string(config_file, "", "config file"); > static std::string ConfigFile() { > return static_cast<std::string>(FLAG_config_file); > } > > // Flag for raw output files. >-DEFINE_string(out_base, "", "Basename (excluding .jpg) for raw output"); >+WEBRTC_DEFINE_string(out_base, "", "Basename (excluding .jpg) for raw output"); > static std::string OutBase() { > return static_cast<std::string>(FLAG_out_base); > } > >-DEFINE_string(decoder_bitstream_filename, "", "Decoder bitstream output file"); >+WEBRTC_DEFINE_string(decoder_bitstream_filename, >+ "", >+ "Decoder bitstream output file"); > static std::string DecoderBitstreamFilename() { > return static_cast<std::string>(FLAG_decoder_bitstream_filename); > } > > // Flag for video codec. >-DEFINE_string(codec, "VP8", "Video codec"); >+WEBRTC_DEFINE_string(codec, "VP8", "Video codec"); > static std::string Codec() { > return static_cast<std::string>(FLAG_codec); > } > >-DEFINE_bool(help, false, "Print this message."); >+WEBRTC_DEFINE_bool(help, false, "Print this message."); > } // namespace flags > > static const uint32_t kReceiverLocalSsrc = 0x123456; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/report_block_stats.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/report_block_stats.h >index 241fec76a69923842959dcac864e860c0c47375a..90badf70862fc1ab70d6a4b1296754988c451577 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/report_block_stats.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/report_block_stats.h >@@ -14,7 +14,7 @@ > #include <map> > #include <vector> > >-#include "common_types.h" // NOLINT(build/include) >+#include "modules/rtp_rtcp/include/rtcp_statistics.h" > #include "modules/rtp_rtcp/include/rtp_rtcp_defines.h" > > namespace webrtc { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.cc >index 64ec544b70a6f1b3288840bac4006506cf903c40..fdf02ff329d741faf289084e38ad8aa7e738b44b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.cc >@@ -16,7 +16,6 @@ > > #include "absl/memory/memory.h" > >-#include "common_types.h" // NOLINT(build/include) > #include "media/base/mediaconstants.h" > #include "modules/pacing/packet_router.h" > #include "modules/remote_bitrate_estimator/include/remote_bitrate_estimator.h" >@@ -50,7 +49,7 @@ namespace { > // TODO(philipel): Change kPacketBufferStartSize back to 32 in M63 see: > // crbug.com/752886 > constexpr int kPacketBufferStartSize = 512; >-constexpr int kPacketBufferMaxSixe = 2048; >+constexpr int kPacketBufferMaxSize = 2048; > } // namespace > > std::unique_ptr<RtpRtcp> CreateRtpRtcpModule( >@@ -95,7 +94,8 @@ RtpVideoStreamReceiver::RtpVideoStreamReceiver( > ProcessThread* process_thread, > NackSender* nack_sender, > KeyFrameRequestSender* keyframe_request_sender, >- video_coding::OnCompleteFrameCallback* complete_frame_callback) >+ video_coding::OnCompleteFrameCallback* complete_frame_callback, >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor) > : clock_(Clock::GetRealTimeClock()), > config_(*config), > packet_router_(packet_router), >@@ -113,11 +113,10 @@ RtpVideoStreamReceiver::RtpVideoStreamReceiver( > packet_router)), > complete_frame_callback_(complete_frame_callback), > keyframe_request_sender_(keyframe_request_sender), >- has_received_frame_(false) { >+ has_received_frame_(false), >+ frame_decryptor_(frame_decryptor) { > constexpr bool remb_candidate = true; > packet_router_->AddReceiveRtpModule(rtp_rtcp_.get(), remb_candidate); >- rtp_receive_statistics_->RegisterRtpStatisticsCallback(receive_stats_proxy); >- rtp_receive_statistics_->RegisterRtcpStatisticsCallback(receive_stats_proxy); > > RTC_DCHECK(config_.rtp.rtcp_mode != RtcpMode::kOff) > << "A stream should not be configured with RTCP disabled. This value is " >@@ -147,14 +146,29 @@ RtpVideoStreamReceiver::RtpVideoStreamReceiver( > process_thread_->RegisterModule(rtp_rtcp_.get(), RTC_FROM_HERE); > > if (config_.rtp.nack.rtp_history_ms != 0) { >- nack_module_.reset( >- new NackModule(clock_, nack_sender, keyframe_request_sender)); >+ nack_module_ = absl::make_unique<NackModule>(clock_, nack_sender, >+ keyframe_request_sender); > process_thread_->RegisterModule(nack_module_.get(), RTC_FROM_HERE); > } > >+ // The group here must be a positive power of 2, in which case that is used as >+ // size. All other values shall result in the default value being used. >+ const std::string group_name = >+ webrtc::field_trial::FindFullName("WebRTC-PacketBufferMaxSize"); >+ int packet_buffer_max_size = kPacketBufferMaxSize; >+ if (!group_name.empty() && >+ (sscanf(group_name.c_str(), "%d", &packet_buffer_max_size) != 1 || >+ packet_buffer_max_size <= 0 || >+ // Verify that the number is a positive power of 2. >+ (packet_buffer_max_size & (packet_buffer_max_size - 1)) != 0)) { >+ RTC_LOG(LS_WARNING) << "Invalid packet buffer max size: " << group_name; >+ packet_buffer_max_size = kPacketBufferMaxSize; >+ } >+ > packet_buffer_ = video_coding::PacketBuffer::Create( >- clock_, kPacketBufferStartSize, kPacketBufferMaxSixe, this); >- reference_finder_.reset(new video_coding::RtpFrameReferenceFinder(this)); >+ clock_, kPacketBufferStartSize, packet_buffer_max_size, this); >+ reference_finder_ = >+ absl::make_unique<video_coding::RtpFrameReferenceFinder>(this); > } > > RtpVideoStreamReceiver::~RtpVideoStreamReceiver() { >@@ -202,14 +216,15 @@ int32_t RtpVideoStreamReceiver::OnReceivedPayloadData( > size_t payload_size, > const WebRtcRTPHeader* rtp_header) { > return OnReceivedPayloadData(payload_data, payload_size, rtp_header, >- absl::nullopt); >+ absl::nullopt, false); > } > > int32_t RtpVideoStreamReceiver::OnReceivedPayloadData( > const uint8_t* payload_data, > size_t payload_size, > const WebRtcRTPHeader* rtp_header, >- const absl::optional<RtpGenericFrameDescriptor>& generic_descriptor) { >+ const absl::optional<RtpGenericFrameDescriptor>& generic_descriptor, >+ bool is_recovered) { > WebRtcRTPHeader rtp_header_with_ntp = *rtp_header; > rtp_header_with_ntp.ntp_time_ms = > ntp_estimator_.Estimate(rtp_header->header.timestamp); >@@ -221,7 +236,8 @@ int32_t RtpVideoStreamReceiver::OnReceivedPayloadData( > rtp_header->frameType == kVideoFrameKey; > > packet.timesNacked = nack_module_->OnReceivedPacket( >- rtp_header->header.sequenceNumber, is_keyframe); >+ rtp_header->header.sequenceNumber, is_keyframe, is_recovered); >+ > } else { > packet.timesNacked = -1; > } >@@ -302,7 +318,8 @@ void RtpVideoStreamReceiver::OnRtpPacket(const RtpPacketReceived& packet) { > last_received_rtp_system_time_ms_ = now_ms; > > std::vector<uint32_t> csrcs = packet.Csrcs(); >- contributing_sources_.Update(now_ms, csrcs); >+ contributing_sources_.Update(now_ms, csrcs, >+ /* audio level */absl::nullopt); > } > // Periodically log the RTP header of incoming packets. > if (now_ms - last_packet_log_ms_ > kPacketLogIntervalMs) { >@@ -363,10 +380,61 @@ int32_t RtpVideoStreamReceiver::ResendPackets(const uint16_t* sequence_numbers, > > void RtpVideoStreamReceiver::OnReceivedFrame( > std::unique_ptr<video_coding::RtpFrameObject> frame) { >+ // Request a key frame as soon as possible. >+ bool key_frame_requested = false; > if (!has_received_frame_) { > has_received_frame_ = true; >- if (frame->FrameType() != kVideoFrameKey) >+ if (frame->FrameType() != kVideoFrameKey) { >+ key_frame_requested = true; > keyframe_request_sender_->RequestKeyFrame(); >+ } >+ } >+ >+ // Optionally attempt to decrypt the raw video frame if it was provided. >+ if (frame_decryptor_ != nullptr) { >+ // When using encryption we expect the frame to have the generic descriptor. >+ absl::optional<RtpGenericFrameDescriptor> descriptor = >+ frame->GetGenericFrameDescriptor(); >+ if (!descriptor) { >+ RTC_LOG(LS_ERROR) << "No generic frame descriptor found dropping frame."; >+ return; >+ } >+ >+ // Retrieve the bitstream of the encrypted video frame. >+ rtc::ArrayView<const uint8_t> encrypted_frame_bitstream(frame->Buffer(), >+ frame->size()); >+ // Retrieve the maximum possible size of the decrypted payload. >+ const size_t max_plaintext_byte_size = >+ frame_decryptor_->GetMaxPlaintextByteSize(cricket::MEDIA_TYPE_VIDEO, >+ frame->size()); >+ RTC_CHECK(max_plaintext_byte_size <= frame->size()); >+ // Place the decrypted frame inline into the existing frame. >+ rtc::ArrayView<uint8_t> inline_decrypted_bitstream(frame->MutableBuffer(), >+ max_plaintext_byte_size); >+ >+ // Attempt to decrypt the video frame. >+ size_t bytes_written = 0; >+ if (frame_decryptor_->Decrypt( >+ cricket::MEDIA_TYPE_VIDEO, /*csrcs=*/{}, >+ /*additional_data=*/nullptr, encrypted_frame_bitstream, >+ inline_decrypted_bitstream, &bytes_written) != 0) { >+ return; >+ } >+ >+ if (!has_received_decrypted_frame_ && !key_frame_requested) { >+ has_received_decrypted_frame_ = true; >+ if (frame->FrameType() != kVideoFrameKey) { >+ keyframe_request_sender_->RequestKeyFrame(); >+ } >+ } >+ >+ RTC_CHECK(bytes_written <= max_plaintext_byte_size); >+ // Update the frame to contain just the written bytes. >+ frame->SetLength(bytes_written); >+ } else if (config_.crypto_options.sframe.require_frame_encryption) { >+ RTC_LOG(LS_WARNING) << "Frame decryption required but not attached to this " >+ "stream. Dropping frame."; >+ return; > } > > reference_finder_->ManageFrame(std::move(frame)); >@@ -459,8 +527,6 @@ void RtpVideoStreamReceiver::ReceivePacket(const RtpPacketReceived& packet) { > webrtc_rtp_header.video_header().content_type = VideoContentType::UNSPECIFIED; > webrtc_rtp_header.video_header().video_timing.flags = > VideoSendTiming::kInvalid; >- webrtc_rtp_header.video_header().playout_delay.min_ms = -1; >- webrtc_rtp_header.video_header().playout_delay.max_ms = -1; > webrtc_rtp_header.video_header().is_last_packet_in_frame = > webrtc_rtp_header.header.markerBit; > >@@ -486,12 +552,23 @@ void RtpVideoStreamReceiver::ReceivePacket(const RtpPacketReceived& packet) { > webrtc_rtp_header.header.markerBit || > (generic_descriptor_wire->LastSubFrameInFrame() && > generic_descriptor_wire->LastPacketInSubFrame()); >+ >+ if (generic_descriptor_wire->FirstPacketInSubFrame()) { >+ webrtc_rtp_header.frameType = >+ generic_descriptor_wire->FrameDependenciesDiffs().empty() >+ ? kVideoFrameKey >+ : kVideoFrameDelta; >+ } >+ >+ webrtc_rtp_header.video_header().width = generic_descriptor_wire->Width(); >+ webrtc_rtp_header.video_header().height = generic_descriptor_wire->Height(); > } else { > generic_descriptor_wire.reset(); > } > > OnReceivedPayloadData(parsed_payload.payload, parsed_payload.payload_length, >- &webrtc_rtp_header, generic_descriptor_wire); >+ &webrtc_rtp_header, generic_descriptor_wire, >+ packet.recovered()); > } > > void RtpVideoStreamReceiver::ParseAndHandleEncapsulatingHeader( >@@ -523,7 +600,8 @@ void RtpVideoStreamReceiver::NotifyReceiverOfEmptyPacket(uint16_t seq_num) { > reference_finder_->PaddingReceived(seq_num); > packet_buffer_->PaddingReceived(seq_num); > if (nack_module_) { >- nack_module_->OnReceivedPacket(seq_num, /* is_keyframe = */ false); >+ nack_module_->OnReceivedPacket(seq_num, /* is_keyframe = */ false, >+ /* is _recovered = */ false); > } > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.h >index ca5d7a0abd94258147b1f6b9b2aabc59ada553b0..a2006d1bb2a087246a680b6eed057e2d24af0b4c 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver.h >@@ -19,6 +19,7 @@ > > #include "absl/types/optional.h" > >+#include "api/crypto/framedecryptorinterface.h" > #include "api/video_codecs/video_codec.h" > #include "call/rtp_packet_sink_interface.h" > #include "call/syncable.h" >@@ -67,7 +68,8 @@ class RtpVideoStreamReceiver : public RecoveredPacketReceiver, > ProcessThread* process_thread, > NackSender* nack_sender, > KeyFrameRequestSender* keyframe_request_sender, >- video_coding::OnCompleteFrameCallback* complete_frame_callback); >+ video_coding::OnCompleteFrameCallback* complete_frame_callback, >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor); > ~RtpVideoStreamReceiver() override; > > void AddReceiveCodec(const VideoCodec& video_codec, >@@ -102,7 +104,8 @@ class RtpVideoStreamReceiver : public RecoveredPacketReceiver, > const uint8_t* payload_data, > size_t payload_size, > const WebRtcRTPHeader* rtp_header, >- const absl::optional<RtpGenericFrameDescriptor>& generic_descriptor); >+ const absl::optional<RtpGenericFrameDescriptor>& generic_descriptor, >+ bool is_recovered); > > // Implements RecoveredPacketReceiver. > void OnRecoveredPacket(const uint8_t* packet, size_t packet_length) override; >@@ -203,6 +206,11 @@ class RtpVideoStreamReceiver : public RecoveredPacketReceiver, > RTC_GUARDED_BY(rtp_sources_lock_); > absl::optional<int64_t> last_received_rtp_system_time_ms_ > RTC_GUARDED_BY(rtp_sources_lock_); >+ >+ // E2EE Video Frame Decryptor (Optional) >+ rtc::scoped_refptr<FrameDecryptorInterface> frame_decryptor_; >+ // Set to true on the first successsfully decrypted frame. >+ bool has_received_decrypted_frame_ = false; > }; > > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver_unittest.cc >index ec2bf334e9b6d0e7984fe8c665100749a4169653..10c5e7eebce2af1fcb0ccc9c9f4e26fae35ce76e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/rtp_video_stream_receiver_unittest.cc >@@ -133,7 +133,7 @@ class RtpVideoStreamReceiverTest : public testing::Test { > &mock_transport_, nullptr, &packet_router_, &config_, > rtp_receive_statistics_.get(), nullptr, process_thread_.get(), > &mock_nack_sender_, &mock_key_frame_request_sender_, >- &mock_on_complete_frame_callback_); >+ &mock_on_complete_frame_callback_, nullptr); > } > > WebRtcRTPHeader GetDefaultPacket() { >@@ -568,10 +568,8 @@ TEST_F(RtpVideoStreamReceiverTest, ParseGenericDescriptorTwoPackets) { > first_packet_descriptor.SetFirstSubFrameInFrame(true); > first_packet_descriptor.SetLastSubFrameInFrame(true); > first_packet_descriptor.SetFrameId(100); >- first_packet_descriptor.SetTemporalLayer(1); > first_packet_descriptor.SetSpatialLayersBitmask(1 << kSpatialIndex); >- first_packet_descriptor.AddFrameDependencyDiff(90); >- first_packet_descriptor.AddFrameDependencyDiff(80); >+ first_packet_descriptor.SetResolution(480, 360); > EXPECT_TRUE(first_packet.SetExtension<RtpGenericFrameDescriptorExtension>( > first_packet_descriptor)); > >@@ -606,10 +604,10 @@ TEST_F(RtpVideoStreamReceiverTest, ParseGenericDescriptorTwoPackets) { > > EXPECT_CALL(mock_on_complete_frame_callback_, DoOnCompleteFrame) > .WillOnce(Invoke([kSpatialIndex](video_coding::EncodedFrame* frame) { >- EXPECT_EQ(frame->num_references, 2U); >- EXPECT_EQ(frame->references[0], frame->id.picture_id - 90); >- EXPECT_EQ(frame->references[1], frame->id.picture_id - 80); >+ EXPECT_EQ(frame->num_references, 0U); > EXPECT_EQ(frame->id.spatial_layer, kSpatialIndex); >+ EXPECT_EQ(frame->EncodedImage()._encodedWidth, 480u); >+ EXPECT_EQ(frame->EncodedImage()._encodedHeight, 360u); > })); > > rtp_video_stream_receiver_->OnRtpPacket(second_packet); >@@ -628,4 +626,13 @@ TEST_F(RtpVideoStreamReceiverTest, RepeatedSecondarySinkDisallowed) { > } > #endif > >+// Initialization of WebRtcRTPHeader is a bit convoluted, with some fields >+// zero-initialized. RtpVideoStreamReceiver depends on proper default values for >+// the playout delay. >+TEST(WebRtcRTPHeader, DefaultPlayoutDelayIsUnspecified) { >+ WebRtcRTPHeader webrtc_rtp_header = {}; >+ EXPECT_EQ(webrtc_rtp_header.video_header().playout_delay.min_ms, -1); >+ EXPECT_EQ(webrtc_rtp_header.video_header().playout_delay.max_ms, -1); >+} >+ > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/screenshare_loopback.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/screenshare_loopback.cc >index 1d6e87e71ff4ba92e47cc74d9ae8b020b3e39e01..4696666908707d450e2a3fb3a17e62a20636cadc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/screenshare_loopback.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/screenshare_loopback.cc >@@ -12,7 +12,6 @@ > > #include "rtc_base/flags.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringencode.h" > #include "system_wrappers/include/field_trial.h" > #include "test/field_trial.h" > #include "test/gtest.h" >@@ -23,73 +22,76 @@ namespace webrtc { > namespace flags { > > // Flags common with video loopback, with different default values. >-DEFINE_int(width, 1850, "Video width (crops source)."); >+WEBRTC_DEFINE_int(width, 1850, "Video width (crops source)."); > size_t Width() { > return static_cast<size_t>(FLAG_width); > } > >-DEFINE_int(height, 1110, "Video height (crops source)."); >+WEBRTC_DEFINE_int(height, 1110, "Video height (crops source)."); > size_t Height() { > return static_cast<size_t>(FLAG_height); > } > >-DEFINE_int(fps, 5, "Frames per second."); >+WEBRTC_DEFINE_int(fps, 5, "Frames per second."); > int Fps() { > return static_cast<int>(FLAG_fps); > } > >-DEFINE_int(min_bitrate, 50, "Call and stream min bitrate in kbps."); >+WEBRTC_DEFINE_int(min_bitrate, 50, "Call and stream min bitrate in kbps."); > int MinBitrateKbps() { > return static_cast<int>(FLAG_min_bitrate); > } > >-DEFINE_int(start_bitrate, 300, "Call start bitrate in kbps."); >+WEBRTC_DEFINE_int(start_bitrate, 300, "Call start bitrate in kbps."); > int StartBitrateKbps() { > return static_cast<int>(FLAG_start_bitrate); > } > >-DEFINE_int(target_bitrate, 200, "Stream target bitrate in kbps."); >+WEBRTC_DEFINE_int(target_bitrate, 200, "Stream target bitrate in kbps."); > int TargetBitrateKbps() { > return static_cast<int>(FLAG_target_bitrate); > } > >-DEFINE_int(max_bitrate, 1000, "Call and stream max bitrate in kbps."); >+WEBRTC_DEFINE_int(max_bitrate, 1000, "Call and stream max bitrate in kbps."); > int MaxBitrateKbps() { > return static_cast<int>(FLAG_max_bitrate); > } > >-DEFINE_int(num_temporal_layers, 2, "Number of temporal layers to use."); >+WEBRTC_DEFINE_int(num_temporal_layers, 2, "Number of temporal layers to use."); > int NumTemporalLayers() { > return static_cast<int>(FLAG_num_temporal_layers); > } > > // Flags common with video loopback, with equal default values. >-DEFINE_string(codec, "VP8", "Video codec to use."); >+WEBRTC_DEFINE_string(codec, "VP8", "Video codec to use."); > std::string Codec() { > return static_cast<std::string>(FLAG_codec); > } > >-DEFINE_string(rtc_event_log_name, >- "", >- "Filename for rtc event log. Two files " >- "with \"_send\" and \"_recv\" suffixes will be created."); >+WEBRTC_DEFINE_string(rtc_event_log_name, >+ "", >+ "Filename for rtc event log. Two files " >+ "with \"_send\" and \"_recv\" suffixes will be created."); > std::string RtcEventLogName() { > return static_cast<std::string>(FLAG_rtc_event_log_name); > } > >-DEFINE_string(rtp_dump_name, "", "Filename for dumped received RTP stream."); >+WEBRTC_DEFINE_string(rtp_dump_name, >+ "", >+ "Filename for dumped received RTP stream."); > std::string RtpDumpName() { > return static_cast<std::string>(FLAG_rtp_dump_name); > } > >-DEFINE_int(selected_tl, >- -1, >- "Temporal layer to show or analyze. -1 to disable filtering."); >+WEBRTC_DEFINE_int( >+ selected_tl, >+ -1, >+ "Temporal layer to show or analyze. -1 to disable filtering."); > int SelectedTL() { > return static_cast<int>(FLAG_selected_tl); > } > >-DEFINE_int( >+WEBRTC_DEFINE_int( > duration, > 0, > "Duration of the test in seconds. If 0, rendered will be shown instead."); >@@ -97,71 +99,74 @@ int DurationSecs() { > return static_cast<int>(FLAG_duration); > } > >-DEFINE_string(output_filename, "", "Target graph data filename."); >+WEBRTC_DEFINE_string(output_filename, "", "Target graph data filename."); > std::string OutputFilename() { > return static_cast<std::string>(FLAG_output_filename); > } > >-DEFINE_string(graph_title, >- "", >- "If empty, title will be generated automatically."); >+WEBRTC_DEFINE_string(graph_title, >+ "", >+ "If empty, title will be generated automatically."); > std::string GraphTitle() { > return static_cast<std::string>(FLAG_graph_title); > } > >-DEFINE_int(loss_percent, 0, "Percentage of packets randomly lost."); >+WEBRTC_DEFINE_int(loss_percent, 0, "Percentage of packets randomly lost."); > int LossPercent() { > return static_cast<int>(FLAG_loss_percent); > } > >-DEFINE_int(link_capacity, >- 0, >- "Capacity (kbps) of the fake link. 0 means infinite."); >+WEBRTC_DEFINE_int(link_capacity, >+ 0, >+ "Capacity (kbps) of the fake link. 0 means infinite."); > int LinkCapacityKbps() { > return static_cast<int>(FLAG_link_capacity); > } > >-DEFINE_int(queue_size, 0, "Size of the bottleneck link queue in packets."); >+WEBRTC_DEFINE_int(queue_size, >+ 0, >+ "Size of the bottleneck link queue in packets."); > int QueueSize() { > return static_cast<int>(FLAG_queue_size); > } > >-DEFINE_int(avg_propagation_delay_ms, >- 0, >- "Average link propagation delay in ms."); >+WEBRTC_DEFINE_int(avg_propagation_delay_ms, >+ 0, >+ "Average link propagation delay in ms."); > int AvgPropagationDelayMs() { > return static_cast<int>(FLAG_avg_propagation_delay_ms); > } > >-DEFINE_int(std_propagation_delay_ms, >- 0, >- "Link propagation delay standard deviation in ms."); >+WEBRTC_DEFINE_int(std_propagation_delay_ms, >+ 0, >+ "Link propagation delay standard deviation in ms."); > int StdPropagationDelayMs() { > return static_cast<int>(FLAG_std_propagation_delay_ms); > } > >-DEFINE_int(num_streams, 0, "Number of streams to show or analyze."); >+WEBRTC_DEFINE_int(num_streams, 0, "Number of streams to show or analyze."); > int NumStreams() { > return static_cast<int>(FLAG_num_streams); > } > >-DEFINE_int(selected_stream, >- 0, >- "ID of the stream to show or analyze. " >- "Set to the number of streams to show them all."); >+WEBRTC_DEFINE_int(selected_stream, >+ 0, >+ "ID of the stream to show or analyze. " >+ "Set to the number of streams to show them all."); > int SelectedStream() { > return static_cast<int>(FLAG_selected_stream); > } > >-DEFINE_int(num_spatial_layers, 1, "Number of spatial layers to use."); >+WEBRTC_DEFINE_int(num_spatial_layers, 1, "Number of spatial layers to use."); > int NumSpatialLayers() { > return static_cast<int>(FLAG_num_spatial_layers); > } > >-DEFINE_int(inter_layer_pred, >- 0, >- "Inter-layer prediction mode. " >- "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); >+WEBRTC_DEFINE_int( >+ inter_layer_pred, >+ 0, >+ "Inter-layer prediction mode. " >+ "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); > InterLayerPredMode InterLayerPred() { > if (FLAG_inter_layer_pred == 0) { > return InterLayerPredMode::kOn; >@@ -173,58 +178,65 @@ InterLayerPredMode InterLayerPred() { > } > } > >-DEFINE_int(selected_sl, >- -1, >- "Spatial layer to show or analyze. -1 to disable filtering."); >+WEBRTC_DEFINE_int(selected_sl, >+ -1, >+ "Spatial layer to show or analyze. -1 to disable filtering."); > int SelectedSL() { > return static_cast<int>(FLAG_selected_sl); > } > >-DEFINE_string(stream0, >- "", >- "Comma separated values describing VideoStream for stream #0."); >+WEBRTC_DEFINE_string( >+ stream0, >+ "", >+ "Comma separated values describing VideoStream for stream #0."); > std::string Stream0() { > return static_cast<std::string>(FLAG_stream0); > } > >-DEFINE_string(stream1, >- "", >- "Comma separated values describing VideoStream for stream #1."); >+WEBRTC_DEFINE_string( >+ stream1, >+ "", >+ "Comma separated values describing VideoStream for stream #1."); > std::string Stream1() { > return static_cast<std::string>(FLAG_stream1); > } > >-DEFINE_string(sl0, >- "", >- "Comma separated values describing SpatialLayer for layer #0."); >+WEBRTC_DEFINE_string( >+ sl0, >+ "", >+ "Comma separated values describing SpatialLayer for layer #0."); > std::string SL0() { > return static_cast<std::string>(FLAG_sl0); > } > >-DEFINE_string(sl1, >- "", >- "Comma separated values describing SpatialLayer for layer #1."); >+WEBRTC_DEFINE_string( >+ sl1, >+ "", >+ "Comma separated values describing SpatialLayer for layer #1."); > std::string SL1() { > return static_cast<std::string>(FLAG_sl1); > } > >-DEFINE_string(encoded_frame_path, >- "", >- "The base path for encoded frame logs. Created files will have " >- "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); >+WEBRTC_DEFINE_string( >+ encoded_frame_path, >+ "", >+ "The base path for encoded frame logs. Created files will have " >+ "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); > std::string EncodedFramePath() { > return static_cast<std::string>(FLAG_encoded_frame_path); > } > >-DEFINE_bool(logs, false, "print logs to stderr"); >+WEBRTC_DEFINE_bool(logs, false, "print logs to stderr"); > >-DEFINE_bool(send_side_bwe, true, "Use send-side bandwidth estimation"); >+WEBRTC_DEFINE_bool(send_side_bwe, true, "Use send-side bandwidth estimation"); > >-DEFINE_bool(generic_descriptor, false, "Use the generic frame descriptor."); >+WEBRTC_DEFINE_bool(generic_descriptor, >+ false, >+ "Use the generic frame descriptor."); > >-DEFINE_bool(allow_reordering, false, "Allow packet reordering to occur"); >+WEBRTC_DEFINE_bool(allow_reordering, false, "Allow packet reordering to occur"); > >-DEFINE_string( >+WEBRTC_DEFINE_string( > force_fieldtrials, > "", > "Field trials control experimental feature code which can be forced. " >@@ -233,12 +245,14 @@ DEFINE_string( > "trials are separated by \"/\""); > > // Screenshare-specific flags. >-DEFINE_int(min_transmit_bitrate, 400, "Min transmit bitrate incl. padding."); >+WEBRTC_DEFINE_int(min_transmit_bitrate, >+ 400, >+ "Min transmit bitrate incl. padding."); > int MinTransmitBitrateKbps() { > return FLAG_min_transmit_bitrate; > } > >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > generate_slides, > false, > "Whether to use randomly generated slides or read them from files."); >@@ -246,14 +260,14 @@ bool GenerateSlides() { > return static_cast<int>(FLAG_generate_slides); > } > >-DEFINE_int(slide_change_interval, >- 10, >- "Interval (in seconds) between simulated slide changes."); >+WEBRTC_DEFINE_int(slide_change_interval, >+ 10, >+ "Interval (in seconds) between simulated slide changes."); > int SlideChangeInterval() { > return static_cast<int>(FLAG_slide_change_interval); > } > >-DEFINE_int( >+WEBRTC_DEFINE_int( > scroll_duration, > 0, > "Duration (in seconds) during which a slide will be scrolled into place."); >@@ -261,9 +275,10 @@ int ScrollDuration() { > return static_cast<int>(FLAG_scroll_duration); > } > >-DEFINE_string(slides, >- "", >- "Comma-separated list of *.yuv files to display as slides."); >+WEBRTC_DEFINE_string( >+ slides, >+ "", >+ "Comma-separated list of *.yuv files to display as slides."); > std::vector<std::string> Slides() { > std::vector<std::string> slides; > std::string slides_list = FLAG_slides; >@@ -271,12 +286,12 @@ std::vector<std::string> Slides() { > return slides; > } > >-DEFINE_bool(help, false, "prints this message"); >+WEBRTC_DEFINE_bool(help, false, "prints this message"); > > } // namespace flags > > void Loopback() { >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.loss_percent = flags::LossPercent(); > pipe_config.link_capacity_kbps = flags::LinkCapacityKbps(); > pipe_config.queue_length_packets = flags::QueueSize(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_delay_stats.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_delay_stats.h >index 3337899069a006097660030ba005decfda02a576..b44516b285541aba081f110b316fe22de9c0bab9 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_delay_stats.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_delay_stats.h >@@ -16,7 +16,6 @@ > #include <set> > > #include "call/video_send_stream.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/criticalsection.h" > #include "rtc_base/thread_annotations.h" > #include "system_wrappers/include/clock.h" >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.cc >index 9839ed7a3b7094bdffe74de834d1f77315ecbafb..24e964194714503748f2055331ccb59597f7d1b3 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.cc >@@ -15,7 +15,6 @@ > #include <limits> > #include <utility> > >-#include "common_types.h" // NOLINT(build/include) > #include "modules/video_coding/include/video_codec_interface.h" > #include "rtc_base/checks.h" > #include "rtc_base/logging.h" >@@ -823,10 +822,13 @@ void SendStatisticsProxy::UpdateEncoderFallbackStats( > > const int64_t now_ms = clock_->TimeInMilliseconds(); > bool is_active = fallback_info->is_active; >- if (codec_info->codec_name != stats_.encoder_implementation_name) { >+ if (encoder_changed_) { > // Implementation changed. >- is_active = strcmp(codec_info->codec_name, kVp8SwCodecName) == 0; >- if (!is_active && stats_.encoder_implementation_name != kVp8SwCodecName) { >+ const bool last_was_vp8_software = >+ encoder_changed_->previous_encoder_implementation == kVp8SwCodecName; >+ is_active = encoder_changed_->new_encoder_implementation == kVp8SwCodecName; >+ encoder_changed_.reset(); >+ if (!is_active && !last_was_vp8_software) { > // First or not a VP8 SW change, update stats on next call. > return; > } >@@ -865,7 +867,7 @@ void SendStatisticsProxy::UpdateFallbackDisabledStats( > } > > if (!IsForcedFallbackPossible(codec_info, simulcast_index) || >- strcmp(codec_info->codec_name, kVp8SwCodecName) == 0) { >+ stats_.encoder_implementation_name == kVp8SwCodecName) { > uma_container_->fallback_info_disabled_.is_possible = false; > return; > } >@@ -895,13 +897,9 @@ void SendStatisticsProxy::OnSendEncodedImage( > rtc::CritScope lock(&crit_); > ++stats_.frames_encoded; > if (codec_info) { >- if (codec_info->codec_name) { >- UpdateEncoderFallbackStats( >- codec_info, >- encoded_image._encodedWidth * encoded_image._encodedHeight, >- simulcast_idx); >- stats_.encoder_implementation_name = codec_info->codec_name; >- } >+ UpdateEncoderFallbackStats( >+ codec_info, encoded_image._encodedWidth * encoded_image._encodedHeight, >+ simulcast_idx); > } > > if (static_cast<size_t>(simulcast_idx) >= rtp_config_.ssrcs.size()) { >@@ -981,6 +979,14 @@ void SendStatisticsProxy::OnSendEncodedImage( > } > } > >+void SendStatisticsProxy::OnEncoderImplementationChanged( >+ const std::string& implementation_name) { >+ rtc::CritScope lock(&crit_); >+ encoder_changed_ = EncoderChangeEvent{stats_.encoder_implementation_name, >+ implementation_name}; >+ stats_.encoder_implementation_name = implementation_name; >+} >+ > int SendStatisticsProxy::GetInputFrameRate() const { > rtc::CritScope lock(&crit_); > return round(uma_container_->input_frame_rate_tracker_.ComputeRate()); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.h >index b5868d0890727f05260e9030b4570909fd2f4186..7316f3b55aec0d34fd5f5aa092e0f19ca889bc41 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy.h >@@ -18,7 +18,6 @@ > > #include "api/video/video_stream_encoder_observer.h" > #include "call/video_send_stream.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/video_coding/include/video_codec_interface.h" > #include "modules/video_coding/include/video_coding_defines.h" > #include "rtc_base/criticalsection.h" >@@ -53,6 +52,10 @@ class SendStatisticsProxy : public VideoStreamEncoderObserver, > > void OnSendEncodedImage(const EncodedImage& encoded_image, > const CodecSpecificInfo* codec_info) override; >+ >+ void OnEncoderImplementationChanged( >+ const std::string& implementation_name) override; >+ > // Used to update incoming frame rate. > void OnIncomingFrame(int width, int height) override; > >@@ -241,6 +244,14 @@ class SendStatisticsProxy : public VideoStreamEncoderObserver, > > absl::optional<int64_t> last_outlier_timestamp_ RTC_GUARDED_BY(crit_); > >+ struct EncoderChangeEvent { >+ std::string previous_encoder_implementation; >+ std::string new_encoder_implementation; >+ }; >+ // Stores the last change in encoder implementation in an optional, so that >+ // the event can be consumed. >+ absl::optional<EncoderChangeEvent> encoder_changed_; >+ > // Contains stats used for UMA histograms. These stats will be reset if > // content type changes between real-time video and screenshare, since these > // will be reported separately. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy_unittest.cc >index 3d27c712c6b6a3727f9f993bdb16d1a312daf443..586d57fb284880fff1733ec9e7c82055b48f339b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/send_statistics_proxy_unittest.cc >@@ -2167,13 +2167,9 @@ TEST_F(SendStatisticsProxyTest, FecBitrateNotReportedWhenNotEnabled) { > } > > TEST_F(SendStatisticsProxyTest, GetStatsReportsEncoderImplementationName) { >- const char* kName = "encoderName"; >- EncodedImage encoded_image; >- CodecSpecificInfo codec_info; >- codec_info.codec_name = kName; >- statistics_proxy_->OnSendEncodedImage(encoded_image, &codec_info); >- EXPECT_STREQ( >- kName, statistics_proxy_->GetStats().encoder_implementation_name.c_str()); >+ const std::string kName = "encoderName"; >+ statistics_proxy_->OnEncoderImplementationChanged(kName); >+ EXPECT_EQ(kName, statistics_proxy_->GetStats().encoder_implementation_name); > } > > TEST_F(SendStatisticsProxyTest, Vp9SvcLowSpatialLayerDoesNotUpdateResolution) { >@@ -2219,7 +2215,6 @@ class ForcedFallbackTest : public SendStatisticsProxyTest { > : SendStatisticsProxyTest(field_trials) { > codec_info_.codecType = kVideoCodecVP8; > codec_info_.codecSpecific.VP8.temporalIdx = 0; >- codec_info_.codec_name = "fake_codec"; > encoded_image_._encodedWidth = kWidth; > encoded_image_._encodedHeight = kHeight; > encoded_image_.SetSpatialIndex(0); >@@ -2229,6 +2224,8 @@ class ForcedFallbackTest : public SendStatisticsProxyTest { > > protected: > void InsertEncodedFrames(int num_frames, int interval_ms) { >+ statistics_proxy_->OnEncoderImplementationChanged(codec_name_); >+ > // First frame is not updating stats, insert initial frame. > if (statistics_proxy_->GetStats().frames_encoded == 0) { > statistics_proxy_->OnSendEncodedImage(encoded_image_, &codec_info_); >@@ -2243,6 +2240,7 @@ class ForcedFallbackTest : public SendStatisticsProxyTest { > > EncodedImage encoded_image_; > CodecSpecificInfo codec_info_; >+ std::string codec_name_; > const std::string kPrefix = "WebRTC.Video.Encoder.ForcedSw"; > const int kFrameIntervalMs = 1000; > const int kMinFrames = 20; // Min run time 20 sec. >@@ -2321,7 +2319,7 @@ TEST_F(ForcedFallbackEnabled, EnteredLowResolutionNotSetIfNotLibvpx) { > } > > TEST_F(ForcedFallbackEnabled, EnteredLowResolutionSetIfLibvpx) { >- codec_info_.codec_name = "libvpx"; >+ codec_name_ = "libvpx"; > InsertEncodedFrames(1, kFrameIntervalMs); > EXPECT_TRUE(statistics_proxy_->GetStats().has_entered_low_resolution); > } >@@ -2333,7 +2331,7 @@ TEST_F(ForcedFallbackDisabled, EnteredLowResolutionNotSetIfAboveMaxPixels) { > } > > TEST_F(ForcedFallbackDisabled, EnteredLowResolutionNotSetIfLibvpx) { >- codec_info_.codec_name = "libvpx"; >+ codec_name_ = "libvpx"; > InsertEncodedFrames(1, kFrameIntervalMs); > EXPECT_FALSE(statistics_proxy_->GetStats().has_entered_low_resolution); > } >@@ -2351,7 +2349,7 @@ TEST_F(ForcedFallbackEnabled, OneFallbackEvent) { > EXPECT_FALSE(statistics_proxy_->GetStats().has_entered_low_resolution); > InsertEncodedFrames(15, 1000); > EXPECT_FALSE(statistics_proxy_->GetStats().has_entered_low_resolution); >- codec_info_.codec_name = "libvpx"; >+ codec_name_ = "libvpx"; > InsertEncodedFrames(5, 1000); > EXPECT_TRUE(statistics_proxy_->GetStats().has_entered_low_resolution); > >@@ -2369,18 +2367,18 @@ TEST_F(ForcedFallbackEnabled, ThreeFallbackEvents) { > // Three changes. Video: 60000 ms, fallback: 15000 ms (25%). > InsertEncodedFrames(10, 1000); > EXPECT_FALSE(statistics_proxy_->GetStats().has_entered_low_resolution); >- codec_info_.codec_name = "libvpx"; >+ codec_name_ = "libvpx"; > InsertEncodedFrames(15, 500); > EXPECT_TRUE(statistics_proxy_->GetStats().has_entered_low_resolution); >- codec_info_.codec_name = "notlibvpx"; >+ codec_name_ = "notlibvpx"; > InsertEncodedFrames(20, 1000); > InsertEncodedFrames(3, kMaxFrameDiffMs); // Should not be included. > InsertEncodedFrames(10, 1000); > EXPECT_TRUE(statistics_proxy_->GetStats().has_entered_low_resolution); >- codec_info_.codec_name = "notlibvpx2"; >+ codec_name_ = "notlibvpx2"; > InsertEncodedFrames(10, 500); > EXPECT_TRUE(statistics_proxy_->GetStats().has_entered_low_resolution); >- codec_info_.codec_name = "libvpx"; >+ codec_name_ = "libvpx"; > InsertEncodedFrames(15, 500); > EXPECT_TRUE(statistics_proxy_->GetStats().has_entered_low_resolution); > >@@ -2393,7 +2391,7 @@ TEST_F(ForcedFallbackEnabled, ThreeFallbackEvents) { > > TEST_F(ForcedFallbackEnabled, NoFallbackIfAboveMaxPixels) { > encoded_image_._encodedWidth = kWidth + 1; >- codec_info_.codec_name = "libvpx"; >+ codec_name_ = "libvpx"; > InsertEncodedFrames(kMinFrames, kFrameIntervalMs); > > EXPECT_FALSE(statistics_proxy_->GetStats().has_entered_low_resolution); >@@ -2404,7 +2402,7 @@ TEST_F(ForcedFallbackEnabled, NoFallbackIfAboveMaxPixels) { > > TEST_F(ForcedFallbackEnabled, FallbackIfAtMaxPixels) { > encoded_image_._encodedWidth = kWidth; >- codec_info_.codec_name = "libvpx"; >+ codec_name_ = "libvpx"; > InsertEncodedFrames(kMinFrames, kFrameIntervalMs); > > EXPECT_TRUE(statistics_proxy_->GetStats().has_entered_low_resolution); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/sv_loopback.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/sv_loopback.cc >index fd72c999124d155a5e404a9d6c22624149118d1b..1ff4956ca2950cebf179723044d9aed1003c95fe 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/sv_loopback.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/sv_loopback.cc >@@ -12,7 +12,6 @@ > > #include "rtc_base/flags.h" > #include "rtc_base/logging.h" >-#include "rtc_base/stringencode.h" > #include "system_wrappers/include/field_trial.h" > #include "test/field_trial.h" > #include "test/gtest.h" >@@ -34,73 +33,79 @@ InterLayerPredMode IntToInterLayerPredMode(int inter_layer_pred) { > } > > // Flags for video. >-DEFINE_int(vwidth, 640, "Video width."); >+WEBRTC_DEFINE_int(vwidth, 640, "Video width."); > size_t VideoWidth() { > return static_cast<size_t>(FLAG_vwidth); > } > >-DEFINE_int(vheight, 480, "Video height."); >+WEBRTC_DEFINE_int(vheight, 480, "Video height."); > size_t VideoHeight() { > return static_cast<size_t>(FLAG_vheight); > } > >-DEFINE_int(vfps, 30, "Video frames per second."); >+WEBRTC_DEFINE_int(vfps, 30, "Video frames per second."); > int VideoFps() { > return static_cast<int>(FLAG_vfps); > } > >-DEFINE_int(capture_device_index, >- 0, >- "Capture device to select for video stream"); >+WEBRTC_DEFINE_int(capture_device_index, >+ 0, >+ "Capture device to select for video stream"); > size_t GetCaptureDevice() { > return static_cast<size_t>(FLAG_capture_device_index); > } > >-DEFINE_int(vtarget_bitrate, 400, "Video stream target bitrate in kbps."); >+WEBRTC_DEFINE_int(vtarget_bitrate, 400, "Video stream target bitrate in kbps."); > int VideoTargetBitrateKbps() { > return static_cast<int>(FLAG_vtarget_bitrate); > } > >-DEFINE_int(vmin_bitrate, 100, "Video stream min bitrate in kbps."); >+WEBRTC_DEFINE_int(vmin_bitrate, 100, "Video stream min bitrate in kbps."); > int VideoMinBitrateKbps() { > return static_cast<int>(FLAG_vmin_bitrate); > } > >-DEFINE_int(vmax_bitrate, 2000, "Video stream max bitrate in kbps."); >+WEBRTC_DEFINE_int(vmax_bitrate, 2000, "Video stream max bitrate in kbps."); > int VideoMaxBitrateKbps() { > return static_cast<int>(FLAG_vmax_bitrate); > } > >-DEFINE_bool(suspend_below_min_bitrate, >- false, >- "Suspends video below the configured min bitrate."); >+WEBRTC_DEFINE_bool(suspend_below_min_bitrate, >+ false, >+ "Suspends video below the configured min bitrate."); > >-DEFINE_int(vnum_temporal_layers, >- 1, >- "Number of temporal layers for video. Set to 1-4 to override."); >+WEBRTC_DEFINE_int( >+ vnum_temporal_layers, >+ 1, >+ "Number of temporal layers for video. Set to 1-4 to override."); > int VideoNumTemporalLayers() { > return static_cast<int>(FLAG_vnum_temporal_layers); > } > >-DEFINE_int(vnum_streams, 0, "Number of video streams to show or analyze."); >+WEBRTC_DEFINE_int(vnum_streams, >+ 0, >+ "Number of video streams to show or analyze."); > int VideoNumStreams() { > return static_cast<int>(FLAG_vnum_streams); > } > >-DEFINE_int(vnum_spatial_layers, 1, "Number of video spatial layers to use."); >+WEBRTC_DEFINE_int(vnum_spatial_layers, >+ 1, >+ "Number of video spatial layers to use."); > int VideoNumSpatialLayers() { > return static_cast<int>(FLAG_vnum_spatial_layers); > } > >-DEFINE_int(vinter_layer_pred, >- 2, >- "Video inter-layer prediction mode. " >- "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); >+WEBRTC_DEFINE_int( >+ vinter_layer_pred, >+ 2, >+ "Video inter-layer prediction mode. " >+ "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); > InterLayerPredMode VideoInterLayerPred() { > return IntToInterLayerPredMode(FLAG_vinter_layer_pred); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > vstream0, > "", > "Comma separated values describing VideoStream for video stream #0."); >@@ -108,7 +113,7 @@ std::string VideoStream0() { > return static_cast<std::string>(FLAG_vstream0); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > vstream1, > "", > "Comma separated values describing VideoStream for video stream #1."); >@@ -116,7 +121,7 @@ std::string VideoStream1() { > return static_cast<std::string>(FLAG_vstream1); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > vsl0, > "", > "Comma separated values describing SpatialLayer for video layer #0."); >@@ -124,7 +129,7 @@ std::string VideoSL0() { > return static_cast<std::string>(FLAG_vsl0); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > vsl1, > "", > "Comma separated values describing SpatialLayer for video layer #1."); >@@ -132,98 +137,105 @@ std::string VideoSL1() { > return static_cast<std::string>(FLAG_vsl1); > } > >-DEFINE_int(vselected_tl, >- -1, >- "Temporal layer to show or analyze for screenshare. -1 to disable " >- "filtering."); >+WEBRTC_DEFINE_int( >+ vselected_tl, >+ -1, >+ "Temporal layer to show or analyze for screenshare. -1 to disable " >+ "filtering."); > int VideoSelectedTL() { > return static_cast<int>(FLAG_vselected_tl); > } > >-DEFINE_int(vselected_stream, >- 0, >- "ID of the stream to show or analyze for screenshare." >- "Set to the number of streams to show them all."); >+WEBRTC_DEFINE_int(vselected_stream, >+ 0, >+ "ID of the stream to show or analyze for screenshare." >+ "Set to the number of streams to show them all."); > int VideoSelectedStream() { > return static_cast<int>(FLAG_vselected_stream); > } > >-DEFINE_int(vselected_sl, >- -1, >- "Spatial layer to show or analyze for screenshare. -1 to disable " >- "filtering."); >+WEBRTC_DEFINE_int( >+ vselected_sl, >+ -1, >+ "Spatial layer to show or analyze for screenshare. -1 to disable " >+ "filtering."); > int VideoSelectedSL() { > return static_cast<int>(FLAG_vselected_sl); > } > > // Flags for screenshare. >-DEFINE_int(min_transmit_bitrate, >- 400, >- "Min transmit bitrate incl. padding for screenshare."); >+WEBRTC_DEFINE_int(min_transmit_bitrate, >+ 400, >+ "Min transmit bitrate incl. padding for screenshare."); > int ScreenshareMinTransmitBitrateKbps() { > return FLAG_min_transmit_bitrate; > } > >-DEFINE_int(swidth, 1850, "Screenshare width (crops source)."); >+WEBRTC_DEFINE_int(swidth, 1850, "Screenshare width (crops source)."); > size_t ScreenshareWidth() { > return static_cast<size_t>(FLAG_swidth); > } > >-DEFINE_int(sheight, 1110, "Screenshare height (crops source)."); >+WEBRTC_DEFINE_int(sheight, 1110, "Screenshare height (crops source)."); > size_t ScreenshareHeight() { > return static_cast<size_t>(FLAG_sheight); > } > >-DEFINE_int(sfps, 5, "Frames per second for screenshare."); >+WEBRTC_DEFINE_int(sfps, 5, "Frames per second for screenshare."); > int ScreenshareFps() { > return static_cast<int>(FLAG_sfps); > } > >-DEFINE_int(starget_bitrate, 100, "Screenshare stream target bitrate in kbps."); >+WEBRTC_DEFINE_int(starget_bitrate, >+ 100, >+ "Screenshare stream target bitrate in kbps."); > int ScreenshareTargetBitrateKbps() { > return static_cast<int>(FLAG_starget_bitrate); > } > >-DEFINE_int(smin_bitrate, 100, "Screenshare stream min bitrate in kbps."); >+WEBRTC_DEFINE_int(smin_bitrate, 100, "Screenshare stream min bitrate in kbps."); > int ScreenshareMinBitrateKbps() { > return static_cast<int>(FLAG_smin_bitrate); > } > >-DEFINE_int(smax_bitrate, 2000, "Screenshare stream max bitrate in kbps."); >+WEBRTC_DEFINE_int(smax_bitrate, >+ 2000, >+ "Screenshare stream max bitrate in kbps."); > int ScreenshareMaxBitrateKbps() { > return static_cast<int>(FLAG_smax_bitrate); > } > >-DEFINE_int(snum_temporal_layers, >- 2, >- "Number of temporal layers to use in screenshare."); >+WEBRTC_DEFINE_int(snum_temporal_layers, >+ 2, >+ "Number of temporal layers to use in screenshare."); > int ScreenshareNumTemporalLayers() { > return static_cast<int>(FLAG_snum_temporal_layers); > } > >-DEFINE_int(snum_streams, >- 0, >- "Number of screenshare streams to show or analyze."); >+WEBRTC_DEFINE_int(snum_streams, >+ 0, >+ "Number of screenshare streams to show or analyze."); > int ScreenshareNumStreams() { > return static_cast<int>(FLAG_snum_streams); > } > >-DEFINE_int(snum_spatial_layers, >- 1, >- "Number of screenshare spatial layers to use."); >+WEBRTC_DEFINE_int(snum_spatial_layers, >+ 1, >+ "Number of screenshare spatial layers to use."); > int ScreenshareNumSpatialLayers() { > return static_cast<int>(FLAG_snum_spatial_layers); > } > >-DEFINE_int(sinter_layer_pred, >- 0, >- "Screenshare inter-layer prediction mode. " >- "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); >+WEBRTC_DEFINE_int( >+ sinter_layer_pred, >+ 0, >+ "Screenshare inter-layer prediction mode. " >+ "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); > InterLayerPredMode ScreenshareInterLayerPred() { > return IntToInterLayerPredMode(FLAG_sinter_layer_pred); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > sstream0, > "", > "Comma separated values describing VideoStream for screenshare stream #0."); >@@ -231,7 +243,7 @@ std::string ScreenshareStream0() { > return static_cast<std::string>(FLAG_sstream0); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > sstream1, > "", > "Comma separated values describing VideoStream for screenshare stream #1."); >@@ -239,7 +251,7 @@ std::string ScreenshareStream1() { > return static_cast<std::string>(FLAG_sstream1); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > ssl0, > "", > "Comma separated values describing SpatialLayer for screenshare layer #0."); >@@ -247,7 +259,7 @@ std::string ScreenshareSL0() { > return static_cast<std::string>(FLAG_ssl0); > } > >-DEFINE_string( >+WEBRTC_DEFINE_string( > ssl1, > "", > "Comma separated values describing SpatialLayer for screenshare layer #1."); >@@ -255,31 +267,33 @@ std::string ScreenshareSL1() { > return static_cast<std::string>(FLAG_ssl1); > } > >-DEFINE_int(sselected_tl, >- -1, >- "Temporal layer to show or analyze for screenshare. -1 to disable " >- "filtering."); >+WEBRTC_DEFINE_int( >+ sselected_tl, >+ -1, >+ "Temporal layer to show or analyze for screenshare. -1 to disable " >+ "filtering."); > int ScreenshareSelectedTL() { > return static_cast<int>(FLAG_sselected_tl); > } > >-DEFINE_int(sselected_stream, >- 0, >- "ID of the stream to show or analyze for screenshare." >- "Set to the number of streams to show them all."); >+WEBRTC_DEFINE_int(sselected_stream, >+ 0, >+ "ID of the stream to show or analyze for screenshare." >+ "Set to the number of streams to show them all."); > int ScreenshareSelectedStream() { > return static_cast<int>(FLAG_sselected_stream); > } > >-DEFINE_int(sselected_sl, >- -1, >- "Spatial layer to show or analyze for screenshare. -1 to disable " >- "filtering."); >+WEBRTC_DEFINE_int( >+ sselected_sl, >+ -1, >+ "Spatial layer to show or analyze for screenshare. -1 to disable " >+ "filtering."); > int ScreenshareSelectedSL() { > return static_cast<int>(FLAG_sselected_sl); > } > >-DEFINE_bool( >+WEBRTC_DEFINE_bool( > generate_slides, > false, > "Whether to use randomly generated slides or read them from files."); >@@ -287,14 +301,14 @@ bool GenerateSlides() { > return static_cast<int>(FLAG_generate_slides); > } > >-DEFINE_int(slide_change_interval, >- 10, >- "Interval (in seconds) between simulated slide changes."); >+WEBRTC_DEFINE_int(slide_change_interval, >+ 10, >+ "Interval (in seconds) between simulated slide changes."); > int SlideChangeInterval() { > return static_cast<int>(FLAG_slide_change_interval); > } > >-DEFINE_int( >+WEBRTC_DEFINE_int( > scroll_duration, > 0, > "Duration (in seconds) during which a slide will be scrolled into place."); >@@ -302,9 +316,10 @@ int ScrollDuration() { > return static_cast<int>(FLAG_scroll_duration); > } > >-DEFINE_string(slides, >- "", >- "Comma-separated list of *.yuv files to display as slides."); >+WEBRTC_DEFINE_string( >+ slides, >+ "", >+ "Comma-separated list of *.yuv files to display as slides."); > std::vector<std::string> Slides() { > std::vector<std::string> slides; > std::string slides_list = FLAG_slides; >@@ -313,31 +328,31 @@ std::vector<std::string> Slides() { > } > > // Flags common with screenshare and video loopback, with equal default values. >-DEFINE_int(start_bitrate, 600, "Call start bitrate in kbps."); >+WEBRTC_DEFINE_int(start_bitrate, 600, "Call start bitrate in kbps."); > int StartBitrateKbps() { > return static_cast<int>(FLAG_start_bitrate); > } > >-DEFINE_string(codec, "VP8", "Video codec to use."); >+WEBRTC_DEFINE_string(codec, "VP8", "Video codec to use."); > std::string Codec() { > return static_cast<std::string>(FLAG_codec); > } > >-DEFINE_bool(analyze_video, >- false, >- "Analyze video stream (if --duration is present)"); >+WEBRTC_DEFINE_bool(analyze_video, >+ false, >+ "Analyze video stream (if --duration is present)"); > bool AnalyzeVideo() { > return static_cast<bool>(FLAG_analyze_video); > } > >-DEFINE_bool(analyze_screenshare, >- false, >- "Analyze screenshare stream (if --duration is present)"); >+WEBRTC_DEFINE_bool(analyze_screenshare, >+ false, >+ "Analyze screenshare stream (if --duration is present)"); > bool AnalyzeScreenshare() { > return static_cast<bool>(FLAG_analyze_screenshare); > } > >-DEFINE_int( >+WEBRTC_DEFINE_int( > duration, > 0, > "Duration of the test in seconds. If 0, rendered will be shown instead."); >@@ -345,100 +360,113 @@ int DurationSecs() { > return static_cast<int>(FLAG_duration); > } > >-DEFINE_string(output_filename, "", "Target graph data filename."); >+WEBRTC_DEFINE_string(output_filename, "", "Target graph data filename."); > std::string OutputFilename() { > return static_cast<std::string>(FLAG_output_filename); > } > >-DEFINE_string(graph_title, >- "", >- "If empty, title will be generated automatically."); >+WEBRTC_DEFINE_string(graph_title, >+ "", >+ "If empty, title will be generated automatically."); > std::string GraphTitle() { > return static_cast<std::string>(FLAG_graph_title); > } > >-DEFINE_int(loss_percent, 0, "Percentage of packets randomly lost."); >+WEBRTC_DEFINE_int(loss_percent, 0, "Percentage of packets randomly lost."); > int LossPercent() { > return static_cast<int>(FLAG_loss_percent); > } > >-DEFINE_int(avg_burst_loss_length, -1, "Average burst length of lost packets."); >+WEBRTC_DEFINE_int(avg_burst_loss_length, >+ -1, >+ "Average burst length of lost packets."); > int AvgBurstLossLength() { > return static_cast<int>(FLAG_avg_burst_loss_length); > } > >-DEFINE_int(link_capacity, >- 0, >- "Capacity (kbps) of the fake link. 0 means infinite."); >+WEBRTC_DEFINE_int(link_capacity, >+ 0, >+ "Capacity (kbps) of the fake link. 0 means infinite."); > int LinkCapacityKbps() { > return static_cast<int>(FLAG_link_capacity); > } > >-DEFINE_int(queue_size, 0, "Size of the bottleneck link queue in packets."); >+WEBRTC_DEFINE_int(queue_size, >+ 0, >+ "Size of the bottleneck link queue in packets."); > int QueueSize() { > return static_cast<int>(FLAG_queue_size); > } > >-DEFINE_int(avg_propagation_delay_ms, >- 0, >- "Average link propagation delay in ms."); >+WEBRTC_DEFINE_int(avg_propagation_delay_ms, >+ 0, >+ "Average link propagation delay in ms."); > int AvgPropagationDelayMs() { > return static_cast<int>(FLAG_avg_propagation_delay_ms); > } > >-DEFINE_string(rtc_event_log_name, >- "", >- "Filename for rtc event log. Two files " >- "with \"_send\" and \"_recv\" suffixes will be created. " >- "Works only when --duration is set."); >+WEBRTC_DEFINE_string(rtc_event_log_name, >+ "", >+ "Filename for rtc event log. Two files " >+ "with \"_send\" and \"_recv\" suffixes will be created. " >+ "Works only when --duration is set."); > std::string RtcEventLogName() { > return static_cast<std::string>(FLAG_rtc_event_log_name); > } > >-DEFINE_string(rtp_dump_name, "", "Filename for dumped received RTP stream."); >+WEBRTC_DEFINE_string(rtp_dump_name, >+ "", >+ "Filename for dumped received RTP stream."); > std::string RtpDumpName() { > return static_cast<std::string>(FLAG_rtp_dump_name); > } > >-DEFINE_int(std_propagation_delay_ms, >- 0, >- "Link propagation delay standard deviation in ms."); >+WEBRTC_DEFINE_int(std_propagation_delay_ms, >+ 0, >+ "Link propagation delay standard deviation in ms."); > int StdPropagationDelayMs() { > return static_cast<int>(FLAG_std_propagation_delay_ms); > } > >-DEFINE_string(encoded_frame_path, >- "", >- "The base path for encoded frame logs. Created files will have " >- "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); >+WEBRTC_DEFINE_string( >+ encoded_frame_path, >+ "", >+ "The base path for encoded frame logs. Created files will have " >+ "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); > std::string EncodedFramePath() { > return static_cast<std::string>(FLAG_encoded_frame_path); > } > >-DEFINE_bool(logs, false, "print logs to stderr"); >+WEBRTC_DEFINE_bool(logs, false, "print logs to stderr"); > >-DEFINE_bool(send_side_bwe, true, "Use send-side bandwidth estimation"); >+WEBRTC_DEFINE_bool(send_side_bwe, true, "Use send-side bandwidth estimation"); > >-DEFINE_bool(generic_descriptor, false, "Use the generic frame descriptor."); >+WEBRTC_DEFINE_bool(generic_descriptor, >+ false, >+ "Use the generic frame descriptor."); > >-DEFINE_bool(allow_reordering, false, "Allow packet reordering to occur"); >+WEBRTC_DEFINE_bool(allow_reordering, false, "Allow packet reordering to occur"); > >-DEFINE_bool(use_ulpfec, false, "Use RED+ULPFEC forward error correction."); >+WEBRTC_DEFINE_bool(use_ulpfec, >+ false, >+ "Use RED+ULPFEC forward error correction."); > >-DEFINE_bool(use_flexfec, false, "Use FlexFEC forward error correction."); >+WEBRTC_DEFINE_bool(use_flexfec, false, "Use FlexFEC forward error correction."); > >-DEFINE_bool(audio, false, "Add audio stream"); >+WEBRTC_DEFINE_bool(audio, false, "Add audio stream"); > >-DEFINE_bool(audio_video_sync, >- false, >- "Sync audio and video stream (no effect if" >- " audio is false)"); >+WEBRTC_DEFINE_bool(audio_video_sync, >+ false, >+ "Sync audio and video stream (no effect if" >+ " audio is false)"); > >-DEFINE_bool(audio_dtx, false, "Enable audio DTX (no effect if audio is false)"); >+WEBRTC_DEFINE_bool(audio_dtx, >+ false, >+ "Enable audio DTX (no effect if audio is false)"); > >-DEFINE_bool(video, true, "Add video stream"); >+WEBRTC_DEFINE_bool(video, true, "Add video stream"); > >-DEFINE_string( >+WEBRTC_DEFINE_string( > force_fieldtrials, > "", > "Field trials control experimental feature code which can be forced. " >@@ -447,15 +475,16 @@ DEFINE_string( > "trials are separated by \"/\""); > > // Video-specific flags. >-DEFINE_string(vclip, >- "", >- "Name of the clip to show. If empty, the camera is used. Use " >- "\"Generator\" for chroma generator."); >+WEBRTC_DEFINE_string( >+ vclip, >+ "", >+ "Name of the clip to show. If empty, the camera is used. Use " >+ "\"Generator\" for chroma generator."); > std::string VideoClip() { > return static_cast<std::string>(FLAG_vclip); > } > >-DEFINE_bool(help, false, "prints this message"); >+WEBRTC_DEFINE_bool(help, false, "prints this message"); > > } // namespace flags > >@@ -475,7 +504,7 @@ void Loopback() { > screenshare_idx = 0; > } > >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.loss_percent = flags::LossPercent(); > pipe_config.avg_burst_loss_length = flags::AvgBurstLossLength(); > pipe_config.link_capacity_kbps = flags::LinkCapacityKbps(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/transport_adapter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/transport_adapter.h >index 8ec53fdf25a916c71029c7deae24b5887fe272af..786858b0e21c74103410ec4feb2d44ec4899ae9a 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/transport_adapter.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/transport_adapter.h >@@ -13,7 +13,6 @@ > #include <atomic> > > #include "api/call/transport.h" >-#include "common_types.h" // NOLINT(build/include) > > namespace webrtc { > namespace internal { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.cc >index 05abe1fcf3b240b229e23d2dd02137411ae2d831..1fd22ff8060b094a5dd8541b89d4b5b4e8c2e60e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.cc >@@ -18,17 +18,18 @@ > #include "rtc_base/flags.h" > #include "rtc_base/format_macros.h" > #include "rtc_base/memory_usage.h" >-#include "rtc_base/pathutils.h" > #include "system_wrappers/include/cpu_info.h" > #include "test/call_test.h" >+#include "test/testsupport/fileutils.h" > #include "test/testsupport/frame_writer.h" > #include "test/testsupport/perf_test.h" > #include "test/testsupport/test_artifacts.h" > >-DEFINE_bool(save_worst_frame, >- false, >- "Enable saving a frame with the lowest PSNR to a jpeg file in the " >- "test_artifacts_dir"); >+WEBRTC_DEFINE_bool( >+ save_worst_frame, >+ false, >+ "Enable saving a frame with the lowest PSNR to a jpeg file in the " >+ "test_artifacts_dir"); > > namespace webrtc { > namespace { >@@ -70,7 +71,6 @@ VideoAnalyzer::VideoAnalyzer(test::LayerFilteringTransport* transport, > selected_stream_(selected_stream), > selected_sl_(selected_sl), > selected_tl_(selected_tl), >- pre_encode_proxy_(this), > last_fec_bytes_(0), > frames_to_process_(duration_frames), > frames_recorded_(0), >@@ -91,7 +91,6 @@ VideoAnalyzer::VideoAnalyzer(test::LayerFilteringTransport* transport, > avg_ssim_threshold_(avg_ssim_threshold), > is_quick_test_enabled_(is_quick_test_enabled), > stats_polling_thread_(&PollStatsThread, this, "StatsPoller"), >- comparison_available_event_(false, false), > done_(true, false), > clock_(clock), > start_ms_(clock->TimeInMilliseconds()) { >@@ -231,10 +230,10 @@ void VideoAnalyzer::PreEncodeOnFrame(const VideoFrame& video_frame) { > } > } > >-void VideoAnalyzer::EncodedFrameCallback(const EncodedFrame& encoded_frame) { >+void VideoAnalyzer::PostEncodeOnFrame(size_t stream_id, uint32_t timestamp) { > rtc::CritScope lock(&crit_); >- if (!first_sent_timestamp_ && encoded_frame.stream_id_ == selected_stream_) { >- first_sent_timestamp_ = encoded_frame.timestamp_; >+ if (!first_sent_timestamp_ && stream_id == selected_stream_) { >+ first_sent_timestamp_ = timestamp; > } > } > >@@ -366,10 +365,6 @@ void VideoAnalyzer::Wait() { > stats_polling_thread_.Stop(); > } > >-rtc::VideoSinkInterface<VideoFrame>* VideoAnalyzer::pre_encode_proxy() { >- return &pre_encode_proxy_; >-} >- > void VideoAnalyzer::StartMeasuringCpuProcessTime() { > rtc::CritScope lock(&cpu_measurement_lock_); > cpu_time_ -= rtc::GetProcessCpuTimeNanos(); >@@ -611,7 +606,7 @@ void VideoAnalyzer::PrintResults() { > std::string output_dir; > test::GetTestArtifactsDir(&output_dir); > std::string output_path = >- rtc::Pathname(output_dir, test_label_ + ".jpg").pathname(); >+ test::JoinFilename(output_dir, test_label_ + ".jpg"); > RTC_LOG(LS_INFO) << "Saving worst frame to " << output_path; > test::JpegFrameWriter frame_writer(output_path); > RTC_CHECK( >@@ -841,13 +836,6 @@ VideoAnalyzer::Sample::Sample(int dropped, > psnr(psnr), > ssim(ssim) {} > >-VideoAnalyzer::PreEncodeProxy::PreEncodeProxy(VideoAnalyzer* parent) >- : parent_(parent) {} >- >-void VideoAnalyzer::PreEncodeProxy::OnFrame(const VideoFrame& video_frame) { >- parent_->PreEncodeOnFrame(video_frame); >-} >- > VideoAnalyzer::CapturedFrameForwarder::CapturedFrameForwarder( > VideoAnalyzer* analyzer, > Clock* clock) >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.h >index 681d1f464f481c6e7baf3f1ff61fdcae2f40d7cd..d2c4fa84cda3a15b37e6007698b7bcc9f97fae7b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_analyzer.h >@@ -25,8 +25,7 @@ namespace webrtc { > > class VideoAnalyzer : public PacketReceiver, > public Transport, >- public rtc::VideoSinkInterface<VideoFrame>, >- public EncodedFrameObserver { >+ public rtc::VideoSinkInterface<VideoFrame> { > public: > VideoAnalyzer(test::LayerFilteringTransport* transport, > const std::string& test_label, >@@ -61,9 +60,7 @@ class VideoAnalyzer : public PacketReceiver, > int64_t packet_time_us) override; > > void PreEncodeOnFrame(const VideoFrame& video_frame); >- >- // EncodedFrameObserver implementation, wired to post_encode_callback. >- void EncodedFrameCallback(const EncodedFrame& encoded_frame) override; >+ void PostEncodeOnFrame(size_t stream_id, uint32_t timestamp); > > bool SendRtp(const uint8_t* packet, > size_t length, >@@ -73,8 +70,6 @@ class VideoAnalyzer : public PacketReceiver, > void OnFrame(const VideoFrame& video_frame) override; > void Wait(); > >- rtc::VideoSinkInterface<VideoFrame>* pre_encode_proxy(); >- > void StartMeasuringCpuProcessTime(); > void StopMeasuringCpuProcessTime(); > void StartExcludingCpuThreadTime(); >@@ -132,17 +127,6 @@ class VideoAnalyzer : public PacketReceiver, > double ssim; > }; > >- // This class receives the send-side OnFrame callback and is provided to not >- // conflict with the receiver-side renderer callback. >- class PreEncodeProxy : public rtc::VideoSinkInterface<VideoFrame> { >- public: >- explicit PreEncodeProxy(VideoAnalyzer* parent); >- void OnFrame(const VideoFrame& video_frame) override; >- >- private: >- VideoAnalyzer* const parent_; >- }; >- > // Implements VideoSinkInterface to receive captured frames from a > // FrameGeneratorCapturer. Implements VideoSourceInterface to be able to act > // as a source to VideoSendStream. >@@ -221,7 +205,6 @@ class VideoAnalyzer : public PacketReceiver, > const size_t selected_stream_; > const int selected_sl_; > const int selected_tl_; >- PreEncodeProxy pre_encode_proxy_; > > rtc::CriticalSection comparison_lock_; > std::vector<Sample> samples_ RTC_GUARDED_BY(comparison_lock_); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_loopback.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_loopback.cc >index 57590ed144954756e9097366a7c09a76eb03c1f1..4cdddb94e5285850d93d93b098ec813d72b0e363 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_loopback.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_loopback.cc >@@ -22,61 +22,62 @@ namespace webrtc { > namespace flags { > > // Flags common with screenshare loopback, with different default values. >-DEFINE_int(width, 640, "Video width."); >+WEBRTC_DEFINE_int(width, 640, "Video width."); > size_t Width() { > return static_cast<size_t>(FLAG_width); > } > >-DEFINE_int(height, 480, "Video height."); >+WEBRTC_DEFINE_int(height, 480, "Video height."); > size_t Height() { > return static_cast<size_t>(FLAG_height); > } > >-DEFINE_int(fps, 30, "Frames per second."); >+WEBRTC_DEFINE_int(fps, 30, "Frames per second."); > int Fps() { > return static_cast<int>(FLAG_fps); > } > >-DEFINE_int(capture_device_index, 0, "Capture device to select"); >+WEBRTC_DEFINE_int(capture_device_index, 0, "Capture device to select"); > size_t GetCaptureDevice() { > return static_cast<size_t>(FLAG_capture_device_index); > } > >-DEFINE_int(min_bitrate, 50, "Call and stream min bitrate in kbps."); >+WEBRTC_DEFINE_int(min_bitrate, 50, "Call and stream min bitrate in kbps."); > int MinBitrateKbps() { > return static_cast<int>(FLAG_min_bitrate); > } > >-DEFINE_int(start_bitrate, 300, "Call start bitrate in kbps."); >+WEBRTC_DEFINE_int(start_bitrate, 300, "Call start bitrate in kbps."); > int StartBitrateKbps() { > return static_cast<int>(FLAG_start_bitrate); > } > >-DEFINE_int(target_bitrate, 800, "Stream target bitrate in kbps."); >+WEBRTC_DEFINE_int(target_bitrate, 800, "Stream target bitrate in kbps."); > int TargetBitrateKbps() { > return static_cast<int>(FLAG_target_bitrate); > } > >-DEFINE_int(max_bitrate, 800, "Call and stream max bitrate in kbps."); >+WEBRTC_DEFINE_int(max_bitrate, 800, "Call and stream max bitrate in kbps."); > int MaxBitrateKbps() { > return static_cast<int>(FLAG_max_bitrate); > } > >-DEFINE_bool(suspend_below_min_bitrate, >- false, >- "Suspends video below the configured min bitrate."); >+WEBRTC_DEFINE_bool(suspend_below_min_bitrate, >+ false, >+ "Suspends video below the configured min bitrate."); > >-DEFINE_int(num_temporal_layers, >- 1, >- "Number of temporal layers. Set to 1-4 to override."); >+WEBRTC_DEFINE_int(num_temporal_layers, >+ 1, >+ "Number of temporal layers. Set to 1-4 to override."); > int NumTemporalLayers() { > return static_cast<int>(FLAG_num_temporal_layers); > } > >-DEFINE_int(inter_layer_pred, >- 2, >- "Inter-layer prediction mode. " >- "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); >+WEBRTC_DEFINE_int( >+ inter_layer_pred, >+ 2, >+ "Inter-layer prediction mode. " >+ "0 - enabled, 1 - disabled, 2 - enabled only for key pictures."); > InterLayerPredMode InterLayerPred() { > if (FLAG_inter_layer_pred == 0) { > return InterLayerPredMode::kOn; >@@ -89,19 +90,20 @@ InterLayerPredMode InterLayerPred() { > } > > // Flags common with screenshare loopback, with equal default values. >-DEFINE_string(codec, "VP8", "Video codec to use."); >+WEBRTC_DEFINE_string(codec, "VP8", "Video codec to use."); > std::string Codec() { > return static_cast<std::string>(FLAG_codec); > } > >-DEFINE_int(selected_tl, >- -1, >- "Temporal layer to show or analyze. -1 to disable filtering."); >+WEBRTC_DEFINE_int( >+ selected_tl, >+ -1, >+ "Temporal layer to show or analyze. -1 to disable filtering."); > int SelectedTL() { > return static_cast<int>(FLAG_selected_tl); > } > >-DEFINE_int( >+WEBRTC_DEFINE_int( > duration, > 0, > "Duration of the test in seconds. If 0, rendered will be shown instead."); >@@ -109,177 +111,196 @@ int DurationSecs() { > return static_cast<int>(FLAG_duration); > } > >-DEFINE_string(output_filename, "", "Target graph data filename."); >+WEBRTC_DEFINE_string(output_filename, "", "Target graph data filename."); > std::string OutputFilename() { > return static_cast<std::string>(FLAG_output_filename); > } > >-DEFINE_string(graph_title, >- "", >- "If empty, title will be generated automatically."); >+WEBRTC_DEFINE_string(graph_title, >+ "", >+ "If empty, title will be generated automatically."); > std::string GraphTitle() { > return static_cast<std::string>(FLAG_graph_title); > } > >-DEFINE_int(loss_percent, 0, "Percentage of packets randomly lost."); >+WEBRTC_DEFINE_int(loss_percent, 0, "Percentage of packets randomly lost."); > int LossPercent() { > return static_cast<int>(FLAG_loss_percent); > } > >-DEFINE_int(avg_burst_loss_length, -1, "Average burst length of lost packets."); >+WEBRTC_DEFINE_int(avg_burst_loss_length, >+ -1, >+ "Average burst length of lost packets."); > int AvgBurstLossLength() { > return static_cast<int>(FLAG_avg_burst_loss_length); > } > >-DEFINE_int(link_capacity, >- 0, >- "Capacity (kbps) of the fake link. 0 means infinite."); >+WEBRTC_DEFINE_int(link_capacity, >+ 0, >+ "Capacity (kbps) of the fake link. 0 means infinite."); > int LinkCapacityKbps() { > return static_cast<int>(FLAG_link_capacity); > } > >-DEFINE_int(queue_size, 0, "Size of the bottleneck link queue in packets."); >+WEBRTC_DEFINE_int(queue_size, >+ 0, >+ "Size of the bottleneck link queue in packets."); > int QueueSize() { > return static_cast<int>(FLAG_queue_size); > } > >-DEFINE_int(avg_propagation_delay_ms, >- 0, >- "Average link propagation delay in ms."); >+WEBRTC_DEFINE_int(avg_propagation_delay_ms, >+ 0, >+ "Average link propagation delay in ms."); > int AvgPropagationDelayMs() { > return static_cast<int>(FLAG_avg_propagation_delay_ms); > } > >-DEFINE_string(rtc_event_log_name, >- "", >- "Filename for rtc event log. Two files " >- "with \"_send\" and \"_recv\" suffixes will be created."); >+WEBRTC_DEFINE_string(rtc_event_log_name, >+ "", >+ "Filename for rtc event log. Two files " >+ "with \"_send\" and \"_recv\" suffixes will be created."); > std::string RtcEventLogName() { > return static_cast<std::string>(FLAG_rtc_event_log_name); > } > >-DEFINE_string(rtp_dump_name, "", "Filename for dumped received RTP stream."); >+WEBRTC_DEFINE_string(rtp_dump_name, >+ "", >+ "Filename for dumped received RTP stream."); > std::string RtpDumpName() { > return static_cast<std::string>(FLAG_rtp_dump_name); > } > >-DEFINE_int(std_propagation_delay_ms, >- 0, >- "Link propagation delay standard deviation in ms."); >+WEBRTC_DEFINE_int(std_propagation_delay_ms, >+ 0, >+ "Link propagation delay standard deviation in ms."); > int StdPropagationDelayMs() { > return static_cast<int>(FLAG_std_propagation_delay_ms); > } > >-DEFINE_int(num_streams, 0, "Number of streams to show or analyze."); >+WEBRTC_DEFINE_int(num_streams, 0, "Number of streams to show or analyze."); > int NumStreams() { > return static_cast<int>(FLAG_num_streams); > } > >-DEFINE_int(selected_stream, >- 0, >- "ID of the stream to show or analyze. " >- "Set to the number of streams to show them all."); >+WEBRTC_DEFINE_int(selected_stream, >+ 0, >+ "ID of the stream to show or analyze. " >+ "Set to the number of streams to show them all."); > int SelectedStream() { > return static_cast<int>(FLAG_selected_stream); > } > >-DEFINE_int(num_spatial_layers, 1, "Number of spatial layers to use."); >+WEBRTC_DEFINE_int(num_spatial_layers, 1, "Number of spatial layers to use."); > int NumSpatialLayers() { > return static_cast<int>(FLAG_num_spatial_layers); > } > >-DEFINE_int(selected_sl, >- -1, >- "Spatial layer to show or analyze. -1 to disable filtering."); >+WEBRTC_DEFINE_int(selected_sl, >+ -1, >+ "Spatial layer to show or analyze. -1 to disable filtering."); > int SelectedSL() { > return static_cast<int>(FLAG_selected_sl); > } > >-DEFINE_string(stream0, >- "", >- "Comma separated values describing VideoStream for stream #0."); >+WEBRTC_DEFINE_string( >+ stream0, >+ "", >+ "Comma separated values describing VideoStream for stream #0."); > std::string Stream0() { > return static_cast<std::string>(FLAG_stream0); > } > >-DEFINE_string(stream1, >- "", >- "Comma separated values describing VideoStream for stream #1."); >+WEBRTC_DEFINE_string( >+ stream1, >+ "", >+ "Comma separated values describing VideoStream for stream #1."); > std::string Stream1() { > return static_cast<std::string>(FLAG_stream1); > } > >-DEFINE_string(sl0, >- "", >- "Comma separated values describing SpatialLayer for layer #0."); >+WEBRTC_DEFINE_string( >+ sl0, >+ "", >+ "Comma separated values describing SpatialLayer for layer #0."); > std::string SL0() { > return static_cast<std::string>(FLAG_sl0); > } > >-DEFINE_string(sl1, >- "", >- "Comma separated values describing SpatialLayer for layer #1."); >+WEBRTC_DEFINE_string( >+ sl1, >+ "", >+ "Comma separated values describing SpatialLayer for layer #1."); > std::string SL1() { > return static_cast<std::string>(FLAG_sl1); > } > >-DEFINE_string(encoded_frame_path, >- "", >- "The base path for encoded frame logs. Created files will have " >- "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); >+WEBRTC_DEFINE_string( >+ encoded_frame_path, >+ "", >+ "The base path for encoded frame logs. Created files will have " >+ "the form <encoded_frame_path>.<n>.(recv|send.<m>).ivf"); > std::string EncodedFramePath() { > return static_cast<std::string>(FLAG_encoded_frame_path); > } > >-DEFINE_bool(logs, false, "print logs to stderr"); >+WEBRTC_DEFINE_bool(logs, false, "print logs to stderr"); > >-DEFINE_bool(send_side_bwe, true, "Use send-side bandwidth estimation"); >+WEBRTC_DEFINE_bool(send_side_bwe, true, "Use send-side bandwidth estimation"); > >-DEFINE_bool(generic_descriptor, false, "Use the generic frame descriptor."); >+WEBRTC_DEFINE_bool(generic_descriptor, >+ false, >+ "Use the generic frame descriptor."); > >-DEFINE_bool(allow_reordering, false, "Allow packet reordering to occur"); >+WEBRTC_DEFINE_bool(allow_reordering, false, "Allow packet reordering to occur"); > >-DEFINE_bool(use_ulpfec, false, "Use RED+ULPFEC forward error correction."); >+WEBRTC_DEFINE_bool(use_ulpfec, >+ false, >+ "Use RED+ULPFEC forward error correction."); > >-DEFINE_bool(use_flexfec, false, "Use FlexFEC forward error correction."); >+WEBRTC_DEFINE_bool(use_flexfec, false, "Use FlexFEC forward error correction."); > >-DEFINE_bool(audio, false, "Add audio stream"); >+WEBRTC_DEFINE_bool(audio, false, "Add audio stream"); > >-DEFINE_bool(use_real_adm, >- false, >- "Use real ADM instead of fake (no effect if audio is false)"); >+WEBRTC_DEFINE_bool( >+ use_real_adm, >+ false, >+ "Use real ADM instead of fake (no effect if audio is false)"); > >-DEFINE_bool(audio_video_sync, >- false, >- "Sync audio and video stream (no effect if" >- " audio is false)"); >+WEBRTC_DEFINE_bool(audio_video_sync, >+ false, >+ "Sync audio and video stream (no effect if" >+ " audio is false)"); > >-DEFINE_bool(audio_dtx, false, "Enable audio DTX (no effect if audio is false)"); >+WEBRTC_DEFINE_bool(audio_dtx, >+ false, >+ "Enable audio DTX (no effect if audio is false)"); > >-DEFINE_bool(video, true, "Add video stream"); >+WEBRTC_DEFINE_bool(video, true, "Add video stream"); > >-DEFINE_string( >+WEBRTC_DEFINE_string( > force_fieldtrials, > "", > "Field trials control experimental feature code which can be forced. " >- "E.g. running with --force_fieldtrials=WebRTC-FooFeature/Enable/" >+ "E.g. running with --force_fieldtrials=WebRTC-FooFeature/Enabled/" > " will assign the group Enable to field trial WebRTC-FooFeature. Multiple " > "trials are separated by \"/\""); > > // Video-specific flags. >-DEFINE_string(clip, >- "", >- "Name of the clip to show. If empty, using chroma generator."); >+WEBRTC_DEFINE_string( >+ clip, >+ "", >+ "Name of the clip to show. If empty, using chroma generator."); > std::string Clip() { > return static_cast<std::string>(FLAG_clip); > } > >-DEFINE_bool(help, false, "prints this message"); >+WEBRTC_DEFINE_bool(help, false, "prints this message"); > > } // namespace flags > > void Loopback() { >- DefaultNetworkSimulationConfig pipe_config; >+ BuiltInNetworkBehaviorConfig pipe_config; > pipe_config.loss_percent = flags::LossPercent(); > pipe_config.avg_burst_loss_length = flags::AvgBurstLossLength(); > pipe_config.link_capacity_kbps = flags::LinkCapacityKbps(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.cc >index b05c3535e0ce717152321f4a6b0a6fa514478d09..c4a574abfe0e60a182879c5349f1336878ef4359 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.cc >@@ -34,7 +34,9 @@ const int kBlockyQpThresholdVp9 = 60; // TODO(ilnik): tune this value. > > VideoQualityObserver::VideoQualityObserver(VideoContentType content_type) > : last_frame_decoded_ms_(-1), >+ last_frame_rendered_ms_(-1), > num_frames_decoded_(0), >+ num_frames_rendered_(0), > first_frame_decoded_ms_(-1), > last_frame_pixels_(0), > last_frame_qp_(0), >@@ -119,6 +121,46 @@ void VideoQualityObserver::UpdateHistograms() { > RTC_LOG(LS_INFO) << log_stream.str(); > } > >+void VideoQualityObserver::OnRenderedFrame(int64_t now_ms) { >+ if (num_frames_rendered_ == 0) { >+ last_unfreeze_time_ = now_ms; >+ } >+ >+ ++num_frames_rendered_; >+ >+ if (!is_paused_ && num_frames_rendered_ > 1) { >+ // Process inter-frame delay. >+ int64_t interframe_delay_ms = now_ms - last_frame_rendered_ms_; >+ render_interframe_delays_.Add(interframe_delay_ms); >+ absl::optional<int> avg_interframe_delay = >+ render_interframe_delays_.Avg(kMinFrameSamplesToDetectFreeze); >+ // Check if it was a freeze. >+ if (avg_interframe_delay && >+ interframe_delay_ms >= >+ std::max(3 * *avg_interframe_delay, >+ *avg_interframe_delay + kMinIncreaseForFreezeMs)) { >+ freezes_durations_.Add(interframe_delay_ms); >+ smooth_playback_durations_.Add(last_frame_rendered_ms_ - >+ last_unfreeze_time_); >+ last_unfreeze_time_ = now_ms; >+ } >+ } >+ >+ if (is_paused_) { >+ // If the stream was paused since the previous frame, do not count the >+ // pause toward smooth playback. Explicitly count the part before it and >+ // start the new smooth playback interval from this frame. >+ is_paused_ = false; >+ if (last_frame_rendered_ms_ > last_unfreeze_time_) { >+ smooth_playback_durations_.Add(last_frame_rendered_ms_ - >+ last_unfreeze_time_); >+ } >+ last_unfreeze_time_ = now_ms; >+ } >+ >+ last_frame_rendered_ms_ = now_ms; >+} >+ > void VideoQualityObserver::OnDecodedFrame(absl::optional<uint8_t> qp, > int width, > int height, >@@ -126,7 +168,6 @@ void VideoQualityObserver::OnDecodedFrame(absl::optional<uint8_t> qp, > VideoCodecType codec) { > if (num_frames_decoded_ == 0) { > first_frame_decoded_ms_ = now_ms; >- last_unfreeze_time_ = now_ms; > } > > ++num_frames_decoded_; >@@ -134,21 +175,14 @@ void VideoQualityObserver::OnDecodedFrame(absl::optional<uint8_t> qp, > if (!is_paused_ && num_frames_decoded_ > 1) { > // Process inter-frame delay. > int64_t interframe_delay_ms = now_ms - last_frame_decoded_ms_; >- interframe_delays_.Add(interframe_delay_ms); >+ decode_interframe_delays_.Add(interframe_delay_ms); > absl::optional<int> avg_interframe_delay = >- interframe_delays_.Avg(kMinFrameSamplesToDetectFreeze); >- // Check if it was a freeze. >- if (avg_interframe_delay && >- interframe_delay_ms >= >+ decode_interframe_delays_.Avg(kMinFrameSamplesToDetectFreeze); >+ // Count spatial metrics if there were no freeze. >+ if (!avg_interframe_delay || >+ interframe_delay_ms < > std::max(3 * *avg_interframe_delay, > *avg_interframe_delay + kMinIncreaseForFreezeMs)) { >- freezes_durations_.Add(interframe_delay_ms); >- smooth_playback_durations_.Add(last_frame_decoded_ms_ - >- last_unfreeze_time_); >- last_unfreeze_time_ = now_ms; >- } else { >- // Only count inter-frame delay as playback time if there >- // was no freeze. > time_in_resolution_ms_[current_resolution_] += interframe_delay_ms; > absl::optional<int> qp_blocky_threshold; > // TODO(ilnik): add other codec types when we have QP for them. >@@ -168,18 +202,6 @@ void VideoQualityObserver::OnDecodedFrame(absl::optional<uint8_t> qp, > } > } > >- if (is_paused_) { >- // If the stream was paused since the previous frame, do not count the >- // pause toward smooth playback. Explicitly count the part before it and >- // start the new smooth playback interval from this frame. >- is_paused_ = false; >- if (last_frame_decoded_ms_ > last_unfreeze_time_) { >- smooth_playback_durations_.Add(last_frame_decoded_ms_ - >- last_unfreeze_time_); >- } >- last_unfreeze_time_ = now_ms; >- } >- > int64_t pixels = width * height; > if (pixels >= kPixelsInHighResolution) { > current_resolution_ = Resolution::High; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.h >index 8ba4af39842a8de010683e1cda7448c1f6d3dbb1..74f201e8c45f53f567ee0c9759f8598cb7344f7d 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_observer.h >@@ -15,8 +15,8 @@ > #include <vector> > > #include "absl/types/optional.h" >+#include "api/video/video_codec_type.h" > #include "api/video/video_content_type.h" >-#include "common_types.h" // NOLINT(build/include) > #include "rtc_base/numerics/sample_counter.h" > > namespace webrtc { >@@ -35,6 +35,7 @@ class VideoQualityObserver { > int height, > int64_t now_ms, > VideoCodecType codec); >+ void OnRenderedFrame(int64_t now_ms); > > void OnStreamInactive(); > >@@ -48,13 +49,16 @@ class VideoQualityObserver { > }; > > int64_t last_frame_decoded_ms_; >+ int64_t last_frame_rendered_ms_; > int64_t num_frames_decoded_; >+ int64_t num_frames_rendered_; > int64_t first_frame_decoded_ms_; > int64_t last_frame_pixels_; > uint8_t last_frame_qp_; > // Decoded timestamp of the last delayed frame. > int64_t last_unfreeze_time_; >- rtc::SampleCounter interframe_delays_; >+ rtc::SampleCounter render_interframe_delays_; >+ rtc::SampleCounter decode_interframe_delays_; > // An inter-frame delay is counted as a freeze if it's significantly longer > // than average inter-frame delay. > rtc::SampleCounter freezes_durations_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.cc >index 2a3be672d76068a141543e890ba1d8b61b5bc9ab..d6ccb6522e5832177fc5b485329dadd91a027672 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.cc >@@ -16,6 +16,7 @@ > #include <string> > #include <vector> > >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "call/fake_network_pipe.h" > #include "call/simulated_network.h" > #include "logging/rtc_event_log/output/rtc_event_log_output_file.h" >@@ -35,7 +36,7 @@ > #include "test/run_loop.h" > #include "test/testsupport/fileutils.h" > #include "test/video_renderer.h" >-#include "video/video_analyzer.h" >+#include "video/frame_dumping_decoder.h" > #ifdef WEBRTC_WIN > #include "modules/audio_device/include/audio_device_factory.h" > #endif >@@ -75,61 +76,16 @@ class VideoStreamFactory > std::vector<VideoStream> streams_; > }; > >-// A decoder wrapper that writes the encoded frames to a file. >-class FrameDumpingDecoder : public VideoDecoder { >+// This wrapper provides two features needed by the video quality tests: >+// 1. Invoke VideoAnalyzer callbacks before and after encoding each frame. >+// 2. Write the encoded frames to file, one file per simulcast layer. >+class QualityTestVideoEncoder : public VideoEncoder, >+ private EncodedImageCallback { > public: >- FrameDumpingDecoder(std::unique_ptr<VideoDecoder> decoder, >- rtc::PlatformFile file) >- : decoder_(std::move(decoder)), >- writer_(IvfFileWriter::Wrap(rtc::File(file), >- /* byte_limit= */ 100000000)) {} >- >- int32_t InitDecode(const VideoCodec* codec_settings, >- int32_t number_of_cores) override { >- return decoder_->InitDecode(codec_settings, number_of_cores); >- } >- >- int32_t Decode(const EncodedImage& input_image, >- bool missing_frames, >- const CodecSpecificInfo* codec_specific_info, >- int64_t render_time_ms) override { >- int32_t ret = decoder_->Decode(input_image, missing_frames, >- codec_specific_info, render_time_ms); >- writer_->WriteFrame(input_image, codec_specific_info->codecType); >- >- return ret; >- } >- >- int32_t RegisterDecodeCompleteCallback( >- DecodedImageCallback* callback) override { >- return decoder_->RegisterDecodeCompleteCallback(callback); >- } >- >- int32_t Release() override { return decoder_->Release(); } >- >- // Returns true if the decoder prefer to decode frames late. >- // That is, it can not decode infinite number of frames before the decoded >- // frame is consumed. >- bool PrefersLateDecoding() const override { >- return decoder_->PrefersLateDecoding(); >- } >- >- const char* ImplementationName() const override { >- return decoder_->ImplementationName(); >- } >- >- private: >- std::unique_ptr<VideoDecoder> decoder_; >- std::unique_ptr<IvfFileWriter> writer_; >-}; >- >-// An encoder wrapper that writes the encoded frames to file, one per simulcast >-// layer. >-class FrameDumpingEncoder : public VideoEncoder, private EncodedImageCallback { >- public: >- FrameDumpingEncoder(std::unique_ptr<VideoEncoder> encoder, >- std::vector<rtc::PlatformFile> files) >- : encoder_(std::move(encoder)) { >+ QualityTestVideoEncoder(std::unique_ptr<VideoEncoder> encoder, >+ VideoAnalyzer* analyzer, >+ std::vector<rtc::PlatformFile> files) >+ : encoder_(std::move(encoder)), analyzer_(analyzer) { > for (rtc::PlatformFile file : files) { > writers_.push_back( > IvfFileWriter::Wrap(rtc::File(file), /* byte_limit= */ 100000000)); >@@ -151,11 +107,11 @@ class FrameDumpingEncoder : public VideoEncoder, private EncodedImageCallback { > int32_t Encode(const VideoFrame& frame, > const CodecSpecificInfo* codec_specific_info, > const std::vector<FrameType>* frame_types) { >+ if (analyzer_) { >+ analyzer_->PreEncodeOnFrame(frame); >+ } > return encoder_->Encode(frame, codec_specific_info, frame_types); > } >- int32_t SetChannelParameters(uint32_t packet_loss, int64_t rtt) override { >- return encoder_->SetChannelParameters(packet_loss, rtt); >- } > int32_t SetRates(uint32_t bitrate, uint32_t framerate) override { > return encoder_->SetRates(bitrate, framerate); > } >@@ -163,14 +119,8 @@ class FrameDumpingEncoder : public VideoEncoder, private EncodedImageCallback { > uint32_t framerate) override { > return encoder_->SetRateAllocation(allocation, framerate); > } >- ScalingSettings GetScalingSettings() const override { >- return encoder_->GetScalingSettings(); >- } >- bool SupportsNativeHandle() const override { >- return encoder_->SupportsNativeHandle(); >- } >- const char* ImplementationName() const override { >- return encoder_->ImplementationName(); >+ EncoderInfo GetEncoderInfo() const override { >+ return encoder_->GetEncoderInfo(); > } > > private: >@@ -186,6 +136,10 @@ class FrameDumpingEncoder : public VideoEncoder, private EncodedImageCallback { > simulcast_index = encoded_image.SpatialIndex().value_or(0); > } > RTC_DCHECK_GE(simulcast_index, 0); >+ if (analyzer_) { >+ analyzer_->PostEncodeOnFrame(simulcast_index, >+ encoded_image.Timestamp()); >+ } > if (static_cast<size_t>(simulcast_index) < writers_.size()) { > writers_[simulcast_index]->WriteFrame(encoded_image, > codec_specific_info->codecType); >@@ -202,6 +156,7 @@ class FrameDumpingEncoder : public VideoEncoder, private EncodedImageCallback { > > std::unique_ptr<VideoEncoder> encoder_; > EncodedImageCallback* callback_ = nullptr; >+ VideoAnalyzer* const analyzer_; > std::vector<std::unique_ptr<IvfFileWriter>> writers_; > }; > >@@ -228,7 +183,8 @@ std::unique_ptr<VideoDecoder> VideoQualityTest::CreateVideoDecoder( > } > > std::unique_ptr<VideoEncoder> VideoQualityTest::CreateVideoEncoder( >- const SdpVideoFormat& format) { >+ const SdpVideoFormat& format, >+ VideoAnalyzer* analyzer) { > std::unique_ptr<VideoEncoder> encoder; > if (format.name == "VP8") { > encoder = absl::make_unique<VP8EncoderSimulcastProxy>( >@@ -245,12 +201,17 @@ std::unique_ptr<VideoEncoder> VideoQualityTest::CreateVideoEncoder( > sb << send_logs_++; > std::string prefix = > params_.logging.encoded_frame_base_path + "." + sb.str() + ".send."; >- encoder = absl::make_unique<FrameDumpingEncoder>( >- std::move(encoder), std::vector<rtc::PlatformFile>( >- {rtc::CreatePlatformFile(prefix + "1.ivf"), >- rtc::CreatePlatformFile(prefix + "2.ivf"), >- rtc::CreatePlatformFile(prefix + "3.ivf")})); >+ encoder = absl::make_unique<QualityTestVideoEncoder>( >+ std::move(encoder), analyzer, >+ std::vector<rtc::PlatformFile>( >+ {rtc::CreatePlatformFile(prefix + "1.ivf"), >+ rtc::CreatePlatformFile(prefix + "2.ivf"), >+ rtc::CreatePlatformFile(prefix + "3.ivf")})); >+ } else if (analyzer) { >+ encoder = absl::make_unique<QualityTestVideoEncoder>( >+ std::move(encoder), analyzer, std::vector<rtc::PlatformFile>()); > } >+ > return encoder; > } > >@@ -261,11 +222,18 @@ VideoQualityTest::VideoQualityTest( > return this->CreateVideoDecoder(format); > }), > video_encoder_factory_([this](const SdpVideoFormat& format) { >- return this->CreateVideoEncoder(format); >+ return this->CreateVideoEncoder(format, nullptr); > }), >+ video_encoder_factory_with_analyzer_( >+ [this](const SdpVideoFormat& format) { >+ return this->CreateVideoEncoder(format, analyzer_.get()); >+ }), >+ video_bitrate_allocator_factory_( >+ CreateBuiltinVideoBitrateAllocatorFactory()), > receive_logs_(0), > send_logs_(0), >- injection_components_(std::move(injection_components)) { >+ injection_components_(std::move(injection_components)), >+ num_video_streams_(0) { > if (injection_components_ == nullptr) { > injection_components_ = absl::make_unique<InjectionComponents>(); > } >@@ -294,7 +262,6 @@ VideoQualityTest::Params::Params() > audio({false, false, false, false}), > screenshare{{false, false, 10, 0}, {false, false, 10, 0}}, > analyzer({"", 0.0, 0.0, 0, "", ""}), >- pipe(), > config(absl::nullopt), > ss{{std::vector<VideoStream>(), 0, 0, -1, InterLayerPredMode::kOn, > std::vector<SpatialLayer>()}, >@@ -331,9 +298,7 @@ void VideoQualityTest::CheckParamsAndInjectionComponents() { > } > if (!params_.config && injection_components_->sender_network == nullptr && > injection_components_->receiver_network == nullptr) { >- // TODO(titovartem) replace with default config creation when removing >- // pipe. >- params_.config = params_.pipe; >+ params_.config = BuiltInNetworkBehaviorConfig(); > } > RTC_CHECK( > (params_.config && injection_components_->sender_network == nullptr && >@@ -594,7 +559,10 @@ void VideoQualityTest::SetupVideo(Transport* send_transport, > return; > } > video_send_configs_[video_idx].encoder_settings.encoder_factory = >- &video_encoder_factory_; >+ (video_idx == 0) ? &video_encoder_factory_with_analyzer_ >+ : &video_encoder_factory_; >+ video_send_configs_[video_idx].encoder_settings.bitrate_allocator_factory = >+ video_bitrate_allocator_factory_.get(); > > video_send_configs_[video_idx].rtp.payload_name = > params_.video[video_idx].codec; >@@ -702,6 +670,10 @@ void VideoQualityTest::SetupVideo(Transport* send_transport, > vp9_settings.numberOfSpatialLayers = static_cast<unsigned char>( > params_.ss[video_idx].num_spatial_layers); > vp9_settings.interLayerPred = params_.ss[video_idx].inter_layer_pred; >+ // High FPS vp9 screenshare requires flexible mode. >+ if (params_.video[video_idx].fps > 5) { >+ vp9_settings.flexibleMode = true; >+ } > video_encoder_configs_[video_idx].encoder_specific_settings = > new rtc::RefCountedObject< > VideoEncoderConfig::Vp9EncoderSpecificSettings>(vp9_settings); >@@ -783,6 +755,8 @@ void VideoQualityTest::SetupThumbnails(Transport* send_transport, > // TODO(nisse): Could use a simpler VP8-only encoder factory. > thumbnail_send_config.encoder_settings.encoder_factory = > &video_encoder_factory_; >+ thumbnail_send_config.encoder_settings.bitrate_allocator_factory = >+ video_bitrate_allocator_factory_.get(); > thumbnail_send_config.rtp.payload_name = params_.video[0].codec; > thumbnail_send_config.rtp.payload_type = kPayloadTypeVP8; > thumbnail_send_config.rtp.nack.rtp_history_ms = kNackRtpHistoryMs; >@@ -1043,7 +1017,6 @@ void VideoQualityTest::RunWithAnalyzer(const Params& params) { > std::unique_ptr<test::LayerFilteringTransport> send_transport; > std::unique_ptr<test::DirectTransport> recv_transport; > FILE* graph_data_output_file = nullptr; >- std::unique_ptr<VideoAnalyzer> analyzer; > > params_ = params; > // TODO(ivica): Merge with RunWithRenderer and use a flag / argument to >@@ -1100,7 +1073,7 @@ void VideoQualityTest::RunWithAnalyzer(const Params& params) { > if (graph_title.empty()) > graph_title = VideoQualityTest::GenerateGraphTitle(); > bool is_quick_test_enabled = field_trial::IsEnabled("WebRTC-QuickPerfTest"); >- analyzer = absl::make_unique<VideoAnalyzer>( >+ analyzer_ = absl::make_unique<VideoAnalyzer>( > send_transport.get(), params_.analyzer.test_label, > params_.analyzer.avg_psnr_threshold, params_.analyzer.avg_ssim_threshold, > is_quick_test_enabled >@@ -1114,26 +1087,23 @@ void VideoQualityTest::RunWithAnalyzer(const Params& params) { > is_quick_test_enabled, clock_, params_.logging.rtp_dump_name); > > task_queue_.SendTask([&]() { >- analyzer->SetCall(sender_call_.get()); >- analyzer->SetReceiver(receiver_call_->Receiver()); >- send_transport->SetReceiver(analyzer.get()); >+ analyzer_->SetCall(sender_call_.get()); >+ analyzer_->SetReceiver(receiver_call_->Receiver()); >+ send_transport->SetReceiver(analyzer_.get()); > recv_transport->SetReceiver(sender_call_->Receiver()); > >- SetupVideo(analyzer.get(), recv_transport.get()); >- SetupThumbnails(analyzer.get(), recv_transport.get()); >+ SetupVideo(analyzer_.get(), recv_transport.get()); >+ SetupThumbnails(analyzer_.get(), recv_transport.get()); > video_receive_configs_[params_.ss[0].selected_stream].renderer = >- analyzer.get(); >- GetVideoSendConfig()->pre_encode_callback = analyzer->pre_encode_proxy(); >- RTC_DCHECK(!GetVideoSendConfig()->post_encode_callback); >- GetVideoSendConfig()->post_encode_callback = analyzer.get(); >+ analyzer_.get(); > > CreateFlexfecStreams(); > CreateVideoStreams(); >- analyzer->SetSendStream(video_send_streams_[0]); >+ analyzer_->SetSendStream(video_send_streams_[0]); > if (video_receive_streams_.size() == 1) >- analyzer->SetReceiveStream(video_receive_streams_[0]); >+ analyzer_->SetReceiveStream(video_receive_streams_[0]); > >- GetVideoSendStream()->SetSource(analyzer->OutputInterface(), >+ GetVideoSendStream()->SetSource(analyzer_->OutputInterface(), > degradation_preference_); > SetupThumbnailCapturers(params_.call.num_thumbnails); > for (size_t i = 0; i < thumbnail_send_streams_.size(); ++i) { >@@ -1143,7 +1113,8 @@ void VideoQualityTest::RunWithAnalyzer(const Params& params) { > > CreateCapturers(); > >- analyzer->SetSource(video_capturers_[0].get(), params_.ss[0].infer_streams); >+ analyzer_->SetSource(video_capturers_[0].get(), >+ params_.ss[0].infer_streams); > > for (size_t video_idx = 1; video_idx < num_video_streams_; ++video_idx) { > video_send_streams_[video_idx]->SetSource( >@@ -1153,16 +1124,16 @@ void VideoQualityTest::RunWithAnalyzer(const Params& params) { > if (params_.audio.enabled) { > SetupAudio(send_transport.get()); > StartAudioStreams(); >- analyzer->SetAudioReceiveStream(audio_receive_streams_[0]); >+ analyzer_->SetAudioReceiveStream(audio_receive_streams_[0]); > } > StartVideoStreams(); > StartThumbnails(); >- analyzer->StartMeasuringCpuProcessTime(); >+ analyzer_->StartMeasuringCpuProcessTime(); > StartVideoCapture(); > StartThumbnailCapture(); > }); > >- analyzer->Wait(); >+ analyzer_->Wait(); > > task_queue_.SendTask([&]() { > StopThumbnailCapture(); >@@ -1181,6 +1152,7 @@ void VideoQualityTest::RunWithAnalyzer(const Params& params) { > > DestroyCalls(); > }); >+ analyzer_ = nullptr; > } > > rtc::scoped_refptr<AudioDeviceModule> VideoQualityTest::CreateAudioDevice() { >@@ -1239,7 +1211,8 @@ void VideoQualityTest::InitializeAudioDevice(Call::Config* send_call_config, > } > > void VideoQualityTest::SetupAudio(Transport* transport) { >- AudioSendStream::Config audio_send_config(transport); >+ AudioSendStream::Config audio_send_config(transport, >+ /*media_transport=*/nullptr); > audio_send_config.rtp.ssrc = kAudioSendSsrc; > > // Add extension to enable audio send side BWE, and allow audio bit rate >@@ -1278,7 +1251,28 @@ void VideoQualityTest::RunWithRenderers(const Params& params) { > std::unique_ptr<test::DirectTransport> recv_transport; > std::unique_ptr<test::VideoRenderer> local_preview; > std::vector<std::unique_ptr<test::VideoRenderer>> loopback_renderers; >- RtcEventLogNullImpl null_event_log; >+ >+ if (!params.logging.rtc_event_log_name.empty()) { >+ send_event_log_ = RtcEventLog::Create(RtcEventLog::EncodingType::Legacy); >+ recv_event_log_ = RtcEventLog::Create(RtcEventLog::EncodingType::Legacy); >+ std::unique_ptr<RtcEventLogOutputFile> send_output( >+ absl::make_unique<RtcEventLogOutputFile>( >+ params.logging.rtc_event_log_name + "_send", >+ RtcEventLog::kUnlimitedOutput)); >+ std::unique_ptr<RtcEventLogOutputFile> recv_output( >+ absl::make_unique<RtcEventLogOutputFile>( >+ params.logging.rtc_event_log_name + "_recv", >+ RtcEventLog::kUnlimitedOutput)); >+ bool event_log_started = >+ send_event_log_->StartLogging(std::move(send_output), >+ /*output_period_ms=*/5000) && >+ recv_event_log_->StartLogging(std::move(recv_output), >+ /*output_period_ms=*/5000); >+ RTC_DCHECK(event_log_started); >+ } else { >+ send_event_log_ = RtcEventLog::CreateNull(); >+ recv_event_log_ = RtcEventLog::CreateNull(); >+ } > > task_queue_.SendTask([&]() { > params_ = params; >@@ -1286,9 +1280,9 @@ void VideoQualityTest::RunWithRenderers(const Params& params) { > > // TODO(ivica): Remove bitrate_config and use the default Call::Config(), to > // match the full stack tests. >- Call::Config send_call_config(&null_event_log); >+ Call::Config send_call_config(send_event_log_.get()); > send_call_config.bitrate_config = params_.call.call_bitrate_config; >- Call::Config recv_call_config(&null_event_log); >+ Call::Config recv_call_config(recv_event_log_.get()); > > if (params_.audio.enabled) > InitializeAudioDevice( >@@ -1364,7 +1358,7 @@ void VideoQualityTest::RunWithRenderers(const Params& params) { > Start(); > }); > >- test::PressEnterToContinue(); >+ test::PressEnterToContinue(task_queue_); > > task_queue_.SendTask([&]() { > Stop(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.h >index 7ec7e8da7e44b627ffba871124a01c558de72c7b..f67e65bc13797428c0cc7cc61ca678a526870ebc 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_quality_test.h >@@ -17,12 +17,14 @@ > > #include "api/fec_controller.h" > #include "api/test/video_quality_test_fixture.h" >+#include "api/video/video_bitrate_allocator_factory.h" > #include "call/fake_network_pipe.h" > #include "media/engine/internaldecoderfactory.h" > #include "media/engine/internalencoderfactory.h" > #include "test/call_test.h" > #include "test/frame_generator.h" > #include "test/layer_filtering_transport.h" >+#include "video/video_analyzer.h" > #ifdef WEBRTC_WIN > #include "modules/audio_device/win/core_audio_utility_win.h" > #endif >@@ -75,8 +77,8 @@ class VideoQualityTest : > void SetupThumbnailCapturers(size_t num_thumbnail_streams); > std::unique_ptr<VideoDecoder> CreateVideoDecoder( > const SdpVideoFormat& format); >- std::unique_ptr<VideoEncoder> CreateVideoEncoder( >- const SdpVideoFormat& format); >+ std::unique_ptr<VideoEncoder> CreateVideoEncoder(const SdpVideoFormat& format, >+ VideoAnalyzer* analyzer); > void SetupVideo(Transport* send_transport, Transport* recv_transport); > void SetupThumbnails(Transport* send_transport, Transport* recv_transport); > void StartAudioStreams(); >@@ -103,6 +105,9 @@ class VideoQualityTest : > test::FunctionVideoDecoderFactory video_decoder_factory_; > InternalDecoderFactory internal_decoder_factory_; > test::FunctionVideoEncoderFactory video_encoder_factory_; >+ test::FunctionVideoEncoderFactory video_encoder_factory_with_analyzer_; >+ std::unique_ptr<VideoBitrateAllocatorFactory> >+ video_bitrate_allocator_factory_; > InternalEncoderFactory internal_encoder_factory_; > std::vector<VideoSendStream::Config> thumbnail_send_configs_; > std::vector<VideoEncoderConfig> thumbnail_encoder_configs_; >@@ -116,6 +121,9 @@ class VideoQualityTest : > Params params_; > std::unique_ptr<InjectionComponents> injection_components_; > >+ // Set non-null when running with analyzer. >+ std::unique_ptr<VideoAnalyzer> analyzer_; >+ > // Note: not same as similarly named member in CallTest. This is the number of > // separate send streams, the one in CallTest is the number of substreams for > // a single send stream. >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.cc >index 86b93686ca5f9733920456d9b7189eb380fe45d5..bba33aa363e8a695dac6cf4480cbdc4777615300 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.cc >@@ -21,7 +21,6 @@ > #include "api/video_codecs/video_decoder_factory.h" > #include "call/rtp_stream_receiver_controller_interface.h" > #include "call/rtx_receive_stream.h" >-#include "common_types.h" // NOLINT(build/include) > #include "common_video/h264/profile_level_id.h" > #include "common_video/include/incoming_video_stream.h" > #include "common_video/libyuv/include/webrtc_libyuv.h" >@@ -35,10 +34,13 @@ > #include "rtc_base/checks.h" > #include "rtc_base/location.h" > #include "rtc_base/logging.h" >+#include "rtc_base/platform_file.h" >+#include "rtc_base/strings/string_builder.h" > #include "rtc_base/trace_event.h" > #include "system_wrappers/include/clock.h" > #include "system_wrappers/include/field_trial.h" > #include "video/call_stats.h" >+#include "video/frame_dumping_decoder.h" > #include "video/receive_statistics_proxy.h" > > namespace webrtc { >@@ -107,6 +109,11 @@ class NullVideoDecoder : public webrtc::VideoDecoder { > const char* ImplementationName() const override { return "NullVideoDecoder"; } > }; > >+// TODO(https://bugs.webrtc.org/9974): Consider removing this workaround. >+// Maximum time between frames before resetting the FrameBuffer to avoid RTP >+// timestamps wraparound to affect FrameBuffer. >+constexpr int kInactiveStreamThresholdMs = 600000; // 10 minutes. >+ > } // namespace > > namespace internal { >@@ -128,10 +135,11 @@ VideoReceiveStream::VideoReceiveStream( > "DecodingThread", > rtc::kHighestPriority), > call_stats_(call_stats), >- rtp_receive_statistics_(ReceiveStatistics::Create(clock_)), >- timing_(new VCMTiming(clock_)), >- video_receiver_(clock_, nullptr, timing_.get(), this, this), > stats_proxy_(&config_, clock_), >+ rtp_receive_statistics_( >+ ReceiveStatistics::Create(clock_, &stats_proxy_, &stats_proxy_)), >+ timing_(new VCMTiming(clock_)), >+ video_receiver_(clock_, timing_.get(), this, this), > rtp_video_stream_receiver_(&transport_adapter_, > call_stats, > packet_router, >@@ -139,9 +147,10 @@ VideoReceiveStream::VideoReceiveStream( > rtp_receive_statistics_.get(), > &stats_proxy_, > process_thread_, >- this, // NackSender >- this, // KeyFrameRequestSender >- this), // OnCompleteFrameCallback >+ this, // NackSender >+ this, // KeyFrameRequestSender >+ this, // OnCompleteFrameCallback >+ config_.frame_decryptor), > rtp_stream_sync_(this) { > RTC_LOG(LS_INFO) << "VideoReceiveStream: " << config_.ToString(); > >@@ -245,6 +254,18 @@ void VideoReceiveStream::Start() { > if (!video_decoder) { > video_decoder = absl::make_unique<NullVideoDecoder>(); > } >+ >+ std::string decoded_output_file = >+ field_trial::FindFullName("WebRTC-DecoderDataDumpDirectory"); >+ if (!decoded_output_file.empty()) { >+ char filename_buffer[256]; >+ rtc::SimpleStringBuilder ssb(filename_buffer); >+ ssb << decoded_output_file << "/webrtc_receive_stream_" >+ << this->config_.rtp.local_ssrc << ".ivf"; >+ video_decoder = absl::make_unique<FrameDumpingDecoder>( >+ std::move(video_decoder), rtc::CreatePlatformFile(ssb.str())); >+ } >+ > video_decoders_.push_back(std::move(video_decoder)); > > video_receiver_.RegisterExternalDecoder(video_decoders_.back().get(), >@@ -357,6 +378,14 @@ void VideoReceiveStream::OnCompleteFrame( > // partially enabled inter-layer prediction. > frame->id.spatial_layer = 0; > >+ // TODO(https://bugs.webrtc.org/9974): Consider removing this workaround. >+ int64_t time_now_ms = rtc::TimeMillis(); >+ if (last_complete_frame_time_ms_ > 0 && >+ time_now_ms - last_complete_frame_time_ms_ > kInactiveStreamThresholdMs) { >+ frame_buffer_->Clear(); >+ } >+ last_complete_frame_time_ms_ = time_now_ms; >+ > int64_t last_continuous_pid = frame_buffer_->InsertFrame(std::move(frame)); > if (last_continuous_pid != -1) > rtp_video_stream_receiver_.FrameContinuous(last_continuous_pid); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.h >index ae5ced51c3b7b19fea184326b0912df97aba088e..5c9fd83c62a9ed1153bb20e5108d4eb3b3b2ba85 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream.h >@@ -17,7 +17,6 @@ > #include "call/rtp_packet_sink_interface.h" > #include "call/syncable.h" > #include "call/video_receive_stream.h" >-#include "common_video/libyuv/include/webrtc_libyuv.h" > #include "modules/rtp_rtcp/include/flexfec_receiver.h" > #include "modules/video_coding/frame_buffer2.h" > #include "modules/video_coding/video_coding_impl.h" >@@ -115,6 +114,7 @@ class VideoReceiveStream : public webrtc::VideoReceiveStream, > > CallStats* const call_stats_; > >+ ReceiveStatisticsProxy stats_proxy_; > // Shared by media and rtx stream receivers, since the latter has no RtpRtcp > // module of its own. > const std::unique_ptr<ReceiveStatistics> rtp_receive_statistics_; >@@ -122,7 +122,6 @@ class VideoReceiveStream : public webrtc::VideoReceiveStream, > std::unique_ptr<VCMTiming> timing_; // Jitter buffer experiment. > vcm::VideoReceiver video_receiver_; > std::unique_ptr<rtc::VideoSinkInterface<VideoFrame>> incoming_video_stream_; >- ReceiveStatisticsProxy stats_proxy_; > RtpVideoStreamReceiver rtp_video_stream_receiver_; > std::unique_ptr<VideoStreamDecoder> video_stream_decoder_; > RtpStreamsSynchronizer rtp_stream_sync_; >@@ -147,6 +146,7 @@ class VideoReceiveStream : public webrtc::VideoReceiveStream, > bool frame_decoded_ = false; > > int64_t last_keyframe_request_ms_ = 0; >+ int64_t last_complete_frame_time_ms_ = 0; > }; > } // namespace internal > } // namespace webrtc >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream_unittest.cc >index e4ba69b7b92be4bef077dcb6c9b24c7acef0f67a..0ec9e1773384d3e72b661ce468ae5794d56a774b 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_receive_stream_unittest.cc >@@ -118,7 +118,7 @@ TEST_F(VideoReceiveStreamTest, CreateFrameFromH264FmtpSpropAndIdr) { > rtppacket.SetPayloadType(99); > rtppacket.SetSequenceNumber(1); > rtppacket.SetTimestamp(0); >- rtc::Event init_decode_event_(false, false); >+ rtc::Event init_decode_event_; > EXPECT_CALL(mock_h264_video_decoder_, InitDecode(_, _)) > .WillOnce(Invoke([&init_decode_event_](const VideoCodec* config, > int32_t number_of_cores) { >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.cc >index 054d27e421c00179d2cfad96cf018a2a8cfa11ce..059a6150af9b9ce3ed4f879ac4c97abf41223c67 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.cc >@@ -15,12 +15,15 @@ > #include "modules/rtp_rtcp/source/rtp_header_extension_size.h" > #include "modules/rtp_rtcp/source/rtp_sender.h" > #include "rtc_base/logging.h" >+#include "system_wrappers/include/field_trial.h" > #include "video/video_send_stream_impl.h" > > namespace webrtc { > > namespace { > >+constexpr char kTargetBitrateRtcpFieldTrial[] = "WebRTC-Target-Bitrate-Rtcp"; >+ > size_t CalculateMaxHeaderSize(const RtpConfig& config) { > size_t header_size = kRtpHeaderSize; > size_t extensions_size = 0; >@@ -62,7 +65,7 @@ VideoSendStream::VideoSendStream( > rtc::TaskQueue* worker_queue, > CallStats* call_stats, > RtpTransportControllerSendInterface* transport, >- BitrateAllocator* bitrate_allocator, >+ BitrateAllocatorInterface* bitrate_allocator, > SendDelayStats* send_delay_stats, > RtcEventLog* event_log, > VideoSendStream::Config config, >@@ -71,13 +74,13 @@ VideoSendStream::VideoSendStream( > const std::map<uint32_t, RtpPayloadState>& suspended_payload_states, > std::unique_ptr<FecController> fec_controller) > : worker_queue_(worker_queue), >- thread_sync_event_(false /* manual_reset */, false), > stats_proxy_(Clock::GetRealTimeClock(), > config, > encoder_config.content_type), > config_(std::move(config)), > content_type_(encoder_config.content_type) { > RTC_DCHECK(config_.encoder_settings.encoder_factory); >+ RTC_DCHECK(config_.encoder_settings.bitrate_allocator_factory); > > video_stream_encoder_ = CreateVideoStreamEncoder(num_cpu_cores, &stats_proxy_, > config_.encoder_settings, >@@ -104,9 +107,10 @@ VideoSendStream::VideoSendStream( > // it was created on. > thread_sync_event_.Wait(rtc::Event::kForever); > send_stream_->RegisterProcessThread(module_process_thread); >- // TODO(sprang): Enable this also for regular video calls if it works well. >- if (encoder_config.content_type == VideoEncoderConfig::ContentType::kScreen) { >- // Only signal target bitrate for screenshare streams, for now. >+ // TODO(sprang): Enable this also for regular video calls by default, if it >+ // works well. >+ if (encoder_config.content_type == VideoEncoderConfig::ContentType::kScreen || >+ field_trial::IsEnabled(kTargetBitrateRtcpFieldTrial)) { > video_stream_encoder_->SetBitrateAllocationObserver(send_stream_.get()); > } > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.h >index 0945203df31b5a579819a1b1d5bb3252b57577dd..22aa4b1c6d5dbd36275efa7a61af95181d638a1e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream.h >@@ -20,7 +20,6 @@ > #include "call/bitrate_allocator.h" > #include "call/video_receive_stream.h" > #include "call/video_send_stream.h" >-#include "common_video/libyuv/include/webrtc_libyuv.h" > #include "rtc_base/criticalsection.h" > #include "rtc_base/event.h" > #include "rtc_base/task_queue.h" >@@ -60,7 +59,7 @@ class VideoSendStream : public webrtc::VideoSendStream { > rtc::TaskQueue* worker_queue, > CallStats* call_stats, > RtpTransportControllerSendInterface* transport, >- BitrateAllocator* bitrate_allocator, >+ BitrateAllocatorInterface* bitrate_allocator, > SendDelayStats* send_delay_stats, > RtcEventLog* event_log, > VideoSendStream::Config config, >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.cc >index 58caf4777462d7595ca00af6c3f58a0e9741457b..5dcf18243d5f876f01206750464b197fb3c2d149 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.cc >@@ -111,6 +111,14 @@ int CalculateMaxPadBitrateBps(const std::vector<VideoStream>& streams, > return pad_up_to_bitrate_bps; > } > >+RtpSenderFrameEncryptionConfig CreateFrameEncryptionConfig( >+ const VideoSendStream::Config* config) { >+ RtpSenderFrameEncryptionConfig frame_encryption_config; >+ frame_encryption_config.frame_encryptor = config->frame_encryptor; >+ frame_encryption_config.crypto_options = config->crypto_options; >+ return frame_encryption_config; >+} >+ > RtpSenderObservers CreateObservers(CallStats* call_stats, > EncoderRtcpFeedback* encoder_feedback, > SendStatisticsProxy* stats_proxy, >@@ -151,6 +159,16 @@ bool SameStreamsEnabled(const VideoBitrateAllocation& lhs, > } > } // namespace > >+PacingConfig::PacingConfig() >+ : pacing_factor("factor", PacedSender::kDefaultPaceMultiplier), >+ max_pacing_delay("max_delay", >+ TimeDelta::ms(PacedSender::kMaxQueueLengthMs)) { >+ ParseFieldTrial({&pacing_factor, &max_pacing_delay}, >+ field_trial::FindFullName("WebRTC-Video-Pacing")); >+} >+PacingConfig::PacingConfig(const PacingConfig&) = default; >+PacingConfig::~PacingConfig() = default; >+ > // CheckEncoderActivityTask is used for tracking when the encoder last produced > // and encoded video frame. If the encoder has not produced anything the last > // kEncoderTimeOutMs we also want to stop sending padding. >@@ -221,6 +239,7 @@ VideoSendStreamImpl::VideoSendStreamImpl( > std::unique_ptr<FecController> fec_controller) > : has_alr_probing_(config->periodic_alr_bandwidth_probing || > GetAlrSettings(content_type)), >+ pacing_config_(PacingConfig()), > stats_proxy_(stats_proxy), > config_(config), > worker_queue_(worker_queue), >@@ -238,19 +257,20 @@ VideoSendStreamImpl::VideoSendStreamImpl( > config_->rtp.ssrcs, > video_stream_encoder), > bandwidth_observer_(transport->GetBandwidthObserver()), >- rtp_video_sender_( >- transport_->CreateRtpVideoSender(config_->rtp.ssrcs, >- suspended_ssrcs, >- suspended_payload_states, >- config_->rtp, >- config_->rtcp, >- config_->send_transport, >- CreateObservers(call_stats, >- &encoder_feedback_, >- stats_proxy_, >- send_delay_stats), >- event_log, >- std::move(fec_controller))), >+ rtp_video_sender_(transport_->CreateRtpVideoSender( >+ config_->rtp.ssrcs, >+ suspended_ssrcs, >+ suspended_payload_states, >+ config_->rtp, >+ config_->rtcp_report_interval_ms, >+ config_->send_transport, >+ CreateObservers(call_stats, >+ &encoder_feedback_, >+ stats_proxy_, >+ send_delay_stats), >+ event_log, >+ std::move(fec_controller), >+ CreateFrameEncryptionConfig(config_))), > weak_ptr_factory_(this) { > RTC_DCHECK_RUN_ON(worker_queue_); > RTC_LOG(LS_INFO) << "VideoSendStreamInternal: " << config_->ToString(); >@@ -293,9 +313,9 @@ VideoSendStreamImpl::VideoSendStreamImpl( > transport->SetQueueTimeLimit(alr_settings->max_paced_queue_time); > } else { > transport->EnablePeriodicAlrProbing(false); >- transport->SetPacingFactor(PacedSender::kDefaultPaceMultiplier); >- configured_pacing_factor_ = PacedSender::kDefaultPaceMultiplier; >- transport->SetQueueTimeLimit(PacedSender::kMaxQueueLengthMs); >+ transport->SetPacingFactor(pacing_config_.pacing_factor); >+ configured_pacing_factor_ = pacing_config_.pacing_factor; >+ transport->SetQueueTimeLimit(pacing_config_.max_pacing_delay.Get().ms()); > } > } > >@@ -563,17 +583,6 @@ EncodedImageCallback::Result VideoSendStreamImpl::OnEncodedImage( > // Encoded is called on whatever thread the real encoder implementation run > // on. In the case of hardware encoders, there might be several encoders > // running in parallel on different threads. >- const size_t simulcast_idx = >- (codec_specific_info->codecType != kVideoCodecVP9) >- ? encoded_image.SpatialIndex().value_or(0) >- : 0; >- if (config_->post_encode_callback) { >- // TODO(nisse): Delete webrtc::EncodedFrame class, pass EncodedImage >- // instead. >- config_->post_encode_callback->EncodedFrameCallback(EncodedFrame( >- encoded_image._buffer, encoded_image._length, encoded_image._frameType, >- simulcast_idx, encoded_image.Timestamp())); >- } > { > rtc::CritScope lock(&encoder_activity_crit_sect_); > if (check_encoder_activity_task_) >@@ -613,21 +622,22 @@ std::map<uint32_t, RtpPayloadState> VideoSendStreamImpl::GetRtpPayloadStates() > return rtp_video_sender_->GetRtpPayloadStates(); > } > >-uint32_t VideoSendStreamImpl::OnBitrateUpdated(uint32_t bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt, >- int64_t probing_interval_ms) { >+uint32_t VideoSendStreamImpl::OnBitrateUpdated(BitrateAllocationUpdate update) { > RTC_DCHECK_RUN_ON(worker_queue_); > RTC_DCHECK(rtp_video_sender_->IsActive()) > << "VideoSendStream::Start has not been called."; > >- rtp_video_sender_->OnBitrateUpdated(bitrate_bps, fraction_loss, rtt, >- stats_proxy_->GetSendFrameRate()); >+ rtp_video_sender_->OnBitrateUpdated( >+ update.target_bitrate.bps(), >+ rtc::dchecked_cast<uint8_t>(update.packet_loss_ratio * 256), >+ update.round_trip_time.ms(), stats_proxy_->GetSendFrameRate()); > encoder_target_rate_bps_ = rtp_video_sender_->GetPayloadBitrateBps(); > encoder_target_rate_bps_ = > std::min(encoder_max_bitrate_bps_, encoder_target_rate_bps_); >- video_stream_encoder_->OnBitrateUpdated(encoder_target_rate_bps_, >- fraction_loss, rtt); >+ video_stream_encoder_->OnBitrateUpdated( >+ encoder_target_rate_bps_, >+ rtc::dchecked_cast<uint8_t>(update.packet_loss_ratio * 256), >+ update.round_trip_time.ms()); > stats_proxy_->OnSetEncoderTargetRate(encoder_target_rate_bps_); > return rtp_video_sender_->GetProtectionBitrateBps(); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.h >index eed9f567fb08186fa851dd29c42c54170ba33993..9caabacdab92448d13677caf910517f1b2affeb0 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl.h >@@ -18,7 +18,6 @@ > #include "api/video/video_stream_encoder_interface.h" > #include "call/bitrate_allocator.h" > #include "call/rtp_video_sender_interface.h" >-#include "common_types.h" // NOLINT(build/include) > #include "modules/utility/include/process_thread.h" > #include "rtc_base/weak_ptr.h" > #include "video/call_stats.h" >@@ -30,6 +29,16 @@ > namespace webrtc { > namespace internal { > >+// Pacing buffer config; overridden by ALR config if provided. >+struct PacingConfig { >+ PacingConfig(); >+ PacingConfig(const PacingConfig&); >+ PacingConfig& operator=(const PacingConfig&) = default; >+ ~PacingConfig(); >+ FieldTrialParameter<double> pacing_factor; >+ FieldTrialParameter<TimeDelta> max_pacing_delay; >+}; >+ > // VideoSendStreamImpl implements internal::VideoSendStream. > // It is created and destroyed on |worker_queue|. The intent is to decrease the > // need for locking and to ensure methods are called in sequence. >@@ -83,10 +92,7 @@ class VideoSendStreamImpl : public webrtc::BitrateAllocatorObserver, > class CheckEncoderActivityTask; > > // Implements BitrateAllocatorObserver. >- uint32_t OnBitrateUpdated(uint32_t bitrate_bps, >- uint8_t fraction_loss, >- int64_t rtt, >- int64_t probing_interval_ms) override; >+ uint32_t OnBitrateUpdated(BitrateAllocationUpdate update) override; > > void OnEncoderConfigurationChanged(std::vector<VideoStream> streams, > int min_transmit_bitrate_bps) override; >@@ -115,6 +121,7 @@ class VideoSendStreamImpl : public webrtc::BitrateAllocatorObserver, > void SignalEncoderActive(); > > const bool has_alr_probing_; >+ const PacingConfig pacing_config_; > > SendStatisticsProxy* const stats_proxy_; > const VideoSendStream::Config* const config_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl_unittest.cc >index dd036909a6af1ec56798d48b958a169d0b7cdb18..32d0df1c2e61b7dc5efcdae83f63a41881071371 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_impl_unittest.cc >@@ -69,6 +69,14 @@ class MockRtpVideoSender : public RtpVideoSenderInterface { > MOCK_CONST_METHOD0(GetProtectionBitrateBps, uint32_t()); > MOCK_METHOD3(SetEncodingData, void(size_t, size_t, size_t)); > }; >+ >+BitrateAllocationUpdate CreateAllocation(int bitrate_bps) { >+ BitrateAllocationUpdate update; >+ update.target_bitrate = DataRate::bps(bitrate_bps); >+ update.packet_loss_ratio = 0; >+ update.round_trip_time = TimeDelta::Zero(); >+ return update; >+} > } // namespace > > class VideoSendStreamImplTest : public ::testing::Test { >@@ -91,7 +99,7 @@ class VideoSendStreamImplTest : public ::testing::Test { > EXPECT_CALL(transport_controller_, packet_router()) > .WillRepeatedly(Return(&packet_router_)); > EXPECT_CALL(transport_controller_, >- CreateRtpVideoSender(_, _, _, _, _, _, _, _, _)) >+ CreateRtpVideoSender(_, _, _, _, _, _, _, _, _, _)) > .WillRepeatedly(Return(&rtp_video_sender_)); > EXPECT_CALL(rtp_video_sender_, SetActive(_)) > .WillRepeatedly(testing::Invoke( >@@ -367,7 +375,7 @@ TEST_F(VideoSendStreamImplTest, ForwardsVideoBitrateAllocationWhenEnabled) { > .Times(1) > .WillOnce(Return(kBitrateBps)); > static_cast<BitrateAllocatorObserver*>(vss_impl.get()) >- ->OnBitrateUpdated(kBitrateBps, 0, 0, 0); >+ ->OnBitrateUpdated(CreateAllocation(kBitrateBps)); > EXPECT_CALL(rtp_video_sender_, OnBitrateAllocationUpdated(alloc)).Times(1); > observer->OnBitrateAllocationUpdated(alloc); > >@@ -376,7 +384,7 @@ TEST_F(VideoSendStreamImplTest, ForwardsVideoBitrateAllocationWhenEnabled) { > .Times(1) > .WillOnce(Return(0)); > static_cast<BitrateAllocatorObserver*>(vss_impl.get()) >- ->OnBitrateUpdated(0, 0, 0, 0); >+ ->OnBitrateUpdated(CreateAllocation(0)); > EXPECT_CALL(rtp_video_sender_, OnBitrateAllocationUpdated(alloc)).Times(0); > observer->OnBitrateAllocationUpdated(alloc); > >@@ -396,7 +404,7 @@ TEST_F(VideoSendStreamImplTest, ThrottlesVideoBitrateAllocationWhenTooSimilar) { > .Times(1) > .WillOnce(Return(kBitrateBps)); > static_cast<BitrateAllocatorObserver*>(vss_impl.get()) >- ->OnBitrateUpdated(kBitrateBps, 0, 0, 0); >+ ->OnBitrateUpdated(CreateAllocation(kBitrateBps)); > VideoBitrateAllocationObserver* const observer = > static_cast<VideoBitrateAllocationObserver*>(vss_impl.get()); > >@@ -450,7 +458,7 @@ TEST_F(VideoSendStreamImplTest, ForwardsVideoBitrateAllocationOnLayerChange) { > .Times(1) > .WillOnce(Return(kBitrateBps)); > static_cast<BitrateAllocatorObserver*>(vss_impl.get()) >- ->OnBitrateUpdated(kBitrateBps, 0, 0, 0); >+ ->OnBitrateUpdated(CreateAllocation(kBitrateBps)); > VideoBitrateAllocationObserver* const observer = > static_cast<VideoBitrateAllocationObserver*>(vss_impl.get()); > >@@ -494,7 +502,7 @@ TEST_F(VideoSendStreamImplTest, ForwardsVideoBitrateAllocationAfterTimeout) { > .Times(1) > .WillRepeatedly(Return(kBitrateBps)); > static_cast<BitrateAllocatorObserver*>(vss_impl.get()) >- ->OnBitrateUpdated(kBitrateBps, 0, 0, 0); >+ ->OnBitrateUpdated(CreateAllocation(kBitrateBps)); > VideoBitrateAllocationObserver* const observer = > static_cast<VideoBitrateAllocationObserver*>(vss_impl.get()); > >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_tests.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_tests.cc >index 09ccf1c49893a53cf3bc739ddffb70fc2175ac99..ea77cd65291493795c9d21ea4c27801f33d8955f 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_tests.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_send_stream_tests.cc >@@ -12,12 +12,11 @@ > #include <vector> > > #include "api/test/simulated_network.h" >+#include "api/video/encoded_image.h" > #include "call/call.h" > #include "call/fake_network_pipe.h" > #include "call/rtp_transport_controller_send.h" > #include "call/simulated_network.h" >-#include "common_video/include/frame_callback.h" >-#include "common_video/include/video_frame.h" > #include "modules/rtp_rtcp/include/rtp_header_parser.h" > #include "modules/rtp_rtcp/include/rtp_rtcp.h" > #include "modules/rtp_rtcp/source/rtcp_sender.h" >@@ -70,10 +69,14 @@ class VideoSendStreamPeer { > }; > } // namespace test > >+namespace { >+constexpr int64_t kRtcpIntervalMs = 1000; >+ > enum VideoFormat { > kGeneric, > kVP8, > }; >+} // namespace > > void ExpectEqualFramesVector(const std::vector<VideoFrame>& frames1, > const std::vector<VideoFrame>& frames2); >@@ -541,7 +544,7 @@ class UlpfecObserver : public test::EndToEndTest { > // At low RTT (< kLowRttNackMs) -> NACK only, no FEC. > // Configure some network delay. > const int kNetworkDelayMs = 100; >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.loss_percent = 5; > config.queue_delay_ms = kNetworkDelayMs; > return new test::PacketTransport( >@@ -655,7 +658,7 @@ TEST_P(VideoSendStreamTest, DISABLED_DoesUtilizeUlpfecForVp8WithNackEnabled) { > RunBaseTest(&test); > } > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > // Disabled as flaky, see https://crbug.com/webrtc/7285 for details. > TEST_P(VideoSendStreamTest, DISABLED_DoesUtilizeUlpfecForVp9WithNackEnabled) { > test::FunctionVideoEncoderFactory encoder_factory( >@@ -663,7 +666,7 @@ TEST_P(VideoSendStreamTest, DISABLED_DoesUtilizeUlpfecForVp9WithNackEnabled) { > UlpfecObserver test(false, true, true, true, "VP9", &encoder_factory); > RunBaseTest(&test); > } >-#endif // !defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > TEST_P(VideoSendStreamTest, SupportsUlpfecWithMultithreadedH264) { > test::FunctionVideoEncoderFactory encoder_factory([]() { >@@ -731,7 +734,7 @@ class FlexfecObserver : public test::EndToEndTest { > // At low RTT (< kLowRttNackMs) -> NACK only, no FEC. > // Therefore we need some network delay. > const int kNetworkDelayMs = 100; >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.loss_percent = 5; > config.queue_delay_ms = kNetworkDelayMs; > return new test::PacketTransport( >@@ -806,7 +809,7 @@ TEST_P(VideoSendStreamTest, SupportsFlexfecWithRtpExtensionsVp8) { > RunBaseTest(&test); > } > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > TEST_P(VideoSendStreamTest, SupportsFlexfecVp9) { > test::FunctionVideoEncoderFactory encoder_factory( > []() { return VP9Encoder::Create(); }); >@@ -820,7 +823,7 @@ TEST_P(VideoSendStreamTest, SupportsFlexfecWithNackVp9) { > FlexfecObserver test(false, true, "VP9", &encoder_factory, 1); > RunBaseTest(&test); > } >-#endif // defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > TEST_P(VideoSendStreamTest, SupportsFlexfecH264) { > test::FunctionVideoEncoderFactory encoder_factory([]() { >@@ -866,19 +869,27 @@ void VideoSendStreamTest::TestNackRetransmission( > RTPHeader header; > EXPECT_TRUE(parser_->Parse(packet, length, &header)); > >- int kRetransmitTarget = 6; >+ // NACK packets two times at some arbitrary points. >+ const int kNackedPacketsAtOnceCount = 3; >+ const int kRetransmitTarget = kNackedPacketsAtOnceCount * 2; >+ >+ // Skip padding packets because they will never be retransmitted. >+ if (header.paddingLength + header.headerLength == length) { >+ return SEND_PACKET; >+ } >+ > ++send_count_; >+ >+ // NACK packets at arbitrary points. > if (send_count_ == 5 || send_count_ == 25) { >- nacked_sequence_numbers_.push_back( >- static_cast<uint16_t>(header.sequenceNumber - 3)); >- nacked_sequence_numbers_.push_back( >- static_cast<uint16_t>(header.sequenceNumber - 2)); >- nacked_sequence_numbers_.push_back( >- static_cast<uint16_t>(header.sequenceNumber - 1)); >+ nacked_sequence_numbers_.insert( >+ nacked_sequence_numbers_.end(), >+ non_padding_sequence_numbers_.end() - kNackedPacketsAtOnceCount, >+ non_padding_sequence_numbers_.end()); > > RTCPSender rtcp_sender(false, Clock::GetRealTimeClock(), nullptr, > nullptr, nullptr, transport_adapter_.get(), >- RtcpIntervalConfig{}); >+ kRtcpIntervalMs); > > rtcp_sender.SetRTCPStatus(RtcpMode::kReducedSize); > rtcp_sender.SetRemoteSSRC(kVideoSendSsrcs[0]); >@@ -892,7 +903,6 @@ void VideoSendStreamTest::TestNackRetransmission( > } > > uint16_t sequence_number = header.sequenceNumber; >- > if (header.ssrc == retransmit_ssrc_ && > retransmit_ssrc_ != kVideoSendSsrcs[0]) { > // Not kVideoSendSsrcs[0], assume correct RTX packet. Extract sequence >@@ -900,6 +910,7 @@ void VideoSendStreamTest::TestNackRetransmission( > const uint8_t* rtx_header = packet + header.headerLength; > sequence_number = (rtx_header[0] << 8) + rtx_header[1]; > } >+ > auto found = std::find(nacked_sequence_numbers_.begin(), > nacked_sequence_numbers_.end(), sequence_number); > if (found != nacked_sequence_numbers_.end()) { >@@ -910,6 +921,8 @@ void VideoSendStreamTest::TestNackRetransmission( > EXPECT_EQ(retransmit_payload_type_, header.payloadType); > observation_complete_.Set(); > } >+ } else { >+ non_padding_sequence_numbers_.push_back(sequence_number); > } > > return SEND_PACKET; >@@ -938,6 +951,7 @@ void VideoSendStreamTest::TestNackRetransmission( > uint32_t retransmit_ssrc_; > uint8_t retransmit_payload_type_; > std::vector<uint16_t> nacked_sequence_numbers_; >+ std::vector<uint16_t> non_padding_sequence_numbers_; > } test(retransmit_ssrc, retransmit_payload_type); > > RunBaseTest(&test); >@@ -964,8 +978,7 @@ void VideoSendStreamTest::TestPacketFragmentationSize(VideoFormat format, > > // Observer that verifies that the expected number of packets and bytes > // arrive for each frame size, from start_size to stop_size. >- class FrameFragmentationTest : public test::SendTest, >- public EncodedFrameObserver { >+ class FrameFragmentationTest : public test::SendTest { > public: > FrameFragmentationTest(size_t max_packet_size, > size_t start_size, >@@ -1090,7 +1103,7 @@ void VideoSendStreamTest::TestPacketFragmentationSize(VideoFormat format, > loss_ratio); // Loss percent. > RTCPSender rtcp_sender(false, Clock::GetRealTimeClock(), > &lossy_receive_stats, nullptr, nullptr, >- transport_adapter_.get(), RtcpIntervalConfig{}); >+ transport_adapter_.get(), kRtcpIntervalMs); > > rtcp_sender.SetRTCPStatus(RtcpMode::kReducedSize); > rtcp_sender.SetRemoteSSRC(kVideoSendSsrcs[0]); >@@ -1101,7 +1114,7 @@ void VideoSendStreamTest::TestPacketFragmentationSize(VideoFormat format, > } > } > >- void EncodedFrameCallback(const EncodedFrame& encoded_frame) override { >+ void UpdateConfiguration() { > rtc::CritScope lock(&mutex_); > // Increase frame size for next encoded frame, in the context of the > // encoder thread. >@@ -1110,9 +1123,10 @@ void VideoSendStreamTest::TestPacketFragmentationSize(VideoFormat format, > } > encoder_.SetFrameSize(static_cast<size_t>(current_size_frame_)); > } >- void ModifySenderCallConfig(Call::Config* config) override { >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { > const int kMinBitrateBps = 30000; >- config->bitrate_config.min_bitrate_bps = kMinBitrateBps; >+ bitrate_config->min_bitrate_bps = kMinBitrateBps; > } > > void ModifyVideoConfigs( >@@ -1132,38 +1146,15 @@ void VideoSendStreamTest::TestPacketFragmentationSize(VideoFormat format, > > send_config->encoder_settings.encoder_factory = &encoder_factory_; > send_config->rtp.max_packet_size = kMaxPacketSize; >- send_config->post_encode_callback = this; >+ encoder_.RegisterPostEncodeCallback([this]() { UpdateConfiguration(); }); > > // Make sure there is at least one extension header, to make the RTP > // header larger than the base length of 12 bytes. > EXPECT_FALSE(send_config->rtp.extensions.empty()); > > // Setup screen content disables frame dropping which makes this easier. >- class VideoStreamFactory >- : public VideoEncoderConfig::VideoStreamFactoryInterface { >- public: >- explicit VideoStreamFactory(size_t num_temporal_layers) >- : num_temporal_layers_(num_temporal_layers) { >- EXPECT_GT(num_temporal_layers, 0u); >- } >- >- private: >- std::vector<VideoStream> CreateEncoderStreams( >- int width, >- int height, >- const VideoEncoderConfig& encoder_config) override { >- std::vector<VideoStream> streams = >- test::CreateVideoStreams(width, height, encoder_config); >- for (VideoStream& stream : streams) { >- stream.num_temporal_layers = num_temporal_layers_; >- } >- return streams; >- } >- const size_t num_temporal_layers_; >- }; >- >- encoder_config->video_stream_factory = >- new rtc::RefCountedObject<VideoStreamFactory>(2); >+ EXPECT_EQ(1u, encoder_config->simulcast_layers.size()); >+ encoder_config->simulcast_layers[0].num_temporal_layers = 2; > encoder_config->content_type = VideoEncoderConfig::ContentType::kScreen; > } > >@@ -1347,7 +1338,7 @@ TEST_P(VideoSendStreamTest, SuspendBelowMinBitrate) { > FakeReceiveStatistics receive_stats(kVideoSendSsrcs[0], > last_sequence_number_, rtp_count_, 0); > RTCPSender rtcp_sender(false, clock_, &receive_stats, nullptr, nullptr, >- transport_adapter_.get(), RtcpIntervalConfig{}); >+ transport_adapter_.get(), kRtcpIntervalMs); > > rtcp_sender.SetRTCPStatus(RtcpMode::kReducedSize); > rtcp_sender.SetRemoteSSRC(kVideoSendSsrcs[0]); >@@ -1396,12 +1387,20 @@ TEST_P(VideoSendStreamTest, NoPaddingWhenVideoIsMuted) { > header.headerLength + header.paddingLength == length; > > if (test_state_ == kBeforeStopCapture) { >+ // Packets are flowing, stop camera. > capturer_->Stop(); > test_state_ = kWaitingForPadding; > } else if (test_state_ == kWaitingForPadding && only_padding) { >+ // We're still getting padding, after stopping camera. > test_state_ = kWaitingForNoPackets; >- } else if (test_state_ == kWaitingForPaddingAfterCameraRestart && >+ } else if (test_state_ == kWaitingForMediaAfterCameraRestart && >+ !only_padding) { >+ // Media packets are flowing again, stop camera a second time. >+ capturer_->Stop(); >+ test_state_ = kWaitingForPaddingAfterCameraStopsAgain; >+ } else if (test_state_ == kWaitingForPaddingAfterCameraStopsAgain && > only_padding) { >+ // Padding is still flowing, test ok. > observation_complete_.Set(); > } > return SEND_PACKET; >@@ -1414,13 +1413,20 @@ TEST_P(VideoSendStreamTest, NoPaddingWhenVideoIsMuted) { > (last_packet_time_ms_ > 0 && > clock_->TimeInMilliseconds() - last_packet_time_ms_ > > kNoPacketsThresholdMs)) { >+ // No packets seen for |kNoPacketsThresholdMs|, restart camera. > capturer_->Start(); >- test_state_ = kWaitingForPaddingAfterCameraRestart; >+ test_state_ = kWaitingForMediaAfterCameraRestart; > } > return SEND_PACKET; > } > >- size_t GetNumVideoStreams() const override { return 3; } >+ void ModifyVideoConfigs( >+ VideoSendStream::Config* send_config, >+ std::vector<VideoReceiveStream::Config>* receive_configs, >+ VideoEncoderConfig* encoder_config) override { >+ // Make sure padding is sent if encoder is not producing media. >+ encoder_config->min_transmit_bitrate_bps = 50000; >+ } > > void OnFrameGeneratorCapturerCreated( > test::FrameGeneratorCapturer* frame_generator_capturer) override { >@@ -1437,7 +1443,8 @@ TEST_P(VideoSendStreamTest, NoPaddingWhenVideoIsMuted) { > kBeforeStopCapture, > kWaitingForPadding, > kWaitingForNoPackets, >- kWaitingForPaddingAfterCameraRestart >+ kWaitingForMediaAfterCameraRestart, >+ kWaitingForPaddingAfterCameraStopsAgain > }; > > TestState test_state_ = kBeforeStopCapture; >@@ -1481,7 +1488,7 @@ TEST_P(VideoSendStreamTest, PaddingIsPrimarilyRetransmissions) { > test::SingleThreadedTaskQueueForTesting* task_queue, > Call* sender_call) override { > const int kNetworkDelayMs = 50; >- DefaultNetworkSimulationConfig config; >+ BuiltInNetworkBehaviorConfig config; > config.loss_percent = 10; > config.link_capacity_kbps = kCapacityKbps; > config.queue_delay_ms = kNetworkDelayMs; >@@ -1780,7 +1787,6 @@ class MaxPaddingSetTest : public test::SendTest { > > explicit MaxPaddingSetTest(bool test_switch_content_type, T* stream_reset_fun) > : SendTest(test::CallTest::kDefaultTimeoutMs), >- content_switch_event_(false, false), > call_(nullptr), > send_stream_(nullptr), > send_stream_config_(nullptr), >@@ -1910,7 +1916,6 @@ TEST_P(VideoSendStreamTest, > public: > EncoderObserver() > : FakeEncoder(Clock::GetRealTimeClock()), >- init_encode_called_(false, false), > number_of_initializations_(0), > last_initialized_frame_width_(0), > last_initialized_frame_height_(0) {} >@@ -1993,7 +1998,6 @@ TEST_P(VideoSendStreamTest, CanReconfigureToUseStartBitrateAbovePreviousMax) { > public: > StartBitrateObserver() > : FakeEncoder(Clock::GetRealTimeClock()), >- start_bitrate_changed_(false, false), > start_bitrate_kbps_(0) {} > int32_t InitEncode(const VideoCodec* config, > int32_t number_of_cores, >@@ -2068,10 +2072,7 @@ TEST_P(VideoSendStreamTest, CanReconfigureToUseStartBitrateAbovePreviousMax) { > > class StartStopBitrateObserver : public test::FakeEncoder { > public: >- StartStopBitrateObserver() >- : FakeEncoder(Clock::GetRealTimeClock()), >- encoder_init_(false, false), >- bitrate_changed_(false, false) {} >+ StartStopBitrateObserver() : FakeEncoder(Clock::GetRealTimeClock()) {} > int32_t InitEncode(const VideoCodec* config, > int32_t number_of_cores, > size_t max_payload_size) override { >@@ -2228,8 +2229,6 @@ TEST_P(VideoSendStreamTest, VideoSendStreamUpdateActiveSimulcastLayers) { > TEST_P(VideoSendStreamTest, CapturesTextureAndVideoFrames) { > class FrameObserver : public rtc::VideoSinkInterface<VideoFrame> { > public: >- FrameObserver() : output_frame_event_(false, false) {} >- > void OnFrame(const VideoFrame& video_frame) override { > output_frames_.push_back(video_frame); > output_frame_event_.Set(); >@@ -2392,11 +2391,6 @@ TEST_P(VideoSendStreamTest, EncoderIsProperlyInitializedAndDestroyed) { > return 0; > } > >- int32_t SetChannelParameters(uint32_t packetLoss, int64_t rtt) override { >- EXPECT_TRUE(IsReadyForEncode()); >- return 0; >- } >- > int32_t SetRates(uint32_t newBitRate, uint32_t frameRate) override { > EXPECT_TRUE(IsReadyForEncode()); > return 0; >@@ -2458,7 +2452,6 @@ TEST_P(VideoSendStreamTest, EncoderSetupPropagatesCommonEncoderConfigValues) { > VideoCodecConfigObserver() > : SendTest(kDefaultTimeoutMs), > FakeEncoder(Clock::GetRealTimeClock()), >- init_encode_event_(false, false), > num_initializations_(0), > stream_(nullptr), > encoder_factory_(this) {} >@@ -2530,7 +2523,6 @@ class VideoCodecConfigObserver : public test::SendTest, > FakeEncoder(Clock::GetRealTimeClock()), > video_codec_type_(video_codec_type), > codec_name_(codec_name), >- init_encode_event_(false, false), > num_initializations_(0), > stream_(nullptr), > encoder_factory_(this) { >@@ -2538,26 +2530,6 @@ class VideoCodecConfigObserver : public test::SendTest, > } > > private: >- class VideoStreamFactory >- : public VideoEncoderConfig::VideoStreamFactoryInterface { >- public: >- VideoStreamFactory() {} >- >- private: >- std::vector<VideoStream> CreateEncoderStreams( >- int width, >- int height, >- const VideoEncoderConfig& encoder_config) override { >- std::vector<VideoStream> streams = >- test::CreateVideoStreams(width, height, encoder_config); >- for (size_t i = 0; i < streams.size(); ++i) { >- streams[i].num_temporal_layers = >- kVideoCodecConfigObserverNumberOfTemporalLayers; >- } >- return streams; >- } >- }; >- > void ModifyVideoConfigs( > VideoSendStream::Config* send_config, > std::vector<VideoReceiveStream::Config>* receive_configs, >@@ -2567,8 +2539,9 @@ class VideoCodecConfigObserver : public test::SendTest, > > encoder_config->codec_type = video_codec_type_; > encoder_config->encoder_specific_settings = GetEncoderSpecificSettings(); >- encoder_config->video_stream_factory = >- new rtc::RefCountedObject<VideoStreamFactory>(); >+ EXPECT_EQ(1u, encoder_config->simulcast_layers.size()); >+ encoder_config->simulcast_layers[0].num_temporal_layers = >+ kVideoCodecConfigObserverNumberOfTemporalLayers; > encoder_config_ = encoder_config->Copy(); > } > >@@ -2794,8 +2767,6 @@ TEST_P(VideoSendStreamTest, TranslatesTwoLayerScreencastToTargetBitrate) { > const VideoEncoderConfig& encoder_config) override { > std::vector<VideoStream> streams = > test::CreateVideoStreams(width, height, encoder_config); >- EXPECT_FALSE(streams[0].num_temporal_layers.has_value()); >- streams[0].num_temporal_layers = 2; > RTC_CHECK_GT(streams[0].max_bitrate_bps, > kScreencastMaxTargetBitrateDeltaKbps); > streams[0].target_bitrate_bps = >@@ -2833,6 +2804,8 @@ TEST_P(VideoSendStreamTest, TranslatesTwoLayerScreencastToTargetBitrate) { > EXPECT_EQ(1u, encoder_config->number_of_streams); > encoder_config->video_stream_factory = > new rtc::RefCountedObject<VideoStreamFactory>(); >+ EXPECT_EQ(1u, encoder_config->simulcast_layers.size()); >+ encoder_config->simulcast_layers[0].num_temporal_layers = 2; > encoder_config->content_type = VideoEncoderConfig::ContentType::kScreen; > } > >@@ -2863,8 +2836,6 @@ TEST_P(VideoSendStreamTest, ReconfigureBitratesSetsEncoderBitratesCorrectly) { > : SendTest(kDefaultTimeoutMs), > FakeEncoder(Clock::GetRealTimeClock()), > task_queue_(task_queue), >- init_encode_event_(false, false), >- bitrate_changed_event_(false, false), > target_bitrate_(0), > num_initializations_(0), > call_(nullptr), >@@ -2927,10 +2898,11 @@ TEST_P(VideoSendStreamTest, ReconfigureBitratesSetsEncoderBitratesCorrectly) { > EXPECT_EQ(expected_bitrate, target_bitrate_); > } > >- void ModifySenderCallConfig(Call::Config* config) override { >- config->bitrate_config.min_bitrate_bps = kMinBitrateKbps * 1000; >- config->bitrate_config.start_bitrate_bps = kStartBitrateKbps * 1000; >- config->bitrate_config.max_bitrate_bps = kMaxBitrateKbps * 1000; >+ void ModifySenderBitrateConfig( >+ BitrateConstraints* bitrate_config) override { >+ bitrate_config->min_bitrate_bps = kMinBitrateKbps * 1000; >+ bitrate_config->start_bitrate_bps = kStartBitrateKbps * 1000; >+ bitrate_config->max_bitrate_bps = kMaxBitrateKbps * 1000; > } > > class VideoStreamFactory >@@ -3115,7 +3087,7 @@ TEST_P(VideoSendStreamTest, ReportsSentResolution) { > RunBaseTest(&test); > } > >-#if !defined(RTC_DISABLE_VP9) >+#if defined(RTC_ENABLE_VP9) > class Vp9HeaderObserver : public test::SendTest { > public: > Vp9HeaderObserver() >@@ -3137,26 +3109,6 @@ class Vp9HeaderObserver : public test::SendTest { > private: > const int kVp9PayloadType = test::CallTest::kVideoSendPayloadType; > >- class VideoStreamFactory >- : public VideoEncoderConfig::VideoStreamFactoryInterface { >- public: >- explicit VideoStreamFactory(size_t number_of_temporal_layers) >- : number_of_temporal_layers_(number_of_temporal_layers) {} >- >- private: >- std::vector<VideoStream> CreateEncoderStreams( >- int width, >- int height, >- const VideoEncoderConfig& encoder_config) override { >- std::vector<VideoStream> streams = >- test::CreateVideoStreams(width, height, encoder_config); >- streams.back().num_temporal_layers = number_of_temporal_layers_; >- return streams; >- } >- >- const size_t number_of_temporal_layers_; >- }; >- > void ModifyVideoConfigs( > VideoSendStream::Config* send_config, > std::vector<VideoReceiveStream::Config>* receive_configs, >@@ -3168,9 +3120,9 @@ class Vp9HeaderObserver : public test::SendTest { > encoder_config->encoder_specific_settings = new rtc::RefCountedObject< > VideoEncoderConfig::Vp9EncoderSpecificSettings>(vp9_settings_); > EXPECT_EQ(1u, encoder_config->number_of_streams); >- encoder_config->video_stream_factory = >- new rtc::RefCountedObject<VideoStreamFactory>( >- vp9_settings_.numberOfTemporalLayers); >+ EXPECT_EQ(1u, encoder_config->simulcast_layers.size()); >+ encoder_config->simulcast_layers[0].num_temporal_layers = >+ vp9_settings_.numberOfTemporalLayers; > encoder_config_ = encoder_config->Copy(); > } > >@@ -3628,7 +3580,7 @@ TEST_P(VideoSendStreamTest, MAYBE_Vp9FlexModeRefCount) { > > RunBaseTest(&test); > } >-#endif // !defined(RTC_DISABLE_VP9) >+#endif // defined(RTC_ENABLE_VP9) > > void VideoSendStreamTest::TestRequestSourceRotateVideo( > bool support_orientation_ext) { >@@ -3663,6 +3615,39 @@ TEST_P(VideoSendStreamTest, > TestRequestSourceRotateVideo(true); > } > >+TEST_P(VideoSendStreamTest, EncoderConfigMaxFramerateReportedToSource) { >+ static const int kMaxFps = 22; >+ class FpsObserver : public test::SendTest, >+ public test::FrameGeneratorCapturer::SinkWantsObserver { >+ public: >+ FpsObserver() : SendTest(kDefaultTimeoutMs) {} >+ >+ void OnFrameGeneratorCapturerCreated( >+ test::FrameGeneratorCapturer* frame_generator_capturer) override { >+ frame_generator_capturer->SetSinkWantsObserver(this); >+ } >+ >+ void OnSinkWantsChanged(rtc::VideoSinkInterface<VideoFrame>* sink, >+ const rtc::VideoSinkWants& wants) override { >+ if (wants.max_framerate_fps == kMaxFps) >+ observation_complete_.Set(); >+ } >+ >+ void ModifyVideoConfigs( >+ VideoSendStream::Config* send_config, >+ std::vector<VideoReceiveStream::Config>* receive_configs, >+ VideoEncoderConfig* encoder_config) override { >+ encoder_config->simulcast_layers[0].max_framerate = kMaxFps; >+ } >+ >+ void PerformTest() override { >+ EXPECT_TRUE(Wait()) << "Timed out while waiting for fps to be reported."; >+ } >+ } test; >+ >+ RunBaseTest(&test); >+} >+ > // This test verifies that overhead is removed from the bandwidth estimate by > // testing that the maximum possible target payload rate is smaller than the > // maximum bandwidth estimate by the overhead rate. >@@ -3680,8 +3665,7 @@ TEST_P(VideoSendStreamTest, RemoveOverheadFromBandwidth) { > encoder_factory_(this), > call_(nullptr), > max_bitrate_bps_(0), >- first_packet_sent_(false), >- bitrate_changed_event_(false, false) {} >+ first_packet_sent_(false) {} > > int32_t SetRateAllocation(const VideoBitrateAllocation& bitrate, > uint32_t frameRate) override { >@@ -3894,7 +3878,6 @@ class ContentSwitchTest : public test::SendTest { > > explicit ContentSwitchTest(T* stream_reset_fun) > : SendTest(test::CallTest::kDefaultTimeoutMs), >- content_switch_event_(false, false), > call_(nullptr), > state_(StreamState::kBeforeSwitch), > send_stream_(nullptr), >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.cc >index b7b5a326113f57a3ff14d9db2b01722800d6298f..a22d58adf1eaadf887bc54b387f0e4d62e6cee4e 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.cc >@@ -15,8 +15,9 @@ > #include <numeric> > #include <utility> > >+#include "api/video/encoded_image.h" > #include "api/video/i420_buffer.h" >-#include "common_video/include/video_frame.h" >+#include "api/video/video_bitrate_allocator_factory.h" > #include "modules/video_coding/include/video_codec_initializer.h" > #include "modules/video_coding/include/video_coding.h" > #include "rtc_base/arraysize.h" >@@ -363,6 +364,8 @@ VideoStreamEncoder::VideoStreamEncoder( > max_framerate_(-1), > pending_encoder_reconfiguration_(false), > pending_encoder_creation_(false), >+ crop_width_(0), >+ crop_height_(0), > encoder_start_bitrate_bps_(0), > max_data_payload_length_(0), > last_observed_bitrate_bps_(0), >@@ -376,6 +379,7 @@ VideoStreamEncoder::VideoStreamEncoder( > last_frame_log_ms_(clock_->TimeInMilliseconds()), > captured_frame_count_(0), > dropped_frame_count_(0), >+ pending_frame_post_time_us_(0), > bitrate_observer_(nullptr), > encoder_queue_("EncoderQueue") { > RTC_DCHECK(encoder_stats_observer); >@@ -531,11 +535,14 @@ void VideoStreamEncoder::ReconfigureEncoder() { > crop_height_ = last_frame_info_->height - highest_stream_height; > > VideoCodec codec; >- if (!VideoCodecInitializer::SetupCodec(encoder_config_, streams, &codec, >- &rate_allocator_)) { >+ if (!VideoCodecInitializer::SetupCodec(encoder_config_, streams, &codec)) { > RTC_LOG(LS_ERROR) << "Failed to create encoder configuration."; > } > >+ rate_allocator_ = >+ settings_.bitrate_allocator_factory->CreateVideoBitrateAllocator(codec); >+ RTC_CHECK(rate_allocator_) << "Failed to create bitrate allocator."; >+ > // Set min_bitrate_bps, max_bitrate_bps, and max padding bit rate for VP9. > if (encoder_config_.codec_type == kVideoCodecVP9) { > RTC_DCHECK_EQ(1U, streams.size()); >@@ -629,7 +636,7 @@ void VideoStreamEncoder::ReconfigureEncoder() { > > void VideoStreamEncoder::ConfigureQualityScaler() { > RTC_DCHECK_RUN_ON(&encoder_queue_); >- const auto scaling_settings = encoder_->GetScalingSettings(); >+ const auto scaling_settings = encoder_->GetEncoderInfo().scaling_settings; > const bool quality_scaling_allowed = > IsResolutionScalingEnabled(degradation_preference_) && > scaling_settings.thresholds; >@@ -887,7 +894,15 @@ void VideoStreamEncoder::EncodeVideoFrame(const VideoFrame& video_frame, > > overuse_detector_->FrameCaptured(out_frame, time_when_posted_us); > >- video_sender_.AddVideoFrame(out_frame, nullptr); >+ // Encoder metadata needs to be updated before encode complete callback. >+ VideoEncoder::EncoderInfo info = encoder_->GetEncoderInfo(); >+ if (info.implementation_name != encoder_info_.implementation_name) { >+ encoder_stats_observer_->OnEncoderImplementationChanged( >+ info.implementation_name); >+ } >+ encoder_info_ = info; >+ >+ video_sender_.AddVideoFrame(out_frame, nullptr, encoder_info_); > } > > void VideoStreamEncoder::SendKeyFrame() { >@@ -1085,7 +1100,7 @@ void VideoStreamEncoder::AdaptDown(AdaptReason reason) { > bool min_pixels_reached = false; > if (!source_proxy_->RequestResolutionLowerThan( > adaptation_request.input_pixel_count_, >- encoder_->GetScalingSettings().min_pixels_per_frame, >+ encoder_->GetEncoderInfo().scaling_settings.min_pixels_per_frame, > &min_pixels_reached)) { > if (min_pixels_reached) > encoder_stats_observer_->OnMinPixelLimitReached(); >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.h >index 4008426bac9cf0e6dc28e92864484713001256d5..8c5efbe5c369991b39bc3e9cf8278e364164a831 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.h >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder.h >@@ -268,6 +268,8 @@ class VideoStreamEncoder : public VideoStreamEncoderInterface, > absl::optional<int64_t> last_parameters_update_ms_ > RTC_GUARDED_BY(&encoder_queue_); > >+ VideoEncoder::EncoderInfo encoder_info_ RTC_GUARDED_BY(&encoder_queue_); >+ > // All public methods are proxied to |encoder_queue_|. It must must be > // destroyed first to make sure no tasks are run that use other members. > rtc::TaskQueue encoder_queue_; >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder_unittest.cc >index 80f8d5a592175f3c22579033de06fcda3ab390d7..05a23c05b45e736d07dd8761b549be134a2280aa 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder_unittest.cc >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/video/video_stream_encoder_unittest.cc >@@ -8,13 +8,17 @@ > * be found in the AUTHORS file in the root of the source tree. > */ > >+#include "video/video_stream_encoder.h" >+ > #include <algorithm> > #include <limits> > #include <utility> > >+#include "api/video/builtin_video_bitrate_allocator_factory.h" > #include "api/video/i420_buffer.h" >+#include "api/video_codecs/create_vp8_temporal_layers.h" >+#include "api/video_codecs/vp8_temporal_layers.h" > #include "media/base/videoadapter.h" >-#include "modules/video_coding/codecs/vp8/include/vp8_temporal_layers.h" > #include "modules/video_coding/codecs/vp9/include/vp9_globals.h" > #include "modules/video_coding/utility/default_video_bitrate_allocator.h" > #include "rtc_base/fakeclock.h" >@@ -30,7 +34,6 @@ > #include "test/gtest.h" > #include "test/video_encoder_proxy_factory.h" > #include "video/send_statistics_proxy.h" >-#include "video/video_stream_encoder.h" > > namespace webrtc { > >@@ -102,7 +105,7 @@ class VideoStreamEncoderUnderTest : public VideoStreamEncoder { > new CpuOveruseDetectorProxy(stats_proxy))) {} > > void PostTaskAndWait(bool down, AdaptReason reason) { >- rtc::Event event(false, false); >+ rtc::Event event; > encoder_queue()->PostTask([this, &event, reason, down] { > down ? AdaptDown(reason) : AdaptUp(reason); > event.Set(); >@@ -113,7 +116,7 @@ class VideoStreamEncoderUnderTest : public VideoStreamEncoder { > // This is used as a synchronisation mechanism, to make sure that the > // encoder queue is not blocked before we start sending it frames. > void WaitUntilTaskQueueIsIdle() { >- rtc::Event event(false, false); >+ rtc::Event event; > encoder_queue()->PostTask([&event] { event.Set(); }); > ASSERT_TRUE(event.Wait(5000)); > } >@@ -277,6 +280,7 @@ class VideoStreamEncoderTest : public ::testing::Test { > max_framerate_(kDefaultFramerate), > fake_encoder_(), > encoder_factory_(&fake_encoder_), >+ bitrate_allocator_factory_(CreateBuiltinVideoBitrateAllocatorFactory()), > stats_proxy_(new MockableSendStatisticsProxy( > Clock::GetRealTimeClock(), > video_send_config_, >@@ -287,6 +291,8 @@ class VideoStreamEncoderTest : public ::testing::Test { > metrics::Reset(); > video_send_config_ = VideoSendStream::Config(nullptr); > video_send_config_.encoder_settings.encoder_factory = &encoder_factory_; >+ video_send_config_.encoder_settings.bitrate_allocator_factory = >+ bitrate_allocator_factory_.get(); > video_send_config_.rtp.payload_name = "FAKE"; > video_send_config_.rtp.payload_type = 125; > >@@ -492,9 +498,7 @@ class VideoStreamEncoderTest : public ::testing::Test { > > class TestEncoder : public test::FakeEncoder { > public: >- TestEncoder() >- : FakeEncoder(Clock::GetRealTimeClock()), >- continue_encode_event_(false, false) {} >+ TestEncoder() : FakeEncoder(Clock::GetRealTimeClock()) {} > > VideoCodec codec_config() const { > rtc::CritScope lock(&crit_sect_); >@@ -506,11 +510,14 @@ class VideoStreamEncoderTest : public ::testing::Test { > block_next_encode_ = true; > } > >- VideoEncoder::ScalingSettings GetScalingSettings() const override { >+ VideoEncoder::EncoderInfo GetEncoderInfo() const override { >+ EncoderInfo info; > rtc::CritScope lock(&local_crit_sect_); >- if (quality_scaling_) >- return VideoEncoder::ScalingSettings(1, 2, kMinPixelsPerFrame); >- return VideoEncoder::ScalingSettings::kOff; >+ if (quality_scaling_) { >+ info.scaling_settings = >+ VideoEncoder::ScalingSettings(1, 2, kMinPixelsPerFrame); >+ } >+ return info; > } > > void ContinueEncode() { continue_encode_event_.Set(); } >@@ -569,9 +576,8 @@ class VideoStreamEncoderTest : public ::testing::Test { > int num_streams = std::max<int>(1, config->numberOfSimulcastStreams); > for (int i = 0; i < num_streams; ++i) { > allocated_temporal_layers_.emplace_back( >- TemporalLayers::CreateTemporalLayers( >- TemporalLayersType::kFixedPattern, >- config->VP8().numberOfTemporalLayers)); >+ CreateVp8TemporalLayers(Vp8TemporalLayersType::kFixedPattern, >+ config->VP8().numberOfTemporalLayers)); > } > } > if (force_init_encode_failed_) >@@ -587,7 +593,7 @@ class VideoStreamEncoderTest : public ::testing::Test { > int last_input_width_ RTC_GUARDED_BY(local_crit_sect_) = 0; > int last_input_height_ RTC_GUARDED_BY(local_crit_sect_) = 0; > bool quality_scaling_ RTC_GUARDED_BY(local_crit_sect_) = true; >- std::vector<std::unique_ptr<TemporalLayers>> allocated_temporal_layers_ >+ std::vector<std::unique_ptr<Vp8TemporalLayers>> allocated_temporal_layers_ > RTC_GUARDED_BY(local_crit_sect_); > bool force_init_encode_failed_ RTC_GUARDED_BY(local_crit_sect_) = false; > }; >@@ -595,7 +601,7 @@ class VideoStreamEncoderTest : public ::testing::Test { > class TestSink : public VideoStreamEncoder::EncoderSink { > public: > explicit TestSink(TestEncoder* test_encoder) >- : test_encoder_(test_encoder), encoded_frame_event_(false, false) {} >+ : test_encoder_(test_encoder) {} > > void WaitForEncodedFrame(int64_t expected_ntp_time) { > EXPECT_TRUE( >@@ -694,6 +700,7 @@ class VideoStreamEncoderTest : public ::testing::Test { > int max_framerate_; > TestEncoder fake_encoder_; > test::VideoEncoderProxyFactory encoder_factory_; >+ std::unique_ptr<VideoBitrateAllocatorFactory> bitrate_allocator_factory_; > std::unique_ptr<MockableSendStatisticsProxy> stats_proxy_; > TestSink sink_; > AdaptingFrameForwarder video_source_; >@@ -703,7 +710,7 @@ class VideoStreamEncoderTest : public ::testing::Test { > > TEST_F(VideoStreamEncoderTest, EncodeOneFrame) { > video_stream_encoder_->OnBitrateUpdated(kTargetBitrateBps, 0, 0); >- rtc::Event frame_destroyed_event(false, false); >+ rtc::Event frame_destroyed_event; > video_source_.IncomingCapturedFrame(CreateFrame(1, &frame_destroyed_event)); > WaitForEncodedFrame(1); > EXPECT_TRUE(frame_destroyed_event.Wait(kDefaultTimeoutMs)); >@@ -712,7 +719,7 @@ TEST_F(VideoStreamEncoderTest, EncodeOneFrame) { > > TEST_F(VideoStreamEncoderTest, DropsFramesBeforeFirstOnBitrateUpdated) { > // Dropped since no target bitrate has been set. >- rtc::Event frame_destroyed_event(false, false); >+ rtc::Event frame_destroyed_event; > // The encoder will cache up to one frame for a short duration. Adding two > // frames means that the first frame will be dropped and the second frame will > // be sent when the encoder is enabled. >@@ -770,7 +777,7 @@ TEST_F(VideoStreamEncoderTest, DropsFrameAfterStop) { > > video_stream_encoder_->Stop(); > sink_.SetExpectNoFrames(); >- rtc::Event frame_destroyed_event(false, false); >+ rtc::Event frame_destroyed_event; > video_source_.IncomingCapturedFrame(CreateFrame(2, &frame_destroyed_event)); > EXPECT_TRUE(frame_destroyed_event.Wait(kDefaultTimeoutMs)); > } >diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/webrtc.gni b/Source/ThirdParty/libwebrtc/Source/webrtc/webrtc.gni >index 6ecf859ba103d0f68607659d298eee7c6a3c7184..5c5214e8a157f16fca8b83803782b523b5947ad6 100644 >--- a/Source/ThirdParty/libwebrtc/Source/webrtc/webrtc.gni >+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/webrtc.gni >@@ -9,6 +9,7 @@ import("//build/config/arm.gni") > import("//build/config/features.gni") > import("//build/config/mips.gni") > import("//build/config/sanitizers/sanitizers.gni") >+import("//build/config/sysroot.gni") > import("//build/config/ui.gni") > import("//build_overrides/build.gni") > >@@ -35,6 +36,10 @@ if (is_mac) { > } > > declare_args() { >+ # Setting this to true will make RTC_EXPORT (see rtc_base/system/rtc_export.h) >+ # expand to code that will manage symbols visibility. >+ rtc_enable_symbol_export = false >+ > # Setting this to true will define WEBRTC_EXCLUDE_FIELD_TRIAL_DEFAULT which > # will tell the pre-processor to remove the default definition of symbols > # needed to use field_trial. In that case a new implementation needs to be >@@ -103,6 +108,9 @@ declare_args() { > # Set this to false to skip building code that requires X11. > rtc_use_x11 = use_x11 > >+ # Set this to use PipeWire on the Wayland display server. >+ rtc_use_pipewire = false >+ > # Enable to use the Mozilla internal settings. > build_with_mozilla = false > >@@ -225,7 +233,8 @@ rtc_libvpx_dir = "//third_party/libvpx" > rtc_opus_dir = "//third_party/opus" > > # Desktop capturer is supported only on Windows, OSX and Linux. >-rtc_desktop_capture_supported = is_win || is_mac || (is_linux && rtc_use_x11) >+rtc_desktop_capture_supported = >+ is_win || is_mac || (is_linux && (rtc_use_x11 || rtc_use_pipewire)) > > ############################################################################### > # Templates >@@ -649,6 +658,8 @@ if (is_ios) { > } > } > >+ # TODO: Generate module.modulemap file to enable use in Swift >+ # projects. See "mac_framework_bundle_with_umbrella_header". > template("ios_framework_bundle_with_umbrella_header") { > forward_variables_from(invoker, [ "output_name" ]) > umbrella_header_path = >@@ -699,12 +710,17 @@ if (is_mac) { > forward_variables_from(invoker, [ "output_name" ]) > this_target_name = target_name > umbrella_header_path = "$target_gen_dir/umbrella_header/$output_name.h" >+ modulemap_path = "$target_gen_dir/Modules/module.modulemap" > > mac_framework_bundle(target_name) { > forward_variables_from(invoker, "*", []) > > framework_version = "A" >- framework_contents = [ "Headers" ] >+ framework_contents = [ >+ "Headers", >+ "Modules", >+ "Resources", >+ ] > > ldflags = [ > "-all_load", >@@ -714,7 +730,9 @@ if (is_mac) { > > deps += [ > ":copy_framework_headers_$this_target_name", >+ ":copy_modulemap_$this_target_name", > ":copy_umbrella_header_$this_target_name", >+ ":modulemap_$this_target_name", > ":umbrella_header_$this_target_name", > ] > } >@@ -727,6 +745,31 @@ if (is_mac) { > ] > } > >+ action("modulemap_$this_target_name") { >+ script = "//tools_webrtc/ios/generate_modulemap.py" >+ args = [ >+ "--out", >+ rebase_path(modulemap_path, root_build_dir), >+ "--name", >+ output_name, >+ ] >+ outputs = [ >+ modulemap_path, >+ ] >+ } >+ >+ bundle_data("copy_modulemap_$this_target_name") { >+ sources = [ >+ modulemap_path, >+ ] >+ outputs = [ >+ "{{bundle_contents_dir}}/Modules/module.modulemap", >+ ] >+ deps = [ >+ ":modulemap_$this_target_name", >+ ] >+ } >+ > action("umbrella_header_$this_target_name") { > forward_variables_from(invoker, [ "sources" ]) > >@@ -769,14 +812,14 @@ if (is_android) { > "visibility", > ]) > >- javac_args = [] >+ errorprone_args = [] > > # Treat warnings as errors. >- javac_args += [ "-Werror" ] >+ errorprone_args += [ "-Werror" ] > > # TODO(crbug.com/824679): Find out why this fails in Chromium > if (!build_with_chromium) { >- javac_args += [ >+ errorprone_args += [ > "-Xep:ParameterNotNullable:ERROR", > "-Xep:FieldMissingNullable:ERROR", > "-Xep:ReturnMissingNullable:ERROR", >@@ -784,8 +827,8 @@ if (is_android) { > } > > # Add any arguments defined by the invoker. >- if (defined(invoker.javac_args)) { >- javac_args += invoker.javac_args >+ if (defined(invoker.errorprone_args)) { >+ errorprone_args += invoker.errorprone_args > } > > if (!defined(deps)) { >@@ -809,11 +852,11 @@ if (is_android) { > ]) > > # Treat warnings as errors. >- javac_args = [ "-Werror" ] >+ errorprone_args = [ "-Werror" ] > > # TODO(crbug.com/824679): Find out why this fails in Chromium > if (!build_with_chromium) { >- javac_args += [ >+ errorprone_args += [ > "-Xep:ParameterNotNullable:ERROR", > "-Xep:FieldMissingNullable:ERROR", > "-Xep:ReturnMissingNullable:ERROR", >@@ -841,11 +884,11 @@ if (is_android) { > ]) > > # Treat warnings as errors. >- javac_args = [ "-Werror" ] >+ errorprone_args = [ "-Werror" ] > > # TODO(crbug.com/824679): Find out why this fails in Chromium > if (!build_with_chromium) { >- javac_args += [ >+ errorprone_args += [ > "-Xep:ParameterNotNullable:ERROR", > "-Xep:FieldMissingNullable:ERROR", > "-Xep:ReturnMissingNullable:ERROR", >diff --git a/Source/ThirdParty/libwebrtc/WebKit/0001-libwebrtc-changes.patch b/Source/ThirdParty/libwebrtc/WebKit/0001-libwebrtc-changes.patch >deleted file mode 100644 >index 6b0134bb2d1a0f04461678e1f4bd58bd74faefd2..0000000000000000000000000000000000000000 >--- a/Source/ThirdParty/libwebrtc/WebKit/0001-libwebrtc-changes.patch >+++ /dev/null >@@ -1,2082 +0,0 @@ >-From 38a838284552711d6228db9bab2dec728402098d Mon Sep 17 00:00:00 2001 >-From: Youenn Fablet <youenn@apple.com> >-Date: Mon, 8 Oct 2018 10:38:12 -0700 >-Subject: [PATCH 1/2] libwebrtc changes >- >---- >- .../audio_codecs/ilbc/audio_encoder_ilbc.cc | 2 +- >- .../libwebrtc/Source/webrtc/api/mediatypes.cc | 4 +- >- .../Source/webrtc/audio/remix_resample.cc | 4 +- >- .../Source/webrtc/common_audio/wav_file.cc | 4 +- >- .../webrtc/common_video/video_frame_buffer.cc | 2 +- >- .../modules/audio_coding/acm2/rent_a_codec.cc | 4 +- >- .../codecs/cng/audio_encoder_cng.cc | 2 +- >- .../codecs/ilbc/audio_encoder_ilbc.cc | 4 +- >- .../neteq/tools/rtp_file_source.cc | 2 +- >- .../Source/webrtc/p2p/base/tcpport.cc | 10 + >- .../libwebrtc/Source/webrtc/pc/bundlefilter.h | 2 +- >- .../Source/webrtc/pc/peerconnectionfactory.cc | 4 +- >- .../webrtc/pc/rtpparametersconversion.cc | 2 +- >- .../libwebrtc/Source/webrtc/rtc_base/checks.h | 6 +- >- .../libwebrtc/Source/webrtc/rtc_base/flags.cc | 6 +- >- .../Source/webrtc/rtc_base/location.h | 2 +- >- .../Source/webrtc/rtc_base/logging.cc | 54 ++- >- .../webrtc/rtc_base/logging_unittest.cc | 2 +- >- .../Source/webrtc/rtc_base/never_destroyed.h | 69 ++++ >- .../rtc_base/numerics/safe_conversions.h | 4 +- >- .../webrtc/rtc_base/opensslcertificate.cc | 8 +- >- .../Source/webrtc/rtc_base/stringize_macros.h | 4 +- >- .../webrtc/rtc_base/virtualsocketserver.cc | 3 +- >- .../PeerConnection/RTCH264ProfileLevelId.mm | 58 +++ >- .../peerconnection/RTCEncodedImage+Private.mm | 4 +- >- .../peerconnection/RTCRtpEncodingParameters.h | 1 + >- .../RTCVideoEncoderSettings+Private.h | 7 + >- .../RTCVideoEncoderSettings+Private.mm | 52 ++- >- .../api/video_codec/RTCVideoCodecConstants.mm | 4 +- >- .../objc/api/video_codec/RTCVideoEncoderVP8.h | 1 + >- .../RTCWrappedNativeVideoEncoder.mm | 5 + >- .../video_frame_buffer/RTCNativeI420Buffer.h | 1 + >- .../RTCNativeMutableI420Buffer.h | 1 + >- .../sdk/objc/base/RTCCodecSpecificInfo.h | 1 + >- .../webrtc/sdk/objc/base/RTCEncodedImage.h | 2 + >- .../webrtc/sdk/objc/base/RTCEncodedImage.m | 1 + >- .../webrtc/sdk/objc/base/RTCI420Buffer.h | 1 + >- .../sdk/objc/base/RTCRtpFragmentationHeader.h | 1 + >- .../sdk/objc/base/RTCRtpFragmentationHeader.m | 3 +- >- .../webrtc/sdk/objc/base/RTCVideoCodecInfo.h | 1 + >- .../webrtc/sdk/objc/base/RTCVideoDecoder.h | 1 + >- .../sdk/objc/base/RTCVideoDecoderFactory.h | 1 + >- .../webrtc/sdk/objc/base/RTCVideoEncoder.h | 8 +- >- .../sdk/objc/base/RTCVideoEncoderFactory.h | 1 + >- .../objc/base/RTCVideoEncoderQpThresholds.h | 1 + >- .../sdk/objc/base/RTCVideoEncoderSettings.h | 6 + >- .../webrtc/sdk/objc/base/RTCVideoFrame.h | 1 + >- .../sdk/objc/base/RTCVideoFrameBuffer.h | 1 + >- .../video_codec/RTCCodecSpecificInfoH264.h | 3 +- >- .../RTCDefaultVideoDecoderFactory.h | 1 + >- .../RTCDefaultVideoEncoderFactory.h | 1 + >- .../video_codec/RTCH264ProfileLevelId.h | 1 + >- .../video_codec/RTCH264ProfileLevelId.mm | 16 +- >- .../video_codec/RTCVideoDecoderFactoryH264.h | 1 + >- .../video_codec/RTCVideoDecoderH264.h | 1 + >- .../video_codec/RTCVideoEncoderFactoryH264.h | 1 + >- .../video_codec/RTCVideoEncoderH264.h | 1 + >- .../video_codec/RTCVideoEncoderH264.mm | 382 ++++++++++++++---- >- .../objc/components/video_codec/helpers.cc | 16 +- >- .../sdk/objc/components/video_codec/helpers.h | 10 +- >- .../video_frame_buffer/RTCCVPixelBuffer.h | 1 + >- .../native/src/objc_video_encoder_factory.mm | 6 + >- 62 files changed, 637 insertions(+), 172 deletions(-) >- create mode 100644 Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/never_destroyed.h >- create mode 100644 Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/PeerConnection/RTCH264ProfileLevelId.mm >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >-index efcef382ff7..2ae75474ccd 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >-@@ -33,7 +33,7 @@ int GetIlbcBitrate(int ptime) { >- // 50 bytes per frame of 30 ms => (approx) 13333 bits/s. >- return 13333; >- default: >-- FATAL(); >-+ RTC_FATAL(); >- } >- } >- } // namespace >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc >-index 599542db08f..140d0ae9625 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc >-@@ -28,7 +28,7 @@ std::string MediaTypeToString(MediaType type) { >- case MEDIA_TYPE_DATA: >- return kMediaTypeData; >- } >-- FATAL(); >-+ RTC_FATAL(); >- // Not reachable; avoids compile warning. >- return ""; >- } >-@@ -41,7 +41,7 @@ MediaType MediaTypeFromString(const std::string& type_str) { >- } else if (type_str == kMediaTypeData) { >- return MEDIA_TYPE_DATA; >- } >-- FATAL(); >-+ RTC_FATAL(); >- // Not reachable; avoids compile warning. >- return static_cast<MediaType>(-1); >- } >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >-index eda70c7c378..97222e947a4 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >-@@ -58,7 +58,7 @@ void RemixAndResample(const int16_t* src_data, >- >- if (resampler->InitializeIfNeeded(sample_rate_hz, dst_frame->sample_rate_hz_, >- audio_ptr_num_channels) == -1) { >-- FATAL() << "InitializeIfNeeded failed: sample_rate_hz = " << sample_rate_hz >-+ RTC_FATAL() << "InitializeIfNeeded failed: sample_rate_hz = " << sample_rate_hz >- << ", dst_frame->sample_rate_hz_ = " << dst_frame->sample_rate_hz_ >- << ", audio_ptr_num_channels = " << audio_ptr_num_channels; >- } >-@@ -72,7 +72,7 @@ void RemixAndResample(const int16_t* src_data, >- resampler->Resample(audio_ptr, src_length, dst_frame->mutable_data(), >- AudioFrame::kMaxDataSizeSamples); >- if (out_length == -1) { >-- FATAL() << "Resample failed: audio_ptr = " << audio_ptr >-+ RTC_FATAL() << "Resample failed: audio_ptr = " << audio_ptr >- << ", src_length = " << src_length >- << ", dst_frame->mutable_data() = " << dst_frame->mutable_data(); >- } >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >-index 008891f9cfb..0209f52d2c7 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >-@@ -53,7 +53,7 @@ WavReader::WavReader(rtc::PlatformFile file) { >- if (!rtc::ClosePlatformFile(file)) { >- RTC_LOG(LS_ERROR) << "Can't close file."; >- } >-- FATAL() << "Could not open wav file for reading."; >-+ RTC_FATAL() << "Could not open wav file for reading."; >- } >- >- ReadableWavFile readable(file_handle_); >-@@ -138,7 +138,7 @@ WavWriter::WavWriter(rtc::PlatformFile file, >- if (!rtc::ClosePlatformFile(file)) { >- RTC_LOG(LS_ERROR) << "Can't close file."; >- } >-- FATAL() << "Could not open wav file for writing."; >-+ RTC_FATAL() << "Could not open wav file for writing."; >- } >- >- RTC_CHECK(CheckWavParameters(num_channels_, sample_rate_, kWavFormat, >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >-index 2cd0b290f15..be0b4927bbe 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >-@@ -323,7 +323,7 @@ rtc::scoped_refptr<PlanarYuvBuffer> WrapYuvBuffer( >- return WrapI444Buffer(width, height, y_plane, y_stride, u_plane, u_stride, >- v_plane, v_stride, no_longer_used); >- default: >-- FATAL() << "Unexpected frame buffer type."; >-+ RTC_FATAL() << "Unexpected frame buffer type."; >- return nullptr; >- } >- } >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc >-index 0a9ce117e2b..45d78d74a5d 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/acm2/rent_a_codec.cc >-@@ -211,7 +211,7 @@ std::unique_ptr<AudioEncoder> CreateCngEncoder( >- config.vad_mode = Vad::kVadVeryAggressive; >- break; >- default: >-- FATAL(); >-+ RTC_FATAL(); >- } >- return std::unique_ptr<AudioEncoder>(new AudioEncoderCng(std::move(config))); >- } >-@@ -226,7 +226,7 @@ std::unique_ptr<AudioDecoder> CreateIsacDecoder( >- return std::unique_ptr<AudioDecoder>( >- new AudioDecoderIsacFloatImpl(sample_rate_hz, bwinfo)); >- #else >-- FATAL() << "iSAC is not supported."; >-+ RTC_FATAL() << "iSAC is not supported."; >- return std::unique_ptr<AudioDecoder>(); >- #endif >- } >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >-index 4cda3401fa2..055190497a5 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >-@@ -139,7 +139,7 @@ AudioEncoder::EncodedInfo AudioEncoderCng::EncodeImpl( >- break; >- } >- case Vad::kError: { >-- FATAL(); // Fails only if fed invalid data. >-+ RTC_FATAL(); // Fails only if fed invalid data. >- break; >- } >- } >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >-index 84695e37630..dae956b30c0 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >-@@ -40,7 +40,7 @@ int GetIlbcBitrate(int ptime) { >- // 50 bytes per frame of 30 ms => (approx) 13333 bits/s. >- return 13333; >- default: >-- FATAL(); >-+ RTC_FATAL(); >- } >- } >- >-@@ -147,7 +147,7 @@ size_t AudioEncoderIlbcImpl::RequiredOutputSizeBytes() const { >- case 6: >- return 2 * 50; >- default: >-- FATAL(); >-+ RTC_FATAL(); >- } >- } >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc >-index 806bba7a9fe..7bf7d52083c 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/neteq/tools/rtp_file_source.cc >-@@ -89,7 +89,7 @@ bool RtpFileSource::OpenFile(const std::string& file_name) { >- return true; >- rtp_reader_.reset(RtpFileReader::Create(RtpFileReader::kPcap, file_name)); >- if (!rtp_reader_) { >-- FATAL() << "Couldn't open input file as either a rtpdump or .pcap. Note " >-+ RTC_FATAL() << "Couldn't open input file as either a rtpdump or .pcap. Note " >- "that .pcapng is not supported."; >- } >- return true; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >-index d070f6ecee8..c66c7307375 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >-@@ -351,11 +351,21 @@ TCPConnection::TCPConnection(TCPPort* port, >- << ", port() Network:" << port->Network()->ToString(); >- const std::vector<rtc::InterfaceAddress>& desired_addresses = >- port_->Network()->GetIPs(); >-+ >-+#if defined(WEBRTC_WEBKIT_BUILD) >-+ RTC_DCHECK(socket->GetLocalAddress().IsLoopbackIP() || >-+ (std::find_if(desired_addresses.begin(), desired_addresses.end(), >-+ [this](const rtc::InterfaceAddress& addr) { >-+ return socket_->GetLocalAddress().ipaddr() == >-+ addr; >-+ }) != desired_addresses.end())); >-+ #else >- RTC_DCHECK(std::find_if(desired_addresses.begin(), desired_addresses.end(), >- [this](const rtc::InterfaceAddress& addr) { >- return socket_->GetLocalAddress().ipaddr() == >- addr; >- }) != desired_addresses.end()); >-+#endif >- ConnectSocketSignals(socket); >- } >- } >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.h b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.h >-index 7decbba8a4f..de477f457f2 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/bundlefilter.h >-@@ -17,7 +17,7 @@ >- #include <vector> >- >- #include "media/base/streamparams.h" >--#include "rtc_base/basictypes.h" >-+//#include "rtc_base/basictypes.h" >- >- namespace cricket { >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >-index b78ed8db1ee..53f4da8516e 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >-@@ -257,7 +257,7 @@ RtpCapabilities PeerConnectionFactory::GetRtpSenderCapabilities( >- return RtpCapabilities(); >- } >- // Not reached; avoids compile warning. >-- FATAL(); >-+ RTC_FATAL(); >- } >- >- RtpCapabilities PeerConnectionFactory::GetRtpReceiverCapabilities( >-@@ -284,7 +284,7 @@ RtpCapabilities PeerConnectionFactory::GetRtpReceiverCapabilities( >- return RtpCapabilities(); >- } >- // Not reached; avoids compile warning. >-- FATAL(); >-+ RTC_FATAL(); >- } >- >- rtc::scoped_refptr<AudioSourceInterface> >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >-index a57066e4db0..e519b9b06de 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >-@@ -61,7 +61,7 @@ RTCErrorOr<cricket::FeedbackParam> ToCricketFeedbackParam( >- return cricket::FeedbackParam(cricket::kRtcpFbParamTransportCc); >- } >- // Not reached; avoids compile warning. >-- FATAL(); >-+ RTC_FATAL(); >- } >- >- template <typename C> >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h >-index 9de6d47573c..5cf0679910e 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h >-@@ -80,7 +80,7 @@ RTC_NORETURN void rtc_FatalMessage(const char* file, int line, const char* msg); >- // messages if the condition doesn't hold. Prefer them to raw RTC_CHECK and >- // RTC_DCHECK. >- // >--// - FATAL() aborts unconditionally. >-+// - RTC_FATAL() aborts unconditionally. >- // >- // TODO(ajm): Ideally, checks.h would be combined with logging.h, but >- // consolidation with system_wrappers/logging.h should happen first. >-@@ -348,9 +348,9 @@ class FatalLogCall final { >- #define RTC_NOTREACHED() RTC_DCHECK(RTC_UNREACHABLE_CODE_HIT) >- >- // TODO(bugs.webrtc.org/8454): Add an RTC_ prefix or rename differently. >--#define FATAL() \ >-+#define RTC_FATAL() \ >- rtc::webrtc_checks_impl::FatalLogCall<false>(__FILE__, __LINE__, \ >-- "FATAL()") & \ >-+ "RTC_FATAL()") & \ >- rtc::webrtc_checks_impl::LogStreamer<>() >- >- // Performs the integer division a/b and returns the result. CHECKs that the >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >-index 5b28794f6ca..4f294594bd4 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >-@@ -80,7 +80,7 @@ void Flag::SetToDefault() { >- variable_->s = default_.s; >- return; >- } >-- FATAL() << "unreachable code"; >-+ RTC_FATAL() << "unreachable code"; >- } >- >- static const char* Type2String(Flag::Type type) { >-@@ -94,7 +94,7 @@ static const char* Type2String(Flag::Type type) { >- case Flag::STRING: >- return "string"; >- } >-- FATAL() << "unreachable code"; >-+ RTC_FATAL() << "unreachable code"; >- } >- >- static void PrintFlagValue(Flag::Type type, FlagValue* p) { >-@@ -112,7 +112,7 @@ static void PrintFlagValue(Flag::Type type, FlagValue* p) { >- printf("%s", p->s); >- return; >- } >-- FATAL() << "unreachable code"; >-+ RTC_FATAL() << "unreachable code"; >- } >- >- void Flag::Print(bool print_current_value) { >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h >-index 513bc263651..718d9589348 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h >-@@ -50,7 +50,7 @@ class Location { >- #define RTC_FROM_HERE RTC_FROM_HERE_WITH_FUNCTION(__FUNCTION__) >- >- #define RTC_FROM_HERE_WITH_FUNCTION(function_name) \ >-- ::rtc::Location(function_name, __FILE__ ":" STRINGIZE(__LINE__)) >-+ ::rtc::Location(function_name, __FILE__ ":" RTC_STRINGIZE(__LINE__)) >- >- } // namespace rtc >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc >-index 129a8974397..58456c599ec 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc >-@@ -36,6 +36,7 @@ static const int kMaxLogLineSize = 1024 - 60; >- #include "rtc_base/criticalsection.h" >- #include "rtc_base/logging.h" >- #include "rtc_base/platform_thread_types.h" >-+#include "rtc_base/never_destroyed.h" >- #include "rtc_base/stringencode.h" >- #include "rtc_base/strings/string_builder.h" >- #include "rtc_base/stringutils.h" >-@@ -62,8 +63,12 @@ const char* FilenameFromPath(const char* file) { >- return (end1 > end2) ? end1 + 1 : end2 + 1; >- } >- >--// Global lock for log subsystem, only needed to serialize access to streams_. >--CriticalSection g_log_crit; >-+// Global lock for log subsystem, only needed to serialize access to streamList(). >-+CriticalSection& logCriticalScope() { >-+ static auto scope = makeNeverDestroyed<>(CriticalSection { }); >-+ return scope.get(); >-+} >-+ >- } // namespace >- >- // Inefficient default implementation, override is recommended. >-@@ -83,7 +88,15 @@ bool LogMessage::log_to_stderr_ = true; >- // Note: we explicitly do not clean this up, because of the uncertain ordering >- // of destructors at program exit. Let the person who sets the stream trigger >- // cleanup by setting to null, or let it leak (safe at program exit). >--LogMessage::StreamList LogMessage::streams_ RTC_GUARDED_BY(g_log_crit); >-+typedef std::pair<LogSink*, LoggingSeverity> StreamAndSeverity; >-+typedef std::list<StreamAndSeverity> StreamList; >-+ >-+// The output streams and their associated severities >-+StreamList& streamList() >-+ RTC_EXCLUSIVE_LOCKS_REQUIRED(logCriticalScope()) { >-+ static auto stream_list = makeNeverDestroyed<>(StreamList { }); >-+ return stream_list.get(); >-+} >- >- // Boolean options default to false (0) >- bool LogMessage::thread_, LogMessage::timestamp_; >-@@ -97,6 +110,10 @@ LogMessage::LogMessage(const char* file, >- LogErrorContext err_ctx, >- int err) >- : severity_(sev) { >-+ >-+ static std::once_flag callLogCriticalScopeOnce; >-+ std::call_once(callLogCriticalScopeOnce,[] { logCriticalScope(); }); >-+ >- if (timestamp_) { >- // Use SystemTimeMillis so that even if tests use fake clocks, the timestamp >- // in log messages represents the real system time. >-@@ -197,8 +214,8 @@ LogMessage::~LogMessage() { >- #endif >- } >- >-- CritScope cs(&g_log_crit); >-- for (auto& kv : streams_) { >-+ CritScope cs(&logCriticalScope()); >-+ for (auto& kv : streamList()) { >- if (severity_ >= kv.second) { >- #if defined(WEBRTC_ANDROID) >- kv.first->OnLogMessage(str, severity_, tag_); >-@@ -246,7 +263,7 @@ void LogMessage::LogTimestamps(bool on) { >- >- void LogMessage::LogToDebug(LoggingSeverity min_sev) { >- g_dbg_sev = min_sev; >-- CritScope cs(&g_log_crit); >-+ CritScope cs(&logCriticalScope()); >- UpdateMinLogSeverity(); >- } >- >-@@ -255,9 +272,9 @@ void LogMessage::SetLogToStderr(bool log_to_stderr) { >- } >- >- int LogMessage::GetLogToStream(LogSink* stream) { >-- CritScope cs(&g_log_crit); >-+ CritScope cs(&logCriticalScope()); >- LoggingSeverity sev = LS_NONE; >-- for (auto& kv : streams_) { >-+ for (auto& kv : streamList()) { >- if (!stream || stream == kv.first) { >- sev = std::min(sev, kv.second); >- } >-@@ -266,16 +283,16 @@ int LogMessage::GetLogToStream(LogSink* stream) { >- } >- >- void LogMessage::AddLogToStream(LogSink* stream, LoggingSeverity min_sev) { >-- CritScope cs(&g_log_crit); >-- streams_.push_back(std::make_pair(stream, min_sev)); >-+ CritScope cs(&logCriticalScope()); >-+ streamList().push_back(std::make_pair(stream, min_sev)); >- UpdateMinLogSeverity(); >- } >- >- void LogMessage::RemoveLogToStream(LogSink* stream) { >-- CritScope cs(&g_log_crit); >-- for (StreamList::iterator it = streams_.begin(); it != streams_.end(); ++it) { >-+ CritScope cs(&logCriticalScope()); >-+ for (StreamList::iterator it = streamList().begin(); it != streamList().end(); ++it) { >- if (stream == it->first) { >-- streams_.erase(it); >-+ streamList().erase(it); >- break; >- } >- } >-@@ -334,9 +351,9 @@ void LogMessage::ConfigureLogging(const char* params) { >- } >- >- void LogMessage::UpdateMinLogSeverity() >-- RTC_EXCLUSIVE_LOCKS_REQUIRED(g_log_crit) { >-+ RTC_EXCLUSIVE_LOCKS_REQUIRED(logCriticalScope()) { >- LoggingSeverity min_sev = g_dbg_sev; >-- for (const auto& kv : streams_) { >-+ for (const auto& kv : streamList()) { >- const LoggingSeverity sev = kv.second; >- min_sev = std::min(min_sev, sev); >- } >-@@ -450,11 +467,8 @@ bool LogMessage::IsNoop(LoggingSeverity severity) { >- // TODO(tommi): We're grabbing this lock for every LogMessage instance that >- // is going to be logged. This introduces unnecessary synchronization for >- // a feature that's mostly used for testing. >-- CritScope cs(&g_log_crit); >-- if (streams_.size() > 0) >-- return false; >-- >-- return true; >-+ CritScope cs(&logCriticalScope()); >-+ return streamList().size() == 0; >- } >- >- void LogMessage::FinishPrintStream() { >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc >-index a475e52d465..6b2263923fa 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging_unittest.cc >-@@ -221,7 +221,7 @@ TEST(LogTest, SingleStream) { >- >- #if GTEST_HAS_DEATH_TEST && !defined(WEBRTC_ANDROID) >- TEST(LogTest, Checks) { >-- EXPECT_DEATH(FATAL() << "message", >-+ EXPECT_DEATH(RTC_FATAL() << "message", >- "\n\n#\n" >- "# Fatal error in: \\S+, line \\w+\n" >- "# last system error: \\w+\n" >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/never_destroyed.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/never_destroyed.h >-new file mode 100644 >-index 00000000000..fcc62e35534 >---- /dev/null >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/never_destroyed.h >-@@ -0,0 +1,69 @@ >-+/* >-+ * Copyright (C) 2018 Apple Inc. All rights reserved. >-+ * >-+ * Redistribution and use in source and binary forms, with or without >-+ * modification, are permitted provided that the following conditions >-+ * are met: >-+ * 1. Redistributions of source code must retain the above copyright >-+ * notice, this list of conditions and the following disclaimer. >-+ * 2. Redistributions in binary form must reproduce the above copyright >-+ * notice, this list of conditions and the following disclaimer in the >-+ * documentation and/or other materials provided with the distribution. >-+ * >-+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >-+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >-+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >-+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >-+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >-+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >-+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >-+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >-+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >-+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >-+ * THE POSSIBILITY OF SUCH DAMAGE. >-+ */ >-+ >-+#pragma once >-+ >-+#include <type_traits> >-+#include <utility> >-+ >-+namespace rtc { >-+ >-+template<typename T> class NeverDestroyed { >-+public: >-+ template<typename... Args> NeverDestroyed(Args&&... args) >-+ { >-+ new (storagePointer()) T(std::forward<Args>(args)...); >-+ } >-+ >-+ NeverDestroyed(NeverDestroyed&& other) >-+ { >-+ new (storagePointer()) T(std::move(*other.storagePointer())); >-+ } >-+ >-+ operator T&() { return *storagePointer(); } >-+ T& get() { return *storagePointer(); } >-+ >-+ operator const T&() const { return *storagePointer(); } >-+ const T& get() const { return *storagePointer(); } >-+ >-+private: >-+ NeverDestroyed(const NeverDestroyed&) = delete; >-+ NeverDestroyed& operator=(const NeverDestroyed&) = delete; >-+ >-+ using PointerType = typename std::remove_const<T>::type*; >-+ >-+ PointerType storagePointer() const { return const_cast<PointerType>(reinterpret_cast<const T*>(&m_storage)); } >-+ >-+ // FIXME: Investigate whether we should allocate a hunk of virtual memory >-+ // and hand out chunks of it to NeverDestroyed instead, to reduce fragmentation. >-+ typename std::aligned_storage<sizeof(T), std::alignment_of<T>::value>::type m_storage; >-+}; >-+ >-+template<typename T> inline NeverDestroyed<T> makeNeverDestroyed(T&& argument) >-+{ >-+ return NeverDestroyed<T>(std::move(argument)); >-+} >-+ >-+} // namespace rtc >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h >-index 58efcaa746a..48c212e4d49 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h >-@@ -63,11 +63,11 @@ inline Dst saturated_cast(Src value) { >- >- // Should fail only on attempting to assign NaN to a saturated integer. >- case internal::TYPE_INVALID: >-- FATAL(); >-+ RTC_FATAL(); >- return std::numeric_limits<Dst>::max(); >- } >- >-- FATAL(); >-+ RTC_FATAL(); >- return static_cast<Dst>(value); >- } >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >-index ed67a8938e1..236bef55c8a 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >-@@ -256,11 +256,11 @@ OpenSSLCertificate* OpenSSLCertificate::GetReference() const { >- std::string OpenSSLCertificate::ToPEMString() const { >- BIO* bio = BIO_new(BIO_s_mem()); >- if (!bio) { >-- FATAL() << "unreachable code"; >-+ RTC_FATAL() << "unreachable code"; >- } >- if (!PEM_write_bio_X509(bio, x509_)) { >- BIO_free(bio); >-- FATAL() << "unreachable code"; >-+ RTC_FATAL() << "unreachable code"; >- } >- BIO_write(bio, "\0", 1); >- char* buffer; >-@@ -277,11 +277,11 @@ void OpenSSLCertificate::ToDER(Buffer* der_buffer) const { >- // Calculates the DER representation of the certificate, from scratch. >- BIO* bio = BIO_new(BIO_s_mem()); >- if (!bio) { >-- FATAL() << "unreachable code"; >-+ RTC_FATAL() << "unreachable code"; >- } >- if (!i2d_X509_bio(bio, x509_)) { >- BIO_free(bio); >-- FATAL() << "unreachable code"; >-+ RTC_FATAL() << "unreachable code"; >- } >- char* data; >- size_t length = BIO_get_mem_data(bio, &data); >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h >-index aee8d14551d..38c6f1836bd 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h >-@@ -8,7 +8,7 @@ >- * be found in the AUTHORS file in the root of the source tree. >- */ >- >--// Modified from the Chromium original: >-+// Modified from the Chxsromium original: >- // src/base/strings/stringize_macros.h >- >- // This file defines preprocessor macros for stringizing preprocessor >-@@ -33,6 +33,6 @@ >- // Then: >- // STRINGIZE(A) produces "FOO" >- // STRINGIZE(B(y)) produces "myobj->FunctionCall(y)" >--#define STRINGIZE(x) STRINGIZE_NO_EXPANSION(x) >-+#define RTC_STRINGIZE(x) STRINGIZE_NO_EXPANSION(x) >- >- #endif // RTC_BASE_STRINGIZE_MACROS_H_ >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >-index 53dad46fced..ae40d7888f5 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >-@@ -1027,9 +1027,8 @@ void VirtualSocketServer::UpdateDelayDistribution() { >- } >- } >- >--static double PI = 4 * atan(1.0); >-- >- static double Normal(double x, double mean, double stddev) { >-+ static const double PI = 4 * atan(1.0); >- double a = (x - mean) * (x - mean) / (2 * stddev * stddev); >- return exp(-a) / (stddev * sqrt(2 * PI)); >- } >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/PeerConnection/RTCH264ProfileLevelId.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/PeerConnection/RTCH264ProfileLevelId.mm >-new file mode 100644 >-index 00000000000..04a5689417c >---- /dev/null >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/Framework/Classes/PeerConnection/RTCH264ProfileLevelId.mm >-@@ -0,0 +1,58 @@ >-+/* >-+ * Copyright 2018 The WebRTC project authors. All Rights Reserved. >-+ * >-+ * Use of this source code is governed by a BSD-style license >-+ * that can be found in the LICENSE file in the root of the source >-+ * tree. An additional intellectual property rights grant can be found >-+ * in the file PATENTS. All contributing project authors may >-+ * be found in the AUTHORS file in the root of the source tree. >-+ * >-+ */ >-+ >-+#import "WebRTC/RTCVideoCodecH264.h" >-+ >-+#include "media/base/h264_profile_level_id.h" >-+ >-+@interface RTCH264ProfileLevelId () >-+ >-+@property(nonatomic, assign) RTCH264Profile profile; >-+@property(nonatomic, assign) RTCH264Level level; >-+@property(nonatomic, strong) NSString *hexString; >-+ >-+@end >-+ >-+@implementation RTCH264ProfileLevelId >-+ >-+@synthesize profile = _profile; >-+@synthesize level = _level; >-+@synthesize hexString = _hexString; >-+ >-+- (instancetype)initWithHexString:(NSString *)hexString { >-+ if (self = [super init]) { >-+ self.hexString = hexString; >-+ >-+ absl::optional<webrtc::H264::ProfileLevelId> profile_level_id = >-+ webrtc::H264::ParseProfileLevelId([hexString cStringUsingEncoding:NSUTF8StringEncoding]); >-+ if (profile_level_id.has_value()) { >-+ self.profile = static_cast<RTCH264Profile>(profile_level_id->profile); >-+ self.level = static_cast<RTCH264Level>(profile_level_id->level); >-+ } >-+ } >-+ return self; >-+} >-+ >-+- (instancetype)initWithProfile:(RTCH264Profile)profile level:(RTCH264Level)level { >-+ if (self = [super init]) { >-+ self.profile = profile; >-+ self.level = level; >-+ >-+ absl::optional<std::string> hex_string = >-+ webrtc::H264::ProfileLevelIdToString(webrtc::H264::ProfileLevelId( >-+ static_cast<webrtc::H264::Profile>(profile), static_cast<webrtc::H264::Level>(level))); >-+ self.hexString = >-+ [NSString stringWithCString:hex_string.value_or("").c_str() encoding:NSUTF8StringEncoding]; >-+ } >-+ return self; >-+} >-+ >-+@end >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >-index 6f2d1f46d9c..94cb1f110f9 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >-@@ -35,6 +35,7 @@ - (instancetype)initWithNativeEncodedImage:(webrtc::EncodedImage)encodedImage { >- self.contentType = (encodedImage.content_type_ == webrtc::VideoContentType::SCREENSHARE) ? >- RTCVideoContentTypeScreenshare : >- RTCVideoContentTypeUnspecified; >-+ self.spatialIndex = encodedImage.SpatialIndex() ? *encodedImage.SpatialIndex() : 0; >- } >- >- return self; >-@@ -59,7 +60,8 @@ - (webrtc::EncodedImage)nativeEncodedImage { >- encodedImage.content_type_ = (self.contentType == RTCVideoContentTypeScreenshare) ? >- webrtc::VideoContentType::SCREENSHARE : >- webrtc::VideoContentType::UNSPECIFIED; >-- >-+ if (self.spatialIndex) >-+ encodedImage.SetSpatialIndex(self.spatialIndex); >- return encodedImage; >- } >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h >-index ba50bde649f..dabff725abb 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h >-@@ -15,6 +15,7 @@ >- NS_ASSUME_NONNULL_BEGIN >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCRtpEncodingParameters"))) >- @interface RTCRtpEncodingParameters : NSObject >- >- /** Controls whether the encoding is currently transmitted. */ >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h >-index 5b062455bc0..72af732b23b 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h >-@@ -22,4 +22,11 @@ NS_ASSUME_NONNULL_BEGIN >- >- @end >- >-+@interface RTCVideoBitrateAllocation (Private) >-+ >-+- (instancetype)initWithNativeVideoBitrateAllocation:(const webrtc::VideoBitrateAllocation *__nullable)videoBitrateAllocation; >-+- (webrtc::VideoBitrateAllocation)nativeVideoBitrateAllocation; >-+ >-+@end >-+ >- NS_ASSUME_NONNULL_END >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm >-index 6fb81dbb8af..82b5ba7bc0f 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm >-@@ -12,6 +12,23 @@ >- >- #import "helpers/NSString+StdString.h" >- >-+@implementation RTCVideoEncoderSettings { >-+ webrtc::VideoCodec _nativeVideoCodec; >-+} >-+ >-+@synthesize name = _name; >-+@synthesize width = _width; >-+@synthesize height = _height; >-+@synthesize startBitrate = _startBitrate; >-+@synthesize maxBitrate = _maxBitrate; >-+@synthesize minBitrate = _minBitrate; >-+@synthesize targetBitrate = _targetBitrate; >-+@synthesize maxFramerate = _maxFramerate; >-+@synthesize qpMax = _qpMax; >-+@synthesize mode = _mode; >-+ >-+@end >-+ >- @implementation RTCVideoEncoderSettings (Private) >- >- - (instancetype)initWithNativeVideoCodec:(const webrtc::VideoCodec *)videoCodec { >-@@ -20,6 +37,7 @@ - (instancetype)initWithNativeVideoCodec:(const webrtc::VideoCodec *)videoCodec >- const char *codecName = CodecTypeToPayloadString(videoCodec->codecType); >- self.name = [NSString stringWithUTF8String:codecName]; >- >-+ _nativeVideoCodec = *videoCodec; >- self.width = videoCodec->width; >- self.height = videoCodec->height; >- self.startBitrate = videoCodec->startBitrate; >-@@ -36,18 +54,28 @@ - (instancetype)initWithNativeVideoCodec:(const webrtc::VideoCodec *)videoCodec >- } >- >- - (webrtc::VideoCodec)nativeVideoCodec { >-- webrtc::VideoCodec videoCodec; >-- videoCodec.width = self.width; >-- videoCodec.height = self.height; >-- videoCodec.startBitrate = self.startBitrate; >-- videoCodec.maxBitrate = self.maxBitrate; >-- videoCodec.minBitrate = self.minBitrate; >-- videoCodec.targetBitrate = self.targetBitrate; >-- videoCodec.maxBitrate = self.maxBitrate; >-- videoCodec.qpMax = self.qpMax; >-- videoCodec.mode = (webrtc::VideoCodecMode)self.mode; >-- >-- return videoCodec; >-+ return _nativeVideoCodec; >-+} >-+ >-+@end >-+ >-+@implementation RTCVideoBitrateAllocation { >-+ webrtc::VideoBitrateAllocation _nativeVideoBitrateAllocation; >-+} >-+ >-+@end >-+ >-+@implementation RTCVideoBitrateAllocation (Private) >-+ >-+- (instancetype)initWithNativeVideoBitrateAllocation:(const webrtc::VideoBitrateAllocation *)videoBitrateAllocation { >-+ if (self = [super init]) { >-+ _nativeVideoBitrateAllocation = *videoBitrateAllocation; >-+ } >-+ return self; >-+} >-+ >-+- (webrtc::VideoBitrateAllocation)nativeVideoBitrateAllocation { >-+ return _nativeVideoBitrateAllocation; >- } >- >- @end >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm >-index acbf126170c..1447d3b27e1 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm >-@@ -13,5 +13,5 @@ >- >- #include "media/base/mediaconstants.h" >- >--NSString *const kRTCVideoCodecVp8Name = @(cricket::kVp8CodecName); >--NSString *const kRTCVideoCodecVp9Name = @(cricket::kVp9CodecName); >-+NSString *const kRTCVideoCodecVp8Name = @"VP8"; >-+NSString *const kRTCVideoCodecVp9Name = @"VP9"; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h >-index 8d87a898931..08d18504222 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h >-@@ -14,6 +14,7 @@ >- #import "RTCVideoEncoder.h" >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoEncoderVP8"))) >- @interface RTCVideoEncoderVP8 : NSObject >- >- /* This returns a VP8 encoder that can be returned from a RTCVideoEncoderFactory injected into >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm >-index 9afd54f55f5..b9634779f1f 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm >-@@ -58,6 +58,11 @@ - (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate { >- return 0; >- } >- >-+- (int)setRateAllocation:(nonnull RTCVideoBitrateAllocation *)allocation framerate:(uint32_t)framerate { >-+ RTC_NOTREACHED(); >-+ return 0; >-+} >-+ >- - (NSString *)implementationName { >- RTC_NOTREACHED(); >- return nil; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h >-index 9a904f5396a..dc903bc4bca 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h >-@@ -17,6 +17,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** RTCI420Buffer implements the RTCI420Buffer protocol */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCI420Buffer"))) >- @interface RTCI420Buffer : NSObject<RTCI420Buffer> >- @end >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h >-index 6cd5110460b..7cfa950abb6 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h >-@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** Mutable version of RTCI420Buffer */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCMutableI420Buffer"))) >- @interface RTCMutableI420Buffer : RTCI420Buffer<RTCMutableI420Buffer> >- @end >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h >-index e2ae4cafa11..9dc44c12af8 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h >-@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >- * Corresponds to webrtc::CodecSpecificInfo. >- */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCCodecSpecificInfo"))) >- @protocol RTCCodecSpecificInfo <NSObject> >- @end >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h >-index 670c7276ff7..7994ce64a1f 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h >-@@ -31,6 +31,7 @@ typedef NS_ENUM(NSUInteger, RTCVideoContentType) { >- >- /** Represents an encoded frame. Corresponds to webrtc::EncodedImage. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCEncodedImage"))) >- @interface RTCEncodedImage : NSObject >- >- @property(nonatomic, strong) NSData *buffer; >-@@ -47,6 +48,7 @@ RTC_OBJC_EXPORT >- @property(nonatomic, assign) BOOL completeFrame; >- @property(nonatomic, strong) NSNumber *qp; >- @property(nonatomic, assign) RTCVideoContentType contentType; >-+@property(nonatomic, assign) int spatialIndex; >- >- @end >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m >-index 024a57c541b..85cd5e298e4 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m >-@@ -26,5 +26,6 @@ @implementation RTCEncodedImage >- @synthesize completeFrame = _completeFrame; >- @synthesize qp = _qp; >- @synthesize contentType = _contentType; >-+@synthesize spatialIndex = _spatialIndex; >- >- @end >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h >-index a6c7e41bcba..cfa14f8af23 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h >-@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** Protocol for RTCYUVPlanarBuffers containing I420 data */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCI420Buffer"))) >- @protocol RTCI420Buffer <RTCYUVPlanarBuffer> >- @end >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h >-index 2e26b08b8af..fbbc68a8ef6 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h >-@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** Information for header. Corresponds to webrtc::RTPFragmentationHeader. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCRtpFragmentationHeader"))) >- @interface RTCRtpFragmentationHeader : NSObject >- >- @property(nonatomic, strong) NSArray<NSNumber *> *fragmentationOffset; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m >-index 8049abc411e..6cf21ddc2a1 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m >-@@ -17,4 +17,5 @@ @implementation RTCRtpFragmentationHeader >- @synthesize fragmentationTimeDiff = _fragmentationTimeDiff; >- @synthesize fragmentationPlType = _fragmentationPlType; >- >--@end >-\ No newline at end of file >-+@end >-+ >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h >-index 2162caaa21f..4e643d94347 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h >-@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** Holds information to identify a codec. Corresponds to webrtc::SdpVideoFormat. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoCodecInfo"))) >- @interface RTCVideoCodecInfo : NSObject <NSCoding> >- >- - (instancetype)init NS_UNAVAILABLE; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h >-index 18c6f6b0006..534420d3955 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h >-@@ -23,6 +23,7 @@ typedef void (^RTCVideoDecoderCallback)(RTCVideoFrame *frame); >- >- /** Protocol for decoder implementations. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoDecoder"))) >- @protocol RTCVideoDecoder <NSObject> >- >- - (void)setCallback:(RTCVideoDecoderCallback)callback; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h >-index 3e24153b82c..cb95fa3f134 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h >-@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** RTCVideoDecoderFactory is an Objective-C version of webrtc::VideoDecoderFactory. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoDecoderFactory"))) >- @protocol RTCVideoDecoderFactory <NSObject> >- >- - (nullable id<RTCVideoDecoder>)createDecoder:(RTCVideoCodecInfo *)info; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h >-index c5257674d83..5291bb1342b 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h >-@@ -19,7 +19,11 @@ >- #import "RTCVideoFrame.h" >- >- NS_ASSUME_NONNULL_BEGIN >-- >-+/* >-+namespace webrtc { >-+class VideoBitrateAllocation; >-+}; >-+*/ >- /** Callback block for encoder. */ >- typedef BOOL (^RTCVideoEncoderCallback)(RTCEncodedImage *frame, >- id<RTCCodecSpecificInfo> info, >-@@ -27,6 +31,7 @@ typedef BOOL (^RTCVideoEncoderCallback)(RTCEncodedImage *frame, >- >- /** Protocol for encoder implementations. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoEncoder"))) >- @protocol RTCVideoEncoder <NSObject> >- >- - (void)setCallback:(RTCVideoEncoderCallback)callback; >-@@ -37,6 +42,7 @@ RTC_OBJC_EXPORT >- codecSpecificInfo:(nullable id<RTCCodecSpecificInfo>)info >- frameTypes:(NSArray<NSNumber *> *)frameTypes; >- - (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate; >-+- (int)setRateAllocation: (RTCVideoBitrateAllocation *)allocation framerate:(uint32_t)framerate; >- - (NSString *)implementationName; >- >- /** Returns QP scaling settings for encoder. The quality scaler adjusts the resolution in order to >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h >-index 20c603d6fe6..cbe9ea492ae 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h >-@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** RTCVideoEncoderFactory is an Objective-C version of webrtc::VideoEncoderFactory. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoEncoderFactory"))) >- @protocol RTCVideoEncoderFactory <NSObject> >- >- - (nullable id<RTCVideoEncoder>)createEncoder:(RTCVideoCodecInfo *)info; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h >-index 2b48f45ce0a..68cfd66df47 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h >-@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** QP thresholds for encoder. Corresponds to webrtc::VideoEncoder::QpThresholds. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoEncoderQpThresholds"))) >- @interface RTCVideoEncoderQpThresholds : NSObject >- >- - (instancetype)initWithThresholdsLow:(NSInteger)low high:(NSInteger)high; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h >-index 69e04cac70f..ae3bf5fb5f1 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h >-@@ -21,6 +21,7 @@ typedef NS_ENUM(NSUInteger, RTCVideoCodecMode) { >- >- /** Settings for encoder. Corresponds to webrtc::VideoCodec. */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoEncoderSettings"))) >- @interface RTCVideoEncoderSettings : NSObject >- >- @property(nonatomic, strong) NSString *name; >-@@ -40,4 +41,9 @@ RTC_OBJC_EXPORT >- >- @end >- >-+RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoBitrateAllocation"))) >-+@interface RTCVideoBitrateAllocation : NSObject >-+@end >-+ >- NS_ASSUME_NONNULL_END >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h >-index 9aca7433f34..bde09c50c9d 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h >-@@ -26,6 +26,7 @@ typedef NS_ENUM(NSInteger, RTCVideoRotation) { >- >- // RTCVideoFrame is an ObjectiveC version of webrtc::VideoFrame. >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoFrame"))) >- @interface RTCVideoFrame : NSObject >- >- /** Width without rotation applied. */ >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h >-index bb9e6fba631..186ad0889bf 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h >-@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- // RTCVideoFrameBuffer is an ObjectiveC version of webrtc::VideoFrameBuffer. >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoFrameBuffer"))) >- @protocol RTCVideoFrameBuffer <NSObject> >- >- @property(nonatomic, readonly) int width; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h >-index ece9570a13b..e947dc6e623 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h >-@@ -10,7 +10,7 @@ >- >- #import <Foundation/Foundation.h> >- >--#import "RTCCodecSpecificInfo.h" >-+#import "base/RTCCodecSpecificInfo.h" >- #import "RTCMacros.h" >- >- /** Class for H264 specific config. */ >-@@ -20,6 +20,7 @@ typedef NS_ENUM(NSUInteger, RTCH264PacketizationMode) { >- }; >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCCodecSpecificInfoH264"))) >- @interface RTCCodecSpecificInfoH264 : NSObject <RTCCodecSpecificInfo> >- >- @property(nonatomic, assign) RTCH264PacketizationMode packetizationMode; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h >-index 7ca9463a593..d11460300d5 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h >-@@ -19,6 +19,7 @@ NS_ASSUME_NONNULL_BEGIN >- * codecs, create custom implementations of RTCVideoEncoderFactory and RTCVideoDecoderFactory. >- */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCDefaultVideoDecoderFactory"))) >- @interface RTCDefaultVideoDecoderFactory : NSObject <RTCVideoDecoderFactory> >- @end >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h >-index c45e54362b2..9323256cead 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h >-@@ -19,6 +19,7 @@ NS_ASSUME_NONNULL_BEGIN >- * codecs, create custom implementations of RTCVideoEncoderFactory and RTCVideoDecoderFactory. >- */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCDefaultVideoEncoderFactory"))) >- @interface RTCDefaultVideoEncoderFactory : NSObject <RTCVideoEncoderFactory> >- >- @property(nonatomic, retain) RTCVideoCodecInfo *preferredCodec; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h >-index 56b353215a2..d022297674e 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h >-@@ -48,6 +48,7 @@ typedef NS_ENUM(NSUInteger, RTCH264Level) { >- }; >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCH264ProfileLevelId"))) >- @interface RTCH264ProfileLevelId : NSObject >- >- @property(nonatomic, readonly) RTCH264Profile profile; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm >-index 359656cb97b..28f52139c17 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm >-@@ -19,20 +19,30 @@ >- #include "media/base/h264_profile_level_id.h" >- #include "media/base/mediaconstants.h" >- >-+#if !defined(WEBRTC_WEBKIT_BUILD) >- namespace { >- >- NSString *MaxSupportedProfileLevelConstrainedHigh(); >- NSString *MaxSupportedProfileLevelConstrainedBaseline(); >- >- } // namespace >-+#endif >- >--NSString *const kRTCVideoCodecH264Name = @(cricket::kH264CodecName); >-+NSString *const kRTCVideoCodecH264Name = @"H264"; >- NSString *const kRTCLevel31ConstrainedHigh = @"640c1f"; >- NSString *const kRTCLevel31ConstrainedBaseline = @"42e01f"; >-+ >-+#if defined(WEBRTC_WEBKIT_BUILD) >-+NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedHigh = >-+ @"640c1f"; >-+NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedBaseline = >-+ @"42e01f"; >-+#else >- NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedHigh = >- MaxSupportedProfileLevelConstrainedHigh(); >- NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedBaseline = >- MaxSupportedProfileLevelConstrainedBaseline(); >-+#endif >- >- namespace { >- >-@@ -53,6 +63,8 @@ NSString *MaxSupportedLevelForProfile(Profile profile) { >- } >- #endif >- >-+#if !defined(WEBRTC_WEBKIT_BUILD) >-+ >- NSString *MaxSupportedProfileLevelConstrainedBaseline() { >- #if defined(WEBRTC_IOS) >- NSString *profile = MaxSupportedLevelForProfile(webrtc::H264::kProfileConstrainedBaseline); >-@@ -73,6 +85,8 @@ NSString *MaxSupportedProfileLevelConstrainedHigh() { >- return kRTCLevel31ConstrainedHigh; >- } >- >-+#endif >-+ >- } // namespace >- >- @interface RTCH264ProfileLevelId () >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h >-index 4fcff1dff79..fdabd807480 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h >-@@ -14,5 +14,6 @@ >- #import "RTCVideoDecoderFactory.h" >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoDecoderFactoryH264"))) >- @interface RTCVideoDecoderFactoryH264 : NSObject <RTCVideoDecoderFactory> >- @end >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h >-index b860276206c..b10d1e6a099 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h >-@@ -14,5 +14,6 @@ >- #import "RTCVideoDecoder.h" >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoDecoderH264"))) >- @interface RTCVideoDecoderH264 : NSObject <RTCVideoDecoder> >- @end >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h >-index c64405e4dac..5f6f78dfa9a 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h >-@@ -14,5 +14,6 @@ >- #import "RTCVideoEncoderFactory.h" >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoEncoderFactoryH264"))) >- @interface RTCVideoEncoderFactoryH264 : NSObject <RTCVideoEncoderFactory> >- @end >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h >-index a9c05580a41..7714b25eaa4 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h >-@@ -15,6 +15,7 @@ >- #import "RTCVideoEncoder.h" >- >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCVideoEncoderH264"))) >- @interface RTCVideoEncoderH264 : NSObject <RTCVideoEncoder> >- >- - (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm >-index 16ad1ff47aa..9013ba962fb 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm >-@@ -9,39 +9,72 @@ >- * >- */ >- >--#import "RTCVideoEncoderH264.h" >-+#import "WebRTC/RTCVideoCodecH264.h" >- >- #import <VideoToolbox/VideoToolbox.h> >- #include <vector> >- >- #if defined(WEBRTC_IOS) >--#import "helpers/UIDevice+RTCDevice.h" >-+#import "Common/RTCUIApplicationStatusObserver.h" >-+#import "WebRTC/UIDevice+RTCDevice.h" >- #endif >--#import "RTCCodecSpecificInfoH264.h" >--#import "RTCH264ProfileLevelId.h" >--#import "api/peerconnection/RTCRtpFragmentationHeader+Private.h" >--#import "api/peerconnection/RTCVideoCodecInfo+Private.h" >--#import "base/RTCCodecSpecificInfo.h" >--#import "base/RTCI420Buffer.h" >--#import "base/RTCVideoEncoder.h" >--#import "base/RTCVideoFrame.h" >--#import "base/RTCVideoFrameBuffer.h" >--#import "components/video_frame_buffer/RTCCVPixelBuffer.h" >--#import "helpers.h" >-- >-+#import "PeerConnection/RTCVideoCodec+Private.h" >-+#import "WebRTC/RTCVideoCodec.h" >-+#import "WebRTC/RTCVideoFrame.h" >-+#import "WebRTC/RTCVideoFrameBuffer.h" >- #include "common_video/h264/h264_bitstream_parser.h" >- #include "common_video/h264/profile_level_id.h" >- #include "common_video/include/bitrate_adjuster.h" >-+#import "helpers.h" >- #include "modules/include/module_common_types.h" >- #include "modules/video_coding/include/video_error_codes.h" >- #include "rtc_base/buffer.h" >- #include "rtc_base/logging.h" >- #include "rtc_base/timeutils.h" >--#include "sdk/objc/components/video_codec/nalu_rewriter.h" >-+#include "sdk/objc/Framework/Classes/VideoToolbox/nalu_rewriter.h" >-+#include "system_wrappers/include/clock.h" >- #include "third_party/libyuv/include/libyuv/convert_from.h" >- >--@interface RTCVideoEncoderH264 () >-+#include "sdk/WebKit/EncoderUtilities.h" >-+#include "sdk/WebKit/WebKitUtilities.h" >-+ >-+#import <dlfcn.h> >-+#import <objc/runtime.h> >-+ >-+SOFT_LINK_FRAMEWORK_OPTIONAL(VideoToolBox) >-+SOFT_LINK_POINTER_OPTIONAL(VideoToolBox, kVTVideoEncoderSpecification_Usage, NSString *) >-+ >-+#if !ENABLE_VCP_ENCODER && !defined(WEBRTC_IOS) >-+static inline bool isStandardFrameSize(int32_t width, int32_t height) >-+{ >-+ // FIXME: Envision relaxing this rule, something like width and height dividable by 4 or 8 should be good enough. >-+ if (width == 1280) >-+ return height == 720; >-+ if (width == 720) >-+ return height == 1280; >-+ if (width == 960) >-+ return height == 540; >-+ if (width == 540) >-+ return height == 960; >-+ if (width == 640) >-+ return height == 480; >-+ if (width == 480) >-+ return height == 640; >-+ if (width == 288) >-+ return height == 352; >-+ if (width == 352) >-+ return height == 288; >-+ if (width == 320) >-+ return height == 240; >-+ if (width == 240) >-+ return height == 320; >-+ return false; >-+} >-+#endif >- >-+__attribute__((objc_runtime_name("WK_RTCSingleVideoEncoderH264"))) >-+@interface RTCSingleVideoEncoderH264 : NSObject <RTCVideoEncoder> >-+- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo simulcastIndex: (int)index; >- - (void)frameWasEncoded:(OSStatus)status >- flags:(VTEncodeInfoFlags)infoFlags >- sampleBuffer:(CMSampleBufferRef)sampleBuffer >-@@ -51,7 +84,6 @@ - (void)frameWasEncoded:(OSStatus)status >- renderTimeMs:(int64_t)renderTimeMs >- timestamp:(uint32_t)timestamp >- rotation:(RTCVideoRotation)rotation; >-- >- @end >- >- namespace { // anonymous namespace >-@@ -70,7 +102,7 @@ const OSType kNV12PixelFormat = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange; >- // Struct that we pass to the encoder per frame to encode. We receive it again >- // in the encoder callback. >- struct RTCFrameEncodeParams { >-- RTCFrameEncodeParams(RTCVideoEncoderH264 *e, >-+ RTCFrameEncodeParams(RTCSingleVideoEncoderH264 *e, >- RTCCodecSpecificInfoH264 *csi, >- int32_t w, >- int32_t h, >-@@ -85,7 +117,7 @@ struct RTCFrameEncodeParams { >- } >- } >- >-- RTCVideoEncoderH264 *encoder; >-+ RTCSingleVideoEncoderH264 *encoder; >- RTCCodecSpecificInfoH264 *codecSpecificInfo; >- int32_t width; >- int32_t height; >-@@ -173,8 +205,8 @@ void compressionOutputCallback(void *encoder, >- rotation:encodeParams->rotation]; >- } >- >--// Extract VideoToolbox profile out of the webrtc::SdpVideoFormat. If there is >--// no specific VideoToolbox profile for the specified level, AutoLevel will be >-+// Extract VideoToolbox profile out of the cricket::VideoCodec. If there is no >-+// specific VideoToolbox profile for the specified level, AutoLevel will be >- // returned. The user must initialize the encoder with a resolution and >- // framerate conforming to the selected H264 level regardless. >- CFStringRef ExtractProfile(webrtc::SdpVideoFormat videoFormat) { >-@@ -280,7 +312,7 @@ CFStringRef ExtractProfile(webrtc::SdpVideoFormat videoFormat) { >- } >- } // namespace >- >--@implementation RTCVideoEncoderH264 { >-+@implementation RTCSingleVideoEncoderH264 { >- RTCVideoCodecInfo *_codecInfo; >- std::unique_ptr<webrtc::BitrateAdjuster> _bitrateAdjuster; >- uint32_t _targetBitrateBps; >-@@ -290,12 +322,14 @@ @implementation RTCVideoEncoderH264 { >- RTCVideoEncoderCallback _callback; >- int32_t _width; >- int32_t _height; >-- VTCompressionSessionRef _compressionSession; >-- CVPixelBufferPoolRef _pixelBufferPool; >-+ CompressionSessionRef _compressionSession; >- RTCVideoCodecMode _mode; >- >- webrtc::H264BitstreamParser _h264BitstreamParser; >- std::vector<uint8_t> _frameScaleBuffer; >-+ >-+ webrtc::VideoCodec _nativeVideoCodec; >-+ int _simulcastIndex; >- } >- >- // .5 is set as a mininum to prevent overcompensating for large temporary >-@@ -305,14 +339,19 @@ @implementation RTCVideoEncoderH264 { >- // drastically reduced bitrate, so we want to avoid that. In steady state >- // conditions, 0.95 seems to give us better overall bitrate over long periods >- // of time. >--- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo { >-+- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo simulcastIndex:(int)index { >- if (self = [super init]) { >- _codecInfo = codecInfo; >- _bitrateAdjuster.reset(new webrtc::BitrateAdjuster(.5, .95)); >- _packetizationMode = RTCH264PacketizationModeNonInterleaved; >- _profile = ExtractProfile([codecInfo nativeSdpVideoFormat]); >-+ _simulcastIndex = index; >- RTC_LOG(LS_INFO) << "Using profile " << CFStringToString(_profile); >- RTC_CHECK([codecInfo.name isEqualToString:kRTCVideoCodecH264Name]); >-+ >-+#if defined(WEBRTC_IOS) >-+ [RTCUIApplicationStatusObserver prepareForUse]; >-+#endif >- } >- return self; >- } >-@@ -329,9 +368,12 @@ - (NSInteger)startEncodeWithSettings:(RTCVideoEncoderSettings *)settings >- _width = settings.width; >- _height = settings.height; >- _mode = settings.mode; >-+ _nativeVideoCodec = settings.nativeVideoCodec; >-+ >-+ RTC_DCHECK(_nativeVideoCodec.numberOfSimulcastStreams != 1); >- >- // We can only set average bitrate on the HW encoder. >-- _targetBitrateBps = settings.startBitrate * 1000; // startBitrate is in kbps. >-+ _targetBitrateBps = settings.startBitrate; >- _bitrateAdjuster->SetTargetBitrateBps(_targetBitrateBps); >- >- // TODO(tkchin): Try setting payload size via >-@@ -348,10 +390,20 @@ - (NSInteger)encode:(RTCVideoFrame *)frame >- if (!_callback || !_compressionSession) { >- return WEBRTC_VIDEO_CODEC_UNINITIALIZED; >- } >-+#if defined(WEBRTC_IOS) >-+ if (![[RTCUIApplicationStatusObserver sharedInstance] isApplicationActive]) { >-+ // Ignore all encode requests when app isn't active. In this state, the >-+ // hardware encoder has been invalidated by the OS. >-+ return WEBRTC_VIDEO_CODEC_OK; >-+ } >-+#endif >- BOOL isKeyframeRequired = NO; >- >- // Get a pixel buffer from the pool and copy frame data over. >-- if ([self resetCompressionSessionIfNeededWithFrame:frame]) { >-+ CVPixelBufferPoolRef pixelBufferPool = >-+ CompressionSessionGetPixelBufferPool(_compressionSession); >-+ if ([self resetCompressionSessionIfNeededForPool:pixelBufferPool withFrame:frame]) { >-+ pixelBufferPool = CompressionSessionGetPixelBufferPool(_compressionSession); >- isKeyframeRequired = YES; >- } >- >-@@ -368,7 +420,7 @@ - (NSInteger)encode:(RTCVideoFrame *)frame >- CVBufferRetain(pixelBuffer); >- } else { >- // Cropping required, we need to crop and scale to a new pixel buffer. >-- pixelBuffer = CreatePixelBuffer(_pixelBufferPool); >-+ pixelBuffer = CreatePixelBuffer(pixelBufferPool); >- if (!pixelBuffer) { >- return WEBRTC_VIDEO_CODEC_ERROR; >- } >-@@ -391,7 +443,7 @@ - (NSInteger)encode:(RTCVideoFrame *)frame >- >- if (!pixelBuffer) { >- // We did not have a native frame buffer >-- pixelBuffer = CreatePixelBuffer(_pixelBufferPool); >-+ pixelBuffer = CreatePixelBuffer(pixelBufferPool); >- if (!pixelBuffer) { >- return WEBRTC_VIDEO_CODEC_ERROR; >- } >-@@ -434,7 +486,7 @@ - (NSInteger)encode:(RTCVideoFrame *)frame >- // Update the bitrate if needed. >- [self setBitrateBps:_bitrateAdjuster->GetAdjustedBitrateBps()]; >- >-- OSStatus status = VTCompressionSessionEncodeFrame(_compressionSession, >-+ OSStatus status = CompressionSessionEncodeFrame(_compressionSession, >- pixelBuffer, >- presentationTimeStamp, >- kCMTimeInvalid, >-@@ -447,14 +499,7 @@ - (NSInteger)encode:(RTCVideoFrame *)frame >- if (pixelBuffer) { >- CVBufferRelease(pixelBuffer); >- } >-- >-- if (status == kVTInvalidSessionErr) { >-- // This error occurs when entering foreground after backgrounding the app. >-- RTC_LOG(LS_ERROR) << "Invalid compression session, resetting."; >-- [self resetCompressionSessionWithPixelFormat:[self pixelFormatOfFrame:frame]]; >-- >-- return WEBRTC_VIDEO_CODEC_NO_OUTPUT; >-- } else if (status != noErr) { >-+ if (status != noErr) { >- RTC_LOG(LS_ERROR) << "Failed to encode frame with code: " << status; >- return WEBRTC_VIDEO_CODEC_ERROR; >- } >-@@ -465,8 +510,8 @@ - (void)setCallback:(RTCVideoEncoderCallback)callback { >- _callback = callback; >- } >- >--- (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate { >-- _targetBitrateBps = 1000 * bitrateKbit; >-+- (int)setBitrate:(uint32_t)bitrateBps framerate:(uint32_t)framerate { >-+ _targetBitrateBps = bitrateBps; >- _bitrateAdjuster->SetTargetBitrateBps(_targetBitrateBps); >- [self setBitrateBps:_bitrateAdjuster->GetAdjustedBitrateBps()]; >- return WEBRTC_VIDEO_CODEC_OK; >-@@ -483,44 +528,51 @@ - (NSInteger)releaseEncoder { >- return WEBRTC_VIDEO_CODEC_OK; >- } >- >--- (OSType)pixelFormatOfFrame:(RTCVideoFrame *)frame { >-- // Use NV12 for non-native frames. >-- if ([frame.buffer isKindOfClass:[RTCCVPixelBuffer class]]) { >-- RTCCVPixelBuffer *rtcPixelBuffer = (RTCCVPixelBuffer *)frame.buffer; >-- return CVPixelBufferGetPixelFormatType(rtcPixelBuffer.pixelBuffer); >-- } >-- >-- return kNV12PixelFormat; >--} >-- >--- (BOOL)resetCompressionSessionIfNeededWithFrame:(RTCVideoFrame *)frame { >-+- (BOOL)resetCompressionSessionIfNeededForPool:(CVPixelBufferPoolRef)pixelBufferPool >-+ withFrame:(RTCVideoFrame *)frame { >- BOOL resetCompressionSession = NO; >- >-+#if defined(WEBRTC_IOS) >-+ if (!pixelBufferPool) { >-+ // Kind of a hack. On backgrounding, the compression session seems to get >-+ // invalidated, which causes this pool call to fail when the application >-+ // is foregrounded and frames are being sent for encoding again. >-+ // Resetting the session when this happens fixes the issue. >-+ // In addition we request a keyframe so video can recover quickly. >-+ resetCompressionSession = YES; >-+ RTC_LOG(LS_INFO) << "Resetting compression session due to invalid pool."; >-+ } >-+#endif >-+ >- // If we're capturing native frames in another pixel format than the compression session is >- // configured with, make sure the compression session is reset using the correct pixel format. >-- OSType framePixelFormat = [self pixelFormatOfFrame:frame]; >-- >-- if (_compressionSession) { >-+ // If we're capturing non-native frames and the compression session is configured with a non-NV12 >-+ // format, reset it to NV12. >-+ OSType framePixelFormat = kNV12PixelFormat; >-+ if (pixelBufferPool) { >- // The pool attribute `kCVPixelBufferPixelFormatTypeKey` can contain either an array of pixel >- // formats or a single pixel format. >- NSDictionary *poolAttributes = >-- (__bridge NSDictionary *)CVPixelBufferPoolGetPixelBufferAttributes(_pixelBufferPool); >-+ (__bridge NSDictionary *)CVPixelBufferPoolGetPixelBufferAttributes(pixelBufferPool); >- id pixelFormats = >- [poolAttributes objectForKey:(__bridge NSString *)kCVPixelBufferPixelFormatTypeKey]; >- NSArray<NSNumber *> *compressionSessionPixelFormats = nil; >- if ([pixelFormats isKindOfClass:[NSArray class]]) { >- compressionSessionPixelFormats = (NSArray *)pixelFormats; >-- } else if ([pixelFormats isKindOfClass:[NSNumber class]]) { >-+ } else { >- compressionSessionPixelFormats = @[ (NSNumber *)pixelFormats ]; >- } >- >-+ if ([frame.buffer isKindOfClass:[RTCCVPixelBuffer class]]) { >-+ RTCCVPixelBuffer *rtcPixelBuffer = (RTCCVPixelBuffer *)frame.buffer; >-+ framePixelFormat = CVPixelBufferGetPixelFormatType(rtcPixelBuffer.pixelBuffer); >-+ } >-+ >- if (![compressionSessionPixelFormats >- containsObject:[NSNumber numberWithLong:framePixelFormat]]) { >- resetCompressionSession = YES; >- RTC_LOG(LS_INFO) << "Resetting compression session due to non-matching pixel format."; >- } >-- } else { >-- resetCompressionSession = YES; >- } >- >- if (resetCompressionSession) { >-@@ -557,22 +609,26 @@ - (int)resetCompressionSessionWithPixelFormat:(OSType)framePixelFormat { >- CFRelease(pixelFormat); >- pixelFormat = nullptr; >- } >-- CFMutableDictionaryRef encoder_specs = nullptr; >--#if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >-+ CFDictionaryRef encoderSpecs = nullptr; >-+#if !defined(WEBRTC_IOS) >-+ auto useHardwareEncoder = webrtc::isH264HardwareEncoderAllowed() ? kCFBooleanTrue : kCFBooleanFalse; >- // Currently hw accl is supported above 360p on mac, below 360p >- // the compression session will be created with hw accl disabled. >-- encoder_specs = CFDictionaryCreateMutable( >-- nullptr, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >-- CFDictionarySetValue(encoder_specs, >-- kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder, >-- kCFBooleanTrue); >-+ CFTypeRef sessionKeys[] = { kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder, kVTVideoEncoderSpecification_RequireHardwareAcceleratedVideoEncoder, kVTCompressionPropertyKey_RealTime }; >-+ CFTypeRef sessionValues[] = { useHardwareEncoder, useHardwareEncoder, kCFBooleanTrue }; >-+ encoderSpecs = CFDictionaryCreate(kCFAllocatorDefault, sessionKeys, sessionValues, 3, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >-+#else >-+ CFTypeRef sessionKeys[] = { kVTCompressionPropertyKey_RealTime }; >-+ CFTypeRef sessionValues[] = { kCFBooleanTrue }; >-+ encoderSpecs = CFDictionaryCreate(kCFAllocatorDefault, sessionKeys, sessionValues, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >- #endif >-+ >- OSStatus status = >-- VTCompressionSessionCreate(nullptr, // use default allocator >-+ CompressionSessionCreate(nullptr, // use default allocator >- _width, >- _height, >-- kCMVideoCodecType_H264, >-- encoder_specs, // use hardware accelerated encoder if available >-+ kCodecTypeH264, >-+ encoderSpecs, // use hardware accelerated encoder if available >- sourceAttributes, >- nullptr, // use default compressed data allocator >- compressionOutputCallback, >-@@ -582,32 +638,105 @@ - (int)resetCompressionSessionWithPixelFormat:(OSType)framePixelFormat { >- CFRelease(sourceAttributes); >- sourceAttributes = nullptr; >- } >-- if (encoder_specs) { >-- CFRelease(encoder_specs); >-- encoder_specs = nullptr; >-+ if (encoderSpecs) { >-+ CFRelease(encoderSpecs); >-+ encoderSpecs = nullptr; >- } >-+ >-+#if ENABLE_VCP_ENCODER || defined(WEBRTC_IOS) >- if (status != noErr) { >- RTC_LOG(LS_ERROR) << "Failed to create compression session: " << status; >- return WEBRTC_VIDEO_CODEC_ERROR; >- } >-+#endif >- #if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >- CFBooleanRef hwaccl_enabled = nullptr; >-- status = VTSessionCopyProperty(_compressionSession, >-+ if (status == noErr) { >-+ status = VTSessionCopyProperty(_compressionSession, >- kVTCompressionPropertyKey_UsingHardwareAcceleratedVideoEncoder, >- nullptr, >- &hwaccl_enabled); >-+ } >- if (status == noErr && (CFBooleanGetValue(hwaccl_enabled))) { >- RTC_LOG(LS_INFO) << "Compression session created with hw accl enabled"; >- } else { >- RTC_LOG(LS_INFO) << "Compression session created with hw accl disabled"; >-+ >-+#if !ENABLE_VCP_ENCODER && !defined(WEBRTC_IOS) >-+ if (!isStandardFrameSize(_width, _height)) { >-+ RTC_LOG(LS_ERROR) << "Using H264 software encoder with non standard size is not supported"; >-+ return WEBRTC_VIDEO_CODEC_ERROR; >-+ } >-+ >-+ if (!getkVTVideoEncoderSpecification_Usage()) { >-+ RTC_LOG(LS_ERROR) << "RTCSingleVideoEncoderH264 cannot create a H264 software encoder"; >-+ return WEBRTC_VIDEO_CODEC_ERROR; >-+ } >-+ >-+ CFDictionaryRef ioSurfaceValue = CreateCFTypeDictionary(nullptr, nullptr, 0); >-+ int64_t pixelFormatType = framePixelFormat; >-+ CFNumberRef pixelFormat = CFNumberCreate(nullptr, kCFNumberLongType, &pixelFormatType); >-+ >-+ const size_t attributesSize = 3; >-+ CFTypeRef keys[attributesSize] = { >-+ kCVPixelBufferOpenGLCompatibilityKey, >-+ kCVPixelBufferIOSurfacePropertiesKey, >-+ kCVPixelBufferPixelFormatTypeKey >-+ }; >-+ CFTypeRef values[attributesSize] = { >-+ kCFBooleanTrue, >-+ ioSurfaceValue, >-+ pixelFormat}; >-+ CFDictionaryRef sourceAttributes = CreateCFTypeDictionary(keys, values, attributesSize); >-+ >-+ if (ioSurfaceValue) { >-+ CFRelease(ioSurfaceValue); >-+ ioSurfaceValue = nullptr; >-+ } >-+ if (pixelFormat) { >-+ CFRelease(pixelFormat); >-+ pixelFormat = nullptr; >-+ } >-+ >-+ CFMutableDictionaryRef encoderSpecs = CFDictionaryCreateMutable(nullptr, 2, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >-+ CFDictionarySetValue(encoderSpecs, kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder, kCFBooleanFalse); >-+ int usageValue = 1; >-+ CFNumberRef usage = CFNumberCreate(nullptr, kCFNumberIntType, &usageValue); >-+ CFDictionarySetValue(encoderSpecs, (__bridge CFStringRef)getkVTVideoEncoderSpecification_Usage(), usage); >-+ >-+ if (usage) { >-+ CFRelease(usage); >-+ usage = nullptr; >-+ } >-+ >-+ [self destroyCompressionSession]; >-+ >-+ OSStatus status = >-+ CompressionSessionCreate(nullptr, // use default allocator >-+ _width, >-+ _height, >-+ kCodecTypeH264, >-+ encoderSpecs, >-+ sourceAttributes, >-+ nullptr, // use default compressed data allocator >-+ compressionOutputCallback, >-+ nullptr, >-+ &_compressionSession); >-+ if (sourceAttributes) { >-+ CFRelease(sourceAttributes); >-+ sourceAttributes = nullptr; >-+ } >-+ if (encoderSpecs) { >-+ CFRelease(encoderSpecs); >-+ encoderSpecs = nullptr; >-+ } >-+ if (status != noErr) { >-+ return WEBRTC_VIDEO_CODEC_ERROR; >-+ } >-+#endif >- } >- #endif >- [self configureCompressionSession]; >-- >-- // The pixel buffer pool is dependent on the compression session so if the session is reset, the >-- // pool should be reset as well. >-- _pixelBufferPool = VTCompressionSessionGetPixelBufferPool(_compressionSession); >-- >- return WEBRTC_VIDEO_CODEC_OK; >- } >- >-@@ -616,6 +745,10 @@ - (void)configureCompressionSession { >- SetVTSessionProperty(_compressionSession, kVTCompressionPropertyKey_RealTime, true); >- SetVTSessionProperty(_compressionSession, kVTCompressionPropertyKey_ProfileLevel, _profile); >- SetVTSessionProperty(_compressionSession, kVTCompressionPropertyKey_AllowFrameReordering, false); >-+#if ENABLE_VCP_ENCODER >-+ if (auto key = getkVTVideoEncoderSpecification_Usage()) >-+ SetVTSessionProperty(_compressionSession, (__bridge CFStringRef)key, 1); >-+#endif >- [self setEncoderBitrateBps:_targetBitrateBps]; >- // TODO(tkchin): Look at entropy mode and colorspace matrices. >- // TODO(tkchin): Investigate to see if there's any way to make this work. >-@@ -633,10 +766,9 @@ - (void)configureCompressionSession { >- >- - (void)destroyCompressionSession { >- if (_compressionSession) { >-- VTCompressionSessionInvalidate(_compressionSession); >-+ CompressionSessionInvalidate(_compressionSession); >- CFRelease(_compressionSession); >- _compressionSession = nullptr; >-- _pixelBufferPool = nullptr; >- } >- } >- >-@@ -664,7 +796,7 @@ - (void)setEncoderBitrateBps:(uint32_t)bitrateBps { >- CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt64Type, &oneSecondValue); >- const void *nums[2] = {bytesPerSecond, oneSecond}; >- CFArrayRef dataRateLimits = CFArrayCreate(nullptr, nums, 2, &kCFTypeArrayCallBacks); >-- OSStatus status = VTSessionSetProperty( >-+ OSStatus status = CompressionSessionSetProperty( >- _compressionSession, kVTCompressionPropertyKey_DataRateLimits, dataRateLimits); >- if (bytesPerSecond) { >- CFRelease(bytesPerSecond); >-@@ -676,7 +808,7 @@ - (void)setEncoderBitrateBps:(uint32_t)bitrateBps { >- CFRelease(dataRateLimits); >- } >- if (status != noErr) { >-- RTC_LOG(LS_ERROR) << "Failed to set data rate limit with code: " << status; >-+ RTC_LOG(LS_ERROR) << "Failed to set data rate limit"; >- } >- >- _encoderBitrateBps = bitrateBps; >-@@ -693,7 +825,7 @@ - (void)frameWasEncoded:(OSStatus)status >- timestamp:(uint32_t)timestamp >- rotation:(RTCVideoRotation)rotation { >- if (status != noErr) { >-- RTC_LOG(LS_ERROR) << "H264 encode failed with code: " << status; >-+ RTC_LOG(LS_ERROR) << "H264 encode failed: " << status; >- return; >- } >- if (infoFlags & kVTEncodeInfo_FrameDropped) { >-@@ -738,6 +870,7 @@ - (void)frameWasEncoded:(OSStatus)status >- frame.rotation = rotation; >- frame.contentType = (_mode == RTCVideoCodecModeScreensharing) ? RTCVideoContentTypeScreenshare : >- RTCVideoContentTypeUnspecified; >-+ frame.spatialIndex = _simulcastIndex; >- frame.flags = webrtc::VideoSendTiming::kInvalid; >- >- int qp; >-@@ -753,9 +886,88 @@ - (void)frameWasEncoded:(OSStatus)status >- _bitrateAdjuster->Update(frame.buffer.length); >- } >- >--- (nullable RTCVideoEncoderQpThresholds *)scalingSettings { >-- return [[RTCVideoEncoderQpThresholds alloc] initWithThresholdsLow:kLowH264QpThreshold >-- high:kHighH264QpThreshold]; >-+- (RTCVideoEncoderQpThresholds *)scalingSettings { >-+ return [[RTCVideoEncoderQpThresholds alloc] initWithThresholdsLow:kLowH264QpThreshold high:kHighH264QpThreshold]; >-+} >-+ >-+- (int)setRateAllocation:(RTCVideoBitrateAllocation *)allocation framerate:(uint32_t)framerate { >-+ return 0; >-+} >-+ >-+@end >-+ >-+@implementation RTCVideoEncoderH264 { >-+ NSMutableArray<RTCSingleVideoEncoderH264*> *_codecs; >-+ RTCVideoCodecInfo *_codecInfo; >-+} >-+- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo { >-+ if (self = [super init]) { >-+ _codecInfo = codecInfo; >-+ } >-+ return self; >-+} >-+ >-+- (void)setCallback:(RTCVideoEncoderCallback)callback { >-+ for (RTCSingleVideoEncoderH264 *codec : _codecs) >-+ [codec setCallback:callback]; >-+} >-+ >-+- (NSInteger)startEncodeWithSettings:(RTCVideoEncoderSettings *)settings numberOfCores:(int)numberOfCores { >-+ auto nativeCodecSettings = settings.nativeVideoCodec; >-+ >-+ _codecs = [[NSMutableArray alloc] init]; >-+ for (unsigned index = 0 ; index < nativeCodecSettings.numberOfSimulcastStreams; ++index) { >-+ auto codec = [[RTCSingleVideoEncoderH264 alloc] initWithCodecInfo:_codecInfo simulcastIndex:index]; >-+ [_codecs addObject:codec]; >-+ >-+ auto codecSettings = nativeCodecSettings; >-+ codecSettings.width = nativeCodecSettings.simulcastStream[index].width; >-+ codecSettings.height = nativeCodecSettings.simulcastStream[index].height; >-+ codecSettings.maxBitrate = nativeCodecSettings.simulcastStream[index].maxBitrate; >-+ codecSettings.targetBitrate = nativeCodecSettings.simulcastStream[index].targetBitrate; >-+ codecSettings.minBitrate = nativeCodecSettings.simulcastStream[index].minBitrate; >-+ codecSettings.qpMax = nativeCodecSettings.simulcastStream[index].qpMax; >-+ codecSettings.active = true; >-+ >-+ auto *settings = [[RTCVideoEncoderSettings alloc] initWithNativeVideoCodec:&codecSettings]; >-+ [codec startEncodeWithSettings:settings numberOfCores:numberOfCores]; >-+ } >-+ return 0; >-+} >-+ >-+- (NSInteger)releaseEncoder { >-+ for (RTCSingleVideoEncoderH264 *codec : _codecs) >-+ [codec releaseEncoder]; >-+ _codecs = nil; >-+ return 0; >-+} >-+ >-+- (NSInteger)encode:(RTCVideoFrame *)frame codecSpecificInfo:(nullable id<RTCCodecSpecificInfo>)info frameTypes:(NSArray<NSNumber *> *)frameTypes { >-+ int result = 0; >-+ for (RTCSingleVideoEncoderH264 *codec : _codecs) >-+ result |= [codec encode:frame codecSpecificInfo:info frameTypes:frameTypes]; >-+ return result; >-+} >-+ >-+- (int)setRateAllocation:(RTCVideoBitrateAllocation *)bitRateAllocation framerate:(uint32_t) framerate { >-+ int result = 0; >-+ unsigned counter = 0; >-+ auto nativeBitRateAllocation = bitRateAllocation.nativeVideoBitrateAllocation; >-+ for (RTCSingleVideoEncoderH264 *codec : _codecs) >-+ result |= [codec setBitrate:nativeBitRateAllocation.GetSpatialLayerSum(counter++) framerate:framerate]; >-+ return result; >-+} >-+ >-+- (NSString *)implementationName { >-+ return @"VideoToolbox"; >-+} >-+ >-+- (RTCVideoEncoderQpThresholds *)scalingSettings { >-+ return [[RTCVideoEncoderQpThresholds alloc] initWithThresholdsLow:kLowH264QpThreshold high:kHighH264QpThreshold]; >-+} >-+ >-+- (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate { >-+ return 0; >- } >- >- @end >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc >-index ac957f1b497..46daf44ecac 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc >-@@ -35,12 +35,12 @@ std::string CFStringToString(const CFStringRef cf_string) { >- } >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, >-+void SetVTSessionProperty(CompressionSessionRef session, >- CFStringRef key, >- int32_t value) { >- CFNumberRef cfNum = >- CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, &value); >-- OSStatus status = VTSessionSetProperty(session, key, cfNum); >-+ OSStatus status = CompressionSessionSetProperty(session, key, cfNum); >- CFRelease(cfNum); >- if (status != noErr) { >- std::string key_string = CFStringToString(key); >-@@ -50,13 +50,13 @@ void SetVTSessionProperty(VTSessionRef session, >- } >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, >-+void SetVTSessionProperty(CompressionSessionRef session, >- CFStringRef key, >- uint32_t value) { >- int64_t value_64 = value; >- CFNumberRef cfNum = >- CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt64Type, &value_64); >-- OSStatus status = VTSessionSetProperty(session, key, cfNum); >-+ OSStatus status = CompressionSessionSetProperty(session, key, cfNum); >- CFRelease(cfNum); >- if (status != noErr) { >- std::string key_string = CFStringToString(key); >-@@ -66,9 +66,9 @@ void SetVTSessionProperty(VTSessionRef session, >- } >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, CFStringRef key, bool value) { >-+void SetVTSessionProperty(CompressionSessionRef session, CFStringRef key, bool value) { >- CFBooleanRef cf_bool = (value) ? kCFBooleanTrue : kCFBooleanFalse; >-- OSStatus status = VTSessionSetProperty(session, key, cf_bool); >-+ OSStatus status = CompressionSessionSetProperty(session, key, cf_bool); >- if (status != noErr) { >- std::string key_string = CFStringToString(key); >- RTC_LOG(LS_ERROR) << "VTSessionSetProperty failed to set: " << key_string >-@@ -77,10 +77,10 @@ void SetVTSessionProperty(VTSessionRef session, CFStringRef key, bool value) { >- } >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, >-+void SetVTSessionProperty(CompressionSessionRef session, >- CFStringRef key, >- CFStringRef value) { >-- OSStatus status = VTSessionSetProperty(session, key, value); >-+ OSStatus status = CompressionSessionSetProperty(session, key, value); >- if (status != noErr) { >- std::string key_string = CFStringToString(key); >- std::string val_string = CFStringToString(value); >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h >-index 0683ea79e56..0c669e8bce6 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h >-@@ -16,6 +16,8 @@ >- #include <VideoToolbox/VideoToolbox.h> >- #include <string> >- >-+#include "sdk/WebKit/EncoderUtilities.h" >-+ >- // Convenience function for creating a dictionary. >- inline CFDictionaryRef CreateCFTypeDictionary(CFTypeRef* keys, >- CFTypeRef* values, >-@@ -29,18 +31,18 @@ inline CFDictionaryRef CreateCFTypeDictionary(CFTypeRef* keys, >- std::string CFStringToString(const CFStringRef cf_string); >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, CFStringRef key, int32_t value); >-+void SetVTSessionProperty(CompressionSessionRef session, CFStringRef key, int32_t value); >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, >-+void SetVTSessionProperty(CompressionSessionRef session, >- CFStringRef key, >- uint32_t value); >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, CFStringRef key, bool value); >-+void SetVTSessionProperty(CompressionSessionRef session, CFStringRef key, bool value); >- >- // Convenience function for setting a VT property. >--void SetVTSessionProperty(VTSessionRef session, >-+void SetVTSessionProperty(CompressionSessionRef session, >- CFStringRef key, >- CFStringRef value); >- >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h >-index 432a3825746..abe9dfca93b 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_frame_buffer/RTCCVPixelBuffer.h >-@@ -17,6 +17,7 @@ NS_ASSUME_NONNULL_BEGIN >- >- /** RTCVideoFrameBuffer containing a CVPixelBufferRef */ >- RTC_OBJC_EXPORT >-+__attribute__((objc_runtime_name("WK_RTCCVPixelBuffer"))) >- @interface RTCCVPixelBuffer : NSObject <RTCVideoFrameBuffer> >- >- @property(nonatomic, readonly) CVPixelBufferRef pixelBuffer; >-diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >-index ba7c3e96754..c4c85cc6ab9 100644 >---- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >-+++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >-@@ -92,6 +92,12 @@ class ObjCVideoEncoder : public VideoEncoder { >- return [encoder_ setBitrate:bitrate framerate:framerate]; >- } >- >-+ int32_t SetRateAllocation(const VideoBitrateAllocation& allocation, uint32_t framerate) { >-+ RTCVideoBitrateAllocation *bitrateAllocation = >-+ [[RTCVideoBitrateAllocation alloc] initWithNativeVideoBitrateAllocation:&allocation]; >-+ return [encoder_ setRateAllocation: bitrateAllocation framerate:framerate]; >-+ } >-+ >- bool SupportsNativeHandle() const { return true; } >- >- VideoEncoder::ScalingSettings GetScalingSettings() const { >--- >-2.17.1 (Apple Git-112) >- >diff --git a/Source/ThirdParty/libwebrtc/WebKit/libwebrtc.diff b/Source/ThirdParty/libwebrtc/WebKit/libwebrtc.diff >new file mode 100644 >index 0000000000000000000000000000000000000000..92428e415dca86fc0744ecd9813eecf0d2c077e6 >--- /dev/null >+++ b/Source/ThirdParty/libwebrtc/WebKit/libwebrtc.diff >@@ -0,0 +1,2568 @@ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >+index 896ed238cc3..4d028c9aaf4 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/audio_codecs/ilbc/audio_encoder_ilbc.cc >+@@ -33,7 +33,7 @@ int GetIlbcBitrate(int ptime) { >+ // 50 bytes per frame of 30 ms => (approx) 13333 bits/s. >+ return 13333; >+ default: >+- FATAL(); >++ RTC_FATAL(); >+ } >+ } >+ } // namespace >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc >+index 599542db08f..140d0ae9625 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/api/mediatypes.cc >+@@ -28,7 +28,7 @@ std::string MediaTypeToString(MediaType type) { >+ case MEDIA_TYPE_DATA: >+ return kMediaTypeData; >+ } >+- FATAL(); >++ RTC_FATAL(); >+ // Not reachable; avoids compile warning. >+ return ""; >+ } >+@@ -41,7 +41,7 @@ MediaType MediaTypeFromString(const std::string& type_str) { >+ } else if (type_str == kMediaTypeData) { >+ return MEDIA_TYPE_DATA; >+ } >+- FATAL(); >++ RTC_FATAL(); >+ // Not reachable; avoids compile warning. >+ return static_cast<MediaType>(-1); >+ } >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >+index cc59e2a823c..b1158da213f 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/audio/remix_resample.cc >+@@ -55,7 +55,7 @@ void RemixAndResample(const int16_t* src_data, >+ >+ if (resampler->InitializeIfNeeded(sample_rate_hz, dst_frame->sample_rate_hz_, >+ audio_ptr_num_channels) == -1) { >+- FATAL() << "InitializeIfNeeded failed: sample_rate_hz = " << sample_rate_hz >++ RTC_FATAL() << "InitializeIfNeeded failed: sample_rate_hz = " << sample_rate_hz >+ << ", dst_frame->sample_rate_hz_ = " << dst_frame->sample_rate_hz_ >+ << ", audio_ptr_num_channels = " << audio_ptr_num_channels; >+ } >+@@ -69,7 +69,7 @@ void RemixAndResample(const int16_t* src_data, >+ resampler->Resample(audio_ptr, src_length, dst_frame->mutable_data(), >+ AudioFrame::kMaxDataSizeSamples); >+ if (out_length == -1) { >+- FATAL() << "Resample failed: audio_ptr = " << audio_ptr >++ RTC_FATAL() << "Resample failed: audio_ptr = " << audio_ptr >+ << ", src_length = " << src_length >+ << ", dst_frame->mutable_data() = " << dst_frame->mutable_data(); >+ } >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >+index 1b8bcbd923c..90a89281e5c 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_audio/wav_file.cc >+@@ -63,7 +63,7 @@ WavReader::WavReader(rtc::PlatformFile file) { >+ if (!rtc::ClosePlatformFile(file)) { >+ RTC_LOG(LS_ERROR) << "Can't close file."; >+ } >+- FATAL() << "Could not open wav file for reading."; >++ RTC_FATAL() << "Could not open wav file for reading."; >+ } >+ >+ ReadableWavFile readable(file_handle_); >+@@ -148,7 +148,7 @@ WavWriter::WavWriter(rtc::PlatformFile file, >+ if (!rtc::ClosePlatformFile(file)) { >+ RTC_LOG(LS_ERROR) << "Can't close file."; >+ } >+- FATAL() << "Could not open wav file for writing."; >++ RTC_FATAL() << "Could not open wav file for writing."; >+ } >+ >+ RTC_CHECK(CheckWavParameters(num_channels_, sample_rate_, kWavFormat, >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >+index f4f138fccb5..5f89b4acb96 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/common_video/video_frame_buffer.cc >+@@ -272,7 +272,7 @@ rtc::scoped_refptr<PlanarYuvBuffer> WrapYuvBuffer( >+ return WrapI444Buffer(width, height, y_plane, y_stride, u_plane, u_stride, >+ v_plane, v_stride, no_longer_used); >+ default: >+- FATAL() << "Unexpected frame buffer type."; >++ RTC_FATAL() << "Unexpected frame buffer type."; >+ return nullptr; >+ } >+ } >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >+index 9a2926143d0..476954fe60d 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/cng/audio_encoder_cng.cc >+@@ -171,7 +171,7 @@ AudioEncoder::EncodedInfo AudioEncoderCng::EncodeImpl( >+ break; >+ } >+ case Vad::kError: { >+- FATAL(); // Fails only if fed invalid data. >++ RTC_FATAL(); // Fails only if fed invalid data. >+ break; >+ } >+ } >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >+index 8801fd55590..35e6fc64a62 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/audio_coding/codecs/ilbc/audio_encoder_ilbc.cc >+@@ -41,7 +41,7 @@ int GetIlbcBitrate(int ptime) { >+ // 50 bytes per frame of 30 ms => (approx) 13333 bits/s. >+ return 13333; >+ default: >+- FATAL(); >++ RTC_FATAL(); >+ } >+ } >+ >+@@ -148,7 +148,7 @@ size_t AudioEncoderIlbcImpl::RequiredOutputSizeBytes() const { >+ case 6: >+ return 2 * 50; >+ default: >+- FATAL(); >++ RTC_FATAL(); >+ } >+ } >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_noop.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_noop.cc >+new file mode 100644 >+index 00000000000..b058c2d6728 >+--- /dev/null >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/modules/video_coding/codecs/vp9/vp9_noop.cc >+@@ -0,0 +1,43 @@ >++/* >++ * Copyright (c) 2016 The WebRTC project authors. All Rights Reserved. >++ * >++ * Use of this source code is governed by a BSD-style license >++ * that can be found in the LICENSE file in the root of the source >++ * tree. An additional intellectual property rights grant can be found >++ * in the file PATENTS. All contributing project authors may >++ * be found in the AUTHORS file in the root of the source tree. >++ * >++ */ >++ >++#if !defined(RTC_DISABLE_VP9) >++#error >++#endif // !defined(RTC_DISABLE_VP9) >++ >++#include "modules/video_coding/codecs/vp9/include/vp9.h" >++ >++#include "api/video_codecs/sdp_video_format.h" >++#include "rtc_base/checks.h" >++ >++namespace webrtc { >++ >++std::vector<SdpVideoFormat> SupportedVP9Codecs() { >++ return std::vector<SdpVideoFormat>(); >++} >++ >++std::unique_ptr<VP9Encoder> VP9Encoder::Create() { >++ RTC_NOTREACHED(); >++ return nullptr; >++} >++ >++std::unique_ptr<VP9Encoder> VP9Encoder::Create( >++ const cricket::VideoCodec& codec) { >++ RTC_NOTREACHED(); >++ return nullptr; >++} >++ >++std::unique_ptr<VP9Decoder> VP9Decoder::Create() { >++ RTC_NOTREACHED(); >++ return nullptr; >++} >++ >++} // namespace webrtc >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >+index e018ea755d6..0d7aea9d60a 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/p2p/base/tcpport.cc >+@@ -351,11 +351,21 @@ TCPConnection::TCPConnection(TCPPort* port, >+ << ", port() Network:" << port->Network()->ToString(); >+ const std::vector<rtc::InterfaceAddress>& desired_addresses = >+ port_->Network()->GetIPs(); >++ >++#if defined(WEBRTC_WEBKIT_BUILD) >++ RTC_DCHECK(socket->GetLocalAddress().IsLoopbackIP() || >++ (std::find_if(desired_addresses.begin(), desired_addresses.end(), >++ [this](const rtc::InterfaceAddress& addr) { >++ return socket_->GetLocalAddress().ipaddr() == >++ addr; >++ }) != desired_addresses.end())); >++ #else >+ RTC_DCHECK(std::find_if(desired_addresses.begin(), desired_addresses.end(), >+ [this](const rtc::InterfaceAddress& addr) { >+ return socket_->GetLocalAddress().ipaddr() == >+ addr; >+ }) != desired_addresses.end()); >++#endif >+ ConnectSocketSignals(socket); >+ } >+ } >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >+index 37c6a0b0bcf..75261508b27 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/peerconnectionfactory.cc >+@@ -261,7 +261,7 @@ RtpCapabilities PeerConnectionFactory::GetRtpSenderCapabilities( >+ return RtpCapabilities(); >+ } >+ // Not reached; avoids compile warning. >+- FATAL(); >++ RTC_FATAL(); >+ } >+ >+ RtpCapabilities PeerConnectionFactory::GetRtpReceiverCapabilities( >+@@ -288,7 +288,7 @@ RtpCapabilities PeerConnectionFactory::GetRtpReceiverCapabilities( >+ return RtpCapabilities(); >+ } >+ // Not reached; avoids compile warning. >+- FATAL(); >++ RTC_FATAL(); >+ } >+ >+ rtc::scoped_refptr<AudioSourceInterface> >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >+index bf3bce3daa3..865685d73c4 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/pc/rtpparametersconversion.cc >+@@ -61,7 +61,7 @@ RTCErrorOr<cricket::FeedbackParam> ToCricketFeedbackParam( >+ return cricket::FeedbackParam(cricket::kRtcpFbParamTransportCc); >+ } >+ // Not reached; avoids compile warning. >+- FATAL(); >++ RTC_FATAL(); >+ } >+ >+ template <typename C> >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h >+index 9de6d47573c..5cf0679910e 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/checks.h >+@@ -80,7 +80,7 @@ RTC_NORETURN void rtc_FatalMessage(const char* file, int line, const char* msg); >+ // messages if the condition doesn't hold. Prefer them to raw RTC_CHECK and >+ // RTC_DCHECK. >+ // >+-// - FATAL() aborts unconditionally. >++// - RTC_FATAL() aborts unconditionally. >+ // >+ // TODO(ajm): Ideally, checks.h would be combined with logging.h, but >+ // consolidation with system_wrappers/logging.h should happen first. >+@@ -348,9 +348,9 @@ class FatalLogCall final { >+ #define RTC_NOTREACHED() RTC_DCHECK(RTC_UNREACHABLE_CODE_HIT) >+ >+ // TODO(bugs.webrtc.org/8454): Add an RTC_ prefix or rename differently. >+-#define FATAL() \ >++#define RTC_FATAL() \ >+ rtc::webrtc_checks_impl::FatalLogCall<false>(__FILE__, __LINE__, \ >+- "FATAL()") & \ >++ "RTC_FATAL()") & \ >+ rtc::webrtc_checks_impl::LogStreamer<>() >+ >+ // Performs the integer division a/b and returns the result. CHECKs that the >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >+index bcce0dafbc0..6b43b91e026 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/flags.cc >+@@ -81,7 +81,7 @@ void Flag::SetToDefault() { >+ variable_->s = default_.s; >+ return; >+ } >+- FATAL() << "unreachable code"; >++ RTC_FATAL() << "unreachable code"; >+ } >+ >+ static const char* Type2String(Flag::Type type) { >+@@ -95,7 +95,7 @@ static const char* Type2String(Flag::Type type) { >+ case Flag::STRING: >+ return "string"; >+ } >+- FATAL() << "unreachable code"; >++ RTC_FATAL() << "unreachable code"; >+ } >+ >+ static void PrintFlagValue(Flag::Type type, FlagValue* p) { >+@@ -113,7 +113,7 @@ static void PrintFlagValue(Flag::Type type, FlagValue* p) { >+ printf("%s", p->s); >+ return; >+ } >+- FATAL() << "unreachable code"; >++ RTC_FATAL() << "unreachable code"; >+ } >+ >+ void Flag::Print(bool print_current_value) { >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h >+index 513bc263651..718d9589348 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/location.h >+@@ -50,7 +50,7 @@ class Location { >+ #define RTC_FROM_HERE RTC_FROM_HERE_WITH_FUNCTION(__FUNCTION__) >+ >+ #define RTC_FROM_HERE_WITH_FUNCTION(function_name) \ >+- ::rtc::Location(function_name, __FILE__ ":" STRINGIZE(__LINE__)) >++ ::rtc::Location(function_name, __FILE__ ":" RTC_STRINGIZE(__LINE__)) >+ >+ } // namespace rtc >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc >+index bb4fbfaae25..44ec3f50c19 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/logging.cc >+@@ -20,28 +20,26 @@ >+ #include <CoreServices/CoreServices.h> >+ #elif defined(WEBRTC_ANDROID) >+ #include <android/log.h> >+- >+ // Android has a 1024 limit on log inputs. We use 60 chars as an >+ // approx for the header/tag portion. >+ // See android/system/core/liblog/logd_write.c >+ static const int kMaxLogLineSize = 1024 - 60; >+ #endif // WEBRTC_MAC && !defined(WEBRTC_IOS) || WEBRTC_ANDROID >+ >+-#include <stdio.h> >+-#include <string.h> >++#include <limits.h> >+ #include <time.h> >++ >+ #include <algorithm> >+ #include <cstdarg> >+ #include <vector> >+ >+-#include "rtc_base/checks.h" >+ #include "rtc_base/criticalsection.h" >+ #include "rtc_base/logging.h" >+ #include "rtc_base/platform_thread_types.h" >++#include "rtc_base/never_destroyed.h" >+ #include "rtc_base/stringencode.h" >+ #include "rtc_base/strings/string_builder.h" >+ #include "rtc_base/stringutils.h" >+-#include "rtc_base/thread_annotations.h" >+ #include "rtc_base/timeutils.h" >+ >+ namespace rtc { >+@@ -65,20 +63,19 @@ const char* FilenameFromPath(const char* file) { >+ return (end1 > end2) ? end1 + 1 : end2 + 1; >+ } >+ >+-// Global lock for log subsystem, only needed to serialize access to streams_. >+-CriticalSection g_log_crit; >++// Global lock for log subsystem, only needed to serialize access to streamList(). >++CriticalSection& logCriticalScope() { >++ static auto scope = makeNeverDestroyed<>(CriticalSection { }); >++ return scope.get(); >++} >++ >+ } // namespace >+ >+ // Inefficient default implementation, override is recommended. >+ void LogSink::OnLogMessage(const std::string& msg, >+ LoggingSeverity severity, >+ const char* tag) { >+- OnLogMessage(tag + (": " + msg), severity); >+-} >+- >+-void LogSink::OnLogMessage(const std::string& msg, >+- LoggingSeverity /* severity */) { >+- OnLogMessage(msg); >++ OnLogMessage(tag + (": " + msg)); >+ } >+ >+ ///////////////////////////////////////////////////////////////////////////// >+@@ -91,7 +88,15 @@ bool LogMessage::log_to_stderr_ = true; >+ // Note: we explicitly do not clean this up, because of the uncertain ordering >+ // of destructors at program exit. Let the person who sets the stream trigger >+ // cleanup by setting to null, or let it leak (safe at program exit). >+-LogMessage::StreamList LogMessage::streams_ RTC_GUARDED_BY(g_log_crit); >++typedef std::pair<LogSink*, LoggingSeverity> StreamAndSeverity; >++typedef std::list<StreamAndSeverity> StreamList; >++ >++// The output streams and their associated severities >++StreamList& streamList() >++ RTC_EXCLUSIVE_LOCKS_REQUIRED(logCriticalScope()) { >++ static auto stream_list = makeNeverDestroyed<>(StreamList { }); >++ return stream_list.get(); >++} >+ >+ // Boolean options default to false (0) >+ bool LogMessage::thread_, LogMessage::timestamp_; >+@@ -105,6 +110,7 @@ LogMessage::LogMessage(const char* file, >+ LogErrorContext err_ctx, >+ int err) >+ : severity_(sev) { >++ >+ if (timestamp_) { >+ // Use SystemTimeMillis so that even if tests use fake clocks, the timestamp >+ // in log messages represents the real system time. >+@@ -205,13 +211,13 @@ LogMessage::~LogMessage() { >+ #endif >+ } >+ >+- CritScope cs(&g_log_crit); >+- for (auto& kv : streams_) { >++ CritScope cs(&logCriticalScope()); >++ for (auto& kv : streamList()) { >+ if (severity_ >= kv.second) { >+ #if defined(WEBRTC_ANDROID) >+ kv.first->OnLogMessage(str, severity_, tag_); >+ #else >+- kv.first->OnLogMessage(str, severity_); >++ kv.first->OnLogMessage(str); >+ #endif >+ } >+ } >+@@ -254,7 +260,7 @@ void LogMessage::LogTimestamps(bool on) { >+ >+ void LogMessage::LogToDebug(LoggingSeverity min_sev) { >+ g_dbg_sev = min_sev; >+- CritScope cs(&g_log_crit); >++ CritScope cs(&logCriticalScope()); >+ UpdateMinLogSeverity(); >+ } >+ >+@@ -263,9 +269,9 @@ void LogMessage::SetLogToStderr(bool log_to_stderr) { >+ } >+ >+ int LogMessage::GetLogToStream(LogSink* stream) { >+- CritScope cs(&g_log_crit); >++ CritScope cs(&logCriticalScope()); >+ LoggingSeverity sev = LS_NONE; >+- for (auto& kv : streams_) { >++ for (auto& kv : streamList()) { >+ if (!stream || stream == kv.first) { >+ sev = std::min(sev, kv.second); >+ } >+@@ -274,16 +280,16 @@ int LogMessage::GetLogToStream(LogSink* stream) { >+ } >+ >+ void LogMessage::AddLogToStream(LogSink* stream, LoggingSeverity min_sev) { >+- CritScope cs(&g_log_crit); >+- streams_.push_back(std::make_pair(stream, min_sev)); >++ CritScope cs(&logCriticalScope()); >++ streamList().push_back(std::make_pair(stream, min_sev)); >+ UpdateMinLogSeverity(); >+ } >+ >+ void LogMessage::RemoveLogToStream(LogSink* stream) { >+- CritScope cs(&g_log_crit); >+- for (StreamList::iterator it = streams_.begin(); it != streams_.end(); ++it) { >++ CritScope cs(&logCriticalScope()); >++ for (StreamList::iterator it = streamList().begin(); it != streamList().end(); ++it) { >+ if (stream == it->first) { >+- streams_.erase(it); >++ streamList().erase(it); >+ break; >+ } >+ } >+@@ -342,9 +348,9 @@ void LogMessage::ConfigureLogging(const char* params) { >+ } >+ >+ void LogMessage::UpdateMinLogSeverity() >+- RTC_EXCLUSIVE_LOCKS_REQUIRED(g_log_crit) { >++ RTC_EXCLUSIVE_LOCKS_REQUIRED(logCriticalScope()) { >+ LoggingSeverity min_sev = g_dbg_sev; >+- for (const auto& kv : streams_) { >++ for (const auto& kv : streamList()) { >+ const LoggingSeverity sev = kv.second; >+ min_sev = std::min(min_sev, sev); >+ } >+@@ -458,11 +464,8 @@ bool LogMessage::IsNoop(LoggingSeverity severity) { >+ // TODO(tommi): We're grabbing this lock for every LogMessage instance that >+ // is going to be logged. This introduces unnecessary synchronization for >+ // a feature that's mostly used for testing. >+- CritScope cs(&g_log_crit); >+- if (streams_.size() > 0) >+- return false; >+- >+- return true; >++ CritScope cs(&logCriticalScope()); >++ return streamList().size() == 0; >+ } >+ >+ void LogMessage::FinishPrintStream() { >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/never_destroyed.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/never_destroyed.h >+new file mode 100644 >+index 00000000000..fcc62e35534 >+--- /dev/null >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/never_destroyed.h >+@@ -0,0 +1,69 @@ >++/* >++ * Copyright (C) 2018 Apple Inc. All rights reserved. >++ * >++ * Redistribution and use in source and binary forms, with or without >++ * modification, are permitted provided that the following conditions >++ * are met: >++ * 1. Redistributions of source code must retain the above copyright >++ * notice, this list of conditions and the following disclaimer. >++ * 2. Redistributions in binary form must reproduce the above copyright >++ * notice, this list of conditions and the following disclaimer in the >++ * documentation and/or other materials provided with the distribution. >++ * >++ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >++ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >++ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >++ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >++ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >++ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >++ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >++ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >++ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >++ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >++ * THE POSSIBILITY OF SUCH DAMAGE. >++ */ >++ >++#pragma once >++ >++#include <type_traits> >++#include <utility> >++ >++namespace rtc { >++ >++template<typename T> class NeverDestroyed { >++public: >++ template<typename... Args> NeverDestroyed(Args&&... args) >++ { >++ new (storagePointer()) T(std::forward<Args>(args)...); >++ } >++ >++ NeverDestroyed(NeverDestroyed&& other) >++ { >++ new (storagePointer()) T(std::move(*other.storagePointer())); >++ } >++ >++ operator T&() { return *storagePointer(); } >++ T& get() { return *storagePointer(); } >++ >++ operator const T&() const { return *storagePointer(); } >++ const T& get() const { return *storagePointer(); } >++ >++private: >++ NeverDestroyed(const NeverDestroyed&) = delete; >++ NeverDestroyed& operator=(const NeverDestroyed&) = delete; >++ >++ using PointerType = typename std::remove_const<T>::type*; >++ >++ PointerType storagePointer() const { return const_cast<PointerType>(reinterpret_cast<const T*>(&m_storage)); } >++ >++ // FIXME: Investigate whether we should allocate a hunk of virtual memory >++ // and hand out chunks of it to NeverDestroyed instead, to reduce fragmentation. >++ typename std::aligned_storage<sizeof(T), std::alignment_of<T>::value>::type m_storage; >++}; >++ >++template<typename T> inline NeverDestroyed<T> makeNeverDestroyed(T&& argument) >++{ >++ return NeverDestroyed<T>(std::move(argument)); >++} >++ >++} // namespace rtc >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h >+index 58efcaa746a..48c212e4d49 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/numerics/safe_conversions.h >+@@ -63,11 +63,11 @@ inline Dst saturated_cast(Src value) { >+ >+ // Should fail only on attempting to assign NaN to a saturated integer. >+ case internal::TYPE_INVALID: >+- FATAL(); >++ RTC_FATAL(); >+ return std::numeric_limits<Dst>::max(); >+ } >+ >+- FATAL(); >++ RTC_FATAL(); >+ return static_cast<Dst>(value); >+ } >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >+index 4e61b86f3ba..4e325dcec01 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/opensslcertificate.cc >+@@ -244,11 +244,11 @@ std::unique_ptr<SSLCertificate> OpenSSLCertificate::Clone() const { >+ std::string OpenSSLCertificate::ToPEMString() const { >+ BIO* bio = BIO_new(BIO_s_mem()); >+ if (!bio) { >+- FATAL() << "Unreachable code."; >++ RTC_FATAL() << "Unreachable code."; >+ } >+ if (!PEM_write_bio_X509(bio, x509_)) { >+ BIO_free(bio); >+- FATAL() << "Unreachable code."; >++ RTC_FATAL() << "Unreachable code."; >+ } >+ BIO_write(bio, "\0", 1); >+ char* buffer; >+@@ -264,11 +264,11 @@ void OpenSSLCertificate::ToDER(Buffer* der_buffer) const { >+ // Calculates the DER representation of the certificate, from scratch. >+ BIO* bio = BIO_new(BIO_s_mem()); >+ if (!bio) { >+- FATAL() << "Unreachable code."; >++ RTC_FATAL() << "Unreachable code."; >+ } >+ if (!i2d_X509_bio(bio, x509_)) { >+ BIO_free(bio); >+- FATAL() << "Unreachable code."; >++ RTC_FATAL() << "Unreachable code."; >+ } >+ char* data = nullptr; >+ size_t length = BIO_get_mem_data(bio, &data); >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver.cc >+index 4ad2857c00c..7448a7e0427 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/physicalsocketserver.cc >+@@ -596,7 +596,7 @@ bool SocketDispatcher::Initialize() { >+ #elif defined(WEBRTC_POSIX) >+ fcntl(s_, F_SETFL, fcntl(s_, F_GETFL, 0) | O_NONBLOCK); >+ #endif >+-#if defined(WEBRTC_IOS) >++#if defined(WEBRTC_IOS) || (defined(WEBRTC_MAC) && defined(WEBRTC_WEBKIT_BUILD)) >+ // iOS may kill sockets when the app is moved to the background >+ // (specifically, if the app doesn't use the "voip" UIBackgroundMode). When >+ // we attempt to write to such a socket, SIGPIPE will be raised, which by >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h >+index aee8d14551d..38c6f1836bd 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/stringize_macros.h >+@@ -8,7 +8,7 @@ >+ * be found in the AUTHORS file in the root of the source tree. >+ */ >+ >+-// Modified from the Chromium original: >++// Modified from the Chxsromium original: >+ // src/base/strings/stringize_macros.h >+ >+ // This file defines preprocessor macros for stringizing preprocessor >+@@ -33,6 +33,6 @@ >+ // Then: >+ // STRINGIZE(A) produces "FOO" >+ // STRINGIZE(B(y)) produces "myobj->FunctionCall(y)" >+-#define STRINGIZE(x) STRINGIZE_NO_EXPANSION(x) >++#define RTC_STRINGIZE(x) STRINGIZE_NO_EXPANSION(x) >+ >+ #endif // RTC_BASE_STRINGIZE_MACROS_H_ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >+index 79694112c81..bfce0f54541 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/rtc_base/virtualsocketserver.cc >+@@ -1026,9 +1026,9 @@ void VirtualSocketServer::UpdateDelayDistribution() { >+ } >+ } >+ >+-static double PI = 4 * atan(1.0); >+ >+ static double Normal(double x, double mean, double stddev) { >++ static double PI = 4 * atan(1.0); >+ double a = (x - mean) * (x - mean) / (2 * stddev * stddev); >+ return exp(-a) / (stddev * sqrt(2 * PI)); >+ } >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/EncoderUtilities.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/EncoderUtilities.h >+new file mode 100644 >+index 00000000000..5b9bdfda114 >+--- /dev/null >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/EncoderUtilities.h >+@@ -0,0 +1,50 @@ >++/* >++ * Copyright (C) 2018 Apple Inc. All rights reserved. >++ * >++ * Redistribution and use in source and binary forms, with or without >++ * modification, are permitted provided that the following conditions >++ * are met: >++ * 1. Redistributions of source code must retain the above copyright >++ * notice, this list of conditions and the following disclaimer. >++ * 2. Redistributions in binary form must reproduce the above copyright >++ * notice, this list of conditions and the following disclaimer in the >++ * documentation and/or other materials provided with the distribution. >++ * >++ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >++ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >++ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >++ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >++ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >++ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >++ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >++ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >++ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >++ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >++ * THE POSSIBILITY OF SUCH DAMAGE. >++ */ >++ >++#pragma once >++ >++#include "VideoProcessingSoftLink.h" >++ >++#if ENABLE_VCP_ENCODER >++ >++#define CompressionSessionRef VCPCompressionSessionRef >++#define CompressionSessionSetProperty webrtc::VCPCompressionSessionSetProperty >++#define CompressionSessionGetPixelBufferPool webrtc::VCPCompressionSessionGetPixelBufferPool >++#define CompressionSessionEncodeFrame webrtc::VCPCompressionSessionEncodeFrame >++#define CompressionSessionCreate webrtc::VCPCompressionSessionCreate >++#define kCodecTypeH264 kVCPCodecType4CC_H264 >++#define CompressionSessionInvalidate webrtc::VCPCompressionSessionInvalidate >++ >++#else >++ >++#define CompressionSessionRef VTCompressionSessionRef >++#define CompressionSessionSetProperty VTSessionSetProperty >++#define CompressionSessionGetPixelBufferPool VTCompressionSessionGetPixelBufferPool >++#define CompressionSessionEncodeFrame VTCompressionSessionEncodeFrame >++#define CompressionSessionCreate VTCompressionSessionCreate >++#define kCodecTypeH264 kCMVideoCodecType_H264 >++#define CompressionSessionInvalidate VTCompressionSessionInvalidate >++ >++#endif >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/VideoProcessingSoftLink.cpp b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/VideoProcessingSoftLink.cpp >+new file mode 100644 >+index 00000000000..101e87c6677 >+--- /dev/null >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/VideoProcessingSoftLink.cpp >+@@ -0,0 +1,78 @@ >++/* >++ * Copyright (C) 2018 Apple Inc. All rights reserved. >++ * >++ * Redistribution and use in source and binary forms, with or without >++ * modification, are permitted provided that the following conditions >++ * are met: >++ * 1. Redistributions of source code must retain the above copyright >++ * notice, this list of conditions and the following disclaimer. >++ * 2. Redistributions in binary form must reproduce the above copyright >++ * notice, this list of conditions and the following disclaimer in the >++ * documentation and/or other materials provided with the distribution. >++ * >++ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >++ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >++ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >++ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >++ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >++ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >++ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >++ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >++ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >++ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >++ * THE POSSIBILITY OF SUCH DAMAGE. >++ */ >++ >++#include "VideoProcessingSoftLink.h" >++ >++#if ENABLE_VCP_ENCODER >++ >++#include "rtc_base/logging.h" >++#import <dlfcn.h> >++#import <objc/runtime.h> >++ >++// Macros copied from <wtf/cocoa/SoftLinking.h> >++#define SOFT_LINK_PRIVATE_FRAMEWORK_FOR_SOURCE(functionNamespace, framework) \ >++ namespace functionNamespace { \ >++ void* framework##Library(bool isOptional) \ >++ { \ >++ static void* frameworkLibrary; \ >++ static dispatch_once_t once; \ >++ dispatch_once(&once, ^{ \ >++ frameworkLibrary = dlopen("/System/Library/PrivateFrameworks/" #framework ".framework/" #framework, RTLD_NOW); \ >++ if (!isOptional && !frameworkLibrary) \ >++ RTC_LOG(LS_ERROR) << "Cannot open framework: " << dlerror(); \ >++ }); \ >++ return frameworkLibrary; \ >++ } \ >++ } >++ >++#define SOFT_LINK_FUNCTION_FOR_SOURCE(functionNamespace, framework, functionName, resultType, parameterDeclarations, parameterNames) \ >++ extern "C" { \ >++ resultType functionName parameterDeclarations; \ >++ } \ >++ namespace functionNamespace { \ >++ static resultType init##framework##functionName parameterDeclarations; \ >++ resultType (*softLink##framework##functionName) parameterDeclarations = init##framework##functionName; \ >++ static resultType init##framework##functionName parameterDeclarations \ >++ { \ >++ static dispatch_once_t once; \ >++ dispatch_once(&once, ^{ \ >++ softLink##framework##functionName = (resultType (*) parameterDeclarations) dlsym(framework##Library(), #functionName); \ >++ if (!softLink##framework##functionName) \ >++ RTC_LOG(LS_ERROR) << "Cannot find function ##functionName: " << dlerror(); \ >++ }); \ >++ return softLink##framework##functionName parameterNames; \ >++ } \ >++} >++ >++SOFT_LINK_PRIVATE_FRAMEWORK_FOR_SOURCE(webrtc, VideoProcessing) >++ >++SOFT_LINK_FUNCTION_FOR_SOURCE(webrtc, VideoProcessing, VCPCompressionSessionSetProperty, OSStatus, (VCPCompressionSessionRef session, CFStringRef key, CFTypeRef value), (session, key, value)) >++SOFT_LINK_FUNCTION_FOR_SOURCE(webrtc, VideoProcessing, VCPCompressionSessionGetPixelBufferPool, CVPixelBufferPoolRef, (VCPCompressionSessionRef session), (session)) >++SOFT_LINK_FUNCTION_FOR_SOURCE(webrtc, VideoProcessing, VCPCompressionSessionEncodeFrame, OSStatus, (VCPCompressionSessionRef session, CVImageBufferRef buffer, CMTime timestamp, CMTime time, CFDictionaryRef dictionary, void* data, VTEncodeInfoFlags* flags), (session, buffer, timestamp, time, dictionary, data, flags)) >++SOFT_LINK_FUNCTION_FOR_SOURCE(webrtc, VideoProcessing, VCPCompressionSessionCreate, OSStatus, (CFAllocatorRef allocator1, int32_t value1 , int32_t value2, CMVideoCodecType type, CFDictionaryRef dictionary1, CFDictionaryRef dictionary2, CFAllocatorRef allocator3, VTCompressionOutputCallback callback, void* data, VCPCompressionSessionRef* session), (allocator1, value1, value2, type, dictionary1, dictionary2, allocator3, callback, data, session)) >++SOFT_LINK_FUNCTION_FOR_SOURCE(webrtc, VideoProcessing, VCPCompressionSessionInvalidate, void, (VCPCompressionSessionRef session), (session)) >++SOFT_LINK_FUNCTION_FOR_SOURCE(webrtc, VideoProcessing, VPModuleInitialize, void, (), ()) >++ >++#endif // ENABLE_VCP_ENCODER >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/VideoProcessingSoftLink.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/VideoProcessingSoftLink.h >+new file mode 100644 >+index 00000000000..cf866d8a8db >+--- /dev/null >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/VideoProcessingSoftLink.h >+@@ -0,0 +1,140 @@ >++/* >++ * Copyright (C) 2018 Apple Inc. All rights reserved. >++ * >++ * Redistribution and use in source and binary forms, with or without >++ * modification, are permitted provided that the following conditions >++ * are met: >++ * 1. Redistributions of source code must retain the above copyright >++ * notice, this list of conditions and the following disclaimer. >++ * 2. Redistributions in binary form must reproduce the above copyright >++ * notice, this list of conditions and the following disclaimer in the >++ * documentation and/or other materials provided with the distribution. >++ * >++ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >++ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >++ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >++ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >++ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >++ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >++ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >++ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >++ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >++ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >++ * THE POSSIBILITY OF SUCH DAMAGE. >++ */ >++ >++#pragma once >++ >++#ifdef __APPLE__ >++#include <Availability.h> >++#include <AvailabilityMacros.h> >++#include <TargetConditionals.h> >++ >++// Macro taken from WTF/wtf/Platform.h >++#if defined __has_include && __has_include(<CoreFoundation/CFPriv.h>) >++ >++#if (defined(TARGET_IPHONE_SIMULATOR) && TARGET_IPHONE_SIMULATOR) >++#define ENABLE_VCP_ENCODER 0 >++#elif (defined(TARGET_OS_IPHONE) && TARGET_OS_IPHONE) >++#define ENABLE_VCP_ENCODER 1 >++#elif (defined(TARGET_OS_MAC) && TARGET_OS_MAC) >++#define ENABLE_VCP_ENCODER (__MAC_OS_X_VERSION_MIN_REQUIRED >= 101300 && __MAC_OS_X_VERSION_MAX_ALLOWED >= 101304) >++#endif >++ >++#endif >++ >++#if !defined(ENABLE_VCP_ENCODER) >++#define ENABLE_VCP_ENCODER 0 >++#endif >++ >++#if !defined(ALWAYS_INLINE) >++#define ALWAYS_INLINE inline >++#endif >++ >++#ifdef __cplusplus >++#define WTF_EXTERN_C_BEGIN extern "C" { >++#define WTF_EXTERN_C_END } >++#else >++#define WTF_EXTERN_C_BEGIN >++#define WTF_EXTERN_C_END >++#endif >++ >++// Macros copied from <wtf/cocoa/SoftLinking.h> >++#define SOFT_LINK_FRAMEWORK_FOR_HEADER(functionNamespace, framework) \ >++ namespace functionNamespace { \ >++ extern void* framework##Library(bool isOptional = false); \ >++ bool is##framework##FrameworkAvailable(); \ >++ inline bool is##framework##FrameworkAvailable() { \ >++ return framework##Library(true) != nullptr; \ >++ } \ >++ } >++ >++#define SOFT_LINK_FUNCTION_FOR_HEADER(functionNamespace, framework, functionName, resultType, parameterDeclarations, parameterNames) \ >++ WTF_EXTERN_C_BEGIN \ >++ resultType functionName parameterDeclarations; \ >++ WTF_EXTERN_C_END \ >++ namespace functionNamespace { \ >++ extern resultType (*softLink##framework##functionName) parameterDeclarations; \ >++ inline resultType softLink_##framework##_##functionName parameterDeclarations \ >++ { \ >++ return softLink##framework##functionName parameterNames; \ >++ } \ >++ } \ >++ ALWAYS_INLINE resultType functionName parameterDeclarations \ >++ {\ >++ return functionNamespace::softLink##framework##functionName parameterNames; \ >++ } >++ >++#define SOFT_LINK_FRAMEWORK_OPTIONAL(framework) \ >++ static void* framework##Library() \ >++ { \ >++ static void* frameworkLibrary = dlopen("/System/Library/Frameworks/" #framework ".framework/" #framework, RTLD_NOW); \ >++ return frameworkLibrary; \ >++ } >++ >++#define SOFT_LINK_POINTER_OPTIONAL(framework, name, type) \ >++ static type init##name(); \ >++ static type (*get##name)() = init##name; \ >++ static type pointer##name; \ >++ \ >++ static type name##Function() \ >++ { \ >++ return pointer##name; \ >++ } \ >++ \ >++ static type init##name() \ >++ { \ >++ void** pointer = static_cast<void**>(dlsym(framework##Library(), #name)); \ >++ if (pointer) \ >++ pointer##name = (__bridge type)(*pointer); \ >++ get##name = name##Function; \ >++ return pointer##name; \ >++ } >++ >++#if ENABLE_VCP_ENCODER >++ >++#include <VideoProcessing/VideoProcessing.h> >++ >++SOFT_LINK_FRAMEWORK_FOR_HEADER(webrtc, VideoProcessing) >++ >++SOFT_LINK_FUNCTION_FOR_HEADER(webrtc, VideoProcessing, VCPCompressionSessionSetProperty, OSStatus, (VCPCompressionSessionRef session, CFStringRef key, CFTypeRef value), (session, key, value)) >++#define VCPCompressionSessionSetProperty softLink_VideoProcessing_VCPCompressionSessionSetProperty >++ >++SOFT_LINK_FUNCTION_FOR_HEADER(webrtc, VideoProcessing, VCPCompressionSessionGetPixelBufferPool, CVPixelBufferPoolRef, (VCPCompressionSessionRef session), (session)) >++#define VCPCompressionSessionGetPixelBufferPool softLink_VideoProcessing_VCPCompressionSessionGetPixelBufferPool >++ >++SOFT_LINK_FUNCTION_FOR_HEADER(webrtc, VideoProcessing, VCPCompressionSessionEncodeFrame, OSStatus, (VCPCompressionSessionRef session, CVImageBufferRef buffer, CMTime timestamp, CMTime time, CFDictionaryRef dictionary, void * data, VTEncodeInfoFlags * flags), (session, buffer, timestamp, time, dictionary, data, flags)) >++#define VCPCompressionSessionEncodeFrame softLink_VideoProcessing_VCPCompressionSessionEncodeFrame >++ >++SOFT_LINK_FUNCTION_FOR_HEADER(webrtc, VideoProcessing, VCPCompressionSessionCreate, OSStatus, (CFAllocatorRef allocator1, int32_t value1 , int32_t value2, CMVideoCodecType type, CFDictionaryRef dictionary1, CFDictionaryRef dictionary2, CFAllocatorRef allocator3, VTCompressionOutputCallback callback, void * data, VCPCompressionSessionRef *session), (allocator1, value1, value2, type, dictionary1, dictionary2, allocator3, callback, data, session)) >++#define VCPCompressionSessionCreate softLink_VideoProcessing_VCPCompressionSessionCreate >++ >++SOFT_LINK_FUNCTION_FOR_HEADER(webrtc, VideoProcessing, VCPCompressionSessionInvalidate, void, (VCPCompressionSessionRef session), (session)) >++#define VCPCompressionSessionInvalidate softLink_VideoProcessing_VCPCompressionSessionInvalidate >++ >++SOFT_LINK_FUNCTION_FOR_HEADER(webrtc, VideoProcessing, VPModuleInitialize, void, (), ()) >++#define VPModuleInitialize softLink_VideoProcessing_VPModuleInitialize >++ >++#endif // ENABLE_VCP_ENCODER >++ >++#endif // __APPLE__ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.h >+new file mode 100644 >+index 00000000000..ef1456eddd3 >+--- /dev/null >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.h >+@@ -0,0 +1,54 @@ >++/* >++ * Copyright (C) 2018 Apple Inc. All rights reserved. >++ * >++ * Redistribution and use in source and binary forms, with or without >++ * modification, are permitted provided that the following conditions >++ * are met: >++ * 1. Redistributions of source code must retain the above copyright >++ * notice, this list of conditions and the following disclaimer. >++ * 2. Redistributions in binary form must reproduce the above copyright >++ * notice, this list of conditions and the following disclaimer in the >++ * documentation and/or other materials provided with the distribution. >++ * >++ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >++ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >++ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >++ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >++ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >++ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >++ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >++ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >++ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >++ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >++ * THE POSSIBILITY OF SUCH DAMAGE. >++ */ >++ >++#pragma once >++ >++#include "api/video/video_frame_buffer.h" >++#include "rtc_base/scoped_ref_ptr.h" >++#include "webrtc/media/engine/webrtcvideodecoderfactory.h" >++#include "webrtc/media/engine/webrtcvideoencoderfactory.h" >++ >++typedef struct __CVBuffer* CVPixelBufferRef; >++ >++namespace webrtc { >++ >++class VideoDecoderFactory; >++class VideoEncoderFactory; >++class VideoFrame; >++ >++enum class WebKitCodecSupport { H264, H264AndVP8 }; >++ >++std::unique_ptr<webrtc::VideoEncoderFactory> createWebKitEncoderFactory(WebKitCodecSupport); >++std::unique_ptr<webrtc::VideoDecoderFactory> createWebKitDecoderFactory(WebKitCodecSupport); >++ >++void setApplicationStatus(bool isActive); >++ >++void setH264HardwareEncoderAllowed(bool); >++bool isH264HardwareEncoderAllowed(); >++ >++CVPixelBufferRef pixelBufferFromFrame(const VideoFrame&, const std::function<CVPixelBufferRef(size_t, size_t)>&); >++rtc::scoped_refptr<webrtc::VideoFrameBuffer> pixelBufferToFrame(CVPixelBufferRef); >++ >++} >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.mm >+new file mode 100644 >+index 00000000000..65eefa36f2d >+--- /dev/null >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/WebKit/WebKitUtilities.mm >+@@ -0,0 +1,185 @@ >++/* >++ * Copyright (C) 2018 Apple Inc. All rights reserved. >++ * >++ * Redistribution and use in source and binary forms, with or without >++ * modification, are permitted provided that the following conditions >++ * are met: >++ * 1. Redistributions of source code must retain the above copyright >++ * notice, this list of conditions and the following disclaimer. >++ * 2. Redistributions in binary form must reproduce the above copyright >++ * notice, this list of conditions and the following disclaimer in the >++ * documentation and/or other materials provided with the distribution. >++ * >++ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' >++ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, >++ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR >++ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS >++ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR >++ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF >++ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS >++ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN >++ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) >++ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF >++ * THE POSSIBILITY OF SUCH DAMAGE. >++ */ >++ >++#include "WebKitUtilities.h" >++ >++//#include "Common/RTCUIApplicationStatusObserver.h" >++#import "WebRTC/RTCVideoCodecH264.h" >++ >++#include "api/video/video_frame.h" >++#include "native/src/objc_frame_buffer.h" >++#include "third_party/libyuv/include/libyuv/convert_from.h" >++#include "webrtc/rtc_base/checks.h" >++#include "Framework/Headers/WebRTC/RTCVideoCodecFactory.h" >++#include "Framework/Headers/WebRTC/RTCVideoFrame.h" >++#include "Framework/Headers/WebRTC/RTCVideoFrameBuffer.h" >++#include "Framework/Native/api/video_decoder_factory.h" >++#include "Framework/Native/api/video_encoder_factory.h" >++/* >++#if !defined(WEBRTC_IOS) >++__attribute__((objc_runtime_name("WK_RTCUIApplicationStatusObserver"))) >++@interface RTCUIApplicationStatusObserver : NSObject >++ >+++ (instancetype)sharedInstance; >+++ (void)prepareForUse; >++ >++- (BOOL)isApplicationActive; >++ >++@end >++#endif >++ >++@implementation RTCUIApplicationStatusObserver { >++ BOOL _isActive; >++} >++ >+++ (instancetype)sharedInstance { >++ static id sharedInstance; >++ static dispatch_once_t onceToken; >++ dispatch_once(&onceToken, ^{ >++ sharedInstance = [[self alloc] init]; >++ }); >++ >++ return sharedInstance; >++} >++ >+++ (void)prepareForUse { >++ __unused RTCUIApplicationStatusObserver *observer = [self sharedInstance]; >++} >++ >++- (id)init { >++ _isActive = YES; >++ return self; >++} >++ >++- (void)setActive { >++ _isActive = YES; >++} >++ >++- (void)setInactive { >++ _isActive = NO; >++} >++ >++- (BOOL)isApplicationActive { >++ return _isActive; >++} >++ >++@end >++*/ >++namespace webrtc { >++ >++void setApplicationStatus(bool isActive) >++{ >++/* >++ if (isActive) >++ [[RTCUIApplicationStatusObserver sharedInstance] setActive]; >++ else >++ [[RTCUIApplicationStatusObserver sharedInstance] setInactive]; >++ */ >++} >++ >++std::unique_ptr<webrtc::VideoEncoderFactory> createWebKitEncoderFactory(WebKitCodecSupport codecSupport) >++{ >++#if ENABLE_VCP_ENCODER >++ static std::once_flag onceFlag; >++ std::call_once(onceFlag, [] { >++ webrtc::VPModuleInitialize(); >++ }); >++#endif >++ return ObjCToNativeVideoEncoderFactory(codecSupport == WebKitCodecSupport::H264AndVP8 ? [[RTCDefaultVideoEncoderFactory alloc] init] : [[RTCVideoEncoderFactoryH264 alloc] init]); >++} >++ >++std::unique_ptr<webrtc::VideoDecoderFactory> createWebKitDecoderFactory(WebKitCodecSupport codecSupport) >++{ >++ return ObjCToNativeVideoDecoderFactory(codecSupport == WebKitCodecSupport::H264AndVP8 ? [[RTCDefaultVideoDecoderFactory alloc] init] : [[RTCVideoDecoderFactoryH264 alloc] init]); >++} >++ >++static bool h264HardwareEncoderAllowed = true; >++void setH264HardwareEncoderAllowed(bool allowed) >++{ >++ h264HardwareEncoderAllowed = allowed; >++} >++ >++bool isH264HardwareEncoderAllowed() >++{ >++ return h264HardwareEncoderAllowed; >++} >++ >++rtc::scoped_refptr<webrtc::VideoFrameBuffer> pixelBufferToFrame(CVPixelBufferRef pixelBuffer) >++{ >++ RTCCVPixelBuffer *frameBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer]; >++ return new rtc::RefCountedObject<ObjCFrameBuffer>(frameBuffer); >++} >++ >++static bool CopyVideoFrameToPixelBuffer(const rtc::scoped_refptr<webrtc::I420BufferInterface>& frame, CVPixelBufferRef pixel_buffer) { >++ RTC_DCHECK(pixel_buffer); >++ RTC_DCHECK(CVPixelBufferGetPixelFormatType(pixel_buffer) == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange || CVPixelBufferGetPixelFormatType(pixel_buffer) == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange); >++ RTC_DCHECK_EQ(CVPixelBufferGetHeightOfPlane(pixel_buffer, 0), static_cast<size_t>(frame->height())); >++ RTC_DCHECK_EQ(CVPixelBufferGetWidthOfPlane(pixel_buffer, 0), static_cast<size_t>(frame->width())); >++ >++ if (CVPixelBufferLockBaseAddress(pixel_buffer, 0) != kCVReturnSuccess) >++ return false; >++ >++ uint8_t* dst_y = reinterpret_cast<uint8_t*>(CVPixelBufferGetBaseAddressOfPlane(pixel_buffer, 0)); >++ int dst_stride_y = CVPixelBufferGetBytesPerRowOfPlane(pixel_buffer, 0); >++ >++ uint8_t* dst_uv = reinterpret_cast<uint8_t*>(CVPixelBufferGetBaseAddressOfPlane(pixel_buffer, 1)); >++ int dst_stride_uv = CVPixelBufferGetBytesPerRowOfPlane(pixel_buffer, 1); >++ >++ int result = libyuv::I420ToNV12( >++ frame->DataY(), frame->StrideY(), >++ frame->DataU(), frame->StrideU(), >++ frame->DataV(), frame->StrideV(), >++ dst_y, dst_stride_y, dst_uv, dst_stride_uv, >++ frame->width(), frame->height()); >++ >++ CVPixelBufferUnlockBaseAddress(pixel_buffer, 0); >++ >++ if (result) >++ return false; >++ >++ return true; >++} >++ >++ >++CVPixelBufferRef pixelBufferFromFrame(const VideoFrame& frame, const std::function<CVPixelBufferRef(size_t, size_t)>& makePixelBuffer) >++{ >++ if (frame.video_frame_buffer()->type() != VideoFrameBuffer::Type::kNative) { >++ rtc::scoped_refptr<const I420BufferInterface> buffer = frame.video_frame_buffer()->GetI420(); >++ >++ auto pixelBuffer = makePixelBuffer(buffer->width(), buffer->height()); >++ if (pixelBuffer) >++ CopyVideoFrameToPixelBuffer(frame.video_frame_buffer()->GetI420(), pixelBuffer); >++ return pixelBuffer; >++ } >++ >++ auto *frameBuffer = static_cast<ObjCFrameBuffer*>(frame.video_frame_buffer().get())->wrapped_frame_buffer(); >++ if (![frameBuffer isKindOfClass:[RTCCVPixelBuffer class]]) >++ return nullptr; >++ >++ auto *rtcPixelBuffer = (RTCCVPixelBuffer *)frameBuffer; >++ return rtcPixelBuffer.pixelBuffer; >++} >++ >++} >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >+index 6f2d1f46d9c..f9901bd327e 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCEncodedImage+Private.mm >+@@ -56,6 +56,7 @@ - (webrtc::EncodedImage)nativeEncodedImage { >+ encodedImage.rotation_ = webrtc::VideoRotation(self.rotation); >+ encodedImage._completeFrame = self.completeFrame; >+ encodedImage.qp_ = self.qp ? self.qp.intValue : -1; >++ encodedImage.SetSpatialIndex(self.spatialIndex); >+ encodedImage.content_type_ = (self.contentType == RTCVideoContentTypeScreenshare) ? >+ webrtc::VideoContentType::SCREENSHARE : >+ webrtc::VideoContentType::UNSPECIFIED; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h >+index ba50bde649f..dabff725abb 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCRtpEncodingParameters.h >+@@ -15,6 +15,7 @@ >+ NS_ASSUME_NONNULL_BEGIN >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCRtpEncodingParameters"))) >+ @interface RTCRtpEncodingParameters : NSObject >+ >+ /** Controls whether the encoding is currently transmitted. */ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h >+index 5b062455bc0..72af732b23b 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.h >+@@ -22,4 +22,11 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ @end >+ >++@interface RTCVideoBitrateAllocation (Private) >++ >++- (instancetype)initWithNativeVideoBitrateAllocation:(const webrtc::VideoBitrateAllocation *__nullable)videoBitrateAllocation; >++- (webrtc::VideoBitrateAllocation)nativeVideoBitrateAllocation; >++ >++@end >++ >+ NS_ASSUME_NONNULL_END >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm >+index 6fb81dbb8af..82b5ba7bc0f 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/peerconnection/RTCVideoEncoderSettings+Private.mm >+@@ -12,6 +12,23 @@ >+ >+ #import "helpers/NSString+StdString.h" >+ >++@implementation RTCVideoEncoderSettings { >++ webrtc::VideoCodec _nativeVideoCodec; >++} >++ >++@synthesize name = _name; >++@synthesize width = _width; >++@synthesize height = _height; >++@synthesize startBitrate = _startBitrate; >++@synthesize maxBitrate = _maxBitrate; >++@synthesize minBitrate = _minBitrate; >++@synthesize targetBitrate = _targetBitrate; >++@synthesize maxFramerate = _maxFramerate; >++@synthesize qpMax = _qpMax; >++@synthesize mode = _mode; >++ >++@end >++ >+ @implementation RTCVideoEncoderSettings (Private) >+ >+ - (instancetype)initWithNativeVideoCodec:(const webrtc::VideoCodec *)videoCodec { >+@@ -20,6 +37,7 @@ - (instancetype)initWithNativeVideoCodec:(const webrtc::VideoCodec *)videoCodec >+ const char *codecName = CodecTypeToPayloadString(videoCodec->codecType); >+ self.name = [NSString stringWithUTF8String:codecName]; >+ >++ _nativeVideoCodec = *videoCodec; >+ self.width = videoCodec->width; >+ self.height = videoCodec->height; >+ self.startBitrate = videoCodec->startBitrate; >+@@ -36,18 +54,28 @@ - (instancetype)initWithNativeVideoCodec:(const webrtc::VideoCodec *)videoCodec >+ } >+ >+ - (webrtc::VideoCodec)nativeVideoCodec { >+- webrtc::VideoCodec videoCodec; >+- videoCodec.width = self.width; >+- videoCodec.height = self.height; >+- videoCodec.startBitrate = self.startBitrate; >+- videoCodec.maxBitrate = self.maxBitrate; >+- videoCodec.minBitrate = self.minBitrate; >+- videoCodec.targetBitrate = self.targetBitrate; >+- videoCodec.maxBitrate = self.maxBitrate; >+- videoCodec.qpMax = self.qpMax; >+- videoCodec.mode = (webrtc::VideoCodecMode)self.mode; >+- >+- return videoCodec; >++ return _nativeVideoCodec; >++} >++ >++@end >++ >++@implementation RTCVideoBitrateAllocation { >++ webrtc::VideoBitrateAllocation _nativeVideoBitrateAllocation; >++} >++ >++@end >++ >++@implementation RTCVideoBitrateAllocation (Private) >++ >++- (instancetype)initWithNativeVideoBitrateAllocation:(const webrtc::VideoBitrateAllocation *)videoBitrateAllocation { >++ if (self = [super init]) { >++ _nativeVideoBitrateAllocation = *videoBitrateAllocation; >++ } >++ return self; >++} >++ >++- (webrtc::VideoBitrateAllocation)nativeVideoBitrateAllocation { >++ return _nativeVideoBitrateAllocation; >+ } >+ >+ @end >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm >+index acbf126170c..1447d3b27e1 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoCodecConstants.mm >+@@ -13,5 +13,5 @@ >+ >+ #include "media/base/mediaconstants.h" >+ >+-NSString *const kRTCVideoCodecVp8Name = @(cricket::kVp8CodecName); >+-NSString *const kRTCVideoCodecVp9Name = @(cricket::kVp9CodecName); >++NSString *const kRTCVideoCodecVp8Name = @"VP8"; >++NSString *const kRTCVideoCodecVp9Name = @"VP9"; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h >+index 8d87a898931..08d18504222 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCVideoEncoderVP8.h >+@@ -14,6 +14,7 @@ >+ #import "RTCVideoEncoder.h" >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoEncoderVP8"))) >+ @interface RTCVideoEncoderVP8 : NSObject >+ >+ /* This returns a VP8 encoder that can be returned from a RTCVideoEncoderFactory injected into >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm >+index 9afd54f55f5..b9634779f1f 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_codec/RTCWrappedNativeVideoEncoder.mm >+@@ -58,6 +58,11 @@ - (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate { >+ return 0; >+ } >+ >++- (int)setRateAllocation:(nonnull RTCVideoBitrateAllocation *)allocation framerate:(uint32_t)framerate { >++ RTC_NOTREACHED(); >++ return 0; >++} >++ >+ - (NSString *)implementationName { >+ RTC_NOTREACHED(); >+ return nil; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h >+index 9a904f5396a..dc903bc4bca 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeI420Buffer.h >+@@ -17,6 +17,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** RTCI420Buffer implements the RTCI420Buffer protocol */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCI420Buffer"))) >+ @interface RTCI420Buffer : NSObject<RTCI420Buffer> >+ @end >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h >+index 6cd5110460b..7cfa950abb6 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/api/video_frame_buffer/RTCNativeMutableI420Buffer.h >+@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** Mutable version of RTCI420Buffer */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCMutableI420Buffer"))) >+ @interface RTCMutableI420Buffer : RTCI420Buffer<RTCMutableI420Buffer> >+ @end >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h >+index e2ae4cafa11..9dc44c12af8 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCCodecSpecificInfo.h >+@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >+ * Corresponds to webrtc::CodecSpecificInfo. >+ */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCCodecSpecificInfo"))) >+ @protocol RTCCodecSpecificInfo <NSObject> >+ @end >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h >+index 670c7276ff7..7994ce64a1f 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.h >+@@ -31,6 +31,7 @@ typedef NS_ENUM(NSUInteger, RTCVideoContentType) { >+ >+ /** Represents an encoded frame. Corresponds to webrtc::EncodedImage. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCEncodedImage"))) >+ @interface RTCEncodedImage : NSObject >+ >+ @property(nonatomic, strong) NSData *buffer; >+@@ -47,6 +48,7 @@ RTC_OBJC_EXPORT >+ @property(nonatomic, assign) BOOL completeFrame; >+ @property(nonatomic, strong) NSNumber *qp; >+ @property(nonatomic, assign) RTCVideoContentType contentType; >++@property(nonatomic, assign) int spatialIndex; >+ >+ @end >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m >+index 024a57c541b..85cd5e298e4 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCEncodedImage.m >+@@ -26,5 +26,6 @@ @implementation RTCEncodedImage >+ @synthesize completeFrame = _completeFrame; >+ @synthesize qp = _qp; >+ @synthesize contentType = _contentType; >++@synthesize spatialIndex = _spatialIndex; >+ >+ @end >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h >+index a6c7e41bcba..cfa14f8af23 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCI420Buffer.h >+@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** Protocol for RTCYUVPlanarBuffers containing I420 data */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCI420Buffer"))) >+ @protocol RTCI420Buffer <RTCYUVPlanarBuffer> >+ @end >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h >+index 2e26b08b8af..fbbc68a8ef6 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.h >+@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** Information for header. Corresponds to webrtc::RTPFragmentationHeader. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCRtpFragmentationHeader"))) >+ @interface RTCRtpFragmentationHeader : NSObject >+ >+ @property(nonatomic, strong) NSArray<NSNumber *> *fragmentationOffset; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m >+index 8049abc411e..6cf21ddc2a1 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCRtpFragmentationHeader.m >+@@ -17,4 +17,5 @@ @implementation RTCRtpFragmentationHeader >+ @synthesize fragmentationTimeDiff = _fragmentationTimeDiff; >+ @synthesize fragmentationPlType = _fragmentationPlType; >+ >+-@end >+\ No newline at end of file >++@end >++ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h >+index 2162caaa21f..4e643d94347 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoCodecInfo.h >+@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** Holds information to identify a codec. Corresponds to webrtc::SdpVideoFormat. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoCodecInfo"))) >+ @interface RTCVideoCodecInfo : NSObject <NSCoding> >+ >+ - (instancetype)init NS_UNAVAILABLE; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h >+index 18c6f6b0006..534420d3955 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoder.h >+@@ -23,6 +23,7 @@ typedef void (^RTCVideoDecoderCallback)(RTCVideoFrame *frame); >+ >+ /** Protocol for decoder implementations. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoDecoder"))) >+ @protocol RTCVideoDecoder <NSObject> >+ >+ - (void)setCallback:(RTCVideoDecoderCallback)callback; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h >+index 3e24153b82c..cb95fa3f134 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoDecoderFactory.h >+@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** RTCVideoDecoderFactory is an Objective-C version of webrtc::VideoDecoderFactory. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoDecoderFactory"))) >+ @protocol RTCVideoDecoderFactory <NSObject> >+ >+ - (nullable id<RTCVideoDecoder>)createDecoder:(RTCVideoCodecInfo *)info; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h >+index c5257674d83..5291bb1342b 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoder.h >+@@ -19,7 +19,11 @@ >+ #import "RTCVideoFrame.h" >+ >+ NS_ASSUME_NONNULL_BEGIN >+- >++/* >++namespace webrtc { >++class VideoBitrateAllocation; >++}; >++*/ >+ /** Callback block for encoder. */ >+ typedef BOOL (^RTCVideoEncoderCallback)(RTCEncodedImage *frame, >+ id<RTCCodecSpecificInfo> info, >+@@ -27,6 +31,7 @@ typedef BOOL (^RTCVideoEncoderCallback)(RTCEncodedImage *frame, >+ >+ /** Protocol for encoder implementations. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoEncoder"))) >+ @protocol RTCVideoEncoder <NSObject> >+ >+ - (void)setCallback:(RTCVideoEncoderCallback)callback; >+@@ -37,6 +42,7 @@ RTC_OBJC_EXPORT >+ codecSpecificInfo:(nullable id<RTCCodecSpecificInfo>)info >+ frameTypes:(NSArray<NSNumber *> *)frameTypes; >+ - (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate; >++- (int)setRateAllocation: (RTCVideoBitrateAllocation *)allocation framerate:(uint32_t)framerate; >+ - (NSString *)implementationName; >+ >+ /** Returns QP scaling settings for encoder. The quality scaler adjusts the resolution in order to >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h >+index 20c603d6fe6..cbe9ea492ae 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderFactory.h >+@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** RTCVideoEncoderFactory is an Objective-C version of webrtc::VideoEncoderFactory. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoEncoderFactory"))) >+ @protocol RTCVideoEncoderFactory <NSObject> >+ >+ - (nullable id<RTCVideoEncoder>)createEncoder:(RTCVideoCodecInfo *)info; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h >+index 2b48f45ce0a..68cfd66df47 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderQpThresholds.h >+@@ -16,6 +16,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ /** QP thresholds for encoder. Corresponds to webrtc::VideoEncoder::QpThresholds. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoEncoderQpThresholds"))) >+ @interface RTCVideoEncoderQpThresholds : NSObject >+ >+ - (instancetype)initWithThresholdsLow:(NSInteger)low high:(NSInteger)high; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h >+index 69e04cac70f..ae3bf5fb5f1 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoEncoderSettings.h >+@@ -21,6 +21,7 @@ typedef NS_ENUM(NSUInteger, RTCVideoCodecMode) { >+ >+ /** Settings for encoder. Corresponds to webrtc::VideoCodec. */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoEncoderSettings"))) >+ @interface RTCVideoEncoderSettings : NSObject >+ >+ @property(nonatomic, strong) NSString *name; >+@@ -40,4 +41,9 @@ RTC_OBJC_EXPORT >+ >+ @end >+ >++RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoBitrateAllocation"))) >++@interface RTCVideoBitrateAllocation : NSObject >++@end >++ >+ NS_ASSUME_NONNULL_END >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h >+index 9aca7433f34..bde09c50c9d 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrame.h >+@@ -26,6 +26,7 @@ typedef NS_ENUM(NSInteger, RTCVideoRotation) { >+ >+ // RTCVideoFrame is an ObjectiveC version of webrtc::VideoFrame. >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoFrame"))) >+ @interface RTCVideoFrame : NSObject >+ >+ /** Width without rotation applied. */ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h >+index bb9e6fba631..186ad0889bf 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/base/RTCVideoFrameBuffer.h >+@@ -18,6 +18,7 @@ NS_ASSUME_NONNULL_BEGIN >+ >+ // RTCVideoFrameBuffer is an ObjectiveC version of webrtc::VideoFrameBuffer. >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoFrameBuffer"))) >+ @protocol RTCVideoFrameBuffer <NSObject> >+ >+ @property(nonatomic, readonly) int width; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h >+index ece9570a13b..e947dc6e623 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.h >+@@ -10,7 +10,7 @@ >+ >+ #import <Foundation/Foundation.h> >+ >+-#import "RTCCodecSpecificInfo.h" >++#import "base/RTCCodecSpecificInfo.h" >+ #import "RTCMacros.h" >+ >+ /** Class for H264 specific config. */ >+@@ -20,6 +20,7 @@ typedef NS_ENUM(NSUInteger, RTCH264PacketizationMode) { >+ }; >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCCodecSpecificInfoH264"))) >+ @interface RTCCodecSpecificInfoH264 : NSObject <RTCCodecSpecificInfo> >+ >+ @property(nonatomic, assign) RTCH264PacketizationMode packetizationMode; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.mm >+index 57f2411e3bb..63e8bd7f42a 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCCodecSpecificInfoH264.mm >+@@ -20,6 +20,7 @@ @synthesize packetizationMode = _packetizationMode; >+ - (webrtc::CodecSpecificInfo)nativeCodecSpecificInfo { >+ webrtc::CodecSpecificInfo codecSpecificInfo; >+ codecSpecificInfo.codecType = webrtc::kVideoCodecH264; >++ codecSpecificInfo.codec_name = [kRTCVideoCodecH264Name cStringUsingEncoding:NSUTF8StringEncoding]; >+ codecSpecificInfo.codecSpecific.H264.packetization_mode = >+ (webrtc::H264PacketizationMode)_packetizationMode; >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h >+index 7ca9463a593..d11460300d5 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.h >+@@ -19,6 +19,7 @@ NS_ASSUME_NONNULL_BEGIN >+ * codecs, create custom implementations of RTCVideoEncoderFactory and RTCVideoDecoderFactory. >+ */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCDefaultVideoDecoderFactory"))) >+ @interface RTCDefaultVideoDecoderFactory : NSObject <RTCVideoDecoderFactory> >+ @end >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.m >+index bdb18517caf..c7d08caa9c9 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.m >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoDecoderFactory.m >+@@ -15,7 +15,7 @@ >+ #import "api/video_codec/RTCVideoCodecConstants.h" >+ #import "api/video_codec/RTCVideoDecoderVP8.h" >+ #import "base/RTCVideoCodecInfo.h" >+-#if defined(RTC_ENABLE_VP9) >++#if !defined(RTC_DISABLE_VP9) >+ #import "api/video_codec/RTCVideoDecoderVP9.h" >+ #endif >+ >+@@ -26,7 +26,7 @@ @implementation RTCDefaultVideoDecoderFactory >+ return [[RTCVideoDecoderH264 alloc] init]; >+ } else if ([info.name isEqualToString:kRTCVideoCodecVp8Name]) { >+ return [RTCVideoDecoderVP8 vp8Decoder]; >+-#if defined(RTC_ENABLE_VP9) >++#if !defined(RTC_DISABLE_VP9) >+ } else if ([info.name isEqualToString:kRTCVideoCodecVp9Name]) { >+ return [RTCVideoDecoderVP9 vp9Decoder]; >+ #endif >+@@ -39,7 +39,7 @@ @implementation RTCDefaultVideoDecoderFactory >+ return @[ >+ [[RTCVideoCodecInfo alloc] initWithName:kRTCVideoCodecH264Name], >+ [[RTCVideoCodecInfo alloc] initWithName:kRTCVideoCodecVp8Name], >+-#if defined(RTC_ENABLE_VP9) >++#if !defined(RTC_DISABLE_VP9) >+ [[RTCVideoCodecInfo alloc] initWithName:kRTCVideoCodecVp9Name], >+ #endif >+ ]; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h >+index c45e54362b2..9323256cead 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.h >+@@ -19,6 +19,7 @@ NS_ASSUME_NONNULL_BEGIN >+ * codecs, create custom implementations of RTCVideoEncoderFactory and RTCVideoDecoderFactory. >+ */ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCDefaultVideoEncoderFactory"))) >+ @interface RTCDefaultVideoEncoderFactory : NSObject <RTCVideoEncoderFactory> >+ >+ @property(nonatomic, retain) RTCVideoCodecInfo *preferredCodec; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.m b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.m >+index b72296b64f3..b01c0746792 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.m >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCDefaultVideoEncoderFactory.m >+@@ -15,7 +15,7 @@ >+ #import "api/video_codec/RTCVideoCodecConstants.h" >+ #import "api/video_codec/RTCVideoEncoderVP8.h" >+ #import "base/RTCVideoCodecInfo.h" >+-#if defined(RTC_ENABLE_VP9) >++#if !defined(RTC_DISABLE_VP9) >+ #import "api/video_codec/RTCVideoEncoderVP9.h" >+ #endif >+ >+@@ -44,7 +44,7 @@ @implementation RTCDefaultVideoEncoderFactory >+ >+ RTCVideoCodecInfo *vp8Info = [[RTCVideoCodecInfo alloc] initWithName:kRTCVideoCodecVp8Name]; >+ >+-#if defined(RTC_ENABLE_VP9) >++#if !defined(RTC_DISABLE_VP9) >+ RTCVideoCodecInfo *vp9Info = [[RTCVideoCodecInfo alloc] initWithName:kRTCVideoCodecVp9Name]; >+ #endif >+ >+@@ -52,7 +52,7 @@ @implementation RTCDefaultVideoEncoderFactory >+ constrainedHighInfo, >+ constrainedBaselineInfo, >+ vp8Info, >+-#if defined(RTC_ENABLE_VP9) >++#if !defined(RTC_DISABLE_VP9) >+ vp9Info, >+ #endif >+ ]; >+@@ -63,7 +63,7 @@ @implementation RTCDefaultVideoEncoderFactory >+ return [[RTCVideoEncoderH264 alloc] initWithCodecInfo:info]; >+ } else if ([info.name isEqualToString:kRTCVideoCodecVp8Name]) { >+ return [RTCVideoEncoderVP8 vp8Encoder]; >+-#if defined(RTC_ENABLE_VP9) >++#if !defined(RTC_DISABLE_VP9) >+ } else if ([info.name isEqualToString:kRTCVideoCodecVp9Name]) { >+ return [RTCVideoEncoderVP9 vp9Encoder]; >+ #endif >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h >+index 56b353215a2..d022297674e 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.h >+@@ -48,6 +48,7 @@ typedef NS_ENUM(NSUInteger, RTCH264Level) { >+ }; >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCH264ProfileLevelId"))) >+ @interface RTCH264ProfileLevelId : NSObject >+ >+ @property(nonatomic, readonly) RTCH264Profile profile; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm >+index 359656cb97b..3d5c8337246 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCH264ProfileLevelId.mm >+@@ -19,23 +19,35 @@ >+ #include "media/base/h264_profile_level_id.h" >+ #include "media/base/mediaconstants.h" >+ >++#if !defined(WEBRTC_WEBKIT_BUILD) >+ namespace { >+ >+ NSString *MaxSupportedProfileLevelConstrainedHigh(); >+ NSString *MaxSupportedProfileLevelConstrainedBaseline(); >+ >+ } // namespace >++#endif >+ >+-NSString *const kRTCVideoCodecH264Name = @(cricket::kH264CodecName); >++NSString *const kRTCVideoCodecH264Name = @"H264"; >+ NSString *const kRTCLevel31ConstrainedHigh = @"640c1f"; >+ NSString *const kRTCLevel31ConstrainedBaseline = @"42e01f"; >++ >++#if defined(WEBRTC_WEBKIT_BUILD) >++NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedHigh = >++ @"640c1f"; >++NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedBaseline = >++ @"42e01f"; >++#else >+ NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedHigh = >+ MaxSupportedProfileLevelConstrainedHigh(); >+ NSString *const kRTCMaxSupportedH264ProfileLevelConstrainedBaseline = >+ MaxSupportedProfileLevelConstrainedBaseline(); >++#endif >+ >+ namespace { >+ >++#if !defined(WEBRTC_WEBKIT_BUILD) >++ >+ #if defined(WEBRTC_IOS) >+ >+ using namespace webrtc::H264; >+@@ -73,6 +85,8 @@ NSString *MaxSupportedProfileLevelConstrainedHigh() { >+ return kRTCLevel31ConstrainedHigh; >+ } >+ >++#endif >++ >+ } // namespace >+ >+ @interface RTCH264ProfileLevelId () >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h >+index 4fcff1dff79..fdabd807480 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderFactoryH264.h >+@@ -14,5 +14,6 @@ >+ #import "RTCVideoDecoderFactory.h" >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoDecoderFactoryH264"))) >+ @interface RTCVideoDecoderFactoryH264 : NSObject <RTCVideoDecoderFactory> >+ @end >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h >+index b860276206c..b10d1e6a099 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.h >+@@ -14,5 +14,6 @@ >+ #import "RTCVideoDecoder.h" >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoDecoderH264"))) >+ @interface RTCVideoDecoderH264 : NSObject <RTCVideoDecoder> >+ @end >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm >+index 3bfb918bf61..1c18fab17df 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoDecoderH264.mm >+@@ -19,10 +19,6 @@ >+ #import "helpers.h" >+ #import "helpers/scoped_cftyperef.h" >+ >+-#if defined(WEBRTC_IOS) >+-#import "helpers/UIDevice+RTCDevice.h" >+-#endif >+- >+ #include "modules/video_coding/include/video_error_codes.h" >+ #include "rtc_base/checks.h" >+ #include "rtc_base/logging.h" >+@@ -251,9 +247,7 @@ - (void)configureDecompressionSession { >+ - (void)destroyDecompressionSession { >+ if (_decompressionSession) { >+ #if defined(WEBRTC_IOS) >+- if ([UIDevice isIOS11OrLater]) { >+- VTDecompressionSessionWaitForAsynchronousFrames(_decompressionSession); >+- } >++ VTDecompressionSessionWaitForAsynchronousFrames(_decompressionSession); >+ #endif >+ VTDecompressionSessionInvalidate(_decompressionSession); >+ CFRelease(_decompressionSession); >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h >+index c64405e4dac..5f6f78dfa9a 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderFactoryH264.h >+@@ -14,5 +14,6 @@ >+ #import "RTCVideoEncoderFactory.h" >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoEncoderFactoryH264"))) >+ @interface RTCVideoEncoderFactoryH264 : NSObject <RTCVideoEncoderFactory> >+ @end >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h >+index a9c05580a41..7714b25eaa4 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.h >+@@ -15,6 +15,7 @@ >+ #import "RTCVideoEncoder.h" >+ >+ RTC_OBJC_EXPORT >++__attribute__((objc_runtime_name("WK_RTCVideoEncoderH264"))) >+ @interface RTCVideoEncoderH264 : NSObject <RTCVideoEncoder> >+ >+ - (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo; >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm >+index f6690b6d861..a14357fb80c 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/RTCVideoEncoderH264.mm >+@@ -9,39 +9,68 @@ >+ * >+ */ >+ >+-#import "RTCVideoEncoderH264.h" >++#import "WebRTC/RTCVideoCodecH264.h" >+ >+ #import <VideoToolbox/VideoToolbox.h> >+ #include <vector> >+ >+-#if defined(WEBRTC_IOS) >+-#import "helpers/UIDevice+RTCDevice.h" >+-#endif >+-#import "RTCCodecSpecificInfoH264.h" >+-#import "RTCH264ProfileLevelId.h" >+-#import "api/peerconnection/RTCRtpFragmentationHeader+Private.h" >+-#import "api/peerconnection/RTCVideoCodecInfo+Private.h" >+-#import "base/RTCCodecSpecificInfo.h" >+-#import "base/RTCI420Buffer.h" >+-#import "base/RTCVideoEncoder.h" >+-#import "base/RTCVideoFrame.h" >+-#import "base/RTCVideoFrameBuffer.h" >+-#import "components/video_frame_buffer/RTCCVPixelBuffer.h" >+-#import "helpers.h" >+- >++#import "PeerConnection/RTCVideoCodec+Private.h" >++#import "WebRTC/RTCVideoCodec.h" >++#import "WebRTC/RTCVideoFrame.h" >++#import "WebRTC/RTCVideoFrameBuffer.h" >+ #include "common_video/h264/h264_bitstream_parser.h" >+ #include "common_video/h264/profile_level_id.h" >+ #include "common_video/include/bitrate_adjuster.h" >++#import "helpers.h" >+ #include "modules/include/module_common_types.h" >+ #include "modules/video_coding/include/video_error_codes.h" >+ #include "rtc_base/buffer.h" >+ #include "rtc_base/logging.h" >+ #include "rtc_base/timeutils.h" >+-#include "sdk/objc/components/video_codec/nalu_rewriter.h" >++#include "sdk/objc/Framework/Classes/VideoToolbox/nalu_rewriter.h" >++#include "system_wrappers/include/clock.h" >+ #include "third_party/libyuv/include/libyuv/convert_from.h" >+ >+-@interface RTCVideoEncoderH264 () >++#include "sdk/WebKit/EncoderUtilities.h" >++#include "sdk/WebKit/WebKitUtilities.h" >++ >++#import <dlfcn.h> >++#import <objc/runtime.h> >++ >++SOFT_LINK_FRAMEWORK_OPTIONAL(VideoToolBox) >++SOFT_LINK_POINTER_OPTIONAL(VideoToolBox, kVTVideoEncoderSpecification_Usage, NSString *) >++ >++#if !ENABLE_VCP_ENCODER && !defined(WEBRTC_IOS) >++static inline bool isStandardFrameSize(int32_t width, int32_t height) >++{ >++ // FIXME: Envision relaxing this rule, something like width and height dividable by 4 or 8 should be good enough. >++ if (width == 1280) >++ return height == 720; >++ if (width == 720) >++ return height == 1280; >++ if (width == 960) >++ return height == 540; >++ if (width == 540) >++ return height == 960; >++ if (width == 640) >++ return height == 480; >++ if (width == 480) >++ return height == 640; >++ if (width == 288) >++ return height == 352; >++ if (width == 352) >++ return height == 288; >++ if (width == 320) >++ return height == 240; >++ if (width == 240) >++ return height == 320; >++ return false; >++} >++#endif >+ >++__attribute__((objc_runtime_name("WK_RTCSingleVideoEncoderH264"))) >++@interface RTCSingleVideoEncoderH264 : NSObject <RTCVideoEncoder> >++- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo simulcastIndex: (int)index; >+ - (void)frameWasEncoded:(OSStatus)status >+ flags:(VTEncodeInfoFlags)infoFlags >+ sampleBuffer:(CMSampleBufferRef)sampleBuffer >+@@ -51,7 +80,6 @@ - (void)frameWasEncoded:(OSStatus)status >+ renderTimeMs:(int64_t)renderTimeMs >+ timestamp:(uint32_t)timestamp >+ rotation:(RTCVideoRotation)rotation; >+- >+ @end >+ >+ namespace { // anonymous namespace >+@@ -70,7 +98,7 @@ const OSType kNV12PixelFormat = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange; >+ // Struct that we pass to the encoder per frame to encode. We receive it again >+ // in the encoder callback. >+ struct RTCFrameEncodeParams { >+- RTCFrameEncodeParams(RTCVideoEncoderH264 *e, >++ RTCFrameEncodeParams(RTCSingleVideoEncoderH264 *e, >+ RTCCodecSpecificInfoH264 *csi, >+ int32_t w, >+ int32_t h, >+@@ -85,7 +113,7 @@ struct RTCFrameEncodeParams { >+ } >+ } >+ >+- RTCVideoEncoderH264 *encoder; >++ RTCSingleVideoEncoderH264 *encoder; >+ RTCCodecSpecificInfoH264 *codecSpecificInfo; >+ int32_t width; >+ int32_t height; >+@@ -173,8 +201,8 @@ void compressionOutputCallback(void *encoder, >+ rotation:encodeParams->rotation]; >+ } >+ >+-// Extract VideoToolbox profile out of the webrtc::SdpVideoFormat. If there is >+-// no specific VideoToolbox profile for the specified level, AutoLevel will be >++// Extract VideoToolbox profile out of the cricket::VideoCodec. If there is no >++// specific VideoToolbox profile for the specified level, AutoLevel will be >+ // returned. The user must initialize the encoder with a resolution and >+ // framerate conforming to the selected H264 level regardless. >+ CFStringRef ExtractProfile(webrtc::SdpVideoFormat videoFormat) { >+@@ -280,23 +308,26 @@ CFStringRef ExtractProfile(webrtc::SdpVideoFormat videoFormat) { >+ } >+ } // namespace >+ >+-@implementation RTCVideoEncoderH264 { >++@implementation RTCSingleVideoEncoderH264 { >+ RTCVideoCodecInfo *_codecInfo; >+ std::unique_ptr<webrtc::BitrateAdjuster> _bitrateAdjuster; >+ uint32_t _targetBitrateBps; >+- uint32_t _encoderFrameRate; >+ uint32_t _encoderBitrateBps; >+ RTCH264PacketizationMode _packetizationMode; >+ CFStringRef _profile; >+ RTCVideoEncoderCallback _callback; >+ int32_t _width; >+ int32_t _height; >+- VTCompressionSessionRef _compressionSession; >++ CompressionSessionRef _compressionSession; >+ CVPixelBufferPoolRef _pixelBufferPool; >+ RTCVideoCodecMode _mode; >+ >+ webrtc::H264BitstreamParser _h264BitstreamParser; >+ std::vector<uint8_t> _frameScaleBuffer; >++ >++ webrtc::VideoCodec _nativeVideoCodec; >++ int _simulcastIndex; >++ bool _disableEncoding; >+ } >+ >+ // .5 is set as a mininum to prevent overcompensating for large temporary >+@@ -306,12 +337,13 @@ @implementation RTCVideoEncoderH264 { >+ // drastically reduced bitrate, so we want to avoid that. In steady state >+ // conditions, 0.95 seems to give us better overall bitrate over long periods >+ // of time. >+-- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo { >++- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo simulcastIndex:(int)index { >+ if (self = [super init]) { >+ _codecInfo = codecInfo; >+ _bitrateAdjuster.reset(new webrtc::BitrateAdjuster(.5, .95)); >+ _packetizationMode = RTCH264PacketizationModeNonInterleaved; >+ _profile = ExtractProfile([codecInfo nativeSdpVideoFormat]); >++ _simulcastIndex = index; >+ RTC_LOG(LS_INFO) << "Using profile " << CFStringToString(_profile); >+ RTC_CHECK([codecInfo.name isEqualToString:kRTCVideoCodecH264Name]); >+ } >+@@ -330,11 +362,13 @@ - (NSInteger)startEncodeWithSettings:(RTCVideoEncoderSettings *)settings >+ _width = settings.width; >+ _height = settings.height; >+ _mode = settings.mode; >++ _nativeVideoCodec = settings.nativeVideoCodec; >++ >++ RTC_DCHECK(_nativeVideoCodec.numberOfSimulcastStreams != 1); >+ >+ // We can only set average bitrate on the HW encoder. >+- _targetBitrateBps = settings.startBitrate * 1000; // startBitrate is in kbps. >++ _targetBitrateBps = settings.startBitrate; >+ _bitrateAdjuster->SetTargetBitrateBps(_targetBitrateBps); >+- _encoderFrameRate = settings.maxFramerate; >+ >+ // TODO(tkchin): Try setting payload size via >+ // kVTCompressionPropertyKey_MaxH264SliceBytes. >+@@ -357,6 +391,10 @@ - (NSInteger)encode:(RTCVideoFrame *)frame >+ isKeyframeRequired = YES; >+ } >+ >++ if (_disableEncoding) { >++ return WEBRTC_VIDEO_CODEC_ERROR; >++ } >++ >+ CVPixelBufferRef pixelBuffer = nullptr; >+ if ([frame.buffer isKindOfClass:[RTCCVPixelBuffer class]]) { >+ // Native frame buffer >+@@ -434,9 +472,9 @@ - (NSInteger)encode:(RTCVideoFrame *)frame >+ encodeParams->codecSpecificInfo.packetizationMode = _packetizationMode; >+ >+ // Update the bitrate if needed. >+- [self setBitrateBps:_bitrateAdjuster->GetAdjustedBitrateBps() frameRate:_encoderFrameRate]; >++ [self setBitrateBps:_bitrateAdjuster->GetAdjustedBitrateBps()]; >+ >+- OSStatus status = VTCompressionSessionEncodeFrame(_compressionSession, >++ OSStatus status = CompressionSessionEncodeFrame(_compressionSession, >+ pixelBuffer, >+ presentationTimeStamp, >+ kCMTimeInvalid, >+@@ -467,10 +505,10 @@ - (void)setCallback:(RTCVideoEncoderCallback)callback { >+ _callback = callback; >+ } >+ >+-- (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate { >+- _targetBitrateBps = 1000 * bitrateKbit; >++- (int)setBitrate:(uint32_t)bitrateBps framerate:(uint32_t)framerate { >++ _targetBitrateBps = bitrateBps; >+ _bitrateAdjuster->SetTargetBitrateBps(_targetBitrateBps); >+- [self setBitrateBps:_bitrateAdjuster->GetAdjustedBitrateBps() frameRate:framerate]; >++ [self setBitrateBps:_bitrateAdjuster->GetAdjustedBitrateBps()]; >+ return WEBRTC_VIDEO_CODEC_OK; >+ } >+ >+@@ -503,6 +541,11 @@ - (BOOL)resetCompressionSessionIfNeededWithFrame:(RTCVideoFrame *)frame { >+ OSType framePixelFormat = [self pixelFormatOfFrame:frame]; >+ >+ if (_compressionSession) { >++#if defined(WEBRTC_WEBKIT_BUILD) >++ if (!_pixelBufferPool) { >++ return NO; >++ } >++#endif >+ // The pool attribute `kCVPixelBufferPixelFormatTypeKey` can contain either an array of pixel >+ // formats or a single pixel format. >+ NSDictionary *poolAttributes = >+@@ -516,6 +559,11 @@ - (BOOL)resetCompressionSessionIfNeededWithFrame:(RTCVideoFrame *)frame { >+ compressionSessionPixelFormats = @[ (NSNumber *)pixelFormats ]; >+ } >+ >++ if ([frame.buffer isKindOfClass:[RTCCVPixelBuffer class]]) { >++ RTCCVPixelBuffer *rtcPixelBuffer = (RTCCVPixelBuffer *)frame.buffer; >++ framePixelFormat = CVPixelBufferGetPixelFormatType(rtcPixelBuffer.pixelBuffer); >++ } >++ >+ if (![compressionSessionPixelFormats >+ containsObject:[NSNumber numberWithLong:framePixelFormat]]) { >+ resetCompressionSession = YES; >+@@ -533,6 +581,7 @@ - (BOOL)resetCompressionSessionIfNeededWithFrame:(RTCVideoFrame *)frame { >+ >+ - (int)resetCompressionSessionWithPixelFormat:(OSType)framePixelFormat { >+ [self destroyCompressionSession]; >++ _disableEncoding = false; >+ >+ // Set source image buffer attributes. These attributes will be present on >+ // buffers retrieved from the encoder's pixel buffer pool. >+@@ -559,22 +608,26 @@ - (int)resetCompressionSessionWithPixelFormat:(OSType)framePixelFormat { >+ CFRelease(pixelFormat); >+ pixelFormat = nullptr; >+ } >+- CFMutableDictionaryRef encoder_specs = nullptr; >+-#if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >++ CFDictionaryRef encoderSpecs = nullptr; >++#if !defined(WEBRTC_IOS) >++ auto useHardwareEncoder = webrtc::isH264HardwareEncoderAllowed() ? kCFBooleanTrue : kCFBooleanFalse; >+ // Currently hw accl is supported above 360p on mac, below 360p >+ // the compression session will be created with hw accl disabled. >+- encoder_specs = CFDictionaryCreateMutable( >+- nullptr, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >+- CFDictionarySetValue(encoder_specs, >+- kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder, >+- kCFBooleanTrue); >++ CFTypeRef sessionKeys[] = { kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder, kVTVideoEncoderSpecification_RequireHardwareAcceleratedVideoEncoder, kVTCompressionPropertyKey_RealTime }; >++ CFTypeRef sessionValues[] = { useHardwareEncoder, useHardwareEncoder, kCFBooleanTrue }; >++ encoderSpecs = CFDictionaryCreate(kCFAllocatorDefault, sessionKeys, sessionValues, 3, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >++#else >++ CFTypeRef sessionKeys[] = { kVTCompressionPropertyKey_RealTime }; >++ CFTypeRef sessionValues[] = { kCFBooleanTrue }; >++ encoderSpecs = CFDictionaryCreate(kCFAllocatorDefault, sessionKeys, sessionValues, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >+ #endif >++ >+ OSStatus status = >+- VTCompressionSessionCreate(nullptr, // use default allocator >++ CompressionSessionCreate(nullptr, // use default allocator >+ _width, >+ _height, >+- kCMVideoCodecType_H264, >+- encoder_specs, // use hardware accelerated encoder if available >++ kCodecTypeH264, >++ encoderSpecs, // use hardware accelerated encoder if available >+ sourceAttributes, >+ nullptr, // use default compressed data allocator >+ compressionOutputCallback, >+@@ -584,32 +637,113 @@ - (int)resetCompressionSessionWithPixelFormat:(OSType)framePixelFormat { >+ CFRelease(sourceAttributes); >+ sourceAttributes = nullptr; >+ } >+- if (encoder_specs) { >+- CFRelease(encoder_specs); >+- encoder_specs = nullptr; >++ if (encoderSpecs) { >++ CFRelease(encoderSpecs); >++ encoderSpecs = nullptr; >+ } >++ >++#if ENABLE_VCP_ENCODER || defined(WEBRTC_IOS) >+ if (status != noErr) { >+ RTC_LOG(LS_ERROR) << "Failed to create compression session: " << status; >+ return WEBRTC_VIDEO_CODEC_ERROR; >+ } >++#endif >+ #if defined(WEBRTC_MAC) && !defined(WEBRTC_IOS) >+ CFBooleanRef hwaccl_enabled = nullptr; >+- status = VTSessionCopyProperty(_compressionSession, >++ if (status == noErr) { >++ status = VTSessionCopyProperty(_compressionSession, >+ kVTCompressionPropertyKey_UsingHardwareAcceleratedVideoEncoder, >+ nullptr, >+ &hwaccl_enabled); >++ } >+ if (status == noErr && (CFBooleanGetValue(hwaccl_enabled))) { >+ RTC_LOG(LS_INFO) << "Compression session created with hw accl enabled"; >+ } else { >+ RTC_LOG(LS_INFO) << "Compression session created with hw accl disabled"; >++ >++#if !ENABLE_VCP_ENCODER && !defined(WEBRTC_IOS) >++ if (!isStandardFrameSize(_width, _height)) { >++ _disableEncoding = true; >++ RTC_LOG(LS_ERROR) << "Using H264 software encoder with non standard size is not supported"; >++ return WEBRTC_VIDEO_CODEC_ERROR; >++ } >++ >++ if (!getkVTVideoEncoderSpecification_Usage()) { >++ _disableEncoding = true; >++ RTC_LOG(LS_ERROR) << "RTCSingleVideoEncoderH264 cannot create a H264 software encoder"; >++ return WEBRTC_VIDEO_CODEC_ERROR; >++ } >++ >++ CFDictionaryRef ioSurfaceValue = CreateCFTypeDictionary(nullptr, nullptr, 0); >++ int64_t pixelFormatType = framePixelFormat; >++ CFNumberRef pixelFormat = CFNumberCreate(nullptr, kCFNumberLongType, &pixelFormatType); >++ >++ const size_t attributesSize = 3; >++ CFTypeRef keys[attributesSize] = { >++ kCVPixelBufferOpenGLCompatibilityKey, >++ kCVPixelBufferIOSurfacePropertiesKey, >++ kCVPixelBufferPixelFormatTypeKey >++ }; >++ CFTypeRef values[attributesSize] = { >++ kCFBooleanTrue, >++ ioSurfaceValue, >++ pixelFormat}; >++ CFDictionaryRef sourceAttributes = CreateCFTypeDictionary(keys, values, attributesSize); >++ >++ if (ioSurfaceValue) { >++ CFRelease(ioSurfaceValue); >++ ioSurfaceValue = nullptr; >++ } >++ if (pixelFormat) { >++ CFRelease(pixelFormat); >++ pixelFormat = nullptr; >++ } >++ >++ CFMutableDictionaryRef encoderSpecs = CFDictionaryCreateMutable(nullptr, 2, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); >++ CFDictionarySetValue(encoderSpecs, kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder, kCFBooleanFalse); >++ int usageValue = 1; >++ CFNumberRef usage = CFNumberCreate(nullptr, kCFNumberIntType, &usageValue); >++ CFDictionarySetValue(encoderSpecs, (__bridge CFStringRef)getkVTVideoEncoderSpecification_Usage(), usage); >++ >++ if (usage) { >++ CFRelease(usage); >++ usage = nullptr; >++ } >++ >++ [self destroyCompressionSession]; >++ >++ OSStatus status = >++ CompressionSessionCreate(nullptr, // use default allocator >++ _width, >++ _height, >++ kCodecTypeH264, >++ encoderSpecs, >++ sourceAttributes, >++ nullptr, // use default compressed data allocator >++ compressionOutputCallback, >++ nullptr, >++ &_compressionSession); >++ if (sourceAttributes) { >++ CFRelease(sourceAttributes); >++ sourceAttributes = nullptr; >++ } >++ if (encoderSpecs) { >++ CFRelease(encoderSpecs); >++ encoderSpecs = nullptr; >++ } >++ if (status != noErr) { >++ return WEBRTC_VIDEO_CODEC_ERROR; >++ } >++#endif >+ } >+ #endif >+ [self configureCompressionSession]; >+ >++#if !defined(WEBRTC_WEBKIT_BUILD) >+ // The pixel buffer pool is dependent on the compression session so if the session is reset, the >+ // pool should be reset as well. >+- _pixelBufferPool = VTCompressionSessionGetPixelBufferPool(_compressionSession); >+- >++ _pixelBufferPool = CompressionSessionGetPixelBufferPool(_compressionSession); >++#endif >+ return WEBRTC_VIDEO_CODEC_OK; >+ } >+ >+@@ -618,7 +752,11 @@ - (void)configureCompressionSession { >+ SetVTSessionProperty(_compressionSession, kVTCompressionPropertyKey_RealTime, true); >+ SetVTSessionProperty(_compressionSession, kVTCompressionPropertyKey_ProfileLevel, _profile); >+ SetVTSessionProperty(_compressionSession, kVTCompressionPropertyKey_AllowFrameReordering, false); >+- [self setEncoderBitrateBps:_targetBitrateBps frameRate:_encoderFrameRate]; >++#if ENABLE_VCP_ENCODER >++ if (auto key = getkVTVideoEncoderSpecification_Usage()) >++ SetVTSessionProperty(_compressionSession, (__bridge CFStringRef)key, 1); >++#endif >++ [self setEncoderBitrateBps:_targetBitrateBps]; >+ // TODO(tkchin): Look at entropy mode and colorspace matrices. >+ // TODO(tkchin): Investigate to see if there's any way to make this work. >+ // May need it to interop with Android. Currently this call just fails. >+@@ -635,7 +773,7 @@ - (void)configureCompressionSession { >+ >+ - (void)destroyCompressionSession { >+ if (_compressionSession) { >+- VTCompressionSessionInvalidate(_compressionSession); >++ CompressionSessionInvalidate(_compressionSession); >+ CFRelease(_compressionSession); >+ _compressionSession = nullptr; >+ _pixelBufferPool = nullptr; >+@@ -646,17 +784,15 @@ - (NSString *)implementationName { >+ return @"VideoToolbox"; >+ } >+ >+-- (void)setBitrateBps:(uint32_t)bitrateBps frameRate:(uint32_t)frameRate { >+- if (_encoderBitrateBps != bitrateBps || _encoderFrameRate != frameRate) { >+- [self setEncoderBitrateBps:bitrateBps frameRate:frameRate]; >++- (void)setBitrateBps:(uint32_t)bitrateBps { >++ if (_encoderBitrateBps != bitrateBps) { >++ [self setEncoderBitrateBps:bitrateBps]; >+ } >+ } >+ >+-- (void)setEncoderBitrateBps:(uint32_t)bitrateBps frameRate:(uint32_t)frameRate { >++- (void)setEncoderBitrateBps:(uint32_t)bitrateBps { >+ if (_compressionSession) { >+ SetVTSessionProperty(_compressionSession, kVTCompressionPropertyKey_AverageBitRate, bitrateBps); >+- SetVTSessionProperty( >+- _compressionSession, kVTCompressionPropertyKey_ExpectedFrameRate, frameRate); >+ >+ // TODO(tkchin): Add a helper method to set array value. >+ int64_t dataLimitBytesPerSecondValue = >+@@ -668,7 +804,7 @@ - (void)setEncoderBitrateBps:(uint32_t)bitrateBps frameRate:(uint32_t)frameRate >+ CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt64Type, &oneSecondValue); >+ const void *nums[2] = {bytesPerSecond, oneSecond}; >+ CFArrayRef dataRateLimits = CFArrayCreate(nullptr, nums, 2, &kCFTypeArrayCallBacks); >+- OSStatus status = VTSessionSetProperty( >++ OSStatus status = CompressionSessionSetProperty( >+ _compressionSession, kVTCompressionPropertyKey_DataRateLimits, dataRateLimits); >+ if (bytesPerSecond) { >+ CFRelease(bytesPerSecond); >+@@ -680,11 +816,10 @@ - (void)setEncoderBitrateBps:(uint32_t)bitrateBps frameRate:(uint32_t)frameRate >+ CFRelease(dataRateLimits); >+ } >+ if (status != noErr) { >+- RTC_LOG(LS_ERROR) << "Failed to set data rate limit with code: " << status; >++ RTC_LOG(LS_ERROR) << "Failed to set data rate limit"; >+ } >+ >+ _encoderBitrateBps = bitrateBps; >+- _encoderFrameRate = frameRate; >+ } >+ } >+ >+@@ -698,7 +833,7 @@ - (void)frameWasEncoded:(OSStatus)status >+ timestamp:(uint32_t)timestamp >+ rotation:(RTCVideoRotation)rotation { >+ if (status != noErr) { >+- RTC_LOG(LS_ERROR) << "H264 encode failed with code: " << status; >++ RTC_LOG(LS_ERROR) << "H264 encode failed: " << status; >+ return; >+ } >+ if (infoFlags & kVTEncodeInfo_FrameDropped) { >+@@ -743,6 +878,7 @@ - (void)frameWasEncoded:(OSStatus)status >+ frame.rotation = rotation; >+ frame.contentType = (_mode == RTCVideoCodecModeScreensharing) ? RTCVideoContentTypeScreenshare : >+ RTCVideoContentTypeUnspecified; >++ frame.spatialIndex = _simulcastIndex; >+ frame.flags = webrtc::VideoSendTiming::kInvalid; >+ >+ int qp; >+@@ -758,9 +894,88 @@ - (void)frameWasEncoded:(OSStatus)status >+ _bitrateAdjuster->Update(frame.buffer.length); >+ } >+ >+-- (nullable RTCVideoEncoderQpThresholds *)scalingSettings { >+- return [[RTCVideoEncoderQpThresholds alloc] initWithThresholdsLow:kLowH264QpThreshold >+- high:kHighH264QpThreshold]; >++- (RTCVideoEncoderQpThresholds *)scalingSettings { >++ return [[RTCVideoEncoderQpThresholds alloc] initWithThresholdsLow:kLowH264QpThreshold high:kHighH264QpThreshold]; >++} >++ >++- (int)setRateAllocation:(RTCVideoBitrateAllocation *)allocation framerate:(uint32_t)framerate { >++ return 0; >++} >++ >++@end >++ >++@implementation RTCVideoEncoderH264 { >++ NSMutableArray<RTCSingleVideoEncoderH264*> *_codecs; >++ RTCVideoCodecInfo *_codecInfo; >++} >++- (instancetype)initWithCodecInfo:(RTCVideoCodecInfo *)codecInfo { >++ if (self = [super init]) { >++ _codecInfo = codecInfo; >++ } >++ return self; >++} >++ >++- (void)setCallback:(RTCVideoEncoderCallback)callback { >++ for (RTCSingleVideoEncoderH264 *codec : _codecs) >++ [codec setCallback:callback]; >++} >++ >++- (NSInteger)startEncodeWithSettings:(RTCVideoEncoderSettings *)settings numberOfCores:(int)numberOfCores { >++ auto nativeCodecSettings = settings.nativeVideoCodec; >++ >++ _codecs = [[NSMutableArray alloc] init]; >++ for (unsigned index = 0 ; index < nativeCodecSettings.numberOfSimulcastStreams; ++index) { >++ auto codec = [[RTCSingleVideoEncoderH264 alloc] initWithCodecInfo:_codecInfo simulcastIndex:index]; >++ [_codecs addObject:codec]; >++ >++ auto codecSettings = nativeCodecSettings; >++ codecSettings.width = nativeCodecSettings.simulcastStream[index].width; >++ codecSettings.height = nativeCodecSettings.simulcastStream[index].height; >++ codecSettings.maxBitrate = nativeCodecSettings.simulcastStream[index].maxBitrate; >++ codecSettings.targetBitrate = nativeCodecSettings.simulcastStream[index].targetBitrate; >++ codecSettings.minBitrate = nativeCodecSettings.simulcastStream[index].minBitrate; >++ codecSettings.qpMax = nativeCodecSettings.simulcastStream[index].qpMax; >++ codecSettings.active = true; >++ >++ auto *settings = [[RTCVideoEncoderSettings alloc] initWithNativeVideoCodec:&codecSettings]; >++ [codec startEncodeWithSettings:settings numberOfCores:numberOfCores]; >++ } >++ return 0; >++} >++ >++- (NSInteger)releaseEncoder { >++ for (RTCSingleVideoEncoderH264 *codec : _codecs) >++ [codec releaseEncoder]; >++ _codecs = nil; >++ return 0; >++} >++ >++- (NSInteger)encode:(RTCVideoFrame *)frame codecSpecificInfo:(nullable id<RTCCodecSpecificInfo>)info frameTypes:(NSArray<NSNumber *> *)frameTypes { >++ int result = 0; >++ for (RTCSingleVideoEncoderH264 *codec : _codecs) >++ result |= [codec encode:frame codecSpecificInfo:info frameTypes:frameTypes]; >++ return result; >++} >++ >++- (int)setRateAllocation:(RTCVideoBitrateAllocation *)bitRateAllocation framerate:(uint32_t) framerate { >++ int result = 0; >++ unsigned counter = 0; >++ auto nativeBitRateAllocation = bitRateAllocation.nativeVideoBitrateAllocation; >++ for (RTCSingleVideoEncoderH264 *codec : _codecs) >++ result |= [codec setBitrate:nativeBitRateAllocation.GetSpatialLayerSum(counter++) framerate:framerate]; >++ return result; >++} >++ >++- (NSString *)implementationName { >++ return @"VideoToolbox"; >++} >++ >++- (RTCVideoEncoderQpThresholds *)scalingSettings { >++ return [[RTCVideoEncoderQpThresholds alloc] initWithThresholdsLow:kLowH264QpThreshold high:kHighH264QpThreshold]; >++} >++ >++- (int)setBitrate:(uint32_t)bitrateKbit framerate:(uint32_t)framerate { >++ return 0; >+ } >+ >+ @end >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/UIDevice+H264Profile.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/UIDevice+H264Profile.mm >+index 852c3d4702f..a015f7ca15c 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/UIDevice+H264Profile.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/UIDevice+H264Profile.mm >+@@ -24,9 +24,6 @@ struct SupportedH264Profile { >+ >+ constexpr SupportedH264Profile kH264MaxSupportedProfiles[] = { >+ // iPhones with at least iOS 9 >+- {RTCDeviceTypeIPhoneXS, {kProfileHigh, kLevel5_2}}, // https://support.apple.com/kb/SP779 >+- {RTCDeviceTypeIPhoneXSMax, {kProfileHigh, kLevel5_2}}, // https://support.apple.com/kb/SP780 >+- {RTCDeviceTypeIPhoneXR, {kProfileHigh, kLevel5_2}}, // https://support.apple.com/kb/SP781 >+ {RTCDeviceTypeIPhoneX, {kProfileHigh, kLevel5_2}}, // https://support.apple.com/kb/SP770 >+ {RTCDeviceTypeIPhone8, {kProfileHigh, kLevel5_2}}, // https://support.apple.com/kb/SP767 >+ {RTCDeviceTypeIPhone8Plus, {kProfileHigh, kLevel5_2}}, // https://support.apple.com/kb/SP768 >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc >+index ac957f1b497..46daf44ecac 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.cc >+@@ -35,12 +35,12 @@ std::string CFStringToString(const CFStringRef cf_string) { >+ } >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, >++void SetVTSessionProperty(CompressionSessionRef session, >+ CFStringRef key, >+ int32_t value) { >+ CFNumberRef cfNum = >+ CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, &value); >+- OSStatus status = VTSessionSetProperty(session, key, cfNum); >++ OSStatus status = CompressionSessionSetProperty(session, key, cfNum); >+ CFRelease(cfNum); >+ if (status != noErr) { >+ std::string key_string = CFStringToString(key); >+@@ -50,13 +50,13 @@ void SetVTSessionProperty(VTSessionRef session, >+ } >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, >++void SetVTSessionProperty(CompressionSessionRef session, >+ CFStringRef key, >+ uint32_t value) { >+ int64_t value_64 = value; >+ CFNumberRef cfNum = >+ CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt64Type, &value_64); >+- OSStatus status = VTSessionSetProperty(session, key, cfNum); >++ OSStatus status = CompressionSessionSetProperty(session, key, cfNum); >+ CFRelease(cfNum); >+ if (status != noErr) { >+ std::string key_string = CFStringToString(key); >+@@ -66,9 +66,9 @@ void SetVTSessionProperty(VTSessionRef session, >+ } >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, CFStringRef key, bool value) { >++void SetVTSessionProperty(CompressionSessionRef session, CFStringRef key, bool value) { >+ CFBooleanRef cf_bool = (value) ? kCFBooleanTrue : kCFBooleanFalse; >+- OSStatus status = VTSessionSetProperty(session, key, cf_bool); >++ OSStatus status = CompressionSessionSetProperty(session, key, cf_bool); >+ if (status != noErr) { >+ std::string key_string = CFStringToString(key); >+ RTC_LOG(LS_ERROR) << "VTSessionSetProperty failed to set: " << key_string >+@@ -77,10 +77,10 @@ void SetVTSessionProperty(VTSessionRef session, CFStringRef key, bool value) { >+ } >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, >++void SetVTSessionProperty(CompressionSessionRef session, >+ CFStringRef key, >+ CFStringRef value) { >+- OSStatus status = VTSessionSetProperty(session, key, value); >++ OSStatus status = CompressionSessionSetProperty(session, key, value); >+ if (status != noErr) { >+ std::string key_string = CFStringToString(key); >+ std::string val_string = CFStringToString(value); >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h >+index 0683ea79e56..0c669e8bce6 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/components/video_codec/helpers.h >+@@ -16,6 +16,8 @@ >+ #include <VideoToolbox/VideoToolbox.h> >+ #include <string> >+ >++#include "sdk/WebKit/EncoderUtilities.h" >++ >+ // Convenience function for creating a dictionary. >+ inline CFDictionaryRef CreateCFTypeDictionary(CFTypeRef* keys, >+ CFTypeRef* values, >+@@ -29,18 +31,18 @@ inline CFDictionaryRef CreateCFTypeDictionary(CFTypeRef* keys, >+ std::string CFStringToString(const CFStringRef cf_string); >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, CFStringRef key, int32_t value); >++void SetVTSessionProperty(CompressionSessionRef session, CFStringRef key, int32_t value); >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, >++void SetVTSessionProperty(CompressionSessionRef session, >+ CFStringRef key, >+ uint32_t value); >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, CFStringRef key, bool value); >++void SetVTSessionProperty(CompressionSessionRef session, CFStringRef key, bool value); >+ >+ // Convenience function for setting a VT property. >+-void SetVTSessionProperty(VTSessionRef session, >++void SetVTSessionProperty(CompressionSessionRef session, >+ CFStringRef key, >+ CFStringRef value); >+ >+diff --git a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >+index fafad7466ec..ae45033a570 100644 >+--- a/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >++++ b/Source/ThirdParty/libwebrtc/Source/webrtc/sdk/objc/native/src/objc_video_encoder_factory.mm >+@@ -90,6 +90,11 @@ class ObjCVideoEncoder : public VideoEncoder { >+ return [encoder_ setBitrate:bitrate framerate:framerate]; >+ } >+ >++ int32_t SetRateAllocation(const VideoBitrateAllocation& allocation, uint32_t framerate) { >++ auto *rtcAllocation = [[RTCVideoBitrateAllocation alloc] initWithNativeVideoBitrateAllocation:&allocation]; >++ return [encoder_ setRateAllocation: rtcAllocation framerate:framerate]; >++ } >++ >+ VideoEncoder::EncoderInfo GetEncoderInfo() const { >+ EncoderInfo info; >+ info.supports_native_handle = true; >diff --git a/Source/ThirdParty/libwebrtc/libwebrtc.xcodeproj/project.pbxproj b/Source/ThirdParty/libwebrtc/libwebrtc.xcodeproj/project.pbxproj >index 2a54bdb0078a4b8005a31befed10f9cc63b4de3a..229467017fea6bc5e31ab58c0fb35315c828985d 100644 >--- a/Source/ThirdParty/libwebrtc/libwebrtc.xcodeproj/project.pbxproj >+++ b/Source/ThirdParty/libwebrtc/libwebrtc.xcodeproj/project.pbxproj >@@ -81,7 +81,6 @@ > 41294093212E128D00AD95E7 /* libvpx_vp8_encoder.h in Headers */ = {isa = PBXBuildFile; fileRef = 4129408F212E128C00AD95E7 /* libvpx_vp8_encoder.h */; }; > 41294094212E128D00AD95E7 /* libvpx_vp8_decoder.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41294090212E128C00AD95E7 /* libvpx_vp8_decoder.cc */; }; > 41294095212E128D00AD95E7 /* libvpx_vp8_decoder.h in Headers */ = {isa = PBXBuildFile; fileRef = 41294091212E128C00AD95E7 /* libvpx_vp8_decoder.h */; }; >- 41299B8B2127365100B3414B /* pacer_controller.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41E02CD62127363C00C27CD6 /* pacer_controller.cc */; }; > 41299B8C2127365100B3414B /* send_time_history.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41E02CD82127363D00C27CD6 /* send_time_history.cc */; }; > 41299B8D2127365100B3414B /* transport_feedback_adapter.h in Headers */ = {isa = PBXBuildFile; fileRef = 41E02CD72127363C00C27CD6 /* transport_feedback_adapter.h */; }; > 41299B912127367B00B3414B /* isac_vad.c in Sources */ = {isa = PBXBuildFile; fileRef = 41299B8E2127367A00B3414B /* isac_vad.c */; }; >@@ -142,7 +141,6 @@ > 413A22A11FE18E0600373E99 /* testbase64.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21441FE18D4C00373E99 /* testbase64.h */; }; > 413A22A21FE18E0600373E99 /* function_view.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21451FE18D4C00373E99 /* function_view.h */; }; > 413A22A31FE18E0600373E99 /* format_macros.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21471FE18D4D00373E99 /* format_macros.h */; }; >- 413A22A41FE18E0600373E99 /* unixfilesystem.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21481FE18D4D00373E99 /* unixfilesystem.h */; }; > 413A22A51FE18E0600373E99 /* testechoserver.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21491FE18D4E00373E99 /* testechoserver.cc */; }; > 413A22A71FE18E0600373E99 /* proxyserver.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A214C1FE18D4F00373E99 /* proxyserver.cc */; }; > 413A22A81FE18E0600373E99 /* asyncinvoker-inl.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A214D1FE18D5000373E99 /* asyncinvoker-inl.h */; }; >@@ -228,16 +226,13 @@ > 413A23381FE18E0700373E99 /* macutils.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21E41FE18DB000373E99 /* macutils.cc */; }; > 413A23391FE18E0700373E99 /* task_queue.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21E51FE18DB100373E99 /* task_queue.h */; }; > 413A233B1FE18E0700373E99 /* bitbuffer.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21E71FE18DB100373E99 /* bitbuffer.cc */; }; >- 413A233C1FE18E0700373E99 /* pathutils.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21E81FE18DB200373E99 /* pathutils.cc */; }; > 413A233D1FE18E0700373E99 /* physicalsocketserver.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21E91FE18DB200373E99 /* physicalsocketserver.cc */; }; > 413A233E1FE18E0700373E99 /* byteorder.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21EA1FE18DB300373E99 /* byteorder.h */; }; > 413A23421FE18E0700373E99 /* thread.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21EE1FE18DB400373E99 /* thread.h */; }; > 413A23431FE18E0700373E99 /* crc32.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21EF1FE18DB500373E99 /* crc32.cc */; }; >- 413A23451FE18E0700373E99 /* optionsfile.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21F11FE18DB600373E99 /* optionsfile.h */; }; > 413A23471FE18E0700373E99 /* swap_queue.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21F31FE18DB700373E99 /* swap_queue.h */; }; > 413A23481FE18E0700373E99 /* dscp.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21F41FE18DB700373E99 /* dscp.h */; }; > 413A23491FE18E0700373E99 /* timeutils.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21F51FE18DB700373E99 /* timeutils.h */; }; >- 413A234A1FE18E0700373E99 /* optionsfile.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21F61FE18DB800373E99 /* optionsfile.cc */; }; > 413A234B1FE18E0700373E99 /* type_traits.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21F71FE18DB800373E99 /* type_traits.h */; }; > 413A234C1FE18E0700373E99 /* random.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A21F81FE18DB900373E99 /* random.h */; }; > 413A234D1FE18E0700373E99 /* socketaddresspair.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A21F91FE18DB900373E99 /* socketaddresspair.cc */; }; >@@ -255,7 +250,6 @@ > 413A235C1FE18E0700373E99 /* bind.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A22091FE18DC000373E99 /* bind.h */; }; > 413A235D1FE18E0700373E99 /* gunit_prod.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A220A1FE18DC000373E99 /* gunit_prod.h */; }; > 413A235E1FE18E0700373E99 /* asyncsocket.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A220B1FE18DC100373E99 /* asyncsocket.h */; }; >- 413A235F1FE18E0700373E99 /* noop.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A220C1FE18DC100373E99 /* noop.cc */; }; > 413A23601FE18E0700373E99 /* rate_limiter.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A220D1FE18DC200373E99 /* rate_limiter.h */; }; > 413A23611FE18E0700373E99 /* socketadapters.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A220E1FE18DC200373E99 /* socketadapters.h */; }; > 413A23621FE18E0700373E99 /* ratetracker.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A220F1FE18DC200373E99 /* ratetracker.cc */; }; >@@ -287,7 +281,6 @@ > 413A238D1FE18E0700373E99 /* rate_limiter.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A223A1FE18DD800373E99 /* rate_limiter.cc */; }; > 413A238E1FE18E0700373E99 /* buffer.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A223B1FE18DD900373E99 /* buffer.h */; }; > 413A238F1FE18E0700373E99 /* event_tracer.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A223C1FE18DD900373E99 /* event_tracer.cc */; }; >- 413A23901FE18E0800373E99 /* pathutils.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A223D1FE18DD900373E99 /* pathutils.h */; }; > 413A23921FE18E0800373E99 /* sslroots.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A223F1FE18DDA00373E99 /* sslroots.h */; }; > 413A23951FE18E0800373E99 /* httpcommon.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A22421FE18DDB00373E99 /* httpcommon.h */; }; > 413A23961FE18E0800373E99 /* asyncsocket.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A22431FE18DDC00373E99 /* asyncsocket.cc */; }; >@@ -306,12 +299,10 @@ > 413A23AE1FE18E0800373E99 /* sequenced_task_checker_impl.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A225B1FE18DE600373E99 /* sequenced_task_checker_impl.h */; }; > 413A23AF1FE18E0800373E99 /* sslfingerprint.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A225C1FE18DE600373E99 /* sslfingerprint.h */; }; > 413A23B11FE18E0800373E99 /* stringize_macros.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A225E1FE18DE700373E99 /* stringize_macros.h */; }; >- 413A23B51FE18E0800373E99 /* fileutils.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A22621FE18DE900373E99 /* fileutils.cc */; }; > 413A23B71FE18E0800373E99 /* event.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A22641FE18DEB00373E99 /* event.cc */; }; > 413A23B91FE18E0800373E99 /* socketserver.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A22661FE18DEC00373E99 /* socketserver.h */; }; > 413A23BA1FE18E0800373E99 /* sigslottester.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A22671FE18DEC00373E99 /* sigslottester.h */; }; > 413A23BC1FE18E0800373E99 /* flags.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A22691FE18DEF00373E99 /* flags.h */; }; >- 413A23BD1FE18E0800373E99 /* fileutils.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A226A1FE18DF000373E99 /* fileutils.h */; }; > 413A23BE1FE18E0800373E99 /* logsinks.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A226B1FE18DF000373E99 /* logsinks.cc */; }; > 413A23C01FE18E0800373E99 /* nullsocketserver.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A226D1FE18DF100373E99 /* nullsocketserver.h */; }; > 413A23C11FE18E0800373E99 /* sslstreamadapter.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A226E1FE18DF100373E99 /* sslstreamadapter.h */; }; >@@ -331,7 +322,6 @@ > 413A23D81FE18E0800373E99 /* file_posix.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A22851FE18DFC00373E99 /* file_posix.cc */; }; > 413A23D91FE18E0800373E99 /* testutils.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A22861FE18DFC00373E99 /* testutils.h */; }; > 413A23DA1FE18E0800373E99 /* filerotatingstream.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A22871FE18DFD00373E99 /* filerotatingstream.cc */; }; >- 413A23DD1FE18E0800373E99 /* thread_darwin.mm in Sources */ = {isa = PBXBuildFile; fileRef = 413A228A1FE18DFE00373E99 /* thread_darwin.mm */; }; > 413A23DE1FE18E0800373E99 /* task_queue_gcd.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A228B1FE18DFE00373E99 /* task_queue_gcd.cc */; }; > 413A23DF1FE18E0800373E99 /* firewallsocketserver.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A228C1FE18DFE00373E99 /* firewallsocketserver.h */; }; > 413A23E21FE18E0800373E99 /* cpu_time.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A22901FE18E0000373E99 /* cpu_time.h */; }; >@@ -342,7 +332,6 @@ > 413A23EB1FE18E0800373E99 /* location.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A22991FE18E0300373E99 /* location.cc */; }; > 413A23EC1FE18E0800373E99 /* location.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A229A1FE18E0400373E99 /* location.h */; }; > 413A23ED1FE18E0800373E99 /* ipaddress.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A229B1FE18E0400373E99 /* ipaddress.cc */; }; >- 413A23EE1FE18E0800373E99 /* noop.mm in Sources */ = {isa = PBXBuildFile; fileRef = 413A229C1FE18E0400373E99 /* noop.mm */; }; > 413A23EF1FE18E0800373E99 /* race_checker.h in Headers */ = {isa = PBXBuildFile; fileRef = 413A229D1FE18E0500373E99 /* race_checker.h */; }; > 413A23F11FE18E0800373E99 /* sslfingerprint.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A229F1FE18E0600373E99 /* sslfingerprint.cc */; }; > 413A23F21FE18E0800373E99 /* bytebuffer.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413A22A01FE18E0600373E99 /* bytebuffer.cc */; }; >@@ -412,11 +401,8 @@ > 413E676D2169854B00EF37ED /* RTCWrappedNativeVideoEncoder.mm in Sources */ = {isa = PBXBuildFile; fileRef = 417954002169851F0028266B /* RTCWrappedNativeVideoEncoder.mm */; }; > 413E67722169863B00EF37ED /* temporal_layers_checker.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413E676E2169863A00EF37ED /* temporal_layers_checker.cc */; }; > 413E67732169863B00EF37ED /* libvpx_interface.h in Headers */ = {isa = PBXBuildFile; fileRef = 413E676F2169863A00EF37ED /* libvpx_interface.h */; }; >- 413E67742169863B00EF37ED /* vp8_temporal_layers.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413E67702169863B00EF37ED /* vp8_temporal_layers.cc */; }; > 413E67752169863B00EF37ED /* libvpx_interface.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413E67712169863B00EF37ED /* libvpx_interface.cc */; }; > 413E67762169866100EF37ED /* RTCVideoCodecConstants.mm in Sources */ = {isa = PBXBuildFile; fileRef = 417953FF2169851F0028266B /* RTCVideoCodecConstants.mm */; }; >- 413E67792169867A00EF37ED /* channel_receive_proxy.h in Headers */ = {isa = PBXBuildFile; fileRef = 413E67772169867A00EF37ED /* channel_receive_proxy.h */; }; >- 413E677A2169867A00EF37ED /* channel_receive_proxy.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413E67782169867A00EF37ED /* channel_receive_proxy.cc */; }; > 413E677D216986AE00EF37ED /* rtp_header_extension_size.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413E677B216986AD00EF37ED /* rtp_header_extension_size.cc */; }; > 413E677E216986AE00EF37ED /* rtp_header_extension_size.h in Headers */ = {isa = PBXBuildFile; fileRef = 413E677C216986AE00EF37ED /* rtp_header_extension_size.h */; }; > 413E67812169870F00EF37ED /* svc_config.h in Headers */ = {isa = PBXBuildFile; fileRef = 413E677F2169870F00EF37ED /* svc_config.h */; }; >@@ -434,7 +420,6 @@ > 413E679B2169886200EF37ED /* agc2_common.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413E679A2169886200EF37ED /* agc2_common.cc */; }; > 413E67A02169889500EF37ED /* RTCVideoEncoderQpThresholds.h in Headers */ = {isa = PBXBuildFile; fileRef = 413E679C2169889400EF37ED /* RTCVideoEncoderQpThresholds.h */; }; > 413E67A12169889500EF37ED /* RTCRtpFragmentationHeader.h in Headers */ = {isa = PBXBuildFile; fileRef = 413E679D2169889400EF37ED /* RTCRtpFragmentationHeader.h */; }; >- 413E67A22169889500EF37ED /* RTCRtpFragmentationHeader.m in Sources */ = {isa = PBXBuildFile; fileRef = 413E679E2169889400EF37ED /* RTCRtpFragmentationHeader.m */; }; > 413E67A32169889500EF37ED /* RTCVideoEncoderQpThresholds.m in Sources */ = {isa = PBXBuildFile; fileRef = 413E679F2169889500EF37ED /* RTCVideoEncoderQpThresholds.m */; }; > 413E67A5216988DC00EF37ED /* video_stream_encoder_observer.cc in Sources */ = {isa = PBXBuildFile; fileRef = 413E67A4216988DC00EF37ED /* video_stream_encoder_observer.cc */; }; > 4140B8201E4E3383007409E6 /* audio_encoder_pcm.cc in Sources */ = {isa = PBXBuildFile; fileRef = 4140B8181E4E3383007409E6 /* audio_encoder_pcm.cc */; }; >@@ -525,7 +510,6 @@ > 4145E4C61EF896D700FCF6E6 /* rtcstatsreport.h in Headers */ = {isa = PBXBuildFile; fileRef = 4145E4C21EF896CE00FCF6E6 /* rtcstatsreport.h */; }; > 4145E4C71EF896D700FCF6E6 /* rtcstats_objects.h in Headers */ = {isa = PBXBuildFile; fileRef = 4145E4C31EF896D200FCF6E6 /* rtcstats_objects.h */; }; > 4145E4C81EF896D700FCF6E6 /* rtcstats.h in Headers */ = {isa = PBXBuildFile; fileRef = 4145E4C41EF896D700FCF6E6 /* rtcstats.h */; }; >- 4145E4CA1EF8CB3200FCF6E6 /* createpeerconnectionfactory.cc in Sources */ = {isa = PBXBuildFile; fileRef = 4145E4C91EF8CB3100FCF6E6 /* createpeerconnectionfactory.cc */; }; > 4145E4CC1EF8CB8B00FCF6E6 /* rtptransport.cc in Sources */ = {isa = PBXBuildFile; fileRef = 4145E4CB1EF8CB8A00FCF6E6 /* rtptransport.cc */; }; > 4145E4CE1EF8CB9500FCF6E6 /* rtptransport.h in Headers */ = {isa = PBXBuildFile; fileRef = 4145E4CD1EF8CB9400FCF6E6 /* rtptransport.h */; }; > 4145E4D11EF8CC2000FCF6E6 /* webrtcvideoengine.cc in Sources */ = {isa = PBXBuildFile; fileRef = 4145E4CF1EF8CC1700FCF6E6 /* webrtcvideoengine.cc */; }; >@@ -561,7 +545,6 @@ > 415F1FAE21272FBA00064CBF /* fixed_digital_level_estimator.h in Headers */ = {isa = PBXBuildFile; fileRef = 41A08BDF21272F84001D5D7B /* fixed_digital_level_estimator.h */; }; > 415F1FAF21272FBA00064CBF /* gain_applier.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41A08BDE21272F84001D5D7B /* gain_applier.cc */; }; > 415F1FB021272FBA00064CBF /* gain_applier.h in Headers */ = {isa = PBXBuildFile; fileRef = 41A08BE721272F86001D5D7B /* gain_applier.h */; }; >- 415F1FB121272FBA00064CBF /* gain_curve_applier.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41A08BE021272F85001D5D7B /* gain_curve_applier.cc */; }; > 415F1FB221272FBA00064CBF /* interpolated_gain_curve.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41A08BE121272F85001D5D7B /* interpolated_gain_curve.cc */; }; > 415F1FB321272FBA00064CBF /* limiter.h in Headers */ = {isa = PBXBuildFile; fileRef = 41A08BEA21272F86001D5D7B /* limiter.h */; }; > 415F1FB421272FBA00064CBF /* noise_spectrum_estimator.h in Headers */ = {isa = PBXBuildFile; fileRef = 41A08BE221272F85001D5D7B /* noise_spectrum_estimator.h */; }; >@@ -646,9 +629,6 @@ > 416D3BE0212731C200775F09 /* biquad_filter.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41D6B45E212731A0008F9353 /* biquad_filter.cc */; }; > 416D3BE1212731C200775F09 /* biquad_filter.h in Headers */ = {isa = PBXBuildFile; fileRef = 41D6B463212731A2008F9353 /* biquad_filter.h */; }; > 416D3BE2212731C200775F09 /* down_sampler.h in Headers */ = {isa = PBXBuildFile; fileRef = 41D6B460212731A1008F9353 /* down_sampler.h */; }; >- 416D3BE3212731C200775F09 /* fixed_gain_controller.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41D6B465212731A2008F9353 /* fixed_gain_controller.cc */; }; >- 416D3BE4212731C200775F09 /* fixed_gain_controller.h in Headers */ = {isa = PBXBuildFile; fileRef = 41D6B467212731A3008F9353 /* fixed_gain_controller.h */; }; >- 416D3BE5212731C200775F09 /* gain_curve_applier.h in Headers */ = {isa = PBXBuildFile; fileRef = 41D6B469212731A3008F9353 /* gain_curve_applier.h */; }; > 416D3BE6212731C200775F09 /* limiter.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41D6B470212731A6008F9353 /* limiter.cc */; }; > 416D3BE7212731C200775F09 /* noise_level_estimator.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41D6B461212731A1008F9353 /* noise_level_estimator.cc */; }; > 416D3BE8212731C200775F09 /* noise_spectrum_estimator.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41D6B46A212731A3008F9353 /* noise_spectrum_estimator.cc */; }; >@@ -683,9 +663,7 @@ > 417953CC216982DB0028266B /* RTCCodecSpecificInfoH264.mm in Sources */ = {isa = PBXBuildFile; fileRef = 416225ED216981F200A91C9B /* RTCCodecSpecificInfoH264.mm */; }; > 417953D32169834F0028266B /* channel_send.cc in Sources */ = {isa = PBXBuildFile; fileRef = 417953CD2169834E0028266B /* channel_send.cc */; }; > 417953D42169834F0028266B /* channel_send.h in Headers */ = {isa = PBXBuildFile; fileRef = 417953CE2169834E0028266B /* channel_send.h */; }; >- 417953D52169834F0028266B /* channel_send_proxy.cc in Sources */ = {isa = PBXBuildFile; fileRef = 417953CF2169834E0028266B /* channel_send_proxy.cc */; }; > 417953D62169834F0028266B /* channel_receive.h in Headers */ = {isa = PBXBuildFile; fileRef = 417953D02169834F0028266B /* channel_receive.h */; }; >- 417953D72169834F0028266B /* channel_send_proxy.h in Headers */ = {isa = PBXBuildFile; fileRef = 417953D12169834F0028266B /* channel_send_proxy.h */; }; > 417953D82169834F0028266B /* channel_receive.cc in Sources */ = {isa = PBXBuildFile; fileRef = 417953D22169834F0028266B /* channel_receive.cc */; }; > 417953DB216983910028266B /* metrics.cc in Sources */ = {isa = PBXBuildFile; fileRef = 417953D9216983900028266B /* metrics.cc */; }; > 417953DC216983910028266B /* field_trial.cc in Sources */ = {isa = PBXBuildFile; fileRef = 417953DA216983900028266B /* field_trial.cc */; }; >@@ -752,7 +730,6 @@ > 419241542127376F00634FCF /* basicasyncresolverfactory.cc in Sources */ = {isa = PBXBuildFile; fileRef = 4192414A2127376D00634FCF /* basicasyncresolverfactory.cc */; }; > 419241552127376F00634FCF /* regatheringcontroller.h in Headers */ = {isa = PBXBuildFile; fileRef = 4192414B2127376D00634FCF /* regatheringcontroller.h */; }; > 419241562127376F00634FCF /* fakecandidatepair.h in Headers */ = {isa = PBXBuildFile; fileRef = 4192414C2127376E00634FCF /* fakecandidatepair.h */; }; >- 419241572127376F00634FCF /* udptransport.h in Headers */ = {isa = PBXBuildFile; fileRef = 4192414D2127376E00634FCF /* udptransport.h */; }; > 419241582127376F00634FCF /* fakepackettransport.h in Headers */ = {isa = PBXBuildFile; fileRef = 4192414E2127376E00634FCF /* fakepackettransport.h */; }; > 419241592127376F00634FCF /* fakeicetransport.h in Headers */ = {isa = PBXBuildFile; fileRef = 4192414F2127376E00634FCF /* fakeicetransport.h */; }; > 4192415A2127376F00634FCF /* icetransportinternal.h in Headers */ = {isa = PBXBuildFile; fileRef = 419241502127376E00634FCF /* icetransportinternal.h */; }; >@@ -777,7 +754,6 @@ > 419241822127497100634FCF /* bad_variant_access.h in Headers */ = {isa = PBXBuildFile; fileRef = 419241802127497100634FCF /* bad_variant_access.h */; }; > 4192418B212749C800634FCF /* send_time_history.h in Headers */ = {isa = PBXBuildFile; fileRef = 41924187212749C700634FCF /* send_time_history.h */; }; > 4192418C212749C800634FCF /* send_side_congestion_controller.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41924188212749C700634FCF /* send_side_congestion_controller.cc */; }; >- 4192418D212749C800634FCF /* pacer_controller.h in Headers */ = {isa = PBXBuildFile; fileRef = 41924189212749C800634FCF /* pacer_controller.h */; }; > 4192418E212749C800634FCF /* transport_feedback_adapter.cc in Sources */ = {isa = PBXBuildFile; fileRef = 4192418A212749C800634FCF /* transport_feedback_adapter.cc */; }; > 419241902127581000634FCF /* fec_private_tables_bursty.cc in Sources */ = {isa = PBXBuildFile; fileRef = 4192418F2127581000634FCF /* fec_private_tables_bursty.cc */; }; > 419241982127586500634FCF /* rnn_vad_weights.h in Headers */ = {isa = PBXBuildFile; fileRef = 419241932127586400634FCF /* rnn_vad_weights.h */; }; >@@ -891,8 +867,6 @@ > 419C83251FE245280040C30F /* rtc_event_rtp_packet_incoming.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83231FE245280040C30F /* rtc_event_rtp_packet_incoming.h */; }; > 419C832E1FE245CD0040C30F /* pacer.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83281FE245CB0040C30F /* pacer.h */; }; > 419C83311FE245CD0040C30F /* interval_budget.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C832B1FE245CC0040C30F /* interval_budget.h */; }; >- 419C83341FE245EA0040C30F /* time_interval.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83321FE245E90040C30F /* time_interval.h */; }; >- 419C83351FE245EA0040C30F /* time_interval.cc in Sources */ = {isa = PBXBuildFile; fileRef = 419C83331FE245E90040C30F /* time_interval.cc */; }; > 419C833C1FE246230040C30F /* audio_encoder_g722_config.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83371FE246220040C30F /* audio_encoder_g722_config.h */; }; > 419C833D1FE246230040C30F /* audio_decoder_g722.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83381FE246220040C30F /* audio_decoder_g722.h */; }; > 419C833E1FE246230040C30F /* audio_encoder_g722.cc in Sources */ = {isa = PBXBuildFile; fileRef = 419C83391FE246220040C30F /* audio_encoder_g722.cc */; }; >@@ -965,7 +939,6 @@ > 419C83D91FE247F20040C30F /* packettransportinternal.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83D01FE247EF0040C30F /* packettransportinternal.h */; }; > 419C83DA1FE247F20040C30F /* packettransportinternal.cc in Sources */ = {isa = PBXBuildFile; fileRef = 419C83D11FE247F00040C30F /* packettransportinternal.cc */; }; > 419C83DB1FE247F20040C30F /* portinterface.cc in Sources */ = {isa = PBXBuildFile; fileRef = 419C83D21FE247F00040C30F /* portinterface.cc */; }; >- 419C83DC1FE247F20040C30F /* udptransport.cc in Sources */ = {isa = PBXBuildFile; fileRef = 419C83D31FE247F00040C30F /* udptransport.cc */; }; > 419C83DD1FE247F20040C30F /* dtlstransport.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83D41FE247F10040C30F /* dtlstransport.h */; }; > 419C83DE1FE247F20040C30F /* dtlstransportinternal.cc in Sources */ = {isa = PBXBuildFile; fileRef = 419C83D51FE247F10040C30F /* dtlstransportinternal.cc */; }; > 419C83DF1FE247F20040C30F /* dtlstransportinternal.h in Headers */ = {isa = PBXBuildFile; fileRef = 419C83D61FE247F10040C30F /* dtlstransportinternal.h */; }; >@@ -1078,8 +1051,6 @@ > 41A08BD021272EE2001D5D7B /* gain_controller2.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41A08BCE21272EE1001D5D7B /* gain_controller2.cc */; }; > 41A08BD321272EFA001D5D7B /* round_robin_packet_queue.h in Headers */ = {isa = PBXBuildFile; fileRef = 41A08BD121272EFA001D5D7B /* round_robin_packet_queue.h */; }; > 41A08BD421272EFA001D5D7B /* round_robin_packet_queue.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41A08BD221272EFA001D5D7B /* round_robin_packet_queue.cc */; }; >- 41A08BD721272F1D001D5D7B /* congestion_window_pushback_controller.h in Headers */ = {isa = PBXBuildFile; fileRef = 41A08BD521272F1C001D5D7B /* congestion_window_pushback_controller.h */; }; >- 41A08BD821272F1D001D5D7B /* congestion_window_pushback_controller.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41A08BD621272F1C001D5D7B /* congestion_window_pushback_controller.cc */; }; > 41A391731EFC447C00C4516A /* sha1-altivec.c in Sources */ = {isa = PBXBuildFile; fileRef = 41A3914F1EFC446E00C4516A /* sha1-altivec.c */; }; > 41A391741EFC447C00C4516A /* sha1.c in Sources */ = {isa = PBXBuildFile; fileRef = 41A391501EFC446E00C4516A /* sha1.c */; }; > 41A391751EFC447C00C4516A /* sha256.c in Sources */ = {isa = PBXBuildFile; fileRef = 41A391511EFC446E00C4516A /* sha256.c */; }; >@@ -1368,7 +1339,6 @@ > 41E02CA62127352D00C27CD6 /* acknowledged_bitrate_estimator.h in Headers */ = {isa = PBXBuildFile; fileRef = 41E02C922127352A00C27CD6 /* acknowledged_bitrate_estimator.h */; }; > 41E02CA72127352D00C27CD6 /* probe_bitrate_estimator.h in Headers */ = {isa = PBXBuildFile; fileRef = 41E02C932127352A00C27CD6 /* probe_bitrate_estimator.h */; }; > 41E02CA82127352D00C27CD6 /* acknowledged_bitrate_estimator.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41E02C942127352A00C27CD6 /* acknowledged_bitrate_estimator.cc */; }; >- 41E02CA92127352D00C27CD6 /* goog_cc_factory.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41E02C952127352A00C27CD6 /* goog_cc_factory.cc */; }; > 41E02CAA2127352D00C27CD6 /* goog_cc_network_control.h in Headers */ = {isa = PBXBuildFile; fileRef = 41E02C962127352A00C27CD6 /* goog_cc_network_control.h */; }; > 41E02CAB2127352D00C27CD6 /* delay_based_bwe.h in Headers */ = {isa = PBXBuildFile; fileRef = 41E02C972127352A00C27CD6 /* delay_based_bwe.h */; }; > 41E02CAC2127352D00C27CD6 /* bitrate_estimator.h in Headers */ = {isa = PBXBuildFile; fileRef = 41E02C982127352B00C27CD6 /* bitrate_estimator.h */; }; >@@ -1593,6 +1563,65 @@ > 41F9BFE42051DE1800ABF0B9 /* sessiondescription.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41F9BFDE2051DE1700ABF0B9 /* sessiondescription.cc */; }; > 41F9BFE92051DEEA00ABF0B9 /* turnportfactory.h in Headers */ = {isa = PBXBuildFile; fileRef = 41F9BFE72051DEE900ABF0B9 /* turnportfactory.h */; }; > 41F9BFEA2051DEEA00ABF0B9 /* turnportfactory.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41F9BFE82051DEEA00ABF0B9 /* turnportfactory.cc */; }; >+ 41FCBB0E21B1F53700A5DF27 /* create_peerconnection_factory.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB0921B1F53000A5DF27 /* create_peerconnection_factory.cc */; }; >+ 41FCBB0F21B1F53700A5DF27 /* media_transport_interface.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB0A21B1F53100A5DF27 /* media_transport_interface.h */; }; >+ 41FCBB1021B1F53700A5DF27 /* scoped_refptr.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB0B21B1F53200A5DF27 /* scoped_refptr.h */; }; >+ 41FCBB1121B1F53700A5DF27 /* media_transport_interface.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB0C21B1F53200A5DF27 /* media_transport_interface.cc */; }; >+ 41FCBB1221B1F53700A5DF27 /* create_peerconnection_factory.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB0D21B1F53600A5DF27 /* create_peerconnection_factory.h */; }; >+ 41FCBB1521B1F7AA00A5DF27 /* moving_average.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB1321B1F7AA00A5DF27 /* moving_average.cc */; }; >+ 41FCBB1621B1F7AA00A5DF27 /* moving_average.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB1421B1F7AA00A5DF27 /* moving_average.h */; }; >+ 41FCBB1B21B1F7CD00A5DF27 /* builtin_video_bitrate_allocator_factory.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB1721B1F7CC00A5DF27 /* builtin_video_bitrate_allocator_factory.cc */; }; >+ 41FCBB1C21B1F7CD00A5DF27 /* encoded_image.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB1821B1F7CC00A5DF27 /* encoded_image.h */; }; >+ 41FCBB1D21B1F7CD00A5DF27 /* encoded_image.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB1921B1F7CC00A5DF27 /* encoded_image.cc */; }; >+ 41FCBB1E21B1F7CD00A5DF27 /* builtin_video_bitrate_allocator_factory.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB1A21B1F7CD00A5DF27 /* builtin_video_bitrate_allocator_factory.h */; }; >+ 41FCBB2221B1F82F00A5DF27 /* throw_delegate.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB2021B1F82E00A5DF27 /* throw_delegate.cc */; }; >+ 41FCBB2321B1F82F00A5DF27 /* throw_delegate.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB2121B1F82F00A5DF27 /* throw_delegate.h */; }; >+ 41FCBB2721B1F87E00A5DF27 /* sent_packet.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB2521B1F87E00A5DF27 /* sent_packet.cc */; }; >+ 41FCBB2821B1F87E00A5DF27 /* sent_packet.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB2621B1F87E00A5DF27 /* sent_packet.h */; }; >+ 41FCBB2E21B1F8B700A5DF27 /* cryptooptions.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB2A21B1F8B600A5DF27 /* cryptooptions.h */; }; >+ 41FCBB2F21B1F8B700A5DF27 /* framedecryptorinterface.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB2B21B1F8B600A5DF27 /* framedecryptorinterface.h */; }; >+ 41FCBB3021B1F8B700A5DF27 /* frameencryptorinterface.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB2C21B1F8B600A5DF27 /* frameencryptorinterface.h */; }; >+ 41FCBB3121B1F8B700A5DF27 /* cryptooptions.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB2D21B1F8B600A5DF27 /* cryptooptions.cc */; }; >+ 41FCBB3821B1F8FC00A5DF27 /* control_handler.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB3421B1F8FB00A5DF27 /* control_handler.h */; }; >+ 41FCBB3921B1F8FC00A5DF27 /* control_handler.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB3521B1F8FC00A5DF27 /* control_handler.cc */; }; >+ 41FCBB3C21B1FD4000A5DF27 /* match.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB3A21B1FD3F00A5DF27 /* match.cc */; }; >+ 41FCBB3D21B1FD4000A5DF27 /* match.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB3B21B1FD3F00A5DF27 /* match.h */; }; >+ 41FCBB4021B1FDB800A5DF27 /* congestion_window_pushback_controller.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB3E21B1FDB800A5DF27 /* congestion_window_pushback_controller.h */; }; >+ 41FCBB4121B1FDB800A5DF27 /* congestion_window_pushback_controller.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB3F21B1FDB800A5DF27 /* congestion_window_pushback_controller.cc */; }; >+ 41FCBB4621B1FDEF00A5DF27 /* mdns_message.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB4221B1FDEE00A5DF27 /* mdns_message.h */; }; >+ 41FCBB4721B1FDEF00A5DF27 /* icecredentialsiterator.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB4321B1FDEE00A5DF27 /* icecredentialsiterator.cc */; }; >+ 41FCBB4821B1FDEF00A5DF27 /* mdns_message.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB4421B1FDEF00A5DF27 /* mdns_message.cc */; }; >+ 41FCBB4921B1FDEF00A5DF27 /* icecredentialsiterator.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB4521B1FDEF00A5DF27 /* icecredentialsiterator.h */; }; >+ 41FCBB5121B1FE4600A5DF27 /* cpu_speed_experiment.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB4B21B1FE4400A5DF27 /* cpu_speed_experiment.h */; }; >+ 41FCBB5221B1FE4600A5DF27 /* jitter_upper_bound_experiment.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB4C21B1FE4400A5DF27 /* jitter_upper_bound_experiment.cc */; }; >+ 41FCBB5321B1FE4600A5DF27 /* cpu_speed_experiment.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB4D21B1FE4500A5DF27 /* cpu_speed_experiment.cc */; }; >+ 41FCBB5421B1FE4600A5DF27 /* jitter_upper_bound_experiment.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB4E21B1FE4500A5DF27 /* jitter_upper_bound_experiment.h */; }; >+ 41FCBB5521B1FE4600A5DF27 /* normalize_simulcast_size_experiment.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB4F21B1FE4500A5DF27 /* normalize_simulcast_size_experiment.h */; }; >+ 41FCBB5621B1FE4600A5DF27 /* normalize_simulcast_size_experiment.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB5021B1FE4500A5DF27 /* normalize_simulcast_size_experiment.cc */; }; >+ 41FCBB5921B1FE6E00A5DF27 /* frame_dumping_decoder.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB5721B1FE6D00A5DF27 /* frame_dumping_decoder.cc */; }; >+ 41FCBB5A21B1FE6E00A5DF27 /* frame_dumping_decoder.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB5821B1FE6E00A5DF27 /* frame_dumping_decoder.h */; }; >+ 41FCBB5C21B1FEA300A5DF27 /* bitrate_controller.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB5B21B1FEA300A5DF27 /* bitrate_controller.cc */; }; >+ 41FCBB5F21B1FEC400A5DF27 /* goog_cc_factory.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB5D21B1FEC400A5DF27 /* goog_cc_factory.h */; }; >+ 41FCBB6021B1FEC400A5DF27 /* goog_cc_factory.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB5E21B1FEC400A5DF27 /* goog_cc_factory.cc */; }; >+ 41FCBB6421B1FEF600A5DF27 /* create_vp8_temporal_layers.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB6121B1FEF500A5DF27 /* create_vp8_temporal_layers.h */; }; >+ 41FCBB6521B1FEF600A5DF27 /* vp8_temporal_layers.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB6221B1FEF500A5DF27 /* vp8_temporal_layers.h */; }; >+ 41FCBB6621B1FEF600A5DF27 /* create_vp8_temporal_layers.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB6321B1FEF500A5DF27 /* create_vp8_temporal_layers.cc */; }; >+ 41FCBB6921B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB6721B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.h */; }; >+ 41FCBB6A21B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB6821B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.cc */; }; >+ 41FCBB6D21B1FF3A00A5DF27 /* key_derivation.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB6B21B1FF3900A5DF27 /* key_derivation.cc */; }; >+ 41FCBB6E21B1FF3A00A5DF27 /* key_derivation.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB6C21B1FF3A00A5DF27 /* key_derivation.h */; }; >+ 41FCBB7121B1FF7400A5DF27 /* hdr_metadata.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB6F21B1FF7400A5DF27 /* hdr_metadata.cc */; }; >+ 41FCBB7221B1FF7400A5DF27 /* hdr_metadata.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB7021B1FF7400A5DF27 /* hdr_metadata.h */; }; >+ 41FCBB7521B1FFA400A5DF27 /* cocoa_threading.mm in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB7321B1FFA400A5DF27 /* cocoa_threading.mm */; }; >+ 41FCBB7621B1FFA400A5DF27 /* cocoa_threading.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB7421B1FFA400A5DF27 /* cocoa_threading.h */; }; >+ 41FCBB7921B1FFC000A5DF27 /* ascii.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB7721B1FFBF00A5DF27 /* ascii.h */; }; >+ 41FCBB7A21B1FFC000A5DF27 /* ascii.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB7821B1FFC000A5DF27 /* ascii.cc */; }; >+ 41FCBB7D21B1FFF600A5DF27 /* openssl_key_derivation_hkdf.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB7B21B1FFF500A5DF27 /* openssl_key_derivation_hkdf.cc */; }; >+ 41FCBB7E21B1FFF600A5DF27 /* openssl_key_derivation_hkdf.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB7C21B1FFF500A5DF27 /* openssl_key_derivation_hkdf.h */; }; >+ 41FCBB8021B2001200A5DF27 /* string_view.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB7F21B2001100A5DF27 /* string_view.cc */; }; >+ 41FCBB8421B2007B00A5DF27 /* memutil.cc in Sources */ = {isa = PBXBuildFile; fileRef = 41FCBB8221B2007B00A5DF27 /* memutil.cc */; }; >+ 41FCBB8521B2007B00A5DF27 /* memutil.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FCBB8321B2007B00A5DF27 /* memutil.h */; }; >+ 41FCBB8921B200FD00A5DF27 /* RTCRtpFragmentationHeader.m in Sources */ = {isa = PBXBuildFile; fileRef = 413E679E2169889400EF37ED /* RTCRtpFragmentationHeader.m */; }; > 41FE8651215C530700B62C07 /* nasm-listfmt.c in Sources */ = {isa = PBXBuildFile; fileRef = 41FE8650215C530600B62C07 /* nasm-listfmt.c */; }; > 41FE8653215C532500B62C07 /* nasm-eval.c in Sources */ = {isa = PBXBuildFile; fileRef = 41FE8645215C526000B62C07 /* nasm-eval.c */; }; > 41FE8654215C532500B62C07 /* nasm-pp.c in Sources */ = {isa = PBXBuildFile; fileRef = 41D12426215C52410036B057 /* nasm-pp.c */; }; >@@ -1685,8 +1714,6 @@ > 5C0885331E4A99D200403995 /* rdb.h in Headers */ = {isa = PBXBuildFile; fileRef = 5C0885211E4A99D200403995 /* rdb.h */; }; > 5C0885341E4A99D200403995 /* rdbx.h in Headers */ = {isa = PBXBuildFile; fileRef = 5C0885221E4A99D200403995 /* rdbx.h */; }; > 5C0885351E4A99D200403995 /* stat.h in Headers */ = {isa = PBXBuildFile; fileRef = 5C0885231E4A99D200403995 /* stat.h */; }; >- 5C088C101E4AA44400403995 /* bundlefilter.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C63FA3A1E41761F002CA531 /* bundlefilter.cc */; }; >- 5C088C111E4AA44400403995 /* bundlefilter.h in Headers */ = {isa = PBXBuildFile; fileRef = 5C63FA3B1E41761F002CA531 /* bundlefilter.h */; }; > 5C088C121E4AA44400403995 /* channel.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C63FA3D1E41761F002CA531 /* channel.cc */; }; > 5C088C131E4AA44400403995 /* channel.h in Headers */ = {isa = PBXBuildFile; fileRef = 5C63FA3E1E41761F002CA531 /* channel.h */; }; > 5C088C141E4AA44400403995 /* channelmanager.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C63FA401E41761F002CA531 /* channelmanager.cc */; }; >@@ -1980,7 +2007,6 @@ > 5C4B4C1B1E431F75002651C8 /* i420_buffer_pool.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C4B4C121E431F75002651C8 /* i420_buffer_pool.cc */; }; > 5C4B4C1C1E431F75002651C8 /* incoming_video_stream.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C4B4C131E431F75002651C8 /* incoming_video_stream.cc */; }; > 5C4B4C1E1E431F75002651C8 /* video_frame_buffer.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C4B4C151E431F75002651C8 /* video_frame_buffer.cc */; }; >- 5C4B4C1F1E431F75002651C8 /* video_frame.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C4B4C161E431F75002651C8 /* video_frame.cc */; }; > 5C4B4C201E431F75002651C8 /* video_render_frames.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C4B4C171E431F75002651C8 /* video_render_frames.cc */; }; > 5C4B4C211E431F75002651C8 /* video_render_frames.h in Headers */ = {isa = PBXBuildFile; fileRef = 5C4B4C181E431F75002651C8 /* video_render_frames.h */; }; > 5C4B4C591E431F9C002651C8 /* audio_converter.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5C4B4C241E431F9C002651C8 /* audio_converter.cc */; }; >@@ -2345,8 +2371,6 @@ > 5CDD837A1E439A3500621E92 /* frame_dropper.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD836C1E439A3500621E92 /* frame_dropper.h */; }; > 5CDD837B1E439A3500621E92 /* ivf_file_writer.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD836D1E439A3500621E92 /* ivf_file_writer.cc */; }; > 5CDD837C1E439A3500621E92 /* ivf_file_writer.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD836E1E439A3500621E92 /* ivf_file_writer.h */; }; >- 5CDD837D1E439A3500621E92 /* moving_average.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD836F1E439A3500621E92 /* moving_average.cc */; }; >- 5CDD837E1E439A3500621E92 /* moving_average.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD83701E439A3500621E92 /* moving_average.h */; }; > 5CDD83811E439A3500621E92 /* quality_scaler.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD83731E439A3500621E92 /* quality_scaler.cc */; }; > 5CDD83821E439A3500621E92 /* quality_scaler.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD83741E439A3500621E92 /* quality_scaler.h */; }; > 5CDD83851E439A3500621E92 /* vp8_header_parser.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD83771E439A3500621E92 /* vp8_header_parser.cc */; }; >@@ -2453,8 +2477,6 @@ > 5CDD85101E43B1EA00621E92 /* audio_coding_module.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD84F91E43B1EA00621E92 /* audio_coding_module.cc */; }; > 5CDD85121E43B1EA00621E92 /* call_statistics.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD84FB1E43B1EA00621E92 /* call_statistics.cc */; }; > 5CDD85131E43B1EA00621E92 /* call_statistics.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD84FC1E43B1EA00621E92 /* call_statistics.h */; }; >- 5CDD85151E43B1EA00621E92 /* codec_manager.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD84FE1E43B1EA00621E92 /* codec_manager.cc */; }; >- 5CDD85161E43B1EA00621E92 /* codec_manager.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD84FF1E43B1EA00621E92 /* codec_manager.h */; }; > 5CDD85181E43B1EA00621E92 /* rent_a_codec.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD85011E43B1EA00621E92 /* rent_a_codec.cc */; }; > 5CDD85191E43B1EA00621E92 /* rent_a_codec.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD85021E43B1EA00621E92 /* rent_a_codec.h */; }; > 5CDD852F1E43B39C00621E92 /* bitrate_allocator.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD851C1E43B39C00621E92 /* bitrate_allocator.cc */; }; >@@ -2841,8 +2863,6 @@ > 5CDD8A4B1E43BFB300621E92 /* random_vector.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD89E21E43BFB300621E92 /* random_vector.h */; }; > 5CDD8A4D1E43BFB300621E92 /* red_payload_splitter.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD89E41E43BFB300621E92 /* red_payload_splitter.cc */; }; > 5CDD8A4E1E43BFB300621E92 /* red_payload_splitter.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD89E51E43BFB300621E92 /* red_payload_splitter.h */; }; >- 5CDD8A4F1E43BFB300621E92 /* rtcp.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD89E61E43BFB300621E92 /* rtcp.cc */; }; >- 5CDD8A501E43BFB300621E92 /* rtcp.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD89E71E43BFB300621E92 /* rtcp.h */; }; > 5CDD8A511E43BFB300621E92 /* statistics_calculator.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD89E81E43BFB300621E92 /* statistics_calculator.cc */; }; > 5CDD8A521E43BFB300621E92 /* statistics_calculator.h in Headers */ = {isa = PBXBuildFile; fileRef = 5CDD89E91E43BFB300621E92 /* statistics_calculator.h */; }; > 5CDD8A541E43BFB300621E92 /* sync_buffer.cc in Sources */ = {isa = PBXBuildFile; fileRef = 5CDD89EB1E43BFB300621E92 /* sync_buffer.cc */; }; >@@ -3798,7 +3818,6 @@ > 413A21441FE18D4C00373E99 /* testbase64.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = testbase64.h; path = rtc_base/testbase64.h; sourceTree = "<group>"; }; > 413A21451FE18D4C00373E99 /* function_view.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = function_view.h; path = rtc_base/function_view.h; sourceTree = "<group>"; }; > 413A21471FE18D4D00373E99 /* format_macros.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = format_macros.h; path = rtc_base/format_macros.h; sourceTree = "<group>"; }; >- 413A21481FE18D4D00373E99 /* unixfilesystem.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = unixfilesystem.h; path = rtc_base/unixfilesystem.h; sourceTree = "<group>"; }; > 413A21491FE18D4E00373E99 /* testechoserver.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = testechoserver.cc; path = rtc_base/testechoserver.cc; sourceTree = "<group>"; }; > 413A214A1FE18D4E00373E99 /* bitrateallocationstrategy.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = bitrateallocationstrategy.cc; path = rtc_base/bitrateallocationstrategy.cc; sourceTree = "<group>"; }; > 413A214B1FE18D4E00373E99 /* proxy_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = proxy_unittest.cc; path = rtc_base/proxy_unittest.cc; sourceTree = "<group>"; }; >@@ -3872,7 +3891,6 @@ > 413A21981FE18D8A00373E99 /* atomicops.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = atomicops.h; path = rtc_base/atomicops.h; sourceTree = "<group>"; }; > 413A219A1FE18D8B00373E99 /* openssladapter.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = openssladapter.cc; path = rtc_base/openssladapter.cc; sourceTree = "<group>"; }; > 413A219B1FE18D8B00373E99 /* socketadapters.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = socketadapters.cc; path = rtc_base/socketadapters.cc; sourceTree = "<group>"; }; >- 413A219D1FE18D8D00373E99 /* pathutils_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = pathutils_unittest.cc; path = rtc_base/pathutils_unittest.cc; sourceTree = "<group>"; }; > 413A219E1FE18D8D00373E99 /* task_queue_win.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = task_queue_win.cc; path = rtc_base/task_queue_win.cc; sourceTree = "<group>"; }; > 413A219F1FE18D8E00373E99 /* testutils.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = testutils.cc; path = rtc_base/testutils.cc; sourceTree = "<group>"; }; > 413A21A21FE18D8F00373E99 /* sigslot_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = sigslot_unittest.cc; path = rtc_base/sigslot_unittest.cc; sourceTree = "<group>"; }; >@@ -3907,7 +3925,6 @@ > 413A21C31FE18D9E00373E99 /* callback.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = callback.h; path = rtc_base/callback.h; sourceTree = "<group>"; }; > 413A21C51FE18D9F00373E99 /* helpers_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = helpers_unittest.cc; path = rtc_base/helpers_unittest.cc; sourceTree = "<group>"; }; > 413A21C61FE18DA000373E99 /* natsocketfactory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = natsocketfactory.cc; path = rtc_base/natsocketfactory.cc; sourceTree = "<group>"; }; >- 413A21C71FE18DA100373E99 /* optionsfile_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = optionsfile_unittest.cc; path = rtc_base/optionsfile_unittest.cc; sourceTree = "<group>"; }; > 413A21C81FE18DA100373E99 /* nat_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = nat_unittest.cc; path = rtc_base/nat_unittest.cc; sourceTree = "<group>"; }; > 413A21C91FE18DA200373E99 /* platform_thread_types.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = platform_thread_types.h; path = rtc_base/platform_thread_types.h; sourceTree = "<group>"; }; > 413A21CA1FE18DA300373E99 /* platform_thread.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = platform_thread.cc; path = rtc_base/platform_thread.cc; sourceTree = "<group>"; }; >@@ -3936,18 +3953,15 @@ > 413A21E41FE18DB000373E99 /* macutils.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = macutils.cc; path = rtc_base/macutils.cc; sourceTree = "<group>"; }; > 413A21E51FE18DB100373E99 /* task_queue.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = task_queue.h; path = rtc_base/task_queue.h; sourceTree = "<group>"; }; > 413A21E71FE18DB100373E99 /* bitbuffer.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = bitbuffer.cc; path = rtc_base/bitbuffer.cc; sourceTree = "<group>"; }; >- 413A21E81FE18DB200373E99 /* pathutils.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = pathutils.cc; path = rtc_base/pathutils.cc; sourceTree = "<group>"; }; > 413A21E91FE18DB200373E99 /* physicalsocketserver.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = physicalsocketserver.cc; path = rtc_base/physicalsocketserver.cc; sourceTree = "<group>"; }; > 413A21EA1FE18DB300373E99 /* byteorder.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = byteorder.h; path = rtc_base/byteorder.h; sourceTree = "<group>"; }; > 413A21EB1FE18DB300373E99 /* sslstreamadapter_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = sslstreamadapter_unittest.cc; path = rtc_base/sslstreamadapter_unittest.cc; sourceTree = "<group>"; }; > 413A21EE1FE18DB400373E99 /* thread.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = thread.h; path = rtc_base/thread.h; sourceTree = "<group>"; }; > 413A21EF1FE18DB500373E99 /* crc32.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = crc32.cc; path = rtc_base/crc32.cc; sourceTree = "<group>"; }; > 413A21F01FE18DB500373E99 /* memory_usage_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = memory_usage_unittest.cc; path = rtc_base/memory_usage_unittest.cc; sourceTree = "<group>"; }; >- 413A21F11FE18DB600373E99 /* optionsfile.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = optionsfile.h; path = rtc_base/optionsfile.h; sourceTree = "<group>"; }; > 413A21F31FE18DB700373E99 /* swap_queue.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = swap_queue.h; path = rtc_base/swap_queue.h; sourceTree = "<group>"; }; > 413A21F41FE18DB700373E99 /* dscp.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = dscp.h; path = rtc_base/dscp.h; sourceTree = "<group>"; }; > 413A21F51FE18DB700373E99 /* timeutils.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = timeutils.h; path = rtc_base/timeutils.h; sourceTree = "<group>"; }; >- 413A21F61FE18DB800373E99 /* optionsfile.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = optionsfile.cc; path = rtc_base/optionsfile.cc; sourceTree = "<group>"; }; > 413A21F71FE18DB800373E99 /* type_traits.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = type_traits.h; path = rtc_base/type_traits.h; sourceTree = "<group>"; }; > 413A21F81FE18DB900373E99 /* random.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = random.h; path = rtc_base/random.h; sourceTree = "<group>"; }; > 413A21F91FE18DB900373E99 /* socketaddresspair.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = socketaddresspair.cc; path = rtc_base/socketaddresspair.cc; sourceTree = "<group>"; }; >@@ -3967,7 +3981,6 @@ > 413A22091FE18DC000373E99 /* bind.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = bind.h; path = rtc_base/bind.h; sourceTree = "<group>"; }; > 413A220A1FE18DC000373E99 /* gunit_prod.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = gunit_prod.h; path = rtc_base/gunit_prod.h; sourceTree = "<group>"; }; > 413A220B1FE18DC100373E99 /* asyncsocket.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = asyncsocket.h; path = rtc_base/asyncsocket.h; sourceTree = "<group>"; }; >- 413A220C1FE18DC100373E99 /* noop.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = noop.cc; path = rtc_base/noop.cc; sourceTree = "<group>"; }; > 413A220D1FE18DC200373E99 /* rate_limiter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = rate_limiter.h; path = rtc_base/rate_limiter.h; sourceTree = "<group>"; }; > 413A220E1FE18DC200373E99 /* socketadapters.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = socketadapters.h; path = rtc_base/socketadapters.h; sourceTree = "<group>"; }; > 413A220F1FE18DC200373E99 /* ratetracker.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = ratetracker.cc; path = rtc_base/ratetracker.cc; sourceTree = "<group>"; }; >@@ -4002,7 +4015,6 @@ > 413A222F1FE18DD200373E99 /* bind_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = bind_unittest.cc; path = rtc_base/bind_unittest.cc; sourceTree = "<group>"; }; > 413A22311FE18DD300373E99 /* stringutils_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = stringutils_unittest.cc; path = rtc_base/stringutils_unittest.cc; sourceTree = "<group>"; }; > 413A22321FE18DD300373E99 /* compile_assert_c.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = compile_assert_c.h; path = rtc_base/compile_assert_c.h; sourceTree = "<group>"; }; >- 413A22331FE18DD400373E99 /* unixfilesystem.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = unixfilesystem.cc; path = rtc_base/unixfilesystem.cc; sourceTree = "<group>"; }; > 413A22341FE18DD400373E99 /* ssladapter_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = ssladapter_unittest.cc; path = rtc_base/ssladapter_unittest.cc; sourceTree = "<group>"; }; > 413A22351FE18DD500373E99 /* criticalsection.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = criticalsection.cc; path = rtc_base/criticalsection.cc; sourceTree = "<group>"; }; > 413A22361FE18DD500373E99 /* byteorder_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = byteorder_unittest.cc; path = rtc_base/byteorder_unittest.cc; sourceTree = "<group>"; }; >@@ -4012,7 +4024,6 @@ > 413A223A1FE18DD800373E99 /* rate_limiter.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = rate_limiter.cc; path = rtc_base/rate_limiter.cc; sourceTree = "<group>"; }; > 413A223B1FE18DD900373E99 /* buffer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = buffer.h; path = rtc_base/buffer.h; sourceTree = "<group>"; }; > 413A223C1FE18DD900373E99 /* event_tracer.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = event_tracer.cc; path = rtc_base/event_tracer.cc; sourceTree = "<group>"; }; >- 413A223D1FE18DD900373E99 /* pathutils.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = pathutils.h; path = rtc_base/pathutils.h; sourceTree = "<group>"; }; > 413A223F1FE18DDA00373E99 /* sslroots.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = sslroots.h; path = rtc_base/sslroots.h; sourceTree = "<group>"; }; > 413A22401FE18DDB00373E99 /* openssldigest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = openssldigest.cc; path = rtc_base/openssldigest.cc; sourceTree = "<group>"; }; > 413A22421FE18DDB00373E99 /* httpcommon.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = httpcommon.h; path = rtc_base/httpcommon.h; sourceTree = "<group>"; }; >@@ -4043,7 +4054,6 @@ > 413A225D1FE18DE600373E99 /* atomicops_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = atomicops_unittest.cc; path = rtc_base/atomicops_unittest.cc; sourceTree = "<group>"; }; > 413A225E1FE18DE700373E99 /* stringize_macros.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = stringize_macros.h; path = rtc_base/stringize_macros.h; sourceTree = "<group>"; }; > 413A22611FE18DE800373E99 /* cpu_time_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = cpu_time_unittest.cc; path = rtc_base/cpu_time_unittest.cc; sourceTree = "<group>"; }; >- 413A22621FE18DE900373E99 /* fileutils.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = fileutils.cc; path = rtc_base/fileutils.cc; sourceTree = "<group>"; }; > 413A22631FE18DEA00373E99 /* opensslstreamadapter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = opensslstreamadapter.h; path = rtc_base/opensslstreamadapter.h; sourceTree = "<group>"; }; > 413A22641FE18DEB00373E99 /* event.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = event.cc; path = rtc_base/event.cc; sourceTree = "<group>"; }; > 413A22651FE18DEB00373E99 /* crc32_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = crc32_unittest.cc; path = rtc_base/crc32_unittest.cc; sourceTree = "<group>"; }; >@@ -4051,7 +4061,6 @@ > 413A22671FE18DEC00373E99 /* sigslottester.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = sigslottester.h; path = rtc_base/sigslottester.h; sourceTree = "<group>"; }; > 413A22681FE18DEE00373E99 /* network_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = network_unittest.cc; path = rtc_base/network_unittest.cc; sourceTree = "<group>"; }; > 413A22691FE18DEF00373E99 /* flags.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = flags.h; path = rtc_base/flags.h; sourceTree = "<group>"; }; >- 413A226A1FE18DF000373E99 /* fileutils.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = fileutils.h; path = rtc_base/fileutils.h; sourceTree = "<group>"; }; > 413A226B1FE18DF000373E99 /* logsinks.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = logsinks.cc; path = rtc_base/logsinks.cc; sourceTree = "<group>"; }; > 413A226C1FE18DF000373E99 /* sslidentity_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = sslidentity_unittest.cc; path = rtc_base/sslidentity_unittest.cc; sourceTree = "<group>"; }; > 413A226D1FE18DF100373E99 /* nullsocketserver.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = nullsocketserver.h; path = rtc_base/nullsocketserver.h; sourceTree = "<group>"; }; >@@ -4081,7 +4090,6 @@ > 413A22861FE18DFC00373E99 /* testutils.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = testutils.h; path = rtc_base/testutils.h; sourceTree = "<group>"; }; > 413A22871FE18DFD00373E99 /* filerotatingstream.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = filerotatingstream.cc; path = rtc_base/filerotatingstream.cc; sourceTree = "<group>"; }; > 413A22881FE18DFD00373E99 /* rtccertificate_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = rtccertificate_unittest.cc; path = rtc_base/rtccertificate_unittest.cc; sourceTree = "<group>"; }; >- 413A228A1FE18DFE00373E99 /* thread_darwin.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; name = thread_darwin.mm; path = rtc_base/thread_darwin.mm; sourceTree = "<group>"; }; > 413A228B1FE18DFE00373E99 /* task_queue_gcd.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = task_queue_gcd.cc; path = rtc_base/task_queue_gcd.cc; sourceTree = "<group>"; }; > 413A228C1FE18DFE00373E99 /* firewallsocketserver.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = firewallsocketserver.h; path = rtc_base/firewallsocketserver.h; sourceTree = "<group>"; }; > 413A228E1FE18DFF00373E99 /* timeutils_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = timeutils_unittest.cc; path = rtc_base/timeutils_unittest.cc; sourceTree = "<group>"; }; >@@ -4097,7 +4105,6 @@ > 413A22991FE18E0300373E99 /* location.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = location.cc; path = rtc_base/location.cc; sourceTree = "<group>"; }; > 413A229A1FE18E0400373E99 /* location.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = location.h; path = rtc_base/location.h; sourceTree = "<group>"; }; > 413A229B1FE18E0400373E99 /* ipaddress.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = ipaddress.cc; path = rtc_base/ipaddress.cc; sourceTree = "<group>"; }; >- 413A229C1FE18E0400373E99 /* noop.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; name = noop.mm; path = rtc_base/noop.mm; sourceTree = "<group>"; }; > 413A229D1FE18E0500373E99 /* race_checker.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = race_checker.h; path = rtc_base/race_checker.h; sourceTree = "<group>"; }; > 413A229E1FE18E0500373E99 /* rate_limiter_unittest.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = rate_limiter_unittest.cc; path = rtc_base/rate_limiter_unittest.cc; sourceTree = "<group>"; }; > 413A229F1FE18E0600373E99 /* sslfingerprint.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = sslfingerprint.cc; path = rtc_base/sslfingerprint.cc; sourceTree = "<group>"; }; >@@ -4165,16 +4172,12 @@ > 413E67642169854500EF37ED /* RTCVideoEncoderVP8.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = RTCVideoEncoderVP8.mm; sourceTree = "<group>"; }; > 413E676E2169863A00EF37ED /* temporal_layers_checker.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = temporal_layers_checker.cc; path = codecs/vp8/temporal_layers_checker.cc; sourceTree = "<group>"; }; > 413E676F2169863A00EF37ED /* libvpx_interface.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = libvpx_interface.h; path = codecs/vp8/libvpx_interface.h; sourceTree = "<group>"; }; >- 413E67702169863B00EF37ED /* vp8_temporal_layers.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = vp8_temporal_layers.cc; path = codecs/vp8/vp8_temporal_layers.cc; sourceTree = "<group>"; }; > 413E67712169863B00EF37ED /* libvpx_interface.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = libvpx_interface.cc; path = codecs/vp8/libvpx_interface.cc; sourceTree = "<group>"; }; >- 413E67772169867A00EF37ED /* channel_receive_proxy.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = channel_receive_proxy.h; sourceTree = "<group>"; }; >- 413E67782169867A00EF37ED /* channel_receive_proxy.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = channel_receive_proxy.cc; sourceTree = "<group>"; }; > 413E677B216986AD00EF37ED /* rtp_header_extension_size.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = rtp_header_extension_size.cc; sourceTree = "<group>"; }; > 413E677C216986AE00EF37ED /* rtp_header_extension_size.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = rtp_header_extension_size.h; sourceTree = "<group>"; }; > 413E677F2169870F00EF37ED /* svc_config.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = svc_config.h; path = codecs/vp9/svc_config.h; sourceTree = "<group>"; }; > 413E67802169870F00EF37ED /* svc_config.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = svc_config.cc; path = codecs/vp9/svc_config.cc; sourceTree = "<group>"; }; > 413E67832169877D00EF37ED /* RTCVideoEncoderSettings.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RTCVideoEncoderSettings.h; sourceTree = "<group>"; }; >- 413E67842169877D00EF37ED /* RTCVideoEncoderSettings.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = RTCVideoEncoderSettings.mm; sourceTree = "<group>"; }; > 413E6788216987B900EF37ED /* RTCCVPixelBuffer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RTCCVPixelBuffer.h; sourceTree = "<group>"; }; > 413E6789216987B900EF37ED /* RTCCVPixelBuffer.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = RTCCVPixelBuffer.mm; sourceTree = "<group>"; }; > 413E678C216987DB00EF37ED /* RTCVideoFrame.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RTCVideoFrame.h; sourceTree = "<group>"; }; >@@ -4308,7 +4311,6 @@ > 4145E4C21EF896CE00FCF6E6 /* rtcstatsreport.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = rtcstatsreport.h; path = stats/rtcstatsreport.h; sourceTree = "<group>"; }; > 4145E4C31EF896D200FCF6E6 /* rtcstats_objects.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = rtcstats_objects.h; path = stats/rtcstats_objects.h; sourceTree = "<group>"; }; > 4145E4C41EF896D700FCF6E6 /* rtcstats.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = rtcstats.h; path = stats/rtcstats.h; sourceTree = "<group>"; }; >- 4145E4C91EF8CB3100FCF6E6 /* createpeerconnectionfactory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = createpeerconnectionfactory.cc; sourceTree = "<group>"; }; > 4145E4CB1EF8CB8A00FCF6E6 /* rtptransport.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = rtptransport.cc; sourceTree = "<group>"; }; > 4145E4CD1EF8CB9400FCF6E6 /* rtptransport.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = rtptransport.h; sourceTree = "<group>"; }; > 4145E4CF1EF8CC1700FCF6E6 /* webrtcvideoengine.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = webrtcvideoengine.cc; path = engine/webrtcvideoengine.cc; sourceTree = "<group>"; }; >@@ -4455,9 +4457,7 @@ > 417953B3216982420028266B /* RTCDefaultVideoDecoderFactory.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RTCDefaultVideoDecoderFactory.h; sourceTree = "<group>"; }; > 417953CD2169834E0028266B /* channel_send.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = channel_send.cc; sourceTree = "<group>"; }; > 417953CE2169834E0028266B /* channel_send.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = channel_send.h; sourceTree = "<group>"; }; >- 417953CF2169834E0028266B /* channel_send_proxy.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = channel_send_proxy.cc; sourceTree = "<group>"; }; > 417953D02169834F0028266B /* channel_receive.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = channel_receive.h; sourceTree = "<group>"; }; >- 417953D12169834F0028266B /* channel_send_proxy.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = channel_send_proxy.h; sourceTree = "<group>"; }; > 417953D22169834F0028266B /* channel_receive.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = channel_receive.cc; sourceTree = "<group>"; }; > 417953D9216983900028266B /* metrics.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = metrics.cc; path = source/metrics.cc; sourceTree = "<group>"; }; > 417953DA216983900028266B /* field_trial.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = field_trial.cc; path = source/field_trial.cc; sourceTree = "<group>"; }; >@@ -4536,7 +4536,6 @@ > 4192414A2127376D00634FCF /* basicasyncresolverfactory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = basicasyncresolverfactory.cc; sourceTree = "<group>"; }; > 4192414B2127376D00634FCF /* regatheringcontroller.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = regatheringcontroller.h; sourceTree = "<group>"; }; > 4192414C2127376E00634FCF /* fakecandidatepair.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = fakecandidatepair.h; sourceTree = "<group>"; }; >- 4192414D2127376E00634FCF /* udptransport.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = udptransport.h; sourceTree = "<group>"; }; > 4192414E2127376E00634FCF /* fakepackettransport.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = fakepackettransport.h; sourceTree = "<group>"; }; > 4192414F2127376E00634FCF /* fakeicetransport.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = fakeicetransport.h; sourceTree = "<group>"; }; > 419241502127376E00634FCF /* icetransportinternal.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = icetransportinternal.h; sourceTree = "<group>"; }; >@@ -4561,7 +4560,6 @@ > 419241802127497100634FCF /* bad_variant_access.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = bad_variant_access.h; path = types/bad_variant_access.h; sourceTree = "<group>"; }; > 41924187212749C700634FCF /* send_time_history.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = send_time_history.h; sourceTree = "<group>"; }; > 41924188212749C700634FCF /* send_side_congestion_controller.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = send_side_congestion_controller.cc; sourceTree = "<group>"; }; >- 41924189212749C800634FCF /* pacer_controller.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pacer_controller.h; sourceTree = "<group>"; }; > 4192418A212749C800634FCF /* transport_feedback_adapter.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = transport_feedback_adapter.cc; sourceTree = "<group>"; }; > 4192418F2127581000634FCF /* fec_private_tables_bursty.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = fec_private_tables_bursty.cc; sourceTree = "<group>"; }; > 419241932127586400634FCF /* rnn_vad_weights.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = rnn_vad_weights.h; path = src/rnn_vad_weights.h; sourceTree = "<group>"; }; >@@ -4724,8 +4722,6 @@ > 419C83231FE245280040C30F /* rtc_event_rtp_packet_incoming.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = rtc_event_rtp_packet_incoming.h; path = rtc_event_log/events/rtc_event_rtp_packet_incoming.h; sourceTree = "<group>"; }; > 419C83281FE245CB0040C30F /* pacer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = pacer.h; sourceTree = "<group>"; }; > 419C832B1FE245CC0040C30F /* interval_budget.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = interval_budget.h; sourceTree = "<group>"; }; >- 419C83321FE245E90040C30F /* time_interval.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = time_interval.h; sourceTree = "<group>"; }; >- 419C83331FE245E90040C30F /* time_interval.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = time_interval.cc; sourceTree = "<group>"; }; > 419C83371FE246220040C30F /* audio_encoder_g722_config.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = audio_encoder_g722_config.h; path = audio_codecs/g722/audio_encoder_g722_config.h; sourceTree = "<group>"; }; > 419C83381FE246220040C30F /* audio_decoder_g722.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = audio_decoder_g722.h; path = audio_codecs/g722/audio_decoder_g722.h; sourceTree = "<group>"; }; > 419C83391FE246220040C30F /* audio_encoder_g722.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = audio_encoder_g722.cc; path = audio_codecs/g722/audio_encoder_g722.cc; sourceTree = "<group>"; }; >@@ -4798,7 +4794,6 @@ > 419C83D01FE247EF0040C30F /* packettransportinternal.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = packettransportinternal.h; sourceTree = "<group>"; }; > 419C83D11FE247F00040C30F /* packettransportinternal.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = packettransportinternal.cc; sourceTree = "<group>"; }; > 419C83D21FE247F00040C30F /* portinterface.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = portinterface.cc; sourceTree = "<group>"; }; >- 419C83D31FE247F00040C30F /* udptransport.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = udptransport.cc; sourceTree = "<group>"; }; > 419C83D41FE247F10040C30F /* dtlstransport.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = dtlstransport.h; sourceTree = "<group>"; }; > 419C83D51FE247F10040C30F /* dtlstransportinternal.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = dtlstransportinternal.cc; sourceTree = "<group>"; }; > 419C83D61FE247F10040C30F /* dtlstransportinternal.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = dtlstransportinternal.h; sourceTree = "<group>"; }; >@@ -4907,15 +4902,12 @@ > 41A08BCE21272EE1001D5D7B /* gain_controller2.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = gain_controller2.cc; sourceTree = "<group>"; }; > 41A08BD121272EFA001D5D7B /* round_robin_packet_queue.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = round_robin_packet_queue.h; sourceTree = "<group>"; }; > 41A08BD221272EFA001D5D7B /* round_robin_packet_queue.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = round_robin_packet_queue.cc; sourceTree = "<group>"; }; >- 41A08BD521272F1C001D5D7B /* congestion_window_pushback_controller.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = congestion_window_pushback_controller.h; sourceTree = "<group>"; }; >- 41A08BD621272F1C001D5D7B /* congestion_window_pushback_controller.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = congestion_window_pushback_controller.cc; sourceTree = "<group>"; }; > 41A08BDA21272F84001D5D7B /* compute_interpolated_gain_curve.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = compute_interpolated_gain_curve.h; sourceTree = "<group>"; }; > 41A08BDB21272F84001D5D7B /* adaptive_digital_gain_applier.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = adaptive_digital_gain_applier.h; sourceTree = "<group>"; }; > 41A08BDC21272F84001D5D7B /* adaptive_mode_level_estimator.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = adaptive_mode_level_estimator.h; sourceTree = "<group>"; }; > 41A08BDD21272F84001D5D7B /* vector_float_frame.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = vector_float_frame.h; sourceTree = "<group>"; }; > 41A08BDE21272F84001D5D7B /* gain_applier.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = gain_applier.cc; sourceTree = "<group>"; }; > 41A08BDF21272F84001D5D7B /* fixed_digital_level_estimator.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = fixed_digital_level_estimator.h; sourceTree = "<group>"; }; >- 41A08BE021272F85001D5D7B /* gain_curve_applier.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = gain_curve_applier.cc; sourceTree = "<group>"; }; > 41A08BE121272F85001D5D7B /* interpolated_gain_curve.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = interpolated_gain_curve.cc; sourceTree = "<group>"; }; > 41A08BE221272F85001D5D7B /* noise_spectrum_estimator.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = noise_spectrum_estimator.h; sourceTree = "<group>"; }; > 41A08BE321272F85001D5D7B /* vad_with_level.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = vad_with_level.h; sourceTree = "<group>"; }; >@@ -5176,11 +5168,8 @@ > 41D6B462212731A1008F9353 /* adaptive_mode_level_estimator_agc.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = adaptive_mode_level_estimator_agc.h; sourceTree = "<group>"; }; > 41D6B463212731A2008F9353 /* biquad_filter.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = biquad_filter.h; sourceTree = "<group>"; }; > 41D6B464212731A2008F9353 /* adaptive_agc.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = adaptive_agc.h; sourceTree = "<group>"; }; >- 41D6B465212731A2008F9353 /* fixed_gain_controller.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = fixed_gain_controller.cc; sourceTree = "<group>"; }; > 41D6B466212731A2008F9353 /* vector_float_frame.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = vector_float_frame.cc; sourceTree = "<group>"; }; >- 41D6B467212731A3008F9353 /* fixed_gain_controller.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = fixed_gain_controller.h; sourceTree = "<group>"; }; > 41D6B468212731A3008F9353 /* saturation_protector.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = saturation_protector.cc; sourceTree = "<group>"; }; >- 41D6B469212731A3008F9353 /* gain_curve_applier.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = gain_curve_applier.h; sourceTree = "<group>"; }; > 41D6B46A212731A3008F9353 /* noise_spectrum_estimator.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = noise_spectrum_estimator.cc; sourceTree = "<group>"; }; > 41D6B46B212731A3008F9353 /* saturation_protector.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = saturation_protector.h; sourceTree = "<group>"; }; > 41D6B46C212731A4008F9353 /* vad_with_level.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = vad_with_level.cc; sourceTree = "<group>"; }; >@@ -5231,7 +5220,6 @@ > 41E02C922127352A00C27CD6 /* acknowledged_bitrate_estimator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = acknowledged_bitrate_estimator.h; sourceTree = "<group>"; }; > 41E02C932127352A00C27CD6 /* probe_bitrate_estimator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = probe_bitrate_estimator.h; sourceTree = "<group>"; }; > 41E02C942127352A00C27CD6 /* acknowledged_bitrate_estimator.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = acknowledged_bitrate_estimator.cc; sourceTree = "<group>"; }; >- 41E02C952127352A00C27CD6 /* goog_cc_factory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = goog_cc_factory.cc; sourceTree = "<group>"; }; > 41E02C962127352A00C27CD6 /* goog_cc_network_control.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = goog_cc_network_control.h; sourceTree = "<group>"; }; > 41E02C972127352A00C27CD6 /* delay_based_bwe.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = delay_based_bwe.h; sourceTree = "<group>"; }; > 41E02C982127352B00C27CD6 /* bitrate_estimator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = bitrate_estimator.h; sourceTree = "<group>"; }; >@@ -5260,7 +5248,6 @@ > 41E02CCB212735B700C27CD6 /* bitrate_settings.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = bitrate_settings.h; sourceTree = "<group>"; }; > 41E02CD12127360600C27CD6 /* sample_counter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = sample_counter.h; path = rtc_base/numerics/sample_counter.h; sourceTree = "<group>"; }; > 41E02CD22127360700C27CD6 /* sample_counter.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = sample_counter.cc; path = rtc_base/numerics/sample_counter.cc; sourceTree = "<group>"; }; >- 41E02CD62127363C00C27CD6 /* pacer_controller.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = pacer_controller.cc; sourceTree = "<group>"; }; > 41E02CD72127363C00C27CD6 /* transport_feedback_adapter.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = transport_feedback_adapter.h; sourceTree = "<group>"; }; > 41E02CD82127363D00C27CD6 /* send_time_history.cc */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = send_time_history.cc; sourceTree = "<group>"; }; > 41EA53A41EFC2BFD002FF04C /* hmac.c */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.c; path = hmac.c; sourceTree = "<group>"; }; >@@ -5465,8 +5452,8 @@ > 41F9BFB32051C93600ABF0B9 /* simulcast_encoder_adapter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = simulcast_encoder_adapter.h; path = engine/simulcast_encoder_adapter.h; sourceTree = "<group>"; }; > 41F9BFC42051CAE400ABF0B9 /* signal_processing_library.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = signal_processing_library.h; path = signal_processing/include/signal_processing_library.h; sourceTree = "<group>"; }; > 41F9BFC62051DCE800ABF0B9 /* webrtc_vad.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = webrtc_vad.h; path = vad/include/webrtc_vad.h; sourceTree = "<group>"; }; >- 41F9BFC92051DDA200ABF0B9 /* alr_experiment.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = alr_experiment.cc; path = rtc_base/experiments/alr_experiment.cc; sourceTree = "<group>"; }; >- 41F9BFCA2051DDA200ABF0B9 /* alr_experiment.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = alr_experiment.h; path = rtc_base/experiments/alr_experiment.h; sourceTree = "<group>"; }; >+ 41F9BFC92051DDA200ABF0B9 /* alr_experiment.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = alr_experiment.cc; sourceTree = "<group>"; }; >+ 41F9BFCA2051DDA200ABF0B9 /* alr_experiment.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = alr_experiment.h; sourceTree = "<group>"; }; > 41F9BFCD2051DDE400ABF0B9 /* fft_buffer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = fft_buffer.h; sourceTree = "<group>"; }; > 41F9BFCE2051DDE400ABF0B9 /* matrix_buffer.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = matrix_buffer.cc; sourceTree = "<group>"; }; > 41F9BFCF2051DDE400ABF0B9 /* vector_buffer.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = vector_buffer.cc; sourceTree = "<group>"; }; >@@ -5481,6 +5468,65 @@ > 41F9BFDE2051DE1700ABF0B9 /* sessiondescription.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = sessiondescription.cc; sourceTree = "<group>"; }; > 41F9BFE72051DEE900ABF0B9 /* turnportfactory.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = turnportfactory.h; sourceTree = "<group>"; }; > 41F9BFE82051DEEA00ABF0B9 /* turnportfactory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = turnportfactory.cc; sourceTree = "<group>"; }; >+ 41FCBB0921B1F53000A5DF27 /* create_peerconnection_factory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = create_peerconnection_factory.cc; sourceTree = "<group>"; }; >+ 41FCBB0A21B1F53100A5DF27 /* media_transport_interface.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = media_transport_interface.h; sourceTree = "<group>"; }; >+ 41FCBB0B21B1F53200A5DF27 /* scoped_refptr.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = scoped_refptr.h; sourceTree = "<group>"; }; >+ 41FCBB0C21B1F53200A5DF27 /* media_transport_interface.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = media_transport_interface.cc; sourceTree = "<group>"; }; >+ 41FCBB0D21B1F53600A5DF27 /* create_peerconnection_factory.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = create_peerconnection_factory.h; sourceTree = "<group>"; }; >+ 41FCBB1321B1F7AA00A5DF27 /* moving_average.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = moving_average.cc; path = rtc_base/numerics/moving_average.cc; sourceTree = "<group>"; }; >+ 41FCBB1421B1F7AA00A5DF27 /* moving_average.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = moving_average.h; path = rtc_base/numerics/moving_average.h; sourceTree = "<group>"; }; >+ 41FCBB1721B1F7CC00A5DF27 /* builtin_video_bitrate_allocator_factory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = builtin_video_bitrate_allocator_factory.cc; sourceTree = "<group>"; }; >+ 41FCBB1821B1F7CC00A5DF27 /* encoded_image.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = encoded_image.h; sourceTree = "<group>"; }; >+ 41FCBB1921B1F7CC00A5DF27 /* encoded_image.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = encoded_image.cc; sourceTree = "<group>"; }; >+ 41FCBB1A21B1F7CD00A5DF27 /* builtin_video_bitrate_allocator_factory.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = builtin_video_bitrate_allocator_factory.h; sourceTree = "<group>"; }; >+ 41FCBB2021B1F82E00A5DF27 /* throw_delegate.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = throw_delegate.cc; sourceTree = "<group>"; }; >+ 41FCBB2121B1F82F00A5DF27 /* throw_delegate.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = throw_delegate.h; sourceTree = "<group>"; }; >+ 41FCBB2521B1F87E00A5DF27 /* sent_packet.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = sent_packet.cc; sourceTree = "<group>"; }; >+ 41FCBB2621B1F87E00A5DF27 /* sent_packet.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = sent_packet.h; sourceTree = "<group>"; }; >+ 41FCBB2A21B1F8B600A5DF27 /* cryptooptions.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = cryptooptions.h; sourceTree = "<group>"; }; >+ 41FCBB2B21B1F8B600A5DF27 /* framedecryptorinterface.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = framedecryptorinterface.h; sourceTree = "<group>"; }; >+ 41FCBB2C21B1F8B600A5DF27 /* frameencryptorinterface.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = frameencryptorinterface.h; sourceTree = "<group>"; }; >+ 41FCBB2D21B1F8B600A5DF27 /* cryptooptions.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = cryptooptions.cc; sourceTree = "<group>"; }; >+ 41FCBB3421B1F8FB00A5DF27 /* control_handler.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = control_handler.h; sourceTree = "<group>"; }; >+ 41FCBB3521B1F8FC00A5DF27 /* control_handler.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = control_handler.cc; sourceTree = "<group>"; }; >+ 41FCBB3A21B1FD3F00A5DF27 /* match.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = match.cc; sourceTree = "<group>"; }; >+ 41FCBB3B21B1FD3F00A5DF27 /* match.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = match.h; sourceTree = "<group>"; }; >+ 41FCBB3E21B1FDB800A5DF27 /* congestion_window_pushback_controller.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = congestion_window_pushback_controller.h; sourceTree = "<group>"; }; >+ 41FCBB3F21B1FDB800A5DF27 /* congestion_window_pushback_controller.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = congestion_window_pushback_controller.cc; sourceTree = "<group>"; }; >+ 41FCBB4221B1FDEE00A5DF27 /* mdns_message.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = mdns_message.h; sourceTree = "<group>"; }; >+ 41FCBB4321B1FDEE00A5DF27 /* icecredentialsiterator.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = icecredentialsiterator.cc; sourceTree = "<group>"; }; >+ 41FCBB4421B1FDEF00A5DF27 /* mdns_message.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = mdns_message.cc; sourceTree = "<group>"; }; >+ 41FCBB4521B1FDEF00A5DF27 /* icecredentialsiterator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = icecredentialsiterator.h; sourceTree = "<group>"; }; >+ 41FCBB4B21B1FE4400A5DF27 /* cpu_speed_experiment.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = cpu_speed_experiment.h; sourceTree = "<group>"; }; >+ 41FCBB4C21B1FE4400A5DF27 /* jitter_upper_bound_experiment.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = jitter_upper_bound_experiment.cc; sourceTree = "<group>"; }; >+ 41FCBB4D21B1FE4500A5DF27 /* cpu_speed_experiment.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = cpu_speed_experiment.cc; sourceTree = "<group>"; }; >+ 41FCBB4E21B1FE4500A5DF27 /* jitter_upper_bound_experiment.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = jitter_upper_bound_experiment.h; sourceTree = "<group>"; }; >+ 41FCBB4F21B1FE4500A5DF27 /* normalize_simulcast_size_experiment.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = normalize_simulcast_size_experiment.h; sourceTree = "<group>"; }; >+ 41FCBB5021B1FE4500A5DF27 /* normalize_simulcast_size_experiment.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = normalize_simulcast_size_experiment.cc; sourceTree = "<group>"; }; >+ 41FCBB5721B1FE6D00A5DF27 /* frame_dumping_decoder.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = frame_dumping_decoder.cc; sourceTree = "<group>"; }; >+ 41FCBB5821B1FE6E00A5DF27 /* frame_dumping_decoder.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = frame_dumping_decoder.h; sourceTree = "<group>"; }; >+ 41FCBB5B21B1FEA300A5DF27 /* bitrate_controller.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = bitrate_controller.cc; path = bitrate_controller/bitrate_controller.cc; sourceTree = "<group>"; }; >+ 41FCBB5D21B1FEC400A5DF27 /* goog_cc_factory.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = goog_cc_factory.h; sourceTree = "<group>"; }; >+ 41FCBB5E21B1FEC400A5DF27 /* goog_cc_factory.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = goog_cc_factory.cc; sourceTree = "<group>"; }; >+ 41FCBB6121B1FEF500A5DF27 /* create_vp8_temporal_layers.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = create_vp8_temporal_layers.h; path = video_codecs/create_vp8_temporal_layers.h; sourceTree = "<group>"; }; >+ 41FCBB6221B1FEF500A5DF27 /* vp8_temporal_layers.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = vp8_temporal_layers.h; path = video_codecs/vp8_temporal_layers.h; sourceTree = "<group>"; }; >+ 41FCBB6321B1FEF500A5DF27 /* create_vp8_temporal_layers.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = create_vp8_temporal_layers.cc; path = video_codecs/create_vp8_temporal_layers.cc; sourceTree = "<group>"; }; >+ 41FCBB6721B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = loss_based_bandwidth_estimation.h; path = bitrate_controller/loss_based_bandwidth_estimation.h; sourceTree = "<group>"; }; >+ 41FCBB6821B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = loss_based_bandwidth_estimation.cc; path = bitrate_controller/loss_based_bandwidth_estimation.cc; sourceTree = "<group>"; }; >+ 41FCBB6B21B1FF3900A5DF27 /* key_derivation.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = key_derivation.cc; path = rtc_base/key_derivation.cc; sourceTree = "<group>"; }; >+ 41FCBB6C21B1FF3A00A5DF27 /* key_derivation.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = key_derivation.h; path = rtc_base/key_derivation.h; sourceTree = "<group>"; }; >+ 41FCBB6F21B1FF7400A5DF27 /* hdr_metadata.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = hdr_metadata.cc; sourceTree = "<group>"; }; >+ 41FCBB7021B1FF7400A5DF27 /* hdr_metadata.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = hdr_metadata.h; sourceTree = "<group>"; }; >+ 41FCBB7321B1FFA400A5DF27 /* cocoa_threading.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = cocoa_threading.mm; sourceTree = "<group>"; }; >+ 41FCBB7421B1FFA400A5DF27 /* cocoa_threading.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = cocoa_threading.h; sourceTree = "<group>"; }; >+ 41FCBB7721B1FFBF00A5DF27 /* ascii.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ascii.h; sourceTree = "<group>"; }; >+ 41FCBB7821B1FFC000A5DF27 /* ascii.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = ascii.cc; sourceTree = "<group>"; }; >+ 41FCBB7B21B1FFF500A5DF27 /* openssl_key_derivation_hkdf.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = openssl_key_derivation_hkdf.cc; path = rtc_base/openssl_key_derivation_hkdf.cc; sourceTree = "<group>"; }; >+ 41FCBB7C21B1FFF500A5DF27 /* openssl_key_derivation_hkdf.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = openssl_key_derivation_hkdf.h; path = rtc_base/openssl_key_derivation_hkdf.h; sourceTree = "<group>"; }; >+ 41FCBB7F21B2001100A5DF27 /* string_view.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = string_view.cc; sourceTree = "<group>"; }; >+ 41FCBB8221B2007B00A5DF27 /* memutil.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = memutil.cc; sourceTree = "<group>"; }; >+ 41FCBB8321B2007B00A5DF27 /* memutil.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = memutil.h; sourceTree = "<group>"; }; >+ 41FCBB8721B200DB00A5DF27 /* RTCVideoEncoderSettings.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = RTCVideoEncoderSettings.m; sourceTree = "<group>"; }; > 41FE8644215C525F00B62C07 /* genversion.c */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.c; path = genversion.c; sourceTree = "<group>"; }; > 41FE8645215C526000B62C07 /* nasm-eval.c */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.c; path = "nasm-eval.c"; sourceTree = "<group>"; }; > 41FE8646215C526000B62C07 /* nasm-eval.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = "nasm-eval.h"; sourceTree = "<group>"; }; >@@ -5887,7 +5933,6 @@ > 5C4B4C121E431F75002651C8 /* i420_buffer_pool.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = i420_buffer_pool.cc; sourceTree = "<group>"; }; > 5C4B4C131E431F75002651C8 /* incoming_video_stream.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = incoming_video_stream.cc; sourceTree = "<group>"; }; > 5C4B4C151E431F75002651C8 /* video_frame_buffer.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = video_frame_buffer.cc; sourceTree = "<group>"; }; >- 5C4B4C161E431F75002651C8 /* video_frame.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = video_frame.cc; sourceTree = "<group>"; }; > 5C4B4C171E431F75002651C8 /* video_render_frames.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = video_render_frames.cc; sourceTree = "<group>"; }; > 5C4B4C181E431F75002651C8 /* video_render_frames.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = video_render_frames.h; sourceTree = "<group>"; }; > 5C4B4C241E431F9C002651C8 /* audio_converter.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = audio_converter.cc; sourceTree = "<group>"; }; >@@ -6028,8 +6073,6 @@ > 5C63F9D91E4174F6002CA531 /* turnserver.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = turnserver.cc; sourceTree = "<group>"; }; > 5C63F9DA1E4174F6002CA531 /* turnserver.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = turnserver.h; sourceTree = "<group>"; }; > 5C63F9DB1E4174F6002CA531 /* udpport.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = udpport.h; sourceTree = "<group>"; }; >- 5C63FA3A1E41761F002CA531 /* bundlefilter.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = bundlefilter.cc; sourceTree = "<group>"; }; >- 5C63FA3B1E41761F002CA531 /* bundlefilter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = bundlefilter.h; sourceTree = "<group>"; }; > 5C63FA3D1E41761F002CA531 /* channel.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = channel.cc; sourceTree = "<group>"; }; > 5C63FA3E1E41761F002CA531 /* channel.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = channel.h; sourceTree = "<group>"; }; > 5C63FA401E41761F002CA531 /* channelmanager.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = channelmanager.cc; sourceTree = "<group>"; }; >@@ -6234,8 +6277,6 @@ > 5CDD836C1E439A3500621E92 /* frame_dropper.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = frame_dropper.h; sourceTree = "<group>"; }; > 5CDD836D1E439A3500621E92 /* ivf_file_writer.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = ivf_file_writer.cc; sourceTree = "<group>"; }; > 5CDD836E1E439A3500621E92 /* ivf_file_writer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ivf_file_writer.h; sourceTree = "<group>"; }; >- 5CDD836F1E439A3500621E92 /* moving_average.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = moving_average.cc; sourceTree = "<group>"; }; >- 5CDD83701E439A3500621E92 /* moving_average.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = moving_average.h; sourceTree = "<group>"; }; > 5CDD83731E439A3500621E92 /* quality_scaler.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = quality_scaler.cc; sourceTree = "<group>"; }; > 5CDD83741E439A3500621E92 /* quality_scaler.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = quality_scaler.h; sourceTree = "<group>"; }; > 5CDD83771E439A3500621E92 /* vp8_header_parser.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = vp8_header_parser.cc; sourceTree = "<group>"; }; >@@ -6346,8 +6387,6 @@ > 5CDD84F91E43B1EA00621E92 /* audio_coding_module.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = audio_coding_module.cc; sourceTree = "<group>"; }; > 5CDD84FB1E43B1EA00621E92 /* call_statistics.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = call_statistics.cc; sourceTree = "<group>"; }; > 5CDD84FC1E43B1EA00621E92 /* call_statistics.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = call_statistics.h; sourceTree = "<group>"; }; >- 5CDD84FE1E43B1EA00621E92 /* codec_manager.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = codec_manager.cc; sourceTree = "<group>"; }; >- 5CDD84FF1E43B1EA00621E92 /* codec_manager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = codec_manager.h; sourceTree = "<group>"; }; > 5CDD85011E43B1EA00621E92 /* rent_a_codec.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = rent_a_codec.cc; sourceTree = "<group>"; }; > 5CDD85021E43B1EA00621E92 /* rent_a_codec.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = rent_a_codec.h; sourceTree = "<group>"; }; > 5CDD851C1E43B39C00621E92 /* bitrate_allocator.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; name = bitrate_allocator.cc; path = call/bitrate_allocator.cc; sourceTree = "<group>"; }; >@@ -6734,8 +6773,6 @@ > 5CDD89E21E43BFB300621E92 /* random_vector.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = random_vector.h; sourceTree = "<group>"; }; > 5CDD89E41E43BFB300621E92 /* red_payload_splitter.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = red_payload_splitter.cc; sourceTree = "<group>"; }; > 5CDD89E51E43BFB300621E92 /* red_payload_splitter.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = red_payload_splitter.h; sourceTree = "<group>"; }; >- 5CDD89E61E43BFB300621E92 /* rtcp.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = rtcp.cc; sourceTree = "<group>"; }; >- 5CDD89E71E43BFB300621E92 /* rtcp.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = rtcp.h; sourceTree = "<group>"; }; > 5CDD89E81E43BFB300621E92 /* statistics_calculator.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = statistics_calculator.cc; sourceTree = "<group>"; }; > 5CDD89E91E43BFB300621E92 /* statistics_calculator.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = statistics_calculator.h; sourceTree = "<group>"; }; > 5CDD89EB1E43BFB300621E92 /* sync_buffer.cc */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = sync_buffer.cc; sourceTree = "<group>"; }; >@@ -7643,6 +7680,12 @@ > 4106D56B216C2B91001E3C40 /* strings */ = { > isa = PBXGroup; > children = ( >+ 41FCBB8121B2005A00A5DF27 /* internal */, >+ 41FCBB7821B1FFC000A5DF27 /* ascii.cc */, >+ 41FCBB7721B1FFBF00A5DF27 /* ascii.h */, >+ 41FCBB3A21B1FD3F00A5DF27 /* match.cc */, >+ 41FCBB3B21B1FD3F00A5DF27 /* match.h */, >+ 41FCBB7F21B2001100A5DF27 /* string_view.cc */, > 4106D56C216C2BA9001E3C40 /* string_view.h */, > ); > path = strings; >@@ -7713,23 +7756,26 @@ > 41DDB27621267AC100296D47 /* builtin_video_decoder_factory.h */, > 41E02CC22127358700C27CD6 /* builtin_video_encoder_factory.cc */, > 41DDB27721267AC100296D47 /* builtin_video_encoder_factory.h */, >+ 41FCBB6321B1FEF500A5DF27 /* create_vp8_temporal_layers.cc */, >+ 41FCBB6121B1FEF500A5DF27 /* create_vp8_temporal_layers.h */, > 41DDB27A21267AC200296D47 /* sdp_video_format.cc */, > 4145F6191FE1F38B00EB9CAF /* sdp_video_format.h */, > 41DDB27021267AC000296D47 /* video_codec.cc */, > 41DDB27121267AC000296D47 /* video_codec.h */, >+ 41DDB27221267AC000296D47 /* video_decoder.cc */, >+ 4145E48B1EF88B9600FCF6E6 /* video_decoder.h */, > 417953F0216984550028266B /* video_decoder_factory.cc */, > 4145F61A1FE1F38C00EB9CAF /* video_decoder_factory.h */, > 41DDB27421267AC100296D47 /* video_decoder_software_fallback_wrapper.cc */, > 41DDB27821267AC100296D47 /* video_decoder_software_fallback_wrapper.h */, >- 41DDB27221267AC000296D47 /* video_decoder.cc */, >- 4145E48B1EF88B9600FCF6E6 /* video_decoder.h */, >+ 4145F61C1FE1F38D00EB9CAF /* video_encoder.cc */, >+ 4145E48C1EF88B9D00FCF6E6 /* video_encoder.h */, > 41E02CC32127358800C27CD6 /* video_encoder_config.cc */, > 41DDB27B21267AC200296D47 /* video_encoder_config.h */, > 4145F61B1FE1F38D00EB9CAF /* video_encoder_factory.h */, > 41DDB27521267AC100296D47 /* video_encoder_software_fallback_wrapper.cc */, > 41DDB27921267AC200296D47 /* video_encoder_software_fallback_wrapper.h */, >- 4145F61C1FE1F38D00EB9CAF /* video_encoder.cc */, >- 4145E48C1EF88B9D00FCF6E6 /* video_encoder.h */, >+ 41FCBB6221B1FEF500A5DF27 /* vp8_temporal_layers.h */, > ); > name = video_codecs; > sourceTree = "<group>"; >@@ -7790,7 +7836,7 @@ > children = ( > 413A21801FE18D7A00373E99 /* DEPS */, > 419241652127387900634FCF /* experiments */, >- 41F9BFC82051DD8E00ABF0B9 /* experiments */, >+ 41FCBB2421B1F85D00A5DF27 /* network */, > 419C83411FE246380040C30F /* numerics */, > 413A21FD1FE18DBB00373E99 /* OWNERS */, > 41DDB267212679AE00296D47 /* strings */, >@@ -7881,8 +7927,6 @@ > 413A22871FE18DFD00373E99 /* filerotatingstream.cc */, > 413A218C1FE18D8300373E99 /* filerotatingstream.h */, > 413A225A1FE18DE500373E99 /* filerotatingstream_unittest.cc */, >- 413A22621FE18DE900373E99 /* fileutils.cc */, >- 413A226A1FE18DF000373E99 /* fileutils.h */, > 413A22171FE18DC600373E99 /* firewallsocketserver.cc */, > 413A228C1FE18DFE00373E99 /* firewallsocketserver.h */, > 413A21611FE18D5B00373E99 /* flags.cc */, >@@ -7907,6 +7951,8 @@ > 413A214E1FE18D5000373E99 /* ipaddress.h */, > 413A216A1FE18D6700373E99 /* ipaddress_unittest.cc */, > 413A22121FE18DC400373E99 /* keep_ref_until_done.h */, >+ 41FCBB6B21B1FF3900A5DF27 /* key_derivation.cc */, >+ 41FCBB6C21B1FF3A00A5DF27 /* key_derivation.h */, > 413A22991FE18E0300373E99 /* location.cc */, > 413A229A1FE18E0400373E99 /* location.h */, > 413A21721FE18D6D00373E99 /* logging.cc */, >@@ -7947,14 +7993,14 @@ > 413A22381FE18DD600373E99 /* networkmonitor.cc */, > 413A22291FE18DCF00373E99 /* networkmonitor.h */, > 413A21691FE18D6500373E99 /* networkroute.h */, >- 413A220C1FE18DC100373E99 /* noop.cc */, >- 413A229C1FE18E0400373E99 /* noop.mm */, > 413A227F1FE18DF900373E99 /* nullsocketserver.cc */, > 413A226D1FE18DF100373E99 /* nullsocketserver.h */, > 413A227C1FE18DF800373E99 /* nullsocketserver_unittest.cc */, > 413A21E21FE18DAF00373E99 /* onetimeevent.h */, > 413A21621FE18D5D00373E99 /* onetimeevent_unittest.cc */, > 413A22751FE18DF600373E99 /* openssl.h */, >+ 41FCBB7B21B1FFF500A5DF27 /* openssl_key_derivation_hkdf.cc */, >+ 41FCBB7C21B1FFF500A5DF27 /* openssl_key_derivation_hkdf.h */, > 413A219A1FE18D8B00373E99 /* openssladapter.cc */, > 413A21DB1FE18DAC00373E99 /* openssladapter.h */, > 413A227E1FE18DF900373E99 /* openssladapter_unittest.cc */, >@@ -7970,12 +8016,6 @@ > 413A22631FE18DEA00373E99 /* opensslstreamadapter.h */, > 41DDB25F212679A200296D47 /* opensslutility.cc */, > 41DDB25B2126799F00296D47 /* opensslutility.h */, >- 413A21F61FE18DB800373E99 /* optionsfile.cc */, >- 413A21F11FE18DB600373E99 /* optionsfile.h */, >- 413A21C71FE18DA100373E99 /* optionsfile_unittest.cc */, >- 413A21E81FE18DB200373E99 /* pathutils.cc */, >- 413A223D1FE18DD900373E99 /* pathutils.h */, >- 413A219D1FE18D8D00373E99 /* pathutils_unittest.cc */, > 413A21E91FE18DB200373E99 /* physicalsocketserver.cc */, > 413A22981FE18E0300373E99 /* physicalsocketserver.h */, > 413A227B1FE18DF800373E99 /* physicalsocketserver_unittest.cc */, >@@ -8100,7 +8140,6 @@ > 413A21FC1FE18DBB00373E99 /* thread_checker_impl.cc */, > 413A215E1FE18D5900373E99 /* thread_checker_impl.h */, > 413A21B61FE18D9800373E99 /* thread_checker_unittest.cc */, >- 413A228A1FE18DFE00373E99 /* thread_darwin.mm */, > 413A22261FE18DCD00373E99 /* thread_unittest.cc */, > 413A22281FE18DCE00373E99 /* timestampaligner.cc */, > 413A22011FE18DBD00373E99 /* timestampaligner.h */, >@@ -8111,8 +8150,6 @@ > 413A21AB1FE18D9300373E99 /* trace_event.h */, > 413A21F71FE18DB800373E99 /* type_traits.h */, > 413A224C1FE18DE000373E99 /* unittest_main.cc */, >- 413A22331FE18DD400373E99 /* unixfilesystem.cc */, >- 413A21481FE18D4D00373E99 /* unixfilesystem.h */, > 413A21DA1FE18DAC00373E99 /* virtualsocket_unittest.cc */, > 413A21BE1FE18D9C00373E99 /* virtualsocketserver.cc */, > 413A21631FE18D5E00373E99 /* virtualsocketserver.h */, >@@ -8205,11 +8242,11 @@ > 413AD19F21265ADF003F7263 /* absl */ = { > isa = PBXGroup; > children = ( >- 4106D56B216C2B91001E3C40 /* strings */, > 413AD1A021265AFA003F7263 /* algorithm */, > 413AD1A621265B35003F7263 /* base */, > 41DDB24521265B8200296D47 /* container */, > 41A08BB621268A6A001D5D7B /* memory */, >+ 4106D56B216C2B91001E3C40 /* strings */, > 41DDB24A21265BA800296D47 /* types */, > ); > name = absl; >@@ -8228,6 +8265,7 @@ > 413AD1A621265B35003F7263 /* base */ = { > isa = PBXGroup; > children = ( >+ 41FCBB1F21B1F81400A5DF27 /* internal */, > 413AD1A921265B59003F7263 /* casts.h */, > 413AD1A821265B59003F7263 /* dynamic_annotations.cc */, > 413AD1AA21265B59003F7263 /* macros.h */, >@@ -8250,9 +8288,9 @@ > 413E67A62169893600EF37ED /* video_frame_buffer */ = { > isa = PBXGroup; > children = ( >+ 413E67A82169894C00EF37ED /* RTCNativeI420Buffer+Private.h */, > 413E67A92169894C00EF37ED /* RTCNativeI420Buffer.h */, > 413E67A72169894B00EF37ED /* RTCNativeI420Buffer.mm */, >- 413E67A82169894C00EF37ED /* RTCNativeI420Buffer+Private.h */, > 413E67AA2169894C00EF37ED /* RTCNativeMutableI420Buffer.h */, > 4144B3D021698966004363AC /* RTCNativeMutableI420Buffer.mm */, > ); >@@ -8370,9 +8408,9 @@ > children = ( > 4144B3F4216B081B004363AC /* RTCEncodedImage+Private.h */, > 4144B3F3216B081A004363AC /* RTCEncodedImage+Private.mm */, >+ 41889D74216BB7B6004614DD /* RTCRtpEncodingParameters+Private.h */, > 414FB869216BB767001F5492 /* RTCRtpEncodingParameters.h */, > 414FB86A216BB768001F5492 /* RTCRtpEncodingParameters.mm */, >- 41889D74216BB7B6004614DD /* RTCRtpEncodingParameters+Private.h */, > 414FB868216BB766001F5492 /* RTCRtpFragmentationHeader+Private.h */, > 414FB867216BB766001F5492 /* RTCRtpFragmentationHeader+Private.mm */, > 4144B3E72169C5E3004363AC /* RTCVideoCodecInfo+Private.h */, >@@ -8462,10 +8500,10 @@ > 416225D02169818100A91C9B /* video_decoder_factory.mm */, > 416225CD2169818000A91C9B /* video_encoder_factory.h */, > 416225CC2169818000A91C9B /* video_encoder_factory.mm */, >- 416225CB2169818000A91C9B /* video_frame_buffer.h */, >- 416225CF2169818100A91C9B /* video_frame_buffer.mm */, > 416225CE2169818000A91C9B /* video_frame.h */, > 416225D12169818100A91C9B /* video_frame.mm */, >+ 416225CB2169818000A91C9B /* video_frame_buffer.h */, >+ 416225CF2169818100A91C9B /* video_frame_buffer.mm */, > ); > path = api; > sourceTree = "<group>"; >@@ -8488,8 +8526,8 @@ > 416225EA216981BA00A91C9B /* components */ = { > isa = PBXGroup; > children = ( >- 413E6787216987A500EF37ED /* video_frame_buffer */, > 416225EB216981CE00A91C9B /* video_codec */, >+ 413E6787216987A500EF37ED /* video_frame_buffer */, > ); > name = components; > path = sdk/objc/components; >@@ -8502,9 +8540,9 @@ > 416225F5216981F400A91C9B /* helpers.h */, > 417953AC2169823D0028266B /* nalu_rewriter.cc */, > 411521D521698220000ABAF7 /* nalu_rewriter.h */, >+ 417953B2216982410028266B /* RTCCodecSpecificInfoH264+Private.h */, > 416225F1216981F300A91C9B /* RTCCodecSpecificInfoH264.h */, > 416225ED216981F200A91C9B /* RTCCodecSpecificInfoH264.mm */, >- 417953B2216982410028266B /* RTCCodecSpecificInfoH264+Private.h */, > 417953B3216982420028266B /* RTCDefaultVideoDecoderFactory.h */, > 417953AD2169823E0028266B /* RTCDefaultVideoDecoderFactory.m */, > 416225F7216981F400A91C9B /* RTCDefaultVideoEncoderFactory.h */, >@@ -8545,11 +8583,11 @@ > 417953DD216983B20028266B /* include */ = { > isa = PBXGroup; > children = ( >- 417953E0216983C90028266B /* module_common_types_public.h */, >+ 417953DF216983C90028266B /* module.h */, > 417953E1216983CA0028266B /* module_common_types.cc */, > 417953E2216983CA0028266B /* module_common_types.h */, >+ 417953E0216983C90028266B /* module_common_types_public.h */, > 417953DE216983C90028266B /* module_fec_types.h */, >- 417953DF216983C90028266B /* module.h */, > ); > path = include; > sourceTree = "<group>"; >@@ -8572,7 +8610,7 @@ > 413E679C2169889400EF37ED /* RTCVideoEncoderQpThresholds.h */, > 413E679F2169889500EF37ED /* RTCVideoEncoderQpThresholds.m */, > 413E67832169877D00EF37ED /* RTCVideoEncoderSettings.h */, >- 413E67842169877D00EF37ED /* RTCVideoEncoderSettings.mm */, >+ 41FCBB8721B200DB00A5DF27 /* RTCVideoEncoderSettings.m */, > 413E678C216987DB00EF37ED /* RTCVideoFrame.h */, > 413E678D216987DB00EF37ED /* RTCVideoFrame.mm */, > 413E678E216987DB00EF37ED /* RTCVideoFrameBuffer.h */, >@@ -8595,11 +8633,11 @@ > 417953FC216984F40028266B /* video_codec */ = { > isa = PBXGroup; > children = ( >- 413E67642169854500EF37ED /* RTCVideoEncoderVP8.mm */, > 417953FD2169851F0028266B /* RTCVideoCodecConstants.h */, > 417953FF2169851F0028266B /* RTCVideoCodecConstants.mm */, > 41795402216985200028266B /* RTCVideoDecoderVP8.mm */, > 417954012169851F0028266B /* RTCVideoEncoderVP8.h */, >+ 413E67642169854500EF37ED /* RTCVideoEncoderVP8.mm */, > 417953FE2169851F0028266B /* RTCWrappedNativeVideoDecoder.h */, > 41795403216985200028266B /* RTCWrappedNativeVideoDecoder.mm */, > 41795404216985200028266B /* RTCWrappedNativeVideoEncoder.h */, >@@ -8752,12 +8790,20 @@ > 419241652127387900634FCF /* experiments */ = { > isa = PBXGroup; > children = ( >+ 41F9BFC92051DDA200ABF0B9 /* alr_experiment.cc */, >+ 41F9BFCA2051DDA200ABF0B9 /* alr_experiment.h */, > 4192416B2127389B00634FCF /* congestion_controller_experiment.cc */, > 419241672127389A00634FCF /* congestion_controller_experiment.h */, >+ 41FCBB4D21B1FE4500A5DF27 /* cpu_speed_experiment.cc */, >+ 41FCBB4B21B1FE4400A5DF27 /* cpu_speed_experiment.h */, > 419241682127389A00634FCF /* field_trial_parser.cc */, > 419241692127389B00634FCF /* field_trial_parser.h */, > 419241E621275A7600634FCF /* field_trial_units.cc */, > 419241E521275A7600634FCF /* field_trial_units.h */, >+ 41FCBB4C21B1FE4400A5DF27 /* jitter_upper_bound_experiment.cc */, >+ 41FCBB4E21B1FE4500A5DF27 /* jitter_upper_bound_experiment.h */, >+ 41FCBB5021B1FE4500A5DF27 /* normalize_simulcast_size_experiment.cc */, >+ 41FCBB4F21B1FE4500A5DF27 /* normalize_simulcast_size_experiment.h */, > 4192416A2127389B00634FCF /* quality_scaling_experiment.cc */, > 419241662127389A00634FCF /* quality_scaling_experiment.h */, > 417953E8216983F10028266B /* rtt_mult_experiment.cc */, >@@ -8938,6 +8984,8 @@ > 419C834C1FE246640040C30F /* histogram_percentile_counter.h */, > 419C834D1FE246640040C30F /* mathutils.h */, > 419C834A1FE246640040C30F /* mod_ops.h */, >+ 41FCBB1321B1F7AA00A5DF27 /* moving_average.cc */, >+ 41FCBB1421B1F7AA00A5DF27 /* moving_average.h */, > 419C83491FE246640040C30F /* moving_max_counter.h */, > 419C83461FE246630040C30F /* moving_median_filter.h */, > 419C83421FE246620040C30F /* percentile_filter.h */, >@@ -9159,14 +9207,15 @@ > 41A08BD921272F3E001D5D7B /* agc2 */ = { > isa = PBXGroup; > children = ( >+ 41299B98212736B200B3414B /* rnn_vad */, > 41D6B46E212731A4008F9353 /* adaptive_agc.cc */, > 41D6B464212731A2008F9353 /* adaptive_agc.h */, > 41A08BE421272F85001D5D7B /* adaptive_digital_gain_applier.cc */, > 41A08BDB21272F84001D5D7B /* adaptive_digital_gain_applier.h */, >- 41A08BE921272F86001D5D7B /* adaptive_mode_level_estimator_agc.cc */, >- 41D6B462212731A1008F9353 /* adaptive_mode_level_estimator_agc.h */, > 41D6B45F212731A1008F9353 /* adaptive_mode_level_estimator.cc */, > 41A08BDC21272F84001D5D7B /* adaptive_mode_level_estimator.h */, >+ 41A08BE921272F86001D5D7B /* adaptive_mode_level_estimator_agc.cc */, >+ 41D6B462212731A1008F9353 /* adaptive_mode_level_estimator_agc.h */, > 413E679A2169886200EF37ED /* agc2_common.cc */, > 41D6B46D212731A4008F9353 /* agc2_common.h */, > 41D6B45E212731A0008F9353 /* biquad_filter.cc */, >@@ -9177,19 +9226,14 @@ > 41D6B460212731A1008F9353 /* down_sampler.h */, > 41A08BE521272F86001D5D7B /* fixed_digital_level_estimator.cc */, > 41A08BDF21272F84001D5D7B /* fixed_digital_level_estimator.h */, >- 41D6B465212731A2008F9353 /* fixed_gain_controller.cc */, >- 41D6B467212731A3008F9353 /* fixed_gain_controller.h */, > 41A08BDE21272F84001D5D7B /* gain_applier.cc */, > 41A08BE721272F86001D5D7B /* gain_applier.h */, >- 41A08BE021272F85001D5D7B /* gain_curve_applier.cc */, >- 41D6B469212731A3008F9353 /* gain_curve_applier.h */, > 41A08BE121272F85001D5D7B /* interpolated_gain_curve.cc */, > 41D6B470212731A6008F9353 /* limiter.cc */, > 41A08BEA21272F86001D5D7B /* limiter.h */, > 41D6B461212731A1008F9353 /* noise_level_estimator.cc */, > 41D6B46A212731A3008F9353 /* noise_spectrum_estimator.cc */, > 41A08BE221272F85001D5D7B /* noise_spectrum_estimator.h */, >- 41299B98212736B200B3414B /* rnn_vad */, > 41D6B468212731A3008F9353 /* saturation_protector.cc */, > 41D6B46B212731A3008F9353 /* saturation_protector.h */, > 41D6B46F212731A4008F9353 /* signal_classifier.cc */, >@@ -9627,10 +9671,11 @@ > 41E02C992127352B00C27CD6 /* alr_detector.h */, > 41E02C9A2127352B00C27CD6 /* bitrate_estimator.cc */, > 41E02C982127352B00C27CD6 /* bitrate_estimator.h */, >+ 41FCBB3F21B1FDB800A5DF27 /* congestion_window_pushback_controller.cc */, >+ 41FCBB3E21B1FDB800A5DF27 /* congestion_window_pushback_controller.h */, > 41E02C912127352900C27CD6 /* delay_based_bwe.cc */, > 41E02C972127352A00C27CD6 /* delay_based_bwe.h */, > 41E02CA32127352D00C27CD6 /* delay_increase_detector_interface.h */, >- 41E02C952127352A00C27CD6 /* goog_cc_factory.cc */, > 41E02CA22127352C00C27CD6 /* goog_cc_network_control.cc */, > 41E02C962127352A00C27CD6 /* goog_cc_network_control.h */, > 41E02C9D2127352B00C27CD6 /* median_slope_estimator.cc */, >@@ -9650,6 +9695,8 @@ > children = ( > 41E02CC7212735B600C27CD6 /* bitrate_settings.cc */, > 41E02CCB212735B700C27CD6 /* bitrate_settings.h */, >+ 41FCBB5E21B1FEC400A5DF27 /* goog_cc_factory.cc */, >+ 41FCBB5D21B1FEC400A5DF27 /* goog_cc_factory.h */, > 41E02CCA212735B700C27CD6 /* network_control.h */, > 41E02CC8212735B700C27CD6 /* network_types.cc */, > 41E02CC9212735B700C27CD6 /* network_types.h */, >@@ -9660,8 +9707,8 @@ > 41E02CD52127362300C27CD6 /* rtp */ = { > isa = PBXGroup; > children = ( >- 41E02CD62127363C00C27CD6 /* pacer_controller.cc */, >- 41924189212749C800634FCF /* pacer_controller.h */, >+ 41FCBB3521B1F8FC00A5DF27 /* control_handler.cc */, >+ 41FCBB3421B1F8FB00A5DF27 /* control_handler.h */, > 41924188212749C700634FCF /* send_side_congestion_controller.cc */, > 41E02CD82127363D00C27CD6 /* send_time_history.cc */, > 41924187212749C700634FCF /* send_time_history.h */, >@@ -9877,6 +9924,8 @@ > 41F263AB212680A800274F59 /* arch.h */, > 41F263A9212680A700274F59 /* asm_defines.h */, > 41F263A8212680A700274F59 /* BUILD.gn */, >+ 41FCBB7421B1FFA400A5DF27 /* cocoa_threading.h */, >+ 41FCBB7321B1FFA400A5DF27 /* cocoa_threading.mm */, > 41F263A5212680A700274F59 /* fallthrough.h */, > 41F263A3212680A600274F59 /* file_wrapper.cc */, > 41F263AA212680A800274F59 /* file_wrapper.h */, >@@ -9903,9 +9952,9 @@ > 41F411BD1EF8DB8200343C26 /* vp8 */ = { > isa = PBXGroup; > children = ( >+ 4145F6151FE1EF5C00EB9CAF /* include */, > 419C842B1FE24E7E0040C30F /* default_temporal_layers.cc */, > 419C84311FE24E800040C30F /* default_temporal_layers.h */, >- 4145F6151FE1EF5C00EB9CAF /* include */, > 413E67712169863B00EF37ED /* libvpx_interface.cc */, > 413E676F2169863A00EF37ED /* libvpx_interface.h */, > 41294090212E128C00AD95E7 /* libvpx_vp8_decoder.cc */, >@@ -9914,9 +9963,8 @@ > 4129408F212E128C00AD95E7 /* libvpx_vp8_encoder.h */, > 419C84301FE24E7F0040C30F /* screenshare_layers.cc */, > 419C842C1FE24E7F0040C30F /* screenshare_layers.h */, >- 413E676E2169863A00EF37ED /* temporal_layers_checker.cc */, > 419C84331FE24E800040C30F /* temporal_layers.h */, >- 413E67702169863B00EF37ED /* vp8_temporal_layers.cc */, >+ 413E676E2169863A00EF37ED /* temporal_layers_checker.cc */, > ); > name = vp8; > sourceTree = "<group>"; >@@ -10037,13 +10085,44 @@ > name = include; > sourceTree = "<group>"; > }; >- 41F9BFC82051DD8E00ABF0B9 /* experiments */ = { >+ 41FCBB1F21B1F81400A5DF27 /* internal */ = { > isa = PBXGroup; > children = ( >- 41F9BFC92051DDA200ABF0B9 /* alr_experiment.cc */, >- 41F9BFCA2051DDA200ABF0B9 /* alr_experiment.h */, >+ 41FCBB2021B1F82E00A5DF27 /* throw_delegate.cc */, >+ 41FCBB2121B1F82F00A5DF27 /* throw_delegate.h */, > ); >- name = experiments; >+ name = internal; >+ path = base/internal; >+ sourceTree = "<group>"; >+ }; >+ 41FCBB2421B1F85D00A5DF27 /* network */ = { >+ isa = PBXGroup; >+ children = ( >+ 41FCBB2521B1F87E00A5DF27 /* sent_packet.cc */, >+ 41FCBB2621B1F87E00A5DF27 /* sent_packet.h */, >+ ); >+ name = network; >+ path = rtc_base/network; >+ sourceTree = "<group>"; >+ }; >+ 41FCBB2921B1F8A300A5DF27 /* crypto */ = { >+ isa = PBXGroup; >+ children = ( >+ 41FCBB2D21B1F8B600A5DF27 /* cryptooptions.cc */, >+ 41FCBB2A21B1F8B600A5DF27 /* cryptooptions.h */, >+ 41FCBB2B21B1F8B600A5DF27 /* framedecryptorinterface.h */, >+ 41FCBB2C21B1F8B600A5DF27 /* frameencryptorinterface.h */, >+ ); >+ path = crypto; >+ sourceTree = "<group>"; >+ }; >+ 41FCBB8121B2005A00A5DF27 /* internal */ = { >+ isa = PBXGroup; >+ children = ( >+ 41FCBB8221B2007B00A5DF27 /* memutil.cc */, >+ 41FCBB8321B2007B00A5DF27 /* memutil.h */, >+ ); >+ path = internal; > sourceTree = "<group>"; > }; > 41FE864E215C52E900B62C07 /* listfmts */ = { >@@ -10853,12 +10932,12 @@ > 5C4B4B581E431C49002651C8 /* objc */ = { > isa = PBXGroup; > children = ( >- 4144B3EC2169C61C004363AC /* helpers */, > 417953FB216984DC0028266B /* api */, > 417953F2216984830028266B /* base */, > 416225EA216981BA00A91C9B /* components */, >- 416225C72169814100A91C9B /* native */, > 5C4B4B591E431C55002651C8 /* Framework */, >+ 4144B3EC2169C61C004363AC /* helpers */, >+ 416225C72169814100A91C9B /* native */, > ); > name = objc; > sourceTree = "<group>"; >@@ -10894,7 +10973,6 @@ > 5C4B4C101E431F75002651C8 /* bitrate_adjuster.cc */, > 5C4B4C121E431F75002651C8 /* i420_buffer_pool.cc */, > 5C4B4C131E431F75002651C8 /* incoming_video_stream.cc */, >- 5C4B4C161E431F75002651C8 /* video_frame.cc */, > 5C4B4C151E431F75002651C8 /* video_frame_buffer.cc */, > 5C4B4C171E431F75002651C8 /* video_render_frames.cc */, > 5C4B4C181E431F75002651C8 /* video_render_frames.h */, >@@ -11097,17 +11175,25 @@ > 5C63F8E21E41732B002CA531 /* api */ = { > isa = PBXGroup; > children = ( >- 41F9BF792051C7FD00ABF0B9 /* array_view.h */, > 5CD2853B1E6A61BE0094FDC8 /* audio_codecs */, >+ 5CDD8C451E43C57900621E92 /* call */, >+ 41FCBB2921B1F8A300A5DF27 /* crypto */, >+ 4145E4C01EF8962800FCF6E6 /* stats */, >+ 41E02CC62127359F00C27CD6 /* transport */, >+ 419241D421275A1800634FCF /* units */, >+ 5CD284601E6A57DD0094FDC8 /* video */, >+ 412455581EF88AD900F11809 /* video_codecs */, >+ 41F9BF792051C7FD00ABF0B9 /* array_view.h */, > 415F1FCA212730F000064CBF /* audio_frame.cc */, > 415F1FCB212730F000064CBF /* audio_frame.h */, > 415F1FC9212730F000064CBF /* audio_mixer.h */, > 41F2637521267B7600274F59 /* audio_options.cc */, > 41F2637821267B7600274F59 /* audio_options.h */, > 41F2637921267B7700274F59 /* bitrate_constraints.h */, >- 5CDD8C451E43C57900621E92 /* call */, > 419C830E1FE2427F0040C30F /* candidate.cc */, > 41F9BF772051C7FD00ABF0B9 /* candidate.h */, >+ 41FCBB0921B1F53000A5DF27 /* create_peerconnection_factory.cc */, >+ 41FCBB0D21B1F53600A5DF27 /* create_peerconnection_factory.h */, > 41F9BF822051C80000ABF0B9 /* cryptoparams.h */, > 41F2637721267B7600274F59 /* datachannelinterface.cc */, > 5C63F8E71E41737B002CA531 /* datachannelinterface.h */, >@@ -11119,6 +11205,8 @@ > 41F2637B21267B7700274F59 /* jsepicecandidate.cc */, > 5C63F8F11E41737B002CA531 /* jsepicecandidate.h */, > 5C63F8F31E41737B002CA531 /* jsepsessiondescription.h */, >+ 41FCBB0C21B1F53200A5DF27 /* media_transport_interface.cc */, >+ 41FCBB0A21B1F53100A5DF27 /* media_transport_interface.h */, > 5C63F8F61E41737B002CA531 /* mediaconstraintsinterface.cc */, > 5C63F8F71E41737B002CA531 /* mediaconstraintsinterface.h */, > 5CD2863F1E6A683A0094FDC8 /* mediastreaminterface.cc */, >@@ -11148,16 +11236,12 @@ > 5C63F91B1E41737B002CA531 /* rtpsenderinterface.h */, > 41F2637C21267B7700274F59 /* rtptransceiverinterface.cc */, > 41F9BF802051C7FF00ABF0B9 /* rtptransceiverinterface.h */, >+ 41FCBB0B21B1F53200A5DF27 /* scoped_refptr.h */, > 41F9BF842051C80000ABF0B9 /* setremotedescriptionobserverinterface.h */, >- 4145E4C01EF8962800FCF6E6 /* stats */, > 5C63F9211E41737B002CA531 /* statstypes.cc */, > 5C63F9221E41737B002CA531 /* statstypes.h */, >- 41E02CC62127359F00C27CD6 /* transport */, > 41F9BF832051C80000ABF0B9 /* turncustomizer.h */, > 5C63F9241E41737B002CA531 /* umametrics.h */, >- 419241D421275A1800634FCF /* units */, >- 5CD284601E6A57DD0094FDC8 /* video */, >- 412455581EF88AD900F11809 /* video_codecs */, > 5C63F9281E41737B002CA531 /* videosourceproxy.h */, > ); > path = api; >@@ -11191,8 +11275,12 @@ > 4192414F2127376E00634FCF /* fakeicetransport.h */, > 4192414E2127376E00634FCF /* fakepackettransport.h */, > 5C63F9951E4174F6002CA531 /* fakeportallocator.h */, >+ 41FCBB4321B1FDEE00A5DF27 /* icecredentialsiterator.cc */, >+ 41FCBB4521B1FDEF00A5DF27 /* icecredentialsiterator.h */, > 419C83D71FE247F20040C30F /* icetransportinternal.cc */, > 419241502127376E00634FCF /* icetransportinternal.h */, >+ 41FCBB4421B1FDEF00A5DF27 /* mdns_message.cc */, >+ 41FCBB4221B1FDEE00A5DF27 /* mdns_message.h */, > 5C63F9971E4174F6002CA531 /* p2pconstants.cc */, > 5C63F9981E4174F6002CA531 /* p2pconstants.h */, > 5C63F99C1E4174F6002CA531 /* p2ptransportchannel.cc */, >@@ -11242,8 +11330,6 @@ > 5C63F9D91E4174F6002CA531 /* turnserver.cc */, > 5C63F9DA1E4174F6002CA531 /* turnserver.h */, > 5C63F9DB1E4174F6002CA531 /* udpport.h */, >- 419C83D31FE247F00040C30F /* udptransport.cc */, >- 4192414D2127376E00634FCF /* udptransport.h */, > ); > path = base; > sourceTree = "<group>"; >@@ -11253,13 +11339,10 @@ > children = ( > 5CD284C41E6A60570094FDC8 /* audiotrack.cc */, > 5CD284C51E6A60570094FDC8 /* audiotrack.h */, >- 5C63FA3A1E41761F002CA531 /* bundlefilter.cc */, >- 5C63FA3B1E41761F002CA531 /* bundlefilter.h */, > 5C63FA3D1E41761F002CA531 /* channel.cc */, > 5C63FA3E1E41761F002CA531 /* channel.h */, > 5C63FA401E41761F002CA531 /* channelmanager.cc */, > 5C63FA411E41761F002CA531 /* channelmanager.h */, >- 4145E4C91EF8CB3100FCF6E6 /* createpeerconnectionfactory.cc */, > 5CD284C61E6A60570094FDC8 /* datachannel.cc */, > 5CD284C71E6A60570094FDC8 /* datachannel.h */, > 419C82D61FE20E580040C30F /* dtlssrtptransport.cc */, >@@ -11451,31 +11534,37 @@ > 5CD284601E6A57DD0094FDC8 /* video */ = { > isa = PBXGroup; > children = ( >+ 41FCBB1721B1F7CC00A5DF27 /* builtin_video_bitrate_allocator_factory.cc */, >+ 41FCBB1A21B1F7CD00A5DF27 /* builtin_video_bitrate_allocator_factory.h */, > 4102F6D221273414006AE8D7 /* color_space.cc */, > 4102F6D821273416006AE8D7 /* color_space.h */, > 41F263C72126818A00274F59 /* encoded_frame.cc */, > 41F263C22126818900274F59 /* encoded_frame.h */, >+ 41FCBB1921B1F7CC00A5DF27 /* encoded_image.cc */, >+ 41FCBB1821B1F7CC00A5DF27 /* encoded_image.h */, >+ 41FCBB6F21B1FF7400A5DF27 /* hdr_metadata.cc */, >+ 41FCBB7021B1FF7400A5DF27 /* hdr_metadata.h */, > 4102F6D621273415006AE8D7 /* i010_buffer.cc */, > 41F263BF2126818900274F59 /* i010_buffer.h */, > 5CD284611E6A57F40094FDC8 /* i420_buffer.cc */, > 5CD284621E6A57F40094FDC8 /* i420_buffer.h */, > 4102F6D921273416006AE8D7 /* video_bitrate_allocation.cc */, > 41F263C42126818900274F59 /* video_bitrate_allocation.h */, >- 4102F6DA21273416006AE8D7 /* video_bitrate_allocator_factory.h */, > 41F263C12126818900274F59 /* video_bitrate_allocator.h */, >+ 4102F6DA21273416006AE8D7 /* video_bitrate_allocator_factory.h */, > 419C840D1FE249AA0040C30F /* video_content_type.cc */, > 419C840F1FE249AB0040C30F /* video_content_type.h */, >- 4124554A1EF8874300F11809 /* video_frame_buffer.cc */, >- 5CD284631E6A57F40094FDC8 /* video_frame_buffer.h */, > 5CD284641E6A57F40094FDC8 /* video_frame.cc */, > 5CD284651E6A57F40094FDC8 /* video_frame.h */, >+ 4124554A1EF8874300F11809 /* video_frame_buffer.cc */, >+ 5CD284631E6A57F40094FDC8 /* video_frame_buffer.h */, > 5CD284661E6A57F40094FDC8 /* video_rotation.h */, > 4102F6D321273415006AE8D7 /* video_sink_interface.h */, > 41F263C62126818A00274F59 /* video_source_interface.cc */, > 41F263BE2126818800274F59 /* video_source_interface.h */, >+ 41F263C52126818A00274F59 /* video_stream_decoder.h */, > 4102F6D121273414006AE8D7 /* video_stream_decoder_create.cc */, > 4102F6D421273415006AE8D7 /* video_stream_decoder_create.h */, >- 41F263C52126818A00274F59 /* video_stream_decoder.h */, > 4102F6D521273415006AE8D7 /* video_stream_encoder_create.cc */, > 41F263C02126818900274F59 /* video_stream_encoder_create.h */, > 41F263C32126818900274F59 /* video_stream_encoder_interface.h */, >@@ -11762,8 +11851,6 @@ > 5CDD836C1E439A3500621E92 /* frame_dropper.h */, > 5CDD836D1E439A3500621E92 /* ivf_file_writer.cc */, > 5CDD836E1E439A3500621E92 /* ivf_file_writer.h */, >- 5CDD836F1E439A3500621E92 /* moving_average.cc */, >- 5CDD83701E439A3500621E92 /* moving_average.h */, > 5CDD83731E439A3500621E92 /* quality_scaler.cc */, > 5CDD83741E439A3500621E92 /* quality_scaler.h */, > 419241EA21275AFA00634FCF /* simulcast_rate_allocator.cc */, >@@ -11907,8 +11994,6 @@ > 5CDD84F91E43B1EA00621E92 /* audio_coding_module.cc */, > 5CDD84FB1E43B1EA00621E92 /* call_statistics.cc */, > 5CDD84FC1E43B1EA00621E92 /* call_statistics.h */, >- 5CDD84FE1E43B1EA00621E92 /* codec_manager.cc */, >- 5CDD84FF1E43B1EA00621E92 /* codec_manager.h */, > 5CDD85011E43B1EA00621E92 /* rent_a_codec.cc */, > 5CDD85021E43B1EA00621E92 /* rent_a_codec.h */, > ); >@@ -12008,6 +12093,8 @@ > 5CDD855D1E43B5C000621E92 /* call_stats.h */, > 5CDD85601E43B5C000621E92 /* encoder_rtcp_feedback.cc */, > 5CDD85611E43B5C000621E92 /* encoder_rtcp_feedback.h */, >+ 41FCBB5721B1FE6D00A5DF27 /* frame_dumping_decoder.cc */, >+ 41FCBB5821B1FE6E00A5DF27 /* frame_dumping_decoder.h */, > 5CDD85661E43B5C000621E92 /* overuse_frame_detector.cc */, > 5CDD85671E43B5C000621E92 /* overuse_frame_detector.h */, > 5CD286281E6A66C80094FDC8 /* quality_threshold.cc */, >@@ -12376,6 +12463,7 @@ > 5CDD87FA1E43BE2600621E92 /* source */ = { > isa = PBXGroup; > children = ( >+ 5CDD88EB1E43BF2500621E92 /* rtcp_packet */, > 5CDD87FC1E43BE3C00621E92 /* byte_io.h */, > 4102F6EB21273460006AE8D7 /* contributing_sources.cc */, > 5CDD87FE1E43BE3C00621E92 /* dtmf_queue.cc */, >@@ -12390,10 +12478,10 @@ > 5CDD88061E43BE3C00621E92 /* flexfec_header_reader_writer.h */, > 5CDD880A1E43BE3C00621E92 /* flexfec_receiver.cc */, > 5CDD880C1E43BE3C00621E92 /* flexfec_sender.cc */, >- 5CDD880D1E43BE3C00621E92 /* forward_error_correction_internal.cc */, >- 5CDD880E1E43BE3C00621E92 /* forward_error_correction_internal.h */, > 5CDD880F1E43BE3C00621E92 /* forward_error_correction.cc */, > 5CDD88101E43BE3C00621E92 /* forward_error_correction.h */, >+ 5CDD880D1E43BE3C00621E92 /* forward_error_correction_internal.cc */, >+ 5CDD880E1E43BE3C00621E92 /* forward_error_correction_internal.h */, > 5CDD88151E43BE3C00621E92 /* packet_loss_stats.cc */, > 5CDD88161E43BE3C00621E92 /* packet_loss_stats.h */, > 5CDD88181E43BE3C00621E92 /* playout_delay_oracle.cc */, >@@ -12403,19 +12491,20 @@ > 5CDD881E1E43BE3C00621E92 /* remote_ntp_time_estimator.cc */, > 5CD285F81E6A64520094FDC8 /* rtcp_nack_stats.cc */, > 5CD285F91E6A64520094FDC8 /* rtcp_nack_stats.h */, >- 5CDD88EB1E43BF2500621E92 /* rtcp_packet */, > 5CDD88211E43BE3C00621E92 /* rtcp_packet.cc */, > 5CDD88221E43BE3C00621E92 /* rtcp_packet.h */, > 5CDD88241E43BE3C00621E92 /* rtcp_receiver.cc */, > 5CDD88251E43BE3C00621E92 /* rtcp_receiver.h */, > 5CDD88271E43BE3C00621E92 /* rtcp_sender.cc */, > 5CDD88281E43BE3C00621E92 /* rtcp_sender.h */, >+ 419C82F61FE20EFB0040C30F /* rtcp_transceiver.cc */, >+ 419C82FA1FE20EFD0040C30F /* rtcp_transceiver.h */, > 419C82F91FE20EFD0040C30F /* rtcp_transceiver_config.cc */, > 419C82F81FE20EFC0040C30F /* rtcp_transceiver_config.h */, > 419C82F71FE20EFC0040C30F /* rtcp_transceiver_impl.cc */, > 419C82FC1FE20EFE0040C30F /* rtcp_transceiver_impl.h */, >- 419C82F61FE20EFB0040C30F /* rtcp_transceiver.cc */, >- 419C82FA1FE20EFD0040C30F /* rtcp_transceiver.h */, >+ 5CDD883A1E43BE3C00621E92 /* rtp_format.cc */, >+ 5CDD883B1E43BE3C00621E92 /* rtp_format.h */, > 5CDD882E1E43BE3C00621E92 /* rtp_format_h264.cc */, > 5CDD882F1E43BE3C00621E92 /* rtp_format_h264.h */, > 5CDD88301E43BE3C00621E92 /* rtp_format_video_generic.cc */, >@@ -12424,35 +12513,33 @@ > 5CDD88361E43BE3C00621E92 /* rtp_format_vp8.h */, > 5CDD88381E43BE3C00621E92 /* rtp_format_vp9.cc */, > 5CDD88391E43BE3C00621E92 /* rtp_format_vp9.h */, >- 5CDD883A1E43BE3C00621E92 /* rtp_format.cc */, >- 5CDD883B1E43BE3C00621E92 /* rtp_format.h */, >- 419241AF2127590200634FCF /* rtp_generic_frame_descriptor_extension.cc */, >- 4102F6E82127345F006AE8D7 /* rtp_generic_frame_descriptor_extension.h */, > 4102F6E52127345E006AE8D7 /* rtp_generic_frame_descriptor.cc */, > 4102F6E62127345F006AE8D7 /* rtp_generic_frame_descriptor.h */, >+ 419241AF2127590200634FCF /* rtp_generic_frame_descriptor_extension.cc */, >+ 4102F6E82127345F006AE8D7 /* rtp_generic_frame_descriptor_extension.h */, > 4145E4DB1EF8CCEE00FCF6E6 /* rtp_header_extension_map.cc */, > 413E677B216986AD00EF37ED /* rtp_header_extension_size.cc */, > 413E677C216986AE00EF37ED /* rtp_header_extension_size.h */, > 5CDD883F1E43BE3C00621E92 /* rtp_header_extensions.cc */, > 5CDD88401E43BE3C00621E92 /* rtp_header_extensions.h */, > 5CDD88411E43BE3C00621E92 /* rtp_header_parser.cc */, >+ 5CDD88481E43BE3C00621E92 /* rtp_packet.cc */, >+ 5CDD88491E43BE3C00621E92 /* rtp_packet.h */, > 5CDD88431E43BE3C00621E92 /* rtp_packet_history.cc */, > 5CDD88441E43BE3C00621E92 /* rtp_packet_history.h */, > 419C82FE1FE20F010040C30F /* rtp_packet_received.cc */, > 5CDD88451E43BE3C00621E92 /* rtp_packet_received.h */, > 4102F6E92127345F006AE8D7 /* rtp_packet_to_send.cc */, > 5CDD88461E43BE3C00621E92 /* rtp_packet_to_send.h */, >- 5CDD88481E43BE3C00621E92 /* rtp_packet.cc */, >- 5CDD88491E43BE3C00621E92 /* rtp_packet.h */, > 5CDD88541E43BE3C00621E92 /* rtp_rtcp_config.h */, > 5CDD88561E43BE3C00621E92 /* rtp_rtcp_impl.cc */, > 5CDD88571E43BE3C00621E92 /* rtp_rtcp_impl.h */, >+ 5CDD885D1E43BE3C00621E92 /* rtp_sender.cc */, >+ 5CDD885E1E43BE3C00621E92 /* rtp_sender.h */, > 5CDD88581E43BE3C00621E92 /* rtp_sender_audio.cc */, > 5CDD88591E43BE3C00621E92 /* rtp_sender_audio.h */, > 5CDD885B1E43BE3C00621E92 /* rtp_sender_video.cc */, > 5CDD885C1E43BE3C00621E92 /* rtp_sender_video.h */, >- 5CDD885D1E43BE3C00621E92 /* rtp_sender.cc */, >- 5CDD885E1E43BE3C00621E92 /* rtp_sender.h */, > 5CDD885F1E43BE3C00621E92 /* rtp_utility.cc */, > 5CDD88601E43BE3C00621E92 /* rtp_utility.h */, > 4102F6E72127345F006AE8D7 /* rtp_video_header.cc */, >@@ -12587,8 +12674,6 @@ > 5CDD89E21E43BFB300621E92 /* random_vector.h */, > 5CDD89E41E43BFB300621E92 /* red_payload_splitter.cc */, > 5CDD89E51E43BFB300621E92 /* red_payload_splitter.h */, >- 5CDD89E61E43BFB300621E92 /* rtcp.cc */, >- 5CDD89E71E43BFB300621E92 /* rtcp.h */, > 5CDD89E81E43BFB300621E92 /* statistics_calculator.cc */, > 5CDD89E91E43BFB300621E92 /* statistics_calculator.h */, > 5CDD89EB1E43BFB300621E92 /* sync_buffer.cc */, >@@ -12642,6 +12727,7 @@ > 5CDD8ABB1E43C22200621E92 /* audio */ = { > isa = PBXGroup; > children = ( >+ 5CD2854A1E6A62090094FDC8 /* utility */, > 4102F69A21273261006AE8D7 /* audio_level.cc */, > 4102F69821273260006AE8D7 /* audio_level.h */, > 5CDD8ABD1E43C23900621E92 /* audio_receive_stream.cc */, >@@ -12652,12 +12738,8 @@ > 5CDD8AC41E43C23900621E92 /* audio_state.h */, > 41F9BF992051C84B00ABF0B9 /* audio_transport_impl.cc */, > 41F9BF9A2051C84B00ABF0B9 /* audio_transport_impl.h */, >- 413E67782169867A00EF37ED /* channel_receive_proxy.cc */, >- 413E67772169867A00EF37ED /* channel_receive_proxy.h */, > 417953D22169834F0028266B /* channel_receive.cc */, > 417953D02169834F0028266B /* channel_receive.h */, >- 417953CF2169834E0028266B /* channel_send_proxy.cc */, >- 417953D12169834F0028266B /* channel_send_proxy.h */, > 417953CD2169834E0028266B /* channel_send.cc */, > 417953CE2169834E0028266B /* channel_send.h */, > 5CDD8AC61E43C23900621E92 /* conversion.h */, >@@ -12665,11 +12747,8 @@ > 419C83B21FE2472E0040C30F /* null_audio_poller.h */, > 4102F69721273260006AE8D7 /* remix_resample.cc */, > 4102F69D21273261006AE8D7 /* remix_resample.h */, >- 419C83331FE245E90040C30F /* time_interval.cc */, >- 419C83321FE245E90040C30F /* time_interval.h */, > 4102F69F21273262006AE8D7 /* transport_feedback_packet_loss_tracker.cc */, > 4102F69621273260006AE8D7 /* transport_feedback_packet_loss_tracker.h */, >- 5CD2854A1E6A62090094FDC8 /* utility */, > ); > path = audio; > sourceTree = "<group>"; >@@ -12827,10 +12906,10 @@ > 5CDD8C121E43C39200621E92 /* vp9 */ = { > isa = PBXGroup; > children = ( >- 413E67922169881900EF37ED /* svc_rate_allocator.cc */, >- 413E67932169881900EF37ED /* svc_rate_allocator.h */, > 413E67802169870F00EF37ED /* svc_config.cc */, > 413E677F2169870F00EF37ED /* svc_config.h */, >+ 413E67922169881900EF37ED /* svc_rate_allocator.cc */, >+ 413E67932169881900EF37ED /* svc_rate_allocator.h */, > 5CDD8C131E43C3B400621E92 /* vp9_noop.cc */, > ); > name = vp9; >@@ -13207,8 +13286,6 @@ > 419241B12127591C00634FCF /* bbr */, > 41E02C8F212734E800C27CD6 /* goog_cc */, > 41E02CD52127362300C27CD6 /* rtp */, >- 41A08BD621272F1C001D5D7B /* congestion_window_pushback_controller.cc */, >- 41A08BD521272F1C001D5D7B /* congestion_window_pushback_controller.h */, > 413091FD1EF8D06D00757C55 /* receive_side_congestion_controller.cc */, > 413092141EF8D5ED00757C55 /* send_side_congestion_controller.cc */, > 5CDD8FB01E43CD0700621E92 /* transport_feedback_adapter.cc */, >@@ -13305,8 +13382,11 @@ > 5CDD90511E43D10400621E92 /* bitrate_controller */ = { > isa = PBXGroup; > children = ( >+ 41FCBB5B21B1FEA300A5DF27 /* bitrate_controller.cc */, > 5CDD90521E43D11200621E92 /* bitrate_controller_impl.cc */, > 5CDD90531E43D11200621E92 /* bitrate_controller_impl.h */, >+ 41FCBB6821B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.cc */, >+ 41FCBB6721B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.h */, > 5CDD905B1E43D11200621E92 /* send_side_bandwidth_estimation.cc */, > 5CDD905C1E43D11200621E92 /* send_side_bandwidth_estimation.h */, > ); >@@ -13543,7 +13623,6 @@ > isa = PBXHeadersBuildPhase; > buildActionMask = 2147483647; > files = ( >- 5C088C111E4AA44400403995 /* bundlefilter.h in Headers */, > 5C088C131E4AA44400403995 /* channel.h in Headers */, > 5C088C151E4AA44400403995 /* channelmanager.h in Headers */, > 5C088C191E4AA44400403995 /* externalhmac.h in Headers */, >@@ -13723,7 +13802,6 @@ > 416D3BDD212731C200775F09 /* adaptive_mode_level_estimator_agc.h in Headers */, > 41F411C81EF97BAF00343C26 /* adm_helpers.h in Headers */, > 5CD285A91E6A63430094FDC8 /* aec3_common.h in Headers */, >- 4144B3D52169896D004363AC /* RTCNativeMutableI420Buffer.h in Headers */, > 5CD285AB1E6A63430094FDC8 /* aec3_fft.h in Headers */, > 5CDD86B71E43B9C200621E92 /* aec_common.h in Headers */, > 5CDD86BD1E43B9C200621E92 /* aec_core.h in Headers */, >@@ -13758,6 +13836,7 @@ > 41433D011F79B33400387B4D /* arith_routins.h in Headers */, > 41F9BF872051C80100ABF0B9 /* array_view.h in Headers */, > 413A23E41FE18E0800373E99 /* arraysize.h in Headers */, >+ 41FCBB7921B1FFC000A5DF27 /* ascii.h in Headers */, > 41F263B1212680A800274F59 /* asm_defines.h in Headers */, > 5C63F8D91E416D53002CA531 /* assertions.h in Headers */, > 413A22A81FE18E0600373E99 /* asyncinvoker-inl.h in Headers */, >@@ -13776,7 +13855,6 @@ > 5CDD84161E439B2900621E92 /* audio_decoder.h in Headers */, > 5CD285451E6A61D20094FDC8 /* audio_decoder.h in Headers */, > 5CD285431E6A61D20094FDC8 /* audio_decoder_factory.h in Headers */, >- 417953C02169824B0028266B /* RTCCodecSpecificInfoH264.h in Headers */, > 41F9BF982051C82100ABF0B9 /* audio_decoder_factory_template.h in Headers */, > 419C83E81FE248350040C30F /* audio_decoder_g711.h in Headers */, > 4140B8321E4E3396007409E6 /* audio_decoder_g722.h in Headers */, >@@ -13820,7 +13898,6 @@ > 419C83691FE246CA0040C30F /* audio_encoder_isac_float.h in Headers */, > 5CDD8C0C1E43C34600621E92 /* audio_encoder_isac_t.h in Headers */, > 5CDD8C0B1E43C34600621E92 /* audio_encoder_isac_t_impl.h in Headers */, >- 417953C92169824B0028266B /* RTCVideoEncoderH264.h in Headers */, > 419C83F31FE2488D0040C30F /* audio_encoder_L16.h in Headers */, > 5CDD8C641E43C60900621E92 /* audio_encoder_opus.h in Headers */, > 419C82F01FE20EB50040C30F /* audio_encoder_opus.h in Headers */, >@@ -13833,9 +13910,7 @@ > 415F1FCF212730F000064CBF /* audio_frame.h in Headers */, > 5CD284931E6A5F410094FDC8 /* audio_frame_manipulator.h in Headers */, > 5CD2854E1E6A62130094FDC8 /* audio_frame_operations.h in Headers */, >- 4144B3DD2169A070004363AC /* RTCVideoDecoder.h in Headers */, > 4102F6A221273262006AE8D7 /* audio_level.h in Headers */, >- 413E67A02169889500EF37ED /* RTCVideoEncoderQpThresholds.h in Headers */, > 5CDD8A901E43C00F00621E92 /* audio_loop.h in Headers */, > 415F1FCD212730F000064CBF /* audio_mixer.h in Headers */, > 5CD284961E6A5F410094FDC8 /* audio_mixer_impl.h in Headers */, >@@ -13854,7 +13929,6 @@ > 5CDD8C4F1E43C58E00621E92 /* audio_sink.h in Headers */, > 5CDD8A921E43C00F00621E92 /* audio_sink.h in Headers */, > 5CDD8AD31E43C23900621E92 /* audio_state.h in Headers */, >- 417953E3216983CA0028266B /* module_fec_types.h in Headers */, > 5CD284B31E6A5F9F0094FDC8 /* audio_state.h in Headers */, > 41F9BF9C2051C84C00ABF0B9 /* audio_transport_impl.h in Headers */, > 41F9BFA02051C88500ABF0B9 /* audio_util.h in Headers */, >@@ -13877,7 +13951,6 @@ > 5C4B4D0A1E4322F6002651C8 /* basicportallocator.h in Headers */, > 419241C62127593F00634FCF /* bbr_factory.h in Headers */, > 419241C32127593F00634FCF /* bbr_network_controller.h in Headers */, >- 416225E52169819500A91C9B /* objc_video_encoder_factory.h in Headers */, > 413A235C1FE18E0700373E99 /* bind.h in Headers */, > 416D3BE1212731C200775F09 /* biquad_filter.h in Headers */, > 413A23771FE18E0700373E99 /* bitbuffer.h in Headers */, >@@ -13886,7 +13959,6 @@ > 41F2638221267B7700274F59 /* bitrate_constraints.h in Headers */, > 5CDD8F7E1E43CBE000621E92 /* bitrate_controller.h in Headers */, > 5CDD905E1E43D11200621E92 /* bitrate_controller_impl.h in Headers */, >- 4144B3EA2169C5E3004363AC /* RTCVideoEncoderSettings+Private.h in Headers */, > 41E02CAC2127352D00C27CD6 /* bitrate_estimator.h in Headers */, > 5CDD8FA11E43CCBE00621E92 /* bitrate_prober.h in Headers */, > 41E02CD0212735B700C27CD6 /* bitrate_settings.h in Headers */, >@@ -13894,7 +13966,6 @@ > 41BE71A8215C463700A7B196 /* bitvect.h in Headers */, > 5CD285AD1E6A63430094FDC8 /* block_framer.h in Headers */, > 5CDD86881E43B93900621E92 /* block_mean_calculator.h in Headers */, >- 417953C32169824B0028266B /* RTCH264ProfileLevelId.h in Headers */, > 5CD285B11E6A63430094FDC8 /* block_processor.h in Headers */, > 5CD285AF1E6A63430094FDC8 /* block_processor_metrics.h in Headers */, > 413A238E1FE18E0700373E99 /* buffer.h in Headers */, >@@ -13902,16 +13973,13 @@ > 413A232B1FE18E0700373E99 /* bufferqueue.h in Headers */, > 5CD285491E6A61D20094FDC8 /* builtin_audio_decoder_factory.h in Headers */, > 413091FC1EF8CFF800757C55 /* builtin_audio_encoder_factory.h in Headers */, >- 417953E4216983CA0028266B /* module.h in Headers */, >- 413E67792169867A00EF37ED /* channel_receive_proxy.h in Headers */, >+ 41FCBB1E21B1F7CD00A5DF27 /* builtin_video_bitrate_allocator_factory.h in Headers */, > 41F2636021267ADF00274F59 /* builtin_video_decoder_factory.h in Headers */, >- 413E67692169854B00EF37ED /* RTCVideoEncoderVP8.h in Headers */, > 41F2636121267ADF00274F59 /* builtin_video_encoder_factory.h in Headers */, > 5CDD8B751E43C2B500621E92 /* bw_expand.h in Headers */, > 5CDD893F1E43BF3A00621E92 /* bye.h in Headers */, > 5CDD88761E43BE3C00621E92 /* byte_io.h in Headers */, > 413A23A41FE18E0800373E99 /* bytebuffer.h in Headers */, >- 417953F8216984BE0028266B /* RTCCodecSpecificInfo.h in Headers */, > 41BE71AE215C463700A7B196 /* bytecode.h in Headers */, > 413A233E1FE18E0700373E99 /* byteorder.h in Headers */, > 5CD284B81E6A5F9F0094FDC8 /* call.h in Headers */, >@@ -13933,16 +14001,16 @@ > 5CDD8B831E43C2B500621E92 /* cb_update_best_index.h in Headers */, > 5C4B4C651E431F9C002651C8 /* channel_buffer.h in Headers */, > 5CDD8F811E43CBE000621E92 /* channel_controller.h in Headers */, >+ 417953D62169834F0028266B /* channel_receive.h in Headers */, >+ 417953D42169834F0028266B /* channel_send.h in Headers */, > 5CDD8B851E43C2B500621E92 /* chebyshev.h in Headers */, > 413A234F1FE18E0700373E99 /* checks.h in Headers */, > 5C08852A1E4A99D200403995 /* cipher.h in Headers */, >- 413E67662169854B00EF37ED /* RTCVideoCodecConstants.h in Headers */, > 5CDD86A21E43B99400621E92 /* circular_buffer.h in Headers */, >+ 41FCBB7621B1FFA400A5DF27 /* cocoa_threading.h in Headers */, > 5C4B48DC1E42C1E3002651C8 /* codec.h in Headers */, >- 416225E82169819500A91C9B /* objc_video_decoder_factory.h in Headers */, > 5CDD87B01E43BC0500621E92 /* codec.h in Headers */, > 41433CF31F79B33400387B4D /* codec.h in Headers */, >- 5CDD85161E43B1EA00621E92 /* codec_manager.h in Headers */, > 5CDD83C91E439A6F00621E92 /* codec_timer.h in Headers */, > 4102F6E221273416006AE8D7 /* color_space.h in Headers */, > 5CDD8A0E1E43BFB300621E92 /* comfort_noise.h in Headers */, >@@ -13961,27 +14029,31 @@ > 5CDD8FEC1E43CDCA00621E92 /* config.h in Headers */, > 5C63F8DB1E416D53002CA531 /* config.h in Headers */, > 4192416D2127389B00634FCF /* congestion_controller_experiment.h in Headers */, >- 41A08BD721272F1D001D5D7B /* congestion_window_pushback_controller.h in Headers */, >+ 41FCBB4021B1FDB800A5DF27 /* congestion_window_pushback_controller.h in Headers */, > 5CDD8A941E43C00F00621E92 /* constant_pcm_packet_source.h in Headers */, > 5C4B48781E42C1BA002651C8 /* constants.h in Headers */, > 5CDD8B8A1E43C2B500621E92 /* constants.h in Headers */, > 413A236C1FE18E0700373E99 /* constructormagic.h in Headers */, > 413AD1A421265B10003F7263 /* container.h in Headers */, >+ 41FCBB3821B1F8FC00A5DF27 /* control_handler.h in Headers */, > 5CDD8F861E43CBE000621E92 /* controller.h in Headers */, > 5CDD8F841E43CBE000621E92 /* controller_manager.h in Headers */, > 5CDD8AD41E43C23900621E92 /* conversion.h in Headers */, > 41F9BFB42051C93600ABF0B9 /* convert_legacy_video_factory.h in Headers */, > 413A23AC1FE18E0800373E99 /* copyonwritebuffer.h in Headers */, > 41BE71BC215C463700A7B196 /* coretype.h in Headers */, >- 41889D76216BC4EC004614DD /* RTCRtpEncodingParameters.h in Headers */, >+ 41FCBB5121B1FE4600A5DF27 /* cpu_speed_experiment.h in Headers */, > 413A23E21FE18E0800373E99 /* cpu_time.h in Headers */, > 5CDD87B21E43BC0500621E92 /* crc.h in Headers */, > 413A22AC1FE18E0600373E99 /* crc32.h in Headers */, > 5CDD8B8C1E43C2B500621E92 /* create_augmented_vec.h in Headers */, >+ 41FCBB1221B1F53700A5DF27 /* create_peerconnection_factory.h in Headers */, >+ 41FCBB6421B1FEF600A5DF27 /* create_vp8_temporal_layers.h in Headers */, > 413A23691FE18E0700373E99 /* criticalsection.h in Headers */, > 5CDD8A101E43BFB300621E92 /* cross_correlation.h in Headers */, > 5C08852B1E4A99D200403995 /* crypto_kernel.h in Headers */, > 5C08852C1E4A99D200403995 /* crypto_types.h in Headers */, >+ 41FCBB2E21B1F8B700A5DF27 /* cryptooptions.h in Headers */, > 5C4B48DD1E42C1E3002651C8 /* cryptoparams.h in Headers */, > 41F9BF902051C80100ABF0B9 /* cryptoparams.h in Headers */, > 413A23291FE18E0700373E99 /* cryptstring.h in Headers */, >@@ -14044,7 +14116,6 @@ > 5CDD84B11E43AF1300621E92 /* echo_control_mobile_impl.h in Headers */, > 5CD285BB1E6A63430094FDC8 /* echo_path_delay_estimator.h in Headers */, > 5CD285BD1E6A63430094FDC8 /* echo_path_variability.h in Headers */, >- 417953BD2169824B0028266B /* helpers.h in Headers */, > 5CD285C11E6A63430094FDC8 /* echo_remover.h in Headers */, > 5CD285BF1E6A63430094FDC8 /* echo_remover_metrics.h in Headers */, > 413A213E1FE0F0EF00373E99 /* ekt.h in Headers */, >@@ -14053,6 +14124,7 @@ > 5CDD8A961E43C00F00621E92 /* encode_neteq_input.h in Headers */, > 5CDD83CD1E439A6F00621E92 /* encoded_frame.h in Headers */, > 41A08BAC212681C8001D5D7B /* encoded_frame.h in Headers */, >+ 41FCBB1C21B1F7CD00A5DF27 /* encoded_image.h in Headers */, > 41E02C89212734B900C27CD6 /* encoder_database.h in Headers */, > 5CDD85A01E43B5C000621E92 /* encoder_rtcp_feedback.h in Headers */, > 41ECEAFC20646664009D5141 /* EncoderUtilities.h in Headers */, >@@ -14067,7 +14139,6 @@ > 5C08852E1E4A99D200403995 /* err.h in Headers */, > 41BE71AA215C463700A7B196 /* errwarn.h in Headers */, > 413A23A51FE18E0800373E99 /* event.h in Headers */, >- 413E67852169877D00EF37ED /* RTCVideoEncoderSettings.h in Headers */, > 5CD286151E6A66130094FDC8 /* event_log_writer.h in Headers */, > 413A237E1FE18E0700373E99 /* event_tracer.h in Headers */, > 419C83501FE246650040C30F /* exp_filter.h in Headers */, >@@ -14076,7 +14147,6 @@ > 41BE71A6215C463700A7B196 /* expr.h in Headers */, > 5CDD894B1E43BF3A00621E92 /* extended_jitter_report.h in Headers */, > 5CDD894E1E43BF3A00621E92 /* extended_reports.h in Headers */, >- 413E67A12169889500EF37ED /* RTCRtpFragmentationHeader.h in Headers */, > 5CDD8A981E43C00F00621E92 /* fake_decode_from_file.h in Headers */, > 41299B972127369100B3414B /* fake_network_pipe.h in Headers */, > 419241562127376F00634FCF /* fakecandidatepair.h in Headers */, >@@ -14084,7 +14154,6 @@ > 419241532127376F00634FCF /* fakedtlstransport.h in Headers */, > 419241592127376F00634FCF /* fakeicetransport.h in Headers */, > 5C4B48DF1E42C1E3002651C8 /* fakemediaengine.h in Headers */, >- 413E676A2169854B00EF37ED /* RTCWrappedNativeVideoDecoder.h in Headers */, > 413A22B71FE18E0700373E99 /* fakenetwork.h in Headers */, > 5C4B48E01E42C1E3002651C8 /* fakenetworkinterface.h in Headers */, > 419241582127376F00634FCF /* fakepackettransport.h in Headers */, >@@ -14104,7 +14173,6 @@ > 41E02C8E212734B900C27CD6 /* fec_rate_table.h in Headers */, > 5CDD887C1E43BE3C00621E92 /* fec_test_helper.h in Headers */, > 41F2638921267F4000274F59 /* fft.h in Headers */, >- 416225E32169819500A91C9B /* objc_frame_buffer.h in Headers */, > 41433D221F79B33400387B4D /* fft.h in Headers */, > 41F2636F21267B3E00274F59 /* fft4g.h in Headers */, > 41F9BFD32051DDE500ABF0B9 /* fft_buffer.h in Headers */, >@@ -14119,12 +14187,10 @@ > 5CDD8C8C1E43C66000621E92 /* file_utils.h in Headers */, > 41F263B2212680A800274F59 /* file_wrapper.h in Headers */, > 413A22E11FE18E0700373E99 /* filerotatingstream.h in Headers */, >- 413A23BD1FE18E0800373E99 /* fileutils.h in Headers */, > 41433CED1F79B33400387B4D /* filterbank_internal.h in Headers */, > 41433D0D1F79B33400387B4D /* filterbank_tables.h in Headers */, > 5CDD8BA11E43C2B500621E92 /* filtered_cb_vecs.h in Headers */, > 5C11A0041E457400004F0987 /* fine_audio_buffer.h in Headers */, >- 416225D62169818100A91C9B /* video_frame.h in Headers */, > 5CDD89511E43BF3A00621E92 /* fir.h in Headers */, > 5C4B4C6E1E431F9C002651C8 /* fir_filter.h in Headers */, > 419C831A1FE242B20040C30F /* fir_filter_c.h in Headers */, >@@ -14133,12 +14199,10 @@ > 413A23DF1FE18E0800373E99 /* firewallsocketserver.h in Headers */, > 41DDB24D21265BD700296D47 /* fixed_array.h in Headers */, > 415F1FAE21272FBA00064CBF /* fixed_digital_level_estimator.h in Headers */, >- 416D3BE4212731C200775F09 /* fixed_gain_controller.h in Headers */, > 413A23BC1FE18E0800373E99 /* flags.h in Headers */, > 5CDD887F1E43BE3C00621E92 /* flexfec_header_reader_writer.h in Headers */, > 5CDD85371E43B39C00621E92 /* flexfec_receive_stream.h in Headers */, > 5CD284BA1E6A5F9F0094FDC8 /* flexfec_receive_stream_impl.h in Headers */, >- 413E67812169870F00EF37ED /* svc_config.h in Headers */, > 419C83CB1FE247B40040C30F /* flexfec_receiver.h in Headers */, > 419C83CE1FE247B40040C30F /* flexfec_sender.h in Headers */, > 413A22A31FE18E0600373E99 /* format_macros.h in Headers */, >@@ -14152,8 +14216,11 @@ > 5CDD8BA31E43C2B500621E92 /* frame_classify.h in Headers */, > 5CD2849B1E6A5F410094FDC8 /* frame_combiner.h in Headers */, > 5CDD837A1E439A3500621E92 /* frame_dropper.h in Headers */, >+ 41FCBB5A21B1FE6E00A5DF27 /* frame_dumping_decoder.h in Headers */, > 5CDD8F911E43CBE000621E92 /* frame_length_controller.h in Headers */, > 5CDD83D41E439A6F00621E92 /* frame_object.h in Headers */, >+ 41FCBB2F21B1F8B700A5DF27 /* framedecryptorinterface.h in Headers */, >+ 41FCBB3021B1F8B700A5DF27 /* frameencryptorinterface.h in Headers */, > 413A22A21FE18E0600373E99 /* function_view.h in Headers */, > 41F2638E21267F4900274F59 /* g711.h in Headers */, > 4140B8251E4E3383007409E6 /* g711_interface.h in Headers */, >@@ -14164,8 +14231,6 @@ > 5CDD84B31E43AF1300621E92 /* gain_control_for_experimental_agc.h in Headers */, > 5CDD84B51E43AF1300621E92 /* gain_control_impl.h in Headers */, > 41A08BCF21272EE2001D5D7B /* gain_controller2.h in Headers */, >- 416D3BE5212731C200775F09 /* gain_curve_applier.h in Headers */, >- 41889D75216BB7B9004614DD /* RTCRtpEncodingParameters+Private.h in Headers */, > 5CDD8BA51E43C2B500621E92 /* gain_dequant.h in Headers */, > 5CDD87031E43BA7500621E92 /* gain_map_internal.h in Headers */, > 5CDD8BA71E43C2B500621E92 /* gain_quant.h in Headers */, >@@ -14176,6 +14241,7 @@ > 5CDD8BAD1E43C2B500621E92 /* get_sync_seq.h in Headers */, > 413A21411FE0F0EF00373E99 /* getopt_s.h in Headers */, > 5CDD872A1E43BABE00621E92 /* gmm.h in Headers */, >+ 41FCBB5F21B1FEC400A5DF27 /* goog_cc_factory.h in Headers */, > 41E02CAA2127352D00C27CD6 /* goog_cc_network_control.h in Headers */, > 413A23011FE18E0700373E99 /* gtest_prod_util.h in Headers */, > 413A22EA1FE18E0700373E99 /* gunit.h in Headers */, >@@ -14187,6 +14253,8 @@ > 4145E4BE1EF894F600FCF6E6 /* h264_profile_level_id.h in Headers */, > 5CD285EB1E6A639F0094FDC8 /* h264_sprop_parameter_sets.h in Headers */, > 5CDD83DA1E439A6F00621E92 /* h264_sps_pps_tracker.h in Headers */, >+ 41FCBB7221B1FF7400A5DF27 /* hdr_metadata.h in Headers */, >+ 417953BD2169824B0028266B /* helpers.h in Headers */, > 413A22D81FE18E0700373E99 /* helpers.h in Headers */, > 5CDD83DC1E439A6F00621E92 /* histogram.h in Headers */, > 419C83581FE246650040C30F /* histogram_percentile_counter.h in Headers */, >@@ -14196,7 +14264,7 @@ > 41A08BAD212681C8001D5D7B /* i010_buffer.h in Headers */, > 5CD284681E6A57F40094FDC8 /* i420_buffer.h in Headers */, > 41109AAD1E5FA19200C0955A /* i420_buffer_pool.h in Headers */, >- 4144B3DE2169A070004363AC /* RTCVideoEncoder.h in Headers */, >+ 41FCBB4921B1FDEF00A5DF27 /* icecredentialsiterator.h in Headers */, > 41A08BC321269577001D5D7B /* icelogger.h in Headers */, > 413091F71EF8CFBD00757C55 /* iceserverparsing.h in Headers */, > 4192415A2127376F00634FCF /* icetransportinternal.h in Headers */, >@@ -14231,6 +14299,7 @@ > 5CDD83E21E439A6F00621E92 /* jitter_buffer.h in Headers */, > 5CDD83E01E439A6F00621E92 /* jitter_buffer_common.h in Headers */, > 5CDD83E41E439A6F00621E92 /* jitter_estimator.h in Headers */, >+ 41FCBB5421B1FE4600A5DF27 /* jitter_upper_bound_experiment.h in Headers */, > 5C63F9411E41737B002CA531 /* jsep.h in Headers */, > 5C63F9431E41737B002CA531 /* jsepicecandidate.h in Headers */, > 5C63F9451E41737B002CA531 /* jsepsessiondescription.h in Headers */, >@@ -14239,12 +14308,13 @@ > 5C63F8DE1E416D53002CA531 /* json.h in Headers */, > 413A23651FE18E0700373E99 /* keep_ref_until_done.h in Headers */, > 5C0885301E4A99D200403995 /* key.h in Headers */, >+ 41FCBB6E21B1FF3A00A5DF27 /* key_derivation.h in Headers */, > 4192419B2127586500634FCF /* kiss_fft.h in Headers */, > 5CDD84201E439B2900621E92 /* legacy_encoded_audio_frame.h in Headers */, > 5CDD84B91E43AF1300621E92 /* level_estimator_impl.h in Headers */, >+ 413E67732169863B00EF37ED /* libvpx_interface.h in Headers */, > 41294095212E128D00AD95E7 /* libvpx_vp8_decoder.h in Headers */, > 41294093212E128D00AD95E7 /* libvpx_vp8_encoder.h in Headers */, >- 417953F7216984BE0028266B /* RTCEncodedImage.h in Headers */, > 415F1FB321272FBA00064CBF /* limiter.h in Headers */, > 41BE71AB215C463700A7B196 /* linemap.h in Headers */, > 41BE71B9215C463700A7B196 /* listfmt.h in Headers */, >@@ -14253,10 +14323,10 @@ > 5CDD8C101E43C34600621E92 /* locked_bandwidth_info.h in Headers */, > 413A23321FE18E0700373E99 /* logging.h in Headers */, > 413A23521FE18E0700373E99 /* logsinks.h in Headers */, >+ 41FCBB6921B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.h in Headers */, > 419241C02127593F00634FCF /* loss_rate_filter.h in Headers */, > 5CDD87061E43BA7500621E92 /* loudness_histogram.h in Headers */, > 5CD2855E1E6A62ED0094FDC8 /* low_cut_filter.h in Headers */, >- 416225D52169818100A91C9B /* video_encoder_factory.h in Headers */, > 419241402127372400634FCF /* lp_residual.h in Headers */, > 5CDD87C61E43BC0500621E92 /* lpc_analysis.h in Headers */, > 5CDD8BC21E43C2B500621E92 /* lpc_encode.h in Headers */, >@@ -14274,15 +14344,17 @@ > 5CDD8BCE1E43C2B500621E92 /* lsp_to_lsf.h in Headers */, > 413A23971FE18E0800373E99 /* macutils.h in Headers */, > 5CD285CC1E6A63430094FDC8 /* main_filter_update_gain.h in Headers */, >+ 41FCBB3D21B1FD4000A5DF27 /* match.h in Headers */, > 5CD285D01E6A63430094FDC8 /* matched_filter.h in Headers */, >- 4144B3DF2169A070004363AC /* RTCVideoEncoderFactory.h in Headers */, > 5CD285CE1E6A63430094FDC8 /* matched_filter_lag_aggregator.h in Headers */, > 419C83591FE246650040C30F /* mathutils.h in Headers */, > 41F9BFD62051DDE500ABF0B9 /* matrix_buffer.h in Headers */, > 41BE71B8215C463700A7B196 /* md5.h in Headers */, >+ 41FCBB4621B1FDEF00A5DF27 /* mdns_message.h in Headers */, > 5CDD86A51E43B99400621E92 /* mean_variance_estimator.h in Headers */, > 5CDD83E61E439A6F00621E92 /* media_opt_util.h in Headers */, > 5CDD83E81E439A6F00621E92 /* media_optimization.h in Headers */, >+ 41FCBB0F21B1F53700A5DF27 /* media_transport_interface.h in Headers */, > 5C4B48E51E42C1E3002651C8 /* mediachannel.h in Headers */, > 4102F6B221273331006AE8D7 /* mediaconfig.h in Headers */, > 5C4B48E71E42C1E3002651C8 /* mediaconstants.h in Headers */, >@@ -14298,6 +14370,7 @@ > 5CD286331E6A67BF0094FDC8 /* mediatypes.h in Headers */, > 41A08BB821268A7D001D5D7B /* memory.h in Headers */, > 413A23091FE18E0700373E99 /* memory_usage.h in Headers */, >+ 41FCBB8521B2007B00A5DF27 /* memutil.h in Headers */, > 5CDD8A301E43BFB300621E92 /* merge.h in Headers */, > 413A22FA1FE18E0700373E99 /* messagedigest.h in Headers */, > 413A22B91FE18E0700373E99 /* messagehandler.h in Headers */, >@@ -14306,21 +14379,23 @@ > 5C11A0051E457400004F0987 /* mock_audio_device_buffer.h in Headers */, > 5CDD8FED1E43CDCA00621E92 /* mock_audio_processing.h in Headers */, > 419C83561FE246650040C30F /* mod_ops.h in Headers */, >- 5CDD837E1E439A3500621E92 /* moving_average.h in Headers */, >+ 417953E4216983CA0028266B /* module.h in Headers */, >+ 417953E7216983CA0028266B /* module_common_types.h in Headers */, >+ 417953E5216983CA0028266B /* module_common_types_public.h in Headers */, >+ 417953E3216983CA0028266B /* module_fec_types.h in Headers */, >+ 41FCBB1621B1F7AA00A5DF27 /* moving_average.h in Headers */, > 5CD286111E6A64C90094FDC8 /* moving_max.h in Headers */, >- 41889D78216BC4EC004614DD /* RTCRtpFragmentationHeader+Private.h in Headers */, > 419C83551FE246650040C30F /* moving_max_counter.h in Headers */, > 419C83521FE246650040C30F /* moving_median_filter.h in Headers */, > 5CDD8C8F1E43C66000621E92 /* moving_moments.h in Headers */, > 5CDD8BD01E43C2B500621E92 /* my_corr.h in Headers */, > 5CDD89541E43BF3A00621E92 /* nack.h in Headers */, > 5CDD83E91E439A6F00621E92 /* nack_fec_tables.h in Headers */, >- 417953EE216984210028266B /* vp9_profile.h in Headers */, > 5CDD83EB1E439A6F00621E92 /* nack_module.h in Headers */, > 5CDD8A331E43BFB300621E92 /* nack_tracker.h in Headers */, > 4145E4B71EF8943E00FCF6E6 /* nalu_rewriter.h in Headers */, >+ 417953BF2169824B0028266B /* nalu_rewriter.h in Headers */, > 413A22C41FE18E0700373E99 /* natserver.h in Headers */, >- 4144B3E02169A070004363AC /* RTCVideoDecoderFactory.h in Headers */, > 413A22D91FE18E0700373E99 /* natsocketfactory.h in Headers */, > 413A22E71FE18E0700373E99 /* nattypes.h in Headers */, > 5CDD8BD21E43C2B500621E92 /* nearest_neighbor.h in Headers */, >@@ -14335,7 +14410,6 @@ > 41E02CCF212735B700C27CD6 /* network_control.h in Headers */, > 41E02CCE212735B700C27CD6 /* network_types.h in Headers */, > 413A237C1FE18E0700373E99 /* networkmonitor.h in Headers */, >- 413E6791216987DB00EF37ED /* RTCVideoFrameBuffer.h in Headers */, > 413A22C21FE18E0700373E99 /* networkroute.h in Headers */, > 5CDD872B1E43BABE00621E92 /* noise_gmm_tables.h in Headers */, > 415F1FB421272FBA00064CBF /* noise_spectrum_estimator.h in Headers */, >@@ -14343,11 +14417,12 @@ > 5CDD84EA1E43B0B600621E92 /* noise_suppression_impl.h in Headers */, > 5CDD85FD1E43B84E00621E92 /* noise_suppression_x.h in Headers */, > 5CDD8A3E1E43BFB300621E92 /* normal.h in Headers */, >- 417953EB216983F10028266B /* rtt_mult_experiment.h in Headers */, >+ 41FCBB5521B1FE4600A5DF27 /* normalize_simulcast_size_experiment.h in Headers */, > 5CDD86A81E43B99400621E92 /* normalized_covariance_estimator.h in Headers */, > 5C63F9541E41737B002CA531 /* notifier.h in Headers */, > 5CDD86011E43B84E00621E92 /* ns_core.h in Headers */, > 41ECEAC120640F28009D5141 /* NSString+StdString.h in Headers */, >+ 4144B3EF2169C63F004363AC /* NSString+StdString.h in Headers */, > 5CDD86061E43B84E00621E92 /* nsx_core.h in Headers */, > 5CDD86071E43B84E00621E92 /* nsx_defines.h in Headers */, > 419C83B41FE2472E0040C30F /* null_audio_poller.h in Headers */, >@@ -14355,14 +14430,18 @@ > 5C0885321E4A99D200403995 /* null_cipher.h in Headers */, > 413A23C01FE18E0800373E99 /* nullsocketserver.h in Headers */, > 5CD285331E6A61110094FDC8 /* nullwebrtcvideoengine.h in Headers */, >+ 416225E32169819500A91C9B /* objc_frame_buffer.h in Headers */, >+ 416225E82169819500A91C9B /* objc_video_decoder_factory.h in Headers */, >+ 416225E52169819500A91C9B /* objc_video_encoder_factory.h in Headers */, >+ 416225E92169819500A91C9B /* objc_video_frame.h in Headers */, > 413A23361FE18E0700373E99 /* onetimeevent.h in Headers */, > 5CDD86951E43B93900621E92 /* ooura_fft.h in Headers */, > 5CDD86921E43B93900621E92 /* ooura_fft_tables_common.h in Headers */, >+ 41FCBB7E21B1FFF600A5DF27 /* openssl_key_derivation_hkdf.h in Headers */, > 41DDB266212679A300296D47 /* opensslcertificate.h in Headers */, > 41DDB263212679A300296D47 /* opensslsessioncache.h in Headers */, > 41DDB261212679A300296D47 /* opensslutility.h in Headers */, > 41DDB25121265BE900296D47 /* optional.h in Headers */, >- 413A23451FE18E0700373E99 /* optionsfile.h in Headers */, > 5CDD8C661E43C60900621E92 /* opus_inst.h in Headers */, > 5CDD8C681E43C60900621E92 /* opus_interface.h in Headers */, > 5CDD87CF1E43BC0500621E92 /* os_specific_inline.h in Headers */, >@@ -14378,7 +14457,6 @@ > 41F2639921267F5E00274F59 /* pa_ringbuffer.h in Headers */, > 5CDD8FA31E43CCBE00621E92 /* paced_sender.h in Headers */, > 419C832E1FE245CD0040C30F /* pacer.h in Headers */, >- 4192418D212749C800634FCF /* pacer_controller.h in Headers */, > 5CDD8BD41E43C2B500621E92 /* pack_bits.h in Headers */, > 5CDD83EF1E439A6F00621E92 /* packet.h in Headers */, > 5CDD8AB01E43C00F00621E92 /* packet.h in Headers */, >@@ -14386,7 +14464,6 @@ > 5CDD8A411E43BFB300621E92 /* packet_buffer.h in Headers */, > 5CDD83ED1E439A6F00621E92 /* packet_buffer.h in Headers */, > 5CDD888D1E43BE3C00621E92 /* packet_loss_stats.h in Headers */, >- 413E67952169881900EF37ED /* svc_rate_allocator.h in Headers */, > 419241C82127593F00634FCF /* packet_number_indexed_queue.h in Headers */, > 4102F6B921273382006AE8D7 /* packet_receiver.h in Headers */, > 5CDD8FA51E43CCBE00621E92 /* packet_router.h in Headers */, >@@ -14395,7 +14472,6 @@ > 5C63F9F41E4174F6002CA531 /* packetsocketfactory.h in Headers */, > 5C63F9F51E4174F6002CA531 /* packettransportinterface.h in Headers */, > 419C83D91FE247F20040C30F /* packettransportinternal.h in Headers */, >- 413A23901FE18E0800373E99 /* pathutils.h in Headers */, > 5C4B48841E42C1BA002651C8 /* payload_type_mapper.h in Headers */, > 5CDD90011E43CE3A00621E92 /* pcm16b.h in Headers */, > 5CD285001E6A60570094FDC8 /* peerconnection.h in Headers */, >@@ -14405,7 +14481,6 @@ > 4102F69521273206006AE8D7 /* peerconnectioninternal.h in Headers */, > 5C63F95B1E41737B002CA531 /* peerconnectionproxy.h in Headers */, > 41F9BFE02051DE1800ABF0B9 /* peerconnectionwrapper.h in Headers */, >- 417953BB216982420028266B /* RTCDefaultVideoDecoderFactory.h in Headers */, > 419C834E1FE246650040C30F /* percentile_filter.h in Headers */, > 41BE71C0215C463700A7B196 /* phash.h in Headers */, > 413A23EA1FE18E0800373E99 /* physicalsocketserver.h in Headers */, >@@ -14426,10 +14501,8 @@ > 413A231D1FE18E0700373E99 /* platform_thread_types.h in Headers */, > 5CDD88901E43BE3C00621E92 /* playout_delay_oracle.h in Headers */, > 5CDD89571E43BF3A00621E92 /* pli.h in Headers */, >- 413E676C2169854B00EF37ED /* RTCWrappedNativeVideoEncoder.h in Headers */, > 5CDD87341E43BABE00621E92 /* pole_zero_filter.h in Headers */, > 5CDD8BD61E43C2B500621E92 /* poly_to_lsf.h in Headers */, >- 417953BF2169824B0028266B /* nalu_rewriter.h in Headers */, > 5CDD8BD81E43C2B500621E92 /* poly_to_lsp.h in Headers */, > 5C63F9F81E4174F6002CA531 /* port.h in Headers */, > 5C63F9FB1E4174F6002CA531 /* portallocator.h in Headers */, >@@ -14446,20 +14519,17 @@ > 413A231F1FE18E0700373E99 /* proxyinfo.h in Headers */, > 413A23C51FE18E0800373E99 /* proxyserver.h in Headers */, > 5C63F9FF1E4174F6002CA531 /* pseudotcp.h in Headers */, >- 4106D56D216C2BAA001E3C40 /* string_view.h in Headers */, > 5CDD89591E43BF3A00621E92 /* psfb.h in Headers */, > 5CDD87561E43BAF500621E92 /* push_sinc_resampler.h in Headers */, > 5CDD83821E439A3500621E92 /* quality_scaler.h in Headers */, > 4192416C2127389B00634FCF /* quality_scaling_experiment.h in Headers */, > 5CD2862B1E6A66C80094FDC8 /* quality_threshold.h in Headers */, >- 4144B3D42169896D004363AC /* RTCNativeI420Buffer+Private.h in Headers */, > 413A23EF1FE18E0800373E99 /* race_checker.h in Headers */, > 413A234C1FE18E0700373E99 /* random.h in Headers */, > 5CDD8A4B1E43BFB300621E92 /* random_vector.h in Headers */, > 5CDD895C1E43BF3A00621E92 /* rapid_resync_request.h in Headers */, > 413A23601FE18E0700373E99 /* rate_limiter.h in Headers */, > 413A230D1FE18E0700373E99 /* rate_statistics.h in Headers */, >- 4144B3EF2169C63F004363AC /* NSString+StdString.h in Headers */, > 413A23041FE18E0700373E99 /* ratetracker.h in Headers */, > 41A08BCB2126961F001D5D7B /* raw_logging.h in Headers */, > 5C0885331E4A99D200403995 /* rdb.h in Headers */, >@@ -14483,7 +14553,6 @@ > 5C63FA021E4174F6002CA531 /* relayport.h in Headers */, > 5C63FA051E4174F6002CA531 /* relayserver.h in Headers */, > 5CDD89621E43BF3A00621E92 /* remb.h in Headers */, >- 413E677E216986AE00EF37ED /* rtp_header_extension_size.h in Headers */, > 4102F6A721273262006AE8D7 /* remix_resample.h in Headers */, > 5CDD8FDC1E43CD6600621E92 /* remote_bitrate_estimator_abs_send_time.h in Headers */, > 5CDD8FDE1E43CD6600621E92 /* remote_bitrate_estimator_single_stream.h in Headers */, >@@ -14516,7 +14585,6 @@ > 41F9BFA92051C8E200ABF0B9 /* rtc_event_alr_state.h in Headers */, > 419C839D1FE246F90040C30F /* rtc_event_audio_network_adaptation.h in Headers */, > 419C83981FE246F90040C30F /* rtc_event_audio_playout.h in Headers */, >- 413E678A216987B900EF37ED /* RTCCVPixelBuffer.h in Headers */, > 419C83A11FE246F90040C30F /* rtc_event_audio_receive_stream_config.h in Headers */, > 419C838E1FE246F90040C30F /* rtc_event_audio_send_stream_config.h in Headers */, > 419C83931FE246F90040C30F /* rtc_event_bwe_update_delay_based.h in Headers */, >@@ -14529,7 +14597,6 @@ > 419C84231FE24BA60040C30F /* rtc_event_log_output_file.h in Headers */, > 419C839F1FE246F90040C30F /* rtc_event_probe_cluster_created.h in Headers */, > 419C83A01FE246F90040C30F /* rtc_event_probe_result_failure.h in Headers */, >- 4144B3D22169896D004363AC /* RTCNativeI420Buffer.h in Headers */, > 419C839A1FE246F90040C30F /* rtc_event_probe_result_success.h in Headers */, > 419C83901FE246F90040C30F /* rtc_event_rtcp_packet_incoming.h in Headers */, > 419C83941FE246F90040C30F /* rtc_event_rtcp_packet_outgoing.h in Headers */, >@@ -14537,7 +14604,6 @@ > 419C83AB1FE247080040C30F /* rtc_event_rtp_packet_outgoing.h in Headers */, > 419C83AE1FE247080040C30F /* rtc_event_video_receive_stream_config.h in Headers */, > 419C83B01FE247080040C30F /* rtc_event_video_send_stream_config.h in Headers */, >- 417953D42169834F0028266B /* channel_send.h in Headers */, > 419C841E1FE24AEF0040C30F /* rtc_stream_config.h in Headers */, > 413A24621FE1991A00373E99 /* RTCAudioSession.h in Headers */, > 413A24581FE1991A00373E99 /* RTCAudioSessionConfiguration.h in Headers */, >@@ -14545,21 +14611,28 @@ > 413A24571FE1991A00373E99 /* RTCAudioTrack.h in Headers */, > 413A245B1FE1991A00373E99 /* RTCCameraPreviewView.h in Headers */, > 413A244D1FE1991A00373E99 /* RTCCameraVideoCapturer.h in Headers */, >- 417953F9216984BE0028266B /* RTCI420Buffer.h in Headers */, > 413A23231FE18E0700373E99 /* rtccertificate.h in Headers */, > 413A22DB1FE18E0700373E99 /* rtccertificategenerator.h in Headers */, >+ 417953F8216984BE0028266B /* RTCCodecSpecificInfo.h in Headers */, >+ 417953BA216982420028266B /* RTCCodecSpecificInfoH264+Private.h in Headers */, >+ 417953C02169824B0028266B /* RTCCodecSpecificInfoH264.h in Headers */, > 413A24441FE1991A00373E99 /* RTCConfiguration.h in Headers */, >+ 413E678A216987B900EF37ED /* RTCCVPixelBuffer.h in Headers */, > 413A24551FE1991A00373E99 /* RTCDataChannel.h in Headers */, > 413A24701FE1991A00373E99 /* RTCDataChannelConfiguration.h in Headers */, >- 416225D22169818100A91C9B /* video_decoder_factory.h in Headers */, >+ 417953BB216982420028266B /* RTCDefaultVideoDecoderFactory.h in Headers */, >+ 417953C22169824B0028266B /* RTCDefaultVideoEncoderFactory.h in Headers */, > 413A24521FE1991A00373E99 /* RTCDispatcher.h in Headers */, > 413A245D1FE1991A00373E99 /* RTCEAGLVideoView.h in Headers */, >+ 4144B3F6216B081B004363AC /* RTCEncodedImage+Private.h in Headers */, >+ 417953F7216984BE0028266B /* RTCEncodedImage.h in Headers */, > 41F411B11EF8DA0200343C26 /* rtcerror.h in Headers */, >- 417953E5216983CA0028266B /* module_common_types_public.h in Headers */, > 41F9BF882051C80100ABF0B9 /* rtceventlogoutput.h in Headers */, > 413A24401FE1991A00373E99 /* RTCFieldTrials.h in Headers */, > 413A245A1FE1991A00373E99 /* RTCFileLogger.h in Headers */, > 413A24561FE1991A00373E99 /* RTCFileVideoCapturer.h in Headers */, >+ 417953C32169824B0028266B /* RTCH264ProfileLevelId.h in Headers */, >+ 417953F9216984BE0028266B /* RTCI420Buffer.h in Headers */, > 413A244E1FE1991A00373E99 /* RTCIceCandidate.h in Headers */, > 413A245E1FE1991A00373E99 /* RTCIceServer.h in Headers */, > 413A24481FE1991A00373E99 /* RTCIntervalRange.h in Headers */, >@@ -14574,12 +14647,13 @@ > 413A246A1FE1991A00373E99 /* RTCMetricsSampleInfo.h in Headers */, > 413A24671FE1991A00373E99 /* RTCMTLNSVideoView.h in Headers */, > 413A24711FE1991A00373E99 /* RTCMTLVideoView.h in Headers */, >+ 4144B3D42169896D004363AC /* RTCNativeI420Buffer+Private.h in Headers */, >+ 4144B3D22169896D004363AC /* RTCNativeI420Buffer.h in Headers */, >+ 4144B3D52169896D004363AC /* RTCNativeMutableI420Buffer.h in Headers */, > 413A24591FE1991A00373E99 /* RTCNSGLVideoView.h in Headers */, >- 5CDD8A501E43BFB300621E92 /* rtcp.h in Headers */, > 419C82B21FE20DCD0040C30F /* rtcp_demuxer.h in Headers */, > 5CD285FB1E6A64520094FDC8 /* rtcp_nack_stats.h in Headers */, > 5CDD88981E43BE3C00621E92 /* rtcp_packet.h in Headers */, >- 4144B3EB2169C5E3004363AC /* RTCVideoCodecInfo+Private.h in Headers */, > 419C82BD1FE20DCD0040C30F /* rtcp_packet_sink_interface.h in Headers */, > 5CDD889B1E43BE3C00621E92 /* rtcp_receiver.h in Headers */, > 5CDD889E1E43BE3C00621E92 /* rtcp_sender.h in Headers */, >@@ -14588,15 +14662,17 @@ > 419C83051FE20F020040C30F /* rtcp_transceiver_impl.h in Headers */, > 413A243E1FE1991A00373E99 /* RTCPeerConnection.h in Headers */, > 413A24631FE1991A00373E99 /* RTCPeerConnectionFactory.h in Headers */, >- 417953C22169824B0028266B /* RTCDefaultVideoEncoderFactory.h in Headers */, > 413A246C1FE1991A00373E99 /* RTCRtpCodecParameters.h in Headers */, >+ 41889D75216BB7B9004614DD /* RTCRtpEncodingParameters+Private.h in Headers */, >+ 41889D76216BC4EC004614DD /* RTCRtpEncodingParameters.h in Headers */, > 413A244C1FE1991A00373E99 /* RTCRtpEncodingParameters.h in Headers */, >+ 41889D78216BC4EC004614DD /* RTCRtpFragmentationHeader+Private.h in Headers */, >+ 413E67A12169889500EF37ED /* RTCRtpFragmentationHeader.h in Headers */, > 413A24451FE1991A00373E99 /* RTCRtpParameters.h in Headers */, > 413A243B1FE1991A00373E99 /* RTCRtpReceiver.h in Headers */, > 413A245C1FE1991A00373E99 /* RTCRtpSender.h in Headers */, > 413A244A1FE1991A00373E99 /* RTCSessionDescription.h in Headers */, > 413A24661FE1991A00373E99 /* RTCSSLAdapter.h in Headers */, >- 417953C62169824B0028266B /* RTCVideoDecoderH264.h in Headers */, > 4145E4C81EF896D700FCF6E6 /* rtcstats.h in Headers */, > 4145E4C71EF896D700FCF6E6 /* rtcstats_objects.h in Headers */, > 5CD285081E6A60570094FDC8 /* rtcstatscollector.h in Headers */, >@@ -14607,19 +14683,37 @@ > 413A243D1FE1991A00373E99 /* RTCVideoCapturer.h in Headers */, > 41ECEAB620630108009D5141 /* RTCVideoCodec+Private.h in Headers */, > 413A24431FE1991A00373E99 /* RTCVideoCodec.h in Headers */, >+ 413E67662169854B00EF37ED /* RTCVideoCodecConstants.h in Headers */, > 413A24531FE1991A00373E99 /* RTCVideoCodecFactory.h in Headers */, > 413A243C1FE1991A00373E99 /* RTCVideoCodecH264.h in Headers */, >+ 4144B3EB2169C5E3004363AC /* RTCVideoCodecInfo+Private.h in Headers */, >+ 413E67992169883900EF37ED /* RTCVideoCodecInfo.h in Headers */, >+ 4144B3DD2169A070004363AC /* RTCVideoDecoder.h in Headers */, >+ 4144B3E02169A070004363AC /* RTCVideoDecoderFactory.h in Headers */, >+ 417953C52169824B0028266B /* RTCVideoDecoderFactoryH264.h in Headers */, >+ 417953C62169824B0028266B /* RTCVideoDecoderH264.h in Headers */, > 413A24651FE1991A00373E99 /* RTCVideoDecoderVP8.h in Headers */, > 413A24601FE1991A00373E99 /* RTCVideoDecoderVP9.h in Headers */, >+ 4144B3DE2169A070004363AC /* RTCVideoEncoder.h in Headers */, >+ 4144B3DF2169A070004363AC /* RTCVideoEncoderFactory.h in Headers */, >+ 417953C82169824B0028266B /* RTCVideoEncoderFactoryH264.h in Headers */, >+ 417953C92169824B0028266B /* RTCVideoEncoderH264.h in Headers */, >+ 413E67A02169889500EF37ED /* RTCVideoEncoderQpThresholds.h in Headers */, >+ 4144B3EA2169C5E3004363AC /* RTCVideoEncoderSettings+Private.h in Headers */, >+ 413E67852169877D00EF37ED /* RTCVideoEncoderSettings.h in Headers */, >+ 413E67692169854B00EF37ED /* RTCVideoEncoderVP8.h in Headers */, > 413A246E1FE1991A00373E99 /* RTCVideoEncoderVP8.h in Headers */, > 413A24461FE1991A00373E99 /* RTCVideoEncoderVP9.h in Headers */, >- 413E67732169863B00EF37ED /* libvpx_interface.h in Headers */, > 413A244F1FE1991A00373E99 /* RTCVideoFrame.h in Headers */, >+ 413E678F216987DB00EF37ED /* RTCVideoFrame.h in Headers */, >+ 413E6791216987DB00EF37ED /* RTCVideoFrameBuffer.h in Headers */, > 413A24411FE1991A00373E99 /* RTCVideoFrameBuffer.h in Headers */, > 413A246F1FE1991A00373E99 /* RTCVideoRenderer.h in Headers */, > 413A24611FE1991A00373E99 /* RTCVideoSource.h in Headers */, > 413A24501FE1991A00373E99 /* RTCVideoTrack.h in Headers */, > 413A243A1FE1991A00373E99 /* RTCVideoViewShading.h in Headers */, >+ 413E676A2169854B00EF37ED /* RTCWrappedNativeVideoDecoder.h in Headers */, >+ 413E676C2169854B00EF37ED /* RTCWrappedNativeVideoEncoder.h in Headers */, > 419C82B81FE20DCD0040C30F /* rtp_config.h in Headers */, > 419C83C91FE247B40040C30F /* rtp_cvo.h in Headers */, > 413092011EF8D0A600757C55 /* rtp_demuxer.h in Headers */, >@@ -14630,11 +14724,11 @@ > 5CDD88AC1E43BE3C00621E92 /* rtp_format_vp8.h in Headers */, > 5CDD88AF1E43BE3C00621E92 /* rtp_format_vp9.h in Headers */, > 5CDD83F71E439A6F00621E92 /* rtp_frame_reference_finder.h in Headers */, >- 417953D62169834F0028266B /* channel_receive.h in Headers */, > 5CDD8AB91E43C00F00621E92 /* rtp_generator.h in Headers */, > 41E02C7B2127347400C27CD6 /* rtp_generic_frame_descriptor.h in Headers */, > 41E02C792127347400C27CD6 /* rtp_generic_frame_descriptor_extension.h in Headers */, > 419C83C41FE247B40040C30F /* rtp_header_extension_map.h in Headers */, >+ 413E677E216986AE00EF37ED /* rtp_header_extension_size.h in Headers */, > 5CDD88B61E43BE3C00621E92 /* rtp_header_extensions.h in Headers */, > 419C83CC1FE247B40040C30F /* rtp_header_parser.h in Headers */, > 41F9BF8D2051C80100ABF0B9 /* rtp_headers.h in Headers */, >@@ -14642,7 +14736,6 @@ > 5CDD88BA1E43BE3D00621E92 /* rtp_packet_history.h in Headers */, > 5CDD88BB1E43BE3D00621E92 /* rtp_packet_received.h in Headers */, > 419C82B61FE20DCD0040C30F /* rtp_packet_sink_interface.h in Headers */, >- 413E678F216987DB00EF37ED /* RTCVideoFrame.h in Headers */, > 5CDD88BC1E43BE3D00621E92 /* rtp_packet_to_send.h in Headers */, > 41D6B45A21273159008F9353 /* rtp_payload_params.h in Headers */, > 419C83C31FE247B40040C30F /* rtp_rtcp.h in Headers */, >@@ -14679,6 +14772,7 @@ > 419C82DB1FE20E590040C30F /* rtptransportinternaladapter.h in Headers */, > 5C4B48F21E42C1E3002651C8 /* rtputils.h in Headers */, > 5CDD83F91E439A6F00621E92 /* rtt_filter.h in Headers */, >+ 417953EB216983F10028266B /* rtt_mult_experiment.h in Headers */, > 419241CB2127593F00634FCF /* rtt_stats.h in Headers */, > 419C82C21FE20DCD0040C30F /* rtx_receive_stream.h in Headers */, > 419241622127385B00634FCF /* rw_lock_posix.h in Headers */, >@@ -14690,6 +14784,7 @@ > 413A23281FE18E0700373E99 /* sanitizer.h in Headers */, > 416D3BEA212731C200775F09 /* saturation_protector.h in Headers */, > 413A23051FE18E0700373E99 /* scoped_ref_ptr.h in Headers */, >+ 41FCBB1021B1F53700A5DF27 /* scoped_refptr.h in Headers */, > 41F9BFB62051C93600ABF0B9 /* scopedvideodecoder.h in Headers */, > 41F9BFB72051C93600ABF0B9 /* scopedvideoencoder.h in Headers */, > 419C84361FE24E800040C30F /* screenshare_layers.h in Headers */, >@@ -14704,6 +14799,7 @@ > 5CDD85BA1E43B5C000621E92 /* send_statistics_proxy.h in Headers */, > 4192418B212749C800634FCF /* send_time_history.h in Headers */, > 5CDD89731E43BF3A00621E92 /* sender_report.h in Headers */, >+ 41FCBB2821B1F87E00A5DF27 /* sent_packet.h in Headers */, > 419241AC212758D300634FCF /* sequence_buffer.h in Headers */, > 413A22D41FE18E0700373E99 /* sequenced_task_checker.h in Headers */, > 413A23AE1FE18E0800373E99 /* sequenced_task_checker_impl.h in Headers */, >@@ -14715,7 +14811,6 @@ > 5CD285E01E6A63430094FDC8 /* shadow_filter_update_gain.h in Headers */, > 415F1FB521272FBA00064CBF /* signal_classifier.h in Headers */, > 41F9BFC52051CAE500ABF0B9 /* signal_processing_library.h in Headers */, >- 417953D72169834F0028266B /* channel_send_proxy.h in Headers */, > 413A23811FE18E0700373E99 /* signalthread.h in Headers */, > 41F263BC2126811500274F59 /* sigslot.h in Headers */, > 413A23331FE18E0700373E99 /* sigslotrepeater.h in Headers */, >@@ -14755,7 +14850,6 @@ > 413A213F1FE0F0EF00373E99 /* srtp.h in Headers */, > 413A213D1FE0F0EF00373E99 /* srtp_priv.h in Headers */, > 419C82E11FE20E590040C30F /* srtpsession.h in Headers */, >- 417953C82169824B0028266B /* RTCVideoEncoderFactoryH264.h in Headers */, > 419C82E21FE20E590040C30F /* srtptransport.h in Headers */, > 41E02CBD2127356A00C27CD6 /* sslcertificate.h in Headers */, > 413A23AF1FE18E0800373E99 /* sslfingerprint.h in Headers */, >@@ -14766,7 +14860,6 @@ > 5CDD87371E43BABE00621E92 /* standalone_vad.h in Headers */, > 5C0885351E4A99D200403995 /* stat.h in Headers */, > 5CDD8BEC1E43C2B500621E92 /* state_construct.h in Headers */, >- 416225E92169819500A91C9B /* objc_video_frame.h in Headers */, > 5CDD8BEE1E43C2B500621E92 /* state_search.h in Headers */, > 5CDD8A521E43BFB300621E92 /* statistics_calculator.h in Headers */, > 5CDD85BD1E43B5C000621E92 /* stats_counter.h in Headers */, >@@ -14778,6 +14871,7 @@ > 5C4B48F51E42C1E3002651C8 /* streamparams.h in Headers */, > 41DDB26C212679D200296D47 /* string_builder.h in Headers */, > 413A23641FE18E0700373E99 /* string_to_number.h in Headers */, >+ 4106D56D216C2BAA001E3C40 /* string_view.h in Headers */, > 413A22EB1FE18E0700373E99 /* stringencode.h in Headers */, > 413A23B11FE18E0800373E99 /* stringize_macros.h in Headers */, > 413A22CE1FE18E0700373E99 /* stringutils.h in Headers */, >@@ -14791,13 +14885,14 @@ > 5CD285E11E6A63430094FDC8 /* subtractor_output.h in Headers */, > 5CD285E51E6A63430094FDC8 /* suppression_filter.h in Headers */, > 5CD285E71E6A63430094FDC8 /* suppression_gain.h in Headers */, >+ 413E67812169870F00EF37ED /* svc_config.h in Headers */, >+ 413E67952169881900EF37ED /* svc_rate_allocator.h in Headers */, > 5CDD8BF01E43C2B500621E92 /* swap_bytes.h in Headers */, > 413A23471FE18E0700373E99 /* swap_queue.h in Headers */, > 419241A9212758D300634FCF /* symmetric_matrix_buffer.h in Headers */, > 41BE71A3215C463700A7B196 /* symrec.h in Headers */, > 5CDD8A551E43BFB300621E92 /* sync_buffer.h in Headers */, > 5CD284BF1E6A5F9F0094FDC8 /* syncable.h in Headers */, >- 4144B3F6216B081B004363AC /* RTCEncodedImage+Private.h in Headers */, > 5CDD89791E43BF3A00621E92 /* target_bitrate.h in Headers */, > 413A23391FE18E0700373E99 /* task_queue.h in Headers */, > 413A237D1FE18E0700373E99 /* task_queue_posix.h in Headers */, >@@ -14816,9 +14911,9 @@ > 413A22E81FE18E0700373E99 /* thread_checker.h in Headers */, > 413A22B81FE18E0700373E99 /* thread_checker_impl.h in Headers */, > 5CDD84C41E43AF1300621E92 /* three_band_filter_bank.h in Headers */, >+ 41FCBB2321B1F82F00A5DF27 /* throw_delegate.h in Headers */, > 5CDD8A581E43BFB300621E92 /* tick_timer.h in Headers */, > 419241DD21275A3000634FCF /* time_delta.h in Headers */, >- 419C83341FE245EA0040C30F /* time_interval.h in Headers */, > 5CDD8A5B1E43BFB300621E92 /* time_stretch.h in Headers */, > 5CDD88DB1E43BE3D00621E92 /* time_util.h in Headers */, > 419241E021275A3000634FCF /* timestamp.h in Headers */, >@@ -14856,20 +14951,17 @@ > 413A234B1FE18E0700373E99 /* type_traits.h in Headers */, > 5CDD84C61E43AF1300621E92 /* typing_detection.h in Headers */, > 5C63FA311E4174F6002CA531 /* udpport.h in Headers */, >- 419241572127376F00634FCF /* udptransport.h in Headers */, > 413A246B1FE1991A00373E99 /* UIDevice+RTCDevice.h in Headers */, > 5CDD88E01E43BE3D00621E92 /* ulpfec_generator.h in Headers */, > 5CDD88E31E43BE3D00621E92 /* ulpfec_header_reader_writer.h in Headers */, > 419C83C61FE247B40040C30F /* ulpfec_receiver.h in Headers */, > 5CDD88E51E43BE3D00621E92 /* ulpfec_receiver_impl.h in Headers */, > 5C63F9751E41737B002CA531 /* umametrics.h in Headers */, >- 413A22A41FE18E0600373E99 /* unixfilesystem.h in Headers */, > 5CDD8BF21E43C2B500621E92 /* unpack_bits.h in Headers */, > 41F263B0212680A800274F59 /* unused.h in Headers */, > 413A21401FE0F0EF00373E99 /* ut_sim.h in Headers */, > 5CDD87091E43BA7500621E92 /* utility.h in Headers */, > 5CDD87DF1E43BC2E00621E92 /* utility.h in Headers */, >- 417953BA216982420028266B /* RTCCodecSpecificInfoH264+Private.h in Headers */, > 5CDD873B1E43BABE00621E92 /* vad_audio_proc.h in Headers */, > 5CDD87381E43BABE00621E92 /* vad_audio_proc_internal.h in Headers */, > 5CDD873E1E43BABE00621E92 /* vad_circular_buffer.h in Headers */, >@@ -14888,19 +14980,21 @@ > 5CDD84021E439A6F00621E92 /* video_coding_impl.h in Headers */, > 419C84131FE249AB0040C30F /* video_content_type.h in Headers */, > 4145E48D1EF88B9D00FCF6E6 /* video_decoder.h in Headers */, >+ 416225D22169818100A91C9B /* video_decoder_factory.h in Headers */, > 4145F61E1FE1F38D00EB9CAF /* video_decoder_factory.h in Headers */, > 41F2636621267ADF00274F59 /* video_decoder_software_fallback_wrapper.h in Headers */, > 4145E48E1EF88B9D00FCF6E6 /* video_encoder.h in Headers */, > 41F2636821267ADF00274F59 /* video_encoder_config.h in Headers */, >+ 416225D52169818100A91C9B /* video_encoder_factory.h in Headers */, > 4145F61F1FE1F38D00EB9CAF /* video_encoder_factory.h in Headers */, > 41F2636A21267ADF00274F59 /* video_encoder_software_fallback_wrapper.h in Headers */, >+ 416225D62169818100A91C9B /* video_frame.h in Headers */, > 5CD2846B1E6A57F40094FDC8 /* video_frame.h in Headers */, > 41109AAE1E5FA19200C0955A /* video_frame_buffer.h in Headers */, > 5CD284691E6A57F40094FDC8 /* video_frame_buffer.h in Headers */, >+ 416225D32169818100A91C9B /* video_frame_buffer.h in Headers */, > 4102F6CF212733B7006AE8D7 /* video_quality_observer.h in Headers */, > 419C82C41FE20DCD0040C30F /* video_receive_stream.h in Headers */, >- 417953C52169824B0028266B /* RTCVideoDecoderFactoryH264.h in Headers */, >- 413E67992169883900EF37ED /* RTCVideoCodecInfo.h in Headers */, > 5CDD85CB1E43B5C000621E92 /* video_receive_stream.h in Headers */, > 5C4B4C211E431F75002651C8 /* video_render_frames.h in Headers */, > 5CD2846C1E6A57F40094FDC8 /* video_rotation.h in Headers */, >@@ -14932,13 +15026,14 @@ > 5CD285191E6A60570094FDC8 /* videotracksource.h in Headers */, > 413A22BC1FE18E0700373E99 /* virtualsocketserver.h in Headers */, > 5CDD87411E43BABE00621E92 /* voice_activity_detector.h in Headers */, >- 417953E7216983CA0028266B /* module_common_types.h in Headers */, > 5CDD84C81E43AF1300621E92 /* voice_detection_impl.h in Headers */, > 5CDD87421E43BABE00621E92 /* voice_gmm_tables.h in Headers */, > 5CFD53811E4BA4F500482908 /* voice_processing_audio_unit.h in Headers */, > 4145F6171FE1EFCA00EB9CAF /* vp8.h in Headers */, > 419241F421275C3200634FCF /* vp8_encoder_simulcast_proxy.h in Headers */, > 5CDD83861E439A3500621E92 /* vp8_header_parser.h in Headers */, >+ 41FCBB6521B1FEF600A5DF27 /* vp8_temporal_layers.h in Headers */, >+ 417953EE216984210028266B /* vp9_profile.h in Headers */, > 5CDD8BF41E43C2B500621E92 /* vq3.h in Headers */, > 5CDD8BF61E43C2B500621E92 /* vq4.h in Headers */, > 5C4B4C811E431F9C002651C8 /* wav_file.h in Headers */, >@@ -14955,7 +15050,6 @@ > 5C4B48931E42C1BA002651C8 /* webrtcvideocapturerfactory.h in Headers */, > 5C4B48941E42C1BA002651C8 /* webrtcvideodecoderfactory.h in Headers */, > 5C4B48961E42C1BA002651C8 /* webrtcvideoencoderfactory.h in Headers */, >- 416225D32169818100A91C9B /* video_frame_buffer.h in Headers */, > 4145E4D21EF8CC2000FCF6E6 /* webrtcvideoengine.h in Headers */, > 5C4B489E1E42C1BA002651C8 /* webrtcvoiceengine.h in Headers */, > 5CDD8BF81E43C2B500621E92 /* window32_w32.h in Headers */, >@@ -15611,7 +15705,6 @@ > isa = PBXSourcesBuildPhase; > buildActionMask = 2147483647; > files = ( >- 5C088C101E4AA44400403995 /* bundlefilter.cc in Sources */, > 5C088C121E4AA44400403995 /* channel.cc in Sources */, > 5C088C141E4AA44400403995 /* channelmanager.cc in Sources */, > 5C088C181E4AA44400403995 /* externalhmac.cc in Sources */, >@@ -16107,15 +16200,12 @@ > 5CDD8B6A1E43C2B500621E92 /* abs_quant_loop.c in Sources */, > 5CDD89F81E43BFB300621E92 /* accelerate.cc in Sources */, > 41E02CA82127352D00C27CD6 /* acknowledged_bitrate_estimator.cc in Sources */, >- 416225D82169818100A91C9B /* video_decoder_factory.mm in Sources */, > 5CDD85031E43B1EA00621E92 /* acm_codec_database.cc in Sources */, > 5CDD85091E43B1EA00621E92 /* acm_receiver.cc in Sources */, >- 417953D32169834F0028266B /* channel_send.cc in Sources */, > 5CDD850B1E43B1EA00621E92 /* acm_resampler.cc in Sources */, > 5C4B48D71E42C1E3002651C8 /* adaptedvideotracksource.cc in Sources */, > 416D3BDB212731C200775F09 /* adaptive_agc.cc in Sources */, > 415F1FA621272FBA00064CBF /* adaptive_digital_gain_applier.cc in Sources */, >- 413E67942169881900EF37ED /* svc_rate_allocator.cc in Sources */, > 5CD285A41E6A63430094FDC8 /* adaptive_fir_filter.cc in Sources */, > 416D3BDE212731C200775F09 /* adaptive_mode_level_estimator.cc in Sources */, > 415F1FA821272FBA00064CBF /* adaptive_mode_level_estimator_agc.cc in Sources */, >@@ -16130,6 +16220,7 @@ > 5CDD85E71E43B81000621E92 /* aecm_core.cc in Sources */, > 5CDD85E41E43B81000621E92 /* aecm_core_c.cc in Sources */, > 5CDD87011E43BA7500621E92 /* agc.cc in Sources */, >+ 413E679B2169886200EF37ED /* agc2_common.cc in Sources */, > 5CDD86FF1E43BA7500621E92 /* agc_manager_direct.cc in Sources */, > 5CDD8FD11E43CD6600621E92 /* aimd_rate_control.cc in Sources */, > 41F2639E2126800000274F59 /* aligned_malloc.cc in Sources */, >@@ -16145,6 +16236,7 @@ > 41433D291F79B33400387B4D /* arith_routines_hist.c in Sources */, > 5CDD87A81E43BC0500621E92 /* arith_routines_logist.c in Sources */, > 41433D0F1F79B33400387B4D /* arith_routines_logist.c in Sources */, >+ 41FCBB7A21B1FFC000A5DF27 /* ascii.cc in Sources */, > 413A238C1FE18E0700373E99 /* asyncinvoker.cc in Sources */, > 413A216D1FE18D6A00373E99 /* asyncpacketsocket.cc in Sources */, > 413A23551FE18E0700373E99 /* asyncresolverinterface.cc in Sources */, >@@ -16172,7 +16264,6 @@ > 4140B8221E4E3383007409E6 /* audio_decoder_pcm.cc in Sources */, > 5CDD8FFC1E43CE3A00621E92 /* audio_decoder_pcm16b.cc in Sources */, > 5C119FFB1E457400004F0987 /* audio_device_buffer.cc in Sources */, >- 41889D77216BC4EC004614DD /* RTCRtpEncodingParameters.mm in Sources */, > 5C11A01D1E457578004F0987 /* audio_device_dummy.cc in Sources */, > 5C119FFE1E457400004F0987 /* audio_device_generic.cc in Sources */, > 5C11A00B1E457448004F0987 /* audio_device_mac.cc in Sources */, >@@ -16222,7 +16313,6 @@ > 5C4B4C5F1E431F9C002651C8 /* audio_util.cc in Sources */, > 5CDD8A041E43BFB300621E92 /* audio_vector.cc in Sources */, > 5CD284F01E6A60570094FDC8 /* audiotrack.cc in Sources */, >- 417953D82169834F0028266B /* channel_receive.cc in Sources */, > 5CDD8B721E43C2B500621E92 /* augmented_cb_corr.c in Sources */, > 5CDD86401E43B8B500621E92 /* auto_corr_to_refl_coef.c in Sources */, > 5CDD86411E43B8B500621E92 /* auto_correlation.c in Sources */, >@@ -16244,6 +16334,7 @@ > 5C4B4C191E431F75002651C8 /* bitrate_adjuster.cc in Sources */, > 5CDD852F1E43B39C00621E92 /* bitrate_allocator.cc in Sources */, > 5CDD8F7D1E43CBE000621E92 /* bitrate_controller.cc in Sources */, >+ 41FCBB5C21B1FEA300A5DF27 /* bitrate_controller.cc in Sources */, > 5CDD905D1E43D11200621E92 /* bitrate_controller_impl.cc in Sources */, > 41E02CAE2127352D00C27CD6 /* bitrate_estimator.cc in Sources */, > 5CDD8FA01E43CCBE00621E92 /* bitrate_prober.cc in Sources */, >@@ -16257,6 +16348,7 @@ > 413A239B1FE18E0800373E99 /* bufferqueue.cc in Sources */, > 5CD285481E6A61D20094FDC8 /* builtin_audio_decoder_factory.cc in Sources */, > 413091FB1EF8CFF800757C55 /* builtin_audio_encoder_factory.cc in Sources */, >+ 41FCBB1B21B1F7CD00A5DF27 /* builtin_video_bitrate_allocator_factory.cc in Sources */, > 41F2635F21267ADF00274F59 /* builtin_video_decoder_factory.cc in Sources */, > 41E02CC42127358800C27CD6 /* builtin_video_encoder_factory.cc in Sources */, > 5CDD8B741E43C2B500621E92 /* bw_expand.c in Sources */, >@@ -16279,12 +16371,14 @@ > 5CDD8B821E43C2B500621E92 /* cb_update_best_index.c in Sources */, > 5C4B4C641E431F9C002651C8 /* channel_buffer.cc in Sources */, > 5CDD8F801E43CBE000621E92 /* channel_controller.cc in Sources */, >+ 417953D82169834F0028266B /* channel_receive.cc in Sources */, >+ 417953D32169834F0028266B /* channel_send.cc in Sources */, > 5CDD8B841E43C2B500621E92 /* chebyshev.c in Sources */, > 413A23731FE18E0700373E99 /* checks.cc in Sources */, > 5CDD86A11E43B99400621E92 /* circular_buffer.cc in Sources */, > 5C4B4CC61E4320A9002651C8 /* clock.cc in Sources */, >+ 41FCBB7521B1FFA400A5DF27 /* cocoa_threading.mm in Sources */, > 5C4B48DB1E42C1E3002651C8 /* codec.cc in Sources */, >- 5CDD85151E43B1EA00621E92 /* codec_manager.cc in Sources */, > 5CDD83C81E439A6F00621E92 /* codec_timer.cc in Sources */, > 4102F6DC21273416006AE8D7 /* color_space.cc in Sources */, > 5CDD8A0D1E43BFB300621E92 /* comfort_noise.cc in Sources */, >@@ -16299,13 +16393,12 @@ > 415F1FAA21272FBA00064CBF /* compute_interpolated_gain_curve.cc in Sources */, > 5CDD8FEB1E43CDCA00621E92 /* config.cc in Sources */, > 419241712127389B00634FCF /* congestion_controller_experiment.cc in Sources */, >- 41A08BD821272F1D001D5D7B /* congestion_window_pushback_controller.cc in Sources */, >+ 41FCBB4121B1FDB800A5DF27 /* congestion_window_pushback_controller.cc in Sources */, > 5CDD8A931E43C00F00621E92 /* constant_pcm_packet_source.cc in Sources */, > 5CDD8B891E43C2B500621E92 /* constants.c in Sources */, > 419C83FB1FE249370040C30F /* constants.cc in Sources */, > 41E02C782127347400C27CD6 /* contributing_sources.cc in Sources */, >- 413E67A22169889500EF37ED /* RTCRtpFragmentationHeader.m in Sources */, >- 416225E72169819500A91C9B /* objc_video_decoder_factory.mm in Sources */, >+ 41FCBB3921B1F8FC00A5DF27 /* control_handler.cc in Sources */, > 5CDD8F851E43CBE000621E92 /* controller.cc in Sources */, > 5CDD8F831E43CBE000621E92 /* controller_manager.cc in Sources */, > 41F9BFAC2051C91900ABF0B9 /* convert_legacy_video_factory.cc in Sources */, >@@ -16313,13 +16406,16 @@ > 413A23151FE18E0700373E99 /* copyonwritebuffer.cc in Sources */, > 5C4B4CCA1E4320A9002651C8 /* cpu_features.cc in Sources */, > 5C4B4CCB1E4320A9002651C8 /* cpu_info.cc in Sources */, >+ 41FCBB5321B1FE4600A5DF27 /* cpu_speed_experiment.cc in Sources */, > 5CDD87B11E43BC0500621E92 /* crc.c in Sources */, > 413A23431FE18E0700373E99 /* crc32.cc in Sources */, > 5CDD8B8B1E43C2B500621E92 /* create_augmented_vec.c in Sources */, >- 4145E4CA1EF8CB3200FCF6E6 /* createpeerconnectionfactory.cc in Sources */, >+ 41FCBB0E21B1F53700A5DF27 /* create_peerconnection_factory.cc in Sources */, >+ 41FCBB6621B1FEF600A5DF27 /* create_vp8_temporal_layers.cc in Sources */, > 413A23881FE18E0700373E99 /* criticalsection.cc in Sources */, > 5CDD864B1E43B8B500621E92 /* cross_correlation.c in Sources */, > 5CDD8A0F1E43BFB300621E92 /* cross_correlation.cc in Sources */, >+ 41FCBB3121B1F8B700A5DF27 /* cryptooptions.cc in Sources */, > 413A23221FE18E0700373E99 /* cryptstring.cc in Sources */, > 419C848E1FE2574D0040C30F /* custom_extensions.cc in Sources */, > 419C848C1FE2574D0040C30F /* d1_both.cc in Sources */, >@@ -16353,7 +16449,6 @@ > 41E02CA52127352D00C27CD6 /* delay_based_bwe.cc in Sources */, > 5CDD868D1E43B93900621E92 /* delay_estimator.cc in Sources */, > 5CDD868B1E43B93900621E92 /* delay_estimator_wrapper.cc in Sources */, >- 417953FA216984BE0028266B /* RTCEncodedImage.m in Sources */, > 5CDD8A1D1E43BFB300621E92 /* delay_manager.cc in Sources */, > 5CDD8A201E43BFB300621E92 /* delay_peak_detector.cc in Sources */, > 5CDD86EE1E43BA6D00621E92 /* digital_agc.c in Sources */, >@@ -16388,10 +16483,10 @@ > 5CDD87B71E43BC0500621E92 /* encode.c in Sources */, > 41433D121F79B33400387B4D /* encode.c in Sources */, > 5CDD87B51E43BC0500621E92 /* encode_lpc_swb.c in Sources */, >- 417953DC216983910028266B /* field_trial.cc in Sources */, > 5CDD8A951E43C00F00621E92 /* encode_neteq_input.cc in Sources */, > 5CDD83CC1E439A6F00621E92 /* encoded_frame.cc in Sources */, > 41A08BAB212681C8001D5D7B /* encoded_frame.cc in Sources */, >+ 41FCBB1D21B1F7CD00A5DF27 /* encoded_image.cc in Sources */, > 41E02C88212734B900C27CD6 /* encoder_database.cc in Sources */, > 5CDD859F1E43B5C000621E92 /* encoder_rtcp_feedback.cc in Sources */, > 5CDD86511E43B8B500621E92 /* energy.c in Sources */, >@@ -16414,7 +16509,6 @@ > 5CDD894A1E43BF3A00621E92 /* extended_jitter_report.cc in Sources */, > 5CDD894D1E43BF3A00621E92 /* extended_reports.cc in Sources */, > 5CDD8A971E43C00F00621E92 /* fake_decode_from_file.cc in Sources */, >- 413E67822169870F00EF37ED /* svc_config.cc in Sources */, > 41299B962127369100B3414B /* fake_network_pipe.cc in Sources */, > 413A23101FE18E0700373E99 /* fakeclock.cc in Sources */, > 413A23CC1FE18E0800373E99 /* fakesslidentity.cc in Sources */, >@@ -16425,11 +16519,11 @@ > 419241D3212759A100634FCF /* fec_private_tables_random.cc in Sources */, > 5CDD887B1E43BE3C00621E92 /* fec_test_helper.cc in Sources */, > 41F2638A21267F4000274F59 /* fft.c in Sources */, >- 417953EA216983F10028266B /* rtt_mult_experiment.cc in Sources */, > 41433CEA1F79B33400387B4D /* fft.c in Sources */, > 41F2636E21267B3E00274F59 /* fft4g.c in Sources */, > 41F9BFD72051DDE500ABF0B9 /* fft_buffer.cc in Sources */, > 4192413D2127372400634FCF /* fft_util.cc in Sources */, >+ 417953DC216983910028266B /* field_trial.cc in Sources */, > 4192416E2127389B00634FCF /* field_trial_parser.cc in Sources */, > 419241E821275A7600634FCF /* field_trial_units.cc in Sources */, > 413A22E61FE18E0700373E99 /* file.cc in Sources */, >@@ -16437,14 +16531,11 @@ > 5C11A01F1E457578004F0987 /* file_audio_device_factory.cc in Sources */, > 413A23D81FE18E0800373E99 /* file_posix.cc in Sources */, > 5CDD8C8B1E43C66000621E92 /* file_utils.cc in Sources */, >- 416225D72169818100A91C9B /* video_frame_buffer.mm in Sources */, > 41F263AC212680A800274F59 /* file_wrapper.cc in Sources */, > 413A23DA1FE18E0800373E99 /* filerotatingstream.cc in Sources */, >- 413A23B51FE18E0800373E99 /* fileutils.cc in Sources */, > 5CDD86551E43B8B500621E92 /* filter_ar.c in Sources */, > 5CDD86541E43B8B500621E92 /* filter_ar_fast_q12.c in Sources */, > 5CDD87BC1E43BC0500621E92 /* filter_functions.c in Sources */, >- 4144B3E82169C5E3004363AC /* RTCVideoCodecInfo+Private.mm in Sources */, > 5CDD86561E43B8B500621E92 /* filter_ma_fast_q12.c in Sources */, > 41433D271F79B33400387B4D /* filterbank_tables.c in Sources */, > 41433D061F79B33400387B4D /* filterbanks.c in Sources */, >@@ -16457,7 +16548,6 @@ > 419C831D1FE242B20040C30F /* fir_filter_factory.cc in Sources */, > 5C4B4C6A1E431F9C002651C8 /* fir_filter_sse.cc in Sources */, > 415F1FAD21272FBA00064CBF /* fixed_digital_level_estimator.cc in Sources */, >- 416D3BE3212731C200775F09 /* fixed_gain_controller.cc in Sources */, > 413A217D1FE18D7800373E99 /* flags.cc in Sources */, > 5CDD887E1E43BE3C00621E92 /* flexfec_header_reader_writer.cc in Sources */, > 4102F6BC21273382006AE8D7 /* flexfec_receive_stream.cc in Sources */, >@@ -16471,8 +16561,8 @@ > 5CDD83D11E439A6F00621E92 /* frame_buffer2.cc in Sources */, > 5CDD8BA21E43C2B500621E92 /* frame_classify.c in Sources */, > 5CD2849A1E6A5F410094FDC8 /* frame_combiner.cc in Sources */, >- 417953C72169824B0028266B /* RTCVideoDecoderH264.mm in Sources */, > 5CDD83791E439A3500621E92 /* frame_dropper.cc in Sources */, >+ 41FCBB5921B1FE6E00A5DF27 /* frame_dumping_decoder.cc in Sources */, > 5CDD8F901E43CBE000621E92 /* frame_length_controller.cc in Sources */, > 5CDD83D31E439A6F00621E92 /* frame_object.cc in Sources */, > 41F2638D21267F4900274F59 /* g711.c in Sources */, >@@ -16484,7 +16574,6 @@ > 5CDD84B21E43AF1300621E92 /* gain_control_for_experimental_agc.cc in Sources */, > 5CDD84B41E43AF1300621E92 /* gain_control_impl.cc in Sources */, > 41A08BD021272EE2001D5D7B /* gain_controller2.cc in Sources */, >- 415F1FB121272FBA00064CBF /* gain_curve_applier.cc in Sources */, > 5CDD8BA41E43C2B500621E92 /* gain_dequant.c in Sources */, > 5CDD8BA61E43C2B500621E92 /* gain_quant.c in Sources */, > 5CDD83D51E439A6F00621E92 /* generic_decoder.cc in Sources */, >@@ -16495,19 +16584,20 @@ > 5CDD86581E43B8B500621E92 /* get_scaling_square.c in Sources */, > 5CDD8BAC1E43C2B500621E92 /* get_sync_seq.c in Sources */, > 5CDD87291E43BABE00621E92 /* gmm.cc in Sources */, >- 41E02CA92127352D00C27CD6 /* goog_cc_factory.cc in Sources */, >+ 41FCBB6021B1FEC400A5DF27 /* goog_cc_factory.cc in Sources */, > 41E02CB62127352D00C27CD6 /* goog_cc_network_control.cc in Sources */, > 5CDD85581E43B42B00621E92 /* h264.cc in Sources */, > 5CDD83541E43257200621E92 /* h264_bitstream_parser.cc in Sources */, > 5CDD83561E43257200621E92 /* h264_common.cc in Sources */, > 4145E4BF1EF894F600FCF6E6 /* h264_profile_level_id.cc in Sources */, >- 41889D79216BC4EC004614DD /* RTCRtpFragmentationHeader+Private.mm in Sources */, > 5CD285EA1E6A639F0094FDC8 /* h264_sprop_parameter_sets.cc in Sources */, > 5CDD83D91E439A6F00621E92 /* h264_sps_pps_tracker.cc in Sources */, > 419C84691FE255FA0040C30F /* handshake.cc in Sources */, > 419C84891FE2574D0040C30F /* handshake_client.cc in Sources */, > 419C848B1FE2574D0040C30F /* handshake_server.cc in Sources */, >+ 41FCBB7121B1FF7400A5DF27 /* hdr_metadata.cc in Sources */, > 413A236E1FE18E0700373E99 /* helpers.cc in Sources */, >+ 417953BC2169824B0028266B /* helpers.cc in Sources */, > 5CDD83DB1E439A6F00621E92 /* histogram.cc in Sources */, > 419C83511FE246650040C30F /* histogram_percentile_counter.cc in Sources */, > 5CDD8BAE1E43C2B500621E92 /* hp_input.c in Sources */, >@@ -16517,6 +16607,7 @@ > 5CDD90501E43D0E900621E92 /* i420.cc in Sources */, > 5CD284671E6A57F40094FDC8 /* i420_buffer.cc in Sources */, > 5C4B4C1B1E431F75002651C8 /* i420_buffer_pool.cc in Sources */, >+ 41FCBB4721B1FDEF00A5DF27 /* icecredentialsiterator.cc in Sources */, > 41A08BC421269577001D5D7B /* icelogger.cc in Sources */, > 413091F81EF8CFBD00757C55 /* iceserverparsing.cc in Sources */, > 419C83E01FE247F20040C30F /* icetransportinternal.cc in Sources */, >@@ -16539,7 +16630,6 @@ > 415F1FB221272FBA00064CBF /* interpolated_gain_curve.cc in Sources */, > 419C829A1FE20CA10040C30F /* interval_budget.cc in Sources */, > 5CDD87C01E43BC0500621E92 /* intialize.c in Sources */, >- 417953CC216982DB0028266B /* RTCCodecSpecificInfoH264.mm in Sources */, > 413A23ED1FE18E0800373E99 /* ipaddress.cc in Sources */, > 5CDD87C31E43BC0500621E92 /* isac.c in Sources */, > 41299B912127367B00B3414B /* isac_vad.c in Sources */, >@@ -16547,29 +16637,32 @@ > 5CDD837B1E439A3500621E92 /* ivf_file_writer.cc in Sources */, > 5CDD83E11E439A6F00621E92 /* jitter_buffer.cc in Sources */, > 5CDD83E31E439A6F00621E92 /* jitter_estimator.cc in Sources */, >+ 41FCBB5221B1FE4600A5DF27 /* jitter_upper_bound_experiment.cc in Sources */, > 41F9BF862051C80100ABF0B9 /* jsep.cc in Sources */, > 5CD284741E6A5D280094FDC8 /* jsepicecandidate.cc in Sources */, > 41F2638421267B7700274F59 /* jsepicecandidate.cc in Sources */, > 5CD284721E6A5D080094FDC8 /* jsepsessiondescription.cc in Sources */, > 41F9BFE32051DE1800ABF0B9 /* jseptransport.cc in Sources */, > 4102F69021273206006AE8D7 /* jseptransportcontroller.cc in Sources */, >+ 41FCBB6D21B1FF3A00A5DF27 /* key_derivation.cc in Sources */, > 4192419A2127586500634FCF /* kiss_fft.cc in Sources */, > 5CDD87C41E43BC0500621E92 /* lattice.c in Sources */, >- 413E67682169854B00EF37ED /* RTCVideoDecoderVP8.mm in Sources */, > 41433CF21F79B33400387B4D /* lattice.c in Sources */, >- 416225D42169818100A91C9B /* video_encoder_factory.mm in Sources */, > 41433D141F79B33400387B4D /* lattice_c.c in Sources */, > 5CDD841F1E439B2900621E92 /* legacy_encoded_audio_frame.cc in Sources */, > 5CDD84B81E43AF1300621E92 /* level_estimator_impl.cc in Sources */, > 5CDD865A1E43B8B500621E92 /* levinson_durbin.c in Sources */, >+ 413E67752169863B00EF37ED /* libvpx_interface.cc in Sources */, > 41294094212E128D00AD95E7 /* libvpx_vp8_decoder.cc in Sources */, > 41294092212E128D00AD95E7 /* libvpx_vp8_encoder.cc in Sources */, > 416D3BE6212731C200775F09 /* limiter.cc in Sources */, > 5CD284F61E6A60570094FDC8 /* localaudiosource.cc in Sources */, > 413A23EB1FE18E0800373E99 /* location.cc in Sources */, > 5CDD8C0F1E43C34600621E92 /* locked_bandwidth_info.cc in Sources */, >+ 4144B3D821698D3D004363AC /* logging.cc in Sources */, > 413A22C01FE18E0700373E99 /* logging_mac.mm in Sources */, > 413A23BE1FE18E0800373E99 /* logsinks.cc in Sources */, >+ 41FCBB6A21B1FF1B00A5DF27 /* loss_based_bandwidth_estimation.cc in Sources */, > 419241C52127593F00634FCF /* loss_rate_filter.cc in Sources */, > 5CDD87051E43BA7500621E92 /* loudness_histogram.cc in Sources */, > 5CD2855D1E6A62ED0094FDC8 /* low_cut_filter.cc in Sources */, >@@ -16578,7 +16671,6 @@ > 5CDD8BC11E43C2B500621E92 /* lpc_encode.c in Sources */, > 5CDD87C71E43BC0500621E92 /* lpc_gain_swb_tables.c in Sources */, > 41433CEE1F79B33400387B4D /* lpc_masking_model.c in Sources */, >- 4144B3F02169C63F004363AC /* NSString+StdString.mm in Sources */, > 5CDD87C91E43BC0500621E92 /* lpc_shape_swb12_tables.c in Sources */, > 5CDD87CB1E43BC0500621E92 /* lpc_shape_swb16_tables.c in Sources */, > 5CDD87CD1E43BC0500621E92 /* lpc_tables.c in Sources */, >@@ -16593,13 +16685,15 @@ > 413A232C1FE18E0700373E99 /* macifaddrs_converter.cc in Sources */, > 413A23381FE18E0700373E99 /* macutils.cc in Sources */, > 5CD285CB1E6A63430094FDC8 /* main_filter_update_gain.cc in Sources */, >+ 41FCBB3C21B1FD4000A5DF27 /* match.cc in Sources */, > 5CD285CF1E6A63430094FDC8 /* matched_filter.cc in Sources */, > 5CD285CD1E6A63430094FDC8 /* matched_filter_lag_aggregator.cc in Sources */, > 41F9BFD42051DDE500ABF0B9 /* matrix_buffer.cc in Sources */, >+ 41FCBB4821B1FDEF00A5DF27 /* mdns_message.cc in Sources */, > 5CDD86A41E43B99400621E92 /* mean_variance_estimator.cc in Sources */, > 5CDD83E51E439A6F00621E92 /* media_opt_util.cc in Sources */, > 5CDD83E71E439A6F00621E92 /* media_optimization.cc in Sources */, >- 416225E62169819500A91C9B /* objc_video_encoder_factory.mm in Sources */, >+ 41FCBB1121B1F53700A5DF27 /* media_transport_interface.cc in Sources */, > 4102F6B321273331006AE8D7 /* mediachannel.cc in Sources */, > 5C4B48E61E42C1E3002651C8 /* mediaconstants.cc in Sources */, > 5C63F9481E41737B002CA531 /* mediaconstraintsinterface.cc in Sources */, >@@ -16607,24 +16701,25 @@ > 41E02CB12127352D00C27CD6 /* median_slope_estimator.cc in Sources */, > 5CD284FA1E6A60570094FDC8 /* mediastream.cc in Sources */, > 5CD286401E6A683A0094FDC8 /* mediastreaminterface.cc in Sources */, >- 4144B3D32169896D004363AC /* RTCNativeI420Buffer.mm in Sources */, > 5CD284FC1E6A60570094FDC8 /* mediastreamobserver.cc in Sources */, > 5CD286321E6A67BF0094FDC8 /* mediatypes.cc in Sources */, > 413A23801FE18E0700373E99 /* memory_usage.cc in Sources */, >+ 41FCBB8421B2007B00A5DF27 /* memutil.cc in Sources */, > 5CDD8A2F1E43BFB300621E92 /* merge.cc in Sources */, > 413A22F91FE18E0700373E99 /* messagedigest.cc in Sources */, > 413A236B1FE18E0700373E99 /* messagehandler.cc in Sources */, > 413A22B11FE18E0700373E99 /* messagequeue.cc in Sources */, >+ 417953DB216983910028266B /* metrics.cc in Sources */, > 5CDD865E1E43B8B500621E92 /* min_max_operations.c in Sources */, >- 5CDD837D1E439A3500621E92 /* moving_average.cc in Sources */, >- 413E67A32169889500EF37ED /* RTCVideoEncoderQpThresholds.m in Sources */, >+ 417953E6216983CA0028266B /* module_common_types.cc in Sources */, >+ 41FCBB1521B1F7AA00A5DF27 /* moving_average.cc in Sources */, > 5CD286101E6A64C90094FDC8 /* moving_max.cc in Sources */, > 5CDD8C8E1E43C66000621E92 /* moving_moments.cc in Sources */, > 5CDD8BCF1E43C2B500621E92 /* my_corr.c in Sources */, > 5CDD89531E43BF3A00621E92 /* nack.cc in Sources */, > 5CDD83EA1E439A6F00621E92 /* nack_module.cc in Sources */, > 5CDD8A321E43BFB300621E92 /* nack_tracker.cc in Sources */, >- 417953B9216982420028266B /* RTCVideoEncoderFactoryH264.m in Sources */, >+ 417953B4216982420028266B /* nalu_rewriter.cc in Sources */, > 413A23031FE18E0700373E99 /* nattypes.cc in Sources */, > 5CDD8BD11E43C2B500621E92 /* nearest_neighbor.c in Sources */, > 5CDD8A3B1E43BFB300621E92 /* neteq.cc in Sources */, >@@ -16637,46 +16732,44 @@ > 41E02CCD212735B700C27CD6 /* network_types.cc in Sources */, > 413A238B1FE18E0700373E99 /* networkmonitor.cc in Sources */, > 416D3BE7212731C200775F09 /* noise_level_estimator.cc in Sources */, >- 413E6790216987DB00EF37ED /* RTCVideoFrame.mm in Sources */, > 416D3BE8212731C200775F09 /* noise_spectrum_estimator.cc in Sources */, > 5CDD85FE1E43B84E00621E92 /* noise_suppression.c in Sources */, > 5CDD84E91E43B0B600621E92 /* noise_suppression_impl.cc in Sources */, > 5CDD85FC1E43B84E00621E92 /* noise_suppression_x.c in Sources */, >- 417953B8216982420028266B /* RTCVideoDecoderFactoryH264.m in Sources */, >- 413A235F1FE18E0700373E99 /* noop.cc in Sources */, >- 413A23EE1FE18E0800373E99 /* noop.mm in Sources */, > 5CDD8A3D1E43BFB300621E92 /* normal.cc in Sources */, >+ 41FCBB5621B1FE4600A5DF27 /* normalize_simulcast_size_experiment.cc in Sources */, > 5CDD86A71E43B99400621E92 /* normalized_covariance_estimator.cc in Sources */, > 5CDD86001E43B84E00621E92 /* ns_core.c in Sources */, >+ 4144B3F02169C63F004363AC /* NSString+StdString.mm in Sources */, > 5CDD86051E43B84E00621E92 /* nsx_core.c in Sources */, > 5CDD86021E43B84E00621E92 /* nsx_core_c.c in Sources */, > 41F411AD1EF8D91F00343C26 /* null_aec_dump_factory.cc in Sources */, > 419C83B31FE2472E0040C30F /* null_audio_poller.cc in Sources */, > 413A23D21FE18E0800373E99 /* nullsocketserver.cc in Sources */, >+ 416225E42169819500A91C9B /* objc_frame_buffer.mm in Sources */, >+ 416225E72169819500A91C9B /* objc_video_decoder_factory.mm in Sources */, >+ 416225E62169819500A91C9B /* objc_video_encoder_factory.mm in Sources */, >+ 416225E22169819500A91C9B /* objc_video_frame.mm in Sources */, > 5CDD86941E43B93900621E92 /* ooura_fft.cc in Sources */, > 5CDD86911E43B93900621E92 /* ooura_fft_sse2.cc in Sources */, >+ 41FCBB7D21B1FFF600A5DF27 /* openssl_key_derivation_hkdf.cc in Sources */, > 419C84271FE24BD10040C30F /* openssladapter.cc in Sources */, > 41DDB264212679A300296D47 /* opensslcertificate.cc in Sources */, > 419C84261FE24BCF0040C30F /* openssldigest.cc in Sources */, > 419C84251FE24BC80040C30F /* opensslidentity.cc in Sources */, > 41DDB262212679A300296D47 /* opensslsessioncache.cc in Sources */, > 419C843F1FE250B90040C30F /* opensslstreamadapter.cc in Sources */, >- 413E679B2169886200EF37ED /* agc2_common.cc in Sources */, > 41DDB265212679A300296D47 /* opensslutility.cc in Sources */, > 41DDB25221265BE900296D47 /* optional.cc in Sources */, >- 413A234A1FE18E0700373E99 /* optionsfile.cc in Sources */, > 5CDD8C671E43C60900621E92 /* opus_interface.c in Sources */, > 5CDD8FD71E43CD6600621E92 /* overuse_detector.cc in Sources */, >- 417953D52169834F0028266B /* channel_send_proxy.cc in Sources */, > 5CDD8FD91E43CD6600621E92 /* overuse_estimator.cc in Sources */, > 5CDD85A41E43B5C000621E92 /* overuse_frame_detector.cc in Sources */, > 5C63F9ED1E4174F6002CA531 /* p2pconstants.cc in Sources */, > 5C63F9F21E4174F6002CA531 /* p2ptransportchannel.cc in Sources */, > 41F2639A21267F5E00274F59 /* pa_ringbuffer.c in Sources */, > 5CDD8FA21E43CCBE00621E92 /* paced_sender.cc in Sources */, >- 41299B8B2127365100B3414B /* pacer_controller.cc in Sources */, > 5CDD8BD31E43C2B500621E92 /* pack_bits.c in Sources */, >- 4144B3D121698967004363AC /* RTCNativeMutableI420Buffer.mm in Sources */, > 5CDD8A421E43BFB300621E92 /* packet.cc in Sources */, > 5CDD83EE1E439A6F00621E92 /* packet.cc in Sources */, > 5CDD8AAF1E43C00F00621E92 /* packet.cc in Sources */, >@@ -16688,7 +16781,6 @@ > 412455561EF887FB00F11809 /* packetlossestimator.cc in Sources */, > 419C83F51FE248CA0040C30F /* packetsocketfactory.cc in Sources */, > 419C83DA1FE247F20040C30F /* packettransportinternal.cc in Sources */, >- 413A233C1FE18E0700373E99 /* pathutils.cc in Sources */, > 5C4B48831E42C1BA002651C8 /* payload_type_mapper.cc in Sources */, > 5CDD90001E43CE3A00621E92 /* pcm16b.c in Sources */, > 5CD284FF1E6A60570094FDC8 /* peerconnection.cc in Sources */, >@@ -16707,7 +16799,6 @@ > 5CDD87301E43BABE00621E92 /* pitch_internal.cc in Sources */, > 5CDD87D51E43BC0600621E92 /* pitch_lag_tables.c in Sources */, > 41433D191F79B33400387B4D /* pitch_lag_tables.c in Sources */, >- 413E676D2169854B00EF37ED /* RTCWrappedNativeVideoEncoder.mm in Sources */, > 419241412127372400634FCF /* pitch_search.cc in Sources */, > 419241442127372800634FCF /* pitch_search_internal.cc in Sources */, > 413A235A1FE18E0700373E99 /* platform_file.cc in Sources */, >@@ -16747,7 +16838,6 @@ > 413A23621FE18E0700373E99 /* ratetracker.cc in Sources */, > 41A08BCC21269620001D5D7B /* raw_logging.cc in Sources */, > 5CDD86611E43B8B500621E92 /* real_fft.c in Sources */, >- 413E67752169863B00EF37ED /* libvpx_interface.cc in Sources */, > 5C4B4C771E431F9C002651C8 /* real_fourier.cc in Sources */, > 5C4B4C721E431F9C002651C8 /* real_fourier_ooura.cc in Sources */, > 413091FE1EF8D06D00757C55 /* receive_side_congestion_controller.cc in Sources */, >@@ -16783,14 +16873,12 @@ > 5CDD86641E43B8B500621E92 /* resample_by_2_internal.c in Sources */, > 5CDD86681E43B8B500621E92 /* resample_fractional.c in Sources */, > 5CDD8AB11E43C00F00621E92 /* resample_input_audio_file.cc in Sources */, >- 413E67982169883900EF37ED /* RTCVideoCodecInfo.m in Sources */, > 5CDD87581E43BAF500621E92 /* resampler.cc in Sources */, > 5CDD84BD1E43AF1300621E92 /* residual_echo_detector.cc in Sources */, > 5CD285DD1E6A63430094FDC8 /* residual_echo_estimator.cc in Sources */, > 5C4B4C7A1E431F9C002651C8 /* ring_buffer.c in Sources */, > 5CDD84BF1E43AF1300621E92 /* rms_level.cc in Sources */, > 419241462127372800634FCF /* rnn.cc in Sources */, >- 417953BC2169824B0028266B /* helpers.cc in Sources */, > 419241992127586500634FCF /* rnn_vad_weights.cc in Sources */, > 5CFD538F1E4BD3A300482908 /* rotate_neon.cc in Sources */, > 5CFD53901E4BD3A300482908 /* rotate_neon64.cc in Sources */, >@@ -16804,7 +16892,6 @@ > 419C839B1FE246F90040C30F /* rtc_event_audio_receive_stream_config.cc in Sources */, > 419C839E1FE246F90040C30F /* rtc_event_audio_send_stream_config.cc in Sources */, > 419C838B1FE246F90040C30F /* rtc_event_bwe_update_delay_based.cc in Sources */, >- 417953EF216984210028266B /* vp9_profile.cc in Sources */, > 419C838D1FE246F90040C30F /* rtc_event_bwe_update_loss_based.cc in Sources */, > 41A08BC021269553001D5D7B /* rtc_event_ice_candidate_pair.cc in Sources */, > 41A08BBD21269553001D5D7B /* rtc_event_ice_candidate_pair_config.cc in Sources */, >@@ -16824,10 +16911,17 @@ > 419C841D1FE24AEF0040C30F /* rtc_stream_config.cc in Sources */, > 413A230F1FE18E0700373E99 /* rtccertificate.cc in Sources */, > 413A22BE1FE18E0700373E99 /* rtccertificategenerator.cc in Sources */, >+ 417953CC216982DB0028266B /* RTCCodecSpecificInfoH264.mm in Sources */, >+ 413E678B216987B900EF37ED /* RTCCVPixelBuffer.mm in Sources */, >+ 417953B5216982420028266B /* RTCDefaultVideoDecoderFactory.m in Sources */, >+ 417953B7216982420028266B /* RTCDefaultVideoEncoderFactory.m in Sources */, >+ 4144B3F5216B081B004363AC /* RTCEncodedImage+Private.mm in Sources */, >+ 417953FA216984BE0028266B /* RTCEncodedImage.m in Sources */, > 41F411B01EF8DA0100343C26 /* rtcerror.cc in Sources */, >- 5CDD8A4F1E43BFB300621E92 /* rtcp.cc in Sources */, >+ 4144B3D621698A1A004363AC /* RTCH264ProfileLevelId.mm in Sources */, >+ 4144B3D32169896D004363AC /* RTCNativeI420Buffer.mm in Sources */, >+ 4144B3D121698967004363AC /* RTCNativeMutableI420Buffer.mm in Sources */, > 419C82B91FE20DCD0040C30F /* rtcp_demuxer.cc in Sources */, >- 413E67762169866100EF37ED /* RTCVideoCodecConstants.mm in Sources */, > 5CD285FA1E6A64520094FDC8 /* rtcp_nack_stats.cc in Sources */, > 5CDD88971E43BE3C00621E92 /* rtcp_packet.cc in Sources */, > 5CDD889A1E43BE3C00621E92 /* rtcp_receiver.cc in Sources */, >@@ -16835,15 +16929,31 @@ > 419C82FF1FE20F020040C30F /* rtcp_transceiver.cc in Sources */, > 419C83021FE20F020040C30F /* rtcp_transceiver_config.cc in Sources */, > 419C83001FE20F020040C30F /* rtcp_transceiver_impl.cc in Sources */, >+ 41889D77216BC4EC004614DD /* RTCRtpEncodingParameters.mm in Sources */, >+ 41889D79216BC4EC004614DD /* RTCRtpFragmentationHeader+Private.mm in Sources */, >+ 41FCBB8921B200FD00A5DF27 /* RTCRtpFragmentationHeader.m in Sources */, > 5C63FA781E417AED002CA531 /* rtcstats.cc in Sources */, > 5C63FA771E417AED002CA531 /* rtcstats_objects.cc in Sources */, > 5CD285071E6A60570094FDC8 /* rtcstatscollector.cc in Sources */, > 5C63FA791E417AED002CA531 /* rtcstatsreport.cc in Sources */, > 4102F69221273206006AE8D7 /* rtcstatstraversal.cc in Sources */, >+ 413E67762169866100EF37ED /* RTCVideoCodecConstants.mm in Sources */, >+ 4144B3E82169C5E3004363AC /* RTCVideoCodecInfo+Private.mm in Sources */, >+ 413E67982169883900EF37ED /* RTCVideoCodecInfo.m in Sources */, >+ 417953B8216982420028266B /* RTCVideoDecoderFactoryH264.m in Sources */, >+ 417953C72169824B0028266B /* RTCVideoDecoderH264.mm in Sources */, >+ 413E67682169854B00EF37ED /* RTCVideoDecoderVP8.mm in Sources */, >+ 417953B9216982420028266B /* RTCVideoEncoderFactoryH264.m in Sources */, >+ 417953CA2169824B0028266B /* RTCVideoEncoderH264.mm in Sources */, >+ 413E67A32169889500EF37ED /* RTCVideoEncoderQpThresholds.m in Sources */, >+ 4144B3F7216B09CB004363AC /* RTCVideoEncoderSettings+Private.mm in Sources */, >+ 413E67652169854600EF37ED /* RTCVideoEncoderVP8.mm in Sources */, >+ 413E6790216987DB00EF37ED /* RTCVideoFrame.mm in Sources */, >+ 413E676B2169854B00EF37ED /* RTCWrappedNativeVideoDecoder.mm in Sources */, >+ 413E676D2169854B00EF37ED /* RTCWrappedNativeVideoEncoder.mm in Sources */, > 41D6B45921273159008F9353 /* rtp_bitrate_configurator.cc in Sources */, > 419C82B41FE20DCD0040C30F /* rtp_config.cc in Sources */, > 413092021EF8D0A600757C55 /* rtp_demuxer.cc in Sources */, >- 417953B5216982420028266B /* RTCDefaultVideoDecoderFactory.m in Sources */, > 5CDD88B01E43BE3C00621E92 /* rtp_format.cc in Sources */, > 5CDD88A41E43BE3C00621E92 /* rtp_format_h264.cc in Sources */, > 5CDD88A61E43BE3C00621E92 /* rtp_format_video_generic.cc in Sources */, >@@ -16854,6 +16964,7 @@ > 41E02C7A2127347400C27CD6 /* rtp_generic_frame_descriptor.cc in Sources */, > 419241B02127590200634FCF /* rtp_generic_frame_descriptor_extension.cc in Sources */, > 4145E4DC1EF8CCEF00FCF6E6 /* rtp_header_extension_map.cc in Sources */, >+ 413E677D216986AE00EF37ED /* rtp_header_extension_size.cc in Sources */, > 5CDD88B51E43BE3C00621E92 /* rtp_header_extensions.cc in Sources */, > 5CDD88B71E43BE3C00621E92 /* rtp_header_parser.cc in Sources */, > 41F9BF8B2051C80100ABF0B9 /* rtp_headers.cc in Sources */, >@@ -16867,7 +16978,6 @@ > 5CDD88CC1E43BE3D00621E92 /* rtp_rtcp_impl.cc in Sources */, > 5CDD88D31E43BE3D00621E92 /* rtp_sender.cc in Sources */, > 5CDD88CE1E43BE3D00621E92 /* rtp_sender_audio.cc in Sources */, >- 417953DB216983910028266B /* metrics.cc in Sources */, > 5CDD88D11E43BE3D00621E92 /* rtp_sender_video.cc in Sources */, > 419C82BC1FE20DCD0040C30F /* rtp_stream_receiver_controller.cc in Sources */, > 5CDD85B21E43B5C000621E92 /* rtp_streams_synchronizer.cc in Sources */, >@@ -16876,26 +16986,22 @@ > 5CDD88D51E43BE3D00621E92 /* rtp_utility.cc in Sources */, > 41E02C7D2127347400C27CD6 /* rtp_video_header.cc in Sources */, > 41D6B45C21273159008F9353 /* rtp_video_sender.cc in Sources */, >- 413E678B216987B900EF37ED /* RTCCVPixelBuffer.mm in Sources */, >- 413E67742169863B00EF37ED /* vp8_temporal_layers.cc in Sources */, > 413091F41EF8CF9200757C55 /* rtp_video_stream_receiver.cc in Sources */, > 5C4B48EB1E42C1E3002651C8 /* rtpdataengine.cc in Sources */, > 5CDD896C1E43BF3A00621E92 /* rtpfb.cc in Sources */, > 419C82C91FE20E2C0040C30F /* rtpmediautils.cc in Sources */, >- 4144B3D821698D3D004363AC /* logging.cc in Sources */, > 419C83151FE242800040C30F /* rtpparameters.cc in Sources */, > 4102F69321273206006AE8D7 /* rtpparametersconversion.cc in Sources */, > 5CD285091E6A60570094FDC8 /* rtpreceiver.cc in Sources */, >- 4144B3F5216B081B004363AC /* RTCEncodedImage+Private.mm in Sources */, >- 416225D92169818100A91C9B /* video_frame.mm in Sources */, > 41F2638321267B7700274F59 /* rtpreceiverinterface.cc in Sources */, > 5CD2850B1E6A60570094FDC8 /* rtpsender.cc in Sources */, >+ 4144B3E22169B5BD004363AC /* rtpsenderinterface.cc in Sources */, > 419C82E61FE20E590040C30F /* rtptransceiver.cc in Sources */, > 41F2638521267B7700274F59 /* rtptransceiverinterface.cc in Sources */, > 4145E4CC1EF8CB8B00FCF6E6 /* rtptransport.cc in Sources */, > 5C4B48F11E42C1E3002651C8 /* rtputils.cc in Sources */, > 5CDD83F81E439A6F00621E92 /* rtt_filter.cc in Sources */, >- 413E67722169863B00EF37ED /* temporal_layers_checker.cc in Sources */, >+ 417953EA216983F10028266B /* rtt_mult_experiment.cc in Sources */, > 419241CA2127593F00634FCF /* rtt_stats.cc in Sources */, > 419C82B71FE20DCD0040C30F /* rtx_receive_stream.cc in Sources */, > 419241642127385B00634FCF /* rw_lock_posix.cc in Sources */, >@@ -16912,7 +17018,6 @@ > 41239B3F21476DC400396F81 /* screenshare_layers.cc in Sources */, > 5CD2863C1E6A681C0094FDC8 /* sctptransport.cc in Sources */, > 5CD2850D1E6A60570094FDC8 /* sctputils.cc in Sources */, >- 413E67652169854600EF37ED /* RTCVideoEncoderVP8.mm in Sources */, > 5CDD896F1E43BF3A00621E92 /* sdes.cc in Sources */, > 41F2636221267ADF00274F59 /* sdp_video_format.cc in Sources */, > 419C82DC1FE20E590040C30F /* sdputils.cc in Sources */, >@@ -16923,11 +17028,11 @@ > 5CDD85B91E43B5C000621E92 /* send_statistics_proxy.cc in Sources */, > 41299B8C2127365100B3414B /* send_time_history.cc in Sources */, > 5CDD89721E43BF3A00621E92 /* sender_report.cc in Sources */, >+ 41FCBB2721B1F87E00A5DF27 /* sent_packet.cc in Sources */, > 413A239D1FE18E0800373E99 /* sequenced_task_checker_impl.cc in Sources */, > 5CDD83FB1E439A6F00621E92 /* session_info.cc in Sources */, > 41F9BFE42051DE1800ABF0B9 /* sessiondescription.cc in Sources */, > 5CD285DF1E6A63430094FDC8 /* shadow_filter_update_gain.cc in Sources */, >- 413E677A2169867A00EF37ED /* channel_receive_proxy.cc in Sources */, > 416D3BEB212731C200775F09 /* signal_classifier.cc in Sources */, > 413A235B1FE18E0700373E99 /* signalthread.cc in Sources */, > 41F263BB2126811500274F59 /* sigslot.cc in Sources */, >@@ -16946,18 +17051,14 @@ > 5C4B4CE51E4320A9002651C8 /* sleep.cc in Sources */, > 5CDD8BE51E43C2B500621E92 /* smooth.c in Sources */, > 5CDD8BE31E43C2B500621E92 /* smooth_out_data.c in Sources */, >- 413E67A5216988DC00EF37ED /* video_stream_encoder_observer.cc in Sources */, >- 417953B4216982420028266B /* nalu_rewriter.cc in Sources */, > 5CD285F61E6A63F60094FDC8 /* smoothing_filter.cc in Sources */, > 41DDB2562126790A00296D47 /* socket.cc in Sources */, > 413A22F01FE18E0700373E99 /* socketadapters.cc in Sources */, > 413A23681FE18E0700373E99 /* socketaddress.cc in Sources */, >- 4144B3D621698A1A004363AC /* RTCH264ProfileLevelId.mm in Sources */, > 413A234D1FE18E0700373E99 /* socketaddresspair.cc in Sources */, > 413A22D31FE18E0700373E99 /* socketstream.cc in Sources */, > 5CDD8BE71E43C2B500621E92 /* sort_sq.c in Sources */, > 5C4B4C7D1E431F9C002651C8 /* sparse_fir_filter.cc in Sources */, >- 4144B3E22169B5BD004363AC /* rtpsenderinterface.cc in Sources */, > 419241AD212758D300634FCF /* spectral_features.cc in Sources */, > 419241A8212758D300634FCF /* spectral_features_internal.cc in Sources */, > 41433D021F79B33400387B4D /* spectrum_ar_model_tables.c in Sources */, >@@ -16992,7 +17093,6 @@ > 41E02CBE2127356A00C27CD6 /* sslcertificate.cc in Sources */, > 413A23F11FE18E0800373E99 /* sslfingerprint.cc in Sources */, > 419C84161FE24A8D0040C30F /* sslidentity.cc in Sources */, >- 413E677D216986AE00EF37ED /* rtp_header_extension_size.cc in Sources */, > 419C84171FE24AA30040C30F /* sslstreamadapter.cc in Sources */, > 5CDD87361E43BABE00621E92 /* standalone_vad.cc in Sources */, > 5CDD8BEB1E43C2B500621E92 /* state_construct.c in Sources */, >@@ -17005,8 +17105,8 @@ > 5CDD85BF1E43B5C000621E92 /* stream_synchronization.cc in Sources */, > 5C4B48F41E42C1E3002651C8 /* streamparams.cc in Sources */, > 41DDB26E212679D200296D47 /* string_builder.cc in Sources */, >- 413E676B2169854B00EF37ED /* RTCWrappedNativeVideoDecoder.mm in Sources */, > 413A23D61FE18E0800373E99 /* string_to_number.cc in Sources */, >+ 41FCBB8021B2001200A5DF27 /* string_view.cc in Sources */, > 413A22B51FE18E0700373E99 /* stringencode.cc in Sources */, > 413A23131FE18E0700373E99 /* stringutils.cc in Sources */, > 5C63FA0B1E4174F6002CA531 /* stun.cc in Sources */, >@@ -17016,6 +17116,8 @@ > 5CD285E21E6A63430094FDC8 /* subtractor.cc in Sources */, > 5CD285E41E6A63430094FDC8 /* suppression_filter.cc in Sources */, > 5CD285E61E6A63430094FDC8 /* suppression_gain.cc in Sources */, >+ 413E67822169870F00EF37ED /* svc_config.cc in Sources */, >+ 413E67942169881900EF37ED /* svc_rate_allocator.cc in Sources */, > 5CDD8BEF1E43C2B500621E92 /* swap_bytes.c in Sources */, > 5CDD8A541E43BFB300621E92 /* sync_buffer.cc in Sources */, > 5CD284BE1E6A5F9F0094FDC8 /* syncable.cc in Sources */, >@@ -17025,14 +17127,14 @@ > 413A23DE1FE18E0800373E99 /* task_queue_gcd.cc in Sources */, > 413A232D1FE18E0700373E99 /* task_queue_posix.cc in Sources */, > 5C63FA171E4174F6002CA531 /* tcpport.cc in Sources */, >+ 413E67722169863B00EF37ED /* temporal_layers_checker.cc in Sources */, > 413A22A51FE18E0600373E99 /* testechoserver.cc in Sources */, > 413A22E51FE18E0700373E99 /* thread.cc in Sources */, > 413A23501FE18E0700373E99 /* thread_checker_impl.cc in Sources */, >- 413A23DD1FE18E0800373E99 /* thread_darwin.mm in Sources */, > 5CDD84C31E43AF1300621E92 /* three_band_filter_bank.cc in Sources */, >+ 41FCBB2221B1F82F00A5DF27 /* throw_delegate.cc in Sources */, > 5CDD8A571E43BFB300621E92 /* tick_timer.cc in Sources */, > 419241E221275A3000634FCF /* time_delta.cc in Sources */, >- 419C83351FE245EA0040C30F /* time_interval.cc in Sources */, > 5CDD8A5A1E43BFB300621E92 /* time_stretch.cc in Sources */, > 5CDD88DA1E43BE3D00621E92 /* time_util.cc in Sources */, > 419241E321275A3000634FCF /* timestamp.cc in Sources */, >@@ -17072,9 +17174,7 @@ > 41F9BFEA2051DEEA00ABF0B9 /* turnportfactory.cc in Sources */, > 5C63FA2F1E4174F6002CA531 /* turnserver.cc in Sources */, > 5C4B48F91E42C1E3002651C8 /* turnutils.cc in Sources */, >- 416225E42169819500A91C9B /* objc_frame_buffer.mm in Sources */, > 5CDD84C51E43AF1300621E92 /* typing_detection.cc in Sources */, >- 419C83DC1FE247F20040C30F /* udptransport.cc in Sources */, > 5CDD88DF1E43BE3D00621E92 /* ulpfec_generator.cc in Sources */, > 5CDD88E21E43BE3D00621E92 /* ulpfec_header_reader_writer.cc in Sources */, > 5CDD88E41E43BE3D00621E92 /* ulpfec_receiver_impl.cc in Sources */, >@@ -17087,7 +17187,6 @@ > 5CDD86D61E43BA2800621E92 /* vad_core.c in Sources */, > 5CDD86D91E43BA2800621E92 /* vad_filterbank.c in Sources */, > 5CDD86DC1E43BA2800621E92 /* vad_gmm.c in Sources */, >- 417953E6216983CA0028266B /* module_common_types.cc in Sources */, > 5CDD86DF1E43BA2800621E92 /* vad_sp.c in Sources */, > 416D3BEC212731C200775F09 /* vad_with_level.cc in Sources */, > 41F9BFD52051DDE500ABF0B9 /* vector_buffer.cc in Sources */, >@@ -17098,17 +17197,20 @@ > 5CD285351E6A61590094FDC8 /* video_codec_initializer.cc in Sources */, > 41E02C8D212734B900C27CD6 /* video_coding_defines.cc in Sources */, > 5CDD84011E439A6F00621E92 /* video_coding_impl.cc in Sources */, >- 416225E22169819500A91C9B /* objc_video_frame.mm in Sources */, > 419C84111FE249AB0040C30F /* video_content_type.cc in Sources */, > 41F2636721267ADF00274F59 /* video_decoder.cc in Sources */, >+ 417953F1216984550028266B /* video_decoder_factory.cc in Sources */, >+ 416225D82169818100A91C9B /* video_decoder_factory.mm in Sources */, > 41F2636521267ADF00274F59 /* video_decoder_software_fallback_wrapper.cc in Sources */, > 4145F6201FE1F38D00EB9CAF /* video_encoder.cc in Sources */, > 41E02CC52127358800C27CD6 /* video_encoder_config.cc in Sources */, >+ 416225D42169818100A91C9B /* video_encoder_factory.mm in Sources */, > 41F2636921267ADF00274F59 /* video_encoder_software_fallback_wrapper.cc in Sources */, > 5CD2846A1E6A57F40094FDC8 /* video_frame.cc in Sources */, >- 5C4B4C1F1E431F75002651C8 /* video_frame.cc in Sources */, >+ 416225D92169818100A91C9B /* video_frame.mm in Sources */, > 4124554B1EF8874300F11809 /* video_frame_buffer.cc in Sources */, > 5C4B4C1E1E431F75002651C8 /* video_frame_buffer.cc in Sources */, >+ 416225D72169818100A91C9B /* video_frame_buffer.mm in Sources */, > 4102F6C9212733B7006AE8D7 /* video_quality_observer.cc in Sources */, > 419C82BE1FE20DCD0040C30F /* video_receive_stream.cc in Sources */, > 5CDD85CA1E43B5C000621E92 /* video_receive_stream.cc in Sources */, >@@ -17116,15 +17218,14 @@ > 5C4B4C201E431F75002651C8 /* video_render_frames.cc in Sources */, > 419C82B31FE20DCD0040C30F /* video_send_stream.cc in Sources */, > 5CDD85CD1E43B5C000621E92 /* video_send_stream.cc in Sources */, >- 417953CA2169824B0028266B /* RTCVideoEncoderH264.mm in Sources */, > 4102F6CC212733B7006AE8D7 /* video_send_stream_impl.cc in Sources */, >- 417953F1216984550028266B /* video_decoder_factory.cc in Sources */, > 5CDD84041E439A6F00621E92 /* video_sender.cc in Sources */, > 41A08BB0212681C8001D5D7B /* video_source_interface.cc in Sources */, > 5CDD85CF1E43B5C000621E92 /* video_stream_decoder.cc in Sources */, > 4102F6DB21273416006AE8D7 /* video_stream_decoder_create.cc in Sources */, > 4102F6CB212733B7006AE8D7 /* video_stream_decoder_impl.cc in Sources */, > 419C831F1FE242E60040C30F /* video_stream_encoder.cc in Sources */, >+ 413E67A5216988DC00EF37ED /* video_stream_encoder_observer.cc in Sources */, > 419C84101FE249AB0040C30F /* video_timing.cc in Sources */, > 5C4B48FC1E42C1E3002651C8 /* videoadapter.cc in Sources */, > 5C4B48FF1E42C1E3002651C8 /* videobroadcaster.cc in Sources */, >@@ -17142,7 +17243,7 @@ > 419241F521275C3200634FCF /* vp8_encoder_simulcast_proxy.cc in Sources */, > 5CDD83851E439A3500621E92 /* vp8_header_parser.cc in Sources */, > 5CDD8C141E43C3B400621E92 /* vp9_noop.cc in Sources */, >- 417953B7216982420028266B /* RTCDefaultVideoEncoderFactory.m in Sources */, >+ 417953EF216984210028266B /* vp9_profile.cc in Sources */, > 5CDD8BF31E43C2B500621E92 /* vq3.c in Sources */, > 5CDD8BF51E43C2B500621E92 /* vq4.c in Sources */, > 5C4B4C801E431F9C002651C8 /* wav_file.cc in Sources */, >@@ -17165,7 +17266,6 @@ > 5C4B4C861E431F9C002651C8 /* window_generator.cc in Sources */, > 5CDD8C981E43C66000621E92 /* wpd_node.cc in Sources */, > 5CDD8C9B1E43C66000621E92 /* wpd_tree.cc in Sources */, >- 4144B3F7216B09CB004363AC /* RTCVideoEncoderSettings+Private.mm in Sources */, > 5CDD8BF91E43C2B500621E92 /* xcorr_coef.c in Sources */, > 41DDB2592126792800296D47 /* zero_memory.cc in Sources */, > ); >diff --git a/Source/WebCore/platform/mediastream/libwebrtc/LibWebRTCProvider.cpp b/Source/WebCore/platform/mediastream/libwebrtc/LibWebRTCProvider.cpp >index e5dab669d8287ebd30e14ed61517a99fb12c04fe..c24dc0799d14052b7025aaa515fce807bf9bad1c 100644 >--- a/Source/WebCore/platform/mediastream/libwebrtc/LibWebRTCProvider.cpp >+++ b/Source/WebCore/platform/mediastream/libwebrtc/LibWebRTCProvider.cpp >@@ -37,6 +37,7 @@ ALLOW_UNUSED_PARAMETERS_BEGIN > #include <webrtc/api/asyncresolverfactory.h> > #include <webrtc/api/audio_codecs/builtin_audio_decoder_factory.h> > #include <webrtc/api/audio_codecs/builtin_audio_encoder_factory.h> >+#include <webrtc/api/create_peerconnection_factory.h> > #include <webrtc/api/peerconnectionfactoryproxy.h> > #include <webrtc/modules/audio_processing/include/audio_processing.h> > #include <webrtc/p2p/base/basicpacketsocketfactory.h> >diff --git a/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.cpp b/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.cpp >index 73b01967594e0ae4d62cbfe5df66c71d7f00ba5e..c5995bc53255fe2d376fc45fbfb9ee631ee73e71 100644 >--- a/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.cpp >+++ b/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.cpp >@@ -92,7 +92,7 @@ void LibWebRTCSocketClient::signalReadPacket(rtc::AsyncPacketSocket* socket, con > auto buffer = WebCore::SharedBuffer::create(value, length); > m_rtcProvider.sendFromMainThread([identifier = m_identifier, buffer = WTFMove(buffer), address = RTCNetwork::isolatedCopy(address), packetTime](IPC::Connection& connection) { > IPC::DataReference data(reinterpret_cast<const uint8_t*>(buffer->data()), buffer->size()); >- connection.send(Messages::WebRTCSocket::SignalReadPacket(data, RTCNetwork::IPAddress(address.ipaddr()), address.port(), packetTime.timestamp), identifier); >+ connection.send(Messages::WebRTCSocket::SignalReadPacket(data, RTCNetwork::IPAddress(address.ipaddr()), address.port(), packetTime), identifier); > }); > } > >diff --git a/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.h b/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.h >index 360c2e01a4f13c5e808feeed5226771422efd14c..8f205ce029713d8ba213f143ba71bb114752def5 100644 >--- a/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.h >+++ b/Source/WebKit/NetworkProcess/webrtc/LibWebRTCSocketClient.h >@@ -35,8 +35,8 @@ namespace rtc { > class AsyncPacketSocket; > class SocketAddress; > struct PacketOptions; >-struct PacketTime; > struct SentPacket; >+typedef int64_t PacketTime; > } > > namespace WebCore { >diff --git a/Source/WebKit/NetworkProcess/webrtc/NetworkRTCProvider.cpp b/Source/WebKit/NetworkProcess/webrtc/NetworkRTCProvider.cpp >index ce0f13da7594ad6fab87147e9de2d58b82ebdd93..dd97798daea40ace629ad8f0a3b411e4928d3f82 100644 >--- a/Source/WebKit/NetworkProcess/webrtc/NetworkRTCProvider.cpp >+++ b/Source/WebKit/NetworkProcess/webrtc/NetworkRTCProvider.cpp >@@ -37,6 +37,7 @@ > #include "WebRTCSocketMessages.h" > #include <WebCore/LibWebRTCMacros.h> > #include <webrtc/rtc_base/asyncpacketsocket.h> >+#include <webrtc/rtc_base/logging.h> > #include <wtf/MainThread.h> > #include <wtf/text/WTFString.h> > >diff --git a/Source/WebKit/NetworkProcess/webrtc/NetworkRTCSocket.h b/Source/WebKit/NetworkProcess/webrtc/NetworkRTCSocket.h >index d9deaac625503194bf2deef0ab85978c706c6f58..6e4bfe2b16419cbbc51b80d95f759efe2edd92af 100644 >--- a/Source/WebKit/NetworkProcess/webrtc/NetworkRTCSocket.h >+++ b/Source/WebKit/NetworkProcess/webrtc/NetworkRTCSocket.h >@@ -42,8 +42,8 @@ namespace rtc { > class AsyncPacketSocket; > class SocketAddress; > struct PacketOptions; >-struct PacketTime; > struct SentPacket; >+typedef int64_t PacketTime; > } > > namespace WebCore { >diff --git a/Source/WebKit/WebProcess/Network/webrtc/LibWebRTCSocket.cpp b/Source/WebKit/WebProcess/Network/webrtc/LibWebRTCSocket.cpp >index 70541573373d269271202b061eb9c15c8ec7656a..e2c5df557c40115ddf48fd4079afb949a6c25569 100644 >--- a/Source/WebKit/WebProcess/Network/webrtc/LibWebRTCSocket.cpp >+++ b/Source/WebKit/WebProcess/Network/webrtc/LibWebRTCSocket.cpp >@@ -82,7 +82,7 @@ void LibWebRTCSocket::signalAddressReady(const rtc::SocketAddress& address) > void LibWebRTCSocket::signalReadPacket(const WebCore::SharedBuffer& buffer, rtc::SocketAddress&& address, int64_t timestamp) > { > m_remoteAddress = WTFMove(address); >- SignalReadPacket(this, buffer.data(), buffer.size(), m_remoteAddress, rtc::PacketTime(timestamp, 0)); >+ SignalReadPacket(this, buffer.data(), buffer.size(), m_remoteAddress, timestamp); > } > > void LibWebRTCSocket::signalSentPacket(int rtcPacketID, int64_t sendTimeMs)
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Diff
View Attachment As Raw
Actions:
View
|
Formatted Diff
|
Diff
Attachments on
bug 192316
:
356390
|
356436
|
356469
|
356793
|
356816
|
356841